Mar 20 06:49:36 crc systemd[1]: Starting Kubernetes Kubelet... Mar 20 06:49:36 crc restorecon[4819]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 06:49:36 crc restorecon[4819]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 06:49:37 crc restorecon[4819]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 06:49:37 crc restorecon[4819]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 20 06:49:38 crc kubenswrapper[4971]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 06:49:38 crc kubenswrapper[4971]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 20 06:49:38 crc kubenswrapper[4971]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 06:49:38 crc kubenswrapper[4971]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 06:49:38 crc kubenswrapper[4971]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 20 06:49:38 crc kubenswrapper[4971]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.450674 4971 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.458555 4971 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.458586 4971 feature_gate.go:330] unrecognized feature gate: Example Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.458597 4971 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.458631 4971 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.458640 4971 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.458650 4971 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.458659 4971 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.458668 4971 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.458679 4971 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.458690 4971 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.458698 4971 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.458722 4971 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.458730 4971 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.458740 4971 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.458750 4971 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.458758 4971 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.458766 4971 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.458774 4971 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.458782 4971 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.458789 4971 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.458797 4971 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.458805 4971 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.458812 4971 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.458820 4971 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.458827 4971 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.458835 4971 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.458843 4971 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.458850 4971 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.458858 4971 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.458868 4971 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.458878 4971 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.458887 4971 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.458896 4971 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.458905 4971 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.458914 4971 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.458923 4971 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.458931 4971 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.458940 4971 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.458948 4971 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.458956 4971 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.458964 4971 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.458971 4971 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.458979 4971 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.458988 4971 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.458996 4971 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.459004 4971 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.459012 4971 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.459020 4971 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.459028 4971 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.459036 4971 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.459044 4971 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.459051 4971 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.459060 4971 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.459068 4971 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.459076 4971 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.459084 4971 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.459097 4971 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.459106 4971 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.459115 4971 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.459123 4971 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.459131 4971 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.459139 4971 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.459148 4971 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.459156 4971 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.459164 4971 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.459171 4971 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.459179 4971 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.459186 4971 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.459194 4971 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.459202 4971 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.459209 4971 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.460363 4971 flags.go:64] FLAG: --address="0.0.0.0" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.460387 4971 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.460409 4971 flags.go:64] FLAG: --anonymous-auth="true" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.460421 4971 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.460432 4971 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.460442 4971 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.460454 4971 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.460465 4971 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.460474 4971 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.460483 4971 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.460493 4971 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.460503 4971 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.460511 4971 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.460521 4971 flags.go:64] FLAG: --cgroup-root="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.460532 4971 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.460541 4971 flags.go:64] FLAG: --client-ca-file="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.460549 4971 flags.go:64] FLAG: --cloud-config="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.460559 4971 flags.go:64] FLAG: --cloud-provider="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.460569 4971 flags.go:64] FLAG: --cluster-dns="[]" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.460584 4971 flags.go:64] FLAG: --cluster-domain="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.460595 4971 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.460639 4971 flags.go:64] FLAG: --config-dir="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.460651 4971 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.460663 4971 flags.go:64] FLAG: --container-log-max-files="5" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.460675 4971 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.460684 4971 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.460693 4971 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.460702 4971 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.460712 4971 flags.go:64] FLAG: --contention-profiling="false" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.460722 4971 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.460731 4971 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.460740 4971 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.460751 4971 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.460762 4971 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.460772 4971 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.460781 4971 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.460789 4971 flags.go:64] FLAG: --enable-load-reader="false" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.460798 4971 flags.go:64] FLAG: --enable-server="true" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.460807 4971 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.460826 4971 flags.go:64] FLAG: --event-burst="100" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.460835 4971 flags.go:64] FLAG: --event-qps="50" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.460844 4971 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.460853 4971 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.460861 4971 flags.go:64] FLAG: --eviction-hard="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.460872 4971 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.460882 4971 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.460891 4971 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.460900 4971 flags.go:64] FLAG: --eviction-soft="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.460910 4971 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.460920 4971 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.460930 4971 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.460939 4971 flags.go:64] FLAG: --experimental-mounter-path="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.460948 4971 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.460957 4971 flags.go:64] FLAG: --fail-swap-on="true" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.460966 4971 flags.go:64] FLAG: --feature-gates="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.460977 4971 flags.go:64] FLAG: --file-check-frequency="20s" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.460986 4971 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.460995 4971 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461004 4971 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461014 4971 flags.go:64] FLAG: --healthz-port="10248" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461023 4971 flags.go:64] FLAG: --help="false" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461032 4971 flags.go:64] FLAG: --hostname-override="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461041 4971 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461050 4971 flags.go:64] FLAG: --http-check-frequency="20s" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461059 4971 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461067 4971 flags.go:64] FLAG: --image-credential-provider-config="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461076 4971 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461085 4971 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461095 4971 flags.go:64] FLAG: --image-service-endpoint="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461104 4971 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461112 4971 flags.go:64] FLAG: --kube-api-burst="100" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461122 4971 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461131 4971 flags.go:64] FLAG: --kube-api-qps="50" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461140 4971 flags.go:64] FLAG: --kube-reserved="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461149 4971 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461158 4971 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461167 4971 flags.go:64] FLAG: --kubelet-cgroups="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461176 4971 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461185 4971 flags.go:64] FLAG: --lock-file="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461194 4971 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461203 4971 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461213 4971 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461226 4971 flags.go:64] FLAG: --log-json-split-stream="false" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461235 4971 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461244 4971 flags.go:64] FLAG: --log-text-split-stream="false" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461253 4971 flags.go:64] FLAG: --logging-format="text" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461263 4971 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461272 4971 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461281 4971 flags.go:64] FLAG: --manifest-url="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461289 4971 flags.go:64] FLAG: --manifest-url-header="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461301 4971 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461310 4971 flags.go:64] FLAG: --max-open-files="1000000" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461320 4971 flags.go:64] FLAG: --max-pods="110" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461329 4971 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461338 4971 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461347 4971 flags.go:64] FLAG: --memory-manager-policy="None" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461356 4971 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461366 4971 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461375 4971 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461384 4971 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461403 4971 flags.go:64] FLAG: --node-status-max-images="50" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461412 4971 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461422 4971 flags.go:64] FLAG: --oom-score-adj="-999" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461431 4971 flags.go:64] FLAG: --pod-cidr="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461441 4971 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461454 4971 flags.go:64] FLAG: --pod-manifest-path="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461463 4971 flags.go:64] FLAG: --pod-max-pids="-1" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461480 4971 flags.go:64] FLAG: --pods-per-core="0" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461488 4971 flags.go:64] FLAG: --port="10250" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461498 4971 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461507 4971 flags.go:64] FLAG: --provider-id="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461516 4971 flags.go:64] FLAG: --qos-reserved="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461525 4971 flags.go:64] FLAG: --read-only-port="10255" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461534 4971 flags.go:64] FLAG: --register-node="true" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461543 4971 flags.go:64] FLAG: --register-schedulable="true" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461552 4971 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461566 4971 flags.go:64] FLAG: --registry-burst="10" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461575 4971 flags.go:64] FLAG: --registry-qps="5" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461584 4971 flags.go:64] FLAG: --reserved-cpus="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461593 4971 flags.go:64] FLAG: --reserved-memory="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461630 4971 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461640 4971 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461650 4971 flags.go:64] FLAG: --rotate-certificates="false" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461660 4971 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461669 4971 flags.go:64] FLAG: --runonce="false" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461678 4971 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461687 4971 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461697 4971 flags.go:64] FLAG: --seccomp-default="false" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461706 4971 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461715 4971 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461724 4971 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461733 4971 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461742 4971 flags.go:64] FLAG: --storage-driver-password="root" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461751 4971 flags.go:64] FLAG: --storage-driver-secure="false" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461760 4971 flags.go:64] FLAG: --storage-driver-table="stats" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461768 4971 flags.go:64] FLAG: --storage-driver-user="root" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461777 4971 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461786 4971 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461796 4971 flags.go:64] FLAG: --system-cgroups="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461805 4971 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461820 4971 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461829 4971 flags.go:64] FLAG: --tls-cert-file="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461838 4971 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461849 4971 flags.go:64] FLAG: --tls-min-version="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461859 4971 flags.go:64] FLAG: --tls-private-key-file="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461868 4971 flags.go:64] FLAG: --topology-manager-policy="none" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461877 4971 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461887 4971 flags.go:64] FLAG: --topology-manager-scope="container" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461896 4971 flags.go:64] FLAG: --v="2" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461908 4971 flags.go:64] FLAG: --version="false" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461919 4971 flags.go:64] FLAG: --vmodule="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461929 4971 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.461939 4971 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462159 4971 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462170 4971 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462179 4971 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462188 4971 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462196 4971 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462204 4971 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462213 4971 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462222 4971 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462230 4971 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462238 4971 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462246 4971 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462254 4971 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462262 4971 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462271 4971 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462278 4971 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462286 4971 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462297 4971 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462307 4971 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462316 4971 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462325 4971 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462333 4971 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462341 4971 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462349 4971 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462358 4971 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462366 4971 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462375 4971 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462383 4971 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462390 4971 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462398 4971 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462406 4971 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462413 4971 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462421 4971 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462429 4971 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462436 4971 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462444 4971 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462451 4971 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462462 4971 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462472 4971 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462481 4971 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462490 4971 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462499 4971 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462509 4971 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462517 4971 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462525 4971 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462533 4971 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462540 4971 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462548 4971 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462555 4971 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462563 4971 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462573 4971 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462583 4971 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462593 4971 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462625 4971 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462635 4971 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462643 4971 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462651 4971 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462659 4971 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462675 4971 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462683 4971 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462692 4971 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462700 4971 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462707 4971 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462715 4971 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462724 4971 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462731 4971 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462739 4971 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462746 4971 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462754 4971 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462762 4971 feature_gate.go:330] unrecognized feature gate: Example Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462770 4971 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.462779 4971 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.462807 4971 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.475156 4971 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.475217 4971 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.475358 4971 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.475382 4971 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.475428 4971 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.475439 4971 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.475451 4971 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.475461 4971 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.475469 4971 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.475478 4971 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.475487 4971 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.475495 4971 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.475504 4971 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.475512 4971 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.475522 4971 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.475534 4971 feature_gate.go:330] unrecognized feature gate: Example Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.475545 4971 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.475556 4971 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.475567 4971 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.475577 4971 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.475585 4971 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.475594 4971 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.475631 4971 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.475640 4971 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.475652 4971 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.475667 4971 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.475678 4971 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.475687 4971 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.475696 4971 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.475704 4971 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.475713 4971 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.475722 4971 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.475730 4971 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.475738 4971 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.475747 4971 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.475755 4971 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.475766 4971 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.475775 4971 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.475783 4971 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.475792 4971 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.475801 4971 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.475812 4971 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.475822 4971 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.475831 4971 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.475840 4971 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.475852 4971 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.475861 4971 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.475870 4971 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.475879 4971 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.475888 4971 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.475899 4971 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.475910 4971 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.475919 4971 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.475929 4971 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.475938 4971 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.475947 4971 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.475956 4971 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.475965 4971 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.475973 4971 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.475982 4971 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.475990 4971 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.475999 4971 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476007 4971 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476015 4971 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476023 4971 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476032 4971 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476040 4971 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476049 4971 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476057 4971 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476065 4971 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476073 4971 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476082 4971 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476092 4971 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.476107 4971 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476350 4971 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476363 4971 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476375 4971 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476388 4971 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476397 4971 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476406 4971 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476415 4971 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476424 4971 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476433 4971 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476442 4971 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476450 4971 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476460 4971 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476468 4971 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476477 4971 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476485 4971 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476494 4971 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476502 4971 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476511 4971 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476519 4971 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476528 4971 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476536 4971 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476547 4971 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476558 4971 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476567 4971 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476578 4971 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476587 4971 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476595 4971 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476628 4971 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476637 4971 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476646 4971 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476654 4971 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476663 4971 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476671 4971 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476680 4971 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476690 4971 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476699 4971 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476707 4971 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476716 4971 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476724 4971 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476732 4971 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476743 4971 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476752 4971 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476761 4971 feature_gate.go:330] unrecognized feature gate: Example Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476769 4971 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476777 4971 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476787 4971 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476795 4971 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476803 4971 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476811 4971 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476822 4971 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476833 4971 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476844 4971 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476854 4971 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476864 4971 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476874 4971 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476883 4971 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476893 4971 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476902 4971 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476910 4971 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476919 4971 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476927 4971 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476936 4971 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476944 4971 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476953 4971 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476961 4971 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476970 4971 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476978 4971 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476987 4971 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.476995 4971 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.477003 4971 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.477013 4971 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.477027 4971 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.478547 4971 server.go:940] "Client rotation is on, will bootstrap in background" Mar 20 06:49:38 crc kubenswrapper[4971]: E0320 06:49:38.484023 4971 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.490195 4971 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.490306 4971 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.492031 4971 server.go:997] "Starting client certificate rotation" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.492075 4971 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.492241 4971 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.517369 4971 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.519768 4971 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 06:49:38 crc kubenswrapper[4971]: E0320 06:49:38.521356 4971 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.119:6443: connect: connection refused" logger="UnhandledError" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.536069 4971 log.go:25] "Validated CRI v1 runtime API" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.577230 4971 log.go:25] "Validated CRI v1 image API" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.579909 4971 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.585212 4971 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-20-06-40-30-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.585267 4971 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:45 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:42 fsType:tmpfs blockSize:0}] Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.618833 4971 manager.go:217] Machine: {Timestamp:2026-03-20 06:49:38.615013366 +0000 UTC m=+0.594887544 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:e08276c0-e3e4-4e35-ad9f-5ef530b85d12 BootID:24b0987a-bded-4b95-97d3-cecbc47baa01 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:45 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:42 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:76:fb:f7 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:76:fb:f7 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:54:03:0e Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:85:b4:2f Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:3b:46:80 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:9c:29:fa Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:e5:66:f1 Speed:-1 Mtu:1496} {Name:ens7.44 MacAddress:52:54:00:84:36:18 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:12:0d:68:55:9a:72 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:76:d5:5d:a1:f2:74 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.619248 4971 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.619494 4971 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.621005 4971 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.621404 4971 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.621478 4971 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.621918 4971 topology_manager.go:138] "Creating topology manager with none policy" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.621940 4971 container_manager_linux.go:303] "Creating device plugin manager" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.622597 4971 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.622700 4971 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.623076 4971 state_mem.go:36] "Initialized new in-memory state store" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.623330 4971 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.628142 4971 kubelet.go:418] "Attempting to sync node with API server" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.628192 4971 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.628238 4971 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.628265 4971 kubelet.go:324] "Adding apiserver pod source" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.628299 4971 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.632247 4971 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.119:6443: connect: connection refused Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.632369 4971 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.119:6443: connect: connection refused Mar 20 06:49:38 crc kubenswrapper[4971]: E0320 06:49:38.632409 4971 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.119:6443: connect: connection refused" logger="UnhandledError" Mar 20 06:49:38 crc kubenswrapper[4971]: E0320 06:49:38.632462 4971 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.119:6443: connect: connection refused" logger="UnhandledError" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.633061 4971 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.634490 4971 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.636971 4971 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.638774 4971 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.638818 4971 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.638834 4971 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.638848 4971 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.638869 4971 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.638882 4971 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.638896 4971 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.638918 4971 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.638935 4971 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.638950 4971 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.638968 4971 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.638982 4971 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.640501 4971 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.641293 4971 server.go:1280] "Started kubelet" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.642929 4971 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 20 06:49:38 crc systemd[1]: Started Kubernetes Kubelet. Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.642948 4971 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.645150 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.119:6443: connect: connection refused Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.649856 4971 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.649911 4971 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.650071 4971 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.650847 4971 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.650875 4971 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 20 06:49:38 crc kubenswrapper[4971]: E0320 06:49:38.650961 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.651483 4971 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.652875 4971 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.119:6443: connect: connection refused Mar 20 06:49:38 crc kubenswrapper[4971]: E0320 06:49:38.654842 4971 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.119:6443: connect: connection refused" logger="UnhandledError" Mar 20 06:49:38 crc kubenswrapper[4971]: E0320 06:49:38.653033 4971 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.119:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189e79f0dd3a3ae3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:38.641246947 +0000 UTC m=+0.621121115,LastTimestamp:2026-03-20 06:49:38.641246947 +0000 UTC m=+0.621121115,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:49:38 crc kubenswrapper[4971]: E0320 06:49:38.652763 4971 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.119:6443: connect: connection refused" interval="200ms" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.654112 4971 factory.go:55] Registering systemd factory Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.655270 4971 factory.go:221] Registration of the systemd container factory successfully Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.656149 4971 factory.go:153] Registering CRI-O factory Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.656205 4971 factory.go:221] Registration of the crio container factory successfully Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.656360 4971 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.656400 4971 server.go:460] "Adding debug handlers to kubelet server" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.656415 4971 factory.go:103] Registering Raw factory Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.657244 4971 manager.go:1196] Started watching for new ooms in manager Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.659213 4971 manager.go:319] Starting recovery of all containers Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.676448 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.676557 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.676590 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.676650 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.676669 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.676717 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.676739 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.676764 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.676784 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.676806 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.676827 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.676845 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.676863 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.676884 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.676903 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.676921 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.676941 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.676967 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.676992 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.677010 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.677028 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.677047 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.677065 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.677084 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.677102 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.677122 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.677146 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.677165 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.677183 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.677235 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.677254 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.677272 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.677290 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.677309 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.677328 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.677348 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.677365 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.677383 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.677402 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.677422 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.677441 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.677460 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.677479 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.677497 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.677521 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.677546 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.677569 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.677593 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.677669 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.677699 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.677723 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.677814 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.677842 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.677863 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.677885 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.677906 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.677927 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.677946 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.677966 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.677987 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.678006 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.678024 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.678042 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.678060 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.678078 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.678096 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.678114 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.678131 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.678149 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.678167 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.678185 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.678203 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.678221 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.678242 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.678261 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.678278 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.678297 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.678315 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.678332 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.678351 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.678370 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.678388 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.678407 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.678433 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.678458 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.678482 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.678506 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.678531 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.678554 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.678573 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.678591 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.678641 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.678668 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.678695 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.678721 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.678741 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.678769 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.678856 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.678882 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.678902 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.678921 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.678939 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.678958 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.678977 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.679004 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.679028 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.679048 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.679069 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.679088 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.679108 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.679127 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.679146 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.679167 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.679186 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.679207 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.679228 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.679249 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.679268 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.679286 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.679306 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.679325 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.679345 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.679363 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.679381 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.679399 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.679416 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.679461 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.679479 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.679499 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.679517 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.679536 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.679554 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.679574 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.679592 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.679639 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.679667 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.679742 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.679816 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.679838 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.679879 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.679898 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.679939 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.679957 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.679976 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.680083 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.680134 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.680161 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.680179 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.680220 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.680250 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.680299 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.680339 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.680357 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.680400 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.680437 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.680465 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.680490 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.680508 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.680557 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.680780 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.680816 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.680860 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.680878 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.680919 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.680937 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.680962 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.680989 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.681018 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.681042 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.681063 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.681080 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.681099 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.681117 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.681137 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.681189 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.681210 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.681228 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.681246 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.681264 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.681283 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.681303 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.683555 4971 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.683598 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.683724 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.683772 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.683796 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.683846 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.683924 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.683965 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.684011 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.684030 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.684051 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.684069 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.684113 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.684154 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.684173 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.684211 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.684266 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.684309 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.684349 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.684372 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.684414 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.684435 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.684451 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.684527 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.684568 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.684587 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.684628 4971 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.684670 4971 reconstruct.go:97] "Volume reconstruction finished" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.684683 4971 reconciler.go:26] "Reconciler: start to sync state" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.697599 4971 manager.go:324] Recovery completed Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.714057 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.716262 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.716303 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.716316 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.717465 4971 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.717516 4971 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.717561 4971 state_mem.go:36] "Initialized new in-memory state store" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.728235 4971 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.730832 4971 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.730901 4971 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.730951 4971 kubelet.go:2335] "Starting kubelet main sync loop" Mar 20 06:49:38 crc kubenswrapper[4971]: E0320 06:49:38.731040 4971 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 20 06:49:38 crc kubenswrapper[4971]: W0320 06:49:38.731960 4971 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.119:6443: connect: connection refused Mar 20 06:49:38 crc kubenswrapper[4971]: E0320 06:49:38.732065 4971 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.119:6443: connect: connection refused" logger="UnhandledError" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.732490 4971 policy_none.go:49] "None policy: Start" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.733884 4971 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.733913 4971 state_mem.go:35] "Initializing new in-memory state store" Mar 20 06:49:38 crc kubenswrapper[4971]: E0320 06:49:38.752312 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.798145 4971 manager.go:334] "Starting Device Plugin manager" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.798209 4971 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.798229 4971 server.go:79] "Starting device plugin registration server" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.798875 4971 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.798903 4971 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.799412 4971 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.799588 4971 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.799660 4971 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 20 06:49:38 crc kubenswrapper[4971]: E0320 06:49:38.812882 4971 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.831170 4971 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.831373 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.833216 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.833278 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.833302 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.833541 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.834725 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.834819 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.834931 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.834979 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.834995 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.835161 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.836015 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.836084 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.836273 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.836354 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.836378 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.836482 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.836529 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.836554 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.836765 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.836936 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.836997 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.837464 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.837555 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.837665 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.838699 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.838761 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.838773 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.838794 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.838811 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.838945 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.838800 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.839024 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.839062 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.840047 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.840076 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.840090 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.840182 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.840211 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.840228 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.840250 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.840278 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.841066 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.841092 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.841103 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:49:38 crc kubenswrapper[4971]: E0320 06:49:38.855948 4971 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.119:6443: connect: connection refused" interval="400ms" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.887598 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.887663 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.887695 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.887730 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.888182 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.888253 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.888319 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.888374 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.888426 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.888704 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.888761 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.890737 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.890837 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.890918 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.891018 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.899461 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.902376 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.902427 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.902447 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.902486 4971 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 06:49:38 crc kubenswrapper[4971]: E0320 06:49:38.903199 4971 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.119:6443: connect: connection refused" node="crc" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.993270 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.993340 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.993382 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.993415 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.993445 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.993526 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.993564 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.993531 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.993532 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.993660 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.993721 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.993767 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.993820 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.993804 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.993918 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.993953 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.993940 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.994044 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.994096 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.994108 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.994194 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.994327 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.994406 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.994480 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.994568 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.994487 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.994675 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.994693 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.994514 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 06:49:38 crc kubenswrapper[4971]: I0320 06:49:38.994811 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 06:49:39 crc kubenswrapper[4971]: I0320 06:49:39.103793 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:49:39 crc kubenswrapper[4971]: I0320 06:49:39.105677 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:49:39 crc kubenswrapper[4971]: I0320 06:49:39.105737 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:49:39 crc kubenswrapper[4971]: I0320 06:49:39.105756 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:49:39 crc kubenswrapper[4971]: I0320 06:49:39.105790 4971 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 06:49:39 crc kubenswrapper[4971]: E0320 06:49:39.106425 4971 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.119:6443: connect: connection refused" node="crc" Mar 20 06:49:39 crc kubenswrapper[4971]: I0320 06:49:39.167306 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 06:49:39 crc kubenswrapper[4971]: I0320 06:49:39.197931 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 20 06:49:39 crc kubenswrapper[4971]: I0320 06:49:39.221542 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 06:49:39 crc kubenswrapper[4971]: W0320 06:49:39.224063 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-f6327e2d37923522d33d5968b31401b7aef04883287c79f24db292632cf38cf8 WatchSource:0}: Error finding container f6327e2d37923522d33d5968b31401b7aef04883287c79f24db292632cf38cf8: Status 404 returned error can't find the container with id f6327e2d37923522d33d5968b31401b7aef04883287c79f24db292632cf38cf8 Mar 20 06:49:39 crc kubenswrapper[4971]: W0320 06:49:39.235913 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-a8eb63770ee3e6b8a2211700ad8ed893b7e5b360f3347af12e66b503050cdc9d WatchSource:0}: Error finding container a8eb63770ee3e6b8a2211700ad8ed893b7e5b360f3347af12e66b503050cdc9d: Status 404 returned error can't find the container with id a8eb63770ee3e6b8a2211700ad8ed893b7e5b360f3347af12e66b503050cdc9d Mar 20 06:49:39 crc kubenswrapper[4971]: I0320 06:49:39.236347 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 06:49:39 crc kubenswrapper[4971]: I0320 06:49:39.245060 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 06:49:39 crc kubenswrapper[4971]: W0320 06:49:39.254423 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-061ce1fadcf66eb7d683eff1d91f970cbe2a354454a01ee1954f1fe1723c0974 WatchSource:0}: Error finding container 061ce1fadcf66eb7d683eff1d91f970cbe2a354454a01ee1954f1fe1723c0974: Status 404 returned error can't find the container with id 061ce1fadcf66eb7d683eff1d91f970cbe2a354454a01ee1954f1fe1723c0974 Mar 20 06:49:39 crc kubenswrapper[4971]: E0320 06:49:39.258174 4971 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.119:6443: connect: connection refused" interval="800ms" Mar 20 06:49:39 crc kubenswrapper[4971]: W0320 06:49:39.260479 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-851146dc0924a52b075da57e5fcc01974cf3030da04ed81bebe5e5d7874090ae WatchSource:0}: Error finding container 851146dc0924a52b075da57e5fcc01974cf3030da04ed81bebe5e5d7874090ae: Status 404 returned error can't find the container with id 851146dc0924a52b075da57e5fcc01974cf3030da04ed81bebe5e5d7874090ae Mar 20 06:49:39 crc kubenswrapper[4971]: W0320 06:49:39.266619 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-090ff87067a2668981b4bd67d8e3a0f0c1e2d1bbc371b586b2d057f4d77e0d43 WatchSource:0}: Error finding container 090ff87067a2668981b4bd67d8e3a0f0c1e2d1bbc371b586b2d057f4d77e0d43: Status 404 returned error can't find the container with id 090ff87067a2668981b4bd67d8e3a0f0c1e2d1bbc371b586b2d057f4d77e0d43 Mar 20 06:49:39 crc kubenswrapper[4971]: I0320 06:49:39.506777 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:49:39 crc kubenswrapper[4971]: I0320 06:49:39.508149 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:49:39 crc kubenswrapper[4971]: I0320 06:49:39.508200 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:49:39 crc kubenswrapper[4971]: I0320 06:49:39.508213 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:49:39 crc kubenswrapper[4971]: I0320 06:49:39.508248 4971 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 06:49:39 crc kubenswrapper[4971]: E0320 06:49:39.508818 4971 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.119:6443: connect: connection refused" node="crc" Mar 20 06:49:39 crc kubenswrapper[4971]: I0320 06:49:39.646888 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.119:6443: connect: connection refused Mar 20 06:49:39 crc kubenswrapper[4971]: W0320 06:49:39.678692 4971 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.119:6443: connect: connection refused Mar 20 06:49:39 crc kubenswrapper[4971]: E0320 06:49:39.678813 4971 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.119:6443: connect: connection refused" logger="UnhandledError" Mar 20 06:49:39 crc kubenswrapper[4971]: I0320 06:49:39.738313 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"851146dc0924a52b075da57e5fcc01974cf3030da04ed81bebe5e5d7874090ae"} Mar 20 06:49:39 crc kubenswrapper[4971]: I0320 06:49:39.739774 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"061ce1fadcf66eb7d683eff1d91f970cbe2a354454a01ee1954f1fe1723c0974"} Mar 20 06:49:39 crc kubenswrapper[4971]: I0320 06:49:39.741048 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a8eb63770ee3e6b8a2211700ad8ed893b7e5b360f3347af12e66b503050cdc9d"} Mar 20 06:49:39 crc kubenswrapper[4971]: I0320 06:49:39.742476 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"f6327e2d37923522d33d5968b31401b7aef04883287c79f24db292632cf38cf8"} Mar 20 06:49:39 crc kubenswrapper[4971]: I0320 06:49:39.744387 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"090ff87067a2668981b4bd67d8e3a0f0c1e2d1bbc371b586b2d057f4d77e0d43"} Mar 20 06:49:39 crc kubenswrapper[4971]: W0320 06:49:39.862625 4971 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.119:6443: connect: connection refused Mar 20 06:49:39 crc kubenswrapper[4971]: E0320 06:49:39.863047 4971 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.119:6443: connect: connection refused" logger="UnhandledError" Mar 20 06:49:40 crc kubenswrapper[4971]: E0320 06:49:40.059072 4971 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.119:6443: connect: connection refused" interval="1.6s" Mar 20 06:49:40 crc kubenswrapper[4971]: W0320 06:49:40.062695 4971 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.119:6443: connect: connection refused Mar 20 06:49:40 crc kubenswrapper[4971]: E0320 06:49:40.062774 4971 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.119:6443: connect: connection refused" logger="UnhandledError" Mar 20 06:49:40 crc kubenswrapper[4971]: W0320 06:49:40.197875 4971 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.119:6443: connect: connection refused Mar 20 06:49:40 crc kubenswrapper[4971]: E0320 06:49:40.198022 4971 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.119:6443: connect: connection refused" logger="UnhandledError" Mar 20 06:49:40 crc kubenswrapper[4971]: I0320 06:49:40.309223 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:49:40 crc kubenswrapper[4971]: I0320 06:49:40.311590 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:49:40 crc kubenswrapper[4971]: I0320 06:49:40.311694 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:49:40 crc kubenswrapper[4971]: I0320 06:49:40.311717 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:49:40 crc kubenswrapper[4971]: I0320 06:49:40.311802 4971 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 06:49:40 crc kubenswrapper[4971]: E0320 06:49:40.312415 4971 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.119:6443: connect: connection refused" node="crc" Mar 20 06:49:40 crc kubenswrapper[4971]: I0320 06:49:40.551934 4971 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 06:49:40 crc kubenswrapper[4971]: E0320 06:49:40.553083 4971 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.119:6443: connect: connection refused" logger="UnhandledError" Mar 20 06:49:40 crc kubenswrapper[4971]: I0320 06:49:40.646892 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.119:6443: connect: connection refused Mar 20 06:49:40 crc kubenswrapper[4971]: I0320 06:49:40.749776 4971 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc" exitCode=0 Mar 20 06:49:40 crc kubenswrapper[4971]: I0320 06:49:40.749859 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc"} Mar 20 06:49:40 crc kubenswrapper[4971]: I0320 06:49:40.749984 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:49:40 crc kubenswrapper[4971]: I0320 06:49:40.751034 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:49:40 crc kubenswrapper[4971]: I0320 06:49:40.751074 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:49:40 crc kubenswrapper[4971]: I0320 06:49:40.751087 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:49:40 crc kubenswrapper[4971]: I0320 06:49:40.761154 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:49:40 crc kubenswrapper[4971]: I0320 06:49:40.762535 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:49:40 crc kubenswrapper[4971]: I0320 06:49:40.762599 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:49:40 crc kubenswrapper[4971]: I0320 06:49:40.762648 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:49:40 crc kubenswrapper[4971]: I0320 06:49:40.763992 4971 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b" exitCode=0 Mar 20 06:49:40 crc kubenswrapper[4971]: I0320 06:49:40.764087 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b"} Mar 20 06:49:40 crc kubenswrapper[4971]: I0320 06:49:40.764181 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:49:40 crc kubenswrapper[4971]: I0320 06:49:40.766497 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:49:40 crc kubenswrapper[4971]: I0320 06:49:40.766539 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:49:40 crc kubenswrapper[4971]: I0320 06:49:40.766559 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:49:40 crc kubenswrapper[4971]: I0320 06:49:40.767482 4971 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="e4ee4a5d4b945414c51474c35ab69b28ed5f455e26d29cd5b16ad428ccbbf22a" exitCode=0 Mar 20 06:49:40 crc kubenswrapper[4971]: I0320 06:49:40.767557 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:49:40 crc kubenswrapper[4971]: I0320 06:49:40.767562 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"e4ee4a5d4b945414c51474c35ab69b28ed5f455e26d29cd5b16ad428ccbbf22a"} Mar 20 06:49:40 crc kubenswrapper[4971]: I0320 06:49:40.769207 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:49:40 crc kubenswrapper[4971]: I0320 06:49:40.769273 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:49:40 crc kubenswrapper[4971]: I0320 06:49:40.769292 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:49:40 crc kubenswrapper[4971]: I0320 06:49:40.769817 4971 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="5d625bf5d3c5e189cc09b066dcf7488a12513f542970ad31f968c3b97b40946f" exitCode=0 Mar 20 06:49:40 crc kubenswrapper[4971]: I0320 06:49:40.769883 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"5d625bf5d3c5e189cc09b066dcf7488a12513f542970ad31f968c3b97b40946f"} Mar 20 06:49:40 crc kubenswrapper[4971]: I0320 06:49:40.770013 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:49:40 crc kubenswrapper[4971]: I0320 06:49:40.770835 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:49:40 crc kubenswrapper[4971]: I0320 06:49:40.770869 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:49:40 crc kubenswrapper[4971]: I0320 06:49:40.770884 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:49:40 crc kubenswrapper[4971]: I0320 06:49:40.774370 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"465f395d4a53da4db1294d226cb9df79fa577b5e791cdadfe55758e760fa3b1d"} Mar 20 06:49:40 crc kubenswrapper[4971]: I0320 06:49:40.774409 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"57f7e2094376c5a4e7ab64d5f5d5cf8f2345b48b8cc9c0debb63d2cce6cf9524"} Mar 20 06:49:40 crc kubenswrapper[4971]: I0320 06:49:40.774425 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"892e128c2264ce375823bf22386e28688c7c163c1a8ab217c984a26dd61506d9"} Mar 20 06:49:41 crc kubenswrapper[4971]: I0320 06:49:41.647041 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.119:6443: connect: connection refused Mar 20 06:49:41 crc kubenswrapper[4971]: E0320 06:49:41.659916 4971 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.119:6443: connect: connection refused" interval="3.2s" Mar 20 06:49:41 crc kubenswrapper[4971]: I0320 06:49:41.779864 4971 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e" exitCode=0 Mar 20 06:49:41 crc kubenswrapper[4971]: I0320 06:49:41.779967 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e"} Mar 20 06:49:41 crc kubenswrapper[4971]: I0320 06:49:41.780014 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:49:41 crc kubenswrapper[4971]: I0320 06:49:41.781788 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:49:41 crc kubenswrapper[4971]: I0320 06:49:41.781817 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:49:41 crc kubenswrapper[4971]: I0320 06:49:41.781826 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:49:41 crc kubenswrapper[4971]: I0320 06:49:41.784085 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"8f039ce86df8b26f12e0b570ebc679159993f54933090b6f74961acad0bc6ac3"} Mar 20 06:49:41 crc kubenswrapper[4971]: I0320 06:49:41.784306 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:49:41 crc kubenswrapper[4971]: I0320 06:49:41.789779 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:49:41 crc kubenswrapper[4971]: I0320 06:49:41.789825 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:49:41 crc kubenswrapper[4971]: I0320 06:49:41.789845 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:49:41 crc kubenswrapper[4971]: I0320 06:49:41.793560 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"1a8e9788928dff4b3f4984ced0d9fd89424ff1b3932eab286bef120536b58fda"} Mar 20 06:49:41 crc kubenswrapper[4971]: I0320 06:49:41.793584 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:49:41 crc kubenswrapper[4971]: I0320 06:49:41.793667 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"37dc61a04694f84965d4677bc1de4f78175f69c5d491dd6e75c3e5bbcffefc16"} Mar 20 06:49:41 crc kubenswrapper[4971]: I0320 06:49:41.793693 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"f520d26409a32e66fab0a111d40b27ef3e315b1c69687d842535e28f5be7cba6"} Mar 20 06:49:41 crc kubenswrapper[4971]: I0320 06:49:41.794586 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:49:41 crc kubenswrapper[4971]: I0320 06:49:41.794638 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:49:41 crc kubenswrapper[4971]: I0320 06:49:41.794650 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:49:41 crc kubenswrapper[4971]: I0320 06:49:41.799111 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9f2b249d64a0733988fd3aceda10266127b4496b00e4240553c37eaf2f0ddc5e"} Mar 20 06:49:41 crc kubenswrapper[4971]: I0320 06:49:41.799185 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:49:41 crc kubenswrapper[4971]: I0320 06:49:41.800286 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:49:41 crc kubenswrapper[4971]: I0320 06:49:41.800328 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:49:41 crc kubenswrapper[4971]: I0320 06:49:41.800345 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:49:41 crc kubenswrapper[4971]: I0320 06:49:41.808152 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2468ac45fc80991cc210720a76afa06655f3fd7e91fab289eed1d20818e7fcbd"} Mar 20 06:49:41 crc kubenswrapper[4971]: I0320 06:49:41.808202 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e45395fcc4ab1005b3b86243b30fecf35311439ef5593f777ba909a0b9359574"} Mar 20 06:49:41 crc kubenswrapper[4971]: I0320 06:49:41.808225 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c945250129ad9a7fdd1cc19ac9f4c8d3e73c504448c652d5e02cdc986624089d"} Mar 20 06:49:41 crc kubenswrapper[4971]: I0320 06:49:41.808242 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9cfb7907b15336a74d333ce825f8325e80508112fa387f60627e13cc60dc6b32"} Mar 20 06:49:41 crc kubenswrapper[4971]: I0320 06:49:41.913408 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:49:41 crc kubenswrapper[4971]: I0320 06:49:41.915256 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:49:41 crc kubenswrapper[4971]: I0320 06:49:41.915291 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:49:41 crc kubenswrapper[4971]: I0320 06:49:41.915300 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:49:41 crc kubenswrapper[4971]: I0320 06:49:41.915330 4971 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 06:49:41 crc kubenswrapper[4971]: E0320 06:49:41.915771 4971 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.119:6443: connect: connection refused" node="crc" Mar 20 06:49:42 crc kubenswrapper[4971]: I0320 06:49:42.277130 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 06:49:42 crc kubenswrapper[4971]: I0320 06:49:42.818288 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d6254fd68aa30b54e4216ceee196f26d56e96cb708a6048a5d4639fb1b4d1f43"} Mar 20 06:49:42 crc kubenswrapper[4971]: I0320 06:49:42.818651 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:49:42 crc kubenswrapper[4971]: I0320 06:49:42.820431 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:49:42 crc kubenswrapper[4971]: I0320 06:49:42.820505 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:49:42 crc kubenswrapper[4971]: I0320 06:49:42.820529 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:49:42 crc kubenswrapper[4971]: I0320 06:49:42.822371 4971 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465" exitCode=0 Mar 20 06:49:42 crc kubenswrapper[4971]: I0320 06:49:42.822432 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465"} Mar 20 06:49:42 crc kubenswrapper[4971]: I0320 06:49:42.822555 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:49:42 crc kubenswrapper[4971]: I0320 06:49:42.822889 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:49:42 crc kubenswrapper[4971]: I0320 06:49:42.823032 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:49:42 crc kubenswrapper[4971]: I0320 06:49:42.822785 4971 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 06:49:42 crc kubenswrapper[4971]: I0320 06:49:42.823384 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:49:42 crc kubenswrapper[4971]: I0320 06:49:42.824170 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:49:42 crc kubenswrapper[4971]: I0320 06:49:42.824200 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:49:42 crc kubenswrapper[4971]: I0320 06:49:42.824209 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:49:42 crc kubenswrapper[4971]: I0320 06:49:42.825455 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:49:42 crc kubenswrapper[4971]: I0320 06:49:42.825517 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:49:42 crc kubenswrapper[4971]: I0320 06:49:42.825537 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:49:42 crc kubenswrapper[4971]: I0320 06:49:42.826258 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:49:42 crc kubenswrapper[4971]: I0320 06:49:42.826310 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:49:42 crc kubenswrapper[4971]: I0320 06:49:42.826327 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:49:42 crc kubenswrapper[4971]: I0320 06:49:42.826531 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:49:42 crc kubenswrapper[4971]: I0320 06:49:42.826594 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:49:42 crc kubenswrapper[4971]: I0320 06:49:42.826654 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:49:43 crc kubenswrapper[4971]: I0320 06:49:43.523927 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 06:49:43 crc kubenswrapper[4971]: I0320 06:49:43.834199 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1c624e715179e2699ff4d282ae4bd733ca1bc71ac21247b97f8585f0c6313460"} Mar 20 06:49:43 crc kubenswrapper[4971]: I0320 06:49:43.834287 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a82f33c71d1353dd841a49c673ff55387468152e3c7aa9a935173f9de78dcd1f"} Mar 20 06:49:43 crc kubenswrapper[4971]: I0320 06:49:43.834311 4971 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 06:49:43 crc kubenswrapper[4971]: I0320 06:49:43.834329 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"105120bce25dc17ac09a04c3b0703e3f07e3f60495ddc67ca4d77004384a6fdf"} Mar 20 06:49:43 crc kubenswrapper[4971]: I0320 06:49:43.834349 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:49:43 crc kubenswrapper[4971]: I0320 06:49:43.834389 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:49:43 crc kubenswrapper[4971]: I0320 06:49:43.836008 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:49:43 crc kubenswrapper[4971]: I0320 06:49:43.836068 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:49:43 crc kubenswrapper[4971]: I0320 06:49:43.836088 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:49:43 crc kubenswrapper[4971]: I0320 06:49:43.836645 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:49:43 crc kubenswrapper[4971]: I0320 06:49:43.836715 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:49:43 crc kubenswrapper[4971]: I0320 06:49:43.836734 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:49:43 crc kubenswrapper[4971]: I0320 06:49:43.870650 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 06:49:44 crc kubenswrapper[4971]: I0320 06:49:44.765495 4971 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 06:49:44 crc kubenswrapper[4971]: I0320 06:49:44.849811 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"16665116a21b8abffb23402c6362e9cd5110741ce22065c5e50f200b179a327c"} Mar 20 06:49:44 crc kubenswrapper[4971]: I0320 06:49:44.849894 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"becef23cbdd3d98ab9e3aa793aee4fe00b06c86505d5717db0739eb21e344b99"} Mar 20 06:49:44 crc kubenswrapper[4971]: I0320 06:49:44.849864 4971 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 06:49:44 crc kubenswrapper[4971]: I0320 06:49:44.849973 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:49:44 crc kubenswrapper[4971]: I0320 06:49:44.850006 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:49:44 crc kubenswrapper[4971]: I0320 06:49:44.849975 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:49:44 crc kubenswrapper[4971]: I0320 06:49:44.851658 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:49:44 crc kubenswrapper[4971]: I0320 06:49:44.851705 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:49:44 crc kubenswrapper[4971]: I0320 06:49:44.851723 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:49:44 crc kubenswrapper[4971]: I0320 06:49:44.851729 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:49:44 crc kubenswrapper[4971]: I0320 06:49:44.851768 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:49:44 crc kubenswrapper[4971]: I0320 06:49:44.851782 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:49:44 crc kubenswrapper[4971]: I0320 06:49:44.851837 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:49:44 crc kubenswrapper[4971]: I0320 06:49:44.851893 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:49:44 crc kubenswrapper[4971]: I0320 06:49:44.851915 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:49:45 crc kubenswrapper[4971]: I0320 06:49:45.115935 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:49:45 crc kubenswrapper[4971]: I0320 06:49:45.118110 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:49:45 crc kubenswrapper[4971]: I0320 06:49:45.118220 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:49:45 crc kubenswrapper[4971]: I0320 06:49:45.118252 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:49:45 crc kubenswrapper[4971]: I0320 06:49:45.118327 4971 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 06:49:45 crc kubenswrapper[4971]: I0320 06:49:45.278330 4971 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 06:49:45 crc kubenswrapper[4971]: I0320 06:49:45.278497 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 06:49:45 crc kubenswrapper[4971]: I0320 06:49:45.853553 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:49:45 crc kubenswrapper[4971]: I0320 06:49:45.855060 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:49:45 crc kubenswrapper[4971]: I0320 06:49:45.855120 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:49:45 crc kubenswrapper[4971]: I0320 06:49:45.855138 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:49:45 crc kubenswrapper[4971]: I0320 06:49:45.966332 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 06:49:45 crc kubenswrapper[4971]: I0320 06:49:45.966551 4971 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 06:49:45 crc kubenswrapper[4971]: I0320 06:49:45.966634 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:49:45 crc kubenswrapper[4971]: I0320 06:49:45.968289 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:49:45 crc kubenswrapper[4971]: I0320 06:49:45.968354 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:49:45 crc kubenswrapper[4971]: I0320 06:49:45.968372 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:49:46 crc kubenswrapper[4971]: I0320 06:49:46.292902 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 06:49:46 crc kubenswrapper[4971]: I0320 06:49:46.293191 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:49:46 crc kubenswrapper[4971]: I0320 06:49:46.295194 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:49:46 crc kubenswrapper[4971]: I0320 06:49:46.295315 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:49:46 crc kubenswrapper[4971]: I0320 06:49:46.295353 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:49:46 crc kubenswrapper[4971]: I0320 06:49:46.304737 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 06:49:46 crc kubenswrapper[4971]: I0320 06:49:46.376842 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 06:49:46 crc kubenswrapper[4971]: I0320 06:49:46.377214 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:49:46 crc kubenswrapper[4971]: I0320 06:49:46.379076 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:49:46 crc kubenswrapper[4971]: I0320 06:49:46.379148 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:49:46 crc kubenswrapper[4971]: I0320 06:49:46.379169 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:49:46 crc kubenswrapper[4971]: I0320 06:49:46.508222 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 20 06:49:46 crc kubenswrapper[4971]: I0320 06:49:46.857345 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:49:46 crc kubenswrapper[4971]: I0320 06:49:46.857390 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:49:46 crc kubenswrapper[4971]: I0320 06:49:46.857446 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 06:49:46 crc kubenswrapper[4971]: I0320 06:49:46.859384 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:49:46 crc kubenswrapper[4971]: I0320 06:49:46.859462 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:49:46 crc kubenswrapper[4971]: I0320 06:49:46.859484 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:49:46 crc kubenswrapper[4971]: I0320 06:49:46.859730 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:49:46 crc kubenswrapper[4971]: I0320 06:49:46.859787 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:49:46 crc kubenswrapper[4971]: I0320 06:49:46.859805 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:49:47 crc kubenswrapper[4971]: I0320 06:49:47.860268 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:49:47 crc kubenswrapper[4971]: I0320 06:49:47.861727 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:49:47 crc kubenswrapper[4971]: I0320 06:49:47.861786 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:49:47 crc kubenswrapper[4971]: I0320 06:49:47.861800 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:49:48 crc kubenswrapper[4971]: I0320 06:49:48.579976 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 06:49:48 crc kubenswrapper[4971]: I0320 06:49:48.580220 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:49:48 crc kubenswrapper[4971]: I0320 06:49:48.581676 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:49:48 crc kubenswrapper[4971]: I0320 06:49:48.581761 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:49:48 crc kubenswrapper[4971]: I0320 06:49:48.581780 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:49:48 crc kubenswrapper[4971]: E0320 06:49:48.813513 4971 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 06:49:49 crc kubenswrapper[4971]: I0320 06:49:49.798512 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 06:49:49 crc kubenswrapper[4971]: I0320 06:49:49.798807 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:49:49 crc kubenswrapper[4971]: I0320 06:49:49.800434 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:49:49 crc kubenswrapper[4971]: I0320 06:49:49.800487 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:49:49 crc kubenswrapper[4971]: I0320 06:49:49.800505 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:49:52 crc kubenswrapper[4971]: W0320 06:49:52.282219 4971 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 20 06:49:52 crc kubenswrapper[4971]: I0320 06:49:52.282665 4971 trace.go:236] Trace[1074600314]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Mar-2026 06:49:42.280) (total time: 10002ms): Mar 20 06:49:52 crc kubenswrapper[4971]: Trace[1074600314]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (06:49:52.282) Mar 20 06:49:52 crc kubenswrapper[4971]: Trace[1074600314]: [10.002041818s] [10.002041818s] END Mar 20 06:49:52 crc kubenswrapper[4971]: E0320 06:49:52.282694 4971 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 20 06:49:52 crc kubenswrapper[4971]: W0320 06:49:52.635801 4971 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 20 06:49:52 crc kubenswrapper[4971]: I0320 06:49:52.635925 4971 trace.go:236] Trace[1998998493]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Mar-2026 06:49:42.632) (total time: 10003ms): Mar 20 06:49:52 crc kubenswrapper[4971]: Trace[1998998493]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10003ms (06:49:52.635) Mar 20 06:49:52 crc kubenswrapper[4971]: Trace[1998998493]: [10.003751098s] [10.003751098s] END Mar 20 06:49:52 crc kubenswrapper[4971]: E0320 06:49:52.635956 4971 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 20 06:49:52 crc kubenswrapper[4971]: I0320 06:49:52.647510 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 20 06:49:52 crc kubenswrapper[4971]: E0320 06:49:52.700368 4971 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": net/http: TLS handshake timeout" event="&Event{ObjectMeta:{crc.189e79f0dd3a3ae3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:38.641246947 +0000 UTC m=+0.621121115,LastTimestamp:2026-03-20 06:49:38.641246947 +0000 UTC m=+0.621121115,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:49:52 crc kubenswrapper[4971]: W0320 06:49:52.726716 4971 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 20 06:49:52 crc kubenswrapper[4971]: I0320 06:49:52.726854 4971 trace.go:236] Trace[1728947473]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Mar-2026 06:49:42.724) (total time: 10001ms): Mar 20 06:49:52 crc kubenswrapper[4971]: Trace[1728947473]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (06:49:52.726) Mar 20 06:49:52 crc kubenswrapper[4971]: Trace[1728947473]: [10.001919455s] [10.001919455s] END Mar 20 06:49:52 crc kubenswrapper[4971]: E0320 06:49:52.726889 4971 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 20 06:49:52 crc kubenswrapper[4971]: W0320 06:49:52.963460 4971 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 20 06:49:52 crc kubenswrapper[4971]: I0320 06:49:52.963577 4971 trace.go:236] Trace[152380446]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Mar-2026 06:49:42.962) (total time: 10001ms): Mar 20 06:49:52 crc kubenswrapper[4971]: Trace[152380446]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (06:49:52.963) Mar 20 06:49:52 crc kubenswrapper[4971]: Trace[152380446]: [10.001529052s] [10.001529052s] END Mar 20 06:49:52 crc kubenswrapper[4971]: E0320 06:49:52.963622 4971 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 20 06:49:53 crc kubenswrapper[4971]: E0320 06:49:53.238917 4971 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:49:53Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 06:49:53 crc kubenswrapper[4971]: E0320 06:49:53.251861 4971 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:49:53Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 20 06:49:53 crc kubenswrapper[4971]: I0320 06:49:53.254620 4971 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 06:49:53 crc kubenswrapper[4971]: I0320 06:49:53.254713 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 20 06:49:53 crc kubenswrapper[4971]: E0320 06:49:53.260378 4971 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:49:53Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 06:49:53 crc kubenswrapper[4971]: I0320 06:49:53.262441 4971 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 06:49:53 crc kubenswrapper[4971]: I0320 06:49:53.262485 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 20 06:49:53 crc kubenswrapper[4971]: I0320 06:49:53.493255 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 20 06:49:53 crc kubenswrapper[4971]: I0320 06:49:53.493540 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:49:53 crc kubenswrapper[4971]: I0320 06:49:53.495147 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:49:53 crc kubenswrapper[4971]: I0320 06:49:53.495188 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:49:53 crc kubenswrapper[4971]: I0320 06:49:53.495199 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:49:53 crc kubenswrapper[4971]: I0320 06:49:53.530819 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 20 06:49:53 crc kubenswrapper[4971]: I0320 06:49:53.649276 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:49:53Z is after 2026-02-23T05:33:13Z Mar 20 06:49:53 crc kubenswrapper[4971]: I0320 06:49:53.886865 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 20 06:49:53 crc kubenswrapper[4971]: I0320 06:49:53.888791 4971 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d6254fd68aa30b54e4216ceee196f26d56e96cb708a6048a5d4639fb1b4d1f43" exitCode=255 Mar 20 06:49:53 crc kubenswrapper[4971]: I0320 06:49:53.888952 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:49:53 crc kubenswrapper[4971]: I0320 06:49:53.889817 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"d6254fd68aa30b54e4216ceee196f26d56e96cb708a6048a5d4639fb1b4d1f43"} Mar 20 06:49:53 crc kubenswrapper[4971]: I0320 06:49:53.889960 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:49:53 crc kubenswrapper[4971]: I0320 06:49:53.891089 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:49:53 crc kubenswrapper[4971]: I0320 06:49:53.891140 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:49:53 crc kubenswrapper[4971]: I0320 06:49:53.891150 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:49:53 crc kubenswrapper[4971]: I0320 06:49:53.891221 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:49:53 crc kubenswrapper[4971]: I0320 06:49:53.891295 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:49:53 crc kubenswrapper[4971]: I0320 06:49:53.891317 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:49:53 crc kubenswrapper[4971]: I0320 06:49:53.892244 4971 scope.go:117] "RemoveContainer" containerID="d6254fd68aa30b54e4216ceee196f26d56e96cb708a6048a5d4639fb1b4d1f43" Mar 20 06:49:53 crc kubenswrapper[4971]: I0320 06:49:53.912004 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 20 06:49:54 crc kubenswrapper[4971]: I0320 06:49:54.651390 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:49:54Z is after 2026-02-23T05:33:13Z Mar 20 06:49:54 crc kubenswrapper[4971]: I0320 06:49:54.895837 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 20 06:49:54 crc kubenswrapper[4971]: I0320 06:49:54.898454 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f3b3a6f0c1bc4a75dfb46d3d4c2dbea21de579889aff0678167081af0b199bc8"} Mar 20 06:49:54 crc kubenswrapper[4971]: I0320 06:49:54.898631 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:49:54 crc kubenswrapper[4971]: I0320 06:49:54.898712 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:49:54 crc kubenswrapper[4971]: I0320 06:49:54.900099 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:49:54 crc kubenswrapper[4971]: I0320 06:49:54.900186 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:49:54 crc kubenswrapper[4971]: I0320 06:49:54.900207 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:49:54 crc kubenswrapper[4971]: I0320 06:49:54.900134 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:49:54 crc kubenswrapper[4971]: I0320 06:49:54.900285 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:49:54 crc kubenswrapper[4971]: I0320 06:49:54.900306 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:49:55 crc kubenswrapper[4971]: I0320 06:49:55.277705 4971 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 06:49:55 crc kubenswrapper[4971]: I0320 06:49:55.277811 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 06:49:55 crc kubenswrapper[4971]: I0320 06:49:55.651972 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:49:55Z is after 2026-02-23T05:33:13Z Mar 20 06:49:55 crc kubenswrapper[4971]: I0320 06:49:55.903952 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 06:49:55 crc kubenswrapper[4971]: I0320 06:49:55.904863 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 20 06:49:55 crc kubenswrapper[4971]: I0320 06:49:55.907875 4971 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f3b3a6f0c1bc4a75dfb46d3d4c2dbea21de579889aff0678167081af0b199bc8" exitCode=255 Mar 20 06:49:55 crc kubenswrapper[4971]: I0320 06:49:55.907950 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"f3b3a6f0c1bc4a75dfb46d3d4c2dbea21de579889aff0678167081af0b199bc8"} Mar 20 06:49:55 crc kubenswrapper[4971]: I0320 06:49:55.908322 4971 scope.go:117] "RemoveContainer" containerID="d6254fd68aa30b54e4216ceee196f26d56e96cb708a6048a5d4639fb1b4d1f43" Mar 20 06:49:55 crc kubenswrapper[4971]: I0320 06:49:55.908522 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:49:55 crc kubenswrapper[4971]: I0320 06:49:55.909903 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:49:55 crc kubenswrapper[4971]: I0320 06:49:55.909970 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:49:55 crc kubenswrapper[4971]: I0320 06:49:55.909992 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:49:55 crc kubenswrapper[4971]: I0320 06:49:55.910830 4971 scope.go:117] "RemoveContainer" containerID="f3b3a6f0c1bc4a75dfb46d3d4c2dbea21de579889aff0678167081af0b199bc8" Mar 20 06:49:55 crc kubenswrapper[4971]: E0320 06:49:55.911211 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 06:49:55 crc kubenswrapper[4971]: I0320 06:49:55.975140 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 06:49:56 crc kubenswrapper[4971]: I0320 06:49:56.654078 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:49:56Z is after 2026-02-23T05:33:13Z Mar 20 06:49:56 crc kubenswrapper[4971]: W0320 06:49:56.785164 4971 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:49:56Z is after 2026-02-23T05:33:13Z Mar 20 06:49:56 crc kubenswrapper[4971]: E0320 06:49:56.785295 4971 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:49:56Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 06:49:56 crc kubenswrapper[4971]: I0320 06:49:56.914820 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 06:49:56 crc kubenswrapper[4971]: I0320 06:49:56.918217 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:49:56 crc kubenswrapper[4971]: I0320 06:49:56.923922 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:49:56 crc kubenswrapper[4971]: I0320 06:49:56.923974 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:49:56 crc kubenswrapper[4971]: I0320 06:49:56.923992 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:49:56 crc kubenswrapper[4971]: I0320 06:49:56.925010 4971 scope.go:117] "RemoveContainer" containerID="f3b3a6f0c1bc4a75dfb46d3d4c2dbea21de579889aff0678167081af0b199bc8" Mar 20 06:49:56 crc kubenswrapper[4971]: E0320 06:49:56.925553 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 06:49:56 crc kubenswrapper[4971]: I0320 06:49:56.926341 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 06:49:57 crc kubenswrapper[4971]: W0320 06:49:57.480774 4971 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:49:57Z is after 2026-02-23T05:33:13Z Mar 20 06:49:57 crc kubenswrapper[4971]: E0320 06:49:57.481523 4971 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:49:57Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 06:49:57 crc kubenswrapper[4971]: W0320 06:49:57.480837 4971 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:49:57Z is after 2026-02-23T05:33:13Z Mar 20 06:49:57 crc kubenswrapper[4971]: E0320 06:49:57.482137 4971 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:49:57Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 06:49:57 crc kubenswrapper[4971]: I0320 06:49:57.651347 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:49:57Z is after 2026-02-23T05:33:13Z Mar 20 06:49:57 crc kubenswrapper[4971]: I0320 06:49:57.921087 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:49:57 crc kubenswrapper[4971]: I0320 06:49:57.922544 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:49:57 crc kubenswrapper[4971]: I0320 06:49:57.922596 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:49:57 crc kubenswrapper[4971]: I0320 06:49:57.922642 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:49:57 crc kubenswrapper[4971]: I0320 06:49:57.923553 4971 scope.go:117] "RemoveContainer" containerID="f3b3a6f0c1bc4a75dfb46d3d4c2dbea21de579889aff0678167081af0b199bc8" Mar 20 06:49:57 crc kubenswrapper[4971]: E0320 06:49:57.923985 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 06:49:58 crc kubenswrapper[4971]: I0320 06:49:58.580377 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 06:49:58 crc kubenswrapper[4971]: I0320 06:49:58.651230 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:49:58Z is after 2026-02-23T05:33:13Z Mar 20 06:49:58 crc kubenswrapper[4971]: E0320 06:49:58.813720 4971 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 06:49:58 crc kubenswrapper[4971]: I0320 06:49:58.925472 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:49:58 crc kubenswrapper[4971]: I0320 06:49:58.927717 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:49:58 crc kubenswrapper[4971]: I0320 06:49:58.927795 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:49:58 crc kubenswrapper[4971]: I0320 06:49:58.927822 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:49:58 crc kubenswrapper[4971]: I0320 06:49:58.929038 4971 scope.go:117] "RemoveContainer" containerID="f3b3a6f0c1bc4a75dfb46d3d4c2dbea21de579889aff0678167081af0b199bc8" Mar 20 06:49:58 crc kubenswrapper[4971]: E0320 06:49:58.929386 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 06:49:59 crc kubenswrapper[4971]: W0320 06:49:59.151092 4971 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:49:59Z is after 2026-02-23T05:33:13Z Mar 20 06:49:59 crc kubenswrapper[4971]: E0320 06:49:59.151502 4971 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:49:59Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 06:49:59 crc kubenswrapper[4971]: I0320 06:49:59.639381 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:49:59 crc kubenswrapper[4971]: I0320 06:49:59.641777 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:49:59 crc kubenswrapper[4971]: I0320 06:49:59.641865 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:49:59 crc kubenswrapper[4971]: I0320 06:49:59.641894 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:49:59 crc kubenswrapper[4971]: I0320 06:49:59.641955 4971 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 06:49:59 crc kubenswrapper[4971]: E0320 06:49:59.647102 4971 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:49:59Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 06:49:59 crc kubenswrapper[4971]: I0320 06:49:59.651826 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:49:59Z is after 2026-02-23T05:33:13Z Mar 20 06:49:59 crc kubenswrapper[4971]: E0320 06:49:59.657069 4971 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:49:59Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 20 06:50:00 crc kubenswrapper[4971]: I0320 06:50:00.143418 4971 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 06:50:00 crc kubenswrapper[4971]: I0320 06:50:00.143715 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:50:00 crc kubenswrapper[4971]: I0320 06:50:00.145249 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:50:00 crc kubenswrapper[4971]: I0320 06:50:00.145299 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:50:00 crc kubenswrapper[4971]: I0320 06:50:00.145316 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:50:00 crc kubenswrapper[4971]: I0320 06:50:00.146117 4971 scope.go:117] "RemoveContainer" containerID="f3b3a6f0c1bc4a75dfb46d3d4c2dbea21de579889aff0678167081af0b199bc8" Mar 20 06:50:00 crc kubenswrapper[4971]: E0320 06:50:00.146450 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 06:50:00 crc kubenswrapper[4971]: I0320 06:50:00.651235 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:50:00Z is after 2026-02-23T05:33:13Z Mar 20 06:50:01 crc kubenswrapper[4971]: I0320 06:50:01.550489 4971 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 06:50:01 crc kubenswrapper[4971]: E0320 06:50:01.557008 4971 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:50:01Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 06:50:01 crc kubenswrapper[4971]: I0320 06:50:01.650927 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:50:01Z is after 2026-02-23T05:33:13Z Mar 20 06:50:02 crc kubenswrapper[4971]: I0320 06:50:02.651642 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:50:02Z is after 2026-02-23T05:33:13Z Mar 20 06:50:02 crc kubenswrapper[4971]: E0320 06:50:02.704379 4971 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:50:02Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e79f0dd3a3ae3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:38.641246947 +0000 UTC m=+0.621121115,LastTimestamp:2026-03-20 06:49:38.641246947 +0000 UTC m=+0.621121115,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:03 crc kubenswrapper[4971]: I0320 06:50:03.650117 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:50:03Z is after 2026-02-23T05:33:13Z Mar 20 06:50:04 crc kubenswrapper[4971]: W0320 06:50:04.228810 4971 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:50:04Z is after 2026-02-23T05:33:13Z Mar 20 06:50:04 crc kubenswrapper[4971]: E0320 06:50:04.228928 4971 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:50:04Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 06:50:04 crc kubenswrapper[4971]: I0320 06:50:04.650565 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:50:04Z is after 2026-02-23T05:33:13Z Mar 20 06:50:05 crc kubenswrapper[4971]: I0320 06:50:05.278810 4971 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 06:50:05 crc kubenswrapper[4971]: I0320 06:50:05.278955 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 06:50:05 crc kubenswrapper[4971]: I0320 06:50:05.279081 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 06:50:05 crc kubenswrapper[4971]: I0320 06:50:05.279348 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:50:05 crc kubenswrapper[4971]: I0320 06:50:05.281269 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:50:05 crc kubenswrapper[4971]: I0320 06:50:05.281343 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:50:05 crc kubenswrapper[4971]: I0320 06:50:05.281389 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:50:05 crc kubenswrapper[4971]: I0320 06:50:05.282308 4971 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"57f7e2094376c5a4e7ab64d5f5d5cf8f2345b48b8cc9c0debb63d2cce6cf9524"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 20 06:50:05 crc kubenswrapper[4971]: I0320 06:50:05.282646 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://57f7e2094376c5a4e7ab64d5f5d5cf8f2345b48b8cc9c0debb63d2cce6cf9524" gracePeriod=30 Mar 20 06:50:05 crc kubenswrapper[4971]: W0320 06:50:05.601989 4971 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:50:05Z is after 2026-02-23T05:33:13Z Mar 20 06:50:05 crc kubenswrapper[4971]: E0320 06:50:05.602096 4971 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:50:05Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 06:50:05 crc kubenswrapper[4971]: I0320 06:50:05.652076 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:50:05Z is after 2026-02-23T05:33:13Z Mar 20 06:50:05 crc kubenswrapper[4971]: I0320 06:50:05.954302 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 06:50:05 crc kubenswrapper[4971]: I0320 06:50:05.955149 4971 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="57f7e2094376c5a4e7ab64d5f5d5cf8f2345b48b8cc9c0debb63d2cce6cf9524" exitCode=255 Mar 20 06:50:05 crc kubenswrapper[4971]: I0320 06:50:05.955267 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"57f7e2094376c5a4e7ab64d5f5d5cf8f2345b48b8cc9c0debb63d2cce6cf9524"} Mar 20 06:50:05 crc kubenswrapper[4971]: I0320 06:50:05.955342 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9585aeb7e25f5126d0ef599cd585ee78a0b547cd3bb388257855c4482621229f"} Mar 20 06:50:05 crc kubenswrapper[4971]: I0320 06:50:05.955501 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:50:05 crc kubenswrapper[4971]: I0320 06:50:05.957219 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:50:05 crc kubenswrapper[4971]: I0320 06:50:05.957284 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:50:05 crc kubenswrapper[4971]: I0320 06:50:05.957306 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:50:06 crc kubenswrapper[4971]: W0320 06:50:06.504936 4971 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:50:06Z is after 2026-02-23T05:33:13Z Mar 20 06:50:06 crc kubenswrapper[4971]: E0320 06:50:06.505048 4971 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:50:06Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 06:50:06 crc kubenswrapper[4971]: I0320 06:50:06.647590 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:50:06 crc kubenswrapper[4971]: I0320 06:50:06.649936 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:50:06 crc kubenswrapper[4971]: I0320 06:50:06.650003 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:50:06 crc kubenswrapper[4971]: I0320 06:50:06.650028 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:50:06 crc kubenswrapper[4971]: I0320 06:50:06.650074 4971 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 06:50:06 crc kubenswrapper[4971]: I0320 06:50:06.654253 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:50:06Z is after 2026-02-23T05:33:13Z Mar 20 06:50:06 crc kubenswrapper[4971]: E0320 06:50:06.659369 4971 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:50:06Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 06:50:06 crc kubenswrapper[4971]: E0320 06:50:06.665249 4971 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:50:06Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 20 06:50:07 crc kubenswrapper[4971]: I0320 06:50:07.651252 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:50:07Z is after 2026-02-23T05:33:13Z Mar 20 06:50:08 crc kubenswrapper[4971]: I0320 06:50:08.651730 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:50:08Z is after 2026-02-23T05:33:13Z Mar 20 06:50:08 crc kubenswrapper[4971]: E0320 06:50:08.813860 4971 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 06:50:09 crc kubenswrapper[4971]: I0320 06:50:09.650207 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:50:09Z is after 2026-02-23T05:33:13Z Mar 20 06:50:10 crc kubenswrapper[4971]: I0320 06:50:10.650254 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:50:10Z is after 2026-02-23T05:33:13Z Mar 20 06:50:11 crc kubenswrapper[4971]: W0320 06:50:11.131593 4971 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:50:11Z is after 2026-02-23T05:33:13Z Mar 20 06:50:11 crc kubenswrapper[4971]: E0320 06:50:11.131765 4971 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:50:11Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 06:50:11 crc kubenswrapper[4971]: I0320 06:50:11.650965 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:50:11Z is after 2026-02-23T05:33:13Z Mar 20 06:50:12 crc kubenswrapper[4971]: I0320 06:50:12.277128 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 06:50:12 crc kubenswrapper[4971]: I0320 06:50:12.277369 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:50:12 crc kubenswrapper[4971]: I0320 06:50:12.279061 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:50:12 crc kubenswrapper[4971]: I0320 06:50:12.279124 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:50:12 crc kubenswrapper[4971]: I0320 06:50:12.279149 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:50:12 crc kubenswrapper[4971]: I0320 06:50:12.650971 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:50:12Z is after 2026-02-23T05:33:13Z Mar 20 06:50:12 crc kubenswrapper[4971]: E0320 06:50:12.710362 4971 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:50:12Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e79f0dd3a3ae3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:38.641246947 +0000 UTC m=+0.621121115,LastTimestamp:2026-03-20 06:49:38.641246947 +0000 UTC m=+0.621121115,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:13 crc kubenswrapper[4971]: I0320 06:50:13.524026 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 06:50:13 crc kubenswrapper[4971]: I0320 06:50:13.524241 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:50:13 crc kubenswrapper[4971]: I0320 06:50:13.525939 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:50:13 crc kubenswrapper[4971]: I0320 06:50:13.525999 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:50:13 crc kubenswrapper[4971]: I0320 06:50:13.526019 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:50:13 crc kubenswrapper[4971]: I0320 06:50:13.651253 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:50:13Z is after 2026-02-23T05:33:13Z Mar 20 06:50:13 crc kubenswrapper[4971]: I0320 06:50:13.660671 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:50:13 crc kubenswrapper[4971]: I0320 06:50:13.662643 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:50:13 crc kubenswrapper[4971]: I0320 06:50:13.662702 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:50:13 crc kubenswrapper[4971]: I0320 06:50:13.662722 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:50:13 crc kubenswrapper[4971]: I0320 06:50:13.662766 4971 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 06:50:13 crc kubenswrapper[4971]: E0320 06:50:13.669437 4971 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:50:13Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 06:50:13 crc kubenswrapper[4971]: E0320 06:50:13.672065 4971 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:50:13Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 20 06:50:14 crc kubenswrapper[4971]: I0320 06:50:14.656349 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:50:14Z is after 2026-02-23T05:33:13Z Mar 20 06:50:14 crc kubenswrapper[4971]: I0320 06:50:14.732042 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:50:14 crc kubenswrapper[4971]: I0320 06:50:14.734346 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:50:14 crc kubenswrapper[4971]: I0320 06:50:14.734456 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:50:14 crc kubenswrapper[4971]: I0320 06:50:14.734485 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:50:14 crc kubenswrapper[4971]: I0320 06:50:14.735789 4971 scope.go:117] "RemoveContainer" containerID="f3b3a6f0c1bc4a75dfb46d3d4c2dbea21de579889aff0678167081af0b199bc8" Mar 20 06:50:15 crc kubenswrapper[4971]: I0320 06:50:15.278163 4971 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 06:50:15 crc kubenswrapper[4971]: I0320 06:50:15.278286 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 06:50:15 crc kubenswrapper[4971]: I0320 06:50:15.650999 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:50:15Z is after 2026-02-23T05:33:13Z Mar 20 06:50:15 crc kubenswrapper[4971]: I0320 06:50:15.988671 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 06:50:15 crc kubenswrapper[4971]: I0320 06:50:15.989907 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 06:50:15 crc kubenswrapper[4971]: I0320 06:50:15.993349 4971 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="bfc43ecf5399618bd0f8a950fd2756f692cdf577f176e76079f6372c5baebb9d" exitCode=255 Mar 20 06:50:15 crc kubenswrapper[4971]: I0320 06:50:15.993445 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"bfc43ecf5399618bd0f8a950fd2756f692cdf577f176e76079f6372c5baebb9d"} Mar 20 06:50:15 crc kubenswrapper[4971]: I0320 06:50:15.993589 4971 scope.go:117] "RemoveContainer" containerID="f3b3a6f0c1bc4a75dfb46d3d4c2dbea21de579889aff0678167081af0b199bc8" Mar 20 06:50:15 crc kubenswrapper[4971]: I0320 06:50:15.993815 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:50:15 crc kubenswrapper[4971]: I0320 06:50:15.995314 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:50:15 crc kubenswrapper[4971]: I0320 06:50:15.995370 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:50:15 crc kubenswrapper[4971]: I0320 06:50:15.995389 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:50:15 crc kubenswrapper[4971]: I0320 06:50:15.996515 4971 scope.go:117] "RemoveContainer" containerID="bfc43ecf5399618bd0f8a950fd2756f692cdf577f176e76079f6372c5baebb9d" Mar 20 06:50:15 crc kubenswrapper[4971]: E0320 06:50:15.996959 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 06:50:16 crc kubenswrapper[4971]: I0320 06:50:16.651552 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:50:16Z is after 2026-02-23T05:33:13Z Mar 20 06:50:16 crc kubenswrapper[4971]: I0320 06:50:16.998962 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 06:50:17 crc kubenswrapper[4971]: I0320 06:50:17.652802 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:50:17Z is after 2026-02-23T05:33:13Z Mar 20 06:50:18 crc kubenswrapper[4971]: I0320 06:50:18.136509 4971 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 06:50:18 crc kubenswrapper[4971]: E0320 06:50:18.142861 4971 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:50:18Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 06:50:18 crc kubenswrapper[4971]: E0320 06:50:18.144121 4971 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Mar 20 06:50:18 crc kubenswrapper[4971]: I0320 06:50:18.580800 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 06:50:18 crc kubenswrapper[4971]: I0320 06:50:18.581144 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:50:18 crc kubenswrapper[4971]: I0320 06:50:18.583317 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:50:18 crc kubenswrapper[4971]: I0320 06:50:18.583400 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:50:18 crc kubenswrapper[4971]: I0320 06:50:18.583420 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:50:18 crc kubenswrapper[4971]: I0320 06:50:18.584494 4971 scope.go:117] "RemoveContainer" containerID="bfc43ecf5399618bd0f8a950fd2756f692cdf577f176e76079f6372c5baebb9d" Mar 20 06:50:18 crc kubenswrapper[4971]: E0320 06:50:18.584865 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 06:50:18 crc kubenswrapper[4971]: I0320 06:50:18.651183 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:50:18Z is after 2026-02-23T05:33:13Z Mar 20 06:50:18 crc kubenswrapper[4971]: E0320 06:50:18.814413 4971 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 06:50:19 crc kubenswrapper[4971]: I0320 06:50:19.650788 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:50:19Z is after 2026-02-23T05:33:13Z Mar 20 06:50:20 crc kubenswrapper[4971]: I0320 06:50:20.142531 4971 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 06:50:20 crc kubenswrapper[4971]: I0320 06:50:20.143341 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:50:20 crc kubenswrapper[4971]: I0320 06:50:20.145381 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:50:20 crc kubenswrapper[4971]: I0320 06:50:20.145491 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:50:20 crc kubenswrapper[4971]: I0320 06:50:20.145512 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:50:20 crc kubenswrapper[4971]: I0320 06:50:20.146873 4971 scope.go:117] "RemoveContainer" containerID="bfc43ecf5399618bd0f8a950fd2756f692cdf577f176e76079f6372c5baebb9d" Mar 20 06:50:20 crc kubenswrapper[4971]: E0320 06:50:20.147303 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 06:50:20 crc kubenswrapper[4971]: I0320 06:50:20.654084 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 06:50:20 crc kubenswrapper[4971]: I0320 06:50:20.670404 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:50:20 crc kubenswrapper[4971]: I0320 06:50:20.672726 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:50:20 crc kubenswrapper[4971]: I0320 06:50:20.672815 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:50:20 crc kubenswrapper[4971]: I0320 06:50:20.672839 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:50:20 crc kubenswrapper[4971]: I0320 06:50:20.672894 4971 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 06:50:20 crc kubenswrapper[4971]: E0320 06:50:20.681998 4971 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 06:50:20 crc kubenswrapper[4971]: E0320 06:50:20.682041 4971 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 06:50:21 crc kubenswrapper[4971]: I0320 06:50:21.653207 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 06:50:22 crc kubenswrapper[4971]: I0320 06:50:22.653174 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 06:50:22 crc kubenswrapper[4971]: E0320 06:50:22.717797 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e79f0dd3a3ae3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:38.641246947 +0000 UTC m=+0.621121115,LastTimestamp:2026-03-20 06:49:38.641246947 +0000 UTC m=+0.621121115,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:22 crc kubenswrapper[4971]: E0320 06:50:22.724543 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e79f0e1b3454d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:38.716288333 +0000 UTC m=+0.696162491,LastTimestamp:2026-03-20 06:49:38.716288333 +0000 UTC m=+0.696162491,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:22 crc kubenswrapper[4971]: E0320 06:50:22.731821 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e79f0e1b3a01f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:38.716311583 +0000 UTC m=+0.696185731,LastTimestamp:2026-03-20 06:49:38.716311583 +0000 UTC m=+0.696185731,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:22 crc kubenswrapper[4971]: E0320 06:50:22.739771 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e79f0e1b3cb5c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:38.716322652 +0000 UTC m=+0.696196800,LastTimestamp:2026-03-20 06:49:38.716322652 +0000 UTC m=+0.696196800,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:22 crc kubenswrapper[4971]: E0320 06:50:22.747325 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e79f0e6ebe18d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:38.803884429 +0000 UTC m=+0.783758577,LastTimestamp:2026-03-20 06:49:38.803884429 +0000 UTC m=+0.783758577,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:22 crc kubenswrapper[4971]: E0320 06:50:22.755543 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e79f0e1b3454d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e79f0e1b3454d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:38.716288333 +0000 UTC m=+0.696162491,LastTimestamp:2026-03-20 06:49:38.83325493 +0000 UTC m=+0.813129078,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:22 crc kubenswrapper[4971]: E0320 06:50:22.763428 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e79f0e1b3a01f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e79f0e1b3a01f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:38.716311583 +0000 UTC m=+0.696185731,LastTimestamp:2026-03-20 06:49:38.833290769 +0000 UTC m=+0.813164917,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:22 crc kubenswrapper[4971]: E0320 06:50:22.771301 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e79f0e1b3cb5c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e79f0e1b3cb5c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:38.716322652 +0000 UTC m=+0.696196800,LastTimestamp:2026-03-20 06:49:38.833310249 +0000 UTC m=+0.813184397,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:22 crc kubenswrapper[4971]: E0320 06:50:22.779472 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e79f0e1b3454d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e79f0e1b3454d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:38.716288333 +0000 UTC m=+0.696162491,LastTimestamp:2026-03-20 06:49:38.834957954 +0000 UTC m=+0.814832102,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:22 crc kubenswrapper[4971]: E0320 06:50:22.785703 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e79f0e1b3a01f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e79f0e1b3a01f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:38.716311583 +0000 UTC m=+0.696185731,LastTimestamp:2026-03-20 06:49:38.834989963 +0000 UTC m=+0.814864121,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:22 crc kubenswrapper[4971]: E0320 06:50:22.793012 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e79f0e1b3cb5c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e79f0e1b3cb5c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:38.716322652 +0000 UTC m=+0.696196800,LastTimestamp:2026-03-20 06:49:38.835003283 +0000 UTC m=+0.814877431,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:22 crc kubenswrapper[4971]: E0320 06:50:22.799799 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e79f0e1b3454d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e79f0e1b3454d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:38.716288333 +0000 UTC m=+0.696162491,LastTimestamp:2026-03-20 06:49:38.836318321 +0000 UTC m=+0.816192489,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:22 crc kubenswrapper[4971]: E0320 06:50:22.808091 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e79f0e1b3a01f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e79f0e1b3a01f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:38.716311583 +0000 UTC m=+0.696185731,LastTimestamp:2026-03-20 06:49:38.83636807 +0000 UTC m=+0.816242238,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:22 crc kubenswrapper[4971]: E0320 06:50:22.815684 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e79f0e1b3cb5c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e79f0e1b3cb5c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:38.716322652 +0000 UTC m=+0.696196800,LastTimestamp:2026-03-20 06:49:38.83639029 +0000 UTC m=+0.816264468,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:22 crc kubenswrapper[4971]: E0320 06:50:22.823097 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e79f0e1b3454d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e79f0e1b3454d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:38.716288333 +0000 UTC m=+0.696162491,LastTimestamp:2026-03-20 06:49:38.836513819 +0000 UTC m=+0.816387967,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:22 crc kubenswrapper[4971]: E0320 06:50:22.832087 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e79f0e1b3a01f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e79f0e1b3a01f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:38.716311583 +0000 UTC m=+0.696185731,LastTimestamp:2026-03-20 06:49:38.836548099 +0000 UTC m=+0.816422257,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:22 crc kubenswrapper[4971]: E0320 06:50:22.839452 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e79f0e1b3cb5c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e79f0e1b3cb5c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:38.716322652 +0000 UTC m=+0.696196800,LastTimestamp:2026-03-20 06:49:38.836568768 +0000 UTC m=+0.816442916,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:22 crc kubenswrapper[4971]: E0320 06:50:22.846851 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e79f0e1b3454d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e79f0e1b3454d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:38.716288333 +0000 UTC m=+0.696162491,LastTimestamp:2026-03-20 06:49:38.837504199 +0000 UTC m=+0.817378367,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:22 crc kubenswrapper[4971]: E0320 06:50:22.853344 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e79f0e1b3a01f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e79f0e1b3a01f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:38.716311583 +0000 UTC m=+0.696185731,LastTimestamp:2026-03-20 06:49:38.837593399 +0000 UTC m=+0.817467567,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:22 crc kubenswrapper[4971]: E0320 06:50:22.859991 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e79f0e1b3cb5c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e79f0e1b3cb5c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:38.716322652 +0000 UTC m=+0.696196800,LastTimestamp:2026-03-20 06:49:38.837687268 +0000 UTC m=+0.817561446,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:22 crc kubenswrapper[4971]: E0320 06:50:22.866535 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e79f0e1b3454d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e79f0e1b3454d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:38.716288333 +0000 UTC m=+0.696162491,LastTimestamp:2026-03-20 06:49:38.838723788 +0000 UTC m=+0.818597936,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:22 crc kubenswrapper[4971]: E0320 06:50:22.873402 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e79f0e1b3454d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e79f0e1b3454d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:38.716288333 +0000 UTC m=+0.696162491,LastTimestamp:2026-03-20 06:49:38.838784707 +0000 UTC m=+0.818658855,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:22 crc kubenswrapper[4971]: E0320 06:50:22.880100 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e79f0e1b3a01f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e79f0e1b3a01f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:38.716311583 +0000 UTC m=+0.696185731,LastTimestamp:2026-03-20 06:49:38.838795657 +0000 UTC m=+0.818669805,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:22 crc kubenswrapper[4971]: E0320 06:50:22.888008 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e79f0e1b3a01f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e79f0e1b3a01f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:38.716311583 +0000 UTC m=+0.696185731,LastTimestamp:2026-03-20 06:49:38.838805477 +0000 UTC m=+0.818679625,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:22 crc kubenswrapper[4971]: E0320 06:50:22.894843 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e79f0e1b3cb5c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e79f0e1b3cb5c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:38.716322652 +0000 UTC m=+0.696196800,LastTimestamp:2026-03-20 06:49:38.838819297 +0000 UTC m=+0.818693445,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:22 crc kubenswrapper[4971]: E0320 06:50:22.903973 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e79f100df480d openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:39.239266317 +0000 UTC m=+1.219140495,LastTimestamp:2026-03-20 06:49:39.239266317 +0000 UTC m=+1.219140495,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:22 crc kubenswrapper[4971]: E0320 06:50:22.910428 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e79f100e3b584 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:39.239556484 +0000 UTC m=+1.219430632,LastTimestamp:2026-03-20 06:49:39.239556484 +0000 UTC m=+1.219430632,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:22 crc kubenswrapper[4971]: E0320 06:50:22.917203 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e79f1021844c1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:39.259778241 +0000 UTC m=+1.239652419,LastTimestamp:2026-03-20 06:49:39.259778241 +0000 UTC m=+1.239652419,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:22 crc kubenswrapper[4971]: E0320 06:50:22.924383 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e79f102668cb9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:39.264908473 +0000 UTC m=+1.244782621,LastTimestamp:2026-03-20 06:49:39.264908473 +0000 UTC m=+1.244782621,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:22 crc kubenswrapper[4971]: E0320 06:50:22.931553 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e79f102b896ab openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:39.270284971 +0000 UTC m=+1.250159119,LastTimestamp:2026-03-20 06:49:39.270284971 +0000 UTC m=+1.250159119,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:22 crc kubenswrapper[4971]: E0320 06:50:22.938349 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e79f126a3ec8d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:39.872910477 +0000 UTC m=+1.852784625,LastTimestamp:2026-03-20 06:49:39.872910477 +0000 UTC m=+1.852784625,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:22 crc kubenswrapper[4971]: E0320 06:50:22.945644 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e79f126be31b9 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:39.874632121 +0000 UTC m=+1.854506269,LastTimestamp:2026-03-20 06:49:39.874632121 +0000 UTC m=+1.854506269,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:22 crc kubenswrapper[4971]: E0320 06:50:22.952166 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e79f127316259 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:39.882181209 +0000 UTC m=+1.862055357,LastTimestamp:2026-03-20 06:49:39.882181209 +0000 UTC m=+1.862055357,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:22 crc kubenswrapper[4971]: E0320 06:50:22.958654 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e79f127506326 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:39.88421303 +0000 UTC m=+1.864087178,LastTimestamp:2026-03-20 06:49:39.88421303 +0000 UTC m=+1.864087178,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:22 crc kubenswrapper[4971]: E0320 06:50:22.965495 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e79f127bee6cb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:39.891455691 +0000 UTC m=+1.871329839,LastTimestamp:2026-03-20 06:49:39.891455691 +0000 UTC m=+1.871329839,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:22 crc kubenswrapper[4971]: E0320 06:50:22.972575 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e79f127d4edeb openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:39.892899307 +0000 UTC m=+1.872773455,LastTimestamp:2026-03-20 06:49:39.892899307 +0000 UTC m=+1.872773455,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:22 crc kubenswrapper[4971]: E0320 06:50:22.979475 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e79f127f4bb5d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:39.894983517 +0000 UTC m=+1.874857665,LastTimestamp:2026-03-20 06:49:39.894983517 +0000 UTC m=+1.874857665,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:22 crc kubenswrapper[4971]: E0320 06:50:22.986373 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e79f128008abc openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:39.8957575 +0000 UTC m=+1.875631648,LastTimestamp:2026-03-20 06:49:39.8957575 +0000 UTC m=+1.875631648,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:22 crc kubenswrapper[4971]: E0320 06:50:22.990736 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e79f1282ddba4 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:39.898727332 +0000 UTC m=+1.878601480,LastTimestamp:2026-03-20 06:49:39.898727332 +0000 UTC m=+1.878601480,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:22 crc kubenswrapper[4971]: E0320 06:50:22.997177 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e79f12889a83d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:39.904743485 +0000 UTC m=+1.884617623,LastTimestamp:2026-03-20 06:49:39.904743485 +0000 UTC m=+1.884617623,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.005313 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e79f129344a34 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:39.915926068 +0000 UTC m=+1.895800206,LastTimestamp:2026-03-20 06:49:39.915926068 +0000 UTC m=+1.895800206,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.014657 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e79f13ac69b4b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:40.210727755 +0000 UTC m=+2.190601933,LastTimestamp:2026-03-20 06:49:40.210727755 +0000 UTC m=+2.190601933,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.021841 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e79f13ba886f1 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:40.225533681 +0000 UTC m=+2.205407859,LastTimestamp:2026-03-20 06:49:40.225533681 +0000 UTC m=+2.205407859,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.030247 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e79f13bc05961 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:40.227094881 +0000 UTC m=+2.206969059,LastTimestamp:2026-03-20 06:49:40.227094881 +0000 UTC m=+2.206969059,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.038008 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e79f14a969f98 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:40.476018584 +0000 UTC m=+2.455892722,LastTimestamp:2026-03-20 06:49:40.476018584 +0000 UTC m=+2.455892722,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.046378 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e79f14cc16339 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:40.512375609 +0000 UTC m=+2.492249777,LastTimestamp:2026-03-20 06:49:40.512375609 +0000 UTC m=+2.492249777,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.053113 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e79f14cd8e5fa openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:40.51391641 +0000 UTC m=+2.493790548,LastTimestamp:2026-03-20 06:49:40.51391641 +0000 UTC m=+2.493790548,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.062104 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e79f15b90141c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:40.760802332 +0000 UTC m=+2.740676480,LastTimestamp:2026-03-20 06:49:40.760802332 +0000 UTC m=+2.740676480,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.073015 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e79f15c3ac08d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:40.771987597 +0000 UTC m=+2.751861755,LastTimestamp:2026-03-20 06:49:40.771987597 +0000 UTC m=+2.751861755,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.078484 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e79f15c6100e6 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:40.774494438 +0000 UTC m=+2.754368586,LastTimestamp:2026-03-20 06:49:40.774494438 +0000 UTC m=+2.754368586,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.085158 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e79f15c6a057e openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:40.775085438 +0000 UTC m=+2.754959576,LastTimestamp:2026-03-20 06:49:40.775085438 +0000 UTC m=+2.754959576,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.091238 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e79f15ceee021 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:40.783792161 +0000 UTC m=+2.763666309,LastTimestamp:2026-03-20 06:49:40.783792161 +0000 UTC m=+2.763666309,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.095126 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e79f15e1b5a28 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:40.8034842 +0000 UTC m=+2.783358348,LastTimestamp:2026-03-20 06:49:40.8034842 +0000 UTC m=+2.783358348,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.101036 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e79f16b33eef0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:41.02319896 +0000 UTC m=+3.003073098,LastTimestamp:2026-03-20 06:49:41.02319896 +0000 UTC m=+3.003073098,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.107315 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e79f16bd0ed27 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:41.033487655 +0000 UTC m=+3.013361793,LastTimestamp:2026-03-20 06:49:41.033487655 +0000 UTC m=+3.013361793,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.111815 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e79f16bd5c17b openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:41.033804155 +0000 UTC m=+3.013678293,LastTimestamp:2026-03-20 06:49:41.033804155 +0000 UTC m=+3.013678293,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.118209 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e79f16c1a78a9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:41.038307497 +0000 UTC m=+3.018181635,LastTimestamp:2026-03-20 06:49:41.038307497 +0000 UTC m=+3.018181635,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.123007 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e79f16c2b3bc1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:41.039406017 +0000 UTC m=+3.019280155,LastTimestamp:2026-03-20 06:49:41.039406017 +0000 UTC m=+3.019280155,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.129249 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e79f16c700edd openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:41.043916509 +0000 UTC m=+3.023790637,LastTimestamp:2026-03-20 06:49:41.043916509 +0000 UTC m=+3.023790637,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.135556 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e79f16c7d5a89 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:41.044787849 +0000 UTC m=+3.024661987,LastTimestamp:2026-03-20 06:49:41.044787849 +0000 UTC m=+3.024661987,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.142091 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e79f16c8be000 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:41.04573952 +0000 UTC m=+3.025613658,LastTimestamp:2026-03-20 06:49:41.04573952 +0000 UTC m=+3.025613658,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.149468 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e79f16dd94235 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:41.067588149 +0000 UTC m=+3.047462287,LastTimestamp:2026-03-20 06:49:41.067588149 +0000 UTC m=+3.047462287,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.155588 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e79f17a571ba5 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:41.277162405 +0000 UTC m=+3.257036543,LastTimestamp:2026-03-20 06:49:41.277162405 +0000 UTC m=+3.257036543,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.162169 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e79f17a679256 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:41.278241366 +0000 UTC m=+3.258115504,LastTimestamp:2026-03-20 06:49:41.278241366 +0000 UTC m=+3.258115504,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.166712 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e79f17b15c937 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:41.289658679 +0000 UTC m=+3.269532837,LastTimestamp:2026-03-20 06:49:41.289658679 +0000 UTC m=+3.269532837,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.173766 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e79f17b312ce6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:41.29145367 +0000 UTC m=+3.271327818,LastTimestamp:2026-03-20 06:49:41.29145367 +0000 UTC m=+3.271327818,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.180583 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e79f17b4bb5c1 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:41.293192641 +0000 UTC m=+3.273066779,LastTimestamp:2026-03-20 06:49:41.293192641 +0000 UTC m=+3.273066779,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.187491 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e79f17b70c68c openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:41.295621772 +0000 UTC m=+3.275495910,LastTimestamp:2026-03-20 06:49:41.295621772 +0000 UTC m=+3.275495910,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.193980 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e79f188d6f1d7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:41.520421335 +0000 UTC m=+3.500295463,LastTimestamp:2026-03-20 06:49:41.520421335 +0000 UTC m=+3.500295463,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.202829 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e79f18955cb04 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:41.528734468 +0000 UTC m=+3.508608606,LastTimestamp:2026-03-20 06:49:41.528734468 +0000 UTC m=+3.508608606,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.210646 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e79f1899840bc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:41.53308998 +0000 UTC m=+3.512964118,LastTimestamp:2026-03-20 06:49:41.53308998 +0000 UTC m=+3.512964118,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.219070 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e79f189c3a4fb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:41.535933691 +0000 UTC m=+3.515807829,LastTimestamp:2026-03-20 06:49:41.535933691 +0000 UTC m=+3.515807829,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.226193 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e79f18a0f85e9 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:41.540906473 +0000 UTC m=+3.520780611,LastTimestamp:2026-03-20 06:49:41.540906473 +0000 UTC m=+3.520780611,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.232955 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e79f18b74374f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:41.564282703 +0000 UTC m=+3.544156841,LastTimestamp:2026-03-20 06:49:41.564282703 +0000 UTC m=+3.544156841,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.240196 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e79f19738eb6c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:41.761723244 +0000 UTC m=+3.741597382,LastTimestamp:2026-03-20 06:49:41.761723244 +0000 UTC m=+3.741597382,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.248828 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e79f1982c864d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:41.777688141 +0000 UTC m=+3.757562289,LastTimestamp:2026-03-20 06:49:41.777688141 +0000 UTC m=+3.757562289,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.255146 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e79f198425e24 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:41.779119652 +0000 UTC m=+3.758993790,LastTimestamp:2026-03-20 06:49:41.779119652 +0000 UTC m=+3.758993790,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.262083 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e79f1987cf355 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:41.782958933 +0000 UTC m=+3.762833081,LastTimestamp:2026-03-20 06:49:41.782958933 +0000 UTC m=+3.762833081,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.270312 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e79f1a582993d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:42.001432893 +0000 UTC m=+3.981307071,LastTimestamp:2026-03-20 06:49:42.001432893 +0000 UTC m=+3.981307071,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.275184 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e79f1a58d0a7c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:42.002117244 +0000 UTC m=+3.981991382,LastTimestamp:2026-03-20 06:49:42.002117244 +0000 UTC m=+3.981991382,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.281561 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e79f1a62443b4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:42.012027828 +0000 UTC m=+3.991901966,LastTimestamp:2026-03-20 06:49:42.012027828 +0000 UTC m=+3.991901966,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.290157 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e79f1a690cc21 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:42.019140641 +0000 UTC m=+3.999014779,LastTimestamp:2026-03-20 06:49:42.019140641 +0000 UTC m=+3.999014779,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.298359 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e79f1d6a593cc openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:42.825808844 +0000 UTC m=+4.805682982,LastTimestamp:2026-03-20 06:49:42.825808844 +0000 UTC m=+4.805682982,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.305459 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e79f1e6148e87 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:43.084740231 +0000 UTC m=+5.064614379,LastTimestamp:2026-03-20 06:49:43.084740231 +0000 UTC m=+5.064614379,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.312206 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e79f1e6ea0f9e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:43.098732446 +0000 UTC m=+5.078606614,LastTimestamp:2026-03-20 06:49:43.098732446 +0000 UTC m=+5.078606614,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.318803 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e79f1e7032a57 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:43.100377687 +0000 UTC m=+5.080251865,LastTimestamp:2026-03-20 06:49:43.100377687 +0000 UTC m=+5.080251865,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.323253 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e79f1f7644ca9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:43.375178921 +0000 UTC m=+5.355053069,LastTimestamp:2026-03-20 06:49:43.375178921 +0000 UTC m=+5.355053069,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.329987 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e79f1f873fac4 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:43.392983748 +0000 UTC m=+5.372857896,LastTimestamp:2026-03-20 06:49:43.392983748 +0000 UTC m=+5.372857896,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.336723 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e79f1f8882423 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:43.394305059 +0000 UTC m=+5.374179207,LastTimestamp:2026-03-20 06:49:43.394305059 +0000 UTC m=+5.374179207,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.343368 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e79f20756309b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:43.642689691 +0000 UTC m=+5.622563839,LastTimestamp:2026-03-20 06:49:43.642689691 +0000 UTC m=+5.622563839,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.351489 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e79f2085d788e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:43.659944078 +0000 UTC m=+5.639818226,LastTimestamp:2026-03-20 06:49:43.659944078 +0000 UTC m=+5.639818226,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.357592 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e79f2086eb541 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:43.661073729 +0000 UTC m=+5.640947907,LastTimestamp:2026-03-20 06:49:43.661073729 +0000 UTC m=+5.640947907,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.365344 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e79f21a07cb19 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:43.956319001 +0000 UTC m=+5.936193139,LastTimestamp:2026-03-20 06:49:43.956319001 +0000 UTC m=+5.936193139,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.372929 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e79f21afe2839 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:43.972464697 +0000 UTC m=+5.952338835,LastTimestamp:2026-03-20 06:49:43.972464697 +0000 UTC m=+5.952338835,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.378181 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e79f21b14dfa8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:43.973953448 +0000 UTC m=+5.953827616,LastTimestamp:2026-03-20 06:49:43.973953448 +0000 UTC m=+5.953827616,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.383473 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e79f22aa6a314 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:44.235164436 +0000 UTC m=+6.215038614,LastTimestamp:2026-03-20 06:49:44.235164436 +0000 UTC m=+6.215038614,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.390279 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e79f22b7d278c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:44.249223052 +0000 UTC m=+6.229097220,LastTimestamp:2026-03-20 06:49:44.249223052 +0000 UTC m=+6.229097220,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.400955 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 06:50:23 crc kubenswrapper[4971]: &Event{ObjectMeta:{kube-controller-manager-crc.189e79f268d58e48 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 20 06:50:23 crc kubenswrapper[4971]: body: Mar 20 06:50:23 crc kubenswrapper[4971]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:45.278426696 +0000 UTC m=+7.258300884,LastTimestamp:2026-03-20 06:49:45.278426696 +0000 UTC m=+7.258300884,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 06:50:23 crc kubenswrapper[4971]: > Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.406828 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e79f268d950c5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:45.278673093 +0000 UTC m=+7.258547291,LastTimestamp:2026-03-20 06:49:45.278673093 +0000 UTC m=+7.258547291,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.413209 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 20 06:50:23 crc kubenswrapper[4971]: &Event{ObjectMeta:{kube-apiserver-crc.189e79f44441ba72 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 20 06:50:23 crc kubenswrapper[4971]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 06:50:23 crc kubenswrapper[4971]: Mar 20 06:50:23 crc kubenswrapper[4971]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:53.25469349 +0000 UTC m=+15.234567638,LastTimestamp:2026-03-20 06:49:53.25469349 +0000 UTC m=+15.234567638,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 06:50:23 crc kubenswrapper[4971]: > Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.418145 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e79f4444278c4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:53.254742212 +0000 UTC m=+15.234616350,LastTimestamp:2026-03-20 06:49:53.254742212 +0000 UTC m=+15.234616350,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.423862 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e79f44441ba72\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 20 06:50:23 crc kubenswrapper[4971]: &Event{ObjectMeta:{kube-apiserver-crc.189e79f44441ba72 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 20 06:50:23 crc kubenswrapper[4971]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 06:50:23 crc kubenswrapper[4971]: Mar 20 06:50:23 crc kubenswrapper[4971]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:53.25469349 +0000 UTC m=+15.234567638,LastTimestamp:2026-03-20 06:49:53.262472505 +0000 UTC m=+15.242346643,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 06:50:23 crc kubenswrapper[4971]: > Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.429038 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e79f4444278c4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e79f4444278c4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:53.254742212 +0000 UTC m=+15.234616350,LastTimestamp:2026-03-20 06:49:53.262509776 +0000 UTC m=+15.242383914,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.438937 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e79f198425e24\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e79f198425e24 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:41.779119652 +0000 UTC m=+3.758993790,LastTimestamp:2026-03-20 06:49:53.893615399 +0000 UTC m=+15.873489537,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.442887 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e79f1a582993d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e79f1a582993d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:42.001432893 +0000 UTC m=+3.981307071,LastTimestamp:2026-03-20 06:49:54.132278305 +0000 UTC m=+16.112152443,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.446795 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e79f1a62443b4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e79f1a62443b4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:42.012027828 +0000 UTC m=+3.991901966,LastTimestamp:2026-03-20 06:49:54.146403397 +0000 UTC m=+16.126277535,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.451190 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 06:50:23 crc kubenswrapper[4971]: &Event{ObjectMeta:{kube-controller-manager-crc.189e79f4bcd76d3e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 06:50:23 crc kubenswrapper[4971]: body: Mar 20 06:50:23 crc kubenswrapper[4971]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:55.277770046 +0000 UTC m=+17.257644224,LastTimestamp:2026-03-20 06:49:55.277770046 +0000 UTC m=+17.257644224,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 06:50:23 crc kubenswrapper[4971]: > Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.455717 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e79f4bcd8a232 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:55.277849138 +0000 UTC m=+17.257723316,LastTimestamp:2026-03-20 06:49:55.277849138 +0000 UTC m=+17.257723316,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.460736 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e79f4bcd76d3e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 06:50:23 crc kubenswrapper[4971]: &Event{ObjectMeta:{kube-controller-manager-crc.189e79f4bcd76d3e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 06:50:23 crc kubenswrapper[4971]: body: Mar 20 06:50:23 crc kubenswrapper[4971]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:55.277770046 +0000 UTC m=+17.257644224,LastTimestamp:2026-03-20 06:50:05.278899602 +0000 UTC m=+27.258773820,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 06:50:23 crc kubenswrapper[4971]: > Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.464393 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e79f4bcd8a232\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e79f4bcd8a232 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:55.277849138 +0000 UTC m=+17.257723316,LastTimestamp:2026-03-20 06:50:05.279013455 +0000 UTC m=+27.258887633,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.468536 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e79f7112cac79 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:50:05.282577529 +0000 UTC m=+27.262451747,LastTimestamp:2026-03-20 06:50:05.282577529 +0000 UTC m=+27.262451747,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.473886 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e79f127f4bb5d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e79f127f4bb5d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:39.894983517 +0000 UTC m=+1.874857665,LastTimestamp:2026-03-20 06:50:05.400194857 +0000 UTC m=+27.380069025,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.478812 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e79f13ac69b4b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e79f13ac69b4b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:40.210727755 +0000 UTC m=+2.190601933,LastTimestamp:2026-03-20 06:50:05.63646811 +0000 UTC m=+27.616342288,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.484491 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e79f13ba886f1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e79f13ba886f1 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:40.225533681 +0000 UTC m=+2.205407859,LastTimestamp:2026-03-20 06:50:05.650034828 +0000 UTC m=+27.629908976,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.489629 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e79f4bcd76d3e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 06:50:23 crc kubenswrapper[4971]: &Event{ObjectMeta:{kube-controller-manager-crc.189e79f4bcd76d3e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 06:50:23 crc kubenswrapper[4971]: body: Mar 20 06:50:23 crc kubenswrapper[4971]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:55.277770046 +0000 UTC m=+17.257644224,LastTimestamp:2026-03-20 06:50:15.27824334 +0000 UTC m=+37.258117508,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 06:50:23 crc kubenswrapper[4971]: > Mar 20 06:50:23 crc kubenswrapper[4971]: E0320 06:50:23.495872 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e79f4bcd8a232\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e79f4bcd8a232 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:55.277849138 +0000 UTC m=+17.257723316,LastTimestamp:2026-03-20 06:50:15.278328882 +0000 UTC m=+37.258203050,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:50:23 crc kubenswrapper[4971]: I0320 06:50:23.651140 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 06:50:24 crc kubenswrapper[4971]: I0320 06:50:24.652038 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 06:50:25 crc kubenswrapper[4971]: I0320 06:50:25.278014 4971 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 06:50:25 crc kubenswrapper[4971]: I0320 06:50:25.278183 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 06:50:25 crc kubenswrapper[4971]: E0320 06:50:25.284962 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e79f4bcd76d3e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 06:50:25 crc kubenswrapper[4971]: &Event{ObjectMeta:{kube-controller-manager-crc.189e79f4bcd76d3e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 06:50:25 crc kubenswrapper[4971]: body: Mar 20 06:50:25 crc kubenswrapper[4971]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:49:55.277770046 +0000 UTC m=+17.257644224,LastTimestamp:2026-03-20 06:50:25.278139402 +0000 UTC m=+47.258013580,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 06:50:25 crc kubenswrapper[4971]: > Mar 20 06:50:25 crc kubenswrapper[4971]: W0320 06:50:25.458986 4971 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 20 06:50:25 crc kubenswrapper[4971]: E0320 06:50:25.459075 4971 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 20 06:50:25 crc kubenswrapper[4971]: I0320 06:50:25.651808 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 06:50:26 crc kubenswrapper[4971]: I0320 06:50:26.384261 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 06:50:26 crc kubenswrapper[4971]: I0320 06:50:26.384470 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:50:26 crc kubenswrapper[4971]: I0320 06:50:26.386995 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:50:26 crc kubenswrapper[4971]: I0320 06:50:26.387088 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:50:26 crc kubenswrapper[4971]: I0320 06:50:26.387109 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:50:26 crc kubenswrapper[4971]: I0320 06:50:26.650588 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 06:50:27 crc kubenswrapper[4971]: W0320 06:50:27.602959 4971 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 20 06:50:27 crc kubenswrapper[4971]: E0320 06:50:27.603053 4971 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 20 06:50:27 crc kubenswrapper[4971]: I0320 06:50:27.652930 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 06:50:27 crc kubenswrapper[4971]: I0320 06:50:27.683067 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:50:27 crc kubenswrapper[4971]: I0320 06:50:27.685150 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:50:27 crc kubenswrapper[4971]: I0320 06:50:27.685231 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:50:27 crc kubenswrapper[4971]: I0320 06:50:27.685260 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:50:27 crc kubenswrapper[4971]: I0320 06:50:27.685308 4971 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 06:50:27 crc kubenswrapper[4971]: E0320 06:50:27.691495 4971 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 06:50:27 crc kubenswrapper[4971]: E0320 06:50:27.692087 4971 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 06:50:28 crc kubenswrapper[4971]: W0320 06:50:28.391388 4971 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 20 06:50:28 crc kubenswrapper[4971]: E0320 06:50:28.391456 4971 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 20 06:50:28 crc kubenswrapper[4971]: I0320 06:50:28.652224 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 06:50:28 crc kubenswrapper[4971]: E0320 06:50:28.815189 4971 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 06:50:29 crc kubenswrapper[4971]: I0320 06:50:29.654070 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 06:50:30 crc kubenswrapper[4971]: I0320 06:50:30.652285 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 06:50:31 crc kubenswrapper[4971]: I0320 06:50:31.650379 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 06:50:32 crc kubenswrapper[4971]: W0320 06:50:32.242381 4971 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 20 06:50:32 crc kubenswrapper[4971]: E0320 06:50:32.242443 4971 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 20 06:50:32 crc kubenswrapper[4971]: I0320 06:50:32.280771 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 06:50:32 crc kubenswrapper[4971]: I0320 06:50:32.280947 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:50:32 crc kubenswrapper[4971]: I0320 06:50:32.282216 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:50:32 crc kubenswrapper[4971]: I0320 06:50:32.282265 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:50:32 crc kubenswrapper[4971]: I0320 06:50:32.282277 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:50:32 crc kubenswrapper[4971]: I0320 06:50:32.287938 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 06:50:32 crc kubenswrapper[4971]: I0320 06:50:32.650833 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 06:50:33 crc kubenswrapper[4971]: I0320 06:50:33.050311 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:50:33 crc kubenswrapper[4971]: I0320 06:50:33.052175 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:50:33 crc kubenswrapper[4971]: I0320 06:50:33.052222 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:50:33 crc kubenswrapper[4971]: I0320 06:50:33.052234 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:50:33 crc kubenswrapper[4971]: I0320 06:50:33.650040 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 06:50:33 crc kubenswrapper[4971]: I0320 06:50:33.731428 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:50:33 crc kubenswrapper[4971]: I0320 06:50:33.732964 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:50:33 crc kubenswrapper[4971]: I0320 06:50:33.733034 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:50:33 crc kubenswrapper[4971]: I0320 06:50:33.733061 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:50:33 crc kubenswrapper[4971]: I0320 06:50:33.733978 4971 scope.go:117] "RemoveContainer" containerID="bfc43ecf5399618bd0f8a950fd2756f692cdf577f176e76079f6372c5baebb9d" Mar 20 06:50:33 crc kubenswrapper[4971]: E0320 06:50:33.734298 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 06:50:34 crc kubenswrapper[4971]: I0320 06:50:34.652969 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 06:50:34 crc kubenswrapper[4971]: I0320 06:50:34.691788 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:50:34 crc kubenswrapper[4971]: I0320 06:50:34.693566 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:50:34 crc kubenswrapper[4971]: I0320 06:50:34.693677 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:50:34 crc kubenswrapper[4971]: I0320 06:50:34.693711 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:50:34 crc kubenswrapper[4971]: I0320 06:50:34.693759 4971 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 06:50:34 crc kubenswrapper[4971]: E0320 06:50:34.698114 4971 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 06:50:34 crc kubenswrapper[4971]: E0320 06:50:34.698957 4971 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 06:50:35 crc kubenswrapper[4971]: I0320 06:50:35.650124 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 06:50:36 crc kubenswrapper[4971]: I0320 06:50:36.652844 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 06:50:37 crc kubenswrapper[4971]: I0320 06:50:37.652179 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 06:50:38 crc kubenswrapper[4971]: I0320 06:50:38.654370 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 06:50:38 crc kubenswrapper[4971]: E0320 06:50:38.815405 4971 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 06:50:39 crc kubenswrapper[4971]: I0320 06:50:39.651138 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 06:50:40 crc kubenswrapper[4971]: I0320 06:50:40.651284 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 06:50:41 crc kubenswrapper[4971]: I0320 06:50:41.653320 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 06:50:41 crc kubenswrapper[4971]: I0320 06:50:41.699167 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:50:41 crc kubenswrapper[4971]: I0320 06:50:41.700391 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:50:41 crc kubenswrapper[4971]: I0320 06:50:41.700450 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:50:41 crc kubenswrapper[4971]: I0320 06:50:41.700467 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:50:41 crc kubenswrapper[4971]: I0320 06:50:41.700501 4971 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 06:50:41 crc kubenswrapper[4971]: E0320 06:50:41.706448 4971 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 06:50:41 crc kubenswrapper[4971]: E0320 06:50:41.707093 4971 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 06:50:42 crc kubenswrapper[4971]: I0320 06:50:42.649918 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 06:50:43 crc kubenswrapper[4971]: I0320 06:50:43.653286 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 06:50:44 crc kubenswrapper[4971]: I0320 06:50:44.650708 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 06:50:45 crc kubenswrapper[4971]: I0320 06:50:45.651122 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 06:50:46 crc kubenswrapper[4971]: I0320 06:50:46.649852 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 06:50:47 crc kubenswrapper[4971]: I0320 06:50:47.650024 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 06:50:48 crc kubenswrapper[4971]: I0320 06:50:48.650077 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 06:50:48 crc kubenswrapper[4971]: I0320 06:50:48.706533 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:50:48 crc kubenswrapper[4971]: I0320 06:50:48.711465 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:50:48 crc kubenswrapper[4971]: I0320 06:50:48.711509 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:50:48 crc kubenswrapper[4971]: I0320 06:50:48.711521 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:50:48 crc kubenswrapper[4971]: I0320 06:50:48.711549 4971 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 06:50:48 crc kubenswrapper[4971]: E0320 06:50:48.714078 4971 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 06:50:48 crc kubenswrapper[4971]: E0320 06:50:48.714190 4971 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 06:50:48 crc kubenswrapper[4971]: I0320 06:50:48.731820 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:50:48 crc kubenswrapper[4971]: I0320 06:50:48.732635 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:50:48 crc kubenswrapper[4971]: I0320 06:50:48.732658 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:50:48 crc kubenswrapper[4971]: I0320 06:50:48.732667 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:50:48 crc kubenswrapper[4971]: I0320 06:50:48.733155 4971 scope.go:117] "RemoveContainer" containerID="bfc43ecf5399618bd0f8a950fd2756f692cdf577f176e76079f6372c5baebb9d" Mar 20 06:50:48 crc kubenswrapper[4971]: E0320 06:50:48.815676 4971 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 06:50:49 crc kubenswrapper[4971]: I0320 06:50:49.099685 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 06:50:49 crc kubenswrapper[4971]: I0320 06:50:49.102981 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6a748c7f52fe87f489e7b11239aafb2b6f9dc55d79f48fd08e10193349e61706"} Mar 20 06:50:49 crc kubenswrapper[4971]: I0320 06:50:49.103185 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:50:49 crc kubenswrapper[4971]: I0320 06:50:49.104857 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:50:49 crc kubenswrapper[4971]: I0320 06:50:49.104918 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:50:49 crc kubenswrapper[4971]: I0320 06:50:49.104936 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:50:49 crc kubenswrapper[4971]: I0320 06:50:49.652467 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 06:50:50 crc kubenswrapper[4971]: I0320 06:50:50.107700 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 06:50:50 crc kubenswrapper[4971]: I0320 06:50:50.108477 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 06:50:50 crc kubenswrapper[4971]: I0320 06:50:50.111294 4971 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6a748c7f52fe87f489e7b11239aafb2b6f9dc55d79f48fd08e10193349e61706" exitCode=255 Mar 20 06:50:50 crc kubenswrapper[4971]: I0320 06:50:50.111366 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"6a748c7f52fe87f489e7b11239aafb2b6f9dc55d79f48fd08e10193349e61706"} Mar 20 06:50:50 crc kubenswrapper[4971]: I0320 06:50:50.111429 4971 scope.go:117] "RemoveContainer" containerID="bfc43ecf5399618bd0f8a950fd2756f692cdf577f176e76079f6372c5baebb9d" Mar 20 06:50:50 crc kubenswrapper[4971]: I0320 06:50:50.111655 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:50:50 crc kubenswrapper[4971]: I0320 06:50:50.112564 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:50:50 crc kubenswrapper[4971]: I0320 06:50:50.112621 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:50:50 crc kubenswrapper[4971]: I0320 06:50:50.112635 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:50:50 crc kubenswrapper[4971]: I0320 06:50:50.113235 4971 scope.go:117] "RemoveContainer" containerID="6a748c7f52fe87f489e7b11239aafb2b6f9dc55d79f48fd08e10193349e61706" Mar 20 06:50:50 crc kubenswrapper[4971]: E0320 06:50:50.113420 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 06:50:50 crc kubenswrapper[4971]: I0320 06:50:50.142728 4971 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 06:50:50 crc kubenswrapper[4971]: I0320 06:50:50.146388 4971 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 06:50:50 crc kubenswrapper[4971]: I0320 06:50:50.163360 4971 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 20 06:50:50 crc kubenswrapper[4971]: I0320 06:50:50.657079 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 06:50:51 crc kubenswrapper[4971]: I0320 06:50:51.120180 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 06:50:51 crc kubenswrapper[4971]: I0320 06:50:51.122441 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:50:51 crc kubenswrapper[4971]: I0320 06:50:51.123931 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:50:51 crc kubenswrapper[4971]: I0320 06:50:51.123963 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:50:51 crc kubenswrapper[4971]: I0320 06:50:51.123973 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:50:51 crc kubenswrapper[4971]: I0320 06:50:51.124428 4971 scope.go:117] "RemoveContainer" containerID="6a748c7f52fe87f489e7b11239aafb2b6f9dc55d79f48fd08e10193349e61706" Mar 20 06:50:51 crc kubenswrapper[4971]: E0320 06:50:51.124573 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 06:50:51 crc kubenswrapper[4971]: I0320 06:50:51.660258 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 06:50:52 crc kubenswrapper[4971]: I0320 06:50:52.653655 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 06:50:53 crc kubenswrapper[4971]: I0320 06:50:53.652158 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 06:50:54 crc kubenswrapper[4971]: I0320 06:50:54.246784 4971 csr.go:261] certificate signing request csr-l28ph is approved, waiting to be issued Mar 20 06:50:54 crc kubenswrapper[4971]: I0320 06:50:54.256220 4971 csr.go:257] certificate signing request csr-l28ph is issued Mar 20 06:50:54 crc kubenswrapper[4971]: I0320 06:50:54.342881 4971 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 20 06:50:54 crc kubenswrapper[4971]: I0320 06:50:54.492368 4971 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 20 06:50:55 crc kubenswrapper[4971]: I0320 06:50:55.258289 4971 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-31 11:27:50.744123032 +0000 UTC Mar 20 06:50:55 crc kubenswrapper[4971]: I0320 06:50:55.258391 4971 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6868h36m55.485738832s for next certificate rotation Mar 20 06:50:55 crc kubenswrapper[4971]: I0320 06:50:55.467416 4971 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 20 06:50:55 crc kubenswrapper[4971]: I0320 06:50:55.715077 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:50:55 crc kubenswrapper[4971]: I0320 06:50:55.716978 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:50:55 crc kubenswrapper[4971]: I0320 06:50:55.717044 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:50:55 crc kubenswrapper[4971]: I0320 06:50:55.717063 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:50:55 crc kubenswrapper[4971]: I0320 06:50:55.717202 4971 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 06:50:55 crc kubenswrapper[4971]: I0320 06:50:55.729220 4971 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 20 06:50:55 crc kubenswrapper[4971]: I0320 06:50:55.729528 4971 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 20 06:50:55 crc kubenswrapper[4971]: E0320 06:50:55.729556 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 20 06:50:55 crc kubenswrapper[4971]: I0320 06:50:55.733352 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:50:55 crc kubenswrapper[4971]: I0320 06:50:55.733509 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:50:55 crc kubenswrapper[4971]: I0320 06:50:55.733534 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:50:55 crc kubenswrapper[4971]: I0320 06:50:55.733557 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:50:55 crc kubenswrapper[4971]: I0320 06:50:55.733575 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:50:55Z","lastTransitionTime":"2026-03-20T06:50:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:50:55 crc kubenswrapper[4971]: E0320 06:50:55.755211 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:50:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:50:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:50:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:50:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:50:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:50:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:50:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:50:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24b0987a-bded-4b95-97d3-cecbc47baa01\\\",\\\"systemUUID\\\":\\\"e08276c0-e3e4-4e35-ad9f-5ef530b85d12\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 06:50:55 crc kubenswrapper[4971]: I0320 06:50:55.759769 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:50:55 crc kubenswrapper[4971]: I0320 06:50:55.759824 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:50:55 crc kubenswrapper[4971]: I0320 06:50:55.759844 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:50:55 crc kubenswrapper[4971]: I0320 06:50:55.759866 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:50:55 crc kubenswrapper[4971]: I0320 06:50:55.759883 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:50:55Z","lastTransitionTime":"2026-03-20T06:50:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:50:55 crc kubenswrapper[4971]: E0320 06:50:55.773386 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:50:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:50:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:50:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:50:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:50:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:50:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:50:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:50:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24b0987a-bded-4b95-97d3-cecbc47baa01\\\",\\\"systemUUID\\\":\\\"e08276c0-e3e4-4e35-ad9f-5ef530b85d12\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 06:50:55 crc kubenswrapper[4971]: I0320 06:50:55.778651 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:50:55 crc kubenswrapper[4971]: I0320 06:50:55.778705 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:50:55 crc kubenswrapper[4971]: I0320 06:50:55.778724 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:50:55 crc kubenswrapper[4971]: I0320 06:50:55.778767 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:50:55 crc kubenswrapper[4971]: I0320 06:50:55.778786 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:50:55Z","lastTransitionTime":"2026-03-20T06:50:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:50:55 crc kubenswrapper[4971]: E0320 06:50:55.801275 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:50:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:50:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:50:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:50:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:50:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:50:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:50:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:50:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24b0987a-bded-4b95-97d3-cecbc47baa01\\\",\\\"systemUUID\\\":\\\"e08276c0-e3e4-4e35-ad9f-5ef530b85d12\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 06:50:55 crc kubenswrapper[4971]: I0320 06:50:55.808132 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:50:55 crc kubenswrapper[4971]: I0320 06:50:55.808216 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:50:55 crc kubenswrapper[4971]: I0320 06:50:55.808234 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:50:55 crc kubenswrapper[4971]: I0320 06:50:55.808264 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:50:55 crc kubenswrapper[4971]: I0320 06:50:55.808283 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:50:55Z","lastTransitionTime":"2026-03-20T06:50:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:50:55 crc kubenswrapper[4971]: E0320 06:50:55.826380 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:50:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:50:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:50:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:50:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:50:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:50:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:50:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:50:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24b0987a-bded-4b95-97d3-cecbc47baa01\\\",\\\"systemUUID\\\":\\\"e08276c0-e3e4-4e35-ad9f-5ef530b85d12\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 06:50:55 crc kubenswrapper[4971]: E0320 06:50:55.826656 4971 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 06:50:55 crc kubenswrapper[4971]: E0320 06:50:55.826703 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:50:55 crc kubenswrapper[4971]: E0320 06:50:55.927303 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:50:56 crc kubenswrapper[4971]: E0320 06:50:56.027919 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:50:56 crc kubenswrapper[4971]: E0320 06:50:56.128949 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:50:56 crc kubenswrapper[4971]: E0320 06:50:56.229551 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:50:56 crc kubenswrapper[4971]: E0320 06:50:56.329990 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:50:56 crc kubenswrapper[4971]: E0320 06:50:56.431108 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:50:56 crc kubenswrapper[4971]: E0320 06:50:56.532191 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:50:56 crc kubenswrapper[4971]: E0320 06:50:56.633173 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:50:56 crc kubenswrapper[4971]: E0320 06:50:56.733986 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:50:56 crc kubenswrapper[4971]: E0320 06:50:56.835160 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:50:56 crc kubenswrapper[4971]: E0320 06:50:56.935661 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:50:57 crc kubenswrapper[4971]: E0320 06:50:57.035860 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:50:57 crc kubenswrapper[4971]: E0320 06:50:57.136260 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:50:57 crc kubenswrapper[4971]: E0320 06:50:57.236679 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:50:57 crc kubenswrapper[4971]: E0320 06:50:57.337245 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:50:57 crc kubenswrapper[4971]: E0320 06:50:57.438306 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:50:57 crc kubenswrapper[4971]: E0320 06:50:57.538681 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:50:57 crc kubenswrapper[4971]: E0320 06:50:57.639670 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:50:57 crc kubenswrapper[4971]: E0320 06:50:57.740138 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:50:57 crc kubenswrapper[4971]: E0320 06:50:57.840547 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:50:57 crc kubenswrapper[4971]: E0320 06:50:57.940745 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:50:58 crc kubenswrapper[4971]: E0320 06:50:58.040997 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:50:58 crc kubenswrapper[4971]: E0320 06:50:58.141177 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:50:58 crc kubenswrapper[4971]: E0320 06:50:58.241972 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:50:58 crc kubenswrapper[4971]: E0320 06:50:58.343085 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:50:58 crc kubenswrapper[4971]: E0320 06:50:58.443915 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:50:58 crc kubenswrapper[4971]: E0320 06:50:58.545023 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:50:58 crc kubenswrapper[4971]: I0320 06:50:58.580915 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 06:50:58 crc kubenswrapper[4971]: I0320 06:50:58.581189 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:50:58 crc kubenswrapper[4971]: I0320 06:50:58.583274 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:50:58 crc kubenswrapper[4971]: I0320 06:50:58.583330 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:50:58 crc kubenswrapper[4971]: I0320 06:50:58.583348 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:50:58 crc kubenswrapper[4971]: I0320 06:50:58.584492 4971 scope.go:117] "RemoveContainer" containerID="6a748c7f52fe87f489e7b11239aafb2b6f9dc55d79f48fd08e10193349e61706" Mar 20 06:50:58 crc kubenswrapper[4971]: E0320 06:50:58.584804 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 06:50:58 crc kubenswrapper[4971]: E0320 06:50:58.645568 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:50:58 crc kubenswrapper[4971]: E0320 06:50:58.745956 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:50:58 crc kubenswrapper[4971]: E0320 06:50:58.816576 4971 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 06:50:58 crc kubenswrapper[4971]: E0320 06:50:58.847034 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:50:58 crc kubenswrapper[4971]: E0320 06:50:58.947637 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:50:59 crc kubenswrapper[4971]: E0320 06:50:59.048300 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:50:59 crc kubenswrapper[4971]: E0320 06:50:59.148952 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:50:59 crc kubenswrapper[4971]: E0320 06:50:59.249641 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:50:59 crc kubenswrapper[4971]: E0320 06:50:59.349943 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:50:59 crc kubenswrapper[4971]: E0320 06:50:59.450657 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:50:59 crc kubenswrapper[4971]: E0320 06:50:59.551590 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:50:59 crc kubenswrapper[4971]: E0320 06:50:59.651961 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:50:59 crc kubenswrapper[4971]: E0320 06:50:59.752282 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:50:59 crc kubenswrapper[4971]: E0320 06:50:59.853276 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:50:59 crc kubenswrapper[4971]: E0320 06:50:59.954123 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:00 crc kubenswrapper[4971]: E0320 06:51:00.054453 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:00 crc kubenswrapper[4971]: E0320 06:51:00.154674 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:00 crc kubenswrapper[4971]: E0320 06:51:00.255220 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:00 crc kubenswrapper[4971]: E0320 06:51:00.355397 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:00 crc kubenswrapper[4971]: E0320 06:51:00.455481 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:00 crc kubenswrapper[4971]: E0320 06:51:00.556017 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:00 crc kubenswrapper[4971]: E0320 06:51:00.656875 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:00 crc kubenswrapper[4971]: E0320 06:51:00.757088 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:00 crc kubenswrapper[4971]: E0320 06:51:00.857875 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:00 crc kubenswrapper[4971]: E0320 06:51:00.958238 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:01 crc kubenswrapper[4971]: E0320 06:51:01.059313 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:01 crc kubenswrapper[4971]: E0320 06:51:01.159809 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:01 crc kubenswrapper[4971]: E0320 06:51:01.260270 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:01 crc kubenswrapper[4971]: E0320 06:51:01.360982 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:01 crc kubenswrapper[4971]: E0320 06:51:01.461311 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:01 crc kubenswrapper[4971]: E0320 06:51:01.561886 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:01 crc kubenswrapper[4971]: E0320 06:51:01.662706 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:01 crc kubenswrapper[4971]: E0320 06:51:01.763708 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:01 crc kubenswrapper[4971]: E0320 06:51:01.864424 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:01 crc kubenswrapper[4971]: E0320 06:51:01.964703 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:02 crc kubenswrapper[4971]: E0320 06:51:02.065828 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:02 crc kubenswrapper[4971]: E0320 06:51:02.166546 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:02 crc kubenswrapper[4971]: E0320 06:51:02.266847 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:02 crc kubenswrapper[4971]: E0320 06:51:02.367923 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:02 crc kubenswrapper[4971]: E0320 06:51:02.468743 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:02 crc kubenswrapper[4971]: E0320 06:51:02.569442 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:02 crc kubenswrapper[4971]: E0320 06:51:02.670152 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:02 crc kubenswrapper[4971]: E0320 06:51:02.771303 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:02 crc kubenswrapper[4971]: E0320 06:51:02.872552 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:02 crc kubenswrapper[4971]: E0320 06:51:02.973201 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:03 crc kubenswrapper[4971]: E0320 06:51:03.073822 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:03 crc kubenswrapper[4971]: E0320 06:51:03.174868 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:03 crc kubenswrapper[4971]: E0320 06:51:03.275242 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:03 crc kubenswrapper[4971]: E0320 06:51:03.375945 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:03 crc kubenswrapper[4971]: E0320 06:51:03.476497 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:03 crc kubenswrapper[4971]: E0320 06:51:03.577209 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:03 crc kubenswrapper[4971]: E0320 06:51:03.677601 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:03 crc kubenswrapper[4971]: E0320 06:51:03.778264 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:03 crc kubenswrapper[4971]: E0320 06:51:03.878384 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:03 crc kubenswrapper[4971]: E0320 06:51:03.979514 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:04 crc kubenswrapper[4971]: E0320 06:51:04.080015 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:04 crc kubenswrapper[4971]: E0320 06:51:04.180899 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:04 crc kubenswrapper[4971]: E0320 06:51:04.281247 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:04 crc kubenswrapper[4971]: E0320 06:51:04.382357 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:04 crc kubenswrapper[4971]: E0320 06:51:04.482863 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:04 crc kubenswrapper[4971]: E0320 06:51:04.583256 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:04 crc kubenswrapper[4971]: E0320 06:51:04.683461 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:04 crc kubenswrapper[4971]: E0320 06:51:04.784591 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:04 crc kubenswrapper[4971]: E0320 06:51:04.885166 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:04 crc kubenswrapper[4971]: E0320 06:51:04.986115 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:05 crc kubenswrapper[4971]: E0320 06:51:05.086526 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:05 crc kubenswrapper[4971]: E0320 06:51:05.186934 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:05 crc kubenswrapper[4971]: E0320 06:51:05.287594 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:05 crc kubenswrapper[4971]: E0320 06:51:05.387807 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:05 crc kubenswrapper[4971]: E0320 06:51:05.488932 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:05 crc kubenswrapper[4971]: E0320 06:51:05.589881 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:05 crc kubenswrapper[4971]: E0320 06:51:05.690376 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:05 crc kubenswrapper[4971]: I0320 06:51:05.731843 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:51:05 crc kubenswrapper[4971]: I0320 06:51:05.733685 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:05 crc kubenswrapper[4971]: I0320 06:51:05.733785 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:05 crc kubenswrapper[4971]: I0320 06:51:05.733808 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:05 crc kubenswrapper[4971]: E0320 06:51:05.791142 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:05 crc kubenswrapper[4971]: E0320 06:51:05.891661 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:05 crc kubenswrapper[4971]: E0320 06:51:05.992652 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:06 crc kubenswrapper[4971]: E0320 06:51:06.037458 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 20 06:51:06 crc kubenswrapper[4971]: I0320 06:51:06.042425 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:06 crc kubenswrapper[4971]: I0320 06:51:06.042475 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:06 crc kubenswrapper[4971]: I0320 06:51:06.042494 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:06 crc kubenswrapper[4971]: I0320 06:51:06.042517 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:06 crc kubenswrapper[4971]: I0320 06:51:06.042536 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:06Z","lastTransitionTime":"2026-03-20T06:51:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:06 crc kubenswrapper[4971]: E0320 06:51:06.059184 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24b0987a-bded-4b95-97d3-cecbc47baa01\\\",\\\"systemUUID\\\":\\\"e08276c0-e3e4-4e35-ad9f-5ef530b85d12\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 06:51:06 crc kubenswrapper[4971]: I0320 06:51:06.064665 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:06 crc kubenswrapper[4971]: I0320 06:51:06.064735 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:06 crc kubenswrapper[4971]: I0320 06:51:06.064760 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:06 crc kubenswrapper[4971]: I0320 06:51:06.064792 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:06 crc kubenswrapper[4971]: I0320 06:51:06.064816 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:06Z","lastTransitionTime":"2026-03-20T06:51:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:06 crc kubenswrapper[4971]: E0320 06:51:06.080825 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24b0987a-bded-4b95-97d3-cecbc47baa01\\\",\\\"systemUUID\\\":\\\"e08276c0-e3e4-4e35-ad9f-5ef530b85d12\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 06:51:06 crc kubenswrapper[4971]: I0320 06:51:06.085597 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:06 crc kubenswrapper[4971]: I0320 06:51:06.085703 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:06 crc kubenswrapper[4971]: I0320 06:51:06.085723 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:06 crc kubenswrapper[4971]: I0320 06:51:06.085750 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:06 crc kubenswrapper[4971]: I0320 06:51:06.085770 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:06Z","lastTransitionTime":"2026-03-20T06:51:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:06 crc kubenswrapper[4971]: E0320 06:51:06.104507 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24b0987a-bded-4b95-97d3-cecbc47baa01\\\",\\\"systemUUID\\\":\\\"e08276c0-e3e4-4e35-ad9f-5ef530b85d12\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 06:51:06 crc kubenswrapper[4971]: I0320 06:51:06.110008 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:06 crc kubenswrapper[4971]: I0320 06:51:06.110068 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:06 crc kubenswrapper[4971]: I0320 06:51:06.110087 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:06 crc kubenswrapper[4971]: I0320 06:51:06.110115 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:06 crc kubenswrapper[4971]: I0320 06:51:06.110135 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:06Z","lastTransitionTime":"2026-03-20T06:51:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:06 crc kubenswrapper[4971]: E0320 06:51:06.132167 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24b0987a-bded-4b95-97d3-cecbc47baa01\\\",\\\"systemUUID\\\":\\\"e08276c0-e3e4-4e35-ad9f-5ef530b85d12\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 06:51:06 crc kubenswrapper[4971]: E0320 06:51:06.132289 4971 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 06:51:06 crc kubenswrapper[4971]: E0320 06:51:06.132316 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:06 crc kubenswrapper[4971]: E0320 06:51:06.233093 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:06 crc kubenswrapper[4971]: E0320 06:51:06.334042 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:06 crc kubenswrapper[4971]: E0320 06:51:06.434760 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:06 crc kubenswrapper[4971]: E0320 06:51:06.535716 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:06 crc kubenswrapper[4971]: E0320 06:51:06.636495 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:06 crc kubenswrapper[4971]: E0320 06:51:06.737482 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:06 crc kubenswrapper[4971]: E0320 06:51:06.838637 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:06 crc kubenswrapper[4971]: E0320 06:51:06.939787 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:07 crc kubenswrapper[4971]: E0320 06:51:07.040525 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:07 crc kubenswrapper[4971]: E0320 06:51:07.140895 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:07 crc kubenswrapper[4971]: E0320 06:51:07.241413 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:07 crc kubenswrapper[4971]: E0320 06:51:07.342645 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:07 crc kubenswrapper[4971]: E0320 06:51:07.443022 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:07 crc kubenswrapper[4971]: E0320 06:51:07.543732 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:07 crc kubenswrapper[4971]: E0320 06:51:07.644268 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:07 crc kubenswrapper[4971]: E0320 06:51:07.745535 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:07 crc kubenswrapper[4971]: E0320 06:51:07.845982 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:07 crc kubenswrapper[4971]: E0320 06:51:07.946531 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:08 crc kubenswrapper[4971]: E0320 06:51:08.047159 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:08 crc kubenswrapper[4971]: E0320 06:51:08.147699 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:08 crc kubenswrapper[4971]: E0320 06:51:08.248249 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:08 crc kubenswrapper[4971]: E0320 06:51:08.349077 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:08 crc kubenswrapper[4971]: E0320 06:51:08.449989 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:08 crc kubenswrapper[4971]: E0320 06:51:08.550502 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:08 crc kubenswrapper[4971]: E0320 06:51:08.651097 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:08 crc kubenswrapper[4971]: E0320 06:51:08.751965 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:08 crc kubenswrapper[4971]: E0320 06:51:08.817347 4971 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 06:51:08 crc kubenswrapper[4971]: E0320 06:51:08.852348 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:08 crc kubenswrapper[4971]: E0320 06:51:08.953090 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:09 crc kubenswrapper[4971]: E0320 06:51:09.054240 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:09 crc kubenswrapper[4971]: E0320 06:51:09.155242 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:09 crc kubenswrapper[4971]: E0320 06:51:09.256063 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:09 crc kubenswrapper[4971]: E0320 06:51:09.357113 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:09 crc kubenswrapper[4971]: E0320 06:51:09.457234 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:09 crc kubenswrapper[4971]: E0320 06:51:09.557808 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:09 crc kubenswrapper[4971]: E0320 06:51:09.658148 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:09 crc kubenswrapper[4971]: E0320 06:51:09.759105 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:09 crc kubenswrapper[4971]: E0320 06:51:09.859656 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:09 crc kubenswrapper[4971]: E0320 06:51:09.960752 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:10 crc kubenswrapper[4971]: E0320 06:51:10.061888 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:10 crc kubenswrapper[4971]: E0320 06:51:10.162417 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:10 crc kubenswrapper[4971]: E0320 06:51:10.263070 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:10 crc kubenswrapper[4971]: E0320 06:51:10.363839 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:10 crc kubenswrapper[4971]: E0320 06:51:10.464766 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:10 crc kubenswrapper[4971]: E0320 06:51:10.565908 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:10 crc kubenswrapper[4971]: E0320 06:51:10.666488 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:10 crc kubenswrapper[4971]: E0320 06:51:10.766672 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:10 crc kubenswrapper[4971]: I0320 06:51:10.807724 4971 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 20 06:51:10 crc kubenswrapper[4971]: E0320 06:51:10.867536 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:10 crc kubenswrapper[4971]: E0320 06:51:10.967727 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:11 crc kubenswrapper[4971]: E0320 06:51:11.068342 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:11 crc kubenswrapper[4971]: E0320 06:51:11.169103 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:11 crc kubenswrapper[4971]: E0320 06:51:11.269810 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:11 crc kubenswrapper[4971]: E0320 06:51:11.370802 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:11 crc kubenswrapper[4971]: E0320 06:51:11.471761 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:11 crc kubenswrapper[4971]: E0320 06:51:11.572071 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:11 crc kubenswrapper[4971]: E0320 06:51:11.672921 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:11 crc kubenswrapper[4971]: E0320 06:51:11.774106 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:11 crc kubenswrapper[4971]: E0320 06:51:11.874233 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:11 crc kubenswrapper[4971]: E0320 06:51:11.974467 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:12 crc kubenswrapper[4971]: E0320 06:51:12.075081 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:12 crc kubenswrapper[4971]: E0320 06:51:12.175973 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:12 crc kubenswrapper[4971]: E0320 06:51:12.276096 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:12 crc kubenswrapper[4971]: E0320 06:51:12.376524 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:12 crc kubenswrapper[4971]: E0320 06:51:12.476846 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:12 crc kubenswrapper[4971]: E0320 06:51:12.577942 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:12 crc kubenswrapper[4971]: E0320 06:51:12.678728 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:12 crc kubenswrapper[4971]: E0320 06:51:12.779805 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:12 crc kubenswrapper[4971]: E0320 06:51:12.880975 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:12 crc kubenswrapper[4971]: E0320 06:51:12.981589 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:13 crc kubenswrapper[4971]: E0320 06:51:13.082352 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:13 crc kubenswrapper[4971]: E0320 06:51:13.182498 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:13 crc kubenswrapper[4971]: E0320 06:51:13.283644 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:13 crc kubenswrapper[4971]: E0320 06:51:13.384701 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:13 crc kubenswrapper[4971]: E0320 06:51:13.485120 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:13 crc kubenswrapper[4971]: E0320 06:51:13.585818 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:13 crc kubenswrapper[4971]: E0320 06:51:13.686238 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:13 crc kubenswrapper[4971]: I0320 06:51:13.731417 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 06:51:13 crc kubenswrapper[4971]: I0320 06:51:13.733063 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:13 crc kubenswrapper[4971]: I0320 06:51:13.733135 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:13 crc kubenswrapper[4971]: I0320 06:51:13.733161 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:13 crc kubenswrapper[4971]: I0320 06:51:13.734474 4971 scope.go:117] "RemoveContainer" containerID="6a748c7f52fe87f489e7b11239aafb2b6f9dc55d79f48fd08e10193349e61706" Mar 20 06:51:13 crc kubenswrapper[4971]: E0320 06:51:13.734848 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 06:51:13 crc kubenswrapper[4971]: E0320 06:51:13.787009 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:13 crc kubenswrapper[4971]: E0320 06:51:13.887970 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:13 crc kubenswrapper[4971]: E0320 06:51:13.989175 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:14 crc kubenswrapper[4971]: E0320 06:51:14.089551 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:14 crc kubenswrapper[4971]: E0320 06:51:14.190749 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:14 crc kubenswrapper[4971]: E0320 06:51:14.291491 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:14 crc kubenswrapper[4971]: E0320 06:51:14.392696 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:14 crc kubenswrapper[4971]: E0320 06:51:14.492943 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:14 crc kubenswrapper[4971]: E0320 06:51:14.593861 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:14 crc kubenswrapper[4971]: E0320 06:51:14.694626 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:14 crc kubenswrapper[4971]: E0320 06:51:14.795598 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:14 crc kubenswrapper[4971]: I0320 06:51:14.828102 4971 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 20 06:51:14 crc kubenswrapper[4971]: E0320 06:51:14.896480 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:14 crc kubenswrapper[4971]: E0320 06:51:14.996881 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:15 crc kubenswrapper[4971]: E0320 06:51:15.097244 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:15 crc kubenswrapper[4971]: E0320 06:51:15.197569 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:15 crc kubenswrapper[4971]: E0320 06:51:15.298700 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:15 crc kubenswrapper[4971]: E0320 06:51:15.399484 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:15 crc kubenswrapper[4971]: E0320 06:51:15.500387 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:15 crc kubenswrapper[4971]: E0320 06:51:15.601191 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:15 crc kubenswrapper[4971]: E0320 06:51:15.701947 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:15 crc kubenswrapper[4971]: E0320 06:51:15.802995 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:15 crc kubenswrapper[4971]: E0320 06:51:15.904113 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:16 crc kubenswrapper[4971]: E0320 06:51:16.004186 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:16 crc kubenswrapper[4971]: E0320 06:51:16.105149 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:16 crc kubenswrapper[4971]: E0320 06:51:16.206235 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:16 crc kubenswrapper[4971]: E0320 06:51:16.238375 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 20 06:51:16 crc kubenswrapper[4971]: I0320 06:51:16.244086 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:16 crc kubenswrapper[4971]: I0320 06:51:16.244141 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:16 crc kubenswrapper[4971]: I0320 06:51:16.244161 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:16 crc kubenswrapper[4971]: I0320 06:51:16.244196 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:16 crc kubenswrapper[4971]: I0320 06:51:16.244246 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:16Z","lastTransitionTime":"2026-03-20T06:51:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:16 crc kubenswrapper[4971]: E0320 06:51:16.259537 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24b0987a-bded-4b95-97d3-cecbc47baa01\\\",\\\"systemUUID\\\":\\\"e08276c0-e3e4-4e35-ad9f-5ef530b85d12\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 06:51:16 crc kubenswrapper[4971]: I0320 06:51:16.264352 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:16 crc kubenswrapper[4971]: I0320 06:51:16.264400 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:16 crc kubenswrapper[4971]: I0320 06:51:16.264413 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:16 crc kubenswrapper[4971]: I0320 06:51:16.264432 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:16 crc kubenswrapper[4971]: I0320 06:51:16.264450 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:16Z","lastTransitionTime":"2026-03-20T06:51:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:16 crc kubenswrapper[4971]: E0320 06:51:16.278933 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24b0987a-bded-4b95-97d3-cecbc47baa01\\\",\\\"systemUUID\\\":\\\"e08276c0-e3e4-4e35-ad9f-5ef530b85d12\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 06:51:16 crc kubenswrapper[4971]: I0320 06:51:16.282488 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:16 crc kubenswrapper[4971]: I0320 06:51:16.282553 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:16 crc kubenswrapper[4971]: I0320 06:51:16.282581 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:16 crc kubenswrapper[4971]: I0320 06:51:16.282667 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:16 crc kubenswrapper[4971]: I0320 06:51:16.282704 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:16Z","lastTransitionTime":"2026-03-20T06:51:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:16 crc kubenswrapper[4971]: E0320 06:51:16.306778 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24b0987a-bded-4b95-97d3-cecbc47baa01\\\",\\\"systemUUID\\\":\\\"e08276c0-e3e4-4e35-ad9f-5ef530b85d12\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 06:51:16 crc kubenswrapper[4971]: I0320 06:51:16.313039 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:16 crc kubenswrapper[4971]: I0320 06:51:16.313106 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:16 crc kubenswrapper[4971]: I0320 06:51:16.313125 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:16 crc kubenswrapper[4971]: I0320 06:51:16.313155 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:16 crc kubenswrapper[4971]: I0320 06:51:16.313175 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:16Z","lastTransitionTime":"2026-03-20T06:51:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:16 crc kubenswrapper[4971]: E0320 06:51:16.327423 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24b0987a-bded-4b95-97d3-cecbc47baa01\\\",\\\"systemUUID\\\":\\\"e08276c0-e3e4-4e35-ad9f-5ef530b85d12\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 06:51:16 crc kubenswrapper[4971]: E0320 06:51:16.327543 4971 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 06:51:16 crc kubenswrapper[4971]: E0320 06:51:16.327572 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:16 crc kubenswrapper[4971]: E0320 06:51:16.428008 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:16 crc kubenswrapper[4971]: E0320 06:51:16.528918 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:16 crc kubenswrapper[4971]: E0320 06:51:16.630062 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:16 crc kubenswrapper[4971]: E0320 06:51:16.730809 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:16 crc kubenswrapper[4971]: E0320 06:51:16.831731 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:16 crc kubenswrapper[4971]: E0320 06:51:16.932650 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:17 crc kubenswrapper[4971]: E0320 06:51:17.033661 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:17 crc kubenswrapper[4971]: E0320 06:51:17.134091 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:17 crc kubenswrapper[4971]: E0320 06:51:17.235226 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:17 crc kubenswrapper[4971]: E0320 06:51:17.336053 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:17 crc kubenswrapper[4971]: E0320 06:51:17.447932 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:17 crc kubenswrapper[4971]: E0320 06:51:17.548869 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.557336 4971 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.651801 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.651889 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.651916 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.651952 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.651978 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:17Z","lastTransitionTime":"2026-03-20T06:51:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.685260 4971 apiserver.go:52] "Watching apiserver" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.690587 4971 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.690979 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf"] Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.691741 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.691752 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.691583 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.691872 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.692069 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.692088 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:51:17 crc kubenswrapper[4971]: E0320 06:51:17.692359 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 06:51:17 crc kubenswrapper[4971]: E0320 06:51:17.692460 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 06:51:17 crc kubenswrapper[4971]: E0320 06:51:17.692828 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.697583 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.697705 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.697830 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.697917 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.699411 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.699976 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.700022 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.700030 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.701993 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.738251 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.752692 4971 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.754848 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.754993 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.755037 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.755057 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.755084 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.755105 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:17Z","lastTransitionTime":"2026-03-20T06:51:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.771594 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.785714 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.797249 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.803140 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.803207 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.803245 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.803291 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.803328 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.803364 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.803400 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.803433 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.803475 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.803507 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.803541 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.803585 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.803630 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.803669 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.803723 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.803766 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.803797 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.803828 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.803894 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.803935 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.803969 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.804004 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.804038 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.804072 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.804104 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.804136 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.804145 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.804169 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.804478 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.804524 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.804562 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.804598 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.804667 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.804702 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.804734 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.804763 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.804796 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.804875 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.804871 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.805018 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.805054 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.805097 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.805135 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.805211 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.805262 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.805309 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.805351 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.805395 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.805436 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.805456 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.805470 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.805505 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.805538 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.805570 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.805634 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.805668 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.805703 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.805730 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.805735 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.805788 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.805820 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.805851 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.805884 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.805919 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.805950 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.806023 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.806033 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.806101 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.806137 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.806169 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.806198 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.806227 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.806254 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.806283 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.806308 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.806337 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.806364 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.806392 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.806420 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.806448 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.806479 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.806555 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.806590 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.806648 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.806675 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.806705 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.806731 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.806754 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.806777 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.806801 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.806824 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.806849 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.806872 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.806901 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.806925 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.806949 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.806985 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.807010 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.807036 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.807062 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.807088 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.807119 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.807143 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.807168 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.807193 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.807219 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.807246 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.807272 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.807299 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.807324 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.807350 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.807373 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.807398 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.807426 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.807452 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.807477 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.807502 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.807530 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.807555 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.807582 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.807625 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.807651 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.807675 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.807698 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.807722 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.807747 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.807808 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.807835 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.807884 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.807911 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.807937 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.807961 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.807987 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.808017 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.808042 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.808068 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.808094 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.808119 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.808145 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.808170 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.808195 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.808220 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.808243 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.808266 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.808290 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.808315 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.808340 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.808367 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.808392 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.808420 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.808451 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.808477 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.808502 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.808527 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.808552 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.808578 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.808617 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.808645 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.808676 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.808701 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.808733 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.808758 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.808784 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.808811 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.808838 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.808866 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.808895 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.808922 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.808948 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.808975 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.809006 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.809032 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.809057 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.809086 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.809112 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.809138 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.809164 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.809191 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.809217 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.809243 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.809270 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.809301 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.809329 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.809356 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.809383 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.809468 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.809496 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.809521 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.809549 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.809577 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.809624 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.809654 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.809700 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.809727 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.809753 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.809781 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.809808 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.809838 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.809868 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.809895 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.809922 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.809948 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.811013 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.812181 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.812313 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.812357 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.812389 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.812416 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.812447 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.812478 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.812506 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.812534 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.812561 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.812595 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.812640 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.812682 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.812841 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.812858 4971 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.812875 4971 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.812890 4971 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.812905 4971 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.812920 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.812934 4971 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.816702 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.806941 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.807470 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.807752 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.808549 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.809292 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.809572 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.809992 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.810055 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.810523 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.811309 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.811370 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.811415 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.811835 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.811950 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.812061 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.812280 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: E0320 06:51:17.813046 4971 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.813471 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.813663 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.814209 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.814726 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.814934 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.815003 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.815053 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.815105 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.815715 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.816042 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.816291 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.816355 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.816410 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.816504 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.816552 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.816965 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.817006 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.817409 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.817665 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.818451 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.818644 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.818929 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.848879 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.848896 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.821224 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.821437 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.848980 4971 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.866579 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.821486 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.828262 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.821730 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.828365 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.828598 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.828916 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.828936 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.829315 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.829659 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.829965 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.830443 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.831078 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.831367 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.831458 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.831529 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.832936 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.833518 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.833753 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.834517 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.834751 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: E0320 06:51:17.834815 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 06:51:18.334768569 +0000 UTC m=+100.314642737 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.835189 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.835356 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.835425 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.835448 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.835775 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.836178 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.836217 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.836630 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.836687 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.836821 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.837333 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.837769 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.838460 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.839252 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.839434 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.839797 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.840061 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.840457 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.840473 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.841134 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.841655 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.841808 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.843004 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.843059 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.843168 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.843492 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.867542 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.843760 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.844266 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: E0320 06:51:17.844290 4971 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.844384 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.844498 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.844679 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.844700 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.845046 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.845099 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.845311 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.845541 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.845940 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.867675 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.846215 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.846581 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.846653 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.846673 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.846761 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.841493 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.846875 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.847000 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.847049 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.847043 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.847365 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.847387 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.847504 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.847856 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.847880 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.848234 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.848238 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.848526 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.820521 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.849028 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.849114 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.849081 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.849178 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.849312 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.849375 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.849390 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.849435 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.849591 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.850110 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.850154 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.850151 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.850307 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.850500 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.850573 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.850893 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.851037 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: E0320 06:51:17.867962 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 06:51:18.367900571 +0000 UTC m=+100.347774719 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.850943 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.851236 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.851252 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.851439 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.851822 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.851853 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.852264 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.852295 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.852466 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.852509 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.851113 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.866455 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.866710 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.867107 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.867108 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.867579 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.868102 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.868283 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.868327 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.868677 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.869115 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.869523 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.869782 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.869848 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.869991 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.870194 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.870253 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.870380 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.870416 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.870575 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.870658 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.870789 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.870843 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.871081 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: E0320 06:51:17.871138 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 06:51:18.371113526 +0000 UTC m=+100.350987804 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.871446 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.872045 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.870314 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.872218 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.872427 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.872667 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.873522 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.873583 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.873887 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.874572 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.876025 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: E0320 06:51:17.877181 4971 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 06:51:17 crc kubenswrapper[4971]: E0320 06:51:17.877222 4971 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 06:51:17 crc kubenswrapper[4971]: E0320 06:51:17.877248 4971 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 06:51:17 crc kubenswrapper[4971]: E0320 06:51:17.877349 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 06:51:18.377317989 +0000 UTC m=+100.357192177 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.877522 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.878293 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.878314 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.878545 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.878771 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.884959 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.884993 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.885005 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.885024 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.885036 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:17Z","lastTransitionTime":"2026-03-20T06:51:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.889505 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: E0320 06:51:17.890072 4971 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 06:51:17 crc kubenswrapper[4971]: E0320 06:51:17.890254 4971 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 06:51:17 crc kubenswrapper[4971]: E0320 06:51:17.890397 4971 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 06:51:17 crc kubenswrapper[4971]: E0320 06:51:17.890734 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 06:51:18.390581429 +0000 UTC m=+100.370455757 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.894659 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.899323 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.903298 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.904469 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.914494 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.914677 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.914966 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.915292 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.915049 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.915324 4971 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.915858 4971 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.916033 4971 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.916179 4971 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.916359 4971 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.916534 4971 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.916814 4971 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.917012 4971 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.917189 4971 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.917394 4971 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.917677 4971 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.917874 4971 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.918010 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.918130 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.918245 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.918371 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.918502 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.918800 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.918954 4971 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.919075 4971 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.919192 4971 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.919312 4971 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.919432 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.919569 4971 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.919795 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.919926 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.920075 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.920206 4971 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.920346 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.920473 4971 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.920653 4971 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.920778 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.920905 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.921040 4971 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.921166 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.921298 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.921452 4971 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.921645 4971 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.921788 4971 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.921938 4971 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.922081 4971 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.922212 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.922347 4971 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.922478 4971 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.922706 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.922902 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.923051 4971 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.923186 4971 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.923323 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.923462 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.923672 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.923837 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.924043 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.924191 4971 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.924323 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.924456 4971 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.924690 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.924874 4971 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.925163 4971 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.925327 4971 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.925518 4971 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.925706 4971 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.925868 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.926018 4971 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.926149 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.926284 4971 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.926426 4971 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.921176 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.926602 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.922992 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.926725 4971 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.926747 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.926765 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.926782 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.926797 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.926814 4971 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.926836 4971 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.926855 4971 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.926870 4971 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.926885 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.926899 4971 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.926913 4971 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.926927 4971 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.926941 4971 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.926956 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.926970 4971 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.926983 4971 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.926997 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927011 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927024 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927037 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927051 4971 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927065 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927078 4971 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927091 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927104 4971 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927119 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927134 4971 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927150 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927165 4971 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927180 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927199 4971 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927212 4971 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927225 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927241 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927255 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927268 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927282 4971 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927295 4971 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927309 4971 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927323 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927336 4971 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927350 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927364 4971 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927377 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927390 4971 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927404 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927416 4971 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927429 4971 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927442 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927456 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927468 4971 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927481 4971 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927495 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927510 4971 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927523 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927539 4971 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927551 4971 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927567 4971 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927582 4971 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927596 4971 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927694 4971 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927708 4971 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927724 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927738 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927753 4971 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927766 4971 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927779 4971 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927793 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927807 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927820 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927833 4971 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927845 4971 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927861 4971 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927874 4971 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927887 4971 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927900 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927914 4971 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927928 4971 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927943 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927956 4971 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927969 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927983 4971 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.927996 4971 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.928062 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.928076 4971 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.928090 4971 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.928104 4971 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.928116 4971 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.928129 4971 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.928142 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.928155 4971 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.928168 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.928182 4971 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.928195 4971 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.928207 4971 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.928222 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.928236 4971 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.928249 4971 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.928263 4971 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.928275 4971 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.928288 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.928300 4971 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.928313 4971 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.928326 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.928341 4971 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.928354 4971 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.928367 4971 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.928379 4971 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.928392 4971 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.928408 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.928420 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.928433 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.928447 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.928460 4971 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.928473 4971 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.988504 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.988573 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.988586 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.988661 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:17 crc kubenswrapper[4971]: I0320 06:51:17.988681 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:17Z","lastTransitionTime":"2026-03-20T06:51:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.029760 4971 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.029800 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.029820 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.034033 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.047807 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.064172 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 06:51:18 crc kubenswrapper[4971]: W0320 06:51:18.065940 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-ebb0619e51323be98ac70c50a9793b5cde116940282c597a0bedbc8bc721ac5e WatchSource:0}: Error finding container ebb0619e51323be98ac70c50a9793b5cde116940282c597a0bedbc8bc721ac5e: Status 404 returned error can't find the container with id ebb0619e51323be98ac70c50a9793b5cde116940282c597a0bedbc8bc721ac5e Mar 20 06:51:18 crc kubenswrapper[4971]: W0320 06:51:18.086264 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-47318d91f760c3bcb0bcbc01f686f83b83a4b2ca1d37202b59ced0a9db7f9eef WatchSource:0}: Error finding container 47318d91f760c3bcb0bcbc01f686f83b83a4b2ca1d37202b59ced0a9db7f9eef: Status 404 returned error can't find the container with id 47318d91f760c3bcb0bcbc01f686f83b83a4b2ca1d37202b59ced0a9db7f9eef Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.090647 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.090698 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.090711 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.090731 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.090747 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:18Z","lastTransitionTime":"2026-03-20T06:51:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.194393 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.194451 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.194469 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.194493 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.194514 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:18Z","lastTransitionTime":"2026-03-20T06:51:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.199488 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"ebb0619e51323be98ac70c50a9793b5cde116940282c597a0bedbc8bc721ac5e"} Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.201419 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"778db7b39603ad8898f04a9b0bf4f29ebf26b9796a2a6888e80816561bd811b4"} Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.204504 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"47318d91f760c3bcb0bcbc01f686f83b83a4b2ca1d37202b59ced0a9db7f9eef"} Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.298113 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.298166 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.298184 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.298210 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.298229 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:18Z","lastTransitionTime":"2026-03-20T06:51:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.403210 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.403279 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.403298 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.403330 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.403352 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:18Z","lastTransitionTime":"2026-03-20T06:51:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.433389 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:51:18 crc kubenswrapper[4971]: E0320 06:51:18.433554 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 06:51:19.433515648 +0000 UTC m=+101.413389826 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.433664 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.433718 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.433764 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.433817 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:51:18 crc kubenswrapper[4971]: E0320 06:51:18.433980 4971 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 06:51:18 crc kubenswrapper[4971]: E0320 06:51:18.433994 4971 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 06:51:18 crc kubenswrapper[4971]: E0320 06:51:18.434077 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 06:51:19.434055642 +0000 UTC m=+101.413929830 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 06:51:18 crc kubenswrapper[4971]: E0320 06:51:18.434113 4971 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 06:51:18 crc kubenswrapper[4971]: E0320 06:51:18.434171 4971 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 06:51:18 crc kubenswrapper[4971]: E0320 06:51:18.434187 4971 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 06:51:18 crc kubenswrapper[4971]: E0320 06:51:18.434142 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 06:51:19.434123684 +0000 UTC m=+101.413997862 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 06:51:18 crc kubenswrapper[4971]: E0320 06:51:18.434278 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 06:51:19.434255178 +0000 UTC m=+101.414129316 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 06:51:18 crc kubenswrapper[4971]: E0320 06:51:18.434007 4971 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 06:51:18 crc kubenswrapper[4971]: E0320 06:51:18.434523 4971 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 06:51:18 crc kubenswrapper[4971]: E0320 06:51:18.434625 4971 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 06:51:18 crc kubenswrapper[4971]: E0320 06:51:18.434744 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 06:51:19.43472968 +0000 UTC m=+101.414603828 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.507012 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.507094 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.507120 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.507155 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.507185 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:18Z","lastTransitionTime":"2026-03-20T06:51:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.612598 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.614726 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.614767 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.614795 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.614815 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:18Z","lastTransitionTime":"2026-03-20T06:51:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.718368 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.718430 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.718446 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.718472 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.718487 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:18Z","lastTransitionTime":"2026-03-20T06:51:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.738460 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.739537 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.741920 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.743767 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.745106 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.746208 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.747358 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.748740 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.750960 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.752247 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.753145 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.754202 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.754931 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.756033 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.756596 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.757943 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.758536 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.759430 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.760424 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.760955 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.761479 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.762088 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.763048 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.763702 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.764759 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.765504 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.766478 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.767170 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.768456 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.769060 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.769770 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.770646 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.771155 4971 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.771259 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.773503 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.774012 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.774412 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.776762 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.778460 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.778543 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.779723 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.783086 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.784936 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.785714 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.786540 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.787988 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.789423 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.790239 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.791739 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.791942 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.793302 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.795129 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.796308 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.797456 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.798575 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.799884 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.801643 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.802845 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.811260 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:18Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.831035 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.831112 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.831135 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.831180 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.831237 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:18Z","lastTransitionTime":"2026-03-20T06:51:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.833268 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:18Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.934592 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.934696 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.934719 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.934747 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:18 crc kubenswrapper[4971]: I0320 06:51:18.934768 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:18Z","lastTransitionTime":"2026-03-20T06:51:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.037839 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.037910 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.037930 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.037960 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.037981 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:19Z","lastTransitionTime":"2026-03-20T06:51:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.142328 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.142408 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.142432 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.142465 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.142487 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:19Z","lastTransitionTime":"2026-03-20T06:51:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.210542 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"b9cccf9478c15d0ef6ac9a69d23a13e83c06968969d355cfe2fde1db3b40804f"} Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.215181 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"4b2683d138d302cd71c3b08e7fa14d434a288810b0bcb2dd30a6ed778e9288f9"} Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.215225 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"55dbb5b36c90e374f7b5583a7981e39d07f6154e268299d9bc22781e5be7250c"} Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.234111 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:19Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.246134 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.246230 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.246252 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.246320 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.246338 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:19Z","lastTransitionTime":"2026-03-20T06:51:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.259350 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:19Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.282253 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:19Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.304549 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:19Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.321078 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9cccf9478c15d0ef6ac9a69d23a13e83c06968969d355cfe2fde1db3b40804f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:19Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.337485 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:19Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.349065 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.349134 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.349154 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.349183 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.349203 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:19Z","lastTransitionTime":"2026-03-20T06:51:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.357777 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:19Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.375557 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9cccf9478c15d0ef6ac9a69d23a13e83c06968969d355cfe2fde1db3b40804f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:19Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.393154 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:19Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.410193 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:19Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.424191 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-jl74k"] Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.424627 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-jl74k" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.427099 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.427267 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.427382 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.429960 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b2683d138d302cd71c3b08e7fa14d434a288810b0bcb2dd30a6ed778e9288f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dbb5b36c90e374f7b5583a7981e39d07f6154e268299d9bc22781e5be7250c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:19Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.442447 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.442546 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:51:19 crc kubenswrapper[4971]: E0320 06:51:19.442593 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 06:51:21.442571825 +0000 UTC m=+103.422445963 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.442636 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.442673 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.442699 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:51:19 crc kubenswrapper[4971]: E0320 06:51:19.442706 4971 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 06:51:19 crc kubenswrapper[4971]: E0320 06:51:19.442736 4971 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 06:51:19 crc kubenswrapper[4971]: E0320 06:51:19.442756 4971 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 06:51:19 crc kubenswrapper[4971]: E0320 06:51:19.442789 4971 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 06:51:19 crc kubenswrapper[4971]: E0320 06:51:19.442803 4971 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 06:51:19 crc kubenswrapper[4971]: E0320 06:51:19.442814 4971 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 06:51:19 crc kubenswrapper[4971]: E0320 06:51:19.442822 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 06:51:21.442807812 +0000 UTC m=+103.422681960 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 06:51:19 crc kubenswrapper[4971]: E0320 06:51:19.442845 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 06:51:21.442836082 +0000 UTC m=+103.422710220 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 06:51:19 crc kubenswrapper[4971]: E0320 06:51:19.442868 4971 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 06:51:19 crc kubenswrapper[4971]: E0320 06:51:19.442893 4971 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 06:51:19 crc kubenswrapper[4971]: E0320 06:51:19.442979 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 06:51:21.442953485 +0000 UTC m=+103.422827643 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 06:51:19 crc kubenswrapper[4971]: E0320 06:51:19.443014 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 06:51:21.443000177 +0000 UTC m=+103.422874405 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.447802 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:19Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.451702 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.451750 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.451764 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.451789 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.451804 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:19Z","lastTransitionTime":"2026-03-20T06:51:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.468168 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b2683d138d302cd71c3b08e7fa14d434a288810b0bcb2dd30a6ed778e9288f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dbb5b36c90e374f7b5583a7981e39d07f6154e268299d9bc22781e5be7250c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:19Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.484472 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jl74k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3068ae0f-0404-452c-b0e0-c422d898e6b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmftj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jl74k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:19Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.501070 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:19Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.519990 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:19Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.540998 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9cccf9478c15d0ef6ac9a69d23a13e83c06968969d355cfe2fde1db3b40804f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:19Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.543916 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmftj\" (UniqueName: \"kubernetes.io/projected/3068ae0f-0404-452c-b0e0-c422d898e6b1-kube-api-access-pmftj\") pod \"node-resolver-jl74k\" (UID: \"3068ae0f-0404-452c-b0e0-c422d898e6b1\") " pod="openshift-dns/node-resolver-jl74k" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.544063 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3068ae0f-0404-452c-b0e0-c422d898e6b1-hosts-file\") pod \"node-resolver-jl74k\" (UID: \"3068ae0f-0404-452c-b0e0-c422d898e6b1\") " pod="openshift-dns/node-resolver-jl74k" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.555350 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.555406 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.555422 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.555449 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.555466 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:19Z","lastTransitionTime":"2026-03-20T06:51:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.559932 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:19Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.579786 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:19Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.645454 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmftj\" (UniqueName: \"kubernetes.io/projected/3068ae0f-0404-452c-b0e0-c422d898e6b1-kube-api-access-pmftj\") pod \"node-resolver-jl74k\" (UID: \"3068ae0f-0404-452c-b0e0-c422d898e6b1\") " pod="openshift-dns/node-resolver-jl74k" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.645525 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3068ae0f-0404-452c-b0e0-c422d898e6b1-hosts-file\") pod \"node-resolver-jl74k\" (UID: \"3068ae0f-0404-452c-b0e0-c422d898e6b1\") " pod="openshift-dns/node-resolver-jl74k" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.645631 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3068ae0f-0404-452c-b0e0-c422d898e6b1-hosts-file\") pod \"node-resolver-jl74k\" (UID: \"3068ae0f-0404-452c-b0e0-c422d898e6b1\") " pod="openshift-dns/node-resolver-jl74k" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.658114 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.658190 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.658203 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.658240 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.658269 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:19Z","lastTransitionTime":"2026-03-20T06:51:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.679198 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmftj\" (UniqueName: \"kubernetes.io/projected/3068ae0f-0404-452c-b0e0-c422d898e6b1-kube-api-access-pmftj\") pod \"node-resolver-jl74k\" (UID: \"3068ae0f-0404-452c-b0e0-c422d898e6b1\") " pod="openshift-dns/node-resolver-jl74k" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.731954 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.732024 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.731974 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:51:19 crc kubenswrapper[4971]: E0320 06:51:19.732134 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 06:51:19 crc kubenswrapper[4971]: E0320 06:51:19.732226 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 06:51:19 crc kubenswrapper[4971]: E0320 06:51:19.732397 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.746190 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-jl74k" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.761118 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.761761 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.761786 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.761824 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.761847 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:19Z","lastTransitionTime":"2026-03-20T06:51:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:19 crc kubenswrapper[4971]: W0320 06:51:19.768270 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3068ae0f_0404_452c_b0e0_c422d898e6b1.slice/crio-570f5d3ab39337c4af6a37ee8b67fe4edfa5ada3074de21da8c4407f7a478dcc WatchSource:0}: Error finding container 570f5d3ab39337c4af6a37ee8b67fe4edfa5ada3074de21da8c4407f7a478dcc: Status 404 returned error can't find the container with id 570f5d3ab39337c4af6a37ee8b67fe4edfa5ada3074de21da8c4407f7a478dcc Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.818258 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-qmgrn"] Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.819552 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-m26zl"] Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.820028 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-lwlhn"] Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.820393 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.820573 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-qmgrn" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.820644 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-lwlhn" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.829884 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.830659 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.831135 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.833011 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.833153 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.833305 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.834697 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.833404 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.833462 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.833472 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.833539 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.833880 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.857403 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c96bfbc-3a6a-44df-9bf6-9f78c587657c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m26zl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:19Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.888876 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.888951 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.888967 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.889000 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.889020 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:19Z","lastTransitionTime":"2026-03-20T06:51:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.912031 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:19Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.951309 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:19Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.953755 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f11eaf57-f83a-4974-adb5-7a59b11555b0-host-run-multus-certs\") pod \"multus-lwlhn\" (UID: \"f11eaf57-f83a-4974-adb5-7a59b11555b0\") " pod="openshift-multus/multus-lwlhn" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.953840 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpwlj\" (UniqueName: \"kubernetes.io/projected/4be6e520-cae7-413a-bd91-de5f2de7f0b1-kube-api-access-qpwlj\") pod \"multus-additional-cni-plugins-qmgrn\" (UID: \"4be6e520-cae7-413a-bd91-de5f2de7f0b1\") " pod="openshift-multus/multus-additional-cni-plugins-qmgrn" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.953894 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f11eaf57-f83a-4974-adb5-7a59b11555b0-etc-kubernetes\") pod \"multus-lwlhn\" (UID: \"f11eaf57-f83a-4974-adb5-7a59b11555b0\") " pod="openshift-multus/multus-lwlhn" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.953944 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4be6e520-cae7-413a-bd91-de5f2de7f0b1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qmgrn\" (UID: \"4be6e520-cae7-413a-bd91-de5f2de7f0b1\") " pod="openshift-multus/multus-additional-cni-plugins-qmgrn" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.953993 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0c96bfbc-3a6a-44df-9bf6-9f78c587657c-proxy-tls\") pod \"machine-config-daemon-m26zl\" (UID: \"0c96bfbc-3a6a-44df-9bf6-9f78c587657c\") " pod="openshift-machine-config-operator/machine-config-daemon-m26zl" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.954078 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f11eaf57-f83a-4974-adb5-7a59b11555b0-host-var-lib-cni-multus\") pod \"multus-lwlhn\" (UID: \"f11eaf57-f83a-4974-adb5-7a59b11555b0\") " pod="openshift-multus/multus-lwlhn" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.954130 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f11eaf57-f83a-4974-adb5-7a59b11555b0-multus-daemon-config\") pod \"multus-lwlhn\" (UID: \"f11eaf57-f83a-4974-adb5-7a59b11555b0\") " pod="openshift-multus/multus-lwlhn" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.954167 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4be6e520-cae7-413a-bd91-de5f2de7f0b1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qmgrn\" (UID: \"4be6e520-cae7-413a-bd91-de5f2de7f0b1\") " pod="openshift-multus/multus-additional-cni-plugins-qmgrn" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.954229 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f11eaf57-f83a-4974-adb5-7a59b11555b0-host-run-netns\") pod \"multus-lwlhn\" (UID: \"f11eaf57-f83a-4974-adb5-7a59b11555b0\") " pod="openshift-multus/multus-lwlhn" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.954268 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f11eaf57-f83a-4974-adb5-7a59b11555b0-multus-conf-dir\") pod \"multus-lwlhn\" (UID: \"f11eaf57-f83a-4974-adb5-7a59b11555b0\") " pod="openshift-multus/multus-lwlhn" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.954383 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0c96bfbc-3a6a-44df-9bf6-9f78c587657c-mcd-auth-proxy-config\") pod \"machine-config-daemon-m26zl\" (UID: \"0c96bfbc-3a6a-44df-9bf6-9f78c587657c\") " pod="openshift-machine-config-operator/machine-config-daemon-m26zl" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.954440 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lwzk\" (UniqueName: \"kubernetes.io/projected/0c96bfbc-3a6a-44df-9bf6-9f78c587657c-kube-api-access-7lwzk\") pod \"machine-config-daemon-m26zl\" (UID: \"0c96bfbc-3a6a-44df-9bf6-9f78c587657c\") " pod="openshift-machine-config-operator/machine-config-daemon-m26zl" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.954475 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f11eaf57-f83a-4974-adb5-7a59b11555b0-os-release\") pod \"multus-lwlhn\" (UID: \"f11eaf57-f83a-4974-adb5-7a59b11555b0\") " pod="openshift-multus/multus-lwlhn" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.954502 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f11eaf57-f83a-4974-adb5-7a59b11555b0-cnibin\") pod \"multus-lwlhn\" (UID: \"f11eaf57-f83a-4974-adb5-7a59b11555b0\") " pod="openshift-multus/multus-lwlhn" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.954520 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/0c96bfbc-3a6a-44df-9bf6-9f78c587657c-rootfs\") pod \"machine-config-daemon-m26zl\" (UID: \"0c96bfbc-3a6a-44df-9bf6-9f78c587657c\") " pod="openshift-machine-config-operator/machine-config-daemon-m26zl" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.954538 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f11eaf57-f83a-4974-adb5-7a59b11555b0-hostroot\") pod \"multus-lwlhn\" (UID: \"f11eaf57-f83a-4974-adb5-7a59b11555b0\") " pod="openshift-multus/multus-lwlhn" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.954555 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2jcv\" (UniqueName: \"kubernetes.io/projected/f11eaf57-f83a-4974-adb5-7a59b11555b0-kube-api-access-t2jcv\") pod \"multus-lwlhn\" (UID: \"f11eaf57-f83a-4974-adb5-7a59b11555b0\") " pod="openshift-multus/multus-lwlhn" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.954634 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f11eaf57-f83a-4974-adb5-7a59b11555b0-multus-socket-dir-parent\") pod \"multus-lwlhn\" (UID: \"f11eaf57-f83a-4974-adb5-7a59b11555b0\") " pod="openshift-multus/multus-lwlhn" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.954723 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f11eaf57-f83a-4974-adb5-7a59b11555b0-multus-cni-dir\") pod \"multus-lwlhn\" (UID: \"f11eaf57-f83a-4974-adb5-7a59b11555b0\") " pod="openshift-multus/multus-lwlhn" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.954776 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f11eaf57-f83a-4974-adb5-7a59b11555b0-cni-binary-copy\") pod \"multus-lwlhn\" (UID: \"f11eaf57-f83a-4974-adb5-7a59b11555b0\") " pod="openshift-multus/multus-lwlhn" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.954817 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4be6e520-cae7-413a-bd91-de5f2de7f0b1-cni-binary-copy\") pod \"multus-additional-cni-plugins-qmgrn\" (UID: \"4be6e520-cae7-413a-bd91-de5f2de7f0b1\") " pod="openshift-multus/multus-additional-cni-plugins-qmgrn" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.954942 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f11eaf57-f83a-4974-adb5-7a59b11555b0-system-cni-dir\") pod \"multus-lwlhn\" (UID: \"f11eaf57-f83a-4974-adb5-7a59b11555b0\") " pod="openshift-multus/multus-lwlhn" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.954999 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4be6e520-cae7-413a-bd91-de5f2de7f0b1-os-release\") pod \"multus-additional-cni-plugins-qmgrn\" (UID: \"4be6e520-cae7-413a-bd91-de5f2de7f0b1\") " pod="openshift-multus/multus-additional-cni-plugins-qmgrn" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.955039 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4be6e520-cae7-413a-bd91-de5f2de7f0b1-system-cni-dir\") pod \"multus-additional-cni-plugins-qmgrn\" (UID: \"4be6e520-cae7-413a-bd91-de5f2de7f0b1\") " pod="openshift-multus/multus-additional-cni-plugins-qmgrn" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.955106 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f11eaf57-f83a-4974-adb5-7a59b11555b0-host-run-k8s-cni-cncf-io\") pod \"multus-lwlhn\" (UID: \"f11eaf57-f83a-4974-adb5-7a59b11555b0\") " pod="openshift-multus/multus-lwlhn" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.955127 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f11eaf57-f83a-4974-adb5-7a59b11555b0-host-var-lib-cni-bin\") pod \"multus-lwlhn\" (UID: \"f11eaf57-f83a-4974-adb5-7a59b11555b0\") " pod="openshift-multus/multus-lwlhn" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.955147 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f11eaf57-f83a-4974-adb5-7a59b11555b0-host-var-lib-kubelet\") pod \"multus-lwlhn\" (UID: \"f11eaf57-f83a-4974-adb5-7a59b11555b0\") " pod="openshift-multus/multus-lwlhn" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.955174 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4be6e520-cae7-413a-bd91-de5f2de7f0b1-cnibin\") pod \"multus-additional-cni-plugins-qmgrn\" (UID: \"4be6e520-cae7-413a-bd91-de5f2de7f0b1\") " pod="openshift-multus/multus-additional-cni-plugins-qmgrn" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.981187 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jl74k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3068ae0f-0404-452c-b0e0-c422d898e6b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmftj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jl74k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:19Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.991389 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.991442 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.991452 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.991471 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.991482 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:19Z","lastTransitionTime":"2026-03-20T06:51:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:19 crc kubenswrapper[4971]: I0320 06:51:19.995272 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:19Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.014831 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qmgrn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be6e520-cae7-413a-bd91-de5f2de7f0b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qmgrn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:20Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.028439 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b2683d138d302cd71c3b08e7fa14d434a288810b0bcb2dd30a6ed778e9288f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dbb5b36c90e374f7b5583a7981e39d07f6154e268299d9bc22781e5be7250c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:20Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.043135 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9cccf9478c15d0ef6ac9a69d23a13e83c06968969d355cfe2fde1db3b40804f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:20Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.056160 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f11eaf57-f83a-4974-adb5-7a59b11555b0-host-run-k8s-cni-cncf-io\") pod \"multus-lwlhn\" (UID: \"f11eaf57-f83a-4974-adb5-7a59b11555b0\") " pod="openshift-multus/multus-lwlhn" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.056230 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f11eaf57-f83a-4974-adb5-7a59b11555b0-host-var-lib-cni-bin\") pod \"multus-lwlhn\" (UID: \"f11eaf57-f83a-4974-adb5-7a59b11555b0\") " pod="openshift-multus/multus-lwlhn" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.056272 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f11eaf57-f83a-4974-adb5-7a59b11555b0-host-var-lib-kubelet\") pod \"multus-lwlhn\" (UID: \"f11eaf57-f83a-4974-adb5-7a59b11555b0\") " pod="openshift-multus/multus-lwlhn" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.056276 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f11eaf57-f83a-4974-adb5-7a59b11555b0-host-run-k8s-cni-cncf-io\") pod \"multus-lwlhn\" (UID: \"f11eaf57-f83a-4974-adb5-7a59b11555b0\") " pod="openshift-multus/multus-lwlhn" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.056298 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4be6e520-cae7-413a-bd91-de5f2de7f0b1-cnibin\") pod \"multus-additional-cni-plugins-qmgrn\" (UID: \"4be6e520-cae7-413a-bd91-de5f2de7f0b1\") " pod="openshift-multus/multus-additional-cni-plugins-qmgrn" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.056335 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f11eaf57-f83a-4974-adb5-7a59b11555b0-host-var-lib-kubelet\") pod \"multus-lwlhn\" (UID: \"f11eaf57-f83a-4974-adb5-7a59b11555b0\") " pod="openshift-multus/multus-lwlhn" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.056321 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f11eaf57-f83a-4974-adb5-7a59b11555b0-host-run-multus-certs\") pod \"multus-lwlhn\" (UID: \"f11eaf57-f83a-4974-adb5-7a59b11555b0\") " pod="openshift-multus/multus-lwlhn" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.056361 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4be6e520-cae7-413a-bd91-de5f2de7f0b1-cnibin\") pod \"multus-additional-cni-plugins-qmgrn\" (UID: \"4be6e520-cae7-413a-bd91-de5f2de7f0b1\") " pod="openshift-multus/multus-additional-cni-plugins-qmgrn" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.056367 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpwlj\" (UniqueName: \"kubernetes.io/projected/4be6e520-cae7-413a-bd91-de5f2de7f0b1-kube-api-access-qpwlj\") pod \"multus-additional-cni-plugins-qmgrn\" (UID: \"4be6e520-cae7-413a-bd91-de5f2de7f0b1\") " pod="openshift-multus/multus-additional-cni-plugins-qmgrn" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.056444 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f11eaf57-f83a-4974-adb5-7a59b11555b0-etc-kubernetes\") pod \"multus-lwlhn\" (UID: \"f11eaf57-f83a-4974-adb5-7a59b11555b0\") " pod="openshift-multus/multus-lwlhn" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.056480 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4be6e520-cae7-413a-bd91-de5f2de7f0b1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qmgrn\" (UID: \"4be6e520-cae7-413a-bd91-de5f2de7f0b1\") " pod="openshift-multus/multus-additional-cni-plugins-qmgrn" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.056484 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f11eaf57-f83a-4974-adb5-7a59b11555b0-host-run-multus-certs\") pod \"multus-lwlhn\" (UID: \"f11eaf57-f83a-4974-adb5-7a59b11555b0\") " pod="openshift-multus/multus-lwlhn" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.056501 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0c96bfbc-3a6a-44df-9bf6-9f78c587657c-proxy-tls\") pod \"machine-config-daemon-m26zl\" (UID: \"0c96bfbc-3a6a-44df-9bf6-9f78c587657c\") " pod="openshift-machine-config-operator/machine-config-daemon-m26zl" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.056549 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f11eaf57-f83a-4974-adb5-7a59b11555b0-etc-kubernetes\") pod \"multus-lwlhn\" (UID: \"f11eaf57-f83a-4974-adb5-7a59b11555b0\") " pod="openshift-multus/multus-lwlhn" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.056645 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f11eaf57-f83a-4974-adb5-7a59b11555b0-host-var-lib-cni-multus\") pod \"multus-lwlhn\" (UID: \"f11eaf57-f83a-4974-adb5-7a59b11555b0\") " pod="openshift-multus/multus-lwlhn" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.056668 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f11eaf57-f83a-4974-adb5-7a59b11555b0-multus-daemon-config\") pod \"multus-lwlhn\" (UID: \"f11eaf57-f83a-4974-adb5-7a59b11555b0\") " pod="openshift-multus/multus-lwlhn" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.056686 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4be6e520-cae7-413a-bd91-de5f2de7f0b1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qmgrn\" (UID: \"4be6e520-cae7-413a-bd91-de5f2de7f0b1\") " pod="openshift-multus/multus-additional-cni-plugins-qmgrn" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.056714 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f11eaf57-f83a-4974-adb5-7a59b11555b0-host-run-netns\") pod \"multus-lwlhn\" (UID: \"f11eaf57-f83a-4974-adb5-7a59b11555b0\") " pod="openshift-multus/multus-lwlhn" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.056729 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f11eaf57-f83a-4974-adb5-7a59b11555b0-multus-conf-dir\") pod \"multus-lwlhn\" (UID: \"f11eaf57-f83a-4974-adb5-7a59b11555b0\") " pod="openshift-multus/multus-lwlhn" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.056746 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0c96bfbc-3a6a-44df-9bf6-9f78c587657c-mcd-auth-proxy-config\") pod \"machine-config-daemon-m26zl\" (UID: \"0c96bfbc-3a6a-44df-9bf6-9f78c587657c\") " pod="openshift-machine-config-operator/machine-config-daemon-m26zl" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.056765 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lwzk\" (UniqueName: \"kubernetes.io/projected/0c96bfbc-3a6a-44df-9bf6-9f78c587657c-kube-api-access-7lwzk\") pod \"machine-config-daemon-m26zl\" (UID: \"0c96bfbc-3a6a-44df-9bf6-9f78c587657c\") " pod="openshift-machine-config-operator/machine-config-daemon-m26zl" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.056791 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f11eaf57-f83a-4974-adb5-7a59b11555b0-os-release\") pod \"multus-lwlhn\" (UID: \"f11eaf57-f83a-4974-adb5-7a59b11555b0\") " pod="openshift-multus/multus-lwlhn" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.056814 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f11eaf57-f83a-4974-adb5-7a59b11555b0-cnibin\") pod \"multus-lwlhn\" (UID: \"f11eaf57-f83a-4974-adb5-7a59b11555b0\") " pod="openshift-multus/multus-lwlhn" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.056837 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/0c96bfbc-3a6a-44df-9bf6-9f78c587657c-rootfs\") pod \"machine-config-daemon-m26zl\" (UID: \"0c96bfbc-3a6a-44df-9bf6-9f78c587657c\") " pod="openshift-machine-config-operator/machine-config-daemon-m26zl" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.056833 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f11eaf57-f83a-4974-adb5-7a59b11555b0-host-var-lib-cni-bin\") pod \"multus-lwlhn\" (UID: \"f11eaf57-f83a-4974-adb5-7a59b11555b0\") " pod="openshift-multus/multus-lwlhn" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.056866 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2jcv\" (UniqueName: \"kubernetes.io/projected/f11eaf57-f83a-4974-adb5-7a59b11555b0-kube-api-access-t2jcv\") pod \"multus-lwlhn\" (UID: \"f11eaf57-f83a-4974-adb5-7a59b11555b0\") " pod="openshift-multus/multus-lwlhn" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.056907 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f11eaf57-f83a-4974-adb5-7a59b11555b0-hostroot\") pod \"multus-lwlhn\" (UID: \"f11eaf57-f83a-4974-adb5-7a59b11555b0\") " pod="openshift-multus/multus-lwlhn" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.056958 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f11eaf57-f83a-4974-adb5-7a59b11555b0-multus-socket-dir-parent\") pod \"multus-lwlhn\" (UID: \"f11eaf57-f83a-4974-adb5-7a59b11555b0\") " pod="openshift-multus/multus-lwlhn" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.056984 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f11eaf57-f83a-4974-adb5-7a59b11555b0-multus-cni-dir\") pod \"multus-lwlhn\" (UID: \"f11eaf57-f83a-4974-adb5-7a59b11555b0\") " pod="openshift-multus/multus-lwlhn" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.057012 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f11eaf57-f83a-4974-adb5-7a59b11555b0-cni-binary-copy\") pod \"multus-lwlhn\" (UID: \"f11eaf57-f83a-4974-adb5-7a59b11555b0\") " pod="openshift-multus/multus-lwlhn" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.057054 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4be6e520-cae7-413a-bd91-de5f2de7f0b1-cni-binary-copy\") pod \"multus-additional-cni-plugins-qmgrn\" (UID: \"4be6e520-cae7-413a-bd91-de5f2de7f0b1\") " pod="openshift-multus/multus-additional-cni-plugins-qmgrn" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.057081 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f11eaf57-f83a-4974-adb5-7a59b11555b0-system-cni-dir\") pod \"multus-lwlhn\" (UID: \"f11eaf57-f83a-4974-adb5-7a59b11555b0\") " pod="openshift-multus/multus-lwlhn" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.057128 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4be6e520-cae7-413a-bd91-de5f2de7f0b1-os-release\") pod \"multus-additional-cni-plugins-qmgrn\" (UID: \"4be6e520-cae7-413a-bd91-de5f2de7f0b1\") " pod="openshift-multus/multus-additional-cni-plugins-qmgrn" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.057153 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4be6e520-cae7-413a-bd91-de5f2de7f0b1-system-cni-dir\") pod \"multus-additional-cni-plugins-qmgrn\" (UID: \"4be6e520-cae7-413a-bd91-de5f2de7f0b1\") " pod="openshift-multus/multus-additional-cni-plugins-qmgrn" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.057195 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f11eaf57-f83a-4974-adb5-7a59b11555b0-host-var-lib-cni-multus\") pod \"multus-lwlhn\" (UID: \"f11eaf57-f83a-4974-adb5-7a59b11555b0\") " pod="openshift-multus/multus-lwlhn" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.057238 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4be6e520-cae7-413a-bd91-de5f2de7f0b1-system-cni-dir\") pod \"multus-additional-cni-plugins-qmgrn\" (UID: \"4be6e520-cae7-413a-bd91-de5f2de7f0b1\") " pod="openshift-multus/multus-additional-cni-plugins-qmgrn" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.057440 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4be6e520-cae7-413a-bd91-de5f2de7f0b1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qmgrn\" (UID: \"4be6e520-cae7-413a-bd91-de5f2de7f0b1\") " pod="openshift-multus/multus-additional-cni-plugins-qmgrn" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.057544 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f11eaf57-f83a-4974-adb5-7a59b11555b0-hostroot\") pod \"multus-lwlhn\" (UID: \"f11eaf57-f83a-4974-adb5-7a59b11555b0\") " pod="openshift-multus/multus-lwlhn" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.057591 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f11eaf57-f83a-4974-adb5-7a59b11555b0-multus-socket-dir-parent\") pod \"multus-lwlhn\" (UID: \"f11eaf57-f83a-4974-adb5-7a59b11555b0\") " pod="openshift-multus/multus-lwlhn" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.057723 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f11eaf57-f83a-4974-adb5-7a59b11555b0-multus-cni-dir\") pod \"multus-lwlhn\" (UID: \"f11eaf57-f83a-4974-adb5-7a59b11555b0\") " pod="openshift-multus/multus-lwlhn" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.057740 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f11eaf57-f83a-4974-adb5-7a59b11555b0-system-cni-dir\") pod \"multus-lwlhn\" (UID: \"f11eaf57-f83a-4974-adb5-7a59b11555b0\") " pod="openshift-multus/multus-lwlhn" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.057771 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4be6e520-cae7-413a-bd91-de5f2de7f0b1-os-release\") pod \"multus-additional-cni-plugins-qmgrn\" (UID: \"4be6e520-cae7-413a-bd91-de5f2de7f0b1\") " pod="openshift-multus/multus-additional-cni-plugins-qmgrn" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.057825 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f11eaf57-f83a-4974-adb5-7a59b11555b0-host-run-netns\") pod \"multus-lwlhn\" (UID: \"f11eaf57-f83a-4974-adb5-7a59b11555b0\") " pod="openshift-multus/multus-lwlhn" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.057844 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/0c96bfbc-3a6a-44df-9bf6-9f78c587657c-rootfs\") pod \"machine-config-daemon-m26zl\" (UID: \"0c96bfbc-3a6a-44df-9bf6-9f78c587657c\") " pod="openshift-machine-config-operator/machine-config-daemon-m26zl" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.057851 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f11eaf57-f83a-4974-adb5-7a59b11555b0-cnibin\") pod \"multus-lwlhn\" (UID: \"f11eaf57-f83a-4974-adb5-7a59b11555b0\") " pod="openshift-multus/multus-lwlhn" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.057896 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f11eaf57-f83a-4974-adb5-7a59b11555b0-multus-conf-dir\") pod \"multus-lwlhn\" (UID: \"f11eaf57-f83a-4974-adb5-7a59b11555b0\") " pod="openshift-multus/multus-lwlhn" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.057953 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f11eaf57-f83a-4974-adb5-7a59b11555b0-os-release\") pod \"multus-lwlhn\" (UID: \"f11eaf57-f83a-4974-adb5-7a59b11555b0\") " pod="openshift-multus/multus-lwlhn" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.058312 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f11eaf57-f83a-4974-adb5-7a59b11555b0-multus-daemon-config\") pod \"multus-lwlhn\" (UID: \"f11eaf57-f83a-4974-adb5-7a59b11555b0\") " pod="openshift-multus/multus-lwlhn" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.058543 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f11eaf57-f83a-4974-adb5-7a59b11555b0-cni-binary-copy\") pod \"multus-lwlhn\" (UID: \"f11eaf57-f83a-4974-adb5-7a59b11555b0\") " pod="openshift-multus/multus-lwlhn" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.059011 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4be6e520-cae7-413a-bd91-de5f2de7f0b1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qmgrn\" (UID: \"4be6e520-cae7-413a-bd91-de5f2de7f0b1\") " pod="openshift-multus/multus-additional-cni-plugins-qmgrn" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.059456 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4be6e520-cae7-413a-bd91-de5f2de7f0b1-cni-binary-copy\") pod \"multus-additional-cni-plugins-qmgrn\" (UID: \"4be6e520-cae7-413a-bd91-de5f2de7f0b1\") " pod="openshift-multus/multus-additional-cni-plugins-qmgrn" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.060453 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0c96bfbc-3a6a-44df-9bf6-9f78c587657c-mcd-auth-proxy-config\") pod \"machine-config-daemon-m26zl\" (UID: \"0c96bfbc-3a6a-44df-9bf6-9f78c587657c\") " pod="openshift-machine-config-operator/machine-config-daemon-m26zl" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.061898 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:20Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.083273 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b2683d138d302cd71c3b08e7fa14d434a288810b0bcb2dd30a6ed778e9288f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dbb5b36c90e374f7b5583a7981e39d07f6154e268299d9bc22781e5be7250c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:20Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.094356 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.094386 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.094399 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.094418 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.094431 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:20Z","lastTransitionTime":"2026-03-20T06:51:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.098847 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lwzk\" (UniqueName: \"kubernetes.io/projected/0c96bfbc-3a6a-44df-9bf6-9f78c587657c-kube-api-access-7lwzk\") pod \"machine-config-daemon-m26zl\" (UID: \"0c96bfbc-3a6a-44df-9bf6-9f78c587657c\") " pod="openshift-machine-config-operator/machine-config-daemon-m26zl" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.099227 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:20Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.099401 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0c96bfbc-3a6a-44df-9bf6-9f78c587657c-proxy-tls\") pod \"machine-config-daemon-m26zl\" (UID: \"0c96bfbc-3a6a-44df-9bf6-9f78c587657c\") " pod="openshift-machine-config-operator/machine-config-daemon-m26zl" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.099416 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2jcv\" (UniqueName: \"kubernetes.io/projected/f11eaf57-f83a-4974-adb5-7a59b11555b0-kube-api-access-t2jcv\") pod \"multus-lwlhn\" (UID: \"f11eaf57-f83a-4974-adb5-7a59b11555b0\") " pod="openshift-multus/multus-lwlhn" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.099666 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpwlj\" (UniqueName: \"kubernetes.io/projected/4be6e520-cae7-413a-bd91-de5f2de7f0b1-kube-api-access-qpwlj\") pod \"multus-additional-cni-plugins-qmgrn\" (UID: \"4be6e520-cae7-413a-bd91-de5f2de7f0b1\") " pod="openshift-multus/multus-additional-cni-plugins-qmgrn" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.118128 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lwlhn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f11eaf57-f83a-4974-adb5-7a59b11555b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2jcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lwlhn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:20Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.134038 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9cccf9478c15d0ef6ac9a69d23a13e83c06968969d355cfe2fde1db3b40804f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:20Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.149764 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:20Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.161351 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.171137 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:20Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:20 crc kubenswrapper[4971]: W0320 06:51:20.177148 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c96bfbc_3a6a_44df_9bf6_9f78c587657c.slice/crio-57765441e0197730a80a636dd1377cc4877f875d6442794966563b9e14913231 WatchSource:0}: Error finding container 57765441e0197730a80a636dd1377cc4877f875d6442794966563b9e14913231: Status 404 returned error can't find the container with id 57765441e0197730a80a636dd1377cc4877f875d6442794966563b9e14913231 Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.200074 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.200112 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.200121 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.200136 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.200148 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:20Z","lastTransitionTime":"2026-03-20T06:51:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.202804 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-qmgrn" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.218673 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jl74k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3068ae0f-0404-452c-b0e0-c422d898e6b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmftj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jl74k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:20Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.226459 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-jl74k" event={"ID":"3068ae0f-0404-452c-b0e0-c422d898e6b1","Type":"ContainerStarted","Data":"570f5d3ab39337c4af6a37ee8b67fe4edfa5ada3074de21da8c4407f7a478dcc"} Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.228211 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerStarted","Data":"57765441e0197730a80a636dd1377cc4877f875d6442794966563b9e14913231"} Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.235955 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-fhlwk"] Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.237211 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.240491 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c96bfbc-3a6a-44df-9bf6-9f78c587657c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m26zl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:20Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.240758 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-lwlhn" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.241227 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.241282 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.243000 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.243308 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.243619 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.243881 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.251741 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 20 06:51:20 crc kubenswrapper[4971]: W0320 06:51:20.256295 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4be6e520_cae7_413a_bd91_de5f2de7f0b1.slice/crio-c3bf2d16aa8f565d0c3aaa46663ebf3b953fe4082eb88afa2b3c7b7ef3395ac6 WatchSource:0}: Error finding container c3bf2d16aa8f565d0c3aaa46663ebf3b953fe4082eb88afa2b3c7b7ef3395ac6: Status 404 returned error can't find the container with id c3bf2d16aa8f565d0c3aaa46663ebf3b953fe4082eb88afa2b3c7b7ef3395ac6 Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.262955 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:20Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.290556 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qmgrn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be6e520-cae7-413a-bd91-de5f2de7f0b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qmgrn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:20Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.305366 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.305401 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.305412 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.305431 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.305446 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:20Z","lastTransitionTime":"2026-03-20T06:51:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.311153 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c96bfbc-3a6a-44df-9bf6-9f78c587657c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m26zl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:20Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.330689 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:20Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.348513 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:20Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.360884 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-host-kubelet\") pod \"ovnkube-node-fhlwk\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.360944 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-log-socket\") pod \"ovnkube-node-fhlwk\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.360989 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-run-openvswitch\") pod \"ovnkube-node-fhlwk\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.361025 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-etc-openvswitch\") pod \"ovnkube-node-fhlwk\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.361048 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7f624a08-810b-4915-b3bf-1e19d3e6cace-ovnkube-config\") pod \"ovnkube-node-fhlwk\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.361082 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-run-ovn\") pod \"ovnkube-node-fhlwk\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.361105 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7f624a08-810b-4915-b3bf-1e19d3e6cace-env-overrides\") pod \"ovnkube-node-fhlwk\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.361136 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-host-run-netns\") pod \"ovnkube-node-fhlwk\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.361155 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-run-systemd\") pod \"ovnkube-node-fhlwk\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.361203 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-var-lib-openvswitch\") pod \"ovnkube-node-fhlwk\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.361224 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7f624a08-810b-4915-b3bf-1e19d3e6cace-ovnkube-script-lib\") pod \"ovnkube-node-fhlwk\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.361275 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-node-log\") pod \"ovnkube-node-fhlwk\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.361294 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-host-cni-bin\") pod \"ovnkube-node-fhlwk\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.361317 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-host-cni-netd\") pod \"ovnkube-node-fhlwk\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.361350 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-host-run-ovn-kubernetes\") pod \"ovnkube-node-fhlwk\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.361375 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fhlwk\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.361400 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7f624a08-810b-4915-b3bf-1e19d3e6cace-ovn-node-metrics-cert\") pod \"ovnkube-node-fhlwk\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.361444 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7vn4\" (UniqueName: \"kubernetes.io/projected/7f624a08-810b-4915-b3bf-1e19d3e6cace-kube-api-access-j7vn4\") pod \"ovnkube-node-fhlwk\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.361466 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-host-slash\") pod \"ovnkube-node-fhlwk\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.361490 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-systemd-units\") pod \"ovnkube-node-fhlwk\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.363259 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jl74k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3068ae0f-0404-452c-b0e0-c422d898e6b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmftj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jl74k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:20Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.376749 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:20Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.394638 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qmgrn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be6e520-cae7-413a-bd91-de5f2de7f0b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qmgrn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:20Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.407947 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b2683d138d302cd71c3b08e7fa14d434a288810b0bcb2dd30a6ed778e9288f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dbb5b36c90e374f7b5583a7981e39d07f6154e268299d9bc22781e5be7250c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:20Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.408769 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.408818 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.408834 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.408864 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.408884 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:20Z","lastTransitionTime":"2026-03-20T06:51:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.422342 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9cccf9478c15d0ef6ac9a69d23a13e83c06968969d355cfe2fde1db3b40804f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:20Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.437009 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:20Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.457422 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lwlhn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f11eaf57-f83a-4974-adb5-7a59b11555b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2jcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lwlhn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:20Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.462127 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-log-socket\") pod \"ovnkube-node-fhlwk\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.462194 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-run-openvswitch\") pod \"ovnkube-node-fhlwk\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.462237 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-etc-openvswitch\") pod \"ovnkube-node-fhlwk\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.462279 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-log-socket\") pod \"ovnkube-node-fhlwk\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.462306 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-etc-openvswitch\") pod \"ovnkube-node-fhlwk\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.462332 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7f624a08-810b-4915-b3bf-1e19d3e6cace-ovnkube-config\") pod \"ovnkube-node-fhlwk\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.462313 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-run-openvswitch\") pod \"ovnkube-node-fhlwk\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.462454 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-run-ovn\") pod \"ovnkube-node-fhlwk\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.462506 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-run-ovn\") pod \"ovnkube-node-fhlwk\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.462530 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7f624a08-810b-4915-b3bf-1e19d3e6cace-env-overrides\") pod \"ovnkube-node-fhlwk\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.462871 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-host-run-netns\") pod \"ovnkube-node-fhlwk\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.463239 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7f624a08-810b-4915-b3bf-1e19d3e6cace-ovnkube-config\") pod \"ovnkube-node-fhlwk\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.463424 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7f624a08-810b-4915-b3bf-1e19d3e6cace-env-overrides\") pod \"ovnkube-node-fhlwk\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.463512 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-host-run-netns\") pod \"ovnkube-node-fhlwk\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.463582 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-run-systemd\") pod \"ovnkube-node-fhlwk\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.463707 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-run-systemd\") pod \"ovnkube-node-fhlwk\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.463874 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-var-lib-openvswitch\") pod \"ovnkube-node-fhlwk\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.463964 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-var-lib-openvswitch\") pod \"ovnkube-node-fhlwk\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.464016 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7f624a08-810b-4915-b3bf-1e19d3e6cace-ovnkube-script-lib\") pod \"ovnkube-node-fhlwk\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.464116 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-host-cni-netd\") pod \"ovnkube-node-fhlwk\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.464207 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-host-cni-netd\") pod \"ovnkube-node-fhlwk\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.464289 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-node-log\") pod \"ovnkube-node-fhlwk\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.464378 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-node-log\") pod \"ovnkube-node-fhlwk\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.464485 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-host-cni-bin\") pod \"ovnkube-node-fhlwk\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.464538 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-host-run-ovn-kubernetes\") pod \"ovnkube-node-fhlwk\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.464585 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fhlwk\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.464650 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7f624a08-810b-4915-b3bf-1e19d3e6cace-ovnkube-script-lib\") pod \"ovnkube-node-fhlwk\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.464692 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-host-run-ovn-kubernetes\") pod \"ovnkube-node-fhlwk\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.464716 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fhlwk\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.464646 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7f624a08-810b-4915-b3bf-1e19d3e6cace-ovn-node-metrics-cert\") pod \"ovnkube-node-fhlwk\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.464803 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7vn4\" (UniqueName: \"kubernetes.io/projected/7f624a08-810b-4915-b3bf-1e19d3e6cace-kube-api-access-j7vn4\") pod \"ovnkube-node-fhlwk\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.464854 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-host-slash\") pod \"ovnkube-node-fhlwk\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.464890 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-systemd-units\") pod \"ovnkube-node-fhlwk\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.464938 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-host-slash\") pod \"ovnkube-node-fhlwk\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.464954 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-host-kubelet\") pod \"ovnkube-node-fhlwk\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.464989 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-systemd-units\") pod \"ovnkube-node-fhlwk\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.465103 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-host-kubelet\") pod \"ovnkube-node-fhlwk\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.465101 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-host-cni-bin\") pod \"ovnkube-node-fhlwk\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.468997 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7f624a08-810b-4915-b3bf-1e19d3e6cace-ovn-node-metrics-cert\") pod \"ovnkube-node-fhlwk\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.480894 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f624a08-810b-4915-b3bf-1e19d3e6cace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fhlwk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:20Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.487978 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7vn4\" (UniqueName: \"kubernetes.io/projected/7f624a08-810b-4915-b3bf-1e19d3e6cace-kube-api-access-j7vn4\") pod \"ovnkube-node-fhlwk\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.512939 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.512991 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.513003 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.513022 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.513035 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:20Z","lastTransitionTime":"2026-03-20T06:51:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.613313 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.618209 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.618252 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.618262 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.618277 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.618287 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:20Z","lastTransitionTime":"2026-03-20T06:51:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:20 crc kubenswrapper[4971]: W0320 06:51:20.633872 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f624a08_810b_4915_b3bf_1e19d3e6cace.slice/crio-8395fd76aef46e29db331cfcb866ba29a8fcb755dff4a6f9919b871fb3a5356f WatchSource:0}: Error finding container 8395fd76aef46e29db331cfcb866ba29a8fcb755dff4a6f9919b871fb3a5356f: Status 404 returned error can't find the container with id 8395fd76aef46e29db331cfcb866ba29a8fcb755dff4a6f9919b871fb3a5356f Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.720928 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.720977 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.720989 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.721005 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.721018 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:20Z","lastTransitionTime":"2026-03-20T06:51:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.824360 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.824430 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.824441 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.824456 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.824492 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:20Z","lastTransitionTime":"2026-03-20T06:51:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.927364 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.927428 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.927444 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.927469 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:20 crc kubenswrapper[4971]: I0320 06:51:20.927483 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:20Z","lastTransitionTime":"2026-03-20T06:51:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.030399 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.030474 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.030489 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.030520 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.030563 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:21Z","lastTransitionTime":"2026-03-20T06:51:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.134276 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.134351 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.134370 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.134398 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.134420 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:21Z","lastTransitionTime":"2026-03-20T06:51:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.233983 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerStarted","Data":"b1a4307986ec73b6299d1dbba6bfd0efb557578fd13619a0b6a81e5e6a31f7d0"} Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.234059 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerStarted","Data":"b6545ca2d913195f4ebed66a5c9050bc7198c2d2281fea7c3293a5d188beb2e4"} Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.235883 4971 generic.go:334] "Generic (PLEG): container finished" podID="7f624a08-810b-4915-b3bf-1e19d3e6cace" containerID="2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd" exitCode=0 Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.235931 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" event={"ID":"7f624a08-810b-4915-b3bf-1e19d3e6cace","Type":"ContainerDied","Data":"2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd"} Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.236007 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" event={"ID":"7f624a08-810b-4915-b3bf-1e19d3e6cace","Type":"ContainerStarted","Data":"8395fd76aef46e29db331cfcb866ba29a8fcb755dff4a6f9919b871fb3a5356f"} Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.236187 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.236211 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.236222 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.236241 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.236253 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:21Z","lastTransitionTime":"2026-03-20T06:51:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.237786 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-jl74k" event={"ID":"3068ae0f-0404-452c-b0e0-c422d898e6b1","Type":"ContainerStarted","Data":"dad3a055d918bddfc385b5ad7adfb0e47347ef1f86fe991b0bb93c58d39e4436"} Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.238817 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lwlhn" event={"ID":"f11eaf57-f83a-4974-adb5-7a59b11555b0","Type":"ContainerStarted","Data":"93f25aa49f9048f5f935b94dc408da72227675c36d75b9614bb59743769df123"} Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.238842 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lwlhn" event={"ID":"f11eaf57-f83a-4974-adb5-7a59b11555b0","Type":"ContainerStarted","Data":"c0a94e586f4c27ad61bba60d7bbc3b60d843c7d4e6451cff76044bbfacc8a6aa"} Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.240748 4971 generic.go:334] "Generic (PLEG): container finished" podID="4be6e520-cae7-413a-bd91-de5f2de7f0b1" containerID="7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515" exitCode=0 Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.240776 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qmgrn" event={"ID":"4be6e520-cae7-413a-bd91-de5f2de7f0b1","Type":"ContainerDied","Data":"7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515"} Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.240811 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qmgrn" event={"ID":"4be6e520-cae7-413a-bd91-de5f2de7f0b1","Type":"ContainerStarted","Data":"c3bf2d16aa8f565d0c3aaa46663ebf3b953fe4082eb88afa2b3c7b7ef3395ac6"} Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.263790 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:21Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.291285 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:21Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.307947 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jl74k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3068ae0f-0404-452c-b0e0-c422d898e6b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmftj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jl74k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:21Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.333352 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c96bfbc-3a6a-44df-9bf6-9f78c587657c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1a4307986ec73b6299d1dbba6bfd0efb557578fd13619a0b6a81e5e6a31f7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6545ca2d913195f4ebed66a5c9050bc7198c2d2281fea7c3293a5d188beb2e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m26zl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:21Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.340491 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.340530 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.340543 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.340562 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.340573 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:21Z","lastTransitionTime":"2026-03-20T06:51:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.359632 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:21Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.381819 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qmgrn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be6e520-cae7-413a-bd91-de5f2de7f0b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qmgrn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:21Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.401965 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b2683d138d302cd71c3b08e7fa14d434a288810b0bcb2dd30a6ed778e9288f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dbb5b36c90e374f7b5583a7981e39d07f6154e268299d9bc22781e5be7250c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:21Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.419502 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9cccf9478c15d0ef6ac9a69d23a13e83c06968969d355cfe2fde1db3b40804f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:21Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.436391 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:21Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.443635 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.443685 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.443697 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.443718 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.443730 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:21Z","lastTransitionTime":"2026-03-20T06:51:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.453867 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lwlhn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f11eaf57-f83a-4974-adb5-7a59b11555b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2jcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lwlhn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:21Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.477177 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.477326 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.477354 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.477377 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.477399 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:51:21 crc kubenswrapper[4971]: E0320 06:51:21.477518 4971 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 06:51:21 crc kubenswrapper[4971]: E0320 06:51:21.477582 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 06:51:25.477564384 +0000 UTC m=+107.457438522 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 06:51:21 crc kubenswrapper[4971]: E0320 06:51:21.478039 4971 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 06:51:21 crc kubenswrapper[4971]: E0320 06:51:21.478076 4971 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 06:51:21 crc kubenswrapper[4971]: E0320 06:51:21.478091 4971 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 06:51:21 crc kubenswrapper[4971]: E0320 06:51:21.478166 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 06:51:25.47814621 +0000 UTC m=+107.458020358 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 06:51:21 crc kubenswrapper[4971]: E0320 06:51:21.478403 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 06:51:25.478370656 +0000 UTC m=+107.458244804 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:51:21 crc kubenswrapper[4971]: E0320 06:51:21.478874 4971 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 06:51:21 crc kubenswrapper[4971]: E0320 06:51:21.479424 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 06:51:25.479051044 +0000 UTC m=+107.458925192 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 06:51:21 crc kubenswrapper[4971]: E0320 06:51:21.479673 4971 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 06:51:21 crc kubenswrapper[4971]: E0320 06:51:21.479697 4971 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 06:51:21 crc kubenswrapper[4971]: E0320 06:51:21.479714 4971 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 06:51:21 crc kubenswrapper[4971]: E0320 06:51:21.479757 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 06:51:25.479743452 +0000 UTC m=+107.459617600 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.480632 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f624a08-810b-4915-b3bf-1e19d3e6cace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fhlwk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:21Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.498385 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9cccf9478c15d0ef6ac9a69d23a13e83c06968969d355cfe2fde1db3b40804f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:21Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.520248 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:21Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.540293 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lwlhn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f11eaf57-f83a-4974-adb5-7a59b11555b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93f25aa49f9048f5f935b94dc408da72227675c36d75b9614bb59743769df123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2jcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lwlhn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:21Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.547494 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.547553 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.547568 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.547587 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.547620 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:21Z","lastTransitionTime":"2026-03-20T06:51:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.569706 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f624a08-810b-4915-b3bf-1e19d3e6cace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fhlwk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:21Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.594382 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:21Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.608965 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:21Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.622880 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jl74k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3068ae0f-0404-452c-b0e0-c422d898e6b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dad3a055d918bddfc385b5ad7adfb0e47347ef1f86fe991b0bb93c58d39e4436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmftj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jl74k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:21Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.636204 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c96bfbc-3a6a-44df-9bf6-9f78c587657c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1a4307986ec73b6299d1dbba6bfd0efb557578fd13619a0b6a81e5e6a31f7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6545ca2d913195f4ebed66a5c9050bc7198c2d2281fea7c3293a5d188beb2e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m26zl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:21Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.650772 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:21Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.651059 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.651096 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.651107 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.651122 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.651132 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:21Z","lastTransitionTime":"2026-03-20T06:51:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.669772 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qmgrn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be6e520-cae7-413a-bd91-de5f2de7f0b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qmgrn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:21Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.685546 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b2683d138d302cd71c3b08e7fa14d434a288810b0bcb2dd30a6ed778e9288f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dbb5b36c90e374f7b5583a7981e39d07f6154e268299d9bc22781e5be7250c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:21Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.733889 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.734019 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:51:21 crc kubenswrapper[4971]: E0320 06:51:21.734149 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 06:51:21 crc kubenswrapper[4971]: E0320 06:51:21.734507 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.735236 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:51:21 crc kubenswrapper[4971]: E0320 06:51:21.735528 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.754215 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.754274 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.754296 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.754324 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.754345 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:21Z","lastTransitionTime":"2026-03-20T06:51:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.858641 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.858710 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.858732 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.859076 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.859256 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:21Z","lastTransitionTime":"2026-03-20T06:51:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.962070 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.962531 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.962550 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.962575 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:21 crc kubenswrapper[4971]: I0320 06:51:21.962591 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:21Z","lastTransitionTime":"2026-03-20T06:51:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.065098 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.065157 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.065206 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.065225 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.065236 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:22Z","lastTransitionTime":"2026-03-20T06:51:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.167508 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.167546 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.167555 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.167569 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.167582 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:22Z","lastTransitionTime":"2026-03-20T06:51:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.245078 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"ae3df870db76866893b6eb3f04127ac3feec63a351cfd97e96960dddd119c79f"} Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.248811 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" event={"ID":"7f624a08-810b-4915-b3bf-1e19d3e6cace","Type":"ContainerStarted","Data":"421a7ad4b3dde0de7b751159a385832075d84609031c7def22b4aa02874d2936"} Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.248864 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" event={"ID":"7f624a08-810b-4915-b3bf-1e19d3e6cace","Type":"ContainerStarted","Data":"20d0d42b820b46ff3173d56e03dedec6fd3fd360fe144467d8eb6f785d597308"} Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.248885 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" event={"ID":"7f624a08-810b-4915-b3bf-1e19d3e6cace","Type":"ContainerStarted","Data":"178ce0bca73aef60296595c16024b62f27dc16d226d6ed8d102bb88b3b4477aa"} Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.248900 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" event={"ID":"7f624a08-810b-4915-b3bf-1e19d3e6cace","Type":"ContainerStarted","Data":"44a1ae4297291e48978592b82e56c27206a61f8789bc5312c5dd22d06b34c195"} Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.248914 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" event={"ID":"7f624a08-810b-4915-b3bf-1e19d3e6cace","Type":"ContainerStarted","Data":"f30450686629f21239e3247309d8fab4f41bacff80fdc6d456d970dde1a7e703"} Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.254961 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qmgrn" event={"ID":"4be6e520-cae7-413a-bd91-de5f2de7f0b1","Type":"ContainerStarted","Data":"d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64"} Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.261728 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b2683d138d302cd71c3b08e7fa14d434a288810b0bcb2dd30a6ed778e9288f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dbb5b36c90e374f7b5583a7981e39d07f6154e268299d9bc22781e5be7250c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:22Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.270258 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.270288 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.270297 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.270326 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.270339 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:22Z","lastTransitionTime":"2026-03-20T06:51:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.284851 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lwlhn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f11eaf57-f83a-4974-adb5-7a59b11555b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93f25aa49f9048f5f935b94dc408da72227675c36d75b9614bb59743769df123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2jcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lwlhn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:22Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.307579 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f624a08-810b-4915-b3bf-1e19d3e6cace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fhlwk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:22Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.325504 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9cccf9478c15d0ef6ac9a69d23a13e83c06968969d355cfe2fde1db3b40804f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:22Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.341139 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:22Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.362839 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:22Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.373855 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.373910 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.373922 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.373944 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.373960 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:22Z","lastTransitionTime":"2026-03-20T06:51:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.376460 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jl74k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3068ae0f-0404-452c-b0e0-c422d898e6b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dad3a055d918bddfc385b5ad7adfb0e47347ef1f86fe991b0bb93c58d39e4436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmftj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jl74k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:22Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.394354 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c96bfbc-3a6a-44df-9bf6-9f78c587657c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1a4307986ec73b6299d1dbba6bfd0efb557578fd13619a0b6a81e5e6a31f7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6545ca2d913195f4ebed66a5c9050bc7198c2d2281fea7c3293a5d188beb2e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m26zl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:22Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.414241 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:22Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.430466 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae3df870db76866893b6eb3f04127ac3feec63a351cfd97e96960dddd119c79f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:22Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.455702 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qmgrn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be6e520-cae7-413a-bd91-de5f2de7f0b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qmgrn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:22Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.472431 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae3df870db76866893b6eb3f04127ac3feec63a351cfd97e96960dddd119c79f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:22Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.476649 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.476691 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.476704 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.476722 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.476738 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:22Z","lastTransitionTime":"2026-03-20T06:51:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.507892 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qmgrn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be6e520-cae7-413a-bd91-de5f2de7f0b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qmgrn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:22Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.531801 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b2683d138d302cd71c3b08e7fa14d434a288810b0bcb2dd30a6ed778e9288f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dbb5b36c90e374f7b5583a7981e39d07f6154e268299d9bc22781e5be7250c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:22Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.567575 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9cccf9478c15d0ef6ac9a69d23a13e83c06968969d355cfe2fde1db3b40804f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:22Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.579897 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.579930 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.579939 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.579957 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.579967 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:22Z","lastTransitionTime":"2026-03-20T06:51:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.582975 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:22Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.596302 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lwlhn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f11eaf57-f83a-4974-adb5-7a59b11555b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93f25aa49f9048f5f935b94dc408da72227675c36d75b9614bb59743769df123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2jcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lwlhn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:22Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.613431 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f624a08-810b-4915-b3bf-1e19d3e6cace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fhlwk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:22Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.624845 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:22Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.636310 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:22Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.644948 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jl74k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3068ae0f-0404-452c-b0e0-c422d898e6b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dad3a055d918bddfc385b5ad7adfb0e47347ef1f86fe991b0bb93c58d39e4436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmftj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jl74k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:22Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.654839 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c96bfbc-3a6a-44df-9bf6-9f78c587657c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1a4307986ec73b6299d1dbba6bfd0efb557578fd13619a0b6a81e5e6a31f7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6545ca2d913195f4ebed66a5c9050bc7198c2d2281fea7c3293a5d188beb2e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m26zl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:22Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.683734 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.683796 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.683816 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.683847 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.683868 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:22Z","lastTransitionTime":"2026-03-20T06:51:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.750924 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.786772 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.786825 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.786835 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.786851 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.786862 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:22Z","lastTransitionTime":"2026-03-20T06:51:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.892291 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.892372 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.892394 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.892421 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.892440 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:22Z","lastTransitionTime":"2026-03-20T06:51:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.995366 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.995404 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.995412 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.995429 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:22 crc kubenswrapper[4971]: I0320 06:51:22.995441 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:22Z","lastTransitionTime":"2026-03-20T06:51:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:23 crc kubenswrapper[4971]: I0320 06:51:23.099483 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:23 crc kubenswrapper[4971]: I0320 06:51:23.099589 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:23 crc kubenswrapper[4971]: I0320 06:51:23.099650 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:23 crc kubenswrapper[4971]: I0320 06:51:23.099694 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:23 crc kubenswrapper[4971]: I0320 06:51:23.099719 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:23Z","lastTransitionTime":"2026-03-20T06:51:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:23 crc kubenswrapper[4971]: I0320 06:51:23.203507 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:23 crc kubenswrapper[4971]: I0320 06:51:23.203599 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:23 crc kubenswrapper[4971]: I0320 06:51:23.203659 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:23 crc kubenswrapper[4971]: I0320 06:51:23.203689 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:23 crc kubenswrapper[4971]: I0320 06:51:23.203709 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:23Z","lastTransitionTime":"2026-03-20T06:51:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:23 crc kubenswrapper[4971]: I0320 06:51:23.262091 4971 generic.go:334] "Generic (PLEG): container finished" podID="4be6e520-cae7-413a-bd91-de5f2de7f0b1" containerID="d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64" exitCode=0 Mar 20 06:51:23 crc kubenswrapper[4971]: I0320 06:51:23.262227 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qmgrn" event={"ID":"4be6e520-cae7-413a-bd91-de5f2de7f0b1","Type":"ContainerDied","Data":"d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64"} Mar 20 06:51:23 crc kubenswrapper[4971]: I0320 06:51:23.274968 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" event={"ID":"7f624a08-810b-4915-b3bf-1e19d3e6cace","Type":"ContainerStarted","Data":"547b376608a6a765aea3f53af2c12c6eea952bf8fcdb34be1a94fc79d7c84b90"} Mar 20 06:51:23 crc kubenswrapper[4971]: I0320 06:51:23.286351 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:23Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:23 crc kubenswrapper[4971]: I0320 06:51:23.303056 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jl74k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3068ae0f-0404-452c-b0e0-c422d898e6b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dad3a055d918bddfc385b5ad7adfb0e47347ef1f86fe991b0bb93c58d39e4436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmftj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jl74k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:23Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:23 crc kubenswrapper[4971]: I0320 06:51:23.307823 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:23 crc kubenswrapper[4971]: I0320 06:51:23.307877 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:23 crc kubenswrapper[4971]: I0320 06:51:23.307894 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:23 crc kubenswrapper[4971]: I0320 06:51:23.307920 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:23 crc kubenswrapper[4971]: I0320 06:51:23.307940 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:23Z","lastTransitionTime":"2026-03-20T06:51:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:23 crc kubenswrapper[4971]: I0320 06:51:23.321121 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c96bfbc-3a6a-44df-9bf6-9f78c587657c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1a4307986ec73b6299d1dbba6bfd0efb557578fd13619a0b6a81e5e6a31f7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6545ca2d913195f4ebed66a5c9050bc7198c2d2281fea7c3293a5d188beb2e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m26zl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:23Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:23 crc kubenswrapper[4971]: I0320 06:51:23.347229 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:23Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:23 crc kubenswrapper[4971]: I0320 06:51:23.363905 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae3df870db76866893b6eb3f04127ac3feec63a351cfd97e96960dddd119c79f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:23Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:23 crc kubenswrapper[4971]: I0320 06:51:23.381816 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qmgrn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be6e520-cae7-413a-bd91-de5f2de7f0b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qmgrn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:23Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:23 crc kubenswrapper[4971]: I0320 06:51:23.406426 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6eccb52-4f65-4e1f-b9c9-4bece549a815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a82f33c71d1353dd841a49c673ff55387468152e3c7aa9a935173f9de78dcd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c624e715179e2699ff4d282ae4bd733ca1bc71ac21247b97f8585f0c6313460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://becef23cbdd3d98ab9e3aa793aee4fe00b06c86505d5717db0739eb21e344b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16665116a21b8abffb23402c6362e9cd5110741ce22065c5e50f200b179a327c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://105120bce25dc17ac09a04c3b0703e3f07e3f60495ddc67ca4d77004384a6fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:23Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:23 crc kubenswrapper[4971]: I0320 06:51:23.409802 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:23 crc kubenswrapper[4971]: I0320 06:51:23.409935 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:23 crc kubenswrapper[4971]: I0320 06:51:23.410037 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:23 crc kubenswrapper[4971]: I0320 06:51:23.410122 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:23 crc kubenswrapper[4971]: I0320 06:51:23.410201 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:23Z","lastTransitionTime":"2026-03-20T06:51:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:23 crc kubenswrapper[4971]: I0320 06:51:23.422536 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b2683d138d302cd71c3b08e7fa14d434a288810b0bcb2dd30a6ed778e9288f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dbb5b36c90e374f7b5583a7981e39d07f6154e268299d9bc22781e5be7250c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:23Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:23 crc kubenswrapper[4971]: I0320 06:51:23.440743 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lwlhn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f11eaf57-f83a-4974-adb5-7a59b11555b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93f25aa49f9048f5f935b94dc408da72227675c36d75b9614bb59743769df123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2jcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lwlhn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:23Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:23 crc kubenswrapper[4971]: I0320 06:51:23.464787 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f624a08-810b-4915-b3bf-1e19d3e6cace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fhlwk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:23Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:23 crc kubenswrapper[4971]: I0320 06:51:23.484886 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9cccf9478c15d0ef6ac9a69d23a13e83c06968969d355cfe2fde1db3b40804f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:23Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:23 crc kubenswrapper[4971]: I0320 06:51:23.501400 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:23Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:23 crc kubenswrapper[4971]: I0320 06:51:23.512960 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:23 crc kubenswrapper[4971]: I0320 06:51:23.512998 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:23 crc kubenswrapper[4971]: I0320 06:51:23.513010 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:23 crc kubenswrapper[4971]: I0320 06:51:23.513029 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:23 crc kubenswrapper[4971]: I0320 06:51:23.513041 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:23Z","lastTransitionTime":"2026-03-20T06:51:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:23 crc kubenswrapper[4971]: I0320 06:51:23.617744 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:23 crc kubenswrapper[4971]: I0320 06:51:23.617809 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:23 crc kubenswrapper[4971]: I0320 06:51:23.617829 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:23 crc kubenswrapper[4971]: I0320 06:51:23.617856 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:23 crc kubenswrapper[4971]: I0320 06:51:23.617873 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:23Z","lastTransitionTime":"2026-03-20T06:51:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:23 crc kubenswrapper[4971]: I0320 06:51:23.720929 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:23 crc kubenswrapper[4971]: I0320 06:51:23.720994 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:23 crc kubenswrapper[4971]: I0320 06:51:23.721012 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:23 crc kubenswrapper[4971]: I0320 06:51:23.721034 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:23 crc kubenswrapper[4971]: I0320 06:51:23.721047 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:23Z","lastTransitionTime":"2026-03-20T06:51:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:23 crc kubenswrapper[4971]: I0320 06:51:23.731212 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:51:23 crc kubenswrapper[4971]: I0320 06:51:23.731256 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:51:23 crc kubenswrapper[4971]: E0320 06:51:23.731355 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 06:51:23 crc kubenswrapper[4971]: E0320 06:51:23.731509 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 06:51:23 crc kubenswrapper[4971]: I0320 06:51:23.731704 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:51:23 crc kubenswrapper[4971]: E0320 06:51:23.731800 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 06:51:23 crc kubenswrapper[4971]: I0320 06:51:23.823954 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:23 crc kubenswrapper[4971]: I0320 06:51:23.824002 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:23 crc kubenswrapper[4971]: I0320 06:51:23.824016 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:23 crc kubenswrapper[4971]: I0320 06:51:23.824037 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:23 crc kubenswrapper[4971]: I0320 06:51:23.824052 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:23Z","lastTransitionTime":"2026-03-20T06:51:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:23 crc kubenswrapper[4971]: I0320 06:51:23.927043 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:23 crc kubenswrapper[4971]: I0320 06:51:23.927089 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:23 crc kubenswrapper[4971]: I0320 06:51:23.927104 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:23 crc kubenswrapper[4971]: I0320 06:51:23.927123 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:23 crc kubenswrapper[4971]: I0320 06:51:23.927135 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:23Z","lastTransitionTime":"2026-03-20T06:51:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:24 crc kubenswrapper[4971]: I0320 06:51:24.030942 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:24 crc kubenswrapper[4971]: I0320 06:51:24.030985 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:24 crc kubenswrapper[4971]: I0320 06:51:24.030993 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:24 crc kubenswrapper[4971]: I0320 06:51:24.031012 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:24 crc kubenswrapper[4971]: I0320 06:51:24.031025 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:24Z","lastTransitionTime":"2026-03-20T06:51:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:24 crc kubenswrapper[4971]: I0320 06:51:24.133396 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:24 crc kubenswrapper[4971]: I0320 06:51:24.133434 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:24 crc kubenswrapper[4971]: I0320 06:51:24.133443 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:24 crc kubenswrapper[4971]: I0320 06:51:24.133457 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:24 crc kubenswrapper[4971]: I0320 06:51:24.133469 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:24Z","lastTransitionTime":"2026-03-20T06:51:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:24 crc kubenswrapper[4971]: I0320 06:51:24.236417 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:24 crc kubenswrapper[4971]: I0320 06:51:24.236511 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:24 crc kubenswrapper[4971]: I0320 06:51:24.236531 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:24 crc kubenswrapper[4971]: I0320 06:51:24.236566 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:24 crc kubenswrapper[4971]: I0320 06:51:24.236590 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:24Z","lastTransitionTime":"2026-03-20T06:51:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:24 crc kubenswrapper[4971]: I0320 06:51:24.283029 4971 generic.go:334] "Generic (PLEG): container finished" podID="4be6e520-cae7-413a-bd91-de5f2de7f0b1" containerID="b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b" exitCode=0 Mar 20 06:51:24 crc kubenswrapper[4971]: I0320 06:51:24.283103 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qmgrn" event={"ID":"4be6e520-cae7-413a-bd91-de5f2de7f0b1","Type":"ContainerDied","Data":"b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b"} Mar 20 06:51:24 crc kubenswrapper[4971]: I0320 06:51:24.299861 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:24Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:24 crc kubenswrapper[4971]: I0320 06:51:24.319539 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jl74k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3068ae0f-0404-452c-b0e0-c422d898e6b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dad3a055d918bddfc385b5ad7adfb0e47347ef1f86fe991b0bb93c58d39e4436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmftj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jl74k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:24Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:24 crc kubenswrapper[4971]: I0320 06:51:24.334571 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c96bfbc-3a6a-44df-9bf6-9f78c587657c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1a4307986ec73b6299d1dbba6bfd0efb557578fd13619a0b6a81e5e6a31f7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6545ca2d913195f4ebed66a5c9050bc7198c2d2281fea7c3293a5d188beb2e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m26zl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:24Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:24 crc kubenswrapper[4971]: I0320 06:51:24.340194 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:24 crc kubenswrapper[4971]: I0320 06:51:24.340240 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:24 crc kubenswrapper[4971]: I0320 06:51:24.340252 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:24 crc kubenswrapper[4971]: I0320 06:51:24.340272 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:24 crc kubenswrapper[4971]: I0320 06:51:24.340285 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:24Z","lastTransitionTime":"2026-03-20T06:51:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:24 crc kubenswrapper[4971]: I0320 06:51:24.352575 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:24Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:24 crc kubenswrapper[4971]: I0320 06:51:24.363519 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae3df870db76866893b6eb3f04127ac3feec63a351cfd97e96960dddd119c79f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:24Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:24 crc kubenswrapper[4971]: I0320 06:51:24.382381 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qmgrn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be6e520-cae7-413a-bd91-de5f2de7f0b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qmgrn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:24Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:24 crc kubenswrapper[4971]: I0320 06:51:24.413266 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6eccb52-4f65-4e1f-b9c9-4bece549a815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a82f33c71d1353dd841a49c673ff55387468152e3c7aa9a935173f9de78dcd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c624e715179e2699ff4d282ae4bd733ca1bc71ac21247b97f8585f0c6313460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://becef23cbdd3d98ab9e3aa793aee4fe00b06c86505d5717db0739eb21e344b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16665116a21b8abffb23402c6362e9cd5110741ce22065c5e50f200b179a327c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://105120bce25dc17ac09a04c3b0703e3f07e3f60495ddc67ca4d77004384a6fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:24Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:24 crc kubenswrapper[4971]: I0320 06:51:24.427021 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b2683d138d302cd71c3b08e7fa14d434a288810b0bcb2dd30a6ed778e9288f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dbb5b36c90e374f7b5583a7981e39d07f6154e268299d9bc22781e5be7250c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:24Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:24 crc kubenswrapper[4971]: I0320 06:51:24.443757 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:24 crc kubenswrapper[4971]: I0320 06:51:24.443785 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:24 crc kubenswrapper[4971]: I0320 06:51:24.443794 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:24 crc kubenswrapper[4971]: I0320 06:51:24.443810 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:24 crc kubenswrapper[4971]: I0320 06:51:24.443822 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:24Z","lastTransitionTime":"2026-03-20T06:51:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:24 crc kubenswrapper[4971]: I0320 06:51:24.447523 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lwlhn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f11eaf57-f83a-4974-adb5-7a59b11555b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93f25aa49f9048f5f935b94dc408da72227675c36d75b9614bb59743769df123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2jcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lwlhn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:24Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:24 crc kubenswrapper[4971]: I0320 06:51:24.521061 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f624a08-810b-4915-b3bf-1e19d3e6cace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fhlwk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:24Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:24 crc kubenswrapper[4971]: I0320 06:51:24.537943 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9cccf9478c15d0ef6ac9a69d23a13e83c06968969d355cfe2fde1db3b40804f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:24Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:24 crc kubenswrapper[4971]: I0320 06:51:24.546047 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:24 crc kubenswrapper[4971]: I0320 06:51:24.546091 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:24 crc kubenswrapper[4971]: I0320 06:51:24.546103 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:24 crc kubenswrapper[4971]: I0320 06:51:24.546120 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:24 crc kubenswrapper[4971]: I0320 06:51:24.546133 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:24Z","lastTransitionTime":"2026-03-20T06:51:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:24 crc kubenswrapper[4971]: I0320 06:51:24.552537 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:24Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:24 crc kubenswrapper[4971]: I0320 06:51:24.648886 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:24 crc kubenswrapper[4971]: I0320 06:51:24.648934 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:24 crc kubenswrapper[4971]: I0320 06:51:24.648945 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:24 crc kubenswrapper[4971]: I0320 06:51:24.648964 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:24 crc kubenswrapper[4971]: I0320 06:51:24.648978 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:24Z","lastTransitionTime":"2026-03-20T06:51:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:24 crc kubenswrapper[4971]: I0320 06:51:24.751917 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:24 crc kubenswrapper[4971]: I0320 06:51:24.751975 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:24 crc kubenswrapper[4971]: I0320 06:51:24.751989 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:24 crc kubenswrapper[4971]: I0320 06:51:24.752006 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:24 crc kubenswrapper[4971]: I0320 06:51:24.752020 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:24Z","lastTransitionTime":"2026-03-20T06:51:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:24 crc kubenswrapper[4971]: I0320 06:51:24.854700 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:24 crc kubenswrapper[4971]: I0320 06:51:24.854751 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:24 crc kubenswrapper[4971]: I0320 06:51:24.854758 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:24 crc kubenswrapper[4971]: I0320 06:51:24.854775 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:24 crc kubenswrapper[4971]: I0320 06:51:24.854786 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:24Z","lastTransitionTime":"2026-03-20T06:51:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:24 crc kubenswrapper[4971]: I0320 06:51:24.959007 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:24 crc kubenswrapper[4971]: I0320 06:51:24.959131 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:24 crc kubenswrapper[4971]: I0320 06:51:24.959158 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:24 crc kubenswrapper[4971]: I0320 06:51:24.959189 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:24 crc kubenswrapper[4971]: I0320 06:51:24.959211 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:24Z","lastTransitionTime":"2026-03-20T06:51:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.063186 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.063259 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.063283 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.063310 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.063329 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:25Z","lastTransitionTime":"2026-03-20T06:51:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.166741 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.166832 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.166855 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.166968 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.166996 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:25Z","lastTransitionTime":"2026-03-20T06:51:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.271419 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.271477 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.271494 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.271519 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.271536 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:25Z","lastTransitionTime":"2026-03-20T06:51:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.315329 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" event={"ID":"7f624a08-810b-4915-b3bf-1e19d3e6cace","Type":"ContainerStarted","Data":"4691ea9fd75102b278413b4cf2d045ddad7f06b55243e481ec5ff36f6891cd77"} Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.319368 4971 generic.go:334] "Generic (PLEG): container finished" podID="4be6e520-cae7-413a-bd91-de5f2de7f0b1" containerID="1ec6ebffc7048694a1a324f1569343353e81d494bd19d04dd579073299d82eaf" exitCode=0 Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.319420 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qmgrn" event={"ID":"4be6e520-cae7-413a-bd91-de5f2de7f0b1","Type":"ContainerDied","Data":"1ec6ebffc7048694a1a324f1569343353e81d494bd19d04dd579073299d82eaf"} Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.347937 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:25Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.372405 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jl74k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3068ae0f-0404-452c-b0e0-c422d898e6b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dad3a055d918bddfc385b5ad7adfb0e47347ef1f86fe991b0bb93c58d39e4436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmftj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jl74k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:25Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.375200 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.375288 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.375309 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.375342 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.375366 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:25Z","lastTransitionTime":"2026-03-20T06:51:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.392143 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c96bfbc-3a6a-44df-9bf6-9f78c587657c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1a4307986ec73b6299d1dbba6bfd0efb557578fd13619a0b6a81e5e6a31f7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6545ca2d913195f4ebed66a5c9050bc7198c2d2281fea7c3293a5d188beb2e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m26zl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:25Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.411767 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:25Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.438859 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae3df870db76866893b6eb3f04127ac3feec63a351cfd97e96960dddd119c79f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:25Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.462193 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qmgrn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be6e520-cae7-413a-bd91-de5f2de7f0b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ec6ebffc7048694a1a324f1569343353e81d494bd19d04dd579073299d82eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ec6ebffc7048694a1a324f1569343353e81d494bd19d04dd579073299d82eaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qmgrn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:25Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.489715 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.489808 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.489830 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.489858 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.489890 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:25Z","lastTransitionTime":"2026-03-20T06:51:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.491627 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6eccb52-4f65-4e1f-b9c9-4bece549a815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a82f33c71d1353dd841a49c673ff55387468152e3c7aa9a935173f9de78dcd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c624e715179e2699ff4d282ae4bd733ca1bc71ac21247b97f8585f0c6313460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://becef23cbdd3d98ab9e3aa793aee4fe00b06c86505d5717db0739eb21e344b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16665116a21b8abffb23402c6362e9cd5110741ce22065c5e50f200b179a327c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://105120bce25dc17ac09a04c3b0703e3f07e3f60495ddc67ca4d77004384a6fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:25Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.517678 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b2683d138d302cd71c3b08e7fa14d434a288810b0bcb2dd30a6ed778e9288f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dbb5b36c90e374f7b5583a7981e39d07f6154e268299d9bc22781e5be7250c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:25Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.523568 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.523681 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.523706 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.523727 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.523750 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:51:25 crc kubenswrapper[4971]: E0320 06:51:25.523848 4971 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 06:51:25 crc kubenswrapper[4971]: E0320 06:51:25.523895 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 06:51:33.523880628 +0000 UTC m=+115.503754766 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 06:51:25 crc kubenswrapper[4971]: E0320 06:51:25.524184 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 06:51:33.524177176 +0000 UTC m=+115.504051314 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:51:25 crc kubenswrapper[4971]: E0320 06:51:25.524224 4971 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 06:51:25 crc kubenswrapper[4971]: E0320 06:51:25.524248 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 06:51:33.524242547 +0000 UTC m=+115.504116685 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 06:51:25 crc kubenswrapper[4971]: E0320 06:51:25.524307 4971 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 06:51:25 crc kubenswrapper[4971]: E0320 06:51:25.524317 4971 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 06:51:25 crc kubenswrapper[4971]: E0320 06:51:25.524326 4971 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 06:51:25 crc kubenswrapper[4971]: E0320 06:51:25.524349 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 06:51:33.52434314 +0000 UTC m=+115.504217268 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 06:51:25 crc kubenswrapper[4971]: E0320 06:51:25.524389 4971 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 06:51:25 crc kubenswrapper[4971]: E0320 06:51:25.524401 4971 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 06:51:25 crc kubenswrapper[4971]: E0320 06:51:25.524409 4971 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 06:51:25 crc kubenswrapper[4971]: E0320 06:51:25.524428 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 06:51:33.524421532 +0000 UTC m=+115.504295670 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.535813 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lwlhn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f11eaf57-f83a-4974-adb5-7a59b11555b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93f25aa49f9048f5f935b94dc408da72227675c36d75b9614bb59743769df123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2jcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lwlhn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:25Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.563568 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f624a08-810b-4915-b3bf-1e19d3e6cace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fhlwk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:25Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.612953 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.612992 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.613002 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.613020 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.613031 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:25Z","lastTransitionTime":"2026-03-20T06:51:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.614512 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9cccf9478c15d0ef6ac9a69d23a13e83c06968969d355cfe2fde1db3b40804f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:25Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.627811 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:25Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.716622 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.716668 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.716680 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.716701 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.716715 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:25Z","lastTransitionTime":"2026-03-20T06:51:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.731809 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.731929 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:51:25 crc kubenswrapper[4971]: E0320 06:51:25.732074 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.732083 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:51:25 crc kubenswrapper[4971]: E0320 06:51:25.732513 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 06:51:25 crc kubenswrapper[4971]: E0320 06:51:25.732589 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.746245 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.747366 4971 scope.go:117] "RemoveContainer" containerID="6a748c7f52fe87f489e7b11239aafb2b6f9dc55d79f48fd08e10193349e61706" Mar 20 06:51:25 crc kubenswrapper[4971]: E0320 06:51:25.748040 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.820418 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.820491 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.820511 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.820542 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.820561 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:25Z","lastTransitionTime":"2026-03-20T06:51:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.924754 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.924845 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.924864 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.924899 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:25 crc kubenswrapper[4971]: I0320 06:51:25.924922 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:25Z","lastTransitionTime":"2026-03-20T06:51:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.032637 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.032681 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.032693 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.032715 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.032729 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:26Z","lastTransitionTime":"2026-03-20T06:51:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.136432 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.136488 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.136501 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.136525 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.136539 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:26Z","lastTransitionTime":"2026-03-20T06:51:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.239897 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.240193 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.240283 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.240440 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.240538 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:26Z","lastTransitionTime":"2026-03-20T06:51:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.328056 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qmgrn" event={"ID":"4be6e520-cae7-413a-bd91-de5f2de7f0b1","Type":"ContainerStarted","Data":"c24b027e991a36a6d80660e3aaf9d436d5f07b486acd239cadd2506e854fb4f5"} Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.329271 4971 scope.go:117] "RemoveContainer" containerID="6a748c7f52fe87f489e7b11239aafb2b6f9dc55d79f48fd08e10193349e61706" Mar 20 06:51:26 crc kubenswrapper[4971]: E0320 06:51:26.329928 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.343732 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.343783 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.343827 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.343853 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.343872 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:26Z","lastTransitionTime":"2026-03-20T06:51:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.350927 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lwlhn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f11eaf57-f83a-4974-adb5-7a59b11555b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93f25aa49f9048f5f935b94dc408da72227675c36d75b9614bb59743769df123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2jcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lwlhn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:26Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.378029 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-kfnsf"] Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.378831 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-kfnsf" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.379372 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f624a08-810b-4915-b3bf-1e19d3e6cace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fhlwk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:26Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.381025 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.382533 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.382677 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.382951 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.401246 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9cccf9478c15d0ef6ac9a69d23a13e83c06968969d355cfe2fde1db3b40804f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:26Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.421212 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:26Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.440469 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:26Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.446203 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.446283 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.446303 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.446332 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.446350 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:26Z","lastTransitionTime":"2026-03-20T06:51:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.457542 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jl74k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3068ae0f-0404-452c-b0e0-c422d898e6b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dad3a055d918bddfc385b5ad7adfb0e47347ef1f86fe991b0bb93c58d39e4436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmftj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jl74k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:26Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.476424 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c96bfbc-3a6a-44df-9bf6-9f78c587657c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1a4307986ec73b6299d1dbba6bfd0efb557578fd13619a0b6a81e5e6a31f7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6545ca2d913195f4ebed66a5c9050bc7198c2d2281fea7c3293a5d188beb2e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m26zl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:26Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.495665 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:26Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.514381 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae3df870db76866893b6eb3f04127ac3feec63a351cfd97e96960dddd119c79f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:26Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.539115 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7d21aee3-3c87-4b06-9a9b-f88d3acb6d3c-serviceca\") pod \"node-ca-kfnsf\" (UID: \"7d21aee3-3c87-4b06-9a9b-f88d3acb6d3c\") " pod="openshift-image-registry/node-ca-kfnsf" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.539716 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7d788\" (UniqueName: \"kubernetes.io/projected/7d21aee3-3c87-4b06-9a9b-f88d3acb6d3c-kube-api-access-7d788\") pod \"node-ca-kfnsf\" (UID: \"7d21aee3-3c87-4b06-9a9b-f88d3acb6d3c\") " pod="openshift-image-registry/node-ca-kfnsf" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.539815 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7d21aee3-3c87-4b06-9a9b-f88d3acb6d3c-host\") pod \"node-ca-kfnsf\" (UID: \"7d21aee3-3c87-4b06-9a9b-f88d3acb6d3c\") " pod="openshift-image-registry/node-ca-kfnsf" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.541797 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qmgrn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be6e520-cae7-413a-bd91-de5f2de7f0b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ec6ebffc7048694a1a324f1569343353e81d494bd19d04dd579073299d82eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ec6ebffc7048694a1a324f1569343353e81d494bd19d04dd579073299d82eaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c24b027e991a36a6d80660e3aaf9d436d5f07b486acd239cadd2506e854fb4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qmgrn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:26Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.549600 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.549666 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.549681 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.549704 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.549718 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:26Z","lastTransitionTime":"2026-03-20T06:51:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.569900 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.569959 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.569979 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.570005 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.570023 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:26Z","lastTransitionTime":"2026-03-20T06:51:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.580561 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6eccb52-4f65-4e1f-b9c9-4bece549a815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a82f33c71d1353dd841a49c673ff55387468152e3c7aa9a935173f9de78dcd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c624e715179e2699ff4d282ae4bd733ca1bc71ac21247b97f8585f0c6313460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://becef23cbdd3d98ab9e3aa793aee4fe00b06c86505d5717db0739eb21e344b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16665116a21b8abffb23402c6362e9cd5110741ce22065c5e50f200b179a327c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://105120bce25dc17ac09a04c3b0703e3f07e3f60495ddc67ca4d77004384a6fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:26Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:26 crc kubenswrapper[4971]: E0320 06:51:26.591321 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24b0987a-bded-4b95-97d3-cecbc47baa01\\\",\\\"systemUUID\\\":\\\"e08276c0-e3e4-4e35-ad9f-5ef530b85d12\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:26Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.603168 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.603229 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.603244 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.603281 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.603304 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:26Z","lastTransitionTime":"2026-03-20T06:51:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.606522 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b2683d138d302cd71c3b08e7fa14d434a288810b0bcb2dd30a6ed778e9288f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dbb5b36c90e374f7b5583a7981e39d07f6154e268299d9bc22781e5be7250c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:26Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:26 crc kubenswrapper[4971]: E0320 06:51:26.622382 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24b0987a-bded-4b95-97d3-cecbc47baa01\\\",\\\"systemUUID\\\":\\\"e08276c0-e3e4-4e35-ad9f-5ef530b85d12\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:26Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.627998 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.628067 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.628082 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.628106 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.628121 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:26Z","lastTransitionTime":"2026-03-20T06:51:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.628952 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3f656dd-4e82-4469-9bd4-f02922f2649c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfb7907b15336a74d333ce825f8325e80508112fa387f60627e13cc60dc6b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e45395fcc4ab1005b3b86243b30fecf35311439ef5593f777ba909a0b9359574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c945250129ad9a7fdd1cc19ac9f4c8d3e73c504448c652d5e02cdc986624089d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a748c7f52fe87f489e7b11239aafb2b6f9dc55d79f48fd08e10193349e61706\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a748c7f52fe87f489e7b11239aafb2b6f9dc55d79f48fd08e10193349e61706\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T06:50:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 06:50:49.359098 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 06:50:49.359207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 06:50:49.359766 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3550867493/tls.crt::/tmp/serving-cert-3550867493/tls.key\\\\\\\"\\\\nI0320 06:50:49.512575 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 06:50:49.515179 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 06:50:49.515195 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 06:50:49.515218 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 06:50:49.515223 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 06:50:49.521761 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 06:50:49.521810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 06:50:49.521822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 06:50:49.521832 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 06:50:49.521839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 06:50:49.521858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 06:50:49.521866 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 06:50:49.521769 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0320 06:50:49.523950 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:50:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2468ac45fc80991cc210720a76afa06655f3fd7e91fab289eed1d20818e7fcbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:26Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.640640 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7d21aee3-3c87-4b06-9a9b-f88d3acb6d3c-host\") pod \"node-ca-kfnsf\" (UID: \"7d21aee3-3c87-4b06-9a9b-f88d3acb6d3c\") " pod="openshift-image-registry/node-ca-kfnsf" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.640919 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7d21aee3-3c87-4b06-9a9b-f88d3acb6d3c-serviceca\") pod \"node-ca-kfnsf\" (UID: \"7d21aee3-3c87-4b06-9a9b-f88d3acb6d3c\") " pod="openshift-image-registry/node-ca-kfnsf" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.640946 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7d21aee3-3c87-4b06-9a9b-f88d3acb6d3c-host\") pod \"node-ca-kfnsf\" (UID: \"7d21aee3-3c87-4b06-9a9b-f88d3acb6d3c\") " pod="openshift-image-registry/node-ca-kfnsf" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.640993 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7d788\" (UniqueName: \"kubernetes.io/projected/7d21aee3-3c87-4b06-9a9b-f88d3acb6d3c-kube-api-access-7d788\") pod \"node-ca-kfnsf\" (UID: \"7d21aee3-3c87-4b06-9a9b-f88d3acb6d3c\") " pod="openshift-image-registry/node-ca-kfnsf" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.642307 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7d21aee3-3c87-4b06-9a9b-f88d3acb6d3c-serviceca\") pod \"node-ca-kfnsf\" (UID: \"7d21aee3-3c87-4b06-9a9b-f88d3acb6d3c\") " pod="openshift-image-registry/node-ca-kfnsf" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.647487 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c96bfbc-3a6a-44df-9bf6-9f78c587657c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1a4307986ec73b6299d1dbba6bfd0efb557578fd13619a0b6a81e5e6a31f7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6545ca2d913195f4ebed66a5c9050bc7198c2d2281fea7c3293a5d188beb2e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m26zl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:26Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:26 crc kubenswrapper[4971]: E0320 06:51:26.650836 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24b0987a-bded-4b95-97d3-cecbc47baa01\\\",\\\"systemUUID\\\":\\\"e08276c0-e3e4-4e35-ad9f-5ef530b85d12\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:26Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.699077 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.699137 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.699150 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.699174 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.699237 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:26Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.699262 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:26Z","lastTransitionTime":"2026-03-20T06:51:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.711717 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7d788\" (UniqueName: \"kubernetes.io/projected/7d21aee3-3c87-4b06-9a9b-f88d3acb6d3c-kube-api-access-7d788\") pod \"node-ca-kfnsf\" (UID: \"7d21aee3-3c87-4b06-9a9b-f88d3acb6d3c\") " pod="openshift-image-registry/node-ca-kfnsf" Mar 20 06:51:26 crc kubenswrapper[4971]: E0320 06:51:26.721593 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24b0987a-bded-4b95-97d3-cecbc47baa01\\\",\\\"systemUUID\\\":\\\"e08276c0-e3e4-4e35-ad9f-5ef530b85d12\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:26Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.725172 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:26Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.727432 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.727471 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.727483 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.727505 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.727520 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:26Z","lastTransitionTime":"2026-03-20T06:51:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.739142 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jl74k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3068ae0f-0404-452c-b0e0-c422d898e6b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dad3a055d918bddfc385b5ad7adfb0e47347ef1f86fe991b0bb93c58d39e4436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmftj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jl74k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:26Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:26 crc kubenswrapper[4971]: E0320 06:51:26.745342 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24b0987a-bded-4b95-97d3-cecbc47baa01\\\",\\\"systemUUID\\\":\\\"e08276c0-e3e4-4e35-ad9f-5ef530b85d12\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:26Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:26 crc kubenswrapper[4971]: E0320 06:51:26.745669 4971 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.748934 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.748986 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.748999 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.749025 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.749039 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:26Z","lastTransitionTime":"2026-03-20T06:51:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.756537 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kfnsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d21aee3-3c87-4b06-9a9b-f88d3acb6d3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7d788\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kfnsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:26Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.763002 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-kfnsf" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.784502 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6eccb52-4f65-4e1f-b9c9-4bece549a815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a82f33c71d1353dd841a49c673ff55387468152e3c7aa9a935173f9de78dcd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c624e715179e2699ff4d282ae4bd733ca1bc71ac21247b97f8585f0c6313460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://becef23cbdd3d98ab9e3aa793aee4fe00b06c86505d5717db0739eb21e344b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16665116a21b8abffb23402c6362e9cd5110741ce22065c5e50f200b179a327c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://105120bce25dc17ac09a04c3b0703e3f07e3f60495ddc67ca4d77004384a6fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:26Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:26 crc kubenswrapper[4971]: W0320 06:51:26.792341 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d21aee3_3c87_4b06_9a9b_f88d3acb6d3c.slice/crio-bfd3491ef32f3772dd9a498f7a062f50d40e53a165bc6490bf0a77d9120ff6a2 WatchSource:0}: Error finding container bfd3491ef32f3772dd9a498f7a062f50d40e53a165bc6490bf0a77d9120ff6a2: Status 404 returned error can't find the container with id bfd3491ef32f3772dd9a498f7a062f50d40e53a165bc6490bf0a77d9120ff6a2 Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.804183 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae3df870db76866893b6eb3f04127ac3feec63a351cfd97e96960dddd119c79f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:26Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.832491 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qmgrn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be6e520-cae7-413a-bd91-de5f2de7f0b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ec6ebffc7048694a1a324f1569343353e81d494bd19d04dd579073299d82eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ec6ebffc7048694a1a324f1569343353e81d494bd19d04dd579073299d82eaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c24b027e991a36a6d80660e3aaf9d436d5f07b486acd239cadd2506e854fb4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qmgrn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:26Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.851685 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.851733 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.851743 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.851763 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.851775 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:26Z","lastTransitionTime":"2026-03-20T06:51:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.858778 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3f656dd-4e82-4469-9bd4-f02922f2649c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfb7907b15336a74d333ce825f8325e80508112fa387f60627e13cc60dc6b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e45395fcc4ab1005b3b86243b30fecf35311439ef5593f777ba909a0b9359574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c945250129ad9a7fdd1cc19ac9f4c8d3e73c504448c652d5e02cdc986624089d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a748c7f52fe87f489e7b11239aafb2b6f9dc55d79f48fd08e10193349e61706\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a748c7f52fe87f489e7b11239aafb2b6f9dc55d79f48fd08e10193349e61706\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T06:50:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 06:50:49.359098 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 06:50:49.359207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 06:50:49.359766 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3550867493/tls.crt::/tmp/serving-cert-3550867493/tls.key\\\\\\\"\\\\nI0320 06:50:49.512575 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 06:50:49.515179 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 06:50:49.515195 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 06:50:49.515218 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 06:50:49.515223 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 06:50:49.521761 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 06:50:49.521810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 06:50:49.521822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 06:50:49.521832 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 06:50:49.521839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 06:50:49.521858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 06:50:49.521866 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 06:50:49.521769 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0320 06:50:49.523950 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:50:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2468ac45fc80991cc210720a76afa06655f3fd7e91fab289eed1d20818e7fcbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:26Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.883945 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b2683d138d302cd71c3b08e7fa14d434a288810b0bcb2dd30a6ed778e9288f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dbb5b36c90e374f7b5583a7981e39d07f6154e268299d9bc22781e5be7250c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:26Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.908871 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9cccf9478c15d0ef6ac9a69d23a13e83c06968969d355cfe2fde1db3b40804f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:26Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.929693 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:26Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.948626 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lwlhn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f11eaf57-f83a-4974-adb5-7a59b11555b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93f25aa49f9048f5f935b94dc408da72227675c36d75b9614bb59743769df123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2jcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lwlhn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:26Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.955388 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.955441 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.955452 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.955471 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.955492 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:26Z","lastTransitionTime":"2026-03-20T06:51:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:26 crc kubenswrapper[4971]: I0320 06:51:26.986531 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f624a08-810b-4915-b3bf-1e19d3e6cace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fhlwk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:26Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.059839 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.059896 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.059915 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.059944 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.059966 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:27Z","lastTransitionTime":"2026-03-20T06:51:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.163166 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.163232 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.163243 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.163264 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.163276 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:27Z","lastTransitionTime":"2026-03-20T06:51:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.266985 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.267045 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.267058 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.267084 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.267103 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:27Z","lastTransitionTime":"2026-03-20T06:51:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.342809 4971 generic.go:334] "Generic (PLEG): container finished" podID="4be6e520-cae7-413a-bd91-de5f2de7f0b1" containerID="c24b027e991a36a6d80660e3aaf9d436d5f07b486acd239cadd2506e854fb4f5" exitCode=0 Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.342908 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qmgrn" event={"ID":"4be6e520-cae7-413a-bd91-de5f2de7f0b1","Type":"ContainerDied","Data":"c24b027e991a36a6d80660e3aaf9d436d5f07b486acd239cadd2506e854fb4f5"} Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.356598 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-kfnsf" event={"ID":"7d21aee3-3c87-4b06-9a9b-f88d3acb6d3c","Type":"ContainerStarted","Data":"20ac2a0f22405759bc85f5dcfbecdd34e7568f53c6d67b5c0fe9d132e2a359b5"} Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.356683 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-kfnsf" event={"ID":"7d21aee3-3c87-4b06-9a9b-f88d3acb6d3c","Type":"ContainerStarted","Data":"bfd3491ef32f3772dd9a498f7a062f50d40e53a165bc6490bf0a77d9120ff6a2"} Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.375371 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.375530 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.375584 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.375666 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.375750 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:27Z","lastTransitionTime":"2026-03-20T06:51:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.379870 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:27Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.381425 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" event={"ID":"7f624a08-810b-4915-b3bf-1e19d3e6cace","Type":"ContainerStarted","Data":"839f28f29f3f35b4feb6b49064638cf8e0b348963fb75d9c4b8f8f38e969effa"} Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.381878 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.381906 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.399769 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:27Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.412314 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jl74k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3068ae0f-0404-452c-b0e0-c422d898e6b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dad3a055d918bddfc385b5ad7adfb0e47347ef1f86fe991b0bb93c58d39e4436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmftj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jl74k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:27Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.426700 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.428789 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c96bfbc-3a6a-44df-9bf6-9f78c587657c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1a4307986ec73b6299d1dbba6bfd0efb557578fd13619a0b6a81e5e6a31f7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6545ca2d913195f4ebed66a5c9050bc7198c2d2281fea7c3293a5d188beb2e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m26zl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:27Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.454712 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6eccb52-4f65-4e1f-b9c9-4bece549a815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a82f33c71d1353dd841a49c673ff55387468152e3c7aa9a935173f9de78dcd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c624e715179e2699ff4d282ae4bd733ca1bc71ac21247b97f8585f0c6313460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://becef23cbdd3d98ab9e3aa793aee4fe00b06c86505d5717db0739eb21e344b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16665116a21b8abffb23402c6362e9cd5110741ce22065c5e50f200b179a327c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://105120bce25dc17ac09a04c3b0703e3f07e3f60495ddc67ca4d77004384a6fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:27Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.470073 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae3df870db76866893b6eb3f04127ac3feec63a351cfd97e96960dddd119c79f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:27Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.478918 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.479082 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.479185 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.479282 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.479388 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:27Z","lastTransitionTime":"2026-03-20T06:51:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.493195 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qmgrn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be6e520-cae7-413a-bd91-de5f2de7f0b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ec6ebffc7048694a1a324f1569343353e81d494bd19d04dd579073299d82eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ec6ebffc7048694a1a324f1569343353e81d494bd19d04dd579073299d82eaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c24b027e991a36a6d80660e3aaf9d436d5f07b486acd239cadd2506e854fb4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c24b027e991a36a6d80660e3aaf9d436d5f07b486acd239cadd2506e854fb4f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qmgrn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:27Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.507890 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kfnsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d21aee3-3c87-4b06-9a9b-f88d3acb6d3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7d788\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kfnsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:27Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.526995 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3f656dd-4e82-4469-9bd4-f02922f2649c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfb7907b15336a74d333ce825f8325e80508112fa387f60627e13cc60dc6b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e45395fcc4ab1005b3b86243b30fecf35311439ef5593f777ba909a0b9359574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c945250129ad9a7fdd1cc19ac9f4c8d3e73c504448c652d5e02cdc986624089d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a748c7f52fe87f489e7b11239aafb2b6f9dc55d79f48fd08e10193349e61706\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a748c7f52fe87f489e7b11239aafb2b6f9dc55d79f48fd08e10193349e61706\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T06:50:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 06:50:49.359098 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 06:50:49.359207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 06:50:49.359766 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3550867493/tls.crt::/tmp/serving-cert-3550867493/tls.key\\\\\\\"\\\\nI0320 06:50:49.512575 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 06:50:49.515179 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 06:50:49.515195 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 06:50:49.515218 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 06:50:49.515223 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 06:50:49.521761 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 06:50:49.521810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 06:50:49.521822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 06:50:49.521832 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 06:50:49.521839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 06:50:49.521858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 06:50:49.521866 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 06:50:49.521769 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0320 06:50:49.523950 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:50:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2468ac45fc80991cc210720a76afa06655f3fd7e91fab289eed1d20818e7fcbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:27Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.541233 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b2683d138d302cd71c3b08e7fa14d434a288810b0bcb2dd30a6ed778e9288f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dbb5b36c90e374f7b5583a7981e39d07f6154e268299d9bc22781e5be7250c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:27Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.560877 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9cccf9478c15d0ef6ac9a69d23a13e83c06968969d355cfe2fde1db3b40804f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:27Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.576429 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:27Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.581287 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.581336 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.581348 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.581366 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.581377 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:27Z","lastTransitionTime":"2026-03-20T06:51:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.591857 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lwlhn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f11eaf57-f83a-4974-adb5-7a59b11555b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93f25aa49f9048f5f935b94dc408da72227675c36d75b9614bb59743769df123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2jcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lwlhn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:27Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.613423 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f624a08-810b-4915-b3bf-1e19d3e6cace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fhlwk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:27Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.630259 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3f656dd-4e82-4469-9bd4-f02922f2649c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfb7907b15336a74d333ce825f8325e80508112fa387f60627e13cc60dc6b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e45395fcc4ab1005b3b86243b30fecf35311439ef5593f777ba909a0b9359574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c945250129ad9a7fdd1cc19ac9f4c8d3e73c504448c652d5e02cdc986624089d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a748c7f52fe87f489e7b11239aafb2b6f9dc55d79f48fd08e10193349e61706\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a748c7f52fe87f489e7b11239aafb2b6f9dc55d79f48fd08e10193349e61706\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T06:50:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 06:50:49.359098 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 06:50:49.359207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 06:50:49.359766 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3550867493/tls.crt::/tmp/serving-cert-3550867493/tls.key\\\\\\\"\\\\nI0320 06:50:49.512575 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 06:50:49.515179 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 06:50:49.515195 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 06:50:49.515218 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 06:50:49.515223 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 06:50:49.521761 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 06:50:49.521810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 06:50:49.521822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 06:50:49.521832 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 06:50:49.521839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 06:50:49.521858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 06:50:49.521866 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 06:50:49.521769 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0320 06:50:49.523950 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:50:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2468ac45fc80991cc210720a76afa06655f3fd7e91fab289eed1d20818e7fcbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:27Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.646201 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b2683d138d302cd71c3b08e7fa14d434a288810b0bcb2dd30a6ed778e9288f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dbb5b36c90e374f7b5583a7981e39d07f6154e268299d9bc22781e5be7250c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:27Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.659769 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9cccf9478c15d0ef6ac9a69d23a13e83c06968969d355cfe2fde1db3b40804f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:27Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.674701 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:27Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.684702 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.684780 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.684801 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.684835 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.684859 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:27Z","lastTransitionTime":"2026-03-20T06:51:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.688334 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lwlhn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f11eaf57-f83a-4974-adb5-7a59b11555b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93f25aa49f9048f5f935b94dc408da72227675c36d75b9614bb59743769df123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2jcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lwlhn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:27Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.723294 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f624a08-810b-4915-b3bf-1e19d3e6cace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://178ce0bca73aef60296595c16024b62f27dc16d226d6ed8d102bb88b3b4477aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20d0d42b820b46ff3173d56e03dedec6fd3fd360fe144467d8eb6f785d597308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://547b376608a6a765aea3f53af2c12c6eea952bf8fcdb34be1a94fc79d7c84b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://421a7ad4b3dde0de7b751159a385832075d84609031c7def22b4aa02874d2936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a1ae4297291e48978592b82e56c27206a61f8789bc5312c5dd22d06b34c195\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30450686629f21239e3247309d8fab4f41bacff80fdc6d456d970dde1a7e703\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://839f28f29f3f35b4feb6b49064638cf8e0b348963fb75d9c4b8f8f38e969effa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4691ea9fd75102b278413b4cf2d045ddad7f06b55243e481ec5ff36f6891cd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fhlwk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:27Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.731979 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.732016 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.732090 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:51:27 crc kubenswrapper[4971]: E0320 06:51:27.732501 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 06:51:27 crc kubenswrapper[4971]: E0320 06:51:27.732247 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 06:51:27 crc kubenswrapper[4971]: E0320 06:51:27.732642 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.742144 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:27Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.759327 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:27Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.776528 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jl74k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3068ae0f-0404-452c-b0e0-c422d898e6b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dad3a055d918bddfc385b5ad7adfb0e47347ef1f86fe991b0bb93c58d39e4436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmftj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jl74k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:27Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.788411 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.788488 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.788509 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.788545 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.788567 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:27Z","lastTransitionTime":"2026-03-20T06:51:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.792549 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c96bfbc-3a6a-44df-9bf6-9f78c587657c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1a4307986ec73b6299d1dbba6bfd0efb557578fd13619a0b6a81e5e6a31f7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6545ca2d913195f4ebed66a5c9050bc7198c2d2281fea7c3293a5d188beb2e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m26zl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:27Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.815116 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6eccb52-4f65-4e1f-b9c9-4bece549a815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a82f33c71d1353dd841a49c673ff55387468152e3c7aa9a935173f9de78dcd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c624e715179e2699ff4d282ae4bd733ca1bc71ac21247b97f8585f0c6313460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://becef23cbdd3d98ab9e3aa793aee4fe00b06c86505d5717db0739eb21e344b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16665116a21b8abffb23402c6362e9cd5110741ce22065c5e50f200b179a327c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://105120bce25dc17ac09a04c3b0703e3f07e3f60495ddc67ca4d77004384a6fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:27Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.834795 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae3df870db76866893b6eb3f04127ac3feec63a351cfd97e96960dddd119c79f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:27Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.849941 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qmgrn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be6e520-cae7-413a-bd91-de5f2de7f0b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ec6ebffc7048694a1a324f1569343353e81d494bd19d04dd579073299d82eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ec6ebffc7048694a1a324f1569343353e81d494bd19d04dd579073299d82eaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c24b027e991a36a6d80660e3aaf9d436d5f07b486acd239cadd2506e854fb4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c24b027e991a36a6d80660e3aaf9d436d5f07b486acd239cadd2506e854fb4f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qmgrn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:27Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.869360 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kfnsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d21aee3-3c87-4b06-9a9b-f88d3acb6d3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20ac2a0f22405759bc85f5dcfbecdd34e7568f53c6d67b5c0fe9d132e2a359b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7d788\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kfnsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:27Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.892892 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.892935 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.892947 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.892970 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.892988 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:27Z","lastTransitionTime":"2026-03-20T06:51:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.997039 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.997138 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.997168 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.997209 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:27 crc kubenswrapper[4971]: I0320 06:51:27.997237 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:27Z","lastTransitionTime":"2026-03-20T06:51:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.101494 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.101587 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.101659 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.101691 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.101716 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:28Z","lastTransitionTime":"2026-03-20T06:51:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.204636 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.204684 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.204695 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.204719 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.204730 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:28Z","lastTransitionTime":"2026-03-20T06:51:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.308489 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.308831 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.308966 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.309136 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.309265 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:28Z","lastTransitionTime":"2026-03-20T06:51:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.389178 4971 generic.go:334] "Generic (PLEG): container finished" podID="4be6e520-cae7-413a-bd91-de5f2de7f0b1" containerID="295001469b4485d0ef107707d2affb5c6a653b55aaa04545172607c9c9915dae" exitCode=0 Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.389256 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qmgrn" event={"ID":"4be6e520-cae7-413a-bd91-de5f2de7f0b1","Type":"ContainerDied","Data":"295001469b4485d0ef107707d2affb5c6a653b55aaa04545172607c9c9915dae"} Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.389774 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.412121 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.412193 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.412228 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.412256 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.412273 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:28Z","lastTransitionTime":"2026-03-20T06:51:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.418921 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3f656dd-4e82-4469-9bd4-f02922f2649c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfb7907b15336a74d333ce825f8325e80508112fa387f60627e13cc60dc6b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e45395fcc4ab1005b3b86243b30fecf35311439ef5593f777ba909a0b9359574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c945250129ad9a7fdd1cc19ac9f4c8d3e73c504448c652d5e02cdc986624089d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a748c7f52fe87f489e7b11239aafb2b6f9dc55d79f48fd08e10193349e61706\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a748c7f52fe87f489e7b11239aafb2b6f9dc55d79f48fd08e10193349e61706\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T06:50:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 06:50:49.359098 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 06:50:49.359207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 06:50:49.359766 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3550867493/tls.crt::/tmp/serving-cert-3550867493/tls.key\\\\\\\"\\\\nI0320 06:50:49.512575 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 06:50:49.515179 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 06:50:49.515195 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 06:50:49.515218 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 06:50:49.515223 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 06:50:49.521761 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 06:50:49.521810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 06:50:49.521822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 06:50:49.521832 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 06:50:49.521839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 06:50:49.521858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 06:50:49.521866 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 06:50:49.521769 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0320 06:50:49.523950 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:50:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2468ac45fc80991cc210720a76afa06655f3fd7e91fab289eed1d20818e7fcbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:28Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.421312 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.441811 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b2683d138d302cd71c3b08e7fa14d434a288810b0bcb2dd30a6ed778e9288f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dbb5b36c90e374f7b5583a7981e39d07f6154e268299d9bc22781e5be7250c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:28Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.467631 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f624a08-810b-4915-b3bf-1e19d3e6cace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://178ce0bca73aef60296595c16024b62f27dc16d226d6ed8d102bb88b3b4477aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20d0d42b820b46ff3173d56e03dedec6fd3fd360fe144467d8eb6f785d597308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://547b376608a6a765aea3f53af2c12c6eea952bf8fcdb34be1a94fc79d7c84b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://421a7ad4b3dde0de7b751159a385832075d84609031c7def22b4aa02874d2936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a1ae4297291e48978592b82e56c27206a61f8789bc5312c5dd22d06b34c195\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30450686629f21239e3247309d8fab4f41bacff80fdc6d456d970dde1a7e703\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://839f28f29f3f35b4feb6b49064638cf8e0b348963fb75d9c4b8f8f38e969effa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4691ea9fd75102b278413b4cf2d045ddad7f06b55243e481ec5ff36f6891cd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fhlwk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:28Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.484324 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9cccf9478c15d0ef6ac9a69d23a13e83c06968969d355cfe2fde1db3b40804f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:28Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.500467 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:28Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.517496 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lwlhn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f11eaf57-f83a-4974-adb5-7a59b11555b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93f25aa49f9048f5f935b94dc408da72227675c36d75b9614bb59743769df123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2jcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lwlhn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:28Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.518788 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.518837 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.518855 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.518879 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.518900 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:28Z","lastTransitionTime":"2026-03-20T06:51:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.532327 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jl74k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3068ae0f-0404-452c-b0e0-c422d898e6b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dad3a055d918bddfc385b5ad7adfb0e47347ef1f86fe991b0bb93c58d39e4436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmftj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jl74k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:28Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.544928 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c96bfbc-3a6a-44df-9bf6-9f78c587657c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1a4307986ec73b6299d1dbba6bfd0efb557578fd13619a0b6a81e5e6a31f7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6545ca2d913195f4ebed66a5c9050bc7198c2d2281fea7c3293a5d188beb2e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m26zl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:28Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.562839 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:28Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.577599 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:28Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.595577 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qmgrn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be6e520-cae7-413a-bd91-de5f2de7f0b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ec6ebffc7048694a1a324f1569343353e81d494bd19d04dd579073299d82eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ec6ebffc7048694a1a324f1569343353e81d494bd19d04dd579073299d82eaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c24b027e991a36a6d80660e3aaf9d436d5f07b486acd239cadd2506e854fb4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c24b027e991a36a6d80660e3aaf9d436d5f07b486acd239cadd2506e854fb4f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://295001469b4485d0ef107707d2affb5c6a653b55aaa04545172607c9c9915dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://295001469b4485d0ef107707d2affb5c6a653b55aaa04545172607c9c9915dae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qmgrn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:28Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.610446 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kfnsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d21aee3-3c87-4b06-9a9b-f88d3acb6d3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20ac2a0f22405759bc85f5dcfbecdd34e7568f53c6d67b5c0fe9d132e2a359b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7d788\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kfnsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:28Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.622052 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.622121 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.622152 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.622183 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.622207 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:28Z","lastTransitionTime":"2026-03-20T06:51:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.636532 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6eccb52-4f65-4e1f-b9c9-4bece549a815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a82f33c71d1353dd841a49c673ff55387468152e3c7aa9a935173f9de78dcd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c624e715179e2699ff4d282ae4bd733ca1bc71ac21247b97f8585f0c6313460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://becef23cbdd3d98ab9e3aa793aee4fe00b06c86505d5717db0739eb21e344b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16665116a21b8abffb23402c6362e9cd5110741ce22065c5e50f200b179a327c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://105120bce25dc17ac09a04c3b0703e3f07e3f60495ddc67ca4d77004384a6fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:28Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.653089 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae3df870db76866893b6eb3f04127ac3feec63a351cfd97e96960dddd119c79f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:28Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.669660 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3f656dd-4e82-4469-9bd4-f02922f2649c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfb7907b15336a74d333ce825f8325e80508112fa387f60627e13cc60dc6b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e45395fcc4ab1005b3b86243b30fecf35311439ef5593f777ba909a0b9359574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c945250129ad9a7fdd1cc19ac9f4c8d3e73c504448c652d5e02cdc986624089d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a748c7f52fe87f489e7b11239aafb2b6f9dc55d79f48fd08e10193349e61706\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a748c7f52fe87f489e7b11239aafb2b6f9dc55d79f48fd08e10193349e61706\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T06:50:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 06:50:49.359098 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 06:50:49.359207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 06:50:49.359766 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3550867493/tls.crt::/tmp/serving-cert-3550867493/tls.key\\\\\\\"\\\\nI0320 06:50:49.512575 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 06:50:49.515179 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 06:50:49.515195 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 06:50:49.515218 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 06:50:49.515223 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 06:50:49.521761 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 06:50:49.521810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 06:50:49.521822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 06:50:49.521832 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 06:50:49.521839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 06:50:49.521858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 06:50:49.521866 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 06:50:49.521769 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0320 06:50:49.523950 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:50:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2468ac45fc80991cc210720a76afa06655f3fd7e91fab289eed1d20818e7fcbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:28Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.685333 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b2683d138d302cd71c3b08e7fa14d434a288810b0bcb2dd30a6ed778e9288f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dbb5b36c90e374f7b5583a7981e39d07f6154e268299d9bc22781e5be7250c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:28Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.698773 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:28Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.712794 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lwlhn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f11eaf57-f83a-4974-adb5-7a59b11555b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93f25aa49f9048f5f935b94dc408da72227675c36d75b9614bb59743769df123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2jcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lwlhn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:28Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.724054 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.724097 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.724108 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.724125 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.724139 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:28Z","lastTransitionTime":"2026-03-20T06:51:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.742017 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f624a08-810b-4915-b3bf-1e19d3e6cace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://178ce0bca73aef60296595c16024b62f27dc16d226d6ed8d102bb88b3b4477aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20d0d42b820b46ff3173d56e03dedec6fd3fd360fe144467d8eb6f785d597308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://547b376608a6a765aea3f53af2c12c6eea952bf8fcdb34be1a94fc79d7c84b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://421a7ad4b3dde0de7b751159a385832075d84609031c7def22b4aa02874d2936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a1ae4297291e48978592b82e56c27206a61f8789bc5312c5dd22d06b34c195\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30450686629f21239e3247309d8fab4f41bacff80fdc6d456d970dde1a7e703\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://839f28f29f3f35b4feb6b49064638cf8e0b348963fb75d9c4b8f8f38e969effa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4691ea9fd75102b278413b4cf2d045ddad7f06b55243e481ec5ff36f6891cd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fhlwk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:28Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.759263 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9cccf9478c15d0ef6ac9a69d23a13e83c06968969d355cfe2fde1db3b40804f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:28Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.778122 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:28Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.793408 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:28Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.806313 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jl74k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3068ae0f-0404-452c-b0e0-c422d898e6b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dad3a055d918bddfc385b5ad7adfb0e47347ef1f86fe991b0bb93c58d39e4436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmftj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jl74k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:28Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.818114 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c96bfbc-3a6a-44df-9bf6-9f78c587657c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1a4307986ec73b6299d1dbba6bfd0efb557578fd13619a0b6a81e5e6a31f7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6545ca2d913195f4ebed66a5c9050bc7198c2d2281fea7c3293a5d188beb2e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m26zl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:28Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.826688 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.826732 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.826749 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.826771 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.826788 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:28Z","lastTransitionTime":"2026-03-20T06:51:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.829642 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae3df870db76866893b6eb3f04127ac3feec63a351cfd97e96960dddd119c79f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:28Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.848374 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qmgrn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be6e520-cae7-413a-bd91-de5f2de7f0b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ec6ebffc7048694a1a324f1569343353e81d494bd19d04dd579073299d82eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ec6ebffc7048694a1a324f1569343353e81d494bd19d04dd579073299d82eaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c24b027e991a36a6d80660e3aaf9d436d5f07b486acd239cadd2506e854fb4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c24b027e991a36a6d80660e3aaf9d436d5f07b486acd239cadd2506e854fb4f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://295001469b4485d0ef107707d2affb5c6a653b55aaa04545172607c9c9915dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://295001469b4485d0ef107707d2affb5c6a653b55aaa04545172607c9c9915dae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qmgrn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:28Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.858705 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kfnsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d21aee3-3c87-4b06-9a9b-f88d3acb6d3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20ac2a0f22405759bc85f5dcfbecdd34e7568f53c6d67b5c0fe9d132e2a359b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7d788\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kfnsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:28Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.886252 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6eccb52-4f65-4e1f-b9c9-4bece549a815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a82f33c71d1353dd841a49c673ff55387468152e3c7aa9a935173f9de78dcd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c624e715179e2699ff4d282ae4bd733ca1bc71ac21247b97f8585f0c6313460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://becef23cbdd3d98ab9e3aa793aee4fe00b06c86505d5717db0739eb21e344b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16665116a21b8abffb23402c6362e9cd5110741ce22065c5e50f200b179a327c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://105120bce25dc17ac09a04c3b0703e3f07e3f60495ddc67ca4d77004384a6fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:28Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.898576 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9cccf9478c15d0ef6ac9a69d23a13e83c06968969d355cfe2fde1db3b40804f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:28Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.909436 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:28Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.918963 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lwlhn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f11eaf57-f83a-4974-adb5-7a59b11555b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93f25aa49f9048f5f935b94dc408da72227675c36d75b9614bb59743769df123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2jcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lwlhn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:28Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.929048 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.929089 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.929100 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.929115 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.929126 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:28Z","lastTransitionTime":"2026-03-20T06:51:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.948072 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f624a08-810b-4915-b3bf-1e19d3e6cace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://178ce0bca73aef60296595c16024b62f27dc16d226d6ed8d102bb88b3b4477aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20d0d42b820b46ff3173d56e03dedec6fd3fd360fe144467d8eb6f785d597308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://547b376608a6a765aea3f53af2c12c6eea952bf8fcdb34be1a94fc79d7c84b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://421a7ad4b3dde0de7b751159a385832075d84609031c7def22b4aa02874d2936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a1ae4297291e48978592b82e56c27206a61f8789bc5312c5dd22d06b34c195\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30450686629f21239e3247309d8fab4f41bacff80fdc6d456d970dde1a7e703\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://839f28f29f3f35b4feb6b49064638cf8e0b348963fb75d9c4b8f8f38e969effa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4691ea9fd75102b278413b4cf2d045ddad7f06b55243e481ec5ff36f6891cd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fhlwk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:28Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.962440 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:28Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.975928 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:28Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.986030 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jl74k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3068ae0f-0404-452c-b0e0-c422d898e6b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dad3a055d918bddfc385b5ad7adfb0e47347ef1f86fe991b0bb93c58d39e4436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmftj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jl74k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:28Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:28 crc kubenswrapper[4971]: I0320 06:51:28.995869 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c96bfbc-3a6a-44df-9bf6-9f78c587657c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1a4307986ec73b6299d1dbba6bfd0efb557578fd13619a0b6a81e5e6a31f7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6545ca2d913195f4ebed66a5c9050bc7198c2d2281fea7c3293a5d188beb2e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m26zl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:28Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.015490 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6eccb52-4f65-4e1f-b9c9-4bece549a815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a82f33c71d1353dd841a49c673ff55387468152e3c7aa9a935173f9de78dcd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c624e715179e2699ff4d282ae4bd733ca1bc71ac21247b97f8585f0c6313460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://becef23cbdd3d98ab9e3aa793aee4fe00b06c86505d5717db0739eb21e344b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16665116a21b8abffb23402c6362e9cd5110741ce22065c5e50f200b179a327c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://105120bce25dc17ac09a04c3b0703e3f07e3f60495ddc67ca4d77004384a6fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:29Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.031177 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae3df870db76866893b6eb3f04127ac3feec63a351cfd97e96960dddd119c79f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:29Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.031874 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.031960 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.031981 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.032010 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.032030 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:29Z","lastTransitionTime":"2026-03-20T06:51:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.046709 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qmgrn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be6e520-cae7-413a-bd91-de5f2de7f0b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ec6ebffc7048694a1a324f1569343353e81d494bd19d04dd579073299d82eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ec6ebffc7048694a1a324f1569343353e81d494bd19d04dd579073299d82eaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c24b027e991a36a6d80660e3aaf9d436d5f07b486acd239cadd2506e854fb4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c24b027e991a36a6d80660e3aaf9d436d5f07b486acd239cadd2506e854fb4f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://295001469b4485d0ef107707d2affb5c6a653b55aaa04545172607c9c9915dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://295001469b4485d0ef107707d2affb5c6a653b55aaa04545172607c9c9915dae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qmgrn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:29Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.059625 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kfnsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d21aee3-3c87-4b06-9a9b-f88d3acb6d3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20ac2a0f22405759bc85f5dcfbecdd34e7568f53c6d67b5c0fe9d132e2a359b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7d788\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kfnsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:29Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.076875 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3f656dd-4e82-4469-9bd4-f02922f2649c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfb7907b15336a74d333ce825f8325e80508112fa387f60627e13cc60dc6b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e45395fcc4ab1005b3b86243b30fecf35311439ef5593f777ba909a0b9359574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c945250129ad9a7fdd1cc19ac9f4c8d3e73c504448c652d5e02cdc986624089d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a748c7f52fe87f489e7b11239aafb2b6f9dc55d79f48fd08e10193349e61706\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a748c7f52fe87f489e7b11239aafb2b6f9dc55d79f48fd08e10193349e61706\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T06:50:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 06:50:49.359098 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 06:50:49.359207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 06:50:49.359766 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3550867493/tls.crt::/tmp/serving-cert-3550867493/tls.key\\\\\\\"\\\\nI0320 06:50:49.512575 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 06:50:49.515179 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 06:50:49.515195 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 06:50:49.515218 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 06:50:49.515223 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 06:50:49.521761 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 06:50:49.521810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 06:50:49.521822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 06:50:49.521832 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 06:50:49.521839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 06:50:49.521858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 06:50:49.521866 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 06:50:49.521769 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0320 06:50:49.523950 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:50:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2468ac45fc80991cc210720a76afa06655f3fd7e91fab289eed1d20818e7fcbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:29Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.102190 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b2683d138d302cd71c3b08e7fa14d434a288810b0bcb2dd30a6ed778e9288f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dbb5b36c90e374f7b5583a7981e39d07f6154e268299d9bc22781e5be7250c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:29Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.134845 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.134908 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.134920 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.134948 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.134962 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:29Z","lastTransitionTime":"2026-03-20T06:51:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.238062 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.238106 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.238117 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.238136 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.238148 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:29Z","lastTransitionTime":"2026-03-20T06:51:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.340788 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.340839 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.340850 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.340869 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.340882 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:29Z","lastTransitionTime":"2026-03-20T06:51:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.397780 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qmgrn" event={"ID":"4be6e520-cae7-413a-bd91-de5f2de7f0b1","Type":"ContainerStarted","Data":"4c6fda90e6a1a5bb49163607f9d9547737db1eaaaea047b43e386e5a24cc2eac"} Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.414144 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9cccf9478c15d0ef6ac9a69d23a13e83c06968969d355cfe2fde1db3b40804f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:29Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.430785 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:29Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.443256 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.443317 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.443336 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.443361 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.443380 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:29Z","lastTransitionTime":"2026-03-20T06:51:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.445748 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lwlhn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f11eaf57-f83a-4974-adb5-7a59b11555b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93f25aa49f9048f5f935b94dc408da72227675c36d75b9614bb59743769df123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2jcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lwlhn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:29Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.469849 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f624a08-810b-4915-b3bf-1e19d3e6cace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://178ce0bca73aef60296595c16024b62f27dc16d226d6ed8d102bb88b3b4477aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20d0d42b820b46ff3173d56e03dedec6fd3fd360fe144467d8eb6f785d597308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://547b376608a6a765aea3f53af2c12c6eea952bf8fcdb34be1a94fc79d7c84b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://421a7ad4b3dde0de7b751159a385832075d84609031c7def22b4aa02874d2936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a1ae4297291e48978592b82e56c27206a61f8789bc5312c5dd22d06b34c195\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30450686629f21239e3247309d8fab4f41bacff80fdc6d456d970dde1a7e703\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://839f28f29f3f35b4feb6b49064638cf8e0b348963fb75d9c4b8f8f38e969effa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4691ea9fd75102b278413b4cf2d045ddad7f06b55243e481ec5ff36f6891cd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fhlwk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:29Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.485855 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:29Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.500376 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:29Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.515131 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jl74k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3068ae0f-0404-452c-b0e0-c422d898e6b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dad3a055d918bddfc385b5ad7adfb0e47347ef1f86fe991b0bb93c58d39e4436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmftj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jl74k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:29Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.531188 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c96bfbc-3a6a-44df-9bf6-9f78c587657c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1a4307986ec73b6299d1dbba6bfd0efb557578fd13619a0b6a81e5e6a31f7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6545ca2d913195f4ebed66a5c9050bc7198c2d2281fea7c3293a5d188beb2e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m26zl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:29Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.546035 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.546088 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.546099 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.546116 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.546131 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:29Z","lastTransitionTime":"2026-03-20T06:51:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.553537 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6eccb52-4f65-4e1f-b9c9-4bece549a815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a82f33c71d1353dd841a49c673ff55387468152e3c7aa9a935173f9de78dcd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c624e715179e2699ff4d282ae4bd733ca1bc71ac21247b97f8585f0c6313460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://becef23cbdd3d98ab9e3aa793aee4fe00b06c86505d5717db0739eb21e344b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16665116a21b8abffb23402c6362e9cd5110741ce22065c5e50f200b179a327c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://105120bce25dc17ac09a04c3b0703e3f07e3f60495ddc67ca4d77004384a6fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:29Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.575981 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae3df870db76866893b6eb3f04127ac3feec63a351cfd97e96960dddd119c79f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:29Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.601900 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qmgrn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be6e520-cae7-413a-bd91-de5f2de7f0b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c6fda90e6a1a5bb49163607f9d9547737db1eaaaea047b43e386e5a24cc2eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ec6ebffc7048694a1a324f1569343353e81d494bd19d04dd579073299d82eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ec6ebffc7048694a1a324f1569343353e81d494bd19d04dd579073299d82eaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c24b027e991a36a6d80660e3aaf9d436d5f07b486acd239cadd2506e854fb4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c24b027e991a36a6d80660e3aaf9d436d5f07b486acd239cadd2506e854fb4f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://295001469b4485d0ef107707d2affb5c6a653b55aaa04545172607c9c9915dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://295001469b4485d0ef107707d2affb5c6a653b55aaa04545172607c9c9915dae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qmgrn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:29Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.616025 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kfnsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d21aee3-3c87-4b06-9a9b-f88d3acb6d3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20ac2a0f22405759bc85f5dcfbecdd34e7568f53c6d67b5c0fe9d132e2a359b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7d788\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kfnsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:29Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.631180 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3f656dd-4e82-4469-9bd4-f02922f2649c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfb7907b15336a74d333ce825f8325e80508112fa387f60627e13cc60dc6b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e45395fcc4ab1005b3b86243b30fecf35311439ef5593f777ba909a0b9359574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c945250129ad9a7fdd1cc19ac9f4c8d3e73c504448c652d5e02cdc986624089d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a748c7f52fe87f489e7b11239aafb2b6f9dc55d79f48fd08e10193349e61706\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a748c7f52fe87f489e7b11239aafb2b6f9dc55d79f48fd08e10193349e61706\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T06:50:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 06:50:49.359098 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 06:50:49.359207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 06:50:49.359766 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3550867493/tls.crt::/tmp/serving-cert-3550867493/tls.key\\\\\\\"\\\\nI0320 06:50:49.512575 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 06:50:49.515179 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 06:50:49.515195 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 06:50:49.515218 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 06:50:49.515223 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 06:50:49.521761 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 06:50:49.521810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 06:50:49.521822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 06:50:49.521832 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 06:50:49.521839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 06:50:49.521858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 06:50:49.521866 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 06:50:49.521769 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0320 06:50:49.523950 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:50:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2468ac45fc80991cc210720a76afa06655f3fd7e91fab289eed1d20818e7fcbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:29Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.649716 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b2683d138d302cd71c3b08e7fa14d434a288810b0bcb2dd30a6ed778e9288f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dbb5b36c90e374f7b5583a7981e39d07f6154e268299d9bc22781e5be7250c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:29Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.651461 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.651531 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.651540 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.651555 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.651567 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:29Z","lastTransitionTime":"2026-03-20T06:51:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.731821 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.731899 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:51:29 crc kubenswrapper[4971]: E0320 06:51:29.731951 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.731821 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:51:29 crc kubenswrapper[4971]: E0320 06:51:29.732161 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 06:51:29 crc kubenswrapper[4971]: E0320 06:51:29.732205 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.754451 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.754508 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.754526 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.754553 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.754573 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:29Z","lastTransitionTime":"2026-03-20T06:51:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.856776 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.856821 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.856834 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.856850 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.856861 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:29Z","lastTransitionTime":"2026-03-20T06:51:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.965634 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.965681 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.965691 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.965705 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:29 crc kubenswrapper[4971]: I0320 06:51:29.965717 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:29Z","lastTransitionTime":"2026-03-20T06:51:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.069352 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.069426 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.069444 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.069472 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.069491 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:30Z","lastTransitionTime":"2026-03-20T06:51:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.173138 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.173200 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.173216 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.173240 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.173258 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:30Z","lastTransitionTime":"2026-03-20T06:51:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.277201 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.277270 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.277286 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.277311 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.277330 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:30Z","lastTransitionTime":"2026-03-20T06:51:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.380662 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.380741 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.380757 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.380783 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.380803 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:30Z","lastTransitionTime":"2026-03-20T06:51:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.403461 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fhlwk_7f624a08-810b-4915-b3bf-1e19d3e6cace/ovnkube-controller/0.log" Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.408085 4971 generic.go:334] "Generic (PLEG): container finished" podID="7f624a08-810b-4915-b3bf-1e19d3e6cace" containerID="839f28f29f3f35b4feb6b49064638cf8e0b348963fb75d9c4b8f8f38e969effa" exitCode=1 Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.408126 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" event={"ID":"7f624a08-810b-4915-b3bf-1e19d3e6cace","Type":"ContainerDied","Data":"839f28f29f3f35b4feb6b49064638cf8e0b348963fb75d9c4b8f8f38e969effa"} Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.409348 4971 scope.go:117] "RemoveContainer" containerID="839f28f29f3f35b4feb6b49064638cf8e0b348963fb75d9c4b8f8f38e969effa" Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.428477 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9cccf9478c15d0ef6ac9a69d23a13e83c06968969d355cfe2fde1db3b40804f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:30Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.449785 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:30Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.467055 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lwlhn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f11eaf57-f83a-4974-adb5-7a59b11555b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93f25aa49f9048f5f935b94dc408da72227675c36d75b9614bb59743769df123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2jcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lwlhn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:30Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.484003 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.484041 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.484052 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.484069 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.484081 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:30Z","lastTransitionTime":"2026-03-20T06:51:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.500952 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f624a08-810b-4915-b3bf-1e19d3e6cace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://178ce0bca73aef60296595c16024b62f27dc16d226d6ed8d102bb88b3b4477aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20d0d42b820b46ff3173d56e03dedec6fd3fd360fe144467d8eb6f785d597308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://547b376608a6a765aea3f53af2c12c6eea952bf8fcdb34be1a94fc79d7c84b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://421a7ad4b3dde0de7b751159a385832075d84609031c7def22b4aa02874d2936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a1ae4297291e48978592b82e56c27206a61f8789bc5312c5dd22d06b34c195\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30450686629f21239e3247309d8fab4f41bacff80fdc6d456d970dde1a7e703\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://839f28f29f3f35b4feb6b49064638cf8e0b348963fb75d9c4b8f8f38e969effa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://839f28f29f3f35b4feb6b49064638cf8e0b348963fb75d9c4b8f8f38e969effa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T06:51:29Z\\\",\\\"message\\\":\\\"s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 06:51:29.900680 6733 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 06:51:29.900786 6733 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 06:51:29.900798 6733 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 06:51:29.900931 6733 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 06:51:29.900978 6733 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 06:51:29.901021 6733 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 06:51:29.901040 6733 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 06:51:29.901091 6733 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 06:51:29.901119 6733 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 06:51:29.901134 6733 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 06:51:29.901172 6733 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 06:51:29.901524 6733 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 06:51:29.900857 6733 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 06:51:29.901796 6733 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4691ea9fd75102b278413b4cf2d045ddad7f06b55243e481ec5ff36f6891cd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fhlwk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:30Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.530164 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:30Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.552837 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:30Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.570710 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jl74k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3068ae0f-0404-452c-b0e0-c422d898e6b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dad3a055d918bddfc385b5ad7adfb0e47347ef1f86fe991b0bb93c58d39e4436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmftj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jl74k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:30Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.587518 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.587583 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.587646 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.587684 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.587708 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:30Z","lastTransitionTime":"2026-03-20T06:51:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.595167 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c96bfbc-3a6a-44df-9bf6-9f78c587657c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1a4307986ec73b6299d1dbba6bfd0efb557578fd13619a0b6a81e5e6a31f7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6545ca2d913195f4ebed66a5c9050bc7198c2d2281fea7c3293a5d188beb2e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m26zl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:30Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.635744 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6eccb52-4f65-4e1f-b9c9-4bece549a815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a82f33c71d1353dd841a49c673ff55387468152e3c7aa9a935173f9de78dcd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c624e715179e2699ff4d282ae4bd733ca1bc71ac21247b97f8585f0c6313460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://becef23cbdd3d98ab9e3aa793aee4fe00b06c86505d5717db0739eb21e344b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16665116a21b8abffb23402c6362e9cd5110741ce22065c5e50f200b179a327c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://105120bce25dc17ac09a04c3b0703e3f07e3f60495ddc67ca4d77004384a6fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:30Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.658815 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae3df870db76866893b6eb3f04127ac3feec63a351cfd97e96960dddd119c79f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:30Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.682033 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qmgrn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be6e520-cae7-413a-bd91-de5f2de7f0b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c6fda90e6a1a5bb49163607f9d9547737db1eaaaea047b43e386e5a24cc2eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ec6ebffc7048694a1a324f1569343353e81d494bd19d04dd579073299d82eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ec6ebffc7048694a1a324f1569343353e81d494bd19d04dd579073299d82eaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c24b027e991a36a6d80660e3aaf9d436d5f07b486acd239cadd2506e854fb4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c24b027e991a36a6d80660e3aaf9d436d5f07b486acd239cadd2506e854fb4f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://295001469b4485d0ef107707d2affb5c6a653b55aaa04545172607c9c9915dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://295001469b4485d0ef107707d2affb5c6a653b55aaa04545172607c9c9915dae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qmgrn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:30Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.689938 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.689986 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.690000 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.690019 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.690032 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:30Z","lastTransitionTime":"2026-03-20T06:51:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.700206 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kfnsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d21aee3-3c87-4b06-9a9b-f88d3acb6d3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20ac2a0f22405759bc85f5dcfbecdd34e7568f53c6d67b5c0fe9d132e2a359b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7d788\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kfnsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:30Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.715728 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3f656dd-4e82-4469-9bd4-f02922f2649c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfb7907b15336a74d333ce825f8325e80508112fa387f60627e13cc60dc6b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e45395fcc4ab1005b3b86243b30fecf35311439ef5593f777ba909a0b9359574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c945250129ad9a7fdd1cc19ac9f4c8d3e73c504448c652d5e02cdc986624089d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a748c7f52fe87f489e7b11239aafb2b6f9dc55d79f48fd08e10193349e61706\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a748c7f52fe87f489e7b11239aafb2b6f9dc55d79f48fd08e10193349e61706\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T06:50:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 06:50:49.359098 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 06:50:49.359207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 06:50:49.359766 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3550867493/tls.crt::/tmp/serving-cert-3550867493/tls.key\\\\\\\"\\\\nI0320 06:50:49.512575 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 06:50:49.515179 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 06:50:49.515195 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 06:50:49.515218 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 06:50:49.515223 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 06:50:49.521761 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 06:50:49.521810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 06:50:49.521822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 06:50:49.521832 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 06:50:49.521839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 06:50:49.521858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 06:50:49.521866 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 06:50:49.521769 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0320 06:50:49.523950 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:50:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2468ac45fc80991cc210720a76afa06655f3fd7e91fab289eed1d20818e7fcbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:30Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.738221 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b2683d138d302cd71c3b08e7fa14d434a288810b0bcb2dd30a6ed778e9288f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dbb5b36c90e374f7b5583a7981e39d07f6154e268299d9bc22781e5be7250c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:30Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.793802 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.793871 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.793888 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.793912 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.793930 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:30Z","lastTransitionTime":"2026-03-20T06:51:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.896984 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.897021 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.897032 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.897047 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.897057 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:30Z","lastTransitionTime":"2026-03-20T06:51:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.999527 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.999582 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.999594 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.999635 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:30 crc kubenswrapper[4971]: I0320 06:51:30.999650 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:30Z","lastTransitionTime":"2026-03-20T06:51:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:31 crc kubenswrapper[4971]: I0320 06:51:31.103187 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:31 crc kubenswrapper[4971]: I0320 06:51:31.103311 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:31 crc kubenswrapper[4971]: I0320 06:51:31.103333 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:31 crc kubenswrapper[4971]: I0320 06:51:31.103384 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:31 crc kubenswrapper[4971]: I0320 06:51:31.103410 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:31Z","lastTransitionTime":"2026-03-20T06:51:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:31 crc kubenswrapper[4971]: I0320 06:51:31.205480 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:31 crc kubenswrapper[4971]: I0320 06:51:31.205526 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:31 crc kubenswrapper[4971]: I0320 06:51:31.205538 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:31 crc kubenswrapper[4971]: I0320 06:51:31.205558 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:31 crc kubenswrapper[4971]: I0320 06:51:31.205572 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:31Z","lastTransitionTime":"2026-03-20T06:51:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:31 crc kubenswrapper[4971]: I0320 06:51:31.308961 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:31 crc kubenswrapper[4971]: I0320 06:51:31.309009 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:31 crc kubenswrapper[4971]: I0320 06:51:31.309019 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:31 crc kubenswrapper[4971]: I0320 06:51:31.309038 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:31 crc kubenswrapper[4971]: I0320 06:51:31.309053 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:31Z","lastTransitionTime":"2026-03-20T06:51:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:31 crc kubenswrapper[4971]: I0320 06:51:31.412776 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:31 crc kubenswrapper[4971]: I0320 06:51:31.412842 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:31 crc kubenswrapper[4971]: I0320 06:51:31.412866 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:31 crc kubenswrapper[4971]: I0320 06:51:31.412898 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:31 crc kubenswrapper[4971]: I0320 06:51:31.412921 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:31Z","lastTransitionTime":"2026-03-20T06:51:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:31 crc kubenswrapper[4971]: I0320 06:51:31.421231 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fhlwk_7f624a08-810b-4915-b3bf-1e19d3e6cace/ovnkube-controller/0.log" Mar 20 06:51:31 crc kubenswrapper[4971]: I0320 06:51:31.427313 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" event={"ID":"7f624a08-810b-4915-b3bf-1e19d3e6cace","Type":"ContainerStarted","Data":"8c214154bbb3ce7bf59da4df92dc496f7b74c0d13a424767af2692cdef906998"} Mar 20 06:51:31 crc kubenswrapper[4971]: I0320 06:51:31.428021 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:31 crc kubenswrapper[4971]: I0320 06:51:31.464600 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6eccb52-4f65-4e1f-b9c9-4bece549a815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a82f33c71d1353dd841a49c673ff55387468152e3c7aa9a935173f9de78dcd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c624e715179e2699ff4d282ae4bd733ca1bc71ac21247b97f8585f0c6313460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://becef23cbdd3d98ab9e3aa793aee4fe00b06c86505d5717db0739eb21e344b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16665116a21b8abffb23402c6362e9cd5110741ce22065c5e50f200b179a327c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://105120bce25dc17ac09a04c3b0703e3f07e3f60495ddc67ca4d77004384a6fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:31Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:31 crc kubenswrapper[4971]: I0320 06:51:31.484904 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae3df870db76866893b6eb3f04127ac3feec63a351cfd97e96960dddd119c79f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:31Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:31 crc kubenswrapper[4971]: I0320 06:51:31.512091 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qmgrn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be6e520-cae7-413a-bd91-de5f2de7f0b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c6fda90e6a1a5bb49163607f9d9547737db1eaaaea047b43e386e5a24cc2eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ec6ebffc7048694a1a324f1569343353e81d494bd19d04dd579073299d82eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ec6ebffc7048694a1a324f1569343353e81d494bd19d04dd579073299d82eaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c24b027e991a36a6d80660e3aaf9d436d5f07b486acd239cadd2506e854fb4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c24b027e991a36a6d80660e3aaf9d436d5f07b486acd239cadd2506e854fb4f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://295001469b4485d0ef107707d2affb5c6a653b55aaa04545172607c9c9915dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://295001469b4485d0ef107707d2affb5c6a653b55aaa04545172607c9c9915dae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qmgrn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:31Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:31 crc kubenswrapper[4971]: I0320 06:51:31.515918 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:31 crc kubenswrapper[4971]: I0320 06:51:31.516008 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:31 crc kubenswrapper[4971]: I0320 06:51:31.516039 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:31 crc kubenswrapper[4971]: I0320 06:51:31.516076 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:31 crc kubenswrapper[4971]: I0320 06:51:31.516106 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:31Z","lastTransitionTime":"2026-03-20T06:51:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:31 crc kubenswrapper[4971]: I0320 06:51:31.528168 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kfnsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d21aee3-3c87-4b06-9a9b-f88d3acb6d3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20ac2a0f22405759bc85f5dcfbecdd34e7568f53c6d67b5c0fe9d132e2a359b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7d788\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kfnsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:31Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:31 crc kubenswrapper[4971]: I0320 06:51:31.546730 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3f656dd-4e82-4469-9bd4-f02922f2649c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfb7907b15336a74d333ce825f8325e80508112fa387f60627e13cc60dc6b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e45395fcc4ab1005b3b86243b30fecf35311439ef5593f777ba909a0b9359574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c945250129ad9a7fdd1cc19ac9f4c8d3e73c504448c652d5e02cdc986624089d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a748c7f52fe87f489e7b11239aafb2b6f9dc55d79f48fd08e10193349e61706\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a748c7f52fe87f489e7b11239aafb2b6f9dc55d79f48fd08e10193349e61706\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T06:50:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 06:50:49.359098 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 06:50:49.359207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 06:50:49.359766 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3550867493/tls.crt::/tmp/serving-cert-3550867493/tls.key\\\\\\\"\\\\nI0320 06:50:49.512575 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 06:50:49.515179 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 06:50:49.515195 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 06:50:49.515218 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 06:50:49.515223 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 06:50:49.521761 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 06:50:49.521810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 06:50:49.521822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 06:50:49.521832 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 06:50:49.521839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 06:50:49.521858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 06:50:49.521866 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 06:50:49.521769 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0320 06:50:49.523950 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:50:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2468ac45fc80991cc210720a76afa06655f3fd7e91fab289eed1d20818e7fcbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:31Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:31 crc kubenswrapper[4971]: I0320 06:51:31.566686 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b2683d138d302cd71c3b08e7fa14d434a288810b0bcb2dd30a6ed778e9288f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dbb5b36c90e374f7b5583a7981e39d07f6154e268299d9bc22781e5be7250c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:31Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:31 crc kubenswrapper[4971]: I0320 06:51:31.587046 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9cccf9478c15d0ef6ac9a69d23a13e83c06968969d355cfe2fde1db3b40804f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:31Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:31 crc kubenswrapper[4971]: I0320 06:51:31.610107 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:31Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:31 crc kubenswrapper[4971]: I0320 06:51:31.619411 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:31 crc kubenswrapper[4971]: I0320 06:51:31.619510 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:31 crc kubenswrapper[4971]: I0320 06:51:31.619542 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:31 crc kubenswrapper[4971]: I0320 06:51:31.619574 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:31 crc kubenswrapper[4971]: I0320 06:51:31.619600 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:31Z","lastTransitionTime":"2026-03-20T06:51:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:31 crc kubenswrapper[4971]: I0320 06:51:31.673770 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lwlhn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f11eaf57-f83a-4974-adb5-7a59b11555b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93f25aa49f9048f5f935b94dc408da72227675c36d75b9614bb59743769df123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2jcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lwlhn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:31Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:31 crc kubenswrapper[4971]: I0320 06:51:31.710260 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f624a08-810b-4915-b3bf-1e19d3e6cace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://178ce0bca73aef60296595c16024b62f27dc16d226d6ed8d102bb88b3b4477aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20d0d42b820b46ff3173d56e03dedec6fd3fd360fe144467d8eb6f785d597308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://547b376608a6a765aea3f53af2c12c6eea952bf8fcdb34be1a94fc79d7c84b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://421a7ad4b3dde0de7b751159a385832075d84609031c7def22b4aa02874d2936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a1ae4297291e48978592b82e56c27206a61f8789bc5312c5dd22d06b34c195\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30450686629f21239e3247309d8fab4f41bacff80fdc6d456d970dde1a7e703\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c214154bbb3ce7bf59da4df92dc496f7b74c0d13a424767af2692cdef906998\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://839f28f29f3f35b4feb6b49064638cf8e0b348963fb75d9c4b8f8f38e969effa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T06:51:29Z\\\",\\\"message\\\":\\\"s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 06:51:29.900680 6733 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 06:51:29.900786 6733 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 06:51:29.900798 6733 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 06:51:29.900931 6733 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 06:51:29.900978 6733 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 06:51:29.901021 6733 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 06:51:29.901040 6733 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 06:51:29.901091 6733 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 06:51:29.901119 6733 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 06:51:29.901134 6733 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 06:51:29.901172 6733 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 06:51:29.901524 6733 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 06:51:29.900857 6733 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 06:51:29.901796 6733 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4691ea9fd75102b278413b4cf2d045ddad7f06b55243e481ec5ff36f6891cd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fhlwk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:31Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:31 crc kubenswrapper[4971]: I0320 06:51:31.722211 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:31 crc kubenswrapper[4971]: I0320 06:51:31.722250 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:31 crc kubenswrapper[4971]: I0320 06:51:31.722260 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:31 crc kubenswrapper[4971]: I0320 06:51:31.722276 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:31 crc kubenswrapper[4971]: I0320 06:51:31.722286 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:31Z","lastTransitionTime":"2026-03-20T06:51:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:31 crc kubenswrapper[4971]: I0320 06:51:31.727060 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:31Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:31 crc kubenswrapper[4971]: I0320 06:51:31.731193 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:51:31 crc kubenswrapper[4971]: I0320 06:51:31.731244 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:51:31 crc kubenswrapper[4971]: I0320 06:51:31.731276 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:51:31 crc kubenswrapper[4971]: E0320 06:51:31.731288 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 06:51:31 crc kubenswrapper[4971]: E0320 06:51:31.731422 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 06:51:31 crc kubenswrapper[4971]: E0320 06:51:31.731514 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 06:51:31 crc kubenswrapper[4971]: I0320 06:51:31.741965 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:31Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:31 crc kubenswrapper[4971]: I0320 06:51:31.752410 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jl74k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3068ae0f-0404-452c-b0e0-c422d898e6b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dad3a055d918bddfc385b5ad7adfb0e47347ef1f86fe991b0bb93c58d39e4436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmftj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jl74k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:31Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:31 crc kubenswrapper[4971]: I0320 06:51:31.763005 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c96bfbc-3a6a-44df-9bf6-9f78c587657c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1a4307986ec73b6299d1dbba6bfd0efb557578fd13619a0b6a81e5e6a31f7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6545ca2d913195f4ebed66a5c9050bc7198c2d2281fea7c3293a5d188beb2e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m26zl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:31Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:31 crc kubenswrapper[4971]: I0320 06:51:31.824114 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:31 crc kubenswrapper[4971]: I0320 06:51:31.824171 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:31 crc kubenswrapper[4971]: I0320 06:51:31.824183 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:31 crc kubenswrapper[4971]: I0320 06:51:31.824199 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:31 crc kubenswrapper[4971]: I0320 06:51:31.824210 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:31Z","lastTransitionTime":"2026-03-20T06:51:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:31 crc kubenswrapper[4971]: I0320 06:51:31.927027 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:31 crc kubenswrapper[4971]: I0320 06:51:31.927070 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:31 crc kubenswrapper[4971]: I0320 06:51:31.927079 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:31 crc kubenswrapper[4971]: I0320 06:51:31.927095 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:31 crc kubenswrapper[4971]: I0320 06:51:31.927105 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:31Z","lastTransitionTime":"2026-03-20T06:51:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.030269 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.030317 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.030327 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.030340 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.030349 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:32Z","lastTransitionTime":"2026-03-20T06:51:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.133959 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.134056 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.134078 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.134107 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.134129 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:32Z","lastTransitionTime":"2026-03-20T06:51:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.236891 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.236983 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.237011 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.237046 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.237069 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:32Z","lastTransitionTime":"2026-03-20T06:51:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.340294 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.340360 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.340378 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.340406 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.340424 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:32Z","lastTransitionTime":"2026-03-20T06:51:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.434716 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fhlwk_7f624a08-810b-4915-b3bf-1e19d3e6cace/ovnkube-controller/1.log" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.435762 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fhlwk_7f624a08-810b-4915-b3bf-1e19d3e6cace/ovnkube-controller/0.log" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.440065 4971 generic.go:334] "Generic (PLEG): container finished" podID="7f624a08-810b-4915-b3bf-1e19d3e6cace" containerID="8c214154bbb3ce7bf59da4df92dc496f7b74c0d13a424767af2692cdef906998" exitCode=1 Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.440128 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" event={"ID":"7f624a08-810b-4915-b3bf-1e19d3e6cace","Type":"ContainerDied","Data":"8c214154bbb3ce7bf59da4df92dc496f7b74c0d13a424767af2692cdef906998"} Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.440188 4971 scope.go:117] "RemoveContainer" containerID="839f28f29f3f35b4feb6b49064638cf8e0b348963fb75d9c4b8f8f38e969effa" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.441590 4971 scope.go:117] "RemoveContainer" containerID="8c214154bbb3ce7bf59da4df92dc496f7b74c0d13a424767af2692cdef906998" Mar 20 06:51:32 crc kubenswrapper[4971]: E0320 06:51:32.441960 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-fhlwk_openshift-ovn-kubernetes(7f624a08-810b-4915-b3bf-1e19d3e6cace)\"" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" podUID="7f624a08-810b-4915-b3bf-1e19d3e6cace" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.442478 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.442528 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.442541 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.442563 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.442580 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:32Z","lastTransitionTime":"2026-03-20T06:51:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.499722 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6eccb52-4f65-4e1f-b9c9-4bece549a815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a82f33c71d1353dd841a49c673ff55387468152e3c7aa9a935173f9de78dcd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c624e715179e2699ff4d282ae4bd733ca1bc71ac21247b97f8585f0c6313460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://becef23cbdd3d98ab9e3aa793aee4fe00b06c86505d5717db0739eb21e344b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16665116a21b8abffb23402c6362e9cd5110741ce22065c5e50f200b179a327c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://105120bce25dc17ac09a04c3b0703e3f07e3f60495ddc67ca4d77004384a6fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:32Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.506226 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j57qn"] Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.507058 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j57qn" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.511470 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.512069 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.525685 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae3df870db76866893b6eb3f04127ac3feec63a351cfd97e96960dddd119c79f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:32Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.547786 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.547825 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.547835 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.547855 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.547868 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:32Z","lastTransitionTime":"2026-03-20T06:51:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.555537 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qmgrn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be6e520-cae7-413a-bd91-de5f2de7f0b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c6fda90e6a1a5bb49163607f9d9547737db1eaaaea047b43e386e5a24cc2eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ec6ebffc7048694a1a324f1569343353e81d494bd19d04dd579073299d82eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ec6ebffc7048694a1a324f1569343353e81d494bd19d04dd579073299d82eaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c24b027e991a36a6d80660e3aaf9d436d5f07b486acd239cadd2506e854fb4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c24b027e991a36a6d80660e3aaf9d436d5f07b486acd239cadd2506e854fb4f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://295001469b4485d0ef107707d2affb5c6a653b55aaa04545172607c9c9915dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://295001469b4485d0ef107707d2affb5c6a653b55aaa04545172607c9c9915dae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qmgrn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:32Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.573707 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kfnsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d21aee3-3c87-4b06-9a9b-f88d3acb6d3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20ac2a0f22405759bc85f5dcfbecdd34e7568f53c6d67b5c0fe9d132e2a359b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7d788\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kfnsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:32Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.598808 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3f656dd-4e82-4469-9bd4-f02922f2649c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfb7907b15336a74d333ce825f8325e80508112fa387f60627e13cc60dc6b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e45395fcc4ab1005b3b86243b30fecf35311439ef5593f777ba909a0b9359574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c945250129ad9a7fdd1cc19ac9f4c8d3e73c504448c652d5e02cdc986624089d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a748c7f52fe87f489e7b11239aafb2b6f9dc55d79f48fd08e10193349e61706\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a748c7f52fe87f489e7b11239aafb2b6f9dc55d79f48fd08e10193349e61706\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T06:50:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 06:50:49.359098 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 06:50:49.359207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 06:50:49.359766 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3550867493/tls.crt::/tmp/serving-cert-3550867493/tls.key\\\\\\\"\\\\nI0320 06:50:49.512575 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 06:50:49.515179 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 06:50:49.515195 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 06:50:49.515218 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 06:50:49.515223 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 06:50:49.521761 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 06:50:49.521810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 06:50:49.521822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 06:50:49.521832 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 06:50:49.521839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 06:50:49.521858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 06:50:49.521866 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 06:50:49.521769 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0320 06:50:49.523950 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:50:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2468ac45fc80991cc210720a76afa06655f3fd7e91fab289eed1d20818e7fcbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:32Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.613672 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f219b30c-9597-4a2c-a6d4-33cd07c2ff11-env-overrides\") pod \"ovnkube-control-plane-749d76644c-j57qn\" (UID: \"f219b30c-9597-4a2c-a6d4-33cd07c2ff11\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j57qn" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.613766 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f219b30c-9597-4a2c-a6d4-33cd07c2ff11-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-j57qn\" (UID: \"f219b30c-9597-4a2c-a6d4-33cd07c2ff11\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j57qn" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.613805 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f219b30c-9597-4a2c-a6d4-33cd07c2ff11-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-j57qn\" (UID: \"f219b30c-9597-4a2c-a6d4-33cd07c2ff11\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j57qn" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.613838 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zb2ss\" (UniqueName: \"kubernetes.io/projected/f219b30c-9597-4a2c-a6d4-33cd07c2ff11-kube-api-access-zb2ss\") pod \"ovnkube-control-plane-749d76644c-j57qn\" (UID: \"f219b30c-9597-4a2c-a6d4-33cd07c2ff11\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j57qn" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.614233 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b2683d138d302cd71c3b08e7fa14d434a288810b0bcb2dd30a6ed778e9288f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dbb5b36c90e374f7b5583a7981e39d07f6154e268299d9bc22781e5be7250c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:32Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.628978 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9cccf9478c15d0ef6ac9a69d23a13e83c06968969d355cfe2fde1db3b40804f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:32Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.644076 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:32Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.658028 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.658121 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.658165 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.658205 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.658228 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:32Z","lastTransitionTime":"2026-03-20T06:51:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.659844 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lwlhn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f11eaf57-f83a-4974-adb5-7a59b11555b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93f25aa49f9048f5f935b94dc408da72227675c36d75b9614bb59743769df123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2jcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lwlhn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:32Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.686207 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f624a08-810b-4915-b3bf-1e19d3e6cace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://178ce0bca73aef60296595c16024b62f27dc16d226d6ed8d102bb88b3b4477aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20d0d42b820b46ff3173d56e03dedec6fd3fd360fe144467d8eb6f785d597308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://547b376608a6a765aea3f53af2c12c6eea952bf8fcdb34be1a94fc79d7c84b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://421a7ad4b3dde0de7b751159a385832075d84609031c7def22b4aa02874d2936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a1ae4297291e48978592b82e56c27206a61f8789bc5312c5dd22d06b34c195\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30450686629f21239e3247309d8fab4f41bacff80fdc6d456d970dde1a7e703\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c214154bbb3ce7bf59da4df92dc496f7b74c0d13a424767af2692cdef906998\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://839f28f29f3f35b4feb6b49064638cf8e0b348963fb75d9c4b8f8f38e969effa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T06:51:29Z\\\",\\\"message\\\":\\\"s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 06:51:29.900680 6733 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 06:51:29.900786 6733 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 06:51:29.900798 6733 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 06:51:29.900931 6733 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 06:51:29.900978 6733 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 06:51:29.901021 6733 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 06:51:29.901040 6733 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 06:51:29.901091 6733 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 06:51:29.901119 6733 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 06:51:29.901134 6733 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 06:51:29.901172 6733 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 06:51:29.901524 6733 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 06:51:29.900857 6733 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 06:51:29.901796 6733 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c214154bbb3ce7bf59da4df92dc496f7b74c0d13a424767af2692cdef906998\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T06:51:31Z\\\",\\\"message\\\":\\\"4ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 06:51:31.421624 6909 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator\\\\\\\"}\\\\nI0320 06:51:31.421649 6909 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-apiserver/apiserver]} name:Service_openshift-kube-apiserver/apiserver_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.93:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d71b38eb-32af-4c0f-9490-7c317c111e3a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0320 06:51:31.420261 6909 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4691ea9fd75102b278413b4cf2d045ddad7f06b55243e481ec5ff36f6891cd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fhlwk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:32Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.701748 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:32Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.715007 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:32Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.715388 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f219b30c-9597-4a2c-a6d4-33cd07c2ff11-env-overrides\") pod \"ovnkube-control-plane-749d76644c-j57qn\" (UID: \"f219b30c-9597-4a2c-a6d4-33cd07c2ff11\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j57qn" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.715459 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f219b30c-9597-4a2c-a6d4-33cd07c2ff11-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-j57qn\" (UID: \"f219b30c-9597-4a2c-a6d4-33cd07c2ff11\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j57qn" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.715497 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f219b30c-9597-4a2c-a6d4-33cd07c2ff11-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-j57qn\" (UID: \"f219b30c-9597-4a2c-a6d4-33cd07c2ff11\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j57qn" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.715539 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zb2ss\" (UniqueName: \"kubernetes.io/projected/f219b30c-9597-4a2c-a6d4-33cd07c2ff11-kube-api-access-zb2ss\") pod \"ovnkube-control-plane-749d76644c-j57qn\" (UID: \"f219b30c-9597-4a2c-a6d4-33cd07c2ff11\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j57qn" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.716435 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f219b30c-9597-4a2c-a6d4-33cd07c2ff11-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-j57qn\" (UID: \"f219b30c-9597-4a2c-a6d4-33cd07c2ff11\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j57qn" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.716663 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f219b30c-9597-4a2c-a6d4-33cd07c2ff11-env-overrides\") pod \"ovnkube-control-plane-749d76644c-j57qn\" (UID: \"f219b30c-9597-4a2c-a6d4-33cd07c2ff11\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j57qn" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.726056 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f219b30c-9597-4a2c-a6d4-33cd07c2ff11-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-j57qn\" (UID: \"f219b30c-9597-4a2c-a6d4-33cd07c2ff11\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j57qn" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.728115 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jl74k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3068ae0f-0404-452c-b0e0-c422d898e6b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dad3a055d918bddfc385b5ad7adfb0e47347ef1f86fe991b0bb93c58d39e4436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmftj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jl74k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:32Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.737667 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zb2ss\" (UniqueName: \"kubernetes.io/projected/f219b30c-9597-4a2c-a6d4-33cd07c2ff11-kube-api-access-zb2ss\") pod \"ovnkube-control-plane-749d76644c-j57qn\" (UID: \"f219b30c-9597-4a2c-a6d4-33cd07c2ff11\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j57qn" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.741852 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c96bfbc-3a6a-44df-9bf6-9f78c587657c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1a4307986ec73b6299d1dbba6bfd0efb557578fd13619a0b6a81e5e6a31f7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6545ca2d913195f4ebed66a5c9050bc7198c2d2281fea7c3293a5d188beb2e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m26zl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:32Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.756002 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c96bfbc-3a6a-44df-9bf6-9f78c587657c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1a4307986ec73b6299d1dbba6bfd0efb557578fd13619a0b6a81e5e6a31f7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6545ca2d913195f4ebed66a5c9050bc7198c2d2281fea7c3293a5d188beb2e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m26zl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:32Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.761008 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.761055 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.761070 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.761093 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.761106 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:32Z","lastTransitionTime":"2026-03-20T06:51:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.772732 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j57qn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f219b30c-9597-4a2c-a6d4-33cd07c2ff11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zb2ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zb2ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j57qn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:32Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.787772 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:32Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.808910 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:32Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.826029 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jl74k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3068ae0f-0404-452c-b0e0-c422d898e6b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dad3a055d918bddfc385b5ad7adfb0e47347ef1f86fe991b0bb93c58d39e4436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmftj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jl74k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:32Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.832275 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j57qn" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.842291 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kfnsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d21aee3-3c87-4b06-9a9b-f88d3acb6d3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20ac2a0f22405759bc85f5dcfbecdd34e7568f53c6d67b5c0fe9d132e2a359b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7d788\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kfnsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:32Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:32 crc kubenswrapper[4971]: W0320 06:51:32.845445 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf219b30c_9597_4a2c_a6d4_33cd07c2ff11.slice/crio-44e2726819cf1344384ab4958f7c706155fed6e10d45730b962910f0585edee6 WatchSource:0}: Error finding container 44e2726819cf1344384ab4958f7c706155fed6e10d45730b962910f0585edee6: Status 404 returned error can't find the container with id 44e2726819cf1344384ab4958f7c706155fed6e10d45730b962910f0585edee6 Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.864706 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.864772 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.864788 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.864806 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.864820 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:32Z","lastTransitionTime":"2026-03-20T06:51:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.869975 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6eccb52-4f65-4e1f-b9c9-4bece549a815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a82f33c71d1353dd841a49c673ff55387468152e3c7aa9a935173f9de78dcd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c624e715179e2699ff4d282ae4bd733ca1bc71ac21247b97f8585f0c6313460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://becef23cbdd3d98ab9e3aa793aee4fe00b06c86505d5717db0739eb21e344b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16665116a21b8abffb23402c6362e9cd5110741ce22065c5e50f200b179a327c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://105120bce25dc17ac09a04c3b0703e3f07e3f60495ddc67ca4d77004384a6fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:32Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.890960 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae3df870db76866893b6eb3f04127ac3feec63a351cfd97e96960dddd119c79f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:32Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.911885 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qmgrn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be6e520-cae7-413a-bd91-de5f2de7f0b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c6fda90e6a1a5bb49163607f9d9547737db1eaaaea047b43e386e5a24cc2eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ec6ebffc7048694a1a324f1569343353e81d494bd19d04dd579073299d82eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ec6ebffc7048694a1a324f1569343353e81d494bd19d04dd579073299d82eaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c24b027e991a36a6d80660e3aaf9d436d5f07b486acd239cadd2506e854fb4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c24b027e991a36a6d80660e3aaf9d436d5f07b486acd239cadd2506e854fb4f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://295001469b4485d0ef107707d2affb5c6a653b55aaa04545172607c9c9915dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://295001469b4485d0ef107707d2affb5c6a653b55aaa04545172607c9c9915dae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qmgrn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:32Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.931499 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3f656dd-4e82-4469-9bd4-f02922f2649c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfb7907b15336a74d333ce825f8325e80508112fa387f60627e13cc60dc6b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e45395fcc4ab1005b3b86243b30fecf35311439ef5593f777ba909a0b9359574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c945250129ad9a7fdd1cc19ac9f4c8d3e73c504448c652d5e02cdc986624089d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a748c7f52fe87f489e7b11239aafb2b6f9dc55d79f48fd08e10193349e61706\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a748c7f52fe87f489e7b11239aafb2b6f9dc55d79f48fd08e10193349e61706\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T06:50:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 06:50:49.359098 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 06:50:49.359207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 06:50:49.359766 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3550867493/tls.crt::/tmp/serving-cert-3550867493/tls.key\\\\\\\"\\\\nI0320 06:50:49.512575 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 06:50:49.515179 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 06:50:49.515195 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 06:50:49.515218 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 06:50:49.515223 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 06:50:49.521761 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 06:50:49.521810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 06:50:49.521822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 06:50:49.521832 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 06:50:49.521839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 06:50:49.521858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 06:50:49.521866 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 06:50:49.521769 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0320 06:50:49.523950 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:50:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2468ac45fc80991cc210720a76afa06655f3fd7e91fab289eed1d20818e7fcbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:32Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.952682 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b2683d138d302cd71c3b08e7fa14d434a288810b0bcb2dd30a6ed778e9288f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dbb5b36c90e374f7b5583a7981e39d07f6154e268299d9bc22781e5be7250c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:32Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.969947 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.970036 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.970062 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.970096 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.970122 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:32Z","lastTransitionTime":"2026-03-20T06:51:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.976434 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9cccf9478c15d0ef6ac9a69d23a13e83c06968969d355cfe2fde1db3b40804f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:32Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:32 crc kubenswrapper[4971]: I0320 06:51:32.995427 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:32Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.013106 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lwlhn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f11eaf57-f83a-4974-adb5-7a59b11555b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93f25aa49f9048f5f935b94dc408da72227675c36d75b9614bb59743769df123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2jcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lwlhn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:33Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.042234 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f624a08-810b-4915-b3bf-1e19d3e6cace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://178ce0bca73aef60296595c16024b62f27dc16d226d6ed8d102bb88b3b4477aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20d0d42b820b46ff3173d56e03dedec6fd3fd360fe144467d8eb6f785d597308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://547b376608a6a765aea3f53af2c12c6eea952bf8fcdb34be1a94fc79d7c84b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://421a7ad4b3dde0de7b751159a385832075d84609031c7def22b4aa02874d2936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a1ae4297291e48978592b82e56c27206a61f8789bc5312c5dd22d06b34c195\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30450686629f21239e3247309d8fab4f41bacff80fdc6d456d970dde1a7e703\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c214154bbb3ce7bf59da4df92dc496f7b74c0d13a424767af2692cdef906998\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://839f28f29f3f35b4feb6b49064638cf8e0b348963fb75d9c4b8f8f38e969effa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T06:51:29Z\\\",\\\"message\\\":\\\"s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 06:51:29.900680 6733 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 06:51:29.900786 6733 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 06:51:29.900798 6733 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 06:51:29.900931 6733 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 06:51:29.900978 6733 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 06:51:29.901021 6733 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 06:51:29.901040 6733 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 06:51:29.901091 6733 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 06:51:29.901119 6733 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 06:51:29.901134 6733 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 06:51:29.901172 6733 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 06:51:29.901524 6733 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 06:51:29.900857 6733 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 06:51:29.901796 6733 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c214154bbb3ce7bf59da4df92dc496f7b74c0d13a424767af2692cdef906998\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T06:51:31Z\\\",\\\"message\\\":\\\"4ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 06:51:31.421624 6909 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator\\\\\\\"}\\\\nI0320 06:51:31.421649 6909 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-apiserver/apiserver]} name:Service_openshift-kube-apiserver/apiserver_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.93:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d71b38eb-32af-4c0f-9490-7c317c111e3a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0320 06:51:31.420261 6909 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4691ea9fd75102b278413b4cf2d045ddad7f06b55243e481ec5ff36f6891cd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fhlwk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:33Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.073475 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.073523 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.073538 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.073557 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.073569 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:33Z","lastTransitionTime":"2026-03-20T06:51:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.176062 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.176121 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.176139 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.176159 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.176172 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:33Z","lastTransitionTime":"2026-03-20T06:51:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.242721 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-6vl68"] Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.243208 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:51:33 crc kubenswrapper[4971]: E0320 06:51:33.243293 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vl68" podUID="6ac992f1-bd05-471f-a101-cf14466e15e8" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.278525 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.278591 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.278645 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.278690 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.278716 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:33Z","lastTransitionTime":"2026-03-20T06:51:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.286120 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6eccb52-4f65-4e1f-b9c9-4bece549a815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a82f33c71d1353dd841a49c673ff55387468152e3c7aa9a935173f9de78dcd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c624e715179e2699ff4d282ae4bd733ca1bc71ac21247b97f8585f0c6313460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://becef23cbdd3d98ab9e3aa793aee4fe00b06c86505d5717db0739eb21e344b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16665116a21b8abffb23402c6362e9cd5110741ce22065c5e50f200b179a327c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://105120bce25dc17ac09a04c3b0703e3f07e3f60495ddc67ca4d77004384a6fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:33Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.307246 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae3df870db76866893b6eb3f04127ac3feec63a351cfd97e96960dddd119c79f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:33Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.324537 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ac992f1-bd05-471f-a101-cf14466e15e8-metrics-certs\") pod \"network-metrics-daemon-6vl68\" (UID: \"6ac992f1-bd05-471f-a101-cf14466e15e8\") " pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.324626 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cql2d\" (UniqueName: \"kubernetes.io/projected/6ac992f1-bd05-471f-a101-cf14466e15e8-kube-api-access-cql2d\") pod \"network-metrics-daemon-6vl68\" (UID: \"6ac992f1-bd05-471f-a101-cf14466e15e8\") " pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.330259 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qmgrn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be6e520-cae7-413a-bd91-de5f2de7f0b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c6fda90e6a1a5bb49163607f9d9547737db1eaaaea047b43e386e5a24cc2eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ec6ebffc7048694a1a324f1569343353e81d494bd19d04dd579073299d82eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ec6ebffc7048694a1a324f1569343353e81d494bd19d04dd579073299d82eaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c24b027e991a36a6d80660e3aaf9d436d5f07b486acd239cadd2506e854fb4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c24b027e991a36a6d80660e3aaf9d436d5f07b486acd239cadd2506e854fb4f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://295001469b4485d0ef107707d2affb5c6a653b55aaa04545172607c9c9915dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://295001469b4485d0ef107707d2affb5c6a653b55aaa04545172607c9c9915dae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qmgrn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:33Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.345651 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kfnsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d21aee3-3c87-4b06-9a9b-f88d3acb6d3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20ac2a0f22405759bc85f5dcfbecdd34e7568f53c6d67b5c0fe9d132e2a359b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7d788\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kfnsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:33Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.377830 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3f656dd-4e82-4469-9bd4-f02922f2649c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfb7907b15336a74d333ce825f8325e80508112fa387f60627e13cc60dc6b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e45395fcc4ab1005b3b86243b30fecf35311439ef5593f777ba909a0b9359574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c945250129ad9a7fdd1cc19ac9f4c8d3e73c504448c652d5e02cdc986624089d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a748c7f52fe87f489e7b11239aafb2b6f9dc55d79f48fd08e10193349e61706\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a748c7f52fe87f489e7b11239aafb2b6f9dc55d79f48fd08e10193349e61706\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T06:50:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 06:50:49.359098 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 06:50:49.359207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 06:50:49.359766 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3550867493/tls.crt::/tmp/serving-cert-3550867493/tls.key\\\\\\\"\\\\nI0320 06:50:49.512575 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 06:50:49.515179 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 06:50:49.515195 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 06:50:49.515218 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 06:50:49.515223 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 06:50:49.521761 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 06:50:49.521810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 06:50:49.521822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 06:50:49.521832 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 06:50:49.521839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 06:50:49.521858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 06:50:49.521866 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 06:50:49.521769 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0320 06:50:49.523950 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:50:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2468ac45fc80991cc210720a76afa06655f3fd7e91fab289eed1d20818e7fcbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:33Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.381455 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.381494 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.381509 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.381547 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.381560 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:33Z","lastTransitionTime":"2026-03-20T06:51:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.401130 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b2683d138d302cd71c3b08e7fa14d434a288810b0bcb2dd30a6ed778e9288f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dbb5b36c90e374f7b5583a7981e39d07f6154e268299d9bc22781e5be7250c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:33Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.424410 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9cccf9478c15d0ef6ac9a69d23a13e83c06968969d355cfe2fde1db3b40804f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:33Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.425802 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ac992f1-bd05-471f-a101-cf14466e15e8-metrics-certs\") pod \"network-metrics-daemon-6vl68\" (UID: \"6ac992f1-bd05-471f-a101-cf14466e15e8\") " pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.425871 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cql2d\" (UniqueName: \"kubernetes.io/projected/6ac992f1-bd05-471f-a101-cf14466e15e8-kube-api-access-cql2d\") pod \"network-metrics-daemon-6vl68\" (UID: \"6ac992f1-bd05-471f-a101-cf14466e15e8\") " pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:51:33 crc kubenswrapper[4971]: E0320 06:51:33.426244 4971 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 06:51:33 crc kubenswrapper[4971]: E0320 06:51:33.426303 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ac992f1-bd05-471f-a101-cf14466e15e8-metrics-certs podName:6ac992f1-bd05-471f-a101-cf14466e15e8 nodeName:}" failed. No retries permitted until 2026-03-20 06:51:33.926286716 +0000 UTC m=+115.906160864 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6ac992f1-bd05-471f-a101-cf14466e15e8-metrics-certs") pod "network-metrics-daemon-6vl68" (UID: "6ac992f1-bd05-471f-a101-cf14466e15e8") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.439112 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:33Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.445248 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j57qn" event={"ID":"f219b30c-9597-4a2c-a6d4-33cd07c2ff11","Type":"ContainerStarted","Data":"0aa886347017baa18f89bad2475ac0d5b6127fd3c292e34becc72e9a1afba2ed"} Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.445294 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j57qn" event={"ID":"f219b30c-9597-4a2c-a6d4-33cd07c2ff11","Type":"ContainerStarted","Data":"dab11f01560755e3638bfa9faf7d3a92738e066ab765f7ca1245515244805ca8"} Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.445308 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j57qn" event={"ID":"f219b30c-9597-4a2c-a6d4-33cd07c2ff11","Type":"ContainerStarted","Data":"44e2726819cf1344384ab4958f7c706155fed6e10d45730b962910f0585edee6"} Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.447102 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fhlwk_7f624a08-810b-4915-b3bf-1e19d3e6cace/ovnkube-controller/1.log" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.452039 4971 scope.go:117] "RemoveContainer" containerID="8c214154bbb3ce7bf59da4df92dc496f7b74c0d13a424767af2692cdef906998" Mar 20 06:51:33 crc kubenswrapper[4971]: E0320 06:51:33.452248 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-fhlwk_openshift-ovn-kubernetes(7f624a08-810b-4915-b3bf-1e19d3e6cace)\"" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" podUID="7f624a08-810b-4915-b3bf-1e19d3e6cace" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.452519 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lwlhn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f11eaf57-f83a-4974-adb5-7a59b11555b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93f25aa49f9048f5f935b94dc408da72227675c36d75b9614bb59743769df123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2jcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lwlhn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:33Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.453536 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cql2d\" (UniqueName: \"kubernetes.io/projected/6ac992f1-bd05-471f-a101-cf14466e15e8-kube-api-access-cql2d\") pod \"network-metrics-daemon-6vl68\" (UID: \"6ac992f1-bd05-471f-a101-cf14466e15e8\") " pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.476220 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f624a08-810b-4915-b3bf-1e19d3e6cace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://178ce0bca73aef60296595c16024b62f27dc16d226d6ed8d102bb88b3b4477aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20d0d42b820b46ff3173d56e03dedec6fd3fd360fe144467d8eb6f785d597308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://547b376608a6a765aea3f53af2c12c6eea952bf8fcdb34be1a94fc79d7c84b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://421a7ad4b3dde0de7b751159a385832075d84609031c7def22b4aa02874d2936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a1ae4297291e48978592b82e56c27206a61f8789bc5312c5dd22d06b34c195\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30450686629f21239e3247309d8fab4f41bacff80fdc6d456d970dde1a7e703\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c214154bbb3ce7bf59da4df92dc496f7b74c0d13a424767af2692cdef906998\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://839f28f29f3f35b4feb6b49064638cf8e0b348963fb75d9c4b8f8f38e969effa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T06:51:29Z\\\",\\\"message\\\":\\\"s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 06:51:29.900680 6733 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 06:51:29.900786 6733 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 06:51:29.900798 6733 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 06:51:29.900931 6733 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 06:51:29.900978 6733 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 06:51:29.901021 6733 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 06:51:29.901040 6733 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 06:51:29.901091 6733 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 06:51:29.901119 6733 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 06:51:29.901134 6733 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 06:51:29.901172 6733 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 06:51:29.901524 6733 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 06:51:29.900857 6733 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 06:51:29.901796 6733 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c214154bbb3ce7bf59da4df92dc496f7b74c0d13a424767af2692cdef906998\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T06:51:31Z\\\",\\\"message\\\":\\\"4ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 06:51:31.421624 6909 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator\\\\\\\"}\\\\nI0320 06:51:31.421649 6909 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-apiserver/apiserver]} name:Service_openshift-kube-apiserver/apiserver_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.93:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d71b38eb-32af-4c0f-9490-7c317c111e3a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0320 06:51:31.420261 6909 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4691ea9fd75102b278413b4cf2d045ddad7f06b55243e481ec5ff36f6891cd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fhlwk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:33Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.483913 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.483950 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.483965 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.483982 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.483994 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:33Z","lastTransitionTime":"2026-03-20T06:51:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.491321 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6vl68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ac992f1-bd05-471f-a101-cf14466e15e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cql2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cql2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6vl68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:33Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.509766 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:33Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.521481 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:33Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.527823 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.527932 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.527963 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:51:33 crc kubenswrapper[4971]: E0320 06:51:33.528008 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 06:51:49.527969784 +0000 UTC m=+131.507843932 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:51:33 crc kubenswrapper[4971]: E0320 06:51:33.528053 4971 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 06:51:33 crc kubenswrapper[4971]: E0320 06:51:33.528196 4971 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 06:51:33 crc kubenswrapper[4971]: E0320 06:51:33.528222 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 06:51:49.528169359 +0000 UTC m=+131.508043507 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 06:51:33 crc kubenswrapper[4971]: E0320 06:51:33.528399 4971 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 06:51:33 crc kubenswrapper[4971]: E0320 06:51:33.528449 4971 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 06:51:33 crc kubenswrapper[4971]: E0320 06:51:33.528537 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 06:51:49.528510238 +0000 UTC m=+131.508384426 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.530456 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.530498 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:51:33 crc kubenswrapper[4971]: E0320 06:51:33.530755 4971 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 06:51:33 crc kubenswrapper[4971]: E0320 06:51:33.530795 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 06:51:49.530784898 +0000 UTC m=+131.510659046 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 06:51:33 crc kubenswrapper[4971]: E0320 06:51:33.530852 4971 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 06:51:33 crc kubenswrapper[4971]: E0320 06:51:33.530867 4971 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 06:51:33 crc kubenswrapper[4971]: E0320 06:51:33.530877 4971 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 06:51:33 crc kubenswrapper[4971]: E0320 06:51:33.530903 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 06:51:49.530895291 +0000 UTC m=+131.510769439 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.532569 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jl74k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3068ae0f-0404-452c-b0e0-c422d898e6b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dad3a055d918bddfc385b5ad7adfb0e47347ef1f86fe991b0bb93c58d39e4436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmftj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jl74k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:33Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.543147 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c96bfbc-3a6a-44df-9bf6-9f78c587657c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1a4307986ec73b6299d1dbba6bfd0efb557578fd13619a0b6a81e5e6a31f7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6545ca2d913195f4ebed66a5c9050bc7198c2d2281fea7c3293a5d188beb2e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m26zl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:33Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.556954 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j57qn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f219b30c-9597-4a2c-a6d4-33cd07c2ff11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zb2ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zb2ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j57qn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:33Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.569601 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:33Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.586274 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:33Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.587420 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.587481 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.587501 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.587529 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.587551 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:33Z","lastTransitionTime":"2026-03-20T06:51:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.598524 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jl74k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3068ae0f-0404-452c-b0e0-c422d898e6b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dad3a055d918bddfc385b5ad7adfb0e47347ef1f86fe991b0bb93c58d39e4436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmftj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jl74k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:33Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.611530 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c96bfbc-3a6a-44df-9bf6-9f78c587657c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1a4307986ec73b6299d1dbba6bfd0efb557578fd13619a0b6a81e5e6a31f7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6545ca2d913195f4ebed66a5c9050bc7198c2d2281fea7c3293a5d188beb2e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m26zl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:33Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.624122 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j57qn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f219b30c-9597-4a2c-a6d4-33cd07c2ff11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dab11f01560755e3638bfa9faf7d3a92738e066ab765f7ca1245515244805ca8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zb2ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa886347017baa18f89bad2475ac0d5b6127fd3c292e34becc72e9a1afba2ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zb2ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j57qn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:33Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.635432 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6vl68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ac992f1-bd05-471f-a101-cf14466e15e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cql2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cql2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6vl68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:33Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.657478 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6eccb52-4f65-4e1f-b9c9-4bece549a815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a82f33c71d1353dd841a49c673ff55387468152e3c7aa9a935173f9de78dcd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c624e715179e2699ff4d282ae4bd733ca1bc71ac21247b97f8585f0c6313460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://becef23cbdd3d98ab9e3aa793aee4fe00b06c86505d5717db0739eb21e344b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16665116a21b8abffb23402c6362e9cd5110741ce22065c5e50f200b179a327c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://105120bce25dc17ac09a04c3b0703e3f07e3f60495ddc67ca4d77004384a6fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:33Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.673575 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae3df870db76866893b6eb3f04127ac3feec63a351cfd97e96960dddd119c79f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:33Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.690864 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.690922 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.690941 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.690968 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.690988 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:33Z","lastTransitionTime":"2026-03-20T06:51:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.695045 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qmgrn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be6e520-cae7-413a-bd91-de5f2de7f0b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c6fda90e6a1a5bb49163607f9d9547737db1eaaaea047b43e386e5a24cc2eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ec6ebffc7048694a1a324f1569343353e81d494bd19d04dd579073299d82eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ec6ebffc7048694a1a324f1569343353e81d494bd19d04dd579073299d82eaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c24b027e991a36a6d80660e3aaf9d436d5f07b486acd239cadd2506e854fb4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c24b027e991a36a6d80660e3aaf9d436d5f07b486acd239cadd2506e854fb4f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://295001469b4485d0ef107707d2affb5c6a653b55aaa04545172607c9c9915dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://295001469b4485d0ef107707d2affb5c6a653b55aaa04545172607c9c9915dae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qmgrn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:33Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.709348 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kfnsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d21aee3-3c87-4b06-9a9b-f88d3acb6d3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20ac2a0f22405759bc85f5dcfbecdd34e7568f53c6d67b5c0fe9d132e2a359b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7d788\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kfnsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:33Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.727871 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3f656dd-4e82-4469-9bd4-f02922f2649c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfb7907b15336a74d333ce825f8325e80508112fa387f60627e13cc60dc6b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e45395fcc4ab1005b3b86243b30fecf35311439ef5593f777ba909a0b9359574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c945250129ad9a7fdd1cc19ac9f4c8d3e73c504448c652d5e02cdc986624089d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a748c7f52fe87f489e7b11239aafb2b6f9dc55d79f48fd08e10193349e61706\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a748c7f52fe87f489e7b11239aafb2b6f9dc55d79f48fd08e10193349e61706\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T06:50:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 06:50:49.359098 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 06:50:49.359207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 06:50:49.359766 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3550867493/tls.crt::/tmp/serving-cert-3550867493/tls.key\\\\\\\"\\\\nI0320 06:50:49.512575 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 06:50:49.515179 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 06:50:49.515195 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 06:50:49.515218 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 06:50:49.515223 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 06:50:49.521761 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 06:50:49.521810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 06:50:49.521822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 06:50:49.521832 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 06:50:49.521839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 06:50:49.521858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 06:50:49.521866 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 06:50:49.521769 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0320 06:50:49.523950 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:50:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2468ac45fc80991cc210720a76afa06655f3fd7e91fab289eed1d20818e7fcbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:33Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.731277 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.731328 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.731281 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:51:33 crc kubenswrapper[4971]: E0320 06:51:33.731419 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 06:51:33 crc kubenswrapper[4971]: E0320 06:51:33.731588 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 06:51:33 crc kubenswrapper[4971]: E0320 06:51:33.731728 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.744095 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b2683d138d302cd71c3b08e7fa14d434a288810b0bcb2dd30a6ed778e9288f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dbb5b36c90e374f7b5583a7981e39d07f6154e268299d9bc22781e5be7250c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:33Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.758642 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9cccf9478c15d0ef6ac9a69d23a13e83c06968969d355cfe2fde1db3b40804f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:33Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.773366 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:33Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.791086 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lwlhn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f11eaf57-f83a-4974-adb5-7a59b11555b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93f25aa49f9048f5f935b94dc408da72227675c36d75b9614bb59743769df123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2jcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lwlhn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:33Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.793205 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.793266 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.793282 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.793299 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.793348 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:33Z","lastTransitionTime":"2026-03-20T06:51:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.817852 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f624a08-810b-4915-b3bf-1e19d3e6cace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://178ce0bca73aef60296595c16024b62f27dc16d226d6ed8d102bb88b3b4477aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20d0d42b820b46ff3173d56e03dedec6fd3fd360fe144467d8eb6f785d597308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://547b376608a6a765aea3f53af2c12c6eea952bf8fcdb34be1a94fc79d7c84b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://421a7ad4b3dde0de7b751159a385832075d84609031c7def22b4aa02874d2936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a1ae4297291e48978592b82e56c27206a61f8789bc5312c5dd22d06b34c195\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30450686629f21239e3247309d8fab4f41bacff80fdc6d456d970dde1a7e703\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c214154bbb3ce7bf59da4df92dc496f7b74c0d13a424767af2692cdef906998\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c214154bbb3ce7bf59da4df92dc496f7b74c0d13a424767af2692cdef906998\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T06:51:31Z\\\",\\\"message\\\":\\\"4ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 06:51:31.421624 6909 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator\\\\\\\"}\\\\nI0320 06:51:31.421649 6909 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-apiserver/apiserver]} name:Service_openshift-kube-apiserver/apiserver_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.93:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d71b38eb-32af-4c0f-9490-7c317c111e3a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0320 06:51:31.420261 6909 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-fhlwk_openshift-ovn-kubernetes(7f624a08-810b-4915-b3bf-1e19d3e6cace)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4691ea9fd75102b278413b4cf2d045ddad7f06b55243e481ec5ff36f6891cd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fhlwk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:33Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.896495 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.896579 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.896644 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.896674 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.896694 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:33Z","lastTransitionTime":"2026-03-20T06:51:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.934421 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ac992f1-bd05-471f-a101-cf14466e15e8-metrics-certs\") pod \"network-metrics-daemon-6vl68\" (UID: \"6ac992f1-bd05-471f-a101-cf14466e15e8\") " pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:51:33 crc kubenswrapper[4971]: E0320 06:51:33.934662 4971 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 06:51:33 crc kubenswrapper[4971]: E0320 06:51:33.934781 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ac992f1-bd05-471f-a101-cf14466e15e8-metrics-certs podName:6ac992f1-bd05-471f-a101-cf14466e15e8 nodeName:}" failed. No retries permitted until 2026-03-20 06:51:34.934754308 +0000 UTC m=+116.914628486 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6ac992f1-bd05-471f-a101-cf14466e15e8-metrics-certs") pod "network-metrics-daemon-6vl68" (UID: "6ac992f1-bd05-471f-a101-cf14466e15e8") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.999361 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.999422 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.999441 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.999470 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:33 crc kubenswrapper[4971]: I0320 06:51:33.999491 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:33Z","lastTransitionTime":"2026-03-20T06:51:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:34 crc kubenswrapper[4971]: I0320 06:51:34.102081 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:34 crc kubenswrapper[4971]: I0320 06:51:34.102153 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:34 crc kubenswrapper[4971]: I0320 06:51:34.102185 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:34 crc kubenswrapper[4971]: I0320 06:51:34.102218 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:34 crc kubenswrapper[4971]: I0320 06:51:34.102283 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:34Z","lastTransitionTime":"2026-03-20T06:51:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:34 crc kubenswrapper[4971]: I0320 06:51:34.206339 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:34 crc kubenswrapper[4971]: I0320 06:51:34.206440 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:34 crc kubenswrapper[4971]: I0320 06:51:34.206466 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:34 crc kubenswrapper[4971]: I0320 06:51:34.206497 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:34 crc kubenswrapper[4971]: I0320 06:51:34.206519 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:34Z","lastTransitionTime":"2026-03-20T06:51:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:34 crc kubenswrapper[4971]: I0320 06:51:34.310526 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:34 crc kubenswrapper[4971]: I0320 06:51:34.310600 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:34 crc kubenswrapper[4971]: I0320 06:51:34.310650 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:34 crc kubenswrapper[4971]: I0320 06:51:34.310673 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:34 crc kubenswrapper[4971]: I0320 06:51:34.310692 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:34Z","lastTransitionTime":"2026-03-20T06:51:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:34 crc kubenswrapper[4971]: I0320 06:51:34.413541 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:34 crc kubenswrapper[4971]: I0320 06:51:34.413597 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:34 crc kubenswrapper[4971]: I0320 06:51:34.413648 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:34 crc kubenswrapper[4971]: I0320 06:51:34.413672 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:34 crc kubenswrapper[4971]: I0320 06:51:34.413693 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:34Z","lastTransitionTime":"2026-03-20T06:51:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:34 crc kubenswrapper[4971]: I0320 06:51:34.517750 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:34 crc kubenswrapper[4971]: I0320 06:51:34.517815 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:34 crc kubenswrapper[4971]: I0320 06:51:34.517833 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:34 crc kubenswrapper[4971]: I0320 06:51:34.517856 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:34 crc kubenswrapper[4971]: I0320 06:51:34.517874 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:34Z","lastTransitionTime":"2026-03-20T06:51:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:34 crc kubenswrapper[4971]: I0320 06:51:34.621256 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:34 crc kubenswrapper[4971]: I0320 06:51:34.621314 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:34 crc kubenswrapper[4971]: I0320 06:51:34.621332 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:34 crc kubenswrapper[4971]: I0320 06:51:34.621357 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:34 crc kubenswrapper[4971]: I0320 06:51:34.621375 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:34Z","lastTransitionTime":"2026-03-20T06:51:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:34 crc kubenswrapper[4971]: I0320 06:51:34.724152 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:34 crc kubenswrapper[4971]: I0320 06:51:34.724319 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:34 crc kubenswrapper[4971]: I0320 06:51:34.724351 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:34 crc kubenswrapper[4971]: I0320 06:51:34.724380 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:34 crc kubenswrapper[4971]: I0320 06:51:34.724401 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:34Z","lastTransitionTime":"2026-03-20T06:51:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:34 crc kubenswrapper[4971]: I0320 06:51:34.731689 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:51:34 crc kubenswrapper[4971]: E0320 06:51:34.731879 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vl68" podUID="6ac992f1-bd05-471f-a101-cf14466e15e8" Mar 20 06:51:34 crc kubenswrapper[4971]: I0320 06:51:34.827674 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:34 crc kubenswrapper[4971]: I0320 06:51:34.827730 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:34 crc kubenswrapper[4971]: I0320 06:51:34.827748 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:34 crc kubenswrapper[4971]: I0320 06:51:34.827770 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:34 crc kubenswrapper[4971]: I0320 06:51:34.827788 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:34Z","lastTransitionTime":"2026-03-20T06:51:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:34 crc kubenswrapper[4971]: I0320 06:51:34.930876 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:34 crc kubenswrapper[4971]: I0320 06:51:34.930941 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:34 crc kubenswrapper[4971]: I0320 06:51:34.930967 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:34 crc kubenswrapper[4971]: I0320 06:51:34.930998 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:34 crc kubenswrapper[4971]: I0320 06:51:34.931019 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:34Z","lastTransitionTime":"2026-03-20T06:51:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:34 crc kubenswrapper[4971]: I0320 06:51:34.947687 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ac992f1-bd05-471f-a101-cf14466e15e8-metrics-certs\") pod \"network-metrics-daemon-6vl68\" (UID: \"6ac992f1-bd05-471f-a101-cf14466e15e8\") " pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:51:34 crc kubenswrapper[4971]: E0320 06:51:34.947876 4971 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 06:51:34 crc kubenswrapper[4971]: E0320 06:51:34.947944 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ac992f1-bd05-471f-a101-cf14466e15e8-metrics-certs podName:6ac992f1-bd05-471f-a101-cf14466e15e8 nodeName:}" failed. No retries permitted until 2026-03-20 06:51:36.947926294 +0000 UTC m=+118.927800442 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6ac992f1-bd05-471f-a101-cf14466e15e8-metrics-certs") pod "network-metrics-daemon-6vl68" (UID: "6ac992f1-bd05-471f-a101-cf14466e15e8") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 06:51:35 crc kubenswrapper[4971]: I0320 06:51:35.033668 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:35 crc kubenswrapper[4971]: I0320 06:51:35.033726 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:35 crc kubenswrapper[4971]: I0320 06:51:35.033746 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:35 crc kubenswrapper[4971]: I0320 06:51:35.033771 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:35 crc kubenswrapper[4971]: I0320 06:51:35.033788 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:35Z","lastTransitionTime":"2026-03-20T06:51:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:35 crc kubenswrapper[4971]: I0320 06:51:35.137430 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:35 crc kubenswrapper[4971]: I0320 06:51:35.137485 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:35 crc kubenswrapper[4971]: I0320 06:51:35.137498 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:35 crc kubenswrapper[4971]: I0320 06:51:35.137515 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:35 crc kubenswrapper[4971]: I0320 06:51:35.137526 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:35Z","lastTransitionTime":"2026-03-20T06:51:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:35 crc kubenswrapper[4971]: I0320 06:51:35.240661 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:35 crc kubenswrapper[4971]: I0320 06:51:35.240713 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:35 crc kubenswrapper[4971]: I0320 06:51:35.240732 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:35 crc kubenswrapper[4971]: I0320 06:51:35.240756 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:35 crc kubenswrapper[4971]: I0320 06:51:35.240775 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:35Z","lastTransitionTime":"2026-03-20T06:51:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:35 crc kubenswrapper[4971]: I0320 06:51:35.344157 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:35 crc kubenswrapper[4971]: I0320 06:51:35.344233 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:35 crc kubenswrapper[4971]: I0320 06:51:35.344249 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:35 crc kubenswrapper[4971]: I0320 06:51:35.344273 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:35 crc kubenswrapper[4971]: I0320 06:51:35.344293 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:35Z","lastTransitionTime":"2026-03-20T06:51:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:35 crc kubenswrapper[4971]: I0320 06:51:35.447377 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:35 crc kubenswrapper[4971]: I0320 06:51:35.447441 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:35 crc kubenswrapper[4971]: I0320 06:51:35.447458 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:35 crc kubenswrapper[4971]: I0320 06:51:35.447484 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:35 crc kubenswrapper[4971]: I0320 06:51:35.447501 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:35Z","lastTransitionTime":"2026-03-20T06:51:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:35 crc kubenswrapper[4971]: I0320 06:51:35.550800 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:35 crc kubenswrapper[4971]: I0320 06:51:35.550852 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:35 crc kubenswrapper[4971]: I0320 06:51:35.550865 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:35 crc kubenswrapper[4971]: I0320 06:51:35.550886 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:35 crc kubenswrapper[4971]: I0320 06:51:35.550899 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:35Z","lastTransitionTime":"2026-03-20T06:51:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:35 crc kubenswrapper[4971]: I0320 06:51:35.654706 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:35 crc kubenswrapper[4971]: I0320 06:51:35.654794 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:35 crc kubenswrapper[4971]: I0320 06:51:35.654821 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:35 crc kubenswrapper[4971]: I0320 06:51:35.654853 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:35 crc kubenswrapper[4971]: I0320 06:51:35.654876 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:35Z","lastTransitionTime":"2026-03-20T06:51:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:35 crc kubenswrapper[4971]: I0320 06:51:35.732166 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:51:35 crc kubenswrapper[4971]: I0320 06:51:35.732286 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:51:35 crc kubenswrapper[4971]: E0320 06:51:35.732436 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 06:51:35 crc kubenswrapper[4971]: I0320 06:51:35.732305 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:51:35 crc kubenswrapper[4971]: E0320 06:51:35.732557 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 06:51:35 crc kubenswrapper[4971]: E0320 06:51:35.732778 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 06:51:35 crc kubenswrapper[4971]: I0320 06:51:35.757310 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:35 crc kubenswrapper[4971]: I0320 06:51:35.757374 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:35 crc kubenswrapper[4971]: I0320 06:51:35.757387 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:35 crc kubenswrapper[4971]: I0320 06:51:35.757403 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:35 crc kubenswrapper[4971]: I0320 06:51:35.757415 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:35Z","lastTransitionTime":"2026-03-20T06:51:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:35 crc kubenswrapper[4971]: I0320 06:51:35.860871 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:35 crc kubenswrapper[4971]: I0320 06:51:35.860922 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:35 crc kubenswrapper[4971]: I0320 06:51:35.860941 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:35 crc kubenswrapper[4971]: I0320 06:51:35.860963 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:35 crc kubenswrapper[4971]: I0320 06:51:35.860981 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:35Z","lastTransitionTime":"2026-03-20T06:51:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:35 crc kubenswrapper[4971]: I0320 06:51:35.963149 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:35 crc kubenswrapper[4971]: I0320 06:51:35.963202 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:35 crc kubenswrapper[4971]: I0320 06:51:35.963216 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:35 crc kubenswrapper[4971]: I0320 06:51:35.963236 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:35 crc kubenswrapper[4971]: I0320 06:51:35.963257 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:35Z","lastTransitionTime":"2026-03-20T06:51:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.066359 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.066407 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.066419 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.066437 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.066450 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:36Z","lastTransitionTime":"2026-03-20T06:51:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.169302 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.169337 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.169345 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.169358 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.169367 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:36Z","lastTransitionTime":"2026-03-20T06:51:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.271788 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.272586 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.272655 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.272675 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.272688 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:36Z","lastTransitionTime":"2026-03-20T06:51:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.375407 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.375473 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.375491 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.375514 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.375531 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:36Z","lastTransitionTime":"2026-03-20T06:51:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.478266 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.478329 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.478346 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.478368 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.478390 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:36Z","lastTransitionTime":"2026-03-20T06:51:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.581512 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.581581 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.581598 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.581669 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.581688 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:36Z","lastTransitionTime":"2026-03-20T06:51:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.684553 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.684670 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.684691 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.684721 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.684769 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:36Z","lastTransitionTime":"2026-03-20T06:51:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.732318 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:51:36 crc kubenswrapper[4971]: E0320 06:51:36.732563 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vl68" podUID="6ac992f1-bd05-471f-a101-cf14466e15e8" Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.787863 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.788247 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.788383 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.788531 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.788731 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:36Z","lastTransitionTime":"2026-03-20T06:51:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.868232 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.868804 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.869007 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.869184 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.869338 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:36Z","lastTransitionTime":"2026-03-20T06:51:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:36 crc kubenswrapper[4971]: E0320 06:51:36.890811 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24b0987a-bded-4b95-97d3-cecbc47baa01\\\",\\\"systemUUID\\\":\\\"e08276c0-e3e4-4e35-ad9f-5ef530b85d12\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:36Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.896538 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.896639 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.896659 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.896687 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.896710 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:36Z","lastTransitionTime":"2026-03-20T06:51:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:36 crc kubenswrapper[4971]: E0320 06:51:36.914886 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24b0987a-bded-4b95-97d3-cecbc47baa01\\\",\\\"systemUUID\\\":\\\"e08276c0-e3e4-4e35-ad9f-5ef530b85d12\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:36Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.921217 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.921286 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.921313 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.921425 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.921460 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:36Z","lastTransitionTime":"2026-03-20T06:51:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:36 crc kubenswrapper[4971]: E0320 06:51:36.943387 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24b0987a-bded-4b95-97d3-cecbc47baa01\\\",\\\"systemUUID\\\":\\\"e08276c0-e3e4-4e35-ad9f-5ef530b85d12\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:36Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.951902 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.951979 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.952049 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.952081 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.952100 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:36Z","lastTransitionTime":"2026-03-20T06:51:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.971268 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ac992f1-bd05-471f-a101-cf14466e15e8-metrics-certs\") pod \"network-metrics-daemon-6vl68\" (UID: \"6ac992f1-bd05-471f-a101-cf14466e15e8\") " pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:51:36 crc kubenswrapper[4971]: E0320 06:51:36.971427 4971 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 06:51:36 crc kubenswrapper[4971]: E0320 06:51:36.974106 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ac992f1-bd05-471f-a101-cf14466e15e8-metrics-certs podName:6ac992f1-bd05-471f-a101-cf14466e15e8 nodeName:}" failed. No retries permitted until 2026-03-20 06:51:40.974032818 +0000 UTC m=+122.953906986 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6ac992f1-bd05-471f-a101-cf14466e15e8-metrics-certs") pod "network-metrics-daemon-6vl68" (UID: "6ac992f1-bd05-471f-a101-cf14466e15e8") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 06:51:36 crc kubenswrapper[4971]: E0320 06:51:36.974665 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24b0987a-bded-4b95-97d3-cecbc47baa01\\\",\\\"systemUUID\\\":\\\"e08276c0-e3e4-4e35-ad9f-5ef530b85d12\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:36Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.982203 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.982429 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.982573 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.982779 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:36 crc kubenswrapper[4971]: I0320 06:51:36.982975 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:36Z","lastTransitionTime":"2026-03-20T06:51:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:37 crc kubenswrapper[4971]: E0320 06:51:37.001923 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24b0987a-bded-4b95-97d3-cecbc47baa01\\\",\\\"systemUUID\\\":\\\"e08276c0-e3e4-4e35-ad9f-5ef530b85d12\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:36Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:37 crc kubenswrapper[4971]: E0320 06:51:37.002444 4971 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 06:51:37 crc kubenswrapper[4971]: I0320 06:51:37.005557 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:37 crc kubenswrapper[4971]: I0320 06:51:37.005709 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:37 crc kubenswrapper[4971]: I0320 06:51:37.005991 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:37 crc kubenswrapper[4971]: I0320 06:51:37.006312 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:37 crc kubenswrapper[4971]: I0320 06:51:37.006362 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:37Z","lastTransitionTime":"2026-03-20T06:51:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:37 crc kubenswrapper[4971]: I0320 06:51:37.110285 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:37 crc kubenswrapper[4971]: I0320 06:51:37.110349 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:37 crc kubenswrapper[4971]: I0320 06:51:37.110366 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:37 crc kubenswrapper[4971]: I0320 06:51:37.110392 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:37 crc kubenswrapper[4971]: I0320 06:51:37.110412 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:37Z","lastTransitionTime":"2026-03-20T06:51:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:37 crc kubenswrapper[4971]: I0320 06:51:37.213095 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:37 crc kubenswrapper[4971]: I0320 06:51:37.213150 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:37 crc kubenswrapper[4971]: I0320 06:51:37.213161 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:37 crc kubenswrapper[4971]: I0320 06:51:37.213181 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:37 crc kubenswrapper[4971]: I0320 06:51:37.213194 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:37Z","lastTransitionTime":"2026-03-20T06:51:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:37 crc kubenswrapper[4971]: I0320 06:51:37.317054 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:37 crc kubenswrapper[4971]: I0320 06:51:37.317117 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:37 crc kubenswrapper[4971]: I0320 06:51:37.317133 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:37 crc kubenswrapper[4971]: I0320 06:51:37.317160 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:37 crc kubenswrapper[4971]: I0320 06:51:37.317181 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:37Z","lastTransitionTime":"2026-03-20T06:51:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:37 crc kubenswrapper[4971]: I0320 06:51:37.420718 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:37 crc kubenswrapper[4971]: I0320 06:51:37.420775 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:37 crc kubenswrapper[4971]: I0320 06:51:37.420792 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:37 crc kubenswrapper[4971]: I0320 06:51:37.420819 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:37 crc kubenswrapper[4971]: I0320 06:51:37.420839 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:37Z","lastTransitionTime":"2026-03-20T06:51:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:37 crc kubenswrapper[4971]: I0320 06:51:37.524899 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:37 crc kubenswrapper[4971]: I0320 06:51:37.524974 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:37 crc kubenswrapper[4971]: I0320 06:51:37.524993 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:37 crc kubenswrapper[4971]: I0320 06:51:37.525024 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:37 crc kubenswrapper[4971]: I0320 06:51:37.525043 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:37Z","lastTransitionTime":"2026-03-20T06:51:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:37 crc kubenswrapper[4971]: I0320 06:51:37.628638 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:37 crc kubenswrapper[4971]: I0320 06:51:37.628695 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:37 crc kubenswrapper[4971]: I0320 06:51:37.628709 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:37 crc kubenswrapper[4971]: I0320 06:51:37.628736 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:37 crc kubenswrapper[4971]: I0320 06:51:37.628754 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:37Z","lastTransitionTime":"2026-03-20T06:51:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:37 crc kubenswrapper[4971]: I0320 06:51:37.731942 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:51:37 crc kubenswrapper[4971]: I0320 06:51:37.732379 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:51:37 crc kubenswrapper[4971]: I0320 06:51:37.732453 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:51:37 crc kubenswrapper[4971]: I0320 06:51:37.732878 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:37 crc kubenswrapper[4971]: I0320 06:51:37.732906 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:37 crc kubenswrapper[4971]: I0320 06:51:37.732923 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:37 crc kubenswrapper[4971]: I0320 06:51:37.732948 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:37 crc kubenswrapper[4971]: I0320 06:51:37.732967 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:37Z","lastTransitionTime":"2026-03-20T06:51:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:37 crc kubenswrapper[4971]: E0320 06:51:37.733573 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 06:51:37 crc kubenswrapper[4971]: E0320 06:51:37.733726 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 06:51:37 crc kubenswrapper[4971]: E0320 06:51:37.733905 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 06:51:37 crc kubenswrapper[4971]: I0320 06:51:37.837022 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:37 crc kubenswrapper[4971]: I0320 06:51:37.837092 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:37 crc kubenswrapper[4971]: I0320 06:51:37.837113 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:37 crc kubenswrapper[4971]: I0320 06:51:37.837140 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:37 crc kubenswrapper[4971]: I0320 06:51:37.837159 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:37Z","lastTransitionTime":"2026-03-20T06:51:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:37 crc kubenswrapper[4971]: I0320 06:51:37.940028 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:37 crc kubenswrapper[4971]: I0320 06:51:37.940115 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:37 crc kubenswrapper[4971]: I0320 06:51:37.940140 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:37 crc kubenswrapper[4971]: I0320 06:51:37.940174 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:37 crc kubenswrapper[4971]: I0320 06:51:37.940200 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:37Z","lastTransitionTime":"2026-03-20T06:51:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:38 crc kubenswrapper[4971]: I0320 06:51:38.043523 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:38 crc kubenswrapper[4971]: I0320 06:51:38.043574 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:38 crc kubenswrapper[4971]: I0320 06:51:38.043585 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:38 crc kubenswrapper[4971]: I0320 06:51:38.043602 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:38 crc kubenswrapper[4971]: I0320 06:51:38.043636 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:38Z","lastTransitionTime":"2026-03-20T06:51:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:38 crc kubenswrapper[4971]: I0320 06:51:38.146684 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:38 crc kubenswrapper[4971]: I0320 06:51:38.146756 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:38 crc kubenswrapper[4971]: I0320 06:51:38.146776 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:38 crc kubenswrapper[4971]: I0320 06:51:38.146806 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:38 crc kubenswrapper[4971]: I0320 06:51:38.146830 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:38Z","lastTransitionTime":"2026-03-20T06:51:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:38 crc kubenswrapper[4971]: I0320 06:51:38.256907 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:38 crc kubenswrapper[4971]: I0320 06:51:38.256969 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:38 crc kubenswrapper[4971]: I0320 06:51:38.256988 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:38 crc kubenswrapper[4971]: I0320 06:51:38.257015 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:38 crc kubenswrapper[4971]: I0320 06:51:38.257036 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:38Z","lastTransitionTime":"2026-03-20T06:51:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:38 crc kubenswrapper[4971]: I0320 06:51:38.360486 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:38 crc kubenswrapper[4971]: I0320 06:51:38.360546 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:38 crc kubenswrapper[4971]: I0320 06:51:38.360566 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:38 crc kubenswrapper[4971]: I0320 06:51:38.360596 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:38 crc kubenswrapper[4971]: I0320 06:51:38.360647 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:38Z","lastTransitionTime":"2026-03-20T06:51:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:38 crc kubenswrapper[4971]: I0320 06:51:38.463576 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:38 crc kubenswrapper[4971]: I0320 06:51:38.463653 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:38 crc kubenswrapper[4971]: I0320 06:51:38.463666 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:38 crc kubenswrapper[4971]: I0320 06:51:38.463687 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:38 crc kubenswrapper[4971]: I0320 06:51:38.463702 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:38Z","lastTransitionTime":"2026-03-20T06:51:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:38 crc kubenswrapper[4971]: I0320 06:51:38.566426 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:38 crc kubenswrapper[4971]: I0320 06:51:38.566488 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:38 crc kubenswrapper[4971]: I0320 06:51:38.566534 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:38 crc kubenswrapper[4971]: I0320 06:51:38.566560 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:38 crc kubenswrapper[4971]: I0320 06:51:38.566577 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:38Z","lastTransitionTime":"2026-03-20T06:51:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:38 crc kubenswrapper[4971]: E0320 06:51:38.667758 4971 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 20 06:51:38 crc kubenswrapper[4971]: I0320 06:51:38.731369 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:51:38 crc kubenswrapper[4971]: E0320 06:51:38.731680 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vl68" podUID="6ac992f1-bd05-471f-a101-cf14466e15e8" Mar 20 06:51:38 crc kubenswrapper[4971]: I0320 06:51:38.755237 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3f656dd-4e82-4469-9bd4-f02922f2649c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfb7907b15336a74d333ce825f8325e80508112fa387f60627e13cc60dc6b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e45395fcc4ab1005b3b86243b30fecf35311439ef5593f777ba909a0b9359574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c945250129ad9a7fdd1cc19ac9f4c8d3e73c504448c652d5e02cdc986624089d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a748c7f52fe87f489e7b11239aafb2b6f9dc55d79f48fd08e10193349e61706\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a748c7f52fe87f489e7b11239aafb2b6f9dc55d79f48fd08e10193349e61706\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T06:50:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 06:50:49.359098 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 06:50:49.359207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 06:50:49.359766 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3550867493/tls.crt::/tmp/serving-cert-3550867493/tls.key\\\\\\\"\\\\nI0320 06:50:49.512575 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 06:50:49.515179 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 06:50:49.515195 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 06:50:49.515218 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 06:50:49.515223 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 06:50:49.521761 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 06:50:49.521810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 06:50:49.521822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 06:50:49.521832 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 06:50:49.521839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 06:50:49.521858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 06:50:49.521866 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 06:50:49.521769 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0320 06:50:49.523950 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:50:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2468ac45fc80991cc210720a76afa06655f3fd7e91fab289eed1d20818e7fcbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:38Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:38 crc kubenswrapper[4971]: I0320 06:51:38.774183 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b2683d138d302cd71c3b08e7fa14d434a288810b0bcb2dd30a6ed778e9288f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dbb5b36c90e374f7b5583a7981e39d07f6154e268299d9bc22781e5be7250c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:38Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:38 crc kubenswrapper[4971]: I0320 06:51:38.789741 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:38Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:38 crc kubenswrapper[4971]: I0320 06:51:38.806062 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lwlhn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f11eaf57-f83a-4974-adb5-7a59b11555b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93f25aa49f9048f5f935b94dc408da72227675c36d75b9614bb59743769df123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2jcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lwlhn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:38Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:38 crc kubenswrapper[4971]: I0320 06:51:38.831658 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f624a08-810b-4915-b3bf-1e19d3e6cace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://178ce0bca73aef60296595c16024b62f27dc16d226d6ed8d102bb88b3b4477aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20d0d42b820b46ff3173d56e03dedec6fd3fd360fe144467d8eb6f785d597308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://547b376608a6a765aea3f53af2c12c6eea952bf8fcdb34be1a94fc79d7c84b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://421a7ad4b3dde0de7b751159a385832075d84609031c7def22b4aa02874d2936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a1ae4297291e48978592b82e56c27206a61f8789bc5312c5dd22d06b34c195\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30450686629f21239e3247309d8fab4f41bacff80fdc6d456d970dde1a7e703\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c214154bbb3ce7bf59da4df92dc496f7b74c0d13a424767af2692cdef906998\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c214154bbb3ce7bf59da4df92dc496f7b74c0d13a424767af2692cdef906998\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T06:51:31Z\\\",\\\"message\\\":\\\"4ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 06:51:31.421624 6909 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator\\\\\\\"}\\\\nI0320 06:51:31.421649 6909 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-apiserver/apiserver]} name:Service_openshift-kube-apiserver/apiserver_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.93:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d71b38eb-32af-4c0f-9490-7c317c111e3a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0320 06:51:31.420261 6909 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-fhlwk_openshift-ovn-kubernetes(7f624a08-810b-4915-b3bf-1e19d3e6cace)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4691ea9fd75102b278413b4cf2d045ddad7f06b55243e481ec5ff36f6891cd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fhlwk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:38Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:38 crc kubenswrapper[4971]: E0320 06:51:38.834992 4971 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 06:51:38 crc kubenswrapper[4971]: I0320 06:51:38.853463 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9cccf9478c15d0ef6ac9a69d23a13e83c06968969d355cfe2fde1db3b40804f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:38Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:38 crc kubenswrapper[4971]: I0320 06:51:38.878094 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:38Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:38 crc kubenswrapper[4971]: I0320 06:51:38.896841 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:38Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:38 crc kubenswrapper[4971]: I0320 06:51:38.920275 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jl74k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3068ae0f-0404-452c-b0e0-c422d898e6b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dad3a055d918bddfc385b5ad7adfb0e47347ef1f86fe991b0bb93c58d39e4436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmftj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jl74k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:38Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:38 crc kubenswrapper[4971]: I0320 06:51:38.936992 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c96bfbc-3a6a-44df-9bf6-9f78c587657c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1a4307986ec73b6299d1dbba6bfd0efb557578fd13619a0b6a81e5e6a31f7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6545ca2d913195f4ebed66a5c9050bc7198c2d2281fea7c3293a5d188beb2e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m26zl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:38Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:38 crc kubenswrapper[4971]: I0320 06:51:38.956321 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j57qn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f219b30c-9597-4a2c-a6d4-33cd07c2ff11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dab11f01560755e3638bfa9faf7d3a92738e066ab765f7ca1245515244805ca8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zb2ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa886347017baa18f89bad2475ac0d5b6127fd3c292e34becc72e9a1afba2ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zb2ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j57qn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:38Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:38 crc kubenswrapper[4971]: I0320 06:51:38.975132 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6vl68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ac992f1-bd05-471f-a101-cf14466e15e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cql2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cql2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6vl68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:38Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:38 crc kubenswrapper[4971]: I0320 06:51:38.996856 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae3df870db76866893b6eb3f04127ac3feec63a351cfd97e96960dddd119c79f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:38Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:39 crc kubenswrapper[4971]: I0320 06:51:39.023412 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qmgrn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be6e520-cae7-413a-bd91-de5f2de7f0b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c6fda90e6a1a5bb49163607f9d9547737db1eaaaea047b43e386e5a24cc2eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ec6ebffc7048694a1a324f1569343353e81d494bd19d04dd579073299d82eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ec6ebffc7048694a1a324f1569343353e81d494bd19d04dd579073299d82eaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c24b027e991a36a6d80660e3aaf9d436d5f07b486acd239cadd2506e854fb4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c24b027e991a36a6d80660e3aaf9d436d5f07b486acd239cadd2506e854fb4f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://295001469b4485d0ef107707d2affb5c6a653b55aaa04545172607c9c9915dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://295001469b4485d0ef107707d2affb5c6a653b55aaa04545172607c9c9915dae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qmgrn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:39Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:39 crc kubenswrapper[4971]: I0320 06:51:39.041868 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kfnsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d21aee3-3c87-4b06-9a9b-f88d3acb6d3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20ac2a0f22405759bc85f5dcfbecdd34e7568f53c6d67b5c0fe9d132e2a359b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7d788\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kfnsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:39Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:39 crc kubenswrapper[4971]: I0320 06:51:39.070199 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6eccb52-4f65-4e1f-b9c9-4bece549a815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a82f33c71d1353dd841a49c673ff55387468152e3c7aa9a935173f9de78dcd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c624e715179e2699ff4d282ae4bd733ca1bc71ac21247b97f8585f0c6313460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://becef23cbdd3d98ab9e3aa793aee4fe00b06c86505d5717db0739eb21e344b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16665116a21b8abffb23402c6362e9cd5110741ce22065c5e50f200b179a327c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://105120bce25dc17ac09a04c3b0703e3f07e3f60495ddc67ca4d77004384a6fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:39Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:39 crc kubenswrapper[4971]: I0320 06:51:39.731910 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:51:39 crc kubenswrapper[4971]: E0320 06:51:39.732113 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 06:51:39 crc kubenswrapper[4971]: I0320 06:51:39.733178 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:51:39 crc kubenswrapper[4971]: I0320 06:51:39.733237 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:51:39 crc kubenswrapper[4971]: E0320 06:51:39.733357 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 06:51:39 crc kubenswrapper[4971]: E0320 06:51:39.733947 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 06:51:39 crc kubenswrapper[4971]: I0320 06:51:39.735166 4971 scope.go:117] "RemoveContainer" containerID="6a748c7f52fe87f489e7b11239aafb2b6f9dc55d79f48fd08e10193349e61706" Mar 20 06:51:40 crc kubenswrapper[4971]: I0320 06:51:40.480577 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 06:51:40 crc kubenswrapper[4971]: I0320 06:51:40.482204 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8d92316a4a93f3b34950803a33ac423d41d3467830591e102e6a6b492c5a5707"} Mar 20 06:51:40 crc kubenswrapper[4971]: I0320 06:51:40.483000 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 06:51:40 crc kubenswrapper[4971]: I0320 06:51:40.509461 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6eccb52-4f65-4e1f-b9c9-4bece549a815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a82f33c71d1353dd841a49c673ff55387468152e3c7aa9a935173f9de78dcd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c624e715179e2699ff4d282ae4bd733ca1bc71ac21247b97f8585f0c6313460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://becef23cbdd3d98ab9e3aa793aee4fe00b06c86505d5717db0739eb21e344b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16665116a21b8abffb23402c6362e9cd5110741ce22065c5e50f200b179a327c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://105120bce25dc17ac09a04c3b0703e3f07e3f60495ddc67ca4d77004384a6fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:40Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:40 crc kubenswrapper[4971]: I0320 06:51:40.529678 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae3df870db76866893b6eb3f04127ac3feec63a351cfd97e96960dddd119c79f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:40Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:40 crc kubenswrapper[4971]: I0320 06:51:40.556832 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qmgrn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be6e520-cae7-413a-bd91-de5f2de7f0b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c6fda90e6a1a5bb49163607f9d9547737db1eaaaea047b43e386e5a24cc2eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ec6ebffc7048694a1a324f1569343353e81d494bd19d04dd579073299d82eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ec6ebffc7048694a1a324f1569343353e81d494bd19d04dd579073299d82eaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c24b027e991a36a6d80660e3aaf9d436d5f07b486acd239cadd2506e854fb4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c24b027e991a36a6d80660e3aaf9d436d5f07b486acd239cadd2506e854fb4f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://295001469b4485d0ef107707d2affb5c6a653b55aaa04545172607c9c9915dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://295001469b4485d0ef107707d2affb5c6a653b55aaa04545172607c9c9915dae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qmgrn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:40Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:40 crc kubenswrapper[4971]: I0320 06:51:40.573778 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kfnsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d21aee3-3c87-4b06-9a9b-f88d3acb6d3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20ac2a0f22405759bc85f5dcfbecdd34e7568f53c6d67b5c0fe9d132e2a359b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7d788\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kfnsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:40Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:40 crc kubenswrapper[4971]: I0320 06:51:40.597370 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3f656dd-4e82-4469-9bd4-f02922f2649c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfb7907b15336a74d333ce825f8325e80508112fa387f60627e13cc60dc6b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e45395fcc4ab1005b3b86243b30fecf35311439ef5593f777ba909a0b9359574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c945250129ad9a7fdd1cc19ac9f4c8d3e73c504448c652d5e02cdc986624089d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d92316a4a93f3b34950803a33ac423d41d3467830591e102e6a6b492c5a5707\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a748c7f52fe87f489e7b11239aafb2b6f9dc55d79f48fd08e10193349e61706\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T06:50:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 06:50:49.359098 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 06:50:49.359207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 06:50:49.359766 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3550867493/tls.crt::/tmp/serving-cert-3550867493/tls.key\\\\\\\"\\\\nI0320 06:50:49.512575 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 06:50:49.515179 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 06:50:49.515195 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 06:50:49.515218 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 06:50:49.515223 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 06:50:49.521761 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 06:50:49.521810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 06:50:49.521822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 06:50:49.521832 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 06:50:49.521839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 06:50:49.521858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 06:50:49.521866 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 06:50:49.521769 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0320 06:50:49.523950 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:50:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2468ac45fc80991cc210720a76afa06655f3fd7e91fab289eed1d20818e7fcbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:40Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:40 crc kubenswrapper[4971]: I0320 06:51:40.620599 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b2683d138d302cd71c3b08e7fa14d434a288810b0bcb2dd30a6ed778e9288f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dbb5b36c90e374f7b5583a7981e39d07f6154e268299d9bc22781e5be7250c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:40Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:40 crc kubenswrapper[4971]: I0320 06:51:40.668439 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9cccf9478c15d0ef6ac9a69d23a13e83c06968969d355cfe2fde1db3b40804f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:40Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:40 crc kubenswrapper[4971]: I0320 06:51:40.709009 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:40Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:40 crc kubenswrapper[4971]: I0320 06:51:40.721423 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lwlhn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f11eaf57-f83a-4974-adb5-7a59b11555b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93f25aa49f9048f5f935b94dc408da72227675c36d75b9614bb59743769df123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2jcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lwlhn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:40Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:40 crc kubenswrapper[4971]: I0320 06:51:40.731639 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:51:40 crc kubenswrapper[4971]: E0320 06:51:40.731750 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vl68" podUID="6ac992f1-bd05-471f-a101-cf14466e15e8" Mar 20 06:51:40 crc kubenswrapper[4971]: I0320 06:51:40.741037 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f624a08-810b-4915-b3bf-1e19d3e6cace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://178ce0bca73aef60296595c16024b62f27dc16d226d6ed8d102bb88b3b4477aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20d0d42b820b46ff3173d56e03dedec6fd3fd360fe144467d8eb6f785d597308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://547b376608a6a765aea3f53af2c12c6eea952bf8fcdb34be1a94fc79d7c84b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://421a7ad4b3dde0de7b751159a385832075d84609031c7def22b4aa02874d2936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a1ae4297291e48978592b82e56c27206a61f8789bc5312c5dd22d06b34c195\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30450686629f21239e3247309d8fab4f41bacff80fdc6d456d970dde1a7e703\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c214154bbb3ce7bf59da4df92dc496f7b74c0d13a424767af2692cdef906998\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c214154bbb3ce7bf59da4df92dc496f7b74c0d13a424767af2692cdef906998\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T06:51:31Z\\\",\\\"message\\\":\\\"4ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 06:51:31.421624 6909 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator\\\\\\\"}\\\\nI0320 06:51:31.421649 6909 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-apiserver/apiserver]} name:Service_openshift-kube-apiserver/apiserver_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.93:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d71b38eb-32af-4c0f-9490-7c317c111e3a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0320 06:51:31.420261 6909 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-fhlwk_openshift-ovn-kubernetes(7f624a08-810b-4915-b3bf-1e19d3e6cace)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4691ea9fd75102b278413b4cf2d045ddad7f06b55243e481ec5ff36f6891cd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fhlwk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:40Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:40 crc kubenswrapper[4971]: I0320 06:51:40.755550 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:40Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:40 crc kubenswrapper[4971]: I0320 06:51:40.770303 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:40Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:40 crc kubenswrapper[4971]: I0320 06:51:40.783775 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jl74k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3068ae0f-0404-452c-b0e0-c422d898e6b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dad3a055d918bddfc385b5ad7adfb0e47347ef1f86fe991b0bb93c58d39e4436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmftj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jl74k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:40Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:40 crc kubenswrapper[4971]: I0320 06:51:40.808293 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c96bfbc-3a6a-44df-9bf6-9f78c587657c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1a4307986ec73b6299d1dbba6bfd0efb557578fd13619a0b6a81e5e6a31f7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6545ca2d913195f4ebed66a5c9050bc7198c2d2281fea7c3293a5d188beb2e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m26zl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:40Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:40 crc kubenswrapper[4971]: I0320 06:51:40.822385 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j57qn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f219b30c-9597-4a2c-a6d4-33cd07c2ff11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dab11f01560755e3638bfa9faf7d3a92738e066ab765f7ca1245515244805ca8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zb2ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa886347017baa18f89bad2475ac0d5b6127fd3c292e34becc72e9a1afba2ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zb2ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j57qn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:40Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:40 crc kubenswrapper[4971]: I0320 06:51:40.835934 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6vl68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ac992f1-bd05-471f-a101-cf14466e15e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cql2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cql2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6vl68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:40Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:41 crc kubenswrapper[4971]: I0320 06:51:41.024377 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ac992f1-bd05-471f-a101-cf14466e15e8-metrics-certs\") pod \"network-metrics-daemon-6vl68\" (UID: \"6ac992f1-bd05-471f-a101-cf14466e15e8\") " pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:51:41 crc kubenswrapper[4971]: E0320 06:51:41.024524 4971 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 06:51:41 crc kubenswrapper[4971]: E0320 06:51:41.024580 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ac992f1-bd05-471f-a101-cf14466e15e8-metrics-certs podName:6ac992f1-bd05-471f-a101-cf14466e15e8 nodeName:}" failed. No retries permitted until 2026-03-20 06:51:49.024566623 +0000 UTC m=+131.004440761 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6ac992f1-bd05-471f-a101-cf14466e15e8-metrics-certs") pod "network-metrics-daemon-6vl68" (UID: "6ac992f1-bd05-471f-a101-cf14466e15e8") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 06:51:41 crc kubenswrapper[4971]: I0320 06:51:41.731288 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:51:41 crc kubenswrapper[4971]: I0320 06:51:41.731351 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:51:41 crc kubenswrapper[4971]: E0320 06:51:41.731486 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 06:51:41 crc kubenswrapper[4971]: E0320 06:51:41.731758 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 06:51:41 crc kubenswrapper[4971]: I0320 06:51:41.731886 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:51:41 crc kubenswrapper[4971]: E0320 06:51:41.732238 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 06:51:42 crc kubenswrapper[4971]: I0320 06:51:42.731897 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:51:42 crc kubenswrapper[4971]: E0320 06:51:42.732101 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vl68" podUID="6ac992f1-bd05-471f-a101-cf14466e15e8" Mar 20 06:51:43 crc kubenswrapper[4971]: I0320 06:51:43.731663 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:51:43 crc kubenswrapper[4971]: I0320 06:51:43.731718 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:51:43 crc kubenswrapper[4971]: I0320 06:51:43.732054 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:51:43 crc kubenswrapper[4971]: E0320 06:51:43.732156 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 06:51:43 crc kubenswrapper[4971]: E0320 06:51:43.732251 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 06:51:43 crc kubenswrapper[4971]: E0320 06:51:43.731997 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 06:51:43 crc kubenswrapper[4971]: E0320 06:51:43.836888 4971 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 06:51:44 crc kubenswrapper[4971]: I0320 06:51:44.731659 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:51:44 crc kubenswrapper[4971]: E0320 06:51:44.731931 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vl68" podUID="6ac992f1-bd05-471f-a101-cf14466e15e8" Mar 20 06:51:44 crc kubenswrapper[4971]: I0320 06:51:44.742742 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 20 06:51:45 crc kubenswrapper[4971]: I0320 06:51:45.731586 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:51:45 crc kubenswrapper[4971]: E0320 06:51:45.731958 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 06:51:45 crc kubenswrapper[4971]: I0320 06:51:45.732023 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:51:45 crc kubenswrapper[4971]: I0320 06:51:45.732124 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:51:45 crc kubenswrapper[4971]: E0320 06:51:45.732210 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 06:51:45 crc kubenswrapper[4971]: E0320 06:51:45.732370 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 06:51:46 crc kubenswrapper[4971]: I0320 06:51:46.732099 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:51:46 crc kubenswrapper[4971]: E0320 06:51:46.732430 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vl68" podUID="6ac992f1-bd05-471f-a101-cf14466e15e8" Mar 20 06:51:46 crc kubenswrapper[4971]: I0320 06:51:46.735032 4971 scope.go:117] "RemoveContainer" containerID="8c214154bbb3ce7bf59da4df92dc496f7b74c0d13a424767af2692cdef906998" Mar 20 06:51:47 crc kubenswrapper[4971]: I0320 06:51:47.217832 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:47 crc kubenswrapper[4971]: I0320 06:51:47.218260 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:47 crc kubenswrapper[4971]: I0320 06:51:47.218279 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:47 crc kubenswrapper[4971]: I0320 06:51:47.218307 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:47 crc kubenswrapper[4971]: I0320 06:51:47.218327 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:47Z","lastTransitionTime":"2026-03-20T06:51:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:47 crc kubenswrapper[4971]: E0320 06:51:47.235110 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24b0987a-bded-4b95-97d3-cecbc47baa01\\\",\\\"systemUUID\\\":\\\"e08276c0-e3e4-4e35-ad9f-5ef530b85d12\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:47Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:47 crc kubenswrapper[4971]: I0320 06:51:47.239995 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:47 crc kubenswrapper[4971]: I0320 06:51:47.240049 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:47 crc kubenswrapper[4971]: I0320 06:51:47.240062 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:47 crc kubenswrapper[4971]: I0320 06:51:47.240084 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:47 crc kubenswrapper[4971]: I0320 06:51:47.240102 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:47Z","lastTransitionTime":"2026-03-20T06:51:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:47 crc kubenswrapper[4971]: E0320 06:51:47.258661 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24b0987a-bded-4b95-97d3-cecbc47baa01\\\",\\\"systemUUID\\\":\\\"e08276c0-e3e4-4e35-ad9f-5ef530b85d12\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:47Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:47 crc kubenswrapper[4971]: I0320 06:51:47.268795 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:47 crc kubenswrapper[4971]: I0320 06:51:47.268849 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:47 crc kubenswrapper[4971]: I0320 06:51:47.268866 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:47 crc kubenswrapper[4971]: I0320 06:51:47.268887 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:47 crc kubenswrapper[4971]: I0320 06:51:47.268901 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:47Z","lastTransitionTime":"2026-03-20T06:51:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:47 crc kubenswrapper[4971]: E0320 06:51:47.282336 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24b0987a-bded-4b95-97d3-cecbc47baa01\\\",\\\"systemUUID\\\":\\\"e08276c0-e3e4-4e35-ad9f-5ef530b85d12\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:47Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:47 crc kubenswrapper[4971]: I0320 06:51:47.286572 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:47 crc kubenswrapper[4971]: I0320 06:51:47.286643 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:47 crc kubenswrapper[4971]: I0320 06:51:47.286657 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:47 crc kubenswrapper[4971]: I0320 06:51:47.286677 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:47 crc kubenswrapper[4971]: I0320 06:51:47.286690 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:47Z","lastTransitionTime":"2026-03-20T06:51:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:47 crc kubenswrapper[4971]: E0320 06:51:47.301370 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24b0987a-bded-4b95-97d3-cecbc47baa01\\\",\\\"systemUUID\\\":\\\"e08276c0-e3e4-4e35-ad9f-5ef530b85d12\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:47Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:47 crc kubenswrapper[4971]: I0320 06:51:47.306956 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:47 crc kubenswrapper[4971]: I0320 06:51:47.306985 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:47 crc kubenswrapper[4971]: I0320 06:51:47.306994 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:47 crc kubenswrapper[4971]: I0320 06:51:47.307012 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:47 crc kubenswrapper[4971]: I0320 06:51:47.307022 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:47Z","lastTransitionTime":"2026-03-20T06:51:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:47 crc kubenswrapper[4971]: E0320 06:51:47.321818 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24b0987a-bded-4b95-97d3-cecbc47baa01\\\",\\\"systemUUID\\\":\\\"e08276c0-e3e4-4e35-ad9f-5ef530b85d12\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:47Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:47 crc kubenswrapper[4971]: E0320 06:51:47.321962 4971 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 06:51:47 crc kubenswrapper[4971]: I0320 06:51:47.515781 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fhlwk_7f624a08-810b-4915-b3bf-1e19d3e6cace/ovnkube-controller/1.log" Mar 20 06:51:47 crc kubenswrapper[4971]: I0320 06:51:47.520148 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" event={"ID":"7f624a08-810b-4915-b3bf-1e19d3e6cace","Type":"ContainerStarted","Data":"7ec5a39eae08931433319b8734ef8180b21e3772bf8bcfce2e0599e9c229b23f"} Mar 20 06:51:47 crc kubenswrapper[4971]: I0320 06:51:47.520907 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:51:47 crc kubenswrapper[4971]: I0320 06:51:47.545475 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9cccf9478c15d0ef6ac9a69d23a13e83c06968969d355cfe2fde1db3b40804f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:47Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:47 crc kubenswrapper[4971]: I0320 06:51:47.568549 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:47Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:47 crc kubenswrapper[4971]: I0320 06:51:47.588046 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lwlhn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f11eaf57-f83a-4974-adb5-7a59b11555b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93f25aa49f9048f5f935b94dc408da72227675c36d75b9614bb59743769df123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2jcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lwlhn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:47Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:47 crc kubenswrapper[4971]: I0320 06:51:47.616863 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f624a08-810b-4915-b3bf-1e19d3e6cace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://178ce0bca73aef60296595c16024b62f27dc16d226d6ed8d102bb88b3b4477aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20d0d42b820b46ff3173d56e03dedec6fd3fd360fe144467d8eb6f785d597308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://547b376608a6a765aea3f53af2c12c6eea952bf8fcdb34be1a94fc79d7c84b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://421a7ad4b3dde0de7b751159a385832075d84609031c7def22b4aa02874d2936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a1ae4297291e48978592b82e56c27206a61f8789bc5312c5dd22d06b34c195\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30450686629f21239e3247309d8fab4f41bacff80fdc6d456d970dde1a7e703\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ec5a39eae08931433319b8734ef8180b21e3772bf8bcfce2e0599e9c229b23f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c214154bbb3ce7bf59da4df92dc496f7b74c0d13a424767af2692cdef906998\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T06:51:31Z\\\",\\\"message\\\":\\\"4ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 06:51:31.421624 6909 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator\\\\\\\"}\\\\nI0320 06:51:31.421649 6909 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-apiserver/apiserver]} name:Service_openshift-kube-apiserver/apiserver_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.93:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d71b38eb-32af-4c0f-9490-7c317c111e3a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0320 06:51:31.420261 6909 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4691ea9fd75102b278413b4cf2d045ddad7f06b55243e481ec5ff36f6891cd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fhlwk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:47Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:47 crc kubenswrapper[4971]: I0320 06:51:47.633410 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6vl68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ac992f1-bd05-471f-a101-cf14466e15e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cql2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cql2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6vl68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:47Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:47 crc kubenswrapper[4971]: I0320 06:51:47.654835 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:47Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:47 crc kubenswrapper[4971]: I0320 06:51:47.688691 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:47Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:47 crc kubenswrapper[4971]: I0320 06:51:47.705543 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jl74k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3068ae0f-0404-452c-b0e0-c422d898e6b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dad3a055d918bddfc385b5ad7adfb0e47347ef1f86fe991b0bb93c58d39e4436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmftj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jl74k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:47Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:47 crc kubenswrapper[4971]: I0320 06:51:47.725006 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c96bfbc-3a6a-44df-9bf6-9f78c587657c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1a4307986ec73b6299d1dbba6bfd0efb557578fd13619a0b6a81e5e6a31f7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6545ca2d913195f4ebed66a5c9050bc7198c2d2281fea7c3293a5d188beb2e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m26zl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:47Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:47 crc kubenswrapper[4971]: I0320 06:51:47.731293 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:51:47 crc kubenswrapper[4971]: I0320 06:51:47.731303 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:51:47 crc kubenswrapper[4971]: I0320 06:51:47.731404 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:51:47 crc kubenswrapper[4971]: E0320 06:51:47.731509 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 06:51:47 crc kubenswrapper[4971]: E0320 06:51:47.731689 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 06:51:47 crc kubenswrapper[4971]: E0320 06:51:47.731829 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 06:51:47 crc kubenswrapper[4971]: I0320 06:51:47.744832 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j57qn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f219b30c-9597-4a2c-a6d4-33cd07c2ff11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dab11f01560755e3638bfa9faf7d3a92738e066ab765f7ca1245515244805ca8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zb2ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa886347017baa18f89bad2475ac0d5b6127fd3c292e34becc72e9a1afba2ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zb2ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j57qn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:47Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:47 crc kubenswrapper[4971]: I0320 06:51:47.778710 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6eccb52-4f65-4e1f-b9c9-4bece549a815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a82f33c71d1353dd841a49c673ff55387468152e3c7aa9a935173f9de78dcd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c624e715179e2699ff4d282ae4bd733ca1bc71ac21247b97f8585f0c6313460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://becef23cbdd3d98ab9e3aa793aee4fe00b06c86505d5717db0739eb21e344b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16665116a21b8abffb23402c6362e9cd5110741ce22065c5e50f200b179a327c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://105120bce25dc17ac09a04c3b0703e3f07e3f60495ddc67ca4d77004384a6fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:47Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:47 crc kubenswrapper[4971]: I0320 06:51:47.799838 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"769cbbc2-99d8-4602-af5e-d67be4ea42c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:50:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:50:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f520d26409a32e66fab0a111d40b27ef3e315b1c69687d842535e28f5be7cba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37dc61a04694f84965d4677bc1de4f78175f69c5d491dd6e75c3e5bbcffefc16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a8e9788928dff4b3f4984ced0d9fd89424ff1b3932eab286bef120536b58fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d625bf5d3c5e189cc09b066dcf7488a12513f542970ad31f968c3b97b40946f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d625bf5d3c5e189cc09b066dcf7488a12513f542970ad31f968c3b97b40946f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:47Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:47 crc kubenswrapper[4971]: I0320 06:51:47.819787 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae3df870db76866893b6eb3f04127ac3feec63a351cfd97e96960dddd119c79f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:47Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:47 crc kubenswrapper[4971]: I0320 06:51:47.847198 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qmgrn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be6e520-cae7-413a-bd91-de5f2de7f0b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c6fda90e6a1a5bb49163607f9d9547737db1eaaaea047b43e386e5a24cc2eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ec6ebffc7048694a1a324f1569343353e81d494bd19d04dd579073299d82eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ec6ebffc7048694a1a324f1569343353e81d494bd19d04dd579073299d82eaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c24b027e991a36a6d80660e3aaf9d436d5f07b486acd239cadd2506e854fb4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c24b027e991a36a6d80660e3aaf9d436d5f07b486acd239cadd2506e854fb4f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://295001469b4485d0ef107707d2affb5c6a653b55aaa04545172607c9c9915dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://295001469b4485d0ef107707d2affb5c6a653b55aaa04545172607c9c9915dae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qmgrn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:47Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:47 crc kubenswrapper[4971]: I0320 06:51:47.864065 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kfnsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d21aee3-3c87-4b06-9a9b-f88d3acb6d3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20ac2a0f22405759bc85f5dcfbecdd34e7568f53c6d67b5c0fe9d132e2a359b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7d788\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kfnsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:47Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:47 crc kubenswrapper[4971]: I0320 06:51:47.885504 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3f656dd-4e82-4469-9bd4-f02922f2649c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfb7907b15336a74d333ce825f8325e80508112fa387f60627e13cc60dc6b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e45395fcc4ab1005b3b86243b30fecf35311439ef5593f777ba909a0b9359574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c945250129ad9a7fdd1cc19ac9f4c8d3e73c504448c652d5e02cdc986624089d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d92316a4a93f3b34950803a33ac423d41d3467830591e102e6a6b492c5a5707\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a748c7f52fe87f489e7b11239aafb2b6f9dc55d79f48fd08e10193349e61706\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T06:50:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 06:50:49.359098 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 06:50:49.359207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 06:50:49.359766 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3550867493/tls.crt::/tmp/serving-cert-3550867493/tls.key\\\\\\\"\\\\nI0320 06:50:49.512575 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 06:50:49.515179 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 06:50:49.515195 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 06:50:49.515218 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 06:50:49.515223 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 06:50:49.521761 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 06:50:49.521810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 06:50:49.521822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 06:50:49.521832 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 06:50:49.521839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 06:50:49.521858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 06:50:49.521866 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 06:50:49.521769 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0320 06:50:49.523950 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:50:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2468ac45fc80991cc210720a76afa06655f3fd7e91fab289eed1d20818e7fcbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:47Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:47 crc kubenswrapper[4971]: I0320 06:51:47.908347 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b2683d138d302cd71c3b08e7fa14d434a288810b0bcb2dd30a6ed778e9288f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dbb5b36c90e374f7b5583a7981e39d07f6154e268299d9bc22781e5be7250c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:47Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:48 crc kubenswrapper[4971]: I0320 06:51:48.528338 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fhlwk_7f624a08-810b-4915-b3bf-1e19d3e6cace/ovnkube-controller/2.log" Mar 20 06:51:48 crc kubenswrapper[4971]: I0320 06:51:48.529797 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fhlwk_7f624a08-810b-4915-b3bf-1e19d3e6cace/ovnkube-controller/1.log" Mar 20 06:51:48 crc kubenswrapper[4971]: I0320 06:51:48.533335 4971 generic.go:334] "Generic (PLEG): container finished" podID="7f624a08-810b-4915-b3bf-1e19d3e6cace" containerID="7ec5a39eae08931433319b8734ef8180b21e3772bf8bcfce2e0599e9c229b23f" exitCode=1 Mar 20 06:51:48 crc kubenswrapper[4971]: I0320 06:51:48.533386 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" event={"ID":"7f624a08-810b-4915-b3bf-1e19d3e6cace","Type":"ContainerDied","Data":"7ec5a39eae08931433319b8734ef8180b21e3772bf8bcfce2e0599e9c229b23f"} Mar 20 06:51:48 crc kubenswrapper[4971]: I0320 06:51:48.533431 4971 scope.go:117] "RemoveContainer" containerID="8c214154bbb3ce7bf59da4df92dc496f7b74c0d13a424767af2692cdef906998" Mar 20 06:51:48 crc kubenswrapper[4971]: I0320 06:51:48.534449 4971 scope.go:117] "RemoveContainer" containerID="7ec5a39eae08931433319b8734ef8180b21e3772bf8bcfce2e0599e9c229b23f" Mar 20 06:51:48 crc kubenswrapper[4971]: E0320 06:51:48.534721 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fhlwk_openshift-ovn-kubernetes(7f624a08-810b-4915-b3bf-1e19d3e6cace)\"" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" podUID="7f624a08-810b-4915-b3bf-1e19d3e6cace" Mar 20 06:51:48 crc kubenswrapper[4971]: I0320 06:51:48.567564 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9cccf9478c15d0ef6ac9a69d23a13e83c06968969d355cfe2fde1db3b40804f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:48Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:48 crc kubenswrapper[4971]: I0320 06:51:48.590092 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:48Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:48 crc kubenswrapper[4971]: I0320 06:51:48.611075 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lwlhn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f11eaf57-f83a-4974-adb5-7a59b11555b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93f25aa49f9048f5f935b94dc408da72227675c36d75b9614bb59743769df123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2jcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lwlhn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:48Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:48 crc kubenswrapper[4971]: I0320 06:51:48.645314 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f624a08-810b-4915-b3bf-1e19d3e6cace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://178ce0bca73aef60296595c16024b62f27dc16d226d6ed8d102bb88b3b4477aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20d0d42b820b46ff3173d56e03dedec6fd3fd360fe144467d8eb6f785d597308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://547b376608a6a765aea3f53af2c12c6eea952bf8fcdb34be1a94fc79d7c84b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://421a7ad4b3dde0de7b751159a385832075d84609031c7def22b4aa02874d2936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a1ae4297291e48978592b82e56c27206a61f8789bc5312c5dd22d06b34c195\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30450686629f21239e3247309d8fab4f41bacff80fdc6d456d970dde1a7e703\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ec5a39eae08931433319b8734ef8180b21e3772bf8bcfce2e0599e9c229b23f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c214154bbb3ce7bf59da4df92dc496f7b74c0d13a424767af2692cdef906998\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T06:51:31Z\\\",\\\"message\\\":\\\"4ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 06:51:31.421624 6909 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator\\\\\\\"}\\\\nI0320 06:51:31.421649 6909 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-apiserver/apiserver]} name:Service_openshift-kube-apiserver/apiserver_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.93:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d71b38eb-32af-4c0f-9490-7c317c111e3a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0320 06:51:31.420261 6909 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ec5a39eae08931433319b8734ef8180b21e3772bf8bcfce2e0599e9c229b23f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T06:51:47Z\\\",\\\"message\\\":\\\" Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 06:51:47.785226 7182 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 06:51:47.785730 7182 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 06:51:47.786508 7182 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 06:51:47.786535 7182 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 06:51:47.786584 7182 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 06:51:47.786594 7182 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 06:51:47.786669 7182 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 06:51:47.786681 7182 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 06:51:47.786720 7182 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 06:51:47.786731 7182 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 06:51:47.786774 7182 factory.go:656] Stopping watch factory\\\\nI0320 06:51:47.786802 7182 ovnkube.go:599] Stopped ovnkube\\\\nI0320 06:51:47.786803 7182 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4691ea9fd75102b278413b4cf2d045ddad7f06b55243e481ec5ff36f6891cd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fhlwk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:48Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:48 crc kubenswrapper[4971]: I0320 06:51:48.663703 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:48Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:48 crc kubenswrapper[4971]: I0320 06:51:48.683022 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:48Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:48 crc kubenswrapper[4971]: I0320 06:51:48.699906 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jl74k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3068ae0f-0404-452c-b0e0-c422d898e6b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dad3a055d918bddfc385b5ad7adfb0e47347ef1f86fe991b0bb93c58d39e4436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmftj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jl74k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:48Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:48 crc kubenswrapper[4971]: I0320 06:51:48.717589 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c96bfbc-3a6a-44df-9bf6-9f78c587657c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1a4307986ec73b6299d1dbba6bfd0efb557578fd13619a0b6a81e5e6a31f7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6545ca2d913195f4ebed66a5c9050bc7198c2d2281fea7c3293a5d188beb2e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m26zl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:48Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:48 crc kubenswrapper[4971]: I0320 06:51:48.732321 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:51:48 crc kubenswrapper[4971]: E0320 06:51:48.732850 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vl68" podUID="6ac992f1-bd05-471f-a101-cf14466e15e8" Mar 20 06:51:48 crc kubenswrapper[4971]: I0320 06:51:48.739876 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j57qn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f219b30c-9597-4a2c-a6d4-33cd07c2ff11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dab11f01560755e3638bfa9faf7d3a92738e066ab765f7ca1245515244805ca8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zb2ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa886347017baa18f89bad2475ac0d5b6127fd3c292e34becc72e9a1afba2ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zb2ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j57qn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:48Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:48 crc kubenswrapper[4971]: I0320 06:51:48.754929 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6vl68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ac992f1-bd05-471f-a101-cf14466e15e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cql2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cql2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6vl68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:48Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:48 crc kubenswrapper[4971]: I0320 06:51:48.777982 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6eccb52-4f65-4e1f-b9c9-4bece549a815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a82f33c71d1353dd841a49c673ff55387468152e3c7aa9a935173f9de78dcd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c624e715179e2699ff4d282ae4bd733ca1bc71ac21247b97f8585f0c6313460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://becef23cbdd3d98ab9e3aa793aee4fe00b06c86505d5717db0739eb21e344b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16665116a21b8abffb23402c6362e9cd5110741ce22065c5e50f200b179a327c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://105120bce25dc17ac09a04c3b0703e3f07e3f60495ddc67ca4d77004384a6fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:48Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:48 crc kubenswrapper[4971]: I0320 06:51:48.796237 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"769cbbc2-99d8-4602-af5e-d67be4ea42c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:50:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:50:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f520d26409a32e66fab0a111d40b27ef3e315b1c69687d842535e28f5be7cba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37dc61a04694f84965d4677bc1de4f78175f69c5d491dd6e75c3e5bbcffefc16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a8e9788928dff4b3f4984ced0d9fd89424ff1b3932eab286bef120536b58fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d625bf5d3c5e189cc09b066dcf7488a12513f542970ad31f968c3b97b40946f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d625bf5d3c5e189cc09b066dcf7488a12513f542970ad31f968c3b97b40946f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:48Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:48 crc kubenswrapper[4971]: I0320 06:51:48.817042 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae3df870db76866893b6eb3f04127ac3feec63a351cfd97e96960dddd119c79f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:48Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:48 crc kubenswrapper[4971]: E0320 06:51:48.838145 4971 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 06:51:48 crc kubenswrapper[4971]: I0320 06:51:48.844405 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qmgrn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be6e520-cae7-413a-bd91-de5f2de7f0b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c6fda90e6a1a5bb49163607f9d9547737db1eaaaea047b43e386e5a24cc2eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ec6ebffc7048694a1a324f1569343353e81d494bd19d04dd579073299d82eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ec6ebffc7048694a1a324f1569343353e81d494bd19d04dd579073299d82eaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c24b027e991a36a6d80660e3aaf9d436d5f07b486acd239cadd2506e854fb4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c24b027e991a36a6d80660e3aaf9d436d5f07b486acd239cadd2506e854fb4f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://295001469b4485d0ef107707d2affb5c6a653b55aaa04545172607c9c9915dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://295001469b4485d0ef107707d2affb5c6a653b55aaa04545172607c9c9915dae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qmgrn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:48Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:48 crc kubenswrapper[4971]: I0320 06:51:48.863811 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kfnsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d21aee3-3c87-4b06-9a9b-f88d3acb6d3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20ac2a0f22405759bc85f5dcfbecdd34e7568f53c6d67b5c0fe9d132e2a359b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7d788\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kfnsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:48Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:48 crc kubenswrapper[4971]: I0320 06:51:48.888389 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3f656dd-4e82-4469-9bd4-f02922f2649c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfb7907b15336a74d333ce825f8325e80508112fa387f60627e13cc60dc6b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e45395fcc4ab1005b3b86243b30fecf35311439ef5593f777ba909a0b9359574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c945250129ad9a7fdd1cc19ac9f4c8d3e73c504448c652d5e02cdc986624089d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d92316a4a93f3b34950803a33ac423d41d3467830591e102e6a6b492c5a5707\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a748c7f52fe87f489e7b11239aafb2b6f9dc55d79f48fd08e10193349e61706\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T06:50:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 06:50:49.359098 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 06:50:49.359207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 06:50:49.359766 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3550867493/tls.crt::/tmp/serving-cert-3550867493/tls.key\\\\\\\"\\\\nI0320 06:50:49.512575 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 06:50:49.515179 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 06:50:49.515195 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 06:50:49.515218 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 06:50:49.515223 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 06:50:49.521761 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 06:50:49.521810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 06:50:49.521822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 06:50:49.521832 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 06:50:49.521839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 06:50:49.521858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 06:50:49.521866 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 06:50:49.521769 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0320 06:50:49.523950 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:50:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2468ac45fc80991cc210720a76afa06655f3fd7e91fab289eed1d20818e7fcbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:48Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:48 crc kubenswrapper[4971]: I0320 06:51:48.905982 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b2683d138d302cd71c3b08e7fa14d434a288810b0bcb2dd30a6ed778e9288f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dbb5b36c90e374f7b5583a7981e39d07f6154e268299d9bc22781e5be7250c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:48Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:48 crc kubenswrapper[4971]: I0320 06:51:48.922996 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jl74k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3068ae0f-0404-452c-b0e0-c422d898e6b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dad3a055d918bddfc385b5ad7adfb0e47347ef1f86fe991b0bb93c58d39e4436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmftj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jl74k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:48Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:48 crc kubenswrapper[4971]: I0320 06:51:48.941172 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c96bfbc-3a6a-44df-9bf6-9f78c587657c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1a4307986ec73b6299d1dbba6bfd0efb557578fd13619a0b6a81e5e6a31f7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6545ca2d913195f4ebed66a5c9050bc7198c2d2281fea7c3293a5d188beb2e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m26zl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:48Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:48 crc kubenswrapper[4971]: I0320 06:51:48.960361 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j57qn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f219b30c-9597-4a2c-a6d4-33cd07c2ff11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dab11f01560755e3638bfa9faf7d3a92738e066ab765f7ca1245515244805ca8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zb2ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa886347017baa18f89bad2475ac0d5b6127fd3c292e34becc72e9a1afba2ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zb2ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j57qn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:48Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:48 crc kubenswrapper[4971]: I0320 06:51:48.979117 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6vl68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ac992f1-bd05-471f-a101-cf14466e15e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cql2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cql2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6vl68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:48Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:49 crc kubenswrapper[4971]: I0320 06:51:49.001485 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:48Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:49 crc kubenswrapper[4971]: I0320 06:51:49.025221 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:49Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:49 crc kubenswrapper[4971]: I0320 06:51:49.053740 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qmgrn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be6e520-cae7-413a-bd91-de5f2de7f0b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c6fda90e6a1a5bb49163607f9d9547737db1eaaaea047b43e386e5a24cc2eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ec6ebffc7048694a1a324f1569343353e81d494bd19d04dd579073299d82eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ec6ebffc7048694a1a324f1569343353e81d494bd19d04dd579073299d82eaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c24b027e991a36a6d80660e3aaf9d436d5f07b486acd239cadd2506e854fb4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c24b027e991a36a6d80660e3aaf9d436d5f07b486acd239cadd2506e854fb4f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://295001469b4485d0ef107707d2affb5c6a653b55aaa04545172607c9c9915dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://295001469b4485d0ef107707d2affb5c6a653b55aaa04545172607c9c9915dae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qmgrn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:49Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:49 crc kubenswrapper[4971]: I0320 06:51:49.069432 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kfnsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d21aee3-3c87-4b06-9a9b-f88d3acb6d3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20ac2a0f22405759bc85f5dcfbecdd34e7568f53c6d67b5c0fe9d132e2a359b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7d788\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kfnsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:49Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:49 crc kubenswrapper[4971]: I0320 06:51:49.102507 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6eccb52-4f65-4e1f-b9c9-4bece549a815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a82f33c71d1353dd841a49c673ff55387468152e3c7aa9a935173f9de78dcd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c624e715179e2699ff4d282ae4bd733ca1bc71ac21247b97f8585f0c6313460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://becef23cbdd3d98ab9e3aa793aee4fe00b06c86505d5717db0739eb21e344b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16665116a21b8abffb23402c6362e9cd5110741ce22065c5e50f200b179a327c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://105120bce25dc17ac09a04c3b0703e3f07e3f60495ddc67ca4d77004384a6fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:49Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:49 crc kubenswrapper[4971]: I0320 06:51:49.119386 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"769cbbc2-99d8-4602-af5e-d67be4ea42c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:50:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:50:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f520d26409a32e66fab0a111d40b27ef3e315b1c69687d842535e28f5be7cba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37dc61a04694f84965d4677bc1de4f78175f69c5d491dd6e75c3e5bbcffefc16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a8e9788928dff4b3f4984ced0d9fd89424ff1b3932eab286bef120536b58fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d625bf5d3c5e189cc09b066dcf7488a12513f542970ad31f968c3b97b40946f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d625bf5d3c5e189cc09b066dcf7488a12513f542970ad31f968c3b97b40946f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:49Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:49 crc kubenswrapper[4971]: I0320 06:51:49.124968 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ac992f1-bd05-471f-a101-cf14466e15e8-metrics-certs\") pod \"network-metrics-daemon-6vl68\" (UID: \"6ac992f1-bd05-471f-a101-cf14466e15e8\") " pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:51:49 crc kubenswrapper[4971]: E0320 06:51:49.125147 4971 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 06:51:49 crc kubenswrapper[4971]: E0320 06:51:49.125243 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ac992f1-bd05-471f-a101-cf14466e15e8-metrics-certs podName:6ac992f1-bd05-471f-a101-cf14466e15e8 nodeName:}" failed. No retries permitted until 2026-03-20 06:52:05.125219413 +0000 UTC m=+147.105093561 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6ac992f1-bd05-471f-a101-cf14466e15e8-metrics-certs") pod "network-metrics-daemon-6vl68" (UID: "6ac992f1-bd05-471f-a101-cf14466e15e8") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 06:51:49 crc kubenswrapper[4971]: I0320 06:51:49.138207 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae3df870db76866893b6eb3f04127ac3feec63a351cfd97e96960dddd119c79f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:49Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:49 crc kubenswrapper[4971]: I0320 06:51:49.161897 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3f656dd-4e82-4469-9bd4-f02922f2649c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfb7907b15336a74d333ce825f8325e80508112fa387f60627e13cc60dc6b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e45395fcc4ab1005b3b86243b30fecf35311439ef5593f777ba909a0b9359574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c945250129ad9a7fdd1cc19ac9f4c8d3e73c504448c652d5e02cdc986624089d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d92316a4a93f3b34950803a33ac423d41d3467830591e102e6a6b492c5a5707\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a748c7f52fe87f489e7b11239aafb2b6f9dc55d79f48fd08e10193349e61706\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T06:50:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 06:50:49.359098 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 06:50:49.359207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 06:50:49.359766 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3550867493/tls.crt::/tmp/serving-cert-3550867493/tls.key\\\\\\\"\\\\nI0320 06:50:49.512575 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 06:50:49.515179 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 06:50:49.515195 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 06:50:49.515218 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 06:50:49.515223 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 06:50:49.521761 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 06:50:49.521810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 06:50:49.521822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 06:50:49.521832 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 06:50:49.521839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 06:50:49.521858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 06:50:49.521866 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 06:50:49.521769 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0320 06:50:49.523950 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:50:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2468ac45fc80991cc210720a76afa06655f3fd7e91fab289eed1d20818e7fcbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:49Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:49 crc kubenswrapper[4971]: I0320 06:51:49.182539 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b2683d138d302cd71c3b08e7fa14d434a288810b0bcb2dd30a6ed778e9288f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dbb5b36c90e374f7b5583a7981e39d07f6154e268299d9bc22781e5be7250c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:49Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:49 crc kubenswrapper[4971]: I0320 06:51:49.218896 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f624a08-810b-4915-b3bf-1e19d3e6cace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://178ce0bca73aef60296595c16024b62f27dc16d226d6ed8d102bb88b3b4477aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20d0d42b820b46ff3173d56e03dedec6fd3fd360fe144467d8eb6f785d597308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://547b376608a6a765aea3f53af2c12c6eea952bf8fcdb34be1a94fc79d7c84b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://421a7ad4b3dde0de7b751159a385832075d84609031c7def22b4aa02874d2936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a1ae4297291e48978592b82e56c27206a61f8789bc5312c5dd22d06b34c195\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30450686629f21239e3247309d8fab4f41bacff80fdc6d456d970dde1a7e703\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ec5a39eae08931433319b8734ef8180b21e3772bf8bcfce2e0599e9c229b23f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c214154bbb3ce7bf59da4df92dc496f7b74c0d13a424767af2692cdef906998\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T06:51:31Z\\\",\\\"message\\\":\\\"4ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 06:51:31.421624 6909 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator\\\\\\\"}\\\\nI0320 06:51:31.421649 6909 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-apiserver/apiserver]} name:Service_openshift-kube-apiserver/apiserver_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.93:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d71b38eb-32af-4c0f-9490-7c317c111e3a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0320 06:51:31.420261 6909 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ec5a39eae08931433319b8734ef8180b21e3772bf8bcfce2e0599e9c229b23f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T06:51:47Z\\\",\\\"message\\\":\\\" Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 06:51:47.785226 7182 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 06:51:47.785730 7182 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 06:51:47.786508 7182 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 06:51:47.786535 7182 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 06:51:47.786584 7182 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 06:51:47.786594 7182 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 06:51:47.786669 7182 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 06:51:47.786681 7182 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 06:51:47.786720 7182 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 06:51:47.786731 7182 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 06:51:47.786774 7182 factory.go:656] Stopping watch factory\\\\nI0320 06:51:47.786802 7182 ovnkube.go:599] Stopped ovnkube\\\\nI0320 06:51:47.786803 7182 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4691ea9fd75102b278413b4cf2d045ddad7f06b55243e481ec5ff36f6891cd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fhlwk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:49Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:49 crc kubenswrapper[4971]: I0320 06:51:49.244232 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9cccf9478c15d0ef6ac9a69d23a13e83c06968969d355cfe2fde1db3b40804f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:49Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:49 crc kubenswrapper[4971]: I0320 06:51:49.265748 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:49Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:49 crc kubenswrapper[4971]: I0320 06:51:49.288400 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lwlhn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f11eaf57-f83a-4974-adb5-7a59b11555b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93f25aa49f9048f5f935b94dc408da72227675c36d75b9614bb59743769df123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2jcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lwlhn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:49Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:49 crc kubenswrapper[4971]: I0320 06:51:49.530276 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:51:49 crc kubenswrapper[4971]: E0320 06:51:49.530519 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 06:52:21.530476927 +0000 UTC m=+163.510351105 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:51:49 crc kubenswrapper[4971]: E0320 06:51:49.530902 4971 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 06:51:49 crc kubenswrapper[4971]: E0320 06:51:49.531032 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 06:52:21.531002841 +0000 UTC m=+163.510877009 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 06:51:49 crc kubenswrapper[4971]: I0320 06:51:49.530706 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:51:49 crc kubenswrapper[4971]: I0320 06:51:49.531458 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:51:49 crc kubenswrapper[4971]: E0320 06:51:49.531573 4971 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 06:51:49 crc kubenswrapper[4971]: E0320 06:51:49.531675 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 06:52:21.531656939 +0000 UTC m=+163.511531117 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 06:51:49 crc kubenswrapper[4971]: I0320 06:51:49.531884 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:51:49 crc kubenswrapper[4971]: I0320 06:51:49.531952 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:51:49 crc kubenswrapper[4971]: E0320 06:51:49.532183 4971 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 06:51:49 crc kubenswrapper[4971]: E0320 06:51:49.532246 4971 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 06:51:49 crc kubenswrapper[4971]: E0320 06:51:49.532271 4971 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 06:51:49 crc kubenswrapper[4971]: E0320 06:51:49.532331 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 06:52:21.532312996 +0000 UTC m=+163.512187164 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 06:51:49 crc kubenswrapper[4971]: E0320 06:51:49.532961 4971 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 06:51:49 crc kubenswrapper[4971]: E0320 06:51:49.533006 4971 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 06:51:49 crc kubenswrapper[4971]: E0320 06:51:49.533023 4971 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 06:51:49 crc kubenswrapper[4971]: E0320 06:51:49.533082 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 06:52:21.533064996 +0000 UTC m=+163.512939164 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 06:51:49 crc kubenswrapper[4971]: I0320 06:51:49.542881 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fhlwk_7f624a08-810b-4915-b3bf-1e19d3e6cace/ovnkube-controller/2.log" Mar 20 06:51:49 crc kubenswrapper[4971]: I0320 06:51:49.550389 4971 scope.go:117] "RemoveContainer" containerID="7ec5a39eae08931433319b8734ef8180b21e3772bf8bcfce2e0599e9c229b23f" Mar 20 06:51:49 crc kubenswrapper[4971]: E0320 06:51:49.551184 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fhlwk_openshift-ovn-kubernetes(7f624a08-810b-4915-b3bf-1e19d3e6cace)\"" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" podUID="7f624a08-810b-4915-b3bf-1e19d3e6cace" Mar 20 06:51:49 crc kubenswrapper[4971]: I0320 06:51:49.572158 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:49Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:49 crc kubenswrapper[4971]: I0320 06:51:49.598069 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:49Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:49 crc kubenswrapper[4971]: I0320 06:51:49.617165 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jl74k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3068ae0f-0404-452c-b0e0-c422d898e6b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dad3a055d918bddfc385b5ad7adfb0e47347ef1f86fe991b0bb93c58d39e4436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmftj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jl74k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:49Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:49 crc kubenswrapper[4971]: I0320 06:51:49.637761 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c96bfbc-3a6a-44df-9bf6-9f78c587657c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1a4307986ec73b6299d1dbba6bfd0efb557578fd13619a0b6a81e5e6a31f7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6545ca2d913195f4ebed66a5c9050bc7198c2d2281fea7c3293a5d188beb2e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m26zl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:49Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:49 crc kubenswrapper[4971]: I0320 06:51:49.657961 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j57qn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f219b30c-9597-4a2c-a6d4-33cd07c2ff11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dab11f01560755e3638bfa9faf7d3a92738e066ab765f7ca1245515244805ca8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zb2ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa886347017baa18f89bad2475ac0d5b6127fd3c292e34becc72e9a1afba2ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zb2ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j57qn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:49Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:49 crc kubenswrapper[4971]: I0320 06:51:49.675032 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6vl68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ac992f1-bd05-471f-a101-cf14466e15e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cql2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cql2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6vl68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:49Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:49 crc kubenswrapper[4971]: I0320 06:51:49.712774 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6eccb52-4f65-4e1f-b9c9-4bece549a815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a82f33c71d1353dd841a49c673ff55387468152e3c7aa9a935173f9de78dcd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c624e715179e2699ff4d282ae4bd733ca1bc71ac21247b97f8585f0c6313460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://becef23cbdd3d98ab9e3aa793aee4fe00b06c86505d5717db0739eb21e344b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16665116a21b8abffb23402c6362e9cd5110741ce22065c5e50f200b179a327c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://105120bce25dc17ac09a04c3b0703e3f07e3f60495ddc67ca4d77004384a6fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:49Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:49 crc kubenswrapper[4971]: I0320 06:51:49.731643 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:51:49 crc kubenswrapper[4971]: I0320 06:51:49.731693 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:51:49 crc kubenswrapper[4971]: E0320 06:51:49.731877 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 06:51:49 crc kubenswrapper[4971]: I0320 06:51:49.731808 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:51:49 crc kubenswrapper[4971]: E0320 06:51:49.731949 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 06:51:49 crc kubenswrapper[4971]: E0320 06:51:49.732208 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 06:51:49 crc kubenswrapper[4971]: I0320 06:51:49.732499 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"769cbbc2-99d8-4602-af5e-d67be4ea42c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:50:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:50:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f520d26409a32e66fab0a111d40b27ef3e315b1c69687d842535e28f5be7cba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37dc61a04694f84965d4677bc1de4f78175f69c5d491dd6e75c3e5bbcffefc16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a8e9788928dff4b3f4984ced0d9fd89424ff1b3932eab286bef120536b58fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d625bf5d3c5e189cc09b066dcf7488a12513f542970ad31f968c3b97b40946f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d625bf5d3c5e189cc09b066dcf7488a12513f542970ad31f968c3b97b40946f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:49Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:49 crc kubenswrapper[4971]: I0320 06:51:49.754755 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae3df870db76866893b6eb3f04127ac3feec63a351cfd97e96960dddd119c79f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:49Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:49 crc kubenswrapper[4971]: I0320 06:51:49.782394 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qmgrn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be6e520-cae7-413a-bd91-de5f2de7f0b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c6fda90e6a1a5bb49163607f9d9547737db1eaaaea047b43e386e5a24cc2eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ec6ebffc7048694a1a324f1569343353e81d494bd19d04dd579073299d82eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ec6ebffc7048694a1a324f1569343353e81d494bd19d04dd579073299d82eaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c24b027e991a36a6d80660e3aaf9d436d5f07b486acd239cadd2506e854fb4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c24b027e991a36a6d80660e3aaf9d436d5f07b486acd239cadd2506e854fb4f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://295001469b4485d0ef107707d2affb5c6a653b55aaa04545172607c9c9915dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://295001469b4485d0ef107707d2affb5c6a653b55aaa04545172607c9c9915dae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qmgrn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:49Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:49 crc kubenswrapper[4971]: I0320 06:51:49.803424 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kfnsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d21aee3-3c87-4b06-9a9b-f88d3acb6d3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20ac2a0f22405759bc85f5dcfbecdd34e7568f53c6d67b5c0fe9d132e2a359b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7d788\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kfnsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:49Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:49 crc kubenswrapper[4971]: I0320 06:51:49.831518 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3f656dd-4e82-4469-9bd4-f02922f2649c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfb7907b15336a74d333ce825f8325e80508112fa387f60627e13cc60dc6b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e45395fcc4ab1005b3b86243b30fecf35311439ef5593f777ba909a0b9359574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c945250129ad9a7fdd1cc19ac9f4c8d3e73c504448c652d5e02cdc986624089d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d92316a4a93f3b34950803a33ac423d41d3467830591e102e6a6b492c5a5707\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a748c7f52fe87f489e7b11239aafb2b6f9dc55d79f48fd08e10193349e61706\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T06:50:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 06:50:49.359098 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 06:50:49.359207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 06:50:49.359766 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3550867493/tls.crt::/tmp/serving-cert-3550867493/tls.key\\\\\\\"\\\\nI0320 06:50:49.512575 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 06:50:49.515179 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 06:50:49.515195 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 06:50:49.515218 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 06:50:49.515223 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 06:50:49.521761 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 06:50:49.521810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 06:50:49.521822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 06:50:49.521832 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 06:50:49.521839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 06:50:49.521858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 06:50:49.521866 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 06:50:49.521769 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0320 06:50:49.523950 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:50:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2468ac45fc80991cc210720a76afa06655f3fd7e91fab289eed1d20818e7fcbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:49Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:49 crc kubenswrapper[4971]: I0320 06:51:49.856681 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b2683d138d302cd71c3b08e7fa14d434a288810b0bcb2dd30a6ed778e9288f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dbb5b36c90e374f7b5583a7981e39d07f6154e268299d9bc22781e5be7250c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:49Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:49 crc kubenswrapper[4971]: I0320 06:51:49.879504 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9cccf9478c15d0ef6ac9a69d23a13e83c06968969d355cfe2fde1db3b40804f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:49Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:49 crc kubenswrapper[4971]: I0320 06:51:49.899473 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:49Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:49 crc kubenswrapper[4971]: I0320 06:51:49.924549 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lwlhn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f11eaf57-f83a-4974-adb5-7a59b11555b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93f25aa49f9048f5f935b94dc408da72227675c36d75b9614bb59743769df123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2jcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lwlhn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:49Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:49 crc kubenswrapper[4971]: I0320 06:51:49.958746 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f624a08-810b-4915-b3bf-1e19d3e6cace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://178ce0bca73aef60296595c16024b62f27dc16d226d6ed8d102bb88b3b4477aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20d0d42b820b46ff3173d56e03dedec6fd3fd360fe144467d8eb6f785d597308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://547b376608a6a765aea3f53af2c12c6eea952bf8fcdb34be1a94fc79d7c84b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://421a7ad4b3dde0de7b751159a385832075d84609031c7def22b4aa02874d2936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a1ae4297291e48978592b82e56c27206a61f8789bc5312c5dd22d06b34c195\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30450686629f21239e3247309d8fab4f41bacff80fdc6d456d970dde1a7e703\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ec5a39eae08931433319b8734ef8180b21e3772bf8bcfce2e0599e9c229b23f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ec5a39eae08931433319b8734ef8180b21e3772bf8bcfce2e0599e9c229b23f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T06:51:47Z\\\",\\\"message\\\":\\\" Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 06:51:47.785226 7182 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 06:51:47.785730 7182 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 06:51:47.786508 7182 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 06:51:47.786535 7182 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 06:51:47.786584 7182 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 06:51:47.786594 7182 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 06:51:47.786669 7182 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 06:51:47.786681 7182 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 06:51:47.786720 7182 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 06:51:47.786731 7182 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 06:51:47.786774 7182 factory.go:656] Stopping watch factory\\\\nI0320 06:51:47.786802 7182 ovnkube.go:599] Stopped ovnkube\\\\nI0320 06:51:47.786803 7182 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fhlwk_openshift-ovn-kubernetes(7f624a08-810b-4915-b3bf-1e19d3e6cace)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4691ea9fd75102b278413b4cf2d045ddad7f06b55243e481ec5ff36f6891cd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fhlwk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:49Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:50 crc kubenswrapper[4971]: I0320 06:51:50.731390 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:51:50 crc kubenswrapper[4971]: E0320 06:51:50.731719 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vl68" podUID="6ac992f1-bd05-471f-a101-cf14466e15e8" Mar 20 06:51:51 crc kubenswrapper[4971]: I0320 06:51:51.731601 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:51:51 crc kubenswrapper[4971]: I0320 06:51:51.731749 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:51:51 crc kubenswrapper[4971]: E0320 06:51:51.732965 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 06:51:51 crc kubenswrapper[4971]: E0320 06:51:51.733188 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 06:51:51 crc kubenswrapper[4971]: I0320 06:51:51.732133 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:51:51 crc kubenswrapper[4971]: E0320 06:51:51.733361 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 06:51:51 crc kubenswrapper[4971]: I0320 06:51:51.749456 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 20 06:51:52 crc kubenswrapper[4971]: I0320 06:51:52.731560 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:51:52 crc kubenswrapper[4971]: E0320 06:51:52.731858 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vl68" podUID="6ac992f1-bd05-471f-a101-cf14466e15e8" Mar 20 06:51:53 crc kubenswrapper[4971]: I0320 06:51:53.732199 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:51:53 crc kubenswrapper[4971]: I0320 06:51:53.732211 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:51:53 crc kubenswrapper[4971]: I0320 06:51:53.732333 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:51:53 crc kubenswrapper[4971]: E0320 06:51:53.732784 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 06:51:53 crc kubenswrapper[4971]: E0320 06:51:53.732966 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 06:51:53 crc kubenswrapper[4971]: E0320 06:51:53.733201 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 06:51:53 crc kubenswrapper[4971]: E0320 06:51:53.839770 4971 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 06:51:54 crc kubenswrapper[4971]: I0320 06:51:54.732273 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:51:54 crc kubenswrapper[4971]: E0320 06:51:54.732438 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vl68" podUID="6ac992f1-bd05-471f-a101-cf14466e15e8" Mar 20 06:51:55 crc kubenswrapper[4971]: I0320 06:51:55.731238 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:51:55 crc kubenswrapper[4971]: I0320 06:51:55.731256 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:51:55 crc kubenswrapper[4971]: I0320 06:51:55.731261 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:51:55 crc kubenswrapper[4971]: E0320 06:51:55.731429 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 06:51:55 crc kubenswrapper[4971]: E0320 06:51:55.731813 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 06:51:55 crc kubenswrapper[4971]: E0320 06:51:55.731899 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 06:51:56 crc kubenswrapper[4971]: I0320 06:51:56.732461 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:51:56 crc kubenswrapper[4971]: E0320 06:51:56.732820 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vl68" podUID="6ac992f1-bd05-471f-a101-cf14466e15e8" Mar 20 06:51:57 crc kubenswrapper[4971]: I0320 06:51:57.559458 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:57 crc kubenswrapper[4971]: I0320 06:51:57.559544 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:57 crc kubenswrapper[4971]: I0320 06:51:57.559562 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:57 crc kubenswrapper[4971]: I0320 06:51:57.559594 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:57 crc kubenswrapper[4971]: I0320 06:51:57.559658 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:57Z","lastTransitionTime":"2026-03-20T06:51:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:57 crc kubenswrapper[4971]: E0320 06:51:57.584887 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24b0987a-bded-4b95-97d3-cecbc47baa01\\\",\\\"systemUUID\\\":\\\"e08276c0-e3e4-4e35-ad9f-5ef530b85d12\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:57Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:57 crc kubenswrapper[4971]: I0320 06:51:57.591002 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:57 crc kubenswrapper[4971]: I0320 06:51:57.591063 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:57 crc kubenswrapper[4971]: I0320 06:51:57.591082 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:57 crc kubenswrapper[4971]: I0320 06:51:57.591114 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:57 crc kubenswrapper[4971]: I0320 06:51:57.591136 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:57Z","lastTransitionTime":"2026-03-20T06:51:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:57 crc kubenswrapper[4971]: E0320 06:51:57.616008 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24b0987a-bded-4b95-97d3-cecbc47baa01\\\",\\\"systemUUID\\\":\\\"e08276c0-e3e4-4e35-ad9f-5ef530b85d12\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:57Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:57 crc kubenswrapper[4971]: I0320 06:51:57.621483 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:57 crc kubenswrapper[4971]: I0320 06:51:57.621538 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:57 crc kubenswrapper[4971]: I0320 06:51:57.621551 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:57 crc kubenswrapper[4971]: I0320 06:51:57.621576 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:57 crc kubenswrapper[4971]: I0320 06:51:57.621592 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:57Z","lastTransitionTime":"2026-03-20T06:51:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:57 crc kubenswrapper[4971]: E0320 06:51:57.643419 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24b0987a-bded-4b95-97d3-cecbc47baa01\\\",\\\"systemUUID\\\":\\\"e08276c0-e3e4-4e35-ad9f-5ef530b85d12\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:57Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:57 crc kubenswrapper[4971]: I0320 06:51:57.649425 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:57 crc kubenswrapper[4971]: I0320 06:51:57.649504 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:57 crc kubenswrapper[4971]: I0320 06:51:57.649525 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:57 crc kubenswrapper[4971]: I0320 06:51:57.649555 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:57 crc kubenswrapper[4971]: I0320 06:51:57.649574 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:57Z","lastTransitionTime":"2026-03-20T06:51:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:57 crc kubenswrapper[4971]: E0320 06:51:57.671513 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24b0987a-bded-4b95-97d3-cecbc47baa01\\\",\\\"systemUUID\\\":\\\"e08276c0-e3e4-4e35-ad9f-5ef530b85d12\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:57Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:57 crc kubenswrapper[4971]: I0320 06:51:57.678410 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:51:57 crc kubenswrapper[4971]: I0320 06:51:57.678508 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:51:57 crc kubenswrapper[4971]: I0320 06:51:57.678571 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:51:57 crc kubenswrapper[4971]: I0320 06:51:57.678686 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:51:57 crc kubenswrapper[4971]: I0320 06:51:57.678755 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:51:57Z","lastTransitionTime":"2026-03-20T06:51:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:51:57 crc kubenswrapper[4971]: E0320 06:51:57.702854 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:51:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24b0987a-bded-4b95-97d3-cecbc47baa01\\\",\\\"systemUUID\\\":\\\"e08276c0-e3e4-4e35-ad9f-5ef530b85d12\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:57Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:57 crc kubenswrapper[4971]: E0320 06:51:57.703110 4971 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 06:51:57 crc kubenswrapper[4971]: I0320 06:51:57.732029 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:51:57 crc kubenswrapper[4971]: I0320 06:51:57.732054 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:51:57 crc kubenswrapper[4971]: I0320 06:51:57.732167 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:51:57 crc kubenswrapper[4971]: E0320 06:51:57.732254 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 06:51:57 crc kubenswrapper[4971]: E0320 06:51:57.732392 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 06:51:57 crc kubenswrapper[4971]: E0320 06:51:57.732549 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 06:51:58 crc kubenswrapper[4971]: I0320 06:51:58.587355 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 06:51:58 crc kubenswrapper[4971]: I0320 06:51:58.609895 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:58Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:58 crc kubenswrapper[4971]: I0320 06:51:58.637562 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:58Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:58 crc kubenswrapper[4971]: I0320 06:51:58.654884 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jl74k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3068ae0f-0404-452c-b0e0-c422d898e6b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dad3a055d918bddfc385b5ad7adfb0e47347ef1f86fe991b0bb93c58d39e4436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmftj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jl74k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:58Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:58 crc kubenswrapper[4971]: I0320 06:51:58.675692 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c96bfbc-3a6a-44df-9bf6-9f78c587657c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1a4307986ec73b6299d1dbba6bfd0efb557578fd13619a0b6a81e5e6a31f7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6545ca2d913195f4ebed66a5c9050bc7198c2d2281fea7c3293a5d188beb2e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m26zl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:58Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:58 crc kubenswrapper[4971]: I0320 06:51:58.695108 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j57qn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f219b30c-9597-4a2c-a6d4-33cd07c2ff11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dab11f01560755e3638bfa9faf7d3a92738e066ab765f7ca1245515244805ca8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zb2ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa886347017baa18f89bad2475ac0d5b6127fd3c292e34becc72e9a1afba2ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zb2ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j57qn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:58Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:58 crc kubenswrapper[4971]: I0320 06:51:58.712460 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6vl68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ac992f1-bd05-471f-a101-cf14466e15e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cql2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cql2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6vl68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:58Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:58 crc kubenswrapper[4971]: I0320 06:51:58.731885 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:51:58 crc kubenswrapper[4971]: E0320 06:51:58.732073 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vl68" podUID="6ac992f1-bd05-471f-a101-cf14466e15e8" Mar 20 06:51:58 crc kubenswrapper[4971]: I0320 06:51:58.735289 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb66d79e-43e8-4160-ad35-5364bb664ab3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:50:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:50:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9585aeb7e25f5126d0ef599cd585ee78a0b547cd3bb388257855c4482621229f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57f7e2094376c5a4e7ab64d5f5d5cf8f2345b48b8cc9c0debb63d2cce6cf9524\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T06:50:05Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 06:49:40.859108 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 06:49:40.862084 1 observer_polling.go:159] Starting file observer\\\\nI0320 06:49:40.896904 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 06:49:40.901430 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 06:50:05.286315 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 06:50:05.286429 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:50:05Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:40Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:50:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://892e128c2264ce375823bf22386e28688c7c163c1a8ab217c984a26dd61506d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://465f395d4a53da4db1294d226cb9df79fa577b5e791cdadfe55758e760fa3b1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f2b249d64a0733988fd3aceda10266127b4496b00e4240553c37eaf2f0ddc5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:58Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:58 crc kubenswrapper[4971]: I0320 06:51:58.756293 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"769cbbc2-99d8-4602-af5e-d67be4ea42c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:50:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:50:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f520d26409a32e66fab0a111d40b27ef3e315b1c69687d842535e28f5be7cba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37dc61a04694f84965d4677bc1de4f78175f69c5d491dd6e75c3e5bbcffefc16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a8e9788928dff4b3f4984ced0d9fd89424ff1b3932eab286bef120536b58fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d625bf5d3c5e189cc09b066dcf7488a12513f542970ad31f968c3b97b40946f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d625bf5d3c5e189cc09b066dcf7488a12513f542970ad31f968c3b97b40946f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:58Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:58 crc kubenswrapper[4971]: I0320 06:51:58.776460 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae3df870db76866893b6eb3f04127ac3feec63a351cfd97e96960dddd119c79f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:58Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:58 crc kubenswrapper[4971]: I0320 06:51:58.802655 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qmgrn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be6e520-cae7-413a-bd91-de5f2de7f0b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c6fda90e6a1a5bb49163607f9d9547737db1eaaaea047b43e386e5a24cc2eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ec6ebffc7048694a1a324f1569343353e81d494bd19d04dd579073299d82eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ec6ebffc7048694a1a324f1569343353e81d494bd19d04dd579073299d82eaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c24b027e991a36a6d80660e3aaf9d436d5f07b486acd239cadd2506e854fb4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c24b027e991a36a6d80660e3aaf9d436d5f07b486acd239cadd2506e854fb4f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://295001469b4485d0ef107707d2affb5c6a653b55aaa04545172607c9c9915dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://295001469b4485d0ef107707d2affb5c6a653b55aaa04545172607c9c9915dae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qmgrn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:58Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:58 crc kubenswrapper[4971]: I0320 06:51:58.819689 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kfnsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d21aee3-3c87-4b06-9a9b-f88d3acb6d3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20ac2a0f22405759bc85f5dcfbecdd34e7568f53c6d67b5c0fe9d132e2a359b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7d788\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kfnsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:58Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:58 crc kubenswrapper[4971]: E0320 06:51:58.840684 4971 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 06:51:58 crc kubenswrapper[4971]: I0320 06:51:58.853843 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6eccb52-4f65-4e1f-b9c9-4bece549a815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a82f33c71d1353dd841a49c673ff55387468152e3c7aa9a935173f9de78dcd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c624e715179e2699ff4d282ae4bd733ca1bc71ac21247b97f8585f0c6313460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://becef23cbdd3d98ab9e3aa793aee4fe00b06c86505d5717db0739eb21e344b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16665116a21b8abffb23402c6362e9cd5110741ce22065c5e50f200b179a327c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://105120bce25dc17ac09a04c3b0703e3f07e3f60495ddc67ca4d77004384a6fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:58Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:58 crc kubenswrapper[4971]: I0320 06:51:58.875120 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3f656dd-4e82-4469-9bd4-f02922f2649c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfb7907b15336a74d333ce825f8325e80508112fa387f60627e13cc60dc6b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e45395fcc4ab1005b3b86243b30fecf35311439ef5593f777ba909a0b9359574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c945250129ad9a7fdd1cc19ac9f4c8d3e73c504448c652d5e02cdc986624089d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d92316a4a93f3b34950803a33ac423d41d3467830591e102e6a6b492c5a5707\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a748c7f52fe87f489e7b11239aafb2b6f9dc55d79f48fd08e10193349e61706\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T06:50:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 06:50:49.359098 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 06:50:49.359207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 06:50:49.359766 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3550867493/tls.crt::/tmp/serving-cert-3550867493/tls.key\\\\\\\"\\\\nI0320 06:50:49.512575 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 06:50:49.515179 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 06:50:49.515195 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 06:50:49.515218 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 06:50:49.515223 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 06:50:49.521761 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 06:50:49.521810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 06:50:49.521822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 06:50:49.521832 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 06:50:49.521839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 06:50:49.521858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 06:50:49.521866 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 06:50:49.521769 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0320 06:50:49.523950 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:50:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2468ac45fc80991cc210720a76afa06655f3fd7e91fab289eed1d20818e7fcbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:58Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:58 crc kubenswrapper[4971]: I0320 06:51:58.893428 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b2683d138d302cd71c3b08e7fa14d434a288810b0bcb2dd30a6ed778e9288f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dbb5b36c90e374f7b5583a7981e39d07f6154e268299d9bc22781e5be7250c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:58Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:58 crc kubenswrapper[4971]: I0320 06:51:58.907386 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:58Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:58 crc kubenswrapper[4971]: I0320 06:51:58.925837 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lwlhn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f11eaf57-f83a-4974-adb5-7a59b11555b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93f25aa49f9048f5f935b94dc408da72227675c36d75b9614bb59743769df123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2jcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lwlhn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:58Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:58 crc kubenswrapper[4971]: I0320 06:51:58.952983 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f624a08-810b-4915-b3bf-1e19d3e6cace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://178ce0bca73aef60296595c16024b62f27dc16d226d6ed8d102bb88b3b4477aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20d0d42b820b46ff3173d56e03dedec6fd3fd360fe144467d8eb6f785d597308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://547b376608a6a765aea3f53af2c12c6eea952bf8fcdb34be1a94fc79d7c84b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://421a7ad4b3dde0de7b751159a385832075d84609031c7def22b4aa02874d2936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a1ae4297291e48978592b82e56c27206a61f8789bc5312c5dd22d06b34c195\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30450686629f21239e3247309d8fab4f41bacff80fdc6d456d970dde1a7e703\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ec5a39eae08931433319b8734ef8180b21e3772bf8bcfce2e0599e9c229b23f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ec5a39eae08931433319b8734ef8180b21e3772bf8bcfce2e0599e9c229b23f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T06:51:47Z\\\",\\\"message\\\":\\\" Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 06:51:47.785226 7182 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 06:51:47.785730 7182 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 06:51:47.786508 7182 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 06:51:47.786535 7182 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 06:51:47.786584 7182 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 06:51:47.786594 7182 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 06:51:47.786669 7182 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 06:51:47.786681 7182 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 06:51:47.786720 7182 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 06:51:47.786731 7182 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 06:51:47.786774 7182 factory.go:656] Stopping watch factory\\\\nI0320 06:51:47.786802 7182 ovnkube.go:599] Stopped ovnkube\\\\nI0320 06:51:47.786803 7182 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fhlwk_openshift-ovn-kubernetes(7f624a08-810b-4915-b3bf-1e19d3e6cace)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4691ea9fd75102b278413b4cf2d045ddad7f06b55243e481ec5ff36f6891cd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fhlwk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:58Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:58 crc kubenswrapper[4971]: I0320 06:51:58.975438 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9cccf9478c15d0ef6ac9a69d23a13e83c06968969d355cfe2fde1db3b40804f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:58Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:58 crc kubenswrapper[4971]: I0320 06:51:58.994336 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9cccf9478c15d0ef6ac9a69d23a13e83c06968969d355cfe2fde1db3b40804f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:58Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:59 crc kubenswrapper[4971]: I0320 06:51:59.011381 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:59Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:59 crc kubenswrapper[4971]: I0320 06:51:59.031125 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lwlhn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f11eaf57-f83a-4974-adb5-7a59b11555b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93f25aa49f9048f5f935b94dc408da72227675c36d75b9614bb59743769df123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2jcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lwlhn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:59Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:59 crc kubenswrapper[4971]: I0320 06:51:59.064100 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f624a08-810b-4915-b3bf-1e19d3e6cace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://178ce0bca73aef60296595c16024b62f27dc16d226d6ed8d102bb88b3b4477aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20d0d42b820b46ff3173d56e03dedec6fd3fd360fe144467d8eb6f785d597308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://547b376608a6a765aea3f53af2c12c6eea952bf8fcdb34be1a94fc79d7c84b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://421a7ad4b3dde0de7b751159a385832075d84609031c7def22b4aa02874d2936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a1ae4297291e48978592b82e56c27206a61f8789bc5312c5dd22d06b34c195\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30450686629f21239e3247309d8fab4f41bacff80fdc6d456d970dde1a7e703\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ec5a39eae08931433319b8734ef8180b21e3772bf8bcfce2e0599e9c229b23f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ec5a39eae08931433319b8734ef8180b21e3772bf8bcfce2e0599e9c229b23f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T06:51:47Z\\\",\\\"message\\\":\\\" Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 06:51:47.785226 7182 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 06:51:47.785730 7182 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 06:51:47.786508 7182 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 06:51:47.786535 7182 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 06:51:47.786584 7182 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 06:51:47.786594 7182 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 06:51:47.786669 7182 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 06:51:47.786681 7182 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 06:51:47.786720 7182 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 06:51:47.786731 7182 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 06:51:47.786774 7182 factory.go:656] Stopping watch factory\\\\nI0320 06:51:47.786802 7182 ovnkube.go:599] Stopped ovnkube\\\\nI0320 06:51:47.786803 7182 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fhlwk_openshift-ovn-kubernetes(7f624a08-810b-4915-b3bf-1e19d3e6cace)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4691ea9fd75102b278413b4cf2d045ddad7f06b55243e481ec5ff36f6891cd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fhlwk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:59Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:59 crc kubenswrapper[4971]: I0320 06:51:59.083458 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb66d79e-43e8-4160-ad35-5364bb664ab3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:50:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:50:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9585aeb7e25f5126d0ef599cd585ee78a0b547cd3bb388257855c4482621229f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57f7e2094376c5a4e7ab64d5f5d5cf8f2345b48b8cc9c0debb63d2cce6cf9524\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T06:50:05Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 06:49:40.859108 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 06:49:40.862084 1 observer_polling.go:159] Starting file observer\\\\nI0320 06:49:40.896904 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 06:49:40.901430 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 06:50:05.286315 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 06:50:05.286429 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:50:05Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:40Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:50:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://892e128c2264ce375823bf22386e28688c7c163c1a8ab217c984a26dd61506d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://465f395d4a53da4db1294d226cb9df79fa577b5e791cdadfe55758e760fa3b1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f2b249d64a0733988fd3aceda10266127b4496b00e4240553c37eaf2f0ddc5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:59Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:59 crc kubenswrapper[4971]: I0320 06:51:59.107975 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:59Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:59 crc kubenswrapper[4971]: I0320 06:51:59.127780 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:59Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:59 crc kubenswrapper[4971]: I0320 06:51:59.143701 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jl74k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3068ae0f-0404-452c-b0e0-c422d898e6b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dad3a055d918bddfc385b5ad7adfb0e47347ef1f86fe991b0bb93c58d39e4436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmftj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jl74k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:59Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:59 crc kubenswrapper[4971]: I0320 06:51:59.161238 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c96bfbc-3a6a-44df-9bf6-9f78c587657c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1a4307986ec73b6299d1dbba6bfd0efb557578fd13619a0b6a81e5e6a31f7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6545ca2d913195f4ebed66a5c9050bc7198c2d2281fea7c3293a5d188beb2e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m26zl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:59Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:59 crc kubenswrapper[4971]: I0320 06:51:59.181423 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j57qn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f219b30c-9597-4a2c-a6d4-33cd07c2ff11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dab11f01560755e3638bfa9faf7d3a92738e066ab765f7ca1245515244805ca8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zb2ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa886347017baa18f89bad2475ac0d5b6127fd3c292e34becc72e9a1afba2ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zb2ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j57qn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:59Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:59 crc kubenswrapper[4971]: I0320 06:51:59.197769 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6vl68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ac992f1-bd05-471f-a101-cf14466e15e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cql2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cql2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6vl68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:59Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:59 crc kubenswrapper[4971]: I0320 06:51:59.230220 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6eccb52-4f65-4e1f-b9c9-4bece549a815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a82f33c71d1353dd841a49c673ff55387468152e3c7aa9a935173f9de78dcd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c624e715179e2699ff4d282ae4bd733ca1bc71ac21247b97f8585f0c6313460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://becef23cbdd3d98ab9e3aa793aee4fe00b06c86505d5717db0739eb21e344b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16665116a21b8abffb23402c6362e9cd5110741ce22065c5e50f200b179a327c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://105120bce25dc17ac09a04c3b0703e3f07e3f60495ddc67ca4d77004384a6fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:59Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:59 crc kubenswrapper[4971]: I0320 06:51:59.245415 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"769cbbc2-99d8-4602-af5e-d67be4ea42c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:50:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:50:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f520d26409a32e66fab0a111d40b27ef3e315b1c69687d842535e28f5be7cba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37dc61a04694f84965d4677bc1de4f78175f69c5d491dd6e75c3e5bbcffefc16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a8e9788928dff4b3f4984ced0d9fd89424ff1b3932eab286bef120536b58fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d625bf5d3c5e189cc09b066dcf7488a12513f542970ad31f968c3b97b40946f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d625bf5d3c5e189cc09b066dcf7488a12513f542970ad31f968c3b97b40946f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:59Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:59 crc kubenswrapper[4971]: I0320 06:51:59.261136 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae3df870db76866893b6eb3f04127ac3feec63a351cfd97e96960dddd119c79f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:59Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:59 crc kubenswrapper[4971]: I0320 06:51:59.274086 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qmgrn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be6e520-cae7-413a-bd91-de5f2de7f0b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c6fda90e6a1a5bb49163607f9d9547737db1eaaaea047b43e386e5a24cc2eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ec6ebffc7048694a1a324f1569343353e81d494bd19d04dd579073299d82eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ec6ebffc7048694a1a324f1569343353e81d494bd19d04dd579073299d82eaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c24b027e991a36a6d80660e3aaf9d436d5f07b486acd239cadd2506e854fb4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c24b027e991a36a6d80660e3aaf9d436d5f07b486acd239cadd2506e854fb4f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://295001469b4485d0ef107707d2affb5c6a653b55aaa04545172607c9c9915dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://295001469b4485d0ef107707d2affb5c6a653b55aaa04545172607c9c9915dae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qmgrn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:59Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:59 crc kubenswrapper[4971]: I0320 06:51:59.289063 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kfnsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d21aee3-3c87-4b06-9a9b-f88d3acb6d3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20ac2a0f22405759bc85f5dcfbecdd34e7568f53c6d67b5c0fe9d132e2a359b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7d788\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kfnsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:59Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:59 crc kubenswrapper[4971]: I0320 06:51:59.312243 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3f656dd-4e82-4469-9bd4-f02922f2649c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfb7907b15336a74d333ce825f8325e80508112fa387f60627e13cc60dc6b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e45395fcc4ab1005b3b86243b30fecf35311439ef5593f777ba909a0b9359574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c945250129ad9a7fdd1cc19ac9f4c8d3e73c504448c652d5e02cdc986624089d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d92316a4a93f3b34950803a33ac423d41d3467830591e102e6a6b492c5a5707\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a748c7f52fe87f489e7b11239aafb2b6f9dc55d79f48fd08e10193349e61706\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T06:50:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 06:50:49.359098 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 06:50:49.359207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 06:50:49.359766 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3550867493/tls.crt::/tmp/serving-cert-3550867493/tls.key\\\\\\\"\\\\nI0320 06:50:49.512575 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 06:50:49.515179 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 06:50:49.515195 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 06:50:49.515218 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 06:50:49.515223 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 06:50:49.521761 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 06:50:49.521810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 06:50:49.521822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 06:50:49.521832 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 06:50:49.521839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 06:50:49.521858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 06:50:49.521866 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 06:50:49.521769 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0320 06:50:49.523950 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:50:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2468ac45fc80991cc210720a76afa06655f3fd7e91fab289eed1d20818e7fcbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:59Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:59 crc kubenswrapper[4971]: I0320 06:51:59.324301 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b2683d138d302cd71c3b08e7fa14d434a288810b0bcb2dd30a6ed778e9288f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dbb5b36c90e374f7b5583a7981e39d07f6154e268299d9bc22781e5be7250c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:51:59Z is after 2025-08-24T17:21:41Z" Mar 20 06:51:59 crc kubenswrapper[4971]: I0320 06:51:59.732243 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:51:59 crc kubenswrapper[4971]: I0320 06:51:59.732289 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:51:59 crc kubenswrapper[4971]: I0320 06:51:59.732305 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:51:59 crc kubenswrapper[4971]: E0320 06:51:59.732421 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 06:51:59 crc kubenswrapper[4971]: E0320 06:51:59.732650 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 06:51:59 crc kubenswrapper[4971]: E0320 06:51:59.732780 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 06:52:00 crc kubenswrapper[4971]: I0320 06:52:00.731705 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:52:00 crc kubenswrapper[4971]: E0320 06:52:00.732026 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vl68" podUID="6ac992f1-bd05-471f-a101-cf14466e15e8" Mar 20 06:52:00 crc kubenswrapper[4971]: I0320 06:52:00.733143 4971 scope.go:117] "RemoveContainer" containerID="7ec5a39eae08931433319b8734ef8180b21e3772bf8bcfce2e0599e9c229b23f" Mar 20 06:52:00 crc kubenswrapper[4971]: E0320 06:52:00.733440 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fhlwk_openshift-ovn-kubernetes(7f624a08-810b-4915-b3bf-1e19d3e6cace)\"" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" podUID="7f624a08-810b-4915-b3bf-1e19d3e6cace" Mar 20 06:52:01 crc kubenswrapper[4971]: I0320 06:52:01.731988 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:52:01 crc kubenswrapper[4971]: I0320 06:52:01.732060 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:52:01 crc kubenswrapper[4971]: I0320 06:52:01.732108 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:52:01 crc kubenswrapper[4971]: E0320 06:52:01.732175 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 06:52:01 crc kubenswrapper[4971]: E0320 06:52:01.732261 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 06:52:01 crc kubenswrapper[4971]: E0320 06:52:01.732458 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 06:52:02 crc kubenswrapper[4971]: I0320 06:52:02.732133 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:52:02 crc kubenswrapper[4971]: E0320 06:52:02.732388 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vl68" podUID="6ac992f1-bd05-471f-a101-cf14466e15e8" Mar 20 06:52:03 crc kubenswrapper[4971]: I0320 06:52:03.732169 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:52:03 crc kubenswrapper[4971]: I0320 06:52:03.732368 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:52:03 crc kubenswrapper[4971]: E0320 06:52:03.732573 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 06:52:03 crc kubenswrapper[4971]: E0320 06:52:03.732757 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 06:52:03 crc kubenswrapper[4971]: I0320 06:52:03.733509 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:52:03 crc kubenswrapper[4971]: E0320 06:52:03.733916 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 06:52:03 crc kubenswrapper[4971]: E0320 06:52:03.842461 4971 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 06:52:04 crc kubenswrapper[4971]: I0320 06:52:04.732289 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:52:04 crc kubenswrapper[4971]: E0320 06:52:04.732549 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vl68" podUID="6ac992f1-bd05-471f-a101-cf14466e15e8" Mar 20 06:52:05 crc kubenswrapper[4971]: I0320 06:52:05.213637 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ac992f1-bd05-471f-a101-cf14466e15e8-metrics-certs\") pod \"network-metrics-daemon-6vl68\" (UID: \"6ac992f1-bd05-471f-a101-cf14466e15e8\") " pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:52:05 crc kubenswrapper[4971]: E0320 06:52:05.213783 4971 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 06:52:05 crc kubenswrapper[4971]: E0320 06:52:05.213843 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ac992f1-bd05-471f-a101-cf14466e15e8-metrics-certs podName:6ac992f1-bd05-471f-a101-cf14466e15e8 nodeName:}" failed. No retries permitted until 2026-03-20 06:52:37.213826729 +0000 UTC m=+179.193700857 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6ac992f1-bd05-471f-a101-cf14466e15e8-metrics-certs") pod "network-metrics-daemon-6vl68" (UID: "6ac992f1-bd05-471f-a101-cf14466e15e8") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 06:52:05 crc kubenswrapper[4971]: I0320 06:52:05.731235 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:52:05 crc kubenswrapper[4971]: I0320 06:52:05.731235 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:52:05 crc kubenswrapper[4971]: E0320 06:52:05.731497 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 06:52:05 crc kubenswrapper[4971]: I0320 06:52:05.731571 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:52:05 crc kubenswrapper[4971]: E0320 06:52:05.731765 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 06:52:05 crc kubenswrapper[4971]: E0320 06:52:05.731889 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 06:52:06 crc kubenswrapper[4971]: I0320 06:52:06.732233 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:52:06 crc kubenswrapper[4971]: E0320 06:52:06.732393 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vl68" podUID="6ac992f1-bd05-471f-a101-cf14466e15e8" Mar 20 06:52:07 crc kubenswrapper[4971]: I0320 06:52:07.642541 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lwlhn_f11eaf57-f83a-4974-adb5-7a59b11555b0/kube-multus/0.log" Mar 20 06:52:07 crc kubenswrapper[4971]: I0320 06:52:07.642624 4971 generic.go:334] "Generic (PLEG): container finished" podID="f11eaf57-f83a-4974-adb5-7a59b11555b0" containerID="93f25aa49f9048f5f935b94dc408da72227675c36d75b9614bb59743769df123" exitCode=1 Mar 20 06:52:07 crc kubenswrapper[4971]: I0320 06:52:07.642657 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lwlhn" event={"ID":"f11eaf57-f83a-4974-adb5-7a59b11555b0","Type":"ContainerDied","Data":"93f25aa49f9048f5f935b94dc408da72227675c36d75b9614bb59743769df123"} Mar 20 06:52:07 crc kubenswrapper[4971]: I0320 06:52:07.643149 4971 scope.go:117] "RemoveContainer" containerID="93f25aa49f9048f5f935b94dc408da72227675c36d75b9614bb59743769df123" Mar 20 06:52:07 crc kubenswrapper[4971]: I0320 06:52:07.658859 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb66d79e-43e8-4160-ad35-5364bb664ab3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:50:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:50:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9585aeb7e25f5126d0ef599cd585ee78a0b547cd3bb388257855c4482621229f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57f7e2094376c5a4e7ab64d5f5d5cf8f2345b48b8cc9c0debb63d2cce6cf9524\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T06:50:05Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 06:49:40.859108 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 06:49:40.862084 1 observer_polling.go:159] Starting file observer\\\\nI0320 06:49:40.896904 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 06:49:40.901430 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 06:50:05.286315 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 06:50:05.286429 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:50:05Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:40Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:50:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://892e128c2264ce375823bf22386e28688c7c163c1a8ab217c984a26dd61506d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://465f395d4a53da4db1294d226cb9df79fa577b5e791cdadfe55758e760fa3b1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f2b249d64a0733988fd3aceda10266127b4496b00e4240553c37eaf2f0ddc5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:07Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:07 crc kubenswrapper[4971]: I0320 06:52:07.673476 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:07Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:07 crc kubenswrapper[4971]: I0320 06:52:07.694176 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:07Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:07 crc kubenswrapper[4971]: I0320 06:52:07.709579 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jl74k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3068ae0f-0404-452c-b0e0-c422d898e6b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dad3a055d918bddfc385b5ad7adfb0e47347ef1f86fe991b0bb93c58d39e4436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmftj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jl74k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:07Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:07 crc kubenswrapper[4971]: I0320 06:52:07.723384 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c96bfbc-3a6a-44df-9bf6-9f78c587657c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1a4307986ec73b6299d1dbba6bfd0efb557578fd13619a0b6a81e5e6a31f7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6545ca2d913195f4ebed66a5c9050bc7198c2d2281fea7c3293a5d188beb2e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m26zl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:07Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:07 crc kubenswrapper[4971]: I0320 06:52:07.729260 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:52:07 crc kubenswrapper[4971]: I0320 06:52:07.729310 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:52:07 crc kubenswrapper[4971]: I0320 06:52:07.729327 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:52:07 crc kubenswrapper[4971]: I0320 06:52:07.729352 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:52:07 crc kubenswrapper[4971]: I0320 06:52:07.729369 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:52:07Z","lastTransitionTime":"2026-03-20T06:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:52:07 crc kubenswrapper[4971]: I0320 06:52:07.731818 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:52:07 crc kubenswrapper[4971]: E0320 06:52:07.731997 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 06:52:07 crc kubenswrapper[4971]: I0320 06:52:07.731852 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:52:07 crc kubenswrapper[4971]: E0320 06:52:07.732441 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 06:52:07 crc kubenswrapper[4971]: I0320 06:52:07.732748 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:52:07 crc kubenswrapper[4971]: E0320 06:52:07.733962 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 06:52:07 crc kubenswrapper[4971]: I0320 06:52:07.736287 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j57qn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f219b30c-9597-4a2c-a6d4-33cd07c2ff11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dab11f01560755e3638bfa9faf7d3a92738e066ab765f7ca1245515244805ca8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zb2ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa886347017baa18f89bad2475ac0d5b6127fd3c292e34becc72e9a1afba2ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zb2ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j57qn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:07Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:07 crc kubenswrapper[4971]: E0320 06:52:07.744844 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:52:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:52:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:52:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:52:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24b0987a-bded-4b95-97d3-cecbc47baa01\\\",\\\"systemUUID\\\":\\\"e08276c0-e3e4-4e35-ad9f-5ef530b85d12\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:07Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:07 crc kubenswrapper[4971]: I0320 06:52:07.747441 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6vl68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ac992f1-bd05-471f-a101-cf14466e15e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cql2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cql2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6vl68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:07Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:07 crc kubenswrapper[4971]: I0320 06:52:07.750513 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:52:07 crc kubenswrapper[4971]: I0320 06:52:07.750540 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:52:07 crc kubenswrapper[4971]: I0320 06:52:07.750552 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:52:07 crc kubenswrapper[4971]: I0320 06:52:07.750570 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:52:07 crc kubenswrapper[4971]: I0320 06:52:07.750582 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:52:07Z","lastTransitionTime":"2026-03-20T06:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:52:07 crc kubenswrapper[4971]: E0320 06:52:07.765847 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:52:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:52:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:52:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:52:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24b0987a-bded-4b95-97d3-cecbc47baa01\\\",\\\"systemUUID\\\":\\\"e08276c0-e3e4-4e35-ad9f-5ef530b85d12\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:07Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:07 crc kubenswrapper[4971]: I0320 06:52:07.770655 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:52:07 crc kubenswrapper[4971]: I0320 06:52:07.770729 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:52:07 crc kubenswrapper[4971]: I0320 06:52:07.770754 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:52:07 crc kubenswrapper[4971]: I0320 06:52:07.770786 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:52:07 crc kubenswrapper[4971]: I0320 06:52:07.770810 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:52:07Z","lastTransitionTime":"2026-03-20T06:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:52:07 crc kubenswrapper[4971]: I0320 06:52:07.771600 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6eccb52-4f65-4e1f-b9c9-4bece549a815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a82f33c71d1353dd841a49c673ff55387468152e3c7aa9a935173f9de78dcd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c624e715179e2699ff4d282ae4bd733ca1bc71ac21247b97f8585f0c6313460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://becef23cbdd3d98ab9e3aa793aee4fe00b06c86505d5717db0739eb21e344b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16665116a21b8abffb23402c6362e9cd5110741ce22065c5e50f200b179a327c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://105120bce25dc17ac09a04c3b0703e3f07e3f60495ddc67ca4d77004384a6fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:07Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:07 crc kubenswrapper[4971]: I0320 06:52:07.792834 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"769cbbc2-99d8-4602-af5e-d67be4ea42c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:50:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:50:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f520d26409a32e66fab0a111d40b27ef3e315b1c69687d842535e28f5be7cba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37dc61a04694f84965d4677bc1de4f78175f69c5d491dd6e75c3e5bbcffefc16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a8e9788928dff4b3f4984ced0d9fd89424ff1b3932eab286bef120536b58fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d625bf5d3c5e189cc09b066dcf7488a12513f542970ad31f968c3b97b40946f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d625bf5d3c5e189cc09b066dcf7488a12513f542970ad31f968c3b97b40946f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:07Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:07 crc kubenswrapper[4971]: E0320 06:52:07.792814 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:52:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:52:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:52:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:52:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24b0987a-bded-4b95-97d3-cecbc47baa01\\\",\\\"systemUUID\\\":\\\"e08276c0-e3e4-4e35-ad9f-5ef530b85d12\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:07Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:07 crc kubenswrapper[4971]: I0320 06:52:07.797986 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:52:07 crc kubenswrapper[4971]: I0320 06:52:07.798222 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:52:07 crc kubenswrapper[4971]: I0320 06:52:07.798357 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:52:07 crc kubenswrapper[4971]: I0320 06:52:07.798506 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:52:07 crc kubenswrapper[4971]: I0320 06:52:07.798680 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:52:07Z","lastTransitionTime":"2026-03-20T06:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:52:07 crc kubenswrapper[4971]: I0320 06:52:07.809704 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae3df870db76866893b6eb3f04127ac3feec63a351cfd97e96960dddd119c79f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:07Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:07 crc kubenswrapper[4971]: E0320 06:52:07.818224 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:52:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:52:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:52:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:52:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24b0987a-bded-4b95-97d3-cecbc47baa01\\\",\\\"systemUUID\\\":\\\"e08276c0-e3e4-4e35-ad9f-5ef530b85d12\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:07Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:07 crc kubenswrapper[4971]: I0320 06:52:07.823190 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:52:07 crc kubenswrapper[4971]: I0320 06:52:07.823249 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:52:07 crc kubenswrapper[4971]: I0320 06:52:07.823269 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:52:07 crc kubenswrapper[4971]: I0320 06:52:07.823295 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:52:07 crc kubenswrapper[4971]: I0320 06:52:07.823313 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:52:07Z","lastTransitionTime":"2026-03-20T06:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:52:07 crc kubenswrapper[4971]: I0320 06:52:07.832174 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qmgrn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be6e520-cae7-413a-bd91-de5f2de7f0b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c6fda90e6a1a5bb49163607f9d9547737db1eaaaea047b43e386e5a24cc2eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ec6ebffc7048694a1a324f1569343353e81d494bd19d04dd579073299d82eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ec6ebffc7048694a1a324f1569343353e81d494bd19d04dd579073299d82eaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c24b027e991a36a6d80660e3aaf9d436d5f07b486acd239cadd2506e854fb4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c24b027e991a36a6d80660e3aaf9d436d5f07b486acd239cadd2506e854fb4f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://295001469b4485d0ef107707d2affb5c6a653b55aaa04545172607c9c9915dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://295001469b4485d0ef107707d2affb5c6a653b55aaa04545172607c9c9915dae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qmgrn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:07Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:07 crc kubenswrapper[4971]: E0320 06:52:07.840348 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:52:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:52:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:52:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:52:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24b0987a-bded-4b95-97d3-cecbc47baa01\\\",\\\"systemUUID\\\":\\\"e08276c0-e3e4-4e35-ad9f-5ef530b85d12\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:07Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:07 crc kubenswrapper[4971]: E0320 06:52:07.840515 4971 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 06:52:07 crc kubenswrapper[4971]: I0320 06:52:07.846120 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kfnsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d21aee3-3c87-4b06-9a9b-f88d3acb6d3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20ac2a0f22405759bc85f5dcfbecdd34e7568f53c6d67b5c0fe9d132e2a359b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7d788\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kfnsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:07Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:07 crc kubenswrapper[4971]: I0320 06:52:07.861131 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3f656dd-4e82-4469-9bd4-f02922f2649c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfb7907b15336a74d333ce825f8325e80508112fa387f60627e13cc60dc6b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e45395fcc4ab1005b3b86243b30fecf35311439ef5593f777ba909a0b9359574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c945250129ad9a7fdd1cc19ac9f4c8d3e73c504448c652d5e02cdc986624089d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d92316a4a93f3b34950803a33ac423d41d3467830591e102e6a6b492c5a5707\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a748c7f52fe87f489e7b11239aafb2b6f9dc55d79f48fd08e10193349e61706\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T06:50:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 06:50:49.359098 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 06:50:49.359207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 06:50:49.359766 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3550867493/tls.crt::/tmp/serving-cert-3550867493/tls.key\\\\\\\"\\\\nI0320 06:50:49.512575 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 06:50:49.515179 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 06:50:49.515195 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 06:50:49.515218 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 06:50:49.515223 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 06:50:49.521761 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 06:50:49.521810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 06:50:49.521822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 06:50:49.521832 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 06:50:49.521839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 06:50:49.521858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 06:50:49.521866 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 06:50:49.521769 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0320 06:50:49.523950 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:50:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2468ac45fc80991cc210720a76afa06655f3fd7e91fab289eed1d20818e7fcbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:07Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:07 crc kubenswrapper[4971]: I0320 06:52:07.876658 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b2683d138d302cd71c3b08e7fa14d434a288810b0bcb2dd30a6ed778e9288f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dbb5b36c90e374f7b5583a7981e39d07f6154e268299d9bc22781e5be7250c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:07Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:07 crc kubenswrapper[4971]: I0320 06:52:07.894781 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9cccf9478c15d0ef6ac9a69d23a13e83c06968969d355cfe2fde1db3b40804f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:07Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:07 crc kubenswrapper[4971]: I0320 06:52:07.907962 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:07Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:07 crc kubenswrapper[4971]: I0320 06:52:07.923821 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lwlhn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f11eaf57-f83a-4974-adb5-7a59b11555b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93f25aa49f9048f5f935b94dc408da72227675c36d75b9614bb59743769df123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93f25aa49f9048f5f935b94dc408da72227675c36d75b9614bb59743769df123\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T06:52:06Z\\\",\\\"message\\\":\\\"2026-03-20T06:51:21+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4d3cf008-dd50-43b5-9c13-e6eaedf330d9\\\\n2026-03-20T06:51:21+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4d3cf008-dd50-43b5-9c13-e6eaedf330d9 to /host/opt/cni/bin/\\\\n2026-03-20T06:51:21Z [verbose] multus-daemon started\\\\n2026-03-20T06:51:21Z [verbose] Readiness Indicator file check\\\\n2026-03-20T06:52:06Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2jcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lwlhn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:07Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:07 crc kubenswrapper[4971]: I0320 06:52:07.947670 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f624a08-810b-4915-b3bf-1e19d3e6cace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://178ce0bca73aef60296595c16024b62f27dc16d226d6ed8d102bb88b3b4477aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20d0d42b820b46ff3173d56e03dedec6fd3fd360fe144467d8eb6f785d597308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://547b376608a6a765aea3f53af2c12c6eea952bf8fcdb34be1a94fc79d7c84b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://421a7ad4b3dde0de7b751159a385832075d84609031c7def22b4aa02874d2936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a1ae4297291e48978592b82e56c27206a61f8789bc5312c5dd22d06b34c195\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30450686629f21239e3247309d8fab4f41bacff80fdc6d456d970dde1a7e703\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ec5a39eae08931433319b8734ef8180b21e3772bf8bcfce2e0599e9c229b23f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ec5a39eae08931433319b8734ef8180b21e3772bf8bcfce2e0599e9c229b23f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T06:51:47Z\\\",\\\"message\\\":\\\" Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 06:51:47.785226 7182 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 06:51:47.785730 7182 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 06:51:47.786508 7182 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 06:51:47.786535 7182 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 06:51:47.786584 7182 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 06:51:47.786594 7182 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 06:51:47.786669 7182 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 06:51:47.786681 7182 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 06:51:47.786720 7182 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 06:51:47.786731 7182 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 06:51:47.786774 7182 factory.go:656] Stopping watch factory\\\\nI0320 06:51:47.786802 7182 ovnkube.go:599] Stopped ovnkube\\\\nI0320 06:51:47.786803 7182 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fhlwk_openshift-ovn-kubernetes(7f624a08-810b-4915-b3bf-1e19d3e6cace)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4691ea9fd75102b278413b4cf2d045ddad7f06b55243e481ec5ff36f6891cd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fhlwk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:07Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:08 crc kubenswrapper[4971]: I0320 06:52:08.650169 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lwlhn_f11eaf57-f83a-4974-adb5-7a59b11555b0/kube-multus/0.log" Mar 20 06:52:08 crc kubenswrapper[4971]: I0320 06:52:08.650252 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lwlhn" event={"ID":"f11eaf57-f83a-4974-adb5-7a59b11555b0","Type":"ContainerStarted","Data":"b387ac17402e80963231386ddb3272ea2e435161b374b91196b8fa406f8868ec"} Mar 20 06:52:08 crc kubenswrapper[4971]: I0320 06:52:08.674080 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b2683d138d302cd71c3b08e7fa14d434a288810b0bcb2dd30a6ed778e9288f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dbb5b36c90e374f7b5583a7981e39d07f6154e268299d9bc22781e5be7250c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:08Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:08 crc kubenswrapper[4971]: I0320 06:52:08.697715 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3f656dd-4e82-4469-9bd4-f02922f2649c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfb7907b15336a74d333ce825f8325e80508112fa387f60627e13cc60dc6b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e45395fcc4ab1005b3b86243b30fecf35311439ef5593f777ba909a0b9359574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c945250129ad9a7fdd1cc19ac9f4c8d3e73c504448c652d5e02cdc986624089d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d92316a4a93f3b34950803a33ac423d41d3467830591e102e6a6b492c5a5707\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a748c7f52fe87f489e7b11239aafb2b6f9dc55d79f48fd08e10193349e61706\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T06:50:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 06:50:49.359098 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 06:50:49.359207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 06:50:49.359766 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3550867493/tls.crt::/tmp/serving-cert-3550867493/tls.key\\\\\\\"\\\\nI0320 06:50:49.512575 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 06:50:49.515179 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 06:50:49.515195 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 06:50:49.515218 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 06:50:49.515223 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 06:50:49.521761 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 06:50:49.521810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 06:50:49.521822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 06:50:49.521832 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 06:50:49.521839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 06:50:49.521858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 06:50:49.521866 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 06:50:49.521769 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0320 06:50:49.523950 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:50:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2468ac45fc80991cc210720a76afa06655f3fd7e91fab289eed1d20818e7fcbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:08Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:08 crc kubenswrapper[4971]: I0320 06:52:08.720254 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lwlhn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f11eaf57-f83a-4974-adb5-7a59b11555b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b387ac17402e80963231386ddb3272ea2e435161b374b91196b8fa406f8868ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93f25aa49f9048f5f935b94dc408da72227675c36d75b9614bb59743769df123\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T06:52:06Z\\\",\\\"message\\\":\\\"2026-03-20T06:51:21+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4d3cf008-dd50-43b5-9c13-e6eaedf330d9\\\\n2026-03-20T06:51:21+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4d3cf008-dd50-43b5-9c13-e6eaedf330d9 to /host/opt/cni/bin/\\\\n2026-03-20T06:51:21Z [verbose] multus-daemon started\\\\n2026-03-20T06:51:21Z [verbose] Readiness Indicator file check\\\\n2026-03-20T06:52:06Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2jcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lwlhn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:08Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:08 crc kubenswrapper[4971]: I0320 06:52:08.732012 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:52:08 crc kubenswrapper[4971]: E0320 06:52:08.732308 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vl68" podUID="6ac992f1-bd05-471f-a101-cf14466e15e8" Mar 20 06:52:08 crc kubenswrapper[4971]: I0320 06:52:08.756729 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f624a08-810b-4915-b3bf-1e19d3e6cace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://178ce0bca73aef60296595c16024b62f27dc16d226d6ed8d102bb88b3b4477aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20d0d42b820b46ff3173d56e03dedec6fd3fd360fe144467d8eb6f785d597308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://547b376608a6a765aea3f53af2c12c6eea952bf8fcdb34be1a94fc79d7c84b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://421a7ad4b3dde0de7b751159a385832075d84609031c7def22b4aa02874d2936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a1ae4297291e48978592b82e56c27206a61f8789bc5312c5dd22d06b34c195\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30450686629f21239e3247309d8fab4f41bacff80fdc6d456d970dde1a7e703\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ec5a39eae08931433319b8734ef8180b21e3772bf8bcfce2e0599e9c229b23f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ec5a39eae08931433319b8734ef8180b21e3772bf8bcfce2e0599e9c229b23f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T06:51:47Z\\\",\\\"message\\\":\\\" Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 06:51:47.785226 7182 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 06:51:47.785730 7182 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 06:51:47.786508 7182 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 06:51:47.786535 7182 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 06:51:47.786584 7182 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 06:51:47.786594 7182 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 06:51:47.786669 7182 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 06:51:47.786681 7182 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 06:51:47.786720 7182 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 06:51:47.786731 7182 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 06:51:47.786774 7182 factory.go:656] Stopping watch factory\\\\nI0320 06:51:47.786802 7182 ovnkube.go:599] Stopped ovnkube\\\\nI0320 06:51:47.786803 7182 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fhlwk_openshift-ovn-kubernetes(7f624a08-810b-4915-b3bf-1e19d3e6cace)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4691ea9fd75102b278413b4cf2d045ddad7f06b55243e481ec5ff36f6891cd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fhlwk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:08Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:08 crc kubenswrapper[4971]: I0320 06:52:08.778328 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9cccf9478c15d0ef6ac9a69d23a13e83c06968969d355cfe2fde1db3b40804f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:08Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:08 crc kubenswrapper[4971]: I0320 06:52:08.794169 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:08Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:08 crc kubenswrapper[4971]: I0320 06:52:08.817058 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:08Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:08 crc kubenswrapper[4971]: I0320 06:52:08.834938 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jl74k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3068ae0f-0404-452c-b0e0-c422d898e6b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dad3a055d918bddfc385b5ad7adfb0e47347ef1f86fe991b0bb93c58d39e4436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmftj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jl74k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:08Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:08 crc kubenswrapper[4971]: E0320 06:52:08.843634 4971 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 06:52:08 crc kubenswrapper[4971]: I0320 06:52:08.861321 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c96bfbc-3a6a-44df-9bf6-9f78c587657c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1a4307986ec73b6299d1dbba6bfd0efb557578fd13619a0b6a81e5e6a31f7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6545ca2d913195f4ebed66a5c9050bc7198c2d2281fea7c3293a5d188beb2e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m26zl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:08Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:08 crc kubenswrapper[4971]: I0320 06:52:08.879345 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j57qn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f219b30c-9597-4a2c-a6d4-33cd07c2ff11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dab11f01560755e3638bfa9faf7d3a92738e066ab765f7ca1245515244805ca8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zb2ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa886347017baa18f89bad2475ac0d5b6127fd3c292e34becc72e9a1afba2ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zb2ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j57qn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:08Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:08 crc kubenswrapper[4971]: I0320 06:52:08.898256 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6vl68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ac992f1-bd05-471f-a101-cf14466e15e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cql2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cql2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6vl68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:08Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:08 crc kubenswrapper[4971]: I0320 06:52:08.922846 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb66d79e-43e8-4160-ad35-5364bb664ab3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:50:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:50:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9585aeb7e25f5126d0ef599cd585ee78a0b547cd3bb388257855c4482621229f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57f7e2094376c5a4e7ab64d5f5d5cf8f2345b48b8cc9c0debb63d2cce6cf9524\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T06:50:05Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 06:49:40.859108 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 06:49:40.862084 1 observer_polling.go:159] Starting file observer\\\\nI0320 06:49:40.896904 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 06:49:40.901430 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 06:50:05.286315 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 06:50:05.286429 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:50:05Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:40Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:50:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://892e128c2264ce375823bf22386e28688c7c163c1a8ab217c984a26dd61506d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://465f395d4a53da4db1294d226cb9df79fa577b5e791cdadfe55758e760fa3b1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f2b249d64a0733988fd3aceda10266127b4496b00e4240553c37eaf2f0ddc5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:08Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:08 crc kubenswrapper[4971]: I0320 06:52:08.946568 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:08Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:08 crc kubenswrapper[4971]: I0320 06:52:08.966909 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae3df870db76866893b6eb3f04127ac3feec63a351cfd97e96960dddd119c79f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:08Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:08 crc kubenswrapper[4971]: I0320 06:52:08.990528 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qmgrn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be6e520-cae7-413a-bd91-de5f2de7f0b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c6fda90e6a1a5bb49163607f9d9547737db1eaaaea047b43e386e5a24cc2eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ec6ebffc7048694a1a324f1569343353e81d494bd19d04dd579073299d82eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ec6ebffc7048694a1a324f1569343353e81d494bd19d04dd579073299d82eaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c24b027e991a36a6d80660e3aaf9d436d5f07b486acd239cadd2506e854fb4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c24b027e991a36a6d80660e3aaf9d436d5f07b486acd239cadd2506e854fb4f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://295001469b4485d0ef107707d2affb5c6a653b55aaa04545172607c9c9915dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://295001469b4485d0ef107707d2affb5c6a653b55aaa04545172607c9c9915dae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qmgrn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:08Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:09 crc kubenswrapper[4971]: I0320 06:52:09.009065 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kfnsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d21aee3-3c87-4b06-9a9b-f88d3acb6d3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20ac2a0f22405759bc85f5dcfbecdd34e7568f53c6d67b5c0fe9d132e2a359b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7d788\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kfnsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:09Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:09 crc kubenswrapper[4971]: I0320 06:52:09.042923 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6eccb52-4f65-4e1f-b9c9-4bece549a815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a82f33c71d1353dd841a49c673ff55387468152e3c7aa9a935173f9de78dcd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c624e715179e2699ff4d282ae4bd733ca1bc71ac21247b97f8585f0c6313460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://becef23cbdd3d98ab9e3aa793aee4fe00b06c86505d5717db0739eb21e344b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16665116a21b8abffb23402c6362e9cd5110741ce22065c5e50f200b179a327c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://105120bce25dc17ac09a04c3b0703e3f07e3f60495ddc67ca4d77004384a6fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:09Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:09 crc kubenswrapper[4971]: I0320 06:52:09.064378 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"769cbbc2-99d8-4602-af5e-d67be4ea42c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:50:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:50:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f520d26409a32e66fab0a111d40b27ef3e315b1c69687d842535e28f5be7cba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37dc61a04694f84965d4677bc1de4f78175f69c5d491dd6e75c3e5bbcffefc16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a8e9788928dff4b3f4984ced0d9fd89424ff1b3932eab286bef120536b58fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d625bf5d3c5e189cc09b066dcf7488a12513f542970ad31f968c3b97b40946f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d625bf5d3c5e189cc09b066dcf7488a12513f542970ad31f968c3b97b40946f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:09Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:09 crc kubenswrapper[4971]: I0320 06:52:09.089708 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3f656dd-4e82-4469-9bd4-f02922f2649c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfb7907b15336a74d333ce825f8325e80508112fa387f60627e13cc60dc6b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e45395fcc4ab1005b3b86243b30fecf35311439ef5593f777ba909a0b9359574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c945250129ad9a7fdd1cc19ac9f4c8d3e73c504448c652d5e02cdc986624089d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d92316a4a93f3b34950803a33ac423d41d3467830591e102e6a6b492c5a5707\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a748c7f52fe87f489e7b11239aafb2b6f9dc55d79f48fd08e10193349e61706\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T06:50:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 06:50:49.359098 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 06:50:49.359207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 06:50:49.359766 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3550867493/tls.crt::/tmp/serving-cert-3550867493/tls.key\\\\\\\"\\\\nI0320 06:50:49.512575 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 06:50:49.515179 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 06:50:49.515195 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 06:50:49.515218 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 06:50:49.515223 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 06:50:49.521761 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 06:50:49.521810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 06:50:49.521822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 06:50:49.521832 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 06:50:49.521839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 06:50:49.521858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 06:50:49.521866 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 06:50:49.521769 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0320 06:50:49.523950 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:50:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2468ac45fc80991cc210720a76afa06655f3fd7e91fab289eed1d20818e7fcbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:09Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:09 crc kubenswrapper[4971]: I0320 06:52:09.115332 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b2683d138d302cd71c3b08e7fa14d434a288810b0bcb2dd30a6ed778e9288f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dbb5b36c90e374f7b5583a7981e39d07f6154e268299d9bc22781e5be7250c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:09Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:09 crc kubenswrapper[4971]: I0320 06:52:09.144007 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9cccf9478c15d0ef6ac9a69d23a13e83c06968969d355cfe2fde1db3b40804f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:09Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:09 crc kubenswrapper[4971]: I0320 06:52:09.166284 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:09Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:09 crc kubenswrapper[4971]: I0320 06:52:09.189277 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lwlhn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f11eaf57-f83a-4974-adb5-7a59b11555b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b387ac17402e80963231386ddb3272ea2e435161b374b91196b8fa406f8868ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93f25aa49f9048f5f935b94dc408da72227675c36d75b9614bb59743769df123\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T06:52:06Z\\\",\\\"message\\\":\\\"2026-03-20T06:51:21+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4d3cf008-dd50-43b5-9c13-e6eaedf330d9\\\\n2026-03-20T06:51:21+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4d3cf008-dd50-43b5-9c13-e6eaedf330d9 to /host/opt/cni/bin/\\\\n2026-03-20T06:51:21Z [verbose] multus-daemon started\\\\n2026-03-20T06:51:21Z [verbose] Readiness Indicator file check\\\\n2026-03-20T06:52:06Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2jcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lwlhn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:09Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:09 crc kubenswrapper[4971]: I0320 06:52:09.224539 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f624a08-810b-4915-b3bf-1e19d3e6cace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://178ce0bca73aef60296595c16024b62f27dc16d226d6ed8d102bb88b3b4477aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20d0d42b820b46ff3173d56e03dedec6fd3fd360fe144467d8eb6f785d597308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://547b376608a6a765aea3f53af2c12c6eea952bf8fcdb34be1a94fc79d7c84b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://421a7ad4b3dde0de7b751159a385832075d84609031c7def22b4aa02874d2936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a1ae4297291e48978592b82e56c27206a61f8789bc5312c5dd22d06b34c195\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30450686629f21239e3247309d8fab4f41bacff80fdc6d456d970dde1a7e703\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ec5a39eae08931433319b8734ef8180b21e3772bf8bcfce2e0599e9c229b23f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ec5a39eae08931433319b8734ef8180b21e3772bf8bcfce2e0599e9c229b23f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T06:51:47Z\\\",\\\"message\\\":\\\" Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 06:51:47.785226 7182 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 06:51:47.785730 7182 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 06:51:47.786508 7182 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 06:51:47.786535 7182 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 06:51:47.786584 7182 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 06:51:47.786594 7182 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 06:51:47.786669 7182 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 06:51:47.786681 7182 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 06:51:47.786720 7182 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 06:51:47.786731 7182 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 06:51:47.786774 7182 factory.go:656] Stopping watch factory\\\\nI0320 06:51:47.786802 7182 ovnkube.go:599] Stopped ovnkube\\\\nI0320 06:51:47.786803 7182 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fhlwk_openshift-ovn-kubernetes(7f624a08-810b-4915-b3bf-1e19d3e6cace)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4691ea9fd75102b278413b4cf2d045ddad7f06b55243e481ec5ff36f6891cd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fhlwk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:09Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:09 crc kubenswrapper[4971]: I0320 06:52:09.245929 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb66d79e-43e8-4160-ad35-5364bb664ab3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:50:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:50:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9585aeb7e25f5126d0ef599cd585ee78a0b547cd3bb388257855c4482621229f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57f7e2094376c5a4e7ab64d5f5d5cf8f2345b48b8cc9c0debb63d2cce6cf9524\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T06:50:05Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 06:49:40.859108 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 06:49:40.862084 1 observer_polling.go:159] Starting file observer\\\\nI0320 06:49:40.896904 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 06:49:40.901430 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 06:50:05.286315 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 06:50:05.286429 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:50:05Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:40Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:50:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://892e128c2264ce375823bf22386e28688c7c163c1a8ab217c984a26dd61506d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://465f395d4a53da4db1294d226cb9df79fa577b5e791cdadfe55758e760fa3b1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f2b249d64a0733988fd3aceda10266127b4496b00e4240553c37eaf2f0ddc5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:09Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:09 crc kubenswrapper[4971]: I0320 06:52:09.267196 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:09Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:09 crc kubenswrapper[4971]: I0320 06:52:09.284696 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:09Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:09 crc kubenswrapper[4971]: I0320 06:52:09.296766 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jl74k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3068ae0f-0404-452c-b0e0-c422d898e6b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dad3a055d918bddfc385b5ad7adfb0e47347ef1f86fe991b0bb93c58d39e4436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmftj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jl74k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:09Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:09 crc kubenswrapper[4971]: I0320 06:52:09.317105 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c96bfbc-3a6a-44df-9bf6-9f78c587657c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1a4307986ec73b6299d1dbba6bfd0efb557578fd13619a0b6a81e5e6a31f7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6545ca2d913195f4ebed66a5c9050bc7198c2d2281fea7c3293a5d188beb2e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m26zl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:09Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:09 crc kubenswrapper[4971]: I0320 06:52:09.335595 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j57qn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f219b30c-9597-4a2c-a6d4-33cd07c2ff11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dab11f01560755e3638bfa9faf7d3a92738e066ab765f7ca1245515244805ca8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zb2ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa886347017baa18f89bad2475ac0d5b6127fd3c292e34becc72e9a1afba2ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zb2ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j57qn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:09Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:09 crc kubenswrapper[4971]: I0320 06:52:09.351001 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6vl68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ac992f1-bd05-471f-a101-cf14466e15e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cql2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cql2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6vl68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:09Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:09 crc kubenswrapper[4971]: I0320 06:52:09.387832 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6eccb52-4f65-4e1f-b9c9-4bece549a815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a82f33c71d1353dd841a49c673ff55387468152e3c7aa9a935173f9de78dcd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c624e715179e2699ff4d282ae4bd733ca1bc71ac21247b97f8585f0c6313460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://becef23cbdd3d98ab9e3aa793aee4fe00b06c86505d5717db0739eb21e344b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16665116a21b8abffb23402c6362e9cd5110741ce22065c5e50f200b179a327c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://105120bce25dc17ac09a04c3b0703e3f07e3f60495ddc67ca4d77004384a6fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:09Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:09 crc kubenswrapper[4971]: I0320 06:52:09.407978 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"769cbbc2-99d8-4602-af5e-d67be4ea42c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:50:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:50:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f520d26409a32e66fab0a111d40b27ef3e315b1c69687d842535e28f5be7cba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37dc61a04694f84965d4677bc1de4f78175f69c5d491dd6e75c3e5bbcffefc16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a8e9788928dff4b3f4984ced0d9fd89424ff1b3932eab286bef120536b58fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d625bf5d3c5e189cc09b066dcf7488a12513f542970ad31f968c3b97b40946f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d625bf5d3c5e189cc09b066dcf7488a12513f542970ad31f968c3b97b40946f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:09Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:09 crc kubenswrapper[4971]: I0320 06:52:09.427545 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae3df870db76866893b6eb3f04127ac3feec63a351cfd97e96960dddd119c79f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:09Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:09 crc kubenswrapper[4971]: I0320 06:52:09.450895 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qmgrn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be6e520-cae7-413a-bd91-de5f2de7f0b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c6fda90e6a1a5bb49163607f9d9547737db1eaaaea047b43e386e5a24cc2eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ec6ebffc7048694a1a324f1569343353e81d494bd19d04dd579073299d82eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ec6ebffc7048694a1a324f1569343353e81d494bd19d04dd579073299d82eaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c24b027e991a36a6d80660e3aaf9d436d5f07b486acd239cadd2506e854fb4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c24b027e991a36a6d80660e3aaf9d436d5f07b486acd239cadd2506e854fb4f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://295001469b4485d0ef107707d2affb5c6a653b55aaa04545172607c9c9915dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://295001469b4485d0ef107707d2affb5c6a653b55aaa04545172607c9c9915dae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qmgrn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:09Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:09 crc kubenswrapper[4971]: I0320 06:52:09.465598 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kfnsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d21aee3-3c87-4b06-9a9b-f88d3acb6d3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20ac2a0f22405759bc85f5dcfbecdd34e7568f53c6d67b5c0fe9d132e2a359b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7d788\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kfnsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:09Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:09 crc kubenswrapper[4971]: I0320 06:52:09.731193 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:52:09 crc kubenswrapper[4971]: I0320 06:52:09.731464 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:52:09 crc kubenswrapper[4971]: I0320 06:52:09.731740 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:52:09 crc kubenswrapper[4971]: E0320 06:52:09.731774 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 06:52:09 crc kubenswrapper[4971]: E0320 06:52:09.731977 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 06:52:09 crc kubenswrapper[4971]: E0320 06:52:09.732278 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 06:52:10 crc kubenswrapper[4971]: I0320 06:52:10.732879 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:52:10 crc kubenswrapper[4971]: E0320 06:52:10.733251 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vl68" podUID="6ac992f1-bd05-471f-a101-cf14466e15e8" Mar 20 06:52:10 crc kubenswrapper[4971]: I0320 06:52:10.747745 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 20 06:52:11 crc kubenswrapper[4971]: I0320 06:52:11.732230 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:52:11 crc kubenswrapper[4971]: E0320 06:52:11.732448 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 06:52:11 crc kubenswrapper[4971]: I0320 06:52:11.732907 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:52:11 crc kubenswrapper[4971]: I0320 06:52:11.733011 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:52:11 crc kubenswrapper[4971]: E0320 06:52:11.733080 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 06:52:11 crc kubenswrapper[4971]: E0320 06:52:11.733143 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 06:52:12 crc kubenswrapper[4971]: I0320 06:52:12.731817 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:52:12 crc kubenswrapper[4971]: E0320 06:52:12.732015 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vl68" podUID="6ac992f1-bd05-471f-a101-cf14466e15e8" Mar 20 06:52:13 crc kubenswrapper[4971]: I0320 06:52:13.731850 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:52:13 crc kubenswrapper[4971]: I0320 06:52:13.731943 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:52:13 crc kubenswrapper[4971]: E0320 06:52:13.732045 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 06:52:13 crc kubenswrapper[4971]: I0320 06:52:13.732173 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:52:13 crc kubenswrapper[4971]: E0320 06:52:13.732379 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 06:52:13 crc kubenswrapper[4971]: E0320 06:52:13.732476 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 06:52:13 crc kubenswrapper[4971]: I0320 06:52:13.733774 4971 scope.go:117] "RemoveContainer" containerID="7ec5a39eae08931433319b8734ef8180b21e3772bf8bcfce2e0599e9c229b23f" Mar 20 06:52:13 crc kubenswrapper[4971]: E0320 06:52:13.846579 4971 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 06:52:14 crc kubenswrapper[4971]: I0320 06:52:14.679442 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fhlwk_7f624a08-810b-4915-b3bf-1e19d3e6cace/ovnkube-controller/2.log" Mar 20 06:52:14 crc kubenswrapper[4971]: I0320 06:52:14.682328 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" event={"ID":"7f624a08-810b-4915-b3bf-1e19d3e6cace","Type":"ContainerStarted","Data":"d333225cc7556a04bf2b3cf747cb0645ef0b657641dc10e0e91d59ecd504b63e"} Mar 20 06:52:14 crc kubenswrapper[4971]: I0320 06:52:14.682752 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:52:14 crc kubenswrapper[4971]: I0320 06:52:14.695942 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e18e6cb3-bf21-4f44-b97a-8e0a523ef45f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f039ce86df8b26f12e0b570ebc679159993f54933090b6f74961acad0bc6ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4ee4a5d4b945414c51474c35ab69b28ed5f455e26d29cd5b16ad428ccbbf22a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4ee4a5d4b945414c51474c35ab69b28ed5f455e26d29cd5b16ad428ccbbf22a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:14Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:14 crc kubenswrapper[4971]: I0320 06:52:14.728886 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3f656dd-4e82-4469-9bd4-f02922f2649c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfb7907b15336a74d333ce825f8325e80508112fa387f60627e13cc60dc6b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e45395fcc4ab1005b3b86243b30fecf35311439ef5593f777ba909a0b9359574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c945250129ad9a7fdd1cc19ac9f4c8d3e73c504448c652d5e02cdc986624089d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d92316a4a93f3b34950803a33ac423d41d3467830591e102e6a6b492c5a5707\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a748c7f52fe87f489e7b11239aafb2b6f9dc55d79f48fd08e10193349e61706\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T06:50:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 06:50:49.359098 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 06:50:49.359207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 06:50:49.359766 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3550867493/tls.crt::/tmp/serving-cert-3550867493/tls.key\\\\\\\"\\\\nI0320 06:50:49.512575 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 06:50:49.515179 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 06:50:49.515195 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 06:50:49.515218 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 06:50:49.515223 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 06:50:49.521761 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 06:50:49.521810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 06:50:49.521822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 06:50:49.521832 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 06:50:49.521839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 06:50:49.521858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 06:50:49.521866 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 06:50:49.521769 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0320 06:50:49.523950 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:50:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2468ac45fc80991cc210720a76afa06655f3fd7e91fab289eed1d20818e7fcbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:14Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:14 crc kubenswrapper[4971]: I0320 06:52:14.731700 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:52:14 crc kubenswrapper[4971]: E0320 06:52:14.731882 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vl68" podUID="6ac992f1-bd05-471f-a101-cf14466e15e8" Mar 20 06:52:14 crc kubenswrapper[4971]: I0320 06:52:14.746405 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b2683d138d302cd71c3b08e7fa14d434a288810b0bcb2dd30a6ed778e9288f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dbb5b36c90e374f7b5583a7981e39d07f6154e268299d9bc22781e5be7250c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:14Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:14 crc kubenswrapper[4971]: I0320 06:52:14.759944 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9cccf9478c15d0ef6ac9a69d23a13e83c06968969d355cfe2fde1db3b40804f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:14Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:14 crc kubenswrapper[4971]: I0320 06:52:14.772094 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:14Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:14 crc kubenswrapper[4971]: I0320 06:52:14.783558 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lwlhn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f11eaf57-f83a-4974-adb5-7a59b11555b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b387ac17402e80963231386ddb3272ea2e435161b374b91196b8fa406f8868ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93f25aa49f9048f5f935b94dc408da72227675c36d75b9614bb59743769df123\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T06:52:06Z\\\",\\\"message\\\":\\\"2026-03-20T06:51:21+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4d3cf008-dd50-43b5-9c13-e6eaedf330d9\\\\n2026-03-20T06:51:21+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4d3cf008-dd50-43b5-9c13-e6eaedf330d9 to /host/opt/cni/bin/\\\\n2026-03-20T06:51:21Z [verbose] multus-daemon started\\\\n2026-03-20T06:51:21Z [verbose] Readiness Indicator file check\\\\n2026-03-20T06:52:06Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2jcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lwlhn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:14Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:14 crc kubenswrapper[4971]: I0320 06:52:14.802348 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f624a08-810b-4915-b3bf-1e19d3e6cace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://178ce0bca73aef60296595c16024b62f27dc16d226d6ed8d102bb88b3b4477aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20d0d42b820b46ff3173d56e03dedec6fd3fd360fe144467d8eb6f785d597308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://547b376608a6a765aea3f53af2c12c6eea952bf8fcdb34be1a94fc79d7c84b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://421a7ad4b3dde0de7b751159a385832075d84609031c7def22b4aa02874d2936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a1ae4297291e48978592b82e56c27206a61f8789bc5312c5dd22d06b34c195\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30450686629f21239e3247309d8fab4f41bacff80fdc6d456d970dde1a7e703\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d333225cc7556a04bf2b3cf747cb0645ef0b657641dc10e0e91d59ecd504b63e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ec5a39eae08931433319b8734ef8180b21e3772bf8bcfce2e0599e9c229b23f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T06:51:47Z\\\",\\\"message\\\":\\\" Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 06:51:47.785226 7182 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 06:51:47.785730 7182 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 06:51:47.786508 7182 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 06:51:47.786535 7182 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 06:51:47.786584 7182 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 06:51:47.786594 7182 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 06:51:47.786669 7182 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 06:51:47.786681 7182 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 06:51:47.786720 7182 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 06:51:47.786731 7182 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 06:51:47.786774 7182 factory.go:656] Stopping watch factory\\\\nI0320 06:51:47.786802 7182 ovnkube.go:599] Stopped ovnkube\\\\nI0320 06:51:47.786803 7182 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:52:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4691ea9fd75102b278413b4cf2d045ddad7f06b55243e481ec5ff36f6891cd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fhlwk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:14Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:14 crc kubenswrapper[4971]: I0320 06:52:14.813642 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c96bfbc-3a6a-44df-9bf6-9f78c587657c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1a4307986ec73b6299d1dbba6bfd0efb557578fd13619a0b6a81e5e6a31f7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6545ca2d913195f4ebed66a5c9050bc7198c2d2281fea7c3293a5d188beb2e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m26zl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:14Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:14 crc kubenswrapper[4971]: I0320 06:52:14.825171 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j57qn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f219b30c-9597-4a2c-a6d4-33cd07c2ff11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dab11f01560755e3638bfa9faf7d3a92738e066ab765f7ca1245515244805ca8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zb2ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa886347017baa18f89bad2475ac0d5b6127fd3c292e34becc72e9a1afba2ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zb2ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j57qn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:14Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:14 crc kubenswrapper[4971]: I0320 06:52:14.836982 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6vl68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ac992f1-bd05-471f-a101-cf14466e15e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cql2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cql2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6vl68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:14Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:14 crc kubenswrapper[4971]: I0320 06:52:14.851551 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb66d79e-43e8-4160-ad35-5364bb664ab3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:50:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:50:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9585aeb7e25f5126d0ef599cd585ee78a0b547cd3bb388257855c4482621229f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57f7e2094376c5a4e7ab64d5f5d5cf8f2345b48b8cc9c0debb63d2cce6cf9524\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T06:50:05Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 06:49:40.859108 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 06:49:40.862084 1 observer_polling.go:159] Starting file observer\\\\nI0320 06:49:40.896904 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 06:49:40.901430 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 06:50:05.286315 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 06:50:05.286429 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:50:05Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:40Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:50:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://892e128c2264ce375823bf22386e28688c7c163c1a8ab217c984a26dd61506d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://465f395d4a53da4db1294d226cb9df79fa577b5e791cdadfe55758e760fa3b1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f2b249d64a0733988fd3aceda10266127b4496b00e4240553c37eaf2f0ddc5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:14Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:14 crc kubenswrapper[4971]: I0320 06:52:14.864656 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:14Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:14 crc kubenswrapper[4971]: I0320 06:52:14.877274 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:14Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:14 crc kubenswrapper[4971]: I0320 06:52:14.903189 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jl74k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3068ae0f-0404-452c-b0e0-c422d898e6b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dad3a055d918bddfc385b5ad7adfb0e47347ef1f86fe991b0bb93c58d39e4436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmftj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jl74k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:14Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:14 crc kubenswrapper[4971]: I0320 06:52:14.918702 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kfnsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d21aee3-3c87-4b06-9a9b-f88d3acb6d3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20ac2a0f22405759bc85f5dcfbecdd34e7568f53c6d67b5c0fe9d132e2a359b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7d788\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kfnsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:14Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:14 crc kubenswrapper[4971]: I0320 06:52:14.950029 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6eccb52-4f65-4e1f-b9c9-4bece549a815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a82f33c71d1353dd841a49c673ff55387468152e3c7aa9a935173f9de78dcd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c624e715179e2699ff4d282ae4bd733ca1bc71ac21247b97f8585f0c6313460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://becef23cbdd3d98ab9e3aa793aee4fe00b06c86505d5717db0739eb21e344b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16665116a21b8abffb23402c6362e9cd5110741ce22065c5e50f200b179a327c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://105120bce25dc17ac09a04c3b0703e3f07e3f60495ddc67ca4d77004384a6fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:14Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:14 crc kubenswrapper[4971]: I0320 06:52:14.970295 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"769cbbc2-99d8-4602-af5e-d67be4ea42c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:50:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:50:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f520d26409a32e66fab0a111d40b27ef3e315b1c69687d842535e28f5be7cba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37dc61a04694f84965d4677bc1de4f78175f69c5d491dd6e75c3e5bbcffefc16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a8e9788928dff4b3f4984ced0d9fd89424ff1b3932eab286bef120536b58fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d625bf5d3c5e189cc09b066dcf7488a12513f542970ad31f968c3b97b40946f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d625bf5d3c5e189cc09b066dcf7488a12513f542970ad31f968c3b97b40946f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:14Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:14 crc kubenswrapper[4971]: I0320 06:52:14.988448 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae3df870db76866893b6eb3f04127ac3feec63a351cfd97e96960dddd119c79f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:14Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:15 crc kubenswrapper[4971]: I0320 06:52:15.010228 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qmgrn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be6e520-cae7-413a-bd91-de5f2de7f0b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c6fda90e6a1a5bb49163607f9d9547737db1eaaaea047b43e386e5a24cc2eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ec6ebffc7048694a1a324f1569343353e81d494bd19d04dd579073299d82eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ec6ebffc7048694a1a324f1569343353e81d494bd19d04dd579073299d82eaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c24b027e991a36a6d80660e3aaf9d436d5f07b486acd239cadd2506e854fb4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c24b027e991a36a6d80660e3aaf9d436d5f07b486acd239cadd2506e854fb4f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://295001469b4485d0ef107707d2affb5c6a653b55aaa04545172607c9c9915dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://295001469b4485d0ef107707d2affb5c6a653b55aaa04545172607c9c9915dae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qmgrn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:15Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:15 crc kubenswrapper[4971]: I0320 06:52:15.689942 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fhlwk_7f624a08-810b-4915-b3bf-1e19d3e6cace/ovnkube-controller/3.log" Mar 20 06:52:15 crc kubenswrapper[4971]: I0320 06:52:15.690987 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fhlwk_7f624a08-810b-4915-b3bf-1e19d3e6cace/ovnkube-controller/2.log" Mar 20 06:52:15 crc kubenswrapper[4971]: I0320 06:52:15.695069 4971 generic.go:334] "Generic (PLEG): container finished" podID="7f624a08-810b-4915-b3bf-1e19d3e6cace" containerID="d333225cc7556a04bf2b3cf747cb0645ef0b657641dc10e0e91d59ecd504b63e" exitCode=1 Mar 20 06:52:15 crc kubenswrapper[4971]: I0320 06:52:15.695136 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" event={"ID":"7f624a08-810b-4915-b3bf-1e19d3e6cace","Type":"ContainerDied","Data":"d333225cc7556a04bf2b3cf747cb0645ef0b657641dc10e0e91d59ecd504b63e"} Mar 20 06:52:15 crc kubenswrapper[4971]: I0320 06:52:15.695288 4971 scope.go:117] "RemoveContainer" containerID="7ec5a39eae08931433319b8734ef8180b21e3772bf8bcfce2e0599e9c229b23f" Mar 20 06:52:15 crc kubenswrapper[4971]: I0320 06:52:15.695863 4971 scope.go:117] "RemoveContainer" containerID="d333225cc7556a04bf2b3cf747cb0645ef0b657641dc10e0e91d59ecd504b63e" Mar 20 06:52:15 crc kubenswrapper[4971]: E0320 06:52:15.696036 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-fhlwk_openshift-ovn-kubernetes(7f624a08-810b-4915-b3bf-1e19d3e6cace)\"" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" podUID="7f624a08-810b-4915-b3bf-1e19d3e6cace" Mar 20 06:52:15 crc kubenswrapper[4971]: I0320 06:52:15.719867 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e18e6cb3-bf21-4f44-b97a-8e0a523ef45f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f039ce86df8b26f12e0b570ebc679159993f54933090b6f74961acad0bc6ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4ee4a5d4b945414c51474c35ab69b28ed5f455e26d29cd5b16ad428ccbbf22a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4ee4a5d4b945414c51474c35ab69b28ed5f455e26d29cd5b16ad428ccbbf22a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:15Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:15 crc kubenswrapper[4971]: I0320 06:52:15.731488 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:52:15 crc kubenswrapper[4971]: I0320 06:52:15.731530 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:52:15 crc kubenswrapper[4971]: E0320 06:52:15.731701 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 06:52:15 crc kubenswrapper[4971]: E0320 06:52:15.731844 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 06:52:15 crc kubenswrapper[4971]: I0320 06:52:15.732134 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:52:15 crc kubenswrapper[4971]: E0320 06:52:15.732275 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 06:52:15 crc kubenswrapper[4971]: I0320 06:52:15.744134 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3f656dd-4e82-4469-9bd4-f02922f2649c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfb7907b15336a74d333ce825f8325e80508112fa387f60627e13cc60dc6b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e45395fcc4ab1005b3b86243b30fecf35311439ef5593f777ba909a0b9359574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c945250129ad9a7fdd1cc19ac9f4c8d3e73c504448c652d5e02cdc986624089d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d92316a4a93f3b34950803a33ac423d41d3467830591e102e6a6b492c5a5707\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a748c7f52fe87f489e7b11239aafb2b6f9dc55d79f48fd08e10193349e61706\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T06:50:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 06:50:49.359098 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 06:50:49.359207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 06:50:49.359766 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3550867493/tls.crt::/tmp/serving-cert-3550867493/tls.key\\\\\\\"\\\\nI0320 06:50:49.512575 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 06:50:49.515179 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 06:50:49.515195 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 06:50:49.515218 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 06:50:49.515223 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 06:50:49.521761 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 06:50:49.521810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 06:50:49.521822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 06:50:49.521832 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 06:50:49.521839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 06:50:49.521858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 06:50:49.521866 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 06:50:49.521769 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0320 06:50:49.523950 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:50:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2468ac45fc80991cc210720a76afa06655f3fd7e91fab289eed1d20818e7fcbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:15Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:15 crc kubenswrapper[4971]: I0320 06:52:15.760309 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b2683d138d302cd71c3b08e7fa14d434a288810b0bcb2dd30a6ed778e9288f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dbb5b36c90e374f7b5583a7981e39d07f6154e268299d9bc22781e5be7250c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:15Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:15 crc kubenswrapper[4971]: I0320 06:52:15.776392 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9cccf9478c15d0ef6ac9a69d23a13e83c06968969d355cfe2fde1db3b40804f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:15Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:15 crc kubenswrapper[4971]: I0320 06:52:15.796880 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:15Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:15 crc kubenswrapper[4971]: I0320 06:52:15.817179 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lwlhn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f11eaf57-f83a-4974-adb5-7a59b11555b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b387ac17402e80963231386ddb3272ea2e435161b374b91196b8fa406f8868ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93f25aa49f9048f5f935b94dc408da72227675c36d75b9614bb59743769df123\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T06:52:06Z\\\",\\\"message\\\":\\\"2026-03-20T06:51:21+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4d3cf008-dd50-43b5-9c13-e6eaedf330d9\\\\n2026-03-20T06:51:21+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4d3cf008-dd50-43b5-9c13-e6eaedf330d9 to /host/opt/cni/bin/\\\\n2026-03-20T06:51:21Z [verbose] multus-daemon started\\\\n2026-03-20T06:51:21Z [verbose] Readiness Indicator file check\\\\n2026-03-20T06:52:06Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2jcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lwlhn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:15Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:15 crc kubenswrapper[4971]: I0320 06:52:15.846864 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f624a08-810b-4915-b3bf-1e19d3e6cace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://178ce0bca73aef60296595c16024b62f27dc16d226d6ed8d102bb88b3b4477aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20d0d42b820b46ff3173d56e03dedec6fd3fd360fe144467d8eb6f785d597308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://547b376608a6a765aea3f53af2c12c6eea952bf8fcdb34be1a94fc79d7c84b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://421a7ad4b3dde0de7b751159a385832075d84609031c7def22b4aa02874d2936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a1ae4297291e48978592b82e56c27206a61f8789bc5312c5dd22d06b34c195\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30450686629f21239e3247309d8fab4f41bacff80fdc6d456d970dde1a7e703\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d333225cc7556a04bf2b3cf747cb0645ef0b657641dc10e0e91d59ecd504b63e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ec5a39eae08931433319b8734ef8180b21e3772bf8bcfce2e0599e9c229b23f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T06:51:47Z\\\",\\\"message\\\":\\\" Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 06:51:47.785226 7182 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 06:51:47.785730 7182 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 06:51:47.786508 7182 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 06:51:47.786535 7182 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 06:51:47.786584 7182 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 06:51:47.786594 7182 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 06:51:47.786669 7182 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 06:51:47.786681 7182 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 06:51:47.786720 7182 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 06:51:47.786731 7182 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 06:51:47.786774 7182 factory.go:656] Stopping watch factory\\\\nI0320 06:51:47.786802 7182 ovnkube.go:599] Stopped ovnkube\\\\nI0320 06:51:47.786803 7182 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d333225cc7556a04bf2b3cf747cb0645ef0b657641dc10e0e91d59ecd504b63e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T06:52:14Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 06:52:14.724829 7493 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 06:52:14.724889 7493 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 06:52:14.724954 7493 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 06:52:14.724987 7493 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 06:52:14.725001 7493 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 06:52:14.725009 7493 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 06:52:14.725028 7493 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 06:52:14.725034 7493 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 06:52:14.725038 7493 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 06:52:14.725045 7493 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 06:52:14.725081 7493 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 06:52:14.725123 7493 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 06:52:14.725131 7493 factory.go:656] Stopping watch factory\\\\nI0320 06:52:14.725167 7493 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 06:52:14.725404 7493 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:52:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4691ea9fd75102b278413b4cf2d045ddad7f06b55243e481ec5ff36f6891cd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fhlwk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:15Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:15 crc kubenswrapper[4971]: I0320 06:52:15.864694 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j57qn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f219b30c-9597-4a2c-a6d4-33cd07c2ff11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dab11f01560755e3638bfa9faf7d3a92738e066ab765f7ca1245515244805ca8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zb2ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa886347017baa18f89bad2475ac0d5b6127fd3c292e34becc72e9a1afba2ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zb2ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j57qn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:15Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:15 crc kubenswrapper[4971]: I0320 06:52:15.876281 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6vl68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ac992f1-bd05-471f-a101-cf14466e15e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cql2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cql2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6vl68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:15Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:15 crc kubenswrapper[4971]: I0320 06:52:15.889516 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb66d79e-43e8-4160-ad35-5364bb664ab3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:50:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:50:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9585aeb7e25f5126d0ef599cd585ee78a0b547cd3bb388257855c4482621229f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57f7e2094376c5a4e7ab64d5f5d5cf8f2345b48b8cc9c0debb63d2cce6cf9524\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T06:50:05Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 06:49:40.859108 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 06:49:40.862084 1 observer_polling.go:159] Starting file observer\\\\nI0320 06:49:40.896904 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 06:49:40.901430 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 06:50:05.286315 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 06:50:05.286429 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:50:05Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:40Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:50:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://892e128c2264ce375823bf22386e28688c7c163c1a8ab217c984a26dd61506d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://465f395d4a53da4db1294d226cb9df79fa577b5e791cdadfe55758e760fa3b1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f2b249d64a0733988fd3aceda10266127b4496b00e4240553c37eaf2f0ddc5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:15Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:15 crc kubenswrapper[4971]: I0320 06:52:15.911128 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:15Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:15 crc kubenswrapper[4971]: I0320 06:52:15.925802 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:15Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:15 crc kubenswrapper[4971]: I0320 06:52:15.938824 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jl74k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3068ae0f-0404-452c-b0e0-c422d898e6b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dad3a055d918bddfc385b5ad7adfb0e47347ef1f86fe991b0bb93c58d39e4436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmftj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jl74k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:15Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:15 crc kubenswrapper[4971]: I0320 06:52:15.951512 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c96bfbc-3a6a-44df-9bf6-9f78c587657c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1a4307986ec73b6299d1dbba6bfd0efb557578fd13619a0b6a81e5e6a31f7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6545ca2d913195f4ebed66a5c9050bc7198c2d2281fea7c3293a5d188beb2e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m26zl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:15Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:15 crc kubenswrapper[4971]: I0320 06:52:15.975337 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6eccb52-4f65-4e1f-b9c9-4bece549a815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a82f33c71d1353dd841a49c673ff55387468152e3c7aa9a935173f9de78dcd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c624e715179e2699ff4d282ae4bd733ca1bc71ac21247b97f8585f0c6313460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://becef23cbdd3d98ab9e3aa793aee4fe00b06c86505d5717db0739eb21e344b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16665116a21b8abffb23402c6362e9cd5110741ce22065c5e50f200b179a327c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://105120bce25dc17ac09a04c3b0703e3f07e3f60495ddc67ca4d77004384a6fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:15Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:15 crc kubenswrapper[4971]: I0320 06:52:15.992950 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"769cbbc2-99d8-4602-af5e-d67be4ea42c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:50:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:50:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f520d26409a32e66fab0a111d40b27ef3e315b1c69687d842535e28f5be7cba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37dc61a04694f84965d4677bc1de4f78175f69c5d491dd6e75c3e5bbcffefc16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a8e9788928dff4b3f4984ced0d9fd89424ff1b3932eab286bef120536b58fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d625bf5d3c5e189cc09b066dcf7488a12513f542970ad31f968c3b97b40946f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d625bf5d3c5e189cc09b066dcf7488a12513f542970ad31f968c3b97b40946f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:15Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:16 crc kubenswrapper[4971]: I0320 06:52:16.010711 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae3df870db76866893b6eb3f04127ac3feec63a351cfd97e96960dddd119c79f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:16Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:16 crc kubenswrapper[4971]: I0320 06:52:16.035079 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qmgrn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be6e520-cae7-413a-bd91-de5f2de7f0b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c6fda90e6a1a5bb49163607f9d9547737db1eaaaea047b43e386e5a24cc2eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ec6ebffc7048694a1a324f1569343353e81d494bd19d04dd579073299d82eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ec6ebffc7048694a1a324f1569343353e81d494bd19d04dd579073299d82eaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c24b027e991a36a6d80660e3aaf9d436d5f07b486acd239cadd2506e854fb4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c24b027e991a36a6d80660e3aaf9d436d5f07b486acd239cadd2506e854fb4f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://295001469b4485d0ef107707d2affb5c6a653b55aaa04545172607c9c9915dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://295001469b4485d0ef107707d2affb5c6a653b55aaa04545172607c9c9915dae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qmgrn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:16Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:16 crc kubenswrapper[4971]: I0320 06:52:16.052376 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kfnsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d21aee3-3c87-4b06-9a9b-f88d3acb6d3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20ac2a0f22405759bc85f5dcfbecdd34e7568f53c6d67b5c0fe9d132e2a359b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7d788\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kfnsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:16Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:16 crc kubenswrapper[4971]: I0320 06:52:16.702158 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fhlwk_7f624a08-810b-4915-b3bf-1e19d3e6cace/ovnkube-controller/3.log" Mar 20 06:52:16 crc kubenswrapper[4971]: I0320 06:52:16.707788 4971 scope.go:117] "RemoveContainer" containerID="d333225cc7556a04bf2b3cf747cb0645ef0b657641dc10e0e91d59ecd504b63e" Mar 20 06:52:16 crc kubenswrapper[4971]: E0320 06:52:16.708156 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-fhlwk_openshift-ovn-kubernetes(7f624a08-810b-4915-b3bf-1e19d3e6cace)\"" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" podUID="7f624a08-810b-4915-b3bf-1e19d3e6cace" Mar 20 06:52:16 crc kubenswrapper[4971]: I0320 06:52:16.724119 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9cccf9478c15d0ef6ac9a69d23a13e83c06968969d355cfe2fde1db3b40804f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:16Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:16 crc kubenswrapper[4971]: I0320 06:52:16.732168 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:52:16 crc kubenswrapper[4971]: E0320 06:52:16.732321 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vl68" podUID="6ac992f1-bd05-471f-a101-cf14466e15e8" Mar 20 06:52:16 crc kubenswrapper[4971]: I0320 06:52:16.741712 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:16Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:16 crc kubenswrapper[4971]: I0320 06:52:16.765106 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lwlhn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f11eaf57-f83a-4974-adb5-7a59b11555b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b387ac17402e80963231386ddb3272ea2e435161b374b91196b8fa406f8868ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93f25aa49f9048f5f935b94dc408da72227675c36d75b9614bb59743769df123\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T06:52:06Z\\\",\\\"message\\\":\\\"2026-03-20T06:51:21+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4d3cf008-dd50-43b5-9c13-e6eaedf330d9\\\\n2026-03-20T06:51:21+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4d3cf008-dd50-43b5-9c13-e6eaedf330d9 to /host/opt/cni/bin/\\\\n2026-03-20T06:51:21Z [verbose] multus-daemon started\\\\n2026-03-20T06:51:21Z [verbose] Readiness Indicator file check\\\\n2026-03-20T06:52:06Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2jcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lwlhn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:16Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:16 crc kubenswrapper[4971]: I0320 06:52:16.797769 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f624a08-810b-4915-b3bf-1e19d3e6cace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://178ce0bca73aef60296595c16024b62f27dc16d226d6ed8d102bb88b3b4477aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20d0d42b820b46ff3173d56e03dedec6fd3fd360fe144467d8eb6f785d597308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://547b376608a6a765aea3f53af2c12c6eea952bf8fcdb34be1a94fc79d7c84b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://421a7ad4b3dde0de7b751159a385832075d84609031c7def22b4aa02874d2936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a1ae4297291e48978592b82e56c27206a61f8789bc5312c5dd22d06b34c195\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30450686629f21239e3247309d8fab4f41bacff80fdc6d456d970dde1a7e703\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d333225cc7556a04bf2b3cf747cb0645ef0b657641dc10e0e91d59ecd504b63e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d333225cc7556a04bf2b3cf747cb0645ef0b657641dc10e0e91d59ecd504b63e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T06:52:14Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 06:52:14.724829 7493 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 06:52:14.724889 7493 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 06:52:14.724954 7493 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 06:52:14.724987 7493 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 06:52:14.725001 7493 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 06:52:14.725009 7493 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 06:52:14.725028 7493 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 06:52:14.725034 7493 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 06:52:14.725038 7493 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 06:52:14.725045 7493 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 06:52:14.725081 7493 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 06:52:14.725123 7493 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 06:52:14.725131 7493 factory.go:656] Stopping watch factory\\\\nI0320 06:52:14.725167 7493 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 06:52:14.725404 7493 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:52:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-fhlwk_openshift-ovn-kubernetes(7f624a08-810b-4915-b3bf-1e19d3e6cace)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4691ea9fd75102b278413b4cf2d045ddad7f06b55243e481ec5ff36f6891cd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fhlwk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:16Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:16 crc kubenswrapper[4971]: I0320 06:52:16.822507 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb66d79e-43e8-4160-ad35-5364bb664ab3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:50:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:50:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9585aeb7e25f5126d0ef599cd585ee78a0b547cd3bb388257855c4482621229f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57f7e2094376c5a4e7ab64d5f5d5cf8f2345b48b8cc9c0debb63d2cce6cf9524\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T06:50:05Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 06:49:40.859108 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 06:49:40.862084 1 observer_polling.go:159] Starting file observer\\\\nI0320 06:49:40.896904 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 06:49:40.901430 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 06:50:05.286315 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 06:50:05.286429 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:50:05Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:40Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:50:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://892e128c2264ce375823bf22386e28688c7c163c1a8ab217c984a26dd61506d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://465f395d4a53da4db1294d226cb9df79fa577b5e791cdadfe55758e760fa3b1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f2b249d64a0733988fd3aceda10266127b4496b00e4240553c37eaf2f0ddc5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:16Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:16 crc kubenswrapper[4971]: I0320 06:52:16.847143 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:16Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:16 crc kubenswrapper[4971]: I0320 06:52:16.869500 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:16Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:16 crc kubenswrapper[4971]: I0320 06:52:16.886210 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jl74k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3068ae0f-0404-452c-b0e0-c422d898e6b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dad3a055d918bddfc385b5ad7adfb0e47347ef1f86fe991b0bb93c58d39e4436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmftj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jl74k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:16Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:16 crc kubenswrapper[4971]: I0320 06:52:16.900431 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c96bfbc-3a6a-44df-9bf6-9f78c587657c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1a4307986ec73b6299d1dbba6bfd0efb557578fd13619a0b6a81e5e6a31f7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6545ca2d913195f4ebed66a5c9050bc7198c2d2281fea7c3293a5d188beb2e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m26zl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:16Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:16 crc kubenswrapper[4971]: I0320 06:52:16.915781 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j57qn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f219b30c-9597-4a2c-a6d4-33cd07c2ff11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dab11f01560755e3638bfa9faf7d3a92738e066ab765f7ca1245515244805ca8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zb2ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa886347017baa18f89bad2475ac0d5b6127fd3c292e34becc72e9a1afba2ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zb2ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j57qn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:16Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:16 crc kubenswrapper[4971]: I0320 06:52:16.930390 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6vl68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ac992f1-bd05-471f-a101-cf14466e15e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cql2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cql2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6vl68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:16Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:16 crc kubenswrapper[4971]: I0320 06:52:16.951530 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6eccb52-4f65-4e1f-b9c9-4bece549a815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a82f33c71d1353dd841a49c673ff55387468152e3c7aa9a935173f9de78dcd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c624e715179e2699ff4d282ae4bd733ca1bc71ac21247b97f8585f0c6313460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://becef23cbdd3d98ab9e3aa793aee4fe00b06c86505d5717db0739eb21e344b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16665116a21b8abffb23402c6362e9cd5110741ce22065c5e50f200b179a327c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://105120bce25dc17ac09a04c3b0703e3f07e3f60495ddc67ca4d77004384a6fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:16Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:16 crc kubenswrapper[4971]: I0320 06:52:16.965244 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"769cbbc2-99d8-4602-af5e-d67be4ea42c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:50:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:50:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f520d26409a32e66fab0a111d40b27ef3e315b1c69687d842535e28f5be7cba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37dc61a04694f84965d4677bc1de4f78175f69c5d491dd6e75c3e5bbcffefc16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a8e9788928dff4b3f4984ced0d9fd89424ff1b3932eab286bef120536b58fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d625bf5d3c5e189cc09b066dcf7488a12513f542970ad31f968c3b97b40946f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d625bf5d3c5e189cc09b066dcf7488a12513f542970ad31f968c3b97b40946f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:16Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:16 crc kubenswrapper[4971]: I0320 06:52:16.977548 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae3df870db76866893b6eb3f04127ac3feec63a351cfd97e96960dddd119c79f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:16Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:16 crc kubenswrapper[4971]: I0320 06:52:16.993957 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qmgrn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be6e520-cae7-413a-bd91-de5f2de7f0b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c6fda90e6a1a5bb49163607f9d9547737db1eaaaea047b43e386e5a24cc2eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ec6ebffc7048694a1a324f1569343353e81d494bd19d04dd579073299d82eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ec6ebffc7048694a1a324f1569343353e81d494bd19d04dd579073299d82eaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c24b027e991a36a6d80660e3aaf9d436d5f07b486acd239cadd2506e854fb4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c24b027e991a36a6d80660e3aaf9d436d5f07b486acd239cadd2506e854fb4f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://295001469b4485d0ef107707d2affb5c6a653b55aaa04545172607c9c9915dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://295001469b4485d0ef107707d2affb5c6a653b55aaa04545172607c9c9915dae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qmgrn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:16Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:17 crc kubenswrapper[4971]: I0320 06:52:17.011341 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kfnsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d21aee3-3c87-4b06-9a9b-f88d3acb6d3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20ac2a0f22405759bc85f5dcfbecdd34e7568f53c6d67b5c0fe9d132e2a359b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7d788\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kfnsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:17Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:17 crc kubenswrapper[4971]: I0320 06:52:17.028057 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e18e6cb3-bf21-4f44-b97a-8e0a523ef45f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f039ce86df8b26f12e0b570ebc679159993f54933090b6f74961acad0bc6ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4ee4a5d4b945414c51474c35ab69b28ed5f455e26d29cd5b16ad428ccbbf22a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4ee4a5d4b945414c51474c35ab69b28ed5f455e26d29cd5b16ad428ccbbf22a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:17Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:17 crc kubenswrapper[4971]: I0320 06:52:17.049426 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3f656dd-4e82-4469-9bd4-f02922f2649c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfb7907b15336a74d333ce825f8325e80508112fa387f60627e13cc60dc6b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e45395fcc4ab1005b3b86243b30fecf35311439ef5593f777ba909a0b9359574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c945250129ad9a7fdd1cc19ac9f4c8d3e73c504448c652d5e02cdc986624089d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d92316a4a93f3b34950803a33ac423d41d3467830591e102e6a6b492c5a5707\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a748c7f52fe87f489e7b11239aafb2b6f9dc55d79f48fd08e10193349e61706\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T06:50:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 06:50:49.359098 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 06:50:49.359207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 06:50:49.359766 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3550867493/tls.crt::/tmp/serving-cert-3550867493/tls.key\\\\\\\"\\\\nI0320 06:50:49.512575 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 06:50:49.515179 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 06:50:49.515195 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 06:50:49.515218 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 06:50:49.515223 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 06:50:49.521761 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 06:50:49.521810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 06:50:49.521822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 06:50:49.521832 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 06:50:49.521839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 06:50:49.521858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 06:50:49.521866 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 06:50:49.521769 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0320 06:50:49.523950 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:50:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2468ac45fc80991cc210720a76afa06655f3fd7e91fab289eed1d20818e7fcbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:17Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:17 crc kubenswrapper[4971]: I0320 06:52:17.068577 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b2683d138d302cd71c3b08e7fa14d434a288810b0bcb2dd30a6ed778e9288f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dbb5b36c90e374f7b5583a7981e39d07f6154e268299d9bc22781e5be7250c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:17Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:17 crc kubenswrapper[4971]: I0320 06:52:17.731667 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:52:17 crc kubenswrapper[4971]: I0320 06:52:17.731674 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:52:17 crc kubenswrapper[4971]: I0320 06:52:17.731889 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:52:17 crc kubenswrapper[4971]: E0320 06:52:17.732077 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 06:52:17 crc kubenswrapper[4971]: E0320 06:52:17.732330 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 06:52:17 crc kubenswrapper[4971]: E0320 06:52:17.732541 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 06:52:17 crc kubenswrapper[4971]: I0320 06:52:17.875259 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:52:17 crc kubenswrapper[4971]: I0320 06:52:17.875303 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:52:17 crc kubenswrapper[4971]: I0320 06:52:17.875319 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:52:17 crc kubenswrapper[4971]: I0320 06:52:17.875343 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:52:17 crc kubenswrapper[4971]: I0320 06:52:17.875359 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:52:17Z","lastTransitionTime":"2026-03-20T06:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:52:17 crc kubenswrapper[4971]: E0320 06:52:17.896635 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:52:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:52:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:52:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:52:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24b0987a-bded-4b95-97d3-cecbc47baa01\\\",\\\"systemUUID\\\":\\\"e08276c0-e3e4-4e35-ad9f-5ef530b85d12\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:17Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:17 crc kubenswrapper[4971]: I0320 06:52:17.903362 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:52:17 crc kubenswrapper[4971]: I0320 06:52:17.903439 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:52:17 crc kubenswrapper[4971]: I0320 06:52:17.903459 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:52:17 crc kubenswrapper[4971]: I0320 06:52:17.903489 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:52:17 crc kubenswrapper[4971]: I0320 06:52:17.903507 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:52:17Z","lastTransitionTime":"2026-03-20T06:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:52:17 crc kubenswrapper[4971]: E0320 06:52:17.925225 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:52:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:52:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:52:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:52:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24b0987a-bded-4b95-97d3-cecbc47baa01\\\",\\\"systemUUID\\\":\\\"e08276c0-e3e4-4e35-ad9f-5ef530b85d12\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:17Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:17 crc kubenswrapper[4971]: I0320 06:52:17.931731 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:52:17 crc kubenswrapper[4971]: I0320 06:52:17.931873 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:52:17 crc kubenswrapper[4971]: I0320 06:52:17.931896 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:52:17 crc kubenswrapper[4971]: I0320 06:52:17.931921 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:52:17 crc kubenswrapper[4971]: I0320 06:52:17.931941 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:52:17Z","lastTransitionTime":"2026-03-20T06:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:52:17 crc kubenswrapper[4971]: E0320 06:52:17.952057 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:52:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:52:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:52:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:52:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24b0987a-bded-4b95-97d3-cecbc47baa01\\\",\\\"systemUUID\\\":\\\"e08276c0-e3e4-4e35-ad9f-5ef530b85d12\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:17Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:17 crc kubenswrapper[4971]: I0320 06:52:17.958381 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:52:17 crc kubenswrapper[4971]: I0320 06:52:17.958440 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:52:17 crc kubenswrapper[4971]: I0320 06:52:17.958457 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:52:17 crc kubenswrapper[4971]: I0320 06:52:17.958479 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:52:17 crc kubenswrapper[4971]: I0320 06:52:17.958494 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:52:17Z","lastTransitionTime":"2026-03-20T06:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:52:17 crc kubenswrapper[4971]: E0320 06:52:17.979509 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:52:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:52:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:52:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:52:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24b0987a-bded-4b95-97d3-cecbc47baa01\\\",\\\"systemUUID\\\":\\\"e08276c0-e3e4-4e35-ad9f-5ef530b85d12\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:17Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:17 crc kubenswrapper[4971]: I0320 06:52:17.985563 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:52:17 crc kubenswrapper[4971]: I0320 06:52:17.985620 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:52:17 crc kubenswrapper[4971]: I0320 06:52:17.985634 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:52:17 crc kubenswrapper[4971]: I0320 06:52:17.985654 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:52:17 crc kubenswrapper[4971]: I0320 06:52:17.985667 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:52:17Z","lastTransitionTime":"2026-03-20T06:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:52:18 crc kubenswrapper[4971]: E0320 06:52:18.007200 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:52:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:52:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:52:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:52:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24b0987a-bded-4b95-97d3-cecbc47baa01\\\",\\\"systemUUID\\\":\\\"e08276c0-e3e4-4e35-ad9f-5ef530b85d12\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:18Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:18 crc kubenswrapper[4971]: E0320 06:52:18.007376 4971 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 06:52:18 crc kubenswrapper[4971]: I0320 06:52:18.732320 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:52:18 crc kubenswrapper[4971]: E0320 06:52:18.733281 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vl68" podUID="6ac992f1-bd05-471f-a101-cf14466e15e8" Mar 20 06:52:18 crc kubenswrapper[4971]: I0320 06:52:18.752708 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:18Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:18 crc kubenswrapper[4971]: I0320 06:52:18.777075 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:18Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:18 crc kubenswrapper[4971]: I0320 06:52:18.796629 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jl74k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3068ae0f-0404-452c-b0e0-c422d898e6b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dad3a055d918bddfc385b5ad7adfb0e47347ef1f86fe991b0bb93c58d39e4436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmftj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jl74k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:18Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:18 crc kubenswrapper[4971]: I0320 06:52:18.816927 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c96bfbc-3a6a-44df-9bf6-9f78c587657c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1a4307986ec73b6299d1dbba6bfd0efb557578fd13619a0b6a81e5e6a31f7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6545ca2d913195f4ebed66a5c9050bc7198c2d2281fea7c3293a5d188beb2e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m26zl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:18Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:18 crc kubenswrapper[4971]: I0320 06:52:18.834059 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j57qn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f219b30c-9597-4a2c-a6d4-33cd07c2ff11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dab11f01560755e3638bfa9faf7d3a92738e066ab765f7ca1245515244805ca8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zb2ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa886347017baa18f89bad2475ac0d5b6127fd3c292e34becc72e9a1afba2ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zb2ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j57qn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:18Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:18 crc kubenswrapper[4971]: E0320 06:52:18.847423 4971 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 06:52:18 crc kubenswrapper[4971]: I0320 06:52:18.850164 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6vl68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ac992f1-bd05-471f-a101-cf14466e15e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cql2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cql2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6vl68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:18Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:18 crc kubenswrapper[4971]: I0320 06:52:18.872416 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb66d79e-43e8-4160-ad35-5364bb664ab3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:50:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:50:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9585aeb7e25f5126d0ef599cd585ee78a0b547cd3bb388257855c4482621229f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57f7e2094376c5a4e7ab64d5f5d5cf8f2345b48b8cc9c0debb63d2cce6cf9524\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T06:50:05Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 06:49:40.859108 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 06:49:40.862084 1 observer_polling.go:159] Starting file observer\\\\nI0320 06:49:40.896904 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 06:49:40.901430 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 06:50:05.286315 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 06:50:05.286429 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:50:05Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:40Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:50:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://892e128c2264ce375823bf22386e28688c7c163c1a8ab217c984a26dd61506d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://465f395d4a53da4db1294d226cb9df79fa577b5e791cdadfe55758e760fa3b1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f2b249d64a0733988fd3aceda10266127b4496b00e4240553c37eaf2f0ddc5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:18Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:18 crc kubenswrapper[4971]: I0320 06:52:18.895067 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"769cbbc2-99d8-4602-af5e-d67be4ea42c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:50:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:50:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f520d26409a32e66fab0a111d40b27ef3e315b1c69687d842535e28f5be7cba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37dc61a04694f84965d4677bc1de4f78175f69c5d491dd6e75c3e5bbcffefc16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a8e9788928dff4b3f4984ced0d9fd89424ff1b3932eab286bef120536b58fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d625bf5d3c5e189cc09b066dcf7488a12513f542970ad31f968c3b97b40946f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d625bf5d3c5e189cc09b066dcf7488a12513f542970ad31f968c3b97b40946f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:18Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:18 crc kubenswrapper[4971]: I0320 06:52:18.914428 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae3df870db76866893b6eb3f04127ac3feec63a351cfd97e96960dddd119c79f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:18Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:18 crc kubenswrapper[4971]: I0320 06:52:18.940041 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qmgrn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be6e520-cae7-413a-bd91-de5f2de7f0b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c6fda90e6a1a5bb49163607f9d9547737db1eaaaea047b43e386e5a24cc2eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ec6ebffc7048694a1a324f1569343353e81d494bd19d04dd579073299d82eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ec6ebffc7048694a1a324f1569343353e81d494bd19d04dd579073299d82eaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c24b027e991a36a6d80660e3aaf9d436d5f07b486acd239cadd2506e854fb4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c24b027e991a36a6d80660e3aaf9d436d5f07b486acd239cadd2506e854fb4f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://295001469b4485d0ef107707d2affb5c6a653b55aaa04545172607c9c9915dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://295001469b4485d0ef107707d2affb5c6a653b55aaa04545172607c9c9915dae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qmgrn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:18Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:18 crc kubenswrapper[4971]: I0320 06:52:18.957199 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kfnsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d21aee3-3c87-4b06-9a9b-f88d3acb6d3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20ac2a0f22405759bc85f5dcfbecdd34e7568f53c6d67b5c0fe9d132e2a359b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7d788\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kfnsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:18Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:18 crc kubenswrapper[4971]: I0320 06:52:18.992753 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6eccb52-4f65-4e1f-b9c9-4bece549a815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a82f33c71d1353dd841a49c673ff55387468152e3c7aa9a935173f9de78dcd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c624e715179e2699ff4d282ae4bd733ca1bc71ac21247b97f8585f0c6313460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://becef23cbdd3d98ab9e3aa793aee4fe00b06c86505d5717db0739eb21e344b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16665116a21b8abffb23402c6362e9cd5110741ce22065c5e50f200b179a327c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://105120bce25dc17ac09a04c3b0703e3f07e3f60495ddc67ca4d77004384a6fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:18Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:19 crc kubenswrapper[4971]: I0320 06:52:19.012178 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3f656dd-4e82-4469-9bd4-f02922f2649c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfb7907b15336a74d333ce825f8325e80508112fa387f60627e13cc60dc6b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e45395fcc4ab1005b3b86243b30fecf35311439ef5593f777ba909a0b9359574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c945250129ad9a7fdd1cc19ac9f4c8d3e73c504448c652d5e02cdc986624089d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d92316a4a93f3b34950803a33ac423d41d3467830591e102e6a6b492c5a5707\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a748c7f52fe87f489e7b11239aafb2b6f9dc55d79f48fd08e10193349e61706\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T06:50:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 06:50:49.359098 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 06:50:49.359207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 06:50:49.359766 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3550867493/tls.crt::/tmp/serving-cert-3550867493/tls.key\\\\\\\"\\\\nI0320 06:50:49.512575 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 06:50:49.515179 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 06:50:49.515195 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 06:50:49.515218 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 06:50:49.515223 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 06:50:49.521761 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 06:50:49.521810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 06:50:49.521822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 06:50:49.521832 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 06:50:49.521839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 06:50:49.521858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 06:50:49.521866 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 06:50:49.521769 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0320 06:50:49.523950 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:50:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2468ac45fc80991cc210720a76afa06655f3fd7e91fab289eed1d20818e7fcbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:19Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:19 crc kubenswrapper[4971]: I0320 06:52:19.035634 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b2683d138d302cd71c3b08e7fa14d434a288810b0bcb2dd30a6ed778e9288f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dbb5b36c90e374f7b5583a7981e39d07f6154e268299d9bc22781e5be7250c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:19Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:19 crc kubenswrapper[4971]: I0320 06:52:19.050216 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e18e6cb3-bf21-4f44-b97a-8e0a523ef45f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f039ce86df8b26f12e0b570ebc679159993f54933090b6f74961acad0bc6ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4ee4a5d4b945414c51474c35ab69b28ed5f455e26d29cd5b16ad428ccbbf22a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4ee4a5d4b945414c51474c35ab69b28ed5f455e26d29cd5b16ad428ccbbf22a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:19Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:19 crc kubenswrapper[4971]: I0320 06:52:19.063652 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:19Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:19 crc kubenswrapper[4971]: I0320 06:52:19.078029 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lwlhn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f11eaf57-f83a-4974-adb5-7a59b11555b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b387ac17402e80963231386ddb3272ea2e435161b374b91196b8fa406f8868ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93f25aa49f9048f5f935b94dc408da72227675c36d75b9614bb59743769df123\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T06:52:06Z\\\",\\\"message\\\":\\\"2026-03-20T06:51:21+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4d3cf008-dd50-43b5-9c13-e6eaedf330d9\\\\n2026-03-20T06:51:21+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4d3cf008-dd50-43b5-9c13-e6eaedf330d9 to /host/opt/cni/bin/\\\\n2026-03-20T06:51:21Z [verbose] multus-daemon started\\\\n2026-03-20T06:51:21Z [verbose] Readiness Indicator file check\\\\n2026-03-20T06:52:06Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2jcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lwlhn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:19Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:19 crc kubenswrapper[4971]: I0320 06:52:19.107185 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f624a08-810b-4915-b3bf-1e19d3e6cace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://178ce0bca73aef60296595c16024b62f27dc16d226d6ed8d102bb88b3b4477aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20d0d42b820b46ff3173d56e03dedec6fd3fd360fe144467d8eb6f785d597308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://547b376608a6a765aea3f53af2c12c6eea952bf8fcdb34be1a94fc79d7c84b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://421a7ad4b3dde0de7b751159a385832075d84609031c7def22b4aa02874d2936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a1ae4297291e48978592b82e56c27206a61f8789bc5312c5dd22d06b34c195\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30450686629f21239e3247309d8fab4f41bacff80fdc6d456d970dde1a7e703\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d333225cc7556a04bf2b3cf747cb0645ef0b657641dc10e0e91d59ecd504b63e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d333225cc7556a04bf2b3cf747cb0645ef0b657641dc10e0e91d59ecd504b63e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T06:52:14Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 06:52:14.724829 7493 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 06:52:14.724889 7493 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 06:52:14.724954 7493 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 06:52:14.724987 7493 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 06:52:14.725001 7493 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 06:52:14.725009 7493 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 06:52:14.725028 7493 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 06:52:14.725034 7493 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 06:52:14.725038 7493 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 06:52:14.725045 7493 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 06:52:14.725081 7493 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 06:52:14.725123 7493 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 06:52:14.725131 7493 factory.go:656] Stopping watch factory\\\\nI0320 06:52:14.725167 7493 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 06:52:14.725404 7493 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:52:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-fhlwk_openshift-ovn-kubernetes(7f624a08-810b-4915-b3bf-1e19d3e6cace)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4691ea9fd75102b278413b4cf2d045ddad7f06b55243e481ec5ff36f6891cd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fhlwk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:19Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:19 crc kubenswrapper[4971]: I0320 06:52:19.122497 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9cccf9478c15d0ef6ac9a69d23a13e83c06968969d355cfe2fde1db3b40804f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:19Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:19 crc kubenswrapper[4971]: I0320 06:52:19.732132 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:52:19 crc kubenswrapper[4971]: I0320 06:52:19.732195 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:52:19 crc kubenswrapper[4971]: I0320 06:52:19.732258 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:52:19 crc kubenswrapper[4971]: E0320 06:52:19.732289 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 06:52:19 crc kubenswrapper[4971]: E0320 06:52:19.732516 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 06:52:19 crc kubenswrapper[4971]: E0320 06:52:19.732783 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 06:52:20 crc kubenswrapper[4971]: I0320 06:52:20.731945 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:52:20 crc kubenswrapper[4971]: E0320 06:52:20.732209 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vl68" podUID="6ac992f1-bd05-471f-a101-cf14466e15e8" Mar 20 06:52:21 crc kubenswrapper[4971]: I0320 06:52:21.540192 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:52:21 crc kubenswrapper[4971]: E0320 06:52:21.540434 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 06:53:25.540403009 +0000 UTC m=+227.520277177 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:52:21 crc kubenswrapper[4971]: I0320 06:52:21.540565 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:52:21 crc kubenswrapper[4971]: I0320 06:52:21.540669 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:52:21 crc kubenswrapper[4971]: I0320 06:52:21.540701 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:52:21 crc kubenswrapper[4971]: I0320 06:52:21.540739 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:52:21 crc kubenswrapper[4971]: E0320 06:52:21.540779 4971 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 06:52:21 crc kubenswrapper[4971]: E0320 06:52:21.540867 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 06:53:25.540845982 +0000 UTC m=+227.520720150 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 06:52:21 crc kubenswrapper[4971]: E0320 06:52:21.540875 4971 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 06:52:21 crc kubenswrapper[4971]: E0320 06:52:21.540898 4971 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 06:52:21 crc kubenswrapper[4971]: E0320 06:52:21.540941 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 06:53:25.540923914 +0000 UTC m=+227.520798082 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 06:52:21 crc kubenswrapper[4971]: E0320 06:52:21.540941 4971 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 06:52:21 crc kubenswrapper[4971]: E0320 06:52:21.540975 4971 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 06:52:21 crc kubenswrapper[4971]: E0320 06:52:21.541013 4971 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 06:52:21 crc kubenswrapper[4971]: E0320 06:52:21.541074 4971 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 06:52:21 crc kubenswrapper[4971]: E0320 06:52:21.541096 4971 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 06:52:21 crc kubenswrapper[4971]: E0320 06:52:21.541043 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 06:53:25.541024597 +0000 UTC m=+227.520898765 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 06:52:21 crc kubenswrapper[4971]: E0320 06:52:21.541195 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 06:53:25.541160741 +0000 UTC m=+227.521034919 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 06:52:21 crc kubenswrapper[4971]: I0320 06:52:21.731450 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:52:21 crc kubenswrapper[4971]: I0320 06:52:21.731455 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:52:21 crc kubenswrapper[4971]: I0320 06:52:21.731822 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:52:21 crc kubenswrapper[4971]: E0320 06:52:21.731665 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 06:52:21 crc kubenswrapper[4971]: E0320 06:52:21.731962 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 06:52:21 crc kubenswrapper[4971]: E0320 06:52:21.732108 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 06:52:22 crc kubenswrapper[4971]: I0320 06:52:22.732146 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:52:22 crc kubenswrapper[4971]: E0320 06:52:22.732321 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vl68" podUID="6ac992f1-bd05-471f-a101-cf14466e15e8" Mar 20 06:52:23 crc kubenswrapper[4971]: I0320 06:52:23.731303 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:52:23 crc kubenswrapper[4971]: I0320 06:52:23.731395 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:52:23 crc kubenswrapper[4971]: E0320 06:52:23.731461 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 06:52:23 crc kubenswrapper[4971]: I0320 06:52:23.731498 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:52:23 crc kubenswrapper[4971]: E0320 06:52:23.731687 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 06:52:23 crc kubenswrapper[4971]: E0320 06:52:23.731801 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 06:52:23 crc kubenswrapper[4971]: E0320 06:52:23.849459 4971 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 06:52:24 crc kubenswrapper[4971]: I0320 06:52:24.731964 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:52:24 crc kubenswrapper[4971]: E0320 06:52:24.732872 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vl68" podUID="6ac992f1-bd05-471f-a101-cf14466e15e8" Mar 20 06:52:25 crc kubenswrapper[4971]: I0320 06:52:25.732374 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:52:25 crc kubenswrapper[4971]: I0320 06:52:25.732397 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:52:25 crc kubenswrapper[4971]: I0320 06:52:25.732466 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:52:25 crc kubenswrapper[4971]: E0320 06:52:25.732883 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 06:52:25 crc kubenswrapper[4971]: E0320 06:52:25.732999 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 06:52:25 crc kubenswrapper[4971]: E0320 06:52:25.733126 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 06:52:26 crc kubenswrapper[4971]: I0320 06:52:26.731865 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:52:26 crc kubenswrapper[4971]: E0320 06:52:26.732138 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vl68" podUID="6ac992f1-bd05-471f-a101-cf14466e15e8" Mar 20 06:52:27 crc kubenswrapper[4971]: I0320 06:52:27.766922 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:52:27 crc kubenswrapper[4971]: I0320 06:52:27.766929 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:52:27 crc kubenswrapper[4971]: E0320 06:52:27.767179 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 06:52:27 crc kubenswrapper[4971]: I0320 06:52:27.767273 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:52:27 crc kubenswrapper[4971]: E0320 06:52:27.767439 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 06:52:27 crc kubenswrapper[4971]: E0320 06:52:27.767698 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 06:52:28 crc kubenswrapper[4971]: I0320 06:52:28.246554 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:52:28 crc kubenswrapper[4971]: I0320 06:52:28.246693 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:52:28 crc kubenswrapper[4971]: I0320 06:52:28.246756 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:52:28 crc kubenswrapper[4971]: I0320 06:52:28.246796 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:52:28 crc kubenswrapper[4971]: I0320 06:52:28.246819 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:52:28Z","lastTransitionTime":"2026-03-20T06:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:52:28 crc kubenswrapper[4971]: E0320 06:52:28.265808 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:52:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:52:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:52:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:52:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24b0987a-bded-4b95-97d3-cecbc47baa01\\\",\\\"systemUUID\\\":\\\"e08276c0-e3e4-4e35-ad9f-5ef530b85d12\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:28Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:28 crc kubenswrapper[4971]: I0320 06:52:28.272302 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:52:28 crc kubenswrapper[4971]: I0320 06:52:28.272360 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:52:28 crc kubenswrapper[4971]: I0320 06:52:28.272379 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:52:28 crc kubenswrapper[4971]: I0320 06:52:28.272408 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:52:28 crc kubenswrapper[4971]: I0320 06:52:28.272433 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:52:28Z","lastTransitionTime":"2026-03-20T06:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:52:28 crc kubenswrapper[4971]: E0320 06:52:28.293050 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:52:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:52:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:52:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:52:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24b0987a-bded-4b95-97d3-cecbc47baa01\\\",\\\"systemUUID\\\":\\\"e08276c0-e3e4-4e35-ad9f-5ef530b85d12\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:28Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:28 crc kubenswrapper[4971]: I0320 06:52:28.298719 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:52:28 crc kubenswrapper[4971]: I0320 06:52:28.298813 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:52:28 crc kubenswrapper[4971]: I0320 06:52:28.298830 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:52:28 crc kubenswrapper[4971]: I0320 06:52:28.298857 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:52:28 crc kubenswrapper[4971]: I0320 06:52:28.298873 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:52:28Z","lastTransitionTime":"2026-03-20T06:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:52:28 crc kubenswrapper[4971]: E0320 06:52:28.317258 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:52:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:52:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:52:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:52:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24b0987a-bded-4b95-97d3-cecbc47baa01\\\",\\\"systemUUID\\\":\\\"e08276c0-e3e4-4e35-ad9f-5ef530b85d12\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:28Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:28 crc kubenswrapper[4971]: I0320 06:52:28.325985 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:52:28 crc kubenswrapper[4971]: I0320 06:52:28.326033 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:52:28 crc kubenswrapper[4971]: I0320 06:52:28.326047 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:52:28 crc kubenswrapper[4971]: I0320 06:52:28.326070 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:52:28 crc kubenswrapper[4971]: I0320 06:52:28.326083 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:52:28Z","lastTransitionTime":"2026-03-20T06:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:52:28 crc kubenswrapper[4971]: E0320 06:52:28.345561 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:52:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:52:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:52:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:52:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24b0987a-bded-4b95-97d3-cecbc47baa01\\\",\\\"systemUUID\\\":\\\"e08276c0-e3e4-4e35-ad9f-5ef530b85d12\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:28Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:28 crc kubenswrapper[4971]: I0320 06:52:28.351006 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:52:28 crc kubenswrapper[4971]: I0320 06:52:28.351044 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:52:28 crc kubenswrapper[4971]: I0320 06:52:28.351054 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:52:28 crc kubenswrapper[4971]: I0320 06:52:28.351110 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:52:28 crc kubenswrapper[4971]: I0320 06:52:28.351122 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:52:28Z","lastTransitionTime":"2026-03-20T06:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:52:28 crc kubenswrapper[4971]: E0320 06:52:28.369538 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:52:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:52:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:52:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T06:52:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24b0987a-bded-4b95-97d3-cecbc47baa01\\\",\\\"systemUUID\\\":\\\"e08276c0-e3e4-4e35-ad9f-5ef530b85d12\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:28Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:28 crc kubenswrapper[4971]: E0320 06:52:28.369731 4971 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 06:52:28 crc kubenswrapper[4971]: I0320 06:52:28.732223 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:52:28 crc kubenswrapper[4971]: E0320 06:52:28.732470 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vl68" podUID="6ac992f1-bd05-471f-a101-cf14466e15e8" Mar 20 06:52:28 crc kubenswrapper[4971]: I0320 06:52:28.775456 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6eccb52-4f65-4e1f-b9c9-4bece549a815\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a82f33c71d1353dd841a49c673ff55387468152e3c7aa9a935173f9de78dcd1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c624e715179e2699ff4d282ae4bd733ca1bc71ac21247b97f8585f0c6313460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://becef23cbdd3d98ab9e3aa793aee4fe00b06c86505d5717db0739eb21e344b99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16665116a21b8abffb23402c6362e9cd5110741ce22065c5e50f200b179a327c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://105120bce25dc17ac09a04c3b0703e3f07e3f60495ddc67ca4d77004384a6fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bbf083c6865e2cf783db52166d86a2450f28e787ba0c8fac04d8e9b9ce52c0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a1f3eabbb1d6cb5e6381fc6429ee1548458658bc7c629ff228ce34190f63e5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f3103f5713f3c864387633b57b6161626e2a7810d46c127ce203740056ba465\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:28Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:28 crc kubenswrapper[4971]: I0320 06:52:28.834096 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"769cbbc2-99d8-4602-af5e-d67be4ea42c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:50:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:50:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f520d26409a32e66fab0a111d40b27ef3e315b1c69687d842535e28f5be7cba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37dc61a04694f84965d4677bc1de4f78175f69c5d491dd6e75c3e5bbcffefc16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a8e9788928dff4b3f4984ced0d9fd89424ff1b3932eab286bef120536b58fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d625bf5d3c5e189cc09b066dcf7488a12513f542970ad31f968c3b97b40946f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d625bf5d3c5e189cc09b066dcf7488a12513f542970ad31f968c3b97b40946f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:28Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:28 crc kubenswrapper[4971]: E0320 06:52:28.850069 4971 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 06:52:28 crc kubenswrapper[4971]: I0320 06:52:28.860847 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae3df870db76866893b6eb3f04127ac3feec63a351cfd97e96960dddd119c79f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:28Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:28 crc kubenswrapper[4971]: I0320 06:52:28.884752 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qmgrn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be6e520-cae7-413a-bd91-de5f2de7f0b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c6fda90e6a1a5bb49163607f9d9547737db1eaaaea047b43e386e5a24cc2eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e91db1ca80cf9ac3f855eff1e992d29898d0b97f66446eecc63ecf5ac81a515\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3c1276318d823fbe11273d0b7a3dbebc13465681ba0c24a4b59a8fbf355fc64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2819732daac7f2e2c79c630ee0a96c5e8d01a762cdf65a83e2baf5bad72628b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ec6ebffc7048694a1a324f1569343353e81d494bd19d04dd579073299d82eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ec6ebffc7048694a1a324f1569343353e81d494bd19d04dd579073299d82eaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c24b027e991a36a6d80660e3aaf9d436d5f07b486acd239cadd2506e854fb4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c24b027e991a36a6d80660e3aaf9d436d5f07b486acd239cadd2506e854fb4f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://295001469b4485d0ef107707d2affb5c6a653b55aaa04545172607c9c9915dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://295001469b4485d0ef107707d2affb5c6a653b55aaa04545172607c9c9915dae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpwlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qmgrn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:28Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:28 crc kubenswrapper[4971]: I0320 06:52:28.897792 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kfnsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d21aee3-3c87-4b06-9a9b-f88d3acb6d3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20ac2a0f22405759bc85f5dcfbecdd34e7568f53c6d67b5c0fe9d132e2a359b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7d788\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kfnsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:28Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:28 crc kubenswrapper[4971]: I0320 06:52:28.914725 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e18e6cb3-bf21-4f44-b97a-8e0a523ef45f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f039ce86df8b26f12e0b570ebc679159993f54933090b6f74961acad0bc6ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4ee4a5d4b945414c51474c35ab69b28ed5f455e26d29cd5b16ad428ccbbf22a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4ee4a5d4b945414c51474c35ab69b28ed5f455e26d29cd5b16ad428ccbbf22a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:28Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:28 crc kubenswrapper[4971]: I0320 06:52:28.933944 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3f656dd-4e82-4469-9bd4-f02922f2649c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cfb7907b15336a74d333ce825f8325e80508112fa387f60627e13cc60dc6b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e45395fcc4ab1005b3b86243b30fecf35311439ef5593f777ba909a0b9359574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c945250129ad9a7fdd1cc19ac9f4c8d3e73c504448c652d5e02cdc986624089d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d92316a4a93f3b34950803a33ac423d41d3467830591e102e6a6b492c5a5707\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a748c7f52fe87f489e7b11239aafb2b6f9dc55d79f48fd08e10193349e61706\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T06:50:49Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 06:50:49.359098 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 06:50:49.359207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 06:50:49.359766 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3550867493/tls.crt::/tmp/serving-cert-3550867493/tls.key\\\\\\\"\\\\nI0320 06:50:49.512575 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 06:50:49.515179 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 06:50:49.515195 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 06:50:49.515218 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 06:50:49.515223 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 06:50:49.521761 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 06:50:49.521810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 06:50:49.521822 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 06:50:49.521832 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 06:50:49.521839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 06:50:49.521858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 06:50:49.521866 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 06:50:49.521769 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0320 06:50:49.523950 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:50:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2468ac45fc80991cc210720a76afa06655f3fd7e91fab289eed1d20818e7fcbd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:49:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:28Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:28 crc kubenswrapper[4971]: I0320 06:52:28.949424 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b2683d138d302cd71c3b08e7fa14d434a288810b0bcb2dd30a6ed778e9288f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55dbb5b36c90e374f7b5583a7981e39d07f6154e268299d9bc22781e5be7250c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:28Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:28 crc kubenswrapper[4971]: I0320 06:52:28.966224 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9cccf9478c15d0ef6ac9a69d23a13e83c06968969d355cfe2fde1db3b40804f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:28Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:28 crc kubenswrapper[4971]: I0320 06:52:28.985182 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:28Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:29 crc kubenswrapper[4971]: I0320 06:52:29.005148 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lwlhn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f11eaf57-f83a-4974-adb5-7a59b11555b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b387ac17402e80963231386ddb3272ea2e435161b374b91196b8fa406f8868ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93f25aa49f9048f5f935b94dc408da72227675c36d75b9614bb59743769df123\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T06:52:06Z\\\",\\\"message\\\":\\\"2026-03-20T06:51:21+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4d3cf008-dd50-43b5-9c13-e6eaedf330d9\\\\n2026-03-20T06:51:21+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4d3cf008-dd50-43b5-9c13-e6eaedf330d9 to /host/opt/cni/bin/\\\\n2026-03-20T06:51:21Z [verbose] multus-daemon started\\\\n2026-03-20T06:51:21Z [verbose] Readiness Indicator file check\\\\n2026-03-20T06:52:06Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2jcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lwlhn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:29Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:29 crc kubenswrapper[4971]: I0320 06:52:29.025268 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f624a08-810b-4915-b3bf-1e19d3e6cace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://178ce0bca73aef60296595c16024b62f27dc16d226d6ed8d102bb88b3b4477aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20d0d42b820b46ff3173d56e03dedec6fd3fd360fe144467d8eb6f785d597308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://547b376608a6a765aea3f53af2c12c6eea952bf8fcdb34be1a94fc79d7c84b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://421a7ad4b3dde0de7b751159a385832075d84609031c7def22b4aa02874d2936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a1ae4297291e48978592b82e56c27206a61f8789bc5312c5dd22d06b34c195\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30450686629f21239e3247309d8fab4f41bacff80fdc6d456d970dde1a7e703\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d333225cc7556a04bf2b3cf747cb0645ef0b657641dc10e0e91d59ecd504b63e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d333225cc7556a04bf2b3cf747cb0645ef0b657641dc10e0e91d59ecd504b63e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T06:52:14Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 06:52:14.724829 7493 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 06:52:14.724889 7493 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 06:52:14.724954 7493 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 06:52:14.724987 7493 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 06:52:14.725001 7493 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 06:52:14.725009 7493 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 06:52:14.725028 7493 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 06:52:14.725034 7493 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 06:52:14.725038 7493 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 06:52:14.725045 7493 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 06:52:14.725081 7493 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 06:52:14.725123 7493 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 06:52:14.725131 7493 factory.go:656] Stopping watch factory\\\\nI0320 06:52:14.725167 7493 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 06:52:14.725404 7493 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:52:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-fhlwk_openshift-ovn-kubernetes(7f624a08-810b-4915-b3bf-1e19d3e6cace)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4691ea9fd75102b278413b4cf2d045ddad7f06b55243e481ec5ff36f6891cd77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T06:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7vn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fhlwk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:29Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:29 crc kubenswrapper[4971]: I0320 06:52:29.037204 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j57qn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f219b30c-9597-4a2c-a6d4-33cd07c2ff11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dab11f01560755e3638bfa9faf7d3a92738e066ab765f7ca1245515244805ca8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zb2ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa886347017baa18f89bad2475ac0d5b6127fd3c292e34becc72e9a1afba2ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zb2ss\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j57qn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:29Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:29 crc kubenswrapper[4971]: I0320 06:52:29.048292 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6vl68" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ac992f1-bd05-471f-a101-cf14466e15e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cql2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cql2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6vl68\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:29Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:29 crc kubenswrapper[4971]: I0320 06:52:29.060118 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb66d79e-43e8-4160-ad35-5364bb664ab3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:50:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:50:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9585aeb7e25f5126d0ef599cd585ee78a0b547cd3bb388257855c4482621229f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57f7e2094376c5a4e7ab64d5f5d5cf8f2345b48b8cc9c0debb63d2cce6cf9524\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T06:50:05Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 06:49:40.859108 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 06:49:40.862084 1 observer_polling.go:159] Starting file observer\\\\nI0320 06:49:40.896904 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 06:49:40.901430 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 06:50:05.286315 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 06:50:05.286429 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:50:05Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T06:49:40Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:50:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://892e128c2264ce375823bf22386e28688c7c163c1a8ab217c984a26dd61506d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://465f395d4a53da4db1294d226cb9df79fa577b5e791cdadfe55758e760fa3b1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f2b249d64a0733988fd3aceda10266127b4496b00e4240553c37eaf2f0ddc5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:49:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:29Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:29 crc kubenswrapper[4971]: I0320 06:52:29.077777 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:29Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:29 crc kubenswrapper[4971]: I0320 06:52:29.089632 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:29Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:29 crc kubenswrapper[4971]: I0320 06:52:29.104296 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jl74k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3068ae0f-0404-452c-b0e0-c422d898e6b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dad3a055d918bddfc385b5ad7adfb0e47347ef1f86fe991b0bb93c58d39e4436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmftj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jl74k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:29Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:29 crc kubenswrapper[4971]: I0320 06:52:29.120519 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c96bfbc-3a6a-44df-9bf6-9f78c587657c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1a4307986ec73b6299d1dbba6bfd0efb557578fd13619a0b6a81e5e6a31f7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6545ca2d913195f4ebed66a5c9050bc7198c2d2281fea7c3293a5d188beb2e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwzk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T06:51:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m26zl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T06:52:29Z is after 2025-08-24T17:21:41Z" Mar 20 06:52:29 crc kubenswrapper[4971]: I0320 06:52:29.731639 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:52:29 crc kubenswrapper[4971]: E0320 06:52:29.731822 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 06:52:29 crc kubenswrapper[4971]: I0320 06:52:29.731945 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:52:29 crc kubenswrapper[4971]: I0320 06:52:29.732035 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:52:29 crc kubenswrapper[4971]: E0320 06:52:29.732454 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 06:52:29 crc kubenswrapper[4971]: E0320 06:52:29.732598 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 06:52:29 crc kubenswrapper[4971]: I0320 06:52:29.733275 4971 scope.go:117] "RemoveContainer" containerID="d333225cc7556a04bf2b3cf747cb0645ef0b657641dc10e0e91d59ecd504b63e" Mar 20 06:52:29 crc kubenswrapper[4971]: E0320 06:52:29.733594 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-fhlwk_openshift-ovn-kubernetes(7f624a08-810b-4915-b3bf-1e19d3e6cace)\"" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" podUID="7f624a08-810b-4915-b3bf-1e19d3e6cace" Mar 20 06:52:30 crc kubenswrapper[4971]: I0320 06:52:30.731822 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:52:30 crc kubenswrapper[4971]: E0320 06:52:30.732122 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vl68" podUID="6ac992f1-bd05-471f-a101-cf14466e15e8" Mar 20 06:52:31 crc kubenswrapper[4971]: I0320 06:52:31.731435 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:52:31 crc kubenswrapper[4971]: I0320 06:52:31.731455 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:52:31 crc kubenswrapper[4971]: I0320 06:52:31.731853 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:52:31 crc kubenswrapper[4971]: E0320 06:52:31.732092 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 06:52:31 crc kubenswrapper[4971]: E0320 06:52:31.732216 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 06:52:31 crc kubenswrapper[4971]: E0320 06:52:31.732337 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 06:52:32 crc kubenswrapper[4971]: I0320 06:52:32.731414 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:52:32 crc kubenswrapper[4971]: E0320 06:52:32.731725 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vl68" podUID="6ac992f1-bd05-471f-a101-cf14466e15e8" Mar 20 06:52:33 crc kubenswrapper[4971]: I0320 06:52:33.732140 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:52:33 crc kubenswrapper[4971]: I0320 06:52:33.732159 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:52:33 crc kubenswrapper[4971]: E0320 06:52:33.732412 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 06:52:33 crc kubenswrapper[4971]: E0320 06:52:33.732527 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 06:52:33 crc kubenswrapper[4971]: I0320 06:52:33.733476 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:52:33 crc kubenswrapper[4971]: E0320 06:52:33.733895 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 06:52:33 crc kubenswrapper[4971]: E0320 06:52:33.854971 4971 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 06:52:34 crc kubenswrapper[4971]: I0320 06:52:34.731669 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:52:34 crc kubenswrapper[4971]: E0320 06:52:34.731991 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vl68" podUID="6ac992f1-bd05-471f-a101-cf14466e15e8" Mar 20 06:52:35 crc kubenswrapper[4971]: I0320 06:52:35.731786 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:52:35 crc kubenswrapper[4971]: I0320 06:52:35.731867 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:52:35 crc kubenswrapper[4971]: I0320 06:52:35.731943 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:52:35 crc kubenswrapper[4971]: E0320 06:52:35.732061 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 06:52:35 crc kubenswrapper[4971]: E0320 06:52:35.732214 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 06:52:35 crc kubenswrapper[4971]: E0320 06:52:35.732553 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 06:52:36 crc kubenswrapper[4971]: I0320 06:52:36.731947 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:52:36 crc kubenswrapper[4971]: E0320 06:52:36.732288 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vl68" podUID="6ac992f1-bd05-471f-a101-cf14466e15e8" Mar 20 06:52:37 crc kubenswrapper[4971]: I0320 06:52:37.218856 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ac992f1-bd05-471f-a101-cf14466e15e8-metrics-certs\") pod \"network-metrics-daemon-6vl68\" (UID: \"6ac992f1-bd05-471f-a101-cf14466e15e8\") " pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:52:37 crc kubenswrapper[4971]: E0320 06:52:37.219208 4971 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 06:52:37 crc kubenswrapper[4971]: E0320 06:52:37.219376 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ac992f1-bd05-471f-a101-cf14466e15e8-metrics-certs podName:6ac992f1-bd05-471f-a101-cf14466e15e8 nodeName:}" failed. No retries permitted until 2026-03-20 06:53:41.219333557 +0000 UTC m=+243.199207865 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6ac992f1-bd05-471f-a101-cf14466e15e8-metrics-certs") pod "network-metrics-daemon-6vl68" (UID: "6ac992f1-bd05-471f-a101-cf14466e15e8") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 06:52:37 crc kubenswrapper[4971]: I0320 06:52:37.732218 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:52:37 crc kubenswrapper[4971]: I0320 06:52:37.732361 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:52:37 crc kubenswrapper[4971]: I0320 06:52:37.732218 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:52:37 crc kubenswrapper[4971]: E0320 06:52:37.732722 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 06:52:37 crc kubenswrapper[4971]: E0320 06:52:37.732978 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 06:52:37 crc kubenswrapper[4971]: E0320 06:52:37.733077 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 06:52:38 crc kubenswrapper[4971]: I0320 06:52:38.598458 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 06:52:38 crc kubenswrapper[4971]: I0320 06:52:38.598537 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 06:52:38 crc kubenswrapper[4971]: I0320 06:52:38.598556 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 06:52:38 crc kubenswrapper[4971]: I0320 06:52:38.598585 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 06:52:38 crc kubenswrapper[4971]: I0320 06:52:38.598682 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T06:52:38Z","lastTransitionTime":"2026-03-20T06:52:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 06:52:38 crc kubenswrapper[4971]: I0320 06:52:38.678837 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-nc894"] Mar 20 06:52:38 crc kubenswrapper[4971]: I0320 06:52:38.679434 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nc894" Mar 20 06:52:38 crc kubenswrapper[4971]: I0320 06:52:38.682468 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 20 06:52:38 crc kubenswrapper[4971]: I0320 06:52:38.682966 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 20 06:52:38 crc kubenswrapper[4971]: I0320 06:52:38.683875 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 20 06:52:38 crc kubenswrapper[4971]: I0320 06:52:38.685342 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 20 06:52:38 crc kubenswrapper[4971]: I0320 06:52:38.731662 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:52:38 crc kubenswrapper[4971]: E0320 06:52:38.732151 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vl68" podUID="6ac992f1-bd05-471f-a101-cf14466e15e8" Mar 20 06:52:38 crc kubenswrapper[4971]: I0320 06:52:38.738243 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=76.738215493 podStartE2EDuration="1m16.738215493s" podCreationTimestamp="2026-03-20 06:51:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:52:38.736920716 +0000 UTC m=+180.716794894" watchObservedRunningTime="2026-03-20 06:52:38.738215493 +0000 UTC m=+180.718089671" Mar 20 06:52:38 crc kubenswrapper[4971]: I0320 06:52:38.740502 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9842b50a-c4bd-4cdf-91a0-59bec0448f93-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-nc894\" (UID: \"9842b50a-c4bd-4cdf-91a0-59bec0448f93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nc894" Mar 20 06:52:38 crc kubenswrapper[4971]: I0320 06:52:38.740636 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/9842b50a-c4bd-4cdf-91a0-59bec0448f93-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-nc894\" (UID: \"9842b50a-c4bd-4cdf-91a0-59bec0448f93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nc894" Mar 20 06:52:38 crc kubenswrapper[4971]: I0320 06:52:38.740678 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9842b50a-c4bd-4cdf-91a0-59bec0448f93-service-ca\") pod \"cluster-version-operator-5c965bbfc6-nc894\" (UID: \"9842b50a-c4bd-4cdf-91a0-59bec0448f93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nc894" Mar 20 06:52:38 crc kubenswrapper[4971]: I0320 06:52:38.740854 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/9842b50a-c4bd-4cdf-91a0-59bec0448f93-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-nc894\" (UID: \"9842b50a-c4bd-4cdf-91a0-59bec0448f93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nc894" Mar 20 06:52:38 crc kubenswrapper[4971]: I0320 06:52:38.740923 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9842b50a-c4bd-4cdf-91a0-59bec0448f93-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-nc894\" (UID: \"9842b50a-c4bd-4cdf-91a0-59bec0448f93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nc894" Mar 20 06:52:38 crc kubenswrapper[4971]: I0320 06:52:38.763914 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=54.763827178 podStartE2EDuration="54.763827178s" podCreationTimestamp="2026-03-20 06:51:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:52:38.762338737 +0000 UTC m=+180.742212875" watchObservedRunningTime="2026-03-20 06:52:38.763827178 +0000 UTC m=+180.743701346" Mar 20 06:52:38 crc kubenswrapper[4971]: I0320 06:52:38.770939 4971 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 20 06:52:38 crc kubenswrapper[4971]: I0320 06:52:38.782848 4971 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 20 06:52:38 crc kubenswrapper[4971]: I0320 06:52:38.820002 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-qmgrn" podStartSLOduration=127.819913936 podStartE2EDuration="2m7.819913936s" podCreationTimestamp="2026-03-20 06:50:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:52:38.818677532 +0000 UTC m=+180.798551710" watchObservedRunningTime="2026-03-20 06:52:38.819913936 +0000 UTC m=+180.799788124" Mar 20 06:52:38 crc kubenswrapper[4971]: I0320 06:52:38.842700 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9842b50a-c4bd-4cdf-91a0-59bec0448f93-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-nc894\" (UID: \"9842b50a-c4bd-4cdf-91a0-59bec0448f93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nc894" Mar 20 06:52:38 crc kubenswrapper[4971]: I0320 06:52:38.842876 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9842b50a-c4bd-4cdf-91a0-59bec0448f93-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-nc894\" (UID: \"9842b50a-c4bd-4cdf-91a0-59bec0448f93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nc894" Mar 20 06:52:38 crc kubenswrapper[4971]: I0320 06:52:38.842950 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/9842b50a-c4bd-4cdf-91a0-59bec0448f93-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-nc894\" (UID: \"9842b50a-c4bd-4cdf-91a0-59bec0448f93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nc894" Mar 20 06:52:38 crc kubenswrapper[4971]: I0320 06:52:38.843000 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9842b50a-c4bd-4cdf-91a0-59bec0448f93-service-ca\") pod \"cluster-version-operator-5c965bbfc6-nc894\" (UID: \"9842b50a-c4bd-4cdf-91a0-59bec0448f93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nc894" Mar 20 06:52:38 crc kubenswrapper[4971]: I0320 06:52:38.843056 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/9842b50a-c4bd-4cdf-91a0-59bec0448f93-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-nc894\" (UID: \"9842b50a-c4bd-4cdf-91a0-59bec0448f93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nc894" Mar 20 06:52:38 crc kubenswrapper[4971]: I0320 06:52:38.843174 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/9842b50a-c4bd-4cdf-91a0-59bec0448f93-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-nc894\" (UID: \"9842b50a-c4bd-4cdf-91a0-59bec0448f93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nc894" Mar 20 06:52:38 crc kubenswrapper[4971]: I0320 06:52:38.844149 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/9842b50a-c4bd-4cdf-91a0-59bec0448f93-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-nc894\" (UID: \"9842b50a-c4bd-4cdf-91a0-59bec0448f93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nc894" Mar 20 06:52:38 crc kubenswrapper[4971]: I0320 06:52:38.845334 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9842b50a-c4bd-4cdf-91a0-59bec0448f93-service-ca\") pod \"cluster-version-operator-5c965bbfc6-nc894\" (UID: \"9842b50a-c4bd-4cdf-91a0-59bec0448f93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nc894" Mar 20 06:52:38 crc kubenswrapper[4971]: I0320 06:52:38.857123 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9842b50a-c4bd-4cdf-91a0-59bec0448f93-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-nc894\" (UID: \"9842b50a-c4bd-4cdf-91a0-59bec0448f93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nc894" Mar 20 06:52:38 crc kubenswrapper[4971]: E0320 06:52:38.863290 4971 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 06:52:38 crc kubenswrapper[4971]: I0320 06:52:38.865448 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-kfnsf" podStartSLOduration=127.865423258 podStartE2EDuration="2m7.865423258s" podCreationTimestamp="2026-03-20 06:50:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:52:38.839855894 +0000 UTC m=+180.819730032" watchObservedRunningTime="2026-03-20 06:52:38.865423258 +0000 UTC m=+180.845297436" Mar 20 06:52:38 crc kubenswrapper[4971]: I0320 06:52:38.865739 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=28.865729487 podStartE2EDuration="28.865729487s" podCreationTimestamp="2026-03-20 06:52:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:52:38.865183231 +0000 UTC m=+180.845057429" watchObservedRunningTime="2026-03-20 06:52:38.865729487 +0000 UTC m=+180.845603665" Mar 20 06:52:38 crc kubenswrapper[4971]: I0320 06:52:38.874951 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9842b50a-c4bd-4cdf-91a0-59bec0448f93-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-nc894\" (UID: \"9842b50a-c4bd-4cdf-91a0-59bec0448f93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nc894" Mar 20 06:52:38 crc kubenswrapper[4971]: I0320 06:52:38.918110 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=73.91807758 podStartE2EDuration="1m13.91807758s" podCreationTimestamp="2026-03-20 06:51:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:52:38.895074147 +0000 UTC m=+180.874948365" watchObservedRunningTime="2026-03-20 06:52:38.91807758 +0000 UTC m=+180.897951748" Mar 20 06:52:39 crc kubenswrapper[4971]: I0320 06:52:39.009385 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nc894" Mar 20 06:52:39 crc kubenswrapper[4971]: I0320 06:52:39.023433 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-lwlhn" podStartSLOduration=128.023402744 podStartE2EDuration="2m8.023402744s" podCreationTimestamp="2026-03-20 06:50:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:52:38.991285516 +0000 UTC m=+180.971159654" watchObservedRunningTime="2026-03-20 06:52:39.023402744 +0000 UTC m=+181.003276892" Mar 20 06:52:39 crc kubenswrapper[4971]: I0320 06:52:39.055815 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=48.055786059 podStartE2EDuration="48.055786059s" podCreationTimestamp="2026-03-20 06:51:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:52:39.054708419 +0000 UTC m=+181.034582567" watchObservedRunningTime="2026-03-20 06:52:39.055786059 +0000 UTC m=+181.035660207" Mar 20 06:52:39 crc kubenswrapper[4971]: I0320 06:52:39.129730 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-jl74k" podStartSLOduration=128.129699375 podStartE2EDuration="2m8.129699375s" podCreationTimestamp="2026-03-20 06:50:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:52:39.115507899 +0000 UTC m=+181.095382037" watchObservedRunningTime="2026-03-20 06:52:39.129699375 +0000 UTC m=+181.109573563" Mar 20 06:52:39 crc kubenswrapper[4971]: I0320 06:52:39.131201 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podStartSLOduration=128.131192517 podStartE2EDuration="2m8.131192517s" podCreationTimestamp="2026-03-20 06:50:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:52:39.130347973 +0000 UTC m=+181.110222111" watchObservedRunningTime="2026-03-20 06:52:39.131192517 +0000 UTC m=+181.111066695" Mar 20 06:52:39 crc kubenswrapper[4971]: I0320 06:52:39.147151 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j57qn" podStartSLOduration=127.147121302 podStartE2EDuration="2m7.147121302s" podCreationTimestamp="2026-03-20 06:50:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:52:39.146005471 +0000 UTC m=+181.125879609" watchObservedRunningTime="2026-03-20 06:52:39.147121302 +0000 UTC m=+181.126995470" Mar 20 06:52:39 crc kubenswrapper[4971]: I0320 06:52:39.731735 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:52:39 crc kubenswrapper[4971]: I0320 06:52:39.731739 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:52:39 crc kubenswrapper[4971]: I0320 06:52:39.731735 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:52:39 crc kubenswrapper[4971]: E0320 06:52:39.731948 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 06:52:39 crc kubenswrapper[4971]: E0320 06:52:39.732337 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 06:52:39 crc kubenswrapper[4971]: E0320 06:52:39.732546 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 06:52:39 crc kubenswrapper[4971]: I0320 06:52:39.822149 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nc894" event={"ID":"9842b50a-c4bd-4cdf-91a0-59bec0448f93","Type":"ContainerStarted","Data":"378de5fc0d7adf46c4e05d27af3f61308546aefec9c42e8d4af8e2373115cfc9"} Mar 20 06:52:39 crc kubenswrapper[4971]: I0320 06:52:39.822241 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nc894" event={"ID":"9842b50a-c4bd-4cdf-91a0-59bec0448f93","Type":"ContainerStarted","Data":"9e5a57d7b060a5915281eb4c54b71127bb59ac9f602d964f51325d222e0a4081"} Mar 20 06:52:39 crc kubenswrapper[4971]: I0320 06:52:39.847061 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nc894" podStartSLOduration=128.847037816 podStartE2EDuration="2m8.847037816s" podCreationTimestamp="2026-03-20 06:50:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:52:39.846077099 +0000 UTC m=+181.825951267" watchObservedRunningTime="2026-03-20 06:52:39.847037816 +0000 UTC m=+181.826911994" Mar 20 06:52:40 crc kubenswrapper[4971]: I0320 06:52:40.732336 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:52:40 crc kubenswrapper[4971]: E0320 06:52:40.732566 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vl68" podUID="6ac992f1-bd05-471f-a101-cf14466e15e8" Mar 20 06:52:40 crc kubenswrapper[4971]: I0320 06:52:40.733679 4971 scope.go:117] "RemoveContainer" containerID="d333225cc7556a04bf2b3cf747cb0645ef0b657641dc10e0e91d59ecd504b63e" Mar 20 06:52:40 crc kubenswrapper[4971]: E0320 06:52:40.733975 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-fhlwk_openshift-ovn-kubernetes(7f624a08-810b-4915-b3bf-1e19d3e6cace)\"" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" podUID="7f624a08-810b-4915-b3bf-1e19d3e6cace" Mar 20 06:52:41 crc kubenswrapper[4971]: I0320 06:52:41.732130 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:52:41 crc kubenswrapper[4971]: I0320 06:52:41.732235 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:52:41 crc kubenswrapper[4971]: E0320 06:52:41.732316 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 06:52:41 crc kubenswrapper[4971]: I0320 06:52:41.732235 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:52:41 crc kubenswrapper[4971]: E0320 06:52:41.732445 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 06:52:41 crc kubenswrapper[4971]: E0320 06:52:41.732731 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 06:52:42 crc kubenswrapper[4971]: I0320 06:52:42.731795 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:52:42 crc kubenswrapper[4971]: E0320 06:52:42.732069 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vl68" podUID="6ac992f1-bd05-471f-a101-cf14466e15e8" Mar 20 06:52:43 crc kubenswrapper[4971]: I0320 06:52:43.731709 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:52:43 crc kubenswrapper[4971]: I0320 06:52:43.731707 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:52:43 crc kubenswrapper[4971]: I0320 06:52:43.731952 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:52:43 crc kubenswrapper[4971]: E0320 06:52:43.732030 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 06:52:43 crc kubenswrapper[4971]: E0320 06:52:43.732168 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 06:52:43 crc kubenswrapper[4971]: E0320 06:52:43.732306 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 06:52:43 crc kubenswrapper[4971]: E0320 06:52:43.864867 4971 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 06:52:44 crc kubenswrapper[4971]: I0320 06:52:44.731297 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:52:44 crc kubenswrapper[4971]: E0320 06:52:44.731497 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vl68" podUID="6ac992f1-bd05-471f-a101-cf14466e15e8" Mar 20 06:52:45 crc kubenswrapper[4971]: I0320 06:52:45.732104 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:52:45 crc kubenswrapper[4971]: I0320 06:52:45.732230 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:52:45 crc kubenswrapper[4971]: E0320 06:52:45.732308 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 06:52:45 crc kubenswrapper[4971]: E0320 06:52:45.732760 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 06:52:45 crc kubenswrapper[4971]: I0320 06:52:45.732899 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:52:45 crc kubenswrapper[4971]: E0320 06:52:45.733039 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 06:52:46 crc kubenswrapper[4971]: I0320 06:52:46.731800 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:52:46 crc kubenswrapper[4971]: E0320 06:52:46.732123 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vl68" podUID="6ac992f1-bd05-471f-a101-cf14466e15e8" Mar 20 06:52:47 crc kubenswrapper[4971]: I0320 06:52:47.732133 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:52:47 crc kubenswrapper[4971]: I0320 06:52:47.732208 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:52:47 crc kubenswrapper[4971]: I0320 06:52:47.732215 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:52:47 crc kubenswrapper[4971]: E0320 06:52:47.732382 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 06:52:47 crc kubenswrapper[4971]: E0320 06:52:47.732566 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 06:52:47 crc kubenswrapper[4971]: E0320 06:52:47.732702 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 06:52:48 crc kubenswrapper[4971]: I0320 06:52:48.732833 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:52:48 crc kubenswrapper[4971]: E0320 06:52:48.733057 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vl68" podUID="6ac992f1-bd05-471f-a101-cf14466e15e8" Mar 20 06:52:48 crc kubenswrapper[4971]: E0320 06:52:48.866090 4971 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 06:52:49 crc kubenswrapper[4971]: I0320 06:52:49.731783 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:52:49 crc kubenswrapper[4971]: I0320 06:52:49.731886 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:52:49 crc kubenswrapper[4971]: I0320 06:52:49.731896 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:52:49 crc kubenswrapper[4971]: E0320 06:52:49.732447 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 06:52:49 crc kubenswrapper[4971]: E0320 06:52:49.732581 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 06:52:49 crc kubenswrapper[4971]: E0320 06:52:49.732968 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 06:52:50 crc kubenswrapper[4971]: I0320 06:52:50.731878 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:52:50 crc kubenswrapper[4971]: E0320 06:52:50.732177 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vl68" podUID="6ac992f1-bd05-471f-a101-cf14466e15e8" Mar 20 06:52:51 crc kubenswrapper[4971]: I0320 06:52:51.731855 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:52:51 crc kubenswrapper[4971]: I0320 06:52:51.731972 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:52:51 crc kubenswrapper[4971]: E0320 06:52:51.732074 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 06:52:51 crc kubenswrapper[4971]: I0320 06:52:51.732112 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:52:51 crc kubenswrapper[4971]: E0320 06:52:51.732335 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 06:52:51 crc kubenswrapper[4971]: E0320 06:52:51.732871 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 06:52:52 crc kubenswrapper[4971]: I0320 06:52:52.731362 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:52:52 crc kubenswrapper[4971]: E0320 06:52:52.731672 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vl68" podUID="6ac992f1-bd05-471f-a101-cf14466e15e8" Mar 20 06:52:53 crc kubenswrapper[4971]: I0320 06:52:53.732179 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:52:53 crc kubenswrapper[4971]: E0320 06:52:53.732440 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 06:52:53 crc kubenswrapper[4971]: I0320 06:52:53.732782 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:52:53 crc kubenswrapper[4971]: I0320 06:52:53.732847 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:52:53 crc kubenswrapper[4971]: E0320 06:52:53.733060 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 06:52:53 crc kubenswrapper[4971]: E0320 06:52:53.733189 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 06:52:53 crc kubenswrapper[4971]: I0320 06:52:53.734652 4971 scope.go:117] "RemoveContainer" containerID="d333225cc7556a04bf2b3cf747cb0645ef0b657641dc10e0e91d59ecd504b63e" Mar 20 06:52:53 crc kubenswrapper[4971]: E0320 06:52:53.734923 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-fhlwk_openshift-ovn-kubernetes(7f624a08-810b-4915-b3bf-1e19d3e6cace)\"" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" podUID="7f624a08-810b-4915-b3bf-1e19d3e6cace" Mar 20 06:52:53 crc kubenswrapper[4971]: E0320 06:52:53.867671 4971 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 06:52:53 crc kubenswrapper[4971]: I0320 06:52:53.883031 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lwlhn_f11eaf57-f83a-4974-adb5-7a59b11555b0/kube-multus/1.log" Mar 20 06:52:53 crc kubenswrapper[4971]: I0320 06:52:53.884094 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lwlhn_f11eaf57-f83a-4974-adb5-7a59b11555b0/kube-multus/0.log" Mar 20 06:52:53 crc kubenswrapper[4971]: I0320 06:52:53.884178 4971 generic.go:334] "Generic (PLEG): container finished" podID="f11eaf57-f83a-4974-adb5-7a59b11555b0" containerID="b387ac17402e80963231386ddb3272ea2e435161b374b91196b8fa406f8868ec" exitCode=1 Mar 20 06:52:53 crc kubenswrapper[4971]: I0320 06:52:53.884223 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lwlhn" event={"ID":"f11eaf57-f83a-4974-adb5-7a59b11555b0","Type":"ContainerDied","Data":"b387ac17402e80963231386ddb3272ea2e435161b374b91196b8fa406f8868ec"} Mar 20 06:52:53 crc kubenswrapper[4971]: I0320 06:52:53.884279 4971 scope.go:117] "RemoveContainer" containerID="93f25aa49f9048f5f935b94dc408da72227675c36d75b9614bb59743769df123" Mar 20 06:52:53 crc kubenswrapper[4971]: I0320 06:52:53.885062 4971 scope.go:117] "RemoveContainer" containerID="b387ac17402e80963231386ddb3272ea2e435161b374b91196b8fa406f8868ec" Mar 20 06:52:53 crc kubenswrapper[4971]: E0320 06:52:53.885481 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-lwlhn_openshift-multus(f11eaf57-f83a-4974-adb5-7a59b11555b0)\"" pod="openshift-multus/multus-lwlhn" podUID="f11eaf57-f83a-4974-adb5-7a59b11555b0" Mar 20 06:52:54 crc kubenswrapper[4971]: I0320 06:52:54.731809 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:52:54 crc kubenswrapper[4971]: E0320 06:52:54.732335 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vl68" podUID="6ac992f1-bd05-471f-a101-cf14466e15e8" Mar 20 06:52:54 crc kubenswrapper[4971]: I0320 06:52:54.891479 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lwlhn_f11eaf57-f83a-4974-adb5-7a59b11555b0/kube-multus/1.log" Mar 20 06:52:55 crc kubenswrapper[4971]: I0320 06:52:55.731635 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:52:55 crc kubenswrapper[4971]: I0320 06:52:55.731743 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:52:55 crc kubenswrapper[4971]: E0320 06:52:55.731801 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 06:52:55 crc kubenswrapper[4971]: I0320 06:52:55.731827 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:52:55 crc kubenswrapper[4971]: E0320 06:52:55.731919 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 06:52:55 crc kubenswrapper[4971]: E0320 06:52:55.732035 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 06:52:56 crc kubenswrapper[4971]: I0320 06:52:56.731559 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:52:56 crc kubenswrapper[4971]: E0320 06:52:56.731905 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vl68" podUID="6ac992f1-bd05-471f-a101-cf14466e15e8" Mar 20 06:52:57 crc kubenswrapper[4971]: I0320 06:52:57.731755 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:52:57 crc kubenswrapper[4971]: I0320 06:52:57.731754 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:52:57 crc kubenswrapper[4971]: E0320 06:52:57.732108 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 06:52:57 crc kubenswrapper[4971]: E0320 06:52:57.731952 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 06:52:57 crc kubenswrapper[4971]: I0320 06:52:57.731783 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:52:57 crc kubenswrapper[4971]: E0320 06:52:57.732245 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 06:52:58 crc kubenswrapper[4971]: I0320 06:52:58.731959 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:52:58 crc kubenswrapper[4971]: E0320 06:52:58.734411 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vl68" podUID="6ac992f1-bd05-471f-a101-cf14466e15e8" Mar 20 06:52:58 crc kubenswrapper[4971]: E0320 06:52:58.868790 4971 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 06:52:59 crc kubenswrapper[4971]: I0320 06:52:59.731657 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:52:59 crc kubenswrapper[4971]: I0320 06:52:59.731726 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:52:59 crc kubenswrapper[4971]: E0320 06:52:59.731873 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 06:52:59 crc kubenswrapper[4971]: I0320 06:52:59.732203 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:52:59 crc kubenswrapper[4971]: E0320 06:52:59.732299 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 06:52:59 crc kubenswrapper[4971]: E0320 06:52:59.732531 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 06:53:00 crc kubenswrapper[4971]: I0320 06:53:00.732063 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:53:00 crc kubenswrapper[4971]: E0320 06:53:00.732268 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vl68" podUID="6ac992f1-bd05-471f-a101-cf14466e15e8" Mar 20 06:53:01 crc kubenswrapper[4971]: I0320 06:53:01.732133 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:53:01 crc kubenswrapper[4971]: I0320 06:53:01.732222 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:53:01 crc kubenswrapper[4971]: I0320 06:53:01.732265 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:53:01 crc kubenswrapper[4971]: E0320 06:53:01.732400 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 06:53:01 crc kubenswrapper[4971]: E0320 06:53:01.732583 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 06:53:01 crc kubenswrapper[4971]: E0320 06:53:01.732753 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 06:53:02 crc kubenswrapper[4971]: I0320 06:53:02.731913 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:53:02 crc kubenswrapper[4971]: E0320 06:53:02.732173 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vl68" podUID="6ac992f1-bd05-471f-a101-cf14466e15e8" Mar 20 06:53:03 crc kubenswrapper[4971]: I0320 06:53:03.731248 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:53:03 crc kubenswrapper[4971]: I0320 06:53:03.731287 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:53:03 crc kubenswrapper[4971]: I0320 06:53:03.731302 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:53:03 crc kubenswrapper[4971]: E0320 06:53:03.731462 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 06:53:03 crc kubenswrapper[4971]: E0320 06:53:03.731723 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 06:53:03 crc kubenswrapper[4971]: E0320 06:53:03.731924 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 06:53:03 crc kubenswrapper[4971]: E0320 06:53:03.869980 4971 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 06:53:04 crc kubenswrapper[4971]: I0320 06:53:04.731698 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:53:04 crc kubenswrapper[4971]: E0320 06:53:04.731892 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vl68" podUID="6ac992f1-bd05-471f-a101-cf14466e15e8" Mar 20 06:53:05 crc kubenswrapper[4971]: I0320 06:53:05.732142 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:53:05 crc kubenswrapper[4971]: I0320 06:53:05.732243 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:53:05 crc kubenswrapper[4971]: I0320 06:53:05.732309 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:53:05 crc kubenswrapper[4971]: E0320 06:53:05.732581 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 06:53:05 crc kubenswrapper[4971]: E0320 06:53:05.733051 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 06:53:05 crc kubenswrapper[4971]: E0320 06:53:05.733176 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 06:53:06 crc kubenswrapper[4971]: I0320 06:53:06.731994 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:53:06 crc kubenswrapper[4971]: E0320 06:53:06.732246 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vl68" podUID="6ac992f1-bd05-471f-a101-cf14466e15e8" Mar 20 06:53:07 crc kubenswrapper[4971]: I0320 06:53:07.731901 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:53:07 crc kubenswrapper[4971]: I0320 06:53:07.732149 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:53:07 crc kubenswrapper[4971]: I0320 06:53:07.732176 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:53:07 crc kubenswrapper[4971]: E0320 06:53:07.732599 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 06:53:07 crc kubenswrapper[4971]: E0320 06:53:07.733104 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 06:53:07 crc kubenswrapper[4971]: E0320 06:53:07.733074 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 06:53:08 crc kubenswrapper[4971]: I0320 06:53:08.731633 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:53:08 crc kubenswrapper[4971]: I0320 06:53:08.734744 4971 scope.go:117] "RemoveContainer" containerID="b387ac17402e80963231386ddb3272ea2e435161b374b91196b8fa406f8868ec" Mar 20 06:53:08 crc kubenswrapper[4971]: E0320 06:53:08.735459 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vl68" podUID="6ac992f1-bd05-471f-a101-cf14466e15e8" Mar 20 06:53:08 crc kubenswrapper[4971]: I0320 06:53:08.735839 4971 scope.go:117] "RemoveContainer" containerID="d333225cc7556a04bf2b3cf747cb0645ef0b657641dc10e0e91d59ecd504b63e" Mar 20 06:53:08 crc kubenswrapper[4971]: E0320 06:53:08.870745 4971 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 06:53:08 crc kubenswrapper[4971]: I0320 06:53:08.946919 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lwlhn_f11eaf57-f83a-4974-adb5-7a59b11555b0/kube-multus/1.log" Mar 20 06:53:08 crc kubenswrapper[4971]: I0320 06:53:08.947041 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lwlhn" event={"ID":"f11eaf57-f83a-4974-adb5-7a59b11555b0","Type":"ContainerStarted","Data":"acb2940eb72cee89ebb03b5dd72a025b5c78fc535358f2cae2578c588d6380f0"} Mar 20 06:53:08 crc kubenswrapper[4971]: I0320 06:53:08.952487 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fhlwk_7f624a08-810b-4915-b3bf-1e19d3e6cace/ovnkube-controller/3.log" Mar 20 06:53:08 crc kubenswrapper[4971]: I0320 06:53:08.957385 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" event={"ID":"7f624a08-810b-4915-b3bf-1e19d3e6cace","Type":"ContainerStarted","Data":"e085ffc1a6eb1489d8650aa6291b0fdfa7bd675c8cd3e76bf2c7939ebcf256ed"} Mar 20 06:53:08 crc kubenswrapper[4971]: I0320 06:53:08.958544 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:53:09 crc kubenswrapper[4971]: I0320 06:53:09.025534 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" podStartSLOduration=158.025504947 podStartE2EDuration="2m38.025504947s" podCreationTimestamp="2026-03-20 06:50:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:53:09.024965742 +0000 UTC m=+211.004839880" watchObservedRunningTime="2026-03-20 06:53:09.025504947 +0000 UTC m=+211.005379125" Mar 20 06:53:09 crc kubenswrapper[4971]: I0320 06:53:09.732041 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:53:09 crc kubenswrapper[4971]: I0320 06:53:09.732100 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:53:09 crc kubenswrapper[4971]: E0320 06:53:09.732678 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 06:53:09 crc kubenswrapper[4971]: I0320 06:53:09.732147 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6vl68"] Mar 20 06:53:09 crc kubenswrapper[4971]: I0320 06:53:09.732136 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:53:09 crc kubenswrapper[4971]: E0320 06:53:09.732830 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 06:53:09 crc kubenswrapper[4971]: I0320 06:53:09.732940 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:53:09 crc kubenswrapper[4971]: E0320 06:53:09.733089 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 06:53:09 crc kubenswrapper[4971]: E0320 06:53:09.733197 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vl68" podUID="6ac992f1-bd05-471f-a101-cf14466e15e8" Mar 20 06:53:11 crc kubenswrapper[4971]: I0320 06:53:11.731496 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:53:11 crc kubenswrapper[4971]: I0320 06:53:11.731432 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:53:11 crc kubenswrapper[4971]: I0320 06:53:11.731564 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:53:11 crc kubenswrapper[4971]: E0320 06:53:11.732307 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vl68" podUID="6ac992f1-bd05-471f-a101-cf14466e15e8" Mar 20 06:53:11 crc kubenswrapper[4971]: E0320 06:53:11.732059 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 06:53:11 crc kubenswrapper[4971]: I0320 06:53:11.731601 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:53:11 crc kubenswrapper[4971]: E0320 06:53:11.732403 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 06:53:11 crc kubenswrapper[4971]: E0320 06:53:11.732468 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 06:53:13 crc kubenswrapper[4971]: I0320 06:53:13.731659 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:53:13 crc kubenswrapper[4971]: I0320 06:53:13.731659 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:53:13 crc kubenswrapper[4971]: E0320 06:53:13.731899 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 06:53:13 crc kubenswrapper[4971]: I0320 06:53:13.731709 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:53:13 crc kubenswrapper[4971]: E0320 06:53:13.731945 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 06:53:13 crc kubenswrapper[4971]: I0320 06:53:13.731705 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:53:13 crc kubenswrapper[4971]: E0320 06:53:13.732318 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 06:53:13 crc kubenswrapper[4971]: E0320 06:53:13.732089 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vl68" podUID="6ac992f1-bd05-471f-a101-cf14466e15e8" Mar 20 06:53:15 crc kubenswrapper[4971]: I0320 06:53:15.731586 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:53:15 crc kubenswrapper[4971]: I0320 06:53:15.731763 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:53:15 crc kubenswrapper[4971]: I0320 06:53:15.731794 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:53:15 crc kubenswrapper[4971]: I0320 06:53:15.731858 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:53:15 crc kubenswrapper[4971]: I0320 06:53:15.736168 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 20 06:53:15 crc kubenswrapper[4971]: I0320 06:53:15.736296 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 20 06:53:15 crc kubenswrapper[4971]: I0320 06:53:15.736528 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 20 06:53:15 crc kubenswrapper[4971]: I0320 06:53:15.736795 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 20 06:53:15 crc kubenswrapper[4971]: I0320 06:53:15.738082 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 20 06:53:15 crc kubenswrapper[4971]: I0320 06:53:15.738809 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.530238 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.587337 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-fw8fs"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.588138 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-tp5rq"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.588510 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-fw8fs" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.589276 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-k5xfj"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.589503 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-tp5rq" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.590344 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-k5xfj" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.598940 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.600083 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.602192 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.602483 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.602510 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.602673 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.602966 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.604407 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.604652 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.604870 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.605369 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.605384 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.605664 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.605688 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.605801 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.612962 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.613474 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.614488 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-hvlj7"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.615205 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-hvlj7" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.617550 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-cwd9m"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.618105 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cwd9m" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.621096 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-scq4h"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.621627 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-scq4h" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.626284 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.626496 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.626685 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.627573 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.628162 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.629031 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.629322 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.629540 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.629913 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nj4b4"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.630550 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nj4b4" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.637961 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.638299 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.638393 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.637961 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.638517 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.638743 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.638791 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.639041 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.639618 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.639627 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.639772 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.639874 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.640906 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.641027 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tv2fz"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.641664 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tv2fz" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.641678 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.641825 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.642093 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.643958 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-g8hs9"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.644354 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-g8hs9" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.644458 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tzg77"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.645041 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.645367 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tzg77" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.646659 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-4l9qf"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.646853 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.647119 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-4l9qf" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.647156 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.647257 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.653898 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.654189 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.654385 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-dwxp5"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.654404 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.655026 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-dwxp5" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.654699 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.654969 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.681200 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-g2c5c"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.682692 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-r8t8l"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.683648 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-r8t8l" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.683856 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.684172 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.684393 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.684434 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-g2c5c" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.684505 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.686023 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.698255 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/168eba70-d30f-4f54-88c8-403a411a16a0-config\") pod \"apiserver-76f77b778f-tp5rq\" (UID: \"168eba70-d30f-4f54-88c8-403a411a16a0\") " pod="openshift-apiserver/apiserver-76f77b778f-tp5rq" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.698310 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/168eba70-d30f-4f54-88c8-403a411a16a0-trusted-ca-bundle\") pod \"apiserver-76f77b778f-tp5rq\" (UID: \"168eba70-d30f-4f54-88c8-403a411a16a0\") " pod="openshift-apiserver/apiserver-76f77b778f-tp5rq" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.700669 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.703369 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/168eba70-d30f-4f54-88c8-403a411a16a0-node-pullsecrets\") pod \"apiserver-76f77b778f-tp5rq\" (UID: \"168eba70-d30f-4f54-88c8-403a411a16a0\") " pod="openshift-apiserver/apiserver-76f77b778f-tp5rq" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.703725 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-9ndf9"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.704370 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/168eba70-d30f-4f54-88c8-403a411a16a0-etcd-serving-ca\") pod \"apiserver-76f77b778f-tp5rq\" (UID: \"168eba70-d30f-4f54-88c8-403a411a16a0\") " pod="openshift-apiserver/apiserver-76f77b778f-tp5rq" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.704410 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/168eba70-d30f-4f54-88c8-403a411a16a0-audit\") pod \"apiserver-76f77b778f-tp5rq\" (UID: \"168eba70-d30f-4f54-88c8-403a411a16a0\") " pod="openshift-apiserver/apiserver-76f77b778f-tp5rq" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.704436 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/168eba70-d30f-4f54-88c8-403a411a16a0-etcd-client\") pod \"apiserver-76f77b778f-tp5rq\" (UID: \"168eba70-d30f-4f54-88c8-403a411a16a0\") " pod="openshift-apiserver/apiserver-76f77b778f-tp5rq" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.704455 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/168eba70-d30f-4f54-88c8-403a411a16a0-encryption-config\") pod \"apiserver-76f77b778f-tp5rq\" (UID: \"168eba70-d30f-4f54-88c8-403a411a16a0\") " pod="openshift-apiserver/apiserver-76f77b778f-tp5rq" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.704472 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9g76s\" (UniqueName: \"kubernetes.io/projected/168eba70-d30f-4f54-88c8-403a411a16a0-kube-api-access-9g76s\") pod \"apiserver-76f77b778f-tp5rq\" (UID: \"168eba70-d30f-4f54-88c8-403a411a16a0\") " pod="openshift-apiserver/apiserver-76f77b778f-tp5rq" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.704492 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/168eba70-d30f-4f54-88c8-403a411a16a0-audit-dir\") pod \"apiserver-76f77b778f-tp5rq\" (UID: \"168eba70-d30f-4f54-88c8-403a411a16a0\") " pod="openshift-apiserver/apiserver-76f77b778f-tp5rq" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.704509 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/168eba70-d30f-4f54-88c8-403a411a16a0-image-import-ca\") pod \"apiserver-76f77b778f-tp5rq\" (UID: \"168eba70-d30f-4f54-88c8-403a411a16a0\") " pod="openshift-apiserver/apiserver-76f77b778f-tp5rq" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.704525 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/168eba70-d30f-4f54-88c8-403a411a16a0-serving-cert\") pod \"apiserver-76f77b778f-tp5rq\" (UID: \"168eba70-d30f-4f54-88c8-403a411a16a0\") " pod="openshift-apiserver/apiserver-76f77b778f-tp5rq" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.704538 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-9ndf9" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.704983 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.705114 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.705205 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.705286 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.705374 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.705413 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.705463 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.705668 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.705741 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k7vhj"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.705808 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.705934 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.705938 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.713519 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k7vhj" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.721232 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.722545 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.723207 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.723764 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.724007 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.724149 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.724264 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.727565 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.727660 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kdv68"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.728013 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.728240 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.728403 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.728431 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-t2znt"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.728930 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-t2znt" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.729214 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kdv68" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.728437 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.729570 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.728757 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.728874 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.728915 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.729062 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.729103 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.729142 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.730582 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.732808 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.733347 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.734865 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.735055 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.735152 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.736964 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.737187 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.737338 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.737444 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.739359 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.739487 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.741824 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-l5r26"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.741065 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.745533 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-k9kzh"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.746267 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-858jn"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.746448 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-l5r26" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.746627 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7jzjv"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.746991 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fn7gp"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.747837 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jw2zn"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.747860 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fn7gp" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.747972 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7jzjv" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.747999 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-858jn" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.748340 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-k9kzh" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.748833 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.749065 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jw2zn" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.749299 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.749936 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.766425 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.771426 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pkbqf"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.774992 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-4gfgh"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.781444 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pkbqf" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.795596 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.795706 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.796023 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.797367 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bj9mx"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.798063 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-kmdm2"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.798340 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4gfgh" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.798928 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-bj9mx" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.799301 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-fw8fs"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.799407 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-kmdm2" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.800316 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-k5xfj"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.800866 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.801705 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-n48nl"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.803418 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bnct9"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.803591 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n48nl" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.804081 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566485-rnzpk"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.804347 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bnct9" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.804848 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566485-rnzpk" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.805355 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kcw47"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.806577 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kcw47" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.807239 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-4g87f"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.808075 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-4g87f" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.808423 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f6352b6-0b3b-43d0-8d2f-681f5aec50c2-trusted-ca-bundle\") pod \"console-f9d7485db-4l9qf\" (UID: \"9f6352b6-0b3b-43d0-8d2f-681f5aec50c2\") " pod="openshift-console/console-f9d7485db-4l9qf" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.808458 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52969add-873f-41ae-b0a9-e36924daac43-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-l5r26\" (UID: \"52969add-873f-41ae-b0a9-e36924daac43\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-l5r26" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.808485 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/469b3cee-92e9-43a1-b8e7-de66a1990a8f-client-ca\") pod \"controller-manager-879f6c89f-k5xfj\" (UID: \"469b3cee-92e9-43a1-b8e7-de66a1990a8f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k5xfj" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.808505 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/911a50c7-0e81-4087-9bc0-81e5206d5fd0-audit-policies\") pod \"apiserver-7bbb656c7d-scq4h\" (UID: \"911a50c7-0e81-4087-9bc0-81e5206d5fd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-scq4h" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.808538 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/971318f2-7f27-4651-9fbe-d66d37fb6b12-config\") pod \"openshift-apiserver-operator-796bbdcf4f-k7vhj\" (UID: \"971318f2-7f27-4651-9fbe-d66d37fb6b12\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k7vhj" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.808558 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34c91a14-92a3-459a-a798-85b4bda4f3ca-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-tv2fz\" (UID: \"34c91a14-92a3-459a-a798-85b4bda4f3ca\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tv2fz" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.808576 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cc43d2b-e8e2-405a-9870-33c60db55ff2-config\") pod \"authentication-operator-69f744f599-hvlj7\" (UID: \"6cc43d2b-e8e2-405a-9870-33c60db55ff2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hvlj7" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.808595 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9f6352b6-0b3b-43d0-8d2f-681f5aec50c2-console-config\") pod \"console-f9d7485db-4l9qf\" (UID: \"9f6352b6-0b3b-43d0-8d2f-681f5aec50c2\") " pod="openshift-console/console-f9d7485db-4l9qf" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.808629 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6cc43d2b-e8e2-405a-9870-33c60db55ff2-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-hvlj7\" (UID: \"6cc43d2b-e8e2-405a-9870-33c60db55ff2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hvlj7" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.808645 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/634ee46d-0ea5-4287-a7d7-55d7b0fb803b-config\") pod \"route-controller-manager-6576b87f9c-tzg77\" (UID: \"634ee46d-0ea5-4287-a7d7-55d7b0fb803b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tzg77" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.808663 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9q48\" (UniqueName: \"kubernetes.io/projected/971318f2-7f27-4651-9fbe-d66d37fb6b12-kube-api-access-g9q48\") pod \"openshift-apiserver-operator-796bbdcf4f-k7vhj\" (UID: \"971318f2-7f27-4651-9fbe-d66d37fb6b12\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k7vhj" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.808684 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9f6352b6-0b3b-43d0-8d2f-681f5aec50c2-console-oauth-config\") pod \"console-f9d7485db-4l9qf\" (UID: \"9f6352b6-0b3b-43d0-8d2f-681f5aec50c2\") " pod="openshift-console/console-f9d7485db-4l9qf" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.808700 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92c1ee23-ae49-48a6-828d-19ecb5573057-service-ca-bundle\") pod \"router-default-5444994796-858jn\" (UID: \"92c1ee23-ae49-48a6-828d-19ecb5573057\") " pod="openshift-ingress/router-default-5444994796-858jn" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.808717 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e1a9fe49-1dd3-48c9-a4c7-33ce3cf9f698-serving-cert\") pod \"etcd-operator-b45778765-dwxp5\" (UID: \"e1a9fe49-1dd3-48c9-a4c7-33ce3cf9f698\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dwxp5" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.808734 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4mk4\" (UniqueName: \"kubernetes.io/projected/8b3d1a36-b34d-422b-96e1-0bfd934deac1-kube-api-access-b4mk4\") pod \"machine-config-operator-74547568cd-n48nl\" (UID: \"8b3d1a36-b34d-422b-96e1-0bfd934deac1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n48nl" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.808753 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6f5bffa0-8b50-4aac-a108-edc093dfc2bf-images\") pod \"machine-api-operator-5694c8668f-fw8fs\" (UID: \"6f5bffa0-8b50-4aac-a108-edc093dfc2bf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fw8fs" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.808771 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8b3d1a36-b34d-422b-96e1-0bfd934deac1-images\") pod \"machine-config-operator-74547568cd-n48nl\" (UID: \"8b3d1a36-b34d-422b-96e1-0bfd934deac1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n48nl" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.808792 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9g76s\" (UniqueName: \"kubernetes.io/projected/168eba70-d30f-4f54-88c8-403a411a16a0-kube-api-access-9g76s\") pod \"apiserver-76f77b778f-tp5rq\" (UID: \"168eba70-d30f-4f54-88c8-403a411a16a0\") " pod="openshift-apiserver/apiserver-76f77b778f-tp5rq" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.808812 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-g2c5c\" (UID: \"fe99e221-99e5-49c1-9cec-875a54851847\") " pod="openshift-authentication/oauth-openshift-558db77b4-g2c5c" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.808830 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9f6352b6-0b3b-43d0-8d2f-681f5aec50c2-oauth-serving-cert\") pod \"console-f9d7485db-4l9qf\" (UID: \"9f6352b6-0b3b-43d0-8d2f-681f5aec50c2\") " pod="openshift-console/console-f9d7485db-4l9qf" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.808824 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-q8mn4"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.808850 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/168eba70-d30f-4f54-88c8-403a411a16a0-serving-cert\") pod \"apiserver-76f77b778f-tp5rq\" (UID: \"168eba70-d30f-4f54-88c8-403a411a16a0\") " pod="openshift-apiserver/apiserver-76f77b778f-tp5rq" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.808869 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/168eba70-d30f-4f54-88c8-403a411a16a0-audit-dir\") pod \"apiserver-76f77b778f-tp5rq\" (UID: \"168eba70-d30f-4f54-88c8-403a411a16a0\") " pod="openshift-apiserver/apiserver-76f77b778f-tp5rq" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.808894 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3bf8239a-db87-4f56-9da5-24b1e2248263-trusted-ca\") pod \"console-operator-58897d9998-r8t8l\" (UID: \"3bf8239a-db87-4f56-9da5-24b1e2248263\") " pod="openshift-console-operator/console-operator-58897d9998-r8t8l" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.808921 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3a9bd8ec-b649-4b24-8eca-d0e20cef988a-trusted-ca\") pod \"ingress-operator-5b745b69d9-k9kzh\" (UID: \"3a9bd8ec-b649-4b24-8eca-d0e20cef988a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-k9kzh" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.808937 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/d028f881-7ad8-4805-84b8-341facd59bc2-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-kdv68\" (UID: \"d028f881-7ad8-4805-84b8-341facd59bc2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kdv68" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.808957 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-g2c5c\" (UID: \"fe99e221-99e5-49c1-9cec-875a54851847\") " pod="openshift-authentication/oauth-openshift-558db77b4-g2c5c" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.808977 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/469b3cee-92e9-43a1-b8e7-de66a1990a8f-serving-cert\") pod \"controller-manager-879f6c89f-k5xfj\" (UID: \"469b3cee-92e9-43a1-b8e7-de66a1990a8f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k5xfj" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.809060 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/168eba70-d30f-4f54-88c8-403a411a16a0-audit-dir\") pod \"apiserver-76f77b778f-tp5rq\" (UID: \"168eba70-d30f-4f54-88c8-403a411a16a0\") " pod="openshift-apiserver/apiserver-76f77b778f-tp5rq" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.809054 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3a9bd8ec-b649-4b24-8eca-d0e20cef988a-bound-sa-token\") pod \"ingress-operator-5b745b69d9-k9kzh\" (UID: \"3a9bd8ec-b649-4b24-8eca-d0e20cef988a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-k9kzh" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.809149 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/e1a9fe49-1dd3-48c9-a4c7-33ce3cf9f698-etcd-service-ca\") pod \"etcd-operator-b45778765-dwxp5\" (UID: \"e1a9fe49-1dd3-48c9-a4c7-33ce3cf9f698\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dwxp5" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.809182 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b7d2e412-0a4f-44b6-a326-269ac4921dae-config-volume\") pod \"collect-profiles-29566485-rnzpk\" (UID: \"b7d2e412-0a4f-44b6-a326-269ac4921dae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566485-rnzpk" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.809230 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/168eba70-d30f-4f54-88c8-403a411a16a0-config\") pod \"apiserver-76f77b778f-tp5rq\" (UID: \"168eba70-d30f-4f54-88c8-403a411a16a0\") " pod="openshift-apiserver/apiserver-76f77b778f-tp5rq" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.809262 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8b3d1a36-b34d-422b-96e1-0bfd934deac1-proxy-tls\") pod \"machine-config-operator-74547568cd-n48nl\" (UID: \"8b3d1a36-b34d-422b-96e1-0bfd934deac1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n48nl" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.809342 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/469b3cee-92e9-43a1-b8e7-de66a1990a8f-config\") pod \"controller-manager-879f6c89f-k5xfj\" (UID: \"469b3cee-92e9-43a1-b8e7-de66a1990a8f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k5xfj" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.809408 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f5bffa0-8b50-4aac-a108-edc093dfc2bf-config\") pod \"machine-api-operator-5694c8668f-fw8fs\" (UID: \"6f5bffa0-8b50-4aac-a108-edc093dfc2bf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fw8fs" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.809470 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/181841ef-d317-458b-bb36-daa09020abdb-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-nj4b4\" (UID: \"181841ef-d317-458b-bb36-daa09020abdb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nj4b4" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.809647 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/168eba70-d30f-4f54-88c8-403a411a16a0-node-pullsecrets\") pod \"apiserver-76f77b778f-tp5rq\" (UID: \"168eba70-d30f-4f54-88c8-403a411a16a0\") " pod="openshift-apiserver/apiserver-76f77b778f-tp5rq" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.809682 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnqvm\" (UniqueName: \"kubernetes.io/projected/e1a9fe49-1dd3-48c9-a4c7-33ce3cf9f698-kube-api-access-gnqvm\") pod \"etcd-operator-b45778765-dwxp5\" (UID: \"e1a9fe49-1dd3-48c9-a4c7-33ce3cf9f698\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dwxp5" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.809707 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8b3d1a36-b34d-422b-96e1-0bfd934deac1-auth-proxy-config\") pod \"machine-config-operator-74547568cd-n48nl\" (UID: \"8b3d1a36-b34d-422b-96e1-0bfd934deac1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n48nl" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.809779 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/168eba70-d30f-4f54-88c8-403a411a16a0-node-pullsecrets\") pod \"apiserver-76f77b778f-tp5rq\" (UID: \"168eba70-d30f-4f54-88c8-403a411a16a0\") " pod="openshift-apiserver/apiserver-76f77b778f-tp5rq" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.809653 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.809837 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ad08056-729b-4fb9-8049-9a27c674e520-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-jw2zn\" (UID: \"8ad08056-729b-4fb9-8049-9a27c674e520\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jw2zn" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.809874 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1159c4dd-9dfe-432c-8c65-f18f589c8a66-config\") pod \"kube-controller-manager-operator-78b949d7b-7jzjv\" (UID: \"1159c4dd-9dfe-432c-8c65-f18f589c8a66\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7jzjv" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.809900 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-g2c5c\" (UID: \"fe99e221-99e5-49c1-9cec-875a54851847\") " pod="openshift-authentication/oauth-openshift-558db77b4-g2c5c" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.809931 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fe99e221-99e5-49c1-9cec-875a54851847-audit-policies\") pod \"oauth-openshift-558db77b4-g2c5c\" (UID: \"fe99e221-99e5-49c1-9cec-875a54851847\") " pod="openshift-authentication/oauth-openshift-558db77b4-g2c5c" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.809955 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-g2c5c\" (UID: \"fe99e221-99e5-49c1-9cec-875a54851847\") " pod="openshift-authentication/oauth-openshift-558db77b4-g2c5c" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.810100 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/168eba70-d30f-4f54-88c8-403a411a16a0-config\") pod \"apiserver-76f77b778f-tp5rq\" (UID: \"168eba70-d30f-4f54-88c8-403a411a16a0\") " pod="openshift-apiserver/apiserver-76f77b778f-tp5rq" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.810147 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/469b3cee-92e9-43a1-b8e7-de66a1990a8f-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-k5xfj\" (UID: \"469b3cee-92e9-43a1-b8e7-de66a1990a8f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k5xfj" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.810195 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/e1a9fe49-1dd3-48c9-a4c7-33ce3cf9f698-etcd-ca\") pod \"etcd-operator-b45778765-dwxp5\" (UID: \"e1a9fe49-1dd3-48c9-a4c7-33ce3cf9f698\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dwxp5" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.810391 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xfqk2"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.810635 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.810870 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sk7n7\" (UniqueName: \"kubernetes.io/projected/634ee46d-0ea5-4287-a7d7-55d7b0fb803b-kube-api-access-sk7n7\") pod \"route-controller-manager-6576b87f9c-tzg77\" (UID: \"634ee46d-0ea5-4287-a7d7-55d7b0fb803b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tzg77" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.810903 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntg4h\" (UniqueName: \"kubernetes.io/projected/9f6352b6-0b3b-43d0-8d2f-681f5aec50c2-kube-api-access-ntg4h\") pod \"console-f9d7485db-4l9qf\" (UID: \"9f6352b6-0b3b-43d0-8d2f-681f5aec50c2\") " pod="openshift-console/console-f9d7485db-4l9qf" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.811043 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/92c1ee23-ae49-48a6-828d-19ecb5573057-metrics-certs\") pod \"router-default-5444994796-858jn\" (UID: \"92c1ee23-ae49-48a6-828d-19ecb5573057\") " pod="openshift-ingress/router-default-5444994796-858jn" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.811070 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3bf8239a-db87-4f56-9da5-24b1e2248263-serving-cert\") pod \"console-operator-58897d9998-r8t8l\" (UID: \"3bf8239a-db87-4f56-9da5-24b1e2248263\") " pod="openshift-console-operator/console-operator-58897d9998-r8t8l" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.811519 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/168eba70-d30f-4f54-88c8-403a411a16a0-audit\") pod \"apiserver-76f77b778f-tp5rq\" (UID: \"168eba70-d30f-4f54-88c8-403a411a16a0\") " pod="openshift-apiserver/apiserver-76f77b778f-tp5rq" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.811681 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-g2c5c\" (UID: \"fe99e221-99e5-49c1-9cec-875a54851847\") " pod="openshift-authentication/oauth-openshift-558db77b4-g2c5c" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.811713 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-g2c5c\" (UID: \"fe99e221-99e5-49c1-9cec-875a54851847\") " pod="openshift-authentication/oauth-openshift-558db77b4-g2c5c" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.811734 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbzqj\" (UniqueName: \"kubernetes.io/projected/3bf8239a-db87-4f56-9da5-24b1e2248263-kube-api-access-lbzqj\") pod \"console-operator-58897d9998-r8t8l\" (UID: \"3bf8239a-db87-4f56-9da5-24b1e2248263\") " pod="openshift-console-operator/console-operator-58897d9998-r8t8l" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.811761 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/168eba70-d30f-4f54-88c8-403a411a16a0-etcd-client\") pod \"apiserver-76f77b778f-tp5rq\" (UID: \"168eba70-d30f-4f54-88c8-403a411a16a0\") " pod="openshift-apiserver/apiserver-76f77b778f-tp5rq" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.811781 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2g4q\" (UniqueName: \"kubernetes.io/projected/09a5c7dc-07b9-4b2a-a8da-0a96222b932a-kube-api-access-l2g4q\") pod \"migrator-59844c95c7-4gfgh\" (UID: \"09a5c7dc-07b9-4b2a-a8da-0a96222b932a\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4gfgh" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.811815 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/911a50c7-0e81-4087-9bc0-81e5206d5fd0-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-scq4h\" (UID: \"911a50c7-0e81-4087-9bc0-81e5206d5fd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-scq4h" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.811837 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d028f881-7ad8-4805-84b8-341facd59bc2-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-kdv68\" (UID: \"d028f881-7ad8-4805-84b8-341facd59bc2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kdv68" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.811863 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6f5bffa0-8b50-4aac-a108-edc093dfc2bf-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-fw8fs\" (UID: \"6f5bffa0-8b50-4aac-a108-edc093dfc2bf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fw8fs" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.811883 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1159c4dd-9dfe-432c-8c65-f18f589c8a66-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-7jzjv\" (UID: \"1159c4dd-9dfe-432c-8c65-f18f589c8a66\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7jzjv" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.811901 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/52969add-873f-41ae-b0a9-e36924daac43-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-l5r26\" (UID: \"52969add-873f-41ae-b0a9-e36924daac43\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-l5r26" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.811926 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qp7s\" (UniqueName: \"kubernetes.io/projected/34c91a14-92a3-459a-a798-85b4bda4f3ca-kube-api-access-8qp7s\") pod \"openshift-controller-manager-operator-756b6f6bc6-tv2fz\" (UID: \"34c91a14-92a3-459a-a798-85b4bda4f3ca\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tv2fz" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.811945 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-g2c5c\" (UID: \"fe99e221-99e5-49c1-9cec-875a54851847\") " pod="openshift-authentication/oauth-openshift-558db77b4-g2c5c" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.811962 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vz7gx\" (UniqueName: \"kubernetes.io/projected/6f5bffa0-8b50-4aac-a108-edc093dfc2bf-kube-api-access-vz7gx\") pod \"machine-api-operator-5694c8668f-fw8fs\" (UID: \"6f5bffa0-8b50-4aac-a108-edc093dfc2bf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fw8fs" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.811986 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/634ee46d-0ea5-4287-a7d7-55d7b0fb803b-client-ca\") pod \"route-controller-manager-6576b87f9c-tzg77\" (UID: \"634ee46d-0ea5-4287-a7d7-55d7b0fb803b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tzg77" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.812007 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/92c1ee23-ae49-48a6-828d-19ecb5573057-default-certificate\") pod \"router-default-5444994796-858jn\" (UID: \"92c1ee23-ae49-48a6-828d-19ecb5573057\") " pod="openshift-ingress/router-default-5444994796-858jn" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.812041 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fe99e221-99e5-49c1-9cec-875a54851847-audit-dir\") pod \"oauth-openshift-558db77b4-g2c5c\" (UID: \"fe99e221-99e5-49c1-9cec-875a54851847\") " pod="openshift-authentication/oauth-openshift-558db77b4-g2c5c" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.812060 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24lb7\" (UniqueName: \"kubernetes.io/projected/8ad08056-729b-4fb9-8049-9a27c674e520-kube-api-access-24lb7\") pod \"kube-storage-version-migrator-operator-b67b599dd-jw2zn\" (UID: \"8ad08056-729b-4fb9-8049-9a27c674e520\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jw2zn" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.812317 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/168eba70-d30f-4f54-88c8-403a411a16a0-audit\") pod \"apiserver-76f77b778f-tp5rq\" (UID: \"168eba70-d30f-4f54-88c8-403a411a16a0\") " pod="openshift-apiserver/apiserver-76f77b778f-tp5rq" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.812420 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/971318f2-7f27-4651-9fbe-d66d37fb6b12-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-k7vhj\" (UID: \"971318f2-7f27-4651-9fbe-d66d37fb6b12\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k7vhj" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.812454 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k96wh\" (UniqueName: \"kubernetes.io/projected/469b3cee-92e9-43a1-b8e7-de66a1990a8f-kube-api-access-k96wh\") pod \"controller-manager-879f6c89f-k5xfj\" (UID: \"469b3cee-92e9-43a1-b8e7-de66a1990a8f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k5xfj" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.813003 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvqq2\" (UniqueName: \"kubernetes.io/projected/d028f881-7ad8-4805-84b8-341facd59bc2-kube-api-access-qvqq2\") pod \"cluster-image-registry-operator-dc59b4c8b-kdv68\" (UID: \"d028f881-7ad8-4805-84b8-341facd59bc2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kdv68" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.813054 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/911a50c7-0e81-4087-9bc0-81e5206d5fd0-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-scq4h\" (UID: \"911a50c7-0e81-4087-9bc0-81e5206d5fd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-scq4h" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.813078 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klhnh\" (UniqueName: \"kubernetes.io/projected/92c1ee23-ae49-48a6-828d-19ecb5573057-kube-api-access-klhnh\") pod \"router-default-5444994796-858jn\" (UID: \"92c1ee23-ae49-48a6-828d-19ecb5573057\") " pod="openshift-ingress/router-default-5444994796-858jn" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.813105 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffnx2\" (UniqueName: \"kubernetes.io/projected/0939bb94-8858-43fc-8443-686e696beaa1-kube-api-access-ffnx2\") pod \"downloads-7954f5f757-t2znt\" (UID: \"0939bb94-8858-43fc-8443-686e696beaa1\") " pod="openshift-console/downloads-7954f5f757-t2znt" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.813152 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-g2c5c\" (UID: \"fe99e221-99e5-49c1-9cec-875a54851847\") " pod="openshift-authentication/oauth-openshift-558db77b4-g2c5c" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.813241 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnzn7\" (UniqueName: \"kubernetes.io/projected/fe99e221-99e5-49c1-9cec-875a54851847-kube-api-access-tnzn7\") pod \"oauth-openshift-558db77b4-g2c5c\" (UID: \"fe99e221-99e5-49c1-9cec-875a54851847\") " pod="openshift-authentication/oauth-openshift-558db77b4-g2c5c" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.813241 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-r6vfl"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.813266 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d028f881-7ad8-4805-84b8-341facd59bc2-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-kdv68\" (UID: \"d028f881-7ad8-4805-84b8-341facd59bc2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kdv68" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.813319 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52969add-873f-41ae-b0a9-e36924daac43-config\") pod \"kube-apiserver-operator-766d6c64bb-l5r26\" (UID: \"52969add-873f-41ae-b0a9-e36924daac43\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-l5r26" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.813345 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34c91a14-92a3-459a-a798-85b4bda4f3ca-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-tv2fz\" (UID: \"34c91a14-92a3-459a-a798-85b4bda4f3ca\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tv2fz" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.813401 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/65ba3fb8-bfc9-4b6a-96ac-e901252e44b7-available-featuregates\") pod \"openshift-config-operator-7777fb866f-g8hs9\" (UID: \"65ba3fb8-bfc9-4b6a-96ac-e901252e44b7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-g8hs9" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.813454 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/168eba70-d30f-4f54-88c8-403a411a16a0-encryption-config\") pod \"apiserver-76f77b778f-tp5rq\" (UID: \"168eba70-d30f-4f54-88c8-403a411a16a0\") " pod="openshift-apiserver/apiserver-76f77b778f-tp5rq" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.813477 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjdv2\" (UniqueName: \"kubernetes.io/projected/181841ef-d317-458b-bb36-daa09020abdb-kube-api-access-vjdv2\") pod \"cluster-samples-operator-665b6dd947-nj4b4\" (UID: \"181841ef-d317-458b-bb36-daa09020abdb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nj4b4" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.813564 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d17d7658-ce33-4a3c-826a-1a0361e73629-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bj9mx\" (UID: \"d17d7658-ce33-4a3c-826a-1a0361e73629\") " pod="openshift-marketplace/marketplace-operator-79b997595-bj9mx" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.813628 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghmgp\" (UniqueName: \"kubernetes.io/projected/3a9bd8ec-b649-4b24-8eca-d0e20cef988a-kube-api-access-ghmgp\") pod \"ingress-operator-5b745b69d9-k9kzh\" (UID: \"3a9bd8ec-b649-4b24-8eca-d0e20cef988a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-k9kzh" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.813656 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/168eba70-d30f-4f54-88c8-403a411a16a0-image-import-ca\") pod \"apiserver-76f77b778f-tp5rq\" (UID: \"168eba70-d30f-4f54-88c8-403a411a16a0\") " pod="openshift-apiserver/apiserver-76f77b778f-tp5rq" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.813706 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/911a50c7-0e81-4087-9bc0-81e5206d5fd0-audit-dir\") pod \"apiserver-7bbb656c7d-scq4h\" (UID: \"911a50c7-0e81-4087-9bc0-81e5206d5fd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-scq4h" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.813791 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1a9fe49-1dd3-48c9-a4c7-33ce3cf9f698-config\") pod \"etcd-operator-b45778765-dwxp5\" (UID: \"e1a9fe49-1dd3-48c9-a4c7-33ce3cf9f698\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dwxp5" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.815978 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9f6352b6-0b3b-43d0-8d2f-681f5aec50c2-service-ca\") pod \"console-f9d7485db-4l9qf\" (UID: \"9f6352b6-0b3b-43d0-8d2f-681f5aec50c2\") " pod="openshift-console/console-f9d7485db-4l9qf" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.816032 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/168eba70-d30f-4f54-88c8-403a411a16a0-trusted-ca-bundle\") pod \"apiserver-76f77b778f-tp5rq\" (UID: \"168eba70-d30f-4f54-88c8-403a411a16a0\") " pod="openshift-apiserver/apiserver-76f77b778f-tp5rq" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.816065 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d17d7658-ce33-4a3c-826a-1a0361e73629-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bj9mx\" (UID: \"d17d7658-ce33-4a3c-826a-1a0361e73629\") " pod="openshift-marketplace/marketplace-operator-79b997595-bj9mx" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.816121 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1159c4dd-9dfe-432c-8c65-f18f589c8a66-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-7jzjv\" (UID: \"1159c4dd-9dfe-432c-8c65-f18f589c8a66\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7jzjv" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.816145 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/634ee46d-0ea5-4287-a7d7-55d7b0fb803b-serving-cert\") pod \"route-controller-manager-6576b87f9c-tzg77\" (UID: \"634ee46d-0ea5-4287-a7d7-55d7b0fb803b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tzg77" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.816168 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-g2c5c\" (UID: \"fe99e221-99e5-49c1-9cec-875a54851847\") " pod="openshift-authentication/oauth-openshift-558db77b4-g2c5c" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.816190 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/780881af-a5e9-44fa-9cf9-4d57513449dd-auth-proxy-config\") pod \"machine-approver-56656f9798-cwd9m\" (UID: \"780881af-a5e9-44fa-9cf9-4d57513449dd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cwd9m" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.816210 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-g2c5c\" (UID: \"fe99e221-99e5-49c1-9cec-875a54851847\") " pod="openshift-authentication/oauth-openshift-558db77b4-g2c5c" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.816234 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9f6352b6-0b3b-43d0-8d2f-681f5aec50c2-console-serving-cert\") pod \"console-f9d7485db-4l9qf\" (UID: \"9f6352b6-0b3b-43d0-8d2f-681f5aec50c2\") " pod="openshift-console/console-f9d7485db-4l9qf" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.816271 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/780881af-a5e9-44fa-9cf9-4d57513449dd-config\") pod \"machine-approver-56656f9798-cwd9m\" (UID: \"780881af-a5e9-44fa-9cf9-4d57513449dd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cwd9m" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.816325 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e1a9fe49-1dd3-48c9-a4c7-33ce3cf9f698-etcd-client\") pod \"etcd-operator-b45778765-dwxp5\" (UID: \"e1a9fe49-1dd3-48c9-a4c7-33ce3cf9f698\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dwxp5" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.816346 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65ba3fb8-bfc9-4b6a-96ac-e901252e44b7-serving-cert\") pod \"openshift-config-operator-7777fb866f-g8hs9\" (UID: \"65ba3fb8-bfc9-4b6a-96ac-e901252e44b7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-g8hs9" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.816370 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/92c1ee23-ae49-48a6-828d-19ecb5573057-stats-auth\") pod \"router-default-5444994796-858jn\" (UID: \"92c1ee23-ae49-48a6-828d-19ecb5573057\") " pod="openshift-ingress/router-default-5444994796-858jn" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.816396 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/168eba70-d30f-4f54-88c8-403a411a16a0-etcd-serving-ca\") pod \"apiserver-76f77b778f-tp5rq\" (UID: \"168eba70-d30f-4f54-88c8-403a411a16a0\") " pod="openshift-apiserver/apiserver-76f77b778f-tp5rq" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.816417 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/780881af-a5e9-44fa-9cf9-4d57513449dd-machine-approver-tls\") pod \"machine-approver-56656f9798-cwd9m\" (UID: \"780881af-a5e9-44fa-9cf9-4d57513449dd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cwd9m" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.816439 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzj7w\" (UniqueName: \"kubernetes.io/projected/911a50c7-0e81-4087-9bc0-81e5206d5fd0-kube-api-access-jzj7w\") pod \"apiserver-7bbb656c7d-scq4h\" (UID: \"911a50c7-0e81-4087-9bc0-81e5206d5fd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-scq4h" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.816459 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b7d2e412-0a4f-44b6-a326-269ac4921dae-secret-volume\") pod \"collect-profiles-29566485-rnzpk\" (UID: \"b7d2e412-0a4f-44b6-a326-269ac4921dae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566485-rnzpk" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.816481 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6cc43d2b-e8e2-405a-9870-33c60db55ff2-service-ca-bundle\") pod \"authentication-operator-69f744f599-hvlj7\" (UID: \"6cc43d2b-e8e2-405a-9870-33c60db55ff2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hvlj7" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.816502 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/911a50c7-0e81-4087-9bc0-81e5206d5fd0-serving-cert\") pod \"apiserver-7bbb656c7d-scq4h\" (UID: \"911a50c7-0e81-4087-9bc0-81e5206d5fd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-scq4h" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.816520 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/911a50c7-0e81-4087-9bc0-81e5206d5fd0-encryption-config\") pod \"apiserver-7bbb656c7d-scq4h\" (UID: \"911a50c7-0e81-4087-9bc0-81e5206d5fd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-scq4h" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.816543 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bf8239a-db87-4f56-9da5-24b1e2248263-config\") pod \"console-operator-58897d9998-r8t8l\" (UID: \"3bf8239a-db87-4f56-9da5-24b1e2248263\") " pod="openshift-console-operator/console-operator-58897d9998-r8t8l" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.816564 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/911a50c7-0e81-4087-9bc0-81e5206d5fd0-etcd-client\") pod \"apiserver-7bbb656c7d-scq4h\" (UID: \"911a50c7-0e81-4087-9bc0-81e5206d5fd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-scq4h" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.816585 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8r7v2\" (UniqueName: \"kubernetes.io/projected/65ba3fb8-bfc9-4b6a-96ac-e901252e44b7-kube-api-access-8r7v2\") pod \"openshift-config-operator-7777fb866f-g8hs9\" (UID: \"65ba3fb8-bfc9-4b6a-96ac-e901252e44b7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-g8hs9" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.816674 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6cc43d2b-e8e2-405a-9870-33c60db55ff2-serving-cert\") pod \"authentication-operator-69f744f599-hvlj7\" (UID: \"6cc43d2b-e8e2-405a-9870-33c60db55ff2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hvlj7" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.816697 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ad08056-729b-4fb9-8049-9a27c674e520-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-jw2zn\" (UID: \"8ad08056-729b-4fb9-8049-9a27c674e520\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jw2zn" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.816721 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tn4p7\" (UniqueName: \"kubernetes.io/projected/780881af-a5e9-44fa-9cf9-4d57513449dd-kube-api-access-tn4p7\") pod \"machine-approver-56656f9798-cwd9m\" (UID: \"780881af-a5e9-44fa-9cf9-4d57513449dd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cwd9m" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.816742 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfzsm\" (UniqueName: \"kubernetes.io/projected/b7d2e412-0a4f-44b6-a326-269ac4921dae-kube-api-access-gfzsm\") pod \"collect-profiles-29566485-rnzpk\" (UID: \"b7d2e412-0a4f-44b6-a326-269ac4921dae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566485-rnzpk" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.819445 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/168eba70-d30f-4f54-88c8-403a411a16a0-serving-cert\") pod \"apiserver-76f77b778f-tp5rq\" (UID: \"168eba70-d30f-4f54-88c8-403a411a16a0\") " pod="openshift-apiserver/apiserver-76f77b778f-tp5rq" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.821503 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/168eba70-d30f-4f54-88c8-403a411a16a0-etcd-serving-ca\") pod \"apiserver-76f77b778f-tp5rq\" (UID: \"168eba70-d30f-4f54-88c8-403a411a16a0\") " pod="openshift-apiserver/apiserver-76f77b778f-tp5rq" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.821904 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/168eba70-d30f-4f54-88c8-403a411a16a0-etcd-client\") pod \"apiserver-76f77b778f-tp5rq\" (UID: \"168eba70-d30f-4f54-88c8-403a411a16a0\") " pod="openshift-apiserver/apiserver-76f77b778f-tp5rq" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.822516 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/168eba70-d30f-4f54-88c8-403a411a16a0-image-import-ca\") pod \"apiserver-76f77b778f-tp5rq\" (UID: \"168eba70-d30f-4f54-88c8-403a411a16a0\") " pod="openshift-apiserver/apiserver-76f77b778f-tp5rq" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.822771 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/168eba70-d30f-4f54-88c8-403a411a16a0-trusted-ca-bundle\") pod \"apiserver-76f77b778f-tp5rq\" (UID: \"168eba70-d30f-4f54-88c8-403a411a16a0\") " pod="openshift-apiserver/apiserver-76f77b778f-tp5rq" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.823786 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xfqk2" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.824349 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3a9bd8ec-b649-4b24-8eca-d0e20cef988a-metrics-tls\") pod \"ingress-operator-5b745b69d9-k9kzh\" (UID: \"3a9bd8ec-b649-4b24-8eca-d0e20cef988a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-k9kzh" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.824384 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqpn2\" (UniqueName: \"kubernetes.io/projected/d17d7658-ce33-4a3c-826a-1a0361e73629-kube-api-access-vqpn2\") pod \"marketplace-operator-79b997595-bj9mx\" (UID: \"d17d7658-ce33-4a3c-826a-1a0361e73629\") " pod="openshift-marketplace/marketplace-operator-79b997595-bj9mx" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.824407 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-g2c5c\" (UID: \"fe99e221-99e5-49c1-9cec-875a54851847\") " pod="openshift-authentication/oauth-openshift-558db77b4-g2c5c" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.824425 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4dcf\" (UniqueName: \"kubernetes.io/projected/6cc43d2b-e8e2-405a-9870-33c60db55ff2-kube-api-access-c4dcf\") pod \"authentication-operator-69f744f599-hvlj7\" (UID: \"6cc43d2b-e8e2-405a-9870-33c60db55ff2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hvlj7" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.825110 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/168eba70-d30f-4f54-88c8-403a411a16a0-encryption-config\") pod \"apiserver-76f77b778f-tp5rq\" (UID: \"168eba70-d30f-4f54-88c8-403a411a16a0\") " pod="openshift-apiserver/apiserver-76f77b778f-tp5rq" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.831532 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.833402 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566492-86snb"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.833570 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-r6vfl" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.834120 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-tp5rq"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.834150 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mg9fg"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.834294 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566492-86snb" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.834550 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-sjq2s"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.834686 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mg9fg" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.834967 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tzg77"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.834998 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-g8hs9"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.835014 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tv2fz"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.835179 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-sjq2s" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.835625 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nj4b4"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.841456 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-dwxp5"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.845249 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-t2znt"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.846393 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-hvlj7"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.849100 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-g2c5c"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.850854 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-xg8rk"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.851834 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-xg8rk" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.851989 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.853341 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-r8t8l"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.858079 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fn7gp"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.858162 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k7vhj"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.870670 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.871137 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7jzjv"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.874635 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-n48nl"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.877647 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bnct9"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.878819 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-l5r26"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.880039 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jw2zn"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.881133 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-scq4h"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.882212 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-4l9qf"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.883306 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566492-86snb"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.884473 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-k9kzh"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.885635 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xfqk2"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.886651 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pkbqf"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.887808 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kdv68"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.888974 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-4gfgh"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.890346 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.892121 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566485-rnzpk"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.892921 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-7bj4h"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.893934 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-7bj4h" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.894186 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-9ndf9"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.895632 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-xg8rk"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.896503 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-596j6"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.898281 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-596j6"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.898414 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-596j6" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.898813 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-r6vfl"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.900094 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-4g87f"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.902059 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bj9mx"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.902082 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kcw47"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.903229 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mg9fg"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.904235 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-kmdm2"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.905537 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-q8mn4"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.906628 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-sjq2s"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.907742 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-s29mb"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.908686 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-s29mb" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.908904 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-s29mb"] Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.910986 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.925470 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sk7n7\" (UniqueName: \"kubernetes.io/projected/634ee46d-0ea5-4287-a7d7-55d7b0fb803b-kube-api-access-sk7n7\") pod \"route-controller-manager-6576b87f9c-tzg77\" (UID: \"634ee46d-0ea5-4287-a7d7-55d7b0fb803b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tzg77" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.925510 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntg4h\" (UniqueName: \"kubernetes.io/projected/9f6352b6-0b3b-43d0-8d2f-681f5aec50c2-kube-api-access-ntg4h\") pod \"console-f9d7485db-4l9qf\" (UID: \"9f6352b6-0b3b-43d0-8d2f-681f5aec50c2\") " pod="openshift-console/console-f9d7485db-4l9qf" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.925535 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/92c1ee23-ae49-48a6-828d-19ecb5573057-metrics-certs\") pod \"router-default-5444994796-858jn\" (UID: \"92c1ee23-ae49-48a6-828d-19ecb5573057\") " pod="openshift-ingress/router-default-5444994796-858jn" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.925562 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b96a8e1e-ebdd-419f-8661-ae6e515429b0-srv-cert\") pod \"catalog-operator-68c6474976-bnct9\" (UID: \"b96a8e1e-ebdd-419f-8661-ae6e515429b0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bnct9" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.925583 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7w6cm\" (UniqueName: \"kubernetes.io/projected/f7f071e5-085b-429b-8956-102ebc333825-kube-api-access-7w6cm\") pod \"machine-config-server-7bj4h\" (UID: \"f7f071e5-085b-429b-8956-102ebc333825\") " pod="openshift-machine-config-operator/machine-config-server-7bj4h" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.925628 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-g2c5c\" (UID: \"fe99e221-99e5-49c1-9cec-875a54851847\") " pod="openshift-authentication/oauth-openshift-558db77b4-g2c5c" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.925645 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3bf8239a-db87-4f56-9da5-24b1e2248263-serving-cert\") pod \"console-operator-58897d9998-r8t8l\" (UID: \"3bf8239a-db87-4f56-9da5-24b1e2248263\") " pod="openshift-console-operator/console-operator-58897d9998-r8t8l" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.925667 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-g2c5c\" (UID: \"fe99e221-99e5-49c1-9cec-875a54851847\") " pod="openshift-authentication/oauth-openshift-558db77b4-g2c5c" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.925688 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbzqj\" (UniqueName: \"kubernetes.io/projected/3bf8239a-db87-4f56-9da5-24b1e2248263-kube-api-access-lbzqj\") pod \"console-operator-58897d9998-r8t8l\" (UID: \"3bf8239a-db87-4f56-9da5-24b1e2248263\") " pod="openshift-console-operator/console-operator-58897d9998-r8t8l" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.925707 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d6d4012-f253-411e-9bed-2e204d9a6f67-config-volume\") pod \"dns-default-s29mb\" (UID: \"7d6d4012-f253-411e-9bed-2e204d9a6f67\") " pod="openshift-dns/dns-default-s29mb" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.925792 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2g4q\" (UniqueName: \"kubernetes.io/projected/09a5c7dc-07b9-4b2a-a8da-0a96222b932a-kube-api-access-l2g4q\") pod \"migrator-59844c95c7-4gfgh\" (UID: \"09a5c7dc-07b9-4b2a-a8da-0a96222b932a\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4gfgh" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.925816 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/911a50c7-0e81-4087-9bc0-81e5206d5fd0-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-scq4h\" (UID: \"911a50c7-0e81-4087-9bc0-81e5206d5fd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-scq4h" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.925857 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d028f881-7ad8-4805-84b8-341facd59bc2-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-kdv68\" (UID: \"d028f881-7ad8-4805-84b8-341facd59bc2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kdv68" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.925880 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6f5bffa0-8b50-4aac-a108-edc093dfc2bf-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-fw8fs\" (UID: \"6f5bffa0-8b50-4aac-a108-edc093dfc2bf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fw8fs" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.925925 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1159c4dd-9dfe-432c-8c65-f18f589c8a66-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-7jzjv\" (UID: \"1159c4dd-9dfe-432c-8c65-f18f589c8a66\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7jzjv" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.925980 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b0525b9e-caf1-4606-83d1-ffaf7d231b69-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pkbqf\" (UID: \"b0525b9e-caf1-4606-83d1-ffaf7d231b69\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pkbqf" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.926006 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qp7s\" (UniqueName: \"kubernetes.io/projected/34c91a14-92a3-459a-a798-85b4bda4f3ca-kube-api-access-8qp7s\") pod \"openshift-controller-manager-operator-756b6f6bc6-tv2fz\" (UID: \"34c91a14-92a3-459a-a798-85b4bda4f3ca\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tv2fz" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.926049 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-g2c5c\" (UID: \"fe99e221-99e5-49c1-9cec-875a54851847\") " pod="openshift-authentication/oauth-openshift-558db77b4-g2c5c" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.926077 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vz7gx\" (UniqueName: \"kubernetes.io/projected/6f5bffa0-8b50-4aac-a108-edc093dfc2bf-kube-api-access-vz7gx\") pod \"machine-api-operator-5694c8668f-fw8fs\" (UID: \"6f5bffa0-8b50-4aac-a108-edc093dfc2bf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fw8fs" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.926762 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/52969add-873f-41ae-b0a9-e36924daac43-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-l5r26\" (UID: \"52969add-873f-41ae-b0a9-e36924daac43\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-l5r26" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.926776 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/911a50c7-0e81-4087-9bc0-81e5206d5fd0-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-scq4h\" (UID: \"911a50c7-0e81-4087-9bc0-81e5206d5fd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-scq4h" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.926792 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/634ee46d-0ea5-4287-a7d7-55d7b0fb803b-client-ca\") pod \"route-controller-manager-6576b87f9c-tzg77\" (UID: \"634ee46d-0ea5-4287-a7d7-55d7b0fb803b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tzg77" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.926815 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-g2c5c\" (UID: \"fe99e221-99e5-49c1-9cec-875a54851847\") " pod="openshift-authentication/oauth-openshift-558db77b4-g2c5c" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.926830 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p95h6\" (UniqueName: \"kubernetes.io/projected/59c19ac3-a273-4527-bb7c-fa5d0ec71cba-kube-api-access-p95h6\") pod \"dns-operator-744455d44c-9ndf9\" (UID: \"59c19ac3-a273-4527-bb7c-fa5d0ec71cba\") " pod="openshift-dns-operator/dns-operator-744455d44c-9ndf9" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.926879 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fe99e221-99e5-49c1-9cec-875a54851847-audit-dir\") pod \"oauth-openshift-558db77b4-g2c5c\" (UID: \"fe99e221-99e5-49c1-9cec-875a54851847\") " pod="openshift-authentication/oauth-openshift-558db77b4-g2c5c" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.926906 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24lb7\" (UniqueName: \"kubernetes.io/projected/8ad08056-729b-4fb9-8049-9a27c674e520-kube-api-access-24lb7\") pod \"kube-storage-version-migrator-operator-b67b599dd-jw2zn\" (UID: \"8ad08056-729b-4fb9-8049-9a27c674e520\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jw2zn" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.926929 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/92c1ee23-ae49-48a6-828d-19ecb5573057-default-certificate\") pod \"router-default-5444994796-858jn\" (UID: \"92c1ee23-ae49-48a6-828d-19ecb5573057\") " pod="openshift-ingress/router-default-5444994796-858jn" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.926955 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/59c19ac3-a273-4527-bb7c-fa5d0ec71cba-metrics-tls\") pod \"dns-operator-744455d44c-9ndf9\" (UID: \"59c19ac3-a273-4527-bb7c-fa5d0ec71cba\") " pod="openshift-dns-operator/dns-operator-744455d44c-9ndf9" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.926984 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/971318f2-7f27-4651-9fbe-d66d37fb6b12-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-k7vhj\" (UID: \"971318f2-7f27-4651-9fbe-d66d37fb6b12\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k7vhj" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.927005 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8e16069-c384-4991-8041-af49bbd32266-serving-cert\") pod \"service-ca-operator-777779d784-sjq2s\" (UID: \"a8e16069-c384-4991-8041-af49bbd32266\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-sjq2s" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.926999 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fe99e221-99e5-49c1-9cec-875a54851847-audit-dir\") pod \"oauth-openshift-558db77b4-g2c5c\" (UID: \"fe99e221-99e5-49c1-9cec-875a54851847\") " pod="openshift-authentication/oauth-openshift-558db77b4-g2c5c" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.927071 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k96wh\" (UniqueName: \"kubernetes.io/projected/469b3cee-92e9-43a1-b8e7-de66a1990a8f-kube-api-access-k96wh\") pod \"controller-manager-879f6c89f-k5xfj\" (UID: \"469b3cee-92e9-43a1-b8e7-de66a1990a8f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k5xfj" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.927113 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvqq2\" (UniqueName: \"kubernetes.io/projected/d028f881-7ad8-4805-84b8-341facd59bc2-kube-api-access-qvqq2\") pod \"cluster-image-registry-operator-dc59b4c8b-kdv68\" (UID: \"d028f881-7ad8-4805-84b8-341facd59bc2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kdv68" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.927146 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-g2c5c\" (UID: \"fe99e221-99e5-49c1-9cec-875a54851847\") " pod="openshift-authentication/oauth-openshift-558db77b4-g2c5c" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.927339 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/911a50c7-0e81-4087-9bc0-81e5206d5fd0-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-scq4h\" (UID: \"911a50c7-0e81-4087-9bc0-81e5206d5fd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-scq4h" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.927414 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klhnh\" (UniqueName: \"kubernetes.io/projected/92c1ee23-ae49-48a6-828d-19ecb5573057-kube-api-access-klhnh\") pod \"router-default-5444994796-858jn\" (UID: \"92c1ee23-ae49-48a6-828d-19ecb5573057\") " pod="openshift-ingress/router-default-5444994796-858jn" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.927439 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffnx2\" (UniqueName: \"kubernetes.io/projected/0939bb94-8858-43fc-8443-686e696beaa1-kube-api-access-ffnx2\") pod \"downloads-7954f5f757-t2znt\" (UID: \"0939bb94-8858-43fc-8443-686e696beaa1\") " pod="openshift-console/downloads-7954f5f757-t2znt" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.927531 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpn9k\" (UniqueName: \"kubernetes.io/projected/b96a8e1e-ebdd-419f-8661-ae6e515429b0-kube-api-access-zpn9k\") pod \"catalog-operator-68c6474976-bnct9\" (UID: \"b96a8e1e-ebdd-419f-8661-ae6e515429b0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bnct9" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.927533 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/634ee46d-0ea5-4287-a7d7-55d7b0fb803b-client-ca\") pod \"route-controller-manager-6576b87f9c-tzg77\" (UID: \"634ee46d-0ea5-4287-a7d7-55d7b0fb803b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tzg77" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.927569 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcqlb\" (UniqueName: \"kubernetes.io/projected/7d6d4012-f253-411e-9bed-2e204d9a6f67-kube-api-access-mcqlb\") pod \"dns-default-s29mb\" (UID: \"7d6d4012-f253-411e-9bed-2e204d9a6f67\") " pod="openshift-dns/dns-default-s29mb" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.927625 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8e16069-c384-4991-8041-af49bbd32266-config\") pod \"service-ca-operator-777779d784-sjq2s\" (UID: \"a8e16069-c384-4991-8041-af49bbd32266\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-sjq2s" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.927651 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnzn7\" (UniqueName: \"kubernetes.io/projected/fe99e221-99e5-49c1-9cec-875a54851847-kube-api-access-tnzn7\") pod \"oauth-openshift-558db77b4-g2c5c\" (UID: \"fe99e221-99e5-49c1-9cec-875a54851847\") " pod="openshift-authentication/oauth-openshift-558db77b4-g2c5c" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.927673 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d028f881-7ad8-4805-84b8-341facd59bc2-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-kdv68\" (UID: \"d028f881-7ad8-4805-84b8-341facd59bc2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kdv68" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.927695 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/f7f071e5-085b-429b-8956-102ebc333825-certs\") pod \"machine-config-server-7bj4h\" (UID: \"f7f071e5-085b-429b-8956-102ebc333825\") " pod="openshift-machine-config-operator/machine-config-server-7bj4h" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.927720 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52969add-873f-41ae-b0a9-e36924daac43-config\") pod \"kube-apiserver-operator-766d6c64bb-l5r26\" (UID: \"52969add-873f-41ae-b0a9-e36924daac43\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-l5r26" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.927747 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzn2m\" (UniqueName: \"kubernetes.io/projected/c9ad26c3-6a8e-41df-9879-b7ff5f77fdea-kube-api-access-tzn2m\") pod \"control-plane-machine-set-operator-78cbb6b69f-fn7gp\" (UID: \"c9ad26c3-6a8e-41df-9879-b7ff5f77fdea\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fn7gp" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.927777 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d17d7658-ce33-4a3c-826a-1a0361e73629-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bj9mx\" (UID: \"d17d7658-ce33-4a3c-826a-1a0361e73629\") " pod="openshift-marketplace/marketplace-operator-79b997595-bj9mx" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.927800 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a00f6082-ee0c-441f-9aad-d4d12c607e23-cert\") pod \"ingress-canary-xg8rk\" (UID: \"a00f6082-ee0c-441f-9aad-d4d12c607e23\") " pod="openshift-ingress-canary/ingress-canary-xg8rk" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.927830 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjdv2\" (UniqueName: \"kubernetes.io/projected/181841ef-d317-458b-bb36-daa09020abdb-kube-api-access-vjdv2\") pod \"cluster-samples-operator-665b6dd947-nj4b4\" (UID: \"181841ef-d317-458b-bb36-daa09020abdb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nj4b4" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.927847 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/911a50c7-0e81-4087-9bc0-81e5206d5fd0-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-scq4h\" (UID: \"911a50c7-0e81-4087-9bc0-81e5206d5fd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-scq4h" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.927855 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34c91a14-92a3-459a-a798-85b4bda4f3ca-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-tv2fz\" (UID: \"34c91a14-92a3-459a-a798-85b4bda4f3ca\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tv2fz" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.927913 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/65ba3fb8-bfc9-4b6a-96ac-e901252e44b7-available-featuregates\") pod \"openshift-config-operator-7777fb866f-g8hs9\" (UID: \"65ba3fb8-bfc9-4b6a-96ac-e901252e44b7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-g8hs9" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.927941 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr7h7\" (UniqueName: \"kubernetes.io/projected/a742bd2a-b1b3-42fc-be4f-98002090b4ed-kube-api-access-rr7h7\") pod \"package-server-manager-789f6589d5-kcw47\" (UID: \"a742bd2a-b1b3-42fc-be4f-98002090b4ed\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kcw47" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.927971 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghmgp\" (UniqueName: \"kubernetes.io/projected/3a9bd8ec-b649-4b24-8eca-d0e20cef988a-kube-api-access-ghmgp\") pod \"ingress-operator-5b745b69d9-k9kzh\" (UID: \"3a9bd8ec-b649-4b24-8eca-d0e20cef988a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-k9kzh" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.928012 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/911a50c7-0e81-4087-9bc0-81e5206d5fd0-audit-dir\") pod \"apiserver-7bbb656c7d-scq4h\" (UID: \"911a50c7-0e81-4087-9bc0-81e5206d5fd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-scq4h" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.928089 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/911a50c7-0e81-4087-9bc0-81e5206d5fd0-audit-dir\") pod \"apiserver-7bbb656c7d-scq4h\" (UID: \"911a50c7-0e81-4087-9bc0-81e5206d5fd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-scq4h" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.928098 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1a9fe49-1dd3-48c9-a4c7-33ce3cf9f698-config\") pod \"etcd-operator-b45778765-dwxp5\" (UID: \"e1a9fe49-1dd3-48c9-a4c7-33ce3cf9f698\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dwxp5" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.928171 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9f6352b6-0b3b-43d0-8d2f-681f5aec50c2-service-ca\") pod \"console-f9d7485db-4l9qf\" (UID: \"9f6352b6-0b3b-43d0-8d2f-681f5aec50c2\") " pod="openshift-console/console-f9d7485db-4l9qf" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.928207 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/634ee46d-0ea5-4287-a7d7-55d7b0fb803b-serving-cert\") pod \"route-controller-manager-6576b87f9c-tzg77\" (UID: \"634ee46d-0ea5-4287-a7d7-55d7b0fb803b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tzg77" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.928237 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-g2c5c\" (UID: \"fe99e221-99e5-49c1-9cec-875a54851847\") " pod="openshift-authentication/oauth-openshift-558db77b4-g2c5c" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.928267 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d17d7658-ce33-4a3c-826a-1a0361e73629-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bj9mx\" (UID: \"d17d7658-ce33-4a3c-826a-1a0361e73629\") " pod="openshift-marketplace/marketplace-operator-79b997595-bj9mx" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.928288 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/65ba3fb8-bfc9-4b6a-96ac-e901252e44b7-available-featuregates\") pod \"openshift-config-operator-7777fb866f-g8hs9\" (UID: \"65ba3fb8-bfc9-4b6a-96ac-e901252e44b7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-g8hs9" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.928292 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1159c4dd-9dfe-432c-8c65-f18f589c8a66-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-7jzjv\" (UID: \"1159c4dd-9dfe-432c-8c65-f18f589c8a66\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7jzjv" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.928349 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/780881af-a5e9-44fa-9cf9-4d57513449dd-auth-proxy-config\") pod \"machine-approver-56656f9798-cwd9m\" (UID: \"780881af-a5e9-44fa-9cf9-4d57513449dd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cwd9m" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.928371 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-g2c5c\" (UID: \"fe99e221-99e5-49c1-9cec-875a54851847\") " pod="openshift-authentication/oauth-openshift-558db77b4-g2c5c" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.928391 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9f6352b6-0b3b-43d0-8d2f-681f5aec50c2-console-serving-cert\") pod \"console-f9d7485db-4l9qf\" (UID: \"9f6352b6-0b3b-43d0-8d2f-681f5aec50c2\") " pod="openshift-console/console-f9d7485db-4l9qf" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.928419 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/780881af-a5e9-44fa-9cf9-4d57513449dd-config\") pod \"machine-approver-56656f9798-cwd9m\" (UID: \"780881af-a5e9-44fa-9cf9-4d57513449dd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cwd9m" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.928442 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e1a9fe49-1dd3-48c9-a4c7-33ce3cf9f698-etcd-client\") pod \"etcd-operator-b45778765-dwxp5\" (UID: \"e1a9fe49-1dd3-48c9-a4c7-33ce3cf9f698\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dwxp5" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.928461 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/780881af-a5e9-44fa-9cf9-4d57513449dd-machine-approver-tls\") pod \"machine-approver-56656f9798-cwd9m\" (UID: \"780881af-a5e9-44fa-9cf9-4d57513449dd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cwd9m" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.928481 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65ba3fb8-bfc9-4b6a-96ac-e901252e44b7-serving-cert\") pod \"openshift-config-operator-7777fb866f-g8hs9\" (UID: \"65ba3fb8-bfc9-4b6a-96ac-e901252e44b7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-g8hs9" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.928499 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/92c1ee23-ae49-48a6-828d-19ecb5573057-stats-auth\") pod \"router-default-5444994796-858jn\" (UID: \"92c1ee23-ae49-48a6-828d-19ecb5573057\") " pod="openshift-ingress/router-default-5444994796-858jn" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.928523 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzj7w\" (UniqueName: \"kubernetes.io/projected/911a50c7-0e81-4087-9bc0-81e5206d5fd0-kube-api-access-jzj7w\") pod \"apiserver-7bbb656c7d-scq4h\" (UID: \"911a50c7-0e81-4087-9bc0-81e5206d5fd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-scq4h" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.928543 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b7d2e412-0a4f-44b6-a326-269ac4921dae-secret-volume\") pod \"collect-profiles-29566485-rnzpk\" (UID: \"b7d2e412-0a4f-44b6-a326-269ac4921dae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566485-rnzpk" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.928564 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6cc43d2b-e8e2-405a-9870-33c60db55ff2-service-ca-bundle\") pod \"authentication-operator-69f744f599-hvlj7\" (UID: \"6cc43d2b-e8e2-405a-9870-33c60db55ff2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hvlj7" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.928582 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/911a50c7-0e81-4087-9bc0-81e5206d5fd0-serving-cert\") pod \"apiserver-7bbb656c7d-scq4h\" (UID: \"911a50c7-0e81-4087-9bc0-81e5206d5fd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-scq4h" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.928618 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/911a50c7-0e81-4087-9bc0-81e5206d5fd0-encryption-config\") pod \"apiserver-7bbb656c7d-scq4h\" (UID: \"911a50c7-0e81-4087-9bc0-81e5206d5fd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-scq4h" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.928640 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bf8239a-db87-4f56-9da5-24b1e2248263-config\") pod \"console-operator-58897d9998-r8t8l\" (UID: \"3bf8239a-db87-4f56-9da5-24b1e2248263\") " pod="openshift-console-operator/console-operator-58897d9998-r8t8l" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.928661 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/911a50c7-0e81-4087-9bc0-81e5206d5fd0-etcd-client\") pod \"apiserver-7bbb656c7d-scq4h\" (UID: \"911a50c7-0e81-4087-9bc0-81e5206d5fd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-scq4h" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.928681 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfzsm\" (UniqueName: \"kubernetes.io/projected/b7d2e412-0a4f-44b6-a326-269ac4921dae-kube-api-access-gfzsm\") pod \"collect-profiles-29566485-rnzpk\" (UID: \"b7d2e412-0a4f-44b6-a326-269ac4921dae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566485-rnzpk" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.928700 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tn4p7\" (UniqueName: \"kubernetes.io/projected/780881af-a5e9-44fa-9cf9-4d57513449dd-kube-api-access-tn4p7\") pod \"machine-approver-56656f9798-cwd9m\" (UID: \"780881af-a5e9-44fa-9cf9-4d57513449dd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cwd9m" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.928721 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8r7v2\" (UniqueName: \"kubernetes.io/projected/65ba3fb8-bfc9-4b6a-96ac-e901252e44b7-kube-api-access-8r7v2\") pod \"openshift-config-operator-7777fb866f-g8hs9\" (UID: \"65ba3fb8-bfc9-4b6a-96ac-e901252e44b7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-g8hs9" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.928745 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6cc43d2b-e8e2-405a-9870-33c60db55ff2-serving-cert\") pod \"authentication-operator-69f744f599-hvlj7\" (UID: \"6cc43d2b-e8e2-405a-9870-33c60db55ff2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hvlj7" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.928765 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ad08056-729b-4fb9-8049-9a27c674e520-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-jw2zn\" (UID: \"8ad08056-729b-4fb9-8049-9a27c674e520\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jw2zn" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.928787 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3a9bd8ec-b649-4b24-8eca-d0e20cef988a-metrics-tls\") pod \"ingress-operator-5b745b69d9-k9kzh\" (UID: \"3a9bd8ec-b649-4b24-8eca-d0e20cef988a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-k9kzh" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.928817 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqpn2\" (UniqueName: \"kubernetes.io/projected/d17d7658-ce33-4a3c-826a-1a0361e73629-kube-api-access-vqpn2\") pod \"marketplace-operator-79b997595-bj9mx\" (UID: \"d17d7658-ce33-4a3c-826a-1a0361e73629\") " pod="openshift-marketplace/marketplace-operator-79b997595-bj9mx" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.928840 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-g2c5c\" (UID: \"fe99e221-99e5-49c1-9cec-875a54851847\") " pod="openshift-authentication/oauth-openshift-558db77b4-g2c5c" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.928858 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4dcf\" (UniqueName: \"kubernetes.io/projected/6cc43d2b-e8e2-405a-9870-33c60db55ff2-kube-api-access-c4dcf\") pod \"authentication-operator-69f744f599-hvlj7\" (UID: \"6cc43d2b-e8e2-405a-9870-33c60db55ff2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hvlj7" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.928880 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b96a8e1e-ebdd-419f-8661-ae6e515429b0-profile-collector-cert\") pod \"catalog-operator-68c6474976-bnct9\" (UID: \"b96a8e1e-ebdd-419f-8661-ae6e515429b0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bnct9" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.928902 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6e696339-3718-476d-a99b-1da32e10090b-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-r6vfl\" (UID: \"6e696339-3718-476d-a99b-1da32e10090b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-r6vfl" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.928925 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f6352b6-0b3b-43d0-8d2f-681f5aec50c2-trusted-ca-bundle\") pod \"console-f9d7485db-4l9qf\" (UID: \"9f6352b6-0b3b-43d0-8d2f-681f5aec50c2\") " pod="openshift-console/console-f9d7485db-4l9qf" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.928947 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/cdb9a3df-ae5a-4ce0-a7af-7986ac68c57f-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-kmdm2\" (UID: \"cdb9a3df-ae5a-4ce0-a7af-7986ac68c57f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-kmdm2" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.928971 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52969add-873f-41ae-b0a9-e36924daac43-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-l5r26\" (UID: \"52969add-873f-41ae-b0a9-e36924daac43\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-l5r26" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.928989 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxcvl\" (UniqueName: \"kubernetes.io/projected/a00f6082-ee0c-441f-9aad-d4d12c607e23-kube-api-access-wxcvl\") pod \"ingress-canary-xg8rk\" (UID: \"a00f6082-ee0c-441f-9aad-d4d12c607e23\") " pod="openshift-ingress-canary/ingress-canary-xg8rk" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.928999 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1a9fe49-1dd3-48c9-a4c7-33ce3cf9f698-config\") pod \"etcd-operator-b45778765-dwxp5\" (UID: \"e1a9fe49-1dd3-48c9-a4c7-33ce3cf9f698\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dwxp5" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.929009 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/469b3cee-92e9-43a1-b8e7-de66a1990a8f-client-ca\") pod \"controller-manager-879f6c89f-k5xfj\" (UID: \"469b3cee-92e9-43a1-b8e7-de66a1990a8f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k5xfj" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.929115 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/911a50c7-0e81-4087-9bc0-81e5206d5fd0-audit-policies\") pod \"apiserver-7bbb656c7d-scq4h\" (UID: \"911a50c7-0e81-4087-9bc0-81e5206d5fd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-scq4h" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.929137 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/971318f2-7f27-4651-9fbe-d66d37fb6b12-config\") pod \"openshift-apiserver-operator-796bbdcf4f-k7vhj\" (UID: \"971318f2-7f27-4651-9fbe-d66d37fb6b12\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k7vhj" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.929161 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34c91a14-92a3-459a-a798-85b4bda4f3ca-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-tv2fz\" (UID: \"34c91a14-92a3-459a-a798-85b4bda4f3ca\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tv2fz" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.929180 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cc43d2b-e8e2-405a-9870-33c60db55ff2-config\") pod \"authentication-operator-69f744f599-hvlj7\" (UID: \"6cc43d2b-e8e2-405a-9870-33c60db55ff2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hvlj7" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.929200 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9f6352b6-0b3b-43d0-8d2f-681f5aec50c2-console-config\") pod \"console-f9d7485db-4l9qf\" (UID: \"9f6352b6-0b3b-43d0-8d2f-681f5aec50c2\") " pod="openshift-console/console-f9d7485db-4l9qf" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.929219 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6cc43d2b-e8e2-405a-9870-33c60db55ff2-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-hvlj7\" (UID: \"6cc43d2b-e8e2-405a-9870-33c60db55ff2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hvlj7" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.929244 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/634ee46d-0ea5-4287-a7d7-55d7b0fb803b-config\") pod \"route-controller-manager-6576b87f9c-tzg77\" (UID: \"634ee46d-0ea5-4287-a7d7-55d7b0fb803b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tzg77" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.929267 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9q48\" (UniqueName: \"kubernetes.io/projected/971318f2-7f27-4651-9fbe-d66d37fb6b12-kube-api-access-g9q48\") pod \"openshift-apiserver-operator-796bbdcf4f-k7vhj\" (UID: \"971318f2-7f27-4651-9fbe-d66d37fb6b12\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k7vhj" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.929296 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9f6352b6-0b3b-43d0-8d2f-681f5aec50c2-console-oauth-config\") pod \"console-f9d7485db-4l9qf\" (UID: \"9f6352b6-0b3b-43d0-8d2f-681f5aec50c2\") " pod="openshift-console/console-f9d7485db-4l9qf" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.929317 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e1a9fe49-1dd3-48c9-a4c7-33ce3cf9f698-serving-cert\") pod \"etcd-operator-b45778765-dwxp5\" (UID: \"e1a9fe49-1dd3-48c9-a4c7-33ce3cf9f698\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dwxp5" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.929338 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4mk4\" (UniqueName: \"kubernetes.io/projected/8b3d1a36-b34d-422b-96e1-0bfd934deac1-kube-api-access-b4mk4\") pod \"machine-config-operator-74547568cd-n48nl\" (UID: \"8b3d1a36-b34d-422b-96e1-0bfd934deac1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n48nl" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.929357 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92c1ee23-ae49-48a6-828d-19ecb5573057-service-ca-bundle\") pod \"router-default-5444994796-858jn\" (UID: \"92c1ee23-ae49-48a6-828d-19ecb5573057\") " pod="openshift-ingress/router-default-5444994796-858jn" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.929376 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6f5bffa0-8b50-4aac-a108-edc093dfc2bf-images\") pod \"machine-api-operator-5694c8668f-fw8fs\" (UID: \"6f5bffa0-8b50-4aac-a108-edc093dfc2bf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fw8fs" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.929404 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8b3d1a36-b34d-422b-96e1-0bfd934deac1-images\") pod \"machine-config-operator-74547568cd-n48nl\" (UID: \"8b3d1a36-b34d-422b-96e1-0bfd934deac1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n48nl" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.929410 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9f6352b6-0b3b-43d0-8d2f-681f5aec50c2-service-ca\") pod \"console-f9d7485db-4l9qf\" (UID: \"9f6352b6-0b3b-43d0-8d2f-681f5aec50c2\") " pod="openshift-console/console-f9d7485db-4l9qf" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.929426 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4ckm\" (UniqueName: \"kubernetes.io/projected/a8e16069-c384-4991-8041-af49bbd32266-kube-api-access-g4ckm\") pod \"service-ca-operator-777779d784-sjq2s\" (UID: \"a8e16069-c384-4991-8041-af49bbd32266\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-sjq2s" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.929449 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6f5bffa0-8b50-4aac-a108-edc093dfc2bf-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-fw8fs\" (UID: \"6f5bffa0-8b50-4aac-a108-edc093dfc2bf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fw8fs" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.929473 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/c9ad26c3-6a8e-41df-9879-b7ff5f77fdea-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-fn7gp\" (UID: \"c9ad26c3-6a8e-41df-9879-b7ff5f77fdea\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fn7gp" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.930322 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6f5bffa0-8b50-4aac-a108-edc093dfc2bf-images\") pod \"machine-api-operator-5694c8668f-fw8fs\" (UID: \"6f5bffa0-8b50-4aac-a108-edc093dfc2bf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fw8fs" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.930640 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/780881af-a5e9-44fa-9cf9-4d57513449dd-auth-proxy-config\") pod \"machine-approver-56656f9798-cwd9m\" (UID: \"780881af-a5e9-44fa-9cf9-4d57513449dd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cwd9m" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.930813 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/469b3cee-92e9-43a1-b8e7-de66a1990a8f-client-ca\") pod \"controller-manager-879f6c89f-k5xfj\" (UID: \"469b3cee-92e9-43a1-b8e7-de66a1990a8f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k5xfj" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.930991 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f6352b6-0b3b-43d0-8d2f-681f5aec50c2-trusted-ca-bundle\") pod \"console-f9d7485db-4l9qf\" (UID: \"9f6352b6-0b3b-43d0-8d2f-681f5aec50c2\") " pod="openshift-console/console-f9d7485db-4l9qf" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.931013 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e696339-3718-476d-a99b-1da32e10090b-proxy-tls\") pod \"machine-config-controller-84d6567774-r6vfl\" (UID: \"6e696339-3718-476d-a99b-1da32e10090b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-r6vfl" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.931080 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-g2c5c\" (UID: \"fe99e221-99e5-49c1-9cec-875a54851847\") " pod="openshift-authentication/oauth-openshift-558db77b4-g2c5c" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.931118 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9f6352b6-0b3b-43d0-8d2f-681f5aec50c2-oauth-serving-cert\") pod \"console-f9d7485db-4l9qf\" (UID: \"9f6352b6-0b3b-43d0-8d2f-681f5aec50c2\") " pod="openshift-console/console-f9d7485db-4l9qf" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.931145 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3a9bd8ec-b649-4b24-8eca-d0e20cef988a-trusted-ca\") pod \"ingress-operator-5b745b69d9-k9kzh\" (UID: \"3a9bd8ec-b649-4b24-8eca-d0e20cef988a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-k9kzh" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.931167 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/d028f881-7ad8-4805-84b8-341facd59bc2-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-kdv68\" (UID: \"d028f881-7ad8-4805-84b8-341facd59bc2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kdv68" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.931195 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-g2c5c\" (UID: \"fe99e221-99e5-49c1-9cec-875a54851847\") " pod="openshift-authentication/oauth-openshift-558db77b4-g2c5c" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.931220 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/469b3cee-92e9-43a1-b8e7-de66a1990a8f-serving-cert\") pod \"controller-manager-879f6c89f-k5xfj\" (UID: \"469b3cee-92e9-43a1-b8e7-de66a1990a8f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k5xfj" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.931319 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3bf8239a-db87-4f56-9da5-24b1e2248263-trusted-ca\") pod \"console-operator-58897d9998-r8t8l\" (UID: \"3bf8239a-db87-4f56-9da5-24b1e2248263\") " pod="openshift-console-operator/console-operator-58897d9998-r8t8l" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.931340 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b0525b9e-caf1-4606-83d1-ffaf7d231b69-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pkbqf\" (UID: \"b0525b9e-caf1-4606-83d1-ffaf7d231b69\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pkbqf" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.931365 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3a9bd8ec-b649-4b24-8eca-d0e20cef988a-bound-sa-token\") pod \"ingress-operator-5b745b69d9-k9kzh\" (UID: \"3a9bd8ec-b649-4b24-8eca-d0e20cef988a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-k9kzh" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.931416 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/e1a9fe49-1dd3-48c9-a4c7-33ce3cf9f698-etcd-service-ca\") pod \"etcd-operator-b45778765-dwxp5\" (UID: \"e1a9fe49-1dd3-48c9-a4c7-33ce3cf9f698\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dwxp5" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.931439 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b7d2e412-0a4f-44b6-a326-269ac4921dae-config-volume\") pod \"collect-profiles-29566485-rnzpk\" (UID: \"b7d2e412-0a4f-44b6-a326-269ac4921dae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566485-rnzpk" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.931461 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7d6d4012-f253-411e-9bed-2e204d9a6f67-metrics-tls\") pod \"dns-default-s29mb\" (UID: \"7d6d4012-f253-411e-9bed-2e204d9a6f67\") " pod="openshift-dns/dns-default-s29mb" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.931481 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bf8239a-db87-4f56-9da5-24b1e2248263-config\") pod \"console-operator-58897d9998-r8t8l\" (UID: \"3bf8239a-db87-4f56-9da5-24b1e2248263\") " pod="openshift-console-operator/console-operator-58897d9998-r8t8l" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.931486 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/469b3cee-92e9-43a1-b8e7-de66a1990a8f-config\") pod \"controller-manager-879f6c89f-k5xfj\" (UID: \"469b3cee-92e9-43a1-b8e7-de66a1990a8f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k5xfj" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.931536 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f5bffa0-8b50-4aac-a108-edc093dfc2bf-config\") pod \"machine-api-operator-5694c8668f-fw8fs\" (UID: \"6f5bffa0-8b50-4aac-a108-edc093dfc2bf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fw8fs" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.931563 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8b3d1a36-b34d-422b-96e1-0bfd934deac1-proxy-tls\") pod \"machine-config-operator-74547568cd-n48nl\" (UID: \"8b3d1a36-b34d-422b-96e1-0bfd934deac1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n48nl" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.931565 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34c91a14-92a3-459a-a798-85b4bda4f3ca-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-tv2fz\" (UID: \"34c91a14-92a3-459a-a798-85b4bda4f3ca\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tv2fz" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.931618 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/911a50c7-0e81-4087-9bc0-81e5206d5fd0-audit-policies\") pod \"apiserver-7bbb656c7d-scq4h\" (UID: \"911a50c7-0e81-4087-9bc0-81e5206d5fd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-scq4h" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.931642 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/f7f071e5-085b-429b-8956-102ebc333825-node-bootstrap-token\") pod \"machine-config-server-7bj4h\" (UID: \"f7f071e5-085b-429b-8956-102ebc333825\") " pod="openshift-machine-config-operator/machine-config-server-7bj4h" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.931671 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a742bd2a-b1b3-42fc-be4f-98002090b4ed-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-kcw47\" (UID: \"a742bd2a-b1b3-42fc-be4f-98002090b4ed\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kcw47" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.931699 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8b3d1a36-b34d-422b-96e1-0bfd934deac1-auth-proxy-config\") pod \"machine-config-operator-74547568cd-n48nl\" (UID: \"8b3d1a36-b34d-422b-96e1-0bfd934deac1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n48nl" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.931731 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/181841ef-d317-458b-bb36-daa09020abdb-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-nj4b4\" (UID: \"181841ef-d317-458b-bb36-daa09020abdb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nj4b4" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.931754 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnqvm\" (UniqueName: \"kubernetes.io/projected/e1a9fe49-1dd3-48c9-a4c7-33ce3cf9f698-kube-api-access-gnqvm\") pod \"etcd-operator-b45778765-dwxp5\" (UID: \"e1a9fe49-1dd3-48c9-a4c7-33ce3cf9f698\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dwxp5" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.931778 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-g2c5c\" (UID: \"fe99e221-99e5-49c1-9cec-875a54851847\") " pod="openshift-authentication/oauth-openshift-558db77b4-g2c5c" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.931802 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ad08056-729b-4fb9-8049-9a27c674e520-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-jw2zn\" (UID: \"8ad08056-729b-4fb9-8049-9a27c674e520\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jw2zn" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.931827 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1159c4dd-9dfe-432c-8c65-f18f589c8a66-config\") pod \"kube-controller-manager-operator-78b949d7b-7jzjv\" (UID: \"1159c4dd-9dfe-432c-8c65-f18f589c8a66\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7jzjv" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.931850 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fe99e221-99e5-49c1-9cec-875a54851847-audit-policies\") pod \"oauth-openshift-558db77b4-g2c5c\" (UID: \"fe99e221-99e5-49c1-9cec-875a54851847\") " pod="openshift-authentication/oauth-openshift-558db77b4-g2c5c" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.931903 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbtr7\" (UniqueName: \"kubernetes.io/projected/cdb9a3df-ae5a-4ce0-a7af-7986ac68c57f-kube-api-access-tbtr7\") pod \"multus-admission-controller-857f4d67dd-kmdm2\" (UID: \"cdb9a3df-ae5a-4ce0-a7af-7986ac68c57f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-kmdm2" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.932060 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwclq\" (UniqueName: \"kubernetes.io/projected/e766c0e7-ca96-4f32-9e14-81a3d5bbd389-kube-api-access-fwclq\") pod \"auto-csr-approver-29566492-86snb\" (UID: \"e766c0e7-ca96-4f32-9e14-81a3d5bbd389\") " pod="openshift-infra/auto-csr-approver-29566492-86snb" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.932088 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-g2c5c\" (UID: \"fe99e221-99e5-49c1-9cec-875a54851847\") " pod="openshift-authentication/oauth-openshift-558db77b4-g2c5c" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.932112 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/469b3cee-92e9-43a1-b8e7-de66a1990a8f-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-k5xfj\" (UID: \"469b3cee-92e9-43a1-b8e7-de66a1990a8f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k5xfj" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.932136 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56szz\" (UniqueName: \"kubernetes.io/projected/6e696339-3718-476d-a99b-1da32e10090b-kube-api-access-56szz\") pod \"machine-config-controller-84d6567774-r6vfl\" (UID: \"6e696339-3718-476d-a99b-1da32e10090b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-r6vfl" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.932174 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/e1a9fe49-1dd3-48c9-a4c7-33ce3cf9f698-etcd-ca\") pod \"etcd-operator-b45778765-dwxp5\" (UID: \"e1a9fe49-1dd3-48c9-a4c7-33ce3cf9f698\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dwxp5" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.932203 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0525b9e-caf1-4606-83d1-ffaf7d231b69-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pkbqf\" (UID: \"b0525b9e-caf1-4606-83d1-ffaf7d231b69\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pkbqf" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.932352 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6cc43d2b-e8e2-405a-9870-33c60db55ff2-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-hvlj7\" (UID: \"6cc43d2b-e8e2-405a-9870-33c60db55ff2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hvlj7" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.932464 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-g2c5c\" (UID: \"fe99e221-99e5-49c1-9cec-875a54851847\") " pod="openshift-authentication/oauth-openshift-558db77b4-g2c5c" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.933091 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-g2c5c\" (UID: \"fe99e221-99e5-49c1-9cec-875a54851847\") " pod="openshift-authentication/oauth-openshift-558db77b4-g2c5c" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.933234 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cc43d2b-e8e2-405a-9870-33c60db55ff2-config\") pod \"authentication-operator-69f744f599-hvlj7\" (UID: \"6cc43d2b-e8e2-405a-9870-33c60db55ff2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hvlj7" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.933300 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/971318f2-7f27-4651-9fbe-d66d37fb6b12-config\") pod \"openshift-apiserver-operator-796bbdcf4f-k7vhj\" (UID: \"971318f2-7f27-4651-9fbe-d66d37fb6b12\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k7vhj" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.933446 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/469b3cee-92e9-43a1-b8e7-de66a1990a8f-config\") pod \"controller-manager-879f6c89f-k5xfj\" (UID: \"469b3cee-92e9-43a1-b8e7-de66a1990a8f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k5xfj" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.933967 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8b3d1a36-b34d-422b-96e1-0bfd934deac1-auth-proxy-config\") pod \"machine-config-operator-74547568cd-n48nl\" (UID: \"8b3d1a36-b34d-422b-96e1-0bfd934deac1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n48nl" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.934323 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9f6352b6-0b3b-43d0-8d2f-681f5aec50c2-oauth-serving-cert\") pod \"console-f9d7485db-4l9qf\" (UID: \"9f6352b6-0b3b-43d0-8d2f-681f5aec50c2\") " pod="openshift-console/console-f9d7485db-4l9qf" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.934877 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3bf8239a-db87-4f56-9da5-24b1e2248263-trusted-ca\") pod \"console-operator-58897d9998-r8t8l\" (UID: \"3bf8239a-db87-4f56-9da5-24b1e2248263\") " pod="openshift-console-operator/console-operator-58897d9998-r8t8l" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.934882 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/911a50c7-0e81-4087-9bc0-81e5206d5fd0-serving-cert\") pod \"apiserver-7bbb656c7d-scq4h\" (UID: \"911a50c7-0e81-4087-9bc0-81e5206d5fd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-scq4h" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.936191 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/780881af-a5e9-44fa-9cf9-4d57513449dd-config\") pod \"machine-approver-56656f9798-cwd9m\" (UID: \"780881af-a5e9-44fa-9cf9-4d57513449dd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cwd9m" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.936367 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/e1a9fe49-1dd3-48c9-a4c7-33ce3cf9f698-etcd-ca\") pod \"etcd-operator-b45778765-dwxp5\" (UID: \"e1a9fe49-1dd3-48c9-a4c7-33ce3cf9f698\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dwxp5" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.936617 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/e1a9fe49-1dd3-48c9-a4c7-33ce3cf9f698-etcd-service-ca\") pod \"etcd-operator-b45778765-dwxp5\" (UID: \"e1a9fe49-1dd3-48c9-a4c7-33ce3cf9f698\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dwxp5" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.937307 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6cc43d2b-e8e2-405a-9870-33c60db55ff2-service-ca-bundle\") pod \"authentication-operator-69f744f599-hvlj7\" (UID: \"6cc43d2b-e8e2-405a-9870-33c60db55ff2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hvlj7" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.937925 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.944509 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d028f881-7ad8-4805-84b8-341facd59bc2-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-kdv68\" (UID: \"d028f881-7ad8-4805-84b8-341facd59bc2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kdv68" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.945032 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f5bffa0-8b50-4aac-a108-edc093dfc2bf-config\") pod \"machine-api-operator-5694c8668f-fw8fs\" (UID: \"6f5bffa0-8b50-4aac-a108-edc093dfc2bf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fw8fs" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.945344 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fe99e221-99e5-49c1-9cec-875a54851847-audit-policies\") pod \"oauth-openshift-558db77b4-g2c5c\" (UID: \"fe99e221-99e5-49c1-9cec-875a54851847\") " pod="openshift-authentication/oauth-openshift-558db77b4-g2c5c" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.945967 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-g2c5c\" (UID: \"fe99e221-99e5-49c1-9cec-875a54851847\") " pod="openshift-authentication/oauth-openshift-558db77b4-g2c5c" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.945971 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-g2c5c\" (UID: \"fe99e221-99e5-49c1-9cec-875a54851847\") " pod="openshift-authentication/oauth-openshift-558db77b4-g2c5c" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.946111 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-g2c5c\" (UID: \"fe99e221-99e5-49c1-9cec-875a54851847\") " pod="openshift-authentication/oauth-openshift-558db77b4-g2c5c" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.946147 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/469b3cee-92e9-43a1-b8e7-de66a1990a8f-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-k5xfj\" (UID: \"469b3cee-92e9-43a1-b8e7-de66a1990a8f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k5xfj" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.946269 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-g2c5c\" (UID: \"fe99e221-99e5-49c1-9cec-875a54851847\") " pod="openshift-authentication/oauth-openshift-558db77b4-g2c5c" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.946663 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9f6352b6-0b3b-43d0-8d2f-681f5aec50c2-console-serving-cert\") pod \"console-f9d7485db-4l9qf\" (UID: \"9f6352b6-0b3b-43d0-8d2f-681f5aec50c2\") " pod="openshift-console/console-f9d7485db-4l9qf" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.947002 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/181841ef-d317-458b-bb36-daa09020abdb-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-nj4b4\" (UID: \"181841ef-d317-458b-bb36-daa09020abdb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nj4b4" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.947052 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e1a9fe49-1dd3-48c9-a4c7-33ce3cf9f698-etcd-client\") pod \"etcd-operator-b45778765-dwxp5\" (UID: \"e1a9fe49-1dd3-48c9-a4c7-33ce3cf9f698\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dwxp5" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.947336 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/911a50c7-0e81-4087-9bc0-81e5206d5fd0-encryption-config\") pod \"apiserver-7bbb656c7d-scq4h\" (UID: \"911a50c7-0e81-4087-9bc0-81e5206d5fd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-scq4h" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.947362 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e1a9fe49-1dd3-48c9-a4c7-33ce3cf9f698-serving-cert\") pod \"etcd-operator-b45778765-dwxp5\" (UID: \"e1a9fe49-1dd3-48c9-a4c7-33ce3cf9f698\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dwxp5" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.947790 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-g2c5c\" (UID: \"fe99e221-99e5-49c1-9cec-875a54851847\") " pod="openshift-authentication/oauth-openshift-558db77b4-g2c5c" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.947880 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/911a50c7-0e81-4087-9bc0-81e5206d5fd0-etcd-client\") pod \"apiserver-7bbb656c7d-scq4h\" (UID: \"911a50c7-0e81-4087-9bc0-81e5206d5fd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-scq4h" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.947914 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65ba3fb8-bfc9-4b6a-96ac-e901252e44b7-serving-cert\") pod \"openshift-config-operator-7777fb866f-g8hs9\" (UID: \"65ba3fb8-bfc9-4b6a-96ac-e901252e44b7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-g8hs9" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.948223 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9f6352b6-0b3b-43d0-8d2f-681f5aec50c2-console-oauth-config\") pod \"console-f9d7485db-4l9qf\" (UID: \"9f6352b6-0b3b-43d0-8d2f-681f5aec50c2\") " pod="openshift-console/console-f9d7485db-4l9qf" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.948460 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34c91a14-92a3-459a-a798-85b4bda4f3ca-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-tv2fz\" (UID: \"34c91a14-92a3-459a-a798-85b4bda4f3ca\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tv2fz" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.948700 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-g2c5c\" (UID: \"fe99e221-99e5-49c1-9cec-875a54851847\") " pod="openshift-authentication/oauth-openshift-558db77b4-g2c5c" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.948835 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3bf8239a-db87-4f56-9da5-24b1e2248263-serving-cert\") pod \"console-operator-58897d9998-r8t8l\" (UID: \"3bf8239a-db87-4f56-9da5-24b1e2248263\") " pod="openshift-console-operator/console-operator-58897d9998-r8t8l" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.948924 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/780881af-a5e9-44fa-9cf9-4d57513449dd-machine-approver-tls\") pod \"machine-approver-56656f9798-cwd9m\" (UID: \"780881af-a5e9-44fa-9cf9-4d57513449dd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cwd9m" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.949142 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/971318f2-7f27-4651-9fbe-d66d37fb6b12-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-k7vhj\" (UID: \"971318f2-7f27-4651-9fbe-d66d37fb6b12\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k7vhj" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.949330 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/469b3cee-92e9-43a1-b8e7-de66a1990a8f-serving-cert\") pod \"controller-manager-879f6c89f-k5xfj\" (UID: \"469b3cee-92e9-43a1-b8e7-de66a1990a8f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k5xfj" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.951238 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-g2c5c\" (UID: \"fe99e221-99e5-49c1-9cec-875a54851847\") " pod="openshift-authentication/oauth-openshift-558db77b4-g2c5c" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.954790 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.961484 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-g2c5c\" (UID: \"fe99e221-99e5-49c1-9cec-875a54851847\") " pod="openshift-authentication/oauth-openshift-558db77b4-g2c5c" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.962143 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/d028f881-7ad8-4805-84b8-341facd59bc2-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-kdv68\" (UID: \"d028f881-7ad8-4805-84b8-341facd59bc2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kdv68" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.971278 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.991090 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.995738 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6cc43d2b-e8e2-405a-9870-33c60db55ff2-serving-cert\") pod \"authentication-operator-69f744f599-hvlj7\" (UID: \"6cc43d2b-e8e2-405a-9870-33c60db55ff2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hvlj7" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.996506 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9f6352b6-0b3b-43d0-8d2f-681f5aec50c2-console-config\") pod \"console-f9d7485db-4l9qf\" (UID: \"9f6352b6-0b3b-43d0-8d2f-681f5aec50c2\") " pod="openshift-console/console-f9d7485db-4l9qf" Mar 20 06:53:19 crc kubenswrapper[4971]: I0320 06:53:19.996683 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/634ee46d-0ea5-4287-a7d7-55d7b0fb803b-config\") pod \"route-controller-manager-6576b87f9c-tzg77\" (UID: \"634ee46d-0ea5-4287-a7d7-55d7b0fb803b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tzg77" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.000760 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/634ee46d-0ea5-4287-a7d7-55d7b0fb803b-serving-cert\") pod \"route-controller-manager-6576b87f9c-tzg77\" (UID: \"634ee46d-0ea5-4287-a7d7-55d7b0fb803b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tzg77" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.011294 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.030819 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.033655 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b0525b9e-caf1-4606-83d1-ffaf7d231b69-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pkbqf\" (UID: \"b0525b9e-caf1-4606-83d1-ffaf7d231b69\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pkbqf" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.033698 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p95h6\" (UniqueName: \"kubernetes.io/projected/59c19ac3-a273-4527-bb7c-fa5d0ec71cba-kube-api-access-p95h6\") pod \"dns-operator-744455d44c-9ndf9\" (UID: \"59c19ac3-a273-4527-bb7c-fa5d0ec71cba\") " pod="openshift-dns-operator/dns-operator-744455d44c-9ndf9" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.033745 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/59c19ac3-a273-4527-bb7c-fa5d0ec71cba-metrics-tls\") pod \"dns-operator-744455d44c-9ndf9\" (UID: \"59c19ac3-a273-4527-bb7c-fa5d0ec71cba\") " pod="openshift-dns-operator/dns-operator-744455d44c-9ndf9" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.033778 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8e16069-c384-4991-8041-af49bbd32266-serving-cert\") pod \"service-ca-operator-777779d784-sjq2s\" (UID: \"a8e16069-c384-4991-8041-af49bbd32266\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-sjq2s" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.033861 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpn9k\" (UniqueName: \"kubernetes.io/projected/b96a8e1e-ebdd-419f-8661-ae6e515429b0-kube-api-access-zpn9k\") pod \"catalog-operator-68c6474976-bnct9\" (UID: \"b96a8e1e-ebdd-419f-8661-ae6e515429b0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bnct9" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.033888 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcqlb\" (UniqueName: \"kubernetes.io/projected/7d6d4012-f253-411e-9bed-2e204d9a6f67-kube-api-access-mcqlb\") pod \"dns-default-s29mb\" (UID: \"7d6d4012-f253-411e-9bed-2e204d9a6f67\") " pod="openshift-dns/dns-default-s29mb" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.033938 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8e16069-c384-4991-8041-af49bbd32266-config\") pod \"service-ca-operator-777779d784-sjq2s\" (UID: \"a8e16069-c384-4991-8041-af49bbd32266\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-sjq2s" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.034000 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/f7f071e5-085b-429b-8956-102ebc333825-certs\") pod \"machine-config-server-7bj4h\" (UID: \"f7f071e5-085b-429b-8956-102ebc333825\") " pod="openshift-machine-config-operator/machine-config-server-7bj4h" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.034026 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzn2m\" (UniqueName: \"kubernetes.io/projected/c9ad26c3-6a8e-41df-9879-b7ff5f77fdea-kube-api-access-tzn2m\") pod \"control-plane-machine-set-operator-78cbb6b69f-fn7gp\" (UID: \"c9ad26c3-6a8e-41df-9879-b7ff5f77fdea\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fn7gp" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.034211 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a00f6082-ee0c-441f-9aad-d4d12c607e23-cert\") pod \"ingress-canary-xg8rk\" (UID: \"a00f6082-ee0c-441f-9aad-d4d12c607e23\") " pod="openshift-ingress-canary/ingress-canary-xg8rk" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.034240 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr7h7\" (UniqueName: \"kubernetes.io/projected/a742bd2a-b1b3-42fc-be4f-98002090b4ed-kube-api-access-rr7h7\") pod \"package-server-manager-789f6589d5-kcw47\" (UID: \"a742bd2a-b1b3-42fc-be4f-98002090b4ed\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kcw47" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.034394 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b96a8e1e-ebdd-419f-8661-ae6e515429b0-profile-collector-cert\") pod \"catalog-operator-68c6474976-bnct9\" (UID: \"b96a8e1e-ebdd-419f-8661-ae6e515429b0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bnct9" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.034425 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6e696339-3718-476d-a99b-1da32e10090b-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-r6vfl\" (UID: \"6e696339-3718-476d-a99b-1da32e10090b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-r6vfl" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.034448 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/cdb9a3df-ae5a-4ce0-a7af-7986ac68c57f-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-kmdm2\" (UID: \"cdb9a3df-ae5a-4ce0-a7af-7986ac68c57f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-kmdm2" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.034471 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxcvl\" (UniqueName: \"kubernetes.io/projected/a00f6082-ee0c-441f-9aad-d4d12c607e23-kube-api-access-wxcvl\") pod \"ingress-canary-xg8rk\" (UID: \"a00f6082-ee0c-441f-9aad-d4d12c607e23\") " pod="openshift-ingress-canary/ingress-canary-xg8rk" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.034540 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4ckm\" (UniqueName: \"kubernetes.io/projected/a8e16069-c384-4991-8041-af49bbd32266-kube-api-access-g4ckm\") pod \"service-ca-operator-777779d784-sjq2s\" (UID: \"a8e16069-c384-4991-8041-af49bbd32266\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-sjq2s" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.034622 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/c9ad26c3-6a8e-41df-9879-b7ff5f77fdea-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-fn7gp\" (UID: \"c9ad26c3-6a8e-41df-9879-b7ff5f77fdea\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fn7gp" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.034646 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e696339-3718-476d-a99b-1da32e10090b-proxy-tls\") pod \"machine-config-controller-84d6567774-r6vfl\" (UID: \"6e696339-3718-476d-a99b-1da32e10090b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-r6vfl" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.034673 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b0525b9e-caf1-4606-83d1-ffaf7d231b69-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pkbqf\" (UID: \"b0525b9e-caf1-4606-83d1-ffaf7d231b69\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pkbqf" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.034702 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7d6d4012-f253-411e-9bed-2e204d9a6f67-metrics-tls\") pod \"dns-default-s29mb\" (UID: \"7d6d4012-f253-411e-9bed-2e204d9a6f67\") " pod="openshift-dns/dns-default-s29mb" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.034729 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/f7f071e5-085b-429b-8956-102ebc333825-node-bootstrap-token\") pod \"machine-config-server-7bj4h\" (UID: \"f7f071e5-085b-429b-8956-102ebc333825\") " pod="openshift-machine-config-operator/machine-config-server-7bj4h" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.034749 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a742bd2a-b1b3-42fc-be4f-98002090b4ed-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-kcw47\" (UID: \"a742bd2a-b1b3-42fc-be4f-98002090b4ed\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kcw47" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.034791 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbtr7\" (UniqueName: \"kubernetes.io/projected/cdb9a3df-ae5a-4ce0-a7af-7986ac68c57f-kube-api-access-tbtr7\") pod \"multus-admission-controller-857f4d67dd-kmdm2\" (UID: \"cdb9a3df-ae5a-4ce0-a7af-7986ac68c57f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-kmdm2" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.034814 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwclq\" (UniqueName: \"kubernetes.io/projected/e766c0e7-ca96-4f32-9e14-81a3d5bbd389-kube-api-access-fwclq\") pod \"auto-csr-approver-29566492-86snb\" (UID: \"e766c0e7-ca96-4f32-9e14-81a3d5bbd389\") " pod="openshift-infra/auto-csr-approver-29566492-86snb" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.034833 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56szz\" (UniqueName: \"kubernetes.io/projected/6e696339-3718-476d-a99b-1da32e10090b-kube-api-access-56szz\") pod \"machine-config-controller-84d6567774-r6vfl\" (UID: \"6e696339-3718-476d-a99b-1da32e10090b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-r6vfl" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.034853 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0525b9e-caf1-4606-83d1-ffaf7d231b69-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pkbqf\" (UID: \"b0525b9e-caf1-4606-83d1-ffaf7d231b69\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pkbqf" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.034878 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b96a8e1e-ebdd-419f-8661-ae6e515429b0-srv-cert\") pod \"catalog-operator-68c6474976-bnct9\" (UID: \"b96a8e1e-ebdd-419f-8661-ae6e515429b0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bnct9" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.034898 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7w6cm\" (UniqueName: \"kubernetes.io/projected/f7f071e5-085b-429b-8956-102ebc333825-kube-api-access-7w6cm\") pod \"machine-config-server-7bj4h\" (UID: \"f7f071e5-085b-429b-8956-102ebc333825\") " pod="openshift-machine-config-operator/machine-config-server-7bj4h" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.034934 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d6d4012-f253-411e-9bed-2e204d9a6f67-config-volume\") pod \"dns-default-s29mb\" (UID: \"7d6d4012-f253-411e-9bed-2e204d9a6f67\") " pod="openshift-dns/dns-default-s29mb" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.036057 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6e696339-3718-476d-a99b-1da32e10090b-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-r6vfl\" (UID: \"6e696339-3718-476d-a99b-1da32e10090b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-r6vfl" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.037566 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/59c19ac3-a273-4527-bb7c-fa5d0ec71cba-metrics-tls\") pod \"dns-operator-744455d44c-9ndf9\" (UID: \"59c19ac3-a273-4527-bb7c-fa5d0ec71cba\") " pod="openshift-dns-operator/dns-operator-744455d44c-9ndf9" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.038812 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52969add-873f-41ae-b0a9-e36924daac43-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-l5r26\" (UID: \"52969add-873f-41ae-b0a9-e36924daac43\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-l5r26" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.050596 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.062290 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52969add-873f-41ae-b0a9-e36924daac43-config\") pod \"kube-apiserver-operator-766d6c64bb-l5r26\" (UID: \"52969add-873f-41ae-b0a9-e36924daac43\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-l5r26" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.071334 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.090675 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.111788 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.118749 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1159c4dd-9dfe-432c-8c65-f18f589c8a66-config\") pod \"kube-controller-manager-operator-78b949d7b-7jzjv\" (UID: \"1159c4dd-9dfe-432c-8c65-f18f589c8a66\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7jzjv" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.131010 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.151193 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.161946 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.162127 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.162319 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/92c1ee23-ae49-48a6-828d-19ecb5573057-stats-auth\") pod \"router-default-5444994796-858jn\" (UID: \"92c1ee23-ae49-48a6-828d-19ecb5573057\") " pod="openshift-ingress/router-default-5444994796-858jn" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.171303 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.192506 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.200463 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1159c4dd-9dfe-432c-8c65-f18f589c8a66-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-7jzjv\" (UID: \"1159c4dd-9dfe-432c-8c65-f18f589c8a66\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7jzjv" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.211488 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.222085 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/92c1ee23-ae49-48a6-828d-19ecb5573057-default-certificate\") pod \"router-default-5444994796-858jn\" (UID: \"92c1ee23-ae49-48a6-828d-19ecb5573057\") " pod="openshift-ingress/router-default-5444994796-858jn" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.231578 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.240878 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92c1ee23-ae49-48a6-828d-19ecb5573057-service-ca-bundle\") pod \"router-default-5444994796-858jn\" (UID: \"92c1ee23-ae49-48a6-828d-19ecb5573057\") " pod="openshift-ingress/router-default-5444994796-858jn" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.251173 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.259895 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/c9ad26c3-6a8e-41df-9879-b7ff5f77fdea-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-fn7gp\" (UID: \"c9ad26c3-6a8e-41df-9879-b7ff5f77fdea\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fn7gp" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.271920 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.291738 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.301556 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/92c1ee23-ae49-48a6-828d-19ecb5573057-metrics-certs\") pod \"router-default-5444994796-858jn\" (UID: \"92c1ee23-ae49-48a6-828d-19ecb5573057\") " pod="openshift-ingress/router-default-5444994796-858jn" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.312074 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.330960 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.351984 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.365851 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3a9bd8ec-b649-4b24-8eca-d0e20cef988a-metrics-tls\") pod \"ingress-operator-5b745b69d9-k9kzh\" (UID: \"3a9bd8ec-b649-4b24-8eca-d0e20cef988a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-k9kzh" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.382147 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.388278 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3a9bd8ec-b649-4b24-8eca-d0e20cef988a-trusted-ca\") pod \"ingress-operator-5b745b69d9-k9kzh\" (UID: \"3a9bd8ec-b649-4b24-8eca-d0e20cef988a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-k9kzh" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.391194 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.411860 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.431709 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.451449 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.470813 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.481583 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ad08056-729b-4fb9-8049-9a27c674e520-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-jw2zn\" (UID: \"8ad08056-729b-4fb9-8049-9a27c674e520\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jw2zn" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.492516 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.498026 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ad08056-729b-4fb9-8049-9a27c674e520-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-jw2zn\" (UID: \"8ad08056-729b-4fb9-8049-9a27c674e520\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jw2zn" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.511666 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.531851 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.551910 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.559856 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b0525b9e-caf1-4606-83d1-ffaf7d231b69-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pkbqf\" (UID: \"b0525b9e-caf1-4606-83d1-ffaf7d231b69\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pkbqf" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.572864 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.576999 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0525b9e-caf1-4606-83d1-ffaf7d231b69-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pkbqf\" (UID: \"b0525b9e-caf1-4606-83d1-ffaf7d231b69\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pkbqf" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.592458 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.611403 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.632188 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.642467 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.650578 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.692062 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.708121 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d17d7658-ce33-4a3c-826a-1a0361e73629-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bj9mx\" (UID: \"d17d7658-ce33-4a3c-826a-1a0361e73629\") " pod="openshift-marketplace/marketplace-operator-79b997595-bj9mx" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.711825 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.742182 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.750355 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d17d7658-ce33-4a3c-826a-1a0361e73629-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bj9mx\" (UID: \"d17d7658-ce33-4a3c-826a-1a0361e73629\") " pod="openshift-marketplace/marketplace-operator-79b997595-bj9mx" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.752298 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.772210 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.791894 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.801215 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/cdb9a3df-ae5a-4ce0-a7af-7986ac68c57f-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-kmdm2\" (UID: \"cdb9a3df-ae5a-4ce0-a7af-7986ac68c57f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-kmdm2" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.810964 4971 request.go:700] Waited for 1.010334643s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-multus/secrets?fieldSelector=metadata.name%3Dmultus-ac-dockercfg-9lkdf&limit=500&resourceVersion=0 Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.813368 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.832676 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.841567 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8b3d1a36-b34d-422b-96e1-0bfd934deac1-images\") pod \"machine-config-operator-74547568cd-n48nl\" (UID: \"8b3d1a36-b34d-422b-96e1-0bfd934deac1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n48nl" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.852193 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.854852 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8b3d1a36-b34d-422b-96e1-0bfd934deac1-proxy-tls\") pod \"machine-config-operator-74547568cd-n48nl\" (UID: \"8b3d1a36-b34d-422b-96e1-0bfd934deac1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n48nl" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.871475 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.890824 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.911565 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.931169 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 20 06:53:20 crc kubenswrapper[4971]: E0320 06:53:20.935887 4971 configmap.go:193] Couldn't get configMap openshift-operator-lifecycle-manager/collect-profiles-config: failed to sync configmap cache: timed out waiting for the condition Mar 20 06:53:20 crc kubenswrapper[4971]: E0320 06:53:20.936012 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b7d2e412-0a4f-44b6-a326-269ac4921dae-config-volume podName:b7d2e412-0a4f-44b6-a326-269ac4921dae nodeName:}" failed. No retries permitted until 2026-03-20 06:53:21.435987229 +0000 UTC m=+223.415861377 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/b7d2e412-0a4f-44b6-a326-269ac4921dae-config-volume") pod "collect-profiles-29566485-rnzpk" (UID: "b7d2e412-0a4f-44b6-a326-269ac4921dae") : failed to sync configmap cache: timed out waiting for the condition Mar 20 06:53:20 crc kubenswrapper[4971]: E0320 06:53:20.936319 4971 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/pprof-cert: failed to sync secret cache: timed out waiting for the condition Mar 20 06:53:20 crc kubenswrapper[4971]: E0320 06:53:20.936400 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7d2e412-0a4f-44b6-a326-269ac4921dae-secret-volume podName:b7d2e412-0a4f-44b6-a326-269ac4921dae nodeName:}" failed. No retries permitted until 2026-03-20 06:53:21.43637949 +0000 UTC m=+223.416253628 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-volume" (UniqueName: "kubernetes.io/secret/b7d2e412-0a4f-44b6-a326-269ac4921dae-secret-volume") pod "collect-profiles-29566485-rnzpk" (UID: "b7d2e412-0a4f-44b6-a326-269ac4921dae") : failed to sync secret cache: timed out waiting for the condition Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.939414 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b96a8e1e-ebdd-419f-8661-ae6e515429b0-srv-cert\") pod \"catalog-operator-68c6474976-bnct9\" (UID: \"b96a8e1e-ebdd-419f-8661-ae6e515429b0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bnct9" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.950579 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.959406 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b96a8e1e-ebdd-419f-8661-ae6e515429b0-profile-collector-cert\") pod \"catalog-operator-68c6474976-bnct9\" (UID: \"b96a8e1e-ebdd-419f-8661-ae6e515429b0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bnct9" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.971666 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 20 06:53:20 crc kubenswrapper[4971]: I0320 06:53:20.992368 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.011776 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.031451 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 20 06:53:21 crc kubenswrapper[4971]: E0320 06:53:21.035103 4971 secret.go:188] Couldn't get secret openshift-ingress-canary/canary-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 20 06:53:21 crc kubenswrapper[4971]: E0320 06:53:21.035192 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a00f6082-ee0c-441f-9aad-d4d12c607e23-cert podName:a00f6082-ee0c-441f-9aad-d4d12c607e23 nodeName:}" failed. No retries permitted until 2026-03-20 06:53:21.535165099 +0000 UTC m=+223.515039247 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a00f6082-ee0c-441f-9aad-d4d12c607e23-cert") pod "ingress-canary-xg8rk" (UID: "a00f6082-ee0c-441f-9aad-d4d12c607e23") : failed to sync secret cache: timed out waiting for the condition Mar 20 06:53:21 crc kubenswrapper[4971]: E0320 06:53:21.035208 4971 configmap.go:193] Couldn't get configMap openshift-dns/dns-default: failed to sync configmap cache: timed out waiting for the condition Mar 20 06:53:21 crc kubenswrapper[4971]: E0320 06:53:21.035351 4971 secret.go:188] Couldn't get secret openshift-machine-config-operator/mcc-proxy-tls: failed to sync secret cache: timed out waiting for the condition Mar 20 06:53:21 crc kubenswrapper[4971]: E0320 06:53:21.035541 4971 secret.go:188] Couldn't get secret openshift-dns/dns-default-metrics-tls: failed to sync secret cache: timed out waiting for the condition Mar 20 06:53:21 crc kubenswrapper[4971]: E0320 06:53:21.035550 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e696339-3718-476d-a99b-1da32e10090b-proxy-tls podName:6e696339-3718-476d-a99b-1da32e10090b nodeName:}" failed. No retries permitted until 2026-03-20 06:53:21.535497459 +0000 UTC m=+223.515371607 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/6e696339-3718-476d-a99b-1da32e10090b-proxy-tls") pod "machine-config-controller-84d6567774-r6vfl" (UID: "6e696339-3718-476d-a99b-1da32e10090b") : failed to sync secret cache: timed out waiting for the condition Mar 20 06:53:21 crc kubenswrapper[4971]: E0320 06:53:21.035253 4971 secret.go:188] Couldn't get secret openshift-service-ca-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 20 06:53:21 crc kubenswrapper[4971]: E0320 06:53:21.035587 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d6d4012-f253-411e-9bed-2e204d9a6f67-metrics-tls podName:7d6d4012-f253-411e-9bed-2e204d9a6f67 nodeName:}" failed. No retries permitted until 2026-03-20 06:53:21.535576481 +0000 UTC m=+223.515450629 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7d6d4012-f253-411e-9bed-2e204d9a6f67-metrics-tls") pod "dns-default-s29mb" (UID: "7d6d4012-f253-411e-9bed-2e204d9a6f67") : failed to sync secret cache: timed out waiting for the condition Mar 20 06:53:21 crc kubenswrapper[4971]: E0320 06:53:21.035594 4971 secret.go:188] Couldn't get secret openshift-machine-config-operator/node-bootstrapper-token: failed to sync secret cache: timed out waiting for the condition Mar 20 06:53:21 crc kubenswrapper[4971]: E0320 06:53:21.035286 4971 secret.go:188] Couldn't get secret openshift-machine-config-operator/machine-config-server-tls: failed to sync secret cache: timed out waiting for the condition Mar 20 06:53:21 crc kubenswrapper[4971]: E0320 06:53:21.035678 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7f071e5-085b-429b-8956-102ebc333825-certs podName:f7f071e5-085b-429b-8956-102ebc333825 nodeName:}" failed. No retries permitted until 2026-03-20 06:53:21.535667824 +0000 UTC m=+223.515541972 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certs" (UniqueName: "kubernetes.io/secret/f7f071e5-085b-429b-8956-102ebc333825-certs") pod "machine-config-server-7bj4h" (UID: "f7f071e5-085b-429b-8956-102ebc333825") : failed to sync secret cache: timed out waiting for the condition Mar 20 06:53:21 crc kubenswrapper[4971]: E0320 06:53:21.035727 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7f071e5-085b-429b-8956-102ebc333825-node-bootstrap-token podName:f7f071e5-085b-429b-8956-102ebc333825 nodeName:}" failed. No retries permitted until 2026-03-20 06:53:21.535693275 +0000 UTC m=+223.515567443 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-bootstrap-token" (UniqueName: "kubernetes.io/secret/f7f071e5-085b-429b-8956-102ebc333825-node-bootstrap-token") pod "machine-config-server-7bj4h" (UID: "f7f071e5-085b-429b-8956-102ebc333825") : failed to sync secret cache: timed out waiting for the condition Mar 20 06:53:21 crc kubenswrapper[4971]: E0320 06:53:21.035221 4971 configmap.go:193] Couldn't get configMap openshift-service-ca-operator/service-ca-operator-config: failed to sync configmap cache: timed out waiting for the condition Mar 20 06:53:21 crc kubenswrapper[4971]: E0320 06:53:21.035813 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a8e16069-c384-4991-8041-af49bbd32266-config podName:a8e16069-c384-4991-8041-af49bbd32266 nodeName:}" failed. No retries permitted until 2026-03-20 06:53:21.535801128 +0000 UTC m=+223.515675276 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/a8e16069-c384-4991-8041-af49bbd32266-config") pod "service-ca-operator-777779d784-sjq2s" (UID: "a8e16069-c384-4991-8041-af49bbd32266") : failed to sync configmap cache: timed out waiting for the condition Mar 20 06:53:21 crc kubenswrapper[4971]: E0320 06:53:21.035869 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7d6d4012-f253-411e-9bed-2e204d9a6f67-config-volume podName:7d6d4012-f253-411e-9bed-2e204d9a6f67 nodeName:}" failed. No retries permitted until 2026-03-20 06:53:21.535826488 +0000 UTC m=+223.515700656 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/7d6d4012-f253-411e-9bed-2e204d9a6f67-config-volume") pod "dns-default-s29mb" (UID: "7d6d4012-f253-411e-9bed-2e204d9a6f67") : failed to sync configmap cache: timed out waiting for the condition Mar 20 06:53:21 crc kubenswrapper[4971]: E0320 06:53:21.036536 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8e16069-c384-4991-8041-af49bbd32266-serving-cert podName:a8e16069-c384-4991-8041-af49bbd32266 nodeName:}" failed. No retries permitted until 2026-03-20 06:53:21.536518979 +0000 UTC m=+223.516393127 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/a8e16069-c384-4991-8041-af49bbd32266-serving-cert") pod "service-ca-operator-777779d784-sjq2s" (UID: "a8e16069-c384-4991-8041-af49bbd32266") : failed to sync secret cache: timed out waiting for the condition Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.041763 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a742bd2a-b1b3-42fc-be4f-98002090b4ed-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-kcw47\" (UID: \"a742bd2a-b1b3-42fc-be4f-98002090b4ed\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kcw47" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.051171 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.071674 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.090706 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.110525 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.131043 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.171551 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.178163 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9g76s\" (UniqueName: \"kubernetes.io/projected/168eba70-d30f-4f54-88c8-403a411a16a0-kube-api-access-9g76s\") pod \"apiserver-76f77b778f-tp5rq\" (UID: \"168eba70-d30f-4f54-88c8-403a411a16a0\") " pod="openshift-apiserver/apiserver-76f77b778f-tp5rq" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.192759 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.212122 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.231151 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.252963 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.271307 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.291462 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.311858 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.332352 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.352429 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.372278 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.392651 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.411086 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.432276 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.438164 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-tp5rq" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.452356 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.463559 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b7d2e412-0a4f-44b6-a326-269ac4921dae-secret-volume\") pod \"collect-profiles-29566485-rnzpk\" (UID: \"b7d2e412-0a4f-44b6-a326-269ac4921dae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566485-rnzpk" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.463942 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b7d2e412-0a4f-44b6-a326-269ac4921dae-config-volume\") pod \"collect-profiles-29566485-rnzpk\" (UID: \"b7d2e412-0a4f-44b6-a326-269ac4921dae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566485-rnzpk" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.465957 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b7d2e412-0a4f-44b6-a326-269ac4921dae-config-volume\") pod \"collect-profiles-29566485-rnzpk\" (UID: \"b7d2e412-0a4f-44b6-a326-269ac4921dae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566485-rnzpk" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.470867 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b7d2e412-0a4f-44b6-a326-269ac4921dae-secret-volume\") pod \"collect-profiles-29566485-rnzpk\" (UID: \"b7d2e412-0a4f-44b6-a326-269ac4921dae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566485-rnzpk" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.471640 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.492488 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.512196 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.533695 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.552121 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.566865 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d6d4012-f253-411e-9bed-2e204d9a6f67-config-volume\") pod \"dns-default-s29mb\" (UID: \"7d6d4012-f253-411e-9bed-2e204d9a6f67\") " pod="openshift-dns/dns-default-s29mb" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.567039 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8e16069-c384-4991-8041-af49bbd32266-serving-cert\") pod \"service-ca-operator-777779d784-sjq2s\" (UID: \"a8e16069-c384-4991-8041-af49bbd32266\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-sjq2s" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.567176 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8e16069-c384-4991-8041-af49bbd32266-config\") pod \"service-ca-operator-777779d784-sjq2s\" (UID: \"a8e16069-c384-4991-8041-af49bbd32266\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-sjq2s" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.567235 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/f7f071e5-085b-429b-8956-102ebc333825-certs\") pod \"machine-config-server-7bj4h\" (UID: \"f7f071e5-085b-429b-8956-102ebc333825\") " pod="openshift-machine-config-operator/machine-config-server-7bj4h" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.567296 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a00f6082-ee0c-441f-9aad-d4d12c607e23-cert\") pod \"ingress-canary-xg8rk\" (UID: \"a00f6082-ee0c-441f-9aad-d4d12c607e23\") " pod="openshift-ingress-canary/ingress-canary-xg8rk" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.567657 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e696339-3718-476d-a99b-1da32e10090b-proxy-tls\") pod \"machine-config-controller-84d6567774-r6vfl\" (UID: \"6e696339-3718-476d-a99b-1da32e10090b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-r6vfl" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.567731 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7d6d4012-f253-411e-9bed-2e204d9a6f67-metrics-tls\") pod \"dns-default-s29mb\" (UID: \"7d6d4012-f253-411e-9bed-2e204d9a6f67\") " pod="openshift-dns/dns-default-s29mb" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.567771 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/f7f071e5-085b-429b-8956-102ebc333825-node-bootstrap-token\") pod \"machine-config-server-7bj4h\" (UID: \"f7f071e5-085b-429b-8956-102ebc333825\") " pod="openshift-machine-config-operator/machine-config-server-7bj4h" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.569059 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8e16069-c384-4991-8041-af49bbd32266-config\") pod \"service-ca-operator-777779d784-sjq2s\" (UID: \"a8e16069-c384-4991-8041-af49bbd32266\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-sjq2s" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.574004 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8e16069-c384-4991-8041-af49bbd32266-serving-cert\") pod \"service-ca-operator-777779d784-sjq2s\" (UID: \"a8e16069-c384-4991-8041-af49bbd32266\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-sjq2s" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.574478 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/f7f071e5-085b-429b-8956-102ebc333825-certs\") pod \"machine-config-server-7bj4h\" (UID: \"f7f071e5-085b-429b-8956-102ebc333825\") " pod="openshift-machine-config-operator/machine-config-server-7bj4h" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.575239 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e696339-3718-476d-a99b-1da32e10090b-proxy-tls\") pod \"machine-config-controller-84d6567774-r6vfl\" (UID: \"6e696339-3718-476d-a99b-1da32e10090b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-r6vfl" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.575345 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a00f6082-ee0c-441f-9aad-d4d12c607e23-cert\") pod \"ingress-canary-xg8rk\" (UID: \"a00f6082-ee0c-441f-9aad-d4d12c607e23\") " pod="openshift-ingress-canary/ingress-canary-xg8rk" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.575547 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/f7f071e5-085b-429b-8956-102ebc333825-node-bootstrap-token\") pod \"machine-config-server-7bj4h\" (UID: \"f7f071e5-085b-429b-8956-102ebc333825\") " pod="openshift-machine-config-operator/machine-config-server-7bj4h" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.576143 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.591266 4971 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.613143 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.631741 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.655958 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.659736 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d6d4012-f253-411e-9bed-2e204d9a6f67-config-volume\") pod \"dns-default-s29mb\" (UID: \"7d6d4012-f253-411e-9bed-2e204d9a6f67\") " pod="openshift-dns/dns-default-s29mb" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.671189 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.691725 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.706390 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7d6d4012-f253-411e-9bed-2e204d9a6f67-metrics-tls\") pod \"dns-default-s29mb\" (UID: \"7d6d4012-f253-411e-9bed-2e204d9a6f67\") " pod="openshift-dns/dns-default-s29mb" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.754278 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntg4h\" (UniqueName: \"kubernetes.io/projected/9f6352b6-0b3b-43d0-8d2f-681f5aec50c2-kube-api-access-ntg4h\") pod \"console-f9d7485db-4l9qf\" (UID: \"9f6352b6-0b3b-43d0-8d2f-681f5aec50c2\") " pod="openshift-console/console-f9d7485db-4l9qf" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.778796 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sk7n7\" (UniqueName: \"kubernetes.io/projected/634ee46d-0ea5-4287-a7d7-55d7b0fb803b-kube-api-access-sk7n7\") pod \"route-controller-manager-6576b87f9c-tzg77\" (UID: \"634ee46d-0ea5-4287-a7d7-55d7b0fb803b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tzg77" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.793711 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbzqj\" (UniqueName: \"kubernetes.io/projected/3bf8239a-db87-4f56-9da5-24b1e2248263-kube-api-access-lbzqj\") pod \"console-operator-58897d9998-r8t8l\" (UID: \"3bf8239a-db87-4f56-9da5-24b1e2248263\") " pod="openshift-console-operator/console-operator-58897d9998-r8t8l" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.811412 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2g4q\" (UniqueName: \"kubernetes.io/projected/09a5c7dc-07b9-4b2a-a8da-0a96222b932a-kube-api-access-l2g4q\") pod \"migrator-59844c95c7-4gfgh\" (UID: \"09a5c7dc-07b9-4b2a-a8da-0a96222b932a\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4gfgh" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.829098 4971 request.go:700] Waited for 1.902762002s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-api/serviceaccounts/machine-api-operator/token Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.830002 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4gfgh" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.837815 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qp7s\" (UniqueName: \"kubernetes.io/projected/34c91a14-92a3-459a-a798-85b4bda4f3ca-kube-api-access-8qp7s\") pod \"openshift-controller-manager-operator-756b6f6bc6-tv2fz\" (UID: \"34c91a14-92a3-459a-a798-85b4bda4f3ca\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tv2fz" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.858751 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vz7gx\" (UniqueName: \"kubernetes.io/projected/6f5bffa0-8b50-4aac-a108-edc093dfc2bf-kube-api-access-vz7gx\") pod \"machine-api-operator-5694c8668f-fw8fs\" (UID: \"6f5bffa0-8b50-4aac-a108-edc093dfc2bf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fw8fs" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.880469 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/52969add-873f-41ae-b0a9-e36924daac43-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-l5r26\" (UID: \"52969add-873f-41ae-b0a9-e36924daac43\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-l5r26" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.902037 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24lb7\" (UniqueName: \"kubernetes.io/projected/8ad08056-729b-4fb9-8049-9a27c674e520-kube-api-access-24lb7\") pod \"kube-storage-version-migrator-operator-b67b599dd-jw2zn\" (UID: \"8ad08056-729b-4fb9-8049-9a27c674e520\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jw2zn" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.902407 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tv2fz" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.915320 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k96wh\" (UniqueName: \"kubernetes.io/projected/469b3cee-92e9-43a1-b8e7-de66a1990a8f-kube-api-access-k96wh\") pod \"controller-manager-879f6c89f-k5xfj\" (UID: \"469b3cee-92e9-43a1-b8e7-de66a1990a8f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k5xfj" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.928142 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-tp5rq"] Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.943496 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvqq2\" (UniqueName: \"kubernetes.io/projected/d028f881-7ad8-4805-84b8-341facd59bc2-kube-api-access-qvqq2\") pod \"cluster-image-registry-operator-dc59b4c8b-kdv68\" (UID: \"d028f881-7ad8-4805-84b8-341facd59bc2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kdv68" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.951188 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klhnh\" (UniqueName: \"kubernetes.io/projected/92c1ee23-ae49-48a6-828d-19ecb5573057-kube-api-access-klhnh\") pod \"router-default-5444994796-858jn\" (UID: \"92c1ee23-ae49-48a6-828d-19ecb5573057\") " pod="openshift-ingress/router-default-5444994796-858jn" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.951852 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tzg77" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.965823 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-4l9qf" Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.969574 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffnx2\" (UniqueName: \"kubernetes.io/projected/0939bb94-8858-43fc-8443-686e696beaa1-kube-api-access-ffnx2\") pod \"downloads-7954f5f757-t2znt\" (UID: \"0939bb94-8858-43fc-8443-686e696beaa1\") " pod="openshift-console/downloads-7954f5f757-t2znt" Mar 20 06:53:21 crc kubenswrapper[4971]: W0320 06:53:21.974114 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod168eba70_d30f_4f54_88c8_403a411a16a0.slice/crio-2c980480f6e36c70f2ae84c92981da3255bd9502170f32113d50678574d091b7 WatchSource:0}: Error finding container 2c980480f6e36c70f2ae84c92981da3255bd9502170f32113d50678574d091b7: Status 404 returned error can't find the container with id 2c980480f6e36c70f2ae84c92981da3255bd9502170f32113d50678574d091b7 Mar 20 06:53:21 crc kubenswrapper[4971]: I0320 06:53:21.979216 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-r8t8l" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.003758 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnzn7\" (UniqueName: \"kubernetes.io/projected/fe99e221-99e5-49c1-9cec-875a54851847-kube-api-access-tnzn7\") pod \"oauth-openshift-558db77b4-g2c5c\" (UID: \"fe99e221-99e5-49c1-9cec-875a54851847\") " pod="openshift-authentication/oauth-openshift-558db77b4-g2c5c" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.005136 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d028f881-7ad8-4805-84b8-341facd59bc2-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-kdv68\" (UID: \"d028f881-7ad8-4805-84b8-341facd59bc2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kdv68" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.005453 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-t2znt" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.018158 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-tp5rq" event={"ID":"168eba70-d30f-4f54-88c8-403a411a16a0","Type":"ContainerStarted","Data":"2c980480f6e36c70f2ae84c92981da3255bd9502170f32113d50678574d091b7"} Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.019714 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-fw8fs" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.033498 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjdv2\" (UniqueName: \"kubernetes.io/projected/181841ef-d317-458b-bb36-daa09020abdb-kube-api-access-vjdv2\") pod \"cluster-samples-operator-665b6dd947-nj4b4\" (UID: \"181841ef-d317-458b-bb36-daa09020abdb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nj4b4" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.054985 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-k5xfj" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.057671 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghmgp\" (UniqueName: \"kubernetes.io/projected/3a9bd8ec-b649-4b24-8eca-d0e20cef988a-kube-api-access-ghmgp\") pod \"ingress-operator-5b745b69d9-k9kzh\" (UID: \"3a9bd8ec-b649-4b24-8eca-d0e20cef988a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-k9kzh" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.072162 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1159c4dd-9dfe-432c-8c65-f18f589c8a66-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-7jzjv\" (UID: \"1159c4dd-9dfe-432c-8c65-f18f589c8a66\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7jzjv" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.075986 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kdv68" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.083345 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-4gfgh"] Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.088400 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4mk4\" (UniqueName: \"kubernetes.io/projected/8b3d1a36-b34d-422b-96e1-0bfd934deac1-kube-api-access-b4mk4\") pod \"machine-config-operator-74547568cd-n48nl\" (UID: \"8b3d1a36-b34d-422b-96e1-0bfd934deac1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n48nl" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.096866 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-l5r26" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.097210 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7jzjv" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.097345 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-858jn" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.108348 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4dcf\" (UniqueName: \"kubernetes.io/projected/6cc43d2b-e8e2-405a-9870-33c60db55ff2-kube-api-access-c4dcf\") pod \"authentication-operator-69f744f599-hvlj7\" (UID: \"6cc43d2b-e8e2-405a-9870-33c60db55ff2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hvlj7" Mar 20 06:53:22 crc kubenswrapper[4971]: W0320 06:53:22.115090 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09a5c7dc_07b9_4b2a_a8da_0a96222b932a.slice/crio-9d059a9fb790b4a38e351b1cfda11bc16623f3c7506a9593cb38b1dc37d95ef2 WatchSource:0}: Error finding container 9d059a9fb790b4a38e351b1cfda11bc16623f3c7506a9593cb38b1dc37d95ef2: Status 404 returned error can't find the container with id 9d059a9fb790b4a38e351b1cfda11bc16623f3c7506a9593cb38b1dc37d95ef2 Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.120241 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jw2zn" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.122208 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tv2fz"] Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.139695 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqpn2\" (UniqueName: \"kubernetes.io/projected/d17d7658-ce33-4a3c-826a-1a0361e73629-kube-api-access-vqpn2\") pod \"marketplace-operator-79b997595-bj9mx\" (UID: \"d17d7658-ce33-4a3c-826a-1a0361e73629\") " pod="openshift-marketplace/marketplace-operator-79b997595-bj9mx" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.146190 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9q48\" (UniqueName: \"kubernetes.io/projected/971318f2-7f27-4651-9fbe-d66d37fb6b12-kube-api-access-g9q48\") pod \"openshift-apiserver-operator-796bbdcf4f-k7vhj\" (UID: \"971318f2-7f27-4651-9fbe-d66d37fb6b12\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k7vhj" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.153589 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n48nl" Mar 20 06:53:22 crc kubenswrapper[4971]: W0320 06:53:22.178742 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34c91a14_92a3_459a_a798_85b4bda4f3ca.slice/crio-cf762e12648b7a88449d9fddc1cf0ac0619b9a201bc75d37c70aa909fefdbf01 WatchSource:0}: Error finding container cf762e12648b7a88449d9fddc1cf0ac0619b9a201bc75d37c70aa909fefdbf01: Status 404 returned error can't find the container with id cf762e12648b7a88449d9fddc1cf0ac0619b9a201bc75d37c70aa909fefdbf01 Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.182254 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8r7v2\" (UniqueName: \"kubernetes.io/projected/65ba3fb8-bfc9-4b6a-96ac-e901252e44b7-kube-api-access-8r7v2\") pod \"openshift-config-operator-7777fb866f-g8hs9\" (UID: \"65ba3fb8-bfc9-4b6a-96ac-e901252e44b7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-g8hs9" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.190054 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3a9bd8ec-b649-4b24-8eca-d0e20cef988a-bound-sa-token\") pod \"ingress-operator-5b745b69d9-k9kzh\" (UID: \"3a9bd8ec-b649-4b24-8eca-d0e20cef988a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-k9kzh" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.196528 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nj4b4" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.207727 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnqvm\" (UniqueName: \"kubernetes.io/projected/e1a9fe49-1dd3-48c9-a4c7-33ce3cf9f698-kube-api-access-gnqvm\") pod \"etcd-operator-b45778765-dwxp5\" (UID: \"e1a9fe49-1dd3-48c9-a4c7-33ce3cf9f698\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dwxp5" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.236797 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-g8hs9" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.244040 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tn4p7\" (UniqueName: \"kubernetes.io/projected/780881af-a5e9-44fa-9cf9-4d57513449dd-kube-api-access-tn4p7\") pod \"machine-approver-56656f9798-cwd9m\" (UID: \"780881af-a5e9-44fa-9cf9-4d57513449dd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cwd9m" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.248138 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-r8t8l"] Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.253869 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzj7w\" (UniqueName: \"kubernetes.io/projected/911a50c7-0e81-4087-9bc0-81e5206d5fd0-kube-api-access-jzj7w\") pod \"apiserver-7bbb656c7d-scq4h\" (UID: \"911a50c7-0e81-4087-9bc0-81e5206d5fd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-scq4h" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.264135 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfzsm\" (UniqueName: \"kubernetes.io/projected/b7d2e412-0a4f-44b6-a326-269ac4921dae-kube-api-access-gfzsm\") pod \"collect-profiles-29566485-rnzpk\" (UID: \"b7d2e412-0a4f-44b6-a326-269ac4921dae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566485-rnzpk" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.271130 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-dwxp5" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.286126 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p95h6\" (UniqueName: \"kubernetes.io/projected/59c19ac3-a273-4527-bb7c-fa5d0ec71cba-kube-api-access-p95h6\") pod \"dns-operator-744455d44c-9ndf9\" (UID: \"59c19ac3-a273-4527-bb7c-fa5d0ec71cba\") " pod="openshift-dns-operator/dns-operator-744455d44c-9ndf9" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.286256 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-g2c5c" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.292593 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-9ndf9" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.299263 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k7vhj" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.338491 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcqlb\" (UniqueName: \"kubernetes.io/projected/7d6d4012-f253-411e-9bed-2e204d9a6f67-kube-api-access-mcqlb\") pod \"dns-default-s29mb\" (UID: \"7d6d4012-f253-411e-9bed-2e204d9a6f67\") " pod="openshift-dns/dns-default-s29mb" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.340098 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpn9k\" (UniqueName: \"kubernetes.io/projected/b96a8e1e-ebdd-419f-8661-ae6e515429b0-kube-api-access-zpn9k\") pod \"catalog-operator-68c6474976-bnct9\" (UID: \"b96a8e1e-ebdd-419f-8661-ae6e515429b0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bnct9" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.362788 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzn2m\" (UniqueName: \"kubernetes.io/projected/c9ad26c3-6a8e-41df-9879-b7ff5f77fdea-kube-api-access-tzn2m\") pod \"control-plane-machine-set-operator-78cbb6b69f-fn7gp\" (UID: \"c9ad26c3-6a8e-41df-9879-b7ff5f77fdea\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fn7gp" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.367559 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr7h7\" (UniqueName: \"kubernetes.io/projected/a742bd2a-b1b3-42fc-be4f-98002090b4ed-kube-api-access-rr7h7\") pod \"package-server-manager-789f6589d5-kcw47\" (UID: \"a742bd2a-b1b3-42fc-be4f-98002090b4ed\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kcw47" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.386873 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-hvlj7" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.393333 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxcvl\" (UniqueName: \"kubernetes.io/projected/a00f6082-ee0c-441f-9aad-d4d12c607e23-kube-api-access-wxcvl\") pod \"ingress-canary-xg8rk\" (UID: \"a00f6082-ee0c-441f-9aad-d4d12c607e23\") " pod="openshift-ingress-canary/ingress-canary-xg8rk" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.406104 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fn7gp" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.411282 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-k9kzh" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.414149 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cwd9m" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.418516 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4ckm\" (UniqueName: \"kubernetes.io/projected/a8e16069-c384-4991-8041-af49bbd32266-kube-api-access-g4ckm\") pod \"service-ca-operator-777779d784-sjq2s\" (UID: \"a8e16069-c384-4991-8041-af49bbd32266\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-sjq2s" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.434250 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b0525b9e-caf1-4606-83d1-ffaf7d231b69-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pkbqf\" (UID: \"b0525b9e-caf1-4606-83d1-ffaf7d231b69\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pkbqf" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.448116 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-bj9mx" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.448291 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-scq4h" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.457674 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56szz\" (UniqueName: \"kubernetes.io/projected/6e696339-3718-476d-a99b-1da32e10090b-kube-api-access-56szz\") pod \"machine-config-controller-84d6567774-r6vfl\" (UID: \"6e696339-3718-476d-a99b-1da32e10090b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-r6vfl" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.457686 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bnct9" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.465041 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566485-rnzpk" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.480756 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kcw47" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.482655 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7w6cm\" (UniqueName: \"kubernetes.io/projected/f7f071e5-085b-429b-8956-102ebc333825-kube-api-access-7w6cm\") pod \"machine-config-server-7bj4h\" (UID: \"f7f071e5-085b-429b-8956-102ebc333825\") " pod="openshift-machine-config-operator/machine-config-server-7bj4h" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.503981 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-r6vfl" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.504907 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwclq\" (UniqueName: \"kubernetes.io/projected/e766c0e7-ca96-4f32-9e14-81a3d5bbd389-kube-api-access-fwclq\") pod \"auto-csr-approver-29566492-86snb\" (UID: \"e766c0e7-ca96-4f32-9e14-81a3d5bbd389\") " pod="openshift-infra/auto-csr-approver-29566492-86snb" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.507449 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbtr7\" (UniqueName: \"kubernetes.io/projected/cdb9a3df-ae5a-4ce0-a7af-7986ac68c57f-kube-api-access-tbtr7\") pod \"multus-admission-controller-857f4d67dd-kmdm2\" (UID: \"cdb9a3df-ae5a-4ce0-a7af-7986ac68c57f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-kmdm2" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.511799 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-k5xfj"] Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.526190 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566492-86snb" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.543449 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-sjq2s" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.551024 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-xg8rk" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.561574 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-7bj4h" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.564176 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-s29mb" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.584927 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3897dfd7-24d8-4384-b536-24db5edccd01-socket-dir\") pod \"csi-hostpathplugin-596j6\" (UID: \"3897dfd7-24d8-4384-b536-24db5edccd01\") " pod="hostpath-provisioner/csi-hostpathplugin-596j6" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.585016 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twgtb\" (UniqueName: \"kubernetes.io/projected/a39768b3-6060-4f23-8227-8d658942b4ed-kube-api-access-twgtb\") pod \"olm-operator-6b444d44fb-mg9fg\" (UID: \"a39768b3-6060-4f23-8227-8d658942b4ed\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mg9fg" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.585054 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/6cb5cf3c-a623-476d-b32e-0819eb48d38f-signing-key\") pod \"service-ca-9c57cc56f-4g87f\" (UID: \"6cb5cf3c-a623-476d-b32e-0819eb48d38f\") " pod="openshift-service-ca/service-ca-9c57cc56f-4g87f" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.585075 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/3897dfd7-24d8-4384-b536-24db5edccd01-plugins-dir\") pod \"csi-hostpathplugin-596j6\" (UID: \"3897dfd7-24d8-4384-b536-24db5edccd01\") " pod="hostpath-provisioner/csi-hostpathplugin-596j6" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.585118 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a39768b3-6060-4f23-8227-8d658942b4ed-profile-collector-cert\") pod \"olm-operator-6b444d44fb-mg9fg\" (UID: \"a39768b3-6060-4f23-8227-8d658942b4ed\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mg9fg" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.585238 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a39768b3-6060-4f23-8227-8d658942b4ed-srv-cert\") pod \"olm-operator-6b444d44fb-mg9fg\" (UID: \"a39768b3-6060-4f23-8227-8d658942b4ed\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mg9fg" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.585265 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4e2cb781-6f5e-493d-a811-b88b641fcda2-installation-pull-secrets\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.585288 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4f8e835f-f481-4bf3-8eda-9c005962ccc2-apiservice-cert\") pod \"packageserver-d55dfcdfc-xfqk2\" (UID: \"4f8e835f-f481-4bf3-8eda-9c005962ccc2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xfqk2" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.585336 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/6cb5cf3c-a623-476d-b32e-0819eb48d38f-signing-cabundle\") pod \"service-ca-9c57cc56f-4g87f\" (UID: \"6cb5cf3c-a623-476d-b32e-0819eb48d38f\") " pod="openshift-service-ca/service-ca-9c57cc56f-4g87f" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.585389 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4e2cb781-6f5e-493d-a811-b88b641fcda2-registry-certificates\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.585432 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/3897dfd7-24d8-4384-b536-24db5edccd01-csi-data-dir\") pod \"csi-hostpathplugin-596j6\" (UID: \"3897dfd7-24d8-4384-b536-24db5edccd01\") " pod="hostpath-provisioner/csi-hostpathplugin-596j6" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.585477 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.585519 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxshm\" (UniqueName: \"kubernetes.io/projected/6cb5cf3c-a623-476d-b32e-0819eb48d38f-kube-api-access-hxshm\") pod \"service-ca-9c57cc56f-4g87f\" (UID: \"6cb5cf3c-a623-476d-b32e-0819eb48d38f\") " pod="openshift-service-ca/service-ca-9c57cc56f-4g87f" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.585557 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4f8e835f-f481-4bf3-8eda-9c005962ccc2-webhook-cert\") pod \"packageserver-d55dfcdfc-xfqk2\" (UID: \"4f8e835f-f481-4bf3-8eda-9c005962ccc2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xfqk2" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.585888 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4e2cb781-6f5e-493d-a811-b88b641fcda2-ca-trust-extracted\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.585960 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/3897dfd7-24d8-4384-b536-24db5edccd01-mountpoint-dir\") pod \"csi-hostpathplugin-596j6\" (UID: \"3897dfd7-24d8-4384-b536-24db5edccd01\") " pod="hostpath-provisioner/csi-hostpathplugin-596j6" Mar 20 06:53:22 crc kubenswrapper[4971]: E0320 06:53:22.587581 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 06:53:23.087558781 +0000 UTC m=+225.067432919 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q8mn4" (UID: "4e2cb781-6f5e-493d-a811-b88b641fcda2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.604320 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z2vq\" (UniqueName: \"kubernetes.io/projected/3897dfd7-24d8-4384-b536-24db5edccd01-kube-api-access-2z2vq\") pod \"csi-hostpathplugin-596j6\" (UID: \"3897dfd7-24d8-4384-b536-24db5edccd01\") " pod="hostpath-provisioner/csi-hostpathplugin-596j6" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.605506 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4e2cb781-6f5e-493d-a811-b88b641fcda2-registry-tls\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.605535 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4e2cb781-6f5e-493d-a811-b88b641fcda2-trusted-ca\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.605555 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4e2cb781-6f5e-493d-a811-b88b641fcda2-bound-sa-token\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.605573 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfsxv\" (UniqueName: \"kubernetes.io/projected/4f8e835f-f481-4bf3-8eda-9c005962ccc2-kube-api-access-pfsxv\") pod \"packageserver-d55dfcdfc-xfqk2\" (UID: \"4f8e835f-f481-4bf3-8eda-9c005962ccc2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xfqk2" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.605810 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/4f8e835f-f481-4bf3-8eda-9c005962ccc2-tmpfs\") pod \"packageserver-d55dfcdfc-xfqk2\" (UID: \"4f8e835f-f481-4bf3-8eda-9c005962ccc2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xfqk2" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.605873 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57dzz\" (UniqueName: \"kubernetes.io/projected/4e2cb781-6f5e-493d-a811-b88b641fcda2-kube-api-access-57dzz\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.606326 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3897dfd7-24d8-4384-b536-24db5edccd01-registration-dir\") pod \"csi-hostpathplugin-596j6\" (UID: \"3897dfd7-24d8-4384-b536-24db5edccd01\") " pod="hostpath-provisioner/csi-hostpathplugin-596j6" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.618223 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tzg77"] Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.629019 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kdv68"] Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.640530 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-4l9qf"] Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.655158 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-fw8fs"] Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.659748 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-t2znt"] Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.707584 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.708077 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3897dfd7-24d8-4384-b536-24db5edccd01-socket-dir\") pod \"csi-hostpathplugin-596j6\" (UID: \"3897dfd7-24d8-4384-b536-24db5edccd01\") " pod="hostpath-provisioner/csi-hostpathplugin-596j6" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.708186 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twgtb\" (UniqueName: \"kubernetes.io/projected/a39768b3-6060-4f23-8227-8d658942b4ed-kube-api-access-twgtb\") pod \"olm-operator-6b444d44fb-mg9fg\" (UID: \"a39768b3-6060-4f23-8227-8d658942b4ed\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mg9fg" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.708229 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/6cb5cf3c-a623-476d-b32e-0819eb48d38f-signing-key\") pod \"service-ca-9c57cc56f-4g87f\" (UID: \"6cb5cf3c-a623-476d-b32e-0819eb48d38f\") " pod="openshift-service-ca/service-ca-9c57cc56f-4g87f" Mar 20 06:53:22 crc kubenswrapper[4971]: E0320 06:53:22.708266 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 06:53:23.208230523 +0000 UTC m=+225.188104821 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.708331 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/3897dfd7-24d8-4384-b536-24db5edccd01-plugins-dir\") pod \"csi-hostpathplugin-596j6\" (UID: \"3897dfd7-24d8-4384-b536-24db5edccd01\") " pod="hostpath-provisioner/csi-hostpathplugin-596j6" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.708384 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a39768b3-6060-4f23-8227-8d658942b4ed-profile-collector-cert\") pod \"olm-operator-6b444d44fb-mg9fg\" (UID: \"a39768b3-6060-4f23-8227-8d658942b4ed\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mg9fg" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.708562 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a39768b3-6060-4f23-8227-8d658942b4ed-srv-cert\") pod \"olm-operator-6b444d44fb-mg9fg\" (UID: \"a39768b3-6060-4f23-8227-8d658942b4ed\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mg9fg" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.708652 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4e2cb781-6f5e-493d-a811-b88b641fcda2-installation-pull-secrets\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.708684 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4f8e835f-f481-4bf3-8eda-9c005962ccc2-apiservice-cert\") pod \"packageserver-d55dfcdfc-xfqk2\" (UID: \"4f8e835f-f481-4bf3-8eda-9c005962ccc2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xfqk2" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.708799 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/6cb5cf3c-a623-476d-b32e-0819eb48d38f-signing-cabundle\") pod \"service-ca-9c57cc56f-4g87f\" (UID: \"6cb5cf3c-a623-476d-b32e-0819eb48d38f\") " pod="openshift-service-ca/service-ca-9c57cc56f-4g87f" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.708877 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4e2cb781-6f5e-493d-a811-b88b641fcda2-registry-certificates\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.708998 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/3897dfd7-24d8-4384-b536-24db5edccd01-csi-data-dir\") pod \"csi-hostpathplugin-596j6\" (UID: \"3897dfd7-24d8-4384-b536-24db5edccd01\") " pod="hostpath-provisioner/csi-hostpathplugin-596j6" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.709044 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.709080 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxshm\" (UniqueName: \"kubernetes.io/projected/6cb5cf3c-a623-476d-b32e-0819eb48d38f-kube-api-access-hxshm\") pod \"service-ca-9c57cc56f-4g87f\" (UID: \"6cb5cf3c-a623-476d-b32e-0819eb48d38f\") " pod="openshift-service-ca/service-ca-9c57cc56f-4g87f" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.709146 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4f8e835f-f481-4bf3-8eda-9c005962ccc2-webhook-cert\") pod \"packageserver-d55dfcdfc-xfqk2\" (UID: \"4f8e835f-f481-4bf3-8eda-9c005962ccc2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xfqk2" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.709202 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4e2cb781-6f5e-493d-a811-b88b641fcda2-ca-trust-extracted\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.709266 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/3897dfd7-24d8-4384-b536-24db5edccd01-mountpoint-dir\") pod \"csi-hostpathplugin-596j6\" (UID: \"3897dfd7-24d8-4384-b536-24db5edccd01\") " pod="hostpath-provisioner/csi-hostpathplugin-596j6" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.709373 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/3897dfd7-24d8-4384-b536-24db5edccd01-plugins-dir\") pod \"csi-hostpathplugin-596j6\" (UID: \"3897dfd7-24d8-4384-b536-24db5edccd01\") " pod="hostpath-provisioner/csi-hostpathplugin-596j6" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.709490 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2z2vq\" (UniqueName: \"kubernetes.io/projected/3897dfd7-24d8-4384-b536-24db5edccd01-kube-api-access-2z2vq\") pod \"csi-hostpathplugin-596j6\" (UID: \"3897dfd7-24d8-4384-b536-24db5edccd01\") " pod="hostpath-provisioner/csi-hostpathplugin-596j6" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.709579 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4e2cb781-6f5e-493d-a811-b88b641fcda2-registry-tls\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.709839 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4e2cb781-6f5e-493d-a811-b88b641fcda2-trusted-ca\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.709980 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/3897dfd7-24d8-4384-b536-24db5edccd01-csi-data-dir\") pod \"csi-hostpathplugin-596j6\" (UID: \"3897dfd7-24d8-4384-b536-24db5edccd01\") " pod="hostpath-provisioner/csi-hostpathplugin-596j6" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.710124 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4e2cb781-6f5e-493d-a811-b88b641fcda2-bound-sa-token\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.710190 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfsxv\" (UniqueName: \"kubernetes.io/projected/4f8e835f-f481-4bf3-8eda-9c005962ccc2-kube-api-access-pfsxv\") pod \"packageserver-d55dfcdfc-xfqk2\" (UID: \"4f8e835f-f481-4bf3-8eda-9c005962ccc2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xfqk2" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.710299 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57dzz\" (UniqueName: \"kubernetes.io/projected/4e2cb781-6f5e-493d-a811-b88b641fcda2-kube-api-access-57dzz\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.710339 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/4f8e835f-f481-4bf3-8eda-9c005962ccc2-tmpfs\") pod \"packageserver-d55dfcdfc-xfqk2\" (UID: \"4f8e835f-f481-4bf3-8eda-9c005962ccc2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xfqk2" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.710623 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3897dfd7-24d8-4384-b536-24db5edccd01-registration-dir\") pod \"csi-hostpathplugin-596j6\" (UID: \"3897dfd7-24d8-4384-b536-24db5edccd01\") " pod="hostpath-provisioner/csi-hostpathplugin-596j6" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.710691 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/3897dfd7-24d8-4384-b536-24db5edccd01-mountpoint-dir\") pod \"csi-hostpathplugin-596j6\" (UID: \"3897dfd7-24d8-4384-b536-24db5edccd01\") " pod="hostpath-provisioner/csi-hostpathplugin-596j6" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.710773 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3897dfd7-24d8-4384-b536-24db5edccd01-socket-dir\") pod \"csi-hostpathplugin-596j6\" (UID: \"3897dfd7-24d8-4384-b536-24db5edccd01\") " pod="hostpath-provisioner/csi-hostpathplugin-596j6" Mar 20 06:53:22 crc kubenswrapper[4971]: E0320 06:53:22.712198 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 06:53:23.212176767 +0000 UTC m=+225.192050905 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q8mn4" (UID: "4e2cb781-6f5e-493d-a811-b88b641fcda2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.712278 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/4f8e835f-f481-4bf3-8eda-9c005962ccc2-tmpfs\") pod \"packageserver-d55dfcdfc-xfqk2\" (UID: \"4f8e835f-f481-4bf3-8eda-9c005962ccc2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xfqk2" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.712743 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3897dfd7-24d8-4384-b536-24db5edccd01-registration-dir\") pod \"csi-hostpathplugin-596j6\" (UID: \"3897dfd7-24d8-4384-b536-24db5edccd01\") " pod="hostpath-provisioner/csi-hostpathplugin-596j6" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.713232 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4e2cb781-6f5e-493d-a811-b88b641fcda2-ca-trust-extracted\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.713322 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/6cb5cf3c-a623-476d-b32e-0819eb48d38f-signing-cabundle\") pod \"service-ca-9c57cc56f-4g87f\" (UID: \"6cb5cf3c-a623-476d-b32e-0819eb48d38f\") " pod="openshift-service-ca/service-ca-9c57cc56f-4g87f" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.714106 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4e2cb781-6f5e-493d-a811-b88b641fcda2-trusted-ca\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.717213 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4e2cb781-6f5e-493d-a811-b88b641fcda2-registry-certificates\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.717564 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4f8e835f-f481-4bf3-8eda-9c005962ccc2-webhook-cert\") pod \"packageserver-d55dfcdfc-xfqk2\" (UID: \"4f8e835f-f481-4bf3-8eda-9c005962ccc2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xfqk2" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.720003 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4e2cb781-6f5e-493d-a811-b88b641fcda2-registry-tls\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.721304 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a39768b3-6060-4f23-8227-8d658942b4ed-profile-collector-cert\") pod \"olm-operator-6b444d44fb-mg9fg\" (UID: \"a39768b3-6060-4f23-8227-8d658942b4ed\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mg9fg" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.722275 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/6cb5cf3c-a623-476d-b32e-0819eb48d38f-signing-key\") pod \"service-ca-9c57cc56f-4g87f\" (UID: \"6cb5cf3c-a623-476d-b32e-0819eb48d38f\") " pod="openshift-service-ca/service-ca-9c57cc56f-4g87f" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.724421 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a39768b3-6060-4f23-8227-8d658942b4ed-srv-cert\") pod \"olm-operator-6b444d44fb-mg9fg\" (UID: \"a39768b3-6060-4f23-8227-8d658942b4ed\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mg9fg" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.725201 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4f8e835f-f481-4bf3-8eda-9c005962ccc2-apiservice-cert\") pod \"packageserver-d55dfcdfc-xfqk2\" (UID: \"4f8e835f-f481-4bf3-8eda-9c005962ccc2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xfqk2" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.725315 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pkbqf" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.727326 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4e2cb781-6f5e-493d-a811-b88b641fcda2-installation-pull-secrets\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.745442 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-kmdm2" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.784414 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twgtb\" (UniqueName: \"kubernetes.io/projected/a39768b3-6060-4f23-8227-8d658942b4ed-kube-api-access-twgtb\") pod \"olm-operator-6b444d44fb-mg9fg\" (UID: \"a39768b3-6060-4f23-8227-8d658942b4ed\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mg9fg" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.789688 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxshm\" (UniqueName: \"kubernetes.io/projected/6cb5cf3c-a623-476d-b32e-0819eb48d38f-kube-api-access-hxshm\") pod \"service-ca-9c57cc56f-4g87f\" (UID: \"6cb5cf3c-a623-476d-b32e-0819eb48d38f\") " pod="openshift-service-ca/service-ca-9c57cc56f-4g87f" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.801635 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4e2cb781-6f5e-493d-a811-b88b641fcda2-bound-sa-token\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.808750 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfsxv\" (UniqueName: \"kubernetes.io/projected/4f8e835f-f481-4bf3-8eda-9c005962ccc2-kube-api-access-pfsxv\") pod \"packageserver-d55dfcdfc-xfqk2\" (UID: \"4f8e835f-f481-4bf3-8eda-9c005962ccc2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xfqk2" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.814134 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:53:22 crc kubenswrapper[4971]: E0320 06:53:22.814309 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 06:53:23.314269602 +0000 UTC m=+225.294143740 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.814437 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:22 crc kubenswrapper[4971]: E0320 06:53:22.814809 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 06:53:23.314794608 +0000 UTC m=+225.294668746 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q8mn4" (UID: "4e2cb781-6f5e-493d-a811-b88b641fcda2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.828949 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z2vq\" (UniqueName: \"kubernetes.io/projected/3897dfd7-24d8-4384-b536-24db5edccd01-kube-api-access-2z2vq\") pod \"csi-hostpathplugin-596j6\" (UID: \"3897dfd7-24d8-4384-b536-24db5edccd01\") " pod="hostpath-provisioner/csi-hostpathplugin-596j6" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.833305 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mg9fg" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.853666 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57dzz\" (UniqueName: \"kubernetes.io/projected/4e2cb781-6f5e-493d-a811-b88b641fcda2-kube-api-access-57dzz\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.923183 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:53:22 crc kubenswrapper[4971]: E0320 06:53:22.923584 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 06:53:23.423563046 +0000 UTC m=+225.403437184 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:22 crc kubenswrapper[4971]: I0320 06:53:22.928898 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-596j6" Mar 20 06:53:23 crc kubenswrapper[4971]: I0320 06:53:23.025742 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:23 crc kubenswrapper[4971]: E0320 06:53:23.026509 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 06:53:23.526484874 +0000 UTC m=+225.506359042 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q8mn4" (UID: "4e2cb781-6f5e-493d-a811-b88b641fcda2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:23 crc kubenswrapper[4971]: I0320 06:53:23.069476 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-k5xfj" event={"ID":"469b3cee-92e9-43a1-b8e7-de66a1990a8f","Type":"ContainerStarted","Data":"ee3257a007fbcbc946277e09316aafb3172bcc3f89f4f0282d4de1a0b3b24cc0"} Mar 20 06:53:23 crc kubenswrapper[4971]: I0320 06:53:23.071492 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-4g87f" Mar 20 06:53:23 crc kubenswrapper[4971]: I0320 06:53:23.082678 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cwd9m" event={"ID":"780881af-a5e9-44fa-9cf9-4d57513449dd","Type":"ContainerStarted","Data":"d3354a4de44844f6ee98c4df00367d4bb0c2c0b052b6658cc8c1bc563c25e9d5"} Mar 20 06:53:23 crc kubenswrapper[4971]: I0320 06:53:23.089143 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-858jn" event={"ID":"92c1ee23-ae49-48a6-828d-19ecb5573057","Type":"ContainerStarted","Data":"9e0be7b14d1cc18fae7b5808eea4f62c74f13d7432c6b8c166e0d2557f3944ae"} Mar 20 06:53:23 crc kubenswrapper[4971]: I0320 06:53:23.089192 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-858jn" event={"ID":"92c1ee23-ae49-48a6-828d-19ecb5573057","Type":"ContainerStarted","Data":"df91564c829b86e2b2b8d363e3fd428d63b020b24ae8b006c729b4551bfc1733"} Mar 20 06:53:23 crc kubenswrapper[4971]: I0320 06:53:23.093929 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xfqk2" Mar 20 06:53:23 crc kubenswrapper[4971]: I0320 06:53:23.098594 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-858jn" Mar 20 06:53:23 crc kubenswrapper[4971]: I0320 06:53:23.105043 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4gfgh" event={"ID":"09a5c7dc-07b9-4b2a-a8da-0a96222b932a","Type":"ContainerStarted","Data":"6999fdf1c8403571eab35f0cb51fd47f36a9fa4dd3b5e152a0ff5d528f056ca3"} Mar 20 06:53:23 crc kubenswrapper[4971]: I0320 06:53:23.105092 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4gfgh" event={"ID":"09a5c7dc-07b9-4b2a-a8da-0a96222b932a","Type":"ContainerStarted","Data":"57b8c944ddc1a5eac923aed1fbcbe5942e9c6784f2ba39ca7c2eb7bff7027c0f"} Mar 20 06:53:23 crc kubenswrapper[4971]: I0320 06:53:23.105103 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4gfgh" event={"ID":"09a5c7dc-07b9-4b2a-a8da-0a96222b932a","Type":"ContainerStarted","Data":"9d059a9fb790b4a38e351b1cfda11bc16623f3c7506a9593cb38b1dc37d95ef2"} Mar 20 06:53:23 crc kubenswrapper[4971]: I0320 06:53:23.107346 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-fw8fs" event={"ID":"6f5bffa0-8b50-4aac-a108-edc093dfc2bf","Type":"ContainerStarted","Data":"03a31b73987b3d8c839ee5265edeb5b454c679d6d152974d3ba55e4980151910"} Mar 20 06:53:23 crc kubenswrapper[4971]: I0320 06:53:23.111852 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-4l9qf" event={"ID":"9f6352b6-0b3b-43d0-8d2f-681f5aec50c2","Type":"ContainerStarted","Data":"e3c1562e1fc8e758500a9be02e76daf7fe1fb45d8b51fafbe7b87419e2d229d4"} Mar 20 06:53:23 crc kubenswrapper[4971]: I0320 06:53:23.114495 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-n48nl"] Mar 20 06:53:23 crc kubenswrapper[4971]: I0320 06:53:23.115342 4971 generic.go:334] "Generic (PLEG): container finished" podID="168eba70-d30f-4f54-88c8-403a411a16a0" containerID="44ebfbc0470e00f5f5f861e7ce3fcdf2eaddb160aa53c28c87a8128ec424cb33" exitCode=0 Mar 20 06:53:23 crc kubenswrapper[4971]: I0320 06:53:23.115394 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-tp5rq" event={"ID":"168eba70-d30f-4f54-88c8-403a411a16a0","Type":"ContainerDied","Data":"44ebfbc0470e00f5f5f861e7ce3fcdf2eaddb160aa53c28c87a8128ec424cb33"} Mar 20 06:53:23 crc kubenswrapper[4971]: I0320 06:53:23.119848 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-dwxp5"] Mar 20 06:53:23 crc kubenswrapper[4971]: I0320 06:53:23.120562 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7jzjv"] Mar 20 06:53:23 crc kubenswrapper[4971]: I0320 06:53:23.129581 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:53:23 crc kubenswrapper[4971]: E0320 06:53:23.131209 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 06:53:23.631170584 +0000 UTC m=+225.611044722 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:23 crc kubenswrapper[4971]: I0320 06:53:23.136703 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-7bj4h" event={"ID":"f7f071e5-085b-429b-8956-102ebc333825","Type":"ContainerStarted","Data":"3547a1011ef7e3dccf7c88e5b64a2729f42efd1309faa9bf4086e724fb8f8a0a"} Mar 20 06:53:23 crc kubenswrapper[4971]: I0320 06:53:23.138045 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tzg77" event={"ID":"634ee46d-0ea5-4287-a7d7-55d7b0fb803b","Type":"ContainerStarted","Data":"2313394c6b7792cb804f9f9b96daaac5d95459eb183f0976b2918b3cf2973bef"} Mar 20 06:53:23 crc kubenswrapper[4971]: I0320 06:53:23.148670 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tv2fz" event={"ID":"34c91a14-92a3-459a-a798-85b4bda4f3ca","Type":"ContainerStarted","Data":"85e40868910ef5975f3c86f350505842ccc6a190b220c7d5c7fc546aa54a127c"} Mar 20 06:53:23 crc kubenswrapper[4971]: I0320 06:53:23.148717 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tv2fz" event={"ID":"34c91a14-92a3-459a-a798-85b4bda4f3ca","Type":"ContainerStarted","Data":"cf762e12648b7a88449d9fddc1cf0ac0619b9a201bc75d37c70aa909fefdbf01"} Mar 20 06:53:23 crc kubenswrapper[4971]: I0320 06:53:23.155244 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kdv68" event={"ID":"d028f881-7ad8-4805-84b8-341facd59bc2","Type":"ContainerStarted","Data":"cecdc23a92d5d45f2c3dadb2b8c270377129956529e18e3b4c7089b6db9a448a"} Mar 20 06:53:23 crc kubenswrapper[4971]: I0320 06:53:23.160565 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-r8t8l" event={"ID":"3bf8239a-db87-4f56-9da5-24b1e2248263","Type":"ContainerStarted","Data":"b269f406a0a425179dad3080a06d394cfc4338c4b93554370c7c7711407be69e"} Mar 20 06:53:23 crc kubenswrapper[4971]: I0320 06:53:23.160617 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-r8t8l" event={"ID":"3bf8239a-db87-4f56-9da5-24b1e2248263","Type":"ContainerStarted","Data":"d23cb06fcdab46e77c5414f29001670e521ebf4537e89d62dcf544a86d88d835"} Mar 20 06:53:23 crc kubenswrapper[4971]: I0320 06:53:23.161719 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-r8t8l" Mar 20 06:53:23 crc kubenswrapper[4971]: I0320 06:53:23.190305 4971 patch_prober.go:28] interesting pod/console-operator-58897d9998-r8t8l container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.16:8443/readyz\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Mar 20 06:53:23 crc kubenswrapper[4971]: I0320 06:53:23.190835 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-r8t8l" podUID="3bf8239a-db87-4f56-9da5-24b1e2248263" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.16:8443/readyz\": dial tcp 10.217.0.16:8443: connect: connection refused" Mar 20 06:53:23 crc kubenswrapper[4971]: I0320 06:53:23.198971 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-l5r26"] Mar 20 06:53:23 crc kubenswrapper[4971]: I0320 06:53:23.201190 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-g8hs9"] Mar 20 06:53:23 crc kubenswrapper[4971]: I0320 06:53:23.237304 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:23 crc kubenswrapper[4971]: E0320 06:53:23.239392 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 06:53:23.739372345 +0000 UTC m=+225.719246483 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q8mn4" (UID: "4e2cb781-6f5e-493d-a811-b88b641fcda2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:23 crc kubenswrapper[4971]: I0320 06:53:23.339044 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:53:23 crc kubenswrapper[4971]: E0320 06:53:23.344904 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 06:53:23.844879119 +0000 UTC m=+225.824753257 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:23 crc kubenswrapper[4971]: I0320 06:53:23.361686 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tv2fz" podStartSLOduration=172.361662325 podStartE2EDuration="2m52.361662325s" podCreationTimestamp="2026-03-20 06:50:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:53:23.360566503 +0000 UTC m=+225.340440641" watchObservedRunningTime="2026-03-20 06:53:23.361662325 +0000 UTC m=+225.341536463" Mar 20 06:53:23 crc kubenswrapper[4971]: I0320 06:53:23.395863 4971 patch_prober.go:28] interesting pod/router-default-5444994796-858jn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 06:53:23 crc kubenswrapper[4971]: [-]has-synced failed: reason withheld Mar 20 06:53:23 crc kubenswrapper[4971]: [+]process-running ok Mar 20 06:53:23 crc kubenswrapper[4971]: healthz check failed Mar 20 06:53:23 crc kubenswrapper[4971]: I0320 06:53:23.395911 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-858jn" podUID="92c1ee23-ae49-48a6-828d-19ecb5573057" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 06:53:23 crc kubenswrapper[4971]: I0320 06:53:23.435243 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-r8t8l" podStartSLOduration=172.435206993 podStartE2EDuration="2m52.435206993s" podCreationTimestamp="2026-03-20 06:50:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:53:23.422953319 +0000 UTC m=+225.402827487" watchObservedRunningTime="2026-03-20 06:53:23.435206993 +0000 UTC m=+225.415081131" Mar 20 06:53:23 crc kubenswrapper[4971]: I0320 06:53:23.441551 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:23 crc kubenswrapper[4971]: E0320 06:53:23.441954 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 06:53:23.941941028 +0000 UTC m=+225.921815156 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q8mn4" (UID: "4e2cb781-6f5e-493d-a811-b88b641fcda2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:23 crc kubenswrapper[4971]: I0320 06:53:23.570863 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:53:23 crc kubenswrapper[4971]: E0320 06:53:23.571168 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 06:53:24.071129577 +0000 UTC m=+226.051003715 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:23 crc kubenswrapper[4971]: I0320 06:53:23.572621 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:23 crc kubenswrapper[4971]: E0320 06:53:23.573095 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 06:53:24.073073274 +0000 UTC m=+226.052947412 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q8mn4" (UID: "4e2cb781-6f5e-493d-a811-b88b641fcda2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:23 crc kubenswrapper[4971]: I0320 06:53:23.680534 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:53:23 crc kubenswrapper[4971]: E0320 06:53:23.681357 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 06:53:24.181321197 +0000 UTC m=+226.161195335 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:23 crc kubenswrapper[4971]: I0320 06:53:23.787667 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:23 crc kubenswrapper[4971]: E0320 06:53:23.788058 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 06:53:24.288042096 +0000 UTC m=+226.267916234 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q8mn4" (UID: "4e2cb781-6f5e-493d-a811-b88b641fcda2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:23 crc kubenswrapper[4971]: I0320 06:53:23.844751 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-858jn" podStartSLOduration=171.844725226 podStartE2EDuration="2m51.844725226s" podCreationTimestamp="2026-03-20 06:50:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:53:23.842963905 +0000 UTC m=+225.822838053" watchObservedRunningTime="2026-03-20 06:53:23.844725226 +0000 UTC m=+225.824599364" Mar 20 06:53:23 crc kubenswrapper[4971]: I0320 06:53:23.890181 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:53:23 crc kubenswrapper[4971]: E0320 06:53:23.891187 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 06:53:24.39116344 +0000 UTC m=+226.371037578 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:23 crc kubenswrapper[4971]: I0320 06:53:23.992083 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:23 crc kubenswrapper[4971]: E0320 06:53:23.992623 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 06:53:24.492588116 +0000 UTC m=+226.472462254 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q8mn4" (UID: "4e2cb781-6f5e-493d-a811-b88b641fcda2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.055980 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jw2zn"] Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.062975 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-g2c5c"] Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.063035 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nj4b4"] Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.093775 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:53:24 crc kubenswrapper[4971]: E0320 06:53:24.094106 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 06:53:24.594052992 +0000 UTC m=+226.573927130 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.094179 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:24 crc kubenswrapper[4971]: E0320 06:53:24.094891 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 06:53:24.594869016 +0000 UTC m=+226.574743154 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q8mn4" (UID: "4e2cb781-6f5e-493d-a811-b88b641fcda2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.156446 4971 patch_prober.go:28] interesting pod/router-default-5444994796-858jn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 06:53:24 crc kubenswrapper[4971]: [-]has-synced failed: reason withheld Mar 20 06:53:24 crc kubenswrapper[4971]: [+]process-running ok Mar 20 06:53:24 crc kubenswrapper[4971]: healthz check failed Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.156924 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-858jn" podUID="92c1ee23-ae49-48a6-828d-19ecb5573057" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.169964 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4gfgh" podStartSLOduration=172.169933479 podStartE2EDuration="2m52.169933479s" podCreationTimestamp="2026-03-20 06:50:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:53:24.167192639 +0000 UTC m=+226.147066777" watchObservedRunningTime="2026-03-20 06:53:24.169933479 +0000 UTC m=+226.149807617" Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.186487 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-g2c5c" event={"ID":"fe99e221-99e5-49c1-9cec-875a54851847","Type":"ContainerStarted","Data":"6f57b27b633986b947007cf31750d1c580be9e9e6cf4069c71b132de2b00e71c"} Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.188874 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-dwxp5" event={"ID":"e1a9fe49-1dd3-48c9-a4c7-33ce3cf9f698","Type":"ContainerStarted","Data":"630e8a4b39f9f1ae36b5a59bc60cfd65524122fbf4f96a4f448549365b8269ef"} Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.193032 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-t2znt" event={"ID":"0939bb94-8858-43fc-8443-686e696beaa1","Type":"ContainerStarted","Data":"70523da7f2ba354e645feaa789263720411e0b177dba4b54323bf06972789aaa"} Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.193071 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-t2znt" event={"ID":"0939bb94-8858-43fc-8443-686e696beaa1","Type":"ContainerStarted","Data":"f5524839fd8b8f4263b05bc9f258530df1b4d05097ff127e0fd4a296d213e5ab"} Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.194894 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-t2znt" Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.195301 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:53:24 crc kubenswrapper[4971]: E0320 06:53:24.195648 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 06:53:24.695628332 +0000 UTC m=+226.675502470 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.205484 4971 patch_prober.go:28] interesting pod/downloads-7954f5f757-t2znt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.205547 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-t2znt" podUID="0939bb94-8858-43fc-8443-686e696beaa1" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.222598 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cwd9m" event={"ID":"780881af-a5e9-44fa-9cf9-4d57513449dd","Type":"ContainerStarted","Data":"1d1af9ee288ea9f4c70bc2fb3a07b72fbeab2787e82ddbc1495b79f8fa6aa17d"} Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.232093 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jw2zn" event={"ID":"8ad08056-729b-4fb9-8049-9a27c674e520","Type":"ContainerStarted","Data":"5bacdbdb1ef89dcef9e824baff78c88d73896a2e3b97198a718a45dd5e27829e"} Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.248425 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tzg77" event={"ID":"634ee46d-0ea5-4287-a7d7-55d7b0fb803b","Type":"ContainerStarted","Data":"75f805e80346e267bb7cea91a2a01f8a489c9b11441e48c0066e17d1ddbf2a51"} Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.249367 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tzg77" Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.253096 4971 generic.go:334] "Generic (PLEG): container finished" podID="65ba3fb8-bfc9-4b6a-96ac-e901252e44b7" containerID="99e491153d6205e23b17a009a96d50bde42f3e4b9db22cb5a8cf85d2c69e2432" exitCode=0 Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.253186 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-g8hs9" event={"ID":"65ba3fb8-bfc9-4b6a-96ac-e901252e44b7","Type":"ContainerDied","Data":"99e491153d6205e23b17a009a96d50bde42f3e4b9db22cb5a8cf85d2c69e2432"} Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.253212 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-g8hs9" event={"ID":"65ba3fb8-bfc9-4b6a-96ac-e901252e44b7","Type":"ContainerStarted","Data":"ba345804da925fc13334abf74125917af975b184f0c9b1d8d0f2e55e2e16903a"} Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.265935 4971 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-tzg77 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.266006 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tzg77" podUID="634ee46d-0ea5-4287-a7d7-55d7b0fb803b" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.284410 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-4l9qf" event={"ID":"9f6352b6-0b3b-43d0-8d2f-681f5aec50c2","Type":"ContainerStarted","Data":"b3550967a7937f0922fb88dc84a6fa0a07a39a519d13bcc9e66a9bbd4f177ce7"} Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.297991 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:24 crc kubenswrapper[4971]: E0320 06:53:24.299057 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 06:53:24.799013755 +0000 UTC m=+226.778887893 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q8mn4" (UID: "4e2cb781-6f5e-493d-a811-b88b641fcda2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.335902 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-k5xfj" event={"ID":"469b3cee-92e9-43a1-b8e7-de66a1990a8f","Type":"ContainerStarted","Data":"70606590a09aa41c7a112416bbd78ac53b29e45664879e7a2cc25e66807e73dd"} Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.337002 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-k5xfj" Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.346121 4971 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-k5xfj container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.346196 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-k5xfj" podUID="469b3cee-92e9-43a1-b8e7-de66a1990a8f" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.355805 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7jzjv" event={"ID":"1159c4dd-9dfe-432c-8c65-f18f589c8a66","Type":"ContainerStarted","Data":"de9faab9184fa69c6581fee04ee3e74b5d22344ca104f41eda883ddc2ea669af"} Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.372226 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kdv68" event={"ID":"d028f881-7ad8-4805-84b8-341facd59bc2","Type":"ContainerStarted","Data":"53bebbf2c1909603f65eeffb7d2e7b3988225bbc842097b4188f533ce2bfdc24"} Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.396964 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-7bj4h" event={"ID":"f7f071e5-085b-429b-8956-102ebc333825","Type":"ContainerStarted","Data":"f092fdb3f98d9601bee0dbaee8d70b55760cb5bf739f5bd2396486e7e31e5ec3"} Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.398094 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-fw8fs" event={"ID":"6f5bffa0-8b50-4aac-a108-edc093dfc2bf","Type":"ContainerStarted","Data":"10dca0b8f8ec236a40a0b62b8437920f763329a3e0d3d543217aa4a7c06020f7"} Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.398744 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.401614 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n48nl" event={"ID":"8b3d1a36-b34d-422b-96e1-0bfd934deac1","Type":"ContainerStarted","Data":"d0df32409ec2450465b455896c789971c750974139cf10a9dcfc2d5063d967ae"} Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.401666 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n48nl" event={"ID":"8b3d1a36-b34d-422b-96e1-0bfd934deac1","Type":"ContainerStarted","Data":"e1029cc474a44abd5283adfe033ac2488d5d7309552742b891bd3c4416fffe2a"} Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.403905 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-l5r26" event={"ID":"52969add-873f-41ae-b0a9-e36924daac43","Type":"ContainerStarted","Data":"dea049432badf922d566b8f3148384def8d8ccb9054ac1ef2fab35eaaa9d430a"} Mar 20 06:53:24 crc kubenswrapper[4971]: E0320 06:53:24.404304 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 06:53:24.904271101 +0000 UTC m=+226.884145239 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.423407 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-r8t8l" Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.493470 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-scq4h"] Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.506259 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.508359 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bj9mx"] Mar 20 06:53:24 crc kubenswrapper[4971]: E0320 06:53:24.513901 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 06:53:25.013870393 +0000 UTC m=+226.993744531 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q8mn4" (UID: "4e2cb781-6f5e-493d-a811-b88b641fcda2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.541713 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mg9fg"] Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.543215 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-4l9qf" podStartSLOduration=173.543188942 podStartE2EDuration="2m53.543188942s" podCreationTimestamp="2026-03-20 06:50:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:53:24.506113609 +0000 UTC m=+226.485987767" watchObservedRunningTime="2026-03-20 06:53:24.543188942 +0000 UTC m=+226.523063080" Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.579893 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k7vhj"] Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.611109 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:53:24 crc kubenswrapper[4971]: E0320 06:53:24.611662 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 06:53:25.111639493 +0000 UTC m=+227.091513631 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.612204 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kcw47"] Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.618728 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-7bj4h" podStartSLOduration=5.618703097 podStartE2EDuration="5.618703097s" podCreationTimestamp="2026-03-20 06:53:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:53:24.532890324 +0000 UTC m=+226.512764452" watchObservedRunningTime="2026-03-20 06:53:24.618703097 +0000 UTC m=+226.598577235" Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.639181 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-t2znt" podStartSLOduration=173.639154009 podStartE2EDuration="2m53.639154009s" podCreationTimestamp="2026-03-20 06:50:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:53:24.570024629 +0000 UTC m=+226.549898787" watchObservedRunningTime="2026-03-20 06:53:24.639154009 +0000 UTC m=+226.619028147" Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.675910 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566485-rnzpk"] Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.732677 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:24 crc kubenswrapper[4971]: E0320 06:53:24.733053 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 06:53:25.233039777 +0000 UTC m=+227.212913915 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q8mn4" (UID: "4e2cb781-6f5e-493d-a811-b88b641fcda2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.770139 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fn7gp"] Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.770183 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566492-86snb"] Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.770193 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-r6vfl"] Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.770336 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-sjq2s"] Mar 20 06:53:24 crc kubenswrapper[4971]: W0320 06:53:24.770592 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb96a8e1e_ebdd_419f_8661_ae6e515429b0.slice/crio-ba73498f85ec37e1bad353aa9896337637b0e5acb96f8e2eb22e8baed702f71e WatchSource:0}: Error finding container ba73498f85ec37e1bad353aa9896337637b0e5acb96f8e2eb22e8baed702f71e: Status 404 returned error can't find the container with id ba73498f85ec37e1bad353aa9896337637b0e5acb96f8e2eb22e8baed702f71e Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.772858 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bnct9"] Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.778023 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-hvlj7"] Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.782173 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-9ndf9"] Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.784896 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kdv68" podStartSLOduration=173.784857757 podStartE2EDuration="2m53.784857757s" podCreationTimestamp="2026-03-20 06:50:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:53:24.695644644 +0000 UTC m=+226.675518802" watchObservedRunningTime="2026-03-20 06:53:24.784857757 +0000 UTC m=+226.764731895" Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.785165 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-xg8rk"] Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.792028 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-k9kzh"] Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.797426 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-4g87f"] Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.799310 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-s29mb"] Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.814993 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tzg77" podStartSLOduration=172.814975308 podStartE2EDuration="2m52.814975308s" podCreationTimestamp="2026-03-20 06:50:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:53:24.732317646 +0000 UTC m=+226.712191784" watchObservedRunningTime="2026-03-20 06:53:24.814975308 +0000 UTC m=+226.794849436" Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.820895 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-596j6"] Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.823489 4971 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.833485 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:53:24 crc kubenswrapper[4971]: E0320 06:53:24.835121 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 06:53:25.335098171 +0000 UTC m=+227.314972459 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.835275 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:24 crc kubenswrapper[4971]: E0320 06:53:24.837111 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 06:53:25.337102009 +0000 UTC m=+227.316976147 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q8mn4" (UID: "4e2cb781-6f5e-493d-a811-b88b641fcda2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:24 crc kubenswrapper[4971]: W0320 06:53:24.838954 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a9bd8ec_b649_4b24_8eca_d0e20cef988a.slice/crio-d51f193f0868c0fed274a6670046db5697b5a441554fbc695768bb00b3b45799 WatchSource:0}: Error finding container d51f193f0868c0fed274a6670046db5697b5a441554fbc695768bb00b3b45799: Status 404 returned error can't find the container with id d51f193f0868c0fed274a6670046db5697b5a441554fbc695768bb00b3b45799 Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.860419 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-kmdm2"] Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.863093 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pkbqf"] Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.864662 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xfqk2"] Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.865104 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-k5xfj" podStartSLOduration=173.865090589 podStartE2EDuration="2m53.865090589s" podCreationTimestamp="2026-03-20 06:50:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:53:24.78808973 +0000 UTC m=+226.767963888" watchObservedRunningTime="2026-03-20 06:53:24.865090589 +0000 UTC m=+226.844964717" Mar 20 06:53:24 crc kubenswrapper[4971]: W0320 06:53:24.868555 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda00f6082_ee0c_441f_9aad_d4d12c607e23.slice/crio-6e419c3c7c1920a980cf4523dc5c8c8e2029fe45d4d64c901bddbeab4372a2b4 WatchSource:0}: Error finding container 6e419c3c7c1920a980cf4523dc5c8c8e2029fe45d4d64c901bddbeab4372a2b4: Status 404 returned error can't find the container with id 6e419c3c7c1920a980cf4523dc5c8c8e2029fe45d4d64c901bddbeab4372a2b4 Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.939076 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:53:24 crc kubenswrapper[4971]: E0320 06:53:24.939508 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 06:53:25.439489422 +0000 UTC m=+227.419363560 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:24 crc kubenswrapper[4971]: I0320 06:53:24.994334 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-k5xfj"] Mar 20 06:53:25 crc kubenswrapper[4971]: I0320 06:53:25.040191 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:25 crc kubenswrapper[4971]: E0320 06:53:25.040927 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 06:53:25.540910578 +0000 UTC m=+227.520784716 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q8mn4" (UID: "4e2cb781-6f5e-493d-a811-b88b641fcda2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:25 crc kubenswrapper[4971]: I0320 06:53:25.056575 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tzg77"] Mar 20 06:53:25 crc kubenswrapper[4971]: I0320 06:53:25.110017 4971 patch_prober.go:28] interesting pod/router-default-5444994796-858jn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 06:53:25 crc kubenswrapper[4971]: [-]has-synced failed: reason withheld Mar 20 06:53:25 crc kubenswrapper[4971]: [+]process-running ok Mar 20 06:53:25 crc kubenswrapper[4971]: healthz check failed Mar 20 06:53:25 crc kubenswrapper[4971]: I0320 06:53:25.110534 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-858jn" podUID="92c1ee23-ae49-48a6-828d-19ecb5573057" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 06:53:25 crc kubenswrapper[4971]: I0320 06:53:25.142521 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:53:25 crc kubenswrapper[4971]: E0320 06:53:25.142935 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 06:53:25.64291755 +0000 UTC m=+227.622791688 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:25 crc kubenswrapper[4971]: I0320 06:53:25.245133 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:25 crc kubenswrapper[4971]: E0320 06:53:25.245501 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 06:53:25.745486709 +0000 UTC m=+227.725360847 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q8mn4" (UID: "4e2cb781-6f5e-493d-a811-b88b641fcda2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:25 crc kubenswrapper[4971]: I0320 06:53:25.346966 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:53:25 crc kubenswrapper[4971]: E0320 06:53:25.348740 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 06:53:25.848598573 +0000 UTC m=+227.828472711 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:25 crc kubenswrapper[4971]: I0320 06:53:25.449239 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:25 crc kubenswrapper[4971]: E0320 06:53:25.449702 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 06:53:25.949685629 +0000 UTC m=+227.929559767 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q8mn4" (UID: "4e2cb781-6f5e-493d-a811-b88b641fcda2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:25 crc kubenswrapper[4971]: I0320 06:53:25.451264 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kcw47" event={"ID":"a742bd2a-b1b3-42fc-be4f-98002090b4ed","Type":"ContainerStarted","Data":"b0e016394f98a4393d83829a409ae584dbc13be5a3f43b5f51537a10d17609de"} Mar 20 06:53:25 crc kubenswrapper[4971]: I0320 06:53:25.456282 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566485-rnzpk" event={"ID":"b7d2e412-0a4f-44b6-a326-269ac4921dae","Type":"ContainerStarted","Data":"dcb53514c699db07f3cb16566e3180e0df4a7247d933959a1c7ef34c8d794f9f"} Mar 20 06:53:25 crc kubenswrapper[4971]: I0320 06:53:25.457805 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bnct9" event={"ID":"b96a8e1e-ebdd-419f-8661-ae6e515429b0","Type":"ContainerStarted","Data":"ba73498f85ec37e1bad353aa9896337637b0e5acb96f8e2eb22e8baed702f71e"} Mar 20 06:53:25 crc kubenswrapper[4971]: I0320 06:53:25.459108 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nj4b4" event={"ID":"181841ef-d317-458b-bb36-daa09020abdb","Type":"ContainerStarted","Data":"ef9b0540dce8d50bc927775f75ea141ba0bcf5873192478a159dde305a603e08"} Mar 20 06:53:25 crc kubenswrapper[4971]: I0320 06:53:25.459136 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nj4b4" event={"ID":"181841ef-d317-458b-bb36-daa09020abdb","Type":"ContainerStarted","Data":"a58f4fc37ef2b57913f6b281487b2a5844bbc9af177500c3af8df61e41062d5e"} Mar 20 06:53:25 crc kubenswrapper[4971]: I0320 06:53:25.467994 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-fw8fs" event={"ID":"6f5bffa0-8b50-4aac-a108-edc093dfc2bf","Type":"ContainerStarted","Data":"71f6f597f9c9d96cebcf1d1dc25113a0b35331c8a54b8e291415d674d0a722e3"} Mar 20 06:53:25 crc kubenswrapper[4971]: I0320 06:53:25.477441 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-sjq2s" event={"ID":"a8e16069-c384-4991-8041-af49bbd32266","Type":"ContainerStarted","Data":"dfcdd53b7ba1fca1feca4bff4da3374e73f51b0b3bf7692525deb776e6972e97"} Mar 20 06:53:25 crc kubenswrapper[4971]: I0320 06:53:25.500803 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-l5r26" event={"ID":"52969add-873f-41ae-b0a9-e36924daac43","Type":"ContainerStarted","Data":"eb871a242cf5368b92a934ef7fe55d3bb89349eb7edb5a22c2574965bae31262"} Mar 20 06:53:25 crc kubenswrapper[4971]: I0320 06:53:25.502578 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-fw8fs" podStartSLOduration=173.502542089 podStartE2EDuration="2m53.502542089s" podCreationTimestamp="2026-03-20 06:50:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:53:25.500200651 +0000 UTC m=+227.480074789" watchObservedRunningTime="2026-03-20 06:53:25.502542089 +0000 UTC m=+227.482416227" Mar 20 06:53:25 crc kubenswrapper[4971]: I0320 06:53:25.560647 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:53:25 crc kubenswrapper[4971]: I0320 06:53:25.565324 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-hvlj7" event={"ID":"6cc43d2b-e8e2-405a-9870-33c60db55ff2","Type":"ContainerStarted","Data":"5b73dd649fa06adeb2565c46dd726bdb3f12a0d21bf98d37b34b8595ddd2a1ba"} Mar 20 06:53:25 crc kubenswrapper[4971]: I0320 06:53:25.565687 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:53:25 crc kubenswrapper[4971]: I0320 06:53:25.565724 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:53:25 crc kubenswrapper[4971]: I0320 06:53:25.565773 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:53:25 crc kubenswrapper[4971]: I0320 06:53:25.565801 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:53:25 crc kubenswrapper[4971]: E0320 06:53:25.567013 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 06:53:26.066979694 +0000 UTC m=+228.046853832 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:25 crc kubenswrapper[4971]: I0320 06:53:25.567490 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-l5r26" podStartSLOduration=173.567474198 podStartE2EDuration="2m53.567474198s" podCreationTimestamp="2026-03-20 06:50:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:53:25.551142585 +0000 UTC m=+227.531016723" watchObservedRunningTime="2026-03-20 06:53:25.567474198 +0000 UTC m=+227.547348336" Mar 20 06:53:25 crc kubenswrapper[4971]: I0320 06:53:25.575674 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:53:25 crc kubenswrapper[4971]: I0320 06:53:25.595402 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:53:25 crc kubenswrapper[4971]: I0320 06:53:25.597165 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:53:25 crc kubenswrapper[4971]: I0320 06:53:25.597649 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:53:25 crc kubenswrapper[4971]: I0320 06:53:25.634983 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-r6vfl" event={"ID":"6e696339-3718-476d-a99b-1da32e10090b","Type":"ContainerStarted","Data":"9f1f6e056d0667abbcd44fbd4101fa8008027ae3d96dd14256987d6e6c040bbd"} Mar 20 06:53:25 crc kubenswrapper[4971]: I0320 06:53:25.656586 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-g2c5c" event={"ID":"fe99e221-99e5-49c1-9cec-875a54851847","Type":"ContainerStarted","Data":"891259158e431359136fb2ac4966aa8329dbf23f371f0fa70390e636d2448b32"} Mar 20 06:53:25 crc kubenswrapper[4971]: I0320 06:53:25.660078 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-g2c5c" Mar 20 06:53:25 crc kubenswrapper[4971]: I0320 06:53:25.661651 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 06:53:25 crc kubenswrapper[4971]: I0320 06:53:25.662132 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566492-86snb" event={"ID":"e766c0e7-ca96-4f32-9e14-81a3d5bbd389","Type":"ContainerStarted","Data":"e71ffb773c4ef6545828c6a07f36769162aa30216f11625a44b300e7e0609f3b"} Mar 20 06:53:25 crc kubenswrapper[4971]: I0320 06:53:25.672390 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jw2zn" event={"ID":"8ad08056-729b-4fb9-8049-9a27c674e520","Type":"ContainerStarted","Data":"774055128f20d133e3358998bf68a845b1df0ae72770235b074d9c7893f2fc2e"} Mar 20 06:53:25 crc kubenswrapper[4971]: I0320 06:53:25.674790 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:25 crc kubenswrapper[4971]: I0320 06:53:25.677066 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 06:53:25 crc kubenswrapper[4971]: E0320 06:53:25.680509 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 06:53:26.180488619 +0000 UTC m=+228.160362757 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q8mn4" (UID: "4e2cb781-6f5e-493d-a811-b88b641fcda2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:25 crc kubenswrapper[4971]: I0320 06:53:25.684950 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-k9kzh" event={"ID":"3a9bd8ec-b649-4b24-8eca-d0e20cef988a","Type":"ContainerStarted","Data":"d51f193f0868c0fed274a6670046db5697b5a441554fbc695768bb00b3b45799"} Mar 20 06:53:25 crc kubenswrapper[4971]: I0320 06:53:25.694386 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:53:25 crc kubenswrapper[4971]: I0320 06:53:25.696382 4971 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-g2c5c container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.15:6443/healthz\": dial tcp 10.217.0.15:6443: connect: connection refused" start-of-body= Mar 20 06:53:25 crc kubenswrapper[4971]: I0320 06:53:25.696424 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-g2c5c" podUID="fe99e221-99e5-49c1-9cec-875a54851847" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.15:6443/healthz\": dial tcp 10.217.0.15:6443: connect: connection refused" Mar 20 06:53:25 crc kubenswrapper[4971]: I0320 06:53:25.720820 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-g2c5c" podStartSLOduration=174.720783575 podStartE2EDuration="2m54.720783575s" podCreationTimestamp="2026-03-20 06:50:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:53:25.719690804 +0000 UTC m=+227.699564952" watchObservedRunningTime="2026-03-20 06:53:25.720783575 +0000 UTC m=+227.700657713" Mar 20 06:53:25 crc kubenswrapper[4971]: I0320 06:53:25.751132 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n48nl" event={"ID":"8b3d1a36-b34d-422b-96e1-0bfd934deac1","Type":"ContainerStarted","Data":"ac61f375dc3c2adb0936048d96524c50e34ffdaff343f7463e8c675e1df7d3cf"} Mar 20 06:53:25 crc kubenswrapper[4971]: I0320 06:53:25.753123 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jw2zn" podStartSLOduration=173.753110931 podStartE2EDuration="2m53.753110931s" podCreationTimestamp="2026-03-20 06:50:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:53:25.752206245 +0000 UTC m=+227.732080383" watchObservedRunningTime="2026-03-20 06:53:25.753110931 +0000 UTC m=+227.732985069" Mar 20 06:53:25 crc kubenswrapper[4971]: I0320 06:53:25.776458 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:53:25 crc kubenswrapper[4971]: E0320 06:53:25.778114 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 06:53:26.278091864 +0000 UTC m=+228.257966002 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:25 crc kubenswrapper[4971]: I0320 06:53:25.780679 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bj9mx" event={"ID":"d17d7658-ce33-4a3c-826a-1a0361e73629","Type":"ContainerStarted","Data":"856d87d0848cb1c7117a4ecee26fc78281ed69f1d642a687df533058dc1114df"} Mar 20 06:53:25 crc kubenswrapper[4971]: I0320 06:53:25.780734 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bj9mx" event={"ID":"d17d7658-ce33-4a3c-826a-1a0361e73629","Type":"ContainerStarted","Data":"964360ce05a4ce669a32572f50f65373f705c414575fbfeb10241253155b829e"} Mar 20 06:53:25 crc kubenswrapper[4971]: I0320 06:53:25.781305 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-bj9mx" Mar 20 06:53:25 crc kubenswrapper[4971]: I0320 06:53:25.795914 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n48nl" podStartSLOduration=173.795882709 podStartE2EDuration="2m53.795882709s" podCreationTimestamp="2026-03-20 06:50:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:53:25.791386779 +0000 UTC m=+227.771260917" watchObservedRunningTime="2026-03-20 06:53:25.795882709 +0000 UTC m=+227.775756847" Mar 20 06:53:25 crc kubenswrapper[4971]: I0320 06:53:25.803480 4971 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-bj9mx container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Mar 20 06:53:25 crc kubenswrapper[4971]: I0320 06:53:25.803543 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-bj9mx" podUID="d17d7658-ce33-4a3c-826a-1a0361e73629" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" Mar 20 06:53:25 crc kubenswrapper[4971]: I0320 06:53:25.819794 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7jzjv" event={"ID":"1159c4dd-9dfe-432c-8c65-f18f589c8a66","Type":"ContainerStarted","Data":"3f165a7d67ec793a3d02b2e7f8789968d1a23ac989e8ea38e3a818db9dc0951a"} Mar 20 06:53:25 crc kubenswrapper[4971]: I0320 06:53:25.834095 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-bj9mx" podStartSLOduration=173.834073714 podStartE2EDuration="2m53.834073714s" podCreationTimestamp="2026-03-20 06:50:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:53:25.83254579 +0000 UTC m=+227.812419948" watchObservedRunningTime="2026-03-20 06:53:25.834073714 +0000 UTC m=+227.813947852" Mar 20 06:53:25 crc kubenswrapper[4971]: I0320 06:53:25.845530 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-kmdm2" event={"ID":"cdb9a3df-ae5a-4ce0-a7af-7986ac68c57f","Type":"ContainerStarted","Data":"8ff48ce94cba86700a7052378e5df22f5720da1082a5efaf5b08615cb6d0a1b8"} Mar 20 06:53:25 crc kubenswrapper[4971]: I0320 06:53:25.868560 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-4g87f" event={"ID":"6cb5cf3c-a623-476d-b32e-0819eb48d38f","Type":"ContainerStarted","Data":"6e55f281df97fd7670e78414f1351c55d6cd6558a5a905a658809939647ffde2"} Mar 20 06:53:25 crc kubenswrapper[4971]: I0320 06:53:25.870235 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fn7gp" event={"ID":"c9ad26c3-6a8e-41df-9879-b7ff5f77fdea","Type":"ContainerStarted","Data":"88d85df2ce7aab693bd3f44695389ae08134cdcec72d1d849e00010afe66da14"} Mar 20 06:53:25 crc kubenswrapper[4971]: I0320 06:53:25.874791 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cwd9m" event={"ID":"780881af-a5e9-44fa-9cf9-4d57513449dd","Type":"ContainerStarted","Data":"5386aeeb8b063b2038b5ccd84f8aff2dcb6f9f3e634a89a5c6135f9bc287de30"} Mar 20 06:53:25 crc kubenswrapper[4971]: I0320 06:53:25.880984 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:25 crc kubenswrapper[4971]: E0320 06:53:25.883566 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 06:53:26.383540896 +0000 UTC m=+228.363415034 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q8mn4" (UID: "4e2cb781-6f5e-493d-a811-b88b641fcda2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:25 crc kubenswrapper[4971]: I0320 06:53:25.884265 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7jzjv" podStartSLOduration=173.884248847 podStartE2EDuration="2m53.884248847s" podCreationTimestamp="2026-03-20 06:50:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:53:25.883403242 +0000 UTC m=+227.863277390" watchObservedRunningTime="2026-03-20 06:53:25.884248847 +0000 UTC m=+227.864122985" Mar 20 06:53:25 crc kubenswrapper[4971]: I0320 06:53:25.900274 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-9ndf9" event={"ID":"59c19ac3-a273-4527-bb7c-fa5d0ec71cba","Type":"ContainerStarted","Data":"2ef4634415527de96fe0d4cd9f071a4fb7b8473550291f2a20d2a453dbbe5997"} Mar 20 06:53:25 crc kubenswrapper[4971]: I0320 06:53:25.929330 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cwd9m" podStartSLOduration=174.929305541 podStartE2EDuration="2m54.929305541s" podCreationTimestamp="2026-03-20 06:50:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:53:25.924980626 +0000 UTC m=+227.904854754" watchObservedRunningTime="2026-03-20 06:53:25.929305541 +0000 UTC m=+227.909179679" Mar 20 06:53:25 crc kubenswrapper[4971]: I0320 06:53:25.939684 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pkbqf" event={"ID":"b0525b9e-caf1-4606-83d1-ffaf7d231b69","Type":"ContainerStarted","Data":"770e2281c0dc50ede45bcee38e7e70e53dd0ce9ce5318ca6319e7d0f2c7e4376"} Mar 20 06:53:25 crc kubenswrapper[4971]: I0320 06:53:25.991160 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:53:25 crc kubenswrapper[4971]: E0320 06:53:25.991659 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 06:53:26.491643085 +0000 UTC m=+228.471517223 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:25 crc kubenswrapper[4971]: I0320 06:53:25.991849 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k7vhj" event={"ID":"971318f2-7f27-4651-9fbe-d66d37fb6b12","Type":"ContainerStarted","Data":"3c7e64772d5b296cfc8748adade593ef41f2aeaf69e5eedc1db58bb91e6963dc"} Mar 20 06:53:26 crc kubenswrapper[4971]: I0320 06:53:26.008314 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xfqk2" event={"ID":"4f8e835f-f481-4bf3-8eda-9c005962ccc2","Type":"ContainerStarted","Data":"2217be0e6860ef4242b060080592089a002a4a02d8619ee487c811e06664e486"} Mar 20 06:53:26 crc kubenswrapper[4971]: I0320 06:53:26.018679 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-596j6" event={"ID":"3897dfd7-24d8-4384-b536-24db5edccd01","Type":"ContainerStarted","Data":"20e9d8526206680d3268cd1565715fb8c123d13e29b2cc8d94f74fe06b745ef2"} Mar 20 06:53:26 crc kubenswrapper[4971]: I0320 06:53:26.041917 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-scq4h" event={"ID":"911a50c7-0e81-4087-9bc0-81e5206d5fd0","Type":"ContainerStarted","Data":"cb0457cb2ce3eaceaca68de7ef50d36f4cb9d39e82b1d4f83ecec51be4951d8d"} Mar 20 06:53:26 crc kubenswrapper[4971]: I0320 06:53:26.059283 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-dwxp5" event={"ID":"e1a9fe49-1dd3-48c9-a4c7-33ce3cf9f698","Type":"ContainerStarted","Data":"f30db1c37a2327b26d3c50fc28a027e7a6fa2c4c9c3b3fd8f05d0d920e6fea8c"} Mar 20 06:53:26 crc kubenswrapper[4971]: I0320 06:53:26.079909 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-s29mb" event={"ID":"7d6d4012-f253-411e-9bed-2e204d9a6f67","Type":"ContainerStarted","Data":"81709259726d14006303ac0b0582fbd9d34681e3315c88ffe612e3f889b6fc66"} Mar 20 06:53:26 crc kubenswrapper[4971]: I0320 06:53:26.093538 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:26 crc kubenswrapper[4971]: E0320 06:53:26.094087 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 06:53:26.594064299 +0000 UTC m=+228.573938437 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q8mn4" (UID: "4e2cb781-6f5e-493d-a811-b88b641fcda2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:26 crc kubenswrapper[4971]: I0320 06:53:26.109382 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-dwxp5" podStartSLOduration=175.109362122 podStartE2EDuration="2m55.109362122s" podCreationTimestamp="2026-03-20 06:50:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:53:26.109362322 +0000 UTC m=+228.089236460" watchObservedRunningTime="2026-03-20 06:53:26.109362122 +0000 UTC m=+228.089236260" Mar 20 06:53:26 crc kubenswrapper[4971]: I0320 06:53:26.117087 4971 patch_prober.go:28] interesting pod/router-default-5444994796-858jn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 06:53:26 crc kubenswrapper[4971]: [-]has-synced failed: reason withheld Mar 20 06:53:26 crc kubenswrapper[4971]: [+]process-running ok Mar 20 06:53:26 crc kubenswrapper[4971]: healthz check failed Mar 20 06:53:26 crc kubenswrapper[4971]: I0320 06:53:26.117169 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-858jn" podUID="92c1ee23-ae49-48a6-828d-19ecb5573057" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 06:53:26 crc kubenswrapper[4971]: I0320 06:53:26.117805 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-xg8rk" event={"ID":"a00f6082-ee0c-441f-9aad-d4d12c607e23","Type":"ContainerStarted","Data":"6e419c3c7c1920a980cf4523dc5c8c8e2029fe45d4d64c901bddbeab4372a2b4"} Mar 20 06:53:26 crc kubenswrapper[4971]: I0320 06:53:26.153195 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mg9fg" event={"ID":"a39768b3-6060-4f23-8227-8d658942b4ed","Type":"ContainerStarted","Data":"3329b7c5110480461b17e28d5a71ace18866909815d29c2c4b7e059ac4b21b3a"} Mar 20 06:53:26 crc kubenswrapper[4971]: I0320 06:53:26.171522 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-g8hs9" event={"ID":"65ba3fb8-bfc9-4b6a-96ac-e901252e44b7","Type":"ContainerStarted","Data":"8588bfa3a359e6d5d788a2366fba1619f4c3c08c2e4f2daf8730835c0438bdef"} Mar 20 06:53:26 crc kubenswrapper[4971]: I0320 06:53:26.172682 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-g8hs9" Mar 20 06:53:26 crc kubenswrapper[4971]: I0320 06:53:26.184952 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-tp5rq" event={"ID":"168eba70-d30f-4f54-88c8-403a411a16a0","Type":"ContainerStarted","Data":"8c6fafead9b45e61891ec4f2ae5a003f096ed62647b1f0b687a483b81a83bb3d"} Mar 20 06:53:26 crc kubenswrapper[4971]: I0320 06:53:26.185027 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-tp5rq" event={"ID":"168eba70-d30f-4f54-88c8-403a411a16a0","Type":"ContainerStarted","Data":"6bcf38e4c7a3252331d8524c883c865d2500df79e0d32014a229d8c25903261c"} Mar 20 06:53:26 crc kubenswrapper[4971]: I0320 06:53:26.189799 4971 patch_prober.go:28] interesting pod/downloads-7954f5f757-t2znt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Mar 20 06:53:26 crc kubenswrapper[4971]: I0320 06:53:26.189864 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-t2znt" podUID="0939bb94-8858-43fc-8443-686e696beaa1" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Mar 20 06:53:26 crc kubenswrapper[4971]: I0320 06:53:26.196550 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:53:26 crc kubenswrapper[4971]: E0320 06:53:26.197350 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 06:53:26.697332528 +0000 UTC m=+228.677206656 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:26 crc kubenswrapper[4971]: I0320 06:53:26.208549 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-k5xfj" Mar 20 06:53:26 crc kubenswrapper[4971]: I0320 06:53:26.211250 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tzg77" Mar 20 06:53:26 crc kubenswrapper[4971]: I0320 06:53:26.229868 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:26 crc kubenswrapper[4971]: E0320 06:53:26.230423 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 06:53:26.730404046 +0000 UTC m=+228.710278184 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q8mn4" (UID: "4e2cb781-6f5e-493d-a811-b88b641fcda2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:26 crc kubenswrapper[4971]: I0320 06:53:26.284432 4971 ???:1] "http: TLS handshake error from 192.168.126.11:36254: no serving certificate available for the kubelet" Mar 20 06:53:26 crc kubenswrapper[4971]: I0320 06:53:26.315665 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-g8hs9" podStartSLOduration=175.315600501 podStartE2EDuration="2m55.315600501s" podCreationTimestamp="2026-03-20 06:50:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:53:26.24782561 +0000 UTC m=+228.227699768" watchObservedRunningTime="2026-03-20 06:53:26.315600501 +0000 UTC m=+228.295474629" Mar 20 06:53:26 crc kubenswrapper[4971]: I0320 06:53:26.331783 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:53:26 crc kubenswrapper[4971]: E0320 06:53:26.332547 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 06:53:26.832529311 +0000 UTC m=+228.812403439 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:26 crc kubenswrapper[4971]: I0320 06:53:26.438811 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:26 crc kubenswrapper[4971]: I0320 06:53:26.438984 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-tp5rq" Mar 20 06:53:26 crc kubenswrapper[4971]: E0320 06:53:26.439238 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 06:53:26.939222269 +0000 UTC m=+228.919096407 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q8mn4" (UID: "4e2cb781-6f5e-493d-a811-b88b641fcda2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:26 crc kubenswrapper[4971]: I0320 06:53:26.439395 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-tp5rq" Mar 20 06:53:26 crc kubenswrapper[4971]: I0320 06:53:26.485669 4971 ???:1] "http: TLS handshake error from 192.168.126.11:36258: no serving certificate available for the kubelet" Mar 20 06:53:26 crc kubenswrapper[4971]: I0320 06:53:26.542121 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:53:26 crc kubenswrapper[4971]: E0320 06:53:26.542550 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 06:53:27.0425296 +0000 UTC m=+229.022403738 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:26 crc kubenswrapper[4971]: I0320 06:53:26.649998 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:26 crc kubenswrapper[4971]: E0320 06:53:26.650500 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 06:53:27.150475174 +0000 UTC m=+229.130349322 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q8mn4" (UID: "4e2cb781-6f5e-493d-a811-b88b641fcda2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:26 crc kubenswrapper[4971]: I0320 06:53:26.726732 4971 ???:1] "http: TLS handshake error from 192.168.126.11:36272: no serving certificate available for the kubelet" Mar 20 06:53:26 crc kubenswrapper[4971]: I0320 06:53:26.756597 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:53:26 crc kubenswrapper[4971]: E0320 06:53:26.756755 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 06:53:27.256726679 +0000 UTC m=+229.236600817 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:26 crc kubenswrapper[4971]: I0320 06:53:26.756867 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:26 crc kubenswrapper[4971]: E0320 06:53:26.757275 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 06:53:27.257257234 +0000 UTC m=+229.237131372 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q8mn4" (UID: "4e2cb781-6f5e-493d-a811-b88b641fcda2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:26 crc kubenswrapper[4971]: I0320 06:53:26.883542 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:53:26 crc kubenswrapper[4971]: E0320 06:53:26.884045 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 06:53:27.384027563 +0000 UTC m=+229.363901701 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:26 crc kubenswrapper[4971]: I0320 06:53:26.888008 4971 ???:1] "http: TLS handshake error from 192.168.126.11:36274: no serving certificate available for the kubelet" Mar 20 06:53:26 crc kubenswrapper[4971]: I0320 06:53:26.985896 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:26 crc kubenswrapper[4971]: E0320 06:53:26.986372 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 06:53:27.486353124 +0000 UTC m=+229.466227262 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q8mn4" (UID: "4e2cb781-6f5e-493d-a811-b88b641fcda2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:26 crc kubenswrapper[4971]: I0320 06:53:26.987983 4971 ???:1] "http: TLS handshake error from 192.168.126.11:36280: no serving certificate available for the kubelet" Mar 20 06:53:27 crc kubenswrapper[4971]: I0320 06:53:27.089339 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:53:27 crc kubenswrapper[4971]: E0320 06:53:27.089866 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 06:53:27.58984732 +0000 UTC m=+229.569721458 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:27 crc kubenswrapper[4971]: I0320 06:53:27.121072 4971 patch_prober.go:28] interesting pod/router-default-5444994796-858jn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 06:53:27 crc kubenswrapper[4971]: [-]has-synced failed: reason withheld Mar 20 06:53:27 crc kubenswrapper[4971]: [+]process-running ok Mar 20 06:53:27 crc kubenswrapper[4971]: healthz check failed Mar 20 06:53:27 crc kubenswrapper[4971]: I0320 06:53:27.121166 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-858jn" podUID="92c1ee23-ae49-48a6-828d-19ecb5573057" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 06:53:27 crc kubenswrapper[4971]: I0320 06:53:27.121917 4971 ???:1] "http: TLS handshake error from 192.168.126.11:36282: no serving certificate available for the kubelet" Mar 20 06:53:27 crc kubenswrapper[4971]: I0320 06:53:27.187257 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-tp5rq" podStartSLOduration=176.187236848 podStartE2EDuration="2m56.187236848s" podCreationTimestamp="2026-03-20 06:50:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:53:26.635250033 +0000 UTC m=+228.615124171" watchObservedRunningTime="2026-03-20 06:53:27.187236848 +0000 UTC m=+229.167110986" Mar 20 06:53:27 crc kubenswrapper[4971]: I0320 06:53:27.191456 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:27 crc kubenswrapper[4971]: E0320 06:53:27.191896 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 06:53:27.691879813 +0000 UTC m=+229.671753951 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q8mn4" (UID: "4e2cb781-6f5e-493d-a811-b88b641fcda2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:27 crc kubenswrapper[4971]: I0320 06:53:27.234746 4971 ???:1] "http: TLS handshake error from 192.168.126.11:36284: no serving certificate available for the kubelet" Mar 20 06:53:27 crc kubenswrapper[4971]: I0320 06:53:27.295489 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:53:27 crc kubenswrapper[4971]: E0320 06:53:27.295646 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 06:53:27.795625386 +0000 UTC m=+229.775499524 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:27 crc kubenswrapper[4971]: I0320 06:53:27.295988 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:27 crc kubenswrapper[4971]: E0320 06:53:27.296507 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 06:53:27.79648039 +0000 UTC m=+229.776354528 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q8mn4" (UID: "4e2cb781-6f5e-493d-a811-b88b641fcda2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:27 crc kubenswrapper[4971]: I0320 06:53:27.305629 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fn7gp" event={"ID":"c9ad26c3-6a8e-41df-9879-b7ff5f77fdea","Type":"ContainerStarted","Data":"241224fa6c2ddf9b97f2f722c8fef1349f66d8ff1024d95c4e7cc60bebb995f4"} Mar 20 06:53:27 crc kubenswrapper[4971]: I0320 06:53:27.308030 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k7vhj" event={"ID":"971318f2-7f27-4651-9fbe-d66d37fb6b12","Type":"ContainerStarted","Data":"06fc20b6768899b00d3810dc19278965a8bb84fafa0d8ee8588665acbed36695"} Mar 20 06:53:27 crc kubenswrapper[4971]: I0320 06:53:27.338405 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bnct9" event={"ID":"b96a8e1e-ebdd-419f-8661-ae6e515429b0","Type":"ContainerStarted","Data":"51315ed82ccdc71b7825834cfb5bec90a821b1be7850d3f9b0e3ec6d37a6817b"} Mar 20 06:53:27 crc kubenswrapper[4971]: I0320 06:53:27.339597 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bnct9" Mar 20 06:53:27 crc kubenswrapper[4971]: I0320 06:53:27.369080 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566485-rnzpk" event={"ID":"b7d2e412-0a4f-44b6-a326-269ac4921dae","Type":"ContainerStarted","Data":"034740c7fd2522f3622e7a92b4698c566929791c244c307efc89fc747e087833"} Mar 20 06:53:27 crc kubenswrapper[4971]: I0320 06:53:27.389821 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-k9kzh" event={"ID":"3a9bd8ec-b649-4b24-8eca-d0e20cef988a","Type":"ContainerStarted","Data":"1500f2f88397a62b448c6c29a10d7aef0db2386150c59ed185d124eb9cfcc654"} Mar 20 06:53:27 crc kubenswrapper[4971]: I0320 06:53:27.389953 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bnct9" Mar 20 06:53:27 crc kubenswrapper[4971]: I0320 06:53:27.395375 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-hvlj7" event={"ID":"6cc43d2b-e8e2-405a-9870-33c60db55ff2","Type":"ContainerStarted","Data":"581e88e602b6a92f08a5033c33d9630348b173d7105fa72c19e6e4f9a889c29a"} Mar 20 06:53:27 crc kubenswrapper[4971]: I0320 06:53:27.396954 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:53:27 crc kubenswrapper[4971]: E0320 06:53:27.398212 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 06:53:27.898195064 +0000 UTC m=+229.878069202 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:27 crc kubenswrapper[4971]: I0320 06:53:27.423479 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-xg8rk" event={"ID":"a00f6082-ee0c-441f-9aad-d4d12c607e23","Type":"ContainerStarted","Data":"c176b10710670c96a8074f40dae95382e51af58b035cd8ba9cb499d578c5b567"} Mar 20 06:53:27 crc kubenswrapper[4971]: I0320 06:53:27.426836 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fn7gp" podStartSLOduration=175.426811993 podStartE2EDuration="2m55.426811993s" podCreationTimestamp="2026-03-20 06:50:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:53:27.371708988 +0000 UTC m=+229.351583126" watchObservedRunningTime="2026-03-20 06:53:27.426811993 +0000 UTC m=+229.406686131" Mar 20 06:53:27 crc kubenswrapper[4971]: I0320 06:53:27.447541 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mg9fg" event={"ID":"a39768b3-6060-4f23-8227-8d658942b4ed","Type":"ContainerStarted","Data":"d1ab57841877ca2c8f183d3eb89e96dc86c2de74e344d5b01651cef0332de442"} Mar 20 06:53:27 crc kubenswrapper[4971]: I0320 06:53:27.447813 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mg9fg" Mar 20 06:53:27 crc kubenswrapper[4971]: I0320 06:53:27.453963 4971 ???:1] "http: TLS handshake error from 192.168.126.11:36290: no serving certificate available for the kubelet" Mar 20 06:53:27 crc kubenswrapper[4971]: I0320 06:53:27.476582 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bnct9" podStartSLOduration=175.476554892 podStartE2EDuration="2m55.476554892s" podCreationTimestamp="2026-03-20 06:50:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:53:27.427121121 +0000 UTC m=+229.406995259" watchObservedRunningTime="2026-03-20 06:53:27.476554892 +0000 UTC m=+229.456429020" Mar 20 06:53:27 crc kubenswrapper[4971]: I0320 06:53:27.477072 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k7vhj" podStartSLOduration=176.477065967 podStartE2EDuration="2m56.477065967s" podCreationTimestamp="2026-03-20 06:50:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:53:27.465701268 +0000 UTC m=+229.445575406" watchObservedRunningTime="2026-03-20 06:53:27.477065967 +0000 UTC m=+229.456940105" Mar 20 06:53:27 crc kubenswrapper[4971]: I0320 06:53:27.500439 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:27 crc kubenswrapper[4971]: E0320 06:53:27.502105 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 06:53:28.002082771 +0000 UTC m=+229.981956909 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q8mn4" (UID: "4e2cb781-6f5e-493d-a811-b88b641fcda2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:27 crc kubenswrapper[4971]: I0320 06:53:27.502466 4971 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-mg9fg container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" start-of-body= Mar 20 06:53:27 crc kubenswrapper[4971]: I0320 06:53:27.502534 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mg9fg" podUID="a39768b3-6060-4f23-8227-8d658942b4ed" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" Mar 20 06:53:27 crc kubenswrapper[4971]: I0320 06:53:27.513268 4971 generic.go:334] "Generic (PLEG): container finished" podID="911a50c7-0e81-4087-9bc0-81e5206d5fd0" containerID="29a41eb5d63fba23ad85f59de5fb20b404079042ed400269e8e0683791f1febc" exitCode=0 Mar 20 06:53:27 crc kubenswrapper[4971]: I0320 06:53:27.513386 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-scq4h" event={"ID":"911a50c7-0e81-4087-9bc0-81e5206d5fd0","Type":"ContainerDied","Data":"29a41eb5d63fba23ad85f59de5fb20b404079042ed400269e8e0683791f1febc"} Mar 20 06:53:27 crc kubenswrapper[4971]: I0320 06:53:27.525524 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pkbqf" event={"ID":"b0525b9e-caf1-4606-83d1-ffaf7d231b69","Type":"ContainerStarted","Data":"1110e0aacae68dd425a151e9f533ea536b016fc9cf5fac6b7047a9462e7c6848"} Mar 20 06:53:27 crc kubenswrapper[4971]: I0320 06:53:27.536723 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29566485-rnzpk" podStartSLOduration=176.536680863 podStartE2EDuration="2m56.536680863s" podCreationTimestamp="2026-03-20 06:50:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:53:27.502933776 +0000 UTC m=+229.482807934" watchObservedRunningTime="2026-03-20 06:53:27.536680863 +0000 UTC m=+229.516555001" Mar 20 06:53:27 crc kubenswrapper[4971]: I0320 06:53:27.564224 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-kmdm2" event={"ID":"cdb9a3df-ae5a-4ce0-a7af-7986ac68c57f","Type":"ContainerStarted","Data":"62734b60f8de8a62f00b1bd459f404ef714c2cac56a3c35f250dc9609d0a1cde"} Mar 20 06:53:27 crc kubenswrapper[4971]: I0320 06:53:27.590183 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nj4b4" event={"ID":"181841ef-d317-458b-bb36-daa09020abdb","Type":"ContainerStarted","Data":"d7424adaa24d0a697100dc4119f40053d12b8a86a79315b2fd8251aca7b09811"} Mar 20 06:53:27 crc kubenswrapper[4971]: I0320 06:53:27.591422 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mg9fg" podStartSLOduration=175.591395926 podStartE2EDuration="2m55.591395926s" podCreationTimestamp="2026-03-20 06:50:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:53:27.588864053 +0000 UTC m=+229.568738191" watchObservedRunningTime="2026-03-20 06:53:27.591395926 +0000 UTC m=+229.571270064" Mar 20 06:53:27 crc kubenswrapper[4971]: I0320 06:53:27.603082 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:53:27 crc kubenswrapper[4971]: E0320 06:53:27.603935 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 06:53:28.103909918 +0000 UTC m=+230.083784056 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:27 crc kubenswrapper[4971]: I0320 06:53:27.604487 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xfqk2" event={"ID":"4f8e835f-f481-4bf3-8eda-9c005962ccc2","Type":"ContainerStarted","Data":"04eb9aa1deb1b572274d0b1f21b8cf59ba57468b193fa4ee317dc27bdbc69495"} Mar 20 06:53:27 crc kubenswrapper[4971]: I0320 06:53:27.605693 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xfqk2" Mar 20 06:53:27 crc kubenswrapper[4971]: I0320 06:53:27.634006 4971 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-xfqk2 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.31:5443/healthz\": dial tcp 10.217.0.31:5443: connect: connection refused" start-of-body= Mar 20 06:53:27 crc kubenswrapper[4971]: I0320 06:53:27.634083 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xfqk2" podUID="4f8e835f-f481-4bf3-8eda-9c005962ccc2" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.31:5443/healthz\": dial tcp 10.217.0.31:5443: connect: connection refused" Mar 20 06:53:27 crc kubenswrapper[4971]: I0320 06:53:27.652816 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-hvlj7" podStartSLOduration=176.652797373 podStartE2EDuration="2m56.652797373s" podCreationTimestamp="2026-03-20 06:50:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:53:27.651118505 +0000 UTC m=+229.630992643" watchObservedRunningTime="2026-03-20 06:53:27.652797373 +0000 UTC m=+229.632671511" Mar 20 06:53:27 crc kubenswrapper[4971]: I0320 06:53:27.694900 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-4g87f" event={"ID":"6cb5cf3c-a623-476d-b32e-0819eb48d38f","Type":"ContainerStarted","Data":"b918226b82ca4874d1eb7a1bb1c287da0252193e83cd45cfea0a2d1122800927"} Mar 20 06:53:27 crc kubenswrapper[4971]: I0320 06:53:27.710226 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-xg8rk" podStartSLOduration=8.710190225 podStartE2EDuration="8.710190225s" podCreationTimestamp="2026-03-20 06:53:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:53:27.69931415 +0000 UTC m=+229.679188298" watchObservedRunningTime="2026-03-20 06:53:27.710190225 +0000 UTC m=+229.690064373" Mar 20 06:53:27 crc kubenswrapper[4971]: I0320 06:53:27.751072 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-sjq2s" event={"ID":"a8e16069-c384-4991-8041-af49bbd32266","Type":"ContainerStarted","Data":"293d1f032b9d88e9bef9cf7f82df5d25de77be5f49ecb20c8921e1d5cc1451f7"} Mar 20 06:53:27 crc kubenswrapper[4971]: I0320 06:53:27.761782 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:27 crc kubenswrapper[4971]: I0320 06:53:27.761785 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-9ndf9" event={"ID":"59c19ac3-a273-4527-bb7c-fa5d0ec71cba","Type":"ContainerStarted","Data":"e982eedc882e708a822c0211e090ce2566ac7305eecde68722d681c219bc1deb"} Mar 20 06:53:27 crc kubenswrapper[4971]: E0320 06:53:27.762565 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 06:53:28.26254061 +0000 UTC m=+230.242414748 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q8mn4" (UID: "4e2cb781-6f5e-493d-a811-b88b641fcda2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:27 crc kubenswrapper[4971]: I0320 06:53:27.822353 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xfqk2" podStartSLOduration=175.82233514 podStartE2EDuration="2m55.82233514s" podCreationTimestamp="2026-03-20 06:50:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:53:27.781380455 +0000 UTC m=+229.761254583" watchObservedRunningTime="2026-03-20 06:53:27.82233514 +0000 UTC m=+229.802209278" Mar 20 06:53:27 crc kubenswrapper[4971]: I0320 06:53:27.855896 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-r6vfl" event={"ID":"6e696339-3718-476d-a99b-1da32e10090b","Type":"ContainerStarted","Data":"fe997853e34f96aedf1721a1105c4c6697e5e1d4fde8f6e583ae1105d0265a6c"} Mar 20 06:53:27 crc kubenswrapper[4971]: I0320 06:53:27.864320 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:53:27 crc kubenswrapper[4971]: E0320 06:53:27.866170 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 06:53:28.366147618 +0000 UTC m=+230.346021756 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:27 crc kubenswrapper[4971]: I0320 06:53:27.870248 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kcw47" event={"ID":"a742bd2a-b1b3-42fc-be4f-98002090b4ed","Type":"ContainerStarted","Data":"fca6074eaf6c5d2f04230c5a513e1f2a3b0048b5c737d219678ad8902c2f71cd"} Mar 20 06:53:27 crc kubenswrapper[4971]: I0320 06:53:27.870302 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kcw47" event={"ID":"a742bd2a-b1b3-42fc-be4f-98002090b4ed","Type":"ContainerStarted","Data":"c0dd9f0529387aaa5ab852c834fa793bdc2a35f7cd46d4eff8891e07ea69ab1f"} Mar 20 06:53:27 crc kubenswrapper[4971]: I0320 06:53:27.874077 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tzg77" podUID="634ee46d-0ea5-4287-a7d7-55d7b0fb803b" containerName="route-controller-manager" containerID="cri-o://75f805e80346e267bb7cea91a2a01f8a489c9b11441e48c0066e17d1ddbf2a51" gracePeriod=30 Mar 20 06:53:27 crc kubenswrapper[4971]: I0320 06:53:27.874428 4971 patch_prober.go:28] interesting pod/downloads-7954f5f757-t2znt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Mar 20 06:53:27 crc kubenswrapper[4971]: I0320 06:53:27.874502 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-t2znt" podUID="0939bb94-8858-43fc-8443-686e696beaa1" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Mar 20 06:53:27 crc kubenswrapper[4971]: I0320 06:53:27.874559 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-k5xfj" podUID="469b3cee-92e9-43a1-b8e7-de66a1990a8f" containerName="controller-manager" containerID="cri-o://70606590a09aa41c7a112416bbd78ac53b29e45664879e7a2cc25e66807e73dd" gracePeriod=30 Mar 20 06:53:27 crc kubenswrapper[4971]: I0320 06:53:27.874597 4971 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-bj9mx container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Mar 20 06:53:27 crc kubenswrapper[4971]: I0320 06:53:27.874636 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-bj9mx" podUID="d17d7658-ce33-4a3c-826a-1a0361e73629" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" Mar 20 06:53:27 crc kubenswrapper[4971]: I0320 06:53:27.905501 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-g2c5c" Mar 20 06:53:27 crc kubenswrapper[4971]: I0320 06:53:27.930828 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-g8hs9" Mar 20 06:53:27 crc kubenswrapper[4971]: I0320 06:53:27.931094 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pkbqf" podStartSLOduration=175.931073608 podStartE2EDuration="2m55.931073608s" podCreationTimestamp="2026-03-20 06:50:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:53:27.82197517 +0000 UTC m=+229.801849308" watchObservedRunningTime="2026-03-20 06:53:27.931073608 +0000 UTC m=+229.910947746" Mar 20 06:53:27 crc kubenswrapper[4971]: I0320 06:53:27.975096 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-4g87f" podStartSLOduration=175.975079551 podStartE2EDuration="2m55.975079551s" podCreationTimestamp="2026-03-20 06:50:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:53:27.97467876 +0000 UTC m=+229.954552898" watchObservedRunningTime="2026-03-20 06:53:27.975079551 +0000 UTC m=+229.954953689" Mar 20 06:53:27 crc kubenswrapper[4971]: I0320 06:53:27.975371 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:27 crc kubenswrapper[4971]: E0320 06:53:27.975995 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 06:53:28.475976377 +0000 UTC m=+230.455850515 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q8mn4" (UID: "4e2cb781-6f5e-493d-a811-b88b641fcda2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:27 crc kubenswrapper[4971]: I0320 06:53:27.976862 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nj4b4" podStartSLOduration=176.976853483 podStartE2EDuration="2m56.976853483s" podCreationTimestamp="2026-03-20 06:50:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:53:27.930708477 +0000 UTC m=+229.910582625" watchObservedRunningTime="2026-03-20 06:53:27.976853483 +0000 UTC m=+229.956727621" Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.087146 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.089787 4971 patch_prober.go:28] interesting pod/apiserver-76f77b778f-tp5rq container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 20 06:53:28 crc kubenswrapper[4971]: [+]log ok Mar 20 06:53:28 crc kubenswrapper[4971]: [+]etcd ok Mar 20 06:53:28 crc kubenswrapper[4971]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 20 06:53:28 crc kubenswrapper[4971]: [+]poststarthook/generic-apiserver-start-informers ok Mar 20 06:53:28 crc kubenswrapper[4971]: [+]poststarthook/max-in-flight-filter ok Mar 20 06:53:28 crc kubenswrapper[4971]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 20 06:53:28 crc kubenswrapper[4971]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 20 06:53:28 crc kubenswrapper[4971]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 20 06:53:28 crc kubenswrapper[4971]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Mar 20 06:53:28 crc kubenswrapper[4971]: [+]poststarthook/project.openshift.io-projectcache ok Mar 20 06:53:28 crc kubenswrapper[4971]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 20 06:53:28 crc kubenswrapper[4971]: [+]poststarthook/openshift.io-startinformers ok Mar 20 06:53:28 crc kubenswrapper[4971]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 20 06:53:28 crc kubenswrapper[4971]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 20 06:53:28 crc kubenswrapper[4971]: livez check failed Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.089850 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-tp5rq" podUID="168eba70-d30f-4f54-88c8-403a411a16a0" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 06:53:28 crc kubenswrapper[4971]: E0320 06:53:28.097594 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 06:53:28.597554226 +0000 UTC m=+230.577428364 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.110987 4971 patch_prober.go:28] interesting pod/router-default-5444994796-858jn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 06:53:28 crc kubenswrapper[4971]: [-]has-synced failed: reason withheld Mar 20 06:53:28 crc kubenswrapper[4971]: [+]process-running ok Mar 20 06:53:28 crc kubenswrapper[4971]: healthz check failed Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.111067 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-858jn" podUID="92c1ee23-ae49-48a6-828d-19ecb5573057" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.151283 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kcw47" podStartSLOduration=176.15125971 podStartE2EDuration="2m56.15125971s" podCreationTimestamp="2026-03-20 06:50:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:53:28.103515799 +0000 UTC m=+230.083389937" watchObservedRunningTime="2026-03-20 06:53:28.15125971 +0000 UTC m=+230.131133848" Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.152573 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-sjq2s" podStartSLOduration=176.152567448 podStartE2EDuration="2m56.152567448s" podCreationTimestamp="2026-03-20 06:50:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:53:28.150242311 +0000 UTC m=+230.130116449" watchObservedRunningTime="2026-03-20 06:53:28.152567448 +0000 UTC m=+230.132441586" Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.184035 4971 ???:1] "http: TLS handshake error from 192.168.126.11:36304: no serving certificate available for the kubelet" Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.201472 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:28 crc kubenswrapper[4971]: E0320 06:53:28.201897 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 06:53:28.701879026 +0000 UTC m=+230.681753164 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q8mn4" (UID: "4e2cb781-6f5e-493d-a811-b88b641fcda2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.304990 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:53:28 crc kubenswrapper[4971]: E0320 06:53:28.305753 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 06:53:28.805734912 +0000 UTC m=+230.785609050 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.408394 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:28 crc kubenswrapper[4971]: E0320 06:53:28.408788 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 06:53:28.908774374 +0000 UTC m=+230.888648512 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q8mn4" (UID: "4e2cb781-6f5e-493d-a811-b88b641fcda2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.509770 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:53:28 crc kubenswrapper[4971]: E0320 06:53:28.509948 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 06:53:29.009912671 +0000 UTC m=+230.989786809 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.510085 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:28 crc kubenswrapper[4971]: E0320 06:53:28.510442 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 06:53:29.010426986 +0000 UTC m=+230.990301114 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q8mn4" (UID: "4e2cb781-6f5e-493d-a811-b88b641fcda2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.611762 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:53:28 crc kubenswrapper[4971]: E0320 06:53:28.612518 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 06:53:29.112467239 +0000 UTC m=+231.092341387 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.685190 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tzg77" Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.717016 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/634ee46d-0ea5-4287-a7d7-55d7b0fb803b-serving-cert\") pod \"634ee46d-0ea5-4287-a7d7-55d7b0fb803b\" (UID: \"634ee46d-0ea5-4287-a7d7-55d7b0fb803b\") " Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.717353 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/634ee46d-0ea5-4287-a7d7-55d7b0fb803b-client-ca\") pod \"634ee46d-0ea5-4287-a7d7-55d7b0fb803b\" (UID: \"634ee46d-0ea5-4287-a7d7-55d7b0fb803b\") " Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.717380 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/634ee46d-0ea5-4287-a7d7-55d7b0fb803b-config\") pod \"634ee46d-0ea5-4287-a7d7-55d7b0fb803b\" (UID: \"634ee46d-0ea5-4287-a7d7-55d7b0fb803b\") " Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.717416 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sk7n7\" (UniqueName: \"kubernetes.io/projected/634ee46d-0ea5-4287-a7d7-55d7b0fb803b-kube-api-access-sk7n7\") pod \"634ee46d-0ea5-4287-a7d7-55d7b0fb803b\" (UID: \"634ee46d-0ea5-4287-a7d7-55d7b0fb803b\") " Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.717583 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:28 crc kubenswrapper[4971]: E0320 06:53:28.717992 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 06:53:29.217976513 +0000 UTC m=+231.197850651 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q8mn4" (UID: "4e2cb781-6f5e-493d-a811-b88b641fcda2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.718661 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/634ee46d-0ea5-4287-a7d7-55d7b0fb803b-client-ca" (OuterVolumeSpecName: "client-ca") pod "634ee46d-0ea5-4287-a7d7-55d7b0fb803b" (UID: "634ee46d-0ea5-4287-a7d7-55d7b0fb803b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.719084 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/634ee46d-0ea5-4287-a7d7-55d7b0fb803b-config" (OuterVolumeSpecName: "config") pod "634ee46d-0ea5-4287-a7d7-55d7b0fb803b" (UID: "634ee46d-0ea5-4287-a7d7-55d7b0fb803b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.737420 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/634ee46d-0ea5-4287-a7d7-55d7b0fb803b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "634ee46d-0ea5-4287-a7d7-55d7b0fb803b" (UID: "634ee46d-0ea5-4287-a7d7-55d7b0fb803b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.744158 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-k5xfj" Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.745768 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/634ee46d-0ea5-4287-a7d7-55d7b0fb803b-kube-api-access-sk7n7" (OuterVolumeSpecName: "kube-api-access-sk7n7") pod "634ee46d-0ea5-4287-a7d7-55d7b0fb803b" (UID: "634ee46d-0ea5-4287-a7d7-55d7b0fb803b"). InnerVolumeSpecName "kube-api-access-sk7n7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.751901 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-547f97b769-ftxsp"] Mar 20 06:53:28 crc kubenswrapper[4971]: E0320 06:53:28.752257 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="469b3cee-92e9-43a1-b8e7-de66a1990a8f" containerName="controller-manager" Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.752276 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="469b3cee-92e9-43a1-b8e7-de66a1990a8f" containerName="controller-manager" Mar 20 06:53:28 crc kubenswrapper[4971]: E0320 06:53:28.752286 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="634ee46d-0ea5-4287-a7d7-55d7b0fb803b" containerName="route-controller-manager" Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.752294 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="634ee46d-0ea5-4287-a7d7-55d7b0fb803b" containerName="route-controller-manager" Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.752434 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="634ee46d-0ea5-4287-a7d7-55d7b0fb803b" containerName="route-controller-manager" Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.752450 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="469b3cee-92e9-43a1-b8e7-de66a1990a8f" containerName="controller-manager" Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.753012 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-547f97b769-ftxsp" Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.774735 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-547f97b769-ftxsp"] Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.826945 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k96wh\" (UniqueName: \"kubernetes.io/projected/469b3cee-92e9-43a1-b8e7-de66a1990a8f-kube-api-access-k96wh\") pod \"469b3cee-92e9-43a1-b8e7-de66a1990a8f\" (UID: \"469b3cee-92e9-43a1-b8e7-de66a1990a8f\") " Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.827007 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/469b3cee-92e9-43a1-b8e7-de66a1990a8f-serving-cert\") pod \"469b3cee-92e9-43a1-b8e7-de66a1990a8f\" (UID: \"469b3cee-92e9-43a1-b8e7-de66a1990a8f\") " Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.827068 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/469b3cee-92e9-43a1-b8e7-de66a1990a8f-proxy-ca-bundles\") pod \"469b3cee-92e9-43a1-b8e7-de66a1990a8f\" (UID: \"469b3cee-92e9-43a1-b8e7-de66a1990a8f\") " Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.827103 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/469b3cee-92e9-43a1-b8e7-de66a1990a8f-config\") pod \"469b3cee-92e9-43a1-b8e7-de66a1990a8f\" (UID: \"469b3cee-92e9-43a1-b8e7-de66a1990a8f\") " Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.827234 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.827352 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/469b3cee-92e9-43a1-b8e7-de66a1990a8f-client-ca\") pod \"469b3cee-92e9-43a1-b8e7-de66a1990a8f\" (UID: \"469b3cee-92e9-43a1-b8e7-de66a1990a8f\") " Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.829485 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/469b3cee-92e9-43a1-b8e7-de66a1990a8f-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "469b3cee-92e9-43a1-b8e7-de66a1990a8f" (UID: "469b3cee-92e9-43a1-b8e7-de66a1990a8f"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.841085 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c57e69da-fb86-40f0-bbb9-e7560725c9c0-config\") pod \"route-controller-manager-547f97b769-ftxsp\" (UID: \"c57e69da-fb86-40f0-bbb9-e7560725c9c0\") " pod="openshift-route-controller-manager/route-controller-manager-547f97b769-ftxsp" Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.841167 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7xmg\" (UniqueName: \"kubernetes.io/projected/c57e69da-fb86-40f0-bbb9-e7560725c9c0-kube-api-access-j7xmg\") pod \"route-controller-manager-547f97b769-ftxsp\" (UID: \"c57e69da-fb86-40f0-bbb9-e7560725c9c0\") " pod="openshift-route-controller-manager/route-controller-manager-547f97b769-ftxsp" Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.841375 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c57e69da-fb86-40f0-bbb9-e7560725c9c0-client-ca\") pod \"route-controller-manager-547f97b769-ftxsp\" (UID: \"c57e69da-fb86-40f0-bbb9-e7560725c9c0\") " pod="openshift-route-controller-manager/route-controller-manager-547f97b769-ftxsp" Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.841398 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c57e69da-fb86-40f0-bbb9-e7560725c9c0-serving-cert\") pod \"route-controller-manager-547f97b769-ftxsp\" (UID: \"c57e69da-fb86-40f0-bbb9-e7560725c9c0\") " pod="openshift-route-controller-manager/route-controller-manager-547f97b769-ftxsp" Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.841707 4971 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/634ee46d-0ea5-4287-a7d7-55d7b0fb803b-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.841727 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/634ee46d-0ea5-4287-a7d7-55d7b0fb803b-config\") on node \"crc\" DevicePath \"\"" Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.841739 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sk7n7\" (UniqueName: \"kubernetes.io/projected/634ee46d-0ea5-4287-a7d7-55d7b0fb803b-kube-api-access-sk7n7\") on node \"crc\" DevicePath \"\"" Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.841755 4971 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/634ee46d-0ea5-4287-a7d7-55d7b0fb803b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 06:53:28 crc kubenswrapper[4971]: E0320 06:53:28.841929 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 06:53:29.34190651 +0000 UTC m=+231.321780648 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.844779 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2ljbf"] Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.846997 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2ljbf" Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.858351 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/469b3cee-92e9-43a1-b8e7-de66a1990a8f-config" (OuterVolumeSpecName: "config") pod "469b3cee-92e9-43a1-b8e7-de66a1990a8f" (UID: "469b3cee-92e9-43a1-b8e7-de66a1990a8f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.859954 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/469b3cee-92e9-43a1-b8e7-de66a1990a8f-kube-api-access-k96wh" (OuterVolumeSpecName: "kube-api-access-k96wh") pod "469b3cee-92e9-43a1-b8e7-de66a1990a8f" (UID: "469b3cee-92e9-43a1-b8e7-de66a1990a8f"). InnerVolumeSpecName "kube-api-access-k96wh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.864950 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/469b3cee-92e9-43a1-b8e7-de66a1990a8f-client-ca" (OuterVolumeSpecName: "client-ca") pod "469b3cee-92e9-43a1-b8e7-de66a1990a8f" (UID: "469b3cee-92e9-43a1-b8e7-de66a1990a8f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.874723 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.883725 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/469b3cee-92e9-43a1-b8e7-de66a1990a8f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "469b3cee-92e9-43a1-b8e7-de66a1990a8f" (UID: "469b3cee-92e9-43a1-b8e7-de66a1990a8f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.910893 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2ljbf"] Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.943292 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-kmdm2" event={"ID":"cdb9a3df-ae5a-4ce0-a7af-7986ac68c57f","Type":"ContainerStarted","Data":"ce871b60df7f7f4a731912cf8a3c4e4c4e8f59651369efe9f4fbb00dc5c7541b"} Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.945248 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c57e69da-fb86-40f0-bbb9-e7560725c9c0-client-ca\") pod \"route-controller-manager-547f97b769-ftxsp\" (UID: \"c57e69da-fb86-40f0-bbb9-e7560725c9c0\") " pod="openshift-route-controller-manager/route-controller-manager-547f97b769-ftxsp" Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.945276 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c57e69da-fb86-40f0-bbb9-e7560725c9c0-serving-cert\") pod \"route-controller-manager-547f97b769-ftxsp\" (UID: \"c57e69da-fb86-40f0-bbb9-e7560725c9c0\") " pod="openshift-route-controller-manager/route-controller-manager-547f97b769-ftxsp" Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.945325 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.945358 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rc876\" (UniqueName: \"kubernetes.io/projected/264fdafe-b375-48d7-95d7-62d72c752b5d-kube-api-access-rc876\") pod \"certified-operators-2ljbf\" (UID: \"264fdafe-b375-48d7-95d7-62d72c752b5d\") " pod="openshift-marketplace/certified-operators-2ljbf" Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.945386 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/264fdafe-b375-48d7-95d7-62d72c752b5d-catalog-content\") pod \"certified-operators-2ljbf\" (UID: \"264fdafe-b375-48d7-95d7-62d72c752b5d\") " pod="openshift-marketplace/certified-operators-2ljbf" Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.945408 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/264fdafe-b375-48d7-95d7-62d72c752b5d-utilities\") pod \"certified-operators-2ljbf\" (UID: \"264fdafe-b375-48d7-95d7-62d72c752b5d\") " pod="openshift-marketplace/certified-operators-2ljbf" Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.945432 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c57e69da-fb86-40f0-bbb9-e7560725c9c0-config\") pod \"route-controller-manager-547f97b769-ftxsp\" (UID: \"c57e69da-fb86-40f0-bbb9-e7560725c9c0\") " pod="openshift-route-controller-manager/route-controller-manager-547f97b769-ftxsp" Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.945448 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7xmg\" (UniqueName: \"kubernetes.io/projected/c57e69da-fb86-40f0-bbb9-e7560725c9c0-kube-api-access-j7xmg\") pod \"route-controller-manager-547f97b769-ftxsp\" (UID: \"c57e69da-fb86-40f0-bbb9-e7560725c9c0\") " pod="openshift-route-controller-manager/route-controller-manager-547f97b769-ftxsp" Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.945514 4971 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/469b3cee-92e9-43a1-b8e7-de66a1990a8f-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.945525 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k96wh\" (UniqueName: \"kubernetes.io/projected/469b3cee-92e9-43a1-b8e7-de66a1990a8f-kube-api-access-k96wh\") on node \"crc\" DevicePath \"\"" Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.945534 4971 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/469b3cee-92e9-43a1-b8e7-de66a1990a8f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.945545 4971 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/469b3cee-92e9-43a1-b8e7-de66a1990a8f-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.945555 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/469b3cee-92e9-43a1-b8e7-de66a1990a8f-config\") on node \"crc\" DevicePath \"\"" Mar 20 06:53:28 crc kubenswrapper[4971]: E0320 06:53:28.956349 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 06:53:29.456325252 +0000 UTC m=+231.436199390 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q8mn4" (UID: "4e2cb781-6f5e-493d-a811-b88b641fcda2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.957659 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c57e69da-fb86-40f0-bbb9-e7560725c9c0-config\") pod \"route-controller-manager-547f97b769-ftxsp\" (UID: \"c57e69da-fb86-40f0-bbb9-e7560725c9c0\") " pod="openshift-route-controller-manager/route-controller-manager-547f97b769-ftxsp" Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.957855 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c57e69da-fb86-40f0-bbb9-e7560725c9c0-client-ca\") pod \"route-controller-manager-547f97b769-ftxsp\" (UID: \"c57e69da-fb86-40f0-bbb9-e7560725c9c0\") " pod="openshift-route-controller-manager/route-controller-manager-547f97b769-ftxsp" Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.976250 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c57e69da-fb86-40f0-bbb9-e7560725c9c0-serving-cert\") pod \"route-controller-manager-547f97b769-ftxsp\" (UID: \"c57e69da-fb86-40f0-bbb9-e7560725c9c0\") " pod="openshift-route-controller-manager/route-controller-manager-547f97b769-ftxsp" Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.986737 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7xmg\" (UniqueName: \"kubernetes.io/projected/c57e69da-fb86-40f0-bbb9-e7560725c9c0-kube-api-access-j7xmg\") pod \"route-controller-manager-547f97b769-ftxsp\" (UID: \"c57e69da-fb86-40f0-bbb9-e7560725c9c0\") " pod="openshift-route-controller-manager/route-controller-manager-547f97b769-ftxsp" Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.988808 4971 generic.go:334] "Generic (PLEG): container finished" podID="634ee46d-0ea5-4287-a7d7-55d7b0fb803b" containerID="75f805e80346e267bb7cea91a2a01f8a489c9b11441e48c0066e17d1ddbf2a51" exitCode=0 Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.988887 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tzg77" event={"ID":"634ee46d-0ea5-4287-a7d7-55d7b0fb803b","Type":"ContainerDied","Data":"75f805e80346e267bb7cea91a2a01f8a489c9b11441e48c0066e17d1ddbf2a51"} Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.988918 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tzg77" event={"ID":"634ee46d-0ea5-4287-a7d7-55d7b0fb803b","Type":"ContainerDied","Data":"2313394c6b7792cb804f9f9b96daaac5d95459eb183f0976b2918b3cf2973bef"} Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.988949 4971 scope.go:117] "RemoveContainer" containerID="75f805e80346e267bb7cea91a2a01f8a489c9b11441e48c0066e17d1ddbf2a51" Mar 20 06:53:28 crc kubenswrapper[4971]: I0320 06:53:28.989121 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tzg77" Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.043227 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"2382d3aef04d609bc8c26c32f5eae78b7345ee100fa104059594105a659ffeab"} Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.043286 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"070e4d46454413f542c9325a91aba13dda55f7344080bf3061457310048b13af"} Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.043994 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.048021 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.048448 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rc876\" (UniqueName: \"kubernetes.io/projected/264fdafe-b375-48d7-95d7-62d72c752b5d-kube-api-access-rc876\") pod \"certified-operators-2ljbf\" (UID: \"264fdafe-b375-48d7-95d7-62d72c752b5d\") " pod="openshift-marketplace/certified-operators-2ljbf" Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.048488 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/264fdafe-b375-48d7-95d7-62d72c752b5d-catalog-content\") pod \"certified-operators-2ljbf\" (UID: \"264fdafe-b375-48d7-95d7-62d72c752b5d\") " pod="openshift-marketplace/certified-operators-2ljbf" Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.048516 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/264fdafe-b375-48d7-95d7-62d72c752b5d-utilities\") pod \"certified-operators-2ljbf\" (UID: \"264fdafe-b375-48d7-95d7-62d72c752b5d\") " pod="openshift-marketplace/certified-operators-2ljbf" Mar 20 06:53:29 crc kubenswrapper[4971]: E0320 06:53:29.049516 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 06:53:29.549498648 +0000 UTC m=+231.529372786 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.050383 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/264fdafe-b375-48d7-95d7-62d72c752b5d-catalog-content\") pod \"certified-operators-2ljbf\" (UID: \"264fdafe-b375-48d7-95d7-62d72c752b5d\") " pod="openshift-marketplace/certified-operators-2ljbf" Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.050592 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/264fdafe-b375-48d7-95d7-62d72c752b5d-utilities\") pod \"certified-operators-2ljbf\" (UID: \"264fdafe-b375-48d7-95d7-62d72c752b5d\") " pod="openshift-marketplace/certified-operators-2ljbf" Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.057671 4971 scope.go:117] "RemoveContainer" containerID="75f805e80346e267bb7cea91a2a01f8a489c9b11441e48c0066e17d1ddbf2a51" Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.059199 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-v9l2k"] Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.060514 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v9l2k" Mar 20 06:53:29 crc kubenswrapper[4971]: E0320 06:53:29.069173 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75f805e80346e267bb7cea91a2a01f8a489c9b11441e48c0066e17d1ddbf2a51\": container with ID starting with 75f805e80346e267bb7cea91a2a01f8a489c9b11441e48c0066e17d1ddbf2a51 not found: ID does not exist" containerID="75f805e80346e267bb7cea91a2a01f8a489c9b11441e48c0066e17d1ddbf2a51" Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.069284 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75f805e80346e267bb7cea91a2a01f8a489c9b11441e48c0066e17d1ddbf2a51"} err="failed to get container status \"75f805e80346e267bb7cea91a2a01f8a489c9b11441e48c0066e17d1ddbf2a51\": rpc error: code = NotFound desc = could not find container \"75f805e80346e267bb7cea91a2a01f8a489c9b11441e48c0066e17d1ddbf2a51\": container with ID starting with 75f805e80346e267bb7cea91a2a01f8a489c9b11441e48c0066e17d1ddbf2a51 not found: ID does not exist" Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.069984 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.077340 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"4ea4d55c4809e268c3a715e1980e55fe3b0d08ae8f49c3af05e0b4363805723c"} Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.077473 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"8c02cdcb2c3f6d91aec0bc3b2a478020cc52a5c74a8fa1e062f5126b44f685c7"} Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.084283 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rc876\" (UniqueName: \"kubernetes.io/projected/264fdafe-b375-48d7-95d7-62d72c752b5d-kube-api-access-rc876\") pod \"certified-operators-2ljbf\" (UID: \"264fdafe-b375-48d7-95d7-62d72c752b5d\") " pod="openshift-marketplace/certified-operators-2ljbf" Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.095217 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-v9l2k"] Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.106785 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-596j6" event={"ID":"3897dfd7-24d8-4384-b536-24db5edccd01","Type":"ContainerStarted","Data":"b81e38f1161c4b3deee959382c232fb0df4770e8f141a79d5e99d29325a6b5fd"} Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.124952 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-547f97b769-ftxsp" Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.125379 4971 patch_prober.go:28] interesting pod/router-default-5444994796-858jn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 06:53:29 crc kubenswrapper[4971]: [-]has-synced failed: reason withheld Mar 20 06:53:29 crc kubenswrapper[4971]: [+]process-running ok Mar 20 06:53:29 crc kubenswrapper[4971]: healthz check failed Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.125418 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-858jn" podUID="92c1ee23-ae49-48a6-828d-19ecb5573057" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.147886 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-9ndf9" event={"ID":"59c19ac3-a273-4527-bb7c-fa5d0ec71cba","Type":"ContainerStarted","Data":"0cd886bc587b2056597b76c73246800d0b8d42d740e05ec04c6756d40064374b"} Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.149164 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83cbdad9-020f-4a1b-8f5f-ca7d56d3cfac-utilities\") pod \"community-operators-v9l2k\" (UID: \"83cbdad9-020f-4a1b-8f5f-ca7d56d3cfac\") " pod="openshift-marketplace/community-operators-v9l2k" Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.149197 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.149234 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83cbdad9-020f-4a1b-8f5f-ca7d56d3cfac-catalog-content\") pod \"community-operators-v9l2k\" (UID: \"83cbdad9-020f-4a1b-8f5f-ca7d56d3cfac\") " pod="openshift-marketplace/community-operators-v9l2k" Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.149258 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbnvj\" (UniqueName: \"kubernetes.io/projected/83cbdad9-020f-4a1b-8f5f-ca7d56d3cfac-kube-api-access-hbnvj\") pod \"community-operators-v9l2k\" (UID: \"83cbdad9-020f-4a1b-8f5f-ca7d56d3cfac\") " pod="openshift-marketplace/community-operators-v9l2k" Mar 20 06:53:29 crc kubenswrapper[4971]: E0320 06:53:29.150423 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 06:53:29.650409509 +0000 UTC m=+231.630283647 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q8mn4" (UID: "4e2cb781-6f5e-493d-a811-b88b641fcda2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.210144 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-scq4h" event={"ID":"911a50c7-0e81-4087-9bc0-81e5206d5fd0","Type":"ContainerStarted","Data":"5423d267e15debcd94014892ada156bc9436c931a12045f31b046056673324d4"} Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.213936 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2ljbf" Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.224295 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zlhdf"] Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.225580 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zlhdf" Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.230915 4971 generic.go:334] "Generic (PLEG): container finished" podID="469b3cee-92e9-43a1-b8e7-de66a1990a8f" containerID="70606590a09aa41c7a112416bbd78ac53b29e45664879e7a2cc25e66807e73dd" exitCode=0 Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.231059 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-k5xfj" event={"ID":"469b3cee-92e9-43a1-b8e7-de66a1990a8f","Type":"ContainerDied","Data":"70606590a09aa41c7a112416bbd78ac53b29e45664879e7a2cc25e66807e73dd"} Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.231097 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-k5xfj" event={"ID":"469b3cee-92e9-43a1-b8e7-de66a1990a8f","Type":"ContainerDied","Data":"ee3257a007fbcbc946277e09316aafb3172bcc3f89f4f0282d4de1a0b3b24cc0"} Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.231117 4971 scope.go:117] "RemoveContainer" containerID="70606590a09aa41c7a112416bbd78ac53b29e45664879e7a2cc25e66807e73dd" Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.231336 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-k5xfj" Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.245372 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zlhdf"] Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.252453 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.252696 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83cbdad9-020f-4a1b-8f5f-ca7d56d3cfac-utilities\") pod \"community-operators-v9l2k\" (UID: \"83cbdad9-020f-4a1b-8f5f-ca7d56d3cfac\") " pod="openshift-marketplace/community-operators-v9l2k" Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.252804 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83cbdad9-020f-4a1b-8f5f-ca7d56d3cfac-catalog-content\") pod \"community-operators-v9l2k\" (UID: \"83cbdad9-020f-4a1b-8f5f-ca7d56d3cfac\") " pod="openshift-marketplace/community-operators-v9l2k" Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.252842 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbnvj\" (UniqueName: \"kubernetes.io/projected/83cbdad9-020f-4a1b-8f5f-ca7d56d3cfac-kube-api-access-hbnvj\") pod \"community-operators-v9l2k\" (UID: \"83cbdad9-020f-4a1b-8f5f-ca7d56d3cfac\") " pod="openshift-marketplace/community-operators-v9l2k" Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.252876 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h55hg\" (UniqueName: \"kubernetes.io/projected/6fbc771d-6009-4ab3-85e4-963d26ee459e-kube-api-access-h55hg\") pod \"certified-operators-zlhdf\" (UID: \"6fbc771d-6009-4ab3-85e4-963d26ee459e\") " pod="openshift-marketplace/certified-operators-zlhdf" Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.252908 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fbc771d-6009-4ab3-85e4-963d26ee459e-catalog-content\") pod \"certified-operators-zlhdf\" (UID: \"6fbc771d-6009-4ab3-85e4-963d26ee459e\") " pod="openshift-marketplace/certified-operators-zlhdf" Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.253005 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fbc771d-6009-4ab3-85e4-963d26ee459e-utilities\") pod \"certified-operators-zlhdf\" (UID: \"6fbc771d-6009-4ab3-85e4-963d26ee459e\") " pod="openshift-marketplace/certified-operators-zlhdf" Mar 20 06:53:29 crc kubenswrapper[4971]: E0320 06:53:29.253084 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 06:53:29.75305421 +0000 UTC m=+231.732928348 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.254479 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83cbdad9-020f-4a1b-8f5f-ca7d56d3cfac-utilities\") pod \"community-operators-v9l2k\" (UID: \"83cbdad9-020f-4a1b-8f5f-ca7d56d3cfac\") " pod="openshift-marketplace/community-operators-v9l2k" Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.256773 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83cbdad9-020f-4a1b-8f5f-ca7d56d3cfac-catalog-content\") pod \"community-operators-v9l2k\" (UID: \"83cbdad9-020f-4a1b-8f5f-ca7d56d3cfac\") " pod="openshift-marketplace/community-operators-v9l2k" Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.288359 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-r6vfl" event={"ID":"6e696339-3718-476d-a99b-1da32e10090b","Type":"ContainerStarted","Data":"57cf8ba24f8331be2e2c2f41eb2697f96a3d52fb780e41b9e66aba71040dc5f0"} Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.312115 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbnvj\" (UniqueName: \"kubernetes.io/projected/83cbdad9-020f-4a1b-8f5f-ca7d56d3cfac-kube-api-access-hbnvj\") pod \"community-operators-v9l2k\" (UID: \"83cbdad9-020f-4a1b-8f5f-ca7d56d3cfac\") " pod="openshift-marketplace/community-operators-v9l2k" Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.321185 4971 scope.go:117] "RemoveContainer" containerID="70606590a09aa41c7a112416bbd78ac53b29e45664879e7a2cc25e66807e73dd" Mar 20 06:53:29 crc kubenswrapper[4971]: E0320 06:53:29.329879 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70606590a09aa41c7a112416bbd78ac53b29e45664879e7a2cc25e66807e73dd\": container with ID starting with 70606590a09aa41c7a112416bbd78ac53b29e45664879e7a2cc25e66807e73dd not found: ID does not exist" containerID="70606590a09aa41c7a112416bbd78ac53b29e45664879e7a2cc25e66807e73dd" Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.329940 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70606590a09aa41c7a112416bbd78ac53b29e45664879e7a2cc25e66807e73dd"} err="failed to get container status \"70606590a09aa41c7a112416bbd78ac53b29e45664879e7a2cc25e66807e73dd\": rpc error: code = NotFound desc = could not find container \"70606590a09aa41c7a112416bbd78ac53b29e45664879e7a2cc25e66807e73dd\": container with ID starting with 70606590a09aa41c7a112416bbd78ac53b29e45664879e7a2cc25e66807e73dd not found: ID does not exist" Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.331218 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-k9kzh" event={"ID":"3a9bd8ec-b649-4b24-8eca-d0e20cef988a","Type":"ContainerStarted","Data":"0635d4b29b12515a0c1761670e55ef3db330069bb113c0511e5e26fd29541c51"} Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.350383 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-kmdm2" podStartSLOduration=177.350360216 podStartE2EDuration="2m57.350360216s" podCreationTimestamp="2026-03-20 06:50:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:53:29.311176632 +0000 UTC m=+231.291050770" watchObservedRunningTime="2026-03-20 06:53:29.350360216 +0000 UTC m=+231.330234354" Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.360490 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h55hg\" (UniqueName: \"kubernetes.io/projected/6fbc771d-6009-4ab3-85e4-963d26ee459e-kube-api-access-h55hg\") pod \"certified-operators-zlhdf\" (UID: \"6fbc771d-6009-4ab3-85e4-963d26ee459e\") " pod="openshift-marketplace/certified-operators-zlhdf" Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.360551 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fbc771d-6009-4ab3-85e4-963d26ee459e-catalog-content\") pod \"certified-operators-zlhdf\" (UID: \"6fbc771d-6009-4ab3-85e4-963d26ee459e\") " pod="openshift-marketplace/certified-operators-zlhdf" Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.360591 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fbc771d-6009-4ab3-85e4-963d26ee459e-utilities\") pod \"certified-operators-zlhdf\" (UID: \"6fbc771d-6009-4ab3-85e4-963d26ee459e\") " pod="openshift-marketplace/certified-operators-zlhdf" Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.360714 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.360891 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-s29mb" event={"ID":"7d6d4012-f253-411e-9bed-2e204d9a6f67","Type":"ContainerStarted","Data":"a35d69c6296525d7d41ca44d501836bf56bd9f380806f6f626ecca10a515b44f"} Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.360935 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-s29mb" event={"ID":"7d6d4012-f253-411e-9bed-2e204d9a6f67","Type":"ContainerStarted","Data":"458dbfdb19b785bbdf5718136dbccceda79173207b3acff6a2a8037647fef292"} Mar 20 06:53:29 crc kubenswrapper[4971]: E0320 06:53:29.361060 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 06:53:29.861044716 +0000 UTC m=+231.840918854 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q8mn4" (UID: "4e2cb781-6f5e-493d-a811-b88b641fcda2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.361958 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-s29mb" Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.363892 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fbc771d-6009-4ab3-85e4-963d26ee459e-catalog-content\") pod \"certified-operators-zlhdf\" (UID: \"6fbc771d-6009-4ab3-85e4-963d26ee459e\") " pod="openshift-marketplace/certified-operators-zlhdf" Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.364319 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fbc771d-6009-4ab3-85e4-963d26ee459e-utilities\") pod \"certified-operators-zlhdf\" (UID: \"6fbc771d-6009-4ab3-85e4-963d26ee459e\") " pod="openshift-marketplace/certified-operators-zlhdf" Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.366812 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"ec32cb95bbf2cedbc34551240e0479cf40766e447457decc6507ec688d356cf9"} Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.366857 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"e1f88d51185a19d44d4154f20de4ec8ec860c62753ebf1c6fea773c09f4db89c"} Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.379394 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kcw47" Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.383161 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tzg77"] Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.387687 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xfqk2" Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.399398 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tzg77"] Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.402560 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-bj9mx" Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.411328 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v9l2k" Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.415858 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mg9fg" Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.433361 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-25c82"] Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.434753 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-25c82" Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.435817 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h55hg\" (UniqueName: \"kubernetes.io/projected/6fbc771d-6009-4ab3-85e4-963d26ee459e-kube-api-access-h55hg\") pod \"certified-operators-zlhdf\" (UID: \"6fbc771d-6009-4ab3-85e4-963d26ee459e\") " pod="openshift-marketplace/certified-operators-zlhdf" Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.462668 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.494962 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c6b8af5-8d8d-4643-8815-9ed0526f04e1-utilities\") pod \"community-operators-25c82\" (UID: \"4c6b8af5-8d8d-4643-8815-9ed0526f04e1\") " pod="openshift-marketplace/community-operators-25c82" Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.495152 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c6b8af5-8d8d-4643-8815-9ed0526f04e1-catalog-content\") pod \"community-operators-25c82\" (UID: \"4c6b8af5-8d8d-4643-8815-9ed0526f04e1\") " pod="openshift-marketplace/community-operators-25c82" Mar 20 06:53:29 crc kubenswrapper[4971]: E0320 06:53:29.500923 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 06:53:30.000893913 +0000 UTC m=+231.980768051 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.501301 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-25c82"] Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.515802 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.515883 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtdcz\" (UniqueName: \"kubernetes.io/projected/4c6b8af5-8d8d-4643-8815-9ed0526f04e1-kube-api-access-qtdcz\") pod \"community-operators-25c82\" (UID: \"4c6b8af5-8d8d-4643-8815-9ed0526f04e1\") " pod="openshift-marketplace/community-operators-25c82" Mar 20 06:53:29 crc kubenswrapper[4971]: E0320 06:53:29.526935 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 06:53:30.026885946 +0000 UTC m=+232.006760084 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q8mn4" (UID: "4e2cb781-6f5e-493d-a811-b88b641fcda2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.538526 4971 ???:1] "http: TLS handshake error from 192.168.126.11:36306: no serving certificate available for the kubelet" Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.597206 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zlhdf" Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.617631 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.617815 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c6b8af5-8d8d-4643-8815-9ed0526f04e1-catalog-content\") pod \"community-operators-25c82\" (UID: \"4c6b8af5-8d8d-4643-8815-9ed0526f04e1\") " pod="openshift-marketplace/community-operators-25c82" Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.617904 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtdcz\" (UniqueName: \"kubernetes.io/projected/4c6b8af5-8d8d-4643-8815-9ed0526f04e1-kube-api-access-qtdcz\") pod \"community-operators-25c82\" (UID: \"4c6b8af5-8d8d-4643-8815-9ed0526f04e1\") " pod="openshift-marketplace/community-operators-25c82" Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.617955 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c6b8af5-8d8d-4643-8815-9ed0526f04e1-utilities\") pod \"community-operators-25c82\" (UID: \"4c6b8af5-8d8d-4643-8815-9ed0526f04e1\") " pod="openshift-marketplace/community-operators-25c82" Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.620894 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c6b8af5-8d8d-4643-8815-9ed0526f04e1-utilities\") pod \"community-operators-25c82\" (UID: \"4c6b8af5-8d8d-4643-8815-9ed0526f04e1\") " pod="openshift-marketplace/community-operators-25c82" Mar 20 06:53:29 crc kubenswrapper[4971]: E0320 06:53:29.620933 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 06:53:30.120892367 +0000 UTC m=+232.100766505 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.633100 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c6b8af5-8d8d-4643-8815-9ed0526f04e1-catalog-content\") pod \"community-operators-25c82\" (UID: \"4c6b8af5-8d8d-4643-8815-9ed0526f04e1\") " pod="openshift-marketplace/community-operators-25c82" Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.635850 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-9ndf9" podStartSLOduration=178.635831609 podStartE2EDuration="2m58.635831609s" podCreationTimestamp="2026-03-20 06:50:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:53:29.632464471 +0000 UTC m=+231.612338619" watchObservedRunningTime="2026-03-20 06:53:29.635831609 +0000 UTC m=+231.615705747" Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.665391 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtdcz\" (UniqueName: \"kubernetes.io/projected/4c6b8af5-8d8d-4643-8815-9ed0526f04e1-kube-api-access-qtdcz\") pod \"community-operators-25c82\" (UID: \"4c6b8af5-8d8d-4643-8815-9ed0526f04e1\") " pod="openshift-marketplace/community-operators-25c82" Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.720774 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:29 crc kubenswrapper[4971]: E0320 06:53:29.721135 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 06:53:30.221120788 +0000 UTC m=+232.200994926 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q8mn4" (UID: "4e2cb781-6f5e-493d-a811-b88b641fcda2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.759254 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-scq4h" podStartSLOduration=177.759237181 podStartE2EDuration="2m57.759237181s" podCreationTimestamp="2026-03-20 06:50:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:53:29.757891042 +0000 UTC m=+231.737765180" watchObservedRunningTime="2026-03-20 06:53:29.759237181 +0000 UTC m=+231.739111319" Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.815499 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-547f97b769-ftxsp"] Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.819813 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-s29mb" podStartSLOduration=10.819783403 podStartE2EDuration="10.819783403s" podCreationTimestamp="2026-03-20 06:53:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:53:29.808075654 +0000 UTC m=+231.787949792" watchObservedRunningTime="2026-03-20 06:53:29.819783403 +0000 UTC m=+231.799657541" Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.821740 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:53:29 crc kubenswrapper[4971]: E0320 06:53:29.822181 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 06:53:30.322161952 +0000 UTC m=+232.302036090 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.834843 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-25c82" Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.924799 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-r6vfl" podStartSLOduration=177.924776472 podStartE2EDuration="2m57.924776472s" podCreationTimestamp="2026-03-20 06:50:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:53:29.897178483 +0000 UTC m=+231.877052621" watchObservedRunningTime="2026-03-20 06:53:29.924776472 +0000 UTC m=+231.904650610" Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.937488 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:29 crc kubenswrapper[4971]: E0320 06:53:29.938047 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 06:53:30.438031956 +0000 UTC m=+232.417906094 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q8mn4" (UID: "4e2cb781-6f5e-493d-a811-b88b641fcda2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.943892 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-k5xfj"] Mar 20 06:53:29 crc kubenswrapper[4971]: I0320 06:53:29.959116 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-k5xfj"] Mar 20 06:53:30 crc kubenswrapper[4971]: I0320 06:53:30.032728 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2ljbf"] Mar 20 06:53:30 crc kubenswrapper[4971]: I0320 06:53:30.035449 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-k9kzh" podStartSLOduration=179.035402784 podStartE2EDuration="2m59.035402784s" podCreationTimestamp="2026-03-20 06:50:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:53:30.004778467 +0000 UTC m=+231.984652605" watchObservedRunningTime="2026-03-20 06:53:30.035402784 +0000 UTC m=+232.015276922" Mar 20 06:53:30 crc kubenswrapper[4971]: I0320 06:53:30.041377 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:53:30 crc kubenswrapper[4971]: E0320 06:53:30.041921 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 06:53:30.541876431 +0000 UTC m=+232.521750569 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:30 crc kubenswrapper[4971]: I0320 06:53:30.108039 4971 patch_prober.go:28] interesting pod/router-default-5444994796-858jn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 06:53:30 crc kubenswrapper[4971]: [-]has-synced failed: reason withheld Mar 20 06:53:30 crc kubenswrapper[4971]: [+]process-running ok Mar 20 06:53:30 crc kubenswrapper[4971]: healthz check failed Mar 20 06:53:30 crc kubenswrapper[4971]: I0320 06:53:30.108096 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-858jn" podUID="92c1ee23-ae49-48a6-828d-19ecb5573057" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 06:53:30 crc kubenswrapper[4971]: I0320 06:53:30.142877 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:30 crc kubenswrapper[4971]: E0320 06:53:30.145766 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 06:53:30.643179273 +0000 UTC m=+232.623053411 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q8mn4" (UID: "4e2cb781-6f5e-493d-a811-b88b641fcda2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:30 crc kubenswrapper[4971]: I0320 06:53:30.248225 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:53:30 crc kubenswrapper[4971]: E0320 06:53:30.249084 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 06:53:30.749062358 +0000 UTC m=+232.728936496 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:30 crc kubenswrapper[4971]: I0320 06:53:30.353575 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:30 crc kubenswrapper[4971]: E0320 06:53:30.353981 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 06:53:30.853966174 +0000 UTC m=+232.833840312 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q8mn4" (UID: "4e2cb781-6f5e-493d-a811-b88b641fcda2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:30 crc kubenswrapper[4971]: I0320 06:53:30.363425 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zlhdf"] Mar 20 06:53:30 crc kubenswrapper[4971]: I0320 06:53:30.415457 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2ljbf" event={"ID":"264fdafe-b375-48d7-95d7-62d72c752b5d","Type":"ContainerStarted","Data":"da571780fd9d1ad968bd2335431ed0b4691d9fa09c738f495a8e89b41f81a2ad"} Mar 20 06:53:30 crc kubenswrapper[4971]: I0320 06:53:30.420820 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-547f97b769-ftxsp" event={"ID":"c57e69da-fb86-40f0-bbb9-e7560725c9c0","Type":"ContainerStarted","Data":"ef6967630408e8111a105c6b4398b2d0ed9f6fbe1b61b457a192c516e89b56b8"} Mar 20 06:53:30 crc kubenswrapper[4971]: I0320 06:53:30.420863 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-547f97b769-ftxsp" event={"ID":"c57e69da-fb86-40f0-bbb9-e7560725c9c0","Type":"ContainerStarted","Data":"29a33c62006db79c1434ba07177bd22372ee6d2657a342133b8ac084cb28d9ca"} Mar 20 06:53:30 crc kubenswrapper[4971]: I0320 06:53:30.448880 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-547f97b769-ftxsp" Mar 20 06:53:30 crc kubenswrapper[4971]: I0320 06:53:30.460096 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-v9l2k"] Mar 20 06:53:30 crc kubenswrapper[4971]: I0320 06:53:30.460236 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:53:30 crc kubenswrapper[4971]: E0320 06:53:30.460646 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 06:53:30.960597449 +0000 UTC m=+232.940471587 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:30 crc kubenswrapper[4971]: I0320 06:53:30.460844 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:30 crc kubenswrapper[4971]: E0320 06:53:30.461239 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 06:53:30.961231268 +0000 UTC m=+232.941105406 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q8mn4" (UID: "4e2cb781-6f5e-493d-a811-b88b641fcda2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:30 crc kubenswrapper[4971]: I0320 06:53:30.476004 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-547f97b769-ftxsp" podStartSLOduration=5.475975755 podStartE2EDuration="5.475975755s" podCreationTimestamp="2026-03-20 06:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:53:30.467701605 +0000 UTC m=+232.447575753" watchObservedRunningTime="2026-03-20 06:53:30.475975755 +0000 UTC m=+232.455849893" Mar 20 06:53:30 crc kubenswrapper[4971]: I0320 06:53:30.562934 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:53:30 crc kubenswrapper[4971]: E0320 06:53:30.564532 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 06:53:31.064509677 +0000 UTC m=+233.044383815 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:30 crc kubenswrapper[4971]: I0320 06:53:30.658303 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-25c82"] Mar 20 06:53:30 crc kubenswrapper[4971]: I0320 06:53:30.674161 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:30 crc kubenswrapper[4971]: E0320 06:53:30.674526 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 06:53:31.174511101 +0000 UTC m=+233.154385239 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q8mn4" (UID: "4e2cb781-6f5e-493d-a811-b88b641fcda2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:30 crc kubenswrapper[4971]: W0320 06:53:30.675875 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c6b8af5_8d8d_4643_8815_9ed0526f04e1.slice/crio-3dcf09b991cbcb066f4cab64c3ff54e62184c5d5945513c3f0ecc1183e24dad9 WatchSource:0}: Error finding container 3dcf09b991cbcb066f4cab64c3ff54e62184c5d5945513c3f0ecc1183e24dad9: Status 404 returned error can't find the container with id 3dcf09b991cbcb066f4cab64c3ff54e62184c5d5945513c3f0ecc1183e24dad9 Mar 20 06:53:30 crc kubenswrapper[4971]: I0320 06:53:30.755363 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="469b3cee-92e9-43a1-b8e7-de66a1990a8f" path="/var/lib/kubelet/pods/469b3cee-92e9-43a1-b8e7-de66a1990a8f/volumes" Mar 20 06:53:30 crc kubenswrapper[4971]: I0320 06:53:30.756563 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="634ee46d-0ea5-4287-a7d7-55d7b0fb803b" path="/var/lib/kubelet/pods/634ee46d-0ea5-4287-a7d7-55d7b0fb803b/volumes" Mar 20 06:53:30 crc kubenswrapper[4971]: I0320 06:53:30.757331 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-654b6f9594-r5jch"] Mar 20 06:53:30 crc kubenswrapper[4971]: I0320 06:53:30.758160 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-654b6f9594-r5jch" Mar 20 06:53:30 crc kubenswrapper[4971]: I0320 06:53:30.758180 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-654b6f9594-r5jch"] Mar 20 06:53:30 crc kubenswrapper[4971]: I0320 06:53:30.761583 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 06:53:30 crc kubenswrapper[4971]: I0320 06:53:30.771375 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 06:53:30 crc kubenswrapper[4971]: I0320 06:53:30.771590 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 06:53:30 crc kubenswrapper[4971]: I0320 06:53:30.771870 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 06:53:30 crc kubenswrapper[4971]: I0320 06:53:30.772092 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 06:53:30 crc kubenswrapper[4971]: I0320 06:53:30.772356 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 06:53:30 crc kubenswrapper[4971]: I0320 06:53:30.774190 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 06:53:30 crc kubenswrapper[4971]: I0320 06:53:30.775651 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 06:53:30 crc kubenswrapper[4971]: I0320 06:53:30.776895 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:53:30 crc kubenswrapper[4971]: E0320 06:53:30.777234 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 06:53:31.277195323 +0000 UTC m=+233.257069461 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:30 crc kubenswrapper[4971]: I0320 06:53:30.778070 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 06:53:30 crc kubenswrapper[4971]: I0320 06:53:30.782771 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 06:53:30 crc kubenswrapper[4971]: I0320 06:53:30.785532 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 20 06:53:30 crc kubenswrapper[4971]: I0320 06:53:30.785537 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 20 06:53:30 crc kubenswrapper[4971]: I0320 06:53:30.882655 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fec1493f-ac46-4873-a9f5-94217c9cc6ab-serving-cert\") pod \"controller-manager-654b6f9594-r5jch\" (UID: \"fec1493f-ac46-4873-a9f5-94217c9cc6ab\") " pod="openshift-controller-manager/controller-manager-654b6f9594-r5jch" Mar 20 06:53:30 crc kubenswrapper[4971]: I0320 06:53:30.882700 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fec1493f-ac46-4873-a9f5-94217c9cc6ab-config\") pod \"controller-manager-654b6f9594-r5jch\" (UID: \"fec1493f-ac46-4873-a9f5-94217c9cc6ab\") " pod="openshift-controller-manager/controller-manager-654b6f9594-r5jch" Mar 20 06:53:30 crc kubenswrapper[4971]: I0320 06:53:30.882742 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3e129ce8-683d-4f95-9e78-5a99a6951318-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3e129ce8-683d-4f95-9e78-5a99a6951318\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 06:53:30 crc kubenswrapper[4971]: I0320 06:53:30.882765 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fec1493f-ac46-4873-a9f5-94217c9cc6ab-client-ca\") pod \"controller-manager-654b6f9594-r5jch\" (UID: \"fec1493f-ac46-4873-a9f5-94217c9cc6ab\") " pod="openshift-controller-manager/controller-manager-654b6f9594-r5jch" Mar 20 06:53:30 crc kubenswrapper[4971]: I0320 06:53:30.882797 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzr6m\" (UniqueName: \"kubernetes.io/projected/fec1493f-ac46-4873-a9f5-94217c9cc6ab-kube-api-access-rzr6m\") pod \"controller-manager-654b6f9594-r5jch\" (UID: \"fec1493f-ac46-4873-a9f5-94217c9cc6ab\") " pod="openshift-controller-manager/controller-manager-654b6f9594-r5jch" Mar 20 06:53:30 crc kubenswrapper[4971]: I0320 06:53:30.883016 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:30 crc kubenswrapper[4971]: I0320 06:53:30.883203 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3e129ce8-683d-4f95-9e78-5a99a6951318-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3e129ce8-683d-4f95-9e78-5a99a6951318\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 06:53:30 crc kubenswrapper[4971]: I0320 06:53:30.883276 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fec1493f-ac46-4873-a9f5-94217c9cc6ab-proxy-ca-bundles\") pod \"controller-manager-654b6f9594-r5jch\" (UID: \"fec1493f-ac46-4873-a9f5-94217c9cc6ab\") " pod="openshift-controller-manager/controller-manager-654b6f9594-r5jch" Mar 20 06:53:30 crc kubenswrapper[4971]: E0320 06:53:30.883737 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 06:53:31.383720456 +0000 UTC m=+233.363594594 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q8mn4" (UID: "4e2cb781-6f5e-493d-a811-b88b641fcda2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:30 crc kubenswrapper[4971]: I0320 06:53:30.984434 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:53:30 crc kubenswrapper[4971]: I0320 06:53:30.984776 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3e129ce8-683d-4f95-9e78-5a99a6951318-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3e129ce8-683d-4f95-9e78-5a99a6951318\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 06:53:30 crc kubenswrapper[4971]: I0320 06:53:30.984822 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fec1493f-ac46-4873-a9f5-94217c9cc6ab-client-ca\") pod \"controller-manager-654b6f9594-r5jch\" (UID: \"fec1493f-ac46-4873-a9f5-94217c9cc6ab\") " pod="openshift-controller-manager/controller-manager-654b6f9594-r5jch" Mar 20 06:53:30 crc kubenswrapper[4971]: I0320 06:53:30.984857 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzr6m\" (UniqueName: \"kubernetes.io/projected/fec1493f-ac46-4873-a9f5-94217c9cc6ab-kube-api-access-rzr6m\") pod \"controller-manager-654b6f9594-r5jch\" (UID: \"fec1493f-ac46-4873-a9f5-94217c9cc6ab\") " pod="openshift-controller-manager/controller-manager-654b6f9594-r5jch" Mar 20 06:53:30 crc kubenswrapper[4971]: I0320 06:53:30.984895 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3e129ce8-683d-4f95-9e78-5a99a6951318-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3e129ce8-683d-4f95-9e78-5a99a6951318\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 06:53:30 crc kubenswrapper[4971]: I0320 06:53:30.984984 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fec1493f-ac46-4873-a9f5-94217c9cc6ab-proxy-ca-bundles\") pod \"controller-manager-654b6f9594-r5jch\" (UID: \"fec1493f-ac46-4873-a9f5-94217c9cc6ab\") " pod="openshift-controller-manager/controller-manager-654b6f9594-r5jch" Mar 20 06:53:30 crc kubenswrapper[4971]: I0320 06:53:30.985021 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fec1493f-ac46-4873-a9f5-94217c9cc6ab-serving-cert\") pod \"controller-manager-654b6f9594-r5jch\" (UID: \"fec1493f-ac46-4873-a9f5-94217c9cc6ab\") " pod="openshift-controller-manager/controller-manager-654b6f9594-r5jch" Mar 20 06:53:30 crc kubenswrapper[4971]: I0320 06:53:30.985049 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fec1493f-ac46-4873-a9f5-94217c9cc6ab-config\") pod \"controller-manager-654b6f9594-r5jch\" (UID: \"fec1493f-ac46-4873-a9f5-94217c9cc6ab\") " pod="openshift-controller-manager/controller-manager-654b6f9594-r5jch" Mar 20 06:53:30 crc kubenswrapper[4971]: E0320 06:53:30.987506 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 06:53:31.487478619 +0000 UTC m=+233.467352757 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:30 crc kubenswrapper[4971]: I0320 06:53:30.987778 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3e129ce8-683d-4f95-9e78-5a99a6951318-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3e129ce8-683d-4f95-9e78-5a99a6951318\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 06:53:30 crc kubenswrapper[4971]: I0320 06:53:30.988500 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fec1493f-ac46-4873-a9f5-94217c9cc6ab-proxy-ca-bundles\") pod \"controller-manager-654b6f9594-r5jch\" (UID: \"fec1493f-ac46-4873-a9f5-94217c9cc6ab\") " pod="openshift-controller-manager/controller-manager-654b6f9594-r5jch" Mar 20 06:53:30 crc kubenswrapper[4971]: I0320 06:53:30.989279 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fec1493f-ac46-4873-a9f5-94217c9cc6ab-config\") pod \"controller-manager-654b6f9594-r5jch\" (UID: \"fec1493f-ac46-4873-a9f5-94217c9cc6ab\") " pod="openshift-controller-manager/controller-manager-654b6f9594-r5jch" Mar 20 06:53:30 crc kubenswrapper[4971]: I0320 06:53:30.989922 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fec1493f-ac46-4873-a9f5-94217c9cc6ab-client-ca\") pod \"controller-manager-654b6f9594-r5jch\" (UID: \"fec1493f-ac46-4873-a9f5-94217c9cc6ab\") " pod="openshift-controller-manager/controller-manager-654b6f9594-r5jch" Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.006665 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fec1493f-ac46-4873-a9f5-94217c9cc6ab-serving-cert\") pod \"controller-manager-654b6f9594-r5jch\" (UID: \"fec1493f-ac46-4873-a9f5-94217c9cc6ab\") " pod="openshift-controller-manager/controller-manager-654b6f9594-r5jch" Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.015668 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3e129ce8-683d-4f95-9e78-5a99a6951318-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3e129ce8-683d-4f95-9e78-5a99a6951318\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.025695 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzr6m\" (UniqueName: \"kubernetes.io/projected/fec1493f-ac46-4873-a9f5-94217c9cc6ab-kube-api-access-rzr6m\") pod \"controller-manager-654b6f9594-r5jch\" (UID: \"fec1493f-ac46-4873-a9f5-94217c9cc6ab\") " pod="openshift-controller-manager/controller-manager-654b6f9594-r5jch" Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.030435 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9l5fd"] Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.031604 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9l5fd" Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.044661 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.049641 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9l5fd"] Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.086743 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-547f97b769-ftxsp" Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.086883 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:31 crc kubenswrapper[4971]: E0320 06:53:31.087678 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 06:53:31.587661199 +0000 UTC m=+233.567535337 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q8mn4" (UID: "4e2cb781-6f5e-493d-a811-b88b641fcda2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.102787 4971 patch_prober.go:28] interesting pod/router-default-5444994796-858jn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 06:53:31 crc kubenswrapper[4971]: [-]has-synced failed: reason withheld Mar 20 06:53:31 crc kubenswrapper[4971]: [+]process-running ok Mar 20 06:53:31 crc kubenswrapper[4971]: healthz check failed Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.102863 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-858jn" podUID="92c1ee23-ae49-48a6-828d-19ecb5573057" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.104927 4971 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.188271 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.188772 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7cb5ccdd-a645-458a-9961-461bd99c3c79-utilities\") pod \"redhat-marketplace-9l5fd\" (UID: \"7cb5ccdd-a645-458a-9961-461bd99c3c79\") " pod="openshift-marketplace/redhat-marketplace-9l5fd" Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.188830 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7cb5ccdd-a645-458a-9961-461bd99c3c79-catalog-content\") pod \"redhat-marketplace-9l5fd\" (UID: \"7cb5ccdd-a645-458a-9961-461bd99c3c79\") " pod="openshift-marketplace/redhat-marketplace-9l5fd" Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.188880 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45bwf\" (UniqueName: \"kubernetes.io/projected/7cb5ccdd-a645-458a-9961-461bd99c3c79-kube-api-access-45bwf\") pod \"redhat-marketplace-9l5fd\" (UID: \"7cb5ccdd-a645-458a-9961-461bd99c3c79\") " pod="openshift-marketplace/redhat-marketplace-9l5fd" Mar 20 06:53:31 crc kubenswrapper[4971]: E0320 06:53:31.188980 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 06:53:31.68894361 +0000 UTC m=+233.668817748 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.189058 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:31 crc kubenswrapper[4971]: E0320 06:53:31.190021 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 06:53:31.690002421 +0000 UTC m=+233.669876559 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q8mn4" (UID: "4e2cb781-6f5e-493d-a811-b88b641fcda2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.257834 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-654b6f9594-r5jch" Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.263829 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.289990 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.290318 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7cb5ccdd-a645-458a-9961-461bd99c3c79-catalog-content\") pod \"redhat-marketplace-9l5fd\" (UID: \"7cb5ccdd-a645-458a-9961-461bd99c3c79\") " pod="openshift-marketplace/redhat-marketplace-9l5fd" Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.290383 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45bwf\" (UniqueName: \"kubernetes.io/projected/7cb5ccdd-a645-458a-9961-461bd99c3c79-kube-api-access-45bwf\") pod \"redhat-marketplace-9l5fd\" (UID: \"7cb5ccdd-a645-458a-9961-461bd99c3c79\") " pod="openshift-marketplace/redhat-marketplace-9l5fd" Mar 20 06:53:31 crc kubenswrapper[4971]: E0320 06:53:31.290488 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 06:53:31.790457658 +0000 UTC m=+233.770331796 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.290793 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7cb5ccdd-a645-458a-9961-461bd99c3c79-utilities\") pod \"redhat-marketplace-9l5fd\" (UID: \"7cb5ccdd-a645-458a-9961-461bd99c3c79\") " pod="openshift-marketplace/redhat-marketplace-9l5fd" Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.290980 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7cb5ccdd-a645-458a-9961-461bd99c3c79-catalog-content\") pod \"redhat-marketplace-9l5fd\" (UID: \"7cb5ccdd-a645-458a-9961-461bd99c3c79\") " pod="openshift-marketplace/redhat-marketplace-9l5fd" Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.291492 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7cb5ccdd-a645-458a-9961-461bd99c3c79-utilities\") pod \"redhat-marketplace-9l5fd\" (UID: \"7cb5ccdd-a645-458a-9961-461bd99c3c79\") " pod="openshift-marketplace/redhat-marketplace-9l5fd" Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.315662 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45bwf\" (UniqueName: \"kubernetes.io/projected/7cb5ccdd-a645-458a-9961-461bd99c3c79-kube-api-access-45bwf\") pod \"redhat-marketplace-9l5fd\" (UID: \"7cb5ccdd-a645-458a-9961-461bd99c3c79\") " pod="openshift-marketplace/redhat-marketplace-9l5fd" Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.341164 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.342513 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.349852 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.350226 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.351323 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.396754 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:31 crc kubenswrapper[4971]: E0320 06:53:31.397518 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 06:53:31.897494366 +0000 UTC m=+233.877368494 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q8mn4" (UID: "4e2cb781-6f5e-493d-a811-b88b641fcda2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.398025 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9l5fd" Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.431426 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6b8lj"] Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.432849 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6b8lj"] Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.432963 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6b8lj" Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.458510 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-tp5rq" Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.475306 4971 generic.go:334] "Generic (PLEG): container finished" podID="6fbc771d-6009-4ab3-85e4-963d26ee459e" containerID="d0fb8f8dc6b00dd48db63775457875faf74b664eb9122a468589f8670723c60a" exitCode=0 Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.475421 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zlhdf" event={"ID":"6fbc771d-6009-4ab3-85e4-963d26ee459e","Type":"ContainerDied","Data":"d0fb8f8dc6b00dd48db63775457875faf74b664eb9122a468589f8670723c60a"} Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.475514 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-tp5rq" Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.475597 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zlhdf" event={"ID":"6fbc771d-6009-4ab3-85e4-963d26ee459e","Type":"ContainerStarted","Data":"70bf1343010d907b841c74313f74ba7482501ec61ca732acae178511e650d02a"} Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.483967 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-596j6" event={"ID":"3897dfd7-24d8-4384-b536-24db5edccd01","Type":"ContainerStarted","Data":"44cdc9a0f01dee89ca71c7430d140dc30956dd86029811d597daf697886242d8"} Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.484471 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-596j6" event={"ID":"3897dfd7-24d8-4384-b536-24db5edccd01","Type":"ContainerStarted","Data":"b0cb3b6f3f4dbf85817d8371e98e9a49a9c6864520b095cfbc896fee1fb06c3d"} Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.494826 4971 generic.go:334] "Generic (PLEG): container finished" podID="83cbdad9-020f-4a1b-8f5f-ca7d56d3cfac" containerID="9e1fad1a4a953dace5c5dda053036316c7aa7e1a995eb1ed059b27498c585fa1" exitCode=0 Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.495112 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v9l2k" event={"ID":"83cbdad9-020f-4a1b-8f5f-ca7d56d3cfac","Type":"ContainerDied","Data":"9e1fad1a4a953dace5c5dda053036316c7aa7e1a995eb1ed059b27498c585fa1"} Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.495167 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v9l2k" event={"ID":"83cbdad9-020f-4a1b-8f5f-ca7d56d3cfac","Type":"ContainerStarted","Data":"b1812ad0fae5c04cf8bcdc3c6ef3f48f0baac2e21956d0426c7f58a3a63bb9e6"} Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.497844 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.498344 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fcf218ee-0f64-48ac-80a7-1815cc6daf6c-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"fcf218ee-0f64-48ac-80a7-1815cc6daf6c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.498468 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fcf218ee-0f64-48ac-80a7-1815cc6daf6c-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"fcf218ee-0f64-48ac-80a7-1815cc6daf6c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 06:53:31 crc kubenswrapper[4971]: E0320 06:53:31.498715 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 06:53:31.998697715 +0000 UTC m=+233.978571853 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.566701 4971 generic.go:334] "Generic (PLEG): container finished" podID="264fdafe-b375-48d7-95d7-62d72c752b5d" containerID="f9a27fd69e2b216942baa90f4594ff32169f4f4bb3008b3e41f132bb22efa1c7" exitCode=0 Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.566815 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2ljbf" event={"ID":"264fdafe-b375-48d7-95d7-62d72c752b5d","Type":"ContainerDied","Data":"f9a27fd69e2b216942baa90f4594ff32169f4f4bb3008b3e41f132bb22efa1c7"} Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.593236 4971 generic.go:334] "Generic (PLEG): container finished" podID="4c6b8af5-8d8d-4643-8815-9ed0526f04e1" containerID="49311374d35b1bdeab75836ffba59e14d88502c7c663e9d9bf4dc817150f93b7" exitCode=0 Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.594394 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-25c82" event={"ID":"4c6b8af5-8d8d-4643-8815-9ed0526f04e1","Type":"ContainerDied","Data":"49311374d35b1bdeab75836ffba59e14d88502c7c663e9d9bf4dc817150f93b7"} Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.594428 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-25c82" event={"ID":"4c6b8af5-8d8d-4643-8815-9ed0526f04e1","Type":"ContainerStarted","Data":"3dcf09b991cbcb066f4cab64c3ff54e62184c5d5945513c3f0ecc1183e24dad9"} Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.599436 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff1d12ba-e1ee-4cbc-8c8e-102269e47f75-catalog-content\") pod \"redhat-marketplace-6b8lj\" (UID: \"ff1d12ba-e1ee-4cbc-8c8e-102269e47f75\") " pod="openshift-marketplace/redhat-marketplace-6b8lj" Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.599484 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc78s\" (UniqueName: \"kubernetes.io/projected/ff1d12ba-e1ee-4cbc-8c8e-102269e47f75-kube-api-access-xc78s\") pod \"redhat-marketplace-6b8lj\" (UID: \"ff1d12ba-e1ee-4cbc-8c8e-102269e47f75\") " pod="openshift-marketplace/redhat-marketplace-6b8lj" Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.599563 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fcf218ee-0f64-48ac-80a7-1815cc6daf6c-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"fcf218ee-0f64-48ac-80a7-1815cc6daf6c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.599660 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.599733 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff1d12ba-e1ee-4cbc-8c8e-102269e47f75-utilities\") pod \"redhat-marketplace-6b8lj\" (UID: \"ff1d12ba-e1ee-4cbc-8c8e-102269e47f75\") " pod="openshift-marketplace/redhat-marketplace-6b8lj" Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.599776 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fcf218ee-0f64-48ac-80a7-1815cc6daf6c-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"fcf218ee-0f64-48ac-80a7-1815cc6daf6c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 06:53:31 crc kubenswrapper[4971]: E0320 06:53:31.609900 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 06:53:32.109866433 +0000 UTC m=+234.089740571 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q8mn4" (UID: "4e2cb781-6f5e-493d-a811-b88b641fcda2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.610204 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fcf218ee-0f64-48ac-80a7-1815cc6daf6c-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"fcf218ee-0f64-48ac-80a7-1815cc6daf6c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.632233 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fcf218ee-0f64-48ac-80a7-1815cc6daf6c-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"fcf218ee-0f64-48ac-80a7-1815cc6daf6c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.702383 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.702703 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff1d12ba-e1ee-4cbc-8c8e-102269e47f75-catalog-content\") pod \"redhat-marketplace-6b8lj\" (UID: \"ff1d12ba-e1ee-4cbc-8c8e-102269e47f75\") " pod="openshift-marketplace/redhat-marketplace-6b8lj" Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.702751 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xc78s\" (UniqueName: \"kubernetes.io/projected/ff1d12ba-e1ee-4cbc-8c8e-102269e47f75-kube-api-access-xc78s\") pod \"redhat-marketplace-6b8lj\" (UID: \"ff1d12ba-e1ee-4cbc-8c8e-102269e47f75\") " pod="openshift-marketplace/redhat-marketplace-6b8lj" Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.702954 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff1d12ba-e1ee-4cbc-8c8e-102269e47f75-utilities\") pod \"redhat-marketplace-6b8lj\" (UID: \"ff1d12ba-e1ee-4cbc-8c8e-102269e47f75\") " pod="openshift-marketplace/redhat-marketplace-6b8lj" Mar 20 06:53:31 crc kubenswrapper[4971]: E0320 06:53:31.703928 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 06:53:32.203889664 +0000 UTC m=+234.183763792 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.705019 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff1d12ba-e1ee-4cbc-8c8e-102269e47f75-utilities\") pod \"redhat-marketplace-6b8lj\" (UID: \"ff1d12ba-e1ee-4cbc-8c8e-102269e47f75\") " pod="openshift-marketplace/redhat-marketplace-6b8lj" Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.705243 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff1d12ba-e1ee-4cbc-8c8e-102269e47f75-catalog-content\") pod \"redhat-marketplace-6b8lj\" (UID: \"ff1d12ba-e1ee-4cbc-8c8e-102269e47f75\") " pod="openshift-marketplace/redhat-marketplace-6b8lj" Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.723822 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.728460 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc78s\" (UniqueName: \"kubernetes.io/projected/ff1d12ba-e1ee-4cbc-8c8e-102269e47f75-kube-api-access-xc78s\") pod \"redhat-marketplace-6b8lj\" (UID: \"ff1d12ba-e1ee-4cbc-8c8e-102269e47f75\") " pod="openshift-marketplace/redhat-marketplace-6b8lj" Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.729724 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-654b6f9594-r5jch"] Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.769557 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.778438 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6b8lj" Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.804737 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:31 crc kubenswrapper[4971]: E0320 06:53:31.805397 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 06:53:32.305377212 +0000 UTC m=+234.285251350 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q8mn4" (UID: "4e2cb781-6f5e-493d-a811-b88b641fcda2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.850679 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9l5fd"] Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.907003 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:53:31 crc kubenswrapper[4971]: E0320 06:53:31.907660 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 06:53:32.407632641 +0000 UTC m=+234.387506779 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.967881 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-4l9qf" Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.969225 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-4l9qf" Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.969285 4971 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-20T06:53:31.104943969Z","Handler":null,"Name":""} Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.976256 4971 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 20 06:53:31 crc kubenswrapper[4971]: I0320 06:53:31.976287 4971 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.005241 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4w7kq"] Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.007930 4971 patch_prober.go:28] interesting pod/console-f9d7485db-4l9qf container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.25:8443/health\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.007966 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-4l9qf" podUID="9f6352b6-0b3b-43d0-8d2f-681f5aec50c2" containerName="console" probeResult="failure" output="Get \"https://10.217.0.25:8443/health\": dial tcp 10.217.0.25:8443: connect: connection refused" Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.008249 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4w7kq" Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.009543 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.010400 4971 patch_prober.go:28] interesting pod/downloads-7954f5f757-t2znt container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.016975 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-t2znt" podUID="0939bb94-8858-43fc-8443-686e696beaa1" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.013974 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.014689 4971 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.017216 4971 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.020776 4971 patch_prober.go:28] interesting pod/downloads-7954f5f757-t2znt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.020832 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-t2znt" podUID="0939bb94-8858-43fc-8443-686e696beaa1" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.024590 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4w7kq"] Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.100726 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-858jn" Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.116402 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a637a4b-c5ec-46a5-8b5c-84a56a112ec4-catalog-content\") pod \"redhat-operators-4w7kq\" (UID: \"0a637a4b-c5ec-46a5-8b5c-84a56a112ec4\") " pod="openshift-marketplace/redhat-operators-4w7kq" Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.116471 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4mz5\" (UniqueName: \"kubernetes.io/projected/0a637a4b-c5ec-46a5-8b5c-84a56a112ec4-kube-api-access-l4mz5\") pod \"redhat-operators-4w7kq\" (UID: \"0a637a4b-c5ec-46a5-8b5c-84a56a112ec4\") " pod="openshift-marketplace/redhat-operators-4w7kq" Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.116498 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a637a4b-c5ec-46a5-8b5c-84a56a112ec4-utilities\") pod \"redhat-operators-4w7kq\" (UID: \"0a637a4b-c5ec-46a5-8b5c-84a56a112ec4\") " pod="openshift-marketplace/redhat-operators-4w7kq" Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.127005 4971 patch_prober.go:28] interesting pod/router-default-5444994796-858jn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 06:53:32 crc kubenswrapper[4971]: [-]has-synced failed: reason withheld Mar 20 06:53:32 crc kubenswrapper[4971]: [+]process-running ok Mar 20 06:53:32 crc kubenswrapper[4971]: healthz check failed Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.127093 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-858jn" podUID="92c1ee23-ae49-48a6-828d-19ecb5573057" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.134405 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.200145 4971 ???:1] "http: TLS handshake error from 192.168.126.11:36318: no serving certificate available for the kubelet" Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.220021 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a637a4b-c5ec-46a5-8b5c-84a56a112ec4-catalog-content\") pod \"redhat-operators-4w7kq\" (UID: \"0a637a4b-c5ec-46a5-8b5c-84a56a112ec4\") " pod="openshift-marketplace/redhat-operators-4w7kq" Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.221768 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4mz5\" (UniqueName: \"kubernetes.io/projected/0a637a4b-c5ec-46a5-8b5c-84a56a112ec4-kube-api-access-l4mz5\") pod \"redhat-operators-4w7kq\" (UID: \"0a637a4b-c5ec-46a5-8b5c-84a56a112ec4\") " pod="openshift-marketplace/redhat-operators-4w7kq" Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.221791 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a637a4b-c5ec-46a5-8b5c-84a56a112ec4-utilities\") pod \"redhat-operators-4w7kq\" (UID: \"0a637a4b-c5ec-46a5-8b5c-84a56a112ec4\") " pod="openshift-marketplace/redhat-operators-4w7kq" Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.222182 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a637a4b-c5ec-46a5-8b5c-84a56a112ec4-catalog-content\") pod \"redhat-operators-4w7kq\" (UID: \"0a637a4b-c5ec-46a5-8b5c-84a56a112ec4\") " pod="openshift-marketplace/redhat-operators-4w7kq" Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.223158 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a637a4b-c5ec-46a5-8b5c-84a56a112ec4-utilities\") pod \"redhat-operators-4w7kq\" (UID: \"0a637a4b-c5ec-46a5-8b5c-84a56a112ec4\") " pod="openshift-marketplace/redhat-operators-4w7kq" Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.241058 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6b8lj"] Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.243867 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q8mn4\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.258451 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4mz5\" (UniqueName: \"kubernetes.io/projected/0a637a4b-c5ec-46a5-8b5c-84a56a112ec4-kube-api-access-l4mz5\") pod \"redhat-operators-4w7kq\" (UID: \"0a637a4b-c5ec-46a5-8b5c-84a56a112ec4\") " pod="openshift-marketplace/redhat-operators-4w7kq" Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.325387 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.366691 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.386161 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4w7kq" Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.392172 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.410678 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zj9pn"] Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.423497 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zj9pn"] Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.423794 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zj9pn" Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.452474 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-scq4h" Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.453639 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-scq4h" Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.479802 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-scq4h" Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.534697 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7552f5a-9823-42db-9766-437726d2c7bb-utilities\") pod \"redhat-operators-zj9pn\" (UID: \"e7552f5a-9823-42db-9766-437726d2c7bb\") " pod="openshift-marketplace/redhat-operators-zj9pn" Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.534839 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbsc7\" (UniqueName: \"kubernetes.io/projected/e7552f5a-9823-42db-9766-437726d2c7bb-kube-api-access-jbsc7\") pod \"redhat-operators-zj9pn\" (UID: \"e7552f5a-9823-42db-9766-437726d2c7bb\") " pod="openshift-marketplace/redhat-operators-zj9pn" Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.534926 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7552f5a-9823-42db-9766-437726d2c7bb-catalog-content\") pod \"redhat-operators-zj9pn\" (UID: \"e7552f5a-9823-42db-9766-437726d2c7bb\") " pod="openshift-marketplace/redhat-operators-zj9pn" Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.639847 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7552f5a-9823-42db-9766-437726d2c7bb-utilities\") pod \"redhat-operators-zj9pn\" (UID: \"e7552f5a-9823-42db-9766-437726d2c7bb\") " pod="openshift-marketplace/redhat-operators-zj9pn" Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.639990 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbsc7\" (UniqueName: \"kubernetes.io/projected/e7552f5a-9823-42db-9766-437726d2c7bb-kube-api-access-jbsc7\") pod \"redhat-operators-zj9pn\" (UID: \"e7552f5a-9823-42db-9766-437726d2c7bb\") " pod="openshift-marketplace/redhat-operators-zj9pn" Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.640144 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7552f5a-9823-42db-9766-437726d2c7bb-catalog-content\") pod \"redhat-operators-zj9pn\" (UID: \"e7552f5a-9823-42db-9766-437726d2c7bb\") " pod="openshift-marketplace/redhat-operators-zj9pn" Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.641137 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7552f5a-9823-42db-9766-437726d2c7bb-utilities\") pod \"redhat-operators-zj9pn\" (UID: \"e7552f5a-9823-42db-9766-437726d2c7bb\") " pod="openshift-marketplace/redhat-operators-zj9pn" Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.641406 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7552f5a-9823-42db-9766-437726d2c7bb-catalog-content\") pod \"redhat-operators-zj9pn\" (UID: \"e7552f5a-9823-42db-9766-437726d2c7bb\") " pod="openshift-marketplace/redhat-operators-zj9pn" Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.641638 4971 generic.go:334] "Generic (PLEG): container finished" podID="b7d2e412-0a4f-44b6-a326-269ac4921dae" containerID="034740c7fd2522f3622e7a92b4698c566929791c244c307efc89fc747e087833" exitCode=0 Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.641710 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566485-rnzpk" event={"ID":"b7d2e412-0a4f-44b6-a326-269ac4921dae","Type":"ContainerDied","Data":"034740c7fd2522f3622e7a92b4698c566929791c244c307efc89fc747e087833"} Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.658617 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-654b6f9594-r5jch" event={"ID":"fec1493f-ac46-4873-a9f5-94217c9cc6ab","Type":"ContainerStarted","Data":"cabb0c694184d2100094480ec456805583c48d414bccadf7db7c45629d4f4e8c"} Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.658679 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-654b6f9594-r5jch" event={"ID":"fec1493f-ac46-4873-a9f5-94217c9cc6ab","Type":"ContainerStarted","Data":"30e6e0f45deba8ba55ecb5e8a3ff70b568b729b7a3b94815c376fa92dfdb96ef"} Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.659341 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-654b6f9594-r5jch" Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.677789 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-654b6f9594-r5jch" Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.711655 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"fcf218ee-0f64-48ac-80a7-1815cc6daf6c","Type":"ContainerStarted","Data":"63ccc27c525907a2ef1aefd6697db48f48af629703e2a72f134faadaccae028b"} Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.715957 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbsc7\" (UniqueName: \"kubernetes.io/projected/e7552f5a-9823-42db-9766-437726d2c7bb-kube-api-access-jbsc7\") pod \"redhat-operators-zj9pn\" (UID: \"e7552f5a-9823-42db-9766-437726d2c7bb\") " pod="openshift-marketplace/redhat-operators-zj9pn" Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.738701 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-654b6f9594-r5jch" podStartSLOduration=7.738679765 podStartE2EDuration="7.738679765s" podCreationTimestamp="2026-03-20 06:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:53:32.736301106 +0000 UTC m=+234.716175264" watchObservedRunningTime="2026-03-20 06:53:32.738679765 +0000 UTC m=+234.718553913" Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.770866 4971 generic.go:334] "Generic (PLEG): container finished" podID="ff1d12ba-e1ee-4cbc-8c8e-102269e47f75" containerID="d882d6de253a984953fe71e9bece051759942151aa2ecac6cc70f67eea4fa007" exitCode=0 Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.784187 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.7841563110000003 podStartE2EDuration="2.784156311s" podCreationTimestamp="2026-03-20 06:53:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:53:32.777073796 +0000 UTC m=+234.756947934" watchObservedRunningTime="2026-03-20 06:53:32.784156311 +0000 UTC m=+234.764030449" Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.807336 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.808133 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"3e129ce8-683d-4f95-9e78-5a99a6951318","Type":"ContainerStarted","Data":"66b0f0b28fdbf36ec13eb99bd1d7cd4e62d80a84522d20a814853d5ab4f0deec"} Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.808188 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"3e129ce8-683d-4f95-9e78-5a99a6951318","Type":"ContainerStarted","Data":"ae726e0762eee3ff9a4a188425237d4066ea98edd83a77c91beade0ed8eac003"} Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.808201 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6b8lj" event={"ID":"ff1d12ba-e1ee-4cbc-8c8e-102269e47f75","Type":"ContainerDied","Data":"d882d6de253a984953fe71e9bece051759942151aa2ecac6cc70f67eea4fa007"} Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.808214 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6b8lj" event={"ID":"ff1d12ba-e1ee-4cbc-8c8e-102269e47f75","Type":"ContainerStarted","Data":"9a761afd163a3cfd18bb0ed2fb9963eb661a290ac2969cb40d461f126f37c2a9"} Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.808226 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-596j6" event={"ID":"3897dfd7-24d8-4384-b536-24db5edccd01","Type":"ContainerStarted","Data":"cde8d4455cf9950922d6adc900d67bce226475e85ca2799b8e134ae647edeb72"} Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.810184 4971 generic.go:334] "Generic (PLEG): container finished" podID="7cb5ccdd-a645-458a-9961-461bd99c3c79" containerID="fd3dd99ec954b86266a54ec40c2a7102416297351bf28ce76faa24dc70d7dbc9" exitCode=0 Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.810914 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9l5fd" event={"ID":"7cb5ccdd-a645-458a-9961-461bd99c3c79","Type":"ContainerDied","Data":"fd3dd99ec954b86266a54ec40c2a7102416297351bf28ce76faa24dc70d7dbc9"} Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.811016 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9l5fd" event={"ID":"7cb5ccdd-a645-458a-9961-461bd99c3c79","Type":"ContainerStarted","Data":"5924892065e4aaed905b963e626884c0a765023864f54bc0888e0c89bb1d5d33"} Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.822209 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zj9pn" Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.827862 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-scq4h" Mar 20 06:53:32 crc kubenswrapper[4971]: I0320 06:53:32.901042 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-596j6" podStartSLOduration=13.901016883 podStartE2EDuration="13.901016883s" podCreationTimestamp="2026-03-20 06:53:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:53:32.870658075 +0000 UTC m=+234.850532213" watchObservedRunningTime="2026-03-20 06:53:32.901016883 +0000 UTC m=+234.880891021" Mar 20 06:53:33 crc kubenswrapper[4971]: I0320 06:53:33.115770 4971 patch_prober.go:28] interesting pod/router-default-5444994796-858jn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 06:53:33 crc kubenswrapper[4971]: [-]has-synced failed: reason withheld Mar 20 06:53:33 crc kubenswrapper[4971]: [+]process-running ok Mar 20 06:53:33 crc kubenswrapper[4971]: healthz check failed Mar 20 06:53:33 crc kubenswrapper[4971]: I0320 06:53:33.116205 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-858jn" podUID="92c1ee23-ae49-48a6-828d-19ecb5573057" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 06:53:33 crc kubenswrapper[4971]: I0320 06:53:33.278037 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4w7kq"] Mar 20 06:53:33 crc kubenswrapper[4971]: I0320 06:53:33.280974 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-q8mn4"] Mar 20 06:53:33 crc kubenswrapper[4971]: I0320 06:53:33.626427 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zj9pn"] Mar 20 06:53:33 crc kubenswrapper[4971]: W0320 06:53:33.678645 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7552f5a_9823_42db_9766_437726d2c7bb.slice/crio-b8df97fadf6939dc0b6e27367092b497734a8a287ab307ecdedcdecb59ce6009 WatchSource:0}: Error finding container b8df97fadf6939dc0b6e27367092b497734a8a287ab307ecdedcdecb59ce6009: Status 404 returned error can't find the container with id b8df97fadf6939dc0b6e27367092b497734a8a287ab307ecdedcdecb59ce6009 Mar 20 06:53:33 crc kubenswrapper[4971]: I0320 06:53:33.883118 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zj9pn" event={"ID":"e7552f5a-9823-42db-9766-437726d2c7bb","Type":"ContainerStarted","Data":"b8df97fadf6939dc0b6e27367092b497734a8a287ab307ecdedcdecb59ce6009"} Mar 20 06:53:33 crc kubenswrapper[4971]: I0320 06:53:33.928367 4971 generic.go:334] "Generic (PLEG): container finished" podID="0a637a4b-c5ec-46a5-8b5c-84a56a112ec4" containerID="07eb1b4d633a4abf7466cf937403cb2f33ba6a40cf7da38689bbef1b3f4f6049" exitCode=0 Mar 20 06:53:33 crc kubenswrapper[4971]: I0320 06:53:33.928492 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4w7kq" event={"ID":"0a637a4b-c5ec-46a5-8b5c-84a56a112ec4","Type":"ContainerDied","Data":"07eb1b4d633a4abf7466cf937403cb2f33ba6a40cf7da38689bbef1b3f4f6049"} Mar 20 06:53:33 crc kubenswrapper[4971]: I0320 06:53:33.928524 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4w7kq" event={"ID":"0a637a4b-c5ec-46a5-8b5c-84a56a112ec4","Type":"ContainerStarted","Data":"ab601b960ab81dfd31f653f166f776eb6235c78316b509bc75f19e8757f35180"} Mar 20 06:53:33 crc kubenswrapper[4971]: I0320 06:53:33.963457 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" event={"ID":"4e2cb781-6f5e-493d-a811-b88b641fcda2","Type":"ContainerStarted","Data":"32593b7b0e143baed9f8d1279ffc4059b088e0b8564179764076ed2ab59378f4"} Mar 20 06:53:33 crc kubenswrapper[4971]: I0320 06:53:33.963520 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" event={"ID":"4e2cb781-6f5e-493d-a811-b88b641fcda2","Type":"ContainerStarted","Data":"26ab5aa38fae1ea488aa974054cab4a21d79c628df54ae22f8c7189dd6ae9304"} Mar 20 06:53:33 crc kubenswrapper[4971]: I0320 06:53:33.963924 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:33 crc kubenswrapper[4971]: I0320 06:53:33.993250 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" podStartSLOduration=182.993230745 podStartE2EDuration="3m2.993230745s" podCreationTimestamp="2026-03-20 06:50:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:53:33.990806564 +0000 UTC m=+235.970680702" watchObservedRunningTime="2026-03-20 06:53:33.993230745 +0000 UTC m=+235.973104883" Mar 20 06:53:34 crc kubenswrapper[4971]: I0320 06:53:34.043481 4971 generic.go:334] "Generic (PLEG): container finished" podID="fcf218ee-0f64-48ac-80a7-1815cc6daf6c" containerID="6ea8c574116cb9e8360e7a82057837ed7d607090859d2971837a760802288929" exitCode=0 Mar 20 06:53:34 crc kubenswrapper[4971]: I0320 06:53:34.043708 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"fcf218ee-0f64-48ac-80a7-1815cc6daf6c","Type":"ContainerDied","Data":"6ea8c574116cb9e8360e7a82057837ed7d607090859d2971837a760802288929"} Mar 20 06:53:34 crc kubenswrapper[4971]: I0320 06:53:34.060333 4971 generic.go:334] "Generic (PLEG): container finished" podID="3e129ce8-683d-4f95-9e78-5a99a6951318" containerID="66b0f0b28fdbf36ec13eb99bd1d7cd4e62d80a84522d20a814853d5ab4f0deec" exitCode=0 Mar 20 06:53:34 crc kubenswrapper[4971]: I0320 06:53:34.060443 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"3e129ce8-683d-4f95-9e78-5a99a6951318","Type":"ContainerDied","Data":"66b0f0b28fdbf36ec13eb99bd1d7cd4e62d80a84522d20a814853d5ab4f0deec"} Mar 20 06:53:34 crc kubenswrapper[4971]: I0320 06:53:34.102069 4971 patch_prober.go:28] interesting pod/router-default-5444994796-858jn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 06:53:34 crc kubenswrapper[4971]: [-]has-synced failed: reason withheld Mar 20 06:53:34 crc kubenswrapper[4971]: [+]process-running ok Mar 20 06:53:34 crc kubenswrapper[4971]: healthz check failed Mar 20 06:53:34 crc kubenswrapper[4971]: I0320 06:53:34.102255 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-858jn" podUID="92c1ee23-ae49-48a6-828d-19ecb5573057" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 06:53:34 crc kubenswrapper[4971]: I0320 06:53:34.596670 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566485-rnzpk" Mar 20 06:53:34 crc kubenswrapper[4971]: I0320 06:53:34.724273 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b7d2e412-0a4f-44b6-a326-269ac4921dae-config-volume\") pod \"b7d2e412-0a4f-44b6-a326-269ac4921dae\" (UID: \"b7d2e412-0a4f-44b6-a326-269ac4921dae\") " Mar 20 06:53:34 crc kubenswrapper[4971]: I0320 06:53:34.724362 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b7d2e412-0a4f-44b6-a326-269ac4921dae-secret-volume\") pod \"b7d2e412-0a4f-44b6-a326-269ac4921dae\" (UID: \"b7d2e412-0a4f-44b6-a326-269ac4921dae\") " Mar 20 06:53:34 crc kubenswrapper[4971]: I0320 06:53:34.724416 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfzsm\" (UniqueName: \"kubernetes.io/projected/b7d2e412-0a4f-44b6-a326-269ac4921dae-kube-api-access-gfzsm\") pod \"b7d2e412-0a4f-44b6-a326-269ac4921dae\" (UID: \"b7d2e412-0a4f-44b6-a326-269ac4921dae\") " Mar 20 06:53:34 crc kubenswrapper[4971]: I0320 06:53:34.729434 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7d2e412-0a4f-44b6-a326-269ac4921dae-config-volume" (OuterVolumeSpecName: "config-volume") pod "b7d2e412-0a4f-44b6-a326-269ac4921dae" (UID: "b7d2e412-0a4f-44b6-a326-269ac4921dae"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:53:34 crc kubenswrapper[4971]: I0320 06:53:34.732590 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7d2e412-0a4f-44b6-a326-269ac4921dae-kube-api-access-gfzsm" (OuterVolumeSpecName: "kube-api-access-gfzsm") pod "b7d2e412-0a4f-44b6-a326-269ac4921dae" (UID: "b7d2e412-0a4f-44b6-a326-269ac4921dae"). InnerVolumeSpecName "kube-api-access-gfzsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:53:34 crc kubenswrapper[4971]: I0320 06:53:34.735312 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7d2e412-0a4f-44b6-a326-269ac4921dae-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b7d2e412-0a4f-44b6-a326-269ac4921dae" (UID: "b7d2e412-0a4f-44b6-a326-269ac4921dae"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:53:34 crc kubenswrapper[4971]: I0320 06:53:34.833725 4971 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b7d2e412-0a4f-44b6-a326-269ac4921dae-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 06:53:34 crc kubenswrapper[4971]: I0320 06:53:34.833762 4971 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b7d2e412-0a4f-44b6-a326-269ac4921dae-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 06:53:34 crc kubenswrapper[4971]: I0320 06:53:34.833774 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfzsm\" (UniqueName: \"kubernetes.io/projected/b7d2e412-0a4f-44b6-a326-269ac4921dae-kube-api-access-gfzsm\") on node \"crc\" DevicePath \"\"" Mar 20 06:53:35 crc kubenswrapper[4971]: I0320 06:53:35.097526 4971 generic.go:334] "Generic (PLEG): container finished" podID="e7552f5a-9823-42db-9766-437726d2c7bb" containerID="2399b40800727f8e450d8c07e7d13fdfc80b6fbe3fe5b3842553eb5981d612f3" exitCode=0 Mar 20 06:53:35 crc kubenswrapper[4971]: I0320 06:53:35.097668 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zj9pn" event={"ID":"e7552f5a-9823-42db-9766-437726d2c7bb","Type":"ContainerDied","Data":"2399b40800727f8e450d8c07e7d13fdfc80b6fbe3fe5b3842553eb5981d612f3"} Mar 20 06:53:35 crc kubenswrapper[4971]: I0320 06:53:35.127970 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566485-rnzpk" Mar 20 06:53:35 crc kubenswrapper[4971]: I0320 06:53:35.128916 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566485-rnzpk" event={"ID":"b7d2e412-0a4f-44b6-a326-269ac4921dae","Type":"ContainerDied","Data":"dcb53514c699db07f3cb16566e3180e0df4a7247d933959a1c7ef34c8d794f9f"} Mar 20 06:53:35 crc kubenswrapper[4971]: I0320 06:53:35.128947 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dcb53514c699db07f3cb16566e3180e0df4a7247d933959a1c7ef34c8d794f9f" Mar 20 06:53:35 crc kubenswrapper[4971]: I0320 06:53:35.132367 4971 patch_prober.go:28] interesting pod/router-default-5444994796-858jn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 06:53:35 crc kubenswrapper[4971]: [-]has-synced failed: reason withheld Mar 20 06:53:35 crc kubenswrapper[4971]: [+]process-running ok Mar 20 06:53:35 crc kubenswrapper[4971]: healthz check failed Mar 20 06:53:35 crc kubenswrapper[4971]: I0320 06:53:35.132448 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-858jn" podUID="92c1ee23-ae49-48a6-828d-19ecb5573057" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 06:53:35 crc kubenswrapper[4971]: I0320 06:53:35.529319 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 06:53:35 crc kubenswrapper[4971]: I0320 06:53:35.536538 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 06:53:35 crc kubenswrapper[4971]: I0320 06:53:35.651410 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fcf218ee-0f64-48ac-80a7-1815cc6daf6c-kubelet-dir\") pod \"fcf218ee-0f64-48ac-80a7-1815cc6daf6c\" (UID: \"fcf218ee-0f64-48ac-80a7-1815cc6daf6c\") " Mar 20 06:53:35 crc kubenswrapper[4971]: I0320 06:53:35.651520 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fcf218ee-0f64-48ac-80a7-1815cc6daf6c-kube-api-access\") pod \"fcf218ee-0f64-48ac-80a7-1815cc6daf6c\" (UID: \"fcf218ee-0f64-48ac-80a7-1815cc6daf6c\") " Mar 20 06:53:35 crc kubenswrapper[4971]: I0320 06:53:35.651580 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3e129ce8-683d-4f95-9e78-5a99a6951318-kube-api-access\") pod \"3e129ce8-683d-4f95-9e78-5a99a6951318\" (UID: \"3e129ce8-683d-4f95-9e78-5a99a6951318\") " Mar 20 06:53:35 crc kubenswrapper[4971]: I0320 06:53:35.651623 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3e129ce8-683d-4f95-9e78-5a99a6951318-kubelet-dir\") pod \"3e129ce8-683d-4f95-9e78-5a99a6951318\" (UID: \"3e129ce8-683d-4f95-9e78-5a99a6951318\") " Mar 20 06:53:35 crc kubenswrapper[4971]: I0320 06:53:35.651787 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3e129ce8-683d-4f95-9e78-5a99a6951318-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "3e129ce8-683d-4f95-9e78-5a99a6951318" (UID: "3e129ce8-683d-4f95-9e78-5a99a6951318"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 06:53:35 crc kubenswrapper[4971]: I0320 06:53:35.651839 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fcf218ee-0f64-48ac-80a7-1815cc6daf6c-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "fcf218ee-0f64-48ac-80a7-1815cc6daf6c" (UID: "fcf218ee-0f64-48ac-80a7-1815cc6daf6c"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 06:53:35 crc kubenswrapper[4971]: I0320 06:53:35.653036 4971 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3e129ce8-683d-4f95-9e78-5a99a6951318-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 06:53:35 crc kubenswrapper[4971]: I0320 06:53:35.653074 4971 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fcf218ee-0f64-48ac-80a7-1815cc6daf6c-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 06:53:35 crc kubenswrapper[4971]: I0320 06:53:35.656957 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcf218ee-0f64-48ac-80a7-1815cc6daf6c-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "fcf218ee-0f64-48ac-80a7-1815cc6daf6c" (UID: "fcf218ee-0f64-48ac-80a7-1815cc6daf6c"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:53:35 crc kubenswrapper[4971]: I0320 06:53:35.658094 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e129ce8-683d-4f95-9e78-5a99a6951318-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "3e129ce8-683d-4f95-9e78-5a99a6951318" (UID: "3e129ce8-683d-4f95-9e78-5a99a6951318"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:53:35 crc kubenswrapper[4971]: I0320 06:53:35.754374 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fcf218ee-0f64-48ac-80a7-1815cc6daf6c-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 06:53:35 crc kubenswrapper[4971]: I0320 06:53:35.755206 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3e129ce8-683d-4f95-9e78-5a99a6951318-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 06:53:36 crc kubenswrapper[4971]: I0320 06:53:36.100939 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-858jn" Mar 20 06:53:36 crc kubenswrapper[4971]: I0320 06:53:36.104200 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-858jn" Mar 20 06:53:36 crc kubenswrapper[4971]: I0320 06:53:36.165716 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"fcf218ee-0f64-48ac-80a7-1815cc6daf6c","Type":"ContainerDied","Data":"63ccc27c525907a2ef1aefd6697db48f48af629703e2a72f134faadaccae028b"} Mar 20 06:53:36 crc kubenswrapper[4971]: I0320 06:53:36.165797 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63ccc27c525907a2ef1aefd6697db48f48af629703e2a72f134faadaccae028b" Mar 20 06:53:36 crc kubenswrapper[4971]: I0320 06:53:36.165917 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 06:53:36 crc kubenswrapper[4971]: I0320 06:53:36.175738 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 06:53:36 crc kubenswrapper[4971]: I0320 06:53:36.179353 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"3e129ce8-683d-4f95-9e78-5a99a6951318","Type":"ContainerDied","Data":"ae726e0762eee3ff9a4a188425237d4066ea98edd83a77c91beade0ed8eac003"} Mar 20 06:53:36 crc kubenswrapper[4971]: I0320 06:53:36.179474 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae726e0762eee3ff9a4a188425237d4066ea98edd83a77c91beade0ed8eac003" Mar 20 06:53:36 crc kubenswrapper[4971]: I0320 06:53:36.279172 4971 ???:1] "http: TLS handshake error from 192.168.126.11:57978: no serving certificate available for the kubelet" Mar 20 06:53:37 crc kubenswrapper[4971]: I0320 06:53:37.395835 4971 ???:1] "http: TLS handshake error from 192.168.126.11:57986: no serving certificate available for the kubelet" Mar 20 06:53:37 crc kubenswrapper[4971]: I0320 06:53:37.567628 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-s29mb" Mar 20 06:53:41 crc kubenswrapper[4971]: I0320 06:53:41.267255 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ac992f1-bd05-471f-a101-cf14466e15e8-metrics-certs\") pod \"network-metrics-daemon-6vl68\" (UID: \"6ac992f1-bd05-471f-a101-cf14466e15e8\") " pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:53:41 crc kubenswrapper[4971]: I0320 06:53:41.270230 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 20 06:53:41 crc kubenswrapper[4971]: I0320 06:53:41.291547 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ac992f1-bd05-471f-a101-cf14466e15e8-metrics-certs\") pod \"network-metrics-daemon-6vl68\" (UID: \"6ac992f1-bd05-471f-a101-cf14466e15e8\") " pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:53:41 crc kubenswrapper[4971]: I0320 06:53:41.313105 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 20 06:53:41 crc kubenswrapper[4971]: I0320 06:53:41.321954 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vl68" Mar 20 06:53:41 crc kubenswrapper[4971]: I0320 06:53:41.984042 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-4l9qf" Mar 20 06:53:41 crc kubenswrapper[4971]: I0320 06:53:41.987714 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-4l9qf" Mar 20 06:53:42 crc kubenswrapper[4971]: I0320 06:53:42.030307 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-t2znt" Mar 20 06:53:44 crc kubenswrapper[4971]: I0320 06:53:44.558509 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-654b6f9594-r5jch"] Mar 20 06:53:44 crc kubenswrapper[4971]: I0320 06:53:44.561199 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-654b6f9594-r5jch" podUID="fec1493f-ac46-4873-a9f5-94217c9cc6ab" containerName="controller-manager" containerID="cri-o://cabb0c694184d2100094480ec456805583c48d414bccadf7db7c45629d4f4e8c" gracePeriod=30 Mar 20 06:53:44 crc kubenswrapper[4971]: I0320 06:53:44.565236 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-547f97b769-ftxsp"] Mar 20 06:53:44 crc kubenswrapper[4971]: I0320 06:53:44.565908 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-547f97b769-ftxsp" podUID="c57e69da-fb86-40f0-bbb9-e7560725c9c0" containerName="route-controller-manager" containerID="cri-o://ef6967630408e8111a105c6b4398b2d0ed9f6fbe1b61b457a192c516e89b56b8" gracePeriod=30 Mar 20 06:53:45 crc kubenswrapper[4971]: I0320 06:53:45.296379 4971 generic.go:334] "Generic (PLEG): container finished" podID="fec1493f-ac46-4873-a9f5-94217c9cc6ab" containerID="cabb0c694184d2100094480ec456805583c48d414bccadf7db7c45629d4f4e8c" exitCode=0 Mar 20 06:53:45 crc kubenswrapper[4971]: I0320 06:53:45.296476 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-654b6f9594-r5jch" event={"ID":"fec1493f-ac46-4873-a9f5-94217c9cc6ab","Type":"ContainerDied","Data":"cabb0c694184d2100094480ec456805583c48d414bccadf7db7c45629d4f4e8c"} Mar 20 06:53:45 crc kubenswrapper[4971]: I0320 06:53:45.300820 4971 generic.go:334] "Generic (PLEG): container finished" podID="c57e69da-fb86-40f0-bbb9-e7560725c9c0" containerID="ef6967630408e8111a105c6b4398b2d0ed9f6fbe1b61b457a192c516e89b56b8" exitCode=0 Mar 20 06:53:45 crc kubenswrapper[4971]: I0320 06:53:45.300913 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-547f97b769-ftxsp" event={"ID":"c57e69da-fb86-40f0-bbb9-e7560725c9c0","Type":"ContainerDied","Data":"ef6967630408e8111a105c6b4398b2d0ed9f6fbe1b61b457a192c516e89b56b8"} Mar 20 06:53:49 crc kubenswrapper[4971]: I0320 06:53:49.127057 4971 patch_prober.go:28] interesting pod/route-controller-manager-547f97b769-ftxsp container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" start-of-body= Mar 20 06:53:49 crc kubenswrapper[4971]: I0320 06:53:49.127570 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-547f97b769-ftxsp" podUID="c57e69da-fb86-40f0-bbb9-e7560725c9c0" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" Mar 20 06:53:50 crc kubenswrapper[4971]: I0320 06:53:50.162305 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 06:53:50 crc kubenswrapper[4971]: I0320 06:53:50.162408 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 06:53:52 crc kubenswrapper[4971]: I0320 06:53:52.258386 4971 patch_prober.go:28] interesting pod/controller-manager-654b6f9594-r5jch container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.50:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 06:53:52 crc kubenswrapper[4971]: I0320 06:53:52.259006 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-654b6f9594-r5jch" podUID="fec1493f-ac46-4873-a9f5-94217c9cc6ab" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.50:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 06:53:52 crc kubenswrapper[4971]: I0320 06:53:52.400453 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:53:54 crc kubenswrapper[4971]: I0320 06:53:54.227440 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-654b6f9594-r5jch" Mar 20 06:53:54 crc kubenswrapper[4971]: I0320 06:53:54.264927 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-db9dc86b-d5947"] Mar 20 06:53:54 crc kubenswrapper[4971]: E0320 06:53:54.265160 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e129ce8-683d-4f95-9e78-5a99a6951318" containerName="pruner" Mar 20 06:53:54 crc kubenswrapper[4971]: I0320 06:53:54.265173 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e129ce8-683d-4f95-9e78-5a99a6951318" containerName="pruner" Mar 20 06:53:54 crc kubenswrapper[4971]: E0320 06:53:54.265186 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcf218ee-0f64-48ac-80a7-1815cc6daf6c" containerName="pruner" Mar 20 06:53:54 crc kubenswrapper[4971]: I0320 06:53:54.265193 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcf218ee-0f64-48ac-80a7-1815cc6daf6c" containerName="pruner" Mar 20 06:53:54 crc kubenswrapper[4971]: E0320 06:53:54.265205 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7d2e412-0a4f-44b6-a326-269ac4921dae" containerName="collect-profiles" Mar 20 06:53:54 crc kubenswrapper[4971]: I0320 06:53:54.265212 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7d2e412-0a4f-44b6-a326-269ac4921dae" containerName="collect-profiles" Mar 20 06:53:54 crc kubenswrapper[4971]: E0320 06:53:54.265227 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fec1493f-ac46-4873-a9f5-94217c9cc6ab" containerName="controller-manager" Mar 20 06:53:54 crc kubenswrapper[4971]: I0320 06:53:54.265233 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="fec1493f-ac46-4873-a9f5-94217c9cc6ab" containerName="controller-manager" Mar 20 06:53:54 crc kubenswrapper[4971]: I0320 06:53:54.265323 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7d2e412-0a4f-44b6-a326-269ac4921dae" containerName="collect-profiles" Mar 20 06:53:54 crc kubenswrapper[4971]: I0320 06:53:54.265336 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcf218ee-0f64-48ac-80a7-1815cc6daf6c" containerName="pruner" Mar 20 06:53:54 crc kubenswrapper[4971]: I0320 06:53:54.265348 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e129ce8-683d-4f95-9e78-5a99a6951318" containerName="pruner" Mar 20 06:53:54 crc kubenswrapper[4971]: I0320 06:53:54.265364 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="fec1493f-ac46-4873-a9f5-94217c9cc6ab" containerName="controller-manager" Mar 20 06:53:54 crc kubenswrapper[4971]: I0320 06:53:54.265841 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-db9dc86b-d5947" Mar 20 06:53:54 crc kubenswrapper[4971]: I0320 06:53:54.299166 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-db9dc86b-d5947"] Mar 20 06:53:54 crc kubenswrapper[4971]: I0320 06:53:54.335709 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fec1493f-ac46-4873-a9f5-94217c9cc6ab-serving-cert\") pod \"fec1493f-ac46-4873-a9f5-94217c9cc6ab\" (UID: \"fec1493f-ac46-4873-a9f5-94217c9cc6ab\") " Mar 20 06:53:54 crc kubenswrapper[4971]: I0320 06:53:54.335885 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fec1493f-ac46-4873-a9f5-94217c9cc6ab-client-ca\") pod \"fec1493f-ac46-4873-a9f5-94217c9cc6ab\" (UID: \"fec1493f-ac46-4873-a9f5-94217c9cc6ab\") " Mar 20 06:53:54 crc kubenswrapper[4971]: I0320 06:53:54.335929 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fec1493f-ac46-4873-a9f5-94217c9cc6ab-config\") pod \"fec1493f-ac46-4873-a9f5-94217c9cc6ab\" (UID: \"fec1493f-ac46-4873-a9f5-94217c9cc6ab\") " Mar 20 06:53:54 crc kubenswrapper[4971]: I0320 06:53:54.335977 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fec1493f-ac46-4873-a9f5-94217c9cc6ab-proxy-ca-bundles\") pod \"fec1493f-ac46-4873-a9f5-94217c9cc6ab\" (UID: \"fec1493f-ac46-4873-a9f5-94217c9cc6ab\") " Mar 20 06:53:54 crc kubenswrapper[4971]: I0320 06:53:54.335999 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzr6m\" (UniqueName: \"kubernetes.io/projected/fec1493f-ac46-4873-a9f5-94217c9cc6ab-kube-api-access-rzr6m\") pod \"fec1493f-ac46-4873-a9f5-94217c9cc6ab\" (UID: \"fec1493f-ac46-4873-a9f5-94217c9cc6ab\") " Mar 20 06:53:54 crc kubenswrapper[4971]: I0320 06:53:54.337643 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fec1493f-ac46-4873-a9f5-94217c9cc6ab-client-ca" (OuterVolumeSpecName: "client-ca") pod "fec1493f-ac46-4873-a9f5-94217c9cc6ab" (UID: "fec1493f-ac46-4873-a9f5-94217c9cc6ab"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:53:54 crc kubenswrapper[4971]: I0320 06:53:54.338246 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fec1493f-ac46-4873-a9f5-94217c9cc6ab-config" (OuterVolumeSpecName: "config") pod "fec1493f-ac46-4873-a9f5-94217c9cc6ab" (UID: "fec1493f-ac46-4873-a9f5-94217c9cc6ab"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:53:54 crc kubenswrapper[4971]: I0320 06:53:54.338566 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fec1493f-ac46-4873-a9f5-94217c9cc6ab-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "fec1493f-ac46-4873-a9f5-94217c9cc6ab" (UID: "fec1493f-ac46-4873-a9f5-94217c9cc6ab"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:53:54 crc kubenswrapper[4971]: I0320 06:53:54.343241 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fec1493f-ac46-4873-a9f5-94217c9cc6ab-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "fec1493f-ac46-4873-a9f5-94217c9cc6ab" (UID: "fec1493f-ac46-4873-a9f5-94217c9cc6ab"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:53:54 crc kubenswrapper[4971]: I0320 06:53:54.343907 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fec1493f-ac46-4873-a9f5-94217c9cc6ab-kube-api-access-rzr6m" (OuterVolumeSpecName: "kube-api-access-rzr6m") pod "fec1493f-ac46-4873-a9f5-94217c9cc6ab" (UID: "fec1493f-ac46-4873-a9f5-94217c9cc6ab"). InnerVolumeSpecName "kube-api-access-rzr6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:53:54 crc kubenswrapper[4971]: I0320 06:53:54.381478 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-654b6f9594-r5jch" event={"ID":"fec1493f-ac46-4873-a9f5-94217c9cc6ab","Type":"ContainerDied","Data":"30e6e0f45deba8ba55ecb5e8a3ff70b568b729b7a3b94815c376fa92dfdb96ef"} Mar 20 06:53:54 crc kubenswrapper[4971]: I0320 06:53:54.381534 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-654b6f9594-r5jch" Mar 20 06:53:54 crc kubenswrapper[4971]: I0320 06:53:54.381562 4971 scope.go:117] "RemoveContainer" containerID="cabb0c694184d2100094480ec456805583c48d414bccadf7db7c45629d4f4e8c" Mar 20 06:53:54 crc kubenswrapper[4971]: I0320 06:53:54.416424 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-654b6f9594-r5jch"] Mar 20 06:53:54 crc kubenswrapper[4971]: I0320 06:53:54.420149 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-654b6f9594-r5jch"] Mar 20 06:53:54 crc kubenswrapper[4971]: I0320 06:53:54.437148 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dfc9622c-7440-4bfb-bc10-f670cdabb856-serving-cert\") pod \"controller-manager-db9dc86b-d5947\" (UID: \"dfc9622c-7440-4bfb-bc10-f670cdabb856\") " pod="openshift-controller-manager/controller-manager-db9dc86b-d5947" Mar 20 06:53:54 crc kubenswrapper[4971]: I0320 06:53:54.437199 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfc9622c-7440-4bfb-bc10-f670cdabb856-config\") pod \"controller-manager-db9dc86b-d5947\" (UID: \"dfc9622c-7440-4bfb-bc10-f670cdabb856\") " pod="openshift-controller-manager/controller-manager-db9dc86b-d5947" Mar 20 06:53:54 crc kubenswrapper[4971]: I0320 06:53:54.437236 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dfc9622c-7440-4bfb-bc10-f670cdabb856-client-ca\") pod \"controller-manager-db9dc86b-d5947\" (UID: \"dfc9622c-7440-4bfb-bc10-f670cdabb856\") " pod="openshift-controller-manager/controller-manager-db9dc86b-d5947" Mar 20 06:53:54 crc kubenswrapper[4971]: I0320 06:53:54.437255 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dfc9622c-7440-4bfb-bc10-f670cdabb856-proxy-ca-bundles\") pod \"controller-manager-db9dc86b-d5947\" (UID: \"dfc9622c-7440-4bfb-bc10-f670cdabb856\") " pod="openshift-controller-manager/controller-manager-db9dc86b-d5947" Mar 20 06:53:54 crc kubenswrapper[4971]: I0320 06:53:54.437281 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lx7j\" (UniqueName: \"kubernetes.io/projected/dfc9622c-7440-4bfb-bc10-f670cdabb856-kube-api-access-6lx7j\") pod \"controller-manager-db9dc86b-d5947\" (UID: \"dfc9622c-7440-4bfb-bc10-f670cdabb856\") " pod="openshift-controller-manager/controller-manager-db9dc86b-d5947" Mar 20 06:53:54 crc kubenswrapper[4971]: I0320 06:53:54.437418 4971 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fec1493f-ac46-4873-a9f5-94217c9cc6ab-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 06:53:54 crc kubenswrapper[4971]: I0320 06:53:54.437434 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fec1493f-ac46-4873-a9f5-94217c9cc6ab-config\") on node \"crc\" DevicePath \"\"" Mar 20 06:53:54 crc kubenswrapper[4971]: I0320 06:53:54.437444 4971 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fec1493f-ac46-4873-a9f5-94217c9cc6ab-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 06:53:54 crc kubenswrapper[4971]: I0320 06:53:54.437457 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzr6m\" (UniqueName: \"kubernetes.io/projected/fec1493f-ac46-4873-a9f5-94217c9cc6ab-kube-api-access-rzr6m\") on node \"crc\" DevicePath \"\"" Mar 20 06:53:54 crc kubenswrapper[4971]: I0320 06:53:54.437468 4971 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fec1493f-ac46-4873-a9f5-94217c9cc6ab-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 06:53:54 crc kubenswrapper[4971]: I0320 06:53:54.538471 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dfc9622c-7440-4bfb-bc10-f670cdabb856-serving-cert\") pod \"controller-manager-db9dc86b-d5947\" (UID: \"dfc9622c-7440-4bfb-bc10-f670cdabb856\") " pod="openshift-controller-manager/controller-manager-db9dc86b-d5947" Mar 20 06:53:54 crc kubenswrapper[4971]: I0320 06:53:54.538523 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfc9622c-7440-4bfb-bc10-f670cdabb856-config\") pod \"controller-manager-db9dc86b-d5947\" (UID: \"dfc9622c-7440-4bfb-bc10-f670cdabb856\") " pod="openshift-controller-manager/controller-manager-db9dc86b-d5947" Mar 20 06:53:54 crc kubenswrapper[4971]: I0320 06:53:54.538563 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dfc9622c-7440-4bfb-bc10-f670cdabb856-proxy-ca-bundles\") pod \"controller-manager-db9dc86b-d5947\" (UID: \"dfc9622c-7440-4bfb-bc10-f670cdabb856\") " pod="openshift-controller-manager/controller-manager-db9dc86b-d5947" Mar 20 06:53:54 crc kubenswrapper[4971]: I0320 06:53:54.538593 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dfc9622c-7440-4bfb-bc10-f670cdabb856-client-ca\") pod \"controller-manager-db9dc86b-d5947\" (UID: \"dfc9622c-7440-4bfb-bc10-f670cdabb856\") " pod="openshift-controller-manager/controller-manager-db9dc86b-d5947" Mar 20 06:53:54 crc kubenswrapper[4971]: I0320 06:53:54.538652 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lx7j\" (UniqueName: \"kubernetes.io/projected/dfc9622c-7440-4bfb-bc10-f670cdabb856-kube-api-access-6lx7j\") pod \"controller-manager-db9dc86b-d5947\" (UID: \"dfc9622c-7440-4bfb-bc10-f670cdabb856\") " pod="openshift-controller-manager/controller-manager-db9dc86b-d5947" Mar 20 06:53:54 crc kubenswrapper[4971]: I0320 06:53:54.540333 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfc9622c-7440-4bfb-bc10-f670cdabb856-config\") pod \"controller-manager-db9dc86b-d5947\" (UID: \"dfc9622c-7440-4bfb-bc10-f670cdabb856\") " pod="openshift-controller-manager/controller-manager-db9dc86b-d5947" Mar 20 06:53:54 crc kubenswrapper[4971]: I0320 06:53:54.540419 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dfc9622c-7440-4bfb-bc10-f670cdabb856-client-ca\") pod \"controller-manager-db9dc86b-d5947\" (UID: \"dfc9622c-7440-4bfb-bc10-f670cdabb856\") " pod="openshift-controller-manager/controller-manager-db9dc86b-d5947" Mar 20 06:53:54 crc kubenswrapper[4971]: I0320 06:53:54.540515 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dfc9622c-7440-4bfb-bc10-f670cdabb856-proxy-ca-bundles\") pod \"controller-manager-db9dc86b-d5947\" (UID: \"dfc9622c-7440-4bfb-bc10-f670cdabb856\") " pod="openshift-controller-manager/controller-manager-db9dc86b-d5947" Mar 20 06:53:54 crc kubenswrapper[4971]: I0320 06:53:54.551836 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dfc9622c-7440-4bfb-bc10-f670cdabb856-serving-cert\") pod \"controller-manager-db9dc86b-d5947\" (UID: \"dfc9622c-7440-4bfb-bc10-f670cdabb856\") " pod="openshift-controller-manager/controller-manager-db9dc86b-d5947" Mar 20 06:53:54 crc kubenswrapper[4971]: I0320 06:53:54.556381 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lx7j\" (UniqueName: \"kubernetes.io/projected/dfc9622c-7440-4bfb-bc10-f670cdabb856-kube-api-access-6lx7j\") pod \"controller-manager-db9dc86b-d5947\" (UID: \"dfc9622c-7440-4bfb-bc10-f670cdabb856\") " pod="openshift-controller-manager/controller-manager-db9dc86b-d5947" Mar 20 06:53:54 crc kubenswrapper[4971]: I0320 06:53:54.617634 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-db9dc86b-d5947" Mar 20 06:53:54 crc kubenswrapper[4971]: I0320 06:53:54.667596 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-547f97b769-ftxsp" Mar 20 06:53:54 crc kubenswrapper[4971]: I0320 06:53:54.750241 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fec1493f-ac46-4873-a9f5-94217c9cc6ab" path="/var/lib/kubelet/pods/fec1493f-ac46-4873-a9f5-94217c9cc6ab/volumes" Mar 20 06:53:54 crc kubenswrapper[4971]: I0320 06:53:54.848942 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c57e69da-fb86-40f0-bbb9-e7560725c9c0-serving-cert\") pod \"c57e69da-fb86-40f0-bbb9-e7560725c9c0\" (UID: \"c57e69da-fb86-40f0-bbb9-e7560725c9c0\") " Mar 20 06:53:54 crc kubenswrapper[4971]: I0320 06:53:54.849104 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c57e69da-fb86-40f0-bbb9-e7560725c9c0-client-ca\") pod \"c57e69da-fb86-40f0-bbb9-e7560725c9c0\" (UID: \"c57e69da-fb86-40f0-bbb9-e7560725c9c0\") " Mar 20 06:53:54 crc kubenswrapper[4971]: I0320 06:53:54.849150 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7xmg\" (UniqueName: \"kubernetes.io/projected/c57e69da-fb86-40f0-bbb9-e7560725c9c0-kube-api-access-j7xmg\") pod \"c57e69da-fb86-40f0-bbb9-e7560725c9c0\" (UID: \"c57e69da-fb86-40f0-bbb9-e7560725c9c0\") " Mar 20 06:53:54 crc kubenswrapper[4971]: I0320 06:53:54.849219 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c57e69da-fb86-40f0-bbb9-e7560725c9c0-config\") pod \"c57e69da-fb86-40f0-bbb9-e7560725c9c0\" (UID: \"c57e69da-fb86-40f0-bbb9-e7560725c9c0\") " Mar 20 06:53:54 crc kubenswrapper[4971]: I0320 06:53:54.850264 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c57e69da-fb86-40f0-bbb9-e7560725c9c0-client-ca" (OuterVolumeSpecName: "client-ca") pod "c57e69da-fb86-40f0-bbb9-e7560725c9c0" (UID: "c57e69da-fb86-40f0-bbb9-e7560725c9c0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:53:54 crc kubenswrapper[4971]: I0320 06:53:54.850347 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c57e69da-fb86-40f0-bbb9-e7560725c9c0-config" (OuterVolumeSpecName: "config") pod "c57e69da-fb86-40f0-bbb9-e7560725c9c0" (UID: "c57e69da-fb86-40f0-bbb9-e7560725c9c0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:53:54 crc kubenswrapper[4971]: I0320 06:53:54.855677 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c57e69da-fb86-40f0-bbb9-e7560725c9c0-kube-api-access-j7xmg" (OuterVolumeSpecName: "kube-api-access-j7xmg") pod "c57e69da-fb86-40f0-bbb9-e7560725c9c0" (UID: "c57e69da-fb86-40f0-bbb9-e7560725c9c0"). InnerVolumeSpecName "kube-api-access-j7xmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:53:54 crc kubenswrapper[4971]: I0320 06:53:54.856784 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c57e69da-fb86-40f0-bbb9-e7560725c9c0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c57e69da-fb86-40f0-bbb9-e7560725c9c0" (UID: "c57e69da-fb86-40f0-bbb9-e7560725c9c0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:53:54 crc kubenswrapper[4971]: I0320 06:53:54.951832 4971 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c57e69da-fb86-40f0-bbb9-e7560725c9c0-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 06:53:54 crc kubenswrapper[4971]: I0320 06:53:54.951889 4971 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c57e69da-fb86-40f0-bbb9-e7560725c9c0-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 06:53:54 crc kubenswrapper[4971]: I0320 06:53:54.951911 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7xmg\" (UniqueName: \"kubernetes.io/projected/c57e69da-fb86-40f0-bbb9-e7560725c9c0-kube-api-access-j7xmg\") on node \"crc\" DevicePath \"\"" Mar 20 06:53:54 crc kubenswrapper[4971]: I0320 06:53:54.951933 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c57e69da-fb86-40f0-bbb9-e7560725c9c0-config\") on node \"crc\" DevicePath \"\"" Mar 20 06:53:55 crc kubenswrapper[4971]: I0320 06:53:55.389163 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-547f97b769-ftxsp" event={"ID":"c57e69da-fb86-40f0-bbb9-e7560725c9c0","Type":"ContainerDied","Data":"29a33c62006db79c1434ba07177bd22372ee6d2657a342133b8ac084cb28d9ca"} Mar 20 06:53:55 crc kubenswrapper[4971]: I0320 06:53:55.389272 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-547f97b769-ftxsp" Mar 20 06:53:55 crc kubenswrapper[4971]: I0320 06:53:55.421936 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-547f97b769-ftxsp"] Mar 20 06:53:55 crc kubenswrapper[4971]: I0320 06:53:55.424309 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-547f97b769-ftxsp"] Mar 20 06:53:55 crc kubenswrapper[4971]: E0320 06:53:55.886794 4971 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 20 06:53:55 crc kubenswrapper[4971]: E0320 06:53:55.886991 4971 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 06:53:55 crc kubenswrapper[4971]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 20 06:53:55 crc kubenswrapper[4971]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fwclq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29566492-86snb_openshift-infra(e766c0e7-ca96-4f32-9e14-81a3d5bbd389): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 20 06:53:55 crc kubenswrapper[4971]: > logger="UnhandledError" Mar 20 06:53:55 crc kubenswrapper[4971]: E0320 06:53:55.888174 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29566492-86snb" podUID="e766c0e7-ca96-4f32-9e14-81a3d5bbd389" Mar 20 06:53:56 crc kubenswrapper[4971]: E0320 06:53:56.397444 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29566492-86snb" podUID="e766c0e7-ca96-4f32-9e14-81a3d5bbd389" Mar 20 06:53:56 crc kubenswrapper[4971]: I0320 06:53:56.749302 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c57e69da-fb86-40f0-bbb9-e7560725c9c0" path="/var/lib/kubelet/pods/c57e69da-fb86-40f0-bbb9-e7560725c9c0/volumes" Mar 20 06:53:56 crc kubenswrapper[4971]: I0320 06:53:56.752121 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6cbffbc499-dzzjv"] Mar 20 06:53:56 crc kubenswrapper[4971]: E0320 06:53:56.752622 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c57e69da-fb86-40f0-bbb9-e7560725c9c0" containerName="route-controller-manager" Mar 20 06:53:56 crc kubenswrapper[4971]: I0320 06:53:56.752644 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="c57e69da-fb86-40f0-bbb9-e7560725c9c0" containerName="route-controller-manager" Mar 20 06:53:56 crc kubenswrapper[4971]: I0320 06:53:56.752868 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="c57e69da-fb86-40f0-bbb9-e7560725c9c0" containerName="route-controller-manager" Mar 20 06:53:56 crc kubenswrapper[4971]: I0320 06:53:56.753650 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6cbffbc499-dzzjv" Mar 20 06:53:56 crc kubenswrapper[4971]: I0320 06:53:56.755957 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 06:53:56 crc kubenswrapper[4971]: I0320 06:53:56.756012 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6cbffbc499-dzzjv"] Mar 20 06:53:56 crc kubenswrapper[4971]: I0320 06:53:56.757990 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 06:53:56 crc kubenswrapper[4971]: I0320 06:53:56.759225 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 06:53:56 crc kubenswrapper[4971]: I0320 06:53:56.759323 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 06:53:56 crc kubenswrapper[4971]: I0320 06:53:56.760322 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 06:53:56 crc kubenswrapper[4971]: I0320 06:53:56.761235 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 06:53:56 crc kubenswrapper[4971]: I0320 06:53:56.775130 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktjjm\" (UniqueName: \"kubernetes.io/projected/8b2bfdea-6d89-4408-a110-0102e356f9f0-kube-api-access-ktjjm\") pod \"route-controller-manager-6cbffbc499-dzzjv\" (UID: \"8b2bfdea-6d89-4408-a110-0102e356f9f0\") " pod="openshift-route-controller-manager/route-controller-manager-6cbffbc499-dzzjv" Mar 20 06:53:56 crc kubenswrapper[4971]: I0320 06:53:56.775213 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b2bfdea-6d89-4408-a110-0102e356f9f0-config\") pod \"route-controller-manager-6cbffbc499-dzzjv\" (UID: \"8b2bfdea-6d89-4408-a110-0102e356f9f0\") " pod="openshift-route-controller-manager/route-controller-manager-6cbffbc499-dzzjv" Mar 20 06:53:56 crc kubenswrapper[4971]: I0320 06:53:56.775239 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b2bfdea-6d89-4408-a110-0102e356f9f0-serving-cert\") pod \"route-controller-manager-6cbffbc499-dzzjv\" (UID: \"8b2bfdea-6d89-4408-a110-0102e356f9f0\") " pod="openshift-route-controller-manager/route-controller-manager-6cbffbc499-dzzjv" Mar 20 06:53:56 crc kubenswrapper[4971]: I0320 06:53:56.775268 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8b2bfdea-6d89-4408-a110-0102e356f9f0-client-ca\") pod \"route-controller-manager-6cbffbc499-dzzjv\" (UID: \"8b2bfdea-6d89-4408-a110-0102e356f9f0\") " pod="openshift-route-controller-manager/route-controller-manager-6cbffbc499-dzzjv" Mar 20 06:53:56 crc kubenswrapper[4971]: I0320 06:53:56.876890 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b2bfdea-6d89-4408-a110-0102e356f9f0-config\") pod \"route-controller-manager-6cbffbc499-dzzjv\" (UID: \"8b2bfdea-6d89-4408-a110-0102e356f9f0\") " pod="openshift-route-controller-manager/route-controller-manager-6cbffbc499-dzzjv" Mar 20 06:53:56 crc kubenswrapper[4971]: I0320 06:53:56.876947 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b2bfdea-6d89-4408-a110-0102e356f9f0-serving-cert\") pod \"route-controller-manager-6cbffbc499-dzzjv\" (UID: \"8b2bfdea-6d89-4408-a110-0102e356f9f0\") " pod="openshift-route-controller-manager/route-controller-manager-6cbffbc499-dzzjv" Mar 20 06:53:56 crc kubenswrapper[4971]: I0320 06:53:56.876979 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8b2bfdea-6d89-4408-a110-0102e356f9f0-client-ca\") pod \"route-controller-manager-6cbffbc499-dzzjv\" (UID: \"8b2bfdea-6d89-4408-a110-0102e356f9f0\") " pod="openshift-route-controller-manager/route-controller-manager-6cbffbc499-dzzjv" Mar 20 06:53:56 crc kubenswrapper[4971]: I0320 06:53:56.877366 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktjjm\" (UniqueName: \"kubernetes.io/projected/8b2bfdea-6d89-4408-a110-0102e356f9f0-kube-api-access-ktjjm\") pod \"route-controller-manager-6cbffbc499-dzzjv\" (UID: \"8b2bfdea-6d89-4408-a110-0102e356f9f0\") " pod="openshift-route-controller-manager/route-controller-manager-6cbffbc499-dzzjv" Mar 20 06:53:56 crc kubenswrapper[4971]: I0320 06:53:56.878657 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b2bfdea-6d89-4408-a110-0102e356f9f0-config\") pod \"route-controller-manager-6cbffbc499-dzzjv\" (UID: \"8b2bfdea-6d89-4408-a110-0102e356f9f0\") " pod="openshift-route-controller-manager/route-controller-manager-6cbffbc499-dzzjv" Mar 20 06:53:56 crc kubenswrapper[4971]: I0320 06:53:56.878822 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8b2bfdea-6d89-4408-a110-0102e356f9f0-client-ca\") pod \"route-controller-manager-6cbffbc499-dzzjv\" (UID: \"8b2bfdea-6d89-4408-a110-0102e356f9f0\") " pod="openshift-route-controller-manager/route-controller-manager-6cbffbc499-dzzjv" Mar 20 06:53:56 crc kubenswrapper[4971]: I0320 06:53:56.895712 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktjjm\" (UniqueName: \"kubernetes.io/projected/8b2bfdea-6d89-4408-a110-0102e356f9f0-kube-api-access-ktjjm\") pod \"route-controller-manager-6cbffbc499-dzzjv\" (UID: \"8b2bfdea-6d89-4408-a110-0102e356f9f0\") " pod="openshift-route-controller-manager/route-controller-manager-6cbffbc499-dzzjv" Mar 20 06:53:56 crc kubenswrapper[4971]: I0320 06:53:56.904287 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b2bfdea-6d89-4408-a110-0102e356f9f0-serving-cert\") pod \"route-controller-manager-6cbffbc499-dzzjv\" (UID: \"8b2bfdea-6d89-4408-a110-0102e356f9f0\") " pod="openshift-route-controller-manager/route-controller-manager-6cbffbc499-dzzjv" Mar 20 06:53:57 crc kubenswrapper[4971]: I0320 06:53:57.082923 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6cbffbc499-dzzjv" Mar 20 06:53:57 crc kubenswrapper[4971]: I0320 06:53:57.906996 4971 ???:1] "http: TLS handshake error from 192.168.126.11:37928: no serving certificate available for the kubelet" Mar 20 06:53:58 crc kubenswrapper[4971]: I0320 06:53:58.603011 4971 scope.go:117] "RemoveContainer" containerID="ef6967630408e8111a105c6b4398b2d0ed9f6fbe1b61b457a192c516e89b56b8" Mar 20 06:54:00 crc kubenswrapper[4971]: I0320 06:54:00.141231 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566494-7drpt"] Mar 20 06:54:00 crc kubenswrapper[4971]: I0320 06:54:00.143143 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566494-7drpt" Mar 20 06:54:00 crc kubenswrapper[4971]: I0320 06:54:00.143939 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566494-7drpt"] Mar 20 06:54:00 crc kubenswrapper[4971]: I0320 06:54:00.145652 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 06:54:00 crc kubenswrapper[4971]: I0320 06:54:00.226788 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vh5pl\" (UniqueName: \"kubernetes.io/projected/38760c08-459a-4637-92a1-a1c2701532a9-kube-api-access-vh5pl\") pod \"auto-csr-approver-29566494-7drpt\" (UID: \"38760c08-459a-4637-92a1-a1c2701532a9\") " pod="openshift-infra/auto-csr-approver-29566494-7drpt" Mar 20 06:54:00 crc kubenswrapper[4971]: I0320 06:54:00.328398 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vh5pl\" (UniqueName: \"kubernetes.io/projected/38760c08-459a-4637-92a1-a1c2701532a9-kube-api-access-vh5pl\") pod \"auto-csr-approver-29566494-7drpt\" (UID: \"38760c08-459a-4637-92a1-a1c2701532a9\") " pod="openshift-infra/auto-csr-approver-29566494-7drpt" Mar 20 06:54:00 crc kubenswrapper[4971]: I0320 06:54:00.348078 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vh5pl\" (UniqueName: \"kubernetes.io/projected/38760c08-459a-4637-92a1-a1c2701532a9-kube-api-access-vh5pl\") pod \"auto-csr-approver-29566494-7drpt\" (UID: \"38760c08-459a-4637-92a1-a1c2701532a9\") " pod="openshift-infra/auto-csr-approver-29566494-7drpt" Mar 20 06:54:00 crc kubenswrapper[4971]: I0320 06:54:00.463049 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566494-7drpt" Mar 20 06:54:02 crc kubenswrapper[4971]: I0320 06:54:02.487182 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kcw47" Mar 20 06:54:04 crc kubenswrapper[4971]: E0320 06:54:04.105680 4971 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 20 06:54:04 crc kubenswrapper[4971]: E0320 06:54:04.105921 4971 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-45bwf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-9l5fd_openshift-marketplace(7cb5ccdd-a645-458a-9961-461bd99c3c79): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 06:54:04 crc kubenswrapper[4971]: E0320 06:54:04.107175 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-9l5fd" podUID="7cb5ccdd-a645-458a-9961-461bd99c3c79" Mar 20 06:54:04 crc kubenswrapper[4971]: E0320 06:54:04.120508 4971 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 20 06:54:04 crc kubenswrapper[4971]: E0320 06:54:04.120708 4971 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xc78s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-6b8lj_openshift-marketplace(ff1d12ba-e1ee-4cbc-8c8e-102269e47f75): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 06:54:04 crc kubenswrapper[4971]: E0320 06:54:04.122169 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-6b8lj" podUID="ff1d12ba-e1ee-4cbc-8c8e-102269e47f75" Mar 20 06:54:04 crc kubenswrapper[4971]: I0320 06:54:04.533413 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 06:54:04 crc kubenswrapper[4971]: I0320 06:54:04.536038 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 06:54:04 crc kubenswrapper[4971]: I0320 06:54:04.555504 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 20 06:54:04 crc kubenswrapper[4971]: I0320 06:54:04.557045 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 20 06:54:04 crc kubenswrapper[4971]: I0320 06:54:04.560335 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 06:54:04 crc kubenswrapper[4971]: I0320 06:54:04.652952 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6cbffbc499-dzzjv"] Mar 20 06:54:04 crc kubenswrapper[4971]: I0320 06:54:04.657824 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-db9dc86b-d5947"] Mar 20 06:54:04 crc kubenswrapper[4971]: I0320 06:54:04.704735 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/90d64abe-32fb-448b-9d44-2c799495d9f7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"90d64abe-32fb-448b-9d44-2c799495d9f7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 06:54:04 crc kubenswrapper[4971]: I0320 06:54:04.704812 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/90d64abe-32fb-448b-9d44-2c799495d9f7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"90d64abe-32fb-448b-9d44-2c799495d9f7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 06:54:04 crc kubenswrapper[4971]: I0320 06:54:04.806512 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/90d64abe-32fb-448b-9d44-2c799495d9f7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"90d64abe-32fb-448b-9d44-2c799495d9f7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 06:54:04 crc kubenswrapper[4971]: I0320 06:54:04.806595 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/90d64abe-32fb-448b-9d44-2c799495d9f7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"90d64abe-32fb-448b-9d44-2c799495d9f7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 06:54:04 crc kubenswrapper[4971]: I0320 06:54:04.806711 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/90d64abe-32fb-448b-9d44-2c799495d9f7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"90d64abe-32fb-448b-9d44-2c799495d9f7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 06:54:04 crc kubenswrapper[4971]: I0320 06:54:04.831052 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/90d64abe-32fb-448b-9d44-2c799495d9f7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"90d64abe-32fb-448b-9d44-2c799495d9f7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 06:54:04 crc kubenswrapper[4971]: I0320 06:54:04.886254 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 06:54:05 crc kubenswrapper[4971]: I0320 06:54:05.699080 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 06:54:08 crc kubenswrapper[4971]: E0320 06:54:08.561398 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-6b8lj" podUID="ff1d12ba-e1ee-4cbc-8c8e-102269e47f75" Mar 20 06:54:08 crc kubenswrapper[4971]: E0320 06:54:08.561706 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-9l5fd" podUID="7cb5ccdd-a645-458a-9961-461bd99c3c79" Mar 20 06:54:08 crc kubenswrapper[4971]: E0320 06:54:08.711072 4971 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 20 06:54:08 crc kubenswrapper[4971]: E0320 06:54:08.711704 4971 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jbsc7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-zj9pn_openshift-marketplace(e7552f5a-9823-42db-9766-437726d2c7bb): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 06:54:08 crc kubenswrapper[4971]: E0320 06:54:08.712907 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-zj9pn" podUID="e7552f5a-9823-42db-9766-437726d2c7bb" Mar 20 06:54:08 crc kubenswrapper[4971]: E0320 06:54:08.744348 4971 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 20 06:54:08 crc kubenswrapper[4971]: E0320 06:54:08.744813 4971 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l4mz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-4w7kq_openshift-marketplace(0a637a4b-c5ec-46a5-8b5c-84a56a112ec4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 06:54:08 crc kubenswrapper[4971]: E0320 06:54:08.746093 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-4w7kq" podUID="0a637a4b-c5ec-46a5-8b5c-84a56a112ec4" Mar 20 06:54:09 crc kubenswrapper[4971]: I0320 06:54:09.031146 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6vl68"] Mar 20 06:54:09 crc kubenswrapper[4971]: W0320 06:54:09.041366 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ac992f1_bd05_471f_a101_cf14466e15e8.slice/crio-73b0e2c0c053541b21c090e4fd29ddedde23c86ea5a630ff85bcb3fe44357875 WatchSource:0}: Error finding container 73b0e2c0c053541b21c090e4fd29ddedde23c86ea5a630ff85bcb3fe44357875: Status 404 returned error can't find the container with id 73b0e2c0c053541b21c090e4fd29ddedde23c86ea5a630ff85bcb3fe44357875 Mar 20 06:54:09 crc kubenswrapper[4971]: I0320 06:54:09.126906 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6cbffbc499-dzzjv"] Mar 20 06:54:09 crc kubenswrapper[4971]: W0320 06:54:09.138285 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b2bfdea_6d89_4408_a110_0102e356f9f0.slice/crio-f6d3f511cccc87af4187a23eb4a920bfc395fab6bc678210ec06a73378ed75b2 WatchSource:0}: Error finding container f6d3f511cccc87af4187a23eb4a920bfc395fab6bc678210ec06a73378ed75b2: Status 404 returned error can't find the container with id f6d3f511cccc87af4187a23eb4a920bfc395fab6bc678210ec06a73378ed75b2 Mar 20 06:54:09 crc kubenswrapper[4971]: I0320 06:54:09.147996 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-db9dc86b-d5947"] Mar 20 06:54:09 crc kubenswrapper[4971]: I0320 06:54:09.200115 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 06:54:09 crc kubenswrapper[4971]: I0320 06:54:09.214460 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566494-7drpt"] Mar 20 06:54:09 crc kubenswrapper[4971]: I0320 06:54:09.482191 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"90d64abe-32fb-448b-9d44-2c799495d9f7","Type":"ContainerStarted","Data":"7232460cae0bdc68b6ffda9b6e18a9fd560056c585b3d96849ef6daae4997dfa"} Mar 20 06:54:09 crc kubenswrapper[4971]: I0320 06:54:09.485082 4971 generic.go:334] "Generic (PLEG): container finished" podID="83cbdad9-020f-4a1b-8f5f-ca7d56d3cfac" containerID="556c498a26b4cd281aca3104fc11b7597318fd39b653dcbd18f5fdf4b79c6f79" exitCode=0 Mar 20 06:54:09 crc kubenswrapper[4971]: I0320 06:54:09.485202 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v9l2k" event={"ID":"83cbdad9-020f-4a1b-8f5f-ca7d56d3cfac","Type":"ContainerDied","Data":"556c498a26b4cd281aca3104fc11b7597318fd39b653dcbd18f5fdf4b79c6f79"} Mar 20 06:54:09 crc kubenswrapper[4971]: I0320 06:54:09.487415 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566494-7drpt" event={"ID":"38760c08-459a-4637-92a1-a1c2701532a9","Type":"ContainerStarted","Data":"4862b73dac4626105ced262c0196dd6f5f58f95783d85ee2ff1fbdc547206bed"} Mar 20 06:54:09 crc kubenswrapper[4971]: I0320 06:54:09.491141 4971 generic.go:334] "Generic (PLEG): container finished" podID="6fbc771d-6009-4ab3-85e4-963d26ee459e" containerID="ce2d0c3726e49ca28bc5f094197cb05090df4fbb83ab88fd1eb8077a76316a38" exitCode=0 Mar 20 06:54:09 crc kubenswrapper[4971]: I0320 06:54:09.491263 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zlhdf" event={"ID":"6fbc771d-6009-4ab3-85e4-963d26ee459e","Type":"ContainerDied","Data":"ce2d0c3726e49ca28bc5f094197cb05090df4fbb83ab88fd1eb8077a76316a38"} Mar 20 06:54:09 crc kubenswrapper[4971]: I0320 06:54:09.492385 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-db9dc86b-d5947" event={"ID":"dfc9622c-7440-4bfb-bc10-f670cdabb856","Type":"ContainerStarted","Data":"bd8189c8ecd6cef4ca751bb133012b9b46320982616e20dca1079a009f52607f"} Mar 20 06:54:09 crc kubenswrapper[4971]: I0320 06:54:09.496866 4971 generic.go:334] "Generic (PLEG): container finished" podID="264fdafe-b375-48d7-95d7-62d72c752b5d" containerID="a19aa3fbc9b4daa4dde72bfb37611062726d118915be79bcaef422bbb8a3b2c3" exitCode=0 Mar 20 06:54:09 crc kubenswrapper[4971]: I0320 06:54:09.496974 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2ljbf" event={"ID":"264fdafe-b375-48d7-95d7-62d72c752b5d","Type":"ContainerDied","Data":"a19aa3fbc9b4daa4dde72bfb37611062726d118915be79bcaef422bbb8a3b2c3"} Mar 20 06:54:09 crc kubenswrapper[4971]: I0320 06:54:09.502387 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6vl68" event={"ID":"6ac992f1-bd05-471f-a101-cf14466e15e8","Type":"ContainerStarted","Data":"73b0e2c0c053541b21c090e4fd29ddedde23c86ea5a630ff85bcb3fe44357875"} Mar 20 06:54:09 crc kubenswrapper[4971]: I0320 06:54:09.508405 4971 generic.go:334] "Generic (PLEG): container finished" podID="4c6b8af5-8d8d-4643-8815-9ed0526f04e1" containerID="24f79678695c6a971b63003e26339799f771b1bf0f7bbdc536a9daa99a55c45b" exitCode=0 Mar 20 06:54:09 crc kubenswrapper[4971]: I0320 06:54:09.508460 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-25c82" event={"ID":"4c6b8af5-8d8d-4643-8815-9ed0526f04e1","Type":"ContainerDied","Data":"24f79678695c6a971b63003e26339799f771b1bf0f7bbdc536a9daa99a55c45b"} Mar 20 06:54:09 crc kubenswrapper[4971]: I0320 06:54:09.519317 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6cbffbc499-dzzjv" event={"ID":"8b2bfdea-6d89-4408-a110-0102e356f9f0","Type":"ContainerStarted","Data":"f6d3f511cccc87af4187a23eb4a920bfc395fab6bc678210ec06a73378ed75b2"} Mar 20 06:54:09 crc kubenswrapper[4971]: E0320 06:54:09.522002 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zj9pn" podUID="e7552f5a-9823-42db-9766-437726d2c7bb" Mar 20 06:54:09 crc kubenswrapper[4971]: E0320 06:54:09.522225 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-4w7kq" podUID="0a637a4b-c5ec-46a5-8b5c-84a56a112ec4" Mar 20 06:54:09 crc kubenswrapper[4971]: I0320 06:54:09.935284 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 06:54:09 crc kubenswrapper[4971]: I0320 06:54:09.937808 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 06:54:09 crc kubenswrapper[4971]: I0320 06:54:09.981251 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 06:54:10 crc kubenswrapper[4971]: I0320 06:54:10.011839 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c00f1ed-a0fb-4e01-b022-b9fb5578ce33-kube-api-access\") pod \"installer-9-crc\" (UID: \"1c00f1ed-a0fb-4e01-b022-b9fb5578ce33\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 06:54:10 crc kubenswrapper[4971]: I0320 06:54:10.011977 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1c00f1ed-a0fb-4e01-b022-b9fb5578ce33-var-lock\") pod \"installer-9-crc\" (UID: \"1c00f1ed-a0fb-4e01-b022-b9fb5578ce33\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 06:54:10 crc kubenswrapper[4971]: I0320 06:54:10.012015 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1c00f1ed-a0fb-4e01-b022-b9fb5578ce33-kubelet-dir\") pod \"installer-9-crc\" (UID: \"1c00f1ed-a0fb-4e01-b022-b9fb5578ce33\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 06:54:10 crc kubenswrapper[4971]: I0320 06:54:10.113514 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1c00f1ed-a0fb-4e01-b022-b9fb5578ce33-kubelet-dir\") pod \"installer-9-crc\" (UID: \"1c00f1ed-a0fb-4e01-b022-b9fb5578ce33\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 06:54:10 crc kubenswrapper[4971]: I0320 06:54:10.113615 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c00f1ed-a0fb-4e01-b022-b9fb5578ce33-kube-api-access\") pod \"installer-9-crc\" (UID: \"1c00f1ed-a0fb-4e01-b022-b9fb5578ce33\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 06:54:10 crc kubenswrapper[4971]: I0320 06:54:10.113694 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1c00f1ed-a0fb-4e01-b022-b9fb5578ce33-var-lock\") pod \"installer-9-crc\" (UID: \"1c00f1ed-a0fb-4e01-b022-b9fb5578ce33\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 06:54:10 crc kubenswrapper[4971]: I0320 06:54:10.113699 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1c00f1ed-a0fb-4e01-b022-b9fb5578ce33-kubelet-dir\") pod \"installer-9-crc\" (UID: \"1c00f1ed-a0fb-4e01-b022-b9fb5578ce33\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 06:54:10 crc kubenswrapper[4971]: I0320 06:54:10.113793 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1c00f1ed-a0fb-4e01-b022-b9fb5578ce33-var-lock\") pod \"installer-9-crc\" (UID: \"1c00f1ed-a0fb-4e01-b022-b9fb5578ce33\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 06:54:10 crc kubenswrapper[4971]: I0320 06:54:10.142969 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c00f1ed-a0fb-4e01-b022-b9fb5578ce33-kube-api-access\") pod \"installer-9-crc\" (UID: \"1c00f1ed-a0fb-4e01-b022-b9fb5578ce33\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 06:54:10 crc kubenswrapper[4971]: I0320 06:54:10.283537 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 06:54:10 crc kubenswrapper[4971]: I0320 06:54:10.528561 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6cbffbc499-dzzjv" event={"ID":"8b2bfdea-6d89-4408-a110-0102e356f9f0","Type":"ContainerStarted","Data":"96430d829b29df8241e431cef0262f5c1d77a114a28267a66c57cdda6254d6d7"} Mar 20 06:54:10 crc kubenswrapper[4971]: I0320 06:54:10.528768 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6cbffbc499-dzzjv" podUID="8b2bfdea-6d89-4408-a110-0102e356f9f0" containerName="route-controller-manager" containerID="cri-o://96430d829b29df8241e431cef0262f5c1d77a114a28267a66c57cdda6254d6d7" gracePeriod=30 Mar 20 06:54:10 crc kubenswrapper[4971]: I0320 06:54:10.528979 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6cbffbc499-dzzjv" Mar 20 06:54:10 crc kubenswrapper[4971]: I0320 06:54:10.531878 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-db9dc86b-d5947" event={"ID":"dfc9622c-7440-4bfb-bc10-f670cdabb856","Type":"ContainerStarted","Data":"a778e4441a5a8037e7dbf4102df50b8edf9b491bfefbf2ac32a96bd41a38afc9"} Mar 20 06:54:10 crc kubenswrapper[4971]: I0320 06:54:10.532021 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-db9dc86b-d5947" podUID="dfc9622c-7440-4bfb-bc10-f670cdabb856" containerName="controller-manager" containerID="cri-o://a778e4441a5a8037e7dbf4102df50b8edf9b491bfefbf2ac32a96bd41a38afc9" gracePeriod=30 Mar 20 06:54:10 crc kubenswrapper[4971]: I0320 06:54:10.532474 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-db9dc86b-d5947" Mar 20 06:54:10 crc kubenswrapper[4971]: I0320 06:54:10.534148 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6cbffbc499-dzzjv" Mar 20 06:54:10 crc kubenswrapper[4971]: I0320 06:54:10.541108 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-db9dc86b-d5947" Mar 20 06:54:10 crc kubenswrapper[4971]: I0320 06:54:10.543307 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"90d64abe-32fb-448b-9d44-2c799495d9f7","Type":"ContainerStarted","Data":"8814f0b23e41b12b0a71c8874755404d644f91ad9cf873ab64adc265fd5737b9"} Mar 20 06:54:10 crc kubenswrapper[4971]: I0320 06:54:10.545666 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6vl68" event={"ID":"6ac992f1-bd05-471f-a101-cf14466e15e8","Type":"ContainerStarted","Data":"5034193b918dfbfcb3a94f3bfaa69ad904ed462a8f594832f95645da22238703"} Mar 20 06:54:10 crc kubenswrapper[4971]: I0320 06:54:10.545706 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6vl68" event={"ID":"6ac992f1-bd05-471f-a101-cf14466e15e8","Type":"ContainerStarted","Data":"34e88ff5a18338cfba674e784ed8f04c6843fc2e23ee22773c5e05ca02c0ca21"} Mar 20 06:54:10 crc kubenswrapper[4971]: I0320 06:54:10.555939 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6cbffbc499-dzzjv" podStartSLOduration=26.5559168 podStartE2EDuration="26.5559168s" podCreationTimestamp="2026-03-20 06:53:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:54:10.553932882 +0000 UTC m=+272.533807040" watchObservedRunningTime="2026-03-20 06:54:10.5559168 +0000 UTC m=+272.535790938" Mar 20 06:54:10 crc kubenswrapper[4971]: I0320 06:54:10.618677 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-db9dc86b-d5947" podStartSLOduration=26.61865137 podStartE2EDuration="26.61865137s" podCreationTimestamp="2026-03-20 06:53:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:54:10.615428406 +0000 UTC m=+272.595302564" watchObservedRunningTime="2026-03-20 06:54:10.61865137 +0000 UTC m=+272.598525528" Mar 20 06:54:10 crc kubenswrapper[4971]: I0320 06:54:10.636542 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=6.636515958 podStartE2EDuration="6.636515958s" podCreationTimestamp="2026-03-20 06:54:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:54:10.63210657 +0000 UTC m=+272.611980698" watchObservedRunningTime="2026-03-20 06:54:10.636515958 +0000 UTC m=+272.616390096" Mar 20 06:54:10 crc kubenswrapper[4971]: I0320 06:54:10.694822 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-6vl68" podStartSLOduration=219.694804539 podStartE2EDuration="3m39.694804539s" podCreationTimestamp="2026-03-20 06:50:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:54:10.691521734 +0000 UTC m=+272.671395872" watchObservedRunningTime="2026-03-20 06:54:10.694804539 +0000 UTC m=+272.674678677" Mar 20 06:54:10 crc kubenswrapper[4971]: I0320 06:54:10.763274 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-g2c5c"] Mar 20 06:54:10 crc kubenswrapper[4971]: E0320 06:54:10.821761 4971 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddfc9622c_7440_4bfb_bc10_f670cdabb856.slice/crio-conmon-a778e4441a5a8037e7dbf4102df50b8edf9b491bfefbf2ac32a96bd41a38afc9.scope\": RecentStats: unable to find data in memory cache]" Mar 20 06:54:11 crc kubenswrapper[4971]: I0320 06:54:11.554575 4971 generic.go:334] "Generic (PLEG): container finished" podID="90d64abe-32fb-448b-9d44-2c799495d9f7" containerID="8814f0b23e41b12b0a71c8874755404d644f91ad9cf873ab64adc265fd5737b9" exitCode=0 Mar 20 06:54:11 crc kubenswrapper[4971]: I0320 06:54:11.554738 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"90d64abe-32fb-448b-9d44-2c799495d9f7","Type":"ContainerDied","Data":"8814f0b23e41b12b0a71c8874755404d644f91ad9cf873ab64adc265fd5737b9"} Mar 20 06:54:11 crc kubenswrapper[4971]: I0320 06:54:11.557333 4971 generic.go:334] "Generic (PLEG): container finished" podID="8b2bfdea-6d89-4408-a110-0102e356f9f0" containerID="96430d829b29df8241e431cef0262f5c1d77a114a28267a66c57cdda6254d6d7" exitCode=0 Mar 20 06:54:11 crc kubenswrapper[4971]: I0320 06:54:11.557388 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6cbffbc499-dzzjv" event={"ID":"8b2bfdea-6d89-4408-a110-0102e356f9f0","Type":"ContainerDied","Data":"96430d829b29df8241e431cef0262f5c1d77a114a28267a66c57cdda6254d6d7"} Mar 20 06:54:11 crc kubenswrapper[4971]: I0320 06:54:11.560271 4971 generic.go:334] "Generic (PLEG): container finished" podID="dfc9622c-7440-4bfb-bc10-f670cdabb856" containerID="a778e4441a5a8037e7dbf4102df50b8edf9b491bfefbf2ac32a96bd41a38afc9" exitCode=0 Mar 20 06:54:11 crc kubenswrapper[4971]: I0320 06:54:11.560341 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-db9dc86b-d5947" event={"ID":"dfc9622c-7440-4bfb-bc10-f670cdabb856","Type":"ContainerDied","Data":"a778e4441a5a8037e7dbf4102df50b8edf9b491bfefbf2ac32a96bd41a38afc9"} Mar 20 06:54:12 crc kubenswrapper[4971]: I0320 06:54:12.408350 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6cbffbc499-dzzjv" Mar 20 06:54:12 crc kubenswrapper[4971]: I0320 06:54:12.441440 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79f687cf94-6pkxg"] Mar 20 06:54:12 crc kubenswrapper[4971]: E0320 06:54:12.441804 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b2bfdea-6d89-4408-a110-0102e356f9f0" containerName="route-controller-manager" Mar 20 06:54:12 crc kubenswrapper[4971]: I0320 06:54:12.441820 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b2bfdea-6d89-4408-a110-0102e356f9f0" containerName="route-controller-manager" Mar 20 06:54:12 crc kubenswrapper[4971]: I0320 06:54:12.441973 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b2bfdea-6d89-4408-a110-0102e356f9f0" containerName="route-controller-manager" Mar 20 06:54:12 crc kubenswrapper[4971]: I0320 06:54:12.442982 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-79f687cf94-6pkxg" Mar 20 06:54:12 crc kubenswrapper[4971]: I0320 06:54:12.457824 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8b2bfdea-6d89-4408-a110-0102e356f9f0-client-ca\") pod \"8b2bfdea-6d89-4408-a110-0102e356f9f0\" (UID: \"8b2bfdea-6d89-4408-a110-0102e356f9f0\") " Mar 20 06:54:12 crc kubenswrapper[4971]: I0320 06:54:12.457901 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b2bfdea-6d89-4408-a110-0102e356f9f0-serving-cert\") pod \"8b2bfdea-6d89-4408-a110-0102e356f9f0\" (UID: \"8b2bfdea-6d89-4408-a110-0102e356f9f0\") " Mar 20 06:54:12 crc kubenswrapper[4971]: I0320 06:54:12.457939 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktjjm\" (UniqueName: \"kubernetes.io/projected/8b2bfdea-6d89-4408-a110-0102e356f9f0-kube-api-access-ktjjm\") pod \"8b2bfdea-6d89-4408-a110-0102e356f9f0\" (UID: \"8b2bfdea-6d89-4408-a110-0102e356f9f0\") " Mar 20 06:54:12 crc kubenswrapper[4971]: I0320 06:54:12.458033 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b2bfdea-6d89-4408-a110-0102e356f9f0-config\") pod \"8b2bfdea-6d89-4408-a110-0102e356f9f0\" (UID: \"8b2bfdea-6d89-4408-a110-0102e356f9f0\") " Mar 20 06:54:12 crc kubenswrapper[4971]: I0320 06:54:12.458232 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdfe551c-e022-4cf6-98ae-fb7d8bbcc5eb-config\") pod \"route-controller-manager-79f687cf94-6pkxg\" (UID: \"fdfe551c-e022-4cf6-98ae-fb7d8bbcc5eb\") " pod="openshift-route-controller-manager/route-controller-manager-79f687cf94-6pkxg" Mar 20 06:54:12 crc kubenswrapper[4971]: I0320 06:54:12.458277 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzc9n\" (UniqueName: \"kubernetes.io/projected/fdfe551c-e022-4cf6-98ae-fb7d8bbcc5eb-kube-api-access-xzc9n\") pod \"route-controller-manager-79f687cf94-6pkxg\" (UID: \"fdfe551c-e022-4cf6-98ae-fb7d8bbcc5eb\") " pod="openshift-route-controller-manager/route-controller-manager-79f687cf94-6pkxg" Mar 20 06:54:12 crc kubenswrapper[4971]: I0320 06:54:12.458337 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fdfe551c-e022-4cf6-98ae-fb7d8bbcc5eb-client-ca\") pod \"route-controller-manager-79f687cf94-6pkxg\" (UID: \"fdfe551c-e022-4cf6-98ae-fb7d8bbcc5eb\") " pod="openshift-route-controller-manager/route-controller-manager-79f687cf94-6pkxg" Mar 20 06:54:12 crc kubenswrapper[4971]: I0320 06:54:12.458357 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fdfe551c-e022-4cf6-98ae-fb7d8bbcc5eb-serving-cert\") pod \"route-controller-manager-79f687cf94-6pkxg\" (UID: \"fdfe551c-e022-4cf6-98ae-fb7d8bbcc5eb\") " pod="openshift-route-controller-manager/route-controller-manager-79f687cf94-6pkxg" Mar 20 06:54:12 crc kubenswrapper[4971]: I0320 06:54:12.460060 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b2bfdea-6d89-4408-a110-0102e356f9f0-client-ca" (OuterVolumeSpecName: "client-ca") pod "8b2bfdea-6d89-4408-a110-0102e356f9f0" (UID: "8b2bfdea-6d89-4408-a110-0102e356f9f0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:54:12 crc kubenswrapper[4971]: I0320 06:54:12.460114 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b2bfdea-6d89-4408-a110-0102e356f9f0-config" (OuterVolumeSpecName: "config") pod "8b2bfdea-6d89-4408-a110-0102e356f9f0" (UID: "8b2bfdea-6d89-4408-a110-0102e356f9f0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:54:12 crc kubenswrapper[4971]: I0320 06:54:12.462644 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79f687cf94-6pkxg"] Mar 20 06:54:12 crc kubenswrapper[4971]: I0320 06:54:12.467098 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b2bfdea-6d89-4408-a110-0102e356f9f0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8b2bfdea-6d89-4408-a110-0102e356f9f0" (UID: "8b2bfdea-6d89-4408-a110-0102e356f9f0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:54:12 crc kubenswrapper[4971]: I0320 06:54:12.474079 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b2bfdea-6d89-4408-a110-0102e356f9f0-kube-api-access-ktjjm" (OuterVolumeSpecName: "kube-api-access-ktjjm") pod "8b2bfdea-6d89-4408-a110-0102e356f9f0" (UID: "8b2bfdea-6d89-4408-a110-0102e356f9f0"). InnerVolumeSpecName "kube-api-access-ktjjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:54:12 crc kubenswrapper[4971]: I0320 06:54:12.559894 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fdfe551c-e022-4cf6-98ae-fb7d8bbcc5eb-client-ca\") pod \"route-controller-manager-79f687cf94-6pkxg\" (UID: \"fdfe551c-e022-4cf6-98ae-fb7d8bbcc5eb\") " pod="openshift-route-controller-manager/route-controller-manager-79f687cf94-6pkxg" Mar 20 06:54:12 crc kubenswrapper[4971]: I0320 06:54:12.559982 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fdfe551c-e022-4cf6-98ae-fb7d8bbcc5eb-serving-cert\") pod \"route-controller-manager-79f687cf94-6pkxg\" (UID: \"fdfe551c-e022-4cf6-98ae-fb7d8bbcc5eb\") " pod="openshift-route-controller-manager/route-controller-manager-79f687cf94-6pkxg" Mar 20 06:54:12 crc kubenswrapper[4971]: I0320 06:54:12.560058 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdfe551c-e022-4cf6-98ae-fb7d8bbcc5eb-config\") pod \"route-controller-manager-79f687cf94-6pkxg\" (UID: \"fdfe551c-e022-4cf6-98ae-fb7d8bbcc5eb\") " pod="openshift-route-controller-manager/route-controller-manager-79f687cf94-6pkxg" Mar 20 06:54:12 crc kubenswrapper[4971]: I0320 06:54:12.560126 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzc9n\" (UniqueName: \"kubernetes.io/projected/fdfe551c-e022-4cf6-98ae-fb7d8bbcc5eb-kube-api-access-xzc9n\") pod \"route-controller-manager-79f687cf94-6pkxg\" (UID: \"fdfe551c-e022-4cf6-98ae-fb7d8bbcc5eb\") " pod="openshift-route-controller-manager/route-controller-manager-79f687cf94-6pkxg" Mar 20 06:54:12 crc kubenswrapper[4971]: I0320 06:54:12.560220 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b2bfdea-6d89-4408-a110-0102e356f9f0-config\") on node \"crc\" DevicePath \"\"" Mar 20 06:54:12 crc kubenswrapper[4971]: I0320 06:54:12.560247 4971 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8b2bfdea-6d89-4408-a110-0102e356f9f0-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 06:54:12 crc kubenswrapper[4971]: I0320 06:54:12.560267 4971 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b2bfdea-6d89-4408-a110-0102e356f9f0-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 06:54:12 crc kubenswrapper[4971]: I0320 06:54:12.560286 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktjjm\" (UniqueName: \"kubernetes.io/projected/8b2bfdea-6d89-4408-a110-0102e356f9f0-kube-api-access-ktjjm\") on node \"crc\" DevicePath \"\"" Mar 20 06:54:12 crc kubenswrapper[4971]: I0320 06:54:12.563037 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fdfe551c-e022-4cf6-98ae-fb7d8bbcc5eb-client-ca\") pod \"route-controller-manager-79f687cf94-6pkxg\" (UID: \"fdfe551c-e022-4cf6-98ae-fb7d8bbcc5eb\") " pod="openshift-route-controller-manager/route-controller-manager-79f687cf94-6pkxg" Mar 20 06:54:12 crc kubenswrapper[4971]: I0320 06:54:12.563288 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdfe551c-e022-4cf6-98ae-fb7d8bbcc5eb-config\") pod \"route-controller-manager-79f687cf94-6pkxg\" (UID: \"fdfe551c-e022-4cf6-98ae-fb7d8bbcc5eb\") " pod="openshift-route-controller-manager/route-controller-manager-79f687cf94-6pkxg" Mar 20 06:54:12 crc kubenswrapper[4971]: I0320 06:54:12.569428 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fdfe551c-e022-4cf6-98ae-fb7d8bbcc5eb-serving-cert\") pod \"route-controller-manager-79f687cf94-6pkxg\" (UID: \"fdfe551c-e022-4cf6-98ae-fb7d8bbcc5eb\") " pod="openshift-route-controller-manager/route-controller-manager-79f687cf94-6pkxg" Mar 20 06:54:12 crc kubenswrapper[4971]: I0320 06:54:12.578484 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6cbffbc499-dzzjv" Mar 20 06:54:12 crc kubenswrapper[4971]: I0320 06:54:12.585799 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6cbffbc499-dzzjv" event={"ID":"8b2bfdea-6d89-4408-a110-0102e356f9f0","Type":"ContainerDied","Data":"f6d3f511cccc87af4187a23eb4a920bfc395fab6bc678210ec06a73378ed75b2"} Mar 20 06:54:12 crc kubenswrapper[4971]: I0320 06:54:12.585905 4971 scope.go:117] "RemoveContainer" containerID="96430d829b29df8241e431cef0262f5c1d77a114a28267a66c57cdda6254d6d7" Mar 20 06:54:12 crc kubenswrapper[4971]: I0320 06:54:12.591834 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzc9n\" (UniqueName: \"kubernetes.io/projected/fdfe551c-e022-4cf6-98ae-fb7d8bbcc5eb-kube-api-access-xzc9n\") pod \"route-controller-manager-79f687cf94-6pkxg\" (UID: \"fdfe551c-e022-4cf6-98ae-fb7d8bbcc5eb\") " pod="openshift-route-controller-manager/route-controller-manager-79f687cf94-6pkxg" Mar 20 06:54:12 crc kubenswrapper[4971]: I0320 06:54:12.634401 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6cbffbc499-dzzjv"] Mar 20 06:54:12 crc kubenswrapper[4971]: I0320 06:54:12.638483 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6cbffbc499-dzzjv"] Mar 20 06:54:12 crc kubenswrapper[4971]: I0320 06:54:12.743541 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b2bfdea-6d89-4408-a110-0102e356f9f0" path="/var/lib/kubelet/pods/8b2bfdea-6d89-4408-a110-0102e356f9f0/volumes" Mar 20 06:54:12 crc kubenswrapper[4971]: I0320 06:54:12.799127 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-79f687cf94-6pkxg" Mar 20 06:54:13 crc kubenswrapper[4971]: I0320 06:54:13.194202 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 06:54:13 crc kubenswrapper[4971]: I0320 06:54:13.200899 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-db9dc86b-d5947" Mar 20 06:54:13 crc kubenswrapper[4971]: I0320 06:54:13.271123 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dfc9622c-7440-4bfb-bc10-f670cdabb856-proxy-ca-bundles\") pod \"dfc9622c-7440-4bfb-bc10-f670cdabb856\" (UID: \"dfc9622c-7440-4bfb-bc10-f670cdabb856\") " Mar 20 06:54:13 crc kubenswrapper[4971]: I0320 06:54:13.271995 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfc9622c-7440-4bfb-bc10-f670cdabb856-config\") pod \"dfc9622c-7440-4bfb-bc10-f670cdabb856\" (UID: \"dfc9622c-7440-4bfb-bc10-f670cdabb856\") " Mar 20 06:54:13 crc kubenswrapper[4971]: I0320 06:54:13.272027 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/90d64abe-32fb-448b-9d44-2c799495d9f7-kubelet-dir\") pod \"90d64abe-32fb-448b-9d44-2c799495d9f7\" (UID: \"90d64abe-32fb-448b-9d44-2c799495d9f7\") " Mar 20 06:54:13 crc kubenswrapper[4971]: I0320 06:54:13.272084 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lx7j\" (UniqueName: \"kubernetes.io/projected/dfc9622c-7440-4bfb-bc10-f670cdabb856-kube-api-access-6lx7j\") pod \"dfc9622c-7440-4bfb-bc10-f670cdabb856\" (UID: \"dfc9622c-7440-4bfb-bc10-f670cdabb856\") " Mar 20 06:54:13 crc kubenswrapper[4971]: I0320 06:54:13.272156 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dfc9622c-7440-4bfb-bc10-f670cdabb856-client-ca\") pod \"dfc9622c-7440-4bfb-bc10-f670cdabb856\" (UID: \"dfc9622c-7440-4bfb-bc10-f670cdabb856\") " Mar 20 06:54:13 crc kubenswrapper[4971]: I0320 06:54:13.272184 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/90d64abe-32fb-448b-9d44-2c799495d9f7-kube-api-access\") pod \"90d64abe-32fb-448b-9d44-2c799495d9f7\" (UID: \"90d64abe-32fb-448b-9d44-2c799495d9f7\") " Mar 20 06:54:13 crc kubenswrapper[4971]: I0320 06:54:13.272304 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dfc9622c-7440-4bfb-bc10-f670cdabb856-serving-cert\") pod \"dfc9622c-7440-4bfb-bc10-f670cdabb856\" (UID: \"dfc9622c-7440-4bfb-bc10-f670cdabb856\") " Mar 20 06:54:13 crc kubenswrapper[4971]: I0320 06:54:13.272305 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfc9622c-7440-4bfb-bc10-f670cdabb856-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "dfc9622c-7440-4bfb-bc10-f670cdabb856" (UID: "dfc9622c-7440-4bfb-bc10-f670cdabb856"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:54:13 crc kubenswrapper[4971]: I0320 06:54:13.272352 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/90d64abe-32fb-448b-9d44-2c799495d9f7-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "90d64abe-32fb-448b-9d44-2c799495d9f7" (UID: "90d64abe-32fb-448b-9d44-2c799495d9f7"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 06:54:13 crc kubenswrapper[4971]: I0320 06:54:13.272876 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfc9622c-7440-4bfb-bc10-f670cdabb856-config" (OuterVolumeSpecName: "config") pod "dfc9622c-7440-4bfb-bc10-f670cdabb856" (UID: "dfc9622c-7440-4bfb-bc10-f670cdabb856"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:54:13 crc kubenswrapper[4971]: I0320 06:54:13.272948 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfc9622c-7440-4bfb-bc10-f670cdabb856-client-ca" (OuterVolumeSpecName: "client-ca") pod "dfc9622c-7440-4bfb-bc10-f670cdabb856" (UID: "dfc9622c-7440-4bfb-bc10-f670cdabb856"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:54:13 crc kubenswrapper[4971]: I0320 06:54:13.279173 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90d64abe-32fb-448b-9d44-2c799495d9f7-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "90d64abe-32fb-448b-9d44-2c799495d9f7" (UID: "90d64abe-32fb-448b-9d44-2c799495d9f7"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:54:13 crc kubenswrapper[4971]: I0320 06:54:13.280880 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfc9622c-7440-4bfb-bc10-f670cdabb856-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "dfc9622c-7440-4bfb-bc10-f670cdabb856" (UID: "dfc9622c-7440-4bfb-bc10-f670cdabb856"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:54:13 crc kubenswrapper[4971]: I0320 06:54:13.282727 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfc9622c-7440-4bfb-bc10-f670cdabb856-kube-api-access-6lx7j" (OuterVolumeSpecName: "kube-api-access-6lx7j") pod "dfc9622c-7440-4bfb-bc10-f670cdabb856" (UID: "dfc9622c-7440-4bfb-bc10-f670cdabb856"). InnerVolumeSpecName "kube-api-access-6lx7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:54:13 crc kubenswrapper[4971]: I0320 06:54:13.373844 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfc9622c-7440-4bfb-bc10-f670cdabb856-config\") on node \"crc\" DevicePath \"\"" Mar 20 06:54:13 crc kubenswrapper[4971]: I0320 06:54:13.373880 4971 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/90d64abe-32fb-448b-9d44-2c799495d9f7-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 06:54:13 crc kubenswrapper[4971]: I0320 06:54:13.373896 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lx7j\" (UniqueName: \"kubernetes.io/projected/dfc9622c-7440-4bfb-bc10-f670cdabb856-kube-api-access-6lx7j\") on node \"crc\" DevicePath \"\"" Mar 20 06:54:13 crc kubenswrapper[4971]: I0320 06:54:13.373907 4971 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dfc9622c-7440-4bfb-bc10-f670cdabb856-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 06:54:13 crc kubenswrapper[4971]: I0320 06:54:13.373917 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/90d64abe-32fb-448b-9d44-2c799495d9f7-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 06:54:13 crc kubenswrapper[4971]: I0320 06:54:13.373926 4971 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dfc9622c-7440-4bfb-bc10-f670cdabb856-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 06:54:13 crc kubenswrapper[4971]: I0320 06:54:13.373936 4971 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dfc9622c-7440-4bfb-bc10-f670cdabb856-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 06:54:13 crc kubenswrapper[4971]: I0320 06:54:13.407458 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 06:54:13 crc kubenswrapper[4971]: W0320 06:54:13.460798 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod1c00f1ed_a0fb_4e01_b022_b9fb5578ce33.slice/crio-c1ad2548bbd0ae81cf6a179c3fa8947c7d9c7015e8c73c7cd58e8e1f1fdbb5b9 WatchSource:0}: Error finding container c1ad2548bbd0ae81cf6a179c3fa8947c7d9c7015e8c73c7cd58e8e1f1fdbb5b9: Status 404 returned error can't find the container with id c1ad2548bbd0ae81cf6a179c3fa8947c7d9c7015e8c73c7cd58e8e1f1fdbb5b9 Mar 20 06:54:13 crc kubenswrapper[4971]: I0320 06:54:13.516761 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79f687cf94-6pkxg"] Mar 20 06:54:13 crc kubenswrapper[4971]: I0320 06:54:13.588823 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 06:54:13 crc kubenswrapper[4971]: I0320 06:54:13.588854 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"90d64abe-32fb-448b-9d44-2c799495d9f7","Type":"ContainerDied","Data":"7232460cae0bdc68b6ffda9b6e18a9fd560056c585b3d96849ef6daae4997dfa"} Mar 20 06:54:13 crc kubenswrapper[4971]: I0320 06:54:13.588912 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7232460cae0bdc68b6ffda9b6e18a9fd560056c585b3d96849ef6daae4997dfa" Mar 20 06:54:13 crc kubenswrapper[4971]: I0320 06:54:13.606265 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v9l2k" event={"ID":"83cbdad9-020f-4a1b-8f5f-ca7d56d3cfac","Type":"ContainerStarted","Data":"116fedb9b26edd4b4aacd6e32cd947751c32a64fff4e5df1ffe669e760f2ad5e"} Mar 20 06:54:13 crc kubenswrapper[4971]: I0320 06:54:13.609075 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"1c00f1ed-a0fb-4e01-b022-b9fb5578ce33","Type":"ContainerStarted","Data":"c1ad2548bbd0ae81cf6a179c3fa8947c7d9c7015e8c73c7cd58e8e1f1fdbb5b9"} Mar 20 06:54:13 crc kubenswrapper[4971]: I0320 06:54:13.613143 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-25c82" event={"ID":"4c6b8af5-8d8d-4643-8815-9ed0526f04e1","Type":"ContainerStarted","Data":"1e6cc46fb4b8d4c5e8f6f57d69cf682c7f883002126fe95809ffd103cf57ced5"} Mar 20 06:54:13 crc kubenswrapper[4971]: I0320 06:54:13.615249 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-79f687cf94-6pkxg" event={"ID":"fdfe551c-e022-4cf6-98ae-fb7d8bbcc5eb","Type":"ContainerStarted","Data":"2e8dcec68cfc5ea085689ae7c10781de0259c9ed49726307811a63f899bada8e"} Mar 20 06:54:13 crc kubenswrapper[4971]: I0320 06:54:13.618231 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-db9dc86b-d5947" event={"ID":"dfc9622c-7440-4bfb-bc10-f670cdabb856","Type":"ContainerDied","Data":"bd8189c8ecd6cef4ca751bb133012b9b46320982616e20dca1079a009f52607f"} Mar 20 06:54:13 crc kubenswrapper[4971]: I0320 06:54:13.618273 4971 scope.go:117] "RemoveContainer" containerID="a778e4441a5a8037e7dbf4102df50b8edf9b491bfefbf2ac32a96bd41a38afc9" Mar 20 06:54:13 crc kubenswrapper[4971]: I0320 06:54:13.618366 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-db9dc86b-d5947" Mar 20 06:54:13 crc kubenswrapper[4971]: I0320 06:54:13.622979 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2ljbf" event={"ID":"264fdafe-b375-48d7-95d7-62d72c752b5d","Type":"ContainerStarted","Data":"f7cf8472aed6728bae861e17601f80f11b0faea13807991178f4f3913b3a2e6c"} Mar 20 06:54:13 crc kubenswrapper[4971]: I0320 06:54:13.631770 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-v9l2k" podStartSLOduration=2.906723741 podStartE2EDuration="44.631745756s" podCreationTimestamp="2026-03-20 06:53:29 +0000 UTC" firstStartedPulling="2026-03-20 06:53:31.496970515 +0000 UTC m=+233.476844653" lastFinishedPulling="2026-03-20 06:54:13.22199253 +0000 UTC m=+275.201866668" observedRunningTime="2026-03-20 06:54:13.627280357 +0000 UTC m=+275.607154515" watchObservedRunningTime="2026-03-20 06:54:13.631745756 +0000 UTC m=+275.611619904" Mar 20 06:54:13 crc kubenswrapper[4971]: I0320 06:54:13.649988 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2ljbf" podStartSLOduration=4.075058051 podStartE2EDuration="45.649966355s" podCreationTimestamp="2026-03-20 06:53:28 +0000 UTC" firstStartedPulling="2026-03-20 06:53:31.569221727 +0000 UTC m=+233.549095865" lastFinishedPulling="2026-03-20 06:54:13.144130031 +0000 UTC m=+275.124004169" observedRunningTime="2026-03-20 06:54:13.647300798 +0000 UTC m=+275.627174936" watchObservedRunningTime="2026-03-20 06:54:13.649966355 +0000 UTC m=+275.629840493" Mar 20 06:54:13 crc kubenswrapper[4971]: I0320 06:54:13.688039 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-25c82" podStartSLOduration=3.063637239 podStartE2EDuration="44.688005659s" podCreationTimestamp="2026-03-20 06:53:29 +0000 UTC" firstStartedPulling="2026-03-20 06:53:31.597826505 +0000 UTC m=+233.577700643" lastFinishedPulling="2026-03-20 06:54:13.222194895 +0000 UTC m=+275.202069063" observedRunningTime="2026-03-20 06:54:13.682984843 +0000 UTC m=+275.662858981" watchObservedRunningTime="2026-03-20 06:54:13.688005659 +0000 UTC m=+275.667879797" Mar 20 06:54:13 crc kubenswrapper[4971]: I0320 06:54:13.723228 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-db9dc86b-d5947"] Mar 20 06:54:13 crc kubenswrapper[4971]: I0320 06:54:13.726479 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-db9dc86b-d5947"] Mar 20 06:54:13 crc kubenswrapper[4971]: I0320 06:54:13.742192 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zlhdf" podStartSLOduration=2.954092268 podStartE2EDuration="44.742128909s" podCreationTimestamp="2026-03-20 06:53:29 +0000 UTC" firstStartedPulling="2026-03-20 06:53:31.479812499 +0000 UTC m=+233.459686637" lastFinishedPulling="2026-03-20 06:54:13.26784914 +0000 UTC m=+275.247723278" observedRunningTime="2026-03-20 06:54:13.741103009 +0000 UTC m=+275.720977147" watchObservedRunningTime="2026-03-20 06:54:13.742128909 +0000 UTC m=+275.722003047" Mar 20 06:54:14 crc kubenswrapper[4971]: I0320 06:54:14.295290 4971 csr.go:261] certificate signing request csr-4xgj5 is approved, waiting to be issued Mar 20 06:54:14 crc kubenswrapper[4971]: I0320 06:54:14.303552 4971 csr.go:257] certificate signing request csr-4xgj5 is issued Mar 20 06:54:14 crc kubenswrapper[4971]: I0320 06:54:14.637946 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zlhdf" event={"ID":"6fbc771d-6009-4ab3-85e4-963d26ee459e","Type":"ContainerStarted","Data":"f76645e222a80d7d8b0d91650d756744d53a0d743e4852b48a11b700a398b325"} Mar 20 06:54:14 crc kubenswrapper[4971]: I0320 06:54:14.642419 4971 generic.go:334] "Generic (PLEG): container finished" podID="e766c0e7-ca96-4f32-9e14-81a3d5bbd389" containerID="be0670ae7db133b8af1e304dc2821ff335cb40c1560ca24d5cbf0aa96842e67f" exitCode=0 Mar 20 06:54:14 crc kubenswrapper[4971]: I0320 06:54:14.642475 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566492-86snb" event={"ID":"e766c0e7-ca96-4f32-9e14-81a3d5bbd389","Type":"ContainerDied","Data":"be0670ae7db133b8af1e304dc2821ff335cb40c1560ca24d5cbf0aa96842e67f"} Mar 20 06:54:14 crc kubenswrapper[4971]: I0320 06:54:14.644359 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"1c00f1ed-a0fb-4e01-b022-b9fb5578ce33","Type":"ContainerStarted","Data":"c3cb0ace7084d810aa56ee4d522852fdba1926beacf1de4ce3fee2b832ed6b8b"} Mar 20 06:54:14 crc kubenswrapper[4971]: I0320 06:54:14.646125 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566494-7drpt" event={"ID":"38760c08-459a-4637-92a1-a1c2701532a9","Type":"ContainerStarted","Data":"87b1cbf2c46265829095cd0cec5e8fd3b8fd0923653ce3d06b332ca892c8fcc1"} Mar 20 06:54:14 crc kubenswrapper[4971]: I0320 06:54:14.647931 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-79f687cf94-6pkxg" event={"ID":"fdfe551c-e022-4cf6-98ae-fb7d8bbcc5eb","Type":"ContainerStarted","Data":"2da85cca8c663014a13c1c1f6bcfef8a2f51e8a05c2a8a9b023017da6774ed0a"} Mar 20 06:54:14 crc kubenswrapper[4971]: I0320 06:54:14.649154 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-79f687cf94-6pkxg" Mar 20 06:54:14 crc kubenswrapper[4971]: I0320 06:54:14.655007 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-79f687cf94-6pkxg" Mar 20 06:54:14 crc kubenswrapper[4971]: I0320 06:54:14.695110 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=5.695086337 podStartE2EDuration="5.695086337s" podCreationTimestamp="2026-03-20 06:54:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:54:14.687775905 +0000 UTC m=+276.667650053" watchObservedRunningTime="2026-03-20 06:54:14.695086337 +0000 UTC m=+276.674960475" Mar 20 06:54:14 crc kubenswrapper[4971]: I0320 06:54:14.731567 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566494-7drpt" podStartSLOduration=10.725479939 podStartE2EDuration="14.731544244s" podCreationTimestamp="2026-03-20 06:54:00 +0000 UTC" firstStartedPulling="2026-03-20 06:54:09.26368302 +0000 UTC m=+271.243557158" lastFinishedPulling="2026-03-20 06:54:13.269747325 +0000 UTC m=+275.249621463" observedRunningTime="2026-03-20 06:54:14.716197489 +0000 UTC m=+276.696071627" watchObservedRunningTime="2026-03-20 06:54:14.731544244 +0000 UTC m=+276.711418383" Mar 20 06:54:14 crc kubenswrapper[4971]: I0320 06:54:14.732878 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-79f687cf94-6pkxg" podStartSLOduration=10.732870283 podStartE2EDuration="10.732870283s" podCreationTimestamp="2026-03-20 06:54:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:54:14.732195163 +0000 UTC m=+276.712069291" watchObservedRunningTime="2026-03-20 06:54:14.732870283 +0000 UTC m=+276.712744421" Mar 20 06:54:14 crc kubenswrapper[4971]: I0320 06:54:14.740843 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfc9622c-7440-4bfb-bc10-f670cdabb856" path="/var/lib/kubelet/pods/dfc9622c-7440-4bfb-bc10-f670cdabb856/volumes" Mar 20 06:54:14 crc kubenswrapper[4971]: I0320 06:54:14.763180 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5d77494bff-976qk"] Mar 20 06:54:14 crc kubenswrapper[4971]: E0320 06:54:14.763645 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfc9622c-7440-4bfb-bc10-f670cdabb856" containerName="controller-manager" Mar 20 06:54:14 crc kubenswrapper[4971]: I0320 06:54:14.763740 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfc9622c-7440-4bfb-bc10-f670cdabb856" containerName="controller-manager" Mar 20 06:54:14 crc kubenswrapper[4971]: E0320 06:54:14.763829 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90d64abe-32fb-448b-9d44-2c799495d9f7" containerName="pruner" Mar 20 06:54:14 crc kubenswrapper[4971]: I0320 06:54:14.763884 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="90d64abe-32fb-448b-9d44-2c799495d9f7" containerName="pruner" Mar 20 06:54:14 crc kubenswrapper[4971]: I0320 06:54:14.764046 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfc9622c-7440-4bfb-bc10-f670cdabb856" containerName="controller-manager" Mar 20 06:54:14 crc kubenswrapper[4971]: I0320 06:54:14.764136 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="90d64abe-32fb-448b-9d44-2c799495d9f7" containerName="pruner" Mar 20 06:54:14 crc kubenswrapper[4971]: I0320 06:54:14.764698 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d77494bff-976qk" Mar 20 06:54:14 crc kubenswrapper[4971]: I0320 06:54:14.780086 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 06:54:14 crc kubenswrapper[4971]: I0320 06:54:14.780700 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 06:54:14 crc kubenswrapper[4971]: I0320 06:54:14.780972 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 06:54:14 crc kubenswrapper[4971]: I0320 06:54:14.781133 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 06:54:14 crc kubenswrapper[4971]: I0320 06:54:14.781335 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 06:54:14 crc kubenswrapper[4971]: I0320 06:54:14.782006 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 06:54:14 crc kubenswrapper[4971]: I0320 06:54:14.785493 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5d77494bff-976qk"] Mar 20 06:54:14 crc kubenswrapper[4971]: I0320 06:54:14.788308 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 06:54:14 crc kubenswrapper[4971]: I0320 06:54:14.902425 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8831fb9e-bc6a-4d43-a537-8152456ede10-serving-cert\") pod \"controller-manager-5d77494bff-976qk\" (UID: \"8831fb9e-bc6a-4d43-a537-8152456ede10\") " pod="openshift-controller-manager/controller-manager-5d77494bff-976qk" Mar 20 06:54:14 crc kubenswrapper[4971]: I0320 06:54:14.902763 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8831fb9e-bc6a-4d43-a537-8152456ede10-config\") pod \"controller-manager-5d77494bff-976qk\" (UID: \"8831fb9e-bc6a-4d43-a537-8152456ede10\") " pod="openshift-controller-manager/controller-manager-5d77494bff-976qk" Mar 20 06:54:14 crc kubenswrapper[4971]: I0320 06:54:14.903495 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8831fb9e-bc6a-4d43-a537-8152456ede10-client-ca\") pod \"controller-manager-5d77494bff-976qk\" (UID: \"8831fb9e-bc6a-4d43-a537-8152456ede10\") " pod="openshift-controller-manager/controller-manager-5d77494bff-976qk" Mar 20 06:54:14 crc kubenswrapper[4971]: I0320 06:54:14.903962 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8831fb9e-bc6a-4d43-a537-8152456ede10-proxy-ca-bundles\") pod \"controller-manager-5d77494bff-976qk\" (UID: \"8831fb9e-bc6a-4d43-a537-8152456ede10\") " pod="openshift-controller-manager/controller-manager-5d77494bff-976qk" Mar 20 06:54:14 crc kubenswrapper[4971]: I0320 06:54:14.904074 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-986qn\" (UniqueName: \"kubernetes.io/projected/8831fb9e-bc6a-4d43-a537-8152456ede10-kube-api-access-986qn\") pod \"controller-manager-5d77494bff-976qk\" (UID: \"8831fb9e-bc6a-4d43-a537-8152456ede10\") " pod="openshift-controller-manager/controller-manager-5d77494bff-976qk" Mar 20 06:54:15 crc kubenswrapper[4971]: I0320 06:54:15.005203 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8831fb9e-bc6a-4d43-a537-8152456ede10-serving-cert\") pod \"controller-manager-5d77494bff-976qk\" (UID: \"8831fb9e-bc6a-4d43-a537-8152456ede10\") " pod="openshift-controller-manager/controller-manager-5d77494bff-976qk" Mar 20 06:54:15 crc kubenswrapper[4971]: I0320 06:54:15.005262 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8831fb9e-bc6a-4d43-a537-8152456ede10-config\") pod \"controller-manager-5d77494bff-976qk\" (UID: \"8831fb9e-bc6a-4d43-a537-8152456ede10\") " pod="openshift-controller-manager/controller-manager-5d77494bff-976qk" Mar 20 06:54:15 crc kubenswrapper[4971]: I0320 06:54:15.005285 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8831fb9e-bc6a-4d43-a537-8152456ede10-client-ca\") pod \"controller-manager-5d77494bff-976qk\" (UID: \"8831fb9e-bc6a-4d43-a537-8152456ede10\") " pod="openshift-controller-manager/controller-manager-5d77494bff-976qk" Mar 20 06:54:15 crc kubenswrapper[4971]: I0320 06:54:15.005315 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8831fb9e-bc6a-4d43-a537-8152456ede10-proxy-ca-bundles\") pod \"controller-manager-5d77494bff-976qk\" (UID: \"8831fb9e-bc6a-4d43-a537-8152456ede10\") " pod="openshift-controller-manager/controller-manager-5d77494bff-976qk" Mar 20 06:54:15 crc kubenswrapper[4971]: I0320 06:54:15.005345 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-986qn\" (UniqueName: \"kubernetes.io/projected/8831fb9e-bc6a-4d43-a537-8152456ede10-kube-api-access-986qn\") pod \"controller-manager-5d77494bff-976qk\" (UID: \"8831fb9e-bc6a-4d43-a537-8152456ede10\") " pod="openshift-controller-manager/controller-manager-5d77494bff-976qk" Mar 20 06:54:15 crc kubenswrapper[4971]: I0320 06:54:15.007096 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8831fb9e-bc6a-4d43-a537-8152456ede10-proxy-ca-bundles\") pod \"controller-manager-5d77494bff-976qk\" (UID: \"8831fb9e-bc6a-4d43-a537-8152456ede10\") " pod="openshift-controller-manager/controller-manager-5d77494bff-976qk" Mar 20 06:54:15 crc kubenswrapper[4971]: I0320 06:54:15.007133 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8831fb9e-bc6a-4d43-a537-8152456ede10-config\") pod \"controller-manager-5d77494bff-976qk\" (UID: \"8831fb9e-bc6a-4d43-a537-8152456ede10\") " pod="openshift-controller-manager/controller-manager-5d77494bff-976qk" Mar 20 06:54:15 crc kubenswrapper[4971]: I0320 06:54:15.008073 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8831fb9e-bc6a-4d43-a537-8152456ede10-client-ca\") pod \"controller-manager-5d77494bff-976qk\" (UID: \"8831fb9e-bc6a-4d43-a537-8152456ede10\") " pod="openshift-controller-manager/controller-manager-5d77494bff-976qk" Mar 20 06:54:15 crc kubenswrapper[4971]: I0320 06:54:15.013691 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8831fb9e-bc6a-4d43-a537-8152456ede10-serving-cert\") pod \"controller-manager-5d77494bff-976qk\" (UID: \"8831fb9e-bc6a-4d43-a537-8152456ede10\") " pod="openshift-controller-manager/controller-manager-5d77494bff-976qk" Mar 20 06:54:15 crc kubenswrapper[4971]: I0320 06:54:15.033376 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-986qn\" (UniqueName: \"kubernetes.io/projected/8831fb9e-bc6a-4d43-a537-8152456ede10-kube-api-access-986qn\") pod \"controller-manager-5d77494bff-976qk\" (UID: \"8831fb9e-bc6a-4d43-a537-8152456ede10\") " pod="openshift-controller-manager/controller-manager-5d77494bff-976qk" Mar 20 06:54:15 crc kubenswrapper[4971]: I0320 06:54:15.141744 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d77494bff-976qk" Mar 20 06:54:15 crc kubenswrapper[4971]: I0320 06:54:15.305343 4971 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-24 00:54:53.895003396 +0000 UTC Mar 20 06:54:15 crc kubenswrapper[4971]: I0320 06:54:15.305800 4971 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6690h0m38.589208311s for next certificate rotation Mar 20 06:54:15 crc kubenswrapper[4971]: I0320 06:54:15.420690 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5d77494bff-976qk"] Mar 20 06:54:15 crc kubenswrapper[4971]: I0320 06:54:15.657110 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d77494bff-976qk" event={"ID":"8831fb9e-bc6a-4d43-a537-8152456ede10","Type":"ContainerStarted","Data":"b409f46d1c125512f0ddb04c9ae93636bf7e5f9a42a93cb26fbd2d0bf1f3dc20"} Mar 20 06:54:15 crc kubenswrapper[4971]: I0320 06:54:15.657639 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d77494bff-976qk" event={"ID":"8831fb9e-bc6a-4d43-a537-8152456ede10","Type":"ContainerStarted","Data":"45367a70b8ef78b0112a04ecfa85f2d3f29b5c0e056801539b66328f4edd6e42"} Mar 20 06:54:15 crc kubenswrapper[4971]: I0320 06:54:15.660359 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5d77494bff-976qk" Mar 20 06:54:15 crc kubenswrapper[4971]: I0320 06:54:15.660475 4971 patch_prober.go:28] interesting pod/controller-manager-5d77494bff-976qk container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.64:8443/healthz\": dial tcp 10.217.0.64:8443: connect: connection refused" start-of-body= Mar 20 06:54:15 crc kubenswrapper[4971]: I0320 06:54:15.660515 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5d77494bff-976qk" podUID="8831fb9e-bc6a-4d43-a537-8152456ede10" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.64:8443/healthz\": dial tcp 10.217.0.64:8443: connect: connection refused" Mar 20 06:54:15 crc kubenswrapper[4971]: I0320 06:54:15.662307 4971 generic.go:334] "Generic (PLEG): container finished" podID="38760c08-459a-4637-92a1-a1c2701532a9" containerID="87b1cbf2c46265829095cd0cec5e8fd3b8fd0923653ce3d06b332ca892c8fcc1" exitCode=0 Mar 20 06:54:15 crc kubenswrapper[4971]: I0320 06:54:15.663243 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566494-7drpt" event={"ID":"38760c08-459a-4637-92a1-a1c2701532a9","Type":"ContainerDied","Data":"87b1cbf2c46265829095cd0cec5e8fd3b8fd0923653ce3d06b332ca892c8fcc1"} Mar 20 06:54:15 crc kubenswrapper[4971]: I0320 06:54:15.684095 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5d77494bff-976qk" podStartSLOduration=11.684050939 podStartE2EDuration="11.684050939s" podCreationTimestamp="2026-03-20 06:54:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:54:15.682810153 +0000 UTC m=+277.662684291" watchObservedRunningTime="2026-03-20 06:54:15.684050939 +0000 UTC m=+277.663925087" Mar 20 06:54:15 crc kubenswrapper[4971]: I0320 06:54:15.970583 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566492-86snb" Mar 20 06:54:16 crc kubenswrapper[4971]: I0320 06:54:16.019960 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwclq\" (UniqueName: \"kubernetes.io/projected/e766c0e7-ca96-4f32-9e14-81a3d5bbd389-kube-api-access-fwclq\") pod \"e766c0e7-ca96-4f32-9e14-81a3d5bbd389\" (UID: \"e766c0e7-ca96-4f32-9e14-81a3d5bbd389\") " Mar 20 06:54:16 crc kubenswrapper[4971]: I0320 06:54:16.030496 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e766c0e7-ca96-4f32-9e14-81a3d5bbd389-kube-api-access-fwclq" (OuterVolumeSpecName: "kube-api-access-fwclq") pod "e766c0e7-ca96-4f32-9e14-81a3d5bbd389" (UID: "e766c0e7-ca96-4f32-9e14-81a3d5bbd389"). InnerVolumeSpecName "kube-api-access-fwclq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:54:16 crc kubenswrapper[4971]: I0320 06:54:16.121283 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwclq\" (UniqueName: \"kubernetes.io/projected/e766c0e7-ca96-4f32-9e14-81a3d5bbd389-kube-api-access-fwclq\") on node \"crc\" DevicePath \"\"" Mar 20 06:54:16 crc kubenswrapper[4971]: I0320 06:54:16.306721 4971 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-15 08:47:04.133802282 +0000 UTC Mar 20 06:54:16 crc kubenswrapper[4971]: I0320 06:54:16.306769 4971 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6481h52m47.827036256s for next certificate rotation Mar 20 06:54:16 crc kubenswrapper[4971]: I0320 06:54:16.672231 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566492-86snb" event={"ID":"e766c0e7-ca96-4f32-9e14-81a3d5bbd389","Type":"ContainerDied","Data":"e71ffb773c4ef6545828c6a07f36769162aa30216f11625a44b300e7e0609f3b"} Mar 20 06:54:16 crc kubenswrapper[4971]: I0320 06:54:16.672312 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e71ffb773c4ef6545828c6a07f36769162aa30216f11625a44b300e7e0609f3b" Mar 20 06:54:16 crc kubenswrapper[4971]: I0320 06:54:16.672346 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566492-86snb" Mar 20 06:54:16 crc kubenswrapper[4971]: I0320 06:54:16.683910 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5d77494bff-976qk" Mar 20 06:54:16 crc kubenswrapper[4971]: I0320 06:54:16.978165 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566494-7drpt" Mar 20 06:54:17 crc kubenswrapper[4971]: I0320 06:54:17.036884 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vh5pl\" (UniqueName: \"kubernetes.io/projected/38760c08-459a-4637-92a1-a1c2701532a9-kube-api-access-vh5pl\") pod \"38760c08-459a-4637-92a1-a1c2701532a9\" (UID: \"38760c08-459a-4637-92a1-a1c2701532a9\") " Mar 20 06:54:17 crc kubenswrapper[4971]: I0320 06:54:17.041047 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38760c08-459a-4637-92a1-a1c2701532a9-kube-api-access-vh5pl" (OuterVolumeSpecName: "kube-api-access-vh5pl") pod "38760c08-459a-4637-92a1-a1c2701532a9" (UID: "38760c08-459a-4637-92a1-a1c2701532a9"). InnerVolumeSpecName "kube-api-access-vh5pl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:54:17 crc kubenswrapper[4971]: I0320 06:54:17.138788 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vh5pl\" (UniqueName: \"kubernetes.io/projected/38760c08-459a-4637-92a1-a1c2701532a9-kube-api-access-vh5pl\") on node \"crc\" DevicePath \"\"" Mar 20 06:54:17 crc kubenswrapper[4971]: I0320 06:54:17.680854 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566494-7drpt" event={"ID":"38760c08-459a-4637-92a1-a1c2701532a9","Type":"ContainerDied","Data":"4862b73dac4626105ced262c0196dd6f5f58f95783d85ee2ff1fbdc547206bed"} Mar 20 06:54:17 crc kubenswrapper[4971]: I0320 06:54:17.680917 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4862b73dac4626105ced262c0196dd6f5f58f95783d85ee2ff1fbdc547206bed" Mar 20 06:54:17 crc kubenswrapper[4971]: I0320 06:54:17.680918 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566494-7drpt" Mar 20 06:54:19 crc kubenswrapper[4971]: I0320 06:54:19.214973 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2ljbf" Mar 20 06:54:19 crc kubenswrapper[4971]: I0320 06:54:19.215023 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2ljbf" Mar 20 06:54:19 crc kubenswrapper[4971]: I0320 06:54:19.412687 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-v9l2k" Mar 20 06:54:19 crc kubenswrapper[4971]: I0320 06:54:19.413046 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-v9l2k" Mar 20 06:54:19 crc kubenswrapper[4971]: I0320 06:54:19.481754 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2ljbf" Mar 20 06:54:19 crc kubenswrapper[4971]: I0320 06:54:19.482081 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-v9l2k" Mar 20 06:54:19 crc kubenswrapper[4971]: I0320 06:54:19.613523 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zlhdf" Mar 20 06:54:19 crc kubenswrapper[4971]: I0320 06:54:19.613581 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zlhdf" Mar 20 06:54:19 crc kubenswrapper[4971]: I0320 06:54:19.676586 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zlhdf" Mar 20 06:54:19 crc kubenswrapper[4971]: I0320 06:54:19.774846 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-v9l2k" Mar 20 06:54:19 crc kubenswrapper[4971]: I0320 06:54:19.775001 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2ljbf" Mar 20 06:54:19 crc kubenswrapper[4971]: I0320 06:54:19.781713 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zlhdf" Mar 20 06:54:19 crc kubenswrapper[4971]: I0320 06:54:19.835918 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-25c82" Mar 20 06:54:19 crc kubenswrapper[4971]: I0320 06:54:19.836927 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-25c82" Mar 20 06:54:19 crc kubenswrapper[4971]: I0320 06:54:19.887565 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-25c82" Mar 20 06:54:20 crc kubenswrapper[4971]: I0320 06:54:20.162173 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 06:54:20 crc kubenswrapper[4971]: I0320 06:54:20.162276 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 06:54:20 crc kubenswrapper[4971]: I0320 06:54:20.162354 4971 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" Mar 20 06:54:20 crc kubenswrapper[4971]: I0320 06:54:20.163308 4971 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b6545ca2d913195f4ebed66a5c9050bc7198c2d2281fea7c3293a5d188beb2e4"} pod="openshift-machine-config-operator/machine-config-daemon-m26zl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 06:54:20 crc kubenswrapper[4971]: I0320 06:54:20.163400 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" containerID="cri-o://b6545ca2d913195f4ebed66a5c9050bc7198c2d2281fea7c3293a5d188beb2e4" gracePeriod=600 Mar 20 06:54:20 crc kubenswrapper[4971]: I0320 06:54:20.708569 4971 generic.go:334] "Generic (PLEG): container finished" podID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerID="b6545ca2d913195f4ebed66a5c9050bc7198c2d2281fea7c3293a5d188beb2e4" exitCode=0 Mar 20 06:54:20 crc kubenswrapper[4971]: I0320 06:54:20.708665 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerDied","Data":"b6545ca2d913195f4ebed66a5c9050bc7198c2d2281fea7c3293a5d188beb2e4"} Mar 20 06:54:20 crc kubenswrapper[4971]: I0320 06:54:20.752873 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-25c82" Mar 20 06:54:22 crc kubenswrapper[4971]: I0320 06:54:22.139292 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zlhdf"] Mar 20 06:54:22 crc kubenswrapper[4971]: I0320 06:54:22.142546 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zlhdf" podUID="6fbc771d-6009-4ab3-85e4-963d26ee459e" containerName="registry-server" containerID="cri-o://f76645e222a80d7d8b0d91650d756744d53a0d743e4852b48a11b700a398b325" gracePeriod=2 Mar 20 06:54:23 crc kubenswrapper[4971]: I0320 06:54:23.154842 4971 generic.go:334] "Generic (PLEG): container finished" podID="6fbc771d-6009-4ab3-85e4-963d26ee459e" containerID="f76645e222a80d7d8b0d91650d756744d53a0d743e4852b48a11b700a398b325" exitCode=0 Mar 20 06:54:23 crc kubenswrapper[4971]: I0320 06:54:23.154983 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zlhdf" event={"ID":"6fbc771d-6009-4ab3-85e4-963d26ee459e","Type":"ContainerDied","Data":"f76645e222a80d7d8b0d91650d756744d53a0d743e4852b48a11b700a398b325"} Mar 20 06:54:23 crc kubenswrapper[4971]: I0320 06:54:23.158333 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerStarted","Data":"ddd443d54f717c3c4871b4cb928359d37429421fd7c4e0ea4b56ed63d3ac0038"} Mar 20 06:54:24 crc kubenswrapper[4971]: I0320 06:54:24.053876 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zlhdf" Mar 20 06:54:24 crc kubenswrapper[4971]: I0320 06:54:24.122570 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-25c82"] Mar 20 06:54:24 crc kubenswrapper[4971]: I0320 06:54:24.155992 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fbc771d-6009-4ab3-85e4-963d26ee459e-utilities\") pod \"6fbc771d-6009-4ab3-85e4-963d26ee459e\" (UID: \"6fbc771d-6009-4ab3-85e4-963d26ee459e\") " Mar 20 06:54:24 crc kubenswrapper[4971]: I0320 06:54:24.156083 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h55hg\" (UniqueName: \"kubernetes.io/projected/6fbc771d-6009-4ab3-85e4-963d26ee459e-kube-api-access-h55hg\") pod \"6fbc771d-6009-4ab3-85e4-963d26ee459e\" (UID: \"6fbc771d-6009-4ab3-85e4-963d26ee459e\") " Mar 20 06:54:24 crc kubenswrapper[4971]: I0320 06:54:24.156118 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fbc771d-6009-4ab3-85e4-963d26ee459e-catalog-content\") pod \"6fbc771d-6009-4ab3-85e4-963d26ee459e\" (UID: \"6fbc771d-6009-4ab3-85e4-963d26ee459e\") " Mar 20 06:54:24 crc kubenswrapper[4971]: I0320 06:54:24.157589 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fbc771d-6009-4ab3-85e4-963d26ee459e-utilities" (OuterVolumeSpecName: "utilities") pod "6fbc771d-6009-4ab3-85e4-963d26ee459e" (UID: "6fbc771d-6009-4ab3-85e4-963d26ee459e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 06:54:24 crc kubenswrapper[4971]: I0320 06:54:24.163241 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fbc771d-6009-4ab3-85e4-963d26ee459e-kube-api-access-h55hg" (OuterVolumeSpecName: "kube-api-access-h55hg") pod "6fbc771d-6009-4ab3-85e4-963d26ee459e" (UID: "6fbc771d-6009-4ab3-85e4-963d26ee459e"). InnerVolumeSpecName "kube-api-access-h55hg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:54:24 crc kubenswrapper[4971]: I0320 06:54:24.182584 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zlhdf" event={"ID":"6fbc771d-6009-4ab3-85e4-963d26ee459e","Type":"ContainerDied","Data":"70bf1343010d907b841c74313f74ba7482501ec61ca732acae178511e650d02a"} Mar 20 06:54:24 crc kubenswrapper[4971]: I0320 06:54:24.182681 4971 scope.go:117] "RemoveContainer" containerID="f76645e222a80d7d8b0d91650d756744d53a0d743e4852b48a11b700a398b325" Mar 20 06:54:24 crc kubenswrapper[4971]: I0320 06:54:24.182813 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-25c82" podUID="4c6b8af5-8d8d-4643-8815-9ed0526f04e1" containerName="registry-server" containerID="cri-o://1e6cc46fb4b8d4c5e8f6f57d69cf682c7f883002126fe95809ffd103cf57ced5" gracePeriod=2 Mar 20 06:54:24 crc kubenswrapper[4971]: I0320 06:54:24.182931 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zlhdf" Mar 20 06:54:24 crc kubenswrapper[4971]: I0320 06:54:24.211208 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fbc771d-6009-4ab3-85e4-963d26ee459e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6fbc771d-6009-4ab3-85e4-963d26ee459e" (UID: "6fbc771d-6009-4ab3-85e4-963d26ee459e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 06:54:24 crc kubenswrapper[4971]: I0320 06:54:24.227047 4971 scope.go:117] "RemoveContainer" containerID="ce2d0c3726e49ca28bc5f094197cb05090df4fbb83ab88fd1eb8077a76316a38" Mar 20 06:54:24 crc kubenswrapper[4971]: I0320 06:54:24.257239 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fbc771d-6009-4ab3-85e4-963d26ee459e-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 06:54:24 crc kubenswrapper[4971]: I0320 06:54:24.257452 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h55hg\" (UniqueName: \"kubernetes.io/projected/6fbc771d-6009-4ab3-85e4-963d26ee459e-kube-api-access-h55hg\") on node \"crc\" DevicePath \"\"" Mar 20 06:54:24 crc kubenswrapper[4971]: I0320 06:54:24.257527 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fbc771d-6009-4ab3-85e4-963d26ee459e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 06:54:24 crc kubenswrapper[4971]: I0320 06:54:24.315527 4971 scope.go:117] "RemoveContainer" containerID="d0fb8f8dc6b00dd48db63775457875faf74b664eb9122a468589f8670723c60a" Mar 20 06:54:24 crc kubenswrapper[4971]: I0320 06:54:24.515169 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zlhdf"] Mar 20 06:54:24 crc kubenswrapper[4971]: I0320 06:54:24.517857 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zlhdf"] Mar 20 06:54:24 crc kubenswrapper[4971]: I0320 06:54:24.592992 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5d77494bff-976qk"] Mar 20 06:54:24 crc kubenswrapper[4971]: I0320 06:54:24.593244 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5d77494bff-976qk" podUID="8831fb9e-bc6a-4d43-a537-8152456ede10" containerName="controller-manager" containerID="cri-o://b409f46d1c125512f0ddb04c9ae93636bf7e5f9a42a93cb26fbd2d0bf1f3dc20" gracePeriod=30 Mar 20 06:54:24 crc kubenswrapper[4971]: I0320 06:54:24.622118 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79f687cf94-6pkxg"] Mar 20 06:54:24 crc kubenswrapper[4971]: I0320 06:54:24.622422 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-79f687cf94-6pkxg" podUID="fdfe551c-e022-4cf6-98ae-fb7d8bbcc5eb" containerName="route-controller-manager" containerID="cri-o://2da85cca8c663014a13c1c1f6bcfef8a2f51e8a05c2a8a9b023017da6774ed0a" gracePeriod=30 Mar 20 06:54:24 crc kubenswrapper[4971]: I0320 06:54:24.746820 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fbc771d-6009-4ab3-85e4-963d26ee459e" path="/var/lib/kubelet/pods/6fbc771d-6009-4ab3-85e4-963d26ee459e/volumes" Mar 20 06:54:24 crc kubenswrapper[4971]: I0320 06:54:24.802377 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-25c82" Mar 20 06:54:24 crc kubenswrapper[4971]: I0320 06:54:24.866390 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtdcz\" (UniqueName: \"kubernetes.io/projected/4c6b8af5-8d8d-4643-8815-9ed0526f04e1-kube-api-access-qtdcz\") pod \"4c6b8af5-8d8d-4643-8815-9ed0526f04e1\" (UID: \"4c6b8af5-8d8d-4643-8815-9ed0526f04e1\") " Mar 20 06:54:24 crc kubenswrapper[4971]: I0320 06:54:24.867034 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c6b8af5-8d8d-4643-8815-9ed0526f04e1-utilities\") pod \"4c6b8af5-8d8d-4643-8815-9ed0526f04e1\" (UID: \"4c6b8af5-8d8d-4643-8815-9ed0526f04e1\") " Mar 20 06:54:24 crc kubenswrapper[4971]: I0320 06:54:24.867090 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c6b8af5-8d8d-4643-8815-9ed0526f04e1-catalog-content\") pod \"4c6b8af5-8d8d-4643-8815-9ed0526f04e1\" (UID: \"4c6b8af5-8d8d-4643-8815-9ed0526f04e1\") " Mar 20 06:54:24 crc kubenswrapper[4971]: I0320 06:54:24.867978 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c6b8af5-8d8d-4643-8815-9ed0526f04e1-utilities" (OuterVolumeSpecName: "utilities") pod "4c6b8af5-8d8d-4643-8815-9ed0526f04e1" (UID: "4c6b8af5-8d8d-4643-8815-9ed0526f04e1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 06:54:24 crc kubenswrapper[4971]: I0320 06:54:24.873544 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c6b8af5-8d8d-4643-8815-9ed0526f04e1-kube-api-access-qtdcz" (OuterVolumeSpecName: "kube-api-access-qtdcz") pod "4c6b8af5-8d8d-4643-8815-9ed0526f04e1" (UID: "4c6b8af5-8d8d-4643-8815-9ed0526f04e1"). InnerVolumeSpecName "kube-api-access-qtdcz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:54:24 crc kubenswrapper[4971]: I0320 06:54:24.921469 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c6b8af5-8d8d-4643-8815-9ed0526f04e1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4c6b8af5-8d8d-4643-8815-9ed0526f04e1" (UID: "4c6b8af5-8d8d-4643-8815-9ed0526f04e1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 06:54:24 crc kubenswrapper[4971]: I0320 06:54:24.968170 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c6b8af5-8d8d-4643-8815-9ed0526f04e1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 06:54:24 crc kubenswrapper[4971]: I0320 06:54:24.968437 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtdcz\" (UniqueName: \"kubernetes.io/projected/4c6b8af5-8d8d-4643-8815-9ed0526f04e1-kube-api-access-qtdcz\") on node \"crc\" DevicePath \"\"" Mar 20 06:54:24 crc kubenswrapper[4971]: I0320 06:54:24.968505 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c6b8af5-8d8d-4643-8815-9ed0526f04e1-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.185532 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-79f687cf94-6pkxg" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.192058 4971 generic.go:334] "Generic (PLEG): container finished" podID="7cb5ccdd-a645-458a-9961-461bd99c3c79" containerID="df63d190e7088729d2022627e8412dac36b3ab035e48217f40dca613d036f4cb" exitCode=0 Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.192098 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9l5fd" event={"ID":"7cb5ccdd-a645-458a-9961-461bd99c3c79","Type":"ContainerDied","Data":"df63d190e7088729d2022627e8412dac36b3ab035e48217f40dca613d036f4cb"} Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.200151 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d77494bff-976qk" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.200738 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4w7kq" event={"ID":"0a637a4b-c5ec-46a5-8b5c-84a56a112ec4","Type":"ContainerStarted","Data":"67e4e52e7920ffef5b5a9e2617ba879991173b0b523eabce69df599f95d6bad0"} Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.205279 4971 generic.go:334] "Generic (PLEG): container finished" podID="8831fb9e-bc6a-4d43-a537-8152456ede10" containerID="b409f46d1c125512f0ddb04c9ae93636bf7e5f9a42a93cb26fbd2d0bf1f3dc20" exitCode=0 Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.205333 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d77494bff-976qk" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.205396 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d77494bff-976qk" event={"ID":"8831fb9e-bc6a-4d43-a537-8152456ede10","Type":"ContainerDied","Data":"b409f46d1c125512f0ddb04c9ae93636bf7e5f9a42a93cb26fbd2d0bf1f3dc20"} Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.205487 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d77494bff-976qk" event={"ID":"8831fb9e-bc6a-4d43-a537-8152456ede10","Type":"ContainerDied","Data":"45367a70b8ef78b0112a04ecfa85f2d3f29b5c0e056801539b66328f4edd6e42"} Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.205513 4971 scope.go:117] "RemoveContainer" containerID="b409f46d1c125512f0ddb04c9ae93636bf7e5f9a42a93cb26fbd2d0bf1f3dc20" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.222444 4971 generic.go:334] "Generic (PLEG): container finished" podID="4c6b8af5-8d8d-4643-8815-9ed0526f04e1" containerID="1e6cc46fb4b8d4c5e8f6f57d69cf682c7f883002126fe95809ffd103cf57ced5" exitCode=0 Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.222511 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-25c82" event={"ID":"4c6b8af5-8d8d-4643-8815-9ed0526f04e1","Type":"ContainerDied","Data":"1e6cc46fb4b8d4c5e8f6f57d69cf682c7f883002126fe95809ffd103cf57ced5"} Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.222571 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-25c82" event={"ID":"4c6b8af5-8d8d-4643-8815-9ed0526f04e1","Type":"ContainerDied","Data":"3dcf09b991cbcb066f4cab64c3ff54e62184c5d5945513c3f0ecc1183e24dad9"} Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.223353 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-25c82" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.225917 4971 generic.go:334] "Generic (PLEG): container finished" podID="fdfe551c-e022-4cf6-98ae-fb7d8bbcc5eb" containerID="2da85cca8c663014a13c1c1f6bcfef8a2f51e8a05c2a8a9b023017da6774ed0a" exitCode=0 Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.226026 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-79f687cf94-6pkxg" event={"ID":"fdfe551c-e022-4cf6-98ae-fb7d8bbcc5eb","Type":"ContainerDied","Data":"2da85cca8c663014a13c1c1f6bcfef8a2f51e8a05c2a8a9b023017da6774ed0a"} Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.226058 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-79f687cf94-6pkxg" event={"ID":"fdfe551c-e022-4cf6-98ae-fb7d8bbcc5eb","Type":"ContainerDied","Data":"2e8dcec68cfc5ea085689ae7c10781de0259c9ed49726307811a63f899bada8e"} Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.226135 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-79f687cf94-6pkxg" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.228857 4971 generic.go:334] "Generic (PLEG): container finished" podID="ff1d12ba-e1ee-4cbc-8c8e-102269e47f75" containerID="5471658d911356884172533fcb8c330b370001f7ac74afd14baae776290baf70" exitCode=0 Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.228910 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6b8lj" event={"ID":"ff1d12ba-e1ee-4cbc-8c8e-102269e47f75","Type":"ContainerDied","Data":"5471658d911356884172533fcb8c330b370001f7ac74afd14baae776290baf70"} Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.241926 4971 scope.go:117] "RemoveContainer" containerID="b409f46d1c125512f0ddb04c9ae93636bf7e5f9a42a93cb26fbd2d0bf1f3dc20" Mar 20 06:54:25 crc kubenswrapper[4971]: E0320 06:54:25.246101 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b409f46d1c125512f0ddb04c9ae93636bf7e5f9a42a93cb26fbd2d0bf1f3dc20\": container with ID starting with b409f46d1c125512f0ddb04c9ae93636bf7e5f9a42a93cb26fbd2d0bf1f3dc20 not found: ID does not exist" containerID="b409f46d1c125512f0ddb04c9ae93636bf7e5f9a42a93cb26fbd2d0bf1f3dc20" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.246143 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b409f46d1c125512f0ddb04c9ae93636bf7e5f9a42a93cb26fbd2d0bf1f3dc20"} err="failed to get container status \"b409f46d1c125512f0ddb04c9ae93636bf7e5f9a42a93cb26fbd2d0bf1f3dc20\": rpc error: code = NotFound desc = could not find container \"b409f46d1c125512f0ddb04c9ae93636bf7e5f9a42a93cb26fbd2d0bf1f3dc20\": container with ID starting with b409f46d1c125512f0ddb04c9ae93636bf7e5f9a42a93cb26fbd2d0bf1f3dc20 not found: ID does not exist" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.246781 4971 scope.go:117] "RemoveContainer" containerID="1e6cc46fb4b8d4c5e8f6f57d69cf682c7f883002126fe95809ffd103cf57ced5" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.254649 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zj9pn" event={"ID":"e7552f5a-9823-42db-9766-437726d2c7bb","Type":"ContainerStarted","Data":"987ce30fcb06679d0ad90be451f483673e114ac854337cc22fef574472572179"} Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.273920 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8831fb9e-bc6a-4d43-a537-8152456ede10-client-ca\") pod \"8831fb9e-bc6a-4d43-a537-8152456ede10\" (UID: \"8831fb9e-bc6a-4d43-a537-8152456ede10\") " Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.274026 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdfe551c-e022-4cf6-98ae-fb7d8bbcc5eb-config\") pod \"fdfe551c-e022-4cf6-98ae-fb7d8bbcc5eb\" (UID: \"fdfe551c-e022-4cf6-98ae-fb7d8bbcc5eb\") " Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.274143 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fdfe551c-e022-4cf6-98ae-fb7d8bbcc5eb-client-ca\") pod \"fdfe551c-e022-4cf6-98ae-fb7d8bbcc5eb\" (UID: \"fdfe551c-e022-4cf6-98ae-fb7d8bbcc5eb\") " Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.274230 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fdfe551c-e022-4cf6-98ae-fb7d8bbcc5eb-serving-cert\") pod \"fdfe551c-e022-4cf6-98ae-fb7d8bbcc5eb\" (UID: \"fdfe551c-e022-4cf6-98ae-fb7d8bbcc5eb\") " Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.274271 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzc9n\" (UniqueName: \"kubernetes.io/projected/fdfe551c-e022-4cf6-98ae-fb7d8bbcc5eb-kube-api-access-xzc9n\") pod \"fdfe551c-e022-4cf6-98ae-fb7d8bbcc5eb\" (UID: \"fdfe551c-e022-4cf6-98ae-fb7d8bbcc5eb\") " Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.274306 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-986qn\" (UniqueName: \"kubernetes.io/projected/8831fb9e-bc6a-4d43-a537-8152456ede10-kube-api-access-986qn\") pod \"8831fb9e-bc6a-4d43-a537-8152456ede10\" (UID: \"8831fb9e-bc6a-4d43-a537-8152456ede10\") " Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.274370 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8831fb9e-bc6a-4d43-a537-8152456ede10-proxy-ca-bundles\") pod \"8831fb9e-bc6a-4d43-a537-8152456ede10\" (UID: \"8831fb9e-bc6a-4d43-a537-8152456ede10\") " Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.274437 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8831fb9e-bc6a-4d43-a537-8152456ede10-serving-cert\") pod \"8831fb9e-bc6a-4d43-a537-8152456ede10\" (UID: \"8831fb9e-bc6a-4d43-a537-8152456ede10\") " Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.274469 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8831fb9e-bc6a-4d43-a537-8152456ede10-config\") pod \"8831fb9e-bc6a-4d43-a537-8152456ede10\" (UID: \"8831fb9e-bc6a-4d43-a537-8152456ede10\") " Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.276071 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8831fb9e-bc6a-4d43-a537-8152456ede10-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "8831fb9e-bc6a-4d43-a537-8152456ede10" (UID: "8831fb9e-bc6a-4d43-a537-8152456ede10"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.278520 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8831fb9e-bc6a-4d43-a537-8152456ede10-config" (OuterVolumeSpecName: "config") pod "8831fb9e-bc6a-4d43-a537-8152456ede10" (UID: "8831fb9e-bc6a-4d43-a537-8152456ede10"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.279791 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8831fb9e-bc6a-4d43-a537-8152456ede10-client-ca" (OuterVolumeSpecName: "client-ca") pod "8831fb9e-bc6a-4d43-a537-8152456ede10" (UID: "8831fb9e-bc6a-4d43-a537-8152456ede10"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.279864 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdfe551c-e022-4cf6-98ae-fb7d8bbcc5eb-config" (OuterVolumeSpecName: "config") pod "fdfe551c-e022-4cf6-98ae-fb7d8bbcc5eb" (UID: "fdfe551c-e022-4cf6-98ae-fb7d8bbcc5eb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.279928 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8831fb9e-bc6a-4d43-a537-8152456ede10-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8831fb9e-bc6a-4d43-a537-8152456ede10" (UID: "8831fb9e-bc6a-4d43-a537-8152456ede10"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.280063 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdfe551c-e022-4cf6-98ae-fb7d8bbcc5eb-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "fdfe551c-e022-4cf6-98ae-fb7d8bbcc5eb" (UID: "fdfe551c-e022-4cf6-98ae-fb7d8bbcc5eb"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.280293 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdfe551c-e022-4cf6-98ae-fb7d8bbcc5eb-client-ca" (OuterVolumeSpecName: "client-ca") pod "fdfe551c-e022-4cf6-98ae-fb7d8bbcc5eb" (UID: "fdfe551c-e022-4cf6-98ae-fb7d8bbcc5eb"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.289551 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdfe551c-e022-4cf6-98ae-fb7d8bbcc5eb-kube-api-access-xzc9n" (OuterVolumeSpecName: "kube-api-access-xzc9n") pod "fdfe551c-e022-4cf6-98ae-fb7d8bbcc5eb" (UID: "fdfe551c-e022-4cf6-98ae-fb7d8bbcc5eb"). InnerVolumeSpecName "kube-api-access-xzc9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.291012 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8831fb9e-bc6a-4d43-a537-8152456ede10-kube-api-access-986qn" (OuterVolumeSpecName: "kube-api-access-986qn") pod "8831fb9e-bc6a-4d43-a537-8152456ede10" (UID: "8831fb9e-bc6a-4d43-a537-8152456ede10"). InnerVolumeSpecName "kube-api-access-986qn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.375771 4971 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8831fb9e-bc6a-4d43-a537-8152456ede10-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.375815 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8831fb9e-bc6a-4d43-a537-8152456ede10-config\") on node \"crc\" DevicePath \"\"" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.375826 4971 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8831fb9e-bc6a-4d43-a537-8152456ede10-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.375837 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdfe551c-e022-4cf6-98ae-fb7d8bbcc5eb-config\") on node \"crc\" DevicePath \"\"" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.375847 4971 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fdfe551c-e022-4cf6-98ae-fb7d8bbcc5eb-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.375856 4971 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fdfe551c-e022-4cf6-98ae-fb7d8bbcc5eb-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.375866 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzc9n\" (UniqueName: \"kubernetes.io/projected/fdfe551c-e022-4cf6-98ae-fb7d8bbcc5eb-kube-api-access-xzc9n\") on node \"crc\" DevicePath \"\"" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.375878 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-986qn\" (UniqueName: \"kubernetes.io/projected/8831fb9e-bc6a-4d43-a537-8152456ede10-kube-api-access-986qn\") on node \"crc\" DevicePath \"\"" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.375886 4971 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8831fb9e-bc6a-4d43-a537-8152456ede10-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.394283 4971 scope.go:117] "RemoveContainer" containerID="24f79678695c6a971b63003e26339799f771b1bf0f7bbdc536a9daa99a55c45b" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.394559 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-25c82"] Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.398201 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-25c82"] Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.431668 4971 scope.go:117] "RemoveContainer" containerID="49311374d35b1bdeab75836ffba59e14d88502c7c663e9d9bf4dc817150f93b7" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.450476 4971 scope.go:117] "RemoveContainer" containerID="1e6cc46fb4b8d4c5e8f6f57d69cf682c7f883002126fe95809ffd103cf57ced5" Mar 20 06:54:25 crc kubenswrapper[4971]: E0320 06:54:25.451004 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e6cc46fb4b8d4c5e8f6f57d69cf682c7f883002126fe95809ffd103cf57ced5\": container with ID starting with 1e6cc46fb4b8d4c5e8f6f57d69cf682c7f883002126fe95809ffd103cf57ced5 not found: ID does not exist" containerID="1e6cc46fb4b8d4c5e8f6f57d69cf682c7f883002126fe95809ffd103cf57ced5" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.451050 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e6cc46fb4b8d4c5e8f6f57d69cf682c7f883002126fe95809ffd103cf57ced5"} err="failed to get container status \"1e6cc46fb4b8d4c5e8f6f57d69cf682c7f883002126fe95809ffd103cf57ced5\": rpc error: code = NotFound desc = could not find container \"1e6cc46fb4b8d4c5e8f6f57d69cf682c7f883002126fe95809ffd103cf57ced5\": container with ID starting with 1e6cc46fb4b8d4c5e8f6f57d69cf682c7f883002126fe95809ffd103cf57ced5 not found: ID does not exist" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.451082 4971 scope.go:117] "RemoveContainer" containerID="24f79678695c6a971b63003e26339799f771b1bf0f7bbdc536a9daa99a55c45b" Mar 20 06:54:25 crc kubenswrapper[4971]: E0320 06:54:25.451520 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24f79678695c6a971b63003e26339799f771b1bf0f7bbdc536a9daa99a55c45b\": container with ID starting with 24f79678695c6a971b63003e26339799f771b1bf0f7bbdc536a9daa99a55c45b not found: ID does not exist" containerID="24f79678695c6a971b63003e26339799f771b1bf0f7bbdc536a9daa99a55c45b" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.451553 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24f79678695c6a971b63003e26339799f771b1bf0f7bbdc536a9daa99a55c45b"} err="failed to get container status \"24f79678695c6a971b63003e26339799f771b1bf0f7bbdc536a9daa99a55c45b\": rpc error: code = NotFound desc = could not find container \"24f79678695c6a971b63003e26339799f771b1bf0f7bbdc536a9daa99a55c45b\": container with ID starting with 24f79678695c6a971b63003e26339799f771b1bf0f7bbdc536a9daa99a55c45b not found: ID does not exist" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.451572 4971 scope.go:117] "RemoveContainer" containerID="49311374d35b1bdeab75836ffba59e14d88502c7c663e9d9bf4dc817150f93b7" Mar 20 06:54:25 crc kubenswrapper[4971]: E0320 06:54:25.451964 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49311374d35b1bdeab75836ffba59e14d88502c7c663e9d9bf4dc817150f93b7\": container with ID starting with 49311374d35b1bdeab75836ffba59e14d88502c7c663e9d9bf4dc817150f93b7 not found: ID does not exist" containerID="49311374d35b1bdeab75836ffba59e14d88502c7c663e9d9bf4dc817150f93b7" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.451995 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49311374d35b1bdeab75836ffba59e14d88502c7c663e9d9bf4dc817150f93b7"} err="failed to get container status \"49311374d35b1bdeab75836ffba59e14d88502c7c663e9d9bf4dc817150f93b7\": rpc error: code = NotFound desc = could not find container \"49311374d35b1bdeab75836ffba59e14d88502c7c663e9d9bf4dc817150f93b7\": container with ID starting with 49311374d35b1bdeab75836ffba59e14d88502c7c663e9d9bf4dc817150f93b7 not found: ID does not exist" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.452016 4971 scope.go:117] "RemoveContainer" containerID="2da85cca8c663014a13c1c1f6bcfef8a2f51e8a05c2a8a9b023017da6774ed0a" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.471456 4971 scope.go:117] "RemoveContainer" containerID="2da85cca8c663014a13c1c1f6bcfef8a2f51e8a05c2a8a9b023017da6774ed0a" Mar 20 06:54:25 crc kubenswrapper[4971]: E0320 06:54:25.472042 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2da85cca8c663014a13c1c1f6bcfef8a2f51e8a05c2a8a9b023017da6774ed0a\": container with ID starting with 2da85cca8c663014a13c1c1f6bcfef8a2f51e8a05c2a8a9b023017da6774ed0a not found: ID does not exist" containerID="2da85cca8c663014a13c1c1f6bcfef8a2f51e8a05c2a8a9b023017da6774ed0a" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.472090 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2da85cca8c663014a13c1c1f6bcfef8a2f51e8a05c2a8a9b023017da6774ed0a"} err="failed to get container status \"2da85cca8c663014a13c1c1f6bcfef8a2f51e8a05c2a8a9b023017da6774ed0a\": rpc error: code = NotFound desc = could not find container \"2da85cca8c663014a13c1c1f6bcfef8a2f51e8a05c2a8a9b023017da6774ed0a\": container with ID starting with 2da85cca8c663014a13c1c1f6bcfef8a2f51e8a05c2a8a9b023017da6774ed0a not found: ID does not exist" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.537354 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5d77494bff-976qk"] Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.541293 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5d77494bff-976qk"] Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.561534 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79f687cf94-6pkxg"] Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.563870 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79f687cf94-6pkxg"] Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.940190 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f5ccb7977-z72sh"] Mar 20 06:54:25 crc kubenswrapper[4971]: E0320 06:54:25.941096 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fbc771d-6009-4ab3-85e4-963d26ee459e" containerName="extract-content" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.941123 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fbc771d-6009-4ab3-85e4-963d26ee459e" containerName="extract-content" Mar 20 06:54:25 crc kubenswrapper[4971]: E0320 06:54:25.941137 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c6b8af5-8d8d-4643-8815-9ed0526f04e1" containerName="registry-server" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.941148 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c6b8af5-8d8d-4643-8815-9ed0526f04e1" containerName="registry-server" Mar 20 06:54:25 crc kubenswrapper[4971]: E0320 06:54:25.941163 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38760c08-459a-4637-92a1-a1c2701532a9" containerName="oc" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.941170 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="38760c08-459a-4637-92a1-a1c2701532a9" containerName="oc" Mar 20 06:54:25 crc kubenswrapper[4971]: E0320 06:54:25.941185 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdfe551c-e022-4cf6-98ae-fb7d8bbcc5eb" containerName="route-controller-manager" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.941193 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdfe551c-e022-4cf6-98ae-fb7d8bbcc5eb" containerName="route-controller-manager" Mar 20 06:54:25 crc kubenswrapper[4971]: E0320 06:54:25.941203 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c6b8af5-8d8d-4643-8815-9ed0526f04e1" containerName="extract-utilities" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.941211 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c6b8af5-8d8d-4643-8815-9ed0526f04e1" containerName="extract-utilities" Mar 20 06:54:25 crc kubenswrapper[4971]: E0320 06:54:25.941236 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fbc771d-6009-4ab3-85e4-963d26ee459e" containerName="extract-utilities" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.941247 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fbc771d-6009-4ab3-85e4-963d26ee459e" containerName="extract-utilities" Mar 20 06:54:25 crc kubenswrapper[4971]: E0320 06:54:25.941262 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8831fb9e-bc6a-4d43-a537-8152456ede10" containerName="controller-manager" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.941271 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="8831fb9e-bc6a-4d43-a537-8152456ede10" containerName="controller-manager" Mar 20 06:54:25 crc kubenswrapper[4971]: E0320 06:54:25.941285 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fbc771d-6009-4ab3-85e4-963d26ee459e" containerName="registry-server" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.941295 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fbc771d-6009-4ab3-85e4-963d26ee459e" containerName="registry-server" Mar 20 06:54:25 crc kubenswrapper[4971]: E0320 06:54:25.941308 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c6b8af5-8d8d-4643-8815-9ed0526f04e1" containerName="extract-content" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.941317 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c6b8af5-8d8d-4643-8815-9ed0526f04e1" containerName="extract-content" Mar 20 06:54:25 crc kubenswrapper[4971]: E0320 06:54:25.941329 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e766c0e7-ca96-4f32-9e14-81a3d5bbd389" containerName="oc" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.941337 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="e766c0e7-ca96-4f32-9e14-81a3d5bbd389" containerName="oc" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.941468 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="8831fb9e-bc6a-4d43-a537-8152456ede10" containerName="controller-manager" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.941485 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fbc771d-6009-4ab3-85e4-963d26ee459e" containerName="registry-server" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.941497 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="e766c0e7-ca96-4f32-9e14-81a3d5bbd389" containerName="oc" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.941509 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdfe551c-e022-4cf6-98ae-fb7d8bbcc5eb" containerName="route-controller-manager" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.941520 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c6b8af5-8d8d-4643-8815-9ed0526f04e1" containerName="registry-server" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.941534 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="38760c08-459a-4637-92a1-a1c2701532a9" containerName="oc" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.942147 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f5ccb7977-z72sh" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.947181 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.947468 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.947670 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.948158 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.948513 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.948621 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.961452 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7f8d9d77fc-fmvt4"] Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.962641 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f8d9d77fc-fmvt4" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.967455 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.967718 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.967903 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.969280 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.976062 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.977901 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.980972 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.981417 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f5ccb7977-z72sh"] Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.984007 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f5d7ed75-853c-45b8-90a0-9014096485fc-client-ca\") pod \"controller-manager-7f8d9d77fc-fmvt4\" (UID: \"f5d7ed75-853c-45b8-90a0-9014096485fc\") " pod="openshift-controller-manager/controller-manager-7f8d9d77fc-fmvt4" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.984088 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8d2c2cc8-d215-4e57-87a2-25a54b287683-client-ca\") pod \"route-controller-manager-6f5ccb7977-z72sh\" (UID: \"8d2c2cc8-d215-4e57-87a2-25a54b287683\") " pod="openshift-route-controller-manager/route-controller-manager-6f5ccb7977-z72sh" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.984130 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d2c2cc8-d215-4e57-87a2-25a54b287683-config\") pod \"route-controller-manager-6f5ccb7977-z72sh\" (UID: \"8d2c2cc8-d215-4e57-87a2-25a54b287683\") " pod="openshift-route-controller-manager/route-controller-manager-6f5ccb7977-z72sh" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.984161 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5d7ed75-853c-45b8-90a0-9014096485fc-config\") pod \"controller-manager-7f8d9d77fc-fmvt4\" (UID: \"f5d7ed75-853c-45b8-90a0-9014096485fc\") " pod="openshift-controller-manager/controller-manager-7f8d9d77fc-fmvt4" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.984216 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5d7ed75-853c-45b8-90a0-9014096485fc-serving-cert\") pod \"controller-manager-7f8d9d77fc-fmvt4\" (UID: \"f5d7ed75-853c-45b8-90a0-9014096485fc\") " pod="openshift-controller-manager/controller-manager-7f8d9d77fc-fmvt4" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.984257 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjsld\" (UniqueName: \"kubernetes.io/projected/8d2c2cc8-d215-4e57-87a2-25a54b287683-kube-api-access-rjsld\") pod \"route-controller-manager-6f5ccb7977-z72sh\" (UID: \"8d2c2cc8-d215-4e57-87a2-25a54b287683\") " pod="openshift-route-controller-manager/route-controller-manager-6f5ccb7977-z72sh" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.984290 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d2c2cc8-d215-4e57-87a2-25a54b287683-serving-cert\") pod \"route-controller-manager-6f5ccb7977-z72sh\" (UID: \"8d2c2cc8-d215-4e57-87a2-25a54b287683\") " pod="openshift-route-controller-manager/route-controller-manager-6f5ccb7977-z72sh" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.984329 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8kv7\" (UniqueName: \"kubernetes.io/projected/f5d7ed75-853c-45b8-90a0-9014096485fc-kube-api-access-g8kv7\") pod \"controller-manager-7f8d9d77fc-fmvt4\" (UID: \"f5d7ed75-853c-45b8-90a0-9014096485fc\") " pod="openshift-controller-manager/controller-manager-7f8d9d77fc-fmvt4" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.984358 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f5d7ed75-853c-45b8-90a0-9014096485fc-proxy-ca-bundles\") pod \"controller-manager-7f8d9d77fc-fmvt4\" (UID: \"f5d7ed75-853c-45b8-90a0-9014096485fc\") " pod="openshift-controller-manager/controller-manager-7f8d9d77fc-fmvt4" Mar 20 06:54:25 crc kubenswrapper[4971]: I0320 06:54:25.985232 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7f8d9d77fc-fmvt4"] Mar 20 06:54:26 crc kubenswrapper[4971]: I0320 06:54:26.086117 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8d2c2cc8-d215-4e57-87a2-25a54b287683-client-ca\") pod \"route-controller-manager-6f5ccb7977-z72sh\" (UID: \"8d2c2cc8-d215-4e57-87a2-25a54b287683\") " pod="openshift-route-controller-manager/route-controller-manager-6f5ccb7977-z72sh" Mar 20 06:54:26 crc kubenswrapper[4971]: I0320 06:54:26.086175 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d2c2cc8-d215-4e57-87a2-25a54b287683-config\") pod \"route-controller-manager-6f5ccb7977-z72sh\" (UID: \"8d2c2cc8-d215-4e57-87a2-25a54b287683\") " pod="openshift-route-controller-manager/route-controller-manager-6f5ccb7977-z72sh" Mar 20 06:54:26 crc kubenswrapper[4971]: I0320 06:54:26.086210 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5d7ed75-853c-45b8-90a0-9014096485fc-config\") pod \"controller-manager-7f8d9d77fc-fmvt4\" (UID: \"f5d7ed75-853c-45b8-90a0-9014096485fc\") " pod="openshift-controller-manager/controller-manager-7f8d9d77fc-fmvt4" Mar 20 06:54:26 crc kubenswrapper[4971]: I0320 06:54:26.086240 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5d7ed75-853c-45b8-90a0-9014096485fc-serving-cert\") pod \"controller-manager-7f8d9d77fc-fmvt4\" (UID: \"f5d7ed75-853c-45b8-90a0-9014096485fc\") " pod="openshift-controller-manager/controller-manager-7f8d9d77fc-fmvt4" Mar 20 06:54:26 crc kubenswrapper[4971]: I0320 06:54:26.086273 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjsld\" (UniqueName: \"kubernetes.io/projected/8d2c2cc8-d215-4e57-87a2-25a54b287683-kube-api-access-rjsld\") pod \"route-controller-manager-6f5ccb7977-z72sh\" (UID: \"8d2c2cc8-d215-4e57-87a2-25a54b287683\") " pod="openshift-route-controller-manager/route-controller-manager-6f5ccb7977-z72sh" Mar 20 06:54:26 crc kubenswrapper[4971]: I0320 06:54:26.086296 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d2c2cc8-d215-4e57-87a2-25a54b287683-serving-cert\") pod \"route-controller-manager-6f5ccb7977-z72sh\" (UID: \"8d2c2cc8-d215-4e57-87a2-25a54b287683\") " pod="openshift-route-controller-manager/route-controller-manager-6f5ccb7977-z72sh" Mar 20 06:54:26 crc kubenswrapper[4971]: I0320 06:54:26.086321 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8kv7\" (UniqueName: \"kubernetes.io/projected/f5d7ed75-853c-45b8-90a0-9014096485fc-kube-api-access-g8kv7\") pod \"controller-manager-7f8d9d77fc-fmvt4\" (UID: \"f5d7ed75-853c-45b8-90a0-9014096485fc\") " pod="openshift-controller-manager/controller-manager-7f8d9d77fc-fmvt4" Mar 20 06:54:26 crc kubenswrapper[4971]: I0320 06:54:26.086341 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f5d7ed75-853c-45b8-90a0-9014096485fc-proxy-ca-bundles\") pod \"controller-manager-7f8d9d77fc-fmvt4\" (UID: \"f5d7ed75-853c-45b8-90a0-9014096485fc\") " pod="openshift-controller-manager/controller-manager-7f8d9d77fc-fmvt4" Mar 20 06:54:26 crc kubenswrapper[4971]: I0320 06:54:26.086370 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f5d7ed75-853c-45b8-90a0-9014096485fc-client-ca\") pod \"controller-manager-7f8d9d77fc-fmvt4\" (UID: \"f5d7ed75-853c-45b8-90a0-9014096485fc\") " pod="openshift-controller-manager/controller-manager-7f8d9d77fc-fmvt4" Mar 20 06:54:26 crc kubenswrapper[4971]: I0320 06:54:26.087751 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f5d7ed75-853c-45b8-90a0-9014096485fc-client-ca\") pod \"controller-manager-7f8d9d77fc-fmvt4\" (UID: \"f5d7ed75-853c-45b8-90a0-9014096485fc\") " pod="openshift-controller-manager/controller-manager-7f8d9d77fc-fmvt4" Mar 20 06:54:26 crc kubenswrapper[4971]: I0320 06:54:26.087942 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8d2c2cc8-d215-4e57-87a2-25a54b287683-client-ca\") pod \"route-controller-manager-6f5ccb7977-z72sh\" (UID: \"8d2c2cc8-d215-4e57-87a2-25a54b287683\") " pod="openshift-route-controller-manager/route-controller-manager-6f5ccb7977-z72sh" Mar 20 06:54:26 crc kubenswrapper[4971]: I0320 06:54:26.087974 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5d7ed75-853c-45b8-90a0-9014096485fc-config\") pod \"controller-manager-7f8d9d77fc-fmvt4\" (UID: \"f5d7ed75-853c-45b8-90a0-9014096485fc\") " pod="openshift-controller-manager/controller-manager-7f8d9d77fc-fmvt4" Mar 20 06:54:26 crc kubenswrapper[4971]: I0320 06:54:26.087974 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d2c2cc8-d215-4e57-87a2-25a54b287683-config\") pod \"route-controller-manager-6f5ccb7977-z72sh\" (UID: \"8d2c2cc8-d215-4e57-87a2-25a54b287683\") " pod="openshift-route-controller-manager/route-controller-manager-6f5ccb7977-z72sh" Mar 20 06:54:26 crc kubenswrapper[4971]: I0320 06:54:26.088975 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f5d7ed75-853c-45b8-90a0-9014096485fc-proxy-ca-bundles\") pod \"controller-manager-7f8d9d77fc-fmvt4\" (UID: \"f5d7ed75-853c-45b8-90a0-9014096485fc\") " pod="openshift-controller-manager/controller-manager-7f8d9d77fc-fmvt4" Mar 20 06:54:26 crc kubenswrapper[4971]: I0320 06:54:26.091705 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5d7ed75-853c-45b8-90a0-9014096485fc-serving-cert\") pod \"controller-manager-7f8d9d77fc-fmvt4\" (UID: \"f5d7ed75-853c-45b8-90a0-9014096485fc\") " pod="openshift-controller-manager/controller-manager-7f8d9d77fc-fmvt4" Mar 20 06:54:26 crc kubenswrapper[4971]: I0320 06:54:26.104524 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8kv7\" (UniqueName: \"kubernetes.io/projected/f5d7ed75-853c-45b8-90a0-9014096485fc-kube-api-access-g8kv7\") pod \"controller-manager-7f8d9d77fc-fmvt4\" (UID: \"f5d7ed75-853c-45b8-90a0-9014096485fc\") " pod="openshift-controller-manager/controller-manager-7f8d9d77fc-fmvt4" Mar 20 06:54:26 crc kubenswrapper[4971]: I0320 06:54:26.109146 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d2c2cc8-d215-4e57-87a2-25a54b287683-serving-cert\") pod \"route-controller-manager-6f5ccb7977-z72sh\" (UID: \"8d2c2cc8-d215-4e57-87a2-25a54b287683\") " pod="openshift-route-controller-manager/route-controller-manager-6f5ccb7977-z72sh" Mar 20 06:54:26 crc kubenswrapper[4971]: I0320 06:54:26.109751 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjsld\" (UniqueName: \"kubernetes.io/projected/8d2c2cc8-d215-4e57-87a2-25a54b287683-kube-api-access-rjsld\") pod \"route-controller-manager-6f5ccb7977-z72sh\" (UID: \"8d2c2cc8-d215-4e57-87a2-25a54b287683\") " pod="openshift-route-controller-manager/route-controller-manager-6f5ccb7977-z72sh" Mar 20 06:54:26 crc kubenswrapper[4971]: I0320 06:54:26.143953 4971 patch_prober.go:28] interesting pod/controller-manager-5d77494bff-976qk container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.64:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 06:54:26 crc kubenswrapper[4971]: I0320 06:54:26.144064 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5d77494bff-976qk" podUID="8831fb9e-bc6a-4d43-a537-8152456ede10" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.64:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 06:54:26 crc kubenswrapper[4971]: I0320 06:54:26.268308 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f5ccb7977-z72sh" Mar 20 06:54:26 crc kubenswrapper[4971]: I0320 06:54:26.269129 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6b8lj" event={"ID":"ff1d12ba-e1ee-4cbc-8c8e-102269e47f75","Type":"ContainerStarted","Data":"a581809e6c3475a5ea1a2b4dcd260a6e716c321614bc6e34e2affca5ebe7e531"} Mar 20 06:54:26 crc kubenswrapper[4971]: I0320 06:54:26.272506 4971 generic.go:334] "Generic (PLEG): container finished" podID="e7552f5a-9823-42db-9766-437726d2c7bb" containerID="987ce30fcb06679d0ad90be451f483673e114ac854337cc22fef574472572179" exitCode=0 Mar 20 06:54:26 crc kubenswrapper[4971]: I0320 06:54:26.272569 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zj9pn" event={"ID":"e7552f5a-9823-42db-9766-437726d2c7bb","Type":"ContainerDied","Data":"987ce30fcb06679d0ad90be451f483673e114ac854337cc22fef574472572179"} Mar 20 06:54:26 crc kubenswrapper[4971]: I0320 06:54:26.272663 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zj9pn" event={"ID":"e7552f5a-9823-42db-9766-437726d2c7bb","Type":"ContainerStarted","Data":"7bcef32e41d44aa792cda70334af0a1134d39c1ab91fba1d6e3746814e4138d2"} Mar 20 06:54:26 crc kubenswrapper[4971]: I0320 06:54:26.276282 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9l5fd" event={"ID":"7cb5ccdd-a645-458a-9961-461bd99c3c79","Type":"ContainerStarted","Data":"f23ca37f77065dcd97cdf30d765a6bf59f2c8968a0a0c987fead046d7d339726"} Mar 20 06:54:26 crc kubenswrapper[4971]: I0320 06:54:26.277996 4971 generic.go:334] "Generic (PLEG): container finished" podID="0a637a4b-c5ec-46a5-8b5c-84a56a112ec4" containerID="67e4e52e7920ffef5b5a9e2617ba879991173b0b523eabce69df599f95d6bad0" exitCode=0 Mar 20 06:54:26 crc kubenswrapper[4971]: I0320 06:54:26.278044 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4w7kq" event={"ID":"0a637a4b-c5ec-46a5-8b5c-84a56a112ec4","Type":"ContainerDied","Data":"67e4e52e7920ffef5b5a9e2617ba879991173b0b523eabce69df599f95d6bad0"} Mar 20 06:54:26 crc kubenswrapper[4971]: I0320 06:54:26.290817 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f8d9d77fc-fmvt4" Mar 20 06:54:26 crc kubenswrapper[4971]: I0320 06:54:26.301442 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6b8lj" podStartSLOduration=2.215367922 podStartE2EDuration="55.301419297s" podCreationTimestamp="2026-03-20 06:53:31 +0000 UTC" firstStartedPulling="2026-03-20 06:53:32.773603266 +0000 UTC m=+234.753477404" lastFinishedPulling="2026-03-20 06:54:25.859654631 +0000 UTC m=+287.839528779" observedRunningTime="2026-03-20 06:54:26.29942614 +0000 UTC m=+288.279300278" watchObservedRunningTime="2026-03-20 06:54:26.301419297 +0000 UTC m=+288.281293435" Mar 20 06:54:26 crc kubenswrapper[4971]: I0320 06:54:26.336795 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9l5fd" podStartSLOduration=3.29912309 podStartE2EDuration="56.336756083s" podCreationTimestamp="2026-03-20 06:53:30 +0000 UTC" firstStartedPulling="2026-03-20 06:53:32.814594932 +0000 UTC m=+234.794469070" lastFinishedPulling="2026-03-20 06:54:25.852227915 +0000 UTC m=+287.832102063" observedRunningTime="2026-03-20 06:54:26.324918099 +0000 UTC m=+288.304792277" watchObservedRunningTime="2026-03-20 06:54:26.336756083 +0000 UTC m=+288.316630251" Mar 20 06:54:26 crc kubenswrapper[4971]: I0320 06:54:26.589716 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zj9pn" podStartSLOduration=4.02622876 podStartE2EDuration="54.589689761s" podCreationTimestamp="2026-03-20 06:53:32 +0000 UTC" firstStartedPulling="2026-03-20 06:53:35.106700622 +0000 UTC m=+237.086574760" lastFinishedPulling="2026-03-20 06:54:25.670161583 +0000 UTC m=+287.650035761" observedRunningTime="2026-03-20 06:54:26.385812726 +0000 UTC m=+288.365686864" watchObservedRunningTime="2026-03-20 06:54:26.589689761 +0000 UTC m=+288.569563899" Mar 20 06:54:26 crc kubenswrapper[4971]: I0320 06:54:26.594251 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f5ccb7977-z72sh"] Mar 20 06:54:26 crc kubenswrapper[4971]: I0320 06:54:26.667378 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7f8d9d77fc-fmvt4"] Mar 20 06:54:26 crc kubenswrapper[4971]: I0320 06:54:26.743595 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c6b8af5-8d8d-4643-8815-9ed0526f04e1" path="/var/lib/kubelet/pods/4c6b8af5-8d8d-4643-8815-9ed0526f04e1/volumes" Mar 20 06:54:26 crc kubenswrapper[4971]: I0320 06:54:26.744589 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8831fb9e-bc6a-4d43-a537-8152456ede10" path="/var/lib/kubelet/pods/8831fb9e-bc6a-4d43-a537-8152456ede10/volumes" Mar 20 06:54:26 crc kubenswrapper[4971]: I0320 06:54:26.745512 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdfe551c-e022-4cf6-98ae-fb7d8bbcc5eb" path="/var/lib/kubelet/pods/fdfe551c-e022-4cf6-98ae-fb7d8bbcc5eb/volumes" Mar 20 06:54:27 crc kubenswrapper[4971]: I0320 06:54:27.284975 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6f5ccb7977-z72sh" event={"ID":"8d2c2cc8-d215-4e57-87a2-25a54b287683","Type":"ContainerStarted","Data":"19c2d1d56464a1ae28eed34fdd13ec0bcc7be61b4ffce3d10d51df1e67478337"} Mar 20 06:54:27 crc kubenswrapper[4971]: I0320 06:54:27.285563 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6f5ccb7977-z72sh" Mar 20 06:54:27 crc kubenswrapper[4971]: I0320 06:54:27.285579 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6f5ccb7977-z72sh" event={"ID":"8d2c2cc8-d215-4e57-87a2-25a54b287683","Type":"ContainerStarted","Data":"ee207f92da3a18d86c255c548ef055b468e21191766420b5474c7f9b013afd79"} Mar 20 06:54:27 crc kubenswrapper[4971]: I0320 06:54:27.286966 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f8d9d77fc-fmvt4" event={"ID":"f5d7ed75-853c-45b8-90a0-9014096485fc","Type":"ContainerStarted","Data":"d931fa95c2306be9a7481525d9b45793052c290d506fbc6bd74e876a81472c4a"} Mar 20 06:54:27 crc kubenswrapper[4971]: I0320 06:54:27.287412 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f8d9d77fc-fmvt4" event={"ID":"f5d7ed75-853c-45b8-90a0-9014096485fc","Type":"ContainerStarted","Data":"bf73865123d063008cf05a59d8d5831eeffb0e9d872583b9f0cebee286c2dab0"} Mar 20 06:54:27 crc kubenswrapper[4971]: I0320 06:54:27.287456 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7f8d9d77fc-fmvt4" Mar 20 06:54:27 crc kubenswrapper[4971]: I0320 06:54:27.289859 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4w7kq" event={"ID":"0a637a4b-c5ec-46a5-8b5c-84a56a112ec4","Type":"ContainerStarted","Data":"678df1b724815327e4996b78dd609092217a98e85e5b6e63b55b67e5dd3ba339"} Mar 20 06:54:27 crc kubenswrapper[4971]: I0320 06:54:27.294918 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7f8d9d77fc-fmvt4" Mar 20 06:54:27 crc kubenswrapper[4971]: I0320 06:54:27.340225 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7f8d9d77fc-fmvt4" podStartSLOduration=3.340179975 podStartE2EDuration="3.340179975s" podCreationTimestamp="2026-03-20 06:54:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:54:27.338023022 +0000 UTC m=+289.317897150" watchObservedRunningTime="2026-03-20 06:54:27.340179975 +0000 UTC m=+289.320054113" Mar 20 06:54:27 crc kubenswrapper[4971]: I0320 06:54:27.342931 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6f5ccb7977-z72sh" podStartSLOduration=3.342923194 podStartE2EDuration="3.342923194s" podCreationTimestamp="2026-03-20 06:54:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:54:27.31209842 +0000 UTC m=+289.291972558" watchObservedRunningTime="2026-03-20 06:54:27.342923194 +0000 UTC m=+289.322797332" Mar 20 06:54:27 crc kubenswrapper[4971]: I0320 06:54:27.366088 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4w7kq" podStartSLOduration=3.583829922 podStartE2EDuration="56.366061795s" podCreationTimestamp="2026-03-20 06:53:31 +0000 UTC" firstStartedPulling="2026-03-20 06:53:33.942086044 +0000 UTC m=+235.921960182" lastFinishedPulling="2026-03-20 06:54:26.724317917 +0000 UTC m=+288.704192055" observedRunningTime="2026-03-20 06:54:27.364804879 +0000 UTC m=+289.344679027" watchObservedRunningTime="2026-03-20 06:54:27.366061795 +0000 UTC m=+289.345935933" Mar 20 06:54:27 crc kubenswrapper[4971]: I0320 06:54:27.376478 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6f5ccb7977-z72sh" Mar 20 06:54:31 crc kubenswrapper[4971]: I0320 06:54:31.399368 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9l5fd" Mar 20 06:54:31 crc kubenswrapper[4971]: I0320 06:54:31.400018 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9l5fd" Mar 20 06:54:31 crc kubenswrapper[4971]: I0320 06:54:31.456257 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9l5fd" Mar 20 06:54:31 crc kubenswrapper[4971]: I0320 06:54:31.779932 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6b8lj" Mar 20 06:54:31 crc kubenswrapper[4971]: I0320 06:54:31.780012 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6b8lj" Mar 20 06:54:31 crc kubenswrapper[4971]: I0320 06:54:31.823884 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6b8lj" Mar 20 06:54:32 crc kubenswrapper[4971]: I0320 06:54:32.384480 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6b8lj" Mar 20 06:54:32 crc kubenswrapper[4971]: I0320 06:54:32.387543 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4w7kq" Mar 20 06:54:32 crc kubenswrapper[4971]: I0320 06:54:32.387772 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4w7kq" Mar 20 06:54:32 crc kubenswrapper[4971]: I0320 06:54:32.407866 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9l5fd" Mar 20 06:54:32 crc kubenswrapper[4971]: I0320 06:54:32.824204 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zj9pn" Mar 20 06:54:32 crc kubenswrapper[4971]: I0320 06:54:32.824274 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zj9pn" Mar 20 06:54:33 crc kubenswrapper[4971]: I0320 06:54:33.441108 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4w7kq" podUID="0a637a4b-c5ec-46a5-8b5c-84a56a112ec4" containerName="registry-server" probeResult="failure" output=< Mar 20 06:54:33 crc kubenswrapper[4971]: timeout: failed to connect service ":50051" within 1s Mar 20 06:54:33 crc kubenswrapper[4971]: > Mar 20 06:54:33 crc kubenswrapper[4971]: I0320 06:54:33.891424 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zj9pn" podUID="e7552f5a-9823-42db-9766-437726d2c7bb" containerName="registry-server" probeResult="failure" output=< Mar 20 06:54:33 crc kubenswrapper[4971]: timeout: failed to connect service ":50051" within 1s Mar 20 06:54:33 crc kubenswrapper[4971]: > Mar 20 06:54:34 crc kubenswrapper[4971]: I0320 06:54:34.527345 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6b8lj"] Mar 20 06:54:34 crc kubenswrapper[4971]: I0320 06:54:34.527863 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6b8lj" podUID="ff1d12ba-e1ee-4cbc-8c8e-102269e47f75" containerName="registry-server" containerID="cri-o://a581809e6c3475a5ea1a2b4dcd260a6e716c321614bc6e34e2affca5ebe7e531" gracePeriod=2 Mar 20 06:54:35 crc kubenswrapper[4971]: I0320 06:54:35.173842 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6b8lj" Mar 20 06:54:35 crc kubenswrapper[4971]: I0320 06:54:35.241455 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff1d12ba-e1ee-4cbc-8c8e-102269e47f75-utilities\") pod \"ff1d12ba-e1ee-4cbc-8c8e-102269e47f75\" (UID: \"ff1d12ba-e1ee-4cbc-8c8e-102269e47f75\") " Mar 20 06:54:35 crc kubenswrapper[4971]: I0320 06:54:35.241519 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xc78s\" (UniqueName: \"kubernetes.io/projected/ff1d12ba-e1ee-4cbc-8c8e-102269e47f75-kube-api-access-xc78s\") pod \"ff1d12ba-e1ee-4cbc-8c8e-102269e47f75\" (UID: \"ff1d12ba-e1ee-4cbc-8c8e-102269e47f75\") " Mar 20 06:54:35 crc kubenswrapper[4971]: I0320 06:54:35.241563 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff1d12ba-e1ee-4cbc-8c8e-102269e47f75-catalog-content\") pod \"ff1d12ba-e1ee-4cbc-8c8e-102269e47f75\" (UID: \"ff1d12ba-e1ee-4cbc-8c8e-102269e47f75\") " Mar 20 06:54:35 crc kubenswrapper[4971]: I0320 06:54:35.242875 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff1d12ba-e1ee-4cbc-8c8e-102269e47f75-utilities" (OuterVolumeSpecName: "utilities") pod "ff1d12ba-e1ee-4cbc-8c8e-102269e47f75" (UID: "ff1d12ba-e1ee-4cbc-8c8e-102269e47f75"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 06:54:35 crc kubenswrapper[4971]: I0320 06:54:35.252008 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff1d12ba-e1ee-4cbc-8c8e-102269e47f75-kube-api-access-xc78s" (OuterVolumeSpecName: "kube-api-access-xc78s") pod "ff1d12ba-e1ee-4cbc-8c8e-102269e47f75" (UID: "ff1d12ba-e1ee-4cbc-8c8e-102269e47f75"). InnerVolumeSpecName "kube-api-access-xc78s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:54:35 crc kubenswrapper[4971]: I0320 06:54:35.275258 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff1d12ba-e1ee-4cbc-8c8e-102269e47f75-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ff1d12ba-e1ee-4cbc-8c8e-102269e47f75" (UID: "ff1d12ba-e1ee-4cbc-8c8e-102269e47f75"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 06:54:35 crc kubenswrapper[4971]: I0320 06:54:35.343417 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff1d12ba-e1ee-4cbc-8c8e-102269e47f75-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 06:54:35 crc kubenswrapper[4971]: I0320 06:54:35.343501 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff1d12ba-e1ee-4cbc-8c8e-102269e47f75-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 06:54:35 crc kubenswrapper[4971]: I0320 06:54:35.343523 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xc78s\" (UniqueName: \"kubernetes.io/projected/ff1d12ba-e1ee-4cbc-8c8e-102269e47f75-kube-api-access-xc78s\") on node \"crc\" DevicePath \"\"" Mar 20 06:54:35 crc kubenswrapper[4971]: I0320 06:54:35.366955 4971 generic.go:334] "Generic (PLEG): container finished" podID="ff1d12ba-e1ee-4cbc-8c8e-102269e47f75" containerID="a581809e6c3475a5ea1a2b4dcd260a6e716c321614bc6e34e2affca5ebe7e531" exitCode=0 Mar 20 06:54:35 crc kubenswrapper[4971]: I0320 06:54:35.367051 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6b8lj" event={"ID":"ff1d12ba-e1ee-4cbc-8c8e-102269e47f75","Type":"ContainerDied","Data":"a581809e6c3475a5ea1a2b4dcd260a6e716c321614bc6e34e2affca5ebe7e531"} Mar 20 06:54:35 crc kubenswrapper[4971]: I0320 06:54:35.367102 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6b8lj" Mar 20 06:54:35 crc kubenswrapper[4971]: I0320 06:54:35.367508 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6b8lj" event={"ID":"ff1d12ba-e1ee-4cbc-8c8e-102269e47f75","Type":"ContainerDied","Data":"9a761afd163a3cfd18bb0ed2fb9963eb661a290ac2969cb40d461f126f37c2a9"} Mar 20 06:54:35 crc kubenswrapper[4971]: I0320 06:54:35.367665 4971 scope.go:117] "RemoveContainer" containerID="a581809e6c3475a5ea1a2b4dcd260a6e716c321614bc6e34e2affca5ebe7e531" Mar 20 06:54:35 crc kubenswrapper[4971]: I0320 06:54:35.395009 4971 scope.go:117] "RemoveContainer" containerID="5471658d911356884172533fcb8c330b370001f7ac74afd14baae776290baf70" Mar 20 06:54:35 crc kubenswrapper[4971]: I0320 06:54:35.414796 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6b8lj"] Mar 20 06:54:35 crc kubenswrapper[4971]: I0320 06:54:35.425231 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6b8lj"] Mar 20 06:54:35 crc kubenswrapper[4971]: I0320 06:54:35.432474 4971 scope.go:117] "RemoveContainer" containerID="d882d6de253a984953fe71e9bece051759942151aa2ecac6cc70f67eea4fa007" Mar 20 06:54:35 crc kubenswrapper[4971]: I0320 06:54:35.452817 4971 scope.go:117] "RemoveContainer" containerID="a581809e6c3475a5ea1a2b4dcd260a6e716c321614bc6e34e2affca5ebe7e531" Mar 20 06:54:35 crc kubenswrapper[4971]: E0320 06:54:35.453529 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a581809e6c3475a5ea1a2b4dcd260a6e716c321614bc6e34e2affca5ebe7e531\": container with ID starting with a581809e6c3475a5ea1a2b4dcd260a6e716c321614bc6e34e2affca5ebe7e531 not found: ID does not exist" containerID="a581809e6c3475a5ea1a2b4dcd260a6e716c321614bc6e34e2affca5ebe7e531" Mar 20 06:54:35 crc kubenswrapper[4971]: I0320 06:54:35.453687 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a581809e6c3475a5ea1a2b4dcd260a6e716c321614bc6e34e2affca5ebe7e531"} err="failed to get container status \"a581809e6c3475a5ea1a2b4dcd260a6e716c321614bc6e34e2affca5ebe7e531\": rpc error: code = NotFound desc = could not find container \"a581809e6c3475a5ea1a2b4dcd260a6e716c321614bc6e34e2affca5ebe7e531\": container with ID starting with a581809e6c3475a5ea1a2b4dcd260a6e716c321614bc6e34e2affca5ebe7e531 not found: ID does not exist" Mar 20 06:54:35 crc kubenswrapper[4971]: I0320 06:54:35.453743 4971 scope.go:117] "RemoveContainer" containerID="5471658d911356884172533fcb8c330b370001f7ac74afd14baae776290baf70" Mar 20 06:54:35 crc kubenswrapper[4971]: E0320 06:54:35.454261 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5471658d911356884172533fcb8c330b370001f7ac74afd14baae776290baf70\": container with ID starting with 5471658d911356884172533fcb8c330b370001f7ac74afd14baae776290baf70 not found: ID does not exist" containerID="5471658d911356884172533fcb8c330b370001f7ac74afd14baae776290baf70" Mar 20 06:54:35 crc kubenswrapper[4971]: I0320 06:54:35.454333 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5471658d911356884172533fcb8c330b370001f7ac74afd14baae776290baf70"} err="failed to get container status \"5471658d911356884172533fcb8c330b370001f7ac74afd14baae776290baf70\": rpc error: code = NotFound desc = could not find container \"5471658d911356884172533fcb8c330b370001f7ac74afd14baae776290baf70\": container with ID starting with 5471658d911356884172533fcb8c330b370001f7ac74afd14baae776290baf70 not found: ID does not exist" Mar 20 06:54:35 crc kubenswrapper[4971]: I0320 06:54:35.454522 4971 scope.go:117] "RemoveContainer" containerID="d882d6de253a984953fe71e9bece051759942151aa2ecac6cc70f67eea4fa007" Mar 20 06:54:35 crc kubenswrapper[4971]: E0320 06:54:35.455064 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d882d6de253a984953fe71e9bece051759942151aa2ecac6cc70f67eea4fa007\": container with ID starting with d882d6de253a984953fe71e9bece051759942151aa2ecac6cc70f67eea4fa007 not found: ID does not exist" containerID="d882d6de253a984953fe71e9bece051759942151aa2ecac6cc70f67eea4fa007" Mar 20 06:54:35 crc kubenswrapper[4971]: I0320 06:54:35.455136 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d882d6de253a984953fe71e9bece051759942151aa2ecac6cc70f67eea4fa007"} err="failed to get container status \"d882d6de253a984953fe71e9bece051759942151aa2ecac6cc70f67eea4fa007\": rpc error: code = NotFound desc = could not find container \"d882d6de253a984953fe71e9bece051759942151aa2ecac6cc70f67eea4fa007\": container with ID starting with d882d6de253a984953fe71e9bece051759942151aa2ecac6cc70f67eea4fa007 not found: ID does not exist" Mar 20 06:54:35 crc kubenswrapper[4971]: I0320 06:54:35.805763 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-g2c5c" podUID="fe99e221-99e5-49c1-9cec-875a54851847" containerName="oauth-openshift" containerID="cri-o://891259158e431359136fb2ac4966aa8329dbf23f371f0fa70390e636d2448b32" gracePeriod=15 Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.378451 4971 generic.go:334] "Generic (PLEG): container finished" podID="fe99e221-99e5-49c1-9cec-875a54851847" containerID="891259158e431359136fb2ac4966aa8329dbf23f371f0fa70390e636d2448b32" exitCode=0 Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.378592 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-g2c5c" event={"ID":"fe99e221-99e5-49c1-9cec-875a54851847","Type":"ContainerDied","Data":"891259158e431359136fb2ac4966aa8329dbf23f371f0fa70390e636d2448b32"} Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.503032 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-g2c5c" Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.565003 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-user-template-provider-selection\") pod \"fe99e221-99e5-49c1-9cec-875a54851847\" (UID: \"fe99e221-99e5-49c1-9cec-875a54851847\") " Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.565142 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-user-idp-0-file-data\") pod \"fe99e221-99e5-49c1-9cec-875a54851847\" (UID: \"fe99e221-99e5-49c1-9cec-875a54851847\") " Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.565196 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-system-session\") pod \"fe99e221-99e5-49c1-9cec-875a54851847\" (UID: \"fe99e221-99e5-49c1-9cec-875a54851847\") " Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.565241 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-system-router-certs\") pod \"fe99e221-99e5-49c1-9cec-875a54851847\" (UID: \"fe99e221-99e5-49c1-9cec-875a54851847\") " Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.565285 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-system-cliconfig\") pod \"fe99e221-99e5-49c1-9cec-875a54851847\" (UID: \"fe99e221-99e5-49c1-9cec-875a54851847\") " Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.565326 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-user-template-login\") pod \"fe99e221-99e5-49c1-9cec-875a54851847\" (UID: \"fe99e221-99e5-49c1-9cec-875a54851847\") " Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.565368 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-system-ocp-branding-template\") pod \"fe99e221-99e5-49c1-9cec-875a54851847\" (UID: \"fe99e221-99e5-49c1-9cec-875a54851847\") " Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.565411 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fe99e221-99e5-49c1-9cec-875a54851847-audit-dir\") pod \"fe99e221-99e5-49c1-9cec-875a54851847\" (UID: \"fe99e221-99e5-49c1-9cec-875a54851847\") " Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.565460 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-system-trusted-ca-bundle\") pod \"fe99e221-99e5-49c1-9cec-875a54851847\" (UID: \"fe99e221-99e5-49c1-9cec-875a54851847\") " Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.565505 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-system-serving-cert\") pod \"fe99e221-99e5-49c1-9cec-875a54851847\" (UID: \"fe99e221-99e5-49c1-9cec-875a54851847\") " Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.565561 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-system-service-ca\") pod \"fe99e221-99e5-49c1-9cec-875a54851847\" (UID: \"fe99e221-99e5-49c1-9cec-875a54851847\") " Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.565641 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-user-template-error\") pod \"fe99e221-99e5-49c1-9cec-875a54851847\" (UID: \"fe99e221-99e5-49c1-9cec-875a54851847\") " Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.565683 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fe99e221-99e5-49c1-9cec-875a54851847-audit-policies\") pod \"fe99e221-99e5-49c1-9cec-875a54851847\" (UID: \"fe99e221-99e5-49c1-9cec-875a54851847\") " Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.565731 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnzn7\" (UniqueName: \"kubernetes.io/projected/fe99e221-99e5-49c1-9cec-875a54851847-kube-api-access-tnzn7\") pod \"fe99e221-99e5-49c1-9cec-875a54851847\" (UID: \"fe99e221-99e5-49c1-9cec-875a54851847\") " Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.567206 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fe99e221-99e5-49c1-9cec-875a54851847-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "fe99e221-99e5-49c1-9cec-875a54851847" (UID: "fe99e221-99e5-49c1-9cec-875a54851847"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.571921 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe99e221-99e5-49c1-9cec-875a54851847-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "fe99e221-99e5-49c1-9cec-875a54851847" (UID: "fe99e221-99e5-49c1-9cec-875a54851847"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.573588 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe99e221-99e5-49c1-9cec-875a54851847-kube-api-access-tnzn7" (OuterVolumeSpecName: "kube-api-access-tnzn7") pod "fe99e221-99e5-49c1-9cec-875a54851847" (UID: "fe99e221-99e5-49c1-9cec-875a54851847"). InnerVolumeSpecName "kube-api-access-tnzn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.573937 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "fe99e221-99e5-49c1-9cec-875a54851847" (UID: "fe99e221-99e5-49c1-9cec-875a54851847"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.575005 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "fe99e221-99e5-49c1-9cec-875a54851847" (UID: "fe99e221-99e5-49c1-9cec-875a54851847"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.575863 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "fe99e221-99e5-49c1-9cec-875a54851847" (UID: "fe99e221-99e5-49c1-9cec-875a54851847"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.576101 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "fe99e221-99e5-49c1-9cec-875a54851847" (UID: "fe99e221-99e5-49c1-9cec-875a54851847"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.576126 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "fe99e221-99e5-49c1-9cec-875a54851847" (UID: "fe99e221-99e5-49c1-9cec-875a54851847"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.576201 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "fe99e221-99e5-49c1-9cec-875a54851847" (UID: "fe99e221-99e5-49c1-9cec-875a54851847"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.576907 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "fe99e221-99e5-49c1-9cec-875a54851847" (UID: "fe99e221-99e5-49c1-9cec-875a54851847"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.579034 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "fe99e221-99e5-49c1-9cec-875a54851847" (UID: "fe99e221-99e5-49c1-9cec-875a54851847"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.581314 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "fe99e221-99e5-49c1-9cec-875a54851847" (UID: "fe99e221-99e5-49c1-9cec-875a54851847"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.582323 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "fe99e221-99e5-49c1-9cec-875a54851847" (UID: "fe99e221-99e5-49c1-9cec-875a54851847"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.586984 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "fe99e221-99e5-49c1-9cec-875a54851847" (UID: "fe99e221-99e5-49c1-9cec-875a54851847"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.667370 4971 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.667437 4971 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.667456 4971 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fe99e221-99e5-49c1-9cec-875a54851847-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.667480 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnzn7\" (UniqueName: \"kubernetes.io/projected/fe99e221-99e5-49c1-9cec-875a54851847-kube-api-access-tnzn7\") on node \"crc\" DevicePath \"\"" Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.667501 4971 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.667517 4971 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.667530 4971 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.667543 4971 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.667559 4971 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.667572 4971 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.667586 4971 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.667600 4971 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fe99e221-99e5-49c1-9cec-875a54851847-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.667636 4971 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.667650 4971 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fe99e221-99e5-49c1-9cec-875a54851847-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.746293 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff1d12ba-e1ee-4cbc-8c8e-102269e47f75" path="/var/lib/kubelet/pods/ff1d12ba-e1ee-4cbc-8c8e-102269e47f75/volumes" Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.949873 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-6f8f59f8d9-x56bg"] Mar 20 06:54:36 crc kubenswrapper[4971]: E0320 06:54:36.950480 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe99e221-99e5-49c1-9cec-875a54851847" containerName="oauth-openshift" Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.950507 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe99e221-99e5-49c1-9cec-875a54851847" containerName="oauth-openshift" Mar 20 06:54:36 crc kubenswrapper[4971]: E0320 06:54:36.950533 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff1d12ba-e1ee-4cbc-8c8e-102269e47f75" containerName="extract-utilities" Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.950546 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff1d12ba-e1ee-4cbc-8c8e-102269e47f75" containerName="extract-utilities" Mar 20 06:54:36 crc kubenswrapper[4971]: E0320 06:54:36.950574 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff1d12ba-e1ee-4cbc-8c8e-102269e47f75" containerName="extract-content" Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.950593 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff1d12ba-e1ee-4cbc-8c8e-102269e47f75" containerName="extract-content" Mar 20 06:54:36 crc kubenswrapper[4971]: E0320 06:54:36.950667 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff1d12ba-e1ee-4cbc-8c8e-102269e47f75" containerName="registry-server" Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.950688 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff1d12ba-e1ee-4cbc-8c8e-102269e47f75" containerName="registry-server" Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.950875 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe99e221-99e5-49c1-9cec-875a54851847" containerName="oauth-openshift" Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.950910 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff1d12ba-e1ee-4cbc-8c8e-102269e47f75" containerName="registry-server" Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.951592 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6f8f59f8d9-x56bg" Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.971173 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6f8f59f8d9-x56bg"] Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.973178 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/056aa70c-b1ee-4e72-a758-3a74e96df388-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6f8f59f8d9-x56bg\" (UID: \"056aa70c-b1ee-4e72-a758-3a74e96df388\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-x56bg" Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.973286 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/056aa70c-b1ee-4e72-a758-3a74e96df388-audit-policies\") pod \"oauth-openshift-6f8f59f8d9-x56bg\" (UID: \"056aa70c-b1ee-4e72-a758-3a74e96df388\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-x56bg" Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.973447 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/056aa70c-b1ee-4e72-a758-3a74e96df388-v4-0-config-user-template-error\") pod \"oauth-openshift-6f8f59f8d9-x56bg\" (UID: \"056aa70c-b1ee-4e72-a758-3a74e96df388\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-x56bg" Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.973535 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/056aa70c-b1ee-4e72-a758-3a74e96df388-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6f8f59f8d9-x56bg\" (UID: \"056aa70c-b1ee-4e72-a758-3a74e96df388\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-x56bg" Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.973696 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/056aa70c-b1ee-4e72-a758-3a74e96df388-v4-0-config-system-router-certs\") pod \"oauth-openshift-6f8f59f8d9-x56bg\" (UID: \"056aa70c-b1ee-4e72-a758-3a74e96df388\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-x56bg" Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.973765 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/056aa70c-b1ee-4e72-a758-3a74e96df388-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6f8f59f8d9-x56bg\" (UID: \"056aa70c-b1ee-4e72-a758-3a74e96df388\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-x56bg" Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.973817 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/056aa70c-b1ee-4e72-a758-3a74e96df388-v4-0-config-system-session\") pod \"oauth-openshift-6f8f59f8d9-x56bg\" (UID: \"056aa70c-b1ee-4e72-a758-3a74e96df388\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-x56bg" Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.974171 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/056aa70c-b1ee-4e72-a758-3a74e96df388-v4-0-config-system-service-ca\") pod \"oauth-openshift-6f8f59f8d9-x56bg\" (UID: \"056aa70c-b1ee-4e72-a758-3a74e96df388\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-x56bg" Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.974329 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/056aa70c-b1ee-4e72-a758-3a74e96df388-audit-dir\") pod \"oauth-openshift-6f8f59f8d9-x56bg\" (UID: \"056aa70c-b1ee-4e72-a758-3a74e96df388\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-x56bg" Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.974527 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/056aa70c-b1ee-4e72-a758-3a74e96df388-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6f8f59f8d9-x56bg\" (UID: \"056aa70c-b1ee-4e72-a758-3a74e96df388\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-x56bg" Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.975035 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/056aa70c-b1ee-4e72-a758-3a74e96df388-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6f8f59f8d9-x56bg\" (UID: \"056aa70c-b1ee-4e72-a758-3a74e96df388\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-x56bg" Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.975149 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44c2n\" (UniqueName: \"kubernetes.io/projected/056aa70c-b1ee-4e72-a758-3a74e96df388-kube-api-access-44c2n\") pod \"oauth-openshift-6f8f59f8d9-x56bg\" (UID: \"056aa70c-b1ee-4e72-a758-3a74e96df388\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-x56bg" Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.975349 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/056aa70c-b1ee-4e72-a758-3a74e96df388-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6f8f59f8d9-x56bg\" (UID: \"056aa70c-b1ee-4e72-a758-3a74e96df388\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-x56bg" Mar 20 06:54:36 crc kubenswrapper[4971]: I0320 06:54:36.975671 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/056aa70c-b1ee-4e72-a758-3a74e96df388-v4-0-config-user-template-login\") pod \"oauth-openshift-6f8f59f8d9-x56bg\" (UID: \"056aa70c-b1ee-4e72-a758-3a74e96df388\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-x56bg" Mar 20 06:54:37 crc kubenswrapper[4971]: I0320 06:54:37.077392 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/056aa70c-b1ee-4e72-a758-3a74e96df388-v4-0-config-user-template-login\") pod \"oauth-openshift-6f8f59f8d9-x56bg\" (UID: \"056aa70c-b1ee-4e72-a758-3a74e96df388\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-x56bg" Mar 20 06:54:37 crc kubenswrapper[4971]: I0320 06:54:37.077479 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/056aa70c-b1ee-4e72-a758-3a74e96df388-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6f8f59f8d9-x56bg\" (UID: \"056aa70c-b1ee-4e72-a758-3a74e96df388\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-x56bg" Mar 20 06:54:37 crc kubenswrapper[4971]: I0320 06:54:37.077538 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/056aa70c-b1ee-4e72-a758-3a74e96df388-audit-policies\") pod \"oauth-openshift-6f8f59f8d9-x56bg\" (UID: \"056aa70c-b1ee-4e72-a758-3a74e96df388\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-x56bg" Mar 20 06:54:37 crc kubenswrapper[4971]: I0320 06:54:37.077594 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/056aa70c-b1ee-4e72-a758-3a74e96df388-v4-0-config-user-template-error\") pod \"oauth-openshift-6f8f59f8d9-x56bg\" (UID: \"056aa70c-b1ee-4e72-a758-3a74e96df388\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-x56bg" Mar 20 06:54:37 crc kubenswrapper[4971]: I0320 06:54:37.077679 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/056aa70c-b1ee-4e72-a758-3a74e96df388-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6f8f59f8d9-x56bg\" (UID: \"056aa70c-b1ee-4e72-a758-3a74e96df388\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-x56bg" Mar 20 06:54:37 crc kubenswrapper[4971]: I0320 06:54:37.077731 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/056aa70c-b1ee-4e72-a758-3a74e96df388-v4-0-config-system-router-certs\") pod \"oauth-openshift-6f8f59f8d9-x56bg\" (UID: \"056aa70c-b1ee-4e72-a758-3a74e96df388\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-x56bg" Mar 20 06:54:37 crc kubenswrapper[4971]: I0320 06:54:37.077781 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/056aa70c-b1ee-4e72-a758-3a74e96df388-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6f8f59f8d9-x56bg\" (UID: \"056aa70c-b1ee-4e72-a758-3a74e96df388\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-x56bg" Mar 20 06:54:37 crc kubenswrapper[4971]: I0320 06:54:37.077822 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/056aa70c-b1ee-4e72-a758-3a74e96df388-v4-0-config-system-session\") pod \"oauth-openshift-6f8f59f8d9-x56bg\" (UID: \"056aa70c-b1ee-4e72-a758-3a74e96df388\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-x56bg" Mar 20 06:54:37 crc kubenswrapper[4971]: I0320 06:54:37.077857 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/056aa70c-b1ee-4e72-a758-3a74e96df388-v4-0-config-system-service-ca\") pod \"oauth-openshift-6f8f59f8d9-x56bg\" (UID: \"056aa70c-b1ee-4e72-a758-3a74e96df388\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-x56bg" Mar 20 06:54:37 crc kubenswrapper[4971]: I0320 06:54:37.077894 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/056aa70c-b1ee-4e72-a758-3a74e96df388-audit-dir\") pod \"oauth-openshift-6f8f59f8d9-x56bg\" (UID: \"056aa70c-b1ee-4e72-a758-3a74e96df388\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-x56bg" Mar 20 06:54:37 crc kubenswrapper[4971]: I0320 06:54:37.077950 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/056aa70c-b1ee-4e72-a758-3a74e96df388-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6f8f59f8d9-x56bg\" (UID: \"056aa70c-b1ee-4e72-a758-3a74e96df388\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-x56bg" Mar 20 06:54:37 crc kubenswrapper[4971]: I0320 06:54:37.077996 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/056aa70c-b1ee-4e72-a758-3a74e96df388-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6f8f59f8d9-x56bg\" (UID: \"056aa70c-b1ee-4e72-a758-3a74e96df388\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-x56bg" Mar 20 06:54:37 crc kubenswrapper[4971]: I0320 06:54:37.078032 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44c2n\" (UniqueName: \"kubernetes.io/projected/056aa70c-b1ee-4e72-a758-3a74e96df388-kube-api-access-44c2n\") pod \"oauth-openshift-6f8f59f8d9-x56bg\" (UID: \"056aa70c-b1ee-4e72-a758-3a74e96df388\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-x56bg" Mar 20 06:54:37 crc kubenswrapper[4971]: I0320 06:54:37.078094 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/056aa70c-b1ee-4e72-a758-3a74e96df388-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6f8f59f8d9-x56bg\" (UID: \"056aa70c-b1ee-4e72-a758-3a74e96df388\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-x56bg" Mar 20 06:54:37 crc kubenswrapper[4971]: I0320 06:54:37.078965 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/056aa70c-b1ee-4e72-a758-3a74e96df388-audit-dir\") pod \"oauth-openshift-6f8f59f8d9-x56bg\" (UID: \"056aa70c-b1ee-4e72-a758-3a74e96df388\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-x56bg" Mar 20 06:54:37 crc kubenswrapper[4971]: I0320 06:54:37.080297 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/056aa70c-b1ee-4e72-a758-3a74e96df388-audit-policies\") pod \"oauth-openshift-6f8f59f8d9-x56bg\" (UID: \"056aa70c-b1ee-4e72-a758-3a74e96df388\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-x56bg" Mar 20 06:54:37 crc kubenswrapper[4971]: I0320 06:54:37.080472 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/056aa70c-b1ee-4e72-a758-3a74e96df388-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6f8f59f8d9-x56bg\" (UID: \"056aa70c-b1ee-4e72-a758-3a74e96df388\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-x56bg" Mar 20 06:54:37 crc kubenswrapper[4971]: I0320 06:54:37.080896 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/056aa70c-b1ee-4e72-a758-3a74e96df388-v4-0-config-system-service-ca\") pod \"oauth-openshift-6f8f59f8d9-x56bg\" (UID: \"056aa70c-b1ee-4e72-a758-3a74e96df388\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-x56bg" Mar 20 06:54:37 crc kubenswrapper[4971]: I0320 06:54:37.082883 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/056aa70c-b1ee-4e72-a758-3a74e96df388-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6f8f59f8d9-x56bg\" (UID: \"056aa70c-b1ee-4e72-a758-3a74e96df388\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-x56bg" Mar 20 06:54:37 crc kubenswrapper[4971]: I0320 06:54:37.087007 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/056aa70c-b1ee-4e72-a758-3a74e96df388-v4-0-config-user-template-login\") pod \"oauth-openshift-6f8f59f8d9-x56bg\" (UID: \"056aa70c-b1ee-4e72-a758-3a74e96df388\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-x56bg" Mar 20 06:54:37 crc kubenswrapper[4971]: I0320 06:54:37.087092 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/056aa70c-b1ee-4e72-a758-3a74e96df388-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6f8f59f8d9-x56bg\" (UID: \"056aa70c-b1ee-4e72-a758-3a74e96df388\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-x56bg" Mar 20 06:54:37 crc kubenswrapper[4971]: I0320 06:54:37.087565 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/056aa70c-b1ee-4e72-a758-3a74e96df388-v4-0-config-user-template-error\") pod \"oauth-openshift-6f8f59f8d9-x56bg\" (UID: \"056aa70c-b1ee-4e72-a758-3a74e96df388\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-x56bg" Mar 20 06:54:37 crc kubenswrapper[4971]: I0320 06:54:37.088076 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/056aa70c-b1ee-4e72-a758-3a74e96df388-v4-0-config-system-session\") pod \"oauth-openshift-6f8f59f8d9-x56bg\" (UID: \"056aa70c-b1ee-4e72-a758-3a74e96df388\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-x56bg" Mar 20 06:54:37 crc kubenswrapper[4971]: I0320 06:54:37.089257 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/056aa70c-b1ee-4e72-a758-3a74e96df388-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6f8f59f8d9-x56bg\" (UID: \"056aa70c-b1ee-4e72-a758-3a74e96df388\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-x56bg" Mar 20 06:54:37 crc kubenswrapper[4971]: I0320 06:54:37.090123 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/056aa70c-b1ee-4e72-a758-3a74e96df388-v4-0-config-system-router-certs\") pod \"oauth-openshift-6f8f59f8d9-x56bg\" (UID: \"056aa70c-b1ee-4e72-a758-3a74e96df388\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-x56bg" Mar 20 06:54:37 crc kubenswrapper[4971]: I0320 06:54:37.090496 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/056aa70c-b1ee-4e72-a758-3a74e96df388-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6f8f59f8d9-x56bg\" (UID: \"056aa70c-b1ee-4e72-a758-3a74e96df388\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-x56bg" Mar 20 06:54:37 crc kubenswrapper[4971]: I0320 06:54:37.098426 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/056aa70c-b1ee-4e72-a758-3a74e96df388-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6f8f59f8d9-x56bg\" (UID: \"056aa70c-b1ee-4e72-a758-3a74e96df388\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-x56bg" Mar 20 06:54:37 crc kubenswrapper[4971]: I0320 06:54:37.114241 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44c2n\" (UniqueName: \"kubernetes.io/projected/056aa70c-b1ee-4e72-a758-3a74e96df388-kube-api-access-44c2n\") pod \"oauth-openshift-6f8f59f8d9-x56bg\" (UID: \"056aa70c-b1ee-4e72-a758-3a74e96df388\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-x56bg" Mar 20 06:54:37 crc kubenswrapper[4971]: I0320 06:54:37.296441 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6f8f59f8d9-x56bg" Mar 20 06:54:37 crc kubenswrapper[4971]: I0320 06:54:37.394247 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-g2c5c" event={"ID":"fe99e221-99e5-49c1-9cec-875a54851847","Type":"ContainerDied","Data":"6f57b27b633986b947007cf31750d1c580be9e9e6cf4069c71b132de2b00e71c"} Mar 20 06:54:37 crc kubenswrapper[4971]: I0320 06:54:37.394346 4971 scope.go:117] "RemoveContainer" containerID="891259158e431359136fb2ac4966aa8329dbf23f371f0fa70390e636d2448b32" Mar 20 06:54:37 crc kubenswrapper[4971]: I0320 06:54:37.394398 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-g2c5c" Mar 20 06:54:37 crc kubenswrapper[4971]: I0320 06:54:37.422260 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-g2c5c"] Mar 20 06:54:37 crc kubenswrapper[4971]: I0320 06:54:37.442717 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-g2c5c"] Mar 20 06:54:37 crc kubenswrapper[4971]: I0320 06:54:37.806172 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6f8f59f8d9-x56bg"] Mar 20 06:54:38 crc kubenswrapper[4971]: I0320 06:54:38.415632 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6f8f59f8d9-x56bg" event={"ID":"056aa70c-b1ee-4e72-a758-3a74e96df388","Type":"ContainerStarted","Data":"e97e4f238c07b437ec919389a38d063e2f246e431ec50c3a8a4a3f95eefa8c5e"} Mar 20 06:54:38 crc kubenswrapper[4971]: I0320 06:54:38.416421 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6f8f59f8d9-x56bg" event={"ID":"056aa70c-b1ee-4e72-a758-3a74e96df388","Type":"ContainerStarted","Data":"d44972600575d16fad15146bbbf815fa1f7daf6daca43f684fa805baa2f7b18f"} Mar 20 06:54:38 crc kubenswrapper[4971]: I0320 06:54:38.739773 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe99e221-99e5-49c1-9cec-875a54851847" path="/var/lib/kubelet/pods/fe99e221-99e5-49c1-9cec-875a54851847/volumes" Mar 20 06:54:39 crc kubenswrapper[4971]: I0320 06:54:39.426671 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6f8f59f8d9-x56bg" Mar 20 06:54:39 crc kubenswrapper[4971]: I0320 06:54:39.435044 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6f8f59f8d9-x56bg" Mar 20 06:54:39 crc kubenswrapper[4971]: I0320 06:54:39.458904 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-6f8f59f8d9-x56bg" podStartSLOduration=29.458870095 podStartE2EDuration="29.458870095s" podCreationTimestamp="2026-03-20 06:54:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:54:39.45663002 +0000 UTC m=+301.436504238" watchObservedRunningTime="2026-03-20 06:54:39.458870095 +0000 UTC m=+301.438744273" Mar 20 06:54:42 crc kubenswrapper[4971]: I0320 06:54:42.457911 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4w7kq" Mar 20 06:54:42 crc kubenswrapper[4971]: I0320 06:54:42.509087 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4w7kq" Mar 20 06:54:42 crc kubenswrapper[4971]: I0320 06:54:42.909477 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zj9pn" Mar 20 06:54:42 crc kubenswrapper[4971]: I0320 06:54:42.963176 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zj9pn" Mar 20 06:54:43 crc kubenswrapper[4971]: I0320 06:54:43.874783 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zj9pn"] Mar 20 06:54:44 crc kubenswrapper[4971]: I0320 06:54:44.458795 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zj9pn" podUID="e7552f5a-9823-42db-9766-437726d2c7bb" containerName="registry-server" containerID="cri-o://7bcef32e41d44aa792cda70334af0a1134d39c1ab91fba1d6e3746814e4138d2" gracePeriod=2 Mar 20 06:54:44 crc kubenswrapper[4971]: I0320 06:54:44.619068 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7f8d9d77fc-fmvt4"] Mar 20 06:54:44 crc kubenswrapper[4971]: I0320 06:54:44.619475 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7f8d9d77fc-fmvt4" podUID="f5d7ed75-853c-45b8-90a0-9014096485fc" containerName="controller-manager" containerID="cri-o://d931fa95c2306be9a7481525d9b45793052c290d506fbc6bd74e876a81472c4a" gracePeriod=30 Mar 20 06:54:44 crc kubenswrapper[4971]: I0320 06:54:44.694586 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f5ccb7977-z72sh"] Mar 20 06:54:44 crc kubenswrapper[4971]: I0320 06:54:44.694930 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6f5ccb7977-z72sh" podUID="8d2c2cc8-d215-4e57-87a2-25a54b287683" containerName="route-controller-manager" containerID="cri-o://19c2d1d56464a1ae28eed34fdd13ec0bcc7be61b4ffce3d10d51df1e67478337" gracePeriod=30 Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.127030 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zj9pn" Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.166814 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f5ccb7977-z72sh" Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.224992 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f8d9d77fc-fmvt4" Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.280106 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8d2c2cc8-d215-4e57-87a2-25a54b287683-client-ca\") pod \"8d2c2cc8-d215-4e57-87a2-25a54b287683\" (UID: \"8d2c2cc8-d215-4e57-87a2-25a54b287683\") " Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.280182 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d2c2cc8-d215-4e57-87a2-25a54b287683-serving-cert\") pod \"8d2c2cc8-d215-4e57-87a2-25a54b287683\" (UID: \"8d2c2cc8-d215-4e57-87a2-25a54b287683\") " Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.280271 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjsld\" (UniqueName: \"kubernetes.io/projected/8d2c2cc8-d215-4e57-87a2-25a54b287683-kube-api-access-rjsld\") pod \"8d2c2cc8-d215-4e57-87a2-25a54b287683\" (UID: \"8d2c2cc8-d215-4e57-87a2-25a54b287683\") " Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.280342 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7552f5a-9823-42db-9766-437726d2c7bb-utilities\") pod \"e7552f5a-9823-42db-9766-437726d2c7bb\" (UID: \"e7552f5a-9823-42db-9766-437726d2c7bb\") " Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.280372 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbsc7\" (UniqueName: \"kubernetes.io/projected/e7552f5a-9823-42db-9766-437726d2c7bb-kube-api-access-jbsc7\") pod \"e7552f5a-9823-42db-9766-437726d2c7bb\" (UID: \"e7552f5a-9823-42db-9766-437726d2c7bb\") " Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.280432 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7552f5a-9823-42db-9766-437726d2c7bb-catalog-content\") pod \"e7552f5a-9823-42db-9766-437726d2c7bb\" (UID: \"e7552f5a-9823-42db-9766-437726d2c7bb\") " Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.280525 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d2c2cc8-d215-4e57-87a2-25a54b287683-config\") pod \"8d2c2cc8-d215-4e57-87a2-25a54b287683\" (UID: \"8d2c2cc8-d215-4e57-87a2-25a54b287683\") " Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.281186 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d2c2cc8-d215-4e57-87a2-25a54b287683-config" (OuterVolumeSpecName: "config") pod "8d2c2cc8-d215-4e57-87a2-25a54b287683" (UID: "8d2c2cc8-d215-4e57-87a2-25a54b287683"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.281427 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5d7ed75-853c-45b8-90a0-9014096485fc-serving-cert\") pod \"f5d7ed75-853c-45b8-90a0-9014096485fc\" (UID: \"f5d7ed75-853c-45b8-90a0-9014096485fc\") " Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.281492 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5d7ed75-853c-45b8-90a0-9014096485fc-config\") pod \"f5d7ed75-853c-45b8-90a0-9014096485fc\" (UID: \"f5d7ed75-853c-45b8-90a0-9014096485fc\") " Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.281691 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d2c2cc8-d215-4e57-87a2-25a54b287683-client-ca" (OuterVolumeSpecName: "client-ca") pod "8d2c2cc8-d215-4e57-87a2-25a54b287683" (UID: "8d2c2cc8-d215-4e57-87a2-25a54b287683"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.281987 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d2c2cc8-d215-4e57-87a2-25a54b287683-config\") on node \"crc\" DevicePath \"\"" Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.282014 4971 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8d2c2cc8-d215-4e57-87a2-25a54b287683-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.282083 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7552f5a-9823-42db-9766-437726d2c7bb-utilities" (OuterVolumeSpecName: "utilities") pod "e7552f5a-9823-42db-9766-437726d2c7bb" (UID: "e7552f5a-9823-42db-9766-437726d2c7bb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.282889 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5d7ed75-853c-45b8-90a0-9014096485fc-config" (OuterVolumeSpecName: "config") pod "f5d7ed75-853c-45b8-90a0-9014096485fc" (UID: "f5d7ed75-853c-45b8-90a0-9014096485fc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.287454 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5d7ed75-853c-45b8-90a0-9014096485fc-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f5d7ed75-853c-45b8-90a0-9014096485fc" (UID: "f5d7ed75-853c-45b8-90a0-9014096485fc"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.287532 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d2c2cc8-d215-4e57-87a2-25a54b287683-kube-api-access-rjsld" (OuterVolumeSpecName: "kube-api-access-rjsld") pod "8d2c2cc8-d215-4e57-87a2-25a54b287683" (UID: "8d2c2cc8-d215-4e57-87a2-25a54b287683"). InnerVolumeSpecName "kube-api-access-rjsld". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.287825 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d2c2cc8-d215-4e57-87a2-25a54b287683-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8d2c2cc8-d215-4e57-87a2-25a54b287683" (UID: "8d2c2cc8-d215-4e57-87a2-25a54b287683"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.288058 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7552f5a-9823-42db-9766-437726d2c7bb-kube-api-access-jbsc7" (OuterVolumeSpecName: "kube-api-access-jbsc7") pod "e7552f5a-9823-42db-9766-437726d2c7bb" (UID: "e7552f5a-9823-42db-9766-437726d2c7bb"). InnerVolumeSpecName "kube-api-access-jbsc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.383396 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f5d7ed75-853c-45b8-90a0-9014096485fc-client-ca\") pod \"f5d7ed75-853c-45b8-90a0-9014096485fc\" (UID: \"f5d7ed75-853c-45b8-90a0-9014096485fc\") " Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.383511 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f5d7ed75-853c-45b8-90a0-9014096485fc-proxy-ca-bundles\") pod \"f5d7ed75-853c-45b8-90a0-9014096485fc\" (UID: \"f5d7ed75-853c-45b8-90a0-9014096485fc\") " Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.383567 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8kv7\" (UniqueName: \"kubernetes.io/projected/f5d7ed75-853c-45b8-90a0-9014096485fc-kube-api-access-g8kv7\") pod \"f5d7ed75-853c-45b8-90a0-9014096485fc\" (UID: \"f5d7ed75-853c-45b8-90a0-9014096485fc\") " Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.384023 4971 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d2c2cc8-d215-4e57-87a2-25a54b287683-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.384047 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjsld\" (UniqueName: \"kubernetes.io/projected/8d2c2cc8-d215-4e57-87a2-25a54b287683-kube-api-access-rjsld\") on node \"crc\" DevicePath \"\"" Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.384068 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7552f5a-9823-42db-9766-437726d2c7bb-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.384086 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbsc7\" (UniqueName: \"kubernetes.io/projected/e7552f5a-9823-42db-9766-437726d2c7bb-kube-api-access-jbsc7\") on node \"crc\" DevicePath \"\"" Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.384107 4971 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5d7ed75-853c-45b8-90a0-9014096485fc-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.384123 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5d7ed75-853c-45b8-90a0-9014096485fc-config\") on node \"crc\" DevicePath \"\"" Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.384965 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5d7ed75-853c-45b8-90a0-9014096485fc-client-ca" (OuterVolumeSpecName: "client-ca") pod "f5d7ed75-853c-45b8-90a0-9014096485fc" (UID: "f5d7ed75-853c-45b8-90a0-9014096485fc"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.385359 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5d7ed75-853c-45b8-90a0-9014096485fc-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "f5d7ed75-853c-45b8-90a0-9014096485fc" (UID: "f5d7ed75-853c-45b8-90a0-9014096485fc"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.388432 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5d7ed75-853c-45b8-90a0-9014096485fc-kube-api-access-g8kv7" (OuterVolumeSpecName: "kube-api-access-g8kv7") pod "f5d7ed75-853c-45b8-90a0-9014096485fc" (UID: "f5d7ed75-853c-45b8-90a0-9014096485fc"). InnerVolumeSpecName "kube-api-access-g8kv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.419018 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7552f5a-9823-42db-9766-437726d2c7bb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e7552f5a-9823-42db-9766-437726d2c7bb" (UID: "e7552f5a-9823-42db-9766-437726d2c7bb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.468580 4971 generic.go:334] "Generic (PLEG): container finished" podID="8d2c2cc8-d215-4e57-87a2-25a54b287683" containerID="19c2d1d56464a1ae28eed34fdd13ec0bcc7be61b4ffce3d10d51df1e67478337" exitCode=0 Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.468702 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f5ccb7977-z72sh" Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.468768 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6f5ccb7977-z72sh" event={"ID":"8d2c2cc8-d215-4e57-87a2-25a54b287683","Type":"ContainerDied","Data":"19c2d1d56464a1ae28eed34fdd13ec0bcc7be61b4ffce3d10d51df1e67478337"} Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.468800 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6f5ccb7977-z72sh" event={"ID":"8d2c2cc8-d215-4e57-87a2-25a54b287683","Type":"ContainerDied","Data":"ee207f92da3a18d86c255c548ef055b468e21191766420b5474c7f9b013afd79"} Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.468821 4971 scope.go:117] "RemoveContainer" containerID="19c2d1d56464a1ae28eed34fdd13ec0bcc7be61b4ffce3d10d51df1e67478337" Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.472214 4971 generic.go:334] "Generic (PLEG): container finished" podID="f5d7ed75-853c-45b8-90a0-9014096485fc" containerID="d931fa95c2306be9a7481525d9b45793052c290d506fbc6bd74e876a81472c4a" exitCode=0 Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.472283 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f8d9d77fc-fmvt4" Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.472321 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f8d9d77fc-fmvt4" event={"ID":"f5d7ed75-853c-45b8-90a0-9014096485fc","Type":"ContainerDied","Data":"d931fa95c2306be9a7481525d9b45793052c290d506fbc6bd74e876a81472c4a"} Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.472358 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f8d9d77fc-fmvt4" event={"ID":"f5d7ed75-853c-45b8-90a0-9014096485fc","Type":"ContainerDied","Data":"bf73865123d063008cf05a59d8d5831eeffb0e9d872583b9f0cebee286c2dab0"} Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.477653 4971 generic.go:334] "Generic (PLEG): container finished" podID="e7552f5a-9823-42db-9766-437726d2c7bb" containerID="7bcef32e41d44aa792cda70334af0a1134d39c1ab91fba1d6e3746814e4138d2" exitCode=0 Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.477709 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zj9pn" event={"ID":"e7552f5a-9823-42db-9766-437726d2c7bb","Type":"ContainerDied","Data":"7bcef32e41d44aa792cda70334af0a1134d39c1ab91fba1d6e3746814e4138d2"} Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.477744 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zj9pn" event={"ID":"e7552f5a-9823-42db-9766-437726d2c7bb","Type":"ContainerDied","Data":"b8df97fadf6939dc0b6e27367092b497734a8a287ab307ecdedcdecb59ce6009"} Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.477775 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zj9pn" Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.484672 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8kv7\" (UniqueName: \"kubernetes.io/projected/f5d7ed75-853c-45b8-90a0-9014096485fc-kube-api-access-g8kv7\") on node \"crc\" DevicePath \"\"" Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.484692 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7552f5a-9823-42db-9766-437726d2c7bb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.484703 4971 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f5d7ed75-853c-45b8-90a0-9014096485fc-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.484713 4971 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f5d7ed75-853c-45b8-90a0-9014096485fc-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.503908 4971 scope.go:117] "RemoveContainer" containerID="19c2d1d56464a1ae28eed34fdd13ec0bcc7be61b4ffce3d10d51df1e67478337" Mar 20 06:54:45 crc kubenswrapper[4971]: E0320 06:54:45.504797 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19c2d1d56464a1ae28eed34fdd13ec0bcc7be61b4ffce3d10d51df1e67478337\": container with ID starting with 19c2d1d56464a1ae28eed34fdd13ec0bcc7be61b4ffce3d10d51df1e67478337 not found: ID does not exist" containerID="19c2d1d56464a1ae28eed34fdd13ec0bcc7be61b4ffce3d10d51df1e67478337" Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.504859 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19c2d1d56464a1ae28eed34fdd13ec0bcc7be61b4ffce3d10d51df1e67478337"} err="failed to get container status \"19c2d1d56464a1ae28eed34fdd13ec0bcc7be61b4ffce3d10d51df1e67478337\": rpc error: code = NotFound desc = could not find container \"19c2d1d56464a1ae28eed34fdd13ec0bcc7be61b4ffce3d10d51df1e67478337\": container with ID starting with 19c2d1d56464a1ae28eed34fdd13ec0bcc7be61b4ffce3d10d51df1e67478337 not found: ID does not exist" Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.504902 4971 scope.go:117] "RemoveContainer" containerID="d931fa95c2306be9a7481525d9b45793052c290d506fbc6bd74e876a81472c4a" Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.514744 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7f8d9d77fc-fmvt4"] Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.520748 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7f8d9d77fc-fmvt4"] Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.540256 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f5ccb7977-z72sh"] Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.544816 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f5ccb7977-z72sh"] Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.547241 4971 scope.go:117] "RemoveContainer" containerID="d931fa95c2306be9a7481525d9b45793052c290d506fbc6bd74e876a81472c4a" Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.549790 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zj9pn"] Mar 20 06:54:45 crc kubenswrapper[4971]: E0320 06:54:45.550064 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d931fa95c2306be9a7481525d9b45793052c290d506fbc6bd74e876a81472c4a\": container with ID starting with d931fa95c2306be9a7481525d9b45793052c290d506fbc6bd74e876a81472c4a not found: ID does not exist" containerID="d931fa95c2306be9a7481525d9b45793052c290d506fbc6bd74e876a81472c4a" Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.550125 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d931fa95c2306be9a7481525d9b45793052c290d506fbc6bd74e876a81472c4a"} err="failed to get container status \"d931fa95c2306be9a7481525d9b45793052c290d506fbc6bd74e876a81472c4a\": rpc error: code = NotFound desc = could not find container \"d931fa95c2306be9a7481525d9b45793052c290d506fbc6bd74e876a81472c4a\": container with ID starting with d931fa95c2306be9a7481525d9b45793052c290d506fbc6bd74e876a81472c4a not found: ID does not exist" Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.550170 4971 scope.go:117] "RemoveContainer" containerID="7bcef32e41d44aa792cda70334af0a1134d39c1ab91fba1d6e3746814e4138d2" Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.553100 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zj9pn"] Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.571477 4971 scope.go:117] "RemoveContainer" containerID="987ce30fcb06679d0ad90be451f483673e114ac854337cc22fef574472572179" Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.593728 4971 scope.go:117] "RemoveContainer" containerID="2399b40800727f8e450d8c07e7d13fdfc80b6fbe3fe5b3842553eb5981d612f3" Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.612779 4971 scope.go:117] "RemoveContainer" containerID="7bcef32e41d44aa792cda70334af0a1134d39c1ab91fba1d6e3746814e4138d2" Mar 20 06:54:45 crc kubenswrapper[4971]: E0320 06:54:45.613803 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bcef32e41d44aa792cda70334af0a1134d39c1ab91fba1d6e3746814e4138d2\": container with ID starting with 7bcef32e41d44aa792cda70334af0a1134d39c1ab91fba1d6e3746814e4138d2 not found: ID does not exist" containerID="7bcef32e41d44aa792cda70334af0a1134d39c1ab91fba1d6e3746814e4138d2" Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.613868 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bcef32e41d44aa792cda70334af0a1134d39c1ab91fba1d6e3746814e4138d2"} err="failed to get container status \"7bcef32e41d44aa792cda70334af0a1134d39c1ab91fba1d6e3746814e4138d2\": rpc error: code = NotFound desc = could not find container \"7bcef32e41d44aa792cda70334af0a1134d39c1ab91fba1d6e3746814e4138d2\": container with ID starting with 7bcef32e41d44aa792cda70334af0a1134d39c1ab91fba1d6e3746814e4138d2 not found: ID does not exist" Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.613916 4971 scope.go:117] "RemoveContainer" containerID="987ce30fcb06679d0ad90be451f483673e114ac854337cc22fef574472572179" Mar 20 06:54:45 crc kubenswrapper[4971]: E0320 06:54:45.614594 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"987ce30fcb06679d0ad90be451f483673e114ac854337cc22fef574472572179\": container with ID starting with 987ce30fcb06679d0ad90be451f483673e114ac854337cc22fef574472572179 not found: ID does not exist" containerID="987ce30fcb06679d0ad90be451f483673e114ac854337cc22fef574472572179" Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.614677 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"987ce30fcb06679d0ad90be451f483673e114ac854337cc22fef574472572179"} err="failed to get container status \"987ce30fcb06679d0ad90be451f483673e114ac854337cc22fef574472572179\": rpc error: code = NotFound desc = could not find container \"987ce30fcb06679d0ad90be451f483673e114ac854337cc22fef574472572179\": container with ID starting with 987ce30fcb06679d0ad90be451f483673e114ac854337cc22fef574472572179 not found: ID does not exist" Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.614706 4971 scope.go:117] "RemoveContainer" containerID="2399b40800727f8e450d8c07e7d13fdfc80b6fbe3fe5b3842553eb5981d612f3" Mar 20 06:54:45 crc kubenswrapper[4971]: E0320 06:54:45.615047 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2399b40800727f8e450d8c07e7d13fdfc80b6fbe3fe5b3842553eb5981d612f3\": container with ID starting with 2399b40800727f8e450d8c07e7d13fdfc80b6fbe3fe5b3842553eb5981d612f3 not found: ID does not exist" containerID="2399b40800727f8e450d8c07e7d13fdfc80b6fbe3fe5b3842553eb5981d612f3" Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.615099 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2399b40800727f8e450d8c07e7d13fdfc80b6fbe3fe5b3842553eb5981d612f3"} err="failed to get container status \"2399b40800727f8e450d8c07e7d13fdfc80b6fbe3fe5b3842553eb5981d612f3\": rpc error: code = NotFound desc = could not find container \"2399b40800727f8e450d8c07e7d13fdfc80b6fbe3fe5b3842553eb5981d612f3\": container with ID starting with 2399b40800727f8e450d8c07e7d13fdfc80b6fbe3fe5b3842553eb5981d612f3 not found: ID does not exist" Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.956453 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f6dbbfd8d-zs6fp"] Mar 20 06:54:45 crc kubenswrapper[4971]: E0320 06:54:45.956947 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7552f5a-9823-42db-9766-437726d2c7bb" containerName="extract-content" Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.956982 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7552f5a-9823-42db-9766-437726d2c7bb" containerName="extract-content" Mar 20 06:54:45 crc kubenswrapper[4971]: E0320 06:54:45.957025 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7552f5a-9823-42db-9766-437726d2c7bb" containerName="extract-utilities" Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.957044 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7552f5a-9823-42db-9766-437726d2c7bb" containerName="extract-utilities" Mar 20 06:54:45 crc kubenswrapper[4971]: E0320 06:54:45.957063 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d2c2cc8-d215-4e57-87a2-25a54b287683" containerName="route-controller-manager" Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.957077 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d2c2cc8-d215-4e57-87a2-25a54b287683" containerName="route-controller-manager" Mar 20 06:54:45 crc kubenswrapper[4971]: E0320 06:54:45.957096 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7552f5a-9823-42db-9766-437726d2c7bb" containerName="registry-server" Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.957110 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7552f5a-9823-42db-9766-437726d2c7bb" containerName="registry-server" Mar 20 06:54:45 crc kubenswrapper[4971]: E0320 06:54:45.957136 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5d7ed75-853c-45b8-90a0-9014096485fc" containerName="controller-manager" Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.957149 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5d7ed75-853c-45b8-90a0-9014096485fc" containerName="controller-manager" Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.959727 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7552f5a-9823-42db-9766-437726d2c7bb" containerName="registry-server" Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.959762 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5d7ed75-853c-45b8-90a0-9014096485fc" containerName="controller-manager" Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.959792 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d2c2cc8-d215-4e57-87a2-25a54b287683" containerName="route-controller-manager" Mar 20 06:54:45 crc kubenswrapper[4971]: I0320 06:54:45.961593 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f6dbbfd8d-zs6fp" Mar 20 06:54:46 crc kubenswrapper[4971]: I0320 06:54:46.005907 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 06:54:46 crc kubenswrapper[4971]: I0320 06:54:46.008950 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 06:54:46 crc kubenswrapper[4971]: I0320 06:54:46.009166 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 06:54:46 crc kubenswrapper[4971]: I0320 06:54:46.009194 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7b96847c47-hl7gr"] Mar 20 06:54:46 crc kubenswrapper[4971]: I0320 06:54:46.009678 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 06:54:46 crc kubenswrapper[4971]: I0320 06:54:46.009974 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 06:54:46 crc kubenswrapper[4971]: I0320 06:54:46.010038 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 06:54:46 crc kubenswrapper[4971]: I0320 06:54:46.011578 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b96847c47-hl7gr" Mar 20 06:54:46 crc kubenswrapper[4971]: I0320 06:54:46.017039 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 06:54:46 crc kubenswrapper[4971]: I0320 06:54:46.017417 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 06:54:46 crc kubenswrapper[4971]: I0320 06:54:46.018003 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f6dbbfd8d-zs6fp"] Mar 20 06:54:46 crc kubenswrapper[4971]: I0320 06:54:46.018346 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 06:54:46 crc kubenswrapper[4971]: I0320 06:54:46.018878 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 06:54:46 crc kubenswrapper[4971]: I0320 06:54:46.019090 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 06:54:46 crc kubenswrapper[4971]: I0320 06:54:46.019508 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 06:54:46 crc kubenswrapper[4971]: I0320 06:54:46.024620 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7b96847c47-hl7gr"] Mar 20 06:54:46 crc kubenswrapper[4971]: I0320 06:54:46.026082 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 06:54:46 crc kubenswrapper[4971]: I0320 06:54:46.108341 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9c203b8-cf18-419e-ac66-05763da50b36-config\") pod \"route-controller-manager-7f6dbbfd8d-zs6fp\" (UID: \"a9c203b8-cf18-419e-ac66-05763da50b36\") " pod="openshift-route-controller-manager/route-controller-manager-7f6dbbfd8d-zs6fp" Mar 20 06:54:46 crc kubenswrapper[4971]: I0320 06:54:46.108428 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9c203b8-cf18-419e-ac66-05763da50b36-client-ca\") pod \"route-controller-manager-7f6dbbfd8d-zs6fp\" (UID: \"a9c203b8-cf18-419e-ac66-05763da50b36\") " pod="openshift-route-controller-manager/route-controller-manager-7f6dbbfd8d-zs6fp" Mar 20 06:54:46 crc kubenswrapper[4971]: I0320 06:54:46.108456 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qv5fm\" (UniqueName: \"kubernetes.io/projected/a9c203b8-cf18-419e-ac66-05763da50b36-kube-api-access-qv5fm\") pod \"route-controller-manager-7f6dbbfd8d-zs6fp\" (UID: \"a9c203b8-cf18-419e-ac66-05763da50b36\") " pod="openshift-route-controller-manager/route-controller-manager-7f6dbbfd8d-zs6fp" Mar 20 06:54:46 crc kubenswrapper[4971]: I0320 06:54:46.108599 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9c203b8-cf18-419e-ac66-05763da50b36-serving-cert\") pod \"route-controller-manager-7f6dbbfd8d-zs6fp\" (UID: \"a9c203b8-cf18-419e-ac66-05763da50b36\") " pod="openshift-route-controller-manager/route-controller-manager-7f6dbbfd8d-zs6fp" Mar 20 06:54:46 crc kubenswrapper[4971]: I0320 06:54:46.209767 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9c203b8-cf18-419e-ac66-05763da50b36-config\") pod \"route-controller-manager-7f6dbbfd8d-zs6fp\" (UID: \"a9c203b8-cf18-419e-ac66-05763da50b36\") " pod="openshift-route-controller-manager/route-controller-manager-7f6dbbfd8d-zs6fp" Mar 20 06:54:46 crc kubenswrapper[4971]: I0320 06:54:46.209838 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d2f83c4-79fa-499e-ba04-6a89eb94f7d9-config\") pod \"controller-manager-7b96847c47-hl7gr\" (UID: \"8d2f83c4-79fa-499e-ba04-6a89eb94f7d9\") " pod="openshift-controller-manager/controller-manager-7b96847c47-hl7gr" Mar 20 06:54:46 crc kubenswrapper[4971]: I0320 06:54:46.209871 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwk5n\" (UniqueName: \"kubernetes.io/projected/8d2f83c4-79fa-499e-ba04-6a89eb94f7d9-kube-api-access-hwk5n\") pod \"controller-manager-7b96847c47-hl7gr\" (UID: \"8d2f83c4-79fa-499e-ba04-6a89eb94f7d9\") " pod="openshift-controller-manager/controller-manager-7b96847c47-hl7gr" Mar 20 06:54:46 crc kubenswrapper[4971]: I0320 06:54:46.209908 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d2f83c4-79fa-499e-ba04-6a89eb94f7d9-serving-cert\") pod \"controller-manager-7b96847c47-hl7gr\" (UID: \"8d2f83c4-79fa-499e-ba04-6a89eb94f7d9\") " pod="openshift-controller-manager/controller-manager-7b96847c47-hl7gr" Mar 20 06:54:46 crc kubenswrapper[4971]: I0320 06:54:46.209942 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9c203b8-cf18-419e-ac66-05763da50b36-client-ca\") pod \"route-controller-manager-7f6dbbfd8d-zs6fp\" (UID: \"a9c203b8-cf18-419e-ac66-05763da50b36\") " pod="openshift-route-controller-manager/route-controller-manager-7f6dbbfd8d-zs6fp" Mar 20 06:54:46 crc kubenswrapper[4971]: I0320 06:54:46.209966 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qv5fm\" (UniqueName: \"kubernetes.io/projected/a9c203b8-cf18-419e-ac66-05763da50b36-kube-api-access-qv5fm\") pod \"route-controller-manager-7f6dbbfd8d-zs6fp\" (UID: \"a9c203b8-cf18-419e-ac66-05763da50b36\") " pod="openshift-route-controller-manager/route-controller-manager-7f6dbbfd8d-zs6fp" Mar 20 06:54:46 crc kubenswrapper[4971]: I0320 06:54:46.209986 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9c203b8-cf18-419e-ac66-05763da50b36-serving-cert\") pod \"route-controller-manager-7f6dbbfd8d-zs6fp\" (UID: \"a9c203b8-cf18-419e-ac66-05763da50b36\") " pod="openshift-route-controller-manager/route-controller-manager-7f6dbbfd8d-zs6fp" Mar 20 06:54:46 crc kubenswrapper[4971]: I0320 06:54:46.210014 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8d2f83c4-79fa-499e-ba04-6a89eb94f7d9-client-ca\") pod \"controller-manager-7b96847c47-hl7gr\" (UID: \"8d2f83c4-79fa-499e-ba04-6a89eb94f7d9\") " pod="openshift-controller-manager/controller-manager-7b96847c47-hl7gr" Mar 20 06:54:46 crc kubenswrapper[4971]: I0320 06:54:46.210048 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8d2f83c4-79fa-499e-ba04-6a89eb94f7d9-proxy-ca-bundles\") pod \"controller-manager-7b96847c47-hl7gr\" (UID: \"8d2f83c4-79fa-499e-ba04-6a89eb94f7d9\") " pod="openshift-controller-manager/controller-manager-7b96847c47-hl7gr" Mar 20 06:54:46 crc kubenswrapper[4971]: I0320 06:54:46.211518 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9c203b8-cf18-419e-ac66-05763da50b36-config\") pod \"route-controller-manager-7f6dbbfd8d-zs6fp\" (UID: \"a9c203b8-cf18-419e-ac66-05763da50b36\") " pod="openshift-route-controller-manager/route-controller-manager-7f6dbbfd8d-zs6fp" Mar 20 06:54:46 crc kubenswrapper[4971]: I0320 06:54:46.211518 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9c203b8-cf18-419e-ac66-05763da50b36-client-ca\") pod \"route-controller-manager-7f6dbbfd8d-zs6fp\" (UID: \"a9c203b8-cf18-419e-ac66-05763da50b36\") " pod="openshift-route-controller-manager/route-controller-manager-7f6dbbfd8d-zs6fp" Mar 20 06:54:46 crc kubenswrapper[4971]: I0320 06:54:46.220855 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9c203b8-cf18-419e-ac66-05763da50b36-serving-cert\") pod \"route-controller-manager-7f6dbbfd8d-zs6fp\" (UID: \"a9c203b8-cf18-419e-ac66-05763da50b36\") " pod="openshift-route-controller-manager/route-controller-manager-7f6dbbfd8d-zs6fp" Mar 20 06:54:46 crc kubenswrapper[4971]: I0320 06:54:46.243460 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv5fm\" (UniqueName: \"kubernetes.io/projected/a9c203b8-cf18-419e-ac66-05763da50b36-kube-api-access-qv5fm\") pod \"route-controller-manager-7f6dbbfd8d-zs6fp\" (UID: \"a9c203b8-cf18-419e-ac66-05763da50b36\") " pod="openshift-route-controller-manager/route-controller-manager-7f6dbbfd8d-zs6fp" Mar 20 06:54:46 crc kubenswrapper[4971]: I0320 06:54:46.311754 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d2f83c4-79fa-499e-ba04-6a89eb94f7d9-config\") pod \"controller-manager-7b96847c47-hl7gr\" (UID: \"8d2f83c4-79fa-499e-ba04-6a89eb94f7d9\") " pod="openshift-controller-manager/controller-manager-7b96847c47-hl7gr" Mar 20 06:54:46 crc kubenswrapper[4971]: I0320 06:54:46.311818 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwk5n\" (UniqueName: \"kubernetes.io/projected/8d2f83c4-79fa-499e-ba04-6a89eb94f7d9-kube-api-access-hwk5n\") pod \"controller-manager-7b96847c47-hl7gr\" (UID: \"8d2f83c4-79fa-499e-ba04-6a89eb94f7d9\") " pod="openshift-controller-manager/controller-manager-7b96847c47-hl7gr" Mar 20 06:54:46 crc kubenswrapper[4971]: I0320 06:54:46.311851 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d2f83c4-79fa-499e-ba04-6a89eb94f7d9-serving-cert\") pod \"controller-manager-7b96847c47-hl7gr\" (UID: \"8d2f83c4-79fa-499e-ba04-6a89eb94f7d9\") " pod="openshift-controller-manager/controller-manager-7b96847c47-hl7gr" Mar 20 06:54:46 crc kubenswrapper[4971]: I0320 06:54:46.311897 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8d2f83c4-79fa-499e-ba04-6a89eb94f7d9-client-ca\") pod \"controller-manager-7b96847c47-hl7gr\" (UID: \"8d2f83c4-79fa-499e-ba04-6a89eb94f7d9\") " pod="openshift-controller-manager/controller-manager-7b96847c47-hl7gr" Mar 20 06:54:46 crc kubenswrapper[4971]: I0320 06:54:46.311930 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8d2f83c4-79fa-499e-ba04-6a89eb94f7d9-proxy-ca-bundles\") pod \"controller-manager-7b96847c47-hl7gr\" (UID: \"8d2f83c4-79fa-499e-ba04-6a89eb94f7d9\") " pod="openshift-controller-manager/controller-manager-7b96847c47-hl7gr" Mar 20 06:54:46 crc kubenswrapper[4971]: I0320 06:54:46.313751 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8d2f83c4-79fa-499e-ba04-6a89eb94f7d9-proxy-ca-bundles\") pod \"controller-manager-7b96847c47-hl7gr\" (UID: \"8d2f83c4-79fa-499e-ba04-6a89eb94f7d9\") " pod="openshift-controller-manager/controller-manager-7b96847c47-hl7gr" Mar 20 06:54:46 crc kubenswrapper[4971]: I0320 06:54:46.314370 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8d2f83c4-79fa-499e-ba04-6a89eb94f7d9-client-ca\") pod \"controller-manager-7b96847c47-hl7gr\" (UID: \"8d2f83c4-79fa-499e-ba04-6a89eb94f7d9\") " pod="openshift-controller-manager/controller-manager-7b96847c47-hl7gr" Mar 20 06:54:46 crc kubenswrapper[4971]: I0320 06:54:46.314799 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d2f83c4-79fa-499e-ba04-6a89eb94f7d9-config\") pod \"controller-manager-7b96847c47-hl7gr\" (UID: \"8d2f83c4-79fa-499e-ba04-6a89eb94f7d9\") " pod="openshift-controller-manager/controller-manager-7b96847c47-hl7gr" Mar 20 06:54:46 crc kubenswrapper[4971]: I0320 06:54:46.321372 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d2f83c4-79fa-499e-ba04-6a89eb94f7d9-serving-cert\") pod \"controller-manager-7b96847c47-hl7gr\" (UID: \"8d2f83c4-79fa-499e-ba04-6a89eb94f7d9\") " pod="openshift-controller-manager/controller-manager-7b96847c47-hl7gr" Mar 20 06:54:46 crc kubenswrapper[4971]: I0320 06:54:46.341431 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f6dbbfd8d-zs6fp" Mar 20 06:54:46 crc kubenswrapper[4971]: I0320 06:54:46.346801 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwk5n\" (UniqueName: \"kubernetes.io/projected/8d2f83c4-79fa-499e-ba04-6a89eb94f7d9-kube-api-access-hwk5n\") pod \"controller-manager-7b96847c47-hl7gr\" (UID: \"8d2f83c4-79fa-499e-ba04-6a89eb94f7d9\") " pod="openshift-controller-manager/controller-manager-7b96847c47-hl7gr" Mar 20 06:54:46 crc kubenswrapper[4971]: I0320 06:54:46.353014 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b96847c47-hl7gr" Mar 20 06:54:46 crc kubenswrapper[4971]: I0320 06:54:46.698243 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f6dbbfd8d-zs6fp"] Mar 20 06:54:46 crc kubenswrapper[4971]: I0320 06:54:46.751211 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d2c2cc8-d215-4e57-87a2-25a54b287683" path="/var/lib/kubelet/pods/8d2c2cc8-d215-4e57-87a2-25a54b287683/volumes" Mar 20 06:54:46 crc kubenswrapper[4971]: I0320 06:54:46.752206 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7552f5a-9823-42db-9766-437726d2c7bb" path="/var/lib/kubelet/pods/e7552f5a-9823-42db-9766-437726d2c7bb/volumes" Mar 20 06:54:46 crc kubenswrapper[4971]: I0320 06:54:46.753764 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5d7ed75-853c-45b8-90a0-9014096485fc" path="/var/lib/kubelet/pods/f5d7ed75-853c-45b8-90a0-9014096485fc/volumes" Mar 20 06:54:46 crc kubenswrapper[4971]: I0320 06:54:46.853825 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7b96847c47-hl7gr"] Mar 20 06:54:46 crc kubenswrapper[4971]: W0320 06:54:46.857969 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d2f83c4_79fa_499e_ba04_6a89eb94f7d9.slice/crio-9ab64eadd06c360a2711eb5828eb30b48df51c28bf9436b178f4008ed6e7be61 WatchSource:0}: Error finding container 9ab64eadd06c360a2711eb5828eb30b48df51c28bf9436b178f4008ed6e7be61: Status 404 returned error can't find the container with id 9ab64eadd06c360a2711eb5828eb30b48df51c28bf9436b178f4008ed6e7be61 Mar 20 06:54:47 crc kubenswrapper[4971]: I0320 06:54:47.538429 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b96847c47-hl7gr" event={"ID":"8d2f83c4-79fa-499e-ba04-6a89eb94f7d9","Type":"ContainerStarted","Data":"5932c2ee957f98bf10d469489ac7d2b939eca53bc3de519bcc053a5e4aac5b0d"} Mar 20 06:54:47 crc kubenswrapper[4971]: I0320 06:54:47.538844 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b96847c47-hl7gr" event={"ID":"8d2f83c4-79fa-499e-ba04-6a89eb94f7d9","Type":"ContainerStarted","Data":"9ab64eadd06c360a2711eb5828eb30b48df51c28bf9436b178f4008ed6e7be61"} Mar 20 06:54:47 crc kubenswrapper[4971]: I0320 06:54:47.539651 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7b96847c47-hl7gr" Mar 20 06:54:47 crc kubenswrapper[4971]: I0320 06:54:47.540404 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f6dbbfd8d-zs6fp" event={"ID":"a9c203b8-cf18-419e-ac66-05763da50b36","Type":"ContainerStarted","Data":"a32ce0f1c475532ac4bce28b3847bd1466e5b4d39a625eb30a1f94a30148e596"} Mar 20 06:54:47 crc kubenswrapper[4971]: I0320 06:54:47.540449 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f6dbbfd8d-zs6fp" event={"ID":"a9c203b8-cf18-419e-ac66-05763da50b36","Type":"ContainerStarted","Data":"b66068cb29298d329466e5d4dbdfb43f522e3ab4dfed64c947c7eb932f0c29c3"} Mar 20 06:54:47 crc kubenswrapper[4971]: I0320 06:54:47.540721 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7f6dbbfd8d-zs6fp" Mar 20 06:54:47 crc kubenswrapper[4971]: I0320 06:54:47.548813 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7b96847c47-hl7gr" Mar 20 06:54:47 crc kubenswrapper[4971]: I0320 06:54:47.550273 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7f6dbbfd8d-zs6fp" Mar 20 06:54:47 crc kubenswrapper[4971]: I0320 06:54:47.563424 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7b96847c47-hl7gr" podStartSLOduration=3.563401757 podStartE2EDuration="3.563401757s" podCreationTimestamp="2026-03-20 06:54:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:54:47.560591085 +0000 UTC m=+309.540465233" watchObservedRunningTime="2026-03-20 06:54:47.563401757 +0000 UTC m=+309.543275905" Mar 20 06:54:47 crc kubenswrapper[4971]: I0320 06:54:47.597119 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7f6dbbfd8d-zs6fp" podStartSLOduration=3.597098525 podStartE2EDuration="3.597098525s" podCreationTimestamp="2026-03-20 06:54:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:54:47.5914096 +0000 UTC m=+309.571283738" watchObservedRunningTime="2026-03-20 06:54:47.597098525 +0000 UTC m=+309.576972663" Mar 20 06:54:51 crc kubenswrapper[4971]: I0320 06:54:51.530988 4971 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 06:54:51 crc kubenswrapper[4971]: I0320 06:54:51.532235 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 06:54:51 crc kubenswrapper[4971]: I0320 06:54:51.532291 4971 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 06:54:51 crc kubenswrapper[4971]: I0320 06:54:51.533058 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://9cfb7907b15336a74d333ce825f8325e80508112fa387f60627e13cc60dc6b32" gracePeriod=15 Mar 20 06:54:51 crc kubenswrapper[4971]: I0320 06:54:51.533119 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://8d92316a4a93f3b34950803a33ac423d41d3467830591e102e6a6b492c5a5707" gracePeriod=15 Mar 20 06:54:51 crc kubenswrapper[4971]: I0320 06:54:51.533172 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://e45395fcc4ab1005b3b86243b30fecf35311439ef5593f777ba909a0b9359574" gracePeriod=15 Mar 20 06:54:51 crc kubenswrapper[4971]: I0320 06:54:51.533225 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://2468ac45fc80991cc210720a76afa06655f3fd7e91fab289eed1d20818e7fcbd" gracePeriod=15 Mar 20 06:54:51 crc kubenswrapper[4971]: I0320 06:54:51.533257 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://c945250129ad9a7fdd1cc19ac9f4c8d3e73c504448c652d5e02cdc986624089d" gracePeriod=15 Mar 20 06:54:51 crc kubenswrapper[4971]: I0320 06:54:51.533804 4971 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 06:54:51 crc kubenswrapper[4971]: E0320 06:54:51.533943 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 06:54:51 crc kubenswrapper[4971]: I0320 06:54:51.533959 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 06:54:51 crc kubenswrapper[4971]: E0320 06:54:51.533966 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 06:54:51 crc kubenswrapper[4971]: I0320 06:54:51.533972 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 06:54:51 crc kubenswrapper[4971]: E0320 06:54:51.533981 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 06:54:51 crc kubenswrapper[4971]: I0320 06:54:51.533986 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 06:54:51 crc kubenswrapper[4971]: E0320 06:54:51.533998 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 20 06:54:51 crc kubenswrapper[4971]: I0320 06:54:51.534004 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 20 06:54:51 crc kubenswrapper[4971]: E0320 06:54:51.534012 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 06:54:51 crc kubenswrapper[4971]: I0320 06:54:51.534018 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 06:54:51 crc kubenswrapper[4971]: E0320 06:54:51.534026 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 06:54:51 crc kubenswrapper[4971]: I0320 06:54:51.534032 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 06:54:51 crc kubenswrapper[4971]: E0320 06:54:51.534038 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 06:54:51 crc kubenswrapper[4971]: I0320 06:54:51.534044 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 06:54:51 crc kubenswrapper[4971]: E0320 06:54:51.534050 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 06:54:51 crc kubenswrapper[4971]: I0320 06:54:51.534055 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 06:54:51 crc kubenswrapper[4971]: I0320 06:54:51.534172 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 06:54:51 crc kubenswrapper[4971]: I0320 06:54:51.534193 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 06:54:51 crc kubenswrapper[4971]: I0320 06:54:51.534201 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 06:54:51 crc kubenswrapper[4971]: I0320 06:54:51.534209 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 06:54:51 crc kubenswrapper[4971]: I0320 06:54:51.534219 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 06:54:51 crc kubenswrapper[4971]: I0320 06:54:51.534227 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 06:54:51 crc kubenswrapper[4971]: I0320 06:54:51.534236 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 06:54:51 crc kubenswrapper[4971]: I0320 06:54:51.534243 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 06:54:51 crc kubenswrapper[4971]: E0320 06:54:51.534321 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 06:54:51 crc kubenswrapper[4971]: I0320 06:54:51.534328 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 06:54:51 crc kubenswrapper[4971]: I0320 06:54:51.534422 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 06:54:51 crc kubenswrapper[4971]: E0320 06:54:51.534521 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 06:54:51 crc kubenswrapper[4971]: I0320 06:54:51.534528 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 06:54:51 crc kubenswrapper[4971]: I0320 06:54:51.569553 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 06:54:51 crc kubenswrapper[4971]: I0320 06:54:51.714354 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 06:54:51 crc kubenswrapper[4971]: I0320 06:54:51.714872 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 06:54:51 crc kubenswrapper[4971]: I0320 06:54:51.714911 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 06:54:51 crc kubenswrapper[4971]: I0320 06:54:51.714935 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 06:54:51 crc kubenswrapper[4971]: I0320 06:54:51.714967 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 06:54:51 crc kubenswrapper[4971]: I0320 06:54:51.714989 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 06:54:51 crc kubenswrapper[4971]: I0320 06:54:51.715052 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 06:54:51 crc kubenswrapper[4971]: I0320 06:54:51.715088 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 06:54:51 crc kubenswrapper[4971]: E0320 06:54:51.789119 4971 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-pod1c00f1ed_a0fb_4e01_b022_b9fb5578ce33.slice/crio-c3cb0ace7084d810aa56ee4d522852fdba1926beacf1de4ce3fee2b832ed6b8b.scope\": RecentStats: unable to find data in memory cache]" Mar 20 06:54:51 crc kubenswrapper[4971]: I0320 06:54:51.815813 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 06:54:51 crc kubenswrapper[4971]: I0320 06:54:51.815879 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 06:54:51 crc kubenswrapper[4971]: I0320 06:54:51.815915 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 06:54:51 crc kubenswrapper[4971]: I0320 06:54:51.815923 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 06:54:51 crc kubenswrapper[4971]: I0320 06:54:51.815944 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 06:54:51 crc kubenswrapper[4971]: I0320 06:54:51.815962 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 06:54:51 crc kubenswrapper[4971]: I0320 06:54:51.815981 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 06:54:51 crc kubenswrapper[4971]: I0320 06:54:51.815981 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 06:54:51 crc kubenswrapper[4971]: I0320 06:54:51.816004 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 06:54:51 crc kubenswrapper[4971]: I0320 06:54:51.816008 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 06:54:51 crc kubenswrapper[4971]: I0320 06:54:51.816026 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 06:54:51 crc kubenswrapper[4971]: I0320 06:54:51.816045 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 06:54:51 crc kubenswrapper[4971]: I0320 06:54:51.816050 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 06:54:51 crc kubenswrapper[4971]: I0320 06:54:51.816091 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 06:54:51 crc kubenswrapper[4971]: I0320 06:54:51.817176 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 06:54:51 crc kubenswrapper[4971]: I0320 06:54:51.817193 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 06:54:51 crc kubenswrapper[4971]: I0320 06:54:51.865241 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 06:54:51 crc kubenswrapper[4971]: E0320 06:54:51.897059 4971 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.119:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189e7a39ccac00bb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:54:51.896103099 +0000 UTC m=+313.875977237,LastTimestamp:2026-03-20 06:54:51.896103099 +0000 UTC m=+313.875977237,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:54:52 crc kubenswrapper[4971]: I0320 06:54:52.571053 4971 generic.go:334] "Generic (PLEG): container finished" podID="1c00f1ed-a0fb-4e01-b022-b9fb5578ce33" containerID="c3cb0ace7084d810aa56ee4d522852fdba1926beacf1de4ce3fee2b832ed6b8b" exitCode=0 Mar 20 06:54:52 crc kubenswrapper[4971]: I0320 06:54:52.571198 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"1c00f1ed-a0fb-4e01-b022-b9fb5578ce33","Type":"ContainerDied","Data":"c3cb0ace7084d810aa56ee4d522852fdba1926beacf1de4ce3fee2b832ed6b8b"} Mar 20 06:54:52 crc kubenswrapper[4971]: I0320 06:54:52.572071 4971 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Mar 20 06:54:52 crc kubenswrapper[4971]: I0320 06:54:52.572357 4971 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Mar 20 06:54:52 crc kubenswrapper[4971]: I0320 06:54:52.572581 4971 status_manager.go:851] "Failed to get status for pod" podUID="1c00f1ed-a0fb-4e01-b022-b9fb5578ce33" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Mar 20 06:54:52 crc kubenswrapper[4971]: I0320 06:54:52.574012 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"81160d8b9b99c40a04803f0ec0b34a5c0f5a96810fdfcb72e98762652ab14461"} Mar 20 06:54:52 crc kubenswrapper[4971]: I0320 06:54:52.574051 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"29042c6a1e3b3bb59c17e0f0a32cd5f84c9c3d6e444a99f42b054446e884806d"} Mar 20 06:54:52 crc kubenswrapper[4971]: I0320 06:54:52.574894 4971 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Mar 20 06:54:52 crc kubenswrapper[4971]: I0320 06:54:52.575301 4971 status_manager.go:851] "Failed to get status for pod" podUID="1c00f1ed-a0fb-4e01-b022-b9fb5578ce33" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Mar 20 06:54:52 crc kubenswrapper[4971]: I0320 06:54:52.575496 4971 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Mar 20 06:54:52 crc kubenswrapper[4971]: I0320 06:54:52.577468 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 06:54:52 crc kubenswrapper[4971]: I0320 06:54:52.578915 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 06:54:52 crc kubenswrapper[4971]: I0320 06:54:52.579703 4971 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8d92316a4a93f3b34950803a33ac423d41d3467830591e102e6a6b492c5a5707" exitCode=0 Mar 20 06:54:52 crc kubenswrapper[4971]: I0320 06:54:52.579754 4971 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2468ac45fc80991cc210720a76afa06655f3fd7e91fab289eed1d20818e7fcbd" exitCode=0 Mar 20 06:54:52 crc kubenswrapper[4971]: I0320 06:54:52.579769 4971 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e45395fcc4ab1005b3b86243b30fecf35311439ef5593f777ba909a0b9359574" exitCode=0 Mar 20 06:54:52 crc kubenswrapper[4971]: I0320 06:54:52.579779 4971 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c945250129ad9a7fdd1cc19ac9f4c8d3e73c504448c652d5e02cdc986624089d" exitCode=2 Mar 20 06:54:52 crc kubenswrapper[4971]: I0320 06:54:52.579844 4971 scope.go:117] "RemoveContainer" containerID="6a748c7f52fe87f489e7b11239aafb2b6f9dc55d79f48fd08e10193349e61706" Mar 20 06:54:53 crc kubenswrapper[4971]: I0320 06:54:53.592658 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 06:54:53 crc kubenswrapper[4971]: I0320 06:54:53.934366 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 06:54:53 crc kubenswrapper[4971]: I0320 06:54:53.935623 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 06:54:53 crc kubenswrapper[4971]: I0320 06:54:53.936150 4971 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Mar 20 06:54:53 crc kubenswrapper[4971]: I0320 06:54:53.936330 4971 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Mar 20 06:54:53 crc kubenswrapper[4971]: I0320 06:54:53.936495 4971 status_manager.go:851] "Failed to get status for pod" podUID="1c00f1ed-a0fb-4e01-b022-b9fb5578ce33" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Mar 20 06:54:53 crc kubenswrapper[4971]: I0320 06:54:53.943442 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 06:54:53 crc kubenswrapper[4971]: I0320 06:54:53.943539 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 06:54:53 crc kubenswrapper[4971]: I0320 06:54:53.943566 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 06:54:53 crc kubenswrapper[4971]: I0320 06:54:53.943583 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 06:54:53 crc kubenswrapper[4971]: I0320 06:54:53.943627 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 06:54:53 crc kubenswrapper[4971]: I0320 06:54:53.943748 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 06:54:53 crc kubenswrapper[4971]: I0320 06:54:53.943789 4971 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 06:54:53 crc kubenswrapper[4971]: I0320 06:54:53.943803 4971 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 20 06:54:54 crc kubenswrapper[4971]: I0320 06:54:54.045140 4971 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 20 06:54:54 crc kubenswrapper[4971]: I0320 06:54:54.055192 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 06:54:54 crc kubenswrapper[4971]: I0320 06:54:54.055872 4971 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Mar 20 06:54:54 crc kubenswrapper[4971]: I0320 06:54:54.056400 4971 status_manager.go:851] "Failed to get status for pod" podUID="1c00f1ed-a0fb-4e01-b022-b9fb5578ce33" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Mar 20 06:54:54 crc kubenswrapper[4971]: I0320 06:54:54.057138 4971 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Mar 20 06:54:54 crc kubenswrapper[4971]: I0320 06:54:54.247815 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c00f1ed-a0fb-4e01-b022-b9fb5578ce33-kube-api-access\") pod \"1c00f1ed-a0fb-4e01-b022-b9fb5578ce33\" (UID: \"1c00f1ed-a0fb-4e01-b022-b9fb5578ce33\") " Mar 20 06:54:54 crc kubenswrapper[4971]: I0320 06:54:54.247945 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1c00f1ed-a0fb-4e01-b022-b9fb5578ce33-var-lock\") pod \"1c00f1ed-a0fb-4e01-b022-b9fb5578ce33\" (UID: \"1c00f1ed-a0fb-4e01-b022-b9fb5578ce33\") " Mar 20 06:54:54 crc kubenswrapper[4971]: I0320 06:54:54.248001 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1c00f1ed-a0fb-4e01-b022-b9fb5578ce33-kubelet-dir\") pod \"1c00f1ed-a0fb-4e01-b022-b9fb5578ce33\" (UID: \"1c00f1ed-a0fb-4e01-b022-b9fb5578ce33\") " Mar 20 06:54:54 crc kubenswrapper[4971]: I0320 06:54:54.248136 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c00f1ed-a0fb-4e01-b022-b9fb5578ce33-var-lock" (OuterVolumeSpecName: "var-lock") pod "1c00f1ed-a0fb-4e01-b022-b9fb5578ce33" (UID: "1c00f1ed-a0fb-4e01-b022-b9fb5578ce33"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 06:54:54 crc kubenswrapper[4971]: I0320 06:54:54.248226 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c00f1ed-a0fb-4e01-b022-b9fb5578ce33-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "1c00f1ed-a0fb-4e01-b022-b9fb5578ce33" (UID: "1c00f1ed-a0fb-4e01-b022-b9fb5578ce33"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 06:54:54 crc kubenswrapper[4971]: I0320 06:54:54.248397 4971 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1c00f1ed-a0fb-4e01-b022-b9fb5578ce33-var-lock\") on node \"crc\" DevicePath \"\"" Mar 20 06:54:54 crc kubenswrapper[4971]: I0320 06:54:54.248415 4971 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1c00f1ed-a0fb-4e01-b022-b9fb5578ce33-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 06:54:54 crc kubenswrapper[4971]: I0320 06:54:54.254072 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c00f1ed-a0fb-4e01-b022-b9fb5578ce33-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1c00f1ed-a0fb-4e01-b022-b9fb5578ce33" (UID: "1c00f1ed-a0fb-4e01-b022-b9fb5578ce33"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:54:54 crc kubenswrapper[4971]: I0320 06:54:54.349767 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c00f1ed-a0fb-4e01-b022-b9fb5578ce33-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 06:54:54 crc kubenswrapper[4971]: I0320 06:54:54.614942 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 06:54:54 crc kubenswrapper[4971]: I0320 06:54:54.617332 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 06:54:54 crc kubenswrapper[4971]: I0320 06:54:54.617367 4971 scope.go:117] "RemoveContainer" containerID="8d92316a4a93f3b34950803a33ac423d41d3467830591e102e6a6b492c5a5707" Mar 20 06:54:54 crc kubenswrapper[4971]: I0320 06:54:54.617463 4971 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9cfb7907b15336a74d333ce825f8325e80508112fa387f60627e13cc60dc6b32" exitCode=0 Mar 20 06:54:54 crc kubenswrapper[4971]: I0320 06:54:54.621558 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"1c00f1ed-a0fb-4e01-b022-b9fb5578ce33","Type":"ContainerDied","Data":"c1ad2548bbd0ae81cf6a179c3fa8947c7d9c7015e8c73c7cd58e8e1f1fdbb5b9"} Mar 20 06:54:54 crc kubenswrapper[4971]: I0320 06:54:54.621661 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1ad2548bbd0ae81cf6a179c3fa8947c7d9c7015e8c73c7cd58e8e1f1fdbb5b9" Mar 20 06:54:54 crc kubenswrapper[4971]: I0320 06:54:54.621816 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 06:54:54 crc kubenswrapper[4971]: I0320 06:54:54.644556 4971 scope.go:117] "RemoveContainer" containerID="2468ac45fc80991cc210720a76afa06655f3fd7e91fab289eed1d20818e7fcbd" Mar 20 06:54:54 crc kubenswrapper[4971]: I0320 06:54:54.645620 4971 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Mar 20 06:54:54 crc kubenswrapper[4971]: I0320 06:54:54.645961 4971 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Mar 20 06:54:54 crc kubenswrapper[4971]: I0320 06:54:54.646817 4971 status_manager.go:851] "Failed to get status for pod" podUID="1c00f1ed-a0fb-4e01-b022-b9fb5578ce33" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Mar 20 06:54:54 crc kubenswrapper[4971]: I0320 06:54:54.660306 4971 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Mar 20 06:54:54 crc kubenswrapper[4971]: I0320 06:54:54.660838 4971 status_manager.go:851] "Failed to get status for pod" podUID="1c00f1ed-a0fb-4e01-b022-b9fb5578ce33" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Mar 20 06:54:54 crc kubenswrapper[4971]: I0320 06:54:54.661576 4971 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Mar 20 06:54:54 crc kubenswrapper[4971]: I0320 06:54:54.673136 4971 scope.go:117] "RemoveContainer" containerID="e45395fcc4ab1005b3b86243b30fecf35311439ef5593f777ba909a0b9359574" Mar 20 06:54:54 crc kubenswrapper[4971]: I0320 06:54:54.698628 4971 scope.go:117] "RemoveContainer" containerID="c945250129ad9a7fdd1cc19ac9f4c8d3e73c504448c652d5e02cdc986624089d" Mar 20 06:54:54 crc kubenswrapper[4971]: I0320 06:54:54.721376 4971 scope.go:117] "RemoveContainer" containerID="9cfb7907b15336a74d333ce825f8325e80508112fa387f60627e13cc60dc6b32" Mar 20 06:54:54 crc kubenswrapper[4971]: I0320 06:54:54.743650 4971 scope.go:117] "RemoveContainer" containerID="c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc" Mar 20 06:54:54 crc kubenswrapper[4971]: I0320 06:54:54.743921 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 20 06:54:54 crc kubenswrapper[4971]: I0320 06:54:54.772716 4971 scope.go:117] "RemoveContainer" containerID="8d92316a4a93f3b34950803a33ac423d41d3467830591e102e6a6b492c5a5707" Mar 20 06:54:54 crc kubenswrapper[4971]: E0320 06:54:54.773422 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d92316a4a93f3b34950803a33ac423d41d3467830591e102e6a6b492c5a5707\": container with ID starting with 8d92316a4a93f3b34950803a33ac423d41d3467830591e102e6a6b492c5a5707 not found: ID does not exist" containerID="8d92316a4a93f3b34950803a33ac423d41d3467830591e102e6a6b492c5a5707" Mar 20 06:54:54 crc kubenswrapper[4971]: I0320 06:54:54.773502 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d92316a4a93f3b34950803a33ac423d41d3467830591e102e6a6b492c5a5707"} err="failed to get container status \"8d92316a4a93f3b34950803a33ac423d41d3467830591e102e6a6b492c5a5707\": rpc error: code = NotFound desc = could not find container \"8d92316a4a93f3b34950803a33ac423d41d3467830591e102e6a6b492c5a5707\": container with ID starting with 8d92316a4a93f3b34950803a33ac423d41d3467830591e102e6a6b492c5a5707 not found: ID does not exist" Mar 20 06:54:54 crc kubenswrapper[4971]: I0320 06:54:54.773583 4971 scope.go:117] "RemoveContainer" containerID="2468ac45fc80991cc210720a76afa06655f3fd7e91fab289eed1d20818e7fcbd" Mar 20 06:54:54 crc kubenswrapper[4971]: E0320 06:54:54.774174 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2468ac45fc80991cc210720a76afa06655f3fd7e91fab289eed1d20818e7fcbd\": container with ID starting with 2468ac45fc80991cc210720a76afa06655f3fd7e91fab289eed1d20818e7fcbd not found: ID does not exist" containerID="2468ac45fc80991cc210720a76afa06655f3fd7e91fab289eed1d20818e7fcbd" Mar 20 06:54:54 crc kubenswrapper[4971]: I0320 06:54:54.774288 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2468ac45fc80991cc210720a76afa06655f3fd7e91fab289eed1d20818e7fcbd"} err="failed to get container status \"2468ac45fc80991cc210720a76afa06655f3fd7e91fab289eed1d20818e7fcbd\": rpc error: code = NotFound desc = could not find container \"2468ac45fc80991cc210720a76afa06655f3fd7e91fab289eed1d20818e7fcbd\": container with ID starting with 2468ac45fc80991cc210720a76afa06655f3fd7e91fab289eed1d20818e7fcbd not found: ID does not exist" Mar 20 06:54:54 crc kubenswrapper[4971]: I0320 06:54:54.774381 4971 scope.go:117] "RemoveContainer" containerID="e45395fcc4ab1005b3b86243b30fecf35311439ef5593f777ba909a0b9359574" Mar 20 06:54:54 crc kubenswrapper[4971]: E0320 06:54:54.774843 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e45395fcc4ab1005b3b86243b30fecf35311439ef5593f777ba909a0b9359574\": container with ID starting with e45395fcc4ab1005b3b86243b30fecf35311439ef5593f777ba909a0b9359574 not found: ID does not exist" containerID="e45395fcc4ab1005b3b86243b30fecf35311439ef5593f777ba909a0b9359574" Mar 20 06:54:54 crc kubenswrapper[4971]: I0320 06:54:54.774881 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e45395fcc4ab1005b3b86243b30fecf35311439ef5593f777ba909a0b9359574"} err="failed to get container status \"e45395fcc4ab1005b3b86243b30fecf35311439ef5593f777ba909a0b9359574\": rpc error: code = NotFound desc = could not find container \"e45395fcc4ab1005b3b86243b30fecf35311439ef5593f777ba909a0b9359574\": container with ID starting with e45395fcc4ab1005b3b86243b30fecf35311439ef5593f777ba909a0b9359574 not found: ID does not exist" Mar 20 06:54:54 crc kubenswrapper[4971]: I0320 06:54:54.774918 4971 scope.go:117] "RemoveContainer" containerID="c945250129ad9a7fdd1cc19ac9f4c8d3e73c504448c652d5e02cdc986624089d" Mar 20 06:54:54 crc kubenswrapper[4971]: E0320 06:54:54.775260 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c945250129ad9a7fdd1cc19ac9f4c8d3e73c504448c652d5e02cdc986624089d\": container with ID starting with c945250129ad9a7fdd1cc19ac9f4c8d3e73c504448c652d5e02cdc986624089d not found: ID does not exist" containerID="c945250129ad9a7fdd1cc19ac9f4c8d3e73c504448c652d5e02cdc986624089d" Mar 20 06:54:54 crc kubenswrapper[4971]: I0320 06:54:54.775316 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c945250129ad9a7fdd1cc19ac9f4c8d3e73c504448c652d5e02cdc986624089d"} err="failed to get container status \"c945250129ad9a7fdd1cc19ac9f4c8d3e73c504448c652d5e02cdc986624089d\": rpc error: code = NotFound desc = could not find container \"c945250129ad9a7fdd1cc19ac9f4c8d3e73c504448c652d5e02cdc986624089d\": container with ID starting with c945250129ad9a7fdd1cc19ac9f4c8d3e73c504448c652d5e02cdc986624089d not found: ID does not exist" Mar 20 06:54:54 crc kubenswrapper[4971]: I0320 06:54:54.775355 4971 scope.go:117] "RemoveContainer" containerID="9cfb7907b15336a74d333ce825f8325e80508112fa387f60627e13cc60dc6b32" Mar 20 06:54:54 crc kubenswrapper[4971]: E0320 06:54:54.775738 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cfb7907b15336a74d333ce825f8325e80508112fa387f60627e13cc60dc6b32\": container with ID starting with 9cfb7907b15336a74d333ce825f8325e80508112fa387f60627e13cc60dc6b32 not found: ID does not exist" containerID="9cfb7907b15336a74d333ce825f8325e80508112fa387f60627e13cc60dc6b32" Mar 20 06:54:54 crc kubenswrapper[4971]: I0320 06:54:54.775824 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cfb7907b15336a74d333ce825f8325e80508112fa387f60627e13cc60dc6b32"} err="failed to get container status \"9cfb7907b15336a74d333ce825f8325e80508112fa387f60627e13cc60dc6b32\": rpc error: code = NotFound desc = could not find container \"9cfb7907b15336a74d333ce825f8325e80508112fa387f60627e13cc60dc6b32\": container with ID starting with 9cfb7907b15336a74d333ce825f8325e80508112fa387f60627e13cc60dc6b32 not found: ID does not exist" Mar 20 06:54:54 crc kubenswrapper[4971]: I0320 06:54:54.775901 4971 scope.go:117] "RemoveContainer" containerID="c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc" Mar 20 06:54:54 crc kubenswrapper[4971]: E0320 06:54:54.776578 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc\": container with ID starting with c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc not found: ID does not exist" containerID="c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc" Mar 20 06:54:54 crc kubenswrapper[4971]: I0320 06:54:54.776640 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc"} err="failed to get container status \"c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc\": rpc error: code = NotFound desc = could not find container \"c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc\": container with ID starting with c55125c618002bc3423f7516449a015779fa48330512df552546282377b058dc not found: ID does not exist" Mar 20 06:54:56 crc kubenswrapper[4971]: E0320 06:54:56.413184 4971 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.119:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189e7a39ccac00bb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 06:54:51.896103099 +0000 UTC m=+313.875977237,LastTimestamp:2026-03-20 06:54:51.896103099 +0000 UTC m=+313.875977237,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 06:54:56 crc kubenswrapper[4971]: E0320 06:54:56.793554 4971 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.119:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" volumeName="registry-storage" Mar 20 06:54:58 crc kubenswrapper[4971]: I0320 06:54:58.739480 4971 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Mar 20 06:54:58 crc kubenswrapper[4971]: I0320 06:54:58.741663 4971 status_manager.go:851] "Failed to get status for pod" podUID="1c00f1ed-a0fb-4e01-b022-b9fb5578ce33" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Mar 20 06:55:00 crc kubenswrapper[4971]: E0320 06:55:00.265656 4971 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.119:6443: connect: connection refused" Mar 20 06:55:00 crc kubenswrapper[4971]: E0320 06:55:00.267594 4971 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.119:6443: connect: connection refused" Mar 20 06:55:00 crc kubenswrapper[4971]: E0320 06:55:00.268314 4971 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.119:6443: connect: connection refused" Mar 20 06:55:00 crc kubenswrapper[4971]: E0320 06:55:00.268815 4971 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.119:6443: connect: connection refused" Mar 20 06:55:00 crc kubenswrapper[4971]: E0320 06:55:00.269405 4971 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.119:6443: connect: connection refused" Mar 20 06:55:00 crc kubenswrapper[4971]: I0320 06:55:00.269699 4971 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 20 06:55:00 crc kubenswrapper[4971]: E0320 06:55:00.270705 4971 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.119:6443: connect: connection refused" interval="200ms" Mar 20 06:55:00 crc kubenswrapper[4971]: E0320 06:55:00.471470 4971 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.119:6443: connect: connection refused" interval="400ms" Mar 20 06:55:00 crc kubenswrapper[4971]: E0320 06:55:00.872959 4971 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.119:6443: connect: connection refused" interval="800ms" Mar 20 06:55:01 crc kubenswrapper[4971]: E0320 06:55:01.673665 4971 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.119:6443: connect: connection refused" interval="1.6s" Mar 20 06:55:01 crc kubenswrapper[4971]: I0320 06:55:01.731833 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 06:55:01 crc kubenswrapper[4971]: I0320 06:55:01.732765 4971 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Mar 20 06:55:01 crc kubenswrapper[4971]: I0320 06:55:01.733358 4971 status_manager.go:851] "Failed to get status for pod" podUID="1c00f1ed-a0fb-4e01-b022-b9fb5578ce33" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Mar 20 06:55:01 crc kubenswrapper[4971]: I0320 06:55:01.760507 4971 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c3f656dd-4e82-4469-9bd4-f02922f2649c" Mar 20 06:55:01 crc kubenswrapper[4971]: I0320 06:55:01.760566 4971 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c3f656dd-4e82-4469-9bd4-f02922f2649c" Mar 20 06:55:01 crc kubenswrapper[4971]: E0320 06:55:01.761365 4971 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 06:55:01 crc kubenswrapper[4971]: I0320 06:55:01.762068 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 06:55:01 crc kubenswrapper[4971]: W0320 06:55:01.792151 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-24fbc854a23bf79298a06742431ec598ed1daeb05730a66d40aa40fa8bd656e6 WatchSource:0}: Error finding container 24fbc854a23bf79298a06742431ec598ed1daeb05730a66d40aa40fa8bd656e6: Status 404 returned error can't find the container with id 24fbc854a23bf79298a06742431ec598ed1daeb05730a66d40aa40fa8bd656e6 Mar 20 06:55:02 crc kubenswrapper[4971]: I0320 06:55:02.700775 4971 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="233a31abcce8bdb5ca991c95679334c563f5af3ef9606c6127ee022eae5671f3" exitCode=0 Mar 20 06:55:02 crc kubenswrapper[4971]: I0320 06:55:02.700889 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"233a31abcce8bdb5ca991c95679334c563f5af3ef9606c6127ee022eae5671f3"} Mar 20 06:55:02 crc kubenswrapper[4971]: I0320 06:55:02.701430 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"24fbc854a23bf79298a06742431ec598ed1daeb05730a66d40aa40fa8bd656e6"} Mar 20 06:55:02 crc kubenswrapper[4971]: I0320 06:55:02.701918 4971 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c3f656dd-4e82-4469-9bd4-f02922f2649c" Mar 20 06:55:02 crc kubenswrapper[4971]: I0320 06:55:02.701943 4971 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c3f656dd-4e82-4469-9bd4-f02922f2649c" Mar 20 06:55:02 crc kubenswrapper[4971]: E0320 06:55:02.702642 4971 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 06:55:02 crc kubenswrapper[4971]: I0320 06:55:02.702698 4971 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Mar 20 06:55:02 crc kubenswrapper[4971]: I0320 06:55:02.703293 4971 status_manager.go:851] "Failed to get status for pod" podUID="1c00f1ed-a0fb-4e01-b022-b9fb5578ce33" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Mar 20 06:55:03 crc kubenswrapper[4971]: I0320 06:55:03.712204 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"3dc885c850d52086b0350df69bcbaaf08ccc43b72da3a02d623c30010ae27bda"} Mar 20 06:55:03 crc kubenswrapper[4971]: I0320 06:55:03.713707 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"6f2491291793650277b6756960716fd80c87b0729fabc63987fe2ae6fcf28401"} Mar 20 06:55:03 crc kubenswrapper[4971]: I0320 06:55:03.713813 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"77a95803a7bd002e7e5f04f7dd4cb0e6a6df9bc9e284a90c32d7e6a6baed2446"} Mar 20 06:55:03 crc kubenswrapper[4971]: I0320 06:55:03.713899 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"62a138f513596158d3ad506019762746f83c7ef4c10b76fcb1a0c09a23af3ecc"} Mar 20 06:55:04 crc kubenswrapper[4971]: I0320 06:55:04.737508 4971 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c3f656dd-4e82-4469-9bd4-f02922f2649c" Mar 20 06:55:04 crc kubenswrapper[4971]: I0320 06:55:04.738172 4971 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c3f656dd-4e82-4469-9bd4-f02922f2649c" Mar 20 06:55:04 crc kubenswrapper[4971]: I0320 06:55:04.743098 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1e31e4d35cdc8b6cf7467f2f90b173b632075d19e10ed8595d870c6187fcf3ac"} Mar 20 06:55:04 crc kubenswrapper[4971]: I0320 06:55:04.743255 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 06:55:05 crc kubenswrapper[4971]: I0320 06:55:05.746458 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 06:55:05 crc kubenswrapper[4971]: I0320 06:55:05.747156 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 20 06:55:05 crc kubenswrapper[4971]: I0320 06:55:05.747223 4971 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="892e128c2264ce375823bf22386e28688c7c163c1a8ab217c984a26dd61506d9" exitCode=1 Mar 20 06:55:05 crc kubenswrapper[4971]: I0320 06:55:05.747259 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"892e128c2264ce375823bf22386e28688c7c163c1a8ab217c984a26dd61506d9"} Mar 20 06:55:05 crc kubenswrapper[4971]: I0320 06:55:05.747738 4971 scope.go:117] "RemoveContainer" containerID="892e128c2264ce375823bf22386e28688c7c163c1a8ab217c984a26dd61506d9" Mar 20 06:55:06 crc kubenswrapper[4971]: I0320 06:55:06.000476 4971 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 06:55:06 crc kubenswrapper[4971]: I0320 06:55:06.293148 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 06:55:06 crc kubenswrapper[4971]: I0320 06:55:06.758281 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 06:55:06 crc kubenswrapper[4971]: I0320 06:55:06.760263 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 20 06:55:06 crc kubenswrapper[4971]: I0320 06:55:06.760359 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"40c2d56981346e68b2ddb440b1abb4e9ca9532880dfc35ea0b1e47068e678919"} Mar 20 06:55:06 crc kubenswrapper[4971]: I0320 06:55:06.763244 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 06:55:06 crc kubenswrapper[4971]: I0320 06:55:06.763289 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 06:55:06 crc kubenswrapper[4971]: I0320 06:55:06.773119 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 06:55:09 crc kubenswrapper[4971]: I0320 06:55:09.776435 4971 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 06:55:09 crc kubenswrapper[4971]: I0320 06:55:09.789460 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 06:55:09 crc kubenswrapper[4971]: I0320 06:55:09.855043 4971 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="9cf81c5a-30c1-43e0-8404-1b691f4a909f" Mar 20 06:55:10 crc kubenswrapper[4971]: I0320 06:55:10.787426 4971 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c3f656dd-4e82-4469-9bd4-f02922f2649c" Mar 20 06:55:10 crc kubenswrapper[4971]: I0320 06:55:10.789103 4971 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c3f656dd-4e82-4469-9bd4-f02922f2649c" Mar 20 06:55:10 crc kubenswrapper[4971]: I0320 06:55:10.793776 4971 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="9cf81c5a-30c1-43e0-8404-1b691f4a909f" Mar 20 06:55:10 crc kubenswrapper[4971]: I0320 06:55:10.796166 4971 status_manager.go:308] "Container readiness changed before pod has synced" pod="openshift-kube-apiserver/kube-apiserver-crc" containerID="cri-o://62a138f513596158d3ad506019762746f83c7ef4c10b76fcb1a0c09a23af3ecc" Mar 20 06:55:10 crc kubenswrapper[4971]: I0320 06:55:10.796353 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 06:55:11 crc kubenswrapper[4971]: I0320 06:55:11.792719 4971 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c3f656dd-4e82-4469-9bd4-f02922f2649c" Mar 20 06:55:11 crc kubenswrapper[4971]: I0320 06:55:11.793057 4971 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c3f656dd-4e82-4469-9bd4-f02922f2649c" Mar 20 06:55:11 crc kubenswrapper[4971]: I0320 06:55:11.795523 4971 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="9cf81c5a-30c1-43e0-8404-1b691f4a909f" Mar 20 06:55:16 crc kubenswrapper[4971]: I0320 06:55:16.293492 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 06:55:16 crc kubenswrapper[4971]: I0320 06:55:16.293794 4971 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 20 06:55:16 crc kubenswrapper[4971]: I0320 06:55:16.293996 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 20 06:55:16 crc kubenswrapper[4971]: I0320 06:55:16.425845 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 20 06:55:16 crc kubenswrapper[4971]: I0320 06:55:16.434836 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 20 06:55:16 crc kubenswrapper[4971]: I0320 06:55:16.600653 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 20 06:55:16 crc kubenswrapper[4971]: I0320 06:55:16.625834 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 20 06:55:16 crc kubenswrapper[4971]: I0320 06:55:16.647001 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 20 06:55:16 crc kubenswrapper[4971]: I0320 06:55:16.660284 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 20 06:55:16 crc kubenswrapper[4971]: I0320 06:55:16.884252 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 20 06:55:17 crc kubenswrapper[4971]: I0320 06:55:17.113839 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 20 06:55:17 crc kubenswrapper[4971]: I0320 06:55:17.140758 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 20 06:55:17 crc kubenswrapper[4971]: I0320 06:55:17.193364 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 06:55:17 crc kubenswrapper[4971]: I0320 06:55:17.352566 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 20 06:55:17 crc kubenswrapper[4971]: I0320 06:55:17.412930 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 20 06:55:17 crc kubenswrapper[4971]: I0320 06:55:17.417599 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 20 06:55:17 crc kubenswrapper[4971]: I0320 06:55:17.626975 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 20 06:55:17 crc kubenswrapper[4971]: I0320 06:55:17.697577 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 20 06:55:17 crc kubenswrapper[4971]: I0320 06:55:17.866992 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 20 06:55:17 crc kubenswrapper[4971]: I0320 06:55:17.984353 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 20 06:55:17 crc kubenswrapper[4971]: I0320 06:55:17.998375 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 20 06:55:18 crc kubenswrapper[4971]: I0320 06:55:18.147784 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 20 06:55:18 crc kubenswrapper[4971]: I0320 06:55:18.224891 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 20 06:55:18 crc kubenswrapper[4971]: I0320 06:55:18.447922 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 20 06:55:18 crc kubenswrapper[4971]: I0320 06:55:18.553248 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 20 06:55:18 crc kubenswrapper[4971]: I0320 06:55:18.623795 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 20 06:55:18 crc kubenswrapper[4971]: I0320 06:55:18.828586 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 20 06:55:18 crc kubenswrapper[4971]: I0320 06:55:18.913129 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 20 06:55:19 crc kubenswrapper[4971]: I0320 06:55:19.027853 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 20 06:55:19 crc kubenswrapper[4971]: I0320 06:55:19.061204 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 20 06:55:19 crc kubenswrapper[4971]: I0320 06:55:19.203727 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 20 06:55:19 crc kubenswrapper[4971]: I0320 06:55:19.350506 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 20 06:55:19 crc kubenswrapper[4971]: I0320 06:55:19.483016 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 20 06:55:19 crc kubenswrapper[4971]: I0320 06:55:19.542972 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 20 06:55:19 crc kubenswrapper[4971]: I0320 06:55:19.860546 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 20 06:55:19 crc kubenswrapper[4971]: I0320 06:55:19.903388 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 20 06:55:20 crc kubenswrapper[4971]: I0320 06:55:20.218213 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 20 06:55:20 crc kubenswrapper[4971]: I0320 06:55:20.277768 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 20 06:55:20 crc kubenswrapper[4971]: I0320 06:55:20.386311 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 20 06:55:20 crc kubenswrapper[4971]: I0320 06:55:20.392200 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 20 06:55:20 crc kubenswrapper[4971]: I0320 06:55:20.405261 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 20 06:55:20 crc kubenswrapper[4971]: I0320 06:55:20.829166 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 20 06:55:21 crc kubenswrapper[4971]: I0320 06:55:21.231084 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 20 06:55:21 crc kubenswrapper[4971]: I0320 06:55:21.429507 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 20 06:55:21 crc kubenswrapper[4971]: I0320 06:55:21.456549 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 20 06:55:21 crc kubenswrapper[4971]: I0320 06:55:21.717843 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 06:55:21 crc kubenswrapper[4971]: I0320 06:55:21.723446 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 20 06:55:21 crc kubenswrapper[4971]: I0320 06:55:21.835554 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 20 06:55:21 crc kubenswrapper[4971]: I0320 06:55:21.838746 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 20 06:55:21 crc kubenswrapper[4971]: I0320 06:55:21.886845 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 20 06:55:21 crc kubenswrapper[4971]: I0320 06:55:21.996452 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 20 06:55:22 crc kubenswrapper[4971]: I0320 06:55:22.036723 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 20 06:55:22 crc kubenswrapper[4971]: I0320 06:55:22.102759 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 20 06:55:22 crc kubenswrapper[4971]: I0320 06:55:22.277335 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 20 06:55:22 crc kubenswrapper[4971]: I0320 06:55:22.313591 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 20 06:55:22 crc kubenswrapper[4971]: I0320 06:55:22.354291 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 20 06:55:22 crc kubenswrapper[4971]: I0320 06:55:22.489007 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 20 06:55:22 crc kubenswrapper[4971]: I0320 06:55:22.503544 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 20 06:55:22 crc kubenswrapper[4971]: I0320 06:55:22.548620 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 20 06:55:22 crc kubenswrapper[4971]: I0320 06:55:22.656905 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 20 06:55:22 crc kubenswrapper[4971]: I0320 06:55:22.779889 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 20 06:55:22 crc kubenswrapper[4971]: I0320 06:55:22.911107 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 20 06:55:23 crc kubenswrapper[4971]: I0320 06:55:23.044115 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 20 06:55:23 crc kubenswrapper[4971]: I0320 06:55:23.122808 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 20 06:55:23 crc kubenswrapper[4971]: I0320 06:55:23.714579 4971 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 20 06:55:23 crc kubenswrapper[4971]: I0320 06:55:23.797381 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 20 06:55:23 crc kubenswrapper[4971]: I0320 06:55:23.833154 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 20 06:55:23 crc kubenswrapper[4971]: I0320 06:55:23.860190 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 06:55:23 crc kubenswrapper[4971]: I0320 06:55:23.870122 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 20 06:55:24 crc kubenswrapper[4971]: I0320 06:55:24.023113 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 20 06:55:24 crc kubenswrapper[4971]: I0320 06:55:24.220173 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 20 06:55:24 crc kubenswrapper[4971]: I0320 06:55:24.295702 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 20 06:55:24 crc kubenswrapper[4971]: I0320 06:55:24.334275 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 20 06:55:24 crc kubenswrapper[4971]: I0320 06:55:24.387408 4971 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 20 06:55:24 crc kubenswrapper[4971]: I0320 06:55:24.424775 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 20 06:55:24 crc kubenswrapper[4971]: I0320 06:55:24.544250 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 20 06:55:24 crc kubenswrapper[4971]: I0320 06:55:24.662181 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 20 06:55:24 crc kubenswrapper[4971]: I0320 06:55:24.785839 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 20 06:55:24 crc kubenswrapper[4971]: I0320 06:55:24.874399 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 20 06:55:24 crc kubenswrapper[4971]: I0320 06:55:24.894247 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 20 06:55:25 crc kubenswrapper[4971]: I0320 06:55:25.077819 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 20 06:55:25 crc kubenswrapper[4971]: I0320 06:55:25.237590 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 20 06:55:25 crc kubenswrapper[4971]: I0320 06:55:25.309804 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 20 06:55:25 crc kubenswrapper[4971]: I0320 06:55:25.459404 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 20 06:55:25 crc kubenswrapper[4971]: I0320 06:55:25.467930 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 20 06:55:25 crc kubenswrapper[4971]: I0320 06:55:25.710416 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 20 06:55:25 crc kubenswrapper[4971]: I0320 06:55:25.788790 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 20 06:55:25 crc kubenswrapper[4971]: I0320 06:55:25.826386 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 20 06:55:25 crc kubenswrapper[4971]: I0320 06:55:25.837754 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 20 06:55:25 crc kubenswrapper[4971]: I0320 06:55:25.957338 4971 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 20 06:55:25 crc kubenswrapper[4971]: I0320 06:55:25.963075 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=34.963031908 podStartE2EDuration="34.963031908s" podCreationTimestamp="2026-03-20 06:54:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:55:09.783525323 +0000 UTC m=+331.763399461" watchObservedRunningTime="2026-03-20 06:55:25.963031908 +0000 UTC m=+347.942906086" Mar 20 06:55:25 crc kubenswrapper[4971]: I0320 06:55:25.968452 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 06:55:25 crc kubenswrapper[4971]: I0320 06:55:25.968556 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 06:55:25 crc kubenswrapper[4971]: I0320 06:55:25.976681 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 06:55:25 crc kubenswrapper[4971]: I0320 06:55:25.981916 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 20 06:55:25 crc kubenswrapper[4971]: I0320 06:55:25.999466 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=16.999431928 podStartE2EDuration="16.999431928s" podCreationTimestamp="2026-03-20 06:55:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:55:25.9933922 +0000 UTC m=+347.973266408" watchObservedRunningTime="2026-03-20 06:55:25.999431928 +0000 UTC m=+347.979306096" Mar 20 06:55:26 crc kubenswrapper[4971]: I0320 06:55:26.025490 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 06:55:26 crc kubenswrapper[4971]: I0320 06:55:26.067917 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 06:55:26 crc kubenswrapper[4971]: I0320 06:55:26.118095 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 20 06:55:26 crc kubenswrapper[4971]: I0320 06:55:26.157664 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 20 06:55:26 crc kubenswrapper[4971]: I0320 06:55:26.182642 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 20 06:55:26 crc kubenswrapper[4971]: I0320 06:55:26.235365 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 20 06:55:26 crc kubenswrapper[4971]: I0320 06:55:26.293857 4971 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 20 06:55:26 crc kubenswrapper[4971]: I0320 06:55:26.293946 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 20 06:55:26 crc kubenswrapper[4971]: I0320 06:55:26.463437 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 20 06:55:26 crc kubenswrapper[4971]: I0320 06:55:26.493542 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 20 06:55:26 crc kubenswrapper[4971]: I0320 06:55:26.666020 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 20 06:55:26 crc kubenswrapper[4971]: I0320 06:55:26.947548 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 20 06:55:27 crc kubenswrapper[4971]: I0320 06:55:27.001773 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 20 06:55:27 crc kubenswrapper[4971]: I0320 06:55:27.174510 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 20 06:55:27 crc kubenswrapper[4971]: I0320 06:55:27.192653 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 20 06:55:27 crc kubenswrapper[4971]: I0320 06:55:27.193529 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 20 06:55:27 crc kubenswrapper[4971]: I0320 06:55:27.271050 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 20 06:55:27 crc kubenswrapper[4971]: I0320 06:55:27.460665 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 20 06:55:27 crc kubenswrapper[4971]: I0320 06:55:27.540928 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 20 06:55:27 crc kubenswrapper[4971]: I0320 06:55:27.594856 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 20 06:55:27 crc kubenswrapper[4971]: I0320 06:55:27.631492 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 20 06:55:27 crc kubenswrapper[4971]: I0320 06:55:27.632221 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 06:55:27 crc kubenswrapper[4971]: I0320 06:55:27.688530 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 20 06:55:27 crc kubenswrapper[4971]: I0320 06:55:27.729772 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 20 06:55:27 crc kubenswrapper[4971]: I0320 06:55:27.767859 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 20 06:55:27 crc kubenswrapper[4971]: I0320 06:55:27.785435 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 20 06:55:27 crc kubenswrapper[4971]: I0320 06:55:27.795668 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 20 06:55:27 crc kubenswrapper[4971]: I0320 06:55:27.873079 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 20 06:55:27 crc kubenswrapper[4971]: I0320 06:55:27.955669 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 20 06:55:28 crc kubenswrapper[4971]: I0320 06:55:28.032379 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 20 06:55:28 crc kubenswrapper[4971]: I0320 06:55:28.036068 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 20 06:55:28 crc kubenswrapper[4971]: I0320 06:55:28.044459 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 20 06:55:28 crc kubenswrapper[4971]: I0320 06:55:28.314927 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 20 06:55:28 crc kubenswrapper[4971]: I0320 06:55:28.391522 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 20 06:55:28 crc kubenswrapper[4971]: I0320 06:55:28.488730 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 20 06:55:28 crc kubenswrapper[4971]: I0320 06:55:28.549467 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 20 06:55:28 crc kubenswrapper[4971]: I0320 06:55:28.590084 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 20 06:55:28 crc kubenswrapper[4971]: I0320 06:55:28.590937 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 20 06:55:28 crc kubenswrapper[4971]: I0320 06:55:28.598252 4971 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 20 06:55:28 crc kubenswrapper[4971]: I0320 06:55:28.743321 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 20 06:55:28 crc kubenswrapper[4971]: I0320 06:55:28.744847 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 20 06:55:28 crc kubenswrapper[4971]: I0320 06:55:28.750972 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 20 06:55:28 crc kubenswrapper[4971]: I0320 06:55:28.764444 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 20 06:55:28 crc kubenswrapper[4971]: I0320 06:55:28.775733 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 20 06:55:29 crc kubenswrapper[4971]: I0320 06:55:29.070017 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 20 06:55:29 crc kubenswrapper[4971]: I0320 06:55:29.222319 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 06:55:29 crc kubenswrapper[4971]: I0320 06:55:29.412714 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 06:55:29 crc kubenswrapper[4971]: I0320 06:55:29.462424 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 20 06:55:29 crc kubenswrapper[4971]: I0320 06:55:29.464664 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 20 06:55:29 crc kubenswrapper[4971]: I0320 06:55:29.472719 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 20 06:55:29 crc kubenswrapper[4971]: I0320 06:55:29.580370 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 20 06:55:29 crc kubenswrapper[4971]: I0320 06:55:29.644367 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 20 06:55:29 crc kubenswrapper[4971]: I0320 06:55:29.741090 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 20 06:55:29 crc kubenswrapper[4971]: I0320 06:55:29.787468 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 06:55:29 crc kubenswrapper[4971]: I0320 06:55:29.834938 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 20 06:55:29 crc kubenswrapper[4971]: I0320 06:55:29.871715 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 20 06:55:29 crc kubenswrapper[4971]: I0320 06:55:29.883573 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 20 06:55:29 crc kubenswrapper[4971]: I0320 06:55:29.908217 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 20 06:55:29 crc kubenswrapper[4971]: I0320 06:55:29.933050 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 20 06:55:29 crc kubenswrapper[4971]: I0320 06:55:29.982194 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 20 06:55:30 crc kubenswrapper[4971]: I0320 06:55:30.091422 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 20 06:55:30 crc kubenswrapper[4971]: I0320 06:55:30.102213 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 20 06:55:30 crc kubenswrapper[4971]: I0320 06:55:30.209182 4971 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 20 06:55:30 crc kubenswrapper[4971]: I0320 06:55:30.373160 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 06:55:30 crc kubenswrapper[4971]: I0320 06:55:30.428009 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 20 06:55:30 crc kubenswrapper[4971]: I0320 06:55:30.575361 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 20 06:55:30 crc kubenswrapper[4971]: I0320 06:55:30.592054 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 06:55:30 crc kubenswrapper[4971]: I0320 06:55:30.641968 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 20 06:55:30 crc kubenswrapper[4971]: I0320 06:55:30.792132 4971 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 20 06:55:31 crc kubenswrapper[4971]: I0320 06:55:31.053593 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 20 06:55:31 crc kubenswrapper[4971]: I0320 06:55:31.071338 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 20 06:55:31 crc kubenswrapper[4971]: I0320 06:55:31.105967 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 06:55:31 crc kubenswrapper[4971]: I0320 06:55:31.307940 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 20 06:55:31 crc kubenswrapper[4971]: I0320 06:55:31.339826 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 20 06:55:31 crc kubenswrapper[4971]: I0320 06:55:31.377369 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 20 06:55:31 crc kubenswrapper[4971]: I0320 06:55:31.510930 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 20 06:55:31 crc kubenswrapper[4971]: I0320 06:55:31.595968 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 20 06:55:31 crc kubenswrapper[4971]: I0320 06:55:31.675894 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 20 06:55:31 crc kubenswrapper[4971]: I0320 06:55:31.683460 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 20 06:55:31 crc kubenswrapper[4971]: I0320 06:55:31.689377 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 20 06:55:31 crc kubenswrapper[4971]: I0320 06:55:31.763340 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 20 06:55:31 crc kubenswrapper[4971]: I0320 06:55:31.765769 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 20 06:55:31 crc kubenswrapper[4971]: I0320 06:55:31.767573 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 20 06:55:31 crc kubenswrapper[4971]: I0320 06:55:31.817294 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 20 06:55:31 crc kubenswrapper[4971]: I0320 06:55:31.820943 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 20 06:55:31 crc kubenswrapper[4971]: I0320 06:55:31.976883 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 20 06:55:32 crc kubenswrapper[4971]: I0320 06:55:32.003271 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 20 06:55:32 crc kubenswrapper[4971]: I0320 06:55:32.015373 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 20 06:55:32 crc kubenswrapper[4971]: I0320 06:55:32.164416 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 20 06:55:32 crc kubenswrapper[4971]: I0320 06:55:32.203889 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 20 06:55:32 crc kubenswrapper[4971]: I0320 06:55:32.206132 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 20 06:55:32 crc kubenswrapper[4971]: I0320 06:55:32.214319 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 20 06:55:32 crc kubenswrapper[4971]: I0320 06:55:32.378096 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 20 06:55:32 crc kubenswrapper[4971]: I0320 06:55:32.401487 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 20 06:55:32 crc kubenswrapper[4971]: I0320 06:55:32.405251 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 20 06:55:32 crc kubenswrapper[4971]: I0320 06:55:32.434058 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 20 06:55:32 crc kubenswrapper[4971]: I0320 06:55:32.439464 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 20 06:55:32 crc kubenswrapper[4971]: I0320 06:55:32.517328 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 20 06:55:32 crc kubenswrapper[4971]: I0320 06:55:32.519971 4971 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 06:55:32 crc kubenswrapper[4971]: I0320 06:55:32.520321 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://81160d8b9b99c40a04803f0ec0b34a5c0f5a96810fdfcb72e98762652ab14461" gracePeriod=5 Mar 20 06:55:32 crc kubenswrapper[4971]: I0320 06:55:32.588503 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 20 06:55:32 crc kubenswrapper[4971]: I0320 06:55:32.810942 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 20 06:55:32 crc kubenswrapper[4971]: I0320 06:55:32.864544 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 20 06:55:33 crc kubenswrapper[4971]: I0320 06:55:33.118969 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 20 06:55:33 crc kubenswrapper[4971]: I0320 06:55:33.223059 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 06:55:33 crc kubenswrapper[4971]: I0320 06:55:33.234370 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 20 06:55:33 crc kubenswrapper[4971]: I0320 06:55:33.349120 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 20 06:55:33 crc kubenswrapper[4971]: I0320 06:55:33.476885 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 06:55:33 crc kubenswrapper[4971]: I0320 06:55:33.557756 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 20 06:55:33 crc kubenswrapper[4971]: I0320 06:55:33.580283 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 20 06:55:33 crc kubenswrapper[4971]: I0320 06:55:33.681529 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 20 06:55:33 crc kubenswrapper[4971]: I0320 06:55:33.717726 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 20 06:55:33 crc kubenswrapper[4971]: I0320 06:55:33.719153 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 20 06:55:33 crc kubenswrapper[4971]: I0320 06:55:33.722809 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 06:55:33 crc kubenswrapper[4971]: I0320 06:55:33.888486 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 20 06:55:33 crc kubenswrapper[4971]: I0320 06:55:33.892118 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 20 06:55:33 crc kubenswrapper[4971]: I0320 06:55:33.997694 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 20 06:55:34 crc kubenswrapper[4971]: I0320 06:55:34.075465 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 20 06:55:34 crc kubenswrapper[4971]: I0320 06:55:34.113659 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 20 06:55:34 crc kubenswrapper[4971]: I0320 06:55:34.247396 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 06:55:34 crc kubenswrapper[4971]: I0320 06:55:34.259243 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 20 06:55:34 crc kubenswrapper[4971]: I0320 06:55:34.480978 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 20 06:55:34 crc kubenswrapper[4971]: I0320 06:55:34.536259 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 20 06:55:34 crc kubenswrapper[4971]: I0320 06:55:34.619334 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 20 06:55:34 crc kubenswrapper[4971]: I0320 06:55:34.628837 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 20 06:55:34 crc kubenswrapper[4971]: I0320 06:55:34.696972 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 20 06:55:34 crc kubenswrapper[4971]: I0320 06:55:34.719334 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 20 06:55:34 crc kubenswrapper[4971]: I0320 06:55:34.730446 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 20 06:55:34 crc kubenswrapper[4971]: I0320 06:55:34.779068 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 20 06:55:34 crc kubenswrapper[4971]: I0320 06:55:34.864817 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 20 06:55:34 crc kubenswrapper[4971]: I0320 06:55:34.895780 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 20 06:55:35 crc kubenswrapper[4971]: I0320 06:55:35.084816 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 20 06:55:35 crc kubenswrapper[4971]: I0320 06:55:35.229436 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 20 06:55:35 crc kubenswrapper[4971]: I0320 06:55:35.317961 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 20 06:55:35 crc kubenswrapper[4971]: I0320 06:55:35.319384 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 20 06:55:35 crc kubenswrapper[4971]: I0320 06:55:35.406861 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 20 06:55:35 crc kubenswrapper[4971]: I0320 06:55:35.509213 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 20 06:55:35 crc kubenswrapper[4971]: I0320 06:55:35.565829 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 06:55:35 crc kubenswrapper[4971]: I0320 06:55:35.588652 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 20 06:55:35 crc kubenswrapper[4971]: I0320 06:55:35.588659 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 20 06:55:35 crc kubenswrapper[4971]: I0320 06:55:35.672444 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 20 06:55:35 crc kubenswrapper[4971]: I0320 06:55:35.690481 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 20 06:55:35 crc kubenswrapper[4971]: I0320 06:55:35.782422 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 20 06:55:35 crc kubenswrapper[4971]: I0320 06:55:35.806869 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 20 06:55:35 crc kubenswrapper[4971]: I0320 06:55:35.868989 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 20 06:55:35 crc kubenswrapper[4971]: I0320 06:55:35.888450 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 20 06:55:35 crc kubenswrapper[4971]: I0320 06:55:35.937465 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 20 06:55:36 crc kubenswrapper[4971]: I0320 06:55:36.139889 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 20 06:55:36 crc kubenswrapper[4971]: I0320 06:55:36.294028 4971 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 20 06:55:36 crc kubenswrapper[4971]: I0320 06:55:36.294540 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 20 06:55:36 crc kubenswrapper[4971]: I0320 06:55:36.294867 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 06:55:36 crc kubenswrapper[4971]: I0320 06:55:36.295134 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 20 06:55:36 crc kubenswrapper[4971]: I0320 06:55:36.296172 4971 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"40c2d56981346e68b2ddb440b1abb4e9ca9532880dfc35ea0b1e47068e678919"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Mar 20 06:55:36 crc kubenswrapper[4971]: I0320 06:55:36.296645 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://40c2d56981346e68b2ddb440b1abb4e9ca9532880dfc35ea0b1e47068e678919" gracePeriod=30 Mar 20 06:55:36 crc kubenswrapper[4971]: I0320 06:55:36.440860 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 20 06:55:36 crc kubenswrapper[4971]: I0320 06:55:36.484459 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 20 06:55:36 crc kubenswrapper[4971]: I0320 06:55:36.658203 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 20 06:55:36 crc kubenswrapper[4971]: I0320 06:55:36.719597 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 20 06:55:36 crc kubenswrapper[4971]: I0320 06:55:36.811957 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 20 06:55:36 crc kubenswrapper[4971]: I0320 06:55:36.868021 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 20 06:55:36 crc kubenswrapper[4971]: I0320 06:55:36.886254 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 20 06:55:37 crc kubenswrapper[4971]: I0320 06:55:37.073244 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 20 06:55:37 crc kubenswrapper[4971]: I0320 06:55:37.304759 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 20 06:55:37 crc kubenswrapper[4971]: I0320 06:55:37.564713 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 20 06:55:37 crc kubenswrapper[4971]: I0320 06:55:37.754921 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 20 06:55:37 crc kubenswrapper[4971]: I0320 06:55:37.998986 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 20 06:55:37 crc kubenswrapper[4971]: I0320 06:55:37.999082 4971 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="81160d8b9b99c40a04803f0ec0b34a5c0f5a96810fdfcb72e98762652ab14461" exitCode=137 Mar 20 06:55:38 crc kubenswrapper[4971]: I0320 06:55:38.072826 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 20 06:55:38 crc kubenswrapper[4971]: I0320 06:55:38.110506 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 20 06:55:38 crc kubenswrapper[4971]: I0320 06:55:38.127730 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 20 06:55:38 crc kubenswrapper[4971]: I0320 06:55:38.127819 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 06:55:38 crc kubenswrapper[4971]: I0320 06:55:38.130891 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 20 06:55:38 crc kubenswrapper[4971]: I0320 06:55:38.268348 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 06:55:38 crc kubenswrapper[4971]: I0320 06:55:38.268473 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 06:55:38 crc kubenswrapper[4971]: I0320 06:55:38.268527 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 06:55:38 crc kubenswrapper[4971]: I0320 06:55:38.268587 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 06:55:38 crc kubenswrapper[4971]: I0320 06:55:38.268590 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 06:55:38 crc kubenswrapper[4971]: I0320 06:55:38.268681 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 06:55:38 crc kubenswrapper[4971]: I0320 06:55:38.268773 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 06:55:38 crc kubenswrapper[4971]: I0320 06:55:38.268849 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 06:55:38 crc kubenswrapper[4971]: I0320 06:55:38.268899 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 06:55:38 crc kubenswrapper[4971]: I0320 06:55:38.269289 4971 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 06:55:38 crc kubenswrapper[4971]: I0320 06:55:38.269327 4971 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 20 06:55:38 crc kubenswrapper[4971]: I0320 06:55:38.269349 4971 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 20 06:55:38 crc kubenswrapper[4971]: I0320 06:55:38.269371 4971 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 20 06:55:38 crc kubenswrapper[4971]: I0320 06:55:38.281353 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 06:55:38 crc kubenswrapper[4971]: I0320 06:55:38.364199 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 20 06:55:38 crc kubenswrapper[4971]: I0320 06:55:38.371150 4971 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 06:55:38 crc kubenswrapper[4971]: I0320 06:55:38.593411 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 20 06:55:38 crc kubenswrapper[4971]: I0320 06:55:38.727180 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 20 06:55:38 crc kubenswrapper[4971]: I0320 06:55:38.743680 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 20 06:55:38 crc kubenswrapper[4971]: I0320 06:55:38.744023 4971 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Mar 20 06:55:38 crc kubenswrapper[4971]: I0320 06:55:38.757661 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 20 06:55:38 crc kubenswrapper[4971]: I0320 06:55:38.759548 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 06:55:38 crc kubenswrapper[4971]: I0320 06:55:38.759599 4971 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="3938a600-5c63-4b6b-b047-6efac62b8314" Mar 20 06:55:38 crc kubenswrapper[4971]: I0320 06:55:38.763386 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 06:55:38 crc kubenswrapper[4971]: I0320 06:55:38.763425 4971 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="3938a600-5c63-4b6b-b047-6efac62b8314" Mar 20 06:55:39 crc kubenswrapper[4971]: I0320 06:55:39.009698 4971 scope.go:117] "RemoveContainer" containerID="81160d8b9b99c40a04803f0ec0b34a5c0f5a96810fdfcb72e98762652ab14461" Mar 20 06:55:39 crc kubenswrapper[4971]: I0320 06:55:39.009725 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 06:55:40 crc kubenswrapper[4971]: I0320 06:55:40.150763 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 20 06:56:07 crc kubenswrapper[4971]: I0320 06:56:07.211480 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 20 06:56:07 crc kubenswrapper[4971]: I0320 06:56:07.215377 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 06:56:07 crc kubenswrapper[4971]: I0320 06:56:07.216335 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 20 06:56:07 crc kubenswrapper[4971]: I0320 06:56:07.216434 4971 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="40c2d56981346e68b2ddb440b1abb4e9ca9532880dfc35ea0b1e47068e678919" exitCode=137 Mar 20 06:56:07 crc kubenswrapper[4971]: I0320 06:56:07.216485 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"40c2d56981346e68b2ddb440b1abb4e9ca9532880dfc35ea0b1e47068e678919"} Mar 20 06:56:07 crc kubenswrapper[4971]: I0320 06:56:07.216535 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9a16366a65a6847a33bbbcf274585dcb2377c11f7898b4bdb44d75c155acdacf"} Mar 20 06:56:07 crc kubenswrapper[4971]: I0320 06:56:07.216577 4971 scope.go:117] "RemoveContainer" containerID="892e128c2264ce375823bf22386e28688c7c163c1a8ab217c984a26dd61506d9" Mar 20 06:56:08 crc kubenswrapper[4971]: I0320 06:56:08.228083 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 20 06:56:08 crc kubenswrapper[4971]: I0320 06:56:08.230379 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 06:56:09 crc kubenswrapper[4971]: I0320 06:56:09.789161 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 06:56:16 crc kubenswrapper[4971]: I0320 06:56:16.293713 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 06:56:16 crc kubenswrapper[4971]: I0320 06:56:16.303544 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 06:56:19 crc kubenswrapper[4971]: I0320 06:56:19.793095 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 06:56:26 crc kubenswrapper[4971]: I0320 06:56:26.327671 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566496-rzfqw"] Mar 20 06:56:26 crc kubenswrapper[4971]: E0320 06:56:26.328466 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c00f1ed-a0fb-4e01-b022-b9fb5578ce33" containerName="installer" Mar 20 06:56:26 crc kubenswrapper[4971]: I0320 06:56:26.328483 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c00f1ed-a0fb-4e01-b022-b9fb5578ce33" containerName="installer" Mar 20 06:56:26 crc kubenswrapper[4971]: E0320 06:56:26.328516 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 06:56:26 crc kubenswrapper[4971]: I0320 06:56:26.328525 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 06:56:26 crc kubenswrapper[4971]: I0320 06:56:26.328663 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c00f1ed-a0fb-4e01-b022-b9fb5578ce33" containerName="installer" Mar 20 06:56:26 crc kubenswrapper[4971]: I0320 06:56:26.328675 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 06:56:26 crc kubenswrapper[4971]: I0320 06:56:26.329138 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566496-rzfqw" Mar 20 06:56:26 crc kubenswrapper[4971]: I0320 06:56:26.331023 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 06:56:26 crc kubenswrapper[4971]: I0320 06:56:26.335924 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566496-rzfqw"] Mar 20 06:56:26 crc kubenswrapper[4971]: I0320 06:56:26.337355 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 06:56:26 crc kubenswrapper[4971]: I0320 06:56:26.337748 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 06:56:26 crc kubenswrapper[4971]: I0320 06:56:26.433577 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tkcb\" (UniqueName: \"kubernetes.io/projected/f2239507-962f-4ce1-82fa-6ca4a2ddbb93-kube-api-access-7tkcb\") pod \"auto-csr-approver-29566496-rzfqw\" (UID: \"f2239507-962f-4ce1-82fa-6ca4a2ddbb93\") " pod="openshift-infra/auto-csr-approver-29566496-rzfqw" Mar 20 06:56:26 crc kubenswrapper[4971]: I0320 06:56:26.534775 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tkcb\" (UniqueName: \"kubernetes.io/projected/f2239507-962f-4ce1-82fa-6ca4a2ddbb93-kube-api-access-7tkcb\") pod \"auto-csr-approver-29566496-rzfqw\" (UID: \"f2239507-962f-4ce1-82fa-6ca4a2ddbb93\") " pod="openshift-infra/auto-csr-approver-29566496-rzfqw" Mar 20 06:56:26 crc kubenswrapper[4971]: I0320 06:56:26.557142 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tkcb\" (UniqueName: \"kubernetes.io/projected/f2239507-962f-4ce1-82fa-6ca4a2ddbb93-kube-api-access-7tkcb\") pod \"auto-csr-approver-29566496-rzfqw\" (UID: \"f2239507-962f-4ce1-82fa-6ca4a2ddbb93\") " pod="openshift-infra/auto-csr-approver-29566496-rzfqw" Mar 20 06:56:26 crc kubenswrapper[4971]: I0320 06:56:26.645646 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566496-rzfqw" Mar 20 06:56:27 crc kubenswrapper[4971]: I0320 06:56:27.068936 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566496-rzfqw"] Mar 20 06:56:27 crc kubenswrapper[4971]: I0320 06:56:27.378709 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566496-rzfqw" event={"ID":"f2239507-962f-4ce1-82fa-6ca4a2ddbb93","Type":"ContainerStarted","Data":"3c59510731d8114372b791fcdef5fcbce32f727a174b40b98cef96f1426d4dc5"} Mar 20 06:56:28 crc kubenswrapper[4971]: I0320 06:56:28.386823 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566496-rzfqw" event={"ID":"f2239507-962f-4ce1-82fa-6ca4a2ddbb93","Type":"ContainerStarted","Data":"bb793c8c531d465ffcd9e9dbe5cca7a0bb21632b645211ed847bff198be4a22e"} Mar 20 06:56:28 crc kubenswrapper[4971]: I0320 06:56:28.407026 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566496-rzfqw" podStartSLOduration=1.548597738 podStartE2EDuration="2.406998227s" podCreationTimestamp="2026-03-20 06:56:26 +0000 UTC" firstStartedPulling="2026-03-20 06:56:27.086247961 +0000 UTC m=+409.066122109" lastFinishedPulling="2026-03-20 06:56:27.94464846 +0000 UTC m=+409.924522598" observedRunningTime="2026-03-20 06:56:28.405953229 +0000 UTC m=+410.385827377" watchObservedRunningTime="2026-03-20 06:56:28.406998227 +0000 UTC m=+410.386872365" Mar 20 06:56:29 crc kubenswrapper[4971]: I0320 06:56:29.406382 4971 generic.go:334] "Generic (PLEG): container finished" podID="f2239507-962f-4ce1-82fa-6ca4a2ddbb93" containerID="bb793c8c531d465ffcd9e9dbe5cca7a0bb21632b645211ed847bff198be4a22e" exitCode=0 Mar 20 06:56:29 crc kubenswrapper[4971]: I0320 06:56:29.406463 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566496-rzfqw" event={"ID":"f2239507-962f-4ce1-82fa-6ca4a2ddbb93","Type":"ContainerDied","Data":"bb793c8c531d465ffcd9e9dbe5cca7a0bb21632b645211ed847bff198be4a22e"} Mar 20 06:56:30 crc kubenswrapper[4971]: I0320 06:56:30.702891 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566496-rzfqw" Mar 20 06:56:30 crc kubenswrapper[4971]: I0320 06:56:30.795269 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tkcb\" (UniqueName: \"kubernetes.io/projected/f2239507-962f-4ce1-82fa-6ca4a2ddbb93-kube-api-access-7tkcb\") pod \"f2239507-962f-4ce1-82fa-6ca4a2ddbb93\" (UID: \"f2239507-962f-4ce1-82fa-6ca4a2ddbb93\") " Mar 20 06:56:30 crc kubenswrapper[4971]: I0320 06:56:30.836637 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2239507-962f-4ce1-82fa-6ca4a2ddbb93-kube-api-access-7tkcb" (OuterVolumeSpecName: "kube-api-access-7tkcb") pod "f2239507-962f-4ce1-82fa-6ca4a2ddbb93" (UID: "f2239507-962f-4ce1-82fa-6ca4a2ddbb93"). InnerVolumeSpecName "kube-api-access-7tkcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:56:30 crc kubenswrapper[4971]: I0320 06:56:30.896787 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tkcb\" (UniqueName: \"kubernetes.io/projected/f2239507-962f-4ce1-82fa-6ca4a2ddbb93-kube-api-access-7tkcb\") on node \"crc\" DevicePath \"\"" Mar 20 06:56:31 crc kubenswrapper[4971]: I0320 06:56:31.420867 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566496-rzfqw" event={"ID":"f2239507-962f-4ce1-82fa-6ca4a2ddbb93","Type":"ContainerDied","Data":"3c59510731d8114372b791fcdef5fcbce32f727a174b40b98cef96f1426d4dc5"} Mar 20 06:56:31 crc kubenswrapper[4971]: I0320 06:56:31.420912 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c59510731d8114372b791fcdef5fcbce32f727a174b40b98cef96f1426d4dc5" Mar 20 06:56:31 crc kubenswrapper[4971]: I0320 06:56:31.420979 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566496-rzfqw" Mar 20 06:56:37 crc kubenswrapper[4971]: I0320 06:56:37.283350 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2ljbf"] Mar 20 06:56:37 crc kubenswrapper[4971]: I0320 06:56:37.284413 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2ljbf" podUID="264fdafe-b375-48d7-95d7-62d72c752b5d" containerName="registry-server" containerID="cri-o://f7cf8472aed6728bae861e17601f80f11b0faea13807991178f4f3913b3a2e6c" gracePeriod=30 Mar 20 06:56:37 crc kubenswrapper[4971]: I0320 06:56:37.300146 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-v9l2k"] Mar 20 06:56:37 crc kubenswrapper[4971]: I0320 06:56:37.300534 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-v9l2k" podUID="83cbdad9-020f-4a1b-8f5f-ca7d56d3cfac" containerName="registry-server" containerID="cri-o://116fedb9b26edd4b4aacd6e32cd947751c32a64fff4e5df1ffe669e760f2ad5e" gracePeriod=30 Mar 20 06:56:37 crc kubenswrapper[4971]: I0320 06:56:37.316481 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bj9mx"] Mar 20 06:56:37 crc kubenswrapper[4971]: I0320 06:56:37.316819 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-bj9mx" podUID="d17d7658-ce33-4a3c-826a-1a0361e73629" containerName="marketplace-operator" containerID="cri-o://856d87d0848cb1c7117a4ecee26fc78281ed69f1d642a687df533058dc1114df" gracePeriod=30 Mar 20 06:56:37 crc kubenswrapper[4971]: I0320 06:56:37.323247 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9l5fd"] Mar 20 06:56:37 crc kubenswrapper[4971]: I0320 06:56:37.323520 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9l5fd" podUID="7cb5ccdd-a645-458a-9961-461bd99c3c79" containerName="registry-server" containerID="cri-o://f23ca37f77065dcd97cdf30d765a6bf59f2c8968a0a0c987fead046d7d339726" gracePeriod=30 Mar 20 06:56:37 crc kubenswrapper[4971]: I0320 06:56:37.345037 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-q4lsj"] Mar 20 06:56:37 crc kubenswrapper[4971]: E0320 06:56:37.345399 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2239507-962f-4ce1-82fa-6ca4a2ddbb93" containerName="oc" Mar 20 06:56:37 crc kubenswrapper[4971]: I0320 06:56:37.345422 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2239507-962f-4ce1-82fa-6ca4a2ddbb93" containerName="oc" Mar 20 06:56:37 crc kubenswrapper[4971]: I0320 06:56:37.345565 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2239507-962f-4ce1-82fa-6ca4a2ddbb93" containerName="oc" Mar 20 06:56:37 crc kubenswrapper[4971]: I0320 06:56:37.346144 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-q4lsj" Mar 20 06:56:37 crc kubenswrapper[4971]: I0320 06:56:37.351168 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4w7kq"] Mar 20 06:56:37 crc kubenswrapper[4971]: I0320 06:56:37.351581 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4w7kq" podUID="0a637a4b-c5ec-46a5-8b5c-84a56a112ec4" containerName="registry-server" containerID="cri-o://678df1b724815327e4996b78dd609092217a98e85e5b6e63b55b67e5dd3ba339" gracePeriod=30 Mar 20 06:56:37 crc kubenswrapper[4971]: I0320 06:56:37.359878 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-q4lsj"] Mar 20 06:56:37 crc kubenswrapper[4971]: I0320 06:56:37.399464 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c34c021e-b961-4f87-a749-9e023f6b8c93-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-q4lsj\" (UID: \"c34c021e-b961-4f87-a749-9e023f6b8c93\") " pod="openshift-marketplace/marketplace-operator-79b997595-q4lsj" Mar 20 06:56:37 crc kubenswrapper[4971]: I0320 06:56:37.400027 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64l4p\" (UniqueName: \"kubernetes.io/projected/c34c021e-b961-4f87-a749-9e023f6b8c93-kube-api-access-64l4p\") pod \"marketplace-operator-79b997595-q4lsj\" (UID: \"c34c021e-b961-4f87-a749-9e023f6b8c93\") " pod="openshift-marketplace/marketplace-operator-79b997595-q4lsj" Mar 20 06:56:37 crc kubenswrapper[4971]: I0320 06:56:37.400080 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c34c021e-b961-4f87-a749-9e023f6b8c93-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-q4lsj\" (UID: \"c34c021e-b961-4f87-a749-9e023f6b8c93\") " pod="openshift-marketplace/marketplace-operator-79b997595-q4lsj" Mar 20 06:56:37 crc kubenswrapper[4971]: I0320 06:56:37.501661 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64l4p\" (UniqueName: \"kubernetes.io/projected/c34c021e-b961-4f87-a749-9e023f6b8c93-kube-api-access-64l4p\") pod \"marketplace-operator-79b997595-q4lsj\" (UID: \"c34c021e-b961-4f87-a749-9e023f6b8c93\") " pod="openshift-marketplace/marketplace-operator-79b997595-q4lsj" Mar 20 06:56:37 crc kubenswrapper[4971]: I0320 06:56:37.501728 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c34c021e-b961-4f87-a749-9e023f6b8c93-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-q4lsj\" (UID: \"c34c021e-b961-4f87-a749-9e023f6b8c93\") " pod="openshift-marketplace/marketplace-operator-79b997595-q4lsj" Mar 20 06:56:37 crc kubenswrapper[4971]: I0320 06:56:37.501766 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c34c021e-b961-4f87-a749-9e023f6b8c93-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-q4lsj\" (UID: \"c34c021e-b961-4f87-a749-9e023f6b8c93\") " pod="openshift-marketplace/marketplace-operator-79b997595-q4lsj" Mar 20 06:56:37 crc kubenswrapper[4971]: I0320 06:56:37.503153 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c34c021e-b961-4f87-a749-9e023f6b8c93-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-q4lsj\" (UID: \"c34c021e-b961-4f87-a749-9e023f6b8c93\") " pod="openshift-marketplace/marketplace-operator-79b997595-q4lsj" Mar 20 06:56:37 crc kubenswrapper[4971]: I0320 06:56:37.510383 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c34c021e-b961-4f87-a749-9e023f6b8c93-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-q4lsj\" (UID: \"c34c021e-b961-4f87-a749-9e023f6b8c93\") " pod="openshift-marketplace/marketplace-operator-79b997595-q4lsj" Mar 20 06:56:37 crc kubenswrapper[4971]: I0320 06:56:37.520098 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64l4p\" (UniqueName: \"kubernetes.io/projected/c34c021e-b961-4f87-a749-9e023f6b8c93-kube-api-access-64l4p\") pod \"marketplace-operator-79b997595-q4lsj\" (UID: \"c34c021e-b961-4f87-a749-9e023f6b8c93\") " pod="openshift-marketplace/marketplace-operator-79b997595-q4lsj" Mar 20 06:56:37 crc kubenswrapper[4971]: I0320 06:56:37.668539 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-q4lsj" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.182333 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-q4lsj"] Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.306829 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9l5fd" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.388043 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-bj9mx" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.397453 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4w7kq" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.442724 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d17d7658-ce33-4a3c-826a-1a0361e73629-marketplace-trusted-ca\") pod \"d17d7658-ce33-4a3c-826a-1a0361e73629\" (UID: \"d17d7658-ce33-4a3c-826a-1a0361e73629\") " Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.442789 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45bwf\" (UniqueName: \"kubernetes.io/projected/7cb5ccdd-a645-458a-9961-461bd99c3c79-kube-api-access-45bwf\") pod \"7cb5ccdd-a645-458a-9961-461bd99c3c79\" (UID: \"7cb5ccdd-a645-458a-9961-461bd99c3c79\") " Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.442823 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqpn2\" (UniqueName: \"kubernetes.io/projected/d17d7658-ce33-4a3c-826a-1a0361e73629-kube-api-access-vqpn2\") pod \"d17d7658-ce33-4a3c-826a-1a0361e73629\" (UID: \"d17d7658-ce33-4a3c-826a-1a0361e73629\") " Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.443595 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7cb5ccdd-a645-458a-9961-461bd99c3c79-utilities\") pod \"7cb5ccdd-a645-458a-9961-461bd99c3c79\" (UID: \"7cb5ccdd-a645-458a-9961-461bd99c3c79\") " Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.446516 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7cb5ccdd-a645-458a-9961-461bd99c3c79-utilities" (OuterVolumeSpecName: "utilities") pod "7cb5ccdd-a645-458a-9961-461bd99c3c79" (UID: "7cb5ccdd-a645-458a-9961-461bd99c3c79"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.446634 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d17d7658-ce33-4a3c-826a-1a0361e73629-marketplace-operator-metrics\") pod \"d17d7658-ce33-4a3c-826a-1a0361e73629\" (UID: \"d17d7658-ce33-4a3c-826a-1a0361e73629\") " Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.446681 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7cb5ccdd-a645-458a-9961-461bd99c3c79-catalog-content\") pod \"7cb5ccdd-a645-458a-9961-461bd99c3c79\" (UID: \"7cb5ccdd-a645-458a-9961-461bd99c3c79\") " Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.447097 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d17d7658-ce33-4a3c-826a-1a0361e73629-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "d17d7658-ce33-4a3c-826a-1a0361e73629" (UID: "d17d7658-ce33-4a3c-826a-1a0361e73629"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.447586 4971 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d17d7658-ce33-4a3c-826a-1a0361e73629-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.447658 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7cb5ccdd-a645-458a-9961-461bd99c3c79-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.455181 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cb5ccdd-a645-458a-9961-461bd99c3c79-kube-api-access-45bwf" (OuterVolumeSpecName: "kube-api-access-45bwf") pod "7cb5ccdd-a645-458a-9961-461bd99c3c79" (UID: "7cb5ccdd-a645-458a-9961-461bd99c3c79"). InnerVolumeSpecName "kube-api-access-45bwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.457482 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d17d7658-ce33-4a3c-826a-1a0361e73629-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "d17d7658-ce33-4a3c-826a-1a0361e73629" (UID: "d17d7658-ce33-4a3c-826a-1a0361e73629"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.459562 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d17d7658-ce33-4a3c-826a-1a0361e73629-kube-api-access-vqpn2" (OuterVolumeSpecName: "kube-api-access-vqpn2") pod "d17d7658-ce33-4a3c-826a-1a0361e73629" (UID: "d17d7658-ce33-4a3c-826a-1a0361e73629"). InnerVolumeSpecName "kube-api-access-vqpn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.466399 4971 generic.go:334] "Generic (PLEG): container finished" podID="7cb5ccdd-a645-458a-9961-461bd99c3c79" containerID="f23ca37f77065dcd97cdf30d765a6bf59f2c8968a0a0c987fead046d7d339726" exitCode=0 Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.466478 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9l5fd" event={"ID":"7cb5ccdd-a645-458a-9961-461bd99c3c79","Type":"ContainerDied","Data":"f23ca37f77065dcd97cdf30d765a6bf59f2c8968a0a0c987fead046d7d339726"} Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.466511 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9l5fd" event={"ID":"7cb5ccdd-a645-458a-9961-461bd99c3c79","Type":"ContainerDied","Data":"5924892065e4aaed905b963e626884c0a765023864f54bc0888e0c89bb1d5d33"} Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.466547 4971 scope.go:117] "RemoveContainer" containerID="f23ca37f77065dcd97cdf30d765a6bf59f2c8968a0a0c987fead046d7d339726" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.466740 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9l5fd" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.481013 4971 generic.go:334] "Generic (PLEG): container finished" podID="0a637a4b-c5ec-46a5-8b5c-84a56a112ec4" containerID="678df1b724815327e4996b78dd609092217a98e85e5b6e63b55b67e5dd3ba339" exitCode=0 Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.481110 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4w7kq" event={"ID":"0a637a4b-c5ec-46a5-8b5c-84a56a112ec4","Type":"ContainerDied","Data":"678df1b724815327e4996b78dd609092217a98e85e5b6e63b55b67e5dd3ba339"} Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.481145 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4w7kq" event={"ID":"0a637a4b-c5ec-46a5-8b5c-84a56a112ec4","Type":"ContainerDied","Data":"ab601b960ab81dfd31f653f166f776eb6235c78316b509bc75f19e8757f35180"} Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.481224 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4w7kq" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.486524 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-q4lsj" event={"ID":"c34c021e-b961-4f87-a749-9e023f6b8c93","Type":"ContainerStarted","Data":"7606cec3679bbf6b8913d8a9dcf57b38d5ee385fdfeb751f20f7c08b87f66f1d"} Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.486574 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-q4lsj" event={"ID":"c34c021e-b961-4f87-a749-9e023f6b8c93","Type":"ContainerStarted","Data":"34792ddd95a2e3d0e07019b9be616364550b76184b593677ea998468f50bc2ae"} Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.487136 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-q4lsj" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.489743 4971 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-q4lsj container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.71:8080/healthz\": dial tcp 10.217.0.71:8080: connect: connection refused" start-of-body= Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.489971 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-q4lsj" podUID="c34c021e-b961-4f87-a749-9e023f6b8c93" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.71:8080/healthz\": dial tcp 10.217.0.71:8080: connect: connection refused" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.494341 4971 generic.go:334] "Generic (PLEG): container finished" podID="264fdafe-b375-48d7-95d7-62d72c752b5d" containerID="f7cf8472aed6728bae861e17601f80f11b0faea13807991178f4f3913b3a2e6c" exitCode=0 Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.494480 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2ljbf" event={"ID":"264fdafe-b375-48d7-95d7-62d72c752b5d","Type":"ContainerDied","Data":"f7cf8472aed6728bae861e17601f80f11b0faea13807991178f4f3913b3a2e6c"} Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.494524 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2ljbf" event={"ID":"264fdafe-b375-48d7-95d7-62d72c752b5d","Type":"ContainerDied","Data":"da571780fd9d1ad968bd2335431ed0b4691d9fa09c738f495a8e89b41f81a2ad"} Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.494567 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da571780fd9d1ad968bd2335431ed0b4691d9fa09c738f495a8e89b41f81a2ad" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.498132 4971 generic.go:334] "Generic (PLEG): container finished" podID="d17d7658-ce33-4a3c-826a-1a0361e73629" containerID="856d87d0848cb1c7117a4ecee26fc78281ed69f1d642a687df533058dc1114df" exitCode=0 Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.498200 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-bj9mx" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.498218 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7cb5ccdd-a645-458a-9961-461bd99c3c79-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7cb5ccdd-a645-458a-9961-461bd99c3c79" (UID: "7cb5ccdd-a645-458a-9961-461bd99c3c79"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.498251 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bj9mx" event={"ID":"d17d7658-ce33-4a3c-826a-1a0361e73629","Type":"ContainerDied","Data":"856d87d0848cb1c7117a4ecee26fc78281ed69f1d642a687df533058dc1114df"} Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.498285 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bj9mx" event={"ID":"d17d7658-ce33-4a3c-826a-1a0361e73629","Type":"ContainerDied","Data":"964360ce05a4ce669a32572f50f65373f705c414575fbfeb10241253155b829e"} Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.508086 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-q4lsj" podStartSLOduration=1.508064219 podStartE2EDuration="1.508064219s" podCreationTimestamp="2026-03-20 06:56:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:56:38.505446938 +0000 UTC m=+420.485321076" watchObservedRunningTime="2026-03-20 06:56:38.508064219 +0000 UTC m=+420.487938357" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.515809 4971 scope.go:117] "RemoveContainer" containerID="df63d190e7088729d2022627e8412dac36b3ab035e48217f40dca613d036f4cb" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.515971 4971 generic.go:334] "Generic (PLEG): container finished" podID="83cbdad9-020f-4a1b-8f5f-ca7d56d3cfac" containerID="116fedb9b26edd4b4aacd6e32cd947751c32a64fff4e5df1ffe669e760f2ad5e" exitCode=0 Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.516016 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v9l2k" event={"ID":"83cbdad9-020f-4a1b-8f5f-ca7d56d3cfac","Type":"ContainerDied","Data":"116fedb9b26edd4b4aacd6e32cd947751c32a64fff4e5df1ffe669e760f2ad5e"} Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.516074 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v9l2k" event={"ID":"83cbdad9-020f-4a1b-8f5f-ca7d56d3cfac","Type":"ContainerDied","Data":"b1812ad0fae5c04cf8bcdc3c6ef3f48f0baac2e21956d0426c7f58a3a63bb9e6"} Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.516093 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1812ad0fae5c04cf8bcdc3c6ef3f48f0baac2e21956d0426c7f58a3a63bb9e6" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.517675 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2ljbf" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.517794 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v9l2k" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.554772 4971 scope.go:117] "RemoveContainer" containerID="fd3dd99ec954b86266a54ec40c2a7102416297351bf28ce76faa24dc70d7dbc9" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.554998 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a637a4b-c5ec-46a5-8b5c-84a56a112ec4-utilities\") pod \"0a637a4b-c5ec-46a5-8b5c-84a56a112ec4\" (UID: \"0a637a4b-c5ec-46a5-8b5c-84a56a112ec4\") " Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.555055 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4mz5\" (UniqueName: \"kubernetes.io/projected/0a637a4b-c5ec-46a5-8b5c-84a56a112ec4-kube-api-access-l4mz5\") pod \"0a637a4b-c5ec-46a5-8b5c-84a56a112ec4\" (UID: \"0a637a4b-c5ec-46a5-8b5c-84a56a112ec4\") " Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.555171 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a637a4b-c5ec-46a5-8b5c-84a56a112ec4-catalog-content\") pod \"0a637a4b-c5ec-46a5-8b5c-84a56a112ec4\" (UID: \"0a637a4b-c5ec-46a5-8b5c-84a56a112ec4\") " Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.555426 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7cb5ccdd-a645-458a-9961-461bd99c3c79-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.555445 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45bwf\" (UniqueName: \"kubernetes.io/projected/7cb5ccdd-a645-458a-9961-461bd99c3c79-kube-api-access-45bwf\") on node \"crc\" DevicePath \"\"" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.555457 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqpn2\" (UniqueName: \"kubernetes.io/projected/d17d7658-ce33-4a3c-826a-1a0361e73629-kube-api-access-vqpn2\") on node \"crc\" DevicePath \"\"" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.555467 4971 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d17d7658-ce33-4a3c-826a-1a0361e73629-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.557124 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a637a4b-c5ec-46a5-8b5c-84a56a112ec4-utilities" (OuterVolumeSpecName: "utilities") pod "0a637a4b-c5ec-46a5-8b5c-84a56a112ec4" (UID: "0a637a4b-c5ec-46a5-8b5c-84a56a112ec4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.562794 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a637a4b-c5ec-46a5-8b5c-84a56a112ec4-kube-api-access-l4mz5" (OuterVolumeSpecName: "kube-api-access-l4mz5") pod "0a637a4b-c5ec-46a5-8b5c-84a56a112ec4" (UID: "0a637a4b-c5ec-46a5-8b5c-84a56a112ec4"). InnerVolumeSpecName "kube-api-access-l4mz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.587043 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bj9mx"] Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.596112 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bj9mx"] Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.598531 4971 scope.go:117] "RemoveContainer" containerID="f23ca37f77065dcd97cdf30d765a6bf59f2c8968a0a0c987fead046d7d339726" Mar 20 06:56:38 crc kubenswrapper[4971]: E0320 06:56:38.600358 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f23ca37f77065dcd97cdf30d765a6bf59f2c8968a0a0c987fead046d7d339726\": container with ID starting with f23ca37f77065dcd97cdf30d765a6bf59f2c8968a0a0c987fead046d7d339726 not found: ID does not exist" containerID="f23ca37f77065dcd97cdf30d765a6bf59f2c8968a0a0c987fead046d7d339726" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.600401 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f23ca37f77065dcd97cdf30d765a6bf59f2c8968a0a0c987fead046d7d339726"} err="failed to get container status \"f23ca37f77065dcd97cdf30d765a6bf59f2c8968a0a0c987fead046d7d339726\": rpc error: code = NotFound desc = could not find container \"f23ca37f77065dcd97cdf30d765a6bf59f2c8968a0a0c987fead046d7d339726\": container with ID starting with f23ca37f77065dcd97cdf30d765a6bf59f2c8968a0a0c987fead046d7d339726 not found: ID does not exist" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.600433 4971 scope.go:117] "RemoveContainer" containerID="df63d190e7088729d2022627e8412dac36b3ab035e48217f40dca613d036f4cb" Mar 20 06:56:38 crc kubenswrapper[4971]: E0320 06:56:38.600857 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df63d190e7088729d2022627e8412dac36b3ab035e48217f40dca613d036f4cb\": container with ID starting with df63d190e7088729d2022627e8412dac36b3ab035e48217f40dca613d036f4cb not found: ID does not exist" containerID="df63d190e7088729d2022627e8412dac36b3ab035e48217f40dca613d036f4cb" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.600876 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df63d190e7088729d2022627e8412dac36b3ab035e48217f40dca613d036f4cb"} err="failed to get container status \"df63d190e7088729d2022627e8412dac36b3ab035e48217f40dca613d036f4cb\": rpc error: code = NotFound desc = could not find container \"df63d190e7088729d2022627e8412dac36b3ab035e48217f40dca613d036f4cb\": container with ID starting with df63d190e7088729d2022627e8412dac36b3ab035e48217f40dca613d036f4cb not found: ID does not exist" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.600891 4971 scope.go:117] "RemoveContainer" containerID="fd3dd99ec954b86266a54ec40c2a7102416297351bf28ce76faa24dc70d7dbc9" Mar 20 06:56:38 crc kubenswrapper[4971]: E0320 06:56:38.602420 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd3dd99ec954b86266a54ec40c2a7102416297351bf28ce76faa24dc70d7dbc9\": container with ID starting with fd3dd99ec954b86266a54ec40c2a7102416297351bf28ce76faa24dc70d7dbc9 not found: ID does not exist" containerID="fd3dd99ec954b86266a54ec40c2a7102416297351bf28ce76faa24dc70d7dbc9" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.602447 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd3dd99ec954b86266a54ec40c2a7102416297351bf28ce76faa24dc70d7dbc9"} err="failed to get container status \"fd3dd99ec954b86266a54ec40c2a7102416297351bf28ce76faa24dc70d7dbc9\": rpc error: code = NotFound desc = could not find container \"fd3dd99ec954b86266a54ec40c2a7102416297351bf28ce76faa24dc70d7dbc9\": container with ID starting with fd3dd99ec954b86266a54ec40c2a7102416297351bf28ce76faa24dc70d7dbc9 not found: ID does not exist" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.602463 4971 scope.go:117] "RemoveContainer" containerID="678df1b724815327e4996b78dd609092217a98e85e5b6e63b55b67e5dd3ba339" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.617867 4971 scope.go:117] "RemoveContainer" containerID="67e4e52e7920ffef5b5a9e2617ba879991173b0b523eabce69df599f95d6bad0" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.633715 4971 scope.go:117] "RemoveContainer" containerID="07eb1b4d633a4abf7466cf937403cb2f33ba6a40cf7da38689bbef1b3f4f6049" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.650200 4971 scope.go:117] "RemoveContainer" containerID="678df1b724815327e4996b78dd609092217a98e85e5b6e63b55b67e5dd3ba339" Mar 20 06:56:38 crc kubenswrapper[4971]: E0320 06:56:38.650673 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"678df1b724815327e4996b78dd609092217a98e85e5b6e63b55b67e5dd3ba339\": container with ID starting with 678df1b724815327e4996b78dd609092217a98e85e5b6e63b55b67e5dd3ba339 not found: ID does not exist" containerID="678df1b724815327e4996b78dd609092217a98e85e5b6e63b55b67e5dd3ba339" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.650709 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"678df1b724815327e4996b78dd609092217a98e85e5b6e63b55b67e5dd3ba339"} err="failed to get container status \"678df1b724815327e4996b78dd609092217a98e85e5b6e63b55b67e5dd3ba339\": rpc error: code = NotFound desc = could not find container \"678df1b724815327e4996b78dd609092217a98e85e5b6e63b55b67e5dd3ba339\": container with ID starting with 678df1b724815327e4996b78dd609092217a98e85e5b6e63b55b67e5dd3ba339 not found: ID does not exist" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.650741 4971 scope.go:117] "RemoveContainer" containerID="67e4e52e7920ffef5b5a9e2617ba879991173b0b523eabce69df599f95d6bad0" Mar 20 06:56:38 crc kubenswrapper[4971]: E0320 06:56:38.650974 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67e4e52e7920ffef5b5a9e2617ba879991173b0b523eabce69df599f95d6bad0\": container with ID starting with 67e4e52e7920ffef5b5a9e2617ba879991173b0b523eabce69df599f95d6bad0 not found: ID does not exist" containerID="67e4e52e7920ffef5b5a9e2617ba879991173b0b523eabce69df599f95d6bad0" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.650998 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67e4e52e7920ffef5b5a9e2617ba879991173b0b523eabce69df599f95d6bad0"} err="failed to get container status \"67e4e52e7920ffef5b5a9e2617ba879991173b0b523eabce69df599f95d6bad0\": rpc error: code = NotFound desc = could not find container \"67e4e52e7920ffef5b5a9e2617ba879991173b0b523eabce69df599f95d6bad0\": container with ID starting with 67e4e52e7920ffef5b5a9e2617ba879991173b0b523eabce69df599f95d6bad0 not found: ID does not exist" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.651015 4971 scope.go:117] "RemoveContainer" containerID="07eb1b4d633a4abf7466cf937403cb2f33ba6a40cf7da38689bbef1b3f4f6049" Mar 20 06:56:38 crc kubenswrapper[4971]: E0320 06:56:38.651396 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07eb1b4d633a4abf7466cf937403cb2f33ba6a40cf7da38689bbef1b3f4f6049\": container with ID starting with 07eb1b4d633a4abf7466cf937403cb2f33ba6a40cf7da38689bbef1b3f4f6049 not found: ID does not exist" containerID="07eb1b4d633a4abf7466cf937403cb2f33ba6a40cf7da38689bbef1b3f4f6049" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.651419 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07eb1b4d633a4abf7466cf937403cb2f33ba6a40cf7da38689bbef1b3f4f6049"} err="failed to get container status \"07eb1b4d633a4abf7466cf937403cb2f33ba6a40cf7da38689bbef1b3f4f6049\": rpc error: code = NotFound desc = could not find container \"07eb1b4d633a4abf7466cf937403cb2f33ba6a40cf7da38689bbef1b3f4f6049\": container with ID starting with 07eb1b4d633a4abf7466cf937403cb2f33ba6a40cf7da38689bbef1b3f4f6049 not found: ID does not exist" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.651432 4971 scope.go:117] "RemoveContainer" containerID="856d87d0848cb1c7117a4ecee26fc78281ed69f1d642a687df533058dc1114df" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.656914 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbnvj\" (UniqueName: \"kubernetes.io/projected/83cbdad9-020f-4a1b-8f5f-ca7d56d3cfac-kube-api-access-hbnvj\") pod \"83cbdad9-020f-4a1b-8f5f-ca7d56d3cfac\" (UID: \"83cbdad9-020f-4a1b-8f5f-ca7d56d3cfac\") " Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.657050 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/264fdafe-b375-48d7-95d7-62d72c752b5d-catalog-content\") pod \"264fdafe-b375-48d7-95d7-62d72c752b5d\" (UID: \"264fdafe-b375-48d7-95d7-62d72c752b5d\") " Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.657101 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/264fdafe-b375-48d7-95d7-62d72c752b5d-utilities\") pod \"264fdafe-b375-48d7-95d7-62d72c752b5d\" (UID: \"264fdafe-b375-48d7-95d7-62d72c752b5d\") " Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.657153 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83cbdad9-020f-4a1b-8f5f-ca7d56d3cfac-catalog-content\") pod \"83cbdad9-020f-4a1b-8f5f-ca7d56d3cfac\" (UID: \"83cbdad9-020f-4a1b-8f5f-ca7d56d3cfac\") " Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.657188 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rc876\" (UniqueName: \"kubernetes.io/projected/264fdafe-b375-48d7-95d7-62d72c752b5d-kube-api-access-rc876\") pod \"264fdafe-b375-48d7-95d7-62d72c752b5d\" (UID: \"264fdafe-b375-48d7-95d7-62d72c752b5d\") " Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.657245 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83cbdad9-020f-4a1b-8f5f-ca7d56d3cfac-utilities\") pod \"83cbdad9-020f-4a1b-8f5f-ca7d56d3cfac\" (UID: \"83cbdad9-020f-4a1b-8f5f-ca7d56d3cfac\") " Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.657555 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4mz5\" (UniqueName: \"kubernetes.io/projected/0a637a4b-c5ec-46a5-8b5c-84a56a112ec4-kube-api-access-l4mz5\") on node \"crc\" DevicePath \"\"" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.657598 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a637a4b-c5ec-46a5-8b5c-84a56a112ec4-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.658669 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/264fdafe-b375-48d7-95d7-62d72c752b5d-utilities" (OuterVolumeSpecName: "utilities") pod "264fdafe-b375-48d7-95d7-62d72c752b5d" (UID: "264fdafe-b375-48d7-95d7-62d72c752b5d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.658731 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83cbdad9-020f-4a1b-8f5f-ca7d56d3cfac-utilities" (OuterVolumeSpecName: "utilities") pod "83cbdad9-020f-4a1b-8f5f-ca7d56d3cfac" (UID: "83cbdad9-020f-4a1b-8f5f-ca7d56d3cfac"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.659370 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83cbdad9-020f-4a1b-8f5f-ca7d56d3cfac-kube-api-access-hbnvj" (OuterVolumeSpecName: "kube-api-access-hbnvj") pod "83cbdad9-020f-4a1b-8f5f-ca7d56d3cfac" (UID: "83cbdad9-020f-4a1b-8f5f-ca7d56d3cfac"). InnerVolumeSpecName "kube-api-access-hbnvj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.661046 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/264fdafe-b375-48d7-95d7-62d72c752b5d-kube-api-access-rc876" (OuterVolumeSpecName: "kube-api-access-rc876") pod "264fdafe-b375-48d7-95d7-62d72c752b5d" (UID: "264fdafe-b375-48d7-95d7-62d72c752b5d"). InnerVolumeSpecName "kube-api-access-rc876". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.673197 4971 scope.go:117] "RemoveContainer" containerID="856d87d0848cb1c7117a4ecee26fc78281ed69f1d642a687df533058dc1114df" Mar 20 06:56:38 crc kubenswrapper[4971]: E0320 06:56:38.673796 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"856d87d0848cb1c7117a4ecee26fc78281ed69f1d642a687df533058dc1114df\": container with ID starting with 856d87d0848cb1c7117a4ecee26fc78281ed69f1d642a687df533058dc1114df not found: ID does not exist" containerID="856d87d0848cb1c7117a4ecee26fc78281ed69f1d642a687df533058dc1114df" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.673923 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"856d87d0848cb1c7117a4ecee26fc78281ed69f1d642a687df533058dc1114df"} err="failed to get container status \"856d87d0848cb1c7117a4ecee26fc78281ed69f1d642a687df533058dc1114df\": rpc error: code = NotFound desc = could not find container \"856d87d0848cb1c7117a4ecee26fc78281ed69f1d642a687df533058dc1114df\": container with ID starting with 856d87d0848cb1c7117a4ecee26fc78281ed69f1d642a687df533058dc1114df not found: ID does not exist" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.726871 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/264fdafe-b375-48d7-95d7-62d72c752b5d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "264fdafe-b375-48d7-95d7-62d72c752b5d" (UID: "264fdafe-b375-48d7-95d7-62d72c752b5d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.736231 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a637a4b-c5ec-46a5-8b5c-84a56a112ec4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0a637a4b-c5ec-46a5-8b5c-84a56a112ec4" (UID: "0a637a4b-c5ec-46a5-8b5c-84a56a112ec4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.738211 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83cbdad9-020f-4a1b-8f5f-ca7d56d3cfac-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "83cbdad9-020f-4a1b-8f5f-ca7d56d3cfac" (UID: "83cbdad9-020f-4a1b-8f5f-ca7d56d3cfac"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.745500 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d17d7658-ce33-4a3c-826a-1a0361e73629" path="/var/lib/kubelet/pods/d17d7658-ce33-4a3c-826a-1a0361e73629/volumes" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.758721 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/264fdafe-b375-48d7-95d7-62d72c752b5d-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.758796 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83cbdad9-020f-4a1b-8f5f-ca7d56d3cfac-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.758820 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rc876\" (UniqueName: \"kubernetes.io/projected/264fdafe-b375-48d7-95d7-62d72c752b5d-kube-api-access-rc876\") on node \"crc\" DevicePath \"\"" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.758834 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83cbdad9-020f-4a1b-8f5f-ca7d56d3cfac-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.758846 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbnvj\" (UniqueName: \"kubernetes.io/projected/83cbdad9-020f-4a1b-8f5f-ca7d56d3cfac-kube-api-access-hbnvj\") on node \"crc\" DevicePath \"\"" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.758862 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a637a4b-c5ec-46a5-8b5c-84a56a112ec4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.758874 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/264fdafe-b375-48d7-95d7-62d72c752b5d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.794760 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9l5fd"] Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.798066 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9l5fd"] Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.826863 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4w7kq"] Mar 20 06:56:38 crc kubenswrapper[4971]: I0320 06:56:38.829959 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4w7kq"] Mar 20 06:56:39 crc kubenswrapper[4971]: I0320 06:56:39.500404 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wztzb"] Mar 20 06:56:39 crc kubenswrapper[4971]: E0320 06:56:39.500640 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83cbdad9-020f-4a1b-8f5f-ca7d56d3cfac" containerName="extract-utilities" Mar 20 06:56:39 crc kubenswrapper[4971]: I0320 06:56:39.500654 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="83cbdad9-020f-4a1b-8f5f-ca7d56d3cfac" containerName="extract-utilities" Mar 20 06:56:39 crc kubenswrapper[4971]: E0320 06:56:39.500664 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cb5ccdd-a645-458a-9961-461bd99c3c79" containerName="extract-utilities" Mar 20 06:56:39 crc kubenswrapper[4971]: I0320 06:56:39.500670 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cb5ccdd-a645-458a-9961-461bd99c3c79" containerName="extract-utilities" Mar 20 06:56:39 crc kubenswrapper[4971]: E0320 06:56:39.500679 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="264fdafe-b375-48d7-95d7-62d72c752b5d" containerName="registry-server" Mar 20 06:56:39 crc kubenswrapper[4971]: I0320 06:56:39.500686 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="264fdafe-b375-48d7-95d7-62d72c752b5d" containerName="registry-server" Mar 20 06:56:39 crc kubenswrapper[4971]: E0320 06:56:39.500696 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cb5ccdd-a645-458a-9961-461bd99c3c79" containerName="registry-server" Mar 20 06:56:39 crc kubenswrapper[4971]: I0320 06:56:39.500712 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cb5ccdd-a645-458a-9961-461bd99c3c79" containerName="registry-server" Mar 20 06:56:39 crc kubenswrapper[4971]: E0320 06:56:39.500724 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cb5ccdd-a645-458a-9961-461bd99c3c79" containerName="extract-content" Mar 20 06:56:39 crc kubenswrapper[4971]: I0320 06:56:39.500730 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cb5ccdd-a645-458a-9961-461bd99c3c79" containerName="extract-content" Mar 20 06:56:39 crc kubenswrapper[4971]: E0320 06:56:39.500740 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83cbdad9-020f-4a1b-8f5f-ca7d56d3cfac" containerName="registry-server" Mar 20 06:56:39 crc kubenswrapper[4971]: I0320 06:56:39.500745 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="83cbdad9-020f-4a1b-8f5f-ca7d56d3cfac" containerName="registry-server" Mar 20 06:56:39 crc kubenswrapper[4971]: E0320 06:56:39.500754 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a637a4b-c5ec-46a5-8b5c-84a56a112ec4" containerName="extract-content" Mar 20 06:56:39 crc kubenswrapper[4971]: I0320 06:56:39.500760 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a637a4b-c5ec-46a5-8b5c-84a56a112ec4" containerName="extract-content" Mar 20 06:56:39 crc kubenswrapper[4971]: E0320 06:56:39.500768 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83cbdad9-020f-4a1b-8f5f-ca7d56d3cfac" containerName="extract-content" Mar 20 06:56:39 crc kubenswrapper[4971]: I0320 06:56:39.500774 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="83cbdad9-020f-4a1b-8f5f-ca7d56d3cfac" containerName="extract-content" Mar 20 06:56:39 crc kubenswrapper[4971]: E0320 06:56:39.500782 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="264fdafe-b375-48d7-95d7-62d72c752b5d" containerName="extract-content" Mar 20 06:56:39 crc kubenswrapper[4971]: I0320 06:56:39.500787 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="264fdafe-b375-48d7-95d7-62d72c752b5d" containerName="extract-content" Mar 20 06:56:39 crc kubenswrapper[4971]: E0320 06:56:39.500798 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a637a4b-c5ec-46a5-8b5c-84a56a112ec4" containerName="registry-server" Mar 20 06:56:39 crc kubenswrapper[4971]: I0320 06:56:39.500803 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a637a4b-c5ec-46a5-8b5c-84a56a112ec4" containerName="registry-server" Mar 20 06:56:39 crc kubenswrapper[4971]: E0320 06:56:39.500812 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a637a4b-c5ec-46a5-8b5c-84a56a112ec4" containerName="extract-utilities" Mar 20 06:56:39 crc kubenswrapper[4971]: I0320 06:56:39.500819 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a637a4b-c5ec-46a5-8b5c-84a56a112ec4" containerName="extract-utilities" Mar 20 06:56:39 crc kubenswrapper[4971]: E0320 06:56:39.500845 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d17d7658-ce33-4a3c-826a-1a0361e73629" containerName="marketplace-operator" Mar 20 06:56:39 crc kubenswrapper[4971]: I0320 06:56:39.500851 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="d17d7658-ce33-4a3c-826a-1a0361e73629" containerName="marketplace-operator" Mar 20 06:56:39 crc kubenswrapper[4971]: E0320 06:56:39.500861 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="264fdafe-b375-48d7-95d7-62d72c752b5d" containerName="extract-utilities" Mar 20 06:56:39 crc kubenswrapper[4971]: I0320 06:56:39.500867 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="264fdafe-b375-48d7-95d7-62d72c752b5d" containerName="extract-utilities" Mar 20 06:56:39 crc kubenswrapper[4971]: I0320 06:56:39.500950 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="264fdafe-b375-48d7-95d7-62d72c752b5d" containerName="registry-server" Mar 20 06:56:39 crc kubenswrapper[4971]: I0320 06:56:39.500957 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cb5ccdd-a645-458a-9961-461bd99c3c79" containerName="registry-server" Mar 20 06:56:39 crc kubenswrapper[4971]: I0320 06:56:39.500971 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="d17d7658-ce33-4a3c-826a-1a0361e73629" containerName="marketplace-operator" Mar 20 06:56:39 crc kubenswrapper[4971]: I0320 06:56:39.500979 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a637a4b-c5ec-46a5-8b5c-84a56a112ec4" containerName="registry-server" Mar 20 06:56:39 crc kubenswrapper[4971]: I0320 06:56:39.500987 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="83cbdad9-020f-4a1b-8f5f-ca7d56d3cfac" containerName="registry-server" Mar 20 06:56:39 crc kubenswrapper[4971]: I0320 06:56:39.501725 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wztzb" Mar 20 06:56:39 crc kubenswrapper[4971]: I0320 06:56:39.504809 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 06:56:39 crc kubenswrapper[4971]: I0320 06:56:39.520759 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2ljbf" Mar 20 06:56:39 crc kubenswrapper[4971]: I0320 06:56:39.520811 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v9l2k" Mar 20 06:56:39 crc kubenswrapper[4971]: I0320 06:56:39.522350 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wztzb"] Mar 20 06:56:39 crc kubenswrapper[4971]: I0320 06:56:39.525121 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-q4lsj" Mar 20 06:56:39 crc kubenswrapper[4971]: I0320 06:56:39.558244 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2ljbf"] Mar 20 06:56:39 crc kubenswrapper[4971]: I0320 06:56:39.561074 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2ljbf"] Mar 20 06:56:39 crc kubenswrapper[4971]: I0320 06:56:39.569117 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gj7kl\" (UniqueName: \"kubernetes.io/projected/44282ec6-bf9c-4091-bb86-2e491e577076-kube-api-access-gj7kl\") pod \"redhat-marketplace-wztzb\" (UID: \"44282ec6-bf9c-4091-bb86-2e491e577076\") " pod="openshift-marketplace/redhat-marketplace-wztzb" Mar 20 06:56:39 crc kubenswrapper[4971]: I0320 06:56:39.569164 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44282ec6-bf9c-4091-bb86-2e491e577076-utilities\") pod \"redhat-marketplace-wztzb\" (UID: \"44282ec6-bf9c-4091-bb86-2e491e577076\") " pod="openshift-marketplace/redhat-marketplace-wztzb" Mar 20 06:56:39 crc kubenswrapper[4971]: I0320 06:56:39.569204 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44282ec6-bf9c-4091-bb86-2e491e577076-catalog-content\") pod \"redhat-marketplace-wztzb\" (UID: \"44282ec6-bf9c-4091-bb86-2e491e577076\") " pod="openshift-marketplace/redhat-marketplace-wztzb" Mar 20 06:56:39 crc kubenswrapper[4971]: I0320 06:56:39.591196 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-v9l2k"] Mar 20 06:56:39 crc kubenswrapper[4971]: I0320 06:56:39.591359 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-v9l2k"] Mar 20 06:56:39 crc kubenswrapper[4971]: I0320 06:56:39.670198 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44282ec6-bf9c-4091-bb86-2e491e577076-catalog-content\") pod \"redhat-marketplace-wztzb\" (UID: \"44282ec6-bf9c-4091-bb86-2e491e577076\") " pod="openshift-marketplace/redhat-marketplace-wztzb" Mar 20 06:56:39 crc kubenswrapper[4971]: I0320 06:56:39.670456 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gj7kl\" (UniqueName: \"kubernetes.io/projected/44282ec6-bf9c-4091-bb86-2e491e577076-kube-api-access-gj7kl\") pod \"redhat-marketplace-wztzb\" (UID: \"44282ec6-bf9c-4091-bb86-2e491e577076\") " pod="openshift-marketplace/redhat-marketplace-wztzb" Mar 20 06:56:39 crc kubenswrapper[4971]: I0320 06:56:39.670499 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44282ec6-bf9c-4091-bb86-2e491e577076-utilities\") pod \"redhat-marketplace-wztzb\" (UID: \"44282ec6-bf9c-4091-bb86-2e491e577076\") " pod="openshift-marketplace/redhat-marketplace-wztzb" Mar 20 06:56:39 crc kubenswrapper[4971]: I0320 06:56:39.671174 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44282ec6-bf9c-4091-bb86-2e491e577076-catalog-content\") pod \"redhat-marketplace-wztzb\" (UID: \"44282ec6-bf9c-4091-bb86-2e491e577076\") " pod="openshift-marketplace/redhat-marketplace-wztzb" Mar 20 06:56:39 crc kubenswrapper[4971]: I0320 06:56:39.671749 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44282ec6-bf9c-4091-bb86-2e491e577076-utilities\") pod \"redhat-marketplace-wztzb\" (UID: \"44282ec6-bf9c-4091-bb86-2e491e577076\") " pod="openshift-marketplace/redhat-marketplace-wztzb" Mar 20 06:56:39 crc kubenswrapper[4971]: I0320 06:56:39.691421 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gj7kl\" (UniqueName: \"kubernetes.io/projected/44282ec6-bf9c-4091-bb86-2e491e577076-kube-api-access-gj7kl\") pod \"redhat-marketplace-wztzb\" (UID: \"44282ec6-bf9c-4091-bb86-2e491e577076\") " pod="openshift-marketplace/redhat-marketplace-wztzb" Mar 20 06:56:39 crc kubenswrapper[4971]: I0320 06:56:39.815718 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wztzb" Mar 20 06:56:40 crc kubenswrapper[4971]: I0320 06:56:40.237394 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wztzb"] Mar 20 06:56:40 crc kubenswrapper[4971]: W0320 06:56:40.244450 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44282ec6_bf9c_4091_bb86_2e491e577076.slice/crio-660b5179afd1f8058d6b02942db42b24de23ab70683b4464e360eff2831f21f7 WatchSource:0}: Error finding container 660b5179afd1f8058d6b02942db42b24de23ab70683b4464e360eff2831f21f7: Status 404 returned error can't find the container with id 660b5179afd1f8058d6b02942db42b24de23ab70683b4464e360eff2831f21f7 Mar 20 06:56:40 crc kubenswrapper[4971]: I0320 06:56:40.528548 4971 generic.go:334] "Generic (PLEG): container finished" podID="44282ec6-bf9c-4091-bb86-2e491e577076" containerID="d3889c1beaad707eef4141724ee40feb408ca270875d6a421ae534d4c61436ee" exitCode=0 Mar 20 06:56:40 crc kubenswrapper[4971]: I0320 06:56:40.528644 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wztzb" event={"ID":"44282ec6-bf9c-4091-bb86-2e491e577076","Type":"ContainerDied","Data":"d3889c1beaad707eef4141724ee40feb408ca270875d6a421ae534d4c61436ee"} Mar 20 06:56:40 crc kubenswrapper[4971]: I0320 06:56:40.529031 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wztzb" event={"ID":"44282ec6-bf9c-4091-bb86-2e491e577076","Type":"ContainerStarted","Data":"660b5179afd1f8058d6b02942db42b24de23ab70683b4464e360eff2831f21f7"} Mar 20 06:56:40 crc kubenswrapper[4971]: I0320 06:56:40.745925 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a637a4b-c5ec-46a5-8b5c-84a56a112ec4" path="/var/lib/kubelet/pods/0a637a4b-c5ec-46a5-8b5c-84a56a112ec4/volumes" Mar 20 06:56:40 crc kubenswrapper[4971]: I0320 06:56:40.746724 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="264fdafe-b375-48d7-95d7-62d72c752b5d" path="/var/lib/kubelet/pods/264fdafe-b375-48d7-95d7-62d72c752b5d/volumes" Mar 20 06:56:40 crc kubenswrapper[4971]: I0320 06:56:40.747432 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cb5ccdd-a645-458a-9961-461bd99c3c79" path="/var/lib/kubelet/pods/7cb5ccdd-a645-458a-9961-461bd99c3c79/volumes" Mar 20 06:56:40 crc kubenswrapper[4971]: I0320 06:56:40.748561 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83cbdad9-020f-4a1b-8f5f-ca7d56d3cfac" path="/var/lib/kubelet/pods/83cbdad9-020f-4a1b-8f5f-ca7d56d3cfac/volumes" Mar 20 06:56:41 crc kubenswrapper[4971]: I0320 06:56:41.707867 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ktk7v"] Mar 20 06:56:41 crc kubenswrapper[4971]: I0320 06:56:41.709244 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ktk7v" Mar 20 06:56:41 crc kubenswrapper[4971]: I0320 06:56:41.711907 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 06:56:41 crc kubenswrapper[4971]: I0320 06:56:41.728147 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ktk7v"] Mar 20 06:56:41 crc kubenswrapper[4971]: I0320 06:56:41.800747 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghrhr\" (UniqueName: \"kubernetes.io/projected/03cfe007-db9f-4403-86b5-cd107e145c1a-kube-api-access-ghrhr\") pod \"community-operators-ktk7v\" (UID: \"03cfe007-db9f-4403-86b5-cd107e145c1a\") " pod="openshift-marketplace/community-operators-ktk7v" Mar 20 06:56:41 crc kubenswrapper[4971]: I0320 06:56:41.800798 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03cfe007-db9f-4403-86b5-cd107e145c1a-utilities\") pod \"community-operators-ktk7v\" (UID: \"03cfe007-db9f-4403-86b5-cd107e145c1a\") " pod="openshift-marketplace/community-operators-ktk7v" Mar 20 06:56:41 crc kubenswrapper[4971]: I0320 06:56:41.800857 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03cfe007-db9f-4403-86b5-cd107e145c1a-catalog-content\") pod \"community-operators-ktk7v\" (UID: \"03cfe007-db9f-4403-86b5-cd107e145c1a\") " pod="openshift-marketplace/community-operators-ktk7v" Mar 20 06:56:41 crc kubenswrapper[4971]: I0320 06:56:41.902195 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03cfe007-db9f-4403-86b5-cd107e145c1a-catalog-content\") pod \"community-operators-ktk7v\" (UID: \"03cfe007-db9f-4403-86b5-cd107e145c1a\") " pod="openshift-marketplace/community-operators-ktk7v" Mar 20 06:56:41 crc kubenswrapper[4971]: I0320 06:56:41.902276 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghrhr\" (UniqueName: \"kubernetes.io/projected/03cfe007-db9f-4403-86b5-cd107e145c1a-kube-api-access-ghrhr\") pod \"community-operators-ktk7v\" (UID: \"03cfe007-db9f-4403-86b5-cd107e145c1a\") " pod="openshift-marketplace/community-operators-ktk7v" Mar 20 06:56:41 crc kubenswrapper[4971]: I0320 06:56:41.902299 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03cfe007-db9f-4403-86b5-cd107e145c1a-utilities\") pod \"community-operators-ktk7v\" (UID: \"03cfe007-db9f-4403-86b5-cd107e145c1a\") " pod="openshift-marketplace/community-operators-ktk7v" Mar 20 06:56:41 crc kubenswrapper[4971]: I0320 06:56:41.902779 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03cfe007-db9f-4403-86b5-cd107e145c1a-utilities\") pod \"community-operators-ktk7v\" (UID: \"03cfe007-db9f-4403-86b5-cd107e145c1a\") " pod="openshift-marketplace/community-operators-ktk7v" Mar 20 06:56:41 crc kubenswrapper[4971]: I0320 06:56:41.903036 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03cfe007-db9f-4403-86b5-cd107e145c1a-catalog-content\") pod \"community-operators-ktk7v\" (UID: \"03cfe007-db9f-4403-86b5-cd107e145c1a\") " pod="openshift-marketplace/community-operators-ktk7v" Mar 20 06:56:41 crc kubenswrapper[4971]: I0320 06:56:41.908053 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xp27t"] Mar 20 06:56:41 crc kubenswrapper[4971]: I0320 06:56:41.911412 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xp27t" Mar 20 06:56:41 crc kubenswrapper[4971]: I0320 06:56:41.917956 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xp27t"] Mar 20 06:56:41 crc kubenswrapper[4971]: I0320 06:56:41.920440 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 06:56:41 crc kubenswrapper[4971]: I0320 06:56:41.935388 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghrhr\" (UniqueName: \"kubernetes.io/projected/03cfe007-db9f-4403-86b5-cd107e145c1a-kube-api-access-ghrhr\") pod \"community-operators-ktk7v\" (UID: \"03cfe007-db9f-4403-86b5-cd107e145c1a\") " pod="openshift-marketplace/community-operators-ktk7v" Mar 20 06:56:42 crc kubenswrapper[4971]: I0320 06:56:42.004130 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1cb53af-ca9d-494f-a994-dbe7117881db-catalog-content\") pod \"redhat-operators-xp27t\" (UID: \"d1cb53af-ca9d-494f-a994-dbe7117881db\") " pod="openshift-marketplace/redhat-operators-xp27t" Mar 20 06:56:42 crc kubenswrapper[4971]: I0320 06:56:42.004274 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkklk\" (UniqueName: \"kubernetes.io/projected/d1cb53af-ca9d-494f-a994-dbe7117881db-kube-api-access-bkklk\") pod \"redhat-operators-xp27t\" (UID: \"d1cb53af-ca9d-494f-a994-dbe7117881db\") " pod="openshift-marketplace/redhat-operators-xp27t" Mar 20 06:56:42 crc kubenswrapper[4971]: I0320 06:56:42.004323 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1cb53af-ca9d-494f-a994-dbe7117881db-utilities\") pod \"redhat-operators-xp27t\" (UID: \"d1cb53af-ca9d-494f-a994-dbe7117881db\") " pod="openshift-marketplace/redhat-operators-xp27t" Mar 20 06:56:42 crc kubenswrapper[4971]: I0320 06:56:42.027640 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ktk7v" Mar 20 06:56:42 crc kubenswrapper[4971]: I0320 06:56:42.105759 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1cb53af-ca9d-494f-a994-dbe7117881db-catalog-content\") pod \"redhat-operators-xp27t\" (UID: \"d1cb53af-ca9d-494f-a994-dbe7117881db\") " pod="openshift-marketplace/redhat-operators-xp27t" Mar 20 06:56:42 crc kubenswrapper[4971]: I0320 06:56:42.105831 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkklk\" (UniqueName: \"kubernetes.io/projected/d1cb53af-ca9d-494f-a994-dbe7117881db-kube-api-access-bkklk\") pod \"redhat-operators-xp27t\" (UID: \"d1cb53af-ca9d-494f-a994-dbe7117881db\") " pod="openshift-marketplace/redhat-operators-xp27t" Mar 20 06:56:42 crc kubenswrapper[4971]: I0320 06:56:42.105859 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1cb53af-ca9d-494f-a994-dbe7117881db-utilities\") pod \"redhat-operators-xp27t\" (UID: \"d1cb53af-ca9d-494f-a994-dbe7117881db\") " pod="openshift-marketplace/redhat-operators-xp27t" Mar 20 06:56:42 crc kubenswrapper[4971]: I0320 06:56:42.106374 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1cb53af-ca9d-494f-a994-dbe7117881db-utilities\") pod \"redhat-operators-xp27t\" (UID: \"d1cb53af-ca9d-494f-a994-dbe7117881db\") " pod="openshift-marketplace/redhat-operators-xp27t" Mar 20 06:56:42 crc kubenswrapper[4971]: I0320 06:56:42.106369 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1cb53af-ca9d-494f-a994-dbe7117881db-catalog-content\") pod \"redhat-operators-xp27t\" (UID: \"d1cb53af-ca9d-494f-a994-dbe7117881db\") " pod="openshift-marketplace/redhat-operators-xp27t" Mar 20 06:56:42 crc kubenswrapper[4971]: I0320 06:56:42.141764 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkklk\" (UniqueName: \"kubernetes.io/projected/d1cb53af-ca9d-494f-a994-dbe7117881db-kube-api-access-bkklk\") pod \"redhat-operators-xp27t\" (UID: \"d1cb53af-ca9d-494f-a994-dbe7117881db\") " pod="openshift-marketplace/redhat-operators-xp27t" Mar 20 06:56:42 crc kubenswrapper[4971]: I0320 06:56:42.292126 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xp27t" Mar 20 06:56:42 crc kubenswrapper[4971]: I0320 06:56:42.468490 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ktk7v"] Mar 20 06:56:42 crc kubenswrapper[4971]: I0320 06:56:42.524712 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xp27t"] Mar 20 06:56:42 crc kubenswrapper[4971]: I0320 06:56:42.556080 4971 generic.go:334] "Generic (PLEG): container finished" podID="44282ec6-bf9c-4091-bb86-2e491e577076" containerID="aa056d580727888b77e7d359ffccc0429b6936fc37dfa05261cb6c944a0e0de3" exitCode=0 Mar 20 06:56:42 crc kubenswrapper[4971]: I0320 06:56:42.556158 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wztzb" event={"ID":"44282ec6-bf9c-4091-bb86-2e491e577076","Type":"ContainerDied","Data":"aa056d580727888b77e7d359ffccc0429b6936fc37dfa05261cb6c944a0e0de3"} Mar 20 06:56:42 crc kubenswrapper[4971]: I0320 06:56:42.562878 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ktk7v" event={"ID":"03cfe007-db9f-4403-86b5-cd107e145c1a","Type":"ContainerStarted","Data":"979e0ace61d21c228304ee91272b3f5334ca82724b07bd2f68247f7690012a50"} Mar 20 06:56:43 crc kubenswrapper[4971]: I0320 06:56:43.571263 4971 generic.go:334] "Generic (PLEG): container finished" podID="d1cb53af-ca9d-494f-a994-dbe7117881db" containerID="a859277aa75c2fbdc680b92c5961558f164c95b3f218c5bcb2b53b3d8d4d6f35" exitCode=0 Mar 20 06:56:43 crc kubenswrapper[4971]: I0320 06:56:43.571319 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xp27t" event={"ID":"d1cb53af-ca9d-494f-a994-dbe7117881db","Type":"ContainerDied","Data":"a859277aa75c2fbdc680b92c5961558f164c95b3f218c5bcb2b53b3d8d4d6f35"} Mar 20 06:56:43 crc kubenswrapper[4971]: I0320 06:56:43.571367 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xp27t" event={"ID":"d1cb53af-ca9d-494f-a994-dbe7117881db","Type":"ContainerStarted","Data":"444d438f606b18b20819df8a02121fb9addafcf5ab3279d65fbc41cecebccbc9"} Mar 20 06:56:43 crc kubenswrapper[4971]: I0320 06:56:43.579371 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wztzb" event={"ID":"44282ec6-bf9c-4091-bb86-2e491e577076","Type":"ContainerStarted","Data":"da0fa5da958caf02b9e852ca2c1d882bb6ad516bb82d7e7a44b82490f820755f"} Mar 20 06:56:43 crc kubenswrapper[4971]: I0320 06:56:43.584108 4971 generic.go:334] "Generic (PLEG): container finished" podID="03cfe007-db9f-4403-86b5-cd107e145c1a" containerID="4d4ff9df969b72ff953b8f5446293968bf1e6b252bcf392fb26864f556fecaeb" exitCode=0 Mar 20 06:56:43 crc kubenswrapper[4971]: I0320 06:56:43.584146 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ktk7v" event={"ID":"03cfe007-db9f-4403-86b5-cd107e145c1a","Type":"ContainerDied","Data":"4d4ff9df969b72ff953b8f5446293968bf1e6b252bcf392fb26864f556fecaeb"} Mar 20 06:56:43 crc kubenswrapper[4971]: I0320 06:56:43.629958 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wztzb" podStartSLOduration=1.850045709 podStartE2EDuration="4.629928955s" podCreationTimestamp="2026-03-20 06:56:39 +0000 UTC" firstStartedPulling="2026-03-20 06:56:40.5303386 +0000 UTC m=+422.510212738" lastFinishedPulling="2026-03-20 06:56:43.310221816 +0000 UTC m=+425.290095984" observedRunningTime="2026-03-20 06:56:43.624501018 +0000 UTC m=+425.604375166" watchObservedRunningTime="2026-03-20 06:56:43.629928955 +0000 UTC m=+425.609803113" Mar 20 06:56:44 crc kubenswrapper[4971]: I0320 06:56:44.107180 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-m2l48"] Mar 20 06:56:44 crc kubenswrapper[4971]: I0320 06:56:44.111477 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m2l48" Mar 20 06:56:44 crc kubenswrapper[4971]: I0320 06:56:44.114291 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m2l48"] Mar 20 06:56:44 crc kubenswrapper[4971]: I0320 06:56:44.114723 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 06:56:44 crc kubenswrapper[4971]: I0320 06:56:44.237934 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca4e2982-1f40-434d-b05f-baaa0c805845-catalog-content\") pod \"certified-operators-m2l48\" (UID: \"ca4e2982-1f40-434d-b05f-baaa0c805845\") " pod="openshift-marketplace/certified-operators-m2l48" Mar 20 06:56:44 crc kubenswrapper[4971]: I0320 06:56:44.238006 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfw2h\" (UniqueName: \"kubernetes.io/projected/ca4e2982-1f40-434d-b05f-baaa0c805845-kube-api-access-lfw2h\") pod \"certified-operators-m2l48\" (UID: \"ca4e2982-1f40-434d-b05f-baaa0c805845\") " pod="openshift-marketplace/certified-operators-m2l48" Mar 20 06:56:44 crc kubenswrapper[4971]: I0320 06:56:44.238398 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca4e2982-1f40-434d-b05f-baaa0c805845-utilities\") pod \"certified-operators-m2l48\" (UID: \"ca4e2982-1f40-434d-b05f-baaa0c805845\") " pod="openshift-marketplace/certified-operators-m2l48" Mar 20 06:56:44 crc kubenswrapper[4971]: I0320 06:56:44.340062 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca4e2982-1f40-434d-b05f-baaa0c805845-catalog-content\") pod \"certified-operators-m2l48\" (UID: \"ca4e2982-1f40-434d-b05f-baaa0c805845\") " pod="openshift-marketplace/certified-operators-m2l48" Mar 20 06:56:44 crc kubenswrapper[4971]: I0320 06:56:44.340627 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfw2h\" (UniqueName: \"kubernetes.io/projected/ca4e2982-1f40-434d-b05f-baaa0c805845-kube-api-access-lfw2h\") pod \"certified-operators-m2l48\" (UID: \"ca4e2982-1f40-434d-b05f-baaa0c805845\") " pod="openshift-marketplace/certified-operators-m2l48" Mar 20 06:56:44 crc kubenswrapper[4971]: I0320 06:56:44.340706 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca4e2982-1f40-434d-b05f-baaa0c805845-utilities\") pod \"certified-operators-m2l48\" (UID: \"ca4e2982-1f40-434d-b05f-baaa0c805845\") " pod="openshift-marketplace/certified-operators-m2l48" Mar 20 06:56:44 crc kubenswrapper[4971]: I0320 06:56:44.340936 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca4e2982-1f40-434d-b05f-baaa0c805845-catalog-content\") pod \"certified-operators-m2l48\" (UID: \"ca4e2982-1f40-434d-b05f-baaa0c805845\") " pod="openshift-marketplace/certified-operators-m2l48" Mar 20 06:56:44 crc kubenswrapper[4971]: I0320 06:56:44.341283 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca4e2982-1f40-434d-b05f-baaa0c805845-utilities\") pod \"certified-operators-m2l48\" (UID: \"ca4e2982-1f40-434d-b05f-baaa0c805845\") " pod="openshift-marketplace/certified-operators-m2l48" Mar 20 06:56:44 crc kubenswrapper[4971]: I0320 06:56:44.370602 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfw2h\" (UniqueName: \"kubernetes.io/projected/ca4e2982-1f40-434d-b05f-baaa0c805845-kube-api-access-lfw2h\") pod \"certified-operators-m2l48\" (UID: \"ca4e2982-1f40-434d-b05f-baaa0c805845\") " pod="openshift-marketplace/certified-operators-m2l48" Mar 20 06:56:44 crc kubenswrapper[4971]: I0320 06:56:44.431306 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m2l48" Mar 20 06:56:44 crc kubenswrapper[4971]: I0320 06:56:44.595725 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ktk7v" event={"ID":"03cfe007-db9f-4403-86b5-cd107e145c1a","Type":"ContainerStarted","Data":"7190523d839dd731cc4a99d3805728490092af7e318af62fc5597246e1201a8c"} Mar 20 06:56:44 crc kubenswrapper[4971]: I0320 06:56:44.923292 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m2l48"] Mar 20 06:56:44 crc kubenswrapper[4971]: W0320 06:56:44.941052 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca4e2982_1f40_434d_b05f_baaa0c805845.slice/crio-c19d9cee40d919c99a9167e36faa4b98ffd363d7620cbbb173a313f551791dc9 WatchSource:0}: Error finding container c19d9cee40d919c99a9167e36faa4b98ffd363d7620cbbb173a313f551791dc9: Status 404 returned error can't find the container with id c19d9cee40d919c99a9167e36faa4b98ffd363d7620cbbb173a313f551791dc9 Mar 20 06:56:45 crc kubenswrapper[4971]: I0320 06:56:45.611583 4971 generic.go:334] "Generic (PLEG): container finished" podID="ca4e2982-1f40-434d-b05f-baaa0c805845" containerID="8343ee0588801698887108f8e0ebd9b152596a7f6b86a981dbcec5339252b00a" exitCode=0 Mar 20 06:56:45 crc kubenswrapper[4971]: I0320 06:56:45.611727 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m2l48" event={"ID":"ca4e2982-1f40-434d-b05f-baaa0c805845","Type":"ContainerDied","Data":"8343ee0588801698887108f8e0ebd9b152596a7f6b86a981dbcec5339252b00a"} Mar 20 06:56:45 crc kubenswrapper[4971]: I0320 06:56:45.611797 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m2l48" event={"ID":"ca4e2982-1f40-434d-b05f-baaa0c805845","Type":"ContainerStarted","Data":"c19d9cee40d919c99a9167e36faa4b98ffd363d7620cbbb173a313f551791dc9"} Mar 20 06:56:45 crc kubenswrapper[4971]: I0320 06:56:45.614676 4971 generic.go:334] "Generic (PLEG): container finished" podID="d1cb53af-ca9d-494f-a994-dbe7117881db" containerID="bfb1f56709eb0b101862cbb49f6e96a6aee898e723a4909fe40c96c8d56f6160" exitCode=0 Mar 20 06:56:45 crc kubenswrapper[4971]: I0320 06:56:45.614745 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xp27t" event={"ID":"d1cb53af-ca9d-494f-a994-dbe7117881db","Type":"ContainerDied","Data":"bfb1f56709eb0b101862cbb49f6e96a6aee898e723a4909fe40c96c8d56f6160"} Mar 20 06:56:45 crc kubenswrapper[4971]: I0320 06:56:45.617851 4971 generic.go:334] "Generic (PLEG): container finished" podID="03cfe007-db9f-4403-86b5-cd107e145c1a" containerID="7190523d839dd731cc4a99d3805728490092af7e318af62fc5597246e1201a8c" exitCode=0 Mar 20 06:56:45 crc kubenswrapper[4971]: I0320 06:56:45.617889 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ktk7v" event={"ID":"03cfe007-db9f-4403-86b5-cd107e145c1a","Type":"ContainerDied","Data":"7190523d839dd731cc4a99d3805728490092af7e318af62fc5597246e1201a8c"} Mar 20 06:56:46 crc kubenswrapper[4971]: I0320 06:56:46.627720 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ktk7v" event={"ID":"03cfe007-db9f-4403-86b5-cd107e145c1a","Type":"ContainerStarted","Data":"b71909be1e7a548f33c8b7e149d925ff0688997aa8fca17f82a05c823a8e392a"} Mar 20 06:56:46 crc kubenswrapper[4971]: I0320 06:56:46.630165 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xp27t" event={"ID":"d1cb53af-ca9d-494f-a994-dbe7117881db","Type":"ContainerStarted","Data":"b00d9e33aec962beb9a4bdea51a4c0ade31899fc36d6f28526170ebce3506e02"} Mar 20 06:56:46 crc kubenswrapper[4971]: I0320 06:56:46.632361 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m2l48" event={"ID":"ca4e2982-1f40-434d-b05f-baaa0c805845","Type":"ContainerStarted","Data":"bf4bfca7c9a1fe2b8f0ca16f57c9271bed53f681ae55ab20cb224b26a316d14e"} Mar 20 06:56:46 crc kubenswrapper[4971]: I0320 06:56:46.655033 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ktk7v" podStartSLOduration=3.206472839 podStartE2EDuration="5.654977901s" podCreationTimestamp="2026-03-20 06:56:41 +0000 UTC" firstStartedPulling="2026-03-20 06:56:43.585482652 +0000 UTC m=+425.565356790" lastFinishedPulling="2026-03-20 06:56:46.033987704 +0000 UTC m=+428.013861852" observedRunningTime="2026-03-20 06:56:46.651798405 +0000 UTC m=+428.631672553" watchObservedRunningTime="2026-03-20 06:56:46.654977901 +0000 UTC m=+428.634852049" Mar 20 06:56:46 crc kubenswrapper[4971]: I0320 06:56:46.675101 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xp27t" podStartSLOduration=3.031734843 podStartE2EDuration="5.675049944s" podCreationTimestamp="2026-03-20 06:56:41 +0000 UTC" firstStartedPulling="2026-03-20 06:56:43.574518226 +0000 UTC m=+425.554392404" lastFinishedPulling="2026-03-20 06:56:46.217833347 +0000 UTC m=+428.197707505" observedRunningTime="2026-03-20 06:56:46.672501315 +0000 UTC m=+428.652375463" watchObservedRunningTime="2026-03-20 06:56:46.675049944 +0000 UTC m=+428.654924082" Mar 20 06:56:47 crc kubenswrapper[4971]: I0320 06:56:47.640856 4971 generic.go:334] "Generic (PLEG): container finished" podID="ca4e2982-1f40-434d-b05f-baaa0c805845" containerID="bf4bfca7c9a1fe2b8f0ca16f57c9271bed53f681ae55ab20cb224b26a316d14e" exitCode=0 Mar 20 06:56:47 crc kubenswrapper[4971]: I0320 06:56:47.641759 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m2l48" event={"ID":"ca4e2982-1f40-434d-b05f-baaa0c805845","Type":"ContainerDied","Data":"bf4bfca7c9a1fe2b8f0ca16f57c9271bed53f681ae55ab20cb224b26a316d14e"} Mar 20 06:56:48 crc kubenswrapper[4971]: I0320 06:56:48.650348 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m2l48" event={"ID":"ca4e2982-1f40-434d-b05f-baaa0c805845","Type":"ContainerStarted","Data":"b81daddccea710da2cd489813ce81d07ba5b0b497cfbace94453cfde4968aa9a"} Mar 20 06:56:48 crc kubenswrapper[4971]: I0320 06:56:48.674437 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-m2l48" podStartSLOduration=2.163513997 podStartE2EDuration="4.674409236s" podCreationTimestamp="2026-03-20 06:56:44 +0000 UTC" firstStartedPulling="2026-03-20 06:56:45.614535647 +0000 UTC m=+427.594409785" lastFinishedPulling="2026-03-20 06:56:48.125430866 +0000 UTC m=+430.105305024" observedRunningTime="2026-03-20 06:56:48.671266091 +0000 UTC m=+430.651140239" watchObservedRunningTime="2026-03-20 06:56:48.674409236 +0000 UTC m=+430.654283384" Mar 20 06:56:49 crc kubenswrapper[4971]: I0320 06:56:49.817155 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wztzb" Mar 20 06:56:49 crc kubenswrapper[4971]: I0320 06:56:49.817220 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wztzb" Mar 20 06:56:49 crc kubenswrapper[4971]: I0320 06:56:49.880980 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wztzb" Mar 20 06:56:50 crc kubenswrapper[4971]: I0320 06:56:50.163152 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 06:56:50 crc kubenswrapper[4971]: I0320 06:56:50.163239 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 06:56:50 crc kubenswrapper[4971]: I0320 06:56:50.756998 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wztzb" Mar 20 06:56:52 crc kubenswrapper[4971]: I0320 06:56:52.028106 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ktk7v" Mar 20 06:56:52 crc kubenswrapper[4971]: I0320 06:56:52.028171 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ktk7v" Mar 20 06:56:52 crc kubenswrapper[4971]: I0320 06:56:52.082993 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ktk7v" Mar 20 06:56:52 crc kubenswrapper[4971]: I0320 06:56:52.292446 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xp27t" Mar 20 06:56:52 crc kubenswrapper[4971]: I0320 06:56:52.292519 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xp27t" Mar 20 06:56:52 crc kubenswrapper[4971]: I0320 06:56:52.742043 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ktk7v" Mar 20 06:56:53 crc kubenswrapper[4971]: I0320 06:56:53.352420 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xp27t" podUID="d1cb53af-ca9d-494f-a994-dbe7117881db" containerName="registry-server" probeResult="failure" output=< Mar 20 06:56:53 crc kubenswrapper[4971]: timeout: failed to connect service ":50051" within 1s Mar 20 06:56:53 crc kubenswrapper[4971]: > Mar 20 06:56:54 crc kubenswrapper[4971]: I0320 06:56:54.432257 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-m2l48" Mar 20 06:56:54 crc kubenswrapper[4971]: I0320 06:56:54.433331 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-m2l48" Mar 20 06:56:54 crc kubenswrapper[4971]: I0320 06:56:54.476630 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-m2l48" Mar 20 06:56:54 crc kubenswrapper[4971]: I0320 06:56:54.759150 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-m2l48" Mar 20 06:56:58 crc kubenswrapper[4971]: I0320 06:56:58.039696 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-wk4fg"] Mar 20 06:56:58 crc kubenswrapper[4971]: I0320 06:56:58.040924 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-wk4fg" Mar 20 06:56:58 crc kubenswrapper[4971]: I0320 06:56:58.063638 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-wk4fg"] Mar 20 06:56:58 crc kubenswrapper[4971]: I0320 06:56:58.159846 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hg59b\" (UniqueName: \"kubernetes.io/projected/a9af7c0d-8ba0-468f-b5d6-4d16cdb76b8b-kube-api-access-hg59b\") pod \"image-registry-66df7c8f76-wk4fg\" (UID: \"a9af7c0d-8ba0-468f-b5d6-4d16cdb76b8b\") " pod="openshift-image-registry/image-registry-66df7c8f76-wk4fg" Mar 20 06:56:58 crc kubenswrapper[4971]: I0320 06:56:58.159903 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a9af7c0d-8ba0-468f-b5d6-4d16cdb76b8b-registry-certificates\") pod \"image-registry-66df7c8f76-wk4fg\" (UID: \"a9af7c0d-8ba0-468f-b5d6-4d16cdb76b8b\") " pod="openshift-image-registry/image-registry-66df7c8f76-wk4fg" Mar 20 06:56:58 crc kubenswrapper[4971]: I0320 06:56:58.159940 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a9af7c0d-8ba0-468f-b5d6-4d16cdb76b8b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-wk4fg\" (UID: \"a9af7c0d-8ba0-468f-b5d6-4d16cdb76b8b\") " pod="openshift-image-registry/image-registry-66df7c8f76-wk4fg" Mar 20 06:56:58 crc kubenswrapper[4971]: I0320 06:56:58.159968 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a9af7c0d-8ba0-468f-b5d6-4d16cdb76b8b-registry-tls\") pod \"image-registry-66df7c8f76-wk4fg\" (UID: \"a9af7c0d-8ba0-468f-b5d6-4d16cdb76b8b\") " pod="openshift-image-registry/image-registry-66df7c8f76-wk4fg" Mar 20 06:56:58 crc kubenswrapper[4971]: I0320 06:56:58.159997 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a9af7c0d-8ba0-468f-b5d6-4d16cdb76b8b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-wk4fg\" (UID: \"a9af7c0d-8ba0-468f-b5d6-4d16cdb76b8b\") " pod="openshift-image-registry/image-registry-66df7c8f76-wk4fg" Mar 20 06:56:58 crc kubenswrapper[4971]: I0320 06:56:58.160128 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a9af7c0d-8ba0-468f-b5d6-4d16cdb76b8b-bound-sa-token\") pod \"image-registry-66df7c8f76-wk4fg\" (UID: \"a9af7c0d-8ba0-468f-b5d6-4d16cdb76b8b\") " pod="openshift-image-registry/image-registry-66df7c8f76-wk4fg" Mar 20 06:56:58 crc kubenswrapper[4971]: I0320 06:56:58.160197 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a9af7c0d-8ba0-468f-b5d6-4d16cdb76b8b-trusted-ca\") pod \"image-registry-66df7c8f76-wk4fg\" (UID: \"a9af7c0d-8ba0-468f-b5d6-4d16cdb76b8b\") " pod="openshift-image-registry/image-registry-66df7c8f76-wk4fg" Mar 20 06:56:58 crc kubenswrapper[4971]: I0320 06:56:58.160414 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-wk4fg\" (UID: \"a9af7c0d-8ba0-468f-b5d6-4d16cdb76b8b\") " pod="openshift-image-registry/image-registry-66df7c8f76-wk4fg" Mar 20 06:56:58 crc kubenswrapper[4971]: I0320 06:56:58.193038 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-wk4fg\" (UID: \"a9af7c0d-8ba0-468f-b5d6-4d16cdb76b8b\") " pod="openshift-image-registry/image-registry-66df7c8f76-wk4fg" Mar 20 06:56:58 crc kubenswrapper[4971]: I0320 06:56:58.262652 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hg59b\" (UniqueName: \"kubernetes.io/projected/a9af7c0d-8ba0-468f-b5d6-4d16cdb76b8b-kube-api-access-hg59b\") pod \"image-registry-66df7c8f76-wk4fg\" (UID: \"a9af7c0d-8ba0-468f-b5d6-4d16cdb76b8b\") " pod="openshift-image-registry/image-registry-66df7c8f76-wk4fg" Mar 20 06:56:58 crc kubenswrapper[4971]: I0320 06:56:58.262761 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a9af7c0d-8ba0-468f-b5d6-4d16cdb76b8b-registry-certificates\") pod \"image-registry-66df7c8f76-wk4fg\" (UID: \"a9af7c0d-8ba0-468f-b5d6-4d16cdb76b8b\") " pod="openshift-image-registry/image-registry-66df7c8f76-wk4fg" Mar 20 06:56:58 crc kubenswrapper[4971]: I0320 06:56:58.262822 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a9af7c0d-8ba0-468f-b5d6-4d16cdb76b8b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-wk4fg\" (UID: \"a9af7c0d-8ba0-468f-b5d6-4d16cdb76b8b\") " pod="openshift-image-registry/image-registry-66df7c8f76-wk4fg" Mar 20 06:56:58 crc kubenswrapper[4971]: I0320 06:56:58.262869 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a9af7c0d-8ba0-468f-b5d6-4d16cdb76b8b-registry-tls\") pod \"image-registry-66df7c8f76-wk4fg\" (UID: \"a9af7c0d-8ba0-468f-b5d6-4d16cdb76b8b\") " pod="openshift-image-registry/image-registry-66df7c8f76-wk4fg" Mar 20 06:56:58 crc kubenswrapper[4971]: I0320 06:56:58.262915 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a9af7c0d-8ba0-468f-b5d6-4d16cdb76b8b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-wk4fg\" (UID: \"a9af7c0d-8ba0-468f-b5d6-4d16cdb76b8b\") " pod="openshift-image-registry/image-registry-66df7c8f76-wk4fg" Mar 20 06:56:58 crc kubenswrapper[4971]: I0320 06:56:58.262958 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a9af7c0d-8ba0-468f-b5d6-4d16cdb76b8b-bound-sa-token\") pod \"image-registry-66df7c8f76-wk4fg\" (UID: \"a9af7c0d-8ba0-468f-b5d6-4d16cdb76b8b\") " pod="openshift-image-registry/image-registry-66df7c8f76-wk4fg" Mar 20 06:56:58 crc kubenswrapper[4971]: I0320 06:56:58.263944 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a9af7c0d-8ba0-468f-b5d6-4d16cdb76b8b-trusted-ca\") pod \"image-registry-66df7c8f76-wk4fg\" (UID: \"a9af7c0d-8ba0-468f-b5d6-4d16cdb76b8b\") " pod="openshift-image-registry/image-registry-66df7c8f76-wk4fg" Mar 20 06:56:58 crc kubenswrapper[4971]: I0320 06:56:58.264986 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a9af7c0d-8ba0-468f-b5d6-4d16cdb76b8b-registry-certificates\") pod \"image-registry-66df7c8f76-wk4fg\" (UID: \"a9af7c0d-8ba0-468f-b5d6-4d16cdb76b8b\") " pod="openshift-image-registry/image-registry-66df7c8f76-wk4fg" Mar 20 06:56:58 crc kubenswrapper[4971]: I0320 06:56:58.265554 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a9af7c0d-8ba0-468f-b5d6-4d16cdb76b8b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-wk4fg\" (UID: \"a9af7c0d-8ba0-468f-b5d6-4d16cdb76b8b\") " pod="openshift-image-registry/image-registry-66df7c8f76-wk4fg" Mar 20 06:56:58 crc kubenswrapper[4971]: I0320 06:56:58.265963 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a9af7c0d-8ba0-468f-b5d6-4d16cdb76b8b-trusted-ca\") pod \"image-registry-66df7c8f76-wk4fg\" (UID: \"a9af7c0d-8ba0-468f-b5d6-4d16cdb76b8b\") " pod="openshift-image-registry/image-registry-66df7c8f76-wk4fg" Mar 20 06:56:58 crc kubenswrapper[4971]: I0320 06:56:58.276781 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a9af7c0d-8ba0-468f-b5d6-4d16cdb76b8b-registry-tls\") pod \"image-registry-66df7c8f76-wk4fg\" (UID: \"a9af7c0d-8ba0-468f-b5d6-4d16cdb76b8b\") " pod="openshift-image-registry/image-registry-66df7c8f76-wk4fg" Mar 20 06:56:58 crc kubenswrapper[4971]: I0320 06:56:58.281165 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a9af7c0d-8ba0-468f-b5d6-4d16cdb76b8b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-wk4fg\" (UID: \"a9af7c0d-8ba0-468f-b5d6-4d16cdb76b8b\") " pod="openshift-image-registry/image-registry-66df7c8f76-wk4fg" Mar 20 06:56:58 crc kubenswrapper[4971]: I0320 06:56:58.288193 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a9af7c0d-8ba0-468f-b5d6-4d16cdb76b8b-bound-sa-token\") pod \"image-registry-66df7c8f76-wk4fg\" (UID: \"a9af7c0d-8ba0-468f-b5d6-4d16cdb76b8b\") " pod="openshift-image-registry/image-registry-66df7c8f76-wk4fg" Mar 20 06:56:58 crc kubenswrapper[4971]: I0320 06:56:58.292101 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hg59b\" (UniqueName: \"kubernetes.io/projected/a9af7c0d-8ba0-468f-b5d6-4d16cdb76b8b-kube-api-access-hg59b\") pod \"image-registry-66df7c8f76-wk4fg\" (UID: \"a9af7c0d-8ba0-468f-b5d6-4d16cdb76b8b\") " pod="openshift-image-registry/image-registry-66df7c8f76-wk4fg" Mar 20 06:56:58 crc kubenswrapper[4971]: I0320 06:56:58.400664 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-wk4fg" Mar 20 06:56:58 crc kubenswrapper[4971]: I0320 06:56:58.679339 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-wk4fg"] Mar 20 06:56:58 crc kubenswrapper[4971]: I0320 06:56:58.748298 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-wk4fg" event={"ID":"a9af7c0d-8ba0-468f-b5d6-4d16cdb76b8b","Type":"ContainerStarted","Data":"549000c625c3dbf885cc1873f99869629b065cf76807caaf8eccb89a48b7c4b2"} Mar 20 06:56:59 crc kubenswrapper[4971]: I0320 06:56:59.743819 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-wk4fg" event={"ID":"a9af7c0d-8ba0-468f-b5d6-4d16cdb76b8b","Type":"ContainerStarted","Data":"d41650df13ce1089b9be9e7ef54e7381fa366e105976f6cc16891968d4841dce"} Mar 20 06:56:59 crc kubenswrapper[4971]: I0320 06:56:59.744247 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-wk4fg" Mar 20 06:56:59 crc kubenswrapper[4971]: I0320 06:56:59.766007 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-wk4fg" podStartSLOduration=1.765985911 podStartE2EDuration="1.765985911s" podCreationTimestamp="2026-03-20 06:56:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 06:56:59.763699699 +0000 UTC m=+441.743573877" watchObservedRunningTime="2026-03-20 06:56:59.765985911 +0000 UTC m=+441.745860049" Mar 20 06:57:02 crc kubenswrapper[4971]: I0320 06:57:02.356438 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xp27t" Mar 20 06:57:02 crc kubenswrapper[4971]: I0320 06:57:02.413894 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xp27t" Mar 20 06:57:18 crc kubenswrapper[4971]: I0320 06:57:18.415260 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-wk4fg" Mar 20 06:57:18 crc kubenswrapper[4971]: I0320 06:57:18.515034 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-q8mn4"] Mar 20 06:57:20 crc kubenswrapper[4971]: I0320 06:57:20.162296 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 06:57:20 crc kubenswrapper[4971]: I0320 06:57:20.162844 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 06:57:43 crc kubenswrapper[4971]: I0320 06:57:43.579374 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" podUID="4e2cb781-6f5e-493d-a811-b88b641fcda2" containerName="registry" containerID="cri-o://32593b7b0e143baed9f8d1279ffc4059b088e0b8564179764076ed2ab59378f4" gracePeriod=30 Mar 20 06:57:44 crc kubenswrapper[4971]: I0320 06:57:44.017624 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:57:44 crc kubenswrapper[4971]: I0320 06:57:44.107562 4971 generic.go:334] "Generic (PLEG): container finished" podID="4e2cb781-6f5e-493d-a811-b88b641fcda2" containerID="32593b7b0e143baed9f8d1279ffc4059b088e0b8564179764076ed2ab59378f4" exitCode=0 Mar 20 06:57:44 crc kubenswrapper[4971]: I0320 06:57:44.107625 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" event={"ID":"4e2cb781-6f5e-493d-a811-b88b641fcda2","Type":"ContainerDied","Data":"32593b7b0e143baed9f8d1279ffc4059b088e0b8564179764076ed2ab59378f4"} Mar 20 06:57:44 crc kubenswrapper[4971]: I0320 06:57:44.107656 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" event={"ID":"4e2cb781-6f5e-493d-a811-b88b641fcda2","Type":"ContainerDied","Data":"26ab5aa38fae1ea488aa974054cab4a21d79c628df54ae22f8c7189dd6ae9304"} Mar 20 06:57:44 crc kubenswrapper[4971]: I0320 06:57:44.107676 4971 scope.go:117] "RemoveContainer" containerID="32593b7b0e143baed9f8d1279ffc4059b088e0b8564179764076ed2ab59378f4" Mar 20 06:57:44 crc kubenswrapper[4971]: I0320 06:57:44.107804 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-q8mn4" Mar 20 06:57:44 crc kubenswrapper[4971]: I0320 06:57:44.130669 4971 scope.go:117] "RemoveContainer" containerID="32593b7b0e143baed9f8d1279ffc4059b088e0b8564179764076ed2ab59378f4" Mar 20 06:57:44 crc kubenswrapper[4971]: E0320 06:57:44.131104 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32593b7b0e143baed9f8d1279ffc4059b088e0b8564179764076ed2ab59378f4\": container with ID starting with 32593b7b0e143baed9f8d1279ffc4059b088e0b8564179764076ed2ab59378f4 not found: ID does not exist" containerID="32593b7b0e143baed9f8d1279ffc4059b088e0b8564179764076ed2ab59378f4" Mar 20 06:57:44 crc kubenswrapper[4971]: I0320 06:57:44.131140 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32593b7b0e143baed9f8d1279ffc4059b088e0b8564179764076ed2ab59378f4"} err="failed to get container status \"32593b7b0e143baed9f8d1279ffc4059b088e0b8564179764076ed2ab59378f4\": rpc error: code = NotFound desc = could not find container \"32593b7b0e143baed9f8d1279ffc4059b088e0b8564179764076ed2ab59378f4\": container with ID starting with 32593b7b0e143baed9f8d1279ffc4059b088e0b8564179764076ed2ab59378f4 not found: ID does not exist" Mar 20 06:57:44 crc kubenswrapper[4971]: I0320 06:57:44.189220 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4e2cb781-6f5e-493d-a811-b88b641fcda2-installation-pull-secrets\") pod \"4e2cb781-6f5e-493d-a811-b88b641fcda2\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " Mar 20 06:57:44 crc kubenswrapper[4971]: I0320 06:57:44.189259 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4e2cb781-6f5e-493d-a811-b88b641fcda2-registry-certificates\") pod \"4e2cb781-6f5e-493d-a811-b88b641fcda2\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " Mar 20 06:57:44 crc kubenswrapper[4971]: I0320 06:57:44.189292 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4e2cb781-6f5e-493d-a811-b88b641fcda2-bound-sa-token\") pod \"4e2cb781-6f5e-493d-a811-b88b641fcda2\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " Mar 20 06:57:44 crc kubenswrapper[4971]: I0320 06:57:44.189323 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4e2cb781-6f5e-493d-a811-b88b641fcda2-trusted-ca\") pod \"4e2cb781-6f5e-493d-a811-b88b641fcda2\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " Mar 20 06:57:44 crc kubenswrapper[4971]: I0320 06:57:44.190143 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e2cb781-6f5e-493d-a811-b88b641fcda2-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "4e2cb781-6f5e-493d-a811-b88b641fcda2" (UID: "4e2cb781-6f5e-493d-a811-b88b641fcda2"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:57:44 crc kubenswrapper[4971]: I0320 06:57:44.190180 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e2cb781-6f5e-493d-a811-b88b641fcda2-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "4e2cb781-6f5e-493d-a811-b88b641fcda2" (UID: "4e2cb781-6f5e-493d-a811-b88b641fcda2"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 06:57:44 crc kubenswrapper[4971]: I0320 06:57:44.190409 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4e2cb781-6f5e-493d-a811-b88b641fcda2-registry-tls\") pod \"4e2cb781-6f5e-493d-a811-b88b641fcda2\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " Mar 20 06:57:44 crc kubenswrapper[4971]: I0320 06:57:44.190548 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"4e2cb781-6f5e-493d-a811-b88b641fcda2\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " Mar 20 06:57:44 crc kubenswrapper[4971]: I0320 06:57:44.190593 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4e2cb781-6f5e-493d-a811-b88b641fcda2-ca-trust-extracted\") pod \"4e2cb781-6f5e-493d-a811-b88b641fcda2\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " Mar 20 06:57:44 crc kubenswrapper[4971]: I0320 06:57:44.190649 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57dzz\" (UniqueName: \"kubernetes.io/projected/4e2cb781-6f5e-493d-a811-b88b641fcda2-kube-api-access-57dzz\") pod \"4e2cb781-6f5e-493d-a811-b88b641fcda2\" (UID: \"4e2cb781-6f5e-493d-a811-b88b641fcda2\") " Mar 20 06:57:44 crc kubenswrapper[4971]: I0320 06:57:44.190914 4971 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4e2cb781-6f5e-493d-a811-b88b641fcda2-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 20 06:57:44 crc kubenswrapper[4971]: I0320 06:57:44.190940 4971 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4e2cb781-6f5e-493d-a811-b88b641fcda2-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 06:57:44 crc kubenswrapper[4971]: I0320 06:57:44.195256 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e2cb781-6f5e-493d-a811-b88b641fcda2-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "4e2cb781-6f5e-493d-a811-b88b641fcda2" (UID: "4e2cb781-6f5e-493d-a811-b88b641fcda2"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 06:57:44 crc kubenswrapper[4971]: I0320 06:57:44.199053 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e2cb781-6f5e-493d-a811-b88b641fcda2-kube-api-access-57dzz" (OuterVolumeSpecName: "kube-api-access-57dzz") pod "4e2cb781-6f5e-493d-a811-b88b641fcda2" (UID: "4e2cb781-6f5e-493d-a811-b88b641fcda2"). InnerVolumeSpecName "kube-api-access-57dzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:57:44 crc kubenswrapper[4971]: I0320 06:57:44.199267 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e2cb781-6f5e-493d-a811-b88b641fcda2-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "4e2cb781-6f5e-493d-a811-b88b641fcda2" (UID: "4e2cb781-6f5e-493d-a811-b88b641fcda2"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:57:44 crc kubenswrapper[4971]: I0320 06:57:44.199749 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e2cb781-6f5e-493d-a811-b88b641fcda2-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "4e2cb781-6f5e-493d-a811-b88b641fcda2" (UID: "4e2cb781-6f5e-493d-a811-b88b641fcda2"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:57:44 crc kubenswrapper[4971]: I0320 06:57:44.202962 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "4e2cb781-6f5e-493d-a811-b88b641fcda2" (UID: "4e2cb781-6f5e-493d-a811-b88b641fcda2"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 06:57:44 crc kubenswrapper[4971]: I0320 06:57:44.208277 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e2cb781-6f5e-493d-a811-b88b641fcda2-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "4e2cb781-6f5e-493d-a811-b88b641fcda2" (UID: "4e2cb781-6f5e-493d-a811-b88b641fcda2"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 06:57:44 crc kubenswrapper[4971]: I0320 06:57:44.291910 4971 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4e2cb781-6f5e-493d-a811-b88b641fcda2-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 20 06:57:44 crc kubenswrapper[4971]: I0320 06:57:44.291962 4971 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4e2cb781-6f5e-493d-a811-b88b641fcda2-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 20 06:57:44 crc kubenswrapper[4971]: I0320 06:57:44.291976 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57dzz\" (UniqueName: \"kubernetes.io/projected/4e2cb781-6f5e-493d-a811-b88b641fcda2-kube-api-access-57dzz\") on node \"crc\" DevicePath \"\"" Mar 20 06:57:44 crc kubenswrapper[4971]: I0320 06:57:44.291991 4971 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4e2cb781-6f5e-493d-a811-b88b641fcda2-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 20 06:57:44 crc kubenswrapper[4971]: I0320 06:57:44.292008 4971 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4e2cb781-6f5e-493d-a811-b88b641fcda2-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 06:57:44 crc kubenswrapper[4971]: I0320 06:57:44.445684 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-q8mn4"] Mar 20 06:57:44 crc kubenswrapper[4971]: I0320 06:57:44.451125 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-q8mn4"] Mar 20 06:57:44 crc kubenswrapper[4971]: I0320 06:57:44.741411 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e2cb781-6f5e-493d-a811-b88b641fcda2" path="/var/lib/kubelet/pods/4e2cb781-6f5e-493d-a811-b88b641fcda2/volumes" Mar 20 06:57:50 crc kubenswrapper[4971]: I0320 06:57:50.162678 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 06:57:50 crc kubenswrapper[4971]: I0320 06:57:50.163301 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 06:57:50 crc kubenswrapper[4971]: I0320 06:57:50.163378 4971 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" Mar 20 06:57:50 crc kubenswrapper[4971]: I0320 06:57:50.164370 4971 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ddd443d54f717c3c4871b4cb928359d37429421fd7c4e0ea4b56ed63d3ac0038"} pod="openshift-machine-config-operator/machine-config-daemon-m26zl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 06:57:50 crc kubenswrapper[4971]: I0320 06:57:50.164494 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" containerID="cri-o://ddd443d54f717c3c4871b4cb928359d37429421fd7c4e0ea4b56ed63d3ac0038" gracePeriod=600 Mar 20 06:57:51 crc kubenswrapper[4971]: I0320 06:57:51.166485 4971 generic.go:334] "Generic (PLEG): container finished" podID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerID="ddd443d54f717c3c4871b4cb928359d37429421fd7c4e0ea4b56ed63d3ac0038" exitCode=0 Mar 20 06:57:51 crc kubenswrapper[4971]: I0320 06:57:51.166590 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerDied","Data":"ddd443d54f717c3c4871b4cb928359d37429421fd7c4e0ea4b56ed63d3ac0038"} Mar 20 06:57:51 crc kubenswrapper[4971]: I0320 06:57:51.166978 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerStarted","Data":"7ab1be4a624c5deceee25ffdac450b581a2b15b47c089d46af6f08d49779add2"} Mar 20 06:57:51 crc kubenswrapper[4971]: I0320 06:57:51.167004 4971 scope.go:117] "RemoveContainer" containerID="b6545ca2d913195f4ebed66a5c9050bc7198c2d2281fea7c3293a5d188beb2e4" Mar 20 06:58:00 crc kubenswrapper[4971]: I0320 06:58:00.156521 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566498-ps9kt"] Mar 20 06:58:00 crc kubenswrapper[4971]: E0320 06:58:00.159869 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e2cb781-6f5e-493d-a811-b88b641fcda2" containerName="registry" Mar 20 06:58:00 crc kubenswrapper[4971]: I0320 06:58:00.160055 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e2cb781-6f5e-493d-a811-b88b641fcda2" containerName="registry" Mar 20 06:58:00 crc kubenswrapper[4971]: I0320 06:58:00.160537 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e2cb781-6f5e-493d-a811-b88b641fcda2" containerName="registry" Mar 20 06:58:00 crc kubenswrapper[4971]: I0320 06:58:00.161531 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566498-ps9kt" Mar 20 06:58:00 crc kubenswrapper[4971]: I0320 06:58:00.169035 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566498-ps9kt"] Mar 20 06:58:00 crc kubenswrapper[4971]: I0320 06:58:00.214086 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 06:58:00 crc kubenswrapper[4971]: I0320 06:58:00.214119 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 06:58:00 crc kubenswrapper[4971]: I0320 06:58:00.214097 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 06:58:00 crc kubenswrapper[4971]: I0320 06:58:00.258325 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wcgq\" (UniqueName: \"kubernetes.io/projected/32828d9c-7153-40b0-872c-62b986ae0c39-kube-api-access-7wcgq\") pod \"auto-csr-approver-29566498-ps9kt\" (UID: \"32828d9c-7153-40b0-872c-62b986ae0c39\") " pod="openshift-infra/auto-csr-approver-29566498-ps9kt" Mar 20 06:58:00 crc kubenswrapper[4971]: I0320 06:58:00.360685 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wcgq\" (UniqueName: \"kubernetes.io/projected/32828d9c-7153-40b0-872c-62b986ae0c39-kube-api-access-7wcgq\") pod \"auto-csr-approver-29566498-ps9kt\" (UID: \"32828d9c-7153-40b0-872c-62b986ae0c39\") " pod="openshift-infra/auto-csr-approver-29566498-ps9kt" Mar 20 06:58:00 crc kubenswrapper[4971]: I0320 06:58:00.398498 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wcgq\" (UniqueName: \"kubernetes.io/projected/32828d9c-7153-40b0-872c-62b986ae0c39-kube-api-access-7wcgq\") pod \"auto-csr-approver-29566498-ps9kt\" (UID: \"32828d9c-7153-40b0-872c-62b986ae0c39\") " pod="openshift-infra/auto-csr-approver-29566498-ps9kt" Mar 20 06:58:00 crc kubenswrapper[4971]: I0320 06:58:00.544066 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566498-ps9kt" Mar 20 06:58:00 crc kubenswrapper[4971]: I0320 06:58:00.858984 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566498-ps9kt"] Mar 20 06:58:01 crc kubenswrapper[4971]: I0320 06:58:01.256705 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566498-ps9kt" event={"ID":"32828d9c-7153-40b0-872c-62b986ae0c39","Type":"ContainerStarted","Data":"2c455d29924d06243d20b6837a8a9596079e38b0fecdcd44d978345a11b7038e"} Mar 20 06:58:02 crc kubenswrapper[4971]: I0320 06:58:02.268315 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566498-ps9kt" event={"ID":"32828d9c-7153-40b0-872c-62b986ae0c39","Type":"ContainerStarted","Data":"33723ccdab253f2b230302903a0643bb3062d17242c781857e2471d0eaad95c9"} Mar 20 06:58:02 crc kubenswrapper[4971]: I0320 06:58:02.290410 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566498-ps9kt" podStartSLOduration=1.506292658 podStartE2EDuration="2.290391345s" podCreationTimestamp="2026-03-20 06:58:00 +0000 UTC" firstStartedPulling="2026-03-20 06:58:00.876387598 +0000 UTC m=+502.856261776" lastFinishedPulling="2026-03-20 06:58:01.660486325 +0000 UTC m=+503.640360463" observedRunningTime="2026-03-20 06:58:02.285728598 +0000 UTC m=+504.265602766" watchObservedRunningTime="2026-03-20 06:58:02.290391345 +0000 UTC m=+504.270265493" Mar 20 06:58:03 crc kubenswrapper[4971]: I0320 06:58:03.279247 4971 generic.go:334] "Generic (PLEG): container finished" podID="32828d9c-7153-40b0-872c-62b986ae0c39" containerID="33723ccdab253f2b230302903a0643bb3062d17242c781857e2471d0eaad95c9" exitCode=0 Mar 20 06:58:03 crc kubenswrapper[4971]: I0320 06:58:03.279303 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566498-ps9kt" event={"ID":"32828d9c-7153-40b0-872c-62b986ae0c39","Type":"ContainerDied","Data":"33723ccdab253f2b230302903a0643bb3062d17242c781857e2471d0eaad95c9"} Mar 20 06:58:04 crc kubenswrapper[4971]: I0320 06:58:04.610049 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566498-ps9kt" Mar 20 06:58:04 crc kubenswrapper[4971]: I0320 06:58:04.728527 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wcgq\" (UniqueName: \"kubernetes.io/projected/32828d9c-7153-40b0-872c-62b986ae0c39-kube-api-access-7wcgq\") pod \"32828d9c-7153-40b0-872c-62b986ae0c39\" (UID: \"32828d9c-7153-40b0-872c-62b986ae0c39\") " Mar 20 06:58:04 crc kubenswrapper[4971]: I0320 06:58:04.740469 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32828d9c-7153-40b0-872c-62b986ae0c39-kube-api-access-7wcgq" (OuterVolumeSpecName: "kube-api-access-7wcgq") pod "32828d9c-7153-40b0-872c-62b986ae0c39" (UID: "32828d9c-7153-40b0-872c-62b986ae0c39"). InnerVolumeSpecName "kube-api-access-7wcgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 06:58:04 crc kubenswrapper[4971]: I0320 06:58:04.830486 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wcgq\" (UniqueName: \"kubernetes.io/projected/32828d9c-7153-40b0-872c-62b986ae0c39-kube-api-access-7wcgq\") on node \"crc\" DevicePath \"\"" Mar 20 06:58:05 crc kubenswrapper[4971]: I0320 06:58:05.301356 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566498-ps9kt" event={"ID":"32828d9c-7153-40b0-872c-62b986ae0c39","Type":"ContainerDied","Data":"2c455d29924d06243d20b6837a8a9596079e38b0fecdcd44d978345a11b7038e"} Mar 20 06:58:05 crc kubenswrapper[4971]: I0320 06:58:05.302122 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c455d29924d06243d20b6837a8a9596079e38b0fecdcd44d978345a11b7038e" Mar 20 06:58:05 crc kubenswrapper[4971]: I0320 06:58:05.301513 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566498-ps9kt" Mar 20 06:58:05 crc kubenswrapper[4971]: I0320 06:58:05.370065 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566492-86snb"] Mar 20 06:58:05 crc kubenswrapper[4971]: I0320 06:58:05.376184 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566492-86snb"] Mar 20 06:58:06 crc kubenswrapper[4971]: I0320 06:58:06.744501 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e766c0e7-ca96-4f32-9e14-81a3d5bbd389" path="/var/lib/kubelet/pods/e766c0e7-ca96-4f32-9e14-81a3d5bbd389/volumes" Mar 20 06:59:39 crc kubenswrapper[4971]: I0320 06:59:39.125664 4971 scope.go:117] "RemoveContainer" containerID="f9a27fd69e2b216942baa90f4594ff32169f4f4bb3008b3e41f132bb22efa1c7" Mar 20 06:59:39 crc kubenswrapper[4971]: I0320 06:59:39.159520 4971 scope.go:117] "RemoveContainer" containerID="9e1fad1a4a953dace5c5dda053036316c7aa7e1a995eb1ed059b27498c585fa1" Mar 20 06:59:50 crc kubenswrapper[4971]: I0320 06:59:50.162410 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 06:59:50 crc kubenswrapper[4971]: I0320 06:59:50.163102 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:00:00 crc kubenswrapper[4971]: I0320 07:00:00.152845 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566500-qwvlj"] Mar 20 07:00:00 crc kubenswrapper[4971]: E0320 07:00:00.154228 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32828d9c-7153-40b0-872c-62b986ae0c39" containerName="oc" Mar 20 07:00:00 crc kubenswrapper[4971]: I0320 07:00:00.154268 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="32828d9c-7153-40b0-872c-62b986ae0c39" containerName="oc" Mar 20 07:00:00 crc kubenswrapper[4971]: I0320 07:00:00.154496 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="32828d9c-7153-40b0-872c-62b986ae0c39" containerName="oc" Mar 20 07:00:00 crc kubenswrapper[4971]: I0320 07:00:00.155274 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566500-qwvlj" Mar 20 07:00:00 crc kubenswrapper[4971]: I0320 07:00:00.158589 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 07:00:00 crc kubenswrapper[4971]: I0320 07:00:00.159072 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 07:00:00 crc kubenswrapper[4971]: I0320 07:00:00.163881 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 07:00:00 crc kubenswrapper[4971]: I0320 07:00:00.165160 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566500-lqrrn"] Mar 20 07:00:00 crc kubenswrapper[4971]: I0320 07:00:00.171378 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566500-lqrrn" Mar 20 07:00:00 crc kubenswrapper[4971]: I0320 07:00:00.175116 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 07:00:00 crc kubenswrapper[4971]: I0320 07:00:00.175375 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 07:00:00 crc kubenswrapper[4971]: I0320 07:00:00.182861 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566500-qwvlj"] Mar 20 07:00:00 crc kubenswrapper[4971]: I0320 07:00:00.201848 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6da4e6ff-eae8-421a-81e8-694388cbb4d3-secret-volume\") pod \"collect-profiles-29566500-lqrrn\" (UID: \"6da4e6ff-eae8-421a-81e8-694388cbb4d3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566500-lqrrn" Mar 20 07:00:00 crc kubenswrapper[4971]: I0320 07:00:00.201925 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bl926\" (UniqueName: \"kubernetes.io/projected/6da4e6ff-eae8-421a-81e8-694388cbb4d3-kube-api-access-bl926\") pod \"collect-profiles-29566500-lqrrn\" (UID: \"6da4e6ff-eae8-421a-81e8-694388cbb4d3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566500-lqrrn" Mar 20 07:00:00 crc kubenswrapper[4971]: I0320 07:00:00.201997 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twn8h\" (UniqueName: \"kubernetes.io/projected/6f1a8924-50a9-4115-85ac-e24d81d9693e-kube-api-access-twn8h\") pod \"auto-csr-approver-29566500-qwvlj\" (UID: \"6f1a8924-50a9-4115-85ac-e24d81d9693e\") " pod="openshift-infra/auto-csr-approver-29566500-qwvlj" Mar 20 07:00:00 crc kubenswrapper[4971]: I0320 07:00:00.202019 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6da4e6ff-eae8-421a-81e8-694388cbb4d3-config-volume\") pod \"collect-profiles-29566500-lqrrn\" (UID: \"6da4e6ff-eae8-421a-81e8-694388cbb4d3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566500-lqrrn" Mar 20 07:00:00 crc kubenswrapper[4971]: I0320 07:00:00.207406 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566500-lqrrn"] Mar 20 07:00:00 crc kubenswrapper[4971]: I0320 07:00:00.303050 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twn8h\" (UniqueName: \"kubernetes.io/projected/6f1a8924-50a9-4115-85ac-e24d81d9693e-kube-api-access-twn8h\") pod \"auto-csr-approver-29566500-qwvlj\" (UID: \"6f1a8924-50a9-4115-85ac-e24d81d9693e\") " pod="openshift-infra/auto-csr-approver-29566500-qwvlj" Mar 20 07:00:00 crc kubenswrapper[4971]: I0320 07:00:00.303099 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6da4e6ff-eae8-421a-81e8-694388cbb4d3-config-volume\") pod \"collect-profiles-29566500-lqrrn\" (UID: \"6da4e6ff-eae8-421a-81e8-694388cbb4d3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566500-lqrrn" Mar 20 07:00:00 crc kubenswrapper[4971]: I0320 07:00:00.303156 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6da4e6ff-eae8-421a-81e8-694388cbb4d3-secret-volume\") pod \"collect-profiles-29566500-lqrrn\" (UID: \"6da4e6ff-eae8-421a-81e8-694388cbb4d3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566500-lqrrn" Mar 20 07:00:00 crc kubenswrapper[4971]: I0320 07:00:00.303199 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bl926\" (UniqueName: \"kubernetes.io/projected/6da4e6ff-eae8-421a-81e8-694388cbb4d3-kube-api-access-bl926\") pod \"collect-profiles-29566500-lqrrn\" (UID: \"6da4e6ff-eae8-421a-81e8-694388cbb4d3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566500-lqrrn" Mar 20 07:00:00 crc kubenswrapper[4971]: I0320 07:00:00.304771 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6da4e6ff-eae8-421a-81e8-694388cbb4d3-config-volume\") pod \"collect-profiles-29566500-lqrrn\" (UID: \"6da4e6ff-eae8-421a-81e8-694388cbb4d3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566500-lqrrn" Mar 20 07:00:00 crc kubenswrapper[4971]: I0320 07:00:00.318529 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6da4e6ff-eae8-421a-81e8-694388cbb4d3-secret-volume\") pod \"collect-profiles-29566500-lqrrn\" (UID: \"6da4e6ff-eae8-421a-81e8-694388cbb4d3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566500-lqrrn" Mar 20 07:00:00 crc kubenswrapper[4971]: I0320 07:00:00.330038 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bl926\" (UniqueName: \"kubernetes.io/projected/6da4e6ff-eae8-421a-81e8-694388cbb4d3-kube-api-access-bl926\") pod \"collect-profiles-29566500-lqrrn\" (UID: \"6da4e6ff-eae8-421a-81e8-694388cbb4d3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566500-lqrrn" Mar 20 07:00:00 crc kubenswrapper[4971]: I0320 07:00:00.333202 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twn8h\" (UniqueName: \"kubernetes.io/projected/6f1a8924-50a9-4115-85ac-e24d81d9693e-kube-api-access-twn8h\") pod \"auto-csr-approver-29566500-qwvlj\" (UID: \"6f1a8924-50a9-4115-85ac-e24d81d9693e\") " pod="openshift-infra/auto-csr-approver-29566500-qwvlj" Mar 20 07:00:00 crc kubenswrapper[4971]: I0320 07:00:00.502506 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566500-qwvlj" Mar 20 07:00:00 crc kubenswrapper[4971]: I0320 07:00:00.520956 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566500-lqrrn" Mar 20 07:00:00 crc kubenswrapper[4971]: I0320 07:00:00.786144 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566500-qwvlj"] Mar 20 07:00:00 crc kubenswrapper[4971]: I0320 07:00:00.804144 4971 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 07:00:00 crc kubenswrapper[4971]: I0320 07:00:00.834032 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566500-lqrrn"] Mar 20 07:00:00 crc kubenswrapper[4971]: W0320 07:00:00.839321 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6da4e6ff_eae8_421a_81e8_694388cbb4d3.slice/crio-789ec1a457f91fd93ef0712c83e5691f2677f95dde0b9a1b0ac87e9676c7e23d WatchSource:0}: Error finding container 789ec1a457f91fd93ef0712c83e5691f2677f95dde0b9a1b0ac87e9676c7e23d: Status 404 returned error can't find the container with id 789ec1a457f91fd93ef0712c83e5691f2677f95dde0b9a1b0ac87e9676c7e23d Mar 20 07:00:01 crc kubenswrapper[4971]: I0320 07:00:01.205453 4971 generic.go:334] "Generic (PLEG): container finished" podID="6da4e6ff-eae8-421a-81e8-694388cbb4d3" containerID="54c16493493392922f01a08812121dfe702a5cd7ce82156b39195a9f80a7e0bf" exitCode=0 Mar 20 07:00:01 crc kubenswrapper[4971]: I0320 07:00:01.205524 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566500-lqrrn" event={"ID":"6da4e6ff-eae8-421a-81e8-694388cbb4d3","Type":"ContainerDied","Data":"54c16493493392922f01a08812121dfe702a5cd7ce82156b39195a9f80a7e0bf"} Mar 20 07:00:01 crc kubenswrapper[4971]: I0320 07:00:01.205561 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566500-lqrrn" event={"ID":"6da4e6ff-eae8-421a-81e8-694388cbb4d3","Type":"ContainerStarted","Data":"789ec1a457f91fd93ef0712c83e5691f2677f95dde0b9a1b0ac87e9676c7e23d"} Mar 20 07:00:01 crc kubenswrapper[4971]: I0320 07:00:01.207383 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566500-qwvlj" event={"ID":"6f1a8924-50a9-4115-85ac-e24d81d9693e","Type":"ContainerStarted","Data":"9c401fb676d7215212d64706401c17a2bcc47dae0b660934073be1091f01905a"} Mar 20 07:00:02 crc kubenswrapper[4971]: I0320 07:00:02.454408 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566500-lqrrn" Mar 20 07:00:02 crc kubenswrapper[4971]: I0320 07:00:02.532471 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6da4e6ff-eae8-421a-81e8-694388cbb4d3-config-volume\") pod \"6da4e6ff-eae8-421a-81e8-694388cbb4d3\" (UID: \"6da4e6ff-eae8-421a-81e8-694388cbb4d3\") " Mar 20 07:00:02 crc kubenswrapper[4971]: I0320 07:00:02.532544 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6da4e6ff-eae8-421a-81e8-694388cbb4d3-secret-volume\") pod \"6da4e6ff-eae8-421a-81e8-694388cbb4d3\" (UID: \"6da4e6ff-eae8-421a-81e8-694388cbb4d3\") " Mar 20 07:00:02 crc kubenswrapper[4971]: I0320 07:00:02.532708 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bl926\" (UniqueName: \"kubernetes.io/projected/6da4e6ff-eae8-421a-81e8-694388cbb4d3-kube-api-access-bl926\") pod \"6da4e6ff-eae8-421a-81e8-694388cbb4d3\" (UID: \"6da4e6ff-eae8-421a-81e8-694388cbb4d3\") " Mar 20 07:00:02 crc kubenswrapper[4971]: I0320 07:00:02.533480 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6da4e6ff-eae8-421a-81e8-694388cbb4d3-config-volume" (OuterVolumeSpecName: "config-volume") pod "6da4e6ff-eae8-421a-81e8-694388cbb4d3" (UID: "6da4e6ff-eae8-421a-81e8-694388cbb4d3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:00:02 crc kubenswrapper[4971]: I0320 07:00:02.541349 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6da4e6ff-eae8-421a-81e8-694388cbb4d3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6da4e6ff-eae8-421a-81e8-694388cbb4d3" (UID: "6da4e6ff-eae8-421a-81e8-694388cbb4d3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:00:02 crc kubenswrapper[4971]: I0320 07:00:02.541854 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6da4e6ff-eae8-421a-81e8-694388cbb4d3-kube-api-access-bl926" (OuterVolumeSpecName: "kube-api-access-bl926") pod "6da4e6ff-eae8-421a-81e8-694388cbb4d3" (UID: "6da4e6ff-eae8-421a-81e8-694388cbb4d3"). InnerVolumeSpecName "kube-api-access-bl926". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:00:02 crc kubenswrapper[4971]: I0320 07:00:02.634424 4971 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6da4e6ff-eae8-421a-81e8-694388cbb4d3-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 07:00:02 crc kubenswrapper[4971]: I0320 07:00:02.634471 4971 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6da4e6ff-eae8-421a-81e8-694388cbb4d3-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 07:00:02 crc kubenswrapper[4971]: I0320 07:00:02.634491 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bl926\" (UniqueName: \"kubernetes.io/projected/6da4e6ff-eae8-421a-81e8-694388cbb4d3-kube-api-access-bl926\") on node \"crc\" DevicePath \"\"" Mar 20 07:00:03 crc kubenswrapper[4971]: I0320 07:00:03.229717 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566500-lqrrn" event={"ID":"6da4e6ff-eae8-421a-81e8-694388cbb4d3","Type":"ContainerDied","Data":"789ec1a457f91fd93ef0712c83e5691f2677f95dde0b9a1b0ac87e9676c7e23d"} Mar 20 07:00:03 crc kubenswrapper[4971]: I0320 07:00:03.229765 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="789ec1a457f91fd93ef0712c83e5691f2677f95dde0b9a1b0ac87e9676c7e23d" Mar 20 07:00:03 crc kubenswrapper[4971]: I0320 07:00:03.229823 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566500-lqrrn" Mar 20 07:00:18 crc kubenswrapper[4971]: I0320 07:00:18.338848 4971 generic.go:334] "Generic (PLEG): container finished" podID="6f1a8924-50a9-4115-85ac-e24d81d9693e" containerID="d761685f3e8e47bc450dba900f3ddc295cc90799b8183ad527be9d582da75158" exitCode=0 Mar 20 07:00:18 crc kubenswrapper[4971]: I0320 07:00:18.339797 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566500-qwvlj" event={"ID":"6f1a8924-50a9-4115-85ac-e24d81d9693e","Type":"ContainerDied","Data":"d761685f3e8e47bc450dba900f3ddc295cc90799b8183ad527be9d582da75158"} Mar 20 07:00:19 crc kubenswrapper[4971]: I0320 07:00:19.653021 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566500-qwvlj" Mar 20 07:00:19 crc kubenswrapper[4971]: I0320 07:00:19.798957 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twn8h\" (UniqueName: \"kubernetes.io/projected/6f1a8924-50a9-4115-85ac-e24d81d9693e-kube-api-access-twn8h\") pod \"6f1a8924-50a9-4115-85ac-e24d81d9693e\" (UID: \"6f1a8924-50a9-4115-85ac-e24d81d9693e\") " Mar 20 07:00:19 crc kubenswrapper[4971]: I0320 07:00:19.808494 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f1a8924-50a9-4115-85ac-e24d81d9693e-kube-api-access-twn8h" (OuterVolumeSpecName: "kube-api-access-twn8h") pod "6f1a8924-50a9-4115-85ac-e24d81d9693e" (UID: "6f1a8924-50a9-4115-85ac-e24d81d9693e"). InnerVolumeSpecName "kube-api-access-twn8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:00:19 crc kubenswrapper[4971]: I0320 07:00:19.901145 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twn8h\" (UniqueName: \"kubernetes.io/projected/6f1a8924-50a9-4115-85ac-e24d81d9693e-kube-api-access-twn8h\") on node \"crc\" DevicePath \"\"" Mar 20 07:00:20 crc kubenswrapper[4971]: I0320 07:00:20.162687 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:00:20 crc kubenswrapper[4971]: I0320 07:00:20.162787 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:00:20 crc kubenswrapper[4971]: I0320 07:00:20.362713 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566500-qwvlj" event={"ID":"6f1a8924-50a9-4115-85ac-e24d81d9693e","Type":"ContainerDied","Data":"9c401fb676d7215212d64706401c17a2bcc47dae0b660934073be1091f01905a"} Mar 20 07:00:20 crc kubenswrapper[4971]: I0320 07:00:20.363067 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c401fb676d7215212d64706401c17a2bcc47dae0b660934073be1091f01905a" Mar 20 07:00:20 crc kubenswrapper[4971]: I0320 07:00:20.362873 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566500-qwvlj" Mar 20 07:00:20 crc kubenswrapper[4971]: I0320 07:00:20.743950 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566494-7drpt"] Mar 20 07:00:20 crc kubenswrapper[4971]: I0320 07:00:20.746382 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566494-7drpt"] Mar 20 07:00:22 crc kubenswrapper[4971]: I0320 07:00:22.745764 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38760c08-459a-4637-92a1-a1c2701532a9" path="/var/lib/kubelet/pods/38760c08-459a-4637-92a1-a1c2701532a9/volumes" Mar 20 07:00:39 crc kubenswrapper[4971]: I0320 07:00:39.196689 4971 scope.go:117] "RemoveContainer" containerID="87b1cbf2c46265829095cd0cec5e8fd3b8fd0923653ce3d06b332ca892c8fcc1" Mar 20 07:00:39 crc kubenswrapper[4971]: I0320 07:00:39.258401 4971 scope.go:117] "RemoveContainer" containerID="f7cf8472aed6728bae861e17601f80f11b0faea13807991178f4f3913b3a2e6c" Mar 20 07:00:39 crc kubenswrapper[4971]: I0320 07:00:39.284756 4971 scope.go:117] "RemoveContainer" containerID="556c498a26b4cd281aca3104fc11b7597318fd39b653dcbd18f5fdf4b79c6f79" Mar 20 07:00:39 crc kubenswrapper[4971]: I0320 07:00:39.316727 4971 scope.go:117] "RemoveContainer" containerID="a19aa3fbc9b4daa4dde72bfb37611062726d118915be79bcaef422bbb8a3b2c3" Mar 20 07:00:39 crc kubenswrapper[4971]: I0320 07:00:39.346882 4971 scope.go:117] "RemoveContainer" containerID="116fedb9b26edd4b4aacd6e32cd947751c32a64fff4e5df1ffe669e760f2ad5e" Mar 20 07:00:39 crc kubenswrapper[4971]: I0320 07:00:39.369019 4971 scope.go:117] "RemoveContainer" containerID="be0670ae7db133b8af1e304dc2821ff335cb40c1560ca24d5cbf0aa96842e67f" Mar 20 07:00:50 crc kubenswrapper[4971]: I0320 07:00:50.163168 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:00:50 crc kubenswrapper[4971]: I0320 07:00:50.164054 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:00:50 crc kubenswrapper[4971]: I0320 07:00:50.164106 4971 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" Mar 20 07:00:50 crc kubenswrapper[4971]: I0320 07:00:50.164821 4971 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7ab1be4a624c5deceee25ffdac450b581a2b15b47c089d46af6f08d49779add2"} pod="openshift-machine-config-operator/machine-config-daemon-m26zl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 07:00:50 crc kubenswrapper[4971]: I0320 07:00:50.164886 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" containerID="cri-o://7ab1be4a624c5deceee25ffdac450b581a2b15b47c089d46af6f08d49779add2" gracePeriod=600 Mar 20 07:00:50 crc kubenswrapper[4971]: I0320 07:00:50.580461 4971 generic.go:334] "Generic (PLEG): container finished" podID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerID="7ab1be4a624c5deceee25ffdac450b581a2b15b47c089d46af6f08d49779add2" exitCode=0 Mar 20 07:00:50 crc kubenswrapper[4971]: I0320 07:00:50.580546 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerDied","Data":"7ab1be4a624c5deceee25ffdac450b581a2b15b47c089d46af6f08d49779add2"} Mar 20 07:00:50 crc kubenswrapper[4971]: I0320 07:00:50.581266 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerStarted","Data":"d08c37f4d49289256792c5435adc7c946fcecf7a8b553aaa8277a80f90bfc57b"} Mar 20 07:00:50 crc kubenswrapper[4971]: I0320 07:00:50.581309 4971 scope.go:117] "RemoveContainer" containerID="ddd443d54f717c3c4871b4cb928359d37429421fd7c4e0ea4b56ed63d3ac0038" Mar 20 07:02:00 crc kubenswrapper[4971]: I0320 07:02:00.150649 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566502-k7wqs"] Mar 20 07:02:00 crc kubenswrapper[4971]: E0320 07:02:00.151434 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6da4e6ff-eae8-421a-81e8-694388cbb4d3" containerName="collect-profiles" Mar 20 07:02:00 crc kubenswrapper[4971]: I0320 07:02:00.151455 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="6da4e6ff-eae8-421a-81e8-694388cbb4d3" containerName="collect-profiles" Mar 20 07:02:00 crc kubenswrapper[4971]: E0320 07:02:00.151492 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f1a8924-50a9-4115-85ac-e24d81d9693e" containerName="oc" Mar 20 07:02:00 crc kubenswrapper[4971]: I0320 07:02:00.151504 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f1a8924-50a9-4115-85ac-e24d81d9693e" containerName="oc" Mar 20 07:02:00 crc kubenswrapper[4971]: I0320 07:02:00.151723 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f1a8924-50a9-4115-85ac-e24d81d9693e" containerName="oc" Mar 20 07:02:00 crc kubenswrapper[4971]: I0320 07:02:00.151741 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="6da4e6ff-eae8-421a-81e8-694388cbb4d3" containerName="collect-profiles" Mar 20 07:02:00 crc kubenswrapper[4971]: I0320 07:02:00.152293 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566502-k7wqs" Mar 20 07:02:00 crc kubenswrapper[4971]: I0320 07:02:00.155886 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 07:02:00 crc kubenswrapper[4971]: I0320 07:02:00.156656 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 07:02:00 crc kubenswrapper[4971]: I0320 07:02:00.156931 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 07:02:00 crc kubenswrapper[4971]: I0320 07:02:00.167198 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566502-k7wqs"] Mar 20 07:02:00 crc kubenswrapper[4971]: I0320 07:02:00.255720 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dvp7\" (UniqueName: \"kubernetes.io/projected/183ddb6f-4d37-4c8b-bbc8-3d152c9f968c-kube-api-access-7dvp7\") pod \"auto-csr-approver-29566502-k7wqs\" (UID: \"183ddb6f-4d37-4c8b-bbc8-3d152c9f968c\") " pod="openshift-infra/auto-csr-approver-29566502-k7wqs" Mar 20 07:02:00 crc kubenswrapper[4971]: I0320 07:02:00.357833 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dvp7\" (UniqueName: \"kubernetes.io/projected/183ddb6f-4d37-4c8b-bbc8-3d152c9f968c-kube-api-access-7dvp7\") pod \"auto-csr-approver-29566502-k7wqs\" (UID: \"183ddb6f-4d37-4c8b-bbc8-3d152c9f968c\") " pod="openshift-infra/auto-csr-approver-29566502-k7wqs" Mar 20 07:02:00 crc kubenswrapper[4971]: I0320 07:02:00.392798 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dvp7\" (UniqueName: \"kubernetes.io/projected/183ddb6f-4d37-4c8b-bbc8-3d152c9f968c-kube-api-access-7dvp7\") pod \"auto-csr-approver-29566502-k7wqs\" (UID: \"183ddb6f-4d37-4c8b-bbc8-3d152c9f968c\") " pod="openshift-infra/auto-csr-approver-29566502-k7wqs" Mar 20 07:02:00 crc kubenswrapper[4971]: I0320 07:02:00.472904 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566502-k7wqs" Mar 20 07:02:00 crc kubenswrapper[4971]: I0320 07:02:00.803055 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566502-k7wqs"] Mar 20 07:02:01 crc kubenswrapper[4971]: I0320 07:02:01.264745 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566502-k7wqs" event={"ID":"183ddb6f-4d37-4c8b-bbc8-3d152c9f968c","Type":"ContainerStarted","Data":"0fe745d68644d9af6e11b8fbea54a9eba3cab7ff7cb7f9a47ec1efb249f2163f"} Mar 20 07:02:02 crc kubenswrapper[4971]: I0320 07:02:02.274828 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566502-k7wqs" event={"ID":"183ddb6f-4d37-4c8b-bbc8-3d152c9f968c","Type":"ContainerStarted","Data":"2e2c9ebd82302778b695f45805ace259226f22720b36a474e1b17992e988632e"} Mar 20 07:02:02 crc kubenswrapper[4971]: I0320 07:02:02.292737 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566502-k7wqs" podStartSLOduration=1.407691094 podStartE2EDuration="2.292716613s" podCreationTimestamp="2026-03-20 07:02:00 +0000 UTC" firstStartedPulling="2026-03-20 07:02:00.812866149 +0000 UTC m=+742.792740297" lastFinishedPulling="2026-03-20 07:02:01.697891638 +0000 UTC m=+743.677765816" observedRunningTime="2026-03-20 07:02:02.288312847 +0000 UTC m=+744.268186985" watchObservedRunningTime="2026-03-20 07:02:02.292716613 +0000 UTC m=+744.272590761" Mar 20 07:02:03 crc kubenswrapper[4971]: I0320 07:02:03.283351 4971 generic.go:334] "Generic (PLEG): container finished" podID="183ddb6f-4d37-4c8b-bbc8-3d152c9f968c" containerID="2e2c9ebd82302778b695f45805ace259226f22720b36a474e1b17992e988632e" exitCode=0 Mar 20 07:02:03 crc kubenswrapper[4971]: I0320 07:02:03.283435 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566502-k7wqs" event={"ID":"183ddb6f-4d37-4c8b-bbc8-3d152c9f968c","Type":"ContainerDied","Data":"2e2c9ebd82302778b695f45805ace259226f22720b36a474e1b17992e988632e"} Mar 20 07:02:04 crc kubenswrapper[4971]: I0320 07:02:04.628340 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566502-k7wqs" Mar 20 07:02:04 crc kubenswrapper[4971]: I0320 07:02:04.716587 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dvp7\" (UniqueName: \"kubernetes.io/projected/183ddb6f-4d37-4c8b-bbc8-3d152c9f968c-kube-api-access-7dvp7\") pod \"183ddb6f-4d37-4c8b-bbc8-3d152c9f968c\" (UID: \"183ddb6f-4d37-4c8b-bbc8-3d152c9f968c\") " Mar 20 07:02:04 crc kubenswrapper[4971]: I0320 07:02:04.723004 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/183ddb6f-4d37-4c8b-bbc8-3d152c9f968c-kube-api-access-7dvp7" (OuterVolumeSpecName: "kube-api-access-7dvp7") pod "183ddb6f-4d37-4c8b-bbc8-3d152c9f968c" (UID: "183ddb6f-4d37-4c8b-bbc8-3d152c9f968c"). InnerVolumeSpecName "kube-api-access-7dvp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:02:04 crc kubenswrapper[4971]: I0320 07:02:04.818836 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dvp7\" (UniqueName: \"kubernetes.io/projected/183ddb6f-4d37-4c8b-bbc8-3d152c9f968c-kube-api-access-7dvp7\") on node \"crc\" DevicePath \"\"" Mar 20 07:02:05 crc kubenswrapper[4971]: I0320 07:02:05.297680 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566502-k7wqs" event={"ID":"183ddb6f-4d37-4c8b-bbc8-3d152c9f968c","Type":"ContainerDied","Data":"0fe745d68644d9af6e11b8fbea54a9eba3cab7ff7cb7f9a47ec1efb249f2163f"} Mar 20 07:02:05 crc kubenswrapper[4971]: I0320 07:02:05.297733 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0fe745d68644d9af6e11b8fbea54a9eba3cab7ff7cb7f9a47ec1efb249f2163f" Mar 20 07:02:05 crc kubenswrapper[4971]: I0320 07:02:05.297808 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566502-k7wqs" Mar 20 07:02:05 crc kubenswrapper[4971]: I0320 07:02:05.359670 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566496-rzfqw"] Mar 20 07:02:05 crc kubenswrapper[4971]: I0320 07:02:05.364141 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566496-rzfqw"] Mar 20 07:02:06 crc kubenswrapper[4971]: I0320 07:02:06.743072 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2239507-962f-4ce1-82fa-6ca4a2ddbb93" path="/var/lib/kubelet/pods/f2239507-962f-4ce1-82fa-6ca4a2ddbb93/volumes" Mar 20 07:02:39 crc kubenswrapper[4971]: I0320 07:02:39.484925 4971 scope.go:117] "RemoveContainer" containerID="bb793c8c531d465ffcd9e9dbe5cca7a0bb21632b645211ed847bff198be4a22e" Mar 20 07:02:50 crc kubenswrapper[4971]: I0320 07:02:50.162314 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:02:50 crc kubenswrapper[4971]: I0320 07:02:50.163189 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:03:20 crc kubenswrapper[4971]: I0320 07:03:20.162375 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:03:20 crc kubenswrapper[4971]: I0320 07:03:20.163088 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:03:34 crc kubenswrapper[4971]: I0320 07:03:34.972109 4971 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 07:03:50 crc kubenswrapper[4971]: I0320 07:03:50.162826 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:03:50 crc kubenswrapper[4971]: I0320 07:03:50.163553 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:03:50 crc kubenswrapper[4971]: I0320 07:03:50.163690 4971 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" Mar 20 07:03:50 crc kubenswrapper[4971]: I0320 07:03:50.164699 4971 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d08c37f4d49289256792c5435adc7c946fcecf7a8b553aaa8277a80f90bfc57b"} pod="openshift-machine-config-operator/machine-config-daemon-m26zl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 07:03:50 crc kubenswrapper[4971]: I0320 07:03:50.164798 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" containerID="cri-o://d08c37f4d49289256792c5435adc7c946fcecf7a8b553aaa8277a80f90bfc57b" gracePeriod=600 Mar 20 07:03:51 crc kubenswrapper[4971]: I0320 07:03:51.049931 4971 generic.go:334] "Generic (PLEG): container finished" podID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerID="d08c37f4d49289256792c5435adc7c946fcecf7a8b553aaa8277a80f90bfc57b" exitCode=0 Mar 20 07:03:51 crc kubenswrapper[4971]: I0320 07:03:51.050000 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerDied","Data":"d08c37f4d49289256792c5435adc7c946fcecf7a8b553aaa8277a80f90bfc57b"} Mar 20 07:03:51 crc kubenswrapper[4971]: I0320 07:03:51.050335 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerStarted","Data":"b3da7217c29a31428564d7212c3a93202314dd80c28b100287c20a984795e9b1"} Mar 20 07:03:51 crc kubenswrapper[4971]: I0320 07:03:51.050364 4971 scope.go:117] "RemoveContainer" containerID="7ab1be4a624c5deceee25ffdac450b581a2b15b47c089d46af6f08d49779add2" Mar 20 07:04:00 crc kubenswrapper[4971]: I0320 07:04:00.195478 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566504-6mpbh"] Mar 20 07:04:00 crc kubenswrapper[4971]: E0320 07:04:00.196068 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="183ddb6f-4d37-4c8b-bbc8-3d152c9f968c" containerName="oc" Mar 20 07:04:00 crc kubenswrapper[4971]: I0320 07:04:00.196079 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="183ddb6f-4d37-4c8b-bbc8-3d152c9f968c" containerName="oc" Mar 20 07:04:00 crc kubenswrapper[4971]: I0320 07:04:00.196173 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="183ddb6f-4d37-4c8b-bbc8-3d152c9f968c" containerName="oc" Mar 20 07:04:00 crc kubenswrapper[4971]: I0320 07:04:00.196524 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566504-6mpbh" Mar 20 07:04:00 crc kubenswrapper[4971]: I0320 07:04:00.205961 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566504-6mpbh"] Mar 20 07:04:00 crc kubenswrapper[4971]: I0320 07:04:00.210468 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 07:04:00 crc kubenswrapper[4971]: I0320 07:04:00.210959 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 07:04:00 crc kubenswrapper[4971]: I0320 07:04:00.211254 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 07:04:00 crc kubenswrapper[4971]: I0320 07:04:00.296640 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88kpc\" (UniqueName: \"kubernetes.io/projected/132e6589-0f2b-4b4d-9059-eadbe86f6219-kube-api-access-88kpc\") pod \"auto-csr-approver-29566504-6mpbh\" (UID: \"132e6589-0f2b-4b4d-9059-eadbe86f6219\") " pod="openshift-infra/auto-csr-approver-29566504-6mpbh" Mar 20 07:04:00 crc kubenswrapper[4971]: I0320 07:04:00.398369 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88kpc\" (UniqueName: \"kubernetes.io/projected/132e6589-0f2b-4b4d-9059-eadbe86f6219-kube-api-access-88kpc\") pod \"auto-csr-approver-29566504-6mpbh\" (UID: \"132e6589-0f2b-4b4d-9059-eadbe86f6219\") " pod="openshift-infra/auto-csr-approver-29566504-6mpbh" Mar 20 07:04:00 crc kubenswrapper[4971]: I0320 07:04:00.417007 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88kpc\" (UniqueName: \"kubernetes.io/projected/132e6589-0f2b-4b4d-9059-eadbe86f6219-kube-api-access-88kpc\") pod \"auto-csr-approver-29566504-6mpbh\" (UID: \"132e6589-0f2b-4b4d-9059-eadbe86f6219\") " pod="openshift-infra/auto-csr-approver-29566504-6mpbh" Mar 20 07:04:00 crc kubenswrapper[4971]: I0320 07:04:00.523331 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566504-6mpbh" Mar 20 07:04:00 crc kubenswrapper[4971]: I0320 07:04:00.765067 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566504-6mpbh"] Mar 20 07:04:01 crc kubenswrapper[4971]: I0320 07:04:01.110483 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566504-6mpbh" event={"ID":"132e6589-0f2b-4b4d-9059-eadbe86f6219","Type":"ContainerStarted","Data":"aa73ef8e67a8fe77a7948337319ddb4216ce3e0b186a38f9ddaaacd1de94ae96"} Mar 20 07:04:02 crc kubenswrapper[4971]: I0320 07:04:02.116245 4971 generic.go:334] "Generic (PLEG): container finished" podID="132e6589-0f2b-4b4d-9059-eadbe86f6219" containerID="d18203a999ce539dd70c56ece8133269ba47a469beca44ebad7eac7b0328a465" exitCode=0 Mar 20 07:04:02 crc kubenswrapper[4971]: I0320 07:04:02.116282 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566504-6mpbh" event={"ID":"132e6589-0f2b-4b4d-9059-eadbe86f6219","Type":"ContainerDied","Data":"d18203a999ce539dd70c56ece8133269ba47a469beca44ebad7eac7b0328a465"} Mar 20 07:04:03 crc kubenswrapper[4971]: I0320 07:04:03.396773 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566504-6mpbh" Mar 20 07:04:03 crc kubenswrapper[4971]: I0320 07:04:03.539835 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88kpc\" (UniqueName: \"kubernetes.io/projected/132e6589-0f2b-4b4d-9059-eadbe86f6219-kube-api-access-88kpc\") pod \"132e6589-0f2b-4b4d-9059-eadbe86f6219\" (UID: \"132e6589-0f2b-4b4d-9059-eadbe86f6219\") " Mar 20 07:04:03 crc kubenswrapper[4971]: I0320 07:04:03.546345 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/132e6589-0f2b-4b4d-9059-eadbe86f6219-kube-api-access-88kpc" (OuterVolumeSpecName: "kube-api-access-88kpc") pod "132e6589-0f2b-4b4d-9059-eadbe86f6219" (UID: "132e6589-0f2b-4b4d-9059-eadbe86f6219"). InnerVolumeSpecName "kube-api-access-88kpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:04:03 crc kubenswrapper[4971]: I0320 07:04:03.641734 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88kpc\" (UniqueName: \"kubernetes.io/projected/132e6589-0f2b-4b4d-9059-eadbe86f6219-kube-api-access-88kpc\") on node \"crc\" DevicePath \"\"" Mar 20 07:04:04 crc kubenswrapper[4971]: I0320 07:04:04.129371 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566504-6mpbh" event={"ID":"132e6589-0f2b-4b4d-9059-eadbe86f6219","Type":"ContainerDied","Data":"aa73ef8e67a8fe77a7948337319ddb4216ce3e0b186a38f9ddaaacd1de94ae96"} Mar 20 07:04:04 crc kubenswrapper[4971]: I0320 07:04:04.129412 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa73ef8e67a8fe77a7948337319ddb4216ce3e0b186a38f9ddaaacd1de94ae96" Mar 20 07:04:04 crc kubenswrapper[4971]: I0320 07:04:04.129447 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566504-6mpbh" Mar 20 07:04:04 crc kubenswrapper[4971]: I0320 07:04:04.468725 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566498-ps9kt"] Mar 20 07:04:04 crc kubenswrapper[4971]: I0320 07:04:04.471598 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566498-ps9kt"] Mar 20 07:04:04 crc kubenswrapper[4971]: I0320 07:04:04.743223 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32828d9c-7153-40b0-872c-62b986ae0c39" path="/var/lib/kubelet/pods/32828d9c-7153-40b0-872c-62b986ae0c39/volumes" Mar 20 07:04:39 crc kubenswrapper[4971]: I0320 07:04:39.592381 4971 scope.go:117] "RemoveContainer" containerID="33723ccdab253f2b230302903a0643bb3062d17242c781857e2471d0eaad95c9" Mar 20 07:05:01 crc kubenswrapper[4971]: I0320 07:05:01.842906 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-fhlwk"] Mar 20 07:05:01 crc kubenswrapper[4971]: I0320 07:05:01.844515 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" podUID="7f624a08-810b-4915-b3bf-1e19d3e6cace" containerName="ovn-controller" containerID="cri-o://f30450686629f21239e3247309d8fab4f41bacff80fdc6d456d970dde1a7e703" gracePeriod=30 Mar 20 07:05:01 crc kubenswrapper[4971]: I0320 07:05:01.844556 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" podUID="7f624a08-810b-4915-b3bf-1e19d3e6cace" containerName="nbdb" containerID="cri-o://547b376608a6a765aea3f53af2c12c6eea952bf8fcdb34be1a94fc79d7c84b90" gracePeriod=30 Mar 20 07:05:01 crc kubenswrapper[4971]: I0320 07:05:01.844684 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" podUID="7f624a08-810b-4915-b3bf-1e19d3e6cace" containerName="sbdb" containerID="cri-o://4691ea9fd75102b278413b4cf2d045ddad7f06b55243e481ec5ff36f6891cd77" gracePeriod=30 Mar 20 07:05:01 crc kubenswrapper[4971]: I0320 07:05:01.844679 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" podUID="7f624a08-810b-4915-b3bf-1e19d3e6cace" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://20d0d42b820b46ff3173d56e03dedec6fd3fd360fe144467d8eb6f785d597308" gracePeriod=30 Mar 20 07:05:01 crc kubenswrapper[4971]: I0320 07:05:01.844738 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" podUID="7f624a08-810b-4915-b3bf-1e19d3e6cace" containerName="northd" containerID="cri-o://421a7ad4b3dde0de7b751159a385832075d84609031c7def22b4aa02874d2936" gracePeriod=30 Mar 20 07:05:01 crc kubenswrapper[4971]: I0320 07:05:01.844796 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" podUID="7f624a08-810b-4915-b3bf-1e19d3e6cace" containerName="ovn-acl-logging" containerID="cri-o://44a1ae4297291e48978592b82e56c27206a61f8789bc5312c5dd22d06b34c195" gracePeriod=30 Mar 20 07:05:01 crc kubenswrapper[4971]: I0320 07:05:01.844786 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" podUID="7f624a08-810b-4915-b3bf-1e19d3e6cace" containerName="kube-rbac-proxy-node" containerID="cri-o://178ce0bca73aef60296595c16024b62f27dc16d226d6ed8d102bb88b3b4477aa" gracePeriod=30 Mar 20 07:05:01 crc kubenswrapper[4971]: I0320 07:05:01.918452 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" podUID="7f624a08-810b-4915-b3bf-1e19d3e6cace" containerName="ovnkube-controller" containerID="cri-o://e085ffc1a6eb1489d8650aa6291b0fdfa7bd675c8cd3e76bf2c7939ebcf256ed" gracePeriod=30 Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.203866 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fhlwk_7f624a08-810b-4915-b3bf-1e19d3e6cace/ovnkube-controller/3.log" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.206541 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fhlwk_7f624a08-810b-4915-b3bf-1e19d3e6cace/ovn-acl-logging/0.log" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.207171 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fhlwk_7f624a08-810b-4915-b3bf-1e19d3e6cace/ovn-controller/0.log" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.207680 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.254186 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-etc-openvswitch\") pod \"7f624a08-810b-4915-b3bf-1e19d3e6cace\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.254238 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-host-run-ovn-kubernetes\") pod \"7f624a08-810b-4915-b3bf-1e19d3e6cace\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.254277 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7f624a08-810b-4915-b3bf-1e19d3e6cace-env-overrides\") pod \"7f624a08-810b-4915-b3bf-1e19d3e6cace\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.254356 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-systemd-units\") pod \"7f624a08-810b-4915-b3bf-1e19d3e6cace\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.254387 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-node-log\") pod \"7f624a08-810b-4915-b3bf-1e19d3e6cace\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.254385 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "7f624a08-810b-4915-b3bf-1e19d3e6cace" (UID: "7f624a08-810b-4915-b3bf-1e19d3e6cace"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.254415 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7f624a08-810b-4915-b3bf-1e19d3e6cace-ovnkube-config\") pod \"7f624a08-810b-4915-b3bf-1e19d3e6cace\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.254448 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "7f624a08-810b-4915-b3bf-1e19d3e6cace" (UID: "7f624a08-810b-4915-b3bf-1e19d3e6cace"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.254477 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "7f624a08-810b-4915-b3bf-1e19d3e6cace" (UID: "7f624a08-810b-4915-b3bf-1e19d3e6cace"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.254536 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-node-log" (OuterVolumeSpecName: "node-log") pod "7f624a08-810b-4915-b3bf-1e19d3e6cace" (UID: "7f624a08-810b-4915-b3bf-1e19d3e6cace"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.254588 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7vn4\" (UniqueName: \"kubernetes.io/projected/7f624a08-810b-4915-b3bf-1e19d3e6cace-kube-api-access-j7vn4\") pod \"7f624a08-810b-4915-b3bf-1e19d3e6cace\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.254894 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7f624a08-810b-4915-b3bf-1e19d3e6cace-ovnkube-script-lib\") pod \"7f624a08-810b-4915-b3bf-1e19d3e6cace\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.254928 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7f624a08-810b-4915-b3bf-1e19d3e6cace-ovn-node-metrics-cert\") pod \"7f624a08-810b-4915-b3bf-1e19d3e6cace\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.254948 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-host-slash\") pod \"7f624a08-810b-4915-b3bf-1e19d3e6cace\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.255102 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-run-ovn\") pod \"7f624a08-810b-4915-b3bf-1e19d3e6cace\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.255126 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-host-cni-netd\") pod \"7f624a08-810b-4915-b3bf-1e19d3e6cace\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.255175 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f624a08-810b-4915-b3bf-1e19d3e6cace-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "7f624a08-810b-4915-b3bf-1e19d3e6cace" (UID: "7f624a08-810b-4915-b3bf-1e19d3e6cace"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.255209 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-host-slash" (OuterVolumeSpecName: "host-slash") pod "7f624a08-810b-4915-b3bf-1e19d3e6cace" (UID: "7f624a08-810b-4915-b3bf-1e19d3e6cace"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.255245 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "7f624a08-810b-4915-b3bf-1e19d3e6cace" (UID: "7f624a08-810b-4915-b3bf-1e19d3e6cace"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.255266 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-host-kubelet\") pod \"7f624a08-810b-4915-b3bf-1e19d3e6cace\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.255279 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "7f624a08-810b-4915-b3bf-1e19d3e6cace" (UID: "7f624a08-810b-4915-b3bf-1e19d3e6cace"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.255290 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-run-systemd\") pod \"7f624a08-810b-4915-b3bf-1e19d3e6cace\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.255273 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f624a08-810b-4915-b3bf-1e19d3e6cace-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "7f624a08-810b-4915-b3bf-1e19d3e6cace" (UID: "7f624a08-810b-4915-b3bf-1e19d3e6cace"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.255306 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "7f624a08-810b-4915-b3bf-1e19d3e6cace" (UID: "7f624a08-810b-4915-b3bf-1e19d3e6cace"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.255514 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-host-var-lib-cni-networks-ovn-kubernetes\") pod \"7f624a08-810b-4915-b3bf-1e19d3e6cace\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.255823 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-host-run-netns\") pod \"7f624a08-810b-4915-b3bf-1e19d3e6cace\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.255852 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-var-lib-openvswitch\") pod \"7f624a08-810b-4915-b3bf-1e19d3e6cace\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.255880 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "7f624a08-810b-4915-b3bf-1e19d3e6cace" (UID: "7f624a08-810b-4915-b3bf-1e19d3e6cace"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.255988 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f624a08-810b-4915-b3bf-1e19d3e6cace-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "7f624a08-810b-4915-b3bf-1e19d3e6cace" (UID: "7f624a08-810b-4915-b3bf-1e19d3e6cace"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.255986 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "7f624a08-810b-4915-b3bf-1e19d3e6cace" (UID: "7f624a08-810b-4915-b3bf-1e19d3e6cace"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.256106 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-host-cni-bin\") pod \"7f624a08-810b-4915-b3bf-1e19d3e6cace\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.256132 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-run-openvswitch\") pod \"7f624a08-810b-4915-b3bf-1e19d3e6cace\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.256160 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "7f624a08-810b-4915-b3bf-1e19d3e6cace" (UID: "7f624a08-810b-4915-b3bf-1e19d3e6cace"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.256190 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-log-socket\") pod \"7f624a08-810b-4915-b3bf-1e19d3e6cace\" (UID: \"7f624a08-810b-4915-b3bf-1e19d3e6cace\") " Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.256204 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "7f624a08-810b-4915-b3bf-1e19d3e6cace" (UID: "7f624a08-810b-4915-b3bf-1e19d3e6cace"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.256290 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-log-socket" (OuterVolumeSpecName: "log-socket") pod "7f624a08-810b-4915-b3bf-1e19d3e6cace" (UID: "7f624a08-810b-4915-b3bf-1e19d3e6cace"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.256741 4971 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.256761 4971 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.256773 4971 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.256812 4971 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.256827 4971 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.256838 4971 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.256849 4971 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.256861 4971 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-log-socket\") on node \"crc\" DevicePath \"\"" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.256899 4971 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.256911 4971 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.256922 4971 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7f624a08-810b-4915-b3bf-1e19d3e6cace-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.256933 4971 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.256943 4971 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-node-log\") on node \"crc\" DevicePath \"\"" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.256980 4971 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7f624a08-810b-4915-b3bf-1e19d3e6cace-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.256991 4971 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7f624a08-810b-4915-b3bf-1e19d3e6cace-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.257898 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "7f624a08-810b-4915-b3bf-1e19d3e6cace" (UID: "7f624a08-810b-4915-b3bf-1e19d3e6cace"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.262475 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f624a08-810b-4915-b3bf-1e19d3e6cace-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "7f624a08-810b-4915-b3bf-1e19d3e6cace" (UID: "7f624a08-810b-4915-b3bf-1e19d3e6cace"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.262591 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f624a08-810b-4915-b3bf-1e19d3e6cace-kube-api-access-j7vn4" (OuterVolumeSpecName: "kube-api-access-j7vn4") pod "7f624a08-810b-4915-b3bf-1e19d3e6cace" (UID: "7f624a08-810b-4915-b3bf-1e19d3e6cace"). InnerVolumeSpecName "kube-api-access-j7vn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.280245 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4vqnp"] Mar 20 07:05:02 crc kubenswrapper[4971]: E0320 07:05:02.280473 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f624a08-810b-4915-b3bf-1e19d3e6cace" containerName="kubecfg-setup" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.280493 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f624a08-810b-4915-b3bf-1e19d3e6cace" containerName="kubecfg-setup" Mar 20 07:05:02 crc kubenswrapper[4971]: E0320 07:05:02.280509 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f624a08-810b-4915-b3bf-1e19d3e6cace" containerName="sbdb" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.280518 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f624a08-810b-4915-b3bf-1e19d3e6cace" containerName="sbdb" Mar 20 07:05:02 crc kubenswrapper[4971]: E0320 07:05:02.280530 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f624a08-810b-4915-b3bf-1e19d3e6cace" containerName="ovnkube-controller" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.280539 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f624a08-810b-4915-b3bf-1e19d3e6cace" containerName="ovnkube-controller" Mar 20 07:05:02 crc kubenswrapper[4971]: E0320 07:05:02.280548 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f624a08-810b-4915-b3bf-1e19d3e6cace" containerName="kube-rbac-proxy-node" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.280557 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f624a08-810b-4915-b3bf-1e19d3e6cace" containerName="kube-rbac-proxy-node" Mar 20 07:05:02 crc kubenswrapper[4971]: E0320 07:05:02.280566 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f624a08-810b-4915-b3bf-1e19d3e6cace" containerName="ovn-acl-logging" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.280575 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f624a08-810b-4915-b3bf-1e19d3e6cace" containerName="ovn-acl-logging" Mar 20 07:05:02 crc kubenswrapper[4971]: E0320 07:05:02.280586 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f624a08-810b-4915-b3bf-1e19d3e6cace" containerName="northd" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.280594 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f624a08-810b-4915-b3bf-1e19d3e6cace" containerName="northd" Mar 20 07:05:02 crc kubenswrapper[4971]: E0320 07:05:02.280625 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f624a08-810b-4915-b3bf-1e19d3e6cace" containerName="ovnkube-controller" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.280634 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f624a08-810b-4915-b3bf-1e19d3e6cace" containerName="ovnkube-controller" Mar 20 07:05:02 crc kubenswrapper[4971]: E0320 07:05:02.280645 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f624a08-810b-4915-b3bf-1e19d3e6cace" containerName="ovnkube-controller" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.280655 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f624a08-810b-4915-b3bf-1e19d3e6cace" containerName="ovnkube-controller" Mar 20 07:05:02 crc kubenswrapper[4971]: E0320 07:05:02.280666 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="132e6589-0f2b-4b4d-9059-eadbe86f6219" containerName="oc" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.280674 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="132e6589-0f2b-4b4d-9059-eadbe86f6219" containerName="oc" Mar 20 07:05:02 crc kubenswrapper[4971]: E0320 07:05:02.280688 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f624a08-810b-4915-b3bf-1e19d3e6cace" containerName="ovn-controller" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.280696 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f624a08-810b-4915-b3bf-1e19d3e6cace" containerName="ovn-controller" Mar 20 07:05:02 crc kubenswrapper[4971]: E0320 07:05:02.280708 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f624a08-810b-4915-b3bf-1e19d3e6cace" containerName="ovnkube-controller" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.280717 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f624a08-810b-4915-b3bf-1e19d3e6cace" containerName="ovnkube-controller" Mar 20 07:05:02 crc kubenswrapper[4971]: E0320 07:05:02.280728 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f624a08-810b-4915-b3bf-1e19d3e6cace" containerName="nbdb" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.280737 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f624a08-810b-4915-b3bf-1e19d3e6cace" containerName="nbdb" Mar 20 07:05:02 crc kubenswrapper[4971]: E0320 07:05:02.280749 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f624a08-810b-4915-b3bf-1e19d3e6cace" containerName="ovnkube-controller" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.280756 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f624a08-810b-4915-b3bf-1e19d3e6cace" containerName="ovnkube-controller" Mar 20 07:05:02 crc kubenswrapper[4971]: E0320 07:05:02.280770 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f624a08-810b-4915-b3bf-1e19d3e6cace" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.280778 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f624a08-810b-4915-b3bf-1e19d3e6cace" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.280910 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="132e6589-0f2b-4b4d-9059-eadbe86f6219" containerName="oc" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.280922 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f624a08-810b-4915-b3bf-1e19d3e6cace" containerName="ovnkube-controller" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.280935 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f624a08-810b-4915-b3bf-1e19d3e6cace" containerName="ovnkube-controller" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.280944 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f624a08-810b-4915-b3bf-1e19d3e6cace" containerName="sbdb" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.280953 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f624a08-810b-4915-b3bf-1e19d3e6cace" containerName="ovnkube-controller" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.280964 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f624a08-810b-4915-b3bf-1e19d3e6cace" containerName="northd" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.280973 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f624a08-810b-4915-b3bf-1e19d3e6cace" containerName="ovn-controller" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.280985 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f624a08-810b-4915-b3bf-1e19d3e6cace" containerName="nbdb" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.280997 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f624a08-810b-4915-b3bf-1e19d3e6cace" containerName="kube-rbac-proxy-node" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.281009 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f624a08-810b-4915-b3bf-1e19d3e6cace" containerName="ovnkube-controller" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.281019 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f624a08-810b-4915-b3bf-1e19d3e6cace" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.281031 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f624a08-810b-4915-b3bf-1e19d3e6cace" containerName="ovn-acl-logging" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.281205 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "7f624a08-810b-4915-b3bf-1e19d3e6cace" (UID: "7f624a08-810b-4915-b3bf-1e19d3e6cace"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.281422 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f624a08-810b-4915-b3bf-1e19d3e6cace" containerName="ovnkube-controller" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.286497 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.358544 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/dd26c3c1-c3bf-4a19-815b-ad14562be91c-log-socket\") pod \"ovnkube-node-4vqnp\" (UID: \"dd26c3c1-c3bf-4a19-815b-ad14562be91c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.358831 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/dd26c3c1-c3bf-4a19-815b-ad14562be91c-run-systemd\") pod \"ovnkube-node-4vqnp\" (UID: \"dd26c3c1-c3bf-4a19-815b-ad14562be91c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.358915 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dd26c3c1-c3bf-4a19-815b-ad14562be91c-etc-openvswitch\") pod \"ovnkube-node-4vqnp\" (UID: \"dd26c3c1-c3bf-4a19-815b-ad14562be91c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.358984 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5xq7\" (UniqueName: \"kubernetes.io/projected/dd26c3c1-c3bf-4a19-815b-ad14562be91c-kube-api-access-v5xq7\") pod \"ovnkube-node-4vqnp\" (UID: \"dd26c3c1-c3bf-4a19-815b-ad14562be91c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.359061 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/dd26c3c1-c3bf-4a19-815b-ad14562be91c-node-log\") pod \"ovnkube-node-4vqnp\" (UID: \"dd26c3c1-c3bf-4a19-815b-ad14562be91c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.359132 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dd26c3c1-c3bf-4a19-815b-ad14562be91c-host-run-ovn-kubernetes\") pod \"ovnkube-node-4vqnp\" (UID: \"dd26c3c1-c3bf-4a19-815b-ad14562be91c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.359206 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/dd26c3c1-c3bf-4a19-815b-ad14562be91c-ovnkube-script-lib\") pod \"ovnkube-node-4vqnp\" (UID: \"dd26c3c1-c3bf-4a19-815b-ad14562be91c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.359303 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/dd26c3c1-c3bf-4a19-815b-ad14562be91c-host-cni-netd\") pod \"ovnkube-node-4vqnp\" (UID: \"dd26c3c1-c3bf-4a19-815b-ad14562be91c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.359388 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dd26c3c1-c3bf-4a19-815b-ad14562be91c-host-slash\") pod \"ovnkube-node-4vqnp\" (UID: \"dd26c3c1-c3bf-4a19-815b-ad14562be91c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.359464 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dd26c3c1-c3bf-4a19-815b-ad14562be91c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4vqnp\" (UID: \"dd26c3c1-c3bf-4a19-815b-ad14562be91c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.359550 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dd26c3c1-c3bf-4a19-815b-ad14562be91c-run-openvswitch\") pod \"ovnkube-node-4vqnp\" (UID: \"dd26c3c1-c3bf-4a19-815b-ad14562be91c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.359638 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dd26c3c1-c3bf-4a19-815b-ad14562be91c-ovn-node-metrics-cert\") pod \"ovnkube-node-4vqnp\" (UID: \"dd26c3c1-c3bf-4a19-815b-ad14562be91c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.359718 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/dd26c3c1-c3bf-4a19-815b-ad14562be91c-run-ovn\") pod \"ovnkube-node-4vqnp\" (UID: \"dd26c3c1-c3bf-4a19-815b-ad14562be91c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.359797 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/dd26c3c1-c3bf-4a19-815b-ad14562be91c-systemd-units\") pod \"ovnkube-node-4vqnp\" (UID: \"dd26c3c1-c3bf-4a19-815b-ad14562be91c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.359864 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dd26c3c1-c3bf-4a19-815b-ad14562be91c-host-cni-bin\") pod \"ovnkube-node-4vqnp\" (UID: \"dd26c3c1-c3bf-4a19-815b-ad14562be91c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.359930 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dd26c3c1-c3bf-4a19-815b-ad14562be91c-env-overrides\") pod \"ovnkube-node-4vqnp\" (UID: \"dd26c3c1-c3bf-4a19-815b-ad14562be91c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.359995 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/dd26c3c1-c3bf-4a19-815b-ad14562be91c-host-kubelet\") pod \"ovnkube-node-4vqnp\" (UID: \"dd26c3c1-c3bf-4a19-815b-ad14562be91c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.360065 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dd26c3c1-c3bf-4a19-815b-ad14562be91c-ovnkube-config\") pod \"ovnkube-node-4vqnp\" (UID: \"dd26c3c1-c3bf-4a19-815b-ad14562be91c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.360141 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dd26c3c1-c3bf-4a19-815b-ad14562be91c-var-lib-openvswitch\") pod \"ovnkube-node-4vqnp\" (UID: \"dd26c3c1-c3bf-4a19-815b-ad14562be91c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.360213 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dd26c3c1-c3bf-4a19-815b-ad14562be91c-host-run-netns\") pod \"ovnkube-node-4vqnp\" (UID: \"dd26c3c1-c3bf-4a19-815b-ad14562be91c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.360339 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7vn4\" (UniqueName: \"kubernetes.io/projected/7f624a08-810b-4915-b3bf-1e19d3e6cace-kube-api-access-j7vn4\") on node \"crc\" DevicePath \"\"" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.360428 4971 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7f624a08-810b-4915-b3bf-1e19d3e6cace-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.360482 4971 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-host-slash\") on node \"crc\" DevicePath \"\"" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.360537 4971 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.360592 4971 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7f624a08-810b-4915-b3bf-1e19d3e6cace-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.462096 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/dd26c3c1-c3bf-4a19-815b-ad14562be91c-run-systemd\") pod \"ovnkube-node-4vqnp\" (UID: \"dd26c3c1-c3bf-4a19-815b-ad14562be91c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.462303 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/dd26c3c1-c3bf-4a19-815b-ad14562be91c-run-systemd\") pod \"ovnkube-node-4vqnp\" (UID: \"dd26c3c1-c3bf-4a19-815b-ad14562be91c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.462559 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dd26c3c1-c3bf-4a19-815b-ad14562be91c-etc-openvswitch\") pod \"ovnkube-node-4vqnp\" (UID: \"dd26c3c1-c3bf-4a19-815b-ad14562be91c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.462917 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5xq7\" (UniqueName: \"kubernetes.io/projected/dd26c3c1-c3bf-4a19-815b-ad14562be91c-kube-api-access-v5xq7\") pod \"ovnkube-node-4vqnp\" (UID: \"dd26c3c1-c3bf-4a19-815b-ad14562be91c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.462829 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dd26c3c1-c3bf-4a19-815b-ad14562be91c-etc-openvswitch\") pod \"ovnkube-node-4vqnp\" (UID: \"dd26c3c1-c3bf-4a19-815b-ad14562be91c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.463378 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/dd26c3c1-c3bf-4a19-815b-ad14562be91c-node-log\") pod \"ovnkube-node-4vqnp\" (UID: \"dd26c3c1-c3bf-4a19-815b-ad14562be91c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.464029 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/dd26c3c1-c3bf-4a19-815b-ad14562be91c-node-log\") pod \"ovnkube-node-4vqnp\" (UID: \"dd26c3c1-c3bf-4a19-815b-ad14562be91c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.464215 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dd26c3c1-c3bf-4a19-815b-ad14562be91c-host-run-ovn-kubernetes\") pod \"ovnkube-node-4vqnp\" (UID: \"dd26c3c1-c3bf-4a19-815b-ad14562be91c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.464479 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/dd26c3c1-c3bf-4a19-815b-ad14562be91c-ovnkube-script-lib\") pod \"ovnkube-node-4vqnp\" (UID: \"dd26c3c1-c3bf-4a19-815b-ad14562be91c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.464699 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/dd26c3c1-c3bf-4a19-815b-ad14562be91c-host-cni-netd\") pod \"ovnkube-node-4vqnp\" (UID: \"dd26c3c1-c3bf-4a19-815b-ad14562be91c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.464927 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dd26c3c1-c3bf-4a19-815b-ad14562be91c-host-slash\") pod \"ovnkube-node-4vqnp\" (UID: \"dd26c3c1-c3bf-4a19-815b-ad14562be91c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.465213 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dd26c3c1-c3bf-4a19-815b-ad14562be91c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4vqnp\" (UID: \"dd26c3c1-c3bf-4a19-815b-ad14562be91c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.465394 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dd26c3c1-c3bf-4a19-815b-ad14562be91c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4vqnp\" (UID: \"dd26c3c1-c3bf-4a19-815b-ad14562be91c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.464872 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/dd26c3c1-c3bf-4a19-815b-ad14562be91c-host-cni-netd\") pod \"ovnkube-node-4vqnp\" (UID: \"dd26c3c1-c3bf-4a19-815b-ad14562be91c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.464414 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dd26c3c1-c3bf-4a19-815b-ad14562be91c-host-run-ovn-kubernetes\") pod \"ovnkube-node-4vqnp\" (UID: \"dd26c3c1-c3bf-4a19-815b-ad14562be91c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.465162 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dd26c3c1-c3bf-4a19-815b-ad14562be91c-host-slash\") pod \"ovnkube-node-4vqnp\" (UID: \"dd26c3c1-c3bf-4a19-815b-ad14562be91c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.465851 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dd26c3c1-c3bf-4a19-815b-ad14562be91c-run-openvswitch\") pod \"ovnkube-node-4vqnp\" (UID: \"dd26c3c1-c3bf-4a19-815b-ad14562be91c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.466077 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dd26c3c1-c3bf-4a19-815b-ad14562be91c-run-openvswitch\") pod \"ovnkube-node-4vqnp\" (UID: \"dd26c3c1-c3bf-4a19-815b-ad14562be91c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.466171 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dd26c3c1-c3bf-4a19-815b-ad14562be91c-ovn-node-metrics-cert\") pod \"ovnkube-node-4vqnp\" (UID: \"dd26c3c1-c3bf-4a19-815b-ad14562be91c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.467726 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/dd26c3c1-c3bf-4a19-815b-ad14562be91c-run-ovn\") pod \"ovnkube-node-4vqnp\" (UID: \"dd26c3c1-c3bf-4a19-815b-ad14562be91c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.468051 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/dd26c3c1-c3bf-4a19-815b-ad14562be91c-ovnkube-script-lib\") pod \"ovnkube-node-4vqnp\" (UID: \"dd26c3c1-c3bf-4a19-815b-ad14562be91c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.467833 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/dd26c3c1-c3bf-4a19-815b-ad14562be91c-run-ovn\") pod \"ovnkube-node-4vqnp\" (UID: \"dd26c3c1-c3bf-4a19-815b-ad14562be91c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.468393 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/dd26c3c1-c3bf-4a19-815b-ad14562be91c-systemd-units\") pod \"ovnkube-node-4vqnp\" (UID: \"dd26c3c1-c3bf-4a19-815b-ad14562be91c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.468475 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/dd26c3c1-c3bf-4a19-815b-ad14562be91c-systemd-units\") pod \"ovnkube-node-4vqnp\" (UID: \"dd26c3c1-c3bf-4a19-815b-ad14562be91c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.468715 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dd26c3c1-c3bf-4a19-815b-ad14562be91c-host-cni-bin\") pod \"ovnkube-node-4vqnp\" (UID: \"dd26c3c1-c3bf-4a19-815b-ad14562be91c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.468951 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dd26c3c1-c3bf-4a19-815b-ad14562be91c-host-cni-bin\") pod \"ovnkube-node-4vqnp\" (UID: \"dd26c3c1-c3bf-4a19-815b-ad14562be91c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.469169 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dd26c3c1-c3bf-4a19-815b-ad14562be91c-env-overrides\") pod \"ovnkube-node-4vqnp\" (UID: \"dd26c3c1-c3bf-4a19-815b-ad14562be91c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.469295 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/dd26c3c1-c3bf-4a19-815b-ad14562be91c-host-kubelet\") pod \"ovnkube-node-4vqnp\" (UID: \"dd26c3c1-c3bf-4a19-815b-ad14562be91c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.469369 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dd26c3c1-c3bf-4a19-815b-ad14562be91c-ovnkube-config\") pod \"ovnkube-node-4vqnp\" (UID: \"dd26c3c1-c3bf-4a19-815b-ad14562be91c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.469427 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dd26c3c1-c3bf-4a19-815b-ad14562be91c-var-lib-openvswitch\") pod \"ovnkube-node-4vqnp\" (UID: \"dd26c3c1-c3bf-4a19-815b-ad14562be91c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.469515 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/dd26c3c1-c3bf-4a19-815b-ad14562be91c-host-kubelet\") pod \"ovnkube-node-4vqnp\" (UID: \"dd26c3c1-c3bf-4a19-815b-ad14562be91c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.469586 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dd26c3c1-c3bf-4a19-815b-ad14562be91c-host-run-netns\") pod \"ovnkube-node-4vqnp\" (UID: \"dd26c3c1-c3bf-4a19-815b-ad14562be91c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.469534 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dd26c3c1-c3bf-4a19-815b-ad14562be91c-host-run-netns\") pod \"ovnkube-node-4vqnp\" (UID: \"dd26c3c1-c3bf-4a19-815b-ad14562be91c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.469832 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/dd26c3c1-c3bf-4a19-815b-ad14562be91c-log-socket\") pod \"ovnkube-node-4vqnp\" (UID: \"dd26c3c1-c3bf-4a19-815b-ad14562be91c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.470002 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/dd26c3c1-c3bf-4a19-815b-ad14562be91c-log-socket\") pod \"ovnkube-node-4vqnp\" (UID: \"dd26c3c1-c3bf-4a19-815b-ad14562be91c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.470136 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dd26c3c1-c3bf-4a19-815b-ad14562be91c-var-lib-openvswitch\") pod \"ovnkube-node-4vqnp\" (UID: \"dd26c3c1-c3bf-4a19-815b-ad14562be91c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.470782 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dd26c3c1-c3bf-4a19-815b-ad14562be91c-ovnkube-config\") pod \"ovnkube-node-4vqnp\" (UID: \"dd26c3c1-c3bf-4a19-815b-ad14562be91c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.471569 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dd26c3c1-c3bf-4a19-815b-ad14562be91c-ovn-node-metrics-cert\") pod \"ovnkube-node-4vqnp\" (UID: \"dd26c3c1-c3bf-4a19-815b-ad14562be91c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.472454 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dd26c3c1-c3bf-4a19-815b-ad14562be91c-env-overrides\") pod \"ovnkube-node-4vqnp\" (UID: \"dd26c3c1-c3bf-4a19-815b-ad14562be91c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.490906 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5xq7\" (UniqueName: \"kubernetes.io/projected/dd26c3c1-c3bf-4a19-815b-ad14562be91c-kube-api-access-v5xq7\") pod \"ovnkube-node-4vqnp\" (UID: \"dd26c3c1-c3bf-4a19-815b-ad14562be91c\") " pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.555784 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fhlwk_7f624a08-810b-4915-b3bf-1e19d3e6cace/ovnkube-controller/3.log" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.559323 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fhlwk_7f624a08-810b-4915-b3bf-1e19d3e6cace/ovn-acl-logging/0.log" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.560217 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fhlwk_7f624a08-810b-4915-b3bf-1e19d3e6cace/ovn-controller/0.log" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.560936 4971 generic.go:334] "Generic (PLEG): container finished" podID="7f624a08-810b-4915-b3bf-1e19d3e6cace" containerID="e085ffc1a6eb1489d8650aa6291b0fdfa7bd675c8cd3e76bf2c7939ebcf256ed" exitCode=0 Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.560986 4971 generic.go:334] "Generic (PLEG): container finished" podID="7f624a08-810b-4915-b3bf-1e19d3e6cace" containerID="4691ea9fd75102b278413b4cf2d045ddad7f06b55243e481ec5ff36f6891cd77" exitCode=0 Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.561002 4971 generic.go:334] "Generic (PLEG): container finished" podID="7f624a08-810b-4915-b3bf-1e19d3e6cace" containerID="547b376608a6a765aea3f53af2c12c6eea952bf8fcdb34be1a94fc79d7c84b90" exitCode=0 Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.561015 4971 generic.go:334] "Generic (PLEG): container finished" podID="7f624a08-810b-4915-b3bf-1e19d3e6cace" containerID="421a7ad4b3dde0de7b751159a385832075d84609031c7def22b4aa02874d2936" exitCode=0 Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.561029 4971 generic.go:334] "Generic (PLEG): container finished" podID="7f624a08-810b-4915-b3bf-1e19d3e6cace" containerID="20d0d42b820b46ff3173d56e03dedec6fd3fd360fe144467d8eb6f785d597308" exitCode=0 Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.561041 4971 generic.go:334] "Generic (PLEG): container finished" podID="7f624a08-810b-4915-b3bf-1e19d3e6cace" containerID="178ce0bca73aef60296595c16024b62f27dc16d226d6ed8d102bb88b3b4477aa" exitCode=0 Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.561053 4971 generic.go:334] "Generic (PLEG): container finished" podID="7f624a08-810b-4915-b3bf-1e19d3e6cace" containerID="44a1ae4297291e48978592b82e56c27206a61f8789bc5312c5dd22d06b34c195" exitCode=143 Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.561065 4971 generic.go:334] "Generic (PLEG): container finished" podID="7f624a08-810b-4915-b3bf-1e19d3e6cace" containerID="f30450686629f21239e3247309d8fab4f41bacff80fdc6d456d970dde1a7e703" exitCode=143 Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.561126 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.561156 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" event={"ID":"7f624a08-810b-4915-b3bf-1e19d3e6cace","Type":"ContainerDied","Data":"e085ffc1a6eb1489d8650aa6291b0fdfa7bd675c8cd3e76bf2c7939ebcf256ed"} Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.561196 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" event={"ID":"7f624a08-810b-4915-b3bf-1e19d3e6cace","Type":"ContainerDied","Data":"4691ea9fd75102b278413b4cf2d045ddad7f06b55243e481ec5ff36f6891cd77"} Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.561217 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" event={"ID":"7f624a08-810b-4915-b3bf-1e19d3e6cace","Type":"ContainerDied","Data":"547b376608a6a765aea3f53af2c12c6eea952bf8fcdb34be1a94fc79d7c84b90"} Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.561233 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" event={"ID":"7f624a08-810b-4915-b3bf-1e19d3e6cace","Type":"ContainerDied","Data":"421a7ad4b3dde0de7b751159a385832075d84609031c7def22b4aa02874d2936"} Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.561255 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" event={"ID":"7f624a08-810b-4915-b3bf-1e19d3e6cace","Type":"ContainerDied","Data":"20d0d42b820b46ff3173d56e03dedec6fd3fd360fe144467d8eb6f785d597308"} Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.561272 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" event={"ID":"7f624a08-810b-4915-b3bf-1e19d3e6cace","Type":"ContainerDied","Data":"178ce0bca73aef60296595c16024b62f27dc16d226d6ed8d102bb88b3b4477aa"} Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.561292 4971 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d333225cc7556a04bf2b3cf747cb0645ef0b657641dc10e0e91d59ecd504b63e"} Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.561306 4971 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4691ea9fd75102b278413b4cf2d045ddad7f06b55243e481ec5ff36f6891cd77"} Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.561316 4971 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"547b376608a6a765aea3f53af2c12c6eea952bf8fcdb34be1a94fc79d7c84b90"} Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.561326 4971 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"421a7ad4b3dde0de7b751159a385832075d84609031c7def22b4aa02874d2936"} Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.561334 4971 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"20d0d42b820b46ff3173d56e03dedec6fd3fd360fe144467d8eb6f785d597308"} Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.561343 4971 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"178ce0bca73aef60296595c16024b62f27dc16d226d6ed8d102bb88b3b4477aa"} Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.561352 4971 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"44a1ae4297291e48978592b82e56c27206a61f8789bc5312c5dd22d06b34c195"} Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.561362 4971 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f30450686629f21239e3247309d8fab4f41bacff80fdc6d456d970dde1a7e703"} Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.561372 4971 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd"} Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.561386 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" event={"ID":"7f624a08-810b-4915-b3bf-1e19d3e6cace","Type":"ContainerDied","Data":"44a1ae4297291e48978592b82e56c27206a61f8789bc5312c5dd22d06b34c195"} Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.561400 4971 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e085ffc1a6eb1489d8650aa6291b0fdfa7bd675c8cd3e76bf2c7939ebcf256ed"} Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.561411 4971 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d333225cc7556a04bf2b3cf747cb0645ef0b657641dc10e0e91d59ecd504b63e"} Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.561420 4971 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4691ea9fd75102b278413b4cf2d045ddad7f06b55243e481ec5ff36f6891cd77"} Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.561429 4971 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"547b376608a6a765aea3f53af2c12c6eea952bf8fcdb34be1a94fc79d7c84b90"} Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.561438 4971 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"421a7ad4b3dde0de7b751159a385832075d84609031c7def22b4aa02874d2936"} Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.561447 4971 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"20d0d42b820b46ff3173d56e03dedec6fd3fd360fe144467d8eb6f785d597308"} Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.561455 4971 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"178ce0bca73aef60296595c16024b62f27dc16d226d6ed8d102bb88b3b4477aa"} Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.561465 4971 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"44a1ae4297291e48978592b82e56c27206a61f8789bc5312c5dd22d06b34c195"} Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.561474 4971 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f30450686629f21239e3247309d8fab4f41bacff80fdc6d456d970dde1a7e703"} Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.561483 4971 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd"} Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.561496 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" event={"ID":"7f624a08-810b-4915-b3bf-1e19d3e6cace","Type":"ContainerDied","Data":"f30450686629f21239e3247309d8fab4f41bacff80fdc6d456d970dde1a7e703"} Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.561510 4971 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e085ffc1a6eb1489d8650aa6291b0fdfa7bd675c8cd3e76bf2c7939ebcf256ed"} Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.561522 4971 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d333225cc7556a04bf2b3cf747cb0645ef0b657641dc10e0e91d59ecd504b63e"} Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.561533 4971 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4691ea9fd75102b278413b4cf2d045ddad7f06b55243e481ec5ff36f6891cd77"} Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.561542 4971 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"547b376608a6a765aea3f53af2c12c6eea952bf8fcdb34be1a94fc79d7c84b90"} Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.561551 4971 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"421a7ad4b3dde0de7b751159a385832075d84609031c7def22b4aa02874d2936"} Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.561560 4971 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"20d0d42b820b46ff3173d56e03dedec6fd3fd360fe144467d8eb6f785d597308"} Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.561568 4971 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"178ce0bca73aef60296595c16024b62f27dc16d226d6ed8d102bb88b3b4477aa"} Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.561577 4971 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"44a1ae4297291e48978592b82e56c27206a61f8789bc5312c5dd22d06b34c195"} Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.561586 4971 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f30450686629f21239e3247309d8fab4f41bacff80fdc6d456d970dde1a7e703"} Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.561594 4971 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd"} Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.562369 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fhlwk" event={"ID":"7f624a08-810b-4915-b3bf-1e19d3e6cace","Type":"ContainerDied","Data":"8395fd76aef46e29db331cfcb866ba29a8fcb755dff4a6f9919b871fb3a5356f"} Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.561845 4971 scope.go:117] "RemoveContainer" containerID="e085ffc1a6eb1489d8650aa6291b0fdfa7bd675c8cd3e76bf2c7939ebcf256ed" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.564268 4971 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e085ffc1a6eb1489d8650aa6291b0fdfa7bd675c8cd3e76bf2c7939ebcf256ed"} Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.564293 4971 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d333225cc7556a04bf2b3cf747cb0645ef0b657641dc10e0e91d59ecd504b63e"} Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.564435 4971 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4691ea9fd75102b278413b4cf2d045ddad7f06b55243e481ec5ff36f6891cd77"} Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.564477 4971 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"547b376608a6a765aea3f53af2c12c6eea952bf8fcdb34be1a94fc79d7c84b90"} Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.564486 4971 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"421a7ad4b3dde0de7b751159a385832075d84609031c7def22b4aa02874d2936"} Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.564530 4971 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"20d0d42b820b46ff3173d56e03dedec6fd3fd360fe144467d8eb6f785d597308"} Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.564541 4971 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"178ce0bca73aef60296595c16024b62f27dc16d226d6ed8d102bb88b3b4477aa"} Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.564550 4971 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"44a1ae4297291e48978592b82e56c27206a61f8789bc5312c5dd22d06b34c195"} Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.564558 4971 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f30450686629f21239e3247309d8fab4f41bacff80fdc6d456d970dde1a7e703"} Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.564567 4971 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd"} Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.566971 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lwlhn_f11eaf57-f83a-4974-adb5-7a59b11555b0/kube-multus/2.log" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.568662 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lwlhn_f11eaf57-f83a-4974-adb5-7a59b11555b0/kube-multus/1.log" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.568746 4971 generic.go:334] "Generic (PLEG): container finished" podID="f11eaf57-f83a-4974-adb5-7a59b11555b0" containerID="acb2940eb72cee89ebb03b5dd72a025b5c78fc535358f2cae2578c588d6380f0" exitCode=2 Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.568819 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lwlhn" event={"ID":"f11eaf57-f83a-4974-adb5-7a59b11555b0","Type":"ContainerDied","Data":"acb2940eb72cee89ebb03b5dd72a025b5c78fc535358f2cae2578c588d6380f0"} Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.568852 4971 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b387ac17402e80963231386ddb3272ea2e435161b374b91196b8fa406f8868ec"} Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.569810 4971 scope.go:117] "RemoveContainer" containerID="acb2940eb72cee89ebb03b5dd72a025b5c78fc535358f2cae2578c588d6380f0" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.611114 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.616889 4971 scope.go:117] "RemoveContainer" containerID="d333225cc7556a04bf2b3cf747cb0645ef0b657641dc10e0e91d59ecd504b63e" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.627269 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-fhlwk"] Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.632897 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-fhlwk"] Mar 20 07:05:02 crc kubenswrapper[4971]: W0320 07:05:02.647438 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd26c3c1_c3bf_4a19_815b_ad14562be91c.slice/crio-6ca526d52fde7b22fd8be8ce08097aaad9d5818573bbed60020bc974388e5fbc WatchSource:0}: Error finding container 6ca526d52fde7b22fd8be8ce08097aaad9d5818573bbed60020bc974388e5fbc: Status 404 returned error can't find the container with id 6ca526d52fde7b22fd8be8ce08097aaad9d5818573bbed60020bc974388e5fbc Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.659534 4971 scope.go:117] "RemoveContainer" containerID="4691ea9fd75102b278413b4cf2d045ddad7f06b55243e481ec5ff36f6891cd77" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.682848 4971 scope.go:117] "RemoveContainer" containerID="547b376608a6a765aea3f53af2c12c6eea952bf8fcdb34be1a94fc79d7c84b90" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.707801 4971 scope.go:117] "RemoveContainer" containerID="421a7ad4b3dde0de7b751159a385832075d84609031c7def22b4aa02874d2936" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.739854 4971 scope.go:117] "RemoveContainer" containerID="20d0d42b820b46ff3173d56e03dedec6fd3fd360fe144467d8eb6f785d597308" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.743710 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f624a08-810b-4915-b3bf-1e19d3e6cace" path="/var/lib/kubelet/pods/7f624a08-810b-4915-b3bf-1e19d3e6cace/volumes" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.757478 4971 scope.go:117] "RemoveContainer" containerID="178ce0bca73aef60296595c16024b62f27dc16d226d6ed8d102bb88b3b4477aa" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.784508 4971 scope.go:117] "RemoveContainer" containerID="44a1ae4297291e48978592b82e56c27206a61f8789bc5312c5dd22d06b34c195" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.832109 4971 scope.go:117] "RemoveContainer" containerID="f30450686629f21239e3247309d8fab4f41bacff80fdc6d456d970dde1a7e703" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.872528 4971 scope.go:117] "RemoveContainer" containerID="2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.892743 4971 scope.go:117] "RemoveContainer" containerID="e085ffc1a6eb1489d8650aa6291b0fdfa7bd675c8cd3e76bf2c7939ebcf256ed" Mar 20 07:05:02 crc kubenswrapper[4971]: E0320 07:05:02.893270 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e085ffc1a6eb1489d8650aa6291b0fdfa7bd675c8cd3e76bf2c7939ebcf256ed\": container with ID starting with e085ffc1a6eb1489d8650aa6291b0fdfa7bd675c8cd3e76bf2c7939ebcf256ed not found: ID does not exist" containerID="e085ffc1a6eb1489d8650aa6291b0fdfa7bd675c8cd3e76bf2c7939ebcf256ed" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.893310 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e085ffc1a6eb1489d8650aa6291b0fdfa7bd675c8cd3e76bf2c7939ebcf256ed"} err="failed to get container status \"e085ffc1a6eb1489d8650aa6291b0fdfa7bd675c8cd3e76bf2c7939ebcf256ed\": rpc error: code = NotFound desc = could not find container \"e085ffc1a6eb1489d8650aa6291b0fdfa7bd675c8cd3e76bf2c7939ebcf256ed\": container with ID starting with e085ffc1a6eb1489d8650aa6291b0fdfa7bd675c8cd3e76bf2c7939ebcf256ed not found: ID does not exist" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.893336 4971 scope.go:117] "RemoveContainer" containerID="d333225cc7556a04bf2b3cf747cb0645ef0b657641dc10e0e91d59ecd504b63e" Mar 20 07:05:02 crc kubenswrapper[4971]: E0320 07:05:02.893744 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d333225cc7556a04bf2b3cf747cb0645ef0b657641dc10e0e91d59ecd504b63e\": container with ID starting with d333225cc7556a04bf2b3cf747cb0645ef0b657641dc10e0e91d59ecd504b63e not found: ID does not exist" containerID="d333225cc7556a04bf2b3cf747cb0645ef0b657641dc10e0e91d59ecd504b63e" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.893838 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d333225cc7556a04bf2b3cf747cb0645ef0b657641dc10e0e91d59ecd504b63e"} err="failed to get container status \"d333225cc7556a04bf2b3cf747cb0645ef0b657641dc10e0e91d59ecd504b63e\": rpc error: code = NotFound desc = could not find container \"d333225cc7556a04bf2b3cf747cb0645ef0b657641dc10e0e91d59ecd504b63e\": container with ID starting with d333225cc7556a04bf2b3cf747cb0645ef0b657641dc10e0e91d59ecd504b63e not found: ID does not exist" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.893913 4971 scope.go:117] "RemoveContainer" containerID="4691ea9fd75102b278413b4cf2d045ddad7f06b55243e481ec5ff36f6891cd77" Mar 20 07:05:02 crc kubenswrapper[4971]: E0320 07:05:02.894473 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4691ea9fd75102b278413b4cf2d045ddad7f06b55243e481ec5ff36f6891cd77\": container with ID starting with 4691ea9fd75102b278413b4cf2d045ddad7f06b55243e481ec5ff36f6891cd77 not found: ID does not exist" containerID="4691ea9fd75102b278413b4cf2d045ddad7f06b55243e481ec5ff36f6891cd77" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.894540 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4691ea9fd75102b278413b4cf2d045ddad7f06b55243e481ec5ff36f6891cd77"} err="failed to get container status \"4691ea9fd75102b278413b4cf2d045ddad7f06b55243e481ec5ff36f6891cd77\": rpc error: code = NotFound desc = could not find container \"4691ea9fd75102b278413b4cf2d045ddad7f06b55243e481ec5ff36f6891cd77\": container with ID starting with 4691ea9fd75102b278413b4cf2d045ddad7f06b55243e481ec5ff36f6891cd77 not found: ID does not exist" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.894596 4971 scope.go:117] "RemoveContainer" containerID="547b376608a6a765aea3f53af2c12c6eea952bf8fcdb34be1a94fc79d7c84b90" Mar 20 07:05:02 crc kubenswrapper[4971]: E0320 07:05:02.895075 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"547b376608a6a765aea3f53af2c12c6eea952bf8fcdb34be1a94fc79d7c84b90\": container with ID starting with 547b376608a6a765aea3f53af2c12c6eea952bf8fcdb34be1a94fc79d7c84b90 not found: ID does not exist" containerID="547b376608a6a765aea3f53af2c12c6eea952bf8fcdb34be1a94fc79d7c84b90" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.895103 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"547b376608a6a765aea3f53af2c12c6eea952bf8fcdb34be1a94fc79d7c84b90"} err="failed to get container status \"547b376608a6a765aea3f53af2c12c6eea952bf8fcdb34be1a94fc79d7c84b90\": rpc error: code = NotFound desc = could not find container \"547b376608a6a765aea3f53af2c12c6eea952bf8fcdb34be1a94fc79d7c84b90\": container with ID starting with 547b376608a6a765aea3f53af2c12c6eea952bf8fcdb34be1a94fc79d7c84b90 not found: ID does not exist" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.895121 4971 scope.go:117] "RemoveContainer" containerID="421a7ad4b3dde0de7b751159a385832075d84609031c7def22b4aa02874d2936" Mar 20 07:05:02 crc kubenswrapper[4971]: E0320 07:05:02.895372 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"421a7ad4b3dde0de7b751159a385832075d84609031c7def22b4aa02874d2936\": container with ID starting with 421a7ad4b3dde0de7b751159a385832075d84609031c7def22b4aa02874d2936 not found: ID does not exist" containerID="421a7ad4b3dde0de7b751159a385832075d84609031c7def22b4aa02874d2936" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.895457 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"421a7ad4b3dde0de7b751159a385832075d84609031c7def22b4aa02874d2936"} err="failed to get container status \"421a7ad4b3dde0de7b751159a385832075d84609031c7def22b4aa02874d2936\": rpc error: code = NotFound desc = could not find container \"421a7ad4b3dde0de7b751159a385832075d84609031c7def22b4aa02874d2936\": container with ID starting with 421a7ad4b3dde0de7b751159a385832075d84609031c7def22b4aa02874d2936 not found: ID does not exist" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.895529 4971 scope.go:117] "RemoveContainer" containerID="20d0d42b820b46ff3173d56e03dedec6fd3fd360fe144467d8eb6f785d597308" Mar 20 07:05:02 crc kubenswrapper[4971]: E0320 07:05:02.895861 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20d0d42b820b46ff3173d56e03dedec6fd3fd360fe144467d8eb6f785d597308\": container with ID starting with 20d0d42b820b46ff3173d56e03dedec6fd3fd360fe144467d8eb6f785d597308 not found: ID does not exist" containerID="20d0d42b820b46ff3173d56e03dedec6fd3fd360fe144467d8eb6f785d597308" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.895886 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20d0d42b820b46ff3173d56e03dedec6fd3fd360fe144467d8eb6f785d597308"} err="failed to get container status \"20d0d42b820b46ff3173d56e03dedec6fd3fd360fe144467d8eb6f785d597308\": rpc error: code = NotFound desc = could not find container \"20d0d42b820b46ff3173d56e03dedec6fd3fd360fe144467d8eb6f785d597308\": container with ID starting with 20d0d42b820b46ff3173d56e03dedec6fd3fd360fe144467d8eb6f785d597308 not found: ID does not exist" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.895902 4971 scope.go:117] "RemoveContainer" containerID="178ce0bca73aef60296595c16024b62f27dc16d226d6ed8d102bb88b3b4477aa" Mar 20 07:05:02 crc kubenswrapper[4971]: E0320 07:05:02.896184 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"178ce0bca73aef60296595c16024b62f27dc16d226d6ed8d102bb88b3b4477aa\": container with ID starting with 178ce0bca73aef60296595c16024b62f27dc16d226d6ed8d102bb88b3b4477aa not found: ID does not exist" containerID="178ce0bca73aef60296595c16024b62f27dc16d226d6ed8d102bb88b3b4477aa" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.896285 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"178ce0bca73aef60296595c16024b62f27dc16d226d6ed8d102bb88b3b4477aa"} err="failed to get container status \"178ce0bca73aef60296595c16024b62f27dc16d226d6ed8d102bb88b3b4477aa\": rpc error: code = NotFound desc = could not find container \"178ce0bca73aef60296595c16024b62f27dc16d226d6ed8d102bb88b3b4477aa\": container with ID starting with 178ce0bca73aef60296595c16024b62f27dc16d226d6ed8d102bb88b3b4477aa not found: ID does not exist" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.896351 4971 scope.go:117] "RemoveContainer" containerID="44a1ae4297291e48978592b82e56c27206a61f8789bc5312c5dd22d06b34c195" Mar 20 07:05:02 crc kubenswrapper[4971]: E0320 07:05:02.896695 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44a1ae4297291e48978592b82e56c27206a61f8789bc5312c5dd22d06b34c195\": container with ID starting with 44a1ae4297291e48978592b82e56c27206a61f8789bc5312c5dd22d06b34c195 not found: ID does not exist" containerID="44a1ae4297291e48978592b82e56c27206a61f8789bc5312c5dd22d06b34c195" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.896719 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44a1ae4297291e48978592b82e56c27206a61f8789bc5312c5dd22d06b34c195"} err="failed to get container status \"44a1ae4297291e48978592b82e56c27206a61f8789bc5312c5dd22d06b34c195\": rpc error: code = NotFound desc = could not find container \"44a1ae4297291e48978592b82e56c27206a61f8789bc5312c5dd22d06b34c195\": container with ID starting with 44a1ae4297291e48978592b82e56c27206a61f8789bc5312c5dd22d06b34c195 not found: ID does not exist" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.896733 4971 scope.go:117] "RemoveContainer" containerID="f30450686629f21239e3247309d8fab4f41bacff80fdc6d456d970dde1a7e703" Mar 20 07:05:02 crc kubenswrapper[4971]: E0320 07:05:02.897067 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f30450686629f21239e3247309d8fab4f41bacff80fdc6d456d970dde1a7e703\": container with ID starting with f30450686629f21239e3247309d8fab4f41bacff80fdc6d456d970dde1a7e703 not found: ID does not exist" containerID="f30450686629f21239e3247309d8fab4f41bacff80fdc6d456d970dde1a7e703" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.897096 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f30450686629f21239e3247309d8fab4f41bacff80fdc6d456d970dde1a7e703"} err="failed to get container status \"f30450686629f21239e3247309d8fab4f41bacff80fdc6d456d970dde1a7e703\": rpc error: code = NotFound desc = could not find container \"f30450686629f21239e3247309d8fab4f41bacff80fdc6d456d970dde1a7e703\": container with ID starting with f30450686629f21239e3247309d8fab4f41bacff80fdc6d456d970dde1a7e703 not found: ID does not exist" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.897120 4971 scope.go:117] "RemoveContainer" containerID="2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd" Mar 20 07:05:02 crc kubenswrapper[4971]: E0320 07:05:02.897360 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\": container with ID starting with 2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd not found: ID does not exist" containerID="2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.897439 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd"} err="failed to get container status \"2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\": rpc error: code = NotFound desc = could not find container \"2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\": container with ID starting with 2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd not found: ID does not exist" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.897501 4971 scope.go:117] "RemoveContainer" containerID="e085ffc1a6eb1489d8650aa6291b0fdfa7bd675c8cd3e76bf2c7939ebcf256ed" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.897811 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e085ffc1a6eb1489d8650aa6291b0fdfa7bd675c8cd3e76bf2c7939ebcf256ed"} err="failed to get container status \"e085ffc1a6eb1489d8650aa6291b0fdfa7bd675c8cd3e76bf2c7939ebcf256ed\": rpc error: code = NotFound desc = could not find container \"e085ffc1a6eb1489d8650aa6291b0fdfa7bd675c8cd3e76bf2c7939ebcf256ed\": container with ID starting with e085ffc1a6eb1489d8650aa6291b0fdfa7bd675c8cd3e76bf2c7939ebcf256ed not found: ID does not exist" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.897884 4971 scope.go:117] "RemoveContainer" containerID="d333225cc7556a04bf2b3cf747cb0645ef0b657641dc10e0e91d59ecd504b63e" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.898220 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d333225cc7556a04bf2b3cf747cb0645ef0b657641dc10e0e91d59ecd504b63e"} err="failed to get container status \"d333225cc7556a04bf2b3cf747cb0645ef0b657641dc10e0e91d59ecd504b63e\": rpc error: code = NotFound desc = could not find container \"d333225cc7556a04bf2b3cf747cb0645ef0b657641dc10e0e91d59ecd504b63e\": container with ID starting with d333225cc7556a04bf2b3cf747cb0645ef0b657641dc10e0e91d59ecd504b63e not found: ID does not exist" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.898246 4971 scope.go:117] "RemoveContainer" containerID="4691ea9fd75102b278413b4cf2d045ddad7f06b55243e481ec5ff36f6891cd77" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.898516 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4691ea9fd75102b278413b4cf2d045ddad7f06b55243e481ec5ff36f6891cd77"} err="failed to get container status \"4691ea9fd75102b278413b4cf2d045ddad7f06b55243e481ec5ff36f6891cd77\": rpc error: code = NotFound desc = could not find container \"4691ea9fd75102b278413b4cf2d045ddad7f06b55243e481ec5ff36f6891cd77\": container with ID starting with 4691ea9fd75102b278413b4cf2d045ddad7f06b55243e481ec5ff36f6891cd77 not found: ID does not exist" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.898541 4971 scope.go:117] "RemoveContainer" containerID="547b376608a6a765aea3f53af2c12c6eea952bf8fcdb34be1a94fc79d7c84b90" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.898797 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"547b376608a6a765aea3f53af2c12c6eea952bf8fcdb34be1a94fc79d7c84b90"} err="failed to get container status \"547b376608a6a765aea3f53af2c12c6eea952bf8fcdb34be1a94fc79d7c84b90\": rpc error: code = NotFound desc = could not find container \"547b376608a6a765aea3f53af2c12c6eea952bf8fcdb34be1a94fc79d7c84b90\": container with ID starting with 547b376608a6a765aea3f53af2c12c6eea952bf8fcdb34be1a94fc79d7c84b90 not found: ID does not exist" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.898820 4971 scope.go:117] "RemoveContainer" containerID="421a7ad4b3dde0de7b751159a385832075d84609031c7def22b4aa02874d2936" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.899121 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"421a7ad4b3dde0de7b751159a385832075d84609031c7def22b4aa02874d2936"} err="failed to get container status \"421a7ad4b3dde0de7b751159a385832075d84609031c7def22b4aa02874d2936\": rpc error: code = NotFound desc = could not find container \"421a7ad4b3dde0de7b751159a385832075d84609031c7def22b4aa02874d2936\": container with ID starting with 421a7ad4b3dde0de7b751159a385832075d84609031c7def22b4aa02874d2936 not found: ID does not exist" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.899143 4971 scope.go:117] "RemoveContainer" containerID="20d0d42b820b46ff3173d56e03dedec6fd3fd360fe144467d8eb6f785d597308" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.899467 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20d0d42b820b46ff3173d56e03dedec6fd3fd360fe144467d8eb6f785d597308"} err="failed to get container status \"20d0d42b820b46ff3173d56e03dedec6fd3fd360fe144467d8eb6f785d597308\": rpc error: code = NotFound desc = could not find container \"20d0d42b820b46ff3173d56e03dedec6fd3fd360fe144467d8eb6f785d597308\": container with ID starting with 20d0d42b820b46ff3173d56e03dedec6fd3fd360fe144467d8eb6f785d597308 not found: ID does not exist" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.899494 4971 scope.go:117] "RemoveContainer" containerID="178ce0bca73aef60296595c16024b62f27dc16d226d6ed8d102bb88b3b4477aa" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.899959 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"178ce0bca73aef60296595c16024b62f27dc16d226d6ed8d102bb88b3b4477aa"} err="failed to get container status \"178ce0bca73aef60296595c16024b62f27dc16d226d6ed8d102bb88b3b4477aa\": rpc error: code = NotFound desc = could not find container \"178ce0bca73aef60296595c16024b62f27dc16d226d6ed8d102bb88b3b4477aa\": container with ID starting with 178ce0bca73aef60296595c16024b62f27dc16d226d6ed8d102bb88b3b4477aa not found: ID does not exist" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.899982 4971 scope.go:117] "RemoveContainer" containerID="44a1ae4297291e48978592b82e56c27206a61f8789bc5312c5dd22d06b34c195" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.900231 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44a1ae4297291e48978592b82e56c27206a61f8789bc5312c5dd22d06b34c195"} err="failed to get container status \"44a1ae4297291e48978592b82e56c27206a61f8789bc5312c5dd22d06b34c195\": rpc error: code = NotFound desc = could not find container \"44a1ae4297291e48978592b82e56c27206a61f8789bc5312c5dd22d06b34c195\": container with ID starting with 44a1ae4297291e48978592b82e56c27206a61f8789bc5312c5dd22d06b34c195 not found: ID does not exist" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.900256 4971 scope.go:117] "RemoveContainer" containerID="f30450686629f21239e3247309d8fab4f41bacff80fdc6d456d970dde1a7e703" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.900591 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f30450686629f21239e3247309d8fab4f41bacff80fdc6d456d970dde1a7e703"} err="failed to get container status \"f30450686629f21239e3247309d8fab4f41bacff80fdc6d456d970dde1a7e703\": rpc error: code = NotFound desc = could not find container \"f30450686629f21239e3247309d8fab4f41bacff80fdc6d456d970dde1a7e703\": container with ID starting with f30450686629f21239e3247309d8fab4f41bacff80fdc6d456d970dde1a7e703 not found: ID does not exist" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.900630 4971 scope.go:117] "RemoveContainer" containerID="2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.900920 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd"} err="failed to get container status \"2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\": rpc error: code = NotFound desc = could not find container \"2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\": container with ID starting with 2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd not found: ID does not exist" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.900942 4971 scope.go:117] "RemoveContainer" containerID="e085ffc1a6eb1489d8650aa6291b0fdfa7bd675c8cd3e76bf2c7939ebcf256ed" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.901220 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e085ffc1a6eb1489d8650aa6291b0fdfa7bd675c8cd3e76bf2c7939ebcf256ed"} err="failed to get container status \"e085ffc1a6eb1489d8650aa6291b0fdfa7bd675c8cd3e76bf2c7939ebcf256ed\": rpc error: code = NotFound desc = could not find container \"e085ffc1a6eb1489d8650aa6291b0fdfa7bd675c8cd3e76bf2c7939ebcf256ed\": container with ID starting with e085ffc1a6eb1489d8650aa6291b0fdfa7bd675c8cd3e76bf2c7939ebcf256ed not found: ID does not exist" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.901242 4971 scope.go:117] "RemoveContainer" containerID="d333225cc7556a04bf2b3cf747cb0645ef0b657641dc10e0e91d59ecd504b63e" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.901517 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d333225cc7556a04bf2b3cf747cb0645ef0b657641dc10e0e91d59ecd504b63e"} err="failed to get container status \"d333225cc7556a04bf2b3cf747cb0645ef0b657641dc10e0e91d59ecd504b63e\": rpc error: code = NotFound desc = could not find container \"d333225cc7556a04bf2b3cf747cb0645ef0b657641dc10e0e91d59ecd504b63e\": container with ID starting with d333225cc7556a04bf2b3cf747cb0645ef0b657641dc10e0e91d59ecd504b63e not found: ID does not exist" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.901538 4971 scope.go:117] "RemoveContainer" containerID="4691ea9fd75102b278413b4cf2d045ddad7f06b55243e481ec5ff36f6891cd77" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.901910 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4691ea9fd75102b278413b4cf2d045ddad7f06b55243e481ec5ff36f6891cd77"} err="failed to get container status \"4691ea9fd75102b278413b4cf2d045ddad7f06b55243e481ec5ff36f6891cd77\": rpc error: code = NotFound desc = could not find container \"4691ea9fd75102b278413b4cf2d045ddad7f06b55243e481ec5ff36f6891cd77\": container with ID starting with 4691ea9fd75102b278413b4cf2d045ddad7f06b55243e481ec5ff36f6891cd77 not found: ID does not exist" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.901933 4971 scope.go:117] "RemoveContainer" containerID="547b376608a6a765aea3f53af2c12c6eea952bf8fcdb34be1a94fc79d7c84b90" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.902245 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"547b376608a6a765aea3f53af2c12c6eea952bf8fcdb34be1a94fc79d7c84b90"} err="failed to get container status \"547b376608a6a765aea3f53af2c12c6eea952bf8fcdb34be1a94fc79d7c84b90\": rpc error: code = NotFound desc = could not find container \"547b376608a6a765aea3f53af2c12c6eea952bf8fcdb34be1a94fc79d7c84b90\": container with ID starting with 547b376608a6a765aea3f53af2c12c6eea952bf8fcdb34be1a94fc79d7c84b90 not found: ID does not exist" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.902269 4971 scope.go:117] "RemoveContainer" containerID="421a7ad4b3dde0de7b751159a385832075d84609031c7def22b4aa02874d2936" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.902553 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"421a7ad4b3dde0de7b751159a385832075d84609031c7def22b4aa02874d2936"} err="failed to get container status \"421a7ad4b3dde0de7b751159a385832075d84609031c7def22b4aa02874d2936\": rpc error: code = NotFound desc = could not find container \"421a7ad4b3dde0de7b751159a385832075d84609031c7def22b4aa02874d2936\": container with ID starting with 421a7ad4b3dde0de7b751159a385832075d84609031c7def22b4aa02874d2936 not found: ID does not exist" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.902670 4971 scope.go:117] "RemoveContainer" containerID="20d0d42b820b46ff3173d56e03dedec6fd3fd360fe144467d8eb6f785d597308" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.904034 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20d0d42b820b46ff3173d56e03dedec6fd3fd360fe144467d8eb6f785d597308"} err="failed to get container status \"20d0d42b820b46ff3173d56e03dedec6fd3fd360fe144467d8eb6f785d597308\": rpc error: code = NotFound desc = could not find container \"20d0d42b820b46ff3173d56e03dedec6fd3fd360fe144467d8eb6f785d597308\": container with ID starting with 20d0d42b820b46ff3173d56e03dedec6fd3fd360fe144467d8eb6f785d597308 not found: ID does not exist" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.904132 4971 scope.go:117] "RemoveContainer" containerID="178ce0bca73aef60296595c16024b62f27dc16d226d6ed8d102bb88b3b4477aa" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.904480 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"178ce0bca73aef60296595c16024b62f27dc16d226d6ed8d102bb88b3b4477aa"} err="failed to get container status \"178ce0bca73aef60296595c16024b62f27dc16d226d6ed8d102bb88b3b4477aa\": rpc error: code = NotFound desc = could not find container \"178ce0bca73aef60296595c16024b62f27dc16d226d6ed8d102bb88b3b4477aa\": container with ID starting with 178ce0bca73aef60296595c16024b62f27dc16d226d6ed8d102bb88b3b4477aa not found: ID does not exist" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.904506 4971 scope.go:117] "RemoveContainer" containerID="44a1ae4297291e48978592b82e56c27206a61f8789bc5312c5dd22d06b34c195" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.904751 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44a1ae4297291e48978592b82e56c27206a61f8789bc5312c5dd22d06b34c195"} err="failed to get container status \"44a1ae4297291e48978592b82e56c27206a61f8789bc5312c5dd22d06b34c195\": rpc error: code = NotFound desc = could not find container \"44a1ae4297291e48978592b82e56c27206a61f8789bc5312c5dd22d06b34c195\": container with ID starting with 44a1ae4297291e48978592b82e56c27206a61f8789bc5312c5dd22d06b34c195 not found: ID does not exist" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.904775 4971 scope.go:117] "RemoveContainer" containerID="f30450686629f21239e3247309d8fab4f41bacff80fdc6d456d970dde1a7e703" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.905040 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f30450686629f21239e3247309d8fab4f41bacff80fdc6d456d970dde1a7e703"} err="failed to get container status \"f30450686629f21239e3247309d8fab4f41bacff80fdc6d456d970dde1a7e703\": rpc error: code = NotFound desc = could not find container \"f30450686629f21239e3247309d8fab4f41bacff80fdc6d456d970dde1a7e703\": container with ID starting with f30450686629f21239e3247309d8fab4f41bacff80fdc6d456d970dde1a7e703 not found: ID does not exist" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.905066 4971 scope.go:117] "RemoveContainer" containerID="2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.905333 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd"} err="failed to get container status \"2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\": rpc error: code = NotFound desc = could not find container \"2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\": container with ID starting with 2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd not found: ID does not exist" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.905431 4971 scope.go:117] "RemoveContainer" containerID="e085ffc1a6eb1489d8650aa6291b0fdfa7bd675c8cd3e76bf2c7939ebcf256ed" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.905779 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e085ffc1a6eb1489d8650aa6291b0fdfa7bd675c8cd3e76bf2c7939ebcf256ed"} err="failed to get container status \"e085ffc1a6eb1489d8650aa6291b0fdfa7bd675c8cd3e76bf2c7939ebcf256ed\": rpc error: code = NotFound desc = could not find container \"e085ffc1a6eb1489d8650aa6291b0fdfa7bd675c8cd3e76bf2c7939ebcf256ed\": container with ID starting with e085ffc1a6eb1489d8650aa6291b0fdfa7bd675c8cd3e76bf2c7939ebcf256ed not found: ID does not exist" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.905806 4971 scope.go:117] "RemoveContainer" containerID="d333225cc7556a04bf2b3cf747cb0645ef0b657641dc10e0e91d59ecd504b63e" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.906079 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d333225cc7556a04bf2b3cf747cb0645ef0b657641dc10e0e91d59ecd504b63e"} err="failed to get container status \"d333225cc7556a04bf2b3cf747cb0645ef0b657641dc10e0e91d59ecd504b63e\": rpc error: code = NotFound desc = could not find container \"d333225cc7556a04bf2b3cf747cb0645ef0b657641dc10e0e91d59ecd504b63e\": container with ID starting with d333225cc7556a04bf2b3cf747cb0645ef0b657641dc10e0e91d59ecd504b63e not found: ID does not exist" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.906176 4971 scope.go:117] "RemoveContainer" containerID="4691ea9fd75102b278413b4cf2d045ddad7f06b55243e481ec5ff36f6891cd77" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.906509 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4691ea9fd75102b278413b4cf2d045ddad7f06b55243e481ec5ff36f6891cd77"} err="failed to get container status \"4691ea9fd75102b278413b4cf2d045ddad7f06b55243e481ec5ff36f6891cd77\": rpc error: code = NotFound desc = could not find container \"4691ea9fd75102b278413b4cf2d045ddad7f06b55243e481ec5ff36f6891cd77\": container with ID starting with 4691ea9fd75102b278413b4cf2d045ddad7f06b55243e481ec5ff36f6891cd77 not found: ID does not exist" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.906533 4971 scope.go:117] "RemoveContainer" containerID="547b376608a6a765aea3f53af2c12c6eea952bf8fcdb34be1a94fc79d7c84b90" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.906800 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"547b376608a6a765aea3f53af2c12c6eea952bf8fcdb34be1a94fc79d7c84b90"} err="failed to get container status \"547b376608a6a765aea3f53af2c12c6eea952bf8fcdb34be1a94fc79d7c84b90\": rpc error: code = NotFound desc = could not find container \"547b376608a6a765aea3f53af2c12c6eea952bf8fcdb34be1a94fc79d7c84b90\": container with ID starting with 547b376608a6a765aea3f53af2c12c6eea952bf8fcdb34be1a94fc79d7c84b90 not found: ID does not exist" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.906900 4971 scope.go:117] "RemoveContainer" containerID="421a7ad4b3dde0de7b751159a385832075d84609031c7def22b4aa02874d2936" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.907215 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"421a7ad4b3dde0de7b751159a385832075d84609031c7def22b4aa02874d2936"} err="failed to get container status \"421a7ad4b3dde0de7b751159a385832075d84609031c7def22b4aa02874d2936\": rpc error: code = NotFound desc = could not find container \"421a7ad4b3dde0de7b751159a385832075d84609031c7def22b4aa02874d2936\": container with ID starting with 421a7ad4b3dde0de7b751159a385832075d84609031c7def22b4aa02874d2936 not found: ID does not exist" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.907238 4971 scope.go:117] "RemoveContainer" containerID="20d0d42b820b46ff3173d56e03dedec6fd3fd360fe144467d8eb6f785d597308" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.907471 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20d0d42b820b46ff3173d56e03dedec6fd3fd360fe144467d8eb6f785d597308"} err="failed to get container status \"20d0d42b820b46ff3173d56e03dedec6fd3fd360fe144467d8eb6f785d597308\": rpc error: code = NotFound desc = could not find container \"20d0d42b820b46ff3173d56e03dedec6fd3fd360fe144467d8eb6f785d597308\": container with ID starting with 20d0d42b820b46ff3173d56e03dedec6fd3fd360fe144467d8eb6f785d597308 not found: ID does not exist" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.907568 4971 scope.go:117] "RemoveContainer" containerID="178ce0bca73aef60296595c16024b62f27dc16d226d6ed8d102bb88b3b4477aa" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.907797 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"178ce0bca73aef60296595c16024b62f27dc16d226d6ed8d102bb88b3b4477aa"} err="failed to get container status \"178ce0bca73aef60296595c16024b62f27dc16d226d6ed8d102bb88b3b4477aa\": rpc error: code = NotFound desc = could not find container \"178ce0bca73aef60296595c16024b62f27dc16d226d6ed8d102bb88b3b4477aa\": container with ID starting with 178ce0bca73aef60296595c16024b62f27dc16d226d6ed8d102bb88b3b4477aa not found: ID does not exist" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.907819 4971 scope.go:117] "RemoveContainer" containerID="44a1ae4297291e48978592b82e56c27206a61f8789bc5312c5dd22d06b34c195" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.908095 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44a1ae4297291e48978592b82e56c27206a61f8789bc5312c5dd22d06b34c195"} err="failed to get container status \"44a1ae4297291e48978592b82e56c27206a61f8789bc5312c5dd22d06b34c195\": rpc error: code = NotFound desc = could not find container \"44a1ae4297291e48978592b82e56c27206a61f8789bc5312c5dd22d06b34c195\": container with ID starting with 44a1ae4297291e48978592b82e56c27206a61f8789bc5312c5dd22d06b34c195 not found: ID does not exist" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.908193 4971 scope.go:117] "RemoveContainer" containerID="f30450686629f21239e3247309d8fab4f41bacff80fdc6d456d970dde1a7e703" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.908511 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f30450686629f21239e3247309d8fab4f41bacff80fdc6d456d970dde1a7e703"} err="failed to get container status \"f30450686629f21239e3247309d8fab4f41bacff80fdc6d456d970dde1a7e703\": rpc error: code = NotFound desc = could not find container \"f30450686629f21239e3247309d8fab4f41bacff80fdc6d456d970dde1a7e703\": container with ID starting with f30450686629f21239e3247309d8fab4f41bacff80fdc6d456d970dde1a7e703 not found: ID does not exist" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.908534 4971 scope.go:117] "RemoveContainer" containerID="2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.908762 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd"} err="failed to get container status \"2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\": rpc error: code = NotFound desc = could not find container \"2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd\": container with ID starting with 2d1acf1903daa267c3666101da094f18b60f477c6c665a463073ed19b243f4cd not found: ID does not exist" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.908784 4971 scope.go:117] "RemoveContainer" containerID="e085ffc1a6eb1489d8650aa6291b0fdfa7bd675c8cd3e76bf2c7939ebcf256ed" Mar 20 07:05:02 crc kubenswrapper[4971]: I0320 07:05:02.909040 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e085ffc1a6eb1489d8650aa6291b0fdfa7bd675c8cd3e76bf2c7939ebcf256ed"} err="failed to get container status \"e085ffc1a6eb1489d8650aa6291b0fdfa7bd675c8cd3e76bf2c7939ebcf256ed\": rpc error: code = NotFound desc = could not find container \"e085ffc1a6eb1489d8650aa6291b0fdfa7bd675c8cd3e76bf2c7939ebcf256ed\": container with ID starting with e085ffc1a6eb1489d8650aa6291b0fdfa7bd675c8cd3e76bf2c7939ebcf256ed not found: ID does not exist" Mar 20 07:05:03 crc kubenswrapper[4971]: I0320 07:05:03.578486 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lwlhn_f11eaf57-f83a-4974-adb5-7a59b11555b0/kube-multus/2.log" Mar 20 07:05:03 crc kubenswrapper[4971]: I0320 07:05:03.579306 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lwlhn_f11eaf57-f83a-4974-adb5-7a59b11555b0/kube-multus/1.log" Mar 20 07:05:03 crc kubenswrapper[4971]: I0320 07:05:03.579395 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lwlhn" event={"ID":"f11eaf57-f83a-4974-adb5-7a59b11555b0","Type":"ContainerStarted","Data":"28b7581e4e67d235785e0dd8c44585c137117ee6a9eb5991079ac15d118df59d"} Mar 20 07:05:03 crc kubenswrapper[4971]: I0320 07:05:03.581092 4971 generic.go:334] "Generic (PLEG): container finished" podID="dd26c3c1-c3bf-4a19-815b-ad14562be91c" containerID="899e50bda0ca10250b2b11c8c02b0b6a1bc47402df580ec6dca223715a9084fe" exitCode=0 Mar 20 07:05:03 crc kubenswrapper[4971]: I0320 07:05:03.581273 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" event={"ID":"dd26c3c1-c3bf-4a19-815b-ad14562be91c","Type":"ContainerDied","Data":"899e50bda0ca10250b2b11c8c02b0b6a1bc47402df580ec6dca223715a9084fe"} Mar 20 07:05:03 crc kubenswrapper[4971]: I0320 07:05:03.581377 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" event={"ID":"dd26c3c1-c3bf-4a19-815b-ad14562be91c","Type":"ContainerStarted","Data":"6ca526d52fde7b22fd8be8ce08097aaad9d5818573bbed60020bc974388e5fbc"} Mar 20 07:05:04 crc kubenswrapper[4971]: I0320 07:05:04.590566 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" event={"ID":"dd26c3c1-c3bf-4a19-815b-ad14562be91c","Type":"ContainerStarted","Data":"111964c3011b824164b701b8cb1ae169280fdee8add5a87f3ab1094ae1f88c68"} Mar 20 07:05:04 crc kubenswrapper[4971]: I0320 07:05:04.591176 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" event={"ID":"dd26c3c1-c3bf-4a19-815b-ad14562be91c","Type":"ContainerStarted","Data":"9f7e39766676cc2dc9ebedf203c4b3900ee197db2a363aedb43ea8e8cf00f0e9"} Mar 20 07:05:04 crc kubenswrapper[4971]: I0320 07:05:04.591189 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" event={"ID":"dd26c3c1-c3bf-4a19-815b-ad14562be91c","Type":"ContainerStarted","Data":"038da1758d04b15a82b942a9ad8860e0f9de6ed38afaf402291c69b5dada7cac"} Mar 20 07:05:04 crc kubenswrapper[4971]: I0320 07:05:04.591199 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" event={"ID":"dd26c3c1-c3bf-4a19-815b-ad14562be91c","Type":"ContainerStarted","Data":"9eee81864bfc031d81ee5dd1b195ae6864584268c9519260a47eae016190dc9c"} Mar 20 07:05:04 crc kubenswrapper[4971]: I0320 07:05:04.591210 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" event={"ID":"dd26c3c1-c3bf-4a19-815b-ad14562be91c","Type":"ContainerStarted","Data":"964d8bbb849bbcdae01274c0e4821630aa781f4ddec8e6b421bfd4349ebf23cb"} Mar 20 07:05:04 crc kubenswrapper[4971]: I0320 07:05:04.591221 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" event={"ID":"dd26c3c1-c3bf-4a19-815b-ad14562be91c","Type":"ContainerStarted","Data":"a5742c66f170933227e15ea150757695e1f060bc73c770c341dc0be87ec86de8"} Mar 20 07:05:06 crc kubenswrapper[4971]: I0320 07:05:06.612856 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" event={"ID":"dd26c3c1-c3bf-4a19-815b-ad14562be91c","Type":"ContainerStarted","Data":"7d477f7e1aeeb5718095f820bc063b87e09ede5d5a8cb9516408c3bcaf7c7ced"} Mar 20 07:05:10 crc kubenswrapper[4971]: I0320 07:05:10.645404 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" event={"ID":"dd26c3c1-c3bf-4a19-815b-ad14562be91c","Type":"ContainerStarted","Data":"8b8e3983d353e60befe210eeb48c2b8594877c7b923ead120b3e5581f3e7b47d"} Mar 20 07:05:10 crc kubenswrapper[4971]: I0320 07:05:10.645895 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:10 crc kubenswrapper[4971]: I0320 07:05:10.679036 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" podStartSLOduration=8.679017766 podStartE2EDuration="8.679017766s" podCreationTimestamp="2026-03-20 07:05:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:05:10.675200558 +0000 UTC m=+932.655074716" watchObservedRunningTime="2026-03-20 07:05:10.679017766 +0000 UTC m=+932.658891904" Mar 20 07:05:10 crc kubenswrapper[4971]: I0320 07:05:10.688788 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:11 crc kubenswrapper[4971]: I0320 07:05:11.100546 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-c7m56"] Mar 20 07:05:11 crc kubenswrapper[4971]: I0320 07:05:11.101554 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c7m56" Mar 20 07:05:11 crc kubenswrapper[4971]: I0320 07:05:11.113216 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c7m56"] Mar 20 07:05:11 crc kubenswrapper[4971]: I0320 07:05:11.202925 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75cfa995-1433-4fa6-a3dd-a8346a19e0fb-catalog-content\") pod \"certified-operators-c7m56\" (UID: \"75cfa995-1433-4fa6-a3dd-a8346a19e0fb\") " pod="openshift-marketplace/certified-operators-c7m56" Mar 20 07:05:11 crc kubenswrapper[4971]: I0320 07:05:11.202989 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkhjm\" (UniqueName: \"kubernetes.io/projected/75cfa995-1433-4fa6-a3dd-a8346a19e0fb-kube-api-access-rkhjm\") pod \"certified-operators-c7m56\" (UID: \"75cfa995-1433-4fa6-a3dd-a8346a19e0fb\") " pod="openshift-marketplace/certified-operators-c7m56" Mar 20 07:05:11 crc kubenswrapper[4971]: I0320 07:05:11.203201 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75cfa995-1433-4fa6-a3dd-a8346a19e0fb-utilities\") pod \"certified-operators-c7m56\" (UID: \"75cfa995-1433-4fa6-a3dd-a8346a19e0fb\") " pod="openshift-marketplace/certified-operators-c7m56" Mar 20 07:05:11 crc kubenswrapper[4971]: I0320 07:05:11.305399 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75cfa995-1433-4fa6-a3dd-a8346a19e0fb-catalog-content\") pod \"certified-operators-c7m56\" (UID: \"75cfa995-1433-4fa6-a3dd-a8346a19e0fb\") " pod="openshift-marketplace/certified-operators-c7m56" Mar 20 07:05:11 crc kubenswrapper[4971]: I0320 07:05:11.305757 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkhjm\" (UniqueName: \"kubernetes.io/projected/75cfa995-1433-4fa6-a3dd-a8346a19e0fb-kube-api-access-rkhjm\") pod \"certified-operators-c7m56\" (UID: \"75cfa995-1433-4fa6-a3dd-a8346a19e0fb\") " pod="openshift-marketplace/certified-operators-c7m56" Mar 20 07:05:11 crc kubenswrapper[4971]: I0320 07:05:11.305808 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75cfa995-1433-4fa6-a3dd-a8346a19e0fb-utilities\") pod \"certified-operators-c7m56\" (UID: \"75cfa995-1433-4fa6-a3dd-a8346a19e0fb\") " pod="openshift-marketplace/certified-operators-c7m56" Mar 20 07:05:11 crc kubenswrapper[4971]: I0320 07:05:11.306184 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75cfa995-1433-4fa6-a3dd-a8346a19e0fb-catalog-content\") pod \"certified-operators-c7m56\" (UID: \"75cfa995-1433-4fa6-a3dd-a8346a19e0fb\") " pod="openshift-marketplace/certified-operators-c7m56" Mar 20 07:05:11 crc kubenswrapper[4971]: I0320 07:05:11.307126 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75cfa995-1433-4fa6-a3dd-a8346a19e0fb-utilities\") pod \"certified-operators-c7m56\" (UID: \"75cfa995-1433-4fa6-a3dd-a8346a19e0fb\") " pod="openshift-marketplace/certified-operators-c7m56" Mar 20 07:05:11 crc kubenswrapper[4971]: I0320 07:05:11.328438 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkhjm\" (UniqueName: \"kubernetes.io/projected/75cfa995-1433-4fa6-a3dd-a8346a19e0fb-kube-api-access-rkhjm\") pod \"certified-operators-c7m56\" (UID: \"75cfa995-1433-4fa6-a3dd-a8346a19e0fb\") " pod="openshift-marketplace/certified-operators-c7m56" Mar 20 07:05:11 crc kubenswrapper[4971]: I0320 07:05:11.416428 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c7m56" Mar 20 07:05:11 crc kubenswrapper[4971]: E0320 07:05:11.471197 4971 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-c7m56_openshift-marketplace_75cfa995-1433-4fa6-a3dd-a8346a19e0fb_0(7a305944ecd1ceeea6f4d9e2d8833b57d4fe7c45c8191e9da274a0e5e011b5f1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 07:05:11 crc kubenswrapper[4971]: E0320 07:05:11.471283 4971 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-c7m56_openshift-marketplace_75cfa995-1433-4fa6-a3dd-a8346a19e0fb_0(7a305944ecd1ceeea6f4d9e2d8833b57d4fe7c45c8191e9da274a0e5e011b5f1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/certified-operators-c7m56" Mar 20 07:05:11 crc kubenswrapper[4971]: E0320 07:05:11.471304 4971 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-c7m56_openshift-marketplace_75cfa995-1433-4fa6-a3dd-a8346a19e0fb_0(7a305944ecd1ceeea6f4d9e2d8833b57d4fe7c45c8191e9da274a0e5e011b5f1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/certified-operators-c7m56" Mar 20 07:05:11 crc kubenswrapper[4971]: E0320 07:05:11.471343 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"certified-operators-c7m56_openshift-marketplace(75cfa995-1433-4fa6-a3dd-a8346a19e0fb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"certified-operators-c7m56_openshift-marketplace(75cfa995-1433-4fa6-a3dd-a8346a19e0fb)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-c7m56_openshift-marketplace_75cfa995-1433-4fa6-a3dd-a8346a19e0fb_0(7a305944ecd1ceeea6f4d9e2d8833b57d4fe7c45c8191e9da274a0e5e011b5f1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/certified-operators-c7m56" podUID="75cfa995-1433-4fa6-a3dd-a8346a19e0fb" Mar 20 07:05:11 crc kubenswrapper[4971]: I0320 07:05:11.651457 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:11 crc kubenswrapper[4971]: I0320 07:05:11.651523 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:11 crc kubenswrapper[4971]: I0320 07:05:11.652042 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c7m56" Mar 20 07:05:11 crc kubenswrapper[4971]: I0320 07:05:11.652317 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c7m56" Mar 20 07:05:11 crc kubenswrapper[4971]: E0320 07:05:11.690101 4971 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-c7m56_openshift-marketplace_75cfa995-1433-4fa6-a3dd-a8346a19e0fb_0(149243e3b22208b2e16ea317dc8a0966847a14baf1422c34ef9b71c368919f05): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 07:05:11 crc kubenswrapper[4971]: E0320 07:05:11.690485 4971 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-c7m56_openshift-marketplace_75cfa995-1433-4fa6-a3dd-a8346a19e0fb_0(149243e3b22208b2e16ea317dc8a0966847a14baf1422c34ef9b71c368919f05): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/certified-operators-c7m56" Mar 20 07:05:11 crc kubenswrapper[4971]: E0320 07:05:11.690564 4971 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-c7m56_openshift-marketplace_75cfa995-1433-4fa6-a3dd-a8346a19e0fb_0(149243e3b22208b2e16ea317dc8a0966847a14baf1422c34ef9b71c368919f05): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/certified-operators-c7m56" Mar 20 07:05:11 crc kubenswrapper[4971]: E0320 07:05:11.690676 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"certified-operators-c7m56_openshift-marketplace(75cfa995-1433-4fa6-a3dd-a8346a19e0fb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"certified-operators-c7m56_openshift-marketplace(75cfa995-1433-4fa6-a3dd-a8346a19e0fb)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-c7m56_openshift-marketplace_75cfa995-1433-4fa6-a3dd-a8346a19e0fb_0(149243e3b22208b2e16ea317dc8a0966847a14baf1422c34ef9b71c368919f05): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/certified-operators-c7m56" podUID="75cfa995-1433-4fa6-a3dd-a8346a19e0fb" Mar 20 07:05:11 crc kubenswrapper[4971]: I0320 07:05:11.691349 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:11 crc kubenswrapper[4971]: I0320 07:05:11.796989 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-l5xpx"] Mar 20 07:05:11 crc kubenswrapper[4971]: I0320 07:05:11.797617 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-l5xpx" Mar 20 07:05:11 crc kubenswrapper[4971]: I0320 07:05:11.799631 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Mar 20 07:05:11 crc kubenswrapper[4971]: I0320 07:05:11.799856 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Mar 20 07:05:11 crc kubenswrapper[4971]: I0320 07:05:11.800312 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Mar 20 07:05:11 crc kubenswrapper[4971]: I0320 07:05:11.800645 4971 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-k8hht" Mar 20 07:05:11 crc kubenswrapper[4971]: I0320 07:05:11.806869 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-l5xpx"] Mar 20 07:05:11 crc kubenswrapper[4971]: I0320 07:05:11.912261 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/8e8977ac-1b93-49d2-8f42-20e64d909d7b-crc-storage\") pod \"crc-storage-crc-l5xpx\" (UID: \"8e8977ac-1b93-49d2-8f42-20e64d909d7b\") " pod="crc-storage/crc-storage-crc-l5xpx" Mar 20 07:05:11 crc kubenswrapper[4971]: I0320 07:05:11.912311 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvgcm\" (UniqueName: \"kubernetes.io/projected/8e8977ac-1b93-49d2-8f42-20e64d909d7b-kube-api-access-tvgcm\") pod \"crc-storage-crc-l5xpx\" (UID: \"8e8977ac-1b93-49d2-8f42-20e64d909d7b\") " pod="crc-storage/crc-storage-crc-l5xpx" Mar 20 07:05:11 crc kubenswrapper[4971]: I0320 07:05:11.912566 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/8e8977ac-1b93-49d2-8f42-20e64d909d7b-node-mnt\") pod \"crc-storage-crc-l5xpx\" (UID: \"8e8977ac-1b93-49d2-8f42-20e64d909d7b\") " pod="crc-storage/crc-storage-crc-l5xpx" Mar 20 07:05:12 crc kubenswrapper[4971]: I0320 07:05:12.014042 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/8e8977ac-1b93-49d2-8f42-20e64d909d7b-crc-storage\") pod \"crc-storage-crc-l5xpx\" (UID: \"8e8977ac-1b93-49d2-8f42-20e64d909d7b\") " pod="crc-storage/crc-storage-crc-l5xpx" Mar 20 07:05:12 crc kubenswrapper[4971]: I0320 07:05:12.014090 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvgcm\" (UniqueName: \"kubernetes.io/projected/8e8977ac-1b93-49d2-8f42-20e64d909d7b-kube-api-access-tvgcm\") pod \"crc-storage-crc-l5xpx\" (UID: \"8e8977ac-1b93-49d2-8f42-20e64d909d7b\") " pod="crc-storage/crc-storage-crc-l5xpx" Mar 20 07:05:12 crc kubenswrapper[4971]: I0320 07:05:12.014154 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/8e8977ac-1b93-49d2-8f42-20e64d909d7b-node-mnt\") pod \"crc-storage-crc-l5xpx\" (UID: \"8e8977ac-1b93-49d2-8f42-20e64d909d7b\") " pod="crc-storage/crc-storage-crc-l5xpx" Mar 20 07:05:12 crc kubenswrapper[4971]: I0320 07:05:12.014415 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/8e8977ac-1b93-49d2-8f42-20e64d909d7b-node-mnt\") pod \"crc-storage-crc-l5xpx\" (UID: \"8e8977ac-1b93-49d2-8f42-20e64d909d7b\") " pod="crc-storage/crc-storage-crc-l5xpx" Mar 20 07:05:12 crc kubenswrapper[4971]: I0320 07:05:12.014823 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/8e8977ac-1b93-49d2-8f42-20e64d909d7b-crc-storage\") pod \"crc-storage-crc-l5xpx\" (UID: \"8e8977ac-1b93-49d2-8f42-20e64d909d7b\") " pod="crc-storage/crc-storage-crc-l5xpx" Mar 20 07:05:12 crc kubenswrapper[4971]: I0320 07:05:12.030745 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvgcm\" (UniqueName: \"kubernetes.io/projected/8e8977ac-1b93-49d2-8f42-20e64d909d7b-kube-api-access-tvgcm\") pod \"crc-storage-crc-l5xpx\" (UID: \"8e8977ac-1b93-49d2-8f42-20e64d909d7b\") " pod="crc-storage/crc-storage-crc-l5xpx" Mar 20 07:05:12 crc kubenswrapper[4971]: I0320 07:05:12.111827 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-l5xpx" Mar 20 07:05:12 crc kubenswrapper[4971]: E0320 07:05:12.133285 4971 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-l5xpx_crc-storage_8e8977ac-1b93-49d2-8f42-20e64d909d7b_0(65e42eda12976580a6bc4a54db97e50a7626708f0b6101d65fa281e25c7e2643): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 07:05:12 crc kubenswrapper[4971]: E0320 07:05:12.133357 4971 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-l5xpx_crc-storage_8e8977ac-1b93-49d2-8f42-20e64d909d7b_0(65e42eda12976580a6bc4a54db97e50a7626708f0b6101d65fa281e25c7e2643): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-l5xpx" Mar 20 07:05:12 crc kubenswrapper[4971]: E0320 07:05:12.133382 4971 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-l5xpx_crc-storage_8e8977ac-1b93-49d2-8f42-20e64d909d7b_0(65e42eda12976580a6bc4a54db97e50a7626708f0b6101d65fa281e25c7e2643): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-l5xpx" Mar 20 07:05:12 crc kubenswrapper[4971]: E0320 07:05:12.133426 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-l5xpx_crc-storage(8e8977ac-1b93-49d2-8f42-20e64d909d7b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-l5xpx_crc-storage(8e8977ac-1b93-49d2-8f42-20e64d909d7b)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-l5xpx_crc-storage_8e8977ac-1b93-49d2-8f42-20e64d909d7b_0(65e42eda12976580a6bc4a54db97e50a7626708f0b6101d65fa281e25c7e2643): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-l5xpx" podUID="8e8977ac-1b93-49d2-8f42-20e64d909d7b" Mar 20 07:05:12 crc kubenswrapper[4971]: I0320 07:05:12.657205 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-l5xpx" Mar 20 07:05:12 crc kubenswrapper[4971]: I0320 07:05:12.659968 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-l5xpx" Mar 20 07:05:12 crc kubenswrapper[4971]: E0320 07:05:12.708546 4971 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-l5xpx_crc-storage_8e8977ac-1b93-49d2-8f42-20e64d909d7b_0(c319317f342fe2f5adfcf4d9a13287a207ad922e2ae80adcbd1cacf976bdedb9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 07:05:12 crc kubenswrapper[4971]: E0320 07:05:12.708669 4971 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-l5xpx_crc-storage_8e8977ac-1b93-49d2-8f42-20e64d909d7b_0(c319317f342fe2f5adfcf4d9a13287a207ad922e2ae80adcbd1cacf976bdedb9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-l5xpx" Mar 20 07:05:12 crc kubenswrapper[4971]: E0320 07:05:12.708706 4971 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-l5xpx_crc-storage_8e8977ac-1b93-49d2-8f42-20e64d909d7b_0(c319317f342fe2f5adfcf4d9a13287a207ad922e2ae80adcbd1cacf976bdedb9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-l5xpx" Mar 20 07:05:12 crc kubenswrapper[4971]: E0320 07:05:12.708805 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-l5xpx_crc-storage(8e8977ac-1b93-49d2-8f42-20e64d909d7b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-l5xpx_crc-storage(8e8977ac-1b93-49d2-8f42-20e64d909d7b)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-l5xpx_crc-storage_8e8977ac-1b93-49d2-8f42-20e64d909d7b_0(c319317f342fe2f5adfcf4d9a13287a207ad922e2ae80adcbd1cacf976bdedb9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-l5xpx" podUID="8e8977ac-1b93-49d2-8f42-20e64d909d7b" Mar 20 07:05:24 crc kubenswrapper[4971]: I0320 07:05:24.731867 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c7m56" Mar 20 07:05:24 crc kubenswrapper[4971]: I0320 07:05:24.732887 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c7m56" Mar 20 07:05:25 crc kubenswrapper[4971]: I0320 07:05:25.174977 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c7m56"] Mar 20 07:05:25 crc kubenswrapper[4971]: I0320 07:05:25.749084 4971 generic.go:334] "Generic (PLEG): container finished" podID="75cfa995-1433-4fa6-a3dd-a8346a19e0fb" containerID="508a59d9682f5b44678dfc8da426063b34b583b8b8f17c15988232e4e9fbc0e3" exitCode=0 Mar 20 07:05:25 crc kubenswrapper[4971]: I0320 07:05:25.749164 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c7m56" event={"ID":"75cfa995-1433-4fa6-a3dd-a8346a19e0fb","Type":"ContainerDied","Data":"508a59d9682f5b44678dfc8da426063b34b583b8b8f17c15988232e4e9fbc0e3"} Mar 20 07:05:25 crc kubenswrapper[4971]: I0320 07:05:25.749376 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c7m56" event={"ID":"75cfa995-1433-4fa6-a3dd-a8346a19e0fb","Type":"ContainerStarted","Data":"6d6161a526514ba453567d7db99b0567df8a24681717fa3e768b62ab94254059"} Mar 20 07:05:25 crc kubenswrapper[4971]: I0320 07:05:25.753736 4971 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 07:05:26 crc kubenswrapper[4971]: I0320 07:05:26.731859 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-l5xpx" Mar 20 07:05:26 crc kubenswrapper[4971]: I0320 07:05:26.732771 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-l5xpx" Mar 20 07:05:27 crc kubenswrapper[4971]: I0320 07:05:27.181160 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-l5xpx"] Mar 20 07:05:27 crc kubenswrapper[4971]: I0320 07:05:27.762659 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-l5xpx" event={"ID":"8e8977ac-1b93-49d2-8f42-20e64d909d7b","Type":"ContainerStarted","Data":"cf7f060a81fd4c690f65b4a8aa6e1876ec67a5e9e7cd8ca37811bd6ca23e96a5"} Mar 20 07:05:27 crc kubenswrapper[4971]: I0320 07:05:27.765681 4971 generic.go:334] "Generic (PLEG): container finished" podID="75cfa995-1433-4fa6-a3dd-a8346a19e0fb" containerID="b57f3c872398945f93678f8f34d38196caefd9ac541620805336adcf893268d4" exitCode=0 Mar 20 07:05:27 crc kubenswrapper[4971]: I0320 07:05:27.765744 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c7m56" event={"ID":"75cfa995-1433-4fa6-a3dd-a8346a19e0fb","Type":"ContainerDied","Data":"b57f3c872398945f93678f8f34d38196caefd9ac541620805336adcf893268d4"} Mar 20 07:05:28 crc kubenswrapper[4971]: I0320 07:05:28.774182 4971 generic.go:334] "Generic (PLEG): container finished" podID="8e8977ac-1b93-49d2-8f42-20e64d909d7b" containerID="1df822e674560188b6562afdacdcc6638e60307322f9a4bcc81dd72d14db5b00" exitCode=0 Mar 20 07:05:28 crc kubenswrapper[4971]: I0320 07:05:28.774291 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-l5xpx" event={"ID":"8e8977ac-1b93-49d2-8f42-20e64d909d7b","Type":"ContainerDied","Data":"1df822e674560188b6562afdacdcc6638e60307322f9a4bcc81dd72d14db5b00"} Mar 20 07:05:29 crc kubenswrapper[4971]: I0320 07:05:29.783306 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c7m56" event={"ID":"75cfa995-1433-4fa6-a3dd-a8346a19e0fb","Type":"ContainerStarted","Data":"d3aab746675f9b56b86297dc8e49aa6f2e9b9e31a865416db5c8144b58f971a1"} Mar 20 07:05:30 crc kubenswrapper[4971]: I0320 07:05:30.037708 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-l5xpx" Mar 20 07:05:30 crc kubenswrapper[4971]: I0320 07:05:30.053633 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-c7m56" podStartSLOduration=15.377001006 podStartE2EDuration="19.053579371s" podCreationTimestamp="2026-03-20 07:05:11 +0000 UTC" firstStartedPulling="2026-03-20 07:05:25.751716636 +0000 UTC m=+947.731590814" lastFinishedPulling="2026-03-20 07:05:29.428295041 +0000 UTC m=+951.408169179" observedRunningTime="2026-03-20 07:05:29.807177006 +0000 UTC m=+951.787051144" watchObservedRunningTime="2026-03-20 07:05:30.053579371 +0000 UTC m=+952.033453509" Mar 20 07:05:30 crc kubenswrapper[4971]: I0320 07:05:30.091751 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/8e8977ac-1b93-49d2-8f42-20e64d909d7b-node-mnt\") pod \"8e8977ac-1b93-49d2-8f42-20e64d909d7b\" (UID: \"8e8977ac-1b93-49d2-8f42-20e64d909d7b\") " Mar 20 07:05:30 crc kubenswrapper[4971]: I0320 07:05:30.091830 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/8e8977ac-1b93-49d2-8f42-20e64d909d7b-crc-storage\") pod \"8e8977ac-1b93-49d2-8f42-20e64d909d7b\" (UID: \"8e8977ac-1b93-49d2-8f42-20e64d909d7b\") " Mar 20 07:05:30 crc kubenswrapper[4971]: I0320 07:05:30.091869 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e8977ac-1b93-49d2-8f42-20e64d909d7b-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "8e8977ac-1b93-49d2-8f42-20e64d909d7b" (UID: "8e8977ac-1b93-49d2-8f42-20e64d909d7b"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 07:05:30 crc kubenswrapper[4971]: I0320 07:05:30.092212 4971 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/8e8977ac-1b93-49d2-8f42-20e64d909d7b-node-mnt\") on node \"crc\" DevicePath \"\"" Mar 20 07:05:30 crc kubenswrapper[4971]: I0320 07:05:30.106927 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e8977ac-1b93-49d2-8f42-20e64d909d7b-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "8e8977ac-1b93-49d2-8f42-20e64d909d7b" (UID: "8e8977ac-1b93-49d2-8f42-20e64d909d7b"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:05:30 crc kubenswrapper[4971]: I0320 07:05:30.192759 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvgcm\" (UniqueName: \"kubernetes.io/projected/8e8977ac-1b93-49d2-8f42-20e64d909d7b-kube-api-access-tvgcm\") pod \"8e8977ac-1b93-49d2-8f42-20e64d909d7b\" (UID: \"8e8977ac-1b93-49d2-8f42-20e64d909d7b\") " Mar 20 07:05:30 crc kubenswrapper[4971]: I0320 07:05:30.192967 4971 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/8e8977ac-1b93-49d2-8f42-20e64d909d7b-crc-storage\") on node \"crc\" DevicePath \"\"" Mar 20 07:05:30 crc kubenswrapper[4971]: I0320 07:05:30.197385 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e8977ac-1b93-49d2-8f42-20e64d909d7b-kube-api-access-tvgcm" (OuterVolumeSpecName: "kube-api-access-tvgcm") pod "8e8977ac-1b93-49d2-8f42-20e64d909d7b" (UID: "8e8977ac-1b93-49d2-8f42-20e64d909d7b"). InnerVolumeSpecName "kube-api-access-tvgcm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:05:30 crc kubenswrapper[4971]: I0320 07:05:30.293733 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvgcm\" (UniqueName: \"kubernetes.io/projected/8e8977ac-1b93-49d2-8f42-20e64d909d7b-kube-api-access-tvgcm\") on node \"crc\" DevicePath \"\"" Mar 20 07:05:30 crc kubenswrapper[4971]: I0320 07:05:30.790647 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-l5xpx" event={"ID":"8e8977ac-1b93-49d2-8f42-20e64d909d7b","Type":"ContainerDied","Data":"cf7f060a81fd4c690f65b4a8aa6e1876ec67a5e9e7cd8ca37811bd6ca23e96a5"} Mar 20 07:05:30 crc kubenswrapper[4971]: I0320 07:05:30.790957 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf7f060a81fd4c690f65b4a8aa6e1876ec67a5e9e7cd8ca37811bd6ca23e96a5" Mar 20 07:05:30 crc kubenswrapper[4971]: I0320 07:05:30.790665 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-l5xpx" Mar 20 07:05:31 crc kubenswrapper[4971]: I0320 07:05:31.417014 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-c7m56" Mar 20 07:05:31 crc kubenswrapper[4971]: I0320 07:05:31.417061 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-c7m56" Mar 20 07:05:31 crc kubenswrapper[4971]: I0320 07:05:31.468299 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-c7m56" Mar 20 07:05:32 crc kubenswrapper[4971]: I0320 07:05:32.649568 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4vqnp" Mar 20 07:05:37 crc kubenswrapper[4971]: I0320 07:05:37.887823 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874822m6"] Mar 20 07:05:37 crc kubenswrapper[4971]: E0320 07:05:37.888418 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e8977ac-1b93-49d2-8f42-20e64d909d7b" containerName="storage" Mar 20 07:05:37 crc kubenswrapper[4971]: I0320 07:05:37.888437 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e8977ac-1b93-49d2-8f42-20e64d909d7b" containerName="storage" Mar 20 07:05:37 crc kubenswrapper[4971]: I0320 07:05:37.888653 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e8977ac-1b93-49d2-8f42-20e64d909d7b" containerName="storage" Mar 20 07:05:37 crc kubenswrapper[4971]: I0320 07:05:37.889864 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874822m6" Mar 20 07:05:37 crc kubenswrapper[4971]: I0320 07:05:37.892119 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fb5mv\" (UniqueName: \"kubernetes.io/projected/cd481b29-3182-4bea-bc4d-60f01f877aa3-kube-api-access-fb5mv\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874822m6\" (UID: \"cd481b29-3182-4bea-bc4d-60f01f877aa3\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874822m6" Mar 20 07:05:37 crc kubenswrapper[4971]: I0320 07:05:37.892178 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cd481b29-3182-4bea-bc4d-60f01f877aa3-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874822m6\" (UID: \"cd481b29-3182-4bea-bc4d-60f01f877aa3\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874822m6" Mar 20 07:05:37 crc kubenswrapper[4971]: I0320 07:05:37.892308 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cd481b29-3182-4bea-bc4d-60f01f877aa3-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874822m6\" (UID: \"cd481b29-3182-4bea-bc4d-60f01f877aa3\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874822m6" Mar 20 07:05:37 crc kubenswrapper[4971]: I0320 07:05:37.893013 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 20 07:05:37 crc kubenswrapper[4971]: I0320 07:05:37.898309 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874822m6"] Mar 20 07:05:37 crc kubenswrapper[4971]: I0320 07:05:37.993669 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cd481b29-3182-4bea-bc4d-60f01f877aa3-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874822m6\" (UID: \"cd481b29-3182-4bea-bc4d-60f01f877aa3\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874822m6" Mar 20 07:05:37 crc kubenswrapper[4971]: I0320 07:05:37.994160 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fb5mv\" (UniqueName: \"kubernetes.io/projected/cd481b29-3182-4bea-bc4d-60f01f877aa3-kube-api-access-fb5mv\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874822m6\" (UID: \"cd481b29-3182-4bea-bc4d-60f01f877aa3\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874822m6" Mar 20 07:05:37 crc kubenswrapper[4971]: I0320 07:05:37.994218 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cd481b29-3182-4bea-bc4d-60f01f877aa3-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874822m6\" (UID: \"cd481b29-3182-4bea-bc4d-60f01f877aa3\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874822m6" Mar 20 07:05:37 crc kubenswrapper[4971]: I0320 07:05:37.994369 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cd481b29-3182-4bea-bc4d-60f01f877aa3-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874822m6\" (UID: \"cd481b29-3182-4bea-bc4d-60f01f877aa3\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874822m6" Mar 20 07:05:37 crc kubenswrapper[4971]: I0320 07:05:37.994739 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cd481b29-3182-4bea-bc4d-60f01f877aa3-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874822m6\" (UID: \"cd481b29-3182-4bea-bc4d-60f01f877aa3\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874822m6" Mar 20 07:05:38 crc kubenswrapper[4971]: I0320 07:05:38.019574 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb5mv\" (UniqueName: \"kubernetes.io/projected/cd481b29-3182-4bea-bc4d-60f01f877aa3-kube-api-access-fb5mv\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874822m6\" (UID: \"cd481b29-3182-4bea-bc4d-60f01f877aa3\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874822m6" Mar 20 07:05:38 crc kubenswrapper[4971]: I0320 07:05:38.207970 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874822m6" Mar 20 07:05:38 crc kubenswrapper[4971]: I0320 07:05:38.683894 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874822m6"] Mar 20 07:05:38 crc kubenswrapper[4971]: W0320 07:05:38.690136 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd481b29_3182_4bea_bc4d_60f01f877aa3.slice/crio-249714855c8df026ae072104a2da7a2c72ea31b00827d98cd3d9c225815186f1 WatchSource:0}: Error finding container 249714855c8df026ae072104a2da7a2c72ea31b00827d98cd3d9c225815186f1: Status 404 returned error can't find the container with id 249714855c8df026ae072104a2da7a2c72ea31b00827d98cd3d9c225815186f1 Mar 20 07:05:38 crc kubenswrapper[4971]: I0320 07:05:38.841583 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874822m6" event={"ID":"cd481b29-3182-4bea-bc4d-60f01f877aa3","Type":"ContainerStarted","Data":"249714855c8df026ae072104a2da7a2c72ea31b00827d98cd3d9c225815186f1"} Mar 20 07:05:39 crc kubenswrapper[4971]: I0320 07:05:39.664228 4971 scope.go:117] "RemoveContainer" containerID="b387ac17402e80963231386ddb3272ea2e435161b374b91196b8fa406f8868ec" Mar 20 07:05:39 crc kubenswrapper[4971]: I0320 07:05:39.853487 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lwlhn_f11eaf57-f83a-4974-adb5-7a59b11555b0/kube-multus/2.log" Mar 20 07:05:39 crc kubenswrapper[4971]: I0320 07:05:39.856751 4971 generic.go:334] "Generic (PLEG): container finished" podID="cd481b29-3182-4bea-bc4d-60f01f877aa3" containerID="cddab7b225196f2feeccf5d29e98b22c00df9daefa7ce095899de0ed2f266a98" exitCode=0 Mar 20 07:05:39 crc kubenswrapper[4971]: I0320 07:05:39.856803 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874822m6" event={"ID":"cd481b29-3182-4bea-bc4d-60f01f877aa3","Type":"ContainerDied","Data":"cddab7b225196f2feeccf5d29e98b22c00df9daefa7ce095899de0ed2f266a98"} Mar 20 07:05:40 crc kubenswrapper[4971]: I0320 07:05:40.015042 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-flhqw"] Mar 20 07:05:40 crc kubenswrapper[4971]: I0320 07:05:40.017779 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-flhqw" Mar 20 07:05:40 crc kubenswrapper[4971]: I0320 07:05:40.040722 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-flhqw"] Mar 20 07:05:40 crc kubenswrapper[4971]: I0320 07:05:40.048211 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-974sk\" (UniqueName: \"kubernetes.io/projected/4c8d0bd3-0e1a-4cd7-884d-edb3f5992909-kube-api-access-974sk\") pod \"redhat-operators-flhqw\" (UID: \"4c8d0bd3-0e1a-4cd7-884d-edb3f5992909\") " pod="openshift-marketplace/redhat-operators-flhqw" Mar 20 07:05:40 crc kubenswrapper[4971]: I0320 07:05:40.048360 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c8d0bd3-0e1a-4cd7-884d-edb3f5992909-catalog-content\") pod \"redhat-operators-flhqw\" (UID: \"4c8d0bd3-0e1a-4cd7-884d-edb3f5992909\") " pod="openshift-marketplace/redhat-operators-flhqw" Mar 20 07:05:40 crc kubenswrapper[4971]: I0320 07:05:40.048457 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c8d0bd3-0e1a-4cd7-884d-edb3f5992909-utilities\") pod \"redhat-operators-flhqw\" (UID: \"4c8d0bd3-0e1a-4cd7-884d-edb3f5992909\") " pod="openshift-marketplace/redhat-operators-flhqw" Mar 20 07:05:40 crc kubenswrapper[4971]: I0320 07:05:40.149309 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c8d0bd3-0e1a-4cd7-884d-edb3f5992909-utilities\") pod \"redhat-operators-flhqw\" (UID: \"4c8d0bd3-0e1a-4cd7-884d-edb3f5992909\") " pod="openshift-marketplace/redhat-operators-flhqw" Mar 20 07:05:40 crc kubenswrapper[4971]: I0320 07:05:40.149366 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-974sk\" (UniqueName: \"kubernetes.io/projected/4c8d0bd3-0e1a-4cd7-884d-edb3f5992909-kube-api-access-974sk\") pod \"redhat-operators-flhqw\" (UID: \"4c8d0bd3-0e1a-4cd7-884d-edb3f5992909\") " pod="openshift-marketplace/redhat-operators-flhqw" Mar 20 07:05:40 crc kubenswrapper[4971]: I0320 07:05:40.149406 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c8d0bd3-0e1a-4cd7-884d-edb3f5992909-catalog-content\") pod \"redhat-operators-flhqw\" (UID: \"4c8d0bd3-0e1a-4cd7-884d-edb3f5992909\") " pod="openshift-marketplace/redhat-operators-flhqw" Mar 20 07:05:40 crc kubenswrapper[4971]: I0320 07:05:40.149881 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c8d0bd3-0e1a-4cd7-884d-edb3f5992909-utilities\") pod \"redhat-operators-flhqw\" (UID: \"4c8d0bd3-0e1a-4cd7-884d-edb3f5992909\") " pod="openshift-marketplace/redhat-operators-flhqw" Mar 20 07:05:40 crc kubenswrapper[4971]: I0320 07:05:40.149925 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c8d0bd3-0e1a-4cd7-884d-edb3f5992909-catalog-content\") pod \"redhat-operators-flhqw\" (UID: \"4c8d0bd3-0e1a-4cd7-884d-edb3f5992909\") " pod="openshift-marketplace/redhat-operators-flhqw" Mar 20 07:05:40 crc kubenswrapper[4971]: I0320 07:05:40.176550 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-974sk\" (UniqueName: \"kubernetes.io/projected/4c8d0bd3-0e1a-4cd7-884d-edb3f5992909-kube-api-access-974sk\") pod \"redhat-operators-flhqw\" (UID: \"4c8d0bd3-0e1a-4cd7-884d-edb3f5992909\") " pod="openshift-marketplace/redhat-operators-flhqw" Mar 20 07:05:40 crc kubenswrapper[4971]: I0320 07:05:40.384245 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-flhqw" Mar 20 07:05:40 crc kubenswrapper[4971]: I0320 07:05:40.590950 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-flhqw"] Mar 20 07:05:40 crc kubenswrapper[4971]: I0320 07:05:40.864921 4971 generic.go:334] "Generic (PLEG): container finished" podID="4c8d0bd3-0e1a-4cd7-884d-edb3f5992909" containerID="cbaf1cde93f08414461efca864faab8bbfff318dfea9dc0a47afb3adc8d16837" exitCode=0 Mar 20 07:05:40 crc kubenswrapper[4971]: I0320 07:05:40.864961 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-flhqw" event={"ID":"4c8d0bd3-0e1a-4cd7-884d-edb3f5992909","Type":"ContainerDied","Data":"cbaf1cde93f08414461efca864faab8bbfff318dfea9dc0a47afb3adc8d16837"} Mar 20 07:05:40 crc kubenswrapper[4971]: I0320 07:05:40.864985 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-flhqw" event={"ID":"4c8d0bd3-0e1a-4cd7-884d-edb3f5992909","Type":"ContainerStarted","Data":"99b28f947fc39bba7677836dbf8e9bd7b8c87141447c9870672503318c580a4c"} Mar 20 07:05:41 crc kubenswrapper[4971]: I0320 07:05:41.460969 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-c7m56" Mar 20 07:05:41 crc kubenswrapper[4971]: I0320 07:05:41.875767 4971 generic.go:334] "Generic (PLEG): container finished" podID="cd481b29-3182-4bea-bc4d-60f01f877aa3" containerID="f3d762f887195ae7d5e71bdb9b9149184b9bc12072cfe56cd3b3c6af8b56ee66" exitCode=0 Mar 20 07:05:41 crc kubenswrapper[4971]: I0320 07:05:41.875840 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874822m6" event={"ID":"cd481b29-3182-4bea-bc4d-60f01f877aa3","Type":"ContainerDied","Data":"f3d762f887195ae7d5e71bdb9b9149184b9bc12072cfe56cd3b3c6af8b56ee66"} Mar 20 07:05:42 crc kubenswrapper[4971]: I0320 07:05:42.886041 4971 generic.go:334] "Generic (PLEG): container finished" podID="cd481b29-3182-4bea-bc4d-60f01f877aa3" containerID="454353eb8b82b564f79b5500c43cf8891e8fd6464908bd99d9ba42e7eba009f5" exitCode=0 Mar 20 07:05:42 crc kubenswrapper[4971]: I0320 07:05:42.886163 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874822m6" event={"ID":"cd481b29-3182-4bea-bc4d-60f01f877aa3","Type":"ContainerDied","Data":"454353eb8b82b564f79b5500c43cf8891e8fd6464908bd99d9ba42e7eba009f5"} Mar 20 07:05:42 crc kubenswrapper[4971]: I0320 07:05:42.888906 4971 generic.go:334] "Generic (PLEG): container finished" podID="4c8d0bd3-0e1a-4cd7-884d-edb3f5992909" containerID="a1babeba443674806ece35fb99324dae405a2a88c0d50b287bccc07c8fefba18" exitCode=0 Mar 20 07:05:42 crc kubenswrapper[4971]: I0320 07:05:42.888963 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-flhqw" event={"ID":"4c8d0bd3-0e1a-4cd7-884d-edb3f5992909","Type":"ContainerDied","Data":"a1babeba443674806ece35fb99324dae405a2a88c0d50b287bccc07c8fefba18"} Mar 20 07:05:43 crc kubenswrapper[4971]: I0320 07:05:43.900034 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-flhqw" event={"ID":"4c8d0bd3-0e1a-4cd7-884d-edb3f5992909","Type":"ContainerStarted","Data":"23a0a846983d5ecf1b4814b682cdbdd0bd2a631ab99b8ed023a4ac521abfb09f"} Mar 20 07:05:43 crc kubenswrapper[4971]: I0320 07:05:43.922154 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-flhqw" podStartSLOduration=2.544119294 podStartE2EDuration="4.922138737s" podCreationTimestamp="2026-03-20 07:05:39 +0000 UTC" firstStartedPulling="2026-03-20 07:05:40.887485169 +0000 UTC m=+962.867359307" lastFinishedPulling="2026-03-20 07:05:43.265504612 +0000 UTC m=+965.245378750" observedRunningTime="2026-03-20 07:05:43.921013188 +0000 UTC m=+965.900887336" watchObservedRunningTime="2026-03-20 07:05:43.922138737 +0000 UTC m=+965.902012885" Mar 20 07:05:44 crc kubenswrapper[4971]: I0320 07:05:44.190568 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874822m6" Mar 20 07:05:44 crc kubenswrapper[4971]: I0320 07:05:44.204074 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fb5mv\" (UniqueName: \"kubernetes.io/projected/cd481b29-3182-4bea-bc4d-60f01f877aa3-kube-api-access-fb5mv\") pod \"cd481b29-3182-4bea-bc4d-60f01f877aa3\" (UID: \"cd481b29-3182-4bea-bc4d-60f01f877aa3\") " Mar 20 07:05:44 crc kubenswrapper[4971]: I0320 07:05:44.211490 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd481b29-3182-4bea-bc4d-60f01f877aa3-kube-api-access-fb5mv" (OuterVolumeSpecName: "kube-api-access-fb5mv") pod "cd481b29-3182-4bea-bc4d-60f01f877aa3" (UID: "cd481b29-3182-4bea-bc4d-60f01f877aa3"). InnerVolumeSpecName "kube-api-access-fb5mv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:05:44 crc kubenswrapper[4971]: I0320 07:05:44.304886 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cd481b29-3182-4bea-bc4d-60f01f877aa3-util\") pod \"cd481b29-3182-4bea-bc4d-60f01f877aa3\" (UID: \"cd481b29-3182-4bea-bc4d-60f01f877aa3\") " Mar 20 07:05:44 crc kubenswrapper[4971]: I0320 07:05:44.304973 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cd481b29-3182-4bea-bc4d-60f01f877aa3-bundle\") pod \"cd481b29-3182-4bea-bc4d-60f01f877aa3\" (UID: \"cd481b29-3182-4bea-bc4d-60f01f877aa3\") " Mar 20 07:05:44 crc kubenswrapper[4971]: I0320 07:05:44.305696 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fb5mv\" (UniqueName: \"kubernetes.io/projected/cd481b29-3182-4bea-bc4d-60f01f877aa3-kube-api-access-fb5mv\") on node \"crc\" DevicePath \"\"" Mar 20 07:05:44 crc kubenswrapper[4971]: I0320 07:05:44.309785 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd481b29-3182-4bea-bc4d-60f01f877aa3-bundle" (OuterVolumeSpecName: "bundle") pod "cd481b29-3182-4bea-bc4d-60f01f877aa3" (UID: "cd481b29-3182-4bea-bc4d-60f01f877aa3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:05:44 crc kubenswrapper[4971]: I0320 07:05:44.406857 4971 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cd481b29-3182-4bea-bc4d-60f01f877aa3-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:05:44 crc kubenswrapper[4971]: I0320 07:05:44.493971 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd481b29-3182-4bea-bc4d-60f01f877aa3-util" (OuterVolumeSpecName: "util") pod "cd481b29-3182-4bea-bc4d-60f01f877aa3" (UID: "cd481b29-3182-4bea-bc4d-60f01f877aa3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:05:44 crc kubenswrapper[4971]: I0320 07:05:44.507861 4971 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cd481b29-3182-4bea-bc4d-60f01f877aa3-util\") on node \"crc\" DevicePath \"\"" Mar 20 07:05:44 crc kubenswrapper[4971]: I0320 07:05:44.599368 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c7m56"] Mar 20 07:05:44 crc kubenswrapper[4971]: I0320 07:05:44.599644 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-c7m56" podUID="75cfa995-1433-4fa6-a3dd-a8346a19e0fb" containerName="registry-server" containerID="cri-o://d3aab746675f9b56b86297dc8e49aa6f2e9b9e31a865416db5c8144b58f971a1" gracePeriod=2 Mar 20 07:05:44 crc kubenswrapper[4971]: I0320 07:05:44.911323 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874822m6" Mar 20 07:05:44 crc kubenswrapper[4971]: I0320 07:05:44.913587 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874822m6" event={"ID":"cd481b29-3182-4bea-bc4d-60f01f877aa3","Type":"ContainerDied","Data":"249714855c8df026ae072104a2da7a2c72ea31b00827d98cd3d9c225815186f1"} Mar 20 07:05:44 crc kubenswrapper[4971]: I0320 07:05:44.913662 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="249714855c8df026ae072104a2da7a2c72ea31b00827d98cd3d9c225815186f1" Mar 20 07:05:44 crc kubenswrapper[4971]: I0320 07:05:44.915414 4971 generic.go:334] "Generic (PLEG): container finished" podID="75cfa995-1433-4fa6-a3dd-a8346a19e0fb" containerID="d3aab746675f9b56b86297dc8e49aa6f2e9b9e31a865416db5c8144b58f971a1" exitCode=0 Mar 20 07:05:44 crc kubenswrapper[4971]: I0320 07:05:44.915533 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c7m56" event={"ID":"75cfa995-1433-4fa6-a3dd-a8346a19e0fb","Type":"ContainerDied","Data":"d3aab746675f9b56b86297dc8e49aa6f2e9b9e31a865416db5c8144b58f971a1"} Mar 20 07:05:45 crc kubenswrapper[4971]: I0320 07:05:45.056508 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c7m56" Mar 20 07:05:45 crc kubenswrapper[4971]: I0320 07:05:45.215804 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75cfa995-1433-4fa6-a3dd-a8346a19e0fb-utilities\") pod \"75cfa995-1433-4fa6-a3dd-a8346a19e0fb\" (UID: \"75cfa995-1433-4fa6-a3dd-a8346a19e0fb\") " Mar 20 07:05:45 crc kubenswrapper[4971]: I0320 07:05:45.215959 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkhjm\" (UniqueName: \"kubernetes.io/projected/75cfa995-1433-4fa6-a3dd-a8346a19e0fb-kube-api-access-rkhjm\") pod \"75cfa995-1433-4fa6-a3dd-a8346a19e0fb\" (UID: \"75cfa995-1433-4fa6-a3dd-a8346a19e0fb\") " Mar 20 07:05:45 crc kubenswrapper[4971]: I0320 07:05:45.215985 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75cfa995-1433-4fa6-a3dd-a8346a19e0fb-catalog-content\") pod \"75cfa995-1433-4fa6-a3dd-a8346a19e0fb\" (UID: \"75cfa995-1433-4fa6-a3dd-a8346a19e0fb\") " Mar 20 07:05:45 crc kubenswrapper[4971]: I0320 07:05:45.216476 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75cfa995-1433-4fa6-a3dd-a8346a19e0fb-utilities" (OuterVolumeSpecName: "utilities") pod "75cfa995-1433-4fa6-a3dd-a8346a19e0fb" (UID: "75cfa995-1433-4fa6-a3dd-a8346a19e0fb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:05:45 crc kubenswrapper[4971]: I0320 07:05:45.222721 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75cfa995-1433-4fa6-a3dd-a8346a19e0fb-kube-api-access-rkhjm" (OuterVolumeSpecName: "kube-api-access-rkhjm") pod "75cfa995-1433-4fa6-a3dd-a8346a19e0fb" (UID: "75cfa995-1433-4fa6-a3dd-a8346a19e0fb"). InnerVolumeSpecName "kube-api-access-rkhjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:05:45 crc kubenswrapper[4971]: I0320 07:05:45.272835 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75cfa995-1433-4fa6-a3dd-a8346a19e0fb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "75cfa995-1433-4fa6-a3dd-a8346a19e0fb" (UID: "75cfa995-1433-4fa6-a3dd-a8346a19e0fb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:05:45 crc kubenswrapper[4971]: I0320 07:05:45.317523 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkhjm\" (UniqueName: \"kubernetes.io/projected/75cfa995-1433-4fa6-a3dd-a8346a19e0fb-kube-api-access-rkhjm\") on node \"crc\" DevicePath \"\"" Mar 20 07:05:45 crc kubenswrapper[4971]: I0320 07:05:45.317568 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75cfa995-1433-4fa6-a3dd-a8346a19e0fb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 07:05:45 crc kubenswrapper[4971]: I0320 07:05:45.317583 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75cfa995-1433-4fa6-a3dd-a8346a19e0fb-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 07:05:45 crc kubenswrapper[4971]: I0320 07:05:45.922730 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c7m56" event={"ID":"75cfa995-1433-4fa6-a3dd-a8346a19e0fb","Type":"ContainerDied","Data":"6d6161a526514ba453567d7db99b0567df8a24681717fa3e768b62ab94254059"} Mar 20 07:05:45 crc kubenswrapper[4971]: I0320 07:05:45.922789 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c7m56" Mar 20 07:05:45 crc kubenswrapper[4971]: I0320 07:05:45.923975 4971 scope.go:117] "RemoveContainer" containerID="d3aab746675f9b56b86297dc8e49aa6f2e9b9e31a865416db5c8144b58f971a1" Mar 20 07:05:45 crc kubenswrapper[4971]: I0320 07:05:45.942882 4971 scope.go:117] "RemoveContainer" containerID="b57f3c872398945f93678f8f34d38196caefd9ac541620805336adcf893268d4" Mar 20 07:05:45 crc kubenswrapper[4971]: I0320 07:05:45.951467 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c7m56"] Mar 20 07:05:45 crc kubenswrapper[4971]: I0320 07:05:45.958129 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-c7m56"] Mar 20 07:05:45 crc kubenswrapper[4971]: I0320 07:05:45.971661 4971 scope.go:117] "RemoveContainer" containerID="508a59d9682f5b44678dfc8da426063b34b583b8b8f17c15988232e4e9fbc0e3" Mar 20 07:05:46 crc kubenswrapper[4971]: I0320 07:05:46.546930 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-j886c"] Mar 20 07:05:46 crc kubenswrapper[4971]: E0320 07:05:46.547173 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd481b29-3182-4bea-bc4d-60f01f877aa3" containerName="pull" Mar 20 07:05:46 crc kubenswrapper[4971]: I0320 07:05:46.547191 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd481b29-3182-4bea-bc4d-60f01f877aa3" containerName="pull" Mar 20 07:05:46 crc kubenswrapper[4971]: E0320 07:05:46.547209 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd481b29-3182-4bea-bc4d-60f01f877aa3" containerName="util" Mar 20 07:05:46 crc kubenswrapper[4971]: I0320 07:05:46.547218 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd481b29-3182-4bea-bc4d-60f01f877aa3" containerName="util" Mar 20 07:05:46 crc kubenswrapper[4971]: E0320 07:05:46.547233 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75cfa995-1433-4fa6-a3dd-a8346a19e0fb" containerName="extract-content" Mar 20 07:05:46 crc kubenswrapper[4971]: I0320 07:05:46.547242 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="75cfa995-1433-4fa6-a3dd-a8346a19e0fb" containerName="extract-content" Mar 20 07:05:46 crc kubenswrapper[4971]: E0320 07:05:46.547258 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75cfa995-1433-4fa6-a3dd-a8346a19e0fb" containerName="registry-server" Mar 20 07:05:46 crc kubenswrapper[4971]: I0320 07:05:46.547266 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="75cfa995-1433-4fa6-a3dd-a8346a19e0fb" containerName="registry-server" Mar 20 07:05:46 crc kubenswrapper[4971]: E0320 07:05:46.547280 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd481b29-3182-4bea-bc4d-60f01f877aa3" containerName="extract" Mar 20 07:05:46 crc kubenswrapper[4971]: I0320 07:05:46.547287 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd481b29-3182-4bea-bc4d-60f01f877aa3" containerName="extract" Mar 20 07:05:46 crc kubenswrapper[4971]: E0320 07:05:46.547302 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75cfa995-1433-4fa6-a3dd-a8346a19e0fb" containerName="extract-utilities" Mar 20 07:05:46 crc kubenswrapper[4971]: I0320 07:05:46.547310 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="75cfa995-1433-4fa6-a3dd-a8346a19e0fb" containerName="extract-utilities" Mar 20 07:05:46 crc kubenswrapper[4971]: I0320 07:05:46.547419 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="75cfa995-1433-4fa6-a3dd-a8346a19e0fb" containerName="registry-server" Mar 20 07:05:46 crc kubenswrapper[4971]: I0320 07:05:46.547440 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd481b29-3182-4bea-bc4d-60f01f877aa3" containerName="extract" Mar 20 07:05:46 crc kubenswrapper[4971]: I0320 07:05:46.547890 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-j886c" Mar 20 07:05:46 crc kubenswrapper[4971]: I0320 07:05:46.550003 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-b94n6" Mar 20 07:05:46 crc kubenswrapper[4971]: I0320 07:05:46.550164 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 20 07:05:46 crc kubenswrapper[4971]: I0320 07:05:46.552934 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 20 07:05:46 crc kubenswrapper[4971]: I0320 07:05:46.567370 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-j886c"] Mar 20 07:05:46 crc kubenswrapper[4971]: I0320 07:05:46.732684 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxfnq\" (UniqueName: \"kubernetes.io/projected/2659746d-8c1c-475b-ad3b-f57197f14895-kube-api-access-sxfnq\") pod \"nmstate-operator-796d4cfff4-j886c\" (UID: \"2659746d-8c1c-475b-ad3b-f57197f14895\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-j886c" Mar 20 07:05:46 crc kubenswrapper[4971]: I0320 07:05:46.740301 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75cfa995-1433-4fa6-a3dd-a8346a19e0fb" path="/var/lib/kubelet/pods/75cfa995-1433-4fa6-a3dd-a8346a19e0fb/volumes" Mar 20 07:05:46 crc kubenswrapper[4971]: I0320 07:05:46.834120 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxfnq\" (UniqueName: \"kubernetes.io/projected/2659746d-8c1c-475b-ad3b-f57197f14895-kube-api-access-sxfnq\") pod \"nmstate-operator-796d4cfff4-j886c\" (UID: \"2659746d-8c1c-475b-ad3b-f57197f14895\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-j886c" Mar 20 07:05:46 crc kubenswrapper[4971]: I0320 07:05:46.855137 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxfnq\" (UniqueName: \"kubernetes.io/projected/2659746d-8c1c-475b-ad3b-f57197f14895-kube-api-access-sxfnq\") pod \"nmstate-operator-796d4cfff4-j886c\" (UID: \"2659746d-8c1c-475b-ad3b-f57197f14895\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-j886c" Mar 20 07:05:46 crc kubenswrapper[4971]: I0320 07:05:46.864499 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-j886c" Mar 20 07:05:47 crc kubenswrapper[4971]: I0320 07:05:47.097132 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-j886c"] Mar 20 07:05:47 crc kubenswrapper[4971]: I0320 07:05:47.950594 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-j886c" event={"ID":"2659746d-8c1c-475b-ad3b-f57197f14895","Type":"ContainerStarted","Data":"5f38075bff7d263a4d26582a9de5f0dd6447d33dfae5d482fc1135137d8e6663"} Mar 20 07:05:50 crc kubenswrapper[4971]: I0320 07:05:50.161915 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:05:50 crc kubenswrapper[4971]: I0320 07:05:50.162270 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:05:50 crc kubenswrapper[4971]: I0320 07:05:50.384775 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-flhqw" Mar 20 07:05:50 crc kubenswrapper[4971]: I0320 07:05:50.384843 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-flhqw" Mar 20 07:05:50 crc kubenswrapper[4971]: I0320 07:05:50.969920 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-j886c" event={"ID":"2659746d-8c1c-475b-ad3b-f57197f14895","Type":"ContainerStarted","Data":"8af2c2cd8cb9c9db62e59519f35595ee89074a5fe041b4a2ec474659b03231c7"} Mar 20 07:05:50 crc kubenswrapper[4971]: I0320 07:05:50.994049 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-j886c" podStartSLOduration=1.829357492 podStartE2EDuration="4.994016885s" podCreationTimestamp="2026-03-20 07:05:46 +0000 UTC" firstStartedPulling="2026-03-20 07:05:47.105625844 +0000 UTC m=+969.085500012" lastFinishedPulling="2026-03-20 07:05:50.270285267 +0000 UTC m=+972.250159405" observedRunningTime="2026-03-20 07:05:50.987777105 +0000 UTC m=+972.967651283" watchObservedRunningTime="2026-03-20 07:05:50.994016885 +0000 UTC m=+972.973891053" Mar 20 07:05:51 crc kubenswrapper[4971]: I0320 07:05:51.447321 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-flhqw" podUID="4c8d0bd3-0e1a-4cd7-884d-edb3f5992909" containerName="registry-server" probeResult="failure" output=< Mar 20 07:05:51 crc kubenswrapper[4971]: timeout: failed to connect service ":50051" within 1s Mar 20 07:05:51 crc kubenswrapper[4971]: > Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.023025 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-c8zms"] Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.024160 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-c8zms" Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.028020 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-7gwd7" Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.028884 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-rtfrt"] Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.029631 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-rtfrt" Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.033626 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.038949 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-rtfrt"] Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.052863 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-smvq8"] Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.053641 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-smvq8" Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.081705 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-c8zms"] Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.163532 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-zl9t9"] Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.164329 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-zl9t9" Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.165876 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.166966 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.167280 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-gcznw" Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.176132 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-zl9t9"] Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.206547 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/b8bca7dc-fb19-45b9-8bad-b2ee90232d46-nmstate-lock\") pod \"nmstate-handler-smvq8\" (UID: \"b8bca7dc-fb19-45b9-8bad-b2ee90232d46\") " pod="openshift-nmstate/nmstate-handler-smvq8" Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.206583 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6rzh\" (UniqueName: \"kubernetes.io/projected/b8bca7dc-fb19-45b9-8bad-b2ee90232d46-kube-api-access-p6rzh\") pod \"nmstate-handler-smvq8\" (UID: \"b8bca7dc-fb19-45b9-8bad-b2ee90232d46\") " pod="openshift-nmstate/nmstate-handler-smvq8" Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.206625 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/81500b95-85cb-41cc-bee8-d3d56d47ff4d-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-rtfrt\" (UID: \"81500b95-85cb-41cc-bee8-d3d56d47ff4d\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-rtfrt" Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.206785 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/b8bca7dc-fb19-45b9-8bad-b2ee90232d46-dbus-socket\") pod \"nmstate-handler-smvq8\" (UID: \"b8bca7dc-fb19-45b9-8bad-b2ee90232d46\") " pod="openshift-nmstate/nmstate-handler-smvq8" Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.206851 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q92rd\" (UniqueName: \"kubernetes.io/projected/81500b95-85cb-41cc-bee8-d3d56d47ff4d-kube-api-access-q92rd\") pod \"nmstate-webhook-5f558f5558-rtfrt\" (UID: \"81500b95-85cb-41cc-bee8-d3d56d47ff4d\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-rtfrt" Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.206882 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/b8bca7dc-fb19-45b9-8bad-b2ee90232d46-ovs-socket\") pod \"nmstate-handler-smvq8\" (UID: \"b8bca7dc-fb19-45b9-8bad-b2ee90232d46\") " pod="openshift-nmstate/nmstate-handler-smvq8" Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.206904 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ld8x8\" (UniqueName: \"kubernetes.io/projected/ae395f28-6667-46bf-81c3-af54d7c1c743-kube-api-access-ld8x8\") pod \"nmstate-metrics-9b8c8685d-c8zms\" (UID: \"ae395f28-6667-46bf-81c3-af54d7c1c743\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-c8zms" Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.308157 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/b8bca7dc-fb19-45b9-8bad-b2ee90232d46-dbus-socket\") pod \"nmstate-handler-smvq8\" (UID: \"b8bca7dc-fb19-45b9-8bad-b2ee90232d46\") " pod="openshift-nmstate/nmstate-handler-smvq8" Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.308216 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q92rd\" (UniqueName: \"kubernetes.io/projected/81500b95-85cb-41cc-bee8-d3d56d47ff4d-kube-api-access-q92rd\") pod \"nmstate-webhook-5f558f5558-rtfrt\" (UID: \"81500b95-85cb-41cc-bee8-d3d56d47ff4d\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-rtfrt" Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.308265 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/b8bca7dc-fb19-45b9-8bad-b2ee90232d46-ovs-socket\") pod \"nmstate-handler-smvq8\" (UID: \"b8bca7dc-fb19-45b9-8bad-b2ee90232d46\") " pod="openshift-nmstate/nmstate-handler-smvq8" Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.308284 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ld8x8\" (UniqueName: \"kubernetes.io/projected/ae395f28-6667-46bf-81c3-af54d7c1c743-kube-api-access-ld8x8\") pod \"nmstate-metrics-9b8c8685d-c8zms\" (UID: \"ae395f28-6667-46bf-81c3-af54d7c1c743\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-c8zms" Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.308309 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/0fc8c692-e8ea-4d20-91a2-41c340d9cb66-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-zl9t9\" (UID: \"0fc8c692-e8ea-4d20-91a2-41c340d9cb66\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-zl9t9" Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.308335 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/0fc8c692-e8ea-4d20-91a2-41c340d9cb66-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-zl9t9\" (UID: \"0fc8c692-e8ea-4d20-91a2-41c340d9cb66\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-zl9t9" Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.308362 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/b8bca7dc-fb19-45b9-8bad-b2ee90232d46-nmstate-lock\") pod \"nmstate-handler-smvq8\" (UID: \"b8bca7dc-fb19-45b9-8bad-b2ee90232d46\") " pod="openshift-nmstate/nmstate-handler-smvq8" Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.308379 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6rzh\" (UniqueName: \"kubernetes.io/projected/b8bca7dc-fb19-45b9-8bad-b2ee90232d46-kube-api-access-p6rzh\") pod \"nmstate-handler-smvq8\" (UID: \"b8bca7dc-fb19-45b9-8bad-b2ee90232d46\") " pod="openshift-nmstate/nmstate-handler-smvq8" Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.308390 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/b8bca7dc-fb19-45b9-8bad-b2ee90232d46-ovs-socket\") pod \"nmstate-handler-smvq8\" (UID: \"b8bca7dc-fb19-45b9-8bad-b2ee90232d46\") " pod="openshift-nmstate/nmstate-handler-smvq8" Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.308397 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ls7g\" (UniqueName: \"kubernetes.io/projected/0fc8c692-e8ea-4d20-91a2-41c340d9cb66-kube-api-access-2ls7g\") pod \"nmstate-console-plugin-86f58fcf4-zl9t9\" (UID: \"0fc8c692-e8ea-4d20-91a2-41c340d9cb66\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-zl9t9" Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.308583 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/81500b95-85cb-41cc-bee8-d3d56d47ff4d-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-rtfrt\" (UID: \"81500b95-85cb-41cc-bee8-d3d56d47ff4d\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-rtfrt" Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.308631 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/b8bca7dc-fb19-45b9-8bad-b2ee90232d46-nmstate-lock\") pod \"nmstate-handler-smvq8\" (UID: \"b8bca7dc-fb19-45b9-8bad-b2ee90232d46\") " pod="openshift-nmstate/nmstate-handler-smvq8" Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.308597 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/b8bca7dc-fb19-45b9-8bad-b2ee90232d46-dbus-socket\") pod \"nmstate-handler-smvq8\" (UID: \"b8bca7dc-fb19-45b9-8bad-b2ee90232d46\") " pod="openshift-nmstate/nmstate-handler-smvq8" Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.315399 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/81500b95-85cb-41cc-bee8-d3d56d47ff4d-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-rtfrt\" (UID: \"81500b95-85cb-41cc-bee8-d3d56d47ff4d\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-rtfrt" Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.325181 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q92rd\" (UniqueName: \"kubernetes.io/projected/81500b95-85cb-41cc-bee8-d3d56d47ff4d-kube-api-access-q92rd\") pod \"nmstate-webhook-5f558f5558-rtfrt\" (UID: \"81500b95-85cb-41cc-bee8-d3d56d47ff4d\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-rtfrt" Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.328062 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6rzh\" (UniqueName: \"kubernetes.io/projected/b8bca7dc-fb19-45b9-8bad-b2ee90232d46-kube-api-access-p6rzh\") pod \"nmstate-handler-smvq8\" (UID: \"b8bca7dc-fb19-45b9-8bad-b2ee90232d46\") " pod="openshift-nmstate/nmstate-handler-smvq8" Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.330727 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ld8x8\" (UniqueName: \"kubernetes.io/projected/ae395f28-6667-46bf-81c3-af54d7c1c743-kube-api-access-ld8x8\") pod \"nmstate-metrics-9b8c8685d-c8zms\" (UID: \"ae395f28-6667-46bf-81c3-af54d7c1c743\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-c8zms" Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.359087 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-c8zms" Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.367030 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6d67db987b-kdls5"] Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.367690 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d67db987b-kdls5" Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.382297 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-rtfrt" Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.393216 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-smvq8" Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.409468 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/0fc8c692-e8ea-4d20-91a2-41c340d9cb66-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-zl9t9\" (UID: \"0fc8c692-e8ea-4d20-91a2-41c340d9cb66\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-zl9t9" Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.409861 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/0fc8c692-e8ea-4d20-91a2-41c340d9cb66-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-zl9t9\" (UID: \"0fc8c692-e8ea-4d20-91a2-41c340d9cb66\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-zl9t9" Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.409902 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ls7g\" (UniqueName: \"kubernetes.io/projected/0fc8c692-e8ea-4d20-91a2-41c340d9cb66-kube-api-access-2ls7g\") pod \"nmstate-console-plugin-86f58fcf4-zl9t9\" (UID: \"0fc8c692-e8ea-4d20-91a2-41c340d9cb66\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-zl9t9" Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.409870 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6d67db987b-kdls5"] Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.411288 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/0fc8c692-e8ea-4d20-91a2-41c340d9cb66-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-zl9t9\" (UID: \"0fc8c692-e8ea-4d20-91a2-41c340d9cb66\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-zl9t9" Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.415123 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/0fc8c692-e8ea-4d20-91a2-41c340d9cb66-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-zl9t9\" (UID: \"0fc8c692-e8ea-4d20-91a2-41c340d9cb66\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-zl9t9" Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.427145 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ls7g\" (UniqueName: \"kubernetes.io/projected/0fc8c692-e8ea-4d20-91a2-41c340d9cb66-kube-api-access-2ls7g\") pod \"nmstate-console-plugin-86f58fcf4-zl9t9\" (UID: \"0fc8c692-e8ea-4d20-91a2-41c340d9cb66\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-zl9t9" Mar 20 07:05:52 crc kubenswrapper[4971]: W0320 07:05:52.455346 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8bca7dc_fb19_45b9_8bad_b2ee90232d46.slice/crio-17a3cc5625ae8a20ed00c42a68deaf08bf73000288fd3d894b64ecf047da418b WatchSource:0}: Error finding container 17a3cc5625ae8a20ed00c42a68deaf08bf73000288fd3d894b64ecf047da418b: Status 404 returned error can't find the container with id 17a3cc5625ae8a20ed00c42a68deaf08bf73000288fd3d894b64ecf047da418b Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.491423 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-zl9t9" Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.512083 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7f8eae5f-60d8-4190-a798-b0a53b2a2be1-console-oauth-config\") pod \"console-6d67db987b-kdls5\" (UID: \"7f8eae5f-60d8-4190-a798-b0a53b2a2be1\") " pod="openshift-console/console-6d67db987b-kdls5" Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.512129 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7f8eae5f-60d8-4190-a798-b0a53b2a2be1-service-ca\") pod \"console-6d67db987b-kdls5\" (UID: \"7f8eae5f-60d8-4190-a798-b0a53b2a2be1\") " pod="openshift-console/console-6d67db987b-kdls5" Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.512148 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7f8eae5f-60d8-4190-a798-b0a53b2a2be1-oauth-serving-cert\") pod \"console-6d67db987b-kdls5\" (UID: \"7f8eae5f-60d8-4190-a798-b0a53b2a2be1\") " pod="openshift-console/console-6d67db987b-kdls5" Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.512184 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-465rb\" (UniqueName: \"kubernetes.io/projected/7f8eae5f-60d8-4190-a798-b0a53b2a2be1-kube-api-access-465rb\") pod \"console-6d67db987b-kdls5\" (UID: \"7f8eae5f-60d8-4190-a798-b0a53b2a2be1\") " pod="openshift-console/console-6d67db987b-kdls5" Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.512213 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7f8eae5f-60d8-4190-a798-b0a53b2a2be1-console-serving-cert\") pod \"console-6d67db987b-kdls5\" (UID: \"7f8eae5f-60d8-4190-a798-b0a53b2a2be1\") " pod="openshift-console/console-6d67db987b-kdls5" Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.512227 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7f8eae5f-60d8-4190-a798-b0a53b2a2be1-console-config\") pod \"console-6d67db987b-kdls5\" (UID: \"7f8eae5f-60d8-4190-a798-b0a53b2a2be1\") " pod="openshift-console/console-6d67db987b-kdls5" Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.512243 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7f8eae5f-60d8-4190-a798-b0a53b2a2be1-trusted-ca-bundle\") pod \"console-6d67db987b-kdls5\" (UID: \"7f8eae5f-60d8-4190-a798-b0a53b2a2be1\") " pod="openshift-console/console-6d67db987b-kdls5" Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.613294 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-465rb\" (UniqueName: \"kubernetes.io/projected/7f8eae5f-60d8-4190-a798-b0a53b2a2be1-kube-api-access-465rb\") pod \"console-6d67db987b-kdls5\" (UID: \"7f8eae5f-60d8-4190-a798-b0a53b2a2be1\") " pod="openshift-console/console-6d67db987b-kdls5" Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.613686 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7f8eae5f-60d8-4190-a798-b0a53b2a2be1-console-serving-cert\") pod \"console-6d67db987b-kdls5\" (UID: \"7f8eae5f-60d8-4190-a798-b0a53b2a2be1\") " pod="openshift-console/console-6d67db987b-kdls5" Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.613710 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7f8eae5f-60d8-4190-a798-b0a53b2a2be1-console-config\") pod \"console-6d67db987b-kdls5\" (UID: \"7f8eae5f-60d8-4190-a798-b0a53b2a2be1\") " pod="openshift-console/console-6d67db987b-kdls5" Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.613780 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7f8eae5f-60d8-4190-a798-b0a53b2a2be1-trusted-ca-bundle\") pod \"console-6d67db987b-kdls5\" (UID: \"7f8eae5f-60d8-4190-a798-b0a53b2a2be1\") " pod="openshift-console/console-6d67db987b-kdls5" Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.614776 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7f8eae5f-60d8-4190-a798-b0a53b2a2be1-console-config\") pod \"console-6d67db987b-kdls5\" (UID: \"7f8eae5f-60d8-4190-a798-b0a53b2a2be1\") " pod="openshift-console/console-6d67db987b-kdls5" Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.614823 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7f8eae5f-60d8-4190-a798-b0a53b2a2be1-trusted-ca-bundle\") pod \"console-6d67db987b-kdls5\" (UID: \"7f8eae5f-60d8-4190-a798-b0a53b2a2be1\") " pod="openshift-console/console-6d67db987b-kdls5" Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.617580 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7f8eae5f-60d8-4190-a798-b0a53b2a2be1-console-serving-cert\") pod \"console-6d67db987b-kdls5\" (UID: \"7f8eae5f-60d8-4190-a798-b0a53b2a2be1\") " pod="openshift-console/console-6d67db987b-kdls5" Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.617952 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7f8eae5f-60d8-4190-a798-b0a53b2a2be1-console-oauth-config\") pod \"console-6d67db987b-kdls5\" (UID: \"7f8eae5f-60d8-4190-a798-b0a53b2a2be1\") " pod="openshift-console/console-6d67db987b-kdls5" Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.618444 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7f8eae5f-60d8-4190-a798-b0a53b2a2be1-service-ca\") pod \"console-6d67db987b-kdls5\" (UID: \"7f8eae5f-60d8-4190-a798-b0a53b2a2be1\") " pod="openshift-console/console-6d67db987b-kdls5" Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.618482 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7f8eae5f-60d8-4190-a798-b0a53b2a2be1-oauth-serving-cert\") pod \"console-6d67db987b-kdls5\" (UID: \"7f8eae5f-60d8-4190-a798-b0a53b2a2be1\") " pod="openshift-console/console-6d67db987b-kdls5" Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.619166 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7f8eae5f-60d8-4190-a798-b0a53b2a2be1-service-ca\") pod \"console-6d67db987b-kdls5\" (UID: \"7f8eae5f-60d8-4190-a798-b0a53b2a2be1\") " pod="openshift-console/console-6d67db987b-kdls5" Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.620005 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7f8eae5f-60d8-4190-a798-b0a53b2a2be1-oauth-serving-cert\") pod \"console-6d67db987b-kdls5\" (UID: \"7f8eae5f-60d8-4190-a798-b0a53b2a2be1\") " pod="openshift-console/console-6d67db987b-kdls5" Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.621339 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7f8eae5f-60d8-4190-a798-b0a53b2a2be1-console-oauth-config\") pod \"console-6d67db987b-kdls5\" (UID: \"7f8eae5f-60d8-4190-a798-b0a53b2a2be1\") " pod="openshift-console/console-6d67db987b-kdls5" Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.629739 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-465rb\" (UniqueName: \"kubernetes.io/projected/7f8eae5f-60d8-4190-a798-b0a53b2a2be1-kube-api-access-465rb\") pod \"console-6d67db987b-kdls5\" (UID: \"7f8eae5f-60d8-4190-a798-b0a53b2a2be1\") " pod="openshift-console/console-6d67db987b-kdls5" Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.740986 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-zl9t9"] Mar 20 07:05:52 crc kubenswrapper[4971]: W0320 07:05:52.742954 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0fc8c692_e8ea_4d20_91a2_41c340d9cb66.slice/crio-02620a343104095dadb8dc3da6dcc4467aee53a4d2d3312f58d85bc22db9f276 WatchSource:0}: Error finding container 02620a343104095dadb8dc3da6dcc4467aee53a4d2d3312f58d85bc22db9f276: Status 404 returned error can't find the container with id 02620a343104095dadb8dc3da6dcc4467aee53a4d2d3312f58d85bc22db9f276 Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.752115 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d67db987b-kdls5" Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.848615 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-c8zms"] Mar 20 07:05:52 crc kubenswrapper[4971]: W0320 07:05:52.855702 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae395f28_6667_46bf_81c3_af54d7c1c743.slice/crio-3b2a8c86b380f6617b434b66ea3415693da4798b85c0662ed77c51d2fada9d22 WatchSource:0}: Error finding container 3b2a8c86b380f6617b434b66ea3415693da4798b85c0662ed77c51d2fada9d22: Status 404 returned error can't find the container with id 3b2a8c86b380f6617b434b66ea3415693da4798b85c0662ed77c51d2fada9d22 Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.910337 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-rtfrt"] Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.913730 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6d67db987b-kdls5"] Mar 20 07:05:52 crc kubenswrapper[4971]: W0320 07:05:52.914952 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81500b95_85cb_41cc_bee8_d3d56d47ff4d.slice/crio-a24619479781f59671f1ad08f13281bc499f8d3d5f96568b5fd2030b69b9ecec WatchSource:0}: Error finding container a24619479781f59671f1ad08f13281bc499f8d3d5f96568b5fd2030b69b9ecec: Status 404 returned error can't find the container with id a24619479781f59671f1ad08f13281bc499f8d3d5f96568b5fd2030b69b9ecec Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.983905 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-rtfrt" event={"ID":"81500b95-85cb-41cc-bee8-d3d56d47ff4d","Type":"ContainerStarted","Data":"a24619479781f59671f1ad08f13281bc499f8d3d5f96568b5fd2030b69b9ecec"} Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.985152 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-smvq8" event={"ID":"b8bca7dc-fb19-45b9-8bad-b2ee90232d46","Type":"ContainerStarted","Data":"17a3cc5625ae8a20ed00c42a68deaf08bf73000288fd3d894b64ecf047da418b"} Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.987036 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-c8zms" event={"ID":"ae395f28-6667-46bf-81c3-af54d7c1c743","Type":"ContainerStarted","Data":"3b2a8c86b380f6617b434b66ea3415693da4798b85c0662ed77c51d2fada9d22"} Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.988299 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d67db987b-kdls5" event={"ID":"7f8eae5f-60d8-4190-a798-b0a53b2a2be1","Type":"ContainerStarted","Data":"89e1fa6551e88aa43593d7cf79c2dfe8eaa549681e0acb86d1cf4324bdc0dc44"} Mar 20 07:05:52 crc kubenswrapper[4971]: I0320 07:05:52.989492 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-zl9t9" event={"ID":"0fc8c692-e8ea-4d20-91a2-41c340d9cb66","Type":"ContainerStarted","Data":"02620a343104095dadb8dc3da6dcc4467aee53a4d2d3312f58d85bc22db9f276"} Mar 20 07:05:53 crc kubenswrapper[4971]: I0320 07:05:53.997743 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d67db987b-kdls5" event={"ID":"7f8eae5f-60d8-4190-a798-b0a53b2a2be1","Type":"ContainerStarted","Data":"2fd01c66c0ce88f3350d8cd51d8fddfe39d8fee70338d33060253a58d5c0b7f5"} Mar 20 07:05:54 crc kubenswrapper[4971]: I0320 07:05:54.038056 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6d67db987b-kdls5" podStartSLOduration=2.038036563 podStartE2EDuration="2.038036563s" podCreationTimestamp="2026-03-20 07:05:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:05:54.03792058 +0000 UTC m=+976.017794728" watchObservedRunningTime="2026-03-20 07:05:54.038036563 +0000 UTC m=+976.017910711" Mar 20 07:05:56 crc kubenswrapper[4971]: I0320 07:05:56.014906 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-rtfrt" event={"ID":"81500b95-85cb-41cc-bee8-d3d56d47ff4d","Type":"ContainerStarted","Data":"638c26f9b958b48d64c29e949e4415ab39783feef5e0614a01179942399e76fa"} Mar 20 07:05:56 crc kubenswrapper[4971]: I0320 07:05:56.015711 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-rtfrt" Mar 20 07:05:56 crc kubenswrapper[4971]: I0320 07:05:56.016591 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-smvq8" event={"ID":"b8bca7dc-fb19-45b9-8bad-b2ee90232d46","Type":"ContainerStarted","Data":"09c23c578760472d824eedf4a124bef5f6ec47443bd49990a5ddafe63ad9b2bf"} Mar 20 07:05:56 crc kubenswrapper[4971]: I0320 07:05:56.016750 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-smvq8" Mar 20 07:05:56 crc kubenswrapper[4971]: I0320 07:05:56.018719 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-c8zms" event={"ID":"ae395f28-6667-46bf-81c3-af54d7c1c743","Type":"ContainerStarted","Data":"89abafcdccf2429284560f32ff9970b0d7a767aeaee37466fe90fa25b09e7e13"} Mar 20 07:05:56 crc kubenswrapper[4971]: I0320 07:05:56.021215 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-zl9t9" event={"ID":"0fc8c692-e8ea-4d20-91a2-41c340d9cb66","Type":"ContainerStarted","Data":"3a8afe107c912dd176eeb228ba57ed9f40308fef05e7754842ac3f1246e9aa39"} Mar 20 07:05:56 crc kubenswrapper[4971]: I0320 07:05:56.039752 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-rtfrt" podStartSLOduration=1.444450775 podStartE2EDuration="4.039725543s" podCreationTimestamp="2026-03-20 07:05:52 +0000 UTC" firstStartedPulling="2026-03-20 07:05:52.916634257 +0000 UTC m=+974.896508395" lastFinishedPulling="2026-03-20 07:05:55.511908985 +0000 UTC m=+977.491783163" observedRunningTime="2026-03-20 07:05:56.035136556 +0000 UTC m=+978.015010714" watchObservedRunningTime="2026-03-20 07:05:56.039725543 +0000 UTC m=+978.019599721" Mar 20 07:05:56 crc kubenswrapper[4971]: I0320 07:05:56.056189 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-smvq8" podStartSLOduration=0.972590553 podStartE2EDuration="4.056161545s" podCreationTimestamp="2026-03-20 07:05:52 +0000 UTC" firstStartedPulling="2026-03-20 07:05:52.458116287 +0000 UTC m=+974.437990425" lastFinishedPulling="2026-03-20 07:05:55.541687239 +0000 UTC m=+977.521561417" observedRunningTime="2026-03-20 07:05:56.052248905 +0000 UTC m=+978.032123053" watchObservedRunningTime="2026-03-20 07:05:56.056161545 +0000 UTC m=+978.036035723" Mar 20 07:05:56 crc kubenswrapper[4971]: I0320 07:05:56.081626 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-zl9t9" podStartSLOduration=1.333090467 podStartE2EDuration="4.081579608s" podCreationTimestamp="2026-03-20 07:05:52 +0000 UTC" firstStartedPulling="2026-03-20 07:05:52.744761425 +0000 UTC m=+974.724635563" lastFinishedPulling="2026-03-20 07:05:55.493250526 +0000 UTC m=+977.473124704" observedRunningTime="2026-03-20 07:05:56.074033564 +0000 UTC m=+978.053907782" watchObservedRunningTime="2026-03-20 07:05:56.081579608 +0000 UTC m=+978.061453756" Mar 20 07:05:59 crc kubenswrapper[4971]: I0320 07:05:59.039344 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-c8zms" event={"ID":"ae395f28-6667-46bf-81c3-af54d7c1c743","Type":"ContainerStarted","Data":"6cf05c8df169336c82a7ed29fbae5dfe5ff31d5e30b0d0ca5bee0666f7a95f07"} Mar 20 07:05:59 crc kubenswrapper[4971]: I0320 07:05:59.058317 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-c8zms" podStartSLOduration=1.940655791 podStartE2EDuration="7.058298687s" podCreationTimestamp="2026-03-20 07:05:52 +0000 UTC" firstStartedPulling="2026-03-20 07:05:52.860334101 +0000 UTC m=+974.840208239" lastFinishedPulling="2026-03-20 07:05:57.977976997 +0000 UTC m=+979.957851135" observedRunningTime="2026-03-20 07:05:59.055894825 +0000 UTC m=+981.035768963" watchObservedRunningTime="2026-03-20 07:05:59.058298687 +0000 UTC m=+981.038172845" Mar 20 07:06:00 crc kubenswrapper[4971]: I0320 07:06:00.145746 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566506-wm6kx"] Mar 20 07:06:00 crc kubenswrapper[4971]: I0320 07:06:00.147249 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566506-wm6kx" Mar 20 07:06:00 crc kubenswrapper[4971]: I0320 07:06:00.151289 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 07:06:00 crc kubenswrapper[4971]: I0320 07:06:00.151473 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 07:06:00 crc kubenswrapper[4971]: I0320 07:06:00.152313 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 07:06:00 crc kubenswrapper[4971]: I0320 07:06:00.155726 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566506-wm6kx"] Mar 20 07:06:00 crc kubenswrapper[4971]: I0320 07:06:00.229806 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dh266\" (UniqueName: \"kubernetes.io/projected/794b92ff-cff7-4246-8b6f-fbda2dd717c0-kube-api-access-dh266\") pod \"auto-csr-approver-29566506-wm6kx\" (UID: \"794b92ff-cff7-4246-8b6f-fbda2dd717c0\") " pod="openshift-infra/auto-csr-approver-29566506-wm6kx" Mar 20 07:06:00 crc kubenswrapper[4971]: I0320 07:06:00.330924 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dh266\" (UniqueName: \"kubernetes.io/projected/794b92ff-cff7-4246-8b6f-fbda2dd717c0-kube-api-access-dh266\") pod \"auto-csr-approver-29566506-wm6kx\" (UID: \"794b92ff-cff7-4246-8b6f-fbda2dd717c0\") " pod="openshift-infra/auto-csr-approver-29566506-wm6kx" Mar 20 07:06:00 crc kubenswrapper[4971]: I0320 07:06:00.354375 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dh266\" (UniqueName: \"kubernetes.io/projected/794b92ff-cff7-4246-8b6f-fbda2dd717c0-kube-api-access-dh266\") pod \"auto-csr-approver-29566506-wm6kx\" (UID: \"794b92ff-cff7-4246-8b6f-fbda2dd717c0\") " pod="openshift-infra/auto-csr-approver-29566506-wm6kx" Mar 20 07:06:00 crc kubenswrapper[4971]: I0320 07:06:00.437774 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-flhqw" Mar 20 07:06:00 crc kubenswrapper[4971]: I0320 07:06:00.478905 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566506-wm6kx" Mar 20 07:06:00 crc kubenswrapper[4971]: I0320 07:06:00.512902 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-flhqw" Mar 20 07:06:00 crc kubenswrapper[4971]: I0320 07:06:00.671164 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-flhqw"] Mar 20 07:06:00 crc kubenswrapper[4971]: I0320 07:06:00.714357 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566506-wm6kx"] Mar 20 07:06:00 crc kubenswrapper[4971]: W0320 07:06:00.722432 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod794b92ff_cff7_4246_8b6f_fbda2dd717c0.slice/crio-55c587882f15edf636d66fb5380f940b8df69a59f6732791e1c21c1f20776d3b WatchSource:0}: Error finding container 55c587882f15edf636d66fb5380f940b8df69a59f6732791e1c21c1f20776d3b: Status 404 returned error can't find the container with id 55c587882f15edf636d66fb5380f940b8df69a59f6732791e1c21c1f20776d3b Mar 20 07:06:01 crc kubenswrapper[4971]: I0320 07:06:01.053634 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566506-wm6kx" event={"ID":"794b92ff-cff7-4246-8b6f-fbda2dd717c0","Type":"ContainerStarted","Data":"55c587882f15edf636d66fb5380f940b8df69a59f6732791e1c21c1f20776d3b"} Mar 20 07:06:02 crc kubenswrapper[4971]: I0320 07:06:02.066850 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-flhqw" podUID="4c8d0bd3-0e1a-4cd7-884d-edb3f5992909" containerName="registry-server" containerID="cri-o://23a0a846983d5ecf1b4814b682cdbdd0bd2a631ab99b8ed023a4ac521abfb09f" gracePeriod=2 Mar 20 07:06:02 crc kubenswrapper[4971]: I0320 07:06:02.067897 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566506-wm6kx" event={"ID":"794b92ff-cff7-4246-8b6f-fbda2dd717c0","Type":"ContainerStarted","Data":"924c1f9323defcb56cdff8e32cbb9cfd2255fb2410476e1891409701b4e0869e"} Mar 20 07:06:02 crc kubenswrapper[4971]: I0320 07:06:02.093771 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566506-wm6kx" podStartSLOduration=1.223214348 podStartE2EDuration="2.093750514s" podCreationTimestamp="2026-03-20 07:06:00 +0000 UTC" firstStartedPulling="2026-03-20 07:06:00.726866018 +0000 UTC m=+982.706740156" lastFinishedPulling="2026-03-20 07:06:01.597402174 +0000 UTC m=+983.577276322" observedRunningTime="2026-03-20 07:06:02.089089014 +0000 UTC m=+984.068963192" watchObservedRunningTime="2026-03-20 07:06:02.093750514 +0000 UTC m=+984.073624652" Mar 20 07:06:02 crc kubenswrapper[4971]: I0320 07:06:02.414703 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-smvq8" Mar 20 07:06:02 crc kubenswrapper[4971]: I0320 07:06:02.454379 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-flhqw" Mar 20 07:06:02 crc kubenswrapper[4971]: I0320 07:06:02.555701 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c8d0bd3-0e1a-4cd7-884d-edb3f5992909-catalog-content\") pod \"4c8d0bd3-0e1a-4cd7-884d-edb3f5992909\" (UID: \"4c8d0bd3-0e1a-4cd7-884d-edb3f5992909\") " Mar 20 07:06:02 crc kubenswrapper[4971]: I0320 07:06:02.555867 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-974sk\" (UniqueName: \"kubernetes.io/projected/4c8d0bd3-0e1a-4cd7-884d-edb3f5992909-kube-api-access-974sk\") pod \"4c8d0bd3-0e1a-4cd7-884d-edb3f5992909\" (UID: \"4c8d0bd3-0e1a-4cd7-884d-edb3f5992909\") " Mar 20 07:06:02 crc kubenswrapper[4971]: I0320 07:06:02.555955 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c8d0bd3-0e1a-4cd7-884d-edb3f5992909-utilities\") pod \"4c8d0bd3-0e1a-4cd7-884d-edb3f5992909\" (UID: \"4c8d0bd3-0e1a-4cd7-884d-edb3f5992909\") " Mar 20 07:06:02 crc kubenswrapper[4971]: I0320 07:06:02.556708 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c8d0bd3-0e1a-4cd7-884d-edb3f5992909-utilities" (OuterVolumeSpecName: "utilities") pod "4c8d0bd3-0e1a-4cd7-884d-edb3f5992909" (UID: "4c8d0bd3-0e1a-4cd7-884d-edb3f5992909"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:06:02 crc kubenswrapper[4971]: I0320 07:06:02.557257 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c8d0bd3-0e1a-4cd7-884d-edb3f5992909-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 07:06:02 crc kubenswrapper[4971]: I0320 07:06:02.562257 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c8d0bd3-0e1a-4cd7-884d-edb3f5992909-kube-api-access-974sk" (OuterVolumeSpecName: "kube-api-access-974sk") pod "4c8d0bd3-0e1a-4cd7-884d-edb3f5992909" (UID: "4c8d0bd3-0e1a-4cd7-884d-edb3f5992909"). InnerVolumeSpecName "kube-api-access-974sk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:06:02 crc kubenswrapper[4971]: I0320 07:06:02.658826 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-974sk\" (UniqueName: \"kubernetes.io/projected/4c8d0bd3-0e1a-4cd7-884d-edb3f5992909-kube-api-access-974sk\") on node \"crc\" DevicePath \"\"" Mar 20 07:06:02 crc kubenswrapper[4971]: I0320 07:06:02.708704 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c8d0bd3-0e1a-4cd7-884d-edb3f5992909-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4c8d0bd3-0e1a-4cd7-884d-edb3f5992909" (UID: "4c8d0bd3-0e1a-4cd7-884d-edb3f5992909"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:06:02 crc kubenswrapper[4971]: I0320 07:06:02.753216 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6d67db987b-kdls5" Mar 20 07:06:02 crc kubenswrapper[4971]: I0320 07:06:02.753279 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6d67db987b-kdls5" Mar 20 07:06:02 crc kubenswrapper[4971]: I0320 07:06:02.759498 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c8d0bd3-0e1a-4cd7-884d-edb3f5992909-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 07:06:02 crc kubenswrapper[4971]: I0320 07:06:02.761072 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6d67db987b-kdls5" Mar 20 07:06:03 crc kubenswrapper[4971]: I0320 07:06:03.073806 4971 generic.go:334] "Generic (PLEG): container finished" podID="794b92ff-cff7-4246-8b6f-fbda2dd717c0" containerID="924c1f9323defcb56cdff8e32cbb9cfd2255fb2410476e1891409701b4e0869e" exitCode=0 Mar 20 07:06:03 crc kubenswrapper[4971]: I0320 07:06:03.073890 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566506-wm6kx" event={"ID":"794b92ff-cff7-4246-8b6f-fbda2dd717c0","Type":"ContainerDied","Data":"924c1f9323defcb56cdff8e32cbb9cfd2255fb2410476e1891409701b4e0869e"} Mar 20 07:06:03 crc kubenswrapper[4971]: I0320 07:06:03.076673 4971 generic.go:334] "Generic (PLEG): container finished" podID="4c8d0bd3-0e1a-4cd7-884d-edb3f5992909" containerID="23a0a846983d5ecf1b4814b682cdbdd0bd2a631ab99b8ed023a4ac521abfb09f" exitCode=0 Mar 20 07:06:03 crc kubenswrapper[4971]: I0320 07:06:03.076745 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-flhqw" Mar 20 07:06:03 crc kubenswrapper[4971]: I0320 07:06:03.076764 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-flhqw" event={"ID":"4c8d0bd3-0e1a-4cd7-884d-edb3f5992909","Type":"ContainerDied","Data":"23a0a846983d5ecf1b4814b682cdbdd0bd2a631ab99b8ed023a4ac521abfb09f"} Mar 20 07:06:03 crc kubenswrapper[4971]: I0320 07:06:03.076812 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-flhqw" event={"ID":"4c8d0bd3-0e1a-4cd7-884d-edb3f5992909","Type":"ContainerDied","Data":"99b28f947fc39bba7677836dbf8e9bd7b8c87141447c9870672503318c580a4c"} Mar 20 07:06:03 crc kubenswrapper[4971]: I0320 07:06:03.076841 4971 scope.go:117] "RemoveContainer" containerID="23a0a846983d5ecf1b4814b682cdbdd0bd2a631ab99b8ed023a4ac521abfb09f" Mar 20 07:06:03 crc kubenswrapper[4971]: I0320 07:06:03.085155 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6d67db987b-kdls5" Mar 20 07:06:03 crc kubenswrapper[4971]: I0320 07:06:03.101945 4971 scope.go:117] "RemoveContainer" containerID="a1babeba443674806ece35fb99324dae405a2a88c0d50b287bccc07c8fefba18" Mar 20 07:06:03 crc kubenswrapper[4971]: I0320 07:06:03.112578 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-flhqw"] Mar 20 07:06:03 crc kubenswrapper[4971]: I0320 07:06:03.117835 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-flhqw"] Mar 20 07:06:03 crc kubenswrapper[4971]: I0320 07:06:03.119387 4971 scope.go:117] "RemoveContainer" containerID="cbaf1cde93f08414461efca864faab8bbfff318dfea9dc0a47afb3adc8d16837" Mar 20 07:06:03 crc kubenswrapper[4971]: I0320 07:06:03.165505 4971 scope.go:117] "RemoveContainer" containerID="23a0a846983d5ecf1b4814b682cdbdd0bd2a631ab99b8ed023a4ac521abfb09f" Mar 20 07:06:03 crc kubenswrapper[4971]: E0320 07:06:03.177079 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23a0a846983d5ecf1b4814b682cdbdd0bd2a631ab99b8ed023a4ac521abfb09f\": container with ID starting with 23a0a846983d5ecf1b4814b682cdbdd0bd2a631ab99b8ed023a4ac521abfb09f not found: ID does not exist" containerID="23a0a846983d5ecf1b4814b682cdbdd0bd2a631ab99b8ed023a4ac521abfb09f" Mar 20 07:06:03 crc kubenswrapper[4971]: I0320 07:06:03.177173 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23a0a846983d5ecf1b4814b682cdbdd0bd2a631ab99b8ed023a4ac521abfb09f"} err="failed to get container status \"23a0a846983d5ecf1b4814b682cdbdd0bd2a631ab99b8ed023a4ac521abfb09f\": rpc error: code = NotFound desc = could not find container \"23a0a846983d5ecf1b4814b682cdbdd0bd2a631ab99b8ed023a4ac521abfb09f\": container with ID starting with 23a0a846983d5ecf1b4814b682cdbdd0bd2a631ab99b8ed023a4ac521abfb09f not found: ID does not exist" Mar 20 07:06:03 crc kubenswrapper[4971]: I0320 07:06:03.177269 4971 scope.go:117] "RemoveContainer" containerID="a1babeba443674806ece35fb99324dae405a2a88c0d50b287bccc07c8fefba18" Mar 20 07:06:03 crc kubenswrapper[4971]: E0320 07:06:03.177638 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1babeba443674806ece35fb99324dae405a2a88c0d50b287bccc07c8fefba18\": container with ID starting with a1babeba443674806ece35fb99324dae405a2a88c0d50b287bccc07c8fefba18 not found: ID does not exist" containerID="a1babeba443674806ece35fb99324dae405a2a88c0d50b287bccc07c8fefba18" Mar 20 07:06:03 crc kubenswrapper[4971]: I0320 07:06:03.177662 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1babeba443674806ece35fb99324dae405a2a88c0d50b287bccc07c8fefba18"} err="failed to get container status \"a1babeba443674806ece35fb99324dae405a2a88c0d50b287bccc07c8fefba18\": rpc error: code = NotFound desc = could not find container \"a1babeba443674806ece35fb99324dae405a2a88c0d50b287bccc07c8fefba18\": container with ID starting with a1babeba443674806ece35fb99324dae405a2a88c0d50b287bccc07c8fefba18 not found: ID does not exist" Mar 20 07:06:03 crc kubenswrapper[4971]: I0320 07:06:03.177676 4971 scope.go:117] "RemoveContainer" containerID="cbaf1cde93f08414461efca864faab8bbfff318dfea9dc0a47afb3adc8d16837" Mar 20 07:06:03 crc kubenswrapper[4971]: E0320 07:06:03.178248 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbaf1cde93f08414461efca864faab8bbfff318dfea9dc0a47afb3adc8d16837\": container with ID starting with cbaf1cde93f08414461efca864faab8bbfff318dfea9dc0a47afb3adc8d16837 not found: ID does not exist" containerID="cbaf1cde93f08414461efca864faab8bbfff318dfea9dc0a47afb3adc8d16837" Mar 20 07:06:03 crc kubenswrapper[4971]: I0320 07:06:03.178292 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbaf1cde93f08414461efca864faab8bbfff318dfea9dc0a47afb3adc8d16837"} err="failed to get container status \"cbaf1cde93f08414461efca864faab8bbfff318dfea9dc0a47afb3adc8d16837\": rpc error: code = NotFound desc = could not find container \"cbaf1cde93f08414461efca864faab8bbfff318dfea9dc0a47afb3adc8d16837\": container with ID starting with cbaf1cde93f08414461efca864faab8bbfff318dfea9dc0a47afb3adc8d16837 not found: ID does not exist" Mar 20 07:06:03 crc kubenswrapper[4971]: I0320 07:06:03.179360 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-4l9qf"] Mar 20 07:06:03 crc kubenswrapper[4971]: I0320 07:06:03.481266 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-w46rg"] Mar 20 07:06:03 crc kubenswrapper[4971]: E0320 07:06:03.481640 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c8d0bd3-0e1a-4cd7-884d-edb3f5992909" containerName="extract-content" Mar 20 07:06:03 crc kubenswrapper[4971]: I0320 07:06:03.481665 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c8d0bd3-0e1a-4cd7-884d-edb3f5992909" containerName="extract-content" Mar 20 07:06:03 crc kubenswrapper[4971]: E0320 07:06:03.481688 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c8d0bd3-0e1a-4cd7-884d-edb3f5992909" containerName="extract-utilities" Mar 20 07:06:03 crc kubenswrapper[4971]: I0320 07:06:03.481701 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c8d0bd3-0e1a-4cd7-884d-edb3f5992909" containerName="extract-utilities" Mar 20 07:06:03 crc kubenswrapper[4971]: E0320 07:06:03.481724 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c8d0bd3-0e1a-4cd7-884d-edb3f5992909" containerName="registry-server" Mar 20 07:06:03 crc kubenswrapper[4971]: I0320 07:06:03.481738 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c8d0bd3-0e1a-4cd7-884d-edb3f5992909" containerName="registry-server" Mar 20 07:06:03 crc kubenswrapper[4971]: I0320 07:06:03.481929 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c8d0bd3-0e1a-4cd7-884d-edb3f5992909" containerName="registry-server" Mar 20 07:06:03 crc kubenswrapper[4971]: I0320 07:06:03.483528 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w46rg" Mar 20 07:06:03 crc kubenswrapper[4971]: I0320 07:06:03.493800 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-w46rg"] Mar 20 07:06:03 crc kubenswrapper[4971]: I0320 07:06:03.570262 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gc7w4\" (UniqueName: \"kubernetes.io/projected/5c7e0a32-dff9-48c9-950d-09acdcd0e07f-kube-api-access-gc7w4\") pod \"community-operators-w46rg\" (UID: \"5c7e0a32-dff9-48c9-950d-09acdcd0e07f\") " pod="openshift-marketplace/community-operators-w46rg" Mar 20 07:06:03 crc kubenswrapper[4971]: I0320 07:06:03.570347 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c7e0a32-dff9-48c9-950d-09acdcd0e07f-catalog-content\") pod \"community-operators-w46rg\" (UID: \"5c7e0a32-dff9-48c9-950d-09acdcd0e07f\") " pod="openshift-marketplace/community-operators-w46rg" Mar 20 07:06:03 crc kubenswrapper[4971]: I0320 07:06:03.570389 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c7e0a32-dff9-48c9-950d-09acdcd0e07f-utilities\") pod \"community-operators-w46rg\" (UID: \"5c7e0a32-dff9-48c9-950d-09acdcd0e07f\") " pod="openshift-marketplace/community-operators-w46rg" Mar 20 07:06:03 crc kubenswrapper[4971]: I0320 07:06:03.671237 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gc7w4\" (UniqueName: \"kubernetes.io/projected/5c7e0a32-dff9-48c9-950d-09acdcd0e07f-kube-api-access-gc7w4\") pod \"community-operators-w46rg\" (UID: \"5c7e0a32-dff9-48c9-950d-09acdcd0e07f\") " pod="openshift-marketplace/community-operators-w46rg" Mar 20 07:06:03 crc kubenswrapper[4971]: I0320 07:06:03.671352 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c7e0a32-dff9-48c9-950d-09acdcd0e07f-catalog-content\") pod \"community-operators-w46rg\" (UID: \"5c7e0a32-dff9-48c9-950d-09acdcd0e07f\") " pod="openshift-marketplace/community-operators-w46rg" Mar 20 07:06:03 crc kubenswrapper[4971]: I0320 07:06:03.671419 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c7e0a32-dff9-48c9-950d-09acdcd0e07f-utilities\") pod \"community-operators-w46rg\" (UID: \"5c7e0a32-dff9-48c9-950d-09acdcd0e07f\") " pod="openshift-marketplace/community-operators-w46rg" Mar 20 07:06:03 crc kubenswrapper[4971]: I0320 07:06:03.671829 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c7e0a32-dff9-48c9-950d-09acdcd0e07f-catalog-content\") pod \"community-operators-w46rg\" (UID: \"5c7e0a32-dff9-48c9-950d-09acdcd0e07f\") " pod="openshift-marketplace/community-operators-w46rg" Mar 20 07:06:03 crc kubenswrapper[4971]: I0320 07:06:03.672007 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c7e0a32-dff9-48c9-950d-09acdcd0e07f-utilities\") pod \"community-operators-w46rg\" (UID: \"5c7e0a32-dff9-48c9-950d-09acdcd0e07f\") " pod="openshift-marketplace/community-operators-w46rg" Mar 20 07:06:03 crc kubenswrapper[4971]: I0320 07:06:03.697784 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gc7w4\" (UniqueName: \"kubernetes.io/projected/5c7e0a32-dff9-48c9-950d-09acdcd0e07f-kube-api-access-gc7w4\") pod \"community-operators-w46rg\" (UID: \"5c7e0a32-dff9-48c9-950d-09acdcd0e07f\") " pod="openshift-marketplace/community-operators-w46rg" Mar 20 07:06:03 crc kubenswrapper[4971]: I0320 07:06:03.811814 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w46rg" Mar 20 07:06:04 crc kubenswrapper[4971]: I0320 07:06:04.072277 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-w46rg"] Mar 20 07:06:04 crc kubenswrapper[4971]: I0320 07:06:04.359416 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566506-wm6kx" Mar 20 07:06:04 crc kubenswrapper[4971]: I0320 07:06:04.481141 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dh266\" (UniqueName: \"kubernetes.io/projected/794b92ff-cff7-4246-8b6f-fbda2dd717c0-kube-api-access-dh266\") pod \"794b92ff-cff7-4246-8b6f-fbda2dd717c0\" (UID: \"794b92ff-cff7-4246-8b6f-fbda2dd717c0\") " Mar 20 07:06:04 crc kubenswrapper[4971]: I0320 07:06:04.491167 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/794b92ff-cff7-4246-8b6f-fbda2dd717c0-kube-api-access-dh266" (OuterVolumeSpecName: "kube-api-access-dh266") pod "794b92ff-cff7-4246-8b6f-fbda2dd717c0" (UID: "794b92ff-cff7-4246-8b6f-fbda2dd717c0"). InnerVolumeSpecName "kube-api-access-dh266". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:06:04 crc kubenswrapper[4971]: I0320 07:06:04.582737 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dh266\" (UniqueName: \"kubernetes.io/projected/794b92ff-cff7-4246-8b6f-fbda2dd717c0-kube-api-access-dh266\") on node \"crc\" DevicePath \"\"" Mar 20 07:06:04 crc kubenswrapper[4971]: I0320 07:06:04.773110 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c8d0bd3-0e1a-4cd7-884d-edb3f5992909" path="/var/lib/kubelet/pods/4c8d0bd3-0e1a-4cd7-884d-edb3f5992909/volumes" Mar 20 07:06:05 crc kubenswrapper[4971]: I0320 07:06:05.094863 4971 generic.go:334] "Generic (PLEG): container finished" podID="5c7e0a32-dff9-48c9-950d-09acdcd0e07f" containerID="52f5612ca4e8caab56967c77416d214b06d9ccc175613a49ae00b063d098a7f8" exitCode=0 Mar 20 07:06:05 crc kubenswrapper[4971]: I0320 07:06:05.094912 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w46rg" event={"ID":"5c7e0a32-dff9-48c9-950d-09acdcd0e07f","Type":"ContainerDied","Data":"52f5612ca4e8caab56967c77416d214b06d9ccc175613a49ae00b063d098a7f8"} Mar 20 07:06:05 crc kubenswrapper[4971]: I0320 07:06:05.094965 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w46rg" event={"ID":"5c7e0a32-dff9-48c9-950d-09acdcd0e07f","Type":"ContainerStarted","Data":"3a10bddc5857bec5e04d015ad73c5de5dcd6c517adc60366dd8c4072ed48b1c7"} Mar 20 07:06:05 crc kubenswrapper[4971]: I0320 07:06:05.096676 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566506-wm6kx" event={"ID":"794b92ff-cff7-4246-8b6f-fbda2dd717c0","Type":"ContainerDied","Data":"55c587882f15edf636d66fb5380f940b8df69a59f6732791e1c21c1f20776d3b"} Mar 20 07:06:05 crc kubenswrapper[4971]: I0320 07:06:05.096733 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55c587882f15edf636d66fb5380f940b8df69a59f6732791e1c21c1f20776d3b" Mar 20 07:06:05 crc kubenswrapper[4971]: I0320 07:06:05.096744 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566506-wm6kx" Mar 20 07:06:05 crc kubenswrapper[4971]: I0320 07:06:05.402040 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566500-qwvlj"] Mar 20 07:06:05 crc kubenswrapper[4971]: I0320 07:06:05.406500 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566500-qwvlj"] Mar 20 07:06:06 crc kubenswrapper[4971]: I0320 07:06:06.106632 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w46rg" event={"ID":"5c7e0a32-dff9-48c9-950d-09acdcd0e07f","Type":"ContainerStarted","Data":"500cb764e0318ef7ea803923351d6881445a90848210e048dce29894c6a60025"} Mar 20 07:06:06 crc kubenswrapper[4971]: I0320 07:06:06.739119 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f1a8924-50a9-4115-85ac-e24d81d9693e" path="/var/lib/kubelet/pods/6f1a8924-50a9-4115-85ac-e24d81d9693e/volumes" Mar 20 07:06:07 crc kubenswrapper[4971]: I0320 07:06:07.116495 4971 generic.go:334] "Generic (PLEG): container finished" podID="5c7e0a32-dff9-48c9-950d-09acdcd0e07f" containerID="500cb764e0318ef7ea803923351d6881445a90848210e048dce29894c6a60025" exitCode=0 Mar 20 07:06:07 crc kubenswrapper[4971]: I0320 07:06:07.116558 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w46rg" event={"ID":"5c7e0a32-dff9-48c9-950d-09acdcd0e07f","Type":"ContainerDied","Data":"500cb764e0318ef7ea803923351d6881445a90848210e048dce29894c6a60025"} Mar 20 07:06:08 crc kubenswrapper[4971]: I0320 07:06:08.128454 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w46rg" event={"ID":"5c7e0a32-dff9-48c9-950d-09acdcd0e07f","Type":"ContainerStarted","Data":"76e890d66ff28017dfc7730aa491212501f2da50a4bf471c3e6b059a1f902435"} Mar 20 07:06:08 crc kubenswrapper[4971]: I0320 07:06:08.161365 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-w46rg" podStartSLOduration=2.609725617 podStartE2EDuration="5.161328216s" podCreationTimestamp="2026-03-20 07:06:03 +0000 UTC" firstStartedPulling="2026-03-20 07:06:05.09680897 +0000 UTC m=+987.076683098" lastFinishedPulling="2026-03-20 07:06:07.648411559 +0000 UTC m=+989.628285697" observedRunningTime="2026-03-20 07:06:08.155388813 +0000 UTC m=+990.135262961" watchObservedRunningTime="2026-03-20 07:06:08.161328216 +0000 UTC m=+990.141202394" Mar 20 07:06:12 crc kubenswrapper[4971]: I0320 07:06:12.389183 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-rtfrt" Mar 20 07:06:13 crc kubenswrapper[4971]: I0320 07:06:13.812355 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-w46rg" Mar 20 07:06:13 crc kubenswrapper[4971]: I0320 07:06:13.812696 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-w46rg" Mar 20 07:06:13 crc kubenswrapper[4971]: I0320 07:06:13.870458 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-w46rg" Mar 20 07:06:14 crc kubenswrapper[4971]: I0320 07:06:14.237594 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-w46rg" Mar 20 07:06:14 crc kubenswrapper[4971]: I0320 07:06:14.304833 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-w46rg"] Mar 20 07:06:16 crc kubenswrapper[4971]: I0320 07:06:16.188068 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-w46rg" podUID="5c7e0a32-dff9-48c9-950d-09acdcd0e07f" containerName="registry-server" containerID="cri-o://76e890d66ff28017dfc7730aa491212501f2da50a4bf471c3e6b059a1f902435" gracePeriod=2 Mar 20 07:06:16 crc kubenswrapper[4971]: I0320 07:06:16.589570 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w46rg" Mar 20 07:06:16 crc kubenswrapper[4971]: I0320 07:06:16.643827 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gc7w4\" (UniqueName: \"kubernetes.io/projected/5c7e0a32-dff9-48c9-950d-09acdcd0e07f-kube-api-access-gc7w4\") pod \"5c7e0a32-dff9-48c9-950d-09acdcd0e07f\" (UID: \"5c7e0a32-dff9-48c9-950d-09acdcd0e07f\") " Mar 20 07:06:16 crc kubenswrapper[4971]: I0320 07:06:16.643913 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c7e0a32-dff9-48c9-950d-09acdcd0e07f-utilities\") pod \"5c7e0a32-dff9-48c9-950d-09acdcd0e07f\" (UID: \"5c7e0a32-dff9-48c9-950d-09acdcd0e07f\") " Mar 20 07:06:16 crc kubenswrapper[4971]: I0320 07:06:16.644006 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c7e0a32-dff9-48c9-950d-09acdcd0e07f-catalog-content\") pod \"5c7e0a32-dff9-48c9-950d-09acdcd0e07f\" (UID: \"5c7e0a32-dff9-48c9-950d-09acdcd0e07f\") " Mar 20 07:06:16 crc kubenswrapper[4971]: I0320 07:06:16.645431 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c7e0a32-dff9-48c9-950d-09acdcd0e07f-utilities" (OuterVolumeSpecName: "utilities") pod "5c7e0a32-dff9-48c9-950d-09acdcd0e07f" (UID: "5c7e0a32-dff9-48c9-950d-09acdcd0e07f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:06:16 crc kubenswrapper[4971]: I0320 07:06:16.648903 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c7e0a32-dff9-48c9-950d-09acdcd0e07f-kube-api-access-gc7w4" (OuterVolumeSpecName: "kube-api-access-gc7w4") pod "5c7e0a32-dff9-48c9-950d-09acdcd0e07f" (UID: "5c7e0a32-dff9-48c9-950d-09acdcd0e07f"). InnerVolumeSpecName "kube-api-access-gc7w4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:06:16 crc kubenswrapper[4971]: I0320 07:06:16.745623 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gc7w4\" (UniqueName: \"kubernetes.io/projected/5c7e0a32-dff9-48c9-950d-09acdcd0e07f-kube-api-access-gc7w4\") on node \"crc\" DevicePath \"\"" Mar 20 07:06:16 crc kubenswrapper[4971]: I0320 07:06:16.746036 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c7e0a32-dff9-48c9-950d-09acdcd0e07f-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 07:06:17 crc kubenswrapper[4971]: I0320 07:06:17.194712 4971 generic.go:334] "Generic (PLEG): container finished" podID="5c7e0a32-dff9-48c9-950d-09acdcd0e07f" containerID="76e890d66ff28017dfc7730aa491212501f2da50a4bf471c3e6b059a1f902435" exitCode=0 Mar 20 07:06:17 crc kubenswrapper[4971]: I0320 07:06:17.194768 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w46rg" event={"ID":"5c7e0a32-dff9-48c9-950d-09acdcd0e07f","Type":"ContainerDied","Data":"76e890d66ff28017dfc7730aa491212501f2da50a4bf471c3e6b059a1f902435"} Mar 20 07:06:17 crc kubenswrapper[4971]: I0320 07:06:17.194803 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w46rg" event={"ID":"5c7e0a32-dff9-48c9-950d-09acdcd0e07f","Type":"ContainerDied","Data":"3a10bddc5857bec5e04d015ad73c5de5dcd6c517adc60366dd8c4072ed48b1c7"} Mar 20 07:06:17 crc kubenswrapper[4971]: I0320 07:06:17.194832 4971 scope.go:117] "RemoveContainer" containerID="76e890d66ff28017dfc7730aa491212501f2da50a4bf471c3e6b059a1f902435" Mar 20 07:06:17 crc kubenswrapper[4971]: I0320 07:06:17.194951 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w46rg" Mar 20 07:06:17 crc kubenswrapper[4971]: I0320 07:06:17.217357 4971 scope.go:117] "RemoveContainer" containerID="500cb764e0318ef7ea803923351d6881445a90848210e048dce29894c6a60025" Mar 20 07:06:17 crc kubenswrapper[4971]: I0320 07:06:17.240598 4971 scope.go:117] "RemoveContainer" containerID="52f5612ca4e8caab56967c77416d214b06d9ccc175613a49ae00b063d098a7f8" Mar 20 07:06:17 crc kubenswrapper[4971]: I0320 07:06:17.265037 4971 scope.go:117] "RemoveContainer" containerID="76e890d66ff28017dfc7730aa491212501f2da50a4bf471c3e6b059a1f902435" Mar 20 07:06:17 crc kubenswrapper[4971]: E0320 07:06:17.265454 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76e890d66ff28017dfc7730aa491212501f2da50a4bf471c3e6b059a1f902435\": container with ID starting with 76e890d66ff28017dfc7730aa491212501f2da50a4bf471c3e6b059a1f902435 not found: ID does not exist" containerID="76e890d66ff28017dfc7730aa491212501f2da50a4bf471c3e6b059a1f902435" Mar 20 07:06:17 crc kubenswrapper[4971]: I0320 07:06:17.265507 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76e890d66ff28017dfc7730aa491212501f2da50a4bf471c3e6b059a1f902435"} err="failed to get container status \"76e890d66ff28017dfc7730aa491212501f2da50a4bf471c3e6b059a1f902435\": rpc error: code = NotFound desc = could not find container \"76e890d66ff28017dfc7730aa491212501f2da50a4bf471c3e6b059a1f902435\": container with ID starting with 76e890d66ff28017dfc7730aa491212501f2da50a4bf471c3e6b059a1f902435 not found: ID does not exist" Mar 20 07:06:17 crc kubenswrapper[4971]: I0320 07:06:17.265537 4971 scope.go:117] "RemoveContainer" containerID="500cb764e0318ef7ea803923351d6881445a90848210e048dce29894c6a60025" Mar 20 07:06:17 crc kubenswrapper[4971]: E0320 07:06:17.265974 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"500cb764e0318ef7ea803923351d6881445a90848210e048dce29894c6a60025\": container with ID starting with 500cb764e0318ef7ea803923351d6881445a90848210e048dce29894c6a60025 not found: ID does not exist" containerID="500cb764e0318ef7ea803923351d6881445a90848210e048dce29894c6a60025" Mar 20 07:06:17 crc kubenswrapper[4971]: I0320 07:06:17.266030 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"500cb764e0318ef7ea803923351d6881445a90848210e048dce29894c6a60025"} err="failed to get container status \"500cb764e0318ef7ea803923351d6881445a90848210e048dce29894c6a60025\": rpc error: code = NotFound desc = could not find container \"500cb764e0318ef7ea803923351d6881445a90848210e048dce29894c6a60025\": container with ID starting with 500cb764e0318ef7ea803923351d6881445a90848210e048dce29894c6a60025 not found: ID does not exist" Mar 20 07:06:17 crc kubenswrapper[4971]: I0320 07:06:17.266053 4971 scope.go:117] "RemoveContainer" containerID="52f5612ca4e8caab56967c77416d214b06d9ccc175613a49ae00b063d098a7f8" Mar 20 07:06:17 crc kubenswrapper[4971]: E0320 07:06:17.266387 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52f5612ca4e8caab56967c77416d214b06d9ccc175613a49ae00b063d098a7f8\": container with ID starting with 52f5612ca4e8caab56967c77416d214b06d9ccc175613a49ae00b063d098a7f8 not found: ID does not exist" containerID="52f5612ca4e8caab56967c77416d214b06d9ccc175613a49ae00b063d098a7f8" Mar 20 07:06:17 crc kubenswrapper[4971]: I0320 07:06:17.266446 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52f5612ca4e8caab56967c77416d214b06d9ccc175613a49ae00b063d098a7f8"} err="failed to get container status \"52f5612ca4e8caab56967c77416d214b06d9ccc175613a49ae00b063d098a7f8\": rpc error: code = NotFound desc = could not find container \"52f5612ca4e8caab56967c77416d214b06d9ccc175613a49ae00b063d098a7f8\": container with ID starting with 52f5612ca4e8caab56967c77416d214b06d9ccc175613a49ae00b063d098a7f8 not found: ID does not exist" Mar 20 07:06:17 crc kubenswrapper[4971]: I0320 07:06:17.376494 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c7e0a32-dff9-48c9-950d-09acdcd0e07f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5c7e0a32-dff9-48c9-950d-09acdcd0e07f" (UID: "5c7e0a32-dff9-48c9-950d-09acdcd0e07f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:06:17 crc kubenswrapper[4971]: I0320 07:06:17.451899 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c7e0a32-dff9-48c9-950d-09acdcd0e07f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 07:06:17 crc kubenswrapper[4971]: I0320 07:06:17.539525 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-w46rg"] Mar 20 07:06:17 crc kubenswrapper[4971]: I0320 07:06:17.546105 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-w46rg"] Mar 20 07:06:18 crc kubenswrapper[4971]: I0320 07:06:18.742961 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c7e0a32-dff9-48c9-950d-09acdcd0e07f" path="/var/lib/kubelet/pods/5c7e0a32-dff9-48c9-950d-09acdcd0e07f/volumes" Mar 20 07:06:20 crc kubenswrapper[4971]: I0320 07:06:20.162134 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:06:20 crc kubenswrapper[4971]: I0320 07:06:20.162731 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:06:28 crc kubenswrapper[4971]: I0320 07:06:28.224038 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-4l9qf" podUID="9f6352b6-0b3b-43d0-8d2f-681f5aec50c2" containerName="console" containerID="cri-o://b3550967a7937f0922fb88dc84a6fa0a07a39a519d13bcc9e66a9bbd4f177ce7" gracePeriod=15 Mar 20 07:06:28 crc kubenswrapper[4971]: I0320 07:06:28.359205 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lc28r"] Mar 20 07:06:28 crc kubenswrapper[4971]: E0320 07:06:28.359517 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c7e0a32-dff9-48c9-950d-09acdcd0e07f" containerName="registry-server" Mar 20 07:06:28 crc kubenswrapper[4971]: I0320 07:06:28.359544 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c7e0a32-dff9-48c9-950d-09acdcd0e07f" containerName="registry-server" Mar 20 07:06:28 crc kubenswrapper[4971]: E0320 07:06:28.359570 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="794b92ff-cff7-4246-8b6f-fbda2dd717c0" containerName="oc" Mar 20 07:06:28 crc kubenswrapper[4971]: I0320 07:06:28.359582 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="794b92ff-cff7-4246-8b6f-fbda2dd717c0" containerName="oc" Mar 20 07:06:28 crc kubenswrapper[4971]: E0320 07:06:28.359600 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c7e0a32-dff9-48c9-950d-09acdcd0e07f" containerName="extract-utilities" Mar 20 07:06:28 crc kubenswrapper[4971]: I0320 07:06:28.359691 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c7e0a32-dff9-48c9-950d-09acdcd0e07f" containerName="extract-utilities" Mar 20 07:06:28 crc kubenswrapper[4971]: E0320 07:06:28.359710 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c7e0a32-dff9-48c9-950d-09acdcd0e07f" containerName="extract-content" Mar 20 07:06:28 crc kubenswrapper[4971]: I0320 07:06:28.359722 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c7e0a32-dff9-48c9-950d-09acdcd0e07f" containerName="extract-content" Mar 20 07:06:28 crc kubenswrapper[4971]: I0320 07:06:28.361036 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="794b92ff-cff7-4246-8b6f-fbda2dd717c0" containerName="oc" Mar 20 07:06:28 crc kubenswrapper[4971]: I0320 07:06:28.361320 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c7e0a32-dff9-48c9-950d-09acdcd0e07f" containerName="registry-server" Mar 20 07:06:28 crc kubenswrapper[4971]: I0320 07:06:28.362605 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lc28r" Mar 20 07:06:28 crc kubenswrapper[4971]: I0320 07:06:28.364997 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 20 07:06:28 crc kubenswrapper[4971]: I0320 07:06:28.370469 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lc28r"] Mar 20 07:06:28 crc kubenswrapper[4971]: I0320 07:06:28.511777 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clcp7\" (UniqueName: \"kubernetes.io/projected/32deec4a-495c-4499-b249-5affc8063c20-kube-api-access-clcp7\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lc28r\" (UID: \"32deec4a-495c-4499-b249-5affc8063c20\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lc28r" Mar 20 07:06:28 crc kubenswrapper[4971]: I0320 07:06:28.511860 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/32deec4a-495c-4499-b249-5affc8063c20-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lc28r\" (UID: \"32deec4a-495c-4499-b249-5affc8063c20\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lc28r" Mar 20 07:06:28 crc kubenswrapper[4971]: I0320 07:06:28.511887 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/32deec4a-495c-4499-b249-5affc8063c20-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lc28r\" (UID: \"32deec4a-495c-4499-b249-5affc8063c20\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lc28r" Mar 20 07:06:28 crc kubenswrapper[4971]: I0320 07:06:28.587452 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-4l9qf_9f6352b6-0b3b-43d0-8d2f-681f5aec50c2/console/0.log" Mar 20 07:06:28 crc kubenswrapper[4971]: I0320 07:06:28.587713 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-4l9qf" Mar 20 07:06:28 crc kubenswrapper[4971]: I0320 07:06:28.612980 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clcp7\" (UniqueName: \"kubernetes.io/projected/32deec4a-495c-4499-b249-5affc8063c20-kube-api-access-clcp7\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lc28r\" (UID: \"32deec4a-495c-4499-b249-5affc8063c20\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lc28r" Mar 20 07:06:28 crc kubenswrapper[4971]: I0320 07:06:28.613054 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/32deec4a-495c-4499-b249-5affc8063c20-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lc28r\" (UID: \"32deec4a-495c-4499-b249-5affc8063c20\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lc28r" Mar 20 07:06:28 crc kubenswrapper[4971]: I0320 07:06:28.613071 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/32deec4a-495c-4499-b249-5affc8063c20-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lc28r\" (UID: \"32deec4a-495c-4499-b249-5affc8063c20\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lc28r" Mar 20 07:06:28 crc kubenswrapper[4971]: I0320 07:06:28.613503 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/32deec4a-495c-4499-b249-5affc8063c20-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lc28r\" (UID: \"32deec4a-495c-4499-b249-5affc8063c20\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lc28r" Mar 20 07:06:28 crc kubenswrapper[4971]: I0320 07:06:28.613927 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/32deec4a-495c-4499-b249-5affc8063c20-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lc28r\" (UID: \"32deec4a-495c-4499-b249-5affc8063c20\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lc28r" Mar 20 07:06:28 crc kubenswrapper[4971]: I0320 07:06:28.637392 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clcp7\" (UniqueName: \"kubernetes.io/projected/32deec4a-495c-4499-b249-5affc8063c20-kube-api-access-clcp7\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lc28r\" (UID: \"32deec4a-495c-4499-b249-5affc8063c20\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lc28r" Mar 20 07:06:28 crc kubenswrapper[4971]: I0320 07:06:28.692391 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lc28r" Mar 20 07:06:28 crc kubenswrapper[4971]: I0320 07:06:28.714100 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9f6352b6-0b3b-43d0-8d2f-681f5aec50c2-service-ca\") pod \"9f6352b6-0b3b-43d0-8d2f-681f5aec50c2\" (UID: \"9f6352b6-0b3b-43d0-8d2f-681f5aec50c2\") " Mar 20 07:06:28 crc kubenswrapper[4971]: I0320 07:06:28.714174 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f6352b6-0b3b-43d0-8d2f-681f5aec50c2-trusted-ca-bundle\") pod \"9f6352b6-0b3b-43d0-8d2f-681f5aec50c2\" (UID: \"9f6352b6-0b3b-43d0-8d2f-681f5aec50c2\") " Mar 20 07:06:28 crc kubenswrapper[4971]: I0320 07:06:28.714265 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9f6352b6-0b3b-43d0-8d2f-681f5aec50c2-console-config\") pod \"9f6352b6-0b3b-43d0-8d2f-681f5aec50c2\" (UID: \"9f6352b6-0b3b-43d0-8d2f-681f5aec50c2\") " Mar 20 07:06:28 crc kubenswrapper[4971]: I0320 07:06:28.714298 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9f6352b6-0b3b-43d0-8d2f-681f5aec50c2-oauth-serving-cert\") pod \"9f6352b6-0b3b-43d0-8d2f-681f5aec50c2\" (UID: \"9f6352b6-0b3b-43d0-8d2f-681f5aec50c2\") " Mar 20 07:06:28 crc kubenswrapper[4971]: I0320 07:06:28.714577 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntg4h\" (UniqueName: \"kubernetes.io/projected/9f6352b6-0b3b-43d0-8d2f-681f5aec50c2-kube-api-access-ntg4h\") pod \"9f6352b6-0b3b-43d0-8d2f-681f5aec50c2\" (UID: \"9f6352b6-0b3b-43d0-8d2f-681f5aec50c2\") " Mar 20 07:06:28 crc kubenswrapper[4971]: I0320 07:06:28.714632 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9f6352b6-0b3b-43d0-8d2f-681f5aec50c2-console-oauth-config\") pod \"9f6352b6-0b3b-43d0-8d2f-681f5aec50c2\" (UID: \"9f6352b6-0b3b-43d0-8d2f-681f5aec50c2\") " Mar 20 07:06:28 crc kubenswrapper[4971]: I0320 07:06:28.714657 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9f6352b6-0b3b-43d0-8d2f-681f5aec50c2-console-serving-cert\") pod \"9f6352b6-0b3b-43d0-8d2f-681f5aec50c2\" (UID: \"9f6352b6-0b3b-43d0-8d2f-681f5aec50c2\") " Mar 20 07:06:28 crc kubenswrapper[4971]: I0320 07:06:28.715387 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f6352b6-0b3b-43d0-8d2f-681f5aec50c2-service-ca" (OuterVolumeSpecName: "service-ca") pod "9f6352b6-0b3b-43d0-8d2f-681f5aec50c2" (UID: "9f6352b6-0b3b-43d0-8d2f-681f5aec50c2"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:06:28 crc kubenswrapper[4971]: I0320 07:06:28.715895 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f6352b6-0b3b-43d0-8d2f-681f5aec50c2-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "9f6352b6-0b3b-43d0-8d2f-681f5aec50c2" (UID: "9f6352b6-0b3b-43d0-8d2f-681f5aec50c2"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:06:28 crc kubenswrapper[4971]: I0320 07:06:28.716384 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f6352b6-0b3b-43d0-8d2f-681f5aec50c2-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "9f6352b6-0b3b-43d0-8d2f-681f5aec50c2" (UID: "9f6352b6-0b3b-43d0-8d2f-681f5aec50c2"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:06:28 crc kubenswrapper[4971]: I0320 07:06:28.716808 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f6352b6-0b3b-43d0-8d2f-681f5aec50c2-console-config" (OuterVolumeSpecName: "console-config") pod "9f6352b6-0b3b-43d0-8d2f-681f5aec50c2" (UID: "9f6352b6-0b3b-43d0-8d2f-681f5aec50c2"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:06:28 crc kubenswrapper[4971]: I0320 07:06:28.720525 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f6352b6-0b3b-43d0-8d2f-681f5aec50c2-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "9f6352b6-0b3b-43d0-8d2f-681f5aec50c2" (UID: "9f6352b6-0b3b-43d0-8d2f-681f5aec50c2"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:06:28 crc kubenswrapper[4971]: I0320 07:06:28.721526 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f6352b6-0b3b-43d0-8d2f-681f5aec50c2-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "9f6352b6-0b3b-43d0-8d2f-681f5aec50c2" (UID: "9f6352b6-0b3b-43d0-8d2f-681f5aec50c2"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:06:28 crc kubenswrapper[4971]: I0320 07:06:28.720469 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f6352b6-0b3b-43d0-8d2f-681f5aec50c2-kube-api-access-ntg4h" (OuterVolumeSpecName: "kube-api-access-ntg4h") pod "9f6352b6-0b3b-43d0-8d2f-681f5aec50c2" (UID: "9f6352b6-0b3b-43d0-8d2f-681f5aec50c2"). InnerVolumeSpecName "kube-api-access-ntg4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:06:28 crc kubenswrapper[4971]: I0320 07:06:28.817844 4971 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9f6352b6-0b3b-43d0-8d2f-681f5aec50c2-console-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:06:28 crc kubenswrapper[4971]: I0320 07:06:28.817909 4971 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9f6352b6-0b3b-43d0-8d2f-681f5aec50c2-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 07:06:28 crc kubenswrapper[4971]: I0320 07:06:28.817983 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntg4h\" (UniqueName: \"kubernetes.io/projected/9f6352b6-0b3b-43d0-8d2f-681f5aec50c2-kube-api-access-ntg4h\") on node \"crc\" DevicePath \"\"" Mar 20 07:06:28 crc kubenswrapper[4971]: I0320 07:06:28.818038 4971 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9f6352b6-0b3b-43d0-8d2f-681f5aec50c2-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:06:28 crc kubenswrapper[4971]: I0320 07:06:28.818072 4971 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9f6352b6-0b3b-43d0-8d2f-681f5aec50c2-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 07:06:28 crc kubenswrapper[4971]: I0320 07:06:28.818134 4971 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9f6352b6-0b3b-43d0-8d2f-681f5aec50c2-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 07:06:28 crc kubenswrapper[4971]: I0320 07:06:28.818586 4971 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f6352b6-0b3b-43d0-8d2f-681f5aec50c2-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:06:28 crc kubenswrapper[4971]: I0320 07:06:28.925019 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lc28r"] Mar 20 07:06:29 crc kubenswrapper[4971]: I0320 07:06:29.275655 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-4l9qf_9f6352b6-0b3b-43d0-8d2f-681f5aec50c2/console/0.log" Mar 20 07:06:29 crc kubenswrapper[4971]: I0320 07:06:29.275875 4971 generic.go:334] "Generic (PLEG): container finished" podID="9f6352b6-0b3b-43d0-8d2f-681f5aec50c2" containerID="b3550967a7937f0922fb88dc84a6fa0a07a39a519d13bcc9e66a9bbd4f177ce7" exitCode=2 Mar 20 07:06:29 crc kubenswrapper[4971]: I0320 07:06:29.275954 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-4l9qf" Mar 20 07:06:29 crc kubenswrapper[4971]: I0320 07:06:29.276440 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-4l9qf" event={"ID":"9f6352b6-0b3b-43d0-8d2f-681f5aec50c2","Type":"ContainerDied","Data":"b3550967a7937f0922fb88dc84a6fa0a07a39a519d13bcc9e66a9bbd4f177ce7"} Mar 20 07:06:29 crc kubenswrapper[4971]: I0320 07:06:29.276461 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-4l9qf" event={"ID":"9f6352b6-0b3b-43d0-8d2f-681f5aec50c2","Type":"ContainerDied","Data":"e3c1562e1fc8e758500a9be02e76daf7fe1fb45d8b51fafbe7b87419e2d229d4"} Mar 20 07:06:29 crc kubenswrapper[4971]: I0320 07:06:29.276476 4971 scope.go:117] "RemoveContainer" containerID="b3550967a7937f0922fb88dc84a6fa0a07a39a519d13bcc9e66a9bbd4f177ce7" Mar 20 07:06:29 crc kubenswrapper[4971]: I0320 07:06:29.280746 4971 generic.go:334] "Generic (PLEG): container finished" podID="32deec4a-495c-4499-b249-5affc8063c20" containerID="481c4b635170875f9ab2a24350aa2e91f644527638de3a65effe7950c852ae9a" exitCode=0 Mar 20 07:06:29 crc kubenswrapper[4971]: I0320 07:06:29.280896 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lc28r" event={"ID":"32deec4a-495c-4499-b249-5affc8063c20","Type":"ContainerDied","Data":"481c4b635170875f9ab2a24350aa2e91f644527638de3a65effe7950c852ae9a"} Mar 20 07:06:29 crc kubenswrapper[4971]: I0320 07:06:29.281179 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lc28r" event={"ID":"32deec4a-495c-4499-b249-5affc8063c20","Type":"ContainerStarted","Data":"264a62451f52f66a96718c3795b025b1b8384ca025cb978feb57d6f5cfda830f"} Mar 20 07:06:29 crc kubenswrapper[4971]: I0320 07:06:29.302015 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-4l9qf"] Mar 20 07:06:29 crc kubenswrapper[4971]: I0320 07:06:29.305522 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-4l9qf"] Mar 20 07:06:29 crc kubenswrapper[4971]: I0320 07:06:29.307626 4971 scope.go:117] "RemoveContainer" containerID="b3550967a7937f0922fb88dc84a6fa0a07a39a519d13bcc9e66a9bbd4f177ce7" Mar 20 07:06:29 crc kubenswrapper[4971]: E0320 07:06:29.309122 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3550967a7937f0922fb88dc84a6fa0a07a39a519d13bcc9e66a9bbd4f177ce7\": container with ID starting with b3550967a7937f0922fb88dc84a6fa0a07a39a519d13bcc9e66a9bbd4f177ce7 not found: ID does not exist" containerID="b3550967a7937f0922fb88dc84a6fa0a07a39a519d13bcc9e66a9bbd4f177ce7" Mar 20 07:06:29 crc kubenswrapper[4971]: I0320 07:06:29.309189 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3550967a7937f0922fb88dc84a6fa0a07a39a519d13bcc9e66a9bbd4f177ce7"} err="failed to get container status \"b3550967a7937f0922fb88dc84a6fa0a07a39a519d13bcc9e66a9bbd4f177ce7\": rpc error: code = NotFound desc = could not find container \"b3550967a7937f0922fb88dc84a6fa0a07a39a519d13bcc9e66a9bbd4f177ce7\": container with ID starting with b3550967a7937f0922fb88dc84a6fa0a07a39a519d13bcc9e66a9bbd4f177ce7 not found: ID does not exist" Mar 20 07:06:30 crc kubenswrapper[4971]: I0320 07:06:30.740298 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f6352b6-0b3b-43d0-8d2f-681f5aec50c2" path="/var/lib/kubelet/pods/9f6352b6-0b3b-43d0-8d2f-681f5aec50c2/volumes" Mar 20 07:06:31 crc kubenswrapper[4971]: I0320 07:06:31.298898 4971 generic.go:334] "Generic (PLEG): container finished" podID="32deec4a-495c-4499-b249-5affc8063c20" containerID="c128c9b672095559c36bdec9fd67147162823ee28663503ca97f7abb7891ca20" exitCode=0 Mar 20 07:06:31 crc kubenswrapper[4971]: I0320 07:06:31.298958 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lc28r" event={"ID":"32deec4a-495c-4499-b249-5affc8063c20","Type":"ContainerDied","Data":"c128c9b672095559c36bdec9fd67147162823ee28663503ca97f7abb7891ca20"} Mar 20 07:06:32 crc kubenswrapper[4971]: I0320 07:06:32.311557 4971 generic.go:334] "Generic (PLEG): container finished" podID="32deec4a-495c-4499-b249-5affc8063c20" containerID="563d0925ba16ac46bf448d3e8a7220daa1b1b8082b955172e2dcafb344ab8142" exitCode=0 Mar 20 07:06:32 crc kubenswrapper[4971]: I0320 07:06:32.311932 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lc28r" event={"ID":"32deec4a-495c-4499-b249-5affc8063c20","Type":"ContainerDied","Data":"563d0925ba16ac46bf448d3e8a7220daa1b1b8082b955172e2dcafb344ab8142"} Mar 20 07:06:33 crc kubenswrapper[4971]: I0320 07:06:33.598062 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lc28r" Mar 20 07:06:33 crc kubenswrapper[4971]: I0320 07:06:33.793088 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clcp7\" (UniqueName: \"kubernetes.io/projected/32deec4a-495c-4499-b249-5affc8063c20-kube-api-access-clcp7\") pod \"32deec4a-495c-4499-b249-5affc8063c20\" (UID: \"32deec4a-495c-4499-b249-5affc8063c20\") " Mar 20 07:06:33 crc kubenswrapper[4971]: I0320 07:06:33.793203 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/32deec4a-495c-4499-b249-5affc8063c20-bundle\") pod \"32deec4a-495c-4499-b249-5affc8063c20\" (UID: \"32deec4a-495c-4499-b249-5affc8063c20\") " Mar 20 07:06:33 crc kubenswrapper[4971]: I0320 07:06:33.793238 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/32deec4a-495c-4499-b249-5affc8063c20-util\") pod \"32deec4a-495c-4499-b249-5affc8063c20\" (UID: \"32deec4a-495c-4499-b249-5affc8063c20\") " Mar 20 07:06:33 crc kubenswrapper[4971]: I0320 07:06:33.795130 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32deec4a-495c-4499-b249-5affc8063c20-bundle" (OuterVolumeSpecName: "bundle") pod "32deec4a-495c-4499-b249-5affc8063c20" (UID: "32deec4a-495c-4499-b249-5affc8063c20"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:06:33 crc kubenswrapper[4971]: I0320 07:06:33.803954 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32deec4a-495c-4499-b249-5affc8063c20-kube-api-access-clcp7" (OuterVolumeSpecName: "kube-api-access-clcp7") pod "32deec4a-495c-4499-b249-5affc8063c20" (UID: "32deec4a-495c-4499-b249-5affc8063c20"). InnerVolumeSpecName "kube-api-access-clcp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:06:33 crc kubenswrapper[4971]: I0320 07:06:33.823982 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32deec4a-495c-4499-b249-5affc8063c20-util" (OuterVolumeSpecName: "util") pod "32deec4a-495c-4499-b249-5affc8063c20" (UID: "32deec4a-495c-4499-b249-5affc8063c20"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:06:33 crc kubenswrapper[4971]: I0320 07:06:33.895546 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clcp7\" (UniqueName: \"kubernetes.io/projected/32deec4a-495c-4499-b249-5affc8063c20-kube-api-access-clcp7\") on node \"crc\" DevicePath \"\"" Mar 20 07:06:33 crc kubenswrapper[4971]: I0320 07:06:33.895632 4971 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/32deec4a-495c-4499-b249-5affc8063c20-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:06:33 crc kubenswrapper[4971]: I0320 07:06:33.895653 4971 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/32deec4a-495c-4499-b249-5affc8063c20-util\") on node \"crc\" DevicePath \"\"" Mar 20 07:06:34 crc kubenswrapper[4971]: I0320 07:06:34.330520 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lc28r" event={"ID":"32deec4a-495c-4499-b249-5affc8063c20","Type":"ContainerDied","Data":"264a62451f52f66a96718c3795b025b1b8384ca025cb978feb57d6f5cfda830f"} Mar 20 07:06:34 crc kubenswrapper[4971]: I0320 07:06:34.330567 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="264a62451f52f66a96718c3795b025b1b8384ca025cb978feb57d6f5cfda830f" Mar 20 07:06:34 crc kubenswrapper[4971]: I0320 07:06:34.330657 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lc28r" Mar 20 07:06:39 crc kubenswrapper[4971]: I0320 07:06:39.757428 4971 scope.go:117] "RemoveContainer" containerID="d761685f3e8e47bc450dba900f3ddc295cc90799b8183ad527be9d582da75158" Mar 20 07:06:40 crc kubenswrapper[4971]: I0320 07:06:40.505128 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-p6tk2"] Mar 20 07:06:40 crc kubenswrapper[4971]: E0320 07:06:40.505878 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32deec4a-495c-4499-b249-5affc8063c20" containerName="pull" Mar 20 07:06:40 crc kubenswrapper[4971]: I0320 07:06:40.505914 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="32deec4a-495c-4499-b249-5affc8063c20" containerName="pull" Mar 20 07:06:40 crc kubenswrapper[4971]: E0320 07:06:40.505942 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32deec4a-495c-4499-b249-5affc8063c20" containerName="util" Mar 20 07:06:40 crc kubenswrapper[4971]: I0320 07:06:40.505954 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="32deec4a-495c-4499-b249-5affc8063c20" containerName="util" Mar 20 07:06:40 crc kubenswrapper[4971]: E0320 07:06:40.505982 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32deec4a-495c-4499-b249-5affc8063c20" containerName="extract" Mar 20 07:06:40 crc kubenswrapper[4971]: I0320 07:06:40.505997 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="32deec4a-495c-4499-b249-5affc8063c20" containerName="extract" Mar 20 07:06:40 crc kubenswrapper[4971]: E0320 07:06:40.506022 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f6352b6-0b3b-43d0-8d2f-681f5aec50c2" containerName="console" Mar 20 07:06:40 crc kubenswrapper[4971]: I0320 07:06:40.506035 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f6352b6-0b3b-43d0-8d2f-681f5aec50c2" containerName="console" Mar 20 07:06:40 crc kubenswrapper[4971]: I0320 07:06:40.506227 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="32deec4a-495c-4499-b249-5affc8063c20" containerName="extract" Mar 20 07:06:40 crc kubenswrapper[4971]: I0320 07:06:40.506269 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f6352b6-0b3b-43d0-8d2f-681f5aec50c2" containerName="console" Mar 20 07:06:40 crc kubenswrapper[4971]: I0320 07:06:40.508266 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p6tk2" Mar 20 07:06:40 crc kubenswrapper[4971]: I0320 07:06:40.525001 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p6tk2"] Mar 20 07:06:40 crc kubenswrapper[4971]: I0320 07:06:40.582684 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gjdv\" (UniqueName: \"kubernetes.io/projected/2fdf14a1-cba9-4770-bf41-60b64875c276-kube-api-access-2gjdv\") pod \"redhat-marketplace-p6tk2\" (UID: \"2fdf14a1-cba9-4770-bf41-60b64875c276\") " pod="openshift-marketplace/redhat-marketplace-p6tk2" Mar 20 07:06:40 crc kubenswrapper[4971]: I0320 07:06:40.582768 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fdf14a1-cba9-4770-bf41-60b64875c276-catalog-content\") pod \"redhat-marketplace-p6tk2\" (UID: \"2fdf14a1-cba9-4770-bf41-60b64875c276\") " pod="openshift-marketplace/redhat-marketplace-p6tk2" Mar 20 07:06:40 crc kubenswrapper[4971]: I0320 07:06:40.582843 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fdf14a1-cba9-4770-bf41-60b64875c276-utilities\") pod \"redhat-marketplace-p6tk2\" (UID: \"2fdf14a1-cba9-4770-bf41-60b64875c276\") " pod="openshift-marketplace/redhat-marketplace-p6tk2" Mar 20 07:06:40 crc kubenswrapper[4971]: I0320 07:06:40.683776 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gjdv\" (UniqueName: \"kubernetes.io/projected/2fdf14a1-cba9-4770-bf41-60b64875c276-kube-api-access-2gjdv\") pod \"redhat-marketplace-p6tk2\" (UID: \"2fdf14a1-cba9-4770-bf41-60b64875c276\") " pod="openshift-marketplace/redhat-marketplace-p6tk2" Mar 20 07:06:40 crc kubenswrapper[4971]: I0320 07:06:40.683896 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fdf14a1-cba9-4770-bf41-60b64875c276-catalog-content\") pod \"redhat-marketplace-p6tk2\" (UID: \"2fdf14a1-cba9-4770-bf41-60b64875c276\") " pod="openshift-marketplace/redhat-marketplace-p6tk2" Mar 20 07:06:40 crc kubenswrapper[4971]: I0320 07:06:40.683952 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fdf14a1-cba9-4770-bf41-60b64875c276-utilities\") pod \"redhat-marketplace-p6tk2\" (UID: \"2fdf14a1-cba9-4770-bf41-60b64875c276\") " pod="openshift-marketplace/redhat-marketplace-p6tk2" Mar 20 07:06:40 crc kubenswrapper[4971]: I0320 07:06:40.684548 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fdf14a1-cba9-4770-bf41-60b64875c276-catalog-content\") pod \"redhat-marketplace-p6tk2\" (UID: \"2fdf14a1-cba9-4770-bf41-60b64875c276\") " pod="openshift-marketplace/redhat-marketplace-p6tk2" Mar 20 07:06:40 crc kubenswrapper[4971]: I0320 07:06:40.684926 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fdf14a1-cba9-4770-bf41-60b64875c276-utilities\") pod \"redhat-marketplace-p6tk2\" (UID: \"2fdf14a1-cba9-4770-bf41-60b64875c276\") " pod="openshift-marketplace/redhat-marketplace-p6tk2" Mar 20 07:06:40 crc kubenswrapper[4971]: I0320 07:06:40.724462 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gjdv\" (UniqueName: \"kubernetes.io/projected/2fdf14a1-cba9-4770-bf41-60b64875c276-kube-api-access-2gjdv\") pod \"redhat-marketplace-p6tk2\" (UID: \"2fdf14a1-cba9-4770-bf41-60b64875c276\") " pod="openshift-marketplace/redhat-marketplace-p6tk2" Mar 20 07:06:40 crc kubenswrapper[4971]: I0320 07:06:40.824645 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p6tk2" Mar 20 07:06:42 crc kubenswrapper[4971]: I0320 07:06:42.222954 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p6tk2"] Mar 20 07:06:42 crc kubenswrapper[4971]: I0320 07:06:42.388727 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p6tk2" event={"ID":"2fdf14a1-cba9-4770-bf41-60b64875c276","Type":"ContainerStarted","Data":"f0229de970426d49b590ca043de0ef944a0657cf79974e94ff999ce6337511df"} Mar 20 07:06:42 crc kubenswrapper[4971]: I0320 07:06:42.388950 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p6tk2" event={"ID":"2fdf14a1-cba9-4770-bf41-60b64875c276","Type":"ContainerStarted","Data":"0dd5aa961864cc519ce2604712487569f15fa0878bb6c55fc6b276aec0646e3e"} Mar 20 07:06:42 crc kubenswrapper[4971]: I0320 07:06:42.722572 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-fc6897bfd-j6pdm"] Mar 20 07:06:42 crc kubenswrapper[4971]: I0320 07:06:42.723398 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-fc6897bfd-j6pdm" Mar 20 07:06:42 crc kubenswrapper[4971]: I0320 07:06:42.725137 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 20 07:06:42 crc kubenswrapper[4971]: I0320 07:06:42.725245 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 20 07:06:42 crc kubenswrapper[4971]: I0320 07:06:42.725315 4971 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 20 07:06:42 crc kubenswrapper[4971]: I0320 07:06:42.725672 4971 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 20 07:06:42 crc kubenswrapper[4971]: I0320 07:06:42.726478 4971 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-sm2d8" Mar 20 07:06:42 crc kubenswrapper[4971]: I0320 07:06:42.745668 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-fc6897bfd-j6pdm"] Mar 20 07:06:42 crc kubenswrapper[4971]: I0320 07:06:42.755457 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hggxv\" (UniqueName: \"kubernetes.io/projected/19ab5c82-2f79-47be-a7ba-68cdbe00f4b7-kube-api-access-hggxv\") pod \"metallb-operator-controller-manager-fc6897bfd-j6pdm\" (UID: \"19ab5c82-2f79-47be-a7ba-68cdbe00f4b7\") " pod="metallb-system/metallb-operator-controller-manager-fc6897bfd-j6pdm" Mar 20 07:06:42 crc kubenswrapper[4971]: I0320 07:06:42.755512 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/19ab5c82-2f79-47be-a7ba-68cdbe00f4b7-apiservice-cert\") pod \"metallb-operator-controller-manager-fc6897bfd-j6pdm\" (UID: \"19ab5c82-2f79-47be-a7ba-68cdbe00f4b7\") " pod="metallb-system/metallb-operator-controller-manager-fc6897bfd-j6pdm" Mar 20 07:06:42 crc kubenswrapper[4971]: I0320 07:06:42.755632 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/19ab5c82-2f79-47be-a7ba-68cdbe00f4b7-webhook-cert\") pod \"metallb-operator-controller-manager-fc6897bfd-j6pdm\" (UID: \"19ab5c82-2f79-47be-a7ba-68cdbe00f4b7\") " pod="metallb-system/metallb-operator-controller-manager-fc6897bfd-j6pdm" Mar 20 07:06:42 crc kubenswrapper[4971]: I0320 07:06:42.856522 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hggxv\" (UniqueName: \"kubernetes.io/projected/19ab5c82-2f79-47be-a7ba-68cdbe00f4b7-kube-api-access-hggxv\") pod \"metallb-operator-controller-manager-fc6897bfd-j6pdm\" (UID: \"19ab5c82-2f79-47be-a7ba-68cdbe00f4b7\") " pod="metallb-system/metallb-operator-controller-manager-fc6897bfd-j6pdm" Mar 20 07:06:42 crc kubenswrapper[4971]: I0320 07:06:42.856575 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/19ab5c82-2f79-47be-a7ba-68cdbe00f4b7-apiservice-cert\") pod \"metallb-operator-controller-manager-fc6897bfd-j6pdm\" (UID: \"19ab5c82-2f79-47be-a7ba-68cdbe00f4b7\") " pod="metallb-system/metallb-operator-controller-manager-fc6897bfd-j6pdm" Mar 20 07:06:42 crc kubenswrapper[4971]: I0320 07:06:42.856653 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/19ab5c82-2f79-47be-a7ba-68cdbe00f4b7-webhook-cert\") pod \"metallb-operator-controller-manager-fc6897bfd-j6pdm\" (UID: \"19ab5c82-2f79-47be-a7ba-68cdbe00f4b7\") " pod="metallb-system/metallb-operator-controller-manager-fc6897bfd-j6pdm" Mar 20 07:06:42 crc kubenswrapper[4971]: I0320 07:06:42.861681 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/19ab5c82-2f79-47be-a7ba-68cdbe00f4b7-apiservice-cert\") pod \"metallb-operator-controller-manager-fc6897bfd-j6pdm\" (UID: \"19ab5c82-2f79-47be-a7ba-68cdbe00f4b7\") " pod="metallb-system/metallb-operator-controller-manager-fc6897bfd-j6pdm" Mar 20 07:06:42 crc kubenswrapper[4971]: I0320 07:06:42.863138 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/19ab5c82-2f79-47be-a7ba-68cdbe00f4b7-webhook-cert\") pod \"metallb-operator-controller-manager-fc6897bfd-j6pdm\" (UID: \"19ab5c82-2f79-47be-a7ba-68cdbe00f4b7\") " pod="metallb-system/metallb-operator-controller-manager-fc6897bfd-j6pdm" Mar 20 07:06:42 crc kubenswrapper[4971]: I0320 07:06:42.874025 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hggxv\" (UniqueName: \"kubernetes.io/projected/19ab5c82-2f79-47be-a7ba-68cdbe00f4b7-kube-api-access-hggxv\") pod \"metallb-operator-controller-manager-fc6897bfd-j6pdm\" (UID: \"19ab5c82-2f79-47be-a7ba-68cdbe00f4b7\") " pod="metallb-system/metallb-operator-controller-manager-fc6897bfd-j6pdm" Mar 20 07:06:43 crc kubenswrapper[4971]: I0320 07:06:43.042395 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-fc6897bfd-j6pdm" Mar 20 07:06:43 crc kubenswrapper[4971]: I0320 07:06:43.085223 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-b54b5b68f-hq9kg"] Mar 20 07:06:43 crc kubenswrapper[4971]: I0320 07:06:43.086106 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-b54b5b68f-hq9kg" Mar 20 07:06:43 crc kubenswrapper[4971]: I0320 07:06:43.088711 4971 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 20 07:06:43 crc kubenswrapper[4971]: I0320 07:06:43.089043 4971 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 20 07:06:43 crc kubenswrapper[4971]: I0320 07:06:43.089295 4971 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-xfstf" Mar 20 07:06:43 crc kubenswrapper[4971]: I0320 07:06:43.122258 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-b54b5b68f-hq9kg"] Mar 20 07:06:43 crc kubenswrapper[4971]: I0320 07:06:43.159501 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fe139df5-ee59-46ea-a07d-6a995dccdc8e-webhook-cert\") pod \"metallb-operator-webhook-server-b54b5b68f-hq9kg\" (UID: \"fe139df5-ee59-46ea-a07d-6a995dccdc8e\") " pod="metallb-system/metallb-operator-webhook-server-b54b5b68f-hq9kg" Mar 20 07:06:43 crc kubenswrapper[4971]: I0320 07:06:43.159568 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fe139df5-ee59-46ea-a07d-6a995dccdc8e-apiservice-cert\") pod \"metallb-operator-webhook-server-b54b5b68f-hq9kg\" (UID: \"fe139df5-ee59-46ea-a07d-6a995dccdc8e\") " pod="metallb-system/metallb-operator-webhook-server-b54b5b68f-hq9kg" Mar 20 07:06:43 crc kubenswrapper[4971]: I0320 07:06:43.159619 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v97js\" (UniqueName: \"kubernetes.io/projected/fe139df5-ee59-46ea-a07d-6a995dccdc8e-kube-api-access-v97js\") pod \"metallb-operator-webhook-server-b54b5b68f-hq9kg\" (UID: \"fe139df5-ee59-46ea-a07d-6a995dccdc8e\") " pod="metallb-system/metallb-operator-webhook-server-b54b5b68f-hq9kg" Mar 20 07:06:43 crc kubenswrapper[4971]: I0320 07:06:43.261477 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fe139df5-ee59-46ea-a07d-6a995dccdc8e-webhook-cert\") pod \"metallb-operator-webhook-server-b54b5b68f-hq9kg\" (UID: \"fe139df5-ee59-46ea-a07d-6a995dccdc8e\") " pod="metallb-system/metallb-operator-webhook-server-b54b5b68f-hq9kg" Mar 20 07:06:43 crc kubenswrapper[4971]: I0320 07:06:43.261542 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fe139df5-ee59-46ea-a07d-6a995dccdc8e-apiservice-cert\") pod \"metallb-operator-webhook-server-b54b5b68f-hq9kg\" (UID: \"fe139df5-ee59-46ea-a07d-6a995dccdc8e\") " pod="metallb-system/metallb-operator-webhook-server-b54b5b68f-hq9kg" Mar 20 07:06:43 crc kubenswrapper[4971]: I0320 07:06:43.261576 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v97js\" (UniqueName: \"kubernetes.io/projected/fe139df5-ee59-46ea-a07d-6a995dccdc8e-kube-api-access-v97js\") pod \"metallb-operator-webhook-server-b54b5b68f-hq9kg\" (UID: \"fe139df5-ee59-46ea-a07d-6a995dccdc8e\") " pod="metallb-system/metallb-operator-webhook-server-b54b5b68f-hq9kg" Mar 20 07:06:43 crc kubenswrapper[4971]: I0320 07:06:43.265854 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fe139df5-ee59-46ea-a07d-6a995dccdc8e-apiservice-cert\") pod \"metallb-operator-webhook-server-b54b5b68f-hq9kg\" (UID: \"fe139df5-ee59-46ea-a07d-6a995dccdc8e\") " pod="metallb-system/metallb-operator-webhook-server-b54b5b68f-hq9kg" Mar 20 07:06:43 crc kubenswrapper[4971]: I0320 07:06:43.267152 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fe139df5-ee59-46ea-a07d-6a995dccdc8e-webhook-cert\") pod \"metallb-operator-webhook-server-b54b5b68f-hq9kg\" (UID: \"fe139df5-ee59-46ea-a07d-6a995dccdc8e\") " pod="metallb-system/metallb-operator-webhook-server-b54b5b68f-hq9kg" Mar 20 07:06:43 crc kubenswrapper[4971]: I0320 07:06:43.291284 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v97js\" (UniqueName: \"kubernetes.io/projected/fe139df5-ee59-46ea-a07d-6a995dccdc8e-kube-api-access-v97js\") pod \"metallb-operator-webhook-server-b54b5b68f-hq9kg\" (UID: \"fe139df5-ee59-46ea-a07d-6a995dccdc8e\") " pod="metallb-system/metallb-operator-webhook-server-b54b5b68f-hq9kg" Mar 20 07:06:43 crc kubenswrapper[4971]: I0320 07:06:43.395070 4971 generic.go:334] "Generic (PLEG): container finished" podID="2fdf14a1-cba9-4770-bf41-60b64875c276" containerID="f0229de970426d49b590ca043de0ef944a0657cf79974e94ff999ce6337511df" exitCode=0 Mar 20 07:06:43 crc kubenswrapper[4971]: I0320 07:06:43.395106 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p6tk2" event={"ID":"2fdf14a1-cba9-4770-bf41-60b64875c276","Type":"ContainerDied","Data":"f0229de970426d49b590ca043de0ef944a0657cf79974e94ff999ce6337511df"} Mar 20 07:06:43 crc kubenswrapper[4971]: I0320 07:06:43.411522 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-b54b5b68f-hq9kg" Mar 20 07:06:43 crc kubenswrapper[4971]: I0320 07:06:43.438882 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-fc6897bfd-j6pdm"] Mar 20 07:06:43 crc kubenswrapper[4971]: W0320 07:06:43.450513 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19ab5c82_2f79_47be_a7ba_68cdbe00f4b7.slice/crio-884d15c07225fa901e69681b52f0b8a5e8fbf4298ee71e81d74969df6e9714c1 WatchSource:0}: Error finding container 884d15c07225fa901e69681b52f0b8a5e8fbf4298ee71e81d74969df6e9714c1: Status 404 returned error can't find the container with id 884d15c07225fa901e69681b52f0b8a5e8fbf4298ee71e81d74969df6e9714c1 Mar 20 07:06:43 crc kubenswrapper[4971]: I0320 07:06:43.617858 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-b54b5b68f-hq9kg"] Mar 20 07:06:43 crc kubenswrapper[4971]: W0320 07:06:43.621955 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe139df5_ee59_46ea_a07d_6a995dccdc8e.slice/crio-4fe3f442d928293bbf5dadb91159060610c28c9d78b2197eabd8f14da7c3ac25 WatchSource:0}: Error finding container 4fe3f442d928293bbf5dadb91159060610c28c9d78b2197eabd8f14da7c3ac25: Status 404 returned error can't find the container with id 4fe3f442d928293bbf5dadb91159060610c28c9d78b2197eabd8f14da7c3ac25 Mar 20 07:06:44 crc kubenswrapper[4971]: I0320 07:06:44.404229 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-fc6897bfd-j6pdm" event={"ID":"19ab5c82-2f79-47be-a7ba-68cdbe00f4b7","Type":"ContainerStarted","Data":"884d15c07225fa901e69681b52f0b8a5e8fbf4298ee71e81d74969df6e9714c1"} Mar 20 07:06:44 crc kubenswrapper[4971]: I0320 07:06:44.407648 4971 generic.go:334] "Generic (PLEG): container finished" podID="2fdf14a1-cba9-4770-bf41-60b64875c276" containerID="b2bf73bdc7ee353c72f623da26043d2a7ed3e779eb9b10e9c3829b0872fe3fcd" exitCode=0 Mar 20 07:06:44 crc kubenswrapper[4971]: I0320 07:06:44.407738 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p6tk2" event={"ID":"2fdf14a1-cba9-4770-bf41-60b64875c276","Type":"ContainerDied","Data":"b2bf73bdc7ee353c72f623da26043d2a7ed3e779eb9b10e9c3829b0872fe3fcd"} Mar 20 07:06:44 crc kubenswrapper[4971]: I0320 07:06:44.410014 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-b54b5b68f-hq9kg" event={"ID":"fe139df5-ee59-46ea-a07d-6a995dccdc8e","Type":"ContainerStarted","Data":"4fe3f442d928293bbf5dadb91159060610c28c9d78b2197eabd8f14da7c3ac25"} Mar 20 07:06:45 crc kubenswrapper[4971]: I0320 07:06:45.417636 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p6tk2" event={"ID":"2fdf14a1-cba9-4770-bf41-60b64875c276","Type":"ContainerStarted","Data":"25586d1f655ec7a985ecd62f3f697c1f1db4e1860e74e7ab2dd94e7e54f26154"} Mar 20 07:06:45 crc kubenswrapper[4971]: I0320 07:06:45.433453 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-p6tk2" podStartSLOduration=3.017580351 podStartE2EDuration="5.433437655s" podCreationTimestamp="2026-03-20 07:06:40 +0000 UTC" firstStartedPulling="2026-03-20 07:06:42.390377952 +0000 UTC m=+1024.370252090" lastFinishedPulling="2026-03-20 07:06:44.806235266 +0000 UTC m=+1026.786109394" observedRunningTime="2026-03-20 07:06:45.43170593 +0000 UTC m=+1027.411580068" watchObservedRunningTime="2026-03-20 07:06:45.433437655 +0000 UTC m=+1027.413311793" Mar 20 07:06:49 crc kubenswrapper[4971]: I0320 07:06:49.451016 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-b54b5b68f-hq9kg" event={"ID":"fe139df5-ee59-46ea-a07d-6a995dccdc8e","Type":"ContainerStarted","Data":"2d6fe69604a1acfe461c4fa4f641659619124c5d1c9446d58b96a1eef7ab1988"} Mar 20 07:06:49 crc kubenswrapper[4971]: I0320 07:06:49.451578 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-b54b5b68f-hq9kg" Mar 20 07:06:49 crc kubenswrapper[4971]: I0320 07:06:49.453194 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-fc6897bfd-j6pdm" event={"ID":"19ab5c82-2f79-47be-a7ba-68cdbe00f4b7","Type":"ContainerStarted","Data":"84fbdcbd92e3b38f71ca47b3e0dcc31f9794827bccfa46f22625537af6efc549"} Mar 20 07:06:49 crc kubenswrapper[4971]: I0320 07:06:49.453381 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-fc6897bfd-j6pdm" Mar 20 07:06:49 crc kubenswrapper[4971]: I0320 07:06:49.475597 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-b54b5b68f-hq9kg" podStartSLOduration=1.381094082 podStartE2EDuration="6.475574823s" podCreationTimestamp="2026-03-20 07:06:43 +0000 UTC" firstStartedPulling="2026-03-20 07:06:43.625829795 +0000 UTC m=+1025.605703933" lastFinishedPulling="2026-03-20 07:06:48.720310526 +0000 UTC m=+1030.700184674" observedRunningTime="2026-03-20 07:06:49.471199801 +0000 UTC m=+1031.451073949" watchObservedRunningTime="2026-03-20 07:06:49.475574823 +0000 UTC m=+1031.455448991" Mar 20 07:06:49 crc kubenswrapper[4971]: I0320 07:06:49.498229 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-fc6897bfd-j6pdm" podStartSLOduration=2.262761565 podStartE2EDuration="7.498200614s" podCreationTimestamp="2026-03-20 07:06:42 +0000 UTC" firstStartedPulling="2026-03-20 07:06:43.454255761 +0000 UTC m=+1025.434129899" lastFinishedPulling="2026-03-20 07:06:48.6896948 +0000 UTC m=+1030.669568948" observedRunningTime="2026-03-20 07:06:49.493029061 +0000 UTC m=+1031.472903199" watchObservedRunningTime="2026-03-20 07:06:49.498200614 +0000 UTC m=+1031.478074772" Mar 20 07:06:50 crc kubenswrapper[4971]: I0320 07:06:50.162249 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:06:50 crc kubenswrapper[4971]: I0320 07:06:50.162361 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:06:50 crc kubenswrapper[4971]: I0320 07:06:50.162464 4971 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" Mar 20 07:06:50 crc kubenswrapper[4971]: I0320 07:06:50.163521 4971 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b3da7217c29a31428564d7212c3a93202314dd80c28b100287c20a984795e9b1"} pod="openshift-machine-config-operator/machine-config-daemon-m26zl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 07:06:50 crc kubenswrapper[4971]: I0320 07:06:50.163698 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" containerID="cri-o://b3da7217c29a31428564d7212c3a93202314dd80c28b100287c20a984795e9b1" gracePeriod=600 Mar 20 07:06:50 crc kubenswrapper[4971]: I0320 07:06:50.461436 4971 generic.go:334] "Generic (PLEG): container finished" podID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerID="b3da7217c29a31428564d7212c3a93202314dd80c28b100287c20a984795e9b1" exitCode=0 Mar 20 07:06:50 crc kubenswrapper[4971]: I0320 07:06:50.461533 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerDied","Data":"b3da7217c29a31428564d7212c3a93202314dd80c28b100287c20a984795e9b1"} Mar 20 07:06:50 crc kubenswrapper[4971]: I0320 07:06:50.461598 4971 scope.go:117] "RemoveContainer" containerID="d08c37f4d49289256792c5435adc7c946fcecf7a8b553aaa8277a80f90bfc57b" Mar 20 07:06:50 crc kubenswrapper[4971]: I0320 07:06:50.825120 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-p6tk2" Mar 20 07:06:50 crc kubenswrapper[4971]: I0320 07:06:50.825579 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-p6tk2" Mar 20 07:06:50 crc kubenswrapper[4971]: I0320 07:06:50.895809 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-p6tk2" Mar 20 07:06:51 crc kubenswrapper[4971]: I0320 07:06:51.469404 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerStarted","Data":"13acc3b11ca206dfa51085fca8aab838b094201afdcac5315a5bb8076cf03b1f"} Mar 20 07:06:51 crc kubenswrapper[4971]: I0320 07:06:51.528078 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-p6tk2" Mar 20 07:06:53 crc kubenswrapper[4971]: I0320 07:06:53.293885 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p6tk2"] Mar 20 07:06:53 crc kubenswrapper[4971]: I0320 07:06:53.483588 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-p6tk2" podUID="2fdf14a1-cba9-4770-bf41-60b64875c276" containerName="registry-server" containerID="cri-o://25586d1f655ec7a985ecd62f3f697c1f1db4e1860e74e7ab2dd94e7e54f26154" gracePeriod=2 Mar 20 07:06:54 crc kubenswrapper[4971]: I0320 07:06:54.044594 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p6tk2" Mar 20 07:06:54 crc kubenswrapper[4971]: I0320 07:06:54.230141 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fdf14a1-cba9-4770-bf41-60b64875c276-utilities\") pod \"2fdf14a1-cba9-4770-bf41-60b64875c276\" (UID: \"2fdf14a1-cba9-4770-bf41-60b64875c276\") " Mar 20 07:06:54 crc kubenswrapper[4971]: I0320 07:06:54.230328 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gjdv\" (UniqueName: \"kubernetes.io/projected/2fdf14a1-cba9-4770-bf41-60b64875c276-kube-api-access-2gjdv\") pod \"2fdf14a1-cba9-4770-bf41-60b64875c276\" (UID: \"2fdf14a1-cba9-4770-bf41-60b64875c276\") " Mar 20 07:06:54 crc kubenswrapper[4971]: I0320 07:06:54.230399 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fdf14a1-cba9-4770-bf41-60b64875c276-catalog-content\") pod \"2fdf14a1-cba9-4770-bf41-60b64875c276\" (UID: \"2fdf14a1-cba9-4770-bf41-60b64875c276\") " Mar 20 07:06:54 crc kubenswrapper[4971]: I0320 07:06:54.231363 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fdf14a1-cba9-4770-bf41-60b64875c276-utilities" (OuterVolumeSpecName: "utilities") pod "2fdf14a1-cba9-4770-bf41-60b64875c276" (UID: "2fdf14a1-cba9-4770-bf41-60b64875c276"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:06:54 crc kubenswrapper[4971]: I0320 07:06:54.237024 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fdf14a1-cba9-4770-bf41-60b64875c276-kube-api-access-2gjdv" (OuterVolumeSpecName: "kube-api-access-2gjdv") pod "2fdf14a1-cba9-4770-bf41-60b64875c276" (UID: "2fdf14a1-cba9-4770-bf41-60b64875c276"). InnerVolumeSpecName "kube-api-access-2gjdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:06:54 crc kubenswrapper[4971]: I0320 07:06:54.331744 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gjdv\" (UniqueName: \"kubernetes.io/projected/2fdf14a1-cba9-4770-bf41-60b64875c276-kube-api-access-2gjdv\") on node \"crc\" DevicePath \"\"" Mar 20 07:06:54 crc kubenswrapper[4971]: I0320 07:06:54.331784 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fdf14a1-cba9-4770-bf41-60b64875c276-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 07:06:54 crc kubenswrapper[4971]: I0320 07:06:54.493098 4971 generic.go:334] "Generic (PLEG): container finished" podID="2fdf14a1-cba9-4770-bf41-60b64875c276" containerID="25586d1f655ec7a985ecd62f3f697c1f1db4e1860e74e7ab2dd94e7e54f26154" exitCode=0 Mar 20 07:06:54 crc kubenswrapper[4971]: I0320 07:06:54.493159 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p6tk2" event={"ID":"2fdf14a1-cba9-4770-bf41-60b64875c276","Type":"ContainerDied","Data":"25586d1f655ec7a985ecd62f3f697c1f1db4e1860e74e7ab2dd94e7e54f26154"} Mar 20 07:06:54 crc kubenswrapper[4971]: I0320 07:06:54.493199 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p6tk2" event={"ID":"2fdf14a1-cba9-4770-bf41-60b64875c276","Type":"ContainerDied","Data":"0dd5aa961864cc519ce2604712487569f15fa0878bb6c55fc6b276aec0646e3e"} Mar 20 07:06:54 crc kubenswrapper[4971]: I0320 07:06:54.493194 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p6tk2" Mar 20 07:06:54 crc kubenswrapper[4971]: I0320 07:06:54.493221 4971 scope.go:117] "RemoveContainer" containerID="25586d1f655ec7a985ecd62f3f697c1f1db4e1860e74e7ab2dd94e7e54f26154" Mar 20 07:06:54 crc kubenswrapper[4971]: I0320 07:06:54.517930 4971 scope.go:117] "RemoveContainer" containerID="b2bf73bdc7ee353c72f623da26043d2a7ed3e779eb9b10e9c3829b0872fe3fcd" Mar 20 07:06:54 crc kubenswrapper[4971]: I0320 07:06:54.542033 4971 scope.go:117] "RemoveContainer" containerID="f0229de970426d49b590ca043de0ef944a0657cf79974e94ff999ce6337511df" Mar 20 07:06:54 crc kubenswrapper[4971]: I0320 07:06:54.572502 4971 scope.go:117] "RemoveContainer" containerID="25586d1f655ec7a985ecd62f3f697c1f1db4e1860e74e7ab2dd94e7e54f26154" Mar 20 07:06:54 crc kubenswrapper[4971]: E0320 07:06:54.573644 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25586d1f655ec7a985ecd62f3f697c1f1db4e1860e74e7ab2dd94e7e54f26154\": container with ID starting with 25586d1f655ec7a985ecd62f3f697c1f1db4e1860e74e7ab2dd94e7e54f26154 not found: ID does not exist" containerID="25586d1f655ec7a985ecd62f3f697c1f1db4e1860e74e7ab2dd94e7e54f26154" Mar 20 07:06:54 crc kubenswrapper[4971]: I0320 07:06:54.573712 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25586d1f655ec7a985ecd62f3f697c1f1db4e1860e74e7ab2dd94e7e54f26154"} err="failed to get container status \"25586d1f655ec7a985ecd62f3f697c1f1db4e1860e74e7ab2dd94e7e54f26154\": rpc error: code = NotFound desc = could not find container \"25586d1f655ec7a985ecd62f3f697c1f1db4e1860e74e7ab2dd94e7e54f26154\": container with ID starting with 25586d1f655ec7a985ecd62f3f697c1f1db4e1860e74e7ab2dd94e7e54f26154 not found: ID does not exist" Mar 20 07:06:54 crc kubenswrapper[4971]: I0320 07:06:54.573756 4971 scope.go:117] "RemoveContainer" containerID="b2bf73bdc7ee353c72f623da26043d2a7ed3e779eb9b10e9c3829b0872fe3fcd" Mar 20 07:06:54 crc kubenswrapper[4971]: E0320 07:06:54.577057 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2bf73bdc7ee353c72f623da26043d2a7ed3e779eb9b10e9c3829b0872fe3fcd\": container with ID starting with b2bf73bdc7ee353c72f623da26043d2a7ed3e779eb9b10e9c3829b0872fe3fcd not found: ID does not exist" containerID="b2bf73bdc7ee353c72f623da26043d2a7ed3e779eb9b10e9c3829b0872fe3fcd" Mar 20 07:06:54 crc kubenswrapper[4971]: I0320 07:06:54.577129 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2bf73bdc7ee353c72f623da26043d2a7ed3e779eb9b10e9c3829b0872fe3fcd"} err="failed to get container status \"b2bf73bdc7ee353c72f623da26043d2a7ed3e779eb9b10e9c3829b0872fe3fcd\": rpc error: code = NotFound desc = could not find container \"b2bf73bdc7ee353c72f623da26043d2a7ed3e779eb9b10e9c3829b0872fe3fcd\": container with ID starting with b2bf73bdc7ee353c72f623da26043d2a7ed3e779eb9b10e9c3829b0872fe3fcd not found: ID does not exist" Mar 20 07:06:54 crc kubenswrapper[4971]: I0320 07:06:54.577172 4971 scope.go:117] "RemoveContainer" containerID="f0229de970426d49b590ca043de0ef944a0657cf79974e94ff999ce6337511df" Mar 20 07:06:54 crc kubenswrapper[4971]: E0320 07:06:54.579513 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0229de970426d49b590ca043de0ef944a0657cf79974e94ff999ce6337511df\": container with ID starting with f0229de970426d49b590ca043de0ef944a0657cf79974e94ff999ce6337511df not found: ID does not exist" containerID="f0229de970426d49b590ca043de0ef944a0657cf79974e94ff999ce6337511df" Mar 20 07:06:54 crc kubenswrapper[4971]: I0320 07:06:54.579591 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0229de970426d49b590ca043de0ef944a0657cf79974e94ff999ce6337511df"} err="failed to get container status \"f0229de970426d49b590ca043de0ef944a0657cf79974e94ff999ce6337511df\": rpc error: code = NotFound desc = could not find container \"f0229de970426d49b590ca043de0ef944a0657cf79974e94ff999ce6337511df\": container with ID starting with f0229de970426d49b590ca043de0ef944a0657cf79974e94ff999ce6337511df not found: ID does not exist" Mar 20 07:06:54 crc kubenswrapper[4971]: I0320 07:06:54.988922 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fdf14a1-cba9-4770-bf41-60b64875c276-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2fdf14a1-cba9-4770-bf41-60b64875c276" (UID: "2fdf14a1-cba9-4770-bf41-60b64875c276"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:06:55 crc kubenswrapper[4971]: I0320 07:06:55.038894 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fdf14a1-cba9-4770-bf41-60b64875c276-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 07:06:55 crc kubenswrapper[4971]: I0320 07:06:55.133354 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p6tk2"] Mar 20 07:06:55 crc kubenswrapper[4971]: I0320 07:06:55.141453 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-p6tk2"] Mar 20 07:06:56 crc kubenswrapper[4971]: I0320 07:06:56.748046 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fdf14a1-cba9-4770-bf41-60b64875c276" path="/var/lib/kubelet/pods/2fdf14a1-cba9-4770-bf41-60b64875c276/volumes" Mar 20 07:07:03 crc kubenswrapper[4971]: I0320 07:07:03.417595 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-b54b5b68f-hq9kg" Mar 20 07:07:23 crc kubenswrapper[4971]: I0320 07:07:23.046237 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-fc6897bfd-j6pdm" Mar 20 07:07:23 crc kubenswrapper[4971]: I0320 07:07:23.793859 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-47mxg"] Mar 20 07:07:23 crc kubenswrapper[4971]: E0320 07:07:23.794118 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fdf14a1-cba9-4770-bf41-60b64875c276" containerName="extract-content" Mar 20 07:07:23 crc kubenswrapper[4971]: I0320 07:07:23.794137 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fdf14a1-cba9-4770-bf41-60b64875c276" containerName="extract-content" Mar 20 07:07:23 crc kubenswrapper[4971]: E0320 07:07:23.794151 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fdf14a1-cba9-4770-bf41-60b64875c276" containerName="extract-utilities" Mar 20 07:07:23 crc kubenswrapper[4971]: I0320 07:07:23.794160 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fdf14a1-cba9-4770-bf41-60b64875c276" containerName="extract-utilities" Mar 20 07:07:23 crc kubenswrapper[4971]: E0320 07:07:23.794171 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fdf14a1-cba9-4770-bf41-60b64875c276" containerName="registry-server" Mar 20 07:07:23 crc kubenswrapper[4971]: I0320 07:07:23.794180 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fdf14a1-cba9-4770-bf41-60b64875c276" containerName="registry-server" Mar 20 07:07:23 crc kubenswrapper[4971]: I0320 07:07:23.794297 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fdf14a1-cba9-4770-bf41-60b64875c276" containerName="registry-server" Mar 20 07:07:23 crc kubenswrapper[4971]: I0320 07:07:23.794770 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-47mxg" Mar 20 07:07:23 crc kubenswrapper[4971]: I0320 07:07:23.798205 4971 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-hntss" Mar 20 07:07:23 crc kubenswrapper[4971]: I0320 07:07:23.798632 4971 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 20 07:07:23 crc kubenswrapper[4971]: I0320 07:07:23.809727 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-zlrz9"] Mar 20 07:07:23 crc kubenswrapper[4971]: I0320 07:07:23.811812 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-zlrz9" Mar 20 07:07:23 crc kubenswrapper[4971]: I0320 07:07:23.813457 4971 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 20 07:07:23 crc kubenswrapper[4971]: I0320 07:07:23.813531 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 20 07:07:23 crc kubenswrapper[4971]: I0320 07:07:23.828952 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40-metrics\") pod \"frr-k8s-zlrz9\" (UID: \"2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40\") " pod="metallb-system/frr-k8s-zlrz9" Mar 20 07:07:23 crc kubenswrapper[4971]: I0320 07:07:23.828998 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xp6dx\" (UniqueName: \"kubernetes.io/projected/0be71de9-783d-4467-bb0c-d8cb89f6bf38-kube-api-access-xp6dx\") pod \"frr-k8s-webhook-server-bcc4b6f68-47mxg\" (UID: \"0be71de9-783d-4467-bb0c-d8cb89f6bf38\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-47mxg" Mar 20 07:07:23 crc kubenswrapper[4971]: I0320 07:07:23.829023 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0be71de9-783d-4467-bb0c-d8cb89f6bf38-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-47mxg\" (UID: \"0be71de9-783d-4467-bb0c-d8cb89f6bf38\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-47mxg" Mar 20 07:07:23 crc kubenswrapper[4971]: I0320 07:07:23.829150 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40-frr-sockets\") pod \"frr-k8s-zlrz9\" (UID: \"2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40\") " pod="metallb-system/frr-k8s-zlrz9" Mar 20 07:07:23 crc kubenswrapper[4971]: I0320 07:07:23.829181 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40-frr-startup\") pod \"frr-k8s-zlrz9\" (UID: \"2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40\") " pod="metallb-system/frr-k8s-zlrz9" Mar 20 07:07:23 crc kubenswrapper[4971]: I0320 07:07:23.829231 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40-metrics-certs\") pod \"frr-k8s-zlrz9\" (UID: \"2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40\") " pod="metallb-system/frr-k8s-zlrz9" Mar 20 07:07:23 crc kubenswrapper[4971]: I0320 07:07:23.829276 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40-reloader\") pod \"frr-k8s-zlrz9\" (UID: \"2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40\") " pod="metallb-system/frr-k8s-zlrz9" Mar 20 07:07:23 crc kubenswrapper[4971]: I0320 07:07:23.829303 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40-frr-conf\") pod \"frr-k8s-zlrz9\" (UID: \"2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40\") " pod="metallb-system/frr-k8s-zlrz9" Mar 20 07:07:23 crc kubenswrapper[4971]: I0320 07:07:23.829334 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvbcn\" (UniqueName: \"kubernetes.io/projected/2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40-kube-api-access-xvbcn\") pod \"frr-k8s-zlrz9\" (UID: \"2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40\") " pod="metallb-system/frr-k8s-zlrz9" Mar 20 07:07:23 crc kubenswrapper[4971]: I0320 07:07:23.851686 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-47mxg"] Mar 20 07:07:23 crc kubenswrapper[4971]: I0320 07:07:23.905716 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-x8lwk"] Mar 20 07:07:23 crc kubenswrapper[4971]: I0320 07:07:23.906687 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-x8lwk" Mar 20 07:07:23 crc kubenswrapper[4971]: I0320 07:07:23.910429 4971 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 20 07:07:23 crc kubenswrapper[4971]: I0320 07:07:23.910468 4971 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 20 07:07:23 crc kubenswrapper[4971]: I0320 07:07:23.910733 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 20 07:07:23 crc kubenswrapper[4971]: I0320 07:07:23.910907 4971 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-shpnq" Mar 20 07:07:23 crc kubenswrapper[4971]: I0320 07:07:23.916215 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-l7zzc"] Mar 20 07:07:23 crc kubenswrapper[4971]: I0320 07:07:23.917346 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-l7zzc" Mar 20 07:07:23 crc kubenswrapper[4971]: I0320 07:07:23.922672 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-l7zzc"] Mar 20 07:07:23 crc kubenswrapper[4971]: I0320 07:07:23.926588 4971 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 20 07:07:23 crc kubenswrapper[4971]: I0320 07:07:23.929893 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40-metrics-certs\") pod \"frr-k8s-zlrz9\" (UID: \"2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40\") " pod="metallb-system/frr-k8s-zlrz9" Mar 20 07:07:23 crc kubenswrapper[4971]: I0320 07:07:23.929950 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40-reloader\") pod \"frr-k8s-zlrz9\" (UID: \"2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40\") " pod="metallb-system/frr-k8s-zlrz9" Mar 20 07:07:23 crc kubenswrapper[4971]: I0320 07:07:23.929982 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40-frr-conf\") pod \"frr-k8s-zlrz9\" (UID: \"2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40\") " pod="metallb-system/frr-k8s-zlrz9" Mar 20 07:07:23 crc kubenswrapper[4971]: E0320 07:07:23.930068 4971 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Mar 20 07:07:23 crc kubenswrapper[4971]: I0320 07:07:23.930199 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvbcn\" (UniqueName: \"kubernetes.io/projected/2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40-kube-api-access-xvbcn\") pod \"frr-k8s-zlrz9\" (UID: \"2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40\") " pod="metallb-system/frr-k8s-zlrz9" Mar 20 07:07:23 crc kubenswrapper[4971]: I0320 07:07:23.930237 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40-metrics\") pod \"frr-k8s-zlrz9\" (UID: \"2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40\") " pod="metallb-system/frr-k8s-zlrz9" Mar 20 07:07:23 crc kubenswrapper[4971]: E0320 07:07:23.930276 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40-metrics-certs podName:2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40 nodeName:}" failed. No retries permitted until 2026-03-20 07:07:24.430251236 +0000 UTC m=+1066.410125474 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40-metrics-certs") pod "frr-k8s-zlrz9" (UID: "2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40") : secret "frr-k8s-certs-secret" not found Mar 20 07:07:23 crc kubenswrapper[4971]: I0320 07:07:23.930306 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xp6dx\" (UniqueName: \"kubernetes.io/projected/0be71de9-783d-4467-bb0c-d8cb89f6bf38-kube-api-access-xp6dx\") pod \"frr-k8s-webhook-server-bcc4b6f68-47mxg\" (UID: \"0be71de9-783d-4467-bb0c-d8cb89f6bf38\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-47mxg" Mar 20 07:07:23 crc kubenswrapper[4971]: I0320 07:07:23.930343 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0be71de9-783d-4467-bb0c-d8cb89f6bf38-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-47mxg\" (UID: \"0be71de9-783d-4467-bb0c-d8cb89f6bf38\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-47mxg" Mar 20 07:07:23 crc kubenswrapper[4971]: I0320 07:07:23.930351 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40-frr-conf\") pod \"frr-k8s-zlrz9\" (UID: \"2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40\") " pod="metallb-system/frr-k8s-zlrz9" Mar 20 07:07:23 crc kubenswrapper[4971]: I0320 07:07:23.930405 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40-reloader\") pod \"frr-k8s-zlrz9\" (UID: \"2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40\") " pod="metallb-system/frr-k8s-zlrz9" Mar 20 07:07:23 crc kubenswrapper[4971]: I0320 07:07:23.930423 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40-frr-sockets\") pod \"frr-k8s-zlrz9\" (UID: \"2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40\") " pod="metallb-system/frr-k8s-zlrz9" Mar 20 07:07:23 crc kubenswrapper[4971]: I0320 07:07:23.930462 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40-frr-startup\") pod \"frr-k8s-zlrz9\" (UID: \"2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40\") " pod="metallb-system/frr-k8s-zlrz9" Mar 20 07:07:23 crc kubenswrapper[4971]: I0320 07:07:23.930544 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40-metrics\") pod \"frr-k8s-zlrz9\" (UID: \"2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40\") " pod="metallb-system/frr-k8s-zlrz9" Mar 20 07:07:23 crc kubenswrapper[4971]: E0320 07:07:23.930577 4971 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Mar 20 07:07:23 crc kubenswrapper[4971]: E0320 07:07:23.930635 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0be71de9-783d-4467-bb0c-d8cb89f6bf38-cert podName:0be71de9-783d-4467-bb0c-d8cb89f6bf38 nodeName:}" failed. No retries permitted until 2026-03-20 07:07:24.430622485 +0000 UTC m=+1066.410496713 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0be71de9-783d-4467-bb0c-d8cb89f6bf38-cert") pod "frr-k8s-webhook-server-bcc4b6f68-47mxg" (UID: "0be71de9-783d-4467-bb0c-d8cb89f6bf38") : secret "frr-k8s-webhook-server-cert" not found Mar 20 07:07:23 crc kubenswrapper[4971]: I0320 07:07:23.930736 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40-frr-sockets\") pod \"frr-k8s-zlrz9\" (UID: \"2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40\") " pod="metallb-system/frr-k8s-zlrz9" Mar 20 07:07:23 crc kubenswrapper[4971]: I0320 07:07:23.931306 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40-frr-startup\") pod \"frr-k8s-zlrz9\" (UID: \"2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40\") " pod="metallb-system/frr-k8s-zlrz9" Mar 20 07:07:23 crc kubenswrapper[4971]: I0320 07:07:23.962752 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xp6dx\" (UniqueName: \"kubernetes.io/projected/0be71de9-783d-4467-bb0c-d8cb89f6bf38-kube-api-access-xp6dx\") pod \"frr-k8s-webhook-server-bcc4b6f68-47mxg\" (UID: \"0be71de9-783d-4467-bb0c-d8cb89f6bf38\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-47mxg" Mar 20 07:07:23 crc kubenswrapper[4971]: I0320 07:07:23.976235 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvbcn\" (UniqueName: \"kubernetes.io/projected/2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40-kube-api-access-xvbcn\") pod \"frr-k8s-zlrz9\" (UID: \"2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40\") " pod="metallb-system/frr-k8s-zlrz9" Mar 20 07:07:24 crc kubenswrapper[4971]: I0320 07:07:24.038274 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62dce9cb-516c-4d4b-b6cc-7d19b8cf5b7a-metrics-certs\") pod \"speaker-x8lwk\" (UID: \"62dce9cb-516c-4d4b-b6cc-7d19b8cf5b7a\") " pod="metallb-system/speaker-x8lwk" Mar 20 07:07:24 crc kubenswrapper[4971]: I0320 07:07:24.038352 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/62dce9cb-516c-4d4b-b6cc-7d19b8cf5b7a-metallb-excludel2\") pod \"speaker-x8lwk\" (UID: \"62dce9cb-516c-4d4b-b6cc-7d19b8cf5b7a\") " pod="metallb-system/speaker-x8lwk" Mar 20 07:07:24 crc kubenswrapper[4971]: I0320 07:07:24.038372 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4stll\" (UniqueName: \"kubernetes.io/projected/62dce9cb-516c-4d4b-b6cc-7d19b8cf5b7a-kube-api-access-4stll\") pod \"speaker-x8lwk\" (UID: \"62dce9cb-516c-4d4b-b6cc-7d19b8cf5b7a\") " pod="metallb-system/speaker-x8lwk" Mar 20 07:07:24 crc kubenswrapper[4971]: I0320 07:07:24.038395 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/30af6211-fbfb-4319-8354-b2e08e781f2c-metrics-certs\") pod \"controller-7bb4cc7c98-l7zzc\" (UID: \"30af6211-fbfb-4319-8354-b2e08e781f2c\") " pod="metallb-system/controller-7bb4cc7c98-l7zzc" Mar 20 07:07:24 crc kubenswrapper[4971]: I0320 07:07:24.038422 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krpcl\" (UniqueName: \"kubernetes.io/projected/30af6211-fbfb-4319-8354-b2e08e781f2c-kube-api-access-krpcl\") pod \"controller-7bb4cc7c98-l7zzc\" (UID: \"30af6211-fbfb-4319-8354-b2e08e781f2c\") " pod="metallb-system/controller-7bb4cc7c98-l7zzc" Mar 20 07:07:24 crc kubenswrapper[4971]: I0320 07:07:24.038442 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/62dce9cb-516c-4d4b-b6cc-7d19b8cf5b7a-memberlist\") pod \"speaker-x8lwk\" (UID: \"62dce9cb-516c-4d4b-b6cc-7d19b8cf5b7a\") " pod="metallb-system/speaker-x8lwk" Mar 20 07:07:24 crc kubenswrapper[4971]: I0320 07:07:24.038461 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/30af6211-fbfb-4319-8354-b2e08e781f2c-cert\") pod \"controller-7bb4cc7c98-l7zzc\" (UID: \"30af6211-fbfb-4319-8354-b2e08e781f2c\") " pod="metallb-system/controller-7bb4cc7c98-l7zzc" Mar 20 07:07:24 crc kubenswrapper[4971]: I0320 07:07:24.139184 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krpcl\" (UniqueName: \"kubernetes.io/projected/30af6211-fbfb-4319-8354-b2e08e781f2c-kube-api-access-krpcl\") pod \"controller-7bb4cc7c98-l7zzc\" (UID: \"30af6211-fbfb-4319-8354-b2e08e781f2c\") " pod="metallb-system/controller-7bb4cc7c98-l7zzc" Mar 20 07:07:24 crc kubenswrapper[4971]: I0320 07:07:24.139227 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/62dce9cb-516c-4d4b-b6cc-7d19b8cf5b7a-memberlist\") pod \"speaker-x8lwk\" (UID: \"62dce9cb-516c-4d4b-b6cc-7d19b8cf5b7a\") " pod="metallb-system/speaker-x8lwk" Mar 20 07:07:24 crc kubenswrapper[4971]: I0320 07:07:24.139246 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/30af6211-fbfb-4319-8354-b2e08e781f2c-cert\") pod \"controller-7bb4cc7c98-l7zzc\" (UID: \"30af6211-fbfb-4319-8354-b2e08e781f2c\") " pod="metallb-system/controller-7bb4cc7c98-l7zzc" Mar 20 07:07:24 crc kubenswrapper[4971]: I0320 07:07:24.139293 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62dce9cb-516c-4d4b-b6cc-7d19b8cf5b7a-metrics-certs\") pod \"speaker-x8lwk\" (UID: \"62dce9cb-516c-4d4b-b6cc-7d19b8cf5b7a\") " pod="metallb-system/speaker-x8lwk" Mar 20 07:07:24 crc kubenswrapper[4971]: I0320 07:07:24.139332 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/62dce9cb-516c-4d4b-b6cc-7d19b8cf5b7a-metallb-excludel2\") pod \"speaker-x8lwk\" (UID: \"62dce9cb-516c-4d4b-b6cc-7d19b8cf5b7a\") " pod="metallb-system/speaker-x8lwk" Mar 20 07:07:24 crc kubenswrapper[4971]: I0320 07:07:24.139350 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4stll\" (UniqueName: \"kubernetes.io/projected/62dce9cb-516c-4d4b-b6cc-7d19b8cf5b7a-kube-api-access-4stll\") pod \"speaker-x8lwk\" (UID: \"62dce9cb-516c-4d4b-b6cc-7d19b8cf5b7a\") " pod="metallb-system/speaker-x8lwk" Mar 20 07:07:24 crc kubenswrapper[4971]: I0320 07:07:24.139369 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/30af6211-fbfb-4319-8354-b2e08e781f2c-metrics-certs\") pod \"controller-7bb4cc7c98-l7zzc\" (UID: \"30af6211-fbfb-4319-8354-b2e08e781f2c\") " pod="metallb-system/controller-7bb4cc7c98-l7zzc" Mar 20 07:07:24 crc kubenswrapper[4971]: E0320 07:07:24.139397 4971 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 20 07:07:24 crc kubenswrapper[4971]: E0320 07:07:24.139447 4971 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Mar 20 07:07:24 crc kubenswrapper[4971]: E0320 07:07:24.139467 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62dce9cb-516c-4d4b-b6cc-7d19b8cf5b7a-memberlist podName:62dce9cb-516c-4d4b-b6cc-7d19b8cf5b7a nodeName:}" failed. No retries permitted until 2026-03-20 07:07:24.639452063 +0000 UTC m=+1066.619326201 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/62dce9cb-516c-4d4b-b6cc-7d19b8cf5b7a-memberlist") pod "speaker-x8lwk" (UID: "62dce9cb-516c-4d4b-b6cc-7d19b8cf5b7a") : secret "metallb-memberlist" not found Mar 20 07:07:24 crc kubenswrapper[4971]: E0320 07:07:24.139487 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30af6211-fbfb-4319-8354-b2e08e781f2c-metrics-certs podName:30af6211-fbfb-4319-8354-b2e08e781f2c nodeName:}" failed. No retries permitted until 2026-03-20 07:07:24.639474103 +0000 UTC m=+1066.619348241 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/30af6211-fbfb-4319-8354-b2e08e781f2c-metrics-certs") pod "controller-7bb4cc7c98-l7zzc" (UID: "30af6211-fbfb-4319-8354-b2e08e781f2c") : secret "controller-certs-secret" not found Mar 20 07:07:24 crc kubenswrapper[4971]: I0320 07:07:24.140092 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/62dce9cb-516c-4d4b-b6cc-7d19b8cf5b7a-metallb-excludel2\") pod \"speaker-x8lwk\" (UID: \"62dce9cb-516c-4d4b-b6cc-7d19b8cf5b7a\") " pod="metallb-system/speaker-x8lwk" Mar 20 07:07:24 crc kubenswrapper[4971]: I0320 07:07:24.142214 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/30af6211-fbfb-4319-8354-b2e08e781f2c-cert\") pod \"controller-7bb4cc7c98-l7zzc\" (UID: \"30af6211-fbfb-4319-8354-b2e08e781f2c\") " pod="metallb-system/controller-7bb4cc7c98-l7zzc" Mar 20 07:07:24 crc kubenswrapper[4971]: I0320 07:07:24.142917 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62dce9cb-516c-4d4b-b6cc-7d19b8cf5b7a-metrics-certs\") pod \"speaker-x8lwk\" (UID: \"62dce9cb-516c-4d4b-b6cc-7d19b8cf5b7a\") " pod="metallb-system/speaker-x8lwk" Mar 20 07:07:24 crc kubenswrapper[4971]: I0320 07:07:24.169734 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krpcl\" (UniqueName: \"kubernetes.io/projected/30af6211-fbfb-4319-8354-b2e08e781f2c-kube-api-access-krpcl\") pod \"controller-7bb4cc7c98-l7zzc\" (UID: \"30af6211-fbfb-4319-8354-b2e08e781f2c\") " pod="metallb-system/controller-7bb4cc7c98-l7zzc" Mar 20 07:07:24 crc kubenswrapper[4971]: I0320 07:07:24.170152 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4stll\" (UniqueName: \"kubernetes.io/projected/62dce9cb-516c-4d4b-b6cc-7d19b8cf5b7a-kube-api-access-4stll\") pod \"speaker-x8lwk\" (UID: \"62dce9cb-516c-4d4b-b6cc-7d19b8cf5b7a\") " pod="metallb-system/speaker-x8lwk" Mar 20 07:07:24 crc kubenswrapper[4971]: I0320 07:07:24.443556 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40-metrics-certs\") pod \"frr-k8s-zlrz9\" (UID: \"2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40\") " pod="metallb-system/frr-k8s-zlrz9" Mar 20 07:07:24 crc kubenswrapper[4971]: I0320 07:07:24.443651 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0be71de9-783d-4467-bb0c-d8cb89f6bf38-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-47mxg\" (UID: \"0be71de9-783d-4467-bb0c-d8cb89f6bf38\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-47mxg" Mar 20 07:07:24 crc kubenswrapper[4971]: I0320 07:07:24.446598 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40-metrics-certs\") pod \"frr-k8s-zlrz9\" (UID: \"2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40\") " pod="metallb-system/frr-k8s-zlrz9" Mar 20 07:07:24 crc kubenswrapper[4971]: I0320 07:07:24.447959 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0be71de9-783d-4467-bb0c-d8cb89f6bf38-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-47mxg\" (UID: \"0be71de9-783d-4467-bb0c-d8cb89f6bf38\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-47mxg" Mar 20 07:07:24 crc kubenswrapper[4971]: I0320 07:07:24.645883 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/62dce9cb-516c-4d4b-b6cc-7d19b8cf5b7a-memberlist\") pod \"speaker-x8lwk\" (UID: \"62dce9cb-516c-4d4b-b6cc-7d19b8cf5b7a\") " pod="metallb-system/speaker-x8lwk" Mar 20 07:07:24 crc kubenswrapper[4971]: I0320 07:07:24.646084 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/30af6211-fbfb-4319-8354-b2e08e781f2c-metrics-certs\") pod \"controller-7bb4cc7c98-l7zzc\" (UID: \"30af6211-fbfb-4319-8354-b2e08e781f2c\") " pod="metallb-system/controller-7bb4cc7c98-l7zzc" Mar 20 07:07:24 crc kubenswrapper[4971]: E0320 07:07:24.646142 4971 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 20 07:07:24 crc kubenswrapper[4971]: E0320 07:07:24.646274 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62dce9cb-516c-4d4b-b6cc-7d19b8cf5b7a-memberlist podName:62dce9cb-516c-4d4b-b6cc-7d19b8cf5b7a nodeName:}" failed. No retries permitted until 2026-03-20 07:07:25.646244124 +0000 UTC m=+1067.626118292 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/62dce9cb-516c-4d4b-b6cc-7d19b8cf5b7a-memberlist") pod "speaker-x8lwk" (UID: "62dce9cb-516c-4d4b-b6cc-7d19b8cf5b7a") : secret "metallb-memberlist" not found Mar 20 07:07:24 crc kubenswrapper[4971]: I0320 07:07:24.649937 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/30af6211-fbfb-4319-8354-b2e08e781f2c-metrics-certs\") pod \"controller-7bb4cc7c98-l7zzc\" (UID: \"30af6211-fbfb-4319-8354-b2e08e781f2c\") " pod="metallb-system/controller-7bb4cc7c98-l7zzc" Mar 20 07:07:24 crc kubenswrapper[4971]: I0320 07:07:24.709160 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-47mxg" Mar 20 07:07:24 crc kubenswrapper[4971]: I0320 07:07:24.724767 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-zlrz9" Mar 20 07:07:24 crc kubenswrapper[4971]: I0320 07:07:24.828234 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-l7zzc" Mar 20 07:07:25 crc kubenswrapper[4971]: I0320 07:07:25.252898 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-47mxg"] Mar 20 07:07:25 crc kubenswrapper[4971]: I0320 07:07:25.304183 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-l7zzc"] Mar 20 07:07:25 crc kubenswrapper[4971]: W0320 07:07:25.308765 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30af6211_fbfb_4319_8354_b2e08e781f2c.slice/crio-be0cd5ffccfc3d5379732538e590da9a2b4ad04c517894228ed0fe98194443f4 WatchSource:0}: Error finding container be0cd5ffccfc3d5379732538e590da9a2b4ad04c517894228ed0fe98194443f4: Status 404 returned error can't find the container with id be0cd5ffccfc3d5379732538e590da9a2b4ad04c517894228ed0fe98194443f4 Mar 20 07:07:25 crc kubenswrapper[4971]: I0320 07:07:25.660952 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/62dce9cb-516c-4d4b-b6cc-7d19b8cf5b7a-memberlist\") pod \"speaker-x8lwk\" (UID: \"62dce9cb-516c-4d4b-b6cc-7d19b8cf5b7a\") " pod="metallb-system/speaker-x8lwk" Mar 20 07:07:25 crc kubenswrapper[4971]: I0320 07:07:25.666066 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/62dce9cb-516c-4d4b-b6cc-7d19b8cf5b7a-memberlist\") pod \"speaker-x8lwk\" (UID: \"62dce9cb-516c-4d4b-b6cc-7d19b8cf5b7a\") " pod="metallb-system/speaker-x8lwk" Mar 20 07:07:25 crc kubenswrapper[4971]: I0320 07:07:25.712911 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-47mxg" event={"ID":"0be71de9-783d-4467-bb0c-d8cb89f6bf38","Type":"ContainerStarted","Data":"d486d6b30212483ea60cb4b12f018f2b900a569693efd0c32c46cc5667ce9d7d"} Mar 20 07:07:25 crc kubenswrapper[4971]: I0320 07:07:25.714090 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zlrz9" event={"ID":"2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40","Type":"ContainerStarted","Data":"0709b7a9eff640172caa4cf86173f936105e3fd29974f8f7c75f3fc65d3260c7"} Mar 20 07:07:25 crc kubenswrapper[4971]: I0320 07:07:25.715940 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-l7zzc" event={"ID":"30af6211-fbfb-4319-8354-b2e08e781f2c","Type":"ContainerStarted","Data":"6a8d4006dffcde35edd5dec9f29fb057cb20ef53b06a6881f4059b4d87035636"} Mar 20 07:07:25 crc kubenswrapper[4971]: I0320 07:07:25.715972 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-l7zzc" event={"ID":"30af6211-fbfb-4319-8354-b2e08e781f2c","Type":"ContainerStarted","Data":"8a8e25765a27c992c4430961906a95e75f6b2aa4db50b68f892d93b4c6d08891"} Mar 20 07:07:25 crc kubenswrapper[4971]: I0320 07:07:25.715988 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-l7zzc" event={"ID":"30af6211-fbfb-4319-8354-b2e08e781f2c","Type":"ContainerStarted","Data":"be0cd5ffccfc3d5379732538e590da9a2b4ad04c517894228ed0fe98194443f4"} Mar 20 07:07:25 crc kubenswrapper[4971]: I0320 07:07:25.716097 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-l7zzc" Mar 20 07:07:25 crc kubenswrapper[4971]: I0320 07:07:25.718739 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-x8lwk" Mar 20 07:07:25 crc kubenswrapper[4971]: I0320 07:07:25.736416 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-l7zzc" podStartSLOduration=2.736396256 podStartE2EDuration="2.736396256s" podCreationTimestamp="2026-03-20 07:07:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:07:25.731052355 +0000 UTC m=+1067.710926493" watchObservedRunningTime="2026-03-20 07:07:25.736396256 +0000 UTC m=+1067.716270394" Mar 20 07:07:25 crc kubenswrapper[4971]: W0320 07:07:25.743103 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62dce9cb_516c_4d4b_b6cc_7d19b8cf5b7a.slice/crio-e20c81973878cd330147307ddb775bdd4423284de491422d8fba43830b9d111a WatchSource:0}: Error finding container e20c81973878cd330147307ddb775bdd4423284de491422d8fba43830b9d111a: Status 404 returned error can't find the container with id e20c81973878cd330147307ddb775bdd4423284de491422d8fba43830b9d111a Mar 20 07:07:26 crc kubenswrapper[4971]: I0320 07:07:26.722962 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-x8lwk" event={"ID":"62dce9cb-516c-4d4b-b6cc-7d19b8cf5b7a","Type":"ContainerStarted","Data":"660b1cdd0398fc5a8601ad14406dca3df5e094c02ab57bd7a869c0b18e7ba7f6"} Mar 20 07:07:26 crc kubenswrapper[4971]: I0320 07:07:26.723005 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-x8lwk" event={"ID":"62dce9cb-516c-4d4b-b6cc-7d19b8cf5b7a","Type":"ContainerStarted","Data":"280e066a6e2a850fb5f870f5ce606c5fdcbbaa53d61389c2b136812335449546"} Mar 20 07:07:26 crc kubenswrapper[4971]: I0320 07:07:26.723019 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-x8lwk" event={"ID":"62dce9cb-516c-4d4b-b6cc-7d19b8cf5b7a","Type":"ContainerStarted","Data":"e20c81973878cd330147307ddb775bdd4423284de491422d8fba43830b9d111a"} Mar 20 07:07:26 crc kubenswrapper[4971]: I0320 07:07:26.723407 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-x8lwk" Mar 20 07:07:26 crc kubenswrapper[4971]: I0320 07:07:26.755582 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-x8lwk" podStartSLOduration=3.7555618490000002 podStartE2EDuration="3.755561849s" podCreationTimestamp="2026-03-20 07:07:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:07:26.747992533 +0000 UTC m=+1068.727866691" watchObservedRunningTime="2026-03-20 07:07:26.755561849 +0000 UTC m=+1068.735435997" Mar 20 07:07:32 crc kubenswrapper[4971]: I0320 07:07:32.769056 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-47mxg" event={"ID":"0be71de9-783d-4467-bb0c-d8cb89f6bf38","Type":"ContainerStarted","Data":"59ade26d8c4c7b019dc5f1f6d143071c1f2248fb8305275226eb899650173714"} Mar 20 07:07:32 crc kubenswrapper[4971]: I0320 07:07:32.770506 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-47mxg" Mar 20 07:07:32 crc kubenswrapper[4971]: I0320 07:07:32.770942 4971 generic.go:334] "Generic (PLEG): container finished" podID="2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40" containerID="bed804b2ce1fb55de20a11cc7a08a196eaaa0b6b179af64df03b427eb408c038" exitCode=0 Mar 20 07:07:32 crc kubenswrapper[4971]: I0320 07:07:32.771010 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zlrz9" event={"ID":"2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40","Type":"ContainerDied","Data":"bed804b2ce1fb55de20a11cc7a08a196eaaa0b6b179af64df03b427eb408c038"} Mar 20 07:07:32 crc kubenswrapper[4971]: I0320 07:07:32.788896 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-47mxg" podStartSLOduration=3.393592457 podStartE2EDuration="9.788873811s" podCreationTimestamp="2026-03-20 07:07:23 +0000 UTC" firstStartedPulling="2026-03-20 07:07:25.260248979 +0000 UTC m=+1067.240123147" lastFinishedPulling="2026-03-20 07:07:31.655530363 +0000 UTC m=+1073.635404501" observedRunningTime="2026-03-20 07:07:32.784994565 +0000 UTC m=+1074.764868713" watchObservedRunningTime="2026-03-20 07:07:32.788873811 +0000 UTC m=+1074.768747949" Mar 20 07:07:33 crc kubenswrapper[4971]: I0320 07:07:33.779067 4971 generic.go:334] "Generic (PLEG): container finished" podID="2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40" containerID="89e470e0f5f5876ccaf1afba8ccb127eb7cd2d5cf929626e184ba8420d87a565" exitCode=0 Mar 20 07:07:33 crc kubenswrapper[4971]: I0320 07:07:33.779151 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zlrz9" event={"ID":"2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40","Type":"ContainerDied","Data":"89e470e0f5f5876ccaf1afba8ccb127eb7cd2d5cf929626e184ba8420d87a565"} Mar 20 07:07:34 crc kubenswrapper[4971]: I0320 07:07:34.790691 4971 generic.go:334] "Generic (PLEG): container finished" podID="2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40" containerID="d860437c70f3ef83008118f7db80b475e894cfbe5c3420e42c161a7ad3e3b52d" exitCode=0 Mar 20 07:07:34 crc kubenswrapper[4971]: I0320 07:07:34.790791 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zlrz9" event={"ID":"2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40","Type":"ContainerDied","Data":"d860437c70f3ef83008118f7db80b475e894cfbe5c3420e42c161a7ad3e3b52d"} Mar 20 07:07:35 crc kubenswrapper[4971]: I0320 07:07:35.722456 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-x8lwk" Mar 20 07:07:35 crc kubenswrapper[4971]: I0320 07:07:35.802991 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zlrz9" event={"ID":"2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40","Type":"ContainerStarted","Data":"2379538a06021472e88dcf0e1588155502c964dbffcd0c5366dc4f177003c8c9"} Mar 20 07:07:35 crc kubenswrapper[4971]: I0320 07:07:35.803042 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zlrz9" event={"ID":"2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40","Type":"ContainerStarted","Data":"4d38a61cbe7e6e459049507480cd40435f1f16635b7fc317a68ad7e9be783449"} Mar 20 07:07:35 crc kubenswrapper[4971]: I0320 07:07:35.803059 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zlrz9" event={"ID":"2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40","Type":"ContainerStarted","Data":"0012e6a3616eca92338fd71a7459479c38048836182b700945f6ec37a870c5cd"} Mar 20 07:07:35 crc kubenswrapper[4971]: I0320 07:07:35.803072 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zlrz9" event={"ID":"2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40","Type":"ContainerStarted","Data":"e42b5582aadb17c36f097e4b67df712677f8c1142788c5061635e0dafa9dbeaa"} Mar 20 07:07:35 crc kubenswrapper[4971]: I0320 07:07:35.803082 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zlrz9" event={"ID":"2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40","Type":"ContainerStarted","Data":"e3a7033422b35c18a8520e8237a5cd89902c8472c2c218ccfa0f1d86996077d4"} Mar 20 07:07:36 crc kubenswrapper[4971]: I0320 07:07:36.815808 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zlrz9" event={"ID":"2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40","Type":"ContainerStarted","Data":"57667da9c8019bc4fb7bd2e3e74b438e095a13869467c378038df4c19ccbf906"} Mar 20 07:07:36 crc kubenswrapper[4971]: I0320 07:07:36.816932 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-zlrz9" Mar 20 07:07:36 crc kubenswrapper[4971]: I0320 07:07:36.841428 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-zlrz9" podStartSLOduration=7.084736265 podStartE2EDuration="13.841409287s" podCreationTimestamp="2026-03-20 07:07:23 +0000 UTC" firstStartedPulling="2026-03-20 07:07:24.936743925 +0000 UTC m=+1066.916618073" lastFinishedPulling="2026-03-20 07:07:31.693416917 +0000 UTC m=+1073.673291095" observedRunningTime="2026-03-20 07:07:36.838320711 +0000 UTC m=+1078.818194869" watchObservedRunningTime="2026-03-20 07:07:36.841409287 +0000 UTC m=+1078.821283425" Mar 20 07:07:37 crc kubenswrapper[4971]: I0320 07:07:37.638142 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lssfc"] Mar 20 07:07:37 crc kubenswrapper[4971]: I0320 07:07:37.639760 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lssfc" Mar 20 07:07:37 crc kubenswrapper[4971]: I0320 07:07:37.641815 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 20 07:07:37 crc kubenswrapper[4971]: I0320 07:07:37.651036 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lssfc"] Mar 20 07:07:37 crc kubenswrapper[4971]: I0320 07:07:37.741735 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9f7b38cc-03d4-40e5-9678-f8e7a589392f-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lssfc\" (UID: \"9f7b38cc-03d4-40e5-9678-f8e7a589392f\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lssfc" Mar 20 07:07:37 crc kubenswrapper[4971]: I0320 07:07:37.741841 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gk896\" (UniqueName: \"kubernetes.io/projected/9f7b38cc-03d4-40e5-9678-f8e7a589392f-kube-api-access-gk896\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lssfc\" (UID: \"9f7b38cc-03d4-40e5-9678-f8e7a589392f\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lssfc" Mar 20 07:07:37 crc kubenswrapper[4971]: I0320 07:07:37.741895 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9f7b38cc-03d4-40e5-9678-f8e7a589392f-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lssfc\" (UID: \"9f7b38cc-03d4-40e5-9678-f8e7a589392f\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lssfc" Mar 20 07:07:37 crc kubenswrapper[4971]: I0320 07:07:37.843433 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9f7b38cc-03d4-40e5-9678-f8e7a589392f-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lssfc\" (UID: \"9f7b38cc-03d4-40e5-9678-f8e7a589392f\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lssfc" Mar 20 07:07:37 crc kubenswrapper[4971]: I0320 07:07:37.843592 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gk896\" (UniqueName: \"kubernetes.io/projected/9f7b38cc-03d4-40e5-9678-f8e7a589392f-kube-api-access-gk896\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lssfc\" (UID: \"9f7b38cc-03d4-40e5-9678-f8e7a589392f\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lssfc" Mar 20 07:07:37 crc kubenswrapper[4971]: I0320 07:07:37.843716 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9f7b38cc-03d4-40e5-9678-f8e7a589392f-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lssfc\" (UID: \"9f7b38cc-03d4-40e5-9678-f8e7a589392f\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lssfc" Mar 20 07:07:37 crc kubenswrapper[4971]: I0320 07:07:37.844452 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9f7b38cc-03d4-40e5-9678-f8e7a589392f-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lssfc\" (UID: \"9f7b38cc-03d4-40e5-9678-f8e7a589392f\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lssfc" Mar 20 07:07:37 crc kubenswrapper[4971]: I0320 07:07:37.845237 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9f7b38cc-03d4-40e5-9678-f8e7a589392f-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lssfc\" (UID: \"9f7b38cc-03d4-40e5-9678-f8e7a589392f\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lssfc" Mar 20 07:07:37 crc kubenswrapper[4971]: I0320 07:07:37.870058 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gk896\" (UniqueName: \"kubernetes.io/projected/9f7b38cc-03d4-40e5-9678-f8e7a589392f-kube-api-access-gk896\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lssfc\" (UID: \"9f7b38cc-03d4-40e5-9678-f8e7a589392f\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lssfc" Mar 20 07:07:37 crc kubenswrapper[4971]: I0320 07:07:37.962888 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lssfc" Mar 20 07:07:38 crc kubenswrapper[4971]: I0320 07:07:38.377488 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lssfc"] Mar 20 07:07:38 crc kubenswrapper[4971]: W0320 07:07:38.388981 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f7b38cc_03d4_40e5_9678_f8e7a589392f.slice/crio-c95ebccfc11fb30a0c00d6ebd4a095061660945bcf9204a422f75c28c5b5b8e7 WatchSource:0}: Error finding container c95ebccfc11fb30a0c00d6ebd4a095061660945bcf9204a422f75c28c5b5b8e7: Status 404 returned error can't find the container with id c95ebccfc11fb30a0c00d6ebd4a095061660945bcf9204a422f75c28c5b5b8e7 Mar 20 07:07:38 crc kubenswrapper[4971]: I0320 07:07:38.828854 4971 generic.go:334] "Generic (PLEG): container finished" podID="9f7b38cc-03d4-40e5-9678-f8e7a589392f" containerID="6b8300323e03da11476d16ab78d7e801f8891a2d6e7de6b26cc14d3ac227d701" exitCode=0 Mar 20 07:07:38 crc kubenswrapper[4971]: I0320 07:07:38.830368 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lssfc" event={"ID":"9f7b38cc-03d4-40e5-9678-f8e7a589392f","Type":"ContainerDied","Data":"6b8300323e03da11476d16ab78d7e801f8891a2d6e7de6b26cc14d3ac227d701"} Mar 20 07:07:38 crc kubenswrapper[4971]: I0320 07:07:38.830399 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lssfc" event={"ID":"9f7b38cc-03d4-40e5-9678-f8e7a589392f","Type":"ContainerStarted","Data":"c95ebccfc11fb30a0c00d6ebd4a095061660945bcf9204a422f75c28c5b5b8e7"} Mar 20 07:07:39 crc kubenswrapper[4971]: I0320 07:07:39.725494 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-zlrz9" Mar 20 07:07:39 crc kubenswrapper[4971]: I0320 07:07:39.785789 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-zlrz9" Mar 20 07:07:44 crc kubenswrapper[4971]: I0320 07:07:44.716149 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-47mxg" Mar 20 07:07:44 crc kubenswrapper[4971]: I0320 07:07:44.727830 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-zlrz9" Mar 20 07:07:44 crc kubenswrapper[4971]: I0320 07:07:44.836776 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-l7zzc" Mar 20 07:07:44 crc kubenswrapper[4971]: I0320 07:07:44.877196 4971 generic.go:334] "Generic (PLEG): container finished" podID="9f7b38cc-03d4-40e5-9678-f8e7a589392f" containerID="4810055ebe0e20a0ed592c6c4eda2bfa57de628a3b2d05c0e3e61983d4820b4c" exitCode=0 Mar 20 07:07:44 crc kubenswrapper[4971]: I0320 07:07:44.877280 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lssfc" event={"ID":"9f7b38cc-03d4-40e5-9678-f8e7a589392f","Type":"ContainerDied","Data":"4810055ebe0e20a0ed592c6c4eda2bfa57de628a3b2d05c0e3e61983d4820b4c"} Mar 20 07:07:45 crc kubenswrapper[4971]: I0320 07:07:45.888076 4971 generic.go:334] "Generic (PLEG): container finished" podID="9f7b38cc-03d4-40e5-9678-f8e7a589392f" containerID="b1ae1451f96785cd973f7caee1702bec3e2bc6cedf89d6f7e3b9b04d90544634" exitCode=0 Mar 20 07:07:45 crc kubenswrapper[4971]: I0320 07:07:45.888147 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lssfc" event={"ID":"9f7b38cc-03d4-40e5-9678-f8e7a589392f","Type":"ContainerDied","Data":"b1ae1451f96785cd973f7caee1702bec3e2bc6cedf89d6f7e3b9b04d90544634"} Mar 20 07:07:47 crc kubenswrapper[4971]: I0320 07:07:47.172391 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lssfc" Mar 20 07:07:47 crc kubenswrapper[4971]: I0320 07:07:47.293969 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gk896\" (UniqueName: \"kubernetes.io/projected/9f7b38cc-03d4-40e5-9678-f8e7a589392f-kube-api-access-gk896\") pod \"9f7b38cc-03d4-40e5-9678-f8e7a589392f\" (UID: \"9f7b38cc-03d4-40e5-9678-f8e7a589392f\") " Mar 20 07:07:47 crc kubenswrapper[4971]: I0320 07:07:47.294058 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9f7b38cc-03d4-40e5-9678-f8e7a589392f-util\") pod \"9f7b38cc-03d4-40e5-9678-f8e7a589392f\" (UID: \"9f7b38cc-03d4-40e5-9678-f8e7a589392f\") " Mar 20 07:07:47 crc kubenswrapper[4971]: I0320 07:07:47.294130 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9f7b38cc-03d4-40e5-9678-f8e7a589392f-bundle\") pod \"9f7b38cc-03d4-40e5-9678-f8e7a589392f\" (UID: \"9f7b38cc-03d4-40e5-9678-f8e7a589392f\") " Mar 20 07:07:47 crc kubenswrapper[4971]: I0320 07:07:47.295058 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f7b38cc-03d4-40e5-9678-f8e7a589392f-bundle" (OuterVolumeSpecName: "bundle") pod "9f7b38cc-03d4-40e5-9678-f8e7a589392f" (UID: "9f7b38cc-03d4-40e5-9678-f8e7a589392f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:07:47 crc kubenswrapper[4971]: I0320 07:07:47.300359 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f7b38cc-03d4-40e5-9678-f8e7a589392f-kube-api-access-gk896" (OuterVolumeSpecName: "kube-api-access-gk896") pod "9f7b38cc-03d4-40e5-9678-f8e7a589392f" (UID: "9f7b38cc-03d4-40e5-9678-f8e7a589392f"). InnerVolumeSpecName "kube-api-access-gk896". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:07:47 crc kubenswrapper[4971]: I0320 07:07:47.306923 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f7b38cc-03d4-40e5-9678-f8e7a589392f-util" (OuterVolumeSpecName: "util") pod "9f7b38cc-03d4-40e5-9678-f8e7a589392f" (UID: "9f7b38cc-03d4-40e5-9678-f8e7a589392f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:07:47 crc kubenswrapper[4971]: I0320 07:07:47.396091 4971 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9f7b38cc-03d4-40e5-9678-f8e7a589392f-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:07:47 crc kubenswrapper[4971]: I0320 07:07:47.396129 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gk896\" (UniqueName: \"kubernetes.io/projected/9f7b38cc-03d4-40e5-9678-f8e7a589392f-kube-api-access-gk896\") on node \"crc\" DevicePath \"\"" Mar 20 07:07:47 crc kubenswrapper[4971]: I0320 07:07:47.396141 4971 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9f7b38cc-03d4-40e5-9678-f8e7a589392f-util\") on node \"crc\" DevicePath \"\"" Mar 20 07:07:47 crc kubenswrapper[4971]: I0320 07:07:47.905828 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lssfc" event={"ID":"9f7b38cc-03d4-40e5-9678-f8e7a589392f","Type":"ContainerDied","Data":"c95ebccfc11fb30a0c00d6ebd4a095061660945bcf9204a422f75c28c5b5b8e7"} Mar 20 07:07:47 crc kubenswrapper[4971]: I0320 07:07:47.906079 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c95ebccfc11fb30a0c00d6ebd4a095061660945bcf9204a422f75c28c5b5b8e7" Mar 20 07:07:47 crc kubenswrapper[4971]: I0320 07:07:47.905924 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lssfc" Mar 20 07:07:55 crc kubenswrapper[4971]: I0320 07:07:55.622766 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-s4t7q"] Mar 20 07:07:55 crc kubenswrapper[4971]: E0320 07:07:55.623865 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f7b38cc-03d4-40e5-9678-f8e7a589392f" containerName="pull" Mar 20 07:07:55 crc kubenswrapper[4971]: I0320 07:07:55.623893 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f7b38cc-03d4-40e5-9678-f8e7a589392f" containerName="pull" Mar 20 07:07:55 crc kubenswrapper[4971]: E0320 07:07:55.623920 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f7b38cc-03d4-40e5-9678-f8e7a589392f" containerName="extract" Mar 20 07:07:55 crc kubenswrapper[4971]: I0320 07:07:55.623933 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f7b38cc-03d4-40e5-9678-f8e7a589392f" containerName="extract" Mar 20 07:07:55 crc kubenswrapper[4971]: E0320 07:07:55.623953 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f7b38cc-03d4-40e5-9678-f8e7a589392f" containerName="util" Mar 20 07:07:55 crc kubenswrapper[4971]: I0320 07:07:55.623968 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f7b38cc-03d4-40e5-9678-f8e7a589392f" containerName="util" Mar 20 07:07:55 crc kubenswrapper[4971]: I0320 07:07:55.624181 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f7b38cc-03d4-40e5-9678-f8e7a589392f" containerName="extract" Mar 20 07:07:55 crc kubenswrapper[4971]: I0320 07:07:55.624898 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-s4t7q" Mar 20 07:07:55 crc kubenswrapper[4971]: I0320 07:07:55.627330 4971 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-2qdpl" Mar 20 07:07:55 crc kubenswrapper[4971]: I0320 07:07:55.629055 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Mar 20 07:07:55 crc kubenswrapper[4971]: I0320 07:07:55.633717 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Mar 20 07:07:55 crc kubenswrapper[4971]: I0320 07:07:55.645357 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-s4t7q"] Mar 20 07:07:55 crc kubenswrapper[4971]: I0320 07:07:55.706933 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vj88s\" (UniqueName: \"kubernetes.io/projected/0c8a7ee5-0ec6-4b29-9061-f23731026324-kube-api-access-vj88s\") pod \"cert-manager-operator-controller-manager-66c8bdd694-s4t7q\" (UID: \"0c8a7ee5-0ec6-4b29-9061-f23731026324\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-s4t7q" Mar 20 07:07:55 crc kubenswrapper[4971]: I0320 07:07:55.707071 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0c8a7ee5-0ec6-4b29-9061-f23731026324-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-s4t7q\" (UID: \"0c8a7ee5-0ec6-4b29-9061-f23731026324\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-s4t7q" Mar 20 07:07:55 crc kubenswrapper[4971]: I0320 07:07:55.808763 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0c8a7ee5-0ec6-4b29-9061-f23731026324-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-s4t7q\" (UID: \"0c8a7ee5-0ec6-4b29-9061-f23731026324\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-s4t7q" Mar 20 07:07:55 crc kubenswrapper[4971]: I0320 07:07:55.808844 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vj88s\" (UniqueName: \"kubernetes.io/projected/0c8a7ee5-0ec6-4b29-9061-f23731026324-kube-api-access-vj88s\") pod \"cert-manager-operator-controller-manager-66c8bdd694-s4t7q\" (UID: \"0c8a7ee5-0ec6-4b29-9061-f23731026324\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-s4t7q" Mar 20 07:07:55 crc kubenswrapper[4971]: I0320 07:07:55.809331 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0c8a7ee5-0ec6-4b29-9061-f23731026324-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-s4t7q\" (UID: \"0c8a7ee5-0ec6-4b29-9061-f23731026324\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-s4t7q" Mar 20 07:07:55 crc kubenswrapper[4971]: I0320 07:07:55.839961 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vj88s\" (UniqueName: \"kubernetes.io/projected/0c8a7ee5-0ec6-4b29-9061-f23731026324-kube-api-access-vj88s\") pod \"cert-manager-operator-controller-manager-66c8bdd694-s4t7q\" (UID: \"0c8a7ee5-0ec6-4b29-9061-f23731026324\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-s4t7q" Mar 20 07:07:55 crc kubenswrapper[4971]: I0320 07:07:55.951491 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-s4t7q" Mar 20 07:07:56 crc kubenswrapper[4971]: I0320 07:07:56.442798 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-s4t7q"] Mar 20 07:07:56 crc kubenswrapper[4971]: W0320 07:07:56.453347 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c8a7ee5_0ec6_4b29_9061_f23731026324.slice/crio-bf66dc1a3237b212c036bc148c79dc7cf00e4708e77f5ace64a3cb10d0b18b73 WatchSource:0}: Error finding container bf66dc1a3237b212c036bc148c79dc7cf00e4708e77f5ace64a3cb10d0b18b73: Status 404 returned error can't find the container with id bf66dc1a3237b212c036bc148c79dc7cf00e4708e77f5ace64a3cb10d0b18b73 Mar 20 07:07:56 crc kubenswrapper[4971]: I0320 07:07:56.962955 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-s4t7q" event={"ID":"0c8a7ee5-0ec6-4b29-9061-f23731026324","Type":"ContainerStarted","Data":"bf66dc1a3237b212c036bc148c79dc7cf00e4708e77f5ace64a3cb10d0b18b73"} Mar 20 07:08:00 crc kubenswrapper[4971]: I0320 07:08:00.137243 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566508-k9psq"] Mar 20 07:08:00 crc kubenswrapper[4971]: I0320 07:08:00.138465 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566508-k9psq" Mar 20 07:08:00 crc kubenswrapper[4971]: I0320 07:08:00.140338 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 07:08:00 crc kubenswrapper[4971]: I0320 07:08:00.141555 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 07:08:00 crc kubenswrapper[4971]: I0320 07:08:00.144363 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 07:08:00 crc kubenswrapper[4971]: I0320 07:08:00.171452 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566508-k9psq"] Mar 20 07:08:00 crc kubenswrapper[4971]: I0320 07:08:00.173307 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czzzl\" (UniqueName: \"kubernetes.io/projected/4fef1c76-0cdd-413f-b290-dabfa3b59259-kube-api-access-czzzl\") pod \"auto-csr-approver-29566508-k9psq\" (UID: \"4fef1c76-0cdd-413f-b290-dabfa3b59259\") " pod="openshift-infra/auto-csr-approver-29566508-k9psq" Mar 20 07:08:00 crc kubenswrapper[4971]: I0320 07:08:00.274211 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czzzl\" (UniqueName: \"kubernetes.io/projected/4fef1c76-0cdd-413f-b290-dabfa3b59259-kube-api-access-czzzl\") pod \"auto-csr-approver-29566508-k9psq\" (UID: \"4fef1c76-0cdd-413f-b290-dabfa3b59259\") " pod="openshift-infra/auto-csr-approver-29566508-k9psq" Mar 20 07:08:00 crc kubenswrapper[4971]: I0320 07:08:00.305957 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czzzl\" (UniqueName: \"kubernetes.io/projected/4fef1c76-0cdd-413f-b290-dabfa3b59259-kube-api-access-czzzl\") pod \"auto-csr-approver-29566508-k9psq\" (UID: \"4fef1c76-0cdd-413f-b290-dabfa3b59259\") " pod="openshift-infra/auto-csr-approver-29566508-k9psq" Mar 20 07:08:00 crc kubenswrapper[4971]: I0320 07:08:00.455940 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566508-k9psq" Mar 20 07:08:00 crc kubenswrapper[4971]: I0320 07:08:00.711071 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566508-k9psq"] Mar 20 07:08:01 crc kubenswrapper[4971]: I0320 07:08:01.000795 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566508-k9psq" event={"ID":"4fef1c76-0cdd-413f-b290-dabfa3b59259","Type":"ContainerStarted","Data":"83c9edc423c5ae902a02256cf45e3986d2444344343816fc0884e41cd2566094"} Mar 20 07:08:01 crc kubenswrapper[4971]: I0320 07:08:01.003533 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-s4t7q" event={"ID":"0c8a7ee5-0ec6-4b29-9061-f23731026324","Type":"ContainerStarted","Data":"0512a23602d797a23a9aee5f2f3c372ad5c8248e5bffacff3cab157bca1c8c49"} Mar 20 07:08:01 crc kubenswrapper[4971]: I0320 07:08:01.035200 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-s4t7q" podStartSLOduration=2.529497762 podStartE2EDuration="6.035175708s" podCreationTimestamp="2026-03-20 07:07:55 +0000 UTC" firstStartedPulling="2026-03-20 07:07:56.45595626 +0000 UTC m=+1098.435830438" lastFinishedPulling="2026-03-20 07:07:59.961634246 +0000 UTC m=+1101.941508384" observedRunningTime="2026-03-20 07:08:01.028911024 +0000 UTC m=+1103.008785202" watchObservedRunningTime="2026-03-20 07:08:01.035175708 +0000 UTC m=+1103.015049856" Mar 20 07:08:02 crc kubenswrapper[4971]: I0320 07:08:02.010561 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566508-k9psq" event={"ID":"4fef1c76-0cdd-413f-b290-dabfa3b59259","Type":"ContainerStarted","Data":"ae8d76fe41c2f044d059190b95de529e5d48b241720e74b694699ce489d97a88"} Mar 20 07:08:02 crc kubenswrapper[4971]: I0320 07:08:02.029429 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566508-k9psq" podStartSLOduration=1.060954554 podStartE2EDuration="2.029409397s" podCreationTimestamp="2026-03-20 07:08:00 +0000 UTC" firstStartedPulling="2026-03-20 07:08:00.732320303 +0000 UTC m=+1102.712194451" lastFinishedPulling="2026-03-20 07:08:01.700775156 +0000 UTC m=+1103.680649294" observedRunningTime="2026-03-20 07:08:02.023702916 +0000 UTC m=+1104.003577064" watchObservedRunningTime="2026-03-20 07:08:02.029409397 +0000 UTC m=+1104.009283545" Mar 20 07:08:03 crc kubenswrapper[4971]: I0320 07:08:03.017007 4971 generic.go:334] "Generic (PLEG): container finished" podID="4fef1c76-0cdd-413f-b290-dabfa3b59259" containerID="ae8d76fe41c2f044d059190b95de529e5d48b241720e74b694699ce489d97a88" exitCode=0 Mar 20 07:08:03 crc kubenswrapper[4971]: I0320 07:08:03.017073 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566508-k9psq" event={"ID":"4fef1c76-0cdd-413f-b290-dabfa3b59259","Type":"ContainerDied","Data":"ae8d76fe41c2f044d059190b95de529e5d48b241720e74b694699ce489d97a88"} Mar 20 07:08:04 crc kubenswrapper[4971]: I0320 07:08:04.322834 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566508-k9psq" Mar 20 07:08:04 crc kubenswrapper[4971]: I0320 07:08:04.432172 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czzzl\" (UniqueName: \"kubernetes.io/projected/4fef1c76-0cdd-413f-b290-dabfa3b59259-kube-api-access-czzzl\") pod \"4fef1c76-0cdd-413f-b290-dabfa3b59259\" (UID: \"4fef1c76-0cdd-413f-b290-dabfa3b59259\") " Mar 20 07:08:04 crc kubenswrapper[4971]: I0320 07:08:04.457852 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fef1c76-0cdd-413f-b290-dabfa3b59259-kube-api-access-czzzl" (OuterVolumeSpecName: "kube-api-access-czzzl") pod "4fef1c76-0cdd-413f-b290-dabfa3b59259" (UID: "4fef1c76-0cdd-413f-b290-dabfa3b59259"). InnerVolumeSpecName "kube-api-access-czzzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:08:04 crc kubenswrapper[4971]: I0320 07:08:04.533376 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czzzl\" (UniqueName: \"kubernetes.io/projected/4fef1c76-0cdd-413f-b290-dabfa3b59259-kube-api-access-czzzl\") on node \"crc\" DevicePath \"\"" Mar 20 07:08:05 crc kubenswrapper[4971]: I0320 07:08:05.040029 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566508-k9psq" event={"ID":"4fef1c76-0cdd-413f-b290-dabfa3b59259","Type":"ContainerDied","Data":"83c9edc423c5ae902a02256cf45e3986d2444344343816fc0884e41cd2566094"} Mar 20 07:08:05 crc kubenswrapper[4971]: I0320 07:08:05.040096 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83c9edc423c5ae902a02256cf45e3986d2444344343816fc0884e41cd2566094" Mar 20 07:08:05 crc kubenswrapper[4971]: I0320 07:08:05.040190 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566508-k9psq" Mar 20 07:08:05 crc kubenswrapper[4971]: I0320 07:08:05.112480 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566502-k7wqs"] Mar 20 07:08:05 crc kubenswrapper[4971]: I0320 07:08:05.115427 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566502-k7wqs"] Mar 20 07:08:06 crc kubenswrapper[4971]: I0320 07:08:06.285574 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-6zp78"] Mar 20 07:08:06 crc kubenswrapper[4971]: E0320 07:08:06.286127 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fef1c76-0cdd-413f-b290-dabfa3b59259" containerName="oc" Mar 20 07:08:06 crc kubenswrapper[4971]: I0320 07:08:06.286140 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fef1c76-0cdd-413f-b290-dabfa3b59259" containerName="oc" Mar 20 07:08:06 crc kubenswrapper[4971]: I0320 07:08:06.286247 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fef1c76-0cdd-413f-b290-dabfa3b59259" containerName="oc" Mar 20 07:08:06 crc kubenswrapper[4971]: I0320 07:08:06.286654 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-6zp78" Mar 20 07:08:06 crc kubenswrapper[4971]: I0320 07:08:06.289424 4971 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-sv79t" Mar 20 07:08:06 crc kubenswrapper[4971]: I0320 07:08:06.289847 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 20 07:08:06 crc kubenswrapper[4971]: I0320 07:08:06.290742 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 20 07:08:06 crc kubenswrapper[4971]: I0320 07:08:06.309935 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-6zp78"] Mar 20 07:08:06 crc kubenswrapper[4971]: I0320 07:08:06.356312 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2f7jm\" (UniqueName: \"kubernetes.io/projected/7e167de7-5293-49db-818d-813af89b693e-kube-api-access-2f7jm\") pod \"cert-manager-webhook-6888856db4-6zp78\" (UID: \"7e167de7-5293-49db-818d-813af89b693e\") " pod="cert-manager/cert-manager-webhook-6888856db4-6zp78" Mar 20 07:08:06 crc kubenswrapper[4971]: I0320 07:08:06.356377 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7e167de7-5293-49db-818d-813af89b693e-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-6zp78\" (UID: \"7e167de7-5293-49db-818d-813af89b693e\") " pod="cert-manager/cert-manager-webhook-6888856db4-6zp78" Mar 20 07:08:06 crc kubenswrapper[4971]: I0320 07:08:06.457736 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2f7jm\" (UniqueName: \"kubernetes.io/projected/7e167de7-5293-49db-818d-813af89b693e-kube-api-access-2f7jm\") pod \"cert-manager-webhook-6888856db4-6zp78\" (UID: \"7e167de7-5293-49db-818d-813af89b693e\") " pod="cert-manager/cert-manager-webhook-6888856db4-6zp78" Mar 20 07:08:06 crc kubenswrapper[4971]: I0320 07:08:06.457828 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7e167de7-5293-49db-818d-813af89b693e-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-6zp78\" (UID: \"7e167de7-5293-49db-818d-813af89b693e\") " pod="cert-manager/cert-manager-webhook-6888856db4-6zp78" Mar 20 07:08:06 crc kubenswrapper[4971]: I0320 07:08:06.478384 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2f7jm\" (UniqueName: \"kubernetes.io/projected/7e167de7-5293-49db-818d-813af89b693e-kube-api-access-2f7jm\") pod \"cert-manager-webhook-6888856db4-6zp78\" (UID: \"7e167de7-5293-49db-818d-813af89b693e\") " pod="cert-manager/cert-manager-webhook-6888856db4-6zp78" Mar 20 07:08:06 crc kubenswrapper[4971]: I0320 07:08:06.483244 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7e167de7-5293-49db-818d-813af89b693e-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-6zp78\" (UID: \"7e167de7-5293-49db-818d-813af89b693e\") " pod="cert-manager/cert-manager-webhook-6888856db4-6zp78" Mar 20 07:08:06 crc kubenswrapper[4971]: I0320 07:08:06.611690 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-6zp78" Mar 20 07:08:06 crc kubenswrapper[4971]: I0320 07:08:06.642861 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-5hknf"] Mar 20 07:08:06 crc kubenswrapper[4971]: I0320 07:08:06.643708 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-5hknf" Mar 20 07:08:06 crc kubenswrapper[4971]: I0320 07:08:06.646003 4971 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-4l2sh" Mar 20 07:08:06 crc kubenswrapper[4971]: I0320 07:08:06.661475 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-5hknf"] Mar 20 07:08:06 crc kubenswrapper[4971]: I0320 07:08:06.745333 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="183ddb6f-4d37-4c8b-bbc8-3d152c9f968c" path="/var/lib/kubelet/pods/183ddb6f-4d37-4c8b-bbc8-3d152c9f968c/volumes" Mar 20 07:08:06 crc kubenswrapper[4971]: I0320 07:08:06.761402 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fabc83f0-ade8-49c7-a868-3de1566e8280-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-5hknf\" (UID: \"fabc83f0-ade8-49c7-a868-3de1566e8280\") " pod="cert-manager/cert-manager-cainjector-5545bd876-5hknf" Mar 20 07:08:06 crc kubenswrapper[4971]: I0320 07:08:06.761475 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvf8p\" (UniqueName: \"kubernetes.io/projected/fabc83f0-ade8-49c7-a868-3de1566e8280-kube-api-access-qvf8p\") pod \"cert-manager-cainjector-5545bd876-5hknf\" (UID: \"fabc83f0-ade8-49c7-a868-3de1566e8280\") " pod="cert-manager/cert-manager-cainjector-5545bd876-5hknf" Mar 20 07:08:06 crc kubenswrapper[4971]: I0320 07:08:06.863088 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fabc83f0-ade8-49c7-a868-3de1566e8280-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-5hknf\" (UID: \"fabc83f0-ade8-49c7-a868-3de1566e8280\") " pod="cert-manager/cert-manager-cainjector-5545bd876-5hknf" Mar 20 07:08:06 crc kubenswrapper[4971]: I0320 07:08:06.863441 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvf8p\" (UniqueName: \"kubernetes.io/projected/fabc83f0-ade8-49c7-a868-3de1566e8280-kube-api-access-qvf8p\") pod \"cert-manager-cainjector-5545bd876-5hknf\" (UID: \"fabc83f0-ade8-49c7-a868-3de1566e8280\") " pod="cert-manager/cert-manager-cainjector-5545bd876-5hknf" Mar 20 07:08:06 crc kubenswrapper[4971]: I0320 07:08:06.882768 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-6zp78"] Mar 20 07:08:06 crc kubenswrapper[4971]: I0320 07:08:06.892788 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvf8p\" (UniqueName: \"kubernetes.io/projected/fabc83f0-ade8-49c7-a868-3de1566e8280-kube-api-access-qvf8p\") pod \"cert-manager-cainjector-5545bd876-5hknf\" (UID: \"fabc83f0-ade8-49c7-a868-3de1566e8280\") " pod="cert-manager/cert-manager-cainjector-5545bd876-5hknf" Mar 20 07:08:06 crc kubenswrapper[4971]: I0320 07:08:06.893006 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fabc83f0-ade8-49c7-a868-3de1566e8280-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-5hknf\" (UID: \"fabc83f0-ade8-49c7-a868-3de1566e8280\") " pod="cert-manager/cert-manager-cainjector-5545bd876-5hknf" Mar 20 07:08:06 crc kubenswrapper[4971]: I0320 07:08:06.999742 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-5hknf" Mar 20 07:08:07 crc kubenswrapper[4971]: I0320 07:08:07.054901 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-6zp78" event={"ID":"7e167de7-5293-49db-818d-813af89b693e","Type":"ContainerStarted","Data":"722a1558a838808c834fe27384985e7322895f6e33c476707eeb2b00a23307cb"} Mar 20 07:08:07 crc kubenswrapper[4971]: I0320 07:08:07.205935 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-5hknf"] Mar 20 07:08:07 crc kubenswrapper[4971]: W0320 07:08:07.215868 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfabc83f0_ade8_49c7_a868_3de1566e8280.slice/crio-0193b83a9fce11d0b1cbbbc7e6695863489b43914758e9a2ab78848cc0b51588 WatchSource:0}: Error finding container 0193b83a9fce11d0b1cbbbc7e6695863489b43914758e9a2ab78848cc0b51588: Status 404 returned error can't find the container with id 0193b83a9fce11d0b1cbbbc7e6695863489b43914758e9a2ab78848cc0b51588 Mar 20 07:08:08 crc kubenswrapper[4971]: I0320 07:08:08.062634 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-5hknf" event={"ID":"fabc83f0-ade8-49c7-a868-3de1566e8280","Type":"ContainerStarted","Data":"0193b83a9fce11d0b1cbbbc7e6695863489b43914758e9a2ab78848cc0b51588"} Mar 20 07:08:12 crc kubenswrapper[4971]: I0320 07:08:12.097159 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-6zp78" event={"ID":"7e167de7-5293-49db-818d-813af89b693e","Type":"ContainerStarted","Data":"26404a4969e0ca46f6eb8ea624e17177a84a3a6cde278fa3c9416106c5a91fcf"} Mar 20 07:08:12 crc kubenswrapper[4971]: I0320 07:08:12.097662 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-6zp78" Mar 20 07:08:12 crc kubenswrapper[4971]: I0320 07:08:12.099071 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-5hknf" event={"ID":"fabc83f0-ade8-49c7-a868-3de1566e8280","Type":"ContainerStarted","Data":"df1ff6f21e6fba0fd272f6e06056d98bed1018570f8c300726357b1afc15bfff"} Mar 20 07:08:12 crc kubenswrapper[4971]: I0320 07:08:12.113369 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-6zp78" podStartSLOduration=1.797888371 podStartE2EDuration="6.113354517s" podCreationTimestamp="2026-03-20 07:08:06 +0000 UTC" firstStartedPulling="2026-03-20 07:08:06.891815306 +0000 UTC m=+1108.871689454" lastFinishedPulling="2026-03-20 07:08:11.207281412 +0000 UTC m=+1113.187155600" observedRunningTime="2026-03-20 07:08:12.110083146 +0000 UTC m=+1114.089957284" watchObservedRunningTime="2026-03-20 07:08:12.113354517 +0000 UTC m=+1114.093228655" Mar 20 07:08:12 crc kubenswrapper[4971]: I0320 07:08:12.126246 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-5hknf" podStartSLOduration=2.147704224 podStartE2EDuration="6.126232735s" podCreationTimestamp="2026-03-20 07:08:06 +0000 UTC" firstStartedPulling="2026-03-20 07:08:07.219396431 +0000 UTC m=+1109.199270579" lastFinishedPulling="2026-03-20 07:08:11.197924952 +0000 UTC m=+1113.177799090" observedRunningTime="2026-03-20 07:08:12.12441105 +0000 UTC m=+1114.104285198" watchObservedRunningTime="2026-03-20 07:08:12.126232735 +0000 UTC m=+1114.106106873" Mar 20 07:08:16 crc kubenswrapper[4971]: I0320 07:08:16.614935 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-6zp78" Mar 20 07:08:23 crc kubenswrapper[4971]: I0320 07:08:23.034284 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-kmsrl"] Mar 20 07:08:23 crc kubenswrapper[4971]: I0320 07:08:23.036514 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-kmsrl" Mar 20 07:08:23 crc kubenswrapper[4971]: I0320 07:08:23.038775 4971 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-fd5ff" Mar 20 07:08:23 crc kubenswrapper[4971]: I0320 07:08:23.055117 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-kmsrl"] Mar 20 07:08:23 crc kubenswrapper[4971]: I0320 07:08:23.131914 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwqlc\" (UniqueName: \"kubernetes.io/projected/fd1800e7-08ec-4ea1-80b2-2958aef7113b-kube-api-access-kwqlc\") pod \"cert-manager-545d4d4674-kmsrl\" (UID: \"fd1800e7-08ec-4ea1-80b2-2958aef7113b\") " pod="cert-manager/cert-manager-545d4d4674-kmsrl" Mar 20 07:08:23 crc kubenswrapper[4971]: I0320 07:08:23.132091 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fd1800e7-08ec-4ea1-80b2-2958aef7113b-bound-sa-token\") pod \"cert-manager-545d4d4674-kmsrl\" (UID: \"fd1800e7-08ec-4ea1-80b2-2958aef7113b\") " pod="cert-manager/cert-manager-545d4d4674-kmsrl" Mar 20 07:08:23 crc kubenswrapper[4971]: I0320 07:08:23.233641 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fd1800e7-08ec-4ea1-80b2-2958aef7113b-bound-sa-token\") pod \"cert-manager-545d4d4674-kmsrl\" (UID: \"fd1800e7-08ec-4ea1-80b2-2958aef7113b\") " pod="cert-manager/cert-manager-545d4d4674-kmsrl" Mar 20 07:08:23 crc kubenswrapper[4971]: I0320 07:08:23.233793 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwqlc\" (UniqueName: \"kubernetes.io/projected/fd1800e7-08ec-4ea1-80b2-2958aef7113b-kube-api-access-kwqlc\") pod \"cert-manager-545d4d4674-kmsrl\" (UID: \"fd1800e7-08ec-4ea1-80b2-2958aef7113b\") " pod="cert-manager/cert-manager-545d4d4674-kmsrl" Mar 20 07:08:23 crc kubenswrapper[4971]: I0320 07:08:23.253994 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fd1800e7-08ec-4ea1-80b2-2958aef7113b-bound-sa-token\") pod \"cert-manager-545d4d4674-kmsrl\" (UID: \"fd1800e7-08ec-4ea1-80b2-2958aef7113b\") " pod="cert-manager/cert-manager-545d4d4674-kmsrl" Mar 20 07:08:23 crc kubenswrapper[4971]: I0320 07:08:23.257298 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwqlc\" (UniqueName: \"kubernetes.io/projected/fd1800e7-08ec-4ea1-80b2-2958aef7113b-kube-api-access-kwqlc\") pod \"cert-manager-545d4d4674-kmsrl\" (UID: \"fd1800e7-08ec-4ea1-80b2-2958aef7113b\") " pod="cert-manager/cert-manager-545d4d4674-kmsrl" Mar 20 07:08:23 crc kubenswrapper[4971]: I0320 07:08:23.365147 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-kmsrl" Mar 20 07:08:23 crc kubenswrapper[4971]: I0320 07:08:23.787375 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-kmsrl"] Mar 20 07:08:23 crc kubenswrapper[4971]: W0320 07:08:23.803873 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd1800e7_08ec_4ea1_80b2_2958aef7113b.slice/crio-967501344ec79ab1259e4df14bc72920094bb4203b1c106482b6ed087c14dee8 WatchSource:0}: Error finding container 967501344ec79ab1259e4df14bc72920094bb4203b1c106482b6ed087c14dee8: Status 404 returned error can't find the container with id 967501344ec79ab1259e4df14bc72920094bb4203b1c106482b6ed087c14dee8 Mar 20 07:08:24 crc kubenswrapper[4971]: I0320 07:08:24.176352 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-kmsrl" event={"ID":"fd1800e7-08ec-4ea1-80b2-2958aef7113b","Type":"ContainerStarted","Data":"3d7e30efad77be2083ec4d8aa9986116b8c05264ff2dccc7406163a7e654aa68"} Mar 20 07:08:24 crc kubenswrapper[4971]: I0320 07:08:24.176395 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-kmsrl" event={"ID":"fd1800e7-08ec-4ea1-80b2-2958aef7113b","Type":"ContainerStarted","Data":"967501344ec79ab1259e4df14bc72920094bb4203b1c106482b6ed087c14dee8"} Mar 20 07:08:24 crc kubenswrapper[4971]: I0320 07:08:24.193073 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-kmsrl" podStartSLOduration=1.193053754 podStartE2EDuration="1.193053754s" podCreationTimestamp="2026-03-20 07:08:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:08:24.19001863 +0000 UTC m=+1126.169892768" watchObservedRunningTime="2026-03-20 07:08:24.193053754 +0000 UTC m=+1126.172927892" Mar 20 07:08:29 crc kubenswrapper[4971]: I0320 07:08:29.805170 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-f5h6r"] Mar 20 07:08:29 crc kubenswrapper[4971]: I0320 07:08:29.808311 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-f5h6r" Mar 20 07:08:29 crc kubenswrapper[4971]: I0320 07:08:29.820377 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-n6pmh" Mar 20 07:08:29 crc kubenswrapper[4971]: I0320 07:08:29.820814 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 20 07:08:29 crc kubenswrapper[4971]: I0320 07:08:29.821070 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 20 07:08:29 crc kubenswrapper[4971]: I0320 07:08:29.855722 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-f5h6r"] Mar 20 07:08:29 crc kubenswrapper[4971]: I0320 07:08:29.919389 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbptt\" (UniqueName: \"kubernetes.io/projected/e45e48a5-c0a2-45e0-8299-8149c031263d-kube-api-access-hbptt\") pod \"openstack-operator-index-f5h6r\" (UID: \"e45e48a5-c0a2-45e0-8299-8149c031263d\") " pod="openstack-operators/openstack-operator-index-f5h6r" Mar 20 07:08:30 crc kubenswrapper[4971]: I0320 07:08:30.020768 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbptt\" (UniqueName: \"kubernetes.io/projected/e45e48a5-c0a2-45e0-8299-8149c031263d-kube-api-access-hbptt\") pod \"openstack-operator-index-f5h6r\" (UID: \"e45e48a5-c0a2-45e0-8299-8149c031263d\") " pod="openstack-operators/openstack-operator-index-f5h6r" Mar 20 07:08:30 crc kubenswrapper[4971]: I0320 07:08:30.038325 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbptt\" (UniqueName: \"kubernetes.io/projected/e45e48a5-c0a2-45e0-8299-8149c031263d-kube-api-access-hbptt\") pod \"openstack-operator-index-f5h6r\" (UID: \"e45e48a5-c0a2-45e0-8299-8149c031263d\") " pod="openstack-operators/openstack-operator-index-f5h6r" Mar 20 07:08:30 crc kubenswrapper[4971]: I0320 07:08:30.180386 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-f5h6r" Mar 20 07:08:30 crc kubenswrapper[4971]: I0320 07:08:30.575515 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-f5h6r"] Mar 20 07:08:30 crc kubenswrapper[4971]: W0320 07:08:30.592090 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode45e48a5_c0a2_45e0_8299_8149c031263d.slice/crio-edda9c69d47b017e864a1f733ad37178158811c89fb4f4ebc5712c65287daf6f WatchSource:0}: Error finding container edda9c69d47b017e864a1f733ad37178158811c89fb4f4ebc5712c65287daf6f: Status 404 returned error can't find the container with id edda9c69d47b017e864a1f733ad37178158811c89fb4f4ebc5712c65287daf6f Mar 20 07:08:31 crc kubenswrapper[4971]: I0320 07:08:31.231838 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-f5h6r" event={"ID":"e45e48a5-c0a2-45e0-8299-8149c031263d","Type":"ContainerStarted","Data":"edda9c69d47b017e864a1f733ad37178158811c89fb4f4ebc5712c65287daf6f"} Mar 20 07:08:32 crc kubenswrapper[4971]: I0320 07:08:32.244032 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-f5h6r" event={"ID":"e45e48a5-c0a2-45e0-8299-8149c031263d","Type":"ContainerStarted","Data":"97f0a6c7b61657a444043891e84dad227e2f9bac8e9c34dd4f798fef2d0a90c1"} Mar 20 07:08:32 crc kubenswrapper[4971]: I0320 07:08:32.282477 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-f5h6r" podStartSLOduration=2.5956274390000003 podStartE2EDuration="3.28244711s" podCreationTimestamp="2026-03-20 07:08:29 +0000 UTC" firstStartedPulling="2026-03-20 07:08:30.597386132 +0000 UTC m=+1132.577260310" lastFinishedPulling="2026-03-20 07:08:31.284205843 +0000 UTC m=+1133.264079981" observedRunningTime="2026-03-20 07:08:32.269706946 +0000 UTC m=+1134.249581144" watchObservedRunningTime="2026-03-20 07:08:32.28244711 +0000 UTC m=+1134.262321278" Mar 20 07:08:33 crc kubenswrapper[4971]: I0320 07:08:33.379655 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-f5h6r"] Mar 20 07:08:33 crc kubenswrapper[4971]: I0320 07:08:33.986147 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-k8spn"] Mar 20 07:08:33 crc kubenswrapper[4971]: I0320 07:08:33.987596 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-k8spn" Mar 20 07:08:33 crc kubenswrapper[4971]: I0320 07:08:33.994515 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-k8spn"] Mar 20 07:08:34 crc kubenswrapper[4971]: I0320 07:08:34.089572 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2cx6\" (UniqueName: \"kubernetes.io/projected/956174ac-8d4d-4dcf-90ad-79ee8322ebdb-kube-api-access-q2cx6\") pod \"openstack-operator-index-k8spn\" (UID: \"956174ac-8d4d-4dcf-90ad-79ee8322ebdb\") " pod="openstack-operators/openstack-operator-index-k8spn" Mar 20 07:08:34 crc kubenswrapper[4971]: I0320 07:08:34.193228 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2cx6\" (UniqueName: \"kubernetes.io/projected/956174ac-8d4d-4dcf-90ad-79ee8322ebdb-kube-api-access-q2cx6\") pod \"openstack-operator-index-k8spn\" (UID: \"956174ac-8d4d-4dcf-90ad-79ee8322ebdb\") " pod="openstack-operators/openstack-operator-index-k8spn" Mar 20 07:08:34 crc kubenswrapper[4971]: I0320 07:08:34.224633 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2cx6\" (UniqueName: \"kubernetes.io/projected/956174ac-8d4d-4dcf-90ad-79ee8322ebdb-kube-api-access-q2cx6\") pod \"openstack-operator-index-k8spn\" (UID: \"956174ac-8d4d-4dcf-90ad-79ee8322ebdb\") " pod="openstack-operators/openstack-operator-index-k8spn" Mar 20 07:08:34 crc kubenswrapper[4971]: I0320 07:08:34.259324 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-f5h6r" podUID="e45e48a5-c0a2-45e0-8299-8149c031263d" containerName="registry-server" containerID="cri-o://97f0a6c7b61657a444043891e84dad227e2f9bac8e9c34dd4f798fef2d0a90c1" gracePeriod=2 Mar 20 07:08:34 crc kubenswrapper[4971]: I0320 07:08:34.317865 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-k8spn" Mar 20 07:08:34 crc kubenswrapper[4971]: E0320 07:08:34.381013 4971 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode45e48a5_c0a2_45e0_8299_8149c031263d.slice/crio-97f0a6c7b61657a444043891e84dad227e2f9bac8e9c34dd4f798fef2d0a90c1.scope\": RecentStats: unable to find data in memory cache]" Mar 20 07:08:34 crc kubenswrapper[4971]: I0320 07:08:34.723317 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-f5h6r" Mar 20 07:08:34 crc kubenswrapper[4971]: I0320 07:08:34.801480 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbptt\" (UniqueName: \"kubernetes.io/projected/e45e48a5-c0a2-45e0-8299-8149c031263d-kube-api-access-hbptt\") pod \"e45e48a5-c0a2-45e0-8299-8149c031263d\" (UID: \"e45e48a5-c0a2-45e0-8299-8149c031263d\") " Mar 20 07:08:34 crc kubenswrapper[4971]: I0320 07:08:34.810261 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e45e48a5-c0a2-45e0-8299-8149c031263d-kube-api-access-hbptt" (OuterVolumeSpecName: "kube-api-access-hbptt") pod "e45e48a5-c0a2-45e0-8299-8149c031263d" (UID: "e45e48a5-c0a2-45e0-8299-8149c031263d"). InnerVolumeSpecName "kube-api-access-hbptt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:08:34 crc kubenswrapper[4971]: I0320 07:08:34.812864 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-k8spn"] Mar 20 07:08:34 crc kubenswrapper[4971]: I0320 07:08:34.903600 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbptt\" (UniqueName: \"kubernetes.io/projected/e45e48a5-c0a2-45e0-8299-8149c031263d-kube-api-access-hbptt\") on node \"crc\" DevicePath \"\"" Mar 20 07:08:35 crc kubenswrapper[4971]: I0320 07:08:35.272283 4971 generic.go:334] "Generic (PLEG): container finished" podID="e45e48a5-c0a2-45e0-8299-8149c031263d" containerID="97f0a6c7b61657a444043891e84dad227e2f9bac8e9c34dd4f798fef2d0a90c1" exitCode=0 Mar 20 07:08:35 crc kubenswrapper[4971]: I0320 07:08:35.272344 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-f5h6r" Mar 20 07:08:35 crc kubenswrapper[4971]: I0320 07:08:35.272407 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-f5h6r" event={"ID":"e45e48a5-c0a2-45e0-8299-8149c031263d","Type":"ContainerDied","Data":"97f0a6c7b61657a444043891e84dad227e2f9bac8e9c34dd4f798fef2d0a90c1"} Mar 20 07:08:35 crc kubenswrapper[4971]: I0320 07:08:35.272455 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-f5h6r" event={"ID":"e45e48a5-c0a2-45e0-8299-8149c031263d","Type":"ContainerDied","Data":"edda9c69d47b017e864a1f733ad37178158811c89fb4f4ebc5712c65287daf6f"} Mar 20 07:08:35 crc kubenswrapper[4971]: I0320 07:08:35.272484 4971 scope.go:117] "RemoveContainer" containerID="97f0a6c7b61657a444043891e84dad227e2f9bac8e9c34dd4f798fef2d0a90c1" Mar 20 07:08:35 crc kubenswrapper[4971]: I0320 07:08:35.274983 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-k8spn" event={"ID":"956174ac-8d4d-4dcf-90ad-79ee8322ebdb","Type":"ContainerStarted","Data":"2a59d94403343dd99e74e55a50459b93735b0242e4972325577c91e42017f4f4"} Mar 20 07:08:35 crc kubenswrapper[4971]: I0320 07:08:35.311032 4971 scope.go:117] "RemoveContainer" containerID="97f0a6c7b61657a444043891e84dad227e2f9bac8e9c34dd4f798fef2d0a90c1" Mar 20 07:08:35 crc kubenswrapper[4971]: E0320 07:08:35.312670 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97f0a6c7b61657a444043891e84dad227e2f9bac8e9c34dd4f798fef2d0a90c1\": container with ID starting with 97f0a6c7b61657a444043891e84dad227e2f9bac8e9c34dd4f798fef2d0a90c1 not found: ID does not exist" containerID="97f0a6c7b61657a444043891e84dad227e2f9bac8e9c34dd4f798fef2d0a90c1" Mar 20 07:08:35 crc kubenswrapper[4971]: I0320 07:08:35.312846 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97f0a6c7b61657a444043891e84dad227e2f9bac8e9c34dd4f798fef2d0a90c1"} err="failed to get container status \"97f0a6c7b61657a444043891e84dad227e2f9bac8e9c34dd4f798fef2d0a90c1\": rpc error: code = NotFound desc = could not find container \"97f0a6c7b61657a444043891e84dad227e2f9bac8e9c34dd4f798fef2d0a90c1\": container with ID starting with 97f0a6c7b61657a444043891e84dad227e2f9bac8e9c34dd4f798fef2d0a90c1 not found: ID does not exist" Mar 20 07:08:35 crc kubenswrapper[4971]: I0320 07:08:35.313854 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-f5h6r"] Mar 20 07:08:35 crc kubenswrapper[4971]: I0320 07:08:35.319829 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-f5h6r"] Mar 20 07:08:36 crc kubenswrapper[4971]: I0320 07:08:36.289950 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-k8spn" event={"ID":"956174ac-8d4d-4dcf-90ad-79ee8322ebdb","Type":"ContainerStarted","Data":"bc34d30daa51777df4ced669a069dc1ad68d5ef07c60d7bb6b787838e521d307"} Mar 20 07:08:36 crc kubenswrapper[4971]: I0320 07:08:36.316104 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-k8spn" podStartSLOduration=2.778855196 podStartE2EDuration="3.316074559s" podCreationTimestamp="2026-03-20 07:08:33 +0000 UTC" firstStartedPulling="2026-03-20 07:08:34.829424803 +0000 UTC m=+1136.809298981" lastFinishedPulling="2026-03-20 07:08:35.366644196 +0000 UTC m=+1137.346518344" observedRunningTime="2026-03-20 07:08:36.314257484 +0000 UTC m=+1138.294131662" watchObservedRunningTime="2026-03-20 07:08:36.316074559 +0000 UTC m=+1138.295948737" Mar 20 07:08:36 crc kubenswrapper[4971]: I0320 07:08:36.746827 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e45e48a5-c0a2-45e0-8299-8149c031263d" path="/var/lib/kubelet/pods/e45e48a5-c0a2-45e0-8299-8149c031263d/volumes" Mar 20 07:08:39 crc kubenswrapper[4971]: I0320 07:08:39.887984 4971 scope.go:117] "RemoveContainer" containerID="2e2c9ebd82302778b695f45805ace259226f22720b36a474e1b17992e988632e" Mar 20 07:08:44 crc kubenswrapper[4971]: I0320 07:08:44.318210 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-k8spn" Mar 20 07:08:44 crc kubenswrapper[4971]: I0320 07:08:44.318710 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-k8spn" Mar 20 07:08:44 crc kubenswrapper[4971]: I0320 07:08:44.372585 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-k8spn" Mar 20 07:08:44 crc kubenswrapper[4971]: I0320 07:08:44.417321 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-k8spn" Mar 20 07:08:45 crc kubenswrapper[4971]: I0320 07:08:45.624520 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c7v2tl"] Mar 20 07:08:45 crc kubenswrapper[4971]: E0320 07:08:45.625126 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e45e48a5-c0a2-45e0-8299-8149c031263d" containerName="registry-server" Mar 20 07:08:45 crc kubenswrapper[4971]: I0320 07:08:45.625147 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="e45e48a5-c0a2-45e0-8299-8149c031263d" containerName="registry-server" Mar 20 07:08:45 crc kubenswrapper[4971]: I0320 07:08:45.625345 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="e45e48a5-c0a2-45e0-8299-8149c031263d" containerName="registry-server" Mar 20 07:08:45 crc kubenswrapper[4971]: I0320 07:08:45.626649 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c7v2tl" Mar 20 07:08:45 crc kubenswrapper[4971]: I0320 07:08:45.629455 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-42kp2" Mar 20 07:08:45 crc kubenswrapper[4971]: I0320 07:08:45.640969 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c7v2tl"] Mar 20 07:08:45 crc kubenswrapper[4971]: I0320 07:08:45.674907 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8be29c57-53f2-4dcd-955c-9597cda034bd-bundle\") pod \"7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c7v2tl\" (UID: \"8be29c57-53f2-4dcd-955c-9597cda034bd\") " pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c7v2tl" Mar 20 07:08:45 crc kubenswrapper[4971]: I0320 07:08:45.675000 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cv7h\" (UniqueName: \"kubernetes.io/projected/8be29c57-53f2-4dcd-955c-9597cda034bd-kube-api-access-2cv7h\") pod \"7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c7v2tl\" (UID: \"8be29c57-53f2-4dcd-955c-9597cda034bd\") " pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c7v2tl" Mar 20 07:08:45 crc kubenswrapper[4971]: I0320 07:08:45.675043 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8be29c57-53f2-4dcd-955c-9597cda034bd-util\") pod \"7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c7v2tl\" (UID: \"8be29c57-53f2-4dcd-955c-9597cda034bd\") " pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c7v2tl" Mar 20 07:08:45 crc kubenswrapper[4971]: I0320 07:08:45.775884 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8be29c57-53f2-4dcd-955c-9597cda034bd-bundle\") pod \"7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c7v2tl\" (UID: \"8be29c57-53f2-4dcd-955c-9597cda034bd\") " pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c7v2tl" Mar 20 07:08:45 crc kubenswrapper[4971]: I0320 07:08:45.775956 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cv7h\" (UniqueName: \"kubernetes.io/projected/8be29c57-53f2-4dcd-955c-9597cda034bd-kube-api-access-2cv7h\") pod \"7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c7v2tl\" (UID: \"8be29c57-53f2-4dcd-955c-9597cda034bd\") " pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c7v2tl" Mar 20 07:08:45 crc kubenswrapper[4971]: I0320 07:08:45.775992 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8be29c57-53f2-4dcd-955c-9597cda034bd-util\") pod \"7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c7v2tl\" (UID: \"8be29c57-53f2-4dcd-955c-9597cda034bd\") " pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c7v2tl" Mar 20 07:08:45 crc kubenswrapper[4971]: I0320 07:08:45.776385 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8be29c57-53f2-4dcd-955c-9597cda034bd-bundle\") pod \"7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c7v2tl\" (UID: \"8be29c57-53f2-4dcd-955c-9597cda034bd\") " pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c7v2tl" Mar 20 07:08:45 crc kubenswrapper[4971]: I0320 07:08:45.776424 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8be29c57-53f2-4dcd-955c-9597cda034bd-util\") pod \"7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c7v2tl\" (UID: \"8be29c57-53f2-4dcd-955c-9597cda034bd\") " pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c7v2tl" Mar 20 07:08:45 crc kubenswrapper[4971]: I0320 07:08:45.805631 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cv7h\" (UniqueName: \"kubernetes.io/projected/8be29c57-53f2-4dcd-955c-9597cda034bd-kube-api-access-2cv7h\") pod \"7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c7v2tl\" (UID: \"8be29c57-53f2-4dcd-955c-9597cda034bd\") " pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c7v2tl" Mar 20 07:08:45 crc kubenswrapper[4971]: I0320 07:08:45.954422 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c7v2tl" Mar 20 07:08:46 crc kubenswrapper[4971]: I0320 07:08:46.490450 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c7v2tl"] Mar 20 07:08:46 crc kubenswrapper[4971]: W0320 07:08:46.499888 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8be29c57_53f2_4dcd_955c_9597cda034bd.slice/crio-c99a7faa7019137116b7f7abfd6fe2ab5a350b2615ebecad3788c5cf7eb7750c WatchSource:0}: Error finding container c99a7faa7019137116b7f7abfd6fe2ab5a350b2615ebecad3788c5cf7eb7750c: Status 404 returned error can't find the container with id c99a7faa7019137116b7f7abfd6fe2ab5a350b2615ebecad3788c5cf7eb7750c Mar 20 07:08:47 crc kubenswrapper[4971]: I0320 07:08:47.395708 4971 generic.go:334] "Generic (PLEG): container finished" podID="8be29c57-53f2-4dcd-955c-9597cda034bd" containerID="c542634ab8b4dacfc78ee0f77df87f592ba480f5d2226c7eeb816599fa296f9a" exitCode=0 Mar 20 07:08:47 crc kubenswrapper[4971]: I0320 07:08:47.395800 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c7v2tl" event={"ID":"8be29c57-53f2-4dcd-955c-9597cda034bd","Type":"ContainerDied","Data":"c542634ab8b4dacfc78ee0f77df87f592ba480f5d2226c7eeb816599fa296f9a"} Mar 20 07:08:47 crc kubenswrapper[4971]: I0320 07:08:47.396171 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c7v2tl" event={"ID":"8be29c57-53f2-4dcd-955c-9597cda034bd","Type":"ContainerStarted","Data":"c99a7faa7019137116b7f7abfd6fe2ab5a350b2615ebecad3788c5cf7eb7750c"} Mar 20 07:08:48 crc kubenswrapper[4971]: I0320 07:08:48.407592 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c7v2tl" event={"ID":"8be29c57-53f2-4dcd-955c-9597cda034bd","Type":"ContainerStarted","Data":"8b65a73abfc1ae62a6cefe0f6d923cc5aa6bea370c4d4f846d615cecef600c65"} Mar 20 07:08:49 crc kubenswrapper[4971]: I0320 07:08:49.417511 4971 generic.go:334] "Generic (PLEG): container finished" podID="8be29c57-53f2-4dcd-955c-9597cda034bd" containerID="8b65a73abfc1ae62a6cefe0f6d923cc5aa6bea370c4d4f846d615cecef600c65" exitCode=0 Mar 20 07:08:49 crc kubenswrapper[4971]: I0320 07:08:49.417575 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c7v2tl" event={"ID":"8be29c57-53f2-4dcd-955c-9597cda034bd","Type":"ContainerDied","Data":"8b65a73abfc1ae62a6cefe0f6d923cc5aa6bea370c4d4f846d615cecef600c65"} Mar 20 07:08:50 crc kubenswrapper[4971]: I0320 07:08:50.162800 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:08:50 crc kubenswrapper[4971]: I0320 07:08:50.163162 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:08:50 crc kubenswrapper[4971]: I0320 07:08:50.428047 4971 generic.go:334] "Generic (PLEG): container finished" podID="8be29c57-53f2-4dcd-955c-9597cda034bd" containerID="fefe33f01e8e30c2be16a04521ec5f7e539a13e96e6ca64e357ee129e6c905d0" exitCode=0 Mar 20 07:08:50 crc kubenswrapper[4971]: I0320 07:08:50.428106 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c7v2tl" event={"ID":"8be29c57-53f2-4dcd-955c-9597cda034bd","Type":"ContainerDied","Data":"fefe33f01e8e30c2be16a04521ec5f7e539a13e96e6ca64e357ee129e6c905d0"} Mar 20 07:08:51 crc kubenswrapper[4971]: I0320 07:08:51.647897 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c7v2tl" Mar 20 07:08:51 crc kubenswrapper[4971]: I0320 07:08:51.793715 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8be29c57-53f2-4dcd-955c-9597cda034bd-util\") pod \"8be29c57-53f2-4dcd-955c-9597cda034bd\" (UID: \"8be29c57-53f2-4dcd-955c-9597cda034bd\") " Mar 20 07:08:51 crc kubenswrapper[4971]: I0320 07:08:51.793839 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8be29c57-53f2-4dcd-955c-9597cda034bd-bundle\") pod \"8be29c57-53f2-4dcd-955c-9597cda034bd\" (UID: \"8be29c57-53f2-4dcd-955c-9597cda034bd\") " Mar 20 07:08:51 crc kubenswrapper[4971]: I0320 07:08:51.793875 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2cv7h\" (UniqueName: \"kubernetes.io/projected/8be29c57-53f2-4dcd-955c-9597cda034bd-kube-api-access-2cv7h\") pod \"8be29c57-53f2-4dcd-955c-9597cda034bd\" (UID: \"8be29c57-53f2-4dcd-955c-9597cda034bd\") " Mar 20 07:08:51 crc kubenswrapper[4971]: I0320 07:08:51.794481 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8be29c57-53f2-4dcd-955c-9597cda034bd-bundle" (OuterVolumeSpecName: "bundle") pod "8be29c57-53f2-4dcd-955c-9597cda034bd" (UID: "8be29c57-53f2-4dcd-955c-9597cda034bd"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:08:51 crc kubenswrapper[4971]: I0320 07:08:51.799828 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8be29c57-53f2-4dcd-955c-9597cda034bd-kube-api-access-2cv7h" (OuterVolumeSpecName: "kube-api-access-2cv7h") pod "8be29c57-53f2-4dcd-955c-9597cda034bd" (UID: "8be29c57-53f2-4dcd-955c-9597cda034bd"). InnerVolumeSpecName "kube-api-access-2cv7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:08:51 crc kubenswrapper[4971]: I0320 07:08:51.825836 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8be29c57-53f2-4dcd-955c-9597cda034bd-util" (OuterVolumeSpecName: "util") pod "8be29c57-53f2-4dcd-955c-9597cda034bd" (UID: "8be29c57-53f2-4dcd-955c-9597cda034bd"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:08:51 crc kubenswrapper[4971]: I0320 07:08:51.895033 4971 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8be29c57-53f2-4dcd-955c-9597cda034bd-util\") on node \"crc\" DevicePath \"\"" Mar 20 07:08:51 crc kubenswrapper[4971]: I0320 07:08:51.895064 4971 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8be29c57-53f2-4dcd-955c-9597cda034bd-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:08:51 crc kubenswrapper[4971]: I0320 07:08:51.895073 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2cv7h\" (UniqueName: \"kubernetes.io/projected/8be29c57-53f2-4dcd-955c-9597cda034bd-kube-api-access-2cv7h\") on node \"crc\" DevicePath \"\"" Mar 20 07:08:52 crc kubenswrapper[4971]: I0320 07:08:52.442440 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c7v2tl" event={"ID":"8be29c57-53f2-4dcd-955c-9597cda034bd","Type":"ContainerDied","Data":"c99a7faa7019137116b7f7abfd6fe2ab5a350b2615ebecad3788c5cf7eb7750c"} Mar 20 07:08:52 crc kubenswrapper[4971]: I0320 07:08:52.442499 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c99a7faa7019137116b7f7abfd6fe2ab5a350b2615ebecad3788c5cf7eb7750c" Mar 20 07:08:52 crc kubenswrapper[4971]: I0320 07:08:52.442593 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c7v2tl" Mar 20 07:08:54 crc kubenswrapper[4971]: I0320 07:08:54.832674 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-b85c4d696-9jzcg"] Mar 20 07:08:54 crc kubenswrapper[4971]: E0320 07:08:54.834628 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8be29c57-53f2-4dcd-955c-9597cda034bd" containerName="util" Mar 20 07:08:54 crc kubenswrapper[4971]: I0320 07:08:54.834751 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="8be29c57-53f2-4dcd-955c-9597cda034bd" containerName="util" Mar 20 07:08:54 crc kubenswrapper[4971]: E0320 07:08:54.834898 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8be29c57-53f2-4dcd-955c-9597cda034bd" containerName="pull" Mar 20 07:08:54 crc kubenswrapper[4971]: I0320 07:08:54.835050 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="8be29c57-53f2-4dcd-955c-9597cda034bd" containerName="pull" Mar 20 07:08:54 crc kubenswrapper[4971]: E0320 07:08:54.835170 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8be29c57-53f2-4dcd-955c-9597cda034bd" containerName="extract" Mar 20 07:08:54 crc kubenswrapper[4971]: I0320 07:08:54.835294 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="8be29c57-53f2-4dcd-955c-9597cda034bd" containerName="extract" Mar 20 07:08:54 crc kubenswrapper[4971]: I0320 07:08:54.835528 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="8be29c57-53f2-4dcd-955c-9597cda034bd" containerName="extract" Mar 20 07:08:54 crc kubenswrapper[4971]: I0320 07:08:54.836168 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-b85c4d696-9jzcg" Mar 20 07:08:54 crc kubenswrapper[4971]: I0320 07:08:54.839769 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-sc59v" Mar 20 07:08:54 crc kubenswrapper[4971]: I0320 07:08:54.868048 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-b85c4d696-9jzcg"] Mar 20 07:08:54 crc kubenswrapper[4971]: I0320 07:08:54.944688 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55f2q\" (UniqueName: \"kubernetes.io/projected/e879631c-0237-44c7-8682-b615034de536-kube-api-access-55f2q\") pod \"openstack-operator-controller-init-b85c4d696-9jzcg\" (UID: \"e879631c-0237-44c7-8682-b615034de536\") " pod="openstack-operators/openstack-operator-controller-init-b85c4d696-9jzcg" Mar 20 07:08:55 crc kubenswrapper[4971]: I0320 07:08:55.046176 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55f2q\" (UniqueName: \"kubernetes.io/projected/e879631c-0237-44c7-8682-b615034de536-kube-api-access-55f2q\") pod \"openstack-operator-controller-init-b85c4d696-9jzcg\" (UID: \"e879631c-0237-44c7-8682-b615034de536\") " pod="openstack-operators/openstack-operator-controller-init-b85c4d696-9jzcg" Mar 20 07:08:55 crc kubenswrapper[4971]: I0320 07:08:55.072278 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55f2q\" (UniqueName: \"kubernetes.io/projected/e879631c-0237-44c7-8682-b615034de536-kube-api-access-55f2q\") pod \"openstack-operator-controller-init-b85c4d696-9jzcg\" (UID: \"e879631c-0237-44c7-8682-b615034de536\") " pod="openstack-operators/openstack-operator-controller-init-b85c4d696-9jzcg" Mar 20 07:08:55 crc kubenswrapper[4971]: I0320 07:08:55.155484 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-b85c4d696-9jzcg" Mar 20 07:08:55 crc kubenswrapper[4971]: I0320 07:08:55.615746 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-b85c4d696-9jzcg"] Mar 20 07:08:55 crc kubenswrapper[4971]: W0320 07:08:55.619256 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode879631c_0237_44c7_8682_b615034de536.slice/crio-82b8e42221c7311b2ca761edb4f8d6b99ecef0ca3407b68a1b405a981cf8ea8d WatchSource:0}: Error finding container 82b8e42221c7311b2ca761edb4f8d6b99ecef0ca3407b68a1b405a981cf8ea8d: Status 404 returned error can't find the container with id 82b8e42221c7311b2ca761edb4f8d6b99ecef0ca3407b68a1b405a981cf8ea8d Mar 20 07:08:56 crc kubenswrapper[4971]: I0320 07:08:56.471291 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-b85c4d696-9jzcg" event={"ID":"e879631c-0237-44c7-8682-b615034de536","Type":"ContainerStarted","Data":"82b8e42221c7311b2ca761edb4f8d6b99ecef0ca3407b68a1b405a981cf8ea8d"} Mar 20 07:09:00 crc kubenswrapper[4971]: I0320 07:09:00.513444 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-b85c4d696-9jzcg" event={"ID":"e879631c-0237-44c7-8682-b615034de536","Type":"ContainerStarted","Data":"d103ed9ac421ad940392f20ae9eeb1a937bdc74633de8c49230c0ee7e412a689"} Mar 20 07:09:00 crc kubenswrapper[4971]: I0320 07:09:00.514273 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-b85c4d696-9jzcg" Mar 20 07:09:00 crc kubenswrapper[4971]: I0320 07:09:00.551324 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-b85c4d696-9jzcg" podStartSLOduration=2.4803971320000002 podStartE2EDuration="6.55129344s" podCreationTimestamp="2026-03-20 07:08:54 +0000 UTC" firstStartedPulling="2026-03-20 07:08:55.622367561 +0000 UTC m=+1157.602241699" lastFinishedPulling="2026-03-20 07:08:59.693263869 +0000 UTC m=+1161.673138007" observedRunningTime="2026-03-20 07:09:00.542984365 +0000 UTC m=+1162.522858543" watchObservedRunningTime="2026-03-20 07:09:00.55129344 +0000 UTC m=+1162.531167618" Mar 20 07:09:05 crc kubenswrapper[4971]: I0320 07:09:05.159056 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-b85c4d696-9jzcg" Mar 20 07:09:20 crc kubenswrapper[4971]: I0320 07:09:20.162454 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:09:20 crc kubenswrapper[4971]: I0320 07:09:20.162913 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.185285 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-tzs7r"] Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.186766 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-tzs7r" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.189041 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-2g4lc"] Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.189792 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-2g4lc" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.190463 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-jnt6d" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.204542 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-z4rvv"] Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.205345 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-z4rvv" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.207051 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-k54v8" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.211394 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-p94nb" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.213219 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-tzs7r"] Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.221280 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-z4rvv"] Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.249673 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-lsfqr"] Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.250410 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-lsfqr" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.253778 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-2g4lc"] Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.255917 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-nhxm6" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.262880 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-6z7mk"] Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.263675 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-6z7mk" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.264929 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-2hbxp" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.271221 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gl9rc\" (UniqueName: \"kubernetes.io/projected/1725120f-c2b6-4438-8dc0-0758d75b0ead-kube-api-access-gl9rc\") pod \"cinder-operator-controller-manager-8d58dc466-2g4lc\" (UID: \"1725120f-c2b6-4438-8dc0-0758d75b0ead\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-2g4lc" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.271277 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-854jh\" (UniqueName: \"kubernetes.io/projected/f478a4fa-f1af-4829-84e7-d42e3518b311-kube-api-access-854jh\") pod \"barbican-operator-controller-manager-59bc569d95-tzs7r\" (UID: \"f478a4fa-f1af-4829-84e7-d42e3518b311\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-tzs7r" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.271297 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cs7hc\" (UniqueName: \"kubernetes.io/projected/f03668cf-d39b-496d-84cf-2f1c80162661-kube-api-access-cs7hc\") pod \"designate-operator-controller-manager-588d4d986b-z4rvv\" (UID: \"f03668cf-d39b-496d-84cf-2f1c80162661\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-z4rvv" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.271387 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-lsfqr"] Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.292291 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-6z7mk"] Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.317720 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-srgtw"] Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.318532 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-srgtw" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.319939 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-vn788" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.320758 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.321853 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-zx4c5"] Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.322627 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-zx4c5" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.332017 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-zx4c5"] Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.333845 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-xp6s8" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.348047 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-fkl5f"] Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.349355 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-fkl5f" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.354578 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-srgtw"] Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.362590 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-fkl5f"] Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.363969 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-pqmv9" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.372309 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gl9rc\" (UniqueName: \"kubernetes.io/projected/1725120f-c2b6-4438-8dc0-0758d75b0ead-kube-api-access-gl9rc\") pod \"cinder-operator-controller-manager-8d58dc466-2g4lc\" (UID: \"1725120f-c2b6-4438-8dc0-0758d75b0ead\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-2g4lc" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.372378 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-854jh\" (UniqueName: \"kubernetes.io/projected/f478a4fa-f1af-4829-84e7-d42e3518b311-kube-api-access-854jh\") pod \"barbican-operator-controller-manager-59bc569d95-tzs7r\" (UID: \"f478a4fa-f1af-4829-84e7-d42e3518b311\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-tzs7r" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.372408 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cs7hc\" (UniqueName: \"kubernetes.io/projected/f03668cf-d39b-496d-84cf-2f1c80162661-kube-api-access-cs7hc\") pod \"designate-operator-controller-manager-588d4d986b-z4rvv\" (UID: \"f03668cf-d39b-496d-84cf-2f1c80162661\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-z4rvv" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.372439 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cd9k\" (UniqueName: \"kubernetes.io/projected/23b236ba-b8a6-4025-a093-cd3e7ccce1a2-kube-api-access-5cd9k\") pod \"glance-operator-controller-manager-79df6bcc97-6z7mk\" (UID: \"23b236ba-b8a6-4025-a093-cd3e7ccce1a2\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-6z7mk" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.372486 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/42b39042-7dfe-4946-bcfa-7bd424edb636-cert\") pod \"infra-operator-controller-manager-7b9c774f96-srgtw\" (UID: \"42b39042-7dfe-4946-bcfa-7bd424edb636\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-srgtw" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.372525 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlssq\" (UniqueName: \"kubernetes.io/projected/f899ae6a-129e-4909-83b2-64f4a270d3aa-kube-api-access-xlssq\") pod \"heat-operator-controller-manager-67dd5f86f5-lsfqr\" (UID: \"f899ae6a-129e-4909-83b2-64f4a270d3aa\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-lsfqr" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.372559 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nzd4\" (UniqueName: \"kubernetes.io/projected/42b39042-7dfe-4946-bcfa-7bd424edb636-kube-api-access-6nzd4\") pod \"infra-operator-controller-manager-7b9c774f96-srgtw\" (UID: \"42b39042-7dfe-4946-bcfa-7bd424edb636\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-srgtw" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.372595 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9tvf\" (UniqueName: \"kubernetes.io/projected/8d8e9e17-63d7-4de7-83d4-114caa076970-kube-api-access-c9tvf\") pod \"horizon-operator-controller-manager-8464cc45fb-zx4c5\" (UID: \"8d8e9e17-63d7-4de7-83d4-114caa076970\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-zx4c5" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.375878 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-6xt47"] Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.377654 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-6xt47" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.379995 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-h8bj5" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.383726 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-hm96v"] Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.384845 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-hm96v" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.391964 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-6xt47"] Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.398273 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-kbtwm" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.400325 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-hm96v"] Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.422696 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-854jh\" (UniqueName: \"kubernetes.io/projected/f478a4fa-f1af-4829-84e7-d42e3518b311-kube-api-access-854jh\") pod \"barbican-operator-controller-manager-59bc569d95-tzs7r\" (UID: \"f478a4fa-f1af-4829-84e7-d42e3518b311\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-tzs7r" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.424161 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cs7hc\" (UniqueName: \"kubernetes.io/projected/f03668cf-d39b-496d-84cf-2f1c80162661-kube-api-access-cs7hc\") pod \"designate-operator-controller-manager-588d4d986b-z4rvv\" (UID: \"f03668cf-d39b-496d-84cf-2f1c80162661\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-z4rvv" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.428667 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-wnjvf"] Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.429534 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-wnjvf" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.438479 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-btkq6" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.438665 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-25vg8"] Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.439386 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-25vg8" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.441577 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gl9rc\" (UniqueName: \"kubernetes.io/projected/1725120f-c2b6-4438-8dc0-0758d75b0ead-kube-api-access-gl9rc\") pod \"cinder-operator-controller-manager-8d58dc466-2g4lc\" (UID: \"1725120f-c2b6-4438-8dc0-0758d75b0ead\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-2g4lc" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.444250 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-6jc7r"] Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.445110 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-6jc7r" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.449080 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-g7nkx" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.449311 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-htr55" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.467958 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-wnjvf"] Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.494916 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4gns\" (UniqueName: \"kubernetes.io/projected/524c42fa-40bb-435d-a5ef-b1cd805b5fdb-kube-api-access-b4gns\") pod \"keystone-operator-controller-manager-768b96df4c-6xt47\" (UID: \"524c42fa-40bb-435d-a5ef-b1cd805b5fdb\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-6xt47" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.495029 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cd9k\" (UniqueName: \"kubernetes.io/projected/23b236ba-b8a6-4025-a093-cd3e7ccce1a2-kube-api-access-5cd9k\") pod \"glance-operator-controller-manager-79df6bcc97-6z7mk\" (UID: \"23b236ba-b8a6-4025-a093-cd3e7ccce1a2\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-6z7mk" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.495062 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7n2g\" (UniqueName: \"kubernetes.io/projected/14a1436b-84d2-4739-b16f-1f0cd580a0dd-kube-api-access-m7n2g\") pod \"manila-operator-controller-manager-55f864c847-hm96v\" (UID: \"14a1436b-84d2-4739-b16f-1f0cd580a0dd\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-hm96v" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.495108 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fq5gx\" (UniqueName: \"kubernetes.io/projected/b3e37501-4241-4d3b-bd19-9764455a723c-kube-api-access-fq5gx\") pod \"neutron-operator-controller-manager-767865f676-25vg8\" (UID: \"b3e37501-4241-4d3b-bd19-9764455a723c\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-25vg8" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.495147 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxzhm\" (UniqueName: \"kubernetes.io/projected/804e22b3-ff42-4280-b6b2-fb431b9311b0-kube-api-access-fxzhm\") pod \"mariadb-operator-controller-manager-67ccfc9778-wnjvf\" (UID: \"804e22b3-ff42-4280-b6b2-fb431b9311b0\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-wnjvf" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.495177 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/42b39042-7dfe-4946-bcfa-7bd424edb636-cert\") pod \"infra-operator-controller-manager-7b9c774f96-srgtw\" (UID: \"42b39042-7dfe-4946-bcfa-7bd424edb636\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-srgtw" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.495221 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlssq\" (UniqueName: \"kubernetes.io/projected/f899ae6a-129e-4909-83b2-64f4a270d3aa-kube-api-access-xlssq\") pod \"heat-operator-controller-manager-67dd5f86f5-lsfqr\" (UID: \"f899ae6a-129e-4909-83b2-64f4a270d3aa\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-lsfqr" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.495249 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28qp9\" (UniqueName: \"kubernetes.io/projected/161e42ba-c8e5-40a0-be42-8fba7ea0428c-kube-api-access-28qp9\") pod \"ironic-operator-controller-manager-6f787dddc9-fkl5f\" (UID: \"161e42ba-c8e5-40a0-be42-8fba7ea0428c\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-fkl5f" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.495285 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nzd4\" (UniqueName: \"kubernetes.io/projected/42b39042-7dfe-4946-bcfa-7bd424edb636-kube-api-access-6nzd4\") pod \"infra-operator-controller-manager-7b9c774f96-srgtw\" (UID: \"42b39042-7dfe-4946-bcfa-7bd424edb636\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-srgtw" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.495340 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9tvf\" (UniqueName: \"kubernetes.io/projected/8d8e9e17-63d7-4de7-83d4-114caa076970-kube-api-access-c9tvf\") pod \"horizon-operator-controller-manager-8464cc45fb-zx4c5\" (UID: \"8d8e9e17-63d7-4de7-83d4-114caa076970\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-zx4c5" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.495368 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xddms\" (UniqueName: \"kubernetes.io/projected/c2cf6754-c345-4bd4-81de-d652e506f0cf-kube-api-access-xddms\") pod \"nova-operator-controller-manager-5d488d59fb-6jc7r\" (UID: \"c2cf6754-c345-4bd4-81de-d652e506f0cf\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-6jc7r" Mar 20 07:09:33 crc kubenswrapper[4971]: E0320 07:09:33.495884 4971 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 07:09:33 crc kubenswrapper[4971]: E0320 07:09:33.495940 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42b39042-7dfe-4946-bcfa-7bd424edb636-cert podName:42b39042-7dfe-4946-bcfa-7bd424edb636 nodeName:}" failed. No retries permitted until 2026-03-20 07:09:33.995917834 +0000 UTC m=+1195.975791972 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/42b39042-7dfe-4946-bcfa-7bd424edb636-cert") pod "infra-operator-controller-manager-7b9c774f96-srgtw" (UID: "42b39042-7dfe-4946-bcfa-7bd424edb636") : secret "infra-operator-webhook-server-cert" not found Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.497254 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-25vg8"] Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.497546 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-6jc7r"] Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.509047 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-tzs7r" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.545517 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-j26ds"] Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.554022 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-j26ds" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.555545 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlssq\" (UniqueName: \"kubernetes.io/projected/f899ae6a-129e-4909-83b2-64f4a270d3aa-kube-api-access-xlssq\") pod \"heat-operator-controller-manager-67dd5f86f5-lsfqr\" (UID: \"f899ae6a-129e-4909-83b2-64f4a270d3aa\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-lsfqr" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.562709 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9tvf\" (UniqueName: \"kubernetes.io/projected/8d8e9e17-63d7-4de7-83d4-114caa076970-kube-api-access-c9tvf\") pod \"horizon-operator-controller-manager-8464cc45fb-zx4c5\" (UID: \"8d8e9e17-63d7-4de7-83d4-114caa076970\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-zx4c5" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.565932 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-d29r5" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.566638 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nzd4\" (UniqueName: \"kubernetes.io/projected/42b39042-7dfe-4946-bcfa-7bd424edb636-kube-api-access-6nzd4\") pod \"infra-operator-controller-manager-7b9c774f96-srgtw\" (UID: \"42b39042-7dfe-4946-bcfa-7bd424edb636\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-srgtw" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.567224 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cd9k\" (UniqueName: \"kubernetes.io/projected/23b236ba-b8a6-4025-a093-cd3e7ccce1a2-kube-api-access-5cd9k\") pod \"glance-operator-controller-manager-79df6bcc97-6z7mk\" (UID: \"23b236ba-b8a6-4025-a093-cd3e7ccce1a2\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-6z7mk" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.576252 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-2g4lc" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.585294 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-z4rvv" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.600092 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xddms\" (UniqueName: \"kubernetes.io/projected/c2cf6754-c345-4bd4-81de-d652e506f0cf-kube-api-access-xddms\") pod \"nova-operator-controller-manager-5d488d59fb-6jc7r\" (UID: \"c2cf6754-c345-4bd4-81de-d652e506f0cf\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-6jc7r" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.600629 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4gns\" (UniqueName: \"kubernetes.io/projected/524c42fa-40bb-435d-a5ef-b1cd805b5fdb-kube-api-access-b4gns\") pod \"keystone-operator-controller-manager-768b96df4c-6xt47\" (UID: \"524c42fa-40bb-435d-a5ef-b1cd805b5fdb\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-6xt47" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.600677 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7n2g\" (UniqueName: \"kubernetes.io/projected/14a1436b-84d2-4739-b16f-1f0cd580a0dd-kube-api-access-m7n2g\") pod \"manila-operator-controller-manager-55f864c847-hm96v\" (UID: \"14a1436b-84d2-4739-b16f-1f0cd580a0dd\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-hm96v" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.600706 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fq5gx\" (UniqueName: \"kubernetes.io/projected/b3e37501-4241-4d3b-bd19-9764455a723c-kube-api-access-fq5gx\") pod \"neutron-operator-controller-manager-767865f676-25vg8\" (UID: \"b3e37501-4241-4d3b-bd19-9764455a723c\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-25vg8" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.600734 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxzhm\" (UniqueName: \"kubernetes.io/projected/804e22b3-ff42-4280-b6b2-fb431b9311b0-kube-api-access-fxzhm\") pod \"mariadb-operator-controller-manager-67ccfc9778-wnjvf\" (UID: \"804e22b3-ff42-4280-b6b2-fb431b9311b0\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-wnjvf" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.600787 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28qp9\" (UniqueName: \"kubernetes.io/projected/161e42ba-c8e5-40a0-be42-8fba7ea0428c-kube-api-access-28qp9\") pod \"ironic-operator-controller-manager-6f787dddc9-fkl5f\" (UID: \"161e42ba-c8e5-40a0-be42-8fba7ea0428c\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-fkl5f" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.600845 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2krx8\" (UniqueName: \"kubernetes.io/projected/0fab6976-2703-4123-9188-3cb727ffe701-kube-api-access-2krx8\") pod \"octavia-operator-controller-manager-5b9f45d989-j26ds\" (UID: \"0fab6976-2703-4123-9188-3cb727ffe701\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-j26ds" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.650302 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4gns\" (UniqueName: \"kubernetes.io/projected/524c42fa-40bb-435d-a5ef-b1cd805b5fdb-kube-api-access-b4gns\") pod \"keystone-operator-controller-manager-768b96df4c-6xt47\" (UID: \"524c42fa-40bb-435d-a5ef-b1cd805b5fdb\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-6xt47" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.654005 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xddms\" (UniqueName: \"kubernetes.io/projected/c2cf6754-c345-4bd4-81de-d652e506f0cf-kube-api-access-xddms\") pod \"nova-operator-controller-manager-5d488d59fb-6jc7r\" (UID: \"c2cf6754-c345-4bd4-81de-d652e506f0cf\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-6jc7r" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.661068 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxzhm\" (UniqueName: \"kubernetes.io/projected/804e22b3-ff42-4280-b6b2-fb431b9311b0-kube-api-access-fxzhm\") pod \"mariadb-operator-controller-manager-67ccfc9778-wnjvf\" (UID: \"804e22b3-ff42-4280-b6b2-fb431b9311b0\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-wnjvf" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.669457 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fq5gx\" (UniqueName: \"kubernetes.io/projected/b3e37501-4241-4d3b-bd19-9764455a723c-kube-api-access-fq5gx\") pod \"neutron-operator-controller-manager-767865f676-25vg8\" (UID: \"b3e37501-4241-4d3b-bd19-9764455a723c\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-25vg8" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.669545 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-j26ds"] Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.675314 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-zx4c5" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.675354 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7n2g\" (UniqueName: \"kubernetes.io/projected/14a1436b-84d2-4739-b16f-1f0cd580a0dd-kube-api-access-m7n2g\") pod \"manila-operator-controller-manager-55f864c847-hm96v\" (UID: \"14a1436b-84d2-4739-b16f-1f0cd580a0dd\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-hm96v" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.676086 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28qp9\" (UniqueName: \"kubernetes.io/projected/161e42ba-c8e5-40a0-be42-8fba7ea0428c-kube-api-access-28qp9\") pod \"ironic-operator-controller-manager-6f787dddc9-fkl5f\" (UID: \"161e42ba-c8e5-40a0-be42-8fba7ea0428c\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-fkl5f" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.676390 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-6jc7r" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.686625 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899pz5xg"] Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.687844 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899pz5xg" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.699064 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.699278 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-jhc9d" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.702818 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2krx8\" (UniqueName: \"kubernetes.io/projected/0fab6976-2703-4123-9188-3cb727ffe701-kube-api-access-2krx8\") pod \"octavia-operator-controller-manager-5b9f45d989-j26ds\" (UID: \"0fab6976-2703-4123-9188-3cb727ffe701\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-j26ds" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.707698 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899pz5xg"] Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.713440 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-d9nz2"] Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.714223 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-d9nz2" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.716922 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-6xt47" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.720259 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-rgvlm" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.728592 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-lfdpc"] Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.730467 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-lfdpc" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.736664 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-d9nz2"] Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.743338 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-lfdpc"] Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.743532 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-5tfvm" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.746770 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2krx8\" (UniqueName: \"kubernetes.io/projected/0fab6976-2703-4123-9188-3cb727ffe701-kube-api-access-2krx8\") pod \"octavia-operator-controller-manager-5b9f45d989-j26ds\" (UID: \"0fab6976-2703-4123-9188-3cb727ffe701\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-j26ds" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.748803 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-nnng5"] Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.749735 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-nnng5" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.753686 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-5ths5" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.763788 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-8k5hb"] Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.764497 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-8k5hb" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.766129 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-t99lw" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.771874 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-hm96v" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.772529 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-nnng5"] Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.786665 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-8k5hb"] Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.804750 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l698m\" (UniqueName: \"kubernetes.io/projected/13ed1180-8ed1-4fc9-9dd4-4fb3d9b23f3a-kube-api-access-l698m\") pod \"openstack-baremetal-operator-controller-manager-74c4796899pz5xg\" (UID: \"13ed1180-8ed1-4fc9-9dd4-4fb3d9b23f3a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899pz5xg" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.804821 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/13ed1180-8ed1-4fc9-9dd4-4fb3d9b23f3a-cert\") pod \"openstack-baremetal-operator-controller-manager-74c4796899pz5xg\" (UID: \"13ed1180-8ed1-4fc9-9dd4-4fb3d9b23f3a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899pz5xg" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.832588 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-pjg97"] Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.834390 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-pjg97" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.838007 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-4pjg6" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.843710 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-wnjvf" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.847623 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-pjg97"] Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.879882 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-lsfqr" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.882496 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-ssscj"] Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.883407 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-ssscj" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.888531 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-6z7mk" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.889234 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-j9vdn" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.908411 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/13ed1180-8ed1-4fc9-9dd4-4fb3d9b23f3a-cert\") pod \"openstack-baremetal-operator-controller-manager-74c4796899pz5xg\" (UID: \"13ed1180-8ed1-4fc9-9dd4-4fb3d9b23f3a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899pz5xg" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.909006 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gv547\" (UniqueName: \"kubernetes.io/projected/d0bcdd3a-90b0-42ce-aaa3-c42379e95e2b-kube-api-access-gv547\") pod \"swift-operator-controller-manager-c674c5965-nnng5\" (UID: \"d0bcdd3a-90b0-42ce-aaa3-c42379e95e2b\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-nnng5" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.909055 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqwnp\" (UniqueName: \"kubernetes.io/projected/a7c0469d-7b51-403e-b190-f42fd2668900-kube-api-access-pqwnp\") pod \"telemetry-operator-controller-manager-d6b694c5-8k5hb\" (UID: \"a7c0469d-7b51-403e-b190-f42fd2668900\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-8k5hb" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.909103 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mtrt\" (UniqueName: \"kubernetes.io/projected/b9bc9bf8-93d8-4b02-b563-ed18ef89944d-kube-api-access-6mtrt\") pod \"ovn-operator-controller-manager-884679f54-d9nz2\" (UID: \"b9bc9bf8-93d8-4b02-b563-ed18ef89944d\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-d9nz2" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.909162 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwc8m\" (UniqueName: \"kubernetes.io/projected/1fc90bc6-1bb1-4357-a357-45bd8795ad02-kube-api-access-fwc8m\") pod \"placement-operator-controller-manager-5784578c99-lfdpc\" (UID: \"1fc90bc6-1bb1-4357-a357-45bd8795ad02\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-lfdpc" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.909199 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l698m\" (UniqueName: \"kubernetes.io/projected/13ed1180-8ed1-4fc9-9dd4-4fb3d9b23f3a-kube-api-access-l698m\") pod \"openstack-baremetal-operator-controller-manager-74c4796899pz5xg\" (UID: \"13ed1180-8ed1-4fc9-9dd4-4fb3d9b23f3a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899pz5xg" Mar 20 07:09:33 crc kubenswrapper[4971]: E0320 07:09:33.909961 4971 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 07:09:33 crc kubenswrapper[4971]: E0320 07:09:33.914582 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13ed1180-8ed1-4fc9-9dd4-4fb3d9b23f3a-cert podName:13ed1180-8ed1-4fc9-9dd4-4fb3d9b23f3a nodeName:}" failed. No retries permitted until 2026-03-20 07:09:34.414545442 +0000 UTC m=+1196.394419620 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/13ed1180-8ed1-4fc9-9dd4-4fb3d9b23f3a-cert") pod "openstack-baremetal-operator-controller-manager-74c4796899pz5xg" (UID: "13ed1180-8ed1-4fc9-9dd4-4fb3d9b23f3a") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.924689 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-ssscj"] Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.929676 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l698m\" (UniqueName: \"kubernetes.io/projected/13ed1180-8ed1-4fc9-9dd4-4fb3d9b23f3a-kube-api-access-l698m\") pod \"openstack-baremetal-operator-controller-manager-74c4796899pz5xg\" (UID: \"13ed1180-8ed1-4fc9-9dd4-4fb3d9b23f3a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899pz5xg" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.951445 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-25vg8" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.959464 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-86bd8996f6-ktmn5"] Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.960435 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-ktmn5" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.963916 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-vx5bl" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.964106 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.964218 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.969831 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-fkl5f" Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.988985 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-86bd8996f6-ktmn5"] Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.996413 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-n2tv6"] Mar 20 07:09:33 crc kubenswrapper[4971]: I0320 07:09:33.997420 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-n2tv6" Mar 20 07:09:34 crc kubenswrapper[4971]: I0320 07:09:34.000119 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-j26ds" Mar 20 07:09:34 crc kubenswrapper[4971]: I0320 07:09:34.013639 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzq4c\" (UniqueName: \"kubernetes.io/projected/f4b444d6-b24e-4055-b020-c367f5c5a144-kube-api-access-tzq4c\") pod \"watcher-operator-controller-manager-6c4d75f7f9-ssscj\" (UID: \"f4b444d6-b24e-4055-b020-c367f5c5a144\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-ssscj" Mar 20 07:09:34 crc kubenswrapper[4971]: I0320 07:09:34.013683 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/42b39042-7dfe-4946-bcfa-7bd424edb636-cert\") pod \"infra-operator-controller-manager-7b9c774f96-srgtw\" (UID: \"42b39042-7dfe-4946-bcfa-7bd424edb636\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-srgtw" Mar 20 07:09:34 crc kubenswrapper[4971]: I0320 07:09:34.013711 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gv547\" (UniqueName: \"kubernetes.io/projected/d0bcdd3a-90b0-42ce-aaa3-c42379e95e2b-kube-api-access-gv547\") pod \"swift-operator-controller-manager-c674c5965-nnng5\" (UID: \"d0bcdd3a-90b0-42ce-aaa3-c42379e95e2b\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-nnng5" Mar 20 07:09:34 crc kubenswrapper[4971]: I0320 07:09:34.013731 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1be5b0dc-cddc-44fb-bb47-d92e3295312c-webhook-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-ktmn5\" (UID: \"1be5b0dc-cddc-44fb-bb47-d92e3295312c\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-ktmn5" Mar 20 07:09:34 crc kubenswrapper[4971]: I0320 07:09:34.013748 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwfjx\" (UniqueName: \"kubernetes.io/projected/1be5b0dc-cddc-44fb-bb47-d92e3295312c-kube-api-access-xwfjx\") pod \"openstack-operator-controller-manager-86bd8996f6-ktmn5\" (UID: \"1be5b0dc-cddc-44fb-bb47-d92e3295312c\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-ktmn5" Mar 20 07:09:34 crc kubenswrapper[4971]: I0320 07:09:34.013771 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqwnp\" (UniqueName: \"kubernetes.io/projected/a7c0469d-7b51-403e-b190-f42fd2668900-kube-api-access-pqwnp\") pod \"telemetry-operator-controller-manager-d6b694c5-8k5hb\" (UID: \"a7c0469d-7b51-403e-b190-f42fd2668900\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-8k5hb" Mar 20 07:09:34 crc kubenswrapper[4971]: I0320 07:09:34.013794 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4hr9\" (UniqueName: \"kubernetes.io/projected/10892b2b-3d7a-4994-8779-8a22ac48ca70-kube-api-access-x4hr9\") pod \"rabbitmq-cluster-operator-manager-668c99d594-n2tv6\" (UID: \"10892b2b-3d7a-4994-8779-8a22ac48ca70\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-n2tv6" Mar 20 07:09:34 crc kubenswrapper[4971]: I0320 07:09:34.013815 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mtrt\" (UniqueName: \"kubernetes.io/projected/b9bc9bf8-93d8-4b02-b563-ed18ef89944d-kube-api-access-6mtrt\") pod \"ovn-operator-controller-manager-884679f54-d9nz2\" (UID: \"b9bc9bf8-93d8-4b02-b563-ed18ef89944d\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-d9nz2" Mar 20 07:09:34 crc kubenswrapper[4971]: I0320 07:09:34.013841 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1be5b0dc-cddc-44fb-bb47-d92e3295312c-metrics-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-ktmn5\" (UID: \"1be5b0dc-cddc-44fb-bb47-d92e3295312c\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-ktmn5" Mar 20 07:09:34 crc kubenswrapper[4971]: I0320 07:09:34.013859 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwc8m\" (UniqueName: \"kubernetes.io/projected/1fc90bc6-1bb1-4357-a357-45bd8795ad02-kube-api-access-fwc8m\") pod \"placement-operator-controller-manager-5784578c99-lfdpc\" (UID: \"1fc90bc6-1bb1-4357-a357-45bd8795ad02\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-lfdpc" Mar 20 07:09:34 crc kubenswrapper[4971]: I0320 07:09:34.013894 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbf8t\" (UniqueName: \"kubernetes.io/projected/d05cc1d7-7da4-4950-93c0-84fd9ea90403-kube-api-access-kbf8t\") pod \"test-operator-controller-manager-5c5cb9c4d7-pjg97\" (UID: \"d05cc1d7-7da4-4950-93c0-84fd9ea90403\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-pjg97" Mar 20 07:09:34 crc kubenswrapper[4971]: E0320 07:09:34.014045 4971 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 07:09:34 crc kubenswrapper[4971]: E0320 07:09:34.014092 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42b39042-7dfe-4946-bcfa-7bd424edb636-cert podName:42b39042-7dfe-4946-bcfa-7bd424edb636 nodeName:}" failed. No retries permitted until 2026-03-20 07:09:35.014075511 +0000 UTC m=+1196.993949649 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/42b39042-7dfe-4946-bcfa-7bd424edb636-cert") pod "infra-operator-controller-manager-7b9c774f96-srgtw" (UID: "42b39042-7dfe-4946-bcfa-7bd424edb636") : secret "infra-operator-webhook-server-cert" not found Mar 20 07:09:34 crc kubenswrapper[4971]: I0320 07:09:34.014863 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-gt7vg" Mar 20 07:09:34 crc kubenswrapper[4971]: I0320 07:09:34.064172 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-n2tv6"] Mar 20 07:09:34 crc kubenswrapper[4971]: I0320 07:09:34.069499 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwc8m\" (UniqueName: \"kubernetes.io/projected/1fc90bc6-1bb1-4357-a357-45bd8795ad02-kube-api-access-fwc8m\") pod \"placement-operator-controller-manager-5784578c99-lfdpc\" (UID: \"1fc90bc6-1bb1-4357-a357-45bd8795ad02\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-lfdpc" Mar 20 07:09:34 crc kubenswrapper[4971]: I0320 07:09:34.069557 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mtrt\" (UniqueName: \"kubernetes.io/projected/b9bc9bf8-93d8-4b02-b563-ed18ef89944d-kube-api-access-6mtrt\") pod \"ovn-operator-controller-manager-884679f54-d9nz2\" (UID: \"b9bc9bf8-93d8-4b02-b563-ed18ef89944d\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-d9nz2" Mar 20 07:09:34 crc kubenswrapper[4971]: I0320 07:09:34.069710 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gv547\" (UniqueName: \"kubernetes.io/projected/d0bcdd3a-90b0-42ce-aaa3-c42379e95e2b-kube-api-access-gv547\") pod \"swift-operator-controller-manager-c674c5965-nnng5\" (UID: \"d0bcdd3a-90b0-42ce-aaa3-c42379e95e2b\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-nnng5" Mar 20 07:09:34 crc kubenswrapper[4971]: I0320 07:09:34.069933 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqwnp\" (UniqueName: \"kubernetes.io/projected/a7c0469d-7b51-403e-b190-f42fd2668900-kube-api-access-pqwnp\") pod \"telemetry-operator-controller-manager-d6b694c5-8k5hb\" (UID: \"a7c0469d-7b51-403e-b190-f42fd2668900\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-8k5hb" Mar 20 07:09:34 crc kubenswrapper[4971]: I0320 07:09:34.078597 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-d9nz2" Mar 20 07:09:34 crc kubenswrapper[4971]: I0320 07:09:34.101320 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-lfdpc" Mar 20 07:09:34 crc kubenswrapper[4971]: I0320 07:09:34.114044 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-nnng5" Mar 20 07:09:34 crc kubenswrapper[4971]: I0320 07:09:34.115859 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzq4c\" (UniqueName: \"kubernetes.io/projected/f4b444d6-b24e-4055-b020-c367f5c5a144-kube-api-access-tzq4c\") pod \"watcher-operator-controller-manager-6c4d75f7f9-ssscj\" (UID: \"f4b444d6-b24e-4055-b020-c367f5c5a144\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-ssscj" Mar 20 07:09:34 crc kubenswrapper[4971]: I0320 07:09:34.115910 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1be5b0dc-cddc-44fb-bb47-d92e3295312c-webhook-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-ktmn5\" (UID: \"1be5b0dc-cddc-44fb-bb47-d92e3295312c\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-ktmn5" Mar 20 07:09:34 crc kubenswrapper[4971]: I0320 07:09:34.115935 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwfjx\" (UniqueName: \"kubernetes.io/projected/1be5b0dc-cddc-44fb-bb47-d92e3295312c-kube-api-access-xwfjx\") pod \"openstack-operator-controller-manager-86bd8996f6-ktmn5\" (UID: \"1be5b0dc-cddc-44fb-bb47-d92e3295312c\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-ktmn5" Mar 20 07:09:34 crc kubenswrapper[4971]: I0320 07:09:34.115967 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4hr9\" (UniqueName: \"kubernetes.io/projected/10892b2b-3d7a-4994-8779-8a22ac48ca70-kube-api-access-x4hr9\") pod \"rabbitmq-cluster-operator-manager-668c99d594-n2tv6\" (UID: \"10892b2b-3d7a-4994-8779-8a22ac48ca70\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-n2tv6" Mar 20 07:09:34 crc kubenswrapper[4971]: I0320 07:09:34.116003 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1be5b0dc-cddc-44fb-bb47-d92e3295312c-metrics-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-ktmn5\" (UID: \"1be5b0dc-cddc-44fb-bb47-d92e3295312c\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-ktmn5" Mar 20 07:09:34 crc kubenswrapper[4971]: I0320 07:09:34.116039 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbf8t\" (UniqueName: \"kubernetes.io/projected/d05cc1d7-7da4-4950-93c0-84fd9ea90403-kube-api-access-kbf8t\") pod \"test-operator-controller-manager-5c5cb9c4d7-pjg97\" (UID: \"d05cc1d7-7da4-4950-93c0-84fd9ea90403\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-pjg97" Mar 20 07:09:34 crc kubenswrapper[4971]: E0320 07:09:34.116691 4971 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 07:09:34 crc kubenswrapper[4971]: E0320 07:09:34.116839 4971 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 07:09:34 crc kubenswrapper[4971]: E0320 07:09:34.116901 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1be5b0dc-cddc-44fb-bb47-d92e3295312c-metrics-certs podName:1be5b0dc-cddc-44fb-bb47-d92e3295312c nodeName:}" failed. No retries permitted until 2026-03-20 07:09:34.616762094 +0000 UTC m=+1196.596636232 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1be5b0dc-cddc-44fb-bb47-d92e3295312c-metrics-certs") pod "openstack-operator-controller-manager-86bd8996f6-ktmn5" (UID: "1be5b0dc-cddc-44fb-bb47-d92e3295312c") : secret "metrics-server-cert" not found Mar 20 07:09:34 crc kubenswrapper[4971]: E0320 07:09:34.116919 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1be5b0dc-cddc-44fb-bb47-d92e3295312c-webhook-certs podName:1be5b0dc-cddc-44fb-bb47-d92e3295312c nodeName:}" failed. No retries permitted until 2026-03-20 07:09:34.616911248 +0000 UTC m=+1196.596785376 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/1be5b0dc-cddc-44fb-bb47-d92e3295312c-webhook-certs") pod "openstack-operator-controller-manager-86bd8996f6-ktmn5" (UID: "1be5b0dc-cddc-44fb-bb47-d92e3295312c") : secret "webhook-server-cert" not found Mar 20 07:09:34 crc kubenswrapper[4971]: I0320 07:09:34.128959 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-8k5hb" Mar 20 07:09:34 crc kubenswrapper[4971]: I0320 07:09:34.133326 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwfjx\" (UniqueName: \"kubernetes.io/projected/1be5b0dc-cddc-44fb-bb47-d92e3295312c-kube-api-access-xwfjx\") pod \"openstack-operator-controller-manager-86bd8996f6-ktmn5\" (UID: \"1be5b0dc-cddc-44fb-bb47-d92e3295312c\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-ktmn5" Mar 20 07:09:34 crc kubenswrapper[4971]: I0320 07:09:34.133458 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbf8t\" (UniqueName: \"kubernetes.io/projected/d05cc1d7-7da4-4950-93c0-84fd9ea90403-kube-api-access-kbf8t\") pod \"test-operator-controller-manager-5c5cb9c4d7-pjg97\" (UID: \"d05cc1d7-7da4-4950-93c0-84fd9ea90403\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-pjg97" Mar 20 07:09:34 crc kubenswrapper[4971]: I0320 07:09:34.138347 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4hr9\" (UniqueName: \"kubernetes.io/projected/10892b2b-3d7a-4994-8779-8a22ac48ca70-kube-api-access-x4hr9\") pod \"rabbitmq-cluster-operator-manager-668c99d594-n2tv6\" (UID: \"10892b2b-3d7a-4994-8779-8a22ac48ca70\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-n2tv6" Mar 20 07:09:34 crc kubenswrapper[4971]: I0320 07:09:34.141693 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzq4c\" (UniqueName: \"kubernetes.io/projected/f4b444d6-b24e-4055-b020-c367f5c5a144-kube-api-access-tzq4c\") pod \"watcher-operator-controller-manager-6c4d75f7f9-ssscj\" (UID: \"f4b444d6-b24e-4055-b020-c367f5c5a144\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-ssscj" Mar 20 07:09:34 crc kubenswrapper[4971]: I0320 07:09:34.187841 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-pjg97" Mar 20 07:09:34 crc kubenswrapper[4971]: I0320 07:09:34.229611 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-ssscj" Mar 20 07:09:34 crc kubenswrapper[4971]: I0320 07:09:34.312385 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-2g4lc"] Mar 20 07:09:34 crc kubenswrapper[4971]: I0320 07:09:34.335159 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-tzs7r"] Mar 20 07:09:34 crc kubenswrapper[4971]: W0320 07:09:34.402018 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf478a4fa_f1af_4829_84e7_d42e3518b311.slice/crio-09581e9d3a2eb5cb1c46df50dcb0fdbf39f3fc02f4c2e46098df4a3459419422 WatchSource:0}: Error finding container 09581e9d3a2eb5cb1c46df50dcb0fdbf39f3fc02f4c2e46098df4a3459419422: Status 404 returned error can't find the container with id 09581e9d3a2eb5cb1c46df50dcb0fdbf39f3fc02f4c2e46098df4a3459419422 Mar 20 07:09:34 crc kubenswrapper[4971]: I0320 07:09:34.413544 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-n2tv6" Mar 20 07:09:34 crc kubenswrapper[4971]: I0320 07:09:34.432367 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/13ed1180-8ed1-4fc9-9dd4-4fb3d9b23f3a-cert\") pod \"openstack-baremetal-operator-controller-manager-74c4796899pz5xg\" (UID: \"13ed1180-8ed1-4fc9-9dd4-4fb3d9b23f3a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899pz5xg" Mar 20 07:09:34 crc kubenswrapper[4971]: E0320 07:09:34.432555 4971 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 07:09:34 crc kubenswrapper[4971]: E0320 07:09:34.432618 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13ed1180-8ed1-4fc9-9dd4-4fb3d9b23f3a-cert podName:13ed1180-8ed1-4fc9-9dd4-4fb3d9b23f3a nodeName:}" failed. No retries permitted until 2026-03-20 07:09:35.432587946 +0000 UTC m=+1197.412462084 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/13ed1180-8ed1-4fc9-9dd4-4fb3d9b23f3a-cert") pod "openstack-baremetal-operator-controller-manager-74c4796899pz5xg" (UID: "13ed1180-8ed1-4fc9-9dd4-4fb3d9b23f3a") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 07:09:34 crc kubenswrapper[4971]: I0320 07:09:34.495026 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-6jc7r"] Mar 20 07:09:34 crc kubenswrapper[4971]: I0320 07:09:34.516555 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-z4rvv"] Mar 20 07:09:34 crc kubenswrapper[4971]: I0320 07:09:34.528432 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-zx4c5"] Mar 20 07:09:34 crc kubenswrapper[4971]: I0320 07:09:34.634889 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1be5b0dc-cddc-44fb-bb47-d92e3295312c-webhook-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-ktmn5\" (UID: \"1be5b0dc-cddc-44fb-bb47-d92e3295312c\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-ktmn5" Mar 20 07:09:34 crc kubenswrapper[4971]: I0320 07:09:34.634977 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1be5b0dc-cddc-44fb-bb47-d92e3295312c-metrics-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-ktmn5\" (UID: \"1be5b0dc-cddc-44fb-bb47-d92e3295312c\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-ktmn5" Mar 20 07:09:34 crc kubenswrapper[4971]: E0320 07:09:34.635714 4971 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 07:09:34 crc kubenswrapper[4971]: E0320 07:09:34.635773 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1be5b0dc-cddc-44fb-bb47-d92e3295312c-webhook-certs podName:1be5b0dc-cddc-44fb-bb47-d92e3295312c nodeName:}" failed. No retries permitted until 2026-03-20 07:09:35.635755753 +0000 UTC m=+1197.615629891 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/1be5b0dc-cddc-44fb-bb47-d92e3295312c-webhook-certs") pod "openstack-operator-controller-manager-86bd8996f6-ktmn5" (UID: "1be5b0dc-cddc-44fb-bb47-d92e3295312c") : secret "webhook-server-cert" not found Mar 20 07:09:34 crc kubenswrapper[4971]: E0320 07:09:34.635842 4971 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 07:09:34 crc kubenswrapper[4971]: E0320 07:09:34.635914 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1be5b0dc-cddc-44fb-bb47-d92e3295312c-metrics-certs podName:1be5b0dc-cddc-44fb-bb47-d92e3295312c nodeName:}" failed. No retries permitted until 2026-03-20 07:09:35.635895656 +0000 UTC m=+1197.615769784 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1be5b0dc-cddc-44fb-bb47-d92e3295312c-metrics-certs") pod "openstack-operator-controller-manager-86bd8996f6-ktmn5" (UID: "1be5b0dc-cddc-44fb-bb47-d92e3295312c") : secret "metrics-server-cert" not found Mar 20 07:09:34 crc kubenswrapper[4971]: W0320 07:09:34.750322 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23b236ba_b8a6_4025_a093_cd3e7ccce1a2.slice/crio-5bbf1f4174bdeccf4319c49e134a3ca5ee2054caf347a46e06ffe6c0e696bb35 WatchSource:0}: Error finding container 5bbf1f4174bdeccf4319c49e134a3ca5ee2054caf347a46e06ffe6c0e696bb35: Status 404 returned error can't find the container with id 5bbf1f4174bdeccf4319c49e134a3ca5ee2054caf347a46e06ffe6c0e696bb35 Mar 20 07:09:34 crc kubenswrapper[4971]: I0320 07:09:34.760971 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-6xt47"] Mar 20 07:09:34 crc kubenswrapper[4971]: I0320 07:09:34.761005 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-6z7mk"] Mar 20 07:09:34 crc kubenswrapper[4971]: W0320 07:09:34.761004 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf899ae6a_129e_4909_83b2_64f4a270d3aa.slice/crio-095984cdd1ab42255d7cb007bedf43a300211983fb20107fbbba310ba7d8f576 WatchSource:0}: Error finding container 095984cdd1ab42255d7cb007bedf43a300211983fb20107fbbba310ba7d8f576: Status 404 returned error can't find the container with id 095984cdd1ab42255d7cb007bedf43a300211983fb20107fbbba310ba7d8f576 Mar 20 07:09:34 crc kubenswrapper[4971]: I0320 07:09:34.774484 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-lsfqr"] Mar 20 07:09:34 crc kubenswrapper[4971]: I0320 07:09:34.779056 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-hm96v"] Mar 20 07:09:34 crc kubenswrapper[4971]: I0320 07:09:34.790827 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-25vg8"] Mar 20 07:09:34 crc kubenswrapper[4971]: W0320 07:09:34.791781 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3e37501_4241_4d3b_bd19_9764455a723c.slice/crio-dac4d861a27a66578f12d3f2353bc59ecce5a3708e8baf402c282d62227e220b WatchSource:0}: Error finding container dac4d861a27a66578f12d3f2353bc59ecce5a3708e8baf402c282d62227e220b: Status 404 returned error can't find the container with id dac4d861a27a66578f12d3f2353bc59ecce5a3708e8baf402c282d62227e220b Mar 20 07:09:34 crc kubenswrapper[4971]: I0320 07:09:34.797509 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-wnjvf"] Mar 20 07:09:34 crc kubenswrapper[4971]: W0320 07:09:34.797580 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod804e22b3_ff42_4280_b6b2_fb431b9311b0.slice/crio-9dc44b4cdd25fbe1228adefa7ba0d9ab75f7ae97ef296ca664eabdb9c6e5275f WatchSource:0}: Error finding container 9dc44b4cdd25fbe1228adefa7ba0d9ab75f7ae97ef296ca664eabdb9c6e5275f: Status 404 returned error can't find the container with id 9dc44b4cdd25fbe1228adefa7ba0d9ab75f7ae97ef296ca664eabdb9c6e5275f Mar 20 07:09:34 crc kubenswrapper[4971]: I0320 07:09:34.820724 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-zx4c5" event={"ID":"8d8e9e17-63d7-4de7-83d4-114caa076970","Type":"ContainerStarted","Data":"2507a788a8b62534b054c765328664ba04ee348b07ab85062358667eb2cbfd86"} Mar 20 07:09:34 crc kubenswrapper[4971]: I0320 07:09:34.821922 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-tzs7r" event={"ID":"f478a4fa-f1af-4829-84e7-d42e3518b311","Type":"ContainerStarted","Data":"09581e9d3a2eb5cb1c46df50dcb0fdbf39f3fc02f4c2e46098df4a3459419422"} Mar 20 07:09:34 crc kubenswrapper[4971]: I0320 07:09:34.824064 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-2g4lc" event={"ID":"1725120f-c2b6-4438-8dc0-0758d75b0ead","Type":"ContainerStarted","Data":"4a1890e334695acda25a3eb46f710bc683552da094e9880395d5ca806508fa27"} Mar 20 07:09:34 crc kubenswrapper[4971]: I0320 07:09:34.825313 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-wnjvf" event={"ID":"804e22b3-ff42-4280-b6b2-fb431b9311b0","Type":"ContainerStarted","Data":"9dc44b4cdd25fbe1228adefa7ba0d9ab75f7ae97ef296ca664eabdb9c6e5275f"} Mar 20 07:09:34 crc kubenswrapper[4971]: I0320 07:09:34.826713 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-6jc7r" event={"ID":"c2cf6754-c345-4bd4-81de-d652e506f0cf","Type":"ContainerStarted","Data":"98f2ec3b56d2ba4bb513d116bac011e5fce13afe1741d84e8efa15e310546fac"} Mar 20 07:09:34 crc kubenswrapper[4971]: I0320 07:09:34.828055 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-hm96v" event={"ID":"14a1436b-84d2-4739-b16f-1f0cd580a0dd","Type":"ContainerStarted","Data":"3cde79d051fc0fccc52ffa76bf6c159a9601a0b79688d7bb6dd78b0bd26bf786"} Mar 20 07:09:34 crc kubenswrapper[4971]: I0320 07:09:34.829141 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-25vg8" event={"ID":"b3e37501-4241-4d3b-bd19-9764455a723c","Type":"ContainerStarted","Data":"dac4d861a27a66578f12d3f2353bc59ecce5a3708e8baf402c282d62227e220b"} Mar 20 07:09:34 crc kubenswrapper[4971]: I0320 07:09:34.830013 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-lsfqr" event={"ID":"f899ae6a-129e-4909-83b2-64f4a270d3aa","Type":"ContainerStarted","Data":"095984cdd1ab42255d7cb007bedf43a300211983fb20107fbbba310ba7d8f576"} Mar 20 07:09:34 crc kubenswrapper[4971]: I0320 07:09:34.831252 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-6xt47" event={"ID":"524c42fa-40bb-435d-a5ef-b1cd805b5fdb","Type":"ContainerStarted","Data":"c4af1ddbbe707e4576a0ca4be0c1f08a03f35648b2e50f91d067cb7a3b623c16"} Mar 20 07:09:34 crc kubenswrapper[4971]: I0320 07:09:34.832226 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-z4rvv" event={"ID":"f03668cf-d39b-496d-84cf-2f1c80162661","Type":"ContainerStarted","Data":"7e344628224b32cb913635724d2b50e697305f2a8b4dfa495412eee84cf5f337"} Mar 20 07:09:34 crc kubenswrapper[4971]: I0320 07:09:34.833179 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-6z7mk" event={"ID":"23b236ba-b8a6-4025-a093-cd3e7ccce1a2","Type":"ContainerStarted","Data":"5bbf1f4174bdeccf4319c49e134a3ca5ee2054caf347a46e06ffe6c0e696bb35"} Mar 20 07:09:34 crc kubenswrapper[4971]: I0320 07:09:34.907581 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-fkl5f"] Mar 20 07:09:34 crc kubenswrapper[4971]: W0320 07:09:34.909642 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod161e42ba_c8e5_40a0_be42_8fba7ea0428c.slice/crio-b58522955de976b230fa204adb78301e83c0116b96461fd4295e2af0f8031c94 WatchSource:0}: Error finding container b58522955de976b230fa204adb78301e83c0116b96461fd4295e2af0f8031c94: Status 404 returned error can't find the container with id b58522955de976b230fa204adb78301e83c0116b96461fd4295e2af0f8031c94 Mar 20 07:09:34 crc kubenswrapper[4971]: I0320 07:09:34.967436 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-8k5hb"] Mar 20 07:09:34 crc kubenswrapper[4971]: I0320 07:09:34.975709 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-nnng5"] Mar 20 07:09:34 crc kubenswrapper[4971]: I0320 07:09:34.981368 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-lfdpc"] Mar 20 07:09:34 crc kubenswrapper[4971]: E0320 07:09:34.990375 4971 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pqwnp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-d6b694c5-8k5hb_openstack-operators(a7c0469d-7b51-403e-b190-f42fd2668900): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 07:09:34 crc kubenswrapper[4971]: E0320 07:09:34.991625 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-8k5hb" podUID="a7c0469d-7b51-403e-b190-f42fd2668900" Mar 20 07:09:35 crc kubenswrapper[4971]: W0320 07:09:35.007281 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0bcdd3a_90b0_42ce_aaa3_c42379e95e2b.slice/crio-a4734a3671ef2837b5e45db6d01f00a1538908d33b32d5c85b00bbda7d4a501e WatchSource:0}: Error finding container a4734a3671ef2837b5e45db6d01f00a1538908d33b32d5c85b00bbda7d4a501e: Status 404 returned error can't find the container with id a4734a3671ef2837b5e45db6d01f00a1538908d33b32d5c85b00bbda7d4a501e Mar 20 07:09:35 crc kubenswrapper[4971]: I0320 07:09:35.009216 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-n2tv6"] Mar 20 07:09:35 crc kubenswrapper[4971]: W0320 07:09:35.015204 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1fc90bc6_1bb1_4357_a357_45bd8795ad02.slice/crio-3ff7b3fb944e5ec047e3d371a5e3be43f52e99ad5cfdc788c5021f9e08f574df WatchSource:0}: Error finding container 3ff7b3fb944e5ec047e3d371a5e3be43f52e99ad5cfdc788c5021f9e08f574df: Status 404 returned error can't find the container with id 3ff7b3fb944e5ec047e3d371a5e3be43f52e99ad5cfdc788c5021f9e08f574df Mar 20 07:09:35 crc kubenswrapper[4971]: I0320 07:09:35.015273 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-pjg97"] Mar 20 07:09:35 crc kubenswrapper[4971]: W0320 07:09:35.019877 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10892b2b_3d7a_4994_8779_8a22ac48ca70.slice/crio-85dc0433e1c8d7c08075be4550e522285af8d5579f0c4293e6cfdd6b25377e3c WatchSource:0}: Error finding container 85dc0433e1c8d7c08075be4550e522285af8d5579f0c4293e6cfdd6b25377e3c: Status 404 returned error can't find the container with id 85dc0433e1c8d7c08075be4550e522285af8d5579f0c4293e6cfdd6b25377e3c Mar 20 07:09:35 crc kubenswrapper[4971]: I0320 07:09:35.020015 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-d9nz2"] Mar 20 07:09:35 crc kubenswrapper[4971]: W0320 07:09:35.022302 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0fab6976_2703_4123_9188_3cb727ffe701.slice/crio-bdbddb0a5bb4edb4f564ba9d7c06d04672d9b450f0f7efc1fe165c57961a8897 WatchSource:0}: Error finding container bdbddb0a5bb4edb4f564ba9d7c06d04672d9b450f0f7efc1fe165c57961a8897: Status 404 returned error can't find the container with id bdbddb0a5bb4edb4f564ba9d7c06d04672d9b450f0f7efc1fe165c57961a8897 Mar 20 07:09:35 crc kubenswrapper[4971]: E0320 07:09:35.022682 4971 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-x4hr9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-n2tv6_openstack-operators(10892b2b-3d7a-4994-8779-8a22ac48ca70): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 07:09:35 crc kubenswrapper[4971]: E0320 07:09:35.023841 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-n2tv6" podUID="10892b2b-3d7a-4994-8779-8a22ac48ca70" Mar 20 07:09:35 crc kubenswrapper[4971]: W0320 07:09:35.023938 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b444d6_b24e_4055_b020_c367f5c5a144.slice/crio-923756149c1c92af996fa605b196cba97c61c96437b953f9ccb79ec17f170e79 WatchSource:0}: Error finding container 923756149c1c92af996fa605b196cba97c61c96437b953f9ccb79ec17f170e79: Status 404 returned error can't find the container with id 923756149c1c92af996fa605b196cba97c61c96437b953f9ccb79ec17f170e79 Mar 20 07:09:35 crc kubenswrapper[4971]: E0320 07:09:35.024709 4971 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:425fd66675becbe0ca2b2fe1a5a6694ac6e0b1cdce9a77a7a37f99785eadc74a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2krx8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-5b9f45d989-j26ds_openstack-operators(0fab6976-2703-4123-9188-3cb727ffe701): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 07:09:35 crc kubenswrapper[4971]: I0320 07:09:35.024797 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-j26ds"] Mar 20 07:09:35 crc kubenswrapper[4971]: E0320 07:09:35.025685 4971 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fwc8m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-5784578c99-lfdpc_openstack-operators(1fc90bc6-1bb1-4357-a357-45bd8795ad02): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 07:09:35 crc kubenswrapper[4971]: E0320 07:09:35.026047 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-j26ds" podUID="0fab6976-2703-4123-9188-3cb727ffe701" Mar 20 07:09:35 crc kubenswrapper[4971]: E0320 07:09:35.026809 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-lfdpc" podUID="1fc90bc6-1bb1-4357-a357-45bd8795ad02" Mar 20 07:09:35 crc kubenswrapper[4971]: W0320 07:09:35.027751 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd05cc1d7_7da4_4950_93c0_84fd9ea90403.slice/crio-dd87518a782cdc6732cd4b7e97f63dbc26d16418b010c610902cc8a4ecb1cb1f WatchSource:0}: Error finding container dd87518a782cdc6732cd4b7e97f63dbc26d16418b010c610902cc8a4ecb1cb1f: Status 404 returned error can't find the container with id dd87518a782cdc6732cd4b7e97f63dbc26d16418b010c610902cc8a4ecb1cb1f Mar 20 07:09:35 crc kubenswrapper[4971]: I0320 07:09:35.028713 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-ssscj"] Mar 20 07:09:35 crc kubenswrapper[4971]: E0320 07:09:35.028848 4971 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tzq4c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6c4d75f7f9-ssscj_openstack-operators(f4b444d6-b24e-4055-b020-c367f5c5a144): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 07:09:35 crc kubenswrapper[4971]: E0320 07:09:35.032687 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-ssscj" podUID="f4b444d6-b24e-4055-b020-c367f5c5a144" Mar 20 07:09:35 crc kubenswrapper[4971]: W0320 07:09:35.033473 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9bc9bf8_93d8_4b02_b563_ed18ef89944d.slice/crio-54ec9294b2849341a265c82f9e5882522e21ed88d762583677a6187dc964d661 WatchSource:0}: Error finding container 54ec9294b2849341a265c82f9e5882522e21ed88d762583677a6187dc964d661: Status 404 returned error can't find the container with id 54ec9294b2849341a265c82f9e5882522e21ed88d762583677a6187dc964d661 Mar 20 07:09:35 crc kubenswrapper[4971]: E0320 07:09:35.035206 4971 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6mtrt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-884679f54-d9nz2_openstack-operators(b9bc9bf8-93d8-4b02-b563-ed18ef89944d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 07:09:35 crc kubenswrapper[4971]: E0320 07:09:35.035264 4971 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kbf8t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5c5cb9c4d7-pjg97_openstack-operators(d05cc1d7-7da4-4950-93c0-84fd9ea90403): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 07:09:35 crc kubenswrapper[4971]: E0320 07:09:35.036406 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-d9nz2" podUID="b9bc9bf8-93d8-4b02-b563-ed18ef89944d" Mar 20 07:09:35 crc kubenswrapper[4971]: E0320 07:09:35.036434 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-pjg97" podUID="d05cc1d7-7da4-4950-93c0-84fd9ea90403" Mar 20 07:09:35 crc kubenswrapper[4971]: I0320 07:09:35.045945 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/42b39042-7dfe-4946-bcfa-7bd424edb636-cert\") pod \"infra-operator-controller-manager-7b9c774f96-srgtw\" (UID: \"42b39042-7dfe-4946-bcfa-7bd424edb636\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-srgtw" Mar 20 07:09:35 crc kubenswrapper[4971]: E0320 07:09:35.046142 4971 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 07:09:35 crc kubenswrapper[4971]: E0320 07:09:35.046214 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42b39042-7dfe-4946-bcfa-7bd424edb636-cert podName:42b39042-7dfe-4946-bcfa-7bd424edb636 nodeName:}" failed. No retries permitted until 2026-03-20 07:09:37.046200555 +0000 UTC m=+1199.026074693 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/42b39042-7dfe-4946-bcfa-7bd424edb636-cert") pod "infra-operator-controller-manager-7b9c774f96-srgtw" (UID: "42b39042-7dfe-4946-bcfa-7bd424edb636") : secret "infra-operator-webhook-server-cert" not found Mar 20 07:09:35 crc kubenswrapper[4971]: I0320 07:09:35.452985 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/13ed1180-8ed1-4fc9-9dd4-4fb3d9b23f3a-cert\") pod \"openstack-baremetal-operator-controller-manager-74c4796899pz5xg\" (UID: \"13ed1180-8ed1-4fc9-9dd4-4fb3d9b23f3a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899pz5xg" Mar 20 07:09:35 crc kubenswrapper[4971]: E0320 07:09:35.453224 4971 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 07:09:35 crc kubenswrapper[4971]: E0320 07:09:35.453468 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13ed1180-8ed1-4fc9-9dd4-4fb3d9b23f3a-cert podName:13ed1180-8ed1-4fc9-9dd4-4fb3d9b23f3a nodeName:}" failed. No retries permitted until 2026-03-20 07:09:37.453445883 +0000 UTC m=+1199.433320021 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/13ed1180-8ed1-4fc9-9dd4-4fb3d9b23f3a-cert") pod "openstack-baremetal-operator-controller-manager-74c4796899pz5xg" (UID: "13ed1180-8ed1-4fc9-9dd4-4fb3d9b23f3a") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 07:09:35 crc kubenswrapper[4971]: I0320 07:09:35.656701 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1be5b0dc-cddc-44fb-bb47-d92e3295312c-metrics-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-ktmn5\" (UID: \"1be5b0dc-cddc-44fb-bb47-d92e3295312c\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-ktmn5" Mar 20 07:09:35 crc kubenswrapper[4971]: I0320 07:09:35.656829 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1be5b0dc-cddc-44fb-bb47-d92e3295312c-webhook-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-ktmn5\" (UID: \"1be5b0dc-cddc-44fb-bb47-d92e3295312c\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-ktmn5" Mar 20 07:09:35 crc kubenswrapper[4971]: E0320 07:09:35.656976 4971 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 07:09:35 crc kubenswrapper[4971]: E0320 07:09:35.657039 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1be5b0dc-cddc-44fb-bb47-d92e3295312c-webhook-certs podName:1be5b0dc-cddc-44fb-bb47-d92e3295312c nodeName:}" failed. No retries permitted until 2026-03-20 07:09:37.65701892 +0000 UTC m=+1199.636893058 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/1be5b0dc-cddc-44fb-bb47-d92e3295312c-webhook-certs") pod "openstack-operator-controller-manager-86bd8996f6-ktmn5" (UID: "1be5b0dc-cddc-44fb-bb47-d92e3295312c") : secret "webhook-server-cert" not found Mar 20 07:09:35 crc kubenswrapper[4971]: E0320 07:09:35.657534 4971 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 07:09:35 crc kubenswrapper[4971]: E0320 07:09:35.657582 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1be5b0dc-cddc-44fb-bb47-d92e3295312c-metrics-certs podName:1be5b0dc-cddc-44fb-bb47-d92e3295312c nodeName:}" failed. No retries permitted until 2026-03-20 07:09:37.657570695 +0000 UTC m=+1199.637444833 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1be5b0dc-cddc-44fb-bb47-d92e3295312c-metrics-certs") pod "openstack-operator-controller-manager-86bd8996f6-ktmn5" (UID: "1be5b0dc-cddc-44fb-bb47-d92e3295312c") : secret "metrics-server-cert" not found Mar 20 07:09:35 crc kubenswrapper[4971]: I0320 07:09:35.841101 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-lfdpc" event={"ID":"1fc90bc6-1bb1-4357-a357-45bd8795ad02","Type":"ContainerStarted","Data":"3ff7b3fb944e5ec047e3d371a5e3be43f52e99ad5cfdc788c5021f9e08f574df"} Mar 20 07:09:35 crc kubenswrapper[4971]: I0320 07:09:35.842437 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-pjg97" event={"ID":"d05cc1d7-7da4-4950-93c0-84fd9ea90403","Type":"ContainerStarted","Data":"dd87518a782cdc6732cd4b7e97f63dbc26d16418b010c610902cc8a4ecb1cb1f"} Mar 20 07:09:35 crc kubenswrapper[4971]: E0320 07:09:35.844165 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-pjg97" podUID="d05cc1d7-7da4-4950-93c0-84fd9ea90403" Mar 20 07:09:35 crc kubenswrapper[4971]: I0320 07:09:35.845105 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-8k5hb" event={"ID":"a7c0469d-7b51-403e-b190-f42fd2668900","Type":"ContainerStarted","Data":"2c307788d523f0aa936eb906b430c03ff2f2d2c9c133a06b75ba9921a7143776"} Mar 20 07:09:35 crc kubenswrapper[4971]: E0320 07:09:35.846137 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-8k5hb" podUID="a7c0469d-7b51-403e-b190-f42fd2668900" Mar 20 07:09:35 crc kubenswrapper[4971]: I0320 07:09:35.846800 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-d9nz2" event={"ID":"b9bc9bf8-93d8-4b02-b563-ed18ef89944d","Type":"ContainerStarted","Data":"54ec9294b2849341a265c82f9e5882522e21ed88d762583677a6187dc964d661"} Mar 20 07:09:35 crc kubenswrapper[4971]: E0320 07:09:35.847993 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-d9nz2" podUID="b9bc9bf8-93d8-4b02-b563-ed18ef89944d" Mar 20 07:09:35 crc kubenswrapper[4971]: I0320 07:09:35.849221 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-fkl5f" event={"ID":"161e42ba-c8e5-40a0-be42-8fba7ea0428c","Type":"ContainerStarted","Data":"b58522955de976b230fa204adb78301e83c0116b96461fd4295e2af0f8031c94"} Mar 20 07:09:35 crc kubenswrapper[4971]: I0320 07:09:35.850408 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-n2tv6" event={"ID":"10892b2b-3d7a-4994-8779-8a22ac48ca70","Type":"ContainerStarted","Data":"85dc0433e1c8d7c08075be4550e522285af8d5579f0c4293e6cfdd6b25377e3c"} Mar 20 07:09:35 crc kubenswrapper[4971]: E0320 07:09:35.852204 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-n2tv6" podUID="10892b2b-3d7a-4994-8779-8a22ac48ca70" Mar 20 07:09:35 crc kubenswrapper[4971]: I0320 07:09:35.852761 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-nnng5" event={"ID":"d0bcdd3a-90b0-42ce-aaa3-c42379e95e2b","Type":"ContainerStarted","Data":"a4734a3671ef2837b5e45db6d01f00a1538908d33b32d5c85b00bbda7d4a501e"} Mar 20 07:09:35 crc kubenswrapper[4971]: I0320 07:09:35.859464 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-ssscj" event={"ID":"f4b444d6-b24e-4055-b020-c367f5c5a144","Type":"ContainerStarted","Data":"923756149c1c92af996fa605b196cba97c61c96437b953f9ccb79ec17f170e79"} Mar 20 07:09:35 crc kubenswrapper[4971]: E0320 07:09:35.865179 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-ssscj" podUID="f4b444d6-b24e-4055-b020-c367f5c5a144" Mar 20 07:09:35 crc kubenswrapper[4971]: E0320 07:09:35.866414 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-lfdpc" podUID="1fc90bc6-1bb1-4357-a357-45bd8795ad02" Mar 20 07:09:35 crc kubenswrapper[4971]: I0320 07:09:35.874395 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-j26ds" event={"ID":"0fab6976-2703-4123-9188-3cb727ffe701","Type":"ContainerStarted","Data":"bdbddb0a5bb4edb4f564ba9d7c06d04672d9b450f0f7efc1fe165c57961a8897"} Mar 20 07:09:35 crc kubenswrapper[4971]: E0320 07:09:35.877964 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:425fd66675becbe0ca2b2fe1a5a6694ac6e0b1cdce9a77a7a37f99785eadc74a\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-j26ds" podUID="0fab6976-2703-4123-9188-3cb727ffe701" Mar 20 07:09:36 crc kubenswrapper[4971]: E0320 07:09:36.892311 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:425fd66675becbe0ca2b2fe1a5a6694ac6e0b1cdce9a77a7a37f99785eadc74a\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-j26ds" podUID="0fab6976-2703-4123-9188-3cb727ffe701" Mar 20 07:09:36 crc kubenswrapper[4971]: E0320 07:09:36.893193 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-pjg97" podUID="d05cc1d7-7da4-4950-93c0-84fd9ea90403" Mar 20 07:09:36 crc kubenswrapper[4971]: E0320 07:09:36.893234 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-8k5hb" podUID="a7c0469d-7b51-403e-b190-f42fd2668900" Mar 20 07:09:36 crc kubenswrapper[4971]: E0320 07:09:36.893267 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-lfdpc" podUID="1fc90bc6-1bb1-4357-a357-45bd8795ad02" Mar 20 07:09:36 crc kubenswrapper[4971]: E0320 07:09:36.893300 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-n2tv6" podUID="10892b2b-3d7a-4994-8779-8a22ac48ca70" Mar 20 07:09:36 crc kubenswrapper[4971]: E0320 07:09:36.893522 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-d9nz2" podUID="b9bc9bf8-93d8-4b02-b563-ed18ef89944d" Mar 20 07:09:36 crc kubenswrapper[4971]: E0320 07:09:36.893561 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-ssscj" podUID="f4b444d6-b24e-4055-b020-c367f5c5a144" Mar 20 07:09:37 crc kubenswrapper[4971]: I0320 07:09:37.077682 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/42b39042-7dfe-4946-bcfa-7bd424edb636-cert\") pod \"infra-operator-controller-manager-7b9c774f96-srgtw\" (UID: \"42b39042-7dfe-4946-bcfa-7bd424edb636\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-srgtw" Mar 20 07:09:37 crc kubenswrapper[4971]: E0320 07:09:37.078758 4971 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 07:09:37 crc kubenswrapper[4971]: E0320 07:09:37.078806 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42b39042-7dfe-4946-bcfa-7bd424edb636-cert podName:42b39042-7dfe-4946-bcfa-7bd424edb636 nodeName:}" failed. No retries permitted until 2026-03-20 07:09:41.078792308 +0000 UTC m=+1203.058666446 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/42b39042-7dfe-4946-bcfa-7bd424edb636-cert") pod "infra-operator-controller-manager-7b9c774f96-srgtw" (UID: "42b39042-7dfe-4946-bcfa-7bd424edb636") : secret "infra-operator-webhook-server-cert" not found Mar 20 07:09:37 crc kubenswrapper[4971]: I0320 07:09:37.483656 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/13ed1180-8ed1-4fc9-9dd4-4fb3d9b23f3a-cert\") pod \"openstack-baremetal-operator-controller-manager-74c4796899pz5xg\" (UID: \"13ed1180-8ed1-4fc9-9dd4-4fb3d9b23f3a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899pz5xg" Mar 20 07:09:37 crc kubenswrapper[4971]: E0320 07:09:37.483958 4971 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 07:09:37 crc kubenswrapper[4971]: E0320 07:09:37.484198 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13ed1180-8ed1-4fc9-9dd4-4fb3d9b23f3a-cert podName:13ed1180-8ed1-4fc9-9dd4-4fb3d9b23f3a nodeName:}" failed. No retries permitted until 2026-03-20 07:09:41.484132885 +0000 UTC m=+1203.464007073 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/13ed1180-8ed1-4fc9-9dd4-4fb3d9b23f3a-cert") pod "openstack-baremetal-operator-controller-manager-74c4796899pz5xg" (UID: "13ed1180-8ed1-4fc9-9dd4-4fb3d9b23f3a") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 07:09:37 crc kubenswrapper[4971]: I0320 07:09:37.686218 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1be5b0dc-cddc-44fb-bb47-d92e3295312c-webhook-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-ktmn5\" (UID: \"1be5b0dc-cddc-44fb-bb47-d92e3295312c\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-ktmn5" Mar 20 07:09:37 crc kubenswrapper[4971]: I0320 07:09:37.686286 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1be5b0dc-cddc-44fb-bb47-d92e3295312c-metrics-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-ktmn5\" (UID: \"1be5b0dc-cddc-44fb-bb47-d92e3295312c\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-ktmn5" Mar 20 07:09:37 crc kubenswrapper[4971]: E0320 07:09:37.686405 4971 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 07:09:37 crc kubenswrapper[4971]: E0320 07:09:37.686465 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1be5b0dc-cddc-44fb-bb47-d92e3295312c-metrics-certs podName:1be5b0dc-cddc-44fb-bb47-d92e3295312c nodeName:}" failed. No retries permitted until 2026-03-20 07:09:41.68645114 +0000 UTC m=+1203.666325278 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1be5b0dc-cddc-44fb-bb47-d92e3295312c-metrics-certs") pod "openstack-operator-controller-manager-86bd8996f6-ktmn5" (UID: "1be5b0dc-cddc-44fb-bb47-d92e3295312c") : secret "metrics-server-cert" not found Mar 20 07:09:37 crc kubenswrapper[4971]: E0320 07:09:37.686561 4971 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 07:09:37 crc kubenswrapper[4971]: E0320 07:09:37.686749 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1be5b0dc-cddc-44fb-bb47-d92e3295312c-webhook-certs podName:1be5b0dc-cddc-44fb-bb47-d92e3295312c nodeName:}" failed. No retries permitted until 2026-03-20 07:09:41.686721457 +0000 UTC m=+1203.666595595 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/1be5b0dc-cddc-44fb-bb47-d92e3295312c-webhook-certs") pod "openstack-operator-controller-manager-86bd8996f6-ktmn5" (UID: "1be5b0dc-cddc-44fb-bb47-d92e3295312c") : secret "webhook-server-cert" not found Mar 20 07:09:41 crc kubenswrapper[4971]: I0320 07:09:41.142305 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/42b39042-7dfe-4946-bcfa-7bd424edb636-cert\") pod \"infra-operator-controller-manager-7b9c774f96-srgtw\" (UID: \"42b39042-7dfe-4946-bcfa-7bd424edb636\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-srgtw" Mar 20 07:09:41 crc kubenswrapper[4971]: E0320 07:09:41.142537 4971 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 07:09:41 crc kubenswrapper[4971]: E0320 07:09:41.142914 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42b39042-7dfe-4946-bcfa-7bd424edb636-cert podName:42b39042-7dfe-4946-bcfa-7bd424edb636 nodeName:}" failed. No retries permitted until 2026-03-20 07:09:49.142881626 +0000 UTC m=+1211.122755774 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/42b39042-7dfe-4946-bcfa-7bd424edb636-cert") pod "infra-operator-controller-manager-7b9c774f96-srgtw" (UID: "42b39042-7dfe-4946-bcfa-7bd424edb636") : secret "infra-operator-webhook-server-cert" not found Mar 20 07:09:41 crc kubenswrapper[4971]: I0320 07:09:41.548730 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/13ed1180-8ed1-4fc9-9dd4-4fb3d9b23f3a-cert\") pod \"openstack-baremetal-operator-controller-manager-74c4796899pz5xg\" (UID: \"13ed1180-8ed1-4fc9-9dd4-4fb3d9b23f3a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899pz5xg" Mar 20 07:09:41 crc kubenswrapper[4971]: E0320 07:09:41.549076 4971 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 07:09:41 crc kubenswrapper[4971]: E0320 07:09:41.549143 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13ed1180-8ed1-4fc9-9dd4-4fb3d9b23f3a-cert podName:13ed1180-8ed1-4fc9-9dd4-4fb3d9b23f3a nodeName:}" failed. No retries permitted until 2026-03-20 07:09:49.549125357 +0000 UTC m=+1211.528999505 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/13ed1180-8ed1-4fc9-9dd4-4fb3d9b23f3a-cert") pod "openstack-baremetal-operator-controller-manager-74c4796899pz5xg" (UID: "13ed1180-8ed1-4fc9-9dd4-4fb3d9b23f3a") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 07:09:41 crc kubenswrapper[4971]: I0320 07:09:41.752638 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1be5b0dc-cddc-44fb-bb47-d92e3295312c-metrics-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-ktmn5\" (UID: \"1be5b0dc-cddc-44fb-bb47-d92e3295312c\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-ktmn5" Mar 20 07:09:41 crc kubenswrapper[4971]: I0320 07:09:41.752792 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1be5b0dc-cddc-44fb-bb47-d92e3295312c-webhook-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-ktmn5\" (UID: \"1be5b0dc-cddc-44fb-bb47-d92e3295312c\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-ktmn5" Mar 20 07:09:41 crc kubenswrapper[4971]: E0320 07:09:41.752950 4971 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 07:09:41 crc kubenswrapper[4971]: E0320 07:09:41.753054 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1be5b0dc-cddc-44fb-bb47-d92e3295312c-metrics-certs podName:1be5b0dc-cddc-44fb-bb47-d92e3295312c nodeName:}" failed. No retries permitted until 2026-03-20 07:09:49.753027944 +0000 UTC m=+1211.732902092 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1be5b0dc-cddc-44fb-bb47-d92e3295312c-metrics-certs") pod "openstack-operator-controller-manager-86bd8996f6-ktmn5" (UID: "1be5b0dc-cddc-44fb-bb47-d92e3295312c") : secret "metrics-server-cert" not found Mar 20 07:09:41 crc kubenswrapper[4971]: E0320 07:09:41.752961 4971 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 07:09:41 crc kubenswrapper[4971]: E0320 07:09:41.753327 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1be5b0dc-cddc-44fb-bb47-d92e3295312c-webhook-certs podName:1be5b0dc-cddc-44fb-bb47-d92e3295312c nodeName:}" failed. No retries permitted until 2026-03-20 07:09:49.75328599 +0000 UTC m=+1211.733160128 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/1be5b0dc-cddc-44fb-bb47-d92e3295312c-webhook-certs") pod "openstack-operator-controller-manager-86bd8996f6-ktmn5" (UID: "1be5b0dc-cddc-44fb-bb47-d92e3295312c") : secret "webhook-server-cert" not found Mar 20 07:09:46 crc kubenswrapper[4971]: E0320 07:09:46.085748 4971 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a" Mar 20 07:09:46 crc kubenswrapper[4971]: E0320 07:09:46.086379 4971 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xddms,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-5d488d59fb-6jc7r_openstack-operators(c2cf6754-c345-4bd4-81de-d652e506f0cf): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 07:09:46 crc kubenswrapper[4971]: E0320 07:09:46.087518 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-6jc7r" podUID="c2cf6754-c345-4bd4-81de-d652e506f0cf" Mar 20 07:09:46 crc kubenswrapper[4971]: I0320 07:09:46.961194 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-fkl5f" event={"ID":"161e42ba-c8e5-40a0-be42-8fba7ea0428c","Type":"ContainerStarted","Data":"3e5580c6a88f07f3864aa09e9cf82399a18c8a62d58430d7f083d7aa2182f251"} Mar 20 07:09:46 crc kubenswrapper[4971]: I0320 07:09:46.961771 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-fkl5f" Mar 20 07:09:46 crc kubenswrapper[4971]: I0320 07:09:46.962399 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-lsfqr" event={"ID":"f899ae6a-129e-4909-83b2-64f4a270d3aa","Type":"ContainerStarted","Data":"e221650fd945864bd267cf8190be3b97159618d5f042f319c22d059925221466"} Mar 20 07:09:46 crc kubenswrapper[4971]: I0320 07:09:46.962524 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-lsfqr" Mar 20 07:09:46 crc kubenswrapper[4971]: I0320 07:09:46.963743 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-6xt47" event={"ID":"524c42fa-40bb-435d-a5ef-b1cd805b5fdb","Type":"ContainerStarted","Data":"c1e1cb7d3d003deaf4f7ef1cbb07ea8bac4b3aebf00bb0422a307266e4239787"} Mar 20 07:09:46 crc kubenswrapper[4971]: I0320 07:09:46.972807 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-z4rvv" event={"ID":"f03668cf-d39b-496d-84cf-2f1c80162661","Type":"ContainerStarted","Data":"e036e44037bd2e1f2cc10a977f2549bc0be7c180c39e0ccf3102c3f098ae0b62"} Mar 20 07:09:46 crc kubenswrapper[4971]: I0320 07:09:46.973120 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-z4rvv" Mar 20 07:09:46 crc kubenswrapper[4971]: I0320 07:09:46.983566 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-wnjvf" event={"ID":"804e22b3-ff42-4280-b6b2-fb431b9311b0","Type":"ContainerStarted","Data":"483c5f011e18546d91b45a31cc6177fb70b9de206c4d8889ec88ed84d37f52fa"} Mar 20 07:09:46 crc kubenswrapper[4971]: I0320 07:09:46.983799 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-wnjvf" Mar 20 07:09:46 crc kubenswrapper[4971]: I0320 07:09:46.991719 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-6z7mk" event={"ID":"23b236ba-b8a6-4025-a093-cd3e7ccce1a2","Type":"ContainerStarted","Data":"8e78b2d1076ade3ff774707156f11389443e8755fa28976ff43ffe06de68e727"} Mar 20 07:09:46 crc kubenswrapper[4971]: I0320 07:09:46.992508 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-6z7mk" Mar 20 07:09:47 crc kubenswrapper[4971]: I0320 07:09:47.010125 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-fkl5f" podStartSLOduration=2.828217186 podStartE2EDuration="14.010102729s" podCreationTimestamp="2026-03-20 07:09:33 +0000 UTC" firstStartedPulling="2026-03-20 07:09:34.912060804 +0000 UTC m=+1196.891934942" lastFinishedPulling="2026-03-20 07:09:46.093946347 +0000 UTC m=+1208.073820485" observedRunningTime="2026-03-20 07:09:46.983379595 +0000 UTC m=+1208.963253733" watchObservedRunningTime="2026-03-20 07:09:47.010102729 +0000 UTC m=+1208.989976867" Mar 20 07:09:47 crc kubenswrapper[4971]: I0320 07:09:47.011098 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-zx4c5" event={"ID":"8d8e9e17-63d7-4de7-83d4-114caa076970","Type":"ContainerStarted","Data":"96bce7f77facbceedf7254f64ae3c0c3df8d802830405fa96c3cd1292df63153"} Mar 20 07:09:47 crc kubenswrapper[4971]: I0320 07:09:47.011720 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-zx4c5" Mar 20 07:09:47 crc kubenswrapper[4971]: I0320 07:09:47.015337 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-6xt47" podStartSLOduration=2.63687821 podStartE2EDuration="14.015322276s" podCreationTimestamp="2026-03-20 07:09:33 +0000 UTC" firstStartedPulling="2026-03-20 07:09:34.747254677 +0000 UTC m=+1196.727128815" lastFinishedPulling="2026-03-20 07:09:46.125698713 +0000 UTC m=+1208.105572881" observedRunningTime="2026-03-20 07:09:47.010887169 +0000 UTC m=+1208.990761307" watchObservedRunningTime="2026-03-20 07:09:47.015322276 +0000 UTC m=+1208.995196414" Mar 20 07:09:47 crc kubenswrapper[4971]: I0320 07:09:47.019781 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-tzs7r" event={"ID":"f478a4fa-f1af-4829-84e7-d42e3518b311","Type":"ContainerStarted","Data":"b61ff9239cdc7ffd93087543503701dfd8c9430f219411fc895b760d2b991cbd"} Mar 20 07:09:47 crc kubenswrapper[4971]: I0320 07:09:47.020403 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-tzs7r" Mar 20 07:09:47 crc kubenswrapper[4971]: I0320 07:09:47.026010 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-hm96v" event={"ID":"14a1436b-84d2-4739-b16f-1f0cd580a0dd","Type":"ContainerStarted","Data":"d0b0235a8c85a2cb93df6b37b512af6fecc077570b4dbd19a9cc39f0e1c14c0f"} Mar 20 07:09:47 crc kubenswrapper[4971]: I0320 07:09:47.026676 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-55f864c847-hm96v" Mar 20 07:09:47 crc kubenswrapper[4971]: I0320 07:09:47.032485 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-lsfqr" podStartSLOduration=2.70453312 podStartE2EDuration="14.032471527s" podCreationTimestamp="2026-03-20 07:09:33 +0000 UTC" firstStartedPulling="2026-03-20 07:09:34.766025271 +0000 UTC m=+1196.745899409" lastFinishedPulling="2026-03-20 07:09:46.093963648 +0000 UTC m=+1208.073837816" observedRunningTime="2026-03-20 07:09:47.029647843 +0000 UTC m=+1209.009521981" watchObservedRunningTime="2026-03-20 07:09:47.032471527 +0000 UTC m=+1209.012345665" Mar 20 07:09:47 crc kubenswrapper[4971]: I0320 07:09:47.035148 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-25vg8" event={"ID":"b3e37501-4241-4d3b-bd19-9764455a723c","Type":"ContainerStarted","Data":"024a1defd939d888063cba23861906b9c692fbb9592d93245e3697c6973256ea"} Mar 20 07:09:47 crc kubenswrapper[4971]: I0320 07:09:47.035831 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-767865f676-25vg8" Mar 20 07:09:47 crc kubenswrapper[4971]: I0320 07:09:47.036727 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-2g4lc" event={"ID":"1725120f-c2b6-4438-8dc0-0758d75b0ead","Type":"ContainerStarted","Data":"e41eb45be502d20fff4fec294971009297d390aa61263a12d44cab7314fb4aff"} Mar 20 07:09:47 crc kubenswrapper[4971]: I0320 07:09:47.036862 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-2g4lc" Mar 20 07:09:47 crc kubenswrapper[4971]: I0320 07:09:47.045670 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-nnng5" event={"ID":"d0bcdd3a-90b0-42ce-aaa3-c42379e95e2b","Type":"ContainerStarted","Data":"3b59118d93d7c9338cbee39abe076c3c43d326ec87cab128a1d46ff8a1428603"} Mar 20 07:09:47 crc kubenswrapper[4971]: I0320 07:09:47.045707 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-c674c5965-nnng5" Mar 20 07:09:47 crc kubenswrapper[4971]: E0320 07:09:47.049879 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-6jc7r" podUID="c2cf6754-c345-4bd4-81de-d652e506f0cf" Mar 20 07:09:47 crc kubenswrapper[4971]: I0320 07:09:47.051944 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-z4rvv" podStartSLOduration=2.512561839 podStartE2EDuration="14.05192756s" podCreationTimestamp="2026-03-20 07:09:33 +0000 UTC" firstStartedPulling="2026-03-20 07:09:34.544288295 +0000 UTC m=+1196.524162433" lastFinishedPulling="2026-03-20 07:09:46.083654006 +0000 UTC m=+1208.063528154" observedRunningTime="2026-03-20 07:09:47.050274726 +0000 UTC m=+1209.030148864" watchObservedRunningTime="2026-03-20 07:09:47.05192756 +0000 UTC m=+1209.031801698" Mar 20 07:09:47 crc kubenswrapper[4971]: I0320 07:09:47.070057 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-c674c5965-nnng5" podStartSLOduration=2.955368022 podStartE2EDuration="14.070041466s" podCreationTimestamp="2026-03-20 07:09:33 +0000 UTC" firstStartedPulling="2026-03-20 07:09:35.01411015 +0000 UTC m=+1196.993984288" lastFinishedPulling="2026-03-20 07:09:46.128783564 +0000 UTC m=+1208.108657732" observedRunningTime="2026-03-20 07:09:47.066430191 +0000 UTC m=+1209.046304329" watchObservedRunningTime="2026-03-20 07:09:47.070041466 +0000 UTC m=+1209.049915594" Mar 20 07:09:47 crc kubenswrapper[4971]: I0320 07:09:47.089986 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-tzs7r" podStartSLOduration=2.970677465 podStartE2EDuration="14.089967691s" podCreationTimestamp="2026-03-20 07:09:33 +0000 UTC" firstStartedPulling="2026-03-20 07:09:34.412786754 +0000 UTC m=+1196.392660892" lastFinishedPulling="2026-03-20 07:09:45.53207697 +0000 UTC m=+1207.511951118" observedRunningTime="2026-03-20 07:09:47.086464368 +0000 UTC m=+1209.066338506" watchObservedRunningTime="2026-03-20 07:09:47.089967691 +0000 UTC m=+1209.069841829" Mar 20 07:09:47 crc kubenswrapper[4971]: I0320 07:09:47.132390 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-2g4lc" podStartSLOduration=3.100715618 podStartE2EDuration="14.132374667s" podCreationTimestamp="2026-03-20 07:09:33 +0000 UTC" firstStartedPulling="2026-03-20 07:09:34.492289077 +0000 UTC m=+1196.472163215" lastFinishedPulling="2026-03-20 07:09:45.523948136 +0000 UTC m=+1207.503822264" observedRunningTime="2026-03-20 07:09:47.128067803 +0000 UTC m=+1209.107941941" watchObservedRunningTime="2026-03-20 07:09:47.132374667 +0000 UTC m=+1209.112248805" Mar 20 07:09:47 crc kubenswrapper[4971]: I0320 07:09:47.170926 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-6z7mk" podStartSLOduration=2.808713182 podStartE2EDuration="14.170910351s" podCreationTimestamp="2026-03-20 07:09:33 +0000 UTC" firstStartedPulling="2026-03-20 07:09:34.759773156 +0000 UTC m=+1196.739647294" lastFinishedPulling="2026-03-20 07:09:46.121970325 +0000 UTC m=+1208.101844463" observedRunningTime="2026-03-20 07:09:47.166772912 +0000 UTC m=+1209.146647050" watchObservedRunningTime="2026-03-20 07:09:47.170910351 +0000 UTC m=+1209.150784489" Mar 20 07:09:47 crc kubenswrapper[4971]: I0320 07:09:47.185490 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-zx4c5" podStartSLOduration=2.634188299 podStartE2EDuration="14.185474364s" podCreationTimestamp="2026-03-20 07:09:33 +0000 UTC" firstStartedPulling="2026-03-20 07:09:34.56765923 +0000 UTC m=+1196.547533368" lastFinishedPulling="2026-03-20 07:09:46.118945295 +0000 UTC m=+1208.098819433" observedRunningTime="2026-03-20 07:09:47.181168741 +0000 UTC m=+1209.161042879" watchObservedRunningTime="2026-03-20 07:09:47.185474364 +0000 UTC m=+1209.165348502" Mar 20 07:09:47 crc kubenswrapper[4971]: I0320 07:09:47.195787 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-55f864c847-hm96v" podStartSLOduration=2.89676037 podStartE2EDuration="14.195772725s" podCreationTimestamp="2026-03-20 07:09:33 +0000 UTC" firstStartedPulling="2026-03-20 07:09:34.784861277 +0000 UTC m=+1196.764735415" lastFinishedPulling="2026-03-20 07:09:46.083873592 +0000 UTC m=+1208.063747770" observedRunningTime="2026-03-20 07:09:47.19369021 +0000 UTC m=+1209.173564348" watchObservedRunningTime="2026-03-20 07:09:47.195772725 +0000 UTC m=+1209.175646863" Mar 20 07:09:47 crc kubenswrapper[4971]: I0320 07:09:47.221780 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-767865f676-25vg8" podStartSLOduration=2.887582388 podStartE2EDuration="14.221761899s" podCreationTimestamp="2026-03-20 07:09:33 +0000 UTC" firstStartedPulling="2026-03-20 07:09:34.793141565 +0000 UTC m=+1196.773015703" lastFinishedPulling="2026-03-20 07:09:46.127321036 +0000 UTC m=+1208.107195214" observedRunningTime="2026-03-20 07:09:47.220764333 +0000 UTC m=+1209.200638471" watchObservedRunningTime="2026-03-20 07:09:47.221761899 +0000 UTC m=+1209.201636037" Mar 20 07:09:47 crc kubenswrapper[4971]: I0320 07:09:47.239786 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-wnjvf" podStartSLOduration=2.97504107 podStartE2EDuration="14.239771073s" podCreationTimestamp="2026-03-20 07:09:33 +0000 UTC" firstStartedPulling="2026-03-20 07:09:34.799245185 +0000 UTC m=+1196.779119323" lastFinishedPulling="2026-03-20 07:09:46.063975148 +0000 UTC m=+1208.043849326" observedRunningTime="2026-03-20 07:09:47.239093775 +0000 UTC m=+1209.218967913" watchObservedRunningTime="2026-03-20 07:09:47.239771073 +0000 UTC m=+1209.219645211" Mar 20 07:09:48 crc kubenswrapper[4971]: I0320 07:09:48.052148 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-6xt47" Mar 20 07:09:49 crc kubenswrapper[4971]: I0320 07:09:49.171672 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/42b39042-7dfe-4946-bcfa-7bd424edb636-cert\") pod \"infra-operator-controller-manager-7b9c774f96-srgtw\" (UID: \"42b39042-7dfe-4946-bcfa-7bd424edb636\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-srgtw" Mar 20 07:09:49 crc kubenswrapper[4971]: E0320 07:09:49.171844 4971 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 07:09:49 crc kubenswrapper[4971]: E0320 07:09:49.171918 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42b39042-7dfe-4946-bcfa-7bd424edb636-cert podName:42b39042-7dfe-4946-bcfa-7bd424edb636 nodeName:}" failed. No retries permitted until 2026-03-20 07:10:05.171899222 +0000 UTC m=+1227.151773360 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/42b39042-7dfe-4946-bcfa-7bd424edb636-cert") pod "infra-operator-controller-manager-7b9c774f96-srgtw" (UID: "42b39042-7dfe-4946-bcfa-7bd424edb636") : secret "infra-operator-webhook-server-cert" not found Mar 20 07:09:49 crc kubenswrapper[4971]: I0320 07:09:49.580473 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/13ed1180-8ed1-4fc9-9dd4-4fb3d9b23f3a-cert\") pod \"openstack-baremetal-operator-controller-manager-74c4796899pz5xg\" (UID: \"13ed1180-8ed1-4fc9-9dd4-4fb3d9b23f3a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899pz5xg" Mar 20 07:09:49 crc kubenswrapper[4971]: E0320 07:09:49.580749 4971 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 07:09:49 crc kubenswrapper[4971]: E0320 07:09:49.580917 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13ed1180-8ed1-4fc9-9dd4-4fb3d9b23f3a-cert podName:13ed1180-8ed1-4fc9-9dd4-4fb3d9b23f3a nodeName:}" failed. No retries permitted until 2026-03-20 07:10:05.580881035 +0000 UTC m=+1227.560755203 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/13ed1180-8ed1-4fc9-9dd4-4fb3d9b23f3a-cert") pod "openstack-baremetal-operator-controller-manager-74c4796899pz5xg" (UID: "13ed1180-8ed1-4fc9-9dd4-4fb3d9b23f3a") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 07:09:49 crc kubenswrapper[4971]: I0320 07:09:49.783196 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1be5b0dc-cddc-44fb-bb47-d92e3295312c-webhook-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-ktmn5\" (UID: \"1be5b0dc-cddc-44fb-bb47-d92e3295312c\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-ktmn5" Mar 20 07:09:49 crc kubenswrapper[4971]: I0320 07:09:49.783319 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1be5b0dc-cddc-44fb-bb47-d92e3295312c-metrics-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-ktmn5\" (UID: \"1be5b0dc-cddc-44fb-bb47-d92e3295312c\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-ktmn5" Mar 20 07:09:49 crc kubenswrapper[4971]: E0320 07:09:49.783766 4971 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 07:09:49 crc kubenswrapper[4971]: E0320 07:09:49.783852 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1be5b0dc-cddc-44fb-bb47-d92e3295312c-webhook-certs podName:1be5b0dc-cddc-44fb-bb47-d92e3295312c nodeName:}" failed. No retries permitted until 2026-03-20 07:10:05.783830517 +0000 UTC m=+1227.763704665 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/1be5b0dc-cddc-44fb-bb47-d92e3295312c-webhook-certs") pod "openstack-operator-controller-manager-86bd8996f6-ktmn5" (UID: "1be5b0dc-cddc-44fb-bb47-d92e3295312c") : secret "webhook-server-cert" not found Mar 20 07:09:49 crc kubenswrapper[4971]: E0320 07:09:49.784073 4971 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 07:09:49 crc kubenswrapper[4971]: E0320 07:09:49.784135 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1be5b0dc-cddc-44fb-bb47-d92e3295312c-metrics-certs podName:1be5b0dc-cddc-44fb-bb47-d92e3295312c nodeName:}" failed. No retries permitted until 2026-03-20 07:10:05.784115394 +0000 UTC m=+1227.763989622 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1be5b0dc-cddc-44fb-bb47-d92e3295312c-metrics-certs") pod "openstack-operator-controller-manager-86bd8996f6-ktmn5" (UID: "1be5b0dc-cddc-44fb-bb47-d92e3295312c") : secret "metrics-server-cert" not found Mar 20 07:09:50 crc kubenswrapper[4971]: I0320 07:09:50.163233 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:09:50 crc kubenswrapper[4971]: I0320 07:09:50.163323 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:09:50 crc kubenswrapper[4971]: I0320 07:09:50.163387 4971 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" Mar 20 07:09:50 crc kubenswrapper[4971]: I0320 07:09:50.164248 4971 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"13acc3b11ca206dfa51085fca8aab838b094201afdcac5315a5bb8076cf03b1f"} pod="openshift-machine-config-operator/machine-config-daemon-m26zl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 07:09:50 crc kubenswrapper[4971]: I0320 07:09:50.164348 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" containerID="cri-o://13acc3b11ca206dfa51085fca8aab838b094201afdcac5315a5bb8076cf03b1f" gracePeriod=600 Mar 20 07:09:51 crc kubenswrapper[4971]: I0320 07:09:51.075586 4971 generic.go:334] "Generic (PLEG): container finished" podID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerID="13acc3b11ca206dfa51085fca8aab838b094201afdcac5315a5bb8076cf03b1f" exitCode=0 Mar 20 07:09:51 crc kubenswrapper[4971]: I0320 07:09:51.075637 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerDied","Data":"13acc3b11ca206dfa51085fca8aab838b094201afdcac5315a5bb8076cf03b1f"} Mar 20 07:09:51 crc kubenswrapper[4971]: I0320 07:09:51.075702 4971 scope.go:117] "RemoveContainer" containerID="b3da7217c29a31428564d7212c3a93202314dd80c28b100287c20a984795e9b1" Mar 20 07:09:53 crc kubenswrapper[4971]: I0320 07:09:53.514293 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-tzs7r" Mar 20 07:09:53 crc kubenswrapper[4971]: I0320 07:09:53.580031 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-2g4lc" Mar 20 07:09:53 crc kubenswrapper[4971]: I0320 07:09:53.588445 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-z4rvv" Mar 20 07:09:53 crc kubenswrapper[4971]: I0320 07:09:53.678507 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-zx4c5" Mar 20 07:09:53 crc kubenswrapper[4971]: I0320 07:09:53.724323 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-6xt47" Mar 20 07:09:53 crc kubenswrapper[4971]: I0320 07:09:53.775191 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-55f864c847-hm96v" Mar 20 07:09:53 crc kubenswrapper[4971]: I0320 07:09:53.846818 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-wnjvf" Mar 20 07:09:53 crc kubenswrapper[4971]: I0320 07:09:53.890434 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-lsfqr" Mar 20 07:09:53 crc kubenswrapper[4971]: I0320 07:09:53.892541 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-6z7mk" Mar 20 07:09:53 crc kubenswrapper[4971]: I0320 07:09:53.954937 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-767865f676-25vg8" Mar 20 07:09:53 crc kubenswrapper[4971]: I0320 07:09:53.972042 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-fkl5f" Mar 20 07:09:54 crc kubenswrapper[4971]: I0320 07:09:54.117675 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-c674c5965-nnng5" Mar 20 07:10:00 crc kubenswrapper[4971]: I0320 07:10:00.135172 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566510-xzmwb"] Mar 20 07:10:00 crc kubenswrapper[4971]: I0320 07:10:00.136423 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566510-xzmwb" Mar 20 07:10:00 crc kubenswrapper[4971]: I0320 07:10:00.138796 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 07:10:00 crc kubenswrapper[4971]: I0320 07:10:00.138922 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 07:10:00 crc kubenswrapper[4971]: I0320 07:10:00.139013 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 07:10:00 crc kubenswrapper[4971]: I0320 07:10:00.152037 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566510-xzmwb"] Mar 20 07:10:00 crc kubenswrapper[4971]: I0320 07:10:00.248786 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwc7d\" (UniqueName: \"kubernetes.io/projected/f989670d-5d1a-4fee-a7eb-d6142c0776d7-kube-api-access-cwc7d\") pod \"auto-csr-approver-29566510-xzmwb\" (UID: \"f989670d-5d1a-4fee-a7eb-d6142c0776d7\") " pod="openshift-infra/auto-csr-approver-29566510-xzmwb" Mar 20 07:10:00 crc kubenswrapper[4971]: I0320 07:10:00.349531 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwc7d\" (UniqueName: \"kubernetes.io/projected/f989670d-5d1a-4fee-a7eb-d6142c0776d7-kube-api-access-cwc7d\") pod \"auto-csr-approver-29566510-xzmwb\" (UID: \"f989670d-5d1a-4fee-a7eb-d6142c0776d7\") " pod="openshift-infra/auto-csr-approver-29566510-xzmwb" Mar 20 07:10:00 crc kubenswrapper[4971]: I0320 07:10:00.373924 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwc7d\" (UniqueName: \"kubernetes.io/projected/f989670d-5d1a-4fee-a7eb-d6142c0776d7-kube-api-access-cwc7d\") pod \"auto-csr-approver-29566510-xzmwb\" (UID: \"f989670d-5d1a-4fee-a7eb-d6142c0776d7\") " pod="openshift-infra/auto-csr-approver-29566510-xzmwb" Mar 20 07:10:00 crc kubenswrapper[4971]: I0320 07:10:00.473797 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566510-xzmwb" Mar 20 07:10:00 crc kubenswrapper[4971]: I0320 07:10:00.938266 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566510-xzmwb"] Mar 20 07:10:01 crc kubenswrapper[4971]: I0320 07:10:01.158862 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566510-xzmwb" event={"ID":"f989670d-5d1a-4fee-a7eb-d6142c0776d7","Type":"ContainerStarted","Data":"a9d1d25cf9553a59e2c8380e2f30096b284c52eaacc5192e8d44e79c73d41d13"} Mar 20 07:10:01 crc kubenswrapper[4971]: I0320 07:10:01.160031 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-lfdpc" event={"ID":"1fc90bc6-1bb1-4357-a357-45bd8795ad02","Type":"ContainerStarted","Data":"dd055b9ca22207c31bb7bd23a419cfce33cae0305c88052917361883f3d503f7"} Mar 20 07:10:01 crc kubenswrapper[4971]: I0320 07:10:01.160190 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5784578c99-lfdpc" Mar 20 07:10:01 crc kubenswrapper[4971]: I0320 07:10:01.161275 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-j26ds" event={"ID":"0fab6976-2703-4123-9188-3cb727ffe701","Type":"ContainerStarted","Data":"52b88331701827bf19f7a301c92beb55890899cc5bba26b1b7f006c45cddb486"} Mar 20 07:10:01 crc kubenswrapper[4971]: I0320 07:10:01.161455 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-j26ds" Mar 20 07:10:01 crc kubenswrapper[4971]: I0320 07:10:01.163277 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerStarted","Data":"ae2e826d251bbaee40981cb044e2215f19a7929bd69583c144888da9a8754c15"} Mar 20 07:10:01 crc kubenswrapper[4971]: I0320 07:10:01.164511 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-n2tv6" event={"ID":"10892b2b-3d7a-4994-8779-8a22ac48ca70","Type":"ContainerStarted","Data":"51ad5df29c9cc87f7c863f27e2b6308853e635434f9ed5d128638355f81c8783"} Mar 20 07:10:01 crc kubenswrapper[4971]: I0320 07:10:01.165626 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-ssscj" event={"ID":"f4b444d6-b24e-4055-b020-c367f5c5a144","Type":"ContainerStarted","Data":"2b79fccf969af0180a7c1022649356d44e576b50f4e7717e1fd8ff7fccab3d2a"} Mar 20 07:10:01 crc kubenswrapper[4971]: I0320 07:10:01.165806 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-ssscj" Mar 20 07:10:01 crc kubenswrapper[4971]: I0320 07:10:01.167065 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-pjg97" event={"ID":"d05cc1d7-7da4-4950-93c0-84fd9ea90403","Type":"ContainerStarted","Data":"94de5d6eec13020850c70ed264fc4a70c7397930d3f9adafc6f47b0eb83b00ea"} Mar 20 07:10:01 crc kubenswrapper[4971]: I0320 07:10:01.167230 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-pjg97" Mar 20 07:10:01 crc kubenswrapper[4971]: I0320 07:10:01.168262 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-8k5hb" event={"ID":"a7c0469d-7b51-403e-b190-f42fd2668900","Type":"ContainerStarted","Data":"444dd776dcab14c8a1a34ace467cc5599af6d9912b237b1204e24a7c9e07ba64"} Mar 20 07:10:01 crc kubenswrapper[4971]: I0320 07:10:01.168409 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-8k5hb" Mar 20 07:10:01 crc kubenswrapper[4971]: I0320 07:10:01.169376 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-d9nz2" event={"ID":"b9bc9bf8-93d8-4b02-b563-ed18ef89944d","Type":"ContainerStarted","Data":"e4c8a6ce29d4f52e4edd9773247dbfb0c708f7e4fac750a5e8f596d747feadc4"} Mar 20 07:10:01 crc kubenswrapper[4971]: I0320 07:10:01.169530 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-884679f54-d9nz2" Mar 20 07:10:01 crc kubenswrapper[4971]: I0320 07:10:01.177806 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5784578c99-lfdpc" podStartSLOduration=2.796877891 podStartE2EDuration="28.177788651s" podCreationTimestamp="2026-03-20 07:09:33 +0000 UTC" firstStartedPulling="2026-03-20 07:09:35.025578012 +0000 UTC m=+1197.005452150" lastFinishedPulling="2026-03-20 07:10:00.406488762 +0000 UTC m=+1222.386362910" observedRunningTime="2026-03-20 07:10:01.176321442 +0000 UTC m=+1223.156195580" watchObservedRunningTime="2026-03-20 07:10:01.177788651 +0000 UTC m=+1223.157662789" Mar 20 07:10:01 crc kubenswrapper[4971]: I0320 07:10:01.194517 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-j26ds" podStartSLOduration=2.794359755 podStartE2EDuration="28.194499541s" podCreationTimestamp="2026-03-20 07:09:33 +0000 UTC" firstStartedPulling="2026-03-20 07:09:35.024569545 +0000 UTC m=+1197.004443673" lastFinishedPulling="2026-03-20 07:10:00.424709311 +0000 UTC m=+1222.404583459" observedRunningTime="2026-03-20 07:10:01.193880565 +0000 UTC m=+1223.173754713" watchObservedRunningTime="2026-03-20 07:10:01.194499541 +0000 UTC m=+1223.174373689" Mar 20 07:10:01 crc kubenswrapper[4971]: I0320 07:10:01.214102 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-ssscj" podStartSLOduration=2.799737086 podStartE2EDuration="28.214082716s" podCreationTimestamp="2026-03-20 07:09:33 +0000 UTC" firstStartedPulling="2026-03-20 07:09:35.028686154 +0000 UTC m=+1197.008560302" lastFinishedPulling="2026-03-20 07:10:00.443031794 +0000 UTC m=+1222.422905932" observedRunningTime="2026-03-20 07:10:01.212532145 +0000 UTC m=+1223.192406293" watchObservedRunningTime="2026-03-20 07:10:01.214082716 +0000 UTC m=+1223.193956854" Mar 20 07:10:01 crc kubenswrapper[4971]: I0320 07:10:01.228029 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-n2tv6" podStartSLOduration=2.825114644 podStartE2EDuration="28.228012373s" podCreationTimestamp="2026-03-20 07:09:33 +0000 UTC" firstStartedPulling="2026-03-20 07:09:35.022553772 +0000 UTC m=+1197.002427910" lastFinishedPulling="2026-03-20 07:10:00.425451501 +0000 UTC m=+1222.405325639" observedRunningTime="2026-03-20 07:10:01.223627398 +0000 UTC m=+1223.203501536" watchObservedRunningTime="2026-03-20 07:10:01.228012373 +0000 UTC m=+1223.207886511" Mar 20 07:10:01 crc kubenswrapper[4971]: I0320 07:10:01.261489 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-pjg97" podStartSLOduration=2.873360575 podStartE2EDuration="28.261474354s" podCreationTimestamp="2026-03-20 07:09:33 +0000 UTC" firstStartedPulling="2026-03-20 07:09:35.035145854 +0000 UTC m=+1197.015019992" lastFinishedPulling="2026-03-20 07:10:00.423259613 +0000 UTC m=+1222.403133771" observedRunningTime="2026-03-20 07:10:01.257221472 +0000 UTC m=+1223.237095610" watchObservedRunningTime="2026-03-20 07:10:01.261474354 +0000 UTC m=+1223.241348492" Mar 20 07:10:01 crc kubenswrapper[4971]: I0320 07:10:01.270943 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-884679f54-d9nz2" podStartSLOduration=4.7359543330000005 podStartE2EDuration="28.270928572s" podCreationTimestamp="2026-03-20 07:09:33 +0000 UTC" firstStartedPulling="2026-03-20 07:09:35.035088092 +0000 UTC m=+1197.014962230" lastFinishedPulling="2026-03-20 07:09:58.570062321 +0000 UTC m=+1220.549936469" observedRunningTime="2026-03-20 07:10:01.267937734 +0000 UTC m=+1223.247811882" watchObservedRunningTime="2026-03-20 07:10:01.270928572 +0000 UTC m=+1223.250802710" Mar 20 07:10:01 crc kubenswrapper[4971]: I0320 07:10:01.284147 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-8k5hb" podStartSLOduration=2.949539429 podStartE2EDuration="28.2841298s" podCreationTimestamp="2026-03-20 07:09:33 +0000 UTC" firstStartedPulling="2026-03-20 07:09:34.990202241 +0000 UTC m=+1196.970076379" lastFinishedPulling="2026-03-20 07:10:00.324792612 +0000 UTC m=+1222.304666750" observedRunningTime="2026-03-20 07:10:01.279660962 +0000 UTC m=+1223.259535100" watchObservedRunningTime="2026-03-20 07:10:01.2841298 +0000 UTC m=+1223.264003938" Mar 20 07:10:03 crc kubenswrapper[4971]: I0320 07:10:03.187807 4971 generic.go:334] "Generic (PLEG): container finished" podID="f989670d-5d1a-4fee-a7eb-d6142c0776d7" containerID="bad86612b55b70b2e623e5e65fc1e1cf3aba7fdf58c279008475d32b3bc49a53" exitCode=0 Mar 20 07:10:03 crc kubenswrapper[4971]: I0320 07:10:03.187945 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566510-xzmwb" event={"ID":"f989670d-5d1a-4fee-a7eb-d6142c0776d7","Type":"ContainerDied","Data":"bad86612b55b70b2e623e5e65fc1e1cf3aba7fdf58c279008475d32b3bc49a53"} Mar 20 07:10:04 crc kubenswrapper[4971]: I0320 07:10:04.199513 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-6jc7r" event={"ID":"c2cf6754-c345-4bd4-81de-d652e506f0cf","Type":"ContainerStarted","Data":"a2ec46923b056bef631bf878ff9ec43f90a4000b1ea5ed48559c652b70ece13d"} Mar 20 07:10:04 crc kubenswrapper[4971]: I0320 07:10:04.201303 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-6jc7r" Mar 20 07:10:04 crc kubenswrapper[4971]: I0320 07:10:04.236993 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-6jc7r" podStartSLOduration=2.573115551 podStartE2EDuration="31.23696163s" podCreationTimestamp="2026-03-20 07:09:33 +0000 UTC" firstStartedPulling="2026-03-20 07:09:34.522335348 +0000 UTC m=+1196.502209486" lastFinishedPulling="2026-03-20 07:10:03.186181437 +0000 UTC m=+1225.166055565" observedRunningTime="2026-03-20 07:10:04.231369953 +0000 UTC m=+1226.211244121" watchObservedRunningTime="2026-03-20 07:10:04.23696163 +0000 UTC m=+1226.216835818" Mar 20 07:10:04 crc kubenswrapper[4971]: I0320 07:10:04.540051 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566510-xzmwb" Mar 20 07:10:04 crc kubenswrapper[4971]: I0320 07:10:04.625931 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwc7d\" (UniqueName: \"kubernetes.io/projected/f989670d-5d1a-4fee-a7eb-d6142c0776d7-kube-api-access-cwc7d\") pod \"f989670d-5d1a-4fee-a7eb-d6142c0776d7\" (UID: \"f989670d-5d1a-4fee-a7eb-d6142c0776d7\") " Mar 20 07:10:04 crc kubenswrapper[4971]: I0320 07:10:04.632053 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f989670d-5d1a-4fee-a7eb-d6142c0776d7-kube-api-access-cwc7d" (OuterVolumeSpecName: "kube-api-access-cwc7d") pod "f989670d-5d1a-4fee-a7eb-d6142c0776d7" (UID: "f989670d-5d1a-4fee-a7eb-d6142c0776d7"). InnerVolumeSpecName "kube-api-access-cwc7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:10:04 crc kubenswrapper[4971]: I0320 07:10:04.727708 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwc7d\" (UniqueName: \"kubernetes.io/projected/f989670d-5d1a-4fee-a7eb-d6142c0776d7-kube-api-access-cwc7d\") on node \"crc\" DevicePath \"\"" Mar 20 07:10:05 crc kubenswrapper[4971]: I0320 07:10:05.209132 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566510-xzmwb" event={"ID":"f989670d-5d1a-4fee-a7eb-d6142c0776d7","Type":"ContainerDied","Data":"a9d1d25cf9553a59e2c8380e2f30096b284c52eaacc5192e8d44e79c73d41d13"} Mar 20 07:10:05 crc kubenswrapper[4971]: I0320 07:10:05.209483 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9d1d25cf9553a59e2c8380e2f30096b284c52eaacc5192e8d44e79c73d41d13" Mar 20 07:10:05 crc kubenswrapper[4971]: I0320 07:10:05.209164 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566510-xzmwb" Mar 20 07:10:05 crc kubenswrapper[4971]: I0320 07:10:05.234524 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/42b39042-7dfe-4946-bcfa-7bd424edb636-cert\") pod \"infra-operator-controller-manager-7b9c774f96-srgtw\" (UID: \"42b39042-7dfe-4946-bcfa-7bd424edb636\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-srgtw" Mar 20 07:10:05 crc kubenswrapper[4971]: I0320 07:10:05.240974 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/42b39042-7dfe-4946-bcfa-7bd424edb636-cert\") pod \"infra-operator-controller-manager-7b9c774f96-srgtw\" (UID: \"42b39042-7dfe-4946-bcfa-7bd424edb636\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-srgtw" Mar 20 07:10:05 crc kubenswrapper[4971]: I0320 07:10:05.442780 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-vn788" Mar 20 07:10:05 crc kubenswrapper[4971]: I0320 07:10:05.450905 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-srgtw" Mar 20 07:10:05 crc kubenswrapper[4971]: I0320 07:10:05.624560 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566504-6mpbh"] Mar 20 07:10:05 crc kubenswrapper[4971]: I0320 07:10:05.633033 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566504-6mpbh"] Mar 20 07:10:05 crc kubenswrapper[4971]: I0320 07:10:05.641706 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/13ed1180-8ed1-4fc9-9dd4-4fb3d9b23f3a-cert\") pod \"openstack-baremetal-operator-controller-manager-74c4796899pz5xg\" (UID: \"13ed1180-8ed1-4fc9-9dd4-4fb3d9b23f3a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899pz5xg" Mar 20 07:10:05 crc kubenswrapper[4971]: I0320 07:10:05.663646 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/13ed1180-8ed1-4fc9-9dd4-4fb3d9b23f3a-cert\") pod \"openstack-baremetal-operator-controller-manager-74c4796899pz5xg\" (UID: \"13ed1180-8ed1-4fc9-9dd4-4fb3d9b23f3a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899pz5xg" Mar 20 07:10:05 crc kubenswrapper[4971]: I0320 07:10:05.745513 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-srgtw"] Mar 20 07:10:05 crc kubenswrapper[4971]: W0320 07:10:05.749241 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42b39042_7dfe_4946_bcfa_7bd424edb636.slice/crio-7cf4b6da69da58b70a01dc435625dea584d0c5fca837e3fdb00a3c33e9538b1a WatchSource:0}: Error finding container 7cf4b6da69da58b70a01dc435625dea584d0c5fca837e3fdb00a3c33e9538b1a: Status 404 returned error can't find the container with id 7cf4b6da69da58b70a01dc435625dea584d0c5fca837e3fdb00a3c33e9538b1a Mar 20 07:10:05 crc kubenswrapper[4971]: I0320 07:10:05.844202 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1be5b0dc-cddc-44fb-bb47-d92e3295312c-metrics-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-ktmn5\" (UID: \"1be5b0dc-cddc-44fb-bb47-d92e3295312c\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-ktmn5" Mar 20 07:10:05 crc kubenswrapper[4971]: I0320 07:10:05.844740 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1be5b0dc-cddc-44fb-bb47-d92e3295312c-webhook-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-ktmn5\" (UID: \"1be5b0dc-cddc-44fb-bb47-d92e3295312c\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-ktmn5" Mar 20 07:10:05 crc kubenswrapper[4971]: I0320 07:10:05.852166 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1be5b0dc-cddc-44fb-bb47-d92e3295312c-metrics-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-ktmn5\" (UID: \"1be5b0dc-cddc-44fb-bb47-d92e3295312c\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-ktmn5" Mar 20 07:10:05 crc kubenswrapper[4971]: I0320 07:10:05.855250 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1be5b0dc-cddc-44fb-bb47-d92e3295312c-webhook-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-ktmn5\" (UID: \"1be5b0dc-cddc-44fb-bb47-d92e3295312c\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-ktmn5" Mar 20 07:10:05 crc kubenswrapper[4971]: I0320 07:10:05.856349 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-jhc9d" Mar 20 07:10:05 crc kubenswrapper[4971]: I0320 07:10:05.865577 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899pz5xg" Mar 20 07:10:06 crc kubenswrapper[4971]: I0320 07:10:06.091549 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-vx5bl" Mar 20 07:10:06 crc kubenswrapper[4971]: I0320 07:10:06.099833 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-ktmn5" Mar 20 07:10:06 crc kubenswrapper[4971]: I0320 07:10:06.223908 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-srgtw" event={"ID":"42b39042-7dfe-4946-bcfa-7bd424edb636","Type":"ContainerStarted","Data":"7cf4b6da69da58b70a01dc435625dea584d0c5fca837e3fdb00a3c33e9538b1a"} Mar 20 07:10:06 crc kubenswrapper[4971]: I0320 07:10:06.352213 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899pz5xg"] Mar 20 07:10:06 crc kubenswrapper[4971]: I0320 07:10:06.412229 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-86bd8996f6-ktmn5"] Mar 20 07:10:06 crc kubenswrapper[4971]: W0320 07:10:06.418790 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1be5b0dc_cddc_44fb_bb47_d92e3295312c.slice/crio-e5e5ce9aa1b6aabd931d87fcd413ee41d93dc81bab6c1e7d8faab0ba0c48c965 WatchSource:0}: Error finding container e5e5ce9aa1b6aabd931d87fcd413ee41d93dc81bab6c1e7d8faab0ba0c48c965: Status 404 returned error can't find the container with id e5e5ce9aa1b6aabd931d87fcd413ee41d93dc81bab6c1e7d8faab0ba0c48c965 Mar 20 07:10:06 crc kubenswrapper[4971]: I0320 07:10:06.743780 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="132e6589-0f2b-4b4d-9059-eadbe86f6219" path="/var/lib/kubelet/pods/132e6589-0f2b-4b4d-9059-eadbe86f6219/volumes" Mar 20 07:10:07 crc kubenswrapper[4971]: I0320 07:10:07.234974 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899pz5xg" event={"ID":"13ed1180-8ed1-4fc9-9dd4-4fb3d9b23f3a","Type":"ContainerStarted","Data":"ba89cdeb2e67c678d64a6b035c474e2c9cfa15807cbd5996fcbf1cdefaf9b711"} Mar 20 07:10:07 crc kubenswrapper[4971]: I0320 07:10:07.236481 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-ktmn5" event={"ID":"1be5b0dc-cddc-44fb-bb47-d92e3295312c","Type":"ContainerStarted","Data":"db89a98d7a02e409f9660a3e838843eae576522db0e3d9b3bf8bc6775a014426"} Mar 20 07:10:07 crc kubenswrapper[4971]: I0320 07:10:07.236529 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-ktmn5" event={"ID":"1be5b0dc-cddc-44fb-bb47-d92e3295312c","Type":"ContainerStarted","Data":"e5e5ce9aa1b6aabd931d87fcd413ee41d93dc81bab6c1e7d8faab0ba0c48c965"} Mar 20 07:10:07 crc kubenswrapper[4971]: I0320 07:10:07.236745 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-ktmn5" Mar 20 07:10:07 crc kubenswrapper[4971]: I0320 07:10:07.288926 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-ktmn5" podStartSLOduration=34.28889649 podStartE2EDuration="34.28889649s" podCreationTimestamp="2026-03-20 07:09:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:10:07.279055691 +0000 UTC m=+1229.258929849" watchObservedRunningTime="2026-03-20 07:10:07.28889649 +0000 UTC m=+1229.268770628" Mar 20 07:10:08 crc kubenswrapper[4971]: I0320 07:10:08.247025 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-srgtw" event={"ID":"42b39042-7dfe-4946-bcfa-7bd424edb636","Type":"ContainerStarted","Data":"3184318a5dee4b56df1e82d9d43d1624e4ebbdb0673c921b4a2c90c5b9d7bafb"} Mar 20 07:10:08 crc kubenswrapper[4971]: I0320 07:10:08.247374 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-srgtw" Mar 20 07:10:08 crc kubenswrapper[4971]: I0320 07:10:08.265039 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-srgtw" podStartSLOduration=33.522640104 podStartE2EDuration="35.26502207s" podCreationTimestamp="2026-03-20 07:09:33 +0000 UTC" firstStartedPulling="2026-03-20 07:10:05.75116384 +0000 UTC m=+1227.731037978" lastFinishedPulling="2026-03-20 07:10:07.493545786 +0000 UTC m=+1229.473419944" observedRunningTime="2026-03-20 07:10:08.264380963 +0000 UTC m=+1230.244255101" watchObservedRunningTime="2026-03-20 07:10:08.26502207 +0000 UTC m=+1230.244896208" Mar 20 07:10:09 crc kubenswrapper[4971]: I0320 07:10:09.260739 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899pz5xg" event={"ID":"13ed1180-8ed1-4fc9-9dd4-4fb3d9b23f3a","Type":"ContainerStarted","Data":"6b6c6de2463c9ed32b1e8a7e02c3577ec04e8e6d746985a76f7d68b6278078e3"} Mar 20 07:10:09 crc kubenswrapper[4971]: I0320 07:10:09.261139 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899pz5xg" Mar 20 07:10:09 crc kubenswrapper[4971]: I0320 07:10:09.319285 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899pz5xg" podStartSLOduration=33.796248174 podStartE2EDuration="36.319251364s" podCreationTimestamp="2026-03-20 07:09:33 +0000 UTC" firstStartedPulling="2026-03-20 07:10:06.370921441 +0000 UTC m=+1228.350795599" lastFinishedPulling="2026-03-20 07:10:08.893924651 +0000 UTC m=+1230.873798789" observedRunningTime="2026-03-20 07:10:09.31418189 +0000 UTC m=+1231.294056058" watchObservedRunningTime="2026-03-20 07:10:09.319251364 +0000 UTC m=+1231.299125552" Mar 20 07:10:13 crc kubenswrapper[4971]: I0320 07:10:13.680831 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-6jc7r" Mar 20 07:10:14 crc kubenswrapper[4971]: I0320 07:10:14.003241 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-j26ds" Mar 20 07:10:14 crc kubenswrapper[4971]: I0320 07:10:14.082201 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-884679f54-d9nz2" Mar 20 07:10:14 crc kubenswrapper[4971]: I0320 07:10:14.107854 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5784578c99-lfdpc" Mar 20 07:10:14 crc kubenswrapper[4971]: I0320 07:10:14.133179 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-8k5hb" Mar 20 07:10:14 crc kubenswrapper[4971]: I0320 07:10:14.191644 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-pjg97" Mar 20 07:10:14 crc kubenswrapper[4971]: I0320 07:10:14.235269 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-ssscj" Mar 20 07:10:15 crc kubenswrapper[4971]: I0320 07:10:15.458412 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-srgtw" Mar 20 07:10:15 crc kubenswrapper[4971]: I0320 07:10:15.874216 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899pz5xg" Mar 20 07:10:16 crc kubenswrapper[4971]: I0320 07:10:16.107626 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-ktmn5" Mar 20 07:10:30 crc kubenswrapper[4971]: I0320 07:10:30.517531 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-r7nv7"] Mar 20 07:10:30 crc kubenswrapper[4971]: E0320 07:10:30.518839 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f989670d-5d1a-4fee-a7eb-d6142c0776d7" containerName="oc" Mar 20 07:10:30 crc kubenswrapper[4971]: I0320 07:10:30.518858 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f989670d-5d1a-4fee-a7eb-d6142c0776d7" containerName="oc" Mar 20 07:10:30 crc kubenswrapper[4971]: I0320 07:10:30.519031 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="f989670d-5d1a-4fee-a7eb-d6142c0776d7" containerName="oc" Mar 20 07:10:30 crc kubenswrapper[4971]: I0320 07:10:30.520021 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5448ff6dc7-r7nv7" Mar 20 07:10:30 crc kubenswrapper[4971]: I0320 07:10:30.522200 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-5fh8z" Mar 20 07:10:30 crc kubenswrapper[4971]: I0320 07:10:30.528334 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-r7nv7"] Mar 20 07:10:30 crc kubenswrapper[4971]: I0320 07:10:30.530272 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 20 07:10:30 crc kubenswrapper[4971]: I0320 07:10:30.530478 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 20 07:10:30 crc kubenswrapper[4971]: I0320 07:10:30.530557 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 20 07:10:30 crc kubenswrapper[4971]: I0320 07:10:30.561530 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9bwz\" (UniqueName: \"kubernetes.io/projected/663eb449-2315-426d-89ea-7fa437373dac-kube-api-access-k9bwz\") pod \"dnsmasq-dns-5448ff6dc7-r7nv7\" (UID: \"663eb449-2315-426d-89ea-7fa437373dac\") " pod="openstack/dnsmasq-dns-5448ff6dc7-r7nv7" Mar 20 07:10:30 crc kubenswrapper[4971]: I0320 07:10:30.561640 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/663eb449-2315-426d-89ea-7fa437373dac-config\") pod \"dnsmasq-dns-5448ff6dc7-r7nv7\" (UID: \"663eb449-2315-426d-89ea-7fa437373dac\") " pod="openstack/dnsmasq-dns-5448ff6dc7-r7nv7" Mar 20 07:10:30 crc kubenswrapper[4971]: I0320 07:10:30.600912 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-64696987c5-k2mdp"] Mar 20 07:10:30 crc kubenswrapper[4971]: I0320 07:10:30.605719 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64696987c5-k2mdp" Mar 20 07:10:30 crc kubenswrapper[4971]: I0320 07:10:30.608889 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 20 07:10:30 crc kubenswrapper[4971]: I0320 07:10:30.624926 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64696987c5-k2mdp"] Mar 20 07:10:30 crc kubenswrapper[4971]: I0320 07:10:30.662962 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e3a6967-aab7-4dac-abce-8d3189935dc4-dns-svc\") pod \"dnsmasq-dns-64696987c5-k2mdp\" (UID: \"6e3a6967-aab7-4dac-abce-8d3189935dc4\") " pod="openstack/dnsmasq-dns-64696987c5-k2mdp" Mar 20 07:10:30 crc kubenswrapper[4971]: I0320 07:10:30.663019 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/663eb449-2315-426d-89ea-7fa437373dac-config\") pod \"dnsmasq-dns-5448ff6dc7-r7nv7\" (UID: \"663eb449-2315-426d-89ea-7fa437373dac\") " pod="openstack/dnsmasq-dns-5448ff6dc7-r7nv7" Mar 20 07:10:30 crc kubenswrapper[4971]: I0320 07:10:30.663058 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxx5b\" (UniqueName: \"kubernetes.io/projected/6e3a6967-aab7-4dac-abce-8d3189935dc4-kube-api-access-zxx5b\") pod \"dnsmasq-dns-64696987c5-k2mdp\" (UID: \"6e3a6967-aab7-4dac-abce-8d3189935dc4\") " pod="openstack/dnsmasq-dns-64696987c5-k2mdp" Mar 20 07:10:30 crc kubenswrapper[4971]: I0320 07:10:30.663104 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e3a6967-aab7-4dac-abce-8d3189935dc4-config\") pod \"dnsmasq-dns-64696987c5-k2mdp\" (UID: \"6e3a6967-aab7-4dac-abce-8d3189935dc4\") " pod="openstack/dnsmasq-dns-64696987c5-k2mdp" Mar 20 07:10:30 crc kubenswrapper[4971]: I0320 07:10:30.663131 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9bwz\" (UniqueName: \"kubernetes.io/projected/663eb449-2315-426d-89ea-7fa437373dac-kube-api-access-k9bwz\") pod \"dnsmasq-dns-5448ff6dc7-r7nv7\" (UID: \"663eb449-2315-426d-89ea-7fa437373dac\") " pod="openstack/dnsmasq-dns-5448ff6dc7-r7nv7" Mar 20 07:10:30 crc kubenswrapper[4971]: I0320 07:10:30.664252 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/663eb449-2315-426d-89ea-7fa437373dac-config\") pod \"dnsmasq-dns-5448ff6dc7-r7nv7\" (UID: \"663eb449-2315-426d-89ea-7fa437373dac\") " pod="openstack/dnsmasq-dns-5448ff6dc7-r7nv7" Mar 20 07:10:30 crc kubenswrapper[4971]: I0320 07:10:30.683307 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9bwz\" (UniqueName: \"kubernetes.io/projected/663eb449-2315-426d-89ea-7fa437373dac-kube-api-access-k9bwz\") pod \"dnsmasq-dns-5448ff6dc7-r7nv7\" (UID: \"663eb449-2315-426d-89ea-7fa437373dac\") " pod="openstack/dnsmasq-dns-5448ff6dc7-r7nv7" Mar 20 07:10:30 crc kubenswrapper[4971]: I0320 07:10:30.764628 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e3a6967-aab7-4dac-abce-8d3189935dc4-config\") pod \"dnsmasq-dns-64696987c5-k2mdp\" (UID: \"6e3a6967-aab7-4dac-abce-8d3189935dc4\") " pod="openstack/dnsmasq-dns-64696987c5-k2mdp" Mar 20 07:10:30 crc kubenswrapper[4971]: I0320 07:10:30.765038 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e3a6967-aab7-4dac-abce-8d3189935dc4-dns-svc\") pod \"dnsmasq-dns-64696987c5-k2mdp\" (UID: \"6e3a6967-aab7-4dac-abce-8d3189935dc4\") " pod="openstack/dnsmasq-dns-64696987c5-k2mdp" Mar 20 07:10:30 crc kubenswrapper[4971]: I0320 07:10:30.765096 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxx5b\" (UniqueName: \"kubernetes.io/projected/6e3a6967-aab7-4dac-abce-8d3189935dc4-kube-api-access-zxx5b\") pod \"dnsmasq-dns-64696987c5-k2mdp\" (UID: \"6e3a6967-aab7-4dac-abce-8d3189935dc4\") " pod="openstack/dnsmasq-dns-64696987c5-k2mdp" Mar 20 07:10:30 crc kubenswrapper[4971]: I0320 07:10:30.765856 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e3a6967-aab7-4dac-abce-8d3189935dc4-config\") pod \"dnsmasq-dns-64696987c5-k2mdp\" (UID: \"6e3a6967-aab7-4dac-abce-8d3189935dc4\") " pod="openstack/dnsmasq-dns-64696987c5-k2mdp" Mar 20 07:10:30 crc kubenswrapper[4971]: I0320 07:10:30.766698 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e3a6967-aab7-4dac-abce-8d3189935dc4-dns-svc\") pod \"dnsmasq-dns-64696987c5-k2mdp\" (UID: \"6e3a6967-aab7-4dac-abce-8d3189935dc4\") " pod="openstack/dnsmasq-dns-64696987c5-k2mdp" Mar 20 07:10:30 crc kubenswrapper[4971]: I0320 07:10:30.780997 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxx5b\" (UniqueName: \"kubernetes.io/projected/6e3a6967-aab7-4dac-abce-8d3189935dc4-kube-api-access-zxx5b\") pod \"dnsmasq-dns-64696987c5-k2mdp\" (UID: \"6e3a6967-aab7-4dac-abce-8d3189935dc4\") " pod="openstack/dnsmasq-dns-64696987c5-k2mdp" Mar 20 07:10:30 crc kubenswrapper[4971]: I0320 07:10:30.841956 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5448ff6dc7-r7nv7" Mar 20 07:10:30 crc kubenswrapper[4971]: I0320 07:10:30.927886 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64696987c5-k2mdp" Mar 20 07:10:31 crc kubenswrapper[4971]: I0320 07:10:31.203822 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64696987c5-k2mdp"] Mar 20 07:10:31 crc kubenswrapper[4971]: I0320 07:10:31.218109 4971 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 07:10:31 crc kubenswrapper[4971]: I0320 07:10:31.324858 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-r7nv7"] Mar 20 07:10:31 crc kubenswrapper[4971]: W0320 07:10:31.331900 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod663eb449_2315_426d_89ea_7fa437373dac.slice/crio-33f9b347bc76b713f0a94631ea90ec2d1ff9a81e29b5f2fe18d047b9e38dc5d6 WatchSource:0}: Error finding container 33f9b347bc76b713f0a94631ea90ec2d1ff9a81e29b5f2fe18d047b9e38dc5d6: Status 404 returned error can't find the container with id 33f9b347bc76b713f0a94631ea90ec2d1ff9a81e29b5f2fe18d047b9e38dc5d6 Mar 20 07:10:31 crc kubenswrapper[4971]: I0320 07:10:31.467070 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5448ff6dc7-r7nv7" event={"ID":"663eb449-2315-426d-89ea-7fa437373dac","Type":"ContainerStarted","Data":"33f9b347bc76b713f0a94631ea90ec2d1ff9a81e29b5f2fe18d047b9e38dc5d6"} Mar 20 07:10:31 crc kubenswrapper[4971]: I0320 07:10:31.469177 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64696987c5-k2mdp" event={"ID":"6e3a6967-aab7-4dac-abce-8d3189935dc4","Type":"ContainerStarted","Data":"227656cc13518d8826ab2b153749854708847cc6d228a423e19779955d5fc763"} Mar 20 07:10:33 crc kubenswrapper[4971]: I0320 07:10:33.204042 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-r7nv7"] Mar 20 07:10:33 crc kubenswrapper[4971]: I0320 07:10:33.234023 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-854f47b4f9-m8kz6"] Mar 20 07:10:33 crc kubenswrapper[4971]: I0320 07:10:33.238329 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-854f47b4f9-m8kz6" Mar 20 07:10:33 crc kubenswrapper[4971]: I0320 07:10:33.263133 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-854f47b4f9-m8kz6"] Mar 20 07:10:33 crc kubenswrapper[4971]: I0320 07:10:33.310478 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxhf4\" (UniqueName: \"kubernetes.io/projected/b9eb0be1-51da-41ee-b9cc-0a555d094b16-kube-api-access-lxhf4\") pod \"dnsmasq-dns-854f47b4f9-m8kz6\" (UID: \"b9eb0be1-51da-41ee-b9cc-0a555d094b16\") " pod="openstack/dnsmasq-dns-854f47b4f9-m8kz6" Mar 20 07:10:33 crc kubenswrapper[4971]: I0320 07:10:33.310536 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9eb0be1-51da-41ee-b9cc-0a555d094b16-config\") pod \"dnsmasq-dns-854f47b4f9-m8kz6\" (UID: \"b9eb0be1-51da-41ee-b9cc-0a555d094b16\") " pod="openstack/dnsmasq-dns-854f47b4f9-m8kz6" Mar 20 07:10:33 crc kubenswrapper[4971]: I0320 07:10:33.310568 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b9eb0be1-51da-41ee-b9cc-0a555d094b16-dns-svc\") pod \"dnsmasq-dns-854f47b4f9-m8kz6\" (UID: \"b9eb0be1-51da-41ee-b9cc-0a555d094b16\") " pod="openstack/dnsmasq-dns-854f47b4f9-m8kz6" Mar 20 07:10:33 crc kubenswrapper[4971]: I0320 07:10:33.411807 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxhf4\" (UniqueName: \"kubernetes.io/projected/b9eb0be1-51da-41ee-b9cc-0a555d094b16-kube-api-access-lxhf4\") pod \"dnsmasq-dns-854f47b4f9-m8kz6\" (UID: \"b9eb0be1-51da-41ee-b9cc-0a555d094b16\") " pod="openstack/dnsmasq-dns-854f47b4f9-m8kz6" Mar 20 07:10:33 crc kubenswrapper[4971]: I0320 07:10:33.411862 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9eb0be1-51da-41ee-b9cc-0a555d094b16-config\") pod \"dnsmasq-dns-854f47b4f9-m8kz6\" (UID: \"b9eb0be1-51da-41ee-b9cc-0a555d094b16\") " pod="openstack/dnsmasq-dns-854f47b4f9-m8kz6" Mar 20 07:10:33 crc kubenswrapper[4971]: I0320 07:10:33.411891 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b9eb0be1-51da-41ee-b9cc-0a555d094b16-dns-svc\") pod \"dnsmasq-dns-854f47b4f9-m8kz6\" (UID: \"b9eb0be1-51da-41ee-b9cc-0a555d094b16\") " pod="openstack/dnsmasq-dns-854f47b4f9-m8kz6" Mar 20 07:10:33 crc kubenswrapper[4971]: I0320 07:10:33.412859 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b9eb0be1-51da-41ee-b9cc-0a555d094b16-dns-svc\") pod \"dnsmasq-dns-854f47b4f9-m8kz6\" (UID: \"b9eb0be1-51da-41ee-b9cc-0a555d094b16\") " pod="openstack/dnsmasq-dns-854f47b4f9-m8kz6" Mar 20 07:10:33 crc kubenswrapper[4971]: I0320 07:10:33.413238 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9eb0be1-51da-41ee-b9cc-0a555d094b16-config\") pod \"dnsmasq-dns-854f47b4f9-m8kz6\" (UID: \"b9eb0be1-51da-41ee-b9cc-0a555d094b16\") " pod="openstack/dnsmasq-dns-854f47b4f9-m8kz6" Mar 20 07:10:33 crc kubenswrapper[4971]: I0320 07:10:33.452100 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxhf4\" (UniqueName: \"kubernetes.io/projected/b9eb0be1-51da-41ee-b9cc-0a555d094b16-kube-api-access-lxhf4\") pod \"dnsmasq-dns-854f47b4f9-m8kz6\" (UID: \"b9eb0be1-51da-41ee-b9cc-0a555d094b16\") " pod="openstack/dnsmasq-dns-854f47b4f9-m8kz6" Mar 20 07:10:33 crc kubenswrapper[4971]: I0320 07:10:33.511458 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64696987c5-k2mdp"] Mar 20 07:10:33 crc kubenswrapper[4971]: I0320 07:10:33.543591 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-lzjrm"] Mar 20 07:10:33 crc kubenswrapper[4971]: I0320 07:10:33.545253 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54b5dffb47-lzjrm" Mar 20 07:10:33 crc kubenswrapper[4971]: I0320 07:10:33.558420 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-lzjrm"] Mar 20 07:10:33 crc kubenswrapper[4971]: I0320 07:10:33.568543 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-854f47b4f9-m8kz6" Mar 20 07:10:33 crc kubenswrapper[4971]: I0320 07:10:33.617181 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80f15589-52d0-4bb9-972e-82d3860486ec-config\") pod \"dnsmasq-dns-54b5dffb47-lzjrm\" (UID: \"80f15589-52d0-4bb9-972e-82d3860486ec\") " pod="openstack/dnsmasq-dns-54b5dffb47-lzjrm" Mar 20 07:10:33 crc kubenswrapper[4971]: I0320 07:10:33.617325 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/80f15589-52d0-4bb9-972e-82d3860486ec-dns-svc\") pod \"dnsmasq-dns-54b5dffb47-lzjrm\" (UID: \"80f15589-52d0-4bb9-972e-82d3860486ec\") " pod="openstack/dnsmasq-dns-54b5dffb47-lzjrm" Mar 20 07:10:33 crc kubenswrapper[4971]: I0320 07:10:33.618466 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ck7zk\" (UniqueName: \"kubernetes.io/projected/80f15589-52d0-4bb9-972e-82d3860486ec-kube-api-access-ck7zk\") pod \"dnsmasq-dns-54b5dffb47-lzjrm\" (UID: \"80f15589-52d0-4bb9-972e-82d3860486ec\") " pod="openstack/dnsmasq-dns-54b5dffb47-lzjrm" Mar 20 07:10:33 crc kubenswrapper[4971]: I0320 07:10:33.722503 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/80f15589-52d0-4bb9-972e-82d3860486ec-dns-svc\") pod \"dnsmasq-dns-54b5dffb47-lzjrm\" (UID: \"80f15589-52d0-4bb9-972e-82d3860486ec\") " pod="openstack/dnsmasq-dns-54b5dffb47-lzjrm" Mar 20 07:10:33 crc kubenswrapper[4971]: I0320 07:10:33.722825 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ck7zk\" (UniqueName: \"kubernetes.io/projected/80f15589-52d0-4bb9-972e-82d3860486ec-kube-api-access-ck7zk\") pod \"dnsmasq-dns-54b5dffb47-lzjrm\" (UID: \"80f15589-52d0-4bb9-972e-82d3860486ec\") " pod="openstack/dnsmasq-dns-54b5dffb47-lzjrm" Mar 20 07:10:33 crc kubenswrapper[4971]: I0320 07:10:33.722909 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80f15589-52d0-4bb9-972e-82d3860486ec-config\") pod \"dnsmasq-dns-54b5dffb47-lzjrm\" (UID: \"80f15589-52d0-4bb9-972e-82d3860486ec\") " pod="openstack/dnsmasq-dns-54b5dffb47-lzjrm" Mar 20 07:10:33 crc kubenswrapper[4971]: I0320 07:10:33.725188 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80f15589-52d0-4bb9-972e-82d3860486ec-config\") pod \"dnsmasq-dns-54b5dffb47-lzjrm\" (UID: \"80f15589-52d0-4bb9-972e-82d3860486ec\") " pod="openstack/dnsmasq-dns-54b5dffb47-lzjrm" Mar 20 07:10:33 crc kubenswrapper[4971]: I0320 07:10:33.725744 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/80f15589-52d0-4bb9-972e-82d3860486ec-dns-svc\") pod \"dnsmasq-dns-54b5dffb47-lzjrm\" (UID: \"80f15589-52d0-4bb9-972e-82d3860486ec\") " pod="openstack/dnsmasq-dns-54b5dffb47-lzjrm" Mar 20 07:10:33 crc kubenswrapper[4971]: I0320 07:10:33.753542 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ck7zk\" (UniqueName: \"kubernetes.io/projected/80f15589-52d0-4bb9-972e-82d3860486ec-kube-api-access-ck7zk\") pod \"dnsmasq-dns-54b5dffb47-lzjrm\" (UID: \"80f15589-52d0-4bb9-972e-82d3860486ec\") " pod="openstack/dnsmasq-dns-54b5dffb47-lzjrm" Mar 20 07:10:33 crc kubenswrapper[4971]: I0320 07:10:33.874379 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54b5dffb47-lzjrm" Mar 20 07:10:33 crc kubenswrapper[4971]: I0320 07:10:33.874462 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-854f47b4f9-m8kz6"] Mar 20 07:10:33 crc kubenswrapper[4971]: W0320 07:10:33.892472 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9eb0be1_51da_41ee_b9cc_0a555d094b16.slice/crio-1401fefafb93114bc1446c2de0fee089048d41cc5fd1e5f83ceca7fa1c9c9965 WatchSource:0}: Error finding container 1401fefafb93114bc1446c2de0fee089048d41cc5fd1e5f83ceca7fa1c9c9965: Status 404 returned error can't find the container with id 1401fefafb93114bc1446c2de0fee089048d41cc5fd1e5f83ceca7fa1c9c9965 Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.325260 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-lzjrm"] Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.407979 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.412949 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.416458 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-8dd22" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.416821 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.417106 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.418056 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.418391 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.418691 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.419057 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.427120 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.497523 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-854f47b4f9-m8kz6" event={"ID":"b9eb0be1-51da-41ee-b9cc-0a555d094b16","Type":"ContainerStarted","Data":"1401fefafb93114bc1446c2de0fee089048d41cc5fd1e5f83ceca7fa1c9c9965"} Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.499245 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54b5dffb47-lzjrm" event={"ID":"80f15589-52d0-4bb9-972e-82d3860486ec","Type":"ContainerStarted","Data":"a04b8d554f889b70d3dcdb14e59f76ec18f3a4d897ccdb3a8b72fe56b93ffde7"} Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.548159 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6tcn\" (UniqueName: \"kubernetes.io/projected/a71f8d2d-729c-4b7c-89c7-a06bd2216978-kube-api-access-b6tcn\") pod \"rabbitmq-server-0\" (UID: \"a71f8d2d-729c-4b7c-89c7-a06bd2216978\") " pod="openstack/rabbitmq-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.548208 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a71f8d2d-729c-4b7c-89c7-a06bd2216978-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"a71f8d2d-729c-4b7c-89c7-a06bd2216978\") " pod="openstack/rabbitmq-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.548238 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a71f8d2d-729c-4b7c-89c7-a06bd2216978-pod-info\") pod \"rabbitmq-server-0\" (UID: \"a71f8d2d-729c-4b7c-89c7-a06bd2216978\") " pod="openstack/rabbitmq-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.548259 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"a71f8d2d-729c-4b7c-89c7-a06bd2216978\") " pod="openstack/rabbitmq-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.548277 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a71f8d2d-729c-4b7c-89c7-a06bd2216978-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"a71f8d2d-729c-4b7c-89c7-a06bd2216978\") " pod="openstack/rabbitmq-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.548292 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a71f8d2d-729c-4b7c-89c7-a06bd2216978-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"a71f8d2d-729c-4b7c-89c7-a06bd2216978\") " pod="openstack/rabbitmq-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.548330 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a71f8d2d-729c-4b7c-89c7-a06bd2216978-config-data\") pod \"rabbitmq-server-0\" (UID: \"a71f8d2d-729c-4b7c-89c7-a06bd2216978\") " pod="openstack/rabbitmq-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.548362 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a71f8d2d-729c-4b7c-89c7-a06bd2216978-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"a71f8d2d-729c-4b7c-89c7-a06bd2216978\") " pod="openstack/rabbitmq-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.548378 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a71f8d2d-729c-4b7c-89c7-a06bd2216978-server-conf\") pod \"rabbitmq-server-0\" (UID: \"a71f8d2d-729c-4b7c-89c7-a06bd2216978\") " pod="openstack/rabbitmq-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.548397 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a71f8d2d-729c-4b7c-89c7-a06bd2216978-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"a71f8d2d-729c-4b7c-89c7-a06bd2216978\") " pod="openstack/rabbitmq-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.548430 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a71f8d2d-729c-4b7c-89c7-a06bd2216978-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"a71f8d2d-729c-4b7c-89c7-a06bd2216978\") " pod="openstack/rabbitmq-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.649691 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a71f8d2d-729c-4b7c-89c7-a06bd2216978-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"a71f8d2d-729c-4b7c-89c7-a06bd2216978\") " pod="openstack/rabbitmq-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.649753 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a71f8d2d-729c-4b7c-89c7-a06bd2216978-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"a71f8d2d-729c-4b7c-89c7-a06bd2216978\") " pod="openstack/rabbitmq-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.650351 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a71f8d2d-729c-4b7c-89c7-a06bd2216978-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"a71f8d2d-729c-4b7c-89c7-a06bd2216978\") " pod="openstack/rabbitmq-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.650419 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6tcn\" (UniqueName: \"kubernetes.io/projected/a71f8d2d-729c-4b7c-89c7-a06bd2216978-kube-api-access-b6tcn\") pod \"rabbitmq-server-0\" (UID: \"a71f8d2d-729c-4b7c-89c7-a06bd2216978\") " pod="openstack/rabbitmq-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.650445 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a71f8d2d-729c-4b7c-89c7-a06bd2216978-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"a71f8d2d-729c-4b7c-89c7-a06bd2216978\") " pod="openstack/rabbitmq-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.650478 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a71f8d2d-729c-4b7c-89c7-a06bd2216978-pod-info\") pod \"rabbitmq-server-0\" (UID: \"a71f8d2d-729c-4b7c-89c7-a06bd2216978\") " pod="openstack/rabbitmq-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.650500 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a71f8d2d-729c-4b7c-89c7-a06bd2216978-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"a71f8d2d-729c-4b7c-89c7-a06bd2216978\") " pod="openstack/rabbitmq-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.650516 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"a71f8d2d-729c-4b7c-89c7-a06bd2216978\") " pod="openstack/rabbitmq-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.650533 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a71f8d2d-729c-4b7c-89c7-a06bd2216978-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"a71f8d2d-729c-4b7c-89c7-a06bd2216978\") " pod="openstack/rabbitmq-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.650571 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a71f8d2d-729c-4b7c-89c7-a06bd2216978-config-data\") pod \"rabbitmq-server-0\" (UID: \"a71f8d2d-729c-4b7c-89c7-a06bd2216978\") " pod="openstack/rabbitmq-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.650630 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a71f8d2d-729c-4b7c-89c7-a06bd2216978-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"a71f8d2d-729c-4b7c-89c7-a06bd2216978\") " pod="openstack/rabbitmq-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.650656 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a71f8d2d-729c-4b7c-89c7-a06bd2216978-server-conf\") pod \"rabbitmq-server-0\" (UID: \"a71f8d2d-729c-4b7c-89c7-a06bd2216978\") " pod="openstack/rabbitmq-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.650673 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a71f8d2d-729c-4b7c-89c7-a06bd2216978-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"a71f8d2d-729c-4b7c-89c7-a06bd2216978\") " pod="openstack/rabbitmq-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.651767 4971 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"a71f8d2d-729c-4b7c-89c7-a06bd2216978\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/rabbitmq-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.652969 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a71f8d2d-729c-4b7c-89c7-a06bd2216978-config-data\") pod \"rabbitmq-server-0\" (UID: \"a71f8d2d-729c-4b7c-89c7-a06bd2216978\") " pod="openstack/rabbitmq-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.653111 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a71f8d2d-729c-4b7c-89c7-a06bd2216978-server-conf\") pod \"rabbitmq-server-0\" (UID: \"a71f8d2d-729c-4b7c-89c7-a06bd2216978\") " pod="openstack/rabbitmq-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.653304 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a71f8d2d-729c-4b7c-89c7-a06bd2216978-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"a71f8d2d-729c-4b7c-89c7-a06bd2216978\") " pod="openstack/rabbitmq-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.675490 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a71f8d2d-729c-4b7c-89c7-a06bd2216978-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"a71f8d2d-729c-4b7c-89c7-a06bd2216978\") " pod="openstack/rabbitmq-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.679661 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a71f8d2d-729c-4b7c-89c7-a06bd2216978-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"a71f8d2d-729c-4b7c-89c7-a06bd2216978\") " pod="openstack/rabbitmq-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.682189 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a71f8d2d-729c-4b7c-89c7-a06bd2216978-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"a71f8d2d-729c-4b7c-89c7-a06bd2216978\") " pod="openstack/rabbitmq-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.688891 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a71f8d2d-729c-4b7c-89c7-a06bd2216978-pod-info\") pod \"rabbitmq-server-0\" (UID: \"a71f8d2d-729c-4b7c-89c7-a06bd2216978\") " pod="openstack/rabbitmq-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.693505 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6tcn\" (UniqueName: \"kubernetes.io/projected/a71f8d2d-729c-4b7c-89c7-a06bd2216978-kube-api-access-b6tcn\") pod \"rabbitmq-server-0\" (UID: \"a71f8d2d-729c-4b7c-89c7-a06bd2216978\") " pod="openstack/rabbitmq-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.694465 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"a71f8d2d-729c-4b7c-89c7-a06bd2216978\") " pod="openstack/rabbitmq-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.700210 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.740581 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.747007 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.752635 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.752675 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.753181 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.754191 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.754334 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-f7qhq" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.761571 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.764016 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.854411 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.864943 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1023ed0c-3abe-4cff-987f-52544b885696-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1023ed0c-3abe-4cff-987f-52544b885696\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.865017 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1023ed0c-3abe-4cff-987f-52544b885696-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"1023ed0c-3abe-4cff-987f-52544b885696\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.865056 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1023ed0c-3abe-4cff-987f-52544b885696-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1023ed0c-3abe-4cff-987f-52544b885696\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.865078 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1023ed0c-3abe-4cff-987f-52544b885696-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1023ed0c-3abe-4cff-987f-52544b885696\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.865094 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1023ed0c-3abe-4cff-987f-52544b885696-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1023ed0c-3abe-4cff-987f-52544b885696\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.865109 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pksdb\" (UniqueName: \"kubernetes.io/projected/1023ed0c-3abe-4cff-987f-52544b885696-kube-api-access-pksdb\") pod \"rabbitmq-cell1-server-0\" (UID: \"1023ed0c-3abe-4cff-987f-52544b885696\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.865134 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1023ed0c-3abe-4cff-987f-52544b885696-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1023ed0c-3abe-4cff-987f-52544b885696\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.865202 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1023ed0c-3abe-4cff-987f-52544b885696-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"1023ed0c-3abe-4cff-987f-52544b885696\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.865221 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1023ed0c-3abe-4cff-987f-52544b885696-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1023ed0c-3abe-4cff-987f-52544b885696\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.865249 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1023ed0c-3abe-4cff-987f-52544b885696-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1023ed0c-3abe-4cff-987f-52544b885696\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.865269 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1023ed0c-3abe-4cff-987f-52544b885696\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.966242 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1023ed0c-3abe-4cff-987f-52544b885696-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1023ed0c-3abe-4cff-987f-52544b885696\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.966301 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1023ed0c-3abe-4cff-987f-52544b885696-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"1023ed0c-3abe-4cff-987f-52544b885696\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.966324 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1023ed0c-3abe-4cff-987f-52544b885696-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1023ed0c-3abe-4cff-987f-52544b885696\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.966344 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1023ed0c-3abe-4cff-987f-52544b885696-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1023ed0c-3abe-4cff-987f-52544b885696\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.966360 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1023ed0c-3abe-4cff-987f-52544b885696-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1023ed0c-3abe-4cff-987f-52544b885696\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.966375 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pksdb\" (UniqueName: \"kubernetes.io/projected/1023ed0c-3abe-4cff-987f-52544b885696-kube-api-access-pksdb\") pod \"rabbitmq-cell1-server-0\" (UID: \"1023ed0c-3abe-4cff-987f-52544b885696\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.966397 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1023ed0c-3abe-4cff-987f-52544b885696-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1023ed0c-3abe-4cff-987f-52544b885696\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.966437 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1023ed0c-3abe-4cff-987f-52544b885696-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"1023ed0c-3abe-4cff-987f-52544b885696\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.966454 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1023ed0c-3abe-4cff-987f-52544b885696-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1023ed0c-3abe-4cff-987f-52544b885696\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.966475 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1023ed0c-3abe-4cff-987f-52544b885696\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.966495 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1023ed0c-3abe-4cff-987f-52544b885696-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1023ed0c-3abe-4cff-987f-52544b885696\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.966888 4971 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1023ed0c-3abe-4cff-987f-52544b885696\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.967372 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1023ed0c-3abe-4cff-987f-52544b885696-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1023ed0c-3abe-4cff-987f-52544b885696\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.968308 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1023ed0c-3abe-4cff-987f-52544b885696-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1023ed0c-3abe-4cff-987f-52544b885696\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.968403 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1023ed0c-3abe-4cff-987f-52544b885696-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1023ed0c-3abe-4cff-987f-52544b885696\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.970303 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1023ed0c-3abe-4cff-987f-52544b885696-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1023ed0c-3abe-4cff-987f-52544b885696\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.971785 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1023ed0c-3abe-4cff-987f-52544b885696-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"1023ed0c-3abe-4cff-987f-52544b885696\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.973441 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1023ed0c-3abe-4cff-987f-52544b885696-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1023ed0c-3abe-4cff-987f-52544b885696\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.973467 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1023ed0c-3abe-4cff-987f-52544b885696-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"1023ed0c-3abe-4cff-987f-52544b885696\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.973665 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1023ed0c-3abe-4cff-987f-52544b885696-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1023ed0c-3abe-4cff-987f-52544b885696\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.973665 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1023ed0c-3abe-4cff-987f-52544b885696-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1023ed0c-3abe-4cff-987f-52544b885696\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.984414 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pksdb\" (UniqueName: \"kubernetes.io/projected/1023ed0c-3abe-4cff-987f-52544b885696-kube-api-access-pksdb\") pod \"rabbitmq-cell1-server-0\" (UID: \"1023ed0c-3abe-4cff-987f-52544b885696\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:10:34 crc kubenswrapper[4971]: I0320 07:10:34.989403 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1023ed0c-3abe-4cff-987f-52544b885696\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:10:35 crc kubenswrapper[4971]: I0320 07:10:35.171927 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:10:35 crc kubenswrapper[4971]: I0320 07:10:35.364892 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 07:10:35 crc kubenswrapper[4971]: W0320 07:10:35.411440 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda71f8d2d_729c_4b7c_89c7_a06bd2216978.slice/crio-432938dfa551e87c30cd59b963f5ea472cc946f2bf3968f87989b23b752d112a WatchSource:0}: Error finding container 432938dfa551e87c30cd59b963f5ea472cc946f2bf3968f87989b23b752d112a: Status 404 returned error can't find the container with id 432938dfa551e87c30cd59b963f5ea472cc946f2bf3968f87989b23b752d112a Mar 20 07:10:35 crc kubenswrapper[4971]: I0320 07:10:35.513690 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a71f8d2d-729c-4b7c-89c7-a06bd2216978","Type":"ContainerStarted","Data":"432938dfa551e87c30cd59b963f5ea472cc946f2bf3968f87989b23b752d112a"} Mar 20 07:10:35 crc kubenswrapper[4971]: I0320 07:10:35.647715 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 07:10:35 crc kubenswrapper[4971]: I0320 07:10:35.820089 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 20 07:10:35 crc kubenswrapper[4971]: I0320 07:10:35.821967 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 20 07:10:35 crc kubenswrapper[4971]: I0320 07:10:35.830442 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 20 07:10:35 crc kubenswrapper[4971]: I0320 07:10:35.840735 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-r2v86" Mar 20 07:10:35 crc kubenswrapper[4971]: I0320 07:10:35.840891 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 20 07:10:35 crc kubenswrapper[4971]: I0320 07:10:35.841222 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 20 07:10:35 crc kubenswrapper[4971]: I0320 07:10:35.843590 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 20 07:10:35 crc kubenswrapper[4971]: I0320 07:10:35.847956 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 20 07:10:35 crc kubenswrapper[4971]: I0320 07:10:35.881782 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"69a5a04c-039c-4ef2-a020-1e5c16fca2b6\") " pod="openstack/openstack-galera-0" Mar 20 07:10:35 crc kubenswrapper[4971]: I0320 07:10:35.883198 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/69a5a04c-039c-4ef2-a020-1e5c16fca2b6-config-data-generated\") pod \"openstack-galera-0\" (UID: \"69a5a04c-039c-4ef2-a020-1e5c16fca2b6\") " pod="openstack/openstack-galera-0" Mar 20 07:10:35 crc kubenswrapper[4971]: I0320 07:10:35.883247 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/69a5a04c-039c-4ef2-a020-1e5c16fca2b6-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"69a5a04c-039c-4ef2-a020-1e5c16fca2b6\") " pod="openstack/openstack-galera-0" Mar 20 07:10:35 crc kubenswrapper[4971]: I0320 07:10:35.883571 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7qvk\" (UniqueName: \"kubernetes.io/projected/69a5a04c-039c-4ef2-a020-1e5c16fca2b6-kube-api-access-n7qvk\") pod \"openstack-galera-0\" (UID: \"69a5a04c-039c-4ef2-a020-1e5c16fca2b6\") " pod="openstack/openstack-galera-0" Mar 20 07:10:35 crc kubenswrapper[4971]: I0320 07:10:35.883656 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69a5a04c-039c-4ef2-a020-1e5c16fca2b6-operator-scripts\") pod \"openstack-galera-0\" (UID: \"69a5a04c-039c-4ef2-a020-1e5c16fca2b6\") " pod="openstack/openstack-galera-0" Mar 20 07:10:35 crc kubenswrapper[4971]: I0320 07:10:35.883695 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/69a5a04c-039c-4ef2-a020-1e5c16fca2b6-kolla-config\") pod \"openstack-galera-0\" (UID: \"69a5a04c-039c-4ef2-a020-1e5c16fca2b6\") " pod="openstack/openstack-galera-0" Mar 20 07:10:35 crc kubenswrapper[4971]: I0320 07:10:35.885891 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/69a5a04c-039c-4ef2-a020-1e5c16fca2b6-config-data-default\") pod \"openstack-galera-0\" (UID: \"69a5a04c-039c-4ef2-a020-1e5c16fca2b6\") " pod="openstack/openstack-galera-0" Mar 20 07:10:35 crc kubenswrapper[4971]: I0320 07:10:35.886001 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69a5a04c-039c-4ef2-a020-1e5c16fca2b6-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"69a5a04c-039c-4ef2-a020-1e5c16fca2b6\") " pod="openstack/openstack-galera-0" Mar 20 07:10:35 crc kubenswrapper[4971]: I0320 07:10:35.988213 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/69a5a04c-039c-4ef2-a020-1e5c16fca2b6-config-data-generated\") pod \"openstack-galera-0\" (UID: \"69a5a04c-039c-4ef2-a020-1e5c16fca2b6\") " pod="openstack/openstack-galera-0" Mar 20 07:10:35 crc kubenswrapper[4971]: I0320 07:10:35.988296 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/69a5a04c-039c-4ef2-a020-1e5c16fca2b6-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"69a5a04c-039c-4ef2-a020-1e5c16fca2b6\") " pod="openstack/openstack-galera-0" Mar 20 07:10:35 crc kubenswrapper[4971]: I0320 07:10:35.988590 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7qvk\" (UniqueName: \"kubernetes.io/projected/69a5a04c-039c-4ef2-a020-1e5c16fca2b6-kube-api-access-n7qvk\") pod \"openstack-galera-0\" (UID: \"69a5a04c-039c-4ef2-a020-1e5c16fca2b6\") " pod="openstack/openstack-galera-0" Mar 20 07:10:35 crc kubenswrapper[4971]: I0320 07:10:35.988649 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69a5a04c-039c-4ef2-a020-1e5c16fca2b6-operator-scripts\") pod \"openstack-galera-0\" (UID: \"69a5a04c-039c-4ef2-a020-1e5c16fca2b6\") " pod="openstack/openstack-galera-0" Mar 20 07:10:35 crc kubenswrapper[4971]: I0320 07:10:35.988675 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/69a5a04c-039c-4ef2-a020-1e5c16fca2b6-kolla-config\") pod \"openstack-galera-0\" (UID: \"69a5a04c-039c-4ef2-a020-1e5c16fca2b6\") " pod="openstack/openstack-galera-0" Mar 20 07:10:35 crc kubenswrapper[4971]: I0320 07:10:35.988728 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/69a5a04c-039c-4ef2-a020-1e5c16fca2b6-config-data-default\") pod \"openstack-galera-0\" (UID: \"69a5a04c-039c-4ef2-a020-1e5c16fca2b6\") " pod="openstack/openstack-galera-0" Mar 20 07:10:35 crc kubenswrapper[4971]: I0320 07:10:35.988758 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69a5a04c-039c-4ef2-a020-1e5c16fca2b6-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"69a5a04c-039c-4ef2-a020-1e5c16fca2b6\") " pod="openstack/openstack-galera-0" Mar 20 07:10:35 crc kubenswrapper[4971]: I0320 07:10:35.988818 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"69a5a04c-039c-4ef2-a020-1e5c16fca2b6\") " pod="openstack/openstack-galera-0" Mar 20 07:10:35 crc kubenswrapper[4971]: I0320 07:10:35.991053 4971 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"69a5a04c-039c-4ef2-a020-1e5c16fca2b6\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/openstack-galera-0" Mar 20 07:10:35 crc kubenswrapper[4971]: I0320 07:10:35.991122 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/69a5a04c-039c-4ef2-a020-1e5c16fca2b6-config-data-generated\") pod \"openstack-galera-0\" (UID: \"69a5a04c-039c-4ef2-a020-1e5c16fca2b6\") " pod="openstack/openstack-galera-0" Mar 20 07:10:35 crc kubenswrapper[4971]: I0320 07:10:35.992158 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/69a5a04c-039c-4ef2-a020-1e5c16fca2b6-kolla-config\") pod \"openstack-galera-0\" (UID: \"69a5a04c-039c-4ef2-a020-1e5c16fca2b6\") " pod="openstack/openstack-galera-0" Mar 20 07:10:35 crc kubenswrapper[4971]: I0320 07:10:35.992975 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69a5a04c-039c-4ef2-a020-1e5c16fca2b6-operator-scripts\") pod \"openstack-galera-0\" (UID: \"69a5a04c-039c-4ef2-a020-1e5c16fca2b6\") " pod="openstack/openstack-galera-0" Mar 20 07:10:35 crc kubenswrapper[4971]: I0320 07:10:35.998084 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/69a5a04c-039c-4ef2-a020-1e5c16fca2b6-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"69a5a04c-039c-4ef2-a020-1e5c16fca2b6\") " pod="openstack/openstack-galera-0" Mar 20 07:10:35 crc kubenswrapper[4971]: I0320 07:10:35.998248 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/69a5a04c-039c-4ef2-a020-1e5c16fca2b6-config-data-default\") pod \"openstack-galera-0\" (UID: \"69a5a04c-039c-4ef2-a020-1e5c16fca2b6\") " pod="openstack/openstack-galera-0" Mar 20 07:10:36 crc kubenswrapper[4971]: I0320 07:10:36.017016 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7qvk\" (UniqueName: \"kubernetes.io/projected/69a5a04c-039c-4ef2-a020-1e5c16fca2b6-kube-api-access-n7qvk\") pod \"openstack-galera-0\" (UID: \"69a5a04c-039c-4ef2-a020-1e5c16fca2b6\") " pod="openstack/openstack-galera-0" Mar 20 07:10:36 crc kubenswrapper[4971]: I0320 07:10:36.031461 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69a5a04c-039c-4ef2-a020-1e5c16fca2b6-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"69a5a04c-039c-4ef2-a020-1e5c16fca2b6\") " pod="openstack/openstack-galera-0" Mar 20 07:10:36 crc kubenswrapper[4971]: I0320 07:10:36.041589 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"69a5a04c-039c-4ef2-a020-1e5c16fca2b6\") " pod="openstack/openstack-galera-0" Mar 20 07:10:36 crc kubenswrapper[4971]: I0320 07:10:36.176657 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 20 07:10:36 crc kubenswrapper[4971]: I0320 07:10:36.526748 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1023ed0c-3abe-4cff-987f-52544b885696","Type":"ContainerStarted","Data":"d8878d860f20305c6f32ab33f897cfdab0eeeb47ee5127464f46989bbaccb1d4"} Mar 20 07:10:37 crc kubenswrapper[4971]: I0320 07:10:37.158748 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 07:10:37 crc kubenswrapper[4971]: I0320 07:10:37.164180 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 20 07:10:37 crc kubenswrapper[4971]: I0320 07:10:37.175469 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 20 07:10:37 crc kubenswrapper[4971]: I0320 07:10:37.175518 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-llftp" Mar 20 07:10:37 crc kubenswrapper[4971]: I0320 07:10:37.175669 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 20 07:10:37 crc kubenswrapper[4971]: I0320 07:10:37.175781 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 20 07:10:37 crc kubenswrapper[4971]: I0320 07:10:37.193563 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 07:10:37 crc kubenswrapper[4971]: I0320 07:10:37.324825 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5496b78b-549b-4076-a150-783bbba8896c-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"5496b78b-549b-4076-a150-783bbba8896c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 07:10:37 crc kubenswrapper[4971]: I0320 07:10:37.324878 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"5496b78b-549b-4076-a150-783bbba8896c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 07:10:37 crc kubenswrapper[4971]: I0320 07:10:37.324900 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5496b78b-549b-4076-a150-783bbba8896c-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"5496b78b-549b-4076-a150-783bbba8896c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 07:10:37 crc kubenswrapper[4971]: I0320 07:10:37.325027 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5496b78b-549b-4076-a150-783bbba8896c-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"5496b78b-549b-4076-a150-783bbba8896c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 07:10:37 crc kubenswrapper[4971]: I0320 07:10:37.325054 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5496b78b-549b-4076-a150-783bbba8896c-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"5496b78b-549b-4076-a150-783bbba8896c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 07:10:37 crc kubenswrapper[4971]: I0320 07:10:37.325130 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5496b78b-549b-4076-a150-783bbba8896c-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"5496b78b-549b-4076-a150-783bbba8896c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 07:10:37 crc kubenswrapper[4971]: I0320 07:10:37.325189 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5496b78b-549b-4076-a150-783bbba8896c-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"5496b78b-549b-4076-a150-783bbba8896c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 07:10:37 crc kubenswrapper[4971]: I0320 07:10:37.325255 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4kt9\" (UniqueName: \"kubernetes.io/projected/5496b78b-549b-4076-a150-783bbba8896c-kube-api-access-c4kt9\") pod \"openstack-cell1-galera-0\" (UID: \"5496b78b-549b-4076-a150-783bbba8896c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 07:10:37 crc kubenswrapper[4971]: I0320 07:10:37.427040 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"5496b78b-549b-4076-a150-783bbba8896c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 07:10:37 crc kubenswrapper[4971]: I0320 07:10:37.427110 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5496b78b-549b-4076-a150-783bbba8896c-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"5496b78b-549b-4076-a150-783bbba8896c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 07:10:37 crc kubenswrapper[4971]: I0320 07:10:37.427197 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5496b78b-549b-4076-a150-783bbba8896c-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"5496b78b-549b-4076-a150-783bbba8896c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 07:10:37 crc kubenswrapper[4971]: I0320 07:10:37.427744 4971 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"5496b78b-549b-4076-a150-783bbba8896c\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/openstack-cell1-galera-0" Mar 20 07:10:37 crc kubenswrapper[4971]: I0320 07:10:37.428693 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5496b78b-549b-4076-a150-783bbba8896c-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"5496b78b-549b-4076-a150-783bbba8896c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 07:10:37 crc kubenswrapper[4971]: I0320 07:10:37.429725 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5496b78b-549b-4076-a150-783bbba8896c-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"5496b78b-549b-4076-a150-783bbba8896c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 07:10:37 crc kubenswrapper[4971]: I0320 07:10:37.430851 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5496b78b-549b-4076-a150-783bbba8896c-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"5496b78b-549b-4076-a150-783bbba8896c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 07:10:37 crc kubenswrapper[4971]: I0320 07:10:37.430900 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5496b78b-549b-4076-a150-783bbba8896c-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"5496b78b-549b-4076-a150-783bbba8896c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 07:10:37 crc kubenswrapper[4971]: I0320 07:10:37.430930 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5496b78b-549b-4076-a150-783bbba8896c-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"5496b78b-549b-4076-a150-783bbba8896c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 07:10:37 crc kubenswrapper[4971]: I0320 07:10:37.430974 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5496b78b-549b-4076-a150-783bbba8896c-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"5496b78b-549b-4076-a150-783bbba8896c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 07:10:37 crc kubenswrapper[4971]: I0320 07:10:37.431055 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4kt9\" (UniqueName: \"kubernetes.io/projected/5496b78b-549b-4076-a150-783bbba8896c-kube-api-access-c4kt9\") pod \"openstack-cell1-galera-0\" (UID: \"5496b78b-549b-4076-a150-783bbba8896c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 07:10:37 crc kubenswrapper[4971]: I0320 07:10:37.431203 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5496b78b-549b-4076-a150-783bbba8896c-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"5496b78b-549b-4076-a150-783bbba8896c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 07:10:37 crc kubenswrapper[4971]: I0320 07:10:37.432253 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5496b78b-549b-4076-a150-783bbba8896c-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"5496b78b-549b-4076-a150-783bbba8896c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 07:10:37 crc kubenswrapper[4971]: I0320 07:10:37.435651 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5496b78b-549b-4076-a150-783bbba8896c-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"5496b78b-549b-4076-a150-783bbba8896c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 07:10:37 crc kubenswrapper[4971]: I0320 07:10:37.442766 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5496b78b-549b-4076-a150-783bbba8896c-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"5496b78b-549b-4076-a150-783bbba8896c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 07:10:37 crc kubenswrapper[4971]: I0320 07:10:37.457972 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"5496b78b-549b-4076-a150-783bbba8896c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 07:10:37 crc kubenswrapper[4971]: I0320 07:10:37.477158 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4kt9\" (UniqueName: \"kubernetes.io/projected/5496b78b-549b-4076-a150-783bbba8896c-kube-api-access-c4kt9\") pod \"openstack-cell1-galera-0\" (UID: \"5496b78b-549b-4076-a150-783bbba8896c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 07:10:37 crc kubenswrapper[4971]: I0320 07:10:37.502352 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 20 07:10:37 crc kubenswrapper[4971]: I0320 07:10:37.503399 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 20 07:10:37 crc kubenswrapper[4971]: I0320 07:10:37.504495 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 20 07:10:37 crc kubenswrapper[4971]: I0320 07:10:37.506206 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 20 07:10:37 crc kubenswrapper[4971]: I0320 07:10:37.506522 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-z5dkd" Mar 20 07:10:37 crc kubenswrapper[4971]: I0320 07:10:37.506675 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 20 07:10:37 crc kubenswrapper[4971]: I0320 07:10:37.509065 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 20 07:10:37 crc kubenswrapper[4971]: I0320 07:10:37.634017 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89-memcached-tls-certs\") pod \"memcached-0\" (UID: \"ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89\") " pod="openstack/memcached-0" Mar 20 07:10:37 crc kubenswrapper[4971]: I0320 07:10:37.634076 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89-combined-ca-bundle\") pod \"memcached-0\" (UID: \"ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89\") " pod="openstack/memcached-0" Mar 20 07:10:37 crc kubenswrapper[4971]: I0320 07:10:37.634116 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsbpq\" (UniqueName: \"kubernetes.io/projected/ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89-kube-api-access-gsbpq\") pod \"memcached-0\" (UID: \"ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89\") " pod="openstack/memcached-0" Mar 20 07:10:37 crc kubenswrapper[4971]: I0320 07:10:37.634144 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89-kolla-config\") pod \"memcached-0\" (UID: \"ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89\") " pod="openstack/memcached-0" Mar 20 07:10:37 crc kubenswrapper[4971]: I0320 07:10:37.634210 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89-config-data\") pod \"memcached-0\" (UID: \"ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89\") " pod="openstack/memcached-0" Mar 20 07:10:37 crc kubenswrapper[4971]: I0320 07:10:37.735386 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89-kolla-config\") pod \"memcached-0\" (UID: \"ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89\") " pod="openstack/memcached-0" Mar 20 07:10:37 crc kubenswrapper[4971]: I0320 07:10:37.735488 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89-config-data\") pod \"memcached-0\" (UID: \"ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89\") " pod="openstack/memcached-0" Mar 20 07:10:37 crc kubenswrapper[4971]: I0320 07:10:37.735523 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89-memcached-tls-certs\") pod \"memcached-0\" (UID: \"ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89\") " pod="openstack/memcached-0" Mar 20 07:10:37 crc kubenswrapper[4971]: I0320 07:10:37.735553 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89-combined-ca-bundle\") pod \"memcached-0\" (UID: \"ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89\") " pod="openstack/memcached-0" Mar 20 07:10:37 crc kubenswrapper[4971]: I0320 07:10:37.735582 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsbpq\" (UniqueName: \"kubernetes.io/projected/ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89-kube-api-access-gsbpq\") pod \"memcached-0\" (UID: \"ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89\") " pod="openstack/memcached-0" Mar 20 07:10:37 crc kubenswrapper[4971]: I0320 07:10:37.736067 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89-kolla-config\") pod \"memcached-0\" (UID: \"ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89\") " pod="openstack/memcached-0" Mar 20 07:10:37 crc kubenswrapper[4971]: I0320 07:10:37.736859 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89-config-data\") pod \"memcached-0\" (UID: \"ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89\") " pod="openstack/memcached-0" Mar 20 07:10:37 crc kubenswrapper[4971]: I0320 07:10:37.739781 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89-combined-ca-bundle\") pod \"memcached-0\" (UID: \"ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89\") " pod="openstack/memcached-0" Mar 20 07:10:37 crc kubenswrapper[4971]: I0320 07:10:37.749012 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89-memcached-tls-certs\") pod \"memcached-0\" (UID: \"ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89\") " pod="openstack/memcached-0" Mar 20 07:10:37 crc kubenswrapper[4971]: I0320 07:10:37.754058 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsbpq\" (UniqueName: \"kubernetes.io/projected/ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89-kube-api-access-gsbpq\") pod \"memcached-0\" (UID: \"ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89\") " pod="openstack/memcached-0" Mar 20 07:10:37 crc kubenswrapper[4971]: I0320 07:10:37.871570 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 20 07:10:39 crc kubenswrapper[4971]: I0320 07:10:39.622080 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 07:10:39 crc kubenswrapper[4971]: I0320 07:10:39.625357 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 07:10:39 crc kubenswrapper[4971]: I0320 07:10:39.628681 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-bgj4z" Mar 20 07:10:39 crc kubenswrapper[4971]: I0320 07:10:39.638401 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 07:10:39 crc kubenswrapper[4971]: I0320 07:10:39.767040 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lfgd\" (UniqueName: \"kubernetes.io/projected/42480507-2b62-465a-9af4-07ea0ff3be81-kube-api-access-4lfgd\") pod \"kube-state-metrics-0\" (UID: \"42480507-2b62-465a-9af4-07ea0ff3be81\") " pod="openstack/kube-state-metrics-0" Mar 20 07:10:39 crc kubenswrapper[4971]: I0320 07:10:39.868710 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lfgd\" (UniqueName: \"kubernetes.io/projected/42480507-2b62-465a-9af4-07ea0ff3be81-kube-api-access-4lfgd\") pod \"kube-state-metrics-0\" (UID: \"42480507-2b62-465a-9af4-07ea0ff3be81\") " pod="openstack/kube-state-metrics-0" Mar 20 07:10:39 crc kubenswrapper[4971]: I0320 07:10:39.904641 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lfgd\" (UniqueName: \"kubernetes.io/projected/42480507-2b62-465a-9af4-07ea0ff3be81-kube-api-access-4lfgd\") pod \"kube-state-metrics-0\" (UID: \"42480507-2b62-465a-9af4-07ea0ff3be81\") " pod="openstack/kube-state-metrics-0" Mar 20 07:10:39 crc kubenswrapper[4971]: I0320 07:10:39.975573 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 07:10:39 crc kubenswrapper[4971]: I0320 07:10:39.996569 4971 scope.go:117] "RemoveContainer" containerID="d18203a999ce539dd70c56ece8133269ba47a469beca44ebad7eac7b0328a465" Mar 20 07:10:42 crc kubenswrapper[4971]: I0320 07:10:42.412078 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-pfhnr"] Mar 20 07:10:42 crc kubenswrapper[4971]: I0320 07:10:42.413443 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-pfhnr" Mar 20 07:10:42 crc kubenswrapper[4971]: I0320 07:10:42.421449 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 20 07:10:42 crc kubenswrapper[4971]: I0320 07:10:42.421552 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 20 07:10:42 crc kubenswrapper[4971]: I0320 07:10:42.421833 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-q2425" Mar 20 07:10:42 crc kubenswrapper[4971]: I0320 07:10:42.429910 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-pfhnr"] Mar 20 07:10:42 crc kubenswrapper[4971]: I0320 07:10:42.441970 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-njjzt"] Mar 20 07:10:42 crc kubenswrapper[4971]: I0320 07:10:42.444081 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-njjzt" Mar 20 07:10:42 crc kubenswrapper[4971]: I0320 07:10:42.452307 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-njjzt"] Mar 20 07:10:42 crc kubenswrapper[4971]: I0320 07:10:42.515734 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/cabe7ab2-f272-4434-8295-641c334a0ae2-etc-ovs\") pod \"ovn-controller-ovs-njjzt\" (UID: \"cabe7ab2-f272-4434-8295-641c334a0ae2\") " pod="openstack/ovn-controller-ovs-njjzt" Mar 20 07:10:42 crc kubenswrapper[4971]: I0320 07:10:42.516096 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9jf8\" (UniqueName: \"kubernetes.io/projected/10811481-4b40-49ae-9d75-03f0c0e02fe7-kube-api-access-z9jf8\") pod \"ovn-controller-pfhnr\" (UID: \"10811481-4b40-49ae-9d75-03f0c0e02fe7\") " pod="openstack/ovn-controller-pfhnr" Mar 20 07:10:42 crc kubenswrapper[4971]: I0320 07:10:42.516173 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/10811481-4b40-49ae-9d75-03f0c0e02fe7-scripts\") pod \"ovn-controller-pfhnr\" (UID: \"10811481-4b40-49ae-9d75-03f0c0e02fe7\") " pod="openstack/ovn-controller-pfhnr" Mar 20 07:10:42 crc kubenswrapper[4971]: I0320 07:10:42.516197 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/10811481-4b40-49ae-9d75-03f0c0e02fe7-var-run-ovn\") pod \"ovn-controller-pfhnr\" (UID: \"10811481-4b40-49ae-9d75-03f0c0e02fe7\") " pod="openstack/ovn-controller-pfhnr" Mar 20 07:10:42 crc kubenswrapper[4971]: I0320 07:10:42.516231 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cabe7ab2-f272-4434-8295-641c334a0ae2-scripts\") pod \"ovn-controller-ovs-njjzt\" (UID: \"cabe7ab2-f272-4434-8295-641c334a0ae2\") " pod="openstack/ovn-controller-ovs-njjzt" Mar 20 07:10:42 crc kubenswrapper[4971]: I0320 07:10:42.516252 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cabe7ab2-f272-4434-8295-641c334a0ae2-var-run\") pod \"ovn-controller-ovs-njjzt\" (UID: \"cabe7ab2-f272-4434-8295-641c334a0ae2\") " pod="openstack/ovn-controller-ovs-njjzt" Mar 20 07:10:42 crc kubenswrapper[4971]: I0320 07:10:42.516301 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/10811481-4b40-49ae-9d75-03f0c0e02fe7-var-log-ovn\") pod \"ovn-controller-pfhnr\" (UID: \"10811481-4b40-49ae-9d75-03f0c0e02fe7\") " pod="openstack/ovn-controller-pfhnr" Mar 20 07:10:42 crc kubenswrapper[4971]: I0320 07:10:42.516332 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10811481-4b40-49ae-9d75-03f0c0e02fe7-combined-ca-bundle\") pod \"ovn-controller-pfhnr\" (UID: \"10811481-4b40-49ae-9d75-03f0c0e02fe7\") " pod="openstack/ovn-controller-pfhnr" Mar 20 07:10:42 crc kubenswrapper[4971]: I0320 07:10:42.516354 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/10811481-4b40-49ae-9d75-03f0c0e02fe7-var-run\") pod \"ovn-controller-pfhnr\" (UID: \"10811481-4b40-49ae-9d75-03f0c0e02fe7\") " pod="openstack/ovn-controller-pfhnr" Mar 20 07:10:42 crc kubenswrapper[4971]: I0320 07:10:42.516383 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xvnq\" (UniqueName: \"kubernetes.io/projected/cabe7ab2-f272-4434-8295-641c334a0ae2-kube-api-access-4xvnq\") pod \"ovn-controller-ovs-njjzt\" (UID: \"cabe7ab2-f272-4434-8295-641c334a0ae2\") " pod="openstack/ovn-controller-ovs-njjzt" Mar 20 07:10:42 crc kubenswrapper[4971]: I0320 07:10:42.516406 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/cabe7ab2-f272-4434-8295-641c334a0ae2-var-lib\") pod \"ovn-controller-ovs-njjzt\" (UID: \"cabe7ab2-f272-4434-8295-641c334a0ae2\") " pod="openstack/ovn-controller-ovs-njjzt" Mar 20 07:10:42 crc kubenswrapper[4971]: I0320 07:10:42.516434 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/10811481-4b40-49ae-9d75-03f0c0e02fe7-ovn-controller-tls-certs\") pod \"ovn-controller-pfhnr\" (UID: \"10811481-4b40-49ae-9d75-03f0c0e02fe7\") " pod="openstack/ovn-controller-pfhnr" Mar 20 07:10:42 crc kubenswrapper[4971]: I0320 07:10:42.516475 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/cabe7ab2-f272-4434-8295-641c334a0ae2-var-log\") pod \"ovn-controller-ovs-njjzt\" (UID: \"cabe7ab2-f272-4434-8295-641c334a0ae2\") " pod="openstack/ovn-controller-ovs-njjzt" Mar 20 07:10:42 crc kubenswrapper[4971]: I0320 07:10:42.617936 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/cabe7ab2-f272-4434-8295-641c334a0ae2-var-log\") pod \"ovn-controller-ovs-njjzt\" (UID: \"cabe7ab2-f272-4434-8295-641c334a0ae2\") " pod="openstack/ovn-controller-ovs-njjzt" Mar 20 07:10:42 crc kubenswrapper[4971]: I0320 07:10:42.617995 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/cabe7ab2-f272-4434-8295-641c334a0ae2-etc-ovs\") pod \"ovn-controller-ovs-njjzt\" (UID: \"cabe7ab2-f272-4434-8295-641c334a0ae2\") " pod="openstack/ovn-controller-ovs-njjzt" Mar 20 07:10:42 crc kubenswrapper[4971]: I0320 07:10:42.618030 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9jf8\" (UniqueName: \"kubernetes.io/projected/10811481-4b40-49ae-9d75-03f0c0e02fe7-kube-api-access-z9jf8\") pod \"ovn-controller-pfhnr\" (UID: \"10811481-4b40-49ae-9d75-03f0c0e02fe7\") " pod="openstack/ovn-controller-pfhnr" Mar 20 07:10:42 crc kubenswrapper[4971]: I0320 07:10:42.618062 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/10811481-4b40-49ae-9d75-03f0c0e02fe7-scripts\") pod \"ovn-controller-pfhnr\" (UID: \"10811481-4b40-49ae-9d75-03f0c0e02fe7\") " pod="openstack/ovn-controller-pfhnr" Mar 20 07:10:42 crc kubenswrapper[4971]: I0320 07:10:42.618079 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/10811481-4b40-49ae-9d75-03f0c0e02fe7-var-run-ovn\") pod \"ovn-controller-pfhnr\" (UID: \"10811481-4b40-49ae-9d75-03f0c0e02fe7\") " pod="openstack/ovn-controller-pfhnr" Mar 20 07:10:42 crc kubenswrapper[4971]: I0320 07:10:42.618104 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cabe7ab2-f272-4434-8295-641c334a0ae2-var-run\") pod \"ovn-controller-ovs-njjzt\" (UID: \"cabe7ab2-f272-4434-8295-641c334a0ae2\") " pod="openstack/ovn-controller-ovs-njjzt" Mar 20 07:10:42 crc kubenswrapper[4971]: I0320 07:10:42.618123 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cabe7ab2-f272-4434-8295-641c334a0ae2-scripts\") pod \"ovn-controller-ovs-njjzt\" (UID: \"cabe7ab2-f272-4434-8295-641c334a0ae2\") " pod="openstack/ovn-controller-ovs-njjzt" Mar 20 07:10:42 crc kubenswrapper[4971]: I0320 07:10:42.618155 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/10811481-4b40-49ae-9d75-03f0c0e02fe7-var-log-ovn\") pod \"ovn-controller-pfhnr\" (UID: \"10811481-4b40-49ae-9d75-03f0c0e02fe7\") " pod="openstack/ovn-controller-pfhnr" Mar 20 07:10:42 crc kubenswrapper[4971]: I0320 07:10:42.618176 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/10811481-4b40-49ae-9d75-03f0c0e02fe7-var-run\") pod \"ovn-controller-pfhnr\" (UID: \"10811481-4b40-49ae-9d75-03f0c0e02fe7\") " pod="openstack/ovn-controller-pfhnr" Mar 20 07:10:42 crc kubenswrapper[4971]: I0320 07:10:42.618192 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10811481-4b40-49ae-9d75-03f0c0e02fe7-combined-ca-bundle\") pod \"ovn-controller-pfhnr\" (UID: \"10811481-4b40-49ae-9d75-03f0c0e02fe7\") " pod="openstack/ovn-controller-pfhnr" Mar 20 07:10:42 crc kubenswrapper[4971]: I0320 07:10:42.618213 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xvnq\" (UniqueName: \"kubernetes.io/projected/cabe7ab2-f272-4434-8295-641c334a0ae2-kube-api-access-4xvnq\") pod \"ovn-controller-ovs-njjzt\" (UID: \"cabe7ab2-f272-4434-8295-641c334a0ae2\") " pod="openstack/ovn-controller-ovs-njjzt" Mar 20 07:10:42 crc kubenswrapper[4971]: I0320 07:10:42.618229 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/cabe7ab2-f272-4434-8295-641c334a0ae2-var-lib\") pod \"ovn-controller-ovs-njjzt\" (UID: \"cabe7ab2-f272-4434-8295-641c334a0ae2\") " pod="openstack/ovn-controller-ovs-njjzt" Mar 20 07:10:42 crc kubenswrapper[4971]: I0320 07:10:42.618251 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/10811481-4b40-49ae-9d75-03f0c0e02fe7-ovn-controller-tls-certs\") pod \"ovn-controller-pfhnr\" (UID: \"10811481-4b40-49ae-9d75-03f0c0e02fe7\") " pod="openstack/ovn-controller-pfhnr" Mar 20 07:10:42 crc kubenswrapper[4971]: I0320 07:10:42.618720 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/cabe7ab2-f272-4434-8295-641c334a0ae2-etc-ovs\") pod \"ovn-controller-ovs-njjzt\" (UID: \"cabe7ab2-f272-4434-8295-641c334a0ae2\") " pod="openstack/ovn-controller-ovs-njjzt" Mar 20 07:10:42 crc kubenswrapper[4971]: I0320 07:10:42.620270 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/10811481-4b40-49ae-9d75-03f0c0e02fe7-scripts\") pod \"ovn-controller-pfhnr\" (UID: \"10811481-4b40-49ae-9d75-03f0c0e02fe7\") " pod="openstack/ovn-controller-pfhnr" Mar 20 07:10:42 crc kubenswrapper[4971]: I0320 07:10:42.621129 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/cabe7ab2-f272-4434-8295-641c334a0ae2-var-log\") pod \"ovn-controller-ovs-njjzt\" (UID: \"cabe7ab2-f272-4434-8295-641c334a0ae2\") " pod="openstack/ovn-controller-ovs-njjzt" Mar 20 07:10:42 crc kubenswrapper[4971]: I0320 07:10:42.622966 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cabe7ab2-f272-4434-8295-641c334a0ae2-scripts\") pod \"ovn-controller-ovs-njjzt\" (UID: \"cabe7ab2-f272-4434-8295-641c334a0ae2\") " pod="openstack/ovn-controller-ovs-njjzt" Mar 20 07:10:42 crc kubenswrapper[4971]: I0320 07:10:42.623153 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/10811481-4b40-49ae-9d75-03f0c0e02fe7-var-run\") pod \"ovn-controller-pfhnr\" (UID: \"10811481-4b40-49ae-9d75-03f0c0e02fe7\") " pod="openstack/ovn-controller-pfhnr" Mar 20 07:10:42 crc kubenswrapper[4971]: I0320 07:10:42.623182 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cabe7ab2-f272-4434-8295-641c334a0ae2-var-run\") pod \"ovn-controller-ovs-njjzt\" (UID: \"cabe7ab2-f272-4434-8295-641c334a0ae2\") " pod="openstack/ovn-controller-ovs-njjzt" Mar 20 07:10:42 crc kubenswrapper[4971]: I0320 07:10:42.628045 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10811481-4b40-49ae-9d75-03f0c0e02fe7-combined-ca-bundle\") pod \"ovn-controller-pfhnr\" (UID: \"10811481-4b40-49ae-9d75-03f0c0e02fe7\") " pod="openstack/ovn-controller-pfhnr" Mar 20 07:10:42 crc kubenswrapper[4971]: I0320 07:10:42.628868 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/10811481-4b40-49ae-9d75-03f0c0e02fe7-var-log-ovn\") pod \"ovn-controller-pfhnr\" (UID: \"10811481-4b40-49ae-9d75-03f0c0e02fe7\") " pod="openstack/ovn-controller-pfhnr" Mar 20 07:10:42 crc kubenswrapper[4971]: I0320 07:10:42.629189 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/10811481-4b40-49ae-9d75-03f0c0e02fe7-ovn-controller-tls-certs\") pod \"ovn-controller-pfhnr\" (UID: \"10811481-4b40-49ae-9d75-03f0c0e02fe7\") " pod="openstack/ovn-controller-pfhnr" Mar 20 07:10:42 crc kubenswrapper[4971]: I0320 07:10:42.634966 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/cabe7ab2-f272-4434-8295-641c334a0ae2-var-lib\") pod \"ovn-controller-ovs-njjzt\" (UID: \"cabe7ab2-f272-4434-8295-641c334a0ae2\") " pod="openstack/ovn-controller-ovs-njjzt" Mar 20 07:10:42 crc kubenswrapper[4971]: I0320 07:10:42.635265 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/10811481-4b40-49ae-9d75-03f0c0e02fe7-var-run-ovn\") pod \"ovn-controller-pfhnr\" (UID: \"10811481-4b40-49ae-9d75-03f0c0e02fe7\") " pod="openstack/ovn-controller-pfhnr" Mar 20 07:10:42 crc kubenswrapper[4971]: I0320 07:10:42.636110 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xvnq\" (UniqueName: \"kubernetes.io/projected/cabe7ab2-f272-4434-8295-641c334a0ae2-kube-api-access-4xvnq\") pod \"ovn-controller-ovs-njjzt\" (UID: \"cabe7ab2-f272-4434-8295-641c334a0ae2\") " pod="openstack/ovn-controller-ovs-njjzt" Mar 20 07:10:42 crc kubenswrapper[4971]: I0320 07:10:42.652272 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9jf8\" (UniqueName: \"kubernetes.io/projected/10811481-4b40-49ae-9d75-03f0c0e02fe7-kube-api-access-z9jf8\") pod \"ovn-controller-pfhnr\" (UID: \"10811481-4b40-49ae-9d75-03f0c0e02fe7\") " pod="openstack/ovn-controller-pfhnr" Mar 20 07:10:42 crc kubenswrapper[4971]: I0320 07:10:42.752106 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-pfhnr" Mar 20 07:10:42 crc kubenswrapper[4971]: I0320 07:10:42.756991 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-njjzt" Mar 20 07:10:45 crc kubenswrapper[4971]: I0320 07:10:45.397657 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 07:10:45 crc kubenswrapper[4971]: I0320 07:10:45.399358 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 20 07:10:45 crc kubenswrapper[4971]: I0320 07:10:45.403948 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 20 07:10:45 crc kubenswrapper[4971]: I0320 07:10:45.403981 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 20 07:10:45 crc kubenswrapper[4971]: I0320 07:10:45.405056 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 20 07:10:45 crc kubenswrapper[4971]: I0320 07:10:45.405325 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 20 07:10:45 crc kubenswrapper[4971]: I0320 07:10:45.405538 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-6kv99" Mar 20 07:10:45 crc kubenswrapper[4971]: I0320 07:10:45.407309 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 07:10:45 crc kubenswrapper[4971]: I0320 07:10:45.464384 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"63654bb6-f2be-42d2-bb83-0b96bb2ca8cb\") " pod="openstack/ovsdbserver-nb-0" Mar 20 07:10:45 crc kubenswrapper[4971]: I0320 07:10:45.464442 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63654bb6-f2be-42d2-bb83-0b96bb2ca8cb-config\") pod \"ovsdbserver-nb-0\" (UID: \"63654bb6-f2be-42d2-bb83-0b96bb2ca8cb\") " pod="openstack/ovsdbserver-nb-0" Mar 20 07:10:45 crc kubenswrapper[4971]: I0320 07:10:45.464484 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/63654bb6-f2be-42d2-bb83-0b96bb2ca8cb-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"63654bb6-f2be-42d2-bb83-0b96bb2ca8cb\") " pod="openstack/ovsdbserver-nb-0" Mar 20 07:10:45 crc kubenswrapper[4971]: I0320 07:10:45.464508 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/63654bb6-f2be-42d2-bb83-0b96bb2ca8cb-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"63654bb6-f2be-42d2-bb83-0b96bb2ca8cb\") " pod="openstack/ovsdbserver-nb-0" Mar 20 07:10:45 crc kubenswrapper[4971]: I0320 07:10:45.464530 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdvj9\" (UniqueName: \"kubernetes.io/projected/63654bb6-f2be-42d2-bb83-0b96bb2ca8cb-kube-api-access-bdvj9\") pod \"ovsdbserver-nb-0\" (UID: \"63654bb6-f2be-42d2-bb83-0b96bb2ca8cb\") " pod="openstack/ovsdbserver-nb-0" Mar 20 07:10:45 crc kubenswrapper[4971]: I0320 07:10:45.464556 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/63654bb6-f2be-42d2-bb83-0b96bb2ca8cb-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"63654bb6-f2be-42d2-bb83-0b96bb2ca8cb\") " pod="openstack/ovsdbserver-nb-0" Mar 20 07:10:45 crc kubenswrapper[4971]: I0320 07:10:45.464599 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/63654bb6-f2be-42d2-bb83-0b96bb2ca8cb-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"63654bb6-f2be-42d2-bb83-0b96bb2ca8cb\") " pod="openstack/ovsdbserver-nb-0" Mar 20 07:10:45 crc kubenswrapper[4971]: I0320 07:10:45.464660 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63654bb6-f2be-42d2-bb83-0b96bb2ca8cb-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"63654bb6-f2be-42d2-bb83-0b96bb2ca8cb\") " pod="openstack/ovsdbserver-nb-0" Mar 20 07:10:45 crc kubenswrapper[4971]: I0320 07:10:45.566278 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/63654bb6-f2be-42d2-bb83-0b96bb2ca8cb-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"63654bb6-f2be-42d2-bb83-0b96bb2ca8cb\") " pod="openstack/ovsdbserver-nb-0" Mar 20 07:10:45 crc kubenswrapper[4971]: I0320 07:10:45.566331 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63654bb6-f2be-42d2-bb83-0b96bb2ca8cb-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"63654bb6-f2be-42d2-bb83-0b96bb2ca8cb\") " pod="openstack/ovsdbserver-nb-0" Mar 20 07:10:45 crc kubenswrapper[4971]: I0320 07:10:45.566374 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"63654bb6-f2be-42d2-bb83-0b96bb2ca8cb\") " pod="openstack/ovsdbserver-nb-0" Mar 20 07:10:45 crc kubenswrapper[4971]: I0320 07:10:45.566414 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63654bb6-f2be-42d2-bb83-0b96bb2ca8cb-config\") pod \"ovsdbserver-nb-0\" (UID: \"63654bb6-f2be-42d2-bb83-0b96bb2ca8cb\") " pod="openstack/ovsdbserver-nb-0" Mar 20 07:10:45 crc kubenswrapper[4971]: I0320 07:10:45.566475 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/63654bb6-f2be-42d2-bb83-0b96bb2ca8cb-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"63654bb6-f2be-42d2-bb83-0b96bb2ca8cb\") " pod="openstack/ovsdbserver-nb-0" Mar 20 07:10:45 crc kubenswrapper[4971]: I0320 07:10:45.566507 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/63654bb6-f2be-42d2-bb83-0b96bb2ca8cb-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"63654bb6-f2be-42d2-bb83-0b96bb2ca8cb\") " pod="openstack/ovsdbserver-nb-0" Mar 20 07:10:45 crc kubenswrapper[4971]: I0320 07:10:45.566542 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdvj9\" (UniqueName: \"kubernetes.io/projected/63654bb6-f2be-42d2-bb83-0b96bb2ca8cb-kube-api-access-bdvj9\") pod \"ovsdbserver-nb-0\" (UID: \"63654bb6-f2be-42d2-bb83-0b96bb2ca8cb\") " pod="openstack/ovsdbserver-nb-0" Mar 20 07:10:45 crc kubenswrapper[4971]: I0320 07:10:45.566590 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/63654bb6-f2be-42d2-bb83-0b96bb2ca8cb-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"63654bb6-f2be-42d2-bb83-0b96bb2ca8cb\") " pod="openstack/ovsdbserver-nb-0" Mar 20 07:10:45 crc kubenswrapper[4971]: I0320 07:10:45.568220 4971 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"63654bb6-f2be-42d2-bb83-0b96bb2ca8cb\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/ovsdbserver-nb-0" Mar 20 07:10:45 crc kubenswrapper[4971]: I0320 07:10:45.568372 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/63654bb6-f2be-42d2-bb83-0b96bb2ca8cb-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"63654bb6-f2be-42d2-bb83-0b96bb2ca8cb\") " pod="openstack/ovsdbserver-nb-0" Mar 20 07:10:45 crc kubenswrapper[4971]: I0320 07:10:45.568792 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63654bb6-f2be-42d2-bb83-0b96bb2ca8cb-config\") pod \"ovsdbserver-nb-0\" (UID: \"63654bb6-f2be-42d2-bb83-0b96bb2ca8cb\") " pod="openstack/ovsdbserver-nb-0" Mar 20 07:10:45 crc kubenswrapper[4971]: I0320 07:10:45.568837 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/63654bb6-f2be-42d2-bb83-0b96bb2ca8cb-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"63654bb6-f2be-42d2-bb83-0b96bb2ca8cb\") " pod="openstack/ovsdbserver-nb-0" Mar 20 07:10:45 crc kubenswrapper[4971]: I0320 07:10:45.574953 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/63654bb6-f2be-42d2-bb83-0b96bb2ca8cb-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"63654bb6-f2be-42d2-bb83-0b96bb2ca8cb\") " pod="openstack/ovsdbserver-nb-0" Mar 20 07:10:45 crc kubenswrapper[4971]: I0320 07:10:45.580346 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/63654bb6-f2be-42d2-bb83-0b96bb2ca8cb-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"63654bb6-f2be-42d2-bb83-0b96bb2ca8cb\") " pod="openstack/ovsdbserver-nb-0" Mar 20 07:10:45 crc kubenswrapper[4971]: I0320 07:10:45.581452 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63654bb6-f2be-42d2-bb83-0b96bb2ca8cb-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"63654bb6-f2be-42d2-bb83-0b96bb2ca8cb\") " pod="openstack/ovsdbserver-nb-0" Mar 20 07:10:45 crc kubenswrapper[4971]: I0320 07:10:45.583430 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdvj9\" (UniqueName: \"kubernetes.io/projected/63654bb6-f2be-42d2-bb83-0b96bb2ca8cb-kube-api-access-bdvj9\") pod \"ovsdbserver-nb-0\" (UID: \"63654bb6-f2be-42d2-bb83-0b96bb2ca8cb\") " pod="openstack/ovsdbserver-nb-0" Mar 20 07:10:45 crc kubenswrapper[4971]: I0320 07:10:45.598789 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"63654bb6-f2be-42d2-bb83-0b96bb2ca8cb\") " pod="openstack/ovsdbserver-nb-0" Mar 20 07:10:45 crc kubenswrapper[4971]: I0320 07:10:45.724773 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 20 07:10:46 crc kubenswrapper[4971]: I0320 07:10:46.795135 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 07:10:46 crc kubenswrapper[4971]: I0320 07:10:46.797190 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 20 07:10:46 crc kubenswrapper[4971]: I0320 07:10:46.799118 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-2296s" Mar 20 07:10:46 crc kubenswrapper[4971]: I0320 07:10:46.803076 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 07:10:46 crc kubenswrapper[4971]: I0320 07:10:46.804472 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 20 07:10:46 crc kubenswrapper[4971]: I0320 07:10:46.805154 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 20 07:10:46 crc kubenswrapper[4971]: I0320 07:10:46.806229 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 20 07:10:46 crc kubenswrapper[4971]: I0320 07:10:46.889524 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e53ff8e-9df7-411c-b2ad-e4c32c0209c6-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9e53ff8e-9df7-411c-b2ad-e4c32c0209c6\") " pod="openstack/ovsdbserver-sb-0" Mar 20 07:10:46 crc kubenswrapper[4971]: I0320 07:10:46.889712 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw7bj\" (UniqueName: \"kubernetes.io/projected/9e53ff8e-9df7-411c-b2ad-e4c32c0209c6-kube-api-access-cw7bj\") pod \"ovsdbserver-sb-0\" (UID: \"9e53ff8e-9df7-411c-b2ad-e4c32c0209c6\") " pod="openstack/ovsdbserver-sb-0" Mar 20 07:10:46 crc kubenswrapper[4971]: I0320 07:10:46.889799 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e53ff8e-9df7-411c-b2ad-e4c32c0209c6-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"9e53ff8e-9df7-411c-b2ad-e4c32c0209c6\") " pod="openstack/ovsdbserver-sb-0" Mar 20 07:10:46 crc kubenswrapper[4971]: I0320 07:10:46.889835 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"9e53ff8e-9df7-411c-b2ad-e4c32c0209c6\") " pod="openstack/ovsdbserver-sb-0" Mar 20 07:10:46 crc kubenswrapper[4971]: I0320 07:10:46.890180 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9e53ff8e-9df7-411c-b2ad-e4c32c0209c6-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"9e53ff8e-9df7-411c-b2ad-e4c32c0209c6\") " pod="openstack/ovsdbserver-sb-0" Mar 20 07:10:46 crc kubenswrapper[4971]: I0320 07:10:46.890315 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e53ff8e-9df7-411c-b2ad-e4c32c0209c6-config\") pod \"ovsdbserver-sb-0\" (UID: \"9e53ff8e-9df7-411c-b2ad-e4c32c0209c6\") " pod="openstack/ovsdbserver-sb-0" Mar 20 07:10:46 crc kubenswrapper[4971]: I0320 07:10:46.890365 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e53ff8e-9df7-411c-b2ad-e4c32c0209c6-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9e53ff8e-9df7-411c-b2ad-e4c32c0209c6\") " pod="openstack/ovsdbserver-sb-0" Mar 20 07:10:46 crc kubenswrapper[4971]: I0320 07:10:46.890388 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9e53ff8e-9df7-411c-b2ad-e4c32c0209c6-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"9e53ff8e-9df7-411c-b2ad-e4c32c0209c6\") " pod="openstack/ovsdbserver-sb-0" Mar 20 07:10:46 crc kubenswrapper[4971]: I0320 07:10:46.991777 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e53ff8e-9df7-411c-b2ad-e4c32c0209c6-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9e53ff8e-9df7-411c-b2ad-e4c32c0209c6\") " pod="openstack/ovsdbserver-sb-0" Mar 20 07:10:46 crc kubenswrapper[4971]: I0320 07:10:46.991861 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cw7bj\" (UniqueName: \"kubernetes.io/projected/9e53ff8e-9df7-411c-b2ad-e4c32c0209c6-kube-api-access-cw7bj\") pod \"ovsdbserver-sb-0\" (UID: \"9e53ff8e-9df7-411c-b2ad-e4c32c0209c6\") " pod="openstack/ovsdbserver-sb-0" Mar 20 07:10:46 crc kubenswrapper[4971]: I0320 07:10:46.991894 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e53ff8e-9df7-411c-b2ad-e4c32c0209c6-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"9e53ff8e-9df7-411c-b2ad-e4c32c0209c6\") " pod="openstack/ovsdbserver-sb-0" Mar 20 07:10:46 crc kubenswrapper[4971]: I0320 07:10:46.991923 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"9e53ff8e-9df7-411c-b2ad-e4c32c0209c6\") " pod="openstack/ovsdbserver-sb-0" Mar 20 07:10:46 crc kubenswrapper[4971]: I0320 07:10:46.991974 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9e53ff8e-9df7-411c-b2ad-e4c32c0209c6-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"9e53ff8e-9df7-411c-b2ad-e4c32c0209c6\") " pod="openstack/ovsdbserver-sb-0" Mar 20 07:10:46 crc kubenswrapper[4971]: I0320 07:10:46.992010 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e53ff8e-9df7-411c-b2ad-e4c32c0209c6-config\") pod \"ovsdbserver-sb-0\" (UID: \"9e53ff8e-9df7-411c-b2ad-e4c32c0209c6\") " pod="openstack/ovsdbserver-sb-0" Mar 20 07:10:46 crc kubenswrapper[4971]: I0320 07:10:46.992033 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e53ff8e-9df7-411c-b2ad-e4c32c0209c6-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9e53ff8e-9df7-411c-b2ad-e4c32c0209c6\") " pod="openstack/ovsdbserver-sb-0" Mar 20 07:10:46 crc kubenswrapper[4971]: I0320 07:10:46.992054 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9e53ff8e-9df7-411c-b2ad-e4c32c0209c6-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"9e53ff8e-9df7-411c-b2ad-e4c32c0209c6\") " pod="openstack/ovsdbserver-sb-0" Mar 20 07:10:46 crc kubenswrapper[4971]: I0320 07:10:46.992310 4971 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"9e53ff8e-9df7-411c-b2ad-e4c32c0209c6\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/ovsdbserver-sb-0" Mar 20 07:10:46 crc kubenswrapper[4971]: I0320 07:10:46.992738 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9e53ff8e-9df7-411c-b2ad-e4c32c0209c6-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"9e53ff8e-9df7-411c-b2ad-e4c32c0209c6\") " pod="openstack/ovsdbserver-sb-0" Mar 20 07:10:46 crc kubenswrapper[4971]: I0320 07:10:46.993249 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e53ff8e-9df7-411c-b2ad-e4c32c0209c6-config\") pod \"ovsdbserver-sb-0\" (UID: \"9e53ff8e-9df7-411c-b2ad-e4c32c0209c6\") " pod="openstack/ovsdbserver-sb-0" Mar 20 07:10:46 crc kubenswrapper[4971]: I0320 07:10:46.993705 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9e53ff8e-9df7-411c-b2ad-e4c32c0209c6-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"9e53ff8e-9df7-411c-b2ad-e4c32c0209c6\") " pod="openstack/ovsdbserver-sb-0" Mar 20 07:10:46 crc kubenswrapper[4971]: I0320 07:10:46.996109 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e53ff8e-9df7-411c-b2ad-e4c32c0209c6-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9e53ff8e-9df7-411c-b2ad-e4c32c0209c6\") " pod="openstack/ovsdbserver-sb-0" Mar 20 07:10:46 crc kubenswrapper[4971]: I0320 07:10:46.996167 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e53ff8e-9df7-411c-b2ad-e4c32c0209c6-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9e53ff8e-9df7-411c-b2ad-e4c32c0209c6\") " pod="openstack/ovsdbserver-sb-0" Mar 20 07:10:46 crc kubenswrapper[4971]: I0320 07:10:46.997761 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e53ff8e-9df7-411c-b2ad-e4c32c0209c6-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"9e53ff8e-9df7-411c-b2ad-e4c32c0209c6\") " pod="openstack/ovsdbserver-sb-0" Mar 20 07:10:47 crc kubenswrapper[4971]: I0320 07:10:47.006927 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cw7bj\" (UniqueName: \"kubernetes.io/projected/9e53ff8e-9df7-411c-b2ad-e4c32c0209c6-kube-api-access-cw7bj\") pod \"ovsdbserver-sb-0\" (UID: \"9e53ff8e-9df7-411c-b2ad-e4c32c0209c6\") " pod="openstack/ovsdbserver-sb-0" Mar 20 07:10:47 crc kubenswrapper[4971]: I0320 07:10:47.015381 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"9e53ff8e-9df7-411c-b2ad-e4c32c0209c6\") " pod="openstack/ovsdbserver-sb-0" Mar 20 07:10:47 crc kubenswrapper[4971]: I0320 07:10:47.122130 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 20 07:10:48 crc kubenswrapper[4971]: E0320 07:10:48.624879 4971 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51" Mar 20 07:10:48 crc kubenswrapper[4971]: E0320 07:10:48.625485 4971 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zxx5b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-64696987c5-k2mdp_openstack(6e3a6967-aab7-4dac-abce-8d3189935dc4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 07:10:48 crc kubenswrapper[4971]: E0320 07:10:48.627053 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-64696987c5-k2mdp" podUID="6e3a6967-aab7-4dac-abce-8d3189935dc4" Mar 20 07:10:49 crc kubenswrapper[4971]: I0320 07:10:49.888475 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64696987c5-k2mdp" Mar 20 07:10:49 crc kubenswrapper[4971]: E0320 07:10:49.889655 4971 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51" Mar 20 07:10:49 crc kubenswrapper[4971]: E0320 07:10:49.889789 4971 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k9bwz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5448ff6dc7-r7nv7_openstack(663eb449-2315-426d-89ea-7fa437373dac): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 07:10:49 crc kubenswrapper[4971]: E0320 07:10:49.891174 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5448ff6dc7-r7nv7" podUID="663eb449-2315-426d-89ea-7fa437373dac" Mar 20 07:10:49 crc kubenswrapper[4971]: I0320 07:10:49.940413 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e3a6967-aab7-4dac-abce-8d3189935dc4-dns-svc\") pod \"6e3a6967-aab7-4dac-abce-8d3189935dc4\" (UID: \"6e3a6967-aab7-4dac-abce-8d3189935dc4\") " Mar 20 07:10:49 crc kubenswrapper[4971]: I0320 07:10:49.940509 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxx5b\" (UniqueName: \"kubernetes.io/projected/6e3a6967-aab7-4dac-abce-8d3189935dc4-kube-api-access-zxx5b\") pod \"6e3a6967-aab7-4dac-abce-8d3189935dc4\" (UID: \"6e3a6967-aab7-4dac-abce-8d3189935dc4\") " Mar 20 07:10:49 crc kubenswrapper[4971]: I0320 07:10:49.940628 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e3a6967-aab7-4dac-abce-8d3189935dc4-config\") pod \"6e3a6967-aab7-4dac-abce-8d3189935dc4\" (UID: \"6e3a6967-aab7-4dac-abce-8d3189935dc4\") " Mar 20 07:10:49 crc kubenswrapper[4971]: I0320 07:10:49.942016 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e3a6967-aab7-4dac-abce-8d3189935dc4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6e3a6967-aab7-4dac-abce-8d3189935dc4" (UID: "6e3a6967-aab7-4dac-abce-8d3189935dc4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:10:49 crc kubenswrapper[4971]: I0320 07:10:49.942041 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e3a6967-aab7-4dac-abce-8d3189935dc4-config" (OuterVolumeSpecName: "config") pod "6e3a6967-aab7-4dac-abce-8d3189935dc4" (UID: "6e3a6967-aab7-4dac-abce-8d3189935dc4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:10:49 crc kubenswrapper[4971]: I0320 07:10:49.944763 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e3a6967-aab7-4dac-abce-8d3189935dc4-kube-api-access-zxx5b" (OuterVolumeSpecName: "kube-api-access-zxx5b") pod "6e3a6967-aab7-4dac-abce-8d3189935dc4" (UID: "6e3a6967-aab7-4dac-abce-8d3189935dc4"). InnerVolumeSpecName "kube-api-access-zxx5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:10:50 crc kubenswrapper[4971]: I0320 07:10:50.051568 4971 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e3a6967-aab7-4dac-abce-8d3189935dc4-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 07:10:50 crc kubenswrapper[4971]: I0320 07:10:50.051816 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxx5b\" (UniqueName: \"kubernetes.io/projected/6e3a6967-aab7-4dac-abce-8d3189935dc4-kube-api-access-zxx5b\") on node \"crc\" DevicePath \"\"" Mar 20 07:10:50 crc kubenswrapper[4971]: I0320 07:10:50.051829 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e3a6967-aab7-4dac-abce-8d3189935dc4-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:10:50 crc kubenswrapper[4971]: I0320 07:10:50.307283 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 07:10:50 crc kubenswrapper[4971]: W0320 07:10:50.310506 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42480507_2b62_465a_9af4_07ea0ff3be81.slice/crio-71fd3b4acdf6be866d6c4a30022c4edf95fe5f53fd35bcd1954fe918a0609e83 WatchSource:0}: Error finding container 71fd3b4acdf6be866d6c4a30022c4edf95fe5f53fd35bcd1954fe918a0609e83: Status 404 returned error can't find the container with id 71fd3b4acdf6be866d6c4a30022c4edf95fe5f53fd35bcd1954fe918a0609e83 Mar 20 07:10:50 crc kubenswrapper[4971]: I0320 07:10:50.321264 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 07:10:50 crc kubenswrapper[4971]: I0320 07:10:50.335193 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 20 07:10:50 crc kubenswrapper[4971]: W0320 07:10:50.338111 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea363773_c4ed_4a4c_b0d6_e6d9eb5e1d89.slice/crio-a5812ba7bc663316b16eb74795bf71eeabba37dba7a064db87080e76a1f7566a WatchSource:0}: Error finding container a5812ba7bc663316b16eb74795bf71eeabba37dba7a064db87080e76a1f7566a: Status 404 returned error can't find the container with id a5812ba7bc663316b16eb74795bf71eeabba37dba7a064db87080e76a1f7566a Mar 20 07:10:50 crc kubenswrapper[4971]: I0320 07:10:50.484305 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 20 07:10:50 crc kubenswrapper[4971]: I0320 07:10:50.547914 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-pfhnr"] Mar 20 07:10:50 crc kubenswrapper[4971]: W0320 07:10:50.549532 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10811481_4b40_49ae_9d75_03f0c0e02fe7.slice/crio-7fa10eb638b51a171e2bf9700bb577201953548cbbba1b5705a44c4239502d08 WatchSource:0}: Error finding container 7fa10eb638b51a171e2bf9700bb577201953548cbbba1b5705a44c4239502d08: Status 404 returned error can't find the container with id 7fa10eb638b51a171e2bf9700bb577201953548cbbba1b5705a44c4239502d08 Mar 20 07:10:50 crc kubenswrapper[4971]: W0320 07:10:50.584337 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63654bb6_f2be_42d2_bb83_0b96bb2ca8cb.slice/crio-1bf54ed1e41b5d8cdfd46ea34b84caf9dd6b6efb0a8ec24da65c995e729fc21b WatchSource:0}: Error finding container 1bf54ed1e41b5d8cdfd46ea34b84caf9dd6b6efb0a8ec24da65c995e729fc21b: Status 404 returned error can't find the container with id 1bf54ed1e41b5d8cdfd46ea34b84caf9dd6b6efb0a8ec24da65c995e729fc21b Mar 20 07:10:50 crc kubenswrapper[4971]: I0320 07:10:50.586633 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 07:10:50 crc kubenswrapper[4971]: I0320 07:10:50.666371 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 07:10:50 crc kubenswrapper[4971]: I0320 07:10:50.674319 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"5496b78b-549b-4076-a150-783bbba8896c","Type":"ContainerStarted","Data":"d51fa962826579e9a55882cb0ee5e743b6bbd77c8460e5e932f99fce6197d99e"} Mar 20 07:10:50 crc kubenswrapper[4971]: I0320 07:10:50.676372 4971 generic.go:334] "Generic (PLEG): container finished" podID="80f15589-52d0-4bb9-972e-82d3860486ec" containerID="54488af8cdc5d59cb335ffced7c3b6f148d16bed02277deb8295ef4b0e2886b1" exitCode=0 Mar 20 07:10:50 crc kubenswrapper[4971]: I0320 07:10:50.676456 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54b5dffb47-lzjrm" event={"ID":"80f15589-52d0-4bb9-972e-82d3860486ec","Type":"ContainerDied","Data":"54488af8cdc5d59cb335ffced7c3b6f148d16bed02277deb8295ef4b0e2886b1"} Mar 20 07:10:50 crc kubenswrapper[4971]: I0320 07:10:50.678244 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"63654bb6-f2be-42d2-bb83-0b96bb2ca8cb","Type":"ContainerStarted","Data":"1bf54ed1e41b5d8cdfd46ea34b84caf9dd6b6efb0a8ec24da65c995e729fc21b"} Mar 20 07:10:50 crc kubenswrapper[4971]: I0320 07:10:50.679283 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"69a5a04c-039c-4ef2-a020-1e5c16fca2b6","Type":"ContainerStarted","Data":"fe4f4dd984d01358392cc324d34364eda6d0d09b68a796511182aff4094a98e3"} Mar 20 07:10:50 crc kubenswrapper[4971]: I0320 07:10:50.681343 4971 generic.go:334] "Generic (PLEG): container finished" podID="b9eb0be1-51da-41ee-b9cc-0a555d094b16" containerID="7a8fb84cd3c710f21209e4374bda991cfc693cf132e4175c2497375433af1b4f" exitCode=0 Mar 20 07:10:50 crc kubenswrapper[4971]: I0320 07:10:50.681410 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-854f47b4f9-m8kz6" event={"ID":"b9eb0be1-51da-41ee-b9cc-0a555d094b16","Type":"ContainerDied","Data":"7a8fb84cd3c710f21209e4374bda991cfc693cf132e4175c2497375433af1b4f"} Mar 20 07:10:50 crc kubenswrapper[4971]: I0320 07:10:50.683251 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"42480507-2b62-465a-9af4-07ea0ff3be81","Type":"ContainerStarted","Data":"71fd3b4acdf6be866d6c4a30022c4edf95fe5f53fd35bcd1954fe918a0609e83"} Mar 20 07:10:50 crc kubenswrapper[4971]: I0320 07:10:50.684681 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-pfhnr" event={"ID":"10811481-4b40-49ae-9d75-03f0c0e02fe7","Type":"ContainerStarted","Data":"7fa10eb638b51a171e2bf9700bb577201953548cbbba1b5705a44c4239502d08"} Mar 20 07:10:50 crc kubenswrapper[4971]: I0320 07:10:50.685860 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89","Type":"ContainerStarted","Data":"a5812ba7bc663316b16eb74795bf71eeabba37dba7a064db87080e76a1f7566a"} Mar 20 07:10:50 crc kubenswrapper[4971]: I0320 07:10:50.687410 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64696987c5-k2mdp" event={"ID":"6e3a6967-aab7-4dac-abce-8d3189935dc4","Type":"ContainerDied","Data":"227656cc13518d8826ab2b153749854708847cc6d228a423e19779955d5fc763"} Mar 20 07:10:50 crc kubenswrapper[4971]: I0320 07:10:50.687452 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64696987c5-k2mdp" Mar 20 07:10:50 crc kubenswrapper[4971]: W0320 07:10:50.717740 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9e53ff8e_9df7_411c_b2ad_e4c32c0209c6.slice/crio-0c1148305d758a96d6060d3bb931b807ce2bfaf17d241d6f072f2ed34c0128f5 WatchSource:0}: Error finding container 0c1148305d758a96d6060d3bb931b807ce2bfaf17d241d6f072f2ed34c0128f5: Status 404 returned error can't find the container with id 0c1148305d758a96d6060d3bb931b807ce2bfaf17d241d6f072f2ed34c0128f5 Mar 20 07:10:51 crc kubenswrapper[4971]: E0320 07:10:51.098349 4971 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Mar 20 07:10:51 crc kubenswrapper[4971]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/b9eb0be1-51da-41ee-b9cc-0a555d094b16/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 20 07:10:51 crc kubenswrapper[4971]: > podSandboxID="1401fefafb93114bc1446c2de0fee089048d41cc5fd1e5f83ceca7fa1c9c9965" Mar 20 07:10:51 crc kubenswrapper[4971]: E0320 07:10:51.099320 4971 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 07:10:51 crc kubenswrapper[4971]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lxhf4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-854f47b4f9-m8kz6_openstack(b9eb0be1-51da-41ee-b9cc-0a555d094b16): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/b9eb0be1-51da-41ee-b9cc-0a555d094b16/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 20 07:10:51 crc kubenswrapper[4971]: > logger="UnhandledError" Mar 20 07:10:51 crc kubenswrapper[4971]: E0320 07:10:51.100517 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/b9eb0be1-51da-41ee-b9cc-0a555d094b16/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-854f47b4f9-m8kz6" podUID="b9eb0be1-51da-41ee-b9cc-0a555d094b16" Mar 20 07:10:51 crc kubenswrapper[4971]: I0320 07:10:51.124382 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5448ff6dc7-r7nv7" Mar 20 07:10:51 crc kubenswrapper[4971]: I0320 07:10:51.177017 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/663eb449-2315-426d-89ea-7fa437373dac-config\") pod \"663eb449-2315-426d-89ea-7fa437373dac\" (UID: \"663eb449-2315-426d-89ea-7fa437373dac\") " Mar 20 07:10:51 crc kubenswrapper[4971]: I0320 07:10:51.177144 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9bwz\" (UniqueName: \"kubernetes.io/projected/663eb449-2315-426d-89ea-7fa437373dac-kube-api-access-k9bwz\") pod \"663eb449-2315-426d-89ea-7fa437373dac\" (UID: \"663eb449-2315-426d-89ea-7fa437373dac\") " Mar 20 07:10:51 crc kubenswrapper[4971]: I0320 07:10:51.177514 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/663eb449-2315-426d-89ea-7fa437373dac-config" (OuterVolumeSpecName: "config") pod "663eb449-2315-426d-89ea-7fa437373dac" (UID: "663eb449-2315-426d-89ea-7fa437373dac"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:10:51 crc kubenswrapper[4971]: I0320 07:10:51.177708 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/663eb449-2315-426d-89ea-7fa437373dac-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:10:51 crc kubenswrapper[4971]: I0320 07:10:51.183024 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/663eb449-2315-426d-89ea-7fa437373dac-kube-api-access-k9bwz" (OuterVolumeSpecName: "kube-api-access-k9bwz") pod "663eb449-2315-426d-89ea-7fa437373dac" (UID: "663eb449-2315-426d-89ea-7fa437373dac"). InnerVolumeSpecName "kube-api-access-k9bwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:10:51 crc kubenswrapper[4971]: I0320 07:10:51.280498 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9bwz\" (UniqueName: \"kubernetes.io/projected/663eb449-2315-426d-89ea-7fa437373dac-kube-api-access-k9bwz\") on node \"crc\" DevicePath \"\"" Mar 20 07:10:51 crc kubenswrapper[4971]: I0320 07:10:51.662203 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-njjzt"] Mar 20 07:10:51 crc kubenswrapper[4971]: I0320 07:10:51.699110 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1023ed0c-3abe-4cff-987f-52544b885696","Type":"ContainerStarted","Data":"8ea2d6fbd32a2d37d300ac896cec5808faa6aa48fc9bb628562475b560b88284"} Mar 20 07:10:51 crc kubenswrapper[4971]: I0320 07:10:51.702805 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54b5dffb47-lzjrm" event={"ID":"80f15589-52d0-4bb9-972e-82d3860486ec","Type":"ContainerStarted","Data":"ecda388b5853d5d8b65f2bfa128cd69d83145637fdbc6e4d2b4f472bc460afa3"} Mar 20 07:10:51 crc kubenswrapper[4971]: I0320 07:10:51.702934 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-54b5dffb47-lzjrm" Mar 20 07:10:51 crc kubenswrapper[4971]: I0320 07:10:51.705196 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5448ff6dc7-r7nv7" Mar 20 07:10:51 crc kubenswrapper[4971]: I0320 07:10:51.705202 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5448ff6dc7-r7nv7" event={"ID":"663eb449-2315-426d-89ea-7fa437373dac","Type":"ContainerDied","Data":"33f9b347bc76b713f0a94631ea90ec2d1ff9a81e29b5f2fe18d047b9e38dc5d6"} Mar 20 07:10:51 crc kubenswrapper[4971]: I0320 07:10:51.706915 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a71f8d2d-729c-4b7c-89c7-a06bd2216978","Type":"ContainerStarted","Data":"6133975f1c796980d588347c53a88fbaff173562350d27203ea7112376127cb1"} Mar 20 07:10:51 crc kubenswrapper[4971]: I0320 07:10:51.708648 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"9e53ff8e-9df7-411c-b2ad-e4c32c0209c6","Type":"ContainerStarted","Data":"0c1148305d758a96d6060d3bb931b807ce2bfaf17d241d6f072f2ed34c0128f5"} Mar 20 07:10:51 crc kubenswrapper[4971]: I0320 07:10:51.793914 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-54b5dffb47-lzjrm" podStartSLOduration=3.307084714 podStartE2EDuration="18.793894452s" podCreationTimestamp="2026-03-20 07:10:33 +0000 UTC" firstStartedPulling="2026-03-20 07:10:34.330668758 +0000 UTC m=+1256.310542886" lastFinishedPulling="2026-03-20 07:10:49.817478486 +0000 UTC m=+1271.797352624" observedRunningTime="2026-03-20 07:10:51.786778975 +0000 UTC m=+1273.766653133" watchObservedRunningTime="2026-03-20 07:10:51.793894452 +0000 UTC m=+1273.773768590" Mar 20 07:10:51 crc kubenswrapper[4971]: I0320 07:10:51.860791 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-r7nv7"] Mar 20 07:10:51 crc kubenswrapper[4971]: I0320 07:10:51.881733 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-r7nv7"] Mar 20 07:10:52 crc kubenswrapper[4971]: I0320 07:10:52.743739 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="663eb449-2315-426d-89ea-7fa437373dac" path="/var/lib/kubelet/pods/663eb449-2315-426d-89ea-7fa437373dac/volumes" Mar 20 07:10:53 crc kubenswrapper[4971]: W0320 07:10:53.820454 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcabe7ab2_f272_4434_8295_641c334a0ae2.slice/crio-fbee2fbecd5c87d302299b54299393cd44b0f6afc1e354555b36b6cdae864349 WatchSource:0}: Error finding container fbee2fbecd5c87d302299b54299393cd44b0f6afc1e354555b36b6cdae864349: Status 404 returned error can't find the container with id fbee2fbecd5c87d302299b54299393cd44b0f6afc1e354555b36b6cdae864349 Mar 20 07:10:54 crc kubenswrapper[4971]: I0320 07:10:54.742178 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-njjzt" event={"ID":"cabe7ab2-f272-4434-8295-641c334a0ae2","Type":"ContainerStarted","Data":"fbee2fbecd5c87d302299b54299393cd44b0f6afc1e354555b36b6cdae864349"} Mar 20 07:10:57 crc kubenswrapper[4971]: I0320 07:10:57.786106 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-854f47b4f9-m8kz6" event={"ID":"b9eb0be1-51da-41ee-b9cc-0a555d094b16","Type":"ContainerStarted","Data":"bd7d583000b590bc82de39df0965fc04bd2bcbd76dd569dc0bcc7c4f381a4bf7"} Mar 20 07:10:57 crc kubenswrapper[4971]: I0320 07:10:57.786976 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-854f47b4f9-m8kz6" Mar 20 07:10:57 crc kubenswrapper[4971]: I0320 07:10:57.788052 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"9e53ff8e-9df7-411c-b2ad-e4c32c0209c6","Type":"ContainerStarted","Data":"68ce7b00a908c9710adb361927cb192e2b39212d020dcbf9508b46cd83c87f9b"} Mar 20 07:10:57 crc kubenswrapper[4971]: I0320 07:10:57.789969 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-pfhnr" event={"ID":"10811481-4b40-49ae-9d75-03f0c0e02fe7","Type":"ContainerStarted","Data":"b31c413c5d26b07382c7a1437a8a9e34e970b11922b67a53dd84ae365d6be81a"} Mar 20 07:10:57 crc kubenswrapper[4971]: I0320 07:10:57.790235 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-pfhnr" Mar 20 07:10:57 crc kubenswrapper[4971]: I0320 07:10:57.792060 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89","Type":"ContainerStarted","Data":"881a1fe073deeae5d10c2f3450e47a402c7aeac84069def650c7c0704ad554dd"} Mar 20 07:10:57 crc kubenswrapper[4971]: I0320 07:10:57.792424 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 20 07:10:57 crc kubenswrapper[4971]: I0320 07:10:57.794187 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"5496b78b-549b-4076-a150-783bbba8896c","Type":"ContainerStarted","Data":"2bd77aab0599e42f7a587075c023a48f183e1c5575676c74bbb696d29d33375c"} Mar 20 07:10:57 crc kubenswrapper[4971]: I0320 07:10:57.796764 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"63654bb6-f2be-42d2-bb83-0b96bb2ca8cb","Type":"ContainerStarted","Data":"808a7c9f249b97193158020c6f0533d2ec2b303417e0c3080125fc1cc2867777"} Mar 20 07:10:57 crc kubenswrapper[4971]: I0320 07:10:57.806555 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"69a5a04c-039c-4ef2-a020-1e5c16fca2b6","Type":"ContainerStarted","Data":"01f4c038659721a3b1a9de5996ca336bb07cf20969895b4224218ee5023f415e"} Mar 20 07:10:57 crc kubenswrapper[4971]: I0320 07:10:57.812300 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-854f47b4f9-m8kz6" podStartSLOduration=8.829900122 podStartE2EDuration="24.812284582s" podCreationTimestamp="2026-03-20 07:10:33 +0000 UTC" firstStartedPulling="2026-03-20 07:10:33.896963605 +0000 UTC m=+1255.876837743" lastFinishedPulling="2026-03-20 07:10:49.879348075 +0000 UTC m=+1271.859222203" observedRunningTime="2026-03-20 07:10:57.8102972 +0000 UTC m=+1279.790171378" watchObservedRunningTime="2026-03-20 07:10:57.812284582 +0000 UTC m=+1279.792158720" Mar 20 07:10:57 crc kubenswrapper[4971]: I0320 07:10:57.813848 4971 generic.go:334] "Generic (PLEG): container finished" podID="cabe7ab2-f272-4434-8295-641c334a0ae2" containerID="d4a843ddc450022e968540cf332a8f3bf76fb39cae689bcaa48300de5559bd52" exitCode=0 Mar 20 07:10:57 crc kubenswrapper[4971]: I0320 07:10:57.813965 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-njjzt" event={"ID":"cabe7ab2-f272-4434-8295-641c334a0ae2","Type":"ContainerDied","Data":"d4a843ddc450022e968540cf332a8f3bf76fb39cae689bcaa48300de5559bd52"} Mar 20 07:10:57 crc kubenswrapper[4971]: I0320 07:10:57.849967 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-pfhnr" podStartSLOduration=9.783170219 podStartE2EDuration="15.849939973s" podCreationTimestamp="2026-03-20 07:10:42 +0000 UTC" firstStartedPulling="2026-03-20 07:10:50.552929352 +0000 UTC m=+1272.532803490" lastFinishedPulling="2026-03-20 07:10:56.619699096 +0000 UTC m=+1278.599573244" observedRunningTime="2026-03-20 07:10:57.83917625 +0000 UTC m=+1279.819050498" watchObservedRunningTime="2026-03-20 07:10:57.849939973 +0000 UTC m=+1279.829814141" Mar 20 07:10:57 crc kubenswrapper[4971]: I0320 07:10:57.873886 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=14.794608101 podStartE2EDuration="20.873861863s" podCreationTimestamp="2026-03-20 07:10:37 +0000 UTC" firstStartedPulling="2026-03-20 07:10:50.342083593 +0000 UTC m=+1272.321957731" lastFinishedPulling="2026-03-20 07:10:56.421337335 +0000 UTC m=+1278.401211493" observedRunningTime="2026-03-20 07:10:57.864414894 +0000 UTC m=+1279.844289042" watchObservedRunningTime="2026-03-20 07:10:57.873861863 +0000 UTC m=+1279.853736011" Mar 20 07:10:58 crc kubenswrapper[4971]: I0320 07:10:58.824353 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-njjzt" event={"ID":"cabe7ab2-f272-4434-8295-641c334a0ae2","Type":"ContainerStarted","Data":"68e3c1f10a1b965deae97b10a94045279a9a39e1de7f0b30a1acdde8f91cf276"} Mar 20 07:10:58 crc kubenswrapper[4971]: I0320 07:10:58.824588 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-njjzt" event={"ID":"cabe7ab2-f272-4434-8295-641c334a0ae2","Type":"ContainerStarted","Data":"0e05bef3a26f5139b1217b0d668bd7559845501941c593fdd4f111a356558ab6"} Mar 20 07:10:58 crc kubenswrapper[4971]: I0320 07:10:58.861338 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-njjzt" podStartSLOduration=13.91041855 podStartE2EDuration="16.861315131s" podCreationTimestamp="2026-03-20 07:10:42 +0000 UTC" firstStartedPulling="2026-03-20 07:10:53.851687648 +0000 UTC m=+1275.831561826" lastFinishedPulling="2026-03-20 07:10:56.802584269 +0000 UTC m=+1278.782458407" observedRunningTime="2026-03-20 07:10:58.853010802 +0000 UTC m=+1280.832884950" watchObservedRunningTime="2026-03-20 07:10:58.861315131 +0000 UTC m=+1280.841189279" Mar 20 07:10:58 crc kubenswrapper[4971]: I0320 07:10:58.875639 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-54b5dffb47-lzjrm" Mar 20 07:10:58 crc kubenswrapper[4971]: I0320 07:10:58.980338 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-854f47b4f9-m8kz6"] Mar 20 07:10:59 crc kubenswrapper[4971]: I0320 07:10:59.834096 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-854f47b4f9-m8kz6" podUID="b9eb0be1-51da-41ee-b9cc-0a555d094b16" containerName="dnsmasq-dns" containerID="cri-o://bd7d583000b590bc82de39df0965fc04bd2bcbd76dd569dc0bcc7c4f381a4bf7" gracePeriod=10 Mar 20 07:10:59 crc kubenswrapper[4971]: I0320 07:10:59.834461 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-njjzt" Mar 20 07:10:59 crc kubenswrapper[4971]: I0320 07:10:59.834487 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-njjzt" Mar 20 07:11:00 crc kubenswrapper[4971]: I0320 07:11:00.843137 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-854f47b4f9-m8kz6" event={"ID":"b9eb0be1-51da-41ee-b9cc-0a555d094b16","Type":"ContainerDied","Data":"bd7d583000b590bc82de39df0965fc04bd2bcbd76dd569dc0bcc7c4f381a4bf7"} Mar 20 07:11:00 crc kubenswrapper[4971]: I0320 07:11:00.843132 4971 generic.go:334] "Generic (PLEG): container finished" podID="b9eb0be1-51da-41ee-b9cc-0a555d094b16" containerID="bd7d583000b590bc82de39df0965fc04bd2bcbd76dd569dc0bcc7c4f381a4bf7" exitCode=0 Mar 20 07:11:00 crc kubenswrapper[4971]: I0320 07:11:00.843541 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-854f47b4f9-m8kz6" event={"ID":"b9eb0be1-51da-41ee-b9cc-0a555d094b16","Type":"ContainerDied","Data":"1401fefafb93114bc1446c2de0fee089048d41cc5fd1e5f83ceca7fa1c9c9965"} Mar 20 07:11:00 crc kubenswrapper[4971]: I0320 07:11:00.843587 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1401fefafb93114bc1446c2de0fee089048d41cc5fd1e5f83ceca7fa1c9c9965" Mar 20 07:11:00 crc kubenswrapper[4971]: I0320 07:11:00.848456 4971 generic.go:334] "Generic (PLEG): container finished" podID="5496b78b-549b-4076-a150-783bbba8896c" containerID="2bd77aab0599e42f7a587075c023a48f183e1c5575676c74bbb696d29d33375c" exitCode=0 Mar 20 07:11:00 crc kubenswrapper[4971]: I0320 07:11:00.848502 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"5496b78b-549b-4076-a150-783bbba8896c","Type":"ContainerDied","Data":"2bd77aab0599e42f7a587075c023a48f183e1c5575676c74bbb696d29d33375c"} Mar 20 07:11:00 crc kubenswrapper[4971]: I0320 07:11:00.852366 4971 generic.go:334] "Generic (PLEG): container finished" podID="69a5a04c-039c-4ef2-a020-1e5c16fca2b6" containerID="01f4c038659721a3b1a9de5996ca336bb07cf20969895b4224218ee5023f415e" exitCode=0 Mar 20 07:11:00 crc kubenswrapper[4971]: I0320 07:11:00.852436 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"69a5a04c-039c-4ef2-a020-1e5c16fca2b6","Type":"ContainerDied","Data":"01f4c038659721a3b1a9de5996ca336bb07cf20969895b4224218ee5023f415e"} Mar 20 07:11:00 crc kubenswrapper[4971]: I0320 07:11:00.894205 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-854f47b4f9-m8kz6" Mar 20 07:11:01 crc kubenswrapper[4971]: I0320 07:11:01.083036 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9eb0be1-51da-41ee-b9cc-0a555d094b16-config\") pod \"b9eb0be1-51da-41ee-b9cc-0a555d094b16\" (UID: \"b9eb0be1-51da-41ee-b9cc-0a555d094b16\") " Mar 20 07:11:01 crc kubenswrapper[4971]: I0320 07:11:01.083145 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b9eb0be1-51da-41ee-b9cc-0a555d094b16-dns-svc\") pod \"b9eb0be1-51da-41ee-b9cc-0a555d094b16\" (UID: \"b9eb0be1-51da-41ee-b9cc-0a555d094b16\") " Mar 20 07:11:01 crc kubenswrapper[4971]: I0320 07:11:01.083187 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxhf4\" (UniqueName: \"kubernetes.io/projected/b9eb0be1-51da-41ee-b9cc-0a555d094b16-kube-api-access-lxhf4\") pod \"b9eb0be1-51da-41ee-b9cc-0a555d094b16\" (UID: \"b9eb0be1-51da-41ee-b9cc-0a555d094b16\") " Mar 20 07:11:01 crc kubenswrapper[4971]: I0320 07:11:01.088346 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9eb0be1-51da-41ee-b9cc-0a555d094b16-kube-api-access-lxhf4" (OuterVolumeSpecName: "kube-api-access-lxhf4") pod "b9eb0be1-51da-41ee-b9cc-0a555d094b16" (UID: "b9eb0be1-51da-41ee-b9cc-0a555d094b16"). InnerVolumeSpecName "kube-api-access-lxhf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:11:01 crc kubenswrapper[4971]: I0320 07:11:01.122312 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9eb0be1-51da-41ee-b9cc-0a555d094b16-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b9eb0be1-51da-41ee-b9cc-0a555d094b16" (UID: "b9eb0be1-51da-41ee-b9cc-0a555d094b16"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:11:01 crc kubenswrapper[4971]: I0320 07:11:01.126617 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9eb0be1-51da-41ee-b9cc-0a555d094b16-config" (OuterVolumeSpecName: "config") pod "b9eb0be1-51da-41ee-b9cc-0a555d094b16" (UID: "b9eb0be1-51da-41ee-b9cc-0a555d094b16"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:11:01 crc kubenswrapper[4971]: I0320 07:11:01.185117 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9eb0be1-51da-41ee-b9cc-0a555d094b16-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:01 crc kubenswrapper[4971]: I0320 07:11:01.185160 4971 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b9eb0be1-51da-41ee-b9cc-0a555d094b16-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:01 crc kubenswrapper[4971]: I0320 07:11:01.185173 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxhf4\" (UniqueName: \"kubernetes.io/projected/b9eb0be1-51da-41ee-b9cc-0a555d094b16-kube-api-access-lxhf4\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:01 crc kubenswrapper[4971]: I0320 07:11:01.861680 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"5496b78b-549b-4076-a150-783bbba8896c","Type":"ContainerStarted","Data":"a7c2133041da20b5fced03a202d5fff42ac4b65bc46dd7533ac03796bf0ed5e3"} Mar 20 07:11:01 crc kubenswrapper[4971]: I0320 07:11:01.863433 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"63654bb6-f2be-42d2-bb83-0b96bb2ca8cb","Type":"ContainerStarted","Data":"2055269c243f22fc79e97f8f94001b54b54ef95b9b6b344a99916b8ea073e623"} Mar 20 07:11:01 crc kubenswrapper[4971]: I0320 07:11:01.866860 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"69a5a04c-039c-4ef2-a020-1e5c16fca2b6","Type":"ContainerStarted","Data":"1a7b3fb5fd235f55cc04b6d8e45ed95722d35221264653d7df9a70cd8b76f73a"} Mar 20 07:11:01 crc kubenswrapper[4971]: I0320 07:11:01.868759 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"42480507-2b62-465a-9af4-07ea0ff3be81","Type":"ContainerStarted","Data":"156030cee7f644b41b8984ee238b73db5b10fa0d94de13f13f94c03d61a72f61"} Mar 20 07:11:01 crc kubenswrapper[4971]: I0320 07:11:01.869149 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 20 07:11:01 crc kubenswrapper[4971]: I0320 07:11:01.871333 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-854f47b4f9-m8kz6" Mar 20 07:11:01 crc kubenswrapper[4971]: I0320 07:11:01.873954 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"9e53ff8e-9df7-411c-b2ad-e4c32c0209c6","Type":"ContainerStarted","Data":"46bf584ea3433e9ac46772bd8db9e28130ad0da039469060e5e16331e01c70aa"} Mar 20 07:11:01 crc kubenswrapper[4971]: I0320 07:11:01.889520 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=19.412858466 podStartE2EDuration="25.889491326s" podCreationTimestamp="2026-03-20 07:10:36 +0000 UTC" firstStartedPulling="2026-03-20 07:10:50.328693971 +0000 UTC m=+1272.308568109" lastFinishedPulling="2026-03-20 07:10:56.805326831 +0000 UTC m=+1278.785200969" observedRunningTime="2026-03-20 07:11:01.889458145 +0000 UTC m=+1283.869332283" watchObservedRunningTime="2026-03-20 07:11:01.889491326 +0000 UTC m=+1283.869365494" Mar 20 07:11:01 crc kubenswrapper[4971]: I0320 07:11:01.920569 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=6.281860076 podStartE2EDuration="16.920547133s" podCreationTimestamp="2026-03-20 07:10:45 +0000 UTC" firstStartedPulling="2026-03-20 07:10:50.720213795 +0000 UTC m=+1272.700087933" lastFinishedPulling="2026-03-20 07:11:01.358900852 +0000 UTC m=+1283.338774990" observedRunningTime="2026-03-20 07:11:01.915852559 +0000 UTC m=+1283.895726717" watchObservedRunningTime="2026-03-20 07:11:01.920547133 +0000 UTC m=+1283.900421291" Mar 20 07:11:01 crc kubenswrapper[4971]: I0320 07:11:01.940459 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=21.824052867 podStartE2EDuration="27.940431466s" podCreationTimestamp="2026-03-20 07:10:34 +0000 UTC" firstStartedPulling="2026-03-20 07:10:50.504009145 +0000 UTC m=+1272.483883303" lastFinishedPulling="2026-03-20 07:10:56.620387724 +0000 UTC m=+1278.600261902" observedRunningTime="2026-03-20 07:11:01.934902651 +0000 UTC m=+1283.914776829" watchObservedRunningTime="2026-03-20 07:11:01.940431466 +0000 UTC m=+1283.920305624" Mar 20 07:11:01 crc kubenswrapper[4971]: I0320 07:11:01.974542 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=9.159391133 podStartE2EDuration="19.974525404s" podCreationTimestamp="2026-03-20 07:10:42 +0000 UTC" firstStartedPulling="2026-03-20 07:10:50.585693044 +0000 UTC m=+1272.565567182" lastFinishedPulling="2026-03-20 07:11:01.400827315 +0000 UTC m=+1283.380701453" observedRunningTime="2026-03-20 07:11:01.967986502 +0000 UTC m=+1283.947860640" watchObservedRunningTime="2026-03-20 07:11:01.974525404 +0000 UTC m=+1283.954399542" Mar 20 07:11:01 crc kubenswrapper[4971]: I0320 07:11:01.990045 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=11.952871188 podStartE2EDuration="22.990021662s" podCreationTimestamp="2026-03-20 07:10:39 +0000 UTC" firstStartedPulling="2026-03-20 07:10:50.321169272 +0000 UTC m=+1272.301043410" lastFinishedPulling="2026-03-20 07:11:01.358319736 +0000 UTC m=+1283.338193884" observedRunningTime="2026-03-20 07:11:01.983305665 +0000 UTC m=+1283.963179843" watchObservedRunningTime="2026-03-20 07:11:01.990021662 +0000 UTC m=+1283.969895840" Mar 20 07:11:02 crc kubenswrapper[4971]: I0320 07:11:02.015823 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-854f47b4f9-m8kz6"] Mar 20 07:11:02 crc kubenswrapper[4971]: I0320 07:11:02.025346 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-854f47b4f9-m8kz6"] Mar 20 07:11:02 crc kubenswrapper[4971]: E0320 07:11:02.077278 4971 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.119:43504->38.102.83.119:38499: write tcp 38.102.83.119:43504->38.102.83.119:38499: write: broken pipe Mar 20 07:11:02 crc kubenswrapper[4971]: I0320 07:11:02.123164 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 20 07:11:02 crc kubenswrapper[4971]: I0320 07:11:02.123292 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 20 07:11:02 crc kubenswrapper[4971]: I0320 07:11:02.159916 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 20 07:11:02 crc kubenswrapper[4971]: I0320 07:11:02.744967 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9eb0be1-51da-41ee-b9cc-0a555d094b16" path="/var/lib/kubelet/pods/b9eb0be1-51da-41ee-b9cc-0a555d094b16/volumes" Mar 20 07:11:02 crc kubenswrapper[4971]: I0320 07:11:02.874171 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 20 07:11:02 crc kubenswrapper[4971]: I0320 07:11:02.965810 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.254215 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7988f9db49-rh67h"] Mar 20 07:11:03 crc kubenswrapper[4971]: E0320 07:11:03.254538 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9eb0be1-51da-41ee-b9cc-0a555d094b16" containerName="init" Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.254554 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9eb0be1-51da-41ee-b9cc-0a555d094b16" containerName="init" Mar 20 07:11:03 crc kubenswrapper[4971]: E0320 07:11:03.254575 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9eb0be1-51da-41ee-b9cc-0a555d094b16" containerName="dnsmasq-dns" Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.254582 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9eb0be1-51da-41ee-b9cc-0a555d094b16" containerName="dnsmasq-dns" Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.254740 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9eb0be1-51da-41ee-b9cc-0a555d094b16" containerName="dnsmasq-dns" Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.255462 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7988f9db49-rh67h" Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.258055 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.267688 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7988f9db49-rh67h"] Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.284703 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-88l8f"] Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.285712 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-88l8f" Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.297004 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.306196 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-88l8f"] Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.325736 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb45968a-c7f9-4ede-8d6c-23e7d3864d18-config\") pod \"dnsmasq-dns-7988f9db49-rh67h\" (UID: \"cb45968a-c7f9-4ede-8d6c-23e7d3864d18\") " pod="openstack/dnsmasq-dns-7988f9db49-rh67h" Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.325789 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5snwv\" (UniqueName: \"kubernetes.io/projected/cb45968a-c7f9-4ede-8d6c-23e7d3864d18-kube-api-access-5snwv\") pod \"dnsmasq-dns-7988f9db49-rh67h\" (UID: \"cb45968a-c7f9-4ede-8d6c-23e7d3864d18\") " pod="openstack/dnsmasq-dns-7988f9db49-rh67h" Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.325813 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb45968a-c7f9-4ede-8d6c-23e7d3864d18-ovsdbserver-sb\") pod \"dnsmasq-dns-7988f9db49-rh67h\" (UID: \"cb45968a-c7f9-4ede-8d6c-23e7d3864d18\") " pod="openstack/dnsmasq-dns-7988f9db49-rh67h" Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.325844 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb45968a-c7f9-4ede-8d6c-23e7d3864d18-dns-svc\") pod \"dnsmasq-dns-7988f9db49-rh67h\" (UID: \"cb45968a-c7f9-4ede-8d6c-23e7d3864d18\") " pod="openstack/dnsmasq-dns-7988f9db49-rh67h" Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.427297 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb45968a-c7f9-4ede-8d6c-23e7d3864d18-dns-svc\") pod \"dnsmasq-dns-7988f9db49-rh67h\" (UID: \"cb45968a-c7f9-4ede-8d6c-23e7d3864d18\") " pod="openstack/dnsmasq-dns-7988f9db49-rh67h" Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.427378 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/852c9ecc-4ab1-4142-8a63-e73c265c3866-ovn-rundir\") pod \"ovn-controller-metrics-88l8f\" (UID: \"852c9ecc-4ab1-4142-8a63-e73c265c3866\") " pod="openstack/ovn-controller-metrics-88l8f" Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.427443 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/852c9ecc-4ab1-4142-8a63-e73c265c3866-combined-ca-bundle\") pod \"ovn-controller-metrics-88l8f\" (UID: \"852c9ecc-4ab1-4142-8a63-e73c265c3866\") " pod="openstack/ovn-controller-metrics-88l8f" Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.427468 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mxwg\" (UniqueName: \"kubernetes.io/projected/852c9ecc-4ab1-4142-8a63-e73c265c3866-kube-api-access-8mxwg\") pod \"ovn-controller-metrics-88l8f\" (UID: \"852c9ecc-4ab1-4142-8a63-e73c265c3866\") " pod="openstack/ovn-controller-metrics-88l8f" Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.427525 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/852c9ecc-4ab1-4142-8a63-e73c265c3866-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-88l8f\" (UID: \"852c9ecc-4ab1-4142-8a63-e73c265c3866\") " pod="openstack/ovn-controller-metrics-88l8f" Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.427547 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/852c9ecc-4ab1-4142-8a63-e73c265c3866-config\") pod \"ovn-controller-metrics-88l8f\" (UID: \"852c9ecc-4ab1-4142-8a63-e73c265c3866\") " pod="openstack/ovn-controller-metrics-88l8f" Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.427578 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb45968a-c7f9-4ede-8d6c-23e7d3864d18-config\") pod \"dnsmasq-dns-7988f9db49-rh67h\" (UID: \"cb45968a-c7f9-4ede-8d6c-23e7d3864d18\") " pod="openstack/dnsmasq-dns-7988f9db49-rh67h" Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.427690 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5snwv\" (UniqueName: \"kubernetes.io/projected/cb45968a-c7f9-4ede-8d6c-23e7d3864d18-kube-api-access-5snwv\") pod \"dnsmasq-dns-7988f9db49-rh67h\" (UID: \"cb45968a-c7f9-4ede-8d6c-23e7d3864d18\") " pod="openstack/dnsmasq-dns-7988f9db49-rh67h" Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.427744 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb45968a-c7f9-4ede-8d6c-23e7d3864d18-ovsdbserver-sb\") pod \"dnsmasq-dns-7988f9db49-rh67h\" (UID: \"cb45968a-c7f9-4ede-8d6c-23e7d3864d18\") " pod="openstack/dnsmasq-dns-7988f9db49-rh67h" Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.427777 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/852c9ecc-4ab1-4142-8a63-e73c265c3866-ovs-rundir\") pod \"ovn-controller-metrics-88l8f\" (UID: \"852c9ecc-4ab1-4142-8a63-e73c265c3866\") " pod="openstack/ovn-controller-metrics-88l8f" Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.428402 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb45968a-c7f9-4ede-8d6c-23e7d3864d18-config\") pod \"dnsmasq-dns-7988f9db49-rh67h\" (UID: \"cb45968a-c7f9-4ede-8d6c-23e7d3864d18\") " pod="openstack/dnsmasq-dns-7988f9db49-rh67h" Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.428482 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb45968a-c7f9-4ede-8d6c-23e7d3864d18-ovsdbserver-sb\") pod \"dnsmasq-dns-7988f9db49-rh67h\" (UID: \"cb45968a-c7f9-4ede-8d6c-23e7d3864d18\") " pod="openstack/dnsmasq-dns-7988f9db49-rh67h" Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.428914 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb45968a-c7f9-4ede-8d6c-23e7d3864d18-dns-svc\") pod \"dnsmasq-dns-7988f9db49-rh67h\" (UID: \"cb45968a-c7f9-4ede-8d6c-23e7d3864d18\") " pod="openstack/dnsmasq-dns-7988f9db49-rh67h" Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.451012 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5snwv\" (UniqueName: \"kubernetes.io/projected/cb45968a-c7f9-4ede-8d6c-23e7d3864d18-kube-api-access-5snwv\") pod \"dnsmasq-dns-7988f9db49-rh67h\" (UID: \"cb45968a-c7f9-4ede-8d6c-23e7d3864d18\") " pod="openstack/dnsmasq-dns-7988f9db49-rh67h" Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.530172 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/852c9ecc-4ab1-4142-8a63-e73c265c3866-ovn-rundir\") pod \"ovn-controller-metrics-88l8f\" (UID: \"852c9ecc-4ab1-4142-8a63-e73c265c3866\") " pod="openstack/ovn-controller-metrics-88l8f" Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.530284 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/852c9ecc-4ab1-4142-8a63-e73c265c3866-combined-ca-bundle\") pod \"ovn-controller-metrics-88l8f\" (UID: \"852c9ecc-4ab1-4142-8a63-e73c265c3866\") " pod="openstack/ovn-controller-metrics-88l8f" Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.530315 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mxwg\" (UniqueName: \"kubernetes.io/projected/852c9ecc-4ab1-4142-8a63-e73c265c3866-kube-api-access-8mxwg\") pod \"ovn-controller-metrics-88l8f\" (UID: \"852c9ecc-4ab1-4142-8a63-e73c265c3866\") " pod="openstack/ovn-controller-metrics-88l8f" Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.530345 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/852c9ecc-4ab1-4142-8a63-e73c265c3866-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-88l8f\" (UID: \"852c9ecc-4ab1-4142-8a63-e73c265c3866\") " pod="openstack/ovn-controller-metrics-88l8f" Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.530373 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/852c9ecc-4ab1-4142-8a63-e73c265c3866-config\") pod \"ovn-controller-metrics-88l8f\" (UID: \"852c9ecc-4ab1-4142-8a63-e73c265c3866\") " pod="openstack/ovn-controller-metrics-88l8f" Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.530404 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/852c9ecc-4ab1-4142-8a63-e73c265c3866-ovs-rundir\") pod \"ovn-controller-metrics-88l8f\" (UID: \"852c9ecc-4ab1-4142-8a63-e73c265c3866\") " pod="openstack/ovn-controller-metrics-88l8f" Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.530838 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/852c9ecc-4ab1-4142-8a63-e73c265c3866-ovs-rundir\") pod \"ovn-controller-metrics-88l8f\" (UID: \"852c9ecc-4ab1-4142-8a63-e73c265c3866\") " pod="openstack/ovn-controller-metrics-88l8f" Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.530952 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/852c9ecc-4ab1-4142-8a63-e73c265c3866-ovn-rundir\") pod \"ovn-controller-metrics-88l8f\" (UID: \"852c9ecc-4ab1-4142-8a63-e73c265c3866\") " pod="openstack/ovn-controller-metrics-88l8f" Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.531531 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/852c9ecc-4ab1-4142-8a63-e73c265c3866-config\") pod \"ovn-controller-metrics-88l8f\" (UID: \"852c9ecc-4ab1-4142-8a63-e73c265c3866\") " pod="openstack/ovn-controller-metrics-88l8f" Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.536646 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/852c9ecc-4ab1-4142-8a63-e73c265c3866-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-88l8f\" (UID: \"852c9ecc-4ab1-4142-8a63-e73c265c3866\") " pod="openstack/ovn-controller-metrics-88l8f" Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.536884 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/852c9ecc-4ab1-4142-8a63-e73c265c3866-combined-ca-bundle\") pod \"ovn-controller-metrics-88l8f\" (UID: \"852c9ecc-4ab1-4142-8a63-e73c265c3866\") " pod="openstack/ovn-controller-metrics-88l8f" Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.549064 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mxwg\" (UniqueName: \"kubernetes.io/projected/852c9ecc-4ab1-4142-8a63-e73c265c3866-kube-api-access-8mxwg\") pod \"ovn-controller-metrics-88l8f\" (UID: \"852c9ecc-4ab1-4142-8a63-e73c265c3866\") " pod="openstack/ovn-controller-metrics-88l8f" Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.562059 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7988f9db49-rh67h"] Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.562734 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7988f9db49-rh67h" Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.594221 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d944d7b75-r597g"] Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.596199 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d944d7b75-r597g" Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.600309 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.604660 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d944d7b75-r597g"] Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.619055 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-88l8f" Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.725401 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.736081 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb1c6902-1390-44a8-8307-8036b02ce9c7-ovsdbserver-sb\") pod \"dnsmasq-dns-5d944d7b75-r597g\" (UID: \"fb1c6902-1390-44a8-8307-8036b02ce9c7\") " pod="openstack/dnsmasq-dns-5d944d7b75-r597g" Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.736129 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb1c6902-1390-44a8-8307-8036b02ce9c7-config\") pod \"dnsmasq-dns-5d944d7b75-r597g\" (UID: \"fb1c6902-1390-44a8-8307-8036b02ce9c7\") " pod="openstack/dnsmasq-dns-5d944d7b75-r597g" Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.736151 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxlvq\" (UniqueName: \"kubernetes.io/projected/fb1c6902-1390-44a8-8307-8036b02ce9c7-kube-api-access-pxlvq\") pod \"dnsmasq-dns-5d944d7b75-r597g\" (UID: \"fb1c6902-1390-44a8-8307-8036b02ce9c7\") " pod="openstack/dnsmasq-dns-5d944d7b75-r597g" Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.736172 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb1c6902-1390-44a8-8307-8036b02ce9c7-dns-svc\") pod \"dnsmasq-dns-5d944d7b75-r597g\" (UID: \"fb1c6902-1390-44a8-8307-8036b02ce9c7\") " pod="openstack/dnsmasq-dns-5d944d7b75-r597g" Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.736311 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb1c6902-1390-44a8-8307-8036b02ce9c7-ovsdbserver-nb\") pod \"dnsmasq-dns-5d944d7b75-r597g\" (UID: \"fb1c6902-1390-44a8-8307-8036b02ce9c7\") " pod="openstack/dnsmasq-dns-5d944d7b75-r597g" Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.785165 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.837868 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb1c6902-1390-44a8-8307-8036b02ce9c7-ovsdbserver-sb\") pod \"dnsmasq-dns-5d944d7b75-r597g\" (UID: \"fb1c6902-1390-44a8-8307-8036b02ce9c7\") " pod="openstack/dnsmasq-dns-5d944d7b75-r597g" Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.837921 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb1c6902-1390-44a8-8307-8036b02ce9c7-config\") pod \"dnsmasq-dns-5d944d7b75-r597g\" (UID: \"fb1c6902-1390-44a8-8307-8036b02ce9c7\") " pod="openstack/dnsmasq-dns-5d944d7b75-r597g" Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.837940 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxlvq\" (UniqueName: \"kubernetes.io/projected/fb1c6902-1390-44a8-8307-8036b02ce9c7-kube-api-access-pxlvq\") pod \"dnsmasq-dns-5d944d7b75-r597g\" (UID: \"fb1c6902-1390-44a8-8307-8036b02ce9c7\") " pod="openstack/dnsmasq-dns-5d944d7b75-r597g" Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.837961 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb1c6902-1390-44a8-8307-8036b02ce9c7-dns-svc\") pod \"dnsmasq-dns-5d944d7b75-r597g\" (UID: \"fb1c6902-1390-44a8-8307-8036b02ce9c7\") " pod="openstack/dnsmasq-dns-5d944d7b75-r597g" Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.838014 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb1c6902-1390-44a8-8307-8036b02ce9c7-ovsdbserver-nb\") pod \"dnsmasq-dns-5d944d7b75-r597g\" (UID: \"fb1c6902-1390-44a8-8307-8036b02ce9c7\") " pod="openstack/dnsmasq-dns-5d944d7b75-r597g" Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.838842 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb1c6902-1390-44a8-8307-8036b02ce9c7-ovsdbserver-sb\") pod \"dnsmasq-dns-5d944d7b75-r597g\" (UID: \"fb1c6902-1390-44a8-8307-8036b02ce9c7\") " pod="openstack/dnsmasq-dns-5d944d7b75-r597g" Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.838997 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb1c6902-1390-44a8-8307-8036b02ce9c7-ovsdbserver-nb\") pod \"dnsmasq-dns-5d944d7b75-r597g\" (UID: \"fb1c6902-1390-44a8-8307-8036b02ce9c7\") " pod="openstack/dnsmasq-dns-5d944d7b75-r597g" Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.839012 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb1c6902-1390-44a8-8307-8036b02ce9c7-config\") pod \"dnsmasq-dns-5d944d7b75-r597g\" (UID: \"fb1c6902-1390-44a8-8307-8036b02ce9c7\") " pod="openstack/dnsmasq-dns-5d944d7b75-r597g" Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.840180 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb1c6902-1390-44a8-8307-8036b02ce9c7-dns-svc\") pod \"dnsmasq-dns-5d944d7b75-r597g\" (UID: \"fb1c6902-1390-44a8-8307-8036b02ce9c7\") " pod="openstack/dnsmasq-dns-5d944d7b75-r597g" Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.862316 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxlvq\" (UniqueName: \"kubernetes.io/projected/fb1c6902-1390-44a8-8307-8036b02ce9c7-kube-api-access-pxlvq\") pod \"dnsmasq-dns-5d944d7b75-r597g\" (UID: \"fb1c6902-1390-44a8-8307-8036b02ce9c7\") " pod="openstack/dnsmasq-dns-5d944d7b75-r597g" Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.888240 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.929229 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 20 07:11:03 crc kubenswrapper[4971]: I0320 07:11:03.978227 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d944d7b75-r597g" Mar 20 07:11:04 crc kubenswrapper[4971]: I0320 07:11:04.033876 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7988f9db49-rh67h"] Mar 20 07:11:04 crc kubenswrapper[4971]: W0320 07:11:04.039724 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb45968a_c7f9_4ede_8d6c_23e7d3864d18.slice/crio-fd835b451c3214d64df33a5b045d98a79a2aa8cca8bb6a96e1f4f568daf42f0f WatchSource:0}: Error finding container fd835b451c3214d64df33a5b045d98a79a2aa8cca8bb6a96e1f4f568daf42f0f: Status 404 returned error can't find the container with id fd835b451c3214d64df33a5b045d98a79a2aa8cca8bb6a96e1f4f568daf42f0f Mar 20 07:11:04 crc kubenswrapper[4971]: I0320 07:11:04.078859 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-88l8f"] Mar 20 07:11:04 crc kubenswrapper[4971]: I0320 07:11:04.150707 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 20 07:11:04 crc kubenswrapper[4971]: I0320 07:11:04.152352 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 20 07:11:04 crc kubenswrapper[4971]: I0320 07:11:04.154580 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-rxlps" Mar 20 07:11:04 crc kubenswrapper[4971]: I0320 07:11:04.154751 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 20 07:11:04 crc kubenswrapper[4971]: I0320 07:11:04.154856 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 20 07:11:04 crc kubenswrapper[4971]: I0320 07:11:04.157497 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 20 07:11:04 crc kubenswrapper[4971]: I0320 07:11:04.164267 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 20 07:11:04 crc kubenswrapper[4971]: I0320 07:11:04.258145 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9064c65d-26e4-4d53-a1c1-b88f932b76be-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"9064c65d-26e4-4d53-a1c1-b88f932b76be\") " pod="openstack/ovn-northd-0" Mar 20 07:11:04 crc kubenswrapper[4971]: I0320 07:11:04.258633 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9064c65d-26e4-4d53-a1c1-b88f932b76be-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"9064c65d-26e4-4d53-a1c1-b88f932b76be\") " pod="openstack/ovn-northd-0" Mar 20 07:11:04 crc kubenswrapper[4971]: I0320 07:11:04.258854 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9064c65d-26e4-4d53-a1c1-b88f932b76be-config\") pod \"ovn-northd-0\" (UID: \"9064c65d-26e4-4d53-a1c1-b88f932b76be\") " pod="openstack/ovn-northd-0" Mar 20 07:11:04 crc kubenswrapper[4971]: I0320 07:11:04.258915 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/9064c65d-26e4-4d53-a1c1-b88f932b76be-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"9064c65d-26e4-4d53-a1c1-b88f932b76be\") " pod="openstack/ovn-northd-0" Mar 20 07:11:04 crc kubenswrapper[4971]: I0320 07:11:04.258953 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9064c65d-26e4-4d53-a1c1-b88f932b76be-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"9064c65d-26e4-4d53-a1c1-b88f932b76be\") " pod="openstack/ovn-northd-0" Mar 20 07:11:04 crc kubenswrapper[4971]: I0320 07:11:04.258979 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9064c65d-26e4-4d53-a1c1-b88f932b76be-scripts\") pod \"ovn-northd-0\" (UID: \"9064c65d-26e4-4d53-a1c1-b88f932b76be\") " pod="openstack/ovn-northd-0" Mar 20 07:11:04 crc kubenswrapper[4971]: I0320 07:11:04.259071 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-465wm\" (UniqueName: \"kubernetes.io/projected/9064c65d-26e4-4d53-a1c1-b88f932b76be-kube-api-access-465wm\") pod \"ovn-northd-0\" (UID: \"9064c65d-26e4-4d53-a1c1-b88f932b76be\") " pod="openstack/ovn-northd-0" Mar 20 07:11:04 crc kubenswrapper[4971]: I0320 07:11:04.361064 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9064c65d-26e4-4d53-a1c1-b88f932b76be-config\") pod \"ovn-northd-0\" (UID: \"9064c65d-26e4-4d53-a1c1-b88f932b76be\") " pod="openstack/ovn-northd-0" Mar 20 07:11:04 crc kubenswrapper[4971]: I0320 07:11:04.361127 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/9064c65d-26e4-4d53-a1c1-b88f932b76be-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"9064c65d-26e4-4d53-a1c1-b88f932b76be\") " pod="openstack/ovn-northd-0" Mar 20 07:11:04 crc kubenswrapper[4971]: I0320 07:11:04.361152 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9064c65d-26e4-4d53-a1c1-b88f932b76be-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"9064c65d-26e4-4d53-a1c1-b88f932b76be\") " pod="openstack/ovn-northd-0" Mar 20 07:11:04 crc kubenswrapper[4971]: I0320 07:11:04.361173 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9064c65d-26e4-4d53-a1c1-b88f932b76be-scripts\") pod \"ovn-northd-0\" (UID: \"9064c65d-26e4-4d53-a1c1-b88f932b76be\") " pod="openstack/ovn-northd-0" Mar 20 07:11:04 crc kubenswrapper[4971]: I0320 07:11:04.361195 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-465wm\" (UniqueName: \"kubernetes.io/projected/9064c65d-26e4-4d53-a1c1-b88f932b76be-kube-api-access-465wm\") pod \"ovn-northd-0\" (UID: \"9064c65d-26e4-4d53-a1c1-b88f932b76be\") " pod="openstack/ovn-northd-0" Mar 20 07:11:04 crc kubenswrapper[4971]: I0320 07:11:04.361234 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9064c65d-26e4-4d53-a1c1-b88f932b76be-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"9064c65d-26e4-4d53-a1c1-b88f932b76be\") " pod="openstack/ovn-northd-0" Mar 20 07:11:04 crc kubenswrapper[4971]: I0320 07:11:04.361257 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9064c65d-26e4-4d53-a1c1-b88f932b76be-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"9064c65d-26e4-4d53-a1c1-b88f932b76be\") " pod="openstack/ovn-northd-0" Mar 20 07:11:04 crc kubenswrapper[4971]: I0320 07:11:04.362085 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9064c65d-26e4-4d53-a1c1-b88f932b76be-config\") pod \"ovn-northd-0\" (UID: \"9064c65d-26e4-4d53-a1c1-b88f932b76be\") " pod="openstack/ovn-northd-0" Mar 20 07:11:04 crc kubenswrapper[4971]: I0320 07:11:04.362433 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9064c65d-26e4-4d53-a1c1-b88f932b76be-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"9064c65d-26e4-4d53-a1c1-b88f932b76be\") " pod="openstack/ovn-northd-0" Mar 20 07:11:04 crc kubenswrapper[4971]: I0320 07:11:04.363682 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9064c65d-26e4-4d53-a1c1-b88f932b76be-scripts\") pod \"ovn-northd-0\" (UID: \"9064c65d-26e4-4d53-a1c1-b88f932b76be\") " pod="openstack/ovn-northd-0" Mar 20 07:11:04 crc kubenswrapper[4971]: I0320 07:11:04.365448 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9064c65d-26e4-4d53-a1c1-b88f932b76be-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"9064c65d-26e4-4d53-a1c1-b88f932b76be\") " pod="openstack/ovn-northd-0" Mar 20 07:11:04 crc kubenswrapper[4971]: I0320 07:11:04.366326 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/9064c65d-26e4-4d53-a1c1-b88f932b76be-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"9064c65d-26e4-4d53-a1c1-b88f932b76be\") " pod="openstack/ovn-northd-0" Mar 20 07:11:04 crc kubenswrapper[4971]: I0320 07:11:04.366560 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9064c65d-26e4-4d53-a1c1-b88f932b76be-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"9064c65d-26e4-4d53-a1c1-b88f932b76be\") " pod="openstack/ovn-northd-0" Mar 20 07:11:04 crc kubenswrapper[4971]: I0320 07:11:04.381311 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-465wm\" (UniqueName: \"kubernetes.io/projected/9064c65d-26e4-4d53-a1c1-b88f932b76be-kube-api-access-465wm\") pod \"ovn-northd-0\" (UID: \"9064c65d-26e4-4d53-a1c1-b88f932b76be\") " pod="openstack/ovn-northd-0" Mar 20 07:11:04 crc kubenswrapper[4971]: I0320 07:11:04.480895 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 20 07:11:04 crc kubenswrapper[4971]: I0320 07:11:04.523547 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d944d7b75-r597g"] Mar 20 07:11:04 crc kubenswrapper[4971]: W0320 07:11:04.530354 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb1c6902_1390_44a8_8307_8036b02ce9c7.slice/crio-8c41a9be5ab7880824349d22b882f86e54a5a39bc318cd8ba465da9c76a494ba WatchSource:0}: Error finding container 8c41a9be5ab7880824349d22b882f86e54a5a39bc318cd8ba465da9c76a494ba: Status 404 returned error can't find the container with id 8c41a9be5ab7880824349d22b882f86e54a5a39bc318cd8ba465da9c76a494ba Mar 20 07:11:04 crc kubenswrapper[4971]: I0320 07:11:04.895424 4971 generic.go:334] "Generic (PLEG): container finished" podID="cb45968a-c7f9-4ede-8d6c-23e7d3864d18" containerID="802dcbd2235bd41503738695a42acade2e1d1c6934b9c6cc4756fb2ce23b08ec" exitCode=0 Mar 20 07:11:04 crc kubenswrapper[4971]: I0320 07:11:04.895509 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7988f9db49-rh67h" event={"ID":"cb45968a-c7f9-4ede-8d6c-23e7d3864d18","Type":"ContainerDied","Data":"802dcbd2235bd41503738695a42acade2e1d1c6934b9c6cc4756fb2ce23b08ec"} Mar 20 07:11:04 crc kubenswrapper[4971]: I0320 07:11:04.895959 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7988f9db49-rh67h" event={"ID":"cb45968a-c7f9-4ede-8d6c-23e7d3864d18","Type":"ContainerStarted","Data":"fd835b451c3214d64df33a5b045d98a79a2aa8cca8bb6a96e1f4f568daf42f0f"} Mar 20 07:11:04 crc kubenswrapper[4971]: I0320 07:11:04.897297 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-88l8f" event={"ID":"852c9ecc-4ab1-4142-8a63-e73c265c3866","Type":"ContainerStarted","Data":"cc1977222a2687951e3071cbfe9eecf3bbf33195805469e1592c465e2505d12a"} Mar 20 07:11:04 crc kubenswrapper[4971]: I0320 07:11:04.897336 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-88l8f" event={"ID":"852c9ecc-4ab1-4142-8a63-e73c265c3866","Type":"ContainerStarted","Data":"7ed0122d8f7e0f61eb9b8a0727605e88c6a7ab618c703c44156d996640ea8b91"} Mar 20 07:11:04 crc kubenswrapper[4971]: I0320 07:11:04.898661 4971 generic.go:334] "Generic (PLEG): container finished" podID="fb1c6902-1390-44a8-8307-8036b02ce9c7" containerID="b28d10f5cd406d72f7c52a3cb0b885fbc68bc235675df5d982bd9fda13f141a5" exitCode=0 Mar 20 07:11:04 crc kubenswrapper[4971]: I0320 07:11:04.898733 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d944d7b75-r597g" event={"ID":"fb1c6902-1390-44a8-8307-8036b02ce9c7","Type":"ContainerDied","Data":"b28d10f5cd406d72f7c52a3cb0b885fbc68bc235675df5d982bd9fda13f141a5"} Mar 20 07:11:04 crc kubenswrapper[4971]: I0320 07:11:04.898760 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d944d7b75-r597g" event={"ID":"fb1c6902-1390-44a8-8307-8036b02ce9c7","Type":"ContainerStarted","Data":"8c41a9be5ab7880824349d22b882f86e54a5a39bc318cd8ba465da9c76a494ba"} Mar 20 07:11:04 crc kubenswrapper[4971]: I0320 07:11:04.961481 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-88l8f" podStartSLOduration=1.9614636029999999 podStartE2EDuration="1.961463603s" podCreationTimestamp="2026-03-20 07:11:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:11:04.956350519 +0000 UTC m=+1286.936224677" watchObservedRunningTime="2026-03-20 07:11:04.961463603 +0000 UTC m=+1286.941337751" Mar 20 07:11:05 crc kubenswrapper[4971]: I0320 07:11:05.022390 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 20 07:11:05 crc kubenswrapper[4971]: I0320 07:11:05.227157 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7988f9db49-rh67h" Mar 20 07:11:05 crc kubenswrapper[4971]: I0320 07:11:05.279956 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb45968a-c7f9-4ede-8d6c-23e7d3864d18-config\") pod \"cb45968a-c7f9-4ede-8d6c-23e7d3864d18\" (UID: \"cb45968a-c7f9-4ede-8d6c-23e7d3864d18\") " Mar 20 07:11:05 crc kubenswrapper[4971]: I0320 07:11:05.280045 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb45968a-c7f9-4ede-8d6c-23e7d3864d18-dns-svc\") pod \"cb45968a-c7f9-4ede-8d6c-23e7d3864d18\" (UID: \"cb45968a-c7f9-4ede-8d6c-23e7d3864d18\") " Mar 20 07:11:05 crc kubenswrapper[4971]: I0320 07:11:05.280107 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb45968a-c7f9-4ede-8d6c-23e7d3864d18-ovsdbserver-sb\") pod \"cb45968a-c7f9-4ede-8d6c-23e7d3864d18\" (UID: \"cb45968a-c7f9-4ede-8d6c-23e7d3864d18\") " Mar 20 07:11:05 crc kubenswrapper[4971]: I0320 07:11:05.280135 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5snwv\" (UniqueName: \"kubernetes.io/projected/cb45968a-c7f9-4ede-8d6c-23e7d3864d18-kube-api-access-5snwv\") pod \"cb45968a-c7f9-4ede-8d6c-23e7d3864d18\" (UID: \"cb45968a-c7f9-4ede-8d6c-23e7d3864d18\") " Mar 20 07:11:05 crc kubenswrapper[4971]: I0320 07:11:05.303297 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb45968a-c7f9-4ede-8d6c-23e7d3864d18-kube-api-access-5snwv" (OuterVolumeSpecName: "kube-api-access-5snwv") pod "cb45968a-c7f9-4ede-8d6c-23e7d3864d18" (UID: "cb45968a-c7f9-4ede-8d6c-23e7d3864d18"). InnerVolumeSpecName "kube-api-access-5snwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:11:05 crc kubenswrapper[4971]: I0320 07:11:05.305096 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb45968a-c7f9-4ede-8d6c-23e7d3864d18-config" (OuterVolumeSpecName: "config") pod "cb45968a-c7f9-4ede-8d6c-23e7d3864d18" (UID: "cb45968a-c7f9-4ede-8d6c-23e7d3864d18"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:11:05 crc kubenswrapper[4971]: I0320 07:11:05.317197 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb45968a-c7f9-4ede-8d6c-23e7d3864d18-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cb45968a-c7f9-4ede-8d6c-23e7d3864d18" (UID: "cb45968a-c7f9-4ede-8d6c-23e7d3864d18"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:11:05 crc kubenswrapper[4971]: I0320 07:11:05.317837 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb45968a-c7f9-4ede-8d6c-23e7d3864d18-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cb45968a-c7f9-4ede-8d6c-23e7d3864d18" (UID: "cb45968a-c7f9-4ede-8d6c-23e7d3864d18"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:11:05 crc kubenswrapper[4971]: I0320 07:11:05.381883 4971 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb45968a-c7f9-4ede-8d6c-23e7d3864d18-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:05 crc kubenswrapper[4971]: I0320 07:11:05.381908 4971 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb45968a-c7f9-4ede-8d6c-23e7d3864d18-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:05 crc kubenswrapper[4971]: I0320 07:11:05.381917 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5snwv\" (UniqueName: \"kubernetes.io/projected/cb45968a-c7f9-4ede-8d6c-23e7d3864d18-kube-api-access-5snwv\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:05 crc kubenswrapper[4971]: I0320 07:11:05.381929 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb45968a-c7f9-4ede-8d6c-23e7d3864d18-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:05 crc kubenswrapper[4971]: I0320 07:11:05.912430 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7988f9db49-rh67h" event={"ID":"cb45968a-c7f9-4ede-8d6c-23e7d3864d18","Type":"ContainerDied","Data":"fd835b451c3214d64df33a5b045d98a79a2aa8cca8bb6a96e1f4f568daf42f0f"} Mar 20 07:11:05 crc kubenswrapper[4971]: I0320 07:11:05.912458 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7988f9db49-rh67h" Mar 20 07:11:05 crc kubenswrapper[4971]: I0320 07:11:05.912504 4971 scope.go:117] "RemoveContainer" containerID="802dcbd2235bd41503738695a42acade2e1d1c6934b9c6cc4756fb2ce23b08ec" Mar 20 07:11:05 crc kubenswrapper[4971]: I0320 07:11:05.914838 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"9064c65d-26e4-4d53-a1c1-b88f932b76be","Type":"ContainerStarted","Data":"8236e5e475f8e2bc30f0fba0ae0ebb7515e2b7b499601e1b123b49fde8584edc"} Mar 20 07:11:05 crc kubenswrapper[4971]: I0320 07:11:05.917070 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d944d7b75-r597g" event={"ID":"fb1c6902-1390-44a8-8307-8036b02ce9c7","Type":"ContainerStarted","Data":"756377c9e0c8de1cb9e21125b8e6c1fd30e0982ec21e18ae72e88dc0dd0eabbf"} Mar 20 07:11:05 crc kubenswrapper[4971]: I0320 07:11:05.937556 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d944d7b75-r597g" podStartSLOduration=2.9375370910000003 podStartE2EDuration="2.937537091s" podCreationTimestamp="2026-03-20 07:11:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:11:05.936370341 +0000 UTC m=+1287.916244479" watchObservedRunningTime="2026-03-20 07:11:05.937537091 +0000 UTC m=+1287.917411229" Mar 20 07:11:06 crc kubenswrapper[4971]: I0320 07:11:06.058114 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7988f9db49-rh67h"] Mar 20 07:11:06 crc kubenswrapper[4971]: I0320 07:11:06.065525 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7988f9db49-rh67h"] Mar 20 07:11:06 crc kubenswrapper[4971]: I0320 07:11:06.180646 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 20 07:11:06 crc kubenswrapper[4971]: I0320 07:11:06.180683 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 20 07:11:06 crc kubenswrapper[4971]: I0320 07:11:06.249073 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 20 07:11:06 crc kubenswrapper[4971]: I0320 07:11:06.742705 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb45968a-c7f9-4ede-8d6c-23e7d3864d18" path="/var/lib/kubelet/pods/cb45968a-c7f9-4ede-8d6c-23e7d3864d18/volumes" Mar 20 07:11:06 crc kubenswrapper[4971]: I0320 07:11:06.925653 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"9064c65d-26e4-4d53-a1c1-b88f932b76be","Type":"ContainerStarted","Data":"78ce6f75fdf4b8134a001b00c9380d6a9b0e6b28e02c6c557c6a2c9097b03dab"} Mar 20 07:11:06 crc kubenswrapper[4971]: I0320 07:11:06.925703 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"9064c65d-26e4-4d53-a1c1-b88f932b76be","Type":"ContainerStarted","Data":"3ca1eeff736f7dc8000796f67df497ff16c309981beedb9789a0e4a334c2f180"} Mar 20 07:11:06 crc kubenswrapper[4971]: I0320 07:11:06.926252 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 20 07:11:06 crc kubenswrapper[4971]: I0320 07:11:06.929205 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d944d7b75-r597g" Mar 20 07:11:06 crc kubenswrapper[4971]: I0320 07:11:06.972018 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.62479628 podStartE2EDuration="2.971995255s" podCreationTimestamp="2026-03-20 07:11:04 +0000 UTC" firstStartedPulling="2026-03-20 07:11:05.039282481 +0000 UTC m=+1287.019156619" lastFinishedPulling="2026-03-20 07:11:06.386481456 +0000 UTC m=+1288.366355594" observedRunningTime="2026-03-20 07:11:06.970066415 +0000 UTC m=+1288.949940553" watchObservedRunningTime="2026-03-20 07:11:06.971995255 +0000 UTC m=+1288.951869393" Mar 20 07:11:07 crc kubenswrapper[4971]: I0320 07:11:07.113517 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 20 07:11:07 crc kubenswrapper[4971]: I0320 07:11:07.505591 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 20 07:11:07 crc kubenswrapper[4971]: I0320 07:11:07.505736 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 20 07:11:07 crc kubenswrapper[4971]: I0320 07:11:07.612557 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 20 07:11:08 crc kubenswrapper[4971]: I0320 07:11:08.064604 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 20 07:11:08 crc kubenswrapper[4971]: I0320 07:11:08.888338 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-d3ee-account-create-update-xpzjm"] Mar 20 07:11:08 crc kubenswrapper[4971]: E0320 07:11:08.889134 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb45968a-c7f9-4ede-8d6c-23e7d3864d18" containerName="init" Mar 20 07:11:08 crc kubenswrapper[4971]: I0320 07:11:08.889158 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb45968a-c7f9-4ede-8d6c-23e7d3864d18" containerName="init" Mar 20 07:11:08 crc kubenswrapper[4971]: I0320 07:11:08.889365 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb45968a-c7f9-4ede-8d6c-23e7d3864d18" containerName="init" Mar 20 07:11:08 crc kubenswrapper[4971]: I0320 07:11:08.890088 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d3ee-account-create-update-xpzjm" Mar 20 07:11:08 crc kubenswrapper[4971]: I0320 07:11:08.892585 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 20 07:11:08 crc kubenswrapper[4971]: I0320 07:11:08.920610 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-d3ee-account-create-update-xpzjm"] Mar 20 07:11:08 crc kubenswrapper[4971]: I0320 07:11:08.930097 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-8tvlb"] Mar 20 07:11:08 crc kubenswrapper[4971]: I0320 07:11:08.931170 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-8tvlb" Mar 20 07:11:08 crc kubenswrapper[4971]: I0320 07:11:08.962839 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-8tvlb"] Mar 20 07:11:09 crc kubenswrapper[4971]: I0320 07:11:09.052665 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h98rj\" (UniqueName: \"kubernetes.io/projected/a2fa4dc2-d625-4071-b0bc-07099f6303ab-kube-api-access-h98rj\") pod \"keystone-d3ee-account-create-update-xpzjm\" (UID: \"a2fa4dc2-d625-4071-b0bc-07099f6303ab\") " pod="openstack/keystone-d3ee-account-create-update-xpzjm" Mar 20 07:11:09 crc kubenswrapper[4971]: I0320 07:11:09.052748 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpzr5\" (UniqueName: \"kubernetes.io/projected/10b994b9-b9a4-49de-8faf-a3c7b64035dc-kube-api-access-hpzr5\") pod \"keystone-db-create-8tvlb\" (UID: \"10b994b9-b9a4-49de-8faf-a3c7b64035dc\") " pod="openstack/keystone-db-create-8tvlb" Mar 20 07:11:09 crc kubenswrapper[4971]: I0320 07:11:09.052782 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a2fa4dc2-d625-4071-b0bc-07099f6303ab-operator-scripts\") pod \"keystone-d3ee-account-create-update-xpzjm\" (UID: \"a2fa4dc2-d625-4071-b0bc-07099f6303ab\") " pod="openstack/keystone-d3ee-account-create-update-xpzjm" Mar 20 07:11:09 crc kubenswrapper[4971]: I0320 07:11:09.053482 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10b994b9-b9a4-49de-8faf-a3c7b64035dc-operator-scripts\") pod \"keystone-db-create-8tvlb\" (UID: \"10b994b9-b9a4-49de-8faf-a3c7b64035dc\") " pod="openstack/keystone-db-create-8tvlb" Mar 20 07:11:09 crc kubenswrapper[4971]: I0320 07:11:09.112901 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-08bb-account-create-update-hw4s7"] Mar 20 07:11:09 crc kubenswrapper[4971]: I0320 07:11:09.114395 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-08bb-account-create-update-hw4s7" Mar 20 07:11:09 crc kubenswrapper[4971]: I0320 07:11:09.117336 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 20 07:11:09 crc kubenswrapper[4971]: I0320 07:11:09.122917 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-rtj87"] Mar 20 07:11:09 crc kubenswrapper[4971]: I0320 07:11:09.124544 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rtj87" Mar 20 07:11:09 crc kubenswrapper[4971]: I0320 07:11:09.131700 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-08bb-account-create-update-hw4s7"] Mar 20 07:11:09 crc kubenswrapper[4971]: I0320 07:11:09.154555 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10b994b9-b9a4-49de-8faf-a3c7b64035dc-operator-scripts\") pod \"keystone-db-create-8tvlb\" (UID: \"10b994b9-b9a4-49de-8faf-a3c7b64035dc\") " pod="openstack/keystone-db-create-8tvlb" Mar 20 07:11:09 crc kubenswrapper[4971]: I0320 07:11:09.154647 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h98rj\" (UniqueName: \"kubernetes.io/projected/a2fa4dc2-d625-4071-b0bc-07099f6303ab-kube-api-access-h98rj\") pod \"keystone-d3ee-account-create-update-xpzjm\" (UID: \"a2fa4dc2-d625-4071-b0bc-07099f6303ab\") " pod="openstack/keystone-d3ee-account-create-update-xpzjm" Mar 20 07:11:09 crc kubenswrapper[4971]: I0320 07:11:09.154722 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpzr5\" (UniqueName: \"kubernetes.io/projected/10b994b9-b9a4-49de-8faf-a3c7b64035dc-kube-api-access-hpzr5\") pod \"keystone-db-create-8tvlb\" (UID: \"10b994b9-b9a4-49de-8faf-a3c7b64035dc\") " pod="openstack/keystone-db-create-8tvlb" Mar 20 07:11:09 crc kubenswrapper[4971]: I0320 07:11:09.154762 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a2fa4dc2-d625-4071-b0bc-07099f6303ab-operator-scripts\") pod \"keystone-d3ee-account-create-update-xpzjm\" (UID: \"a2fa4dc2-d625-4071-b0bc-07099f6303ab\") " pod="openstack/keystone-d3ee-account-create-update-xpzjm" Mar 20 07:11:09 crc kubenswrapper[4971]: I0320 07:11:09.155425 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10b994b9-b9a4-49de-8faf-a3c7b64035dc-operator-scripts\") pod \"keystone-db-create-8tvlb\" (UID: \"10b994b9-b9a4-49de-8faf-a3c7b64035dc\") " pod="openstack/keystone-db-create-8tvlb" Mar 20 07:11:09 crc kubenswrapper[4971]: I0320 07:11:09.157571 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a2fa4dc2-d625-4071-b0bc-07099f6303ab-operator-scripts\") pod \"keystone-d3ee-account-create-update-xpzjm\" (UID: \"a2fa4dc2-d625-4071-b0bc-07099f6303ab\") " pod="openstack/keystone-d3ee-account-create-update-xpzjm" Mar 20 07:11:09 crc kubenswrapper[4971]: I0320 07:11:09.159002 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-rtj87"] Mar 20 07:11:09 crc kubenswrapper[4971]: I0320 07:11:09.177767 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpzr5\" (UniqueName: \"kubernetes.io/projected/10b994b9-b9a4-49de-8faf-a3c7b64035dc-kube-api-access-hpzr5\") pod \"keystone-db-create-8tvlb\" (UID: \"10b994b9-b9a4-49de-8faf-a3c7b64035dc\") " pod="openstack/keystone-db-create-8tvlb" Mar 20 07:11:09 crc kubenswrapper[4971]: I0320 07:11:09.181029 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h98rj\" (UniqueName: \"kubernetes.io/projected/a2fa4dc2-d625-4071-b0bc-07099f6303ab-kube-api-access-h98rj\") pod \"keystone-d3ee-account-create-update-xpzjm\" (UID: \"a2fa4dc2-d625-4071-b0bc-07099f6303ab\") " pod="openstack/keystone-d3ee-account-create-update-xpzjm" Mar 20 07:11:09 crc kubenswrapper[4971]: I0320 07:11:09.226743 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d3ee-account-create-update-xpzjm" Mar 20 07:11:09 crc kubenswrapper[4971]: I0320 07:11:09.256652 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5jqw\" (UniqueName: \"kubernetes.io/projected/a4d13e0d-08bb-495b-abfc-6edcc8f01890-kube-api-access-p5jqw\") pod \"placement-db-create-rtj87\" (UID: \"a4d13e0d-08bb-495b-abfc-6edcc8f01890\") " pod="openstack/placement-db-create-rtj87" Mar 20 07:11:09 crc kubenswrapper[4971]: I0320 07:11:09.256832 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/543e9b24-4327-4fcf-8d2c-9316d2cc7b8e-operator-scripts\") pod \"placement-08bb-account-create-update-hw4s7\" (UID: \"543e9b24-4327-4fcf-8d2c-9316d2cc7b8e\") " pod="openstack/placement-08bb-account-create-update-hw4s7" Mar 20 07:11:09 crc kubenswrapper[4971]: I0320 07:11:09.256863 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4d13e0d-08bb-495b-abfc-6edcc8f01890-operator-scripts\") pod \"placement-db-create-rtj87\" (UID: \"a4d13e0d-08bb-495b-abfc-6edcc8f01890\") " pod="openstack/placement-db-create-rtj87" Mar 20 07:11:09 crc kubenswrapper[4971]: I0320 07:11:09.256905 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vm7pr\" (UniqueName: \"kubernetes.io/projected/543e9b24-4327-4fcf-8d2c-9316d2cc7b8e-kube-api-access-vm7pr\") pod \"placement-08bb-account-create-update-hw4s7\" (UID: \"543e9b24-4327-4fcf-8d2c-9316d2cc7b8e\") " pod="openstack/placement-08bb-account-create-update-hw4s7" Mar 20 07:11:09 crc kubenswrapper[4971]: I0320 07:11:09.257233 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-8tvlb" Mar 20 07:11:09 crc kubenswrapper[4971]: I0320 07:11:09.358663 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/543e9b24-4327-4fcf-8d2c-9316d2cc7b8e-operator-scripts\") pod \"placement-08bb-account-create-update-hw4s7\" (UID: \"543e9b24-4327-4fcf-8d2c-9316d2cc7b8e\") " pod="openstack/placement-08bb-account-create-update-hw4s7" Mar 20 07:11:09 crc kubenswrapper[4971]: I0320 07:11:09.358991 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4d13e0d-08bb-495b-abfc-6edcc8f01890-operator-scripts\") pod \"placement-db-create-rtj87\" (UID: \"a4d13e0d-08bb-495b-abfc-6edcc8f01890\") " pod="openstack/placement-db-create-rtj87" Mar 20 07:11:09 crc kubenswrapper[4971]: I0320 07:11:09.359049 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vm7pr\" (UniqueName: \"kubernetes.io/projected/543e9b24-4327-4fcf-8d2c-9316d2cc7b8e-kube-api-access-vm7pr\") pod \"placement-08bb-account-create-update-hw4s7\" (UID: \"543e9b24-4327-4fcf-8d2c-9316d2cc7b8e\") " pod="openstack/placement-08bb-account-create-update-hw4s7" Mar 20 07:11:09 crc kubenswrapper[4971]: I0320 07:11:09.359093 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5jqw\" (UniqueName: \"kubernetes.io/projected/a4d13e0d-08bb-495b-abfc-6edcc8f01890-kube-api-access-p5jqw\") pod \"placement-db-create-rtj87\" (UID: \"a4d13e0d-08bb-495b-abfc-6edcc8f01890\") " pod="openstack/placement-db-create-rtj87" Mar 20 07:11:09 crc kubenswrapper[4971]: I0320 07:11:09.359446 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/543e9b24-4327-4fcf-8d2c-9316d2cc7b8e-operator-scripts\") pod \"placement-08bb-account-create-update-hw4s7\" (UID: \"543e9b24-4327-4fcf-8d2c-9316d2cc7b8e\") " pod="openstack/placement-08bb-account-create-update-hw4s7" Mar 20 07:11:09 crc kubenswrapper[4971]: I0320 07:11:09.359714 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4d13e0d-08bb-495b-abfc-6edcc8f01890-operator-scripts\") pod \"placement-db-create-rtj87\" (UID: \"a4d13e0d-08bb-495b-abfc-6edcc8f01890\") " pod="openstack/placement-db-create-rtj87" Mar 20 07:11:09 crc kubenswrapper[4971]: I0320 07:11:09.380153 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vm7pr\" (UniqueName: \"kubernetes.io/projected/543e9b24-4327-4fcf-8d2c-9316d2cc7b8e-kube-api-access-vm7pr\") pod \"placement-08bb-account-create-update-hw4s7\" (UID: \"543e9b24-4327-4fcf-8d2c-9316d2cc7b8e\") " pod="openstack/placement-08bb-account-create-update-hw4s7" Mar 20 07:11:09 crc kubenswrapper[4971]: I0320 07:11:09.380748 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5jqw\" (UniqueName: \"kubernetes.io/projected/a4d13e0d-08bb-495b-abfc-6edcc8f01890-kube-api-access-p5jqw\") pod \"placement-db-create-rtj87\" (UID: \"a4d13e0d-08bb-495b-abfc-6edcc8f01890\") " pod="openstack/placement-db-create-rtj87" Mar 20 07:11:09 crc kubenswrapper[4971]: I0320 07:11:09.441444 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-08bb-account-create-update-hw4s7" Mar 20 07:11:09 crc kubenswrapper[4971]: I0320 07:11:09.461511 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rtj87" Mar 20 07:11:09 crc kubenswrapper[4971]: I0320 07:11:09.731150 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-d3ee-account-create-update-xpzjm"] Mar 20 07:11:09 crc kubenswrapper[4971]: I0320 07:11:09.825509 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-8tvlb"] Mar 20 07:11:09 crc kubenswrapper[4971]: I0320 07:11:09.925935 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-08bb-account-create-update-hw4s7"] Mar 20 07:11:09 crc kubenswrapper[4971]: I0320 07:11:09.934679 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d944d7b75-r597g"] Mar 20 07:11:09 crc kubenswrapper[4971]: I0320 07:11:09.934874 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5d944d7b75-r597g" podUID="fb1c6902-1390-44a8-8307-8036b02ce9c7" containerName="dnsmasq-dns" containerID="cri-o://756377c9e0c8de1cb9e21125b8e6c1fd30e0982ec21e18ae72e88dc0dd0eabbf" gracePeriod=10 Mar 20 07:11:09 crc kubenswrapper[4971]: I0320 07:11:09.966682 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b9fd7d84c-bwqjt"] Mar 20 07:11:09 crc kubenswrapper[4971]: I0320 07:11:09.967961 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b9fd7d84c-bwqjt" Mar 20 07:11:10 crc kubenswrapper[4971]: I0320 07:11:10.019879 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b9fd7d84c-bwqjt"] Mar 20 07:11:10 crc kubenswrapper[4971]: I0320 07:11:10.020354 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-08bb-account-create-update-hw4s7" event={"ID":"543e9b24-4327-4fcf-8d2c-9316d2cc7b8e","Type":"ContainerStarted","Data":"f0783d9e8bacce37a99cbc680f722dc0d9e179924dfe96b9aa5de64cda903356"} Mar 20 07:11:10 crc kubenswrapper[4971]: I0320 07:11:10.021093 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 20 07:11:10 crc kubenswrapper[4971]: I0320 07:11:10.024121 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d3ee-account-create-update-xpzjm" event={"ID":"a2fa4dc2-d625-4071-b0bc-07099f6303ab","Type":"ContainerStarted","Data":"5a166564e8fdf2f38e415299d7a0c3db6880c284ec0eec405d7c2f41bc3396f3"} Mar 20 07:11:10 crc kubenswrapper[4971]: I0320 07:11:10.031141 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-8tvlb" event={"ID":"10b994b9-b9a4-49de-8faf-a3c7b64035dc","Type":"ContainerStarted","Data":"77e19da2aa31b3c79ca8058f54ecdb8b48416523ab9ceb894626473cc4f1428e"} Mar 20 07:11:10 crc kubenswrapper[4971]: I0320 07:11:10.057223 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-rtj87"] Mar 20 07:11:10 crc kubenswrapper[4971]: I0320 07:11:10.077765 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f687e76-a919-4ed1-921b-ab9a7085eb30-dns-svc\") pod \"dnsmasq-dns-7b9fd7d84c-bwqjt\" (UID: \"1f687e76-a919-4ed1-921b-ab9a7085eb30\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-bwqjt" Mar 20 07:11:10 crc kubenswrapper[4971]: I0320 07:11:10.080301 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f687e76-a919-4ed1-921b-ab9a7085eb30-ovsdbserver-nb\") pod \"dnsmasq-dns-7b9fd7d84c-bwqjt\" (UID: \"1f687e76-a919-4ed1-921b-ab9a7085eb30\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-bwqjt" Mar 20 07:11:10 crc kubenswrapper[4971]: I0320 07:11:10.090094 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f687e76-a919-4ed1-921b-ab9a7085eb30-ovsdbserver-sb\") pod \"dnsmasq-dns-7b9fd7d84c-bwqjt\" (UID: \"1f687e76-a919-4ed1-921b-ab9a7085eb30\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-bwqjt" Mar 20 07:11:10 crc kubenswrapper[4971]: I0320 07:11:10.090147 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f687e76-a919-4ed1-921b-ab9a7085eb30-config\") pod \"dnsmasq-dns-7b9fd7d84c-bwqjt\" (UID: \"1f687e76-a919-4ed1-921b-ab9a7085eb30\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-bwqjt" Mar 20 07:11:10 crc kubenswrapper[4971]: I0320 07:11:10.090254 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdcmf\" (UniqueName: \"kubernetes.io/projected/1f687e76-a919-4ed1-921b-ab9a7085eb30-kube-api-access-gdcmf\") pod \"dnsmasq-dns-7b9fd7d84c-bwqjt\" (UID: \"1f687e76-a919-4ed1-921b-ab9a7085eb30\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-bwqjt" Mar 20 07:11:10 crc kubenswrapper[4971]: I0320 07:11:10.191507 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f687e76-a919-4ed1-921b-ab9a7085eb30-dns-svc\") pod \"dnsmasq-dns-7b9fd7d84c-bwqjt\" (UID: \"1f687e76-a919-4ed1-921b-ab9a7085eb30\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-bwqjt" Mar 20 07:11:10 crc kubenswrapper[4971]: I0320 07:11:10.191597 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f687e76-a919-4ed1-921b-ab9a7085eb30-ovsdbserver-nb\") pod \"dnsmasq-dns-7b9fd7d84c-bwqjt\" (UID: \"1f687e76-a919-4ed1-921b-ab9a7085eb30\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-bwqjt" Mar 20 07:11:10 crc kubenswrapper[4971]: I0320 07:11:10.191668 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f687e76-a919-4ed1-921b-ab9a7085eb30-ovsdbserver-sb\") pod \"dnsmasq-dns-7b9fd7d84c-bwqjt\" (UID: \"1f687e76-a919-4ed1-921b-ab9a7085eb30\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-bwqjt" Mar 20 07:11:10 crc kubenswrapper[4971]: I0320 07:11:10.191691 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f687e76-a919-4ed1-921b-ab9a7085eb30-config\") pod \"dnsmasq-dns-7b9fd7d84c-bwqjt\" (UID: \"1f687e76-a919-4ed1-921b-ab9a7085eb30\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-bwqjt" Mar 20 07:11:10 crc kubenswrapper[4971]: I0320 07:11:10.191769 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdcmf\" (UniqueName: \"kubernetes.io/projected/1f687e76-a919-4ed1-921b-ab9a7085eb30-kube-api-access-gdcmf\") pod \"dnsmasq-dns-7b9fd7d84c-bwqjt\" (UID: \"1f687e76-a919-4ed1-921b-ab9a7085eb30\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-bwqjt" Mar 20 07:11:10 crc kubenswrapper[4971]: I0320 07:11:10.192861 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f687e76-a919-4ed1-921b-ab9a7085eb30-dns-svc\") pod \"dnsmasq-dns-7b9fd7d84c-bwqjt\" (UID: \"1f687e76-a919-4ed1-921b-ab9a7085eb30\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-bwqjt" Mar 20 07:11:10 crc kubenswrapper[4971]: I0320 07:11:10.194718 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f687e76-a919-4ed1-921b-ab9a7085eb30-ovsdbserver-sb\") pod \"dnsmasq-dns-7b9fd7d84c-bwqjt\" (UID: \"1f687e76-a919-4ed1-921b-ab9a7085eb30\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-bwqjt" Mar 20 07:11:10 crc kubenswrapper[4971]: I0320 07:11:10.194812 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f687e76-a919-4ed1-921b-ab9a7085eb30-config\") pod \"dnsmasq-dns-7b9fd7d84c-bwqjt\" (UID: \"1f687e76-a919-4ed1-921b-ab9a7085eb30\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-bwqjt" Mar 20 07:11:10 crc kubenswrapper[4971]: I0320 07:11:10.195363 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f687e76-a919-4ed1-921b-ab9a7085eb30-ovsdbserver-nb\") pod \"dnsmasq-dns-7b9fd7d84c-bwqjt\" (UID: \"1f687e76-a919-4ed1-921b-ab9a7085eb30\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-bwqjt" Mar 20 07:11:10 crc kubenswrapper[4971]: I0320 07:11:10.211449 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdcmf\" (UniqueName: \"kubernetes.io/projected/1f687e76-a919-4ed1-921b-ab9a7085eb30-kube-api-access-gdcmf\") pod \"dnsmasq-dns-7b9fd7d84c-bwqjt\" (UID: \"1f687e76-a919-4ed1-921b-ab9a7085eb30\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-bwqjt" Mar 20 07:11:10 crc kubenswrapper[4971]: I0320 07:11:10.289886 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b9fd7d84c-bwqjt" Mar 20 07:11:10 crc kubenswrapper[4971]: I0320 07:11:10.741028 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b9fd7d84c-bwqjt"] Mar 20 07:11:11 crc kubenswrapper[4971]: I0320 07:11:11.042630 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-rtj87" event={"ID":"a4d13e0d-08bb-495b-abfc-6edcc8f01890","Type":"ContainerStarted","Data":"c02f43c9643f9c9e0b575d74e2e6eed7ebf8326ec8f02b02b051362dfbeecced"} Mar 20 07:11:11 crc kubenswrapper[4971]: I0320 07:11:11.044596 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b9fd7d84c-bwqjt" event={"ID":"1f687e76-a919-4ed1-921b-ab9a7085eb30","Type":"ContainerStarted","Data":"18869052bd6862d80c0a1aa87ea65abe8532eb2b9f15ecd5793500413305c2f3"} Mar 20 07:11:11 crc kubenswrapper[4971]: I0320 07:11:11.047037 4971 generic.go:334] "Generic (PLEG): container finished" podID="fb1c6902-1390-44a8-8307-8036b02ce9c7" containerID="756377c9e0c8de1cb9e21125b8e6c1fd30e0982ec21e18ae72e88dc0dd0eabbf" exitCode=0 Mar 20 07:11:11 crc kubenswrapper[4971]: I0320 07:11:11.047078 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d944d7b75-r597g" event={"ID":"fb1c6902-1390-44a8-8307-8036b02ce9c7","Type":"ContainerDied","Data":"756377c9e0c8de1cb9e21125b8e6c1fd30e0982ec21e18ae72e88dc0dd0eabbf"} Mar 20 07:11:11 crc kubenswrapper[4971]: I0320 07:11:11.115127 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 20 07:11:11 crc kubenswrapper[4971]: I0320 07:11:11.121841 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 20 07:11:11 crc kubenswrapper[4971]: I0320 07:11:11.127741 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-hchrl" Mar 20 07:11:11 crc kubenswrapper[4971]: I0320 07:11:11.128315 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 20 07:11:11 crc kubenswrapper[4971]: I0320 07:11:11.131825 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 20 07:11:11 crc kubenswrapper[4971]: I0320 07:11:11.134721 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 20 07:11:11 crc kubenswrapper[4971]: I0320 07:11:11.141211 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 20 07:11:11 crc kubenswrapper[4971]: I0320 07:11:11.309840 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f0608b4-f344-45db-a952-d6bc328083a2-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"9f0608b4-f344-45db-a952-d6bc328083a2\") " pod="openstack/swift-storage-0" Mar 20 07:11:11 crc kubenswrapper[4971]: I0320 07:11:11.309954 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5d77b\" (UniqueName: \"kubernetes.io/projected/9f0608b4-f344-45db-a952-d6bc328083a2-kube-api-access-5d77b\") pod \"swift-storage-0\" (UID: \"9f0608b4-f344-45db-a952-d6bc328083a2\") " pod="openstack/swift-storage-0" Mar 20 07:11:11 crc kubenswrapper[4971]: I0320 07:11:11.309994 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/9f0608b4-f344-45db-a952-d6bc328083a2-cache\") pod \"swift-storage-0\" (UID: \"9f0608b4-f344-45db-a952-d6bc328083a2\") " pod="openstack/swift-storage-0" Mar 20 07:11:11 crc kubenswrapper[4971]: I0320 07:11:11.310018 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"9f0608b4-f344-45db-a952-d6bc328083a2\") " pod="openstack/swift-storage-0" Mar 20 07:11:11 crc kubenswrapper[4971]: I0320 07:11:11.310417 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9f0608b4-f344-45db-a952-d6bc328083a2-etc-swift\") pod \"swift-storage-0\" (UID: \"9f0608b4-f344-45db-a952-d6bc328083a2\") " pod="openstack/swift-storage-0" Mar 20 07:11:11 crc kubenswrapper[4971]: I0320 07:11:11.310626 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/9f0608b4-f344-45db-a952-d6bc328083a2-lock\") pod \"swift-storage-0\" (UID: \"9f0608b4-f344-45db-a952-d6bc328083a2\") " pod="openstack/swift-storage-0" Mar 20 07:11:11 crc kubenswrapper[4971]: I0320 07:11:11.412251 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f0608b4-f344-45db-a952-d6bc328083a2-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"9f0608b4-f344-45db-a952-d6bc328083a2\") " pod="openstack/swift-storage-0" Mar 20 07:11:11 crc kubenswrapper[4971]: I0320 07:11:11.412381 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5d77b\" (UniqueName: \"kubernetes.io/projected/9f0608b4-f344-45db-a952-d6bc328083a2-kube-api-access-5d77b\") pod \"swift-storage-0\" (UID: \"9f0608b4-f344-45db-a952-d6bc328083a2\") " pod="openstack/swift-storage-0" Mar 20 07:11:11 crc kubenswrapper[4971]: I0320 07:11:11.412429 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/9f0608b4-f344-45db-a952-d6bc328083a2-cache\") pod \"swift-storage-0\" (UID: \"9f0608b4-f344-45db-a952-d6bc328083a2\") " pod="openstack/swift-storage-0" Mar 20 07:11:11 crc kubenswrapper[4971]: I0320 07:11:11.412453 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"9f0608b4-f344-45db-a952-d6bc328083a2\") " pod="openstack/swift-storage-0" Mar 20 07:11:11 crc kubenswrapper[4971]: I0320 07:11:11.412524 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9f0608b4-f344-45db-a952-d6bc328083a2-etc-swift\") pod \"swift-storage-0\" (UID: \"9f0608b4-f344-45db-a952-d6bc328083a2\") " pod="openstack/swift-storage-0" Mar 20 07:11:11 crc kubenswrapper[4971]: I0320 07:11:11.412558 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/9f0608b4-f344-45db-a952-d6bc328083a2-lock\") pod \"swift-storage-0\" (UID: \"9f0608b4-f344-45db-a952-d6bc328083a2\") " pod="openstack/swift-storage-0" Mar 20 07:11:11 crc kubenswrapper[4971]: E0320 07:11:11.412802 4971 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 07:11:11 crc kubenswrapper[4971]: E0320 07:11:11.412842 4971 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 07:11:11 crc kubenswrapper[4971]: E0320 07:11:11.412914 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9f0608b4-f344-45db-a952-d6bc328083a2-etc-swift podName:9f0608b4-f344-45db-a952-d6bc328083a2 nodeName:}" failed. No retries permitted until 2026-03-20 07:11:11.9128888 +0000 UTC m=+1293.892762968 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9f0608b4-f344-45db-a952-d6bc328083a2-etc-swift") pod "swift-storage-0" (UID: "9f0608b4-f344-45db-a952-d6bc328083a2") : configmap "swift-ring-files" not found Mar 20 07:11:11 crc kubenswrapper[4971]: I0320 07:11:11.413060 4971 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"9f0608b4-f344-45db-a952-d6bc328083a2\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/swift-storage-0" Mar 20 07:11:11 crc kubenswrapper[4971]: I0320 07:11:11.413256 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/9f0608b4-f344-45db-a952-d6bc328083a2-lock\") pod \"swift-storage-0\" (UID: \"9f0608b4-f344-45db-a952-d6bc328083a2\") " pod="openstack/swift-storage-0" Mar 20 07:11:11 crc kubenswrapper[4971]: I0320 07:11:11.413323 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/9f0608b4-f344-45db-a952-d6bc328083a2-cache\") pod \"swift-storage-0\" (UID: \"9f0608b4-f344-45db-a952-d6bc328083a2\") " pod="openstack/swift-storage-0" Mar 20 07:11:11 crc kubenswrapper[4971]: I0320 07:11:11.424372 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f0608b4-f344-45db-a952-d6bc328083a2-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"9f0608b4-f344-45db-a952-d6bc328083a2\") " pod="openstack/swift-storage-0" Mar 20 07:11:11 crc kubenswrapper[4971]: I0320 07:11:11.431923 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5d77b\" (UniqueName: \"kubernetes.io/projected/9f0608b4-f344-45db-a952-d6bc328083a2-kube-api-access-5d77b\") pod \"swift-storage-0\" (UID: \"9f0608b4-f344-45db-a952-d6bc328083a2\") " pod="openstack/swift-storage-0" Mar 20 07:11:11 crc kubenswrapper[4971]: I0320 07:11:11.460457 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"9f0608b4-f344-45db-a952-d6bc328083a2\") " pod="openstack/swift-storage-0" Mar 20 07:11:11 crc kubenswrapper[4971]: I0320 07:11:11.921809 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9f0608b4-f344-45db-a952-d6bc328083a2-etc-swift\") pod \"swift-storage-0\" (UID: \"9f0608b4-f344-45db-a952-d6bc328083a2\") " pod="openstack/swift-storage-0" Mar 20 07:11:11 crc kubenswrapper[4971]: E0320 07:11:11.922095 4971 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 07:11:11 crc kubenswrapper[4971]: E0320 07:11:11.922140 4971 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 07:11:11 crc kubenswrapper[4971]: E0320 07:11:11.922287 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9f0608b4-f344-45db-a952-d6bc328083a2-etc-swift podName:9f0608b4-f344-45db-a952-d6bc328083a2 nodeName:}" failed. No retries permitted until 2026-03-20 07:11:12.922259386 +0000 UTC m=+1294.902133564 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9f0608b4-f344-45db-a952-d6bc328083a2-etc-swift") pod "swift-storage-0" (UID: "9f0608b4-f344-45db-a952-d6bc328083a2") : configmap "swift-ring-files" not found Mar 20 07:11:12 crc kubenswrapper[4971]: I0320 07:11:12.940596 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9f0608b4-f344-45db-a952-d6bc328083a2-etc-swift\") pod \"swift-storage-0\" (UID: \"9f0608b4-f344-45db-a952-d6bc328083a2\") " pod="openstack/swift-storage-0" Mar 20 07:11:12 crc kubenswrapper[4971]: E0320 07:11:12.940812 4971 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 07:11:12 crc kubenswrapper[4971]: E0320 07:11:12.941012 4971 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 07:11:12 crc kubenswrapper[4971]: E0320 07:11:12.941071 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9f0608b4-f344-45db-a952-d6bc328083a2-etc-swift podName:9f0608b4-f344-45db-a952-d6bc328083a2 nodeName:}" failed. No retries permitted until 2026-03-20 07:11:14.941053409 +0000 UTC m=+1296.920927547 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9f0608b4-f344-45db-a952-d6bc328083a2-etc-swift") pod "swift-storage-0" (UID: "9f0608b4-f344-45db-a952-d6bc328083a2") : configmap "swift-ring-files" not found Mar 20 07:11:13 crc kubenswrapper[4971]: I0320 07:11:13.071889 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-08bb-account-create-update-hw4s7" event={"ID":"543e9b24-4327-4fcf-8d2c-9316d2cc7b8e","Type":"ContainerStarted","Data":"8894eea462d02c27135c00ee973dff94683ebc6860c55336204627ceb1560fa1"} Mar 20 07:11:13 crc kubenswrapper[4971]: I0320 07:11:13.074085 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-rtj87" event={"ID":"a4d13e0d-08bb-495b-abfc-6edcc8f01890","Type":"ContainerStarted","Data":"3aa555eddf63e48f674d7432f8d4139f9f04b81f51a084e8ad63dcacfa5ce437"} Mar 20 07:11:13 crc kubenswrapper[4971]: I0320 07:11:13.075825 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b9fd7d84c-bwqjt" event={"ID":"1f687e76-a919-4ed1-921b-ab9a7085eb30","Type":"ContainerStarted","Data":"5fbf1df8dda99bb2bbd2557d66fd0f7cd73de19f39a470db149709ec3733445a"} Mar 20 07:11:13 crc kubenswrapper[4971]: I0320 07:11:13.077424 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d3ee-account-create-update-xpzjm" event={"ID":"a2fa4dc2-d625-4071-b0bc-07099f6303ab","Type":"ContainerStarted","Data":"0d9196d5fa1e06eb300a897a2efc8f4155967e4bbf140160338ab7fea0655449"} Mar 20 07:11:13 crc kubenswrapper[4971]: I0320 07:11:13.081031 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-8tvlb" event={"ID":"10b994b9-b9a4-49de-8faf-a3c7b64035dc","Type":"ContainerStarted","Data":"f88854cdba235abc6a679d085e2ddae83d1dcc94556a4f55f7597dc55a163760"} Mar 20 07:11:13 crc kubenswrapper[4971]: I0320 07:11:13.095913 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-d3ee-account-create-update-xpzjm" podStartSLOduration=5.095891924 podStartE2EDuration="5.095891924s" podCreationTimestamp="2026-03-20 07:11:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:11:13.093267935 +0000 UTC m=+1295.073142083" watchObservedRunningTime="2026-03-20 07:11:13.095891924 +0000 UTC m=+1295.075766082" Mar 20 07:11:13 crc kubenswrapper[4971]: I0320 07:11:13.129515 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-bl874"] Mar 20 07:11:13 crc kubenswrapper[4971]: I0320 07:11:13.132857 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-bl874" Mar 20 07:11:13 crc kubenswrapper[4971]: I0320 07:11:13.149364 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-0b9a-account-create-update-66sh8"] Mar 20 07:11:13 crc kubenswrapper[4971]: I0320 07:11:13.154024 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-0b9a-account-create-update-66sh8" Mar 20 07:11:13 crc kubenswrapper[4971]: I0320 07:11:13.156631 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 20 07:11:13 crc kubenswrapper[4971]: I0320 07:11:13.168870 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-bl874"] Mar 20 07:11:13 crc kubenswrapper[4971]: I0320 07:11:13.185811 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-0b9a-account-create-update-66sh8"] Mar 20 07:11:13 crc kubenswrapper[4971]: I0320 07:11:13.248640 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxqh4\" (UniqueName: \"kubernetes.io/projected/1f2bca3a-2949-4c33-8716-28a6e38b84ef-kube-api-access-mxqh4\") pod \"glance-db-create-bl874\" (UID: \"1f2bca3a-2949-4c33-8716-28a6e38b84ef\") " pod="openstack/glance-db-create-bl874" Mar 20 07:11:13 crc kubenswrapper[4971]: I0320 07:11:13.248712 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f2bca3a-2949-4c33-8716-28a6e38b84ef-operator-scripts\") pod \"glance-db-create-bl874\" (UID: \"1f2bca3a-2949-4c33-8716-28a6e38b84ef\") " pod="openstack/glance-db-create-bl874" Mar 20 07:11:13 crc kubenswrapper[4971]: I0320 07:11:13.248761 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6kz5\" (UniqueName: \"kubernetes.io/projected/89f7dd3f-532f-4e24-bf20-99aa46f1fdda-kube-api-access-j6kz5\") pod \"glance-0b9a-account-create-update-66sh8\" (UID: \"89f7dd3f-532f-4e24-bf20-99aa46f1fdda\") " pod="openstack/glance-0b9a-account-create-update-66sh8" Mar 20 07:11:13 crc kubenswrapper[4971]: I0320 07:11:13.248926 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/89f7dd3f-532f-4e24-bf20-99aa46f1fdda-operator-scripts\") pod \"glance-0b9a-account-create-update-66sh8\" (UID: \"89f7dd3f-532f-4e24-bf20-99aa46f1fdda\") " pod="openstack/glance-0b9a-account-create-update-66sh8" Mar 20 07:11:13 crc kubenswrapper[4971]: I0320 07:11:13.351161 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/89f7dd3f-532f-4e24-bf20-99aa46f1fdda-operator-scripts\") pod \"glance-0b9a-account-create-update-66sh8\" (UID: \"89f7dd3f-532f-4e24-bf20-99aa46f1fdda\") " pod="openstack/glance-0b9a-account-create-update-66sh8" Mar 20 07:11:13 crc kubenswrapper[4971]: I0320 07:11:13.351515 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxqh4\" (UniqueName: \"kubernetes.io/projected/1f2bca3a-2949-4c33-8716-28a6e38b84ef-kube-api-access-mxqh4\") pod \"glance-db-create-bl874\" (UID: \"1f2bca3a-2949-4c33-8716-28a6e38b84ef\") " pod="openstack/glance-db-create-bl874" Mar 20 07:11:13 crc kubenswrapper[4971]: I0320 07:11:13.351554 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f2bca3a-2949-4c33-8716-28a6e38b84ef-operator-scripts\") pod \"glance-db-create-bl874\" (UID: \"1f2bca3a-2949-4c33-8716-28a6e38b84ef\") " pod="openstack/glance-db-create-bl874" Mar 20 07:11:13 crc kubenswrapper[4971]: I0320 07:11:13.351595 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6kz5\" (UniqueName: \"kubernetes.io/projected/89f7dd3f-532f-4e24-bf20-99aa46f1fdda-kube-api-access-j6kz5\") pod \"glance-0b9a-account-create-update-66sh8\" (UID: \"89f7dd3f-532f-4e24-bf20-99aa46f1fdda\") " pod="openstack/glance-0b9a-account-create-update-66sh8" Mar 20 07:11:13 crc kubenswrapper[4971]: I0320 07:11:13.352254 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/89f7dd3f-532f-4e24-bf20-99aa46f1fdda-operator-scripts\") pod \"glance-0b9a-account-create-update-66sh8\" (UID: \"89f7dd3f-532f-4e24-bf20-99aa46f1fdda\") " pod="openstack/glance-0b9a-account-create-update-66sh8" Mar 20 07:11:13 crc kubenswrapper[4971]: I0320 07:11:13.352672 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f2bca3a-2949-4c33-8716-28a6e38b84ef-operator-scripts\") pod \"glance-db-create-bl874\" (UID: \"1f2bca3a-2949-4c33-8716-28a6e38b84ef\") " pod="openstack/glance-db-create-bl874" Mar 20 07:11:13 crc kubenswrapper[4971]: I0320 07:11:13.370368 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6kz5\" (UniqueName: \"kubernetes.io/projected/89f7dd3f-532f-4e24-bf20-99aa46f1fdda-kube-api-access-j6kz5\") pod \"glance-0b9a-account-create-update-66sh8\" (UID: \"89f7dd3f-532f-4e24-bf20-99aa46f1fdda\") " pod="openstack/glance-0b9a-account-create-update-66sh8" Mar 20 07:11:13 crc kubenswrapper[4971]: I0320 07:11:13.373309 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxqh4\" (UniqueName: \"kubernetes.io/projected/1f2bca3a-2949-4c33-8716-28a6e38b84ef-kube-api-access-mxqh4\") pod \"glance-db-create-bl874\" (UID: \"1f2bca3a-2949-4c33-8716-28a6e38b84ef\") " pod="openstack/glance-db-create-bl874" Mar 20 07:11:13 crc kubenswrapper[4971]: I0320 07:11:13.466072 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-bl874" Mar 20 07:11:13 crc kubenswrapper[4971]: I0320 07:11:13.475892 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-0b9a-account-create-update-66sh8" Mar 20 07:11:13 crc kubenswrapper[4971]: I0320 07:11:13.758889 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d944d7b75-r597g" Mar 20 07:11:13 crc kubenswrapper[4971]: I0320 07:11:13.900738 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-bl874"] Mar 20 07:11:13 crc kubenswrapper[4971]: I0320 07:11:13.913848 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb1c6902-1390-44a8-8307-8036b02ce9c7-ovsdbserver-nb\") pod \"fb1c6902-1390-44a8-8307-8036b02ce9c7\" (UID: \"fb1c6902-1390-44a8-8307-8036b02ce9c7\") " Mar 20 07:11:13 crc kubenswrapper[4971]: I0320 07:11:13.913889 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb1c6902-1390-44a8-8307-8036b02ce9c7-config\") pod \"fb1c6902-1390-44a8-8307-8036b02ce9c7\" (UID: \"fb1c6902-1390-44a8-8307-8036b02ce9c7\") " Mar 20 07:11:13 crc kubenswrapper[4971]: I0320 07:11:13.913967 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxlvq\" (UniqueName: \"kubernetes.io/projected/fb1c6902-1390-44a8-8307-8036b02ce9c7-kube-api-access-pxlvq\") pod \"fb1c6902-1390-44a8-8307-8036b02ce9c7\" (UID: \"fb1c6902-1390-44a8-8307-8036b02ce9c7\") " Mar 20 07:11:13 crc kubenswrapper[4971]: I0320 07:11:13.914048 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb1c6902-1390-44a8-8307-8036b02ce9c7-dns-svc\") pod \"fb1c6902-1390-44a8-8307-8036b02ce9c7\" (UID: \"fb1c6902-1390-44a8-8307-8036b02ce9c7\") " Mar 20 07:11:13 crc kubenswrapper[4971]: I0320 07:11:13.914169 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb1c6902-1390-44a8-8307-8036b02ce9c7-ovsdbserver-sb\") pod \"fb1c6902-1390-44a8-8307-8036b02ce9c7\" (UID: \"fb1c6902-1390-44a8-8307-8036b02ce9c7\") " Mar 20 07:11:13 crc kubenswrapper[4971]: I0320 07:11:13.921916 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb1c6902-1390-44a8-8307-8036b02ce9c7-kube-api-access-pxlvq" (OuterVolumeSpecName: "kube-api-access-pxlvq") pod "fb1c6902-1390-44a8-8307-8036b02ce9c7" (UID: "fb1c6902-1390-44a8-8307-8036b02ce9c7"). InnerVolumeSpecName "kube-api-access-pxlvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:11:13 crc kubenswrapper[4971]: I0320 07:11:13.956682 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb1c6902-1390-44a8-8307-8036b02ce9c7-config" (OuterVolumeSpecName: "config") pod "fb1c6902-1390-44a8-8307-8036b02ce9c7" (UID: "fb1c6902-1390-44a8-8307-8036b02ce9c7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:11:13 crc kubenswrapper[4971]: I0320 07:11:13.961782 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb1c6902-1390-44a8-8307-8036b02ce9c7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fb1c6902-1390-44a8-8307-8036b02ce9c7" (UID: "fb1c6902-1390-44a8-8307-8036b02ce9c7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:11:13 crc kubenswrapper[4971]: I0320 07:11:13.974433 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb1c6902-1390-44a8-8307-8036b02ce9c7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fb1c6902-1390-44a8-8307-8036b02ce9c7" (UID: "fb1c6902-1390-44a8-8307-8036b02ce9c7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:11:13 crc kubenswrapper[4971]: I0320 07:11:13.977522 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-0b9a-account-create-update-66sh8"] Mar 20 07:11:13 crc kubenswrapper[4971]: I0320 07:11:13.983065 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb1c6902-1390-44a8-8307-8036b02ce9c7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fb1c6902-1390-44a8-8307-8036b02ce9c7" (UID: "fb1c6902-1390-44a8-8307-8036b02ce9c7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:11:13 crc kubenswrapper[4971]: W0320 07:11:13.987308 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod89f7dd3f_532f_4e24_bf20_99aa46f1fdda.slice/crio-802a67fc85c9080a4ec3956554e3e00a77e2f5dbe865c7b2c6e74bf0a3569967 WatchSource:0}: Error finding container 802a67fc85c9080a4ec3956554e3e00a77e2f5dbe865c7b2c6e74bf0a3569967: Status 404 returned error can't find the container with id 802a67fc85c9080a4ec3956554e3e00a77e2f5dbe865c7b2c6e74bf0a3569967 Mar 20 07:11:14 crc kubenswrapper[4971]: I0320 07:11:14.016497 4971 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb1c6902-1390-44a8-8307-8036b02ce9c7-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:14 crc kubenswrapper[4971]: I0320 07:11:14.016533 4971 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb1c6902-1390-44a8-8307-8036b02ce9c7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:14 crc kubenswrapper[4971]: I0320 07:11:14.016549 4971 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb1c6902-1390-44a8-8307-8036b02ce9c7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:14 crc kubenswrapper[4971]: I0320 07:11:14.016562 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb1c6902-1390-44a8-8307-8036b02ce9c7-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:14 crc kubenswrapper[4971]: I0320 07:11:14.016575 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxlvq\" (UniqueName: \"kubernetes.io/projected/fb1c6902-1390-44a8-8307-8036b02ce9c7-kube-api-access-pxlvq\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:14 crc kubenswrapper[4971]: I0320 07:11:14.096845 4971 generic.go:334] "Generic (PLEG): container finished" podID="10b994b9-b9a4-49de-8faf-a3c7b64035dc" containerID="f88854cdba235abc6a679d085e2ddae83d1dcc94556a4f55f7597dc55a163760" exitCode=0 Mar 20 07:11:14 crc kubenswrapper[4971]: I0320 07:11:14.097927 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-8tvlb" event={"ID":"10b994b9-b9a4-49de-8faf-a3c7b64035dc","Type":"ContainerDied","Data":"f88854cdba235abc6a679d085e2ddae83d1dcc94556a4f55f7597dc55a163760"} Mar 20 07:11:14 crc kubenswrapper[4971]: I0320 07:11:14.115007 4971 generic.go:334] "Generic (PLEG): container finished" podID="543e9b24-4327-4fcf-8d2c-9316d2cc7b8e" containerID="8894eea462d02c27135c00ee973dff94683ebc6860c55336204627ceb1560fa1" exitCode=0 Mar 20 07:11:14 crc kubenswrapper[4971]: I0320 07:11:14.115077 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-08bb-account-create-update-hw4s7" event={"ID":"543e9b24-4327-4fcf-8d2c-9316d2cc7b8e","Type":"ContainerDied","Data":"8894eea462d02c27135c00ee973dff94683ebc6860c55336204627ceb1560fa1"} Mar 20 07:11:14 crc kubenswrapper[4971]: I0320 07:11:14.128864 4971 generic.go:334] "Generic (PLEG): container finished" podID="a4d13e0d-08bb-495b-abfc-6edcc8f01890" containerID="3aa555eddf63e48f674d7432f8d4139f9f04b81f51a084e8ad63dcacfa5ce437" exitCode=0 Mar 20 07:11:14 crc kubenswrapper[4971]: I0320 07:11:14.128977 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-rtj87" event={"ID":"a4d13e0d-08bb-495b-abfc-6edcc8f01890","Type":"ContainerDied","Data":"3aa555eddf63e48f674d7432f8d4139f9f04b81f51a084e8ad63dcacfa5ce437"} Mar 20 07:11:14 crc kubenswrapper[4971]: I0320 07:11:14.131879 4971 generic.go:334] "Generic (PLEG): container finished" podID="1f687e76-a919-4ed1-921b-ab9a7085eb30" containerID="5fbf1df8dda99bb2bbd2557d66fd0f7cd73de19f39a470db149709ec3733445a" exitCode=0 Mar 20 07:11:14 crc kubenswrapper[4971]: I0320 07:11:14.131939 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b9fd7d84c-bwqjt" event={"ID":"1f687e76-a919-4ed1-921b-ab9a7085eb30","Type":"ContainerDied","Data":"5fbf1df8dda99bb2bbd2557d66fd0f7cd73de19f39a470db149709ec3733445a"} Mar 20 07:11:14 crc kubenswrapper[4971]: I0320 07:11:14.145593 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-0b9a-account-create-update-66sh8" event={"ID":"89f7dd3f-532f-4e24-bf20-99aa46f1fdda","Type":"ContainerStarted","Data":"802a67fc85c9080a4ec3956554e3e00a77e2f5dbe865c7b2c6e74bf0a3569967"} Mar 20 07:11:14 crc kubenswrapper[4971]: I0320 07:11:14.165225 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-bl874" event={"ID":"1f2bca3a-2949-4c33-8716-28a6e38b84ef","Type":"ContainerStarted","Data":"3c8e49c6ca6129d2a34d745100b0193abdef178721851aacef6610ac4d3d47db"} Mar 20 07:11:14 crc kubenswrapper[4971]: I0320 07:11:14.167438 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d944d7b75-r597g" event={"ID":"fb1c6902-1390-44a8-8307-8036b02ce9c7","Type":"ContainerDied","Data":"8c41a9be5ab7880824349d22b882f86e54a5a39bc318cd8ba465da9c76a494ba"} Mar 20 07:11:14 crc kubenswrapper[4971]: I0320 07:11:14.167481 4971 scope.go:117] "RemoveContainer" containerID="756377c9e0c8de1cb9e21125b8e6c1fd30e0982ec21e18ae72e88dc0dd0eabbf" Mar 20 07:11:14 crc kubenswrapper[4971]: I0320 07:11:14.168587 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d944d7b75-r597g" Mar 20 07:11:14 crc kubenswrapper[4971]: I0320 07:11:14.186866 4971 generic.go:334] "Generic (PLEG): container finished" podID="a2fa4dc2-d625-4071-b0bc-07099f6303ab" containerID="0d9196d5fa1e06eb300a897a2efc8f4155967e4bbf140160338ab7fea0655449" exitCode=0 Mar 20 07:11:14 crc kubenswrapper[4971]: I0320 07:11:14.186918 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d3ee-account-create-update-xpzjm" event={"ID":"a2fa4dc2-d625-4071-b0bc-07099f6303ab","Type":"ContainerDied","Data":"0d9196d5fa1e06eb300a897a2efc8f4155967e4bbf140160338ab7fea0655449"} Mar 20 07:11:14 crc kubenswrapper[4971]: I0320 07:11:14.209669 4971 scope.go:117] "RemoveContainer" containerID="b28d10f5cd406d72f7c52a3cb0b885fbc68bc235675df5d982bd9fda13f141a5" Mar 20 07:11:14 crc kubenswrapper[4971]: I0320 07:11:14.263916 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d944d7b75-r597g"] Mar 20 07:11:14 crc kubenswrapper[4971]: I0320 07:11:14.274107 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d944d7b75-r597g"] Mar 20 07:11:14 crc kubenswrapper[4971]: I0320 07:11:14.752238 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb1c6902-1390-44a8-8307-8036b02ce9c7" path="/var/lib/kubelet/pods/fb1c6902-1390-44a8-8307-8036b02ce9c7/volumes" Mar 20 07:11:14 crc kubenswrapper[4971]: I0320 07:11:14.776981 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-dhlfs"] Mar 20 07:11:14 crc kubenswrapper[4971]: E0320 07:11:14.777562 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb1c6902-1390-44a8-8307-8036b02ce9c7" containerName="init" Mar 20 07:11:14 crc kubenswrapper[4971]: I0320 07:11:14.777593 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb1c6902-1390-44a8-8307-8036b02ce9c7" containerName="init" Mar 20 07:11:14 crc kubenswrapper[4971]: E0320 07:11:14.777651 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb1c6902-1390-44a8-8307-8036b02ce9c7" containerName="dnsmasq-dns" Mar 20 07:11:14 crc kubenswrapper[4971]: I0320 07:11:14.777668 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb1c6902-1390-44a8-8307-8036b02ce9c7" containerName="dnsmasq-dns" Mar 20 07:11:14 crc kubenswrapper[4971]: I0320 07:11:14.778000 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb1c6902-1390-44a8-8307-8036b02ce9c7" containerName="dnsmasq-dns" Mar 20 07:11:14 crc kubenswrapper[4971]: I0320 07:11:14.778842 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-dhlfs" Mar 20 07:11:14 crc kubenswrapper[4971]: I0320 07:11:14.782204 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 20 07:11:14 crc kubenswrapper[4971]: I0320 07:11:14.789345 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-dhlfs"] Mar 20 07:11:14 crc kubenswrapper[4971]: I0320 07:11:14.831189 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkkgk\" (UniqueName: \"kubernetes.io/projected/2ac19710-56df-48ab-a605-b331b6b54060-kube-api-access-zkkgk\") pod \"root-account-create-update-dhlfs\" (UID: \"2ac19710-56df-48ab-a605-b331b6b54060\") " pod="openstack/root-account-create-update-dhlfs" Mar 20 07:11:14 crc kubenswrapper[4971]: I0320 07:11:14.831317 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ac19710-56df-48ab-a605-b331b6b54060-operator-scripts\") pod \"root-account-create-update-dhlfs\" (UID: \"2ac19710-56df-48ab-a605-b331b6b54060\") " pod="openstack/root-account-create-update-dhlfs" Mar 20 07:11:14 crc kubenswrapper[4971]: I0320 07:11:14.933012 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ac19710-56df-48ab-a605-b331b6b54060-operator-scripts\") pod \"root-account-create-update-dhlfs\" (UID: \"2ac19710-56df-48ab-a605-b331b6b54060\") " pod="openstack/root-account-create-update-dhlfs" Mar 20 07:11:14 crc kubenswrapper[4971]: I0320 07:11:14.933155 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkkgk\" (UniqueName: \"kubernetes.io/projected/2ac19710-56df-48ab-a605-b331b6b54060-kube-api-access-zkkgk\") pod \"root-account-create-update-dhlfs\" (UID: \"2ac19710-56df-48ab-a605-b331b6b54060\") " pod="openstack/root-account-create-update-dhlfs" Mar 20 07:11:14 crc kubenswrapper[4971]: I0320 07:11:14.934045 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ac19710-56df-48ab-a605-b331b6b54060-operator-scripts\") pod \"root-account-create-update-dhlfs\" (UID: \"2ac19710-56df-48ab-a605-b331b6b54060\") " pod="openstack/root-account-create-update-dhlfs" Mar 20 07:11:14 crc kubenswrapper[4971]: I0320 07:11:14.961340 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkkgk\" (UniqueName: \"kubernetes.io/projected/2ac19710-56df-48ab-a605-b331b6b54060-kube-api-access-zkkgk\") pod \"root-account-create-update-dhlfs\" (UID: \"2ac19710-56df-48ab-a605-b331b6b54060\") " pod="openstack/root-account-create-update-dhlfs" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.020236 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-8qwqb"] Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.021461 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-8qwqb" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.023588 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.024437 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.027156 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.034590 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3c949e71-54d4-41f8-9220-d032cb1eb3e2-dispersionconf\") pod \"swift-ring-rebalance-8qwqb\" (UID: \"3c949e71-54d4-41f8-9220-d032cb1eb3e2\") " pod="openstack/swift-ring-rebalance-8qwqb" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.034703 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9f0608b4-f344-45db-a952-d6bc328083a2-etc-swift\") pod \"swift-storage-0\" (UID: \"9f0608b4-f344-45db-a952-d6bc328083a2\") " pod="openstack/swift-storage-0" Mar 20 07:11:15 crc kubenswrapper[4971]: E0320 07:11:15.034842 4971 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 07:11:15 crc kubenswrapper[4971]: E0320 07:11:15.034863 4971 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.034884 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3c949e71-54d4-41f8-9220-d032cb1eb3e2-etc-swift\") pod \"swift-ring-rebalance-8qwqb\" (UID: \"3c949e71-54d4-41f8-9220-d032cb1eb3e2\") " pod="openstack/swift-ring-rebalance-8qwqb" Mar 20 07:11:15 crc kubenswrapper[4971]: E0320 07:11:15.034922 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9f0608b4-f344-45db-a952-d6bc328083a2-etc-swift podName:9f0608b4-f344-45db-a952-d6bc328083a2 nodeName:}" failed. No retries permitted until 2026-03-20 07:11:19.034902214 +0000 UTC m=+1301.014776412 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9f0608b4-f344-45db-a952-d6bc328083a2-etc-swift") pod "swift-storage-0" (UID: "9f0608b4-f344-45db-a952-d6bc328083a2") : configmap "swift-ring-files" not found Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.034943 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3c949e71-54d4-41f8-9220-d032cb1eb3e2-ring-data-devices\") pod \"swift-ring-rebalance-8qwqb\" (UID: \"3c949e71-54d4-41f8-9220-d032cb1eb3e2\") " pod="openstack/swift-ring-rebalance-8qwqb" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.035084 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3c949e71-54d4-41f8-9220-d032cb1eb3e2-scripts\") pod \"swift-ring-rebalance-8qwqb\" (UID: \"3c949e71-54d4-41f8-9220-d032cb1eb3e2\") " pod="openstack/swift-ring-rebalance-8qwqb" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.035115 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9xft\" (UniqueName: \"kubernetes.io/projected/3c949e71-54d4-41f8-9220-d032cb1eb3e2-kube-api-access-s9xft\") pod \"swift-ring-rebalance-8qwqb\" (UID: \"3c949e71-54d4-41f8-9220-d032cb1eb3e2\") " pod="openstack/swift-ring-rebalance-8qwqb" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.035146 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3c949e71-54d4-41f8-9220-d032cb1eb3e2-swiftconf\") pod \"swift-ring-rebalance-8qwqb\" (UID: \"3c949e71-54d4-41f8-9220-d032cb1eb3e2\") " pod="openstack/swift-ring-rebalance-8qwqb" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.035200 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c949e71-54d4-41f8-9220-d032cb1eb3e2-combined-ca-bundle\") pod \"swift-ring-rebalance-8qwqb\" (UID: \"3c949e71-54d4-41f8-9220-d032cb1eb3e2\") " pod="openstack/swift-ring-rebalance-8qwqb" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.062571 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-rjxlr"] Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.063552 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-rjxlr" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.070979 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-8qwqb"] Mar 20 07:11:15 crc kubenswrapper[4971]: E0320 07:11:15.072302 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-s9xft ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/swift-ring-rebalance-8qwqb" podUID="3c949e71-54d4-41f8-9220-d032cb1eb3e2" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.103144 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-dhlfs" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.110051 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-rjxlr"] Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.136402 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/06926146-e70b-4ee7-bf24-d629cb17a509-swiftconf\") pod \"swift-ring-rebalance-rjxlr\" (UID: \"06926146-e70b-4ee7-bf24-d629cb17a509\") " pod="openstack/swift-ring-rebalance-rjxlr" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.136449 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9xft\" (UniqueName: \"kubernetes.io/projected/3c949e71-54d4-41f8-9220-d032cb1eb3e2-kube-api-access-s9xft\") pod \"swift-ring-rebalance-8qwqb\" (UID: \"3c949e71-54d4-41f8-9220-d032cb1eb3e2\") " pod="openstack/swift-ring-rebalance-8qwqb" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.136472 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3c949e71-54d4-41f8-9220-d032cb1eb3e2-swiftconf\") pod \"swift-ring-rebalance-8qwqb\" (UID: \"3c949e71-54d4-41f8-9220-d032cb1eb3e2\") " pod="openstack/swift-ring-rebalance-8qwqb" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.136498 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c949e71-54d4-41f8-9220-d032cb1eb3e2-combined-ca-bundle\") pod \"swift-ring-rebalance-8qwqb\" (UID: \"3c949e71-54d4-41f8-9220-d032cb1eb3e2\") " pod="openstack/swift-ring-rebalance-8qwqb" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.136516 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3c949e71-54d4-41f8-9220-d032cb1eb3e2-dispersionconf\") pod \"swift-ring-rebalance-8qwqb\" (UID: \"3c949e71-54d4-41f8-9220-d032cb1eb3e2\") " pod="openstack/swift-ring-rebalance-8qwqb" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.136536 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/06926146-e70b-4ee7-bf24-d629cb17a509-scripts\") pod \"swift-ring-rebalance-rjxlr\" (UID: \"06926146-e70b-4ee7-bf24-d629cb17a509\") " pod="openstack/swift-ring-rebalance-rjxlr" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.136592 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/06926146-e70b-4ee7-bf24-d629cb17a509-ring-data-devices\") pod \"swift-ring-rebalance-rjxlr\" (UID: \"06926146-e70b-4ee7-bf24-d629cb17a509\") " pod="openstack/swift-ring-rebalance-rjxlr" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.136677 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06926146-e70b-4ee7-bf24-d629cb17a509-combined-ca-bundle\") pod \"swift-ring-rebalance-rjxlr\" (UID: \"06926146-e70b-4ee7-bf24-d629cb17a509\") " pod="openstack/swift-ring-rebalance-rjxlr" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.136692 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/06926146-e70b-4ee7-bf24-d629cb17a509-dispersionconf\") pod \"swift-ring-rebalance-rjxlr\" (UID: \"06926146-e70b-4ee7-bf24-d629cb17a509\") " pod="openstack/swift-ring-rebalance-rjxlr" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.136724 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3c949e71-54d4-41f8-9220-d032cb1eb3e2-etc-swift\") pod \"swift-ring-rebalance-8qwqb\" (UID: \"3c949e71-54d4-41f8-9220-d032cb1eb3e2\") " pod="openstack/swift-ring-rebalance-8qwqb" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.136752 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3c949e71-54d4-41f8-9220-d032cb1eb3e2-ring-data-devices\") pod \"swift-ring-rebalance-8qwqb\" (UID: \"3c949e71-54d4-41f8-9220-d032cb1eb3e2\") " pod="openstack/swift-ring-rebalance-8qwqb" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.136780 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/06926146-e70b-4ee7-bf24-d629cb17a509-etc-swift\") pod \"swift-ring-rebalance-rjxlr\" (UID: \"06926146-e70b-4ee7-bf24-d629cb17a509\") " pod="openstack/swift-ring-rebalance-rjxlr" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.136803 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx8gf\" (UniqueName: \"kubernetes.io/projected/06926146-e70b-4ee7-bf24-d629cb17a509-kube-api-access-tx8gf\") pod \"swift-ring-rebalance-rjxlr\" (UID: \"06926146-e70b-4ee7-bf24-d629cb17a509\") " pod="openstack/swift-ring-rebalance-rjxlr" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.136884 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3c949e71-54d4-41f8-9220-d032cb1eb3e2-scripts\") pod \"swift-ring-rebalance-8qwqb\" (UID: \"3c949e71-54d4-41f8-9220-d032cb1eb3e2\") " pod="openstack/swift-ring-rebalance-8qwqb" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.137643 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3c949e71-54d4-41f8-9220-d032cb1eb3e2-scripts\") pod \"swift-ring-rebalance-8qwqb\" (UID: \"3c949e71-54d4-41f8-9220-d032cb1eb3e2\") " pod="openstack/swift-ring-rebalance-8qwqb" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.138244 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3c949e71-54d4-41f8-9220-d032cb1eb3e2-etc-swift\") pod \"swift-ring-rebalance-8qwqb\" (UID: \"3c949e71-54d4-41f8-9220-d032cb1eb3e2\") " pod="openstack/swift-ring-rebalance-8qwqb" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.138853 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3c949e71-54d4-41f8-9220-d032cb1eb3e2-ring-data-devices\") pod \"swift-ring-rebalance-8qwqb\" (UID: \"3c949e71-54d4-41f8-9220-d032cb1eb3e2\") " pod="openstack/swift-ring-rebalance-8qwqb" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.139651 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-8qwqb"] Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.139788 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3c949e71-54d4-41f8-9220-d032cb1eb3e2-swiftconf\") pod \"swift-ring-rebalance-8qwqb\" (UID: \"3c949e71-54d4-41f8-9220-d032cb1eb3e2\") " pod="openstack/swift-ring-rebalance-8qwqb" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.141252 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c949e71-54d4-41f8-9220-d032cb1eb3e2-combined-ca-bundle\") pod \"swift-ring-rebalance-8qwqb\" (UID: \"3c949e71-54d4-41f8-9220-d032cb1eb3e2\") " pod="openstack/swift-ring-rebalance-8qwqb" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.142427 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3c949e71-54d4-41f8-9220-d032cb1eb3e2-dispersionconf\") pod \"swift-ring-rebalance-8qwqb\" (UID: \"3c949e71-54d4-41f8-9220-d032cb1eb3e2\") " pod="openstack/swift-ring-rebalance-8qwqb" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.152383 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9xft\" (UniqueName: \"kubernetes.io/projected/3c949e71-54d4-41f8-9220-d032cb1eb3e2-kube-api-access-s9xft\") pod \"swift-ring-rebalance-8qwqb\" (UID: \"3c949e71-54d4-41f8-9220-d032cb1eb3e2\") " pod="openstack/swift-ring-rebalance-8qwqb" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.204096 4971 generic.go:334] "Generic (PLEG): container finished" podID="1f2bca3a-2949-4c33-8716-28a6e38b84ef" containerID="dd296b2cff52994971d74ccd25b29d247876a06340250cd1ca3f4c3979f95a89" exitCode=0 Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.204331 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-bl874" event={"ID":"1f2bca3a-2949-4c33-8716-28a6e38b84ef","Type":"ContainerDied","Data":"dd296b2cff52994971d74ccd25b29d247876a06340250cd1ca3f4c3979f95a89"} Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.207840 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b9fd7d84c-bwqjt" event={"ID":"1f687e76-a919-4ed1-921b-ab9a7085eb30","Type":"ContainerStarted","Data":"84e37de1dca3b13d7df575e0ba046c0f63b4cad119703ca80a882711711fe434"} Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.208596 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b9fd7d84c-bwqjt" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.210634 4971 generic.go:334] "Generic (PLEG): container finished" podID="89f7dd3f-532f-4e24-bf20-99aa46f1fdda" containerID="f00a93dff8f53e5f0a13ded30cfb6e5907ce153f6b5a93a64c0c8fc026b840cc" exitCode=0 Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.210691 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-8qwqb" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.210738 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-0b9a-account-create-update-66sh8" event={"ID":"89f7dd3f-532f-4e24-bf20-99aa46f1fdda","Type":"ContainerDied","Data":"f00a93dff8f53e5f0a13ded30cfb6e5907ce153f6b5a93a64c0c8fc026b840cc"} Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.227014 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-8qwqb" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.242652 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c949e71-54d4-41f8-9220-d032cb1eb3e2-combined-ca-bundle\") pod \"3c949e71-54d4-41f8-9220-d032cb1eb3e2\" (UID: \"3c949e71-54d4-41f8-9220-d032cb1eb3e2\") " Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.242711 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3c949e71-54d4-41f8-9220-d032cb1eb3e2-scripts\") pod \"3c949e71-54d4-41f8-9220-d032cb1eb3e2\" (UID: \"3c949e71-54d4-41f8-9220-d032cb1eb3e2\") " Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.242750 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9xft\" (UniqueName: \"kubernetes.io/projected/3c949e71-54d4-41f8-9220-d032cb1eb3e2-kube-api-access-s9xft\") pod \"3c949e71-54d4-41f8-9220-d032cb1eb3e2\" (UID: \"3c949e71-54d4-41f8-9220-d032cb1eb3e2\") " Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.242772 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3c949e71-54d4-41f8-9220-d032cb1eb3e2-etc-swift\") pod \"3c949e71-54d4-41f8-9220-d032cb1eb3e2\" (UID: \"3c949e71-54d4-41f8-9220-d032cb1eb3e2\") " Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.242800 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3c949e71-54d4-41f8-9220-d032cb1eb3e2-swiftconf\") pod \"3c949e71-54d4-41f8-9220-d032cb1eb3e2\" (UID: \"3c949e71-54d4-41f8-9220-d032cb1eb3e2\") " Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.242845 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3c949e71-54d4-41f8-9220-d032cb1eb3e2-dispersionconf\") pod \"3c949e71-54d4-41f8-9220-d032cb1eb3e2\" (UID: \"3c949e71-54d4-41f8-9220-d032cb1eb3e2\") " Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.242859 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3c949e71-54d4-41f8-9220-d032cb1eb3e2-ring-data-devices\") pod \"3c949e71-54d4-41f8-9220-d032cb1eb3e2\" (UID: \"3c949e71-54d4-41f8-9220-d032cb1eb3e2\") " Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.242995 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/06926146-e70b-4ee7-bf24-d629cb17a509-scripts\") pod \"swift-ring-rebalance-rjxlr\" (UID: \"06926146-e70b-4ee7-bf24-d629cb17a509\") " pod="openstack/swift-ring-rebalance-rjxlr" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.243122 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/06926146-e70b-4ee7-bf24-d629cb17a509-ring-data-devices\") pod \"swift-ring-rebalance-rjxlr\" (UID: \"06926146-e70b-4ee7-bf24-d629cb17a509\") " pod="openstack/swift-ring-rebalance-rjxlr" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.243159 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06926146-e70b-4ee7-bf24-d629cb17a509-combined-ca-bundle\") pod \"swift-ring-rebalance-rjxlr\" (UID: \"06926146-e70b-4ee7-bf24-d629cb17a509\") " pod="openstack/swift-ring-rebalance-rjxlr" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.243183 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/06926146-e70b-4ee7-bf24-d629cb17a509-dispersionconf\") pod \"swift-ring-rebalance-rjxlr\" (UID: \"06926146-e70b-4ee7-bf24-d629cb17a509\") " pod="openstack/swift-ring-rebalance-rjxlr" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.243218 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/06926146-e70b-4ee7-bf24-d629cb17a509-etc-swift\") pod \"swift-ring-rebalance-rjxlr\" (UID: \"06926146-e70b-4ee7-bf24-d629cb17a509\") " pod="openstack/swift-ring-rebalance-rjxlr" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.243238 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tx8gf\" (UniqueName: \"kubernetes.io/projected/06926146-e70b-4ee7-bf24-d629cb17a509-kube-api-access-tx8gf\") pod \"swift-ring-rebalance-rjxlr\" (UID: \"06926146-e70b-4ee7-bf24-d629cb17a509\") " pod="openstack/swift-ring-rebalance-rjxlr" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.243267 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/06926146-e70b-4ee7-bf24-d629cb17a509-swiftconf\") pod \"swift-ring-rebalance-rjxlr\" (UID: \"06926146-e70b-4ee7-bf24-d629cb17a509\") " pod="openstack/swift-ring-rebalance-rjxlr" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.243924 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c949e71-54d4-41f8-9220-d032cb1eb3e2-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "3c949e71-54d4-41f8-9220-d032cb1eb3e2" (UID: "3c949e71-54d4-41f8-9220-d032cb1eb3e2"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.244232 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/06926146-e70b-4ee7-bf24-d629cb17a509-scripts\") pod \"swift-ring-rebalance-rjxlr\" (UID: \"06926146-e70b-4ee7-bf24-d629cb17a509\") " pod="openstack/swift-ring-rebalance-rjxlr" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.245693 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/06926146-e70b-4ee7-bf24-d629cb17a509-etc-swift\") pod \"swift-ring-rebalance-rjxlr\" (UID: \"06926146-e70b-4ee7-bf24-d629cb17a509\") " pod="openstack/swift-ring-rebalance-rjxlr" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.246098 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c949e71-54d4-41f8-9220-d032cb1eb3e2-scripts" (OuterVolumeSpecName: "scripts") pod "3c949e71-54d4-41f8-9220-d032cb1eb3e2" (UID: "3c949e71-54d4-41f8-9220-d032cb1eb3e2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.246935 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c949e71-54d4-41f8-9220-d032cb1eb3e2-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "3c949e71-54d4-41f8-9220-d032cb1eb3e2" (UID: "3c949e71-54d4-41f8-9220-d032cb1eb3e2"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.247987 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/06926146-e70b-4ee7-bf24-d629cb17a509-ring-data-devices\") pod \"swift-ring-rebalance-rjxlr\" (UID: \"06926146-e70b-4ee7-bf24-d629cb17a509\") " pod="openstack/swift-ring-rebalance-rjxlr" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.249115 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c949e71-54d4-41f8-9220-d032cb1eb3e2-kube-api-access-s9xft" (OuterVolumeSpecName: "kube-api-access-s9xft") pod "3c949e71-54d4-41f8-9220-d032cb1eb3e2" (UID: "3c949e71-54d4-41f8-9220-d032cb1eb3e2"). InnerVolumeSpecName "kube-api-access-s9xft". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.253021 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c949e71-54d4-41f8-9220-d032cb1eb3e2-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "3c949e71-54d4-41f8-9220-d032cb1eb3e2" (UID: "3c949e71-54d4-41f8-9220-d032cb1eb3e2"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.258128 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06926146-e70b-4ee7-bf24-d629cb17a509-combined-ca-bundle\") pod \"swift-ring-rebalance-rjxlr\" (UID: \"06926146-e70b-4ee7-bf24-d629cb17a509\") " pod="openstack/swift-ring-rebalance-rjxlr" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.261791 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c949e71-54d4-41f8-9220-d032cb1eb3e2-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "3c949e71-54d4-41f8-9220-d032cb1eb3e2" (UID: "3c949e71-54d4-41f8-9220-d032cb1eb3e2"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.261900 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b9fd7d84c-bwqjt" podStartSLOduration=6.261882067 podStartE2EDuration="6.261882067s" podCreationTimestamp="2026-03-20 07:11:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:11:15.253209509 +0000 UTC m=+1297.233083647" watchObservedRunningTime="2026-03-20 07:11:15.261882067 +0000 UTC m=+1297.241756205" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.261993 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/06926146-e70b-4ee7-bf24-d629cb17a509-swiftconf\") pod \"swift-ring-rebalance-rjxlr\" (UID: \"06926146-e70b-4ee7-bf24-d629cb17a509\") " pod="openstack/swift-ring-rebalance-rjxlr" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.268598 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx8gf\" (UniqueName: \"kubernetes.io/projected/06926146-e70b-4ee7-bf24-d629cb17a509-kube-api-access-tx8gf\") pod \"swift-ring-rebalance-rjxlr\" (UID: \"06926146-e70b-4ee7-bf24-d629cb17a509\") " pod="openstack/swift-ring-rebalance-rjxlr" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.280463 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c949e71-54d4-41f8-9220-d032cb1eb3e2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3c949e71-54d4-41f8-9220-d032cb1eb3e2" (UID: "3c949e71-54d4-41f8-9220-d032cb1eb3e2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.287650 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/06926146-e70b-4ee7-bf24-d629cb17a509-dispersionconf\") pod \"swift-ring-rebalance-rjxlr\" (UID: \"06926146-e70b-4ee7-bf24-d629cb17a509\") " pod="openstack/swift-ring-rebalance-rjxlr" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.344430 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c949e71-54d4-41f8-9220-d032cb1eb3e2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.344462 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3c949e71-54d4-41f8-9220-d032cb1eb3e2-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.344471 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9xft\" (UniqueName: \"kubernetes.io/projected/3c949e71-54d4-41f8-9220-d032cb1eb3e2-kube-api-access-s9xft\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.344481 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3c949e71-54d4-41f8-9220-d032cb1eb3e2-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.344490 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3c949e71-54d4-41f8-9220-d032cb1eb3e2-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.344497 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3c949e71-54d4-41f8-9220-d032cb1eb3e2-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.344505 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3c949e71-54d4-41f8-9220-d032cb1eb3e2-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.390748 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-rjxlr" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.707098 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d3ee-account-create-update-xpzjm" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.751285 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a2fa4dc2-d625-4071-b0bc-07099f6303ab-operator-scripts\") pod \"a2fa4dc2-d625-4071-b0bc-07099f6303ab\" (UID: \"a2fa4dc2-d625-4071-b0bc-07099f6303ab\") " Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.751860 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2fa4dc2-d625-4071-b0bc-07099f6303ab-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a2fa4dc2-d625-4071-b0bc-07099f6303ab" (UID: "a2fa4dc2-d625-4071-b0bc-07099f6303ab"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.751994 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h98rj\" (UniqueName: \"kubernetes.io/projected/a2fa4dc2-d625-4071-b0bc-07099f6303ab-kube-api-access-h98rj\") pod \"a2fa4dc2-d625-4071-b0bc-07099f6303ab\" (UID: \"a2fa4dc2-d625-4071-b0bc-07099f6303ab\") " Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.752453 4971 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a2fa4dc2-d625-4071-b0bc-07099f6303ab-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.757420 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2fa4dc2-d625-4071-b0bc-07099f6303ab-kube-api-access-h98rj" (OuterVolumeSpecName: "kube-api-access-h98rj") pod "a2fa4dc2-d625-4071-b0bc-07099f6303ab" (UID: "a2fa4dc2-d625-4071-b0bc-07099f6303ab"). InnerVolumeSpecName "kube-api-access-h98rj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.858756 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h98rj\" (UniqueName: \"kubernetes.io/projected/a2fa4dc2-d625-4071-b0bc-07099f6303ab-kube-api-access-h98rj\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.859561 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-08bb-account-create-update-hw4s7" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.879554 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rtj87" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.905704 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-8tvlb" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.959639 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vm7pr\" (UniqueName: \"kubernetes.io/projected/543e9b24-4327-4fcf-8d2c-9316d2cc7b8e-kube-api-access-vm7pr\") pod \"543e9b24-4327-4fcf-8d2c-9316d2cc7b8e\" (UID: \"543e9b24-4327-4fcf-8d2c-9316d2cc7b8e\") " Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.959681 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5jqw\" (UniqueName: \"kubernetes.io/projected/a4d13e0d-08bb-495b-abfc-6edcc8f01890-kube-api-access-p5jqw\") pod \"a4d13e0d-08bb-495b-abfc-6edcc8f01890\" (UID: \"a4d13e0d-08bb-495b-abfc-6edcc8f01890\") " Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.959777 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/543e9b24-4327-4fcf-8d2c-9316d2cc7b8e-operator-scripts\") pod \"543e9b24-4327-4fcf-8d2c-9316d2cc7b8e\" (UID: \"543e9b24-4327-4fcf-8d2c-9316d2cc7b8e\") " Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.959821 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10b994b9-b9a4-49de-8faf-a3c7b64035dc-operator-scripts\") pod \"10b994b9-b9a4-49de-8faf-a3c7b64035dc\" (UID: \"10b994b9-b9a4-49de-8faf-a3c7b64035dc\") " Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.959872 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpzr5\" (UniqueName: \"kubernetes.io/projected/10b994b9-b9a4-49de-8faf-a3c7b64035dc-kube-api-access-hpzr5\") pod \"10b994b9-b9a4-49de-8faf-a3c7b64035dc\" (UID: \"10b994b9-b9a4-49de-8faf-a3c7b64035dc\") " Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.959933 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4d13e0d-08bb-495b-abfc-6edcc8f01890-operator-scripts\") pod \"a4d13e0d-08bb-495b-abfc-6edcc8f01890\" (UID: \"a4d13e0d-08bb-495b-abfc-6edcc8f01890\") " Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.961231 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4d13e0d-08bb-495b-abfc-6edcc8f01890-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a4d13e0d-08bb-495b-abfc-6edcc8f01890" (UID: "a4d13e0d-08bb-495b-abfc-6edcc8f01890"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.962376 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10b994b9-b9a4-49de-8faf-a3c7b64035dc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "10b994b9-b9a4-49de-8faf-a3c7b64035dc" (UID: "10b994b9-b9a4-49de-8faf-a3c7b64035dc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.962963 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/543e9b24-4327-4fcf-8d2c-9316d2cc7b8e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "543e9b24-4327-4fcf-8d2c-9316d2cc7b8e" (UID: "543e9b24-4327-4fcf-8d2c-9316d2cc7b8e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.963835 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4d13e0d-08bb-495b-abfc-6edcc8f01890-kube-api-access-p5jqw" (OuterVolumeSpecName: "kube-api-access-p5jqw") pod "a4d13e0d-08bb-495b-abfc-6edcc8f01890" (UID: "a4d13e0d-08bb-495b-abfc-6edcc8f01890"). InnerVolumeSpecName "kube-api-access-p5jqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.964341 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10b994b9-b9a4-49de-8faf-a3c7b64035dc-kube-api-access-hpzr5" (OuterVolumeSpecName: "kube-api-access-hpzr5") pod "10b994b9-b9a4-49de-8faf-a3c7b64035dc" (UID: "10b994b9-b9a4-49de-8faf-a3c7b64035dc"). InnerVolumeSpecName "kube-api-access-hpzr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.965542 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/543e9b24-4327-4fcf-8d2c-9316d2cc7b8e-kube-api-access-vm7pr" (OuterVolumeSpecName: "kube-api-access-vm7pr") pod "543e9b24-4327-4fcf-8d2c-9316d2cc7b8e" (UID: "543e9b24-4327-4fcf-8d2c-9316d2cc7b8e"). InnerVolumeSpecName "kube-api-access-vm7pr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:11:15 crc kubenswrapper[4971]: I0320 07:11:15.969470 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-rjxlr"] Mar 20 07:11:16 crc kubenswrapper[4971]: W0320 07:11:16.035425 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ac19710_56df_48ab_a605_b331b6b54060.slice/crio-c7541e8da35deb8c6fef2017d1796ccfb7f3d56b9f30ea01af2c12a9d8bbda13 WatchSource:0}: Error finding container c7541e8da35deb8c6fef2017d1796ccfb7f3d56b9f30ea01af2c12a9d8bbda13: Status 404 returned error can't find the container with id c7541e8da35deb8c6fef2017d1796ccfb7f3d56b9f30ea01af2c12a9d8bbda13 Mar 20 07:11:16 crc kubenswrapper[4971]: I0320 07:11:16.038564 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-dhlfs"] Mar 20 07:11:16 crc kubenswrapper[4971]: I0320 07:11:16.061329 4971 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4d13e0d-08bb-495b-abfc-6edcc8f01890-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:16 crc kubenswrapper[4971]: I0320 07:11:16.061356 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vm7pr\" (UniqueName: \"kubernetes.io/projected/543e9b24-4327-4fcf-8d2c-9316d2cc7b8e-kube-api-access-vm7pr\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:16 crc kubenswrapper[4971]: I0320 07:11:16.061367 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5jqw\" (UniqueName: \"kubernetes.io/projected/a4d13e0d-08bb-495b-abfc-6edcc8f01890-kube-api-access-p5jqw\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:16 crc kubenswrapper[4971]: I0320 07:11:16.061376 4971 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/543e9b24-4327-4fcf-8d2c-9316d2cc7b8e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:16 crc kubenswrapper[4971]: I0320 07:11:16.061385 4971 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10b994b9-b9a4-49de-8faf-a3c7b64035dc-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:16 crc kubenswrapper[4971]: I0320 07:11:16.061395 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpzr5\" (UniqueName: \"kubernetes.io/projected/10b994b9-b9a4-49de-8faf-a3c7b64035dc-kube-api-access-hpzr5\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:16 crc kubenswrapper[4971]: I0320 07:11:16.218301 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-rjxlr" event={"ID":"06926146-e70b-4ee7-bf24-d629cb17a509","Type":"ContainerStarted","Data":"cad6e11bd72d13826a63832296fe8296dd88c98e8f45162baa6393084aab62e0"} Mar 20 07:11:16 crc kubenswrapper[4971]: I0320 07:11:16.220101 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d3ee-account-create-update-xpzjm" event={"ID":"a2fa4dc2-d625-4071-b0bc-07099f6303ab","Type":"ContainerDied","Data":"5a166564e8fdf2f38e415299d7a0c3db6880c284ec0eec405d7c2f41bc3396f3"} Mar 20 07:11:16 crc kubenswrapper[4971]: I0320 07:11:16.220127 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d3ee-account-create-update-xpzjm" Mar 20 07:11:16 crc kubenswrapper[4971]: I0320 07:11:16.220137 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a166564e8fdf2f38e415299d7a0c3db6880c284ec0eec405d7c2f41bc3396f3" Mar 20 07:11:16 crc kubenswrapper[4971]: I0320 07:11:16.222286 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-8tvlb" event={"ID":"10b994b9-b9a4-49de-8faf-a3c7b64035dc","Type":"ContainerDied","Data":"77e19da2aa31b3c79ca8058f54ecdb8b48416523ab9ceb894626473cc4f1428e"} Mar 20 07:11:16 crc kubenswrapper[4971]: I0320 07:11:16.222306 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77e19da2aa31b3c79ca8058f54ecdb8b48416523ab9ceb894626473cc4f1428e" Mar 20 07:11:16 crc kubenswrapper[4971]: I0320 07:11:16.222364 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-8tvlb" Mar 20 07:11:16 crc kubenswrapper[4971]: I0320 07:11:16.226244 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-dhlfs" event={"ID":"2ac19710-56df-48ab-a605-b331b6b54060","Type":"ContainerStarted","Data":"37616ec9a22928fa47fed1046ff25a2bedd0b51b9300cfcaadbe24306840515b"} Mar 20 07:11:16 crc kubenswrapper[4971]: I0320 07:11:16.226290 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-dhlfs" event={"ID":"2ac19710-56df-48ab-a605-b331b6b54060","Type":"ContainerStarted","Data":"c7541e8da35deb8c6fef2017d1796ccfb7f3d56b9f30ea01af2c12a9d8bbda13"} Mar 20 07:11:16 crc kubenswrapper[4971]: I0320 07:11:16.228317 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-08bb-account-create-update-hw4s7" event={"ID":"543e9b24-4327-4fcf-8d2c-9316d2cc7b8e","Type":"ContainerDied","Data":"f0783d9e8bacce37a99cbc680f722dc0d9e179924dfe96b9aa5de64cda903356"} Mar 20 07:11:16 crc kubenswrapper[4971]: I0320 07:11:16.228342 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0783d9e8bacce37a99cbc680f722dc0d9e179924dfe96b9aa5de64cda903356" Mar 20 07:11:16 crc kubenswrapper[4971]: I0320 07:11:16.228343 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-08bb-account-create-update-hw4s7" Mar 20 07:11:16 crc kubenswrapper[4971]: I0320 07:11:16.230263 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rtj87" Mar 20 07:11:16 crc kubenswrapper[4971]: I0320 07:11:16.238105 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-rtj87" event={"ID":"a4d13e0d-08bb-495b-abfc-6edcc8f01890","Type":"ContainerDied","Data":"c02f43c9643f9c9e0b575d74e2e6eed7ebf8326ec8f02b02b051362dfbeecced"} Mar 20 07:11:16 crc kubenswrapper[4971]: I0320 07:11:16.238140 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c02f43c9643f9c9e0b575d74e2e6eed7ebf8326ec8f02b02b051362dfbeecced" Mar 20 07:11:16 crc kubenswrapper[4971]: I0320 07:11:16.238179 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-8qwqb" Mar 20 07:11:16 crc kubenswrapper[4971]: I0320 07:11:16.251490 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-dhlfs" podStartSLOduration=2.251471992 podStartE2EDuration="2.251471992s" podCreationTimestamp="2026-03-20 07:11:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:11:16.250385403 +0000 UTC m=+1298.230259541" watchObservedRunningTime="2026-03-20 07:11:16.251471992 +0000 UTC m=+1298.231346130" Mar 20 07:11:16 crc kubenswrapper[4971]: I0320 07:11:16.356835 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-8qwqb"] Mar 20 07:11:16 crc kubenswrapper[4971]: I0320 07:11:16.364967 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-8qwqb"] Mar 20 07:11:16 crc kubenswrapper[4971]: I0320 07:11:16.586683 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-bl874" Mar 20 07:11:16 crc kubenswrapper[4971]: I0320 07:11:16.669777 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxqh4\" (UniqueName: \"kubernetes.io/projected/1f2bca3a-2949-4c33-8716-28a6e38b84ef-kube-api-access-mxqh4\") pod \"1f2bca3a-2949-4c33-8716-28a6e38b84ef\" (UID: \"1f2bca3a-2949-4c33-8716-28a6e38b84ef\") " Mar 20 07:11:16 crc kubenswrapper[4971]: I0320 07:11:16.669818 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f2bca3a-2949-4c33-8716-28a6e38b84ef-operator-scripts\") pod \"1f2bca3a-2949-4c33-8716-28a6e38b84ef\" (UID: \"1f2bca3a-2949-4c33-8716-28a6e38b84ef\") " Mar 20 07:11:16 crc kubenswrapper[4971]: I0320 07:11:16.670643 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f2bca3a-2949-4c33-8716-28a6e38b84ef-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1f2bca3a-2949-4c33-8716-28a6e38b84ef" (UID: "1f2bca3a-2949-4c33-8716-28a6e38b84ef"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:11:16 crc kubenswrapper[4971]: I0320 07:11:16.675568 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f2bca3a-2949-4c33-8716-28a6e38b84ef-kube-api-access-mxqh4" (OuterVolumeSpecName: "kube-api-access-mxqh4") pod "1f2bca3a-2949-4c33-8716-28a6e38b84ef" (UID: "1f2bca3a-2949-4c33-8716-28a6e38b84ef"). InnerVolumeSpecName "kube-api-access-mxqh4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:11:16 crc kubenswrapper[4971]: I0320 07:11:16.692488 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-0b9a-account-create-update-66sh8" Mar 20 07:11:16 crc kubenswrapper[4971]: I0320 07:11:16.745918 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c949e71-54d4-41f8-9220-d032cb1eb3e2" path="/var/lib/kubelet/pods/3c949e71-54d4-41f8-9220-d032cb1eb3e2/volumes" Mar 20 07:11:16 crc kubenswrapper[4971]: I0320 07:11:16.771946 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/89f7dd3f-532f-4e24-bf20-99aa46f1fdda-operator-scripts\") pod \"89f7dd3f-532f-4e24-bf20-99aa46f1fdda\" (UID: \"89f7dd3f-532f-4e24-bf20-99aa46f1fdda\") " Mar 20 07:11:16 crc kubenswrapper[4971]: I0320 07:11:16.772010 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6kz5\" (UniqueName: \"kubernetes.io/projected/89f7dd3f-532f-4e24-bf20-99aa46f1fdda-kube-api-access-j6kz5\") pod \"89f7dd3f-532f-4e24-bf20-99aa46f1fdda\" (UID: \"89f7dd3f-532f-4e24-bf20-99aa46f1fdda\") " Mar 20 07:11:16 crc kubenswrapper[4971]: I0320 07:11:16.772570 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89f7dd3f-532f-4e24-bf20-99aa46f1fdda-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "89f7dd3f-532f-4e24-bf20-99aa46f1fdda" (UID: "89f7dd3f-532f-4e24-bf20-99aa46f1fdda"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:11:16 crc kubenswrapper[4971]: I0320 07:11:16.779344 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89f7dd3f-532f-4e24-bf20-99aa46f1fdda-kube-api-access-j6kz5" (OuterVolumeSpecName: "kube-api-access-j6kz5") pod "89f7dd3f-532f-4e24-bf20-99aa46f1fdda" (UID: "89f7dd3f-532f-4e24-bf20-99aa46f1fdda"). InnerVolumeSpecName "kube-api-access-j6kz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:11:16 crc kubenswrapper[4971]: I0320 07:11:16.787401 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxqh4\" (UniqueName: \"kubernetes.io/projected/1f2bca3a-2949-4c33-8716-28a6e38b84ef-kube-api-access-mxqh4\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:16 crc kubenswrapper[4971]: I0320 07:11:16.787444 4971 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f2bca3a-2949-4c33-8716-28a6e38b84ef-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:16 crc kubenswrapper[4971]: I0320 07:11:16.787455 4971 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/89f7dd3f-532f-4e24-bf20-99aa46f1fdda-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:16 crc kubenswrapper[4971]: I0320 07:11:16.787479 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6kz5\" (UniqueName: \"kubernetes.io/projected/89f7dd3f-532f-4e24-bf20-99aa46f1fdda-kube-api-access-j6kz5\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:17 crc kubenswrapper[4971]: I0320 07:11:17.282880 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-0b9a-account-create-update-66sh8" event={"ID":"89f7dd3f-532f-4e24-bf20-99aa46f1fdda","Type":"ContainerDied","Data":"802a67fc85c9080a4ec3956554e3e00a77e2f5dbe865c7b2c6e74bf0a3569967"} Mar 20 07:11:17 crc kubenswrapper[4971]: I0320 07:11:17.283659 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="802a67fc85c9080a4ec3956554e3e00a77e2f5dbe865c7b2c6e74bf0a3569967" Mar 20 07:11:17 crc kubenswrapper[4971]: I0320 07:11:17.283803 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-0b9a-account-create-update-66sh8" Mar 20 07:11:17 crc kubenswrapper[4971]: I0320 07:11:17.299415 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-bl874" Mar 20 07:11:17 crc kubenswrapper[4971]: I0320 07:11:17.299635 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-bl874" event={"ID":"1f2bca3a-2949-4c33-8716-28a6e38b84ef","Type":"ContainerDied","Data":"3c8e49c6ca6129d2a34d745100b0193abdef178721851aacef6610ac4d3d47db"} Mar 20 07:11:17 crc kubenswrapper[4971]: I0320 07:11:17.300466 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c8e49c6ca6129d2a34d745100b0193abdef178721851aacef6610ac4d3d47db" Mar 20 07:11:17 crc kubenswrapper[4971]: I0320 07:11:17.323756 4971 generic.go:334] "Generic (PLEG): container finished" podID="2ac19710-56df-48ab-a605-b331b6b54060" containerID="37616ec9a22928fa47fed1046ff25a2bedd0b51b9300cfcaadbe24306840515b" exitCode=0 Mar 20 07:11:17 crc kubenswrapper[4971]: I0320 07:11:17.324107 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-dhlfs" event={"ID":"2ac19710-56df-48ab-a605-b331b6b54060","Type":"ContainerDied","Data":"37616ec9a22928fa47fed1046ff25a2bedd0b51b9300cfcaadbe24306840515b"} Mar 20 07:11:18 crc kubenswrapper[4971]: I0320 07:11:18.202683 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-gdzww"] Mar 20 07:11:18 crc kubenswrapper[4971]: E0320 07:11:18.203320 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89f7dd3f-532f-4e24-bf20-99aa46f1fdda" containerName="mariadb-account-create-update" Mar 20 07:11:18 crc kubenswrapper[4971]: I0320 07:11:18.203333 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="89f7dd3f-532f-4e24-bf20-99aa46f1fdda" containerName="mariadb-account-create-update" Mar 20 07:11:18 crc kubenswrapper[4971]: E0320 07:11:18.203363 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2fa4dc2-d625-4071-b0bc-07099f6303ab" containerName="mariadb-account-create-update" Mar 20 07:11:18 crc kubenswrapper[4971]: I0320 07:11:18.203369 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2fa4dc2-d625-4071-b0bc-07099f6303ab" containerName="mariadb-account-create-update" Mar 20 07:11:18 crc kubenswrapper[4971]: E0320 07:11:18.203381 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f2bca3a-2949-4c33-8716-28a6e38b84ef" containerName="mariadb-database-create" Mar 20 07:11:18 crc kubenswrapper[4971]: I0320 07:11:18.203388 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f2bca3a-2949-4c33-8716-28a6e38b84ef" containerName="mariadb-database-create" Mar 20 07:11:18 crc kubenswrapper[4971]: E0320 07:11:18.203401 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="543e9b24-4327-4fcf-8d2c-9316d2cc7b8e" containerName="mariadb-account-create-update" Mar 20 07:11:18 crc kubenswrapper[4971]: I0320 07:11:18.203407 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="543e9b24-4327-4fcf-8d2c-9316d2cc7b8e" containerName="mariadb-account-create-update" Mar 20 07:11:18 crc kubenswrapper[4971]: E0320 07:11:18.203419 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4d13e0d-08bb-495b-abfc-6edcc8f01890" containerName="mariadb-database-create" Mar 20 07:11:18 crc kubenswrapper[4971]: I0320 07:11:18.203425 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4d13e0d-08bb-495b-abfc-6edcc8f01890" containerName="mariadb-database-create" Mar 20 07:11:18 crc kubenswrapper[4971]: E0320 07:11:18.203433 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10b994b9-b9a4-49de-8faf-a3c7b64035dc" containerName="mariadb-database-create" Mar 20 07:11:18 crc kubenswrapper[4971]: I0320 07:11:18.203438 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="10b994b9-b9a4-49de-8faf-a3c7b64035dc" containerName="mariadb-database-create" Mar 20 07:11:18 crc kubenswrapper[4971]: I0320 07:11:18.203575 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="543e9b24-4327-4fcf-8d2c-9316d2cc7b8e" containerName="mariadb-account-create-update" Mar 20 07:11:18 crc kubenswrapper[4971]: I0320 07:11:18.203586 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="89f7dd3f-532f-4e24-bf20-99aa46f1fdda" containerName="mariadb-account-create-update" Mar 20 07:11:18 crc kubenswrapper[4971]: I0320 07:11:18.203599 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4d13e0d-08bb-495b-abfc-6edcc8f01890" containerName="mariadb-database-create" Mar 20 07:11:18 crc kubenswrapper[4971]: I0320 07:11:18.203622 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2fa4dc2-d625-4071-b0bc-07099f6303ab" containerName="mariadb-account-create-update" Mar 20 07:11:18 crc kubenswrapper[4971]: I0320 07:11:18.203632 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f2bca3a-2949-4c33-8716-28a6e38b84ef" containerName="mariadb-database-create" Mar 20 07:11:18 crc kubenswrapper[4971]: I0320 07:11:18.203644 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="10b994b9-b9a4-49de-8faf-a3c7b64035dc" containerName="mariadb-database-create" Mar 20 07:11:18 crc kubenswrapper[4971]: I0320 07:11:18.204164 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-gdzww" Mar 20 07:11:18 crc kubenswrapper[4971]: I0320 07:11:18.206314 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 20 07:11:18 crc kubenswrapper[4971]: I0320 07:11:18.209413 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-wdnww" Mar 20 07:11:18 crc kubenswrapper[4971]: I0320 07:11:18.214244 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-gdzww"] Mar 20 07:11:18 crc kubenswrapper[4971]: I0320 07:11:18.337860 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ed6dac6-e0d0-403d-8e7d-6a5d921379fb-config-data\") pod \"glance-db-sync-gdzww\" (UID: \"3ed6dac6-e0d0-403d-8e7d-6a5d921379fb\") " pod="openstack/glance-db-sync-gdzww" Mar 20 07:11:18 crc kubenswrapper[4971]: I0320 07:11:18.337914 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kchns\" (UniqueName: \"kubernetes.io/projected/3ed6dac6-e0d0-403d-8e7d-6a5d921379fb-kube-api-access-kchns\") pod \"glance-db-sync-gdzww\" (UID: \"3ed6dac6-e0d0-403d-8e7d-6a5d921379fb\") " pod="openstack/glance-db-sync-gdzww" Mar 20 07:11:18 crc kubenswrapper[4971]: I0320 07:11:18.337934 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ed6dac6-e0d0-403d-8e7d-6a5d921379fb-combined-ca-bundle\") pod \"glance-db-sync-gdzww\" (UID: \"3ed6dac6-e0d0-403d-8e7d-6a5d921379fb\") " pod="openstack/glance-db-sync-gdzww" Mar 20 07:11:18 crc kubenswrapper[4971]: I0320 07:11:18.338517 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3ed6dac6-e0d0-403d-8e7d-6a5d921379fb-db-sync-config-data\") pod \"glance-db-sync-gdzww\" (UID: \"3ed6dac6-e0d0-403d-8e7d-6a5d921379fb\") " pod="openstack/glance-db-sync-gdzww" Mar 20 07:11:18 crc kubenswrapper[4971]: I0320 07:11:18.440270 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ed6dac6-e0d0-403d-8e7d-6a5d921379fb-config-data\") pod \"glance-db-sync-gdzww\" (UID: \"3ed6dac6-e0d0-403d-8e7d-6a5d921379fb\") " pod="openstack/glance-db-sync-gdzww" Mar 20 07:11:18 crc kubenswrapper[4971]: I0320 07:11:18.440332 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kchns\" (UniqueName: \"kubernetes.io/projected/3ed6dac6-e0d0-403d-8e7d-6a5d921379fb-kube-api-access-kchns\") pod \"glance-db-sync-gdzww\" (UID: \"3ed6dac6-e0d0-403d-8e7d-6a5d921379fb\") " pod="openstack/glance-db-sync-gdzww" Mar 20 07:11:18 crc kubenswrapper[4971]: I0320 07:11:18.440359 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ed6dac6-e0d0-403d-8e7d-6a5d921379fb-combined-ca-bundle\") pod \"glance-db-sync-gdzww\" (UID: \"3ed6dac6-e0d0-403d-8e7d-6a5d921379fb\") " pod="openstack/glance-db-sync-gdzww" Mar 20 07:11:18 crc kubenswrapper[4971]: I0320 07:11:18.440445 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3ed6dac6-e0d0-403d-8e7d-6a5d921379fb-db-sync-config-data\") pod \"glance-db-sync-gdzww\" (UID: \"3ed6dac6-e0d0-403d-8e7d-6a5d921379fb\") " pod="openstack/glance-db-sync-gdzww" Mar 20 07:11:18 crc kubenswrapper[4971]: I0320 07:11:18.446214 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3ed6dac6-e0d0-403d-8e7d-6a5d921379fb-db-sync-config-data\") pod \"glance-db-sync-gdzww\" (UID: \"3ed6dac6-e0d0-403d-8e7d-6a5d921379fb\") " pod="openstack/glance-db-sync-gdzww" Mar 20 07:11:18 crc kubenswrapper[4971]: I0320 07:11:18.446657 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ed6dac6-e0d0-403d-8e7d-6a5d921379fb-config-data\") pod \"glance-db-sync-gdzww\" (UID: \"3ed6dac6-e0d0-403d-8e7d-6a5d921379fb\") " pod="openstack/glance-db-sync-gdzww" Mar 20 07:11:18 crc kubenswrapper[4971]: I0320 07:11:18.452181 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ed6dac6-e0d0-403d-8e7d-6a5d921379fb-combined-ca-bundle\") pod \"glance-db-sync-gdzww\" (UID: \"3ed6dac6-e0d0-403d-8e7d-6a5d921379fb\") " pod="openstack/glance-db-sync-gdzww" Mar 20 07:11:18 crc kubenswrapper[4971]: I0320 07:11:18.470597 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kchns\" (UniqueName: \"kubernetes.io/projected/3ed6dac6-e0d0-403d-8e7d-6a5d921379fb-kube-api-access-kchns\") pod \"glance-db-sync-gdzww\" (UID: \"3ed6dac6-e0d0-403d-8e7d-6a5d921379fb\") " pod="openstack/glance-db-sync-gdzww" Mar 20 07:11:18 crc kubenswrapper[4971]: I0320 07:11:18.529326 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-gdzww" Mar 20 07:11:19 crc kubenswrapper[4971]: I0320 07:11:19.050957 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9f0608b4-f344-45db-a952-d6bc328083a2-etc-swift\") pod \"swift-storage-0\" (UID: \"9f0608b4-f344-45db-a952-d6bc328083a2\") " pod="openstack/swift-storage-0" Mar 20 07:11:19 crc kubenswrapper[4971]: E0320 07:11:19.051213 4971 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 07:11:19 crc kubenswrapper[4971]: E0320 07:11:19.051264 4971 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 07:11:19 crc kubenswrapper[4971]: E0320 07:11:19.051336 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9f0608b4-f344-45db-a952-d6bc328083a2-etc-swift podName:9f0608b4-f344-45db-a952-d6bc328083a2 nodeName:}" failed. No retries permitted until 2026-03-20 07:11:27.05131549 +0000 UTC m=+1309.031189638 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9f0608b4-f344-45db-a952-d6bc328083a2-etc-swift") pod "swift-storage-0" (UID: "9f0608b4-f344-45db-a952-d6bc328083a2") : configmap "swift-ring-files" not found Mar 20 07:11:20 crc kubenswrapper[4971]: I0320 07:11:20.291842 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7b9fd7d84c-bwqjt" Mar 20 07:11:20 crc kubenswrapper[4971]: I0320 07:11:20.367312 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-lzjrm"] Mar 20 07:11:20 crc kubenswrapper[4971]: I0320 07:11:20.367566 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-54b5dffb47-lzjrm" podUID="80f15589-52d0-4bb9-972e-82d3860486ec" containerName="dnsmasq-dns" containerID="cri-o://ecda388b5853d5d8b65f2bfa128cd69d83145637fdbc6e4d2b4f472bc460afa3" gracePeriod=10 Mar 20 07:11:20 crc kubenswrapper[4971]: I0320 07:11:20.642705 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-dhlfs" Mar 20 07:11:20 crc kubenswrapper[4971]: I0320 07:11:20.693728 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkkgk\" (UniqueName: \"kubernetes.io/projected/2ac19710-56df-48ab-a605-b331b6b54060-kube-api-access-zkkgk\") pod \"2ac19710-56df-48ab-a605-b331b6b54060\" (UID: \"2ac19710-56df-48ab-a605-b331b6b54060\") " Mar 20 07:11:20 crc kubenswrapper[4971]: I0320 07:11:20.693822 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ac19710-56df-48ab-a605-b331b6b54060-operator-scripts\") pod \"2ac19710-56df-48ab-a605-b331b6b54060\" (UID: \"2ac19710-56df-48ab-a605-b331b6b54060\") " Mar 20 07:11:20 crc kubenswrapper[4971]: I0320 07:11:20.694834 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ac19710-56df-48ab-a605-b331b6b54060-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2ac19710-56df-48ab-a605-b331b6b54060" (UID: "2ac19710-56df-48ab-a605-b331b6b54060"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:11:20 crc kubenswrapper[4971]: I0320 07:11:20.798525 4971 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ac19710-56df-48ab-a605-b331b6b54060-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:20 crc kubenswrapper[4971]: I0320 07:11:20.810719 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ac19710-56df-48ab-a605-b331b6b54060-kube-api-access-zkkgk" (OuterVolumeSpecName: "kube-api-access-zkkgk") pod "2ac19710-56df-48ab-a605-b331b6b54060" (UID: "2ac19710-56df-48ab-a605-b331b6b54060"). InnerVolumeSpecName "kube-api-access-zkkgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:11:20 crc kubenswrapper[4971]: I0320 07:11:20.839418 4971 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod6e3a6967-aab7-4dac-abce-8d3189935dc4"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod6e3a6967-aab7-4dac-abce-8d3189935dc4] : Timed out while waiting for systemd to remove kubepods-besteffort-pod6e3a6967_aab7_4dac_abce_8d3189935dc4.slice" Mar 20 07:11:20 crc kubenswrapper[4971]: E0320 07:11:20.839734 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod6e3a6967-aab7-4dac-abce-8d3189935dc4] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod6e3a6967-aab7-4dac-abce-8d3189935dc4] : Timed out while waiting for systemd to remove kubepods-besteffort-pod6e3a6967_aab7_4dac_abce_8d3189935dc4.slice" pod="openstack/dnsmasq-dns-64696987c5-k2mdp" podUID="6e3a6967-aab7-4dac-abce-8d3189935dc4" Mar 20 07:11:20 crc kubenswrapper[4971]: I0320 07:11:20.900305 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkkgk\" (UniqueName: \"kubernetes.io/projected/2ac19710-56df-48ab-a605-b331b6b54060-kube-api-access-zkkgk\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:20 crc kubenswrapper[4971]: I0320 07:11:20.907667 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-gdzww"] Mar 20 07:11:20 crc kubenswrapper[4971]: I0320 07:11:20.996021 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54b5dffb47-lzjrm" Mar 20 07:11:21 crc kubenswrapper[4971]: I0320 07:11:21.102914 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/80f15589-52d0-4bb9-972e-82d3860486ec-dns-svc\") pod \"80f15589-52d0-4bb9-972e-82d3860486ec\" (UID: \"80f15589-52d0-4bb9-972e-82d3860486ec\") " Mar 20 07:11:21 crc kubenswrapper[4971]: I0320 07:11:21.103039 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ck7zk\" (UniqueName: \"kubernetes.io/projected/80f15589-52d0-4bb9-972e-82d3860486ec-kube-api-access-ck7zk\") pod \"80f15589-52d0-4bb9-972e-82d3860486ec\" (UID: \"80f15589-52d0-4bb9-972e-82d3860486ec\") " Mar 20 07:11:21 crc kubenswrapper[4971]: I0320 07:11:21.103082 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80f15589-52d0-4bb9-972e-82d3860486ec-config\") pod \"80f15589-52d0-4bb9-972e-82d3860486ec\" (UID: \"80f15589-52d0-4bb9-972e-82d3860486ec\") " Mar 20 07:11:21 crc kubenswrapper[4971]: I0320 07:11:21.106698 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80f15589-52d0-4bb9-972e-82d3860486ec-kube-api-access-ck7zk" (OuterVolumeSpecName: "kube-api-access-ck7zk") pod "80f15589-52d0-4bb9-972e-82d3860486ec" (UID: "80f15589-52d0-4bb9-972e-82d3860486ec"). InnerVolumeSpecName "kube-api-access-ck7zk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:11:21 crc kubenswrapper[4971]: I0320 07:11:21.138252 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80f15589-52d0-4bb9-972e-82d3860486ec-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "80f15589-52d0-4bb9-972e-82d3860486ec" (UID: "80f15589-52d0-4bb9-972e-82d3860486ec"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:11:21 crc kubenswrapper[4971]: I0320 07:11:21.142111 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80f15589-52d0-4bb9-972e-82d3860486ec-config" (OuterVolumeSpecName: "config") pod "80f15589-52d0-4bb9-972e-82d3860486ec" (UID: "80f15589-52d0-4bb9-972e-82d3860486ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:11:21 crc kubenswrapper[4971]: I0320 07:11:21.205110 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ck7zk\" (UniqueName: \"kubernetes.io/projected/80f15589-52d0-4bb9-972e-82d3860486ec-kube-api-access-ck7zk\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:21 crc kubenswrapper[4971]: I0320 07:11:21.205142 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80f15589-52d0-4bb9-972e-82d3860486ec-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:21 crc kubenswrapper[4971]: I0320 07:11:21.205153 4971 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/80f15589-52d0-4bb9-972e-82d3860486ec-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:21 crc kubenswrapper[4971]: I0320 07:11:21.357808 4971 generic.go:334] "Generic (PLEG): container finished" podID="80f15589-52d0-4bb9-972e-82d3860486ec" containerID="ecda388b5853d5d8b65f2bfa128cd69d83145637fdbc6e4d2b4f472bc460afa3" exitCode=0 Mar 20 07:11:21 crc kubenswrapper[4971]: I0320 07:11:21.357875 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54b5dffb47-lzjrm" event={"ID":"80f15589-52d0-4bb9-972e-82d3860486ec","Type":"ContainerDied","Data":"ecda388b5853d5d8b65f2bfa128cd69d83145637fdbc6e4d2b4f472bc460afa3"} Mar 20 07:11:21 crc kubenswrapper[4971]: I0320 07:11:21.357909 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54b5dffb47-lzjrm" event={"ID":"80f15589-52d0-4bb9-972e-82d3860486ec","Type":"ContainerDied","Data":"a04b8d554f889b70d3dcdb14e59f76ec18f3a4d897ccdb3a8b72fe56b93ffde7"} Mar 20 07:11:21 crc kubenswrapper[4971]: I0320 07:11:21.357932 4971 scope.go:117] "RemoveContainer" containerID="ecda388b5853d5d8b65f2bfa128cd69d83145637fdbc6e4d2b4f472bc460afa3" Mar 20 07:11:21 crc kubenswrapper[4971]: I0320 07:11:21.358003 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54b5dffb47-lzjrm" Mar 20 07:11:21 crc kubenswrapper[4971]: I0320 07:11:21.360070 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-gdzww" event={"ID":"3ed6dac6-e0d0-403d-8e7d-6a5d921379fb","Type":"ContainerStarted","Data":"b7c33c7e24342f665db0e0e7601a0257bbb47b19c3c609b3af55c8da67f096bb"} Mar 20 07:11:21 crc kubenswrapper[4971]: I0320 07:11:21.362032 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-rjxlr" event={"ID":"06926146-e70b-4ee7-bf24-d629cb17a509","Type":"ContainerStarted","Data":"33e0dc596986e9a8f65bb0952b3f04fc1ffca37a4742dfd20efd67c5686312ae"} Mar 20 07:11:21 crc kubenswrapper[4971]: I0320 07:11:21.364653 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-dhlfs" event={"ID":"2ac19710-56df-48ab-a605-b331b6b54060","Type":"ContainerDied","Data":"c7541e8da35deb8c6fef2017d1796ccfb7f3d56b9f30ea01af2c12a9d8bbda13"} Mar 20 07:11:21 crc kubenswrapper[4971]: I0320 07:11:21.364704 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c7541e8da35deb8c6fef2017d1796ccfb7f3d56b9f30ea01af2c12a9d8bbda13" Mar 20 07:11:21 crc kubenswrapper[4971]: I0320 07:11:21.364705 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-dhlfs" Mar 20 07:11:21 crc kubenswrapper[4971]: I0320 07:11:21.364667 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64696987c5-k2mdp" Mar 20 07:11:21 crc kubenswrapper[4971]: I0320 07:11:21.387851 4971 scope.go:117] "RemoveContainer" containerID="54488af8cdc5d59cb335ffced7c3b6f148d16bed02277deb8295ef4b0e2886b1" Mar 20 07:11:21 crc kubenswrapper[4971]: I0320 07:11:21.446035 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-rjxlr" podStartSLOduration=1.974219269 podStartE2EDuration="6.446017434s" podCreationTimestamp="2026-03-20 07:11:15 +0000 UTC" firstStartedPulling="2026-03-20 07:11:15.977054209 +0000 UTC m=+1297.956928347" lastFinishedPulling="2026-03-20 07:11:20.448852374 +0000 UTC m=+1302.428726512" observedRunningTime="2026-03-20 07:11:21.41247087 +0000 UTC m=+1303.392345038" watchObservedRunningTime="2026-03-20 07:11:21.446017434 +0000 UTC m=+1303.425891582" Mar 20 07:11:21 crc kubenswrapper[4971]: I0320 07:11:21.464834 4971 scope.go:117] "RemoveContainer" containerID="ecda388b5853d5d8b65f2bfa128cd69d83145637fdbc6e4d2b4f472bc460afa3" Mar 20 07:11:21 crc kubenswrapper[4971]: E0320 07:11:21.465397 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecda388b5853d5d8b65f2bfa128cd69d83145637fdbc6e4d2b4f472bc460afa3\": container with ID starting with ecda388b5853d5d8b65f2bfa128cd69d83145637fdbc6e4d2b4f472bc460afa3 not found: ID does not exist" containerID="ecda388b5853d5d8b65f2bfa128cd69d83145637fdbc6e4d2b4f472bc460afa3" Mar 20 07:11:21 crc kubenswrapper[4971]: I0320 07:11:21.465440 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecda388b5853d5d8b65f2bfa128cd69d83145637fdbc6e4d2b4f472bc460afa3"} err="failed to get container status \"ecda388b5853d5d8b65f2bfa128cd69d83145637fdbc6e4d2b4f472bc460afa3\": rpc error: code = NotFound desc = could not find container \"ecda388b5853d5d8b65f2bfa128cd69d83145637fdbc6e4d2b4f472bc460afa3\": container with ID starting with ecda388b5853d5d8b65f2bfa128cd69d83145637fdbc6e4d2b4f472bc460afa3 not found: ID does not exist" Mar 20 07:11:21 crc kubenswrapper[4971]: I0320 07:11:21.465463 4971 scope.go:117] "RemoveContainer" containerID="54488af8cdc5d59cb335ffced7c3b6f148d16bed02277deb8295ef4b0e2886b1" Mar 20 07:11:21 crc kubenswrapper[4971]: E0320 07:11:21.465859 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54488af8cdc5d59cb335ffced7c3b6f148d16bed02277deb8295ef4b0e2886b1\": container with ID starting with 54488af8cdc5d59cb335ffced7c3b6f148d16bed02277deb8295ef4b0e2886b1 not found: ID does not exist" containerID="54488af8cdc5d59cb335ffced7c3b6f148d16bed02277deb8295ef4b0e2886b1" Mar 20 07:11:21 crc kubenswrapper[4971]: I0320 07:11:21.465892 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54488af8cdc5d59cb335ffced7c3b6f148d16bed02277deb8295ef4b0e2886b1"} err="failed to get container status \"54488af8cdc5d59cb335ffced7c3b6f148d16bed02277deb8295ef4b0e2886b1\": rpc error: code = NotFound desc = could not find container \"54488af8cdc5d59cb335ffced7c3b6f148d16bed02277deb8295ef4b0e2886b1\": container with ID starting with 54488af8cdc5d59cb335ffced7c3b6f148d16bed02277deb8295ef4b0e2886b1 not found: ID does not exist" Mar 20 07:11:21 crc kubenswrapper[4971]: I0320 07:11:21.489562 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64696987c5-k2mdp"] Mar 20 07:11:21 crc kubenswrapper[4971]: I0320 07:11:21.497160 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-64696987c5-k2mdp"] Mar 20 07:11:21 crc kubenswrapper[4971]: I0320 07:11:21.503409 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-lzjrm"] Mar 20 07:11:21 crc kubenswrapper[4971]: I0320 07:11:21.517770 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-lzjrm"] Mar 20 07:11:22 crc kubenswrapper[4971]: I0320 07:11:22.742851 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e3a6967-aab7-4dac-abce-8d3189935dc4" path="/var/lib/kubelet/pods/6e3a6967-aab7-4dac-abce-8d3189935dc4/volumes" Mar 20 07:11:22 crc kubenswrapper[4971]: I0320 07:11:22.744061 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80f15589-52d0-4bb9-972e-82d3860486ec" path="/var/lib/kubelet/pods/80f15589-52d0-4bb9-972e-82d3860486ec/volumes" Mar 20 07:11:23 crc kubenswrapper[4971]: I0320 07:11:23.391631 4971 generic.go:334] "Generic (PLEG): container finished" podID="a71f8d2d-729c-4b7c-89c7-a06bd2216978" containerID="6133975f1c796980d588347c53a88fbaff173562350d27203ea7112376127cb1" exitCode=0 Mar 20 07:11:23 crc kubenswrapper[4971]: I0320 07:11:23.391712 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a71f8d2d-729c-4b7c-89c7-a06bd2216978","Type":"ContainerDied","Data":"6133975f1c796980d588347c53a88fbaff173562350d27203ea7112376127cb1"} Mar 20 07:11:23 crc kubenswrapper[4971]: I0320 07:11:23.393914 4971 generic.go:334] "Generic (PLEG): container finished" podID="1023ed0c-3abe-4cff-987f-52544b885696" containerID="8ea2d6fbd32a2d37d300ac896cec5808faa6aa48fc9bb628562475b560b88284" exitCode=0 Mar 20 07:11:23 crc kubenswrapper[4971]: I0320 07:11:23.393945 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1023ed0c-3abe-4cff-987f-52544b885696","Type":"ContainerDied","Data":"8ea2d6fbd32a2d37d300ac896cec5808faa6aa48fc9bb628562475b560b88284"} Mar 20 07:11:24 crc kubenswrapper[4971]: I0320 07:11:24.412410 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a71f8d2d-729c-4b7c-89c7-a06bd2216978","Type":"ContainerStarted","Data":"7461019c6212376a36119aa5818e0716d8509d127a418e899d1c9160f4c383ac"} Mar 20 07:11:24 crc kubenswrapper[4971]: I0320 07:11:24.412656 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 20 07:11:24 crc kubenswrapper[4971]: I0320 07:11:24.430416 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1023ed0c-3abe-4cff-987f-52544b885696","Type":"ContainerStarted","Data":"51b4cb77e4da22feec9a5960ca52b1a8182d5f4678ea7a93350ed77aaf3336c3"} Mar 20 07:11:24 crc kubenswrapper[4971]: I0320 07:11:24.430651 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:11:24 crc kubenswrapper[4971]: I0320 07:11:24.446447 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.063068391 podStartE2EDuration="51.446429768s" podCreationTimestamp="2026-03-20 07:10:33 +0000 UTC" firstStartedPulling="2026-03-20 07:10:35.415684604 +0000 UTC m=+1257.395558742" lastFinishedPulling="2026-03-20 07:10:49.799045981 +0000 UTC m=+1271.778920119" observedRunningTime="2026-03-20 07:11:24.445429783 +0000 UTC m=+1306.425303921" watchObservedRunningTime="2026-03-20 07:11:24.446429768 +0000 UTC m=+1306.426303906" Mar 20 07:11:24 crc kubenswrapper[4971]: I0320 07:11:24.508551 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.29666218 podStartE2EDuration="51.508533234s" podCreationTimestamp="2026-03-20 07:10:33 +0000 UTC" firstStartedPulling="2026-03-20 07:10:35.672063461 +0000 UTC m=+1257.651937599" lastFinishedPulling="2026-03-20 07:10:49.883934515 +0000 UTC m=+1271.863808653" observedRunningTime="2026-03-20 07:11:24.499985484 +0000 UTC m=+1306.479859622" watchObservedRunningTime="2026-03-20 07:11:24.508533234 +0000 UTC m=+1306.488407372" Mar 20 07:11:24 crc kubenswrapper[4971]: I0320 07:11:24.552192 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 20 07:11:26 crc kubenswrapper[4971]: I0320 07:11:26.197536 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-dhlfs"] Mar 20 07:11:26 crc kubenswrapper[4971]: I0320 07:11:26.204957 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-dhlfs"] Mar 20 07:11:26 crc kubenswrapper[4971]: I0320 07:11:26.742068 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ac19710-56df-48ab-a605-b331b6b54060" path="/var/lib/kubelet/pods/2ac19710-56df-48ab-a605-b331b6b54060/volumes" Mar 20 07:11:27 crc kubenswrapper[4971]: I0320 07:11:27.113211 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9f0608b4-f344-45db-a952-d6bc328083a2-etc-swift\") pod \"swift-storage-0\" (UID: \"9f0608b4-f344-45db-a952-d6bc328083a2\") " pod="openstack/swift-storage-0" Mar 20 07:11:27 crc kubenswrapper[4971]: E0320 07:11:27.113441 4971 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 07:11:27 crc kubenswrapper[4971]: E0320 07:11:27.113747 4971 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 07:11:27 crc kubenswrapper[4971]: E0320 07:11:27.113892 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9f0608b4-f344-45db-a952-d6bc328083a2-etc-swift podName:9f0608b4-f344-45db-a952-d6bc328083a2 nodeName:}" failed. No retries permitted until 2026-03-20 07:11:43.113870907 +0000 UTC m=+1325.093745065 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9f0608b4-f344-45db-a952-d6bc328083a2-etc-swift") pod "swift-storage-0" (UID: "9f0608b4-f344-45db-a952-d6bc328083a2") : configmap "swift-ring-files" not found Mar 20 07:11:27 crc kubenswrapper[4971]: I0320 07:11:27.452984 4971 generic.go:334] "Generic (PLEG): container finished" podID="06926146-e70b-4ee7-bf24-d629cb17a509" containerID="33e0dc596986e9a8f65bb0952b3f04fc1ffca37a4742dfd20efd67c5686312ae" exitCode=0 Mar 20 07:11:27 crc kubenswrapper[4971]: I0320 07:11:27.453062 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-rjxlr" event={"ID":"06926146-e70b-4ee7-bf24-d629cb17a509","Type":"ContainerDied","Data":"33e0dc596986e9a8f65bb0952b3f04fc1ffca37a4742dfd20efd67c5686312ae"} Mar 20 07:11:27 crc kubenswrapper[4971]: I0320 07:11:27.894653 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-pfhnr" podUID="10811481-4b40-49ae-9d75-03f0c0e02fe7" containerName="ovn-controller" probeResult="failure" output=< Mar 20 07:11:27 crc kubenswrapper[4971]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 20 07:11:27 crc kubenswrapper[4971]: > Mar 20 07:11:28 crc kubenswrapper[4971]: I0320 07:11:28.786461 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-rjxlr" Mar 20 07:11:28 crc kubenswrapper[4971]: I0320 07:11:28.942015 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/06926146-e70b-4ee7-bf24-d629cb17a509-swiftconf\") pod \"06926146-e70b-4ee7-bf24-d629cb17a509\" (UID: \"06926146-e70b-4ee7-bf24-d629cb17a509\") " Mar 20 07:11:28 crc kubenswrapper[4971]: I0320 07:11:28.942074 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tx8gf\" (UniqueName: \"kubernetes.io/projected/06926146-e70b-4ee7-bf24-d629cb17a509-kube-api-access-tx8gf\") pod \"06926146-e70b-4ee7-bf24-d629cb17a509\" (UID: \"06926146-e70b-4ee7-bf24-d629cb17a509\") " Mar 20 07:11:28 crc kubenswrapper[4971]: I0320 07:11:28.942151 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/06926146-e70b-4ee7-bf24-d629cb17a509-ring-data-devices\") pod \"06926146-e70b-4ee7-bf24-d629cb17a509\" (UID: \"06926146-e70b-4ee7-bf24-d629cb17a509\") " Mar 20 07:11:28 crc kubenswrapper[4971]: I0320 07:11:28.942190 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/06926146-e70b-4ee7-bf24-d629cb17a509-dispersionconf\") pod \"06926146-e70b-4ee7-bf24-d629cb17a509\" (UID: \"06926146-e70b-4ee7-bf24-d629cb17a509\") " Mar 20 07:11:28 crc kubenswrapper[4971]: I0320 07:11:28.942239 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/06926146-e70b-4ee7-bf24-d629cb17a509-scripts\") pod \"06926146-e70b-4ee7-bf24-d629cb17a509\" (UID: \"06926146-e70b-4ee7-bf24-d629cb17a509\") " Mar 20 07:11:28 crc kubenswrapper[4971]: I0320 07:11:28.942290 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06926146-e70b-4ee7-bf24-d629cb17a509-combined-ca-bundle\") pod \"06926146-e70b-4ee7-bf24-d629cb17a509\" (UID: \"06926146-e70b-4ee7-bf24-d629cb17a509\") " Mar 20 07:11:28 crc kubenswrapper[4971]: I0320 07:11:28.942446 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/06926146-e70b-4ee7-bf24-d629cb17a509-etc-swift\") pod \"06926146-e70b-4ee7-bf24-d629cb17a509\" (UID: \"06926146-e70b-4ee7-bf24-d629cb17a509\") " Mar 20 07:11:28 crc kubenswrapper[4971]: I0320 07:11:28.943106 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06926146-e70b-4ee7-bf24-d629cb17a509-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "06926146-e70b-4ee7-bf24-d629cb17a509" (UID: "06926146-e70b-4ee7-bf24-d629cb17a509"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:11:28 crc kubenswrapper[4971]: I0320 07:11:28.944306 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06926146-e70b-4ee7-bf24-d629cb17a509-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "06926146-e70b-4ee7-bf24-d629cb17a509" (UID: "06926146-e70b-4ee7-bf24-d629cb17a509"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:11:28 crc kubenswrapper[4971]: I0320 07:11:28.947480 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06926146-e70b-4ee7-bf24-d629cb17a509-kube-api-access-tx8gf" (OuterVolumeSpecName: "kube-api-access-tx8gf") pod "06926146-e70b-4ee7-bf24-d629cb17a509" (UID: "06926146-e70b-4ee7-bf24-d629cb17a509"). InnerVolumeSpecName "kube-api-access-tx8gf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:11:28 crc kubenswrapper[4971]: I0320 07:11:28.962425 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06926146-e70b-4ee7-bf24-d629cb17a509-scripts" (OuterVolumeSpecName: "scripts") pod "06926146-e70b-4ee7-bf24-d629cb17a509" (UID: "06926146-e70b-4ee7-bf24-d629cb17a509"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:11:28 crc kubenswrapper[4971]: I0320 07:11:28.969078 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06926146-e70b-4ee7-bf24-d629cb17a509-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "06926146-e70b-4ee7-bf24-d629cb17a509" (UID: "06926146-e70b-4ee7-bf24-d629cb17a509"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:11:28 crc kubenswrapper[4971]: I0320 07:11:28.973272 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06926146-e70b-4ee7-bf24-d629cb17a509-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "06926146-e70b-4ee7-bf24-d629cb17a509" (UID: "06926146-e70b-4ee7-bf24-d629cb17a509"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:11:28 crc kubenswrapper[4971]: I0320 07:11:28.981088 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06926146-e70b-4ee7-bf24-d629cb17a509-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "06926146-e70b-4ee7-bf24-d629cb17a509" (UID: "06926146-e70b-4ee7-bf24-d629cb17a509"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:11:29 crc kubenswrapper[4971]: I0320 07:11:29.044744 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/06926146-e70b-4ee7-bf24-d629cb17a509-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:29 crc kubenswrapper[4971]: I0320 07:11:29.044911 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/06926146-e70b-4ee7-bf24-d629cb17a509-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:29 crc kubenswrapper[4971]: I0320 07:11:29.044996 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/06926146-e70b-4ee7-bf24-d629cb17a509-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:29 crc kubenswrapper[4971]: I0320 07:11:29.045117 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06926146-e70b-4ee7-bf24-d629cb17a509-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:29 crc kubenswrapper[4971]: I0320 07:11:29.045218 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/06926146-e70b-4ee7-bf24-d629cb17a509-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:29 crc kubenswrapper[4971]: I0320 07:11:29.045410 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/06926146-e70b-4ee7-bf24-d629cb17a509-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:29 crc kubenswrapper[4971]: I0320 07:11:29.045519 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tx8gf\" (UniqueName: \"kubernetes.io/projected/06926146-e70b-4ee7-bf24-d629cb17a509-kube-api-access-tx8gf\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:29 crc kubenswrapper[4971]: I0320 07:11:29.473206 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-rjxlr" Mar 20 07:11:29 crc kubenswrapper[4971]: I0320 07:11:29.473188 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-rjxlr" event={"ID":"06926146-e70b-4ee7-bf24-d629cb17a509","Type":"ContainerDied","Data":"cad6e11bd72d13826a63832296fe8296dd88c98e8f45162baa6393084aab62e0"} Mar 20 07:11:29 crc kubenswrapper[4971]: I0320 07:11:29.474119 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cad6e11bd72d13826a63832296fe8296dd88c98e8f45162baa6393084aab62e0" Mar 20 07:11:31 crc kubenswrapper[4971]: I0320 07:11:31.194251 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-7dwd5"] Mar 20 07:11:31 crc kubenswrapper[4971]: E0320 07:11:31.195222 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80f15589-52d0-4bb9-972e-82d3860486ec" containerName="dnsmasq-dns" Mar 20 07:11:31 crc kubenswrapper[4971]: I0320 07:11:31.195245 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="80f15589-52d0-4bb9-972e-82d3860486ec" containerName="dnsmasq-dns" Mar 20 07:11:31 crc kubenswrapper[4971]: E0320 07:11:31.195269 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ac19710-56df-48ab-a605-b331b6b54060" containerName="mariadb-account-create-update" Mar 20 07:11:31 crc kubenswrapper[4971]: I0320 07:11:31.195281 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ac19710-56df-48ab-a605-b331b6b54060" containerName="mariadb-account-create-update" Mar 20 07:11:31 crc kubenswrapper[4971]: E0320 07:11:31.195305 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06926146-e70b-4ee7-bf24-d629cb17a509" containerName="swift-ring-rebalance" Mar 20 07:11:31 crc kubenswrapper[4971]: I0320 07:11:31.195315 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="06926146-e70b-4ee7-bf24-d629cb17a509" containerName="swift-ring-rebalance" Mar 20 07:11:31 crc kubenswrapper[4971]: E0320 07:11:31.195340 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80f15589-52d0-4bb9-972e-82d3860486ec" containerName="init" Mar 20 07:11:31 crc kubenswrapper[4971]: I0320 07:11:31.195349 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="80f15589-52d0-4bb9-972e-82d3860486ec" containerName="init" Mar 20 07:11:31 crc kubenswrapper[4971]: I0320 07:11:31.195556 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="06926146-e70b-4ee7-bf24-d629cb17a509" containerName="swift-ring-rebalance" Mar 20 07:11:31 crc kubenswrapper[4971]: I0320 07:11:31.195571 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="80f15589-52d0-4bb9-972e-82d3860486ec" containerName="dnsmasq-dns" Mar 20 07:11:31 crc kubenswrapper[4971]: I0320 07:11:31.195595 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ac19710-56df-48ab-a605-b331b6b54060" containerName="mariadb-account-create-update" Mar 20 07:11:31 crc kubenswrapper[4971]: I0320 07:11:31.196346 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-7dwd5" Mar 20 07:11:31 crc kubenswrapper[4971]: I0320 07:11:31.199371 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 20 07:11:31 crc kubenswrapper[4971]: I0320 07:11:31.207143 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-7dwd5"] Mar 20 07:11:31 crc kubenswrapper[4971]: I0320 07:11:31.287339 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzlgj\" (UniqueName: \"kubernetes.io/projected/38bbd968-9965-427c-b176-00754eb9887c-kube-api-access-hzlgj\") pod \"root-account-create-update-7dwd5\" (UID: \"38bbd968-9965-427c-b176-00754eb9887c\") " pod="openstack/root-account-create-update-7dwd5" Mar 20 07:11:31 crc kubenswrapper[4971]: I0320 07:11:31.287527 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38bbd968-9965-427c-b176-00754eb9887c-operator-scripts\") pod \"root-account-create-update-7dwd5\" (UID: \"38bbd968-9965-427c-b176-00754eb9887c\") " pod="openstack/root-account-create-update-7dwd5" Mar 20 07:11:31 crc kubenswrapper[4971]: I0320 07:11:31.389767 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzlgj\" (UniqueName: \"kubernetes.io/projected/38bbd968-9965-427c-b176-00754eb9887c-kube-api-access-hzlgj\") pod \"root-account-create-update-7dwd5\" (UID: \"38bbd968-9965-427c-b176-00754eb9887c\") " pod="openstack/root-account-create-update-7dwd5" Mar 20 07:11:31 crc kubenswrapper[4971]: I0320 07:11:31.389847 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38bbd968-9965-427c-b176-00754eb9887c-operator-scripts\") pod \"root-account-create-update-7dwd5\" (UID: \"38bbd968-9965-427c-b176-00754eb9887c\") " pod="openstack/root-account-create-update-7dwd5" Mar 20 07:11:31 crc kubenswrapper[4971]: I0320 07:11:31.390860 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38bbd968-9965-427c-b176-00754eb9887c-operator-scripts\") pod \"root-account-create-update-7dwd5\" (UID: \"38bbd968-9965-427c-b176-00754eb9887c\") " pod="openstack/root-account-create-update-7dwd5" Mar 20 07:11:31 crc kubenswrapper[4971]: I0320 07:11:31.413019 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzlgj\" (UniqueName: \"kubernetes.io/projected/38bbd968-9965-427c-b176-00754eb9887c-kube-api-access-hzlgj\") pod \"root-account-create-update-7dwd5\" (UID: \"38bbd968-9965-427c-b176-00754eb9887c\") " pod="openstack/root-account-create-update-7dwd5" Mar 20 07:11:31 crc kubenswrapper[4971]: I0320 07:11:31.518386 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-7dwd5" Mar 20 07:11:32 crc kubenswrapper[4971]: I0320 07:11:32.789321 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-pfhnr" podUID="10811481-4b40-49ae-9d75-03f0c0e02fe7" containerName="ovn-controller" probeResult="failure" output=< Mar 20 07:11:32 crc kubenswrapper[4971]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 20 07:11:32 crc kubenswrapper[4971]: > Mar 20 07:11:32 crc kubenswrapper[4971]: I0320 07:11:32.835980 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-njjzt" Mar 20 07:11:32 crc kubenswrapper[4971]: I0320 07:11:32.847221 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-njjzt" Mar 20 07:11:33 crc kubenswrapper[4971]: I0320 07:11:33.079085 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-pfhnr-config-dtxp2"] Mar 20 07:11:33 crc kubenswrapper[4971]: I0320 07:11:33.080279 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-pfhnr-config-dtxp2" Mar 20 07:11:33 crc kubenswrapper[4971]: I0320 07:11:33.085553 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 20 07:11:33 crc kubenswrapper[4971]: I0320 07:11:33.103756 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-pfhnr-config-dtxp2"] Mar 20 07:11:33 crc kubenswrapper[4971]: I0320 07:11:33.243156 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/55fc97fc-3c36-4df5-9be0-e405144122b8-var-run\") pod \"ovn-controller-pfhnr-config-dtxp2\" (UID: \"55fc97fc-3c36-4df5-9be0-e405144122b8\") " pod="openstack/ovn-controller-pfhnr-config-dtxp2" Mar 20 07:11:33 crc kubenswrapper[4971]: I0320 07:11:33.243222 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/55fc97fc-3c36-4df5-9be0-e405144122b8-scripts\") pod \"ovn-controller-pfhnr-config-dtxp2\" (UID: \"55fc97fc-3c36-4df5-9be0-e405144122b8\") " pod="openstack/ovn-controller-pfhnr-config-dtxp2" Mar 20 07:11:33 crc kubenswrapper[4971]: I0320 07:11:33.243304 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/55fc97fc-3c36-4df5-9be0-e405144122b8-var-run-ovn\") pod \"ovn-controller-pfhnr-config-dtxp2\" (UID: \"55fc97fc-3c36-4df5-9be0-e405144122b8\") " pod="openstack/ovn-controller-pfhnr-config-dtxp2" Mar 20 07:11:33 crc kubenswrapper[4971]: I0320 07:11:33.243325 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/55fc97fc-3c36-4df5-9be0-e405144122b8-additional-scripts\") pod \"ovn-controller-pfhnr-config-dtxp2\" (UID: \"55fc97fc-3c36-4df5-9be0-e405144122b8\") " pod="openstack/ovn-controller-pfhnr-config-dtxp2" Mar 20 07:11:33 crc kubenswrapper[4971]: I0320 07:11:33.243343 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldx55\" (UniqueName: \"kubernetes.io/projected/55fc97fc-3c36-4df5-9be0-e405144122b8-kube-api-access-ldx55\") pod \"ovn-controller-pfhnr-config-dtxp2\" (UID: \"55fc97fc-3c36-4df5-9be0-e405144122b8\") " pod="openstack/ovn-controller-pfhnr-config-dtxp2" Mar 20 07:11:33 crc kubenswrapper[4971]: I0320 07:11:33.243591 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/55fc97fc-3c36-4df5-9be0-e405144122b8-var-log-ovn\") pod \"ovn-controller-pfhnr-config-dtxp2\" (UID: \"55fc97fc-3c36-4df5-9be0-e405144122b8\") " pod="openstack/ovn-controller-pfhnr-config-dtxp2" Mar 20 07:11:33 crc kubenswrapper[4971]: I0320 07:11:33.346136 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/55fc97fc-3c36-4df5-9be0-e405144122b8-var-run-ovn\") pod \"ovn-controller-pfhnr-config-dtxp2\" (UID: \"55fc97fc-3c36-4df5-9be0-e405144122b8\") " pod="openstack/ovn-controller-pfhnr-config-dtxp2" Mar 20 07:11:33 crc kubenswrapper[4971]: I0320 07:11:33.346193 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/55fc97fc-3c36-4df5-9be0-e405144122b8-additional-scripts\") pod \"ovn-controller-pfhnr-config-dtxp2\" (UID: \"55fc97fc-3c36-4df5-9be0-e405144122b8\") " pod="openstack/ovn-controller-pfhnr-config-dtxp2" Mar 20 07:11:33 crc kubenswrapper[4971]: I0320 07:11:33.346224 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldx55\" (UniqueName: \"kubernetes.io/projected/55fc97fc-3c36-4df5-9be0-e405144122b8-kube-api-access-ldx55\") pod \"ovn-controller-pfhnr-config-dtxp2\" (UID: \"55fc97fc-3c36-4df5-9be0-e405144122b8\") " pod="openstack/ovn-controller-pfhnr-config-dtxp2" Mar 20 07:11:33 crc kubenswrapper[4971]: I0320 07:11:33.346267 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/55fc97fc-3c36-4df5-9be0-e405144122b8-var-log-ovn\") pod \"ovn-controller-pfhnr-config-dtxp2\" (UID: \"55fc97fc-3c36-4df5-9be0-e405144122b8\") " pod="openstack/ovn-controller-pfhnr-config-dtxp2" Mar 20 07:11:33 crc kubenswrapper[4971]: I0320 07:11:33.346393 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/55fc97fc-3c36-4df5-9be0-e405144122b8-scripts\") pod \"ovn-controller-pfhnr-config-dtxp2\" (UID: \"55fc97fc-3c36-4df5-9be0-e405144122b8\") " pod="openstack/ovn-controller-pfhnr-config-dtxp2" Mar 20 07:11:33 crc kubenswrapper[4971]: I0320 07:11:33.346414 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/55fc97fc-3c36-4df5-9be0-e405144122b8-var-run\") pod \"ovn-controller-pfhnr-config-dtxp2\" (UID: \"55fc97fc-3c36-4df5-9be0-e405144122b8\") " pod="openstack/ovn-controller-pfhnr-config-dtxp2" Mar 20 07:11:33 crc kubenswrapper[4971]: I0320 07:11:33.346505 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/55fc97fc-3c36-4df5-9be0-e405144122b8-var-run-ovn\") pod \"ovn-controller-pfhnr-config-dtxp2\" (UID: \"55fc97fc-3c36-4df5-9be0-e405144122b8\") " pod="openstack/ovn-controller-pfhnr-config-dtxp2" Mar 20 07:11:33 crc kubenswrapper[4971]: I0320 07:11:33.346570 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/55fc97fc-3c36-4df5-9be0-e405144122b8-var-run\") pod \"ovn-controller-pfhnr-config-dtxp2\" (UID: \"55fc97fc-3c36-4df5-9be0-e405144122b8\") " pod="openstack/ovn-controller-pfhnr-config-dtxp2" Mar 20 07:11:33 crc kubenswrapper[4971]: I0320 07:11:33.346594 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/55fc97fc-3c36-4df5-9be0-e405144122b8-var-log-ovn\") pod \"ovn-controller-pfhnr-config-dtxp2\" (UID: \"55fc97fc-3c36-4df5-9be0-e405144122b8\") " pod="openstack/ovn-controller-pfhnr-config-dtxp2" Mar 20 07:11:33 crc kubenswrapper[4971]: I0320 07:11:33.347109 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/55fc97fc-3c36-4df5-9be0-e405144122b8-additional-scripts\") pod \"ovn-controller-pfhnr-config-dtxp2\" (UID: \"55fc97fc-3c36-4df5-9be0-e405144122b8\") " pod="openstack/ovn-controller-pfhnr-config-dtxp2" Mar 20 07:11:33 crc kubenswrapper[4971]: I0320 07:11:33.348702 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/55fc97fc-3c36-4df5-9be0-e405144122b8-scripts\") pod \"ovn-controller-pfhnr-config-dtxp2\" (UID: \"55fc97fc-3c36-4df5-9be0-e405144122b8\") " pod="openstack/ovn-controller-pfhnr-config-dtxp2" Mar 20 07:11:33 crc kubenswrapper[4971]: I0320 07:11:33.370354 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldx55\" (UniqueName: \"kubernetes.io/projected/55fc97fc-3c36-4df5-9be0-e405144122b8-kube-api-access-ldx55\") pod \"ovn-controller-pfhnr-config-dtxp2\" (UID: \"55fc97fc-3c36-4df5-9be0-e405144122b8\") " pod="openstack/ovn-controller-pfhnr-config-dtxp2" Mar 20 07:11:33 crc kubenswrapper[4971]: I0320 07:11:33.398598 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-pfhnr-config-dtxp2" Mar 20 07:11:34 crc kubenswrapper[4971]: I0320 07:11:34.780955 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.062735 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-v9s7n"] Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.067582 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-v9s7n" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.081047 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-v9s7n"] Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.177497 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.178359 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-4da9-account-create-update-297t8"] Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.179369 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4da9-account-create-update-297t8" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.182222 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.183286 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9gn6\" (UniqueName: \"kubernetes.io/projected/b44d8fdd-a462-4146-9a5e-6b84864cd490-kube-api-access-g9gn6\") pod \"cinder-db-create-v9s7n\" (UID: \"b44d8fdd-a462-4146-9a5e-6b84864cd490\") " pod="openstack/cinder-db-create-v9s7n" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.183321 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b44d8fdd-a462-4146-9a5e-6b84864cd490-operator-scripts\") pod \"cinder-db-create-v9s7n\" (UID: \"b44d8fdd-a462-4146-9a5e-6b84864cd490\") " pod="openstack/cinder-db-create-v9s7n" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.204442 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-4da9-account-create-update-297t8"] Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.258581 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-x6b4m"] Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.259575 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-x6b4m" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.285097 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-x6b4m"] Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.285282 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e77b03c-9fbb-4f84-a85a-fd7771301436-operator-scripts\") pod \"cinder-4da9-account-create-update-297t8\" (UID: \"9e77b03c-9fbb-4f84-a85a-fd7771301436\") " pod="openstack/cinder-4da9-account-create-update-297t8" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.285803 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crkxs\" (UniqueName: \"kubernetes.io/projected/9e77b03c-9fbb-4f84-a85a-fd7771301436-kube-api-access-crkxs\") pod \"cinder-4da9-account-create-update-297t8\" (UID: \"9e77b03c-9fbb-4f84-a85a-fd7771301436\") " pod="openstack/cinder-4da9-account-create-update-297t8" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.285924 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9gn6\" (UniqueName: \"kubernetes.io/projected/b44d8fdd-a462-4146-9a5e-6b84864cd490-kube-api-access-g9gn6\") pod \"cinder-db-create-v9s7n\" (UID: \"b44d8fdd-a462-4146-9a5e-6b84864cd490\") " pod="openstack/cinder-db-create-v9s7n" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.286008 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b44d8fdd-a462-4146-9a5e-6b84864cd490-operator-scripts\") pod \"cinder-db-create-v9s7n\" (UID: \"b44d8fdd-a462-4146-9a5e-6b84864cd490\") " pod="openstack/cinder-db-create-v9s7n" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.290734 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b44d8fdd-a462-4146-9a5e-6b84864cd490-operator-scripts\") pod \"cinder-db-create-v9s7n\" (UID: \"b44d8fdd-a462-4146-9a5e-6b84864cd490\") " pod="openstack/cinder-db-create-v9s7n" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.337037 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9gn6\" (UniqueName: \"kubernetes.io/projected/b44d8fdd-a462-4146-9a5e-6b84864cd490-kube-api-access-g9gn6\") pod \"cinder-db-create-v9s7n\" (UID: \"b44d8fdd-a462-4146-9a5e-6b84864cd490\") " pod="openstack/cinder-db-create-v9s7n" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.388662 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgcqg\" (UniqueName: \"kubernetes.io/projected/b1ae2ef8-20ef-4f86-958e-212c18ae1701-kube-api-access-vgcqg\") pod \"barbican-db-create-x6b4m\" (UID: \"b1ae2ef8-20ef-4f86-958e-212c18ae1701\") " pod="openstack/barbican-db-create-x6b4m" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.388736 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crkxs\" (UniqueName: \"kubernetes.io/projected/9e77b03c-9fbb-4f84-a85a-fd7771301436-kube-api-access-crkxs\") pod \"cinder-4da9-account-create-update-297t8\" (UID: \"9e77b03c-9fbb-4f84-a85a-fd7771301436\") " pod="openstack/cinder-4da9-account-create-update-297t8" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.388788 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1ae2ef8-20ef-4f86-958e-212c18ae1701-operator-scripts\") pod \"barbican-db-create-x6b4m\" (UID: \"b1ae2ef8-20ef-4f86-958e-212c18ae1701\") " pod="openstack/barbican-db-create-x6b4m" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.388867 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e77b03c-9fbb-4f84-a85a-fd7771301436-operator-scripts\") pod \"cinder-4da9-account-create-update-297t8\" (UID: \"9e77b03c-9fbb-4f84-a85a-fd7771301436\") " pod="openstack/cinder-4da9-account-create-update-297t8" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.389055 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-v9s7n" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.389524 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e77b03c-9fbb-4f84-a85a-fd7771301436-operator-scripts\") pod \"cinder-4da9-account-create-update-297t8\" (UID: \"9e77b03c-9fbb-4f84-a85a-fd7771301436\") " pod="openstack/cinder-4da9-account-create-update-297t8" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.406067 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crkxs\" (UniqueName: \"kubernetes.io/projected/9e77b03c-9fbb-4f84-a85a-fd7771301436-kube-api-access-crkxs\") pod \"cinder-4da9-account-create-update-297t8\" (UID: \"9e77b03c-9fbb-4f84-a85a-fd7771301436\") " pod="openstack/cinder-4da9-account-create-update-297t8" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.472511 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-f5ndp"] Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.473462 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-f5ndp" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.489766 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1ae2ef8-20ef-4f86-958e-212c18ae1701-operator-scripts\") pod \"barbican-db-create-x6b4m\" (UID: \"b1ae2ef8-20ef-4f86-958e-212c18ae1701\") " pod="openstack/barbican-db-create-x6b4m" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.489886 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgcqg\" (UniqueName: \"kubernetes.io/projected/b1ae2ef8-20ef-4f86-958e-212c18ae1701-kube-api-access-vgcqg\") pod \"barbican-db-create-x6b4m\" (UID: \"b1ae2ef8-20ef-4f86-958e-212c18ae1701\") " pod="openstack/barbican-db-create-x6b4m" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.490686 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1ae2ef8-20ef-4f86-958e-212c18ae1701-operator-scripts\") pod \"barbican-db-create-x6b4m\" (UID: \"b1ae2ef8-20ef-4f86-958e-212c18ae1701\") " pod="openstack/barbican-db-create-x6b4m" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.499036 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4da9-account-create-update-297t8" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.505074 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-2wsw6"] Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.506063 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-2wsw6" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.514372 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgcqg\" (UniqueName: \"kubernetes.io/projected/b1ae2ef8-20ef-4f86-958e-212c18ae1701-kube-api-access-vgcqg\") pod \"barbican-db-create-x6b4m\" (UID: \"b1ae2ef8-20ef-4f86-958e-212c18ae1701\") " pod="openstack/barbican-db-create-x6b4m" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.517158 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-vfsj2" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.517392 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.517499 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.517716 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.521582 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-2836-account-create-update-mzz6z"] Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.522788 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-2836-account-create-update-mzz6z" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.531868 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.539624 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-f5ndp"] Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.572413 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-2wsw6"] Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.577964 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-x6b4m" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.591027 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41c644a9-8310-4775-bf77-55e4cf46a907-config-data\") pod \"keystone-db-sync-2wsw6\" (UID: \"41c644a9-8310-4775-bf77-55e4cf46a907\") " pod="openstack/keystone-db-sync-2wsw6" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.591077 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41c644a9-8310-4775-bf77-55e4cf46a907-combined-ca-bundle\") pod \"keystone-db-sync-2wsw6\" (UID: \"41c644a9-8310-4775-bf77-55e4cf46a907\") " pod="openstack/keystone-db-sync-2wsw6" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.591099 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30d83042-4484-4d62-8417-1e7fcebc4220-operator-scripts\") pod \"barbican-2836-account-create-update-mzz6z\" (UID: \"30d83042-4484-4d62-8417-1e7fcebc4220\") " pod="openstack/barbican-2836-account-create-update-mzz6z" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.591149 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kv6pg\" (UniqueName: \"kubernetes.io/projected/e3dec926-79a5-4d96-917c-380ac85e7e38-kube-api-access-kv6pg\") pod \"neutron-db-create-f5ndp\" (UID: \"e3dec926-79a5-4d96-917c-380ac85e7e38\") " pod="openstack/neutron-db-create-f5ndp" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.591176 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpz5b\" (UniqueName: \"kubernetes.io/projected/41c644a9-8310-4775-bf77-55e4cf46a907-kube-api-access-bpz5b\") pod \"keystone-db-sync-2wsw6\" (UID: \"41c644a9-8310-4775-bf77-55e4cf46a907\") " pod="openstack/keystone-db-sync-2wsw6" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.591201 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3dec926-79a5-4d96-917c-380ac85e7e38-operator-scripts\") pod \"neutron-db-create-f5ndp\" (UID: \"e3dec926-79a5-4d96-917c-380ac85e7e38\") " pod="openstack/neutron-db-create-f5ndp" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.591232 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kzwb\" (UniqueName: \"kubernetes.io/projected/30d83042-4484-4d62-8417-1e7fcebc4220-kube-api-access-9kzwb\") pod \"barbican-2836-account-create-update-mzz6z\" (UID: \"30d83042-4484-4d62-8417-1e7fcebc4220\") " pod="openstack/barbican-2836-account-create-update-mzz6z" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.592734 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-2836-account-create-update-mzz6z"] Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.693183 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41c644a9-8310-4775-bf77-55e4cf46a907-config-data\") pod \"keystone-db-sync-2wsw6\" (UID: \"41c644a9-8310-4775-bf77-55e4cf46a907\") " pod="openstack/keystone-db-sync-2wsw6" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.693251 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41c644a9-8310-4775-bf77-55e4cf46a907-combined-ca-bundle\") pod \"keystone-db-sync-2wsw6\" (UID: \"41c644a9-8310-4775-bf77-55e4cf46a907\") " pod="openstack/keystone-db-sync-2wsw6" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.693276 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30d83042-4484-4d62-8417-1e7fcebc4220-operator-scripts\") pod \"barbican-2836-account-create-update-mzz6z\" (UID: \"30d83042-4484-4d62-8417-1e7fcebc4220\") " pod="openstack/barbican-2836-account-create-update-mzz6z" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.693348 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kv6pg\" (UniqueName: \"kubernetes.io/projected/e3dec926-79a5-4d96-917c-380ac85e7e38-kube-api-access-kv6pg\") pod \"neutron-db-create-f5ndp\" (UID: \"e3dec926-79a5-4d96-917c-380ac85e7e38\") " pod="openstack/neutron-db-create-f5ndp" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.693383 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpz5b\" (UniqueName: \"kubernetes.io/projected/41c644a9-8310-4775-bf77-55e4cf46a907-kube-api-access-bpz5b\") pod \"keystone-db-sync-2wsw6\" (UID: \"41c644a9-8310-4775-bf77-55e4cf46a907\") " pod="openstack/keystone-db-sync-2wsw6" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.693413 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3dec926-79a5-4d96-917c-380ac85e7e38-operator-scripts\") pod \"neutron-db-create-f5ndp\" (UID: \"e3dec926-79a5-4d96-917c-380ac85e7e38\") " pod="openstack/neutron-db-create-f5ndp" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.693451 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kzwb\" (UniqueName: \"kubernetes.io/projected/30d83042-4484-4d62-8417-1e7fcebc4220-kube-api-access-9kzwb\") pod \"barbican-2836-account-create-update-mzz6z\" (UID: \"30d83042-4484-4d62-8417-1e7fcebc4220\") " pod="openstack/barbican-2836-account-create-update-mzz6z" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.694177 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30d83042-4484-4d62-8417-1e7fcebc4220-operator-scripts\") pod \"barbican-2836-account-create-update-mzz6z\" (UID: \"30d83042-4484-4d62-8417-1e7fcebc4220\") " pod="openstack/barbican-2836-account-create-update-mzz6z" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.694737 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3dec926-79a5-4d96-917c-380ac85e7e38-operator-scripts\") pod \"neutron-db-create-f5ndp\" (UID: \"e3dec926-79a5-4d96-917c-380ac85e7e38\") " pod="openstack/neutron-db-create-f5ndp" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.700319 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41c644a9-8310-4775-bf77-55e4cf46a907-config-data\") pod \"keystone-db-sync-2wsw6\" (UID: \"41c644a9-8310-4775-bf77-55e4cf46a907\") " pod="openstack/keystone-db-sync-2wsw6" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.700800 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41c644a9-8310-4775-bf77-55e4cf46a907-combined-ca-bundle\") pod \"keystone-db-sync-2wsw6\" (UID: \"41c644a9-8310-4775-bf77-55e4cf46a907\") " pod="openstack/keystone-db-sync-2wsw6" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.721277 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-60a3-account-create-update-5zftq"] Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.722372 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-60a3-account-create-update-5zftq" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.723509 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kzwb\" (UniqueName: \"kubernetes.io/projected/30d83042-4484-4d62-8417-1e7fcebc4220-kube-api-access-9kzwb\") pod \"barbican-2836-account-create-update-mzz6z\" (UID: \"30d83042-4484-4d62-8417-1e7fcebc4220\") " pod="openstack/barbican-2836-account-create-update-mzz6z" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.727912 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.731794 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kv6pg\" (UniqueName: \"kubernetes.io/projected/e3dec926-79a5-4d96-917c-380ac85e7e38-kube-api-access-kv6pg\") pod \"neutron-db-create-f5ndp\" (UID: \"e3dec926-79a5-4d96-917c-380ac85e7e38\") " pod="openstack/neutron-db-create-f5ndp" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.732643 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-60a3-account-create-update-5zftq"] Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.741260 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpz5b\" (UniqueName: \"kubernetes.io/projected/41c644a9-8310-4775-bf77-55e4cf46a907-kube-api-access-bpz5b\") pod \"keystone-db-sync-2wsw6\" (UID: \"41c644a9-8310-4775-bf77-55e4cf46a907\") " pod="openstack/keystone-db-sync-2wsw6" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.799416 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-f5ndp" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.800485 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgzhj\" (UniqueName: \"kubernetes.io/projected/72464f1f-f227-4801-8ad0-6a81aaba7081-kube-api-access-jgzhj\") pod \"neutron-60a3-account-create-update-5zftq\" (UID: \"72464f1f-f227-4801-8ad0-6a81aaba7081\") " pod="openstack/neutron-60a3-account-create-update-5zftq" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.800668 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72464f1f-f227-4801-8ad0-6a81aaba7081-operator-scripts\") pod \"neutron-60a3-account-create-update-5zftq\" (UID: \"72464f1f-f227-4801-8ad0-6a81aaba7081\") " pod="openstack/neutron-60a3-account-create-update-5zftq" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.864729 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-2wsw6" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.879051 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-2836-account-create-update-mzz6z" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.902573 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgzhj\" (UniqueName: \"kubernetes.io/projected/72464f1f-f227-4801-8ad0-6a81aaba7081-kube-api-access-jgzhj\") pod \"neutron-60a3-account-create-update-5zftq\" (UID: \"72464f1f-f227-4801-8ad0-6a81aaba7081\") " pod="openstack/neutron-60a3-account-create-update-5zftq" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.902691 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72464f1f-f227-4801-8ad0-6a81aaba7081-operator-scripts\") pod \"neutron-60a3-account-create-update-5zftq\" (UID: \"72464f1f-f227-4801-8ad0-6a81aaba7081\") " pod="openstack/neutron-60a3-account-create-update-5zftq" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.903824 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72464f1f-f227-4801-8ad0-6a81aaba7081-operator-scripts\") pod \"neutron-60a3-account-create-update-5zftq\" (UID: \"72464f1f-f227-4801-8ad0-6a81aaba7081\") " pod="openstack/neutron-60a3-account-create-update-5zftq" Mar 20 07:11:35 crc kubenswrapper[4971]: I0320 07:11:35.929353 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgzhj\" (UniqueName: \"kubernetes.io/projected/72464f1f-f227-4801-8ad0-6a81aaba7081-kube-api-access-jgzhj\") pod \"neutron-60a3-account-create-update-5zftq\" (UID: \"72464f1f-f227-4801-8ad0-6a81aaba7081\") " pod="openstack/neutron-60a3-account-create-update-5zftq" Mar 20 07:11:36 crc kubenswrapper[4971]: I0320 07:11:36.090544 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-60a3-account-create-update-5zftq" Mar 20 07:11:37 crc kubenswrapper[4971]: I0320 07:11:37.874675 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-pfhnr" podUID="10811481-4b40-49ae-9d75-03f0c0e02fe7" containerName="ovn-controller" probeResult="failure" output=< Mar 20 07:11:37 crc kubenswrapper[4971]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 20 07:11:37 crc kubenswrapper[4971]: > Mar 20 07:11:37 crc kubenswrapper[4971]: E0320 07:11:37.926792 4971 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.119:57978->38.102.83.119:38499: write tcp 38.102.83.119:57978->38.102.83.119:38499: write: broken pipe Mar 20 07:11:38 crc kubenswrapper[4971]: I0320 07:11:38.438492 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-2wsw6"] Mar 20 07:11:38 crc kubenswrapper[4971]: I0320 07:11:38.444681 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-7dwd5"] Mar 20 07:11:38 crc kubenswrapper[4971]: W0320 07:11:38.454670 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41c644a9_8310_4775_bf77_55e4cf46a907.slice/crio-d85e02e9f665377611d9e91b94648af83ad8ca719d13192fd75d8f3921ed0f26 WatchSource:0}: Error finding container d85e02e9f665377611d9e91b94648af83ad8ca719d13192fd75d8f3921ed0f26: Status 404 returned error can't find the container with id d85e02e9f665377611d9e91b94648af83ad8ca719d13192fd75d8f3921ed0f26 Mar 20 07:11:38 crc kubenswrapper[4971]: W0320 07:11:38.463849 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod38bbd968_9965_427c_b176_00754eb9887c.slice/crio-079c7b150be468b7d5ac95e044fbc1541ec9d46048f68e0f6da78ccac8c2f257 WatchSource:0}: Error finding container 079c7b150be468b7d5ac95e044fbc1541ec9d46048f68e0f6da78ccac8c2f257: Status 404 returned error can't find the container with id 079c7b150be468b7d5ac95e044fbc1541ec9d46048f68e0f6da78ccac8c2f257 Mar 20 07:11:38 crc kubenswrapper[4971]: I0320 07:11:38.471238 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-4da9-account-create-update-297t8"] Mar 20 07:11:38 crc kubenswrapper[4971]: W0320 07:11:38.488223 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9e77b03c_9fbb_4f84_a85a_fd7771301436.slice/crio-8f0cc2c0bca3b70f41c1d21fe6ff4a5d7bfe2b44da5d36bbdda128ef5bcd1a19 WatchSource:0}: Error finding container 8f0cc2c0bca3b70f41c1d21fe6ff4a5d7bfe2b44da5d36bbdda128ef5bcd1a19: Status 404 returned error can't find the container with id 8f0cc2c0bca3b70f41c1d21fe6ff4a5d7bfe2b44da5d36bbdda128ef5bcd1a19 Mar 20 07:11:38 crc kubenswrapper[4971]: I0320 07:11:38.584598 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-2wsw6" event={"ID":"41c644a9-8310-4775-bf77-55e4cf46a907","Type":"ContainerStarted","Data":"d85e02e9f665377611d9e91b94648af83ad8ca719d13192fd75d8f3921ed0f26"} Mar 20 07:11:38 crc kubenswrapper[4971]: I0320 07:11:38.586653 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-7dwd5" event={"ID":"38bbd968-9965-427c-b176-00754eb9887c","Type":"ContainerStarted","Data":"079c7b150be468b7d5ac95e044fbc1541ec9d46048f68e0f6da78ccac8c2f257"} Mar 20 07:11:38 crc kubenswrapper[4971]: I0320 07:11:38.588523 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-4da9-account-create-update-297t8" event={"ID":"9e77b03c-9fbb-4f84-a85a-fd7771301436","Type":"ContainerStarted","Data":"8f0cc2c0bca3b70f41c1d21fe6ff4a5d7bfe2b44da5d36bbdda128ef5bcd1a19"} Mar 20 07:11:38 crc kubenswrapper[4971]: I0320 07:11:38.643835 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-2836-account-create-update-mzz6z"] Mar 20 07:11:38 crc kubenswrapper[4971]: I0320 07:11:38.658754 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-x6b4m"] Mar 20 07:11:38 crc kubenswrapper[4971]: W0320 07:11:38.667709 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55fc97fc_3c36_4df5_9be0_e405144122b8.slice/crio-46c144f26fdf201a980d9584c03ad7e616e7ad33c4e289e60a68a970b35d324c WatchSource:0}: Error finding container 46c144f26fdf201a980d9584c03ad7e616e7ad33c4e289e60a68a970b35d324c: Status 404 returned error can't find the container with id 46c144f26fdf201a980d9584c03ad7e616e7ad33c4e289e60a68a970b35d324c Mar 20 07:11:38 crc kubenswrapper[4971]: I0320 07:11:38.673526 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-pfhnr-config-dtxp2"] Mar 20 07:11:38 crc kubenswrapper[4971]: I0320 07:11:38.681219 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 20 07:11:38 crc kubenswrapper[4971]: I0320 07:11:38.687445 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-60a3-account-create-update-5zftq"] Mar 20 07:11:38 crc kubenswrapper[4971]: I0320 07:11:38.689005 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 20 07:11:38 crc kubenswrapper[4971]: I0320 07:11:38.811631 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-v9s7n"] Mar 20 07:11:38 crc kubenswrapper[4971]: I0320 07:11:38.824784 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-f5ndp"] Mar 20 07:11:39 crc kubenswrapper[4971]: I0320 07:11:39.607160 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-60a3-account-create-update-5zftq" event={"ID":"72464f1f-f227-4801-8ad0-6a81aaba7081","Type":"ContainerStarted","Data":"42b6d9f32625085d1006654e4e4f12708f6719d8a4afc56ed61979703f1462f1"} Mar 20 07:11:39 crc kubenswrapper[4971]: I0320 07:11:39.607482 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-60a3-account-create-update-5zftq" event={"ID":"72464f1f-f227-4801-8ad0-6a81aaba7081","Type":"ContainerStarted","Data":"45287fec2437fe70bdae81e93b60c619e33932a03c8654d9568d870086b2dc19"} Mar 20 07:11:39 crc kubenswrapper[4971]: I0320 07:11:39.611767 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-x6b4m" event={"ID":"b1ae2ef8-20ef-4f86-958e-212c18ae1701","Type":"ContainerStarted","Data":"55283060505b357dc023a960a805e270100d975ab8edf53910bac6a09d66c831"} Mar 20 07:11:39 crc kubenswrapper[4971]: I0320 07:11:39.611830 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-x6b4m" event={"ID":"b1ae2ef8-20ef-4f86-958e-212c18ae1701","Type":"ContainerStarted","Data":"508840b3e069165e6a68b73e1b441ad3b432fb900f1d5dbb28e4e06434ad3a9e"} Mar 20 07:11:39 crc kubenswrapper[4971]: I0320 07:11:39.614205 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-f5ndp" event={"ID":"e3dec926-79a5-4d96-917c-380ac85e7e38","Type":"ContainerStarted","Data":"b20124ae1f6d2b7ddda5e25f0cc517f56ab0056bdc786ff56907512f902df9f9"} Mar 20 07:11:39 crc kubenswrapper[4971]: I0320 07:11:39.614252 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-f5ndp" event={"ID":"e3dec926-79a5-4d96-917c-380ac85e7e38","Type":"ContainerStarted","Data":"5af9072e587f7a5446ae2cc4a865e515b854b5b7c7912c07b2f1afb37482b458"} Mar 20 07:11:39 crc kubenswrapper[4971]: I0320 07:11:39.620334 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-pfhnr-config-dtxp2" event={"ID":"55fc97fc-3c36-4df5-9be0-e405144122b8","Type":"ContainerStarted","Data":"af1598521084a1003841bd322d7ad538ec00eeef1caab95601854478c13ae85f"} Mar 20 07:11:39 crc kubenswrapper[4971]: I0320 07:11:39.620384 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-pfhnr-config-dtxp2" event={"ID":"55fc97fc-3c36-4df5-9be0-e405144122b8","Type":"ContainerStarted","Data":"46c144f26fdf201a980d9584c03ad7e616e7ad33c4e289e60a68a970b35d324c"} Mar 20 07:11:39 crc kubenswrapper[4971]: I0320 07:11:39.626844 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-60a3-account-create-update-5zftq" podStartSLOduration=4.6268288890000004 podStartE2EDuration="4.626828889s" podCreationTimestamp="2026-03-20 07:11:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:11:39.621756134 +0000 UTC m=+1321.601630272" watchObservedRunningTime="2026-03-20 07:11:39.626828889 +0000 UTC m=+1321.606703027" Mar 20 07:11:39 crc kubenswrapper[4971]: I0320 07:11:39.629169 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-v9s7n" event={"ID":"b44d8fdd-a462-4146-9a5e-6b84864cd490","Type":"ContainerStarted","Data":"1fa0728e16fb926d9758ef863ed07ffb1e047c3342be2cb6f8294d64076e6fba"} Mar 20 07:11:39 crc kubenswrapper[4971]: I0320 07:11:39.629221 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-v9s7n" event={"ID":"b44d8fdd-a462-4146-9a5e-6b84864cd490","Type":"ContainerStarted","Data":"55047e3567f4c6191df51eb9768d08cdb2202e134162f5cfd9255a1182957654"} Mar 20 07:11:39 crc kubenswrapper[4971]: I0320 07:11:39.632445 4971 generic.go:334] "Generic (PLEG): container finished" podID="38bbd968-9965-427c-b176-00754eb9887c" containerID="7bd65cb3fc2edb798de3a2ade999c6f405c2215e4ac027cd03552885c229b825" exitCode=0 Mar 20 07:11:39 crc kubenswrapper[4971]: I0320 07:11:39.632548 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-7dwd5" event={"ID":"38bbd968-9965-427c-b176-00754eb9887c","Type":"ContainerDied","Data":"7bd65cb3fc2edb798de3a2ade999c6f405c2215e4ac027cd03552885c229b825"} Mar 20 07:11:39 crc kubenswrapper[4971]: I0320 07:11:39.635517 4971 generic.go:334] "Generic (PLEG): container finished" podID="9e77b03c-9fbb-4f84-a85a-fd7771301436" containerID="f38c82bcc1900d8bc2655f6085c5695f555d943d88c400862d379df92953d557" exitCode=0 Mar 20 07:11:39 crc kubenswrapper[4971]: I0320 07:11:39.635576 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-4da9-account-create-update-297t8" event={"ID":"9e77b03c-9fbb-4f84-a85a-fd7771301436","Type":"ContainerDied","Data":"f38c82bcc1900d8bc2655f6085c5695f555d943d88c400862d379df92953d557"} Mar 20 07:11:39 crc kubenswrapper[4971]: I0320 07:11:39.644510 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-gdzww" event={"ID":"3ed6dac6-e0d0-403d-8e7d-6a5d921379fb","Type":"ContainerStarted","Data":"9dc2fbe9979ece731bdad20be4ff77fb6c042b3ae61d4a3a925fb081c9c02905"} Mar 20 07:11:39 crc kubenswrapper[4971]: I0320 07:11:39.644956 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-x6b4m" podStartSLOduration=4.644934774 podStartE2EDuration="4.644934774s" podCreationTimestamp="2026-03-20 07:11:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:11:39.63704485 +0000 UTC m=+1321.616918988" watchObservedRunningTime="2026-03-20 07:11:39.644934774 +0000 UTC m=+1321.624808912" Mar 20 07:11:39 crc kubenswrapper[4971]: I0320 07:11:39.647230 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-2836-account-create-update-mzz6z" event={"ID":"30d83042-4484-4d62-8417-1e7fcebc4220","Type":"ContainerStarted","Data":"1f0d05e9c6b42bd0f6ea81b947a19f9ceb2c1ffb8db5799376283fa131c7af29"} Mar 20 07:11:39 crc kubenswrapper[4971]: I0320 07:11:39.647268 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-2836-account-create-update-mzz6z" event={"ID":"30d83042-4484-4d62-8417-1e7fcebc4220","Type":"ContainerStarted","Data":"c003119988b1fceedd23471ddd4b9249e3037de609e6ea65f84d86a00ba338b5"} Mar 20 07:11:39 crc kubenswrapper[4971]: I0320 07:11:39.658351 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-pfhnr-config-dtxp2" podStartSLOduration=6.658331323 podStartE2EDuration="6.658331323s" podCreationTimestamp="2026-03-20 07:11:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:11:39.653788341 +0000 UTC m=+1321.633662489" watchObservedRunningTime="2026-03-20 07:11:39.658331323 +0000 UTC m=+1321.638205461" Mar 20 07:11:39 crc kubenswrapper[4971]: I0320 07:11:39.680304 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-f5ndp" podStartSLOduration=4.6802658919999995 podStartE2EDuration="4.680265892s" podCreationTimestamp="2026-03-20 07:11:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:11:39.674242294 +0000 UTC m=+1321.654116432" watchObservedRunningTime="2026-03-20 07:11:39.680265892 +0000 UTC m=+1321.660140030" Mar 20 07:11:39 crc kubenswrapper[4971]: I0320 07:11:39.742298 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-v9s7n" podStartSLOduration=4.7422725660000005 podStartE2EDuration="4.742272566s" podCreationTimestamp="2026-03-20 07:11:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:11:39.711432938 +0000 UTC m=+1321.691307076" watchObservedRunningTime="2026-03-20 07:11:39.742272566 +0000 UTC m=+1321.722146704" Mar 20 07:11:39 crc kubenswrapper[4971]: I0320 07:11:39.775955 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-gdzww" podStartSLOduration=4.826836147 podStartE2EDuration="21.775923144s" podCreationTimestamp="2026-03-20 07:11:18 +0000 UTC" firstStartedPulling="2026-03-20 07:11:20.91127041 +0000 UTC m=+1302.891144538" lastFinishedPulling="2026-03-20 07:11:37.860357397 +0000 UTC m=+1319.840231535" observedRunningTime="2026-03-20 07:11:39.770630773 +0000 UTC m=+1321.750504911" watchObservedRunningTime="2026-03-20 07:11:39.775923144 +0000 UTC m=+1321.755797282" Mar 20 07:11:39 crc kubenswrapper[4971]: I0320 07:11:39.801782 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-2836-account-create-update-mzz6z" podStartSLOduration=4.801757769 podStartE2EDuration="4.801757769s" podCreationTimestamp="2026-03-20 07:11:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:11:39.795686019 +0000 UTC m=+1321.775560167" watchObservedRunningTime="2026-03-20 07:11:39.801757769 +0000 UTC m=+1321.781631907" Mar 20 07:11:40 crc kubenswrapper[4971]: I0320 07:11:40.660745 4971 generic.go:334] "Generic (PLEG): container finished" podID="72464f1f-f227-4801-8ad0-6a81aaba7081" containerID="42b6d9f32625085d1006654e4e4f12708f6719d8a4afc56ed61979703f1462f1" exitCode=0 Mar 20 07:11:40 crc kubenswrapper[4971]: I0320 07:11:40.660874 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-60a3-account-create-update-5zftq" event={"ID":"72464f1f-f227-4801-8ad0-6a81aaba7081","Type":"ContainerDied","Data":"42b6d9f32625085d1006654e4e4f12708f6719d8a4afc56ed61979703f1462f1"} Mar 20 07:11:40 crc kubenswrapper[4971]: I0320 07:11:40.663917 4971 generic.go:334] "Generic (PLEG): container finished" podID="30d83042-4484-4d62-8417-1e7fcebc4220" containerID="1f0d05e9c6b42bd0f6ea81b947a19f9ceb2c1ffb8db5799376283fa131c7af29" exitCode=0 Mar 20 07:11:40 crc kubenswrapper[4971]: I0320 07:11:40.664000 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-2836-account-create-update-mzz6z" event={"ID":"30d83042-4484-4d62-8417-1e7fcebc4220","Type":"ContainerDied","Data":"1f0d05e9c6b42bd0f6ea81b947a19f9ceb2c1ffb8db5799376283fa131c7af29"} Mar 20 07:11:40 crc kubenswrapper[4971]: I0320 07:11:40.666235 4971 generic.go:334] "Generic (PLEG): container finished" podID="b1ae2ef8-20ef-4f86-958e-212c18ae1701" containerID="55283060505b357dc023a960a805e270100d975ab8edf53910bac6a09d66c831" exitCode=0 Mar 20 07:11:40 crc kubenswrapper[4971]: I0320 07:11:40.666314 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-x6b4m" event={"ID":"b1ae2ef8-20ef-4f86-958e-212c18ae1701","Type":"ContainerDied","Data":"55283060505b357dc023a960a805e270100d975ab8edf53910bac6a09d66c831"} Mar 20 07:11:40 crc kubenswrapper[4971]: I0320 07:11:40.668670 4971 generic.go:334] "Generic (PLEG): container finished" podID="55fc97fc-3c36-4df5-9be0-e405144122b8" containerID="af1598521084a1003841bd322d7ad538ec00eeef1caab95601854478c13ae85f" exitCode=0 Mar 20 07:11:40 crc kubenswrapper[4971]: I0320 07:11:40.668809 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-pfhnr-config-dtxp2" event={"ID":"55fc97fc-3c36-4df5-9be0-e405144122b8","Type":"ContainerDied","Data":"af1598521084a1003841bd322d7ad538ec00eeef1caab95601854478c13ae85f"} Mar 20 07:11:40 crc kubenswrapper[4971]: I0320 07:11:40.671191 4971 generic.go:334] "Generic (PLEG): container finished" podID="e3dec926-79a5-4d96-917c-380ac85e7e38" containerID="b20124ae1f6d2b7ddda5e25f0cc517f56ab0056bdc786ff56907512f902df9f9" exitCode=0 Mar 20 07:11:40 crc kubenswrapper[4971]: I0320 07:11:40.671236 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-f5ndp" event={"ID":"e3dec926-79a5-4d96-917c-380ac85e7e38","Type":"ContainerDied","Data":"b20124ae1f6d2b7ddda5e25f0cc517f56ab0056bdc786ff56907512f902df9f9"} Mar 20 07:11:40 crc kubenswrapper[4971]: I0320 07:11:40.673561 4971 generic.go:334] "Generic (PLEG): container finished" podID="b44d8fdd-a462-4146-9a5e-6b84864cd490" containerID="1fa0728e16fb926d9758ef863ed07ffb1e047c3342be2cb6f8294d64076e6fba" exitCode=0 Mar 20 07:11:40 crc kubenswrapper[4971]: I0320 07:11:40.673701 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-v9s7n" event={"ID":"b44d8fdd-a462-4146-9a5e-6b84864cd490","Type":"ContainerDied","Data":"1fa0728e16fb926d9758ef863ed07ffb1e047c3342be2cb6f8294d64076e6fba"} Mar 20 07:11:42 crc kubenswrapper[4971]: I0320 07:11:42.802265 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-pfhnr" Mar 20 07:11:43 crc kubenswrapper[4971]: I0320 07:11:43.144379 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9f0608b4-f344-45db-a952-d6bc328083a2-etc-swift\") pod \"swift-storage-0\" (UID: \"9f0608b4-f344-45db-a952-d6bc328083a2\") " pod="openstack/swift-storage-0" Mar 20 07:11:43 crc kubenswrapper[4971]: I0320 07:11:43.169895 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9f0608b4-f344-45db-a952-d6bc328083a2-etc-swift\") pod \"swift-storage-0\" (UID: \"9f0608b4-f344-45db-a952-d6bc328083a2\") " pod="openstack/swift-storage-0" Mar 20 07:11:43 crc kubenswrapper[4971]: I0320 07:11:43.253118 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.770263 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-2836-account-create-update-mzz6z" event={"ID":"30d83042-4484-4d62-8417-1e7fcebc4220","Type":"ContainerDied","Data":"c003119988b1fceedd23471ddd4b9249e3037de609e6ea65f84d86a00ba338b5"} Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.771160 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c003119988b1fceedd23471ddd4b9249e3037de609e6ea65f84d86a00ba338b5" Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.774170 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-x6b4m" event={"ID":"b1ae2ef8-20ef-4f86-958e-212c18ae1701","Type":"ContainerDied","Data":"508840b3e069165e6a68b73e1b441ad3b432fb900f1d5dbb28e4e06434ad3a9e"} Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.774208 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="508840b3e069165e6a68b73e1b441ad3b432fb900f1d5dbb28e4e06434ad3a9e" Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.780551 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-pfhnr-config-dtxp2" event={"ID":"55fc97fc-3c36-4df5-9be0-e405144122b8","Type":"ContainerDied","Data":"46c144f26fdf201a980d9584c03ad7e616e7ad33c4e289e60a68a970b35d324c"} Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.780578 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46c144f26fdf201a980d9584c03ad7e616e7ad33c4e289e60a68a970b35d324c" Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.782250 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-f5ndp" event={"ID":"e3dec926-79a5-4d96-917c-380ac85e7e38","Type":"ContainerDied","Data":"5af9072e587f7a5446ae2cc4a865e515b854b5b7c7912c07b2f1afb37482b458"} Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.782270 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5af9072e587f7a5446ae2cc4a865e515b854b5b7c7912c07b2f1afb37482b458" Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.783760 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-v9s7n" event={"ID":"b44d8fdd-a462-4146-9a5e-6b84864cd490","Type":"ContainerDied","Data":"55047e3567f4c6191df51eb9768d08cdb2202e134162f5cfd9255a1182957654"} Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.783789 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55047e3567f4c6191df51eb9768d08cdb2202e134162f5cfd9255a1182957654" Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.785586 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-7dwd5" event={"ID":"38bbd968-9965-427c-b176-00754eb9887c","Type":"ContainerDied","Data":"079c7b150be468b7d5ac95e044fbc1541ec9d46048f68e0f6da78ccac8c2f257"} Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.785648 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="079c7b150be468b7d5ac95e044fbc1541ec9d46048f68e0f6da78ccac8c2f257" Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.787320 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-4da9-account-create-update-297t8" event={"ID":"9e77b03c-9fbb-4f84-a85a-fd7771301436","Type":"ContainerDied","Data":"8f0cc2c0bca3b70f41c1d21fe6ff4a5d7bfe2b44da5d36bbdda128ef5bcd1a19"} Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.787341 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f0cc2c0bca3b70f41c1d21fe6ff4a5d7bfe2b44da5d36bbdda128ef5bcd1a19" Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.789434 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-60a3-account-create-update-5zftq" event={"ID":"72464f1f-f227-4801-8ad0-6a81aaba7081","Type":"ContainerDied","Data":"45287fec2437fe70bdae81e93b60c619e33932a03c8654d9568d870086b2dc19"} Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.789460 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45287fec2437fe70bdae81e93b60c619e33932a03c8654d9568d870086b2dc19" Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.840040 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-60a3-account-create-update-5zftq" Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.868343 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-x6b4m" Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.873150 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1ae2ef8-20ef-4f86-958e-212c18ae1701-operator-scripts\") pod \"b1ae2ef8-20ef-4f86-958e-212c18ae1701\" (UID: \"b1ae2ef8-20ef-4f86-958e-212c18ae1701\") " Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.873261 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72464f1f-f227-4801-8ad0-6a81aaba7081-operator-scripts\") pod \"72464f1f-f227-4801-8ad0-6a81aaba7081\" (UID: \"72464f1f-f227-4801-8ad0-6a81aaba7081\") " Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.873315 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgcqg\" (UniqueName: \"kubernetes.io/projected/b1ae2ef8-20ef-4f86-958e-212c18ae1701-kube-api-access-vgcqg\") pod \"b1ae2ef8-20ef-4f86-958e-212c18ae1701\" (UID: \"b1ae2ef8-20ef-4f86-958e-212c18ae1701\") " Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.873356 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgzhj\" (UniqueName: \"kubernetes.io/projected/72464f1f-f227-4801-8ad0-6a81aaba7081-kube-api-access-jgzhj\") pod \"72464f1f-f227-4801-8ad0-6a81aaba7081\" (UID: \"72464f1f-f227-4801-8ad0-6a81aaba7081\") " Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.875427 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72464f1f-f227-4801-8ad0-6a81aaba7081-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "72464f1f-f227-4801-8ad0-6a81aaba7081" (UID: "72464f1f-f227-4801-8ad0-6a81aaba7081"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.876047 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1ae2ef8-20ef-4f86-958e-212c18ae1701-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b1ae2ef8-20ef-4f86-958e-212c18ae1701" (UID: "b1ae2ef8-20ef-4f86-958e-212c18ae1701"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.881374 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1ae2ef8-20ef-4f86-958e-212c18ae1701-kube-api-access-vgcqg" (OuterVolumeSpecName: "kube-api-access-vgcqg") pod "b1ae2ef8-20ef-4f86-958e-212c18ae1701" (UID: "b1ae2ef8-20ef-4f86-958e-212c18ae1701"). InnerVolumeSpecName "kube-api-access-vgcqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.881418 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72464f1f-f227-4801-8ad0-6a81aaba7081-kube-api-access-jgzhj" (OuterVolumeSpecName: "kube-api-access-jgzhj") pod "72464f1f-f227-4801-8ad0-6a81aaba7081" (UID: "72464f1f-f227-4801-8ad0-6a81aaba7081"). InnerVolumeSpecName "kube-api-access-jgzhj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.887193 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-2836-account-create-update-mzz6z" Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.918039 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-v9s7n" Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.924237 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4da9-account-create-update-297t8" Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.940510 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-f5ndp" Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.952828 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-pfhnr-config-dtxp2" Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.957495 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-7dwd5" Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.975837 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/55fc97fc-3c36-4df5-9be0-e405144122b8-additional-scripts\") pod \"55fc97fc-3c36-4df5-9be0-e405144122b8\" (UID: \"55fc97fc-3c36-4df5-9be0-e405144122b8\") " Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.975905 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3dec926-79a5-4d96-917c-380ac85e7e38-operator-scripts\") pod \"e3dec926-79a5-4d96-917c-380ac85e7e38\" (UID: \"e3dec926-79a5-4d96-917c-380ac85e7e38\") " Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.975960 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/55fc97fc-3c36-4df5-9be0-e405144122b8-var-run\") pod \"55fc97fc-3c36-4df5-9be0-e405144122b8\" (UID: \"55fc97fc-3c36-4df5-9be0-e405144122b8\") " Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.975983 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzlgj\" (UniqueName: \"kubernetes.io/projected/38bbd968-9965-427c-b176-00754eb9887c-kube-api-access-hzlgj\") pod \"38bbd968-9965-427c-b176-00754eb9887c\" (UID: \"38bbd968-9965-427c-b176-00754eb9887c\") " Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.976011 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/55fc97fc-3c36-4df5-9be0-e405144122b8-var-run-ovn\") pod \"55fc97fc-3c36-4df5-9be0-e405144122b8\" (UID: \"55fc97fc-3c36-4df5-9be0-e405144122b8\") " Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.976074 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30d83042-4484-4d62-8417-1e7fcebc4220-operator-scripts\") pod \"30d83042-4484-4d62-8417-1e7fcebc4220\" (UID: \"30d83042-4484-4d62-8417-1e7fcebc4220\") " Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.976122 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/55fc97fc-3c36-4df5-9be0-e405144122b8-var-log-ovn\") pod \"55fc97fc-3c36-4df5-9be0-e405144122b8\" (UID: \"55fc97fc-3c36-4df5-9be0-e405144122b8\") " Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.976162 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38bbd968-9965-427c-b176-00754eb9887c-operator-scripts\") pod \"38bbd968-9965-427c-b176-00754eb9887c\" (UID: \"38bbd968-9965-427c-b176-00754eb9887c\") " Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.976187 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldx55\" (UniqueName: \"kubernetes.io/projected/55fc97fc-3c36-4df5-9be0-e405144122b8-kube-api-access-ldx55\") pod \"55fc97fc-3c36-4df5-9be0-e405144122b8\" (UID: \"55fc97fc-3c36-4df5-9be0-e405144122b8\") " Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.976209 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crkxs\" (UniqueName: \"kubernetes.io/projected/9e77b03c-9fbb-4f84-a85a-fd7771301436-kube-api-access-crkxs\") pod \"9e77b03c-9fbb-4f84-a85a-fd7771301436\" (UID: \"9e77b03c-9fbb-4f84-a85a-fd7771301436\") " Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.976234 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b44d8fdd-a462-4146-9a5e-6b84864cd490-operator-scripts\") pod \"b44d8fdd-a462-4146-9a5e-6b84864cd490\" (UID: \"b44d8fdd-a462-4146-9a5e-6b84864cd490\") " Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.976262 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/55fc97fc-3c36-4df5-9be0-e405144122b8-scripts\") pod \"55fc97fc-3c36-4df5-9be0-e405144122b8\" (UID: \"55fc97fc-3c36-4df5-9be0-e405144122b8\") " Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.976284 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e77b03c-9fbb-4f84-a85a-fd7771301436-operator-scripts\") pod \"9e77b03c-9fbb-4f84-a85a-fd7771301436\" (UID: \"9e77b03c-9fbb-4f84-a85a-fd7771301436\") " Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.976308 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kv6pg\" (UniqueName: \"kubernetes.io/projected/e3dec926-79a5-4d96-917c-380ac85e7e38-kube-api-access-kv6pg\") pod \"e3dec926-79a5-4d96-917c-380ac85e7e38\" (UID: \"e3dec926-79a5-4d96-917c-380ac85e7e38\") " Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.976333 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9kzwb\" (UniqueName: \"kubernetes.io/projected/30d83042-4484-4d62-8417-1e7fcebc4220-kube-api-access-9kzwb\") pod \"30d83042-4484-4d62-8417-1e7fcebc4220\" (UID: \"30d83042-4484-4d62-8417-1e7fcebc4220\") " Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.976380 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9gn6\" (UniqueName: \"kubernetes.io/projected/b44d8fdd-a462-4146-9a5e-6b84864cd490-kube-api-access-g9gn6\") pod \"b44d8fdd-a462-4146-9a5e-6b84864cd490\" (UID: \"b44d8fdd-a462-4146-9a5e-6b84864cd490\") " Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.976706 4971 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1ae2ef8-20ef-4f86-958e-212c18ae1701-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.976719 4971 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72464f1f-f227-4801-8ad0-6a81aaba7081-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.976730 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vgcqg\" (UniqueName: \"kubernetes.io/projected/b1ae2ef8-20ef-4f86-958e-212c18ae1701-kube-api-access-vgcqg\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.976742 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgzhj\" (UniqueName: \"kubernetes.io/projected/72464f1f-f227-4801-8ad0-6a81aaba7081-kube-api-access-jgzhj\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.977150 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3dec926-79a5-4d96-917c-380ac85e7e38-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e3dec926-79a5-4d96-917c-380ac85e7e38" (UID: "e3dec926-79a5-4d96-917c-380ac85e7e38"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.977159 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38bbd968-9965-427c-b176-00754eb9887c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "38bbd968-9965-427c-b176-00754eb9887c" (UID: "38bbd968-9965-427c-b176-00754eb9887c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.977285 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/55fc97fc-3c36-4df5-9be0-e405144122b8-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "55fc97fc-3c36-4df5-9be0-e405144122b8" (UID: "55fc97fc-3c36-4df5-9be0-e405144122b8"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.977497 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55fc97fc-3c36-4df5-9be0-e405144122b8-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "55fc97fc-3c36-4df5-9be0-e405144122b8" (UID: "55fc97fc-3c36-4df5-9be0-e405144122b8"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.977580 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30d83042-4484-4d62-8417-1e7fcebc4220-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "30d83042-4484-4d62-8417-1e7fcebc4220" (UID: "30d83042-4484-4d62-8417-1e7fcebc4220"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.977589 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/55fc97fc-3c36-4df5-9be0-e405144122b8-var-run" (OuterVolumeSpecName: "var-run") pod "55fc97fc-3c36-4df5-9be0-e405144122b8" (UID: "55fc97fc-3c36-4df5-9be0-e405144122b8"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.977755 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/55fc97fc-3c36-4df5-9be0-e405144122b8-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "55fc97fc-3c36-4df5-9be0-e405144122b8" (UID: "55fc97fc-3c36-4df5-9be0-e405144122b8"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.978122 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55fc97fc-3c36-4df5-9be0-e405144122b8-scripts" (OuterVolumeSpecName: "scripts") pod "55fc97fc-3c36-4df5-9be0-e405144122b8" (UID: "55fc97fc-3c36-4df5-9be0-e405144122b8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.978910 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b44d8fdd-a462-4146-9a5e-6b84864cd490-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b44d8fdd-a462-4146-9a5e-6b84864cd490" (UID: "b44d8fdd-a462-4146-9a5e-6b84864cd490"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.981078 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e77b03c-9fbb-4f84-a85a-fd7771301436-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9e77b03c-9fbb-4f84-a85a-fd7771301436" (UID: "9e77b03c-9fbb-4f84-a85a-fd7771301436"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.981882 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3dec926-79a5-4d96-917c-380ac85e7e38-kube-api-access-kv6pg" (OuterVolumeSpecName: "kube-api-access-kv6pg") pod "e3dec926-79a5-4d96-917c-380ac85e7e38" (UID: "e3dec926-79a5-4d96-917c-380ac85e7e38"). InnerVolumeSpecName "kube-api-access-kv6pg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.982075 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b44d8fdd-a462-4146-9a5e-6b84864cd490-kube-api-access-g9gn6" (OuterVolumeSpecName: "kube-api-access-g9gn6") pod "b44d8fdd-a462-4146-9a5e-6b84864cd490" (UID: "b44d8fdd-a462-4146-9a5e-6b84864cd490"). InnerVolumeSpecName "kube-api-access-g9gn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.982240 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38bbd968-9965-427c-b176-00754eb9887c-kube-api-access-hzlgj" (OuterVolumeSpecName: "kube-api-access-hzlgj") pod "38bbd968-9965-427c-b176-00754eb9887c" (UID: "38bbd968-9965-427c-b176-00754eb9887c"). InnerVolumeSpecName "kube-api-access-hzlgj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.982494 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e77b03c-9fbb-4f84-a85a-fd7771301436-kube-api-access-crkxs" (OuterVolumeSpecName: "kube-api-access-crkxs") pod "9e77b03c-9fbb-4f84-a85a-fd7771301436" (UID: "9e77b03c-9fbb-4f84-a85a-fd7771301436"). InnerVolumeSpecName "kube-api-access-crkxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.984077 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30d83042-4484-4d62-8417-1e7fcebc4220-kube-api-access-9kzwb" (OuterVolumeSpecName: "kube-api-access-9kzwb") pod "30d83042-4484-4d62-8417-1e7fcebc4220" (UID: "30d83042-4484-4d62-8417-1e7fcebc4220"). InnerVolumeSpecName "kube-api-access-9kzwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:11:44 crc kubenswrapper[4971]: I0320 07:11:44.988983 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55fc97fc-3c36-4df5-9be0-e405144122b8-kube-api-access-ldx55" (OuterVolumeSpecName: "kube-api-access-ldx55") pod "55fc97fc-3c36-4df5-9be0-e405144122b8" (UID: "55fc97fc-3c36-4df5-9be0-e405144122b8"). InnerVolumeSpecName "kube-api-access-ldx55". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:11:45 crc kubenswrapper[4971]: I0320 07:11:45.078195 4971 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38bbd968-9965-427c-b176-00754eb9887c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:45 crc kubenswrapper[4971]: I0320 07:11:45.078230 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldx55\" (UniqueName: \"kubernetes.io/projected/55fc97fc-3c36-4df5-9be0-e405144122b8-kube-api-access-ldx55\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:45 crc kubenswrapper[4971]: I0320 07:11:45.078242 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crkxs\" (UniqueName: \"kubernetes.io/projected/9e77b03c-9fbb-4f84-a85a-fd7771301436-kube-api-access-crkxs\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:45 crc kubenswrapper[4971]: I0320 07:11:45.078252 4971 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b44d8fdd-a462-4146-9a5e-6b84864cd490-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:45 crc kubenswrapper[4971]: I0320 07:11:45.078263 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/55fc97fc-3c36-4df5-9be0-e405144122b8-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:45 crc kubenswrapper[4971]: I0320 07:11:45.078273 4971 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e77b03c-9fbb-4f84-a85a-fd7771301436-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:45 crc kubenswrapper[4971]: I0320 07:11:45.078282 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kv6pg\" (UniqueName: \"kubernetes.io/projected/e3dec926-79a5-4d96-917c-380ac85e7e38-kube-api-access-kv6pg\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:45 crc kubenswrapper[4971]: I0320 07:11:45.078290 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9kzwb\" (UniqueName: \"kubernetes.io/projected/30d83042-4484-4d62-8417-1e7fcebc4220-kube-api-access-9kzwb\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:45 crc kubenswrapper[4971]: I0320 07:11:45.078298 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9gn6\" (UniqueName: \"kubernetes.io/projected/b44d8fdd-a462-4146-9a5e-6b84864cd490-kube-api-access-g9gn6\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:45 crc kubenswrapper[4971]: I0320 07:11:45.078307 4971 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/55fc97fc-3c36-4df5-9be0-e405144122b8-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:45 crc kubenswrapper[4971]: I0320 07:11:45.078316 4971 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3dec926-79a5-4d96-917c-380ac85e7e38-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:45 crc kubenswrapper[4971]: I0320 07:11:45.078323 4971 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/55fc97fc-3c36-4df5-9be0-e405144122b8-var-run\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:45 crc kubenswrapper[4971]: I0320 07:11:45.078332 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzlgj\" (UniqueName: \"kubernetes.io/projected/38bbd968-9965-427c-b176-00754eb9887c-kube-api-access-hzlgj\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:45 crc kubenswrapper[4971]: I0320 07:11:45.078341 4971 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/55fc97fc-3c36-4df5-9be0-e405144122b8-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:45 crc kubenswrapper[4971]: I0320 07:11:45.078351 4971 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30d83042-4484-4d62-8417-1e7fcebc4220-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:45 crc kubenswrapper[4971]: I0320 07:11:45.078360 4971 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/55fc97fc-3c36-4df5-9be0-e405144122b8-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:45 crc kubenswrapper[4971]: I0320 07:11:45.198730 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 20 07:11:45 crc kubenswrapper[4971]: W0320 07:11:45.200728 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f0608b4_f344_45db_a952_d6bc328083a2.slice/crio-30b6b4cf18c37a2dc05229ec190ae64ac584dd215f06e0f13bdf0928091f2fcc WatchSource:0}: Error finding container 30b6b4cf18c37a2dc05229ec190ae64ac584dd215f06e0f13bdf0928091f2fcc: Status 404 returned error can't find the container with id 30b6b4cf18c37a2dc05229ec190ae64ac584dd215f06e0f13bdf0928091f2fcc Mar 20 07:11:45 crc kubenswrapper[4971]: I0320 07:11:45.822728 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9f0608b4-f344-45db-a952-d6bc328083a2","Type":"ContainerStarted","Data":"30b6b4cf18c37a2dc05229ec190ae64ac584dd215f06e0f13bdf0928091f2fcc"} Mar 20 07:11:45 crc kubenswrapper[4971]: I0320 07:11:45.825883 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-v9s7n" Mar 20 07:11:45 crc kubenswrapper[4971]: I0320 07:11:45.826992 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-60a3-account-create-update-5zftq" Mar 20 07:11:45 crc kubenswrapper[4971]: I0320 07:11:45.827094 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-2wsw6" event={"ID":"41c644a9-8310-4775-bf77-55e4cf46a907","Type":"ContainerStarted","Data":"1b849baaa85bafbb8df6c2a64ec733a62d77ddeafe57154a63e33f841cdc919e"} Mar 20 07:11:45 crc kubenswrapper[4971]: I0320 07:11:45.827167 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-f5ndp" Mar 20 07:11:45 crc kubenswrapper[4971]: I0320 07:11:45.827293 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-x6b4m" Mar 20 07:11:45 crc kubenswrapper[4971]: I0320 07:11:45.827494 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-2836-account-create-update-mzz6z" Mar 20 07:11:45 crc kubenswrapper[4971]: I0320 07:11:45.827579 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-7dwd5" Mar 20 07:11:45 crc kubenswrapper[4971]: I0320 07:11:45.827504 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4da9-account-create-update-297t8" Mar 20 07:11:45 crc kubenswrapper[4971]: I0320 07:11:45.827756 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-pfhnr-config-dtxp2" Mar 20 07:11:45 crc kubenswrapper[4971]: I0320 07:11:45.854588 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-2wsw6" podStartSLOduration=4.661748076 podStartE2EDuration="10.854557753s" podCreationTimestamp="2026-03-20 07:11:35 +0000 UTC" firstStartedPulling="2026-03-20 07:11:38.458401837 +0000 UTC m=+1320.438275975" lastFinishedPulling="2026-03-20 07:11:44.651211474 +0000 UTC m=+1326.631085652" observedRunningTime="2026-03-20 07:11:45.843845909 +0000 UTC m=+1327.823720127" watchObservedRunningTime="2026-03-20 07:11:45.854557753 +0000 UTC m=+1327.834431931" Mar 20 07:11:46 crc kubenswrapper[4971]: I0320 07:11:46.126666 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-pfhnr-config-dtxp2"] Mar 20 07:11:46 crc kubenswrapper[4971]: I0320 07:11:46.133013 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-pfhnr-config-dtxp2"] Mar 20 07:11:46 crc kubenswrapper[4971]: I0320 07:11:46.745155 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55fc97fc-3c36-4df5-9be0-e405144122b8" path="/var/lib/kubelet/pods/55fc97fc-3c36-4df5-9be0-e405144122b8/volumes" Mar 20 07:11:46 crc kubenswrapper[4971]: I0320 07:11:46.833932 4971 generic.go:334] "Generic (PLEG): container finished" podID="3ed6dac6-e0d0-403d-8e7d-6a5d921379fb" containerID="9dc2fbe9979ece731bdad20be4ff77fb6c042b3ae61d4a3a925fb081c9c02905" exitCode=0 Mar 20 07:11:46 crc kubenswrapper[4971]: I0320 07:11:46.834034 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-gdzww" event={"ID":"3ed6dac6-e0d0-403d-8e7d-6a5d921379fb","Type":"ContainerDied","Data":"9dc2fbe9979ece731bdad20be4ff77fb6c042b3ae61d4a3a925fb081c9c02905"} Mar 20 07:11:46 crc kubenswrapper[4971]: I0320 07:11:46.837088 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9f0608b4-f344-45db-a952-d6bc328083a2","Type":"ContainerStarted","Data":"1a911d36f7d7aa0fb9edc9388eb7caf6d4875f4f78e071a88a18bb2b44086495"} Mar 20 07:11:47 crc kubenswrapper[4971]: I0320 07:11:47.846948 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9f0608b4-f344-45db-a952-d6bc328083a2","Type":"ContainerStarted","Data":"d1197f3d4771bcf093680ea5c7e317e0d1839ab9a28044e6c0af4c8d3d15a45d"} Mar 20 07:11:47 crc kubenswrapper[4971]: I0320 07:11:47.847279 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9f0608b4-f344-45db-a952-d6bc328083a2","Type":"ContainerStarted","Data":"0974124c6c77137e176d1974dea07e511d0b153152c5fc899503abbc3b959028"} Mar 20 07:11:47 crc kubenswrapper[4971]: I0320 07:11:47.847294 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9f0608b4-f344-45db-a952-d6bc328083a2","Type":"ContainerStarted","Data":"7c9c527925a998de06f623f87a6e63a193cc7cb04eaaee5ab4f3102ab86f462e"} Mar 20 07:11:47 crc kubenswrapper[4971]: I0320 07:11:47.848859 4971 generic.go:334] "Generic (PLEG): container finished" podID="41c644a9-8310-4775-bf77-55e4cf46a907" containerID="1b849baaa85bafbb8df6c2a64ec733a62d77ddeafe57154a63e33f841cdc919e" exitCode=0 Mar 20 07:11:47 crc kubenswrapper[4971]: I0320 07:11:47.848967 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-2wsw6" event={"ID":"41c644a9-8310-4775-bf77-55e4cf46a907","Type":"ContainerDied","Data":"1b849baaa85bafbb8df6c2a64ec733a62d77ddeafe57154a63e33f841cdc919e"} Mar 20 07:11:48 crc kubenswrapper[4971]: I0320 07:11:48.735103 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-gdzww" Mar 20 07:11:48 crc kubenswrapper[4971]: I0320 07:11:48.879956 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9f0608b4-f344-45db-a952-d6bc328083a2","Type":"ContainerStarted","Data":"3bc926e8abdcab959b8ce2e3d7c3be7c01ec42c3cc5ee00077229e1618e99dca"} Mar 20 07:11:48 crc kubenswrapper[4971]: I0320 07:11:48.880281 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9f0608b4-f344-45db-a952-d6bc328083a2","Type":"ContainerStarted","Data":"606899b0d5647f6bba62377dc5689cc5a5012b312b0d21b384c0392805a1f808"} Mar 20 07:11:48 crc kubenswrapper[4971]: I0320 07:11:48.880292 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9f0608b4-f344-45db-a952-d6bc328083a2","Type":"ContainerStarted","Data":"f933aca417afe71a53e9fc981ae7ba972eb5df323cf2c751b7c7592ca963d97c"} Mar 20 07:11:48 crc kubenswrapper[4971]: I0320 07:11:48.883742 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-gdzww" Mar 20 07:11:48 crc kubenswrapper[4971]: I0320 07:11:48.884051 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-gdzww" event={"ID":"3ed6dac6-e0d0-403d-8e7d-6a5d921379fb","Type":"ContainerDied","Data":"b7c33c7e24342f665db0e0e7601a0257bbb47b19c3c609b3af55c8da67f096bb"} Mar 20 07:11:48 crc kubenswrapper[4971]: I0320 07:11:48.884073 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7c33c7e24342f665db0e0e7601a0257bbb47b19c3c609b3af55c8da67f096bb" Mar 20 07:11:48 crc kubenswrapper[4971]: I0320 07:11:48.949855 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ed6dac6-e0d0-403d-8e7d-6a5d921379fb-config-data\") pod \"3ed6dac6-e0d0-403d-8e7d-6a5d921379fb\" (UID: \"3ed6dac6-e0d0-403d-8e7d-6a5d921379fb\") " Mar 20 07:11:48 crc kubenswrapper[4971]: I0320 07:11:48.949914 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ed6dac6-e0d0-403d-8e7d-6a5d921379fb-combined-ca-bundle\") pod \"3ed6dac6-e0d0-403d-8e7d-6a5d921379fb\" (UID: \"3ed6dac6-e0d0-403d-8e7d-6a5d921379fb\") " Mar 20 07:11:48 crc kubenswrapper[4971]: I0320 07:11:48.949989 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kchns\" (UniqueName: \"kubernetes.io/projected/3ed6dac6-e0d0-403d-8e7d-6a5d921379fb-kube-api-access-kchns\") pod \"3ed6dac6-e0d0-403d-8e7d-6a5d921379fb\" (UID: \"3ed6dac6-e0d0-403d-8e7d-6a5d921379fb\") " Mar 20 07:11:48 crc kubenswrapper[4971]: I0320 07:11:48.950009 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3ed6dac6-e0d0-403d-8e7d-6a5d921379fb-db-sync-config-data\") pod \"3ed6dac6-e0d0-403d-8e7d-6a5d921379fb\" (UID: \"3ed6dac6-e0d0-403d-8e7d-6a5d921379fb\") " Mar 20 07:11:48 crc kubenswrapper[4971]: I0320 07:11:48.953716 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ed6dac6-e0d0-403d-8e7d-6a5d921379fb-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "3ed6dac6-e0d0-403d-8e7d-6a5d921379fb" (UID: "3ed6dac6-e0d0-403d-8e7d-6a5d921379fb"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:11:48 crc kubenswrapper[4971]: I0320 07:11:48.953719 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ed6dac6-e0d0-403d-8e7d-6a5d921379fb-kube-api-access-kchns" (OuterVolumeSpecName: "kube-api-access-kchns") pod "3ed6dac6-e0d0-403d-8e7d-6a5d921379fb" (UID: "3ed6dac6-e0d0-403d-8e7d-6a5d921379fb"). InnerVolumeSpecName "kube-api-access-kchns". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:11:49 crc kubenswrapper[4971]: I0320 07:11:49.011112 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ed6dac6-e0d0-403d-8e7d-6a5d921379fb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3ed6dac6-e0d0-403d-8e7d-6a5d921379fb" (UID: "3ed6dac6-e0d0-403d-8e7d-6a5d921379fb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:11:49 crc kubenswrapper[4971]: I0320 07:11:49.028086 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ed6dac6-e0d0-403d-8e7d-6a5d921379fb-config-data" (OuterVolumeSpecName: "config-data") pod "3ed6dac6-e0d0-403d-8e7d-6a5d921379fb" (UID: "3ed6dac6-e0d0-403d-8e7d-6a5d921379fb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:11:49 crc kubenswrapper[4971]: I0320 07:11:49.051920 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kchns\" (UniqueName: \"kubernetes.io/projected/3ed6dac6-e0d0-403d-8e7d-6a5d921379fb-kube-api-access-kchns\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:49 crc kubenswrapper[4971]: I0320 07:11:49.051966 4971 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3ed6dac6-e0d0-403d-8e7d-6a5d921379fb-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:49 crc kubenswrapper[4971]: I0320 07:11:49.051982 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ed6dac6-e0d0-403d-8e7d-6a5d921379fb-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:49 crc kubenswrapper[4971]: I0320 07:11:49.052189 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ed6dac6-e0d0-403d-8e7d-6a5d921379fb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:49 crc kubenswrapper[4971]: I0320 07:11:49.224380 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-2wsw6" Mar 20 07:11:49 crc kubenswrapper[4971]: I0320 07:11:49.243430 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6cd86fcf7-jfdhx"] Mar 20 07:11:49 crc kubenswrapper[4971]: E0320 07:11:49.243795 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38bbd968-9965-427c-b176-00754eb9887c" containerName="mariadb-account-create-update" Mar 20 07:11:49 crc kubenswrapper[4971]: I0320 07:11:49.243813 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="38bbd968-9965-427c-b176-00754eb9887c" containerName="mariadb-account-create-update" Mar 20 07:11:49 crc kubenswrapper[4971]: E0320 07:11:49.243827 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b44d8fdd-a462-4146-9a5e-6b84864cd490" containerName="mariadb-database-create" Mar 20 07:11:49 crc kubenswrapper[4971]: I0320 07:11:49.243833 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="b44d8fdd-a462-4146-9a5e-6b84864cd490" containerName="mariadb-database-create" Mar 20 07:11:49 crc kubenswrapper[4971]: E0320 07:11:49.243841 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3dec926-79a5-4d96-917c-380ac85e7e38" containerName="mariadb-database-create" Mar 20 07:11:49 crc kubenswrapper[4971]: I0320 07:11:49.243848 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3dec926-79a5-4d96-917c-380ac85e7e38" containerName="mariadb-database-create" Mar 20 07:11:49 crc kubenswrapper[4971]: E0320 07:11:49.243856 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e77b03c-9fbb-4f84-a85a-fd7771301436" containerName="mariadb-account-create-update" Mar 20 07:11:49 crc kubenswrapper[4971]: I0320 07:11:49.243862 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e77b03c-9fbb-4f84-a85a-fd7771301436" containerName="mariadb-account-create-update" Mar 20 07:11:49 crc kubenswrapper[4971]: E0320 07:11:49.243870 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ed6dac6-e0d0-403d-8e7d-6a5d921379fb" containerName="glance-db-sync" Mar 20 07:11:49 crc kubenswrapper[4971]: I0320 07:11:49.243875 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ed6dac6-e0d0-403d-8e7d-6a5d921379fb" containerName="glance-db-sync" Mar 20 07:11:49 crc kubenswrapper[4971]: E0320 07:11:49.243887 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55fc97fc-3c36-4df5-9be0-e405144122b8" containerName="ovn-config" Mar 20 07:11:49 crc kubenswrapper[4971]: I0320 07:11:49.243893 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="55fc97fc-3c36-4df5-9be0-e405144122b8" containerName="ovn-config" Mar 20 07:11:49 crc kubenswrapper[4971]: E0320 07:11:49.243906 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1ae2ef8-20ef-4f86-958e-212c18ae1701" containerName="mariadb-database-create" Mar 20 07:11:49 crc kubenswrapper[4971]: I0320 07:11:49.243912 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1ae2ef8-20ef-4f86-958e-212c18ae1701" containerName="mariadb-database-create" Mar 20 07:11:49 crc kubenswrapper[4971]: E0320 07:11:49.243923 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41c644a9-8310-4775-bf77-55e4cf46a907" containerName="keystone-db-sync" Mar 20 07:11:49 crc kubenswrapper[4971]: I0320 07:11:49.243929 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="41c644a9-8310-4775-bf77-55e4cf46a907" containerName="keystone-db-sync" Mar 20 07:11:49 crc kubenswrapper[4971]: E0320 07:11:49.243940 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30d83042-4484-4d62-8417-1e7fcebc4220" containerName="mariadb-account-create-update" Mar 20 07:11:49 crc kubenswrapper[4971]: I0320 07:11:49.243945 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="30d83042-4484-4d62-8417-1e7fcebc4220" containerName="mariadb-account-create-update" Mar 20 07:11:49 crc kubenswrapper[4971]: E0320 07:11:49.243958 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72464f1f-f227-4801-8ad0-6a81aaba7081" containerName="mariadb-account-create-update" Mar 20 07:11:49 crc kubenswrapper[4971]: I0320 07:11:49.243964 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="72464f1f-f227-4801-8ad0-6a81aaba7081" containerName="mariadb-account-create-update" Mar 20 07:11:49 crc kubenswrapper[4971]: I0320 07:11:49.244124 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="55fc97fc-3c36-4df5-9be0-e405144122b8" containerName="ovn-config" Mar 20 07:11:49 crc kubenswrapper[4971]: I0320 07:11:49.244136 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1ae2ef8-20ef-4f86-958e-212c18ae1701" containerName="mariadb-database-create" Mar 20 07:11:49 crc kubenswrapper[4971]: I0320 07:11:49.244147 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="41c644a9-8310-4775-bf77-55e4cf46a907" containerName="keystone-db-sync" Mar 20 07:11:49 crc kubenswrapper[4971]: I0320 07:11:49.244156 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="72464f1f-f227-4801-8ad0-6a81aaba7081" containerName="mariadb-account-create-update" Mar 20 07:11:49 crc kubenswrapper[4971]: I0320 07:11:49.244168 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="b44d8fdd-a462-4146-9a5e-6b84864cd490" containerName="mariadb-database-create" Mar 20 07:11:49 crc kubenswrapper[4971]: I0320 07:11:49.244181 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="38bbd968-9965-427c-b176-00754eb9887c" containerName="mariadb-account-create-update" Mar 20 07:11:49 crc kubenswrapper[4971]: I0320 07:11:49.244187 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="30d83042-4484-4d62-8417-1e7fcebc4220" containerName="mariadb-account-create-update" Mar 20 07:11:49 crc kubenswrapper[4971]: I0320 07:11:49.244195 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e77b03c-9fbb-4f84-a85a-fd7771301436" containerName="mariadb-account-create-update" Mar 20 07:11:49 crc kubenswrapper[4971]: I0320 07:11:49.244203 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ed6dac6-e0d0-403d-8e7d-6a5d921379fb" containerName="glance-db-sync" Mar 20 07:11:49 crc kubenswrapper[4971]: I0320 07:11:49.244212 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3dec926-79a5-4d96-917c-380ac85e7e38" containerName="mariadb-database-create" Mar 20 07:11:49 crc kubenswrapper[4971]: I0320 07:11:49.250522 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cd86fcf7-jfdhx" Mar 20 07:11:49 crc kubenswrapper[4971]: I0320 07:11:49.272647 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cd86fcf7-jfdhx"] Mar 20 07:11:49 crc kubenswrapper[4971]: I0320 07:11:49.356826 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpz5b\" (UniqueName: \"kubernetes.io/projected/41c644a9-8310-4775-bf77-55e4cf46a907-kube-api-access-bpz5b\") pod \"41c644a9-8310-4775-bf77-55e4cf46a907\" (UID: \"41c644a9-8310-4775-bf77-55e4cf46a907\") " Mar 20 07:11:49 crc kubenswrapper[4971]: I0320 07:11:49.356957 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41c644a9-8310-4775-bf77-55e4cf46a907-combined-ca-bundle\") pod \"41c644a9-8310-4775-bf77-55e4cf46a907\" (UID: \"41c644a9-8310-4775-bf77-55e4cf46a907\") " Mar 20 07:11:49 crc kubenswrapper[4971]: I0320 07:11:49.357104 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41c644a9-8310-4775-bf77-55e4cf46a907-config-data\") pod \"41c644a9-8310-4775-bf77-55e4cf46a907\" (UID: \"41c644a9-8310-4775-bf77-55e4cf46a907\") " Mar 20 07:11:49 crc kubenswrapper[4971]: I0320 07:11:49.357442 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sjlp\" (UniqueName: \"kubernetes.io/projected/94ae3519-ba25-4ec1-a361-1594d879b2c1-kube-api-access-4sjlp\") pod \"dnsmasq-dns-6cd86fcf7-jfdhx\" (UID: \"94ae3519-ba25-4ec1-a361-1594d879b2c1\") " pod="openstack/dnsmasq-dns-6cd86fcf7-jfdhx" Mar 20 07:11:49 crc kubenswrapper[4971]: I0320 07:11:49.357477 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/94ae3519-ba25-4ec1-a361-1594d879b2c1-ovsdbserver-sb\") pod \"dnsmasq-dns-6cd86fcf7-jfdhx\" (UID: \"94ae3519-ba25-4ec1-a361-1594d879b2c1\") " pod="openstack/dnsmasq-dns-6cd86fcf7-jfdhx" Mar 20 07:11:49 crc kubenswrapper[4971]: I0320 07:11:49.357583 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/94ae3519-ba25-4ec1-a361-1594d879b2c1-dns-svc\") pod \"dnsmasq-dns-6cd86fcf7-jfdhx\" (UID: \"94ae3519-ba25-4ec1-a361-1594d879b2c1\") " pod="openstack/dnsmasq-dns-6cd86fcf7-jfdhx" Mar 20 07:11:49 crc kubenswrapper[4971]: I0320 07:11:49.357659 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94ae3519-ba25-4ec1-a361-1594d879b2c1-config\") pod \"dnsmasq-dns-6cd86fcf7-jfdhx\" (UID: \"94ae3519-ba25-4ec1-a361-1594d879b2c1\") " pod="openstack/dnsmasq-dns-6cd86fcf7-jfdhx" Mar 20 07:11:49 crc kubenswrapper[4971]: I0320 07:11:49.357702 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/94ae3519-ba25-4ec1-a361-1594d879b2c1-ovsdbserver-nb\") pod \"dnsmasq-dns-6cd86fcf7-jfdhx\" (UID: \"94ae3519-ba25-4ec1-a361-1594d879b2c1\") " pod="openstack/dnsmasq-dns-6cd86fcf7-jfdhx" Mar 20 07:11:49 crc kubenswrapper[4971]: I0320 07:11:49.362935 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41c644a9-8310-4775-bf77-55e4cf46a907-kube-api-access-bpz5b" (OuterVolumeSpecName: "kube-api-access-bpz5b") pod "41c644a9-8310-4775-bf77-55e4cf46a907" (UID: "41c644a9-8310-4775-bf77-55e4cf46a907"). InnerVolumeSpecName "kube-api-access-bpz5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:11:49 crc kubenswrapper[4971]: I0320 07:11:49.378908 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41c644a9-8310-4775-bf77-55e4cf46a907-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "41c644a9-8310-4775-bf77-55e4cf46a907" (UID: "41c644a9-8310-4775-bf77-55e4cf46a907"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:11:49 crc kubenswrapper[4971]: I0320 07:11:49.395749 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41c644a9-8310-4775-bf77-55e4cf46a907-config-data" (OuterVolumeSpecName: "config-data") pod "41c644a9-8310-4775-bf77-55e4cf46a907" (UID: "41c644a9-8310-4775-bf77-55e4cf46a907"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:11:49 crc kubenswrapper[4971]: I0320 07:11:49.458847 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/94ae3519-ba25-4ec1-a361-1594d879b2c1-dns-svc\") pod \"dnsmasq-dns-6cd86fcf7-jfdhx\" (UID: \"94ae3519-ba25-4ec1-a361-1594d879b2c1\") " pod="openstack/dnsmasq-dns-6cd86fcf7-jfdhx" Mar 20 07:11:49 crc kubenswrapper[4971]: I0320 07:11:49.458903 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94ae3519-ba25-4ec1-a361-1594d879b2c1-config\") pod \"dnsmasq-dns-6cd86fcf7-jfdhx\" (UID: \"94ae3519-ba25-4ec1-a361-1594d879b2c1\") " pod="openstack/dnsmasq-dns-6cd86fcf7-jfdhx" Mar 20 07:11:49 crc kubenswrapper[4971]: I0320 07:11:49.458943 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/94ae3519-ba25-4ec1-a361-1594d879b2c1-ovsdbserver-nb\") pod \"dnsmasq-dns-6cd86fcf7-jfdhx\" (UID: \"94ae3519-ba25-4ec1-a361-1594d879b2c1\") " pod="openstack/dnsmasq-dns-6cd86fcf7-jfdhx" Mar 20 07:11:49 crc kubenswrapper[4971]: I0320 07:11:49.459000 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sjlp\" (UniqueName: \"kubernetes.io/projected/94ae3519-ba25-4ec1-a361-1594d879b2c1-kube-api-access-4sjlp\") pod \"dnsmasq-dns-6cd86fcf7-jfdhx\" (UID: \"94ae3519-ba25-4ec1-a361-1594d879b2c1\") " pod="openstack/dnsmasq-dns-6cd86fcf7-jfdhx" Mar 20 07:11:49 crc kubenswrapper[4971]: I0320 07:11:49.459019 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/94ae3519-ba25-4ec1-a361-1594d879b2c1-ovsdbserver-sb\") pod \"dnsmasq-dns-6cd86fcf7-jfdhx\" (UID: \"94ae3519-ba25-4ec1-a361-1594d879b2c1\") " pod="openstack/dnsmasq-dns-6cd86fcf7-jfdhx" Mar 20 07:11:49 crc kubenswrapper[4971]: I0320 07:11:49.459073 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41c644a9-8310-4775-bf77-55e4cf46a907-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:49 crc kubenswrapper[4971]: I0320 07:11:49.459085 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpz5b\" (UniqueName: \"kubernetes.io/projected/41c644a9-8310-4775-bf77-55e4cf46a907-kube-api-access-bpz5b\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:49 crc kubenswrapper[4971]: I0320 07:11:49.459095 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41c644a9-8310-4775-bf77-55e4cf46a907-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:49 crc kubenswrapper[4971]: I0320 07:11:49.459956 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94ae3519-ba25-4ec1-a361-1594d879b2c1-config\") pod \"dnsmasq-dns-6cd86fcf7-jfdhx\" (UID: \"94ae3519-ba25-4ec1-a361-1594d879b2c1\") " pod="openstack/dnsmasq-dns-6cd86fcf7-jfdhx" Mar 20 07:11:49 crc kubenswrapper[4971]: I0320 07:11:49.460185 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/94ae3519-ba25-4ec1-a361-1594d879b2c1-ovsdbserver-nb\") pod \"dnsmasq-dns-6cd86fcf7-jfdhx\" (UID: \"94ae3519-ba25-4ec1-a361-1594d879b2c1\") " pod="openstack/dnsmasq-dns-6cd86fcf7-jfdhx" Mar 20 07:11:49 crc kubenswrapper[4971]: I0320 07:11:49.460473 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/94ae3519-ba25-4ec1-a361-1594d879b2c1-ovsdbserver-sb\") pod \"dnsmasq-dns-6cd86fcf7-jfdhx\" (UID: \"94ae3519-ba25-4ec1-a361-1594d879b2c1\") " pod="openstack/dnsmasq-dns-6cd86fcf7-jfdhx" Mar 20 07:11:49 crc kubenswrapper[4971]: I0320 07:11:49.460501 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/94ae3519-ba25-4ec1-a361-1594d879b2c1-dns-svc\") pod \"dnsmasq-dns-6cd86fcf7-jfdhx\" (UID: \"94ae3519-ba25-4ec1-a361-1594d879b2c1\") " pod="openstack/dnsmasq-dns-6cd86fcf7-jfdhx" Mar 20 07:11:49 crc kubenswrapper[4971]: I0320 07:11:49.475927 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sjlp\" (UniqueName: \"kubernetes.io/projected/94ae3519-ba25-4ec1-a361-1594d879b2c1-kube-api-access-4sjlp\") pod \"dnsmasq-dns-6cd86fcf7-jfdhx\" (UID: \"94ae3519-ba25-4ec1-a361-1594d879b2c1\") " pod="openstack/dnsmasq-dns-6cd86fcf7-jfdhx" Mar 20 07:11:49 crc kubenswrapper[4971]: I0320 07:11:49.568943 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cd86fcf7-jfdhx" Mar 20 07:11:49 crc kubenswrapper[4971]: I0320 07:11:49.835690 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cd86fcf7-jfdhx"] Mar 20 07:11:49 crc kubenswrapper[4971]: I0320 07:11:49.934858 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cd86fcf7-jfdhx" event={"ID":"94ae3519-ba25-4ec1-a361-1594d879b2c1","Type":"ContainerStarted","Data":"868b772447b9d740b0940430bae406fd5bb719a2e06615ef4eb3c8133576299a"} Mar 20 07:11:49 crc kubenswrapper[4971]: I0320 07:11:49.939571 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9f0608b4-f344-45db-a952-d6bc328083a2","Type":"ContainerStarted","Data":"6ff097de6ee924556cdb6109e47ceb1245409ef90ff32e007c6f3da327384af0"} Mar 20 07:11:49 crc kubenswrapper[4971]: I0320 07:11:49.941502 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-2wsw6" event={"ID":"41c644a9-8310-4775-bf77-55e4cf46a907","Type":"ContainerDied","Data":"d85e02e9f665377611d9e91b94648af83ad8ca719d13192fd75d8f3921ed0f26"} Mar 20 07:11:49 crc kubenswrapper[4971]: I0320 07:11:49.941545 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d85e02e9f665377611d9e91b94648af83ad8ca719d13192fd75d8f3921ed0f26" Mar 20 07:11:49 crc kubenswrapper[4971]: I0320 07:11:49.941596 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-2wsw6" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.120678 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cd86fcf7-jfdhx"] Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.149493 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-dhbt6"] Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.150841 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dhbt6" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.154206 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.154408 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.154592 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.154727 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-vfsj2" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.154890 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.170393 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-dhbt6"] Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.178830 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/189bc582-c200-4c20-8a6f-2a1446fd11b5-scripts\") pod \"keystone-bootstrap-dhbt6\" (UID: \"189bc582-c200-4c20-8a6f-2a1446fd11b5\") " pod="openstack/keystone-bootstrap-dhbt6" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.178895 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/189bc582-c200-4c20-8a6f-2a1446fd11b5-config-data\") pod \"keystone-bootstrap-dhbt6\" (UID: \"189bc582-c200-4c20-8a6f-2a1446fd11b5\") " pod="openstack/keystone-bootstrap-dhbt6" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.178956 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/189bc582-c200-4c20-8a6f-2a1446fd11b5-combined-ca-bundle\") pod \"keystone-bootstrap-dhbt6\" (UID: \"189bc582-c200-4c20-8a6f-2a1446fd11b5\") " pod="openstack/keystone-bootstrap-dhbt6" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.179183 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scqxn\" (UniqueName: \"kubernetes.io/projected/189bc582-c200-4c20-8a6f-2a1446fd11b5-kube-api-access-scqxn\") pod \"keystone-bootstrap-dhbt6\" (UID: \"189bc582-c200-4c20-8a6f-2a1446fd11b5\") " pod="openstack/keystone-bootstrap-dhbt6" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.179281 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/189bc582-c200-4c20-8a6f-2a1446fd11b5-credential-keys\") pod \"keystone-bootstrap-dhbt6\" (UID: \"189bc582-c200-4c20-8a6f-2a1446fd11b5\") " pod="openstack/keystone-bootstrap-dhbt6" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.179310 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/189bc582-c200-4c20-8a6f-2a1446fd11b5-fernet-keys\") pod \"keystone-bootstrap-dhbt6\" (UID: \"189bc582-c200-4c20-8a6f-2a1446fd11b5\") " pod="openstack/keystone-bootstrap-dhbt6" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.235941 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56dddd9f87-c4mf7"] Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.241895 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56dddd9f87-c4mf7" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.284398 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/189bc582-c200-4c20-8a6f-2a1446fd11b5-scripts\") pod \"keystone-bootstrap-dhbt6\" (UID: \"189bc582-c200-4c20-8a6f-2a1446fd11b5\") " pod="openstack/keystone-bootstrap-dhbt6" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.284536 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0cea3a55-06b4-4a0e-ad98-416b3bea7d59-ovsdbserver-nb\") pod \"dnsmasq-dns-56dddd9f87-c4mf7\" (UID: \"0cea3a55-06b4-4a0e-ad98-416b3bea7d59\") " pod="openstack/dnsmasq-dns-56dddd9f87-c4mf7" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.284633 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0cea3a55-06b4-4a0e-ad98-416b3bea7d59-dns-svc\") pod \"dnsmasq-dns-56dddd9f87-c4mf7\" (UID: \"0cea3a55-06b4-4a0e-ad98-416b3bea7d59\") " pod="openstack/dnsmasq-dns-56dddd9f87-c4mf7" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.284713 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/189bc582-c200-4c20-8a6f-2a1446fd11b5-config-data\") pod \"keystone-bootstrap-dhbt6\" (UID: \"189bc582-c200-4c20-8a6f-2a1446fd11b5\") " pod="openstack/keystone-bootstrap-dhbt6" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.284749 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cea3a55-06b4-4a0e-ad98-416b3bea7d59-config\") pod \"dnsmasq-dns-56dddd9f87-c4mf7\" (UID: \"0cea3a55-06b4-4a0e-ad98-416b3bea7d59\") " pod="openstack/dnsmasq-dns-56dddd9f87-c4mf7" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.284861 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/189bc582-c200-4c20-8a6f-2a1446fd11b5-combined-ca-bundle\") pod \"keystone-bootstrap-dhbt6\" (UID: \"189bc582-c200-4c20-8a6f-2a1446fd11b5\") " pod="openstack/keystone-bootstrap-dhbt6" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.284902 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0cea3a55-06b4-4a0e-ad98-416b3bea7d59-ovsdbserver-sb\") pod \"dnsmasq-dns-56dddd9f87-c4mf7\" (UID: \"0cea3a55-06b4-4a0e-ad98-416b3bea7d59\") " pod="openstack/dnsmasq-dns-56dddd9f87-c4mf7" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.285024 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn587\" (UniqueName: \"kubernetes.io/projected/0cea3a55-06b4-4a0e-ad98-416b3bea7d59-kube-api-access-hn587\") pod \"dnsmasq-dns-56dddd9f87-c4mf7\" (UID: \"0cea3a55-06b4-4a0e-ad98-416b3bea7d59\") " pod="openstack/dnsmasq-dns-56dddd9f87-c4mf7" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.285051 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scqxn\" (UniqueName: \"kubernetes.io/projected/189bc582-c200-4c20-8a6f-2a1446fd11b5-kube-api-access-scqxn\") pod \"keystone-bootstrap-dhbt6\" (UID: \"189bc582-c200-4c20-8a6f-2a1446fd11b5\") " pod="openstack/keystone-bootstrap-dhbt6" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.285126 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/189bc582-c200-4c20-8a6f-2a1446fd11b5-credential-keys\") pod \"keystone-bootstrap-dhbt6\" (UID: \"189bc582-c200-4c20-8a6f-2a1446fd11b5\") " pod="openstack/keystone-bootstrap-dhbt6" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.285147 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/189bc582-c200-4c20-8a6f-2a1446fd11b5-fernet-keys\") pod \"keystone-bootstrap-dhbt6\" (UID: \"189bc582-c200-4c20-8a6f-2a1446fd11b5\") " pod="openstack/keystone-bootstrap-dhbt6" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.300908 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/189bc582-c200-4c20-8a6f-2a1446fd11b5-scripts\") pod \"keystone-bootstrap-dhbt6\" (UID: \"189bc582-c200-4c20-8a6f-2a1446fd11b5\") " pod="openstack/keystone-bootstrap-dhbt6" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.304216 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/189bc582-c200-4c20-8a6f-2a1446fd11b5-fernet-keys\") pod \"keystone-bootstrap-dhbt6\" (UID: \"189bc582-c200-4c20-8a6f-2a1446fd11b5\") " pod="openstack/keystone-bootstrap-dhbt6" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.307278 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/189bc582-c200-4c20-8a6f-2a1446fd11b5-config-data\") pod \"keystone-bootstrap-dhbt6\" (UID: \"189bc582-c200-4c20-8a6f-2a1446fd11b5\") " pod="openstack/keystone-bootstrap-dhbt6" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.308355 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56dddd9f87-c4mf7"] Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.308940 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/189bc582-c200-4c20-8a6f-2a1446fd11b5-combined-ca-bundle\") pod \"keystone-bootstrap-dhbt6\" (UID: \"189bc582-c200-4c20-8a6f-2a1446fd11b5\") " pod="openstack/keystone-bootstrap-dhbt6" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.311135 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/189bc582-c200-4c20-8a6f-2a1446fd11b5-credential-keys\") pod \"keystone-bootstrap-dhbt6\" (UID: \"189bc582-c200-4c20-8a6f-2a1446fd11b5\") " pod="openstack/keystone-bootstrap-dhbt6" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.327652 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scqxn\" (UniqueName: \"kubernetes.io/projected/189bc582-c200-4c20-8a6f-2a1446fd11b5-kube-api-access-scqxn\") pod \"keystone-bootstrap-dhbt6\" (UID: \"189bc582-c200-4c20-8a6f-2a1446fd11b5\") " pod="openstack/keystone-bootstrap-dhbt6" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.387039 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hn587\" (UniqueName: \"kubernetes.io/projected/0cea3a55-06b4-4a0e-ad98-416b3bea7d59-kube-api-access-hn587\") pod \"dnsmasq-dns-56dddd9f87-c4mf7\" (UID: \"0cea3a55-06b4-4a0e-ad98-416b3bea7d59\") " pod="openstack/dnsmasq-dns-56dddd9f87-c4mf7" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.387125 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0cea3a55-06b4-4a0e-ad98-416b3bea7d59-ovsdbserver-nb\") pod \"dnsmasq-dns-56dddd9f87-c4mf7\" (UID: \"0cea3a55-06b4-4a0e-ad98-416b3bea7d59\") " pod="openstack/dnsmasq-dns-56dddd9f87-c4mf7" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.387155 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0cea3a55-06b4-4a0e-ad98-416b3bea7d59-dns-svc\") pod \"dnsmasq-dns-56dddd9f87-c4mf7\" (UID: \"0cea3a55-06b4-4a0e-ad98-416b3bea7d59\") " pod="openstack/dnsmasq-dns-56dddd9f87-c4mf7" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.387175 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cea3a55-06b4-4a0e-ad98-416b3bea7d59-config\") pod \"dnsmasq-dns-56dddd9f87-c4mf7\" (UID: \"0cea3a55-06b4-4a0e-ad98-416b3bea7d59\") " pod="openstack/dnsmasq-dns-56dddd9f87-c4mf7" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.387216 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0cea3a55-06b4-4a0e-ad98-416b3bea7d59-ovsdbserver-sb\") pod \"dnsmasq-dns-56dddd9f87-c4mf7\" (UID: \"0cea3a55-06b4-4a0e-ad98-416b3bea7d59\") " pod="openstack/dnsmasq-dns-56dddd9f87-c4mf7" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.387997 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0cea3a55-06b4-4a0e-ad98-416b3bea7d59-ovsdbserver-sb\") pod \"dnsmasq-dns-56dddd9f87-c4mf7\" (UID: \"0cea3a55-06b4-4a0e-ad98-416b3bea7d59\") " pod="openstack/dnsmasq-dns-56dddd9f87-c4mf7" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.388748 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0cea3a55-06b4-4a0e-ad98-416b3bea7d59-ovsdbserver-nb\") pod \"dnsmasq-dns-56dddd9f87-c4mf7\" (UID: \"0cea3a55-06b4-4a0e-ad98-416b3bea7d59\") " pod="openstack/dnsmasq-dns-56dddd9f87-c4mf7" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.389231 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0cea3a55-06b4-4a0e-ad98-416b3bea7d59-dns-svc\") pod \"dnsmasq-dns-56dddd9f87-c4mf7\" (UID: \"0cea3a55-06b4-4a0e-ad98-416b3bea7d59\") " pod="openstack/dnsmasq-dns-56dddd9f87-c4mf7" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.389998 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cea3a55-06b4-4a0e-ad98-416b3bea7d59-config\") pod \"dnsmasq-dns-56dddd9f87-c4mf7\" (UID: \"0cea3a55-06b4-4a0e-ad98-416b3bea7d59\") " pod="openstack/dnsmasq-dns-56dddd9f87-c4mf7" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.417755 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn587\" (UniqueName: \"kubernetes.io/projected/0cea3a55-06b4-4a0e-ad98-416b3bea7d59-kube-api-access-hn587\") pod \"dnsmasq-dns-56dddd9f87-c4mf7\" (UID: \"0cea3a55-06b4-4a0e-ad98-416b3bea7d59\") " pod="openstack/dnsmasq-dns-56dddd9f87-c4mf7" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.477895 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-p9zph"] Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.479406 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-p9zph" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.481150 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-k9d2v" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.481305 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.488787 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.490146 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebe4aafd-3ffe-406f-a0dc-faebf5eeddda-config-data\") pod \"cinder-db-sync-p9zph\" (UID: \"ebe4aafd-3ffe-406f-a0dc-faebf5eeddda\") " pod="openstack/cinder-db-sync-p9zph" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.490204 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ebe4aafd-3ffe-406f-a0dc-faebf5eeddda-db-sync-config-data\") pod \"cinder-db-sync-p9zph\" (UID: \"ebe4aafd-3ffe-406f-a0dc-faebf5eeddda\") " pod="openstack/cinder-db-sync-p9zph" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.490238 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lcr6\" (UniqueName: \"kubernetes.io/projected/ebe4aafd-3ffe-406f-a0dc-faebf5eeddda-kube-api-access-2lcr6\") pod \"cinder-db-sync-p9zph\" (UID: \"ebe4aafd-3ffe-406f-a0dc-faebf5eeddda\") " pod="openstack/cinder-db-sync-p9zph" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.490260 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebe4aafd-3ffe-406f-a0dc-faebf5eeddda-combined-ca-bundle\") pod \"cinder-db-sync-p9zph\" (UID: \"ebe4aafd-3ffe-406f-a0dc-faebf5eeddda\") " pod="openstack/cinder-db-sync-p9zph" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.490289 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ebe4aafd-3ffe-406f-a0dc-faebf5eeddda-etc-machine-id\") pod \"cinder-db-sync-p9zph\" (UID: \"ebe4aafd-3ffe-406f-a0dc-faebf5eeddda\") " pod="openstack/cinder-db-sync-p9zph" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.490305 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebe4aafd-3ffe-406f-a0dc-faebf5eeddda-scripts\") pod \"cinder-db-sync-p9zph\" (UID: \"ebe4aafd-3ffe-406f-a0dc-faebf5eeddda\") " pod="openstack/cinder-db-sync-p9zph" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.506588 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.515054 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dhbt6" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.518002 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56dddd9f87-c4mf7"] Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.518795 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56dddd9f87-c4mf7" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.518920 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.526128 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.526410 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.536177 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-7vflp"] Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.537781 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-7vflp" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.550025 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-qvhrj" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.550225 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.550433 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.579629 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f778bb97-sp7rz"] Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.581266 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f778bb97-sp7rz" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.592099 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ebe4aafd-3ffe-406f-a0dc-faebf5eeddda-db-sync-config-data\") pod \"cinder-db-sync-p9zph\" (UID: \"ebe4aafd-3ffe-406f-a0dc-faebf5eeddda\") " pod="openstack/cinder-db-sync-p9zph" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.592282 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lcr6\" (UniqueName: \"kubernetes.io/projected/ebe4aafd-3ffe-406f-a0dc-faebf5eeddda-kube-api-access-2lcr6\") pod \"cinder-db-sync-p9zph\" (UID: \"ebe4aafd-3ffe-406f-a0dc-faebf5eeddda\") " pod="openstack/cinder-db-sync-p9zph" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.592361 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc189b71-a30a-4ae1-9df1-234fc2e85cbc-dns-svc\") pod \"dnsmasq-dns-55f778bb97-sp7rz\" (UID: \"cc189b71-a30a-4ae1-9df1-234fc2e85cbc\") " pod="openstack/dnsmasq-dns-55f778bb97-sp7rz" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.592443 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebe4aafd-3ffe-406f-a0dc-faebf5eeddda-combined-ca-bundle\") pod \"cinder-db-sync-p9zph\" (UID: \"ebe4aafd-3ffe-406f-a0dc-faebf5eeddda\") " pod="openstack/cinder-db-sync-p9zph" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.592522 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ebe4aafd-3ffe-406f-a0dc-faebf5eeddda-etc-machine-id\") pod \"cinder-db-sync-p9zph\" (UID: \"ebe4aafd-3ffe-406f-a0dc-faebf5eeddda\") " pod="openstack/cinder-db-sync-p9zph" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.592620 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/03a48c07-f949-4376-874a-8d5758b60a9a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"03a48c07-f949-4376-874a-8d5758b60a9a\") " pod="openstack/ceilometer-0" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.592696 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebe4aafd-3ffe-406f-a0dc-faebf5eeddda-scripts\") pod \"cinder-db-sync-p9zph\" (UID: \"ebe4aafd-3ffe-406f-a0dc-faebf5eeddda\") " pod="openstack/cinder-db-sync-p9zph" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.592783 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c3565c3-ef3e-4d98-9bca-211a538032c9-config-data\") pod \"placement-db-sync-7vflp\" (UID: \"7c3565c3-ef3e-4d98-9bca-211a538032c9\") " pod="openstack/placement-db-sync-7vflp" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.592855 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c3565c3-ef3e-4d98-9bca-211a538032c9-combined-ca-bundle\") pod \"placement-db-sync-7vflp\" (UID: \"7c3565c3-ef3e-4d98-9bca-211a538032c9\") " pod="openstack/placement-db-sync-7vflp" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.592931 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03a48c07-f949-4376-874a-8d5758b60a9a-config-data\") pod \"ceilometer-0\" (UID: \"03a48c07-f949-4376-874a-8d5758b60a9a\") " pod="openstack/ceilometer-0" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.593001 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/03a48c07-f949-4376-874a-8d5758b60a9a-run-httpd\") pod \"ceilometer-0\" (UID: \"03a48c07-f949-4376-874a-8d5758b60a9a\") " pod="openstack/ceilometer-0" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.593070 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c3565c3-ef3e-4d98-9bca-211a538032c9-scripts\") pod \"placement-db-sync-7vflp\" (UID: \"7c3565c3-ef3e-4d98-9bca-211a538032c9\") " pod="openstack/placement-db-sync-7vflp" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.593138 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03a48c07-f949-4376-874a-8d5758b60a9a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"03a48c07-f949-4376-874a-8d5758b60a9a\") " pod="openstack/ceilometer-0" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.594500 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc189b71-a30a-4ae1-9df1-234fc2e85cbc-ovsdbserver-sb\") pod \"dnsmasq-dns-55f778bb97-sp7rz\" (UID: \"cc189b71-a30a-4ae1-9df1-234fc2e85cbc\") " pod="openstack/dnsmasq-dns-55f778bb97-sp7rz" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.594626 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03a48c07-f949-4376-874a-8d5758b60a9a-scripts\") pod \"ceilometer-0\" (UID: \"03a48c07-f949-4376-874a-8d5758b60a9a\") " pod="openstack/ceilometer-0" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.594849 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c3565c3-ef3e-4d98-9bca-211a538032c9-logs\") pod \"placement-db-sync-7vflp\" (UID: \"7c3565c3-ef3e-4d98-9bca-211a538032c9\") " pod="openstack/placement-db-sync-7vflp" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.594942 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7x97\" (UniqueName: \"kubernetes.io/projected/7c3565c3-ef3e-4d98-9bca-211a538032c9-kube-api-access-v7x97\") pod \"placement-db-sync-7vflp\" (UID: \"7c3565c3-ef3e-4d98-9bca-211a538032c9\") " pod="openstack/placement-db-sync-7vflp" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.595075 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc189b71-a30a-4ae1-9df1-234fc2e85cbc-config\") pod \"dnsmasq-dns-55f778bb97-sp7rz\" (UID: \"cc189b71-a30a-4ae1-9df1-234fc2e85cbc\") " pod="openstack/dnsmasq-dns-55f778bb97-sp7rz" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.595156 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebe4aafd-3ffe-406f-a0dc-faebf5eeddda-config-data\") pod \"cinder-db-sync-p9zph\" (UID: \"ebe4aafd-3ffe-406f-a0dc-faebf5eeddda\") " pod="openstack/cinder-db-sync-p9zph" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.595223 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/03a48c07-f949-4376-874a-8d5758b60a9a-log-httpd\") pod \"ceilometer-0\" (UID: \"03a48c07-f949-4376-874a-8d5758b60a9a\") " pod="openstack/ceilometer-0" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.595290 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc189b71-a30a-4ae1-9df1-234fc2e85cbc-ovsdbserver-nb\") pod \"dnsmasq-dns-55f778bb97-sp7rz\" (UID: \"cc189b71-a30a-4ae1-9df1-234fc2e85cbc\") " pod="openstack/dnsmasq-dns-55f778bb97-sp7rz" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.595385 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppdz8\" (UniqueName: \"kubernetes.io/projected/cc189b71-a30a-4ae1-9df1-234fc2e85cbc-kube-api-access-ppdz8\") pod \"dnsmasq-dns-55f778bb97-sp7rz\" (UID: \"cc189b71-a30a-4ae1-9df1-234fc2e85cbc\") " pod="openstack/dnsmasq-dns-55f778bb97-sp7rz" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.595504 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.595592 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sg752\" (UniqueName: \"kubernetes.io/projected/03a48c07-f949-4376-874a-8d5758b60a9a-kube-api-access-sg752\") pod \"ceilometer-0\" (UID: \"03a48c07-f949-4376-874a-8d5758b60a9a\") " pod="openstack/ceilometer-0" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.599369 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ebe4aafd-3ffe-406f-a0dc-faebf5eeddda-etc-machine-id\") pod \"cinder-db-sync-p9zph\" (UID: \"ebe4aafd-3ffe-406f-a0dc-faebf5eeddda\") " pod="openstack/cinder-db-sync-p9zph" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.599819 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ebe4aafd-3ffe-406f-a0dc-faebf5eeddda-db-sync-config-data\") pod \"cinder-db-sync-p9zph\" (UID: \"ebe4aafd-3ffe-406f-a0dc-faebf5eeddda\") " pod="openstack/cinder-db-sync-p9zph" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.610975 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebe4aafd-3ffe-406f-a0dc-faebf5eeddda-scripts\") pod \"cinder-db-sync-p9zph\" (UID: \"ebe4aafd-3ffe-406f-a0dc-faebf5eeddda\") " pod="openstack/cinder-db-sync-p9zph" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.611765 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebe4aafd-3ffe-406f-a0dc-faebf5eeddda-combined-ca-bundle\") pod \"cinder-db-sync-p9zph\" (UID: \"ebe4aafd-3ffe-406f-a0dc-faebf5eeddda\") " pod="openstack/cinder-db-sync-p9zph" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.619959 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lcr6\" (UniqueName: \"kubernetes.io/projected/ebe4aafd-3ffe-406f-a0dc-faebf5eeddda-kube-api-access-2lcr6\") pod \"cinder-db-sync-p9zph\" (UID: \"ebe4aafd-3ffe-406f-a0dc-faebf5eeddda\") " pod="openstack/cinder-db-sync-p9zph" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.624644 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-7vflp"] Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.643678 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f778bb97-sp7rz"] Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.649074 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebe4aafd-3ffe-406f-a0dc-faebf5eeddda-config-data\") pod \"cinder-db-sync-p9zph\" (UID: \"ebe4aafd-3ffe-406f-a0dc-faebf5eeddda\") " pod="openstack/cinder-db-sync-p9zph" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.673076 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-p9zph"] Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.701085 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sg752\" (UniqueName: \"kubernetes.io/projected/03a48c07-f949-4376-874a-8d5758b60a9a-kube-api-access-sg752\") pod \"ceilometer-0\" (UID: \"03a48c07-f949-4376-874a-8d5758b60a9a\") " pod="openstack/ceilometer-0" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.701178 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc189b71-a30a-4ae1-9df1-234fc2e85cbc-dns-svc\") pod \"dnsmasq-dns-55f778bb97-sp7rz\" (UID: \"cc189b71-a30a-4ae1-9df1-234fc2e85cbc\") " pod="openstack/dnsmasq-dns-55f778bb97-sp7rz" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.701214 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/03a48c07-f949-4376-874a-8d5758b60a9a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"03a48c07-f949-4376-874a-8d5758b60a9a\") " pod="openstack/ceilometer-0" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.701229 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c3565c3-ef3e-4d98-9bca-211a538032c9-config-data\") pod \"placement-db-sync-7vflp\" (UID: \"7c3565c3-ef3e-4d98-9bca-211a538032c9\") " pod="openstack/placement-db-sync-7vflp" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.701247 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c3565c3-ef3e-4d98-9bca-211a538032c9-combined-ca-bundle\") pod \"placement-db-sync-7vflp\" (UID: \"7c3565c3-ef3e-4d98-9bca-211a538032c9\") " pod="openstack/placement-db-sync-7vflp" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.701267 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03a48c07-f949-4376-874a-8d5758b60a9a-config-data\") pod \"ceilometer-0\" (UID: \"03a48c07-f949-4376-874a-8d5758b60a9a\") " pod="openstack/ceilometer-0" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.701739 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/03a48c07-f949-4376-874a-8d5758b60a9a-run-httpd\") pod \"ceilometer-0\" (UID: \"03a48c07-f949-4376-874a-8d5758b60a9a\") " pod="openstack/ceilometer-0" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.701774 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c3565c3-ef3e-4d98-9bca-211a538032c9-scripts\") pod \"placement-db-sync-7vflp\" (UID: \"7c3565c3-ef3e-4d98-9bca-211a538032c9\") " pod="openstack/placement-db-sync-7vflp" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.701801 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03a48c07-f949-4376-874a-8d5758b60a9a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"03a48c07-f949-4376-874a-8d5758b60a9a\") " pod="openstack/ceilometer-0" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.701818 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc189b71-a30a-4ae1-9df1-234fc2e85cbc-ovsdbserver-sb\") pod \"dnsmasq-dns-55f778bb97-sp7rz\" (UID: \"cc189b71-a30a-4ae1-9df1-234fc2e85cbc\") " pod="openstack/dnsmasq-dns-55f778bb97-sp7rz" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.703302 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/03a48c07-f949-4376-874a-8d5758b60a9a-run-httpd\") pod \"ceilometer-0\" (UID: \"03a48c07-f949-4376-874a-8d5758b60a9a\") " pod="openstack/ceilometer-0" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.701846 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03a48c07-f949-4376-874a-8d5758b60a9a-scripts\") pod \"ceilometer-0\" (UID: \"03a48c07-f949-4376-874a-8d5758b60a9a\") " pod="openstack/ceilometer-0" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.703779 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c3565c3-ef3e-4d98-9bca-211a538032c9-logs\") pod \"placement-db-sync-7vflp\" (UID: \"7c3565c3-ef3e-4d98-9bca-211a538032c9\") " pod="openstack/placement-db-sync-7vflp" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.703850 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7x97\" (UniqueName: \"kubernetes.io/projected/7c3565c3-ef3e-4d98-9bca-211a538032c9-kube-api-access-v7x97\") pod \"placement-db-sync-7vflp\" (UID: \"7c3565c3-ef3e-4d98-9bca-211a538032c9\") " pod="openstack/placement-db-sync-7vflp" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.704204 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc189b71-a30a-4ae1-9df1-234fc2e85cbc-config\") pod \"dnsmasq-dns-55f778bb97-sp7rz\" (UID: \"cc189b71-a30a-4ae1-9df1-234fc2e85cbc\") " pod="openstack/dnsmasq-dns-55f778bb97-sp7rz" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.704246 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc189b71-a30a-4ae1-9df1-234fc2e85cbc-ovsdbserver-nb\") pod \"dnsmasq-dns-55f778bb97-sp7rz\" (UID: \"cc189b71-a30a-4ae1-9df1-234fc2e85cbc\") " pod="openstack/dnsmasq-dns-55f778bb97-sp7rz" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.704265 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/03a48c07-f949-4376-874a-8d5758b60a9a-log-httpd\") pod \"ceilometer-0\" (UID: \"03a48c07-f949-4376-874a-8d5758b60a9a\") " pod="openstack/ceilometer-0" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.704305 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppdz8\" (UniqueName: \"kubernetes.io/projected/cc189b71-a30a-4ae1-9df1-234fc2e85cbc-kube-api-access-ppdz8\") pod \"dnsmasq-dns-55f778bb97-sp7rz\" (UID: \"cc189b71-a30a-4ae1-9df1-234fc2e85cbc\") " pod="openstack/dnsmasq-dns-55f778bb97-sp7rz" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.706257 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc189b71-a30a-4ae1-9df1-234fc2e85cbc-dns-svc\") pod \"dnsmasq-dns-55f778bb97-sp7rz\" (UID: \"cc189b71-a30a-4ae1-9df1-234fc2e85cbc\") " pod="openstack/dnsmasq-dns-55f778bb97-sp7rz" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.707062 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc189b71-a30a-4ae1-9df1-234fc2e85cbc-ovsdbserver-sb\") pod \"dnsmasq-dns-55f778bb97-sp7rz\" (UID: \"cc189b71-a30a-4ae1-9df1-234fc2e85cbc\") " pod="openstack/dnsmasq-dns-55f778bb97-sp7rz" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.707948 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c3565c3-ef3e-4d98-9bca-211a538032c9-logs\") pod \"placement-db-sync-7vflp\" (UID: \"7c3565c3-ef3e-4d98-9bca-211a538032c9\") " pod="openstack/placement-db-sync-7vflp" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.708086 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc189b71-a30a-4ae1-9df1-234fc2e85cbc-config\") pod \"dnsmasq-dns-55f778bb97-sp7rz\" (UID: \"cc189b71-a30a-4ae1-9df1-234fc2e85cbc\") " pod="openstack/dnsmasq-dns-55f778bb97-sp7rz" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.708965 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c3565c3-ef3e-4d98-9bca-211a538032c9-config-data\") pod \"placement-db-sync-7vflp\" (UID: \"7c3565c3-ef3e-4d98-9bca-211a538032c9\") " pod="openstack/placement-db-sync-7vflp" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.711588 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/03a48c07-f949-4376-874a-8d5758b60a9a-log-httpd\") pod \"ceilometer-0\" (UID: \"03a48c07-f949-4376-874a-8d5758b60a9a\") " pod="openstack/ceilometer-0" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.712964 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03a48c07-f949-4376-874a-8d5758b60a9a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"03a48c07-f949-4376-874a-8d5758b60a9a\") " pod="openstack/ceilometer-0" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.714007 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c3565c3-ef3e-4d98-9bca-211a538032c9-combined-ca-bundle\") pod \"placement-db-sync-7vflp\" (UID: \"7c3565c3-ef3e-4d98-9bca-211a538032c9\") " pod="openstack/placement-db-sync-7vflp" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.717681 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc189b71-a30a-4ae1-9df1-234fc2e85cbc-ovsdbserver-nb\") pod \"dnsmasq-dns-55f778bb97-sp7rz\" (UID: \"cc189b71-a30a-4ae1-9df1-234fc2e85cbc\") " pod="openstack/dnsmasq-dns-55f778bb97-sp7rz" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.721253 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppdz8\" (UniqueName: \"kubernetes.io/projected/cc189b71-a30a-4ae1-9df1-234fc2e85cbc-kube-api-access-ppdz8\") pod \"dnsmasq-dns-55f778bb97-sp7rz\" (UID: \"cc189b71-a30a-4ae1-9df1-234fc2e85cbc\") " pod="openstack/dnsmasq-dns-55f778bb97-sp7rz" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.721908 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/03a48c07-f949-4376-874a-8d5758b60a9a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"03a48c07-f949-4376-874a-8d5758b60a9a\") " pod="openstack/ceilometer-0" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.722519 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03a48c07-f949-4376-874a-8d5758b60a9a-scripts\") pod \"ceilometer-0\" (UID: \"03a48c07-f949-4376-874a-8d5758b60a9a\") " pod="openstack/ceilometer-0" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.722577 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c3565c3-ef3e-4d98-9bca-211a538032c9-scripts\") pod \"placement-db-sync-7vflp\" (UID: \"7c3565c3-ef3e-4d98-9bca-211a538032c9\") " pod="openstack/placement-db-sync-7vflp" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.731022 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7x97\" (UniqueName: \"kubernetes.io/projected/7c3565c3-ef3e-4d98-9bca-211a538032c9-kube-api-access-v7x97\") pod \"placement-db-sync-7vflp\" (UID: \"7c3565c3-ef3e-4d98-9bca-211a538032c9\") " pod="openstack/placement-db-sync-7vflp" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.735052 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03a48c07-f949-4376-874a-8d5758b60a9a-config-data\") pod \"ceilometer-0\" (UID: \"03a48c07-f949-4376-874a-8d5758b60a9a\") " pod="openstack/ceilometer-0" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.736971 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sg752\" (UniqueName: \"kubernetes.io/projected/03a48c07-f949-4376-874a-8d5758b60a9a-kube-api-access-sg752\") pod \"ceilometer-0\" (UID: \"03a48c07-f949-4376-874a-8d5758b60a9a\") " pod="openstack/ceilometer-0" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.816273 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f778bb97-sp7rz" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.821890 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-p9zph" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.834877 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-fqqjk"] Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.835972 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-fqqjk" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.838299 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.839114 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.840351 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-2rwpf" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.842128 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-fqqjk"] Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.864133 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-7vflp" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.920763 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35492989-bd44-45b4-85a7-1acc9569270e-combined-ca-bundle\") pod \"barbican-db-sync-fqqjk\" (UID: \"35492989-bd44-45b4-85a7-1acc9569270e\") " pod="openstack/barbican-db-sync-fqqjk" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.920843 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/35492989-bd44-45b4-85a7-1acc9569270e-db-sync-config-data\") pod \"barbican-db-sync-fqqjk\" (UID: \"35492989-bd44-45b4-85a7-1acc9569270e\") " pod="openstack/barbican-db-sync-fqqjk" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.923218 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbbtl\" (UniqueName: \"kubernetes.io/projected/35492989-bd44-45b4-85a7-1acc9569270e-kube-api-access-fbbtl\") pod \"barbican-db-sync-fqqjk\" (UID: \"35492989-bd44-45b4-85a7-1acc9569270e\") " pod="openstack/barbican-db-sync-fqqjk" Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.950698 4971 generic.go:334] "Generic (PLEG): container finished" podID="94ae3519-ba25-4ec1-a361-1594d879b2c1" containerID="f1558a3b7c37a5d1491d5f695d239a1bd2492716b053f3dd4832f9dba49646cd" exitCode=0 Mar 20 07:11:50 crc kubenswrapper[4971]: I0320 07:11:50.950759 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cd86fcf7-jfdhx" event={"ID":"94ae3519-ba25-4ec1-a361-1594d879b2c1","Type":"ContainerDied","Data":"f1558a3b7c37a5d1491d5f695d239a1bd2492716b053f3dd4832f9dba49646cd"} Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.024355 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/35492989-bd44-45b4-85a7-1acc9569270e-db-sync-config-data\") pod \"barbican-db-sync-fqqjk\" (UID: \"35492989-bd44-45b4-85a7-1acc9569270e\") " pod="openstack/barbican-db-sync-fqqjk" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.024414 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbbtl\" (UniqueName: \"kubernetes.io/projected/35492989-bd44-45b4-85a7-1acc9569270e-kube-api-access-fbbtl\") pod \"barbican-db-sync-fqqjk\" (UID: \"35492989-bd44-45b4-85a7-1acc9569270e\") " pod="openstack/barbican-db-sync-fqqjk" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.025438 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35492989-bd44-45b4-85a7-1acc9569270e-combined-ca-bundle\") pod \"barbican-db-sync-fqqjk\" (UID: \"35492989-bd44-45b4-85a7-1acc9569270e\") " pod="openstack/barbican-db-sync-fqqjk" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.031294 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/35492989-bd44-45b4-85a7-1acc9569270e-db-sync-config-data\") pod \"barbican-db-sync-fqqjk\" (UID: \"35492989-bd44-45b4-85a7-1acc9569270e\") " pod="openstack/barbican-db-sync-fqqjk" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.034312 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35492989-bd44-45b4-85a7-1acc9569270e-combined-ca-bundle\") pod \"barbican-db-sync-fqqjk\" (UID: \"35492989-bd44-45b4-85a7-1acc9569270e\") " pod="openstack/barbican-db-sync-fqqjk" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.040656 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbbtl\" (UniqueName: \"kubernetes.io/projected/35492989-bd44-45b4-85a7-1acc9569270e-kube-api-access-fbbtl\") pod \"barbican-db-sync-fqqjk\" (UID: \"35492989-bd44-45b4-85a7-1acc9569270e\") " pod="openstack/barbican-db-sync-fqqjk" Mar 20 07:11:51 crc kubenswrapper[4971]: W0320 07:11:51.091292 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0cea3a55_06b4_4a0e_ad98_416b3bea7d59.slice/crio-1f313862442b3e34e188e0fff87a3674450d19bd8007a97d1105ed8ca2d842a7 WatchSource:0}: Error finding container 1f313862442b3e34e188e0fff87a3674450d19bd8007a97d1105ed8ca2d842a7: Status 404 returned error can't find the container with id 1f313862442b3e34e188e0fff87a3674450d19bd8007a97d1105ed8ca2d842a7 Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.094721 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56dddd9f87-c4mf7"] Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.145072 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-gvx7m"] Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.146333 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-gvx7m" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.150221 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-w2rgc" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.150445 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.151176 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.155374 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-gvx7m"] Mar 20 07:11:51 crc kubenswrapper[4971]: W0320 07:11:51.181889 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod189bc582_c200_4c20_8a6f_2a1446fd11b5.slice/crio-1d17c2d22af07ab8c7e52c7a70d0acfa8f46d8df2d61f6d5f302dcd9f48d64e3 WatchSource:0}: Error finding container 1d17c2d22af07ab8c7e52c7a70d0acfa8f46d8df2d61f6d5f302dcd9f48d64e3: Status 404 returned error can't find the container with id 1d17c2d22af07ab8c7e52c7a70d0acfa8f46d8df2d61f6d5f302dcd9f48d64e3 Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.205398 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-fqqjk" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.211137 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-dhbt6"] Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.232110 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tltth\" (UniqueName: \"kubernetes.io/projected/4b457fd8-aec5-4cfc-bbfd-a78eaf993f1c-kube-api-access-tltth\") pod \"neutron-db-sync-gvx7m\" (UID: \"4b457fd8-aec5-4cfc-bbfd-a78eaf993f1c\") " pod="openstack/neutron-db-sync-gvx7m" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.232415 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4b457fd8-aec5-4cfc-bbfd-a78eaf993f1c-config\") pod \"neutron-db-sync-gvx7m\" (UID: \"4b457fd8-aec5-4cfc-bbfd-a78eaf993f1c\") " pod="openstack/neutron-db-sync-gvx7m" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.232443 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b457fd8-aec5-4cfc-bbfd-a78eaf993f1c-combined-ca-bundle\") pod \"neutron-db-sync-gvx7m\" (UID: \"4b457fd8-aec5-4cfc-bbfd-a78eaf993f1c\") " pod="openstack/neutron-db-sync-gvx7m" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.247849 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.249137 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.253032 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.253143 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-wdnww" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.254086 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.257300 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.337124 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tltth\" (UniqueName: \"kubernetes.io/projected/4b457fd8-aec5-4cfc-bbfd-a78eaf993f1c-kube-api-access-tltth\") pod \"neutron-db-sync-gvx7m\" (UID: \"4b457fd8-aec5-4cfc-bbfd-a78eaf993f1c\") " pod="openstack/neutron-db-sync-gvx7m" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.337694 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdb3f848-6aef-4c5b-9896-559474cfc73a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fdb3f848-6aef-4c5b-9896-559474cfc73a\") " pod="openstack/glance-default-external-api-0" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.337809 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"fdb3f848-6aef-4c5b-9896-559474cfc73a\") " pod="openstack/glance-default-external-api-0" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.337937 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fdb3f848-6aef-4c5b-9896-559474cfc73a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fdb3f848-6aef-4c5b-9896-559474cfc73a\") " pod="openstack/glance-default-external-api-0" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.338126 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fdb3f848-6aef-4c5b-9896-559474cfc73a-scripts\") pod \"glance-default-external-api-0\" (UID: \"fdb3f848-6aef-4c5b-9896-559474cfc73a\") " pod="openstack/glance-default-external-api-0" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.338343 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6822\" (UniqueName: \"kubernetes.io/projected/fdb3f848-6aef-4c5b-9896-559474cfc73a-kube-api-access-f6822\") pod \"glance-default-external-api-0\" (UID: \"fdb3f848-6aef-4c5b-9896-559474cfc73a\") " pod="openstack/glance-default-external-api-0" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.338642 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4b457fd8-aec5-4cfc-bbfd-a78eaf993f1c-config\") pod \"neutron-db-sync-gvx7m\" (UID: \"4b457fd8-aec5-4cfc-bbfd-a78eaf993f1c\") " pod="openstack/neutron-db-sync-gvx7m" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.338763 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b457fd8-aec5-4cfc-bbfd-a78eaf993f1c-combined-ca-bundle\") pod \"neutron-db-sync-gvx7m\" (UID: \"4b457fd8-aec5-4cfc-bbfd-a78eaf993f1c\") " pod="openstack/neutron-db-sync-gvx7m" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.339532 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdb3f848-6aef-4c5b-9896-559474cfc73a-config-data\") pod \"glance-default-external-api-0\" (UID: \"fdb3f848-6aef-4c5b-9896-559474cfc73a\") " pod="openstack/glance-default-external-api-0" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.343187 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fdb3f848-6aef-4c5b-9896-559474cfc73a-logs\") pod \"glance-default-external-api-0\" (UID: \"fdb3f848-6aef-4c5b-9896-559474cfc73a\") " pod="openstack/glance-default-external-api-0" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.345480 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.349028 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.355638 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.358792 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b457fd8-aec5-4cfc-bbfd-a78eaf993f1c-combined-ca-bundle\") pod \"neutron-db-sync-gvx7m\" (UID: \"4b457fd8-aec5-4cfc-bbfd-a78eaf993f1c\") " pod="openstack/neutron-db-sync-gvx7m" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.359110 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.361563 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tltth\" (UniqueName: \"kubernetes.io/projected/4b457fd8-aec5-4cfc-bbfd-a78eaf993f1c-kube-api-access-tltth\") pod \"neutron-db-sync-gvx7m\" (UID: \"4b457fd8-aec5-4cfc-bbfd-a78eaf993f1c\") " pod="openstack/neutron-db-sync-gvx7m" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.361743 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4b457fd8-aec5-4cfc-bbfd-a78eaf993f1c-config\") pod \"neutron-db-sync-gvx7m\" (UID: \"4b457fd8-aec5-4cfc-bbfd-a78eaf993f1c\") " pod="openstack/neutron-db-sync-gvx7m" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.375810 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f778bb97-sp7rz"] Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.445742 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdb3f848-6aef-4c5b-9896-559474cfc73a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fdb3f848-6aef-4c5b-9896-559474cfc73a\") " pod="openstack/glance-default-external-api-0" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.446037 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"fdb3f848-6aef-4c5b-9896-559474cfc73a\") " pod="openstack/glance-default-external-api-0" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.446064 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fdb3f848-6aef-4c5b-9896-559474cfc73a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fdb3f848-6aef-4c5b-9896-559474cfc73a\") " pod="openstack/glance-default-external-api-0" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.446124 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fdb3f848-6aef-4c5b-9896-559474cfc73a-scripts\") pod \"glance-default-external-api-0\" (UID: \"fdb3f848-6aef-4c5b-9896-559474cfc73a\") " pod="openstack/glance-default-external-api-0" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.446149 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6822\" (UniqueName: \"kubernetes.io/projected/fdb3f848-6aef-4c5b-9896-559474cfc73a-kube-api-access-f6822\") pod \"glance-default-external-api-0\" (UID: \"fdb3f848-6aef-4c5b-9896-559474cfc73a\") " pod="openstack/glance-default-external-api-0" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.446200 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdb3f848-6aef-4c5b-9896-559474cfc73a-config-data\") pod \"glance-default-external-api-0\" (UID: \"fdb3f848-6aef-4c5b-9896-559474cfc73a\") " pod="openstack/glance-default-external-api-0" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.446225 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fdb3f848-6aef-4c5b-9896-559474cfc73a-logs\") pod \"glance-default-external-api-0\" (UID: \"fdb3f848-6aef-4c5b-9896-559474cfc73a\") " pod="openstack/glance-default-external-api-0" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.447142 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fdb3f848-6aef-4c5b-9896-559474cfc73a-logs\") pod \"glance-default-external-api-0\" (UID: \"fdb3f848-6aef-4c5b-9896-559474cfc73a\") " pod="openstack/glance-default-external-api-0" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.447536 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fdb3f848-6aef-4c5b-9896-559474cfc73a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fdb3f848-6aef-4c5b-9896-559474cfc73a\") " pod="openstack/glance-default-external-api-0" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.447887 4971 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"fdb3f848-6aef-4c5b-9896-559474cfc73a\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.454988 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fdb3f848-6aef-4c5b-9896-559474cfc73a-scripts\") pod \"glance-default-external-api-0\" (UID: \"fdb3f848-6aef-4c5b-9896-559474cfc73a\") " pod="openstack/glance-default-external-api-0" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.466102 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-gvx7m" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.467324 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdb3f848-6aef-4c5b-9896-559474cfc73a-config-data\") pod \"glance-default-external-api-0\" (UID: \"fdb3f848-6aef-4c5b-9896-559474cfc73a\") " pod="openstack/glance-default-external-api-0" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.472008 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdb3f848-6aef-4c5b-9896-559474cfc73a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fdb3f848-6aef-4c5b-9896-559474cfc73a\") " pod="openstack/glance-default-external-api-0" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.477537 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6822\" (UniqueName: \"kubernetes.io/projected/fdb3f848-6aef-4c5b-9896-559474cfc73a-kube-api-access-f6822\") pod \"glance-default-external-api-0\" (UID: \"fdb3f848-6aef-4c5b-9896-559474cfc73a\") " pod="openstack/glance-default-external-api-0" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.500848 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-p9zph"] Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.541832 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"fdb3f848-6aef-4c5b-9896-559474cfc73a\") " pod="openstack/glance-default-external-api-0" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.549483 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"a0427520-e443-4203-8104-3e6ee27f1de2\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.549581 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0427520-e443-4203-8104-3e6ee27f1de2-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a0427520-e443-4203-8104-3e6ee27f1de2\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.549638 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0427520-e443-4203-8104-3e6ee27f1de2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a0427520-e443-4203-8104-3e6ee27f1de2\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.549676 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0427520-e443-4203-8104-3e6ee27f1de2-logs\") pod \"glance-default-internal-api-0\" (UID: \"a0427520-e443-4203-8104-3e6ee27f1de2\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.549753 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5fts\" (UniqueName: \"kubernetes.io/projected/a0427520-e443-4203-8104-3e6ee27f1de2-kube-api-access-p5fts\") pod \"glance-default-internal-api-0\" (UID: \"a0427520-e443-4203-8104-3e6ee27f1de2\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.549775 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0427520-e443-4203-8104-3e6ee27f1de2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a0427520-e443-4203-8104-3e6ee27f1de2\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.549801 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a0427520-e443-4203-8104-3e6ee27f1de2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a0427520-e443-4203-8104-3e6ee27f1de2\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.582394 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.652735 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0427520-e443-4203-8104-3e6ee27f1de2-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a0427520-e443-4203-8104-3e6ee27f1de2\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.652786 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0427520-e443-4203-8104-3e6ee27f1de2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a0427520-e443-4203-8104-3e6ee27f1de2\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.652828 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0427520-e443-4203-8104-3e6ee27f1de2-logs\") pod \"glance-default-internal-api-0\" (UID: \"a0427520-e443-4203-8104-3e6ee27f1de2\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.652900 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5fts\" (UniqueName: \"kubernetes.io/projected/a0427520-e443-4203-8104-3e6ee27f1de2-kube-api-access-p5fts\") pod \"glance-default-internal-api-0\" (UID: \"a0427520-e443-4203-8104-3e6ee27f1de2\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.652922 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0427520-e443-4203-8104-3e6ee27f1de2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a0427520-e443-4203-8104-3e6ee27f1de2\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.652941 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a0427520-e443-4203-8104-3e6ee27f1de2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a0427520-e443-4203-8104-3e6ee27f1de2\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.652979 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"a0427520-e443-4203-8104-3e6ee27f1de2\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.653141 4971 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"a0427520-e443-4203-8104-3e6ee27f1de2\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-internal-api-0" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.653764 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0427520-e443-4203-8104-3e6ee27f1de2-logs\") pod \"glance-default-internal-api-0\" (UID: \"a0427520-e443-4203-8104-3e6ee27f1de2\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.654850 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a0427520-e443-4203-8104-3e6ee27f1de2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a0427520-e443-4203-8104-3e6ee27f1de2\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.662122 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0427520-e443-4203-8104-3e6ee27f1de2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a0427520-e443-4203-8104-3e6ee27f1de2\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.695705 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.701018 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-7vflp"] Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.702577 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0427520-e443-4203-8104-3e6ee27f1de2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a0427520-e443-4203-8104-3e6ee27f1de2\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.702875 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0427520-e443-4203-8104-3e6ee27f1de2-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a0427520-e443-4203-8104-3e6ee27f1de2\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.710798 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5fts\" (UniqueName: \"kubernetes.io/projected/a0427520-e443-4203-8104-3e6ee27f1de2-kube-api-access-p5fts\") pod \"glance-default-internal-api-0\" (UID: \"a0427520-e443-4203-8104-3e6ee27f1de2\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.726511 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"a0427520-e443-4203-8104-3e6ee27f1de2\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.736264 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-fqqjk"] Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.784101 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.977229 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dhbt6" event={"ID":"189bc582-c200-4c20-8a6f-2a1446fd11b5","Type":"ContainerStarted","Data":"374e54ab4a048309941118221a07e96b275e295109371c7b2474327f103803a5"} Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.977272 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dhbt6" event={"ID":"189bc582-c200-4c20-8a6f-2a1446fd11b5","Type":"ContainerStarted","Data":"1d17c2d22af07ab8c7e52c7a70d0acfa8f46d8df2d61f6d5f302dcd9f48d64e3"} Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.981264 4971 generic.go:334] "Generic (PLEG): container finished" podID="0cea3a55-06b4-4a0e-ad98-416b3bea7d59" containerID="f2f0f7ec39f817502104c7a9c097cfd3843516ede090995b2bae7b607efc5c1c" exitCode=0 Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.981310 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56dddd9f87-c4mf7" event={"ID":"0cea3a55-06b4-4a0e-ad98-416b3bea7d59","Type":"ContainerDied","Data":"f2f0f7ec39f817502104c7a9c097cfd3843516ede090995b2bae7b607efc5c1c"} Mar 20 07:11:51 crc kubenswrapper[4971]: I0320 07:11:51.981337 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56dddd9f87-c4mf7" event={"ID":"0cea3a55-06b4-4a0e-ad98-416b3bea7d59","Type":"ContainerStarted","Data":"1f313862442b3e34e188e0fff87a3674450d19bd8007a97d1105ed8ca2d842a7"} Mar 20 07:11:52 crc kubenswrapper[4971]: I0320 07:11:52.006388 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-dhbt6" podStartSLOduration=2.006367121 podStartE2EDuration="2.006367121s" podCreationTimestamp="2026-03-20 07:11:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:11:51.998680612 +0000 UTC m=+1333.978554750" watchObservedRunningTime="2026-03-20 07:11:52.006367121 +0000 UTC m=+1333.986241279" Mar 20 07:11:52 crc kubenswrapper[4971]: W0320 07:11:52.042265 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c3565c3_ef3e_4d98_9bca_211a538032c9.slice/crio-5ce204006ba619c97b465076cf2c9087e3249d7b639699c993173e55137443f4 WatchSource:0}: Error finding container 5ce204006ba619c97b465076cf2c9087e3249d7b639699c993173e55137443f4: Status 404 returned error can't find the container with id 5ce204006ba619c97b465076cf2c9087e3249d7b639699c993173e55137443f4 Mar 20 07:11:52 crc kubenswrapper[4971]: W0320 07:11:52.045189 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podebe4aafd_3ffe_406f_a0dc_faebf5eeddda.slice/crio-6f4d06b82e67f1d6a8c62d57a281f87352de06bf53fb983b39a3ad95a2ce7099 WatchSource:0}: Error finding container 6f4d06b82e67f1d6a8c62d57a281f87352de06bf53fb983b39a3ad95a2ce7099: Status 404 returned error can't find the container with id 6f4d06b82e67f1d6a8c62d57a281f87352de06bf53fb983b39a3ad95a2ce7099 Mar 20 07:11:52 crc kubenswrapper[4971]: I0320 07:11:52.106234 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cd86fcf7-jfdhx" Mar 20 07:11:52 crc kubenswrapper[4971]: I0320 07:11:52.264738 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56dddd9f87-c4mf7" Mar 20 07:11:52 crc kubenswrapper[4971]: I0320 07:11:52.268378 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/94ae3519-ba25-4ec1-a361-1594d879b2c1-ovsdbserver-nb\") pod \"94ae3519-ba25-4ec1-a361-1594d879b2c1\" (UID: \"94ae3519-ba25-4ec1-a361-1594d879b2c1\") " Mar 20 07:11:52 crc kubenswrapper[4971]: I0320 07:11:52.268489 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/94ae3519-ba25-4ec1-a361-1594d879b2c1-ovsdbserver-sb\") pod \"94ae3519-ba25-4ec1-a361-1594d879b2c1\" (UID: \"94ae3519-ba25-4ec1-a361-1594d879b2c1\") " Mar 20 07:11:52 crc kubenswrapper[4971]: I0320 07:11:52.268521 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/94ae3519-ba25-4ec1-a361-1594d879b2c1-dns-svc\") pod \"94ae3519-ba25-4ec1-a361-1594d879b2c1\" (UID: \"94ae3519-ba25-4ec1-a361-1594d879b2c1\") " Mar 20 07:11:52 crc kubenswrapper[4971]: I0320 07:11:52.268659 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4sjlp\" (UniqueName: \"kubernetes.io/projected/94ae3519-ba25-4ec1-a361-1594d879b2c1-kube-api-access-4sjlp\") pod \"94ae3519-ba25-4ec1-a361-1594d879b2c1\" (UID: \"94ae3519-ba25-4ec1-a361-1594d879b2c1\") " Mar 20 07:11:52 crc kubenswrapper[4971]: I0320 07:11:52.268725 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94ae3519-ba25-4ec1-a361-1594d879b2c1-config\") pod \"94ae3519-ba25-4ec1-a361-1594d879b2c1\" (UID: \"94ae3519-ba25-4ec1-a361-1594d879b2c1\") " Mar 20 07:11:52 crc kubenswrapper[4971]: I0320 07:11:52.299310 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94ae3519-ba25-4ec1-a361-1594d879b2c1-kube-api-access-4sjlp" (OuterVolumeSpecName: "kube-api-access-4sjlp") pod "94ae3519-ba25-4ec1-a361-1594d879b2c1" (UID: "94ae3519-ba25-4ec1-a361-1594d879b2c1"). InnerVolumeSpecName "kube-api-access-4sjlp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:11:52 crc kubenswrapper[4971]: I0320 07:11:52.331030 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94ae3519-ba25-4ec1-a361-1594d879b2c1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "94ae3519-ba25-4ec1-a361-1594d879b2c1" (UID: "94ae3519-ba25-4ec1-a361-1594d879b2c1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:11:52 crc kubenswrapper[4971]: I0320 07:11:52.334268 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94ae3519-ba25-4ec1-a361-1594d879b2c1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "94ae3519-ba25-4ec1-a361-1594d879b2c1" (UID: "94ae3519-ba25-4ec1-a361-1594d879b2c1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:11:52 crc kubenswrapper[4971]: I0320 07:11:52.369886 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0cea3a55-06b4-4a0e-ad98-416b3bea7d59-ovsdbserver-sb\") pod \"0cea3a55-06b4-4a0e-ad98-416b3bea7d59\" (UID: \"0cea3a55-06b4-4a0e-ad98-416b3bea7d59\") " Mar 20 07:11:52 crc kubenswrapper[4971]: I0320 07:11:52.370003 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0cea3a55-06b4-4a0e-ad98-416b3bea7d59-ovsdbserver-nb\") pod \"0cea3a55-06b4-4a0e-ad98-416b3bea7d59\" (UID: \"0cea3a55-06b4-4a0e-ad98-416b3bea7d59\") " Mar 20 07:11:52 crc kubenswrapper[4971]: I0320 07:11:52.370090 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cea3a55-06b4-4a0e-ad98-416b3bea7d59-config\") pod \"0cea3a55-06b4-4a0e-ad98-416b3bea7d59\" (UID: \"0cea3a55-06b4-4a0e-ad98-416b3bea7d59\") " Mar 20 07:11:52 crc kubenswrapper[4971]: I0320 07:11:52.370123 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hn587\" (UniqueName: \"kubernetes.io/projected/0cea3a55-06b4-4a0e-ad98-416b3bea7d59-kube-api-access-hn587\") pod \"0cea3a55-06b4-4a0e-ad98-416b3bea7d59\" (UID: \"0cea3a55-06b4-4a0e-ad98-416b3bea7d59\") " Mar 20 07:11:52 crc kubenswrapper[4971]: I0320 07:11:52.370142 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0cea3a55-06b4-4a0e-ad98-416b3bea7d59-dns-svc\") pod \"0cea3a55-06b4-4a0e-ad98-416b3bea7d59\" (UID: \"0cea3a55-06b4-4a0e-ad98-416b3bea7d59\") " Mar 20 07:11:52 crc kubenswrapper[4971]: I0320 07:11:52.370482 4971 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/94ae3519-ba25-4ec1-a361-1594d879b2c1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:52 crc kubenswrapper[4971]: I0320 07:11:52.370498 4971 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/94ae3519-ba25-4ec1-a361-1594d879b2c1-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:52 crc kubenswrapper[4971]: I0320 07:11:52.370507 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4sjlp\" (UniqueName: \"kubernetes.io/projected/94ae3519-ba25-4ec1-a361-1594d879b2c1-kube-api-access-4sjlp\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:52 crc kubenswrapper[4971]: I0320 07:11:52.380572 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cea3a55-06b4-4a0e-ad98-416b3bea7d59-kube-api-access-hn587" (OuterVolumeSpecName: "kube-api-access-hn587") pod "0cea3a55-06b4-4a0e-ad98-416b3bea7d59" (UID: "0cea3a55-06b4-4a0e-ad98-416b3bea7d59"). InnerVolumeSpecName "kube-api-access-hn587". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:11:52 crc kubenswrapper[4971]: I0320 07:11:52.393066 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94ae3519-ba25-4ec1-a361-1594d879b2c1-config" (OuterVolumeSpecName: "config") pod "94ae3519-ba25-4ec1-a361-1594d879b2c1" (UID: "94ae3519-ba25-4ec1-a361-1594d879b2c1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:11:52 crc kubenswrapper[4971]: I0320 07:11:52.397298 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94ae3519-ba25-4ec1-a361-1594d879b2c1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "94ae3519-ba25-4ec1-a361-1594d879b2c1" (UID: "94ae3519-ba25-4ec1-a361-1594d879b2c1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:11:52 crc kubenswrapper[4971]: I0320 07:11:52.471986 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94ae3519-ba25-4ec1-a361-1594d879b2c1-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:52 crc kubenswrapper[4971]: I0320 07:11:52.472015 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hn587\" (UniqueName: \"kubernetes.io/projected/0cea3a55-06b4-4a0e-ad98-416b3bea7d59-kube-api-access-hn587\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:52 crc kubenswrapper[4971]: I0320 07:11:52.472025 4971 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/94ae3519-ba25-4ec1-a361-1594d879b2c1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:52 crc kubenswrapper[4971]: I0320 07:11:52.513359 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0cea3a55-06b4-4a0e-ad98-416b3bea7d59-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0cea3a55-06b4-4a0e-ad98-416b3bea7d59" (UID: "0cea3a55-06b4-4a0e-ad98-416b3bea7d59"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:11:52 crc kubenswrapper[4971]: I0320 07:11:52.531401 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0cea3a55-06b4-4a0e-ad98-416b3bea7d59-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0cea3a55-06b4-4a0e-ad98-416b3bea7d59" (UID: "0cea3a55-06b4-4a0e-ad98-416b3bea7d59"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:11:52 crc kubenswrapper[4971]: I0320 07:11:52.532348 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0cea3a55-06b4-4a0e-ad98-416b3bea7d59-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0cea3a55-06b4-4a0e-ad98-416b3bea7d59" (UID: "0cea3a55-06b4-4a0e-ad98-416b3bea7d59"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:11:52 crc kubenswrapper[4971]: I0320 07:11:52.549793 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0cea3a55-06b4-4a0e-ad98-416b3bea7d59-config" (OuterVolumeSpecName: "config") pod "0cea3a55-06b4-4a0e-ad98-416b3bea7d59" (UID: "0cea3a55-06b4-4a0e-ad98-416b3bea7d59"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:11:52 crc kubenswrapper[4971]: I0320 07:11:52.573485 4971 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0cea3a55-06b4-4a0e-ad98-416b3bea7d59-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:52 crc kubenswrapper[4971]: I0320 07:11:52.573517 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cea3a55-06b4-4a0e-ad98-416b3bea7d59-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:52 crc kubenswrapper[4971]: I0320 07:11:52.573526 4971 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0cea3a55-06b4-4a0e-ad98-416b3bea7d59-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:52 crc kubenswrapper[4971]: I0320 07:11:52.573533 4971 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0cea3a55-06b4-4a0e-ad98-416b3bea7d59-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 07:11:52 crc kubenswrapper[4971]: I0320 07:11:52.721303 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 07:11:52 crc kubenswrapper[4971]: I0320 07:11:52.793874 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-gvx7m"] Mar 20 07:11:52 crc kubenswrapper[4971]: I0320 07:11:52.888365 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 07:11:52 crc kubenswrapper[4971]: W0320 07:11:52.905190 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0427520_e443_4203_8104_3e6ee27f1de2.slice/crio-ab1b6472ec69bd1191f614bcfc8201134e5900bdaec24feb31273d5ba4fc40ad WatchSource:0}: Error finding container ab1b6472ec69bd1191f614bcfc8201134e5900bdaec24feb31273d5ba4fc40ad: Status 404 returned error can't find the container with id ab1b6472ec69bd1191f614bcfc8201134e5900bdaec24feb31273d5ba4fc40ad Mar 20 07:11:53 crc kubenswrapper[4971]: I0320 07:11:53.088628 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9f0608b4-f344-45db-a952-d6bc328083a2","Type":"ContainerStarted","Data":"5e8654e5ce222132f07315464b39e25a8cab844b1929418e596056aec0200d84"} Mar 20 07:11:53 crc kubenswrapper[4971]: I0320 07:11:53.088675 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9f0608b4-f344-45db-a952-d6bc328083a2","Type":"ContainerStarted","Data":"c9c23b9558cdbf263d8320145f4a662b276ea98292c7170bbe1609469c85d516"} Mar 20 07:11:53 crc kubenswrapper[4971]: I0320 07:11:53.090436 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a0427520-e443-4203-8104-3e6ee27f1de2","Type":"ContainerStarted","Data":"ab1b6472ec69bd1191f614bcfc8201134e5900bdaec24feb31273d5ba4fc40ad"} Mar 20 07:11:53 crc kubenswrapper[4971]: I0320 07:11:53.101674 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-gvx7m" event={"ID":"4b457fd8-aec5-4cfc-bbfd-a78eaf993f1c","Type":"ContainerStarted","Data":"a9918192002d0371a6d0f0530e19505b9f9cc145dcee3733f94a19f2905bb307"} Mar 20 07:11:53 crc kubenswrapper[4971]: I0320 07:11:53.124829 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-gvx7m" podStartSLOduration=2.124795123 podStartE2EDuration="2.124795123s" podCreationTimestamp="2026-03-20 07:11:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:11:53.119848461 +0000 UTC m=+1335.099722599" watchObservedRunningTime="2026-03-20 07:11:53.124795123 +0000 UTC m=+1335.104669261" Mar 20 07:11:53 crc kubenswrapper[4971]: I0320 07:11:53.128580 4971 generic.go:334] "Generic (PLEG): container finished" podID="cc189b71-a30a-4ae1-9df1-234fc2e85cbc" containerID="9c101c1db5075d133976c6c08c5bff09fdb630617bab64ea79f5994b47c2626b" exitCode=0 Mar 20 07:11:53 crc kubenswrapper[4971]: I0320 07:11:53.128774 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f778bb97-sp7rz" event={"ID":"cc189b71-a30a-4ae1-9df1-234fc2e85cbc","Type":"ContainerDied","Data":"9c101c1db5075d133976c6c08c5bff09fdb630617bab64ea79f5994b47c2626b"} Mar 20 07:11:53 crc kubenswrapper[4971]: I0320 07:11:53.128807 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f778bb97-sp7rz" event={"ID":"cc189b71-a30a-4ae1-9df1-234fc2e85cbc","Type":"ContainerStarted","Data":"f992fa2f60fda4be4c4f6f48c72bd92dbbb5e4fcb375c2ee537fd836007ecbcc"} Mar 20 07:11:53 crc kubenswrapper[4971]: I0320 07:11:53.138745 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"03a48c07-f949-4376-874a-8d5758b60a9a","Type":"ContainerStarted","Data":"1e668549de9d6a1794c53064604b079dc82499d00478a42372c389e9b4274445"} Mar 20 07:11:53 crc kubenswrapper[4971]: I0320 07:11:53.150856 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-7vflp" event={"ID":"7c3565c3-ef3e-4d98-9bca-211a538032c9","Type":"ContainerStarted","Data":"5ce204006ba619c97b465076cf2c9087e3249d7b639699c993173e55137443f4"} Mar 20 07:11:53 crc kubenswrapper[4971]: I0320 07:11:53.164852 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cd86fcf7-jfdhx" event={"ID":"94ae3519-ba25-4ec1-a361-1594d879b2c1","Type":"ContainerDied","Data":"868b772447b9d740b0940430bae406fd5bb719a2e06615ef4eb3c8133576299a"} Mar 20 07:11:53 crc kubenswrapper[4971]: I0320 07:11:53.164911 4971 scope.go:117] "RemoveContainer" containerID="f1558a3b7c37a5d1491d5f695d239a1bd2492716b053f3dd4832f9dba49646cd" Mar 20 07:11:53 crc kubenswrapper[4971]: I0320 07:11:53.165063 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cd86fcf7-jfdhx" Mar 20 07:11:53 crc kubenswrapper[4971]: I0320 07:11:53.170921 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-p9zph" event={"ID":"ebe4aafd-3ffe-406f-a0dc-faebf5eeddda","Type":"ContainerStarted","Data":"6f4d06b82e67f1d6a8c62d57a281f87352de06bf53fb983b39a3ad95a2ce7099"} Mar 20 07:11:53 crc kubenswrapper[4971]: I0320 07:11:53.180493 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fdb3f848-6aef-4c5b-9896-559474cfc73a","Type":"ContainerStarted","Data":"6a3a09c948dd039759a4a5944c90dc5ec2295372b7cc04f37d1e98da0626d3ae"} Mar 20 07:11:53 crc kubenswrapper[4971]: I0320 07:11:53.183480 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-fqqjk" event={"ID":"35492989-bd44-45b4-85a7-1acc9569270e","Type":"ContainerStarted","Data":"f833110a111b048fe3a3266332c22dabe98decac6f26c0baf1d7489944b7057e"} Mar 20 07:11:53 crc kubenswrapper[4971]: I0320 07:11:53.220690 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56dddd9f87-c4mf7" Mar 20 07:11:53 crc kubenswrapper[4971]: I0320 07:11:53.221146 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56dddd9f87-c4mf7" event={"ID":"0cea3a55-06b4-4a0e-ad98-416b3bea7d59","Type":"ContainerDied","Data":"1f313862442b3e34e188e0fff87a3674450d19bd8007a97d1105ed8ca2d842a7"} Mar 20 07:11:53 crc kubenswrapper[4971]: I0320 07:11:53.241742 4971 scope.go:117] "RemoveContainer" containerID="f2f0f7ec39f817502104c7a9c097cfd3843516ede090995b2bae7b607efc5c1c" Mar 20 07:11:53 crc kubenswrapper[4971]: I0320 07:11:53.241734 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cd86fcf7-jfdhx"] Mar 20 07:11:53 crc kubenswrapper[4971]: I0320 07:11:53.252440 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6cd86fcf7-jfdhx"] Mar 20 07:11:53 crc kubenswrapper[4971]: I0320 07:11:53.315683 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56dddd9f87-c4mf7"] Mar 20 07:11:53 crc kubenswrapper[4971]: I0320 07:11:53.335719 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56dddd9f87-c4mf7"] Mar 20 07:11:54 crc kubenswrapper[4971]: I0320 07:11:54.170030 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 07:11:54 crc kubenswrapper[4971]: I0320 07:11:54.221199 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 07:11:54 crc kubenswrapper[4971]: I0320 07:11:54.265251 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a0427520-e443-4203-8104-3e6ee27f1de2","Type":"ContainerStarted","Data":"16e86dd2f75fe869c0099e5e8f28758c8b0aa8e231f0d52418f308910ad3a71b"} Mar 20 07:11:54 crc kubenswrapper[4971]: I0320 07:11:54.279000 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 07:11:54 crc kubenswrapper[4971]: I0320 07:11:54.292517 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9f0608b4-f344-45db-a952-d6bc328083a2","Type":"ContainerStarted","Data":"86e323d45ff8a18ef238cc1dc929c3203fdb02bcad78b9221d6f2ec70e572487"} Mar 20 07:11:54 crc kubenswrapper[4971]: I0320 07:11:54.307504 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-gvx7m" event={"ID":"4b457fd8-aec5-4cfc-bbfd-a78eaf993f1c","Type":"ContainerStarted","Data":"5ac947ff7b65ad4e4ef7fa4b0e909f3f4005fc6b016e4dd5e60be9354d69b94e"} Mar 20 07:11:54 crc kubenswrapper[4971]: I0320 07:11:54.316313 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f778bb97-sp7rz" event={"ID":"cc189b71-a30a-4ae1-9df1-234fc2e85cbc","Type":"ContainerStarted","Data":"8c2bdbf8384dd4512b9a4df8dcc520bbb3a8d3f712f30b2d3bc0664f96372ba1"} Mar 20 07:11:54 crc kubenswrapper[4971]: I0320 07:11:54.316381 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55f778bb97-sp7rz" Mar 20 07:11:54 crc kubenswrapper[4971]: I0320 07:11:54.350707 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fdb3f848-6aef-4c5b-9896-559474cfc73a","Type":"ContainerStarted","Data":"7c9ba1ac583a9ad417e563b8d221c2934a2d87a7edfb0ad30cbb4299b6e9257b"} Mar 20 07:11:54 crc kubenswrapper[4971]: I0320 07:11:54.355585 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55f778bb97-sp7rz" podStartSLOduration=4.353754922 podStartE2EDuration="4.353754922s" podCreationTimestamp="2026-03-20 07:11:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:11:54.340342593 +0000 UTC m=+1336.320216741" watchObservedRunningTime="2026-03-20 07:11:54.353754922 +0000 UTC m=+1336.333629060" Mar 20 07:11:54 crc kubenswrapper[4971]: I0320 07:11:54.755757 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cea3a55-06b4-4a0e-ad98-416b3bea7d59" path="/var/lib/kubelet/pods/0cea3a55-06b4-4a0e-ad98-416b3bea7d59/volumes" Mar 20 07:11:54 crc kubenswrapper[4971]: I0320 07:11:54.756759 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94ae3519-ba25-4ec1-a361-1594d879b2c1" path="/var/lib/kubelet/pods/94ae3519-ba25-4ec1-a361-1594d879b2c1/volumes" Mar 20 07:11:55 crc kubenswrapper[4971]: I0320 07:11:55.371844 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a0427520-e443-4203-8104-3e6ee27f1de2","Type":"ContainerStarted","Data":"8890fb77d0327a4c268a9debfb5847496f6fda767784c3377b128d49ea6185ff"} Mar 20 07:11:55 crc kubenswrapper[4971]: I0320 07:11:55.371967 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="a0427520-e443-4203-8104-3e6ee27f1de2" containerName="glance-log" containerID="cri-o://16e86dd2f75fe869c0099e5e8f28758c8b0aa8e231f0d52418f308910ad3a71b" gracePeriod=30 Mar 20 07:11:55 crc kubenswrapper[4971]: I0320 07:11:55.372067 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="a0427520-e443-4203-8104-3e6ee27f1de2" containerName="glance-httpd" containerID="cri-o://8890fb77d0327a4c268a9debfb5847496f6fda767784c3377b128d49ea6185ff" gracePeriod=30 Mar 20 07:11:55 crc kubenswrapper[4971]: I0320 07:11:55.373754 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fdb3f848-6aef-4c5b-9896-559474cfc73a","Type":"ContainerStarted","Data":"c6f8c6bba52814176f2d63fcef53ef5390aac199883bf410759ab23efa5483e6"} Mar 20 07:11:55 crc kubenswrapper[4971]: I0320 07:11:55.373846 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="fdb3f848-6aef-4c5b-9896-559474cfc73a" containerName="glance-log" containerID="cri-o://7c9ba1ac583a9ad417e563b8d221c2934a2d87a7edfb0ad30cbb4299b6e9257b" gracePeriod=30 Mar 20 07:11:55 crc kubenswrapper[4971]: I0320 07:11:55.373859 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="fdb3f848-6aef-4c5b-9896-559474cfc73a" containerName="glance-httpd" containerID="cri-o://c6f8c6bba52814176f2d63fcef53ef5390aac199883bf410759ab23efa5483e6" gracePeriod=30 Mar 20 07:11:55 crc kubenswrapper[4971]: I0320 07:11:55.409490 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.409471093 podStartE2EDuration="5.409471093s" podCreationTimestamp="2026-03-20 07:11:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:11:55.403562568 +0000 UTC m=+1337.383436706" watchObservedRunningTime="2026-03-20 07:11:55.409471093 +0000 UTC m=+1337.389345231" Mar 20 07:11:55 crc kubenswrapper[4971]: I0320 07:11:55.410350 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9f0608b4-f344-45db-a952-d6bc328083a2","Type":"ContainerStarted","Data":"991e136e3d1419ae52ab4b98312ecd62fb645e90cc4433d51eaf06d9da2b9acb"} Mar 20 07:11:55 crc kubenswrapper[4971]: I0320 07:11:55.410753 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9f0608b4-f344-45db-a952-d6bc328083a2","Type":"ContainerStarted","Data":"208cc47279ede4fc675dd95db356e62187d88bcc7f60e2e2862a2771c8f1ac3a"} Mar 20 07:11:55 crc kubenswrapper[4971]: I0320 07:11:55.410780 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9f0608b4-f344-45db-a952-d6bc328083a2","Type":"ContainerStarted","Data":"1cca389d9342f12475322096ca826c0aec4ad5debfe45d027be7baa037a77986"} Mar 20 07:11:55 crc kubenswrapper[4971]: I0320 07:11:55.410788 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9f0608b4-f344-45db-a952-d6bc328083a2","Type":"ContainerStarted","Data":"956cfacd43945d37dd879d68eb8604413a9b059e3c4840c7381849fcbde53b7b"} Mar 20 07:11:55 crc kubenswrapper[4971]: I0320 07:11:55.446209 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.446186386 podStartE2EDuration="5.446186386s" podCreationTimestamp="2026-03-20 07:11:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:11:55.435028402 +0000 UTC m=+1337.414902540" watchObservedRunningTime="2026-03-20 07:11:55.446186386 +0000 UTC m=+1337.426060524" Mar 20 07:11:55 crc kubenswrapper[4971]: I0320 07:11:55.493770 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=38.53409878 podStartE2EDuration="45.493744875s" podCreationTimestamp="2026-03-20 07:11:10 +0000 UTC" firstStartedPulling="2026-03-20 07:11:45.202851154 +0000 UTC m=+1327.182725292" lastFinishedPulling="2026-03-20 07:11:52.162497249 +0000 UTC m=+1334.142371387" observedRunningTime="2026-03-20 07:11:55.477596028 +0000 UTC m=+1337.457470176" watchObservedRunningTime="2026-03-20 07:11:55.493744875 +0000 UTC m=+1337.473619013" Mar 20 07:11:55 crc kubenswrapper[4971]: I0320 07:11:55.861520 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f778bb97-sp7rz"] Mar 20 07:11:55 crc kubenswrapper[4971]: I0320 07:11:55.881746 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-759cc7f497-zhrbn"] Mar 20 07:11:55 crc kubenswrapper[4971]: E0320 07:11:55.882083 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94ae3519-ba25-4ec1-a361-1594d879b2c1" containerName="init" Mar 20 07:11:55 crc kubenswrapper[4971]: I0320 07:11:55.882096 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="94ae3519-ba25-4ec1-a361-1594d879b2c1" containerName="init" Mar 20 07:11:55 crc kubenswrapper[4971]: E0320 07:11:55.882136 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cea3a55-06b4-4a0e-ad98-416b3bea7d59" containerName="init" Mar 20 07:11:55 crc kubenswrapper[4971]: I0320 07:11:55.882145 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cea3a55-06b4-4a0e-ad98-416b3bea7d59" containerName="init" Mar 20 07:11:55 crc kubenswrapper[4971]: I0320 07:11:55.882294 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="94ae3519-ba25-4ec1-a361-1594d879b2c1" containerName="init" Mar 20 07:11:55 crc kubenswrapper[4971]: I0320 07:11:55.882312 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cea3a55-06b4-4a0e-ad98-416b3bea7d59" containerName="init" Mar 20 07:11:55 crc kubenswrapper[4971]: I0320 07:11:55.883134 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-759cc7f497-zhrbn" Mar 20 07:11:55 crc kubenswrapper[4971]: I0320 07:11:55.885058 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 20 07:11:55 crc kubenswrapper[4971]: I0320 07:11:55.907590 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-759cc7f497-zhrbn"] Mar 20 07:11:55 crc kubenswrapper[4971]: I0320 07:11:55.956681 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/165557aa-762a-4c22-b3df-030af613518b-config\") pod \"dnsmasq-dns-759cc7f497-zhrbn\" (UID: \"165557aa-762a-4c22-b3df-030af613518b\") " pod="openstack/dnsmasq-dns-759cc7f497-zhrbn" Mar 20 07:11:55 crc kubenswrapper[4971]: I0320 07:11:55.956749 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/165557aa-762a-4c22-b3df-030af613518b-ovsdbserver-nb\") pod \"dnsmasq-dns-759cc7f497-zhrbn\" (UID: \"165557aa-762a-4c22-b3df-030af613518b\") " pod="openstack/dnsmasq-dns-759cc7f497-zhrbn" Mar 20 07:11:55 crc kubenswrapper[4971]: I0320 07:11:55.956834 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4tq4\" (UniqueName: \"kubernetes.io/projected/165557aa-762a-4c22-b3df-030af613518b-kube-api-access-w4tq4\") pod \"dnsmasq-dns-759cc7f497-zhrbn\" (UID: \"165557aa-762a-4c22-b3df-030af613518b\") " pod="openstack/dnsmasq-dns-759cc7f497-zhrbn" Mar 20 07:11:55 crc kubenswrapper[4971]: I0320 07:11:55.956872 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/165557aa-762a-4c22-b3df-030af613518b-dns-svc\") pod \"dnsmasq-dns-759cc7f497-zhrbn\" (UID: \"165557aa-762a-4c22-b3df-030af613518b\") " pod="openstack/dnsmasq-dns-759cc7f497-zhrbn" Mar 20 07:11:55 crc kubenswrapper[4971]: I0320 07:11:55.956912 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/165557aa-762a-4c22-b3df-030af613518b-dns-swift-storage-0\") pod \"dnsmasq-dns-759cc7f497-zhrbn\" (UID: \"165557aa-762a-4c22-b3df-030af613518b\") " pod="openstack/dnsmasq-dns-759cc7f497-zhrbn" Mar 20 07:11:55 crc kubenswrapper[4971]: I0320 07:11:55.956940 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/165557aa-762a-4c22-b3df-030af613518b-ovsdbserver-sb\") pod \"dnsmasq-dns-759cc7f497-zhrbn\" (UID: \"165557aa-762a-4c22-b3df-030af613518b\") " pod="openstack/dnsmasq-dns-759cc7f497-zhrbn" Mar 20 07:11:56 crc kubenswrapper[4971]: I0320 07:11:56.061204 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/165557aa-762a-4c22-b3df-030af613518b-config\") pod \"dnsmasq-dns-759cc7f497-zhrbn\" (UID: \"165557aa-762a-4c22-b3df-030af613518b\") " pod="openstack/dnsmasq-dns-759cc7f497-zhrbn" Mar 20 07:11:56 crc kubenswrapper[4971]: I0320 07:11:56.061285 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/165557aa-762a-4c22-b3df-030af613518b-ovsdbserver-nb\") pod \"dnsmasq-dns-759cc7f497-zhrbn\" (UID: \"165557aa-762a-4c22-b3df-030af613518b\") " pod="openstack/dnsmasq-dns-759cc7f497-zhrbn" Mar 20 07:11:56 crc kubenswrapper[4971]: I0320 07:11:56.061354 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4tq4\" (UniqueName: \"kubernetes.io/projected/165557aa-762a-4c22-b3df-030af613518b-kube-api-access-w4tq4\") pod \"dnsmasq-dns-759cc7f497-zhrbn\" (UID: \"165557aa-762a-4c22-b3df-030af613518b\") " pod="openstack/dnsmasq-dns-759cc7f497-zhrbn" Mar 20 07:11:56 crc kubenswrapper[4971]: I0320 07:11:56.061403 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/165557aa-762a-4c22-b3df-030af613518b-dns-svc\") pod \"dnsmasq-dns-759cc7f497-zhrbn\" (UID: \"165557aa-762a-4c22-b3df-030af613518b\") " pod="openstack/dnsmasq-dns-759cc7f497-zhrbn" Mar 20 07:11:56 crc kubenswrapper[4971]: I0320 07:11:56.061470 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/165557aa-762a-4c22-b3df-030af613518b-dns-swift-storage-0\") pod \"dnsmasq-dns-759cc7f497-zhrbn\" (UID: \"165557aa-762a-4c22-b3df-030af613518b\") " pod="openstack/dnsmasq-dns-759cc7f497-zhrbn" Mar 20 07:11:56 crc kubenswrapper[4971]: I0320 07:11:56.061520 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/165557aa-762a-4c22-b3df-030af613518b-ovsdbserver-sb\") pod \"dnsmasq-dns-759cc7f497-zhrbn\" (UID: \"165557aa-762a-4c22-b3df-030af613518b\") " pod="openstack/dnsmasq-dns-759cc7f497-zhrbn" Mar 20 07:11:56 crc kubenswrapper[4971]: I0320 07:11:56.063066 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/165557aa-762a-4c22-b3df-030af613518b-config\") pod \"dnsmasq-dns-759cc7f497-zhrbn\" (UID: \"165557aa-762a-4c22-b3df-030af613518b\") " pod="openstack/dnsmasq-dns-759cc7f497-zhrbn" Mar 20 07:11:56 crc kubenswrapper[4971]: I0320 07:11:56.063432 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/165557aa-762a-4c22-b3df-030af613518b-ovsdbserver-sb\") pod \"dnsmasq-dns-759cc7f497-zhrbn\" (UID: \"165557aa-762a-4c22-b3df-030af613518b\") " pod="openstack/dnsmasq-dns-759cc7f497-zhrbn" Mar 20 07:11:56 crc kubenswrapper[4971]: I0320 07:11:56.063563 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/165557aa-762a-4c22-b3df-030af613518b-ovsdbserver-nb\") pod \"dnsmasq-dns-759cc7f497-zhrbn\" (UID: \"165557aa-762a-4c22-b3df-030af613518b\") " pod="openstack/dnsmasq-dns-759cc7f497-zhrbn" Mar 20 07:11:56 crc kubenswrapper[4971]: I0320 07:11:56.063791 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/165557aa-762a-4c22-b3df-030af613518b-dns-svc\") pod \"dnsmasq-dns-759cc7f497-zhrbn\" (UID: \"165557aa-762a-4c22-b3df-030af613518b\") " pod="openstack/dnsmasq-dns-759cc7f497-zhrbn" Mar 20 07:11:56 crc kubenswrapper[4971]: I0320 07:11:56.064821 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/165557aa-762a-4c22-b3df-030af613518b-dns-swift-storage-0\") pod \"dnsmasq-dns-759cc7f497-zhrbn\" (UID: \"165557aa-762a-4c22-b3df-030af613518b\") " pod="openstack/dnsmasq-dns-759cc7f497-zhrbn" Mar 20 07:11:56 crc kubenswrapper[4971]: I0320 07:11:56.102086 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4tq4\" (UniqueName: \"kubernetes.io/projected/165557aa-762a-4c22-b3df-030af613518b-kube-api-access-w4tq4\") pod \"dnsmasq-dns-759cc7f497-zhrbn\" (UID: \"165557aa-762a-4c22-b3df-030af613518b\") " pod="openstack/dnsmasq-dns-759cc7f497-zhrbn" Mar 20 07:11:56 crc kubenswrapper[4971]: I0320 07:11:56.220374 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-759cc7f497-zhrbn" Mar 20 07:11:56 crc kubenswrapper[4971]: I0320 07:11:56.422720 4971 generic.go:334] "Generic (PLEG): container finished" podID="a0427520-e443-4203-8104-3e6ee27f1de2" containerID="8890fb77d0327a4c268a9debfb5847496f6fda767784c3377b128d49ea6185ff" exitCode=0 Mar 20 07:11:56 crc kubenswrapper[4971]: I0320 07:11:56.422760 4971 generic.go:334] "Generic (PLEG): container finished" podID="a0427520-e443-4203-8104-3e6ee27f1de2" containerID="16e86dd2f75fe869c0099e5e8f28758c8b0aa8e231f0d52418f308910ad3a71b" exitCode=143 Mar 20 07:11:56 crc kubenswrapper[4971]: I0320 07:11:56.422814 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a0427520-e443-4203-8104-3e6ee27f1de2","Type":"ContainerDied","Data":"8890fb77d0327a4c268a9debfb5847496f6fda767784c3377b128d49ea6185ff"} Mar 20 07:11:56 crc kubenswrapper[4971]: I0320 07:11:56.422845 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a0427520-e443-4203-8104-3e6ee27f1de2","Type":"ContainerDied","Data":"16e86dd2f75fe869c0099e5e8f28758c8b0aa8e231f0d52418f308910ad3a71b"} Mar 20 07:11:56 crc kubenswrapper[4971]: I0320 07:11:56.424645 4971 generic.go:334] "Generic (PLEG): container finished" podID="189bc582-c200-4c20-8a6f-2a1446fd11b5" containerID="374e54ab4a048309941118221a07e96b275e295109371c7b2474327f103803a5" exitCode=0 Mar 20 07:11:56 crc kubenswrapper[4971]: I0320 07:11:56.424755 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dhbt6" event={"ID":"189bc582-c200-4c20-8a6f-2a1446fd11b5","Type":"ContainerDied","Data":"374e54ab4a048309941118221a07e96b275e295109371c7b2474327f103803a5"} Mar 20 07:11:56 crc kubenswrapper[4971]: I0320 07:11:56.435098 4971 generic.go:334] "Generic (PLEG): container finished" podID="fdb3f848-6aef-4c5b-9896-559474cfc73a" containerID="c6f8c6bba52814176f2d63fcef53ef5390aac199883bf410759ab23efa5483e6" exitCode=0 Mar 20 07:11:56 crc kubenswrapper[4971]: I0320 07:11:56.435130 4971 generic.go:334] "Generic (PLEG): container finished" podID="fdb3f848-6aef-4c5b-9896-559474cfc73a" containerID="7c9ba1ac583a9ad417e563b8d221c2934a2d87a7edfb0ad30cbb4299b6e9257b" exitCode=143 Mar 20 07:11:56 crc kubenswrapper[4971]: I0320 07:11:56.435189 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fdb3f848-6aef-4c5b-9896-559474cfc73a","Type":"ContainerDied","Data":"c6f8c6bba52814176f2d63fcef53ef5390aac199883bf410759ab23efa5483e6"} Mar 20 07:11:56 crc kubenswrapper[4971]: I0320 07:11:56.435253 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fdb3f848-6aef-4c5b-9896-559474cfc73a","Type":"ContainerDied","Data":"7c9ba1ac583a9ad417e563b8d221c2934a2d87a7edfb0ad30cbb4299b6e9257b"} Mar 20 07:11:56 crc kubenswrapper[4971]: I0320 07:11:56.435821 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55f778bb97-sp7rz" podUID="cc189b71-a30a-4ae1-9df1-234fc2e85cbc" containerName="dnsmasq-dns" containerID="cri-o://8c2bdbf8384dd4512b9a4df8dcc520bbb3a8d3f712f30b2d3bc0664f96372ba1" gracePeriod=10 Mar 20 07:11:57 crc kubenswrapper[4971]: I0320 07:11:57.459252 4971 generic.go:334] "Generic (PLEG): container finished" podID="cc189b71-a30a-4ae1-9df1-234fc2e85cbc" containerID="8c2bdbf8384dd4512b9a4df8dcc520bbb3a8d3f712f30b2d3bc0664f96372ba1" exitCode=0 Mar 20 07:11:57 crc kubenswrapper[4971]: I0320 07:11:57.459301 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f778bb97-sp7rz" event={"ID":"cc189b71-a30a-4ae1-9df1-234fc2e85cbc","Type":"ContainerDied","Data":"8c2bdbf8384dd4512b9a4df8dcc520bbb3a8d3f712f30b2d3bc0664f96372ba1"} Mar 20 07:12:00 crc kubenswrapper[4971]: I0320 07:12:00.131693 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566512-7bqgd"] Mar 20 07:12:00 crc kubenswrapper[4971]: I0320 07:12:00.133178 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566512-7bqgd" Mar 20 07:12:00 crc kubenswrapper[4971]: I0320 07:12:00.137451 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 07:12:00 crc kubenswrapper[4971]: I0320 07:12:00.138836 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 07:12:00 crc kubenswrapper[4971]: I0320 07:12:00.139504 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 07:12:00 crc kubenswrapper[4971]: I0320 07:12:00.145314 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566512-7bqgd"] Mar 20 07:12:00 crc kubenswrapper[4971]: I0320 07:12:00.253156 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sr9kb\" (UniqueName: \"kubernetes.io/projected/3f70b1cc-68a3-4e49-8fc0-c90be8b0d30e-kube-api-access-sr9kb\") pod \"auto-csr-approver-29566512-7bqgd\" (UID: \"3f70b1cc-68a3-4e49-8fc0-c90be8b0d30e\") " pod="openshift-infra/auto-csr-approver-29566512-7bqgd" Mar 20 07:12:00 crc kubenswrapper[4971]: I0320 07:12:00.355209 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sr9kb\" (UniqueName: \"kubernetes.io/projected/3f70b1cc-68a3-4e49-8fc0-c90be8b0d30e-kube-api-access-sr9kb\") pod \"auto-csr-approver-29566512-7bqgd\" (UID: \"3f70b1cc-68a3-4e49-8fc0-c90be8b0d30e\") " pod="openshift-infra/auto-csr-approver-29566512-7bqgd" Mar 20 07:12:00 crc kubenswrapper[4971]: I0320 07:12:00.381702 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sr9kb\" (UniqueName: \"kubernetes.io/projected/3f70b1cc-68a3-4e49-8fc0-c90be8b0d30e-kube-api-access-sr9kb\") pod \"auto-csr-approver-29566512-7bqgd\" (UID: \"3f70b1cc-68a3-4e49-8fc0-c90be8b0d30e\") " pod="openshift-infra/auto-csr-approver-29566512-7bqgd" Mar 20 07:12:00 crc kubenswrapper[4971]: I0320 07:12:00.463417 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566512-7bqgd" Mar 20 07:12:01 crc kubenswrapper[4971]: I0320 07:12:01.727761 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dhbt6" Mar 20 07:12:01 crc kubenswrapper[4971]: I0320 07:12:01.789759 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/189bc582-c200-4c20-8a6f-2a1446fd11b5-credential-keys\") pod \"189bc582-c200-4c20-8a6f-2a1446fd11b5\" (UID: \"189bc582-c200-4c20-8a6f-2a1446fd11b5\") " Mar 20 07:12:01 crc kubenswrapper[4971]: I0320 07:12:01.789845 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/189bc582-c200-4c20-8a6f-2a1446fd11b5-fernet-keys\") pod \"189bc582-c200-4c20-8a6f-2a1446fd11b5\" (UID: \"189bc582-c200-4c20-8a6f-2a1446fd11b5\") " Mar 20 07:12:01 crc kubenswrapper[4971]: I0320 07:12:01.789916 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-scqxn\" (UniqueName: \"kubernetes.io/projected/189bc582-c200-4c20-8a6f-2a1446fd11b5-kube-api-access-scqxn\") pod \"189bc582-c200-4c20-8a6f-2a1446fd11b5\" (UID: \"189bc582-c200-4c20-8a6f-2a1446fd11b5\") " Mar 20 07:12:01 crc kubenswrapper[4971]: I0320 07:12:01.789966 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/189bc582-c200-4c20-8a6f-2a1446fd11b5-scripts\") pod \"189bc582-c200-4c20-8a6f-2a1446fd11b5\" (UID: \"189bc582-c200-4c20-8a6f-2a1446fd11b5\") " Mar 20 07:12:01 crc kubenswrapper[4971]: I0320 07:12:01.790085 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/189bc582-c200-4c20-8a6f-2a1446fd11b5-combined-ca-bundle\") pod \"189bc582-c200-4c20-8a6f-2a1446fd11b5\" (UID: \"189bc582-c200-4c20-8a6f-2a1446fd11b5\") " Mar 20 07:12:01 crc kubenswrapper[4971]: I0320 07:12:01.790179 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/189bc582-c200-4c20-8a6f-2a1446fd11b5-config-data\") pod \"189bc582-c200-4c20-8a6f-2a1446fd11b5\" (UID: \"189bc582-c200-4c20-8a6f-2a1446fd11b5\") " Mar 20 07:12:01 crc kubenswrapper[4971]: I0320 07:12:01.816096 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/189bc582-c200-4c20-8a6f-2a1446fd11b5-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "189bc582-c200-4c20-8a6f-2a1446fd11b5" (UID: "189bc582-c200-4c20-8a6f-2a1446fd11b5"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:12:01 crc kubenswrapper[4971]: I0320 07:12:01.817300 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/189bc582-c200-4c20-8a6f-2a1446fd11b5-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "189bc582-c200-4c20-8a6f-2a1446fd11b5" (UID: "189bc582-c200-4c20-8a6f-2a1446fd11b5"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:12:01 crc kubenswrapper[4971]: I0320 07:12:01.818085 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/189bc582-c200-4c20-8a6f-2a1446fd11b5-scripts" (OuterVolumeSpecName: "scripts") pod "189bc582-c200-4c20-8a6f-2a1446fd11b5" (UID: "189bc582-c200-4c20-8a6f-2a1446fd11b5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:12:01 crc kubenswrapper[4971]: I0320 07:12:01.826674 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/189bc582-c200-4c20-8a6f-2a1446fd11b5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "189bc582-c200-4c20-8a6f-2a1446fd11b5" (UID: "189bc582-c200-4c20-8a6f-2a1446fd11b5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:12:01 crc kubenswrapper[4971]: I0320 07:12:01.842657 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/189bc582-c200-4c20-8a6f-2a1446fd11b5-kube-api-access-scqxn" (OuterVolumeSpecName: "kube-api-access-scqxn") pod "189bc582-c200-4c20-8a6f-2a1446fd11b5" (UID: "189bc582-c200-4c20-8a6f-2a1446fd11b5"). InnerVolumeSpecName "kube-api-access-scqxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:12:01 crc kubenswrapper[4971]: I0320 07:12:01.850352 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/189bc582-c200-4c20-8a6f-2a1446fd11b5-config-data" (OuterVolumeSpecName: "config-data") pod "189bc582-c200-4c20-8a6f-2a1446fd11b5" (UID: "189bc582-c200-4c20-8a6f-2a1446fd11b5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:12:01 crc kubenswrapper[4971]: I0320 07:12:01.892817 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/189bc582-c200-4c20-8a6f-2a1446fd11b5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:01 crc kubenswrapper[4971]: I0320 07:12:01.892869 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/189bc582-c200-4c20-8a6f-2a1446fd11b5-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:01 crc kubenswrapper[4971]: I0320 07:12:01.892881 4971 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/189bc582-c200-4c20-8a6f-2a1446fd11b5-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:01 crc kubenswrapper[4971]: I0320 07:12:01.892890 4971 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/189bc582-c200-4c20-8a6f-2a1446fd11b5-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:01 crc kubenswrapper[4971]: I0320 07:12:01.892900 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-scqxn\" (UniqueName: \"kubernetes.io/projected/189bc582-c200-4c20-8a6f-2a1446fd11b5-kube-api-access-scqxn\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:01 crc kubenswrapper[4971]: I0320 07:12:01.892913 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/189bc582-c200-4c20-8a6f-2a1446fd11b5-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:02 crc kubenswrapper[4971]: I0320 07:12:02.517277 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dhbt6" event={"ID":"189bc582-c200-4c20-8a6f-2a1446fd11b5","Type":"ContainerDied","Data":"1d17c2d22af07ab8c7e52c7a70d0acfa8f46d8df2d61f6d5f302dcd9f48d64e3"} Mar 20 07:12:02 crc kubenswrapper[4971]: I0320 07:12:02.517643 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d17c2d22af07ab8c7e52c7a70d0acfa8f46d8df2d61f6d5f302dcd9f48d64e3" Mar 20 07:12:02 crc kubenswrapper[4971]: I0320 07:12:02.517355 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dhbt6" Mar 20 07:12:02 crc kubenswrapper[4971]: I0320 07:12:02.825333 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-dhbt6"] Mar 20 07:12:02 crc kubenswrapper[4971]: I0320 07:12:02.833799 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-dhbt6"] Mar 20 07:12:02 crc kubenswrapper[4971]: I0320 07:12:02.910556 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-rvpqx"] Mar 20 07:12:02 crc kubenswrapper[4971]: E0320 07:12:02.910912 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="189bc582-c200-4c20-8a6f-2a1446fd11b5" containerName="keystone-bootstrap" Mar 20 07:12:02 crc kubenswrapper[4971]: I0320 07:12:02.910929 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="189bc582-c200-4c20-8a6f-2a1446fd11b5" containerName="keystone-bootstrap" Mar 20 07:12:02 crc kubenswrapper[4971]: I0320 07:12:02.911111 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="189bc582-c200-4c20-8a6f-2a1446fd11b5" containerName="keystone-bootstrap" Mar 20 07:12:02 crc kubenswrapper[4971]: I0320 07:12:02.911751 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rvpqx" Mar 20 07:12:02 crc kubenswrapper[4971]: I0320 07:12:02.918323 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 20 07:12:02 crc kubenswrapper[4971]: I0320 07:12:02.918426 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 07:12:02 crc kubenswrapper[4971]: I0320 07:12:02.918334 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 07:12:02 crc kubenswrapper[4971]: I0320 07:12:02.918788 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-vfsj2" Mar 20 07:12:02 crc kubenswrapper[4971]: I0320 07:12:02.919094 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 07:12:02 crc kubenswrapper[4971]: I0320 07:12:02.935253 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-rvpqx"] Mar 20 07:12:03 crc kubenswrapper[4971]: I0320 07:12:03.012633 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/daa1861f-a138-41b1-b5c4-3d2fa15611ea-credential-keys\") pod \"keystone-bootstrap-rvpqx\" (UID: \"daa1861f-a138-41b1-b5c4-3d2fa15611ea\") " pod="openstack/keystone-bootstrap-rvpqx" Mar 20 07:12:03 crc kubenswrapper[4971]: I0320 07:12:03.012750 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjzqz\" (UniqueName: \"kubernetes.io/projected/daa1861f-a138-41b1-b5c4-3d2fa15611ea-kube-api-access-kjzqz\") pod \"keystone-bootstrap-rvpqx\" (UID: \"daa1861f-a138-41b1-b5c4-3d2fa15611ea\") " pod="openstack/keystone-bootstrap-rvpqx" Mar 20 07:12:03 crc kubenswrapper[4971]: I0320 07:12:03.012845 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daa1861f-a138-41b1-b5c4-3d2fa15611ea-combined-ca-bundle\") pod \"keystone-bootstrap-rvpqx\" (UID: \"daa1861f-a138-41b1-b5c4-3d2fa15611ea\") " pod="openstack/keystone-bootstrap-rvpqx" Mar 20 07:12:03 crc kubenswrapper[4971]: I0320 07:12:03.012960 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/daa1861f-a138-41b1-b5c4-3d2fa15611ea-fernet-keys\") pod \"keystone-bootstrap-rvpqx\" (UID: \"daa1861f-a138-41b1-b5c4-3d2fa15611ea\") " pod="openstack/keystone-bootstrap-rvpqx" Mar 20 07:12:03 crc kubenswrapper[4971]: I0320 07:12:03.012993 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/daa1861f-a138-41b1-b5c4-3d2fa15611ea-scripts\") pod \"keystone-bootstrap-rvpqx\" (UID: \"daa1861f-a138-41b1-b5c4-3d2fa15611ea\") " pod="openstack/keystone-bootstrap-rvpqx" Mar 20 07:12:03 crc kubenswrapper[4971]: I0320 07:12:03.013081 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/daa1861f-a138-41b1-b5c4-3d2fa15611ea-config-data\") pod \"keystone-bootstrap-rvpqx\" (UID: \"daa1861f-a138-41b1-b5c4-3d2fa15611ea\") " pod="openstack/keystone-bootstrap-rvpqx" Mar 20 07:12:03 crc kubenswrapper[4971]: I0320 07:12:03.114645 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daa1861f-a138-41b1-b5c4-3d2fa15611ea-combined-ca-bundle\") pod \"keystone-bootstrap-rvpqx\" (UID: \"daa1861f-a138-41b1-b5c4-3d2fa15611ea\") " pod="openstack/keystone-bootstrap-rvpqx" Mar 20 07:12:03 crc kubenswrapper[4971]: I0320 07:12:03.114715 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/daa1861f-a138-41b1-b5c4-3d2fa15611ea-fernet-keys\") pod \"keystone-bootstrap-rvpqx\" (UID: \"daa1861f-a138-41b1-b5c4-3d2fa15611ea\") " pod="openstack/keystone-bootstrap-rvpqx" Mar 20 07:12:03 crc kubenswrapper[4971]: I0320 07:12:03.114738 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/daa1861f-a138-41b1-b5c4-3d2fa15611ea-scripts\") pod \"keystone-bootstrap-rvpqx\" (UID: \"daa1861f-a138-41b1-b5c4-3d2fa15611ea\") " pod="openstack/keystone-bootstrap-rvpqx" Mar 20 07:12:03 crc kubenswrapper[4971]: I0320 07:12:03.114781 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/daa1861f-a138-41b1-b5c4-3d2fa15611ea-config-data\") pod \"keystone-bootstrap-rvpqx\" (UID: \"daa1861f-a138-41b1-b5c4-3d2fa15611ea\") " pod="openstack/keystone-bootstrap-rvpqx" Mar 20 07:12:03 crc kubenswrapper[4971]: I0320 07:12:03.114838 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/daa1861f-a138-41b1-b5c4-3d2fa15611ea-credential-keys\") pod \"keystone-bootstrap-rvpqx\" (UID: \"daa1861f-a138-41b1-b5c4-3d2fa15611ea\") " pod="openstack/keystone-bootstrap-rvpqx" Mar 20 07:12:03 crc kubenswrapper[4971]: I0320 07:12:03.114884 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjzqz\" (UniqueName: \"kubernetes.io/projected/daa1861f-a138-41b1-b5c4-3d2fa15611ea-kube-api-access-kjzqz\") pod \"keystone-bootstrap-rvpqx\" (UID: \"daa1861f-a138-41b1-b5c4-3d2fa15611ea\") " pod="openstack/keystone-bootstrap-rvpqx" Mar 20 07:12:03 crc kubenswrapper[4971]: I0320 07:12:03.121273 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/daa1861f-a138-41b1-b5c4-3d2fa15611ea-scripts\") pod \"keystone-bootstrap-rvpqx\" (UID: \"daa1861f-a138-41b1-b5c4-3d2fa15611ea\") " pod="openstack/keystone-bootstrap-rvpqx" Mar 20 07:12:03 crc kubenswrapper[4971]: I0320 07:12:03.123173 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/daa1861f-a138-41b1-b5c4-3d2fa15611ea-credential-keys\") pod \"keystone-bootstrap-rvpqx\" (UID: \"daa1861f-a138-41b1-b5c4-3d2fa15611ea\") " pod="openstack/keystone-bootstrap-rvpqx" Mar 20 07:12:03 crc kubenswrapper[4971]: I0320 07:12:03.124901 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daa1861f-a138-41b1-b5c4-3d2fa15611ea-combined-ca-bundle\") pod \"keystone-bootstrap-rvpqx\" (UID: \"daa1861f-a138-41b1-b5c4-3d2fa15611ea\") " pod="openstack/keystone-bootstrap-rvpqx" Mar 20 07:12:03 crc kubenswrapper[4971]: I0320 07:12:03.131672 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/daa1861f-a138-41b1-b5c4-3d2fa15611ea-fernet-keys\") pod \"keystone-bootstrap-rvpqx\" (UID: \"daa1861f-a138-41b1-b5c4-3d2fa15611ea\") " pod="openstack/keystone-bootstrap-rvpqx" Mar 20 07:12:03 crc kubenswrapper[4971]: I0320 07:12:03.132476 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjzqz\" (UniqueName: \"kubernetes.io/projected/daa1861f-a138-41b1-b5c4-3d2fa15611ea-kube-api-access-kjzqz\") pod \"keystone-bootstrap-rvpqx\" (UID: \"daa1861f-a138-41b1-b5c4-3d2fa15611ea\") " pod="openstack/keystone-bootstrap-rvpqx" Mar 20 07:12:03 crc kubenswrapper[4971]: I0320 07:12:03.136476 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/daa1861f-a138-41b1-b5c4-3d2fa15611ea-config-data\") pod \"keystone-bootstrap-rvpqx\" (UID: \"daa1861f-a138-41b1-b5c4-3d2fa15611ea\") " pod="openstack/keystone-bootstrap-rvpqx" Mar 20 07:12:03 crc kubenswrapper[4971]: I0320 07:12:03.239581 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rvpqx" Mar 20 07:12:04 crc kubenswrapper[4971]: I0320 07:12:04.753852 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="189bc582-c200-4c20-8a6f-2a1446fd11b5" path="/var/lib/kubelet/pods/189bc582-c200-4c20-8a6f-2a1446fd11b5/volumes" Mar 20 07:12:04 crc kubenswrapper[4971]: I0320 07:12:04.896921 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f778bb97-sp7rz" Mar 20 07:12:04 crc kubenswrapper[4971]: I0320 07:12:04.905932 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 07:12:04 crc kubenswrapper[4971]: I0320 07:12:04.969782 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc189b71-a30a-4ae1-9df1-234fc2e85cbc-ovsdbserver-sb\") pod \"cc189b71-a30a-4ae1-9df1-234fc2e85cbc\" (UID: \"cc189b71-a30a-4ae1-9df1-234fc2e85cbc\") " Mar 20 07:12:04 crc kubenswrapper[4971]: I0320 07:12:04.969883 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc189b71-a30a-4ae1-9df1-234fc2e85cbc-dns-svc\") pod \"cc189b71-a30a-4ae1-9df1-234fc2e85cbc\" (UID: \"cc189b71-a30a-4ae1-9df1-234fc2e85cbc\") " Mar 20 07:12:04 crc kubenswrapper[4971]: I0320 07:12:04.969912 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc189b71-a30a-4ae1-9df1-234fc2e85cbc-ovsdbserver-nb\") pod \"cc189b71-a30a-4ae1-9df1-234fc2e85cbc\" (UID: \"cc189b71-a30a-4ae1-9df1-234fc2e85cbc\") " Mar 20 07:12:04 crc kubenswrapper[4971]: I0320 07:12:04.969935 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc189b71-a30a-4ae1-9df1-234fc2e85cbc-config\") pod \"cc189b71-a30a-4ae1-9df1-234fc2e85cbc\" (UID: \"cc189b71-a30a-4ae1-9df1-234fc2e85cbc\") " Mar 20 07:12:04 crc kubenswrapper[4971]: I0320 07:12:04.969977 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppdz8\" (UniqueName: \"kubernetes.io/projected/cc189b71-a30a-4ae1-9df1-234fc2e85cbc-kube-api-access-ppdz8\") pod \"cc189b71-a30a-4ae1-9df1-234fc2e85cbc\" (UID: \"cc189b71-a30a-4ae1-9df1-234fc2e85cbc\") " Mar 20 07:12:04 crc kubenswrapper[4971]: I0320 07:12:04.978031 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc189b71-a30a-4ae1-9df1-234fc2e85cbc-kube-api-access-ppdz8" (OuterVolumeSpecName: "kube-api-access-ppdz8") pod "cc189b71-a30a-4ae1-9df1-234fc2e85cbc" (UID: "cc189b71-a30a-4ae1-9df1-234fc2e85cbc"). InnerVolumeSpecName "kube-api-access-ppdz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.027480 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc189b71-a30a-4ae1-9df1-234fc2e85cbc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cc189b71-a30a-4ae1-9df1-234fc2e85cbc" (UID: "cc189b71-a30a-4ae1-9df1-234fc2e85cbc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.038806 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc189b71-a30a-4ae1-9df1-234fc2e85cbc-config" (OuterVolumeSpecName: "config") pod "cc189b71-a30a-4ae1-9df1-234fc2e85cbc" (UID: "cc189b71-a30a-4ae1-9df1-234fc2e85cbc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.051445 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc189b71-a30a-4ae1-9df1-234fc2e85cbc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cc189b71-a30a-4ae1-9df1-234fc2e85cbc" (UID: "cc189b71-a30a-4ae1-9df1-234fc2e85cbc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.058445 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc189b71-a30a-4ae1-9df1-234fc2e85cbc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cc189b71-a30a-4ae1-9df1-234fc2e85cbc" (UID: "cc189b71-a30a-4ae1-9df1-234fc2e85cbc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.071372 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fdb3f848-6aef-4c5b-9896-559474cfc73a-logs\") pod \"fdb3f848-6aef-4c5b-9896-559474cfc73a\" (UID: \"fdb3f848-6aef-4c5b-9896-559474cfc73a\") " Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.071505 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdb3f848-6aef-4c5b-9896-559474cfc73a-combined-ca-bundle\") pod \"fdb3f848-6aef-4c5b-9896-559474cfc73a\" (UID: \"fdb3f848-6aef-4c5b-9896-559474cfc73a\") " Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.071586 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6822\" (UniqueName: \"kubernetes.io/projected/fdb3f848-6aef-4c5b-9896-559474cfc73a-kube-api-access-f6822\") pod \"fdb3f848-6aef-4c5b-9896-559474cfc73a\" (UID: \"fdb3f848-6aef-4c5b-9896-559474cfc73a\") " Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.071711 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdb3f848-6aef-4c5b-9896-559474cfc73a-config-data\") pod \"fdb3f848-6aef-4c5b-9896-559474cfc73a\" (UID: \"fdb3f848-6aef-4c5b-9896-559474cfc73a\") " Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.071751 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fdb3f848-6aef-4c5b-9896-559474cfc73a-httpd-run\") pod \"fdb3f848-6aef-4c5b-9896-559474cfc73a\" (UID: \"fdb3f848-6aef-4c5b-9896-559474cfc73a\") " Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.071778 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fdb3f848-6aef-4c5b-9896-559474cfc73a-scripts\") pod \"fdb3f848-6aef-4c5b-9896-559474cfc73a\" (UID: \"fdb3f848-6aef-4c5b-9896-559474cfc73a\") " Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.071883 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"fdb3f848-6aef-4c5b-9896-559474cfc73a\" (UID: \"fdb3f848-6aef-4c5b-9896-559474cfc73a\") " Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.072086 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdb3f848-6aef-4c5b-9896-559474cfc73a-logs" (OuterVolumeSpecName: "logs") pod "fdb3f848-6aef-4c5b-9896-559474cfc73a" (UID: "fdb3f848-6aef-4c5b-9896-559474cfc73a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.072371 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdb3f848-6aef-4c5b-9896-559474cfc73a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "fdb3f848-6aef-4c5b-9896-559474cfc73a" (UID: "fdb3f848-6aef-4c5b-9896-559474cfc73a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.072954 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppdz8\" (UniqueName: \"kubernetes.io/projected/cc189b71-a30a-4ae1-9df1-234fc2e85cbc-kube-api-access-ppdz8\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.072978 4971 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fdb3f848-6aef-4c5b-9896-559474cfc73a-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.072995 4971 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fdb3f848-6aef-4c5b-9896-559474cfc73a-logs\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.073010 4971 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc189b71-a30a-4ae1-9df1-234fc2e85cbc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.073021 4971 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc189b71-a30a-4ae1-9df1-234fc2e85cbc-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.073034 4971 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc189b71-a30a-4ae1-9df1-234fc2e85cbc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.073047 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc189b71-a30a-4ae1-9df1-234fc2e85cbc-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.075995 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdb3f848-6aef-4c5b-9896-559474cfc73a-scripts" (OuterVolumeSpecName: "scripts") pod "fdb3f848-6aef-4c5b-9896-559474cfc73a" (UID: "fdb3f848-6aef-4c5b-9896-559474cfc73a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.076761 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "fdb3f848-6aef-4c5b-9896-559474cfc73a" (UID: "fdb3f848-6aef-4c5b-9896-559474cfc73a"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.077588 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdb3f848-6aef-4c5b-9896-559474cfc73a-kube-api-access-f6822" (OuterVolumeSpecName: "kube-api-access-f6822") pod "fdb3f848-6aef-4c5b-9896-559474cfc73a" (UID: "fdb3f848-6aef-4c5b-9896-559474cfc73a"). InnerVolumeSpecName "kube-api-access-f6822". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.095122 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdb3f848-6aef-4c5b-9896-559474cfc73a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fdb3f848-6aef-4c5b-9896-559474cfc73a" (UID: "fdb3f848-6aef-4c5b-9896-559474cfc73a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.117867 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdb3f848-6aef-4c5b-9896-559474cfc73a-config-data" (OuterVolumeSpecName: "config-data") pod "fdb3f848-6aef-4c5b-9896-559474cfc73a" (UID: "fdb3f848-6aef-4c5b-9896-559474cfc73a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.175869 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6822\" (UniqueName: \"kubernetes.io/projected/fdb3f848-6aef-4c5b-9896-559474cfc73a-kube-api-access-f6822\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.175909 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdb3f848-6aef-4c5b-9896-559474cfc73a-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.175922 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fdb3f848-6aef-4c5b-9896-559474cfc73a-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.175971 4971 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.175991 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdb3f848-6aef-4c5b-9896-559474cfc73a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.194743 4971 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.278427 4971 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.557988 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f778bb97-sp7rz" event={"ID":"cc189b71-a30a-4ae1-9df1-234fc2e85cbc","Type":"ContainerDied","Data":"f992fa2f60fda4be4c4f6f48c72bd92dbbb5e4fcb375c2ee537fd836007ecbcc"} Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.558084 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f778bb97-sp7rz" Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.558916 4971 scope.go:117] "RemoveContainer" containerID="8c2bdbf8384dd4512b9a4df8dcc520bbb3a8d3f712f30b2d3bc0664f96372ba1" Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.564068 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fdb3f848-6aef-4c5b-9896-559474cfc73a","Type":"ContainerDied","Data":"6a3a09c948dd039759a4a5944c90dc5ec2295372b7cc04f37d1e98da0626d3ae"} Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.564181 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.608118 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f778bb97-sp7rz"] Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.625970 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f778bb97-sp7rz"] Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.633810 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.642683 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.652857 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 07:12:05 crc kubenswrapper[4971]: E0320 07:12:05.653299 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdb3f848-6aef-4c5b-9896-559474cfc73a" containerName="glance-httpd" Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.653314 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdb3f848-6aef-4c5b-9896-559474cfc73a" containerName="glance-httpd" Mar 20 07:12:05 crc kubenswrapper[4971]: E0320 07:12:05.653328 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc189b71-a30a-4ae1-9df1-234fc2e85cbc" containerName="init" Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.653335 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc189b71-a30a-4ae1-9df1-234fc2e85cbc" containerName="init" Mar 20 07:12:05 crc kubenswrapper[4971]: E0320 07:12:05.653360 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc189b71-a30a-4ae1-9df1-234fc2e85cbc" containerName="dnsmasq-dns" Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.653368 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc189b71-a30a-4ae1-9df1-234fc2e85cbc" containerName="dnsmasq-dns" Mar 20 07:12:05 crc kubenswrapper[4971]: E0320 07:12:05.653379 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdb3f848-6aef-4c5b-9896-559474cfc73a" containerName="glance-log" Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.653384 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdb3f848-6aef-4c5b-9896-559474cfc73a" containerName="glance-log" Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.653587 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdb3f848-6aef-4c5b-9896-559474cfc73a" containerName="glance-log" Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.653661 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdb3f848-6aef-4c5b-9896-559474cfc73a" containerName="glance-httpd" Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.653674 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc189b71-a30a-4ae1-9df1-234fc2e85cbc" containerName="dnsmasq-dns" Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.654588 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.657916 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.658155 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.673311 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.788869 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/493ce1ed-38cb-48e6-b8f4-4aaed4934de8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"493ce1ed-38cb-48e6-b8f4-4aaed4934de8\") " pod="openstack/glance-default-external-api-0" Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.788917 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/493ce1ed-38cb-48e6-b8f4-4aaed4934de8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"493ce1ed-38cb-48e6-b8f4-4aaed4934de8\") " pod="openstack/glance-default-external-api-0" Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.788951 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"493ce1ed-38cb-48e6-b8f4-4aaed4934de8\") " pod="openstack/glance-default-external-api-0" Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.789100 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/493ce1ed-38cb-48e6-b8f4-4aaed4934de8-config-data\") pod \"glance-default-external-api-0\" (UID: \"493ce1ed-38cb-48e6-b8f4-4aaed4934de8\") " pod="openstack/glance-default-external-api-0" Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.789176 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpq26\" (UniqueName: \"kubernetes.io/projected/493ce1ed-38cb-48e6-b8f4-4aaed4934de8-kube-api-access-zpq26\") pod \"glance-default-external-api-0\" (UID: \"493ce1ed-38cb-48e6-b8f4-4aaed4934de8\") " pod="openstack/glance-default-external-api-0" Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.789241 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/493ce1ed-38cb-48e6-b8f4-4aaed4934de8-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"493ce1ed-38cb-48e6-b8f4-4aaed4934de8\") " pod="openstack/glance-default-external-api-0" Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.789269 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/493ce1ed-38cb-48e6-b8f4-4aaed4934de8-logs\") pod \"glance-default-external-api-0\" (UID: \"493ce1ed-38cb-48e6-b8f4-4aaed4934de8\") " pod="openstack/glance-default-external-api-0" Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.789617 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/493ce1ed-38cb-48e6-b8f4-4aaed4934de8-scripts\") pod \"glance-default-external-api-0\" (UID: \"493ce1ed-38cb-48e6-b8f4-4aaed4934de8\") " pod="openstack/glance-default-external-api-0" Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.820052 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-55f778bb97-sp7rz" podUID="cc189b71-a30a-4ae1-9df1-234fc2e85cbc" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.145:5353: i/o timeout" Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.891936 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/493ce1ed-38cb-48e6-b8f4-4aaed4934de8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"493ce1ed-38cb-48e6-b8f4-4aaed4934de8\") " pod="openstack/glance-default-external-api-0" Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.892011 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/493ce1ed-38cb-48e6-b8f4-4aaed4934de8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"493ce1ed-38cb-48e6-b8f4-4aaed4934de8\") " pod="openstack/glance-default-external-api-0" Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.892071 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"493ce1ed-38cb-48e6-b8f4-4aaed4934de8\") " pod="openstack/glance-default-external-api-0" Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.892114 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/493ce1ed-38cb-48e6-b8f4-4aaed4934de8-config-data\") pod \"glance-default-external-api-0\" (UID: \"493ce1ed-38cb-48e6-b8f4-4aaed4934de8\") " pod="openstack/glance-default-external-api-0" Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.892155 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpq26\" (UniqueName: \"kubernetes.io/projected/493ce1ed-38cb-48e6-b8f4-4aaed4934de8-kube-api-access-zpq26\") pod \"glance-default-external-api-0\" (UID: \"493ce1ed-38cb-48e6-b8f4-4aaed4934de8\") " pod="openstack/glance-default-external-api-0" Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.892194 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/493ce1ed-38cb-48e6-b8f4-4aaed4934de8-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"493ce1ed-38cb-48e6-b8f4-4aaed4934de8\") " pod="openstack/glance-default-external-api-0" Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.892239 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/493ce1ed-38cb-48e6-b8f4-4aaed4934de8-logs\") pod \"glance-default-external-api-0\" (UID: \"493ce1ed-38cb-48e6-b8f4-4aaed4934de8\") " pod="openstack/glance-default-external-api-0" Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.892335 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/493ce1ed-38cb-48e6-b8f4-4aaed4934de8-scripts\") pod \"glance-default-external-api-0\" (UID: \"493ce1ed-38cb-48e6-b8f4-4aaed4934de8\") " pod="openstack/glance-default-external-api-0" Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.892485 4971 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"493ce1ed-38cb-48e6-b8f4-4aaed4934de8\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.892785 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/493ce1ed-38cb-48e6-b8f4-4aaed4934de8-logs\") pod \"glance-default-external-api-0\" (UID: \"493ce1ed-38cb-48e6-b8f4-4aaed4934de8\") " pod="openstack/glance-default-external-api-0" Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.893983 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/493ce1ed-38cb-48e6-b8f4-4aaed4934de8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"493ce1ed-38cb-48e6-b8f4-4aaed4934de8\") " pod="openstack/glance-default-external-api-0" Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.899048 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/493ce1ed-38cb-48e6-b8f4-4aaed4934de8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"493ce1ed-38cb-48e6-b8f4-4aaed4934de8\") " pod="openstack/glance-default-external-api-0" Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.900213 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/493ce1ed-38cb-48e6-b8f4-4aaed4934de8-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"493ce1ed-38cb-48e6-b8f4-4aaed4934de8\") " pod="openstack/glance-default-external-api-0" Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.901185 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/493ce1ed-38cb-48e6-b8f4-4aaed4934de8-config-data\") pod \"glance-default-external-api-0\" (UID: \"493ce1ed-38cb-48e6-b8f4-4aaed4934de8\") " pod="openstack/glance-default-external-api-0" Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.904850 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/493ce1ed-38cb-48e6-b8f4-4aaed4934de8-scripts\") pod \"glance-default-external-api-0\" (UID: \"493ce1ed-38cb-48e6-b8f4-4aaed4934de8\") " pod="openstack/glance-default-external-api-0" Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.919928 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"493ce1ed-38cb-48e6-b8f4-4aaed4934de8\") " pod="openstack/glance-default-external-api-0" Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.931158 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpq26\" (UniqueName: \"kubernetes.io/projected/493ce1ed-38cb-48e6-b8f4-4aaed4934de8-kube-api-access-zpq26\") pod \"glance-default-external-api-0\" (UID: \"493ce1ed-38cb-48e6-b8f4-4aaed4934de8\") " pod="openstack/glance-default-external-api-0" Mar 20 07:12:05 crc kubenswrapper[4971]: I0320 07:12:05.985492 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 07:12:06 crc kubenswrapper[4971]: I0320 07:12:06.746966 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc189b71-a30a-4ae1-9df1-234fc2e85cbc" path="/var/lib/kubelet/pods/cc189b71-a30a-4ae1-9df1-234fc2e85cbc/volumes" Mar 20 07:12:06 crc kubenswrapper[4971]: I0320 07:12:06.748451 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdb3f848-6aef-4c5b-9896-559474cfc73a" path="/var/lib/kubelet/pods/fdb3f848-6aef-4c5b-9896-559474cfc73a/volumes" Mar 20 07:12:12 crc kubenswrapper[4971]: I0320 07:12:12.319468 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 07:12:12 crc kubenswrapper[4971]: I0320 07:12:12.413618 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0427520-e443-4203-8104-3e6ee27f1de2-logs\") pod \"a0427520-e443-4203-8104-3e6ee27f1de2\" (UID: \"a0427520-e443-4203-8104-3e6ee27f1de2\") " Mar 20 07:12:12 crc kubenswrapper[4971]: I0320 07:12:12.413704 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0427520-e443-4203-8104-3e6ee27f1de2-combined-ca-bundle\") pod \"a0427520-e443-4203-8104-3e6ee27f1de2\" (UID: \"a0427520-e443-4203-8104-3e6ee27f1de2\") " Mar 20 07:12:12 crc kubenswrapper[4971]: I0320 07:12:12.413736 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"a0427520-e443-4203-8104-3e6ee27f1de2\" (UID: \"a0427520-e443-4203-8104-3e6ee27f1de2\") " Mar 20 07:12:12 crc kubenswrapper[4971]: I0320 07:12:12.413765 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5fts\" (UniqueName: \"kubernetes.io/projected/a0427520-e443-4203-8104-3e6ee27f1de2-kube-api-access-p5fts\") pod \"a0427520-e443-4203-8104-3e6ee27f1de2\" (UID: \"a0427520-e443-4203-8104-3e6ee27f1de2\") " Mar 20 07:12:12 crc kubenswrapper[4971]: I0320 07:12:12.413925 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0427520-e443-4203-8104-3e6ee27f1de2-config-data\") pod \"a0427520-e443-4203-8104-3e6ee27f1de2\" (UID: \"a0427520-e443-4203-8104-3e6ee27f1de2\") " Mar 20 07:12:12 crc kubenswrapper[4971]: I0320 07:12:12.413954 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0427520-e443-4203-8104-3e6ee27f1de2-scripts\") pod \"a0427520-e443-4203-8104-3e6ee27f1de2\" (UID: \"a0427520-e443-4203-8104-3e6ee27f1de2\") " Mar 20 07:12:12 crc kubenswrapper[4971]: I0320 07:12:12.413985 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a0427520-e443-4203-8104-3e6ee27f1de2-httpd-run\") pod \"a0427520-e443-4203-8104-3e6ee27f1de2\" (UID: \"a0427520-e443-4203-8104-3e6ee27f1de2\") " Mar 20 07:12:12 crc kubenswrapper[4971]: I0320 07:12:12.414346 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0427520-e443-4203-8104-3e6ee27f1de2-logs" (OuterVolumeSpecName: "logs") pod "a0427520-e443-4203-8104-3e6ee27f1de2" (UID: "a0427520-e443-4203-8104-3e6ee27f1de2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:12:12 crc kubenswrapper[4971]: I0320 07:12:12.414963 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0427520-e443-4203-8104-3e6ee27f1de2-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a0427520-e443-4203-8104-3e6ee27f1de2" (UID: "a0427520-e443-4203-8104-3e6ee27f1de2"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:12:12 crc kubenswrapper[4971]: I0320 07:12:12.419278 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0427520-e443-4203-8104-3e6ee27f1de2-scripts" (OuterVolumeSpecName: "scripts") pod "a0427520-e443-4203-8104-3e6ee27f1de2" (UID: "a0427520-e443-4203-8104-3e6ee27f1de2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:12:12 crc kubenswrapper[4971]: I0320 07:12:12.425265 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0427520-e443-4203-8104-3e6ee27f1de2-kube-api-access-p5fts" (OuterVolumeSpecName: "kube-api-access-p5fts") pod "a0427520-e443-4203-8104-3e6ee27f1de2" (UID: "a0427520-e443-4203-8104-3e6ee27f1de2"). InnerVolumeSpecName "kube-api-access-p5fts". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:12:12 crc kubenswrapper[4971]: I0320 07:12:12.425381 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "a0427520-e443-4203-8104-3e6ee27f1de2" (UID: "a0427520-e443-4203-8104-3e6ee27f1de2"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 07:12:12 crc kubenswrapper[4971]: I0320 07:12:12.456664 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0427520-e443-4203-8104-3e6ee27f1de2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a0427520-e443-4203-8104-3e6ee27f1de2" (UID: "a0427520-e443-4203-8104-3e6ee27f1de2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:12:12 crc kubenswrapper[4971]: I0320 07:12:12.464201 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0427520-e443-4203-8104-3e6ee27f1de2-config-data" (OuterVolumeSpecName: "config-data") pod "a0427520-e443-4203-8104-3e6ee27f1de2" (UID: "a0427520-e443-4203-8104-3e6ee27f1de2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:12:12 crc kubenswrapper[4971]: I0320 07:12:12.515803 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0427520-e443-4203-8104-3e6ee27f1de2-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:12 crc kubenswrapper[4971]: I0320 07:12:12.515847 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0427520-e443-4203-8104-3e6ee27f1de2-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:12 crc kubenswrapper[4971]: I0320 07:12:12.515860 4971 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a0427520-e443-4203-8104-3e6ee27f1de2-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:12 crc kubenswrapper[4971]: I0320 07:12:12.515873 4971 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0427520-e443-4203-8104-3e6ee27f1de2-logs\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:12 crc kubenswrapper[4971]: I0320 07:12:12.515884 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0427520-e443-4203-8104-3e6ee27f1de2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:12 crc kubenswrapper[4971]: I0320 07:12:12.515923 4971 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 20 07:12:12 crc kubenswrapper[4971]: I0320 07:12:12.515935 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5fts\" (UniqueName: \"kubernetes.io/projected/a0427520-e443-4203-8104-3e6ee27f1de2-kube-api-access-p5fts\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:12 crc kubenswrapper[4971]: I0320 07:12:12.534809 4971 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 20 07:12:12 crc kubenswrapper[4971]: I0320 07:12:12.617524 4971 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:12 crc kubenswrapper[4971]: I0320 07:12:12.633784 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a0427520-e443-4203-8104-3e6ee27f1de2","Type":"ContainerDied","Data":"ab1b6472ec69bd1191f614bcfc8201134e5900bdaec24feb31273d5ba4fc40ad"} Mar 20 07:12:12 crc kubenswrapper[4971]: I0320 07:12:12.634205 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 07:12:12 crc kubenswrapper[4971]: E0320 07:12:12.668734 4971 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:718ce7d3397ab3eef55d3be4add819266aff70eb4c4192b3a9ef4ee37b0e9d2f" Mar 20 07:12:12 crc kubenswrapper[4971]: E0320 07:12:12.669121 4971 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:718ce7d3397ab3eef55d3be4add819266aff70eb4c4192b3a9ef4ee37b0e9d2f,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n59ch589h67dhf7h7ch68ch645h5dfh5f9hf6h5ffh58fh76h55fh5d9h97h64ch5bdh96hf8hf7h74hb9h4h7fh5bdh56ch59fh647h679h7h57dq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sg752,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(03a48c07-f949-4376-874a-8d5758b60a9a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 07:12:12 crc kubenswrapper[4971]: I0320 07:12:12.696875 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 07:12:12 crc kubenswrapper[4971]: I0320 07:12:12.711827 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 07:12:12 crc kubenswrapper[4971]: I0320 07:12:12.721457 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 07:12:12 crc kubenswrapper[4971]: E0320 07:12:12.721938 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0427520-e443-4203-8104-3e6ee27f1de2" containerName="glance-log" Mar 20 07:12:12 crc kubenswrapper[4971]: I0320 07:12:12.721956 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0427520-e443-4203-8104-3e6ee27f1de2" containerName="glance-log" Mar 20 07:12:12 crc kubenswrapper[4971]: E0320 07:12:12.721979 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0427520-e443-4203-8104-3e6ee27f1de2" containerName="glance-httpd" Mar 20 07:12:12 crc kubenswrapper[4971]: I0320 07:12:12.721987 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0427520-e443-4203-8104-3e6ee27f1de2" containerName="glance-httpd" Mar 20 07:12:12 crc kubenswrapper[4971]: I0320 07:12:12.722183 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0427520-e443-4203-8104-3e6ee27f1de2" containerName="glance-log" Mar 20 07:12:12 crc kubenswrapper[4971]: I0320 07:12:12.722204 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0427520-e443-4203-8104-3e6ee27f1de2" containerName="glance-httpd" Mar 20 07:12:12 crc kubenswrapper[4971]: I0320 07:12:12.723142 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 07:12:12 crc kubenswrapper[4971]: I0320 07:12:12.725188 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 20 07:12:12 crc kubenswrapper[4971]: I0320 07:12:12.725541 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 20 07:12:12 crc kubenswrapper[4971]: I0320 07:12:12.757949 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0427520-e443-4203-8104-3e6ee27f1de2" path="/var/lib/kubelet/pods/a0427520-e443-4203-8104-3e6ee27f1de2/volumes" Mar 20 07:12:12 crc kubenswrapper[4971]: I0320 07:12:12.758804 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 07:12:12 crc kubenswrapper[4971]: I0320 07:12:12.821408 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ed5de2f-85e8-45ca-8c0b-0c646167168f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8ed5de2f-85e8-45ca-8c0b-0c646167168f\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:12:12 crc kubenswrapper[4971]: I0320 07:12:12.821468 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8ed5de2f-85e8-45ca-8c0b-0c646167168f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8ed5de2f-85e8-45ca-8c0b-0c646167168f\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:12:12 crc kubenswrapper[4971]: I0320 07:12:12.821502 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"8ed5de2f-85e8-45ca-8c0b-0c646167168f\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:12:12 crc kubenswrapper[4971]: I0320 07:12:12.821780 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6nvq\" (UniqueName: \"kubernetes.io/projected/8ed5de2f-85e8-45ca-8c0b-0c646167168f-kube-api-access-c6nvq\") pod \"glance-default-internal-api-0\" (UID: \"8ed5de2f-85e8-45ca-8c0b-0c646167168f\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:12:12 crc kubenswrapper[4971]: I0320 07:12:12.821841 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ed5de2f-85e8-45ca-8c0b-0c646167168f-logs\") pod \"glance-default-internal-api-0\" (UID: \"8ed5de2f-85e8-45ca-8c0b-0c646167168f\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:12:12 crc kubenswrapper[4971]: I0320 07:12:12.821919 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ed5de2f-85e8-45ca-8c0b-0c646167168f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8ed5de2f-85e8-45ca-8c0b-0c646167168f\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:12:12 crc kubenswrapper[4971]: I0320 07:12:12.822058 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ed5de2f-85e8-45ca-8c0b-0c646167168f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8ed5de2f-85e8-45ca-8c0b-0c646167168f\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:12:12 crc kubenswrapper[4971]: I0320 07:12:12.822097 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ed5de2f-85e8-45ca-8c0b-0c646167168f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8ed5de2f-85e8-45ca-8c0b-0c646167168f\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:12:12 crc kubenswrapper[4971]: I0320 07:12:12.924144 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ed5de2f-85e8-45ca-8c0b-0c646167168f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8ed5de2f-85e8-45ca-8c0b-0c646167168f\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:12:12 crc kubenswrapper[4971]: I0320 07:12:12.924190 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ed5de2f-85e8-45ca-8c0b-0c646167168f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8ed5de2f-85e8-45ca-8c0b-0c646167168f\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:12:12 crc kubenswrapper[4971]: I0320 07:12:12.924285 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ed5de2f-85e8-45ca-8c0b-0c646167168f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8ed5de2f-85e8-45ca-8c0b-0c646167168f\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:12:12 crc kubenswrapper[4971]: I0320 07:12:12.924332 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8ed5de2f-85e8-45ca-8c0b-0c646167168f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8ed5de2f-85e8-45ca-8c0b-0c646167168f\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:12:12 crc kubenswrapper[4971]: I0320 07:12:12.924353 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"8ed5de2f-85e8-45ca-8c0b-0c646167168f\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:12:12 crc kubenswrapper[4971]: I0320 07:12:12.924389 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6nvq\" (UniqueName: \"kubernetes.io/projected/8ed5de2f-85e8-45ca-8c0b-0c646167168f-kube-api-access-c6nvq\") pod \"glance-default-internal-api-0\" (UID: \"8ed5de2f-85e8-45ca-8c0b-0c646167168f\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:12:12 crc kubenswrapper[4971]: I0320 07:12:12.924407 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ed5de2f-85e8-45ca-8c0b-0c646167168f-logs\") pod \"glance-default-internal-api-0\" (UID: \"8ed5de2f-85e8-45ca-8c0b-0c646167168f\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:12:12 crc kubenswrapper[4971]: I0320 07:12:12.924430 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ed5de2f-85e8-45ca-8c0b-0c646167168f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8ed5de2f-85e8-45ca-8c0b-0c646167168f\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:12:12 crc kubenswrapper[4971]: I0320 07:12:12.925218 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8ed5de2f-85e8-45ca-8c0b-0c646167168f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8ed5de2f-85e8-45ca-8c0b-0c646167168f\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:12:12 crc kubenswrapper[4971]: I0320 07:12:12.925403 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ed5de2f-85e8-45ca-8c0b-0c646167168f-logs\") pod \"glance-default-internal-api-0\" (UID: \"8ed5de2f-85e8-45ca-8c0b-0c646167168f\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:12:12 crc kubenswrapper[4971]: I0320 07:12:12.925830 4971 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"8ed5de2f-85e8-45ca-8c0b-0c646167168f\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-internal-api-0" Mar 20 07:12:12 crc kubenswrapper[4971]: I0320 07:12:12.930504 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ed5de2f-85e8-45ca-8c0b-0c646167168f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8ed5de2f-85e8-45ca-8c0b-0c646167168f\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:12:12 crc kubenswrapper[4971]: I0320 07:12:12.930811 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ed5de2f-85e8-45ca-8c0b-0c646167168f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8ed5de2f-85e8-45ca-8c0b-0c646167168f\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:12:12 crc kubenswrapper[4971]: I0320 07:12:12.935136 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ed5de2f-85e8-45ca-8c0b-0c646167168f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8ed5de2f-85e8-45ca-8c0b-0c646167168f\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:12:12 crc kubenswrapper[4971]: I0320 07:12:12.935570 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ed5de2f-85e8-45ca-8c0b-0c646167168f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8ed5de2f-85e8-45ca-8c0b-0c646167168f\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:12:12 crc kubenswrapper[4971]: I0320 07:12:12.939999 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6nvq\" (UniqueName: \"kubernetes.io/projected/8ed5de2f-85e8-45ca-8c0b-0c646167168f-kube-api-access-c6nvq\") pod \"glance-default-internal-api-0\" (UID: \"8ed5de2f-85e8-45ca-8c0b-0c646167168f\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:12:12 crc kubenswrapper[4971]: I0320 07:12:12.947585 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"8ed5de2f-85e8-45ca-8c0b-0c646167168f\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:12:13 crc kubenswrapper[4971]: I0320 07:12:13.063486 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 07:12:13 crc kubenswrapper[4971]: I0320 07:12:13.119648 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-759cc7f497-zhrbn"] Mar 20 07:12:13 crc kubenswrapper[4971]: I0320 07:12:13.646794 4971 generic.go:334] "Generic (PLEG): container finished" podID="4b457fd8-aec5-4cfc-bbfd-a78eaf993f1c" containerID="5ac947ff7b65ad4e4ef7fa4b0e909f3f4005fc6b016e4dd5e60be9354d69b94e" exitCode=0 Mar 20 07:12:13 crc kubenswrapper[4971]: I0320 07:12:13.646849 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-gvx7m" event={"ID":"4b457fd8-aec5-4cfc-bbfd-a78eaf993f1c","Type":"ContainerDied","Data":"5ac947ff7b65ad4e4ef7fa4b0e909f3f4005fc6b016e4dd5e60be9354d69b94e"} Mar 20 07:12:13 crc kubenswrapper[4971]: I0320 07:12:13.822834 4971 scope.go:117] "RemoveContainer" containerID="9c101c1db5075d133976c6c08c5bff09fdb630617bab64ea79f5994b47c2626b" Mar 20 07:12:13 crc kubenswrapper[4971]: E0320 07:12:13.829075 4971 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:574a17f0877c175128a764f2b37fc02456649c8514689125718ce6ca974bfb6b" Mar 20 07:12:13 crc kubenswrapper[4971]: E0320 07:12:13.829332 4971 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:574a17f0877c175128a764f2b37fc02456649c8514689125718ce6ca974bfb6b,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2lcr6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-p9zph_openstack(ebe4aafd-3ffe-406f-a0dc-faebf5eeddda): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 07:12:13 crc kubenswrapper[4971]: E0320 07:12:13.830732 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-p9zph" podUID="ebe4aafd-3ffe-406f-a0dc-faebf5eeddda" Mar 20 07:12:13 crc kubenswrapper[4971]: I0320 07:12:13.992519 4971 scope.go:117] "RemoveContainer" containerID="c6f8c6bba52814176f2d63fcef53ef5390aac199883bf410759ab23efa5483e6" Mar 20 07:12:14 crc kubenswrapper[4971]: I0320 07:12:14.054063 4971 scope.go:117] "RemoveContainer" containerID="7c9ba1ac583a9ad417e563b8d221c2934a2d87a7edfb0ad30cbb4299b6e9257b" Mar 20 07:12:14 crc kubenswrapper[4971]: I0320 07:12:14.086816 4971 scope.go:117] "RemoveContainer" containerID="8890fb77d0327a4c268a9debfb5847496f6fda767784c3377b128d49ea6185ff" Mar 20 07:12:14 crc kubenswrapper[4971]: I0320 07:12:14.112294 4971 scope.go:117] "RemoveContainer" containerID="16e86dd2f75fe869c0099e5e8f28758c8b0aa8e231f0d52418f308910ad3a71b" Mar 20 07:12:14 crc kubenswrapper[4971]: I0320 07:12:14.407526 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 07:12:14 crc kubenswrapper[4971]: I0320 07:12:14.445477 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-rvpqx"] Mar 20 07:12:14 crc kubenswrapper[4971]: W0320 07:12:14.450842 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddaa1861f_a138_41b1_b5c4_3d2fa15611ea.slice/crio-d8a54eaca7e4451001c21e6f49baa7e70d0161d5fc873ff4f04ba81f03ce1335 WatchSource:0}: Error finding container d8a54eaca7e4451001c21e6f49baa7e70d0161d5fc873ff4f04ba81f03ce1335: Status 404 returned error can't find the container with id d8a54eaca7e4451001c21e6f49baa7e70d0161d5fc873ff4f04ba81f03ce1335 Mar 20 07:12:14 crc kubenswrapper[4971]: I0320 07:12:14.452343 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566512-7bqgd"] Mar 20 07:12:14 crc kubenswrapper[4971]: I0320 07:12:14.698963 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rvpqx" event={"ID":"daa1861f-a138-41b1-b5c4-3d2fa15611ea","Type":"ContainerStarted","Data":"821ce76436237ce7e9434e6e2e31435cd0f21237c27264d9054c57be93e6161a"} Mar 20 07:12:14 crc kubenswrapper[4971]: I0320 07:12:14.699314 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rvpqx" event={"ID":"daa1861f-a138-41b1-b5c4-3d2fa15611ea","Type":"ContainerStarted","Data":"d8a54eaca7e4451001c21e6f49baa7e70d0161d5fc873ff4f04ba81f03ce1335"} Mar 20 07:12:14 crc kubenswrapper[4971]: I0320 07:12:14.706392 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-7vflp" event={"ID":"7c3565c3-ef3e-4d98-9bca-211a538032c9","Type":"ContainerStarted","Data":"f6ed6c2bd98279cdf6b3f41b7c35c157a45f901ea5da3911570bab6705696124"} Mar 20 07:12:14 crc kubenswrapper[4971]: I0320 07:12:14.719211 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-rvpqx" podStartSLOduration=12.719195597 podStartE2EDuration="12.719195597s" podCreationTimestamp="2026-03-20 07:12:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:12:14.715815984 +0000 UTC m=+1356.695690142" watchObservedRunningTime="2026-03-20 07:12:14.719195597 +0000 UTC m=+1356.699069725" Mar 20 07:12:14 crc kubenswrapper[4971]: I0320 07:12:14.719751 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-fqqjk" event={"ID":"35492989-bd44-45b4-85a7-1acc9569270e","Type":"ContainerStarted","Data":"f2c6e66d0b0a91e73a33dda62aeb723853c5bbc9009e00a71519161cd37e2c4a"} Mar 20 07:12:14 crc kubenswrapper[4971]: I0320 07:12:14.722315 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566512-7bqgd" event={"ID":"3f70b1cc-68a3-4e49-8fc0-c90be8b0d30e","Type":"ContainerStarted","Data":"c3494901d612a09a842ed8c1a5df0539673ea6073ccf74a18cb2cd4f399f2397"} Mar 20 07:12:14 crc kubenswrapper[4971]: I0320 07:12:14.725571 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8ed5de2f-85e8-45ca-8c0b-0c646167168f","Type":"ContainerStarted","Data":"7b305347201c520358a095844517b1d49d32248695d3ee60c5c56af1b87dd612"} Mar 20 07:12:14 crc kubenswrapper[4971]: I0320 07:12:14.733895 4971 generic.go:334] "Generic (PLEG): container finished" podID="165557aa-762a-4c22-b3df-030af613518b" containerID="6a0cfd35ae031b20932c4994326ceaed13d6a51186a431f86faa77a09604c741" exitCode=0 Mar 20 07:12:14 crc kubenswrapper[4971]: I0320 07:12:14.735102 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-759cc7f497-zhrbn" event={"ID":"165557aa-762a-4c22-b3df-030af613518b","Type":"ContainerDied","Data":"6a0cfd35ae031b20932c4994326ceaed13d6a51186a431f86faa77a09604c741"} Mar 20 07:12:14 crc kubenswrapper[4971]: I0320 07:12:14.735145 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-759cc7f497-zhrbn" event={"ID":"165557aa-762a-4c22-b3df-030af613518b","Type":"ContainerStarted","Data":"4d55d38543f2850b3914aa97376cf00cb38d0678f7867063f06ab8ba227ee9de"} Mar 20 07:12:14 crc kubenswrapper[4971]: E0320 07:12:14.741016 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:574a17f0877c175128a764f2b37fc02456649c8514689125718ce6ca974bfb6b\\\"\"" pod="openstack/cinder-db-sync-p9zph" podUID="ebe4aafd-3ffe-406f-a0dc-faebf5eeddda" Mar 20 07:12:14 crc kubenswrapper[4971]: I0320 07:12:14.754779 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-7vflp" podStartSLOduration=2.965640331 podStartE2EDuration="24.754728091s" podCreationTimestamp="2026-03-20 07:11:50 +0000 UTC" firstStartedPulling="2026-03-20 07:11:52.045970935 +0000 UTC m=+1334.025845073" lastFinishedPulling="2026-03-20 07:12:13.835058665 +0000 UTC m=+1355.814932833" observedRunningTime="2026-03-20 07:12:14.736013121 +0000 UTC m=+1356.715887269" watchObservedRunningTime="2026-03-20 07:12:14.754728091 +0000 UTC m=+1356.734602249" Mar 20 07:12:14 crc kubenswrapper[4971]: I0320 07:12:14.797276 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-fqqjk" podStartSLOduration=4.223816548 podStartE2EDuration="24.797249556s" podCreationTimestamp="2026-03-20 07:11:50 +0000 UTC" firstStartedPulling="2026-03-20 07:11:52.099314356 +0000 UTC m=+1334.079188484" lastFinishedPulling="2026-03-20 07:12:12.672747354 +0000 UTC m=+1354.652621492" observedRunningTime="2026-03-20 07:12:14.752436914 +0000 UTC m=+1356.732311062" watchObservedRunningTime="2026-03-20 07:12:14.797249556 +0000 UTC m=+1356.777123704" Mar 20 07:12:15 crc kubenswrapper[4971]: I0320 07:12:15.040256 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-gvx7m" Mar 20 07:12:15 crc kubenswrapper[4971]: I0320 07:12:15.071074 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b457fd8-aec5-4cfc-bbfd-a78eaf993f1c-combined-ca-bundle\") pod \"4b457fd8-aec5-4cfc-bbfd-a78eaf993f1c\" (UID: \"4b457fd8-aec5-4cfc-bbfd-a78eaf993f1c\") " Mar 20 07:12:15 crc kubenswrapper[4971]: I0320 07:12:15.071381 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tltth\" (UniqueName: \"kubernetes.io/projected/4b457fd8-aec5-4cfc-bbfd-a78eaf993f1c-kube-api-access-tltth\") pod \"4b457fd8-aec5-4cfc-bbfd-a78eaf993f1c\" (UID: \"4b457fd8-aec5-4cfc-bbfd-a78eaf993f1c\") " Mar 20 07:12:15 crc kubenswrapper[4971]: I0320 07:12:15.071873 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4b457fd8-aec5-4cfc-bbfd-a78eaf993f1c-config\") pod \"4b457fd8-aec5-4cfc-bbfd-a78eaf993f1c\" (UID: \"4b457fd8-aec5-4cfc-bbfd-a78eaf993f1c\") " Mar 20 07:12:15 crc kubenswrapper[4971]: I0320 07:12:15.080284 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b457fd8-aec5-4cfc-bbfd-a78eaf993f1c-kube-api-access-tltth" (OuterVolumeSpecName: "kube-api-access-tltth") pod "4b457fd8-aec5-4cfc-bbfd-a78eaf993f1c" (UID: "4b457fd8-aec5-4cfc-bbfd-a78eaf993f1c"). InnerVolumeSpecName "kube-api-access-tltth". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:12:15 crc kubenswrapper[4971]: I0320 07:12:15.108709 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b457fd8-aec5-4cfc-bbfd-a78eaf993f1c-config" (OuterVolumeSpecName: "config") pod "4b457fd8-aec5-4cfc-bbfd-a78eaf993f1c" (UID: "4b457fd8-aec5-4cfc-bbfd-a78eaf993f1c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:12:15 crc kubenswrapper[4971]: I0320 07:12:15.112836 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b457fd8-aec5-4cfc-bbfd-a78eaf993f1c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4b457fd8-aec5-4cfc-bbfd-a78eaf993f1c" (UID: "4b457fd8-aec5-4cfc-bbfd-a78eaf993f1c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:12:15 crc kubenswrapper[4971]: I0320 07:12:15.144717 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 07:12:15 crc kubenswrapper[4971]: I0320 07:12:15.174979 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tltth\" (UniqueName: \"kubernetes.io/projected/4b457fd8-aec5-4cfc-bbfd-a78eaf993f1c-kube-api-access-tltth\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:15 crc kubenswrapper[4971]: I0320 07:12:15.175029 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/4b457fd8-aec5-4cfc-bbfd-a78eaf993f1c-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:15 crc kubenswrapper[4971]: I0320 07:12:15.175040 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b457fd8-aec5-4cfc-bbfd-a78eaf993f1c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:15 crc kubenswrapper[4971]: I0320 07:12:15.753088 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-759cc7f497-zhrbn" event={"ID":"165557aa-762a-4c22-b3df-030af613518b","Type":"ContainerStarted","Data":"fe2cd538af167ebe7756add415c43a2a91a6b98373c962ca5e2edbfc935d1336"} Mar 20 07:12:15 crc kubenswrapper[4971]: I0320 07:12:15.753449 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-759cc7f497-zhrbn" Mar 20 07:12:15 crc kubenswrapper[4971]: I0320 07:12:15.762643 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566512-7bqgd" event={"ID":"3f70b1cc-68a3-4e49-8fc0-c90be8b0d30e","Type":"ContainerStarted","Data":"f81aecf5a1a3f9063ecfd9a721f11e941db08bb5f827ff314bb3c535b94bb02d"} Mar 20 07:12:15 crc kubenswrapper[4971]: I0320 07:12:15.766381 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-gvx7m" Mar 20 07:12:15 crc kubenswrapper[4971]: I0320 07:12:15.766542 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-gvx7m" event={"ID":"4b457fd8-aec5-4cfc-bbfd-a78eaf993f1c","Type":"ContainerDied","Data":"a9918192002d0371a6d0f0530e19505b9f9cc145dcee3733f94a19f2905bb307"} Mar 20 07:12:15 crc kubenswrapper[4971]: I0320 07:12:15.766595 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9918192002d0371a6d0f0530e19505b9f9cc145dcee3733f94a19f2905bb307" Mar 20 07:12:15 crc kubenswrapper[4971]: I0320 07:12:15.782333 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-759cc7f497-zhrbn" podStartSLOduration=20.78230981 podStartE2EDuration="20.78230981s" podCreationTimestamp="2026-03-20 07:11:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:12:15.774889958 +0000 UTC m=+1357.754764086" watchObservedRunningTime="2026-03-20 07:12:15.78230981 +0000 UTC m=+1357.762183948" Mar 20 07:12:15 crc kubenswrapper[4971]: I0320 07:12:15.791190 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8ed5de2f-85e8-45ca-8c0b-0c646167168f","Type":"ContainerStarted","Data":"f20ece9bc6026beb1e2b728e8620ad88489184b8b8369888b69279374937d9eb"} Mar 20 07:12:15 crc kubenswrapper[4971]: I0320 07:12:15.807881 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566512-7bqgd" podStartSLOduration=14.97397372 podStartE2EDuration="15.807858468s" podCreationTimestamp="2026-03-20 07:12:00 +0000 UTC" firstStartedPulling="2026-03-20 07:12:14.465918811 +0000 UTC m=+1356.445792959" lastFinishedPulling="2026-03-20 07:12:15.299803549 +0000 UTC m=+1357.279677707" observedRunningTime="2026-03-20 07:12:15.802342382 +0000 UTC m=+1357.782216520" watchObservedRunningTime="2026-03-20 07:12:15.807858468 +0000 UTC m=+1357.787732616" Mar 20 07:12:15 crc kubenswrapper[4971]: I0320 07:12:15.813814 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"03a48c07-f949-4376-874a-8d5758b60a9a","Type":"ContainerStarted","Data":"aece527112bdee1e807769fbd04b183b04d566ead4cf6b30a2d975bcd179d608"} Mar 20 07:12:15 crc kubenswrapper[4971]: I0320 07:12:15.822549 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"493ce1ed-38cb-48e6-b8f4-4aaed4934de8","Type":"ContainerStarted","Data":"ba80d7d23d36fb024375e89ebd594cea0ab9040052c3984f0193e33c97b7acb9"} Mar 20 07:12:15 crc kubenswrapper[4971]: I0320 07:12:15.837274 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.837252811 podStartE2EDuration="3.837252811s" podCreationTimestamp="2026-03-20 07:12:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:12:15.826729032 +0000 UTC m=+1357.806603190" watchObservedRunningTime="2026-03-20 07:12:15.837252811 +0000 UTC m=+1357.817126949" Mar 20 07:12:15 crc kubenswrapper[4971]: I0320 07:12:15.988666 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-759cc7f497-zhrbn"] Mar 20 07:12:16 crc kubenswrapper[4971]: I0320 07:12:16.015741 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d67d65cb9-jknv9"] Mar 20 07:12:16 crc kubenswrapper[4971]: E0320 07:12:16.016145 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b457fd8-aec5-4cfc-bbfd-a78eaf993f1c" containerName="neutron-db-sync" Mar 20 07:12:16 crc kubenswrapper[4971]: I0320 07:12:16.016159 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b457fd8-aec5-4cfc-bbfd-a78eaf993f1c" containerName="neutron-db-sync" Mar 20 07:12:16 crc kubenswrapper[4971]: I0320 07:12:16.016366 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b457fd8-aec5-4cfc-bbfd-a78eaf993f1c" containerName="neutron-db-sync" Mar 20 07:12:16 crc kubenswrapper[4971]: I0320 07:12:16.017233 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d67d65cb9-jknv9" Mar 20 07:12:16 crc kubenswrapper[4971]: I0320 07:12:16.025451 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d67d65cb9-jknv9"] Mar 20 07:12:16 crc kubenswrapper[4971]: I0320 07:12:16.064373 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-66b6544788-qm44x"] Mar 20 07:12:16 crc kubenswrapper[4971]: I0320 07:12:16.066987 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-66b6544788-qm44x" Mar 20 07:12:16 crc kubenswrapper[4971]: I0320 07:12:16.080531 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 20 07:12:16 crc kubenswrapper[4971]: I0320 07:12:16.080781 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 20 07:12:16 crc kubenswrapper[4971]: I0320 07:12:16.081031 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 20 07:12:16 crc kubenswrapper[4971]: I0320 07:12:16.081222 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-w2rgc" Mar 20 07:12:16 crc kubenswrapper[4971]: I0320 07:12:16.095617 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-66b6544788-qm44x"] Mar 20 07:12:16 crc kubenswrapper[4971]: I0320 07:12:16.120709 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b062fd41-7667-4a30-b4c9-817bfbb80f1b-ovsdbserver-sb\") pod \"dnsmasq-dns-6d67d65cb9-jknv9\" (UID: \"b062fd41-7667-4a30-b4c9-817bfbb80f1b\") " pod="openstack/dnsmasq-dns-6d67d65cb9-jknv9" Mar 20 07:12:16 crc kubenswrapper[4971]: I0320 07:12:16.127201 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/601f5587-0478-4903-9909-ebe0dee36539-combined-ca-bundle\") pod \"neutron-66b6544788-qm44x\" (UID: \"601f5587-0478-4903-9909-ebe0dee36539\") " pod="openstack/neutron-66b6544788-qm44x" Mar 20 07:12:16 crc kubenswrapper[4971]: I0320 07:12:16.127296 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b062fd41-7667-4a30-b4c9-817bfbb80f1b-dns-swift-storage-0\") pod \"dnsmasq-dns-6d67d65cb9-jknv9\" (UID: \"b062fd41-7667-4a30-b4c9-817bfbb80f1b\") " pod="openstack/dnsmasq-dns-6d67d65cb9-jknv9" Mar 20 07:12:16 crc kubenswrapper[4971]: I0320 07:12:16.127317 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/601f5587-0478-4903-9909-ebe0dee36539-httpd-config\") pod \"neutron-66b6544788-qm44x\" (UID: \"601f5587-0478-4903-9909-ebe0dee36539\") " pod="openstack/neutron-66b6544788-qm44x" Mar 20 07:12:16 crc kubenswrapper[4971]: I0320 07:12:16.127356 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mk7ph\" (UniqueName: \"kubernetes.io/projected/b062fd41-7667-4a30-b4c9-817bfbb80f1b-kube-api-access-mk7ph\") pod \"dnsmasq-dns-6d67d65cb9-jknv9\" (UID: \"b062fd41-7667-4a30-b4c9-817bfbb80f1b\") " pod="openstack/dnsmasq-dns-6d67d65cb9-jknv9" Mar 20 07:12:16 crc kubenswrapper[4971]: I0320 07:12:16.127419 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/601f5587-0478-4903-9909-ebe0dee36539-config\") pod \"neutron-66b6544788-qm44x\" (UID: \"601f5587-0478-4903-9909-ebe0dee36539\") " pod="openstack/neutron-66b6544788-qm44x" Mar 20 07:12:16 crc kubenswrapper[4971]: I0320 07:12:16.127446 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b062fd41-7667-4a30-b4c9-817bfbb80f1b-config\") pod \"dnsmasq-dns-6d67d65cb9-jknv9\" (UID: \"b062fd41-7667-4a30-b4c9-817bfbb80f1b\") " pod="openstack/dnsmasq-dns-6d67d65cb9-jknv9" Mar 20 07:12:16 crc kubenswrapper[4971]: I0320 07:12:16.127492 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/601f5587-0478-4903-9909-ebe0dee36539-ovndb-tls-certs\") pod \"neutron-66b6544788-qm44x\" (UID: \"601f5587-0478-4903-9909-ebe0dee36539\") " pod="openstack/neutron-66b6544788-qm44x" Mar 20 07:12:16 crc kubenswrapper[4971]: I0320 07:12:16.127551 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b062fd41-7667-4a30-b4c9-817bfbb80f1b-ovsdbserver-nb\") pod \"dnsmasq-dns-6d67d65cb9-jknv9\" (UID: \"b062fd41-7667-4a30-b4c9-817bfbb80f1b\") " pod="openstack/dnsmasq-dns-6d67d65cb9-jknv9" Mar 20 07:12:16 crc kubenswrapper[4971]: I0320 07:12:16.127640 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b062fd41-7667-4a30-b4c9-817bfbb80f1b-dns-svc\") pod \"dnsmasq-dns-6d67d65cb9-jknv9\" (UID: \"b062fd41-7667-4a30-b4c9-817bfbb80f1b\") " pod="openstack/dnsmasq-dns-6d67d65cb9-jknv9" Mar 20 07:12:16 crc kubenswrapper[4971]: I0320 07:12:16.127681 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdxdr\" (UniqueName: \"kubernetes.io/projected/601f5587-0478-4903-9909-ebe0dee36539-kube-api-access-wdxdr\") pod \"neutron-66b6544788-qm44x\" (UID: \"601f5587-0478-4903-9909-ebe0dee36539\") " pod="openstack/neutron-66b6544788-qm44x" Mar 20 07:12:16 crc kubenswrapper[4971]: I0320 07:12:16.229781 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b062fd41-7667-4a30-b4c9-817bfbb80f1b-dns-svc\") pod \"dnsmasq-dns-6d67d65cb9-jknv9\" (UID: \"b062fd41-7667-4a30-b4c9-817bfbb80f1b\") " pod="openstack/dnsmasq-dns-6d67d65cb9-jknv9" Mar 20 07:12:16 crc kubenswrapper[4971]: I0320 07:12:16.229836 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdxdr\" (UniqueName: \"kubernetes.io/projected/601f5587-0478-4903-9909-ebe0dee36539-kube-api-access-wdxdr\") pod \"neutron-66b6544788-qm44x\" (UID: \"601f5587-0478-4903-9909-ebe0dee36539\") " pod="openstack/neutron-66b6544788-qm44x" Mar 20 07:12:16 crc kubenswrapper[4971]: I0320 07:12:16.229890 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b062fd41-7667-4a30-b4c9-817bfbb80f1b-ovsdbserver-sb\") pod \"dnsmasq-dns-6d67d65cb9-jknv9\" (UID: \"b062fd41-7667-4a30-b4c9-817bfbb80f1b\") " pod="openstack/dnsmasq-dns-6d67d65cb9-jknv9" Mar 20 07:12:16 crc kubenswrapper[4971]: I0320 07:12:16.229923 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/601f5587-0478-4903-9909-ebe0dee36539-combined-ca-bundle\") pod \"neutron-66b6544788-qm44x\" (UID: \"601f5587-0478-4903-9909-ebe0dee36539\") " pod="openstack/neutron-66b6544788-qm44x" Mar 20 07:12:16 crc kubenswrapper[4971]: I0320 07:12:16.229953 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b062fd41-7667-4a30-b4c9-817bfbb80f1b-dns-swift-storage-0\") pod \"dnsmasq-dns-6d67d65cb9-jknv9\" (UID: \"b062fd41-7667-4a30-b4c9-817bfbb80f1b\") " pod="openstack/dnsmasq-dns-6d67d65cb9-jknv9" Mar 20 07:12:16 crc kubenswrapper[4971]: I0320 07:12:16.229971 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/601f5587-0478-4903-9909-ebe0dee36539-httpd-config\") pod \"neutron-66b6544788-qm44x\" (UID: \"601f5587-0478-4903-9909-ebe0dee36539\") " pod="openstack/neutron-66b6544788-qm44x" Mar 20 07:12:16 crc kubenswrapper[4971]: I0320 07:12:16.229992 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mk7ph\" (UniqueName: \"kubernetes.io/projected/b062fd41-7667-4a30-b4c9-817bfbb80f1b-kube-api-access-mk7ph\") pod \"dnsmasq-dns-6d67d65cb9-jknv9\" (UID: \"b062fd41-7667-4a30-b4c9-817bfbb80f1b\") " pod="openstack/dnsmasq-dns-6d67d65cb9-jknv9" Mar 20 07:12:16 crc kubenswrapper[4971]: I0320 07:12:16.230019 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/601f5587-0478-4903-9909-ebe0dee36539-config\") pod \"neutron-66b6544788-qm44x\" (UID: \"601f5587-0478-4903-9909-ebe0dee36539\") " pod="openstack/neutron-66b6544788-qm44x" Mar 20 07:12:16 crc kubenswrapper[4971]: I0320 07:12:16.230038 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b062fd41-7667-4a30-b4c9-817bfbb80f1b-config\") pod \"dnsmasq-dns-6d67d65cb9-jknv9\" (UID: \"b062fd41-7667-4a30-b4c9-817bfbb80f1b\") " pod="openstack/dnsmasq-dns-6d67d65cb9-jknv9" Mar 20 07:12:16 crc kubenswrapper[4971]: I0320 07:12:16.230062 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/601f5587-0478-4903-9909-ebe0dee36539-ovndb-tls-certs\") pod \"neutron-66b6544788-qm44x\" (UID: \"601f5587-0478-4903-9909-ebe0dee36539\") " pod="openstack/neutron-66b6544788-qm44x" Mar 20 07:12:16 crc kubenswrapper[4971]: I0320 07:12:16.230093 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b062fd41-7667-4a30-b4c9-817bfbb80f1b-ovsdbserver-nb\") pod \"dnsmasq-dns-6d67d65cb9-jknv9\" (UID: \"b062fd41-7667-4a30-b4c9-817bfbb80f1b\") " pod="openstack/dnsmasq-dns-6d67d65cb9-jknv9" Mar 20 07:12:16 crc kubenswrapper[4971]: I0320 07:12:16.232085 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b062fd41-7667-4a30-b4c9-817bfbb80f1b-dns-swift-storage-0\") pod \"dnsmasq-dns-6d67d65cb9-jknv9\" (UID: \"b062fd41-7667-4a30-b4c9-817bfbb80f1b\") " pod="openstack/dnsmasq-dns-6d67d65cb9-jknv9" Mar 20 07:12:16 crc kubenswrapper[4971]: I0320 07:12:16.232343 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b062fd41-7667-4a30-b4c9-817bfbb80f1b-ovsdbserver-nb\") pod \"dnsmasq-dns-6d67d65cb9-jknv9\" (UID: \"b062fd41-7667-4a30-b4c9-817bfbb80f1b\") " pod="openstack/dnsmasq-dns-6d67d65cb9-jknv9" Mar 20 07:12:16 crc kubenswrapper[4971]: I0320 07:12:16.232396 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b062fd41-7667-4a30-b4c9-817bfbb80f1b-config\") pod \"dnsmasq-dns-6d67d65cb9-jknv9\" (UID: \"b062fd41-7667-4a30-b4c9-817bfbb80f1b\") " pod="openstack/dnsmasq-dns-6d67d65cb9-jknv9" Mar 20 07:12:16 crc kubenswrapper[4971]: I0320 07:12:16.233207 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b062fd41-7667-4a30-b4c9-817bfbb80f1b-dns-svc\") pod \"dnsmasq-dns-6d67d65cb9-jknv9\" (UID: \"b062fd41-7667-4a30-b4c9-817bfbb80f1b\") " pod="openstack/dnsmasq-dns-6d67d65cb9-jknv9" Mar 20 07:12:16 crc kubenswrapper[4971]: I0320 07:12:16.233274 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b062fd41-7667-4a30-b4c9-817bfbb80f1b-ovsdbserver-sb\") pod \"dnsmasq-dns-6d67d65cb9-jknv9\" (UID: \"b062fd41-7667-4a30-b4c9-817bfbb80f1b\") " pod="openstack/dnsmasq-dns-6d67d65cb9-jknv9" Mar 20 07:12:16 crc kubenswrapper[4971]: I0320 07:12:16.238576 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/601f5587-0478-4903-9909-ebe0dee36539-combined-ca-bundle\") pod \"neutron-66b6544788-qm44x\" (UID: \"601f5587-0478-4903-9909-ebe0dee36539\") " pod="openstack/neutron-66b6544788-qm44x" Mar 20 07:12:16 crc kubenswrapper[4971]: I0320 07:12:16.239875 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/601f5587-0478-4903-9909-ebe0dee36539-ovndb-tls-certs\") pod \"neutron-66b6544788-qm44x\" (UID: \"601f5587-0478-4903-9909-ebe0dee36539\") " pod="openstack/neutron-66b6544788-qm44x" Mar 20 07:12:16 crc kubenswrapper[4971]: I0320 07:12:16.248120 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/601f5587-0478-4903-9909-ebe0dee36539-config\") pod \"neutron-66b6544788-qm44x\" (UID: \"601f5587-0478-4903-9909-ebe0dee36539\") " pod="openstack/neutron-66b6544788-qm44x" Mar 20 07:12:16 crc kubenswrapper[4971]: I0320 07:12:16.252382 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mk7ph\" (UniqueName: \"kubernetes.io/projected/b062fd41-7667-4a30-b4c9-817bfbb80f1b-kube-api-access-mk7ph\") pod \"dnsmasq-dns-6d67d65cb9-jknv9\" (UID: \"b062fd41-7667-4a30-b4c9-817bfbb80f1b\") " pod="openstack/dnsmasq-dns-6d67d65cb9-jknv9" Mar 20 07:12:16 crc kubenswrapper[4971]: I0320 07:12:16.254748 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdxdr\" (UniqueName: \"kubernetes.io/projected/601f5587-0478-4903-9909-ebe0dee36539-kube-api-access-wdxdr\") pod \"neutron-66b6544788-qm44x\" (UID: \"601f5587-0478-4903-9909-ebe0dee36539\") " pod="openstack/neutron-66b6544788-qm44x" Mar 20 07:12:16 crc kubenswrapper[4971]: I0320 07:12:16.266108 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/601f5587-0478-4903-9909-ebe0dee36539-httpd-config\") pod \"neutron-66b6544788-qm44x\" (UID: \"601f5587-0478-4903-9909-ebe0dee36539\") " pod="openstack/neutron-66b6544788-qm44x" Mar 20 07:12:16 crc kubenswrapper[4971]: I0320 07:12:16.359551 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d67d65cb9-jknv9" Mar 20 07:12:16 crc kubenswrapper[4971]: I0320 07:12:16.533930 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-66b6544788-qm44x" Mar 20 07:12:16 crc kubenswrapper[4971]: I0320 07:12:16.833035 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8ed5de2f-85e8-45ca-8c0b-0c646167168f","Type":"ContainerStarted","Data":"a62d88168869a6e92af414f4a661b2d5e26b67f260c2d5ae443b2b09fc993a58"} Mar 20 07:12:16 crc kubenswrapper[4971]: I0320 07:12:16.837393 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"493ce1ed-38cb-48e6-b8f4-4aaed4934de8","Type":"ContainerStarted","Data":"ffd9a902006f3587e7094f8f94475e6cfc20d4b7c3b1bbb1950d42e6b2041e4b"} Mar 20 07:12:16 crc kubenswrapper[4971]: I0320 07:12:16.837456 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"493ce1ed-38cb-48e6-b8f4-4aaed4934de8","Type":"ContainerStarted","Data":"5c053750db84f1f2b63065eafe30dd2a4b52b4191a5d5c84c1cb782b06050033"} Mar 20 07:12:16 crc kubenswrapper[4971]: I0320 07:12:16.839259 4971 generic.go:334] "Generic (PLEG): container finished" podID="3f70b1cc-68a3-4e49-8fc0-c90be8b0d30e" containerID="f81aecf5a1a3f9063ecfd9a721f11e941db08bb5f827ff314bb3c535b94bb02d" exitCode=0 Mar 20 07:12:16 crc kubenswrapper[4971]: I0320 07:12:16.839578 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566512-7bqgd" event={"ID":"3f70b1cc-68a3-4e49-8fc0-c90be8b0d30e","Type":"ContainerDied","Data":"f81aecf5a1a3f9063ecfd9a721f11e941db08bb5f827ff314bb3c535b94bb02d"} Mar 20 07:12:16 crc kubenswrapper[4971]: W0320 07:12:16.912238 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb062fd41_7667_4a30_b4c9_817bfbb80f1b.slice/crio-07fa1bf6bec756664b74ba05483388e474164a9f3a745c2f00e551e19b8dcf3b WatchSource:0}: Error finding container 07fa1bf6bec756664b74ba05483388e474164a9f3a745c2f00e551e19b8dcf3b: Status 404 returned error can't find the container with id 07fa1bf6bec756664b74ba05483388e474164a9f3a745c2f00e551e19b8dcf3b Mar 20 07:12:16 crc kubenswrapper[4971]: I0320 07:12:16.912245 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d67d65cb9-jknv9"] Mar 20 07:12:17 crc kubenswrapper[4971]: I0320 07:12:17.147062 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-66b6544788-qm44x"] Mar 20 07:12:17 crc kubenswrapper[4971]: I0320 07:12:17.850968 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d67d65cb9-jknv9" event={"ID":"b062fd41-7667-4a30-b4c9-817bfbb80f1b","Type":"ContainerStarted","Data":"07fa1bf6bec756664b74ba05483388e474164a9f3a745c2f00e551e19b8dcf3b"} Mar 20 07:12:17 crc kubenswrapper[4971]: I0320 07:12:17.853087 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66b6544788-qm44x" event={"ID":"601f5587-0478-4903-9909-ebe0dee36539","Type":"ContainerStarted","Data":"3aa0526d8fa86bd67d0ddc3a6c7bbc4eae2965f6ef5dabba4e1875ccedcd80b6"} Mar 20 07:12:17 crc kubenswrapper[4971]: I0320 07:12:17.853321 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-759cc7f497-zhrbn" podUID="165557aa-762a-4c22-b3df-030af613518b" containerName="dnsmasq-dns" containerID="cri-o://fe2cd538af167ebe7756add415c43a2a91a6b98373c962ca5e2edbfc935d1336" gracePeriod=10 Mar 20 07:12:17 crc kubenswrapper[4971]: I0320 07:12:17.879645 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=12.879628464 podStartE2EDuration="12.879628464s" podCreationTimestamp="2026-03-20 07:12:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:12:17.876564729 +0000 UTC m=+1359.856438867" watchObservedRunningTime="2026-03-20 07:12:17.879628464 +0000 UTC m=+1359.859502602" Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.307033 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566512-7bqgd" Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.371823 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5ddc6d7dd9-qnsrj"] Mar 20 07:12:18 crc kubenswrapper[4971]: E0320 07:12:18.372336 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f70b1cc-68a3-4e49-8fc0-c90be8b0d30e" containerName="oc" Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.372359 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f70b1cc-68a3-4e49-8fc0-c90be8b0d30e" containerName="oc" Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.372652 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f70b1cc-68a3-4e49-8fc0-c90be8b0d30e" containerName="oc" Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.374279 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sr9kb\" (UniqueName: \"kubernetes.io/projected/3f70b1cc-68a3-4e49-8fc0-c90be8b0d30e-kube-api-access-sr9kb\") pod \"3f70b1cc-68a3-4e49-8fc0-c90be8b0d30e\" (UID: \"3f70b1cc-68a3-4e49-8fc0-c90be8b0d30e\") " Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.374777 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5ddc6d7dd9-qnsrj" Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.378924 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.379147 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.394881 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f70b1cc-68a3-4e49-8fc0-c90be8b0d30e-kube-api-access-sr9kb" (OuterVolumeSpecName: "kube-api-access-sr9kb") pod "3f70b1cc-68a3-4e49-8fc0-c90be8b0d30e" (UID: "3f70b1cc-68a3-4e49-8fc0-c90be8b0d30e"). InnerVolumeSpecName "kube-api-access-sr9kb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.423659 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5ddc6d7dd9-qnsrj"] Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.447894 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-759cc7f497-zhrbn" Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.478446 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/165557aa-762a-4c22-b3df-030af613518b-config\") pod \"165557aa-762a-4c22-b3df-030af613518b\" (UID: \"165557aa-762a-4c22-b3df-030af613518b\") " Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.478497 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/165557aa-762a-4c22-b3df-030af613518b-ovsdbserver-sb\") pod \"165557aa-762a-4c22-b3df-030af613518b\" (UID: \"165557aa-762a-4c22-b3df-030af613518b\") " Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.478525 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/165557aa-762a-4c22-b3df-030af613518b-dns-swift-storage-0\") pod \"165557aa-762a-4c22-b3df-030af613518b\" (UID: \"165557aa-762a-4c22-b3df-030af613518b\") " Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.478694 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/165557aa-762a-4c22-b3df-030af613518b-dns-svc\") pod \"165557aa-762a-4c22-b3df-030af613518b\" (UID: \"165557aa-762a-4c22-b3df-030af613518b\") " Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.478727 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4tq4\" (UniqueName: \"kubernetes.io/projected/165557aa-762a-4c22-b3df-030af613518b-kube-api-access-w4tq4\") pod \"165557aa-762a-4c22-b3df-030af613518b\" (UID: \"165557aa-762a-4c22-b3df-030af613518b\") " Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.478809 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/165557aa-762a-4c22-b3df-030af613518b-ovsdbserver-nb\") pod \"165557aa-762a-4c22-b3df-030af613518b\" (UID: \"165557aa-762a-4c22-b3df-030af613518b\") " Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.479016 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2c4b8281-2638-413d-b91a-ee6acb59693d-httpd-config\") pod \"neutron-5ddc6d7dd9-qnsrj\" (UID: \"2c4b8281-2638-413d-b91a-ee6acb59693d\") " pod="openstack/neutron-5ddc6d7dd9-qnsrj" Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.479035 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c4b8281-2638-413d-b91a-ee6acb59693d-combined-ca-bundle\") pod \"neutron-5ddc6d7dd9-qnsrj\" (UID: \"2c4b8281-2638-413d-b91a-ee6acb59693d\") " pod="openstack/neutron-5ddc6d7dd9-qnsrj" Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.479068 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2c4b8281-2638-413d-b91a-ee6acb59693d-config\") pod \"neutron-5ddc6d7dd9-qnsrj\" (UID: \"2c4b8281-2638-413d-b91a-ee6acb59693d\") " pod="openstack/neutron-5ddc6d7dd9-qnsrj" Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.479136 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c4b8281-2638-413d-b91a-ee6acb59693d-internal-tls-certs\") pod \"neutron-5ddc6d7dd9-qnsrj\" (UID: \"2c4b8281-2638-413d-b91a-ee6acb59693d\") " pod="openstack/neutron-5ddc6d7dd9-qnsrj" Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.479160 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77n5s\" (UniqueName: \"kubernetes.io/projected/2c4b8281-2638-413d-b91a-ee6acb59693d-kube-api-access-77n5s\") pod \"neutron-5ddc6d7dd9-qnsrj\" (UID: \"2c4b8281-2638-413d-b91a-ee6acb59693d\") " pod="openstack/neutron-5ddc6d7dd9-qnsrj" Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.479175 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c4b8281-2638-413d-b91a-ee6acb59693d-ovndb-tls-certs\") pod \"neutron-5ddc6d7dd9-qnsrj\" (UID: \"2c4b8281-2638-413d-b91a-ee6acb59693d\") " pod="openstack/neutron-5ddc6d7dd9-qnsrj" Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.479192 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c4b8281-2638-413d-b91a-ee6acb59693d-public-tls-certs\") pod \"neutron-5ddc6d7dd9-qnsrj\" (UID: \"2c4b8281-2638-413d-b91a-ee6acb59693d\") " pod="openstack/neutron-5ddc6d7dd9-qnsrj" Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.479241 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sr9kb\" (UniqueName: \"kubernetes.io/projected/3f70b1cc-68a3-4e49-8fc0-c90be8b0d30e-kube-api-access-sr9kb\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.511739 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/165557aa-762a-4c22-b3df-030af613518b-kube-api-access-w4tq4" (OuterVolumeSpecName: "kube-api-access-w4tq4") pod "165557aa-762a-4c22-b3df-030af613518b" (UID: "165557aa-762a-4c22-b3df-030af613518b"). InnerVolumeSpecName "kube-api-access-w4tq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.569538 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/165557aa-762a-4c22-b3df-030af613518b-config" (OuterVolumeSpecName: "config") pod "165557aa-762a-4c22-b3df-030af613518b" (UID: "165557aa-762a-4c22-b3df-030af613518b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.569675 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/165557aa-762a-4c22-b3df-030af613518b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "165557aa-762a-4c22-b3df-030af613518b" (UID: "165557aa-762a-4c22-b3df-030af613518b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.574927 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/165557aa-762a-4c22-b3df-030af613518b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "165557aa-762a-4c22-b3df-030af613518b" (UID: "165557aa-762a-4c22-b3df-030af613518b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.581165 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/165557aa-762a-4c22-b3df-030af613518b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "165557aa-762a-4c22-b3df-030af613518b" (UID: "165557aa-762a-4c22-b3df-030af613518b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.581356 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c4b8281-2638-413d-b91a-ee6acb59693d-internal-tls-certs\") pod \"neutron-5ddc6d7dd9-qnsrj\" (UID: \"2c4b8281-2638-413d-b91a-ee6acb59693d\") " pod="openstack/neutron-5ddc6d7dd9-qnsrj" Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.581420 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77n5s\" (UniqueName: \"kubernetes.io/projected/2c4b8281-2638-413d-b91a-ee6acb59693d-kube-api-access-77n5s\") pod \"neutron-5ddc6d7dd9-qnsrj\" (UID: \"2c4b8281-2638-413d-b91a-ee6acb59693d\") " pod="openstack/neutron-5ddc6d7dd9-qnsrj" Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.581451 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c4b8281-2638-413d-b91a-ee6acb59693d-ovndb-tls-certs\") pod \"neutron-5ddc6d7dd9-qnsrj\" (UID: \"2c4b8281-2638-413d-b91a-ee6acb59693d\") " pod="openstack/neutron-5ddc6d7dd9-qnsrj" Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.581484 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c4b8281-2638-413d-b91a-ee6acb59693d-public-tls-certs\") pod \"neutron-5ddc6d7dd9-qnsrj\" (UID: \"2c4b8281-2638-413d-b91a-ee6acb59693d\") " pod="openstack/neutron-5ddc6d7dd9-qnsrj" Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.581558 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2c4b8281-2638-413d-b91a-ee6acb59693d-httpd-config\") pod \"neutron-5ddc6d7dd9-qnsrj\" (UID: \"2c4b8281-2638-413d-b91a-ee6acb59693d\") " pod="openstack/neutron-5ddc6d7dd9-qnsrj" Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.581585 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c4b8281-2638-413d-b91a-ee6acb59693d-combined-ca-bundle\") pod \"neutron-5ddc6d7dd9-qnsrj\" (UID: \"2c4b8281-2638-413d-b91a-ee6acb59693d\") " pod="openstack/neutron-5ddc6d7dd9-qnsrj" Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.581664 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2c4b8281-2638-413d-b91a-ee6acb59693d-config\") pod \"neutron-5ddc6d7dd9-qnsrj\" (UID: \"2c4b8281-2638-413d-b91a-ee6acb59693d\") " pod="openstack/neutron-5ddc6d7dd9-qnsrj" Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.581779 4971 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/165557aa-762a-4c22-b3df-030af613518b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.581799 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4tq4\" (UniqueName: \"kubernetes.io/projected/165557aa-762a-4c22-b3df-030af613518b-kube-api-access-w4tq4\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.581814 4971 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/165557aa-762a-4c22-b3df-030af613518b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.581827 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/165557aa-762a-4c22-b3df-030af613518b-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.581839 4971 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/165557aa-762a-4c22-b3df-030af613518b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.586790 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c4b8281-2638-413d-b91a-ee6acb59693d-internal-tls-certs\") pod \"neutron-5ddc6d7dd9-qnsrj\" (UID: \"2c4b8281-2638-413d-b91a-ee6acb59693d\") " pod="openstack/neutron-5ddc6d7dd9-qnsrj" Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.587020 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c4b8281-2638-413d-b91a-ee6acb59693d-public-tls-certs\") pod \"neutron-5ddc6d7dd9-qnsrj\" (UID: \"2c4b8281-2638-413d-b91a-ee6acb59693d\") " pod="openstack/neutron-5ddc6d7dd9-qnsrj" Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.588737 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/165557aa-762a-4c22-b3df-030af613518b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "165557aa-762a-4c22-b3df-030af613518b" (UID: "165557aa-762a-4c22-b3df-030af613518b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.589321 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2c4b8281-2638-413d-b91a-ee6acb59693d-httpd-config\") pod \"neutron-5ddc6d7dd9-qnsrj\" (UID: \"2c4b8281-2638-413d-b91a-ee6acb59693d\") " pod="openstack/neutron-5ddc6d7dd9-qnsrj" Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.591221 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c4b8281-2638-413d-b91a-ee6acb59693d-ovndb-tls-certs\") pod \"neutron-5ddc6d7dd9-qnsrj\" (UID: \"2c4b8281-2638-413d-b91a-ee6acb59693d\") " pod="openstack/neutron-5ddc6d7dd9-qnsrj" Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.592228 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/2c4b8281-2638-413d-b91a-ee6acb59693d-config\") pod \"neutron-5ddc6d7dd9-qnsrj\" (UID: \"2c4b8281-2638-413d-b91a-ee6acb59693d\") " pod="openstack/neutron-5ddc6d7dd9-qnsrj" Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.592293 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c4b8281-2638-413d-b91a-ee6acb59693d-combined-ca-bundle\") pod \"neutron-5ddc6d7dd9-qnsrj\" (UID: \"2c4b8281-2638-413d-b91a-ee6acb59693d\") " pod="openstack/neutron-5ddc6d7dd9-qnsrj" Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.605597 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77n5s\" (UniqueName: \"kubernetes.io/projected/2c4b8281-2638-413d-b91a-ee6acb59693d-kube-api-access-77n5s\") pod \"neutron-5ddc6d7dd9-qnsrj\" (UID: \"2c4b8281-2638-413d-b91a-ee6acb59693d\") " pod="openstack/neutron-5ddc6d7dd9-qnsrj" Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.683430 4971 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/165557aa-762a-4c22-b3df-030af613518b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.768401 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5ddc6d7dd9-qnsrj" Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.869050 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566512-7bqgd" event={"ID":"3f70b1cc-68a3-4e49-8fc0-c90be8b0d30e","Type":"ContainerDied","Data":"c3494901d612a09a842ed8c1a5df0539673ea6073ccf74a18cb2cd4f399f2397"} Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.869246 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3494901d612a09a842ed8c1a5df0539673ea6073ccf74a18cb2cd4f399f2397" Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.869324 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566512-7bqgd" Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.877138 4971 generic.go:334] "Generic (PLEG): container finished" podID="daa1861f-a138-41b1-b5c4-3d2fa15611ea" containerID="821ce76436237ce7e9434e6e2e31435cd0f21237c27264d9054c57be93e6161a" exitCode=0 Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.877200 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rvpqx" event={"ID":"daa1861f-a138-41b1-b5c4-3d2fa15611ea","Type":"ContainerDied","Data":"821ce76436237ce7e9434e6e2e31435cd0f21237c27264d9054c57be93e6161a"} Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.879408 4971 generic.go:334] "Generic (PLEG): container finished" podID="b062fd41-7667-4a30-b4c9-817bfbb80f1b" containerID="b3b8a1b34d0168c170da49d2d6cb8b4c128acb2d7875417dfcd0a1ccb510034b" exitCode=0 Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.879581 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d67d65cb9-jknv9" event={"ID":"b062fd41-7667-4a30-b4c9-817bfbb80f1b","Type":"ContainerDied","Data":"b3b8a1b34d0168c170da49d2d6cb8b4c128acb2d7875417dfcd0a1ccb510034b"} Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.884298 4971 generic.go:334] "Generic (PLEG): container finished" podID="7c3565c3-ef3e-4d98-9bca-211a538032c9" containerID="f6ed6c2bd98279cdf6b3f41b7c35c157a45f901ea5da3911570bab6705696124" exitCode=0 Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.884359 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-7vflp" event={"ID":"7c3565c3-ef3e-4d98-9bca-211a538032c9","Type":"ContainerDied","Data":"f6ed6c2bd98279cdf6b3f41b7c35c157a45f901ea5da3911570bab6705696124"} Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.892035 4971 generic.go:334] "Generic (PLEG): container finished" podID="165557aa-762a-4c22-b3df-030af613518b" containerID="fe2cd538af167ebe7756add415c43a2a91a6b98373c962ca5e2edbfc935d1336" exitCode=0 Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.892078 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-759cc7f497-zhrbn" event={"ID":"165557aa-762a-4c22-b3df-030af613518b","Type":"ContainerDied","Data":"fe2cd538af167ebe7756add415c43a2a91a6b98373c962ca5e2edbfc935d1336"} Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.892098 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-759cc7f497-zhrbn" event={"ID":"165557aa-762a-4c22-b3df-030af613518b","Type":"ContainerDied","Data":"4d55d38543f2850b3914aa97376cf00cb38d0678f7867063f06ab8ba227ee9de"} Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.892114 4971 scope.go:117] "RemoveContainer" containerID="fe2cd538af167ebe7756add415c43a2a91a6b98373c962ca5e2edbfc935d1336" Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.892225 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-759cc7f497-zhrbn" Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.906162 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566506-wm6kx"] Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.912977 4971 generic.go:334] "Generic (PLEG): container finished" podID="35492989-bd44-45b4-85a7-1acc9569270e" containerID="f2c6e66d0b0a91e73a33dda62aeb723853c5bbc9009e00a71519161cd37e2c4a" exitCode=0 Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.913182 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-fqqjk" event={"ID":"35492989-bd44-45b4-85a7-1acc9569270e","Type":"ContainerDied","Data":"f2c6e66d0b0a91e73a33dda62aeb723853c5bbc9009e00a71519161cd37e2c4a"} Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.915781 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66b6544788-qm44x" event={"ID":"601f5587-0478-4903-9909-ebe0dee36539","Type":"ContainerStarted","Data":"dff293a7209e9d5eb574d247516eb149910bd77be84c64f2a666082d2c149d12"} Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.915908 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66b6544788-qm44x" event={"ID":"601f5587-0478-4903-9909-ebe0dee36539","Type":"ContainerStarted","Data":"76223294f76d4e966d36915f2cf219b65fa94718b6cbae2ce9fb396e2800dc1a"} Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.915997 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-66b6544788-qm44x" Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.917728 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566506-wm6kx"] Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.926660 4971 scope.go:117] "RemoveContainer" containerID="6a0cfd35ae031b20932c4994326ceaed13d6a51186a431f86faa77a09604c741" Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.957720 4971 scope.go:117] "RemoveContainer" containerID="fe2cd538af167ebe7756add415c43a2a91a6b98373c962ca5e2edbfc935d1336" Mar 20 07:12:18 crc kubenswrapper[4971]: E0320 07:12:18.961164 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe2cd538af167ebe7756add415c43a2a91a6b98373c962ca5e2edbfc935d1336\": container with ID starting with fe2cd538af167ebe7756add415c43a2a91a6b98373c962ca5e2edbfc935d1336 not found: ID does not exist" containerID="fe2cd538af167ebe7756add415c43a2a91a6b98373c962ca5e2edbfc935d1336" Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.961204 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe2cd538af167ebe7756add415c43a2a91a6b98373c962ca5e2edbfc935d1336"} err="failed to get container status \"fe2cd538af167ebe7756add415c43a2a91a6b98373c962ca5e2edbfc935d1336\": rpc error: code = NotFound desc = could not find container \"fe2cd538af167ebe7756add415c43a2a91a6b98373c962ca5e2edbfc935d1336\": container with ID starting with fe2cd538af167ebe7756add415c43a2a91a6b98373c962ca5e2edbfc935d1336 not found: ID does not exist" Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.961229 4971 scope.go:117] "RemoveContainer" containerID="6a0cfd35ae031b20932c4994326ceaed13d6a51186a431f86faa77a09604c741" Mar 20 07:12:18 crc kubenswrapper[4971]: E0320 07:12:18.962390 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a0cfd35ae031b20932c4994326ceaed13d6a51186a431f86faa77a09604c741\": container with ID starting with 6a0cfd35ae031b20932c4994326ceaed13d6a51186a431f86faa77a09604c741 not found: ID does not exist" containerID="6a0cfd35ae031b20932c4994326ceaed13d6a51186a431f86faa77a09604c741" Mar 20 07:12:18 crc kubenswrapper[4971]: I0320 07:12:18.962424 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a0cfd35ae031b20932c4994326ceaed13d6a51186a431f86faa77a09604c741"} err="failed to get container status \"6a0cfd35ae031b20932c4994326ceaed13d6a51186a431f86faa77a09604c741\": rpc error: code = NotFound desc = could not find container \"6a0cfd35ae031b20932c4994326ceaed13d6a51186a431f86faa77a09604c741\": container with ID starting with 6a0cfd35ae031b20932c4994326ceaed13d6a51186a431f86faa77a09604c741 not found: ID does not exist" Mar 20 07:12:19 crc kubenswrapper[4971]: I0320 07:12:19.001587 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-759cc7f497-zhrbn"] Mar 20 07:12:19 crc kubenswrapper[4971]: I0320 07:12:19.014659 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-759cc7f497-zhrbn"] Mar 20 07:12:19 crc kubenswrapper[4971]: I0320 07:12:19.016318 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-66b6544788-qm44x" podStartSLOduration=3.016292105 podStartE2EDuration="3.016292105s" podCreationTimestamp="2026-03-20 07:12:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:12:19.005275864 +0000 UTC m=+1360.985150012" watchObservedRunningTime="2026-03-20 07:12:19.016292105 +0000 UTC m=+1360.996166243" Mar 20 07:12:19 crc kubenswrapper[4971]: I0320 07:12:19.404648 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5ddc6d7dd9-qnsrj"] Mar 20 07:12:19 crc kubenswrapper[4971]: I0320 07:12:19.944150 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d67d65cb9-jknv9" event={"ID":"b062fd41-7667-4a30-b4c9-817bfbb80f1b","Type":"ContainerStarted","Data":"f94f5c131900e9142db384e685b8b6e3db64b43435aff6a1f804077e0c0eeafb"} Mar 20 07:12:19 crc kubenswrapper[4971]: I0320 07:12:19.944645 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d67d65cb9-jknv9" Mar 20 07:12:19 crc kubenswrapper[4971]: I0320 07:12:19.974301 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d67d65cb9-jknv9" podStartSLOduration=4.974279133 podStartE2EDuration="4.974279133s" podCreationTimestamp="2026-03-20 07:12:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:12:19.968213954 +0000 UTC m=+1361.948088102" watchObservedRunningTime="2026-03-20 07:12:19.974279133 +0000 UTC m=+1361.954153271" Mar 20 07:12:20 crc kubenswrapper[4971]: I0320 07:12:20.162447 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:12:20 crc kubenswrapper[4971]: I0320 07:12:20.162540 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:12:20 crc kubenswrapper[4971]: I0320 07:12:20.745044 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="165557aa-762a-4c22-b3df-030af613518b" path="/var/lib/kubelet/pods/165557aa-762a-4c22-b3df-030af613518b/volumes" Mar 20 07:12:20 crc kubenswrapper[4971]: I0320 07:12:20.745590 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="794b92ff-cff7-4246-8b6f-fbda2dd717c0" path="/var/lib/kubelet/pods/794b92ff-cff7-4246-8b6f-fbda2dd717c0/volumes" Mar 20 07:12:21 crc kubenswrapper[4971]: I0320 07:12:21.982709 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rvpqx" event={"ID":"daa1861f-a138-41b1-b5c4-3d2fa15611ea","Type":"ContainerDied","Data":"d8a54eaca7e4451001c21e6f49baa7e70d0161d5fc873ff4f04ba81f03ce1335"} Mar 20 07:12:21 crc kubenswrapper[4971]: I0320 07:12:21.985204 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8a54eaca7e4451001c21e6f49baa7e70d0161d5fc873ff4f04ba81f03ce1335" Mar 20 07:12:21 crc kubenswrapper[4971]: I0320 07:12:21.994054 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-7vflp" event={"ID":"7c3565c3-ef3e-4d98-9bca-211a538032c9","Type":"ContainerDied","Data":"5ce204006ba619c97b465076cf2c9087e3249d7b639699c993173e55137443f4"} Mar 20 07:12:21 crc kubenswrapper[4971]: I0320 07:12:21.994169 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ce204006ba619c97b465076cf2c9087e3249d7b639699c993173e55137443f4" Mar 20 07:12:21 crc kubenswrapper[4971]: I0320 07:12:21.994394 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-7vflp" Mar 20 07:12:21 crc kubenswrapper[4971]: I0320 07:12:21.997706 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5ddc6d7dd9-qnsrj" event={"ID":"2c4b8281-2638-413d-b91a-ee6acb59693d","Type":"ContainerStarted","Data":"f8d5ed6d4783df684dde2dd4dc90d0866d16d8b27b7a293c3a7ed9022ad58643"} Mar 20 07:12:22 crc kubenswrapper[4971]: I0320 07:12:22.003743 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-fqqjk" event={"ID":"35492989-bd44-45b4-85a7-1acc9569270e","Type":"ContainerDied","Data":"f833110a111b048fe3a3266332c22dabe98decac6f26c0baf1d7489944b7057e"} Mar 20 07:12:22 crc kubenswrapper[4971]: I0320 07:12:22.003816 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f833110a111b048fe3a3266332c22dabe98decac6f26c0baf1d7489944b7057e" Mar 20 07:12:22 crc kubenswrapper[4971]: I0320 07:12:22.006122 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-fqqjk" Mar 20 07:12:22 crc kubenswrapper[4971]: I0320 07:12:22.083346 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c3565c3-ef3e-4d98-9bca-211a538032c9-config-data\") pod \"7c3565c3-ef3e-4d98-9bca-211a538032c9\" (UID: \"7c3565c3-ef3e-4d98-9bca-211a538032c9\") " Mar 20 07:12:22 crc kubenswrapper[4971]: I0320 07:12:22.083685 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7x97\" (UniqueName: \"kubernetes.io/projected/7c3565c3-ef3e-4d98-9bca-211a538032c9-kube-api-access-v7x97\") pod \"7c3565c3-ef3e-4d98-9bca-211a538032c9\" (UID: \"7c3565c3-ef3e-4d98-9bca-211a538032c9\") " Mar 20 07:12:22 crc kubenswrapper[4971]: I0320 07:12:22.083725 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35492989-bd44-45b4-85a7-1acc9569270e-combined-ca-bundle\") pod \"35492989-bd44-45b4-85a7-1acc9569270e\" (UID: \"35492989-bd44-45b4-85a7-1acc9569270e\") " Mar 20 07:12:22 crc kubenswrapper[4971]: I0320 07:12:22.083760 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbbtl\" (UniqueName: \"kubernetes.io/projected/35492989-bd44-45b4-85a7-1acc9569270e-kube-api-access-fbbtl\") pod \"35492989-bd44-45b4-85a7-1acc9569270e\" (UID: \"35492989-bd44-45b4-85a7-1acc9569270e\") " Mar 20 07:12:22 crc kubenswrapper[4971]: I0320 07:12:22.088891 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35492989-bd44-45b4-85a7-1acc9569270e-kube-api-access-fbbtl" (OuterVolumeSpecName: "kube-api-access-fbbtl") pod "35492989-bd44-45b4-85a7-1acc9569270e" (UID: "35492989-bd44-45b4-85a7-1acc9569270e"). InnerVolumeSpecName "kube-api-access-fbbtl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:12:22 crc kubenswrapper[4971]: I0320 07:12:22.090132 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c3565c3-ef3e-4d98-9bca-211a538032c9-kube-api-access-v7x97" (OuterVolumeSpecName: "kube-api-access-v7x97") pod "7c3565c3-ef3e-4d98-9bca-211a538032c9" (UID: "7c3565c3-ef3e-4d98-9bca-211a538032c9"). InnerVolumeSpecName "kube-api-access-v7x97". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:12:22 crc kubenswrapper[4971]: I0320 07:12:22.102452 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rvpqx" Mar 20 07:12:22 crc kubenswrapper[4971]: I0320 07:12:22.145807 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c3565c3-ef3e-4d98-9bca-211a538032c9-config-data" (OuterVolumeSpecName: "config-data") pod "7c3565c3-ef3e-4d98-9bca-211a538032c9" (UID: "7c3565c3-ef3e-4d98-9bca-211a538032c9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:12:22 crc kubenswrapper[4971]: I0320 07:12:22.175573 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35492989-bd44-45b4-85a7-1acc9569270e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "35492989-bd44-45b4-85a7-1acc9569270e" (UID: "35492989-bd44-45b4-85a7-1acc9569270e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:12:22 crc kubenswrapper[4971]: I0320 07:12:22.184806 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c3565c3-ef3e-4d98-9bca-211a538032c9-scripts\") pod \"7c3565c3-ef3e-4d98-9bca-211a538032c9\" (UID: \"7c3565c3-ef3e-4d98-9bca-211a538032c9\") " Mar 20 07:12:22 crc kubenswrapper[4971]: I0320 07:12:22.184856 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c3565c3-ef3e-4d98-9bca-211a538032c9-combined-ca-bundle\") pod \"7c3565c3-ef3e-4d98-9bca-211a538032c9\" (UID: \"7c3565c3-ef3e-4d98-9bca-211a538032c9\") " Mar 20 07:12:22 crc kubenswrapper[4971]: I0320 07:12:22.184877 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/daa1861f-a138-41b1-b5c4-3d2fa15611ea-config-data\") pod \"daa1861f-a138-41b1-b5c4-3d2fa15611ea\" (UID: \"daa1861f-a138-41b1-b5c4-3d2fa15611ea\") " Mar 20 07:12:22 crc kubenswrapper[4971]: I0320 07:12:22.184903 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/35492989-bd44-45b4-85a7-1acc9569270e-db-sync-config-data\") pod \"35492989-bd44-45b4-85a7-1acc9569270e\" (UID: \"35492989-bd44-45b4-85a7-1acc9569270e\") " Mar 20 07:12:22 crc kubenswrapper[4971]: I0320 07:12:22.184923 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/daa1861f-a138-41b1-b5c4-3d2fa15611ea-credential-keys\") pod \"daa1861f-a138-41b1-b5c4-3d2fa15611ea\" (UID: \"daa1861f-a138-41b1-b5c4-3d2fa15611ea\") " Mar 20 07:12:22 crc kubenswrapper[4971]: I0320 07:12:22.184968 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/daa1861f-a138-41b1-b5c4-3d2fa15611ea-fernet-keys\") pod \"daa1861f-a138-41b1-b5c4-3d2fa15611ea\" (UID: \"daa1861f-a138-41b1-b5c4-3d2fa15611ea\") " Mar 20 07:12:22 crc kubenswrapper[4971]: I0320 07:12:22.185019 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c3565c3-ef3e-4d98-9bca-211a538032c9-logs\") pod \"7c3565c3-ef3e-4d98-9bca-211a538032c9\" (UID: \"7c3565c3-ef3e-4d98-9bca-211a538032c9\") " Mar 20 07:12:22 crc kubenswrapper[4971]: I0320 07:12:22.185066 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/daa1861f-a138-41b1-b5c4-3d2fa15611ea-scripts\") pod \"daa1861f-a138-41b1-b5c4-3d2fa15611ea\" (UID: \"daa1861f-a138-41b1-b5c4-3d2fa15611ea\") " Mar 20 07:12:22 crc kubenswrapper[4971]: I0320 07:12:22.185088 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjzqz\" (UniqueName: \"kubernetes.io/projected/daa1861f-a138-41b1-b5c4-3d2fa15611ea-kube-api-access-kjzqz\") pod \"daa1861f-a138-41b1-b5c4-3d2fa15611ea\" (UID: \"daa1861f-a138-41b1-b5c4-3d2fa15611ea\") " Mar 20 07:12:22 crc kubenswrapper[4971]: I0320 07:12:22.185116 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daa1861f-a138-41b1-b5c4-3d2fa15611ea-combined-ca-bundle\") pod \"daa1861f-a138-41b1-b5c4-3d2fa15611ea\" (UID: \"daa1861f-a138-41b1-b5c4-3d2fa15611ea\") " Mar 20 07:12:22 crc kubenswrapper[4971]: I0320 07:12:22.185323 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c3565c3-ef3e-4d98-9bca-211a538032c9-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:22 crc kubenswrapper[4971]: I0320 07:12:22.185340 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7x97\" (UniqueName: \"kubernetes.io/projected/7c3565c3-ef3e-4d98-9bca-211a538032c9-kube-api-access-v7x97\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:22 crc kubenswrapper[4971]: I0320 07:12:22.185351 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35492989-bd44-45b4-85a7-1acc9569270e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:22 crc kubenswrapper[4971]: I0320 07:12:22.185362 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbbtl\" (UniqueName: \"kubernetes.io/projected/35492989-bd44-45b4-85a7-1acc9569270e-kube-api-access-fbbtl\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:22 crc kubenswrapper[4971]: I0320 07:12:22.186675 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c3565c3-ef3e-4d98-9bca-211a538032c9-logs" (OuterVolumeSpecName: "logs") pod "7c3565c3-ef3e-4d98-9bca-211a538032c9" (UID: "7c3565c3-ef3e-4d98-9bca-211a538032c9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:12:22 crc kubenswrapper[4971]: I0320 07:12:22.189143 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c3565c3-ef3e-4d98-9bca-211a538032c9-scripts" (OuterVolumeSpecName: "scripts") pod "7c3565c3-ef3e-4d98-9bca-211a538032c9" (UID: "7c3565c3-ef3e-4d98-9bca-211a538032c9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:12:22 crc kubenswrapper[4971]: I0320 07:12:22.189838 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/daa1861f-a138-41b1-b5c4-3d2fa15611ea-scripts" (OuterVolumeSpecName: "scripts") pod "daa1861f-a138-41b1-b5c4-3d2fa15611ea" (UID: "daa1861f-a138-41b1-b5c4-3d2fa15611ea"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:12:22 crc kubenswrapper[4971]: I0320 07:12:22.193322 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/daa1861f-a138-41b1-b5c4-3d2fa15611ea-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "daa1861f-a138-41b1-b5c4-3d2fa15611ea" (UID: "daa1861f-a138-41b1-b5c4-3d2fa15611ea"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:12:22 crc kubenswrapper[4971]: I0320 07:12:22.193363 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/daa1861f-a138-41b1-b5c4-3d2fa15611ea-kube-api-access-kjzqz" (OuterVolumeSpecName: "kube-api-access-kjzqz") pod "daa1861f-a138-41b1-b5c4-3d2fa15611ea" (UID: "daa1861f-a138-41b1-b5c4-3d2fa15611ea"). InnerVolumeSpecName "kube-api-access-kjzqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:12:22 crc kubenswrapper[4971]: I0320 07:12:22.193437 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35492989-bd44-45b4-85a7-1acc9569270e-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "35492989-bd44-45b4-85a7-1acc9569270e" (UID: "35492989-bd44-45b4-85a7-1acc9569270e"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:12:22 crc kubenswrapper[4971]: I0320 07:12:22.196401 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/daa1861f-a138-41b1-b5c4-3d2fa15611ea-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "daa1861f-a138-41b1-b5c4-3d2fa15611ea" (UID: "daa1861f-a138-41b1-b5c4-3d2fa15611ea"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:12:22 crc kubenswrapper[4971]: I0320 07:12:22.214257 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/daa1861f-a138-41b1-b5c4-3d2fa15611ea-config-data" (OuterVolumeSpecName: "config-data") pod "daa1861f-a138-41b1-b5c4-3d2fa15611ea" (UID: "daa1861f-a138-41b1-b5c4-3d2fa15611ea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:12:22 crc kubenswrapper[4971]: I0320 07:12:22.218836 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/daa1861f-a138-41b1-b5c4-3d2fa15611ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "daa1861f-a138-41b1-b5c4-3d2fa15611ea" (UID: "daa1861f-a138-41b1-b5c4-3d2fa15611ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:12:22 crc kubenswrapper[4971]: I0320 07:12:22.226935 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c3565c3-ef3e-4d98-9bca-211a538032c9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7c3565c3-ef3e-4d98-9bca-211a538032c9" (UID: "7c3565c3-ef3e-4d98-9bca-211a538032c9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:12:22 crc kubenswrapper[4971]: I0320 07:12:22.286562 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/daa1861f-a138-41b1-b5c4-3d2fa15611ea-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:22 crc kubenswrapper[4971]: I0320 07:12:22.286595 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjzqz\" (UniqueName: \"kubernetes.io/projected/daa1861f-a138-41b1-b5c4-3d2fa15611ea-kube-api-access-kjzqz\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:22 crc kubenswrapper[4971]: I0320 07:12:22.286627 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daa1861f-a138-41b1-b5c4-3d2fa15611ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:22 crc kubenswrapper[4971]: I0320 07:12:22.286642 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c3565c3-ef3e-4d98-9bca-211a538032c9-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:22 crc kubenswrapper[4971]: I0320 07:12:22.286653 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c3565c3-ef3e-4d98-9bca-211a538032c9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:22 crc kubenswrapper[4971]: I0320 07:12:22.286664 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/daa1861f-a138-41b1-b5c4-3d2fa15611ea-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:22 crc kubenswrapper[4971]: I0320 07:12:22.286676 4971 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/35492989-bd44-45b4-85a7-1acc9569270e-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:22 crc kubenswrapper[4971]: I0320 07:12:22.286684 4971 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/daa1861f-a138-41b1-b5c4-3d2fa15611ea-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:22 crc kubenswrapper[4971]: I0320 07:12:22.286692 4971 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/daa1861f-a138-41b1-b5c4-3d2fa15611ea-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:22 crc kubenswrapper[4971]: I0320 07:12:22.286701 4971 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c3565c3-ef3e-4d98-9bca-211a538032c9-logs\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.012314 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"03a48c07-f949-4376-874a-8d5758b60a9a","Type":"ContainerStarted","Data":"2d0d52a9adadceb321d13de081d4c987d1f4efb51e325355c60dddcc42cde7c0"} Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.013711 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5ddc6d7dd9-qnsrj" event={"ID":"2c4b8281-2638-413d-b91a-ee6acb59693d","Type":"ContainerStarted","Data":"1f62d6fff0e58ee91d48b2a1865144fa8a186a552d3f87401a1c86fbf8d631de"} Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.013741 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-7vflp" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.013792 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5ddc6d7dd9-qnsrj" event={"ID":"2c4b8281-2638-413d-b91a-ee6acb59693d","Type":"ContainerStarted","Data":"ec43c060399fdac7b8010fc1e004e9c5f9a92fdcf5f59bf5b7751d728e2f3fc2"} Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.013793 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-fqqjk" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.013906 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rvpqx" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.036427 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5ddc6d7dd9-qnsrj" podStartSLOduration=5.036362102 podStartE2EDuration="5.036362102s" podCreationTimestamp="2026-03-20 07:12:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:12:23.035527752 +0000 UTC m=+1365.015401890" watchObservedRunningTime="2026-03-20 07:12:23.036362102 +0000 UTC m=+1365.016236240" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.063905 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.063956 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.096184 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.119007 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.135733 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-758cd76f6b-7db5d"] Mar 20 07:12:23 crc kubenswrapper[4971]: E0320 07:12:23.136122 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="165557aa-762a-4c22-b3df-030af613518b" containerName="dnsmasq-dns" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.136137 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="165557aa-762a-4c22-b3df-030af613518b" containerName="dnsmasq-dns" Mar 20 07:12:23 crc kubenswrapper[4971]: E0320 07:12:23.136151 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="daa1861f-a138-41b1-b5c4-3d2fa15611ea" containerName="keystone-bootstrap" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.136157 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="daa1861f-a138-41b1-b5c4-3d2fa15611ea" containerName="keystone-bootstrap" Mar 20 07:12:23 crc kubenswrapper[4971]: E0320 07:12:23.136169 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35492989-bd44-45b4-85a7-1acc9569270e" containerName="barbican-db-sync" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.136176 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="35492989-bd44-45b4-85a7-1acc9569270e" containerName="barbican-db-sync" Mar 20 07:12:23 crc kubenswrapper[4971]: E0320 07:12:23.136199 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c3565c3-ef3e-4d98-9bca-211a538032c9" containerName="placement-db-sync" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.136204 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c3565c3-ef3e-4d98-9bca-211a538032c9" containerName="placement-db-sync" Mar 20 07:12:23 crc kubenswrapper[4971]: E0320 07:12:23.136233 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="165557aa-762a-4c22-b3df-030af613518b" containerName="init" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.136239 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="165557aa-762a-4c22-b3df-030af613518b" containerName="init" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.136387 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="35492989-bd44-45b4-85a7-1acc9569270e" containerName="barbican-db-sync" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.136396 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c3565c3-ef3e-4d98-9bca-211a538032c9" containerName="placement-db-sync" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.136417 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="165557aa-762a-4c22-b3df-030af613518b" containerName="dnsmasq-dns" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.136431 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="daa1861f-a138-41b1-b5c4-3d2fa15611ea" containerName="keystone-bootstrap" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.137377 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-758cd76f6b-7db5d" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.141954 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.142536 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-qvhrj" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.142770 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.143327 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.143558 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.181825 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-758cd76f6b-7db5d"] Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.302971 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b124085-be1b-4abd-acec-d5e30f119d75-internal-tls-certs\") pod \"placement-758cd76f6b-7db5d\" (UID: \"1b124085-be1b-4abd-acec-d5e30f119d75\") " pod="openstack/placement-758cd76f6b-7db5d" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.303024 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b124085-be1b-4abd-acec-d5e30f119d75-combined-ca-bundle\") pod \"placement-758cd76f6b-7db5d\" (UID: \"1b124085-be1b-4abd-acec-d5e30f119d75\") " pod="openstack/placement-758cd76f6b-7db5d" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.303057 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b124085-be1b-4abd-acec-d5e30f119d75-config-data\") pod \"placement-758cd76f6b-7db5d\" (UID: \"1b124085-be1b-4abd-acec-d5e30f119d75\") " pod="openstack/placement-758cd76f6b-7db5d" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.303092 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b124085-be1b-4abd-acec-d5e30f119d75-scripts\") pod \"placement-758cd76f6b-7db5d\" (UID: \"1b124085-be1b-4abd-acec-d5e30f119d75\") " pod="openstack/placement-758cd76f6b-7db5d" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.303110 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b124085-be1b-4abd-acec-d5e30f119d75-public-tls-certs\") pod \"placement-758cd76f6b-7db5d\" (UID: \"1b124085-be1b-4abd-acec-d5e30f119d75\") " pod="openstack/placement-758cd76f6b-7db5d" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.303187 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b124085-be1b-4abd-acec-d5e30f119d75-logs\") pod \"placement-758cd76f6b-7db5d\" (UID: \"1b124085-be1b-4abd-acec-d5e30f119d75\") " pod="openstack/placement-758cd76f6b-7db5d" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.303205 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rtjm\" (UniqueName: \"kubernetes.io/projected/1b124085-be1b-4abd-acec-d5e30f119d75-kube-api-access-4rtjm\") pod \"placement-758cd76f6b-7db5d\" (UID: \"1b124085-be1b-4abd-acec-d5e30f119d75\") " pod="openstack/placement-758cd76f6b-7db5d" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.353034 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5645d5b87f-2lrzm"] Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.355804 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5645d5b87f-2lrzm" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.379174 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.379644 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.379771 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.379989 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.380086 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-vfsj2" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.380176 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.383730 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-577f9d69bb-nlxbh"] Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.385264 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-577f9d69bb-nlxbh" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.389571 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5645d5b87f-2lrzm"] Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.412291 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.412464 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.412654 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-2rwpf" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.412915 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-577f9d69bb-nlxbh"] Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.419531 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b124085-be1b-4abd-acec-d5e30f119d75-internal-tls-certs\") pod \"placement-758cd76f6b-7db5d\" (UID: \"1b124085-be1b-4abd-acec-d5e30f119d75\") " pod="openstack/placement-758cd76f6b-7db5d" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.419573 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b124085-be1b-4abd-acec-d5e30f119d75-combined-ca-bundle\") pod \"placement-758cd76f6b-7db5d\" (UID: \"1b124085-be1b-4abd-acec-d5e30f119d75\") " pod="openstack/placement-758cd76f6b-7db5d" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.419623 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b124085-be1b-4abd-acec-d5e30f119d75-config-data\") pod \"placement-758cd76f6b-7db5d\" (UID: \"1b124085-be1b-4abd-acec-d5e30f119d75\") " pod="openstack/placement-758cd76f6b-7db5d" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.419662 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b124085-be1b-4abd-acec-d5e30f119d75-scripts\") pod \"placement-758cd76f6b-7db5d\" (UID: \"1b124085-be1b-4abd-acec-d5e30f119d75\") " pod="openstack/placement-758cd76f6b-7db5d" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.419685 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b124085-be1b-4abd-acec-d5e30f119d75-public-tls-certs\") pod \"placement-758cd76f6b-7db5d\" (UID: \"1b124085-be1b-4abd-acec-d5e30f119d75\") " pod="openstack/placement-758cd76f6b-7db5d" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.419771 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b124085-be1b-4abd-acec-d5e30f119d75-logs\") pod \"placement-758cd76f6b-7db5d\" (UID: \"1b124085-be1b-4abd-acec-d5e30f119d75\") " pod="openstack/placement-758cd76f6b-7db5d" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.419789 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rtjm\" (UniqueName: \"kubernetes.io/projected/1b124085-be1b-4abd-acec-d5e30f119d75-kube-api-access-4rtjm\") pod \"placement-758cd76f6b-7db5d\" (UID: \"1b124085-be1b-4abd-acec-d5e30f119d75\") " pod="openstack/placement-758cd76f6b-7db5d" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.423168 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b124085-be1b-4abd-acec-d5e30f119d75-logs\") pod \"placement-758cd76f6b-7db5d\" (UID: \"1b124085-be1b-4abd-acec-d5e30f119d75\") " pod="openstack/placement-758cd76f6b-7db5d" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.431424 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b124085-be1b-4abd-acec-d5e30f119d75-public-tls-certs\") pod \"placement-758cd76f6b-7db5d\" (UID: \"1b124085-be1b-4abd-acec-d5e30f119d75\") " pod="openstack/placement-758cd76f6b-7db5d" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.433666 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b124085-be1b-4abd-acec-d5e30f119d75-combined-ca-bundle\") pod \"placement-758cd76f6b-7db5d\" (UID: \"1b124085-be1b-4abd-acec-d5e30f119d75\") " pod="openstack/placement-758cd76f6b-7db5d" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.456384 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b124085-be1b-4abd-acec-d5e30f119d75-config-data\") pod \"placement-758cd76f6b-7db5d\" (UID: \"1b124085-be1b-4abd-acec-d5e30f119d75\") " pod="openstack/placement-758cd76f6b-7db5d" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.457199 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b124085-be1b-4abd-acec-d5e30f119d75-internal-tls-certs\") pod \"placement-758cd76f6b-7db5d\" (UID: \"1b124085-be1b-4abd-acec-d5e30f119d75\") " pod="openstack/placement-758cd76f6b-7db5d" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.463991 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b124085-be1b-4abd-acec-d5e30f119d75-scripts\") pod \"placement-758cd76f6b-7db5d\" (UID: \"1b124085-be1b-4abd-acec-d5e30f119d75\") " pod="openstack/placement-758cd76f6b-7db5d" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.475385 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rtjm\" (UniqueName: \"kubernetes.io/projected/1b124085-be1b-4abd-acec-d5e30f119d75-kube-api-access-4rtjm\") pod \"placement-758cd76f6b-7db5d\" (UID: \"1b124085-be1b-4abd-acec-d5e30f119d75\") " pod="openstack/placement-758cd76f6b-7db5d" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.484089 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-758cd76f6b-7db5d" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.498653 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-76b4bdf575-ggc5d"] Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.500081 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-76b4bdf575-ggc5d" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.506411 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.531014 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1172670c-6168-4461-a866-3bd8b410c332-combined-ca-bundle\") pod \"barbican-keystone-listener-577f9d69bb-nlxbh\" (UID: \"1172670c-6168-4461-a866-3bd8b410c332\") " pod="openstack/barbican-keystone-listener-577f9d69bb-nlxbh" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.531090 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cf071dc8-0146-4b1d-a644-02224870fcba-fernet-keys\") pod \"keystone-5645d5b87f-2lrzm\" (UID: \"cf071dc8-0146-4b1d-a644-02224870fcba\") " pod="openstack/keystone-5645d5b87f-2lrzm" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.531114 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng8mf\" (UniqueName: \"kubernetes.io/projected/1172670c-6168-4461-a866-3bd8b410c332-kube-api-access-ng8mf\") pod \"barbican-keystone-listener-577f9d69bb-nlxbh\" (UID: \"1172670c-6168-4461-a866-3bd8b410c332\") " pod="openstack/barbican-keystone-listener-577f9d69bb-nlxbh" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.531133 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1172670c-6168-4461-a866-3bd8b410c332-config-data-custom\") pod \"barbican-keystone-listener-577f9d69bb-nlxbh\" (UID: \"1172670c-6168-4461-a866-3bd8b410c332\") " pod="openstack/barbican-keystone-listener-577f9d69bb-nlxbh" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.531152 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1172670c-6168-4461-a866-3bd8b410c332-logs\") pod \"barbican-keystone-listener-577f9d69bb-nlxbh\" (UID: \"1172670c-6168-4461-a866-3bd8b410c332\") " pod="openstack/barbican-keystone-listener-577f9d69bb-nlxbh" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.531178 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf071dc8-0146-4b1d-a644-02224870fcba-combined-ca-bundle\") pod \"keystone-5645d5b87f-2lrzm\" (UID: \"cf071dc8-0146-4b1d-a644-02224870fcba\") " pod="openstack/keystone-5645d5b87f-2lrzm" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.531205 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf071dc8-0146-4b1d-a644-02224870fcba-config-data\") pod \"keystone-5645d5b87f-2lrzm\" (UID: \"cf071dc8-0146-4b1d-a644-02224870fcba\") " pod="openstack/keystone-5645d5b87f-2lrzm" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.531219 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf071dc8-0146-4b1d-a644-02224870fcba-public-tls-certs\") pod \"keystone-5645d5b87f-2lrzm\" (UID: \"cf071dc8-0146-4b1d-a644-02224870fcba\") " pod="openstack/keystone-5645d5b87f-2lrzm" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.531234 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf071dc8-0146-4b1d-a644-02224870fcba-internal-tls-certs\") pod \"keystone-5645d5b87f-2lrzm\" (UID: \"cf071dc8-0146-4b1d-a644-02224870fcba\") " pod="openstack/keystone-5645d5b87f-2lrzm" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.531252 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf071dc8-0146-4b1d-a644-02224870fcba-scripts\") pod \"keystone-5645d5b87f-2lrzm\" (UID: \"cf071dc8-0146-4b1d-a644-02224870fcba\") " pod="openstack/keystone-5645d5b87f-2lrzm" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.531329 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1172670c-6168-4461-a866-3bd8b410c332-config-data\") pod \"barbican-keystone-listener-577f9d69bb-nlxbh\" (UID: \"1172670c-6168-4461-a866-3bd8b410c332\") " pod="openstack/barbican-keystone-listener-577f9d69bb-nlxbh" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.531392 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-76b4bdf575-ggc5d"] Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.531712 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cf071dc8-0146-4b1d-a644-02224870fcba-credential-keys\") pod \"keystone-5645d5b87f-2lrzm\" (UID: \"cf071dc8-0146-4b1d-a644-02224870fcba\") " pod="openstack/keystone-5645d5b87f-2lrzm" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.531819 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7924j\" (UniqueName: \"kubernetes.io/projected/cf071dc8-0146-4b1d-a644-02224870fcba-kube-api-access-7924j\") pod \"keystone-5645d5b87f-2lrzm\" (UID: \"cf071dc8-0146-4b1d-a644-02224870fcba\") " pod="openstack/keystone-5645d5b87f-2lrzm" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.639060 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d67d65cb9-jknv9"] Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.639371 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d67d65cb9-jknv9" podUID="b062fd41-7667-4a30-b4c9-817bfbb80f1b" containerName="dnsmasq-dns" containerID="cri-o://f94f5c131900e9142db384e685b8b6e3db64b43435aff6a1f804077e0c0eeafb" gracePeriod=10 Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.639428 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7924j\" (UniqueName: \"kubernetes.io/projected/cf071dc8-0146-4b1d-a644-02224870fcba-kube-api-access-7924j\") pod \"keystone-5645d5b87f-2lrzm\" (UID: \"cf071dc8-0146-4b1d-a644-02224870fcba\") " pod="openstack/keystone-5645d5b87f-2lrzm" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.639482 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1172670c-6168-4461-a866-3bd8b410c332-combined-ca-bundle\") pod \"barbican-keystone-listener-577f9d69bb-nlxbh\" (UID: \"1172670c-6168-4461-a866-3bd8b410c332\") " pod="openstack/barbican-keystone-listener-577f9d69bb-nlxbh" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.639522 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/26142983-f987-457d-956c-0d63345bdf1c-config-data-custom\") pod \"barbican-worker-76b4bdf575-ggc5d\" (UID: \"26142983-f987-457d-956c-0d63345bdf1c\") " pod="openstack/barbican-worker-76b4bdf575-ggc5d" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.639659 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cf071dc8-0146-4b1d-a644-02224870fcba-fernet-keys\") pod \"keystone-5645d5b87f-2lrzm\" (UID: \"cf071dc8-0146-4b1d-a644-02224870fcba\") " pod="openstack/keystone-5645d5b87f-2lrzm" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.639698 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ng8mf\" (UniqueName: \"kubernetes.io/projected/1172670c-6168-4461-a866-3bd8b410c332-kube-api-access-ng8mf\") pod \"barbican-keystone-listener-577f9d69bb-nlxbh\" (UID: \"1172670c-6168-4461-a866-3bd8b410c332\") " pod="openstack/barbican-keystone-listener-577f9d69bb-nlxbh" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.639739 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1172670c-6168-4461-a866-3bd8b410c332-config-data-custom\") pod \"barbican-keystone-listener-577f9d69bb-nlxbh\" (UID: \"1172670c-6168-4461-a866-3bd8b410c332\") " pod="openstack/barbican-keystone-listener-577f9d69bb-nlxbh" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.639772 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1172670c-6168-4461-a866-3bd8b410c332-logs\") pod \"barbican-keystone-listener-577f9d69bb-nlxbh\" (UID: \"1172670c-6168-4461-a866-3bd8b410c332\") " pod="openstack/barbican-keystone-listener-577f9d69bb-nlxbh" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.639817 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26142983-f987-457d-956c-0d63345bdf1c-combined-ca-bundle\") pod \"barbican-worker-76b4bdf575-ggc5d\" (UID: \"26142983-f987-457d-956c-0d63345bdf1c\") " pod="openstack/barbican-worker-76b4bdf575-ggc5d" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.639858 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf071dc8-0146-4b1d-a644-02224870fcba-combined-ca-bundle\") pod \"keystone-5645d5b87f-2lrzm\" (UID: \"cf071dc8-0146-4b1d-a644-02224870fcba\") " pod="openstack/keystone-5645d5b87f-2lrzm" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.639910 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26142983-f987-457d-956c-0d63345bdf1c-logs\") pod \"barbican-worker-76b4bdf575-ggc5d\" (UID: \"26142983-f987-457d-956c-0d63345bdf1c\") " pod="openstack/barbican-worker-76b4bdf575-ggc5d" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.639948 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf071dc8-0146-4b1d-a644-02224870fcba-config-data\") pod \"keystone-5645d5b87f-2lrzm\" (UID: \"cf071dc8-0146-4b1d-a644-02224870fcba\") " pod="openstack/keystone-5645d5b87f-2lrzm" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.639977 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf071dc8-0146-4b1d-a644-02224870fcba-public-tls-certs\") pod \"keystone-5645d5b87f-2lrzm\" (UID: \"cf071dc8-0146-4b1d-a644-02224870fcba\") " pod="openstack/keystone-5645d5b87f-2lrzm" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.640006 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf071dc8-0146-4b1d-a644-02224870fcba-internal-tls-certs\") pod \"keystone-5645d5b87f-2lrzm\" (UID: \"cf071dc8-0146-4b1d-a644-02224870fcba\") " pod="openstack/keystone-5645d5b87f-2lrzm" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.640036 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf071dc8-0146-4b1d-a644-02224870fcba-scripts\") pod \"keystone-5645d5b87f-2lrzm\" (UID: \"cf071dc8-0146-4b1d-a644-02224870fcba\") " pod="openstack/keystone-5645d5b87f-2lrzm" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.640132 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26142983-f987-457d-956c-0d63345bdf1c-config-data\") pod \"barbican-worker-76b4bdf575-ggc5d\" (UID: \"26142983-f987-457d-956c-0d63345bdf1c\") " pod="openstack/barbican-worker-76b4bdf575-ggc5d" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.640202 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1172670c-6168-4461-a866-3bd8b410c332-config-data\") pod \"barbican-keystone-listener-577f9d69bb-nlxbh\" (UID: \"1172670c-6168-4461-a866-3bd8b410c332\") " pod="openstack/barbican-keystone-listener-577f9d69bb-nlxbh" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.640246 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2gvk\" (UniqueName: \"kubernetes.io/projected/26142983-f987-457d-956c-0d63345bdf1c-kube-api-access-f2gvk\") pod \"barbican-worker-76b4bdf575-ggc5d\" (UID: \"26142983-f987-457d-956c-0d63345bdf1c\") " pod="openstack/barbican-worker-76b4bdf575-ggc5d" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.640283 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cf071dc8-0146-4b1d-a644-02224870fcba-credential-keys\") pod \"keystone-5645d5b87f-2lrzm\" (UID: \"cf071dc8-0146-4b1d-a644-02224870fcba\") " pod="openstack/keystone-5645d5b87f-2lrzm" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.661314 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf071dc8-0146-4b1d-a644-02224870fcba-combined-ca-bundle\") pod \"keystone-5645d5b87f-2lrzm\" (UID: \"cf071dc8-0146-4b1d-a644-02224870fcba\") " pod="openstack/keystone-5645d5b87f-2lrzm" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.665943 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf071dc8-0146-4b1d-a644-02224870fcba-config-data\") pod \"keystone-5645d5b87f-2lrzm\" (UID: \"cf071dc8-0146-4b1d-a644-02224870fcba\") " pod="openstack/keystone-5645d5b87f-2lrzm" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.675219 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng8mf\" (UniqueName: \"kubernetes.io/projected/1172670c-6168-4461-a866-3bd8b410c332-kube-api-access-ng8mf\") pod \"barbican-keystone-listener-577f9d69bb-nlxbh\" (UID: \"1172670c-6168-4461-a866-3bd8b410c332\") " pod="openstack/barbican-keystone-listener-577f9d69bb-nlxbh" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.675740 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf071dc8-0146-4b1d-a644-02224870fcba-internal-tls-certs\") pod \"keystone-5645d5b87f-2lrzm\" (UID: \"cf071dc8-0146-4b1d-a644-02224870fcba\") " pod="openstack/keystone-5645d5b87f-2lrzm" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.675881 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1172670c-6168-4461-a866-3bd8b410c332-logs\") pod \"barbican-keystone-listener-577f9d69bb-nlxbh\" (UID: \"1172670c-6168-4461-a866-3bd8b410c332\") " pod="openstack/barbican-keystone-listener-577f9d69bb-nlxbh" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.676495 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cf071dc8-0146-4b1d-a644-02224870fcba-credential-keys\") pod \"keystone-5645d5b87f-2lrzm\" (UID: \"cf071dc8-0146-4b1d-a644-02224870fcba\") " pod="openstack/keystone-5645d5b87f-2lrzm" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.678152 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1172670c-6168-4461-a866-3bd8b410c332-config-data\") pod \"barbican-keystone-listener-577f9d69bb-nlxbh\" (UID: \"1172670c-6168-4461-a866-3bd8b410c332\") " pod="openstack/barbican-keystone-listener-577f9d69bb-nlxbh" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.684493 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cf071dc8-0146-4b1d-a644-02224870fcba-fernet-keys\") pod \"keystone-5645d5b87f-2lrzm\" (UID: \"cf071dc8-0146-4b1d-a644-02224870fcba\") " pod="openstack/keystone-5645d5b87f-2lrzm" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.689920 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fc46d7df7-468lg"] Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.703418 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fc46d7df7-468lg" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.703892 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf071dc8-0146-4b1d-a644-02224870fcba-scripts\") pod \"keystone-5645d5b87f-2lrzm\" (UID: \"cf071dc8-0146-4b1d-a644-02224870fcba\") " pod="openstack/keystone-5645d5b87f-2lrzm" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.703973 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1172670c-6168-4461-a866-3bd8b410c332-combined-ca-bundle\") pod \"barbican-keystone-listener-577f9d69bb-nlxbh\" (UID: \"1172670c-6168-4461-a866-3bd8b410c332\") " pod="openstack/barbican-keystone-listener-577f9d69bb-nlxbh" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.703523 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf071dc8-0146-4b1d-a644-02224870fcba-public-tls-certs\") pod \"keystone-5645d5b87f-2lrzm\" (UID: \"cf071dc8-0146-4b1d-a644-02224870fcba\") " pod="openstack/keystone-5645d5b87f-2lrzm" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.704659 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1172670c-6168-4461-a866-3bd8b410c332-config-data-custom\") pod \"barbican-keystone-listener-577f9d69bb-nlxbh\" (UID: \"1172670c-6168-4461-a866-3bd8b410c332\") " pod="openstack/barbican-keystone-listener-577f9d69bb-nlxbh" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.731891 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7924j\" (UniqueName: \"kubernetes.io/projected/cf071dc8-0146-4b1d-a644-02224870fcba-kube-api-access-7924j\") pod \"keystone-5645d5b87f-2lrzm\" (UID: \"cf071dc8-0146-4b1d-a644-02224870fcba\") " pod="openstack/keystone-5645d5b87f-2lrzm" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.742164 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c44d5869-6a57-435c-873e-a5aa02b43b3c-dns-swift-storage-0\") pod \"dnsmasq-dns-7fc46d7df7-468lg\" (UID: \"c44d5869-6a57-435c-873e-a5aa02b43b3c\") " pod="openstack/dnsmasq-dns-7fc46d7df7-468lg" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.742234 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2gvk\" (UniqueName: \"kubernetes.io/projected/26142983-f987-457d-956c-0d63345bdf1c-kube-api-access-f2gvk\") pod \"barbican-worker-76b4bdf575-ggc5d\" (UID: \"26142983-f987-457d-956c-0d63345bdf1c\") " pod="openstack/barbican-worker-76b4bdf575-ggc5d" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.742257 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c44d5869-6a57-435c-873e-a5aa02b43b3c-dns-svc\") pod \"dnsmasq-dns-7fc46d7df7-468lg\" (UID: \"c44d5869-6a57-435c-873e-a5aa02b43b3c\") " pod="openstack/dnsmasq-dns-7fc46d7df7-468lg" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.742300 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/26142983-f987-457d-956c-0d63345bdf1c-config-data-custom\") pod \"barbican-worker-76b4bdf575-ggc5d\" (UID: \"26142983-f987-457d-956c-0d63345bdf1c\") " pod="openstack/barbican-worker-76b4bdf575-ggc5d" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.742376 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c44d5869-6a57-435c-873e-a5aa02b43b3c-config\") pod \"dnsmasq-dns-7fc46d7df7-468lg\" (UID: \"c44d5869-6a57-435c-873e-a5aa02b43b3c\") " pod="openstack/dnsmasq-dns-7fc46d7df7-468lg" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.742425 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26142983-f987-457d-956c-0d63345bdf1c-combined-ca-bundle\") pod \"barbican-worker-76b4bdf575-ggc5d\" (UID: \"26142983-f987-457d-956c-0d63345bdf1c\") " pod="openstack/barbican-worker-76b4bdf575-ggc5d" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.742457 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjxgj\" (UniqueName: \"kubernetes.io/projected/c44d5869-6a57-435c-873e-a5aa02b43b3c-kube-api-access-vjxgj\") pod \"dnsmasq-dns-7fc46d7df7-468lg\" (UID: \"c44d5869-6a57-435c-873e-a5aa02b43b3c\") " pod="openstack/dnsmasq-dns-7fc46d7df7-468lg" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.742489 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26142983-f987-457d-956c-0d63345bdf1c-logs\") pod \"barbican-worker-76b4bdf575-ggc5d\" (UID: \"26142983-f987-457d-956c-0d63345bdf1c\") " pod="openstack/barbican-worker-76b4bdf575-ggc5d" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.742539 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c44d5869-6a57-435c-873e-a5aa02b43b3c-ovsdbserver-sb\") pod \"dnsmasq-dns-7fc46d7df7-468lg\" (UID: \"c44d5869-6a57-435c-873e-a5aa02b43b3c\") " pod="openstack/dnsmasq-dns-7fc46d7df7-468lg" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.742583 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c44d5869-6a57-435c-873e-a5aa02b43b3c-ovsdbserver-nb\") pod \"dnsmasq-dns-7fc46d7df7-468lg\" (UID: \"c44d5869-6a57-435c-873e-a5aa02b43b3c\") " pod="openstack/dnsmasq-dns-7fc46d7df7-468lg" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.742624 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26142983-f987-457d-956c-0d63345bdf1c-config-data\") pod \"barbican-worker-76b4bdf575-ggc5d\" (UID: \"26142983-f987-457d-956c-0d63345bdf1c\") " pod="openstack/barbican-worker-76b4bdf575-ggc5d" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.753111 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26142983-f987-457d-956c-0d63345bdf1c-logs\") pod \"barbican-worker-76b4bdf575-ggc5d\" (UID: \"26142983-f987-457d-956c-0d63345bdf1c\") " pod="openstack/barbican-worker-76b4bdf575-ggc5d" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.756626 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/26142983-f987-457d-956c-0d63345bdf1c-config-data-custom\") pod \"barbican-worker-76b4bdf575-ggc5d\" (UID: \"26142983-f987-457d-956c-0d63345bdf1c\") " pod="openstack/barbican-worker-76b4bdf575-ggc5d" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.757382 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26142983-f987-457d-956c-0d63345bdf1c-config-data\") pod \"barbican-worker-76b4bdf575-ggc5d\" (UID: \"26142983-f987-457d-956c-0d63345bdf1c\") " pod="openstack/barbican-worker-76b4bdf575-ggc5d" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.763385 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26142983-f987-457d-956c-0d63345bdf1c-combined-ca-bundle\") pod \"barbican-worker-76b4bdf575-ggc5d\" (UID: \"26142983-f987-457d-956c-0d63345bdf1c\") " pod="openstack/barbican-worker-76b4bdf575-ggc5d" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.779064 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-956ff8bdd-gdvvh"] Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.780587 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2gvk\" (UniqueName: \"kubernetes.io/projected/26142983-f987-457d-956c-0d63345bdf1c-kube-api-access-f2gvk\") pod \"barbican-worker-76b4bdf575-ggc5d\" (UID: \"26142983-f987-457d-956c-0d63345bdf1c\") " pod="openstack/barbican-worker-76b4bdf575-ggc5d" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.781358 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-956ff8bdd-gdvvh" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.790940 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.831673 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fc46d7df7-468lg"] Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.844241 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c44d5869-6a57-435c-873e-a5aa02b43b3c-config\") pod \"dnsmasq-dns-7fc46d7df7-468lg\" (UID: \"c44d5869-6a57-435c-873e-a5aa02b43b3c\") " pod="openstack/dnsmasq-dns-7fc46d7df7-468lg" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.844306 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjxgj\" (UniqueName: \"kubernetes.io/projected/c44d5869-6a57-435c-873e-a5aa02b43b3c-kube-api-access-vjxgj\") pod \"dnsmasq-dns-7fc46d7df7-468lg\" (UID: \"c44d5869-6a57-435c-873e-a5aa02b43b3c\") " pod="openstack/dnsmasq-dns-7fc46d7df7-468lg" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.844355 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c44d5869-6a57-435c-873e-a5aa02b43b3c-ovsdbserver-sb\") pod \"dnsmasq-dns-7fc46d7df7-468lg\" (UID: \"c44d5869-6a57-435c-873e-a5aa02b43b3c\") " pod="openstack/dnsmasq-dns-7fc46d7df7-468lg" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.844378 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c44d5869-6a57-435c-873e-a5aa02b43b3c-ovsdbserver-nb\") pod \"dnsmasq-dns-7fc46d7df7-468lg\" (UID: \"c44d5869-6a57-435c-873e-a5aa02b43b3c\") " pod="openstack/dnsmasq-dns-7fc46d7df7-468lg" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.844413 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c44d5869-6a57-435c-873e-a5aa02b43b3c-dns-swift-storage-0\") pod \"dnsmasq-dns-7fc46d7df7-468lg\" (UID: \"c44d5869-6a57-435c-873e-a5aa02b43b3c\") " pod="openstack/dnsmasq-dns-7fc46d7df7-468lg" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.844436 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c44d5869-6a57-435c-873e-a5aa02b43b3c-dns-svc\") pod \"dnsmasq-dns-7fc46d7df7-468lg\" (UID: \"c44d5869-6a57-435c-873e-a5aa02b43b3c\") " pod="openstack/dnsmasq-dns-7fc46d7df7-468lg" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.852373 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c44d5869-6a57-435c-873e-a5aa02b43b3c-dns-svc\") pod \"dnsmasq-dns-7fc46d7df7-468lg\" (UID: \"c44d5869-6a57-435c-873e-a5aa02b43b3c\") " pod="openstack/dnsmasq-dns-7fc46d7df7-468lg" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.856152 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c44d5869-6a57-435c-873e-a5aa02b43b3c-ovsdbserver-nb\") pod \"dnsmasq-dns-7fc46d7df7-468lg\" (UID: \"c44d5869-6a57-435c-873e-a5aa02b43b3c\") " pod="openstack/dnsmasq-dns-7fc46d7df7-468lg" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.857154 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c44d5869-6a57-435c-873e-a5aa02b43b3c-config\") pod \"dnsmasq-dns-7fc46d7df7-468lg\" (UID: \"c44d5869-6a57-435c-873e-a5aa02b43b3c\") " pod="openstack/dnsmasq-dns-7fc46d7df7-468lg" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.857684 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c44d5869-6a57-435c-873e-a5aa02b43b3c-dns-swift-storage-0\") pod \"dnsmasq-dns-7fc46d7df7-468lg\" (UID: \"c44d5869-6a57-435c-873e-a5aa02b43b3c\") " pod="openstack/dnsmasq-dns-7fc46d7df7-468lg" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.861005 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c44d5869-6a57-435c-873e-a5aa02b43b3c-ovsdbserver-sb\") pod \"dnsmasq-dns-7fc46d7df7-468lg\" (UID: \"c44d5869-6a57-435c-873e-a5aa02b43b3c\") " pod="openstack/dnsmasq-dns-7fc46d7df7-468lg" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.875062 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-956ff8bdd-gdvvh"] Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.930087 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-577f9d69bb-nlxbh" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.945807 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8hvr\" (UniqueName: \"kubernetes.io/projected/91600ccb-b017-4f9f-b348-be6fe0fbe7d7-kube-api-access-d8hvr\") pod \"barbican-api-956ff8bdd-gdvvh\" (UID: \"91600ccb-b017-4f9f-b348-be6fe0fbe7d7\") " pod="openstack/barbican-api-956ff8bdd-gdvvh" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.945932 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91600ccb-b017-4f9f-b348-be6fe0fbe7d7-combined-ca-bundle\") pod \"barbican-api-956ff8bdd-gdvvh\" (UID: \"91600ccb-b017-4f9f-b348-be6fe0fbe7d7\") " pod="openstack/barbican-api-956ff8bdd-gdvvh" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.946006 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91600ccb-b017-4f9f-b348-be6fe0fbe7d7-config-data\") pod \"barbican-api-956ff8bdd-gdvvh\" (UID: \"91600ccb-b017-4f9f-b348-be6fe0fbe7d7\") " pod="openstack/barbican-api-956ff8bdd-gdvvh" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.946071 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/91600ccb-b017-4f9f-b348-be6fe0fbe7d7-config-data-custom\") pod \"barbican-api-956ff8bdd-gdvvh\" (UID: \"91600ccb-b017-4f9f-b348-be6fe0fbe7d7\") " pod="openstack/barbican-api-956ff8bdd-gdvvh" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.946121 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91600ccb-b017-4f9f-b348-be6fe0fbe7d7-logs\") pod \"barbican-api-956ff8bdd-gdvvh\" (UID: \"91600ccb-b017-4f9f-b348-be6fe0fbe7d7\") " pod="openstack/barbican-api-956ff8bdd-gdvvh" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.947652 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjxgj\" (UniqueName: \"kubernetes.io/projected/c44d5869-6a57-435c-873e-a5aa02b43b3c-kube-api-access-vjxgj\") pod \"dnsmasq-dns-7fc46d7df7-468lg\" (UID: \"c44d5869-6a57-435c-873e-a5aa02b43b3c\") " pod="openstack/dnsmasq-dns-7fc46d7df7-468lg" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.959903 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-594fd7bbdb-tf55d"] Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.965195 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-76b4bdf575-ggc5d" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.967329 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-5999d797d7-b2qlx"] Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.975747 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-594fd7bbdb-tf55d" Mar 20 07:12:23 crc kubenswrapper[4971]: I0320 07:12:23.987248 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5999d797d7-b2qlx" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:23.994022 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-594fd7bbdb-tf55d"] Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.011941 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5999d797d7-b2qlx"] Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.012471 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5645d5b87f-2lrzm" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.042406 4971 generic.go:334] "Generic (PLEG): container finished" podID="b062fd41-7667-4a30-b4c9-817bfbb80f1b" containerID="f94f5c131900e9142db384e685b8b6e3db64b43435aff6a1f804077e0c0eeafb" exitCode=0 Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.043807 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d67d65cb9-jknv9" event={"ID":"b062fd41-7667-4a30-b4c9-817bfbb80f1b","Type":"ContainerDied","Data":"f94f5c131900e9142db384e685b8b6e3db64b43435aff6a1f804077e0c0eeafb"} Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.043851 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.045455 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.047065 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4290efff-c110-4085-b0a7-3e402507b5a0-config-data\") pod \"barbican-worker-5999d797d7-b2qlx\" (UID: \"4290efff-c110-4085-b0a7-3e402507b5a0\") " pod="openstack/barbican-worker-5999d797d7-b2qlx" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.047105 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4290efff-c110-4085-b0a7-3e402507b5a0-combined-ca-bundle\") pod \"barbican-worker-5999d797d7-b2qlx\" (UID: \"4290efff-c110-4085-b0a7-3e402507b5a0\") " pod="openstack/barbican-worker-5999d797d7-b2qlx" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.047130 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6149d926-b8a4-4d76-998c-0f721a1a2ef1-config-data-custom\") pod \"barbican-keystone-listener-594fd7bbdb-tf55d\" (UID: \"6149d926-b8a4-4d76-998c-0f721a1a2ef1\") " pod="openstack/barbican-keystone-listener-594fd7bbdb-tf55d" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.047168 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8hvr\" (UniqueName: \"kubernetes.io/projected/91600ccb-b017-4f9f-b348-be6fe0fbe7d7-kube-api-access-d8hvr\") pod \"barbican-api-956ff8bdd-gdvvh\" (UID: \"91600ccb-b017-4f9f-b348-be6fe0fbe7d7\") " pod="openstack/barbican-api-956ff8bdd-gdvvh" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.047185 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4290efff-c110-4085-b0a7-3e402507b5a0-logs\") pod \"barbican-worker-5999d797d7-b2qlx\" (UID: \"4290efff-c110-4085-b0a7-3e402507b5a0\") " pod="openstack/barbican-worker-5999d797d7-b2qlx" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.047220 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4290efff-c110-4085-b0a7-3e402507b5a0-config-data-custom\") pod \"barbican-worker-5999d797d7-b2qlx\" (UID: \"4290efff-c110-4085-b0a7-3e402507b5a0\") " pod="openstack/barbican-worker-5999d797d7-b2qlx" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.047239 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91600ccb-b017-4f9f-b348-be6fe0fbe7d7-combined-ca-bundle\") pod \"barbican-api-956ff8bdd-gdvvh\" (UID: \"91600ccb-b017-4f9f-b348-be6fe0fbe7d7\") " pod="openstack/barbican-api-956ff8bdd-gdvvh" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.047269 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6149d926-b8a4-4d76-998c-0f721a1a2ef1-config-data\") pod \"barbican-keystone-listener-594fd7bbdb-tf55d\" (UID: \"6149d926-b8a4-4d76-998c-0f721a1a2ef1\") " pod="openstack/barbican-keystone-listener-594fd7bbdb-tf55d" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.047294 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91600ccb-b017-4f9f-b348-be6fe0fbe7d7-config-data\") pod \"barbican-api-956ff8bdd-gdvvh\" (UID: \"91600ccb-b017-4f9f-b348-be6fe0fbe7d7\") " pod="openstack/barbican-api-956ff8bdd-gdvvh" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.047317 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6149d926-b8a4-4d76-998c-0f721a1a2ef1-logs\") pod \"barbican-keystone-listener-594fd7bbdb-tf55d\" (UID: \"6149d926-b8a4-4d76-998c-0f721a1a2ef1\") " pod="openstack/barbican-keystone-listener-594fd7bbdb-tf55d" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.047345 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/91600ccb-b017-4f9f-b348-be6fe0fbe7d7-config-data-custom\") pod \"barbican-api-956ff8bdd-gdvvh\" (UID: \"91600ccb-b017-4f9f-b348-be6fe0fbe7d7\") " pod="openstack/barbican-api-956ff8bdd-gdvvh" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.047365 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrx75\" (UniqueName: \"kubernetes.io/projected/6149d926-b8a4-4d76-998c-0f721a1a2ef1-kube-api-access-wrx75\") pod \"barbican-keystone-listener-594fd7bbdb-tf55d\" (UID: \"6149d926-b8a4-4d76-998c-0f721a1a2ef1\") " pod="openstack/barbican-keystone-listener-594fd7bbdb-tf55d" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.047386 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7kn5\" (UniqueName: \"kubernetes.io/projected/4290efff-c110-4085-b0a7-3e402507b5a0-kube-api-access-q7kn5\") pod \"barbican-worker-5999d797d7-b2qlx\" (UID: \"4290efff-c110-4085-b0a7-3e402507b5a0\") " pod="openstack/barbican-worker-5999d797d7-b2qlx" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.047406 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6149d926-b8a4-4d76-998c-0f721a1a2ef1-combined-ca-bundle\") pod \"barbican-keystone-listener-594fd7bbdb-tf55d\" (UID: \"6149d926-b8a4-4d76-998c-0f721a1a2ef1\") " pod="openstack/barbican-keystone-listener-594fd7bbdb-tf55d" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.047410 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5ddc6d7dd9-qnsrj" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.047427 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91600ccb-b017-4f9f-b348-be6fe0fbe7d7-logs\") pod \"barbican-api-956ff8bdd-gdvvh\" (UID: \"91600ccb-b017-4f9f-b348-be6fe0fbe7d7\") " pod="openstack/barbican-api-956ff8bdd-gdvvh" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.048054 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91600ccb-b017-4f9f-b348-be6fe0fbe7d7-logs\") pod \"barbican-api-956ff8bdd-gdvvh\" (UID: \"91600ccb-b017-4f9f-b348-be6fe0fbe7d7\") " pod="openstack/barbican-api-956ff8bdd-gdvvh" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.053442 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91600ccb-b017-4f9f-b348-be6fe0fbe7d7-combined-ca-bundle\") pod \"barbican-api-956ff8bdd-gdvvh\" (UID: \"91600ccb-b017-4f9f-b348-be6fe0fbe7d7\") " pod="openstack/barbican-api-956ff8bdd-gdvvh" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.054309 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91600ccb-b017-4f9f-b348-be6fe0fbe7d7-config-data\") pod \"barbican-api-956ff8bdd-gdvvh\" (UID: \"91600ccb-b017-4f9f-b348-be6fe0fbe7d7\") " pod="openstack/barbican-api-956ff8bdd-gdvvh" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.057145 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/91600ccb-b017-4f9f-b348-be6fe0fbe7d7-config-data-custom\") pod \"barbican-api-956ff8bdd-gdvvh\" (UID: \"91600ccb-b017-4f9f-b348-be6fe0fbe7d7\") " pod="openstack/barbican-api-956ff8bdd-gdvvh" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.074870 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8hvr\" (UniqueName: \"kubernetes.io/projected/91600ccb-b017-4f9f-b348-be6fe0fbe7d7-kube-api-access-d8hvr\") pod \"barbican-api-956ff8bdd-gdvvh\" (UID: \"91600ccb-b017-4f9f-b348-be6fe0fbe7d7\") " pod="openstack/barbican-api-956ff8bdd-gdvvh" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.093239 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7cccb97bc6-t7qwm"] Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.095650 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7cccb97bc6-t7qwm" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.097360 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fc46d7df7-468lg" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.108790 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-956ff8bdd-gdvvh" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.157706 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7cccb97bc6-t7qwm"] Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.164524 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4290efff-c110-4085-b0a7-3e402507b5a0-combined-ca-bundle\") pod \"barbican-worker-5999d797d7-b2qlx\" (UID: \"4290efff-c110-4085-b0a7-3e402507b5a0\") " pod="openstack/barbican-worker-5999d797d7-b2qlx" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.164575 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6149d926-b8a4-4d76-998c-0f721a1a2ef1-config-data-custom\") pod \"barbican-keystone-listener-594fd7bbdb-tf55d\" (UID: \"6149d926-b8a4-4d76-998c-0f721a1a2ef1\") " pod="openstack/barbican-keystone-listener-594fd7bbdb-tf55d" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.164714 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4290efff-c110-4085-b0a7-3e402507b5a0-logs\") pod \"barbican-worker-5999d797d7-b2qlx\" (UID: \"4290efff-c110-4085-b0a7-3e402507b5a0\") " pod="openstack/barbican-worker-5999d797d7-b2qlx" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.164795 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4290efff-c110-4085-b0a7-3e402507b5a0-config-data-custom\") pod \"barbican-worker-5999d797d7-b2qlx\" (UID: \"4290efff-c110-4085-b0a7-3e402507b5a0\") " pod="openstack/barbican-worker-5999d797d7-b2qlx" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.164859 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6149d926-b8a4-4d76-998c-0f721a1a2ef1-config-data\") pod \"barbican-keystone-listener-594fd7bbdb-tf55d\" (UID: \"6149d926-b8a4-4d76-998c-0f721a1a2ef1\") " pod="openstack/barbican-keystone-listener-594fd7bbdb-tf55d" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.165047 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6149d926-b8a4-4d76-998c-0f721a1a2ef1-logs\") pod \"barbican-keystone-listener-594fd7bbdb-tf55d\" (UID: \"6149d926-b8a4-4d76-998c-0f721a1a2ef1\") " pod="openstack/barbican-keystone-listener-594fd7bbdb-tf55d" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.165114 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrx75\" (UniqueName: \"kubernetes.io/projected/6149d926-b8a4-4d76-998c-0f721a1a2ef1-kube-api-access-wrx75\") pod \"barbican-keystone-listener-594fd7bbdb-tf55d\" (UID: \"6149d926-b8a4-4d76-998c-0f721a1a2ef1\") " pod="openstack/barbican-keystone-listener-594fd7bbdb-tf55d" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.165141 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7kn5\" (UniqueName: \"kubernetes.io/projected/4290efff-c110-4085-b0a7-3e402507b5a0-kube-api-access-q7kn5\") pod \"barbican-worker-5999d797d7-b2qlx\" (UID: \"4290efff-c110-4085-b0a7-3e402507b5a0\") " pod="openstack/barbican-worker-5999d797d7-b2qlx" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.165164 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6149d926-b8a4-4d76-998c-0f721a1a2ef1-combined-ca-bundle\") pod \"barbican-keystone-listener-594fd7bbdb-tf55d\" (UID: \"6149d926-b8a4-4d76-998c-0f721a1a2ef1\") " pod="openstack/barbican-keystone-listener-594fd7bbdb-tf55d" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.165199 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4290efff-c110-4085-b0a7-3e402507b5a0-config-data\") pod \"barbican-worker-5999d797d7-b2qlx\" (UID: \"4290efff-c110-4085-b0a7-3e402507b5a0\") " pod="openstack/barbican-worker-5999d797d7-b2qlx" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.166420 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6149d926-b8a4-4d76-998c-0f721a1a2ef1-logs\") pod \"barbican-keystone-listener-594fd7bbdb-tf55d\" (UID: \"6149d926-b8a4-4d76-998c-0f721a1a2ef1\") " pod="openstack/barbican-keystone-listener-594fd7bbdb-tf55d" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.172563 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4290efff-c110-4085-b0a7-3e402507b5a0-logs\") pod \"barbican-worker-5999d797d7-b2qlx\" (UID: \"4290efff-c110-4085-b0a7-3e402507b5a0\") " pod="openstack/barbican-worker-5999d797d7-b2qlx" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.173291 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4290efff-c110-4085-b0a7-3e402507b5a0-config-data-custom\") pod \"barbican-worker-5999d797d7-b2qlx\" (UID: \"4290efff-c110-4085-b0a7-3e402507b5a0\") " pod="openstack/barbican-worker-5999d797d7-b2qlx" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.174924 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4290efff-c110-4085-b0a7-3e402507b5a0-combined-ca-bundle\") pod \"barbican-worker-5999d797d7-b2qlx\" (UID: \"4290efff-c110-4085-b0a7-3e402507b5a0\") " pod="openstack/barbican-worker-5999d797d7-b2qlx" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.205349 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6149d926-b8a4-4d76-998c-0f721a1a2ef1-config-data\") pod \"barbican-keystone-listener-594fd7bbdb-tf55d\" (UID: \"6149d926-b8a4-4d76-998c-0f721a1a2ef1\") " pod="openstack/barbican-keystone-listener-594fd7bbdb-tf55d" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.206227 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6149d926-b8a4-4d76-998c-0f721a1a2ef1-combined-ca-bundle\") pod \"barbican-keystone-listener-594fd7bbdb-tf55d\" (UID: \"6149d926-b8a4-4d76-998c-0f721a1a2ef1\") " pod="openstack/barbican-keystone-listener-594fd7bbdb-tf55d" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.206790 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6149d926-b8a4-4d76-998c-0f721a1a2ef1-config-data-custom\") pod \"barbican-keystone-listener-594fd7bbdb-tf55d\" (UID: \"6149d926-b8a4-4d76-998c-0f721a1a2ef1\") " pod="openstack/barbican-keystone-listener-594fd7bbdb-tf55d" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.207238 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrx75\" (UniqueName: \"kubernetes.io/projected/6149d926-b8a4-4d76-998c-0f721a1a2ef1-kube-api-access-wrx75\") pod \"barbican-keystone-listener-594fd7bbdb-tf55d\" (UID: \"6149d926-b8a4-4d76-998c-0f721a1a2ef1\") " pod="openstack/barbican-keystone-listener-594fd7bbdb-tf55d" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.215925 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4290efff-c110-4085-b0a7-3e402507b5a0-config-data\") pod \"barbican-worker-5999d797d7-b2qlx\" (UID: \"4290efff-c110-4085-b0a7-3e402507b5a0\") " pod="openstack/barbican-worker-5999d797d7-b2qlx" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.218447 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7kn5\" (UniqueName: \"kubernetes.io/projected/4290efff-c110-4085-b0a7-3e402507b5a0-kube-api-access-q7kn5\") pod \"barbican-worker-5999d797d7-b2qlx\" (UID: \"4290efff-c110-4085-b0a7-3e402507b5a0\") " pod="openstack/barbican-worker-5999d797d7-b2qlx" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.272571 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1edc3606-34b7-4c2a-b534-2efa0ad00e95-config-data-custom\") pod \"barbican-api-7cccb97bc6-t7qwm\" (UID: \"1edc3606-34b7-4c2a-b534-2efa0ad00e95\") " pod="openstack/barbican-api-7cccb97bc6-t7qwm" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.274523 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1edc3606-34b7-4c2a-b534-2efa0ad00e95-combined-ca-bundle\") pod \"barbican-api-7cccb97bc6-t7qwm\" (UID: \"1edc3606-34b7-4c2a-b534-2efa0ad00e95\") " pod="openstack/barbican-api-7cccb97bc6-t7qwm" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.274663 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1edc3606-34b7-4c2a-b534-2efa0ad00e95-logs\") pod \"barbican-api-7cccb97bc6-t7qwm\" (UID: \"1edc3606-34b7-4c2a-b534-2efa0ad00e95\") " pod="openstack/barbican-api-7cccb97bc6-t7qwm" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.274805 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzfzg\" (UniqueName: \"kubernetes.io/projected/1edc3606-34b7-4c2a-b534-2efa0ad00e95-kube-api-access-xzfzg\") pod \"barbican-api-7cccb97bc6-t7qwm\" (UID: \"1edc3606-34b7-4c2a-b534-2efa0ad00e95\") " pod="openstack/barbican-api-7cccb97bc6-t7qwm" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.274892 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1edc3606-34b7-4c2a-b534-2efa0ad00e95-config-data\") pod \"barbican-api-7cccb97bc6-t7qwm\" (UID: \"1edc3606-34b7-4c2a-b534-2efa0ad00e95\") " pod="openstack/barbican-api-7cccb97bc6-t7qwm" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.295624 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-758cd76f6b-7db5d"] Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.377732 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1edc3606-34b7-4c2a-b534-2efa0ad00e95-config-data-custom\") pod \"barbican-api-7cccb97bc6-t7qwm\" (UID: \"1edc3606-34b7-4c2a-b534-2efa0ad00e95\") " pod="openstack/barbican-api-7cccb97bc6-t7qwm" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.377807 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1edc3606-34b7-4c2a-b534-2efa0ad00e95-combined-ca-bundle\") pod \"barbican-api-7cccb97bc6-t7qwm\" (UID: \"1edc3606-34b7-4c2a-b534-2efa0ad00e95\") " pod="openstack/barbican-api-7cccb97bc6-t7qwm" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.377824 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1edc3606-34b7-4c2a-b534-2efa0ad00e95-logs\") pod \"barbican-api-7cccb97bc6-t7qwm\" (UID: \"1edc3606-34b7-4c2a-b534-2efa0ad00e95\") " pod="openstack/barbican-api-7cccb97bc6-t7qwm" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.377864 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzfzg\" (UniqueName: \"kubernetes.io/projected/1edc3606-34b7-4c2a-b534-2efa0ad00e95-kube-api-access-xzfzg\") pod \"barbican-api-7cccb97bc6-t7qwm\" (UID: \"1edc3606-34b7-4c2a-b534-2efa0ad00e95\") " pod="openstack/barbican-api-7cccb97bc6-t7qwm" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.377893 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1edc3606-34b7-4c2a-b534-2efa0ad00e95-config-data\") pod \"barbican-api-7cccb97bc6-t7qwm\" (UID: \"1edc3606-34b7-4c2a-b534-2efa0ad00e95\") " pod="openstack/barbican-api-7cccb97bc6-t7qwm" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.382937 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1edc3606-34b7-4c2a-b534-2efa0ad00e95-logs\") pod \"barbican-api-7cccb97bc6-t7qwm\" (UID: \"1edc3606-34b7-4c2a-b534-2efa0ad00e95\") " pod="openstack/barbican-api-7cccb97bc6-t7qwm" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.388223 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1edc3606-34b7-4c2a-b534-2efa0ad00e95-config-data\") pod \"barbican-api-7cccb97bc6-t7qwm\" (UID: \"1edc3606-34b7-4c2a-b534-2efa0ad00e95\") " pod="openstack/barbican-api-7cccb97bc6-t7qwm" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.388904 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1edc3606-34b7-4c2a-b534-2efa0ad00e95-combined-ca-bundle\") pod \"barbican-api-7cccb97bc6-t7qwm\" (UID: \"1edc3606-34b7-4c2a-b534-2efa0ad00e95\") " pod="openstack/barbican-api-7cccb97bc6-t7qwm" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.389130 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1edc3606-34b7-4c2a-b534-2efa0ad00e95-config-data-custom\") pod \"barbican-api-7cccb97bc6-t7qwm\" (UID: \"1edc3606-34b7-4c2a-b534-2efa0ad00e95\") " pod="openstack/barbican-api-7cccb97bc6-t7qwm" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.414131 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzfzg\" (UniqueName: \"kubernetes.io/projected/1edc3606-34b7-4c2a-b534-2efa0ad00e95-kube-api-access-xzfzg\") pod \"barbican-api-7cccb97bc6-t7qwm\" (UID: \"1edc3606-34b7-4c2a-b534-2efa0ad00e95\") " pod="openstack/barbican-api-7cccb97bc6-t7qwm" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.423073 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-594fd7bbdb-tf55d" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.466921 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5999d797d7-b2qlx" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.515158 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-577f9d69bb-nlxbh"] Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.559110 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7cccb97bc6-t7qwm" Mar 20 07:12:24 crc kubenswrapper[4971]: I0320 07:12:24.802072 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-76b4bdf575-ggc5d"] Mar 20 07:12:25 crc kubenswrapper[4971]: I0320 07:12:25.056520 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-758cd76f6b-7db5d" event={"ID":"1b124085-be1b-4abd-acec-d5e30f119d75","Type":"ContainerStarted","Data":"5492bdfd3df55a8f2124fb54ea15f9b3826e3793717086ce2b066c3ae9965450"} Mar 20 07:12:25 crc kubenswrapper[4971]: I0320 07:12:25.056837 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-758cd76f6b-7db5d" event={"ID":"1b124085-be1b-4abd-acec-d5e30f119d75","Type":"ContainerStarted","Data":"1e5d80e9d40e2e3de322ae4ec238de7fedb78a13b576f952275588885f45a373"} Mar 20 07:12:25 crc kubenswrapper[4971]: I0320 07:12:25.061000 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d67d65cb9-jknv9" event={"ID":"b062fd41-7667-4a30-b4c9-817bfbb80f1b","Type":"ContainerDied","Data":"07fa1bf6bec756664b74ba05483388e474164a9f3a745c2f00e551e19b8dcf3b"} Mar 20 07:12:25 crc kubenswrapper[4971]: I0320 07:12:25.061068 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07fa1bf6bec756664b74ba05483388e474164a9f3a745c2f00e551e19b8dcf3b" Mar 20 07:12:25 crc kubenswrapper[4971]: I0320 07:12:25.069548 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-76b4bdf575-ggc5d" event={"ID":"26142983-f987-457d-956c-0d63345bdf1c","Type":"ContainerStarted","Data":"681512901db4da2f2caaa1df131bdbd441e1a3c65dc4a77d1c3ff0fa29157dd2"} Mar 20 07:12:25 crc kubenswrapper[4971]: I0320 07:12:25.073137 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-577f9d69bb-nlxbh" event={"ID":"1172670c-6168-4461-a866-3bd8b410c332","Type":"ContainerStarted","Data":"b6827c161536d094b1ed2ac230d2ff9b8990f787e8944f7ba8903993ca4d6bbd"} Mar 20 07:12:25 crc kubenswrapper[4971]: I0320 07:12:25.112056 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d67d65cb9-jknv9" Mar 20 07:12:25 crc kubenswrapper[4971]: I0320 07:12:25.199689 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-594fd7bbdb-tf55d"] Mar 20 07:12:25 crc kubenswrapper[4971]: I0320 07:12:25.205332 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b062fd41-7667-4a30-b4c9-817bfbb80f1b-ovsdbserver-nb\") pod \"b062fd41-7667-4a30-b4c9-817bfbb80f1b\" (UID: \"b062fd41-7667-4a30-b4c9-817bfbb80f1b\") " Mar 20 07:12:25 crc kubenswrapper[4971]: I0320 07:12:25.205437 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mk7ph\" (UniqueName: \"kubernetes.io/projected/b062fd41-7667-4a30-b4c9-817bfbb80f1b-kube-api-access-mk7ph\") pod \"b062fd41-7667-4a30-b4c9-817bfbb80f1b\" (UID: \"b062fd41-7667-4a30-b4c9-817bfbb80f1b\") " Mar 20 07:12:25 crc kubenswrapper[4971]: I0320 07:12:25.205473 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b062fd41-7667-4a30-b4c9-817bfbb80f1b-dns-swift-storage-0\") pod \"b062fd41-7667-4a30-b4c9-817bfbb80f1b\" (UID: \"b062fd41-7667-4a30-b4c9-817bfbb80f1b\") " Mar 20 07:12:25 crc kubenswrapper[4971]: I0320 07:12:25.205504 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b062fd41-7667-4a30-b4c9-817bfbb80f1b-dns-svc\") pod \"b062fd41-7667-4a30-b4c9-817bfbb80f1b\" (UID: \"b062fd41-7667-4a30-b4c9-817bfbb80f1b\") " Mar 20 07:12:25 crc kubenswrapper[4971]: I0320 07:12:25.205547 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b062fd41-7667-4a30-b4c9-817bfbb80f1b-ovsdbserver-sb\") pod \"b062fd41-7667-4a30-b4c9-817bfbb80f1b\" (UID: \"b062fd41-7667-4a30-b4c9-817bfbb80f1b\") " Mar 20 07:12:25 crc kubenswrapper[4971]: I0320 07:12:25.205664 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b062fd41-7667-4a30-b4c9-817bfbb80f1b-config\") pod \"b062fd41-7667-4a30-b4c9-817bfbb80f1b\" (UID: \"b062fd41-7667-4a30-b4c9-817bfbb80f1b\") " Mar 20 07:12:25 crc kubenswrapper[4971]: I0320 07:12:25.234278 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-956ff8bdd-gdvvh"] Mar 20 07:12:25 crc kubenswrapper[4971]: W0320 07:12:25.243465 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91600ccb_b017_4f9f_b348_be6fe0fbe7d7.slice/crio-3492b17263bc17ba6863717994695b6ecf47e3cb0db2cc9c08c11bcb9d88c63c WatchSource:0}: Error finding container 3492b17263bc17ba6863717994695b6ecf47e3cb0db2cc9c08c11bcb9d88c63c: Status 404 returned error can't find the container with id 3492b17263bc17ba6863717994695b6ecf47e3cb0db2cc9c08c11bcb9d88c63c Mar 20 07:12:25 crc kubenswrapper[4971]: I0320 07:12:25.244224 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b062fd41-7667-4a30-b4c9-817bfbb80f1b-kube-api-access-mk7ph" (OuterVolumeSpecName: "kube-api-access-mk7ph") pod "b062fd41-7667-4a30-b4c9-817bfbb80f1b" (UID: "b062fd41-7667-4a30-b4c9-817bfbb80f1b"). InnerVolumeSpecName "kube-api-access-mk7ph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:12:25 crc kubenswrapper[4971]: I0320 07:12:25.253267 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5645d5b87f-2lrzm"] Mar 20 07:12:25 crc kubenswrapper[4971]: I0320 07:12:25.269214 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fc46d7df7-468lg"] Mar 20 07:12:25 crc kubenswrapper[4971]: I0320 07:12:25.310022 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mk7ph\" (UniqueName: \"kubernetes.io/projected/b062fd41-7667-4a30-b4c9-817bfbb80f1b-kube-api-access-mk7ph\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:25 crc kubenswrapper[4971]: I0320 07:12:25.322042 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b062fd41-7667-4a30-b4c9-817bfbb80f1b-config" (OuterVolumeSpecName: "config") pod "b062fd41-7667-4a30-b4c9-817bfbb80f1b" (UID: "b062fd41-7667-4a30-b4c9-817bfbb80f1b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:12:25 crc kubenswrapper[4971]: I0320 07:12:25.342074 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b062fd41-7667-4a30-b4c9-817bfbb80f1b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b062fd41-7667-4a30-b4c9-817bfbb80f1b" (UID: "b062fd41-7667-4a30-b4c9-817bfbb80f1b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:12:25 crc kubenswrapper[4971]: I0320 07:12:25.370799 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b062fd41-7667-4a30-b4c9-817bfbb80f1b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b062fd41-7667-4a30-b4c9-817bfbb80f1b" (UID: "b062fd41-7667-4a30-b4c9-817bfbb80f1b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:12:25 crc kubenswrapper[4971]: I0320 07:12:25.378780 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b062fd41-7667-4a30-b4c9-817bfbb80f1b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b062fd41-7667-4a30-b4c9-817bfbb80f1b" (UID: "b062fd41-7667-4a30-b4c9-817bfbb80f1b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:12:25 crc kubenswrapper[4971]: I0320 07:12:25.384745 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b062fd41-7667-4a30-b4c9-817bfbb80f1b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b062fd41-7667-4a30-b4c9-817bfbb80f1b" (UID: "b062fd41-7667-4a30-b4c9-817bfbb80f1b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:12:25 crc kubenswrapper[4971]: I0320 07:12:25.415802 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b062fd41-7667-4a30-b4c9-817bfbb80f1b-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:25 crc kubenswrapper[4971]: I0320 07:12:25.415834 4971 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b062fd41-7667-4a30-b4c9-817bfbb80f1b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:25 crc kubenswrapper[4971]: I0320 07:12:25.415847 4971 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b062fd41-7667-4a30-b4c9-817bfbb80f1b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:25 crc kubenswrapper[4971]: I0320 07:12:25.415855 4971 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b062fd41-7667-4a30-b4c9-817bfbb80f1b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:25 crc kubenswrapper[4971]: I0320 07:12:25.415865 4971 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b062fd41-7667-4a30-b4c9-817bfbb80f1b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:25 crc kubenswrapper[4971]: I0320 07:12:25.430096 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7cccb97bc6-t7qwm"] Mar 20 07:12:25 crc kubenswrapper[4971]: I0320 07:12:25.437985 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5999d797d7-b2qlx"] Mar 20 07:12:25 crc kubenswrapper[4971]: W0320 07:12:25.503496 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4290efff_c110_4085_b0a7_3e402507b5a0.slice/crio-dfabb6a3263186a5587477a3179ed3db85ef57db198ebbab24ffc2b8e3d1827d WatchSource:0}: Error finding container dfabb6a3263186a5587477a3179ed3db85ef57db198ebbab24ffc2b8e3d1827d: Status 404 returned error can't find the container with id dfabb6a3263186a5587477a3179ed3db85ef57db198ebbab24ffc2b8e3d1827d Mar 20 07:12:25 crc kubenswrapper[4971]: I0320 07:12:25.923719 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-68ff6b454b-v87kx"] Mar 20 07:12:25 crc kubenswrapper[4971]: E0320 07:12:25.924445 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b062fd41-7667-4a30-b4c9-817bfbb80f1b" containerName="dnsmasq-dns" Mar 20 07:12:25 crc kubenswrapper[4971]: I0320 07:12:25.924459 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="b062fd41-7667-4a30-b4c9-817bfbb80f1b" containerName="dnsmasq-dns" Mar 20 07:12:25 crc kubenswrapper[4971]: E0320 07:12:25.924479 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b062fd41-7667-4a30-b4c9-817bfbb80f1b" containerName="init" Mar 20 07:12:25 crc kubenswrapper[4971]: I0320 07:12:25.924486 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="b062fd41-7667-4a30-b4c9-817bfbb80f1b" containerName="init" Mar 20 07:12:25 crc kubenswrapper[4971]: I0320 07:12:25.924700 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="b062fd41-7667-4a30-b4c9-817bfbb80f1b" containerName="dnsmasq-dns" Mar 20 07:12:25 crc kubenswrapper[4971]: I0320 07:12:25.925748 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-68ff6b454b-v87kx" Mar 20 07:12:25 crc kubenswrapper[4971]: I0320 07:12:25.936272 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-68ff6b454b-v87kx"] Mar 20 07:12:25 crc kubenswrapper[4971]: I0320 07:12:25.986344 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 07:12:25 crc kubenswrapper[4971]: I0320 07:12:25.986388 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 07:12:26 crc kubenswrapper[4971]: I0320 07:12:26.030570 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e641b3d5-ee18-41b6-b38c-098564591a1b-scripts\") pod \"placement-68ff6b454b-v87kx\" (UID: \"e641b3d5-ee18-41b6-b38c-098564591a1b\") " pod="openstack/placement-68ff6b454b-v87kx" Mar 20 07:12:26 crc kubenswrapper[4971]: I0320 07:12:26.030688 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e641b3d5-ee18-41b6-b38c-098564591a1b-config-data\") pod \"placement-68ff6b454b-v87kx\" (UID: \"e641b3d5-ee18-41b6-b38c-098564591a1b\") " pod="openstack/placement-68ff6b454b-v87kx" Mar 20 07:12:26 crc kubenswrapper[4971]: I0320 07:12:26.030750 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btsqw\" (UniqueName: \"kubernetes.io/projected/e641b3d5-ee18-41b6-b38c-098564591a1b-kube-api-access-btsqw\") pod \"placement-68ff6b454b-v87kx\" (UID: \"e641b3d5-ee18-41b6-b38c-098564591a1b\") " pod="openstack/placement-68ff6b454b-v87kx" Mar 20 07:12:26 crc kubenswrapper[4971]: I0320 07:12:26.030777 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e641b3d5-ee18-41b6-b38c-098564591a1b-combined-ca-bundle\") pod \"placement-68ff6b454b-v87kx\" (UID: \"e641b3d5-ee18-41b6-b38c-098564591a1b\") " pod="openstack/placement-68ff6b454b-v87kx" Mar 20 07:12:26 crc kubenswrapper[4971]: I0320 07:12:26.030798 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e641b3d5-ee18-41b6-b38c-098564591a1b-public-tls-certs\") pod \"placement-68ff6b454b-v87kx\" (UID: \"e641b3d5-ee18-41b6-b38c-098564591a1b\") " pod="openstack/placement-68ff6b454b-v87kx" Mar 20 07:12:26 crc kubenswrapper[4971]: I0320 07:12:26.030825 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e641b3d5-ee18-41b6-b38c-098564591a1b-logs\") pod \"placement-68ff6b454b-v87kx\" (UID: \"e641b3d5-ee18-41b6-b38c-098564591a1b\") " pod="openstack/placement-68ff6b454b-v87kx" Mar 20 07:12:26 crc kubenswrapper[4971]: I0320 07:12:26.030845 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e641b3d5-ee18-41b6-b38c-098564591a1b-internal-tls-certs\") pod \"placement-68ff6b454b-v87kx\" (UID: \"e641b3d5-ee18-41b6-b38c-098564591a1b\") " pod="openstack/placement-68ff6b454b-v87kx" Mar 20 07:12:26 crc kubenswrapper[4971]: I0320 07:12:26.043253 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 07:12:26 crc kubenswrapper[4971]: I0320 07:12:26.077853 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 07:12:26 crc kubenswrapper[4971]: I0320 07:12:26.135784 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-594fd7bbdb-tf55d" event={"ID":"6149d926-b8a4-4d76-998c-0f721a1a2ef1","Type":"ContainerStarted","Data":"058b77e26cfc29aa04870b5b56a750d441b3a92f8e1fc2c0662d7141df4a9ea7"} Mar 20 07:12:26 crc kubenswrapper[4971]: I0320 07:12:26.154327 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e641b3d5-ee18-41b6-b38c-098564591a1b-logs\") pod \"placement-68ff6b454b-v87kx\" (UID: \"e641b3d5-ee18-41b6-b38c-098564591a1b\") " pod="openstack/placement-68ff6b454b-v87kx" Mar 20 07:12:26 crc kubenswrapper[4971]: I0320 07:12:26.154425 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e641b3d5-ee18-41b6-b38c-098564591a1b-internal-tls-certs\") pod \"placement-68ff6b454b-v87kx\" (UID: \"e641b3d5-ee18-41b6-b38c-098564591a1b\") " pod="openstack/placement-68ff6b454b-v87kx" Mar 20 07:12:26 crc kubenswrapper[4971]: I0320 07:12:26.154513 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e641b3d5-ee18-41b6-b38c-098564591a1b-scripts\") pod \"placement-68ff6b454b-v87kx\" (UID: \"e641b3d5-ee18-41b6-b38c-098564591a1b\") " pod="openstack/placement-68ff6b454b-v87kx" Mar 20 07:12:26 crc kubenswrapper[4971]: I0320 07:12:26.154675 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e641b3d5-ee18-41b6-b38c-098564591a1b-config-data\") pod \"placement-68ff6b454b-v87kx\" (UID: \"e641b3d5-ee18-41b6-b38c-098564591a1b\") " pod="openstack/placement-68ff6b454b-v87kx" Mar 20 07:12:26 crc kubenswrapper[4971]: I0320 07:12:26.154783 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btsqw\" (UniqueName: \"kubernetes.io/projected/e641b3d5-ee18-41b6-b38c-098564591a1b-kube-api-access-btsqw\") pod \"placement-68ff6b454b-v87kx\" (UID: \"e641b3d5-ee18-41b6-b38c-098564591a1b\") " pod="openstack/placement-68ff6b454b-v87kx" Mar 20 07:12:26 crc kubenswrapper[4971]: I0320 07:12:26.154810 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e641b3d5-ee18-41b6-b38c-098564591a1b-combined-ca-bundle\") pod \"placement-68ff6b454b-v87kx\" (UID: \"e641b3d5-ee18-41b6-b38c-098564591a1b\") " pod="openstack/placement-68ff6b454b-v87kx" Mar 20 07:12:26 crc kubenswrapper[4971]: I0320 07:12:26.154832 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e641b3d5-ee18-41b6-b38c-098564591a1b-public-tls-certs\") pod \"placement-68ff6b454b-v87kx\" (UID: \"e641b3d5-ee18-41b6-b38c-098564591a1b\") " pod="openstack/placement-68ff6b454b-v87kx" Mar 20 07:12:26 crc kubenswrapper[4971]: I0320 07:12:26.154816 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-956ff8bdd-gdvvh" event={"ID":"91600ccb-b017-4f9f-b348-be6fe0fbe7d7","Type":"ContainerStarted","Data":"868a0345e6ce93eb431ecb9b8eb8b1b3ab7e0a9560ebaebe5302144c4885308f"} Mar 20 07:12:26 crc kubenswrapper[4971]: I0320 07:12:26.154964 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-956ff8bdd-gdvvh" event={"ID":"91600ccb-b017-4f9f-b348-be6fe0fbe7d7","Type":"ContainerStarted","Data":"3492b17263bc17ba6863717994695b6ecf47e3cb0db2cc9c08c11bcb9d88c63c"} Mar 20 07:12:26 crc kubenswrapper[4971]: I0320 07:12:26.156416 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e641b3d5-ee18-41b6-b38c-098564591a1b-logs\") pod \"placement-68ff6b454b-v87kx\" (UID: \"e641b3d5-ee18-41b6-b38c-098564591a1b\") " pod="openstack/placement-68ff6b454b-v87kx" Mar 20 07:12:26 crc kubenswrapper[4971]: I0320 07:12:26.162798 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e641b3d5-ee18-41b6-b38c-098564591a1b-internal-tls-certs\") pod \"placement-68ff6b454b-v87kx\" (UID: \"e641b3d5-ee18-41b6-b38c-098564591a1b\") " pod="openstack/placement-68ff6b454b-v87kx" Mar 20 07:12:26 crc kubenswrapper[4971]: I0320 07:12:26.165934 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e641b3d5-ee18-41b6-b38c-098564591a1b-combined-ca-bundle\") pod \"placement-68ff6b454b-v87kx\" (UID: \"e641b3d5-ee18-41b6-b38c-098564591a1b\") " pod="openstack/placement-68ff6b454b-v87kx" Mar 20 07:12:26 crc kubenswrapper[4971]: I0320 07:12:26.172864 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e641b3d5-ee18-41b6-b38c-098564591a1b-public-tls-certs\") pod \"placement-68ff6b454b-v87kx\" (UID: \"e641b3d5-ee18-41b6-b38c-098564591a1b\") " pod="openstack/placement-68ff6b454b-v87kx" Mar 20 07:12:26 crc kubenswrapper[4971]: I0320 07:12:26.201509 4971 generic.go:334] "Generic (PLEG): container finished" podID="c44d5869-6a57-435c-873e-a5aa02b43b3c" containerID="e4b465828f1eee4253df14f943c5fdc68c1908b41e07ca13377f5ed834610495" exitCode=0 Mar 20 07:12:26 crc kubenswrapper[4971]: I0320 07:12:26.202029 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fc46d7df7-468lg" event={"ID":"c44d5869-6a57-435c-873e-a5aa02b43b3c","Type":"ContainerDied","Data":"e4b465828f1eee4253df14f943c5fdc68c1908b41e07ca13377f5ed834610495"} Mar 20 07:12:26 crc kubenswrapper[4971]: I0320 07:12:26.202083 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fc46d7df7-468lg" event={"ID":"c44d5869-6a57-435c-873e-a5aa02b43b3c","Type":"ContainerStarted","Data":"03b0a72a64e95931835a40cd0b704d11c5e4e695e6f966363e9a0246044f2f73"} Mar 20 07:12:26 crc kubenswrapper[4971]: I0320 07:12:26.207294 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btsqw\" (UniqueName: \"kubernetes.io/projected/e641b3d5-ee18-41b6-b38c-098564591a1b-kube-api-access-btsqw\") pod \"placement-68ff6b454b-v87kx\" (UID: \"e641b3d5-ee18-41b6-b38c-098564591a1b\") " pod="openstack/placement-68ff6b454b-v87kx" Mar 20 07:12:26 crc kubenswrapper[4971]: I0320 07:12:26.213426 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7cccb97bc6-t7qwm" event={"ID":"1edc3606-34b7-4c2a-b534-2efa0ad00e95","Type":"ContainerStarted","Data":"c5688d50e743f0033211b1fa577437147fe0fe343d1c61c4a9aa25d7583af3de"} Mar 20 07:12:26 crc kubenswrapper[4971]: I0320 07:12:26.213468 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7cccb97bc6-t7qwm" event={"ID":"1edc3606-34b7-4c2a-b534-2efa0ad00e95","Type":"ContainerStarted","Data":"4cb8a35f072d44c331ecc56d1821e504cec508e1172de452d8974d9419f87007"} Mar 20 07:12:26 crc kubenswrapper[4971]: I0320 07:12:26.213479 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7cccb97bc6-t7qwm" event={"ID":"1edc3606-34b7-4c2a-b534-2efa0ad00e95","Type":"ContainerStarted","Data":"7082b05a7e200cb4119830375001964b165a470004543ceb8d57b43c520b14dc"} Mar 20 07:12:26 crc kubenswrapper[4971]: I0320 07:12:26.213755 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7cccb97bc6-t7qwm" Mar 20 07:12:26 crc kubenswrapper[4971]: I0320 07:12:26.214214 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e641b3d5-ee18-41b6-b38c-098564591a1b-scripts\") pod \"placement-68ff6b454b-v87kx\" (UID: \"e641b3d5-ee18-41b6-b38c-098564591a1b\") " pod="openstack/placement-68ff6b454b-v87kx" Mar 20 07:12:26 crc kubenswrapper[4971]: I0320 07:12:26.214621 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7cccb97bc6-t7qwm" Mar 20 07:12:26 crc kubenswrapper[4971]: I0320 07:12:26.216397 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-758cd76f6b-7db5d" event={"ID":"1b124085-be1b-4abd-acec-d5e30f119d75","Type":"ContainerStarted","Data":"048309776ba249bbbeab48b005866ec949046c27981f6419194c820f0c411200"} Mar 20 07:12:26 crc kubenswrapper[4971]: I0320 07:12:26.217230 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-758cd76f6b-7db5d" Mar 20 07:12:26 crc kubenswrapper[4971]: I0320 07:12:26.217268 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-758cd76f6b-7db5d" Mar 20 07:12:26 crc kubenswrapper[4971]: I0320 07:12:26.223215 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e641b3d5-ee18-41b6-b38c-098564591a1b-config-data\") pod \"placement-68ff6b454b-v87kx\" (UID: \"e641b3d5-ee18-41b6-b38c-098564591a1b\") " pod="openstack/placement-68ff6b454b-v87kx" Mar 20 07:12:26 crc kubenswrapper[4971]: I0320 07:12:26.262782 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5645d5b87f-2lrzm" event={"ID":"cf071dc8-0146-4b1d-a644-02224870fcba","Type":"ContainerStarted","Data":"41739891398c4430c4f83f4ce50b2d0a22faf0762c4dc2b95014e0a8cb9f4cb0"} Mar 20 07:12:26 crc kubenswrapper[4971]: I0320 07:12:26.262836 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5645d5b87f-2lrzm" event={"ID":"cf071dc8-0146-4b1d-a644-02224870fcba","Type":"ContainerStarted","Data":"7026d070462602394c954aff94390865eadeb5ccb6fd7fb20c68ed2ee95ce3e6"} Mar 20 07:12:26 crc kubenswrapper[4971]: I0320 07:12:26.263512 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-5645d5b87f-2lrzm" Mar 20 07:12:26 crc kubenswrapper[4971]: I0320 07:12:26.281004 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-68ff6b454b-v87kx" Mar 20 07:12:26 crc kubenswrapper[4971]: I0320 07:12:26.284922 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5999d797d7-b2qlx" event={"ID":"4290efff-c110-4085-b0a7-3e402507b5a0","Type":"ContainerStarted","Data":"dfabb6a3263186a5587477a3179ed3db85ef57db198ebbab24ffc2b8e3d1827d"} Mar 20 07:12:26 crc kubenswrapper[4971]: I0320 07:12:26.286809 4971 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 07:12:26 crc kubenswrapper[4971]: I0320 07:12:26.286827 4971 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 07:12:26 crc kubenswrapper[4971]: I0320 07:12:26.286879 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d67d65cb9-jknv9" Mar 20 07:12:26 crc kubenswrapper[4971]: I0320 07:12:26.287934 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 07:12:26 crc kubenswrapper[4971]: I0320 07:12:26.287962 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 07:12:26 crc kubenswrapper[4971]: I0320 07:12:26.313260 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7cccb97bc6-t7qwm" podStartSLOduration=2.313242402 podStartE2EDuration="2.313242402s" podCreationTimestamp="2026-03-20 07:12:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:12:26.270909151 +0000 UTC m=+1368.250783289" watchObservedRunningTime="2026-03-20 07:12:26.313242402 +0000 UTC m=+1368.293116540" Mar 20 07:12:26 crc kubenswrapper[4971]: I0320 07:12:26.320353 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-758cd76f6b-7db5d" podStartSLOduration=3.320341616 podStartE2EDuration="3.320341616s" podCreationTimestamp="2026-03-20 07:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:12:26.292398819 +0000 UTC m=+1368.272272957" watchObservedRunningTime="2026-03-20 07:12:26.320341616 +0000 UTC m=+1368.300215754" Mar 20 07:12:26 crc kubenswrapper[4971]: I0320 07:12:26.343110 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-5645d5b87f-2lrzm" podStartSLOduration=3.343094345 podStartE2EDuration="3.343094345s" podCreationTimestamp="2026-03-20 07:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:12:26.34247278 +0000 UTC m=+1368.322346918" watchObservedRunningTime="2026-03-20 07:12:26.343094345 +0000 UTC m=+1368.322968483" Mar 20 07:12:26 crc kubenswrapper[4971]: I0320 07:12:26.378140 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d67d65cb9-jknv9"] Mar 20 07:12:26 crc kubenswrapper[4971]: I0320 07:12:26.385687 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d67d65cb9-jknv9"] Mar 20 07:12:26 crc kubenswrapper[4971]: I0320 07:12:26.489163 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 07:12:26 crc kubenswrapper[4971]: I0320 07:12:26.762903 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b062fd41-7667-4a30-b4c9-817bfbb80f1b" path="/var/lib/kubelet/pods/b062fd41-7667-4a30-b4c9-817bfbb80f1b/volumes" Mar 20 07:12:27 crc kubenswrapper[4971]: I0320 07:12:27.057779 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-68ff6b454b-v87kx"] Mar 20 07:12:27 crc kubenswrapper[4971]: I0320 07:12:27.298178 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-956ff8bdd-gdvvh" event={"ID":"91600ccb-b017-4f9f-b348-be6fe0fbe7d7","Type":"ContainerStarted","Data":"29fc7fb049a093de8b3c1677edadf777b465a293f069bba4186972bb4980a219"} Mar 20 07:12:27 crc kubenswrapper[4971]: I0320 07:12:27.299303 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-956ff8bdd-gdvvh" Mar 20 07:12:27 crc kubenswrapper[4971]: I0320 07:12:27.299327 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-956ff8bdd-gdvvh" Mar 20 07:12:27 crc kubenswrapper[4971]: I0320 07:12:27.304100 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fc46d7df7-468lg" event={"ID":"c44d5869-6a57-435c-873e-a5aa02b43b3c","Type":"ContainerStarted","Data":"112f076e80060835ec2117278c822bb2d8aa71100255cf31dc876b936c782f6f"} Mar 20 07:12:27 crc kubenswrapper[4971]: I0320 07:12:27.304138 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7fc46d7df7-468lg" Mar 20 07:12:27 crc kubenswrapper[4971]: I0320 07:12:27.304625 4971 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 07:12:27 crc kubenswrapper[4971]: I0320 07:12:27.355227 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7fc46d7df7-468lg" podStartSLOduration=4.355212435 podStartE2EDuration="4.355212435s" podCreationTimestamp="2026-03-20 07:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:12:27.35175367 +0000 UTC m=+1369.331627808" watchObservedRunningTime="2026-03-20 07:12:27.355212435 +0000 UTC m=+1369.335086573" Mar 20 07:12:27 crc kubenswrapper[4971]: I0320 07:12:27.355866 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-956ff8bdd-gdvvh" podStartSLOduration=4.355861111 podStartE2EDuration="4.355861111s" podCreationTimestamp="2026-03-20 07:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:12:27.334435664 +0000 UTC m=+1369.314309802" watchObservedRunningTime="2026-03-20 07:12:27.355861111 +0000 UTC m=+1369.335735249" Mar 20 07:12:27 crc kubenswrapper[4971]: I0320 07:12:27.377269 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 07:12:28 crc kubenswrapper[4971]: I0320 07:12:28.315257 4971 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 07:12:28 crc kubenswrapper[4971]: I0320 07:12:28.315671 4971 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 07:12:28 crc kubenswrapper[4971]: I0320 07:12:28.561924 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-956ff8bdd-gdvvh"] Mar 20 07:12:28 crc kubenswrapper[4971]: I0320 07:12:28.617110 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7c864cc4fb-bwl6r"] Mar 20 07:12:28 crc kubenswrapper[4971]: I0320 07:12:28.619966 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7c864cc4fb-bwl6r" Mar 20 07:12:28 crc kubenswrapper[4971]: I0320 07:12:28.624735 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7c864cc4fb-bwl6r"] Mar 20 07:12:28 crc kubenswrapper[4971]: I0320 07:12:28.625124 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 20 07:12:28 crc kubenswrapper[4971]: I0320 07:12:28.625319 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 20 07:12:28 crc kubenswrapper[4971]: I0320 07:12:28.695475 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bb32512-db70-4db8-a867-0bda48f318eb-public-tls-certs\") pod \"barbican-api-7c864cc4fb-bwl6r\" (UID: \"2bb32512-db70-4db8-a867-0bda48f318eb\") " pod="openstack/barbican-api-7c864cc4fb-bwl6r" Mar 20 07:12:28 crc kubenswrapper[4971]: I0320 07:12:28.695553 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bb32512-db70-4db8-a867-0bda48f318eb-combined-ca-bundle\") pod \"barbican-api-7c864cc4fb-bwl6r\" (UID: \"2bb32512-db70-4db8-a867-0bda48f318eb\") " pod="openstack/barbican-api-7c864cc4fb-bwl6r" Mar 20 07:12:28 crc kubenswrapper[4971]: I0320 07:12:28.695634 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bb32512-db70-4db8-a867-0bda48f318eb-config-data\") pod \"barbican-api-7c864cc4fb-bwl6r\" (UID: \"2bb32512-db70-4db8-a867-0bda48f318eb\") " pod="openstack/barbican-api-7c864cc4fb-bwl6r" Mar 20 07:12:28 crc kubenswrapper[4971]: I0320 07:12:28.695680 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2bb32512-db70-4db8-a867-0bda48f318eb-logs\") pod \"barbican-api-7c864cc4fb-bwl6r\" (UID: \"2bb32512-db70-4db8-a867-0bda48f318eb\") " pod="openstack/barbican-api-7c864cc4fb-bwl6r" Mar 20 07:12:28 crc kubenswrapper[4971]: I0320 07:12:28.695761 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjlv8\" (UniqueName: \"kubernetes.io/projected/2bb32512-db70-4db8-a867-0bda48f318eb-kube-api-access-sjlv8\") pod \"barbican-api-7c864cc4fb-bwl6r\" (UID: \"2bb32512-db70-4db8-a867-0bda48f318eb\") " pod="openstack/barbican-api-7c864cc4fb-bwl6r" Mar 20 07:12:28 crc kubenswrapper[4971]: I0320 07:12:28.695848 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2bb32512-db70-4db8-a867-0bda48f318eb-config-data-custom\") pod \"barbican-api-7c864cc4fb-bwl6r\" (UID: \"2bb32512-db70-4db8-a867-0bda48f318eb\") " pod="openstack/barbican-api-7c864cc4fb-bwl6r" Mar 20 07:12:28 crc kubenswrapper[4971]: I0320 07:12:28.695918 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bb32512-db70-4db8-a867-0bda48f318eb-internal-tls-certs\") pod \"barbican-api-7c864cc4fb-bwl6r\" (UID: \"2bb32512-db70-4db8-a867-0bda48f318eb\") " pod="openstack/barbican-api-7c864cc4fb-bwl6r" Mar 20 07:12:28 crc kubenswrapper[4971]: I0320 07:12:28.797458 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bb32512-db70-4db8-a867-0bda48f318eb-combined-ca-bundle\") pod \"barbican-api-7c864cc4fb-bwl6r\" (UID: \"2bb32512-db70-4db8-a867-0bda48f318eb\") " pod="openstack/barbican-api-7c864cc4fb-bwl6r" Mar 20 07:12:28 crc kubenswrapper[4971]: I0320 07:12:28.797741 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bb32512-db70-4db8-a867-0bda48f318eb-config-data\") pod \"barbican-api-7c864cc4fb-bwl6r\" (UID: \"2bb32512-db70-4db8-a867-0bda48f318eb\") " pod="openstack/barbican-api-7c864cc4fb-bwl6r" Mar 20 07:12:28 crc kubenswrapper[4971]: I0320 07:12:28.797854 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2bb32512-db70-4db8-a867-0bda48f318eb-logs\") pod \"barbican-api-7c864cc4fb-bwl6r\" (UID: \"2bb32512-db70-4db8-a867-0bda48f318eb\") " pod="openstack/barbican-api-7c864cc4fb-bwl6r" Mar 20 07:12:28 crc kubenswrapper[4971]: I0320 07:12:28.797961 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjlv8\" (UniqueName: \"kubernetes.io/projected/2bb32512-db70-4db8-a867-0bda48f318eb-kube-api-access-sjlv8\") pod \"barbican-api-7c864cc4fb-bwl6r\" (UID: \"2bb32512-db70-4db8-a867-0bda48f318eb\") " pod="openstack/barbican-api-7c864cc4fb-bwl6r" Mar 20 07:12:28 crc kubenswrapper[4971]: I0320 07:12:28.798095 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2bb32512-db70-4db8-a867-0bda48f318eb-config-data-custom\") pod \"barbican-api-7c864cc4fb-bwl6r\" (UID: \"2bb32512-db70-4db8-a867-0bda48f318eb\") " pod="openstack/barbican-api-7c864cc4fb-bwl6r" Mar 20 07:12:28 crc kubenswrapper[4971]: I0320 07:12:28.798222 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bb32512-db70-4db8-a867-0bda48f318eb-internal-tls-certs\") pod \"barbican-api-7c864cc4fb-bwl6r\" (UID: \"2bb32512-db70-4db8-a867-0bda48f318eb\") " pod="openstack/barbican-api-7c864cc4fb-bwl6r" Mar 20 07:12:28 crc kubenswrapper[4971]: I0320 07:12:28.798365 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bb32512-db70-4db8-a867-0bda48f318eb-public-tls-certs\") pod \"barbican-api-7c864cc4fb-bwl6r\" (UID: \"2bb32512-db70-4db8-a867-0bda48f318eb\") " pod="openstack/barbican-api-7c864cc4fb-bwl6r" Mar 20 07:12:28 crc kubenswrapper[4971]: I0320 07:12:28.801928 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2bb32512-db70-4db8-a867-0bda48f318eb-logs\") pod \"barbican-api-7c864cc4fb-bwl6r\" (UID: \"2bb32512-db70-4db8-a867-0bda48f318eb\") " pod="openstack/barbican-api-7c864cc4fb-bwl6r" Mar 20 07:12:28 crc kubenswrapper[4971]: I0320 07:12:28.822789 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjlv8\" (UniqueName: \"kubernetes.io/projected/2bb32512-db70-4db8-a867-0bda48f318eb-kube-api-access-sjlv8\") pod \"barbican-api-7c864cc4fb-bwl6r\" (UID: \"2bb32512-db70-4db8-a867-0bda48f318eb\") " pod="openstack/barbican-api-7c864cc4fb-bwl6r" Mar 20 07:12:28 crc kubenswrapper[4971]: I0320 07:12:28.848342 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bb32512-db70-4db8-a867-0bda48f318eb-internal-tls-certs\") pod \"barbican-api-7c864cc4fb-bwl6r\" (UID: \"2bb32512-db70-4db8-a867-0bda48f318eb\") " pod="openstack/barbican-api-7c864cc4fb-bwl6r" Mar 20 07:12:28 crc kubenswrapper[4971]: I0320 07:12:28.848560 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bb32512-db70-4db8-a867-0bda48f318eb-public-tls-certs\") pod \"barbican-api-7c864cc4fb-bwl6r\" (UID: \"2bb32512-db70-4db8-a867-0bda48f318eb\") " pod="openstack/barbican-api-7c864cc4fb-bwl6r" Mar 20 07:12:28 crc kubenswrapper[4971]: I0320 07:12:28.850313 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bb32512-db70-4db8-a867-0bda48f318eb-config-data\") pod \"barbican-api-7c864cc4fb-bwl6r\" (UID: \"2bb32512-db70-4db8-a867-0bda48f318eb\") " pod="openstack/barbican-api-7c864cc4fb-bwl6r" Mar 20 07:12:28 crc kubenswrapper[4971]: I0320 07:12:28.854241 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bb32512-db70-4db8-a867-0bda48f318eb-combined-ca-bundle\") pod \"barbican-api-7c864cc4fb-bwl6r\" (UID: \"2bb32512-db70-4db8-a867-0bda48f318eb\") " pod="openstack/barbican-api-7c864cc4fb-bwl6r" Mar 20 07:12:28 crc kubenswrapper[4971]: I0320 07:12:28.854945 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2bb32512-db70-4db8-a867-0bda48f318eb-config-data-custom\") pod \"barbican-api-7c864cc4fb-bwl6r\" (UID: \"2bb32512-db70-4db8-a867-0bda48f318eb\") " pod="openstack/barbican-api-7c864cc4fb-bwl6r" Mar 20 07:12:29 crc kubenswrapper[4971]: I0320 07:12:29.083904 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7c864cc4fb-bwl6r" Mar 20 07:12:29 crc kubenswrapper[4971]: I0320 07:12:29.330802 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-76b4bdf575-ggc5d" event={"ID":"26142983-f987-457d-956c-0d63345bdf1c","Type":"ContainerStarted","Data":"a4ee488277826c6e3a3aeaad470e448df50549a3a3c8d76ed94a4c244aa33026"} Mar 20 07:12:29 crc kubenswrapper[4971]: I0320 07:12:29.335339 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-68ff6b454b-v87kx" event={"ID":"e641b3d5-ee18-41b6-b38c-098564591a1b","Type":"ContainerStarted","Data":"a7921404e4b591fede304bc781cbac3d5009d94181d0de8b56cf88b7dad22119"} Mar 20 07:12:29 crc kubenswrapper[4971]: I0320 07:12:29.335369 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-68ff6b454b-v87kx" event={"ID":"e641b3d5-ee18-41b6-b38c-098564591a1b","Type":"ContainerStarted","Data":"fdace0241f51e92ca49dfa7a8f4f6005c4bcc152805a4d6e595e7169ed3b81cf"} Mar 20 07:12:29 crc kubenswrapper[4971]: I0320 07:12:29.339978 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-577f9d69bb-nlxbh" event={"ID":"1172670c-6168-4461-a866-3bd8b410c332","Type":"ContainerStarted","Data":"9c521266f66cc9aef83ab65f813dab96df76b6a6f11d1caf127cb3655bae4d29"} Mar 20 07:12:29 crc kubenswrapper[4971]: I0320 07:12:29.342931 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5999d797d7-b2qlx" event={"ID":"4290efff-c110-4085-b0a7-3e402507b5a0","Type":"ContainerStarted","Data":"4179f3a4ba269961a73dda3345aa28072df0582733516d6dc42d2ab7a7ead18c"} Mar 20 07:12:29 crc kubenswrapper[4971]: I0320 07:12:29.342961 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5999d797d7-b2qlx" event={"ID":"4290efff-c110-4085-b0a7-3e402507b5a0","Type":"ContainerStarted","Data":"bb8445172815ef07a9dbeabf96a663864ac0f9efc3e35bfd9f481b62fdf29f8f"} Mar 20 07:12:29 crc kubenswrapper[4971]: I0320 07:12:29.355618 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-594fd7bbdb-tf55d" event={"ID":"6149d926-b8a4-4d76-998c-0f721a1a2ef1","Type":"ContainerStarted","Data":"65c1955e9ecffc618c51033c4ee3d4e91c853d79d00a2cf910425e2df03f5791"} Mar 20 07:12:29 crc kubenswrapper[4971]: I0320 07:12:29.355685 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-594fd7bbdb-tf55d" event={"ID":"6149d926-b8a4-4d76-998c-0f721a1a2ef1","Type":"ContainerStarted","Data":"6e2dd8821eb89876f7c7630305e669c0924a486cdb8b8e13430844295352c507"} Mar 20 07:12:29 crc kubenswrapper[4971]: I0320 07:12:29.369058 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 07:12:29 crc kubenswrapper[4971]: I0320 07:12:29.369281 4971 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 07:12:29 crc kubenswrapper[4971]: I0320 07:12:29.370316 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 07:12:29 crc kubenswrapper[4971]: I0320 07:12:29.382840 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-5999d797d7-b2qlx" podStartSLOduration=3.321424523 podStartE2EDuration="6.382818215s" podCreationTimestamp="2026-03-20 07:12:23 +0000 UTC" firstStartedPulling="2026-03-20 07:12:25.530257635 +0000 UTC m=+1367.510131773" lastFinishedPulling="2026-03-20 07:12:28.591651337 +0000 UTC m=+1370.571525465" observedRunningTime="2026-03-20 07:12:29.368978805 +0000 UTC m=+1371.348852943" watchObservedRunningTime="2026-03-20 07:12:29.382818215 +0000 UTC m=+1371.362692353" Mar 20 07:12:29 crc kubenswrapper[4971]: I0320 07:12:29.426939 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-76b4bdf575-ggc5d"] Mar 20 07:12:29 crc kubenswrapper[4971]: I0320 07:12:29.664772 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7c864cc4fb-bwl6r"] Mar 20 07:12:30 crc kubenswrapper[4971]: I0320 07:12:30.371943 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-577f9d69bb-nlxbh" event={"ID":"1172670c-6168-4461-a866-3bd8b410c332","Type":"ContainerStarted","Data":"9d0643b455264ddaef884bb8446d7ec483ce2d1e044df4d207d016d8d26feae2"} Mar 20 07:12:30 crc kubenswrapper[4971]: I0320 07:12:30.376646 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-76b4bdf575-ggc5d" event={"ID":"26142983-f987-457d-956c-0d63345bdf1c","Type":"ContainerStarted","Data":"c74a8520f0be1ff8549ae137e8b5ed0f71a1eeae105b1e990e6c5ec2ec0289ed"} Mar 20 07:12:30 crc kubenswrapper[4971]: I0320 07:12:30.376738 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-76b4bdf575-ggc5d" podUID="26142983-f987-457d-956c-0d63345bdf1c" containerName="barbican-worker-log" containerID="cri-o://a4ee488277826c6e3a3aeaad470e448df50549a3a3c8d76ed94a4c244aa33026" gracePeriod=30 Mar 20 07:12:30 crc kubenswrapper[4971]: I0320 07:12:30.376786 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-76b4bdf575-ggc5d" podUID="26142983-f987-457d-956c-0d63345bdf1c" containerName="barbican-worker" containerID="cri-o://c74a8520f0be1ff8549ae137e8b5ed0f71a1eeae105b1e990e6c5ec2ec0289ed" gracePeriod=30 Mar 20 07:12:30 crc kubenswrapper[4971]: I0320 07:12:30.395112 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c864cc4fb-bwl6r" event={"ID":"2bb32512-db70-4db8-a867-0bda48f318eb","Type":"ContainerStarted","Data":"f44c356494df4fb7a453f34eb2f258fe912c8e4012fb92962ea899421d3f7ab5"} Mar 20 07:12:30 crc kubenswrapper[4971]: I0320 07:12:30.395209 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c864cc4fb-bwl6r" event={"ID":"2bb32512-db70-4db8-a867-0bda48f318eb","Type":"ContainerStarted","Data":"9e8d68592c9188c08cf1bce3707ea600cd0b97f9b8a2acac21623a93d67816c5"} Mar 20 07:12:30 crc kubenswrapper[4971]: I0320 07:12:30.400298 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-577f9d69bb-nlxbh" podStartSLOduration=3.409205942 podStartE2EDuration="7.400259025s" podCreationTimestamp="2026-03-20 07:12:23 +0000 UTC" firstStartedPulling="2026-03-20 07:12:24.60085732 +0000 UTC m=+1366.580731458" lastFinishedPulling="2026-03-20 07:12:28.591910403 +0000 UTC m=+1370.571784541" observedRunningTime="2026-03-20 07:12:30.387722977 +0000 UTC m=+1372.367597135" watchObservedRunningTime="2026-03-20 07:12:30.400259025 +0000 UTC m=+1372.380133183" Mar 20 07:12:30 crc kubenswrapper[4971]: I0320 07:12:30.406047 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-956ff8bdd-gdvvh" podUID="91600ccb-b017-4f9f-b348-be6fe0fbe7d7" containerName="barbican-api-log" containerID="cri-o://868a0345e6ce93eb431ecb9b8eb8b1b3ab7e0a9560ebaebe5302144c4885308f" gracePeriod=30 Mar 20 07:12:30 crc kubenswrapper[4971]: I0320 07:12:30.407431 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-68ff6b454b-v87kx" event={"ID":"e641b3d5-ee18-41b6-b38c-098564591a1b","Type":"ContainerStarted","Data":"a65462569356d9db057f1cd9fdd87d74a4a015508c9c542b7cb684d1148c04cb"} Mar 20 07:12:30 crc kubenswrapper[4971]: I0320 07:12:30.407500 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-68ff6b454b-v87kx" Mar 20 07:12:30 crc kubenswrapper[4971]: I0320 07:12:30.407532 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-68ff6b454b-v87kx" Mar 20 07:12:30 crc kubenswrapper[4971]: I0320 07:12:30.407633 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-956ff8bdd-gdvvh" podUID="91600ccb-b017-4f9f-b348-be6fe0fbe7d7" containerName="barbican-api" containerID="cri-o://29fc7fb049a093de8b3c1677edadf777b465a293f069bba4186972bb4980a219" gracePeriod=30 Mar 20 07:12:30 crc kubenswrapper[4971]: I0320 07:12:30.422393 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-956ff8bdd-gdvvh" podUID="91600ccb-b017-4f9f-b348-be6fe0fbe7d7" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.164:9311/healthcheck\": EOF" Mar 20 07:12:30 crc kubenswrapper[4971]: I0320 07:12:30.422682 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-956ff8bdd-gdvvh" podUID="91600ccb-b017-4f9f-b348-be6fe0fbe7d7" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.164:9311/healthcheck\": EOF" Mar 20 07:12:30 crc kubenswrapper[4971]: I0320 07:12:30.424858 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-76b4bdf575-ggc5d" podStartSLOduration=3.649442237 podStartE2EDuration="7.424845419s" podCreationTimestamp="2026-03-20 07:12:23 +0000 UTC" firstStartedPulling="2026-03-20 07:12:24.819190667 +0000 UTC m=+1366.799064805" lastFinishedPulling="2026-03-20 07:12:28.594593849 +0000 UTC m=+1370.574467987" observedRunningTime="2026-03-20 07:12:30.410154298 +0000 UTC m=+1372.390028446" watchObservedRunningTime="2026-03-20 07:12:30.424845419 +0000 UTC m=+1372.404719557" Mar 20 07:12:30 crc kubenswrapper[4971]: I0320 07:12:30.443355 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-68ff6b454b-v87kx" podStartSLOduration=5.443326624 podStartE2EDuration="5.443326624s" podCreationTimestamp="2026-03-20 07:12:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:12:30.433227645 +0000 UTC m=+1372.413101783" watchObservedRunningTime="2026-03-20 07:12:30.443326624 +0000 UTC m=+1372.423200762" Mar 20 07:12:30 crc kubenswrapper[4971]: I0320 07:12:30.490017 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-594fd7bbdb-tf55d" podStartSLOduration=4.186383814 podStartE2EDuration="7.489984781s" podCreationTimestamp="2026-03-20 07:12:23 +0000 UTC" firstStartedPulling="2026-03-20 07:12:25.243555387 +0000 UTC m=+1367.223429525" lastFinishedPulling="2026-03-20 07:12:28.547156354 +0000 UTC m=+1370.527030492" observedRunningTime="2026-03-20 07:12:30.46962119 +0000 UTC m=+1372.449495328" watchObservedRunningTime="2026-03-20 07:12:30.489984781 +0000 UTC m=+1372.469858919" Mar 20 07:12:30 crc kubenswrapper[4971]: I0320 07:12:30.529446 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-577f9d69bb-nlxbh"] Mar 20 07:12:31 crc kubenswrapper[4971]: I0320 07:12:31.433718 4971 generic.go:334] "Generic (PLEG): container finished" podID="91600ccb-b017-4f9f-b348-be6fe0fbe7d7" containerID="868a0345e6ce93eb431ecb9b8eb8b1b3ab7e0a9560ebaebe5302144c4885308f" exitCode=143 Mar 20 07:12:31 crc kubenswrapper[4971]: I0320 07:12:31.434027 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-956ff8bdd-gdvvh" event={"ID":"91600ccb-b017-4f9f-b348-be6fe0fbe7d7","Type":"ContainerDied","Data":"868a0345e6ce93eb431ecb9b8eb8b1b3ab7e0a9560ebaebe5302144c4885308f"} Mar 20 07:12:31 crc kubenswrapper[4971]: I0320 07:12:31.448292 4971 generic.go:334] "Generic (PLEG): container finished" podID="26142983-f987-457d-956c-0d63345bdf1c" containerID="a4ee488277826c6e3a3aeaad470e448df50549a3a3c8d76ed94a4c244aa33026" exitCode=143 Mar 20 07:12:31 crc kubenswrapper[4971]: I0320 07:12:31.448400 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-76b4bdf575-ggc5d" event={"ID":"26142983-f987-457d-956c-0d63345bdf1c","Type":"ContainerDied","Data":"a4ee488277826c6e3a3aeaad470e448df50549a3a3c8d76ed94a4c244aa33026"} Mar 20 07:12:31 crc kubenswrapper[4971]: I0320 07:12:31.464410 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-p9zph" event={"ID":"ebe4aafd-3ffe-406f-a0dc-faebf5eeddda","Type":"ContainerStarted","Data":"e16f3a7524c0effa050fe10707eb8e5fb4083331573d897b85372c20bcb66dcc"} Mar 20 07:12:31 crc kubenswrapper[4971]: I0320 07:12:31.469532 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c864cc4fb-bwl6r" event={"ID":"2bb32512-db70-4db8-a867-0bda48f318eb","Type":"ContainerStarted","Data":"f4291f79acc040e64577f76b2a241e98138576981a746fc70ac90f0558fc3b12"} Mar 20 07:12:31 crc kubenswrapper[4971]: I0320 07:12:31.481677 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-p9zph" podStartSLOduration=4.017874037 podStartE2EDuration="41.481651847s" podCreationTimestamp="2026-03-20 07:11:50 +0000 UTC" firstStartedPulling="2026-03-20 07:11:52.051429889 +0000 UTC m=+1334.031304017" lastFinishedPulling="2026-03-20 07:12:29.515207689 +0000 UTC m=+1371.495081827" observedRunningTime="2026-03-20 07:12:31.480248523 +0000 UTC m=+1373.460122661" watchObservedRunningTime="2026-03-20 07:12:31.481651847 +0000 UTC m=+1373.461525985" Mar 20 07:12:31 crc kubenswrapper[4971]: I0320 07:12:31.506395 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7c864cc4fb-bwl6r" podStartSLOduration=3.506369555 podStartE2EDuration="3.506369555s" podCreationTimestamp="2026-03-20 07:12:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:12:31.50212764 +0000 UTC m=+1373.482001798" watchObservedRunningTime="2026-03-20 07:12:31.506369555 +0000 UTC m=+1373.486243683" Mar 20 07:12:32 crc kubenswrapper[4971]: I0320 07:12:32.511843 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7c864cc4fb-bwl6r" Mar 20 07:12:32 crc kubenswrapper[4971]: I0320 07:12:32.512307 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7c864cc4fb-bwl6r" Mar 20 07:12:32 crc kubenswrapper[4971]: I0320 07:12:32.512114 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-577f9d69bb-nlxbh" podUID="1172670c-6168-4461-a866-3bd8b410c332" containerName="barbican-keystone-listener-log" containerID="cri-o://9c521266f66cc9aef83ab65f813dab96df76b6a6f11d1caf127cb3655bae4d29" gracePeriod=30 Mar 20 07:12:32 crc kubenswrapper[4971]: I0320 07:12:32.512500 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-577f9d69bb-nlxbh" podUID="1172670c-6168-4461-a866-3bd8b410c332" containerName="barbican-keystone-listener" containerID="cri-o://9d0643b455264ddaef884bb8446d7ec483ce2d1e044df4d207d016d8d26feae2" gracePeriod=30 Mar 20 07:12:33 crc kubenswrapper[4971]: I0320 07:12:33.525110 4971 generic.go:334] "Generic (PLEG): container finished" podID="1172670c-6168-4461-a866-3bd8b410c332" containerID="9d0643b455264ddaef884bb8446d7ec483ce2d1e044df4d207d016d8d26feae2" exitCode=0 Mar 20 07:12:33 crc kubenswrapper[4971]: I0320 07:12:33.525703 4971 generic.go:334] "Generic (PLEG): container finished" podID="1172670c-6168-4461-a866-3bd8b410c332" containerID="9c521266f66cc9aef83ab65f813dab96df76b6a6f11d1caf127cb3655bae4d29" exitCode=143 Mar 20 07:12:33 crc kubenswrapper[4971]: I0320 07:12:33.525199 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-577f9d69bb-nlxbh" event={"ID":"1172670c-6168-4461-a866-3bd8b410c332","Type":"ContainerDied","Data":"9d0643b455264ddaef884bb8446d7ec483ce2d1e044df4d207d016d8d26feae2"} Mar 20 07:12:33 crc kubenswrapper[4971]: I0320 07:12:33.525842 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-577f9d69bb-nlxbh" event={"ID":"1172670c-6168-4461-a866-3bd8b410c332","Type":"ContainerDied","Data":"9c521266f66cc9aef83ab65f813dab96df76b6a6f11d1caf127cb3655bae4d29"} Mar 20 07:12:34 crc kubenswrapper[4971]: I0320 07:12:34.099907 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7fc46d7df7-468lg" Mar 20 07:12:34 crc kubenswrapper[4971]: I0320 07:12:34.170096 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b9fd7d84c-bwqjt"] Mar 20 07:12:34 crc kubenswrapper[4971]: I0320 07:12:34.170442 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b9fd7d84c-bwqjt" podUID="1f687e76-a919-4ed1-921b-ab9a7085eb30" containerName="dnsmasq-dns" containerID="cri-o://84e37de1dca3b13d7df575e0ba046c0f63b4cad119703ca80a882711711fe434" gracePeriod=10 Mar 20 07:12:34 crc kubenswrapper[4971]: I0320 07:12:34.552436 4971 generic.go:334] "Generic (PLEG): container finished" podID="1f687e76-a919-4ed1-921b-ab9a7085eb30" containerID="84e37de1dca3b13d7df575e0ba046c0f63b4cad119703ca80a882711711fe434" exitCode=0 Mar 20 07:12:34 crc kubenswrapper[4971]: I0320 07:12:34.552645 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b9fd7d84c-bwqjt" event={"ID":"1f687e76-a919-4ed1-921b-ab9a7085eb30","Type":"ContainerDied","Data":"84e37de1dca3b13d7df575e0ba046c0f63b4cad119703ca80a882711711fe434"} Mar 20 07:12:35 crc kubenswrapper[4971]: I0320 07:12:35.291315 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7b9fd7d84c-bwqjt" podUID="1f687e76-a919-4ed1-921b-ab9a7085eb30" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.122:5353: connect: connection refused" Mar 20 07:12:35 crc kubenswrapper[4971]: I0320 07:12:35.710426 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7c864cc4fb-bwl6r" Mar 20 07:12:35 crc kubenswrapper[4971]: I0320 07:12:35.812619 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-956ff8bdd-gdvvh" podUID="91600ccb-b017-4f9f-b348-be6fe0fbe7d7" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.164:9311/healthcheck\": read tcp 10.217.0.2:43602->10.217.0.164:9311: read: connection reset by peer" Mar 20 07:12:35 crc kubenswrapper[4971]: I0320 07:12:35.812671 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-956ff8bdd-gdvvh" podUID="91600ccb-b017-4f9f-b348-be6fe0fbe7d7" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.164:9311/healthcheck\": read tcp 10.217.0.2:43618->10.217.0.164:9311: read: connection reset by peer" Mar 20 07:12:36 crc kubenswrapper[4971]: I0320 07:12:36.041400 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7cccb97bc6-t7qwm" Mar 20 07:12:36 crc kubenswrapper[4971]: I0320 07:12:36.457987 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7cccb97bc6-t7qwm" Mar 20 07:12:36 crc kubenswrapper[4971]: I0320 07:12:36.583941 4971 generic.go:334] "Generic (PLEG): container finished" podID="91600ccb-b017-4f9f-b348-be6fe0fbe7d7" containerID="29fc7fb049a093de8b3c1677edadf777b465a293f069bba4186972bb4980a219" exitCode=0 Mar 20 07:12:36 crc kubenswrapper[4971]: I0320 07:12:36.584012 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-956ff8bdd-gdvvh" event={"ID":"91600ccb-b017-4f9f-b348-be6fe0fbe7d7","Type":"ContainerDied","Data":"29fc7fb049a093de8b3c1677edadf777b465a293f069bba4186972bb4980a219"} Mar 20 07:12:36 crc kubenswrapper[4971]: I0320 07:12:36.591749 4971 generic.go:334] "Generic (PLEG): container finished" podID="ebe4aafd-3ffe-406f-a0dc-faebf5eeddda" containerID="e16f3a7524c0effa050fe10707eb8e5fb4083331573d897b85372c20bcb66dcc" exitCode=0 Mar 20 07:12:36 crc kubenswrapper[4971]: I0320 07:12:36.592063 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-p9zph" event={"ID":"ebe4aafd-3ffe-406f-a0dc-faebf5eeddda","Type":"ContainerDied","Data":"e16f3a7524c0effa050fe10707eb8e5fb4083331573d897b85372c20bcb66dcc"} Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.168510 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7c864cc4fb-bwl6r" Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.270844 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7cccb97bc6-t7qwm"] Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.448479 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-956ff8bdd-gdvvh" Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.473751 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-577f9d69bb-nlxbh" Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.497986 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b9fd7d84c-bwqjt" Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.550686 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f687e76-a919-4ed1-921b-ab9a7085eb30-ovsdbserver-nb\") pod \"1f687e76-a919-4ed1-921b-ab9a7085eb30\" (UID: \"1f687e76-a919-4ed1-921b-ab9a7085eb30\") " Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.550751 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdcmf\" (UniqueName: \"kubernetes.io/projected/1f687e76-a919-4ed1-921b-ab9a7085eb30-kube-api-access-gdcmf\") pod \"1f687e76-a919-4ed1-921b-ab9a7085eb30\" (UID: \"1f687e76-a919-4ed1-921b-ab9a7085eb30\") " Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.550791 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1172670c-6168-4461-a866-3bd8b410c332-config-data-custom\") pod \"1172670c-6168-4461-a866-3bd8b410c332\" (UID: \"1172670c-6168-4461-a866-3bd8b410c332\") " Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.550864 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91600ccb-b017-4f9f-b348-be6fe0fbe7d7-combined-ca-bundle\") pod \"91600ccb-b017-4f9f-b348-be6fe0fbe7d7\" (UID: \"91600ccb-b017-4f9f-b348-be6fe0fbe7d7\") " Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.550892 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91600ccb-b017-4f9f-b348-be6fe0fbe7d7-logs\") pod \"91600ccb-b017-4f9f-b348-be6fe0fbe7d7\" (UID: \"91600ccb-b017-4f9f-b348-be6fe0fbe7d7\") " Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.550915 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91600ccb-b017-4f9f-b348-be6fe0fbe7d7-config-data\") pod \"91600ccb-b017-4f9f-b348-be6fe0fbe7d7\" (UID: \"91600ccb-b017-4f9f-b348-be6fe0fbe7d7\") " Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.551014 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8hvr\" (UniqueName: \"kubernetes.io/projected/91600ccb-b017-4f9f-b348-be6fe0fbe7d7-kube-api-access-d8hvr\") pod \"91600ccb-b017-4f9f-b348-be6fe0fbe7d7\" (UID: \"91600ccb-b017-4f9f-b348-be6fe0fbe7d7\") " Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.551151 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f687e76-a919-4ed1-921b-ab9a7085eb30-ovsdbserver-sb\") pod \"1f687e76-a919-4ed1-921b-ab9a7085eb30\" (UID: \"1f687e76-a919-4ed1-921b-ab9a7085eb30\") " Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.551196 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ng8mf\" (UniqueName: \"kubernetes.io/projected/1172670c-6168-4461-a866-3bd8b410c332-kube-api-access-ng8mf\") pod \"1172670c-6168-4461-a866-3bd8b410c332\" (UID: \"1172670c-6168-4461-a866-3bd8b410c332\") " Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.551222 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/91600ccb-b017-4f9f-b348-be6fe0fbe7d7-config-data-custom\") pod \"91600ccb-b017-4f9f-b348-be6fe0fbe7d7\" (UID: \"91600ccb-b017-4f9f-b348-be6fe0fbe7d7\") " Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.551252 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1172670c-6168-4461-a866-3bd8b410c332-combined-ca-bundle\") pod \"1172670c-6168-4461-a866-3bd8b410c332\" (UID: \"1172670c-6168-4461-a866-3bd8b410c332\") " Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.551334 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f687e76-a919-4ed1-921b-ab9a7085eb30-config\") pod \"1f687e76-a919-4ed1-921b-ab9a7085eb30\" (UID: \"1f687e76-a919-4ed1-921b-ab9a7085eb30\") " Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.551372 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1172670c-6168-4461-a866-3bd8b410c332-config-data\") pod \"1172670c-6168-4461-a866-3bd8b410c332\" (UID: \"1172670c-6168-4461-a866-3bd8b410c332\") " Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.551399 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1172670c-6168-4461-a866-3bd8b410c332-logs\") pod \"1172670c-6168-4461-a866-3bd8b410c332\" (UID: \"1172670c-6168-4461-a866-3bd8b410c332\") " Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.551403 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91600ccb-b017-4f9f-b348-be6fe0fbe7d7-logs" (OuterVolumeSpecName: "logs") pod "91600ccb-b017-4f9f-b348-be6fe0fbe7d7" (UID: "91600ccb-b017-4f9f-b348-be6fe0fbe7d7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.551433 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f687e76-a919-4ed1-921b-ab9a7085eb30-dns-svc\") pod \"1f687e76-a919-4ed1-921b-ab9a7085eb30\" (UID: \"1f687e76-a919-4ed1-921b-ab9a7085eb30\") " Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.551917 4971 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91600ccb-b017-4f9f-b348-be6fe0fbe7d7-logs\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.566519 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1172670c-6168-4461-a866-3bd8b410c332-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1172670c-6168-4461-a866-3bd8b410c332" (UID: "1172670c-6168-4461-a866-3bd8b410c332"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.568217 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91600ccb-b017-4f9f-b348-be6fe0fbe7d7-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "91600ccb-b017-4f9f-b348-be6fe0fbe7d7" (UID: "91600ccb-b017-4f9f-b348-be6fe0fbe7d7"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.568582 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1172670c-6168-4461-a866-3bd8b410c332-logs" (OuterVolumeSpecName: "logs") pod "1172670c-6168-4461-a866-3bd8b410c332" (UID: "1172670c-6168-4461-a866-3bd8b410c332"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.569857 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91600ccb-b017-4f9f-b348-be6fe0fbe7d7-kube-api-access-d8hvr" (OuterVolumeSpecName: "kube-api-access-d8hvr") pod "91600ccb-b017-4f9f-b348-be6fe0fbe7d7" (UID: "91600ccb-b017-4f9f-b348-be6fe0fbe7d7"). InnerVolumeSpecName "kube-api-access-d8hvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.574101 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f687e76-a919-4ed1-921b-ab9a7085eb30-kube-api-access-gdcmf" (OuterVolumeSpecName: "kube-api-access-gdcmf") pod "1f687e76-a919-4ed1-921b-ab9a7085eb30" (UID: "1f687e76-a919-4ed1-921b-ab9a7085eb30"). InnerVolumeSpecName "kube-api-access-gdcmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.581731 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1172670c-6168-4461-a866-3bd8b410c332-kube-api-access-ng8mf" (OuterVolumeSpecName: "kube-api-access-ng8mf") pod "1172670c-6168-4461-a866-3bd8b410c332" (UID: "1172670c-6168-4461-a866-3bd8b410c332"). InnerVolumeSpecName "kube-api-access-ng8mf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:12:37 crc kubenswrapper[4971]: E0320 07:12:37.593999 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="03a48c07-f949-4376-874a-8d5758b60a9a" Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.609765 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91600ccb-b017-4f9f-b348-be6fe0fbe7d7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "91600ccb-b017-4f9f-b348-be6fe0fbe7d7" (UID: "91600ccb-b017-4f9f-b348-be6fe0fbe7d7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.615561 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1172670c-6168-4461-a866-3bd8b410c332-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1172670c-6168-4461-a866-3bd8b410c332" (UID: "1172670c-6168-4461-a866-3bd8b410c332"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.620811 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-577f9d69bb-nlxbh" event={"ID":"1172670c-6168-4461-a866-3bd8b410c332","Type":"ContainerDied","Data":"b6827c161536d094b1ed2ac230d2ff9b8990f787e8944f7ba8903993ca4d6bbd"} Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.620871 4971 scope.go:117] "RemoveContainer" containerID="9d0643b455264ddaef884bb8446d7ec483ce2d1e044df4d207d016d8d26feae2" Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.621022 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-577f9d69bb-nlxbh" Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.639543 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f687e76-a919-4ed1-921b-ab9a7085eb30-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1f687e76-a919-4ed1-921b-ab9a7085eb30" (UID: "1f687e76-a919-4ed1-921b-ab9a7085eb30"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.641502 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f687e76-a919-4ed1-921b-ab9a7085eb30-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1f687e76-a919-4ed1-921b-ab9a7085eb30" (UID: "1f687e76-a919-4ed1-921b-ab9a7085eb30"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.657179 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f687e76-a919-4ed1-921b-ab9a7085eb30-config" (OuterVolumeSpecName: "config") pod "1f687e76-a919-4ed1-921b-ab9a7085eb30" (UID: "1f687e76-a919-4ed1-921b-ab9a7085eb30"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.657472 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"03a48c07-f949-4376-874a-8d5758b60a9a","Type":"ContainerStarted","Data":"da457a1601a60d9cbdbfe4e2ac48670fa1b80ef2a54f12a8442eac7a97202c46"} Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.657739 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="03a48c07-f949-4376-874a-8d5758b60a9a" containerName="ceilometer-notification-agent" containerID="cri-o://aece527112bdee1e807769fbd04b183b04d566ead4cf6b30a2d975bcd179d608" gracePeriod=30 Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.657844 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.658201 4971 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f687e76-a919-4ed1-921b-ab9a7085eb30-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.658263 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="03a48c07-f949-4376-874a-8d5758b60a9a" containerName="proxy-httpd" containerID="cri-o://da457a1601a60d9cbdbfe4e2ac48670fa1b80ef2a54f12a8442eac7a97202c46" gracePeriod=30 Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.658278 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ng8mf\" (UniqueName: \"kubernetes.io/projected/1172670c-6168-4461-a866-3bd8b410c332-kube-api-access-ng8mf\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.658442 4971 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/91600ccb-b017-4f9f-b348-be6fe0fbe7d7-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.658497 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1172670c-6168-4461-a866-3bd8b410c332-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.658572 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f687e76-a919-4ed1-921b-ab9a7085eb30-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.658657 4971 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1172670c-6168-4461-a866-3bd8b410c332-logs\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.658723 4971 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f687e76-a919-4ed1-921b-ab9a7085eb30-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.658797 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gdcmf\" (UniqueName: \"kubernetes.io/projected/1f687e76-a919-4ed1-921b-ab9a7085eb30-kube-api-access-gdcmf\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.658861 4971 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1172670c-6168-4461-a866-3bd8b410c332-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.658917 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91600ccb-b017-4f9f-b348-be6fe0fbe7d7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.658973 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8hvr\" (UniqueName: \"kubernetes.io/projected/91600ccb-b017-4f9f-b348-be6fe0fbe7d7-kube-api-access-d8hvr\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.658246 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="03a48c07-f949-4376-874a-8d5758b60a9a" containerName="sg-core" containerID="cri-o://2d0d52a9adadceb321d13de081d4c987d1f4efb51e325355c60dddcc42cde7c0" gracePeriod=30 Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.667035 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f687e76-a919-4ed1-921b-ab9a7085eb30-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1f687e76-a919-4ed1-921b-ab9a7085eb30" (UID: "1f687e76-a919-4ed1-921b-ab9a7085eb30"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.678031 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-956ff8bdd-gdvvh" Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.678323 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-956ff8bdd-gdvvh" event={"ID":"91600ccb-b017-4f9f-b348-be6fe0fbe7d7","Type":"ContainerDied","Data":"3492b17263bc17ba6863717994695b6ecf47e3cb0db2cc9c08c11bcb9d88c63c"} Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.689946 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91600ccb-b017-4f9f-b348-be6fe0fbe7d7-config-data" (OuterVolumeSpecName: "config-data") pod "91600ccb-b017-4f9f-b348-be6fe0fbe7d7" (UID: "91600ccb-b017-4f9f-b348-be6fe0fbe7d7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.694733 4971 scope.go:117] "RemoveContainer" containerID="9c521266f66cc9aef83ab65f813dab96df76b6a6f11d1caf127cb3655bae4d29" Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.696185 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b9fd7d84c-bwqjt" Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.698675 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b9fd7d84c-bwqjt" event={"ID":"1f687e76-a919-4ed1-921b-ab9a7085eb30","Type":"ContainerDied","Data":"18869052bd6862d80c0a1aa87ea65abe8532eb2b9f15ecd5793500413305c2f3"} Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.698898 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7cccb97bc6-t7qwm" podUID="1edc3606-34b7-4c2a-b534-2efa0ad00e95" containerName="barbican-api-log" containerID="cri-o://4cb8a35f072d44c331ecc56d1821e504cec508e1172de452d8974d9419f87007" gracePeriod=30 Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.699064 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7cccb97bc6-t7qwm" podUID="1edc3606-34b7-4c2a-b534-2efa0ad00e95" containerName="barbican-api" containerID="cri-o://c5688d50e743f0033211b1fa577437147fe0fe343d1c61c4a9aa25d7583af3de" gracePeriod=30 Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.708659 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7cccb97bc6-t7qwm" podUID="1edc3606-34b7-4c2a-b534-2efa0ad00e95" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.167:9311/healthcheck\": EOF" Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.712787 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1172670c-6168-4461-a866-3bd8b410c332-config-data" (OuterVolumeSpecName: "config-data") pod "1172670c-6168-4461-a866-3bd8b410c332" (UID: "1172670c-6168-4461-a866-3bd8b410c332"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.752527 4971 scope.go:117] "RemoveContainer" containerID="29fc7fb049a093de8b3c1677edadf777b465a293f069bba4186972bb4980a219" Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.757189 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b9fd7d84c-bwqjt"] Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.761869 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1172670c-6168-4461-a866-3bd8b410c332-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.761910 4971 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f687e76-a919-4ed1-921b-ab9a7085eb30-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.761923 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91600ccb-b017-4f9f-b348-be6fe0fbe7d7-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.769364 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b9fd7d84c-bwqjt"] Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.817986 4971 scope.go:117] "RemoveContainer" containerID="868a0345e6ce93eb431ecb9b8eb8b1b3ab7e0a9560ebaebe5302144c4885308f" Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.847675 4971 scope.go:117] "RemoveContainer" containerID="84e37de1dca3b13d7df575e0ba046c0f63b4cad119703ca80a882711711fe434" Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.882782 4971 scope.go:117] "RemoveContainer" containerID="5fbf1df8dda99bb2bbd2557d66fd0f7cd73de19f39a470db149709ec3733445a" Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.963590 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-577f9d69bb-nlxbh"] Mar 20 07:12:37 crc kubenswrapper[4971]: I0320 07:12:37.972231 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-577f9d69bb-nlxbh"] Mar 20 07:12:38 crc kubenswrapper[4971]: I0320 07:12:38.018777 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-956ff8bdd-gdvvh"] Mar 20 07:12:38 crc kubenswrapper[4971]: I0320 07:12:38.027371 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-956ff8bdd-gdvvh"] Mar 20 07:12:38 crc kubenswrapper[4971]: I0320 07:12:38.061118 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-p9zph" Mar 20 07:12:38 crc kubenswrapper[4971]: I0320 07:12:38.169641 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lcr6\" (UniqueName: \"kubernetes.io/projected/ebe4aafd-3ffe-406f-a0dc-faebf5eeddda-kube-api-access-2lcr6\") pod \"ebe4aafd-3ffe-406f-a0dc-faebf5eeddda\" (UID: \"ebe4aafd-3ffe-406f-a0dc-faebf5eeddda\") " Mar 20 07:12:38 crc kubenswrapper[4971]: I0320 07:12:38.169733 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ebe4aafd-3ffe-406f-a0dc-faebf5eeddda-etc-machine-id\") pod \"ebe4aafd-3ffe-406f-a0dc-faebf5eeddda\" (UID: \"ebe4aafd-3ffe-406f-a0dc-faebf5eeddda\") " Mar 20 07:12:38 crc kubenswrapper[4971]: I0320 07:12:38.169836 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebe4aafd-3ffe-406f-a0dc-faebf5eeddda-config-data\") pod \"ebe4aafd-3ffe-406f-a0dc-faebf5eeddda\" (UID: \"ebe4aafd-3ffe-406f-a0dc-faebf5eeddda\") " Mar 20 07:12:38 crc kubenswrapper[4971]: I0320 07:12:38.170041 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebe4aafd-3ffe-406f-a0dc-faebf5eeddda-combined-ca-bundle\") pod \"ebe4aafd-3ffe-406f-a0dc-faebf5eeddda\" (UID: \"ebe4aafd-3ffe-406f-a0dc-faebf5eeddda\") " Mar 20 07:12:38 crc kubenswrapper[4971]: I0320 07:12:38.170077 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebe4aafd-3ffe-406f-a0dc-faebf5eeddda-scripts\") pod \"ebe4aafd-3ffe-406f-a0dc-faebf5eeddda\" (UID: \"ebe4aafd-3ffe-406f-a0dc-faebf5eeddda\") " Mar 20 07:12:38 crc kubenswrapper[4971]: I0320 07:12:38.170189 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ebe4aafd-3ffe-406f-a0dc-faebf5eeddda-db-sync-config-data\") pod \"ebe4aafd-3ffe-406f-a0dc-faebf5eeddda\" (UID: \"ebe4aafd-3ffe-406f-a0dc-faebf5eeddda\") " Mar 20 07:12:38 crc kubenswrapper[4971]: I0320 07:12:38.172726 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ebe4aafd-3ffe-406f-a0dc-faebf5eeddda-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "ebe4aafd-3ffe-406f-a0dc-faebf5eeddda" (UID: "ebe4aafd-3ffe-406f-a0dc-faebf5eeddda"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 07:12:38 crc kubenswrapper[4971]: I0320 07:12:38.174815 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebe4aafd-3ffe-406f-a0dc-faebf5eeddda-kube-api-access-2lcr6" (OuterVolumeSpecName: "kube-api-access-2lcr6") pod "ebe4aafd-3ffe-406f-a0dc-faebf5eeddda" (UID: "ebe4aafd-3ffe-406f-a0dc-faebf5eeddda"). InnerVolumeSpecName "kube-api-access-2lcr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:12:38 crc kubenswrapper[4971]: I0320 07:12:38.175162 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebe4aafd-3ffe-406f-a0dc-faebf5eeddda-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "ebe4aafd-3ffe-406f-a0dc-faebf5eeddda" (UID: "ebe4aafd-3ffe-406f-a0dc-faebf5eeddda"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:12:38 crc kubenswrapper[4971]: I0320 07:12:38.179768 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebe4aafd-3ffe-406f-a0dc-faebf5eeddda-scripts" (OuterVolumeSpecName: "scripts") pod "ebe4aafd-3ffe-406f-a0dc-faebf5eeddda" (UID: "ebe4aafd-3ffe-406f-a0dc-faebf5eeddda"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:12:38 crc kubenswrapper[4971]: I0320 07:12:38.194763 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebe4aafd-3ffe-406f-a0dc-faebf5eeddda-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ebe4aafd-3ffe-406f-a0dc-faebf5eeddda" (UID: "ebe4aafd-3ffe-406f-a0dc-faebf5eeddda"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:12:38 crc kubenswrapper[4971]: I0320 07:12:38.227844 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebe4aafd-3ffe-406f-a0dc-faebf5eeddda-config-data" (OuterVolumeSpecName: "config-data") pod "ebe4aafd-3ffe-406f-a0dc-faebf5eeddda" (UID: "ebe4aafd-3ffe-406f-a0dc-faebf5eeddda"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:12:38 crc kubenswrapper[4971]: I0320 07:12:38.271865 4971 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ebe4aafd-3ffe-406f-a0dc-faebf5eeddda-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:38 crc kubenswrapper[4971]: I0320 07:12:38.271905 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lcr6\" (UniqueName: \"kubernetes.io/projected/ebe4aafd-3ffe-406f-a0dc-faebf5eeddda-kube-api-access-2lcr6\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:38 crc kubenswrapper[4971]: I0320 07:12:38.271921 4971 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ebe4aafd-3ffe-406f-a0dc-faebf5eeddda-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:38 crc kubenswrapper[4971]: I0320 07:12:38.271934 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebe4aafd-3ffe-406f-a0dc-faebf5eeddda-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:38 crc kubenswrapper[4971]: I0320 07:12:38.271944 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebe4aafd-3ffe-406f-a0dc-faebf5eeddda-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:38 crc kubenswrapper[4971]: I0320 07:12:38.271953 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebe4aafd-3ffe-406f-a0dc-faebf5eeddda-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:38 crc kubenswrapper[4971]: I0320 07:12:38.707421 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-p9zph" event={"ID":"ebe4aafd-3ffe-406f-a0dc-faebf5eeddda","Type":"ContainerDied","Data":"6f4d06b82e67f1d6a8c62d57a281f87352de06bf53fb983b39a3ad95a2ce7099"} Mar 20 07:12:38 crc kubenswrapper[4971]: I0320 07:12:38.707827 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f4d06b82e67f1d6a8c62d57a281f87352de06bf53fb983b39a3ad95a2ce7099" Mar 20 07:12:38 crc kubenswrapper[4971]: I0320 07:12:38.707795 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-p9zph" Mar 20 07:12:38 crc kubenswrapper[4971]: I0320 07:12:38.711119 4971 generic.go:334] "Generic (PLEG): container finished" podID="1edc3606-34b7-4c2a-b534-2efa0ad00e95" containerID="4cb8a35f072d44c331ecc56d1821e504cec508e1172de452d8974d9419f87007" exitCode=143 Mar 20 07:12:38 crc kubenswrapper[4971]: I0320 07:12:38.711202 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7cccb97bc6-t7qwm" event={"ID":"1edc3606-34b7-4c2a-b534-2efa0ad00e95","Type":"ContainerDied","Data":"4cb8a35f072d44c331ecc56d1821e504cec508e1172de452d8974d9419f87007"} Mar 20 07:12:38 crc kubenswrapper[4971]: I0320 07:12:38.714868 4971 generic.go:334] "Generic (PLEG): container finished" podID="03a48c07-f949-4376-874a-8d5758b60a9a" containerID="2d0d52a9adadceb321d13de081d4c987d1f4efb51e325355c60dddcc42cde7c0" exitCode=2 Mar 20 07:12:38 crc kubenswrapper[4971]: I0320 07:12:38.714922 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"03a48c07-f949-4376-874a-8d5758b60a9a","Type":"ContainerDied","Data":"2d0d52a9adadceb321d13de081d4c987d1f4efb51e325355c60dddcc42cde7c0"} Mar 20 07:12:38 crc kubenswrapper[4971]: I0320 07:12:38.750290 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1172670c-6168-4461-a866-3bd8b410c332" path="/var/lib/kubelet/pods/1172670c-6168-4461-a866-3bd8b410c332/volumes" Mar 20 07:12:38 crc kubenswrapper[4971]: I0320 07:12:38.751087 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f687e76-a919-4ed1-921b-ab9a7085eb30" path="/var/lib/kubelet/pods/1f687e76-a919-4ed1-921b-ab9a7085eb30/volumes" Mar 20 07:12:38 crc kubenswrapper[4971]: I0320 07:12:38.751703 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91600ccb-b017-4f9f-b348-be6fe0fbe7d7" path="/var/lib/kubelet/pods/91600ccb-b017-4f9f-b348-be6fe0fbe7d7/volumes" Mar 20 07:12:38 crc kubenswrapper[4971]: I0320 07:12:38.911941 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 07:12:38 crc kubenswrapper[4971]: E0320 07:12:38.912441 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebe4aafd-3ffe-406f-a0dc-faebf5eeddda" containerName="cinder-db-sync" Mar 20 07:12:38 crc kubenswrapper[4971]: I0320 07:12:38.912478 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebe4aafd-3ffe-406f-a0dc-faebf5eeddda" containerName="cinder-db-sync" Mar 20 07:12:38 crc kubenswrapper[4971]: E0320 07:12:38.912513 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1172670c-6168-4461-a866-3bd8b410c332" containerName="barbican-keystone-listener" Mar 20 07:12:38 crc kubenswrapper[4971]: I0320 07:12:38.912530 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="1172670c-6168-4461-a866-3bd8b410c332" containerName="barbican-keystone-listener" Mar 20 07:12:38 crc kubenswrapper[4971]: E0320 07:12:38.912552 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1172670c-6168-4461-a866-3bd8b410c332" containerName="barbican-keystone-listener-log" Mar 20 07:12:38 crc kubenswrapper[4971]: I0320 07:12:38.912561 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="1172670c-6168-4461-a866-3bd8b410c332" containerName="barbican-keystone-listener-log" Mar 20 07:12:38 crc kubenswrapper[4971]: E0320 07:12:38.912573 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91600ccb-b017-4f9f-b348-be6fe0fbe7d7" containerName="barbican-api-log" Mar 20 07:12:38 crc kubenswrapper[4971]: I0320 07:12:38.912580 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="91600ccb-b017-4f9f-b348-be6fe0fbe7d7" containerName="barbican-api-log" Mar 20 07:12:38 crc kubenswrapper[4971]: E0320 07:12:38.912639 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f687e76-a919-4ed1-921b-ab9a7085eb30" containerName="dnsmasq-dns" Mar 20 07:12:38 crc kubenswrapper[4971]: I0320 07:12:38.912658 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f687e76-a919-4ed1-921b-ab9a7085eb30" containerName="dnsmasq-dns" Mar 20 07:12:38 crc kubenswrapper[4971]: E0320 07:12:38.912671 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91600ccb-b017-4f9f-b348-be6fe0fbe7d7" containerName="barbican-api" Mar 20 07:12:38 crc kubenswrapper[4971]: I0320 07:12:38.912682 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="91600ccb-b017-4f9f-b348-be6fe0fbe7d7" containerName="barbican-api" Mar 20 07:12:38 crc kubenswrapper[4971]: E0320 07:12:38.912691 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f687e76-a919-4ed1-921b-ab9a7085eb30" containerName="init" Mar 20 07:12:38 crc kubenswrapper[4971]: I0320 07:12:38.912699 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f687e76-a919-4ed1-921b-ab9a7085eb30" containerName="init" Mar 20 07:12:38 crc kubenswrapper[4971]: I0320 07:12:38.913093 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebe4aafd-3ffe-406f-a0dc-faebf5eeddda" containerName="cinder-db-sync" Mar 20 07:12:38 crc kubenswrapper[4971]: I0320 07:12:38.913120 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f687e76-a919-4ed1-921b-ab9a7085eb30" containerName="dnsmasq-dns" Mar 20 07:12:38 crc kubenswrapper[4971]: I0320 07:12:38.913134 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="1172670c-6168-4461-a866-3bd8b410c332" containerName="barbican-keystone-listener" Mar 20 07:12:38 crc kubenswrapper[4971]: I0320 07:12:38.913149 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="91600ccb-b017-4f9f-b348-be6fe0fbe7d7" containerName="barbican-api-log" Mar 20 07:12:38 crc kubenswrapper[4971]: I0320 07:12:38.913161 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="1172670c-6168-4461-a866-3bd8b410c332" containerName="barbican-keystone-listener-log" Mar 20 07:12:38 crc kubenswrapper[4971]: I0320 07:12:38.913180 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="91600ccb-b017-4f9f-b348-be6fe0fbe7d7" containerName="barbican-api" Mar 20 07:12:38 crc kubenswrapper[4971]: I0320 07:12:38.914402 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 07:12:38 crc kubenswrapper[4971]: I0320 07:12:38.918884 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 20 07:12:38 crc kubenswrapper[4971]: I0320 07:12:38.919113 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-k9d2v" Mar 20 07:12:38 crc kubenswrapper[4971]: I0320 07:12:38.919288 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 20 07:12:38 crc kubenswrapper[4971]: I0320 07:12:38.924089 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 20 07:12:38 crc kubenswrapper[4971]: I0320 07:12:38.950562 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 07:12:38 crc kubenswrapper[4971]: I0320 07:12:38.978791 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58b85ccffc-jklwc"] Mar 20 07:12:38 crc kubenswrapper[4971]: I0320 07:12:38.980482 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58b85ccffc-jklwc" Mar 20 07:12:38 crc kubenswrapper[4971]: I0320 07:12:38.989392 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f85b1a9b-55d3-4e2f-bc37-726688f9ab49-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f85b1a9b-55d3-4e2f-bc37-726688f9ab49\") " pod="openstack/cinder-scheduler-0" Mar 20 07:12:38 crc kubenswrapper[4971]: I0320 07:12:38.989487 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f85b1a9b-55d3-4e2f-bc37-726688f9ab49-scripts\") pod \"cinder-scheduler-0\" (UID: \"f85b1a9b-55d3-4e2f-bc37-726688f9ab49\") " pod="openstack/cinder-scheduler-0" Mar 20 07:12:38 crc kubenswrapper[4971]: I0320 07:12:38.989551 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f85b1a9b-55d3-4e2f-bc37-726688f9ab49-config-data\") pod \"cinder-scheduler-0\" (UID: \"f85b1a9b-55d3-4e2f-bc37-726688f9ab49\") " pod="openstack/cinder-scheduler-0" Mar 20 07:12:38 crc kubenswrapper[4971]: I0320 07:12:38.989620 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f85b1a9b-55d3-4e2f-bc37-726688f9ab49-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f85b1a9b-55d3-4e2f-bc37-726688f9ab49\") " pod="openstack/cinder-scheduler-0" Mar 20 07:12:38 crc kubenswrapper[4971]: I0320 07:12:38.989715 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9q42\" (UniqueName: \"kubernetes.io/projected/f85b1a9b-55d3-4e2f-bc37-726688f9ab49-kube-api-access-c9q42\") pod \"cinder-scheduler-0\" (UID: \"f85b1a9b-55d3-4e2f-bc37-726688f9ab49\") " pod="openstack/cinder-scheduler-0" Mar 20 07:12:38 crc kubenswrapper[4971]: I0320 07:12:38.989754 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f85b1a9b-55d3-4e2f-bc37-726688f9ab49-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f85b1a9b-55d3-4e2f-bc37-726688f9ab49\") " pod="openstack/cinder-scheduler-0" Mar 20 07:12:39 crc kubenswrapper[4971]: I0320 07:12:39.029502 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58b85ccffc-jklwc"] Mar 20 07:12:39 crc kubenswrapper[4971]: I0320 07:12:39.091386 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f85b1a9b-55d3-4e2f-bc37-726688f9ab49-config-data\") pod \"cinder-scheduler-0\" (UID: \"f85b1a9b-55d3-4e2f-bc37-726688f9ab49\") " pod="openstack/cinder-scheduler-0" Mar 20 07:12:39 crc kubenswrapper[4971]: I0320 07:12:39.091684 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dbc35f7f-6e4d-492d-b82a-81c5ba5e3712-ovsdbserver-sb\") pod \"dnsmasq-dns-58b85ccffc-jklwc\" (UID: \"dbc35f7f-6e4d-492d-b82a-81c5ba5e3712\") " pod="openstack/dnsmasq-dns-58b85ccffc-jklwc" Mar 20 07:12:39 crc kubenswrapper[4971]: I0320 07:12:39.091850 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f85b1a9b-55d3-4e2f-bc37-726688f9ab49-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f85b1a9b-55d3-4e2f-bc37-726688f9ab49\") " pod="openstack/cinder-scheduler-0" Mar 20 07:12:39 crc kubenswrapper[4971]: I0320 07:12:39.091983 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z96xm\" (UniqueName: \"kubernetes.io/projected/dbc35f7f-6e4d-492d-b82a-81c5ba5e3712-kube-api-access-z96xm\") pod \"dnsmasq-dns-58b85ccffc-jklwc\" (UID: \"dbc35f7f-6e4d-492d-b82a-81c5ba5e3712\") " pod="openstack/dnsmasq-dns-58b85ccffc-jklwc" Mar 20 07:12:39 crc kubenswrapper[4971]: I0320 07:12:39.092103 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9q42\" (UniqueName: \"kubernetes.io/projected/f85b1a9b-55d3-4e2f-bc37-726688f9ab49-kube-api-access-c9q42\") pod \"cinder-scheduler-0\" (UID: \"f85b1a9b-55d3-4e2f-bc37-726688f9ab49\") " pod="openstack/cinder-scheduler-0" Mar 20 07:12:39 crc kubenswrapper[4971]: I0320 07:12:39.092202 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f85b1a9b-55d3-4e2f-bc37-726688f9ab49-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f85b1a9b-55d3-4e2f-bc37-726688f9ab49\") " pod="openstack/cinder-scheduler-0" Mar 20 07:12:39 crc kubenswrapper[4971]: I0320 07:12:39.092285 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dbc35f7f-6e4d-492d-b82a-81c5ba5e3712-dns-svc\") pod \"dnsmasq-dns-58b85ccffc-jklwc\" (UID: \"dbc35f7f-6e4d-492d-b82a-81c5ba5e3712\") " pod="openstack/dnsmasq-dns-58b85ccffc-jklwc" Mar 20 07:12:39 crc kubenswrapper[4971]: I0320 07:12:39.092375 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dbc35f7f-6e4d-492d-b82a-81c5ba5e3712-dns-swift-storage-0\") pod \"dnsmasq-dns-58b85ccffc-jklwc\" (UID: \"dbc35f7f-6e4d-492d-b82a-81c5ba5e3712\") " pod="openstack/dnsmasq-dns-58b85ccffc-jklwc" Mar 20 07:12:39 crc kubenswrapper[4971]: I0320 07:12:39.092494 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbc35f7f-6e4d-492d-b82a-81c5ba5e3712-config\") pod \"dnsmasq-dns-58b85ccffc-jklwc\" (UID: \"dbc35f7f-6e4d-492d-b82a-81c5ba5e3712\") " pod="openstack/dnsmasq-dns-58b85ccffc-jklwc" Mar 20 07:12:39 crc kubenswrapper[4971]: I0320 07:12:39.092585 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f85b1a9b-55d3-4e2f-bc37-726688f9ab49-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f85b1a9b-55d3-4e2f-bc37-726688f9ab49\") " pod="openstack/cinder-scheduler-0" Mar 20 07:12:39 crc kubenswrapper[4971]: I0320 07:12:39.092729 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dbc35f7f-6e4d-492d-b82a-81c5ba5e3712-ovsdbserver-nb\") pod \"dnsmasq-dns-58b85ccffc-jklwc\" (UID: \"dbc35f7f-6e4d-492d-b82a-81c5ba5e3712\") " pod="openstack/dnsmasq-dns-58b85ccffc-jklwc" Mar 20 07:12:39 crc kubenswrapper[4971]: I0320 07:12:39.092817 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f85b1a9b-55d3-4e2f-bc37-726688f9ab49-scripts\") pod \"cinder-scheduler-0\" (UID: \"f85b1a9b-55d3-4e2f-bc37-726688f9ab49\") " pod="openstack/cinder-scheduler-0" Mar 20 07:12:39 crc kubenswrapper[4971]: I0320 07:12:39.094682 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f85b1a9b-55d3-4e2f-bc37-726688f9ab49-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f85b1a9b-55d3-4e2f-bc37-726688f9ab49\") " pod="openstack/cinder-scheduler-0" Mar 20 07:12:39 crc kubenswrapper[4971]: I0320 07:12:39.101477 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f85b1a9b-55d3-4e2f-bc37-726688f9ab49-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f85b1a9b-55d3-4e2f-bc37-726688f9ab49\") " pod="openstack/cinder-scheduler-0" Mar 20 07:12:39 crc kubenswrapper[4971]: I0320 07:12:39.102077 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f85b1a9b-55d3-4e2f-bc37-726688f9ab49-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f85b1a9b-55d3-4e2f-bc37-726688f9ab49\") " pod="openstack/cinder-scheduler-0" Mar 20 07:12:39 crc kubenswrapper[4971]: I0320 07:12:39.110451 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f85b1a9b-55d3-4e2f-bc37-726688f9ab49-scripts\") pod \"cinder-scheduler-0\" (UID: \"f85b1a9b-55d3-4e2f-bc37-726688f9ab49\") " pod="openstack/cinder-scheduler-0" Mar 20 07:12:39 crc kubenswrapper[4971]: I0320 07:12:39.110660 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f85b1a9b-55d3-4e2f-bc37-726688f9ab49-config-data\") pod \"cinder-scheduler-0\" (UID: \"f85b1a9b-55d3-4e2f-bc37-726688f9ab49\") " pod="openstack/cinder-scheduler-0" Mar 20 07:12:39 crc kubenswrapper[4971]: I0320 07:12:39.121079 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9q42\" (UniqueName: \"kubernetes.io/projected/f85b1a9b-55d3-4e2f-bc37-726688f9ab49-kube-api-access-c9q42\") pod \"cinder-scheduler-0\" (UID: \"f85b1a9b-55d3-4e2f-bc37-726688f9ab49\") " pod="openstack/cinder-scheduler-0" Mar 20 07:12:39 crc kubenswrapper[4971]: I0320 07:12:39.163294 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 20 07:12:39 crc kubenswrapper[4971]: I0320 07:12:39.164844 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 07:12:39 crc kubenswrapper[4971]: I0320 07:12:39.166802 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 20 07:12:39 crc kubenswrapper[4971]: I0320 07:12:39.177808 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 07:12:39 crc kubenswrapper[4971]: I0320 07:12:39.194357 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dbc35f7f-6e4d-492d-b82a-81c5ba5e3712-ovsdbserver-sb\") pod \"dnsmasq-dns-58b85ccffc-jklwc\" (UID: \"dbc35f7f-6e4d-492d-b82a-81c5ba5e3712\") " pod="openstack/dnsmasq-dns-58b85ccffc-jklwc" Mar 20 07:12:39 crc kubenswrapper[4971]: I0320 07:12:39.194426 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z96xm\" (UniqueName: \"kubernetes.io/projected/dbc35f7f-6e4d-492d-b82a-81c5ba5e3712-kube-api-access-z96xm\") pod \"dnsmasq-dns-58b85ccffc-jklwc\" (UID: \"dbc35f7f-6e4d-492d-b82a-81c5ba5e3712\") " pod="openstack/dnsmasq-dns-58b85ccffc-jklwc" Mar 20 07:12:39 crc kubenswrapper[4971]: I0320 07:12:39.194481 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dbc35f7f-6e4d-492d-b82a-81c5ba5e3712-dns-svc\") pod \"dnsmasq-dns-58b85ccffc-jklwc\" (UID: \"dbc35f7f-6e4d-492d-b82a-81c5ba5e3712\") " pod="openstack/dnsmasq-dns-58b85ccffc-jklwc" Mar 20 07:12:39 crc kubenswrapper[4971]: I0320 07:12:39.194507 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dbc35f7f-6e4d-492d-b82a-81c5ba5e3712-dns-swift-storage-0\") pod \"dnsmasq-dns-58b85ccffc-jklwc\" (UID: \"dbc35f7f-6e4d-492d-b82a-81c5ba5e3712\") " pod="openstack/dnsmasq-dns-58b85ccffc-jklwc" Mar 20 07:12:39 crc kubenswrapper[4971]: I0320 07:12:39.194536 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbc35f7f-6e4d-492d-b82a-81c5ba5e3712-config\") pod \"dnsmasq-dns-58b85ccffc-jklwc\" (UID: \"dbc35f7f-6e4d-492d-b82a-81c5ba5e3712\") " pod="openstack/dnsmasq-dns-58b85ccffc-jklwc" Mar 20 07:12:39 crc kubenswrapper[4971]: I0320 07:12:39.194567 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dbc35f7f-6e4d-492d-b82a-81c5ba5e3712-ovsdbserver-nb\") pod \"dnsmasq-dns-58b85ccffc-jklwc\" (UID: \"dbc35f7f-6e4d-492d-b82a-81c5ba5e3712\") " pod="openstack/dnsmasq-dns-58b85ccffc-jklwc" Mar 20 07:12:39 crc kubenswrapper[4971]: I0320 07:12:39.195266 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dbc35f7f-6e4d-492d-b82a-81c5ba5e3712-ovsdbserver-sb\") pod \"dnsmasq-dns-58b85ccffc-jklwc\" (UID: \"dbc35f7f-6e4d-492d-b82a-81c5ba5e3712\") " pod="openstack/dnsmasq-dns-58b85ccffc-jklwc" Mar 20 07:12:39 crc kubenswrapper[4971]: I0320 07:12:39.195398 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dbc35f7f-6e4d-492d-b82a-81c5ba5e3712-ovsdbserver-nb\") pod \"dnsmasq-dns-58b85ccffc-jklwc\" (UID: \"dbc35f7f-6e4d-492d-b82a-81c5ba5e3712\") " pod="openstack/dnsmasq-dns-58b85ccffc-jklwc" Mar 20 07:12:39 crc kubenswrapper[4971]: I0320 07:12:39.196053 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dbc35f7f-6e4d-492d-b82a-81c5ba5e3712-dns-swift-storage-0\") pod \"dnsmasq-dns-58b85ccffc-jklwc\" (UID: \"dbc35f7f-6e4d-492d-b82a-81c5ba5e3712\") " pod="openstack/dnsmasq-dns-58b85ccffc-jklwc" Mar 20 07:12:39 crc kubenswrapper[4971]: I0320 07:12:39.196215 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbc35f7f-6e4d-492d-b82a-81c5ba5e3712-config\") pod \"dnsmasq-dns-58b85ccffc-jklwc\" (UID: \"dbc35f7f-6e4d-492d-b82a-81c5ba5e3712\") " pod="openstack/dnsmasq-dns-58b85ccffc-jklwc" Mar 20 07:12:39 crc kubenswrapper[4971]: I0320 07:12:39.196316 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dbc35f7f-6e4d-492d-b82a-81c5ba5e3712-dns-svc\") pod \"dnsmasq-dns-58b85ccffc-jklwc\" (UID: \"dbc35f7f-6e4d-492d-b82a-81c5ba5e3712\") " pod="openstack/dnsmasq-dns-58b85ccffc-jklwc" Mar 20 07:12:39 crc kubenswrapper[4971]: I0320 07:12:39.212047 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z96xm\" (UniqueName: \"kubernetes.io/projected/dbc35f7f-6e4d-492d-b82a-81c5ba5e3712-kube-api-access-z96xm\") pod \"dnsmasq-dns-58b85ccffc-jklwc\" (UID: \"dbc35f7f-6e4d-492d-b82a-81c5ba5e3712\") " pod="openstack/dnsmasq-dns-58b85ccffc-jklwc" Mar 20 07:12:39 crc kubenswrapper[4971]: I0320 07:12:39.237343 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 07:12:39 crc kubenswrapper[4971]: I0320 07:12:39.295734 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b02a1786-53e7-4cb4-8543-1dafa6ec1d7f-config-data-custom\") pod \"cinder-api-0\" (UID: \"b02a1786-53e7-4cb4-8543-1dafa6ec1d7f\") " pod="openstack/cinder-api-0" Mar 20 07:12:39 crc kubenswrapper[4971]: I0320 07:12:39.297799 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b02a1786-53e7-4cb4-8543-1dafa6ec1d7f-scripts\") pod \"cinder-api-0\" (UID: \"b02a1786-53e7-4cb4-8543-1dafa6ec1d7f\") " pod="openstack/cinder-api-0" Mar 20 07:12:39 crc kubenswrapper[4971]: I0320 07:12:39.297907 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b02a1786-53e7-4cb4-8543-1dafa6ec1d7f-logs\") pod \"cinder-api-0\" (UID: \"b02a1786-53e7-4cb4-8543-1dafa6ec1d7f\") " pod="openstack/cinder-api-0" Mar 20 07:12:39 crc kubenswrapper[4971]: I0320 07:12:39.297935 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b02a1786-53e7-4cb4-8543-1dafa6ec1d7f-config-data\") pod \"cinder-api-0\" (UID: \"b02a1786-53e7-4cb4-8543-1dafa6ec1d7f\") " pod="openstack/cinder-api-0" Mar 20 07:12:39 crc kubenswrapper[4971]: I0320 07:12:39.297986 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b02a1786-53e7-4cb4-8543-1dafa6ec1d7f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b02a1786-53e7-4cb4-8543-1dafa6ec1d7f\") " pod="openstack/cinder-api-0" Mar 20 07:12:39 crc kubenswrapper[4971]: I0320 07:12:39.298014 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlm79\" (UniqueName: \"kubernetes.io/projected/b02a1786-53e7-4cb4-8543-1dafa6ec1d7f-kube-api-access-vlm79\") pod \"cinder-api-0\" (UID: \"b02a1786-53e7-4cb4-8543-1dafa6ec1d7f\") " pod="openstack/cinder-api-0" Mar 20 07:12:39 crc kubenswrapper[4971]: I0320 07:12:39.298029 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b02a1786-53e7-4cb4-8543-1dafa6ec1d7f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b02a1786-53e7-4cb4-8543-1dafa6ec1d7f\") " pod="openstack/cinder-api-0" Mar 20 07:12:39 crc kubenswrapper[4971]: I0320 07:12:39.303573 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58b85ccffc-jklwc" Mar 20 07:12:39 crc kubenswrapper[4971]: I0320 07:12:39.400416 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b02a1786-53e7-4cb4-8543-1dafa6ec1d7f-config-data\") pod \"cinder-api-0\" (UID: \"b02a1786-53e7-4cb4-8543-1dafa6ec1d7f\") " pod="openstack/cinder-api-0" Mar 20 07:12:39 crc kubenswrapper[4971]: I0320 07:12:39.400458 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b02a1786-53e7-4cb4-8543-1dafa6ec1d7f-logs\") pod \"cinder-api-0\" (UID: \"b02a1786-53e7-4cb4-8543-1dafa6ec1d7f\") " pod="openstack/cinder-api-0" Mar 20 07:12:39 crc kubenswrapper[4971]: I0320 07:12:39.400488 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b02a1786-53e7-4cb4-8543-1dafa6ec1d7f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b02a1786-53e7-4cb4-8543-1dafa6ec1d7f\") " pod="openstack/cinder-api-0" Mar 20 07:12:39 crc kubenswrapper[4971]: I0320 07:12:39.400516 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlm79\" (UniqueName: \"kubernetes.io/projected/b02a1786-53e7-4cb4-8543-1dafa6ec1d7f-kube-api-access-vlm79\") pod \"cinder-api-0\" (UID: \"b02a1786-53e7-4cb4-8543-1dafa6ec1d7f\") " pod="openstack/cinder-api-0" Mar 20 07:12:39 crc kubenswrapper[4971]: I0320 07:12:39.400537 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b02a1786-53e7-4cb4-8543-1dafa6ec1d7f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b02a1786-53e7-4cb4-8543-1dafa6ec1d7f\") " pod="openstack/cinder-api-0" Mar 20 07:12:39 crc kubenswrapper[4971]: I0320 07:12:39.400629 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b02a1786-53e7-4cb4-8543-1dafa6ec1d7f-config-data-custom\") pod \"cinder-api-0\" (UID: \"b02a1786-53e7-4cb4-8543-1dafa6ec1d7f\") " pod="openstack/cinder-api-0" Mar 20 07:12:39 crc kubenswrapper[4971]: I0320 07:12:39.400676 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b02a1786-53e7-4cb4-8543-1dafa6ec1d7f-scripts\") pod \"cinder-api-0\" (UID: \"b02a1786-53e7-4cb4-8543-1dafa6ec1d7f\") " pod="openstack/cinder-api-0" Mar 20 07:12:39 crc kubenswrapper[4971]: I0320 07:12:39.408406 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b02a1786-53e7-4cb4-8543-1dafa6ec1d7f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b02a1786-53e7-4cb4-8543-1dafa6ec1d7f\") " pod="openstack/cinder-api-0" Mar 20 07:12:39 crc kubenswrapper[4971]: I0320 07:12:39.408717 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b02a1786-53e7-4cb4-8543-1dafa6ec1d7f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b02a1786-53e7-4cb4-8543-1dafa6ec1d7f\") " pod="openstack/cinder-api-0" Mar 20 07:12:39 crc kubenswrapper[4971]: I0320 07:12:39.409390 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b02a1786-53e7-4cb4-8543-1dafa6ec1d7f-scripts\") pod \"cinder-api-0\" (UID: \"b02a1786-53e7-4cb4-8543-1dafa6ec1d7f\") " pod="openstack/cinder-api-0" Mar 20 07:12:39 crc kubenswrapper[4971]: I0320 07:12:39.409659 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b02a1786-53e7-4cb4-8543-1dafa6ec1d7f-logs\") pod \"cinder-api-0\" (UID: \"b02a1786-53e7-4cb4-8543-1dafa6ec1d7f\") " pod="openstack/cinder-api-0" Mar 20 07:12:39 crc kubenswrapper[4971]: I0320 07:12:39.426553 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b02a1786-53e7-4cb4-8543-1dafa6ec1d7f-config-data\") pod \"cinder-api-0\" (UID: \"b02a1786-53e7-4cb4-8543-1dafa6ec1d7f\") " pod="openstack/cinder-api-0" Mar 20 07:12:39 crc kubenswrapper[4971]: I0320 07:12:39.434110 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b02a1786-53e7-4cb4-8543-1dafa6ec1d7f-config-data-custom\") pod \"cinder-api-0\" (UID: \"b02a1786-53e7-4cb4-8543-1dafa6ec1d7f\") " pod="openstack/cinder-api-0" Mar 20 07:12:39 crc kubenswrapper[4971]: I0320 07:12:39.438284 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlm79\" (UniqueName: \"kubernetes.io/projected/b02a1786-53e7-4cb4-8543-1dafa6ec1d7f-kube-api-access-vlm79\") pod \"cinder-api-0\" (UID: \"b02a1786-53e7-4cb4-8543-1dafa6ec1d7f\") " pod="openstack/cinder-api-0" Mar 20 07:12:39 crc kubenswrapper[4971]: I0320 07:12:39.705894 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 07:12:39 crc kubenswrapper[4971]: I0320 07:12:39.757813 4971 generic.go:334] "Generic (PLEG): container finished" podID="03a48c07-f949-4376-874a-8d5758b60a9a" containerID="aece527112bdee1e807769fbd04b183b04d566ead4cf6b30a2d975bcd179d608" exitCode=0 Mar 20 07:12:39 crc kubenswrapper[4971]: I0320 07:12:39.757863 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"03a48c07-f949-4376-874a-8d5758b60a9a","Type":"ContainerDied","Data":"aece527112bdee1e807769fbd04b183b04d566ead4cf6b30a2d975bcd179d608"} Mar 20 07:12:39 crc kubenswrapper[4971]: I0320 07:12:39.795188 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 07:12:39 crc kubenswrapper[4971]: I0320 07:12:39.879543 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58b85ccffc-jklwc"] Mar 20 07:12:39 crc kubenswrapper[4971]: W0320 07:12:39.896857 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddbc35f7f_6e4d_492d_b82a_81c5ba5e3712.slice/crio-f197ab2a9394a98c0bb54f40df38a86fdabbe8ecbc5f680ae8db657141d358a0 WatchSource:0}: Error finding container f197ab2a9394a98c0bb54f40df38a86fdabbe8ecbc5f680ae8db657141d358a0: Status 404 returned error can't find the container with id f197ab2a9394a98c0bb54f40df38a86fdabbe8ecbc5f680ae8db657141d358a0 Mar 20 07:12:40 crc kubenswrapper[4971]: I0320 07:12:40.350699 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 07:12:40 crc kubenswrapper[4971]: I0320 07:12:40.772671 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f85b1a9b-55d3-4e2f-bc37-726688f9ab49","Type":"ContainerStarted","Data":"2ceaad863f6de32f201a1d1200a4e110219c4f32bcef41bef656f04d1308ed49"} Mar 20 07:12:40 crc kubenswrapper[4971]: I0320 07:12:40.786657 4971 generic.go:334] "Generic (PLEG): container finished" podID="dbc35f7f-6e4d-492d-b82a-81c5ba5e3712" containerID="ea02c0a0041a03dd80a0b5d7c4b95d6e792ce5703b4753dab7ae3049e7733f06" exitCode=0 Mar 20 07:12:40 crc kubenswrapper[4971]: I0320 07:12:40.786734 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58b85ccffc-jklwc" event={"ID":"dbc35f7f-6e4d-492d-b82a-81c5ba5e3712","Type":"ContainerDied","Data":"ea02c0a0041a03dd80a0b5d7c4b95d6e792ce5703b4753dab7ae3049e7733f06"} Mar 20 07:12:40 crc kubenswrapper[4971]: I0320 07:12:40.786762 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58b85ccffc-jklwc" event={"ID":"dbc35f7f-6e4d-492d-b82a-81c5ba5e3712","Type":"ContainerStarted","Data":"f197ab2a9394a98c0bb54f40df38a86fdabbe8ecbc5f680ae8db657141d358a0"} Mar 20 07:12:40 crc kubenswrapper[4971]: I0320 07:12:40.805164 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b02a1786-53e7-4cb4-8543-1dafa6ec1d7f","Type":"ContainerStarted","Data":"207ecc47c69f52915aeea85c93a683ad1a71db9d1cb05081a590ffb80755823f"} Mar 20 07:12:41 crc kubenswrapper[4971]: I0320 07:12:41.818476 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f85b1a9b-55d3-4e2f-bc37-726688f9ab49","Type":"ContainerStarted","Data":"cfc03ceb35986a85f6a9361634ccbc7e649ab27662c4580c0540117337632d25"} Mar 20 07:12:41 crc kubenswrapper[4971]: I0320 07:12:41.823366 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58b85ccffc-jklwc" event={"ID":"dbc35f7f-6e4d-492d-b82a-81c5ba5e3712","Type":"ContainerStarted","Data":"f2e9b46ae28fea0f5df486fd7db6b7a3f09d30d7d0d8a788febe86c6c026da09"} Mar 20 07:12:41 crc kubenswrapper[4971]: I0320 07:12:41.824635 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58b85ccffc-jklwc" Mar 20 07:12:41 crc kubenswrapper[4971]: I0320 07:12:41.832036 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b02a1786-53e7-4cb4-8543-1dafa6ec1d7f","Type":"ContainerStarted","Data":"15985550ea9424c34adae4327cfdd6d755ee61392a948a25219d1b33ea09d5c8"} Mar 20 07:12:41 crc kubenswrapper[4971]: I0320 07:12:41.859731 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58b85ccffc-jklwc" podStartSLOduration=3.859707251 podStartE2EDuration="3.859707251s" podCreationTimestamp="2026-03-20 07:12:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:12:41.85278906 +0000 UTC m=+1383.832663198" watchObservedRunningTime="2026-03-20 07:12:41.859707251 +0000 UTC m=+1383.839581389" Mar 20 07:12:42 crc kubenswrapper[4971]: I0320 07:12:42.154374 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7cccb97bc6-t7qwm" podUID="1edc3606-34b7-4c2a-b534-2efa0ad00e95" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.167:9311/healthcheck\": read tcp 10.217.0.2:40008->10.217.0.167:9311: read: connection reset by peer" Mar 20 07:12:42 crc kubenswrapper[4971]: I0320 07:12:42.154697 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7cccb97bc6-t7qwm" podUID="1edc3606-34b7-4c2a-b534-2efa0ad00e95" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.167:9311/healthcheck\": read tcp 10.217.0.2:40000->10.217.0.167:9311: read: connection reset by peer" Mar 20 07:12:42 crc kubenswrapper[4971]: I0320 07:12:42.481920 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 20 07:12:42 crc kubenswrapper[4971]: I0320 07:12:42.805320 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7cccb97bc6-t7qwm" Mar 20 07:12:42 crc kubenswrapper[4971]: I0320 07:12:42.882184 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f85b1a9b-55d3-4e2f-bc37-726688f9ab49","Type":"ContainerStarted","Data":"1e3f49bd357c4c772a5a3660e021ae19b88b3e4636ef9c26168d72d4957a90cd"} Mar 20 07:12:42 crc kubenswrapper[4971]: I0320 07:12:42.891825 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b02a1786-53e7-4cb4-8543-1dafa6ec1d7f","Type":"ContainerStarted","Data":"f97a407e76435735033f4d01109a8d949f98eec05d1369d2ccbc7bb2b0be79ac"} Mar 20 07:12:42 crc kubenswrapper[4971]: I0320 07:12:42.891991 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 20 07:12:42 crc kubenswrapper[4971]: I0320 07:12:42.918165 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.651335468 podStartE2EDuration="4.918142437s" podCreationTimestamp="2026-03-20 07:12:38 +0000 UTC" firstStartedPulling="2026-03-20 07:12:39.829423254 +0000 UTC m=+1381.809297392" lastFinishedPulling="2026-03-20 07:12:41.096230233 +0000 UTC m=+1383.076104361" observedRunningTime="2026-03-20 07:12:42.911254628 +0000 UTC m=+1384.891128766" watchObservedRunningTime="2026-03-20 07:12:42.918142437 +0000 UTC m=+1384.898016575" Mar 20 07:12:42 crc kubenswrapper[4971]: I0320 07:12:42.924254 4971 generic.go:334] "Generic (PLEG): container finished" podID="1edc3606-34b7-4c2a-b534-2efa0ad00e95" containerID="c5688d50e743f0033211b1fa577437147fe0fe343d1c61c4a9aa25d7583af3de" exitCode=0 Mar 20 07:12:42 crc kubenswrapper[4971]: I0320 07:12:42.924347 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7cccb97bc6-t7qwm" Mar 20 07:12:42 crc kubenswrapper[4971]: I0320 07:12:42.924367 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7cccb97bc6-t7qwm" event={"ID":"1edc3606-34b7-4c2a-b534-2efa0ad00e95","Type":"ContainerDied","Data":"c5688d50e743f0033211b1fa577437147fe0fe343d1c61c4a9aa25d7583af3de"} Mar 20 07:12:42 crc kubenswrapper[4971]: I0320 07:12:42.924727 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7cccb97bc6-t7qwm" event={"ID":"1edc3606-34b7-4c2a-b534-2efa0ad00e95","Type":"ContainerDied","Data":"7082b05a7e200cb4119830375001964b165a470004543ceb8d57b43c520b14dc"} Mar 20 07:12:42 crc kubenswrapper[4971]: I0320 07:12:42.924823 4971 scope.go:117] "RemoveContainer" containerID="c5688d50e743f0033211b1fa577437147fe0fe343d1c61c4a9aa25d7583af3de" Mar 20 07:12:42 crc kubenswrapper[4971]: I0320 07:12:42.941585 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.941564763 podStartE2EDuration="3.941564763s" podCreationTimestamp="2026-03-20 07:12:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:12:42.933970936 +0000 UTC m=+1384.913845074" watchObservedRunningTime="2026-03-20 07:12:42.941564763 +0000 UTC m=+1384.921438901" Mar 20 07:12:42 crc kubenswrapper[4971]: I0320 07:12:42.955949 4971 scope.go:117] "RemoveContainer" containerID="4cb8a35f072d44c331ecc56d1821e504cec508e1172de452d8974d9419f87007" Mar 20 07:12:42 crc kubenswrapper[4971]: I0320 07:12:42.979933 4971 scope.go:117] "RemoveContainer" containerID="c5688d50e743f0033211b1fa577437147fe0fe343d1c61c4a9aa25d7583af3de" Mar 20 07:12:42 crc kubenswrapper[4971]: E0320 07:12:42.980581 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5688d50e743f0033211b1fa577437147fe0fe343d1c61c4a9aa25d7583af3de\": container with ID starting with c5688d50e743f0033211b1fa577437147fe0fe343d1c61c4a9aa25d7583af3de not found: ID does not exist" containerID="c5688d50e743f0033211b1fa577437147fe0fe343d1c61c4a9aa25d7583af3de" Mar 20 07:12:42 crc kubenswrapper[4971]: I0320 07:12:42.980642 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5688d50e743f0033211b1fa577437147fe0fe343d1c61c4a9aa25d7583af3de"} err="failed to get container status \"c5688d50e743f0033211b1fa577437147fe0fe343d1c61c4a9aa25d7583af3de\": rpc error: code = NotFound desc = could not find container \"c5688d50e743f0033211b1fa577437147fe0fe343d1c61c4a9aa25d7583af3de\": container with ID starting with c5688d50e743f0033211b1fa577437147fe0fe343d1c61c4a9aa25d7583af3de not found: ID does not exist" Mar 20 07:12:42 crc kubenswrapper[4971]: I0320 07:12:42.980671 4971 scope.go:117] "RemoveContainer" containerID="4cb8a35f072d44c331ecc56d1821e504cec508e1172de452d8974d9419f87007" Mar 20 07:12:42 crc kubenswrapper[4971]: E0320 07:12:42.981140 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cb8a35f072d44c331ecc56d1821e504cec508e1172de452d8974d9419f87007\": container with ID starting with 4cb8a35f072d44c331ecc56d1821e504cec508e1172de452d8974d9419f87007 not found: ID does not exist" containerID="4cb8a35f072d44c331ecc56d1821e504cec508e1172de452d8974d9419f87007" Mar 20 07:12:42 crc kubenswrapper[4971]: I0320 07:12:42.981169 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cb8a35f072d44c331ecc56d1821e504cec508e1172de452d8974d9419f87007"} err="failed to get container status \"4cb8a35f072d44c331ecc56d1821e504cec508e1172de452d8974d9419f87007\": rpc error: code = NotFound desc = could not find container \"4cb8a35f072d44c331ecc56d1821e504cec508e1172de452d8974d9419f87007\": container with ID starting with 4cb8a35f072d44c331ecc56d1821e504cec508e1172de452d8974d9419f87007 not found: ID does not exist" Mar 20 07:12:42 crc kubenswrapper[4971]: I0320 07:12:42.986388 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1edc3606-34b7-4c2a-b534-2efa0ad00e95-logs\") pod \"1edc3606-34b7-4c2a-b534-2efa0ad00e95\" (UID: \"1edc3606-34b7-4c2a-b534-2efa0ad00e95\") " Mar 20 07:12:42 crc kubenswrapper[4971]: I0320 07:12:42.986444 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1edc3606-34b7-4c2a-b534-2efa0ad00e95-config-data-custom\") pod \"1edc3606-34b7-4c2a-b534-2efa0ad00e95\" (UID: \"1edc3606-34b7-4c2a-b534-2efa0ad00e95\") " Mar 20 07:12:42 crc kubenswrapper[4971]: I0320 07:12:42.986508 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1edc3606-34b7-4c2a-b534-2efa0ad00e95-config-data\") pod \"1edc3606-34b7-4c2a-b534-2efa0ad00e95\" (UID: \"1edc3606-34b7-4c2a-b534-2efa0ad00e95\") " Mar 20 07:12:42 crc kubenswrapper[4971]: I0320 07:12:42.986524 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1edc3606-34b7-4c2a-b534-2efa0ad00e95-combined-ca-bundle\") pod \"1edc3606-34b7-4c2a-b534-2efa0ad00e95\" (UID: \"1edc3606-34b7-4c2a-b534-2efa0ad00e95\") " Mar 20 07:12:42 crc kubenswrapper[4971]: I0320 07:12:42.986706 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzfzg\" (UniqueName: \"kubernetes.io/projected/1edc3606-34b7-4c2a-b534-2efa0ad00e95-kube-api-access-xzfzg\") pod \"1edc3606-34b7-4c2a-b534-2efa0ad00e95\" (UID: \"1edc3606-34b7-4c2a-b534-2efa0ad00e95\") " Mar 20 07:12:42 crc kubenswrapper[4971]: I0320 07:12:42.987038 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1edc3606-34b7-4c2a-b534-2efa0ad00e95-logs" (OuterVolumeSpecName: "logs") pod "1edc3606-34b7-4c2a-b534-2efa0ad00e95" (UID: "1edc3606-34b7-4c2a-b534-2efa0ad00e95"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:12:42 crc kubenswrapper[4971]: I0320 07:12:42.987262 4971 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1edc3606-34b7-4c2a-b534-2efa0ad00e95-logs\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:43 crc kubenswrapper[4971]: I0320 07:12:42.992980 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1edc3606-34b7-4c2a-b534-2efa0ad00e95-kube-api-access-xzfzg" (OuterVolumeSpecName: "kube-api-access-xzfzg") pod "1edc3606-34b7-4c2a-b534-2efa0ad00e95" (UID: "1edc3606-34b7-4c2a-b534-2efa0ad00e95"). InnerVolumeSpecName "kube-api-access-xzfzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:12:43 crc kubenswrapper[4971]: I0320 07:12:42.994144 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1edc3606-34b7-4c2a-b534-2efa0ad00e95-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1edc3606-34b7-4c2a-b534-2efa0ad00e95" (UID: "1edc3606-34b7-4c2a-b534-2efa0ad00e95"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:12:43 crc kubenswrapper[4971]: I0320 07:12:43.035309 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1edc3606-34b7-4c2a-b534-2efa0ad00e95-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1edc3606-34b7-4c2a-b534-2efa0ad00e95" (UID: "1edc3606-34b7-4c2a-b534-2efa0ad00e95"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:12:43 crc kubenswrapper[4971]: I0320 07:12:43.053511 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1edc3606-34b7-4c2a-b534-2efa0ad00e95-config-data" (OuterVolumeSpecName: "config-data") pod "1edc3606-34b7-4c2a-b534-2efa0ad00e95" (UID: "1edc3606-34b7-4c2a-b534-2efa0ad00e95"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:12:43 crc kubenswrapper[4971]: I0320 07:12:43.089592 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzfzg\" (UniqueName: \"kubernetes.io/projected/1edc3606-34b7-4c2a-b534-2efa0ad00e95-kube-api-access-xzfzg\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:43 crc kubenswrapper[4971]: I0320 07:12:43.089651 4971 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1edc3606-34b7-4c2a-b534-2efa0ad00e95-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:43 crc kubenswrapper[4971]: I0320 07:12:43.089662 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1edc3606-34b7-4c2a-b534-2efa0ad00e95-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:43 crc kubenswrapper[4971]: I0320 07:12:43.089672 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1edc3606-34b7-4c2a-b534-2efa0ad00e95-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:43 crc kubenswrapper[4971]: I0320 07:12:43.257093 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7cccb97bc6-t7qwm"] Mar 20 07:12:43 crc kubenswrapper[4971]: I0320 07:12:43.268018 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-7cccb97bc6-t7qwm"] Mar 20 07:12:43 crc kubenswrapper[4971]: I0320 07:12:43.937703 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="b02a1786-53e7-4cb4-8543-1dafa6ec1d7f" containerName="cinder-api-log" containerID="cri-o://15985550ea9424c34adae4327cfdd6d755ee61392a948a25219d1b33ea09d5c8" gracePeriod=30 Mar 20 07:12:43 crc kubenswrapper[4971]: I0320 07:12:43.937966 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="b02a1786-53e7-4cb4-8543-1dafa6ec1d7f" containerName="cinder-api" containerID="cri-o://f97a407e76435735033f4d01109a8d949f98eec05d1369d2ccbc7bb2b0be79ac" gracePeriod=30 Mar 20 07:12:44 crc kubenswrapper[4971]: I0320 07:12:44.237771 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 20 07:12:44 crc kubenswrapper[4971]: I0320 07:12:44.618531 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 07:12:44 crc kubenswrapper[4971]: I0320 07:12:44.721384 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b02a1786-53e7-4cb4-8543-1dafa6ec1d7f-config-data-custom\") pod \"b02a1786-53e7-4cb4-8543-1dafa6ec1d7f\" (UID: \"b02a1786-53e7-4cb4-8543-1dafa6ec1d7f\") " Mar 20 07:12:44 crc kubenswrapper[4971]: I0320 07:12:44.721442 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b02a1786-53e7-4cb4-8543-1dafa6ec1d7f-scripts\") pod \"b02a1786-53e7-4cb4-8543-1dafa6ec1d7f\" (UID: \"b02a1786-53e7-4cb4-8543-1dafa6ec1d7f\") " Mar 20 07:12:44 crc kubenswrapper[4971]: I0320 07:12:44.721551 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b02a1786-53e7-4cb4-8543-1dafa6ec1d7f-config-data\") pod \"b02a1786-53e7-4cb4-8543-1dafa6ec1d7f\" (UID: \"b02a1786-53e7-4cb4-8543-1dafa6ec1d7f\") " Mar 20 07:12:44 crc kubenswrapper[4971]: I0320 07:12:44.721621 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b02a1786-53e7-4cb4-8543-1dafa6ec1d7f-logs\") pod \"b02a1786-53e7-4cb4-8543-1dafa6ec1d7f\" (UID: \"b02a1786-53e7-4cb4-8543-1dafa6ec1d7f\") " Mar 20 07:12:44 crc kubenswrapper[4971]: I0320 07:12:44.721689 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlm79\" (UniqueName: \"kubernetes.io/projected/b02a1786-53e7-4cb4-8543-1dafa6ec1d7f-kube-api-access-vlm79\") pod \"b02a1786-53e7-4cb4-8543-1dafa6ec1d7f\" (UID: \"b02a1786-53e7-4cb4-8543-1dafa6ec1d7f\") " Mar 20 07:12:44 crc kubenswrapper[4971]: I0320 07:12:44.721726 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b02a1786-53e7-4cb4-8543-1dafa6ec1d7f-combined-ca-bundle\") pod \"b02a1786-53e7-4cb4-8543-1dafa6ec1d7f\" (UID: \"b02a1786-53e7-4cb4-8543-1dafa6ec1d7f\") " Mar 20 07:12:44 crc kubenswrapper[4971]: I0320 07:12:44.721746 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b02a1786-53e7-4cb4-8543-1dafa6ec1d7f-etc-machine-id\") pod \"b02a1786-53e7-4cb4-8543-1dafa6ec1d7f\" (UID: \"b02a1786-53e7-4cb4-8543-1dafa6ec1d7f\") " Mar 20 07:12:44 crc kubenswrapper[4971]: I0320 07:12:44.722101 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b02a1786-53e7-4cb4-8543-1dafa6ec1d7f-logs" (OuterVolumeSpecName: "logs") pod "b02a1786-53e7-4cb4-8543-1dafa6ec1d7f" (UID: "b02a1786-53e7-4cb4-8543-1dafa6ec1d7f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:12:44 crc kubenswrapper[4971]: I0320 07:12:44.722172 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b02a1786-53e7-4cb4-8543-1dafa6ec1d7f-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "b02a1786-53e7-4cb4-8543-1dafa6ec1d7f" (UID: "b02a1786-53e7-4cb4-8543-1dafa6ec1d7f"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 07:12:44 crc kubenswrapper[4971]: I0320 07:12:44.742169 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b02a1786-53e7-4cb4-8543-1dafa6ec1d7f-scripts" (OuterVolumeSpecName: "scripts") pod "b02a1786-53e7-4cb4-8543-1dafa6ec1d7f" (UID: "b02a1786-53e7-4cb4-8543-1dafa6ec1d7f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:12:44 crc kubenswrapper[4971]: I0320 07:12:44.744261 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b02a1786-53e7-4cb4-8543-1dafa6ec1d7f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b02a1786-53e7-4cb4-8543-1dafa6ec1d7f" (UID: "b02a1786-53e7-4cb4-8543-1dafa6ec1d7f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:12:44 crc kubenswrapper[4971]: I0320 07:12:44.747475 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b02a1786-53e7-4cb4-8543-1dafa6ec1d7f-kube-api-access-vlm79" (OuterVolumeSpecName: "kube-api-access-vlm79") pod "b02a1786-53e7-4cb4-8543-1dafa6ec1d7f" (UID: "b02a1786-53e7-4cb4-8543-1dafa6ec1d7f"). InnerVolumeSpecName "kube-api-access-vlm79". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:12:44 crc kubenswrapper[4971]: I0320 07:12:44.751147 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1edc3606-34b7-4c2a-b534-2efa0ad00e95" path="/var/lib/kubelet/pods/1edc3606-34b7-4c2a-b534-2efa0ad00e95/volumes" Mar 20 07:12:44 crc kubenswrapper[4971]: I0320 07:12:44.759720 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b02a1786-53e7-4cb4-8543-1dafa6ec1d7f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b02a1786-53e7-4cb4-8543-1dafa6ec1d7f" (UID: "b02a1786-53e7-4cb4-8543-1dafa6ec1d7f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:12:44 crc kubenswrapper[4971]: I0320 07:12:44.788603 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b02a1786-53e7-4cb4-8543-1dafa6ec1d7f-config-data" (OuterVolumeSpecName: "config-data") pod "b02a1786-53e7-4cb4-8543-1dafa6ec1d7f" (UID: "b02a1786-53e7-4cb4-8543-1dafa6ec1d7f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:12:44 crc kubenswrapper[4971]: I0320 07:12:44.823913 4971 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b02a1786-53e7-4cb4-8543-1dafa6ec1d7f-logs\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:44 crc kubenswrapper[4971]: I0320 07:12:44.823945 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlm79\" (UniqueName: \"kubernetes.io/projected/b02a1786-53e7-4cb4-8543-1dafa6ec1d7f-kube-api-access-vlm79\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:44 crc kubenswrapper[4971]: I0320 07:12:44.823955 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b02a1786-53e7-4cb4-8543-1dafa6ec1d7f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:44 crc kubenswrapper[4971]: I0320 07:12:44.823966 4971 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b02a1786-53e7-4cb4-8543-1dafa6ec1d7f-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:44 crc kubenswrapper[4971]: I0320 07:12:44.823976 4971 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b02a1786-53e7-4cb4-8543-1dafa6ec1d7f-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:44 crc kubenswrapper[4971]: I0320 07:12:44.823984 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b02a1786-53e7-4cb4-8543-1dafa6ec1d7f-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:44 crc kubenswrapper[4971]: I0320 07:12:44.823992 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b02a1786-53e7-4cb4-8543-1dafa6ec1d7f-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:44 crc kubenswrapper[4971]: I0320 07:12:44.946552 4971 generic.go:334] "Generic (PLEG): container finished" podID="b02a1786-53e7-4cb4-8543-1dafa6ec1d7f" containerID="f97a407e76435735033f4d01109a8d949f98eec05d1369d2ccbc7bb2b0be79ac" exitCode=0 Mar 20 07:12:44 crc kubenswrapper[4971]: I0320 07:12:44.946583 4971 generic.go:334] "Generic (PLEG): container finished" podID="b02a1786-53e7-4cb4-8543-1dafa6ec1d7f" containerID="15985550ea9424c34adae4327cfdd6d755ee61392a948a25219d1b33ea09d5c8" exitCode=143 Mar 20 07:12:44 crc kubenswrapper[4971]: I0320 07:12:44.946638 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b02a1786-53e7-4cb4-8543-1dafa6ec1d7f","Type":"ContainerDied","Data":"f97a407e76435735033f4d01109a8d949f98eec05d1369d2ccbc7bb2b0be79ac"} Mar 20 07:12:44 crc kubenswrapper[4971]: I0320 07:12:44.946704 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 07:12:44 crc kubenswrapper[4971]: I0320 07:12:44.946726 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b02a1786-53e7-4cb4-8543-1dafa6ec1d7f","Type":"ContainerDied","Data":"15985550ea9424c34adae4327cfdd6d755ee61392a948a25219d1b33ea09d5c8"} Mar 20 07:12:44 crc kubenswrapper[4971]: I0320 07:12:44.946741 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b02a1786-53e7-4cb4-8543-1dafa6ec1d7f","Type":"ContainerDied","Data":"207ecc47c69f52915aeea85c93a683ad1a71db9d1cb05081a590ffb80755823f"} Mar 20 07:12:44 crc kubenswrapper[4971]: I0320 07:12:44.946758 4971 scope.go:117] "RemoveContainer" containerID="f97a407e76435735033f4d01109a8d949f98eec05d1369d2ccbc7bb2b0be79ac" Mar 20 07:12:44 crc kubenswrapper[4971]: I0320 07:12:44.978404 4971 scope.go:117] "RemoveContainer" containerID="15985550ea9424c34adae4327cfdd6d755ee61392a948a25219d1b33ea09d5c8" Mar 20 07:12:45 crc kubenswrapper[4971]: I0320 07:12:45.006161 4971 scope.go:117] "RemoveContainer" containerID="f97a407e76435735033f4d01109a8d949f98eec05d1369d2ccbc7bb2b0be79ac" Mar 20 07:12:45 crc kubenswrapper[4971]: E0320 07:12:45.006630 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f97a407e76435735033f4d01109a8d949f98eec05d1369d2ccbc7bb2b0be79ac\": container with ID starting with f97a407e76435735033f4d01109a8d949f98eec05d1369d2ccbc7bb2b0be79ac not found: ID does not exist" containerID="f97a407e76435735033f4d01109a8d949f98eec05d1369d2ccbc7bb2b0be79ac" Mar 20 07:12:45 crc kubenswrapper[4971]: I0320 07:12:45.006763 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f97a407e76435735033f4d01109a8d949f98eec05d1369d2ccbc7bb2b0be79ac"} err="failed to get container status \"f97a407e76435735033f4d01109a8d949f98eec05d1369d2ccbc7bb2b0be79ac\": rpc error: code = NotFound desc = could not find container \"f97a407e76435735033f4d01109a8d949f98eec05d1369d2ccbc7bb2b0be79ac\": container with ID starting with f97a407e76435735033f4d01109a8d949f98eec05d1369d2ccbc7bb2b0be79ac not found: ID does not exist" Mar 20 07:12:45 crc kubenswrapper[4971]: I0320 07:12:45.006791 4971 scope.go:117] "RemoveContainer" containerID="15985550ea9424c34adae4327cfdd6d755ee61392a948a25219d1b33ea09d5c8" Mar 20 07:12:45 crc kubenswrapper[4971]: E0320 07:12:45.007171 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15985550ea9424c34adae4327cfdd6d755ee61392a948a25219d1b33ea09d5c8\": container with ID starting with 15985550ea9424c34adae4327cfdd6d755ee61392a948a25219d1b33ea09d5c8 not found: ID does not exist" containerID="15985550ea9424c34adae4327cfdd6d755ee61392a948a25219d1b33ea09d5c8" Mar 20 07:12:45 crc kubenswrapper[4971]: I0320 07:12:45.007192 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15985550ea9424c34adae4327cfdd6d755ee61392a948a25219d1b33ea09d5c8"} err="failed to get container status \"15985550ea9424c34adae4327cfdd6d755ee61392a948a25219d1b33ea09d5c8\": rpc error: code = NotFound desc = could not find container \"15985550ea9424c34adae4327cfdd6d755ee61392a948a25219d1b33ea09d5c8\": container with ID starting with 15985550ea9424c34adae4327cfdd6d755ee61392a948a25219d1b33ea09d5c8 not found: ID does not exist" Mar 20 07:12:45 crc kubenswrapper[4971]: I0320 07:12:45.007206 4971 scope.go:117] "RemoveContainer" containerID="f97a407e76435735033f4d01109a8d949f98eec05d1369d2ccbc7bb2b0be79ac" Mar 20 07:12:45 crc kubenswrapper[4971]: I0320 07:12:45.007817 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f97a407e76435735033f4d01109a8d949f98eec05d1369d2ccbc7bb2b0be79ac"} err="failed to get container status \"f97a407e76435735033f4d01109a8d949f98eec05d1369d2ccbc7bb2b0be79ac\": rpc error: code = NotFound desc = could not find container \"f97a407e76435735033f4d01109a8d949f98eec05d1369d2ccbc7bb2b0be79ac\": container with ID starting with f97a407e76435735033f4d01109a8d949f98eec05d1369d2ccbc7bb2b0be79ac not found: ID does not exist" Mar 20 07:12:45 crc kubenswrapper[4971]: I0320 07:12:45.007862 4971 scope.go:117] "RemoveContainer" containerID="15985550ea9424c34adae4327cfdd6d755ee61392a948a25219d1b33ea09d5c8" Mar 20 07:12:45 crc kubenswrapper[4971]: I0320 07:12:45.008446 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15985550ea9424c34adae4327cfdd6d755ee61392a948a25219d1b33ea09d5c8"} err="failed to get container status \"15985550ea9424c34adae4327cfdd6d755ee61392a948a25219d1b33ea09d5c8\": rpc error: code = NotFound desc = could not find container \"15985550ea9424c34adae4327cfdd6d755ee61392a948a25219d1b33ea09d5c8\": container with ID starting with 15985550ea9424c34adae4327cfdd6d755ee61392a948a25219d1b33ea09d5c8 not found: ID does not exist" Mar 20 07:12:45 crc kubenswrapper[4971]: I0320 07:12:45.012109 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 20 07:12:45 crc kubenswrapper[4971]: I0320 07:12:45.027487 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 20 07:12:45 crc kubenswrapper[4971]: I0320 07:12:45.043493 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 20 07:12:45 crc kubenswrapper[4971]: E0320 07:12:45.044005 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b02a1786-53e7-4cb4-8543-1dafa6ec1d7f" containerName="cinder-api-log" Mar 20 07:12:45 crc kubenswrapper[4971]: I0320 07:12:45.044030 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="b02a1786-53e7-4cb4-8543-1dafa6ec1d7f" containerName="cinder-api-log" Mar 20 07:12:45 crc kubenswrapper[4971]: E0320 07:12:45.044046 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1edc3606-34b7-4c2a-b534-2efa0ad00e95" containerName="barbican-api-log" Mar 20 07:12:45 crc kubenswrapper[4971]: I0320 07:12:45.044055 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="1edc3606-34b7-4c2a-b534-2efa0ad00e95" containerName="barbican-api-log" Mar 20 07:12:45 crc kubenswrapper[4971]: E0320 07:12:45.044071 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b02a1786-53e7-4cb4-8543-1dafa6ec1d7f" containerName="cinder-api" Mar 20 07:12:45 crc kubenswrapper[4971]: I0320 07:12:45.044079 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="b02a1786-53e7-4cb4-8543-1dafa6ec1d7f" containerName="cinder-api" Mar 20 07:12:45 crc kubenswrapper[4971]: E0320 07:12:45.044095 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1edc3606-34b7-4c2a-b534-2efa0ad00e95" containerName="barbican-api" Mar 20 07:12:45 crc kubenswrapper[4971]: I0320 07:12:45.044103 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="1edc3606-34b7-4c2a-b534-2efa0ad00e95" containerName="barbican-api" Mar 20 07:12:45 crc kubenswrapper[4971]: I0320 07:12:45.044342 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="b02a1786-53e7-4cb4-8543-1dafa6ec1d7f" containerName="cinder-api" Mar 20 07:12:45 crc kubenswrapper[4971]: I0320 07:12:45.044381 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="1edc3606-34b7-4c2a-b534-2efa0ad00e95" containerName="barbican-api" Mar 20 07:12:45 crc kubenswrapper[4971]: I0320 07:12:45.044403 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="b02a1786-53e7-4cb4-8543-1dafa6ec1d7f" containerName="cinder-api-log" Mar 20 07:12:45 crc kubenswrapper[4971]: I0320 07:12:45.044418 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="1edc3606-34b7-4c2a-b534-2efa0ad00e95" containerName="barbican-api-log" Mar 20 07:12:45 crc kubenswrapper[4971]: I0320 07:12:45.046393 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 07:12:45 crc kubenswrapper[4971]: I0320 07:12:45.051400 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 20 07:12:45 crc kubenswrapper[4971]: I0320 07:12:45.051606 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 20 07:12:45 crc kubenswrapper[4971]: I0320 07:12:45.051775 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 20 07:12:45 crc kubenswrapper[4971]: I0320 07:12:45.061021 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 07:12:45 crc kubenswrapper[4971]: I0320 07:12:45.130800 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08b125a7-da7e-4ab1-b594-2f45f0b5c529-logs\") pod \"cinder-api-0\" (UID: \"08b125a7-da7e-4ab1-b594-2f45f0b5c529\") " pod="openstack/cinder-api-0" Mar 20 07:12:45 crc kubenswrapper[4971]: I0320 07:12:45.130936 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9x6tw\" (UniqueName: \"kubernetes.io/projected/08b125a7-da7e-4ab1-b594-2f45f0b5c529-kube-api-access-9x6tw\") pod \"cinder-api-0\" (UID: \"08b125a7-da7e-4ab1-b594-2f45f0b5c529\") " pod="openstack/cinder-api-0" Mar 20 07:12:45 crc kubenswrapper[4971]: I0320 07:12:45.131140 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/08b125a7-da7e-4ab1-b594-2f45f0b5c529-etc-machine-id\") pod \"cinder-api-0\" (UID: \"08b125a7-da7e-4ab1-b594-2f45f0b5c529\") " pod="openstack/cinder-api-0" Mar 20 07:12:45 crc kubenswrapper[4971]: I0320 07:12:45.131363 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08b125a7-da7e-4ab1-b594-2f45f0b5c529-config-data-custom\") pod \"cinder-api-0\" (UID: \"08b125a7-da7e-4ab1-b594-2f45f0b5c529\") " pod="openstack/cinder-api-0" Mar 20 07:12:45 crc kubenswrapper[4971]: I0320 07:12:45.131454 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/08b125a7-da7e-4ab1-b594-2f45f0b5c529-public-tls-certs\") pod \"cinder-api-0\" (UID: \"08b125a7-da7e-4ab1-b594-2f45f0b5c529\") " pod="openstack/cinder-api-0" Mar 20 07:12:45 crc kubenswrapper[4971]: I0320 07:12:45.131493 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08b125a7-da7e-4ab1-b594-2f45f0b5c529-config-data\") pod \"cinder-api-0\" (UID: \"08b125a7-da7e-4ab1-b594-2f45f0b5c529\") " pod="openstack/cinder-api-0" Mar 20 07:12:45 crc kubenswrapper[4971]: I0320 07:12:45.131748 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08b125a7-da7e-4ab1-b594-2f45f0b5c529-scripts\") pod \"cinder-api-0\" (UID: \"08b125a7-da7e-4ab1-b594-2f45f0b5c529\") " pod="openstack/cinder-api-0" Mar 20 07:12:45 crc kubenswrapper[4971]: I0320 07:12:45.131773 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08b125a7-da7e-4ab1-b594-2f45f0b5c529-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"08b125a7-da7e-4ab1-b594-2f45f0b5c529\") " pod="openstack/cinder-api-0" Mar 20 07:12:45 crc kubenswrapper[4971]: I0320 07:12:45.131795 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/08b125a7-da7e-4ab1-b594-2f45f0b5c529-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"08b125a7-da7e-4ab1-b594-2f45f0b5c529\") " pod="openstack/cinder-api-0" Mar 20 07:12:45 crc kubenswrapper[4971]: I0320 07:12:45.233311 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08b125a7-da7e-4ab1-b594-2f45f0b5c529-config-data-custom\") pod \"cinder-api-0\" (UID: \"08b125a7-da7e-4ab1-b594-2f45f0b5c529\") " pod="openstack/cinder-api-0" Mar 20 07:12:45 crc kubenswrapper[4971]: I0320 07:12:45.233396 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/08b125a7-da7e-4ab1-b594-2f45f0b5c529-public-tls-certs\") pod \"cinder-api-0\" (UID: \"08b125a7-da7e-4ab1-b594-2f45f0b5c529\") " pod="openstack/cinder-api-0" Mar 20 07:12:45 crc kubenswrapper[4971]: I0320 07:12:45.233429 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08b125a7-da7e-4ab1-b594-2f45f0b5c529-config-data\") pod \"cinder-api-0\" (UID: \"08b125a7-da7e-4ab1-b594-2f45f0b5c529\") " pod="openstack/cinder-api-0" Mar 20 07:12:45 crc kubenswrapper[4971]: I0320 07:12:45.233498 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08b125a7-da7e-4ab1-b594-2f45f0b5c529-scripts\") pod \"cinder-api-0\" (UID: \"08b125a7-da7e-4ab1-b594-2f45f0b5c529\") " pod="openstack/cinder-api-0" Mar 20 07:12:45 crc kubenswrapper[4971]: I0320 07:12:45.233525 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08b125a7-da7e-4ab1-b594-2f45f0b5c529-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"08b125a7-da7e-4ab1-b594-2f45f0b5c529\") " pod="openstack/cinder-api-0" Mar 20 07:12:45 crc kubenswrapper[4971]: I0320 07:12:45.233545 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/08b125a7-da7e-4ab1-b594-2f45f0b5c529-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"08b125a7-da7e-4ab1-b594-2f45f0b5c529\") " pod="openstack/cinder-api-0" Mar 20 07:12:45 crc kubenswrapper[4971]: I0320 07:12:45.233574 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08b125a7-da7e-4ab1-b594-2f45f0b5c529-logs\") pod \"cinder-api-0\" (UID: \"08b125a7-da7e-4ab1-b594-2f45f0b5c529\") " pod="openstack/cinder-api-0" Mar 20 07:12:45 crc kubenswrapper[4971]: I0320 07:12:45.233618 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9x6tw\" (UniqueName: \"kubernetes.io/projected/08b125a7-da7e-4ab1-b594-2f45f0b5c529-kube-api-access-9x6tw\") pod \"cinder-api-0\" (UID: \"08b125a7-da7e-4ab1-b594-2f45f0b5c529\") " pod="openstack/cinder-api-0" Mar 20 07:12:45 crc kubenswrapper[4971]: I0320 07:12:45.233690 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/08b125a7-da7e-4ab1-b594-2f45f0b5c529-etc-machine-id\") pod \"cinder-api-0\" (UID: \"08b125a7-da7e-4ab1-b594-2f45f0b5c529\") " pod="openstack/cinder-api-0" Mar 20 07:12:45 crc kubenswrapper[4971]: I0320 07:12:45.233893 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/08b125a7-da7e-4ab1-b594-2f45f0b5c529-etc-machine-id\") pod \"cinder-api-0\" (UID: \"08b125a7-da7e-4ab1-b594-2f45f0b5c529\") " pod="openstack/cinder-api-0" Mar 20 07:12:45 crc kubenswrapper[4971]: I0320 07:12:45.234122 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08b125a7-da7e-4ab1-b594-2f45f0b5c529-logs\") pod \"cinder-api-0\" (UID: \"08b125a7-da7e-4ab1-b594-2f45f0b5c529\") " pod="openstack/cinder-api-0" Mar 20 07:12:45 crc kubenswrapper[4971]: I0320 07:12:45.238370 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/08b125a7-da7e-4ab1-b594-2f45f0b5c529-public-tls-certs\") pod \"cinder-api-0\" (UID: \"08b125a7-da7e-4ab1-b594-2f45f0b5c529\") " pod="openstack/cinder-api-0" Mar 20 07:12:45 crc kubenswrapper[4971]: I0320 07:12:45.238397 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/08b125a7-da7e-4ab1-b594-2f45f0b5c529-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"08b125a7-da7e-4ab1-b594-2f45f0b5c529\") " pod="openstack/cinder-api-0" Mar 20 07:12:45 crc kubenswrapper[4971]: I0320 07:12:45.238681 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08b125a7-da7e-4ab1-b594-2f45f0b5c529-config-data-custom\") pod \"cinder-api-0\" (UID: \"08b125a7-da7e-4ab1-b594-2f45f0b5c529\") " pod="openstack/cinder-api-0" Mar 20 07:12:45 crc kubenswrapper[4971]: I0320 07:12:45.239546 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08b125a7-da7e-4ab1-b594-2f45f0b5c529-config-data\") pod \"cinder-api-0\" (UID: \"08b125a7-da7e-4ab1-b594-2f45f0b5c529\") " pod="openstack/cinder-api-0" Mar 20 07:12:45 crc kubenswrapper[4971]: I0320 07:12:45.239751 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08b125a7-da7e-4ab1-b594-2f45f0b5c529-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"08b125a7-da7e-4ab1-b594-2f45f0b5c529\") " pod="openstack/cinder-api-0" Mar 20 07:12:45 crc kubenswrapper[4971]: I0320 07:12:45.240416 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08b125a7-da7e-4ab1-b594-2f45f0b5c529-scripts\") pod \"cinder-api-0\" (UID: \"08b125a7-da7e-4ab1-b594-2f45f0b5c529\") " pod="openstack/cinder-api-0" Mar 20 07:12:45 crc kubenswrapper[4971]: I0320 07:12:45.260892 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9x6tw\" (UniqueName: \"kubernetes.io/projected/08b125a7-da7e-4ab1-b594-2f45f0b5c529-kube-api-access-9x6tw\") pod \"cinder-api-0\" (UID: \"08b125a7-da7e-4ab1-b594-2f45f0b5c529\") " pod="openstack/cinder-api-0" Mar 20 07:12:45 crc kubenswrapper[4971]: I0320 07:12:45.378771 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 07:12:45 crc kubenswrapper[4971]: I0320 07:12:45.890703 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 07:12:45 crc kubenswrapper[4971]: I0320 07:12:45.962168 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"08b125a7-da7e-4ab1-b594-2f45f0b5c529","Type":"ContainerStarted","Data":"66d96c63b8a14b9071f1a161f51fb4a60e54efe0e3ffa98a0d3051d17d49b1dc"} Mar 20 07:12:46 crc kubenswrapper[4971]: I0320 07:12:46.546952 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-66b6544788-qm44x" Mar 20 07:12:46 crc kubenswrapper[4971]: I0320 07:12:46.765493 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b02a1786-53e7-4cb4-8543-1dafa6ec1d7f" path="/var/lib/kubelet/pods/b02a1786-53e7-4cb4-8543-1dafa6ec1d7f/volumes" Mar 20 07:12:46 crc kubenswrapper[4971]: I0320 07:12:46.786591 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5ddc6d7dd9-qnsrj"] Mar 20 07:12:46 crc kubenswrapper[4971]: I0320 07:12:46.787560 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5ddc6d7dd9-qnsrj" podUID="2c4b8281-2638-413d-b91a-ee6acb59693d" containerName="neutron-api" containerID="cri-o://ec43c060399fdac7b8010fc1e004e9c5f9a92fdcf5f59bf5b7751d728e2f3fc2" gracePeriod=30 Mar 20 07:12:46 crc kubenswrapper[4971]: I0320 07:12:46.787796 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5ddc6d7dd9-qnsrj" podUID="2c4b8281-2638-413d-b91a-ee6acb59693d" containerName="neutron-httpd" containerID="cri-o://1f62d6fff0e58ee91d48b2a1865144fa8a186a552d3f87401a1c86fbf8d631de" gracePeriod=30 Mar 20 07:12:46 crc kubenswrapper[4971]: I0320 07:12:46.801828 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-5ddc6d7dd9-qnsrj" podUID="2c4b8281-2638-413d-b91a-ee6acb59693d" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.158:9696/\": read tcp 10.217.0.2:57050->10.217.0.158:9696: read: connection reset by peer" Mar 20 07:12:46 crc kubenswrapper[4971]: I0320 07:12:46.831251 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5d45446b77-6x8tp"] Mar 20 07:12:46 crc kubenswrapper[4971]: I0320 07:12:46.833474 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5d45446b77-6x8tp" Mar 20 07:12:46 crc kubenswrapper[4971]: I0320 07:12:46.858866 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5d45446b77-6x8tp"] Mar 20 07:12:46 crc kubenswrapper[4971]: I0320 07:12:46.987137 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"08b125a7-da7e-4ab1-b594-2f45f0b5c529","Type":"ContainerStarted","Data":"8862358b81fede9115c50c47e167a72f0bd5aec1ea665447f7122a28b9e9ceea"} Mar 20 07:12:46 crc kubenswrapper[4971]: I0320 07:12:46.987582 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a83b09bb-e1ad-4b10-9b14-538998197621-httpd-config\") pod \"neutron-5d45446b77-6x8tp\" (UID: \"a83b09bb-e1ad-4b10-9b14-538998197621\") " pod="openstack/neutron-5d45446b77-6x8tp" Mar 20 07:12:46 crc kubenswrapper[4971]: I0320 07:12:46.987616 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a83b09bb-e1ad-4b10-9b14-538998197621-public-tls-certs\") pod \"neutron-5d45446b77-6x8tp\" (UID: \"a83b09bb-e1ad-4b10-9b14-538998197621\") " pod="openstack/neutron-5d45446b77-6x8tp" Mar 20 07:12:46 crc kubenswrapper[4971]: I0320 07:12:46.987652 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a83b09bb-e1ad-4b10-9b14-538998197621-combined-ca-bundle\") pod \"neutron-5d45446b77-6x8tp\" (UID: \"a83b09bb-e1ad-4b10-9b14-538998197621\") " pod="openstack/neutron-5d45446b77-6x8tp" Mar 20 07:12:46 crc kubenswrapper[4971]: I0320 07:12:46.987944 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a83b09bb-e1ad-4b10-9b14-538998197621-ovndb-tls-certs\") pod \"neutron-5d45446b77-6x8tp\" (UID: \"a83b09bb-e1ad-4b10-9b14-538998197621\") " pod="openstack/neutron-5d45446b77-6x8tp" Mar 20 07:12:46 crc kubenswrapper[4971]: I0320 07:12:46.988279 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2mcb\" (UniqueName: \"kubernetes.io/projected/a83b09bb-e1ad-4b10-9b14-538998197621-kube-api-access-x2mcb\") pod \"neutron-5d45446b77-6x8tp\" (UID: \"a83b09bb-e1ad-4b10-9b14-538998197621\") " pod="openstack/neutron-5d45446b77-6x8tp" Mar 20 07:12:46 crc kubenswrapper[4971]: I0320 07:12:46.988330 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a83b09bb-e1ad-4b10-9b14-538998197621-config\") pod \"neutron-5d45446b77-6x8tp\" (UID: \"a83b09bb-e1ad-4b10-9b14-538998197621\") " pod="openstack/neutron-5d45446b77-6x8tp" Mar 20 07:12:46 crc kubenswrapper[4971]: I0320 07:12:46.988380 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a83b09bb-e1ad-4b10-9b14-538998197621-internal-tls-certs\") pod \"neutron-5d45446b77-6x8tp\" (UID: \"a83b09bb-e1ad-4b10-9b14-538998197621\") " pod="openstack/neutron-5d45446b77-6x8tp" Mar 20 07:12:46 crc kubenswrapper[4971]: I0320 07:12:46.989755 4971 generic.go:334] "Generic (PLEG): container finished" podID="2c4b8281-2638-413d-b91a-ee6acb59693d" containerID="1f62d6fff0e58ee91d48b2a1865144fa8a186a552d3f87401a1c86fbf8d631de" exitCode=0 Mar 20 07:12:46 crc kubenswrapper[4971]: I0320 07:12:46.989793 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5ddc6d7dd9-qnsrj" event={"ID":"2c4b8281-2638-413d-b91a-ee6acb59693d","Type":"ContainerDied","Data":"1f62d6fff0e58ee91d48b2a1865144fa8a186a552d3f87401a1c86fbf8d631de"} Mar 20 07:12:47 crc kubenswrapper[4971]: I0320 07:12:47.090737 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2mcb\" (UniqueName: \"kubernetes.io/projected/a83b09bb-e1ad-4b10-9b14-538998197621-kube-api-access-x2mcb\") pod \"neutron-5d45446b77-6x8tp\" (UID: \"a83b09bb-e1ad-4b10-9b14-538998197621\") " pod="openstack/neutron-5d45446b77-6x8tp" Mar 20 07:12:47 crc kubenswrapper[4971]: I0320 07:12:47.090804 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a83b09bb-e1ad-4b10-9b14-538998197621-config\") pod \"neutron-5d45446b77-6x8tp\" (UID: \"a83b09bb-e1ad-4b10-9b14-538998197621\") " pod="openstack/neutron-5d45446b77-6x8tp" Mar 20 07:12:47 crc kubenswrapper[4971]: I0320 07:12:47.090845 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a83b09bb-e1ad-4b10-9b14-538998197621-internal-tls-certs\") pod \"neutron-5d45446b77-6x8tp\" (UID: \"a83b09bb-e1ad-4b10-9b14-538998197621\") " pod="openstack/neutron-5d45446b77-6x8tp" Mar 20 07:12:47 crc kubenswrapper[4971]: I0320 07:12:47.090901 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a83b09bb-e1ad-4b10-9b14-538998197621-httpd-config\") pod \"neutron-5d45446b77-6x8tp\" (UID: \"a83b09bb-e1ad-4b10-9b14-538998197621\") " pod="openstack/neutron-5d45446b77-6x8tp" Mar 20 07:12:47 crc kubenswrapper[4971]: I0320 07:12:47.090959 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a83b09bb-e1ad-4b10-9b14-538998197621-public-tls-certs\") pod \"neutron-5d45446b77-6x8tp\" (UID: \"a83b09bb-e1ad-4b10-9b14-538998197621\") " pod="openstack/neutron-5d45446b77-6x8tp" Mar 20 07:12:47 crc kubenswrapper[4971]: I0320 07:12:47.090991 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a83b09bb-e1ad-4b10-9b14-538998197621-combined-ca-bundle\") pod \"neutron-5d45446b77-6x8tp\" (UID: \"a83b09bb-e1ad-4b10-9b14-538998197621\") " pod="openstack/neutron-5d45446b77-6x8tp" Mar 20 07:12:47 crc kubenswrapper[4971]: I0320 07:12:47.091115 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a83b09bb-e1ad-4b10-9b14-538998197621-ovndb-tls-certs\") pod \"neutron-5d45446b77-6x8tp\" (UID: \"a83b09bb-e1ad-4b10-9b14-538998197621\") " pod="openstack/neutron-5d45446b77-6x8tp" Mar 20 07:12:47 crc kubenswrapper[4971]: I0320 07:12:47.096317 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a83b09bb-e1ad-4b10-9b14-538998197621-public-tls-certs\") pod \"neutron-5d45446b77-6x8tp\" (UID: \"a83b09bb-e1ad-4b10-9b14-538998197621\") " pod="openstack/neutron-5d45446b77-6x8tp" Mar 20 07:12:47 crc kubenswrapper[4971]: I0320 07:12:47.096529 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a83b09bb-e1ad-4b10-9b14-538998197621-config\") pod \"neutron-5d45446b77-6x8tp\" (UID: \"a83b09bb-e1ad-4b10-9b14-538998197621\") " pod="openstack/neutron-5d45446b77-6x8tp" Mar 20 07:12:47 crc kubenswrapper[4971]: I0320 07:12:47.096720 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a83b09bb-e1ad-4b10-9b14-538998197621-httpd-config\") pod \"neutron-5d45446b77-6x8tp\" (UID: \"a83b09bb-e1ad-4b10-9b14-538998197621\") " pod="openstack/neutron-5d45446b77-6x8tp" Mar 20 07:12:47 crc kubenswrapper[4971]: I0320 07:12:47.097161 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a83b09bb-e1ad-4b10-9b14-538998197621-combined-ca-bundle\") pod \"neutron-5d45446b77-6x8tp\" (UID: \"a83b09bb-e1ad-4b10-9b14-538998197621\") " pod="openstack/neutron-5d45446b77-6x8tp" Mar 20 07:12:47 crc kubenswrapper[4971]: I0320 07:12:47.098332 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a83b09bb-e1ad-4b10-9b14-538998197621-ovndb-tls-certs\") pod \"neutron-5d45446b77-6x8tp\" (UID: \"a83b09bb-e1ad-4b10-9b14-538998197621\") " pod="openstack/neutron-5d45446b77-6x8tp" Mar 20 07:12:47 crc kubenswrapper[4971]: I0320 07:12:47.098843 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a83b09bb-e1ad-4b10-9b14-538998197621-internal-tls-certs\") pod \"neutron-5d45446b77-6x8tp\" (UID: \"a83b09bb-e1ad-4b10-9b14-538998197621\") " pod="openstack/neutron-5d45446b77-6x8tp" Mar 20 07:12:47 crc kubenswrapper[4971]: I0320 07:12:47.106205 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2mcb\" (UniqueName: \"kubernetes.io/projected/a83b09bb-e1ad-4b10-9b14-538998197621-kube-api-access-x2mcb\") pod \"neutron-5d45446b77-6x8tp\" (UID: \"a83b09bb-e1ad-4b10-9b14-538998197621\") " pod="openstack/neutron-5d45446b77-6x8tp" Mar 20 07:12:47 crc kubenswrapper[4971]: I0320 07:12:47.164751 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5d45446b77-6x8tp" Mar 20 07:12:47 crc kubenswrapper[4971]: I0320 07:12:47.690027 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5d45446b77-6x8tp"] Mar 20 07:12:48 crc kubenswrapper[4971]: I0320 07:12:48.005130 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d45446b77-6x8tp" event={"ID":"a83b09bb-e1ad-4b10-9b14-538998197621","Type":"ContainerStarted","Data":"35447db00a76cd29e230251c5953d04cbe80e78a51340cb69f31c155161f5b6a"} Mar 20 07:12:48 crc kubenswrapper[4971]: I0320 07:12:48.005447 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d45446b77-6x8tp" event={"ID":"a83b09bb-e1ad-4b10-9b14-538998197621","Type":"ContainerStarted","Data":"808062fa5fea3183d861db5a10e3106b35f589e9d19e52fd4f9a2dd708fa842c"} Mar 20 07:12:48 crc kubenswrapper[4971]: I0320 07:12:48.007076 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"08b125a7-da7e-4ab1-b594-2f45f0b5c529","Type":"ContainerStarted","Data":"2d28427254b93eecd5e3f48d2294727ab000a8f924161e1e67d604b6de308a34"} Mar 20 07:12:48 crc kubenswrapper[4971]: I0320 07:12:48.007999 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 20 07:12:48 crc kubenswrapper[4971]: I0320 07:12:48.052254 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.05223204 podStartE2EDuration="3.05223204s" podCreationTimestamp="2026-03-20 07:12:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:12:48.035383766 +0000 UTC m=+1390.015257904" watchObservedRunningTime="2026-03-20 07:12:48.05223204 +0000 UTC m=+1390.032106178" Mar 20 07:12:48 crc kubenswrapper[4971]: I0320 07:12:48.771359 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-5ddc6d7dd9-qnsrj" podUID="2c4b8281-2638-413d-b91a-ee6acb59693d" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.158:9696/\": dial tcp 10.217.0.158:9696: connect: connection refused" Mar 20 07:12:49 crc kubenswrapper[4971]: I0320 07:12:49.018156 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d45446b77-6x8tp" event={"ID":"a83b09bb-e1ad-4b10-9b14-538998197621","Type":"ContainerStarted","Data":"de8268f31ad44f7855e9d376812885c5bc010017a21fab351f990ce733d64a51"} Mar 20 07:12:49 crc kubenswrapper[4971]: I0320 07:12:49.018698 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5d45446b77-6x8tp" Mar 20 07:12:49 crc kubenswrapper[4971]: I0320 07:12:49.044731 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5d45446b77-6x8tp" podStartSLOduration=3.044708227 podStartE2EDuration="3.044708227s" podCreationTimestamp="2026-03-20 07:12:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:12:49.040486923 +0000 UTC m=+1391.020361061" watchObservedRunningTime="2026-03-20 07:12:49.044708227 +0000 UTC m=+1391.024582405" Mar 20 07:12:49 crc kubenswrapper[4971]: I0320 07:12:49.306466 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-58b85ccffc-jklwc" Mar 20 07:12:49 crc kubenswrapper[4971]: I0320 07:12:49.386112 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fc46d7df7-468lg"] Mar 20 07:12:49 crc kubenswrapper[4971]: I0320 07:12:49.386395 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7fc46d7df7-468lg" podUID="c44d5869-6a57-435c-873e-a5aa02b43b3c" containerName="dnsmasq-dns" containerID="cri-o://112f076e80060835ec2117278c822bb2d8aa71100255cf31dc876b936c782f6f" gracePeriod=10 Mar 20 07:12:49 crc kubenswrapper[4971]: I0320 07:12:49.632120 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 20 07:12:49 crc kubenswrapper[4971]: E0320 07:12:49.695811 4971 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc44d5869_6a57_435c_873e_a5aa02b43b3c.slice/crio-conmon-112f076e80060835ec2117278c822bb2d8aa71100255cf31dc876b936c782f6f.scope\": RecentStats: unable to find data in memory cache]" Mar 20 07:12:49 crc kubenswrapper[4971]: I0320 07:12:49.696487 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 07:12:49 crc kubenswrapper[4971]: I0320 07:12:49.921715 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fc46d7df7-468lg" Mar 20 07:12:49 crc kubenswrapper[4971]: I0320 07:12:49.972468 4971 scope.go:117] "RemoveContainer" containerID="924c1f9323defcb56cdff8e32cbb9cfd2255fb2410476e1891409701b4e0869e" Mar 20 07:12:50 crc kubenswrapper[4971]: I0320 07:12:50.037029 4971 generic.go:334] "Generic (PLEG): container finished" podID="c44d5869-6a57-435c-873e-a5aa02b43b3c" containerID="112f076e80060835ec2117278c822bb2d8aa71100255cf31dc876b936c782f6f" exitCode=0 Mar 20 07:12:50 crc kubenswrapper[4971]: I0320 07:12:50.037098 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fc46d7df7-468lg" event={"ID":"c44d5869-6a57-435c-873e-a5aa02b43b3c","Type":"ContainerDied","Data":"112f076e80060835ec2117278c822bb2d8aa71100255cf31dc876b936c782f6f"} Mar 20 07:12:50 crc kubenswrapper[4971]: I0320 07:12:50.037127 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fc46d7df7-468lg" event={"ID":"c44d5869-6a57-435c-873e-a5aa02b43b3c","Type":"ContainerDied","Data":"03b0a72a64e95931835a40cd0b704d11c5e4e695e6f966363e9a0246044f2f73"} Mar 20 07:12:50 crc kubenswrapper[4971]: I0320 07:12:50.037145 4971 scope.go:117] "RemoveContainer" containerID="112f076e80060835ec2117278c822bb2d8aa71100255cf31dc876b936c782f6f" Mar 20 07:12:50 crc kubenswrapper[4971]: I0320 07:12:50.037276 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fc46d7df7-468lg" Mar 20 07:12:50 crc kubenswrapper[4971]: I0320 07:12:50.041877 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="f85b1a9b-55d3-4e2f-bc37-726688f9ab49" containerName="cinder-scheduler" containerID="cri-o://cfc03ceb35986a85f6a9361634ccbc7e649ab27662c4580c0540117337632d25" gracePeriod=30 Mar 20 07:12:50 crc kubenswrapper[4971]: I0320 07:12:50.041917 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="f85b1a9b-55d3-4e2f-bc37-726688f9ab49" containerName="probe" containerID="cri-o://1e3f49bd357c4c772a5a3660e021ae19b88b3e4636ef9c26168d72d4957a90cd" gracePeriod=30 Mar 20 07:12:50 crc kubenswrapper[4971]: I0320 07:12:50.045501 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjxgj\" (UniqueName: \"kubernetes.io/projected/c44d5869-6a57-435c-873e-a5aa02b43b3c-kube-api-access-vjxgj\") pod \"c44d5869-6a57-435c-873e-a5aa02b43b3c\" (UID: \"c44d5869-6a57-435c-873e-a5aa02b43b3c\") " Mar 20 07:12:50 crc kubenswrapper[4971]: I0320 07:12:50.045549 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c44d5869-6a57-435c-873e-a5aa02b43b3c-dns-svc\") pod \"c44d5869-6a57-435c-873e-a5aa02b43b3c\" (UID: \"c44d5869-6a57-435c-873e-a5aa02b43b3c\") " Mar 20 07:12:50 crc kubenswrapper[4971]: I0320 07:12:50.045566 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c44d5869-6a57-435c-873e-a5aa02b43b3c-ovsdbserver-nb\") pod \"c44d5869-6a57-435c-873e-a5aa02b43b3c\" (UID: \"c44d5869-6a57-435c-873e-a5aa02b43b3c\") " Mar 20 07:12:50 crc kubenswrapper[4971]: I0320 07:12:50.045600 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c44d5869-6a57-435c-873e-a5aa02b43b3c-config\") pod \"c44d5869-6a57-435c-873e-a5aa02b43b3c\" (UID: \"c44d5869-6a57-435c-873e-a5aa02b43b3c\") " Mar 20 07:12:50 crc kubenswrapper[4971]: I0320 07:12:50.045659 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c44d5869-6a57-435c-873e-a5aa02b43b3c-dns-swift-storage-0\") pod \"c44d5869-6a57-435c-873e-a5aa02b43b3c\" (UID: \"c44d5869-6a57-435c-873e-a5aa02b43b3c\") " Mar 20 07:12:50 crc kubenswrapper[4971]: I0320 07:12:50.045725 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c44d5869-6a57-435c-873e-a5aa02b43b3c-ovsdbserver-sb\") pod \"c44d5869-6a57-435c-873e-a5aa02b43b3c\" (UID: \"c44d5869-6a57-435c-873e-a5aa02b43b3c\") " Mar 20 07:12:50 crc kubenswrapper[4971]: I0320 07:12:50.060611 4971 scope.go:117] "RemoveContainer" containerID="e4b465828f1eee4253df14f943c5fdc68c1908b41e07ca13377f5ed834610495" Mar 20 07:12:50 crc kubenswrapper[4971]: I0320 07:12:50.062965 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c44d5869-6a57-435c-873e-a5aa02b43b3c-kube-api-access-vjxgj" (OuterVolumeSpecName: "kube-api-access-vjxgj") pod "c44d5869-6a57-435c-873e-a5aa02b43b3c" (UID: "c44d5869-6a57-435c-873e-a5aa02b43b3c"). InnerVolumeSpecName "kube-api-access-vjxgj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:12:50 crc kubenswrapper[4971]: I0320 07:12:50.102162 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c44d5869-6a57-435c-873e-a5aa02b43b3c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c44d5869-6a57-435c-873e-a5aa02b43b3c" (UID: "c44d5869-6a57-435c-873e-a5aa02b43b3c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:12:50 crc kubenswrapper[4971]: I0320 07:12:50.111492 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c44d5869-6a57-435c-873e-a5aa02b43b3c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c44d5869-6a57-435c-873e-a5aa02b43b3c" (UID: "c44d5869-6a57-435c-873e-a5aa02b43b3c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:12:50 crc kubenswrapper[4971]: I0320 07:12:50.113169 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c44d5869-6a57-435c-873e-a5aa02b43b3c-config" (OuterVolumeSpecName: "config") pod "c44d5869-6a57-435c-873e-a5aa02b43b3c" (UID: "c44d5869-6a57-435c-873e-a5aa02b43b3c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:12:50 crc kubenswrapper[4971]: I0320 07:12:50.115507 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c44d5869-6a57-435c-873e-a5aa02b43b3c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c44d5869-6a57-435c-873e-a5aa02b43b3c" (UID: "c44d5869-6a57-435c-873e-a5aa02b43b3c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:12:50 crc kubenswrapper[4971]: I0320 07:12:50.127424 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c44d5869-6a57-435c-873e-a5aa02b43b3c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c44d5869-6a57-435c-873e-a5aa02b43b3c" (UID: "c44d5869-6a57-435c-873e-a5aa02b43b3c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:12:50 crc kubenswrapper[4971]: I0320 07:12:50.148187 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjxgj\" (UniqueName: \"kubernetes.io/projected/c44d5869-6a57-435c-873e-a5aa02b43b3c-kube-api-access-vjxgj\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:50 crc kubenswrapper[4971]: I0320 07:12:50.148220 4971 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c44d5869-6a57-435c-873e-a5aa02b43b3c-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:50 crc kubenswrapper[4971]: I0320 07:12:50.148230 4971 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c44d5869-6a57-435c-873e-a5aa02b43b3c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:50 crc kubenswrapper[4971]: I0320 07:12:50.148239 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c44d5869-6a57-435c-873e-a5aa02b43b3c-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:50 crc kubenswrapper[4971]: I0320 07:12:50.148248 4971 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c44d5869-6a57-435c-873e-a5aa02b43b3c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:50 crc kubenswrapper[4971]: I0320 07:12:50.148306 4971 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c44d5869-6a57-435c-873e-a5aa02b43b3c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:50 crc kubenswrapper[4971]: I0320 07:12:50.162747 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:12:50 crc kubenswrapper[4971]: I0320 07:12:50.162803 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:12:50 crc kubenswrapper[4971]: I0320 07:12:50.176279 4971 scope.go:117] "RemoveContainer" containerID="112f076e80060835ec2117278c822bb2d8aa71100255cf31dc876b936c782f6f" Mar 20 07:12:50 crc kubenswrapper[4971]: E0320 07:12:50.180981 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"112f076e80060835ec2117278c822bb2d8aa71100255cf31dc876b936c782f6f\": container with ID starting with 112f076e80060835ec2117278c822bb2d8aa71100255cf31dc876b936c782f6f not found: ID does not exist" containerID="112f076e80060835ec2117278c822bb2d8aa71100255cf31dc876b936c782f6f" Mar 20 07:12:50 crc kubenswrapper[4971]: I0320 07:12:50.181032 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"112f076e80060835ec2117278c822bb2d8aa71100255cf31dc876b936c782f6f"} err="failed to get container status \"112f076e80060835ec2117278c822bb2d8aa71100255cf31dc876b936c782f6f\": rpc error: code = NotFound desc = could not find container \"112f076e80060835ec2117278c822bb2d8aa71100255cf31dc876b936c782f6f\": container with ID starting with 112f076e80060835ec2117278c822bb2d8aa71100255cf31dc876b936c782f6f not found: ID does not exist" Mar 20 07:12:50 crc kubenswrapper[4971]: I0320 07:12:50.181062 4971 scope.go:117] "RemoveContainer" containerID="e4b465828f1eee4253df14f943c5fdc68c1908b41e07ca13377f5ed834610495" Mar 20 07:12:50 crc kubenswrapper[4971]: E0320 07:12:50.181437 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4b465828f1eee4253df14f943c5fdc68c1908b41e07ca13377f5ed834610495\": container with ID starting with e4b465828f1eee4253df14f943c5fdc68c1908b41e07ca13377f5ed834610495 not found: ID does not exist" containerID="e4b465828f1eee4253df14f943c5fdc68c1908b41e07ca13377f5ed834610495" Mar 20 07:12:50 crc kubenswrapper[4971]: I0320 07:12:50.181486 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4b465828f1eee4253df14f943c5fdc68c1908b41e07ca13377f5ed834610495"} err="failed to get container status \"e4b465828f1eee4253df14f943c5fdc68c1908b41e07ca13377f5ed834610495\": rpc error: code = NotFound desc = could not find container \"e4b465828f1eee4253df14f943c5fdc68c1908b41e07ca13377f5ed834610495\": container with ID starting with e4b465828f1eee4253df14f943c5fdc68c1908b41e07ca13377f5ed834610495 not found: ID does not exist" Mar 20 07:12:50 crc kubenswrapper[4971]: I0320 07:12:50.389550 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fc46d7df7-468lg"] Mar 20 07:12:50 crc kubenswrapper[4971]: I0320 07:12:50.398296 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fc46d7df7-468lg"] Mar 20 07:12:50 crc kubenswrapper[4971]: I0320 07:12:50.743434 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c44d5869-6a57-435c-873e-a5aa02b43b3c" path="/var/lib/kubelet/pods/c44d5869-6a57-435c-873e-a5aa02b43b3c/volumes" Mar 20 07:12:50 crc kubenswrapper[4971]: I0320 07:12:50.844110 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="03a48c07-f949-4376-874a-8d5758b60a9a" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 20 07:12:51 crc kubenswrapper[4971]: I0320 07:12:51.056283 4971 generic.go:334] "Generic (PLEG): container finished" podID="f85b1a9b-55d3-4e2f-bc37-726688f9ab49" containerID="1e3f49bd357c4c772a5a3660e021ae19b88b3e4636ef9c26168d72d4957a90cd" exitCode=0 Mar 20 07:12:51 crc kubenswrapper[4971]: I0320 07:12:51.056352 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f85b1a9b-55d3-4e2f-bc37-726688f9ab49","Type":"ContainerDied","Data":"1e3f49bd357c4c772a5a3660e021ae19b88b3e4636ef9c26168d72d4957a90cd"} Mar 20 07:12:54 crc kubenswrapper[4971]: I0320 07:12:54.060192 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 07:12:54 crc kubenswrapper[4971]: I0320 07:12:54.094712 4971 generic.go:334] "Generic (PLEG): container finished" podID="f85b1a9b-55d3-4e2f-bc37-726688f9ab49" containerID="cfc03ceb35986a85f6a9361634ccbc7e649ab27662c4580c0540117337632d25" exitCode=0 Mar 20 07:12:54 crc kubenswrapper[4971]: I0320 07:12:54.094764 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f85b1a9b-55d3-4e2f-bc37-726688f9ab49","Type":"ContainerDied","Data":"cfc03ceb35986a85f6a9361634ccbc7e649ab27662c4580c0540117337632d25"} Mar 20 07:12:54 crc kubenswrapper[4971]: I0320 07:12:54.094798 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f85b1a9b-55d3-4e2f-bc37-726688f9ab49","Type":"ContainerDied","Data":"2ceaad863f6de32f201a1d1200a4e110219c4f32bcef41bef656f04d1308ed49"} Mar 20 07:12:54 crc kubenswrapper[4971]: I0320 07:12:54.094819 4971 scope.go:117] "RemoveContainer" containerID="1e3f49bd357c4c772a5a3660e021ae19b88b3e4636ef9c26168d72d4957a90cd" Mar 20 07:12:54 crc kubenswrapper[4971]: I0320 07:12:54.094878 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 07:12:54 crc kubenswrapper[4971]: I0320 07:12:54.125160 4971 scope.go:117] "RemoveContainer" containerID="cfc03ceb35986a85f6a9361634ccbc7e649ab27662c4580c0540117337632d25" Mar 20 07:12:54 crc kubenswrapper[4971]: I0320 07:12:54.142469 4971 scope.go:117] "RemoveContainer" containerID="1e3f49bd357c4c772a5a3660e021ae19b88b3e4636ef9c26168d72d4957a90cd" Mar 20 07:12:54 crc kubenswrapper[4971]: E0320 07:12:54.142939 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e3f49bd357c4c772a5a3660e021ae19b88b3e4636ef9c26168d72d4957a90cd\": container with ID starting with 1e3f49bd357c4c772a5a3660e021ae19b88b3e4636ef9c26168d72d4957a90cd not found: ID does not exist" containerID="1e3f49bd357c4c772a5a3660e021ae19b88b3e4636ef9c26168d72d4957a90cd" Mar 20 07:12:54 crc kubenswrapper[4971]: I0320 07:12:54.143096 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e3f49bd357c4c772a5a3660e021ae19b88b3e4636ef9c26168d72d4957a90cd"} err="failed to get container status \"1e3f49bd357c4c772a5a3660e021ae19b88b3e4636ef9c26168d72d4957a90cd\": rpc error: code = NotFound desc = could not find container \"1e3f49bd357c4c772a5a3660e021ae19b88b3e4636ef9c26168d72d4957a90cd\": container with ID starting with 1e3f49bd357c4c772a5a3660e021ae19b88b3e4636ef9c26168d72d4957a90cd not found: ID does not exist" Mar 20 07:12:54 crc kubenswrapper[4971]: I0320 07:12:54.143189 4971 scope.go:117] "RemoveContainer" containerID="cfc03ceb35986a85f6a9361634ccbc7e649ab27662c4580c0540117337632d25" Mar 20 07:12:54 crc kubenswrapper[4971]: E0320 07:12:54.143597 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfc03ceb35986a85f6a9361634ccbc7e649ab27662c4580c0540117337632d25\": container with ID starting with cfc03ceb35986a85f6a9361634ccbc7e649ab27662c4580c0540117337632d25 not found: ID does not exist" containerID="cfc03ceb35986a85f6a9361634ccbc7e649ab27662c4580c0540117337632d25" Mar 20 07:12:54 crc kubenswrapper[4971]: I0320 07:12:54.143657 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfc03ceb35986a85f6a9361634ccbc7e649ab27662c4580c0540117337632d25"} err="failed to get container status \"cfc03ceb35986a85f6a9361634ccbc7e649ab27662c4580c0540117337632d25\": rpc error: code = NotFound desc = could not find container \"cfc03ceb35986a85f6a9361634ccbc7e649ab27662c4580c0540117337632d25\": container with ID starting with cfc03ceb35986a85f6a9361634ccbc7e649ab27662c4580c0540117337632d25 not found: ID does not exist" Mar 20 07:12:54 crc kubenswrapper[4971]: I0320 07:12:54.250267 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f85b1a9b-55d3-4e2f-bc37-726688f9ab49-config-data-custom\") pod \"f85b1a9b-55d3-4e2f-bc37-726688f9ab49\" (UID: \"f85b1a9b-55d3-4e2f-bc37-726688f9ab49\") " Mar 20 07:12:54 crc kubenswrapper[4971]: I0320 07:12:54.250351 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9q42\" (UniqueName: \"kubernetes.io/projected/f85b1a9b-55d3-4e2f-bc37-726688f9ab49-kube-api-access-c9q42\") pod \"f85b1a9b-55d3-4e2f-bc37-726688f9ab49\" (UID: \"f85b1a9b-55d3-4e2f-bc37-726688f9ab49\") " Mar 20 07:12:54 crc kubenswrapper[4971]: I0320 07:12:54.250525 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f85b1a9b-55d3-4e2f-bc37-726688f9ab49-combined-ca-bundle\") pod \"f85b1a9b-55d3-4e2f-bc37-726688f9ab49\" (UID: \"f85b1a9b-55d3-4e2f-bc37-726688f9ab49\") " Mar 20 07:12:54 crc kubenswrapper[4971]: I0320 07:12:54.250582 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f85b1a9b-55d3-4e2f-bc37-726688f9ab49-scripts\") pod \"f85b1a9b-55d3-4e2f-bc37-726688f9ab49\" (UID: \"f85b1a9b-55d3-4e2f-bc37-726688f9ab49\") " Mar 20 07:12:54 crc kubenswrapper[4971]: I0320 07:12:54.250638 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f85b1a9b-55d3-4e2f-bc37-726688f9ab49-config-data\") pod \"f85b1a9b-55d3-4e2f-bc37-726688f9ab49\" (UID: \"f85b1a9b-55d3-4e2f-bc37-726688f9ab49\") " Mar 20 07:12:54 crc kubenswrapper[4971]: I0320 07:12:54.250742 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f85b1a9b-55d3-4e2f-bc37-726688f9ab49-etc-machine-id\") pod \"f85b1a9b-55d3-4e2f-bc37-726688f9ab49\" (UID: \"f85b1a9b-55d3-4e2f-bc37-726688f9ab49\") " Mar 20 07:12:54 crc kubenswrapper[4971]: I0320 07:12:54.251210 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85b1a9b-55d3-4e2f-bc37-726688f9ab49-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "f85b1a9b-55d3-4e2f-bc37-726688f9ab49" (UID: "f85b1a9b-55d3-4e2f-bc37-726688f9ab49"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 07:12:54 crc kubenswrapper[4971]: I0320 07:12:54.275304 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f85b1a9b-55d3-4e2f-bc37-726688f9ab49-scripts" (OuterVolumeSpecName: "scripts") pod "f85b1a9b-55d3-4e2f-bc37-726688f9ab49" (UID: "f85b1a9b-55d3-4e2f-bc37-726688f9ab49"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:12:54 crc kubenswrapper[4971]: I0320 07:12:54.275427 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f85b1a9b-55d3-4e2f-bc37-726688f9ab49-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f85b1a9b-55d3-4e2f-bc37-726688f9ab49" (UID: "f85b1a9b-55d3-4e2f-bc37-726688f9ab49"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:12:54 crc kubenswrapper[4971]: I0320 07:12:54.276841 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f85b1a9b-55d3-4e2f-bc37-726688f9ab49-kube-api-access-c9q42" (OuterVolumeSpecName: "kube-api-access-c9q42") pod "f85b1a9b-55d3-4e2f-bc37-726688f9ab49" (UID: "f85b1a9b-55d3-4e2f-bc37-726688f9ab49"). InnerVolumeSpecName "kube-api-access-c9q42". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:12:54 crc kubenswrapper[4971]: I0320 07:12:54.316762 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f85b1a9b-55d3-4e2f-bc37-726688f9ab49-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f85b1a9b-55d3-4e2f-bc37-726688f9ab49" (UID: "f85b1a9b-55d3-4e2f-bc37-726688f9ab49"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:12:54 crc kubenswrapper[4971]: I0320 07:12:54.353908 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f85b1a9b-55d3-4e2f-bc37-726688f9ab49-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:54 crc kubenswrapper[4971]: I0320 07:12:54.353955 4971 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f85b1a9b-55d3-4e2f-bc37-726688f9ab49-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:54 crc kubenswrapper[4971]: I0320 07:12:54.353966 4971 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f85b1a9b-55d3-4e2f-bc37-726688f9ab49-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:54 crc kubenswrapper[4971]: I0320 07:12:54.353975 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9q42\" (UniqueName: \"kubernetes.io/projected/f85b1a9b-55d3-4e2f-bc37-726688f9ab49-kube-api-access-c9q42\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:54 crc kubenswrapper[4971]: I0320 07:12:54.353989 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f85b1a9b-55d3-4e2f-bc37-726688f9ab49-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:54 crc kubenswrapper[4971]: I0320 07:12:54.371633 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f85b1a9b-55d3-4e2f-bc37-726688f9ab49-config-data" (OuterVolumeSpecName: "config-data") pod "f85b1a9b-55d3-4e2f-bc37-726688f9ab49" (UID: "f85b1a9b-55d3-4e2f-bc37-726688f9ab49"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:12:54 crc kubenswrapper[4971]: I0320 07:12:54.455174 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f85b1a9b-55d3-4e2f-bc37-726688f9ab49-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:54 crc kubenswrapper[4971]: I0320 07:12:54.491677 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 07:12:54 crc kubenswrapper[4971]: I0320 07:12:54.513266 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 07:12:54 crc kubenswrapper[4971]: I0320 07:12:54.521784 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 07:12:54 crc kubenswrapper[4971]: E0320 07:12:54.522262 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c44d5869-6a57-435c-873e-a5aa02b43b3c" containerName="init" Mar 20 07:12:54 crc kubenswrapper[4971]: I0320 07:12:54.522280 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="c44d5869-6a57-435c-873e-a5aa02b43b3c" containerName="init" Mar 20 07:12:54 crc kubenswrapper[4971]: E0320 07:12:54.522303 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85b1a9b-55d3-4e2f-bc37-726688f9ab49" containerName="probe" Mar 20 07:12:54 crc kubenswrapper[4971]: I0320 07:12:54.522319 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85b1a9b-55d3-4e2f-bc37-726688f9ab49" containerName="probe" Mar 20 07:12:54 crc kubenswrapper[4971]: E0320 07:12:54.522331 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85b1a9b-55d3-4e2f-bc37-726688f9ab49" containerName="cinder-scheduler" Mar 20 07:12:54 crc kubenswrapper[4971]: I0320 07:12:54.522340 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85b1a9b-55d3-4e2f-bc37-726688f9ab49" containerName="cinder-scheduler" Mar 20 07:12:54 crc kubenswrapper[4971]: E0320 07:12:54.522351 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c44d5869-6a57-435c-873e-a5aa02b43b3c" containerName="dnsmasq-dns" Mar 20 07:12:54 crc kubenswrapper[4971]: I0320 07:12:54.522357 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="c44d5869-6a57-435c-873e-a5aa02b43b3c" containerName="dnsmasq-dns" Mar 20 07:12:54 crc kubenswrapper[4971]: I0320 07:12:54.522530 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="c44d5869-6a57-435c-873e-a5aa02b43b3c" containerName="dnsmasq-dns" Mar 20 07:12:54 crc kubenswrapper[4971]: I0320 07:12:54.522542 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85b1a9b-55d3-4e2f-bc37-726688f9ab49" containerName="probe" Mar 20 07:12:54 crc kubenswrapper[4971]: I0320 07:12:54.522552 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85b1a9b-55d3-4e2f-bc37-726688f9ab49" containerName="cinder-scheduler" Mar 20 07:12:54 crc kubenswrapper[4971]: I0320 07:12:54.523729 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 07:12:54 crc kubenswrapper[4971]: I0320 07:12:54.528719 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 20 07:12:54 crc kubenswrapper[4971]: I0320 07:12:54.530763 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 07:12:54 crc kubenswrapper[4971]: I0320 07:12:54.659014 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2656777c-a362-4e3e-92aa-98a8f9d2cf34-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2656777c-a362-4e3e-92aa-98a8f9d2cf34\") " pod="openstack/cinder-scheduler-0" Mar 20 07:12:54 crc kubenswrapper[4971]: I0320 07:12:54.659086 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2656777c-a362-4e3e-92aa-98a8f9d2cf34-scripts\") pod \"cinder-scheduler-0\" (UID: \"2656777c-a362-4e3e-92aa-98a8f9d2cf34\") " pod="openstack/cinder-scheduler-0" Mar 20 07:12:54 crc kubenswrapper[4971]: I0320 07:12:54.659137 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2656777c-a362-4e3e-92aa-98a8f9d2cf34-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2656777c-a362-4e3e-92aa-98a8f9d2cf34\") " pod="openstack/cinder-scheduler-0" Mar 20 07:12:54 crc kubenswrapper[4971]: I0320 07:12:54.659264 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2656777c-a362-4e3e-92aa-98a8f9d2cf34-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2656777c-a362-4e3e-92aa-98a8f9d2cf34\") " pod="openstack/cinder-scheduler-0" Mar 20 07:12:54 crc kubenswrapper[4971]: I0320 07:12:54.659369 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lvnt\" (UniqueName: \"kubernetes.io/projected/2656777c-a362-4e3e-92aa-98a8f9d2cf34-kube-api-access-8lvnt\") pod \"cinder-scheduler-0\" (UID: \"2656777c-a362-4e3e-92aa-98a8f9d2cf34\") " pod="openstack/cinder-scheduler-0" Mar 20 07:12:54 crc kubenswrapper[4971]: I0320 07:12:54.659406 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2656777c-a362-4e3e-92aa-98a8f9d2cf34-config-data\") pod \"cinder-scheduler-0\" (UID: \"2656777c-a362-4e3e-92aa-98a8f9d2cf34\") " pod="openstack/cinder-scheduler-0" Mar 20 07:12:54 crc kubenswrapper[4971]: I0320 07:12:54.661263 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-758cd76f6b-7db5d" Mar 20 07:12:54 crc kubenswrapper[4971]: I0320 07:12:54.664015 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-758cd76f6b-7db5d" Mar 20 07:12:54 crc kubenswrapper[4971]: I0320 07:12:54.760673 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85b1a9b-55d3-4e2f-bc37-726688f9ab49" path="/var/lib/kubelet/pods/f85b1a9b-55d3-4e2f-bc37-726688f9ab49/volumes" Mar 20 07:12:54 crc kubenswrapper[4971]: I0320 07:12:54.760847 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2656777c-a362-4e3e-92aa-98a8f9d2cf34-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2656777c-a362-4e3e-92aa-98a8f9d2cf34\") " pod="openstack/cinder-scheduler-0" Mar 20 07:12:54 crc kubenswrapper[4971]: I0320 07:12:54.760952 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lvnt\" (UniqueName: \"kubernetes.io/projected/2656777c-a362-4e3e-92aa-98a8f9d2cf34-kube-api-access-8lvnt\") pod \"cinder-scheduler-0\" (UID: \"2656777c-a362-4e3e-92aa-98a8f9d2cf34\") " pod="openstack/cinder-scheduler-0" Mar 20 07:12:54 crc kubenswrapper[4971]: I0320 07:12:54.760974 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2656777c-a362-4e3e-92aa-98a8f9d2cf34-config-data\") pod \"cinder-scheduler-0\" (UID: \"2656777c-a362-4e3e-92aa-98a8f9d2cf34\") " pod="openstack/cinder-scheduler-0" Mar 20 07:12:54 crc kubenswrapper[4971]: I0320 07:12:54.761054 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2656777c-a362-4e3e-92aa-98a8f9d2cf34-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2656777c-a362-4e3e-92aa-98a8f9d2cf34\") " pod="openstack/cinder-scheduler-0" Mar 20 07:12:54 crc kubenswrapper[4971]: I0320 07:12:54.761104 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2656777c-a362-4e3e-92aa-98a8f9d2cf34-scripts\") pod \"cinder-scheduler-0\" (UID: \"2656777c-a362-4e3e-92aa-98a8f9d2cf34\") " pod="openstack/cinder-scheduler-0" Mar 20 07:12:54 crc kubenswrapper[4971]: I0320 07:12:54.761128 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2656777c-a362-4e3e-92aa-98a8f9d2cf34-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2656777c-a362-4e3e-92aa-98a8f9d2cf34\") " pod="openstack/cinder-scheduler-0" Mar 20 07:12:54 crc kubenswrapper[4971]: I0320 07:12:54.761649 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2656777c-a362-4e3e-92aa-98a8f9d2cf34-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2656777c-a362-4e3e-92aa-98a8f9d2cf34\") " pod="openstack/cinder-scheduler-0" Mar 20 07:12:54 crc kubenswrapper[4971]: I0320 07:12:54.766758 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2656777c-a362-4e3e-92aa-98a8f9d2cf34-config-data\") pod \"cinder-scheduler-0\" (UID: \"2656777c-a362-4e3e-92aa-98a8f9d2cf34\") " pod="openstack/cinder-scheduler-0" Mar 20 07:12:54 crc kubenswrapper[4971]: I0320 07:12:54.768143 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2656777c-a362-4e3e-92aa-98a8f9d2cf34-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2656777c-a362-4e3e-92aa-98a8f9d2cf34\") " pod="openstack/cinder-scheduler-0" Mar 20 07:12:54 crc kubenswrapper[4971]: I0320 07:12:54.784181 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2656777c-a362-4e3e-92aa-98a8f9d2cf34-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2656777c-a362-4e3e-92aa-98a8f9d2cf34\") " pod="openstack/cinder-scheduler-0" Mar 20 07:12:54 crc kubenswrapper[4971]: I0320 07:12:54.784988 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2656777c-a362-4e3e-92aa-98a8f9d2cf34-scripts\") pod \"cinder-scheduler-0\" (UID: \"2656777c-a362-4e3e-92aa-98a8f9d2cf34\") " pod="openstack/cinder-scheduler-0" Mar 20 07:12:54 crc kubenswrapper[4971]: I0320 07:12:54.803252 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lvnt\" (UniqueName: \"kubernetes.io/projected/2656777c-a362-4e3e-92aa-98a8f9d2cf34-kube-api-access-8lvnt\") pod \"cinder-scheduler-0\" (UID: \"2656777c-a362-4e3e-92aa-98a8f9d2cf34\") " pod="openstack/cinder-scheduler-0" Mar 20 07:12:54 crc kubenswrapper[4971]: I0320 07:12:54.862497 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 07:12:55 crc kubenswrapper[4971]: I0320 07:12:55.318439 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 07:12:55 crc kubenswrapper[4971]: I0320 07:12:55.598193 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-5645d5b87f-2lrzm" Mar 20 07:12:56 crc kubenswrapper[4971]: I0320 07:12:56.138460 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2656777c-a362-4e3e-92aa-98a8f9d2cf34","Type":"ContainerStarted","Data":"f4e86f9dea6983a20f14a79fb0e7338cd92e47228309e782ea16776aa1889874"} Mar 20 07:12:56 crc kubenswrapper[4971]: I0320 07:12:56.138746 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2656777c-a362-4e3e-92aa-98a8f9d2cf34","Type":"ContainerStarted","Data":"c97d6516b3f500a3fa3a21c6620ddd9d9a2a09019ff5d26144c90340566236f1"} Mar 20 07:12:57 crc kubenswrapper[4971]: I0320 07:12:57.169267 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2656777c-a362-4e3e-92aa-98a8f9d2cf34","Type":"ContainerStarted","Data":"31e89339c7ef92bd573974edb8ba31d6d53be9b1e4d1416dc023c35caf6cee2c"} Mar 20 07:12:57 crc kubenswrapper[4971]: I0320 07:12:57.187555 4971 generic.go:334] "Generic (PLEG): container finished" podID="2c4b8281-2638-413d-b91a-ee6acb59693d" containerID="ec43c060399fdac7b8010fc1e004e9c5f9a92fdcf5f59bf5b7751d728e2f3fc2" exitCode=0 Mar 20 07:12:57 crc kubenswrapper[4971]: I0320 07:12:57.187641 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5ddc6d7dd9-qnsrj" event={"ID":"2c4b8281-2638-413d-b91a-ee6acb59693d","Type":"ContainerDied","Data":"ec43c060399fdac7b8010fc1e004e9c5f9a92fdcf5f59bf5b7751d728e2f3fc2"} Mar 20 07:12:57 crc kubenswrapper[4971]: I0320 07:12:57.193318 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.193260206 podStartE2EDuration="3.193260206s" podCreationTimestamp="2026-03-20 07:12:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:12:57.187940405 +0000 UTC m=+1399.167814533" watchObservedRunningTime="2026-03-20 07:12:57.193260206 +0000 UTC m=+1399.173134344" Mar 20 07:12:57 crc kubenswrapper[4971]: I0320 07:12:57.499884 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-68ff6b454b-v87kx" Mar 20 07:12:57 crc kubenswrapper[4971]: I0320 07:12:57.502259 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-68ff6b454b-v87kx" Mar 20 07:12:57 crc kubenswrapper[4971]: I0320 07:12:57.538297 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5ddc6d7dd9-qnsrj" Mar 20 07:12:57 crc kubenswrapper[4971]: I0320 07:12:57.599954 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-758cd76f6b-7db5d"] Mar 20 07:12:57 crc kubenswrapper[4971]: I0320 07:12:57.600216 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-758cd76f6b-7db5d" podUID="1b124085-be1b-4abd-acec-d5e30f119d75" containerName="placement-log" containerID="cri-o://5492bdfd3df55a8f2124fb54ea15f9b3826e3793717086ce2b066c3ae9965450" gracePeriod=30 Mar 20 07:12:57 crc kubenswrapper[4971]: I0320 07:12:57.600592 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-758cd76f6b-7db5d" podUID="1b124085-be1b-4abd-acec-d5e30f119d75" containerName="placement-api" containerID="cri-o://048309776ba249bbbeab48b005866ec949046c27981f6419194c820f0c411200" gracePeriod=30 Mar 20 07:12:57 crc kubenswrapper[4971]: I0320 07:12:57.616509 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c4b8281-2638-413d-b91a-ee6acb59693d-internal-tls-certs\") pod \"2c4b8281-2638-413d-b91a-ee6acb59693d\" (UID: \"2c4b8281-2638-413d-b91a-ee6acb59693d\") " Mar 20 07:12:57 crc kubenswrapper[4971]: I0320 07:12:57.616540 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2c4b8281-2638-413d-b91a-ee6acb59693d-httpd-config\") pod \"2c4b8281-2638-413d-b91a-ee6acb59693d\" (UID: \"2c4b8281-2638-413d-b91a-ee6acb59693d\") " Mar 20 07:12:57 crc kubenswrapper[4971]: I0320 07:12:57.616720 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2c4b8281-2638-413d-b91a-ee6acb59693d-config\") pod \"2c4b8281-2638-413d-b91a-ee6acb59693d\" (UID: \"2c4b8281-2638-413d-b91a-ee6acb59693d\") " Mar 20 07:12:57 crc kubenswrapper[4971]: I0320 07:12:57.616760 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c4b8281-2638-413d-b91a-ee6acb59693d-public-tls-certs\") pod \"2c4b8281-2638-413d-b91a-ee6acb59693d\" (UID: \"2c4b8281-2638-413d-b91a-ee6acb59693d\") " Mar 20 07:12:57 crc kubenswrapper[4971]: I0320 07:12:57.617053 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77n5s\" (UniqueName: \"kubernetes.io/projected/2c4b8281-2638-413d-b91a-ee6acb59693d-kube-api-access-77n5s\") pod \"2c4b8281-2638-413d-b91a-ee6acb59693d\" (UID: \"2c4b8281-2638-413d-b91a-ee6acb59693d\") " Mar 20 07:12:57 crc kubenswrapper[4971]: I0320 07:12:57.617122 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c4b8281-2638-413d-b91a-ee6acb59693d-combined-ca-bundle\") pod \"2c4b8281-2638-413d-b91a-ee6acb59693d\" (UID: \"2c4b8281-2638-413d-b91a-ee6acb59693d\") " Mar 20 07:12:57 crc kubenswrapper[4971]: I0320 07:12:57.617769 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c4b8281-2638-413d-b91a-ee6acb59693d-ovndb-tls-certs\") pod \"2c4b8281-2638-413d-b91a-ee6acb59693d\" (UID: \"2c4b8281-2638-413d-b91a-ee6acb59693d\") " Mar 20 07:12:57 crc kubenswrapper[4971]: I0320 07:12:57.626215 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c4b8281-2638-413d-b91a-ee6acb59693d-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "2c4b8281-2638-413d-b91a-ee6acb59693d" (UID: "2c4b8281-2638-413d-b91a-ee6acb59693d"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:12:57 crc kubenswrapper[4971]: I0320 07:12:57.626481 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c4b8281-2638-413d-b91a-ee6acb59693d-kube-api-access-77n5s" (OuterVolumeSpecName: "kube-api-access-77n5s") pod "2c4b8281-2638-413d-b91a-ee6acb59693d" (UID: "2c4b8281-2638-413d-b91a-ee6acb59693d"). InnerVolumeSpecName "kube-api-access-77n5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:12:57 crc kubenswrapper[4971]: I0320 07:12:57.721973 4971 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2c4b8281-2638-413d-b91a-ee6acb59693d-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:57 crc kubenswrapper[4971]: I0320 07:12:57.722766 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77n5s\" (UniqueName: \"kubernetes.io/projected/2c4b8281-2638-413d-b91a-ee6acb59693d-kube-api-access-77n5s\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:57 crc kubenswrapper[4971]: I0320 07:12:57.730369 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 20 07:12:57 crc kubenswrapper[4971]: I0320 07:12:57.736728 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c4b8281-2638-413d-b91a-ee6acb59693d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2c4b8281-2638-413d-b91a-ee6acb59693d" (UID: "2c4b8281-2638-413d-b91a-ee6acb59693d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:12:57 crc kubenswrapper[4971]: I0320 07:12:57.780972 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c4b8281-2638-413d-b91a-ee6acb59693d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2c4b8281-2638-413d-b91a-ee6acb59693d" (UID: "2c4b8281-2638-413d-b91a-ee6acb59693d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:12:57 crc kubenswrapper[4971]: I0320 07:12:57.812570 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c4b8281-2638-413d-b91a-ee6acb59693d-config" (OuterVolumeSpecName: "config") pod "2c4b8281-2638-413d-b91a-ee6acb59693d" (UID: "2c4b8281-2638-413d-b91a-ee6acb59693d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:12:57 crc kubenswrapper[4971]: I0320 07:12:57.829006 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c4b8281-2638-413d-b91a-ee6acb59693d-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "2c4b8281-2638-413d-b91a-ee6acb59693d" (UID: "2c4b8281-2638-413d-b91a-ee6acb59693d"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:12:57 crc kubenswrapper[4971]: I0320 07:12:57.830078 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c4b8281-2638-413d-b91a-ee6acb59693d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:57 crc kubenswrapper[4971]: I0320 07:12:57.830132 4971 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c4b8281-2638-413d-b91a-ee6acb59693d-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:57 crc kubenswrapper[4971]: I0320 07:12:57.830144 4971 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c4b8281-2638-413d-b91a-ee6acb59693d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:57 crc kubenswrapper[4971]: I0320 07:12:57.830152 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/2c4b8281-2638-413d-b91a-ee6acb59693d-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:57 crc kubenswrapper[4971]: I0320 07:12:57.849489 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c4b8281-2638-413d-b91a-ee6acb59693d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2c4b8281-2638-413d-b91a-ee6acb59693d" (UID: "2c4b8281-2638-413d-b91a-ee6acb59693d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:12:57 crc kubenswrapper[4971]: I0320 07:12:57.932682 4971 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c4b8281-2638-413d-b91a-ee6acb59693d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 07:12:58 crc kubenswrapper[4971]: I0320 07:12:58.198163 4971 generic.go:334] "Generic (PLEG): container finished" podID="1b124085-be1b-4abd-acec-d5e30f119d75" containerID="5492bdfd3df55a8f2124fb54ea15f9b3826e3793717086ce2b066c3ae9965450" exitCode=143 Mar 20 07:12:58 crc kubenswrapper[4971]: I0320 07:12:58.198220 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-758cd76f6b-7db5d" event={"ID":"1b124085-be1b-4abd-acec-d5e30f119d75","Type":"ContainerDied","Data":"5492bdfd3df55a8f2124fb54ea15f9b3826e3793717086ce2b066c3ae9965450"} Mar 20 07:12:58 crc kubenswrapper[4971]: I0320 07:12:58.201011 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5ddc6d7dd9-qnsrj" Mar 20 07:12:58 crc kubenswrapper[4971]: I0320 07:12:58.206658 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5ddc6d7dd9-qnsrj" event={"ID":"2c4b8281-2638-413d-b91a-ee6acb59693d","Type":"ContainerDied","Data":"f8d5ed6d4783df684dde2dd4dc90d0866d16d8b27b7a293c3a7ed9022ad58643"} Mar 20 07:12:58 crc kubenswrapper[4971]: I0320 07:12:58.206694 4971 scope.go:117] "RemoveContainer" containerID="1f62d6fff0e58ee91d48b2a1865144fa8a186a552d3f87401a1c86fbf8d631de" Mar 20 07:12:58 crc kubenswrapper[4971]: I0320 07:12:58.234798 4971 scope.go:117] "RemoveContainer" containerID="ec43c060399fdac7b8010fc1e004e9c5f9a92fdcf5f59bf5b7751d728e2f3fc2" Mar 20 07:12:58 crc kubenswrapper[4971]: I0320 07:12:58.238672 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5ddc6d7dd9-qnsrj"] Mar 20 07:12:58 crc kubenswrapper[4971]: I0320 07:12:58.248304 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5ddc6d7dd9-qnsrj"] Mar 20 07:12:58 crc kubenswrapper[4971]: I0320 07:12:58.752917 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c4b8281-2638-413d-b91a-ee6acb59693d" path="/var/lib/kubelet/pods/2c4b8281-2638-413d-b91a-ee6acb59693d/volumes" Mar 20 07:12:59 crc kubenswrapper[4971]: I0320 07:12:59.268512 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 20 07:12:59 crc kubenswrapper[4971]: E0320 07:12:59.269354 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c4b8281-2638-413d-b91a-ee6acb59693d" containerName="neutron-httpd" Mar 20 07:12:59 crc kubenswrapper[4971]: I0320 07:12:59.269372 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c4b8281-2638-413d-b91a-ee6acb59693d" containerName="neutron-httpd" Mar 20 07:12:59 crc kubenswrapper[4971]: E0320 07:12:59.269414 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c4b8281-2638-413d-b91a-ee6acb59693d" containerName="neutron-api" Mar 20 07:12:59 crc kubenswrapper[4971]: I0320 07:12:59.269425 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c4b8281-2638-413d-b91a-ee6acb59693d" containerName="neutron-api" Mar 20 07:12:59 crc kubenswrapper[4971]: I0320 07:12:59.269662 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c4b8281-2638-413d-b91a-ee6acb59693d" containerName="neutron-api" Mar 20 07:12:59 crc kubenswrapper[4971]: I0320 07:12:59.269686 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c4b8281-2638-413d-b91a-ee6acb59693d" containerName="neutron-httpd" Mar 20 07:12:59 crc kubenswrapper[4971]: I0320 07:12:59.270463 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 07:12:59 crc kubenswrapper[4971]: I0320 07:12:59.276483 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 20 07:12:59 crc kubenswrapper[4971]: I0320 07:12:59.313066 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 20 07:12:59 crc kubenswrapper[4971]: I0320 07:12:59.313240 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 20 07:12:59 crc kubenswrapper[4971]: I0320 07:12:59.313474 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-486qb" Mar 20 07:12:59 crc kubenswrapper[4971]: I0320 07:12:59.365507 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6q7vl\" (UniqueName: \"kubernetes.io/projected/b7c8d202-039a-426e-b5e5-916bb257e441-kube-api-access-6q7vl\") pod \"openstackclient\" (UID: \"b7c8d202-039a-426e-b5e5-916bb257e441\") " pod="openstack/openstackclient" Mar 20 07:12:59 crc kubenswrapper[4971]: I0320 07:12:59.365621 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b7c8d202-039a-426e-b5e5-916bb257e441-openstack-config-secret\") pod \"openstackclient\" (UID: \"b7c8d202-039a-426e-b5e5-916bb257e441\") " pod="openstack/openstackclient" Mar 20 07:12:59 crc kubenswrapper[4971]: I0320 07:12:59.365643 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7c8d202-039a-426e-b5e5-916bb257e441-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b7c8d202-039a-426e-b5e5-916bb257e441\") " pod="openstack/openstackclient" Mar 20 07:12:59 crc kubenswrapper[4971]: I0320 07:12:59.365713 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b7c8d202-039a-426e-b5e5-916bb257e441-openstack-config\") pod \"openstackclient\" (UID: \"b7c8d202-039a-426e-b5e5-916bb257e441\") " pod="openstack/openstackclient" Mar 20 07:12:59 crc kubenswrapper[4971]: I0320 07:12:59.467346 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b7c8d202-039a-426e-b5e5-916bb257e441-openstack-config\") pod \"openstackclient\" (UID: \"b7c8d202-039a-426e-b5e5-916bb257e441\") " pod="openstack/openstackclient" Mar 20 07:12:59 crc kubenswrapper[4971]: I0320 07:12:59.467711 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6q7vl\" (UniqueName: \"kubernetes.io/projected/b7c8d202-039a-426e-b5e5-916bb257e441-kube-api-access-6q7vl\") pod \"openstackclient\" (UID: \"b7c8d202-039a-426e-b5e5-916bb257e441\") " pod="openstack/openstackclient" Mar 20 07:12:59 crc kubenswrapper[4971]: I0320 07:12:59.467839 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b7c8d202-039a-426e-b5e5-916bb257e441-openstack-config-secret\") pod \"openstackclient\" (UID: \"b7c8d202-039a-426e-b5e5-916bb257e441\") " pod="openstack/openstackclient" Mar 20 07:12:59 crc kubenswrapper[4971]: I0320 07:12:59.467965 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7c8d202-039a-426e-b5e5-916bb257e441-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b7c8d202-039a-426e-b5e5-916bb257e441\") " pod="openstack/openstackclient" Mar 20 07:12:59 crc kubenswrapper[4971]: I0320 07:12:59.470253 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b7c8d202-039a-426e-b5e5-916bb257e441-openstack-config\") pod \"openstackclient\" (UID: \"b7c8d202-039a-426e-b5e5-916bb257e441\") " pod="openstack/openstackclient" Mar 20 07:12:59 crc kubenswrapper[4971]: I0320 07:12:59.472264 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b7c8d202-039a-426e-b5e5-916bb257e441-openstack-config-secret\") pod \"openstackclient\" (UID: \"b7c8d202-039a-426e-b5e5-916bb257e441\") " pod="openstack/openstackclient" Mar 20 07:12:59 crc kubenswrapper[4971]: I0320 07:12:59.472717 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7c8d202-039a-426e-b5e5-916bb257e441-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b7c8d202-039a-426e-b5e5-916bb257e441\") " pod="openstack/openstackclient" Mar 20 07:12:59 crc kubenswrapper[4971]: I0320 07:12:59.487480 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6q7vl\" (UniqueName: \"kubernetes.io/projected/b7c8d202-039a-426e-b5e5-916bb257e441-kube-api-access-6q7vl\") pod \"openstackclient\" (UID: \"b7c8d202-039a-426e-b5e5-916bb257e441\") " pod="openstack/openstackclient" Mar 20 07:12:59 crc kubenswrapper[4971]: I0320 07:12:59.643400 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 07:12:59 crc kubenswrapper[4971]: I0320 07:12:59.863420 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 20 07:13:00 crc kubenswrapper[4971]: I0320 07:13:00.121644 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 20 07:13:00 crc kubenswrapper[4971]: I0320 07:13:00.234366 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"b7c8d202-039a-426e-b5e5-916bb257e441","Type":"ContainerStarted","Data":"90b4c55654ecb112ebae596889a185b88a38e47cc432794223fe21703a41f9bb"} Mar 20 07:13:00 crc kubenswrapper[4971]: I0320 07:13:00.863835 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-76b4bdf575-ggc5d" Mar 20 07:13:01 crc kubenswrapper[4971]: I0320 07:13:01.004387 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/26142983-f987-457d-956c-0d63345bdf1c-config-data-custom\") pod \"26142983-f987-457d-956c-0d63345bdf1c\" (UID: \"26142983-f987-457d-956c-0d63345bdf1c\") " Mar 20 07:13:01 crc kubenswrapper[4971]: I0320 07:13:01.004495 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26142983-f987-457d-956c-0d63345bdf1c-config-data\") pod \"26142983-f987-457d-956c-0d63345bdf1c\" (UID: \"26142983-f987-457d-956c-0d63345bdf1c\") " Mar 20 07:13:01 crc kubenswrapper[4971]: I0320 07:13:01.004711 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2gvk\" (UniqueName: \"kubernetes.io/projected/26142983-f987-457d-956c-0d63345bdf1c-kube-api-access-f2gvk\") pod \"26142983-f987-457d-956c-0d63345bdf1c\" (UID: \"26142983-f987-457d-956c-0d63345bdf1c\") " Mar 20 07:13:01 crc kubenswrapper[4971]: I0320 07:13:01.004770 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26142983-f987-457d-956c-0d63345bdf1c-logs\") pod \"26142983-f987-457d-956c-0d63345bdf1c\" (UID: \"26142983-f987-457d-956c-0d63345bdf1c\") " Mar 20 07:13:01 crc kubenswrapper[4971]: I0320 07:13:01.004805 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26142983-f987-457d-956c-0d63345bdf1c-combined-ca-bundle\") pod \"26142983-f987-457d-956c-0d63345bdf1c\" (UID: \"26142983-f987-457d-956c-0d63345bdf1c\") " Mar 20 07:13:01 crc kubenswrapper[4971]: I0320 07:13:01.010825 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26142983-f987-457d-956c-0d63345bdf1c-logs" (OuterVolumeSpecName: "logs") pod "26142983-f987-457d-956c-0d63345bdf1c" (UID: "26142983-f987-457d-956c-0d63345bdf1c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:13:01 crc kubenswrapper[4971]: I0320 07:13:01.012337 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26142983-f987-457d-956c-0d63345bdf1c-kube-api-access-f2gvk" (OuterVolumeSpecName: "kube-api-access-f2gvk") pod "26142983-f987-457d-956c-0d63345bdf1c" (UID: "26142983-f987-457d-956c-0d63345bdf1c"). InnerVolumeSpecName "kube-api-access-f2gvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:13:01 crc kubenswrapper[4971]: I0320 07:13:01.017565 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26142983-f987-457d-956c-0d63345bdf1c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "26142983-f987-457d-956c-0d63345bdf1c" (UID: "26142983-f987-457d-956c-0d63345bdf1c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:13:01 crc kubenswrapper[4971]: I0320 07:13:01.053711 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26142983-f987-457d-956c-0d63345bdf1c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "26142983-f987-457d-956c-0d63345bdf1c" (UID: "26142983-f987-457d-956c-0d63345bdf1c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:13:01 crc kubenswrapper[4971]: I0320 07:13:01.088760 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26142983-f987-457d-956c-0d63345bdf1c-config-data" (OuterVolumeSpecName: "config-data") pod "26142983-f987-457d-956c-0d63345bdf1c" (UID: "26142983-f987-457d-956c-0d63345bdf1c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:13:01 crc kubenswrapper[4971]: I0320 07:13:01.107183 4971 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/26142983-f987-457d-956c-0d63345bdf1c-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:01 crc kubenswrapper[4971]: I0320 07:13:01.107217 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26142983-f987-457d-956c-0d63345bdf1c-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:01 crc kubenswrapper[4971]: I0320 07:13:01.107228 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2gvk\" (UniqueName: \"kubernetes.io/projected/26142983-f987-457d-956c-0d63345bdf1c-kube-api-access-f2gvk\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:01 crc kubenswrapper[4971]: I0320 07:13:01.107241 4971 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26142983-f987-457d-956c-0d63345bdf1c-logs\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:01 crc kubenswrapper[4971]: I0320 07:13:01.107250 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26142983-f987-457d-956c-0d63345bdf1c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:01 crc kubenswrapper[4971]: I0320 07:13:01.141328 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-758cd76f6b-7db5d" Mar 20 07:13:01 crc kubenswrapper[4971]: I0320 07:13:01.208595 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b124085-be1b-4abd-acec-d5e30f119d75-scripts\") pod \"1b124085-be1b-4abd-acec-d5e30f119d75\" (UID: \"1b124085-be1b-4abd-acec-d5e30f119d75\") " Mar 20 07:13:01 crc kubenswrapper[4971]: I0320 07:13:01.208831 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b124085-be1b-4abd-acec-d5e30f119d75-logs\") pod \"1b124085-be1b-4abd-acec-d5e30f119d75\" (UID: \"1b124085-be1b-4abd-acec-d5e30f119d75\") " Mar 20 07:13:01 crc kubenswrapper[4971]: I0320 07:13:01.208902 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b124085-be1b-4abd-acec-d5e30f119d75-public-tls-certs\") pod \"1b124085-be1b-4abd-acec-d5e30f119d75\" (UID: \"1b124085-be1b-4abd-acec-d5e30f119d75\") " Mar 20 07:13:01 crc kubenswrapper[4971]: I0320 07:13:01.209541 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b124085-be1b-4abd-acec-d5e30f119d75-config-data\") pod \"1b124085-be1b-4abd-acec-d5e30f119d75\" (UID: \"1b124085-be1b-4abd-acec-d5e30f119d75\") " Mar 20 07:13:01 crc kubenswrapper[4971]: I0320 07:13:01.209666 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b124085-be1b-4abd-acec-d5e30f119d75-internal-tls-certs\") pod \"1b124085-be1b-4abd-acec-d5e30f119d75\" (UID: \"1b124085-be1b-4abd-acec-d5e30f119d75\") " Mar 20 07:13:01 crc kubenswrapper[4971]: I0320 07:13:01.209712 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b124085-be1b-4abd-acec-d5e30f119d75-combined-ca-bundle\") pod \"1b124085-be1b-4abd-acec-d5e30f119d75\" (UID: \"1b124085-be1b-4abd-acec-d5e30f119d75\") " Mar 20 07:13:01 crc kubenswrapper[4971]: I0320 07:13:01.209770 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rtjm\" (UniqueName: \"kubernetes.io/projected/1b124085-be1b-4abd-acec-d5e30f119d75-kube-api-access-4rtjm\") pod \"1b124085-be1b-4abd-acec-d5e30f119d75\" (UID: \"1b124085-be1b-4abd-acec-d5e30f119d75\") " Mar 20 07:13:01 crc kubenswrapper[4971]: I0320 07:13:01.213186 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b124085-be1b-4abd-acec-d5e30f119d75-kube-api-access-4rtjm" (OuterVolumeSpecName: "kube-api-access-4rtjm") pod "1b124085-be1b-4abd-acec-d5e30f119d75" (UID: "1b124085-be1b-4abd-acec-d5e30f119d75"). InnerVolumeSpecName "kube-api-access-4rtjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:13:01 crc kubenswrapper[4971]: I0320 07:13:01.213661 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b124085-be1b-4abd-acec-d5e30f119d75-logs" (OuterVolumeSpecName: "logs") pod "1b124085-be1b-4abd-acec-d5e30f119d75" (UID: "1b124085-be1b-4abd-acec-d5e30f119d75"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:13:01 crc kubenswrapper[4971]: I0320 07:13:01.220724 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b124085-be1b-4abd-acec-d5e30f119d75-scripts" (OuterVolumeSpecName: "scripts") pod "1b124085-be1b-4abd-acec-d5e30f119d75" (UID: "1b124085-be1b-4abd-acec-d5e30f119d75"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:13:01 crc kubenswrapper[4971]: I0320 07:13:01.249704 4971 generic.go:334] "Generic (PLEG): container finished" podID="26142983-f987-457d-956c-0d63345bdf1c" containerID="c74a8520f0be1ff8549ae137e8b5ed0f71a1eeae105b1e990e6c5ec2ec0289ed" exitCode=137 Mar 20 07:13:01 crc kubenswrapper[4971]: I0320 07:13:01.249778 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-76b4bdf575-ggc5d" event={"ID":"26142983-f987-457d-956c-0d63345bdf1c","Type":"ContainerDied","Data":"c74a8520f0be1ff8549ae137e8b5ed0f71a1eeae105b1e990e6c5ec2ec0289ed"} Mar 20 07:13:01 crc kubenswrapper[4971]: I0320 07:13:01.249815 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-76b4bdf575-ggc5d" event={"ID":"26142983-f987-457d-956c-0d63345bdf1c","Type":"ContainerDied","Data":"681512901db4da2f2caaa1df131bdbd441e1a3c65dc4a77d1c3ff0fa29157dd2"} Mar 20 07:13:01 crc kubenswrapper[4971]: I0320 07:13:01.249843 4971 scope.go:117] "RemoveContainer" containerID="c74a8520f0be1ff8549ae137e8b5ed0f71a1eeae105b1e990e6c5ec2ec0289ed" Mar 20 07:13:01 crc kubenswrapper[4971]: I0320 07:13:01.250028 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-76b4bdf575-ggc5d" Mar 20 07:13:01 crc kubenswrapper[4971]: I0320 07:13:01.262937 4971 generic.go:334] "Generic (PLEG): container finished" podID="1b124085-be1b-4abd-acec-d5e30f119d75" containerID="048309776ba249bbbeab48b005866ec949046c27981f6419194c820f0c411200" exitCode=0 Mar 20 07:13:01 crc kubenswrapper[4971]: I0320 07:13:01.263013 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-758cd76f6b-7db5d" event={"ID":"1b124085-be1b-4abd-acec-d5e30f119d75","Type":"ContainerDied","Data":"048309776ba249bbbeab48b005866ec949046c27981f6419194c820f0c411200"} Mar 20 07:13:01 crc kubenswrapper[4971]: I0320 07:13:01.263057 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-758cd76f6b-7db5d" event={"ID":"1b124085-be1b-4abd-acec-d5e30f119d75","Type":"ContainerDied","Data":"1e5d80e9d40e2e3de322ae4ec238de7fedb78a13b576f952275588885f45a373"} Mar 20 07:13:01 crc kubenswrapper[4971]: I0320 07:13:01.263169 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-758cd76f6b-7db5d" Mar 20 07:13:01 crc kubenswrapper[4971]: I0320 07:13:01.281369 4971 scope.go:117] "RemoveContainer" containerID="a4ee488277826c6e3a3aeaad470e448df50549a3a3c8d76ed94a4c244aa33026" Mar 20 07:13:01 crc kubenswrapper[4971]: I0320 07:13:01.290969 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-76b4bdf575-ggc5d"] Mar 20 07:13:01 crc kubenswrapper[4971]: I0320 07:13:01.300775 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-76b4bdf575-ggc5d"] Mar 20 07:13:01 crc kubenswrapper[4971]: I0320 07:13:01.301839 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b124085-be1b-4abd-acec-d5e30f119d75-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1b124085-be1b-4abd-acec-d5e30f119d75" (UID: "1b124085-be1b-4abd-acec-d5e30f119d75"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:13:01 crc kubenswrapper[4971]: I0320 07:13:01.302451 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b124085-be1b-4abd-acec-d5e30f119d75-config-data" (OuterVolumeSpecName: "config-data") pod "1b124085-be1b-4abd-acec-d5e30f119d75" (UID: "1b124085-be1b-4abd-acec-d5e30f119d75"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:13:01 crc kubenswrapper[4971]: I0320 07:13:01.313846 4971 scope.go:117] "RemoveContainer" containerID="c74a8520f0be1ff8549ae137e8b5ed0f71a1eeae105b1e990e6c5ec2ec0289ed" Mar 20 07:13:01 crc kubenswrapper[4971]: E0320 07:13:01.315203 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c74a8520f0be1ff8549ae137e8b5ed0f71a1eeae105b1e990e6c5ec2ec0289ed\": container with ID starting with c74a8520f0be1ff8549ae137e8b5ed0f71a1eeae105b1e990e6c5ec2ec0289ed not found: ID does not exist" containerID="c74a8520f0be1ff8549ae137e8b5ed0f71a1eeae105b1e990e6c5ec2ec0289ed" Mar 20 07:13:01 crc kubenswrapper[4971]: I0320 07:13:01.315250 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c74a8520f0be1ff8549ae137e8b5ed0f71a1eeae105b1e990e6c5ec2ec0289ed"} err="failed to get container status \"c74a8520f0be1ff8549ae137e8b5ed0f71a1eeae105b1e990e6c5ec2ec0289ed\": rpc error: code = NotFound desc = could not find container \"c74a8520f0be1ff8549ae137e8b5ed0f71a1eeae105b1e990e6c5ec2ec0289ed\": container with ID starting with c74a8520f0be1ff8549ae137e8b5ed0f71a1eeae105b1e990e6c5ec2ec0289ed not found: ID does not exist" Mar 20 07:13:01 crc kubenswrapper[4971]: I0320 07:13:01.315272 4971 scope.go:117] "RemoveContainer" containerID="a4ee488277826c6e3a3aeaad470e448df50549a3a3c8d76ed94a4c244aa33026" Mar 20 07:13:01 crc kubenswrapper[4971]: E0320 07:13:01.315682 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4ee488277826c6e3a3aeaad470e448df50549a3a3c8d76ed94a4c244aa33026\": container with ID starting with a4ee488277826c6e3a3aeaad470e448df50549a3a3c8d76ed94a4c244aa33026 not found: ID does not exist" containerID="a4ee488277826c6e3a3aeaad470e448df50549a3a3c8d76ed94a4c244aa33026" Mar 20 07:13:01 crc kubenswrapper[4971]: I0320 07:13:01.315707 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4ee488277826c6e3a3aeaad470e448df50549a3a3c8d76ed94a4c244aa33026"} err="failed to get container status \"a4ee488277826c6e3a3aeaad470e448df50549a3a3c8d76ed94a4c244aa33026\": rpc error: code = NotFound desc = could not find container \"a4ee488277826c6e3a3aeaad470e448df50549a3a3c8d76ed94a4c244aa33026\": container with ID starting with a4ee488277826c6e3a3aeaad470e448df50549a3a3c8d76ed94a4c244aa33026 not found: ID does not exist" Mar 20 07:13:01 crc kubenswrapper[4971]: I0320 07:13:01.315722 4971 scope.go:117] "RemoveContainer" containerID="048309776ba249bbbeab48b005866ec949046c27981f6419194c820f0c411200" Mar 20 07:13:01 crc kubenswrapper[4971]: I0320 07:13:01.316678 4971 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b124085-be1b-4abd-acec-d5e30f119d75-logs\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:01 crc kubenswrapper[4971]: I0320 07:13:01.316731 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b124085-be1b-4abd-acec-d5e30f119d75-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:01 crc kubenswrapper[4971]: I0320 07:13:01.316742 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b124085-be1b-4abd-acec-d5e30f119d75-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:01 crc kubenswrapper[4971]: I0320 07:13:01.316752 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rtjm\" (UniqueName: \"kubernetes.io/projected/1b124085-be1b-4abd-acec-d5e30f119d75-kube-api-access-4rtjm\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:01 crc kubenswrapper[4971]: I0320 07:13:01.316761 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b124085-be1b-4abd-acec-d5e30f119d75-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:01 crc kubenswrapper[4971]: I0320 07:13:01.326885 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b124085-be1b-4abd-acec-d5e30f119d75-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1b124085-be1b-4abd-acec-d5e30f119d75" (UID: "1b124085-be1b-4abd-acec-d5e30f119d75"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:13:01 crc kubenswrapper[4971]: I0320 07:13:01.333600 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b124085-be1b-4abd-acec-d5e30f119d75-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1b124085-be1b-4abd-acec-d5e30f119d75" (UID: "1b124085-be1b-4abd-acec-d5e30f119d75"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:13:01 crc kubenswrapper[4971]: I0320 07:13:01.341525 4971 scope.go:117] "RemoveContainer" containerID="5492bdfd3df55a8f2124fb54ea15f9b3826e3793717086ce2b066c3ae9965450" Mar 20 07:13:01 crc kubenswrapper[4971]: I0320 07:13:01.369487 4971 scope.go:117] "RemoveContainer" containerID="048309776ba249bbbeab48b005866ec949046c27981f6419194c820f0c411200" Mar 20 07:13:01 crc kubenswrapper[4971]: E0320 07:13:01.371150 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"048309776ba249bbbeab48b005866ec949046c27981f6419194c820f0c411200\": container with ID starting with 048309776ba249bbbeab48b005866ec949046c27981f6419194c820f0c411200 not found: ID does not exist" containerID="048309776ba249bbbeab48b005866ec949046c27981f6419194c820f0c411200" Mar 20 07:13:01 crc kubenswrapper[4971]: I0320 07:13:01.371184 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"048309776ba249bbbeab48b005866ec949046c27981f6419194c820f0c411200"} err="failed to get container status \"048309776ba249bbbeab48b005866ec949046c27981f6419194c820f0c411200\": rpc error: code = NotFound desc = could not find container \"048309776ba249bbbeab48b005866ec949046c27981f6419194c820f0c411200\": container with ID starting with 048309776ba249bbbeab48b005866ec949046c27981f6419194c820f0c411200 not found: ID does not exist" Mar 20 07:13:01 crc kubenswrapper[4971]: I0320 07:13:01.371209 4971 scope.go:117] "RemoveContainer" containerID="5492bdfd3df55a8f2124fb54ea15f9b3826e3793717086ce2b066c3ae9965450" Mar 20 07:13:01 crc kubenswrapper[4971]: E0320 07:13:01.371537 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5492bdfd3df55a8f2124fb54ea15f9b3826e3793717086ce2b066c3ae9965450\": container with ID starting with 5492bdfd3df55a8f2124fb54ea15f9b3826e3793717086ce2b066c3ae9965450 not found: ID does not exist" containerID="5492bdfd3df55a8f2124fb54ea15f9b3826e3793717086ce2b066c3ae9965450" Mar 20 07:13:01 crc kubenswrapper[4971]: I0320 07:13:01.371553 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5492bdfd3df55a8f2124fb54ea15f9b3826e3793717086ce2b066c3ae9965450"} err="failed to get container status \"5492bdfd3df55a8f2124fb54ea15f9b3826e3793717086ce2b066c3ae9965450\": rpc error: code = NotFound desc = could not find container \"5492bdfd3df55a8f2124fb54ea15f9b3826e3793717086ce2b066c3ae9965450\": container with ID starting with 5492bdfd3df55a8f2124fb54ea15f9b3826e3793717086ce2b066c3ae9965450 not found: ID does not exist" Mar 20 07:13:01 crc kubenswrapper[4971]: I0320 07:13:01.418920 4971 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b124085-be1b-4abd-acec-d5e30f119d75-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:01 crc kubenswrapper[4971]: I0320 07:13:01.418952 4971 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b124085-be1b-4abd-acec-d5e30f119d75-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:01 crc kubenswrapper[4971]: I0320 07:13:01.610748 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-758cd76f6b-7db5d"] Mar 20 07:13:01 crc kubenswrapper[4971]: I0320 07:13:01.624937 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-758cd76f6b-7db5d"] Mar 20 07:13:02 crc kubenswrapper[4971]: I0320 07:13:02.557760 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-dff7bb765-jsscs"] Mar 20 07:13:02 crc kubenswrapper[4971]: E0320 07:13:02.558118 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26142983-f987-457d-956c-0d63345bdf1c" containerName="barbican-worker" Mar 20 07:13:02 crc kubenswrapper[4971]: I0320 07:13:02.558130 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="26142983-f987-457d-956c-0d63345bdf1c" containerName="barbican-worker" Mar 20 07:13:02 crc kubenswrapper[4971]: E0320 07:13:02.558145 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b124085-be1b-4abd-acec-d5e30f119d75" containerName="placement-log" Mar 20 07:13:02 crc kubenswrapper[4971]: I0320 07:13:02.558151 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b124085-be1b-4abd-acec-d5e30f119d75" containerName="placement-log" Mar 20 07:13:02 crc kubenswrapper[4971]: E0320 07:13:02.558169 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26142983-f987-457d-956c-0d63345bdf1c" containerName="barbican-worker-log" Mar 20 07:13:02 crc kubenswrapper[4971]: I0320 07:13:02.558177 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="26142983-f987-457d-956c-0d63345bdf1c" containerName="barbican-worker-log" Mar 20 07:13:02 crc kubenswrapper[4971]: E0320 07:13:02.558207 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b124085-be1b-4abd-acec-d5e30f119d75" containerName="placement-api" Mar 20 07:13:02 crc kubenswrapper[4971]: I0320 07:13:02.558212 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b124085-be1b-4abd-acec-d5e30f119d75" containerName="placement-api" Mar 20 07:13:02 crc kubenswrapper[4971]: I0320 07:13:02.558364 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b124085-be1b-4abd-acec-d5e30f119d75" containerName="placement-log" Mar 20 07:13:02 crc kubenswrapper[4971]: I0320 07:13:02.559638 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="26142983-f987-457d-956c-0d63345bdf1c" containerName="barbican-worker-log" Mar 20 07:13:02 crc kubenswrapper[4971]: I0320 07:13:02.559665 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b124085-be1b-4abd-acec-d5e30f119d75" containerName="placement-api" Mar 20 07:13:02 crc kubenswrapper[4971]: I0320 07:13:02.559682 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="26142983-f987-457d-956c-0d63345bdf1c" containerName="barbican-worker" Mar 20 07:13:02 crc kubenswrapper[4971]: I0320 07:13:02.560565 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-dff7bb765-jsscs" Mar 20 07:13:02 crc kubenswrapper[4971]: I0320 07:13:02.563226 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 20 07:13:02 crc kubenswrapper[4971]: I0320 07:13:02.563399 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 20 07:13:02 crc kubenswrapper[4971]: I0320 07:13:02.563517 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 20 07:13:02 crc kubenswrapper[4971]: I0320 07:13:02.612580 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-dff7bb765-jsscs"] Mar 20 07:13:02 crc kubenswrapper[4971]: I0320 07:13:02.639665 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hd7bn\" (UniqueName: \"kubernetes.io/projected/57f3be84-ad73-4f92-9f75-b5e864f20d65-kube-api-access-hd7bn\") pod \"swift-proxy-dff7bb765-jsscs\" (UID: \"57f3be84-ad73-4f92-9f75-b5e864f20d65\") " pod="openstack/swift-proxy-dff7bb765-jsscs" Mar 20 07:13:02 crc kubenswrapper[4971]: I0320 07:13:02.639732 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/57f3be84-ad73-4f92-9f75-b5e864f20d65-public-tls-certs\") pod \"swift-proxy-dff7bb765-jsscs\" (UID: \"57f3be84-ad73-4f92-9f75-b5e864f20d65\") " pod="openstack/swift-proxy-dff7bb765-jsscs" Mar 20 07:13:02 crc kubenswrapper[4971]: I0320 07:13:02.639770 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57f3be84-ad73-4f92-9f75-b5e864f20d65-run-httpd\") pod \"swift-proxy-dff7bb765-jsscs\" (UID: \"57f3be84-ad73-4f92-9f75-b5e864f20d65\") " pod="openstack/swift-proxy-dff7bb765-jsscs" Mar 20 07:13:02 crc kubenswrapper[4971]: I0320 07:13:02.639857 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/57f3be84-ad73-4f92-9f75-b5e864f20d65-internal-tls-certs\") pod \"swift-proxy-dff7bb765-jsscs\" (UID: \"57f3be84-ad73-4f92-9f75-b5e864f20d65\") " pod="openstack/swift-proxy-dff7bb765-jsscs" Mar 20 07:13:02 crc kubenswrapper[4971]: I0320 07:13:02.639909 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57f3be84-ad73-4f92-9f75-b5e864f20d65-config-data\") pod \"swift-proxy-dff7bb765-jsscs\" (UID: \"57f3be84-ad73-4f92-9f75-b5e864f20d65\") " pod="openstack/swift-proxy-dff7bb765-jsscs" Mar 20 07:13:02 crc kubenswrapper[4971]: I0320 07:13:02.639956 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/57f3be84-ad73-4f92-9f75-b5e864f20d65-etc-swift\") pod \"swift-proxy-dff7bb765-jsscs\" (UID: \"57f3be84-ad73-4f92-9f75-b5e864f20d65\") " pod="openstack/swift-proxy-dff7bb765-jsscs" Mar 20 07:13:02 crc kubenswrapper[4971]: I0320 07:13:02.639985 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57f3be84-ad73-4f92-9f75-b5e864f20d65-log-httpd\") pod \"swift-proxy-dff7bb765-jsscs\" (UID: \"57f3be84-ad73-4f92-9f75-b5e864f20d65\") " pod="openstack/swift-proxy-dff7bb765-jsscs" Mar 20 07:13:02 crc kubenswrapper[4971]: I0320 07:13:02.640028 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57f3be84-ad73-4f92-9f75-b5e864f20d65-combined-ca-bundle\") pod \"swift-proxy-dff7bb765-jsscs\" (UID: \"57f3be84-ad73-4f92-9f75-b5e864f20d65\") " pod="openstack/swift-proxy-dff7bb765-jsscs" Mar 20 07:13:02 crc kubenswrapper[4971]: I0320 07:13:02.742061 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57f3be84-ad73-4f92-9f75-b5e864f20d65-config-data\") pod \"swift-proxy-dff7bb765-jsscs\" (UID: \"57f3be84-ad73-4f92-9f75-b5e864f20d65\") " pod="openstack/swift-proxy-dff7bb765-jsscs" Mar 20 07:13:02 crc kubenswrapper[4971]: I0320 07:13:02.742133 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/57f3be84-ad73-4f92-9f75-b5e864f20d65-etc-swift\") pod \"swift-proxy-dff7bb765-jsscs\" (UID: \"57f3be84-ad73-4f92-9f75-b5e864f20d65\") " pod="openstack/swift-proxy-dff7bb765-jsscs" Mar 20 07:13:02 crc kubenswrapper[4971]: I0320 07:13:02.742159 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57f3be84-ad73-4f92-9f75-b5e864f20d65-log-httpd\") pod \"swift-proxy-dff7bb765-jsscs\" (UID: \"57f3be84-ad73-4f92-9f75-b5e864f20d65\") " pod="openstack/swift-proxy-dff7bb765-jsscs" Mar 20 07:13:02 crc kubenswrapper[4971]: I0320 07:13:02.742193 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57f3be84-ad73-4f92-9f75-b5e864f20d65-combined-ca-bundle\") pod \"swift-proxy-dff7bb765-jsscs\" (UID: \"57f3be84-ad73-4f92-9f75-b5e864f20d65\") " pod="openstack/swift-proxy-dff7bb765-jsscs" Mar 20 07:13:02 crc kubenswrapper[4971]: I0320 07:13:02.742221 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hd7bn\" (UniqueName: \"kubernetes.io/projected/57f3be84-ad73-4f92-9f75-b5e864f20d65-kube-api-access-hd7bn\") pod \"swift-proxy-dff7bb765-jsscs\" (UID: \"57f3be84-ad73-4f92-9f75-b5e864f20d65\") " pod="openstack/swift-proxy-dff7bb765-jsscs" Mar 20 07:13:02 crc kubenswrapper[4971]: I0320 07:13:02.742241 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/57f3be84-ad73-4f92-9f75-b5e864f20d65-public-tls-certs\") pod \"swift-proxy-dff7bb765-jsscs\" (UID: \"57f3be84-ad73-4f92-9f75-b5e864f20d65\") " pod="openstack/swift-proxy-dff7bb765-jsscs" Mar 20 07:13:02 crc kubenswrapper[4971]: I0320 07:13:02.742264 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57f3be84-ad73-4f92-9f75-b5e864f20d65-run-httpd\") pod \"swift-proxy-dff7bb765-jsscs\" (UID: \"57f3be84-ad73-4f92-9f75-b5e864f20d65\") " pod="openstack/swift-proxy-dff7bb765-jsscs" Mar 20 07:13:02 crc kubenswrapper[4971]: I0320 07:13:02.742319 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/57f3be84-ad73-4f92-9f75-b5e864f20d65-internal-tls-certs\") pod \"swift-proxy-dff7bb765-jsscs\" (UID: \"57f3be84-ad73-4f92-9f75-b5e864f20d65\") " pod="openstack/swift-proxy-dff7bb765-jsscs" Mar 20 07:13:02 crc kubenswrapper[4971]: I0320 07:13:02.743239 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57f3be84-ad73-4f92-9f75-b5e864f20d65-log-httpd\") pod \"swift-proxy-dff7bb765-jsscs\" (UID: \"57f3be84-ad73-4f92-9f75-b5e864f20d65\") " pod="openstack/swift-proxy-dff7bb765-jsscs" Mar 20 07:13:02 crc kubenswrapper[4971]: I0320 07:13:02.743504 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57f3be84-ad73-4f92-9f75-b5e864f20d65-run-httpd\") pod \"swift-proxy-dff7bb765-jsscs\" (UID: \"57f3be84-ad73-4f92-9f75-b5e864f20d65\") " pod="openstack/swift-proxy-dff7bb765-jsscs" Mar 20 07:13:02 crc kubenswrapper[4971]: I0320 07:13:02.747103 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57f3be84-ad73-4f92-9f75-b5e864f20d65-combined-ca-bundle\") pod \"swift-proxy-dff7bb765-jsscs\" (UID: \"57f3be84-ad73-4f92-9f75-b5e864f20d65\") " pod="openstack/swift-proxy-dff7bb765-jsscs" Mar 20 07:13:02 crc kubenswrapper[4971]: I0320 07:13:02.762927 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/57f3be84-ad73-4f92-9f75-b5e864f20d65-public-tls-certs\") pod \"swift-proxy-dff7bb765-jsscs\" (UID: \"57f3be84-ad73-4f92-9f75-b5e864f20d65\") " pod="openstack/swift-proxy-dff7bb765-jsscs" Mar 20 07:13:02 crc kubenswrapper[4971]: I0320 07:13:02.762940 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/57f3be84-ad73-4f92-9f75-b5e864f20d65-internal-tls-certs\") pod \"swift-proxy-dff7bb765-jsscs\" (UID: \"57f3be84-ad73-4f92-9f75-b5e864f20d65\") " pod="openstack/swift-proxy-dff7bb765-jsscs" Mar 20 07:13:02 crc kubenswrapper[4971]: I0320 07:13:02.763296 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57f3be84-ad73-4f92-9f75-b5e864f20d65-config-data\") pod \"swift-proxy-dff7bb765-jsscs\" (UID: \"57f3be84-ad73-4f92-9f75-b5e864f20d65\") " pod="openstack/swift-proxy-dff7bb765-jsscs" Mar 20 07:13:02 crc kubenswrapper[4971]: I0320 07:13:02.763455 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/57f3be84-ad73-4f92-9f75-b5e864f20d65-etc-swift\") pod \"swift-proxy-dff7bb765-jsscs\" (UID: \"57f3be84-ad73-4f92-9f75-b5e864f20d65\") " pod="openstack/swift-proxy-dff7bb765-jsscs" Mar 20 07:13:02 crc kubenswrapper[4971]: I0320 07:13:02.767741 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hd7bn\" (UniqueName: \"kubernetes.io/projected/57f3be84-ad73-4f92-9f75-b5e864f20d65-kube-api-access-hd7bn\") pod \"swift-proxy-dff7bb765-jsscs\" (UID: \"57f3be84-ad73-4f92-9f75-b5e864f20d65\") " pod="openstack/swift-proxy-dff7bb765-jsscs" Mar 20 07:13:02 crc kubenswrapper[4971]: I0320 07:13:02.769448 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b124085-be1b-4abd-acec-d5e30f119d75" path="/var/lib/kubelet/pods/1b124085-be1b-4abd-acec-d5e30f119d75/volumes" Mar 20 07:13:02 crc kubenswrapper[4971]: I0320 07:13:02.770306 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26142983-f987-457d-956c-0d63345bdf1c" path="/var/lib/kubelet/pods/26142983-f987-457d-956c-0d63345bdf1c/volumes" Mar 20 07:13:02 crc kubenswrapper[4971]: I0320 07:13:02.885282 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-dff7bb765-jsscs" Mar 20 07:13:03 crc kubenswrapper[4971]: I0320 07:13:03.486308 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-dff7bb765-jsscs"] Mar 20 07:13:03 crc kubenswrapper[4971]: W0320 07:13:03.499018 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57f3be84_ad73_4f92_9f75_b5e864f20d65.slice/crio-fd72eff2fe0e2f14518a38eab156efc86d5966fb861a0699d7a314796d773071 WatchSource:0}: Error finding container fd72eff2fe0e2f14518a38eab156efc86d5966fb861a0699d7a314796d773071: Status 404 returned error can't find the container with id fd72eff2fe0e2f14518a38eab156efc86d5966fb861a0699d7a314796d773071 Mar 20 07:13:04 crc kubenswrapper[4971]: I0320 07:13:04.302743 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-dff7bb765-jsscs" event={"ID":"57f3be84-ad73-4f92-9f75-b5e864f20d65","Type":"ContainerStarted","Data":"f9a71029621987be9a7f2766db366b8b68d9770ba10954549da9931e1ec09d70"} Mar 20 07:13:04 crc kubenswrapper[4971]: I0320 07:13:04.303247 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-dff7bb765-jsscs" event={"ID":"57f3be84-ad73-4f92-9f75-b5e864f20d65","Type":"ContainerStarted","Data":"5087a1a61d2001d23a8c542b9b255700f604591da0b11b93ca51014d129c2be8"} Mar 20 07:13:04 crc kubenswrapper[4971]: I0320 07:13:04.303267 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-dff7bb765-jsscs" event={"ID":"57f3be84-ad73-4f92-9f75-b5e864f20d65","Type":"ContainerStarted","Data":"fd72eff2fe0e2f14518a38eab156efc86d5966fb861a0699d7a314796d773071"} Mar 20 07:13:04 crc kubenswrapper[4971]: I0320 07:13:04.303556 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-dff7bb765-jsscs" Mar 20 07:13:04 crc kubenswrapper[4971]: I0320 07:13:04.303623 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-dff7bb765-jsscs" Mar 20 07:13:04 crc kubenswrapper[4971]: I0320 07:13:04.343894 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-dff7bb765-jsscs" podStartSLOduration=2.343859035 podStartE2EDuration="2.343859035s" podCreationTimestamp="2026-03-20 07:13:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:13:04.331480822 +0000 UTC m=+1406.311354960" watchObservedRunningTime="2026-03-20 07:13:04.343859035 +0000 UTC m=+1406.323733193" Mar 20 07:13:05 crc kubenswrapper[4971]: I0320 07:13:05.213572 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 20 07:13:08 crc kubenswrapper[4971]: I0320 07:13:08.348780 4971 generic.go:334] "Generic (PLEG): container finished" podID="03a48c07-f949-4376-874a-8d5758b60a9a" containerID="da457a1601a60d9cbdbfe4e2ac48670fa1b80ef2a54f12a8442eac7a97202c46" exitCode=137 Mar 20 07:13:08 crc kubenswrapper[4971]: I0320 07:13:08.349444 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"03a48c07-f949-4376-874a-8d5758b60a9a","Type":"ContainerDied","Data":"da457a1601a60d9cbdbfe4e2ac48670fa1b80ef2a54f12a8442eac7a97202c46"} Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.296150 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-nv8gl"] Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.298682 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-nv8gl" Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.325551 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-nv8gl"] Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.357474 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-zpvmx"] Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.359676 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-zpvmx" Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.396011 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-zpvmx"] Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.456068 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1bb2907c-f83e-4870-bde2-cff480ce9c78-operator-scripts\") pod \"nova-api-db-create-nv8gl\" (UID: \"1bb2907c-f83e-4870-bde2-cff480ce9c78\") " pod="openstack/nova-api-db-create-nv8gl" Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.456290 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cn2zr\" (UniqueName: \"kubernetes.io/projected/f4c80ffe-c152-4f0f-99f4-38f67d491021-kube-api-access-cn2zr\") pod \"nova-cell0-db-create-zpvmx\" (UID: \"f4c80ffe-c152-4f0f-99f4-38f67d491021\") " pod="openstack/nova-cell0-db-create-zpvmx" Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.456438 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4c80ffe-c152-4f0f-99f4-38f67d491021-operator-scripts\") pod \"nova-cell0-db-create-zpvmx\" (UID: \"f4c80ffe-c152-4f0f-99f4-38f67d491021\") " pod="openstack/nova-cell0-db-create-zpvmx" Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.456648 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq4dm\" (UniqueName: \"kubernetes.io/projected/1bb2907c-f83e-4870-bde2-cff480ce9c78-kube-api-access-qq4dm\") pod \"nova-api-db-create-nv8gl\" (UID: \"1bb2907c-f83e-4870-bde2-cff480ce9c78\") " pod="openstack/nova-api-db-create-nv8gl" Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.460781 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-9csfr"] Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.464458 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-9csfr" Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.502985 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-30f2-account-create-update-v9jm4"] Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.504588 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-30f2-account-create-update-v9jm4" Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.506991 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.517186 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-9csfr"] Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.561257 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-30f2-account-create-update-v9jm4"] Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.563170 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qq4dm\" (UniqueName: \"kubernetes.io/projected/1bb2907c-f83e-4870-bde2-cff480ce9c78-kube-api-access-qq4dm\") pod \"nova-api-db-create-nv8gl\" (UID: \"1bb2907c-f83e-4870-bde2-cff480ce9c78\") " pod="openstack/nova-api-db-create-nv8gl" Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.563292 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x592z\" (UniqueName: \"kubernetes.io/projected/a1d7a078-cf37-45b3-9370-2df67437a53d-kube-api-access-x592z\") pod \"nova-cell1-db-create-9csfr\" (UID: \"a1d7a078-cf37-45b3-9370-2df67437a53d\") " pod="openstack/nova-cell1-db-create-9csfr" Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.563358 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1bb2907c-f83e-4870-bde2-cff480ce9c78-operator-scripts\") pod \"nova-api-db-create-nv8gl\" (UID: \"1bb2907c-f83e-4870-bde2-cff480ce9c78\") " pod="openstack/nova-api-db-create-nv8gl" Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.563385 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cn2zr\" (UniqueName: \"kubernetes.io/projected/f4c80ffe-c152-4f0f-99f4-38f67d491021-kube-api-access-cn2zr\") pod \"nova-cell0-db-create-zpvmx\" (UID: \"f4c80ffe-c152-4f0f-99f4-38f67d491021\") " pod="openstack/nova-cell0-db-create-zpvmx" Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.563420 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4c80ffe-c152-4f0f-99f4-38f67d491021-operator-scripts\") pod \"nova-cell0-db-create-zpvmx\" (UID: \"f4c80ffe-c152-4f0f-99f4-38f67d491021\") " pod="openstack/nova-cell0-db-create-zpvmx" Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.563495 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1d7a078-cf37-45b3-9370-2df67437a53d-operator-scripts\") pod \"nova-cell1-db-create-9csfr\" (UID: \"a1d7a078-cf37-45b3-9370-2df67437a53d\") " pod="openstack/nova-cell1-db-create-9csfr" Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.565263 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1bb2907c-f83e-4870-bde2-cff480ce9c78-operator-scripts\") pod \"nova-api-db-create-nv8gl\" (UID: \"1bb2907c-f83e-4870-bde2-cff480ce9c78\") " pod="openstack/nova-api-db-create-nv8gl" Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.565720 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4c80ffe-c152-4f0f-99f4-38f67d491021-operator-scripts\") pod \"nova-cell0-db-create-zpvmx\" (UID: \"f4c80ffe-c152-4f0f-99f4-38f67d491021\") " pod="openstack/nova-cell0-db-create-zpvmx" Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.585706 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cn2zr\" (UniqueName: \"kubernetes.io/projected/f4c80ffe-c152-4f0f-99f4-38f67d491021-kube-api-access-cn2zr\") pod \"nova-cell0-db-create-zpvmx\" (UID: \"f4c80ffe-c152-4f0f-99f4-38f67d491021\") " pod="openstack/nova-cell0-db-create-zpvmx" Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.585746 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq4dm\" (UniqueName: \"kubernetes.io/projected/1bb2907c-f83e-4870-bde2-cff480ce9c78-kube-api-access-qq4dm\") pod \"nova-api-db-create-nv8gl\" (UID: \"1bb2907c-f83e-4870-bde2-cff480ce9c78\") " pod="openstack/nova-api-db-create-nv8gl" Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.627200 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-nv8gl" Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.666679 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x592z\" (UniqueName: \"kubernetes.io/projected/a1d7a078-cf37-45b3-9370-2df67437a53d-kube-api-access-x592z\") pod \"nova-cell1-db-create-9csfr\" (UID: \"a1d7a078-cf37-45b3-9370-2df67437a53d\") " pod="openstack/nova-cell1-db-create-9csfr" Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.666752 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gklxt\" (UniqueName: \"kubernetes.io/projected/7ff3bb59-445d-44fd-927b-50bff286c5bf-kube-api-access-gklxt\") pod \"nova-api-30f2-account-create-update-v9jm4\" (UID: \"7ff3bb59-445d-44fd-927b-50bff286c5bf\") " pod="openstack/nova-api-30f2-account-create-update-v9jm4" Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.666945 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1d7a078-cf37-45b3-9370-2df67437a53d-operator-scripts\") pod \"nova-cell1-db-create-9csfr\" (UID: \"a1d7a078-cf37-45b3-9370-2df67437a53d\") " pod="openstack/nova-cell1-db-create-9csfr" Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.666983 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ff3bb59-445d-44fd-927b-50bff286c5bf-operator-scripts\") pod \"nova-api-30f2-account-create-update-v9jm4\" (UID: \"7ff3bb59-445d-44fd-927b-50bff286c5bf\") " pod="openstack/nova-api-30f2-account-create-update-v9jm4" Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.672453 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1d7a078-cf37-45b3-9370-2df67437a53d-operator-scripts\") pod \"nova-cell1-db-create-9csfr\" (UID: \"a1d7a078-cf37-45b3-9370-2df67437a53d\") " pod="openstack/nova-cell1-db-create-9csfr" Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.679216 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-bde9-account-create-update-5hn59"] Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.680873 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-bde9-account-create-update-5hn59" Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.683654 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.684404 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x592z\" (UniqueName: \"kubernetes.io/projected/a1d7a078-cf37-45b3-9370-2df67437a53d-kube-api-access-x592z\") pod \"nova-cell1-db-create-9csfr\" (UID: \"a1d7a078-cf37-45b3-9370-2df67437a53d\") " pod="openstack/nova-cell1-db-create-9csfr" Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.690625 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-bde9-account-create-update-5hn59"] Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.726195 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-zpvmx" Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.769111 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsslk\" (UniqueName: \"kubernetes.io/projected/ffe3475f-214d-4782-b998-a579e40723dc-kube-api-access-zsslk\") pod \"nova-cell0-bde9-account-create-update-5hn59\" (UID: \"ffe3475f-214d-4782-b998-a579e40723dc\") " pod="openstack/nova-cell0-bde9-account-create-update-5hn59" Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.769225 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ff3bb59-445d-44fd-927b-50bff286c5bf-operator-scripts\") pod \"nova-api-30f2-account-create-update-v9jm4\" (UID: \"7ff3bb59-445d-44fd-927b-50bff286c5bf\") " pod="openstack/nova-api-30f2-account-create-update-v9jm4" Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.769363 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ffe3475f-214d-4782-b998-a579e40723dc-operator-scripts\") pod \"nova-cell0-bde9-account-create-update-5hn59\" (UID: \"ffe3475f-214d-4782-b998-a579e40723dc\") " pod="openstack/nova-cell0-bde9-account-create-update-5hn59" Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.769412 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gklxt\" (UniqueName: \"kubernetes.io/projected/7ff3bb59-445d-44fd-927b-50bff286c5bf-kube-api-access-gklxt\") pod \"nova-api-30f2-account-create-update-v9jm4\" (UID: \"7ff3bb59-445d-44fd-927b-50bff286c5bf\") " pod="openstack/nova-api-30f2-account-create-update-v9jm4" Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.770296 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ff3bb59-445d-44fd-927b-50bff286c5bf-operator-scripts\") pod \"nova-api-30f2-account-create-update-v9jm4\" (UID: \"7ff3bb59-445d-44fd-927b-50bff286c5bf\") " pod="openstack/nova-api-30f2-account-create-update-v9jm4" Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.791835 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gklxt\" (UniqueName: \"kubernetes.io/projected/7ff3bb59-445d-44fd-927b-50bff286c5bf-kube-api-access-gklxt\") pod \"nova-api-30f2-account-create-update-v9jm4\" (UID: \"7ff3bb59-445d-44fd-927b-50bff286c5bf\") " pod="openstack/nova-api-30f2-account-create-update-v9jm4" Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.871108 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ffe3475f-214d-4782-b998-a579e40723dc-operator-scripts\") pod \"nova-cell0-bde9-account-create-update-5hn59\" (UID: \"ffe3475f-214d-4782-b998-a579e40723dc\") " pod="openstack/nova-cell0-bde9-account-create-update-5hn59" Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.871560 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsslk\" (UniqueName: \"kubernetes.io/projected/ffe3475f-214d-4782-b998-a579e40723dc-kube-api-access-zsslk\") pod \"nova-cell0-bde9-account-create-update-5hn59\" (UID: \"ffe3475f-214d-4782-b998-a579e40723dc\") " pod="openstack/nova-cell0-bde9-account-create-update-5hn59" Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.874832 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ffe3475f-214d-4782-b998-a579e40723dc-operator-scripts\") pod \"nova-cell0-bde9-account-create-update-5hn59\" (UID: \"ffe3475f-214d-4782-b998-a579e40723dc\") " pod="openstack/nova-cell0-bde9-account-create-update-5hn59" Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.885975 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-f0fc-account-create-update-tl2hm"] Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.887402 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f0fc-account-create-update-tl2hm" Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.891376 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.898992 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsslk\" (UniqueName: \"kubernetes.io/projected/ffe3475f-214d-4782-b998-a579e40723dc-kube-api-access-zsslk\") pod \"nova-cell0-bde9-account-create-update-5hn59\" (UID: \"ffe3475f-214d-4782-b998-a579e40723dc\") " pod="openstack/nova-cell0-bde9-account-create-update-5hn59" Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.901126 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-f0fc-account-create-update-tl2hm"] Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.903358 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.973057 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03a48c07-f949-4376-874a-8d5758b60a9a-combined-ca-bundle\") pod \"03a48c07-f949-4376-874a-8d5758b60a9a\" (UID: \"03a48c07-f949-4376-874a-8d5758b60a9a\") " Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.973179 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/03a48c07-f949-4376-874a-8d5758b60a9a-log-httpd\") pod \"03a48c07-f949-4376-874a-8d5758b60a9a\" (UID: \"03a48c07-f949-4376-874a-8d5758b60a9a\") " Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.973213 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/03a48c07-f949-4376-874a-8d5758b60a9a-sg-core-conf-yaml\") pod \"03a48c07-f949-4376-874a-8d5758b60a9a\" (UID: \"03a48c07-f949-4376-874a-8d5758b60a9a\") " Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.973240 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03a48c07-f949-4376-874a-8d5758b60a9a-config-data\") pod \"03a48c07-f949-4376-874a-8d5758b60a9a\" (UID: \"03a48c07-f949-4376-874a-8d5758b60a9a\") " Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.973299 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sg752\" (UniqueName: \"kubernetes.io/projected/03a48c07-f949-4376-874a-8d5758b60a9a-kube-api-access-sg752\") pod \"03a48c07-f949-4376-874a-8d5758b60a9a\" (UID: \"03a48c07-f949-4376-874a-8d5758b60a9a\") " Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.973522 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/03a48c07-f949-4376-874a-8d5758b60a9a-run-httpd\") pod \"03a48c07-f949-4376-874a-8d5758b60a9a\" (UID: \"03a48c07-f949-4376-874a-8d5758b60a9a\") " Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.973569 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03a48c07-f949-4376-874a-8d5758b60a9a-scripts\") pod \"03a48c07-f949-4376-874a-8d5758b60a9a\" (UID: \"03a48c07-f949-4376-874a-8d5758b60a9a\") " Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.973984 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94267126-5e6b-4b8f-9b54-425d893afb6b-operator-scripts\") pod \"nova-cell1-f0fc-account-create-update-tl2hm\" (UID: \"94267126-5e6b-4b8f-9b54-425d893afb6b\") " pod="openstack/nova-cell1-f0fc-account-create-update-tl2hm" Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.974083 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vtlf\" (UniqueName: \"kubernetes.io/projected/94267126-5e6b-4b8f-9b54-425d893afb6b-kube-api-access-2vtlf\") pod \"nova-cell1-f0fc-account-create-update-tl2hm\" (UID: \"94267126-5e6b-4b8f-9b54-425d893afb6b\") " pod="openstack/nova-cell1-f0fc-account-create-update-tl2hm" Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.975567 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03a48c07-f949-4376-874a-8d5758b60a9a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "03a48c07-f949-4376-874a-8d5758b60a9a" (UID: "03a48c07-f949-4376-874a-8d5758b60a9a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.975906 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03a48c07-f949-4376-874a-8d5758b60a9a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "03a48c07-f949-4376-874a-8d5758b60a9a" (UID: "03a48c07-f949-4376-874a-8d5758b60a9a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.978502 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03a48c07-f949-4376-874a-8d5758b60a9a-kube-api-access-sg752" (OuterVolumeSpecName: "kube-api-access-sg752") pod "03a48c07-f949-4376-874a-8d5758b60a9a" (UID: "03a48c07-f949-4376-874a-8d5758b60a9a"). InnerVolumeSpecName "kube-api-access-sg752". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.979668 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-9csfr" Mar 20 07:13:10 crc kubenswrapper[4971]: I0320 07:13:10.980139 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03a48c07-f949-4376-874a-8d5758b60a9a-scripts" (OuterVolumeSpecName: "scripts") pod "03a48c07-f949-4376-874a-8d5758b60a9a" (UID: "03a48c07-f949-4376-874a-8d5758b60a9a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.013617 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-30f2-account-create-update-v9jm4" Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.029639 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-bde9-account-create-update-5hn59" Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.042583 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03a48c07-f949-4376-874a-8d5758b60a9a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "03a48c07-f949-4376-874a-8d5758b60a9a" (UID: "03a48c07-f949-4376-874a-8d5758b60a9a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.059436 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03a48c07-f949-4376-874a-8d5758b60a9a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "03a48c07-f949-4376-874a-8d5758b60a9a" (UID: "03a48c07-f949-4376-874a-8d5758b60a9a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.077168 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94267126-5e6b-4b8f-9b54-425d893afb6b-operator-scripts\") pod \"nova-cell1-f0fc-account-create-update-tl2hm\" (UID: \"94267126-5e6b-4b8f-9b54-425d893afb6b\") " pod="openstack/nova-cell1-f0fc-account-create-update-tl2hm" Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.077297 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vtlf\" (UniqueName: \"kubernetes.io/projected/94267126-5e6b-4b8f-9b54-425d893afb6b-kube-api-access-2vtlf\") pod \"nova-cell1-f0fc-account-create-update-tl2hm\" (UID: \"94267126-5e6b-4b8f-9b54-425d893afb6b\") " pod="openstack/nova-cell1-f0fc-account-create-update-tl2hm" Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.077419 4971 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/03a48c07-f949-4376-874a-8d5758b60a9a-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.077435 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03a48c07-f949-4376-874a-8d5758b60a9a-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.077448 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03a48c07-f949-4376-874a-8d5758b60a9a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.077462 4971 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/03a48c07-f949-4376-874a-8d5758b60a9a-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.077475 4971 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/03a48c07-f949-4376-874a-8d5758b60a9a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.077489 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sg752\" (UniqueName: \"kubernetes.io/projected/03a48c07-f949-4376-874a-8d5758b60a9a-kube-api-access-sg752\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.078440 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94267126-5e6b-4b8f-9b54-425d893afb6b-operator-scripts\") pod \"nova-cell1-f0fc-account-create-update-tl2hm\" (UID: \"94267126-5e6b-4b8f-9b54-425d893afb6b\") " pod="openstack/nova-cell1-f0fc-account-create-update-tl2hm" Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.084582 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03a48c07-f949-4376-874a-8d5758b60a9a-config-data" (OuterVolumeSpecName: "config-data") pod "03a48c07-f949-4376-874a-8d5758b60a9a" (UID: "03a48c07-f949-4376-874a-8d5758b60a9a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.094226 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vtlf\" (UniqueName: \"kubernetes.io/projected/94267126-5e6b-4b8f-9b54-425d893afb6b-kube-api-access-2vtlf\") pod \"nova-cell1-f0fc-account-create-update-tl2hm\" (UID: \"94267126-5e6b-4b8f-9b54-425d893afb6b\") " pod="openstack/nova-cell1-f0fc-account-create-update-tl2hm" Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.180379 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03a48c07-f949-4376-874a-8d5758b60a9a-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.205015 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f0fc-account-create-update-tl2hm" Mar 20 07:13:11 crc kubenswrapper[4971]: W0320 07:13:11.302963 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1bb2907c_f83e_4870_bde2_cff480ce9c78.slice/crio-c0b48e9f32b6f7a704b3736c7248e4bfc597ee0a5fbe087427a88e8f2a1dcf15 WatchSource:0}: Error finding container c0b48e9f32b6f7a704b3736c7248e4bfc597ee0a5fbe087427a88e8f2a1dcf15: Status 404 returned error can't find the container with id c0b48e9f32b6f7a704b3736c7248e4bfc597ee0a5fbe087427a88e8f2a1dcf15 Mar 20 07:13:11 crc kubenswrapper[4971]: W0320 07:13:11.304575 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4c80ffe_c152_4f0f_99f4_38f67d491021.slice/crio-d6813a1c3f579df8a0163718ac1b600e0803cc40218cdb1354b308ed344ae8c1 WatchSource:0}: Error finding container d6813a1c3f579df8a0163718ac1b600e0803cc40218cdb1354b308ed344ae8c1: Status 404 returned error can't find the container with id d6813a1c3f579df8a0163718ac1b600e0803cc40218cdb1354b308ed344ae8c1 Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.334242 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-nv8gl"] Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.361170 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-zpvmx"] Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.421569 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-zpvmx" event={"ID":"f4c80ffe-c152-4f0f-99f4-38f67d491021","Type":"ContainerStarted","Data":"d6813a1c3f579df8a0163718ac1b600e0803cc40218cdb1354b308ed344ae8c1"} Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.423076 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-nv8gl" event={"ID":"1bb2907c-f83e-4870-bde2-cff480ce9c78","Type":"ContainerStarted","Data":"c0b48e9f32b6f7a704b3736c7248e4bfc597ee0a5fbe087427a88e8f2a1dcf15"} Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.425553 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"b7c8d202-039a-426e-b5e5-916bb257e441","Type":"ContainerStarted","Data":"c498a7de47947355c2dc10307b9c3cbd5aee665f320179ba74333dccaba346df"} Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.430507 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"03a48c07-f949-4376-874a-8d5758b60a9a","Type":"ContainerDied","Data":"1e668549de9d6a1794c53064604b079dc82499d00478a42372c389e9b4274445"} Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.430571 4971 scope.go:117] "RemoveContainer" containerID="da457a1601a60d9cbdbfe4e2ac48670fa1b80ef2a54f12a8442eac7a97202c46" Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.430658 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.454052 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.137522841 podStartE2EDuration="12.454033482s" podCreationTimestamp="2026-03-20 07:12:59 +0000 UTC" firstStartedPulling="2026-03-20 07:13:00.138057383 +0000 UTC m=+1402.117931551" lastFinishedPulling="2026-03-20 07:13:10.454568054 +0000 UTC m=+1412.434442192" observedRunningTime="2026-03-20 07:13:11.445347089 +0000 UTC m=+1413.425221227" watchObservedRunningTime="2026-03-20 07:13:11.454033482 +0000 UTC m=+1413.433907620" Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.492564 4971 scope.go:117] "RemoveContainer" containerID="2d0d52a9adadceb321d13de081d4c987d1f4efb51e325355c60dddcc42cde7c0" Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.558396 4971 scope.go:117] "RemoveContainer" containerID="aece527112bdee1e807769fbd04b183b04d566ead4cf6b30a2d975bcd179d608" Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.564145 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.616842 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.676075 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-9csfr"] Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.698992 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 07:13:11 crc kubenswrapper[4971]: E0320 07:13:11.699474 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03a48c07-f949-4376-874a-8d5758b60a9a" containerName="ceilometer-notification-agent" Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.699491 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="03a48c07-f949-4376-874a-8d5758b60a9a" containerName="ceilometer-notification-agent" Mar 20 07:13:11 crc kubenswrapper[4971]: E0320 07:13:11.699507 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03a48c07-f949-4376-874a-8d5758b60a9a" containerName="sg-core" Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.699513 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="03a48c07-f949-4376-874a-8d5758b60a9a" containerName="sg-core" Mar 20 07:13:11 crc kubenswrapper[4971]: E0320 07:13:11.699525 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03a48c07-f949-4376-874a-8d5758b60a9a" containerName="proxy-httpd" Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.699531 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="03a48c07-f949-4376-874a-8d5758b60a9a" containerName="proxy-httpd" Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.699797 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="03a48c07-f949-4376-874a-8d5758b60a9a" containerName="ceilometer-notification-agent" Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.699816 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="03a48c07-f949-4376-874a-8d5758b60a9a" containerName="proxy-httpd" Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.699842 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="03a48c07-f949-4376-874a-8d5758b60a9a" containerName="sg-core" Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.707126 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.709486 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.709784 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.739746 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.773422 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-bde9-account-create-update-5hn59"] Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.782685 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-30f2-account-create-update-v9jm4"] Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.829173 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dad46841-e516-4d94-ac4f-1d1bcb47d3b6-run-httpd\") pod \"ceilometer-0\" (UID: \"dad46841-e516-4d94-ac4f-1d1bcb47d3b6\") " pod="openstack/ceilometer-0" Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.829247 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dad46841-e516-4d94-ac4f-1d1bcb47d3b6-log-httpd\") pod \"ceilometer-0\" (UID: \"dad46841-e516-4d94-ac4f-1d1bcb47d3b6\") " pod="openstack/ceilometer-0" Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.829337 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dad46841-e516-4d94-ac4f-1d1bcb47d3b6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dad46841-e516-4d94-ac4f-1d1bcb47d3b6\") " pod="openstack/ceilometer-0" Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.829375 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dad46841-e516-4d94-ac4f-1d1bcb47d3b6-config-data\") pod \"ceilometer-0\" (UID: \"dad46841-e516-4d94-ac4f-1d1bcb47d3b6\") " pod="openstack/ceilometer-0" Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.829439 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dad46841-e516-4d94-ac4f-1d1bcb47d3b6-scripts\") pod \"ceilometer-0\" (UID: \"dad46841-e516-4d94-ac4f-1d1bcb47d3b6\") " pod="openstack/ceilometer-0" Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.829791 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87cbs\" (UniqueName: \"kubernetes.io/projected/dad46841-e516-4d94-ac4f-1d1bcb47d3b6-kube-api-access-87cbs\") pod \"ceilometer-0\" (UID: \"dad46841-e516-4d94-ac4f-1d1bcb47d3b6\") " pod="openstack/ceilometer-0" Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.829851 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dad46841-e516-4d94-ac4f-1d1bcb47d3b6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dad46841-e516-4d94-ac4f-1d1bcb47d3b6\") " pod="openstack/ceilometer-0" Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.875559 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-f0fc-account-create-update-tl2hm"] Mar 20 07:13:11 crc kubenswrapper[4971]: W0320 07:13:11.878599 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94267126_5e6b_4b8f_9b54_425d893afb6b.slice/crio-c437feaf63bf4f2ae8e8c029dddf3571692f80f6811d559ffbb932db5d66c07b WatchSource:0}: Error finding container c437feaf63bf4f2ae8e8c029dddf3571692f80f6811d559ffbb932db5d66c07b: Status 404 returned error can't find the container with id c437feaf63bf4f2ae8e8c029dddf3571692f80f6811d559ffbb932db5d66c07b Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.931380 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dad46841-e516-4d94-ac4f-1d1bcb47d3b6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dad46841-e516-4d94-ac4f-1d1bcb47d3b6\") " pod="openstack/ceilometer-0" Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.931775 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dad46841-e516-4d94-ac4f-1d1bcb47d3b6-config-data\") pod \"ceilometer-0\" (UID: \"dad46841-e516-4d94-ac4f-1d1bcb47d3b6\") " pod="openstack/ceilometer-0" Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.931811 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dad46841-e516-4d94-ac4f-1d1bcb47d3b6-scripts\") pod \"ceilometer-0\" (UID: \"dad46841-e516-4d94-ac4f-1d1bcb47d3b6\") " pod="openstack/ceilometer-0" Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.931862 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87cbs\" (UniqueName: \"kubernetes.io/projected/dad46841-e516-4d94-ac4f-1d1bcb47d3b6-kube-api-access-87cbs\") pod \"ceilometer-0\" (UID: \"dad46841-e516-4d94-ac4f-1d1bcb47d3b6\") " pod="openstack/ceilometer-0" Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.931882 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dad46841-e516-4d94-ac4f-1d1bcb47d3b6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dad46841-e516-4d94-ac4f-1d1bcb47d3b6\") " pod="openstack/ceilometer-0" Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.931952 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dad46841-e516-4d94-ac4f-1d1bcb47d3b6-run-httpd\") pod \"ceilometer-0\" (UID: \"dad46841-e516-4d94-ac4f-1d1bcb47d3b6\") " pod="openstack/ceilometer-0" Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.931970 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dad46841-e516-4d94-ac4f-1d1bcb47d3b6-log-httpd\") pod \"ceilometer-0\" (UID: \"dad46841-e516-4d94-ac4f-1d1bcb47d3b6\") " pod="openstack/ceilometer-0" Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.932405 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dad46841-e516-4d94-ac4f-1d1bcb47d3b6-log-httpd\") pod \"ceilometer-0\" (UID: \"dad46841-e516-4d94-ac4f-1d1bcb47d3b6\") " pod="openstack/ceilometer-0" Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.936021 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dad46841-e516-4d94-ac4f-1d1bcb47d3b6-run-httpd\") pod \"ceilometer-0\" (UID: \"dad46841-e516-4d94-ac4f-1d1bcb47d3b6\") " pod="openstack/ceilometer-0" Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.942866 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dad46841-e516-4d94-ac4f-1d1bcb47d3b6-config-data\") pod \"ceilometer-0\" (UID: \"dad46841-e516-4d94-ac4f-1d1bcb47d3b6\") " pod="openstack/ceilometer-0" Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.945293 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dad46841-e516-4d94-ac4f-1d1bcb47d3b6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dad46841-e516-4d94-ac4f-1d1bcb47d3b6\") " pod="openstack/ceilometer-0" Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.947007 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dad46841-e516-4d94-ac4f-1d1bcb47d3b6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dad46841-e516-4d94-ac4f-1d1bcb47d3b6\") " pod="openstack/ceilometer-0" Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.950498 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dad46841-e516-4d94-ac4f-1d1bcb47d3b6-scripts\") pod \"ceilometer-0\" (UID: \"dad46841-e516-4d94-ac4f-1d1bcb47d3b6\") " pod="openstack/ceilometer-0" Mar 20 07:13:11 crc kubenswrapper[4971]: I0320 07:13:11.952312 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87cbs\" (UniqueName: \"kubernetes.io/projected/dad46841-e516-4d94-ac4f-1d1bcb47d3b6-kube-api-access-87cbs\") pod \"ceilometer-0\" (UID: \"dad46841-e516-4d94-ac4f-1d1bcb47d3b6\") " pod="openstack/ceilometer-0" Mar 20 07:13:12 crc kubenswrapper[4971]: I0320 07:13:12.130389 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 07:13:12 crc kubenswrapper[4971]: I0320 07:13:12.444939 4971 generic.go:334] "Generic (PLEG): container finished" podID="1bb2907c-f83e-4870-bde2-cff480ce9c78" containerID="16b1faff31785032898871d7f78f8190dd2a353c88127055a1284e643fc00ee1" exitCode=0 Mar 20 07:13:12 crc kubenswrapper[4971]: I0320 07:13:12.445021 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-nv8gl" event={"ID":"1bb2907c-f83e-4870-bde2-cff480ce9c78","Type":"ContainerDied","Data":"16b1faff31785032898871d7f78f8190dd2a353c88127055a1284e643fc00ee1"} Mar 20 07:13:12 crc kubenswrapper[4971]: I0320 07:13:12.446887 4971 generic.go:334] "Generic (PLEG): container finished" podID="ffe3475f-214d-4782-b998-a579e40723dc" containerID="c8dce6132c0db7b826887ea68485e6926d6993786a3e7ddf0bd242030d28557d" exitCode=0 Mar 20 07:13:12 crc kubenswrapper[4971]: I0320 07:13:12.446927 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-bde9-account-create-update-5hn59" event={"ID":"ffe3475f-214d-4782-b998-a579e40723dc","Type":"ContainerDied","Data":"c8dce6132c0db7b826887ea68485e6926d6993786a3e7ddf0bd242030d28557d"} Mar 20 07:13:12 crc kubenswrapper[4971]: I0320 07:13:12.446944 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-bde9-account-create-update-5hn59" event={"ID":"ffe3475f-214d-4782-b998-a579e40723dc","Type":"ContainerStarted","Data":"a5673ca673fee662c142661e550ffb13e76bc64e15a4c1cdfe29610360a50b8f"} Mar 20 07:13:12 crc kubenswrapper[4971]: I0320 07:13:12.453822 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-f0fc-account-create-update-tl2hm" event={"ID":"94267126-5e6b-4b8f-9b54-425d893afb6b","Type":"ContainerStarted","Data":"68284f1f3498c3544a13dd866e1ad49696c1cf23cedffe7cf47b35b59e699421"} Mar 20 07:13:12 crc kubenswrapper[4971]: I0320 07:13:12.453889 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-f0fc-account-create-update-tl2hm" event={"ID":"94267126-5e6b-4b8f-9b54-425d893afb6b","Type":"ContainerStarted","Data":"c437feaf63bf4f2ae8e8c029dddf3571692f80f6811d559ffbb932db5d66c07b"} Mar 20 07:13:12 crc kubenswrapper[4971]: I0320 07:13:12.456420 4971 generic.go:334] "Generic (PLEG): container finished" podID="a1d7a078-cf37-45b3-9370-2df67437a53d" containerID="a476691f9b3abf540342b199004e9fe8a0cb58643f58fc4b0ace659b56cc66dc" exitCode=0 Mar 20 07:13:12 crc kubenswrapper[4971]: I0320 07:13:12.456480 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-9csfr" event={"ID":"a1d7a078-cf37-45b3-9370-2df67437a53d","Type":"ContainerDied","Data":"a476691f9b3abf540342b199004e9fe8a0cb58643f58fc4b0ace659b56cc66dc"} Mar 20 07:13:12 crc kubenswrapper[4971]: I0320 07:13:12.456503 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-9csfr" event={"ID":"a1d7a078-cf37-45b3-9370-2df67437a53d","Type":"ContainerStarted","Data":"ae1813288528eaf9cf6f717fe7242e6736b87c94df150343c002cc42558d0dc8"} Mar 20 07:13:12 crc kubenswrapper[4971]: I0320 07:13:12.466372 4971 generic.go:334] "Generic (PLEG): container finished" podID="f4c80ffe-c152-4f0f-99f4-38f67d491021" containerID="f62e0cf03afdd39a3ebc3d89fcbc2de75a784eae11e65894a8324b66b714dca4" exitCode=0 Mar 20 07:13:12 crc kubenswrapper[4971]: I0320 07:13:12.466514 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-zpvmx" event={"ID":"f4c80ffe-c152-4f0f-99f4-38f67d491021","Type":"ContainerDied","Data":"f62e0cf03afdd39a3ebc3d89fcbc2de75a784eae11e65894a8324b66b714dca4"} Mar 20 07:13:12 crc kubenswrapper[4971]: I0320 07:13:12.471803 4971 generic.go:334] "Generic (PLEG): container finished" podID="7ff3bb59-445d-44fd-927b-50bff286c5bf" containerID="2d165ae39e404c6b7bad9797db6ef0ab755622644236c7d233aa70de83d5923b" exitCode=0 Mar 20 07:13:12 crc kubenswrapper[4971]: I0320 07:13:12.471899 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-30f2-account-create-update-v9jm4" event={"ID":"7ff3bb59-445d-44fd-927b-50bff286c5bf","Type":"ContainerDied","Data":"2d165ae39e404c6b7bad9797db6ef0ab755622644236c7d233aa70de83d5923b"} Mar 20 07:13:12 crc kubenswrapper[4971]: I0320 07:13:12.471949 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-30f2-account-create-update-v9jm4" event={"ID":"7ff3bb59-445d-44fd-927b-50bff286c5bf","Type":"ContainerStarted","Data":"c78847b804e6230d421f1445240c7d43883b57e192067d71c44e7b6b9f302def"} Mar 20 07:13:12 crc kubenswrapper[4971]: I0320 07:13:12.545651 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 07:13:12 crc kubenswrapper[4971]: I0320 07:13:12.605286 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 07:13:12 crc kubenswrapper[4971]: W0320 07:13:12.631058 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddad46841_e516_4d94_ac4f_1d1bcb47d3b6.slice/crio-8a00bde29524607439c7960dc8e4f18764a5c93434bcee2b54294f5a127f031a WatchSource:0}: Error finding container 8a00bde29524607439c7960dc8e4f18764a5c93434bcee2b54294f5a127f031a: Status 404 returned error can't find the container with id 8a00bde29524607439c7960dc8e4f18764a5c93434bcee2b54294f5a127f031a Mar 20 07:13:12 crc kubenswrapper[4971]: I0320 07:13:12.745121 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03a48c07-f949-4376-874a-8d5758b60a9a" path="/var/lib/kubelet/pods/03a48c07-f949-4376-874a-8d5758b60a9a/volumes" Mar 20 07:13:12 crc kubenswrapper[4971]: I0320 07:13:12.892971 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-dff7bb765-jsscs" Mar 20 07:13:12 crc kubenswrapper[4971]: I0320 07:13:12.898909 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-dff7bb765-jsscs" Mar 20 07:13:13 crc kubenswrapper[4971]: I0320 07:13:13.428512 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 07:13:13 crc kubenswrapper[4971]: I0320 07:13:13.429475 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="493ce1ed-38cb-48e6-b8f4-4aaed4934de8" containerName="glance-log" containerID="cri-o://ffd9a902006f3587e7094f8f94475e6cfc20d4b7c3b1bbb1950d42e6b2041e4b" gracePeriod=30 Mar 20 07:13:13 crc kubenswrapper[4971]: I0320 07:13:13.430075 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="493ce1ed-38cb-48e6-b8f4-4aaed4934de8" containerName="glance-httpd" containerID="cri-o://5c053750db84f1f2b63065eafe30dd2a4b52b4191a5d5c84c1cb782b06050033" gracePeriod=30 Mar 20 07:13:13 crc kubenswrapper[4971]: I0320 07:13:13.500468 4971 generic.go:334] "Generic (PLEG): container finished" podID="94267126-5e6b-4b8f-9b54-425d893afb6b" containerID="68284f1f3498c3544a13dd866e1ad49696c1cf23cedffe7cf47b35b59e699421" exitCode=0 Mar 20 07:13:13 crc kubenswrapper[4971]: I0320 07:13:13.500683 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-f0fc-account-create-update-tl2hm" event={"ID":"94267126-5e6b-4b8f-9b54-425d893afb6b","Type":"ContainerDied","Data":"68284f1f3498c3544a13dd866e1ad49696c1cf23cedffe7cf47b35b59e699421"} Mar 20 07:13:13 crc kubenswrapper[4971]: I0320 07:13:13.511981 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dad46841-e516-4d94-ac4f-1d1bcb47d3b6","Type":"ContainerStarted","Data":"8a00bde29524607439c7960dc8e4f18764a5c93434bcee2b54294f5a127f031a"} Mar 20 07:13:13 crc kubenswrapper[4971]: I0320 07:13:13.898128 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f0fc-account-create-update-tl2hm" Mar 20 07:13:13 crc kubenswrapper[4971]: I0320 07:13:13.980869 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94267126-5e6b-4b8f-9b54-425d893afb6b-operator-scripts\") pod \"94267126-5e6b-4b8f-9b54-425d893afb6b\" (UID: \"94267126-5e6b-4b8f-9b54-425d893afb6b\") " Mar 20 07:13:13 crc kubenswrapper[4971]: I0320 07:13:13.980959 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vtlf\" (UniqueName: \"kubernetes.io/projected/94267126-5e6b-4b8f-9b54-425d893afb6b-kube-api-access-2vtlf\") pod \"94267126-5e6b-4b8f-9b54-425d893afb6b\" (UID: \"94267126-5e6b-4b8f-9b54-425d893afb6b\") " Mar 20 07:13:13 crc kubenswrapper[4971]: I0320 07:13:13.982977 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94267126-5e6b-4b8f-9b54-425d893afb6b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "94267126-5e6b-4b8f-9b54-425d893afb6b" (UID: "94267126-5e6b-4b8f-9b54-425d893afb6b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:13:13 crc kubenswrapper[4971]: I0320 07:13:13.988750 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94267126-5e6b-4b8f-9b54-425d893afb6b-kube-api-access-2vtlf" (OuterVolumeSpecName: "kube-api-access-2vtlf") pod "94267126-5e6b-4b8f-9b54-425d893afb6b" (UID: "94267126-5e6b-4b8f-9b54-425d893afb6b"). InnerVolumeSpecName "kube-api-access-2vtlf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:13:14 crc kubenswrapper[4971]: I0320 07:13:14.064323 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-9csfr" Mar 20 07:13:14 crc kubenswrapper[4971]: I0320 07:13:14.068689 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-nv8gl" Mar 20 07:13:14 crc kubenswrapper[4971]: I0320 07:13:14.083728 4971 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94267126-5e6b-4b8f-9b54-425d893afb6b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:14 crc kubenswrapper[4971]: I0320 07:13:14.083763 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vtlf\" (UniqueName: \"kubernetes.io/projected/94267126-5e6b-4b8f-9b54-425d893afb6b-kube-api-access-2vtlf\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:14 crc kubenswrapper[4971]: I0320 07:13:14.095041 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-bde9-account-create-update-5hn59" Mar 20 07:13:14 crc kubenswrapper[4971]: I0320 07:13:14.097418 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-30f2-account-create-update-v9jm4" Mar 20 07:13:14 crc kubenswrapper[4971]: I0320 07:13:14.102314 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-zpvmx" Mar 20 07:13:14 crc kubenswrapper[4971]: I0320 07:13:14.184501 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1bb2907c-f83e-4870-bde2-cff480ce9c78-operator-scripts\") pod \"1bb2907c-f83e-4870-bde2-cff480ce9c78\" (UID: \"1bb2907c-f83e-4870-bde2-cff480ce9c78\") " Mar 20 07:13:14 crc kubenswrapper[4971]: I0320 07:13:14.184649 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4c80ffe-c152-4f0f-99f4-38f67d491021-operator-scripts\") pod \"f4c80ffe-c152-4f0f-99f4-38f67d491021\" (UID: \"f4c80ffe-c152-4f0f-99f4-38f67d491021\") " Mar 20 07:13:14 crc kubenswrapper[4971]: I0320 07:13:14.184672 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qq4dm\" (UniqueName: \"kubernetes.io/projected/1bb2907c-f83e-4870-bde2-cff480ce9c78-kube-api-access-qq4dm\") pod \"1bb2907c-f83e-4870-bde2-cff480ce9c78\" (UID: \"1bb2907c-f83e-4870-bde2-cff480ce9c78\") " Mar 20 07:13:14 crc kubenswrapper[4971]: I0320 07:13:14.184737 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x592z\" (UniqueName: \"kubernetes.io/projected/a1d7a078-cf37-45b3-9370-2df67437a53d-kube-api-access-x592z\") pod \"a1d7a078-cf37-45b3-9370-2df67437a53d\" (UID: \"a1d7a078-cf37-45b3-9370-2df67437a53d\") " Mar 20 07:13:14 crc kubenswrapper[4971]: I0320 07:13:14.184808 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1d7a078-cf37-45b3-9370-2df67437a53d-operator-scripts\") pod \"a1d7a078-cf37-45b3-9370-2df67437a53d\" (UID: \"a1d7a078-cf37-45b3-9370-2df67437a53d\") " Mar 20 07:13:14 crc kubenswrapper[4971]: I0320 07:13:14.184843 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gklxt\" (UniqueName: \"kubernetes.io/projected/7ff3bb59-445d-44fd-927b-50bff286c5bf-kube-api-access-gklxt\") pod \"7ff3bb59-445d-44fd-927b-50bff286c5bf\" (UID: \"7ff3bb59-445d-44fd-927b-50bff286c5bf\") " Mar 20 07:13:14 crc kubenswrapper[4971]: I0320 07:13:14.184873 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cn2zr\" (UniqueName: \"kubernetes.io/projected/f4c80ffe-c152-4f0f-99f4-38f67d491021-kube-api-access-cn2zr\") pod \"f4c80ffe-c152-4f0f-99f4-38f67d491021\" (UID: \"f4c80ffe-c152-4f0f-99f4-38f67d491021\") " Mar 20 07:13:14 crc kubenswrapper[4971]: I0320 07:13:14.184900 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ffe3475f-214d-4782-b998-a579e40723dc-operator-scripts\") pod \"ffe3475f-214d-4782-b998-a579e40723dc\" (UID: \"ffe3475f-214d-4782-b998-a579e40723dc\") " Mar 20 07:13:14 crc kubenswrapper[4971]: I0320 07:13:14.184926 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ff3bb59-445d-44fd-927b-50bff286c5bf-operator-scripts\") pod \"7ff3bb59-445d-44fd-927b-50bff286c5bf\" (UID: \"7ff3bb59-445d-44fd-927b-50bff286c5bf\") " Mar 20 07:13:14 crc kubenswrapper[4971]: I0320 07:13:14.184957 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsslk\" (UniqueName: \"kubernetes.io/projected/ffe3475f-214d-4782-b998-a579e40723dc-kube-api-access-zsslk\") pod \"ffe3475f-214d-4782-b998-a579e40723dc\" (UID: \"ffe3475f-214d-4782-b998-a579e40723dc\") " Mar 20 07:13:14 crc kubenswrapper[4971]: I0320 07:13:14.186087 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1d7a078-cf37-45b3-9370-2df67437a53d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a1d7a078-cf37-45b3-9370-2df67437a53d" (UID: "a1d7a078-cf37-45b3-9370-2df67437a53d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:13:14 crc kubenswrapper[4971]: I0320 07:13:14.186444 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bb2907c-f83e-4870-bde2-cff480ce9c78-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1bb2907c-f83e-4870-bde2-cff480ce9c78" (UID: "1bb2907c-f83e-4870-bde2-cff480ce9c78"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:13:14 crc kubenswrapper[4971]: I0320 07:13:14.186823 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4c80ffe-c152-4f0f-99f4-38f67d491021-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f4c80ffe-c152-4f0f-99f4-38f67d491021" (UID: "f4c80ffe-c152-4f0f-99f4-38f67d491021"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:13:14 crc kubenswrapper[4971]: I0320 07:13:14.187707 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffe3475f-214d-4782-b998-a579e40723dc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ffe3475f-214d-4782-b998-a579e40723dc" (UID: "ffe3475f-214d-4782-b998-a579e40723dc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:13:14 crc kubenswrapper[4971]: I0320 07:13:14.187999 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ff3bb59-445d-44fd-927b-50bff286c5bf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7ff3bb59-445d-44fd-927b-50bff286c5bf" (UID: "7ff3bb59-445d-44fd-927b-50bff286c5bf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:13:14 crc kubenswrapper[4971]: I0320 07:13:14.194997 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4c80ffe-c152-4f0f-99f4-38f67d491021-kube-api-access-cn2zr" (OuterVolumeSpecName: "kube-api-access-cn2zr") pod "f4c80ffe-c152-4f0f-99f4-38f67d491021" (UID: "f4c80ffe-c152-4f0f-99f4-38f67d491021"). InnerVolumeSpecName "kube-api-access-cn2zr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:13:14 crc kubenswrapper[4971]: I0320 07:13:14.199779 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffe3475f-214d-4782-b998-a579e40723dc-kube-api-access-zsslk" (OuterVolumeSpecName: "kube-api-access-zsslk") pod "ffe3475f-214d-4782-b998-a579e40723dc" (UID: "ffe3475f-214d-4782-b998-a579e40723dc"). InnerVolumeSpecName "kube-api-access-zsslk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:13:14 crc kubenswrapper[4971]: I0320 07:13:14.199825 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ff3bb59-445d-44fd-927b-50bff286c5bf-kube-api-access-gklxt" (OuterVolumeSpecName: "kube-api-access-gklxt") pod "7ff3bb59-445d-44fd-927b-50bff286c5bf" (UID: "7ff3bb59-445d-44fd-927b-50bff286c5bf"). InnerVolumeSpecName "kube-api-access-gklxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:13:14 crc kubenswrapper[4971]: I0320 07:13:14.199912 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1d7a078-cf37-45b3-9370-2df67437a53d-kube-api-access-x592z" (OuterVolumeSpecName: "kube-api-access-x592z") pod "a1d7a078-cf37-45b3-9370-2df67437a53d" (UID: "a1d7a078-cf37-45b3-9370-2df67437a53d"). InnerVolumeSpecName "kube-api-access-x592z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:13:14 crc kubenswrapper[4971]: I0320 07:13:14.199979 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bb2907c-f83e-4870-bde2-cff480ce9c78-kube-api-access-qq4dm" (OuterVolumeSpecName: "kube-api-access-qq4dm") pod "1bb2907c-f83e-4870-bde2-cff480ce9c78" (UID: "1bb2907c-f83e-4870-bde2-cff480ce9c78"). InnerVolumeSpecName "kube-api-access-qq4dm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:13:14 crc kubenswrapper[4971]: I0320 07:13:14.286927 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gklxt\" (UniqueName: \"kubernetes.io/projected/7ff3bb59-445d-44fd-927b-50bff286c5bf-kube-api-access-gklxt\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:14 crc kubenswrapper[4971]: I0320 07:13:14.286958 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cn2zr\" (UniqueName: \"kubernetes.io/projected/f4c80ffe-c152-4f0f-99f4-38f67d491021-kube-api-access-cn2zr\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:14 crc kubenswrapper[4971]: I0320 07:13:14.286970 4971 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ffe3475f-214d-4782-b998-a579e40723dc-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:14 crc kubenswrapper[4971]: I0320 07:13:14.286980 4971 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ff3bb59-445d-44fd-927b-50bff286c5bf-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:14 crc kubenswrapper[4971]: I0320 07:13:14.286988 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zsslk\" (UniqueName: \"kubernetes.io/projected/ffe3475f-214d-4782-b998-a579e40723dc-kube-api-access-zsslk\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:14 crc kubenswrapper[4971]: I0320 07:13:14.286998 4971 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1bb2907c-f83e-4870-bde2-cff480ce9c78-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:14 crc kubenswrapper[4971]: I0320 07:13:14.287006 4971 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4c80ffe-c152-4f0f-99f4-38f67d491021-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:14 crc kubenswrapper[4971]: I0320 07:13:14.287014 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qq4dm\" (UniqueName: \"kubernetes.io/projected/1bb2907c-f83e-4870-bde2-cff480ce9c78-kube-api-access-qq4dm\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:14 crc kubenswrapper[4971]: I0320 07:13:14.287023 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x592z\" (UniqueName: \"kubernetes.io/projected/a1d7a078-cf37-45b3-9370-2df67437a53d-kube-api-access-x592z\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:14 crc kubenswrapper[4971]: I0320 07:13:14.287031 4971 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1d7a078-cf37-45b3-9370-2df67437a53d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:14 crc kubenswrapper[4971]: I0320 07:13:14.531792 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-zpvmx" event={"ID":"f4c80ffe-c152-4f0f-99f4-38f67d491021","Type":"ContainerDied","Data":"d6813a1c3f579df8a0163718ac1b600e0803cc40218cdb1354b308ed344ae8c1"} Mar 20 07:13:14 crc kubenswrapper[4971]: I0320 07:13:14.531844 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-zpvmx" Mar 20 07:13:14 crc kubenswrapper[4971]: I0320 07:13:14.531865 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6813a1c3f579df8a0163718ac1b600e0803cc40218cdb1354b308ed344ae8c1" Mar 20 07:13:14 crc kubenswrapper[4971]: I0320 07:13:14.533987 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-30f2-account-create-update-v9jm4" event={"ID":"7ff3bb59-445d-44fd-927b-50bff286c5bf","Type":"ContainerDied","Data":"c78847b804e6230d421f1445240c7d43883b57e192067d71c44e7b6b9f302def"} Mar 20 07:13:14 crc kubenswrapper[4971]: I0320 07:13:14.534030 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c78847b804e6230d421f1445240c7d43883b57e192067d71c44e7b6b9f302def" Mar 20 07:13:14 crc kubenswrapper[4971]: I0320 07:13:14.534068 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-30f2-account-create-update-v9jm4" Mar 20 07:13:14 crc kubenswrapper[4971]: I0320 07:13:14.536447 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-nv8gl" event={"ID":"1bb2907c-f83e-4870-bde2-cff480ce9c78","Type":"ContainerDied","Data":"c0b48e9f32b6f7a704b3736c7248e4bfc597ee0a5fbe087427a88e8f2a1dcf15"} Mar 20 07:13:14 crc kubenswrapper[4971]: I0320 07:13:14.536477 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0b48e9f32b6f7a704b3736c7248e4bfc597ee0a5fbe087427a88e8f2a1dcf15" Mar 20 07:13:14 crc kubenswrapper[4971]: I0320 07:13:14.536459 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-nv8gl" Mar 20 07:13:14 crc kubenswrapper[4971]: I0320 07:13:14.538435 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-bde9-account-create-update-5hn59" event={"ID":"ffe3475f-214d-4782-b998-a579e40723dc","Type":"ContainerDied","Data":"a5673ca673fee662c142661e550ffb13e76bc64e15a4c1cdfe29610360a50b8f"} Mar 20 07:13:14 crc kubenswrapper[4971]: I0320 07:13:14.538465 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5673ca673fee662c142661e550ffb13e76bc64e15a4c1cdfe29610360a50b8f" Mar 20 07:13:14 crc kubenswrapper[4971]: I0320 07:13:14.538519 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-bde9-account-create-update-5hn59" Mar 20 07:13:14 crc kubenswrapper[4971]: I0320 07:13:14.550255 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dad46841-e516-4d94-ac4f-1d1bcb47d3b6","Type":"ContainerStarted","Data":"85b3151eca311b6af0b4f68f197433eab29d02fae0b2d474851e3f032d5d0c6e"} Mar 20 07:13:14 crc kubenswrapper[4971]: I0320 07:13:14.550441 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dad46841-e516-4d94-ac4f-1d1bcb47d3b6","Type":"ContainerStarted","Data":"76786e6e799f6c898f29c4a5e9ebdd6d44d914abe08e38178d63401a99c6c7d4"} Mar 20 07:13:14 crc kubenswrapper[4971]: I0320 07:13:14.552281 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f0fc-account-create-update-tl2hm" Mar 20 07:13:14 crc kubenswrapper[4971]: I0320 07:13:14.552368 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-f0fc-account-create-update-tl2hm" event={"ID":"94267126-5e6b-4b8f-9b54-425d893afb6b","Type":"ContainerDied","Data":"c437feaf63bf4f2ae8e8c029dddf3571692f80f6811d559ffbb932db5d66c07b"} Mar 20 07:13:14 crc kubenswrapper[4971]: I0320 07:13:14.552397 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c437feaf63bf4f2ae8e8c029dddf3571692f80f6811d559ffbb932db5d66c07b" Mar 20 07:13:14 crc kubenswrapper[4971]: I0320 07:13:14.565247 4971 generic.go:334] "Generic (PLEG): container finished" podID="493ce1ed-38cb-48e6-b8f4-4aaed4934de8" containerID="ffd9a902006f3587e7094f8f94475e6cfc20d4b7c3b1bbb1950d42e6b2041e4b" exitCode=143 Mar 20 07:13:14 crc kubenswrapper[4971]: I0320 07:13:14.565325 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"493ce1ed-38cb-48e6-b8f4-4aaed4934de8","Type":"ContainerDied","Data":"ffd9a902006f3587e7094f8f94475e6cfc20d4b7c3b1bbb1950d42e6b2041e4b"} Mar 20 07:13:14 crc kubenswrapper[4971]: I0320 07:13:14.567513 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-9csfr" event={"ID":"a1d7a078-cf37-45b3-9370-2df67437a53d","Type":"ContainerDied","Data":"ae1813288528eaf9cf6f717fe7242e6736b87c94df150343c002cc42558d0dc8"} Mar 20 07:13:14 crc kubenswrapper[4971]: I0320 07:13:14.567557 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae1813288528eaf9cf6f717fe7242e6736b87c94df150343c002cc42558d0dc8" Mar 20 07:13:14 crc kubenswrapper[4971]: I0320 07:13:14.567692 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-9csfr" Mar 20 07:13:16 crc kubenswrapper[4971]: I0320 07:13:16.051557 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-k6q9k"] Mar 20 07:13:16 crc kubenswrapper[4971]: E0320 07:13:16.053371 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4c80ffe-c152-4f0f-99f4-38f67d491021" containerName="mariadb-database-create" Mar 20 07:13:16 crc kubenswrapper[4971]: I0320 07:13:16.053443 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4c80ffe-c152-4f0f-99f4-38f67d491021" containerName="mariadb-database-create" Mar 20 07:13:16 crc kubenswrapper[4971]: E0320 07:13:16.053511 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bb2907c-f83e-4870-bde2-cff480ce9c78" containerName="mariadb-database-create" Mar 20 07:13:16 crc kubenswrapper[4971]: I0320 07:13:16.053587 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bb2907c-f83e-4870-bde2-cff480ce9c78" containerName="mariadb-database-create" Mar 20 07:13:16 crc kubenswrapper[4971]: E0320 07:13:16.053677 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ff3bb59-445d-44fd-927b-50bff286c5bf" containerName="mariadb-account-create-update" Mar 20 07:13:16 crc kubenswrapper[4971]: I0320 07:13:16.053733 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ff3bb59-445d-44fd-927b-50bff286c5bf" containerName="mariadb-account-create-update" Mar 20 07:13:16 crc kubenswrapper[4971]: E0320 07:13:16.053788 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94267126-5e6b-4b8f-9b54-425d893afb6b" containerName="mariadb-account-create-update" Mar 20 07:13:16 crc kubenswrapper[4971]: I0320 07:13:16.053844 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="94267126-5e6b-4b8f-9b54-425d893afb6b" containerName="mariadb-account-create-update" Mar 20 07:13:16 crc kubenswrapper[4971]: E0320 07:13:16.053905 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffe3475f-214d-4782-b998-a579e40723dc" containerName="mariadb-account-create-update" Mar 20 07:13:16 crc kubenswrapper[4971]: I0320 07:13:16.053954 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffe3475f-214d-4782-b998-a579e40723dc" containerName="mariadb-account-create-update" Mar 20 07:13:16 crc kubenswrapper[4971]: E0320 07:13:16.054011 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1d7a078-cf37-45b3-9370-2df67437a53d" containerName="mariadb-database-create" Mar 20 07:13:16 crc kubenswrapper[4971]: I0320 07:13:16.054062 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1d7a078-cf37-45b3-9370-2df67437a53d" containerName="mariadb-database-create" Mar 20 07:13:16 crc kubenswrapper[4971]: I0320 07:13:16.054268 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ff3bb59-445d-44fd-927b-50bff286c5bf" containerName="mariadb-account-create-update" Mar 20 07:13:16 crc kubenswrapper[4971]: I0320 07:13:16.054330 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffe3475f-214d-4782-b998-a579e40723dc" containerName="mariadb-account-create-update" Mar 20 07:13:16 crc kubenswrapper[4971]: I0320 07:13:16.054390 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bb2907c-f83e-4870-bde2-cff480ce9c78" containerName="mariadb-database-create" Mar 20 07:13:16 crc kubenswrapper[4971]: I0320 07:13:16.054444 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4c80ffe-c152-4f0f-99f4-38f67d491021" containerName="mariadb-database-create" Mar 20 07:13:16 crc kubenswrapper[4971]: I0320 07:13:16.054499 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1d7a078-cf37-45b3-9370-2df67437a53d" containerName="mariadb-database-create" Mar 20 07:13:16 crc kubenswrapper[4971]: I0320 07:13:16.054552 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="94267126-5e6b-4b8f-9b54-425d893afb6b" containerName="mariadb-account-create-update" Mar 20 07:13:16 crc kubenswrapper[4971]: I0320 07:13:16.055280 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-k6q9k" Mar 20 07:13:16 crc kubenswrapper[4971]: I0320 07:13:16.058424 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 20 07:13:16 crc kubenswrapper[4971]: I0320 07:13:16.058738 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 20 07:13:16 crc kubenswrapper[4971]: I0320 07:13:16.058908 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-znqz8" Mar 20 07:13:16 crc kubenswrapper[4971]: I0320 07:13:16.066156 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-k6q9k"] Mar 20 07:13:16 crc kubenswrapper[4971]: I0320 07:13:16.125148 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6af1289c-26d4-4865-946f-891c126beb49-config-data\") pod \"nova-cell0-conductor-db-sync-k6q9k\" (UID: \"6af1289c-26d4-4865-946f-891c126beb49\") " pod="openstack/nova-cell0-conductor-db-sync-k6q9k" Mar 20 07:13:16 crc kubenswrapper[4971]: I0320 07:13:16.125205 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5km6x\" (UniqueName: \"kubernetes.io/projected/6af1289c-26d4-4865-946f-891c126beb49-kube-api-access-5km6x\") pod \"nova-cell0-conductor-db-sync-k6q9k\" (UID: \"6af1289c-26d4-4865-946f-891c126beb49\") " pod="openstack/nova-cell0-conductor-db-sync-k6q9k" Mar 20 07:13:16 crc kubenswrapper[4971]: I0320 07:13:16.125315 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6af1289c-26d4-4865-946f-891c126beb49-scripts\") pod \"nova-cell0-conductor-db-sync-k6q9k\" (UID: \"6af1289c-26d4-4865-946f-891c126beb49\") " pod="openstack/nova-cell0-conductor-db-sync-k6q9k" Mar 20 07:13:16 crc kubenswrapper[4971]: I0320 07:13:16.125524 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6af1289c-26d4-4865-946f-891c126beb49-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-k6q9k\" (UID: \"6af1289c-26d4-4865-946f-891c126beb49\") " pod="openstack/nova-cell0-conductor-db-sync-k6q9k" Mar 20 07:13:16 crc kubenswrapper[4971]: I0320 07:13:16.227642 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5km6x\" (UniqueName: \"kubernetes.io/projected/6af1289c-26d4-4865-946f-891c126beb49-kube-api-access-5km6x\") pod \"nova-cell0-conductor-db-sync-k6q9k\" (UID: \"6af1289c-26d4-4865-946f-891c126beb49\") " pod="openstack/nova-cell0-conductor-db-sync-k6q9k" Mar 20 07:13:16 crc kubenswrapper[4971]: I0320 07:13:16.227686 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6af1289c-26d4-4865-946f-891c126beb49-config-data\") pod \"nova-cell0-conductor-db-sync-k6q9k\" (UID: \"6af1289c-26d4-4865-946f-891c126beb49\") " pod="openstack/nova-cell0-conductor-db-sync-k6q9k" Mar 20 07:13:16 crc kubenswrapper[4971]: I0320 07:13:16.227740 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6af1289c-26d4-4865-946f-891c126beb49-scripts\") pod \"nova-cell0-conductor-db-sync-k6q9k\" (UID: \"6af1289c-26d4-4865-946f-891c126beb49\") " pod="openstack/nova-cell0-conductor-db-sync-k6q9k" Mar 20 07:13:16 crc kubenswrapper[4971]: I0320 07:13:16.227818 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6af1289c-26d4-4865-946f-891c126beb49-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-k6q9k\" (UID: \"6af1289c-26d4-4865-946f-891c126beb49\") " pod="openstack/nova-cell0-conductor-db-sync-k6q9k" Mar 20 07:13:16 crc kubenswrapper[4971]: I0320 07:13:16.234217 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6af1289c-26d4-4865-946f-891c126beb49-scripts\") pod \"nova-cell0-conductor-db-sync-k6q9k\" (UID: \"6af1289c-26d4-4865-946f-891c126beb49\") " pod="openstack/nova-cell0-conductor-db-sync-k6q9k" Mar 20 07:13:16 crc kubenswrapper[4971]: I0320 07:13:16.234309 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6af1289c-26d4-4865-946f-891c126beb49-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-k6q9k\" (UID: \"6af1289c-26d4-4865-946f-891c126beb49\") " pod="openstack/nova-cell0-conductor-db-sync-k6q9k" Mar 20 07:13:16 crc kubenswrapper[4971]: I0320 07:13:16.235431 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6af1289c-26d4-4865-946f-891c126beb49-config-data\") pod \"nova-cell0-conductor-db-sync-k6q9k\" (UID: \"6af1289c-26d4-4865-946f-891c126beb49\") " pod="openstack/nova-cell0-conductor-db-sync-k6q9k" Mar 20 07:13:16 crc kubenswrapper[4971]: I0320 07:13:16.247126 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5km6x\" (UniqueName: \"kubernetes.io/projected/6af1289c-26d4-4865-946f-891c126beb49-kube-api-access-5km6x\") pod \"nova-cell0-conductor-db-sync-k6q9k\" (UID: \"6af1289c-26d4-4865-946f-891c126beb49\") " pod="openstack/nova-cell0-conductor-db-sync-k6q9k" Mar 20 07:13:16 crc kubenswrapper[4971]: I0320 07:13:16.372832 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-k6q9k" Mar 20 07:13:16 crc kubenswrapper[4971]: W0320 07:13:16.894947 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6af1289c_26d4_4865_946f_891c126beb49.slice/crio-4834e3334e1f3e3e04ea136febeadd41417d71f3db63193959b6aeff16ccb212 WatchSource:0}: Error finding container 4834e3334e1f3e3e04ea136febeadd41417d71f3db63193959b6aeff16ccb212: Status 404 returned error can't find the container with id 4834e3334e1f3e3e04ea136febeadd41417d71f3db63193959b6aeff16ccb212 Mar 20 07:13:16 crc kubenswrapper[4971]: I0320 07:13:16.895174 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-k6q9k"] Mar 20 07:13:17 crc kubenswrapper[4971]: I0320 07:13:17.181342 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5d45446b77-6x8tp" Mar 20 07:13:17 crc kubenswrapper[4971]: I0320 07:13:17.274308 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-66b6544788-qm44x"] Mar 20 07:13:17 crc kubenswrapper[4971]: I0320 07:13:17.274552 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-66b6544788-qm44x" podUID="601f5587-0478-4903-9909-ebe0dee36539" containerName="neutron-api" containerID="cri-o://76223294f76d4e966d36915f2cf219b65fa94718b6cbae2ce9fb396e2800dc1a" gracePeriod=30 Mar 20 07:13:17 crc kubenswrapper[4971]: I0320 07:13:17.274710 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-66b6544788-qm44x" podUID="601f5587-0478-4903-9909-ebe0dee36539" containerName="neutron-httpd" containerID="cri-o://dff293a7209e9d5eb574d247516eb149910bd77be84c64f2a666082d2c149d12" gracePeriod=30 Mar 20 07:13:17 crc kubenswrapper[4971]: I0320 07:13:17.525935 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 07:13:17 crc kubenswrapper[4971]: I0320 07:13:17.561007 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/493ce1ed-38cb-48e6-b8f4-4aaed4934de8-combined-ca-bundle\") pod \"493ce1ed-38cb-48e6-b8f4-4aaed4934de8\" (UID: \"493ce1ed-38cb-48e6-b8f4-4aaed4934de8\") " Mar 20 07:13:17 crc kubenswrapper[4971]: I0320 07:13:17.561051 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/493ce1ed-38cb-48e6-b8f4-4aaed4934de8-config-data\") pod \"493ce1ed-38cb-48e6-b8f4-4aaed4934de8\" (UID: \"493ce1ed-38cb-48e6-b8f4-4aaed4934de8\") " Mar 20 07:13:17 crc kubenswrapper[4971]: I0320 07:13:17.561068 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"493ce1ed-38cb-48e6-b8f4-4aaed4934de8\" (UID: \"493ce1ed-38cb-48e6-b8f4-4aaed4934de8\") " Mar 20 07:13:17 crc kubenswrapper[4971]: I0320 07:13:17.561096 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/493ce1ed-38cb-48e6-b8f4-4aaed4934de8-public-tls-certs\") pod \"493ce1ed-38cb-48e6-b8f4-4aaed4934de8\" (UID: \"493ce1ed-38cb-48e6-b8f4-4aaed4934de8\") " Mar 20 07:13:17 crc kubenswrapper[4971]: I0320 07:13:17.561168 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/493ce1ed-38cb-48e6-b8f4-4aaed4934de8-logs\") pod \"493ce1ed-38cb-48e6-b8f4-4aaed4934de8\" (UID: \"493ce1ed-38cb-48e6-b8f4-4aaed4934de8\") " Mar 20 07:13:17 crc kubenswrapper[4971]: I0320 07:13:17.561193 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/493ce1ed-38cb-48e6-b8f4-4aaed4934de8-scripts\") pod \"493ce1ed-38cb-48e6-b8f4-4aaed4934de8\" (UID: \"493ce1ed-38cb-48e6-b8f4-4aaed4934de8\") " Mar 20 07:13:17 crc kubenswrapper[4971]: I0320 07:13:17.561318 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/493ce1ed-38cb-48e6-b8f4-4aaed4934de8-httpd-run\") pod \"493ce1ed-38cb-48e6-b8f4-4aaed4934de8\" (UID: \"493ce1ed-38cb-48e6-b8f4-4aaed4934de8\") " Mar 20 07:13:17 crc kubenswrapper[4971]: I0320 07:13:17.561378 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpq26\" (UniqueName: \"kubernetes.io/projected/493ce1ed-38cb-48e6-b8f4-4aaed4934de8-kube-api-access-zpq26\") pod \"493ce1ed-38cb-48e6-b8f4-4aaed4934de8\" (UID: \"493ce1ed-38cb-48e6-b8f4-4aaed4934de8\") " Mar 20 07:13:17 crc kubenswrapper[4971]: I0320 07:13:17.565044 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/493ce1ed-38cb-48e6-b8f4-4aaed4934de8-logs" (OuterVolumeSpecName: "logs") pod "493ce1ed-38cb-48e6-b8f4-4aaed4934de8" (UID: "493ce1ed-38cb-48e6-b8f4-4aaed4934de8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:13:17 crc kubenswrapper[4971]: I0320 07:13:17.565245 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/493ce1ed-38cb-48e6-b8f4-4aaed4934de8-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "493ce1ed-38cb-48e6-b8f4-4aaed4934de8" (UID: "493ce1ed-38cb-48e6-b8f4-4aaed4934de8"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:13:17 crc kubenswrapper[4971]: I0320 07:13:17.585681 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/493ce1ed-38cb-48e6-b8f4-4aaed4934de8-kube-api-access-zpq26" (OuterVolumeSpecName: "kube-api-access-zpq26") pod "493ce1ed-38cb-48e6-b8f4-4aaed4934de8" (UID: "493ce1ed-38cb-48e6-b8f4-4aaed4934de8"). InnerVolumeSpecName "kube-api-access-zpq26". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:13:17 crc kubenswrapper[4971]: I0320 07:13:17.585681 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/493ce1ed-38cb-48e6-b8f4-4aaed4934de8-scripts" (OuterVolumeSpecName: "scripts") pod "493ce1ed-38cb-48e6-b8f4-4aaed4934de8" (UID: "493ce1ed-38cb-48e6-b8f4-4aaed4934de8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:13:17 crc kubenswrapper[4971]: I0320 07:13:17.590731 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "493ce1ed-38cb-48e6-b8f4-4aaed4934de8" (UID: "493ce1ed-38cb-48e6-b8f4-4aaed4934de8"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 07:13:17 crc kubenswrapper[4971]: I0320 07:13:17.626295 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dad46841-e516-4d94-ac4f-1d1bcb47d3b6","Type":"ContainerStarted","Data":"f6109e74497394abf265b9e2ad0fc5cd45091d73c1d19d87c8fdbe6dc1c50da6"} Mar 20 07:13:17 crc kubenswrapper[4971]: I0320 07:13:17.636699 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/493ce1ed-38cb-48e6-b8f4-4aaed4934de8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "493ce1ed-38cb-48e6-b8f4-4aaed4934de8" (UID: "493ce1ed-38cb-48e6-b8f4-4aaed4934de8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:13:17 crc kubenswrapper[4971]: I0320 07:13:17.646260 4971 generic.go:334] "Generic (PLEG): container finished" podID="493ce1ed-38cb-48e6-b8f4-4aaed4934de8" containerID="5c053750db84f1f2b63065eafe30dd2a4b52b4191a5d5c84c1cb782b06050033" exitCode=0 Mar 20 07:13:17 crc kubenswrapper[4971]: I0320 07:13:17.646369 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"493ce1ed-38cb-48e6-b8f4-4aaed4934de8","Type":"ContainerDied","Data":"5c053750db84f1f2b63065eafe30dd2a4b52b4191a5d5c84c1cb782b06050033"} Mar 20 07:13:17 crc kubenswrapper[4971]: I0320 07:13:17.646397 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"493ce1ed-38cb-48e6-b8f4-4aaed4934de8","Type":"ContainerDied","Data":"ba80d7d23d36fb024375e89ebd594cea0ab9040052c3984f0193e33c97b7acb9"} Mar 20 07:13:17 crc kubenswrapper[4971]: I0320 07:13:17.646414 4971 scope.go:117] "RemoveContainer" containerID="5c053750db84f1f2b63065eafe30dd2a4b52b4191a5d5c84c1cb782b06050033" Mar 20 07:13:17 crc kubenswrapper[4971]: I0320 07:13:17.646530 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 07:13:17 crc kubenswrapper[4971]: I0320 07:13:17.650726 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/493ce1ed-38cb-48e6-b8f4-4aaed4934de8-config-data" (OuterVolumeSpecName: "config-data") pod "493ce1ed-38cb-48e6-b8f4-4aaed4934de8" (UID: "493ce1ed-38cb-48e6-b8f4-4aaed4934de8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:13:17 crc kubenswrapper[4971]: I0320 07:13:17.651713 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-k6q9k" event={"ID":"6af1289c-26d4-4865-946f-891c126beb49","Type":"ContainerStarted","Data":"4834e3334e1f3e3e04ea136febeadd41417d71f3db63193959b6aeff16ccb212"} Mar 20 07:13:17 crc kubenswrapper[4971]: I0320 07:13:17.653837 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/493ce1ed-38cb-48e6-b8f4-4aaed4934de8-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "493ce1ed-38cb-48e6-b8f4-4aaed4934de8" (UID: "493ce1ed-38cb-48e6-b8f4-4aaed4934de8"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:13:17 crc kubenswrapper[4971]: I0320 07:13:17.659855 4971 generic.go:334] "Generic (PLEG): container finished" podID="601f5587-0478-4903-9909-ebe0dee36539" containerID="dff293a7209e9d5eb574d247516eb149910bd77be84c64f2a666082d2c149d12" exitCode=0 Mar 20 07:13:17 crc kubenswrapper[4971]: I0320 07:13:17.659904 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66b6544788-qm44x" event={"ID":"601f5587-0478-4903-9909-ebe0dee36539","Type":"ContainerDied","Data":"dff293a7209e9d5eb574d247516eb149910bd77be84c64f2a666082d2c149d12"} Mar 20 07:13:17 crc kubenswrapper[4971]: I0320 07:13:17.663547 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/493ce1ed-38cb-48e6-b8f4-4aaed4934de8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:17 crc kubenswrapper[4971]: I0320 07:13:17.663578 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/493ce1ed-38cb-48e6-b8f4-4aaed4934de8-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:17 crc kubenswrapper[4971]: I0320 07:13:17.663625 4971 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Mar 20 07:13:17 crc kubenswrapper[4971]: I0320 07:13:17.663650 4971 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/493ce1ed-38cb-48e6-b8f4-4aaed4934de8-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:17 crc kubenswrapper[4971]: I0320 07:13:17.663666 4971 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/493ce1ed-38cb-48e6-b8f4-4aaed4934de8-logs\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:17 crc kubenswrapper[4971]: I0320 07:13:17.663674 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/493ce1ed-38cb-48e6-b8f4-4aaed4934de8-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:17 crc kubenswrapper[4971]: I0320 07:13:17.663681 4971 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/493ce1ed-38cb-48e6-b8f4-4aaed4934de8-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:17 crc kubenswrapper[4971]: I0320 07:13:17.663690 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zpq26\" (UniqueName: \"kubernetes.io/projected/493ce1ed-38cb-48e6-b8f4-4aaed4934de8-kube-api-access-zpq26\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:17 crc kubenswrapper[4971]: I0320 07:13:17.673282 4971 scope.go:117] "RemoveContainer" containerID="ffd9a902006f3587e7094f8f94475e6cfc20d4b7c3b1bbb1950d42e6b2041e4b" Mar 20 07:13:17 crc kubenswrapper[4971]: I0320 07:13:17.687331 4971 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Mar 20 07:13:17 crc kubenswrapper[4971]: I0320 07:13:17.712731 4971 scope.go:117] "RemoveContainer" containerID="5c053750db84f1f2b63065eafe30dd2a4b52b4191a5d5c84c1cb782b06050033" Mar 20 07:13:17 crc kubenswrapper[4971]: E0320 07:13:17.713200 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c053750db84f1f2b63065eafe30dd2a4b52b4191a5d5c84c1cb782b06050033\": container with ID starting with 5c053750db84f1f2b63065eafe30dd2a4b52b4191a5d5c84c1cb782b06050033 not found: ID does not exist" containerID="5c053750db84f1f2b63065eafe30dd2a4b52b4191a5d5c84c1cb782b06050033" Mar 20 07:13:17 crc kubenswrapper[4971]: I0320 07:13:17.713233 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c053750db84f1f2b63065eafe30dd2a4b52b4191a5d5c84c1cb782b06050033"} err="failed to get container status \"5c053750db84f1f2b63065eafe30dd2a4b52b4191a5d5c84c1cb782b06050033\": rpc error: code = NotFound desc = could not find container \"5c053750db84f1f2b63065eafe30dd2a4b52b4191a5d5c84c1cb782b06050033\": container with ID starting with 5c053750db84f1f2b63065eafe30dd2a4b52b4191a5d5c84c1cb782b06050033 not found: ID does not exist" Mar 20 07:13:17 crc kubenswrapper[4971]: I0320 07:13:17.713256 4971 scope.go:117] "RemoveContainer" containerID="ffd9a902006f3587e7094f8f94475e6cfc20d4b7c3b1bbb1950d42e6b2041e4b" Mar 20 07:13:17 crc kubenswrapper[4971]: E0320 07:13:17.713549 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffd9a902006f3587e7094f8f94475e6cfc20d4b7c3b1bbb1950d42e6b2041e4b\": container with ID starting with ffd9a902006f3587e7094f8f94475e6cfc20d4b7c3b1bbb1950d42e6b2041e4b not found: ID does not exist" containerID="ffd9a902006f3587e7094f8f94475e6cfc20d4b7c3b1bbb1950d42e6b2041e4b" Mar 20 07:13:17 crc kubenswrapper[4971]: I0320 07:13:17.713575 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffd9a902006f3587e7094f8f94475e6cfc20d4b7c3b1bbb1950d42e6b2041e4b"} err="failed to get container status \"ffd9a902006f3587e7094f8f94475e6cfc20d4b7c3b1bbb1950d42e6b2041e4b\": rpc error: code = NotFound desc = could not find container \"ffd9a902006f3587e7094f8f94475e6cfc20d4b7c3b1bbb1950d42e6b2041e4b\": container with ID starting with ffd9a902006f3587e7094f8f94475e6cfc20d4b7c3b1bbb1950d42e6b2041e4b not found: ID does not exist" Mar 20 07:13:17 crc kubenswrapper[4971]: I0320 07:13:17.766151 4971 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:17 crc kubenswrapper[4971]: I0320 07:13:17.985572 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 07:13:17 crc kubenswrapper[4971]: I0320 07:13:17.995140 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 07:13:18 crc kubenswrapper[4971]: I0320 07:13:18.007999 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 07:13:18 crc kubenswrapper[4971]: E0320 07:13:18.008488 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="493ce1ed-38cb-48e6-b8f4-4aaed4934de8" containerName="glance-log" Mar 20 07:13:18 crc kubenswrapper[4971]: I0320 07:13:18.008516 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="493ce1ed-38cb-48e6-b8f4-4aaed4934de8" containerName="glance-log" Mar 20 07:13:18 crc kubenswrapper[4971]: E0320 07:13:18.008550 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="493ce1ed-38cb-48e6-b8f4-4aaed4934de8" containerName="glance-httpd" Mar 20 07:13:18 crc kubenswrapper[4971]: I0320 07:13:18.008558 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="493ce1ed-38cb-48e6-b8f4-4aaed4934de8" containerName="glance-httpd" Mar 20 07:13:18 crc kubenswrapper[4971]: I0320 07:13:18.008799 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="493ce1ed-38cb-48e6-b8f4-4aaed4934de8" containerName="glance-log" Mar 20 07:13:18 crc kubenswrapper[4971]: I0320 07:13:18.008819 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="493ce1ed-38cb-48e6-b8f4-4aaed4934de8" containerName="glance-httpd" Mar 20 07:13:18 crc kubenswrapper[4971]: I0320 07:13:18.010038 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 07:13:18 crc kubenswrapper[4971]: I0320 07:13:18.014833 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 20 07:13:18 crc kubenswrapper[4971]: I0320 07:13:18.015119 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 20 07:13:18 crc kubenswrapper[4971]: I0320 07:13:18.036693 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 07:13:18 crc kubenswrapper[4971]: I0320 07:13:18.072649 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7c37921-0f32-4a99-bf40-e960c400a9ac-logs\") pod \"glance-default-external-api-0\" (UID: \"c7c37921-0f32-4a99-bf40-e960c400a9ac\") " pod="openstack/glance-default-external-api-0" Mar 20 07:13:18 crc kubenswrapper[4971]: I0320 07:13:18.072737 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c7c37921-0f32-4a99-bf40-e960c400a9ac-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c7c37921-0f32-4a99-bf40-e960c400a9ac\") " pod="openstack/glance-default-external-api-0" Mar 20 07:13:18 crc kubenswrapper[4971]: I0320 07:13:18.072785 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7c37921-0f32-4a99-bf40-e960c400a9ac-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c7c37921-0f32-4a99-bf40-e960c400a9ac\") " pod="openstack/glance-default-external-api-0" Mar 20 07:13:18 crc kubenswrapper[4971]: I0320 07:13:18.072860 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khpp7\" (UniqueName: \"kubernetes.io/projected/c7c37921-0f32-4a99-bf40-e960c400a9ac-kube-api-access-khpp7\") pod \"glance-default-external-api-0\" (UID: \"c7c37921-0f32-4a99-bf40-e960c400a9ac\") " pod="openstack/glance-default-external-api-0" Mar 20 07:13:18 crc kubenswrapper[4971]: I0320 07:13:18.072881 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7c37921-0f32-4a99-bf40-e960c400a9ac-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c7c37921-0f32-4a99-bf40-e960c400a9ac\") " pod="openstack/glance-default-external-api-0" Mar 20 07:13:18 crc kubenswrapper[4971]: I0320 07:13:18.072923 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"c7c37921-0f32-4a99-bf40-e960c400a9ac\") " pod="openstack/glance-default-external-api-0" Mar 20 07:13:18 crc kubenswrapper[4971]: I0320 07:13:18.072967 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7c37921-0f32-4a99-bf40-e960c400a9ac-config-data\") pod \"glance-default-external-api-0\" (UID: \"c7c37921-0f32-4a99-bf40-e960c400a9ac\") " pod="openstack/glance-default-external-api-0" Mar 20 07:13:18 crc kubenswrapper[4971]: I0320 07:13:18.073015 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7c37921-0f32-4a99-bf40-e960c400a9ac-scripts\") pod \"glance-default-external-api-0\" (UID: \"c7c37921-0f32-4a99-bf40-e960c400a9ac\") " pod="openstack/glance-default-external-api-0" Mar 20 07:13:18 crc kubenswrapper[4971]: I0320 07:13:18.175194 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7c37921-0f32-4a99-bf40-e960c400a9ac-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c7c37921-0f32-4a99-bf40-e960c400a9ac\") " pod="openstack/glance-default-external-api-0" Mar 20 07:13:18 crc kubenswrapper[4971]: I0320 07:13:18.175244 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"c7c37921-0f32-4a99-bf40-e960c400a9ac\") " pod="openstack/glance-default-external-api-0" Mar 20 07:13:18 crc kubenswrapper[4971]: I0320 07:13:18.175303 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7c37921-0f32-4a99-bf40-e960c400a9ac-config-data\") pod \"glance-default-external-api-0\" (UID: \"c7c37921-0f32-4a99-bf40-e960c400a9ac\") " pod="openstack/glance-default-external-api-0" Mar 20 07:13:18 crc kubenswrapper[4971]: I0320 07:13:18.175335 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7c37921-0f32-4a99-bf40-e960c400a9ac-scripts\") pod \"glance-default-external-api-0\" (UID: \"c7c37921-0f32-4a99-bf40-e960c400a9ac\") " pod="openstack/glance-default-external-api-0" Mar 20 07:13:18 crc kubenswrapper[4971]: I0320 07:13:18.175378 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7c37921-0f32-4a99-bf40-e960c400a9ac-logs\") pod \"glance-default-external-api-0\" (UID: \"c7c37921-0f32-4a99-bf40-e960c400a9ac\") " pod="openstack/glance-default-external-api-0" Mar 20 07:13:18 crc kubenswrapper[4971]: I0320 07:13:18.175427 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c7c37921-0f32-4a99-bf40-e960c400a9ac-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c7c37921-0f32-4a99-bf40-e960c400a9ac\") " pod="openstack/glance-default-external-api-0" Mar 20 07:13:18 crc kubenswrapper[4971]: I0320 07:13:18.175456 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7c37921-0f32-4a99-bf40-e960c400a9ac-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c7c37921-0f32-4a99-bf40-e960c400a9ac\") " pod="openstack/glance-default-external-api-0" Mar 20 07:13:18 crc kubenswrapper[4971]: I0320 07:13:18.175504 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khpp7\" (UniqueName: \"kubernetes.io/projected/c7c37921-0f32-4a99-bf40-e960c400a9ac-kube-api-access-khpp7\") pod \"glance-default-external-api-0\" (UID: \"c7c37921-0f32-4a99-bf40-e960c400a9ac\") " pod="openstack/glance-default-external-api-0" Mar 20 07:13:18 crc kubenswrapper[4971]: I0320 07:13:18.179653 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7c37921-0f32-4a99-bf40-e960c400a9ac-scripts\") pod \"glance-default-external-api-0\" (UID: \"c7c37921-0f32-4a99-bf40-e960c400a9ac\") " pod="openstack/glance-default-external-api-0" Mar 20 07:13:18 crc kubenswrapper[4971]: I0320 07:13:18.180006 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7c37921-0f32-4a99-bf40-e960c400a9ac-logs\") pod \"glance-default-external-api-0\" (UID: \"c7c37921-0f32-4a99-bf40-e960c400a9ac\") " pod="openstack/glance-default-external-api-0" Mar 20 07:13:18 crc kubenswrapper[4971]: I0320 07:13:18.180271 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c7c37921-0f32-4a99-bf40-e960c400a9ac-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c7c37921-0f32-4a99-bf40-e960c400a9ac\") " pod="openstack/glance-default-external-api-0" Mar 20 07:13:18 crc kubenswrapper[4971]: I0320 07:13:18.180379 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7c37921-0f32-4a99-bf40-e960c400a9ac-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c7c37921-0f32-4a99-bf40-e960c400a9ac\") " pod="openstack/glance-default-external-api-0" Mar 20 07:13:18 crc kubenswrapper[4971]: I0320 07:13:18.180384 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7c37921-0f32-4a99-bf40-e960c400a9ac-config-data\") pod \"glance-default-external-api-0\" (UID: \"c7c37921-0f32-4a99-bf40-e960c400a9ac\") " pod="openstack/glance-default-external-api-0" Mar 20 07:13:18 crc kubenswrapper[4971]: I0320 07:13:18.180470 4971 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"c7c37921-0f32-4a99-bf40-e960c400a9ac\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Mar 20 07:13:18 crc kubenswrapper[4971]: I0320 07:13:18.187501 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7c37921-0f32-4a99-bf40-e960c400a9ac-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c7c37921-0f32-4a99-bf40-e960c400a9ac\") " pod="openstack/glance-default-external-api-0" Mar 20 07:13:18 crc kubenswrapper[4971]: I0320 07:13:18.194939 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khpp7\" (UniqueName: \"kubernetes.io/projected/c7c37921-0f32-4a99-bf40-e960c400a9ac-kube-api-access-khpp7\") pod \"glance-default-external-api-0\" (UID: \"c7c37921-0f32-4a99-bf40-e960c400a9ac\") " pod="openstack/glance-default-external-api-0" Mar 20 07:13:18 crc kubenswrapper[4971]: I0320 07:13:18.210991 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"c7c37921-0f32-4a99-bf40-e960c400a9ac\") " pod="openstack/glance-default-external-api-0" Mar 20 07:13:18 crc kubenswrapper[4971]: I0320 07:13:18.341896 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 07:13:18 crc kubenswrapper[4971]: I0320 07:13:18.744419 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="493ce1ed-38cb-48e6-b8f4-4aaed4934de8" path="/var/lib/kubelet/pods/493ce1ed-38cb-48e6-b8f4-4aaed4934de8/volumes" Mar 20 07:13:18 crc kubenswrapper[4971]: I0320 07:13:18.927270 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 07:13:18 crc kubenswrapper[4971]: W0320 07:13:18.932759 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7c37921_0f32_4a99_bf40_e960c400a9ac.slice/crio-a96535dfac343fba419a6ffd44c54afad8b8d28906a76b17e4cbedcbc2a2703c WatchSource:0}: Error finding container a96535dfac343fba419a6ffd44c54afad8b8d28906a76b17e4cbedcbc2a2703c: Status 404 returned error can't find the container with id a96535dfac343fba419a6ffd44c54afad8b8d28906a76b17e4cbedcbc2a2703c Mar 20 07:13:19 crc kubenswrapper[4971]: I0320 07:13:19.705195 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c7c37921-0f32-4a99-bf40-e960c400a9ac","Type":"ContainerStarted","Data":"87f82d49936752bb6c84997b496c4f49bd0051001c3fe0ed7aff5c22606bc913"} Mar 20 07:13:19 crc kubenswrapper[4971]: I0320 07:13:19.705252 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c7c37921-0f32-4a99-bf40-e960c400a9ac","Type":"ContainerStarted","Data":"a96535dfac343fba419a6ffd44c54afad8b8d28906a76b17e4cbedcbc2a2703c"} Mar 20 07:13:20 crc kubenswrapper[4971]: I0320 07:13:20.042016 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 07:13:20 crc kubenswrapper[4971]: I0320 07:13:20.042266 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8ed5de2f-85e8-45ca-8c0b-0c646167168f" containerName="glance-log" containerID="cri-o://f20ece9bc6026beb1e2b728e8620ad88489184b8b8369888b69279374937d9eb" gracePeriod=30 Mar 20 07:13:20 crc kubenswrapper[4971]: I0320 07:13:20.042567 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8ed5de2f-85e8-45ca-8c0b-0c646167168f" containerName="glance-httpd" containerID="cri-o://a62d88168869a6e92af414f4a661b2d5e26b67f260c2d5ae443b2b09fc993a58" gracePeriod=30 Mar 20 07:13:20 crc kubenswrapper[4971]: I0320 07:13:20.164028 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:13:20 crc kubenswrapper[4971]: I0320 07:13:20.164532 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:13:20 crc kubenswrapper[4971]: I0320 07:13:20.164578 4971 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" Mar 20 07:13:20 crc kubenswrapper[4971]: I0320 07:13:20.165568 4971 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ae2e826d251bbaee40981cb044e2215f19a7929bd69583c144888da9a8754c15"} pod="openshift-machine-config-operator/machine-config-daemon-m26zl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 07:13:20 crc kubenswrapper[4971]: I0320 07:13:20.165640 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" containerID="cri-o://ae2e826d251bbaee40981cb044e2215f19a7929bd69583c144888da9a8754c15" gracePeriod=600 Mar 20 07:13:20 crc kubenswrapper[4971]: I0320 07:13:20.716646 4971 generic.go:334] "Generic (PLEG): container finished" podID="8ed5de2f-85e8-45ca-8c0b-0c646167168f" containerID="f20ece9bc6026beb1e2b728e8620ad88489184b8b8369888b69279374937d9eb" exitCode=143 Mar 20 07:13:20 crc kubenswrapper[4971]: I0320 07:13:20.716729 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8ed5de2f-85e8-45ca-8c0b-0c646167168f","Type":"ContainerDied","Data":"f20ece9bc6026beb1e2b728e8620ad88489184b8b8369888b69279374937d9eb"} Mar 20 07:13:20 crc kubenswrapper[4971]: I0320 07:13:20.726264 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dad46841-e516-4d94-ac4f-1d1bcb47d3b6","Type":"ContainerStarted","Data":"1c6087e8429f6a6d865aebc0ef93818fa6f1e0f64484014bb6975728d85adf49"} Mar 20 07:13:20 crc kubenswrapper[4971]: I0320 07:13:20.726362 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dad46841-e516-4d94-ac4f-1d1bcb47d3b6" containerName="ceilometer-central-agent" containerID="cri-o://76786e6e799f6c898f29c4a5e9ebdd6d44d914abe08e38178d63401a99c6c7d4" gracePeriod=30 Mar 20 07:13:20 crc kubenswrapper[4971]: I0320 07:13:20.726387 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dad46841-e516-4d94-ac4f-1d1bcb47d3b6" containerName="sg-core" containerID="cri-o://f6109e74497394abf265b9e2ad0fc5cd45091d73c1d19d87c8fdbe6dc1c50da6" gracePeriod=30 Mar 20 07:13:20 crc kubenswrapper[4971]: I0320 07:13:20.726389 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dad46841-e516-4d94-ac4f-1d1bcb47d3b6" containerName="ceilometer-notification-agent" containerID="cri-o://85b3151eca311b6af0b4f68f197433eab29d02fae0b2d474851e3f032d5d0c6e" gracePeriod=30 Mar 20 07:13:20 crc kubenswrapper[4971]: I0320 07:13:20.726373 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dad46841-e516-4d94-ac4f-1d1bcb47d3b6" containerName="proxy-httpd" containerID="cri-o://1c6087e8429f6a6d865aebc0ef93818fa6f1e0f64484014bb6975728d85adf49" gracePeriod=30 Mar 20 07:13:20 crc kubenswrapper[4971]: I0320 07:13:20.726538 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 07:13:20 crc kubenswrapper[4971]: I0320 07:13:20.748783 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c7c37921-0f32-4a99-bf40-e960c400a9ac","Type":"ContainerStarted","Data":"8e933255e2bcc27abf971627e50a388be00c68eb771f36803f6707174021661b"} Mar 20 07:13:20 crc kubenswrapper[4971]: I0320 07:13:20.754861 4971 generic.go:334] "Generic (PLEG): container finished" podID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerID="ae2e826d251bbaee40981cb044e2215f19a7929bd69583c144888da9a8754c15" exitCode=0 Mar 20 07:13:20 crc kubenswrapper[4971]: I0320 07:13:20.754914 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerDied","Data":"ae2e826d251bbaee40981cb044e2215f19a7929bd69583c144888da9a8754c15"} Mar 20 07:13:20 crc kubenswrapper[4971]: I0320 07:13:20.754941 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerStarted","Data":"0d76c711da5cb6a39ec9395f7a9d86e2671b04a8f185bea088275be6dbc9a85d"} Mar 20 07:13:20 crc kubenswrapper[4971]: I0320 07:13:20.754959 4971 scope.go:117] "RemoveContainer" containerID="13acc3b11ca206dfa51085fca8aab838b094201afdcac5315a5bb8076cf03b1f" Mar 20 07:13:20 crc kubenswrapper[4971]: I0320 07:13:20.764996 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.824670625 podStartE2EDuration="9.764969805s" podCreationTimestamp="2026-03-20 07:13:11 +0000 UTC" firstStartedPulling="2026-03-20 07:13:12.633990226 +0000 UTC m=+1414.613864364" lastFinishedPulling="2026-03-20 07:13:19.574289406 +0000 UTC m=+1421.554163544" observedRunningTime="2026-03-20 07:13:20.754861406 +0000 UTC m=+1422.734735544" watchObservedRunningTime="2026-03-20 07:13:20.764969805 +0000 UTC m=+1422.744843943" Mar 20 07:13:20 crc kubenswrapper[4971]: I0320 07:13:20.787767 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.787749025 podStartE2EDuration="3.787749025s" podCreationTimestamp="2026-03-20 07:13:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:13:20.785828847 +0000 UTC m=+1422.765703005" watchObservedRunningTime="2026-03-20 07:13:20.787749025 +0000 UTC m=+1422.767623173" Mar 20 07:13:21 crc kubenswrapper[4971]: I0320 07:13:21.770886 4971 generic.go:334] "Generic (PLEG): container finished" podID="dad46841-e516-4d94-ac4f-1d1bcb47d3b6" containerID="1c6087e8429f6a6d865aebc0ef93818fa6f1e0f64484014bb6975728d85adf49" exitCode=0 Mar 20 07:13:21 crc kubenswrapper[4971]: I0320 07:13:21.771200 4971 generic.go:334] "Generic (PLEG): container finished" podID="dad46841-e516-4d94-ac4f-1d1bcb47d3b6" containerID="f6109e74497394abf265b9e2ad0fc5cd45091d73c1d19d87c8fdbe6dc1c50da6" exitCode=2 Mar 20 07:13:21 crc kubenswrapper[4971]: I0320 07:13:21.771211 4971 generic.go:334] "Generic (PLEG): container finished" podID="dad46841-e516-4d94-ac4f-1d1bcb47d3b6" containerID="85b3151eca311b6af0b4f68f197433eab29d02fae0b2d474851e3f032d5d0c6e" exitCode=0 Mar 20 07:13:21 crc kubenswrapper[4971]: I0320 07:13:21.770967 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dad46841-e516-4d94-ac4f-1d1bcb47d3b6","Type":"ContainerDied","Data":"1c6087e8429f6a6d865aebc0ef93818fa6f1e0f64484014bb6975728d85adf49"} Mar 20 07:13:21 crc kubenswrapper[4971]: I0320 07:13:21.771302 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dad46841-e516-4d94-ac4f-1d1bcb47d3b6","Type":"ContainerDied","Data":"f6109e74497394abf265b9e2ad0fc5cd45091d73c1d19d87c8fdbe6dc1c50da6"} Mar 20 07:13:21 crc kubenswrapper[4971]: I0320 07:13:21.771317 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dad46841-e516-4d94-ac4f-1d1bcb47d3b6","Type":"ContainerDied","Data":"85b3151eca311b6af0b4f68f197433eab29d02fae0b2d474851e3f032d5d0c6e"} Mar 20 07:13:23 crc kubenswrapper[4971]: I0320 07:13:23.814088 4971 generic.go:334] "Generic (PLEG): container finished" podID="601f5587-0478-4903-9909-ebe0dee36539" containerID="76223294f76d4e966d36915f2cf219b65fa94718b6cbae2ce9fb396e2800dc1a" exitCode=0 Mar 20 07:13:23 crc kubenswrapper[4971]: I0320 07:13:23.814179 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66b6544788-qm44x" event={"ID":"601f5587-0478-4903-9909-ebe0dee36539","Type":"ContainerDied","Data":"76223294f76d4e966d36915f2cf219b65fa94718b6cbae2ce9fb396e2800dc1a"} Mar 20 07:13:23 crc kubenswrapper[4971]: I0320 07:13:23.818355 4971 generic.go:334] "Generic (PLEG): container finished" podID="8ed5de2f-85e8-45ca-8c0b-0c646167168f" containerID="a62d88168869a6e92af414f4a661b2d5e26b67f260c2d5ae443b2b09fc993a58" exitCode=0 Mar 20 07:13:23 crc kubenswrapper[4971]: I0320 07:13:23.818391 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8ed5de2f-85e8-45ca-8c0b-0c646167168f","Type":"ContainerDied","Data":"a62d88168869a6e92af414f4a661b2d5e26b67f260c2d5ae443b2b09fc993a58"} Mar 20 07:13:24 crc kubenswrapper[4971]: I0320 07:13:24.839111 4971 generic.go:334] "Generic (PLEG): container finished" podID="dad46841-e516-4d94-ac4f-1d1bcb47d3b6" containerID="76786e6e799f6c898f29c4a5e9ebdd6d44d914abe08e38178d63401a99c6c7d4" exitCode=0 Mar 20 07:13:24 crc kubenswrapper[4971]: I0320 07:13:24.839212 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dad46841-e516-4d94-ac4f-1d1bcb47d3b6","Type":"ContainerDied","Data":"76786e6e799f6c898f29c4a5e9ebdd6d44d914abe08e38178d63401a99c6c7d4"} Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.447053 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.500193 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.611566 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-66b6544788-qm44x" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.642882 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dad46841-e516-4d94-ac4f-1d1bcb47d3b6-scripts\") pod \"dad46841-e516-4d94-ac4f-1d1bcb47d3b6\" (UID: \"dad46841-e516-4d94-ac4f-1d1bcb47d3b6\") " Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.642941 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ed5de2f-85e8-45ca-8c0b-0c646167168f-scripts\") pod \"8ed5de2f-85e8-45ca-8c0b-0c646167168f\" (UID: \"8ed5de2f-85e8-45ca-8c0b-0c646167168f\") " Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.642974 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dad46841-e516-4d94-ac4f-1d1bcb47d3b6-run-httpd\") pod \"dad46841-e516-4d94-ac4f-1d1bcb47d3b6\" (UID: \"dad46841-e516-4d94-ac4f-1d1bcb47d3b6\") " Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.643034 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ed5de2f-85e8-45ca-8c0b-0c646167168f-combined-ca-bundle\") pod \"8ed5de2f-85e8-45ca-8c0b-0c646167168f\" (UID: \"8ed5de2f-85e8-45ca-8c0b-0c646167168f\") " Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.643064 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ed5de2f-85e8-45ca-8c0b-0c646167168f-logs\") pod \"8ed5de2f-85e8-45ca-8c0b-0c646167168f\" (UID: \"8ed5de2f-85e8-45ca-8c0b-0c646167168f\") " Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.643140 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dad46841-e516-4d94-ac4f-1d1bcb47d3b6-log-httpd\") pod \"dad46841-e516-4d94-ac4f-1d1bcb47d3b6\" (UID: \"dad46841-e516-4d94-ac4f-1d1bcb47d3b6\") " Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.643167 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87cbs\" (UniqueName: \"kubernetes.io/projected/dad46841-e516-4d94-ac4f-1d1bcb47d3b6-kube-api-access-87cbs\") pod \"dad46841-e516-4d94-ac4f-1d1bcb47d3b6\" (UID: \"dad46841-e516-4d94-ac4f-1d1bcb47d3b6\") " Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.643190 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dad46841-e516-4d94-ac4f-1d1bcb47d3b6-combined-ca-bundle\") pod \"dad46841-e516-4d94-ac4f-1d1bcb47d3b6\" (UID: \"dad46841-e516-4d94-ac4f-1d1bcb47d3b6\") " Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.643212 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8ed5de2f-85e8-45ca-8c0b-0c646167168f-httpd-run\") pod \"8ed5de2f-85e8-45ca-8c0b-0c646167168f\" (UID: \"8ed5de2f-85e8-45ca-8c0b-0c646167168f\") " Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.643228 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"8ed5de2f-85e8-45ca-8c0b-0c646167168f\" (UID: \"8ed5de2f-85e8-45ca-8c0b-0c646167168f\") " Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.643729 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dad46841-e516-4d94-ac4f-1d1bcb47d3b6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "dad46841-e516-4d94-ac4f-1d1bcb47d3b6" (UID: "dad46841-e516-4d94-ac4f-1d1bcb47d3b6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.643996 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ed5de2f-85e8-45ca-8c0b-0c646167168f-internal-tls-certs\") pod \"8ed5de2f-85e8-45ca-8c0b-0c646167168f\" (UID: \"8ed5de2f-85e8-45ca-8c0b-0c646167168f\") " Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.644045 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dad46841-e516-4d94-ac4f-1d1bcb47d3b6-config-data\") pod \"dad46841-e516-4d94-ac4f-1d1bcb47d3b6\" (UID: \"dad46841-e516-4d94-ac4f-1d1bcb47d3b6\") " Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.644081 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6nvq\" (UniqueName: \"kubernetes.io/projected/8ed5de2f-85e8-45ca-8c0b-0c646167168f-kube-api-access-c6nvq\") pod \"8ed5de2f-85e8-45ca-8c0b-0c646167168f\" (UID: \"8ed5de2f-85e8-45ca-8c0b-0c646167168f\") " Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.644115 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dad46841-e516-4d94-ac4f-1d1bcb47d3b6-sg-core-conf-yaml\") pod \"dad46841-e516-4d94-ac4f-1d1bcb47d3b6\" (UID: \"dad46841-e516-4d94-ac4f-1d1bcb47d3b6\") " Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.644147 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ed5de2f-85e8-45ca-8c0b-0c646167168f-config-data\") pod \"8ed5de2f-85e8-45ca-8c0b-0c646167168f\" (UID: \"8ed5de2f-85e8-45ca-8c0b-0c646167168f\") " Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.650742 4971 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dad46841-e516-4d94-ac4f-1d1bcb47d3b6-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.643999 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ed5de2f-85e8-45ca-8c0b-0c646167168f-logs" (OuterVolumeSpecName: "logs") pod "8ed5de2f-85e8-45ca-8c0b-0c646167168f" (UID: "8ed5de2f-85e8-45ca-8c0b-0c646167168f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.649087 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dad46841-e516-4d94-ac4f-1d1bcb47d3b6-scripts" (OuterVolumeSpecName: "scripts") pod "dad46841-e516-4d94-ac4f-1d1bcb47d3b6" (UID: "dad46841-e516-4d94-ac4f-1d1bcb47d3b6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.650797 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dad46841-e516-4d94-ac4f-1d1bcb47d3b6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "dad46841-e516-4d94-ac4f-1d1bcb47d3b6" (UID: "dad46841-e516-4d94-ac4f-1d1bcb47d3b6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.650088 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ed5de2f-85e8-45ca-8c0b-0c646167168f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8ed5de2f-85e8-45ca-8c0b-0c646167168f" (UID: "8ed5de2f-85e8-45ca-8c0b-0c646167168f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.650314 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "8ed5de2f-85e8-45ca-8c0b-0c646167168f" (UID: "8ed5de2f-85e8-45ca-8c0b-0c646167168f"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.651694 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ed5de2f-85e8-45ca-8c0b-0c646167168f-scripts" (OuterVolumeSpecName: "scripts") pod "8ed5de2f-85e8-45ca-8c0b-0c646167168f" (UID: "8ed5de2f-85e8-45ca-8c0b-0c646167168f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.653131 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dad46841-e516-4d94-ac4f-1d1bcb47d3b6-kube-api-access-87cbs" (OuterVolumeSpecName: "kube-api-access-87cbs") pod "dad46841-e516-4d94-ac4f-1d1bcb47d3b6" (UID: "dad46841-e516-4d94-ac4f-1d1bcb47d3b6"). InnerVolumeSpecName "kube-api-access-87cbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.657798 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ed5de2f-85e8-45ca-8c0b-0c646167168f-kube-api-access-c6nvq" (OuterVolumeSpecName: "kube-api-access-c6nvq") pod "8ed5de2f-85e8-45ca-8c0b-0c646167168f" (UID: "8ed5de2f-85e8-45ca-8c0b-0c646167168f"). InnerVolumeSpecName "kube-api-access-c6nvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.679905 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ed5de2f-85e8-45ca-8c0b-0c646167168f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ed5de2f-85e8-45ca-8c0b-0c646167168f" (UID: "8ed5de2f-85e8-45ca-8c0b-0c646167168f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.684758 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dad46841-e516-4d94-ac4f-1d1bcb47d3b6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "dad46841-e516-4d94-ac4f-1d1bcb47d3b6" (UID: "dad46841-e516-4d94-ac4f-1d1bcb47d3b6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.711809 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ed5de2f-85e8-45ca-8c0b-0c646167168f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8ed5de2f-85e8-45ca-8c0b-0c646167168f" (UID: "8ed5de2f-85e8-45ca-8c0b-0c646167168f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.724218 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ed5de2f-85e8-45ca-8c0b-0c646167168f-config-data" (OuterVolumeSpecName: "config-data") pod "8ed5de2f-85e8-45ca-8c0b-0c646167168f" (UID: "8ed5de2f-85e8-45ca-8c0b-0c646167168f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.742315 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dad46841-e516-4d94-ac4f-1d1bcb47d3b6-config-data" (OuterVolumeSpecName: "config-data") pod "dad46841-e516-4d94-ac4f-1d1bcb47d3b6" (UID: "dad46841-e516-4d94-ac4f-1d1bcb47d3b6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.751957 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/601f5587-0478-4903-9909-ebe0dee36539-httpd-config\") pod \"601f5587-0478-4903-9909-ebe0dee36539\" (UID: \"601f5587-0478-4903-9909-ebe0dee36539\") " Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.751999 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/601f5587-0478-4903-9909-ebe0dee36539-ovndb-tls-certs\") pod \"601f5587-0478-4903-9909-ebe0dee36539\" (UID: \"601f5587-0478-4903-9909-ebe0dee36539\") " Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.752035 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/601f5587-0478-4903-9909-ebe0dee36539-combined-ca-bundle\") pod \"601f5587-0478-4903-9909-ebe0dee36539\" (UID: \"601f5587-0478-4903-9909-ebe0dee36539\") " Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.752258 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/601f5587-0478-4903-9909-ebe0dee36539-config\") pod \"601f5587-0478-4903-9909-ebe0dee36539\" (UID: \"601f5587-0478-4903-9909-ebe0dee36539\") " Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.752289 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdxdr\" (UniqueName: \"kubernetes.io/projected/601f5587-0478-4903-9909-ebe0dee36539-kube-api-access-wdxdr\") pod \"601f5587-0478-4903-9909-ebe0dee36539\" (UID: \"601f5587-0478-4903-9909-ebe0dee36539\") " Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.753185 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dad46841-e516-4d94-ac4f-1d1bcb47d3b6-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.753203 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ed5de2f-85e8-45ca-8c0b-0c646167168f-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.753282 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ed5de2f-85e8-45ca-8c0b-0c646167168f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.754194 4971 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ed5de2f-85e8-45ca-8c0b-0c646167168f-logs\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.754712 4971 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dad46841-e516-4d94-ac4f-1d1bcb47d3b6-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.754725 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87cbs\" (UniqueName: \"kubernetes.io/projected/dad46841-e516-4d94-ac4f-1d1bcb47d3b6-kube-api-access-87cbs\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.754733 4971 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8ed5de2f-85e8-45ca-8c0b-0c646167168f-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.754759 4971 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.754769 4971 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ed5de2f-85e8-45ca-8c0b-0c646167168f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.754779 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dad46841-e516-4d94-ac4f-1d1bcb47d3b6-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.754788 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6nvq\" (UniqueName: \"kubernetes.io/projected/8ed5de2f-85e8-45ca-8c0b-0c646167168f-kube-api-access-c6nvq\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.754797 4971 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dad46841-e516-4d94-ac4f-1d1bcb47d3b6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.754805 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ed5de2f-85e8-45ca-8c0b-0c646167168f-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.755399 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/601f5587-0478-4903-9909-ebe0dee36539-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "601f5587-0478-4903-9909-ebe0dee36539" (UID: "601f5587-0478-4903-9909-ebe0dee36539"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.759138 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/601f5587-0478-4903-9909-ebe0dee36539-kube-api-access-wdxdr" (OuterVolumeSpecName: "kube-api-access-wdxdr") pod "601f5587-0478-4903-9909-ebe0dee36539" (UID: "601f5587-0478-4903-9909-ebe0dee36539"). InnerVolumeSpecName "kube-api-access-wdxdr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.760880 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dad46841-e516-4d94-ac4f-1d1bcb47d3b6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dad46841-e516-4d94-ac4f-1d1bcb47d3b6" (UID: "dad46841-e516-4d94-ac4f-1d1bcb47d3b6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.782106 4971 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.802792 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/601f5587-0478-4903-9909-ebe0dee36539-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "601f5587-0478-4903-9909-ebe0dee36539" (UID: "601f5587-0478-4903-9909-ebe0dee36539"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.808585 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/601f5587-0478-4903-9909-ebe0dee36539-config" (OuterVolumeSpecName: "config") pod "601f5587-0478-4903-9909-ebe0dee36539" (UID: "601f5587-0478-4903-9909-ebe0dee36539"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.827875 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/601f5587-0478-4903-9909-ebe0dee36539-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "601f5587-0478-4903-9909-ebe0dee36539" (UID: "601f5587-0478-4903-9909-ebe0dee36539"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.856149 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/601f5587-0478-4903-9909-ebe0dee36539-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.856184 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdxdr\" (UniqueName: \"kubernetes.io/projected/601f5587-0478-4903-9909-ebe0dee36539-kube-api-access-wdxdr\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.856194 4971 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/601f5587-0478-4903-9909-ebe0dee36539-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.856204 4971 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/601f5587-0478-4903-9909-ebe0dee36539-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.856212 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/601f5587-0478-4903-9909-ebe0dee36539-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.856221 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dad46841-e516-4d94-ac4f-1d1bcb47d3b6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.856230 4971 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.860524 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8ed5de2f-85e8-45ca-8c0b-0c646167168f","Type":"ContainerDied","Data":"7b305347201c520358a095844517b1d49d32248695d3ee60c5c56af1b87dd612"} Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.860579 4971 scope.go:117] "RemoveContainer" containerID="a62d88168869a6e92af414f4a661b2d5e26b67f260c2d5ae443b2b09fc993a58" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.860580 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.864815 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dad46841-e516-4d94-ac4f-1d1bcb47d3b6","Type":"ContainerDied","Data":"8a00bde29524607439c7960dc8e4f18764a5c93434bcee2b54294f5a127f031a"} Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.864839 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.868057 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-k6q9k" event={"ID":"6af1289c-26d4-4865-946f-891c126beb49","Type":"ContainerStarted","Data":"b8a7e074bded868b90861fe1e5fdddd63c2b37acc4111e94697089382de56703"} Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.875592 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66b6544788-qm44x" event={"ID":"601f5587-0478-4903-9909-ebe0dee36539","Type":"ContainerDied","Data":"3aa0526d8fa86bd67d0ddc3a6c7bbc4eae2965f6ef5dabba4e1875ccedcd80b6"} Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.875806 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-66b6544788-qm44x" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.889791 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.901705 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.904939 4971 scope.go:117] "RemoveContainer" containerID="f20ece9bc6026beb1e2b728e8620ad88489184b8b8369888b69279374937d9eb" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.907408 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-k6q9k" podStartSLOduration=1.6747530309999998 podStartE2EDuration="10.90738557s" podCreationTimestamp="2026-03-20 07:13:16 +0000 UTC" firstStartedPulling="2026-03-20 07:13:16.898778829 +0000 UTC m=+1418.878652977" lastFinishedPulling="2026-03-20 07:13:26.131411378 +0000 UTC m=+1428.111285516" observedRunningTime="2026-03-20 07:13:26.890556483 +0000 UTC m=+1428.870430621" watchObservedRunningTime="2026-03-20 07:13:26.90738557 +0000 UTC m=+1428.887259708" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.932500 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 07:13:26 crc kubenswrapper[4971]: E0320 07:13:26.933023 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ed5de2f-85e8-45ca-8c0b-0c646167168f" containerName="glance-httpd" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.933042 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ed5de2f-85e8-45ca-8c0b-0c646167168f" containerName="glance-httpd" Mar 20 07:13:26 crc kubenswrapper[4971]: E0320 07:13:26.933063 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dad46841-e516-4d94-ac4f-1d1bcb47d3b6" containerName="proxy-httpd" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.933073 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="dad46841-e516-4d94-ac4f-1d1bcb47d3b6" containerName="proxy-httpd" Mar 20 07:13:26 crc kubenswrapper[4971]: E0320 07:13:26.933089 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="601f5587-0478-4903-9909-ebe0dee36539" containerName="neutron-httpd" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.933097 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="601f5587-0478-4903-9909-ebe0dee36539" containerName="neutron-httpd" Mar 20 07:13:26 crc kubenswrapper[4971]: E0320 07:13:26.933111 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dad46841-e516-4d94-ac4f-1d1bcb47d3b6" containerName="ceilometer-notification-agent" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.933119 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="dad46841-e516-4d94-ac4f-1d1bcb47d3b6" containerName="ceilometer-notification-agent" Mar 20 07:13:26 crc kubenswrapper[4971]: E0320 07:13:26.933135 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dad46841-e516-4d94-ac4f-1d1bcb47d3b6" containerName="ceilometer-central-agent" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.933142 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="dad46841-e516-4d94-ac4f-1d1bcb47d3b6" containerName="ceilometer-central-agent" Mar 20 07:13:26 crc kubenswrapper[4971]: E0320 07:13:26.933160 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dad46841-e516-4d94-ac4f-1d1bcb47d3b6" containerName="sg-core" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.933167 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="dad46841-e516-4d94-ac4f-1d1bcb47d3b6" containerName="sg-core" Mar 20 07:13:26 crc kubenswrapper[4971]: E0320 07:13:26.933236 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ed5de2f-85e8-45ca-8c0b-0c646167168f" containerName="glance-log" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.933245 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ed5de2f-85e8-45ca-8c0b-0c646167168f" containerName="glance-log" Mar 20 07:13:26 crc kubenswrapper[4971]: E0320 07:13:26.933258 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="601f5587-0478-4903-9909-ebe0dee36539" containerName="neutron-api" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.933267 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="601f5587-0478-4903-9909-ebe0dee36539" containerName="neutron-api" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.933467 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="601f5587-0478-4903-9909-ebe0dee36539" containerName="neutron-api" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.933484 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="dad46841-e516-4d94-ac4f-1d1bcb47d3b6" containerName="sg-core" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.933503 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="601f5587-0478-4903-9909-ebe0dee36539" containerName="neutron-httpd" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.933516 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="dad46841-e516-4d94-ac4f-1d1bcb47d3b6" containerName="ceilometer-central-agent" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.933528 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ed5de2f-85e8-45ca-8c0b-0c646167168f" containerName="glance-httpd" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.933542 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ed5de2f-85e8-45ca-8c0b-0c646167168f" containerName="glance-log" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.933552 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="dad46841-e516-4d94-ac4f-1d1bcb47d3b6" containerName="proxy-httpd" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.933564 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="dad46841-e516-4d94-ac4f-1d1bcb47d3b6" containerName="ceilometer-notification-agent" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.935337 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.940743 4971 scope.go:117] "RemoveContainer" containerID="1c6087e8429f6a6d865aebc0ef93818fa6f1e0f64484014bb6975728d85adf49" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.947972 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.948156 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.948246 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 07:13:26 crc kubenswrapper[4971]: I0320 07:13:26.954332 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-66b6544788-qm44x"] Mar 20 07:13:27 crc kubenswrapper[4971]: I0320 07:13:27.016536 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-66b6544788-qm44x"] Mar 20 07:13:27 crc kubenswrapper[4971]: I0320 07:13:27.034072 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 07:13:27 crc kubenswrapper[4971]: I0320 07:13:27.037343 4971 scope.go:117] "RemoveContainer" containerID="f6109e74497394abf265b9e2ad0fc5cd45091d73c1d19d87c8fdbe6dc1c50da6" Mar 20 07:13:27 crc kubenswrapper[4971]: I0320 07:13:27.069672 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa332536-984c-4942-b1fb-1ffde7eb465d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"aa332536-984c-4942-b1fb-1ffde7eb465d\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:13:27 crc kubenswrapper[4971]: I0320 07:13:27.069738 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa332536-984c-4942-b1fb-1ffde7eb465d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"aa332536-984c-4942-b1fb-1ffde7eb465d\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:13:27 crc kubenswrapper[4971]: I0320 07:13:27.069832 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/aa332536-984c-4942-b1fb-1ffde7eb465d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"aa332536-984c-4942-b1fb-1ffde7eb465d\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:13:27 crc kubenswrapper[4971]: I0320 07:13:27.069893 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwzw5\" (UniqueName: \"kubernetes.io/projected/aa332536-984c-4942-b1fb-1ffde7eb465d-kube-api-access-xwzw5\") pod \"glance-default-internal-api-0\" (UID: \"aa332536-984c-4942-b1fb-1ffde7eb465d\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:13:27 crc kubenswrapper[4971]: I0320 07:13:27.070126 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa332536-984c-4942-b1fb-1ffde7eb465d-logs\") pod \"glance-default-internal-api-0\" (UID: \"aa332536-984c-4942-b1fb-1ffde7eb465d\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:13:27 crc kubenswrapper[4971]: I0320 07:13:27.070198 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa332536-984c-4942-b1fb-1ffde7eb465d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"aa332536-984c-4942-b1fb-1ffde7eb465d\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:13:27 crc kubenswrapper[4971]: I0320 07:13:27.070286 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"aa332536-984c-4942-b1fb-1ffde7eb465d\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:13:27 crc kubenswrapper[4971]: I0320 07:13:27.070338 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa332536-984c-4942-b1fb-1ffde7eb465d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"aa332536-984c-4942-b1fb-1ffde7eb465d\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:13:27 crc kubenswrapper[4971]: I0320 07:13:27.072787 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 07:13:27 crc kubenswrapper[4971]: I0320 07:13:27.081808 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 07:13:27 crc kubenswrapper[4971]: I0320 07:13:27.085699 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 07:13:27 crc kubenswrapper[4971]: I0320 07:13:27.088436 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 07:13:27 crc kubenswrapper[4971]: I0320 07:13:27.088735 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 07:13:27 crc kubenswrapper[4971]: I0320 07:13:27.100320 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 07:13:27 crc kubenswrapper[4971]: I0320 07:13:27.130817 4971 scope.go:117] "RemoveContainer" containerID="85b3151eca311b6af0b4f68f197433eab29d02fae0b2d474851e3f032d5d0c6e" Mar 20 07:13:27 crc kubenswrapper[4971]: I0320 07:13:27.154181 4971 scope.go:117] "RemoveContainer" containerID="76786e6e799f6c898f29c4a5e9ebdd6d44d914abe08e38178d63401a99c6c7d4" Mar 20 07:13:27 crc kubenswrapper[4971]: I0320 07:13:27.174824 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa332536-984c-4942-b1fb-1ffde7eb465d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"aa332536-984c-4942-b1fb-1ffde7eb465d\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:13:27 crc kubenswrapper[4971]: I0320 07:13:27.174872 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/aa332536-984c-4942-b1fb-1ffde7eb465d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"aa332536-984c-4942-b1fb-1ffde7eb465d\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:13:27 crc kubenswrapper[4971]: I0320 07:13:27.174906 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwzw5\" (UniqueName: \"kubernetes.io/projected/aa332536-984c-4942-b1fb-1ffde7eb465d-kube-api-access-xwzw5\") pod \"glance-default-internal-api-0\" (UID: \"aa332536-984c-4942-b1fb-1ffde7eb465d\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:13:27 crc kubenswrapper[4971]: I0320 07:13:27.174980 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa332536-984c-4942-b1fb-1ffde7eb465d-logs\") pod \"glance-default-internal-api-0\" (UID: \"aa332536-984c-4942-b1fb-1ffde7eb465d\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:13:27 crc kubenswrapper[4971]: I0320 07:13:27.175017 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa332536-984c-4942-b1fb-1ffde7eb465d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"aa332536-984c-4942-b1fb-1ffde7eb465d\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:13:27 crc kubenswrapper[4971]: I0320 07:13:27.175057 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"aa332536-984c-4942-b1fb-1ffde7eb465d\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:13:27 crc kubenswrapper[4971]: I0320 07:13:27.175085 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa332536-984c-4942-b1fb-1ffde7eb465d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"aa332536-984c-4942-b1fb-1ffde7eb465d\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:13:27 crc kubenswrapper[4971]: I0320 07:13:27.175114 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa332536-984c-4942-b1fb-1ffde7eb465d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"aa332536-984c-4942-b1fb-1ffde7eb465d\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:13:27 crc kubenswrapper[4971]: I0320 07:13:27.175843 4971 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"aa332536-984c-4942-b1fb-1ffde7eb465d\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-internal-api-0" Mar 20 07:13:27 crc kubenswrapper[4971]: I0320 07:13:27.176224 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/aa332536-984c-4942-b1fb-1ffde7eb465d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"aa332536-984c-4942-b1fb-1ffde7eb465d\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:13:27 crc kubenswrapper[4971]: I0320 07:13:27.177058 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa332536-984c-4942-b1fb-1ffde7eb465d-logs\") pod \"glance-default-internal-api-0\" (UID: \"aa332536-984c-4942-b1fb-1ffde7eb465d\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:13:27 crc kubenswrapper[4971]: I0320 07:13:27.180744 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa332536-984c-4942-b1fb-1ffde7eb465d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"aa332536-984c-4942-b1fb-1ffde7eb465d\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:13:27 crc kubenswrapper[4971]: I0320 07:13:27.181583 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa332536-984c-4942-b1fb-1ffde7eb465d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"aa332536-984c-4942-b1fb-1ffde7eb465d\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:13:27 crc kubenswrapper[4971]: I0320 07:13:27.183019 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa332536-984c-4942-b1fb-1ffde7eb465d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"aa332536-984c-4942-b1fb-1ffde7eb465d\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:13:27 crc kubenswrapper[4971]: I0320 07:13:27.183550 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa332536-984c-4942-b1fb-1ffde7eb465d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"aa332536-984c-4942-b1fb-1ffde7eb465d\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:13:27 crc kubenswrapper[4971]: I0320 07:13:27.191617 4971 scope.go:117] "RemoveContainer" containerID="dff293a7209e9d5eb574d247516eb149910bd77be84c64f2a666082d2c149d12" Mar 20 07:13:27 crc kubenswrapper[4971]: I0320 07:13:27.197176 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwzw5\" (UniqueName: \"kubernetes.io/projected/aa332536-984c-4942-b1fb-1ffde7eb465d-kube-api-access-xwzw5\") pod \"glance-default-internal-api-0\" (UID: \"aa332536-984c-4942-b1fb-1ffde7eb465d\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:13:27 crc kubenswrapper[4971]: I0320 07:13:27.210636 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"aa332536-984c-4942-b1fb-1ffde7eb465d\") " pod="openstack/glance-default-internal-api-0" Mar 20 07:13:27 crc kubenswrapper[4971]: I0320 07:13:27.278651 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b831f7e9-967c-4a45-84f6-0d0fc9240684-run-httpd\") pod \"ceilometer-0\" (UID: \"b831f7e9-967c-4a45-84f6-0d0fc9240684\") " pod="openstack/ceilometer-0" Mar 20 07:13:27 crc kubenswrapper[4971]: I0320 07:13:27.278705 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvmmn\" (UniqueName: \"kubernetes.io/projected/b831f7e9-967c-4a45-84f6-0d0fc9240684-kube-api-access-cvmmn\") pod \"ceilometer-0\" (UID: \"b831f7e9-967c-4a45-84f6-0d0fc9240684\") " pod="openstack/ceilometer-0" Mar 20 07:13:27 crc kubenswrapper[4971]: I0320 07:13:27.278723 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b831f7e9-967c-4a45-84f6-0d0fc9240684-scripts\") pod \"ceilometer-0\" (UID: \"b831f7e9-967c-4a45-84f6-0d0fc9240684\") " pod="openstack/ceilometer-0" Mar 20 07:13:27 crc kubenswrapper[4971]: I0320 07:13:27.278797 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b831f7e9-967c-4a45-84f6-0d0fc9240684-log-httpd\") pod \"ceilometer-0\" (UID: \"b831f7e9-967c-4a45-84f6-0d0fc9240684\") " pod="openstack/ceilometer-0" Mar 20 07:13:27 crc kubenswrapper[4971]: I0320 07:13:27.278828 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b831f7e9-967c-4a45-84f6-0d0fc9240684-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b831f7e9-967c-4a45-84f6-0d0fc9240684\") " pod="openstack/ceilometer-0" Mar 20 07:13:27 crc kubenswrapper[4971]: I0320 07:13:27.278849 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b831f7e9-967c-4a45-84f6-0d0fc9240684-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b831f7e9-967c-4a45-84f6-0d0fc9240684\") " pod="openstack/ceilometer-0" Mar 20 07:13:27 crc kubenswrapper[4971]: I0320 07:13:27.278877 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b831f7e9-967c-4a45-84f6-0d0fc9240684-config-data\") pod \"ceilometer-0\" (UID: \"b831f7e9-967c-4a45-84f6-0d0fc9240684\") " pod="openstack/ceilometer-0" Mar 20 07:13:27 crc kubenswrapper[4971]: I0320 07:13:27.279581 4971 scope.go:117] "RemoveContainer" containerID="76223294f76d4e966d36915f2cf219b65fa94718b6cbae2ce9fb396e2800dc1a" Mar 20 07:13:27 crc kubenswrapper[4971]: I0320 07:13:27.379993 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b831f7e9-967c-4a45-84f6-0d0fc9240684-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b831f7e9-967c-4a45-84f6-0d0fc9240684\") " pod="openstack/ceilometer-0" Mar 20 07:13:27 crc kubenswrapper[4971]: I0320 07:13:27.380077 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b831f7e9-967c-4a45-84f6-0d0fc9240684-config-data\") pod \"ceilometer-0\" (UID: \"b831f7e9-967c-4a45-84f6-0d0fc9240684\") " pod="openstack/ceilometer-0" Mar 20 07:13:27 crc kubenswrapper[4971]: I0320 07:13:27.380151 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b831f7e9-967c-4a45-84f6-0d0fc9240684-run-httpd\") pod \"ceilometer-0\" (UID: \"b831f7e9-967c-4a45-84f6-0d0fc9240684\") " pod="openstack/ceilometer-0" Mar 20 07:13:27 crc kubenswrapper[4971]: I0320 07:13:27.380172 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvmmn\" (UniqueName: \"kubernetes.io/projected/b831f7e9-967c-4a45-84f6-0d0fc9240684-kube-api-access-cvmmn\") pod \"ceilometer-0\" (UID: \"b831f7e9-967c-4a45-84f6-0d0fc9240684\") " pod="openstack/ceilometer-0" Mar 20 07:13:27 crc kubenswrapper[4971]: I0320 07:13:27.380193 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b831f7e9-967c-4a45-84f6-0d0fc9240684-scripts\") pod \"ceilometer-0\" (UID: \"b831f7e9-967c-4a45-84f6-0d0fc9240684\") " pod="openstack/ceilometer-0" Mar 20 07:13:27 crc kubenswrapper[4971]: I0320 07:13:27.380250 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b831f7e9-967c-4a45-84f6-0d0fc9240684-log-httpd\") pod \"ceilometer-0\" (UID: \"b831f7e9-967c-4a45-84f6-0d0fc9240684\") " pod="openstack/ceilometer-0" Mar 20 07:13:27 crc kubenswrapper[4971]: I0320 07:13:27.380281 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b831f7e9-967c-4a45-84f6-0d0fc9240684-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b831f7e9-967c-4a45-84f6-0d0fc9240684\") " pod="openstack/ceilometer-0" Mar 20 07:13:27 crc kubenswrapper[4971]: I0320 07:13:27.381404 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b831f7e9-967c-4a45-84f6-0d0fc9240684-log-httpd\") pod \"ceilometer-0\" (UID: \"b831f7e9-967c-4a45-84f6-0d0fc9240684\") " pod="openstack/ceilometer-0" Mar 20 07:13:27 crc kubenswrapper[4971]: I0320 07:13:27.381682 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b831f7e9-967c-4a45-84f6-0d0fc9240684-run-httpd\") pod \"ceilometer-0\" (UID: \"b831f7e9-967c-4a45-84f6-0d0fc9240684\") " pod="openstack/ceilometer-0" Mar 20 07:13:27 crc kubenswrapper[4971]: I0320 07:13:27.385287 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b831f7e9-967c-4a45-84f6-0d0fc9240684-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b831f7e9-967c-4a45-84f6-0d0fc9240684\") " pod="openstack/ceilometer-0" Mar 20 07:13:27 crc kubenswrapper[4971]: I0320 07:13:27.396755 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b831f7e9-967c-4a45-84f6-0d0fc9240684-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b831f7e9-967c-4a45-84f6-0d0fc9240684\") " pod="openstack/ceilometer-0" Mar 20 07:13:27 crc kubenswrapper[4971]: I0320 07:13:27.412314 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b831f7e9-967c-4a45-84f6-0d0fc9240684-scripts\") pod \"ceilometer-0\" (UID: \"b831f7e9-967c-4a45-84f6-0d0fc9240684\") " pod="openstack/ceilometer-0" Mar 20 07:13:27 crc kubenswrapper[4971]: I0320 07:13:27.428456 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 07:13:27 crc kubenswrapper[4971]: I0320 07:13:27.443836 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvmmn\" (UniqueName: \"kubernetes.io/projected/b831f7e9-967c-4a45-84f6-0d0fc9240684-kube-api-access-cvmmn\") pod \"ceilometer-0\" (UID: \"b831f7e9-967c-4a45-84f6-0d0fc9240684\") " pod="openstack/ceilometer-0" Mar 20 07:13:27 crc kubenswrapper[4971]: I0320 07:13:27.444556 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b831f7e9-967c-4a45-84f6-0d0fc9240684-config-data\") pod \"ceilometer-0\" (UID: \"b831f7e9-967c-4a45-84f6-0d0fc9240684\") " pod="openstack/ceilometer-0" Mar 20 07:13:27 crc kubenswrapper[4971]: I0320 07:13:27.732883 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 07:13:28 crc kubenswrapper[4971]: I0320 07:13:28.020760 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 07:13:28 crc kubenswrapper[4971]: W0320 07:13:28.032496 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa332536_984c_4942_b1fb_1ffde7eb465d.slice/crio-3ee96c9b9ec9603d9ab5cbaf08f8803b78fe8d28e83cc6b12fbf0ead0d5e9beb WatchSource:0}: Error finding container 3ee96c9b9ec9603d9ab5cbaf08f8803b78fe8d28e83cc6b12fbf0ead0d5e9beb: Status 404 returned error can't find the container with id 3ee96c9b9ec9603d9ab5cbaf08f8803b78fe8d28e83cc6b12fbf0ead0d5e9beb Mar 20 07:13:28 crc kubenswrapper[4971]: W0320 07:13:28.227548 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb831f7e9_967c_4a45_84f6_0d0fc9240684.slice/crio-e14e82450fed40954264f71e93f0e987b4b415afff2f1eb144b68e7145dc5ffa WatchSource:0}: Error finding container e14e82450fed40954264f71e93f0e987b4b415afff2f1eb144b68e7145dc5ffa: Status 404 returned error can't find the container with id e14e82450fed40954264f71e93f0e987b4b415afff2f1eb144b68e7145dc5ffa Mar 20 07:13:28 crc kubenswrapper[4971]: I0320 07:13:28.232272 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 07:13:28 crc kubenswrapper[4971]: I0320 07:13:28.342972 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 07:13:28 crc kubenswrapper[4971]: I0320 07:13:28.343018 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 07:13:28 crc kubenswrapper[4971]: I0320 07:13:28.384889 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 07:13:28 crc kubenswrapper[4971]: I0320 07:13:28.393956 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 07:13:28 crc kubenswrapper[4971]: I0320 07:13:28.748501 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="601f5587-0478-4903-9909-ebe0dee36539" path="/var/lib/kubelet/pods/601f5587-0478-4903-9909-ebe0dee36539/volumes" Mar 20 07:13:28 crc kubenswrapper[4971]: I0320 07:13:28.750779 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ed5de2f-85e8-45ca-8c0b-0c646167168f" path="/var/lib/kubelet/pods/8ed5de2f-85e8-45ca-8c0b-0c646167168f/volumes" Mar 20 07:13:28 crc kubenswrapper[4971]: I0320 07:13:28.751545 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dad46841-e516-4d94-ac4f-1d1bcb47d3b6" path="/var/lib/kubelet/pods/dad46841-e516-4d94-ac4f-1d1bcb47d3b6/volumes" Mar 20 07:13:28 crc kubenswrapper[4971]: I0320 07:13:28.908133 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b831f7e9-967c-4a45-84f6-0d0fc9240684","Type":"ContainerStarted","Data":"db7041cb24e6d433fa00c9e30f6afb62988b1218e56df0e5da3a180bd1199c74"} Mar 20 07:13:28 crc kubenswrapper[4971]: I0320 07:13:28.908178 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b831f7e9-967c-4a45-84f6-0d0fc9240684","Type":"ContainerStarted","Data":"e14e82450fed40954264f71e93f0e987b4b415afff2f1eb144b68e7145dc5ffa"} Mar 20 07:13:28 crc kubenswrapper[4971]: I0320 07:13:28.912379 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"aa332536-984c-4942-b1fb-1ffde7eb465d","Type":"ContainerStarted","Data":"c5035eee8b5df80277aacaae4c435bb925d2c563411e00ccd53787c9f32033d9"} Mar 20 07:13:28 crc kubenswrapper[4971]: I0320 07:13:28.913097 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"aa332536-984c-4942-b1fb-1ffde7eb465d","Type":"ContainerStarted","Data":"3ee96c9b9ec9603d9ab5cbaf08f8803b78fe8d28e83cc6b12fbf0ead0d5e9beb"} Mar 20 07:13:28 crc kubenswrapper[4971]: I0320 07:13:28.913146 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 07:13:28 crc kubenswrapper[4971]: I0320 07:13:28.913171 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 07:13:29 crc kubenswrapper[4971]: I0320 07:13:29.965018 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"aa332536-984c-4942-b1fb-1ffde7eb465d","Type":"ContainerStarted","Data":"68ae6ce1a0d2429fc4431d1c48cbd5670a70a1125fc1820cb1e6abfe59df9ea9"} Mar 20 07:13:29 crc kubenswrapper[4971]: I0320 07:13:29.969818 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b831f7e9-967c-4a45-84f6-0d0fc9240684","Type":"ContainerStarted","Data":"2878f5248e34f6516ed02cc505ca94a26ac0eddbc775465897f78b88ed5e3419"} Mar 20 07:13:30 crc kubenswrapper[4971]: I0320 07:13:30.034215 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.034191658 podStartE2EDuration="4.034191658s" podCreationTimestamp="2026-03-20 07:13:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:13:30.009890918 +0000 UTC m=+1431.989765056" watchObservedRunningTime="2026-03-20 07:13:30.034191658 +0000 UTC m=+1432.014065806" Mar 20 07:13:30 crc kubenswrapper[4971]: I0320 07:13:30.847784 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 07:13:30 crc kubenswrapper[4971]: I0320 07:13:30.852370 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 07:13:30 crc kubenswrapper[4971]: I0320 07:13:30.978408 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b831f7e9-967c-4a45-84f6-0d0fc9240684","Type":"ContainerStarted","Data":"79b6d7491d7e65d0c6421b9dd971f681bd14ae0a3762473de26b3c62a44bffa6"} Mar 20 07:13:32 crc kubenswrapper[4971]: I0320 07:13:32.998875 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b831f7e9-967c-4a45-84f6-0d0fc9240684","Type":"ContainerStarted","Data":"78886b44af48ee6cc0eb01b8b76d3cc8e1e6debf2f64e074c6945dfac2e288b6"} Mar 20 07:13:33 crc kubenswrapper[4971]: I0320 07:13:32.999369 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 07:13:33 crc kubenswrapper[4971]: I0320 07:13:33.017367 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.17276425 podStartE2EDuration="7.017345699s" podCreationTimestamp="2026-03-20 07:13:26 +0000 UTC" firstStartedPulling="2026-03-20 07:13:28.230896403 +0000 UTC m=+1430.210770541" lastFinishedPulling="2026-03-20 07:13:32.075477852 +0000 UTC m=+1434.055351990" observedRunningTime="2026-03-20 07:13:33.017314438 +0000 UTC m=+1434.997188576" watchObservedRunningTime="2026-03-20 07:13:33.017345699 +0000 UTC m=+1434.997219837" Mar 20 07:13:34 crc kubenswrapper[4971]: I0320 07:13:34.747986 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 07:13:35 crc kubenswrapper[4971]: I0320 07:13:35.018197 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b831f7e9-967c-4a45-84f6-0d0fc9240684" containerName="ceilometer-central-agent" containerID="cri-o://db7041cb24e6d433fa00c9e30f6afb62988b1218e56df0e5da3a180bd1199c74" gracePeriod=30 Mar 20 07:13:35 crc kubenswrapper[4971]: I0320 07:13:35.018246 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b831f7e9-967c-4a45-84f6-0d0fc9240684" containerName="sg-core" containerID="cri-o://79b6d7491d7e65d0c6421b9dd971f681bd14ae0a3762473de26b3c62a44bffa6" gracePeriod=30 Mar 20 07:13:35 crc kubenswrapper[4971]: I0320 07:13:35.018297 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b831f7e9-967c-4a45-84f6-0d0fc9240684" containerName="proxy-httpd" containerID="cri-o://78886b44af48ee6cc0eb01b8b76d3cc8e1e6debf2f64e074c6945dfac2e288b6" gracePeriod=30 Mar 20 07:13:35 crc kubenswrapper[4971]: I0320 07:13:35.018308 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b831f7e9-967c-4a45-84f6-0d0fc9240684" containerName="ceilometer-notification-agent" containerID="cri-o://2878f5248e34f6516ed02cc505ca94a26ac0eddbc775465897f78b88ed5e3419" gracePeriod=30 Mar 20 07:13:35 crc kubenswrapper[4971]: I0320 07:13:35.805063 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 07:13:35 crc kubenswrapper[4971]: I0320 07:13:35.854846 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b831f7e9-967c-4a45-84f6-0d0fc9240684-run-httpd\") pod \"b831f7e9-967c-4a45-84f6-0d0fc9240684\" (UID: \"b831f7e9-967c-4a45-84f6-0d0fc9240684\") " Mar 20 07:13:35 crc kubenswrapper[4971]: I0320 07:13:35.854897 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b831f7e9-967c-4a45-84f6-0d0fc9240684-log-httpd\") pod \"b831f7e9-967c-4a45-84f6-0d0fc9240684\" (UID: \"b831f7e9-967c-4a45-84f6-0d0fc9240684\") " Mar 20 07:13:35 crc kubenswrapper[4971]: I0320 07:13:35.854945 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b831f7e9-967c-4a45-84f6-0d0fc9240684-scripts\") pod \"b831f7e9-967c-4a45-84f6-0d0fc9240684\" (UID: \"b831f7e9-967c-4a45-84f6-0d0fc9240684\") " Mar 20 07:13:35 crc kubenswrapper[4971]: I0320 07:13:35.854990 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvmmn\" (UniqueName: \"kubernetes.io/projected/b831f7e9-967c-4a45-84f6-0d0fc9240684-kube-api-access-cvmmn\") pod \"b831f7e9-967c-4a45-84f6-0d0fc9240684\" (UID: \"b831f7e9-967c-4a45-84f6-0d0fc9240684\") " Mar 20 07:13:35 crc kubenswrapper[4971]: I0320 07:13:35.855120 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b831f7e9-967c-4a45-84f6-0d0fc9240684-combined-ca-bundle\") pod \"b831f7e9-967c-4a45-84f6-0d0fc9240684\" (UID: \"b831f7e9-967c-4a45-84f6-0d0fc9240684\") " Mar 20 07:13:35 crc kubenswrapper[4971]: I0320 07:13:35.855149 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b831f7e9-967c-4a45-84f6-0d0fc9240684-config-data\") pod \"b831f7e9-967c-4a45-84f6-0d0fc9240684\" (UID: \"b831f7e9-967c-4a45-84f6-0d0fc9240684\") " Mar 20 07:13:35 crc kubenswrapper[4971]: I0320 07:13:35.855206 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b831f7e9-967c-4a45-84f6-0d0fc9240684-sg-core-conf-yaml\") pod \"b831f7e9-967c-4a45-84f6-0d0fc9240684\" (UID: \"b831f7e9-967c-4a45-84f6-0d0fc9240684\") " Mar 20 07:13:35 crc kubenswrapper[4971]: I0320 07:13:35.855509 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b831f7e9-967c-4a45-84f6-0d0fc9240684-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b831f7e9-967c-4a45-84f6-0d0fc9240684" (UID: "b831f7e9-967c-4a45-84f6-0d0fc9240684"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:13:35 crc kubenswrapper[4971]: I0320 07:13:35.856054 4971 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b831f7e9-967c-4a45-84f6-0d0fc9240684-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:35 crc kubenswrapper[4971]: I0320 07:13:35.856361 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b831f7e9-967c-4a45-84f6-0d0fc9240684-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b831f7e9-967c-4a45-84f6-0d0fc9240684" (UID: "b831f7e9-967c-4a45-84f6-0d0fc9240684"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:13:35 crc kubenswrapper[4971]: I0320 07:13:35.863456 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b831f7e9-967c-4a45-84f6-0d0fc9240684-scripts" (OuterVolumeSpecName: "scripts") pod "b831f7e9-967c-4a45-84f6-0d0fc9240684" (UID: "b831f7e9-967c-4a45-84f6-0d0fc9240684"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:13:35 crc kubenswrapper[4971]: I0320 07:13:35.864415 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b831f7e9-967c-4a45-84f6-0d0fc9240684-kube-api-access-cvmmn" (OuterVolumeSpecName: "kube-api-access-cvmmn") pod "b831f7e9-967c-4a45-84f6-0d0fc9240684" (UID: "b831f7e9-967c-4a45-84f6-0d0fc9240684"). InnerVolumeSpecName "kube-api-access-cvmmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:13:35 crc kubenswrapper[4971]: I0320 07:13:35.895861 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b831f7e9-967c-4a45-84f6-0d0fc9240684-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b831f7e9-967c-4a45-84f6-0d0fc9240684" (UID: "b831f7e9-967c-4a45-84f6-0d0fc9240684"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:13:35 crc kubenswrapper[4971]: I0320 07:13:35.929062 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b831f7e9-967c-4a45-84f6-0d0fc9240684-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b831f7e9-967c-4a45-84f6-0d0fc9240684" (UID: "b831f7e9-967c-4a45-84f6-0d0fc9240684"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:13:35 crc kubenswrapper[4971]: I0320 07:13:35.958210 4971 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b831f7e9-967c-4a45-84f6-0d0fc9240684-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:35 crc kubenswrapper[4971]: I0320 07:13:35.958257 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b831f7e9-967c-4a45-84f6-0d0fc9240684-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:35 crc kubenswrapper[4971]: I0320 07:13:35.958267 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvmmn\" (UniqueName: \"kubernetes.io/projected/b831f7e9-967c-4a45-84f6-0d0fc9240684-kube-api-access-cvmmn\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:35 crc kubenswrapper[4971]: I0320 07:13:35.958279 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b831f7e9-967c-4a45-84f6-0d0fc9240684-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:35 crc kubenswrapper[4971]: I0320 07:13:35.958287 4971 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b831f7e9-967c-4a45-84f6-0d0fc9240684-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:35 crc kubenswrapper[4971]: I0320 07:13:35.981329 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b831f7e9-967c-4a45-84f6-0d0fc9240684-config-data" (OuterVolumeSpecName: "config-data") pod "b831f7e9-967c-4a45-84f6-0d0fc9240684" (UID: "b831f7e9-967c-4a45-84f6-0d0fc9240684"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.040483 4971 generic.go:334] "Generic (PLEG): container finished" podID="b831f7e9-967c-4a45-84f6-0d0fc9240684" containerID="78886b44af48ee6cc0eb01b8b76d3cc8e1e6debf2f64e074c6945dfac2e288b6" exitCode=0 Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.040521 4971 generic.go:334] "Generic (PLEG): container finished" podID="b831f7e9-967c-4a45-84f6-0d0fc9240684" containerID="79b6d7491d7e65d0c6421b9dd971f681bd14ae0a3762473de26b3c62a44bffa6" exitCode=2 Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.040555 4971 generic.go:334] "Generic (PLEG): container finished" podID="b831f7e9-967c-4a45-84f6-0d0fc9240684" containerID="2878f5248e34f6516ed02cc505ca94a26ac0eddbc775465897f78b88ed5e3419" exitCode=0 Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.040567 4971 generic.go:334] "Generic (PLEG): container finished" podID="b831f7e9-967c-4a45-84f6-0d0fc9240684" containerID="db7041cb24e6d433fa00c9e30f6afb62988b1218e56df0e5da3a180bd1199c74" exitCode=0 Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.040593 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b831f7e9-967c-4a45-84f6-0d0fc9240684","Type":"ContainerDied","Data":"78886b44af48ee6cc0eb01b8b76d3cc8e1e6debf2f64e074c6945dfac2e288b6"} Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.040663 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b831f7e9-967c-4a45-84f6-0d0fc9240684","Type":"ContainerDied","Data":"79b6d7491d7e65d0c6421b9dd971f681bd14ae0a3762473de26b3c62a44bffa6"} Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.040679 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b831f7e9-967c-4a45-84f6-0d0fc9240684","Type":"ContainerDied","Data":"2878f5248e34f6516ed02cc505ca94a26ac0eddbc775465897f78b88ed5e3419"} Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.040745 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b831f7e9-967c-4a45-84f6-0d0fc9240684","Type":"ContainerDied","Data":"db7041cb24e6d433fa00c9e30f6afb62988b1218e56df0e5da3a180bd1199c74"} Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.040751 4971 scope.go:117] "RemoveContainer" containerID="78886b44af48ee6cc0eb01b8b76d3cc8e1e6debf2f64e074c6945dfac2e288b6" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.040759 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b831f7e9-967c-4a45-84f6-0d0fc9240684","Type":"ContainerDied","Data":"e14e82450fed40954264f71e93f0e987b4b415afff2f1eb144b68e7145dc5ffa"} Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.040755 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.059701 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b831f7e9-967c-4a45-84f6-0d0fc9240684-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.082934 4971 scope.go:117] "RemoveContainer" containerID="79b6d7491d7e65d0c6421b9dd971f681bd14ae0a3762473de26b3c62a44bffa6" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.094962 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.108852 4971 scope.go:117] "RemoveContainer" containerID="2878f5248e34f6516ed02cc505ca94a26ac0eddbc775465897f78b88ed5e3419" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.124702 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.151357 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 07:13:36 crc kubenswrapper[4971]: E0320 07:13:36.152682 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b831f7e9-967c-4a45-84f6-0d0fc9240684" containerName="ceilometer-central-agent" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.152699 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="b831f7e9-967c-4a45-84f6-0d0fc9240684" containerName="ceilometer-central-agent" Mar 20 07:13:36 crc kubenswrapper[4971]: E0320 07:13:36.152725 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b831f7e9-967c-4a45-84f6-0d0fc9240684" containerName="sg-core" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.152732 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="b831f7e9-967c-4a45-84f6-0d0fc9240684" containerName="sg-core" Mar 20 07:13:36 crc kubenswrapper[4971]: E0320 07:13:36.152743 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b831f7e9-967c-4a45-84f6-0d0fc9240684" containerName="proxy-httpd" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.152749 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="b831f7e9-967c-4a45-84f6-0d0fc9240684" containerName="proxy-httpd" Mar 20 07:13:36 crc kubenswrapper[4971]: E0320 07:13:36.152760 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b831f7e9-967c-4a45-84f6-0d0fc9240684" containerName="ceilometer-notification-agent" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.152767 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="b831f7e9-967c-4a45-84f6-0d0fc9240684" containerName="ceilometer-notification-agent" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.152937 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="b831f7e9-967c-4a45-84f6-0d0fc9240684" containerName="proxy-httpd" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.152955 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="b831f7e9-967c-4a45-84f6-0d0fc9240684" containerName="sg-core" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.152963 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="b831f7e9-967c-4a45-84f6-0d0fc9240684" containerName="ceilometer-notification-agent" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.152975 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="b831f7e9-967c-4a45-84f6-0d0fc9240684" containerName="ceilometer-central-agent" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.155184 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.158791 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.158935 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.176414 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.202960 4971 scope.go:117] "RemoveContainer" containerID="db7041cb24e6d433fa00c9e30f6afb62988b1218e56df0e5da3a180bd1199c74" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.226117 4971 scope.go:117] "RemoveContainer" containerID="78886b44af48ee6cc0eb01b8b76d3cc8e1e6debf2f64e074c6945dfac2e288b6" Mar 20 07:13:36 crc kubenswrapper[4971]: E0320 07:13:36.227049 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78886b44af48ee6cc0eb01b8b76d3cc8e1e6debf2f64e074c6945dfac2e288b6\": container with ID starting with 78886b44af48ee6cc0eb01b8b76d3cc8e1e6debf2f64e074c6945dfac2e288b6 not found: ID does not exist" containerID="78886b44af48ee6cc0eb01b8b76d3cc8e1e6debf2f64e074c6945dfac2e288b6" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.227097 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78886b44af48ee6cc0eb01b8b76d3cc8e1e6debf2f64e074c6945dfac2e288b6"} err="failed to get container status \"78886b44af48ee6cc0eb01b8b76d3cc8e1e6debf2f64e074c6945dfac2e288b6\": rpc error: code = NotFound desc = could not find container \"78886b44af48ee6cc0eb01b8b76d3cc8e1e6debf2f64e074c6945dfac2e288b6\": container with ID starting with 78886b44af48ee6cc0eb01b8b76d3cc8e1e6debf2f64e074c6945dfac2e288b6 not found: ID does not exist" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.227124 4971 scope.go:117] "RemoveContainer" containerID="79b6d7491d7e65d0c6421b9dd971f681bd14ae0a3762473de26b3c62a44bffa6" Mar 20 07:13:36 crc kubenswrapper[4971]: E0320 07:13:36.227442 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79b6d7491d7e65d0c6421b9dd971f681bd14ae0a3762473de26b3c62a44bffa6\": container with ID starting with 79b6d7491d7e65d0c6421b9dd971f681bd14ae0a3762473de26b3c62a44bffa6 not found: ID does not exist" containerID="79b6d7491d7e65d0c6421b9dd971f681bd14ae0a3762473de26b3c62a44bffa6" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.227472 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79b6d7491d7e65d0c6421b9dd971f681bd14ae0a3762473de26b3c62a44bffa6"} err="failed to get container status \"79b6d7491d7e65d0c6421b9dd971f681bd14ae0a3762473de26b3c62a44bffa6\": rpc error: code = NotFound desc = could not find container \"79b6d7491d7e65d0c6421b9dd971f681bd14ae0a3762473de26b3c62a44bffa6\": container with ID starting with 79b6d7491d7e65d0c6421b9dd971f681bd14ae0a3762473de26b3c62a44bffa6 not found: ID does not exist" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.227491 4971 scope.go:117] "RemoveContainer" containerID="2878f5248e34f6516ed02cc505ca94a26ac0eddbc775465897f78b88ed5e3419" Mar 20 07:13:36 crc kubenswrapper[4971]: E0320 07:13:36.228077 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2878f5248e34f6516ed02cc505ca94a26ac0eddbc775465897f78b88ed5e3419\": container with ID starting with 2878f5248e34f6516ed02cc505ca94a26ac0eddbc775465897f78b88ed5e3419 not found: ID does not exist" containerID="2878f5248e34f6516ed02cc505ca94a26ac0eddbc775465897f78b88ed5e3419" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.228095 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2878f5248e34f6516ed02cc505ca94a26ac0eddbc775465897f78b88ed5e3419"} err="failed to get container status \"2878f5248e34f6516ed02cc505ca94a26ac0eddbc775465897f78b88ed5e3419\": rpc error: code = NotFound desc = could not find container \"2878f5248e34f6516ed02cc505ca94a26ac0eddbc775465897f78b88ed5e3419\": container with ID starting with 2878f5248e34f6516ed02cc505ca94a26ac0eddbc775465897f78b88ed5e3419 not found: ID does not exist" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.228107 4971 scope.go:117] "RemoveContainer" containerID="db7041cb24e6d433fa00c9e30f6afb62988b1218e56df0e5da3a180bd1199c74" Mar 20 07:13:36 crc kubenswrapper[4971]: E0320 07:13:36.228491 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db7041cb24e6d433fa00c9e30f6afb62988b1218e56df0e5da3a180bd1199c74\": container with ID starting with db7041cb24e6d433fa00c9e30f6afb62988b1218e56df0e5da3a180bd1199c74 not found: ID does not exist" containerID="db7041cb24e6d433fa00c9e30f6afb62988b1218e56df0e5da3a180bd1199c74" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.228530 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db7041cb24e6d433fa00c9e30f6afb62988b1218e56df0e5da3a180bd1199c74"} err="failed to get container status \"db7041cb24e6d433fa00c9e30f6afb62988b1218e56df0e5da3a180bd1199c74\": rpc error: code = NotFound desc = could not find container \"db7041cb24e6d433fa00c9e30f6afb62988b1218e56df0e5da3a180bd1199c74\": container with ID starting with db7041cb24e6d433fa00c9e30f6afb62988b1218e56df0e5da3a180bd1199c74 not found: ID does not exist" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.228558 4971 scope.go:117] "RemoveContainer" containerID="78886b44af48ee6cc0eb01b8b76d3cc8e1e6debf2f64e074c6945dfac2e288b6" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.233821 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78886b44af48ee6cc0eb01b8b76d3cc8e1e6debf2f64e074c6945dfac2e288b6"} err="failed to get container status \"78886b44af48ee6cc0eb01b8b76d3cc8e1e6debf2f64e074c6945dfac2e288b6\": rpc error: code = NotFound desc = could not find container \"78886b44af48ee6cc0eb01b8b76d3cc8e1e6debf2f64e074c6945dfac2e288b6\": container with ID starting with 78886b44af48ee6cc0eb01b8b76d3cc8e1e6debf2f64e074c6945dfac2e288b6 not found: ID does not exist" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.233861 4971 scope.go:117] "RemoveContainer" containerID="79b6d7491d7e65d0c6421b9dd971f681bd14ae0a3762473de26b3c62a44bffa6" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.234530 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79b6d7491d7e65d0c6421b9dd971f681bd14ae0a3762473de26b3c62a44bffa6"} err="failed to get container status \"79b6d7491d7e65d0c6421b9dd971f681bd14ae0a3762473de26b3c62a44bffa6\": rpc error: code = NotFound desc = could not find container \"79b6d7491d7e65d0c6421b9dd971f681bd14ae0a3762473de26b3c62a44bffa6\": container with ID starting with 79b6d7491d7e65d0c6421b9dd971f681bd14ae0a3762473de26b3c62a44bffa6 not found: ID does not exist" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.234551 4971 scope.go:117] "RemoveContainer" containerID="2878f5248e34f6516ed02cc505ca94a26ac0eddbc775465897f78b88ed5e3419" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.234957 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2878f5248e34f6516ed02cc505ca94a26ac0eddbc775465897f78b88ed5e3419"} err="failed to get container status \"2878f5248e34f6516ed02cc505ca94a26ac0eddbc775465897f78b88ed5e3419\": rpc error: code = NotFound desc = could not find container \"2878f5248e34f6516ed02cc505ca94a26ac0eddbc775465897f78b88ed5e3419\": container with ID starting with 2878f5248e34f6516ed02cc505ca94a26ac0eddbc775465897f78b88ed5e3419 not found: ID does not exist" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.234985 4971 scope.go:117] "RemoveContainer" containerID="db7041cb24e6d433fa00c9e30f6afb62988b1218e56df0e5da3a180bd1199c74" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.235342 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db7041cb24e6d433fa00c9e30f6afb62988b1218e56df0e5da3a180bd1199c74"} err="failed to get container status \"db7041cb24e6d433fa00c9e30f6afb62988b1218e56df0e5da3a180bd1199c74\": rpc error: code = NotFound desc = could not find container \"db7041cb24e6d433fa00c9e30f6afb62988b1218e56df0e5da3a180bd1199c74\": container with ID starting with db7041cb24e6d433fa00c9e30f6afb62988b1218e56df0e5da3a180bd1199c74 not found: ID does not exist" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.235362 4971 scope.go:117] "RemoveContainer" containerID="78886b44af48ee6cc0eb01b8b76d3cc8e1e6debf2f64e074c6945dfac2e288b6" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.235560 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78886b44af48ee6cc0eb01b8b76d3cc8e1e6debf2f64e074c6945dfac2e288b6"} err="failed to get container status \"78886b44af48ee6cc0eb01b8b76d3cc8e1e6debf2f64e074c6945dfac2e288b6\": rpc error: code = NotFound desc = could not find container \"78886b44af48ee6cc0eb01b8b76d3cc8e1e6debf2f64e074c6945dfac2e288b6\": container with ID starting with 78886b44af48ee6cc0eb01b8b76d3cc8e1e6debf2f64e074c6945dfac2e288b6 not found: ID does not exist" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.235577 4971 scope.go:117] "RemoveContainer" containerID="79b6d7491d7e65d0c6421b9dd971f681bd14ae0a3762473de26b3c62a44bffa6" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.235744 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79b6d7491d7e65d0c6421b9dd971f681bd14ae0a3762473de26b3c62a44bffa6"} err="failed to get container status \"79b6d7491d7e65d0c6421b9dd971f681bd14ae0a3762473de26b3c62a44bffa6\": rpc error: code = NotFound desc = could not find container \"79b6d7491d7e65d0c6421b9dd971f681bd14ae0a3762473de26b3c62a44bffa6\": container with ID starting with 79b6d7491d7e65d0c6421b9dd971f681bd14ae0a3762473de26b3c62a44bffa6 not found: ID does not exist" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.235759 4971 scope.go:117] "RemoveContainer" containerID="2878f5248e34f6516ed02cc505ca94a26ac0eddbc775465897f78b88ed5e3419" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.235907 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2878f5248e34f6516ed02cc505ca94a26ac0eddbc775465897f78b88ed5e3419"} err="failed to get container status \"2878f5248e34f6516ed02cc505ca94a26ac0eddbc775465897f78b88ed5e3419\": rpc error: code = NotFound desc = could not find container \"2878f5248e34f6516ed02cc505ca94a26ac0eddbc775465897f78b88ed5e3419\": container with ID starting with 2878f5248e34f6516ed02cc505ca94a26ac0eddbc775465897f78b88ed5e3419 not found: ID does not exist" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.235920 4971 scope.go:117] "RemoveContainer" containerID="db7041cb24e6d433fa00c9e30f6afb62988b1218e56df0e5da3a180bd1199c74" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.236075 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db7041cb24e6d433fa00c9e30f6afb62988b1218e56df0e5da3a180bd1199c74"} err="failed to get container status \"db7041cb24e6d433fa00c9e30f6afb62988b1218e56df0e5da3a180bd1199c74\": rpc error: code = NotFound desc = could not find container \"db7041cb24e6d433fa00c9e30f6afb62988b1218e56df0e5da3a180bd1199c74\": container with ID starting with db7041cb24e6d433fa00c9e30f6afb62988b1218e56df0e5da3a180bd1199c74 not found: ID does not exist" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.236093 4971 scope.go:117] "RemoveContainer" containerID="78886b44af48ee6cc0eb01b8b76d3cc8e1e6debf2f64e074c6945dfac2e288b6" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.236277 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78886b44af48ee6cc0eb01b8b76d3cc8e1e6debf2f64e074c6945dfac2e288b6"} err="failed to get container status \"78886b44af48ee6cc0eb01b8b76d3cc8e1e6debf2f64e074c6945dfac2e288b6\": rpc error: code = NotFound desc = could not find container \"78886b44af48ee6cc0eb01b8b76d3cc8e1e6debf2f64e074c6945dfac2e288b6\": container with ID starting with 78886b44af48ee6cc0eb01b8b76d3cc8e1e6debf2f64e074c6945dfac2e288b6 not found: ID does not exist" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.236298 4971 scope.go:117] "RemoveContainer" containerID="79b6d7491d7e65d0c6421b9dd971f681bd14ae0a3762473de26b3c62a44bffa6" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.236460 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79b6d7491d7e65d0c6421b9dd971f681bd14ae0a3762473de26b3c62a44bffa6"} err="failed to get container status \"79b6d7491d7e65d0c6421b9dd971f681bd14ae0a3762473de26b3c62a44bffa6\": rpc error: code = NotFound desc = could not find container \"79b6d7491d7e65d0c6421b9dd971f681bd14ae0a3762473de26b3c62a44bffa6\": container with ID starting with 79b6d7491d7e65d0c6421b9dd971f681bd14ae0a3762473de26b3c62a44bffa6 not found: ID does not exist" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.236476 4971 scope.go:117] "RemoveContainer" containerID="2878f5248e34f6516ed02cc505ca94a26ac0eddbc775465897f78b88ed5e3419" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.236725 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2878f5248e34f6516ed02cc505ca94a26ac0eddbc775465897f78b88ed5e3419"} err="failed to get container status \"2878f5248e34f6516ed02cc505ca94a26ac0eddbc775465897f78b88ed5e3419\": rpc error: code = NotFound desc = could not find container \"2878f5248e34f6516ed02cc505ca94a26ac0eddbc775465897f78b88ed5e3419\": container with ID starting with 2878f5248e34f6516ed02cc505ca94a26ac0eddbc775465897f78b88ed5e3419 not found: ID does not exist" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.236770 4971 scope.go:117] "RemoveContainer" containerID="db7041cb24e6d433fa00c9e30f6afb62988b1218e56df0e5da3a180bd1199c74" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.236964 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db7041cb24e6d433fa00c9e30f6afb62988b1218e56df0e5da3a180bd1199c74"} err="failed to get container status \"db7041cb24e6d433fa00c9e30f6afb62988b1218e56df0e5da3a180bd1199c74\": rpc error: code = NotFound desc = could not find container \"db7041cb24e6d433fa00c9e30f6afb62988b1218e56df0e5da3a180bd1199c74\": container with ID starting with db7041cb24e6d433fa00c9e30f6afb62988b1218e56df0e5da3a180bd1199c74 not found: ID does not exist" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.266403 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66e085af-b20f-4a74-a2a2-f21a3057cb8a-run-httpd\") pod \"ceilometer-0\" (UID: \"66e085af-b20f-4a74-a2a2-f21a3057cb8a\") " pod="openstack/ceilometer-0" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.266453 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ht8p\" (UniqueName: \"kubernetes.io/projected/66e085af-b20f-4a74-a2a2-f21a3057cb8a-kube-api-access-7ht8p\") pod \"ceilometer-0\" (UID: \"66e085af-b20f-4a74-a2a2-f21a3057cb8a\") " pod="openstack/ceilometer-0" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.266511 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66e085af-b20f-4a74-a2a2-f21a3057cb8a-log-httpd\") pod \"ceilometer-0\" (UID: \"66e085af-b20f-4a74-a2a2-f21a3057cb8a\") " pod="openstack/ceilometer-0" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.266537 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/66e085af-b20f-4a74-a2a2-f21a3057cb8a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"66e085af-b20f-4a74-a2a2-f21a3057cb8a\") " pod="openstack/ceilometer-0" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.266662 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66e085af-b20f-4a74-a2a2-f21a3057cb8a-scripts\") pod \"ceilometer-0\" (UID: \"66e085af-b20f-4a74-a2a2-f21a3057cb8a\") " pod="openstack/ceilometer-0" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.266892 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66e085af-b20f-4a74-a2a2-f21a3057cb8a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"66e085af-b20f-4a74-a2a2-f21a3057cb8a\") " pod="openstack/ceilometer-0" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.266950 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66e085af-b20f-4a74-a2a2-f21a3057cb8a-config-data\") pod \"ceilometer-0\" (UID: \"66e085af-b20f-4a74-a2a2-f21a3057cb8a\") " pod="openstack/ceilometer-0" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.369263 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66e085af-b20f-4a74-a2a2-f21a3057cb8a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"66e085af-b20f-4a74-a2a2-f21a3057cb8a\") " pod="openstack/ceilometer-0" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.369323 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66e085af-b20f-4a74-a2a2-f21a3057cb8a-config-data\") pod \"ceilometer-0\" (UID: \"66e085af-b20f-4a74-a2a2-f21a3057cb8a\") " pod="openstack/ceilometer-0" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.369448 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66e085af-b20f-4a74-a2a2-f21a3057cb8a-run-httpd\") pod \"ceilometer-0\" (UID: \"66e085af-b20f-4a74-a2a2-f21a3057cb8a\") " pod="openstack/ceilometer-0" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.369484 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ht8p\" (UniqueName: \"kubernetes.io/projected/66e085af-b20f-4a74-a2a2-f21a3057cb8a-kube-api-access-7ht8p\") pod \"ceilometer-0\" (UID: \"66e085af-b20f-4a74-a2a2-f21a3057cb8a\") " pod="openstack/ceilometer-0" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.369537 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66e085af-b20f-4a74-a2a2-f21a3057cb8a-log-httpd\") pod \"ceilometer-0\" (UID: \"66e085af-b20f-4a74-a2a2-f21a3057cb8a\") " pod="openstack/ceilometer-0" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.369560 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/66e085af-b20f-4a74-a2a2-f21a3057cb8a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"66e085af-b20f-4a74-a2a2-f21a3057cb8a\") " pod="openstack/ceilometer-0" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.369584 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66e085af-b20f-4a74-a2a2-f21a3057cb8a-scripts\") pod \"ceilometer-0\" (UID: \"66e085af-b20f-4a74-a2a2-f21a3057cb8a\") " pod="openstack/ceilometer-0" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.370287 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66e085af-b20f-4a74-a2a2-f21a3057cb8a-log-httpd\") pod \"ceilometer-0\" (UID: \"66e085af-b20f-4a74-a2a2-f21a3057cb8a\") " pod="openstack/ceilometer-0" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.370521 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66e085af-b20f-4a74-a2a2-f21a3057cb8a-run-httpd\") pod \"ceilometer-0\" (UID: \"66e085af-b20f-4a74-a2a2-f21a3057cb8a\") " pod="openstack/ceilometer-0" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.374774 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66e085af-b20f-4a74-a2a2-f21a3057cb8a-scripts\") pod \"ceilometer-0\" (UID: \"66e085af-b20f-4a74-a2a2-f21a3057cb8a\") " pod="openstack/ceilometer-0" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.374802 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66e085af-b20f-4a74-a2a2-f21a3057cb8a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"66e085af-b20f-4a74-a2a2-f21a3057cb8a\") " pod="openstack/ceilometer-0" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.375060 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66e085af-b20f-4a74-a2a2-f21a3057cb8a-config-data\") pod \"ceilometer-0\" (UID: \"66e085af-b20f-4a74-a2a2-f21a3057cb8a\") " pod="openstack/ceilometer-0" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.375235 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/66e085af-b20f-4a74-a2a2-f21a3057cb8a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"66e085af-b20f-4a74-a2a2-f21a3057cb8a\") " pod="openstack/ceilometer-0" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.397254 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ht8p\" (UniqueName: \"kubernetes.io/projected/66e085af-b20f-4a74-a2a2-f21a3057cb8a-kube-api-access-7ht8p\") pod \"ceilometer-0\" (UID: \"66e085af-b20f-4a74-a2a2-f21a3057cb8a\") " pod="openstack/ceilometer-0" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.505330 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.763814 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b831f7e9-967c-4a45-84f6-0d0fc9240684" path="/var/lib/kubelet/pods/b831f7e9-967c-4a45-84f6-0d0fc9240684/volumes" Mar 20 07:13:36 crc kubenswrapper[4971]: I0320 07:13:36.826379 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 07:13:37 crc kubenswrapper[4971]: I0320 07:13:37.056254 4971 generic.go:334] "Generic (PLEG): container finished" podID="6af1289c-26d4-4865-946f-891c126beb49" containerID="b8a7e074bded868b90861fe1e5fdddd63c2b37acc4111e94697089382de56703" exitCode=0 Mar 20 07:13:37 crc kubenswrapper[4971]: I0320 07:13:37.056345 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-k6q9k" event={"ID":"6af1289c-26d4-4865-946f-891c126beb49","Type":"ContainerDied","Data":"b8a7e074bded868b90861fe1e5fdddd63c2b37acc4111e94697089382de56703"} Mar 20 07:13:37 crc kubenswrapper[4971]: I0320 07:13:37.058396 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"66e085af-b20f-4a74-a2a2-f21a3057cb8a","Type":"ContainerStarted","Data":"7b5a63c9510ad8942dff7b5d1cf391518fd61ad00b5c4139909190b84eba1607"} Mar 20 07:13:37 crc kubenswrapper[4971]: I0320 07:13:37.430310 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 07:13:37 crc kubenswrapper[4971]: I0320 07:13:37.430360 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 07:13:37 crc kubenswrapper[4971]: I0320 07:13:37.462534 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 07:13:37 crc kubenswrapper[4971]: I0320 07:13:37.511467 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 07:13:38 crc kubenswrapper[4971]: I0320 07:13:38.069598 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"66e085af-b20f-4a74-a2a2-f21a3057cb8a","Type":"ContainerStarted","Data":"4a4002dc99baad38e51c0f6fb55892c305049b301226c24fbb73e80a9aeef493"} Mar 20 07:13:38 crc kubenswrapper[4971]: I0320 07:13:38.070436 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 07:13:38 crc kubenswrapper[4971]: I0320 07:13:38.070461 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 07:13:38 crc kubenswrapper[4971]: I0320 07:13:38.374465 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-k6q9k" Mar 20 07:13:38 crc kubenswrapper[4971]: I0320 07:13:38.423742 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6af1289c-26d4-4865-946f-891c126beb49-scripts\") pod \"6af1289c-26d4-4865-946f-891c126beb49\" (UID: \"6af1289c-26d4-4865-946f-891c126beb49\") " Mar 20 07:13:38 crc kubenswrapper[4971]: I0320 07:13:38.424267 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5km6x\" (UniqueName: \"kubernetes.io/projected/6af1289c-26d4-4865-946f-891c126beb49-kube-api-access-5km6x\") pod \"6af1289c-26d4-4865-946f-891c126beb49\" (UID: \"6af1289c-26d4-4865-946f-891c126beb49\") " Mar 20 07:13:38 crc kubenswrapper[4971]: I0320 07:13:38.424309 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6af1289c-26d4-4865-946f-891c126beb49-combined-ca-bundle\") pod \"6af1289c-26d4-4865-946f-891c126beb49\" (UID: \"6af1289c-26d4-4865-946f-891c126beb49\") " Mar 20 07:13:38 crc kubenswrapper[4971]: I0320 07:13:38.424374 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6af1289c-26d4-4865-946f-891c126beb49-config-data\") pod \"6af1289c-26d4-4865-946f-891c126beb49\" (UID: \"6af1289c-26d4-4865-946f-891c126beb49\") " Mar 20 07:13:38 crc kubenswrapper[4971]: I0320 07:13:38.431011 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6af1289c-26d4-4865-946f-891c126beb49-scripts" (OuterVolumeSpecName: "scripts") pod "6af1289c-26d4-4865-946f-891c126beb49" (UID: "6af1289c-26d4-4865-946f-891c126beb49"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:13:38 crc kubenswrapper[4971]: I0320 07:13:38.435877 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6af1289c-26d4-4865-946f-891c126beb49-kube-api-access-5km6x" (OuterVolumeSpecName: "kube-api-access-5km6x") pod "6af1289c-26d4-4865-946f-891c126beb49" (UID: "6af1289c-26d4-4865-946f-891c126beb49"). InnerVolumeSpecName "kube-api-access-5km6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:13:38 crc kubenswrapper[4971]: I0320 07:13:38.456833 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6af1289c-26d4-4865-946f-891c126beb49-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6af1289c-26d4-4865-946f-891c126beb49" (UID: "6af1289c-26d4-4865-946f-891c126beb49"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:13:38 crc kubenswrapper[4971]: I0320 07:13:38.459846 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6af1289c-26d4-4865-946f-891c126beb49-config-data" (OuterVolumeSpecName: "config-data") pod "6af1289c-26d4-4865-946f-891c126beb49" (UID: "6af1289c-26d4-4865-946f-891c126beb49"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:13:38 crc kubenswrapper[4971]: I0320 07:13:38.525413 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6af1289c-26d4-4865-946f-891c126beb49-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:38 crc kubenswrapper[4971]: I0320 07:13:38.525448 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5km6x\" (UniqueName: \"kubernetes.io/projected/6af1289c-26d4-4865-946f-891c126beb49-kube-api-access-5km6x\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:38 crc kubenswrapper[4971]: I0320 07:13:38.525461 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6af1289c-26d4-4865-946f-891c126beb49-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:38 crc kubenswrapper[4971]: I0320 07:13:38.525470 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6af1289c-26d4-4865-946f-891c126beb49-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:39 crc kubenswrapper[4971]: I0320 07:13:39.083338 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-k6q9k" event={"ID":"6af1289c-26d4-4865-946f-891c126beb49","Type":"ContainerDied","Data":"4834e3334e1f3e3e04ea136febeadd41417d71f3db63193959b6aeff16ccb212"} Mar 20 07:13:39 crc kubenswrapper[4971]: I0320 07:13:39.084919 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4834e3334e1f3e3e04ea136febeadd41417d71f3db63193959b6aeff16ccb212" Mar 20 07:13:39 crc kubenswrapper[4971]: I0320 07:13:39.085162 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-k6q9k" Mar 20 07:13:39 crc kubenswrapper[4971]: I0320 07:13:39.089444 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"66e085af-b20f-4a74-a2a2-f21a3057cb8a","Type":"ContainerStarted","Data":"f700eba655e94a222afd3f2a732900514e38a9d9878a188d6fe2fd9eddfbdef6"} Mar 20 07:13:39 crc kubenswrapper[4971]: I0320 07:13:39.320121 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 07:13:39 crc kubenswrapper[4971]: E0320 07:13:39.320648 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6af1289c-26d4-4865-946f-891c126beb49" containerName="nova-cell0-conductor-db-sync" Mar 20 07:13:39 crc kubenswrapper[4971]: I0320 07:13:39.320670 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="6af1289c-26d4-4865-946f-891c126beb49" containerName="nova-cell0-conductor-db-sync" Mar 20 07:13:39 crc kubenswrapper[4971]: I0320 07:13:39.320908 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="6af1289c-26d4-4865-946f-891c126beb49" containerName="nova-cell0-conductor-db-sync" Mar 20 07:13:39 crc kubenswrapper[4971]: I0320 07:13:39.321678 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 20 07:13:39 crc kubenswrapper[4971]: I0320 07:13:39.323906 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-znqz8" Mar 20 07:13:39 crc kubenswrapper[4971]: I0320 07:13:39.325734 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 20 07:13:39 crc kubenswrapper[4971]: I0320 07:13:39.335208 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 07:13:39 crc kubenswrapper[4971]: I0320 07:13:39.473486 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebe7358b-84ed-4e08-8c12-234928beef49-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"ebe7358b-84ed-4e08-8c12-234928beef49\") " pod="openstack/nova-cell0-conductor-0" Mar 20 07:13:39 crc kubenswrapper[4971]: I0320 07:13:39.474523 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnf6t\" (UniqueName: \"kubernetes.io/projected/ebe7358b-84ed-4e08-8c12-234928beef49-kube-api-access-hnf6t\") pod \"nova-cell0-conductor-0\" (UID: \"ebe7358b-84ed-4e08-8c12-234928beef49\") " pod="openstack/nova-cell0-conductor-0" Mar 20 07:13:39 crc kubenswrapper[4971]: I0320 07:13:39.475037 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebe7358b-84ed-4e08-8c12-234928beef49-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"ebe7358b-84ed-4e08-8c12-234928beef49\") " pod="openstack/nova-cell0-conductor-0" Mar 20 07:13:39 crc kubenswrapper[4971]: I0320 07:13:39.577157 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnf6t\" (UniqueName: \"kubernetes.io/projected/ebe7358b-84ed-4e08-8c12-234928beef49-kube-api-access-hnf6t\") pod \"nova-cell0-conductor-0\" (UID: \"ebe7358b-84ed-4e08-8c12-234928beef49\") " pod="openstack/nova-cell0-conductor-0" Mar 20 07:13:39 crc kubenswrapper[4971]: I0320 07:13:39.577296 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebe7358b-84ed-4e08-8c12-234928beef49-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"ebe7358b-84ed-4e08-8c12-234928beef49\") " pod="openstack/nova-cell0-conductor-0" Mar 20 07:13:39 crc kubenswrapper[4971]: I0320 07:13:39.577345 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebe7358b-84ed-4e08-8c12-234928beef49-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"ebe7358b-84ed-4e08-8c12-234928beef49\") " pod="openstack/nova-cell0-conductor-0" Mar 20 07:13:39 crc kubenswrapper[4971]: I0320 07:13:39.583269 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebe7358b-84ed-4e08-8c12-234928beef49-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"ebe7358b-84ed-4e08-8c12-234928beef49\") " pod="openstack/nova-cell0-conductor-0" Mar 20 07:13:39 crc kubenswrapper[4971]: I0320 07:13:39.593695 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebe7358b-84ed-4e08-8c12-234928beef49-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"ebe7358b-84ed-4e08-8c12-234928beef49\") " pod="openstack/nova-cell0-conductor-0" Mar 20 07:13:39 crc kubenswrapper[4971]: I0320 07:13:39.596805 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnf6t\" (UniqueName: \"kubernetes.io/projected/ebe7358b-84ed-4e08-8c12-234928beef49-kube-api-access-hnf6t\") pod \"nova-cell0-conductor-0\" (UID: \"ebe7358b-84ed-4e08-8c12-234928beef49\") " pod="openstack/nova-cell0-conductor-0" Mar 20 07:13:39 crc kubenswrapper[4971]: I0320 07:13:39.638503 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 20 07:13:39 crc kubenswrapper[4971]: I0320 07:13:39.905129 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 07:13:40 crc kubenswrapper[4971]: I0320 07:13:40.002778 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 07:13:40 crc kubenswrapper[4971]: I0320 07:13:40.124868 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"66e085af-b20f-4a74-a2a2-f21a3057cb8a","Type":"ContainerStarted","Data":"5d4ccd50b8481a237c43d08c563ba64a032112f2167a89b5faa03f80f1ea868e"} Mar 20 07:13:40 crc kubenswrapper[4971]: I0320 07:13:40.186498 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 07:13:40 crc kubenswrapper[4971]: W0320 07:13:40.197802 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podebe7358b_84ed_4e08_8c12_234928beef49.slice/crio-57b59ca2298435e2062d2fe57884c71b9480b6088c3c5f43111cabbc879d79bd WatchSource:0}: Error finding container 57b59ca2298435e2062d2fe57884c71b9480b6088c3c5f43111cabbc879d79bd: Status 404 returned error can't find the container with id 57b59ca2298435e2062d2fe57884c71b9480b6088c3c5f43111cabbc879d79bd Mar 20 07:13:41 crc kubenswrapper[4971]: I0320 07:13:41.136968 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"ebe7358b-84ed-4e08-8c12-234928beef49","Type":"ContainerStarted","Data":"d18efcf36c0990b4e068aae60d892df3f5ec3b90d05ef12fb22df71577b161aa"} Mar 20 07:13:41 crc kubenswrapper[4971]: I0320 07:13:41.137268 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"ebe7358b-84ed-4e08-8c12-234928beef49","Type":"ContainerStarted","Data":"57b59ca2298435e2062d2fe57884c71b9480b6088c3c5f43111cabbc879d79bd"} Mar 20 07:13:41 crc kubenswrapper[4971]: I0320 07:13:41.137307 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 20 07:13:41 crc kubenswrapper[4971]: I0320 07:13:41.164353 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.164285209 podStartE2EDuration="2.164285209s" podCreationTimestamp="2026-03-20 07:13:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:13:41.156259431 +0000 UTC m=+1443.136133609" watchObservedRunningTime="2026-03-20 07:13:41.164285209 +0000 UTC m=+1443.144159357" Mar 20 07:13:42 crc kubenswrapper[4971]: I0320 07:13:42.155678 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"66e085af-b20f-4a74-a2a2-f21a3057cb8a","Type":"ContainerStarted","Data":"ce3710cd7b793d078b3fcd6a5f4dd83bd05c10551a91c491fb6773d8d6ca25ef"} Mar 20 07:13:42 crc kubenswrapper[4971]: I0320 07:13:42.156129 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 07:13:42 crc kubenswrapper[4971]: I0320 07:13:42.189351 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.780033398 podStartE2EDuration="6.189328035s" podCreationTimestamp="2026-03-20 07:13:36 +0000 UTC" firstStartedPulling="2026-03-20 07:13:36.818050669 +0000 UTC m=+1438.797924807" lastFinishedPulling="2026-03-20 07:13:41.227345306 +0000 UTC m=+1443.207219444" observedRunningTime="2026-03-20 07:13:42.187370365 +0000 UTC m=+1444.167244503" watchObservedRunningTime="2026-03-20 07:13:42.189328035 +0000 UTC m=+1444.169202174" Mar 20 07:13:49 crc kubenswrapper[4971]: I0320 07:13:49.689590 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.363229 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-jtmqm"] Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.364956 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-jtmqm" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.367827 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.371959 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.375372 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-jtmqm"] Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.515867 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0e7c0cc-cee7-4e9b-8c9c-53e1c83f8027-scripts\") pod \"nova-cell0-cell-mapping-jtmqm\" (UID: \"e0e7c0cc-cee7-4e9b-8c9c-53e1c83f8027\") " pod="openstack/nova-cell0-cell-mapping-jtmqm" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.515921 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0e7c0cc-cee7-4e9b-8c9c-53e1c83f8027-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-jtmqm\" (UID: \"e0e7c0cc-cee7-4e9b-8c9c-53e1c83f8027\") " pod="openstack/nova-cell0-cell-mapping-jtmqm" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.515941 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0e7c0cc-cee7-4e9b-8c9c-53e1c83f8027-config-data\") pod \"nova-cell0-cell-mapping-jtmqm\" (UID: \"e0e7c0cc-cee7-4e9b-8c9c-53e1c83f8027\") " pod="openstack/nova-cell0-cell-mapping-jtmqm" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.515960 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxmds\" (UniqueName: \"kubernetes.io/projected/e0e7c0cc-cee7-4e9b-8c9c-53e1c83f8027-kube-api-access-bxmds\") pod \"nova-cell0-cell-mapping-jtmqm\" (UID: \"e0e7c0cc-cee7-4e9b-8c9c-53e1c83f8027\") " pod="openstack/nova-cell0-cell-mapping-jtmqm" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.545299 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.548438 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.552573 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.558635 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.617139 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0e7c0cc-cee7-4e9b-8c9c-53e1c83f8027-scripts\") pod \"nova-cell0-cell-mapping-jtmqm\" (UID: \"e0e7c0cc-cee7-4e9b-8c9c-53e1c83f8027\") " pod="openstack/nova-cell0-cell-mapping-jtmqm" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.617187 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0e7c0cc-cee7-4e9b-8c9c-53e1c83f8027-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-jtmqm\" (UID: \"e0e7c0cc-cee7-4e9b-8c9c-53e1c83f8027\") " pod="openstack/nova-cell0-cell-mapping-jtmqm" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.617205 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0e7c0cc-cee7-4e9b-8c9c-53e1c83f8027-config-data\") pod \"nova-cell0-cell-mapping-jtmqm\" (UID: \"e0e7c0cc-cee7-4e9b-8c9c-53e1c83f8027\") " pod="openstack/nova-cell0-cell-mapping-jtmqm" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.617225 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxmds\" (UniqueName: \"kubernetes.io/projected/e0e7c0cc-cee7-4e9b-8c9c-53e1c83f8027-kube-api-access-bxmds\") pod \"nova-cell0-cell-mapping-jtmqm\" (UID: \"e0e7c0cc-cee7-4e9b-8c9c-53e1c83f8027\") " pod="openstack/nova-cell0-cell-mapping-jtmqm" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.618447 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.619980 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.622627 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.633817 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.636011 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0e7c0cc-cee7-4e9b-8c9c-53e1c83f8027-scripts\") pod \"nova-cell0-cell-mapping-jtmqm\" (UID: \"e0e7c0cc-cee7-4e9b-8c9c-53e1c83f8027\") " pod="openstack/nova-cell0-cell-mapping-jtmqm" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.648630 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0e7c0cc-cee7-4e9b-8c9c-53e1c83f8027-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-jtmqm\" (UID: \"e0e7c0cc-cee7-4e9b-8c9c-53e1c83f8027\") " pod="openstack/nova-cell0-cell-mapping-jtmqm" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.657793 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0e7c0cc-cee7-4e9b-8c9c-53e1c83f8027-config-data\") pod \"nova-cell0-cell-mapping-jtmqm\" (UID: \"e0e7c0cc-cee7-4e9b-8c9c-53e1c83f8027\") " pod="openstack/nova-cell0-cell-mapping-jtmqm" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.661455 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxmds\" (UniqueName: \"kubernetes.io/projected/e0e7c0cc-cee7-4e9b-8c9c-53e1c83f8027-kube-api-access-bxmds\") pod \"nova-cell0-cell-mapping-jtmqm\" (UID: \"e0e7c0cc-cee7-4e9b-8c9c-53e1c83f8027\") " pod="openstack/nova-cell0-cell-mapping-jtmqm" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.688531 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-jtmqm" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.721465 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae621917-3cce-4814-8b8c-9a30ff8bdd20-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ae621917-3cce-4814-8b8c-9a30ff8bdd20\") " pod="openstack/nova-scheduler-0" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.721548 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98dkq\" (UniqueName: \"kubernetes.io/projected/ae621917-3cce-4814-8b8c-9a30ff8bdd20-kube-api-access-98dkq\") pod \"nova-scheduler-0\" (UID: \"ae621917-3cce-4814-8b8c-9a30ff8bdd20\") " pod="openstack/nova-scheduler-0" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.721623 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac4c72d9-ed3f-4662-909e-07caccc657ba-logs\") pod \"nova-api-0\" (UID: \"ac4c72d9-ed3f-4662-909e-07caccc657ba\") " pod="openstack/nova-api-0" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.741735 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbfb8\" (UniqueName: \"kubernetes.io/projected/ac4c72d9-ed3f-4662-909e-07caccc657ba-kube-api-access-cbfb8\") pod \"nova-api-0\" (UID: \"ac4c72d9-ed3f-4662-909e-07caccc657ba\") " pod="openstack/nova-api-0" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.741810 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac4c72d9-ed3f-4662-909e-07caccc657ba-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ac4c72d9-ed3f-4662-909e-07caccc657ba\") " pod="openstack/nova-api-0" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.741852 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae621917-3cce-4814-8b8c-9a30ff8bdd20-config-data\") pod \"nova-scheduler-0\" (UID: \"ae621917-3cce-4814-8b8c-9a30ff8bdd20\") " pod="openstack/nova-scheduler-0" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.741902 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac4c72d9-ed3f-4662-909e-07caccc657ba-config-data\") pod \"nova-api-0\" (UID: \"ac4c72d9-ed3f-4662-909e-07caccc657ba\") " pod="openstack/nova-api-0" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.807939 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.809634 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.812199 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.820527 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.821462 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.821594 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.822703 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.825385 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.860407 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-69b4446475-xtj6r"] Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.863149 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69b4446475-xtj6r" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.876150 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69b4446475-xtj6r"] Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.882938 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmxxp\" (UniqueName: \"kubernetes.io/projected/f2f934ac-76df-4767-95d3-d7ba66f8d7c6-kube-api-access-cmxxp\") pod \"nova-metadata-0\" (UID: \"f2f934ac-76df-4767-95d3-d7ba66f8d7c6\") " pod="openstack/nova-metadata-0" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.882998 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae621917-3cce-4814-8b8c-9a30ff8bdd20-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ae621917-3cce-4814-8b8c-9a30ff8bdd20\") " pod="openstack/nova-scheduler-0" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.883032 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98dkq\" (UniqueName: \"kubernetes.io/projected/ae621917-3cce-4814-8b8c-9a30ff8bdd20-kube-api-access-98dkq\") pod \"nova-scheduler-0\" (UID: \"ae621917-3cce-4814-8b8c-9a30ff8bdd20\") " pod="openstack/nova-scheduler-0" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.883087 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2f934ac-76df-4767-95d3-d7ba66f8d7c6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f2f934ac-76df-4767-95d3-d7ba66f8d7c6\") " pod="openstack/nova-metadata-0" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.883125 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/303bf07b-b266-451f-9ba4-9122c1d80109-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"303bf07b-b266-451f-9ba4-9122c1d80109\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.883201 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac4c72d9-ed3f-4662-909e-07caccc657ba-logs\") pod \"nova-api-0\" (UID: \"ac4c72d9-ed3f-4662-909e-07caccc657ba\") " pod="openstack/nova-api-0" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.883226 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2f934ac-76df-4767-95d3-d7ba66f8d7c6-config-data\") pod \"nova-metadata-0\" (UID: \"f2f934ac-76df-4767-95d3-d7ba66f8d7c6\") " pod="openstack/nova-metadata-0" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.883272 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pjff\" (UniqueName: \"kubernetes.io/projected/303bf07b-b266-451f-9ba4-9122c1d80109-kube-api-access-4pjff\") pod \"nova-cell1-novncproxy-0\" (UID: \"303bf07b-b266-451f-9ba4-9122c1d80109\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.883346 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2f934ac-76df-4767-95d3-d7ba66f8d7c6-logs\") pod \"nova-metadata-0\" (UID: \"f2f934ac-76df-4767-95d3-d7ba66f8d7c6\") " pod="openstack/nova-metadata-0" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.883391 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbfb8\" (UniqueName: \"kubernetes.io/projected/ac4c72d9-ed3f-4662-909e-07caccc657ba-kube-api-access-cbfb8\") pod \"nova-api-0\" (UID: \"ac4c72d9-ed3f-4662-909e-07caccc657ba\") " pod="openstack/nova-api-0" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.883409 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac4c72d9-ed3f-4662-909e-07caccc657ba-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ac4c72d9-ed3f-4662-909e-07caccc657ba\") " pod="openstack/nova-api-0" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.883431 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae621917-3cce-4814-8b8c-9a30ff8bdd20-config-data\") pod \"nova-scheduler-0\" (UID: \"ae621917-3cce-4814-8b8c-9a30ff8bdd20\") " pod="openstack/nova-scheduler-0" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.883449 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/303bf07b-b266-451f-9ba4-9122c1d80109-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"303bf07b-b266-451f-9ba4-9122c1d80109\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.883482 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac4c72d9-ed3f-4662-909e-07caccc657ba-config-data\") pod \"nova-api-0\" (UID: \"ac4c72d9-ed3f-4662-909e-07caccc657ba\") " pod="openstack/nova-api-0" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.883924 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac4c72d9-ed3f-4662-909e-07caccc657ba-logs\") pod \"nova-api-0\" (UID: \"ac4c72d9-ed3f-4662-909e-07caccc657ba\") " pod="openstack/nova-api-0" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.889410 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac4c72d9-ed3f-4662-909e-07caccc657ba-config-data\") pod \"nova-api-0\" (UID: \"ac4c72d9-ed3f-4662-909e-07caccc657ba\") " pod="openstack/nova-api-0" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.890498 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae621917-3cce-4814-8b8c-9a30ff8bdd20-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ae621917-3cce-4814-8b8c-9a30ff8bdd20\") " pod="openstack/nova-scheduler-0" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.901366 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac4c72d9-ed3f-4662-909e-07caccc657ba-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ac4c72d9-ed3f-4662-909e-07caccc657ba\") " pod="openstack/nova-api-0" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.903951 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98dkq\" (UniqueName: \"kubernetes.io/projected/ae621917-3cce-4814-8b8c-9a30ff8bdd20-kube-api-access-98dkq\") pod \"nova-scheduler-0\" (UID: \"ae621917-3cce-4814-8b8c-9a30ff8bdd20\") " pod="openstack/nova-scheduler-0" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.905738 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae621917-3cce-4814-8b8c-9a30ff8bdd20-config-data\") pod \"nova-scheduler-0\" (UID: \"ae621917-3cce-4814-8b8c-9a30ff8bdd20\") " pod="openstack/nova-scheduler-0" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.910974 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbfb8\" (UniqueName: \"kubernetes.io/projected/ac4c72d9-ed3f-4662-909e-07caccc657ba-kube-api-access-cbfb8\") pod \"nova-api-0\" (UID: \"ac4c72d9-ed3f-4662-909e-07caccc657ba\") " pod="openstack/nova-api-0" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.990134 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/303bf07b-b266-451f-9ba4-9122c1d80109-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"303bf07b-b266-451f-9ba4-9122c1d80109\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.990188 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wqtk\" (UniqueName: \"kubernetes.io/projected/6c960029-95ef-4f0f-a277-9fb4a8e13198-kube-api-access-2wqtk\") pod \"dnsmasq-dns-69b4446475-xtj6r\" (UID: \"6c960029-95ef-4f0f-a277-9fb4a8e13198\") " pod="openstack/dnsmasq-dns-69b4446475-xtj6r" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.990218 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c960029-95ef-4f0f-a277-9fb4a8e13198-config\") pod \"dnsmasq-dns-69b4446475-xtj6r\" (UID: \"6c960029-95ef-4f0f-a277-9fb4a8e13198\") " pod="openstack/dnsmasq-dns-69b4446475-xtj6r" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.990278 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmxxp\" (UniqueName: \"kubernetes.io/projected/f2f934ac-76df-4767-95d3-d7ba66f8d7c6-kube-api-access-cmxxp\") pod \"nova-metadata-0\" (UID: \"f2f934ac-76df-4767-95d3-d7ba66f8d7c6\") " pod="openstack/nova-metadata-0" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.990314 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2f934ac-76df-4767-95d3-d7ba66f8d7c6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f2f934ac-76df-4767-95d3-d7ba66f8d7c6\") " pod="openstack/nova-metadata-0" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.990333 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c960029-95ef-4f0f-a277-9fb4a8e13198-dns-svc\") pod \"dnsmasq-dns-69b4446475-xtj6r\" (UID: \"6c960029-95ef-4f0f-a277-9fb4a8e13198\") " pod="openstack/dnsmasq-dns-69b4446475-xtj6r" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.990350 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/303bf07b-b266-451f-9ba4-9122c1d80109-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"303bf07b-b266-451f-9ba4-9122c1d80109\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.990389 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2f934ac-76df-4767-95d3-d7ba66f8d7c6-config-data\") pod \"nova-metadata-0\" (UID: \"f2f934ac-76df-4767-95d3-d7ba66f8d7c6\") " pod="openstack/nova-metadata-0" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.990413 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c960029-95ef-4f0f-a277-9fb4a8e13198-ovsdbserver-sb\") pod \"dnsmasq-dns-69b4446475-xtj6r\" (UID: \"6c960029-95ef-4f0f-a277-9fb4a8e13198\") " pod="openstack/dnsmasq-dns-69b4446475-xtj6r" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.990433 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pjff\" (UniqueName: \"kubernetes.io/projected/303bf07b-b266-451f-9ba4-9122c1d80109-kube-api-access-4pjff\") pod \"nova-cell1-novncproxy-0\" (UID: \"303bf07b-b266-451f-9ba4-9122c1d80109\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.990472 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2f934ac-76df-4767-95d3-d7ba66f8d7c6-logs\") pod \"nova-metadata-0\" (UID: \"f2f934ac-76df-4767-95d3-d7ba66f8d7c6\") " pod="openstack/nova-metadata-0" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.990495 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c960029-95ef-4f0f-a277-9fb4a8e13198-ovsdbserver-nb\") pod \"dnsmasq-dns-69b4446475-xtj6r\" (UID: \"6c960029-95ef-4f0f-a277-9fb4a8e13198\") " pod="openstack/dnsmasq-dns-69b4446475-xtj6r" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.990513 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6c960029-95ef-4f0f-a277-9fb4a8e13198-dns-swift-storage-0\") pod \"dnsmasq-dns-69b4446475-xtj6r\" (UID: \"6c960029-95ef-4f0f-a277-9fb4a8e13198\") " pod="openstack/dnsmasq-dns-69b4446475-xtj6r" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.994075 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/303bf07b-b266-451f-9ba4-9122c1d80109-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"303bf07b-b266-451f-9ba4-9122c1d80109\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.997782 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2f934ac-76df-4767-95d3-d7ba66f8d7c6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f2f934ac-76df-4767-95d3-d7ba66f8d7c6\") " pod="openstack/nova-metadata-0" Mar 20 07:13:50 crc kubenswrapper[4971]: I0320 07:13:50.998162 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2f934ac-76df-4767-95d3-d7ba66f8d7c6-logs\") pod \"nova-metadata-0\" (UID: \"f2f934ac-76df-4767-95d3-d7ba66f8d7c6\") " pod="openstack/nova-metadata-0" Mar 20 07:13:51 crc kubenswrapper[4971]: I0320 07:13:51.006154 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/303bf07b-b266-451f-9ba4-9122c1d80109-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"303bf07b-b266-451f-9ba4-9122c1d80109\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 07:13:51 crc kubenswrapper[4971]: I0320 07:13:51.010553 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmxxp\" (UniqueName: \"kubernetes.io/projected/f2f934ac-76df-4767-95d3-d7ba66f8d7c6-kube-api-access-cmxxp\") pod \"nova-metadata-0\" (UID: \"f2f934ac-76df-4767-95d3-d7ba66f8d7c6\") " pod="openstack/nova-metadata-0" Mar 20 07:13:51 crc kubenswrapper[4971]: I0320 07:13:51.013211 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pjff\" (UniqueName: \"kubernetes.io/projected/303bf07b-b266-451f-9ba4-9122c1d80109-kube-api-access-4pjff\") pod \"nova-cell1-novncproxy-0\" (UID: \"303bf07b-b266-451f-9ba4-9122c1d80109\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 07:13:51 crc kubenswrapper[4971]: I0320 07:13:51.021258 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2f934ac-76df-4767-95d3-d7ba66f8d7c6-config-data\") pod \"nova-metadata-0\" (UID: \"f2f934ac-76df-4767-95d3-d7ba66f8d7c6\") " pod="openstack/nova-metadata-0" Mar 20 07:13:51 crc kubenswrapper[4971]: I0320 07:13:51.092592 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c960029-95ef-4f0f-a277-9fb4a8e13198-dns-svc\") pod \"dnsmasq-dns-69b4446475-xtj6r\" (UID: \"6c960029-95ef-4f0f-a277-9fb4a8e13198\") " pod="openstack/dnsmasq-dns-69b4446475-xtj6r" Mar 20 07:13:51 crc kubenswrapper[4971]: I0320 07:13:51.092694 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c960029-95ef-4f0f-a277-9fb4a8e13198-ovsdbserver-sb\") pod \"dnsmasq-dns-69b4446475-xtj6r\" (UID: \"6c960029-95ef-4f0f-a277-9fb4a8e13198\") " pod="openstack/dnsmasq-dns-69b4446475-xtj6r" Mar 20 07:13:51 crc kubenswrapper[4971]: I0320 07:13:51.092748 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c960029-95ef-4f0f-a277-9fb4a8e13198-ovsdbserver-nb\") pod \"dnsmasq-dns-69b4446475-xtj6r\" (UID: \"6c960029-95ef-4f0f-a277-9fb4a8e13198\") " pod="openstack/dnsmasq-dns-69b4446475-xtj6r" Mar 20 07:13:51 crc kubenswrapper[4971]: I0320 07:13:51.092771 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6c960029-95ef-4f0f-a277-9fb4a8e13198-dns-swift-storage-0\") pod \"dnsmasq-dns-69b4446475-xtj6r\" (UID: \"6c960029-95ef-4f0f-a277-9fb4a8e13198\") " pod="openstack/dnsmasq-dns-69b4446475-xtj6r" Mar 20 07:13:51 crc kubenswrapper[4971]: I0320 07:13:51.092792 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wqtk\" (UniqueName: \"kubernetes.io/projected/6c960029-95ef-4f0f-a277-9fb4a8e13198-kube-api-access-2wqtk\") pod \"dnsmasq-dns-69b4446475-xtj6r\" (UID: \"6c960029-95ef-4f0f-a277-9fb4a8e13198\") " pod="openstack/dnsmasq-dns-69b4446475-xtj6r" Mar 20 07:13:51 crc kubenswrapper[4971]: I0320 07:13:51.092815 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c960029-95ef-4f0f-a277-9fb4a8e13198-config\") pod \"dnsmasq-dns-69b4446475-xtj6r\" (UID: \"6c960029-95ef-4f0f-a277-9fb4a8e13198\") " pod="openstack/dnsmasq-dns-69b4446475-xtj6r" Mar 20 07:13:51 crc kubenswrapper[4971]: I0320 07:13:51.093817 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c960029-95ef-4f0f-a277-9fb4a8e13198-ovsdbserver-nb\") pod \"dnsmasq-dns-69b4446475-xtj6r\" (UID: \"6c960029-95ef-4f0f-a277-9fb4a8e13198\") " pod="openstack/dnsmasq-dns-69b4446475-xtj6r" Mar 20 07:13:51 crc kubenswrapper[4971]: I0320 07:13:51.093853 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6c960029-95ef-4f0f-a277-9fb4a8e13198-dns-swift-storage-0\") pod \"dnsmasq-dns-69b4446475-xtj6r\" (UID: \"6c960029-95ef-4f0f-a277-9fb4a8e13198\") " pod="openstack/dnsmasq-dns-69b4446475-xtj6r" Mar 20 07:13:51 crc kubenswrapper[4971]: I0320 07:13:51.094440 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c960029-95ef-4f0f-a277-9fb4a8e13198-dns-svc\") pod \"dnsmasq-dns-69b4446475-xtj6r\" (UID: \"6c960029-95ef-4f0f-a277-9fb4a8e13198\") " pod="openstack/dnsmasq-dns-69b4446475-xtj6r" Mar 20 07:13:51 crc kubenswrapper[4971]: I0320 07:13:51.095034 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c960029-95ef-4f0f-a277-9fb4a8e13198-config\") pod \"dnsmasq-dns-69b4446475-xtj6r\" (UID: \"6c960029-95ef-4f0f-a277-9fb4a8e13198\") " pod="openstack/dnsmasq-dns-69b4446475-xtj6r" Mar 20 07:13:51 crc kubenswrapper[4971]: I0320 07:13:51.095441 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c960029-95ef-4f0f-a277-9fb4a8e13198-ovsdbserver-sb\") pod \"dnsmasq-dns-69b4446475-xtj6r\" (UID: \"6c960029-95ef-4f0f-a277-9fb4a8e13198\") " pod="openstack/dnsmasq-dns-69b4446475-xtj6r" Mar 20 07:13:51 crc kubenswrapper[4971]: I0320 07:13:51.107505 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 07:13:51 crc kubenswrapper[4971]: I0320 07:13:51.111237 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wqtk\" (UniqueName: \"kubernetes.io/projected/6c960029-95ef-4f0f-a277-9fb4a8e13198-kube-api-access-2wqtk\") pod \"dnsmasq-dns-69b4446475-xtj6r\" (UID: \"6c960029-95ef-4f0f-a277-9fb4a8e13198\") " pod="openstack/dnsmasq-dns-69b4446475-xtj6r" Mar 20 07:13:51 crc kubenswrapper[4971]: I0320 07:13:51.166673 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 07:13:51 crc kubenswrapper[4971]: I0320 07:13:51.226035 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 07:13:51 crc kubenswrapper[4971]: I0320 07:13:51.254434 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69b4446475-xtj6r" Mar 20 07:13:51 crc kubenswrapper[4971]: I0320 07:13:51.257142 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 07:13:51 crc kubenswrapper[4971]: I0320 07:13:51.321802 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-jtmqm"] Mar 20 07:13:51 crc kubenswrapper[4971]: I0320 07:13:51.376245 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-5jkzb"] Mar 20 07:13:51 crc kubenswrapper[4971]: I0320 07:13:51.378208 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-5jkzb" Mar 20 07:13:51 crc kubenswrapper[4971]: I0320 07:13:51.383983 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 20 07:13:51 crc kubenswrapper[4971]: I0320 07:13:51.384192 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 20 07:13:51 crc kubenswrapper[4971]: I0320 07:13:51.385312 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-5jkzb"] Mar 20 07:13:51 crc kubenswrapper[4971]: I0320 07:13:51.412913 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-jtmqm" event={"ID":"e0e7c0cc-cee7-4e9b-8c9c-53e1c83f8027","Type":"ContainerStarted","Data":"7cf47dddbb101daab5b4385bbc62fd1353b1e87a8bc249b82bfce00dcd5950a3"} Mar 20 07:13:51 crc kubenswrapper[4971]: I0320 07:13:51.513086 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1be4310d-009b-4cca-8e17-2d392cad914b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-5jkzb\" (UID: \"1be4310d-009b-4cca-8e17-2d392cad914b\") " pod="openstack/nova-cell1-conductor-db-sync-5jkzb" Mar 20 07:13:51 crc kubenswrapper[4971]: I0320 07:13:51.513156 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rc59\" (UniqueName: \"kubernetes.io/projected/1be4310d-009b-4cca-8e17-2d392cad914b-kube-api-access-5rc59\") pod \"nova-cell1-conductor-db-sync-5jkzb\" (UID: \"1be4310d-009b-4cca-8e17-2d392cad914b\") " pod="openstack/nova-cell1-conductor-db-sync-5jkzb" Mar 20 07:13:51 crc kubenswrapper[4971]: I0320 07:13:51.513191 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1be4310d-009b-4cca-8e17-2d392cad914b-scripts\") pod \"nova-cell1-conductor-db-sync-5jkzb\" (UID: \"1be4310d-009b-4cca-8e17-2d392cad914b\") " pod="openstack/nova-cell1-conductor-db-sync-5jkzb" Mar 20 07:13:51 crc kubenswrapper[4971]: I0320 07:13:51.513224 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1be4310d-009b-4cca-8e17-2d392cad914b-config-data\") pod \"nova-cell1-conductor-db-sync-5jkzb\" (UID: \"1be4310d-009b-4cca-8e17-2d392cad914b\") " pod="openstack/nova-cell1-conductor-db-sync-5jkzb" Mar 20 07:13:51 crc kubenswrapper[4971]: I0320 07:13:51.614825 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1be4310d-009b-4cca-8e17-2d392cad914b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-5jkzb\" (UID: \"1be4310d-009b-4cca-8e17-2d392cad914b\") " pod="openstack/nova-cell1-conductor-db-sync-5jkzb" Mar 20 07:13:51 crc kubenswrapper[4971]: I0320 07:13:51.615154 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rc59\" (UniqueName: \"kubernetes.io/projected/1be4310d-009b-4cca-8e17-2d392cad914b-kube-api-access-5rc59\") pod \"nova-cell1-conductor-db-sync-5jkzb\" (UID: \"1be4310d-009b-4cca-8e17-2d392cad914b\") " pod="openstack/nova-cell1-conductor-db-sync-5jkzb" Mar 20 07:13:51 crc kubenswrapper[4971]: I0320 07:13:51.615193 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1be4310d-009b-4cca-8e17-2d392cad914b-scripts\") pod \"nova-cell1-conductor-db-sync-5jkzb\" (UID: \"1be4310d-009b-4cca-8e17-2d392cad914b\") " pod="openstack/nova-cell1-conductor-db-sync-5jkzb" Mar 20 07:13:51 crc kubenswrapper[4971]: I0320 07:13:51.615233 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1be4310d-009b-4cca-8e17-2d392cad914b-config-data\") pod \"nova-cell1-conductor-db-sync-5jkzb\" (UID: \"1be4310d-009b-4cca-8e17-2d392cad914b\") " pod="openstack/nova-cell1-conductor-db-sync-5jkzb" Mar 20 07:13:51 crc kubenswrapper[4971]: I0320 07:13:51.619474 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1be4310d-009b-4cca-8e17-2d392cad914b-scripts\") pod \"nova-cell1-conductor-db-sync-5jkzb\" (UID: \"1be4310d-009b-4cca-8e17-2d392cad914b\") " pod="openstack/nova-cell1-conductor-db-sync-5jkzb" Mar 20 07:13:51 crc kubenswrapper[4971]: I0320 07:13:51.619535 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1be4310d-009b-4cca-8e17-2d392cad914b-config-data\") pod \"nova-cell1-conductor-db-sync-5jkzb\" (UID: \"1be4310d-009b-4cca-8e17-2d392cad914b\") " pod="openstack/nova-cell1-conductor-db-sync-5jkzb" Mar 20 07:13:51 crc kubenswrapper[4971]: I0320 07:13:51.627877 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 07:13:51 crc kubenswrapper[4971]: I0320 07:13:51.633128 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1be4310d-009b-4cca-8e17-2d392cad914b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-5jkzb\" (UID: \"1be4310d-009b-4cca-8e17-2d392cad914b\") " pod="openstack/nova-cell1-conductor-db-sync-5jkzb" Mar 20 07:13:51 crc kubenswrapper[4971]: I0320 07:13:51.633807 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rc59\" (UniqueName: \"kubernetes.io/projected/1be4310d-009b-4cca-8e17-2d392cad914b-kube-api-access-5rc59\") pod \"nova-cell1-conductor-db-sync-5jkzb\" (UID: \"1be4310d-009b-4cca-8e17-2d392cad914b\") " pod="openstack/nova-cell1-conductor-db-sync-5jkzb" Mar 20 07:13:51 crc kubenswrapper[4971]: W0320 07:13:51.635243 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae621917_3cce_4814_8b8c_9a30ff8bdd20.slice/crio-c3861b0e880b71411cbfe17867aef1199f1a7541bf05ca140f1944a610c1cbb7 WatchSource:0}: Error finding container c3861b0e880b71411cbfe17867aef1199f1a7541bf05ca140f1944a610c1cbb7: Status 404 returned error can't find the container with id c3861b0e880b71411cbfe17867aef1199f1a7541bf05ca140f1944a610c1cbb7 Mar 20 07:13:51 crc kubenswrapper[4971]: I0320 07:13:51.753500 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-5jkzb" Mar 20 07:13:51 crc kubenswrapper[4971]: I0320 07:13:51.927683 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 07:13:51 crc kubenswrapper[4971]: I0320 07:13:51.947168 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69b4446475-xtj6r"] Mar 20 07:13:52 crc kubenswrapper[4971]: I0320 07:13:52.019196 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 07:13:52 crc kubenswrapper[4971]: W0320 07:13:52.036639 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2f934ac_76df_4767_95d3_d7ba66f8d7c6.slice/crio-60ba9aa28b2469ed7a567c481ef1abe82649bda4ee9a7b53542d9440dfa303e8 WatchSource:0}: Error finding container 60ba9aa28b2469ed7a567c481ef1abe82649bda4ee9a7b53542d9440dfa303e8: Status 404 returned error can't find the container with id 60ba9aa28b2469ed7a567c481ef1abe82649bda4ee9a7b53542d9440dfa303e8 Mar 20 07:13:52 crc kubenswrapper[4971]: I0320 07:13:52.051934 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 07:13:52 crc kubenswrapper[4971]: I0320 07:13:52.286860 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-5jkzb"] Mar 20 07:13:52 crc kubenswrapper[4971]: I0320 07:13:52.459059 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"303bf07b-b266-451f-9ba4-9122c1d80109","Type":"ContainerStarted","Data":"c9374e49741939d48446d52ac5d009c4a428a29a248a2d9d1b52e446c8b2614f"} Mar 20 07:13:52 crc kubenswrapper[4971]: I0320 07:13:52.461894 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ae621917-3cce-4814-8b8c-9a30ff8bdd20","Type":"ContainerStarted","Data":"c3861b0e880b71411cbfe17867aef1199f1a7541bf05ca140f1944a610c1cbb7"} Mar 20 07:13:52 crc kubenswrapper[4971]: I0320 07:13:52.469440 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-5jkzb" event={"ID":"1be4310d-009b-4cca-8e17-2d392cad914b","Type":"ContainerStarted","Data":"7172165a3b01bb8f687eab7947b0be06c88f3c2969c9c41c32f43a48a3fee7b2"} Mar 20 07:13:52 crc kubenswrapper[4971]: I0320 07:13:52.472172 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ac4c72d9-ed3f-4662-909e-07caccc657ba","Type":"ContainerStarted","Data":"52f0ebc017d0cd561e60a5b6f6fb8e7299ccee8ae9bea03e9b044775846156b7"} Mar 20 07:13:52 crc kubenswrapper[4971]: I0320 07:13:52.475747 4971 generic.go:334] "Generic (PLEG): container finished" podID="6c960029-95ef-4f0f-a277-9fb4a8e13198" containerID="7c0a8ad9c3b445ea4e98dfbc74870cfe993a54f5997889b39998bf8e54eb8bf0" exitCode=0 Mar 20 07:13:52 crc kubenswrapper[4971]: I0320 07:13:52.475799 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69b4446475-xtj6r" event={"ID":"6c960029-95ef-4f0f-a277-9fb4a8e13198","Type":"ContainerDied","Data":"7c0a8ad9c3b445ea4e98dfbc74870cfe993a54f5997889b39998bf8e54eb8bf0"} Mar 20 07:13:52 crc kubenswrapper[4971]: I0320 07:13:52.475819 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69b4446475-xtj6r" event={"ID":"6c960029-95ef-4f0f-a277-9fb4a8e13198","Type":"ContainerStarted","Data":"71c21da65e6d8f5a8af0ef2989614fc4aee55ce536112d2c72725ba144509120"} Mar 20 07:13:52 crc kubenswrapper[4971]: I0320 07:13:52.493852 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f2f934ac-76df-4767-95d3-d7ba66f8d7c6","Type":"ContainerStarted","Data":"60ba9aa28b2469ed7a567c481ef1abe82649bda4ee9a7b53542d9440dfa303e8"} Mar 20 07:13:52 crc kubenswrapper[4971]: I0320 07:13:52.512381 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-jtmqm" event={"ID":"e0e7c0cc-cee7-4e9b-8c9c-53e1c83f8027","Type":"ContainerStarted","Data":"4410efaf3f9d8a87e04404e86f12371268c9d97f8691fe91ffd43229eced6dd3"} Mar 20 07:13:53 crc kubenswrapper[4971]: I0320 07:13:53.540885 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-5jkzb" event={"ID":"1be4310d-009b-4cca-8e17-2d392cad914b","Type":"ContainerStarted","Data":"10a37a5b0caa91deed8a9e36e1111c76d38c04830b8df96ba48ece28266de196"} Mar 20 07:13:53 crc kubenswrapper[4971]: I0320 07:13:53.546800 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69b4446475-xtj6r" event={"ID":"6c960029-95ef-4f0f-a277-9fb4a8e13198","Type":"ContainerStarted","Data":"4f08043439bdae954fd302d7b812af50a1854d47487d2c8ac8c5aff2c1ad7345"} Mar 20 07:13:53 crc kubenswrapper[4971]: I0320 07:13:53.547108 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-69b4446475-xtj6r" Mar 20 07:13:53 crc kubenswrapper[4971]: I0320 07:13:53.565840 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-5jkzb" podStartSLOduration=2.5658205020000002 podStartE2EDuration="2.565820502s" podCreationTimestamp="2026-03-20 07:13:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:13:53.561440698 +0000 UTC m=+1455.541314836" watchObservedRunningTime="2026-03-20 07:13:53.565820502 +0000 UTC m=+1455.545694640" Mar 20 07:13:53 crc kubenswrapper[4971]: I0320 07:13:53.568731 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-jtmqm" podStartSLOduration=3.568713717 podStartE2EDuration="3.568713717s" podCreationTimestamp="2026-03-20 07:13:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:13:52.534644137 +0000 UTC m=+1454.514518275" watchObservedRunningTime="2026-03-20 07:13:53.568713717 +0000 UTC m=+1455.548587855" Mar 20 07:13:53 crc kubenswrapper[4971]: I0320 07:13:53.610514 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-69b4446475-xtj6r" podStartSLOduration=3.610487021 podStartE2EDuration="3.610487021s" podCreationTimestamp="2026-03-20 07:13:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:13:53.583537212 +0000 UTC m=+1455.563411370" watchObservedRunningTime="2026-03-20 07:13:53.610487021 +0000 UTC m=+1455.590361159" Mar 20 07:13:54 crc kubenswrapper[4971]: I0320 07:13:54.468318 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 07:13:54 crc kubenswrapper[4971]: I0320 07:13:54.476180 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 07:13:56 crc kubenswrapper[4971]: I0320 07:13:56.579149 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"303bf07b-b266-451f-9ba4-9122c1d80109","Type":"ContainerStarted","Data":"6196fd736045852e54d49d69c1722264de6f657acdbf76992e58582e67c832e2"} Mar 20 07:13:56 crc kubenswrapper[4971]: I0320 07:13:56.579478 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="303bf07b-b266-451f-9ba4-9122c1d80109" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://6196fd736045852e54d49d69c1722264de6f657acdbf76992e58582e67c832e2" gracePeriod=30 Mar 20 07:13:56 crc kubenswrapper[4971]: I0320 07:13:56.583154 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ae621917-3cce-4814-8b8c-9a30ff8bdd20","Type":"ContainerStarted","Data":"c0c01250fed259451f47634f50318bd526a1fbc02772c3f46eeb9bf1a0facbc4"} Mar 20 07:13:56 crc kubenswrapper[4971]: I0320 07:13:56.588240 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ac4c72d9-ed3f-4662-909e-07caccc657ba","Type":"ContainerStarted","Data":"cacd8e180d09a0791bf4d41328b376e0d5f1aeb23ae7c05247daa14b953d750f"} Mar 20 07:13:56 crc kubenswrapper[4971]: I0320 07:13:56.588276 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ac4c72d9-ed3f-4662-909e-07caccc657ba","Type":"ContainerStarted","Data":"93c5b0e83a8c18ba03c2a92a4de3bebd7a9908cfa79fc6c26aba3e5cf43e27b0"} Mar 20 07:13:56 crc kubenswrapper[4971]: I0320 07:13:56.590923 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f2f934ac-76df-4767-95d3-d7ba66f8d7c6","Type":"ContainerStarted","Data":"f6c2313185efe936176ebe865800a046b9f3f20a09b9e7dcb8c8d6a58676a686"} Mar 20 07:13:56 crc kubenswrapper[4971]: I0320 07:13:56.590978 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f2f934ac-76df-4767-95d3-d7ba66f8d7c6","Type":"ContainerStarted","Data":"8b0b9278315a248e514f61bdcd12d19d781361527ce3f91888033bdba1bb2612"} Mar 20 07:13:56 crc kubenswrapper[4971]: I0320 07:13:56.591045 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f2f934ac-76df-4767-95d3-d7ba66f8d7c6" containerName="nova-metadata-log" containerID="cri-o://8b0b9278315a248e514f61bdcd12d19d781361527ce3f91888033bdba1bb2612" gracePeriod=30 Mar 20 07:13:56 crc kubenswrapper[4971]: I0320 07:13:56.591068 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f2f934ac-76df-4767-95d3-d7ba66f8d7c6" containerName="nova-metadata-metadata" containerID="cri-o://f6c2313185efe936176ebe865800a046b9f3f20a09b9e7dcb8c8d6a58676a686" gracePeriod=30 Mar 20 07:13:56 crc kubenswrapper[4971]: I0320 07:13:56.602410 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.341364095 podStartE2EDuration="6.602380348s" podCreationTimestamp="2026-03-20 07:13:50 +0000 UTC" firstStartedPulling="2026-03-20 07:13:51.934038527 +0000 UTC m=+1453.913912665" lastFinishedPulling="2026-03-20 07:13:55.19505477 +0000 UTC m=+1457.174928918" observedRunningTime="2026-03-20 07:13:56.60130833 +0000 UTC m=+1458.581182468" watchObservedRunningTime="2026-03-20 07:13:56.602380348 +0000 UTC m=+1458.582254496" Mar 20 07:13:56 crc kubenswrapper[4971]: I0320 07:13:56.634590 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.08377451 podStartE2EDuration="6.634570764s" podCreationTimestamp="2026-03-20 07:13:50 +0000 UTC" firstStartedPulling="2026-03-20 07:13:51.639188164 +0000 UTC m=+1453.619062302" lastFinishedPulling="2026-03-20 07:13:55.189984418 +0000 UTC m=+1457.169858556" observedRunningTime="2026-03-20 07:13:56.627908281 +0000 UTC m=+1458.607782429" watchObservedRunningTime="2026-03-20 07:13:56.634570764 +0000 UTC m=+1458.614444902" Mar 20 07:13:56 crc kubenswrapper[4971]: I0320 07:13:56.648198 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.5149990129999997 podStartE2EDuration="6.648179217s" podCreationTimestamp="2026-03-20 07:13:50 +0000 UTC" firstStartedPulling="2026-03-20 07:13:52.056838295 +0000 UTC m=+1454.036712433" lastFinishedPulling="2026-03-20 07:13:55.190018499 +0000 UTC m=+1457.169892637" observedRunningTime="2026-03-20 07:13:56.643451074 +0000 UTC m=+1458.623325252" watchObservedRunningTime="2026-03-20 07:13:56.648179217 +0000 UTC m=+1458.628053355" Mar 20 07:13:56 crc kubenswrapper[4971]: I0320 07:13:56.662030 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.510747653 podStartE2EDuration="6.661993596s" podCreationTimestamp="2026-03-20 07:13:50 +0000 UTC" firstStartedPulling="2026-03-20 07:13:52.038704134 +0000 UTC m=+1454.018578272" lastFinishedPulling="2026-03-20 07:13:55.189950077 +0000 UTC m=+1457.169824215" observedRunningTime="2026-03-20 07:13:56.661503533 +0000 UTC m=+1458.641377671" watchObservedRunningTime="2026-03-20 07:13:56.661993596 +0000 UTC m=+1458.641867774" Mar 20 07:13:57 crc kubenswrapper[4971]: I0320 07:13:57.229680 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 07:13:57 crc kubenswrapper[4971]: I0320 07:13:57.367507 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmxxp\" (UniqueName: \"kubernetes.io/projected/f2f934ac-76df-4767-95d3-d7ba66f8d7c6-kube-api-access-cmxxp\") pod \"f2f934ac-76df-4767-95d3-d7ba66f8d7c6\" (UID: \"f2f934ac-76df-4767-95d3-d7ba66f8d7c6\") " Mar 20 07:13:57 crc kubenswrapper[4971]: I0320 07:13:57.367585 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2f934ac-76df-4767-95d3-d7ba66f8d7c6-logs\") pod \"f2f934ac-76df-4767-95d3-d7ba66f8d7c6\" (UID: \"f2f934ac-76df-4767-95d3-d7ba66f8d7c6\") " Mar 20 07:13:57 crc kubenswrapper[4971]: I0320 07:13:57.367634 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2f934ac-76df-4767-95d3-d7ba66f8d7c6-config-data\") pod \"f2f934ac-76df-4767-95d3-d7ba66f8d7c6\" (UID: \"f2f934ac-76df-4767-95d3-d7ba66f8d7c6\") " Mar 20 07:13:57 crc kubenswrapper[4971]: I0320 07:13:57.367786 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2f934ac-76df-4767-95d3-d7ba66f8d7c6-combined-ca-bundle\") pod \"f2f934ac-76df-4767-95d3-d7ba66f8d7c6\" (UID: \"f2f934ac-76df-4767-95d3-d7ba66f8d7c6\") " Mar 20 07:13:57 crc kubenswrapper[4971]: I0320 07:13:57.368032 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2f934ac-76df-4767-95d3-d7ba66f8d7c6-logs" (OuterVolumeSpecName: "logs") pod "f2f934ac-76df-4767-95d3-d7ba66f8d7c6" (UID: "f2f934ac-76df-4767-95d3-d7ba66f8d7c6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:13:57 crc kubenswrapper[4971]: I0320 07:13:57.368461 4971 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2f934ac-76df-4767-95d3-d7ba66f8d7c6-logs\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:57 crc kubenswrapper[4971]: I0320 07:13:57.382854 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2f934ac-76df-4767-95d3-d7ba66f8d7c6-kube-api-access-cmxxp" (OuterVolumeSpecName: "kube-api-access-cmxxp") pod "f2f934ac-76df-4767-95d3-d7ba66f8d7c6" (UID: "f2f934ac-76df-4767-95d3-d7ba66f8d7c6"). InnerVolumeSpecName "kube-api-access-cmxxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:13:57 crc kubenswrapper[4971]: I0320 07:13:57.397900 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2f934ac-76df-4767-95d3-d7ba66f8d7c6-config-data" (OuterVolumeSpecName: "config-data") pod "f2f934ac-76df-4767-95d3-d7ba66f8d7c6" (UID: "f2f934ac-76df-4767-95d3-d7ba66f8d7c6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:13:57 crc kubenswrapper[4971]: I0320 07:13:57.398430 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2f934ac-76df-4767-95d3-d7ba66f8d7c6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f2f934ac-76df-4767-95d3-d7ba66f8d7c6" (UID: "f2f934ac-76df-4767-95d3-d7ba66f8d7c6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:13:57 crc kubenswrapper[4971]: I0320 07:13:57.470500 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2f934ac-76df-4767-95d3-d7ba66f8d7c6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:57 crc kubenswrapper[4971]: I0320 07:13:57.470913 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmxxp\" (UniqueName: \"kubernetes.io/projected/f2f934ac-76df-4767-95d3-d7ba66f8d7c6-kube-api-access-cmxxp\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:57 crc kubenswrapper[4971]: I0320 07:13:57.470929 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2f934ac-76df-4767-95d3-d7ba66f8d7c6-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 07:13:57 crc kubenswrapper[4971]: I0320 07:13:57.600942 4971 generic.go:334] "Generic (PLEG): container finished" podID="f2f934ac-76df-4767-95d3-d7ba66f8d7c6" containerID="f6c2313185efe936176ebe865800a046b9f3f20a09b9e7dcb8c8d6a58676a686" exitCode=0 Mar 20 07:13:57 crc kubenswrapper[4971]: I0320 07:13:57.600981 4971 generic.go:334] "Generic (PLEG): container finished" podID="f2f934ac-76df-4767-95d3-d7ba66f8d7c6" containerID="8b0b9278315a248e514f61bdcd12d19d781361527ce3f91888033bdba1bb2612" exitCode=143 Mar 20 07:13:57 crc kubenswrapper[4971]: I0320 07:13:57.601119 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f2f934ac-76df-4767-95d3-d7ba66f8d7c6","Type":"ContainerDied","Data":"f6c2313185efe936176ebe865800a046b9f3f20a09b9e7dcb8c8d6a58676a686"} Mar 20 07:13:57 crc kubenswrapper[4971]: I0320 07:13:57.601194 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f2f934ac-76df-4767-95d3-d7ba66f8d7c6","Type":"ContainerDied","Data":"8b0b9278315a248e514f61bdcd12d19d781361527ce3f91888033bdba1bb2612"} Mar 20 07:13:57 crc kubenswrapper[4971]: I0320 07:13:57.601299 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f2f934ac-76df-4767-95d3-d7ba66f8d7c6","Type":"ContainerDied","Data":"60ba9aa28b2469ed7a567c481ef1abe82649bda4ee9a7b53542d9440dfa303e8"} Mar 20 07:13:57 crc kubenswrapper[4971]: I0320 07:13:57.601334 4971 scope.go:117] "RemoveContainer" containerID="f6c2313185efe936176ebe865800a046b9f3f20a09b9e7dcb8c8d6a58676a686" Mar 20 07:13:57 crc kubenswrapper[4971]: I0320 07:13:57.602170 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 07:13:57 crc kubenswrapper[4971]: I0320 07:13:57.624789 4971 scope.go:117] "RemoveContainer" containerID="8b0b9278315a248e514f61bdcd12d19d781361527ce3f91888033bdba1bb2612" Mar 20 07:13:57 crc kubenswrapper[4971]: I0320 07:13:57.647186 4971 scope.go:117] "RemoveContainer" containerID="f6c2313185efe936176ebe865800a046b9f3f20a09b9e7dcb8c8d6a58676a686" Mar 20 07:13:57 crc kubenswrapper[4971]: E0320 07:13:57.647682 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6c2313185efe936176ebe865800a046b9f3f20a09b9e7dcb8c8d6a58676a686\": container with ID starting with f6c2313185efe936176ebe865800a046b9f3f20a09b9e7dcb8c8d6a58676a686 not found: ID does not exist" containerID="f6c2313185efe936176ebe865800a046b9f3f20a09b9e7dcb8c8d6a58676a686" Mar 20 07:13:57 crc kubenswrapper[4971]: I0320 07:13:57.647728 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6c2313185efe936176ebe865800a046b9f3f20a09b9e7dcb8c8d6a58676a686"} err="failed to get container status \"f6c2313185efe936176ebe865800a046b9f3f20a09b9e7dcb8c8d6a58676a686\": rpc error: code = NotFound desc = could not find container \"f6c2313185efe936176ebe865800a046b9f3f20a09b9e7dcb8c8d6a58676a686\": container with ID starting with f6c2313185efe936176ebe865800a046b9f3f20a09b9e7dcb8c8d6a58676a686 not found: ID does not exist" Mar 20 07:13:57 crc kubenswrapper[4971]: I0320 07:13:57.647770 4971 scope.go:117] "RemoveContainer" containerID="8b0b9278315a248e514f61bdcd12d19d781361527ce3f91888033bdba1bb2612" Mar 20 07:13:57 crc kubenswrapper[4971]: E0320 07:13:57.648897 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b0b9278315a248e514f61bdcd12d19d781361527ce3f91888033bdba1bb2612\": container with ID starting with 8b0b9278315a248e514f61bdcd12d19d781361527ce3f91888033bdba1bb2612 not found: ID does not exist" containerID="8b0b9278315a248e514f61bdcd12d19d781361527ce3f91888033bdba1bb2612" Mar 20 07:13:57 crc kubenswrapper[4971]: I0320 07:13:57.648927 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b0b9278315a248e514f61bdcd12d19d781361527ce3f91888033bdba1bb2612"} err="failed to get container status \"8b0b9278315a248e514f61bdcd12d19d781361527ce3f91888033bdba1bb2612\": rpc error: code = NotFound desc = could not find container \"8b0b9278315a248e514f61bdcd12d19d781361527ce3f91888033bdba1bb2612\": container with ID starting with 8b0b9278315a248e514f61bdcd12d19d781361527ce3f91888033bdba1bb2612 not found: ID does not exist" Mar 20 07:13:57 crc kubenswrapper[4971]: I0320 07:13:57.648947 4971 scope.go:117] "RemoveContainer" containerID="f6c2313185efe936176ebe865800a046b9f3f20a09b9e7dcb8c8d6a58676a686" Mar 20 07:13:57 crc kubenswrapper[4971]: I0320 07:13:57.649208 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6c2313185efe936176ebe865800a046b9f3f20a09b9e7dcb8c8d6a58676a686"} err="failed to get container status \"f6c2313185efe936176ebe865800a046b9f3f20a09b9e7dcb8c8d6a58676a686\": rpc error: code = NotFound desc = could not find container \"f6c2313185efe936176ebe865800a046b9f3f20a09b9e7dcb8c8d6a58676a686\": container with ID starting with f6c2313185efe936176ebe865800a046b9f3f20a09b9e7dcb8c8d6a58676a686 not found: ID does not exist" Mar 20 07:13:57 crc kubenswrapper[4971]: I0320 07:13:57.649236 4971 scope.go:117] "RemoveContainer" containerID="8b0b9278315a248e514f61bdcd12d19d781361527ce3f91888033bdba1bb2612" Mar 20 07:13:57 crc kubenswrapper[4971]: I0320 07:13:57.649436 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b0b9278315a248e514f61bdcd12d19d781361527ce3f91888033bdba1bb2612"} err="failed to get container status \"8b0b9278315a248e514f61bdcd12d19d781361527ce3f91888033bdba1bb2612\": rpc error: code = NotFound desc = could not find container \"8b0b9278315a248e514f61bdcd12d19d781361527ce3f91888033bdba1bb2612\": container with ID starting with 8b0b9278315a248e514f61bdcd12d19d781361527ce3f91888033bdba1bb2612 not found: ID does not exist" Mar 20 07:13:57 crc kubenswrapper[4971]: I0320 07:13:57.653744 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 07:13:57 crc kubenswrapper[4971]: I0320 07:13:57.709378 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 07:13:57 crc kubenswrapper[4971]: I0320 07:13:57.730023 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 07:13:57 crc kubenswrapper[4971]: E0320 07:13:57.731066 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2f934ac-76df-4767-95d3-d7ba66f8d7c6" containerName="nova-metadata-metadata" Mar 20 07:13:57 crc kubenswrapper[4971]: I0320 07:13:57.731110 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2f934ac-76df-4767-95d3-d7ba66f8d7c6" containerName="nova-metadata-metadata" Mar 20 07:13:57 crc kubenswrapper[4971]: E0320 07:13:57.731134 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2f934ac-76df-4767-95d3-d7ba66f8d7c6" containerName="nova-metadata-log" Mar 20 07:13:57 crc kubenswrapper[4971]: I0320 07:13:57.731142 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2f934ac-76df-4767-95d3-d7ba66f8d7c6" containerName="nova-metadata-log" Mar 20 07:13:57 crc kubenswrapper[4971]: I0320 07:13:57.731561 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2f934ac-76df-4767-95d3-d7ba66f8d7c6" containerName="nova-metadata-metadata" Mar 20 07:13:57 crc kubenswrapper[4971]: I0320 07:13:57.731641 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2f934ac-76df-4767-95d3-d7ba66f8d7c6" containerName="nova-metadata-log" Mar 20 07:13:57 crc kubenswrapper[4971]: I0320 07:13:57.739122 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 07:13:57 crc kubenswrapper[4971]: I0320 07:13:57.740820 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 07:13:57 crc kubenswrapper[4971]: I0320 07:13:57.743356 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 07:13:57 crc kubenswrapper[4971]: I0320 07:13:57.744560 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 20 07:13:57 crc kubenswrapper[4971]: I0320 07:13:57.883402 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec353ef8-1da5-4ebe-8082-136699b3c38b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ec353ef8-1da5-4ebe-8082-136699b3c38b\") " pod="openstack/nova-metadata-0" Mar 20 07:13:57 crc kubenswrapper[4971]: I0320 07:13:57.883457 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec353ef8-1da5-4ebe-8082-136699b3c38b-logs\") pod \"nova-metadata-0\" (UID: \"ec353ef8-1da5-4ebe-8082-136699b3c38b\") " pod="openstack/nova-metadata-0" Mar 20 07:13:57 crc kubenswrapper[4971]: I0320 07:13:57.883515 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgb7t\" (UniqueName: \"kubernetes.io/projected/ec353ef8-1da5-4ebe-8082-136699b3c38b-kube-api-access-rgb7t\") pod \"nova-metadata-0\" (UID: \"ec353ef8-1da5-4ebe-8082-136699b3c38b\") " pod="openstack/nova-metadata-0" Mar 20 07:13:57 crc kubenswrapper[4971]: I0320 07:13:57.883669 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec353ef8-1da5-4ebe-8082-136699b3c38b-config-data\") pod \"nova-metadata-0\" (UID: \"ec353ef8-1da5-4ebe-8082-136699b3c38b\") " pod="openstack/nova-metadata-0" Mar 20 07:13:57 crc kubenswrapper[4971]: I0320 07:13:57.883687 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec353ef8-1da5-4ebe-8082-136699b3c38b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ec353ef8-1da5-4ebe-8082-136699b3c38b\") " pod="openstack/nova-metadata-0" Mar 20 07:13:57 crc kubenswrapper[4971]: I0320 07:13:57.985990 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec353ef8-1da5-4ebe-8082-136699b3c38b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ec353ef8-1da5-4ebe-8082-136699b3c38b\") " pod="openstack/nova-metadata-0" Mar 20 07:13:57 crc kubenswrapper[4971]: I0320 07:13:57.986113 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec353ef8-1da5-4ebe-8082-136699b3c38b-logs\") pod \"nova-metadata-0\" (UID: \"ec353ef8-1da5-4ebe-8082-136699b3c38b\") " pod="openstack/nova-metadata-0" Mar 20 07:13:57 crc kubenswrapper[4971]: I0320 07:13:57.986199 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgb7t\" (UniqueName: \"kubernetes.io/projected/ec353ef8-1da5-4ebe-8082-136699b3c38b-kube-api-access-rgb7t\") pod \"nova-metadata-0\" (UID: \"ec353ef8-1da5-4ebe-8082-136699b3c38b\") " pod="openstack/nova-metadata-0" Mar 20 07:13:57 crc kubenswrapper[4971]: I0320 07:13:57.986304 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec353ef8-1da5-4ebe-8082-136699b3c38b-config-data\") pod \"nova-metadata-0\" (UID: \"ec353ef8-1da5-4ebe-8082-136699b3c38b\") " pod="openstack/nova-metadata-0" Mar 20 07:13:57 crc kubenswrapper[4971]: I0320 07:13:57.986342 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec353ef8-1da5-4ebe-8082-136699b3c38b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ec353ef8-1da5-4ebe-8082-136699b3c38b\") " pod="openstack/nova-metadata-0" Mar 20 07:13:57 crc kubenswrapper[4971]: I0320 07:13:57.986657 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec353ef8-1da5-4ebe-8082-136699b3c38b-logs\") pod \"nova-metadata-0\" (UID: \"ec353ef8-1da5-4ebe-8082-136699b3c38b\") " pod="openstack/nova-metadata-0" Mar 20 07:13:57 crc kubenswrapper[4971]: I0320 07:13:57.991979 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec353ef8-1da5-4ebe-8082-136699b3c38b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ec353ef8-1da5-4ebe-8082-136699b3c38b\") " pod="openstack/nova-metadata-0" Mar 20 07:13:57 crc kubenswrapper[4971]: I0320 07:13:57.992304 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec353ef8-1da5-4ebe-8082-136699b3c38b-config-data\") pod \"nova-metadata-0\" (UID: \"ec353ef8-1da5-4ebe-8082-136699b3c38b\") " pod="openstack/nova-metadata-0" Mar 20 07:13:57 crc kubenswrapper[4971]: I0320 07:13:57.998918 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec353ef8-1da5-4ebe-8082-136699b3c38b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ec353ef8-1da5-4ebe-8082-136699b3c38b\") " pod="openstack/nova-metadata-0" Mar 20 07:13:58 crc kubenswrapper[4971]: I0320 07:13:58.003860 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgb7t\" (UniqueName: \"kubernetes.io/projected/ec353ef8-1da5-4ebe-8082-136699b3c38b-kube-api-access-rgb7t\") pod \"nova-metadata-0\" (UID: \"ec353ef8-1da5-4ebe-8082-136699b3c38b\") " pod="openstack/nova-metadata-0" Mar 20 07:13:58 crc kubenswrapper[4971]: I0320 07:13:58.058889 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 07:13:58 crc kubenswrapper[4971]: I0320 07:13:58.587141 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 07:13:58 crc kubenswrapper[4971]: I0320 07:13:58.613859 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ec353ef8-1da5-4ebe-8082-136699b3c38b","Type":"ContainerStarted","Data":"cb55c93cd7d893d265d5b52eb3996cf6701c5366623eb491785e66a8b7b70dc5"} Mar 20 07:13:58 crc kubenswrapper[4971]: I0320 07:13:58.743273 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2f934ac-76df-4767-95d3-d7ba66f8d7c6" path="/var/lib/kubelet/pods/f2f934ac-76df-4767-95d3-d7ba66f8d7c6/volumes" Mar 20 07:13:59 crc kubenswrapper[4971]: I0320 07:13:59.624021 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ec353ef8-1da5-4ebe-8082-136699b3c38b","Type":"ContainerStarted","Data":"f75a060831d79391afeb2e25331b7816e39f6ef227ceb9e23beb153bee57b2e2"} Mar 20 07:13:59 crc kubenswrapper[4971]: I0320 07:13:59.624291 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ec353ef8-1da5-4ebe-8082-136699b3c38b","Type":"ContainerStarted","Data":"fe20a539b5624f48757ba2d8745f68a33a89b7b2b5dca6ba8ffe552eb699007e"} Mar 20 07:13:59 crc kubenswrapper[4971]: I0320 07:13:59.625124 4971 generic.go:334] "Generic (PLEG): container finished" podID="e0e7c0cc-cee7-4e9b-8c9c-53e1c83f8027" containerID="4410efaf3f9d8a87e04404e86f12371268c9d97f8691fe91ffd43229eced6dd3" exitCode=0 Mar 20 07:13:59 crc kubenswrapper[4971]: I0320 07:13:59.625168 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-jtmqm" event={"ID":"e0e7c0cc-cee7-4e9b-8c9c-53e1c83f8027","Type":"ContainerDied","Data":"4410efaf3f9d8a87e04404e86f12371268c9d97f8691fe91ffd43229eced6dd3"} Mar 20 07:13:59 crc kubenswrapper[4971]: I0320 07:13:59.652073 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.652056935 podStartE2EDuration="2.652056935s" podCreationTimestamp="2026-03-20 07:13:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:13:59.646883931 +0000 UTC m=+1461.626758069" watchObservedRunningTime="2026-03-20 07:13:59.652056935 +0000 UTC m=+1461.631931073" Mar 20 07:14:00 crc kubenswrapper[4971]: I0320 07:14:00.148540 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566514-7qv5f"] Mar 20 07:14:00 crc kubenswrapper[4971]: I0320 07:14:00.153087 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566514-7qv5f" Mar 20 07:14:00 crc kubenswrapper[4971]: I0320 07:14:00.157955 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 07:14:00 crc kubenswrapper[4971]: I0320 07:14:00.158335 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 07:14:00 crc kubenswrapper[4971]: I0320 07:14:00.159135 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 07:14:00 crc kubenswrapper[4971]: I0320 07:14:00.166088 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566514-7qv5f"] Mar 20 07:14:00 crc kubenswrapper[4971]: I0320 07:14:00.336872 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwfpv\" (UniqueName: \"kubernetes.io/projected/4ad43e1b-a41b-446b-8690-8fede9e1fc7f-kube-api-access-qwfpv\") pod \"auto-csr-approver-29566514-7qv5f\" (UID: \"4ad43e1b-a41b-446b-8690-8fede9e1fc7f\") " pod="openshift-infra/auto-csr-approver-29566514-7qv5f" Mar 20 07:14:00 crc kubenswrapper[4971]: I0320 07:14:00.439669 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwfpv\" (UniqueName: \"kubernetes.io/projected/4ad43e1b-a41b-446b-8690-8fede9e1fc7f-kube-api-access-qwfpv\") pod \"auto-csr-approver-29566514-7qv5f\" (UID: \"4ad43e1b-a41b-446b-8690-8fede9e1fc7f\") " pod="openshift-infra/auto-csr-approver-29566514-7qv5f" Mar 20 07:14:00 crc kubenswrapper[4971]: I0320 07:14:00.457853 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwfpv\" (UniqueName: \"kubernetes.io/projected/4ad43e1b-a41b-446b-8690-8fede9e1fc7f-kube-api-access-qwfpv\") pod \"auto-csr-approver-29566514-7qv5f\" (UID: \"4ad43e1b-a41b-446b-8690-8fede9e1fc7f\") " pod="openshift-infra/auto-csr-approver-29566514-7qv5f" Mar 20 07:14:00 crc kubenswrapper[4971]: I0320 07:14:00.486645 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566514-7qv5f" Mar 20 07:14:00 crc kubenswrapper[4971]: I0320 07:14:00.640384 4971 generic.go:334] "Generic (PLEG): container finished" podID="1be4310d-009b-4cca-8e17-2d392cad914b" containerID="10a37a5b0caa91deed8a9e36e1111c76d38c04830b8df96ba48ece28266de196" exitCode=0 Mar 20 07:14:00 crc kubenswrapper[4971]: I0320 07:14:00.640710 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-5jkzb" event={"ID":"1be4310d-009b-4cca-8e17-2d392cad914b","Type":"ContainerDied","Data":"10a37a5b0caa91deed8a9e36e1111c76d38c04830b8df96ba48ece28266de196"} Mar 20 07:14:00 crc kubenswrapper[4971]: I0320 07:14:00.998413 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566514-7qv5f"] Mar 20 07:14:01 crc kubenswrapper[4971]: I0320 07:14:01.108042 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 20 07:14:01 crc kubenswrapper[4971]: I0320 07:14:01.108097 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 20 07:14:01 crc kubenswrapper[4971]: I0320 07:14:01.118569 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-jtmqm" Mar 20 07:14:01 crc kubenswrapper[4971]: I0320 07:14:01.151522 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 20 07:14:01 crc kubenswrapper[4971]: I0320 07:14:01.167860 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 07:14:01 crc kubenswrapper[4971]: I0320 07:14:01.168410 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 07:14:01 crc kubenswrapper[4971]: I0320 07:14:01.226330 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 20 07:14:01 crc kubenswrapper[4971]: I0320 07:14:01.256746 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-69b4446475-xtj6r" Mar 20 07:14:01 crc kubenswrapper[4971]: I0320 07:14:01.284925 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0e7c0cc-cee7-4e9b-8c9c-53e1c83f8027-config-data\") pod \"e0e7c0cc-cee7-4e9b-8c9c-53e1c83f8027\" (UID: \"e0e7c0cc-cee7-4e9b-8c9c-53e1c83f8027\") " Mar 20 07:14:01 crc kubenswrapper[4971]: I0320 07:14:01.285083 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0e7c0cc-cee7-4e9b-8c9c-53e1c83f8027-scripts\") pod \"e0e7c0cc-cee7-4e9b-8c9c-53e1c83f8027\" (UID: \"e0e7c0cc-cee7-4e9b-8c9c-53e1c83f8027\") " Mar 20 07:14:01 crc kubenswrapper[4971]: I0320 07:14:01.285121 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0e7c0cc-cee7-4e9b-8c9c-53e1c83f8027-combined-ca-bundle\") pod \"e0e7c0cc-cee7-4e9b-8c9c-53e1c83f8027\" (UID: \"e0e7c0cc-cee7-4e9b-8c9c-53e1c83f8027\") " Mar 20 07:14:01 crc kubenswrapper[4971]: I0320 07:14:01.285205 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxmds\" (UniqueName: \"kubernetes.io/projected/e0e7c0cc-cee7-4e9b-8c9c-53e1c83f8027-kube-api-access-bxmds\") pod \"e0e7c0cc-cee7-4e9b-8c9c-53e1c83f8027\" (UID: \"e0e7c0cc-cee7-4e9b-8c9c-53e1c83f8027\") " Mar 20 07:14:01 crc kubenswrapper[4971]: I0320 07:14:01.291706 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0e7c0cc-cee7-4e9b-8c9c-53e1c83f8027-scripts" (OuterVolumeSpecName: "scripts") pod "e0e7c0cc-cee7-4e9b-8c9c-53e1c83f8027" (UID: "e0e7c0cc-cee7-4e9b-8c9c-53e1c83f8027"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:01 crc kubenswrapper[4971]: I0320 07:14:01.298362 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0e7c0cc-cee7-4e9b-8c9c-53e1c83f8027-kube-api-access-bxmds" (OuterVolumeSpecName: "kube-api-access-bxmds") pod "e0e7c0cc-cee7-4e9b-8c9c-53e1c83f8027" (UID: "e0e7c0cc-cee7-4e9b-8c9c-53e1c83f8027"). InnerVolumeSpecName "kube-api-access-bxmds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:01 crc kubenswrapper[4971]: I0320 07:14:01.328813 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0e7c0cc-cee7-4e9b-8c9c-53e1c83f8027-config-data" (OuterVolumeSpecName: "config-data") pod "e0e7c0cc-cee7-4e9b-8c9c-53e1c83f8027" (UID: "e0e7c0cc-cee7-4e9b-8c9c-53e1c83f8027"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:01 crc kubenswrapper[4971]: I0320 07:14:01.354300 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58b85ccffc-jklwc"] Mar 20 07:14:01 crc kubenswrapper[4971]: I0320 07:14:01.354545 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58b85ccffc-jklwc" podUID="dbc35f7f-6e4d-492d-b82a-81c5ba5e3712" containerName="dnsmasq-dns" containerID="cri-o://f2e9b46ae28fea0f5df486fd7db6b7a3f09d30d7d0d8a788febe86c6c026da09" gracePeriod=10 Mar 20 07:14:01 crc kubenswrapper[4971]: I0320 07:14:01.367492 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0e7c0cc-cee7-4e9b-8c9c-53e1c83f8027-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e0e7c0cc-cee7-4e9b-8c9c-53e1c83f8027" (UID: "e0e7c0cc-cee7-4e9b-8c9c-53e1c83f8027"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:01 crc kubenswrapper[4971]: I0320 07:14:01.387473 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0e7c0cc-cee7-4e9b-8c9c-53e1c83f8027-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:01 crc kubenswrapper[4971]: I0320 07:14:01.388064 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0e7c0cc-cee7-4e9b-8c9c-53e1c83f8027-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:01 crc kubenswrapper[4971]: I0320 07:14:01.388205 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0e7c0cc-cee7-4e9b-8c9c-53e1c83f8027-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:01 crc kubenswrapper[4971]: I0320 07:14:01.388297 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxmds\" (UniqueName: \"kubernetes.io/projected/e0e7c0cc-cee7-4e9b-8c9c-53e1c83f8027-kube-api-access-bxmds\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:01 crc kubenswrapper[4971]: E0320 07:14:01.616346 4971 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddbc35f7f_6e4d_492d_b82a_81c5ba5e3712.slice/crio-f2e9b46ae28fea0f5df486fd7db6b7a3f09d30d7d0d8a788febe86c6c026da09.scope\": RecentStats: unable to find data in memory cache]" Mar 20 07:14:01 crc kubenswrapper[4971]: I0320 07:14:01.665227 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566514-7qv5f" event={"ID":"4ad43e1b-a41b-446b-8690-8fede9e1fc7f","Type":"ContainerStarted","Data":"e84d2d49d6aa35c5c0f8dfd2b23ef071c65f56cb68d9de7a0dbb261dff73b6d3"} Mar 20 07:14:01 crc kubenswrapper[4971]: I0320 07:14:01.669651 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-jtmqm" event={"ID":"e0e7c0cc-cee7-4e9b-8c9c-53e1c83f8027","Type":"ContainerDied","Data":"7cf47dddbb101daab5b4385bbc62fd1353b1e87a8bc249b82bfce00dcd5950a3"} Mar 20 07:14:01 crc kubenswrapper[4971]: I0320 07:14:01.669673 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7cf47dddbb101daab5b4385bbc62fd1353b1e87a8bc249b82bfce00dcd5950a3" Mar 20 07:14:01 crc kubenswrapper[4971]: I0320 07:14:01.669724 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-jtmqm" Mar 20 07:14:01 crc kubenswrapper[4971]: I0320 07:14:01.681774 4971 generic.go:334] "Generic (PLEG): container finished" podID="dbc35f7f-6e4d-492d-b82a-81c5ba5e3712" containerID="f2e9b46ae28fea0f5df486fd7db6b7a3f09d30d7d0d8a788febe86c6c026da09" exitCode=0 Mar 20 07:14:01 crc kubenswrapper[4971]: I0320 07:14:01.681922 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58b85ccffc-jklwc" event={"ID":"dbc35f7f-6e4d-492d-b82a-81c5ba5e3712","Type":"ContainerDied","Data":"f2e9b46ae28fea0f5df486fd7db6b7a3f09d30d7d0d8a788febe86c6c026da09"} Mar 20 07:14:01 crc kubenswrapper[4971]: I0320 07:14:01.738899 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 20 07:14:01 crc kubenswrapper[4971]: I0320 07:14:01.827896 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 07:14:01 crc kubenswrapper[4971]: I0320 07:14:01.933264 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 07:14:01 crc kubenswrapper[4971]: I0320 07:14:01.933557 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ec353ef8-1da5-4ebe-8082-136699b3c38b" containerName="nova-metadata-log" containerID="cri-o://fe20a539b5624f48757ba2d8745f68a33a89b7b2b5dca6ba8ffe552eb699007e" gracePeriod=30 Mar 20 07:14:01 crc kubenswrapper[4971]: I0320 07:14:01.934087 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ec353ef8-1da5-4ebe-8082-136699b3c38b" containerName="nova-metadata-metadata" containerID="cri-o://f75a060831d79391afeb2e25331b7816e39f6ef227ceb9e23beb153bee57b2e2" gracePeriod=30 Mar 20 07:14:01 crc kubenswrapper[4971]: I0320 07:14:01.941671 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58b85ccffc-jklwc" Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.117705 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dbc35f7f-6e4d-492d-b82a-81c5ba5e3712-dns-swift-storage-0\") pod \"dbc35f7f-6e4d-492d-b82a-81c5ba5e3712\" (UID: \"dbc35f7f-6e4d-492d-b82a-81c5ba5e3712\") " Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.117768 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z96xm\" (UniqueName: \"kubernetes.io/projected/dbc35f7f-6e4d-492d-b82a-81c5ba5e3712-kube-api-access-z96xm\") pod \"dbc35f7f-6e4d-492d-b82a-81c5ba5e3712\" (UID: \"dbc35f7f-6e4d-492d-b82a-81c5ba5e3712\") " Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.117879 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dbc35f7f-6e4d-492d-b82a-81c5ba5e3712-ovsdbserver-nb\") pod \"dbc35f7f-6e4d-492d-b82a-81c5ba5e3712\" (UID: \"dbc35f7f-6e4d-492d-b82a-81c5ba5e3712\") " Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.117998 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dbc35f7f-6e4d-492d-b82a-81c5ba5e3712-dns-svc\") pod \"dbc35f7f-6e4d-492d-b82a-81c5ba5e3712\" (UID: \"dbc35f7f-6e4d-492d-b82a-81c5ba5e3712\") " Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.118027 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbc35f7f-6e4d-492d-b82a-81c5ba5e3712-config\") pod \"dbc35f7f-6e4d-492d-b82a-81c5ba5e3712\" (UID: \"dbc35f7f-6e4d-492d-b82a-81c5ba5e3712\") " Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.118064 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dbc35f7f-6e4d-492d-b82a-81c5ba5e3712-ovsdbserver-sb\") pod \"dbc35f7f-6e4d-492d-b82a-81c5ba5e3712\" (UID: \"dbc35f7f-6e4d-492d-b82a-81c5ba5e3712\") " Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.130843 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbc35f7f-6e4d-492d-b82a-81c5ba5e3712-kube-api-access-z96xm" (OuterVolumeSpecName: "kube-api-access-z96xm") pod "dbc35f7f-6e4d-492d-b82a-81c5ba5e3712" (UID: "dbc35f7f-6e4d-492d-b82a-81c5ba5e3712"). InnerVolumeSpecName "kube-api-access-z96xm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.168152 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ac4c72d9-ed3f-4662-909e-07caccc657ba" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.192:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.186467 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbc35f7f-6e4d-492d-b82a-81c5ba5e3712-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dbc35f7f-6e4d-492d-b82a-81c5ba5e3712" (UID: "dbc35f7f-6e4d-492d-b82a-81c5ba5e3712"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.196761 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-5jkzb" Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.209870 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ac4c72d9-ed3f-4662-909e-07caccc657ba" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.192:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.216004 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbc35f7f-6e4d-492d-b82a-81c5ba5e3712-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "dbc35f7f-6e4d-492d-b82a-81c5ba5e3712" (UID: "dbc35f7f-6e4d-492d-b82a-81c5ba5e3712"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.220094 4971 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dbc35f7f-6e4d-492d-b82a-81c5ba5e3712-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.220120 4971 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dbc35f7f-6e4d-492d-b82a-81c5ba5e3712-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.220129 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z96xm\" (UniqueName: \"kubernetes.io/projected/dbc35f7f-6e4d-492d-b82a-81c5ba5e3712-kube-api-access-z96xm\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.222026 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbc35f7f-6e4d-492d-b82a-81c5ba5e3712-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "dbc35f7f-6e4d-492d-b82a-81c5ba5e3712" (UID: "dbc35f7f-6e4d-492d-b82a-81c5ba5e3712"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.240011 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbc35f7f-6e4d-492d-b82a-81c5ba5e3712-config" (OuterVolumeSpecName: "config") pod "dbc35f7f-6e4d-492d-b82a-81c5ba5e3712" (UID: "dbc35f7f-6e4d-492d-b82a-81c5ba5e3712"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.255724 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbc35f7f-6e4d-492d-b82a-81c5ba5e3712-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "dbc35f7f-6e4d-492d-b82a-81c5ba5e3712" (UID: "dbc35f7f-6e4d-492d-b82a-81c5ba5e3712"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.326251 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1be4310d-009b-4cca-8e17-2d392cad914b-combined-ca-bundle\") pod \"1be4310d-009b-4cca-8e17-2d392cad914b\" (UID: \"1be4310d-009b-4cca-8e17-2d392cad914b\") " Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.326556 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1be4310d-009b-4cca-8e17-2d392cad914b-scripts\") pod \"1be4310d-009b-4cca-8e17-2d392cad914b\" (UID: \"1be4310d-009b-4cca-8e17-2d392cad914b\") " Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.326583 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1be4310d-009b-4cca-8e17-2d392cad914b-config-data\") pod \"1be4310d-009b-4cca-8e17-2d392cad914b\" (UID: \"1be4310d-009b-4cca-8e17-2d392cad914b\") " Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.326645 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rc59\" (UniqueName: \"kubernetes.io/projected/1be4310d-009b-4cca-8e17-2d392cad914b-kube-api-access-5rc59\") pod \"1be4310d-009b-4cca-8e17-2d392cad914b\" (UID: \"1be4310d-009b-4cca-8e17-2d392cad914b\") " Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.327328 4971 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dbc35f7f-6e4d-492d-b82a-81c5ba5e3712-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.327344 4971 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dbc35f7f-6e4d-492d-b82a-81c5ba5e3712-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.327353 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbc35f7f-6e4d-492d-b82a-81c5ba5e3712-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.330527 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1be4310d-009b-4cca-8e17-2d392cad914b-scripts" (OuterVolumeSpecName: "scripts") pod "1be4310d-009b-4cca-8e17-2d392cad914b" (UID: "1be4310d-009b-4cca-8e17-2d392cad914b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.330696 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1be4310d-009b-4cca-8e17-2d392cad914b-kube-api-access-5rc59" (OuterVolumeSpecName: "kube-api-access-5rc59") pod "1be4310d-009b-4cca-8e17-2d392cad914b" (UID: "1be4310d-009b-4cca-8e17-2d392cad914b"). InnerVolumeSpecName "kube-api-access-5rc59". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.356087 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1be4310d-009b-4cca-8e17-2d392cad914b-config-data" (OuterVolumeSpecName: "config-data") pod "1be4310d-009b-4cca-8e17-2d392cad914b" (UID: "1be4310d-009b-4cca-8e17-2d392cad914b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.362910 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1be4310d-009b-4cca-8e17-2d392cad914b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1be4310d-009b-4cca-8e17-2d392cad914b" (UID: "1be4310d-009b-4cca-8e17-2d392cad914b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.421971 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.430458 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1be4310d-009b-4cca-8e17-2d392cad914b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.430484 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1be4310d-009b-4cca-8e17-2d392cad914b-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.430494 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1be4310d-009b-4cca-8e17-2d392cad914b-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.430503 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rc59\" (UniqueName: \"kubernetes.io/projected/1be4310d-009b-4cca-8e17-2d392cad914b-kube-api-access-5rc59\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.593951 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.716103 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58b85ccffc-jklwc" event={"ID":"dbc35f7f-6e4d-492d-b82a-81c5ba5e3712","Type":"ContainerDied","Data":"f197ab2a9394a98c0bb54f40df38a86fdabbe8ecbc5f680ae8db657141d358a0"} Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.716483 4971 scope.go:117] "RemoveContainer" containerID="f2e9b46ae28fea0f5df486fd7db6b7a3f09d30d7d0d8a788febe86c6c026da09" Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.716701 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58b85ccffc-jklwc" Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.725745 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 07:14:02 crc kubenswrapper[4971]: E0320 07:14:02.726292 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbc35f7f-6e4d-492d-b82a-81c5ba5e3712" containerName="dnsmasq-dns" Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.726314 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbc35f7f-6e4d-492d-b82a-81c5ba5e3712" containerName="dnsmasq-dns" Mar 20 07:14:02 crc kubenswrapper[4971]: E0320 07:14:02.726341 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0e7c0cc-cee7-4e9b-8c9c-53e1c83f8027" containerName="nova-manage" Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.726350 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0e7c0cc-cee7-4e9b-8c9c-53e1c83f8027" containerName="nova-manage" Mar 20 07:14:02 crc kubenswrapper[4971]: E0320 07:14:02.726371 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec353ef8-1da5-4ebe-8082-136699b3c38b" containerName="nova-metadata-log" Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.726379 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec353ef8-1da5-4ebe-8082-136699b3c38b" containerName="nova-metadata-log" Mar 20 07:14:02 crc kubenswrapper[4971]: E0320 07:14:02.726413 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec353ef8-1da5-4ebe-8082-136699b3c38b" containerName="nova-metadata-metadata" Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.726422 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec353ef8-1da5-4ebe-8082-136699b3c38b" containerName="nova-metadata-metadata" Mar 20 07:14:02 crc kubenswrapper[4971]: E0320 07:14:02.726436 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1be4310d-009b-4cca-8e17-2d392cad914b" containerName="nova-cell1-conductor-db-sync" Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.726444 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="1be4310d-009b-4cca-8e17-2d392cad914b" containerName="nova-cell1-conductor-db-sync" Mar 20 07:14:02 crc kubenswrapper[4971]: E0320 07:14:02.726457 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbc35f7f-6e4d-492d-b82a-81c5ba5e3712" containerName="init" Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.726465 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbc35f7f-6e4d-492d-b82a-81c5ba5e3712" containerName="init" Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.726693 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbc35f7f-6e4d-492d-b82a-81c5ba5e3712" containerName="dnsmasq-dns" Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.726709 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="1be4310d-009b-4cca-8e17-2d392cad914b" containerName="nova-cell1-conductor-db-sync" Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.726725 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0e7c0cc-cee7-4e9b-8c9c-53e1c83f8027" containerName="nova-manage" Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.726746 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec353ef8-1da5-4ebe-8082-136699b3c38b" containerName="nova-metadata-metadata" Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.726764 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec353ef8-1da5-4ebe-8082-136699b3c38b" containerName="nova-metadata-log" Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.727566 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.728165 4971 generic.go:334] "Generic (PLEG): container finished" podID="ec353ef8-1da5-4ebe-8082-136699b3c38b" containerID="f75a060831d79391afeb2e25331b7816e39f6ef227ceb9e23beb153bee57b2e2" exitCode=0 Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.728193 4971 generic.go:334] "Generic (PLEG): container finished" podID="ec353ef8-1da5-4ebe-8082-136699b3c38b" containerID="fe20a539b5624f48757ba2d8745f68a33a89b7b2b5dca6ba8ffe552eb699007e" exitCode=143 Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.728267 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ec353ef8-1da5-4ebe-8082-136699b3c38b","Type":"ContainerDied","Data":"f75a060831d79391afeb2e25331b7816e39f6ef227ceb9e23beb153bee57b2e2"} Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.728295 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ec353ef8-1da5-4ebe-8082-136699b3c38b","Type":"ContainerDied","Data":"fe20a539b5624f48757ba2d8745f68a33a89b7b2b5dca6ba8ffe552eb699007e"} Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.728306 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ec353ef8-1da5-4ebe-8082-136699b3c38b","Type":"ContainerDied","Data":"cb55c93cd7d893d265d5b52eb3996cf6701c5366623eb491785e66a8b7b70dc5"} Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.728369 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.736172 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgb7t\" (UniqueName: \"kubernetes.io/projected/ec353ef8-1da5-4ebe-8082-136699b3c38b-kube-api-access-rgb7t\") pod \"ec353ef8-1da5-4ebe-8082-136699b3c38b\" (UID: \"ec353ef8-1da5-4ebe-8082-136699b3c38b\") " Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.736213 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec353ef8-1da5-4ebe-8082-136699b3c38b-logs\") pod \"ec353ef8-1da5-4ebe-8082-136699b3c38b\" (UID: \"ec353ef8-1da5-4ebe-8082-136699b3c38b\") " Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.736302 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec353ef8-1da5-4ebe-8082-136699b3c38b-combined-ca-bundle\") pod \"ec353ef8-1da5-4ebe-8082-136699b3c38b\" (UID: \"ec353ef8-1da5-4ebe-8082-136699b3c38b\") " Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.736321 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec353ef8-1da5-4ebe-8082-136699b3c38b-config-data\") pod \"ec353ef8-1da5-4ebe-8082-136699b3c38b\" (UID: \"ec353ef8-1da5-4ebe-8082-136699b3c38b\") " Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.736382 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec353ef8-1da5-4ebe-8082-136699b3c38b-nova-metadata-tls-certs\") pod \"ec353ef8-1da5-4ebe-8082-136699b3c38b\" (UID: \"ec353ef8-1da5-4ebe-8082-136699b3c38b\") " Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.737152 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec353ef8-1da5-4ebe-8082-136699b3c38b-logs" (OuterVolumeSpecName: "logs") pod "ec353ef8-1da5-4ebe-8082-136699b3c38b" (UID: "ec353ef8-1da5-4ebe-8082-136699b3c38b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.737205 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ac4c72d9-ed3f-4662-909e-07caccc657ba" containerName="nova-api-log" containerID="cri-o://93c5b0e83a8c18ba03c2a92a4de3bebd7a9908cfa79fc6c26aba3e5cf43e27b0" gracePeriod=30 Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.737505 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-5jkzb" Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.740488 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ac4c72d9-ed3f-4662-909e-07caccc657ba" containerName="nova-api-api" containerID="cri-o://cacd8e180d09a0791bf4d41328b376e0d5f1aeb23ae7c05247daa14b953d750f" gracePeriod=30 Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.748528 4971 scope.go:117] "RemoveContainer" containerID="ea02c0a0041a03dd80a0b5d7c4b95d6e792ce5703b4753dab7ae3049e7733f06" Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.767542 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec353ef8-1da5-4ebe-8082-136699b3c38b-kube-api-access-rgb7t" (OuterVolumeSpecName: "kube-api-access-rgb7t") pod "ec353ef8-1da5-4ebe-8082-136699b3c38b" (UID: "ec353ef8-1da5-4ebe-8082-136699b3c38b"). InnerVolumeSpecName "kube-api-access-rgb7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.830517 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec353ef8-1da5-4ebe-8082-136699b3c38b-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "ec353ef8-1da5-4ebe-8082-136699b3c38b" (UID: "ec353ef8-1da5-4ebe-8082-136699b3c38b"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.838251 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9450f53-fcf0-4799-a8da-a9d0bc7016ac-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"b9450f53-fcf0-4799-a8da-a9d0bc7016ac\") " pod="openstack/nova-cell1-conductor-0" Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.838372 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9450f53-fcf0-4799-a8da-a9d0bc7016ac-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"b9450f53-fcf0-4799-a8da-a9d0bc7016ac\") " pod="openstack/nova-cell1-conductor-0" Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.838448 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crscl\" (UniqueName: \"kubernetes.io/projected/b9450f53-fcf0-4799-a8da-a9d0bc7016ac-kube-api-access-crscl\") pod \"nova-cell1-conductor-0\" (UID: \"b9450f53-fcf0-4799-a8da-a9d0bc7016ac\") " pod="openstack/nova-cell1-conductor-0" Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.838549 4971 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec353ef8-1da5-4ebe-8082-136699b3c38b-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.838561 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgb7t\" (UniqueName: \"kubernetes.io/projected/ec353ef8-1da5-4ebe-8082-136699b3c38b-kube-api-access-rgb7t\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.838571 4971 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec353ef8-1da5-4ebe-8082-136699b3c38b-logs\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.844816 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec353ef8-1da5-4ebe-8082-136699b3c38b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ec353ef8-1da5-4ebe-8082-136699b3c38b" (UID: "ec353ef8-1da5-4ebe-8082-136699b3c38b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.868764 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec353ef8-1da5-4ebe-8082-136699b3c38b-config-data" (OuterVolumeSpecName: "config-data") pod "ec353ef8-1da5-4ebe-8082-136699b3c38b" (UID: "ec353ef8-1da5-4ebe-8082-136699b3c38b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.931829 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.931864 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-5jkzb" event={"ID":"1be4310d-009b-4cca-8e17-2d392cad914b","Type":"ContainerDied","Data":"7172165a3b01bb8f687eab7947b0be06c88f3c2969c9c41c32f43a48a3fee7b2"} Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.931885 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7172165a3b01bb8f687eab7947b0be06c88f3c2969c9c41c32f43a48a3fee7b2" Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.939754 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crscl\" (UniqueName: \"kubernetes.io/projected/b9450f53-fcf0-4799-a8da-a9d0bc7016ac-kube-api-access-crscl\") pod \"nova-cell1-conductor-0\" (UID: \"b9450f53-fcf0-4799-a8da-a9d0bc7016ac\") " pod="openstack/nova-cell1-conductor-0" Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.939872 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9450f53-fcf0-4799-a8da-a9d0bc7016ac-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"b9450f53-fcf0-4799-a8da-a9d0bc7016ac\") " pod="openstack/nova-cell1-conductor-0" Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.939924 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9450f53-fcf0-4799-a8da-a9d0bc7016ac-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"b9450f53-fcf0-4799-a8da-a9d0bc7016ac\") " pod="openstack/nova-cell1-conductor-0" Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.939995 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec353ef8-1da5-4ebe-8082-136699b3c38b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.940009 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec353ef8-1da5-4ebe-8082-136699b3c38b-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.943446 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9450f53-fcf0-4799-a8da-a9d0bc7016ac-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"b9450f53-fcf0-4799-a8da-a9d0bc7016ac\") " pod="openstack/nova-cell1-conductor-0" Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.948892 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9450f53-fcf0-4799-a8da-a9d0bc7016ac-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"b9450f53-fcf0-4799-a8da-a9d0bc7016ac\") " pod="openstack/nova-cell1-conductor-0" Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.972808 4971 scope.go:117] "RemoveContainer" containerID="f75a060831d79391afeb2e25331b7816e39f6ef227ceb9e23beb153bee57b2e2" Mar 20 07:14:02 crc kubenswrapper[4971]: I0320 07:14:02.995207 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crscl\" (UniqueName: \"kubernetes.io/projected/b9450f53-fcf0-4799-a8da-a9d0bc7016ac-kube-api-access-crscl\") pod \"nova-cell1-conductor-0\" (UID: \"b9450f53-fcf0-4799-a8da-a9d0bc7016ac\") " pod="openstack/nova-cell1-conductor-0" Mar 20 07:14:03 crc kubenswrapper[4971]: I0320 07:14:03.016508 4971 scope.go:117] "RemoveContainer" containerID="fe20a539b5624f48757ba2d8745f68a33a89b7b2b5dca6ba8ffe552eb699007e" Mar 20 07:14:03 crc kubenswrapper[4971]: I0320 07:14:03.022802 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58b85ccffc-jklwc"] Mar 20 07:14:03 crc kubenswrapper[4971]: I0320 07:14:03.077156 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58b85ccffc-jklwc"] Mar 20 07:14:03 crc kubenswrapper[4971]: I0320 07:14:03.077769 4971 scope.go:117] "RemoveContainer" containerID="f75a060831d79391afeb2e25331b7816e39f6ef227ceb9e23beb153bee57b2e2" Mar 20 07:14:03 crc kubenswrapper[4971]: E0320 07:14:03.084823 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f75a060831d79391afeb2e25331b7816e39f6ef227ceb9e23beb153bee57b2e2\": container with ID starting with f75a060831d79391afeb2e25331b7816e39f6ef227ceb9e23beb153bee57b2e2 not found: ID does not exist" containerID="f75a060831d79391afeb2e25331b7816e39f6ef227ceb9e23beb153bee57b2e2" Mar 20 07:14:03 crc kubenswrapper[4971]: I0320 07:14:03.084869 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f75a060831d79391afeb2e25331b7816e39f6ef227ceb9e23beb153bee57b2e2"} err="failed to get container status \"f75a060831d79391afeb2e25331b7816e39f6ef227ceb9e23beb153bee57b2e2\": rpc error: code = NotFound desc = could not find container \"f75a060831d79391afeb2e25331b7816e39f6ef227ceb9e23beb153bee57b2e2\": container with ID starting with f75a060831d79391afeb2e25331b7816e39f6ef227ceb9e23beb153bee57b2e2 not found: ID does not exist" Mar 20 07:14:03 crc kubenswrapper[4971]: I0320 07:14:03.084894 4971 scope.go:117] "RemoveContainer" containerID="fe20a539b5624f48757ba2d8745f68a33a89b7b2b5dca6ba8ffe552eb699007e" Mar 20 07:14:03 crc kubenswrapper[4971]: E0320 07:14:03.101761 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe20a539b5624f48757ba2d8745f68a33a89b7b2b5dca6ba8ffe552eb699007e\": container with ID starting with fe20a539b5624f48757ba2d8745f68a33a89b7b2b5dca6ba8ffe552eb699007e not found: ID does not exist" containerID="fe20a539b5624f48757ba2d8745f68a33a89b7b2b5dca6ba8ffe552eb699007e" Mar 20 07:14:03 crc kubenswrapper[4971]: I0320 07:14:03.101809 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe20a539b5624f48757ba2d8745f68a33a89b7b2b5dca6ba8ffe552eb699007e"} err="failed to get container status \"fe20a539b5624f48757ba2d8745f68a33a89b7b2b5dca6ba8ffe552eb699007e\": rpc error: code = NotFound desc = could not find container \"fe20a539b5624f48757ba2d8745f68a33a89b7b2b5dca6ba8ffe552eb699007e\": container with ID starting with fe20a539b5624f48757ba2d8745f68a33a89b7b2b5dca6ba8ffe552eb699007e not found: ID does not exist" Mar 20 07:14:03 crc kubenswrapper[4971]: I0320 07:14:03.101841 4971 scope.go:117] "RemoveContainer" containerID="f75a060831d79391afeb2e25331b7816e39f6ef227ceb9e23beb153bee57b2e2" Mar 20 07:14:03 crc kubenswrapper[4971]: I0320 07:14:03.108202 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f75a060831d79391afeb2e25331b7816e39f6ef227ceb9e23beb153bee57b2e2"} err="failed to get container status \"f75a060831d79391afeb2e25331b7816e39f6ef227ceb9e23beb153bee57b2e2\": rpc error: code = NotFound desc = could not find container \"f75a060831d79391afeb2e25331b7816e39f6ef227ceb9e23beb153bee57b2e2\": container with ID starting with f75a060831d79391afeb2e25331b7816e39f6ef227ceb9e23beb153bee57b2e2 not found: ID does not exist" Mar 20 07:14:03 crc kubenswrapper[4971]: I0320 07:14:03.108267 4971 scope.go:117] "RemoveContainer" containerID="fe20a539b5624f48757ba2d8745f68a33a89b7b2b5dca6ba8ffe552eb699007e" Mar 20 07:14:03 crc kubenswrapper[4971]: I0320 07:14:03.114489 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe20a539b5624f48757ba2d8745f68a33a89b7b2b5dca6ba8ffe552eb699007e"} err="failed to get container status \"fe20a539b5624f48757ba2d8745f68a33a89b7b2b5dca6ba8ffe552eb699007e\": rpc error: code = NotFound desc = could not find container \"fe20a539b5624f48757ba2d8745f68a33a89b7b2b5dca6ba8ffe552eb699007e\": container with ID starting with fe20a539b5624f48757ba2d8745f68a33a89b7b2b5dca6ba8ffe552eb699007e not found: ID does not exist" Mar 20 07:14:03 crc kubenswrapper[4971]: I0320 07:14:03.123685 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 07:14:03 crc kubenswrapper[4971]: I0320 07:14:03.131324 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 07:14:03 crc kubenswrapper[4971]: I0320 07:14:03.144625 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 07:14:03 crc kubenswrapper[4971]: I0320 07:14:03.146388 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 07:14:03 crc kubenswrapper[4971]: I0320 07:14:03.151209 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 07:14:03 crc kubenswrapper[4971]: I0320 07:14:03.160545 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 07:14:03 crc kubenswrapper[4971]: I0320 07:14:03.171205 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 20 07:14:03 crc kubenswrapper[4971]: I0320 07:14:03.242184 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 20 07:14:03 crc kubenswrapper[4971]: I0320 07:14:03.253287 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77fdc792-9230-4274-b23d-dd4a21bfc09b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"77fdc792-9230-4274-b23d-dd4a21bfc09b\") " pod="openstack/nova-metadata-0" Mar 20 07:14:03 crc kubenswrapper[4971]: I0320 07:14:03.253424 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77fdc792-9230-4274-b23d-dd4a21bfc09b-logs\") pod \"nova-metadata-0\" (UID: \"77fdc792-9230-4274-b23d-dd4a21bfc09b\") " pod="openstack/nova-metadata-0" Mar 20 07:14:03 crc kubenswrapper[4971]: I0320 07:14:03.253483 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77fdc792-9230-4274-b23d-dd4a21bfc09b-config-data\") pod \"nova-metadata-0\" (UID: \"77fdc792-9230-4274-b23d-dd4a21bfc09b\") " pod="openstack/nova-metadata-0" Mar 20 07:14:03 crc kubenswrapper[4971]: I0320 07:14:03.253519 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8j2t\" (UniqueName: \"kubernetes.io/projected/77fdc792-9230-4274-b23d-dd4a21bfc09b-kube-api-access-x8j2t\") pod \"nova-metadata-0\" (UID: \"77fdc792-9230-4274-b23d-dd4a21bfc09b\") " pod="openstack/nova-metadata-0" Mar 20 07:14:03 crc kubenswrapper[4971]: I0320 07:14:03.253760 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/77fdc792-9230-4274-b23d-dd4a21bfc09b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"77fdc792-9230-4274-b23d-dd4a21bfc09b\") " pod="openstack/nova-metadata-0" Mar 20 07:14:03 crc kubenswrapper[4971]: I0320 07:14:03.355377 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77fdc792-9230-4274-b23d-dd4a21bfc09b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"77fdc792-9230-4274-b23d-dd4a21bfc09b\") " pod="openstack/nova-metadata-0" Mar 20 07:14:03 crc kubenswrapper[4971]: I0320 07:14:03.355823 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77fdc792-9230-4274-b23d-dd4a21bfc09b-logs\") pod \"nova-metadata-0\" (UID: \"77fdc792-9230-4274-b23d-dd4a21bfc09b\") " pod="openstack/nova-metadata-0" Mar 20 07:14:03 crc kubenswrapper[4971]: I0320 07:14:03.355882 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77fdc792-9230-4274-b23d-dd4a21bfc09b-config-data\") pod \"nova-metadata-0\" (UID: \"77fdc792-9230-4274-b23d-dd4a21bfc09b\") " pod="openstack/nova-metadata-0" Mar 20 07:14:03 crc kubenswrapper[4971]: I0320 07:14:03.355915 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8j2t\" (UniqueName: \"kubernetes.io/projected/77fdc792-9230-4274-b23d-dd4a21bfc09b-kube-api-access-x8j2t\") pod \"nova-metadata-0\" (UID: \"77fdc792-9230-4274-b23d-dd4a21bfc09b\") " pod="openstack/nova-metadata-0" Mar 20 07:14:03 crc kubenswrapper[4971]: I0320 07:14:03.355960 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/77fdc792-9230-4274-b23d-dd4a21bfc09b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"77fdc792-9230-4274-b23d-dd4a21bfc09b\") " pod="openstack/nova-metadata-0" Mar 20 07:14:03 crc kubenswrapper[4971]: I0320 07:14:03.356802 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77fdc792-9230-4274-b23d-dd4a21bfc09b-logs\") pod \"nova-metadata-0\" (UID: \"77fdc792-9230-4274-b23d-dd4a21bfc09b\") " pod="openstack/nova-metadata-0" Mar 20 07:14:03 crc kubenswrapper[4971]: I0320 07:14:03.361229 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77fdc792-9230-4274-b23d-dd4a21bfc09b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"77fdc792-9230-4274-b23d-dd4a21bfc09b\") " pod="openstack/nova-metadata-0" Mar 20 07:14:03 crc kubenswrapper[4971]: I0320 07:14:03.361535 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/77fdc792-9230-4274-b23d-dd4a21bfc09b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"77fdc792-9230-4274-b23d-dd4a21bfc09b\") " pod="openstack/nova-metadata-0" Mar 20 07:14:03 crc kubenswrapper[4971]: I0320 07:14:03.363189 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77fdc792-9230-4274-b23d-dd4a21bfc09b-config-data\") pod \"nova-metadata-0\" (UID: \"77fdc792-9230-4274-b23d-dd4a21bfc09b\") " pod="openstack/nova-metadata-0" Mar 20 07:14:03 crc kubenswrapper[4971]: I0320 07:14:03.378405 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8j2t\" (UniqueName: \"kubernetes.io/projected/77fdc792-9230-4274-b23d-dd4a21bfc09b-kube-api-access-x8j2t\") pod \"nova-metadata-0\" (UID: \"77fdc792-9230-4274-b23d-dd4a21bfc09b\") " pod="openstack/nova-metadata-0" Mar 20 07:14:03 crc kubenswrapper[4971]: I0320 07:14:03.471213 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 07:14:03 crc kubenswrapper[4971]: I0320 07:14:03.688208 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 07:14:03 crc kubenswrapper[4971]: I0320 07:14:03.751188 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"b9450f53-fcf0-4799-a8da-a9d0bc7016ac","Type":"ContainerStarted","Data":"ed45809ce534b67891f0d3da975655ba7a58aa3ba30b4df0d4f5c6f6028c0e02"} Mar 20 07:14:03 crc kubenswrapper[4971]: I0320 07:14:03.757421 4971 generic.go:334] "Generic (PLEG): container finished" podID="ac4c72d9-ed3f-4662-909e-07caccc657ba" containerID="93c5b0e83a8c18ba03c2a92a4de3bebd7a9908cfa79fc6c26aba3e5cf43e27b0" exitCode=143 Mar 20 07:14:03 crc kubenswrapper[4971]: I0320 07:14:03.757479 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ac4c72d9-ed3f-4662-909e-07caccc657ba","Type":"ContainerDied","Data":"93c5b0e83a8c18ba03c2a92a4de3bebd7a9908cfa79fc6c26aba3e5cf43e27b0"} Mar 20 07:14:03 crc kubenswrapper[4971]: I0320 07:14:03.759592 4971 generic.go:334] "Generic (PLEG): container finished" podID="4ad43e1b-a41b-446b-8690-8fede9e1fc7f" containerID="9382cb2feb2098aa15b65f8346c5de40c2576c9f1856eeea4da89cdb6f2581f4" exitCode=0 Mar 20 07:14:03 crc kubenswrapper[4971]: I0320 07:14:03.759760 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="ae621917-3cce-4814-8b8c-9a30ff8bdd20" containerName="nova-scheduler-scheduler" containerID="cri-o://c0c01250fed259451f47634f50318bd526a1fbc02772c3f46eeb9bf1a0facbc4" gracePeriod=30 Mar 20 07:14:03 crc kubenswrapper[4971]: I0320 07:14:03.760052 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566514-7qv5f" event={"ID":"4ad43e1b-a41b-446b-8690-8fede9e1fc7f","Type":"ContainerDied","Data":"9382cb2feb2098aa15b65f8346c5de40c2576c9f1856eeea4da89cdb6f2581f4"} Mar 20 07:14:03 crc kubenswrapper[4971]: W0320 07:14:03.949707 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod77fdc792_9230_4274_b23d_dd4a21bfc09b.slice/crio-0314d5800618232183c3f0e47ca589eedb21be6a6dd41d0937460eb4a2bfc049 WatchSource:0}: Error finding container 0314d5800618232183c3f0e47ca589eedb21be6a6dd41d0937460eb4a2bfc049: Status 404 returned error can't find the container with id 0314d5800618232183c3f0e47ca589eedb21be6a6dd41d0937460eb4a2bfc049 Mar 20 07:14:03 crc kubenswrapper[4971]: I0320 07:14:03.954622 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 07:14:04 crc kubenswrapper[4971]: I0320 07:14:04.745677 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbc35f7f-6e4d-492d-b82a-81c5ba5e3712" path="/var/lib/kubelet/pods/dbc35f7f-6e4d-492d-b82a-81c5ba5e3712/volumes" Mar 20 07:14:04 crc kubenswrapper[4971]: I0320 07:14:04.746775 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec353ef8-1da5-4ebe-8082-136699b3c38b" path="/var/lib/kubelet/pods/ec353ef8-1da5-4ebe-8082-136699b3c38b/volumes" Mar 20 07:14:04 crc kubenswrapper[4971]: I0320 07:14:04.782591 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"b9450f53-fcf0-4799-a8da-a9d0bc7016ac","Type":"ContainerStarted","Data":"615ced8f9e0feb5ebdac89eb7107d8990b6103b1111ede4cffc99ba5cd482835"} Mar 20 07:14:04 crc kubenswrapper[4971]: I0320 07:14:04.782765 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 20 07:14:04 crc kubenswrapper[4971]: I0320 07:14:04.786513 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"77fdc792-9230-4274-b23d-dd4a21bfc09b","Type":"ContainerStarted","Data":"065161cfb87f99249a5df1c0c2d94817d5e0618c45b0be4c86426d88b6928275"} Mar 20 07:14:04 crc kubenswrapper[4971]: I0320 07:14:04.786556 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"77fdc792-9230-4274-b23d-dd4a21bfc09b","Type":"ContainerStarted","Data":"3392b6ffcbd08668973077d2e5d10b97ae6f9e13d024c2dbbcd6432a3f3afd2b"} Mar 20 07:14:04 crc kubenswrapper[4971]: I0320 07:14:04.786573 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"77fdc792-9230-4274-b23d-dd4a21bfc09b","Type":"ContainerStarted","Data":"0314d5800618232183c3f0e47ca589eedb21be6a6dd41d0937460eb4a2bfc049"} Mar 20 07:14:04 crc kubenswrapper[4971]: I0320 07:14:04.812055 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.812034016 podStartE2EDuration="2.812034016s" podCreationTimestamp="2026-03-20 07:14:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:14:04.803588207 +0000 UTC m=+1466.783462345" watchObservedRunningTime="2026-03-20 07:14:04.812034016 +0000 UTC m=+1466.791908154" Mar 20 07:14:04 crc kubenswrapper[4971]: I0320 07:14:04.825742 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.825726392 podStartE2EDuration="1.825726392s" podCreationTimestamp="2026-03-20 07:14:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:14:04.824044878 +0000 UTC m=+1466.803919026" watchObservedRunningTime="2026-03-20 07:14:04.825726392 +0000 UTC m=+1466.805600530" Mar 20 07:14:05 crc kubenswrapper[4971]: I0320 07:14:05.139244 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566514-7qv5f" Mar 20 07:14:05 crc kubenswrapper[4971]: I0320 07:14:05.294980 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwfpv\" (UniqueName: \"kubernetes.io/projected/4ad43e1b-a41b-446b-8690-8fede9e1fc7f-kube-api-access-qwfpv\") pod \"4ad43e1b-a41b-446b-8690-8fede9e1fc7f\" (UID: \"4ad43e1b-a41b-446b-8690-8fede9e1fc7f\") " Mar 20 07:14:05 crc kubenswrapper[4971]: I0320 07:14:05.301644 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ad43e1b-a41b-446b-8690-8fede9e1fc7f-kube-api-access-qwfpv" (OuterVolumeSpecName: "kube-api-access-qwfpv") pod "4ad43e1b-a41b-446b-8690-8fede9e1fc7f" (UID: "4ad43e1b-a41b-446b-8690-8fede9e1fc7f"). InnerVolumeSpecName "kube-api-access-qwfpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:05 crc kubenswrapper[4971]: I0320 07:14:05.398023 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwfpv\" (UniqueName: \"kubernetes.io/projected/4ad43e1b-a41b-446b-8690-8fede9e1fc7f-kube-api-access-qwfpv\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:05 crc kubenswrapper[4971]: I0320 07:14:05.803113 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566514-7qv5f" Mar 20 07:14:05 crc kubenswrapper[4971]: I0320 07:14:05.803897 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566514-7qv5f" event={"ID":"4ad43e1b-a41b-446b-8690-8fede9e1fc7f","Type":"ContainerDied","Data":"e84d2d49d6aa35c5c0f8dfd2b23ef071c65f56cb68d9de7a0dbb261dff73b6d3"} Mar 20 07:14:05 crc kubenswrapper[4971]: I0320 07:14:05.805457 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e84d2d49d6aa35c5c0f8dfd2b23ef071c65f56cb68d9de7a0dbb261dff73b6d3" Mar 20 07:14:06 crc kubenswrapper[4971]: E0320 07:14:06.110886 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c0c01250fed259451f47634f50318bd526a1fbc02772c3f46eeb9bf1a0facbc4" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 07:14:06 crc kubenswrapper[4971]: E0320 07:14:06.119136 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c0c01250fed259451f47634f50318bd526a1fbc02772c3f46eeb9bf1a0facbc4" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 07:14:06 crc kubenswrapper[4971]: E0320 07:14:06.124550 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c0c01250fed259451f47634f50318bd526a1fbc02772c3f46eeb9bf1a0facbc4" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 07:14:06 crc kubenswrapper[4971]: E0320 07:14:06.124840 4971 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="ae621917-3cce-4814-8b8c-9a30ff8bdd20" containerName="nova-scheduler-scheduler" Mar 20 07:14:06 crc kubenswrapper[4971]: I0320 07:14:06.231174 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566508-k9psq"] Mar 20 07:14:06 crc kubenswrapper[4971]: I0320 07:14:06.242282 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566508-k9psq"] Mar 20 07:14:06 crc kubenswrapper[4971]: I0320 07:14:06.510246 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 20 07:14:06 crc kubenswrapper[4971]: I0320 07:14:06.746713 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fef1c76-0cdd-413f-b290-dabfa3b59259" path="/var/lib/kubelet/pods/4fef1c76-0cdd-413f-b290-dabfa3b59259/volumes" Mar 20 07:14:07 crc kubenswrapper[4971]: I0320 07:14:07.413735 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 07:14:07 crc kubenswrapper[4971]: I0320 07:14:07.544843 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae621917-3cce-4814-8b8c-9a30ff8bdd20-combined-ca-bundle\") pod \"ae621917-3cce-4814-8b8c-9a30ff8bdd20\" (UID: \"ae621917-3cce-4814-8b8c-9a30ff8bdd20\") " Mar 20 07:14:07 crc kubenswrapper[4971]: I0320 07:14:07.544938 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae621917-3cce-4814-8b8c-9a30ff8bdd20-config-data\") pod \"ae621917-3cce-4814-8b8c-9a30ff8bdd20\" (UID: \"ae621917-3cce-4814-8b8c-9a30ff8bdd20\") " Mar 20 07:14:07 crc kubenswrapper[4971]: I0320 07:14:07.545008 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98dkq\" (UniqueName: \"kubernetes.io/projected/ae621917-3cce-4814-8b8c-9a30ff8bdd20-kube-api-access-98dkq\") pod \"ae621917-3cce-4814-8b8c-9a30ff8bdd20\" (UID: \"ae621917-3cce-4814-8b8c-9a30ff8bdd20\") " Mar 20 07:14:07 crc kubenswrapper[4971]: I0320 07:14:07.550823 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae621917-3cce-4814-8b8c-9a30ff8bdd20-kube-api-access-98dkq" (OuterVolumeSpecName: "kube-api-access-98dkq") pod "ae621917-3cce-4814-8b8c-9a30ff8bdd20" (UID: "ae621917-3cce-4814-8b8c-9a30ff8bdd20"). InnerVolumeSpecName "kube-api-access-98dkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:07 crc kubenswrapper[4971]: I0320 07:14:07.577623 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae621917-3cce-4814-8b8c-9a30ff8bdd20-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ae621917-3cce-4814-8b8c-9a30ff8bdd20" (UID: "ae621917-3cce-4814-8b8c-9a30ff8bdd20"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:07 crc kubenswrapper[4971]: I0320 07:14:07.596734 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae621917-3cce-4814-8b8c-9a30ff8bdd20-config-data" (OuterVolumeSpecName: "config-data") pod "ae621917-3cce-4814-8b8c-9a30ff8bdd20" (UID: "ae621917-3cce-4814-8b8c-9a30ff8bdd20"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:07 crc kubenswrapper[4971]: I0320 07:14:07.624632 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 07:14:07 crc kubenswrapper[4971]: I0320 07:14:07.650730 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae621917-3cce-4814-8b8c-9a30ff8bdd20-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:07 crc kubenswrapper[4971]: I0320 07:14:07.650792 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98dkq\" (UniqueName: \"kubernetes.io/projected/ae621917-3cce-4814-8b8c-9a30ff8bdd20-kube-api-access-98dkq\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:07 crc kubenswrapper[4971]: I0320 07:14:07.650805 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae621917-3cce-4814-8b8c-9a30ff8bdd20-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:07 crc kubenswrapper[4971]: I0320 07:14:07.751706 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac4c72d9-ed3f-4662-909e-07caccc657ba-logs\") pod \"ac4c72d9-ed3f-4662-909e-07caccc657ba\" (UID: \"ac4c72d9-ed3f-4662-909e-07caccc657ba\") " Mar 20 07:14:07 crc kubenswrapper[4971]: I0320 07:14:07.751808 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbfb8\" (UniqueName: \"kubernetes.io/projected/ac4c72d9-ed3f-4662-909e-07caccc657ba-kube-api-access-cbfb8\") pod \"ac4c72d9-ed3f-4662-909e-07caccc657ba\" (UID: \"ac4c72d9-ed3f-4662-909e-07caccc657ba\") " Mar 20 07:14:07 crc kubenswrapper[4971]: I0320 07:14:07.751958 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac4c72d9-ed3f-4662-909e-07caccc657ba-config-data\") pod \"ac4c72d9-ed3f-4662-909e-07caccc657ba\" (UID: \"ac4c72d9-ed3f-4662-909e-07caccc657ba\") " Mar 20 07:14:07 crc kubenswrapper[4971]: I0320 07:14:07.752010 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac4c72d9-ed3f-4662-909e-07caccc657ba-combined-ca-bundle\") pod \"ac4c72d9-ed3f-4662-909e-07caccc657ba\" (UID: \"ac4c72d9-ed3f-4662-909e-07caccc657ba\") " Mar 20 07:14:07 crc kubenswrapper[4971]: I0320 07:14:07.752534 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac4c72d9-ed3f-4662-909e-07caccc657ba-logs" (OuterVolumeSpecName: "logs") pod "ac4c72d9-ed3f-4662-909e-07caccc657ba" (UID: "ac4c72d9-ed3f-4662-909e-07caccc657ba"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:14:07 crc kubenswrapper[4971]: I0320 07:14:07.757173 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac4c72d9-ed3f-4662-909e-07caccc657ba-kube-api-access-cbfb8" (OuterVolumeSpecName: "kube-api-access-cbfb8") pod "ac4c72d9-ed3f-4662-909e-07caccc657ba" (UID: "ac4c72d9-ed3f-4662-909e-07caccc657ba"). InnerVolumeSpecName "kube-api-access-cbfb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:07 crc kubenswrapper[4971]: I0320 07:14:07.785724 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac4c72d9-ed3f-4662-909e-07caccc657ba-config-data" (OuterVolumeSpecName: "config-data") pod "ac4c72d9-ed3f-4662-909e-07caccc657ba" (UID: "ac4c72d9-ed3f-4662-909e-07caccc657ba"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:07 crc kubenswrapper[4971]: I0320 07:14:07.790838 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac4c72d9-ed3f-4662-909e-07caccc657ba-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ac4c72d9-ed3f-4662-909e-07caccc657ba" (UID: "ac4c72d9-ed3f-4662-909e-07caccc657ba"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:07 crc kubenswrapper[4971]: I0320 07:14:07.827925 4971 generic.go:334] "Generic (PLEG): container finished" podID="ae621917-3cce-4814-8b8c-9a30ff8bdd20" containerID="c0c01250fed259451f47634f50318bd526a1fbc02772c3f46eeb9bf1a0facbc4" exitCode=0 Mar 20 07:14:07 crc kubenswrapper[4971]: I0320 07:14:07.828002 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ae621917-3cce-4814-8b8c-9a30ff8bdd20","Type":"ContainerDied","Data":"c0c01250fed259451f47634f50318bd526a1fbc02772c3f46eeb9bf1a0facbc4"} Mar 20 07:14:07 crc kubenswrapper[4971]: I0320 07:14:07.828038 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ae621917-3cce-4814-8b8c-9a30ff8bdd20","Type":"ContainerDied","Data":"c3861b0e880b71411cbfe17867aef1199f1a7541bf05ca140f1944a610c1cbb7"} Mar 20 07:14:07 crc kubenswrapper[4971]: I0320 07:14:07.828037 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 07:14:07 crc kubenswrapper[4971]: I0320 07:14:07.828056 4971 scope.go:117] "RemoveContainer" containerID="c0c01250fed259451f47634f50318bd526a1fbc02772c3f46eeb9bf1a0facbc4" Mar 20 07:14:07 crc kubenswrapper[4971]: I0320 07:14:07.833246 4971 generic.go:334] "Generic (PLEG): container finished" podID="ac4c72d9-ed3f-4662-909e-07caccc657ba" containerID="cacd8e180d09a0791bf4d41328b376e0d5f1aeb23ae7c05247daa14b953d750f" exitCode=0 Mar 20 07:14:07 crc kubenswrapper[4971]: I0320 07:14:07.833285 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ac4c72d9-ed3f-4662-909e-07caccc657ba","Type":"ContainerDied","Data":"cacd8e180d09a0791bf4d41328b376e0d5f1aeb23ae7c05247daa14b953d750f"} Mar 20 07:14:07 crc kubenswrapper[4971]: I0320 07:14:07.833309 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ac4c72d9-ed3f-4662-909e-07caccc657ba","Type":"ContainerDied","Data":"52f0ebc017d0cd561e60a5b6f6fb8e7299ccee8ae9bea03e9b044775846156b7"} Mar 20 07:14:07 crc kubenswrapper[4971]: I0320 07:14:07.833315 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 07:14:07 crc kubenswrapper[4971]: I0320 07:14:07.855353 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac4c72d9-ed3f-4662-909e-07caccc657ba-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:07 crc kubenswrapper[4971]: I0320 07:14:07.855841 4971 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac4c72d9-ed3f-4662-909e-07caccc657ba-logs\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:07 crc kubenswrapper[4971]: I0320 07:14:07.855869 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbfb8\" (UniqueName: \"kubernetes.io/projected/ac4c72d9-ed3f-4662-909e-07caccc657ba-kube-api-access-cbfb8\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:07 crc kubenswrapper[4971]: I0320 07:14:07.855936 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac4c72d9-ed3f-4662-909e-07caccc657ba-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:07 crc kubenswrapper[4971]: I0320 07:14:07.856636 4971 scope.go:117] "RemoveContainer" containerID="c0c01250fed259451f47634f50318bd526a1fbc02772c3f46eeb9bf1a0facbc4" Mar 20 07:14:07 crc kubenswrapper[4971]: E0320 07:14:07.857262 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0c01250fed259451f47634f50318bd526a1fbc02772c3f46eeb9bf1a0facbc4\": container with ID starting with c0c01250fed259451f47634f50318bd526a1fbc02772c3f46eeb9bf1a0facbc4 not found: ID does not exist" containerID="c0c01250fed259451f47634f50318bd526a1fbc02772c3f46eeb9bf1a0facbc4" Mar 20 07:14:07 crc kubenswrapper[4971]: I0320 07:14:07.857424 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0c01250fed259451f47634f50318bd526a1fbc02772c3f46eeb9bf1a0facbc4"} err="failed to get container status \"c0c01250fed259451f47634f50318bd526a1fbc02772c3f46eeb9bf1a0facbc4\": rpc error: code = NotFound desc = could not find container \"c0c01250fed259451f47634f50318bd526a1fbc02772c3f46eeb9bf1a0facbc4\": container with ID starting with c0c01250fed259451f47634f50318bd526a1fbc02772c3f46eeb9bf1a0facbc4 not found: ID does not exist" Mar 20 07:14:07 crc kubenswrapper[4971]: I0320 07:14:07.857522 4971 scope.go:117] "RemoveContainer" containerID="cacd8e180d09a0791bf4d41328b376e0d5f1aeb23ae7c05247daa14b953d750f" Mar 20 07:14:07 crc kubenswrapper[4971]: I0320 07:14:07.883446 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 07:14:07 crc kubenswrapper[4971]: I0320 07:14:07.884819 4971 scope.go:117] "RemoveContainer" containerID="93c5b0e83a8c18ba03c2a92a4de3bebd7a9908cfa79fc6c26aba3e5cf43e27b0" Mar 20 07:14:07 crc kubenswrapper[4971]: I0320 07:14:07.898485 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 07:14:07 crc kubenswrapper[4971]: I0320 07:14:07.916961 4971 scope.go:117] "RemoveContainer" containerID="cacd8e180d09a0791bf4d41328b376e0d5f1aeb23ae7c05247daa14b953d750f" Mar 20 07:14:07 crc kubenswrapper[4971]: E0320 07:14:07.918614 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cacd8e180d09a0791bf4d41328b376e0d5f1aeb23ae7c05247daa14b953d750f\": container with ID starting with cacd8e180d09a0791bf4d41328b376e0d5f1aeb23ae7c05247daa14b953d750f not found: ID does not exist" containerID="cacd8e180d09a0791bf4d41328b376e0d5f1aeb23ae7c05247daa14b953d750f" Mar 20 07:14:07 crc kubenswrapper[4971]: I0320 07:14:07.918642 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cacd8e180d09a0791bf4d41328b376e0d5f1aeb23ae7c05247daa14b953d750f"} err="failed to get container status \"cacd8e180d09a0791bf4d41328b376e0d5f1aeb23ae7c05247daa14b953d750f\": rpc error: code = NotFound desc = could not find container \"cacd8e180d09a0791bf4d41328b376e0d5f1aeb23ae7c05247daa14b953d750f\": container with ID starting with cacd8e180d09a0791bf4d41328b376e0d5f1aeb23ae7c05247daa14b953d750f not found: ID does not exist" Mar 20 07:14:07 crc kubenswrapper[4971]: I0320 07:14:07.918663 4971 scope.go:117] "RemoveContainer" containerID="93c5b0e83a8c18ba03c2a92a4de3bebd7a9908cfa79fc6c26aba3e5cf43e27b0" Mar 20 07:14:07 crc kubenswrapper[4971]: E0320 07:14:07.919008 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93c5b0e83a8c18ba03c2a92a4de3bebd7a9908cfa79fc6c26aba3e5cf43e27b0\": container with ID starting with 93c5b0e83a8c18ba03c2a92a4de3bebd7a9908cfa79fc6c26aba3e5cf43e27b0 not found: ID does not exist" containerID="93c5b0e83a8c18ba03c2a92a4de3bebd7a9908cfa79fc6c26aba3e5cf43e27b0" Mar 20 07:14:07 crc kubenswrapper[4971]: I0320 07:14:07.919029 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93c5b0e83a8c18ba03c2a92a4de3bebd7a9908cfa79fc6c26aba3e5cf43e27b0"} err="failed to get container status \"93c5b0e83a8c18ba03c2a92a4de3bebd7a9908cfa79fc6c26aba3e5cf43e27b0\": rpc error: code = NotFound desc = could not find container \"93c5b0e83a8c18ba03c2a92a4de3bebd7a9908cfa79fc6c26aba3e5cf43e27b0\": container with ID starting with 93c5b0e83a8c18ba03c2a92a4de3bebd7a9908cfa79fc6c26aba3e5cf43e27b0 not found: ID does not exist" Mar 20 07:14:07 crc kubenswrapper[4971]: I0320 07:14:07.919613 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 07:14:07 crc kubenswrapper[4971]: I0320 07:14:07.932862 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 07:14:07 crc kubenswrapper[4971]: E0320 07:14:07.933343 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ad43e1b-a41b-446b-8690-8fede9e1fc7f" containerName="oc" Mar 20 07:14:07 crc kubenswrapper[4971]: I0320 07:14:07.933360 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ad43e1b-a41b-446b-8690-8fede9e1fc7f" containerName="oc" Mar 20 07:14:07 crc kubenswrapper[4971]: E0320 07:14:07.933373 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac4c72d9-ed3f-4662-909e-07caccc657ba" containerName="nova-api-log" Mar 20 07:14:07 crc kubenswrapper[4971]: I0320 07:14:07.933379 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac4c72d9-ed3f-4662-909e-07caccc657ba" containerName="nova-api-log" Mar 20 07:14:07 crc kubenswrapper[4971]: E0320 07:14:07.933395 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae621917-3cce-4814-8b8c-9a30ff8bdd20" containerName="nova-scheduler-scheduler" Mar 20 07:14:07 crc kubenswrapper[4971]: I0320 07:14:07.933402 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae621917-3cce-4814-8b8c-9a30ff8bdd20" containerName="nova-scheduler-scheduler" Mar 20 07:14:07 crc kubenswrapper[4971]: E0320 07:14:07.933413 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac4c72d9-ed3f-4662-909e-07caccc657ba" containerName="nova-api-api" Mar 20 07:14:07 crc kubenswrapper[4971]: I0320 07:14:07.933419 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac4c72d9-ed3f-4662-909e-07caccc657ba" containerName="nova-api-api" Mar 20 07:14:07 crc kubenswrapper[4971]: I0320 07:14:07.933592 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac4c72d9-ed3f-4662-909e-07caccc657ba" containerName="nova-api-log" Mar 20 07:14:07 crc kubenswrapper[4971]: I0320 07:14:07.933635 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae621917-3cce-4814-8b8c-9a30ff8bdd20" containerName="nova-scheduler-scheduler" Mar 20 07:14:07 crc kubenswrapper[4971]: I0320 07:14:07.933652 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ad43e1b-a41b-446b-8690-8fede9e1fc7f" containerName="oc" Mar 20 07:14:07 crc kubenswrapper[4971]: I0320 07:14:07.933665 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac4c72d9-ed3f-4662-909e-07caccc657ba" containerName="nova-api-api" Mar 20 07:14:07 crc kubenswrapper[4971]: I0320 07:14:07.934292 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 07:14:07 crc kubenswrapper[4971]: I0320 07:14:07.936144 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 20 07:14:07 crc kubenswrapper[4971]: I0320 07:14:07.942499 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 20 07:14:07 crc kubenswrapper[4971]: I0320 07:14:07.952293 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 07:14:07 crc kubenswrapper[4971]: I0320 07:14:07.968685 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 07:14:07 crc kubenswrapper[4971]: I0320 07:14:07.970290 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 07:14:07 crc kubenswrapper[4971]: I0320 07:14:07.972688 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 07:14:07 crc kubenswrapper[4971]: I0320 07:14:07.982714 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 07:14:08 crc kubenswrapper[4971]: I0320 07:14:08.059222 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6cpc\" (UniqueName: \"kubernetes.io/projected/65f4685d-7a73-4211-85e2-2bbcb7370be6-kube-api-access-m6cpc\") pod \"nova-scheduler-0\" (UID: \"65f4685d-7a73-4211-85e2-2bbcb7370be6\") " pod="openstack/nova-scheduler-0" Mar 20 07:14:08 crc kubenswrapper[4971]: I0320 07:14:08.059280 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b0faea8-1d06-4130-aaad-3ce56e8b9a69-config-data\") pod \"nova-api-0\" (UID: \"5b0faea8-1d06-4130-aaad-3ce56e8b9a69\") " pod="openstack/nova-api-0" Mar 20 07:14:08 crc kubenswrapper[4971]: I0320 07:14:08.059332 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkp8r\" (UniqueName: \"kubernetes.io/projected/5b0faea8-1d06-4130-aaad-3ce56e8b9a69-kube-api-access-hkp8r\") pod \"nova-api-0\" (UID: \"5b0faea8-1d06-4130-aaad-3ce56e8b9a69\") " pod="openstack/nova-api-0" Mar 20 07:14:08 crc kubenswrapper[4971]: I0320 07:14:08.059363 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65f4685d-7a73-4211-85e2-2bbcb7370be6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"65f4685d-7a73-4211-85e2-2bbcb7370be6\") " pod="openstack/nova-scheduler-0" Mar 20 07:14:08 crc kubenswrapper[4971]: I0320 07:14:08.059420 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b0faea8-1d06-4130-aaad-3ce56e8b9a69-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5b0faea8-1d06-4130-aaad-3ce56e8b9a69\") " pod="openstack/nova-api-0" Mar 20 07:14:08 crc kubenswrapper[4971]: I0320 07:14:08.059436 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65f4685d-7a73-4211-85e2-2bbcb7370be6-config-data\") pod \"nova-scheduler-0\" (UID: \"65f4685d-7a73-4211-85e2-2bbcb7370be6\") " pod="openstack/nova-scheduler-0" Mar 20 07:14:08 crc kubenswrapper[4971]: I0320 07:14:08.059476 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b0faea8-1d06-4130-aaad-3ce56e8b9a69-logs\") pod \"nova-api-0\" (UID: \"5b0faea8-1d06-4130-aaad-3ce56e8b9a69\") " pod="openstack/nova-api-0" Mar 20 07:14:08 crc kubenswrapper[4971]: I0320 07:14:08.162057 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65f4685d-7a73-4211-85e2-2bbcb7370be6-config-data\") pod \"nova-scheduler-0\" (UID: \"65f4685d-7a73-4211-85e2-2bbcb7370be6\") " pod="openstack/nova-scheduler-0" Mar 20 07:14:08 crc kubenswrapper[4971]: I0320 07:14:08.162516 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b0faea8-1d06-4130-aaad-3ce56e8b9a69-logs\") pod \"nova-api-0\" (UID: \"5b0faea8-1d06-4130-aaad-3ce56e8b9a69\") " pod="openstack/nova-api-0" Mar 20 07:14:08 crc kubenswrapper[4971]: I0320 07:14:08.162677 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6cpc\" (UniqueName: \"kubernetes.io/projected/65f4685d-7a73-4211-85e2-2bbcb7370be6-kube-api-access-m6cpc\") pod \"nova-scheduler-0\" (UID: \"65f4685d-7a73-4211-85e2-2bbcb7370be6\") " pod="openstack/nova-scheduler-0" Mar 20 07:14:08 crc kubenswrapper[4971]: I0320 07:14:08.162774 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b0faea8-1d06-4130-aaad-3ce56e8b9a69-config-data\") pod \"nova-api-0\" (UID: \"5b0faea8-1d06-4130-aaad-3ce56e8b9a69\") " pod="openstack/nova-api-0" Mar 20 07:14:08 crc kubenswrapper[4971]: I0320 07:14:08.162888 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkp8r\" (UniqueName: \"kubernetes.io/projected/5b0faea8-1d06-4130-aaad-3ce56e8b9a69-kube-api-access-hkp8r\") pod \"nova-api-0\" (UID: \"5b0faea8-1d06-4130-aaad-3ce56e8b9a69\") " pod="openstack/nova-api-0" Mar 20 07:14:08 crc kubenswrapper[4971]: I0320 07:14:08.162987 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65f4685d-7a73-4211-85e2-2bbcb7370be6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"65f4685d-7a73-4211-85e2-2bbcb7370be6\") " pod="openstack/nova-scheduler-0" Mar 20 07:14:08 crc kubenswrapper[4971]: I0320 07:14:08.163129 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b0faea8-1d06-4130-aaad-3ce56e8b9a69-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5b0faea8-1d06-4130-aaad-3ce56e8b9a69\") " pod="openstack/nova-api-0" Mar 20 07:14:08 crc kubenswrapper[4971]: I0320 07:14:08.166018 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b0faea8-1d06-4130-aaad-3ce56e8b9a69-logs\") pod \"nova-api-0\" (UID: \"5b0faea8-1d06-4130-aaad-3ce56e8b9a69\") " pod="openstack/nova-api-0" Mar 20 07:14:08 crc kubenswrapper[4971]: I0320 07:14:08.171423 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b0faea8-1d06-4130-aaad-3ce56e8b9a69-config-data\") pod \"nova-api-0\" (UID: \"5b0faea8-1d06-4130-aaad-3ce56e8b9a69\") " pod="openstack/nova-api-0" Mar 20 07:14:08 crc kubenswrapper[4971]: I0320 07:14:08.171573 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b0faea8-1d06-4130-aaad-3ce56e8b9a69-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5b0faea8-1d06-4130-aaad-3ce56e8b9a69\") " pod="openstack/nova-api-0" Mar 20 07:14:08 crc kubenswrapper[4971]: I0320 07:14:08.171896 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65f4685d-7a73-4211-85e2-2bbcb7370be6-config-data\") pod \"nova-scheduler-0\" (UID: \"65f4685d-7a73-4211-85e2-2bbcb7370be6\") " pod="openstack/nova-scheduler-0" Mar 20 07:14:08 crc kubenswrapper[4971]: I0320 07:14:08.172086 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65f4685d-7a73-4211-85e2-2bbcb7370be6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"65f4685d-7a73-4211-85e2-2bbcb7370be6\") " pod="openstack/nova-scheduler-0" Mar 20 07:14:08 crc kubenswrapper[4971]: I0320 07:14:08.187571 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6cpc\" (UniqueName: \"kubernetes.io/projected/65f4685d-7a73-4211-85e2-2bbcb7370be6-kube-api-access-m6cpc\") pod \"nova-scheduler-0\" (UID: \"65f4685d-7a73-4211-85e2-2bbcb7370be6\") " pod="openstack/nova-scheduler-0" Mar 20 07:14:08 crc kubenswrapper[4971]: I0320 07:14:08.189561 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkp8r\" (UniqueName: \"kubernetes.io/projected/5b0faea8-1d06-4130-aaad-3ce56e8b9a69-kube-api-access-hkp8r\") pod \"nova-api-0\" (UID: \"5b0faea8-1d06-4130-aaad-3ce56e8b9a69\") " pod="openstack/nova-api-0" Mar 20 07:14:08 crc kubenswrapper[4971]: I0320 07:14:08.264529 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 07:14:08 crc kubenswrapper[4971]: I0320 07:14:08.274510 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 20 07:14:08 crc kubenswrapper[4971]: I0320 07:14:08.324280 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 07:14:08 crc kubenswrapper[4971]: I0320 07:14:08.742079 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac4c72d9-ed3f-4662-909e-07caccc657ba" path="/var/lib/kubelet/pods/ac4c72d9-ed3f-4662-909e-07caccc657ba/volumes" Mar 20 07:14:08 crc kubenswrapper[4971]: I0320 07:14:08.742859 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae621917-3cce-4814-8b8c-9a30ff8bdd20" path="/var/lib/kubelet/pods/ae621917-3cce-4814-8b8c-9a30ff8bdd20/volumes" Mar 20 07:14:08 crc kubenswrapper[4971]: I0320 07:14:08.805390 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 07:14:08 crc kubenswrapper[4971]: W0320 07:14:08.811648 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65f4685d_7a73_4211_85e2_2bbcb7370be6.slice/crio-62949cda5317360a782bb73e73c52e1295f770f3bcdc3cad056fa3ee609a5930 WatchSource:0}: Error finding container 62949cda5317360a782bb73e73c52e1295f770f3bcdc3cad056fa3ee609a5930: Status 404 returned error can't find the container with id 62949cda5317360a782bb73e73c52e1295f770f3bcdc3cad056fa3ee609a5930 Mar 20 07:14:08 crc kubenswrapper[4971]: I0320 07:14:08.847785 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"65f4685d-7a73-4211-85e2-2bbcb7370be6","Type":"ContainerStarted","Data":"62949cda5317360a782bb73e73c52e1295f770f3bcdc3cad056fa3ee609a5930"} Mar 20 07:14:08 crc kubenswrapper[4971]: I0320 07:14:08.864546 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 07:14:08 crc kubenswrapper[4971]: W0320 07:14:08.865475 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b0faea8_1d06_4130_aaad_3ce56e8b9a69.slice/crio-e8c61ec03023816532649c1c3951e5d6dd4d65579c0fee0f7612f00020f49373 WatchSource:0}: Error finding container e8c61ec03023816532649c1c3951e5d6dd4d65579c0fee0f7612f00020f49373: Status 404 returned error can't find the container with id e8c61ec03023816532649c1c3951e5d6dd4d65579c0fee0f7612f00020f49373 Mar 20 07:14:09 crc kubenswrapper[4971]: I0320 07:14:09.876103 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5b0faea8-1d06-4130-aaad-3ce56e8b9a69","Type":"ContainerStarted","Data":"81256547309da7ad97c533caf37fd1dae06d248fadb2b84ef1fc7bf4b62493ff"} Mar 20 07:14:09 crc kubenswrapper[4971]: I0320 07:14:09.876384 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5b0faea8-1d06-4130-aaad-3ce56e8b9a69","Type":"ContainerStarted","Data":"3ee7a7dea82ef43279997d2d4e22fe388996415cb4326a588f530459fa93f9a3"} Mar 20 07:14:09 crc kubenswrapper[4971]: I0320 07:14:09.876401 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5b0faea8-1d06-4130-aaad-3ce56e8b9a69","Type":"ContainerStarted","Data":"e8c61ec03023816532649c1c3951e5d6dd4d65579c0fee0f7612f00020f49373"} Mar 20 07:14:09 crc kubenswrapper[4971]: I0320 07:14:09.885551 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"65f4685d-7a73-4211-85e2-2bbcb7370be6","Type":"ContainerStarted","Data":"90ef297c8d6d890cffeea2aece8e734d3092ce06296da538df1f4c52a196bb8f"} Mar 20 07:14:09 crc kubenswrapper[4971]: I0320 07:14:09.915479 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.91546321 podStartE2EDuration="2.91546321s" podCreationTimestamp="2026-03-20 07:14:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:14:09.91087015 +0000 UTC m=+1471.890744288" watchObservedRunningTime="2026-03-20 07:14:09.91546321 +0000 UTC m=+1471.895337348" Mar 20 07:14:09 crc kubenswrapper[4971]: I0320 07:14:09.962390 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.962354357 podStartE2EDuration="2.962354357s" podCreationTimestamp="2026-03-20 07:14:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:14:09.954942884 +0000 UTC m=+1471.934817022" watchObservedRunningTime="2026-03-20 07:14:09.962354357 +0000 UTC m=+1471.942228495" Mar 20 07:14:10 crc kubenswrapper[4971]: I0320 07:14:10.947281 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 07:14:10 crc kubenswrapper[4971]: I0320 07:14:10.947709 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="42480507-2b62-465a-9af4-07ea0ff3be81" containerName="kube-state-metrics" containerID="cri-o://156030cee7f644b41b8984ee238b73db5b10fa0d94de13f13f94c03d61a72f61" gracePeriod=30 Mar 20 07:14:11 crc kubenswrapper[4971]: I0320 07:14:11.452373 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 07:14:11 crc kubenswrapper[4971]: I0320 07:14:11.638969 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lfgd\" (UniqueName: \"kubernetes.io/projected/42480507-2b62-465a-9af4-07ea0ff3be81-kube-api-access-4lfgd\") pod \"42480507-2b62-465a-9af4-07ea0ff3be81\" (UID: \"42480507-2b62-465a-9af4-07ea0ff3be81\") " Mar 20 07:14:11 crc kubenswrapper[4971]: I0320 07:14:11.649749 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42480507-2b62-465a-9af4-07ea0ff3be81-kube-api-access-4lfgd" (OuterVolumeSpecName: "kube-api-access-4lfgd") pod "42480507-2b62-465a-9af4-07ea0ff3be81" (UID: "42480507-2b62-465a-9af4-07ea0ff3be81"). InnerVolumeSpecName "kube-api-access-4lfgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:11 crc kubenswrapper[4971]: I0320 07:14:11.741309 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lfgd\" (UniqueName: \"kubernetes.io/projected/42480507-2b62-465a-9af4-07ea0ff3be81-kube-api-access-4lfgd\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:11 crc kubenswrapper[4971]: I0320 07:14:11.906281 4971 generic.go:334] "Generic (PLEG): container finished" podID="42480507-2b62-465a-9af4-07ea0ff3be81" containerID="156030cee7f644b41b8984ee238b73db5b10fa0d94de13f13f94c03d61a72f61" exitCode=2 Mar 20 07:14:11 crc kubenswrapper[4971]: I0320 07:14:11.906324 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"42480507-2b62-465a-9af4-07ea0ff3be81","Type":"ContainerDied","Data":"156030cee7f644b41b8984ee238b73db5b10fa0d94de13f13f94c03d61a72f61"} Mar 20 07:14:11 crc kubenswrapper[4971]: I0320 07:14:11.906351 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"42480507-2b62-465a-9af4-07ea0ff3be81","Type":"ContainerDied","Data":"71fd3b4acdf6be866d6c4a30022c4edf95fe5f53fd35bcd1954fe918a0609e83"} Mar 20 07:14:11 crc kubenswrapper[4971]: I0320 07:14:11.906367 4971 scope.go:117] "RemoveContainer" containerID="156030cee7f644b41b8984ee238b73db5b10fa0d94de13f13f94c03d61a72f61" Mar 20 07:14:11 crc kubenswrapper[4971]: I0320 07:14:11.906368 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 07:14:11 crc kubenswrapper[4971]: I0320 07:14:11.925042 4971 scope.go:117] "RemoveContainer" containerID="156030cee7f644b41b8984ee238b73db5b10fa0d94de13f13f94c03d61a72f61" Mar 20 07:14:11 crc kubenswrapper[4971]: E0320 07:14:11.925899 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"156030cee7f644b41b8984ee238b73db5b10fa0d94de13f13f94c03d61a72f61\": container with ID starting with 156030cee7f644b41b8984ee238b73db5b10fa0d94de13f13f94c03d61a72f61 not found: ID does not exist" containerID="156030cee7f644b41b8984ee238b73db5b10fa0d94de13f13f94c03d61a72f61" Mar 20 07:14:11 crc kubenswrapper[4971]: I0320 07:14:11.925926 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"156030cee7f644b41b8984ee238b73db5b10fa0d94de13f13f94c03d61a72f61"} err="failed to get container status \"156030cee7f644b41b8984ee238b73db5b10fa0d94de13f13f94c03d61a72f61\": rpc error: code = NotFound desc = could not find container \"156030cee7f644b41b8984ee238b73db5b10fa0d94de13f13f94c03d61a72f61\": container with ID starting with 156030cee7f644b41b8984ee238b73db5b10fa0d94de13f13f94c03d61a72f61 not found: ID does not exist" Mar 20 07:14:11 crc kubenswrapper[4971]: I0320 07:14:11.960411 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 07:14:11 crc kubenswrapper[4971]: I0320 07:14:11.982941 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 07:14:11 crc kubenswrapper[4971]: I0320 07:14:11.991819 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 07:14:11 crc kubenswrapper[4971]: E0320 07:14:11.992285 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42480507-2b62-465a-9af4-07ea0ff3be81" containerName="kube-state-metrics" Mar 20 07:14:11 crc kubenswrapper[4971]: I0320 07:14:11.992303 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="42480507-2b62-465a-9af4-07ea0ff3be81" containerName="kube-state-metrics" Mar 20 07:14:11 crc kubenswrapper[4971]: I0320 07:14:11.992463 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="42480507-2b62-465a-9af4-07ea0ff3be81" containerName="kube-state-metrics" Mar 20 07:14:11 crc kubenswrapper[4971]: I0320 07:14:11.993279 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 07:14:11 crc kubenswrapper[4971]: I0320 07:14:11.996882 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 20 07:14:11 crc kubenswrapper[4971]: I0320 07:14:11.997046 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 20 07:14:12 crc kubenswrapper[4971]: I0320 07:14:12.003356 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 07:14:12 crc kubenswrapper[4971]: I0320 07:14:12.150012 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7m7n7\" (UniqueName: \"kubernetes.io/projected/b11d9f93-43db-45cf-9cba-75d0c9b50d55-kube-api-access-7m7n7\") pod \"kube-state-metrics-0\" (UID: \"b11d9f93-43db-45cf-9cba-75d0c9b50d55\") " pod="openstack/kube-state-metrics-0" Mar 20 07:14:12 crc kubenswrapper[4971]: I0320 07:14:12.150140 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b11d9f93-43db-45cf-9cba-75d0c9b50d55-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"b11d9f93-43db-45cf-9cba-75d0c9b50d55\") " pod="openstack/kube-state-metrics-0" Mar 20 07:14:12 crc kubenswrapper[4971]: I0320 07:14:12.150256 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/b11d9f93-43db-45cf-9cba-75d0c9b50d55-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"b11d9f93-43db-45cf-9cba-75d0c9b50d55\") " pod="openstack/kube-state-metrics-0" Mar 20 07:14:12 crc kubenswrapper[4971]: I0320 07:14:12.150290 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/b11d9f93-43db-45cf-9cba-75d0c9b50d55-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"b11d9f93-43db-45cf-9cba-75d0c9b50d55\") " pod="openstack/kube-state-metrics-0" Mar 20 07:14:12 crc kubenswrapper[4971]: I0320 07:14:12.252368 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/b11d9f93-43db-45cf-9cba-75d0c9b50d55-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"b11d9f93-43db-45cf-9cba-75d0c9b50d55\") " pod="openstack/kube-state-metrics-0" Mar 20 07:14:12 crc kubenswrapper[4971]: I0320 07:14:12.252457 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/b11d9f93-43db-45cf-9cba-75d0c9b50d55-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"b11d9f93-43db-45cf-9cba-75d0c9b50d55\") " pod="openstack/kube-state-metrics-0" Mar 20 07:14:12 crc kubenswrapper[4971]: I0320 07:14:12.252549 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7m7n7\" (UniqueName: \"kubernetes.io/projected/b11d9f93-43db-45cf-9cba-75d0c9b50d55-kube-api-access-7m7n7\") pod \"kube-state-metrics-0\" (UID: \"b11d9f93-43db-45cf-9cba-75d0c9b50d55\") " pod="openstack/kube-state-metrics-0" Mar 20 07:14:12 crc kubenswrapper[4971]: I0320 07:14:12.252692 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b11d9f93-43db-45cf-9cba-75d0c9b50d55-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"b11d9f93-43db-45cf-9cba-75d0c9b50d55\") " pod="openstack/kube-state-metrics-0" Mar 20 07:14:12 crc kubenswrapper[4971]: I0320 07:14:12.258741 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/b11d9f93-43db-45cf-9cba-75d0c9b50d55-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"b11d9f93-43db-45cf-9cba-75d0c9b50d55\") " pod="openstack/kube-state-metrics-0" Mar 20 07:14:12 crc kubenswrapper[4971]: I0320 07:14:12.261549 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/b11d9f93-43db-45cf-9cba-75d0c9b50d55-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"b11d9f93-43db-45cf-9cba-75d0c9b50d55\") " pod="openstack/kube-state-metrics-0" Mar 20 07:14:12 crc kubenswrapper[4971]: I0320 07:14:12.270982 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b11d9f93-43db-45cf-9cba-75d0c9b50d55-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"b11d9f93-43db-45cf-9cba-75d0c9b50d55\") " pod="openstack/kube-state-metrics-0" Mar 20 07:14:12 crc kubenswrapper[4971]: I0320 07:14:12.277251 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7m7n7\" (UniqueName: \"kubernetes.io/projected/b11d9f93-43db-45cf-9cba-75d0c9b50d55-kube-api-access-7m7n7\") pod \"kube-state-metrics-0\" (UID: \"b11d9f93-43db-45cf-9cba-75d0c9b50d55\") " pod="openstack/kube-state-metrics-0" Mar 20 07:14:12 crc kubenswrapper[4971]: I0320 07:14:12.316395 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 07:14:12 crc kubenswrapper[4971]: I0320 07:14:12.741345 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42480507-2b62-465a-9af4-07ea0ff3be81" path="/var/lib/kubelet/pods/42480507-2b62-465a-9af4-07ea0ff3be81/volumes" Mar 20 07:14:12 crc kubenswrapper[4971]: I0320 07:14:12.856070 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 07:14:12 crc kubenswrapper[4971]: I0320 07:14:12.918167 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b11d9f93-43db-45cf-9cba-75d0c9b50d55","Type":"ContainerStarted","Data":"6a57a00927cb37ccd2926fd817b96e82e4a82b2db40952f235ffa676b7b360f0"} Mar 20 07:14:12 crc kubenswrapper[4971]: I0320 07:14:12.961706 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 07:14:12 crc kubenswrapper[4971]: I0320 07:14:12.961980 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="66e085af-b20f-4a74-a2a2-f21a3057cb8a" containerName="ceilometer-central-agent" containerID="cri-o://4a4002dc99baad38e51c0f6fb55892c305049b301226c24fbb73e80a9aeef493" gracePeriod=30 Mar 20 07:14:12 crc kubenswrapper[4971]: I0320 07:14:12.962083 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="66e085af-b20f-4a74-a2a2-f21a3057cb8a" containerName="ceilometer-notification-agent" containerID="cri-o://f700eba655e94a222afd3f2a732900514e38a9d9878a188d6fe2fd9eddfbdef6" gracePeriod=30 Mar 20 07:14:12 crc kubenswrapper[4971]: I0320 07:14:12.962075 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="66e085af-b20f-4a74-a2a2-f21a3057cb8a" containerName="sg-core" containerID="cri-o://5d4ccd50b8481a237c43d08c563ba64a032112f2167a89b5faa03f80f1ea868e" gracePeriod=30 Mar 20 07:14:12 crc kubenswrapper[4971]: I0320 07:14:12.962114 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="66e085af-b20f-4a74-a2a2-f21a3057cb8a" containerName="proxy-httpd" containerID="cri-o://ce3710cd7b793d078b3fcd6a5f4dd83bd05c10551a91c491fb6773d8d6ca25ef" gracePeriod=30 Mar 20 07:14:13 crc kubenswrapper[4971]: I0320 07:14:13.264689 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 20 07:14:13 crc kubenswrapper[4971]: I0320 07:14:13.472685 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 07:14:13 crc kubenswrapper[4971]: I0320 07:14:13.473111 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 07:14:13 crc kubenswrapper[4971]: I0320 07:14:13.928986 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b11d9f93-43db-45cf-9cba-75d0c9b50d55","Type":"ContainerStarted","Data":"3e232b29d2dc5299cf551b92f2d9493a62313586ff5a609644e971c3988a31be"} Mar 20 07:14:13 crc kubenswrapper[4971]: I0320 07:14:13.929058 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 20 07:14:13 crc kubenswrapper[4971]: I0320 07:14:13.932175 4971 generic.go:334] "Generic (PLEG): container finished" podID="66e085af-b20f-4a74-a2a2-f21a3057cb8a" containerID="ce3710cd7b793d078b3fcd6a5f4dd83bd05c10551a91c491fb6773d8d6ca25ef" exitCode=0 Mar 20 07:14:13 crc kubenswrapper[4971]: I0320 07:14:13.932219 4971 generic.go:334] "Generic (PLEG): container finished" podID="66e085af-b20f-4a74-a2a2-f21a3057cb8a" containerID="5d4ccd50b8481a237c43d08c563ba64a032112f2167a89b5faa03f80f1ea868e" exitCode=2 Mar 20 07:14:13 crc kubenswrapper[4971]: I0320 07:14:13.932230 4971 generic.go:334] "Generic (PLEG): container finished" podID="66e085af-b20f-4a74-a2a2-f21a3057cb8a" containerID="4a4002dc99baad38e51c0f6fb55892c305049b301226c24fbb73e80a9aeef493" exitCode=0 Mar 20 07:14:13 crc kubenswrapper[4971]: I0320 07:14:13.932258 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"66e085af-b20f-4a74-a2a2-f21a3057cb8a","Type":"ContainerDied","Data":"ce3710cd7b793d078b3fcd6a5f4dd83bd05c10551a91c491fb6773d8d6ca25ef"} Mar 20 07:14:13 crc kubenswrapper[4971]: I0320 07:14:13.932308 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"66e085af-b20f-4a74-a2a2-f21a3057cb8a","Type":"ContainerDied","Data":"5d4ccd50b8481a237c43d08c563ba64a032112f2167a89b5faa03f80f1ea868e"} Mar 20 07:14:13 crc kubenswrapper[4971]: I0320 07:14:13.932319 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"66e085af-b20f-4a74-a2a2-f21a3057cb8a","Type":"ContainerDied","Data":"4a4002dc99baad38e51c0f6fb55892c305049b301226c24fbb73e80a9aeef493"} Mar 20 07:14:13 crc kubenswrapper[4971]: I0320 07:14:13.949521 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.592121009 podStartE2EDuration="2.949500106s" podCreationTimestamp="2026-03-20 07:14:11 +0000 UTC" firstStartedPulling="2026-03-20 07:14:12.869942765 +0000 UTC m=+1474.849816913" lastFinishedPulling="2026-03-20 07:14:13.227321832 +0000 UTC m=+1475.207196010" observedRunningTime="2026-03-20 07:14:13.94345461 +0000 UTC m=+1475.923328758" watchObservedRunningTime="2026-03-20 07:14:13.949500106 +0000 UTC m=+1475.929374254" Mar 20 07:14:14 crc kubenswrapper[4971]: I0320 07:14:14.488767 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="77fdc792-9230-4274-b23d-dd4a21bfc09b" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.201:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 07:14:14 crc kubenswrapper[4971]: I0320 07:14:14.488766 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="77fdc792-9230-4274-b23d-dd4a21bfc09b" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.201:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 07:14:14 crc kubenswrapper[4971]: I0320 07:14:14.947418 4971 generic.go:334] "Generic (PLEG): container finished" podID="66e085af-b20f-4a74-a2a2-f21a3057cb8a" containerID="f700eba655e94a222afd3f2a732900514e38a9d9878a188d6fe2fd9eddfbdef6" exitCode=0 Mar 20 07:14:14 crc kubenswrapper[4971]: I0320 07:14:14.947546 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"66e085af-b20f-4a74-a2a2-f21a3057cb8a","Type":"ContainerDied","Data":"f700eba655e94a222afd3f2a732900514e38a9d9878a188d6fe2fd9eddfbdef6"} Mar 20 07:14:15 crc kubenswrapper[4971]: I0320 07:14:15.293850 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 07:14:15 crc kubenswrapper[4971]: I0320 07:14:15.410741 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66e085af-b20f-4a74-a2a2-f21a3057cb8a-log-httpd\") pod \"66e085af-b20f-4a74-a2a2-f21a3057cb8a\" (UID: \"66e085af-b20f-4a74-a2a2-f21a3057cb8a\") " Mar 20 07:14:15 crc kubenswrapper[4971]: I0320 07:14:15.410796 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66e085af-b20f-4a74-a2a2-f21a3057cb8a-combined-ca-bundle\") pod \"66e085af-b20f-4a74-a2a2-f21a3057cb8a\" (UID: \"66e085af-b20f-4a74-a2a2-f21a3057cb8a\") " Mar 20 07:14:15 crc kubenswrapper[4971]: I0320 07:14:15.410948 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66e085af-b20f-4a74-a2a2-f21a3057cb8a-run-httpd\") pod \"66e085af-b20f-4a74-a2a2-f21a3057cb8a\" (UID: \"66e085af-b20f-4a74-a2a2-f21a3057cb8a\") " Mar 20 07:14:15 crc kubenswrapper[4971]: I0320 07:14:15.410978 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66e085af-b20f-4a74-a2a2-f21a3057cb8a-config-data\") pod \"66e085af-b20f-4a74-a2a2-f21a3057cb8a\" (UID: \"66e085af-b20f-4a74-a2a2-f21a3057cb8a\") " Mar 20 07:14:15 crc kubenswrapper[4971]: I0320 07:14:15.410992 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66e085af-b20f-4a74-a2a2-f21a3057cb8a-scripts\") pod \"66e085af-b20f-4a74-a2a2-f21a3057cb8a\" (UID: \"66e085af-b20f-4a74-a2a2-f21a3057cb8a\") " Mar 20 07:14:15 crc kubenswrapper[4971]: I0320 07:14:15.411022 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ht8p\" (UniqueName: \"kubernetes.io/projected/66e085af-b20f-4a74-a2a2-f21a3057cb8a-kube-api-access-7ht8p\") pod \"66e085af-b20f-4a74-a2a2-f21a3057cb8a\" (UID: \"66e085af-b20f-4a74-a2a2-f21a3057cb8a\") " Mar 20 07:14:15 crc kubenswrapper[4971]: I0320 07:14:15.411048 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/66e085af-b20f-4a74-a2a2-f21a3057cb8a-sg-core-conf-yaml\") pod \"66e085af-b20f-4a74-a2a2-f21a3057cb8a\" (UID: \"66e085af-b20f-4a74-a2a2-f21a3057cb8a\") " Mar 20 07:14:15 crc kubenswrapper[4971]: I0320 07:14:15.411194 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66e085af-b20f-4a74-a2a2-f21a3057cb8a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "66e085af-b20f-4a74-a2a2-f21a3057cb8a" (UID: "66e085af-b20f-4a74-a2a2-f21a3057cb8a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:14:15 crc kubenswrapper[4971]: I0320 07:14:15.411542 4971 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66e085af-b20f-4a74-a2a2-f21a3057cb8a-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:15 crc kubenswrapper[4971]: I0320 07:14:15.415198 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66e085af-b20f-4a74-a2a2-f21a3057cb8a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "66e085af-b20f-4a74-a2a2-f21a3057cb8a" (UID: "66e085af-b20f-4a74-a2a2-f21a3057cb8a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:14:15 crc kubenswrapper[4971]: I0320 07:14:15.417817 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66e085af-b20f-4a74-a2a2-f21a3057cb8a-kube-api-access-7ht8p" (OuterVolumeSpecName: "kube-api-access-7ht8p") pod "66e085af-b20f-4a74-a2a2-f21a3057cb8a" (UID: "66e085af-b20f-4a74-a2a2-f21a3057cb8a"). InnerVolumeSpecName "kube-api-access-7ht8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:15 crc kubenswrapper[4971]: I0320 07:14:15.421465 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66e085af-b20f-4a74-a2a2-f21a3057cb8a-scripts" (OuterVolumeSpecName: "scripts") pod "66e085af-b20f-4a74-a2a2-f21a3057cb8a" (UID: "66e085af-b20f-4a74-a2a2-f21a3057cb8a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:15 crc kubenswrapper[4971]: I0320 07:14:15.440471 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66e085af-b20f-4a74-a2a2-f21a3057cb8a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "66e085af-b20f-4a74-a2a2-f21a3057cb8a" (UID: "66e085af-b20f-4a74-a2a2-f21a3057cb8a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:15 crc kubenswrapper[4971]: I0320 07:14:15.498677 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66e085af-b20f-4a74-a2a2-f21a3057cb8a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "66e085af-b20f-4a74-a2a2-f21a3057cb8a" (UID: "66e085af-b20f-4a74-a2a2-f21a3057cb8a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:15 crc kubenswrapper[4971]: I0320 07:14:15.502902 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66e085af-b20f-4a74-a2a2-f21a3057cb8a-config-data" (OuterVolumeSpecName: "config-data") pod "66e085af-b20f-4a74-a2a2-f21a3057cb8a" (UID: "66e085af-b20f-4a74-a2a2-f21a3057cb8a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:15 crc kubenswrapper[4971]: I0320 07:14:15.513260 4971 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66e085af-b20f-4a74-a2a2-f21a3057cb8a-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:15 crc kubenswrapper[4971]: I0320 07:14:15.513284 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66e085af-b20f-4a74-a2a2-f21a3057cb8a-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:15 crc kubenswrapper[4971]: I0320 07:14:15.513296 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66e085af-b20f-4a74-a2a2-f21a3057cb8a-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:15 crc kubenswrapper[4971]: I0320 07:14:15.513306 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ht8p\" (UniqueName: \"kubernetes.io/projected/66e085af-b20f-4a74-a2a2-f21a3057cb8a-kube-api-access-7ht8p\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:15 crc kubenswrapper[4971]: I0320 07:14:15.513315 4971 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/66e085af-b20f-4a74-a2a2-f21a3057cb8a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:15 crc kubenswrapper[4971]: I0320 07:14:15.513325 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66e085af-b20f-4a74-a2a2-f21a3057cb8a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:15 crc kubenswrapper[4971]: I0320 07:14:15.962802 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"66e085af-b20f-4a74-a2a2-f21a3057cb8a","Type":"ContainerDied","Data":"7b5a63c9510ad8942dff7b5d1cf391518fd61ad00b5c4139909190b84eba1607"} Mar 20 07:14:15 crc kubenswrapper[4971]: I0320 07:14:15.962891 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 07:14:15 crc kubenswrapper[4971]: I0320 07:14:15.963428 4971 scope.go:117] "RemoveContainer" containerID="ce3710cd7b793d078b3fcd6a5f4dd83bd05c10551a91c491fb6773d8d6ca25ef" Mar 20 07:14:15 crc kubenswrapper[4971]: I0320 07:14:15.999919 4971 scope.go:117] "RemoveContainer" containerID="5d4ccd50b8481a237c43d08c563ba64a032112f2167a89b5faa03f80f1ea868e" Mar 20 07:14:16 crc kubenswrapper[4971]: I0320 07:14:16.009897 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 07:14:16 crc kubenswrapper[4971]: I0320 07:14:16.026318 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 07:14:16 crc kubenswrapper[4971]: I0320 07:14:16.038577 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 07:14:16 crc kubenswrapper[4971]: E0320 07:14:16.039063 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66e085af-b20f-4a74-a2a2-f21a3057cb8a" containerName="ceilometer-notification-agent" Mar 20 07:14:16 crc kubenswrapper[4971]: I0320 07:14:16.039080 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="66e085af-b20f-4a74-a2a2-f21a3057cb8a" containerName="ceilometer-notification-agent" Mar 20 07:14:16 crc kubenswrapper[4971]: E0320 07:14:16.039099 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66e085af-b20f-4a74-a2a2-f21a3057cb8a" containerName="ceilometer-central-agent" Mar 20 07:14:16 crc kubenswrapper[4971]: I0320 07:14:16.039109 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="66e085af-b20f-4a74-a2a2-f21a3057cb8a" containerName="ceilometer-central-agent" Mar 20 07:14:16 crc kubenswrapper[4971]: E0320 07:14:16.039126 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66e085af-b20f-4a74-a2a2-f21a3057cb8a" containerName="proxy-httpd" Mar 20 07:14:16 crc kubenswrapper[4971]: I0320 07:14:16.039132 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="66e085af-b20f-4a74-a2a2-f21a3057cb8a" containerName="proxy-httpd" Mar 20 07:14:16 crc kubenswrapper[4971]: E0320 07:14:16.039153 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66e085af-b20f-4a74-a2a2-f21a3057cb8a" containerName="sg-core" Mar 20 07:14:16 crc kubenswrapper[4971]: I0320 07:14:16.039159 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="66e085af-b20f-4a74-a2a2-f21a3057cb8a" containerName="sg-core" Mar 20 07:14:16 crc kubenswrapper[4971]: I0320 07:14:16.039336 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="66e085af-b20f-4a74-a2a2-f21a3057cb8a" containerName="ceilometer-notification-agent" Mar 20 07:14:16 crc kubenswrapper[4971]: I0320 07:14:16.039351 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="66e085af-b20f-4a74-a2a2-f21a3057cb8a" containerName="sg-core" Mar 20 07:14:16 crc kubenswrapper[4971]: I0320 07:14:16.039361 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="66e085af-b20f-4a74-a2a2-f21a3057cb8a" containerName="proxy-httpd" Mar 20 07:14:16 crc kubenswrapper[4971]: I0320 07:14:16.039380 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="66e085af-b20f-4a74-a2a2-f21a3057cb8a" containerName="ceilometer-central-agent" Mar 20 07:14:16 crc kubenswrapper[4971]: I0320 07:14:16.041010 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 07:14:16 crc kubenswrapper[4971]: I0320 07:14:16.041907 4971 scope.go:117] "RemoveContainer" containerID="f700eba655e94a222afd3f2a732900514e38a9d9878a188d6fe2fd9eddfbdef6" Mar 20 07:14:16 crc kubenswrapper[4971]: I0320 07:14:16.048398 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 07:14:16 crc kubenswrapper[4971]: I0320 07:14:16.048407 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 07:14:16 crc kubenswrapper[4971]: I0320 07:14:16.048568 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 07:14:16 crc kubenswrapper[4971]: I0320 07:14:16.069089 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 20 07:14:16 crc kubenswrapper[4971]: I0320 07:14:16.070676 4971 scope.go:117] "RemoveContainer" containerID="4a4002dc99baad38e51c0f6fb55892c305049b301226c24fbb73e80a9aeef493" Mar 20 07:14:16 crc kubenswrapper[4971]: I0320 07:14:16.126065 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a33b6ce4-cf95-445b-b26c-2446c082661b-log-httpd\") pod \"ceilometer-0\" (UID: \"a33b6ce4-cf95-445b-b26c-2446c082661b\") " pod="openstack/ceilometer-0" Mar 20 07:14:16 crc kubenswrapper[4971]: I0320 07:14:16.126122 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a33b6ce4-cf95-445b-b26c-2446c082661b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a33b6ce4-cf95-445b-b26c-2446c082661b\") " pod="openstack/ceilometer-0" Mar 20 07:14:16 crc kubenswrapper[4971]: I0320 07:14:16.126157 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a33b6ce4-cf95-445b-b26c-2446c082661b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a33b6ce4-cf95-445b-b26c-2446c082661b\") " pod="openstack/ceilometer-0" Mar 20 07:14:16 crc kubenswrapper[4971]: I0320 07:14:16.126327 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wjjj\" (UniqueName: \"kubernetes.io/projected/a33b6ce4-cf95-445b-b26c-2446c082661b-kube-api-access-9wjjj\") pod \"ceilometer-0\" (UID: \"a33b6ce4-cf95-445b-b26c-2446c082661b\") " pod="openstack/ceilometer-0" Mar 20 07:14:16 crc kubenswrapper[4971]: I0320 07:14:16.126400 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a33b6ce4-cf95-445b-b26c-2446c082661b-config-data\") pod \"ceilometer-0\" (UID: \"a33b6ce4-cf95-445b-b26c-2446c082661b\") " pod="openstack/ceilometer-0" Mar 20 07:14:16 crc kubenswrapper[4971]: I0320 07:14:16.126507 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a33b6ce4-cf95-445b-b26c-2446c082661b-run-httpd\") pod \"ceilometer-0\" (UID: \"a33b6ce4-cf95-445b-b26c-2446c082661b\") " pod="openstack/ceilometer-0" Mar 20 07:14:16 crc kubenswrapper[4971]: I0320 07:14:16.126664 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a33b6ce4-cf95-445b-b26c-2446c082661b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a33b6ce4-cf95-445b-b26c-2446c082661b\") " pod="openstack/ceilometer-0" Mar 20 07:14:16 crc kubenswrapper[4971]: I0320 07:14:16.126814 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a33b6ce4-cf95-445b-b26c-2446c082661b-scripts\") pod \"ceilometer-0\" (UID: \"a33b6ce4-cf95-445b-b26c-2446c082661b\") " pod="openstack/ceilometer-0" Mar 20 07:14:16 crc kubenswrapper[4971]: I0320 07:14:16.228068 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a33b6ce4-cf95-445b-b26c-2446c082661b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a33b6ce4-cf95-445b-b26c-2446c082661b\") " pod="openstack/ceilometer-0" Mar 20 07:14:16 crc kubenswrapper[4971]: I0320 07:14:16.228130 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a33b6ce4-cf95-445b-b26c-2446c082661b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a33b6ce4-cf95-445b-b26c-2446c082661b\") " pod="openstack/ceilometer-0" Mar 20 07:14:16 crc kubenswrapper[4971]: I0320 07:14:16.228163 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wjjj\" (UniqueName: \"kubernetes.io/projected/a33b6ce4-cf95-445b-b26c-2446c082661b-kube-api-access-9wjjj\") pod \"ceilometer-0\" (UID: \"a33b6ce4-cf95-445b-b26c-2446c082661b\") " pod="openstack/ceilometer-0" Mar 20 07:14:16 crc kubenswrapper[4971]: I0320 07:14:16.228195 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a33b6ce4-cf95-445b-b26c-2446c082661b-config-data\") pod \"ceilometer-0\" (UID: \"a33b6ce4-cf95-445b-b26c-2446c082661b\") " pod="openstack/ceilometer-0" Mar 20 07:14:16 crc kubenswrapper[4971]: I0320 07:14:16.228230 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a33b6ce4-cf95-445b-b26c-2446c082661b-run-httpd\") pod \"ceilometer-0\" (UID: \"a33b6ce4-cf95-445b-b26c-2446c082661b\") " pod="openstack/ceilometer-0" Mar 20 07:14:16 crc kubenswrapper[4971]: I0320 07:14:16.228273 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a33b6ce4-cf95-445b-b26c-2446c082661b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a33b6ce4-cf95-445b-b26c-2446c082661b\") " pod="openstack/ceilometer-0" Mar 20 07:14:16 crc kubenswrapper[4971]: I0320 07:14:16.228319 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a33b6ce4-cf95-445b-b26c-2446c082661b-scripts\") pod \"ceilometer-0\" (UID: \"a33b6ce4-cf95-445b-b26c-2446c082661b\") " pod="openstack/ceilometer-0" Mar 20 07:14:16 crc kubenswrapper[4971]: I0320 07:14:16.228397 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a33b6ce4-cf95-445b-b26c-2446c082661b-log-httpd\") pod \"ceilometer-0\" (UID: \"a33b6ce4-cf95-445b-b26c-2446c082661b\") " pod="openstack/ceilometer-0" Mar 20 07:14:16 crc kubenswrapper[4971]: I0320 07:14:16.229028 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a33b6ce4-cf95-445b-b26c-2446c082661b-log-httpd\") pod \"ceilometer-0\" (UID: \"a33b6ce4-cf95-445b-b26c-2446c082661b\") " pod="openstack/ceilometer-0" Mar 20 07:14:16 crc kubenswrapper[4971]: I0320 07:14:16.229135 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a33b6ce4-cf95-445b-b26c-2446c082661b-run-httpd\") pod \"ceilometer-0\" (UID: \"a33b6ce4-cf95-445b-b26c-2446c082661b\") " pod="openstack/ceilometer-0" Mar 20 07:14:16 crc kubenswrapper[4971]: I0320 07:14:16.232219 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a33b6ce4-cf95-445b-b26c-2446c082661b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a33b6ce4-cf95-445b-b26c-2446c082661b\") " pod="openstack/ceilometer-0" Mar 20 07:14:16 crc kubenswrapper[4971]: I0320 07:14:16.232866 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a33b6ce4-cf95-445b-b26c-2446c082661b-scripts\") pod \"ceilometer-0\" (UID: \"a33b6ce4-cf95-445b-b26c-2446c082661b\") " pod="openstack/ceilometer-0" Mar 20 07:14:16 crc kubenswrapper[4971]: I0320 07:14:16.233278 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a33b6ce4-cf95-445b-b26c-2446c082661b-config-data\") pod \"ceilometer-0\" (UID: \"a33b6ce4-cf95-445b-b26c-2446c082661b\") " pod="openstack/ceilometer-0" Mar 20 07:14:16 crc kubenswrapper[4971]: I0320 07:14:16.244396 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a33b6ce4-cf95-445b-b26c-2446c082661b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a33b6ce4-cf95-445b-b26c-2446c082661b\") " pod="openstack/ceilometer-0" Mar 20 07:14:16 crc kubenswrapper[4971]: I0320 07:14:16.246639 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a33b6ce4-cf95-445b-b26c-2446c082661b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a33b6ce4-cf95-445b-b26c-2446c082661b\") " pod="openstack/ceilometer-0" Mar 20 07:14:16 crc kubenswrapper[4971]: I0320 07:14:16.247293 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wjjj\" (UniqueName: \"kubernetes.io/projected/a33b6ce4-cf95-445b-b26c-2446c082661b-kube-api-access-9wjjj\") pod \"ceilometer-0\" (UID: \"a33b6ce4-cf95-445b-b26c-2446c082661b\") " pod="openstack/ceilometer-0" Mar 20 07:14:16 crc kubenswrapper[4971]: I0320 07:14:16.378383 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 07:14:16 crc kubenswrapper[4971]: I0320 07:14:16.744395 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66e085af-b20f-4a74-a2a2-f21a3057cb8a" path="/var/lib/kubelet/pods/66e085af-b20f-4a74-a2a2-f21a3057cb8a/volumes" Mar 20 07:14:16 crc kubenswrapper[4971]: W0320 07:14:16.880907 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda33b6ce4_cf95_445b_b26c_2446c082661b.slice/crio-87d118f9184795c2be7f0856b658532a8eba3087a15158f36bcc733cf7119107 WatchSource:0}: Error finding container 87d118f9184795c2be7f0856b658532a8eba3087a15158f36bcc733cf7119107: Status 404 returned error can't find the container with id 87d118f9184795c2be7f0856b658532a8eba3087a15158f36bcc733cf7119107 Mar 20 07:14:16 crc kubenswrapper[4971]: I0320 07:14:16.883992 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 07:14:16 crc kubenswrapper[4971]: I0320 07:14:16.977300 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a33b6ce4-cf95-445b-b26c-2446c082661b","Type":"ContainerStarted","Data":"87d118f9184795c2be7f0856b658532a8eba3087a15158f36bcc733cf7119107"} Mar 20 07:14:17 crc kubenswrapper[4971]: I0320 07:14:17.988890 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a33b6ce4-cf95-445b-b26c-2446c082661b","Type":"ContainerStarted","Data":"87bf779c50e91049f67f783e5685c4d83ca2286f5c20d7a7a08fb3fe90170211"} Mar 20 07:14:18 crc kubenswrapper[4971]: I0320 07:14:18.265039 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 20 07:14:18 crc kubenswrapper[4971]: I0320 07:14:18.292756 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 20 07:14:18 crc kubenswrapper[4971]: I0320 07:14:18.324870 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 07:14:18 crc kubenswrapper[4971]: I0320 07:14:18.324956 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 07:14:19 crc kubenswrapper[4971]: I0320 07:14:19.001271 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a33b6ce4-cf95-445b-b26c-2446c082661b","Type":"ContainerStarted","Data":"2f9c77390592f570e73b6c5b8efd28d24f870f3f082a0fa40d164395941803fb"} Mar 20 07:14:19 crc kubenswrapper[4971]: I0320 07:14:19.035441 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 20 07:14:19 crc kubenswrapper[4971]: I0320 07:14:19.408757 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5b0faea8-1d06-4130-aaad-3ce56e8b9a69" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.203:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 07:14:19 crc kubenswrapper[4971]: I0320 07:14:19.408995 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5b0faea8-1d06-4130-aaad-3ce56e8b9a69" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.203:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 07:14:20 crc kubenswrapper[4971]: I0320 07:14:20.012614 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a33b6ce4-cf95-445b-b26c-2446c082661b","Type":"ContainerStarted","Data":"c461729e0434f2d9d975a36a275935fcbf203fa6e029a31db7f37345e540cd8e"} Mar 20 07:14:21 crc kubenswrapper[4971]: I0320 07:14:21.472196 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 07:14:21 crc kubenswrapper[4971]: I0320 07:14:21.472582 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 07:14:22 crc kubenswrapper[4971]: I0320 07:14:22.330998 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 20 07:14:23 crc kubenswrapper[4971]: I0320 07:14:23.039547 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a33b6ce4-cf95-445b-b26c-2446c082661b","Type":"ContainerStarted","Data":"a91a24059971fcad7375ecfa127faf237ed0e6ff5f366650a1a8fd1aeb152969"} Mar 20 07:14:23 crc kubenswrapper[4971]: I0320 07:14:23.039977 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 07:14:23 crc kubenswrapper[4971]: I0320 07:14:23.075259 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.6406651540000001 podStartE2EDuration="7.075233083s" podCreationTimestamp="2026-03-20 07:14:16 +0000 UTC" firstStartedPulling="2026-03-20 07:14:16.883355617 +0000 UTC m=+1478.863229765" lastFinishedPulling="2026-03-20 07:14:22.317923546 +0000 UTC m=+1484.297797694" observedRunningTime="2026-03-20 07:14:23.065746677 +0000 UTC m=+1485.045620825" watchObservedRunningTime="2026-03-20 07:14:23.075233083 +0000 UTC m=+1485.055107231" Mar 20 07:14:23 crc kubenswrapper[4971]: I0320 07:14:23.484049 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 07:14:23 crc kubenswrapper[4971]: I0320 07:14:23.496051 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 07:14:23 crc kubenswrapper[4971]: I0320 07:14:23.496723 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 07:14:24 crc kubenswrapper[4971]: I0320 07:14:24.060193 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 07:14:26 crc kubenswrapper[4971]: I0320 07:14:26.325041 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 07:14:26 crc kubenswrapper[4971]: I0320 07:14:26.325635 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 07:14:27 crc kubenswrapper[4971]: I0320 07:14:27.083647 4971 generic.go:334] "Generic (PLEG): container finished" podID="303bf07b-b266-451f-9ba4-9122c1d80109" containerID="6196fd736045852e54d49d69c1722264de6f657acdbf76992e58582e67c832e2" exitCode=137 Mar 20 07:14:27 crc kubenswrapper[4971]: I0320 07:14:27.083684 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"303bf07b-b266-451f-9ba4-9122c1d80109","Type":"ContainerDied","Data":"6196fd736045852e54d49d69c1722264de6f657acdbf76992e58582e67c832e2"} Mar 20 07:14:27 crc kubenswrapper[4971]: I0320 07:14:27.084051 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"303bf07b-b266-451f-9ba4-9122c1d80109","Type":"ContainerDied","Data":"c9374e49741939d48446d52ac5d009c4a428a29a248a2d9d1b52e446c8b2614f"} Mar 20 07:14:27 crc kubenswrapper[4971]: I0320 07:14:27.084072 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c9374e49741939d48446d52ac5d009c4a428a29a248a2d9d1b52e446c8b2614f" Mar 20 07:14:27 crc kubenswrapper[4971]: I0320 07:14:27.125107 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 07:14:27 crc kubenswrapper[4971]: I0320 07:14:27.138717 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/303bf07b-b266-451f-9ba4-9122c1d80109-combined-ca-bundle\") pod \"303bf07b-b266-451f-9ba4-9122c1d80109\" (UID: \"303bf07b-b266-451f-9ba4-9122c1d80109\") " Mar 20 07:14:27 crc kubenswrapper[4971]: I0320 07:14:27.138874 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pjff\" (UniqueName: \"kubernetes.io/projected/303bf07b-b266-451f-9ba4-9122c1d80109-kube-api-access-4pjff\") pod \"303bf07b-b266-451f-9ba4-9122c1d80109\" (UID: \"303bf07b-b266-451f-9ba4-9122c1d80109\") " Mar 20 07:14:27 crc kubenswrapper[4971]: I0320 07:14:27.138998 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/303bf07b-b266-451f-9ba4-9122c1d80109-config-data\") pod \"303bf07b-b266-451f-9ba4-9122c1d80109\" (UID: \"303bf07b-b266-451f-9ba4-9122c1d80109\") " Mar 20 07:14:27 crc kubenswrapper[4971]: I0320 07:14:27.146279 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/303bf07b-b266-451f-9ba4-9122c1d80109-kube-api-access-4pjff" (OuterVolumeSpecName: "kube-api-access-4pjff") pod "303bf07b-b266-451f-9ba4-9122c1d80109" (UID: "303bf07b-b266-451f-9ba4-9122c1d80109"). InnerVolumeSpecName "kube-api-access-4pjff". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:27 crc kubenswrapper[4971]: I0320 07:14:27.186627 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/303bf07b-b266-451f-9ba4-9122c1d80109-config-data" (OuterVolumeSpecName: "config-data") pod "303bf07b-b266-451f-9ba4-9122c1d80109" (UID: "303bf07b-b266-451f-9ba4-9122c1d80109"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:27 crc kubenswrapper[4971]: I0320 07:14:27.200925 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/303bf07b-b266-451f-9ba4-9122c1d80109-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "303bf07b-b266-451f-9ba4-9122c1d80109" (UID: "303bf07b-b266-451f-9ba4-9122c1d80109"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:27 crc kubenswrapper[4971]: I0320 07:14:27.245012 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4pjff\" (UniqueName: \"kubernetes.io/projected/303bf07b-b266-451f-9ba4-9122c1d80109-kube-api-access-4pjff\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:27 crc kubenswrapper[4971]: I0320 07:14:27.245057 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/303bf07b-b266-451f-9ba4-9122c1d80109-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:27 crc kubenswrapper[4971]: I0320 07:14:27.245075 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/303bf07b-b266-451f-9ba4-9122c1d80109-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:28 crc kubenswrapper[4971]: I0320 07:14:28.097491 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 07:14:28 crc kubenswrapper[4971]: I0320 07:14:28.144381 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 07:14:28 crc kubenswrapper[4971]: I0320 07:14:28.177869 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 07:14:28 crc kubenswrapper[4971]: I0320 07:14:28.204016 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 07:14:28 crc kubenswrapper[4971]: E0320 07:14:28.204820 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="303bf07b-b266-451f-9ba4-9122c1d80109" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 07:14:28 crc kubenswrapper[4971]: I0320 07:14:28.204856 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="303bf07b-b266-451f-9ba4-9122c1d80109" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 07:14:28 crc kubenswrapper[4971]: I0320 07:14:28.205223 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="303bf07b-b266-451f-9ba4-9122c1d80109" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 07:14:28 crc kubenswrapper[4971]: I0320 07:14:28.206472 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 07:14:28 crc kubenswrapper[4971]: I0320 07:14:28.210182 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 20 07:14:28 crc kubenswrapper[4971]: I0320 07:14:28.210385 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 20 07:14:28 crc kubenswrapper[4971]: I0320 07:14:28.210567 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 20 07:14:28 crc kubenswrapper[4971]: I0320 07:14:28.220392 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 07:14:28 crc kubenswrapper[4971]: I0320 07:14:28.328932 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 07:14:28 crc kubenswrapper[4971]: I0320 07:14:28.330537 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 07:14:28 crc kubenswrapper[4971]: I0320 07:14:28.335977 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 07:14:28 crc kubenswrapper[4971]: I0320 07:14:28.365274 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b4a1357-9313-4ebb-bb14-2aaf2785e17d-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6b4a1357-9313-4ebb-bb14-2aaf2785e17d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 07:14:28 crc kubenswrapper[4971]: I0320 07:14:28.365355 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7h5z\" (UniqueName: \"kubernetes.io/projected/6b4a1357-9313-4ebb-bb14-2aaf2785e17d-kube-api-access-l7h5z\") pod \"nova-cell1-novncproxy-0\" (UID: \"6b4a1357-9313-4ebb-bb14-2aaf2785e17d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 07:14:28 crc kubenswrapper[4971]: I0320 07:14:28.366189 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b4a1357-9313-4ebb-bb14-2aaf2785e17d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6b4a1357-9313-4ebb-bb14-2aaf2785e17d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 07:14:28 crc kubenswrapper[4971]: I0320 07:14:28.366394 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b4a1357-9313-4ebb-bb14-2aaf2785e17d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6b4a1357-9313-4ebb-bb14-2aaf2785e17d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 07:14:28 crc kubenswrapper[4971]: I0320 07:14:28.366527 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b4a1357-9313-4ebb-bb14-2aaf2785e17d-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6b4a1357-9313-4ebb-bb14-2aaf2785e17d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 07:14:28 crc kubenswrapper[4971]: I0320 07:14:28.468470 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7h5z\" (UniqueName: \"kubernetes.io/projected/6b4a1357-9313-4ebb-bb14-2aaf2785e17d-kube-api-access-l7h5z\") pod \"nova-cell1-novncproxy-0\" (UID: \"6b4a1357-9313-4ebb-bb14-2aaf2785e17d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 07:14:28 crc kubenswrapper[4971]: I0320 07:14:28.468554 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b4a1357-9313-4ebb-bb14-2aaf2785e17d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6b4a1357-9313-4ebb-bb14-2aaf2785e17d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 07:14:28 crc kubenswrapper[4971]: I0320 07:14:28.468684 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b4a1357-9313-4ebb-bb14-2aaf2785e17d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6b4a1357-9313-4ebb-bb14-2aaf2785e17d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 07:14:28 crc kubenswrapper[4971]: I0320 07:14:28.468750 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b4a1357-9313-4ebb-bb14-2aaf2785e17d-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6b4a1357-9313-4ebb-bb14-2aaf2785e17d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 07:14:28 crc kubenswrapper[4971]: I0320 07:14:28.468817 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b4a1357-9313-4ebb-bb14-2aaf2785e17d-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6b4a1357-9313-4ebb-bb14-2aaf2785e17d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 07:14:28 crc kubenswrapper[4971]: I0320 07:14:28.475102 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b4a1357-9313-4ebb-bb14-2aaf2785e17d-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6b4a1357-9313-4ebb-bb14-2aaf2785e17d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 07:14:28 crc kubenswrapper[4971]: I0320 07:14:28.476777 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b4a1357-9313-4ebb-bb14-2aaf2785e17d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6b4a1357-9313-4ebb-bb14-2aaf2785e17d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 07:14:28 crc kubenswrapper[4971]: I0320 07:14:28.477864 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b4a1357-9313-4ebb-bb14-2aaf2785e17d-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6b4a1357-9313-4ebb-bb14-2aaf2785e17d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 07:14:28 crc kubenswrapper[4971]: I0320 07:14:28.483530 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b4a1357-9313-4ebb-bb14-2aaf2785e17d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6b4a1357-9313-4ebb-bb14-2aaf2785e17d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 07:14:28 crc kubenswrapper[4971]: I0320 07:14:28.498100 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7h5z\" (UniqueName: \"kubernetes.io/projected/6b4a1357-9313-4ebb-bb14-2aaf2785e17d-kube-api-access-l7h5z\") pod \"nova-cell1-novncproxy-0\" (UID: \"6b4a1357-9313-4ebb-bb14-2aaf2785e17d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 07:14:28 crc kubenswrapper[4971]: I0320 07:14:28.531799 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 07:14:28 crc kubenswrapper[4971]: I0320 07:14:28.758729 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="303bf07b-b266-451f-9ba4-9122c1d80109" path="/var/lib/kubelet/pods/303bf07b-b266-451f-9ba4-9122c1d80109/volumes" Mar 20 07:14:29 crc kubenswrapper[4971]: W0320 07:14:29.019239 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b4a1357_9313_4ebb_bb14_2aaf2785e17d.slice/crio-56e147ccdd24b108236de6f3992c7d0b4c6a7f524b5f17d126edce8490e0c2ce WatchSource:0}: Error finding container 56e147ccdd24b108236de6f3992c7d0b4c6a7f524b5f17d126edce8490e0c2ce: Status 404 returned error can't find the container with id 56e147ccdd24b108236de6f3992c7d0b4c6a7f524b5f17d126edce8490e0c2ce Mar 20 07:14:29 crc kubenswrapper[4971]: I0320 07:14:29.022133 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 07:14:29 crc kubenswrapper[4971]: I0320 07:14:29.112220 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6b4a1357-9313-4ebb-bb14-2aaf2785e17d","Type":"ContainerStarted","Data":"56e147ccdd24b108236de6f3992c7d0b4c6a7f524b5f17d126edce8490e0c2ce"} Mar 20 07:14:29 crc kubenswrapper[4971]: I0320 07:14:29.117251 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 07:14:29 crc kubenswrapper[4971]: I0320 07:14:29.339124 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-fdb8f6449-msxs5"] Mar 20 07:14:29 crc kubenswrapper[4971]: I0320 07:14:29.346860 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fdb8f6449-msxs5" Mar 20 07:14:29 crc kubenswrapper[4971]: I0320 07:14:29.400515 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fdb8f6449-msxs5"] Mar 20 07:14:29 crc kubenswrapper[4971]: I0320 07:14:29.503581 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws2zp\" (UniqueName: \"kubernetes.io/projected/2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67-kube-api-access-ws2zp\") pod \"dnsmasq-dns-fdb8f6449-msxs5\" (UID: \"2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67\") " pod="openstack/dnsmasq-dns-fdb8f6449-msxs5" Mar 20 07:14:29 crc kubenswrapper[4971]: I0320 07:14:29.503719 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67-ovsdbserver-nb\") pod \"dnsmasq-dns-fdb8f6449-msxs5\" (UID: \"2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67\") " pod="openstack/dnsmasq-dns-fdb8f6449-msxs5" Mar 20 07:14:29 crc kubenswrapper[4971]: I0320 07:14:29.503794 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67-dns-svc\") pod \"dnsmasq-dns-fdb8f6449-msxs5\" (UID: \"2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67\") " pod="openstack/dnsmasq-dns-fdb8f6449-msxs5" Mar 20 07:14:29 crc kubenswrapper[4971]: I0320 07:14:29.503846 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67-config\") pod \"dnsmasq-dns-fdb8f6449-msxs5\" (UID: \"2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67\") " pod="openstack/dnsmasq-dns-fdb8f6449-msxs5" Mar 20 07:14:29 crc kubenswrapper[4971]: I0320 07:14:29.503882 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67-ovsdbserver-sb\") pod \"dnsmasq-dns-fdb8f6449-msxs5\" (UID: \"2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67\") " pod="openstack/dnsmasq-dns-fdb8f6449-msxs5" Mar 20 07:14:29 crc kubenswrapper[4971]: I0320 07:14:29.503974 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67-dns-swift-storage-0\") pod \"dnsmasq-dns-fdb8f6449-msxs5\" (UID: \"2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67\") " pod="openstack/dnsmasq-dns-fdb8f6449-msxs5" Mar 20 07:14:29 crc kubenswrapper[4971]: I0320 07:14:29.605780 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ws2zp\" (UniqueName: \"kubernetes.io/projected/2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67-kube-api-access-ws2zp\") pod \"dnsmasq-dns-fdb8f6449-msxs5\" (UID: \"2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67\") " pod="openstack/dnsmasq-dns-fdb8f6449-msxs5" Mar 20 07:14:29 crc kubenswrapper[4971]: I0320 07:14:29.605846 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67-ovsdbserver-nb\") pod \"dnsmasq-dns-fdb8f6449-msxs5\" (UID: \"2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67\") " pod="openstack/dnsmasq-dns-fdb8f6449-msxs5" Mar 20 07:14:29 crc kubenswrapper[4971]: I0320 07:14:29.605896 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67-dns-svc\") pod \"dnsmasq-dns-fdb8f6449-msxs5\" (UID: \"2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67\") " pod="openstack/dnsmasq-dns-fdb8f6449-msxs5" Mar 20 07:14:29 crc kubenswrapper[4971]: I0320 07:14:29.605939 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67-config\") pod \"dnsmasq-dns-fdb8f6449-msxs5\" (UID: \"2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67\") " pod="openstack/dnsmasq-dns-fdb8f6449-msxs5" Mar 20 07:14:29 crc kubenswrapper[4971]: I0320 07:14:29.605961 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67-ovsdbserver-sb\") pod \"dnsmasq-dns-fdb8f6449-msxs5\" (UID: \"2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67\") " pod="openstack/dnsmasq-dns-fdb8f6449-msxs5" Mar 20 07:14:29 crc kubenswrapper[4971]: I0320 07:14:29.606020 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67-dns-swift-storage-0\") pod \"dnsmasq-dns-fdb8f6449-msxs5\" (UID: \"2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67\") " pod="openstack/dnsmasq-dns-fdb8f6449-msxs5" Mar 20 07:14:29 crc kubenswrapper[4971]: I0320 07:14:29.607593 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67-dns-svc\") pod \"dnsmasq-dns-fdb8f6449-msxs5\" (UID: \"2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67\") " pod="openstack/dnsmasq-dns-fdb8f6449-msxs5" Mar 20 07:14:29 crc kubenswrapper[4971]: I0320 07:14:29.608068 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67-dns-swift-storage-0\") pod \"dnsmasq-dns-fdb8f6449-msxs5\" (UID: \"2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67\") " pod="openstack/dnsmasq-dns-fdb8f6449-msxs5" Mar 20 07:14:29 crc kubenswrapper[4971]: I0320 07:14:29.608357 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67-config\") pod \"dnsmasq-dns-fdb8f6449-msxs5\" (UID: \"2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67\") " pod="openstack/dnsmasq-dns-fdb8f6449-msxs5" Mar 20 07:14:29 crc kubenswrapper[4971]: I0320 07:14:29.609026 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67-ovsdbserver-nb\") pod \"dnsmasq-dns-fdb8f6449-msxs5\" (UID: \"2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67\") " pod="openstack/dnsmasq-dns-fdb8f6449-msxs5" Mar 20 07:14:29 crc kubenswrapper[4971]: I0320 07:14:29.609535 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67-ovsdbserver-sb\") pod \"dnsmasq-dns-fdb8f6449-msxs5\" (UID: \"2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67\") " pod="openstack/dnsmasq-dns-fdb8f6449-msxs5" Mar 20 07:14:29 crc kubenswrapper[4971]: I0320 07:14:29.624907 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws2zp\" (UniqueName: \"kubernetes.io/projected/2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67-kube-api-access-ws2zp\") pod \"dnsmasq-dns-fdb8f6449-msxs5\" (UID: \"2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67\") " pod="openstack/dnsmasq-dns-fdb8f6449-msxs5" Mar 20 07:14:29 crc kubenswrapper[4971]: I0320 07:14:29.702316 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fdb8f6449-msxs5" Mar 20 07:14:30 crc kubenswrapper[4971]: I0320 07:14:30.120296 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6b4a1357-9313-4ebb-bb14-2aaf2785e17d","Type":"ContainerStarted","Data":"a679a633dac3964e55615ddc5dd8c47b610179cc08b11c445d3bcebafc956250"} Mar 20 07:14:30 crc kubenswrapper[4971]: I0320 07:14:30.137865 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.137847019 podStartE2EDuration="2.137847019s" podCreationTimestamp="2026-03-20 07:14:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:14:30.136217887 +0000 UTC m=+1492.116092035" watchObservedRunningTime="2026-03-20 07:14:30.137847019 +0000 UTC m=+1492.117721157" Mar 20 07:14:30 crc kubenswrapper[4971]: I0320 07:14:30.159809 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fdb8f6449-msxs5"] Mar 20 07:14:31 crc kubenswrapper[4971]: I0320 07:14:31.129594 4971 generic.go:334] "Generic (PLEG): container finished" podID="2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67" containerID="4f2a5fe64a55f2870001f34a68956aa3f071975c91e61557fe4c9df8fb524220" exitCode=0 Mar 20 07:14:31 crc kubenswrapper[4971]: I0320 07:14:31.129928 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fdb8f6449-msxs5" event={"ID":"2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67","Type":"ContainerDied","Data":"4f2a5fe64a55f2870001f34a68956aa3f071975c91e61557fe4c9df8fb524220"} Mar 20 07:14:31 crc kubenswrapper[4971]: I0320 07:14:31.129973 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fdb8f6449-msxs5" event={"ID":"2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67","Type":"ContainerStarted","Data":"ef210e03fb6f197477310288719c131cb6f093ba40f8bc1095118cc1597a4905"} Mar 20 07:14:31 crc kubenswrapper[4971]: I0320 07:14:31.640303 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 07:14:31 crc kubenswrapper[4971]: I0320 07:14:31.641006 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a33b6ce4-cf95-445b-b26c-2446c082661b" containerName="ceilometer-central-agent" containerID="cri-o://87bf779c50e91049f67f783e5685c4d83ca2286f5c20d7a7a08fb3fe90170211" gracePeriod=30 Mar 20 07:14:31 crc kubenswrapper[4971]: I0320 07:14:31.641120 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a33b6ce4-cf95-445b-b26c-2446c082661b" containerName="proxy-httpd" containerID="cri-o://a91a24059971fcad7375ecfa127faf237ed0e6ff5f366650a1a8fd1aeb152969" gracePeriod=30 Mar 20 07:14:31 crc kubenswrapper[4971]: I0320 07:14:31.641113 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a33b6ce4-cf95-445b-b26c-2446c082661b" containerName="sg-core" containerID="cri-o://c461729e0434f2d9d975a36a275935fcbf203fa6e029a31db7f37345e540cd8e" gracePeriod=30 Mar 20 07:14:31 crc kubenswrapper[4971]: I0320 07:14:31.641129 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a33b6ce4-cf95-445b-b26c-2446c082661b" containerName="ceilometer-notification-agent" containerID="cri-o://2f9c77390592f570e73b6c5b8efd28d24f870f3f082a0fa40d164395941803fb" gracePeriod=30 Mar 20 07:14:32 crc kubenswrapper[4971]: I0320 07:14:32.152193 4971 generic.go:334] "Generic (PLEG): container finished" podID="a33b6ce4-cf95-445b-b26c-2446c082661b" containerID="a91a24059971fcad7375ecfa127faf237ed0e6ff5f366650a1a8fd1aeb152969" exitCode=0 Mar 20 07:14:32 crc kubenswrapper[4971]: I0320 07:14:32.152645 4971 generic.go:334] "Generic (PLEG): container finished" podID="a33b6ce4-cf95-445b-b26c-2446c082661b" containerID="c461729e0434f2d9d975a36a275935fcbf203fa6e029a31db7f37345e540cd8e" exitCode=2 Mar 20 07:14:32 crc kubenswrapper[4971]: I0320 07:14:32.152656 4971 generic.go:334] "Generic (PLEG): container finished" podID="a33b6ce4-cf95-445b-b26c-2446c082661b" containerID="87bf779c50e91049f67f783e5685c4d83ca2286f5c20d7a7a08fb3fe90170211" exitCode=0 Mar 20 07:14:32 crc kubenswrapper[4971]: I0320 07:14:32.152718 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a33b6ce4-cf95-445b-b26c-2446c082661b","Type":"ContainerDied","Data":"a91a24059971fcad7375ecfa127faf237ed0e6ff5f366650a1a8fd1aeb152969"} Mar 20 07:14:32 crc kubenswrapper[4971]: I0320 07:14:32.152753 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a33b6ce4-cf95-445b-b26c-2446c082661b","Type":"ContainerDied","Data":"c461729e0434f2d9d975a36a275935fcbf203fa6e029a31db7f37345e540cd8e"} Mar 20 07:14:32 crc kubenswrapper[4971]: I0320 07:14:32.152767 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a33b6ce4-cf95-445b-b26c-2446c082661b","Type":"ContainerDied","Data":"87bf779c50e91049f67f783e5685c4d83ca2286f5c20d7a7a08fb3fe90170211"} Mar 20 07:14:32 crc kubenswrapper[4971]: I0320 07:14:32.156756 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fdb8f6449-msxs5" event={"ID":"2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67","Type":"ContainerStarted","Data":"3a2e05f37831e00f2f4d153e39ac7c8308e64ddecc8706e5d6c45b52312a4203"} Mar 20 07:14:32 crc kubenswrapper[4971]: I0320 07:14:32.157897 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-fdb8f6449-msxs5" Mar 20 07:14:32 crc kubenswrapper[4971]: I0320 07:14:32.181019 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-fdb8f6449-msxs5" podStartSLOduration=3.18100169 podStartE2EDuration="3.18100169s" podCreationTimestamp="2026-03-20 07:14:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:14:32.175244321 +0000 UTC m=+1494.155118459" watchObservedRunningTime="2026-03-20 07:14:32.18100169 +0000 UTC m=+1494.160875828" Mar 20 07:14:32 crc kubenswrapper[4971]: I0320 07:14:32.279049 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 07:14:32 crc kubenswrapper[4971]: I0320 07:14:32.279274 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5b0faea8-1d06-4130-aaad-3ce56e8b9a69" containerName="nova-api-log" containerID="cri-o://3ee7a7dea82ef43279997d2d4e22fe388996415cb4326a588f530459fa93f9a3" gracePeriod=30 Mar 20 07:14:32 crc kubenswrapper[4971]: I0320 07:14:32.279790 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5b0faea8-1d06-4130-aaad-3ce56e8b9a69" containerName="nova-api-api" containerID="cri-o://81256547309da7ad97c533caf37fd1dae06d248fadb2b84ef1fc7bf4b62493ff" gracePeriod=30 Mar 20 07:14:32 crc kubenswrapper[4971]: E0320 07:14:32.615316 4971 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b0faea8_1d06_4130_aaad_3ce56e8b9a69.slice/crio-3ee7a7dea82ef43279997d2d4e22fe388996415cb4326a588f530459fa93f9a3.scope\": RecentStats: unable to find data in memory cache]" Mar 20 07:14:33 crc kubenswrapper[4971]: I0320 07:14:33.174886 4971 generic.go:334] "Generic (PLEG): container finished" podID="5b0faea8-1d06-4130-aaad-3ce56e8b9a69" containerID="3ee7a7dea82ef43279997d2d4e22fe388996415cb4326a588f530459fa93f9a3" exitCode=143 Mar 20 07:14:33 crc kubenswrapper[4971]: I0320 07:14:33.174973 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5b0faea8-1d06-4130-aaad-3ce56e8b9a69","Type":"ContainerDied","Data":"3ee7a7dea82ef43279997d2d4e22fe388996415cb4326a588f530459fa93f9a3"} Mar 20 07:14:33 crc kubenswrapper[4971]: I0320 07:14:33.179323 4971 generic.go:334] "Generic (PLEG): container finished" podID="a33b6ce4-cf95-445b-b26c-2446c082661b" containerID="2f9c77390592f570e73b6c5b8efd28d24f870f3f082a0fa40d164395941803fb" exitCode=0 Mar 20 07:14:33 crc kubenswrapper[4971]: I0320 07:14:33.180347 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a33b6ce4-cf95-445b-b26c-2446c082661b","Type":"ContainerDied","Data":"2f9c77390592f570e73b6c5b8efd28d24f870f3f082a0fa40d164395941803fb"} Mar 20 07:14:33 crc kubenswrapper[4971]: I0320 07:14:33.338144 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 07:14:33 crc kubenswrapper[4971]: I0320 07:14:33.402436 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wjjj\" (UniqueName: \"kubernetes.io/projected/a33b6ce4-cf95-445b-b26c-2446c082661b-kube-api-access-9wjjj\") pod \"a33b6ce4-cf95-445b-b26c-2446c082661b\" (UID: \"a33b6ce4-cf95-445b-b26c-2446c082661b\") " Mar 20 07:14:33 crc kubenswrapper[4971]: I0320 07:14:33.402848 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a33b6ce4-cf95-445b-b26c-2446c082661b-ceilometer-tls-certs\") pod \"a33b6ce4-cf95-445b-b26c-2446c082661b\" (UID: \"a33b6ce4-cf95-445b-b26c-2446c082661b\") " Mar 20 07:14:33 crc kubenswrapper[4971]: I0320 07:14:33.402893 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a33b6ce4-cf95-445b-b26c-2446c082661b-scripts\") pod \"a33b6ce4-cf95-445b-b26c-2446c082661b\" (UID: \"a33b6ce4-cf95-445b-b26c-2446c082661b\") " Mar 20 07:14:33 crc kubenswrapper[4971]: I0320 07:14:33.402920 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a33b6ce4-cf95-445b-b26c-2446c082661b-sg-core-conf-yaml\") pod \"a33b6ce4-cf95-445b-b26c-2446c082661b\" (UID: \"a33b6ce4-cf95-445b-b26c-2446c082661b\") " Mar 20 07:14:33 crc kubenswrapper[4971]: I0320 07:14:33.402996 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a33b6ce4-cf95-445b-b26c-2446c082661b-log-httpd\") pod \"a33b6ce4-cf95-445b-b26c-2446c082661b\" (UID: \"a33b6ce4-cf95-445b-b26c-2446c082661b\") " Mar 20 07:14:33 crc kubenswrapper[4971]: I0320 07:14:33.403018 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a33b6ce4-cf95-445b-b26c-2446c082661b-config-data\") pod \"a33b6ce4-cf95-445b-b26c-2446c082661b\" (UID: \"a33b6ce4-cf95-445b-b26c-2446c082661b\") " Mar 20 07:14:33 crc kubenswrapper[4971]: I0320 07:14:33.403046 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a33b6ce4-cf95-445b-b26c-2446c082661b-combined-ca-bundle\") pod \"a33b6ce4-cf95-445b-b26c-2446c082661b\" (UID: \"a33b6ce4-cf95-445b-b26c-2446c082661b\") " Mar 20 07:14:33 crc kubenswrapper[4971]: I0320 07:14:33.403098 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a33b6ce4-cf95-445b-b26c-2446c082661b-run-httpd\") pod \"a33b6ce4-cf95-445b-b26c-2446c082661b\" (UID: \"a33b6ce4-cf95-445b-b26c-2446c082661b\") " Mar 20 07:14:33 crc kubenswrapper[4971]: I0320 07:14:33.403858 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a33b6ce4-cf95-445b-b26c-2446c082661b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a33b6ce4-cf95-445b-b26c-2446c082661b" (UID: "a33b6ce4-cf95-445b-b26c-2446c082661b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:14:33 crc kubenswrapper[4971]: I0320 07:14:33.404112 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a33b6ce4-cf95-445b-b26c-2446c082661b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a33b6ce4-cf95-445b-b26c-2446c082661b" (UID: "a33b6ce4-cf95-445b-b26c-2446c082661b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:14:33 crc kubenswrapper[4971]: I0320 07:14:33.409050 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a33b6ce4-cf95-445b-b26c-2446c082661b-kube-api-access-9wjjj" (OuterVolumeSpecName: "kube-api-access-9wjjj") pod "a33b6ce4-cf95-445b-b26c-2446c082661b" (UID: "a33b6ce4-cf95-445b-b26c-2446c082661b"). InnerVolumeSpecName "kube-api-access-9wjjj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:33 crc kubenswrapper[4971]: I0320 07:14:33.414070 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a33b6ce4-cf95-445b-b26c-2446c082661b-scripts" (OuterVolumeSpecName: "scripts") pod "a33b6ce4-cf95-445b-b26c-2446c082661b" (UID: "a33b6ce4-cf95-445b-b26c-2446c082661b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:33 crc kubenswrapper[4971]: I0320 07:14:33.438550 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a33b6ce4-cf95-445b-b26c-2446c082661b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a33b6ce4-cf95-445b-b26c-2446c082661b" (UID: "a33b6ce4-cf95-445b-b26c-2446c082661b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:33 crc kubenswrapper[4971]: I0320 07:14:33.486338 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a33b6ce4-cf95-445b-b26c-2446c082661b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a33b6ce4-cf95-445b-b26c-2446c082661b" (UID: "a33b6ce4-cf95-445b-b26c-2446c082661b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:33 crc kubenswrapper[4971]: I0320 07:14:33.489802 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a33b6ce4-cf95-445b-b26c-2446c082661b-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "a33b6ce4-cf95-445b-b26c-2446c082661b" (UID: "a33b6ce4-cf95-445b-b26c-2446c082661b"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:33 crc kubenswrapper[4971]: I0320 07:14:33.504323 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9wjjj\" (UniqueName: \"kubernetes.io/projected/a33b6ce4-cf95-445b-b26c-2446c082661b-kube-api-access-9wjjj\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:33 crc kubenswrapper[4971]: I0320 07:14:33.504819 4971 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a33b6ce4-cf95-445b-b26c-2446c082661b-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:33 crc kubenswrapper[4971]: I0320 07:14:33.504917 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a33b6ce4-cf95-445b-b26c-2446c082661b-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:33 crc kubenswrapper[4971]: I0320 07:14:33.504997 4971 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a33b6ce4-cf95-445b-b26c-2446c082661b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:33 crc kubenswrapper[4971]: I0320 07:14:33.505070 4971 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a33b6ce4-cf95-445b-b26c-2446c082661b-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:33 crc kubenswrapper[4971]: I0320 07:14:33.505144 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a33b6ce4-cf95-445b-b26c-2446c082661b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:33 crc kubenswrapper[4971]: I0320 07:14:33.505270 4971 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a33b6ce4-cf95-445b-b26c-2446c082661b-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:33 crc kubenswrapper[4971]: I0320 07:14:33.528448 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a33b6ce4-cf95-445b-b26c-2446c082661b-config-data" (OuterVolumeSpecName: "config-data") pod "a33b6ce4-cf95-445b-b26c-2446c082661b" (UID: "a33b6ce4-cf95-445b-b26c-2446c082661b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:33 crc kubenswrapper[4971]: I0320 07:14:33.532185 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 20 07:14:33 crc kubenswrapper[4971]: I0320 07:14:33.607436 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a33b6ce4-cf95-445b-b26c-2446c082661b-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:34 crc kubenswrapper[4971]: I0320 07:14:34.192900 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a33b6ce4-cf95-445b-b26c-2446c082661b","Type":"ContainerDied","Data":"87d118f9184795c2be7f0856b658532a8eba3087a15158f36bcc733cf7119107"} Mar 20 07:14:34 crc kubenswrapper[4971]: I0320 07:14:34.192961 4971 scope.go:117] "RemoveContainer" containerID="a91a24059971fcad7375ecfa127faf237ed0e6ff5f366650a1a8fd1aeb152969" Mar 20 07:14:34 crc kubenswrapper[4971]: I0320 07:14:34.192963 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 07:14:34 crc kubenswrapper[4971]: I0320 07:14:34.210761 4971 scope.go:117] "RemoveContainer" containerID="c461729e0434f2d9d975a36a275935fcbf203fa6e029a31db7f37345e540cd8e" Mar 20 07:14:34 crc kubenswrapper[4971]: I0320 07:14:34.235584 4971 scope.go:117] "RemoveContainer" containerID="2f9c77390592f570e73b6c5b8efd28d24f870f3f082a0fa40d164395941803fb" Mar 20 07:14:34 crc kubenswrapper[4971]: I0320 07:14:34.256815 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 07:14:34 crc kubenswrapper[4971]: I0320 07:14:34.270185 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 07:14:34 crc kubenswrapper[4971]: I0320 07:14:34.280948 4971 scope.go:117] "RemoveContainer" containerID="87bf779c50e91049f67f783e5685c4d83ca2286f5c20d7a7a08fb3fe90170211" Mar 20 07:14:34 crc kubenswrapper[4971]: I0320 07:14:34.281713 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 07:14:34 crc kubenswrapper[4971]: E0320 07:14:34.282545 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a33b6ce4-cf95-445b-b26c-2446c082661b" containerName="proxy-httpd" Mar 20 07:14:34 crc kubenswrapper[4971]: I0320 07:14:34.282571 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="a33b6ce4-cf95-445b-b26c-2446c082661b" containerName="proxy-httpd" Mar 20 07:14:34 crc kubenswrapper[4971]: E0320 07:14:34.282594 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a33b6ce4-cf95-445b-b26c-2446c082661b" containerName="ceilometer-notification-agent" Mar 20 07:14:34 crc kubenswrapper[4971]: I0320 07:14:34.282621 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="a33b6ce4-cf95-445b-b26c-2446c082661b" containerName="ceilometer-notification-agent" Mar 20 07:14:34 crc kubenswrapper[4971]: E0320 07:14:34.282634 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a33b6ce4-cf95-445b-b26c-2446c082661b" containerName="sg-core" Mar 20 07:14:34 crc kubenswrapper[4971]: I0320 07:14:34.282653 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="a33b6ce4-cf95-445b-b26c-2446c082661b" containerName="sg-core" Mar 20 07:14:34 crc kubenswrapper[4971]: E0320 07:14:34.282672 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a33b6ce4-cf95-445b-b26c-2446c082661b" containerName="ceilometer-central-agent" Mar 20 07:14:34 crc kubenswrapper[4971]: I0320 07:14:34.282680 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="a33b6ce4-cf95-445b-b26c-2446c082661b" containerName="ceilometer-central-agent" Mar 20 07:14:34 crc kubenswrapper[4971]: I0320 07:14:34.282902 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="a33b6ce4-cf95-445b-b26c-2446c082661b" containerName="ceilometer-notification-agent" Mar 20 07:14:34 crc kubenswrapper[4971]: I0320 07:14:34.282927 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="a33b6ce4-cf95-445b-b26c-2446c082661b" containerName="proxy-httpd" Mar 20 07:14:34 crc kubenswrapper[4971]: I0320 07:14:34.282941 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="a33b6ce4-cf95-445b-b26c-2446c082661b" containerName="ceilometer-central-agent" Mar 20 07:14:34 crc kubenswrapper[4971]: I0320 07:14:34.282957 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="a33b6ce4-cf95-445b-b26c-2446c082661b" containerName="sg-core" Mar 20 07:14:34 crc kubenswrapper[4971]: I0320 07:14:34.284902 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 07:14:34 crc kubenswrapper[4971]: I0320 07:14:34.287911 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 07:14:34 crc kubenswrapper[4971]: I0320 07:14:34.293298 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 20 07:14:34 crc kubenswrapper[4971]: I0320 07:14:34.293641 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 07:14:34 crc kubenswrapper[4971]: I0320 07:14:34.302021 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 07:14:34 crc kubenswrapper[4971]: I0320 07:14:34.420967 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggpjs\" (UniqueName: \"kubernetes.io/projected/1f2c826d-a350-4f4e-8204-ece96439fd0f-kube-api-access-ggpjs\") pod \"ceilometer-0\" (UID: \"1f2c826d-a350-4f4e-8204-ece96439fd0f\") " pod="openstack/ceilometer-0" Mar 20 07:14:34 crc kubenswrapper[4971]: I0320 07:14:34.421267 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f2c826d-a350-4f4e-8204-ece96439fd0f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1f2c826d-a350-4f4e-8204-ece96439fd0f\") " pod="openstack/ceilometer-0" Mar 20 07:14:34 crc kubenswrapper[4971]: I0320 07:14:34.421328 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f2c826d-a350-4f4e-8204-ece96439fd0f-run-httpd\") pod \"ceilometer-0\" (UID: \"1f2c826d-a350-4f4e-8204-ece96439fd0f\") " pod="openstack/ceilometer-0" Mar 20 07:14:34 crc kubenswrapper[4971]: I0320 07:14:34.421443 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f2c826d-a350-4f4e-8204-ece96439fd0f-config-data\") pod \"ceilometer-0\" (UID: \"1f2c826d-a350-4f4e-8204-ece96439fd0f\") " pod="openstack/ceilometer-0" Mar 20 07:14:34 crc kubenswrapper[4971]: I0320 07:14:34.421503 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f2c826d-a350-4f4e-8204-ece96439fd0f-scripts\") pod \"ceilometer-0\" (UID: \"1f2c826d-a350-4f4e-8204-ece96439fd0f\") " pod="openstack/ceilometer-0" Mar 20 07:14:34 crc kubenswrapper[4971]: I0320 07:14:34.421590 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f2c826d-a350-4f4e-8204-ece96439fd0f-log-httpd\") pod \"ceilometer-0\" (UID: \"1f2c826d-a350-4f4e-8204-ece96439fd0f\") " pod="openstack/ceilometer-0" Mar 20 07:14:34 crc kubenswrapper[4971]: I0320 07:14:34.421783 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f2c826d-a350-4f4e-8204-ece96439fd0f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1f2c826d-a350-4f4e-8204-ece96439fd0f\") " pod="openstack/ceilometer-0" Mar 20 07:14:34 crc kubenswrapper[4971]: I0320 07:14:34.421873 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1f2c826d-a350-4f4e-8204-ece96439fd0f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1f2c826d-a350-4f4e-8204-ece96439fd0f\") " pod="openstack/ceilometer-0" Mar 20 07:14:34 crc kubenswrapper[4971]: I0320 07:14:34.523322 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f2c826d-a350-4f4e-8204-ece96439fd0f-scripts\") pod \"ceilometer-0\" (UID: \"1f2c826d-a350-4f4e-8204-ece96439fd0f\") " pod="openstack/ceilometer-0" Mar 20 07:14:34 crc kubenswrapper[4971]: I0320 07:14:34.523363 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f2c826d-a350-4f4e-8204-ece96439fd0f-log-httpd\") pod \"ceilometer-0\" (UID: \"1f2c826d-a350-4f4e-8204-ece96439fd0f\") " pod="openstack/ceilometer-0" Mar 20 07:14:34 crc kubenswrapper[4971]: I0320 07:14:34.523408 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f2c826d-a350-4f4e-8204-ece96439fd0f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1f2c826d-a350-4f4e-8204-ece96439fd0f\") " pod="openstack/ceilometer-0" Mar 20 07:14:34 crc kubenswrapper[4971]: I0320 07:14:34.523442 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1f2c826d-a350-4f4e-8204-ece96439fd0f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1f2c826d-a350-4f4e-8204-ece96439fd0f\") " pod="openstack/ceilometer-0" Mar 20 07:14:34 crc kubenswrapper[4971]: I0320 07:14:34.523465 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggpjs\" (UniqueName: \"kubernetes.io/projected/1f2c826d-a350-4f4e-8204-ece96439fd0f-kube-api-access-ggpjs\") pod \"ceilometer-0\" (UID: \"1f2c826d-a350-4f4e-8204-ece96439fd0f\") " pod="openstack/ceilometer-0" Mar 20 07:14:34 crc kubenswrapper[4971]: I0320 07:14:34.523496 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f2c826d-a350-4f4e-8204-ece96439fd0f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1f2c826d-a350-4f4e-8204-ece96439fd0f\") " pod="openstack/ceilometer-0" Mar 20 07:14:34 crc kubenswrapper[4971]: I0320 07:14:34.523533 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f2c826d-a350-4f4e-8204-ece96439fd0f-run-httpd\") pod \"ceilometer-0\" (UID: \"1f2c826d-a350-4f4e-8204-ece96439fd0f\") " pod="openstack/ceilometer-0" Mar 20 07:14:34 crc kubenswrapper[4971]: I0320 07:14:34.523584 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f2c826d-a350-4f4e-8204-ece96439fd0f-config-data\") pod \"ceilometer-0\" (UID: \"1f2c826d-a350-4f4e-8204-ece96439fd0f\") " pod="openstack/ceilometer-0" Mar 20 07:14:34 crc kubenswrapper[4971]: I0320 07:14:34.526703 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f2c826d-a350-4f4e-8204-ece96439fd0f-log-httpd\") pod \"ceilometer-0\" (UID: \"1f2c826d-a350-4f4e-8204-ece96439fd0f\") " pod="openstack/ceilometer-0" Mar 20 07:14:34 crc kubenswrapper[4971]: I0320 07:14:34.527039 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f2c826d-a350-4f4e-8204-ece96439fd0f-run-httpd\") pod \"ceilometer-0\" (UID: \"1f2c826d-a350-4f4e-8204-ece96439fd0f\") " pod="openstack/ceilometer-0" Mar 20 07:14:34 crc kubenswrapper[4971]: I0320 07:14:34.530155 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1f2c826d-a350-4f4e-8204-ece96439fd0f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1f2c826d-a350-4f4e-8204-ece96439fd0f\") " pod="openstack/ceilometer-0" Mar 20 07:14:34 crc kubenswrapper[4971]: I0320 07:14:34.530313 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f2c826d-a350-4f4e-8204-ece96439fd0f-config-data\") pod \"ceilometer-0\" (UID: \"1f2c826d-a350-4f4e-8204-ece96439fd0f\") " pod="openstack/ceilometer-0" Mar 20 07:14:34 crc kubenswrapper[4971]: I0320 07:14:34.534477 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f2c826d-a350-4f4e-8204-ece96439fd0f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1f2c826d-a350-4f4e-8204-ece96439fd0f\") " pod="openstack/ceilometer-0" Mar 20 07:14:34 crc kubenswrapper[4971]: I0320 07:14:34.539345 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f2c826d-a350-4f4e-8204-ece96439fd0f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1f2c826d-a350-4f4e-8204-ece96439fd0f\") " pod="openstack/ceilometer-0" Mar 20 07:14:34 crc kubenswrapper[4971]: I0320 07:14:34.540368 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f2c826d-a350-4f4e-8204-ece96439fd0f-scripts\") pod \"ceilometer-0\" (UID: \"1f2c826d-a350-4f4e-8204-ece96439fd0f\") " pod="openstack/ceilometer-0" Mar 20 07:14:34 crc kubenswrapper[4971]: I0320 07:14:34.562033 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggpjs\" (UniqueName: \"kubernetes.io/projected/1f2c826d-a350-4f4e-8204-ece96439fd0f-kube-api-access-ggpjs\") pod \"ceilometer-0\" (UID: \"1f2c826d-a350-4f4e-8204-ece96439fd0f\") " pod="openstack/ceilometer-0" Mar 20 07:14:34 crc kubenswrapper[4971]: I0320 07:14:34.618356 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 07:14:34 crc kubenswrapper[4971]: I0320 07:14:34.759155 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a33b6ce4-cf95-445b-b26c-2446c082661b" path="/var/lib/kubelet/pods/a33b6ce4-cf95-445b-b26c-2446c082661b/volumes" Mar 20 07:14:35 crc kubenswrapper[4971]: I0320 07:14:35.161423 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 07:14:35 crc kubenswrapper[4971]: I0320 07:14:35.205422 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1f2c826d-a350-4f4e-8204-ece96439fd0f","Type":"ContainerStarted","Data":"992034b7645ab84c244a298116bc0d2391736c7b7dc3923d33394dd1c8a6bd68"} Mar 20 07:14:36 crc kubenswrapper[4971]: I0320 07:14:36.008774 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 07:14:36 crc kubenswrapper[4971]: I0320 07:14:36.179087 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b0faea8-1d06-4130-aaad-3ce56e8b9a69-logs\") pod \"5b0faea8-1d06-4130-aaad-3ce56e8b9a69\" (UID: \"5b0faea8-1d06-4130-aaad-3ce56e8b9a69\") " Mar 20 07:14:36 crc kubenswrapper[4971]: I0320 07:14:36.179249 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkp8r\" (UniqueName: \"kubernetes.io/projected/5b0faea8-1d06-4130-aaad-3ce56e8b9a69-kube-api-access-hkp8r\") pod \"5b0faea8-1d06-4130-aaad-3ce56e8b9a69\" (UID: \"5b0faea8-1d06-4130-aaad-3ce56e8b9a69\") " Mar 20 07:14:36 crc kubenswrapper[4971]: I0320 07:14:36.179383 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b0faea8-1d06-4130-aaad-3ce56e8b9a69-config-data\") pod \"5b0faea8-1d06-4130-aaad-3ce56e8b9a69\" (UID: \"5b0faea8-1d06-4130-aaad-3ce56e8b9a69\") " Mar 20 07:14:36 crc kubenswrapper[4971]: I0320 07:14:36.179450 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b0faea8-1d06-4130-aaad-3ce56e8b9a69-combined-ca-bundle\") pod \"5b0faea8-1d06-4130-aaad-3ce56e8b9a69\" (UID: \"5b0faea8-1d06-4130-aaad-3ce56e8b9a69\") " Mar 20 07:14:36 crc kubenswrapper[4971]: I0320 07:14:36.179816 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b0faea8-1d06-4130-aaad-3ce56e8b9a69-logs" (OuterVolumeSpecName: "logs") pod "5b0faea8-1d06-4130-aaad-3ce56e8b9a69" (UID: "5b0faea8-1d06-4130-aaad-3ce56e8b9a69"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:14:36 crc kubenswrapper[4971]: I0320 07:14:36.180315 4971 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b0faea8-1d06-4130-aaad-3ce56e8b9a69-logs\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:36 crc kubenswrapper[4971]: I0320 07:14:36.184765 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b0faea8-1d06-4130-aaad-3ce56e8b9a69-kube-api-access-hkp8r" (OuterVolumeSpecName: "kube-api-access-hkp8r") pod "5b0faea8-1d06-4130-aaad-3ce56e8b9a69" (UID: "5b0faea8-1d06-4130-aaad-3ce56e8b9a69"). InnerVolumeSpecName "kube-api-access-hkp8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:36 crc kubenswrapper[4971]: I0320 07:14:36.211773 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b0faea8-1d06-4130-aaad-3ce56e8b9a69-config-data" (OuterVolumeSpecName: "config-data") pod "5b0faea8-1d06-4130-aaad-3ce56e8b9a69" (UID: "5b0faea8-1d06-4130-aaad-3ce56e8b9a69"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:36 crc kubenswrapper[4971]: I0320 07:14:36.222293 4971 generic.go:334] "Generic (PLEG): container finished" podID="5b0faea8-1d06-4130-aaad-3ce56e8b9a69" containerID="81256547309da7ad97c533caf37fd1dae06d248fadb2b84ef1fc7bf4b62493ff" exitCode=0 Mar 20 07:14:36 crc kubenswrapper[4971]: I0320 07:14:36.222374 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5b0faea8-1d06-4130-aaad-3ce56e8b9a69","Type":"ContainerDied","Data":"81256547309da7ad97c533caf37fd1dae06d248fadb2b84ef1fc7bf4b62493ff"} Mar 20 07:14:36 crc kubenswrapper[4971]: I0320 07:14:36.222412 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5b0faea8-1d06-4130-aaad-3ce56e8b9a69","Type":"ContainerDied","Data":"e8c61ec03023816532649c1c3951e5d6dd4d65579c0fee0f7612f00020f49373"} Mar 20 07:14:36 crc kubenswrapper[4971]: I0320 07:14:36.222435 4971 scope.go:117] "RemoveContainer" containerID="81256547309da7ad97c533caf37fd1dae06d248fadb2b84ef1fc7bf4b62493ff" Mar 20 07:14:36 crc kubenswrapper[4971]: I0320 07:14:36.222622 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 07:14:36 crc kubenswrapper[4971]: I0320 07:14:36.232392 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1f2c826d-a350-4f4e-8204-ece96439fd0f","Type":"ContainerStarted","Data":"a90100541651d8dcfa27f96fcebde4dd6642efc039317503cc423cb412dc235f"} Mar 20 07:14:36 crc kubenswrapper[4971]: I0320 07:14:36.234265 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b0faea8-1d06-4130-aaad-3ce56e8b9a69-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5b0faea8-1d06-4130-aaad-3ce56e8b9a69" (UID: "5b0faea8-1d06-4130-aaad-3ce56e8b9a69"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:36 crc kubenswrapper[4971]: I0320 07:14:36.253362 4971 scope.go:117] "RemoveContainer" containerID="3ee7a7dea82ef43279997d2d4e22fe388996415cb4326a588f530459fa93f9a3" Mar 20 07:14:36 crc kubenswrapper[4971]: I0320 07:14:36.282417 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkp8r\" (UniqueName: \"kubernetes.io/projected/5b0faea8-1d06-4130-aaad-3ce56e8b9a69-kube-api-access-hkp8r\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:36 crc kubenswrapper[4971]: I0320 07:14:36.282453 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b0faea8-1d06-4130-aaad-3ce56e8b9a69-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:36 crc kubenswrapper[4971]: I0320 07:14:36.282465 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b0faea8-1d06-4130-aaad-3ce56e8b9a69-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:36 crc kubenswrapper[4971]: I0320 07:14:36.284298 4971 scope.go:117] "RemoveContainer" containerID="81256547309da7ad97c533caf37fd1dae06d248fadb2b84ef1fc7bf4b62493ff" Mar 20 07:14:36 crc kubenswrapper[4971]: E0320 07:14:36.285168 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81256547309da7ad97c533caf37fd1dae06d248fadb2b84ef1fc7bf4b62493ff\": container with ID starting with 81256547309da7ad97c533caf37fd1dae06d248fadb2b84ef1fc7bf4b62493ff not found: ID does not exist" containerID="81256547309da7ad97c533caf37fd1dae06d248fadb2b84ef1fc7bf4b62493ff" Mar 20 07:14:36 crc kubenswrapper[4971]: I0320 07:14:36.285212 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81256547309da7ad97c533caf37fd1dae06d248fadb2b84ef1fc7bf4b62493ff"} err="failed to get container status \"81256547309da7ad97c533caf37fd1dae06d248fadb2b84ef1fc7bf4b62493ff\": rpc error: code = NotFound desc = could not find container \"81256547309da7ad97c533caf37fd1dae06d248fadb2b84ef1fc7bf4b62493ff\": container with ID starting with 81256547309da7ad97c533caf37fd1dae06d248fadb2b84ef1fc7bf4b62493ff not found: ID does not exist" Mar 20 07:14:36 crc kubenswrapper[4971]: I0320 07:14:36.285242 4971 scope.go:117] "RemoveContainer" containerID="3ee7a7dea82ef43279997d2d4e22fe388996415cb4326a588f530459fa93f9a3" Mar 20 07:14:36 crc kubenswrapper[4971]: E0320 07:14:36.285809 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ee7a7dea82ef43279997d2d4e22fe388996415cb4326a588f530459fa93f9a3\": container with ID starting with 3ee7a7dea82ef43279997d2d4e22fe388996415cb4326a588f530459fa93f9a3 not found: ID does not exist" containerID="3ee7a7dea82ef43279997d2d4e22fe388996415cb4326a588f530459fa93f9a3" Mar 20 07:14:36 crc kubenswrapper[4971]: I0320 07:14:36.285842 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ee7a7dea82ef43279997d2d4e22fe388996415cb4326a588f530459fa93f9a3"} err="failed to get container status \"3ee7a7dea82ef43279997d2d4e22fe388996415cb4326a588f530459fa93f9a3\": rpc error: code = NotFound desc = could not find container \"3ee7a7dea82ef43279997d2d4e22fe388996415cb4326a588f530459fa93f9a3\": container with ID starting with 3ee7a7dea82ef43279997d2d4e22fe388996415cb4326a588f530459fa93f9a3 not found: ID does not exist" Mar 20 07:14:36 crc kubenswrapper[4971]: I0320 07:14:36.558496 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 07:14:36 crc kubenswrapper[4971]: I0320 07:14:36.571508 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 20 07:14:36 crc kubenswrapper[4971]: I0320 07:14:36.586229 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 07:14:36 crc kubenswrapper[4971]: E0320 07:14:36.586802 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b0faea8-1d06-4130-aaad-3ce56e8b9a69" containerName="nova-api-log" Mar 20 07:14:36 crc kubenswrapper[4971]: I0320 07:14:36.586825 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b0faea8-1d06-4130-aaad-3ce56e8b9a69" containerName="nova-api-log" Mar 20 07:14:36 crc kubenswrapper[4971]: E0320 07:14:36.586856 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b0faea8-1d06-4130-aaad-3ce56e8b9a69" containerName="nova-api-api" Mar 20 07:14:36 crc kubenswrapper[4971]: I0320 07:14:36.586864 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b0faea8-1d06-4130-aaad-3ce56e8b9a69" containerName="nova-api-api" Mar 20 07:14:36 crc kubenswrapper[4971]: I0320 07:14:36.587031 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b0faea8-1d06-4130-aaad-3ce56e8b9a69" containerName="nova-api-api" Mar 20 07:14:36 crc kubenswrapper[4971]: I0320 07:14:36.587045 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b0faea8-1d06-4130-aaad-3ce56e8b9a69" containerName="nova-api-log" Mar 20 07:14:36 crc kubenswrapper[4971]: I0320 07:14:36.588226 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 07:14:36 crc kubenswrapper[4971]: I0320 07:14:36.591032 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 07:14:36 crc kubenswrapper[4971]: I0320 07:14:36.591317 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 20 07:14:36 crc kubenswrapper[4971]: I0320 07:14:36.591829 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 20 07:14:36 crc kubenswrapper[4971]: I0320 07:14:36.611074 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 07:14:36 crc kubenswrapper[4971]: I0320 07:14:36.690448 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/555d1d8c-972b-490a-b996-fa6586964070-config-data\") pod \"nova-api-0\" (UID: \"555d1d8c-972b-490a-b996-fa6586964070\") " pod="openstack/nova-api-0" Mar 20 07:14:36 crc kubenswrapper[4971]: I0320 07:14:36.690802 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/555d1d8c-972b-490a-b996-fa6586964070-internal-tls-certs\") pod \"nova-api-0\" (UID: \"555d1d8c-972b-490a-b996-fa6586964070\") " pod="openstack/nova-api-0" Mar 20 07:14:36 crc kubenswrapper[4971]: I0320 07:14:36.690960 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/555d1d8c-972b-490a-b996-fa6586964070-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"555d1d8c-972b-490a-b996-fa6586964070\") " pod="openstack/nova-api-0" Mar 20 07:14:36 crc kubenswrapper[4971]: I0320 07:14:36.690998 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pd2rf\" (UniqueName: \"kubernetes.io/projected/555d1d8c-972b-490a-b996-fa6586964070-kube-api-access-pd2rf\") pod \"nova-api-0\" (UID: \"555d1d8c-972b-490a-b996-fa6586964070\") " pod="openstack/nova-api-0" Mar 20 07:14:36 crc kubenswrapper[4971]: I0320 07:14:36.691015 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/555d1d8c-972b-490a-b996-fa6586964070-public-tls-certs\") pod \"nova-api-0\" (UID: \"555d1d8c-972b-490a-b996-fa6586964070\") " pod="openstack/nova-api-0" Mar 20 07:14:36 crc kubenswrapper[4971]: I0320 07:14:36.691051 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/555d1d8c-972b-490a-b996-fa6586964070-logs\") pod \"nova-api-0\" (UID: \"555d1d8c-972b-490a-b996-fa6586964070\") " pod="openstack/nova-api-0" Mar 20 07:14:36 crc kubenswrapper[4971]: I0320 07:14:36.743150 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b0faea8-1d06-4130-aaad-3ce56e8b9a69" path="/var/lib/kubelet/pods/5b0faea8-1d06-4130-aaad-3ce56e8b9a69/volumes" Mar 20 07:14:36 crc kubenswrapper[4971]: I0320 07:14:36.793175 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/555d1d8c-972b-490a-b996-fa6586964070-internal-tls-certs\") pod \"nova-api-0\" (UID: \"555d1d8c-972b-490a-b996-fa6586964070\") " pod="openstack/nova-api-0" Mar 20 07:14:36 crc kubenswrapper[4971]: I0320 07:14:36.793345 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/555d1d8c-972b-490a-b996-fa6586964070-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"555d1d8c-972b-490a-b996-fa6586964070\") " pod="openstack/nova-api-0" Mar 20 07:14:36 crc kubenswrapper[4971]: I0320 07:14:36.793379 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pd2rf\" (UniqueName: \"kubernetes.io/projected/555d1d8c-972b-490a-b996-fa6586964070-kube-api-access-pd2rf\") pod \"nova-api-0\" (UID: \"555d1d8c-972b-490a-b996-fa6586964070\") " pod="openstack/nova-api-0" Mar 20 07:14:36 crc kubenswrapper[4971]: I0320 07:14:36.793401 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/555d1d8c-972b-490a-b996-fa6586964070-public-tls-certs\") pod \"nova-api-0\" (UID: \"555d1d8c-972b-490a-b996-fa6586964070\") " pod="openstack/nova-api-0" Mar 20 07:14:36 crc kubenswrapper[4971]: I0320 07:14:36.793437 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/555d1d8c-972b-490a-b996-fa6586964070-logs\") pod \"nova-api-0\" (UID: \"555d1d8c-972b-490a-b996-fa6586964070\") " pod="openstack/nova-api-0" Mar 20 07:14:36 crc kubenswrapper[4971]: I0320 07:14:36.793513 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/555d1d8c-972b-490a-b996-fa6586964070-config-data\") pod \"nova-api-0\" (UID: \"555d1d8c-972b-490a-b996-fa6586964070\") " pod="openstack/nova-api-0" Mar 20 07:14:36 crc kubenswrapper[4971]: I0320 07:14:36.794318 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/555d1d8c-972b-490a-b996-fa6586964070-logs\") pod \"nova-api-0\" (UID: \"555d1d8c-972b-490a-b996-fa6586964070\") " pod="openstack/nova-api-0" Mar 20 07:14:36 crc kubenswrapper[4971]: I0320 07:14:36.799238 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/555d1d8c-972b-490a-b996-fa6586964070-public-tls-certs\") pod \"nova-api-0\" (UID: \"555d1d8c-972b-490a-b996-fa6586964070\") " pod="openstack/nova-api-0" Mar 20 07:14:36 crc kubenswrapper[4971]: I0320 07:14:36.799278 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/555d1d8c-972b-490a-b996-fa6586964070-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"555d1d8c-972b-490a-b996-fa6586964070\") " pod="openstack/nova-api-0" Mar 20 07:14:36 crc kubenswrapper[4971]: I0320 07:14:36.799509 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/555d1d8c-972b-490a-b996-fa6586964070-internal-tls-certs\") pod \"nova-api-0\" (UID: \"555d1d8c-972b-490a-b996-fa6586964070\") " pod="openstack/nova-api-0" Mar 20 07:14:36 crc kubenswrapper[4971]: I0320 07:14:36.801056 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/555d1d8c-972b-490a-b996-fa6586964070-config-data\") pod \"nova-api-0\" (UID: \"555d1d8c-972b-490a-b996-fa6586964070\") " pod="openstack/nova-api-0" Mar 20 07:14:36 crc kubenswrapper[4971]: I0320 07:14:36.813539 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pd2rf\" (UniqueName: \"kubernetes.io/projected/555d1d8c-972b-490a-b996-fa6586964070-kube-api-access-pd2rf\") pod \"nova-api-0\" (UID: \"555d1d8c-972b-490a-b996-fa6586964070\") " pod="openstack/nova-api-0" Mar 20 07:14:36 crc kubenswrapper[4971]: I0320 07:14:36.957967 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 07:14:37 crc kubenswrapper[4971]: I0320 07:14:37.259047 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1f2c826d-a350-4f4e-8204-ece96439fd0f","Type":"ContainerStarted","Data":"45622b13907e1c82b36424cb8a72a5956e99d20fc8608d9925eb324547ffecba"} Mar 20 07:14:37 crc kubenswrapper[4971]: I0320 07:14:37.492668 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 07:14:38 crc kubenswrapper[4971]: I0320 07:14:38.272645 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1f2c826d-a350-4f4e-8204-ece96439fd0f","Type":"ContainerStarted","Data":"267bb15721c4418b84c312cbb7548cbdbafaea164a0e6b80c787f7e652083ca4"} Mar 20 07:14:38 crc kubenswrapper[4971]: I0320 07:14:38.276339 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"555d1d8c-972b-490a-b996-fa6586964070","Type":"ContainerStarted","Data":"370f7604091e5db7ea849130874b92f0f92ba819810dffbb247925f41784c1ca"} Mar 20 07:14:38 crc kubenswrapper[4971]: I0320 07:14:38.276400 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"555d1d8c-972b-490a-b996-fa6586964070","Type":"ContainerStarted","Data":"e4a09276e25eb37776abbfa177385da65bb15ed85f948cf6eec39806cbdbcaf6"} Mar 20 07:14:38 crc kubenswrapper[4971]: I0320 07:14:38.276411 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"555d1d8c-972b-490a-b996-fa6586964070","Type":"ContainerStarted","Data":"f981cc2f456aa4bb0e97573f0be0e8b906fbb033fe995297227c71b68b8d8dc3"} Mar 20 07:14:38 crc kubenswrapper[4971]: I0320 07:14:38.302657 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.302634922 podStartE2EDuration="2.302634922s" podCreationTimestamp="2026-03-20 07:14:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:14:38.297123539 +0000 UTC m=+1500.276997687" watchObservedRunningTime="2026-03-20 07:14:38.302634922 +0000 UTC m=+1500.282509080" Mar 20 07:14:38 crc kubenswrapper[4971]: I0320 07:14:38.532158 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 20 07:14:38 crc kubenswrapper[4971]: I0320 07:14:38.551830 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 20 07:14:39 crc kubenswrapper[4971]: I0320 07:14:39.305095 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 20 07:14:39 crc kubenswrapper[4971]: I0320 07:14:39.483024 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-z7rjt"] Mar 20 07:14:39 crc kubenswrapper[4971]: I0320 07:14:39.485736 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-z7rjt" Mar 20 07:14:39 crc kubenswrapper[4971]: I0320 07:14:39.489531 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 20 07:14:39 crc kubenswrapper[4971]: I0320 07:14:39.489539 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 20 07:14:39 crc kubenswrapper[4971]: I0320 07:14:39.491565 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-z7rjt"] Mar 20 07:14:39 crc kubenswrapper[4971]: I0320 07:14:39.666968 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a73473dc-dcc6-4c79-b04a-f6b3a7a3b7e2-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-z7rjt\" (UID: \"a73473dc-dcc6-4c79-b04a-f6b3a7a3b7e2\") " pod="openstack/nova-cell1-cell-mapping-z7rjt" Mar 20 07:14:39 crc kubenswrapper[4971]: I0320 07:14:39.667385 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a73473dc-dcc6-4c79-b04a-f6b3a7a3b7e2-config-data\") pod \"nova-cell1-cell-mapping-z7rjt\" (UID: \"a73473dc-dcc6-4c79-b04a-f6b3a7a3b7e2\") " pod="openstack/nova-cell1-cell-mapping-z7rjt" Mar 20 07:14:39 crc kubenswrapper[4971]: I0320 07:14:39.667518 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a73473dc-dcc6-4c79-b04a-f6b3a7a3b7e2-scripts\") pod \"nova-cell1-cell-mapping-z7rjt\" (UID: \"a73473dc-dcc6-4c79-b04a-f6b3a7a3b7e2\") " pod="openstack/nova-cell1-cell-mapping-z7rjt" Mar 20 07:14:39 crc kubenswrapper[4971]: I0320 07:14:39.667557 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6tt5\" (UniqueName: \"kubernetes.io/projected/a73473dc-dcc6-4c79-b04a-f6b3a7a3b7e2-kube-api-access-g6tt5\") pod \"nova-cell1-cell-mapping-z7rjt\" (UID: \"a73473dc-dcc6-4c79-b04a-f6b3a7a3b7e2\") " pod="openstack/nova-cell1-cell-mapping-z7rjt" Mar 20 07:14:39 crc kubenswrapper[4971]: I0320 07:14:39.704936 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-fdb8f6449-msxs5" Mar 20 07:14:39 crc kubenswrapper[4971]: I0320 07:14:39.769757 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a73473dc-dcc6-4c79-b04a-f6b3a7a3b7e2-config-data\") pod \"nova-cell1-cell-mapping-z7rjt\" (UID: \"a73473dc-dcc6-4c79-b04a-f6b3a7a3b7e2\") " pod="openstack/nova-cell1-cell-mapping-z7rjt" Mar 20 07:14:39 crc kubenswrapper[4971]: I0320 07:14:39.769836 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a73473dc-dcc6-4c79-b04a-f6b3a7a3b7e2-scripts\") pod \"nova-cell1-cell-mapping-z7rjt\" (UID: \"a73473dc-dcc6-4c79-b04a-f6b3a7a3b7e2\") " pod="openstack/nova-cell1-cell-mapping-z7rjt" Mar 20 07:14:39 crc kubenswrapper[4971]: I0320 07:14:39.769864 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6tt5\" (UniqueName: \"kubernetes.io/projected/a73473dc-dcc6-4c79-b04a-f6b3a7a3b7e2-kube-api-access-g6tt5\") pod \"nova-cell1-cell-mapping-z7rjt\" (UID: \"a73473dc-dcc6-4c79-b04a-f6b3a7a3b7e2\") " pod="openstack/nova-cell1-cell-mapping-z7rjt" Mar 20 07:14:39 crc kubenswrapper[4971]: I0320 07:14:39.769968 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a73473dc-dcc6-4c79-b04a-f6b3a7a3b7e2-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-z7rjt\" (UID: \"a73473dc-dcc6-4c79-b04a-f6b3a7a3b7e2\") " pod="openstack/nova-cell1-cell-mapping-z7rjt" Mar 20 07:14:39 crc kubenswrapper[4971]: I0320 07:14:39.774427 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a73473dc-dcc6-4c79-b04a-f6b3a7a3b7e2-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-z7rjt\" (UID: \"a73473dc-dcc6-4c79-b04a-f6b3a7a3b7e2\") " pod="openstack/nova-cell1-cell-mapping-z7rjt" Mar 20 07:14:39 crc kubenswrapper[4971]: I0320 07:14:39.779415 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a73473dc-dcc6-4c79-b04a-f6b3a7a3b7e2-config-data\") pod \"nova-cell1-cell-mapping-z7rjt\" (UID: \"a73473dc-dcc6-4c79-b04a-f6b3a7a3b7e2\") " pod="openstack/nova-cell1-cell-mapping-z7rjt" Mar 20 07:14:39 crc kubenswrapper[4971]: I0320 07:14:39.787554 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a73473dc-dcc6-4c79-b04a-f6b3a7a3b7e2-scripts\") pod \"nova-cell1-cell-mapping-z7rjt\" (UID: \"a73473dc-dcc6-4c79-b04a-f6b3a7a3b7e2\") " pod="openstack/nova-cell1-cell-mapping-z7rjt" Mar 20 07:14:39 crc kubenswrapper[4971]: I0320 07:14:39.788328 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69b4446475-xtj6r"] Mar 20 07:14:39 crc kubenswrapper[4971]: I0320 07:14:39.789370 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-69b4446475-xtj6r" podUID="6c960029-95ef-4f0f-a277-9fb4a8e13198" containerName="dnsmasq-dns" containerID="cri-o://4f08043439bdae954fd302d7b812af50a1854d47487d2c8ac8c5aff2c1ad7345" gracePeriod=10 Mar 20 07:14:39 crc kubenswrapper[4971]: I0320 07:14:39.814860 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6tt5\" (UniqueName: \"kubernetes.io/projected/a73473dc-dcc6-4c79-b04a-f6b3a7a3b7e2-kube-api-access-g6tt5\") pod \"nova-cell1-cell-mapping-z7rjt\" (UID: \"a73473dc-dcc6-4c79-b04a-f6b3a7a3b7e2\") " pod="openstack/nova-cell1-cell-mapping-z7rjt" Mar 20 07:14:39 crc kubenswrapper[4971]: I0320 07:14:39.837390 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-z7rjt" Mar 20 07:14:40 crc kubenswrapper[4971]: I0320 07:14:40.287230 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69b4446475-xtj6r" Mar 20 07:14:40 crc kubenswrapper[4971]: I0320 07:14:40.300085 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-z7rjt"] Mar 20 07:14:40 crc kubenswrapper[4971]: W0320 07:14:40.300557 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda73473dc_dcc6_4c79_b04a_f6b3a7a3b7e2.slice/crio-b3592e210b2a88d39f066bb6eb15d62b536173f16816c47a0143478da10c5771 WatchSource:0}: Error finding container b3592e210b2a88d39f066bb6eb15d62b536173f16816c47a0143478da10c5771: Status 404 returned error can't find the container with id b3592e210b2a88d39f066bb6eb15d62b536173f16816c47a0143478da10c5771 Mar 20 07:14:40 crc kubenswrapper[4971]: I0320 07:14:40.302213 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1f2c826d-a350-4f4e-8204-ece96439fd0f","Type":"ContainerStarted","Data":"ced131f1672d11ac0e7aa5eeb11d73ff62b3d9eb8a37be9454542ba813df1b74"} Mar 20 07:14:40 crc kubenswrapper[4971]: I0320 07:14:40.303346 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 07:14:40 crc kubenswrapper[4971]: I0320 07:14:40.313750 4971 generic.go:334] "Generic (PLEG): container finished" podID="6c960029-95ef-4f0f-a277-9fb4a8e13198" containerID="4f08043439bdae954fd302d7b812af50a1854d47487d2c8ac8c5aff2c1ad7345" exitCode=0 Mar 20 07:14:40 crc kubenswrapper[4971]: I0320 07:14:40.313852 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69b4446475-xtj6r" event={"ID":"6c960029-95ef-4f0f-a277-9fb4a8e13198","Type":"ContainerDied","Data":"4f08043439bdae954fd302d7b812af50a1854d47487d2c8ac8c5aff2c1ad7345"} Mar 20 07:14:40 crc kubenswrapper[4971]: I0320 07:14:40.313864 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69b4446475-xtj6r" Mar 20 07:14:40 crc kubenswrapper[4971]: I0320 07:14:40.313911 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69b4446475-xtj6r" event={"ID":"6c960029-95ef-4f0f-a277-9fb4a8e13198","Type":"ContainerDied","Data":"71c21da65e6d8f5a8af0ef2989614fc4aee55ce536112d2c72725ba144509120"} Mar 20 07:14:40 crc kubenswrapper[4971]: I0320 07:14:40.313938 4971 scope.go:117] "RemoveContainer" containerID="4f08043439bdae954fd302d7b812af50a1854d47487d2c8ac8c5aff2c1ad7345" Mar 20 07:14:40 crc kubenswrapper[4971]: I0320 07:14:40.353459 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.17966535 podStartE2EDuration="6.353439533s" podCreationTimestamp="2026-03-20 07:14:34 +0000 UTC" firstStartedPulling="2026-03-20 07:14:35.161668086 +0000 UTC m=+1497.141542264" lastFinishedPulling="2026-03-20 07:14:39.335442309 +0000 UTC m=+1501.315316447" observedRunningTime="2026-03-20 07:14:40.340731383 +0000 UTC m=+1502.320605521" watchObservedRunningTime="2026-03-20 07:14:40.353439533 +0000 UTC m=+1502.333313671" Mar 20 07:14:40 crc kubenswrapper[4971]: I0320 07:14:40.358534 4971 scope.go:117] "RemoveContainer" containerID="7c0a8ad9c3b445ea4e98dfbc74870cfe993a54f5997889b39998bf8e54eb8bf0" Mar 20 07:14:40 crc kubenswrapper[4971]: I0320 07:14:40.391233 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c960029-95ef-4f0f-a277-9fb4a8e13198-ovsdbserver-nb\") pod \"6c960029-95ef-4f0f-a277-9fb4a8e13198\" (UID: \"6c960029-95ef-4f0f-a277-9fb4a8e13198\") " Mar 20 07:14:40 crc kubenswrapper[4971]: I0320 07:14:40.391289 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c960029-95ef-4f0f-a277-9fb4a8e13198-config\") pod \"6c960029-95ef-4f0f-a277-9fb4a8e13198\" (UID: \"6c960029-95ef-4f0f-a277-9fb4a8e13198\") " Mar 20 07:14:40 crc kubenswrapper[4971]: I0320 07:14:40.391470 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c960029-95ef-4f0f-a277-9fb4a8e13198-ovsdbserver-sb\") pod \"6c960029-95ef-4f0f-a277-9fb4a8e13198\" (UID: \"6c960029-95ef-4f0f-a277-9fb4a8e13198\") " Mar 20 07:14:40 crc kubenswrapper[4971]: I0320 07:14:40.391542 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6c960029-95ef-4f0f-a277-9fb4a8e13198-dns-swift-storage-0\") pod \"6c960029-95ef-4f0f-a277-9fb4a8e13198\" (UID: \"6c960029-95ef-4f0f-a277-9fb4a8e13198\") " Mar 20 07:14:40 crc kubenswrapper[4971]: I0320 07:14:40.391579 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wqtk\" (UniqueName: \"kubernetes.io/projected/6c960029-95ef-4f0f-a277-9fb4a8e13198-kube-api-access-2wqtk\") pod \"6c960029-95ef-4f0f-a277-9fb4a8e13198\" (UID: \"6c960029-95ef-4f0f-a277-9fb4a8e13198\") " Mar 20 07:14:40 crc kubenswrapper[4971]: I0320 07:14:40.391647 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c960029-95ef-4f0f-a277-9fb4a8e13198-dns-svc\") pod \"6c960029-95ef-4f0f-a277-9fb4a8e13198\" (UID: \"6c960029-95ef-4f0f-a277-9fb4a8e13198\") " Mar 20 07:14:40 crc kubenswrapper[4971]: I0320 07:14:40.416789 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c960029-95ef-4f0f-a277-9fb4a8e13198-kube-api-access-2wqtk" (OuterVolumeSpecName: "kube-api-access-2wqtk") pod "6c960029-95ef-4f0f-a277-9fb4a8e13198" (UID: "6c960029-95ef-4f0f-a277-9fb4a8e13198"). InnerVolumeSpecName "kube-api-access-2wqtk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:40 crc kubenswrapper[4971]: I0320 07:14:40.450124 4971 scope.go:117] "RemoveContainer" containerID="4f08043439bdae954fd302d7b812af50a1854d47487d2c8ac8c5aff2c1ad7345" Mar 20 07:14:40 crc kubenswrapper[4971]: E0320 07:14:40.453199 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f08043439bdae954fd302d7b812af50a1854d47487d2c8ac8c5aff2c1ad7345\": container with ID starting with 4f08043439bdae954fd302d7b812af50a1854d47487d2c8ac8c5aff2c1ad7345 not found: ID does not exist" containerID="4f08043439bdae954fd302d7b812af50a1854d47487d2c8ac8c5aff2c1ad7345" Mar 20 07:14:40 crc kubenswrapper[4971]: I0320 07:14:40.453265 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f08043439bdae954fd302d7b812af50a1854d47487d2c8ac8c5aff2c1ad7345"} err="failed to get container status \"4f08043439bdae954fd302d7b812af50a1854d47487d2c8ac8c5aff2c1ad7345\": rpc error: code = NotFound desc = could not find container \"4f08043439bdae954fd302d7b812af50a1854d47487d2c8ac8c5aff2c1ad7345\": container with ID starting with 4f08043439bdae954fd302d7b812af50a1854d47487d2c8ac8c5aff2c1ad7345 not found: ID does not exist" Mar 20 07:14:40 crc kubenswrapper[4971]: I0320 07:14:40.453309 4971 scope.go:117] "RemoveContainer" containerID="7c0a8ad9c3b445ea4e98dfbc74870cfe993a54f5997889b39998bf8e54eb8bf0" Mar 20 07:14:40 crc kubenswrapper[4971]: E0320 07:14:40.454676 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c0a8ad9c3b445ea4e98dfbc74870cfe993a54f5997889b39998bf8e54eb8bf0\": container with ID starting with 7c0a8ad9c3b445ea4e98dfbc74870cfe993a54f5997889b39998bf8e54eb8bf0 not found: ID does not exist" containerID="7c0a8ad9c3b445ea4e98dfbc74870cfe993a54f5997889b39998bf8e54eb8bf0" Mar 20 07:14:40 crc kubenswrapper[4971]: I0320 07:14:40.454704 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c0a8ad9c3b445ea4e98dfbc74870cfe993a54f5997889b39998bf8e54eb8bf0"} err="failed to get container status \"7c0a8ad9c3b445ea4e98dfbc74870cfe993a54f5997889b39998bf8e54eb8bf0\": rpc error: code = NotFound desc = could not find container \"7c0a8ad9c3b445ea4e98dfbc74870cfe993a54f5997889b39998bf8e54eb8bf0\": container with ID starting with 7c0a8ad9c3b445ea4e98dfbc74870cfe993a54f5997889b39998bf8e54eb8bf0 not found: ID does not exist" Mar 20 07:14:40 crc kubenswrapper[4971]: I0320 07:14:40.470960 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c960029-95ef-4f0f-a277-9fb4a8e13198-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6c960029-95ef-4f0f-a277-9fb4a8e13198" (UID: "6c960029-95ef-4f0f-a277-9fb4a8e13198"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:40 crc kubenswrapper[4971]: I0320 07:14:40.483020 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c960029-95ef-4f0f-a277-9fb4a8e13198-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6c960029-95ef-4f0f-a277-9fb4a8e13198" (UID: "6c960029-95ef-4f0f-a277-9fb4a8e13198"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:40 crc kubenswrapper[4971]: I0320 07:14:40.489200 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c960029-95ef-4f0f-a277-9fb4a8e13198-config" (OuterVolumeSpecName: "config") pod "6c960029-95ef-4f0f-a277-9fb4a8e13198" (UID: "6c960029-95ef-4f0f-a277-9fb4a8e13198"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:40 crc kubenswrapper[4971]: I0320 07:14:40.490990 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c960029-95ef-4f0f-a277-9fb4a8e13198-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6c960029-95ef-4f0f-a277-9fb4a8e13198" (UID: "6c960029-95ef-4f0f-a277-9fb4a8e13198"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:40 crc kubenswrapper[4971]: I0320 07:14:40.495259 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wqtk\" (UniqueName: \"kubernetes.io/projected/6c960029-95ef-4f0f-a277-9fb4a8e13198-kube-api-access-2wqtk\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:40 crc kubenswrapper[4971]: I0320 07:14:40.495359 4971 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c960029-95ef-4f0f-a277-9fb4a8e13198-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:40 crc kubenswrapper[4971]: I0320 07:14:40.495431 4971 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c960029-95ef-4f0f-a277-9fb4a8e13198-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:40 crc kubenswrapper[4971]: I0320 07:14:40.495489 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c960029-95ef-4f0f-a277-9fb4a8e13198-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:40 crc kubenswrapper[4971]: I0320 07:14:40.495547 4971 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c960029-95ef-4f0f-a277-9fb4a8e13198-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:40 crc kubenswrapper[4971]: I0320 07:14:40.514710 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c960029-95ef-4f0f-a277-9fb4a8e13198-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6c960029-95ef-4f0f-a277-9fb4a8e13198" (UID: "6c960029-95ef-4f0f-a277-9fb4a8e13198"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:40 crc kubenswrapper[4971]: I0320 07:14:40.597191 4971 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6c960029-95ef-4f0f-a277-9fb4a8e13198-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:40 crc kubenswrapper[4971]: I0320 07:14:40.650628 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69b4446475-xtj6r"] Mar 20 07:14:40 crc kubenswrapper[4971]: I0320 07:14:40.661387 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-69b4446475-xtj6r"] Mar 20 07:14:40 crc kubenswrapper[4971]: I0320 07:14:40.744504 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c960029-95ef-4f0f-a277-9fb4a8e13198" path="/var/lib/kubelet/pods/6c960029-95ef-4f0f-a277-9fb4a8e13198/volumes" Mar 20 07:14:41 crc kubenswrapper[4971]: I0320 07:14:41.324808 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-z7rjt" event={"ID":"a73473dc-dcc6-4c79-b04a-f6b3a7a3b7e2","Type":"ContainerStarted","Data":"20bb8dee4f6eeefb1b1658b0eb47261977255f1a69893b7d980be7dacb8c4402"} Mar 20 07:14:41 crc kubenswrapper[4971]: I0320 07:14:41.324859 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-z7rjt" event={"ID":"a73473dc-dcc6-4c79-b04a-f6b3a7a3b7e2","Type":"ContainerStarted","Data":"b3592e210b2a88d39f066bb6eb15d62b536173f16816c47a0143478da10c5771"} Mar 20 07:14:41 crc kubenswrapper[4971]: I0320 07:14:41.346991 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-z7rjt" podStartSLOduration=2.34696629 podStartE2EDuration="2.34696629s" podCreationTimestamp="2026-03-20 07:14:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:14:41.344715262 +0000 UTC m=+1503.324589400" watchObservedRunningTime="2026-03-20 07:14:41.34696629 +0000 UTC m=+1503.326840438" Mar 20 07:14:45 crc kubenswrapper[4971]: I0320 07:14:45.386779 4971 generic.go:334] "Generic (PLEG): container finished" podID="a73473dc-dcc6-4c79-b04a-f6b3a7a3b7e2" containerID="20bb8dee4f6eeefb1b1658b0eb47261977255f1a69893b7d980be7dacb8c4402" exitCode=0 Mar 20 07:14:45 crc kubenswrapper[4971]: I0320 07:14:45.386896 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-z7rjt" event={"ID":"a73473dc-dcc6-4c79-b04a-f6b3a7a3b7e2","Type":"ContainerDied","Data":"20bb8dee4f6eeefb1b1658b0eb47261977255f1a69893b7d980be7dacb8c4402"} Mar 20 07:14:46 crc kubenswrapper[4971]: I0320 07:14:46.913147 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-z7rjt" Mar 20 07:14:46 crc kubenswrapper[4971]: I0320 07:14:46.959035 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 07:14:46 crc kubenswrapper[4971]: I0320 07:14:46.959132 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 07:14:47 crc kubenswrapper[4971]: I0320 07:14:47.032078 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a73473dc-dcc6-4c79-b04a-f6b3a7a3b7e2-combined-ca-bundle\") pod \"a73473dc-dcc6-4c79-b04a-f6b3a7a3b7e2\" (UID: \"a73473dc-dcc6-4c79-b04a-f6b3a7a3b7e2\") " Mar 20 07:14:47 crc kubenswrapper[4971]: I0320 07:14:47.032180 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a73473dc-dcc6-4c79-b04a-f6b3a7a3b7e2-scripts\") pod \"a73473dc-dcc6-4c79-b04a-f6b3a7a3b7e2\" (UID: \"a73473dc-dcc6-4c79-b04a-f6b3a7a3b7e2\") " Mar 20 07:14:47 crc kubenswrapper[4971]: I0320 07:14:47.032383 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6tt5\" (UniqueName: \"kubernetes.io/projected/a73473dc-dcc6-4c79-b04a-f6b3a7a3b7e2-kube-api-access-g6tt5\") pod \"a73473dc-dcc6-4c79-b04a-f6b3a7a3b7e2\" (UID: \"a73473dc-dcc6-4c79-b04a-f6b3a7a3b7e2\") " Mar 20 07:14:47 crc kubenswrapper[4971]: I0320 07:14:47.032452 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a73473dc-dcc6-4c79-b04a-f6b3a7a3b7e2-config-data\") pod \"a73473dc-dcc6-4c79-b04a-f6b3a7a3b7e2\" (UID: \"a73473dc-dcc6-4c79-b04a-f6b3a7a3b7e2\") " Mar 20 07:14:47 crc kubenswrapper[4971]: I0320 07:14:47.040743 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a73473dc-dcc6-4c79-b04a-f6b3a7a3b7e2-scripts" (OuterVolumeSpecName: "scripts") pod "a73473dc-dcc6-4c79-b04a-f6b3a7a3b7e2" (UID: "a73473dc-dcc6-4c79-b04a-f6b3a7a3b7e2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:47 crc kubenswrapper[4971]: I0320 07:14:47.040905 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a73473dc-dcc6-4c79-b04a-f6b3a7a3b7e2-kube-api-access-g6tt5" (OuterVolumeSpecName: "kube-api-access-g6tt5") pod "a73473dc-dcc6-4c79-b04a-f6b3a7a3b7e2" (UID: "a73473dc-dcc6-4c79-b04a-f6b3a7a3b7e2"). InnerVolumeSpecName "kube-api-access-g6tt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:47 crc kubenswrapper[4971]: I0320 07:14:47.066932 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a73473dc-dcc6-4c79-b04a-f6b3a7a3b7e2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a73473dc-dcc6-4c79-b04a-f6b3a7a3b7e2" (UID: "a73473dc-dcc6-4c79-b04a-f6b3a7a3b7e2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:47 crc kubenswrapper[4971]: I0320 07:14:47.087106 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a73473dc-dcc6-4c79-b04a-f6b3a7a3b7e2-config-data" (OuterVolumeSpecName: "config-data") pod "a73473dc-dcc6-4c79-b04a-f6b3a7a3b7e2" (UID: "a73473dc-dcc6-4c79-b04a-f6b3a7a3b7e2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:47 crc kubenswrapper[4971]: I0320 07:14:47.135962 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6tt5\" (UniqueName: \"kubernetes.io/projected/a73473dc-dcc6-4c79-b04a-f6b3a7a3b7e2-kube-api-access-g6tt5\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:47 crc kubenswrapper[4971]: I0320 07:14:47.136112 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a73473dc-dcc6-4c79-b04a-f6b3a7a3b7e2-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:47 crc kubenswrapper[4971]: I0320 07:14:47.136173 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a73473dc-dcc6-4c79-b04a-f6b3a7a3b7e2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:47 crc kubenswrapper[4971]: I0320 07:14:47.136266 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a73473dc-dcc6-4c79-b04a-f6b3a7a3b7e2-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:47 crc kubenswrapper[4971]: I0320 07:14:47.413748 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-z7rjt" event={"ID":"a73473dc-dcc6-4c79-b04a-f6b3a7a3b7e2","Type":"ContainerDied","Data":"b3592e210b2a88d39f066bb6eb15d62b536173f16816c47a0143478da10c5771"} Mar 20 07:14:47 crc kubenswrapper[4971]: I0320 07:14:47.413808 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3592e210b2a88d39f066bb6eb15d62b536173f16816c47a0143478da10c5771" Mar 20 07:14:47 crc kubenswrapper[4971]: I0320 07:14:47.413888 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-z7rjt" Mar 20 07:14:47 crc kubenswrapper[4971]: I0320 07:14:47.624960 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 07:14:47 crc kubenswrapper[4971]: I0320 07:14:47.625184 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="555d1d8c-972b-490a-b996-fa6586964070" containerName="nova-api-log" containerID="cri-o://e4a09276e25eb37776abbfa177385da65bb15ed85f948cf6eec39806cbdbcaf6" gracePeriod=30 Mar 20 07:14:47 crc kubenswrapper[4971]: I0320 07:14:47.625650 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="555d1d8c-972b-490a-b996-fa6586964070" containerName="nova-api-api" containerID="cri-o://370f7604091e5db7ea849130874b92f0f92ba819810dffbb247925f41784c1ca" gracePeriod=30 Mar 20 07:14:47 crc kubenswrapper[4971]: I0320 07:14:47.639260 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="555d1d8c-972b-490a-b996-fa6586964070" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.209:8774/\": EOF" Mar 20 07:14:47 crc kubenswrapper[4971]: I0320 07:14:47.642557 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 07:14:47 crc kubenswrapper[4971]: I0320 07:14:47.642805 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="65f4685d-7a73-4211-85e2-2bbcb7370be6" containerName="nova-scheduler-scheduler" containerID="cri-o://90ef297c8d6d890cffeea2aece8e734d3092ce06296da538df1f4c52a196bb8f" gracePeriod=30 Mar 20 07:14:47 crc kubenswrapper[4971]: I0320 07:14:47.647320 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="555d1d8c-972b-490a-b996-fa6586964070" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.209:8774/\": EOF" Mar 20 07:14:47 crc kubenswrapper[4971]: I0320 07:14:47.673538 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 07:14:47 crc kubenswrapper[4971]: I0320 07:14:47.674884 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="77fdc792-9230-4274-b23d-dd4a21bfc09b" containerName="nova-metadata-log" containerID="cri-o://3392b6ffcbd08668973077d2e5d10b97ae6f9e13d024c2dbbcd6432a3f3afd2b" gracePeriod=30 Mar 20 07:14:47 crc kubenswrapper[4971]: I0320 07:14:47.675091 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="77fdc792-9230-4274-b23d-dd4a21bfc09b" containerName="nova-metadata-metadata" containerID="cri-o://065161cfb87f99249a5df1c0c2d94817d5e0618c45b0be4c86426d88b6928275" gracePeriod=30 Mar 20 07:14:48 crc kubenswrapper[4971]: E0320 07:14:48.269737 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="90ef297c8d6d890cffeea2aece8e734d3092ce06296da538df1f4c52a196bb8f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 07:14:48 crc kubenswrapper[4971]: E0320 07:14:48.271532 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="90ef297c8d6d890cffeea2aece8e734d3092ce06296da538df1f4c52a196bb8f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 07:14:48 crc kubenswrapper[4971]: E0320 07:14:48.272778 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="90ef297c8d6d890cffeea2aece8e734d3092ce06296da538df1f4c52a196bb8f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 07:14:48 crc kubenswrapper[4971]: E0320 07:14:48.272933 4971 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="65f4685d-7a73-4211-85e2-2bbcb7370be6" containerName="nova-scheduler-scheduler" Mar 20 07:14:48 crc kubenswrapper[4971]: I0320 07:14:48.427768 4971 generic.go:334] "Generic (PLEG): container finished" podID="555d1d8c-972b-490a-b996-fa6586964070" containerID="e4a09276e25eb37776abbfa177385da65bb15ed85f948cf6eec39806cbdbcaf6" exitCode=143 Mar 20 07:14:48 crc kubenswrapper[4971]: I0320 07:14:48.427854 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"555d1d8c-972b-490a-b996-fa6586964070","Type":"ContainerDied","Data":"e4a09276e25eb37776abbfa177385da65bb15ed85f948cf6eec39806cbdbcaf6"} Mar 20 07:14:48 crc kubenswrapper[4971]: I0320 07:14:48.430091 4971 generic.go:334] "Generic (PLEG): container finished" podID="77fdc792-9230-4274-b23d-dd4a21bfc09b" containerID="3392b6ffcbd08668973077d2e5d10b97ae6f9e13d024c2dbbcd6432a3f3afd2b" exitCode=143 Mar 20 07:14:48 crc kubenswrapper[4971]: I0320 07:14:48.430154 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"77fdc792-9230-4274-b23d-dd4a21bfc09b","Type":"ContainerDied","Data":"3392b6ffcbd08668973077d2e5d10b97ae6f9e13d024c2dbbcd6432a3f3afd2b"} Mar 20 07:14:50 crc kubenswrapper[4971]: I0320 07:14:50.481978 4971 scope.go:117] "RemoveContainer" containerID="ae8d76fe41c2f044d059190b95de529e5d48b241720e74b694699ce489d97a88" Mar 20 07:14:51 crc kubenswrapper[4971]: I0320 07:14:51.298264 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 07:14:51 crc kubenswrapper[4971]: I0320 07:14:51.430870 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8j2t\" (UniqueName: \"kubernetes.io/projected/77fdc792-9230-4274-b23d-dd4a21bfc09b-kube-api-access-x8j2t\") pod \"77fdc792-9230-4274-b23d-dd4a21bfc09b\" (UID: \"77fdc792-9230-4274-b23d-dd4a21bfc09b\") " Mar 20 07:14:51 crc kubenswrapper[4971]: I0320 07:14:51.431109 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/77fdc792-9230-4274-b23d-dd4a21bfc09b-nova-metadata-tls-certs\") pod \"77fdc792-9230-4274-b23d-dd4a21bfc09b\" (UID: \"77fdc792-9230-4274-b23d-dd4a21bfc09b\") " Mar 20 07:14:51 crc kubenswrapper[4971]: I0320 07:14:51.431133 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77fdc792-9230-4274-b23d-dd4a21bfc09b-combined-ca-bundle\") pod \"77fdc792-9230-4274-b23d-dd4a21bfc09b\" (UID: \"77fdc792-9230-4274-b23d-dd4a21bfc09b\") " Mar 20 07:14:51 crc kubenswrapper[4971]: I0320 07:14:51.431161 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77fdc792-9230-4274-b23d-dd4a21bfc09b-logs\") pod \"77fdc792-9230-4274-b23d-dd4a21bfc09b\" (UID: \"77fdc792-9230-4274-b23d-dd4a21bfc09b\") " Mar 20 07:14:51 crc kubenswrapper[4971]: I0320 07:14:51.431236 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77fdc792-9230-4274-b23d-dd4a21bfc09b-config-data\") pod \"77fdc792-9230-4274-b23d-dd4a21bfc09b\" (UID: \"77fdc792-9230-4274-b23d-dd4a21bfc09b\") " Mar 20 07:14:51 crc kubenswrapper[4971]: I0320 07:14:51.432058 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77fdc792-9230-4274-b23d-dd4a21bfc09b-logs" (OuterVolumeSpecName: "logs") pod "77fdc792-9230-4274-b23d-dd4a21bfc09b" (UID: "77fdc792-9230-4274-b23d-dd4a21bfc09b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:14:51 crc kubenswrapper[4971]: I0320 07:14:51.443861 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77fdc792-9230-4274-b23d-dd4a21bfc09b-kube-api-access-x8j2t" (OuterVolumeSpecName: "kube-api-access-x8j2t") pod "77fdc792-9230-4274-b23d-dd4a21bfc09b" (UID: "77fdc792-9230-4274-b23d-dd4a21bfc09b"). InnerVolumeSpecName "kube-api-access-x8j2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:51 crc kubenswrapper[4971]: I0320 07:14:51.459210 4971 generic.go:334] "Generic (PLEG): container finished" podID="77fdc792-9230-4274-b23d-dd4a21bfc09b" containerID="065161cfb87f99249a5df1c0c2d94817d5e0618c45b0be4c86426d88b6928275" exitCode=0 Mar 20 07:14:51 crc kubenswrapper[4971]: I0320 07:14:51.459259 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"77fdc792-9230-4274-b23d-dd4a21bfc09b","Type":"ContainerDied","Data":"065161cfb87f99249a5df1c0c2d94817d5e0618c45b0be4c86426d88b6928275"} Mar 20 07:14:51 crc kubenswrapper[4971]: I0320 07:14:51.459291 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"77fdc792-9230-4274-b23d-dd4a21bfc09b","Type":"ContainerDied","Data":"0314d5800618232183c3f0e47ca589eedb21be6a6dd41d0937460eb4a2bfc049"} Mar 20 07:14:51 crc kubenswrapper[4971]: I0320 07:14:51.459315 4971 scope.go:117] "RemoveContainer" containerID="065161cfb87f99249a5df1c0c2d94817d5e0618c45b0be4c86426d88b6928275" Mar 20 07:14:51 crc kubenswrapper[4971]: I0320 07:14:51.459323 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 07:14:51 crc kubenswrapper[4971]: I0320 07:14:51.503456 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77fdc792-9230-4274-b23d-dd4a21bfc09b-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "77fdc792-9230-4274-b23d-dd4a21bfc09b" (UID: "77fdc792-9230-4274-b23d-dd4a21bfc09b"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:51 crc kubenswrapper[4971]: I0320 07:14:51.511428 4971 scope.go:117] "RemoveContainer" containerID="3392b6ffcbd08668973077d2e5d10b97ae6f9e13d024c2dbbcd6432a3f3afd2b" Mar 20 07:14:51 crc kubenswrapper[4971]: I0320 07:14:51.514194 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77fdc792-9230-4274-b23d-dd4a21bfc09b-config-data" (OuterVolumeSpecName: "config-data") pod "77fdc792-9230-4274-b23d-dd4a21bfc09b" (UID: "77fdc792-9230-4274-b23d-dd4a21bfc09b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:51 crc kubenswrapper[4971]: I0320 07:14:51.519649 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77fdc792-9230-4274-b23d-dd4a21bfc09b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "77fdc792-9230-4274-b23d-dd4a21bfc09b" (UID: "77fdc792-9230-4274-b23d-dd4a21bfc09b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:51 crc kubenswrapper[4971]: I0320 07:14:51.533717 4971 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77fdc792-9230-4274-b23d-dd4a21bfc09b-logs\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:51 crc kubenswrapper[4971]: I0320 07:14:51.533763 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77fdc792-9230-4274-b23d-dd4a21bfc09b-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:51 crc kubenswrapper[4971]: I0320 07:14:51.533780 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8j2t\" (UniqueName: \"kubernetes.io/projected/77fdc792-9230-4274-b23d-dd4a21bfc09b-kube-api-access-x8j2t\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:51 crc kubenswrapper[4971]: I0320 07:14:51.533795 4971 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/77fdc792-9230-4274-b23d-dd4a21bfc09b-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:51 crc kubenswrapper[4971]: I0320 07:14:51.533808 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77fdc792-9230-4274-b23d-dd4a21bfc09b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:51 crc kubenswrapper[4971]: I0320 07:14:51.597699 4971 scope.go:117] "RemoveContainer" containerID="065161cfb87f99249a5df1c0c2d94817d5e0618c45b0be4c86426d88b6928275" Mar 20 07:14:51 crc kubenswrapper[4971]: E0320 07:14:51.598180 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"065161cfb87f99249a5df1c0c2d94817d5e0618c45b0be4c86426d88b6928275\": container with ID starting with 065161cfb87f99249a5df1c0c2d94817d5e0618c45b0be4c86426d88b6928275 not found: ID does not exist" containerID="065161cfb87f99249a5df1c0c2d94817d5e0618c45b0be4c86426d88b6928275" Mar 20 07:14:51 crc kubenswrapper[4971]: I0320 07:14:51.598278 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"065161cfb87f99249a5df1c0c2d94817d5e0618c45b0be4c86426d88b6928275"} err="failed to get container status \"065161cfb87f99249a5df1c0c2d94817d5e0618c45b0be4c86426d88b6928275\": rpc error: code = NotFound desc = could not find container \"065161cfb87f99249a5df1c0c2d94817d5e0618c45b0be4c86426d88b6928275\": container with ID starting with 065161cfb87f99249a5df1c0c2d94817d5e0618c45b0be4c86426d88b6928275 not found: ID does not exist" Mar 20 07:14:51 crc kubenswrapper[4971]: I0320 07:14:51.598358 4971 scope.go:117] "RemoveContainer" containerID="3392b6ffcbd08668973077d2e5d10b97ae6f9e13d024c2dbbcd6432a3f3afd2b" Mar 20 07:14:51 crc kubenswrapper[4971]: E0320 07:14:51.598791 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3392b6ffcbd08668973077d2e5d10b97ae6f9e13d024c2dbbcd6432a3f3afd2b\": container with ID starting with 3392b6ffcbd08668973077d2e5d10b97ae6f9e13d024c2dbbcd6432a3f3afd2b not found: ID does not exist" containerID="3392b6ffcbd08668973077d2e5d10b97ae6f9e13d024c2dbbcd6432a3f3afd2b" Mar 20 07:14:51 crc kubenswrapper[4971]: I0320 07:14:51.598838 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3392b6ffcbd08668973077d2e5d10b97ae6f9e13d024c2dbbcd6432a3f3afd2b"} err="failed to get container status \"3392b6ffcbd08668973077d2e5d10b97ae6f9e13d024c2dbbcd6432a3f3afd2b\": rpc error: code = NotFound desc = could not find container \"3392b6ffcbd08668973077d2e5d10b97ae6f9e13d024c2dbbcd6432a3f3afd2b\": container with ID starting with 3392b6ffcbd08668973077d2e5d10b97ae6f9e13d024c2dbbcd6432a3f3afd2b not found: ID does not exist" Mar 20 07:14:51 crc kubenswrapper[4971]: I0320 07:14:51.797844 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 07:14:51 crc kubenswrapper[4971]: I0320 07:14:51.812407 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 07:14:51 crc kubenswrapper[4971]: I0320 07:14:51.821038 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 07:14:51 crc kubenswrapper[4971]: E0320 07:14:51.821428 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c960029-95ef-4f0f-a277-9fb4a8e13198" containerName="dnsmasq-dns" Mar 20 07:14:51 crc kubenswrapper[4971]: I0320 07:14:51.821446 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c960029-95ef-4f0f-a277-9fb4a8e13198" containerName="dnsmasq-dns" Mar 20 07:14:51 crc kubenswrapper[4971]: E0320 07:14:51.821459 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c960029-95ef-4f0f-a277-9fb4a8e13198" containerName="init" Mar 20 07:14:51 crc kubenswrapper[4971]: I0320 07:14:51.821466 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c960029-95ef-4f0f-a277-9fb4a8e13198" containerName="init" Mar 20 07:14:51 crc kubenswrapper[4971]: E0320 07:14:51.821474 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a73473dc-dcc6-4c79-b04a-f6b3a7a3b7e2" containerName="nova-manage" Mar 20 07:14:51 crc kubenswrapper[4971]: I0320 07:14:51.821481 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="a73473dc-dcc6-4c79-b04a-f6b3a7a3b7e2" containerName="nova-manage" Mar 20 07:14:51 crc kubenswrapper[4971]: E0320 07:14:51.821496 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77fdc792-9230-4274-b23d-dd4a21bfc09b" containerName="nova-metadata-metadata" Mar 20 07:14:51 crc kubenswrapper[4971]: I0320 07:14:51.821503 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="77fdc792-9230-4274-b23d-dd4a21bfc09b" containerName="nova-metadata-metadata" Mar 20 07:14:51 crc kubenswrapper[4971]: E0320 07:14:51.821518 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77fdc792-9230-4274-b23d-dd4a21bfc09b" containerName="nova-metadata-log" Mar 20 07:14:51 crc kubenswrapper[4971]: I0320 07:14:51.821523 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="77fdc792-9230-4274-b23d-dd4a21bfc09b" containerName="nova-metadata-log" Mar 20 07:14:51 crc kubenswrapper[4971]: I0320 07:14:51.821705 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="77fdc792-9230-4274-b23d-dd4a21bfc09b" containerName="nova-metadata-log" Mar 20 07:14:51 crc kubenswrapper[4971]: I0320 07:14:51.821728 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="a73473dc-dcc6-4c79-b04a-f6b3a7a3b7e2" containerName="nova-manage" Mar 20 07:14:51 crc kubenswrapper[4971]: I0320 07:14:51.821735 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c960029-95ef-4f0f-a277-9fb4a8e13198" containerName="dnsmasq-dns" Mar 20 07:14:51 crc kubenswrapper[4971]: I0320 07:14:51.821746 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="77fdc792-9230-4274-b23d-dd4a21bfc09b" containerName="nova-metadata-metadata" Mar 20 07:14:51 crc kubenswrapper[4971]: I0320 07:14:51.822715 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 07:14:51 crc kubenswrapper[4971]: I0320 07:14:51.825256 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 07:14:51 crc kubenswrapper[4971]: I0320 07:14:51.826133 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 20 07:14:51 crc kubenswrapper[4971]: I0320 07:14:51.833157 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 07:14:51 crc kubenswrapper[4971]: I0320 07:14:51.943523 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a\") " pod="openstack/nova-metadata-0" Mar 20 07:14:51 crc kubenswrapper[4971]: I0320 07:14:51.943632 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a\") " pod="openstack/nova-metadata-0" Mar 20 07:14:51 crc kubenswrapper[4971]: I0320 07:14:51.943665 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a-logs\") pod \"nova-metadata-0\" (UID: \"bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a\") " pod="openstack/nova-metadata-0" Mar 20 07:14:51 crc kubenswrapper[4971]: I0320 07:14:51.943745 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a-config-data\") pod \"nova-metadata-0\" (UID: \"bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a\") " pod="openstack/nova-metadata-0" Mar 20 07:14:51 crc kubenswrapper[4971]: I0320 07:14:51.943766 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfdt9\" (UniqueName: \"kubernetes.io/projected/bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a-kube-api-access-dfdt9\") pod \"nova-metadata-0\" (UID: \"bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a\") " pod="openstack/nova-metadata-0" Mar 20 07:14:52 crc kubenswrapper[4971]: I0320 07:14:52.046330 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a\") " pod="openstack/nova-metadata-0" Mar 20 07:14:52 crc kubenswrapper[4971]: I0320 07:14:52.046670 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a\") " pod="openstack/nova-metadata-0" Mar 20 07:14:52 crc kubenswrapper[4971]: I0320 07:14:52.047050 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a-logs\") pod \"nova-metadata-0\" (UID: \"bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a\") " pod="openstack/nova-metadata-0" Mar 20 07:14:52 crc kubenswrapper[4971]: I0320 07:14:52.047315 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a-config-data\") pod \"nova-metadata-0\" (UID: \"bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a\") " pod="openstack/nova-metadata-0" Mar 20 07:14:52 crc kubenswrapper[4971]: I0320 07:14:52.047814 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfdt9\" (UniqueName: \"kubernetes.io/projected/bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a-kube-api-access-dfdt9\") pod \"nova-metadata-0\" (UID: \"bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a\") " pod="openstack/nova-metadata-0" Mar 20 07:14:52 crc kubenswrapper[4971]: I0320 07:14:52.047442 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a-logs\") pod \"nova-metadata-0\" (UID: \"bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a\") " pod="openstack/nova-metadata-0" Mar 20 07:14:52 crc kubenswrapper[4971]: I0320 07:14:52.050531 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a-config-data\") pod \"nova-metadata-0\" (UID: \"bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a\") " pod="openstack/nova-metadata-0" Mar 20 07:14:52 crc kubenswrapper[4971]: I0320 07:14:52.050857 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a\") " pod="openstack/nova-metadata-0" Mar 20 07:14:52 crc kubenswrapper[4971]: I0320 07:14:52.070111 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfdt9\" (UniqueName: \"kubernetes.io/projected/bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a-kube-api-access-dfdt9\") pod \"nova-metadata-0\" (UID: \"bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a\") " pod="openstack/nova-metadata-0" Mar 20 07:14:52 crc kubenswrapper[4971]: I0320 07:14:52.071187 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a\") " pod="openstack/nova-metadata-0" Mar 20 07:14:52 crc kubenswrapper[4971]: I0320 07:14:52.141566 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 07:14:52 crc kubenswrapper[4971]: I0320 07:14:52.476485 4971 generic.go:334] "Generic (PLEG): container finished" podID="65f4685d-7a73-4211-85e2-2bbcb7370be6" containerID="90ef297c8d6d890cffeea2aece8e734d3092ce06296da538df1f4c52a196bb8f" exitCode=0 Mar 20 07:14:52 crc kubenswrapper[4971]: I0320 07:14:52.476567 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"65f4685d-7a73-4211-85e2-2bbcb7370be6","Type":"ContainerDied","Data":"90ef297c8d6d890cffeea2aece8e734d3092ce06296da538df1f4c52a196bb8f"} Mar 20 07:14:52 crc kubenswrapper[4971]: I0320 07:14:52.652452 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 07:14:52 crc kubenswrapper[4971]: I0320 07:14:52.764129 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65f4685d-7a73-4211-85e2-2bbcb7370be6-combined-ca-bundle\") pod \"65f4685d-7a73-4211-85e2-2bbcb7370be6\" (UID: \"65f4685d-7a73-4211-85e2-2bbcb7370be6\") " Mar 20 07:14:52 crc kubenswrapper[4971]: I0320 07:14:52.764570 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6cpc\" (UniqueName: \"kubernetes.io/projected/65f4685d-7a73-4211-85e2-2bbcb7370be6-kube-api-access-m6cpc\") pod \"65f4685d-7a73-4211-85e2-2bbcb7370be6\" (UID: \"65f4685d-7a73-4211-85e2-2bbcb7370be6\") " Mar 20 07:14:52 crc kubenswrapper[4971]: I0320 07:14:52.764700 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65f4685d-7a73-4211-85e2-2bbcb7370be6-config-data\") pod \"65f4685d-7a73-4211-85e2-2bbcb7370be6\" (UID: \"65f4685d-7a73-4211-85e2-2bbcb7370be6\") " Mar 20 07:14:52 crc kubenswrapper[4971]: I0320 07:14:52.778003 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77fdc792-9230-4274-b23d-dd4a21bfc09b" path="/var/lib/kubelet/pods/77fdc792-9230-4274-b23d-dd4a21bfc09b/volumes" Mar 20 07:14:52 crc kubenswrapper[4971]: I0320 07:14:52.781900 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65f4685d-7a73-4211-85e2-2bbcb7370be6-kube-api-access-m6cpc" (OuterVolumeSpecName: "kube-api-access-m6cpc") pod "65f4685d-7a73-4211-85e2-2bbcb7370be6" (UID: "65f4685d-7a73-4211-85e2-2bbcb7370be6"). InnerVolumeSpecName "kube-api-access-m6cpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:52 crc kubenswrapper[4971]: I0320 07:14:52.794062 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65f4685d-7a73-4211-85e2-2bbcb7370be6-config-data" (OuterVolumeSpecName: "config-data") pod "65f4685d-7a73-4211-85e2-2bbcb7370be6" (UID: "65f4685d-7a73-4211-85e2-2bbcb7370be6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:52 crc kubenswrapper[4971]: I0320 07:14:52.820007 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65f4685d-7a73-4211-85e2-2bbcb7370be6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "65f4685d-7a73-4211-85e2-2bbcb7370be6" (UID: "65f4685d-7a73-4211-85e2-2bbcb7370be6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:52 crc kubenswrapper[4971]: I0320 07:14:52.836409 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 07:14:52 crc kubenswrapper[4971]: I0320 07:14:52.867159 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6cpc\" (UniqueName: \"kubernetes.io/projected/65f4685d-7a73-4211-85e2-2bbcb7370be6-kube-api-access-m6cpc\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:52 crc kubenswrapper[4971]: I0320 07:14:52.867195 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65f4685d-7a73-4211-85e2-2bbcb7370be6-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:52 crc kubenswrapper[4971]: I0320 07:14:52.867206 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65f4685d-7a73-4211-85e2-2bbcb7370be6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:53 crc kubenswrapper[4971]: E0320 07:14:53.115014 4971 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod555d1d8c_972b_490a_b996_fa6586964070.slice/crio-370f7604091e5db7ea849130874b92f0f92ba819810dffbb247925f41784c1ca.scope\": RecentStats: unable to find data in memory cache]" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.469308 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.496501 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"65f4685d-7a73-4211-85e2-2bbcb7370be6","Type":"ContainerDied","Data":"62949cda5317360a782bb73e73c52e1295f770f3bcdc3cad056fa3ee609a5930"} Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.496558 4971 scope.go:117] "RemoveContainer" containerID="90ef297c8d6d890cffeea2aece8e734d3092ce06296da538df1f4c52a196bb8f" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.496724 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.507976 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a","Type":"ContainerStarted","Data":"e75361f0fc60a7904ebba0ae78b3a4ce5e8696d626e1bcbb7c9151163fc312c2"} Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.508019 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a","Type":"ContainerStarted","Data":"a887f3e5b2ee565c1425d7522818693ab833fcb378a94afa1401a728afada47e"} Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.508034 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a","Type":"ContainerStarted","Data":"4a9e9a5daa6a85c30810fb5af7b3c83cb19f48669d1b896a371a8e890b99d600"} Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.521065 4971 generic.go:334] "Generic (PLEG): container finished" podID="555d1d8c-972b-490a-b996-fa6586964070" containerID="370f7604091e5db7ea849130874b92f0f92ba819810dffbb247925f41784c1ca" exitCode=0 Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.521192 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.521182 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"555d1d8c-972b-490a-b996-fa6586964070","Type":"ContainerDied","Data":"370f7604091e5db7ea849130874b92f0f92ba819810dffbb247925f41784c1ca"} Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.521780 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"555d1d8c-972b-490a-b996-fa6586964070","Type":"ContainerDied","Data":"f981cc2f456aa4bb0e97573f0be0e8b906fbb033fe995297227c71b68b8d8dc3"} Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.538667 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.538626745 podStartE2EDuration="2.538626745s" podCreationTimestamp="2026-03-20 07:14:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:14:53.526713196 +0000 UTC m=+1515.506587364" watchObservedRunningTime="2026-03-20 07:14:53.538626745 +0000 UTC m=+1515.518500883" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.578280 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/555d1d8c-972b-490a-b996-fa6586964070-config-data\") pod \"555d1d8c-972b-490a-b996-fa6586964070\" (UID: \"555d1d8c-972b-490a-b996-fa6586964070\") " Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.578349 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/555d1d8c-972b-490a-b996-fa6586964070-logs\") pod \"555d1d8c-972b-490a-b996-fa6586964070\" (UID: \"555d1d8c-972b-490a-b996-fa6586964070\") " Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.578393 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pd2rf\" (UniqueName: \"kubernetes.io/projected/555d1d8c-972b-490a-b996-fa6586964070-kube-api-access-pd2rf\") pod \"555d1d8c-972b-490a-b996-fa6586964070\" (UID: \"555d1d8c-972b-490a-b996-fa6586964070\") " Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.578526 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/555d1d8c-972b-490a-b996-fa6586964070-combined-ca-bundle\") pod \"555d1d8c-972b-490a-b996-fa6586964070\" (UID: \"555d1d8c-972b-490a-b996-fa6586964070\") " Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.578666 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/555d1d8c-972b-490a-b996-fa6586964070-public-tls-certs\") pod \"555d1d8c-972b-490a-b996-fa6586964070\" (UID: \"555d1d8c-972b-490a-b996-fa6586964070\") " Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.578960 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/555d1d8c-972b-490a-b996-fa6586964070-internal-tls-certs\") pod \"555d1d8c-972b-490a-b996-fa6586964070\" (UID: \"555d1d8c-972b-490a-b996-fa6586964070\") " Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.579936 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/555d1d8c-972b-490a-b996-fa6586964070-logs" (OuterVolumeSpecName: "logs") pod "555d1d8c-972b-490a-b996-fa6586964070" (UID: "555d1d8c-972b-490a-b996-fa6586964070"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.603923 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/555d1d8c-972b-490a-b996-fa6586964070-kube-api-access-pd2rf" (OuterVolumeSpecName: "kube-api-access-pd2rf") pod "555d1d8c-972b-490a-b996-fa6586964070" (UID: "555d1d8c-972b-490a-b996-fa6586964070"). InnerVolumeSpecName "kube-api-access-pd2rf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.621017 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.621575 4971 scope.go:117] "RemoveContainer" containerID="370f7604091e5db7ea849130874b92f0f92ba819810dffbb247925f41784c1ca" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.657942 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/555d1d8c-972b-490a-b996-fa6586964070-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "555d1d8c-972b-490a-b996-fa6586964070" (UID: "555d1d8c-972b-490a-b996-fa6586964070"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.660523 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/555d1d8c-972b-490a-b996-fa6586964070-config-data" (OuterVolumeSpecName: "config-data") pod "555d1d8c-972b-490a-b996-fa6586964070" (UID: "555d1d8c-972b-490a-b996-fa6586964070"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.667765 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.672933 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/555d1d8c-972b-490a-b996-fa6586964070-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "555d1d8c-972b-490a-b996-fa6586964070" (UID: "555d1d8c-972b-490a-b996-fa6586964070"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.674853 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/555d1d8c-972b-490a-b996-fa6586964070-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "555d1d8c-972b-490a-b996-fa6586964070" (UID: "555d1d8c-972b-490a-b996-fa6586964070"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.677008 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 07:14:54 crc kubenswrapper[4971]: E0320 07:14:53.677954 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="555d1d8c-972b-490a-b996-fa6586964070" containerName="nova-api-log" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.677975 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="555d1d8c-972b-490a-b996-fa6586964070" containerName="nova-api-log" Mar 20 07:14:54 crc kubenswrapper[4971]: E0320 07:14:53.677994 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65f4685d-7a73-4211-85e2-2bbcb7370be6" containerName="nova-scheduler-scheduler" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.678000 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="65f4685d-7a73-4211-85e2-2bbcb7370be6" containerName="nova-scheduler-scheduler" Mar 20 07:14:54 crc kubenswrapper[4971]: E0320 07:14:53.678050 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="555d1d8c-972b-490a-b996-fa6586964070" containerName="nova-api-api" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.678058 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="555d1d8c-972b-490a-b996-fa6586964070" containerName="nova-api-api" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.678286 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="555d1d8c-972b-490a-b996-fa6586964070" containerName="nova-api-api" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.678305 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="555d1d8c-972b-490a-b996-fa6586964070" containerName="nova-api-log" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.678319 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="65f4685d-7a73-4211-85e2-2bbcb7370be6" containerName="nova-scheduler-scheduler" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.680107 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.681339 4971 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/555d1d8c-972b-490a-b996-fa6586964070-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.681365 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/555d1d8c-972b-490a-b996-fa6586964070-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.681377 4971 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/555d1d8c-972b-490a-b996-fa6586964070-logs\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.681388 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pd2rf\" (UniqueName: \"kubernetes.io/projected/555d1d8c-972b-490a-b996-fa6586964070-kube-api-access-pd2rf\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.681400 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/555d1d8c-972b-490a-b996-fa6586964070-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.681408 4971 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/555d1d8c-972b-490a-b996-fa6586964070-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.682924 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.686550 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.762080 4971 scope.go:117] "RemoveContainer" containerID="e4a09276e25eb37776abbfa177385da65bb15ed85f948cf6eec39806cbdbcaf6" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.781428 4971 scope.go:117] "RemoveContainer" containerID="370f7604091e5db7ea849130874b92f0f92ba819810dffbb247925f41784c1ca" Mar 20 07:14:54 crc kubenswrapper[4971]: E0320 07:14:53.781777 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"370f7604091e5db7ea849130874b92f0f92ba819810dffbb247925f41784c1ca\": container with ID starting with 370f7604091e5db7ea849130874b92f0f92ba819810dffbb247925f41784c1ca not found: ID does not exist" containerID="370f7604091e5db7ea849130874b92f0f92ba819810dffbb247925f41784c1ca" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.781812 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"370f7604091e5db7ea849130874b92f0f92ba819810dffbb247925f41784c1ca"} err="failed to get container status \"370f7604091e5db7ea849130874b92f0f92ba819810dffbb247925f41784c1ca\": rpc error: code = NotFound desc = could not find container \"370f7604091e5db7ea849130874b92f0f92ba819810dffbb247925f41784c1ca\": container with ID starting with 370f7604091e5db7ea849130874b92f0f92ba819810dffbb247925f41784c1ca not found: ID does not exist" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.781839 4971 scope.go:117] "RemoveContainer" containerID="e4a09276e25eb37776abbfa177385da65bb15ed85f948cf6eec39806cbdbcaf6" Mar 20 07:14:54 crc kubenswrapper[4971]: E0320 07:14:53.782176 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4a09276e25eb37776abbfa177385da65bb15ed85f948cf6eec39806cbdbcaf6\": container with ID starting with e4a09276e25eb37776abbfa177385da65bb15ed85f948cf6eec39806cbdbcaf6 not found: ID does not exist" containerID="e4a09276e25eb37776abbfa177385da65bb15ed85f948cf6eec39806cbdbcaf6" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.782196 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4a09276e25eb37776abbfa177385da65bb15ed85f948cf6eec39806cbdbcaf6"} err="failed to get container status \"e4a09276e25eb37776abbfa177385da65bb15ed85f948cf6eec39806cbdbcaf6\": rpc error: code = NotFound desc = could not find container \"e4a09276e25eb37776abbfa177385da65bb15ed85f948cf6eec39806cbdbcaf6\": container with ID starting with e4a09276e25eb37776abbfa177385da65bb15ed85f948cf6eec39806cbdbcaf6 not found: ID does not exist" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.783258 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mw6vb\" (UniqueName: \"kubernetes.io/projected/c7136000-46aa-4fe0-acaf-5ad1ed94e4b3-kube-api-access-mw6vb\") pod \"nova-scheduler-0\" (UID: \"c7136000-46aa-4fe0-acaf-5ad1ed94e4b3\") " pod="openstack/nova-scheduler-0" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.783444 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7136000-46aa-4fe0-acaf-5ad1ed94e4b3-config-data\") pod \"nova-scheduler-0\" (UID: \"c7136000-46aa-4fe0-acaf-5ad1ed94e4b3\") " pod="openstack/nova-scheduler-0" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.783477 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7136000-46aa-4fe0-acaf-5ad1ed94e4b3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c7136000-46aa-4fe0-acaf-5ad1ed94e4b3\") " pod="openstack/nova-scheduler-0" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.863007 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.881799 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.884955 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7136000-46aa-4fe0-acaf-5ad1ed94e4b3-config-data\") pod \"nova-scheduler-0\" (UID: \"c7136000-46aa-4fe0-acaf-5ad1ed94e4b3\") " pod="openstack/nova-scheduler-0" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.885001 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7136000-46aa-4fe0-acaf-5ad1ed94e4b3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c7136000-46aa-4fe0-acaf-5ad1ed94e4b3\") " pod="openstack/nova-scheduler-0" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.885139 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mw6vb\" (UniqueName: \"kubernetes.io/projected/c7136000-46aa-4fe0-acaf-5ad1ed94e4b3-kube-api-access-mw6vb\") pod \"nova-scheduler-0\" (UID: \"c7136000-46aa-4fe0-acaf-5ad1ed94e4b3\") " pod="openstack/nova-scheduler-0" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.892709 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.895643 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.898458 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.898758 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.899201 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7136000-46aa-4fe0-acaf-5ad1ed94e4b3-config-data\") pod \"nova-scheduler-0\" (UID: \"c7136000-46aa-4fe0-acaf-5ad1ed94e4b3\") " pod="openstack/nova-scheduler-0" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.902115 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.903479 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7136000-46aa-4fe0-acaf-5ad1ed94e4b3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c7136000-46aa-4fe0-acaf-5ad1ed94e4b3\") " pod="openstack/nova-scheduler-0" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.906845 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.912729 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mw6vb\" (UniqueName: \"kubernetes.io/projected/c7136000-46aa-4fe0-acaf-5ad1ed94e4b3-kube-api-access-mw6vb\") pod \"nova-scheduler-0\" (UID: \"c7136000-46aa-4fe0-acaf-5ad1ed94e4b3\") " pod="openstack/nova-scheduler-0" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.987388 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9g8q\" (UniqueName: \"kubernetes.io/projected/e98717ba-3768-4817-a29a-09114e106105-kube-api-access-z9g8q\") pod \"nova-api-0\" (UID: \"e98717ba-3768-4817-a29a-09114e106105\") " pod="openstack/nova-api-0" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.987439 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e98717ba-3768-4817-a29a-09114e106105-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e98717ba-3768-4817-a29a-09114e106105\") " pod="openstack/nova-api-0" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.987473 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e98717ba-3768-4817-a29a-09114e106105-public-tls-certs\") pod \"nova-api-0\" (UID: \"e98717ba-3768-4817-a29a-09114e106105\") " pod="openstack/nova-api-0" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.987490 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e98717ba-3768-4817-a29a-09114e106105-config-data\") pod \"nova-api-0\" (UID: \"e98717ba-3768-4817-a29a-09114e106105\") " pod="openstack/nova-api-0" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.987510 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e98717ba-3768-4817-a29a-09114e106105-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e98717ba-3768-4817-a29a-09114e106105\") " pod="openstack/nova-api-0" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:53.987554 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e98717ba-3768-4817-a29a-09114e106105-logs\") pod \"nova-api-0\" (UID: \"e98717ba-3768-4817-a29a-09114e106105\") " pod="openstack/nova-api-0" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:54.063024 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:54.089415 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9g8q\" (UniqueName: \"kubernetes.io/projected/e98717ba-3768-4817-a29a-09114e106105-kube-api-access-z9g8q\") pod \"nova-api-0\" (UID: \"e98717ba-3768-4817-a29a-09114e106105\") " pod="openstack/nova-api-0" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:54.089455 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e98717ba-3768-4817-a29a-09114e106105-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e98717ba-3768-4817-a29a-09114e106105\") " pod="openstack/nova-api-0" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:54.089485 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e98717ba-3768-4817-a29a-09114e106105-public-tls-certs\") pod \"nova-api-0\" (UID: \"e98717ba-3768-4817-a29a-09114e106105\") " pod="openstack/nova-api-0" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:54.089505 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e98717ba-3768-4817-a29a-09114e106105-config-data\") pod \"nova-api-0\" (UID: \"e98717ba-3768-4817-a29a-09114e106105\") " pod="openstack/nova-api-0" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:54.089526 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e98717ba-3768-4817-a29a-09114e106105-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e98717ba-3768-4817-a29a-09114e106105\") " pod="openstack/nova-api-0" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:54.089578 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e98717ba-3768-4817-a29a-09114e106105-logs\") pod \"nova-api-0\" (UID: \"e98717ba-3768-4817-a29a-09114e106105\") " pod="openstack/nova-api-0" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:54.089965 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e98717ba-3768-4817-a29a-09114e106105-logs\") pod \"nova-api-0\" (UID: \"e98717ba-3768-4817-a29a-09114e106105\") " pod="openstack/nova-api-0" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:54.094529 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e98717ba-3768-4817-a29a-09114e106105-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e98717ba-3768-4817-a29a-09114e106105\") " pod="openstack/nova-api-0" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:54.094851 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e98717ba-3768-4817-a29a-09114e106105-config-data\") pod \"nova-api-0\" (UID: \"e98717ba-3768-4817-a29a-09114e106105\") " pod="openstack/nova-api-0" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:54.095394 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e98717ba-3768-4817-a29a-09114e106105-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e98717ba-3768-4817-a29a-09114e106105\") " pod="openstack/nova-api-0" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:54.095837 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e98717ba-3768-4817-a29a-09114e106105-public-tls-certs\") pod \"nova-api-0\" (UID: \"e98717ba-3768-4817-a29a-09114e106105\") " pod="openstack/nova-api-0" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:54.105988 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9g8q\" (UniqueName: \"kubernetes.io/projected/e98717ba-3768-4817-a29a-09114e106105-kube-api-access-z9g8q\") pod \"nova-api-0\" (UID: \"e98717ba-3768-4817-a29a-09114e106105\") " pod="openstack/nova-api-0" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:54.221407 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:54.512148 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 07:14:54 crc kubenswrapper[4971]: W0320 07:14:54.520954 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7136000_46aa_4fe0_acaf_5ad1ed94e4b3.slice/crio-bf89cadf2cef0cae3f0b34593796e6584df0ae662b23a7e3e30070db3236da04 WatchSource:0}: Error finding container bf89cadf2cef0cae3f0b34593796e6584df0ae662b23a7e3e30070db3236da04: Status 404 returned error can't find the container with id bf89cadf2cef0cae3f0b34593796e6584df0ae662b23a7e3e30070db3236da04 Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:54.693864 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:54.744746 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="555d1d8c-972b-490a-b996-fa6586964070" path="/var/lib/kubelet/pods/555d1d8c-972b-490a-b996-fa6586964070/volumes" Mar 20 07:14:54 crc kubenswrapper[4971]: I0320 07:14:54.745675 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65f4685d-7a73-4211-85e2-2bbcb7370be6" path="/var/lib/kubelet/pods/65f4685d-7a73-4211-85e2-2bbcb7370be6/volumes" Mar 20 07:14:55 crc kubenswrapper[4971]: I0320 07:14:55.551682 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e98717ba-3768-4817-a29a-09114e106105","Type":"ContainerStarted","Data":"ce5645eab11812d12a5b692e7fd51d5472ba5bc0a4420f51166013a0ec0b6101"} Mar 20 07:14:55 crc kubenswrapper[4971]: I0320 07:14:55.552040 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e98717ba-3768-4817-a29a-09114e106105","Type":"ContainerStarted","Data":"b1c3b7e6f0a0f2ca4105b11d548c344cb0174fb3f45a7a3cbc9fbe11f4d22903"} Mar 20 07:14:55 crc kubenswrapper[4971]: I0320 07:14:55.552055 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e98717ba-3768-4817-a29a-09114e106105","Type":"ContainerStarted","Data":"05fece7797660c922c649a9f7d64fec23f399b3319ea84ff06c20b0824cdc2c7"} Mar 20 07:14:55 crc kubenswrapper[4971]: I0320 07:14:55.556129 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c7136000-46aa-4fe0-acaf-5ad1ed94e4b3","Type":"ContainerStarted","Data":"9cd9c69b1fe928bab1744a28f21f11504ec9f323cd9f2cc5d8584560664eff24"} Mar 20 07:14:55 crc kubenswrapper[4971]: I0320 07:14:55.556175 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c7136000-46aa-4fe0-acaf-5ad1ed94e4b3","Type":"ContainerStarted","Data":"bf89cadf2cef0cae3f0b34593796e6584df0ae662b23a7e3e30070db3236da04"} Mar 20 07:14:55 crc kubenswrapper[4971]: I0320 07:14:55.593468 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.593448789 podStartE2EDuration="2.593448789s" podCreationTimestamp="2026-03-20 07:14:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:14:55.586710114 +0000 UTC m=+1517.566584352" watchObservedRunningTime="2026-03-20 07:14:55.593448789 +0000 UTC m=+1517.573322917" Mar 20 07:14:55 crc kubenswrapper[4971]: I0320 07:14:55.628462 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.628437267 podStartE2EDuration="2.628437267s" podCreationTimestamp="2026-03-20 07:14:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:14:55.611338934 +0000 UTC m=+1517.591213072" watchObservedRunningTime="2026-03-20 07:14:55.628437267 +0000 UTC m=+1517.608311435" Mar 20 07:14:59 crc kubenswrapper[4971]: I0320 07:14:59.063517 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 20 07:15:00 crc kubenswrapper[4971]: I0320 07:15:00.145824 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566515-qz9wn"] Mar 20 07:15:00 crc kubenswrapper[4971]: I0320 07:15:00.147364 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566515-qz9wn" Mar 20 07:15:00 crc kubenswrapper[4971]: I0320 07:15:00.151812 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 07:15:00 crc kubenswrapper[4971]: I0320 07:15:00.152275 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 07:15:00 crc kubenswrapper[4971]: I0320 07:15:00.163323 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566515-qz9wn"] Mar 20 07:15:00 crc kubenswrapper[4971]: I0320 07:15:00.253183 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bkql\" (UniqueName: \"kubernetes.io/projected/4516e191-9ff2-4b88-a20e-66edfa04a8d7-kube-api-access-2bkql\") pod \"collect-profiles-29566515-qz9wn\" (UID: \"4516e191-9ff2-4b88-a20e-66edfa04a8d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566515-qz9wn" Mar 20 07:15:00 crc kubenswrapper[4971]: I0320 07:15:00.253259 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4516e191-9ff2-4b88-a20e-66edfa04a8d7-secret-volume\") pod \"collect-profiles-29566515-qz9wn\" (UID: \"4516e191-9ff2-4b88-a20e-66edfa04a8d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566515-qz9wn" Mar 20 07:15:00 crc kubenswrapper[4971]: I0320 07:15:00.253361 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4516e191-9ff2-4b88-a20e-66edfa04a8d7-config-volume\") pod \"collect-profiles-29566515-qz9wn\" (UID: \"4516e191-9ff2-4b88-a20e-66edfa04a8d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566515-qz9wn" Mar 20 07:15:00 crc kubenswrapper[4971]: I0320 07:15:00.355178 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4516e191-9ff2-4b88-a20e-66edfa04a8d7-config-volume\") pod \"collect-profiles-29566515-qz9wn\" (UID: \"4516e191-9ff2-4b88-a20e-66edfa04a8d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566515-qz9wn" Mar 20 07:15:00 crc kubenswrapper[4971]: I0320 07:15:00.355748 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bkql\" (UniqueName: \"kubernetes.io/projected/4516e191-9ff2-4b88-a20e-66edfa04a8d7-kube-api-access-2bkql\") pod \"collect-profiles-29566515-qz9wn\" (UID: \"4516e191-9ff2-4b88-a20e-66edfa04a8d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566515-qz9wn" Mar 20 07:15:00 crc kubenswrapper[4971]: I0320 07:15:00.355975 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4516e191-9ff2-4b88-a20e-66edfa04a8d7-secret-volume\") pod \"collect-profiles-29566515-qz9wn\" (UID: \"4516e191-9ff2-4b88-a20e-66edfa04a8d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566515-qz9wn" Mar 20 07:15:00 crc kubenswrapper[4971]: I0320 07:15:00.356859 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4516e191-9ff2-4b88-a20e-66edfa04a8d7-config-volume\") pod \"collect-profiles-29566515-qz9wn\" (UID: \"4516e191-9ff2-4b88-a20e-66edfa04a8d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566515-qz9wn" Mar 20 07:15:00 crc kubenswrapper[4971]: I0320 07:15:00.366084 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4516e191-9ff2-4b88-a20e-66edfa04a8d7-secret-volume\") pod \"collect-profiles-29566515-qz9wn\" (UID: \"4516e191-9ff2-4b88-a20e-66edfa04a8d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566515-qz9wn" Mar 20 07:15:00 crc kubenswrapper[4971]: I0320 07:15:00.389730 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bkql\" (UniqueName: \"kubernetes.io/projected/4516e191-9ff2-4b88-a20e-66edfa04a8d7-kube-api-access-2bkql\") pod \"collect-profiles-29566515-qz9wn\" (UID: \"4516e191-9ff2-4b88-a20e-66edfa04a8d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566515-qz9wn" Mar 20 07:15:00 crc kubenswrapper[4971]: I0320 07:15:00.480235 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566515-qz9wn" Mar 20 07:15:00 crc kubenswrapper[4971]: I0320 07:15:00.845531 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566515-qz9wn"] Mar 20 07:15:01 crc kubenswrapper[4971]: I0320 07:15:01.641860 4971 generic.go:334] "Generic (PLEG): container finished" podID="4516e191-9ff2-4b88-a20e-66edfa04a8d7" containerID="ea87d8d39a379069f57550ef31343612a37996e2f76e0e2acef4d11d633aabe4" exitCode=0 Mar 20 07:15:01 crc kubenswrapper[4971]: I0320 07:15:01.642064 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566515-qz9wn" event={"ID":"4516e191-9ff2-4b88-a20e-66edfa04a8d7","Type":"ContainerDied","Data":"ea87d8d39a379069f57550ef31343612a37996e2f76e0e2acef4d11d633aabe4"} Mar 20 07:15:01 crc kubenswrapper[4971]: I0320 07:15:01.643122 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566515-qz9wn" event={"ID":"4516e191-9ff2-4b88-a20e-66edfa04a8d7","Type":"ContainerStarted","Data":"06c8b921b957d7cbdaa9813fc4af1d840e3ced991ab635a7c25352c88ad4705f"} Mar 20 07:15:02 crc kubenswrapper[4971]: I0320 07:15:02.142221 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 07:15:02 crc kubenswrapper[4971]: I0320 07:15:02.143065 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 07:15:03 crc kubenswrapper[4971]: I0320 07:15:03.009350 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566515-qz9wn" Mar 20 07:15:03 crc kubenswrapper[4971]: I0320 07:15:03.109764 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bkql\" (UniqueName: \"kubernetes.io/projected/4516e191-9ff2-4b88-a20e-66edfa04a8d7-kube-api-access-2bkql\") pod \"4516e191-9ff2-4b88-a20e-66edfa04a8d7\" (UID: \"4516e191-9ff2-4b88-a20e-66edfa04a8d7\") " Mar 20 07:15:03 crc kubenswrapper[4971]: I0320 07:15:03.111355 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4516e191-9ff2-4b88-a20e-66edfa04a8d7-secret-volume\") pod \"4516e191-9ff2-4b88-a20e-66edfa04a8d7\" (UID: \"4516e191-9ff2-4b88-a20e-66edfa04a8d7\") " Mar 20 07:15:03 crc kubenswrapper[4971]: I0320 07:15:03.111471 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4516e191-9ff2-4b88-a20e-66edfa04a8d7-config-volume\") pod \"4516e191-9ff2-4b88-a20e-66edfa04a8d7\" (UID: \"4516e191-9ff2-4b88-a20e-66edfa04a8d7\") " Mar 20 07:15:03 crc kubenswrapper[4971]: I0320 07:15:03.112397 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4516e191-9ff2-4b88-a20e-66edfa04a8d7-config-volume" (OuterVolumeSpecName: "config-volume") pod "4516e191-9ff2-4b88-a20e-66edfa04a8d7" (UID: "4516e191-9ff2-4b88-a20e-66edfa04a8d7"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:15:03 crc kubenswrapper[4971]: I0320 07:15:03.116051 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4516e191-9ff2-4b88-a20e-66edfa04a8d7-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4516e191-9ff2-4b88-a20e-66edfa04a8d7" (UID: "4516e191-9ff2-4b88-a20e-66edfa04a8d7"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:03 crc kubenswrapper[4971]: I0320 07:15:03.120447 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4516e191-9ff2-4b88-a20e-66edfa04a8d7-kube-api-access-2bkql" (OuterVolumeSpecName: "kube-api-access-2bkql") pod "4516e191-9ff2-4b88-a20e-66edfa04a8d7" (UID: "4516e191-9ff2-4b88-a20e-66edfa04a8d7"). InnerVolumeSpecName "kube-api-access-2bkql". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:15:03 crc kubenswrapper[4971]: I0320 07:15:03.159812 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.211:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 07:15:03 crc kubenswrapper[4971]: I0320 07:15:03.159817 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.211:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 07:15:03 crc kubenswrapper[4971]: I0320 07:15:03.218949 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bkql\" (UniqueName: \"kubernetes.io/projected/4516e191-9ff2-4b88-a20e-66edfa04a8d7-kube-api-access-2bkql\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:03 crc kubenswrapper[4971]: I0320 07:15:03.218989 4971 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4516e191-9ff2-4b88-a20e-66edfa04a8d7-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:03 crc kubenswrapper[4971]: I0320 07:15:03.219003 4971 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4516e191-9ff2-4b88-a20e-66edfa04a8d7-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:03 crc kubenswrapper[4971]: I0320 07:15:03.665829 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566515-qz9wn" event={"ID":"4516e191-9ff2-4b88-a20e-66edfa04a8d7","Type":"ContainerDied","Data":"06c8b921b957d7cbdaa9813fc4af1d840e3ced991ab635a7c25352c88ad4705f"} Mar 20 07:15:03 crc kubenswrapper[4971]: I0320 07:15:03.666297 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06c8b921b957d7cbdaa9813fc4af1d840e3ced991ab635a7c25352c88ad4705f" Mar 20 07:15:03 crc kubenswrapper[4971]: I0320 07:15:03.665910 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566515-qz9wn" Mar 20 07:15:04 crc kubenswrapper[4971]: I0320 07:15:04.063959 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 20 07:15:04 crc kubenswrapper[4971]: I0320 07:15:04.095725 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 20 07:15:04 crc kubenswrapper[4971]: I0320 07:15:04.222307 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 07:15:04 crc kubenswrapper[4971]: I0320 07:15:04.223950 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 07:15:04 crc kubenswrapper[4971]: I0320 07:15:04.630717 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 20 07:15:04 crc kubenswrapper[4971]: I0320 07:15:04.750908 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 20 07:15:05 crc kubenswrapper[4971]: I0320 07:15:05.235929 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e98717ba-3768-4817-a29a-09114e106105" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.213:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 07:15:05 crc kubenswrapper[4971]: I0320 07:15:05.235993 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e98717ba-3768-4817-a29a-09114e106105" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.213:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 07:15:10 crc kubenswrapper[4971]: I0320 07:15:10.142757 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 07:15:10 crc kubenswrapper[4971]: I0320 07:15:10.143121 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 07:15:12 crc kubenswrapper[4971]: I0320 07:15:12.147965 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 07:15:12 crc kubenswrapper[4971]: I0320 07:15:12.152487 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 07:15:12 crc kubenswrapper[4971]: I0320 07:15:12.153368 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 07:15:12 crc kubenswrapper[4971]: I0320 07:15:12.221960 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 07:15:12 crc kubenswrapper[4971]: I0320 07:15:12.222072 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 07:15:12 crc kubenswrapper[4971]: I0320 07:15:12.811081 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 07:15:14 crc kubenswrapper[4971]: I0320 07:15:14.234993 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 07:15:14 crc kubenswrapper[4971]: I0320 07:15:14.236197 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 07:15:14 crc kubenswrapper[4971]: I0320 07:15:14.244905 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 07:15:14 crc kubenswrapper[4971]: I0320 07:15:14.829846 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 07:15:20 crc kubenswrapper[4971]: I0320 07:15:20.162428 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:15:20 crc kubenswrapper[4971]: I0320 07:15:20.163253 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:15:31 crc kubenswrapper[4971]: I0320 07:15:31.844065 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-t4xw9"] Mar 20 07:15:31 crc kubenswrapper[4971]: E0320 07:15:31.845532 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4516e191-9ff2-4b88-a20e-66edfa04a8d7" containerName="collect-profiles" Mar 20 07:15:31 crc kubenswrapper[4971]: I0320 07:15:31.845549 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="4516e191-9ff2-4b88-a20e-66edfa04a8d7" containerName="collect-profiles" Mar 20 07:15:31 crc kubenswrapper[4971]: I0320 07:15:31.845765 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="4516e191-9ff2-4b88-a20e-66edfa04a8d7" containerName="collect-profiles" Mar 20 07:15:31 crc kubenswrapper[4971]: I0320 07:15:31.846413 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-t4xw9" Mar 20 07:15:31 crc kubenswrapper[4971]: I0320 07:15:31.855972 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 20 07:15:31 crc kubenswrapper[4971]: I0320 07:15:31.880359 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-t4xw9"] Mar 20 07:15:31 crc kubenswrapper[4971]: I0320 07:15:31.955877 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-7dwd5"] Mar 20 07:15:31 crc kubenswrapper[4971]: I0320 07:15:31.977962 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 20 07:15:31 crc kubenswrapper[4971]: I0320 07:15:31.978484 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="b7c8d202-039a-426e-b5e5-916bb257e441" containerName="openstackclient" containerID="cri-o://c498a7de47947355c2dc10307b9c3cbd5aee665f320179ba74333dccaba346df" gracePeriod=2 Mar 20 07:15:32 crc kubenswrapper[4971]: I0320 07:15:32.003790 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lm9mw\" (UniqueName: \"kubernetes.io/projected/c3c07d77-e97d-4251-a86d-c0c67c52be23-kube-api-access-lm9mw\") pod \"root-account-create-update-t4xw9\" (UID: \"c3c07d77-e97d-4251-a86d-c0c67c52be23\") " pod="openstack/root-account-create-update-t4xw9" Mar 20 07:15:32 crc kubenswrapper[4971]: I0320 07:15:32.004043 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3c07d77-e97d-4251-a86d-c0c67c52be23-operator-scripts\") pod \"root-account-create-update-t4xw9\" (UID: \"c3c07d77-e97d-4251-a86d-c0c67c52be23\") " pod="openstack/root-account-create-update-t4xw9" Mar 20 07:15:32 crc kubenswrapper[4971]: I0320 07:15:32.028124 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 20 07:15:32 crc kubenswrapper[4971]: I0320 07:15:32.043120 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-08bb-account-create-update-tqhvf"] Mar 20 07:15:32 crc kubenswrapper[4971]: E0320 07:15:32.043500 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7c8d202-039a-426e-b5e5-916bb257e441" containerName="openstackclient" Mar 20 07:15:32 crc kubenswrapper[4971]: I0320 07:15:32.043511 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7c8d202-039a-426e-b5e5-916bb257e441" containerName="openstackclient" Mar 20 07:15:32 crc kubenswrapper[4971]: I0320 07:15:32.043728 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7c8d202-039a-426e-b5e5-916bb257e441" containerName="openstackclient" Mar 20 07:15:32 crc kubenswrapper[4971]: I0320 07:15:32.044309 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-08bb-account-create-update-tqhvf" Mar 20 07:15:32 crc kubenswrapper[4971]: I0320 07:15:32.066331 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 20 07:15:32 crc kubenswrapper[4971]: I0320 07:15:32.087886 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-7dwd5"] Mar 20 07:15:32 crc kubenswrapper[4971]: I0320 07:15:32.104809 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-08bb-account-create-update-tqhvf"] Mar 20 07:15:32 crc kubenswrapper[4971]: I0320 07:15:32.105675 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lm9mw\" (UniqueName: \"kubernetes.io/projected/c3c07d77-e97d-4251-a86d-c0c67c52be23-kube-api-access-lm9mw\") pod \"root-account-create-update-t4xw9\" (UID: \"c3c07d77-e97d-4251-a86d-c0c67c52be23\") " pod="openstack/root-account-create-update-t4xw9" Mar 20 07:15:32 crc kubenswrapper[4971]: I0320 07:15:32.105723 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3c07d77-e97d-4251-a86d-c0c67c52be23-operator-scripts\") pod \"root-account-create-update-t4xw9\" (UID: \"c3c07d77-e97d-4251-a86d-c0c67c52be23\") " pod="openstack/root-account-create-update-t4xw9" Mar 20 07:15:32 crc kubenswrapper[4971]: I0320 07:15:32.106500 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3c07d77-e97d-4251-a86d-c0c67c52be23-operator-scripts\") pod \"root-account-create-update-t4xw9\" (UID: \"c3c07d77-e97d-4251-a86d-c0c67c52be23\") " pod="openstack/root-account-create-update-t4xw9" Mar 20 07:15:32 crc kubenswrapper[4971]: I0320 07:15:32.207685 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-08bb-account-create-update-hw4s7"] Mar 20 07:15:32 crc kubenswrapper[4971]: I0320 07:15:32.208863 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19961c12-d700-4796-a618-4e625c12649b-operator-scripts\") pod \"placement-08bb-account-create-update-tqhvf\" (UID: \"19961c12-d700-4796-a618-4e625c12649b\") " pod="openstack/placement-08bb-account-create-update-tqhvf" Mar 20 07:15:32 crc kubenswrapper[4971]: I0320 07:15:32.208939 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rh5td\" (UniqueName: \"kubernetes.io/projected/19961c12-d700-4796-a618-4e625c12649b-kube-api-access-rh5td\") pod \"placement-08bb-account-create-update-tqhvf\" (UID: \"19961c12-d700-4796-a618-4e625c12649b\") " pod="openstack/placement-08bb-account-create-update-tqhvf" Mar 20 07:15:32 crc kubenswrapper[4971]: I0320 07:15:32.257400 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lm9mw\" (UniqueName: \"kubernetes.io/projected/c3c07d77-e97d-4251-a86d-c0c67c52be23-kube-api-access-lm9mw\") pod \"root-account-create-update-t4xw9\" (UID: \"c3c07d77-e97d-4251-a86d-c0c67c52be23\") " pod="openstack/root-account-create-update-t4xw9" Mar 20 07:15:32 crc kubenswrapper[4971]: I0320 07:15:32.276672 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-08bb-account-create-update-hw4s7"] Mar 20 07:15:32 crc kubenswrapper[4971]: I0320 07:15:32.294674 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-2836-account-create-update-zn4bs"] Mar 20 07:15:32 crc kubenswrapper[4971]: I0320 07:15:32.295922 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-2836-account-create-update-zn4bs" Mar 20 07:15:32 crc kubenswrapper[4971]: I0320 07:15:32.303387 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-60a3-account-create-update-ck92h"] Mar 20 07:15:32 crc kubenswrapper[4971]: I0320 07:15:32.304801 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-60a3-account-create-update-ck92h" Mar 20 07:15:32 crc kubenswrapper[4971]: I0320 07:15:32.312972 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 20 07:15:32 crc kubenswrapper[4971]: I0320 07:15:32.319265 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19961c12-d700-4796-a618-4e625c12649b-operator-scripts\") pod \"placement-08bb-account-create-update-tqhvf\" (UID: \"19961c12-d700-4796-a618-4e625c12649b\") " pod="openstack/placement-08bb-account-create-update-tqhvf" Mar 20 07:15:32 crc kubenswrapper[4971]: I0320 07:15:32.319447 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rh5td\" (UniqueName: \"kubernetes.io/projected/19961c12-d700-4796-a618-4e625c12649b-kube-api-access-rh5td\") pod \"placement-08bb-account-create-update-tqhvf\" (UID: \"19961c12-d700-4796-a618-4e625c12649b\") " pod="openstack/placement-08bb-account-create-update-tqhvf" Mar 20 07:15:32 crc kubenswrapper[4971]: I0320 07:15:32.323861 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19961c12-d700-4796-a618-4e625c12649b-operator-scripts\") pod \"placement-08bb-account-create-update-tqhvf\" (UID: \"19961c12-d700-4796-a618-4e625c12649b\") " pod="openstack/placement-08bb-account-create-update-tqhvf" Mar 20 07:15:32 crc kubenswrapper[4971]: I0320 07:15:32.334944 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 20 07:15:32 crc kubenswrapper[4971]: I0320 07:15:32.356810 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-60a3-account-create-update-ck92h"] Mar 20 07:15:32 crc kubenswrapper[4971]: I0320 07:15:32.367172 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rh5td\" (UniqueName: \"kubernetes.io/projected/19961c12-d700-4796-a618-4e625c12649b-kube-api-access-rh5td\") pod \"placement-08bb-account-create-update-tqhvf\" (UID: \"19961c12-d700-4796-a618-4e625c12649b\") " pod="openstack/placement-08bb-account-create-update-tqhvf" Mar 20 07:15:32 crc kubenswrapper[4971]: I0320 07:15:32.440676 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-2836-account-create-update-zn4bs"] Mar 20 07:15:32 crc kubenswrapper[4971]: I0320 07:15:32.441115 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-08bb-account-create-update-tqhvf" Mar 20 07:15:32 crc kubenswrapper[4971]: I0320 07:15:32.441327 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54576591-c271-46a6-8b8d-8973dbe4bb8e-operator-scripts\") pod \"barbican-2836-account-create-update-zn4bs\" (UID: \"54576591-c271-46a6-8b8d-8973dbe4bb8e\") " pod="openstack/barbican-2836-account-create-update-zn4bs" Mar 20 07:15:32 crc kubenswrapper[4971]: I0320 07:15:32.441392 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36683fcd-086d-4202-8e1d-5ae325ea95b1-operator-scripts\") pod \"neutron-60a3-account-create-update-ck92h\" (UID: \"36683fcd-086d-4202-8e1d-5ae325ea95b1\") " pod="openstack/neutron-60a3-account-create-update-ck92h" Mar 20 07:15:32 crc kubenswrapper[4971]: I0320 07:15:32.447441 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvrcb\" (UniqueName: \"kubernetes.io/projected/36683fcd-086d-4202-8e1d-5ae325ea95b1-kube-api-access-gvrcb\") pod \"neutron-60a3-account-create-update-ck92h\" (UID: \"36683fcd-086d-4202-8e1d-5ae325ea95b1\") " pod="openstack/neutron-60a3-account-create-update-ck92h" Mar 20 07:15:32 crc kubenswrapper[4971]: I0320 07:15:32.447501 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsdtk\" (UniqueName: \"kubernetes.io/projected/54576591-c271-46a6-8b8d-8973dbe4bb8e-kube-api-access-dsdtk\") pod \"barbican-2836-account-create-update-zn4bs\" (UID: \"54576591-c271-46a6-8b8d-8973dbe4bb8e\") " pod="openstack/barbican-2836-account-create-update-zn4bs" Mar 20 07:15:32 crc kubenswrapper[4971]: I0320 07:15:32.473042 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-t4xw9" Mar 20 07:15:32 crc kubenswrapper[4971]: I0320 07:15:32.503119 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Mar 20 07:15:32 crc kubenswrapper[4971]: I0320 07:15:32.503393 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="9064c65d-26e4-4d53-a1c1-b88f932b76be" containerName="ovn-northd" containerID="cri-o://3ca1eeff736f7dc8000796f67df497ff16c309981beedb9789a0e4a334c2f180" gracePeriod=30 Mar 20 07:15:32 crc kubenswrapper[4971]: I0320 07:15:32.503871 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="9064c65d-26e4-4d53-a1c1-b88f932b76be" containerName="openstack-network-exporter" containerID="cri-o://78ce6f75fdf4b8134a001b00c9380d6a9b0e6b28e02c6c557c6a2c9097b03dab" gracePeriod=30 Mar 20 07:15:32 crc kubenswrapper[4971]: I0320 07:15:32.533206 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 07:15:32 crc kubenswrapper[4971]: I0320 07:15:32.564475 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54576591-c271-46a6-8b8d-8973dbe4bb8e-operator-scripts\") pod \"barbican-2836-account-create-update-zn4bs\" (UID: \"54576591-c271-46a6-8b8d-8973dbe4bb8e\") " pod="openstack/barbican-2836-account-create-update-zn4bs" Mar 20 07:15:32 crc kubenswrapper[4971]: I0320 07:15:32.564530 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36683fcd-086d-4202-8e1d-5ae325ea95b1-operator-scripts\") pod \"neutron-60a3-account-create-update-ck92h\" (UID: \"36683fcd-086d-4202-8e1d-5ae325ea95b1\") " pod="openstack/neutron-60a3-account-create-update-ck92h" Mar 20 07:15:32 crc kubenswrapper[4971]: I0320 07:15:32.564567 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvrcb\" (UniqueName: \"kubernetes.io/projected/36683fcd-086d-4202-8e1d-5ae325ea95b1-kube-api-access-gvrcb\") pod \"neutron-60a3-account-create-update-ck92h\" (UID: \"36683fcd-086d-4202-8e1d-5ae325ea95b1\") " pod="openstack/neutron-60a3-account-create-update-ck92h" Mar 20 07:15:32 crc kubenswrapper[4971]: I0320 07:15:32.564586 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsdtk\" (UniqueName: \"kubernetes.io/projected/54576591-c271-46a6-8b8d-8973dbe4bb8e-kube-api-access-dsdtk\") pod \"barbican-2836-account-create-update-zn4bs\" (UID: \"54576591-c271-46a6-8b8d-8973dbe4bb8e\") " pod="openstack/barbican-2836-account-create-update-zn4bs" Mar 20 07:15:32 crc kubenswrapper[4971]: I0320 07:15:32.565305 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54576591-c271-46a6-8b8d-8973dbe4bb8e-operator-scripts\") pod \"barbican-2836-account-create-update-zn4bs\" (UID: \"54576591-c271-46a6-8b8d-8973dbe4bb8e\") " pod="openstack/barbican-2836-account-create-update-zn4bs" Mar 20 07:15:32 crc kubenswrapper[4971]: I0320 07:15:32.565497 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36683fcd-086d-4202-8e1d-5ae325ea95b1-operator-scripts\") pod \"neutron-60a3-account-create-update-ck92h\" (UID: \"36683fcd-086d-4202-8e1d-5ae325ea95b1\") " pod="openstack/neutron-60a3-account-create-update-ck92h" Mar 20 07:15:32 crc kubenswrapper[4971]: I0320 07:15:32.624480 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvrcb\" (UniqueName: \"kubernetes.io/projected/36683fcd-086d-4202-8e1d-5ae325ea95b1-kube-api-access-gvrcb\") pod \"neutron-60a3-account-create-update-ck92h\" (UID: \"36683fcd-086d-4202-8e1d-5ae325ea95b1\") " pod="openstack/neutron-60a3-account-create-update-ck92h" Mar 20 07:15:32 crc kubenswrapper[4971]: I0320 07:15:32.649249 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsdtk\" (UniqueName: \"kubernetes.io/projected/54576591-c271-46a6-8b8d-8973dbe4bb8e-kube-api-access-dsdtk\") pod \"barbican-2836-account-create-update-zn4bs\" (UID: \"54576591-c271-46a6-8b8d-8973dbe4bb8e\") " pod="openstack/barbican-2836-account-create-update-zn4bs" Mar 20 07:15:32 crc kubenswrapper[4971]: E0320 07:15:32.674287 4971 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 20 07:15:32 crc kubenswrapper[4971]: E0320 07:15:32.674341 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1023ed0c-3abe-4cff-987f-52544b885696-config-data podName:1023ed0c-3abe-4cff-987f-52544b885696 nodeName:}" failed. No retries permitted until 2026-03-20 07:15:33.174325062 +0000 UTC m=+1555.154199200 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/1023ed0c-3abe-4cff-987f-52544b885696-config-data") pod "rabbitmq-cell1-server-0" (UID: "1023ed0c-3abe-4cff-987f-52544b885696") : configmap "rabbitmq-cell1-config-data" not found Mar 20 07:15:32 crc kubenswrapper[4971]: I0320 07:15:32.751838 4971 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/swift-storage-0" secret="" err="secret \"swift-swift-dockercfg-hchrl\" not found" Mar 20 07:15:32 crc kubenswrapper[4971]: I0320 07:15:32.765677 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-2836-account-create-update-zn4bs" Mar 20 07:15:32 crc kubenswrapper[4971]: I0320 07:15:32.800852 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38bbd968-9965-427c-b176-00754eb9887c" path="/var/lib/kubelet/pods/38bbd968-9965-427c-b176-00754eb9887c/volumes" Mar 20 07:15:32 crc kubenswrapper[4971]: I0320 07:15:32.819098 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-60a3-account-create-update-ck92h" Mar 20 07:15:32 crc kubenswrapper[4971]: I0320 07:15:32.832437 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="543e9b24-4327-4fcf-8d2c-9316d2cc7b8e" path="/var/lib/kubelet/pods/543e9b24-4327-4fcf-8d2c-9316d2cc7b8e/volumes" Mar 20 07:15:32 crc kubenswrapper[4971]: I0320 07:15:32.833121 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-2836-account-create-update-mzz6z"] Mar 20 07:15:32 crc kubenswrapper[4971]: I0320 07:15:32.840472 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-2836-account-create-update-mzz6z"] Mar 20 07:15:32 crc kubenswrapper[4971]: I0320 07:15:32.840645 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-gdzww"] Mar 20 07:15:32 crc kubenswrapper[4971]: I0320 07:15:32.840718 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-60a3-account-create-update-5zftq"] Mar 20 07:15:32 crc kubenswrapper[4971]: I0320 07:15:32.840796 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-60a3-account-create-update-5zftq"] Mar 20 07:15:32 crc kubenswrapper[4971]: I0320 07:15:32.840903 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-gdzww"] Mar 20 07:15:32 crc kubenswrapper[4971]: I0320 07:15:32.840975 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-p9zph"] Mar 20 07:15:32 crc kubenswrapper[4971]: E0320 07:15:32.925122 4971 projected.go:288] Couldn't get configMap openstack/swift-storage-config-data: configmap "swift-storage-config-data" not found Mar 20 07:15:32 crc kubenswrapper[4971]: E0320 07:15:32.925591 4971 projected.go:263] Couldn't get secret openstack/swift-conf: secret "swift-conf" not found Mar 20 07:15:32 crc kubenswrapper[4971]: E0320 07:15:32.925620 4971 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 07:15:32 crc kubenswrapper[4971]: E0320 07:15:32.925636 4971 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: [configmap "swift-storage-config-data" not found, secret "swift-conf" not found, configmap "swift-ring-files" not found] Mar 20 07:15:32 crc kubenswrapper[4971]: E0320 07:15:32.925794 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9f0608b4-f344-45db-a952-d6bc328083a2-etc-swift podName:9f0608b4-f344-45db-a952-d6bc328083a2 nodeName:}" failed. No retries permitted until 2026-03-20 07:15:33.425774459 +0000 UTC m=+1555.405648597 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9f0608b4-f344-45db-a952-d6bc328083a2-etc-swift") pod "swift-storage-0" (UID: "9f0608b4-f344-45db-a952-d6bc328083a2") : [configmap "swift-storage-config-data" not found, secret "swift-conf" not found, configmap "swift-ring-files" not found] Mar 20 07:15:33 crc kubenswrapper[4971]: I0320 07:15:33.086258 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-p9zph"] Mar 20 07:15:33 crc kubenswrapper[4971]: I0320 07:15:33.162169 4971 generic.go:334] "Generic (PLEG): container finished" podID="9064c65d-26e4-4d53-a1c1-b88f932b76be" containerID="78ce6f75fdf4b8134a001b00c9380d6a9b0e6b28e02c6c557c6a2c9097b03dab" exitCode=2 Mar 20 07:15:33 crc kubenswrapper[4971]: I0320 07:15:33.162220 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"9064c65d-26e4-4d53-a1c1-b88f932b76be","Type":"ContainerDied","Data":"78ce6f75fdf4b8134a001b00c9380d6a9b0e6b28e02c6c557c6a2c9097b03dab"} Mar 20 07:15:33 crc kubenswrapper[4971]: I0320 07:15:33.173215 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 07:15:33 crc kubenswrapper[4971]: I0320 07:15:33.173746 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="63654bb6-f2be-42d2-bb83-0b96bb2ca8cb" containerName="openstack-network-exporter" containerID="cri-o://2055269c243f22fc79e97f8f94001b54b54ef95b9b6b344a99916b8ea073e623" gracePeriod=300 Mar 20 07:15:33 crc kubenswrapper[4971]: I0320 07:15:33.210147 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-30f2-account-create-update-v9jm4"] Mar 20 07:15:33 crc kubenswrapper[4971]: I0320 07:15:33.220750 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-30f2-account-create-update-v9jm4"] Mar 20 07:15:33 crc kubenswrapper[4971]: I0320 07:15:33.237477 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-7vflp"] Mar 20 07:15:33 crc kubenswrapper[4971]: E0320 07:15:33.240951 4971 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 20 07:15:33 crc kubenswrapper[4971]: E0320 07:15:33.241021 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1023ed0c-3abe-4cff-987f-52544b885696-config-data podName:1023ed0c-3abe-4cff-987f-52544b885696 nodeName:}" failed. No retries permitted until 2026-03-20 07:15:34.241005321 +0000 UTC m=+1556.220879459 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/1023ed0c-3abe-4cff-987f-52544b885696-config-data") pod "rabbitmq-cell1-server-0" (UID: "1023ed0c-3abe-4cff-987f-52544b885696") : configmap "rabbitmq-cell1-config-data" not found Mar 20 07:15:33 crc kubenswrapper[4971]: I0320 07:15:33.246676 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-7vflp"] Mar 20 07:15:33 crc kubenswrapper[4971]: I0320 07:15:33.252844 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 07:15:33 crc kubenswrapper[4971]: I0320 07:15:33.253453 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="9e53ff8e-9df7-411c-b2ad-e4c32c0209c6" containerName="openstack-network-exporter" containerID="cri-o://46bf584ea3433e9ac46772bd8db9e28130ad0da039469060e5e16331e01c70aa" gracePeriod=300 Mar 20 07:15:33 crc kubenswrapper[4971]: I0320 07:15:33.262402 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fdb8f6449-msxs5"] Mar 20 07:15:33 crc kubenswrapper[4971]: I0320 07:15:33.262831 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-fdb8f6449-msxs5" podUID="2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67" containerName="dnsmasq-dns" containerID="cri-o://3a2e05f37831e00f2f4d153e39ac7c8308e64ddecc8706e5d6c45b52312a4203" gracePeriod=10 Mar 20 07:15:33 crc kubenswrapper[4971]: I0320 07:15:33.307317 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-30f2-account-create-update-p2jpv"] Mar 20 07:15:33 crc kubenswrapper[4971]: I0320 07:15:33.310716 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-30f2-account-create-update-p2jpv" Mar 20 07:15:33 crc kubenswrapper[4971]: I0320 07:15:33.316370 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 20 07:15:33 crc kubenswrapper[4971]: I0320 07:15:33.337950 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 07:15:33 crc kubenswrapper[4971]: I0320 07:15:33.342643 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76584044-a0af-481c-961f-1c5351cb8f06-operator-scripts\") pod \"nova-api-30f2-account-create-update-p2jpv\" (UID: \"76584044-a0af-481c-961f-1c5351cb8f06\") " pod="openstack/nova-api-30f2-account-create-update-p2jpv" Mar 20 07:15:33 crc kubenswrapper[4971]: I0320 07:15:33.342708 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kb4m6\" (UniqueName: \"kubernetes.io/projected/76584044-a0af-481c-961f-1c5351cb8f06-kube-api-access-kb4m6\") pod \"nova-api-30f2-account-create-update-p2jpv\" (UID: \"76584044-a0af-481c-961f-1c5351cb8f06\") " pod="openstack/nova-api-30f2-account-create-update-p2jpv" Mar 20 07:15:33 crc kubenswrapper[4971]: I0320 07:15:33.364169 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-30f2-account-create-update-p2jpv"] Mar 20 07:15:33 crc kubenswrapper[4971]: I0320 07:15:33.396682 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-fqqjk"] Mar 20 07:15:33 crc kubenswrapper[4971]: I0320 07:15:33.413059 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="63654bb6-f2be-42d2-bb83-0b96bb2ca8cb" containerName="ovsdbserver-nb" containerID="cri-o://808a7c9f249b97193158020c6f0533d2ec2b303417e0c3080125fc1cc2867777" gracePeriod=300 Mar 20 07:15:33 crc kubenswrapper[4971]: I0320 07:15:33.413948 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-fqqjk"] Mar 20 07:15:33 crc kubenswrapper[4971]: I0320 07:15:33.424166 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-88l8f"] Mar 20 07:15:33 crc kubenswrapper[4971]: I0320 07:15:33.424403 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-88l8f" podUID="852c9ecc-4ab1-4142-8a63-e73c265c3866" containerName="openstack-network-exporter" containerID="cri-o://cc1977222a2687951e3071cbfe9eecf3bbf33195805469e1592c465e2505d12a" gracePeriod=30 Mar 20 07:15:33 crc kubenswrapper[4971]: I0320 07:15:33.453012 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-njjzt"] Mar 20 07:15:33 crc kubenswrapper[4971]: I0320 07:15:33.460573 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-pfhnr"] Mar 20 07:15:33 crc kubenswrapper[4971]: I0320 07:15:33.466979 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76584044-a0af-481c-961f-1c5351cb8f06-operator-scripts\") pod \"nova-api-30f2-account-create-update-p2jpv\" (UID: \"76584044-a0af-481c-961f-1c5351cb8f06\") " pod="openstack/nova-api-30f2-account-create-update-p2jpv" Mar 20 07:15:33 crc kubenswrapper[4971]: I0320 07:15:33.467054 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kb4m6\" (UniqueName: \"kubernetes.io/projected/76584044-a0af-481c-961f-1c5351cb8f06-kube-api-access-kb4m6\") pod \"nova-api-30f2-account-create-update-p2jpv\" (UID: \"76584044-a0af-481c-961f-1c5351cb8f06\") " pod="openstack/nova-api-30f2-account-create-update-p2jpv" Mar 20 07:15:33 crc kubenswrapper[4971]: I0320 07:15:33.468605 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76584044-a0af-481c-961f-1c5351cb8f06-operator-scripts\") pod \"nova-api-30f2-account-create-update-p2jpv\" (UID: \"76584044-a0af-481c-961f-1c5351cb8f06\") " pod="openstack/nova-api-30f2-account-create-update-p2jpv" Mar 20 07:15:33 crc kubenswrapper[4971]: E0320 07:15:33.468725 4971 projected.go:288] Couldn't get configMap openstack/swift-storage-config-data: configmap "swift-storage-config-data" not found Mar 20 07:15:33 crc kubenswrapper[4971]: E0320 07:15:33.468742 4971 projected.go:263] Couldn't get secret openstack/swift-conf: secret "swift-conf" not found Mar 20 07:15:33 crc kubenswrapper[4971]: E0320 07:15:33.468750 4971 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 07:15:33 crc kubenswrapper[4971]: E0320 07:15:33.468761 4971 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: [configmap "swift-storage-config-data" not found, secret "swift-conf" not found, configmap "swift-ring-files" not found] Mar 20 07:15:33 crc kubenswrapper[4971]: E0320 07:15:33.468805 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9f0608b4-f344-45db-a952-d6bc328083a2-etc-swift podName:9f0608b4-f344-45db-a952-d6bc328083a2 nodeName:}" failed. No retries permitted until 2026-03-20 07:15:34.468789914 +0000 UTC m=+1556.448664052 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9f0608b4-f344-45db-a952-d6bc328083a2-etc-swift") pod "swift-storage-0" (UID: "9f0608b4-f344-45db-a952-d6bc328083a2") : [configmap "swift-storage-config-data" not found, secret "swift-conf" not found, configmap "swift-ring-files" not found] Mar 20 07:15:33 crc kubenswrapper[4971]: E0320 07:15:33.468841 4971 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 20 07:15:33 crc kubenswrapper[4971]: E0320 07:15:33.468862 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a71f8d2d-729c-4b7c-89c7-a06bd2216978-config-data podName:a71f8d2d-729c-4b7c-89c7-a06bd2216978 nodeName:}" failed. No retries permitted until 2026-03-20 07:15:33.968854585 +0000 UTC m=+1555.948728723 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/a71f8d2d-729c-4b7c-89c7-a06bd2216978-config-data") pod "rabbitmq-server-0" (UID: "a71f8d2d-729c-4b7c-89c7-a06bd2216978") : configmap "rabbitmq-config-data" not found Mar 20 07:15:33 crc kubenswrapper[4971]: I0320 07:15:33.488649 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="9e53ff8e-9df7-411c-b2ad-e4c32c0209c6" containerName="ovsdbserver-sb" containerID="cri-o://68ce7b00a908c9710adb361927cb192e2b39212d020dcbf9508b46cd83c87f9b" gracePeriod=300 Mar 20 07:15:33 crc kubenswrapper[4971]: I0320 07:15:33.509366 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kb4m6\" (UniqueName: \"kubernetes.io/projected/76584044-a0af-481c-961f-1c5351cb8f06-kube-api-access-kb4m6\") pod \"nova-api-30f2-account-create-update-p2jpv\" (UID: \"76584044-a0af-481c-961f-1c5351cb8f06\") " pod="openstack/nova-api-30f2-account-create-update-p2jpv" Mar 20 07:15:33 crc kubenswrapper[4971]: I0320 07:15:33.532297 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-gvx7m"] Mar 20 07:15:33 crc kubenswrapper[4971]: I0320 07:15:33.561924 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-gvx7m"] Mar 20 07:15:33 crc kubenswrapper[4971]: I0320 07:15:33.611503 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-bde9-account-create-update-5hn59"] Mar 20 07:15:33 crc kubenswrapper[4971]: I0320 07:15:33.667752 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-rjxlr"] Mar 20 07:15:33 crc kubenswrapper[4971]: I0320 07:15:33.703378 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-bde9-account-create-update-5hn59"] Mar 20 07:15:33 crc kubenswrapper[4971]: I0320 07:15:33.714478 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-30f2-account-create-update-p2jpv" Mar 20 07:15:33 crc kubenswrapper[4971]: I0320 07:15:33.718764 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-rjxlr"] Mar 20 07:15:33 crc kubenswrapper[4971]: I0320 07:15:33.730905 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 07:15:33 crc kubenswrapper[4971]: I0320 07:15:33.731189 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c7c37921-0f32-4a99-bf40-e960c400a9ac" containerName="glance-log" containerID="cri-o://87f82d49936752bb6c84997b496c4f49bd0051001c3fe0ed7aff5c22606bc913" gracePeriod=30 Mar 20 07:15:33 crc kubenswrapper[4971]: I0320 07:15:33.731473 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c7c37921-0f32-4a99-bf40-e960c400a9ac" containerName="glance-httpd" containerID="cri-o://8e933255e2bcc27abf971627e50a388be00c68eb771f36803f6707174021661b" gracePeriod=30 Mar 20 07:15:33 crc kubenswrapper[4971]: I0320 07:15:33.799407 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 07:15:33 crc kubenswrapper[4971]: I0320 07:15:33.799702 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="2656777c-a362-4e3e-92aa-98a8f9d2cf34" containerName="cinder-scheduler" containerID="cri-o://f4e86f9dea6983a20f14a79fb0e7338cd92e47228309e782ea16776aa1889874" gracePeriod=30 Mar 20 07:15:33 crc kubenswrapper[4971]: I0320 07:15:33.799836 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="2656777c-a362-4e3e-92aa-98a8f9d2cf34" containerName="probe" containerID="cri-o://31e89339c7ef92bd573974edb8ba31d6d53be9b1e4d1416dc023c35caf6cee2c" gracePeriod=30 Mar 20 07:15:33 crc kubenswrapper[4971]: I0320 07:15:33.847946 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 07:15:33 crc kubenswrapper[4971]: I0320 07:15:33.848213 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="aa332536-984c-4942-b1fb-1ffde7eb465d" containerName="glance-log" containerID="cri-o://c5035eee8b5df80277aacaae4c435bb925d2c563411e00ccd53787c9f32033d9" gracePeriod=30 Mar 20 07:15:33 crc kubenswrapper[4971]: I0320 07:15:33.848671 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="aa332536-984c-4942-b1fb-1ffde7eb465d" containerName="glance-httpd" containerID="cri-o://68ae6ce1a0d2429fc4431d1c48cbd5670a70a1125fc1820cb1e6abfe59df9ea9" gracePeriod=30 Mar 20 07:15:33 crc kubenswrapper[4971]: I0320 07:15:33.873679 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-f0fc-account-create-update-tl2hm"] Mar 20 07:15:33 crc kubenswrapper[4971]: I0320 07:15:33.889064 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-f0fc-account-create-update-tl2hm"] Mar 20 07:15:33 crc kubenswrapper[4971]: E0320 07:15:33.893707 4971 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 07:15:33 crc kubenswrapper[4971]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 20 07:15:33 crc kubenswrapper[4971]: Mar 20 07:15:33 crc kubenswrapper[4971]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 20 07:15:33 crc kubenswrapper[4971]: Mar 20 07:15:33 crc kubenswrapper[4971]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 20 07:15:33 crc kubenswrapper[4971]: Mar 20 07:15:33 crc kubenswrapper[4971]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 20 07:15:33 crc kubenswrapper[4971]: Mar 20 07:15:33 crc kubenswrapper[4971]: if [ -n "" ]; then Mar 20 07:15:33 crc kubenswrapper[4971]: GRANT_DATABASE="" Mar 20 07:15:33 crc kubenswrapper[4971]: else Mar 20 07:15:33 crc kubenswrapper[4971]: GRANT_DATABASE="*" Mar 20 07:15:33 crc kubenswrapper[4971]: fi Mar 20 07:15:33 crc kubenswrapper[4971]: Mar 20 07:15:33 crc kubenswrapper[4971]: # going for maximum compatibility here: Mar 20 07:15:33 crc kubenswrapper[4971]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 20 07:15:33 crc kubenswrapper[4971]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 20 07:15:33 crc kubenswrapper[4971]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 20 07:15:33 crc kubenswrapper[4971]: # support updates Mar 20 07:15:33 crc kubenswrapper[4971]: Mar 20 07:15:33 crc kubenswrapper[4971]: $MYSQL_CMD < logger="UnhandledError" Mar 20 07:15:33 crc kubenswrapper[4971]: E0320 07:15:33.895177 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-cell1-mariadb-root-db-secret\\\" not found\"" pod="openstack/root-account-create-update-t4xw9" podUID="c3c07d77-e97d-4251-a86d-c0c67c52be23" Mar 20 07:15:33 crc kubenswrapper[4971]: I0320 07:15:33.960427 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 20 07:15:33 crc kubenswrapper[4971]: I0320 07:15:33.960732 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="08b125a7-da7e-4ab1-b594-2f45f0b5c529" containerName="cinder-api-log" containerID="cri-o://8862358b81fede9115c50c47e167a72f0bd5aec1ea665447f7122a28b9e9ceea" gracePeriod=30 Mar 20 07:15:33 crc kubenswrapper[4971]: I0320 07:15:33.960813 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="08b125a7-da7e-4ab1-b594-2f45f0b5c529" containerName="cinder-api" containerID="cri-o://2d28427254b93eecd5e3f48d2294727ab000a8f924161e1e67d604b6de308a34" gracePeriod=30 Mar 20 07:15:33 crc kubenswrapper[4971]: I0320 07:15:33.986053 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-68ff6b454b-v87kx"] Mar 20 07:15:33 crc kubenswrapper[4971]: I0320 07:15:33.986367 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-68ff6b454b-v87kx" podUID="e641b3d5-ee18-41b6-b38c-098564591a1b" containerName="placement-log" containerID="cri-o://a7921404e4b591fede304bc781cbac3d5009d94181d0de8b56cf88b7dad22119" gracePeriod=30 Mar 20 07:15:33 crc kubenswrapper[4971]: I0320 07:15:33.986487 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-68ff6b454b-v87kx" podUID="e641b3d5-ee18-41b6-b38c-098564591a1b" containerName="placement-api" containerID="cri-o://a65462569356d9db057f1cd9fdd87d74a4a015508c9c542b7cb684d1148c04cb" gracePeriod=30 Mar 20 07:15:34 crc kubenswrapper[4971]: E0320 07:15:34.002878 4971 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 20 07:15:34 crc kubenswrapper[4971]: E0320 07:15:34.002928 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a71f8d2d-729c-4b7c-89c7-a06bd2216978-config-data podName:a71f8d2d-729c-4b7c-89c7-a06bd2216978 nodeName:}" failed. No retries permitted until 2026-03-20 07:15:35.002915207 +0000 UTC m=+1556.982789345 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/a71f8d2d-729c-4b7c-89c7-a06bd2216978-config-data") pod "rabbitmq-server-0" (UID: "a71f8d2d-729c-4b7c-89c7-a06bd2216978") : configmap "rabbitmq-config-data" not found Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.044072 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Mar 20 07:15:34 crc kubenswrapper[4971]: W0320 07:15:34.052882 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod19961c12_d700_4796_a618_4e625c12649b.slice/crio-0521f2c683147e43a2de06c0ba71818a2ceb6b7adf55031ae58f06be236b0b66 WatchSource:0}: Error finding container 0521f2c683147e43a2de06c0ba71818a2ceb6b7adf55031ae58f06be236b0b66: Status 404 returned error can't find the container with id 0521f2c683147e43a2de06c0ba71818a2ceb6b7adf55031ae58f06be236b0b66 Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.062119 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9f0608b4-f344-45db-a952-d6bc328083a2" containerName="account-server" containerID="cri-o://1a911d36f7d7aa0fb9edc9388eb7caf6d4875f4f78e071a88a18bb2b44086495" gracePeriod=30 Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.062581 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9f0608b4-f344-45db-a952-d6bc328083a2" containerName="swift-recon-cron" containerID="cri-o://991e136e3d1419ae52ab4b98312ecd62fb645e90cc4433d51eaf06d9da2b9acb" gracePeriod=30 Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.062650 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9f0608b4-f344-45db-a952-d6bc328083a2" containerName="rsync" containerID="cri-o://208cc47279ede4fc675dd95db356e62187d88bcc7f60e2e2862a2771c8f1ac3a" gracePeriod=30 Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.062691 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9f0608b4-f344-45db-a952-d6bc328083a2" containerName="object-expirer" containerID="cri-o://1cca389d9342f12475322096ca826c0aec4ad5debfe45d027be7baa037a77986" gracePeriod=30 Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.062746 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9f0608b4-f344-45db-a952-d6bc328083a2" containerName="object-updater" containerID="cri-o://956cfacd43945d37dd879d68eb8604413a9b059e3c4840c7381849fcbde53b7b" gracePeriod=30 Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.062784 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9f0608b4-f344-45db-a952-d6bc328083a2" containerName="object-auditor" containerID="cri-o://86e323d45ff8a18ef238cc1dc929c3203fdb02bcad78b9221d6f2ec70e572487" gracePeriod=30 Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.063039 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9f0608b4-f344-45db-a952-d6bc328083a2" containerName="container-replicator" containerID="cri-o://606899b0d5647f6bba62377dc5689cc5a5012b312b0d21b384c0392805a1f808" gracePeriod=30 Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.063120 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9f0608b4-f344-45db-a952-d6bc328083a2" containerName="object-replicator" containerID="cri-o://5e8654e5ce222132f07315464b39e25a8cab844b1929418e596056aec0200d84" gracePeriod=30 Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.063155 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9f0608b4-f344-45db-a952-d6bc328083a2" containerName="object-server" containerID="cri-o://c9c23b9558cdbf263d8320145f4a662b276ea98292c7170bbe1609469c85d516" gracePeriod=30 Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.063185 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9f0608b4-f344-45db-a952-d6bc328083a2" containerName="container-updater" containerID="cri-o://6ff097de6ee924556cdb6109e47ceb1245409ef90ff32e007c6f3da327384af0" gracePeriod=30 Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.063214 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9f0608b4-f344-45db-a952-d6bc328083a2" containerName="container-auditor" containerID="cri-o://3bc926e8abdcab959b8ce2e3d7c3be7c01ec42c3cc5ee00077229e1618e99dca" gracePeriod=30 Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.063261 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9f0608b4-f344-45db-a952-d6bc328083a2" containerName="account-auditor" containerID="cri-o://0974124c6c77137e176d1974dea07e511d0b153152c5fc899503abbc3b959028" gracePeriod=30 Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.063296 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9f0608b4-f344-45db-a952-d6bc328083a2" containerName="container-server" containerID="cri-o://f933aca417afe71a53e9fc981ae7ba972eb5df323cf2c751b7c7592ca963d97c" gracePeriod=30 Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.063327 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9f0608b4-f344-45db-a952-d6bc328083a2" containerName="account-reaper" containerID="cri-o://d1197f3d4771bcf093680ea5c7e317e0d1839ab9a28044e6c0af4c8d3d15a45d" gracePeriod=30 Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.063364 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9f0608b4-f344-45db-a952-d6bc328083a2" containerName="account-replicator" containerID="cri-o://7c9c527925a998de06f623f87a6e63a193cc7cb04eaaee5ab4f3102ab86f462e" gracePeriod=30 Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.077718 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-bl874"] Mar 20 07:15:34 crc kubenswrapper[4971]: E0320 07:15:34.087597 4971 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 07:15:34 crc kubenswrapper[4971]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 20 07:15:34 crc kubenswrapper[4971]: Mar 20 07:15:34 crc kubenswrapper[4971]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 20 07:15:34 crc kubenswrapper[4971]: Mar 20 07:15:34 crc kubenswrapper[4971]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 20 07:15:34 crc kubenswrapper[4971]: Mar 20 07:15:34 crc kubenswrapper[4971]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 20 07:15:34 crc kubenswrapper[4971]: Mar 20 07:15:34 crc kubenswrapper[4971]: if [ -n "placement" ]; then Mar 20 07:15:34 crc kubenswrapper[4971]: GRANT_DATABASE="placement" Mar 20 07:15:34 crc kubenswrapper[4971]: else Mar 20 07:15:34 crc kubenswrapper[4971]: GRANT_DATABASE="*" Mar 20 07:15:34 crc kubenswrapper[4971]: fi Mar 20 07:15:34 crc kubenswrapper[4971]: Mar 20 07:15:34 crc kubenswrapper[4971]: # going for maximum compatibility here: Mar 20 07:15:34 crc kubenswrapper[4971]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 20 07:15:34 crc kubenswrapper[4971]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 20 07:15:34 crc kubenswrapper[4971]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 20 07:15:34 crc kubenswrapper[4971]: # support updates Mar 20 07:15:34 crc kubenswrapper[4971]: Mar 20 07:15:34 crc kubenswrapper[4971]: $MYSQL_CMD < logger="UnhandledError" Mar 20 07:15:34 crc kubenswrapper[4971]: E0320 07:15:34.088877 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"placement-db-secret\\\" not found\"" pod="openstack/placement-08bb-account-create-update-tqhvf" podUID="19961c12-d700-4796-a618-4e625c12649b" Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.106872 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-bl874"] Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.157784 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-jtmqm"] Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.256204 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_9e53ff8e-9df7-411c-b2ad-e4c32c0209c6/ovsdbserver-sb/0.log" Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.256249 4971 generic.go:334] "Generic (PLEG): container finished" podID="9e53ff8e-9df7-411c-b2ad-e4c32c0209c6" containerID="46bf584ea3433e9ac46772bd8db9e28130ad0da039469060e5e16331e01c70aa" exitCode=2 Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.256267 4971 generic.go:334] "Generic (PLEG): container finished" podID="9e53ff8e-9df7-411c-b2ad-e4c32c0209c6" containerID="68ce7b00a908c9710adb361927cb192e2b39212d020dcbf9508b46cd83c87f9b" exitCode=143 Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.257451 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"9e53ff8e-9df7-411c-b2ad-e4c32c0209c6","Type":"ContainerDied","Data":"46bf584ea3433e9ac46772bd8db9e28130ad0da039469060e5e16331e01c70aa"} Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.257483 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"9e53ff8e-9df7-411c-b2ad-e4c32c0209c6","Type":"ContainerDied","Data":"68ce7b00a908c9710adb361927cb192e2b39212d020dcbf9508b46cd83c87f9b"} Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.262156 4971 generic.go:334] "Generic (PLEG): container finished" podID="2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67" containerID="3a2e05f37831e00f2f4d153e39ac7c8308e64ddecc8706e5d6c45b52312a4203" exitCode=0 Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.261430 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-jtmqm"] Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.263146 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fdb8f6449-msxs5" event={"ID":"2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67","Type":"ContainerDied","Data":"3a2e05f37831e00f2f4d153e39ac7c8308e64ddecc8706e5d6c45b52312a4203"} Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.285870 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-88l8f_852c9ecc-4ab1-4142-8a63-e73c265c3866/openstack-network-exporter/0.log" Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.285928 4971 generic.go:334] "Generic (PLEG): container finished" podID="852c9ecc-4ab1-4142-8a63-e73c265c3866" containerID="cc1977222a2687951e3071cbfe9eecf3bbf33195805469e1592c465e2505d12a" exitCode=2 Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.286448 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-88l8f" event={"ID":"852c9ecc-4ab1-4142-8a63-e73c265c3866","Type":"ContainerDied","Data":"cc1977222a2687951e3071cbfe9eecf3bbf33195805469e1592c465e2505d12a"} Mar 20 07:15:34 crc kubenswrapper[4971]: E0320 07:15:34.287034 4971 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 20 07:15:34 crc kubenswrapper[4971]: E0320 07:15:34.287986 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1023ed0c-3abe-4cff-987f-52544b885696-config-data podName:1023ed0c-3abe-4cff-987f-52544b885696 nodeName:}" failed. No retries permitted until 2026-03-20 07:15:36.287959896 +0000 UTC m=+1558.267834054 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/1023ed0c-3abe-4cff-987f-52544b885696-config-data") pod "rabbitmq-cell1-server-0" (UID: "1023ed0c-3abe-4cff-987f-52544b885696") : configmap "rabbitmq-cell1-config-data" not found Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.302576 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-z7rjt"] Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.312648 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-t4xw9" event={"ID":"c3c07d77-e97d-4251-a86d-c0c67c52be23","Type":"ContainerStarted","Data":"1821c6ace33350bd6b530bdfa4038e0f769b05d128e199f9d4a4865651eb3c01"} Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.313365 4971 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/root-account-create-update-t4xw9" secret="" err="secret \"galera-openstack-cell1-dockercfg-llftp\" not found" Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.327024 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-z7rjt"] Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.331845 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_63654bb6-f2be-42d2-bb83-0b96bb2ca8cb/ovsdbserver-nb/0.log" Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.331887 4971 generic.go:334] "Generic (PLEG): container finished" podID="63654bb6-f2be-42d2-bb83-0b96bb2ca8cb" containerID="2055269c243f22fc79e97f8f94001b54b54ef95b9b6b344a99916b8ea073e623" exitCode=2 Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.331903 4971 generic.go:334] "Generic (PLEG): container finished" podID="63654bb6-f2be-42d2-bb83-0b96bb2ca8cb" containerID="808a7c9f249b97193158020c6f0533d2ec2b303417e0c3080125fc1cc2867777" exitCode=143 Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.331949 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"63654bb6-f2be-42d2-bb83-0b96bb2ca8cb","Type":"ContainerDied","Data":"2055269c243f22fc79e97f8f94001b54b54ef95b9b6b344a99916b8ea073e623"} Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.331974 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"63654bb6-f2be-42d2-bb83-0b96bb2ca8cb","Type":"ContainerDied","Data":"808a7c9f249b97193158020c6f0533d2ec2b303417e0c3080125fc1cc2867777"} Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.341085 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-t4xw9"] Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.347990 4971 generic.go:334] "Generic (PLEG): container finished" podID="aa332536-984c-4942-b1fb-1ffde7eb465d" containerID="c5035eee8b5df80277aacaae4c435bb925d2c563411e00ccd53787c9f32033d9" exitCode=143 Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.348075 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"aa332536-984c-4942-b1fb-1ffde7eb465d","Type":"ContainerDied","Data":"c5035eee8b5df80277aacaae4c435bb925d2c563411e00ccd53787c9f32033d9"} Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.388099 4971 generic.go:334] "Generic (PLEG): container finished" podID="c7c37921-0f32-4a99-bf40-e960c400a9ac" containerID="87f82d49936752bb6c84997b496c4f49bd0051001c3fe0ed7aff5c22606bc913" exitCode=143 Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.389060 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c7c37921-0f32-4a99-bf40-e960c400a9ac","Type":"ContainerDied","Data":"87f82d49936752bb6c84997b496c4f49bd0051001c3fe0ed7aff5c22606bc913"} Mar 20 07:15:34 crc kubenswrapper[4971]: E0320 07:15:34.390522 4971 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Mar 20 07:15:34 crc kubenswrapper[4971]: E0320 07:15:34.390919 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c3c07d77-e97d-4251-a86d-c0c67c52be23-operator-scripts podName:c3c07d77-e97d-4251-a86d-c0c67c52be23 nodeName:}" failed. No retries permitted until 2026-03-20 07:15:34.890899338 +0000 UTC m=+1556.870773466 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/c3c07d77-e97d-4251-a86d-c0c67c52be23-operator-scripts") pod "root-account-create-update-t4xw9" (UID: "c3c07d77-e97d-4251-a86d-c0c67c52be23") : configmap "openstack-cell1-scripts" not found Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.391809 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-0b9a-account-create-update-66sh8"] Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.402741 4971 generic.go:334] "Generic (PLEG): container finished" podID="e641b3d5-ee18-41b6-b38c-098564591a1b" containerID="a7921404e4b591fede304bc781cbac3d5009d94181d0de8b56cf88b7dad22119" exitCode=143 Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.402822 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-68ff6b454b-v87kx" event={"ID":"e641b3d5-ee18-41b6-b38c-098564591a1b","Type":"ContainerDied","Data":"a7921404e4b591fede304bc781cbac3d5009d94181d0de8b56cf88b7dad22119"} Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.427289 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-08bb-account-create-update-tqhvf" event={"ID":"19961c12-d700-4796-a618-4e625c12649b","Type":"ContainerStarted","Data":"0521f2c683147e43a2de06c0ba71818a2ceb6b7adf55031ae58f06be236b0b66"} Mar 20 07:15:34 crc kubenswrapper[4971]: E0320 07:15:34.429233 4971 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 07:15:34 crc kubenswrapper[4971]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 20 07:15:34 crc kubenswrapper[4971]: Mar 20 07:15:34 crc kubenswrapper[4971]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 20 07:15:34 crc kubenswrapper[4971]: Mar 20 07:15:34 crc kubenswrapper[4971]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 20 07:15:34 crc kubenswrapper[4971]: Mar 20 07:15:34 crc kubenswrapper[4971]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 20 07:15:34 crc kubenswrapper[4971]: Mar 20 07:15:34 crc kubenswrapper[4971]: if [ -n "" ]; then Mar 20 07:15:34 crc kubenswrapper[4971]: GRANT_DATABASE="" Mar 20 07:15:34 crc kubenswrapper[4971]: else Mar 20 07:15:34 crc kubenswrapper[4971]: GRANT_DATABASE="*" Mar 20 07:15:34 crc kubenswrapper[4971]: fi Mar 20 07:15:34 crc kubenswrapper[4971]: Mar 20 07:15:34 crc kubenswrapper[4971]: # going for maximum compatibility here: Mar 20 07:15:34 crc kubenswrapper[4971]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 20 07:15:34 crc kubenswrapper[4971]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 20 07:15:34 crc kubenswrapper[4971]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 20 07:15:34 crc kubenswrapper[4971]: # support updates Mar 20 07:15:34 crc kubenswrapper[4971]: Mar 20 07:15:34 crc kubenswrapper[4971]: $MYSQL_CMD < logger="UnhandledError" Mar 20 07:15:34 crc kubenswrapper[4971]: E0320 07:15:34.430665 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-cell1-mariadb-root-db-secret\\\" not found\"" pod="openstack/root-account-create-update-t4xw9" podUID="c3c07d77-e97d-4251-a86d-c0c67c52be23" Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.468168 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-v9s7n"] Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.478407 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-0b9a-account-create-update-66sh8"] Mar 20 07:15:34 crc kubenswrapper[4971]: E0320 07:15:34.488774 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3ca1eeff736f7dc8000796f67df497ff16c309981beedb9789a0e4a334c2f180" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Mar 20 07:15:34 crc kubenswrapper[4971]: E0320 07:15:34.489659 4971 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 07:15:34 crc kubenswrapper[4971]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 20 07:15:34 crc kubenswrapper[4971]: Mar 20 07:15:34 crc kubenswrapper[4971]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 20 07:15:34 crc kubenswrapper[4971]: Mar 20 07:15:34 crc kubenswrapper[4971]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 20 07:15:34 crc kubenswrapper[4971]: Mar 20 07:15:34 crc kubenswrapper[4971]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 20 07:15:34 crc kubenswrapper[4971]: Mar 20 07:15:34 crc kubenswrapper[4971]: if [ -n "placement" ]; then Mar 20 07:15:34 crc kubenswrapper[4971]: GRANT_DATABASE="placement" Mar 20 07:15:34 crc kubenswrapper[4971]: else Mar 20 07:15:34 crc kubenswrapper[4971]: GRANT_DATABASE="*" Mar 20 07:15:34 crc kubenswrapper[4971]: fi Mar 20 07:15:34 crc kubenswrapper[4971]: Mar 20 07:15:34 crc kubenswrapper[4971]: # going for maximum compatibility here: Mar 20 07:15:34 crc kubenswrapper[4971]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 20 07:15:34 crc kubenswrapper[4971]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 20 07:15:34 crc kubenswrapper[4971]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 20 07:15:34 crc kubenswrapper[4971]: # support updates Mar 20 07:15:34 crc kubenswrapper[4971]: Mar 20 07:15:34 crc kubenswrapper[4971]: $MYSQL_CMD < logger="UnhandledError" Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.489889 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-v9s7n"] Mar 20 07:15:34 crc kubenswrapper[4971]: E0320 07:15:34.490954 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"placement-db-secret\\\" not found\"" pod="openstack/placement-08bb-account-create-update-tqhvf" podUID="19961c12-d700-4796-a618-4e625c12649b" Mar 20 07:15:34 crc kubenswrapper[4971]: E0320 07:15:34.493560 4971 projected.go:288] Couldn't get configMap openstack/swift-storage-config-data: configmap "swift-storage-config-data" not found Mar 20 07:15:34 crc kubenswrapper[4971]: E0320 07:15:34.493630 4971 projected.go:263] Couldn't get secret openstack/swift-conf: secret "swift-conf" not found Mar 20 07:15:34 crc kubenswrapper[4971]: E0320 07:15:34.493644 4971 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 07:15:34 crc kubenswrapper[4971]: E0320 07:15:34.493658 4971 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: [configmap "swift-storage-config-data" not found, secret "swift-conf" not found, configmap "swift-ring-files" not found] Mar 20 07:15:34 crc kubenswrapper[4971]: E0320 07:15:34.493695 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9f0608b4-f344-45db-a952-d6bc328083a2-etc-swift podName:9f0608b4-f344-45db-a952-d6bc328083a2 nodeName:}" failed. No retries permitted until 2026-03-20 07:15:36.493682386 +0000 UTC m=+1558.473556524 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9f0608b4-f344-45db-a952-d6bc328083a2-etc-swift") pod "swift-storage-0" (UID: "9f0608b4-f344-45db-a952-d6bc328083a2") : [configmap "swift-storage-config-data" not found, secret "swift-conf" not found, configmap "swift-ring-files" not found] Mar 20 07:15:34 crc kubenswrapper[4971]: E0320 07:15:34.505820 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3ca1eeff736f7dc8000796f67df497ff16c309981beedb9789a0e4a334c2f180" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.507750 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-4da9-account-create-update-297t8"] Mar 20 07:15:34 crc kubenswrapper[4971]: E0320 07:15:34.518594 4971 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f0608b4_f344_45db_a952_d6bc328083a2.slice/crio-5e8654e5ce222132f07315464b39e25a8cab844b1929418e596056aec0200d84.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7c8d202_039a_426e_b5e5_916bb257e441.slice/crio-conmon-c498a7de47947355c2dc10307b9c3cbd5aee665f320179ba74333dccaba346df.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f0608b4_f344_45db_a952_d6bc328083a2.slice/crio-conmon-6ff097de6ee924556cdb6109e47ceb1245409ef90ff32e007c6f3da327384af0.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f0608b4_f344_45db_a952_d6bc328083a2.slice/crio-conmon-3bc926e8abdcab959b8ce2e3d7c3be7c01ec42c3cc5ee00077229e1618e99dca.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f0608b4_f344_45db_a952_d6bc328083a2.slice/crio-conmon-1cca389d9342f12475322096ca826c0aec4ad5debfe45d027be7baa037a77986.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f0608b4_f344_45db_a952_d6bc328083a2.slice/crio-conmon-5e8654e5ce222132f07315464b39e25a8cab844b1929418e596056aec0200d84.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode641b3d5_ee18_41b6_b38c_098564591a1b.slice/crio-a7921404e4b591fede304bc781cbac3d5009d94181d0de8b56cf88b7dad22119.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f0608b4_f344_45db_a952_d6bc328083a2.slice/crio-1cca389d9342f12475322096ca826c0aec4ad5debfe45d027be7baa037a77986.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f0608b4_f344_45db_a952_d6bc328083a2.slice/crio-606899b0d5647f6bba62377dc5689cc5a5012b312b0d21b384c0392805a1f808.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f0608b4_f344_45db_a952_d6bc328083a2.slice/crio-7c9c527925a998de06f623f87a6e63a193cc7cb04eaaee5ab4f3102ab86f462e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode641b3d5_ee18_41b6_b38c_098564591a1b.slice/crio-conmon-a7921404e4b591fede304bc781cbac3d5009d94181d0de8b56cf88b7dad22119.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08b125a7_da7e_4ab1_b594_2f45f0b5c529.slice/crio-8862358b81fede9115c50c47e167a72f0bd5aec1ea665447f7122a28b9e9ceea.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f0608b4_f344_45db_a952_d6bc328083a2.slice/crio-0974124c6c77137e176d1974dea07e511d0b153152c5fc899503abbc3b959028.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f0608b4_f344_45db_a952_d6bc328083a2.slice/crio-3bc926e8abdcab959b8ce2e3d7c3be7c01ec42c3cc5ee00077229e1618e99dca.scope\": RecentStats: unable to find data in memory cache]" Mar 20 07:15:34 crc kubenswrapper[4971]: E0320 07:15:34.521339 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3ca1eeff736f7dc8000796f67df497ff16c309981beedb9789a0e4a334c2f180" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Mar 20 07:15:34 crc kubenswrapper[4971]: E0320 07:15:34.521398 4971 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="9064c65d-26e4-4d53-a1c1-b88f932b76be" containerName="ovn-northd" Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.589849 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5d45446b77-6x8tp"] Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.590071 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5d45446b77-6x8tp" podUID="a83b09bb-e1ad-4b10-9b14-538998197621" containerName="neutron-api" containerID="cri-o://35447db00a76cd29e230251c5953d04cbe80e78a51340cb69f31c155161f5b6a" gracePeriod=30 Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.590474 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5d45446b77-6x8tp" podUID="a83b09bb-e1ad-4b10-9b14-538998197621" containerName="neutron-httpd" containerID="cri-o://de8268f31ad44f7855e9d376812885c5bc010017a21fab351f990ce733d64a51" gracePeriod=30 Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.609193 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-4da9-account-create-update-297t8"] Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.613244 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fdb8f6449-msxs5" Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.621256 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-08bb-account-create-update-tqhvf"] Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.638227 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-rtj87"] Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.651818 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-rtj87"] Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.671688 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-08bb-account-create-update-tqhvf"] Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.680663 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.687089 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-x6b4m"] Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.696250 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67-dns-swift-storage-0\") pod \"2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67\" (UID: \"2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67\") " Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.696370 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67-config\") pod \"2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67\" (UID: \"2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67\") " Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.697019 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67-ovsdbserver-sb\") pod \"2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67\" (UID: \"2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67\") " Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.697324 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67-ovsdbserver-nb\") pod \"2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67\" (UID: \"2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67\") " Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.697848 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ws2zp\" (UniqueName: \"kubernetes.io/projected/2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67-kube-api-access-ws2zp\") pod \"2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67\" (UID: \"2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67\") " Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.697940 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67-dns-svc\") pod \"2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67\" (UID: \"2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67\") " Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.698820 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-2836-account-create-update-zn4bs"] Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.706814 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-x6b4m"] Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.707653 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67-kube-api-access-ws2zp" (OuterVolumeSpecName: "kube-api-access-ws2zp") pod "2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67" (UID: "2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67"). InnerVolumeSpecName "kube-api-access-ws2zp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.725574 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-f5ndp"] Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.745323 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="1023ed0c-3abe-4cff-987f-52544b885696" containerName="rabbitmq" containerID="cri-o://51b4cb77e4da22feec9a5960ca52b1a8182d5f4678ea7a93350ed77aaf3336c3" gracePeriod=604800 Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.751028 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06926146-e70b-4ee7-bf24-d629cb17a509" path="/var/lib/kubelet/pods/06926146-e70b-4ee7-bf24-d629cb17a509/volumes" Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.753285 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f2bca3a-2949-4c33-8716-28a6e38b84ef" path="/var/lib/kubelet/pods/1f2bca3a-2949-4c33-8716-28a6e38b84ef/volumes" Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.754687 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30d83042-4484-4d62-8417-1e7fcebc4220" path="/var/lib/kubelet/pods/30d83042-4484-4d62-8417-1e7fcebc4220/volumes" Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.768218 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35492989-bd44-45b4-85a7-1acc9569270e" path="/var/lib/kubelet/pods/35492989-bd44-45b4-85a7-1acc9569270e/volumes" Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.772481 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ed6dac6-e0d0-403d-8e7d-6a5d921379fb" path="/var/lib/kubelet/pods/3ed6dac6-e0d0-403d-8e7d-6a5d921379fb/volumes" Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.776124 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b457fd8-aec5-4cfc-bbfd-a78eaf993f1c" path="/var/lib/kubelet/pods/4b457fd8-aec5-4cfc-bbfd-a78eaf993f1c/volumes" Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.783814 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-njjzt" podUID="cabe7ab2-f272-4434-8295-641c334a0ae2" containerName="ovs-vswitchd" containerID="cri-o://68e3c1f10a1b965deae97b10a94045279a9a39e1de7f0b30a1acdde8f91cf276" gracePeriod=29 Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.789347 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72464f1f-f227-4801-8ad0-6a81aaba7081" path="/var/lib/kubelet/pods/72464f1f-f227-4801-8ad0-6a81aaba7081/volumes" Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.790303 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c3565c3-ef3e-4d98-9bca-211a538032c9" path="/var/lib/kubelet/pods/7c3565c3-ef3e-4d98-9bca-211a538032c9/volumes" Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.791293 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ff3bb59-445d-44fd-927b-50bff286c5bf" path="/var/lib/kubelet/pods/7ff3bb59-445d-44fd-927b-50bff286c5bf/volumes" Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.792333 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89f7dd3f-532f-4e24-bf20-99aa46f1fdda" path="/var/lib/kubelet/pods/89f7dd3f-532f-4e24-bf20-99aa46f1fdda/volumes" Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.793250 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94267126-5e6b-4b8f-9b54-425d893afb6b" path="/var/lib/kubelet/pods/94267126-5e6b-4b8f-9b54-425d893afb6b/volumes" Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.793886 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e77b03c-9fbb-4f84-a85a-fd7771301436" path="/var/lib/kubelet/pods/9e77b03c-9fbb-4f84-a85a-fd7771301436/volumes" Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.794931 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4d13e0d-08bb-495b-abfc-6edcc8f01890" path="/var/lib/kubelet/pods/a4d13e0d-08bb-495b-abfc-6edcc8f01890/volumes" Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.795917 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a73473dc-dcc6-4c79-b04a-f6b3a7a3b7e2" path="/var/lib/kubelet/pods/a73473dc-dcc6-4c79-b04a-f6b3a7a3b7e2/volumes" Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.796992 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1ae2ef8-20ef-4f86-958e-212c18ae1701" path="/var/lib/kubelet/pods/b1ae2ef8-20ef-4f86-958e-212c18ae1701/volumes" Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.797550 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b44d8fdd-a462-4146-9a5e-6b84864cd490" path="/var/lib/kubelet/pods/b44d8fdd-a462-4146-9a5e-6b84864cd490/volumes" Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.798944 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0e7c0cc-cee7-4e9b-8c9c-53e1c83f8027" path="/var/lib/kubelet/pods/e0e7c0cc-cee7-4e9b-8c9c-53e1c83f8027/volumes" Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.799604 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebe4aafd-3ffe-406f-a0dc-faebf5eeddda" path="/var/lib/kubelet/pods/ebe4aafd-3ffe-406f-a0dc-faebf5eeddda/volumes" Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.800880 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffe3475f-214d-4782-b998-a579e40723dc" path="/var/lib/kubelet/pods/ffe3475f-214d-4782-b998-a579e40723dc/volumes" Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.805576 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ws2zp\" (UniqueName: \"kubernetes.io/projected/2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67-kube-api-access-ws2zp\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.806723 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-f5ndp"] Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.806755 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-60a3-account-create-update-ck92h"] Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.820015 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.820261 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e98717ba-3768-4817-a29a-09114e106105" containerName="nova-api-log" containerID="cri-o://b1c3b7e6f0a0f2ca4105b11d548c344cb0174fb3f45a7a3cbc9fbe11f4d22903" gracePeriod=30 Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.820717 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e98717ba-3768-4817-a29a-09114e106105" containerName="nova-api-api" containerID="cri-o://ce5645eab11812d12a5b692e7fd51d5472ba5bc0a4420f51166013a0ec0b6101" gracePeriod=30 Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.840875 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 07:15:34 crc kubenswrapper[4971]: E0320 07:15:34.841261 4971 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Mar 20 07:15:34 crc kubenswrapper[4971]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Mar 20 07:15:34 crc kubenswrapper[4971]: + source /usr/local/bin/container-scripts/functions Mar 20 07:15:34 crc kubenswrapper[4971]: ++ OVNBridge=br-int Mar 20 07:15:34 crc kubenswrapper[4971]: ++ OVNRemote=tcp:localhost:6642 Mar 20 07:15:34 crc kubenswrapper[4971]: ++ OVNEncapType=geneve Mar 20 07:15:34 crc kubenswrapper[4971]: ++ OVNAvailabilityZones= Mar 20 07:15:34 crc kubenswrapper[4971]: ++ EnableChassisAsGateway=true Mar 20 07:15:34 crc kubenswrapper[4971]: ++ PhysicalNetworks= Mar 20 07:15:34 crc kubenswrapper[4971]: ++ OVNHostName= Mar 20 07:15:34 crc kubenswrapper[4971]: ++ DB_FILE=/etc/openvswitch/conf.db Mar 20 07:15:34 crc kubenswrapper[4971]: ++ ovs_dir=/var/lib/openvswitch Mar 20 07:15:34 crc kubenswrapper[4971]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Mar 20 07:15:34 crc kubenswrapper[4971]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Mar 20 07:15:34 crc kubenswrapper[4971]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 20 07:15:34 crc kubenswrapper[4971]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 20 07:15:34 crc kubenswrapper[4971]: + sleep 0.5 Mar 20 07:15:34 crc kubenswrapper[4971]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 20 07:15:34 crc kubenswrapper[4971]: + sleep 0.5 Mar 20 07:15:34 crc kubenswrapper[4971]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 20 07:15:34 crc kubenswrapper[4971]: + cleanup_ovsdb_server_semaphore Mar 20 07:15:34 crc kubenswrapper[4971]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 20 07:15:34 crc kubenswrapper[4971]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Mar 20 07:15:34 crc kubenswrapper[4971]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-njjzt" message=< Mar 20 07:15:34 crc kubenswrapper[4971]: Exiting ovsdb-server (5) [ OK ] Mar 20 07:15:34 crc kubenswrapper[4971]: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Mar 20 07:15:34 crc kubenswrapper[4971]: + source /usr/local/bin/container-scripts/functions Mar 20 07:15:34 crc kubenswrapper[4971]: ++ OVNBridge=br-int Mar 20 07:15:34 crc kubenswrapper[4971]: ++ OVNRemote=tcp:localhost:6642 Mar 20 07:15:34 crc kubenswrapper[4971]: ++ OVNEncapType=geneve Mar 20 07:15:34 crc kubenswrapper[4971]: ++ OVNAvailabilityZones= Mar 20 07:15:34 crc kubenswrapper[4971]: ++ EnableChassisAsGateway=true Mar 20 07:15:34 crc kubenswrapper[4971]: ++ PhysicalNetworks= Mar 20 07:15:34 crc kubenswrapper[4971]: ++ OVNHostName= Mar 20 07:15:34 crc kubenswrapper[4971]: ++ DB_FILE=/etc/openvswitch/conf.db Mar 20 07:15:34 crc kubenswrapper[4971]: ++ ovs_dir=/var/lib/openvswitch Mar 20 07:15:34 crc kubenswrapper[4971]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Mar 20 07:15:34 crc kubenswrapper[4971]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Mar 20 07:15:34 crc kubenswrapper[4971]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 20 07:15:34 crc kubenswrapper[4971]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 20 07:15:34 crc kubenswrapper[4971]: + sleep 0.5 Mar 20 07:15:34 crc kubenswrapper[4971]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 20 07:15:34 crc kubenswrapper[4971]: + sleep 0.5 Mar 20 07:15:34 crc kubenswrapper[4971]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 20 07:15:34 crc kubenswrapper[4971]: + cleanup_ovsdb_server_semaphore Mar 20 07:15:34 crc kubenswrapper[4971]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 20 07:15:34 crc kubenswrapper[4971]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Mar 20 07:15:34 crc kubenswrapper[4971]: > Mar 20 07:15:34 crc kubenswrapper[4971]: E0320 07:15:34.841303 4971 kuberuntime_container.go:691] "PreStop hook failed" err=< Mar 20 07:15:34 crc kubenswrapper[4971]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Mar 20 07:15:34 crc kubenswrapper[4971]: + source /usr/local/bin/container-scripts/functions Mar 20 07:15:34 crc kubenswrapper[4971]: ++ OVNBridge=br-int Mar 20 07:15:34 crc kubenswrapper[4971]: ++ OVNRemote=tcp:localhost:6642 Mar 20 07:15:34 crc kubenswrapper[4971]: ++ OVNEncapType=geneve Mar 20 07:15:34 crc kubenswrapper[4971]: ++ OVNAvailabilityZones= Mar 20 07:15:34 crc kubenswrapper[4971]: ++ EnableChassisAsGateway=true Mar 20 07:15:34 crc kubenswrapper[4971]: ++ PhysicalNetworks= Mar 20 07:15:34 crc kubenswrapper[4971]: ++ OVNHostName= Mar 20 07:15:34 crc kubenswrapper[4971]: ++ DB_FILE=/etc/openvswitch/conf.db Mar 20 07:15:34 crc kubenswrapper[4971]: ++ ovs_dir=/var/lib/openvswitch Mar 20 07:15:34 crc kubenswrapper[4971]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Mar 20 07:15:34 crc kubenswrapper[4971]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Mar 20 07:15:34 crc kubenswrapper[4971]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 20 07:15:34 crc kubenswrapper[4971]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 20 07:15:34 crc kubenswrapper[4971]: + sleep 0.5 Mar 20 07:15:34 crc kubenswrapper[4971]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 20 07:15:34 crc kubenswrapper[4971]: + sleep 0.5 Mar 20 07:15:34 crc kubenswrapper[4971]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 20 07:15:34 crc kubenswrapper[4971]: + cleanup_ovsdb_server_semaphore Mar 20 07:15:34 crc kubenswrapper[4971]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 20 07:15:34 crc kubenswrapper[4971]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Mar 20 07:15:34 crc kubenswrapper[4971]: > pod="openstack/ovn-controller-ovs-njjzt" podUID="cabe7ab2-f272-4434-8295-641c334a0ae2" containerName="ovsdb-server" containerID="cri-o://0e05bef3a26f5139b1217b0d668bd7559845501941c593fdd4f111a356558ab6" Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.841342 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-njjzt" podUID="cabe7ab2-f272-4434-8295-641c334a0ae2" containerName="ovsdb-server" containerID="cri-o://0e05bef3a26f5139b1217b0d668bd7559845501941c593fdd4f111a356558ab6" gracePeriod=29 Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.862795 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-nv8gl"] Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.870916 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-nv8gl"] Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.872426 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67" (UID: "2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.872468 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67" (UID: "2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.886682 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-zpvmx"] Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.899153 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-zpvmx"] Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.902791 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.903061 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a" containerName="nova-metadata-log" containerID="cri-o://a887f3e5b2ee565c1425d7522818693ab833fcb378a94afa1401a728afada47e" gracePeriod=30 Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.903486 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a" containerName="nova-metadata-metadata" containerID="cri-o://e75361f0fc60a7904ebba0ae78b3a4ce5e8696d626e1bcbb7c9151163fc312c2" gracePeriod=30 Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.908373 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-30f2-account-create-update-p2jpv"] Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.909413 4971 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.909439 4971 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:34 crc kubenswrapper[4971]: E0320 07:15:34.909494 4971 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Mar 20 07:15:34 crc kubenswrapper[4971]: E0320 07:15:34.909535 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c3c07d77-e97d-4251-a86d-c0c67c52be23-operator-scripts podName:c3c07d77-e97d-4251-a86d-c0c67c52be23 nodeName:}" failed. No retries permitted until 2026-03-20 07:15:35.909523268 +0000 UTC m=+1557.889397406 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/c3c07d77-e97d-4251-a86d-c0c67c52be23-operator-scripts") pod "root-account-create-update-t4xw9" (UID: "c3c07d77-e97d-4251-a86d-c0c67c52be23") : configmap "openstack-cell1-scripts" not found Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.917138 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-9csfr"] Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.931090 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_9e53ff8e-9df7-411c-b2ad-e4c32c0209c6/ovsdbserver-sb/0.log" Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.931098 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-9csfr"] Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.931169 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.943928 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-88l8f_852c9ecc-4ab1-4142-8a63-e73c265c3866/openstack-network-exporter/0.log" Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.944130 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-88l8f" Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.966485 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67" (UID: "2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.966589 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-5999d797d7-b2qlx"] Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.966959 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-5999d797d7-b2qlx" podUID="4290efff-c110-4085-b0a7-3e402507b5a0" containerName="barbican-worker-log" containerID="cri-o://bb8445172815ef07a9dbeabf96a663864ac0f9efc3e35bfd9f481b62fdf29f8f" gracePeriod=30 Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.967138 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-5999d797d7-b2qlx" podUID="4290efff-c110-4085-b0a7-3e402507b5a0" containerName="barbican-worker" containerID="cri-o://4179f3a4ba269961a73dda3345aa28072df0582733516d6dc42d2ab7a7ead18c" gracePeriod=30 Mar 20 07:15:34 crc kubenswrapper[4971]: I0320 07:15:34.992931 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67-config" (OuterVolumeSpecName: "config") pod "2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67" (UID: "2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.011289 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e53ff8e-9df7-411c-b2ad-e4c32c0209c6-ovsdbserver-sb-tls-certs\") pod \"9e53ff8e-9df7-411c-b2ad-e4c32c0209c6\" (UID: \"9e53ff8e-9df7-411c-b2ad-e4c32c0209c6\") " Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.011452 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9e53ff8e-9df7-411c-b2ad-e4c32c0209c6-ovsdb-rundir\") pod \"9e53ff8e-9df7-411c-b2ad-e4c32c0209c6\" (UID: \"9e53ff8e-9df7-411c-b2ad-e4c32c0209c6\") " Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.011590 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e53ff8e-9df7-411c-b2ad-e4c32c0209c6-metrics-certs-tls-certs\") pod \"9e53ff8e-9df7-411c-b2ad-e4c32c0209c6\" (UID: \"9e53ff8e-9df7-411c-b2ad-e4c32c0209c6\") " Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.011692 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"9e53ff8e-9df7-411c-b2ad-e4c32c0209c6\" (UID: \"9e53ff8e-9df7-411c-b2ad-e4c32c0209c6\") " Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.011772 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/852c9ecc-4ab1-4142-8a63-e73c265c3866-ovs-rundir\") pod \"852c9ecc-4ab1-4142-8a63-e73c265c3866\" (UID: \"852c9ecc-4ab1-4142-8a63-e73c265c3866\") " Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.011862 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/852c9ecc-4ab1-4142-8a63-e73c265c3866-config\") pod \"852c9ecc-4ab1-4142-8a63-e73c265c3866\" (UID: \"852c9ecc-4ab1-4142-8a63-e73c265c3866\") " Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.012016 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e53ff8e-9df7-411c-b2ad-e4c32c0209c6-combined-ca-bundle\") pod \"9e53ff8e-9df7-411c-b2ad-e4c32c0209c6\" (UID: \"9e53ff8e-9df7-411c-b2ad-e4c32c0209c6\") " Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.012152 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/852c9ecc-4ab1-4142-8a63-e73c265c3866-ovn-rundir\") pod \"852c9ecc-4ab1-4142-8a63-e73c265c3866\" (UID: \"852c9ecc-4ab1-4142-8a63-e73c265c3866\") " Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.012220 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mxwg\" (UniqueName: \"kubernetes.io/projected/852c9ecc-4ab1-4142-8a63-e73c265c3866-kube-api-access-8mxwg\") pod \"852c9ecc-4ab1-4142-8a63-e73c265c3866\" (UID: \"852c9ecc-4ab1-4142-8a63-e73c265c3866\") " Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.012330 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/852c9ecc-4ab1-4142-8a63-e73c265c3866-combined-ca-bundle\") pod \"852c9ecc-4ab1-4142-8a63-e73c265c3866\" (UID: \"852c9ecc-4ab1-4142-8a63-e73c265c3866\") " Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.012400 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/852c9ecc-4ab1-4142-8a63-e73c265c3866-metrics-certs-tls-certs\") pod \"852c9ecc-4ab1-4142-8a63-e73c265c3866\" (UID: \"852c9ecc-4ab1-4142-8a63-e73c265c3866\") " Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.012475 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cw7bj\" (UniqueName: \"kubernetes.io/projected/9e53ff8e-9df7-411c-b2ad-e4c32c0209c6-kube-api-access-cw7bj\") pod \"9e53ff8e-9df7-411c-b2ad-e4c32c0209c6\" (UID: \"9e53ff8e-9df7-411c-b2ad-e4c32c0209c6\") " Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.013740 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9e53ff8e-9df7-411c-b2ad-e4c32c0209c6-scripts\") pod \"9e53ff8e-9df7-411c-b2ad-e4c32c0209c6\" (UID: \"9e53ff8e-9df7-411c-b2ad-e4c32c0209c6\") " Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.013849 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e53ff8e-9df7-411c-b2ad-e4c32c0209c6-config\") pod \"9e53ff8e-9df7-411c-b2ad-e4c32c0209c6\" (UID: \"9e53ff8e-9df7-411c-b2ad-e4c32c0209c6\") " Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.014012 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/852c9ecc-4ab1-4142-8a63-e73c265c3866-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "852c9ecc-4ab1-4142-8a63-e73c265c3866" (UID: "852c9ecc-4ab1-4142-8a63-e73c265c3866"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.014700 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e53ff8e-9df7-411c-b2ad-e4c32c0209c6-scripts" (OuterVolumeSpecName: "scripts") pod "9e53ff8e-9df7-411c-b2ad-e4c32c0209c6" (UID: "9e53ff8e-9df7-411c-b2ad-e4c32c0209c6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.014928 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e53ff8e-9df7-411c-b2ad-e4c32c0209c6-config" (OuterVolumeSpecName: "config") pod "9e53ff8e-9df7-411c-b2ad-e4c32c0209c6" (UID: "9e53ff8e-9df7-411c-b2ad-e4c32c0209c6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:15:35 crc kubenswrapper[4971]: E0320 07:15:35.016874 4971 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.017054 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.017079 4971 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/852c9ecc-4ab1-4142-8a63-e73c265c3866-ovn-rundir\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.017094 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9e53ff8e-9df7-411c-b2ad-e4c32c0209c6-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.017106 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e53ff8e-9df7-411c-b2ad-e4c32c0209c6-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.017116 4971 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:35 crc kubenswrapper[4971]: E0320 07:15:35.017183 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a71f8d2d-729c-4b7c-89c7-a06bd2216978-config-data podName:a71f8d2d-729c-4b7c-89c7-a06bd2216978 nodeName:}" failed. No retries permitted until 2026-03-20 07:15:37.017166972 +0000 UTC m=+1558.997041110 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/a71f8d2d-729c-4b7c-89c7-a06bd2216978-config-data") pod "rabbitmq-server-0" (UID: "a71f8d2d-729c-4b7c-89c7-a06bd2216978") : configmap "rabbitmq-config-data" not found Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.017847 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e53ff8e-9df7-411c-b2ad-e4c32c0209c6-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "9e53ff8e-9df7-411c-b2ad-e4c32c0209c6" (UID: "9e53ff8e-9df7-411c-b2ad-e4c32c0209c6"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.018163 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/852c9ecc-4ab1-4142-8a63-e73c265c3866-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "852c9ecc-4ab1-4142-8a63-e73c265c3866" (UID: "852c9ecc-4ab1-4142-8a63-e73c265c3866"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.018205 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67" (UID: "2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.021732 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "9e53ff8e-9df7-411c-b2ad-e4c32c0209c6" (UID: "9e53ff8e-9df7-411c-b2ad-e4c32c0209c6"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.023673 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/852c9ecc-4ab1-4142-8a63-e73c265c3866-config" (OuterVolumeSpecName: "config") pod "852c9ecc-4ab1-4142-8a63-e73c265c3866" (UID: "852c9ecc-4ab1-4142-8a63-e73c265c3866"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.031766 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/852c9ecc-4ab1-4142-8a63-e73c265c3866-kube-api-access-8mxwg" (OuterVolumeSpecName: "kube-api-access-8mxwg") pod "852c9ecc-4ab1-4142-8a63-e73c265c3866" (UID: "852c9ecc-4ab1-4142-8a63-e73c265c3866"). InnerVolumeSpecName "kube-api-access-8mxwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.034045 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7c864cc4fb-bwl6r"] Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.034755 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7c864cc4fb-bwl6r" podUID="2bb32512-db70-4db8-a867-0bda48f318eb" containerName="barbican-api" containerID="cri-o://f4291f79acc040e64577f76b2a241e98138576981a746fc70ac90f0558fc3b12" gracePeriod=30 Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.035232 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7c864cc4fb-bwl6r" podUID="2bb32512-db70-4db8-a867-0bda48f318eb" containerName="barbican-api-log" containerID="cri-o://f44c356494df4fb7a453f34eb2f258fe912c8e4012fb92962ea899421d3f7ab5" gracePeriod=30 Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.041814 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_63654bb6-f2be-42d2-bb83-0b96bb2ca8cb/ovsdbserver-nb/0.log" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.041882 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.044705 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-594fd7bbdb-tf55d"] Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.044915 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-594fd7bbdb-tf55d" podUID="6149d926-b8a4-4d76-998c-0f721a1a2ef1" containerName="barbican-keystone-listener-log" containerID="cri-o://6e2dd8821eb89876f7c7630305e669c0924a486cdb8b8e13430844295352c507" gracePeriod=30 Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.045040 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-594fd7bbdb-tf55d" podUID="6149d926-b8a4-4d76-998c-0f721a1a2ef1" containerName="barbican-keystone-listener" containerID="cri-o://65c1955e9ecffc618c51033c4ee3d4e91c853d79d00a2cf910425e2df03f5791" gracePeriod=30 Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.054487 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e53ff8e-9df7-411c-b2ad-e4c32c0209c6-kube-api-access-cw7bj" (OuterVolumeSpecName: "kube-api-access-cw7bj") pod "9e53ff8e-9df7-411c-b2ad-e4c32c0209c6" (UID: "9e53ff8e-9df7-411c-b2ad-e4c32c0209c6"). InnerVolumeSpecName "kube-api-access-cw7bj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.058644 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.058859 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="6b4a1357-9313-4ebb-bb14-2aaf2785e17d" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://a679a633dac3964e55615ddc5dd8c47b610179cc08b11c445d3bcebafc956250" gracePeriod=30 Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.069912 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.085313 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-t4xw9"] Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.116355 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.128236 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdvj9\" (UniqueName: \"kubernetes.io/projected/63654bb6-f2be-42d2-bb83-0b96bb2ca8cb-kube-api-access-bdvj9\") pod \"63654bb6-f2be-42d2-bb83-0b96bb2ca8cb\" (UID: \"63654bb6-f2be-42d2-bb83-0b96bb2ca8cb\") " Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.128308 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/63654bb6-f2be-42d2-bb83-0b96bb2ca8cb-ovsdb-rundir\") pod \"63654bb6-f2be-42d2-bb83-0b96bb2ca8cb\" (UID: \"63654bb6-f2be-42d2-bb83-0b96bb2ca8cb\") " Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.128363 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7c8d202-039a-426e-b5e5-916bb257e441-combined-ca-bundle\") pod \"b7c8d202-039a-426e-b5e5-916bb257e441\" (UID: \"b7c8d202-039a-426e-b5e5-916bb257e441\") " Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.128394 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/63654bb6-f2be-42d2-bb83-0b96bb2ca8cb-scripts\") pod \"63654bb6-f2be-42d2-bb83-0b96bb2ca8cb\" (UID: \"63654bb6-f2be-42d2-bb83-0b96bb2ca8cb\") " Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.128423 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63654bb6-f2be-42d2-bb83-0b96bb2ca8cb-combined-ca-bundle\") pod \"63654bb6-f2be-42d2-bb83-0b96bb2ca8cb\" (UID: \"63654bb6-f2be-42d2-bb83-0b96bb2ca8cb\") " Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.128443 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"63654bb6-f2be-42d2-bb83-0b96bb2ca8cb\" (UID: \"63654bb6-f2be-42d2-bb83-0b96bb2ca8cb\") " Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.128495 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/63654bb6-f2be-42d2-bb83-0b96bb2ca8cb-ovsdbserver-nb-tls-certs\") pod \"63654bb6-f2be-42d2-bb83-0b96bb2ca8cb\" (UID: \"63654bb6-f2be-42d2-bb83-0b96bb2ca8cb\") " Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.128519 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b7c8d202-039a-426e-b5e5-916bb257e441-openstack-config-secret\") pod \"b7c8d202-039a-426e-b5e5-916bb257e441\" (UID: \"b7c8d202-039a-426e-b5e5-916bb257e441\") " Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.128563 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6q7vl\" (UniqueName: \"kubernetes.io/projected/b7c8d202-039a-426e-b5e5-916bb257e441-kube-api-access-6q7vl\") pod \"b7c8d202-039a-426e-b5e5-916bb257e441\" (UID: \"b7c8d202-039a-426e-b5e5-916bb257e441\") " Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.128645 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b7c8d202-039a-426e-b5e5-916bb257e441-openstack-config\") pod \"b7c8d202-039a-426e-b5e5-916bb257e441\" (UID: \"b7c8d202-039a-426e-b5e5-916bb257e441\") " Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.128699 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63654bb6-f2be-42d2-bb83-0b96bb2ca8cb-config\") pod \"63654bb6-f2be-42d2-bb83-0b96bb2ca8cb\" (UID: \"63654bb6-f2be-42d2-bb83-0b96bb2ca8cb\") " Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.128752 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/63654bb6-f2be-42d2-bb83-0b96bb2ca8cb-metrics-certs-tls-certs\") pod \"63654bb6-f2be-42d2-bb83-0b96bb2ca8cb\" (UID: \"63654bb6-f2be-42d2-bb83-0b96bb2ca8cb\") " Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.129441 4971 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.129470 4971 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/852c9ecc-4ab1-4142-8a63-e73c265c3866-ovs-rundir\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.129484 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/852c9ecc-4ab1-4142-8a63-e73c265c3866-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.129495 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mxwg\" (UniqueName: \"kubernetes.io/projected/852c9ecc-4ab1-4142-8a63-e73c265c3866-kube-api-access-8mxwg\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.129507 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cw7bj\" (UniqueName: \"kubernetes.io/projected/9e53ff8e-9df7-411c-b2ad-e4c32c0209c6-kube-api-access-cw7bj\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.129519 4971 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9e53ff8e-9df7-411c-b2ad-e4c32c0209c6-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.129530 4971 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.130853 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63654bb6-f2be-42d2-bb83-0b96bb2ca8cb-config" (OuterVolumeSpecName: "config") pod "63654bb6-f2be-42d2-bb83-0b96bb2ca8cb" (UID: "63654bb6-f2be-42d2-bb83-0b96bb2ca8cb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.134793 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63654bb6-f2be-42d2-bb83-0b96bb2ca8cb-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "63654bb6-f2be-42d2-bb83-0b96bb2ca8cb" (UID: "63654bb6-f2be-42d2-bb83-0b96bb2ca8cb"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.145959 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63654bb6-f2be-42d2-bb83-0b96bb2ca8cb-scripts" (OuterVolumeSpecName: "scripts") pod "63654bb6-f2be-42d2-bb83-0b96bb2ca8cb" (UID: "63654bb6-f2be-42d2-bb83-0b96bb2ca8cb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.148027 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.148286 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="c7136000-46aa-4fe0-acaf-5ad1ed94e4b3" containerName="nova-scheduler-scheduler" containerID="cri-o://9cd9c69b1fe928bab1744a28f21f11504ec9f323cd9f2cc5d8584560664eff24" gracePeriod=30 Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.153058 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7c8d202-039a-426e-b5e5-916bb257e441-kube-api-access-6q7vl" (OuterVolumeSpecName: "kube-api-access-6q7vl") pod "b7c8d202-039a-426e-b5e5-916bb257e441" (UID: "b7c8d202-039a-426e-b5e5-916bb257e441"). InnerVolumeSpecName "kube-api-access-6q7vl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.162067 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63654bb6-f2be-42d2-bb83-0b96bb2ca8cb-kube-api-access-bdvj9" (OuterVolumeSpecName: "kube-api-access-bdvj9") pod "63654bb6-f2be-42d2-bb83-0b96bb2ca8cb" (UID: "63654bb6-f2be-42d2-bb83-0b96bb2ca8cb"). InnerVolumeSpecName "kube-api-access-bdvj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.178279 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="1023ed0c-3abe-4cff-987f-52544b885696" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.105:5671: connect: connection refused" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.197367 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e53ff8e-9df7-411c-b2ad-e4c32c0209c6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9e53ff8e-9df7-411c-b2ad-e4c32c0209c6" (UID: "9e53ff8e-9df7-411c-b2ad-e4c32c0209c6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.205093 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "63654bb6-f2be-42d2-bb83-0b96bb2ca8cb" (UID: "63654bb6-f2be-42d2-bb83-0b96bb2ca8cb"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.219278 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/852c9ecc-4ab1-4142-8a63-e73c265c3866-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "852c9ecc-4ab1-4142-8a63-e73c265c3866" (UID: "852c9ecc-4ab1-4142-8a63-e73c265c3866"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.231097 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63654bb6-f2be-42d2-bb83-0b96bb2ca8cb-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.231130 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e53ff8e-9df7-411c-b2ad-e4c32c0209c6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.231138 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdvj9\" (UniqueName: \"kubernetes.io/projected/63654bb6-f2be-42d2-bb83-0b96bb2ca8cb-kube-api-access-bdvj9\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.231147 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/852c9ecc-4ab1-4142-8a63-e73c265c3866-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.231155 4971 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/63654bb6-f2be-42d2-bb83-0b96bb2ca8cb-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.231164 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/63654bb6-f2be-42d2-bb83-0b96bb2ca8cb-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.231186 4971 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.231195 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6q7vl\" (UniqueName: \"kubernetes.io/projected/b7c8d202-039a-426e-b5e5-916bb257e441-kube-api-access-6q7vl\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.268920 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="a71f8d2d-729c-4b7c-89c7-a06bd2216978" containerName="rabbitmq" containerID="cri-o://7461019c6212376a36119aa5818e0716d8509d127a418e899d1c9160f4c383ac" gracePeriod=604800 Mar 20 07:15:35 crc kubenswrapper[4971]: E0320 07:15:35.276784 4971 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 07:15:35 crc kubenswrapper[4971]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 20 07:15:35 crc kubenswrapper[4971]: Mar 20 07:15:35 crc kubenswrapper[4971]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 20 07:15:35 crc kubenswrapper[4971]: Mar 20 07:15:35 crc kubenswrapper[4971]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 20 07:15:35 crc kubenswrapper[4971]: Mar 20 07:15:35 crc kubenswrapper[4971]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 20 07:15:35 crc kubenswrapper[4971]: Mar 20 07:15:35 crc kubenswrapper[4971]: if [ -n "neutron" ]; then Mar 20 07:15:35 crc kubenswrapper[4971]: GRANT_DATABASE="neutron" Mar 20 07:15:35 crc kubenswrapper[4971]: else Mar 20 07:15:35 crc kubenswrapper[4971]: GRANT_DATABASE="*" Mar 20 07:15:35 crc kubenswrapper[4971]: fi Mar 20 07:15:35 crc kubenswrapper[4971]: Mar 20 07:15:35 crc kubenswrapper[4971]: # going for maximum compatibility here: Mar 20 07:15:35 crc kubenswrapper[4971]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 20 07:15:35 crc kubenswrapper[4971]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 20 07:15:35 crc kubenswrapper[4971]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 20 07:15:35 crc kubenswrapper[4971]: # support updates Mar 20 07:15:35 crc kubenswrapper[4971]: Mar 20 07:15:35 crc kubenswrapper[4971]: $MYSQL_CMD < logger="UnhandledError" Mar 20 07:15:35 crc kubenswrapper[4971]: E0320 07:15:35.277407 4971 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 07:15:35 crc kubenswrapper[4971]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 20 07:15:35 crc kubenswrapper[4971]: Mar 20 07:15:35 crc kubenswrapper[4971]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 20 07:15:35 crc kubenswrapper[4971]: Mar 20 07:15:35 crc kubenswrapper[4971]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 20 07:15:35 crc kubenswrapper[4971]: Mar 20 07:15:35 crc kubenswrapper[4971]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 20 07:15:35 crc kubenswrapper[4971]: Mar 20 07:15:35 crc kubenswrapper[4971]: if [ -n "barbican" ]; then Mar 20 07:15:35 crc kubenswrapper[4971]: GRANT_DATABASE="barbican" Mar 20 07:15:35 crc kubenswrapper[4971]: else Mar 20 07:15:35 crc kubenswrapper[4971]: GRANT_DATABASE="*" Mar 20 07:15:35 crc kubenswrapper[4971]: fi Mar 20 07:15:35 crc kubenswrapper[4971]: Mar 20 07:15:35 crc kubenswrapper[4971]: # going for maximum compatibility here: Mar 20 07:15:35 crc kubenswrapper[4971]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 20 07:15:35 crc kubenswrapper[4971]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 20 07:15:35 crc kubenswrapper[4971]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 20 07:15:35 crc kubenswrapper[4971]: # support updates Mar 20 07:15:35 crc kubenswrapper[4971]: Mar 20 07:15:35 crc kubenswrapper[4971]: $MYSQL_CMD < logger="UnhandledError" Mar 20 07:15:35 crc kubenswrapper[4971]: E0320 07:15:35.279094 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"barbican-db-secret\\\" not found\"" pod="openstack/barbican-2836-account-create-update-zn4bs" podUID="54576591-c271-46a6-8b8d-8973dbe4bb8e" Mar 20 07:15:35 crc kubenswrapper[4971]: E0320 07:15:35.279137 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"neutron-db-secret\\\" not found\"" pod="openstack/neutron-60a3-account-create-update-ck92h" podUID="36683fcd-086d-4202-8e1d-5ae325ea95b1" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.299772 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7c8d202-039a-426e-b5e5-916bb257e441-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b7c8d202-039a-426e-b5e5-916bb257e441" (UID: "b7c8d202-039a-426e-b5e5-916bb257e441"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.308666 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-5jkzb"] Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.315167 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/852c9ecc-4ab1-4142-8a63-e73c265c3866-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "852c9ecc-4ab1-4142-8a63-e73c265c3866" (UID: "852c9ecc-4ab1-4142-8a63-e73c265c3866"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.332565 4971 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/852c9ecc-4ab1-4142-8a63-e73c265c3866-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.332598 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7c8d202-039a-426e-b5e5-916bb257e441-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.334770 4971 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.342862 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="5496b78b-549b-4076-a150-783bbba8896c" containerName="galera" containerID="cri-o://a7c2133041da20b5fced03a202d5fff42ac4b65bc46dd7533ac03796bf0ed5e3" gracePeriod=30 Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.359260 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7c8d202-039a-426e-b5e5-916bb257e441-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "b7c8d202-039a-426e-b5e5-916bb257e441" (UID: "b7c8d202-039a-426e-b5e5-916bb257e441"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.369666 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.369926 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="b9450f53-fcf0-4799-a8da-a9d0bc7016ac" containerName="nova-cell1-conductor-conductor" containerID="cri-o://615ced8f9e0feb5ebdac89eb7107d8990b6103b1111ede4cffc99ba5cd482835" gracePeriod=30 Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.372059 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-5jkzb"] Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.379633 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-k6q9k"] Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.382351 4971 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.386027 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.386644 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="ebe7358b-84ed-4e08-8c12-234928beef49" containerName="nova-cell0-conductor-conductor" containerID="cri-o://d18efcf36c0990b4e068aae60d892df3f5ec3b90d05ef12fb22df71577b161aa" gracePeriod=30 Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.401049 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-k6q9k"] Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.404728 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63654bb6-f2be-42d2-bb83-0b96bb2ca8cb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "63654bb6-f2be-42d2-bb83-0b96bb2ca8cb" (UID: "63654bb6-f2be-42d2-bb83-0b96bb2ca8cb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.412289 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-60a3-account-create-update-ck92h"] Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.429790 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-2836-account-create-update-zn4bs"] Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.435890 4971 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b7c8d202-039a-426e-b5e5-916bb257e441-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.435921 4971 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.435933 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63654bb6-f2be-42d2-bb83-0b96bb2ca8cb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.435944 4971 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.446187 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-30f2-account-create-update-p2jpv"] Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.457625 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-dff7bb765-jsscs"] Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.457848 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-dff7bb765-jsscs" podUID="57f3be84-ad73-4f92-9f75-b5e864f20d65" containerName="proxy-httpd" containerID="cri-o://5087a1a61d2001d23a8c542b9b255700f604591da0b11b93ca51014d129c2be8" gracePeriod=30 Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.458214 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-dff7bb765-jsscs" podUID="57f3be84-ad73-4f92-9f75-b5e864f20d65" containerName="proxy-server" containerID="cri-o://f9a71029621987be9a7f2766db366b8b68d9770ba10954549da9931e1ec09d70" gracePeriod=30 Mar 20 07:15:35 crc kubenswrapper[4971]: W0320 07:15:35.471108 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76584044_a0af_481c_961f_1c5351cb8f06.slice/crio-857a14976a0ce54d583fdf28890134b9224ef8a4d553f86c2d29ac9f93c53151 WatchSource:0}: Error finding container 857a14976a0ce54d583fdf28890134b9224ef8a4d553f86c2d29ac9f93c53151: Status 404 returned error can't find the container with id 857a14976a0ce54d583fdf28890134b9224ef8a4d553f86c2d29ac9f93c53151 Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.488713 4971 generic.go:334] "Generic (PLEG): container finished" podID="a83b09bb-e1ad-4b10-9b14-538998197621" containerID="de8268f31ad44f7855e9d376812885c5bc010017a21fab351f990ce733d64a51" exitCode=0 Mar 20 07:15:35 crc kubenswrapper[4971]: E0320 07:15:35.488751 4971 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 07:15:35 crc kubenswrapper[4971]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 20 07:15:35 crc kubenswrapper[4971]: Mar 20 07:15:35 crc kubenswrapper[4971]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 20 07:15:35 crc kubenswrapper[4971]: Mar 20 07:15:35 crc kubenswrapper[4971]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 20 07:15:35 crc kubenswrapper[4971]: Mar 20 07:15:35 crc kubenswrapper[4971]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 20 07:15:35 crc kubenswrapper[4971]: Mar 20 07:15:35 crc kubenswrapper[4971]: if [ -n "nova_api" ]; then Mar 20 07:15:35 crc kubenswrapper[4971]: GRANT_DATABASE="nova_api" Mar 20 07:15:35 crc kubenswrapper[4971]: else Mar 20 07:15:35 crc kubenswrapper[4971]: GRANT_DATABASE="*" Mar 20 07:15:35 crc kubenswrapper[4971]: fi Mar 20 07:15:35 crc kubenswrapper[4971]: Mar 20 07:15:35 crc kubenswrapper[4971]: # going for maximum compatibility here: Mar 20 07:15:35 crc kubenswrapper[4971]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 20 07:15:35 crc kubenswrapper[4971]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 20 07:15:35 crc kubenswrapper[4971]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 20 07:15:35 crc kubenswrapper[4971]: # support updates Mar 20 07:15:35 crc kubenswrapper[4971]: Mar 20 07:15:35 crc kubenswrapper[4971]: $MYSQL_CMD < logger="UnhandledError" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.488777 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d45446b77-6x8tp" event={"ID":"a83b09bb-e1ad-4b10-9b14-538998197621","Type":"ContainerDied","Data":"de8268f31ad44f7855e9d376812885c5bc010017a21fab351f990ce733d64a51"} Mar 20 07:15:35 crc kubenswrapper[4971]: E0320 07:15:35.490198 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-api-db-secret\\\" not found\"" pod="openstack/nova-api-30f2-account-create-update-p2jpv" podUID="76584044-a0af-481c-961f-1c5351cb8f06" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.503444 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e53ff8e-9df7-411c-b2ad-e4c32c0209c6-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "9e53ff8e-9df7-411c-b2ad-e4c32c0209c6" (UID: "9e53ff8e-9df7-411c-b2ad-e4c32c0209c6"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.503805 4971 generic.go:334] "Generic (PLEG): container finished" podID="08b125a7-da7e-4ab1-b594-2f45f0b5c529" containerID="8862358b81fede9115c50c47e167a72f0bd5aec1ea665447f7122a28b9e9ceea" exitCode=143 Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.503908 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"08b125a7-da7e-4ab1-b594-2f45f0b5c529","Type":"ContainerDied","Data":"8862358b81fede9115c50c47e167a72f0bd5aec1ea665447f7122a28b9e9ceea"} Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.537914 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7c8d202-039a-426e-b5e5-916bb257e441-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "b7c8d202-039a-426e-b5e5-916bb257e441" (UID: "b7c8d202-039a-426e-b5e5-916bb257e441"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.539127 4971 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b7c8d202-039a-426e-b5e5-916bb257e441-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.539158 4971 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e53ff8e-9df7-411c-b2ad-e4c32c0209c6-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.544769 4971 generic.go:334] "Generic (PLEG): container finished" podID="6149d926-b8a4-4d76-998c-0f721a1a2ef1" containerID="6e2dd8821eb89876f7c7630305e669c0924a486cdb8b8e13430844295352c507" exitCode=143 Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.544845 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-594fd7bbdb-tf55d" event={"ID":"6149d926-b8a4-4d76-998c-0f721a1a2ef1","Type":"ContainerDied","Data":"6e2dd8821eb89876f7c7630305e669c0924a486cdb8b8e13430844295352c507"} Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.562217 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_63654bb6-f2be-42d2-bb83-0b96bb2ca8cb/ovsdbserver-nb/0.log" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.562359 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.563043 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"63654bb6-f2be-42d2-bb83-0b96bb2ca8cb","Type":"ContainerDied","Data":"1bf54ed1e41b5d8cdfd46ea34b84caf9dd6b6efb0a8ec24da65c995e729fc21b"} Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.563120 4971 scope.go:117] "RemoveContainer" containerID="2055269c243f22fc79e97f8f94001b54b54ef95b9b6b344a99916b8ea073e623" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.576123 4971 generic.go:334] "Generic (PLEG): container finished" podID="2656777c-a362-4e3e-92aa-98a8f9d2cf34" containerID="31e89339c7ef92bd573974edb8ba31d6d53be9b1e4d1416dc023c35caf6cee2c" exitCode=0 Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.576188 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2656777c-a362-4e3e-92aa-98a8f9d2cf34","Type":"ContainerDied","Data":"31e89339c7ef92bd573974edb8ba31d6d53be9b1e4d1416dc023c35caf6cee2c"} Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.578608 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63654bb6-f2be-42d2-bb83-0b96bb2ca8cb-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "63654bb6-f2be-42d2-bb83-0b96bb2ca8cb" (UID: "63654bb6-f2be-42d2-bb83-0b96bb2ca8cb"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.583980 4971 generic.go:334] "Generic (PLEG): container finished" podID="e98717ba-3768-4817-a29a-09114e106105" containerID="b1c3b7e6f0a0f2ca4105b11d548c344cb0174fb3f45a7a3cbc9fbe11f4d22903" exitCode=143 Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.584098 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e98717ba-3768-4817-a29a-09114e106105","Type":"ContainerDied","Data":"b1c3b7e6f0a0f2ca4105b11d548c344cb0174fb3f45a7a3cbc9fbe11f4d22903"} Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.598808 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63654bb6-f2be-42d2-bb83-0b96bb2ca8cb-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "63654bb6-f2be-42d2-bb83-0b96bb2ca8cb" (UID: "63654bb6-f2be-42d2-bb83-0b96bb2ca8cb"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.634883 4971 generic.go:334] "Generic (PLEG): container finished" podID="9f0608b4-f344-45db-a952-d6bc328083a2" containerID="208cc47279ede4fc675dd95db356e62187d88bcc7f60e2e2862a2771c8f1ac3a" exitCode=0 Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.634914 4971 generic.go:334] "Generic (PLEG): container finished" podID="9f0608b4-f344-45db-a952-d6bc328083a2" containerID="1cca389d9342f12475322096ca826c0aec4ad5debfe45d027be7baa037a77986" exitCode=0 Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.634922 4971 generic.go:334] "Generic (PLEG): container finished" podID="9f0608b4-f344-45db-a952-d6bc328083a2" containerID="956cfacd43945d37dd879d68eb8604413a9b059e3c4840c7381849fcbde53b7b" exitCode=0 Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.634928 4971 generic.go:334] "Generic (PLEG): container finished" podID="9f0608b4-f344-45db-a952-d6bc328083a2" containerID="86e323d45ff8a18ef238cc1dc929c3203fdb02bcad78b9221d6f2ec70e572487" exitCode=0 Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.634934 4971 generic.go:334] "Generic (PLEG): container finished" podID="9f0608b4-f344-45db-a952-d6bc328083a2" containerID="5e8654e5ce222132f07315464b39e25a8cab844b1929418e596056aec0200d84" exitCode=0 Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.634940 4971 generic.go:334] "Generic (PLEG): container finished" podID="9f0608b4-f344-45db-a952-d6bc328083a2" containerID="c9c23b9558cdbf263d8320145f4a662b276ea98292c7170bbe1609469c85d516" exitCode=0 Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.634951 4971 generic.go:334] "Generic (PLEG): container finished" podID="9f0608b4-f344-45db-a952-d6bc328083a2" containerID="6ff097de6ee924556cdb6109e47ceb1245409ef90ff32e007c6f3da327384af0" exitCode=0 Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.634957 4971 generic.go:334] "Generic (PLEG): container finished" podID="9f0608b4-f344-45db-a952-d6bc328083a2" containerID="3bc926e8abdcab959b8ce2e3d7c3be7c01ec42c3cc5ee00077229e1618e99dca" exitCode=0 Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.634963 4971 generic.go:334] "Generic (PLEG): container finished" podID="9f0608b4-f344-45db-a952-d6bc328083a2" containerID="606899b0d5647f6bba62377dc5689cc5a5012b312b0d21b384c0392805a1f808" exitCode=0 Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.634969 4971 generic.go:334] "Generic (PLEG): container finished" podID="9f0608b4-f344-45db-a952-d6bc328083a2" containerID="f933aca417afe71a53e9fc981ae7ba972eb5df323cf2c751b7c7592ca963d97c" exitCode=0 Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.634975 4971 generic.go:334] "Generic (PLEG): container finished" podID="9f0608b4-f344-45db-a952-d6bc328083a2" containerID="d1197f3d4771bcf093680ea5c7e317e0d1839ab9a28044e6c0af4c8d3d15a45d" exitCode=0 Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.634981 4971 generic.go:334] "Generic (PLEG): container finished" podID="9f0608b4-f344-45db-a952-d6bc328083a2" containerID="0974124c6c77137e176d1974dea07e511d0b153152c5fc899503abbc3b959028" exitCode=0 Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.634987 4971 generic.go:334] "Generic (PLEG): container finished" podID="9f0608b4-f344-45db-a952-d6bc328083a2" containerID="7c9c527925a998de06f623f87a6e63a193cc7cb04eaaee5ab4f3102ab86f462e" exitCode=0 Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.634993 4971 generic.go:334] "Generic (PLEG): container finished" podID="9f0608b4-f344-45db-a952-d6bc328083a2" containerID="1a911d36f7d7aa0fb9edc9388eb7caf6d4875f4f78e071a88a18bb2b44086495" exitCode=0 Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.635031 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9f0608b4-f344-45db-a952-d6bc328083a2","Type":"ContainerDied","Data":"208cc47279ede4fc675dd95db356e62187d88bcc7f60e2e2862a2771c8f1ac3a"} Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.635057 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9f0608b4-f344-45db-a952-d6bc328083a2","Type":"ContainerDied","Data":"1cca389d9342f12475322096ca826c0aec4ad5debfe45d027be7baa037a77986"} Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.635066 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9f0608b4-f344-45db-a952-d6bc328083a2","Type":"ContainerDied","Data":"956cfacd43945d37dd879d68eb8604413a9b059e3c4840c7381849fcbde53b7b"} Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.635076 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9f0608b4-f344-45db-a952-d6bc328083a2","Type":"ContainerDied","Data":"86e323d45ff8a18ef238cc1dc929c3203fdb02bcad78b9221d6f2ec70e572487"} Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.635085 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9f0608b4-f344-45db-a952-d6bc328083a2","Type":"ContainerDied","Data":"5e8654e5ce222132f07315464b39e25a8cab844b1929418e596056aec0200d84"} Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.635093 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9f0608b4-f344-45db-a952-d6bc328083a2","Type":"ContainerDied","Data":"c9c23b9558cdbf263d8320145f4a662b276ea98292c7170bbe1609469c85d516"} Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.635101 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9f0608b4-f344-45db-a952-d6bc328083a2","Type":"ContainerDied","Data":"6ff097de6ee924556cdb6109e47ceb1245409ef90ff32e007c6f3da327384af0"} Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.635109 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9f0608b4-f344-45db-a952-d6bc328083a2","Type":"ContainerDied","Data":"3bc926e8abdcab959b8ce2e3d7c3be7c01ec42c3cc5ee00077229e1618e99dca"} Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.635117 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9f0608b4-f344-45db-a952-d6bc328083a2","Type":"ContainerDied","Data":"606899b0d5647f6bba62377dc5689cc5a5012b312b0d21b384c0392805a1f808"} Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.635125 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9f0608b4-f344-45db-a952-d6bc328083a2","Type":"ContainerDied","Data":"f933aca417afe71a53e9fc981ae7ba972eb5df323cf2c751b7c7592ca963d97c"} Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.635136 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9f0608b4-f344-45db-a952-d6bc328083a2","Type":"ContainerDied","Data":"d1197f3d4771bcf093680ea5c7e317e0d1839ab9a28044e6c0af4c8d3d15a45d"} Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.635144 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9f0608b4-f344-45db-a952-d6bc328083a2","Type":"ContainerDied","Data":"0974124c6c77137e176d1974dea07e511d0b153152c5fc899503abbc3b959028"} Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.635152 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9f0608b4-f344-45db-a952-d6bc328083a2","Type":"ContainerDied","Data":"7c9c527925a998de06f623f87a6e63a193cc7cb04eaaee5ab4f3102ab86f462e"} Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.635160 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9f0608b4-f344-45db-a952-d6bc328083a2","Type":"ContainerDied","Data":"1a911d36f7d7aa0fb9edc9388eb7caf6d4875f4f78e071a88a18bb2b44086495"} Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.641047 4971 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/63654bb6-f2be-42d2-bb83-0b96bb2ca8cb-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.641069 4971 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/63654bb6-f2be-42d2-bb83-0b96bb2ca8cb-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.641273 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fdb8f6449-msxs5" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.641664 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fdb8f6449-msxs5" event={"ID":"2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67","Type":"ContainerDied","Data":"ef210e03fb6f197477310288719c131cb6f093ba40f8bc1095118cc1597a4905"} Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.644772 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e53ff8e-9df7-411c-b2ad-e4c32c0209c6-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "9e53ff8e-9df7-411c-b2ad-e4c32c0209c6" (UID: "9e53ff8e-9df7-411c-b2ad-e4c32c0209c6"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.654898 4971 scope.go:117] "RemoveContainer" containerID="808a7c9f249b97193158020c6f0533d2ec2b303417e0c3080125fc1cc2867777" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.655007 4971 generic.go:334] "Generic (PLEG): container finished" podID="4290efff-c110-4085-b0a7-3e402507b5a0" containerID="bb8445172815ef07a9dbeabf96a663864ac0f9efc3e35bfd9f481b62fdf29f8f" exitCode=143 Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.655111 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5999d797d7-b2qlx" event={"ID":"4290efff-c110-4085-b0a7-3e402507b5a0","Type":"ContainerDied","Data":"bb8445172815ef07a9dbeabf96a663864ac0f9efc3e35bfd9f481b62fdf29f8f"} Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.661562 4971 generic.go:334] "Generic (PLEG): container finished" podID="2bb32512-db70-4db8-a867-0bda48f318eb" containerID="f44c356494df4fb7a453f34eb2f258fe912c8e4012fb92962ea899421d3f7ab5" exitCode=143 Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.661659 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c864cc4fb-bwl6r" event={"ID":"2bb32512-db70-4db8-a867-0bda48f318eb","Type":"ContainerDied","Data":"f44c356494df4fb7a453f34eb2f258fe912c8e4012fb92962ea899421d3f7ab5"} Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.671340 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-88l8f_852c9ecc-4ab1-4142-8a63-e73c265c3866/openstack-network-exporter/0.log" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.671410 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-88l8f" event={"ID":"852c9ecc-4ab1-4142-8a63-e73c265c3866","Type":"ContainerDied","Data":"7ed0122d8f7e0f61eb9b8a0727605e88c6a7ab618c703c44156d996640ea8b91"} Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.671505 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-88l8f" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.679261 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-2836-account-create-update-zn4bs" event={"ID":"54576591-c271-46a6-8b8d-8973dbe4bb8e","Type":"ContainerStarted","Data":"ec33de4ceddee512f0753287a721953507555f52f681f0c38ff9437080be4736"} Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.686078 4971 generic.go:334] "Generic (PLEG): container finished" podID="cabe7ab2-f272-4434-8295-641c334a0ae2" containerID="0e05bef3a26f5139b1217b0d668bd7559845501941c593fdd4f111a356558ab6" exitCode=0 Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.686148 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-njjzt" event={"ID":"cabe7ab2-f272-4434-8295-641c334a0ae2","Type":"ContainerDied","Data":"0e05bef3a26f5139b1217b0d668bd7559845501941c593fdd4f111a356558ab6"} Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.688665 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_9e53ff8e-9df7-411c-b2ad-e4c32c0209c6/ovsdbserver-sb/0.log" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.688719 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"9e53ff8e-9df7-411c-b2ad-e4c32c0209c6","Type":"ContainerDied","Data":"0c1148305d758a96d6060d3bb931b807ce2bfaf17d241d6f072f2ed34c0128f5"} Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.688803 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.703167 4971 generic.go:334] "Generic (PLEG): container finished" podID="bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a" containerID="a887f3e5b2ee565c1425d7522818693ab833fcb378a94afa1401a728afada47e" exitCode=143 Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.703206 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a","Type":"ContainerDied","Data":"a887f3e5b2ee565c1425d7522818693ab833fcb378a94afa1401a728afada47e"} Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.707278 4971 generic.go:334] "Generic (PLEG): container finished" podID="b7c8d202-039a-426e-b5e5-916bb257e441" containerID="c498a7de47947355c2dc10307b9c3cbd5aee665f320179ba74333dccaba346df" exitCode=137 Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.707410 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.711181 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-60a3-account-create-update-ck92h" event={"ID":"36683fcd-086d-4202-8e1d-5ae325ea95b1","Type":"ContainerStarted","Data":"467334f5f668702bb1aea6848ac62e14624b6df0629e2e3d010440990f1dd70d"} Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.711568 4971 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/root-account-create-update-t4xw9" secret="" err="secret \"galera-openstack-cell1-dockercfg-llftp\" not found" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.743604 4971 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e53ff8e-9df7-411c-b2ad-e4c32c0209c6-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:35 crc kubenswrapper[4971]: E0320 07:15:35.750997 4971 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 07:15:35 crc kubenswrapper[4971]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 20 07:15:35 crc kubenswrapper[4971]: Mar 20 07:15:35 crc kubenswrapper[4971]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 20 07:15:35 crc kubenswrapper[4971]: Mar 20 07:15:35 crc kubenswrapper[4971]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 20 07:15:35 crc kubenswrapper[4971]: Mar 20 07:15:35 crc kubenswrapper[4971]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 20 07:15:35 crc kubenswrapper[4971]: Mar 20 07:15:35 crc kubenswrapper[4971]: if [ -n "placement" ]; then Mar 20 07:15:35 crc kubenswrapper[4971]: GRANT_DATABASE="placement" Mar 20 07:15:35 crc kubenswrapper[4971]: else Mar 20 07:15:35 crc kubenswrapper[4971]: GRANT_DATABASE="*" Mar 20 07:15:35 crc kubenswrapper[4971]: fi Mar 20 07:15:35 crc kubenswrapper[4971]: Mar 20 07:15:35 crc kubenswrapper[4971]: # going for maximum compatibility here: Mar 20 07:15:35 crc kubenswrapper[4971]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 20 07:15:35 crc kubenswrapper[4971]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 20 07:15:35 crc kubenswrapper[4971]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 20 07:15:35 crc kubenswrapper[4971]: # support updates Mar 20 07:15:35 crc kubenswrapper[4971]: Mar 20 07:15:35 crc kubenswrapper[4971]: $MYSQL_CMD < logger="UnhandledError" Mar 20 07:15:35 crc kubenswrapper[4971]: E0320 07:15:35.751559 4971 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 07:15:35 crc kubenswrapper[4971]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 20 07:15:35 crc kubenswrapper[4971]: Mar 20 07:15:35 crc kubenswrapper[4971]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 20 07:15:35 crc kubenswrapper[4971]: Mar 20 07:15:35 crc kubenswrapper[4971]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 20 07:15:35 crc kubenswrapper[4971]: Mar 20 07:15:35 crc kubenswrapper[4971]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 20 07:15:35 crc kubenswrapper[4971]: Mar 20 07:15:35 crc kubenswrapper[4971]: if [ -n "" ]; then Mar 20 07:15:35 crc kubenswrapper[4971]: GRANT_DATABASE="" Mar 20 07:15:35 crc kubenswrapper[4971]: else Mar 20 07:15:35 crc kubenswrapper[4971]: GRANT_DATABASE="*" Mar 20 07:15:35 crc kubenswrapper[4971]: fi Mar 20 07:15:35 crc kubenswrapper[4971]: Mar 20 07:15:35 crc kubenswrapper[4971]: # going for maximum compatibility here: Mar 20 07:15:35 crc kubenswrapper[4971]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 20 07:15:35 crc kubenswrapper[4971]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 20 07:15:35 crc kubenswrapper[4971]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 20 07:15:35 crc kubenswrapper[4971]: # support updates Mar 20 07:15:35 crc kubenswrapper[4971]: Mar 20 07:15:35 crc kubenswrapper[4971]: $MYSQL_CMD < logger="UnhandledError" Mar 20 07:15:35 crc kubenswrapper[4971]: E0320 07:15:35.752599 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-cell1-mariadb-root-db-secret\\\" not found\"" pod="openstack/root-account-create-update-t4xw9" podUID="c3c07d77-e97d-4251-a86d-c0c67c52be23" Mar 20 07:15:35 crc kubenswrapper[4971]: E0320 07:15:35.752656 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"placement-db-secret\\\" not found\"" pod="openstack/placement-08bb-account-create-update-tqhvf" podUID="19961c12-d700-4796-a618-4e625c12649b" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.838879 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-88l8f"] Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.855567 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-88l8f"] Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.864441 4971 scope.go:117] "RemoveContainer" containerID="3a2e05f37831e00f2f4d153e39ac7c8308e64ddecc8706e5d6c45b52312a4203" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.873118 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fdb8f6449-msxs5"] Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.883365 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-fdb8f6449-msxs5"] Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.895769 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.911182 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.917212 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.931403 4971 scope.go:117] "RemoveContainer" containerID="4f2a5fe64a55f2870001f34a68956aa3f071975c91e61557fe4c9df8fb524220" Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.938887 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 07:15:35 crc kubenswrapper[4971]: E0320 07:15:35.951813 4971 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Mar 20 07:15:35 crc kubenswrapper[4971]: E0320 07:15:35.951905 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c3c07d77-e97d-4251-a86d-c0c67c52be23-operator-scripts podName:c3c07d77-e97d-4251-a86d-c0c67c52be23 nodeName:}" failed. No retries permitted until 2026-03-20 07:15:37.951888264 +0000 UTC m=+1559.931762402 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/c3c07d77-e97d-4251-a86d-c0c67c52be23-operator-scripts") pod "root-account-create-update-t4xw9" (UID: "c3c07d77-e97d-4251-a86d-c0c67c52be23") : configmap "openstack-cell1-scripts" not found Mar 20 07:15:35 crc kubenswrapper[4971]: I0320 07:15:35.966377 4971 scope.go:117] "RemoveContainer" containerID="cc1977222a2687951e3071cbfe9eecf3bbf33195805469e1592c465e2505d12a" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.002525 4971 scope.go:117] "RemoveContainer" containerID="46bf584ea3433e9ac46772bd8db9e28130ad0da039469060e5e16331e01c70aa" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.020011 4971 scope.go:117] "RemoveContainer" containerID="68ce7b00a908c9710adb361927cb192e2b39212d020dcbf9508b46cd83c87f9b" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.032551 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-2836-account-create-update-zn4bs" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.048010 4971 scope.go:117] "RemoveContainer" containerID="c498a7de47947355c2dc10307b9c3cbd5aee665f320179ba74333dccaba346df" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.068588 4971 scope.go:117] "RemoveContainer" containerID="c498a7de47947355c2dc10307b9c3cbd5aee665f320179ba74333dccaba346df" Mar 20 07:15:36 crc kubenswrapper[4971]: E0320 07:15:36.069167 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c498a7de47947355c2dc10307b9c3cbd5aee665f320179ba74333dccaba346df\": container with ID starting with c498a7de47947355c2dc10307b9c3cbd5aee665f320179ba74333dccaba346df not found: ID does not exist" containerID="c498a7de47947355c2dc10307b9c3cbd5aee665f320179ba74333dccaba346df" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.069207 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c498a7de47947355c2dc10307b9c3cbd5aee665f320179ba74333dccaba346df"} err="failed to get container status \"c498a7de47947355c2dc10307b9c3cbd5aee665f320179ba74333dccaba346df\": rpc error: code = NotFound desc = could not find container \"c498a7de47947355c2dc10307b9c3cbd5aee665f320179ba74333dccaba346df\": container with ID starting with c498a7de47947355c2dc10307b9c3cbd5aee665f320179ba74333dccaba346df not found: ID does not exist" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.155395 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dsdtk\" (UniqueName: \"kubernetes.io/projected/54576591-c271-46a6-8b8d-8973dbe4bb8e-kube-api-access-dsdtk\") pod \"54576591-c271-46a6-8b8d-8973dbe4bb8e\" (UID: \"54576591-c271-46a6-8b8d-8973dbe4bb8e\") " Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.155541 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54576591-c271-46a6-8b8d-8973dbe4bb8e-operator-scripts\") pod \"54576591-c271-46a6-8b8d-8973dbe4bb8e\" (UID: \"54576591-c271-46a6-8b8d-8973dbe4bb8e\") " Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.162185 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54576591-c271-46a6-8b8d-8973dbe4bb8e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "54576591-c271-46a6-8b8d-8973dbe4bb8e" (UID: "54576591-c271-46a6-8b8d-8973dbe4bb8e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.195167 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54576591-c271-46a6-8b8d-8973dbe4bb8e-kube-api-access-dsdtk" (OuterVolumeSpecName: "kube-api-access-dsdtk") pod "54576591-c271-46a6-8b8d-8973dbe4bb8e" (UID: "54576591-c271-46a6-8b8d-8973dbe4bb8e"). InnerVolumeSpecName "kube-api-access-dsdtk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.258900 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dsdtk\" (UniqueName: \"kubernetes.io/projected/54576591-c271-46a6-8b8d-8973dbe4bb8e-kube-api-access-dsdtk\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.258933 4971 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54576591-c271-46a6-8b8d-8973dbe4bb8e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:36 crc kubenswrapper[4971]: E0320 07:15:36.362789 4971 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 20 07:15:36 crc kubenswrapper[4971]: E0320 07:15:36.362845 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1023ed0c-3abe-4cff-987f-52544b885696-config-data podName:1023ed0c-3abe-4cff-987f-52544b885696 nodeName:}" failed. No retries permitted until 2026-03-20 07:15:40.362832631 +0000 UTC m=+1562.342706769 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/1023ed0c-3abe-4cff-987f-52544b885696-config-data") pod "rabbitmq-cell1-server-0" (UID: "1023ed0c-3abe-4cff-987f-52544b885696") : configmap "rabbitmq-cell1-config-data" not found Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.408767 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-594fd7bbdb-tf55d" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.430041 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.454236 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5999d797d7-b2qlx" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.454387 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-60a3-account-create-update-ck92h" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.464997 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6149d926-b8a4-4d76-998c-0f721a1a2ef1-combined-ca-bundle\") pod \"6149d926-b8a4-4d76-998c-0f721a1a2ef1\" (UID: \"6149d926-b8a4-4d76-998c-0f721a1a2ef1\") " Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.465085 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6149d926-b8a4-4d76-998c-0f721a1a2ef1-logs\") pod \"6149d926-b8a4-4d76-998c-0f721a1a2ef1\" (UID: \"6149d926-b8a4-4d76-998c-0f721a1a2ef1\") " Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.465140 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b4a1357-9313-4ebb-bb14-2aaf2785e17d-combined-ca-bundle\") pod \"6b4a1357-9313-4ebb-bb14-2aaf2785e17d\" (UID: \"6b4a1357-9313-4ebb-bb14-2aaf2785e17d\") " Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.465236 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6149d926-b8a4-4d76-998c-0f721a1a2ef1-config-data\") pod \"6149d926-b8a4-4d76-998c-0f721a1a2ef1\" (UID: \"6149d926-b8a4-4d76-998c-0f721a1a2ef1\") " Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.465266 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6149d926-b8a4-4d76-998c-0f721a1a2ef1-config-data-custom\") pod \"6149d926-b8a4-4d76-998c-0f721a1a2ef1\" (UID: \"6149d926-b8a4-4d76-998c-0f721a1a2ef1\") " Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.465360 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b4a1357-9313-4ebb-bb14-2aaf2785e17d-config-data\") pod \"6b4a1357-9313-4ebb-bb14-2aaf2785e17d\" (UID: \"6b4a1357-9313-4ebb-bb14-2aaf2785e17d\") " Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.465386 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrx75\" (UniqueName: \"kubernetes.io/projected/6149d926-b8a4-4d76-998c-0f721a1a2ef1-kube-api-access-wrx75\") pod \"6149d926-b8a4-4d76-998c-0f721a1a2ef1\" (UID: \"6149d926-b8a4-4d76-998c-0f721a1a2ef1\") " Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.465406 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7h5z\" (UniqueName: \"kubernetes.io/projected/6b4a1357-9313-4ebb-bb14-2aaf2785e17d-kube-api-access-l7h5z\") pod \"6b4a1357-9313-4ebb-bb14-2aaf2785e17d\" (UID: \"6b4a1357-9313-4ebb-bb14-2aaf2785e17d\") " Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.465442 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b4a1357-9313-4ebb-bb14-2aaf2785e17d-nova-novncproxy-tls-certs\") pod \"6b4a1357-9313-4ebb-bb14-2aaf2785e17d\" (UID: \"6b4a1357-9313-4ebb-bb14-2aaf2785e17d\") " Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.465499 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b4a1357-9313-4ebb-bb14-2aaf2785e17d-vencrypt-tls-certs\") pod \"6b4a1357-9313-4ebb-bb14-2aaf2785e17d\" (UID: \"6b4a1357-9313-4ebb-bb14-2aaf2785e17d\") " Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.467396 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6149d926-b8a4-4d76-998c-0f721a1a2ef1-logs" (OuterVolumeSpecName: "logs") pod "6149d926-b8a4-4d76-998c-0f721a1a2ef1" (UID: "6149d926-b8a4-4d76-998c-0f721a1a2ef1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.489903 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6149d926-b8a4-4d76-998c-0f721a1a2ef1-kube-api-access-wrx75" (OuterVolumeSpecName: "kube-api-access-wrx75") pod "6149d926-b8a4-4d76-998c-0f721a1a2ef1" (UID: "6149d926-b8a4-4d76-998c-0f721a1a2ef1"). InnerVolumeSpecName "kube-api-access-wrx75". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.491642 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6149d926-b8a4-4d76-998c-0f721a1a2ef1-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6149d926-b8a4-4d76-998c-0f721a1a2ef1" (UID: "6149d926-b8a4-4d76-998c-0f721a1a2ef1"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.494876 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b4a1357-9313-4ebb-bb14-2aaf2785e17d-kube-api-access-l7h5z" (OuterVolumeSpecName: "kube-api-access-l7h5z") pod "6b4a1357-9313-4ebb-bb14-2aaf2785e17d" (UID: "6b4a1357-9313-4ebb-bb14-2aaf2785e17d"). InnerVolumeSpecName "kube-api-access-l7h5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.537960 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6149d926-b8a4-4d76-998c-0f721a1a2ef1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6149d926-b8a4-4d76-998c-0f721a1a2ef1" (UID: "6149d926-b8a4-4d76-998c-0f721a1a2ef1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.548208 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b4a1357-9313-4ebb-bb14-2aaf2785e17d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6b4a1357-9313-4ebb-bb14-2aaf2785e17d" (UID: "6b4a1357-9313-4ebb-bb14-2aaf2785e17d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.577597 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7kn5\" (UniqueName: \"kubernetes.io/projected/4290efff-c110-4085-b0a7-3e402507b5a0-kube-api-access-q7kn5\") pod \"4290efff-c110-4085-b0a7-3e402507b5a0\" (UID: \"4290efff-c110-4085-b0a7-3e402507b5a0\") " Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.578153 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4290efff-c110-4085-b0a7-3e402507b5a0-logs\") pod \"4290efff-c110-4085-b0a7-3e402507b5a0\" (UID: \"4290efff-c110-4085-b0a7-3e402507b5a0\") " Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.578189 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4290efff-c110-4085-b0a7-3e402507b5a0-combined-ca-bundle\") pod \"4290efff-c110-4085-b0a7-3e402507b5a0\" (UID: \"4290efff-c110-4085-b0a7-3e402507b5a0\") " Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.578949 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4290efff-c110-4085-b0a7-3e402507b5a0-config-data\") pod \"4290efff-c110-4085-b0a7-3e402507b5a0\" (UID: \"4290efff-c110-4085-b0a7-3e402507b5a0\") " Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.579258 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36683fcd-086d-4202-8e1d-5ae325ea95b1-operator-scripts\") pod \"36683fcd-086d-4202-8e1d-5ae325ea95b1\" (UID: \"36683fcd-086d-4202-8e1d-5ae325ea95b1\") " Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.579744 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvrcb\" (UniqueName: \"kubernetes.io/projected/36683fcd-086d-4202-8e1d-5ae325ea95b1-kube-api-access-gvrcb\") pod \"36683fcd-086d-4202-8e1d-5ae325ea95b1\" (UID: \"36683fcd-086d-4202-8e1d-5ae325ea95b1\") " Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.579764 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4290efff-c110-4085-b0a7-3e402507b5a0-config-data-custom\") pod \"4290efff-c110-4085-b0a7-3e402507b5a0\" (UID: \"4290efff-c110-4085-b0a7-3e402507b5a0\") " Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.581480 4971 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6149d926-b8a4-4d76-998c-0f721a1a2ef1-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.581496 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrx75\" (UniqueName: \"kubernetes.io/projected/6149d926-b8a4-4d76-998c-0f721a1a2ef1-kube-api-access-wrx75\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.581507 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7h5z\" (UniqueName: \"kubernetes.io/projected/6b4a1357-9313-4ebb-bb14-2aaf2785e17d-kube-api-access-l7h5z\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.581516 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6149d926-b8a4-4d76-998c-0f721a1a2ef1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.582243 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36683fcd-086d-4202-8e1d-5ae325ea95b1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "36683fcd-086d-4202-8e1d-5ae325ea95b1" (UID: "36683fcd-086d-4202-8e1d-5ae325ea95b1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.582284 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4290efff-c110-4085-b0a7-3e402507b5a0-kube-api-access-q7kn5" (OuterVolumeSpecName: "kube-api-access-q7kn5") pod "4290efff-c110-4085-b0a7-3e402507b5a0" (UID: "4290efff-c110-4085-b0a7-3e402507b5a0"). InnerVolumeSpecName "kube-api-access-q7kn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:15:36 crc kubenswrapper[4971]: E0320 07:15:36.582471 4971 projected.go:288] Couldn't get configMap openstack/swift-storage-config-data: configmap "swift-storage-config-data" not found Mar 20 07:15:36 crc kubenswrapper[4971]: E0320 07:15:36.582486 4971 projected.go:263] Couldn't get secret openstack/swift-conf: secret "swift-conf" not found Mar 20 07:15:36 crc kubenswrapper[4971]: E0320 07:15:36.582494 4971 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 07:15:36 crc kubenswrapper[4971]: E0320 07:15:36.582505 4971 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: [configmap "swift-storage-config-data" not found, secret "swift-conf" not found, configmap "swift-ring-files" not found] Mar 20 07:15:36 crc kubenswrapper[4971]: E0320 07:15:36.582549 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9f0608b4-f344-45db-a952-d6bc328083a2-etc-swift podName:9f0608b4-f344-45db-a952-d6bc328083a2 nodeName:}" failed. No retries permitted until 2026-03-20 07:15:40.582532973 +0000 UTC m=+1562.562407111 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9f0608b4-f344-45db-a952-d6bc328083a2-etc-swift") pod "swift-storage-0" (UID: "9f0608b4-f344-45db-a952-d6bc328083a2") : [configmap "swift-storage-config-data" not found, secret "swift-conf" not found, configmap "swift-ring-files" not found] Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.583551 4971 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6149d926-b8a4-4d76-998c-0f721a1a2ef1-logs\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.583591 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b4a1357-9313-4ebb-bb14-2aaf2785e17d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.588752 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36683fcd-086d-4202-8e1d-5ae325ea95b1-kube-api-access-gvrcb" (OuterVolumeSpecName: "kube-api-access-gvrcb") pod "36683fcd-086d-4202-8e1d-5ae325ea95b1" (UID: "36683fcd-086d-4202-8e1d-5ae325ea95b1"). InnerVolumeSpecName "kube-api-access-gvrcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.589222 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4290efff-c110-4085-b0a7-3e402507b5a0-logs" (OuterVolumeSpecName: "logs") pod "4290efff-c110-4085-b0a7-3e402507b5a0" (UID: "4290efff-c110-4085-b0a7-3e402507b5a0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.591041 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6149d926-b8a4-4d76-998c-0f721a1a2ef1-config-data" (OuterVolumeSpecName: "config-data") pod "6149d926-b8a4-4d76-998c-0f721a1a2ef1" (UID: "6149d926-b8a4-4d76-998c-0f721a1a2ef1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.598278 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b4a1357-9313-4ebb-bb14-2aaf2785e17d-config-data" (OuterVolumeSpecName: "config-data") pod "6b4a1357-9313-4ebb-bb14-2aaf2785e17d" (UID: "6b4a1357-9313-4ebb-bb14-2aaf2785e17d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.603112 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4290efff-c110-4085-b0a7-3e402507b5a0-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4290efff-c110-4085-b0a7-3e402507b5a0" (UID: "4290efff-c110-4085-b0a7-3e402507b5a0"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.626898 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.643610 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b4a1357-9313-4ebb-bb14-2aaf2785e17d-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "6b4a1357-9313-4ebb-bb14-2aaf2785e17d" (UID: "6b4a1357-9313-4ebb-bb14-2aaf2785e17d"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.647674 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4290efff-c110-4085-b0a7-3e402507b5a0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4290efff-c110-4085-b0a7-3e402507b5a0" (UID: "4290efff-c110-4085-b0a7-3e402507b5a0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.663549 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4290efff-c110-4085-b0a7-3e402507b5a0-config-data" (OuterVolumeSpecName: "config-data") pod "4290efff-c110-4085-b0a7-3e402507b5a0" (UID: "4290efff-c110-4085-b0a7-3e402507b5a0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.664125 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b4a1357-9313-4ebb-bb14-2aaf2785e17d-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "6b4a1357-9313-4ebb-bb14-2aaf2785e17d" (UID: "6b4a1357-9313-4ebb-bb14-2aaf2785e17d"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.686232 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5496b78b-549b-4076-a150-783bbba8896c-kolla-config\") pod \"5496b78b-549b-4076-a150-783bbba8896c\" (UID: \"5496b78b-549b-4076-a150-783bbba8896c\") " Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.686345 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5496b78b-549b-4076-a150-783bbba8896c-operator-scripts\") pod \"5496b78b-549b-4076-a150-783bbba8896c\" (UID: \"5496b78b-549b-4076-a150-783bbba8896c\") " Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.686416 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5496b78b-549b-4076-a150-783bbba8896c-config-data-default\") pod \"5496b78b-549b-4076-a150-783bbba8896c\" (UID: \"5496b78b-549b-4076-a150-783bbba8896c\") " Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.686444 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5496b78b-549b-4076-a150-783bbba8896c-config-data-generated\") pod \"5496b78b-549b-4076-a150-783bbba8896c\" (UID: \"5496b78b-549b-4076-a150-783bbba8896c\") " Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.686477 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"5496b78b-549b-4076-a150-783bbba8896c\" (UID: \"5496b78b-549b-4076-a150-783bbba8896c\") " Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.686516 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5496b78b-549b-4076-a150-783bbba8896c-galera-tls-certs\") pod \"5496b78b-549b-4076-a150-783bbba8896c\" (UID: \"5496b78b-549b-4076-a150-783bbba8896c\") " Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.686542 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4kt9\" (UniqueName: \"kubernetes.io/projected/5496b78b-549b-4076-a150-783bbba8896c-kube-api-access-c4kt9\") pod \"5496b78b-549b-4076-a150-783bbba8896c\" (UID: \"5496b78b-549b-4076-a150-783bbba8896c\") " Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.687197 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5496b78b-549b-4076-a150-783bbba8896c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5496b78b-549b-4076-a150-783bbba8896c" (UID: "5496b78b-549b-4076-a150-783bbba8896c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.687425 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5496b78b-549b-4076-a150-783bbba8896c-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "5496b78b-549b-4076-a150-783bbba8896c" (UID: "5496b78b-549b-4076-a150-783bbba8896c"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.688039 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5496b78b-549b-4076-a150-783bbba8896c-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "5496b78b-549b-4076-a150-783bbba8896c" (UID: "5496b78b-549b-4076-a150-783bbba8896c"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.688265 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5496b78b-549b-4076-a150-783bbba8896c-combined-ca-bundle\") pod \"5496b78b-549b-4076-a150-783bbba8896c\" (UID: \"5496b78b-549b-4076-a150-783bbba8896c\") " Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.690470 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4290efff-c110-4085-b0a7-3e402507b5a0-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.690487 4971 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5496b78b-549b-4076-a150-783bbba8896c-config-data-default\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.690497 4971 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5496b78b-549b-4076-a150-783bbba8896c-config-data-generated\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.690506 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b4a1357-9313-4ebb-bb14-2aaf2785e17d-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.690514 4971 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36683fcd-086d-4202-8e1d-5ae325ea95b1-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.690523 4971 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b4a1357-9313-4ebb-bb14-2aaf2785e17d-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.690531 4971 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b4a1357-9313-4ebb-bb14-2aaf2785e17d-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.690539 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvrcb\" (UniqueName: \"kubernetes.io/projected/36683fcd-086d-4202-8e1d-5ae325ea95b1-kube-api-access-gvrcb\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.690548 4971 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4290efff-c110-4085-b0a7-3e402507b5a0-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.690556 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7kn5\" (UniqueName: \"kubernetes.io/projected/4290efff-c110-4085-b0a7-3e402507b5a0-kube-api-access-q7kn5\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.690563 4971 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4290efff-c110-4085-b0a7-3e402507b5a0-logs\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.690571 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4290efff-c110-4085-b0a7-3e402507b5a0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.690579 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6149d926-b8a4-4d76-998c-0f721a1a2ef1-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.690586 4971 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5496b78b-549b-4076-a150-783bbba8896c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.692455 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5496b78b-549b-4076-a150-783bbba8896c-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "5496b78b-549b-4076-a150-783bbba8896c" (UID: "5496b78b-549b-4076-a150-783bbba8896c"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.697055 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5496b78b-549b-4076-a150-783bbba8896c-kube-api-access-c4kt9" (OuterVolumeSpecName: "kube-api-access-c4kt9") pod "5496b78b-549b-4076-a150-783bbba8896c" (UID: "5496b78b-549b-4076-a150-783bbba8896c"). InnerVolumeSpecName "kube-api-access-c4kt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.719953 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "mysql-db") pod "5496b78b-549b-4076-a150-783bbba8896c" (UID: "5496b78b-549b-4076-a150-783bbba8896c"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.731632 4971 generic.go:334] "Generic (PLEG): container finished" podID="4290efff-c110-4085-b0a7-3e402507b5a0" containerID="4179f3a4ba269961a73dda3345aa28072df0582733516d6dc42d2ab7a7ead18c" exitCode=0 Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.731792 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5999d797d7-b2qlx" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.735647 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-2836-account-create-update-zn4bs" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.739505 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-dff7bb765-jsscs" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.739672 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5496b78b-549b-4076-a150-783bbba8896c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5496b78b-549b-4076-a150-783bbba8896c" (UID: "5496b78b-549b-4076-a150-783bbba8896c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.740084 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-60a3-account-create-update-ck92h" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.743337 4971 generic.go:334] "Generic (PLEG): container finished" podID="5496b78b-549b-4076-a150-783bbba8896c" containerID="a7c2133041da20b5fced03a202d5fff42ac4b65bc46dd7533ac03796bf0ed5e3" exitCode=0 Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.743419 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.760293 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bb2907c-f83e-4870-bde2-cff480ce9c78" path="/var/lib/kubelet/pods/1bb2907c-f83e-4870-bde2-cff480ce9c78/volumes" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.760838 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1be4310d-009b-4cca-8e17-2d392cad914b" path="/var/lib/kubelet/pods/1be4310d-009b-4cca-8e17-2d392cad914b/volumes" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.761924 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67" path="/var/lib/kubelet/pods/2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67/volumes" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.763304 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63654bb6-f2be-42d2-bb83-0b96bb2ca8cb" path="/var/lib/kubelet/pods/63654bb6-f2be-42d2-bb83-0b96bb2ca8cb/volumes" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.763883 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6af1289c-26d4-4865-946f-891c126beb49" path="/var/lib/kubelet/pods/6af1289c-26d4-4865-946f-891c126beb49/volumes" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.766240 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="852c9ecc-4ab1-4142-8a63-e73c265c3866" path="/var/lib/kubelet/pods/852c9ecc-4ab1-4142-8a63-e73c265c3866/volumes" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.770036 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e53ff8e-9df7-411c-b2ad-e4c32c0209c6" path="/var/lib/kubelet/pods/9e53ff8e-9df7-411c-b2ad-e4c32c0209c6/volumes" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.770801 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1d7a078-cf37-45b3-9370-2df67437a53d" path="/var/lib/kubelet/pods/a1d7a078-cf37-45b3-9370-2df67437a53d/volumes" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.771930 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7c8d202-039a-426e-b5e5-916bb257e441" path="/var/lib/kubelet/pods/b7c8d202-039a-426e-b5e5-916bb257e441/volumes" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.772522 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3dec926-79a5-4d96-917c-380ac85e7e38" path="/var/lib/kubelet/pods/e3dec926-79a5-4d96-917c-380ac85e7e38/volumes" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.772600 4971 generic.go:334] "Generic (PLEG): container finished" podID="57f3be84-ad73-4f92-9f75-b5e864f20d65" containerID="f9a71029621987be9a7f2766db366b8b68d9770ba10954549da9931e1ec09d70" exitCode=0 Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.772709 4971 generic.go:334] "Generic (PLEG): container finished" podID="57f3be84-ad73-4f92-9f75-b5e864f20d65" containerID="5087a1a61d2001d23a8c542b9b255700f604591da0b11b93ca51014d129c2be8" exitCode=0 Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.772912 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-dff7bb765-jsscs" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.773295 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4c80ffe-c152-4f0f-99f4-38f67d491021" path="/var/lib/kubelet/pods/f4c80ffe-c152-4f0f-99f4-38f67d491021/volumes" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.782014 4971 generic.go:334] "Generic (PLEG): container finished" podID="6149d926-b8a4-4d76-998c-0f721a1a2ef1" containerID="65c1955e9ecffc618c51033c4ee3d4e91c853d79d00a2cf910425e2df03f5791" exitCode=0 Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.782133 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-594fd7bbdb-tf55d" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.784197 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5999d797d7-b2qlx" event={"ID":"4290efff-c110-4085-b0a7-3e402507b5a0","Type":"ContainerDied","Data":"4179f3a4ba269961a73dda3345aa28072df0582733516d6dc42d2ab7a7ead18c"} Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.784233 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5999d797d7-b2qlx" event={"ID":"4290efff-c110-4085-b0a7-3e402507b5a0","Type":"ContainerDied","Data":"dfabb6a3263186a5587477a3179ed3db85ef57db198ebbab24ffc2b8e3d1827d"} Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.784245 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-2836-account-create-update-zn4bs" event={"ID":"54576591-c271-46a6-8b8d-8973dbe4bb8e","Type":"ContainerDied","Data":"ec33de4ceddee512f0753287a721953507555f52f681f0c38ff9437080be4736"} Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.784259 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-60a3-account-create-update-ck92h" event={"ID":"36683fcd-086d-4202-8e1d-5ae325ea95b1","Type":"ContainerDied","Data":"467334f5f668702bb1aea6848ac62e14624b6df0629e2e3d010440990f1dd70d"} Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.784271 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"5496b78b-549b-4076-a150-783bbba8896c","Type":"ContainerDied","Data":"a7c2133041da20b5fced03a202d5fff42ac4b65bc46dd7533ac03796bf0ed5e3"} Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.784282 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"5496b78b-549b-4076-a150-783bbba8896c","Type":"ContainerDied","Data":"d51fa962826579e9a55882cb0ee5e743b6bbd77c8460e5e932f99fce6197d99e"} Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.784291 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-dff7bb765-jsscs" event={"ID":"57f3be84-ad73-4f92-9f75-b5e864f20d65","Type":"ContainerDied","Data":"f9a71029621987be9a7f2766db366b8b68d9770ba10954549da9931e1ec09d70"} Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.784302 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-dff7bb765-jsscs" event={"ID":"57f3be84-ad73-4f92-9f75-b5e864f20d65","Type":"ContainerDied","Data":"5087a1a61d2001d23a8c542b9b255700f604591da0b11b93ca51014d129c2be8"} Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.784315 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-594fd7bbdb-tf55d" event={"ID":"6149d926-b8a4-4d76-998c-0f721a1a2ef1","Type":"ContainerDied","Data":"65c1955e9ecffc618c51033c4ee3d4e91c853d79d00a2cf910425e2df03f5791"} Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.784328 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-594fd7bbdb-tf55d" event={"ID":"6149d926-b8a4-4d76-998c-0f721a1a2ef1","Type":"ContainerDied","Data":"058b77e26cfc29aa04870b5b56a750d441b3a92f8e1fc2c0662d7141df4a9ea7"} Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.784348 4971 scope.go:117] "RemoveContainer" containerID="4179f3a4ba269961a73dda3345aa28072df0582733516d6dc42d2ab7a7ead18c" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.793482 4971 generic.go:334] "Generic (PLEG): container finished" podID="6b4a1357-9313-4ebb-bb14-2aaf2785e17d" containerID="a679a633dac3964e55615ddc5dd8c47b610179cc08b11c445d3bcebafc956250" exitCode=0 Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.793566 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6b4a1357-9313-4ebb-bb14-2aaf2785e17d","Type":"ContainerDied","Data":"a679a633dac3964e55615ddc5dd8c47b610179cc08b11c445d3bcebafc956250"} Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.793592 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6b4a1357-9313-4ebb-bb14-2aaf2785e17d","Type":"ContainerDied","Data":"56e147ccdd24b108236de6f3992c7d0b4c6a7f524b5f17d126edce8490e0c2ce"} Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.794548 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57f3be84-ad73-4f92-9f75-b5e864f20d65-config-data\") pod \"57f3be84-ad73-4f92-9f75-b5e864f20d65\" (UID: \"57f3be84-ad73-4f92-9f75-b5e864f20d65\") " Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.794632 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.794687 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/57f3be84-ad73-4f92-9f75-b5e864f20d65-public-tls-certs\") pod \"57f3be84-ad73-4f92-9f75-b5e864f20d65\" (UID: \"57f3be84-ad73-4f92-9f75-b5e864f20d65\") " Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.794755 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/57f3be84-ad73-4f92-9f75-b5e864f20d65-etc-swift\") pod \"57f3be84-ad73-4f92-9f75-b5e864f20d65\" (UID: \"57f3be84-ad73-4f92-9f75-b5e864f20d65\") " Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.794815 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57f3be84-ad73-4f92-9f75-b5e864f20d65-run-httpd\") pod \"57f3be84-ad73-4f92-9f75-b5e864f20d65\" (UID: \"57f3be84-ad73-4f92-9f75-b5e864f20d65\") " Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.795017 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57f3be84-ad73-4f92-9f75-b5e864f20d65-combined-ca-bundle\") pod \"57f3be84-ad73-4f92-9f75-b5e864f20d65\" (UID: \"57f3be84-ad73-4f92-9f75-b5e864f20d65\") " Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.795098 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/57f3be84-ad73-4f92-9f75-b5e864f20d65-internal-tls-certs\") pod \"57f3be84-ad73-4f92-9f75-b5e864f20d65\" (UID: \"57f3be84-ad73-4f92-9f75-b5e864f20d65\") " Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.795171 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57f3be84-ad73-4f92-9f75-b5e864f20d65-log-httpd\") pod \"57f3be84-ad73-4f92-9f75-b5e864f20d65\" (UID: \"57f3be84-ad73-4f92-9f75-b5e864f20d65\") " Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.795251 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hd7bn\" (UniqueName: \"kubernetes.io/projected/57f3be84-ad73-4f92-9f75-b5e864f20d65-kube-api-access-hd7bn\") pod \"57f3be84-ad73-4f92-9f75-b5e864f20d65\" (UID: \"57f3be84-ad73-4f92-9f75-b5e864f20d65\") " Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.795932 4971 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.795953 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4kt9\" (UniqueName: \"kubernetes.io/projected/5496b78b-549b-4076-a150-783bbba8896c-kube-api-access-c4kt9\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.795978 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5496b78b-549b-4076-a150-783bbba8896c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.795987 4971 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5496b78b-549b-4076-a150-783bbba8896c-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.803795 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57f3be84-ad73-4f92-9f75-b5e864f20d65-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "57f3be84-ad73-4f92-9f75-b5e864f20d65" (UID: "57f3be84-ad73-4f92-9f75-b5e864f20d65"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.803865 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57f3be84-ad73-4f92-9f75-b5e864f20d65-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "57f3be84-ad73-4f92-9f75-b5e864f20d65" (UID: "57f3be84-ad73-4f92-9f75-b5e864f20d65"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.804009 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-30f2-account-create-update-p2jpv" event={"ID":"76584044-a0af-481c-961f-1c5351cb8f06","Type":"ContainerStarted","Data":"857a14976a0ce54d583fdf28890134b9224ef8a4d553f86c2d29ac9f93c53151"} Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.806062 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5496b78b-549b-4076-a150-783bbba8896c-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "5496b78b-549b-4076-a150-783bbba8896c" (UID: "5496b78b-549b-4076-a150-783bbba8896c"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.819807 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57f3be84-ad73-4f92-9f75-b5e864f20d65-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "57f3be84-ad73-4f92-9f75-b5e864f20d65" (UID: "57f3be84-ad73-4f92-9f75-b5e864f20d65"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.823740 4971 scope.go:117] "RemoveContainer" containerID="bb8445172815ef07a9dbeabf96a663864ac0f9efc3e35bfd9f481b62fdf29f8f" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.833221 4971 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.836759 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-5999d797d7-b2qlx"] Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.847114 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-5999d797d7-b2qlx"] Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.856128 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57f3be84-ad73-4f92-9f75-b5e864f20d65-kube-api-access-hd7bn" (OuterVolumeSpecName: "kube-api-access-hd7bn") pod "57f3be84-ad73-4f92-9f75-b5e864f20d65" (UID: "57f3be84-ad73-4f92-9f75-b5e864f20d65"). InnerVolumeSpecName "kube-api-access-hd7bn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.891390 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-2836-account-create-update-zn4bs"] Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.898675 4971 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57f3be84-ad73-4f92-9f75-b5e864f20d65-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.898739 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hd7bn\" (UniqueName: \"kubernetes.io/projected/57f3be84-ad73-4f92-9f75-b5e864f20d65-kube-api-access-hd7bn\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.898756 4971 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.898796 4971 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5496b78b-549b-4076-a150-783bbba8896c-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.898808 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/57f3be84-ad73-4f92-9f75-b5e864f20d65-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.898815 4971 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57f3be84-ad73-4f92-9f75-b5e864f20d65-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.919179 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-2836-account-create-update-zn4bs"] Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.934827 4971 scope.go:117] "RemoveContainer" containerID="4179f3a4ba269961a73dda3345aa28072df0582733516d6dc42d2ab7a7ead18c" Mar 20 07:15:36 crc kubenswrapper[4971]: E0320 07:15:36.937548 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4179f3a4ba269961a73dda3345aa28072df0582733516d6dc42d2ab7a7ead18c\": container with ID starting with 4179f3a4ba269961a73dda3345aa28072df0582733516d6dc42d2ab7a7ead18c not found: ID does not exist" containerID="4179f3a4ba269961a73dda3345aa28072df0582733516d6dc42d2ab7a7ead18c" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.937624 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4179f3a4ba269961a73dda3345aa28072df0582733516d6dc42d2ab7a7ead18c"} err="failed to get container status \"4179f3a4ba269961a73dda3345aa28072df0582733516d6dc42d2ab7a7ead18c\": rpc error: code = NotFound desc = could not find container \"4179f3a4ba269961a73dda3345aa28072df0582733516d6dc42d2ab7a7ead18c\": container with ID starting with 4179f3a4ba269961a73dda3345aa28072df0582733516d6dc42d2ab7a7ead18c not found: ID does not exist" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.937661 4971 scope.go:117] "RemoveContainer" containerID="bb8445172815ef07a9dbeabf96a663864ac0f9efc3e35bfd9f481b62fdf29f8f" Mar 20 07:15:36 crc kubenswrapper[4971]: E0320 07:15:36.938162 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb8445172815ef07a9dbeabf96a663864ac0f9efc3e35bfd9f481b62fdf29f8f\": container with ID starting with bb8445172815ef07a9dbeabf96a663864ac0f9efc3e35bfd9f481b62fdf29f8f not found: ID does not exist" containerID="bb8445172815ef07a9dbeabf96a663864ac0f9efc3e35bfd9f481b62fdf29f8f" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.938192 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb8445172815ef07a9dbeabf96a663864ac0f9efc3e35bfd9f481b62fdf29f8f"} err="failed to get container status \"bb8445172815ef07a9dbeabf96a663864ac0f9efc3e35bfd9f481b62fdf29f8f\": rpc error: code = NotFound desc = could not find container \"bb8445172815ef07a9dbeabf96a663864ac0f9efc3e35bfd9f481b62fdf29f8f\": container with ID starting with bb8445172815ef07a9dbeabf96a663864ac0f9efc3e35bfd9f481b62fdf29f8f not found: ID does not exist" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.938209 4971 scope.go:117] "RemoveContainer" containerID="a7c2133041da20b5fced03a202d5fff42ac4b65bc46dd7533ac03796bf0ed5e3" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.974803 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57f3be84-ad73-4f92-9f75-b5e864f20d65-config-data" (OuterVolumeSpecName: "config-data") pod "57f3be84-ad73-4f92-9f75-b5e864f20d65" (UID: "57f3be84-ad73-4f92-9f75-b5e864f20d65"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.978694 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57f3be84-ad73-4f92-9f75-b5e864f20d65-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "57f3be84-ad73-4f92-9f75-b5e864f20d65" (UID: "57f3be84-ad73-4f92-9f75-b5e864f20d65"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.978767 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-4k7j6"] Mar 20 07:15:36 crc kubenswrapper[4971]: E0320 07:15:36.979150 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57f3be84-ad73-4f92-9f75-b5e864f20d65" containerName="proxy-httpd" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.979167 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="57f3be84-ad73-4f92-9f75-b5e864f20d65" containerName="proxy-httpd" Mar 20 07:15:36 crc kubenswrapper[4971]: E0320 07:15:36.979178 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4290efff-c110-4085-b0a7-3e402507b5a0" containerName="barbican-worker-log" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.979185 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="4290efff-c110-4085-b0a7-3e402507b5a0" containerName="barbican-worker-log" Mar 20 07:15:36 crc kubenswrapper[4971]: E0320 07:15:36.979198 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="852c9ecc-4ab1-4142-8a63-e73c265c3866" containerName="openstack-network-exporter" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.979203 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="852c9ecc-4ab1-4142-8a63-e73c265c3866" containerName="openstack-network-exporter" Mar 20 07:15:36 crc kubenswrapper[4971]: E0320 07:15:36.979210 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5496b78b-549b-4076-a150-783bbba8896c" containerName="mysql-bootstrap" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.979216 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="5496b78b-549b-4076-a150-783bbba8896c" containerName="mysql-bootstrap" Mar 20 07:15:36 crc kubenswrapper[4971]: E0320 07:15:36.979224 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5496b78b-549b-4076-a150-783bbba8896c" containerName="galera" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.979229 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="5496b78b-549b-4076-a150-783bbba8896c" containerName="galera" Mar 20 07:15:36 crc kubenswrapper[4971]: E0320 07:15:36.979239 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6149d926-b8a4-4d76-998c-0f721a1a2ef1" containerName="barbican-keystone-listener-log" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.979245 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="6149d926-b8a4-4d76-998c-0f721a1a2ef1" containerName="barbican-keystone-listener-log" Mar 20 07:15:36 crc kubenswrapper[4971]: E0320 07:15:36.979258 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b4a1357-9313-4ebb-bb14-2aaf2785e17d" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.979264 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b4a1357-9313-4ebb-bb14-2aaf2785e17d" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 07:15:36 crc kubenswrapper[4971]: E0320 07:15:36.979278 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63654bb6-f2be-42d2-bb83-0b96bb2ca8cb" containerName="openstack-network-exporter" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.979286 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="63654bb6-f2be-42d2-bb83-0b96bb2ca8cb" containerName="openstack-network-exporter" Mar 20 07:15:36 crc kubenswrapper[4971]: E0320 07:15:36.979299 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4290efff-c110-4085-b0a7-3e402507b5a0" containerName="barbican-worker" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.979306 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="4290efff-c110-4085-b0a7-3e402507b5a0" containerName="barbican-worker" Mar 20 07:15:36 crc kubenswrapper[4971]: E0320 07:15:36.979315 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e53ff8e-9df7-411c-b2ad-e4c32c0209c6" containerName="ovsdbserver-sb" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.979321 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e53ff8e-9df7-411c-b2ad-e4c32c0209c6" containerName="ovsdbserver-sb" Mar 20 07:15:36 crc kubenswrapper[4971]: E0320 07:15:36.979328 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63654bb6-f2be-42d2-bb83-0b96bb2ca8cb" containerName="ovsdbserver-nb" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.979334 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="63654bb6-f2be-42d2-bb83-0b96bb2ca8cb" containerName="ovsdbserver-nb" Mar 20 07:15:36 crc kubenswrapper[4971]: E0320 07:15:36.979347 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57f3be84-ad73-4f92-9f75-b5e864f20d65" containerName="proxy-server" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.979352 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="57f3be84-ad73-4f92-9f75-b5e864f20d65" containerName="proxy-server" Mar 20 07:15:36 crc kubenswrapper[4971]: E0320 07:15:36.979372 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e53ff8e-9df7-411c-b2ad-e4c32c0209c6" containerName="openstack-network-exporter" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.979378 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e53ff8e-9df7-411c-b2ad-e4c32c0209c6" containerName="openstack-network-exporter" Mar 20 07:15:36 crc kubenswrapper[4971]: E0320 07:15:36.979385 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67" containerName="init" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.979390 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67" containerName="init" Mar 20 07:15:36 crc kubenswrapper[4971]: E0320 07:15:36.979399 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67" containerName="dnsmasq-dns" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.979404 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67" containerName="dnsmasq-dns" Mar 20 07:15:36 crc kubenswrapper[4971]: E0320 07:15:36.979411 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6149d926-b8a4-4d76-998c-0f721a1a2ef1" containerName="barbican-keystone-listener" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.979416 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="6149d926-b8a4-4d76-998c-0f721a1a2ef1" containerName="barbican-keystone-listener" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.979562 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="852c9ecc-4ab1-4142-8a63-e73c265c3866" containerName="openstack-network-exporter" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.979571 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="63654bb6-f2be-42d2-bb83-0b96bb2ca8cb" containerName="openstack-network-exporter" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.979583 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="6149d926-b8a4-4d76-998c-0f721a1a2ef1" containerName="barbican-keystone-listener" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.979591 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="4290efff-c110-4085-b0a7-3e402507b5a0" containerName="barbican-worker-log" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.979603 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="63654bb6-f2be-42d2-bb83-0b96bb2ca8cb" containerName="ovsdbserver-nb" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.979629 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="57f3be84-ad73-4f92-9f75-b5e864f20d65" containerName="proxy-httpd" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.979639 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e53ff8e-9df7-411c-b2ad-e4c32c0209c6" containerName="ovsdbserver-sb" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.979647 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="5496b78b-549b-4076-a150-783bbba8896c" containerName="galera" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.979660 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="6149d926-b8a4-4d76-998c-0f721a1a2ef1" containerName="barbican-keystone-listener-log" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.979671 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b3feaa7-1a6e-4fd7-bf31-b6a18d52ac67" containerName="dnsmasq-dns" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.979680 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b4a1357-9313-4ebb-bb14-2aaf2785e17d" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.979690 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e53ff8e-9df7-411c-b2ad-e4c32c0209c6" containerName="openstack-network-exporter" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.979699 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="4290efff-c110-4085-b0a7-3e402507b5a0" containerName="barbican-worker" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.979708 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="57f3be84-ad73-4f92-9f75-b5e864f20d65" containerName="proxy-server" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.980287 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-4k7j6" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.984172 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.984950 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 07:15:36 crc kubenswrapper[4971]: I0320 07:15:36.989844 4971 scope.go:117] "RemoveContainer" containerID="2bd77aab0599e42f7a587075c023a48f183e1c5575676c74bbb696d29d33375c" Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.000425 4971 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/57f3be84-ad73-4f92-9f75-b5e864f20d65-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.000453 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57f3be84-ad73-4f92-9f75-b5e864f20d65-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.002137 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.012090 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-4k7j6"] Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.025093 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-594fd7bbdb-tf55d"] Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.025274 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57f3be84-ad73-4f92-9f75-b5e864f20d65-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "57f3be84-ad73-4f92-9f75-b5e864f20d65" (UID: "57f3be84-ad73-4f92-9f75-b5e864f20d65"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.031502 4971 scope.go:117] "RemoveContainer" containerID="a7c2133041da20b5fced03a202d5fff42ac4b65bc46dd7533ac03796bf0ed5e3" Mar 20 07:15:37 crc kubenswrapper[4971]: E0320 07:15:37.032274 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7c2133041da20b5fced03a202d5fff42ac4b65bc46dd7533ac03796bf0ed5e3\": container with ID starting with a7c2133041da20b5fced03a202d5fff42ac4b65bc46dd7533ac03796bf0ed5e3 not found: ID does not exist" containerID="a7c2133041da20b5fced03a202d5fff42ac4b65bc46dd7533ac03796bf0ed5e3" Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.032308 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7c2133041da20b5fced03a202d5fff42ac4b65bc46dd7533ac03796bf0ed5e3"} err="failed to get container status \"a7c2133041da20b5fced03a202d5fff42ac4b65bc46dd7533ac03796bf0ed5e3\": rpc error: code = NotFound desc = could not find container \"a7c2133041da20b5fced03a202d5fff42ac4b65bc46dd7533ac03796bf0ed5e3\": container with ID starting with a7c2133041da20b5fced03a202d5fff42ac4b65bc46dd7533ac03796bf0ed5e3 not found: ID does not exist" Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.032339 4971 scope.go:117] "RemoveContainer" containerID="2bd77aab0599e42f7a587075c023a48f183e1c5575676c74bbb696d29d33375c" Mar 20 07:15:37 crc kubenswrapper[4971]: E0320 07:15:37.033123 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bd77aab0599e42f7a587075c023a48f183e1c5575676c74bbb696d29d33375c\": container with ID starting with 2bd77aab0599e42f7a587075c023a48f183e1c5575676c74bbb696d29d33375c not found: ID does not exist" containerID="2bd77aab0599e42f7a587075c023a48f183e1c5575676c74bbb696d29d33375c" Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.033168 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bd77aab0599e42f7a587075c023a48f183e1c5575676c74bbb696d29d33375c"} err="failed to get container status \"2bd77aab0599e42f7a587075c023a48f183e1c5575676c74bbb696d29d33375c\": rpc error: code = NotFound desc = could not find container \"2bd77aab0599e42f7a587075c023a48f183e1c5575676c74bbb696d29d33375c\": container with ID starting with 2bd77aab0599e42f7a587075c023a48f183e1c5575676c74bbb696d29d33375c not found: ID does not exist" Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.033194 4971 scope.go:117] "RemoveContainer" containerID="f9a71029621987be9a7f2766db366b8b68d9770ba10954549da9931e1ec09d70" Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.036800 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57f3be84-ad73-4f92-9f75-b5e864f20d65-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "57f3be84-ad73-4f92-9f75-b5e864f20d65" (UID: "57f3be84-ad73-4f92-9f75-b5e864f20d65"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.056859 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-594fd7bbdb-tf55d"] Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.095598 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-60a3-account-create-update-ck92h"] Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.106116 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dlm7\" (UniqueName: \"kubernetes.io/projected/b339d4de-8516-4dd0-9bc1-c7580a99c54a-kube-api-access-4dlm7\") pod \"root-account-create-update-4k7j6\" (UID: \"b339d4de-8516-4dd0-9bc1-c7580a99c54a\") " pod="openstack/root-account-create-update-4k7j6" Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.113061 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b339d4de-8516-4dd0-9bc1-c7580a99c54a-operator-scripts\") pod \"root-account-create-update-4k7j6\" (UID: \"b339d4de-8516-4dd0-9bc1-c7580a99c54a\") " pod="openstack/root-account-create-update-4k7j6" Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.113396 4971 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/57f3be84-ad73-4f92-9f75-b5e864f20d65-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.113458 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57f3be84-ad73-4f92-9f75-b5e864f20d65-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:37 crc kubenswrapper[4971]: E0320 07:15:37.113538 4971 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 20 07:15:37 crc kubenswrapper[4971]: E0320 07:15:37.113645 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a71f8d2d-729c-4b7c-89c7-a06bd2216978-config-data podName:a71f8d2d-729c-4b7c-89c7-a06bd2216978 nodeName:}" failed. No retries permitted until 2026-03-20 07:15:41.113607348 +0000 UTC m=+1563.093496276 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/a71f8d2d-729c-4b7c-89c7-a06bd2216978-config-data") pod "rabbitmq-server-0" (UID: "a71f8d2d-729c-4b7c-89c7-a06bd2216978") : configmap "rabbitmq-config-data" not found Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.130501 4971 scope.go:117] "RemoveContainer" containerID="5087a1a61d2001d23a8c542b9b255700f604591da0b11b93ca51014d129c2be8" Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.143687 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-60a3-account-create-update-ck92h"] Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.170479 4971 scope.go:117] "RemoveContainer" containerID="f9a71029621987be9a7f2766db366b8b68d9770ba10954549da9931e1ec09d70" Mar 20 07:15:37 crc kubenswrapper[4971]: E0320 07:15:37.173727 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9a71029621987be9a7f2766db366b8b68d9770ba10954549da9931e1ec09d70\": container with ID starting with f9a71029621987be9a7f2766db366b8b68d9770ba10954549da9931e1ec09d70 not found: ID does not exist" containerID="f9a71029621987be9a7f2766db366b8b68d9770ba10954549da9931e1ec09d70" Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.173808 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9a71029621987be9a7f2766db366b8b68d9770ba10954549da9931e1ec09d70"} err="failed to get container status \"f9a71029621987be9a7f2766db366b8b68d9770ba10954549da9931e1ec09d70\": rpc error: code = NotFound desc = could not find container \"f9a71029621987be9a7f2766db366b8b68d9770ba10954549da9931e1ec09d70\": container with ID starting with f9a71029621987be9a7f2766db366b8b68d9770ba10954549da9931e1ec09d70 not found: ID does not exist" Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.173843 4971 scope.go:117] "RemoveContainer" containerID="5087a1a61d2001d23a8c542b9b255700f604591da0b11b93ca51014d129c2be8" Mar 20 07:15:37 crc kubenswrapper[4971]: E0320 07:15:37.175265 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5087a1a61d2001d23a8c542b9b255700f604591da0b11b93ca51014d129c2be8\": container with ID starting with 5087a1a61d2001d23a8c542b9b255700f604591da0b11b93ca51014d129c2be8 not found: ID does not exist" containerID="5087a1a61d2001d23a8c542b9b255700f604591da0b11b93ca51014d129c2be8" Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.175298 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5087a1a61d2001d23a8c542b9b255700f604591da0b11b93ca51014d129c2be8"} err="failed to get container status \"5087a1a61d2001d23a8c542b9b255700f604591da0b11b93ca51014d129c2be8\": rpc error: code = NotFound desc = could not find container \"5087a1a61d2001d23a8c542b9b255700f604591da0b11b93ca51014d129c2be8\": container with ID starting with 5087a1a61d2001d23a8c542b9b255700f604591da0b11b93ca51014d129c2be8 not found: ID does not exist" Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.175319 4971 scope.go:117] "RemoveContainer" containerID="65c1955e9ecffc618c51033c4ee3d4e91c853d79d00a2cf910425e2df03f5791" Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.214256 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-30f2-account-create-update-p2jpv" Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.219206 4971 scope.go:117] "RemoveContainer" containerID="6e2dd8821eb89876f7c7630305e669c0924a486cdb8b8e13430844295352c507" Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.226096 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dlm7\" (UniqueName: \"kubernetes.io/projected/b339d4de-8516-4dd0-9bc1-c7580a99c54a-kube-api-access-4dlm7\") pod \"root-account-create-update-4k7j6\" (UID: \"b339d4de-8516-4dd0-9bc1-c7580a99c54a\") " pod="openstack/root-account-create-update-4k7j6" Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.228253 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b339d4de-8516-4dd0-9bc1-c7580a99c54a-operator-scripts\") pod \"root-account-create-update-4k7j6\" (UID: \"b339d4de-8516-4dd0-9bc1-c7580a99c54a\") " pod="openstack/root-account-create-update-4k7j6" Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.230306 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b339d4de-8516-4dd0-9bc1-c7580a99c54a-operator-scripts\") pod \"root-account-create-update-4k7j6\" (UID: \"b339d4de-8516-4dd0-9bc1-c7580a99c54a\") " pod="openstack/root-account-create-update-4k7j6" Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.247421 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dlm7\" (UniqueName: \"kubernetes.io/projected/b339d4de-8516-4dd0-9bc1-c7580a99c54a-kube-api-access-4dlm7\") pod \"root-account-create-update-4k7j6\" (UID: \"b339d4de-8516-4dd0-9bc1-c7580a99c54a\") " pod="openstack/root-account-create-update-4k7j6" Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.258719 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.270763 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.292862 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-dff7bb765-jsscs"] Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.306738 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-4k7j6" Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.312232 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-dff7bb765-jsscs"] Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.330936 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kb4m6\" (UniqueName: \"kubernetes.io/projected/76584044-a0af-481c-961f-1c5351cb8f06-kube-api-access-kb4m6\") pod \"76584044-a0af-481c-961f-1c5351cb8f06\" (UID: \"76584044-a0af-481c-961f-1c5351cb8f06\") " Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.331091 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76584044-a0af-481c-961f-1c5351cb8f06-operator-scripts\") pod \"76584044-a0af-481c-961f-1c5351cb8f06\" (UID: \"76584044-a0af-481c-961f-1c5351cb8f06\") " Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.332153 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76584044-a0af-481c-961f-1c5351cb8f06-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "76584044-a0af-481c-961f-1c5351cb8f06" (UID: "76584044-a0af-481c-961f-1c5351cb8f06"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.352307 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76584044-a0af-481c-961f-1c5351cb8f06-kube-api-access-kb4m6" (OuterVolumeSpecName: "kube-api-access-kb4m6") pod "76584044-a0af-481c-961f-1c5351cb8f06" (UID: "76584044-a0af-481c-961f-1c5351cb8f06"). InnerVolumeSpecName "kube-api-access-kb4m6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.417719 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.418178 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1f2c826d-a350-4f4e-8204-ece96439fd0f" containerName="ceilometer-central-agent" containerID="cri-o://a90100541651d8dcfa27f96fcebde4dd6642efc039317503cc423cb412dc235f" gracePeriod=30 Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.419315 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1f2c826d-a350-4f4e-8204-ece96439fd0f" containerName="proxy-httpd" containerID="cri-o://ced131f1672d11ac0e7aa5eeb11d73ff62b3d9eb8a37be9454542ba813df1b74" gracePeriod=30 Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.419384 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1f2c826d-a350-4f4e-8204-ece96439fd0f" containerName="sg-core" containerID="cri-o://267bb15721c4418b84c312cbb7548cbdbafaea164a0e6b80c787f7e652083ca4" gracePeriod=30 Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.419435 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1f2c826d-a350-4f4e-8204-ece96439fd0f" containerName="ceilometer-notification-agent" containerID="cri-o://45622b13907e1c82b36424cb8a72a5956e99d20fc8608d9925eb324547ffecba" gracePeriod=30 Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.437851 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kb4m6\" (UniqueName: \"kubernetes.io/projected/76584044-a0af-481c-961f-1c5351cb8f06-kube-api-access-kb4m6\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.437870 4971 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76584044-a0af-481c-961f-1c5351cb8f06-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.497136 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.497345 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="b11d9f93-43db-45cf-9cba-75d0c9b50d55" containerName="kube-state-metrics" containerID="cri-o://3e232b29d2dc5299cf551b92f2d9493a62313586ff5a609644e971c3988a31be" gracePeriod=30 Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.522423 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-t4xw9" Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.582443 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-d3ee-account-create-update-xpzjm"] Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.606695 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-d3ee-account-create-update-xpzjm"] Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.643475 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3c07d77-e97d-4251-a86d-c0c67c52be23-operator-scripts\") pod \"c3c07d77-e97d-4251-a86d-c0c67c52be23\" (UID: \"c3c07d77-e97d-4251-a86d-c0c67c52be23\") " Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.643691 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lm9mw\" (UniqueName: \"kubernetes.io/projected/c3c07d77-e97d-4251-a86d-c0c67c52be23-kube-api-access-lm9mw\") pod \"c3c07d77-e97d-4251-a86d-c0c67c52be23\" (UID: \"c3c07d77-e97d-4251-a86d-c0c67c52be23\") " Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.644322 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3c07d77-e97d-4251-a86d-c0c67c52be23-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c3c07d77-e97d-4251-a86d-c0c67c52be23" (UID: "c3c07d77-e97d-4251-a86d-c0c67c52be23"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.644880 4971 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3c07d77-e97d-4251-a86d-c0c67c52be23-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.664355 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3c07d77-e97d-4251-a86d-c0c67c52be23-kube-api-access-lm9mw" (OuterVolumeSpecName: "kube-api-access-lm9mw") pod "c3c07d77-e97d-4251-a86d-c0c67c52be23" (UID: "c3c07d77-e97d-4251-a86d-c0c67c52be23"). InnerVolumeSpecName "kube-api-access-lm9mw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.692668 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.692921 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89" containerName="memcached" containerID="cri-o://881a1fe073deeae5d10c2f3450e47a402c7aeac84069def650c7c0704ad554dd" gracePeriod=30 Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.703832 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-d3ee-account-create-update-8sbzg"] Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.704936 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d3ee-account-create-update-8sbzg" Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.708390 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.718809 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-d3ee-account-create-update-8sbzg"] Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.722834 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="08b125a7-da7e-4ab1-b594-2f45f0b5c529" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.173:8776/healthcheck\": read tcp 10.217.0.2:50882->10.217.0.173:8776: read: connection reset by peer" Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.744802 4971 scope.go:117] "RemoveContainer" containerID="65c1955e9ecffc618c51033c4ee3d4e91c853d79d00a2cf910425e2df03f5791" Mar 20 07:15:37 crc kubenswrapper[4971]: E0320 07:15:37.745171 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65c1955e9ecffc618c51033c4ee3d4e91c853d79d00a2cf910425e2df03f5791\": container with ID starting with 65c1955e9ecffc618c51033c4ee3d4e91c853d79d00a2cf910425e2df03f5791 not found: ID does not exist" containerID="65c1955e9ecffc618c51033c4ee3d4e91c853d79d00a2cf910425e2df03f5791" Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.745197 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65c1955e9ecffc618c51033c4ee3d4e91c853d79d00a2cf910425e2df03f5791"} err="failed to get container status \"65c1955e9ecffc618c51033c4ee3d4e91c853d79d00a2cf910425e2df03f5791\": rpc error: code = NotFound desc = could not find container \"65c1955e9ecffc618c51033c4ee3d4e91c853d79d00a2cf910425e2df03f5791\": container with ID starting with 65c1955e9ecffc618c51033c4ee3d4e91c853d79d00a2cf910425e2df03f5791 not found: ID does not exist" Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.745217 4971 scope.go:117] "RemoveContainer" containerID="6e2dd8821eb89876f7c7630305e669c0924a486cdb8b8e13430844295352c507" Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.746349 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a79dc2c8-0cb1-4c06-8001-7b7d5c1d31e9-operator-scripts\") pod \"keystone-d3ee-account-create-update-8sbzg\" (UID: \"a79dc2c8-0cb1-4c06-8001-7b7d5c1d31e9\") " pod="openstack/keystone-d3ee-account-create-update-8sbzg" Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.746408 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9jjn\" (UniqueName: \"kubernetes.io/projected/a79dc2c8-0cb1-4c06-8001-7b7d5c1d31e9-kube-api-access-v9jjn\") pod \"keystone-d3ee-account-create-update-8sbzg\" (UID: \"a79dc2c8-0cb1-4c06-8001-7b7d5c1d31e9\") " pod="openstack/keystone-d3ee-account-create-update-8sbzg" Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.746493 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lm9mw\" (UniqueName: \"kubernetes.io/projected/c3c07d77-e97d-4251-a86d-c0c67c52be23-kube-api-access-lm9mw\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.751536 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-2wsw6"] Mar 20 07:15:37 crc kubenswrapper[4971]: E0320 07:15:37.752986 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e2dd8821eb89876f7c7630305e669c0924a486cdb8b8e13430844295352c507\": container with ID starting with 6e2dd8821eb89876f7c7630305e669c0924a486cdb8b8e13430844295352c507 not found: ID does not exist" containerID="6e2dd8821eb89876f7c7630305e669c0924a486cdb8b8e13430844295352c507" Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.753122 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e2dd8821eb89876f7c7630305e669c0924a486cdb8b8e13430844295352c507"} err="failed to get container status \"6e2dd8821eb89876f7c7630305e669c0924a486cdb8b8e13430844295352c507\": rpc error: code = NotFound desc = could not find container \"6e2dd8821eb89876f7c7630305e669c0924a486cdb8b8e13430844295352c507\": container with ID starting with 6e2dd8821eb89876f7c7630305e669c0924a486cdb8b8e13430844295352c507 not found: ID does not exist" Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.753215 4971 scope.go:117] "RemoveContainer" containerID="a679a633dac3964e55615ddc5dd8c47b610179cc08b11c445d3bcebafc956250" Mar 20 07:15:37 crc kubenswrapper[4971]: E0320 07:15:37.760874 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0e05bef3a26f5139b1217b0d668bd7559845501941c593fdd4f111a356558ab6 is running failed: container process not found" containerID="0e05bef3a26f5139b1217b0d668bd7559845501941c593fdd4f111a356558ab6" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 20 07:15:37 crc kubenswrapper[4971]: E0320 07:15:37.761294 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0e05bef3a26f5139b1217b0d668bd7559845501941c593fdd4f111a356558ab6 is running failed: container process not found" containerID="0e05bef3a26f5139b1217b0d668bd7559845501941c593fdd4f111a356558ab6" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 20 07:15:37 crc kubenswrapper[4971]: E0320 07:15:37.761777 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0e05bef3a26f5139b1217b0d668bd7559845501941c593fdd4f111a356558ab6 is running failed: container process not found" containerID="0e05bef3a26f5139b1217b0d668bd7559845501941c593fdd4f111a356558ab6" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 20 07:15:37 crc kubenswrapper[4971]: E0320 07:15:37.761860 4971 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0e05bef3a26f5139b1217b0d668bd7559845501941c593fdd4f111a356558ab6 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-njjzt" podUID="cabe7ab2-f272-4434-8295-641c334a0ae2" containerName="ovsdb-server" Mar 20 07:15:37 crc kubenswrapper[4971]: E0320 07:15:37.763769 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="68e3c1f10a1b965deae97b10a94045279a9a39e1de7f0b30a1acdde8f91cf276" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 20 07:15:37 crc kubenswrapper[4971]: E0320 07:15:37.765662 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="68e3c1f10a1b965deae97b10a94045279a9a39e1de7f0b30a1acdde8f91cf276" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.768761 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-rvpqx"] Mar 20 07:15:37 crc kubenswrapper[4971]: E0320 07:15:37.770668 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="68e3c1f10a1b965deae97b10a94045279a9a39e1de7f0b30a1acdde8f91cf276" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 20 07:15:37 crc kubenswrapper[4971]: E0320 07:15:37.771281 4971 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-njjzt" podUID="cabe7ab2-f272-4434-8295-641c334a0ae2" containerName="ovs-vswitchd" Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.783642 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-2wsw6"] Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.794669 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-rvpqx"] Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.800778 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-5645d5b87f-2lrzm"] Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.800986 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-5645d5b87f-2lrzm" podUID="cf071dc8-0146-4b1d-a644-02224870fcba" containerName="keystone-api" containerID="cri-o://41739891398c4430c4f83f4ce50b2d0a22faf0762c4dc2b95014e0a8cb9f4cb0" gracePeriod=30 Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.814697 4971 scope.go:117] "RemoveContainer" containerID="a679a633dac3964e55615ddc5dd8c47b610179cc08b11c445d3bcebafc956250" Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.814769 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.818286 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-08bb-account-create-update-tqhvf" Mar 20 07:15:37 crc kubenswrapper[4971]: E0320 07:15:37.818395 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a679a633dac3964e55615ddc5dd8c47b610179cc08b11c445d3bcebafc956250\": container with ID starting with a679a633dac3964e55615ddc5dd8c47b610179cc08b11c445d3bcebafc956250 not found: ID does not exist" containerID="a679a633dac3964e55615ddc5dd8c47b610179cc08b11c445d3bcebafc956250" Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.818433 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a679a633dac3964e55615ddc5dd8c47b610179cc08b11c445d3bcebafc956250"} err="failed to get container status \"a679a633dac3964e55615ddc5dd8c47b610179cc08b11c445d3bcebafc956250\": rpc error: code = NotFound desc = could not find container \"a679a633dac3964e55615ddc5dd8c47b610179cc08b11c445d3bcebafc956250\": container with ID starting with a679a633dac3964e55615ddc5dd8c47b610179cc08b11c445d3bcebafc956250 not found: ID does not exist" Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.832820 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-8tvlb"] Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.838726 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-pfhnr" podUID="10811481-4b40-49ae-9d75-03f0c0e02fe7" containerName="ovn-controller" probeResult="failure" output=< Mar 20 07:15:37 crc kubenswrapper[4971]: ERROR - Failed to get connection status from ovn-controller, ovn-appctl exit status: 0 Mar 20 07:15:37 crc kubenswrapper[4971]: > Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.840220 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-8tvlb"] Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.848853 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a79dc2c8-0cb1-4c06-8001-7b7d5c1d31e9-operator-scripts\") pod \"keystone-d3ee-account-create-update-8sbzg\" (UID: \"a79dc2c8-0cb1-4c06-8001-7b7d5c1d31e9\") " pod="openstack/keystone-d3ee-account-create-update-8sbzg" Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.848972 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9jjn\" (UniqueName: \"kubernetes.io/projected/a79dc2c8-0cb1-4c06-8001-7b7d5c1d31e9-kube-api-access-v9jjn\") pod \"keystone-d3ee-account-create-update-8sbzg\" (UID: \"a79dc2c8-0cb1-4c06-8001-7b7d5c1d31e9\") " pod="openstack/keystone-d3ee-account-create-update-8sbzg" Mar 20 07:15:37 crc kubenswrapper[4971]: E0320 07:15:37.850645 4971 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 20 07:15:37 crc kubenswrapper[4971]: E0320 07:15:37.850690 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a79dc2c8-0cb1-4c06-8001-7b7d5c1d31e9-operator-scripts podName:a79dc2c8-0cb1-4c06-8001-7b7d5c1d31e9 nodeName:}" failed. No retries permitted until 2026-03-20 07:15:38.350673889 +0000 UTC m=+1560.330548027 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/a79dc2c8-0cb1-4c06-8001-7b7d5c1d31e9-operator-scripts") pod "keystone-d3ee-account-create-update-8sbzg" (UID: "a79dc2c8-0cb1-4c06-8001-7b7d5c1d31e9") : configmap "openstack-scripts" not found Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.854446 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-d3ee-account-create-update-8sbzg"] Mar 20 07:15:37 crc kubenswrapper[4971]: E0320 07:15:37.855443 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-v9jjn operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/keystone-d3ee-account-create-update-8sbzg" podUID="a79dc2c8-0cb1-4c06-8001-7b7d5c1d31e9" Mar 20 07:15:37 crc kubenswrapper[4971]: E0320 07:15:37.855856 4971 projected.go:194] Error preparing data for projected volume kube-api-access-v9jjn for pod openstack/keystone-d3ee-account-create-update-8sbzg: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 20 07:15:37 crc kubenswrapper[4971]: E0320 07:15:37.855888 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a79dc2c8-0cb1-4c06-8001-7b7d5c1d31e9-kube-api-access-v9jjn podName:a79dc2c8-0cb1-4c06-8001-7b7d5c1d31e9 nodeName:}" failed. No retries permitted until 2026-03-20 07:15:38.355879464 +0000 UTC m=+1560.335753602 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-v9jjn" (UniqueName: "kubernetes.io/projected/a79dc2c8-0cb1-4c06-8001-7b7d5c1d31e9-kube-api-access-v9jjn") pod "keystone-d3ee-account-create-update-8sbzg" (UID: "a79dc2c8-0cb1-4c06-8001-7b7d5c1d31e9") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.859529 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-08bb-account-create-update-tqhvf" event={"ID":"19961c12-d700-4796-a618-4e625c12649b","Type":"ContainerDied","Data":"0521f2c683147e43a2de06c0ba71818a2ceb6b7adf55031ae58f06be236b0b66"} Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.859654 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-08bb-account-create-update-tqhvf" Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.869723 4971 generic.go:334] "Generic (PLEG): container finished" podID="08b125a7-da7e-4ab1-b594-2f45f0b5c529" containerID="2d28427254b93eecd5e3f48d2294727ab000a8f924161e1e67d604b6de308a34" exitCode=0 Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.869783 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"08b125a7-da7e-4ab1-b594-2f45f0b5c529","Type":"ContainerDied","Data":"2d28427254b93eecd5e3f48d2294727ab000a8f924161e1e67d604b6de308a34"} Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.881701 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-4k7j6"] Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.906375 4971 generic.go:334] "Generic (PLEG): container finished" podID="c7c37921-0f32-4a99-bf40-e960c400a9ac" containerID="8e933255e2bcc27abf971627e50a388be00c68eb771f36803f6707174021661b" exitCode=0 Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.906450 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c7c37921-0f32-4a99-bf40-e960c400a9ac","Type":"ContainerDied","Data":"8e933255e2bcc27abf971627e50a388be00c68eb771f36803f6707174021661b"} Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.948352 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-30f2-account-create-update-p2jpv" event={"ID":"76584044-a0af-481c-961f-1c5351cb8f06","Type":"ContainerDied","Data":"857a14976a0ce54d583fdf28890134b9224ef8a4d553f86c2d29ac9f93c53151"} Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.948437 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-30f2-account-create-update-p2jpv" Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.955670 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rh5td\" (UniqueName: \"kubernetes.io/projected/19961c12-d700-4796-a618-4e625c12649b-kube-api-access-rh5td\") pod \"19961c12-d700-4796-a618-4e625c12649b\" (UID: \"19961c12-d700-4796-a618-4e625c12649b\") " Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.955712 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19961c12-d700-4796-a618-4e625c12649b-operator-scripts\") pod \"19961c12-d700-4796-a618-4e625c12649b\" (UID: \"19961c12-d700-4796-a618-4e625c12649b\") " Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.956545 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19961c12-d700-4796-a618-4e625c12649b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "19961c12-d700-4796-a618-4e625c12649b" (UID: "19961c12-d700-4796-a618-4e625c12649b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.964121 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19961c12-d700-4796-a618-4e625c12649b-kube-api-access-rh5td" (OuterVolumeSpecName: "kube-api-access-rh5td") pod "19961c12-d700-4796-a618-4e625c12649b" (UID: "19961c12-d700-4796-a618-4e625c12649b"). InnerVolumeSpecName "kube-api-access-rh5td". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.968516 4971 generic.go:334] "Generic (PLEG): container finished" podID="aa332536-984c-4942-b1fb-1ffde7eb465d" containerID="68ae6ce1a0d2429fc4431d1c48cbd5670a70a1125fc1820cb1e6abfe59df9ea9" exitCode=0 Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.968584 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"aa332536-984c-4942-b1fb-1ffde7eb465d","Type":"ContainerDied","Data":"68ae6ce1a0d2429fc4431d1c48cbd5670a70a1125fc1820cb1e6abfe59df9ea9"} Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.971151 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-t4xw9" event={"ID":"c3c07d77-e97d-4251-a86d-c0c67c52be23","Type":"ContainerDied","Data":"1821c6ace33350bd6b530bdfa4038e0f769b05d128e199f9d4a4865651eb3c01"} Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.971241 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-t4xw9" Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.989789 4971 generic.go:334] "Generic (PLEG): container finished" podID="b11d9f93-43db-45cf-9cba-75d0c9b50d55" containerID="3e232b29d2dc5299cf551b92f2d9493a62313586ff5a609644e971c3988a31be" exitCode=2 Mar 20 07:15:37 crc kubenswrapper[4971]: I0320 07:15:37.989858 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b11d9f93-43db-45cf-9cba-75d0c9b50d55","Type":"ContainerDied","Data":"3e232b29d2dc5299cf551b92f2d9493a62313586ff5a609644e971c3988a31be"} Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.012565 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-30f2-account-create-update-p2jpv"] Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.016631 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-30f2-account-create-update-p2jpv"] Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.024899 4971 generic.go:334] "Generic (PLEG): container finished" podID="1f2c826d-a350-4f4e-8204-ece96439fd0f" containerID="267bb15721c4418b84c312cbb7548cbdbafaea164a0e6b80c787f7e652083ca4" exitCode=2 Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.024958 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1f2c826d-a350-4f4e-8204-ece96439fd0f","Type":"ContainerDied","Data":"267bb15721c4418b84c312cbb7548cbdbafaea164a0e6b80c787f7e652083ca4"} Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.059911 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rh5td\" (UniqueName: \"kubernetes.io/projected/19961c12-d700-4796-a618-4e625c12649b-kube-api-access-rh5td\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.059941 4971 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19961c12-d700-4796-a618-4e625c12649b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.076804 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-t4xw9"] Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.103011 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-t4xw9"] Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.125495 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-4k7j6"] Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.154081 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="69a5a04c-039c-4ef2-a020-1e5c16fca2b6" containerName="galera" containerID="cri-o://1a7b3fb5fd235f55cc04b6d8e45ed95722d35221264653d7df9a70cd8b76f73a" gracePeriod=30 Mar 20 07:15:38 crc kubenswrapper[4971]: W0320 07:15:38.178675 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb339d4de_8516_4dd0_9bc1_c7580a99c54a.slice/crio-9632ec0a5d0a8185a3e649df407fe3cc41d698d178dd055b52f8edb327e2ac1b WatchSource:0}: Error finding container 9632ec0a5d0a8185a3e649df407fe3cc41d698d178dd055b52f8edb327e2ac1b: Status 404 returned error can't find the container with id 9632ec0a5d0a8185a3e649df407fe3cc41d698d178dd055b52f8edb327e2ac1b Mar 20 07:15:38 crc kubenswrapper[4971]: E0320 07:15:38.193256 4971 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 07:15:38 crc kubenswrapper[4971]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 20 07:15:38 crc kubenswrapper[4971]: Mar 20 07:15:38 crc kubenswrapper[4971]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 20 07:15:38 crc kubenswrapper[4971]: Mar 20 07:15:38 crc kubenswrapper[4971]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 20 07:15:38 crc kubenswrapper[4971]: Mar 20 07:15:38 crc kubenswrapper[4971]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 20 07:15:38 crc kubenswrapper[4971]: Mar 20 07:15:38 crc kubenswrapper[4971]: if [ -n "" ]; then Mar 20 07:15:38 crc kubenswrapper[4971]: GRANT_DATABASE="" Mar 20 07:15:38 crc kubenswrapper[4971]: else Mar 20 07:15:38 crc kubenswrapper[4971]: GRANT_DATABASE="*" Mar 20 07:15:38 crc kubenswrapper[4971]: fi Mar 20 07:15:38 crc kubenswrapper[4971]: Mar 20 07:15:38 crc kubenswrapper[4971]: # going for maximum compatibility here: Mar 20 07:15:38 crc kubenswrapper[4971]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 20 07:15:38 crc kubenswrapper[4971]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 20 07:15:38 crc kubenswrapper[4971]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 20 07:15:38 crc kubenswrapper[4971]: # support updates Mar 20 07:15:38 crc kubenswrapper[4971]: Mar 20 07:15:38 crc kubenswrapper[4971]: $MYSQL_CMD < logger="UnhandledError" Mar 20 07:15:38 crc kubenswrapper[4971]: E0320 07:15:38.195753 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-mariadb-root-db-secret\\\" not found\"" pod="openstack/root-account-create-update-4k7j6" podUID="b339d4de-8516-4dd0-9bc1-c7580a99c54a" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.237670 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-08bb-account-create-update-tqhvf"] Mar 20 07:15:38 crc kubenswrapper[4971]: E0320 07:15:38.248805 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="615ced8f9e0feb5ebdac89eb7107d8990b6103b1111ede4cffc99ba5cd482835" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 07:15:38 crc kubenswrapper[4971]: E0320 07:15:38.256483 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="615ced8f9e0feb5ebdac89eb7107d8990b6103b1111ede4cffc99ba5cd482835" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 07:15:38 crc kubenswrapper[4971]: E0320 07:15:38.262712 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="615ced8f9e0feb5ebdac89eb7107d8990b6103b1111ede4cffc99ba5cd482835" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 07:15:38 crc kubenswrapper[4971]: E0320 07:15:38.262819 4971 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="b9450f53-fcf0-4799-a8da-a9d0bc7016ac" containerName="nova-cell1-conductor-conductor" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.263968 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-08bb-account-create-update-tqhvf"] Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.366913 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9jjn\" (UniqueName: \"kubernetes.io/projected/a79dc2c8-0cb1-4c06-8001-7b7d5c1d31e9-kube-api-access-v9jjn\") pod \"keystone-d3ee-account-create-update-8sbzg\" (UID: \"a79dc2c8-0cb1-4c06-8001-7b7d5c1d31e9\") " pod="openstack/keystone-d3ee-account-create-update-8sbzg" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.367075 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a79dc2c8-0cb1-4c06-8001-7b7d5c1d31e9-operator-scripts\") pod \"keystone-d3ee-account-create-update-8sbzg\" (UID: \"a79dc2c8-0cb1-4c06-8001-7b7d5c1d31e9\") " pod="openstack/keystone-d3ee-account-create-update-8sbzg" Mar 20 07:15:38 crc kubenswrapper[4971]: E0320 07:15:38.367173 4971 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 20 07:15:38 crc kubenswrapper[4971]: E0320 07:15:38.367220 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a79dc2c8-0cb1-4c06-8001-7b7d5c1d31e9-operator-scripts podName:a79dc2c8-0cb1-4c06-8001-7b7d5c1d31e9 nodeName:}" failed. No retries permitted until 2026-03-20 07:15:39.367207075 +0000 UTC m=+1561.347081213 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/a79dc2c8-0cb1-4c06-8001-7b7d5c1d31e9-operator-scripts") pod "keystone-d3ee-account-create-update-8sbzg" (UID: "a79dc2c8-0cb1-4c06-8001-7b7d5c1d31e9") : configmap "openstack-scripts" not found Mar 20 07:15:38 crc kubenswrapper[4971]: E0320 07:15:38.371556 4971 projected.go:194] Error preparing data for projected volume kube-api-access-v9jjn for pod openstack/keystone-d3ee-account-create-update-8sbzg: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 20 07:15:38 crc kubenswrapper[4971]: E0320 07:15:38.371690 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a79dc2c8-0cb1-4c06-8001-7b7d5c1d31e9-kube-api-access-v9jjn podName:a79dc2c8-0cb1-4c06-8001-7b7d5c1d31e9 nodeName:}" failed. No retries permitted until 2026-03-20 07:15:39.371659011 +0000 UTC m=+1561.351533149 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-v9jjn" (UniqueName: "kubernetes.io/projected/a79dc2c8-0cb1-4c06-8001-7b7d5c1d31e9-kube-api-access-v9jjn") pod "keystone-d3ee-account-create-update-8sbzg" (UID: "a79dc2c8-0cb1-4c06-8001-7b7d5c1d31e9") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.623264 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.672994 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7c37921-0f32-4a99-bf40-e960c400a9ac-combined-ca-bundle\") pod \"c7c37921-0f32-4a99-bf40-e960c400a9ac\" (UID: \"c7c37921-0f32-4a99-bf40-e960c400a9ac\") " Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.673440 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7c37921-0f32-4a99-bf40-e960c400a9ac-scripts\") pod \"c7c37921-0f32-4a99-bf40-e960c400a9ac\" (UID: \"c7c37921-0f32-4a99-bf40-e960c400a9ac\") " Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.673530 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khpp7\" (UniqueName: \"kubernetes.io/projected/c7c37921-0f32-4a99-bf40-e960c400a9ac-kube-api-access-khpp7\") pod \"c7c37921-0f32-4a99-bf40-e960c400a9ac\" (UID: \"c7c37921-0f32-4a99-bf40-e960c400a9ac\") " Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.682343 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.682367 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7c37921-0f32-4a99-bf40-e960c400a9ac-kube-api-access-khpp7" (OuterVolumeSpecName: "kube-api-access-khpp7") pod "c7c37921-0f32-4a99-bf40-e960c400a9ac" (UID: "c7c37921-0f32-4a99-bf40-e960c400a9ac"). InnerVolumeSpecName "kube-api-access-khpp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.684584 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c7c37921-0f32-4a99-bf40-e960c400a9ac-httpd-run\") pod \"c7c37921-0f32-4a99-bf40-e960c400a9ac\" (UID: \"c7c37921-0f32-4a99-bf40-e960c400a9ac\") " Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.684679 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7c37921-0f32-4a99-bf40-e960c400a9ac-config-data\") pod \"c7c37921-0f32-4a99-bf40-e960c400a9ac\" (UID: \"c7c37921-0f32-4a99-bf40-e960c400a9ac\") " Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.684713 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"c7c37921-0f32-4a99-bf40-e960c400a9ac\" (UID: \"c7c37921-0f32-4a99-bf40-e960c400a9ac\") " Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.684773 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7c37921-0f32-4a99-bf40-e960c400a9ac-logs\") pod \"c7c37921-0f32-4a99-bf40-e960c400a9ac\" (UID: \"c7c37921-0f32-4a99-bf40-e960c400a9ac\") " Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.684814 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7c37921-0f32-4a99-bf40-e960c400a9ac-public-tls-certs\") pod \"c7c37921-0f32-4a99-bf40-e960c400a9ac\" (UID: \"c7c37921-0f32-4a99-bf40-e960c400a9ac\") " Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.686309 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-68ff6b454b-v87kx" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.689170 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khpp7\" (UniqueName: \"kubernetes.io/projected/c7c37921-0f32-4a99-bf40-e960c400a9ac-kube-api-access-khpp7\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.691730 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.694170 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7c37921-0f32-4a99-bf40-e960c400a9ac-logs" (OuterVolumeSpecName: "logs") pod "c7c37921-0f32-4a99-bf40-e960c400a9ac" (UID: "c7c37921-0f32-4a99-bf40-e960c400a9ac"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.696218 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7c37921-0f32-4a99-bf40-e960c400a9ac-scripts" (OuterVolumeSpecName: "scripts") pod "c7c37921-0f32-4a99-bf40-e960c400a9ac" (UID: "c7c37921-0f32-4a99-bf40-e960c400a9ac"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.696987 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.699819 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7c37921-0f32-4a99-bf40-e960c400a9ac-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c7c37921-0f32-4a99-bf40-e960c400a9ac" (UID: "c7c37921-0f32-4a99-bf40-e960c400a9ac"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.700211 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "c7c37921-0f32-4a99-bf40-e960c400a9ac" (UID: "c7c37921-0f32-4a99-bf40-e960c400a9ac"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.700774 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7c37921-0f32-4a99-bf40-e960c400a9ac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c7c37921-0f32-4a99-bf40-e960c400a9ac" (UID: "c7c37921-0f32-4a99-bf40-e960c400a9ac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.730671 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7c37921-0f32-4a99-bf40-e960c400a9ac-config-data" (OuterVolumeSpecName: "config-data") pod "c7c37921-0f32-4a99-bf40-e960c400a9ac" (UID: "c7c37921-0f32-4a99-bf40-e960c400a9ac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.765104 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10b994b9-b9a4-49de-8faf-a3c7b64035dc" path="/var/lib/kubelet/pods/10b994b9-b9a4-49de-8faf-a3c7b64035dc/volumes" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.765830 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19961c12-d700-4796-a618-4e625c12649b" path="/var/lib/kubelet/pods/19961c12-d700-4796-a618-4e625c12649b/volumes" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.766212 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36683fcd-086d-4202-8e1d-5ae325ea95b1" path="/var/lib/kubelet/pods/36683fcd-086d-4202-8e1d-5ae325ea95b1/volumes" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.766588 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41c644a9-8310-4775-bf77-55e4cf46a907" path="/var/lib/kubelet/pods/41c644a9-8310-4775-bf77-55e4cf46a907/volumes" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.767724 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4290efff-c110-4085-b0a7-3e402507b5a0" path="/var/lib/kubelet/pods/4290efff-c110-4085-b0a7-3e402507b5a0/volumes" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.768539 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7c864cc4fb-bwl6r" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.768674 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54576591-c271-46a6-8b8d-8973dbe4bb8e" path="/var/lib/kubelet/pods/54576591-c271-46a6-8b8d-8973dbe4bb8e/volumes" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.769375 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5496b78b-549b-4076-a150-783bbba8896c" path="/var/lib/kubelet/pods/5496b78b-549b-4076-a150-783bbba8896c/volumes" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.770704 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57f3be84-ad73-4f92-9f75-b5e864f20d65" path="/var/lib/kubelet/pods/57f3be84-ad73-4f92-9f75-b5e864f20d65/volumes" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.771395 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6149d926-b8a4-4d76-998c-0f721a1a2ef1" path="/var/lib/kubelet/pods/6149d926-b8a4-4d76-998c-0f721a1a2ef1/volumes" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.771939 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b4a1357-9313-4ebb-bb14-2aaf2785e17d" path="/var/lib/kubelet/pods/6b4a1357-9313-4ebb-bb14-2aaf2785e17d/volumes" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.773027 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76584044-a0af-481c-961f-1c5351cb8f06" path="/var/lib/kubelet/pods/76584044-a0af-481c-961f-1c5351cb8f06/volumes" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.773516 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2fa4dc2-d625-4071-b0bc-07099f6303ab" path="/var/lib/kubelet/pods/a2fa4dc2-d625-4071-b0bc-07099f6303ab/volumes" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.774987 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3c07d77-e97d-4251-a86d-c0c67c52be23" path="/var/lib/kubelet/pods/c3c07d77-e97d-4251-a86d-c0c67c52be23/volumes" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.775359 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="daa1861f-a138-41b1-b5c4-3d2fa15611ea" path="/var/lib/kubelet/pods/daa1861f-a138-41b1-b5c4-3d2fa15611ea/volumes" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.778755 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.791337 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7c37921-0f32-4a99-bf40-e960c400a9ac-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c7c37921-0f32-4a99-bf40-e960c400a9ac" (UID: "c7c37921-0f32-4a99-bf40-e960c400a9ac"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.793984 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e641b3d5-ee18-41b6-b38c-098564591a1b-public-tls-certs\") pod \"e641b3d5-ee18-41b6-b38c-098564591a1b\" (UID: \"e641b3d5-ee18-41b6-b38c-098564591a1b\") " Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.794034 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08b125a7-da7e-4ab1-b594-2f45f0b5c529-combined-ca-bundle\") pod \"08b125a7-da7e-4ab1-b594-2f45f0b5c529\" (UID: \"08b125a7-da7e-4ab1-b594-2f45f0b5c529\") " Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.794062 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7m7n7\" (UniqueName: \"kubernetes.io/projected/b11d9f93-43db-45cf-9cba-75d0c9b50d55-kube-api-access-7m7n7\") pod \"b11d9f93-43db-45cf-9cba-75d0c9b50d55\" (UID: \"b11d9f93-43db-45cf-9cba-75d0c9b50d55\") " Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.794097 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e641b3d5-ee18-41b6-b38c-098564591a1b-logs\") pod \"e641b3d5-ee18-41b6-b38c-098564591a1b\" (UID: \"e641b3d5-ee18-41b6-b38c-098564591a1b\") " Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.794112 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08b125a7-da7e-4ab1-b594-2f45f0b5c529-scripts\") pod \"08b125a7-da7e-4ab1-b594-2f45f0b5c529\" (UID: \"08b125a7-da7e-4ab1-b594-2f45f0b5c529\") " Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.794132 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa332536-984c-4942-b1fb-1ffde7eb465d-combined-ca-bundle\") pod \"aa332536-984c-4942-b1fb-1ffde7eb465d\" (UID: \"aa332536-984c-4942-b1fb-1ffde7eb465d\") " Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.794154 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08b125a7-da7e-4ab1-b594-2f45f0b5c529-config-data\") pod \"08b125a7-da7e-4ab1-b594-2f45f0b5c529\" (UID: \"08b125a7-da7e-4ab1-b594-2f45f0b5c529\") " Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.794195 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b11d9f93-43db-45cf-9cba-75d0c9b50d55-combined-ca-bundle\") pod \"b11d9f93-43db-45cf-9cba-75d0c9b50d55\" (UID: \"b11d9f93-43db-45cf-9cba-75d0c9b50d55\") " Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.794222 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa332536-984c-4942-b1fb-1ffde7eb465d-scripts\") pod \"aa332536-984c-4942-b1fb-1ffde7eb465d\" (UID: \"aa332536-984c-4942-b1fb-1ffde7eb465d\") " Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.795484 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e641b3d5-ee18-41b6-b38c-098564591a1b-scripts\") pod \"e641b3d5-ee18-41b6-b38c-098564591a1b\" (UID: \"e641b3d5-ee18-41b6-b38c-098564591a1b\") " Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.795535 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/aa332536-984c-4942-b1fb-1ffde7eb465d-httpd-run\") pod \"aa332536-984c-4942-b1fb-1ffde7eb465d\" (UID: \"aa332536-984c-4942-b1fb-1ffde7eb465d\") " Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.795565 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/08b125a7-da7e-4ab1-b594-2f45f0b5c529-internal-tls-certs\") pod \"08b125a7-da7e-4ab1-b594-2f45f0b5c529\" (UID: \"08b125a7-da7e-4ab1-b594-2f45f0b5c529\") " Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.795587 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/b11d9f93-43db-45cf-9cba-75d0c9b50d55-kube-state-metrics-tls-certs\") pod \"b11d9f93-43db-45cf-9cba-75d0c9b50d55\" (UID: \"b11d9f93-43db-45cf-9cba-75d0c9b50d55\") " Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.795644 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e641b3d5-ee18-41b6-b38c-098564591a1b-internal-tls-certs\") pod \"e641b3d5-ee18-41b6-b38c-098564591a1b\" (UID: \"e641b3d5-ee18-41b6-b38c-098564591a1b\") " Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.795662 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08b125a7-da7e-4ab1-b594-2f45f0b5c529-logs\") pod \"08b125a7-da7e-4ab1-b594-2f45f0b5c529\" (UID: \"08b125a7-da7e-4ab1-b594-2f45f0b5c529\") " Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.795702 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btsqw\" (UniqueName: \"kubernetes.io/projected/e641b3d5-ee18-41b6-b38c-098564591a1b-kube-api-access-btsqw\") pod \"e641b3d5-ee18-41b6-b38c-098564591a1b\" (UID: \"e641b3d5-ee18-41b6-b38c-098564591a1b\") " Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.795718 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwzw5\" (UniqueName: \"kubernetes.io/projected/aa332536-984c-4942-b1fb-1ffde7eb465d-kube-api-access-xwzw5\") pod \"aa332536-984c-4942-b1fb-1ffde7eb465d\" (UID: \"aa332536-984c-4942-b1fb-1ffde7eb465d\") " Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.795743 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa332536-984c-4942-b1fb-1ffde7eb465d-logs\") pod \"aa332536-984c-4942-b1fb-1ffde7eb465d\" (UID: \"aa332536-984c-4942-b1fb-1ffde7eb465d\") " Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.795770 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e641b3d5-ee18-41b6-b38c-098564591a1b-combined-ca-bundle\") pod \"e641b3d5-ee18-41b6-b38c-098564591a1b\" (UID: \"e641b3d5-ee18-41b6-b38c-098564591a1b\") " Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.795789 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e641b3d5-ee18-41b6-b38c-098564591a1b-config-data\") pod \"e641b3d5-ee18-41b6-b38c-098564591a1b\" (UID: \"e641b3d5-ee18-41b6-b38c-098564591a1b\") " Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.795812 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/08b125a7-da7e-4ab1-b594-2f45f0b5c529-public-tls-certs\") pod \"08b125a7-da7e-4ab1-b594-2f45f0b5c529\" (UID: \"08b125a7-da7e-4ab1-b594-2f45f0b5c529\") " Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.795829 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/b11d9f93-43db-45cf-9cba-75d0c9b50d55-kube-state-metrics-tls-config\") pod \"b11d9f93-43db-45cf-9cba-75d0c9b50d55\" (UID: \"b11d9f93-43db-45cf-9cba-75d0c9b50d55\") " Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.795850 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"aa332536-984c-4942-b1fb-1ffde7eb465d\" (UID: \"aa332536-984c-4942-b1fb-1ffde7eb465d\") " Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.795865 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9x6tw\" (UniqueName: \"kubernetes.io/projected/08b125a7-da7e-4ab1-b594-2f45f0b5c529-kube-api-access-9x6tw\") pod \"08b125a7-da7e-4ab1-b594-2f45f0b5c529\" (UID: \"08b125a7-da7e-4ab1-b594-2f45f0b5c529\") " Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.795900 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa332536-984c-4942-b1fb-1ffde7eb465d-internal-tls-certs\") pod \"aa332536-984c-4942-b1fb-1ffde7eb465d\" (UID: \"aa332536-984c-4942-b1fb-1ffde7eb465d\") " Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.795928 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/08b125a7-da7e-4ab1-b594-2f45f0b5c529-etc-machine-id\") pod \"08b125a7-da7e-4ab1-b594-2f45f0b5c529\" (UID: \"08b125a7-da7e-4ab1-b594-2f45f0b5c529\") " Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.795968 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08b125a7-da7e-4ab1-b594-2f45f0b5c529-config-data-custom\") pod \"08b125a7-da7e-4ab1-b594-2f45f0b5c529\" (UID: \"08b125a7-da7e-4ab1-b594-2f45f0b5c529\") " Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.795990 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa332536-984c-4942-b1fb-1ffde7eb465d-config-data\") pod \"aa332536-984c-4942-b1fb-1ffde7eb465d\" (UID: \"aa332536-984c-4942-b1fb-1ffde7eb465d\") " Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.796457 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e641b3d5-ee18-41b6-b38c-098564591a1b-logs" (OuterVolumeSpecName: "logs") pod "e641b3d5-ee18-41b6-b38c-098564591a1b" (UID: "e641b3d5-ee18-41b6-b38c-098564591a1b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.805879 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.809412 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08b125a7-da7e-4ab1-b594-2f45f0b5c529-scripts" (OuterVolumeSpecName: "scripts") pod "08b125a7-da7e-4ab1-b594-2f45f0b5c529" (UID: "08b125a7-da7e-4ab1-b594-2f45f0b5c529"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.810941 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11d9f93-43db-45cf-9cba-75d0c9b50d55-kube-api-access-7m7n7" (OuterVolumeSpecName: "kube-api-access-7m7n7") pod "b11d9f93-43db-45cf-9cba-75d0c9b50d55" (UID: "b11d9f93-43db-45cf-9cba-75d0c9b50d55"). InnerVolumeSpecName "kube-api-access-7m7n7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.811798 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa332536-984c-4942-b1fb-1ffde7eb465d-logs" (OuterVolumeSpecName: "logs") pod "aa332536-984c-4942-b1fb-1ffde7eb465d" (UID: "aa332536-984c-4942-b1fb-1ffde7eb465d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.816525 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08b125a7-da7e-4ab1-b594-2f45f0b5c529-logs" (OuterVolumeSpecName: "logs") pod "08b125a7-da7e-4ab1-b594-2f45f0b5c529" (UID: "08b125a7-da7e-4ab1-b594-2f45f0b5c529"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.816558 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08b125a7-da7e-4ab1-b594-2f45f0b5c529-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "08b125a7-da7e-4ab1-b594-2f45f0b5c529" (UID: "08b125a7-da7e-4ab1-b594-2f45f0b5c529"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.816681 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08b125a7-da7e-4ab1-b594-2f45f0b5c529-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "08b125a7-da7e-4ab1-b594-2f45f0b5c529" (UID: "08b125a7-da7e-4ab1-b594-2f45f0b5c529"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.817300 4971 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c7c37921-0f32-4a99-bf40-e960c400a9ac-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.817418 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa332536-984c-4942-b1fb-1ffde7eb465d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "aa332536-984c-4942-b1fb-1ffde7eb465d" (UID: "aa332536-984c-4942-b1fb-1ffde7eb465d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.818180 4971 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/08b125a7-da7e-4ab1-b594-2f45f0b5c529-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.818207 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7c37921-0f32-4a99-bf40-e960c400a9ac-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.818232 4971 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.818247 4971 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08b125a7-da7e-4ab1-b594-2f45f0b5c529-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.818258 4971 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7c37921-0f32-4a99-bf40-e960c400a9ac-logs\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.818270 4971 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7c37921-0f32-4a99-bf40-e960c400a9ac-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.818283 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7m7n7\" (UniqueName: \"kubernetes.io/projected/b11d9f93-43db-45cf-9cba-75d0c9b50d55-kube-api-access-7m7n7\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.818297 4971 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e641b3d5-ee18-41b6-b38c-098564591a1b-logs\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.818308 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08b125a7-da7e-4ab1-b594-2f45f0b5c529-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.818320 4971 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08b125a7-da7e-4ab1-b594-2f45f0b5c529-logs\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.818333 4971 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa332536-984c-4942-b1fb-1ffde7eb465d-logs\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.818345 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7c37921-0f32-4a99-bf40-e960c400a9ac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.818357 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7c37921-0f32-4a99-bf40-e960c400a9ac-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.819937 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa332536-984c-4942-b1fb-1ffde7eb465d-kube-api-access-xwzw5" (OuterVolumeSpecName: "kube-api-access-xwzw5") pod "aa332536-984c-4942-b1fb-1ffde7eb465d" (UID: "aa332536-984c-4942-b1fb-1ffde7eb465d"). InnerVolumeSpecName "kube-api-access-xwzw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.821058 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e641b3d5-ee18-41b6-b38c-098564591a1b-scripts" (OuterVolumeSpecName: "scripts") pod "e641b3d5-ee18-41b6-b38c-098564591a1b" (UID: "e641b3d5-ee18-41b6-b38c-098564591a1b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.823117 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa332536-984c-4942-b1fb-1ffde7eb465d-scripts" (OuterVolumeSpecName: "scripts") pod "aa332536-984c-4942-b1fb-1ffde7eb465d" (UID: "aa332536-984c-4942-b1fb-1ffde7eb465d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.824144 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08b125a7-da7e-4ab1-b594-2f45f0b5c529-kube-api-access-9x6tw" (OuterVolumeSpecName: "kube-api-access-9x6tw") pod "08b125a7-da7e-4ab1-b594-2f45f0b5c529" (UID: "08b125a7-da7e-4ab1-b594-2f45f0b5c529"). InnerVolumeSpecName "kube-api-access-9x6tw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.833329 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "aa332536-984c-4942-b1fb-1ffde7eb465d" (UID: "aa332536-984c-4942-b1fb-1ffde7eb465d"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.866975 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e641b3d5-ee18-41b6-b38c-098564591a1b-kube-api-access-btsqw" (OuterVolumeSpecName: "kube-api-access-btsqw") pod "e641b3d5-ee18-41b6-b38c-098564591a1b" (UID: "e641b3d5-ee18-41b6-b38c-098564591a1b"). InnerVolumeSpecName "kube-api-access-btsqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.906574 4971 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.910056 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa332536-984c-4942-b1fb-1ffde7eb465d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aa332536-984c-4942-b1fb-1ffde7eb465d" (UID: "aa332536-984c-4942-b1fb-1ffde7eb465d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.910346 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e641b3d5-ee18-41b6-b38c-098564591a1b-config-data" (OuterVolumeSpecName: "config-data") pod "e641b3d5-ee18-41b6-b38c-098564591a1b" (UID: "e641b3d5-ee18-41b6-b38c-098564591a1b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.919594 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89-kolla-config\") pod \"ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89\" (UID: \"ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89\") " Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.919655 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a-config-data\") pod \"bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a\" (UID: \"bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a\") " Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.919683 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89-memcached-tls-certs\") pod \"ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89\" (UID: \"ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89\") " Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.919701 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bb32512-db70-4db8-a867-0bda48f318eb-internal-tls-certs\") pod \"2bb32512-db70-4db8-a867-0bda48f318eb\" (UID: \"2bb32512-db70-4db8-a867-0bda48f318eb\") " Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.919818 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bb32512-db70-4db8-a867-0bda48f318eb-public-tls-certs\") pod \"2bb32512-db70-4db8-a867-0bda48f318eb\" (UID: \"2bb32512-db70-4db8-a867-0bda48f318eb\") " Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.919835 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a-combined-ca-bundle\") pod \"bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a\" (UID: \"bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a\") " Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.919885 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gsbpq\" (UniqueName: \"kubernetes.io/projected/ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89-kube-api-access-gsbpq\") pod \"ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89\" (UID: \"ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89\") " Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.919911 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89-combined-ca-bundle\") pod \"ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89\" (UID: \"ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89\") " Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.919981 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2bb32512-db70-4db8-a867-0bda48f318eb-logs\") pod \"2bb32512-db70-4db8-a867-0bda48f318eb\" (UID: \"2bb32512-db70-4db8-a867-0bda48f318eb\") " Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.920002 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a-logs\") pod \"bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a\" (UID: \"bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a\") " Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.920019 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2bb32512-db70-4db8-a867-0bda48f318eb-config-data-custom\") pod \"2bb32512-db70-4db8-a867-0bda48f318eb\" (UID: \"2bb32512-db70-4db8-a867-0bda48f318eb\") " Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.920265 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjlv8\" (UniqueName: \"kubernetes.io/projected/2bb32512-db70-4db8-a867-0bda48f318eb-kube-api-access-sjlv8\") pod \"2bb32512-db70-4db8-a867-0bda48f318eb\" (UID: \"2bb32512-db70-4db8-a867-0bda48f318eb\") " Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.920394 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89" (UID: "ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.920776 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89-config-data\") pod \"ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89\" (UID: \"ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89\") " Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.920813 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bb32512-db70-4db8-a867-0bda48f318eb-combined-ca-bundle\") pod \"2bb32512-db70-4db8-a867-0bda48f318eb\" (UID: \"2bb32512-db70-4db8-a867-0bda48f318eb\") " Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.920983 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bb32512-db70-4db8-a867-0bda48f318eb-config-data\") pod \"2bb32512-db70-4db8-a867-0bda48f318eb\" (UID: \"2bb32512-db70-4db8-a867-0bda48f318eb\") " Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.921154 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a-nova-metadata-tls-certs\") pod \"bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a\" (UID: \"bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a\") " Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.921193 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfdt9\" (UniqueName: \"kubernetes.io/projected/bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a-kube-api-access-dfdt9\") pod \"bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a\" (UID: \"bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a\") " Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.921433 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89-config-data" (OuterVolumeSpecName: "config-data") pod "ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89" (UID: "ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.923228 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btsqw\" (UniqueName: \"kubernetes.io/projected/e641b3d5-ee18-41b6-b38c-098564591a1b-kube-api-access-btsqw\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.923244 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwzw5\" (UniqueName: \"kubernetes.io/projected/aa332536-984c-4942-b1fb-1ffde7eb465d-kube-api-access-xwzw5\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.923253 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e641b3d5-ee18-41b6-b38c-098564591a1b-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.923272 4971 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.923381 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9x6tw\" (UniqueName: \"kubernetes.io/projected/08b125a7-da7e-4ab1-b594-2f45f0b5c529-kube-api-access-9x6tw\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.923390 4971 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.923399 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.923408 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa332536-984c-4942-b1fb-1ffde7eb465d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.923418 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa332536-984c-4942-b1fb-1ffde7eb465d-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.923426 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e641b3d5-ee18-41b6-b38c-098564591a1b-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.923628 4971 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/aa332536-984c-4942-b1fb-1ffde7eb465d-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.924569 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bb32512-db70-4db8-a867-0bda48f318eb-logs" (OuterVolumeSpecName: "logs") pod "2bb32512-db70-4db8-a867-0bda48f318eb" (UID: "2bb32512-db70-4db8-a867-0bda48f318eb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.928151 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a-logs" (OuterVolumeSpecName: "logs") pod "bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a" (UID: "bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.941380 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b11d9f93-43db-45cf-9cba-75d0c9b50d55-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "b11d9f93-43db-45cf-9cba-75d0c9b50d55" (UID: "b11d9f93-43db-45cf-9cba-75d0c9b50d55"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.946327 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a-kube-api-access-dfdt9" (OuterVolumeSpecName: "kube-api-access-dfdt9") pod "bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a" (UID: "bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a"). InnerVolumeSpecName "kube-api-access-dfdt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.946460 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08b125a7-da7e-4ab1-b594-2f45f0b5c529-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "08b125a7-da7e-4ab1-b594-2f45f0b5c529" (UID: "08b125a7-da7e-4ab1-b594-2f45f0b5c529"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.953006 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89-kube-api-access-gsbpq" (OuterVolumeSpecName: "kube-api-access-gsbpq") pod "ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89" (UID: "ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89"). InnerVolumeSpecName "kube-api-access-gsbpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.953110 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bb32512-db70-4db8-a867-0bda48f318eb-kube-api-access-sjlv8" (OuterVolumeSpecName: "kube-api-access-sjlv8") pod "2bb32512-db70-4db8-a867-0bda48f318eb" (UID: "2bb32512-db70-4db8-a867-0bda48f318eb"). InnerVolumeSpecName "kube-api-access-sjlv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.967623 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bb32512-db70-4db8-a867-0bda48f318eb-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2bb32512-db70-4db8-a867-0bda48f318eb" (UID: "2bb32512-db70-4db8-a867-0bda48f318eb"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:38 crc kubenswrapper[4971]: I0320 07:15:38.985985 4971 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.008060 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b11d9f93-43db-45cf-9cba-75d0c9b50d55-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b11d9f93-43db-45cf-9cba-75d0c9b50d55" (UID: "b11d9f93-43db-45cf-9cba-75d0c9b50d55"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.025807 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08b125a7-da7e-4ab1-b594-2f45f0b5c529-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.025841 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfdt9\" (UniqueName: \"kubernetes.io/projected/bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a-kube-api-access-dfdt9\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.025853 4971 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.025862 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b11d9f93-43db-45cf-9cba-75d0c9b50d55-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.025874 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gsbpq\" (UniqueName: \"kubernetes.io/projected/ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89-kube-api-access-gsbpq\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.025883 4971 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/b11d9f93-43db-45cf-9cba-75d0c9b50d55-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.025892 4971 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.025902 4971 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2bb32512-db70-4db8-a867-0bda48f318eb-logs\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.025914 4971 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a-logs\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.025922 4971 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2bb32512-db70-4db8-a867-0bda48f318eb-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.025933 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjlv8\" (UniqueName: \"kubernetes.io/projected/2bb32512-db70-4db8-a867-0bda48f318eb-kube-api-access-sjlv8\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.027892 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08b125a7-da7e-4ab1-b594-2f45f0b5c529-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "08b125a7-da7e-4ab1-b594-2f45f0b5c529" (UID: "08b125a7-da7e-4ab1-b594-2f45f0b5c529"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:39 crc kubenswrapper[4971]: E0320 07:15:39.065299 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9cd9c69b1fe928bab1744a28f21f11504ec9f323cd9f2cc5d8584560664eff24 is running failed: container process not found" containerID="9cd9c69b1fe928bab1744a28f21f11504ec9f323cd9f2cc5d8584560664eff24" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 07:15:39 crc kubenswrapper[4971]: E0320 07:15:39.065985 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9cd9c69b1fe928bab1744a28f21f11504ec9f323cd9f2cc5d8584560664eff24 is running failed: container process not found" containerID="9cd9c69b1fe928bab1744a28f21f11504ec9f323cd9f2cc5d8584560664eff24" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 07:15:39 crc kubenswrapper[4971]: E0320 07:15:39.066410 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9cd9c69b1fe928bab1744a28f21f11504ec9f323cd9f2cc5d8584560664eff24 is running failed: container process not found" containerID="9cd9c69b1fe928bab1744a28f21f11504ec9f323cd9f2cc5d8584560664eff24" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 07:15:39 crc kubenswrapper[4971]: E0320 07:15:39.066459 4971 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9cd9c69b1fe928bab1744a28f21f11504ec9f323cd9f2cc5d8584560664eff24 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="c7136000-46aa-4fe0-acaf-5ad1ed94e4b3" containerName="nova-scheduler-scheduler" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.070494 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.070486 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a" (UID: "bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.080097 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa332536-984c-4942-b1fb-1ffde7eb465d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "aa332536-984c-4942-b1fb-1ffde7eb465d" (UID: "aa332536-984c-4942-b1fb-1ffde7eb465d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.105206 4971 generic.go:334] "Generic (PLEG): container finished" podID="c7136000-46aa-4fe0-acaf-5ad1ed94e4b3" containerID="9cd9c69b1fe928bab1744a28f21f11504ec9f323cd9f2cc5d8584560664eff24" exitCode=0 Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.105300 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c7136000-46aa-4fe0-acaf-5ad1ed94e4b3","Type":"ContainerDied","Data":"9cd9c69b1fe928bab1744a28f21f11504ec9f323cd9f2cc5d8584560664eff24"} Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.125081 4971 generic.go:334] "Generic (PLEG): container finished" podID="1f2c826d-a350-4f4e-8204-ece96439fd0f" containerID="ced131f1672d11ac0e7aa5eeb11d73ff62b3d9eb8a37be9454542ba813df1b74" exitCode=0 Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.125111 4971 generic.go:334] "Generic (PLEG): container finished" podID="1f2c826d-a350-4f4e-8204-ece96439fd0f" containerID="a90100541651d8dcfa27f96fcebde4dd6642efc039317503cc423cb412dc235f" exitCode=0 Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.125156 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1f2c826d-a350-4f4e-8204-ece96439fd0f","Type":"ContainerDied","Data":"ced131f1672d11ac0e7aa5eeb11d73ff62b3d9eb8a37be9454542ba813df1b74"} Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.125184 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1f2c826d-a350-4f4e-8204-ece96439fd0f","Type":"ContainerDied","Data":"a90100541651d8dcfa27f96fcebde4dd6642efc039317503cc423cb412dc235f"} Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.127496 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.127528 4971 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/08b125a7-da7e-4ab1-b594-2f45f0b5c529-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.127543 4971 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa332536-984c-4942-b1fb-1ffde7eb465d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.128797 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89" (UID: "ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.136900 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"08b125a7-da7e-4ab1-b594-2f45f0b5c529","Type":"ContainerDied","Data":"66d96c63b8a14b9071f1a161f51fb4a60e54efe0e3ffa98a0d3051d17d49b1dc"} Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.136963 4971 scope.go:117] "RemoveContainer" containerID="2d28427254b93eecd5e3f48d2294727ab000a8f924161e1e67d604b6de308a34" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.137175 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.137229 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa332536-984c-4942-b1fb-1ffde7eb465d-config-data" (OuterVolumeSpecName: "config-data") pod "aa332536-984c-4942-b1fb-1ffde7eb465d" (UID: "aa332536-984c-4942-b1fb-1ffde7eb465d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.145712 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bb32512-db70-4db8-a867-0bda48f318eb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2bb32512-db70-4db8-a867-0bda48f318eb" (UID: "2bb32512-db70-4db8-a867-0bda48f318eb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.146837 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c7c37921-0f32-4a99-bf40-e960c400a9ac","Type":"ContainerDied","Data":"a96535dfac343fba419a6ffd44c54afad8b8d28906a76b17e4cbedcbc2a2703c"} Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.147276 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.152316 4971 generic.go:334] "Generic (PLEG): container finished" podID="bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a" containerID="e75361f0fc60a7904ebba0ae78b3a4ce5e8696d626e1bcbb7c9151163fc312c2" exitCode=0 Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.152374 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.152403 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a","Type":"ContainerDied","Data":"e75361f0fc60a7904ebba0ae78b3a4ce5e8696d626e1bcbb7c9151163fc312c2"} Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.152425 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a","Type":"ContainerDied","Data":"4a9e9a5daa6a85c30810fb5af7b3c83cb19f48669d1b896a371a8e890b99d600"} Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.153395 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a-config-data" (OuterVolumeSpecName: "config-data") pod "bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a" (UID: "bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.155388 4971 generic.go:334] "Generic (PLEG): container finished" podID="2bb32512-db70-4db8-a867-0bda48f318eb" containerID="f4291f79acc040e64577f76b2a241e98138576981a746fc70ac90f0558fc3b12" exitCode=0 Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.155740 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7c864cc4fb-bwl6r" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.155994 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c864cc4fb-bwl6r" event={"ID":"2bb32512-db70-4db8-a867-0bda48f318eb","Type":"ContainerDied","Data":"f4291f79acc040e64577f76b2a241e98138576981a746fc70ac90f0558fc3b12"} Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.156026 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c864cc4fb-bwl6r" event={"ID":"2bb32512-db70-4db8-a867-0bda48f318eb","Type":"ContainerDied","Data":"9e8d68592c9188c08cf1bce3707ea600cd0b97f9b8a2acac21623a93d67816c5"} Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.180807 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08b125a7-da7e-4ab1-b594-2f45f0b5c529-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "08b125a7-da7e-4ab1-b594-2f45f0b5c529" (UID: "08b125a7-da7e-4ab1-b594-2f45f0b5c529"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.180817 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e641b3d5-ee18-41b6-b38c-098564591a1b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e641b3d5-ee18-41b6-b38c-098564591a1b" (UID: "e641b3d5-ee18-41b6-b38c-098564591a1b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.181166 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.181236 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"aa332536-984c-4942-b1fb-1ffde7eb465d","Type":"ContainerDied","Data":"3ee96c9b9ec9603d9ab5cbaf08f8803b78fe8d28e83cc6b12fbf0ead0d5e9beb"} Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.186216 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-4k7j6" event={"ID":"b339d4de-8516-4dd0-9bc1-c7580a99c54a","Type":"ContainerStarted","Data":"9632ec0a5d0a8185a3e649df407fe3cc41d698d178dd055b52f8edb327e2ac1b"} Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.192399 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.204030 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.221988 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bb32512-db70-4db8-a867-0bda48f318eb-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2bb32512-db70-4db8-a867-0bda48f318eb" (UID: "2bb32512-db70-4db8-a867-0bda48f318eb"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.223798 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bb32512-db70-4db8-a867-0bda48f318eb-config-data" (OuterVolumeSpecName: "config-data") pod "2bb32512-db70-4db8-a867-0bda48f318eb" (UID: "2bb32512-db70-4db8-a867-0bda48f318eb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.229560 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b11d9f93-43db-45cf-9cba-75d0c9b50d55-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "b11d9f93-43db-45cf-9cba-75d0c9b50d55" (UID: "b11d9f93-43db-45cf-9cba-75d0c9b50d55"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.230825 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e98717ba-3768-4817-a29a-09114e106105-internal-tls-certs\") pod \"e98717ba-3768-4817-a29a-09114e106105\" (UID: \"e98717ba-3768-4817-a29a-09114e106105\") " Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.230902 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e98717ba-3768-4817-a29a-09114e106105-config-data\") pod \"e98717ba-3768-4817-a29a-09114e106105\" (UID: \"e98717ba-3768-4817-a29a-09114e106105\") " Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.231562 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e98717ba-3768-4817-a29a-09114e106105-public-tls-certs\") pod \"e98717ba-3768-4817-a29a-09114e106105\" (UID: \"e98717ba-3768-4817-a29a-09114e106105\") " Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.231753 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e98717ba-3768-4817-a29a-09114e106105-combined-ca-bundle\") pod \"e98717ba-3768-4817-a29a-09114e106105\" (UID: \"e98717ba-3768-4817-a29a-09114e106105\") " Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.231779 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9g8q\" (UniqueName: \"kubernetes.io/projected/e98717ba-3768-4817-a29a-09114e106105-kube-api-access-z9g8q\") pod \"e98717ba-3768-4817-a29a-09114e106105\" (UID: \"e98717ba-3768-4817-a29a-09114e106105\") " Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.231825 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/b11d9f93-43db-45cf-9cba-75d0c9b50d55-kube-state-metrics-tls-certs\") pod \"b11d9f93-43db-45cf-9cba-75d0c9b50d55\" (UID: \"b11d9f93-43db-45cf-9cba-75d0c9b50d55\") " Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.231872 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e98717ba-3768-4817-a29a-09114e106105-logs\") pod \"e98717ba-3768-4817-a29a-09114e106105\" (UID: \"e98717ba-3768-4817-a29a-09114e106105\") " Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.232217 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa332536-984c-4942-b1fb-1ffde7eb465d-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.232235 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bb32512-db70-4db8-a867-0bda48f318eb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.232263 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bb32512-db70-4db8-a867-0bda48f318eb-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.232272 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.232281 4971 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/08b125a7-da7e-4ab1-b594-2f45f0b5c529-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.232290 4971 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bb32512-db70-4db8-a867-0bda48f318eb-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.232301 4971 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e641b3d5-ee18-41b6-b38c-098564591a1b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.232311 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:39 crc kubenswrapper[4971]: W0320 07:15:39.234330 4971 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/b11d9f93-43db-45cf-9cba-75d0c9b50d55/volumes/kubernetes.io~secret/kube-state-metrics-tls-certs Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.234364 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b11d9f93-43db-45cf-9cba-75d0c9b50d55-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "b11d9f93-43db-45cf-9cba-75d0c9b50d55" (UID: "b11d9f93-43db-45cf-9cba-75d0c9b50d55"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.235020 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e98717ba-3768-4817-a29a-09114e106105-logs" (OuterVolumeSpecName: "logs") pod "e98717ba-3768-4817-a29a-09114e106105" (UID: "e98717ba-3768-4817-a29a-09114e106105"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.236664 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.245415 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e641b3d5-ee18-41b6-b38c-098564591a1b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e641b3d5-ee18-41b6-b38c-098564591a1b" (UID: "e641b3d5-ee18-41b6-b38c-098564591a1b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.245912 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.245760 4971 generic.go:334] "Generic (PLEG): container finished" podID="e98717ba-3768-4817-a29a-09114e106105" containerID="ce5645eab11812d12a5b692e7fd51d5472ba5bc0a4420f51166013a0ec0b6101" exitCode=0 Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.247013 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e98717ba-3768-4817-a29a-09114e106105","Type":"ContainerDied","Data":"ce5645eab11812d12a5b692e7fd51d5472ba5bc0a4420f51166013a0ec0b6101"} Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.247081 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e98717ba-3768-4817-a29a-09114e106105","Type":"ContainerDied","Data":"05fece7797660c922c649a9f7d64fec23f399b3319ea84ff06c20b0824cdc2c7"} Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.247455 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a" (UID: "bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.247719 4971 scope.go:117] "RemoveContainer" containerID="8862358b81fede9115c50c47e167a72f0bd5aec1ea665447f7122a28b9e9ceea" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.265157 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e641b3d5-ee18-41b6-b38c-098564591a1b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e641b3d5-ee18-41b6-b38c-098564591a1b" (UID: "e641b3d5-ee18-41b6-b38c-098564591a1b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.267184 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b11d9f93-43db-45cf-9cba-75d0c9b50d55","Type":"ContainerDied","Data":"6a57a00927cb37ccd2926fd817b96e82e4a82b2db40952f235ffa676b7b360f0"} Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.267326 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.271000 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08b125a7-da7e-4ab1-b594-2f45f0b5c529-config-data" (OuterVolumeSpecName: "config-data") pod "08b125a7-da7e-4ab1-b594-2f45f0b5c529" (UID: "08b125a7-da7e-4ab1-b594-2f45f0b5c529"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.272485 4971 generic.go:334] "Generic (PLEG): container finished" podID="b9450f53-fcf0-4799-a8da-a9d0bc7016ac" containerID="615ced8f9e0feb5ebdac89eb7107d8990b6103b1111ede4cffc99ba5cd482835" exitCode=0 Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.272553 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"b9450f53-fcf0-4799-a8da-a9d0bc7016ac","Type":"ContainerDied","Data":"615ced8f9e0feb5ebdac89eb7107d8990b6103b1111ede4cffc99ba5cd482835"} Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.272669 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.274381 4971 generic.go:334] "Generic (PLEG): container finished" podID="ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89" containerID="881a1fe073deeae5d10c2f3450e47a402c7aeac84069def650c7c0704ad554dd" exitCode=0 Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.274453 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89","Type":"ContainerDied","Data":"881a1fe073deeae5d10c2f3450e47a402c7aeac84069def650c7c0704ad554dd"} Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.274468 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89","Type":"ContainerDied","Data":"a5812ba7bc663316b16eb74795bf71eeabba37dba7a064db87080e76a1f7566a"} Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.274507 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.276270 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89" (UID: "ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.292261 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e98717ba-3768-4817-a29a-09114e106105-kube-api-access-z9g8q" (OuterVolumeSpecName: "kube-api-access-z9g8q") pod "e98717ba-3768-4817-a29a-09114e106105" (UID: "e98717ba-3768-4817-a29a-09114e106105"). InnerVolumeSpecName "kube-api-access-z9g8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.292673 4971 generic.go:334] "Generic (PLEG): container finished" podID="e641b3d5-ee18-41b6-b38c-098564591a1b" containerID="a65462569356d9db057f1cd9fdd87d74a4a015508c9c542b7cb684d1148c04cb" exitCode=0 Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.292762 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d3ee-account-create-update-8sbzg" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.294518 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e98717ba-3768-4817-a29a-09114e106105-config-data" (OuterVolumeSpecName: "config-data") pod "e98717ba-3768-4817-a29a-09114e106105" (UID: "e98717ba-3768-4817-a29a-09114e106105"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.294626 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-68ff6b454b-v87kx" event={"ID":"e641b3d5-ee18-41b6-b38c-098564591a1b","Type":"ContainerDied","Data":"a65462569356d9db057f1cd9fdd87d74a4a015508c9c542b7cb684d1148c04cb"} Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.294670 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-68ff6b454b-v87kx" event={"ID":"e641b3d5-ee18-41b6-b38c-098564591a1b","Type":"ContainerDied","Data":"fdace0241f51e92ca49dfa7a8f4f6005c4bcc152805a4d6e595e7169ed3b81cf"} Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.296683 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bb32512-db70-4db8-a867-0bda48f318eb-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2bb32512-db70-4db8-a867-0bda48f318eb" (UID: "2bb32512-db70-4db8-a867-0bda48f318eb"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.299246 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-68ff6b454b-v87kx" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.314788 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e98717ba-3768-4817-a29a-09114e106105-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e98717ba-3768-4817-a29a-09114e106105" (UID: "e98717ba-3768-4817-a29a-09114e106105"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.336178 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crscl\" (UniqueName: \"kubernetes.io/projected/b9450f53-fcf0-4799-a8da-a9d0bc7016ac-kube-api-access-crscl\") pod \"b9450f53-fcf0-4799-a8da-a9d0bc7016ac\" (UID: \"b9450f53-fcf0-4799-a8da-a9d0bc7016ac\") " Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.336342 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9450f53-fcf0-4799-a8da-a9d0bc7016ac-combined-ca-bundle\") pod \"b9450f53-fcf0-4799-a8da-a9d0bc7016ac\" (UID: \"b9450f53-fcf0-4799-a8da-a9d0bc7016ac\") " Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.336518 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9450f53-fcf0-4799-a8da-a9d0bc7016ac-config-data\") pod \"b9450f53-fcf0-4799-a8da-a9d0bc7016ac\" (UID: \"b9450f53-fcf0-4799-a8da-a9d0bc7016ac\") " Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.336997 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e98717ba-3768-4817-a29a-09114e106105-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.337017 4971 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e641b3d5-ee18-41b6-b38c-098564591a1b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.337027 4971 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.337038 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08b125a7-da7e-4ab1-b594-2f45f0b5c529-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.337050 4971 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.337059 4971 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bb32512-db70-4db8-a867-0bda48f318eb-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.337067 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e98717ba-3768-4817-a29a-09114e106105-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.337077 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9g8q\" (UniqueName: \"kubernetes.io/projected/e98717ba-3768-4817-a29a-09114e106105-kube-api-access-z9g8q\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.337092 4971 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/b11d9f93-43db-45cf-9cba-75d0c9b50d55-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.337102 4971 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e98717ba-3768-4817-a29a-09114e106105-logs\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.337113 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e641b3d5-ee18-41b6-b38c-098564591a1b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.341202 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9450f53-fcf0-4799-a8da-a9d0bc7016ac-kube-api-access-crscl" (OuterVolumeSpecName: "kube-api-access-crscl") pod "b9450f53-fcf0-4799-a8da-a9d0bc7016ac" (UID: "b9450f53-fcf0-4799-a8da-a9d0bc7016ac"). InnerVolumeSpecName "kube-api-access-crscl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.345358 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e98717ba-3768-4817-a29a-09114e106105-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e98717ba-3768-4817-a29a-09114e106105" (UID: "e98717ba-3768-4817-a29a-09114e106105"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.345681 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e98717ba-3768-4817-a29a-09114e106105-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e98717ba-3768-4817-a29a-09114e106105" (UID: "e98717ba-3768-4817-a29a-09114e106105"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.397832 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9450f53-fcf0-4799-a8da-a9d0bc7016ac-config-data" (OuterVolumeSpecName: "config-data") pod "b9450f53-fcf0-4799-a8da-a9d0bc7016ac" (UID: "b9450f53-fcf0-4799-a8da-a9d0bc7016ac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.398898 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9450f53-fcf0-4799-a8da-a9d0bc7016ac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b9450f53-fcf0-4799-a8da-a9d0bc7016ac" (UID: "b9450f53-fcf0-4799-a8da-a9d0bc7016ac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.439173 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a79dc2c8-0cb1-4c06-8001-7b7d5c1d31e9-operator-scripts\") pod \"keystone-d3ee-account-create-update-8sbzg\" (UID: \"a79dc2c8-0cb1-4c06-8001-7b7d5c1d31e9\") " pod="openstack/keystone-d3ee-account-create-update-8sbzg" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.439281 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9jjn\" (UniqueName: \"kubernetes.io/projected/a79dc2c8-0cb1-4c06-8001-7b7d5c1d31e9-kube-api-access-v9jjn\") pod \"keystone-d3ee-account-create-update-8sbzg\" (UID: \"a79dc2c8-0cb1-4c06-8001-7b7d5c1d31e9\") " pod="openstack/keystone-d3ee-account-create-update-8sbzg" Mar 20 07:15:39 crc kubenswrapper[4971]: E0320 07:15:39.439354 4971 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.439443 4971 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e98717ba-3768-4817-a29a-09114e106105-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:39 crc kubenswrapper[4971]: E0320 07:15:39.439474 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a79dc2c8-0cb1-4c06-8001-7b7d5c1d31e9-operator-scripts podName:a79dc2c8-0cb1-4c06-8001-7b7d5c1d31e9 nodeName:}" failed. No retries permitted until 2026-03-20 07:15:41.439449287 +0000 UTC m=+1563.419323425 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/a79dc2c8-0cb1-4c06-8001-7b7d5c1d31e9-operator-scripts") pod "keystone-d3ee-account-create-update-8sbzg" (UID: "a79dc2c8-0cb1-4c06-8001-7b7d5c1d31e9") : configmap "openstack-scripts" not found Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.439516 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9450f53-fcf0-4799-a8da-a9d0bc7016ac-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.439539 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crscl\" (UniqueName: \"kubernetes.io/projected/b9450f53-fcf0-4799-a8da-a9d0bc7016ac-kube-api-access-crscl\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.439553 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9450f53-fcf0-4799-a8da-a9d0bc7016ac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.439566 4971 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e98717ba-3768-4817-a29a-09114e106105-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:39 crc kubenswrapper[4971]: E0320 07:15:39.443320 4971 projected.go:194] Error preparing data for projected volume kube-api-access-v9jjn for pod openstack/keystone-d3ee-account-create-update-8sbzg: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 20 07:15:39 crc kubenswrapper[4971]: E0320 07:15:39.443379 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a79dc2c8-0cb1-4c06-8001-7b7d5c1d31e9-kube-api-access-v9jjn podName:a79dc2c8-0cb1-4c06-8001-7b7d5c1d31e9 nodeName:}" failed. No retries permitted until 2026-03-20 07:15:41.443363778 +0000 UTC m=+1563.423237916 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-v9jjn" (UniqueName: "kubernetes.io/projected/a79dc2c8-0cb1-4c06-8001-7b7d5c1d31e9-kube-api-access-v9jjn") pod "keystone-d3ee-account-create-update-8sbzg" (UID: "a79dc2c8-0cb1-4c06-8001-7b7d5c1d31e9") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.507614 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d3ee-account-create-update-8sbzg" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.511768 4971 scope.go:117] "RemoveContainer" containerID="8e933255e2bcc27abf971627e50a388be00c68eb771f36803f6707174021661b" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.556280 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.577187 4971 scope.go:117] "RemoveContainer" containerID="87f82d49936752bb6c84997b496c4f49bd0051001c3fe0ed7aff5c22606bc913" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.577439 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.612296 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.613812 4971 scope.go:117] "RemoveContainer" containerID="e75361f0fc60a7904ebba0ae78b3a4ce5e8696d626e1bcbb7c9151163fc312c2" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.628747 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.645439 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.645528 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mw6vb\" (UniqueName: \"kubernetes.io/projected/c7136000-46aa-4fe0-acaf-5ad1ed94e4b3-kube-api-access-mw6vb\") pod \"c7136000-46aa-4fe0-acaf-5ad1ed94e4b3\" (UID: \"c7136000-46aa-4fe0-acaf-5ad1ed94e4b3\") " Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.646044 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7136000-46aa-4fe0-acaf-5ad1ed94e4b3-combined-ca-bundle\") pod \"c7136000-46aa-4fe0-acaf-5ad1ed94e4b3\" (UID: \"c7136000-46aa-4fe0-acaf-5ad1ed94e4b3\") " Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.646174 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7136000-46aa-4fe0-acaf-5ad1ed94e4b3-config-data\") pod \"c7136000-46aa-4fe0-acaf-5ad1ed94e4b3\" (UID: \"c7136000-46aa-4fe0-acaf-5ad1ed94e4b3\") " Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.651505 4971 scope.go:117] "RemoveContainer" containerID="a887f3e5b2ee565c1425d7522818693ab833fcb378a94afa1401a728afada47e" Mar 20 07:15:39 crc kubenswrapper[4971]: E0320 07:15:39.651741 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d18efcf36c0990b4e068aae60d892df3f5ec3b90d05ef12fb22df71577b161aa is running failed: container process not found" containerID="d18efcf36c0990b4e068aae60d892df3f5ec3b90d05ef12fb22df71577b161aa" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 07:15:39 crc kubenswrapper[4971]: E0320 07:15:39.653038 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d18efcf36c0990b4e068aae60d892df3f5ec3b90d05ef12fb22df71577b161aa is running failed: container process not found" containerID="d18efcf36c0990b4e068aae60d892df3f5ec3b90d05ef12fb22df71577b161aa" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 07:15:39 crc kubenswrapper[4971]: E0320 07:15:39.667663 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d18efcf36c0990b4e068aae60d892df3f5ec3b90d05ef12fb22df71577b161aa is running failed: container process not found" containerID="d18efcf36c0990b4e068aae60d892df3f5ec3b90d05ef12fb22df71577b161aa" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 07:15:39 crc kubenswrapper[4971]: E0320 07:15:39.667807 4971 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d18efcf36c0990b4e068aae60d892df3f5ec3b90d05ef12fb22df71577b161aa is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="ebe7358b-84ed-4e08-8c12-234928beef49" containerName="nova-cell0-conductor-conductor" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.693816 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7136000-46aa-4fe0-acaf-5ad1ed94e4b3-kube-api-access-mw6vb" (OuterVolumeSpecName: "kube-api-access-mw6vb") pod "c7136000-46aa-4fe0-acaf-5ad1ed94e4b3" (UID: "c7136000-46aa-4fe0-acaf-5ad1ed94e4b3"). InnerVolumeSpecName "kube-api-access-mw6vb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.729989 4971 scope.go:117] "RemoveContainer" containerID="e75361f0fc60a7904ebba0ae78b3a4ce5e8696d626e1bcbb7c9151163fc312c2" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.730322 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7136000-46aa-4fe0-acaf-5ad1ed94e4b3-config-data" (OuterVolumeSpecName: "config-data") pod "c7136000-46aa-4fe0-acaf-5ad1ed94e4b3" (UID: "c7136000-46aa-4fe0-acaf-5ad1ed94e4b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:39 crc kubenswrapper[4971]: E0320 07:15:39.731424 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e75361f0fc60a7904ebba0ae78b3a4ce5e8696d626e1bcbb7c9151163fc312c2\": container with ID starting with e75361f0fc60a7904ebba0ae78b3a4ce5e8696d626e1bcbb7c9151163fc312c2 not found: ID does not exist" containerID="e75361f0fc60a7904ebba0ae78b3a4ce5e8696d626e1bcbb7c9151163fc312c2" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.735194 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e75361f0fc60a7904ebba0ae78b3a4ce5e8696d626e1bcbb7c9151163fc312c2"} err="failed to get container status \"e75361f0fc60a7904ebba0ae78b3a4ce5e8696d626e1bcbb7c9151163fc312c2\": rpc error: code = NotFound desc = could not find container \"e75361f0fc60a7904ebba0ae78b3a4ce5e8696d626e1bcbb7c9151163fc312c2\": container with ID starting with e75361f0fc60a7904ebba0ae78b3a4ce5e8696d626e1bcbb7c9151163fc312c2 not found: ID does not exist" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.735459 4971 scope.go:117] "RemoveContainer" containerID="a887f3e5b2ee565c1425d7522818693ab833fcb378a94afa1401a728afada47e" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.735793 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 07:15:39 crc kubenswrapper[4971]: E0320 07:15:39.746865 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a887f3e5b2ee565c1425d7522818693ab833fcb378a94afa1401a728afada47e\": container with ID starting with a887f3e5b2ee565c1425d7522818693ab833fcb378a94afa1401a728afada47e not found: ID does not exist" containerID="a887f3e5b2ee565c1425d7522818693ab833fcb378a94afa1401a728afada47e" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.747141 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a887f3e5b2ee565c1425d7522818693ab833fcb378a94afa1401a728afada47e"} err="failed to get container status \"a887f3e5b2ee565c1425d7522818693ab833fcb378a94afa1401a728afada47e\": rpc error: code = NotFound desc = could not find container \"a887f3e5b2ee565c1425d7522818693ab833fcb378a94afa1401a728afada47e\": container with ID starting with a887f3e5b2ee565c1425d7522818693ab833fcb378a94afa1401a728afada47e not found: ID does not exist" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.747274 4971 scope.go:117] "RemoveContainer" containerID="f4291f79acc040e64577f76b2a241e98138576981a746fc70ac90f0558fc3b12" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.750654 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7136000-46aa-4fe0-acaf-5ad1ed94e4b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c7136000-46aa-4fe0-acaf-5ad1ed94e4b3" (UID: "c7136000-46aa-4fe0-acaf-5ad1ed94e4b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.751564 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7136000-46aa-4fe0-acaf-5ad1ed94e4b3-combined-ca-bundle\") pod \"c7136000-46aa-4fe0-acaf-5ad1ed94e4b3\" (UID: \"c7136000-46aa-4fe0-acaf-5ad1ed94e4b3\") " Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.752712 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mw6vb\" (UniqueName: \"kubernetes.io/projected/c7136000-46aa-4fe0-acaf-5ad1ed94e4b3-kube-api-access-mw6vb\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.752734 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7136000-46aa-4fe0-acaf-5ad1ed94e4b3-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:39 crc kubenswrapper[4971]: W0320 07:15:39.752746 4971 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/c7136000-46aa-4fe0-acaf-5ad1ed94e4b3/volumes/kubernetes.io~secret/combined-ca-bundle Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.752787 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7136000-46aa-4fe0-acaf-5ad1ed94e4b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c7136000-46aa-4fe0-acaf-5ad1ed94e4b3" (UID: "c7136000-46aa-4fe0-acaf-5ad1ed94e4b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.756673 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.780197 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.783021 4971 scope.go:117] "RemoveContainer" containerID="f44c356494df4fb7a453f34eb2f258fe912c8e4012fb92962ea899421d3f7ab5" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.805361 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.823662 4971 scope.go:117] "RemoveContainer" containerID="f4291f79acc040e64577f76b2a241e98138576981a746fc70ac90f0558fc3b12" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.832253 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 07:15:39 crc kubenswrapper[4971]: E0320 07:15:39.832745 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4291f79acc040e64577f76b2a241e98138576981a746fc70ac90f0558fc3b12\": container with ID starting with f4291f79acc040e64577f76b2a241e98138576981a746fc70ac90f0558fc3b12 not found: ID does not exist" containerID="f4291f79acc040e64577f76b2a241e98138576981a746fc70ac90f0558fc3b12" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.832783 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4291f79acc040e64577f76b2a241e98138576981a746fc70ac90f0558fc3b12"} err="failed to get container status \"f4291f79acc040e64577f76b2a241e98138576981a746fc70ac90f0558fc3b12\": rpc error: code = NotFound desc = could not find container \"f4291f79acc040e64577f76b2a241e98138576981a746fc70ac90f0558fc3b12\": container with ID starting with f4291f79acc040e64577f76b2a241e98138576981a746fc70ac90f0558fc3b12 not found: ID does not exist" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.832817 4971 scope.go:117] "RemoveContainer" containerID="f44c356494df4fb7a453f34eb2f258fe912c8e4012fb92962ea899421d3f7ab5" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.842395 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-4k7j6" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.842644 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 07:15:39 crc kubenswrapper[4971]: E0320 07:15:39.842685 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f44c356494df4fb7a453f34eb2f258fe912c8e4012fb92962ea899421d3f7ab5\": container with ID starting with f44c356494df4fb7a453f34eb2f258fe912c8e4012fb92962ea899421d3f7ab5 not found: ID does not exist" containerID="f44c356494df4fb7a453f34eb2f258fe912c8e4012fb92962ea899421d3f7ab5" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.842989 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f44c356494df4fb7a453f34eb2f258fe912c8e4012fb92962ea899421d3f7ab5"} err="failed to get container status \"f44c356494df4fb7a453f34eb2f258fe912c8e4012fb92962ea899421d3f7ab5\": rpc error: code = NotFound desc = could not find container \"f44c356494df4fb7a453f34eb2f258fe912c8e4012fb92962ea899421d3f7ab5\": container with ID starting with f44c356494df4fb7a453f34eb2f258fe912c8e4012fb92962ea899421d3f7ab5 not found: ID does not exist" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.843013 4971 scope.go:117] "RemoveContainer" containerID="68ae6ce1a0d2429fc4431d1c48cbd5670a70a1125fc1820cb1e6abfe59df9ea9" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.849162 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-68ff6b454b-v87kx"] Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.855259 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7136000-46aa-4fe0-acaf-5ad1ed94e4b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.855517 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-68ff6b454b-v87kx"] Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.881260 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7c864cc4fb-bwl6r"] Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.888594 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-7c864cc4fb-bwl6r"] Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.905908 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.912481 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.922231 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.935727 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.948356 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.956576 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2656777c-a362-4e3e-92aa-98a8f9d2cf34-config-data-custom\") pod \"2656777c-a362-4e3e-92aa-98a8f9d2cf34\" (UID: \"2656777c-a362-4e3e-92aa-98a8f9d2cf34\") " Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.956734 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b339d4de-8516-4dd0-9bc1-c7580a99c54a-operator-scripts\") pod \"b339d4de-8516-4dd0-9bc1-c7580a99c54a\" (UID: \"b339d4de-8516-4dd0-9bc1-c7580a99c54a\") " Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.956770 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dlm7\" (UniqueName: \"kubernetes.io/projected/b339d4de-8516-4dd0-9bc1-c7580a99c54a-kube-api-access-4dlm7\") pod \"b339d4de-8516-4dd0-9bc1-c7580a99c54a\" (UID: \"b339d4de-8516-4dd0-9bc1-c7580a99c54a\") " Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.956806 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2656777c-a362-4e3e-92aa-98a8f9d2cf34-etc-machine-id\") pod \"2656777c-a362-4e3e-92aa-98a8f9d2cf34\" (UID: \"2656777c-a362-4e3e-92aa-98a8f9d2cf34\") " Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.956834 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2656777c-a362-4e3e-92aa-98a8f9d2cf34-combined-ca-bundle\") pod \"2656777c-a362-4e3e-92aa-98a8f9d2cf34\" (UID: \"2656777c-a362-4e3e-92aa-98a8f9d2cf34\") " Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.956874 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebe7358b-84ed-4e08-8c12-234928beef49-combined-ca-bundle\") pod \"ebe7358b-84ed-4e08-8c12-234928beef49\" (UID: \"ebe7358b-84ed-4e08-8c12-234928beef49\") " Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.956905 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2656777c-a362-4e3e-92aa-98a8f9d2cf34-config-data\") pod \"2656777c-a362-4e3e-92aa-98a8f9d2cf34\" (UID: \"2656777c-a362-4e3e-92aa-98a8f9d2cf34\") " Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.956981 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lvnt\" (UniqueName: \"kubernetes.io/projected/2656777c-a362-4e3e-92aa-98a8f9d2cf34-kube-api-access-8lvnt\") pod \"2656777c-a362-4e3e-92aa-98a8f9d2cf34\" (UID: \"2656777c-a362-4e3e-92aa-98a8f9d2cf34\") " Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.957030 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnf6t\" (UniqueName: \"kubernetes.io/projected/ebe7358b-84ed-4e08-8c12-234928beef49-kube-api-access-hnf6t\") pod \"ebe7358b-84ed-4e08-8c12-234928beef49\" (UID: \"ebe7358b-84ed-4e08-8c12-234928beef49\") " Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.957059 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebe7358b-84ed-4e08-8c12-234928beef49-config-data\") pod \"ebe7358b-84ed-4e08-8c12-234928beef49\" (UID: \"ebe7358b-84ed-4e08-8c12-234928beef49\") " Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.957140 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2656777c-a362-4e3e-92aa-98a8f9d2cf34-scripts\") pod \"2656777c-a362-4e3e-92aa-98a8f9d2cf34\" (UID: \"2656777c-a362-4e3e-92aa-98a8f9d2cf34\") " Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.962210 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2656777c-a362-4e3e-92aa-98a8f9d2cf34-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "2656777c-a362-4e3e-92aa-98a8f9d2cf34" (UID: "2656777c-a362-4e3e-92aa-98a8f9d2cf34"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.962623 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b339d4de-8516-4dd0-9bc1-c7580a99c54a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b339d4de-8516-4dd0-9bc1-c7580a99c54a" (UID: "b339d4de-8516-4dd0-9bc1-c7580a99c54a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.964620 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2656777c-a362-4e3e-92aa-98a8f9d2cf34-scripts" (OuterVolumeSpecName: "scripts") pod "2656777c-a362-4e3e-92aa-98a8f9d2cf34" (UID: "2656777c-a362-4e3e-92aa-98a8f9d2cf34"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.969194 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2656777c-a362-4e3e-92aa-98a8f9d2cf34-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2656777c-a362-4e3e-92aa-98a8f9d2cf34" (UID: "2656777c-a362-4e3e-92aa-98a8f9d2cf34"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.969337 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2656777c-a362-4e3e-92aa-98a8f9d2cf34-kube-api-access-8lvnt" (OuterVolumeSpecName: "kube-api-access-8lvnt") pod "2656777c-a362-4e3e-92aa-98a8f9d2cf34" (UID: "2656777c-a362-4e3e-92aa-98a8f9d2cf34"). InnerVolumeSpecName "kube-api-access-8lvnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.969502 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebe7358b-84ed-4e08-8c12-234928beef49-kube-api-access-hnf6t" (OuterVolumeSpecName: "kube-api-access-hnf6t") pod "ebe7358b-84ed-4e08-8c12-234928beef49" (UID: "ebe7358b-84ed-4e08-8c12-234928beef49"). InnerVolumeSpecName "kube-api-access-hnf6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:15:39 crc kubenswrapper[4971]: I0320 07:15:39.969221 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b339d4de-8516-4dd0-9bc1-c7580a99c54a-kube-api-access-4dlm7" (OuterVolumeSpecName: "kube-api-access-4dlm7") pod "b339d4de-8516-4dd0-9bc1-c7580a99c54a" (UID: "b339d4de-8516-4dd0-9bc1-c7580a99c54a"). InnerVolumeSpecName "kube-api-access-4dlm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:39.984847 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.001856 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebe7358b-84ed-4e08-8c12-234928beef49-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ebe7358b-84ed-4e08-8c12-234928beef49" (UID: "ebe7358b-84ed-4e08-8c12-234928beef49"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.021669 4971 scope.go:117] "RemoveContainer" containerID="c5035eee8b5df80277aacaae4c435bb925d2c563411e00ccd53787c9f32033d9" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.029302 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebe7358b-84ed-4e08-8c12-234928beef49-config-data" (OuterVolumeSpecName: "config-data") pod "ebe7358b-84ed-4e08-8c12-234928beef49" (UID: "ebe7358b-84ed-4e08-8c12-234928beef49"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.058849 4971 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b339d4de-8516-4dd0-9bc1-c7580a99c54a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.058876 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dlm7\" (UniqueName: \"kubernetes.io/projected/b339d4de-8516-4dd0-9bc1-c7580a99c54a-kube-api-access-4dlm7\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.058888 4971 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2656777c-a362-4e3e-92aa-98a8f9d2cf34-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.058897 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebe7358b-84ed-4e08-8c12-234928beef49-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.058906 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lvnt\" (UniqueName: \"kubernetes.io/projected/2656777c-a362-4e3e-92aa-98a8f9d2cf34-kube-api-access-8lvnt\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.058915 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hnf6t\" (UniqueName: \"kubernetes.io/projected/ebe7358b-84ed-4e08-8c12-234928beef49-kube-api-access-hnf6t\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.058923 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebe7358b-84ed-4e08-8c12-234928beef49-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.058931 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2656777c-a362-4e3e-92aa-98a8f9d2cf34-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.058940 4971 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2656777c-a362-4e3e-92aa-98a8f9d2cf34-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.059755 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2656777c-a362-4e3e-92aa-98a8f9d2cf34-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2656777c-a362-4e3e-92aa-98a8f9d2cf34" (UID: "2656777c-a362-4e3e-92aa-98a8f9d2cf34"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.101911 4971 scope.go:117] "RemoveContainer" containerID="ce5645eab11812d12a5b692e7fd51d5472ba5bc0a4420f51166013a0ec0b6101" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.130902 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2656777c-a362-4e3e-92aa-98a8f9d2cf34-config-data" (OuterVolumeSpecName: "config-data") pod "2656777c-a362-4e3e-92aa-98a8f9d2cf34" (UID: "2656777c-a362-4e3e-92aa-98a8f9d2cf34"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.130977 4971 scope.go:117] "RemoveContainer" containerID="b1c3b7e6f0a0f2ca4105b11d548c344cb0174fb3f45a7a3cbc9fbe11f4d22903" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.161441 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2656777c-a362-4e3e-92aa-98a8f9d2cf34-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.161491 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2656777c-a362-4e3e-92aa-98a8f9d2cf34-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.165215 4971 scope.go:117] "RemoveContainer" containerID="ce5645eab11812d12a5b692e7fd51d5472ba5bc0a4420f51166013a0ec0b6101" Mar 20 07:15:40 crc kubenswrapper[4971]: E0320 07:15:40.166337 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce5645eab11812d12a5b692e7fd51d5472ba5bc0a4420f51166013a0ec0b6101\": container with ID starting with ce5645eab11812d12a5b692e7fd51d5472ba5bc0a4420f51166013a0ec0b6101 not found: ID does not exist" containerID="ce5645eab11812d12a5b692e7fd51d5472ba5bc0a4420f51166013a0ec0b6101" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.166387 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce5645eab11812d12a5b692e7fd51d5472ba5bc0a4420f51166013a0ec0b6101"} err="failed to get container status \"ce5645eab11812d12a5b692e7fd51d5472ba5bc0a4420f51166013a0ec0b6101\": rpc error: code = NotFound desc = could not find container \"ce5645eab11812d12a5b692e7fd51d5472ba5bc0a4420f51166013a0ec0b6101\": container with ID starting with ce5645eab11812d12a5b692e7fd51d5472ba5bc0a4420f51166013a0ec0b6101 not found: ID does not exist" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.166423 4971 scope.go:117] "RemoveContainer" containerID="b1c3b7e6f0a0f2ca4105b11d548c344cb0174fb3f45a7a3cbc9fbe11f4d22903" Mar 20 07:15:40 crc kubenswrapper[4971]: E0320 07:15:40.167678 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1c3b7e6f0a0f2ca4105b11d548c344cb0174fb3f45a7a3cbc9fbe11f4d22903\": container with ID starting with b1c3b7e6f0a0f2ca4105b11d548c344cb0174fb3f45a7a3cbc9fbe11f4d22903 not found: ID does not exist" containerID="b1c3b7e6f0a0f2ca4105b11d548c344cb0174fb3f45a7a3cbc9fbe11f4d22903" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.167718 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1c3b7e6f0a0f2ca4105b11d548c344cb0174fb3f45a7a3cbc9fbe11f4d22903"} err="failed to get container status \"b1c3b7e6f0a0f2ca4105b11d548c344cb0174fb3f45a7a3cbc9fbe11f4d22903\": rpc error: code = NotFound desc = could not find container \"b1c3b7e6f0a0f2ca4105b11d548c344cb0174fb3f45a7a3cbc9fbe11f4d22903\": container with ID starting with b1c3b7e6f0a0f2ca4105b11d548c344cb0174fb3f45a7a3cbc9fbe11f4d22903 not found: ID does not exist" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.167732 4971 scope.go:117] "RemoveContainer" containerID="3e232b29d2dc5299cf551b92f2d9493a62313586ff5a609644e971c3988a31be" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.191674 4971 scope.go:117] "RemoveContainer" containerID="615ced8f9e0feb5ebdac89eb7107d8990b6103b1111ede4cffc99ba5cd482835" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.213901 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.217697 4971 scope.go:117] "RemoveContainer" containerID="881a1fe073deeae5d10c2f3450e47a402c7aeac84069def650c7c0704ad554dd" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.308971 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c7136000-46aa-4fe0-acaf-5ad1ed94e4b3","Type":"ContainerDied","Data":"bf89cadf2cef0cae3f0b34593796e6584df0ae662b23a7e3e30070db3236da04"} Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.309013 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.329209 4971 scope.go:117] "RemoveContainer" containerID="881a1fe073deeae5d10c2f3450e47a402c7aeac84069def650c7c0704ad554dd" Mar 20 07:15:40 crc kubenswrapper[4971]: E0320 07:15:40.331713 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"881a1fe073deeae5d10c2f3450e47a402c7aeac84069def650c7c0704ad554dd\": container with ID starting with 881a1fe073deeae5d10c2f3450e47a402c7aeac84069def650c7c0704ad554dd not found: ID does not exist" containerID="881a1fe073deeae5d10c2f3450e47a402c7aeac84069def650c7c0704ad554dd" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.331765 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"881a1fe073deeae5d10c2f3450e47a402c7aeac84069def650c7c0704ad554dd"} err="failed to get container status \"881a1fe073deeae5d10c2f3450e47a402c7aeac84069def650c7c0704ad554dd\": rpc error: code = NotFound desc = could not find container \"881a1fe073deeae5d10c2f3450e47a402c7aeac84069def650c7c0704ad554dd\": container with ID starting with 881a1fe073deeae5d10c2f3450e47a402c7aeac84069def650c7c0704ad554dd not found: ID does not exist" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.331791 4971 scope.go:117] "RemoveContainer" containerID="a65462569356d9db057f1cd9fdd87d74a4a015508c9c542b7cb684d1148c04cb" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.339877 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-4k7j6" event={"ID":"b339d4de-8516-4dd0-9bc1-c7580a99c54a","Type":"ContainerDied","Data":"9632ec0a5d0a8185a3e649df407fe3cc41d698d178dd055b52f8edb327e2ac1b"} Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.339996 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-4k7j6" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.345050 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.350542 4971 generic.go:334] "Generic (PLEG): container finished" podID="69a5a04c-039c-4ef2-a020-1e5c16fca2b6" containerID="1a7b3fb5fd235f55cc04b6d8e45ed95722d35221264653d7df9a70cd8b76f73a" exitCode=0 Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.351171 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.352262 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"69a5a04c-039c-4ef2-a020-1e5c16fca2b6","Type":"ContainerDied","Data":"1a7b3fb5fd235f55cc04b6d8e45ed95722d35221264653d7df9a70cd8b76f73a"} Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.352329 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"69a5a04c-039c-4ef2-a020-1e5c16fca2b6","Type":"ContainerDied","Data":"fe4f4dd984d01358392cc324d34364eda6d0d09b68a796511182aff4094a98e3"} Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.355430 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.366036 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7qvk\" (UniqueName: \"kubernetes.io/projected/69a5a04c-039c-4ef2-a020-1e5c16fca2b6-kube-api-access-n7qvk\") pod \"69a5a04c-039c-4ef2-a020-1e5c16fca2b6\" (UID: \"69a5a04c-039c-4ef2-a020-1e5c16fca2b6\") " Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.366221 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/69a5a04c-039c-4ef2-a020-1e5c16fca2b6-config-data-generated\") pod \"69a5a04c-039c-4ef2-a020-1e5c16fca2b6\" (UID: \"69a5a04c-039c-4ef2-a020-1e5c16fca2b6\") " Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.366374 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"69a5a04c-039c-4ef2-a020-1e5c16fca2b6\" (UID: \"69a5a04c-039c-4ef2-a020-1e5c16fca2b6\") " Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.366588 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/69a5a04c-039c-4ef2-a020-1e5c16fca2b6-kolla-config\") pod \"69a5a04c-039c-4ef2-a020-1e5c16fca2b6\" (UID: \"69a5a04c-039c-4ef2-a020-1e5c16fca2b6\") " Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.366700 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/69a5a04c-039c-4ef2-a020-1e5c16fca2b6-config-data-default\") pod \"69a5a04c-039c-4ef2-a020-1e5c16fca2b6\" (UID: \"69a5a04c-039c-4ef2-a020-1e5c16fca2b6\") " Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.367347 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69a5a04c-039c-4ef2-a020-1e5c16fca2b6-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "69a5a04c-039c-4ef2-a020-1e5c16fca2b6" (UID: "69a5a04c-039c-4ef2-a020-1e5c16fca2b6"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.367370 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/69a5a04c-039c-4ef2-a020-1e5c16fca2b6-galera-tls-certs\") pod \"69a5a04c-039c-4ef2-a020-1e5c16fca2b6\" (UID: \"69a5a04c-039c-4ef2-a020-1e5c16fca2b6\") " Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.367470 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69a5a04c-039c-4ef2-a020-1e5c16fca2b6-operator-scripts\") pod \"69a5a04c-039c-4ef2-a020-1e5c16fca2b6\" (UID: \"69a5a04c-039c-4ef2-a020-1e5c16fca2b6\") " Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.367557 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69a5a04c-039c-4ef2-a020-1e5c16fca2b6-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "69a5a04c-039c-4ef2-a020-1e5c16fca2b6" (UID: "69a5a04c-039c-4ef2-a020-1e5c16fca2b6"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.367630 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69a5a04c-039c-4ef2-a020-1e5c16fca2b6-combined-ca-bundle\") pod \"69a5a04c-039c-4ef2-a020-1e5c16fca2b6\" (UID: \"69a5a04c-039c-4ef2-a020-1e5c16fca2b6\") " Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.368695 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69a5a04c-039c-4ef2-a020-1e5c16fca2b6-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "69a5a04c-039c-4ef2-a020-1e5c16fca2b6" (UID: "69a5a04c-039c-4ef2-a020-1e5c16fca2b6"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.369441 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69a5a04c-039c-4ef2-a020-1e5c16fca2b6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "69a5a04c-039c-4ef2-a020-1e5c16fca2b6" (UID: "69a5a04c-039c-4ef2-a020-1e5c16fca2b6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.369789 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_9064c65d-26e4-4d53-a1c1-b88f932b76be/ovn-northd/0.log" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.369834 4971 generic.go:334] "Generic (PLEG): container finished" podID="9064c65d-26e4-4d53-a1c1-b88f932b76be" containerID="3ca1eeff736f7dc8000796f67df497ff16c309981beedb9789a0e4a334c2f180" exitCode=139 Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.369922 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"9064c65d-26e4-4d53-a1c1-b88f932b76be","Type":"ContainerDied","Data":"3ca1eeff736f7dc8000796f67df497ff16c309981beedb9789a0e4a334c2f180"} Mar 20 07:15:40 crc kubenswrapper[4971]: E0320 07:15:40.370029 4971 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 20 07:15:40 crc kubenswrapper[4971]: E0320 07:15:40.370076 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1023ed0c-3abe-4cff-987f-52544b885696-config-data podName:1023ed0c-3abe-4cff-987f-52544b885696 nodeName:}" failed. No retries permitted until 2026-03-20 07:15:48.370060452 +0000 UTC m=+1570.349934590 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/1023ed0c-3abe-4cff-987f-52544b885696-config-data") pod "rabbitmq-cell1-server-0" (UID: "1023ed0c-3abe-4cff-987f-52544b885696") : configmap "rabbitmq-cell1-config-data" not found Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.370313 4971 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/69a5a04c-039c-4ef2-a020-1e5c16fca2b6-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.370344 4971 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/69a5a04c-039c-4ef2-a020-1e5c16fca2b6-config-data-default\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.370358 4971 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69a5a04c-039c-4ef2-a020-1e5c16fca2b6-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.370394 4971 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/69a5a04c-039c-4ef2-a020-1e5c16fca2b6-config-data-generated\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.383166 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69a5a04c-039c-4ef2-a020-1e5c16fca2b6-kube-api-access-n7qvk" (OuterVolumeSpecName: "kube-api-access-n7qvk") pod "69a5a04c-039c-4ef2-a020-1e5c16fca2b6" (UID: "69a5a04c-039c-4ef2-a020-1e5c16fca2b6"). InnerVolumeSpecName "kube-api-access-n7qvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.386506 4971 generic.go:334] "Generic (PLEG): container finished" podID="ebe7358b-84ed-4e08-8c12-234928beef49" containerID="d18efcf36c0990b4e068aae60d892df3f5ec3b90d05ef12fb22df71577b161aa" exitCode=0 Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.386668 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.386595 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"ebe7358b-84ed-4e08-8c12-234928beef49","Type":"ContainerDied","Data":"d18efcf36c0990b4e068aae60d892df3f5ec3b90d05ef12fb22df71577b161aa"} Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.387219 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"ebe7358b-84ed-4e08-8c12-234928beef49","Type":"ContainerDied","Data":"57b59ca2298435e2062d2fe57884c71b9480b6088c3c5f43111cabbc879d79bd"} Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.389195 4971 generic.go:334] "Generic (PLEG): container finished" podID="2656777c-a362-4e3e-92aa-98a8f9d2cf34" containerID="f4e86f9dea6983a20f14a79fb0e7338cd92e47228309e782ea16776aa1889874" exitCode=0 Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.389267 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d3ee-account-create-update-8sbzg" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.390110 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2656777c-a362-4e3e-92aa-98a8f9d2cf34","Type":"ContainerDied","Data":"f4e86f9dea6983a20f14a79fb0e7338cd92e47228309e782ea16776aa1889874"} Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.390136 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2656777c-a362-4e3e-92aa-98a8f9d2cf34","Type":"ContainerDied","Data":"c97d6516b3f500a3fa3a21c6620ddd9d9a2a09019ff5d26144c90340566236f1"} Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.390193 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.406062 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "mysql-db") pod "69a5a04c-039c-4ef2-a020-1e5c16fca2b6" (UID: "69a5a04c-039c-4ef2-a020-1e5c16fca2b6"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.409916 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69a5a04c-039c-4ef2-a020-1e5c16fca2b6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "69a5a04c-039c-4ef2-a020-1e5c16fca2b6" (UID: "69a5a04c-039c-4ef2-a020-1e5c16fca2b6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.430388 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69a5a04c-039c-4ef2-a020-1e5c16fca2b6-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "69a5a04c-039c-4ef2-a020-1e5c16fca2b6" (UID: "69a5a04c-039c-4ef2-a020-1e5c16fca2b6"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.447473 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_9064c65d-26e4-4d53-a1c1-b88f932b76be/ovn-northd/0.log" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.447569 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.460497 4971 scope.go:117] "RemoveContainer" containerID="a7921404e4b591fede304bc781cbac3d5009d94181d0de8b56cf88b7dad22119" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.472392 4971 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/69a5a04c-039c-4ef2-a020-1e5c16fca2b6-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.472429 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69a5a04c-039c-4ef2-a020-1e5c16fca2b6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.472443 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7qvk\" (UniqueName: \"kubernetes.io/projected/69a5a04c-039c-4ef2-a020-1e5c16fca2b6-kube-api-access-n7qvk\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.472477 4971 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.498957 4971 scope.go:117] "RemoveContainer" containerID="a65462569356d9db057f1cd9fdd87d74a4a015508c9c542b7cb684d1148c04cb" Mar 20 07:15:40 crc kubenswrapper[4971]: E0320 07:15:40.501441 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a65462569356d9db057f1cd9fdd87d74a4a015508c9c542b7cb684d1148c04cb\": container with ID starting with a65462569356d9db057f1cd9fdd87d74a4a015508c9c542b7cb684d1148c04cb not found: ID does not exist" containerID="a65462569356d9db057f1cd9fdd87d74a4a015508c9c542b7cb684d1148c04cb" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.501471 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a65462569356d9db057f1cd9fdd87d74a4a015508c9c542b7cb684d1148c04cb"} err="failed to get container status \"a65462569356d9db057f1cd9fdd87d74a4a015508c9c542b7cb684d1148c04cb\": rpc error: code = NotFound desc = could not find container \"a65462569356d9db057f1cd9fdd87d74a4a015508c9c542b7cb684d1148c04cb\": container with ID starting with a65462569356d9db057f1cd9fdd87d74a4a015508c9c542b7cb684d1148c04cb not found: ID does not exist" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.501492 4971 scope.go:117] "RemoveContainer" containerID="a7921404e4b591fede304bc781cbac3d5009d94181d0de8b56cf88b7dad22119" Mar 20 07:15:40 crc kubenswrapper[4971]: E0320 07:15:40.501816 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7921404e4b591fede304bc781cbac3d5009d94181d0de8b56cf88b7dad22119\": container with ID starting with a7921404e4b591fede304bc781cbac3d5009d94181d0de8b56cf88b7dad22119 not found: ID does not exist" containerID="a7921404e4b591fede304bc781cbac3d5009d94181d0de8b56cf88b7dad22119" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.501842 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7921404e4b591fede304bc781cbac3d5009d94181d0de8b56cf88b7dad22119"} err="failed to get container status \"a7921404e4b591fede304bc781cbac3d5009d94181d0de8b56cf88b7dad22119\": rpc error: code = NotFound desc = could not find container \"a7921404e4b591fede304bc781cbac3d5009d94181d0de8b56cf88b7dad22119\": container with ID starting with a7921404e4b591fede304bc781cbac3d5009d94181d0de8b56cf88b7dad22119 not found: ID does not exist" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.501857 4971 scope.go:117] "RemoveContainer" containerID="9cd9c69b1fe928bab1744a28f21f11504ec9f323cd9f2cc5d8584560664eff24" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.524245 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.533785 4971 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.540037 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.549550 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.560389 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.561770 4971 scope.go:117] "RemoveContainer" containerID="1a7b3fb5fd235f55cc04b6d8e45ed95722d35221264653d7df9a70cd8b76f73a" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.572034 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-4k7j6"] Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.574122 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/9064c65d-26e4-4d53-a1c1-b88f932b76be-ovn-northd-tls-certs\") pod \"9064c65d-26e4-4d53-a1c1-b88f932b76be\" (UID: \"9064c65d-26e4-4d53-a1c1-b88f932b76be\") " Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.574183 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9064c65d-26e4-4d53-a1c1-b88f932b76be-metrics-certs-tls-certs\") pod \"9064c65d-26e4-4d53-a1c1-b88f932b76be\" (UID: \"9064c65d-26e4-4d53-a1c1-b88f932b76be\") " Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.574219 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-465wm\" (UniqueName: \"kubernetes.io/projected/9064c65d-26e4-4d53-a1c1-b88f932b76be-kube-api-access-465wm\") pod \"9064c65d-26e4-4d53-a1c1-b88f932b76be\" (UID: \"9064c65d-26e4-4d53-a1c1-b88f932b76be\") " Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.574308 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9064c65d-26e4-4d53-a1c1-b88f932b76be-combined-ca-bundle\") pod \"9064c65d-26e4-4d53-a1c1-b88f932b76be\" (UID: \"9064c65d-26e4-4d53-a1c1-b88f932b76be\") " Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.574394 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9064c65d-26e4-4d53-a1c1-b88f932b76be-scripts\") pod \"9064c65d-26e4-4d53-a1c1-b88f932b76be\" (UID: \"9064c65d-26e4-4d53-a1c1-b88f932b76be\") " Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.574446 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9064c65d-26e4-4d53-a1c1-b88f932b76be-ovn-rundir\") pod \"9064c65d-26e4-4d53-a1c1-b88f932b76be\" (UID: \"9064c65d-26e4-4d53-a1c1-b88f932b76be\") " Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.574464 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9064c65d-26e4-4d53-a1c1-b88f932b76be-config\") pod \"9064c65d-26e4-4d53-a1c1-b88f932b76be\" (UID: \"9064c65d-26e4-4d53-a1c1-b88f932b76be\") " Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.574779 4971 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.575250 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9064c65d-26e4-4d53-a1c1-b88f932b76be-config" (OuterVolumeSpecName: "config") pod "9064c65d-26e4-4d53-a1c1-b88f932b76be" (UID: "9064c65d-26e4-4d53-a1c1-b88f932b76be"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.575877 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-4k7j6"] Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.576185 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9064c65d-26e4-4d53-a1c1-b88f932b76be-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "9064c65d-26e4-4d53-a1c1-b88f932b76be" (UID: "9064c65d-26e4-4d53-a1c1-b88f932b76be"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.576283 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9064c65d-26e4-4d53-a1c1-b88f932b76be-scripts" (OuterVolumeSpecName: "scripts") pod "9064c65d-26e4-4d53-a1c1-b88f932b76be" (UID: "9064c65d-26e4-4d53-a1c1-b88f932b76be"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.580446 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9064c65d-26e4-4d53-a1c1-b88f932b76be-kube-api-access-465wm" (OuterVolumeSpecName: "kube-api-access-465wm") pod "9064c65d-26e4-4d53-a1c1-b88f932b76be" (UID: "9064c65d-26e4-4d53-a1c1-b88f932b76be"). InnerVolumeSpecName "kube-api-access-465wm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.585696 4971 scope.go:117] "RemoveContainer" containerID="01f4c038659721a3b1a9de5996ca336bb07cf20969895b4224218ee5023f415e" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.585846 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-d3ee-account-create-update-8sbzg"] Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.590047 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-d3ee-account-create-update-8sbzg"] Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.596431 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9064c65d-26e4-4d53-a1c1-b88f932b76be-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9064c65d-26e4-4d53-a1c1-b88f932b76be" (UID: "9064c65d-26e4-4d53-a1c1-b88f932b76be"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.618107 4971 scope.go:117] "RemoveContainer" containerID="1a7b3fb5fd235f55cc04b6d8e45ed95722d35221264653d7df9a70cd8b76f73a" Mar 20 07:15:40 crc kubenswrapper[4971]: E0320 07:15:40.618735 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a7b3fb5fd235f55cc04b6d8e45ed95722d35221264653d7df9a70cd8b76f73a\": container with ID starting with 1a7b3fb5fd235f55cc04b6d8e45ed95722d35221264653d7df9a70cd8b76f73a not found: ID does not exist" containerID="1a7b3fb5fd235f55cc04b6d8e45ed95722d35221264653d7df9a70cd8b76f73a" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.618779 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a7b3fb5fd235f55cc04b6d8e45ed95722d35221264653d7df9a70cd8b76f73a"} err="failed to get container status \"1a7b3fb5fd235f55cc04b6d8e45ed95722d35221264653d7df9a70cd8b76f73a\": rpc error: code = NotFound desc = could not find container \"1a7b3fb5fd235f55cc04b6d8e45ed95722d35221264653d7df9a70cd8b76f73a\": container with ID starting with 1a7b3fb5fd235f55cc04b6d8e45ed95722d35221264653d7df9a70cd8b76f73a not found: ID does not exist" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.618811 4971 scope.go:117] "RemoveContainer" containerID="01f4c038659721a3b1a9de5996ca336bb07cf20969895b4224218ee5023f415e" Mar 20 07:15:40 crc kubenswrapper[4971]: E0320 07:15:40.619205 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01f4c038659721a3b1a9de5996ca336bb07cf20969895b4224218ee5023f415e\": container with ID starting with 01f4c038659721a3b1a9de5996ca336bb07cf20969895b4224218ee5023f415e not found: ID does not exist" containerID="01f4c038659721a3b1a9de5996ca336bb07cf20969895b4224218ee5023f415e" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.619236 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01f4c038659721a3b1a9de5996ca336bb07cf20969895b4224218ee5023f415e"} err="failed to get container status \"01f4c038659721a3b1a9de5996ca336bb07cf20969895b4224218ee5023f415e\": rpc error: code = NotFound desc = could not find container \"01f4c038659721a3b1a9de5996ca336bb07cf20969895b4224218ee5023f415e\": container with ID starting with 01f4c038659721a3b1a9de5996ca336bb07cf20969895b4224218ee5023f415e not found: ID does not exist" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.619280 4971 scope.go:117] "RemoveContainer" containerID="d18efcf36c0990b4e068aae60d892df3f5ec3b90d05ef12fb22df71577b161aa" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.638405 4971 scope.go:117] "RemoveContainer" containerID="d18efcf36c0990b4e068aae60d892df3f5ec3b90d05ef12fb22df71577b161aa" Mar 20 07:15:40 crc kubenswrapper[4971]: E0320 07:15:40.638857 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d18efcf36c0990b4e068aae60d892df3f5ec3b90d05ef12fb22df71577b161aa\": container with ID starting with d18efcf36c0990b4e068aae60d892df3f5ec3b90d05ef12fb22df71577b161aa not found: ID does not exist" containerID="d18efcf36c0990b4e068aae60d892df3f5ec3b90d05ef12fb22df71577b161aa" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.638898 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d18efcf36c0990b4e068aae60d892df3f5ec3b90d05ef12fb22df71577b161aa"} err="failed to get container status \"d18efcf36c0990b4e068aae60d892df3f5ec3b90d05ef12fb22df71577b161aa\": rpc error: code = NotFound desc = could not find container \"d18efcf36c0990b4e068aae60d892df3f5ec3b90d05ef12fb22df71577b161aa\": container with ID starting with d18efcf36c0990b4e068aae60d892df3f5ec3b90d05ef12fb22df71577b161aa not found: ID does not exist" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.638926 4971 scope.go:117] "RemoveContainer" containerID="31e89339c7ef92bd573974edb8ba31d6d53be9b1e4d1416dc023c35caf6cee2c" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.656401 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9064c65d-26e4-4d53-a1c1-b88f932b76be-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "9064c65d-26e4-4d53-a1c1-b88f932b76be" (UID: "9064c65d-26e4-4d53-a1c1-b88f932b76be"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.663519 4971 scope.go:117] "RemoveContainer" containerID="f4e86f9dea6983a20f14a79fb0e7338cd92e47228309e782ea16776aa1889874" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.667700 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9064c65d-26e4-4d53-a1c1-b88f932b76be-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "9064c65d-26e4-4d53-a1c1-b88f932b76be" (UID: "9064c65d-26e4-4d53-a1c1-b88f932b76be"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.675878 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-465wm\" (UniqueName: \"kubernetes.io/projected/9064c65d-26e4-4d53-a1c1-b88f932b76be-kube-api-access-465wm\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.675910 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9064c65d-26e4-4d53-a1c1-b88f932b76be-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.675923 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9jjn\" (UniqueName: \"kubernetes.io/projected/a79dc2c8-0cb1-4c06-8001-7b7d5c1d31e9-kube-api-access-v9jjn\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.675933 4971 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a79dc2c8-0cb1-4c06-8001-7b7d5c1d31e9-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.675942 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9064c65d-26e4-4d53-a1c1-b88f932b76be-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.675950 4971 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9064c65d-26e4-4d53-a1c1-b88f932b76be-ovn-rundir\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.675958 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9064c65d-26e4-4d53-a1c1-b88f932b76be-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.675965 4971 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/9064c65d-26e4-4d53-a1c1-b88f932b76be-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.675975 4971 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9064c65d-26e4-4d53-a1c1-b88f932b76be-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:40 crc kubenswrapper[4971]: E0320 07:15:40.676059 4971 projected.go:288] Couldn't get configMap openstack/swift-storage-config-data: configmap "swift-storage-config-data" not found Mar 20 07:15:40 crc kubenswrapper[4971]: E0320 07:15:40.676073 4971 projected.go:263] Couldn't get secret openstack/swift-conf: secret "swift-conf" not found Mar 20 07:15:40 crc kubenswrapper[4971]: E0320 07:15:40.676080 4971 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 07:15:40 crc kubenswrapper[4971]: E0320 07:15:40.676092 4971 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: [configmap "swift-storage-config-data" not found, secret "swift-conf" not found, configmap "swift-ring-files" not found] Mar 20 07:15:40 crc kubenswrapper[4971]: E0320 07:15:40.676128 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9f0608b4-f344-45db-a952-d6bc328083a2-etc-swift podName:9f0608b4-f344-45db-a952-d6bc328083a2 nodeName:}" failed. No retries permitted until 2026-03-20 07:15:48.676115406 +0000 UTC m=+1570.655989544 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9f0608b4-f344-45db-a952-d6bc328083a2-etc-swift") pod "swift-storage-0" (UID: "9f0608b4-f344-45db-a952-d6bc328083a2") : [configmap "swift-storage-config-data" not found, secret "swift-conf" not found, configmap "swift-ring-files" not found] Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.687248 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.693829 4971 scope.go:117] "RemoveContainer" containerID="31e89339c7ef92bd573974edb8ba31d6d53be9b1e4d1416dc023c35caf6cee2c" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.693915 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Mar 20 07:15:40 crc kubenswrapper[4971]: E0320 07:15:40.694279 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31e89339c7ef92bd573974edb8ba31d6d53be9b1e4d1416dc023c35caf6cee2c\": container with ID starting with 31e89339c7ef92bd573974edb8ba31d6d53be9b1e4d1416dc023c35caf6cee2c not found: ID does not exist" containerID="31e89339c7ef92bd573974edb8ba31d6d53be9b1e4d1416dc023c35caf6cee2c" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.694311 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31e89339c7ef92bd573974edb8ba31d6d53be9b1e4d1416dc023c35caf6cee2c"} err="failed to get container status \"31e89339c7ef92bd573974edb8ba31d6d53be9b1e4d1416dc023c35caf6cee2c\": rpc error: code = NotFound desc = could not find container \"31e89339c7ef92bd573974edb8ba31d6d53be9b1e4d1416dc023c35caf6cee2c\": container with ID starting with 31e89339c7ef92bd573974edb8ba31d6d53be9b1e4d1416dc023c35caf6cee2c not found: ID does not exist" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.694336 4971 scope.go:117] "RemoveContainer" containerID="f4e86f9dea6983a20f14a79fb0e7338cd92e47228309e782ea16776aa1889874" Mar 20 07:15:40 crc kubenswrapper[4971]: E0320 07:15:40.694722 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4e86f9dea6983a20f14a79fb0e7338cd92e47228309e782ea16776aa1889874\": container with ID starting with f4e86f9dea6983a20f14a79fb0e7338cd92e47228309e782ea16776aa1889874 not found: ID does not exist" containerID="f4e86f9dea6983a20f14a79fb0e7338cd92e47228309e782ea16776aa1889874" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.694809 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4e86f9dea6983a20f14a79fb0e7338cd92e47228309e782ea16776aa1889874"} err="failed to get container status \"f4e86f9dea6983a20f14a79fb0e7338cd92e47228309e782ea16776aa1889874\": rpc error: code = NotFound desc = could not find container \"f4e86f9dea6983a20f14a79fb0e7338cd92e47228309e782ea16776aa1889874\": container with ID starting with f4e86f9dea6983a20f14a79fb0e7338cd92e47228309e782ea16776aa1889874 not found: ID does not exist" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.743190 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08b125a7-da7e-4ab1-b594-2f45f0b5c529" path="/var/lib/kubelet/pods/08b125a7-da7e-4ab1-b594-2f45f0b5c529/volumes" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.744143 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2656777c-a362-4e3e-92aa-98a8f9d2cf34" path="/var/lib/kubelet/pods/2656777c-a362-4e3e-92aa-98a8f9d2cf34/volumes" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.744828 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bb32512-db70-4db8-a867-0bda48f318eb" path="/var/lib/kubelet/pods/2bb32512-db70-4db8-a867-0bda48f318eb/volumes" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.745948 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69a5a04c-039c-4ef2-a020-1e5c16fca2b6" path="/var/lib/kubelet/pods/69a5a04c-039c-4ef2-a020-1e5c16fca2b6/volumes" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.746428 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a79dc2c8-0cb1-4c06-8001-7b7d5c1d31e9" path="/var/lib/kubelet/pods/a79dc2c8-0cb1-4c06-8001-7b7d5c1d31e9/volumes" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.746862 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa332536-984c-4942-b1fb-1ffde7eb465d" path="/var/lib/kubelet/pods/aa332536-984c-4942-b1fb-1ffde7eb465d/volumes" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.747964 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11d9f93-43db-45cf-9cba-75d0c9b50d55" path="/var/lib/kubelet/pods/b11d9f93-43db-45cf-9cba-75d0c9b50d55/volumes" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.748470 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b339d4de-8516-4dd0-9bc1-c7580a99c54a" path="/var/lib/kubelet/pods/b339d4de-8516-4dd0-9bc1-c7580a99c54a/volumes" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.748866 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9450f53-fcf0-4799-a8da-a9d0bc7016ac" path="/var/lib/kubelet/pods/b9450f53-fcf0-4799-a8da-a9d0bc7016ac/volumes" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.749398 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a" path="/var/lib/kubelet/pods/bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a/volumes" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.750419 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7136000-46aa-4fe0-acaf-5ad1ed94e4b3" path="/var/lib/kubelet/pods/c7136000-46aa-4fe0-acaf-5ad1ed94e4b3/volumes" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.751030 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7c37921-0f32-4a99-bf40-e960c400a9ac" path="/var/lib/kubelet/pods/c7c37921-0f32-4a99-bf40-e960c400a9ac/volumes" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.752173 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e641b3d5-ee18-41b6-b38c-098564591a1b" path="/var/lib/kubelet/pods/e641b3d5-ee18-41b6-b38c-098564591a1b/volumes" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.752838 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e98717ba-3768-4817-a29a-09114e106105" path="/var/lib/kubelet/pods/e98717ba-3768-4817-a29a-09114e106105/volumes" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.774795 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89" path="/var/lib/kubelet/pods/ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89/volumes" Mar 20 07:15:40 crc kubenswrapper[4971]: I0320 07:15:40.775435 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebe7358b-84ed-4e08-8c12-234928beef49" path="/var/lib/kubelet/pods/ebe7358b-84ed-4e08-8c12-234928beef49/volumes" Mar 20 07:15:41 crc kubenswrapper[4971]: E0320 07:15:41.215143 4971 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 20 07:15:41 crc kubenswrapper[4971]: E0320 07:15:41.215481 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a71f8d2d-729c-4b7c-89c7-a06bd2216978-config-data podName:a71f8d2d-729c-4b7c-89c7-a06bd2216978 nodeName:}" failed. No retries permitted until 2026-03-20 07:15:49.215460915 +0000 UTC m=+1571.195335053 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/a71f8d2d-729c-4b7c-89c7-a06bd2216978-config-data") pod "rabbitmq-server-0" (UID: "a71f8d2d-729c-4b7c-89c7-a06bd2216978") : configmap "rabbitmq-config-data" not found Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.399679 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_9064c65d-26e4-4d53-a1c1-b88f932b76be/ovn-northd/0.log" Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.399776 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"9064c65d-26e4-4d53-a1c1-b88f932b76be","Type":"ContainerDied","Data":"8236e5e475f8e2bc30f0fba0ae0ebb7515e2b7b499601e1b123b49fde8584edc"} Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.399833 4971 scope.go:117] "RemoveContainer" containerID="78ce6f75fdf4b8134a001b00c9380d6a9b0e6b28e02c6c557c6a2c9097b03dab" Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.400077 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.408426 4971 generic.go:334] "Generic (PLEG): container finished" podID="cf071dc8-0146-4b1d-a644-02224870fcba" containerID="41739891398c4430c4f83f4ce50b2d0a22faf0762c4dc2b95014e0a8cb9f4cb0" exitCode=0 Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.408507 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5645d5b87f-2lrzm" event={"ID":"cf071dc8-0146-4b1d-a644-02224870fcba","Type":"ContainerDied","Data":"41739891398c4430c4f83f4ce50b2d0a22faf0762c4dc2b95014e0a8cb9f4cb0"} Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.418137 4971 generic.go:334] "Generic (PLEG): container finished" podID="1023ed0c-3abe-4cff-987f-52544b885696" containerID="51b4cb77e4da22feec9a5960ca52b1a8182d5f4678ea7a93350ed77aaf3336c3" exitCode=0 Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.418180 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1023ed0c-3abe-4cff-987f-52544b885696","Type":"ContainerDied","Data":"51b4cb77e4da22feec9a5960ca52b1a8182d5f4678ea7a93350ed77aaf3336c3"} Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.604255 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.607263 4971 scope.go:117] "RemoveContainer" containerID="3ca1eeff736f7dc8000796f67df497ff16c309981beedb9789a0e4a334c2f180" Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.607963 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.610173 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.611140 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5645d5b87f-2lrzm" Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.740409 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1023ed0c-3abe-4cff-987f-52544b885696-rabbitmq-tls\") pod \"1023ed0c-3abe-4cff-987f-52544b885696\" (UID: \"1023ed0c-3abe-4cff-987f-52544b885696\") " Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.740460 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf071dc8-0146-4b1d-a644-02224870fcba-config-data\") pod \"cf071dc8-0146-4b1d-a644-02224870fcba\" (UID: \"cf071dc8-0146-4b1d-a644-02224870fcba\") " Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.740487 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pksdb\" (UniqueName: \"kubernetes.io/projected/1023ed0c-3abe-4cff-987f-52544b885696-kube-api-access-pksdb\") pod \"1023ed0c-3abe-4cff-987f-52544b885696\" (UID: \"1023ed0c-3abe-4cff-987f-52544b885696\") " Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.740511 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cf071dc8-0146-4b1d-a644-02224870fcba-fernet-keys\") pod \"cf071dc8-0146-4b1d-a644-02224870fcba\" (UID: \"cf071dc8-0146-4b1d-a644-02224870fcba\") " Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.740558 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1023ed0c-3abe-4cff-987f-52544b885696-erlang-cookie-secret\") pod \"1023ed0c-3abe-4cff-987f-52544b885696\" (UID: \"1023ed0c-3abe-4cff-987f-52544b885696\") " Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.740578 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1023ed0c-3abe-4cff-987f-52544b885696-pod-info\") pod \"1023ed0c-3abe-4cff-987f-52544b885696\" (UID: \"1023ed0c-3abe-4cff-987f-52544b885696\") " Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.740620 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf071dc8-0146-4b1d-a644-02224870fcba-public-tls-certs\") pod \"cf071dc8-0146-4b1d-a644-02224870fcba\" (UID: \"cf071dc8-0146-4b1d-a644-02224870fcba\") " Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.740641 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7924j\" (UniqueName: \"kubernetes.io/projected/cf071dc8-0146-4b1d-a644-02224870fcba-kube-api-access-7924j\") pod \"cf071dc8-0146-4b1d-a644-02224870fcba\" (UID: \"cf071dc8-0146-4b1d-a644-02224870fcba\") " Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.740659 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cf071dc8-0146-4b1d-a644-02224870fcba-credential-keys\") pod \"cf071dc8-0146-4b1d-a644-02224870fcba\" (UID: \"cf071dc8-0146-4b1d-a644-02224870fcba\") " Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.740679 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf071dc8-0146-4b1d-a644-02224870fcba-scripts\") pod \"cf071dc8-0146-4b1d-a644-02224870fcba\" (UID: \"cf071dc8-0146-4b1d-a644-02224870fcba\") " Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.740708 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1023ed0c-3abe-4cff-987f-52544b885696-server-conf\") pod \"1023ed0c-3abe-4cff-987f-52544b885696\" (UID: \"1023ed0c-3abe-4cff-987f-52544b885696\") " Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.740736 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1023ed0c-3abe-4cff-987f-52544b885696-config-data\") pod \"1023ed0c-3abe-4cff-987f-52544b885696\" (UID: \"1023ed0c-3abe-4cff-987f-52544b885696\") " Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.740765 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1023ed0c-3abe-4cff-987f-52544b885696-rabbitmq-confd\") pod \"1023ed0c-3abe-4cff-987f-52544b885696\" (UID: \"1023ed0c-3abe-4cff-987f-52544b885696\") " Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.740800 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1023ed0c-3abe-4cff-987f-52544b885696-rabbitmq-erlang-cookie\") pod \"1023ed0c-3abe-4cff-987f-52544b885696\" (UID: \"1023ed0c-3abe-4cff-987f-52544b885696\") " Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.740818 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf071dc8-0146-4b1d-a644-02224870fcba-internal-tls-certs\") pod \"cf071dc8-0146-4b1d-a644-02224870fcba\" (UID: \"cf071dc8-0146-4b1d-a644-02224870fcba\") " Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.740836 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1023ed0c-3abe-4cff-987f-52544b885696-rabbitmq-plugins\") pod \"1023ed0c-3abe-4cff-987f-52544b885696\" (UID: \"1023ed0c-3abe-4cff-987f-52544b885696\") " Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.740876 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1023ed0c-3abe-4cff-987f-52544b885696-plugins-conf\") pod \"1023ed0c-3abe-4cff-987f-52544b885696\" (UID: \"1023ed0c-3abe-4cff-987f-52544b885696\") " Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.740891 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"1023ed0c-3abe-4cff-987f-52544b885696\" (UID: \"1023ed0c-3abe-4cff-987f-52544b885696\") " Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.740906 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf071dc8-0146-4b1d-a644-02224870fcba-combined-ca-bundle\") pod \"cf071dc8-0146-4b1d-a644-02224870fcba\" (UID: \"cf071dc8-0146-4b1d-a644-02224870fcba\") " Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.741589 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1023ed0c-3abe-4cff-987f-52544b885696-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "1023ed0c-3abe-4cff-987f-52544b885696" (UID: "1023ed0c-3abe-4cff-987f-52544b885696"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.744387 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1023ed0c-3abe-4cff-987f-52544b885696-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "1023ed0c-3abe-4cff-987f-52544b885696" (UID: "1023ed0c-3abe-4cff-987f-52544b885696"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.744669 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1023ed0c-3abe-4cff-987f-52544b885696-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "1023ed0c-3abe-4cff-987f-52544b885696" (UID: "1023ed0c-3abe-4cff-987f-52544b885696"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.745758 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf071dc8-0146-4b1d-a644-02224870fcba-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "cf071dc8-0146-4b1d-a644-02224870fcba" (UID: "cf071dc8-0146-4b1d-a644-02224870fcba"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.746804 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf071dc8-0146-4b1d-a644-02224870fcba-scripts" (OuterVolumeSpecName: "scripts") pod "cf071dc8-0146-4b1d-a644-02224870fcba" (UID: "cf071dc8-0146-4b1d-a644-02224870fcba"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.751927 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1023ed0c-3abe-4cff-987f-52544b885696-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "1023ed0c-3abe-4cff-987f-52544b885696" (UID: "1023ed0c-3abe-4cff-987f-52544b885696"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.753183 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "persistence") pod "1023ed0c-3abe-4cff-987f-52544b885696" (UID: "1023ed0c-3abe-4cff-987f-52544b885696"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.753592 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf071dc8-0146-4b1d-a644-02224870fcba-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "cf071dc8-0146-4b1d-a644-02224870fcba" (UID: "cf071dc8-0146-4b1d-a644-02224870fcba"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.754072 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/1023ed0c-3abe-4cff-987f-52544b885696-pod-info" (OuterVolumeSpecName: "pod-info") pod "1023ed0c-3abe-4cff-987f-52544b885696" (UID: "1023ed0c-3abe-4cff-987f-52544b885696"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.755056 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1023ed0c-3abe-4cff-987f-52544b885696-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "1023ed0c-3abe-4cff-987f-52544b885696" (UID: "1023ed0c-3abe-4cff-987f-52544b885696"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.755135 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1023ed0c-3abe-4cff-987f-52544b885696-kube-api-access-pksdb" (OuterVolumeSpecName: "kube-api-access-pksdb") pod "1023ed0c-3abe-4cff-987f-52544b885696" (UID: "1023ed0c-3abe-4cff-987f-52544b885696"). InnerVolumeSpecName "kube-api-access-pksdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.772029 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf071dc8-0146-4b1d-a644-02224870fcba-kube-api-access-7924j" (OuterVolumeSpecName: "kube-api-access-7924j") pod "cf071dc8-0146-4b1d-a644-02224870fcba" (UID: "cf071dc8-0146-4b1d-a644-02224870fcba"). InnerVolumeSpecName "kube-api-access-7924j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.794410 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1023ed0c-3abe-4cff-987f-52544b885696-config-data" (OuterVolumeSpecName: "config-data") pod "1023ed0c-3abe-4cff-987f-52544b885696" (UID: "1023ed0c-3abe-4cff-987f-52544b885696"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.812794 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf071dc8-0146-4b1d-a644-02224870fcba-config-data" (OuterVolumeSpecName: "config-data") pod "cf071dc8-0146-4b1d-a644-02224870fcba" (UID: "cf071dc8-0146-4b1d-a644-02224870fcba"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:41 crc kubenswrapper[4971]: E0320 07:15:41.821816 4971 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Mar 20 07:15:41 crc kubenswrapper[4971]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2026-03-20T07:15:34Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Mar 20 07:15:41 crc kubenswrapper[4971]: /etc/init.d/functions: line 589: 372 Alarm clock "$@" Mar 20 07:15:41 crc kubenswrapper[4971]: > execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-pfhnr" message=< Mar 20 07:15:41 crc kubenswrapper[4971]: Exiting ovn-controller (1) [FAILED] Mar 20 07:15:41 crc kubenswrapper[4971]: Killing ovn-controller (1) [ OK ] Mar 20 07:15:41 crc kubenswrapper[4971]: 2026-03-20T07:15:34Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Mar 20 07:15:41 crc kubenswrapper[4971]: /etc/init.d/functions: line 589: 372 Alarm clock "$@" Mar 20 07:15:41 crc kubenswrapper[4971]: > Mar 20 07:15:41 crc kubenswrapper[4971]: E0320 07:15:41.821872 4971 kuberuntime_container.go:691] "PreStop hook failed" err=< Mar 20 07:15:41 crc kubenswrapper[4971]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2026-03-20T07:15:34Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Mar 20 07:15:41 crc kubenswrapper[4971]: /etc/init.d/functions: line 589: 372 Alarm clock "$@" Mar 20 07:15:41 crc kubenswrapper[4971]: > pod="openstack/ovn-controller-pfhnr" podUID="10811481-4b40-49ae-9d75-03f0c0e02fe7" containerName="ovn-controller" containerID="cri-o://b31c413c5d26b07382c7a1437a8a9e34e970b11922b67a53dd84ae365d6be81a" Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.821924 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-pfhnr" podUID="10811481-4b40-49ae-9d75-03f0c0e02fe7" containerName="ovn-controller" containerID="cri-o://b31c413c5d26b07382c7a1437a8a9e34e970b11922b67a53dd84ae365d6be81a" gracePeriod=22 Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.822351 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf071dc8-0146-4b1d-a644-02224870fcba-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "cf071dc8-0146-4b1d-a644-02224870fcba" (UID: "cf071dc8-0146-4b1d-a644-02224870fcba"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.825225 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf071dc8-0146-4b1d-a644-02224870fcba-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cf071dc8-0146-4b1d-a644-02224870fcba" (UID: "cf071dc8-0146-4b1d-a644-02224870fcba"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.844822 4971 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1023ed0c-3abe-4cff-987f-52544b885696-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.844854 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf071dc8-0146-4b1d-a644-02224870fcba-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.844864 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pksdb\" (UniqueName: \"kubernetes.io/projected/1023ed0c-3abe-4cff-987f-52544b885696-kube-api-access-pksdb\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.844875 4971 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cf071dc8-0146-4b1d-a644-02224870fcba-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.844883 4971 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1023ed0c-3abe-4cff-987f-52544b885696-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.844891 4971 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1023ed0c-3abe-4cff-987f-52544b885696-pod-info\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.844902 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7924j\" (UniqueName: \"kubernetes.io/projected/cf071dc8-0146-4b1d-a644-02224870fcba-kube-api-access-7924j\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.844910 4971 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cf071dc8-0146-4b1d-a644-02224870fcba-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.844917 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf071dc8-0146-4b1d-a644-02224870fcba-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.844927 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1023ed0c-3abe-4cff-987f-52544b885696-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.844936 4971 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1023ed0c-3abe-4cff-987f-52544b885696-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.844944 4971 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf071dc8-0146-4b1d-a644-02224870fcba-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.844956 4971 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1023ed0c-3abe-4cff-987f-52544b885696-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.844964 4971 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1023ed0c-3abe-4cff-987f-52544b885696-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.844982 4971 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.844991 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf071dc8-0146-4b1d-a644-02224870fcba-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.847851 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1023ed0c-3abe-4cff-987f-52544b885696-server-conf" (OuterVolumeSpecName: "server-conf") pod "1023ed0c-3abe-4cff-987f-52544b885696" (UID: "1023ed0c-3abe-4cff-987f-52544b885696"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.852108 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf071dc8-0146-4b1d-a644-02224870fcba-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "cf071dc8-0146-4b1d-a644-02224870fcba" (UID: "cf071dc8-0146-4b1d-a644-02224870fcba"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.862342 4971 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.866500 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1023ed0c-3abe-4cff-987f-52544b885696-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "1023ed0c-3abe-4cff-987f-52544b885696" (UID: "1023ed0c-3abe-4cff-987f-52544b885696"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.909696 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.946871 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a71f8d2d-729c-4b7c-89c7-a06bd2216978-server-conf\") pod \"a71f8d2d-729c-4b7c-89c7-a06bd2216978\" (UID: \"a71f8d2d-729c-4b7c-89c7-a06bd2216978\") " Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.946937 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a71f8d2d-729c-4b7c-89c7-a06bd2216978-config-data\") pod \"a71f8d2d-729c-4b7c-89c7-a06bd2216978\" (UID: \"a71f8d2d-729c-4b7c-89c7-a06bd2216978\") " Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.946998 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a71f8d2d-729c-4b7c-89c7-a06bd2216978-rabbitmq-erlang-cookie\") pod \"a71f8d2d-729c-4b7c-89c7-a06bd2216978\" (UID: \"a71f8d2d-729c-4b7c-89c7-a06bd2216978\") " Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.947027 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a71f8d2d-729c-4b7c-89c7-a06bd2216978-erlang-cookie-secret\") pod \"a71f8d2d-729c-4b7c-89c7-a06bd2216978\" (UID: \"a71f8d2d-729c-4b7c-89c7-a06bd2216978\") " Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.947057 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a71f8d2d-729c-4b7c-89c7-a06bd2216978-rabbitmq-tls\") pod \"a71f8d2d-729c-4b7c-89c7-a06bd2216978\" (UID: \"a71f8d2d-729c-4b7c-89c7-a06bd2216978\") " Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.947084 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a71f8d2d-729c-4b7c-89c7-a06bd2216978-rabbitmq-confd\") pod \"a71f8d2d-729c-4b7c-89c7-a06bd2216978\" (UID: \"a71f8d2d-729c-4b7c-89c7-a06bd2216978\") " Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.947110 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a71f8d2d-729c-4b7c-89c7-a06bd2216978-plugins-conf\") pod \"a71f8d2d-729c-4b7c-89c7-a06bd2216978\" (UID: \"a71f8d2d-729c-4b7c-89c7-a06bd2216978\") " Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.947136 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a71f8d2d-729c-4b7c-89c7-a06bd2216978-pod-info\") pod \"a71f8d2d-729c-4b7c-89c7-a06bd2216978\" (UID: \"a71f8d2d-729c-4b7c-89c7-a06bd2216978\") " Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.947164 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"a71f8d2d-729c-4b7c-89c7-a06bd2216978\" (UID: \"a71f8d2d-729c-4b7c-89c7-a06bd2216978\") " Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.947199 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a71f8d2d-729c-4b7c-89c7-a06bd2216978-rabbitmq-plugins\") pod \"a71f8d2d-729c-4b7c-89c7-a06bd2216978\" (UID: \"a71f8d2d-729c-4b7c-89c7-a06bd2216978\") " Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.947250 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6tcn\" (UniqueName: \"kubernetes.io/projected/a71f8d2d-729c-4b7c-89c7-a06bd2216978-kube-api-access-b6tcn\") pod \"a71f8d2d-729c-4b7c-89c7-a06bd2216978\" (UID: \"a71f8d2d-729c-4b7c-89c7-a06bd2216978\") " Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.947495 4971 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf071dc8-0146-4b1d-a644-02224870fcba-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.947511 4971 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1023ed0c-3abe-4cff-987f-52544b885696-server-conf\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.947524 4971 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1023ed0c-3abe-4cff-987f-52544b885696-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.947535 4971 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.950978 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a71f8d2d-729c-4b7c-89c7-a06bd2216978-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "a71f8d2d-729c-4b7c-89c7-a06bd2216978" (UID: "a71f8d2d-729c-4b7c-89c7-a06bd2216978"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.951252 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a71f8d2d-729c-4b7c-89c7-a06bd2216978-kube-api-access-b6tcn" (OuterVolumeSpecName: "kube-api-access-b6tcn") pod "a71f8d2d-729c-4b7c-89c7-a06bd2216978" (UID: "a71f8d2d-729c-4b7c-89c7-a06bd2216978"). InnerVolumeSpecName "kube-api-access-b6tcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.951675 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a71f8d2d-729c-4b7c-89c7-a06bd2216978-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "a71f8d2d-729c-4b7c-89c7-a06bd2216978" (UID: "a71f8d2d-729c-4b7c-89c7-a06bd2216978"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.954346 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a71f8d2d-729c-4b7c-89c7-a06bd2216978-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "a71f8d2d-729c-4b7c-89c7-a06bd2216978" (UID: "a71f8d2d-729c-4b7c-89c7-a06bd2216978"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.955162 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/a71f8d2d-729c-4b7c-89c7-a06bd2216978-pod-info" (OuterVolumeSpecName: "pod-info") pod "a71f8d2d-729c-4b7c-89c7-a06bd2216978" (UID: "a71f8d2d-729c-4b7c-89c7-a06bd2216978"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.955472 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a71f8d2d-729c-4b7c-89c7-a06bd2216978-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "a71f8d2d-729c-4b7c-89c7-a06bd2216978" (UID: "a71f8d2d-729c-4b7c-89c7-a06bd2216978"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.955535 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a71f8d2d-729c-4b7c-89c7-a06bd2216978-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "a71f8d2d-729c-4b7c-89c7-a06bd2216978" (UID: "a71f8d2d-729c-4b7c-89c7-a06bd2216978"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.957018 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "persistence") pod "a71f8d2d-729c-4b7c-89c7-a06bd2216978" (UID: "a71f8d2d-729c-4b7c-89c7-a06bd2216978"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.967742 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a71f8d2d-729c-4b7c-89c7-a06bd2216978-config-data" (OuterVolumeSpecName: "config-data") pod "a71f8d2d-729c-4b7c-89c7-a06bd2216978" (UID: "a71f8d2d-729c-4b7c-89c7-a06bd2216978"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:15:41 crc kubenswrapper[4971]: I0320 07:15:41.988630 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a71f8d2d-729c-4b7c-89c7-a06bd2216978-server-conf" (OuterVolumeSpecName: "server-conf") pod "a71f8d2d-729c-4b7c-89c7-a06bd2216978" (UID: "a71f8d2d-729c-4b7c-89c7-a06bd2216978"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.033451 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a71f8d2d-729c-4b7c-89c7-a06bd2216978-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "a71f8d2d-729c-4b7c-89c7-a06bd2216978" (UID: "a71f8d2d-729c-4b7c-89c7-a06bd2216978"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.078357 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6tcn\" (UniqueName: \"kubernetes.io/projected/a71f8d2d-729c-4b7c-89c7-a06bd2216978-kube-api-access-b6tcn\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.078393 4971 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a71f8d2d-729c-4b7c-89c7-a06bd2216978-server-conf\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.078401 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a71f8d2d-729c-4b7c-89c7-a06bd2216978-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.078411 4971 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a71f8d2d-729c-4b7c-89c7-a06bd2216978-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.078421 4971 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a71f8d2d-729c-4b7c-89c7-a06bd2216978-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.078429 4971 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a71f8d2d-729c-4b7c-89c7-a06bd2216978-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.078437 4971 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a71f8d2d-729c-4b7c-89c7-a06bd2216978-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.078446 4971 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a71f8d2d-729c-4b7c-89c7-a06bd2216978-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.078453 4971 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a71f8d2d-729c-4b7c-89c7-a06bd2216978-pod-info\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.078478 4971 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.078489 4971 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a71f8d2d-729c-4b7c-89c7-a06bd2216978-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.092553 4971 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.109200 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-pfhnr_10811481-4b40-49ae-9d75-03f0c0e02fe7/ovn-controller/0.log" Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.109258 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-pfhnr" Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.179636 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/10811481-4b40-49ae-9d75-03f0c0e02fe7-ovn-controller-tls-certs\") pod \"10811481-4b40-49ae-9d75-03f0c0e02fe7\" (UID: \"10811481-4b40-49ae-9d75-03f0c0e02fe7\") " Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.179683 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9jf8\" (UniqueName: \"kubernetes.io/projected/10811481-4b40-49ae-9d75-03f0c0e02fe7-kube-api-access-z9jf8\") pod \"10811481-4b40-49ae-9d75-03f0c0e02fe7\" (UID: \"10811481-4b40-49ae-9d75-03f0c0e02fe7\") " Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.179714 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/10811481-4b40-49ae-9d75-03f0c0e02fe7-scripts\") pod \"10811481-4b40-49ae-9d75-03f0c0e02fe7\" (UID: \"10811481-4b40-49ae-9d75-03f0c0e02fe7\") " Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.179735 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/10811481-4b40-49ae-9d75-03f0c0e02fe7-var-run\") pod \"10811481-4b40-49ae-9d75-03f0c0e02fe7\" (UID: \"10811481-4b40-49ae-9d75-03f0c0e02fe7\") " Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.179767 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/10811481-4b40-49ae-9d75-03f0c0e02fe7-var-run-ovn\") pod \"10811481-4b40-49ae-9d75-03f0c0e02fe7\" (UID: \"10811481-4b40-49ae-9d75-03f0c0e02fe7\") " Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.179799 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/10811481-4b40-49ae-9d75-03f0c0e02fe7-var-log-ovn\") pod \"10811481-4b40-49ae-9d75-03f0c0e02fe7\" (UID: \"10811481-4b40-49ae-9d75-03f0c0e02fe7\") " Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.179844 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10811481-4b40-49ae-9d75-03f0c0e02fe7-combined-ca-bundle\") pod \"10811481-4b40-49ae-9d75-03f0c0e02fe7\" (UID: \"10811481-4b40-49ae-9d75-03f0c0e02fe7\") " Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.180040 4971 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.180073 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/10811481-4b40-49ae-9d75-03f0c0e02fe7-var-run" (OuterVolumeSpecName: "var-run") pod "10811481-4b40-49ae-9d75-03f0c0e02fe7" (UID: "10811481-4b40-49ae-9d75-03f0c0e02fe7"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.180203 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/10811481-4b40-49ae-9d75-03f0c0e02fe7-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "10811481-4b40-49ae-9d75-03f0c0e02fe7" (UID: "10811481-4b40-49ae-9d75-03f0c0e02fe7"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.180253 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/10811481-4b40-49ae-9d75-03f0c0e02fe7-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "10811481-4b40-49ae-9d75-03f0c0e02fe7" (UID: "10811481-4b40-49ae-9d75-03f0c0e02fe7"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.181180 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10811481-4b40-49ae-9d75-03f0c0e02fe7-scripts" (OuterVolumeSpecName: "scripts") pod "10811481-4b40-49ae-9d75-03f0c0e02fe7" (UID: "10811481-4b40-49ae-9d75-03f0c0e02fe7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.205371 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10811481-4b40-49ae-9d75-03f0c0e02fe7-kube-api-access-z9jf8" (OuterVolumeSpecName: "kube-api-access-z9jf8") pod "10811481-4b40-49ae-9d75-03f0c0e02fe7" (UID: "10811481-4b40-49ae-9d75-03f0c0e02fe7"). InnerVolumeSpecName "kube-api-access-z9jf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.245079 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10811481-4b40-49ae-9d75-03f0c0e02fe7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "10811481-4b40-49ae-9d75-03f0c0e02fe7" (UID: "10811481-4b40-49ae-9d75-03f0c0e02fe7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.274273 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10811481-4b40-49ae-9d75-03f0c0e02fe7-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "10811481-4b40-49ae-9d75-03f0c0e02fe7" (UID: "10811481-4b40-49ae-9d75-03f0c0e02fe7"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.280697 4971 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/10811481-4b40-49ae-9d75-03f0c0e02fe7-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.280722 4971 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/10811481-4b40-49ae-9d75-03f0c0e02fe7-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.280731 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10811481-4b40-49ae-9d75-03f0c0e02fe7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.280740 4971 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/10811481-4b40-49ae-9d75-03f0c0e02fe7-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.280748 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9jf8\" (UniqueName: \"kubernetes.io/projected/10811481-4b40-49ae-9d75-03f0c0e02fe7-kube-api-access-z9jf8\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.280757 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/10811481-4b40-49ae-9d75-03f0c0e02fe7-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.280764 4971 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/10811481-4b40-49ae-9d75-03f0c0e02fe7-var-run\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.430759 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1023ed0c-3abe-4cff-987f-52544b885696","Type":"ContainerDied","Data":"d8878d860f20305c6f32ab33f897cfdab0eeeb47ee5127464f46989bbaccb1d4"} Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.430814 4971 scope.go:117] "RemoveContainer" containerID="51b4cb77e4da22feec9a5960ca52b1a8182d5f4678ea7a93350ed77aaf3336c3" Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.430787 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.434306 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5645d5b87f-2lrzm" event={"ID":"cf071dc8-0146-4b1d-a644-02224870fcba","Type":"ContainerDied","Data":"7026d070462602394c954aff94390865eadeb5ccb6fd7fb20c68ed2ee95ce3e6"} Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.434414 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5645d5b87f-2lrzm" Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.436261 4971 generic.go:334] "Generic (PLEG): container finished" podID="a71f8d2d-729c-4b7c-89c7-a06bd2216978" containerID="7461019c6212376a36119aa5818e0716d8509d127a418e899d1c9160f4c383ac" exitCode=0 Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.436303 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a71f8d2d-729c-4b7c-89c7-a06bd2216978","Type":"ContainerDied","Data":"7461019c6212376a36119aa5818e0716d8509d127a418e899d1c9160f4c383ac"} Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.436321 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a71f8d2d-729c-4b7c-89c7-a06bd2216978","Type":"ContainerDied","Data":"432938dfa551e87c30cd59b963f5ea472cc946f2bf3968f87989b23b752d112a"} Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.436360 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.443046 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-pfhnr_10811481-4b40-49ae-9d75-03f0c0e02fe7/ovn-controller/0.log" Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.443157 4971 generic.go:334] "Generic (PLEG): container finished" podID="10811481-4b40-49ae-9d75-03f0c0e02fe7" containerID="b31c413c5d26b07382c7a1437a8a9e34e970b11922b67a53dd84ae365d6be81a" exitCode=139 Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.443209 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-pfhnr" event={"ID":"10811481-4b40-49ae-9d75-03f0c0e02fe7","Type":"ContainerDied","Data":"b31c413c5d26b07382c7a1437a8a9e34e970b11922b67a53dd84ae365d6be81a"} Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.443234 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-pfhnr" event={"ID":"10811481-4b40-49ae-9d75-03f0c0e02fe7","Type":"ContainerDied","Data":"7fa10eb638b51a171e2bf9700bb577201953548cbbba1b5705a44c4239502d08"} Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.443281 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-pfhnr" Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.460890 4971 scope.go:117] "RemoveContainer" containerID="8ea2d6fbd32a2d37d300ac896cec5808faa6aa48fc9bb628562475b560b88284" Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.485847 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.493326 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.495780 4971 scope.go:117] "RemoveContainer" containerID="41739891398c4430c4f83f4ce50b2d0a22faf0762c4dc2b95014e0a8cb9f4cb0" Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.504815 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.512104 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.524349 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-5645d5b87f-2lrzm"] Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.525723 4971 scope.go:117] "RemoveContainer" containerID="7461019c6212376a36119aa5818e0716d8509d127a418e899d1c9160f4c383ac" Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.542905 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-5645d5b87f-2lrzm"] Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.550349 4971 scope.go:117] "RemoveContainer" containerID="6133975f1c796980d588347c53a88fbaff173562350d27203ea7112376127cb1" Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.550378 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-pfhnr"] Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.558426 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-pfhnr"] Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.583735 4971 scope.go:117] "RemoveContainer" containerID="7461019c6212376a36119aa5818e0716d8509d127a418e899d1c9160f4c383ac" Mar 20 07:15:42 crc kubenswrapper[4971]: E0320 07:15:42.584225 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7461019c6212376a36119aa5818e0716d8509d127a418e899d1c9160f4c383ac\": container with ID starting with 7461019c6212376a36119aa5818e0716d8509d127a418e899d1c9160f4c383ac not found: ID does not exist" containerID="7461019c6212376a36119aa5818e0716d8509d127a418e899d1c9160f4c383ac" Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.584264 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7461019c6212376a36119aa5818e0716d8509d127a418e899d1c9160f4c383ac"} err="failed to get container status \"7461019c6212376a36119aa5818e0716d8509d127a418e899d1c9160f4c383ac\": rpc error: code = NotFound desc = could not find container \"7461019c6212376a36119aa5818e0716d8509d127a418e899d1c9160f4c383ac\": container with ID starting with 7461019c6212376a36119aa5818e0716d8509d127a418e899d1c9160f4c383ac not found: ID does not exist" Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.584293 4971 scope.go:117] "RemoveContainer" containerID="6133975f1c796980d588347c53a88fbaff173562350d27203ea7112376127cb1" Mar 20 07:15:42 crc kubenswrapper[4971]: E0320 07:15:42.584638 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6133975f1c796980d588347c53a88fbaff173562350d27203ea7112376127cb1\": container with ID starting with 6133975f1c796980d588347c53a88fbaff173562350d27203ea7112376127cb1 not found: ID does not exist" containerID="6133975f1c796980d588347c53a88fbaff173562350d27203ea7112376127cb1" Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.584701 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6133975f1c796980d588347c53a88fbaff173562350d27203ea7112376127cb1"} err="failed to get container status \"6133975f1c796980d588347c53a88fbaff173562350d27203ea7112376127cb1\": rpc error: code = NotFound desc = could not find container \"6133975f1c796980d588347c53a88fbaff173562350d27203ea7112376127cb1\": container with ID starting with 6133975f1c796980d588347c53a88fbaff173562350d27203ea7112376127cb1 not found: ID does not exist" Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.584741 4971 scope.go:117] "RemoveContainer" containerID="b31c413c5d26b07382c7a1437a8a9e34e970b11922b67a53dd84ae365d6be81a" Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.608325 4971 scope.go:117] "RemoveContainer" containerID="b31c413c5d26b07382c7a1437a8a9e34e970b11922b67a53dd84ae365d6be81a" Mar 20 07:15:42 crc kubenswrapper[4971]: E0320 07:15:42.609085 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b31c413c5d26b07382c7a1437a8a9e34e970b11922b67a53dd84ae365d6be81a\": container with ID starting with b31c413c5d26b07382c7a1437a8a9e34e970b11922b67a53dd84ae365d6be81a not found: ID does not exist" containerID="b31c413c5d26b07382c7a1437a8a9e34e970b11922b67a53dd84ae365d6be81a" Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.609163 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b31c413c5d26b07382c7a1437a8a9e34e970b11922b67a53dd84ae365d6be81a"} err="failed to get container status \"b31c413c5d26b07382c7a1437a8a9e34e970b11922b67a53dd84ae365d6be81a\": rpc error: code = NotFound desc = could not find container \"b31c413c5d26b07382c7a1437a8a9e34e970b11922b67a53dd84ae365d6be81a\": container with ID starting with b31c413c5d26b07382c7a1437a8a9e34e970b11922b67a53dd84ae365d6be81a not found: ID does not exist" Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.742396 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1023ed0c-3abe-4cff-987f-52544b885696" path="/var/lib/kubelet/pods/1023ed0c-3abe-4cff-987f-52544b885696/volumes" Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.742986 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10811481-4b40-49ae-9d75-03f0c0e02fe7" path="/var/lib/kubelet/pods/10811481-4b40-49ae-9d75-03f0c0e02fe7/volumes" Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.743654 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9064c65d-26e4-4d53-a1c1-b88f932b76be" path="/var/lib/kubelet/pods/9064c65d-26e4-4d53-a1c1-b88f932b76be/volumes" Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.744813 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a71f8d2d-729c-4b7c-89c7-a06bd2216978" path="/var/lib/kubelet/pods/a71f8d2d-729c-4b7c-89c7-a06bd2216978/volumes" Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.745334 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf071dc8-0146-4b1d-a644-02224870fcba" path="/var/lib/kubelet/pods/cf071dc8-0146-4b1d-a644-02224870fcba/volumes" Mar 20 07:15:42 crc kubenswrapper[4971]: E0320 07:15:42.758021 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0e05bef3a26f5139b1217b0d668bd7559845501941c593fdd4f111a356558ab6 is running failed: container process not found" containerID="0e05bef3a26f5139b1217b0d668bd7559845501941c593fdd4f111a356558ab6" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 20 07:15:42 crc kubenswrapper[4971]: E0320 07:15:42.758366 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0e05bef3a26f5139b1217b0d668bd7559845501941c593fdd4f111a356558ab6 is running failed: container process not found" containerID="0e05bef3a26f5139b1217b0d668bd7559845501941c593fdd4f111a356558ab6" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 20 07:15:42 crc kubenswrapper[4971]: E0320 07:15:42.758750 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0e05bef3a26f5139b1217b0d668bd7559845501941c593fdd4f111a356558ab6 is running failed: container process not found" containerID="0e05bef3a26f5139b1217b0d668bd7559845501941c593fdd4f111a356558ab6" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 20 07:15:42 crc kubenswrapper[4971]: E0320 07:15:42.758785 4971 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0e05bef3a26f5139b1217b0d668bd7559845501941c593fdd4f111a356558ab6 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-njjzt" podUID="cabe7ab2-f272-4434-8295-641c334a0ae2" containerName="ovsdb-server" Mar 20 07:15:42 crc kubenswrapper[4971]: E0320 07:15:42.759210 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="68e3c1f10a1b965deae97b10a94045279a9a39e1de7f0b30a1acdde8f91cf276" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 20 07:15:42 crc kubenswrapper[4971]: E0320 07:15:42.769160 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="68e3c1f10a1b965deae97b10a94045279a9a39e1de7f0b30a1acdde8f91cf276" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 20 07:15:42 crc kubenswrapper[4971]: E0320 07:15:42.770972 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="68e3c1f10a1b965deae97b10a94045279a9a39e1de7f0b30a1acdde8f91cf276" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 20 07:15:42 crc kubenswrapper[4971]: E0320 07:15:42.771086 4971 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-njjzt" podUID="cabe7ab2-f272-4434-8295-641c334a0ae2" containerName="ovs-vswitchd" Mar 20 07:15:42 crc kubenswrapper[4971]: I0320 07:15:42.951443 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 07:15:43 crc kubenswrapper[4971]: I0320 07:15:43.099151 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f2c826d-a350-4f4e-8204-ece96439fd0f-scripts\") pod \"1f2c826d-a350-4f4e-8204-ece96439fd0f\" (UID: \"1f2c826d-a350-4f4e-8204-ece96439fd0f\") " Mar 20 07:15:43 crc kubenswrapper[4971]: I0320 07:15:43.099325 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f2c826d-a350-4f4e-8204-ece96439fd0f-ceilometer-tls-certs\") pod \"1f2c826d-a350-4f4e-8204-ece96439fd0f\" (UID: \"1f2c826d-a350-4f4e-8204-ece96439fd0f\") " Mar 20 07:15:43 crc kubenswrapper[4971]: I0320 07:15:43.099409 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f2c826d-a350-4f4e-8204-ece96439fd0f-combined-ca-bundle\") pod \"1f2c826d-a350-4f4e-8204-ece96439fd0f\" (UID: \"1f2c826d-a350-4f4e-8204-ece96439fd0f\") " Mar 20 07:15:43 crc kubenswrapper[4971]: I0320 07:15:43.099496 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1f2c826d-a350-4f4e-8204-ece96439fd0f-sg-core-conf-yaml\") pod \"1f2c826d-a350-4f4e-8204-ece96439fd0f\" (UID: \"1f2c826d-a350-4f4e-8204-ece96439fd0f\") " Mar 20 07:15:43 crc kubenswrapper[4971]: I0320 07:15:43.099551 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f2c826d-a350-4f4e-8204-ece96439fd0f-run-httpd\") pod \"1f2c826d-a350-4f4e-8204-ece96439fd0f\" (UID: \"1f2c826d-a350-4f4e-8204-ece96439fd0f\") " Mar 20 07:15:43 crc kubenswrapper[4971]: I0320 07:15:43.099688 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f2c826d-a350-4f4e-8204-ece96439fd0f-log-httpd\") pod \"1f2c826d-a350-4f4e-8204-ece96439fd0f\" (UID: \"1f2c826d-a350-4f4e-8204-ece96439fd0f\") " Mar 20 07:15:43 crc kubenswrapper[4971]: I0320 07:15:43.099749 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f2c826d-a350-4f4e-8204-ece96439fd0f-config-data\") pod \"1f2c826d-a350-4f4e-8204-ece96439fd0f\" (UID: \"1f2c826d-a350-4f4e-8204-ece96439fd0f\") " Mar 20 07:15:43 crc kubenswrapper[4971]: I0320 07:15:43.099816 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggpjs\" (UniqueName: \"kubernetes.io/projected/1f2c826d-a350-4f4e-8204-ece96439fd0f-kube-api-access-ggpjs\") pod \"1f2c826d-a350-4f4e-8204-ece96439fd0f\" (UID: \"1f2c826d-a350-4f4e-8204-ece96439fd0f\") " Mar 20 07:15:43 crc kubenswrapper[4971]: I0320 07:15:43.100318 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f2c826d-a350-4f4e-8204-ece96439fd0f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1f2c826d-a350-4f4e-8204-ece96439fd0f" (UID: "1f2c826d-a350-4f4e-8204-ece96439fd0f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:15:43 crc kubenswrapper[4971]: I0320 07:15:43.100356 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f2c826d-a350-4f4e-8204-ece96439fd0f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1f2c826d-a350-4f4e-8204-ece96439fd0f" (UID: "1f2c826d-a350-4f4e-8204-ece96439fd0f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:15:43 crc kubenswrapper[4971]: I0320 07:15:43.113099 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f2c826d-a350-4f4e-8204-ece96439fd0f-scripts" (OuterVolumeSpecName: "scripts") pod "1f2c826d-a350-4f4e-8204-ece96439fd0f" (UID: "1f2c826d-a350-4f4e-8204-ece96439fd0f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:43 crc kubenswrapper[4971]: I0320 07:15:43.113971 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f2c826d-a350-4f4e-8204-ece96439fd0f-kube-api-access-ggpjs" (OuterVolumeSpecName: "kube-api-access-ggpjs") pod "1f2c826d-a350-4f4e-8204-ece96439fd0f" (UID: "1f2c826d-a350-4f4e-8204-ece96439fd0f"). InnerVolumeSpecName "kube-api-access-ggpjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:15:43 crc kubenswrapper[4971]: I0320 07:15:43.131713 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f2c826d-a350-4f4e-8204-ece96439fd0f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1f2c826d-a350-4f4e-8204-ece96439fd0f" (UID: "1f2c826d-a350-4f4e-8204-ece96439fd0f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:43 crc kubenswrapper[4971]: I0320 07:15:43.153084 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f2c826d-a350-4f4e-8204-ece96439fd0f-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "1f2c826d-a350-4f4e-8204-ece96439fd0f" (UID: "1f2c826d-a350-4f4e-8204-ece96439fd0f"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:43 crc kubenswrapper[4971]: I0320 07:15:43.171742 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f2c826d-a350-4f4e-8204-ece96439fd0f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1f2c826d-a350-4f4e-8204-ece96439fd0f" (UID: "1f2c826d-a350-4f4e-8204-ece96439fd0f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:43 crc kubenswrapper[4971]: I0320 07:15:43.188454 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f2c826d-a350-4f4e-8204-ece96439fd0f-config-data" (OuterVolumeSpecName: "config-data") pod "1f2c826d-a350-4f4e-8204-ece96439fd0f" (UID: "1f2c826d-a350-4f4e-8204-ece96439fd0f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:43 crc kubenswrapper[4971]: I0320 07:15:43.207068 4971 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f2c826d-a350-4f4e-8204-ece96439fd0f-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:43 crc kubenswrapper[4971]: I0320 07:15:43.207113 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f2c826d-a350-4f4e-8204-ece96439fd0f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:43 crc kubenswrapper[4971]: I0320 07:15:43.207128 4971 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1f2c826d-a350-4f4e-8204-ece96439fd0f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:43 crc kubenswrapper[4971]: I0320 07:15:43.207141 4971 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f2c826d-a350-4f4e-8204-ece96439fd0f-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:43 crc kubenswrapper[4971]: I0320 07:15:43.207158 4971 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f2c826d-a350-4f4e-8204-ece96439fd0f-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:43 crc kubenswrapper[4971]: I0320 07:15:43.207172 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f2c826d-a350-4f4e-8204-ece96439fd0f-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:43 crc kubenswrapper[4971]: I0320 07:15:43.207184 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggpjs\" (UniqueName: \"kubernetes.io/projected/1f2c826d-a350-4f4e-8204-ece96439fd0f-kube-api-access-ggpjs\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:43 crc kubenswrapper[4971]: I0320 07:15:43.207197 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f2c826d-a350-4f4e-8204-ece96439fd0f-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:43 crc kubenswrapper[4971]: I0320 07:15:43.459011 4971 generic.go:334] "Generic (PLEG): container finished" podID="1f2c826d-a350-4f4e-8204-ece96439fd0f" containerID="45622b13907e1c82b36424cb8a72a5956e99d20fc8608d9925eb324547ffecba" exitCode=0 Mar 20 07:15:43 crc kubenswrapper[4971]: I0320 07:15:43.459102 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1f2c826d-a350-4f4e-8204-ece96439fd0f","Type":"ContainerDied","Data":"45622b13907e1c82b36424cb8a72a5956e99d20fc8608d9925eb324547ffecba"} Mar 20 07:15:43 crc kubenswrapper[4971]: I0320 07:15:43.459132 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1f2c826d-a350-4f4e-8204-ece96439fd0f","Type":"ContainerDied","Data":"992034b7645ab84c244a298116bc0d2391736c7b7dc3923d33394dd1c8a6bd68"} Mar 20 07:15:43 crc kubenswrapper[4971]: I0320 07:15:43.459130 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 07:15:43 crc kubenswrapper[4971]: I0320 07:15:43.459171 4971 scope.go:117] "RemoveContainer" containerID="ced131f1672d11ac0e7aa5eeb11d73ff62b3d9eb8a37be9454542ba813df1b74" Mar 20 07:15:43 crc kubenswrapper[4971]: I0320 07:15:43.495971 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 07:15:43 crc kubenswrapper[4971]: I0320 07:15:43.503446 4971 scope.go:117] "RemoveContainer" containerID="267bb15721c4418b84c312cbb7548cbdbafaea164a0e6b80c787f7e652083ca4" Mar 20 07:15:43 crc kubenswrapper[4971]: I0320 07:15:43.509266 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 07:15:43 crc kubenswrapper[4971]: I0320 07:15:43.527468 4971 scope.go:117] "RemoveContainer" containerID="45622b13907e1c82b36424cb8a72a5956e99d20fc8608d9925eb324547ffecba" Mar 20 07:15:43 crc kubenswrapper[4971]: I0320 07:15:43.548688 4971 scope.go:117] "RemoveContainer" containerID="a90100541651d8dcfa27f96fcebde4dd6642efc039317503cc423cb412dc235f" Mar 20 07:15:43 crc kubenswrapper[4971]: I0320 07:15:43.565155 4971 scope.go:117] "RemoveContainer" containerID="ced131f1672d11ac0e7aa5eeb11d73ff62b3d9eb8a37be9454542ba813df1b74" Mar 20 07:15:43 crc kubenswrapper[4971]: E0320 07:15:43.565655 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ced131f1672d11ac0e7aa5eeb11d73ff62b3d9eb8a37be9454542ba813df1b74\": container with ID starting with ced131f1672d11ac0e7aa5eeb11d73ff62b3d9eb8a37be9454542ba813df1b74 not found: ID does not exist" containerID="ced131f1672d11ac0e7aa5eeb11d73ff62b3d9eb8a37be9454542ba813df1b74" Mar 20 07:15:43 crc kubenswrapper[4971]: I0320 07:15:43.565688 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ced131f1672d11ac0e7aa5eeb11d73ff62b3d9eb8a37be9454542ba813df1b74"} err="failed to get container status \"ced131f1672d11ac0e7aa5eeb11d73ff62b3d9eb8a37be9454542ba813df1b74\": rpc error: code = NotFound desc = could not find container \"ced131f1672d11ac0e7aa5eeb11d73ff62b3d9eb8a37be9454542ba813df1b74\": container with ID starting with ced131f1672d11ac0e7aa5eeb11d73ff62b3d9eb8a37be9454542ba813df1b74 not found: ID does not exist" Mar 20 07:15:43 crc kubenswrapper[4971]: I0320 07:15:43.565711 4971 scope.go:117] "RemoveContainer" containerID="267bb15721c4418b84c312cbb7548cbdbafaea164a0e6b80c787f7e652083ca4" Mar 20 07:15:43 crc kubenswrapper[4971]: E0320 07:15:43.566098 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"267bb15721c4418b84c312cbb7548cbdbafaea164a0e6b80c787f7e652083ca4\": container with ID starting with 267bb15721c4418b84c312cbb7548cbdbafaea164a0e6b80c787f7e652083ca4 not found: ID does not exist" containerID="267bb15721c4418b84c312cbb7548cbdbafaea164a0e6b80c787f7e652083ca4" Mar 20 07:15:43 crc kubenswrapper[4971]: I0320 07:15:43.566141 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"267bb15721c4418b84c312cbb7548cbdbafaea164a0e6b80c787f7e652083ca4"} err="failed to get container status \"267bb15721c4418b84c312cbb7548cbdbafaea164a0e6b80c787f7e652083ca4\": rpc error: code = NotFound desc = could not find container \"267bb15721c4418b84c312cbb7548cbdbafaea164a0e6b80c787f7e652083ca4\": container with ID starting with 267bb15721c4418b84c312cbb7548cbdbafaea164a0e6b80c787f7e652083ca4 not found: ID does not exist" Mar 20 07:15:43 crc kubenswrapper[4971]: I0320 07:15:43.566155 4971 scope.go:117] "RemoveContainer" containerID="45622b13907e1c82b36424cb8a72a5956e99d20fc8608d9925eb324547ffecba" Mar 20 07:15:43 crc kubenswrapper[4971]: E0320 07:15:43.566383 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45622b13907e1c82b36424cb8a72a5956e99d20fc8608d9925eb324547ffecba\": container with ID starting with 45622b13907e1c82b36424cb8a72a5956e99d20fc8608d9925eb324547ffecba not found: ID does not exist" containerID="45622b13907e1c82b36424cb8a72a5956e99d20fc8608d9925eb324547ffecba" Mar 20 07:15:43 crc kubenswrapper[4971]: I0320 07:15:43.566399 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45622b13907e1c82b36424cb8a72a5956e99d20fc8608d9925eb324547ffecba"} err="failed to get container status \"45622b13907e1c82b36424cb8a72a5956e99d20fc8608d9925eb324547ffecba\": rpc error: code = NotFound desc = could not find container \"45622b13907e1c82b36424cb8a72a5956e99d20fc8608d9925eb324547ffecba\": container with ID starting with 45622b13907e1c82b36424cb8a72a5956e99d20fc8608d9925eb324547ffecba not found: ID does not exist" Mar 20 07:15:43 crc kubenswrapper[4971]: I0320 07:15:43.566411 4971 scope.go:117] "RemoveContainer" containerID="a90100541651d8dcfa27f96fcebde4dd6642efc039317503cc423cb412dc235f" Mar 20 07:15:43 crc kubenswrapper[4971]: E0320 07:15:43.566802 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a90100541651d8dcfa27f96fcebde4dd6642efc039317503cc423cb412dc235f\": container with ID starting with a90100541651d8dcfa27f96fcebde4dd6642efc039317503cc423cb412dc235f not found: ID does not exist" containerID="a90100541651d8dcfa27f96fcebde4dd6642efc039317503cc423cb412dc235f" Mar 20 07:15:43 crc kubenswrapper[4971]: I0320 07:15:43.566830 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a90100541651d8dcfa27f96fcebde4dd6642efc039317503cc423cb412dc235f"} err="failed to get container status \"a90100541651d8dcfa27f96fcebde4dd6642efc039317503cc423cb412dc235f\": rpc error: code = NotFound desc = could not find container \"a90100541651d8dcfa27f96fcebde4dd6642efc039317503cc423cb412dc235f\": container with ID starting with a90100541651d8dcfa27f96fcebde4dd6642efc039317503cc423cb412dc235f not found: ID does not exist" Mar 20 07:15:44 crc kubenswrapper[4971]: I0320 07:15:44.742967 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f2c826d-a350-4f4e-8204-ece96439fd0f" path="/var/lib/kubelet/pods/1f2c826d-a350-4f4e-8204-ece96439fd0f/volumes" Mar 20 07:15:47 crc kubenswrapper[4971]: I0320 07:15:47.167727 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-5d45446b77-6x8tp" podUID="a83b09bb-e1ad-4b10-9b14-538998197621" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.174:9696/\": dial tcp 10.217.0.174:9696: connect: connection refused" Mar 20 07:15:47 crc kubenswrapper[4971]: E0320 07:15:47.757964 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0e05bef3a26f5139b1217b0d668bd7559845501941c593fdd4f111a356558ab6 is running failed: container process not found" containerID="0e05bef3a26f5139b1217b0d668bd7559845501941c593fdd4f111a356558ab6" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 20 07:15:47 crc kubenswrapper[4971]: E0320 07:15:47.758382 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0e05bef3a26f5139b1217b0d668bd7559845501941c593fdd4f111a356558ab6 is running failed: container process not found" containerID="0e05bef3a26f5139b1217b0d668bd7559845501941c593fdd4f111a356558ab6" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 20 07:15:47 crc kubenswrapper[4971]: E0320 07:15:47.758692 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0e05bef3a26f5139b1217b0d668bd7559845501941c593fdd4f111a356558ab6 is running failed: container process not found" containerID="0e05bef3a26f5139b1217b0d668bd7559845501941c593fdd4f111a356558ab6" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 20 07:15:47 crc kubenswrapper[4971]: E0320 07:15:47.758742 4971 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0e05bef3a26f5139b1217b0d668bd7559845501941c593fdd4f111a356558ab6 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-njjzt" podUID="cabe7ab2-f272-4434-8295-641c334a0ae2" containerName="ovsdb-server" Mar 20 07:15:47 crc kubenswrapper[4971]: E0320 07:15:47.759972 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="68e3c1f10a1b965deae97b10a94045279a9a39e1de7f0b30a1acdde8f91cf276" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 20 07:15:47 crc kubenswrapper[4971]: E0320 07:15:47.761289 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="68e3c1f10a1b965deae97b10a94045279a9a39e1de7f0b30a1acdde8f91cf276" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 20 07:15:47 crc kubenswrapper[4971]: E0320 07:15:47.762787 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="68e3c1f10a1b965deae97b10a94045279a9a39e1de7f0b30a1acdde8f91cf276" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 20 07:15:47 crc kubenswrapper[4971]: E0320 07:15:47.762858 4971 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-njjzt" podUID="cabe7ab2-f272-4434-8295-641c334a0ae2" containerName="ovs-vswitchd" Mar 20 07:15:48 crc kubenswrapper[4971]: E0320 07:15:48.712561 4971 projected.go:288] Couldn't get configMap openstack/swift-storage-config-data: configmap "swift-storage-config-data" not found Mar 20 07:15:48 crc kubenswrapper[4971]: E0320 07:15:48.712959 4971 projected.go:263] Couldn't get secret openstack/swift-conf: secret "swift-conf" not found Mar 20 07:15:48 crc kubenswrapper[4971]: E0320 07:15:48.712974 4971 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 07:15:48 crc kubenswrapper[4971]: E0320 07:15:48.712989 4971 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: [configmap "swift-storage-config-data" not found, secret "swift-conf" not found, configmap "swift-ring-files" not found] Mar 20 07:15:48 crc kubenswrapper[4971]: E0320 07:15:48.713065 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9f0608b4-f344-45db-a952-d6bc328083a2-etc-swift podName:9f0608b4-f344-45db-a952-d6bc328083a2 nodeName:}" failed. No retries permitted until 2026-03-20 07:16:04.713042231 +0000 UTC m=+1586.692916379 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9f0608b4-f344-45db-a952-d6bc328083a2-etc-swift") pod "swift-storage-0" (UID: "9f0608b4-f344-45db-a952-d6bc328083a2") : [configmap "swift-storage-config-data" not found, secret "swift-conf" not found, configmap "swift-ring-files" not found] Mar 20 07:15:50 crc kubenswrapper[4971]: I0320 07:15:50.162844 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:15:50 crc kubenswrapper[4971]: I0320 07:15:50.162925 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:15:52 crc kubenswrapper[4971]: E0320 07:15:52.758398 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0e05bef3a26f5139b1217b0d668bd7559845501941c593fdd4f111a356558ab6 is running failed: container process not found" containerID="0e05bef3a26f5139b1217b0d668bd7559845501941c593fdd4f111a356558ab6" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 20 07:15:52 crc kubenswrapper[4971]: E0320 07:15:52.758750 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="68e3c1f10a1b965deae97b10a94045279a9a39e1de7f0b30a1acdde8f91cf276" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 20 07:15:52 crc kubenswrapper[4971]: E0320 07:15:52.760451 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0e05bef3a26f5139b1217b0d668bd7559845501941c593fdd4f111a356558ab6 is running failed: container process not found" containerID="0e05bef3a26f5139b1217b0d668bd7559845501941c593fdd4f111a356558ab6" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 20 07:15:52 crc kubenswrapper[4971]: E0320 07:15:52.761001 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0e05bef3a26f5139b1217b0d668bd7559845501941c593fdd4f111a356558ab6 is running failed: container process not found" containerID="0e05bef3a26f5139b1217b0d668bd7559845501941c593fdd4f111a356558ab6" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 20 07:15:52 crc kubenswrapper[4971]: E0320 07:15:52.761194 4971 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0e05bef3a26f5139b1217b0d668bd7559845501941c593fdd4f111a356558ab6 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-njjzt" podUID="cabe7ab2-f272-4434-8295-641c334a0ae2" containerName="ovsdb-server" Mar 20 07:15:52 crc kubenswrapper[4971]: E0320 07:15:52.761556 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="68e3c1f10a1b965deae97b10a94045279a9a39e1de7f0b30a1acdde8f91cf276" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 20 07:15:52 crc kubenswrapper[4971]: E0320 07:15:52.763364 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="68e3c1f10a1b965deae97b10a94045279a9a39e1de7f0b30a1acdde8f91cf276" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 20 07:15:52 crc kubenswrapper[4971]: E0320 07:15:52.763434 4971 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-njjzt" podUID="cabe7ab2-f272-4434-8295-641c334a0ae2" containerName="ovs-vswitchd" Mar 20 07:15:56 crc kubenswrapper[4971]: I0320 07:15:56.642352 4971 generic.go:334] "Generic (PLEG): container finished" podID="a83b09bb-e1ad-4b10-9b14-538998197621" containerID="35447db00a76cd29e230251c5953d04cbe80e78a51340cb69f31c155161f5b6a" exitCode=0 Mar 20 07:15:56 crc kubenswrapper[4971]: I0320 07:15:56.642410 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d45446b77-6x8tp" event={"ID":"a83b09bb-e1ad-4b10-9b14-538998197621","Type":"ContainerDied","Data":"35447db00a76cd29e230251c5953d04cbe80e78a51340cb69f31c155161f5b6a"} Mar 20 07:15:57 crc kubenswrapper[4971]: I0320 07:15:57.186862 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5d45446b77-6x8tp" Mar 20 07:15:57 crc kubenswrapper[4971]: I0320 07:15:57.301761 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a83b09bb-e1ad-4b10-9b14-538998197621-combined-ca-bundle\") pod \"a83b09bb-e1ad-4b10-9b14-538998197621\" (UID: \"a83b09bb-e1ad-4b10-9b14-538998197621\") " Mar 20 07:15:57 crc kubenswrapper[4971]: I0320 07:15:57.301891 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a83b09bb-e1ad-4b10-9b14-538998197621-internal-tls-certs\") pod \"a83b09bb-e1ad-4b10-9b14-538998197621\" (UID: \"a83b09bb-e1ad-4b10-9b14-538998197621\") " Mar 20 07:15:57 crc kubenswrapper[4971]: I0320 07:15:57.301991 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a83b09bb-e1ad-4b10-9b14-538998197621-public-tls-certs\") pod \"a83b09bb-e1ad-4b10-9b14-538998197621\" (UID: \"a83b09bb-e1ad-4b10-9b14-538998197621\") " Mar 20 07:15:57 crc kubenswrapper[4971]: I0320 07:15:57.302024 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a83b09bb-e1ad-4b10-9b14-538998197621-ovndb-tls-certs\") pod \"a83b09bb-e1ad-4b10-9b14-538998197621\" (UID: \"a83b09bb-e1ad-4b10-9b14-538998197621\") " Mar 20 07:15:57 crc kubenswrapper[4971]: I0320 07:15:57.302109 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a83b09bb-e1ad-4b10-9b14-538998197621-httpd-config\") pod \"a83b09bb-e1ad-4b10-9b14-538998197621\" (UID: \"a83b09bb-e1ad-4b10-9b14-538998197621\") " Mar 20 07:15:57 crc kubenswrapper[4971]: I0320 07:15:57.302165 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2mcb\" (UniqueName: \"kubernetes.io/projected/a83b09bb-e1ad-4b10-9b14-538998197621-kube-api-access-x2mcb\") pod \"a83b09bb-e1ad-4b10-9b14-538998197621\" (UID: \"a83b09bb-e1ad-4b10-9b14-538998197621\") " Mar 20 07:15:57 crc kubenswrapper[4971]: I0320 07:15:57.302262 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a83b09bb-e1ad-4b10-9b14-538998197621-config\") pod \"a83b09bb-e1ad-4b10-9b14-538998197621\" (UID: \"a83b09bb-e1ad-4b10-9b14-538998197621\") " Mar 20 07:15:57 crc kubenswrapper[4971]: I0320 07:15:57.308138 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a83b09bb-e1ad-4b10-9b14-538998197621-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "a83b09bb-e1ad-4b10-9b14-538998197621" (UID: "a83b09bb-e1ad-4b10-9b14-538998197621"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:57 crc kubenswrapper[4971]: I0320 07:15:57.308551 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a83b09bb-e1ad-4b10-9b14-538998197621-kube-api-access-x2mcb" (OuterVolumeSpecName: "kube-api-access-x2mcb") pod "a83b09bb-e1ad-4b10-9b14-538998197621" (UID: "a83b09bb-e1ad-4b10-9b14-538998197621"). InnerVolumeSpecName "kube-api-access-x2mcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:15:57 crc kubenswrapper[4971]: I0320 07:15:57.346197 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a83b09bb-e1ad-4b10-9b14-538998197621-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a83b09bb-e1ad-4b10-9b14-538998197621" (UID: "a83b09bb-e1ad-4b10-9b14-538998197621"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:57 crc kubenswrapper[4971]: I0320 07:15:57.356137 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a83b09bb-e1ad-4b10-9b14-538998197621-config" (OuterVolumeSpecName: "config") pod "a83b09bb-e1ad-4b10-9b14-538998197621" (UID: "a83b09bb-e1ad-4b10-9b14-538998197621"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:57 crc kubenswrapper[4971]: I0320 07:15:57.366283 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a83b09bb-e1ad-4b10-9b14-538998197621-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "a83b09bb-e1ad-4b10-9b14-538998197621" (UID: "a83b09bb-e1ad-4b10-9b14-538998197621"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:57 crc kubenswrapper[4971]: I0320 07:15:57.367000 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a83b09bb-e1ad-4b10-9b14-538998197621-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a83b09bb-e1ad-4b10-9b14-538998197621" (UID: "a83b09bb-e1ad-4b10-9b14-538998197621"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:57 crc kubenswrapper[4971]: I0320 07:15:57.370438 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a83b09bb-e1ad-4b10-9b14-538998197621-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a83b09bb-e1ad-4b10-9b14-538998197621" (UID: "a83b09bb-e1ad-4b10-9b14-538998197621"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:15:57 crc kubenswrapper[4971]: I0320 07:15:57.403678 4971 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a83b09bb-e1ad-4b10-9b14-538998197621-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:57 crc kubenswrapper[4971]: I0320 07:15:57.403710 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2mcb\" (UniqueName: \"kubernetes.io/projected/a83b09bb-e1ad-4b10-9b14-538998197621-kube-api-access-x2mcb\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:57 crc kubenswrapper[4971]: I0320 07:15:57.403721 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/a83b09bb-e1ad-4b10-9b14-538998197621-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:57 crc kubenswrapper[4971]: I0320 07:15:57.403731 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a83b09bb-e1ad-4b10-9b14-538998197621-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:57 crc kubenswrapper[4971]: I0320 07:15:57.403740 4971 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a83b09bb-e1ad-4b10-9b14-538998197621-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:57 crc kubenswrapper[4971]: I0320 07:15:57.403748 4971 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a83b09bb-e1ad-4b10-9b14-538998197621-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:57 crc kubenswrapper[4971]: I0320 07:15:57.403755 4971 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a83b09bb-e1ad-4b10-9b14-538998197621-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 07:15:57 crc kubenswrapper[4971]: I0320 07:15:57.654972 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d45446b77-6x8tp" event={"ID":"a83b09bb-e1ad-4b10-9b14-538998197621","Type":"ContainerDied","Data":"808062fa5fea3183d861db5a10e3106b35f589e9d19e52fd4f9a2dd708fa842c"} Mar 20 07:15:57 crc kubenswrapper[4971]: I0320 07:15:57.655099 4971 scope.go:117] "RemoveContainer" containerID="de8268f31ad44f7855e9d376812885c5bc010017a21fab351f990ce733d64a51" Mar 20 07:15:57 crc kubenswrapper[4971]: I0320 07:15:57.655101 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5d45446b77-6x8tp" Mar 20 07:15:57 crc kubenswrapper[4971]: I0320 07:15:57.701660 4971 scope.go:117] "RemoveContainer" containerID="35447db00a76cd29e230251c5953d04cbe80e78a51340cb69f31c155161f5b6a" Mar 20 07:15:57 crc kubenswrapper[4971]: I0320 07:15:57.710236 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5d45446b77-6x8tp"] Mar 20 07:15:57 crc kubenswrapper[4971]: I0320 07:15:57.717414 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5d45446b77-6x8tp"] Mar 20 07:15:57 crc kubenswrapper[4971]: E0320 07:15:57.758302 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0e05bef3a26f5139b1217b0d668bd7559845501941c593fdd4f111a356558ab6 is running failed: container process not found" containerID="0e05bef3a26f5139b1217b0d668bd7559845501941c593fdd4f111a356558ab6" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 20 07:15:57 crc kubenswrapper[4971]: E0320 07:15:57.759386 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0e05bef3a26f5139b1217b0d668bd7559845501941c593fdd4f111a356558ab6 is running failed: container process not found" containerID="0e05bef3a26f5139b1217b0d668bd7559845501941c593fdd4f111a356558ab6" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 20 07:15:57 crc kubenswrapper[4971]: E0320 07:15:57.760193 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="68e3c1f10a1b965deae97b10a94045279a9a39e1de7f0b30a1acdde8f91cf276" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 20 07:15:57 crc kubenswrapper[4971]: E0320 07:15:57.760222 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0e05bef3a26f5139b1217b0d668bd7559845501941c593fdd4f111a356558ab6 is running failed: container process not found" containerID="0e05bef3a26f5139b1217b0d668bd7559845501941c593fdd4f111a356558ab6" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 20 07:15:57 crc kubenswrapper[4971]: E0320 07:15:57.760469 4971 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0e05bef3a26f5139b1217b0d668bd7559845501941c593fdd4f111a356558ab6 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-njjzt" podUID="cabe7ab2-f272-4434-8295-641c334a0ae2" containerName="ovsdb-server" Mar 20 07:15:57 crc kubenswrapper[4971]: E0320 07:15:57.761671 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="68e3c1f10a1b965deae97b10a94045279a9a39e1de7f0b30a1acdde8f91cf276" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 20 07:15:57 crc kubenswrapper[4971]: E0320 07:15:57.763325 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="68e3c1f10a1b965deae97b10a94045279a9a39e1de7f0b30a1acdde8f91cf276" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 20 07:15:57 crc kubenswrapper[4971]: E0320 07:15:57.763452 4971 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-njjzt" podUID="cabe7ab2-f272-4434-8295-641c334a0ae2" containerName="ovs-vswitchd" Mar 20 07:15:58 crc kubenswrapper[4971]: I0320 07:15:58.745156 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a83b09bb-e1ad-4b10-9b14-538998197621" path="/var/lib/kubelet/pods/a83b09bb-e1ad-4b10-9b14-538998197621/volumes" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.147021 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566516-8vx7v"] Mar 20 07:16:00 crc kubenswrapper[4971]: E0320 07:16:00.147475 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9064c65d-26e4-4d53-a1c1-b88f932b76be" containerName="openstack-network-exporter" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.147499 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="9064c65d-26e4-4d53-a1c1-b88f932b76be" containerName="openstack-network-exporter" Mar 20 07:16:00 crc kubenswrapper[4971]: E0320 07:16:00.147522 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89" containerName="memcached" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.147535 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89" containerName="memcached" Mar 20 07:16:00 crc kubenswrapper[4971]: E0320 07:16:00.147555 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7c37921-0f32-4a99-bf40-e960c400a9ac" containerName="glance-httpd" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.147568 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7c37921-0f32-4a99-bf40-e960c400a9ac" containerName="glance-httpd" Mar 20 07:16:00 crc kubenswrapper[4971]: E0320 07:16:00.147592 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f2c826d-a350-4f4e-8204-ece96439fd0f" containerName="sg-core" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.147636 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f2c826d-a350-4f4e-8204-ece96439fd0f" containerName="sg-core" Mar 20 07:16:00 crc kubenswrapper[4971]: E0320 07:16:00.147650 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf071dc8-0146-4b1d-a644-02224870fcba" containerName="keystone-api" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.147662 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf071dc8-0146-4b1d-a644-02224870fcba" containerName="keystone-api" Mar 20 07:16:00 crc kubenswrapper[4971]: E0320 07:16:00.147681 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a83b09bb-e1ad-4b10-9b14-538998197621" containerName="neutron-httpd" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.147696 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="a83b09bb-e1ad-4b10-9b14-538998197621" containerName="neutron-httpd" Mar 20 07:16:00 crc kubenswrapper[4971]: E0320 07:16:00.147719 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7c37921-0f32-4a99-bf40-e960c400a9ac" containerName="glance-log" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.147732 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7c37921-0f32-4a99-bf40-e960c400a9ac" containerName="glance-log" Mar 20 07:16:00 crc kubenswrapper[4971]: E0320 07:16:00.147754 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e98717ba-3768-4817-a29a-09114e106105" containerName="nova-api-api" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.147766 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="e98717ba-3768-4817-a29a-09114e106105" containerName="nova-api-api" Mar 20 07:16:00 crc kubenswrapper[4971]: E0320 07:16:00.147792 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f2c826d-a350-4f4e-8204-ece96439fd0f" containerName="proxy-httpd" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.147803 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f2c826d-a350-4f4e-8204-ece96439fd0f" containerName="proxy-httpd" Mar 20 07:16:00 crc kubenswrapper[4971]: E0320 07:16:00.147827 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69a5a04c-039c-4ef2-a020-1e5c16fca2b6" containerName="mysql-bootstrap" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.147838 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="69a5a04c-039c-4ef2-a020-1e5c16fca2b6" containerName="mysql-bootstrap" Mar 20 07:16:00 crc kubenswrapper[4971]: E0320 07:16:00.147852 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1023ed0c-3abe-4cff-987f-52544b885696" containerName="rabbitmq" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.147863 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="1023ed0c-3abe-4cff-987f-52544b885696" containerName="rabbitmq" Mar 20 07:16:00 crc kubenswrapper[4971]: E0320 07:16:00.147877 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bb32512-db70-4db8-a867-0bda48f318eb" containerName="barbican-api-log" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.147888 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bb32512-db70-4db8-a867-0bda48f318eb" containerName="barbican-api-log" Mar 20 07:16:00 crc kubenswrapper[4971]: E0320 07:16:00.147908 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa332536-984c-4942-b1fb-1ffde7eb465d" containerName="glance-log" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.147919 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa332536-984c-4942-b1fb-1ffde7eb465d" containerName="glance-log" Mar 20 07:16:00 crc kubenswrapper[4971]: E0320 07:16:00.147938 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b11d9f93-43db-45cf-9cba-75d0c9b50d55" containerName="kube-state-metrics" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.148309 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="b11d9f93-43db-45cf-9cba-75d0c9b50d55" containerName="kube-state-metrics" Mar 20 07:16:00 crc kubenswrapper[4971]: E0320 07:16:00.148336 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebe7358b-84ed-4e08-8c12-234928beef49" containerName="nova-cell0-conductor-conductor" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.148348 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebe7358b-84ed-4e08-8c12-234928beef49" containerName="nova-cell0-conductor-conductor" Mar 20 07:16:00 crc kubenswrapper[4971]: E0320 07:16:00.148367 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9450f53-fcf0-4799-a8da-a9d0bc7016ac" containerName="nova-cell1-conductor-conductor" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.148379 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9450f53-fcf0-4799-a8da-a9d0bc7016ac" containerName="nova-cell1-conductor-conductor" Mar 20 07:16:00 crc kubenswrapper[4971]: E0320 07:16:00.148399 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa332536-984c-4942-b1fb-1ffde7eb465d" containerName="glance-httpd" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.148411 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa332536-984c-4942-b1fb-1ffde7eb465d" containerName="glance-httpd" Mar 20 07:16:00 crc kubenswrapper[4971]: E0320 07:16:00.148436 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a" containerName="nova-metadata-log" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.148452 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a" containerName="nova-metadata-log" Mar 20 07:16:00 crc kubenswrapper[4971]: E0320 07:16:00.148479 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a" containerName="nova-metadata-metadata" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.148495 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a" containerName="nova-metadata-metadata" Mar 20 07:16:00 crc kubenswrapper[4971]: E0320 07:16:00.148523 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2656777c-a362-4e3e-92aa-98a8f9d2cf34" containerName="cinder-scheduler" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.148549 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="2656777c-a362-4e3e-92aa-98a8f9d2cf34" containerName="cinder-scheduler" Mar 20 07:16:00 crc kubenswrapper[4971]: E0320 07:16:00.148564 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08b125a7-da7e-4ab1-b594-2f45f0b5c529" containerName="cinder-api" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.148575 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="08b125a7-da7e-4ab1-b594-2f45f0b5c529" containerName="cinder-api" Mar 20 07:16:00 crc kubenswrapper[4971]: E0320 07:16:00.148596 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e98717ba-3768-4817-a29a-09114e106105" containerName="nova-api-log" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.148636 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="e98717ba-3768-4817-a29a-09114e106105" containerName="nova-api-log" Mar 20 07:16:00 crc kubenswrapper[4971]: E0320 07:16:00.148659 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f2c826d-a350-4f4e-8204-ece96439fd0f" containerName="ceilometer-central-agent" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.148673 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f2c826d-a350-4f4e-8204-ece96439fd0f" containerName="ceilometer-central-agent" Mar 20 07:16:00 crc kubenswrapper[4971]: E0320 07:16:00.148705 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f2c826d-a350-4f4e-8204-ece96439fd0f" containerName="ceilometer-notification-agent" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.148717 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f2c826d-a350-4f4e-8204-ece96439fd0f" containerName="ceilometer-notification-agent" Mar 20 07:16:00 crc kubenswrapper[4971]: E0320 07:16:00.148740 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7136000-46aa-4fe0-acaf-5ad1ed94e4b3" containerName="nova-scheduler-scheduler" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.148756 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7136000-46aa-4fe0-acaf-5ad1ed94e4b3" containerName="nova-scheduler-scheduler" Mar 20 07:16:00 crc kubenswrapper[4971]: E0320 07:16:00.149731 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10811481-4b40-49ae-9d75-03f0c0e02fe7" containerName="ovn-controller" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.149764 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="10811481-4b40-49ae-9d75-03f0c0e02fe7" containerName="ovn-controller" Mar 20 07:16:00 crc kubenswrapper[4971]: E0320 07:16:00.149796 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9064c65d-26e4-4d53-a1c1-b88f932b76be" containerName="ovn-northd" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.149809 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="9064c65d-26e4-4d53-a1c1-b88f932b76be" containerName="ovn-northd" Mar 20 07:16:00 crc kubenswrapper[4971]: E0320 07:16:00.149839 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a71f8d2d-729c-4b7c-89c7-a06bd2216978" containerName="setup-container" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.149851 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="a71f8d2d-729c-4b7c-89c7-a06bd2216978" containerName="setup-container" Mar 20 07:16:00 crc kubenswrapper[4971]: E0320 07:16:00.149871 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e641b3d5-ee18-41b6-b38c-098564591a1b" containerName="placement-log" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.149884 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="e641b3d5-ee18-41b6-b38c-098564591a1b" containerName="placement-log" Mar 20 07:16:00 crc kubenswrapper[4971]: E0320 07:16:00.149902 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bb32512-db70-4db8-a867-0bda48f318eb" containerName="barbican-api" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.149915 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bb32512-db70-4db8-a867-0bda48f318eb" containerName="barbican-api" Mar 20 07:16:00 crc kubenswrapper[4971]: E0320 07:16:00.149934 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a71f8d2d-729c-4b7c-89c7-a06bd2216978" containerName="rabbitmq" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.149948 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="a71f8d2d-729c-4b7c-89c7-a06bd2216978" containerName="rabbitmq" Mar 20 07:16:00 crc kubenswrapper[4971]: E0320 07:16:00.149969 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08b125a7-da7e-4ab1-b594-2f45f0b5c529" containerName="cinder-api-log" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.149984 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="08b125a7-da7e-4ab1-b594-2f45f0b5c529" containerName="cinder-api-log" Mar 20 07:16:00 crc kubenswrapper[4971]: E0320 07:16:00.150016 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69a5a04c-039c-4ef2-a020-1e5c16fca2b6" containerName="galera" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.150031 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="69a5a04c-039c-4ef2-a020-1e5c16fca2b6" containerName="galera" Mar 20 07:16:00 crc kubenswrapper[4971]: E0320 07:16:00.150058 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e641b3d5-ee18-41b6-b38c-098564591a1b" containerName="placement-api" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.150080 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="e641b3d5-ee18-41b6-b38c-098564591a1b" containerName="placement-api" Mar 20 07:16:00 crc kubenswrapper[4971]: E0320 07:16:00.150104 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1023ed0c-3abe-4cff-987f-52544b885696" containerName="setup-container" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.150117 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="1023ed0c-3abe-4cff-987f-52544b885696" containerName="setup-container" Mar 20 07:16:00 crc kubenswrapper[4971]: E0320 07:16:00.150135 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2656777c-a362-4e3e-92aa-98a8f9d2cf34" containerName="probe" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.150148 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="2656777c-a362-4e3e-92aa-98a8f9d2cf34" containerName="probe" Mar 20 07:16:00 crc kubenswrapper[4971]: E0320 07:16:00.150163 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a83b09bb-e1ad-4b10-9b14-538998197621" containerName="neutron-api" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.150175 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="a83b09bb-e1ad-4b10-9b14-538998197621" containerName="neutron-api" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.150480 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebe7358b-84ed-4e08-8c12-234928beef49" containerName="nova-cell0-conductor-conductor" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.150515 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="2656777c-a362-4e3e-92aa-98a8f9d2cf34" containerName="probe" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.150539 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="08b125a7-da7e-4ab1-b594-2f45f0b5c529" containerName="cinder-api-log" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.150558 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="e641b3d5-ee18-41b6-b38c-098564591a1b" containerName="placement-api" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.150587 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="9064c65d-26e4-4d53-a1c1-b88f932b76be" containerName="ovn-northd" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.150645 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7c37921-0f32-4a99-bf40-e960c400a9ac" containerName="glance-log" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.150675 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7c37921-0f32-4a99-bf40-e960c400a9ac" containerName="glance-httpd" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.150700 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="a83b09bb-e1ad-4b10-9b14-538998197621" containerName="neutron-httpd" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.150721 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="e98717ba-3768-4817-a29a-09114e106105" containerName="nova-api-api" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.150739 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f2c826d-a350-4f4e-8204-ece96439fd0f" containerName="proxy-httpd" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.150752 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a" containerName="nova-metadata-metadata" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.150776 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa332536-984c-4942-b1fb-1ffde7eb465d" containerName="glance-httpd" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.150793 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="a83b09bb-e1ad-4b10-9b14-538998197621" containerName="neutron-api" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.150806 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f2c826d-a350-4f4e-8204-ece96439fd0f" containerName="sg-core" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.150827 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf071dc8-0146-4b1d-a644-02224870fcba" containerName="keystone-api" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.150842 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bb32512-db70-4db8-a867-0bda48f318eb" containerName="barbican-api" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.150856 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="10811481-4b40-49ae-9d75-03f0c0e02fe7" containerName="ovn-controller" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.150880 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="1023ed0c-3abe-4cff-987f-52544b885696" containerName="rabbitmq" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.150896 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9450f53-fcf0-4799-a8da-a9d0bc7016ac" containerName="nova-cell1-conductor-conductor" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.150911 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7136000-46aa-4fe0-acaf-5ad1ed94e4b3" containerName="nova-scheduler-scheduler" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.150926 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="08b125a7-da7e-4ab1-b594-2f45f0b5c529" containerName="cinder-api" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.150950 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="2656777c-a362-4e3e-92aa-98a8f9d2cf34" containerName="cinder-scheduler" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.150968 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bb32512-db70-4db8-a867-0bda48f318eb" containerName="barbican-api-log" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.150992 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f2c826d-a350-4f4e-8204-ece96439fd0f" containerName="ceilometer-central-agent" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.151006 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea363773-c4ed-4a4c-b0d6-e6d9eb5e1d89" containerName="memcached" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.151023 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="a71f8d2d-729c-4b7c-89c7-a06bd2216978" containerName="rabbitmq" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.151038 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="e98717ba-3768-4817-a29a-09114e106105" containerName="nova-api-log" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.151055 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd5e07d9-fdd3-4c2d-b462-5e5bcd618e5a" containerName="nova-metadata-log" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.151071 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="e641b3d5-ee18-41b6-b38c-098564591a1b" containerName="placement-log" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.151086 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="9064c65d-26e4-4d53-a1c1-b88f932b76be" containerName="openstack-network-exporter" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.151102 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f2c826d-a350-4f4e-8204-ece96439fd0f" containerName="ceilometer-notification-agent" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.151123 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="69a5a04c-039c-4ef2-a020-1e5c16fca2b6" containerName="galera" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.151139 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="b11d9f93-43db-45cf-9cba-75d0c9b50d55" containerName="kube-state-metrics" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.151160 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa332536-984c-4942-b1fb-1ffde7eb465d" containerName="glance-log" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.151892 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566516-8vx7v" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.154773 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.155109 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.156247 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.162330 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566516-8vx7v"] Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.252245 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwnzc\" (UniqueName: \"kubernetes.io/projected/a0e1b3f7-6850-4b2e-b692-ab96d597b3ed-kube-api-access-kwnzc\") pod \"auto-csr-approver-29566516-8vx7v\" (UID: \"a0e1b3f7-6850-4b2e-b692-ab96d597b3ed\") " pod="openshift-infra/auto-csr-approver-29566516-8vx7v" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.354698 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwnzc\" (UniqueName: \"kubernetes.io/projected/a0e1b3f7-6850-4b2e-b692-ab96d597b3ed-kube-api-access-kwnzc\") pod \"auto-csr-approver-29566516-8vx7v\" (UID: \"a0e1b3f7-6850-4b2e-b692-ab96d597b3ed\") " pod="openshift-infra/auto-csr-approver-29566516-8vx7v" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.385407 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwnzc\" (UniqueName: \"kubernetes.io/projected/a0e1b3f7-6850-4b2e-b692-ab96d597b3ed-kube-api-access-kwnzc\") pod \"auto-csr-approver-29566516-8vx7v\" (UID: \"a0e1b3f7-6850-4b2e-b692-ab96d597b3ed\") " pod="openshift-infra/auto-csr-approver-29566516-8vx7v" Mar 20 07:16:00 crc kubenswrapper[4971]: I0320 07:16:00.484790 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566516-8vx7v" Mar 20 07:16:01 crc kubenswrapper[4971]: I0320 07:16:01.024577 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566516-8vx7v"] Mar 20 07:16:01 crc kubenswrapper[4971]: I0320 07:16:01.036435 4971 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 07:16:01 crc kubenswrapper[4971]: I0320 07:16:01.698526 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566516-8vx7v" event={"ID":"a0e1b3f7-6850-4b2e-b692-ab96d597b3ed","Type":"ContainerStarted","Data":"d8d5e3cf5eadbbcfd5c9fc654422cc643d442a6f4ea6f293ebc4b0d7dfe9613b"} Mar 20 07:16:02 crc kubenswrapper[4971]: I0320 07:16:02.724250 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566516-8vx7v" event={"ID":"a0e1b3f7-6850-4b2e-b692-ab96d597b3ed","Type":"ContainerStarted","Data":"7831d892345d390eff12e03d1886c71737d9b5e1b0545506bc2d92be4cbe6494"} Mar 20 07:16:02 crc kubenswrapper[4971]: I0320 07:16:02.749401 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566516-8vx7v" podStartSLOduration=1.590684 podStartE2EDuration="2.749372884s" podCreationTimestamp="2026-03-20 07:16:00 +0000 UTC" firstStartedPulling="2026-03-20 07:16:01.036236668 +0000 UTC m=+1583.016110806" lastFinishedPulling="2026-03-20 07:16:02.194925552 +0000 UTC m=+1584.174799690" observedRunningTime="2026-03-20 07:16:02.74227718 +0000 UTC m=+1584.722151358" watchObservedRunningTime="2026-03-20 07:16:02.749372884 +0000 UTC m=+1584.729247072" Mar 20 07:16:02 crc kubenswrapper[4971]: E0320 07:16:02.760561 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0e05bef3a26f5139b1217b0d668bd7559845501941c593fdd4f111a356558ab6 is running failed: container process not found" containerID="0e05bef3a26f5139b1217b0d668bd7559845501941c593fdd4f111a356558ab6" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 20 07:16:02 crc kubenswrapper[4971]: E0320 07:16:02.761191 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0e05bef3a26f5139b1217b0d668bd7559845501941c593fdd4f111a356558ab6 is running failed: container process not found" containerID="0e05bef3a26f5139b1217b0d668bd7559845501941c593fdd4f111a356558ab6" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 20 07:16:02 crc kubenswrapper[4971]: E0320 07:16:02.761533 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0e05bef3a26f5139b1217b0d668bd7559845501941c593fdd4f111a356558ab6 is running failed: container process not found" containerID="0e05bef3a26f5139b1217b0d668bd7559845501941c593fdd4f111a356558ab6" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 20 07:16:02 crc kubenswrapper[4971]: E0320 07:16:02.761557 4971 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0e05bef3a26f5139b1217b0d668bd7559845501941c593fdd4f111a356558ab6 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-njjzt" podUID="cabe7ab2-f272-4434-8295-641c334a0ae2" containerName="ovsdb-server" Mar 20 07:16:02 crc kubenswrapper[4971]: E0320 07:16:02.762466 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="68e3c1f10a1b965deae97b10a94045279a9a39e1de7f0b30a1acdde8f91cf276" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 20 07:16:02 crc kubenswrapper[4971]: E0320 07:16:02.764319 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="68e3c1f10a1b965deae97b10a94045279a9a39e1de7f0b30a1acdde8f91cf276" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 20 07:16:02 crc kubenswrapper[4971]: E0320 07:16:02.766200 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="68e3c1f10a1b965deae97b10a94045279a9a39e1de7f0b30a1acdde8f91cf276" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 20 07:16:02 crc kubenswrapper[4971]: E0320 07:16:02.766258 4971 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-njjzt" podUID="cabe7ab2-f272-4434-8295-641c334a0ae2" containerName="ovs-vswitchd" Mar 20 07:16:03 crc kubenswrapper[4971]: I0320 07:16:03.737061 4971 generic.go:334] "Generic (PLEG): container finished" podID="a0e1b3f7-6850-4b2e-b692-ab96d597b3ed" containerID="7831d892345d390eff12e03d1886c71737d9b5e1b0545506bc2d92be4cbe6494" exitCode=0 Mar 20 07:16:03 crc kubenswrapper[4971]: I0320 07:16:03.737126 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566516-8vx7v" event={"ID":"a0e1b3f7-6850-4b2e-b692-ab96d597b3ed","Type":"ContainerDied","Data":"7831d892345d390eff12e03d1886c71737d9b5e1b0545506bc2d92be4cbe6494"} Mar 20 07:16:04 crc kubenswrapper[4971]: I0320 07:16:04.448185 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-njjzt_cabe7ab2-f272-4434-8295-641c334a0ae2/ovs-vswitchd/0.log" Mar 20 07:16:04 crc kubenswrapper[4971]: I0320 07:16:04.449433 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-njjzt" Mar 20 07:16:04 crc kubenswrapper[4971]: I0320 07:16:04.542110 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/cabe7ab2-f272-4434-8295-641c334a0ae2-var-lib\") pod \"cabe7ab2-f272-4434-8295-641c334a0ae2\" (UID: \"cabe7ab2-f272-4434-8295-641c334a0ae2\") " Mar 20 07:16:04 crc kubenswrapper[4971]: I0320 07:16:04.542183 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xvnq\" (UniqueName: \"kubernetes.io/projected/cabe7ab2-f272-4434-8295-641c334a0ae2-kube-api-access-4xvnq\") pod \"cabe7ab2-f272-4434-8295-641c334a0ae2\" (UID: \"cabe7ab2-f272-4434-8295-641c334a0ae2\") " Mar 20 07:16:04 crc kubenswrapper[4971]: I0320 07:16:04.542220 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/cabe7ab2-f272-4434-8295-641c334a0ae2-var-log\") pod \"cabe7ab2-f272-4434-8295-641c334a0ae2\" (UID: \"cabe7ab2-f272-4434-8295-641c334a0ae2\") " Mar 20 07:16:04 crc kubenswrapper[4971]: I0320 07:16:04.542341 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cabe7ab2-f272-4434-8295-641c334a0ae2-scripts\") pod \"cabe7ab2-f272-4434-8295-641c334a0ae2\" (UID: \"cabe7ab2-f272-4434-8295-641c334a0ae2\") " Mar 20 07:16:04 crc kubenswrapper[4971]: I0320 07:16:04.544545 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/cabe7ab2-f272-4434-8295-641c334a0ae2-etc-ovs\") pod \"cabe7ab2-f272-4434-8295-641c334a0ae2\" (UID: \"cabe7ab2-f272-4434-8295-641c334a0ae2\") " Mar 20 07:16:04 crc kubenswrapper[4971]: I0320 07:16:04.542250 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cabe7ab2-f272-4434-8295-641c334a0ae2-var-lib" (OuterVolumeSpecName: "var-lib") pod "cabe7ab2-f272-4434-8295-641c334a0ae2" (UID: "cabe7ab2-f272-4434-8295-641c334a0ae2"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 07:16:04 crc kubenswrapper[4971]: I0320 07:16:04.542272 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cabe7ab2-f272-4434-8295-641c334a0ae2-var-log" (OuterVolumeSpecName: "var-log") pod "cabe7ab2-f272-4434-8295-641c334a0ae2" (UID: "cabe7ab2-f272-4434-8295-641c334a0ae2"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 07:16:04 crc kubenswrapper[4971]: I0320 07:16:04.544477 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cabe7ab2-f272-4434-8295-641c334a0ae2-scripts" (OuterVolumeSpecName: "scripts") pod "cabe7ab2-f272-4434-8295-641c334a0ae2" (UID: "cabe7ab2-f272-4434-8295-641c334a0ae2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:16:04 crc kubenswrapper[4971]: I0320 07:16:04.544646 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cabe7ab2-f272-4434-8295-641c334a0ae2-var-run\") pod \"cabe7ab2-f272-4434-8295-641c334a0ae2\" (UID: \"cabe7ab2-f272-4434-8295-641c334a0ae2\") " Mar 20 07:16:04 crc kubenswrapper[4971]: I0320 07:16:04.544771 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cabe7ab2-f272-4434-8295-641c334a0ae2-var-run" (OuterVolumeSpecName: "var-run") pod "cabe7ab2-f272-4434-8295-641c334a0ae2" (UID: "cabe7ab2-f272-4434-8295-641c334a0ae2"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 07:16:04 crc kubenswrapper[4971]: I0320 07:16:04.544764 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cabe7ab2-f272-4434-8295-641c334a0ae2-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "cabe7ab2-f272-4434-8295-641c334a0ae2" (UID: "cabe7ab2-f272-4434-8295-641c334a0ae2"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 07:16:04 crc kubenswrapper[4971]: I0320 07:16:04.545285 4971 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/cabe7ab2-f272-4434-8295-641c334a0ae2-var-lib\") on node \"crc\" DevicePath \"\"" Mar 20 07:16:04 crc kubenswrapper[4971]: I0320 07:16:04.545298 4971 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/cabe7ab2-f272-4434-8295-641c334a0ae2-var-log\") on node \"crc\" DevicePath \"\"" Mar 20 07:16:04 crc kubenswrapper[4971]: I0320 07:16:04.545308 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cabe7ab2-f272-4434-8295-641c334a0ae2-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:16:04 crc kubenswrapper[4971]: I0320 07:16:04.545318 4971 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/cabe7ab2-f272-4434-8295-641c334a0ae2-etc-ovs\") on node \"crc\" DevicePath \"\"" Mar 20 07:16:04 crc kubenswrapper[4971]: I0320 07:16:04.545326 4971 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cabe7ab2-f272-4434-8295-641c334a0ae2-var-run\") on node \"crc\" DevicePath \"\"" Mar 20 07:16:04 crc kubenswrapper[4971]: I0320 07:16:04.548194 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cabe7ab2-f272-4434-8295-641c334a0ae2-kube-api-access-4xvnq" (OuterVolumeSpecName: "kube-api-access-4xvnq") pod "cabe7ab2-f272-4434-8295-641c334a0ae2" (UID: "cabe7ab2-f272-4434-8295-641c334a0ae2"). InnerVolumeSpecName "kube-api-access-4xvnq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:16:04 crc kubenswrapper[4971]: I0320 07:16:04.583795 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 20 07:16:04 crc kubenswrapper[4971]: I0320 07:16:04.670834 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xvnq\" (UniqueName: \"kubernetes.io/projected/cabe7ab2-f272-4434-8295-641c334a0ae2-kube-api-access-4xvnq\") on node \"crc\" DevicePath \"\"" Mar 20 07:16:04 crc kubenswrapper[4971]: I0320 07:16:04.751168 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-njjzt_cabe7ab2-f272-4434-8295-641c334a0ae2/ovs-vswitchd/0.log" Mar 20 07:16:04 crc kubenswrapper[4971]: I0320 07:16:04.754811 4971 generic.go:334] "Generic (PLEG): container finished" podID="cabe7ab2-f272-4434-8295-641c334a0ae2" containerID="68e3c1f10a1b965deae97b10a94045279a9a39e1de7f0b30a1acdde8f91cf276" exitCode=137 Mar 20 07:16:04 crc kubenswrapper[4971]: I0320 07:16:04.754873 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-njjzt" event={"ID":"cabe7ab2-f272-4434-8295-641c334a0ae2","Type":"ContainerDied","Data":"68e3c1f10a1b965deae97b10a94045279a9a39e1de7f0b30a1acdde8f91cf276"} Mar 20 07:16:04 crc kubenswrapper[4971]: I0320 07:16:04.754898 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-njjzt" event={"ID":"cabe7ab2-f272-4434-8295-641c334a0ae2","Type":"ContainerDied","Data":"fbee2fbecd5c87d302299b54299393cd44b0f6afc1e354555b36b6cdae864349"} Mar 20 07:16:04 crc kubenswrapper[4971]: I0320 07:16:04.754916 4971 scope.go:117] "RemoveContainer" containerID="68e3c1f10a1b965deae97b10a94045279a9a39e1de7f0b30a1acdde8f91cf276" Mar 20 07:16:04 crc kubenswrapper[4971]: I0320 07:16:04.754929 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-njjzt" Mar 20 07:16:04 crc kubenswrapper[4971]: I0320 07:16:04.771289 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/9f0608b4-f344-45db-a952-d6bc328083a2-lock\") pod \"9f0608b4-f344-45db-a952-d6bc328083a2\" (UID: \"9f0608b4-f344-45db-a952-d6bc328083a2\") " Mar 20 07:16:04 crc kubenswrapper[4971]: I0320 07:16:04.771328 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5d77b\" (UniqueName: \"kubernetes.io/projected/9f0608b4-f344-45db-a952-d6bc328083a2-kube-api-access-5d77b\") pod \"9f0608b4-f344-45db-a952-d6bc328083a2\" (UID: \"9f0608b4-f344-45db-a952-d6bc328083a2\") " Mar 20 07:16:04 crc kubenswrapper[4971]: I0320 07:16:04.771357 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f0608b4-f344-45db-a952-d6bc328083a2-combined-ca-bundle\") pod \"9f0608b4-f344-45db-a952-d6bc328083a2\" (UID: \"9f0608b4-f344-45db-a952-d6bc328083a2\") " Mar 20 07:16:04 crc kubenswrapper[4971]: I0320 07:16:04.771396 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9f0608b4-f344-45db-a952-d6bc328083a2-etc-swift\") pod \"9f0608b4-f344-45db-a952-d6bc328083a2\" (UID: \"9f0608b4-f344-45db-a952-d6bc328083a2\") " Mar 20 07:16:04 crc kubenswrapper[4971]: I0320 07:16:04.771488 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/9f0608b4-f344-45db-a952-d6bc328083a2-cache\") pod \"9f0608b4-f344-45db-a952-d6bc328083a2\" (UID: \"9f0608b4-f344-45db-a952-d6bc328083a2\") " Mar 20 07:16:04 crc kubenswrapper[4971]: I0320 07:16:04.771519 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"9f0608b4-f344-45db-a952-d6bc328083a2\" (UID: \"9f0608b4-f344-45db-a952-d6bc328083a2\") " Mar 20 07:16:04 crc kubenswrapper[4971]: I0320 07:16:04.771741 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f0608b4-f344-45db-a952-d6bc328083a2-lock" (OuterVolumeSpecName: "lock") pod "9f0608b4-f344-45db-a952-d6bc328083a2" (UID: "9f0608b4-f344-45db-a952-d6bc328083a2"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:16:04 crc kubenswrapper[4971]: I0320 07:16:04.772154 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f0608b4-f344-45db-a952-d6bc328083a2-cache" (OuterVolumeSpecName: "cache") pod "9f0608b4-f344-45db-a952-d6bc328083a2" (UID: "9f0608b4-f344-45db-a952-d6bc328083a2"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:16:04 crc kubenswrapper[4971]: I0320 07:16:04.772438 4971 generic.go:334] "Generic (PLEG): container finished" podID="9f0608b4-f344-45db-a952-d6bc328083a2" containerID="991e136e3d1419ae52ab4b98312ecd62fb645e90cc4433d51eaf06d9da2b9acb" exitCode=137 Mar 20 07:16:04 crc kubenswrapper[4971]: I0320 07:16:04.772571 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9f0608b4-f344-45db-a952-d6bc328083a2","Type":"ContainerDied","Data":"991e136e3d1419ae52ab4b98312ecd62fb645e90cc4433d51eaf06d9da2b9acb"} Mar 20 07:16:04 crc kubenswrapper[4971]: I0320 07:16:04.772590 4971 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/9f0608b4-f344-45db-a952-d6bc328083a2-lock\") on node \"crc\" DevicePath \"\"" Mar 20 07:16:04 crc kubenswrapper[4971]: I0320 07:16:04.772632 4971 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/9f0608b4-f344-45db-a952-d6bc328083a2-cache\") on node \"crc\" DevicePath \"\"" Mar 20 07:16:04 crc kubenswrapper[4971]: I0320 07:16:04.772667 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9f0608b4-f344-45db-a952-d6bc328083a2","Type":"ContainerDied","Data":"30b6b4cf18c37a2dc05229ec190ae64ac584dd215f06e0f13bdf0928091f2fcc"} Mar 20 07:16:04 crc kubenswrapper[4971]: I0320 07:16:04.772694 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 20 07:16:04 crc kubenswrapper[4971]: I0320 07:16:04.774964 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "swift") pod "9f0608b4-f344-45db-a952-d6bc328083a2" (UID: "9f0608b4-f344-45db-a952-d6bc328083a2"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 07:16:04 crc kubenswrapper[4971]: I0320 07:16:04.776495 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f0608b4-f344-45db-a952-d6bc328083a2-kube-api-access-5d77b" (OuterVolumeSpecName: "kube-api-access-5d77b") pod "9f0608b4-f344-45db-a952-d6bc328083a2" (UID: "9f0608b4-f344-45db-a952-d6bc328083a2"). InnerVolumeSpecName "kube-api-access-5d77b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:16:04 crc kubenswrapper[4971]: I0320 07:16:04.776860 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f0608b4-f344-45db-a952-d6bc328083a2-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "9f0608b4-f344-45db-a952-d6bc328083a2" (UID: "9f0608b4-f344-45db-a952-d6bc328083a2"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:16:04 crc kubenswrapper[4971]: I0320 07:16:04.807568 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-njjzt"] Mar 20 07:16:04 crc kubenswrapper[4971]: I0320 07:16:04.821121 4971 scope.go:117] "RemoveContainer" containerID="0e05bef3a26f5139b1217b0d668bd7559845501941c593fdd4f111a356558ab6" Mar 20 07:16:04 crc kubenswrapper[4971]: I0320 07:16:04.821652 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-njjzt"] Mar 20 07:16:04 crc kubenswrapper[4971]: I0320 07:16:04.875897 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5d77b\" (UniqueName: \"kubernetes.io/projected/9f0608b4-f344-45db-a952-d6bc328083a2-kube-api-access-5d77b\") on node \"crc\" DevicePath \"\"" Mar 20 07:16:04 crc kubenswrapper[4971]: I0320 07:16:04.878075 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9f0608b4-f344-45db-a952-d6bc328083a2-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 20 07:16:04 crc kubenswrapper[4971]: I0320 07:16:04.878159 4971 scope.go:117] "RemoveContainer" containerID="d4a843ddc450022e968540cf332a8f3bf76fb39cae689bcaa48300de5559bd52" Mar 20 07:16:04 crc kubenswrapper[4971]: I0320 07:16:04.878772 4971 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 20 07:16:04 crc kubenswrapper[4971]: I0320 07:16:04.919147 4971 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 20 07:16:04 crc kubenswrapper[4971]: I0320 07:16:04.938065 4971 scope.go:117] "RemoveContainer" containerID="68e3c1f10a1b965deae97b10a94045279a9a39e1de7f0b30a1acdde8f91cf276" Mar 20 07:16:04 crc kubenswrapper[4971]: E0320 07:16:04.948818 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68e3c1f10a1b965deae97b10a94045279a9a39e1de7f0b30a1acdde8f91cf276\": container with ID starting with 68e3c1f10a1b965deae97b10a94045279a9a39e1de7f0b30a1acdde8f91cf276 not found: ID does not exist" containerID="68e3c1f10a1b965deae97b10a94045279a9a39e1de7f0b30a1acdde8f91cf276" Mar 20 07:16:04 crc kubenswrapper[4971]: I0320 07:16:04.948860 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68e3c1f10a1b965deae97b10a94045279a9a39e1de7f0b30a1acdde8f91cf276"} err="failed to get container status \"68e3c1f10a1b965deae97b10a94045279a9a39e1de7f0b30a1acdde8f91cf276\": rpc error: code = NotFound desc = could not find container \"68e3c1f10a1b965deae97b10a94045279a9a39e1de7f0b30a1acdde8f91cf276\": container with ID starting with 68e3c1f10a1b965deae97b10a94045279a9a39e1de7f0b30a1acdde8f91cf276 not found: ID does not exist" Mar 20 07:16:04 crc kubenswrapper[4971]: I0320 07:16:04.948916 4971 scope.go:117] "RemoveContainer" containerID="0e05bef3a26f5139b1217b0d668bd7559845501941c593fdd4f111a356558ab6" Mar 20 07:16:04 crc kubenswrapper[4971]: E0320 07:16:04.949349 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e05bef3a26f5139b1217b0d668bd7559845501941c593fdd4f111a356558ab6\": container with ID starting with 0e05bef3a26f5139b1217b0d668bd7559845501941c593fdd4f111a356558ab6 not found: ID does not exist" containerID="0e05bef3a26f5139b1217b0d668bd7559845501941c593fdd4f111a356558ab6" Mar 20 07:16:04 crc kubenswrapper[4971]: I0320 07:16:04.949371 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e05bef3a26f5139b1217b0d668bd7559845501941c593fdd4f111a356558ab6"} err="failed to get container status \"0e05bef3a26f5139b1217b0d668bd7559845501941c593fdd4f111a356558ab6\": rpc error: code = NotFound desc = could not find container \"0e05bef3a26f5139b1217b0d668bd7559845501941c593fdd4f111a356558ab6\": container with ID starting with 0e05bef3a26f5139b1217b0d668bd7559845501941c593fdd4f111a356558ab6 not found: ID does not exist" Mar 20 07:16:04 crc kubenswrapper[4971]: I0320 07:16:04.949383 4971 scope.go:117] "RemoveContainer" containerID="d4a843ddc450022e968540cf332a8f3bf76fb39cae689bcaa48300de5559bd52" Mar 20 07:16:04 crc kubenswrapper[4971]: E0320 07:16:04.949673 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4a843ddc450022e968540cf332a8f3bf76fb39cae689bcaa48300de5559bd52\": container with ID starting with d4a843ddc450022e968540cf332a8f3bf76fb39cae689bcaa48300de5559bd52 not found: ID does not exist" containerID="d4a843ddc450022e968540cf332a8f3bf76fb39cae689bcaa48300de5559bd52" Mar 20 07:16:04 crc kubenswrapper[4971]: I0320 07:16:04.949690 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4a843ddc450022e968540cf332a8f3bf76fb39cae689bcaa48300de5559bd52"} err="failed to get container status \"d4a843ddc450022e968540cf332a8f3bf76fb39cae689bcaa48300de5559bd52\": rpc error: code = NotFound desc = could not find container \"d4a843ddc450022e968540cf332a8f3bf76fb39cae689bcaa48300de5559bd52\": container with ID starting with d4a843ddc450022e968540cf332a8f3bf76fb39cae689bcaa48300de5559bd52 not found: ID does not exist" Mar 20 07:16:04 crc kubenswrapper[4971]: I0320 07:16:04.949703 4971 scope.go:117] "RemoveContainer" containerID="991e136e3d1419ae52ab4b98312ecd62fb645e90cc4433d51eaf06d9da2b9acb" Mar 20 07:16:04 crc kubenswrapper[4971]: I0320 07:16:04.980438 4971 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 20 07:16:04 crc kubenswrapper[4971]: I0320 07:16:04.992441 4971 scope.go:117] "RemoveContainer" containerID="208cc47279ede4fc675dd95db356e62187d88bcc7f60e2e2862a2771c8f1ac3a" Mar 20 07:16:05 crc kubenswrapper[4971]: I0320 07:16:05.056165 4971 scope.go:117] "RemoveContainer" containerID="1cca389d9342f12475322096ca826c0aec4ad5debfe45d027be7baa037a77986" Mar 20 07:16:05 crc kubenswrapper[4971]: I0320 07:16:05.058103 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566516-8vx7v" Mar 20 07:16:05 crc kubenswrapper[4971]: I0320 07:16:05.080842 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwnzc\" (UniqueName: \"kubernetes.io/projected/a0e1b3f7-6850-4b2e-b692-ab96d597b3ed-kube-api-access-kwnzc\") pod \"a0e1b3f7-6850-4b2e-b692-ab96d597b3ed\" (UID: \"a0e1b3f7-6850-4b2e-b692-ab96d597b3ed\") " Mar 20 07:16:05 crc kubenswrapper[4971]: I0320 07:16:05.083082 4971 scope.go:117] "RemoveContainer" containerID="956cfacd43945d37dd879d68eb8604413a9b059e3c4840c7381849fcbde53b7b" Mar 20 07:16:05 crc kubenswrapper[4971]: I0320 07:16:05.084173 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0e1b3f7-6850-4b2e-b692-ab96d597b3ed-kube-api-access-kwnzc" (OuterVolumeSpecName: "kube-api-access-kwnzc") pod "a0e1b3f7-6850-4b2e-b692-ab96d597b3ed" (UID: "a0e1b3f7-6850-4b2e-b692-ab96d597b3ed"). InnerVolumeSpecName "kube-api-access-kwnzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:16:05 crc kubenswrapper[4971]: I0320 07:16:05.107429 4971 scope.go:117] "RemoveContainer" containerID="86e323d45ff8a18ef238cc1dc929c3203fdb02bcad78b9221d6f2ec70e572487" Mar 20 07:16:05 crc kubenswrapper[4971]: I0320 07:16:05.107508 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f0608b4-f344-45db-a952-d6bc328083a2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9f0608b4-f344-45db-a952-d6bc328083a2" (UID: "9f0608b4-f344-45db-a952-d6bc328083a2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:16:05 crc kubenswrapper[4971]: I0320 07:16:05.127829 4971 scope.go:117] "RemoveContainer" containerID="5e8654e5ce222132f07315464b39e25a8cab844b1929418e596056aec0200d84" Mar 20 07:16:05 crc kubenswrapper[4971]: I0320 07:16:05.147061 4971 scope.go:117] "RemoveContainer" containerID="c9c23b9558cdbf263d8320145f4a662b276ea98292c7170bbe1609469c85d516" Mar 20 07:16:05 crc kubenswrapper[4971]: I0320 07:16:05.170107 4971 scope.go:117] "RemoveContainer" containerID="6ff097de6ee924556cdb6109e47ceb1245409ef90ff32e007c6f3da327384af0" Mar 20 07:16:05 crc kubenswrapper[4971]: I0320 07:16:05.182424 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f0608b4-f344-45db-a952-d6bc328083a2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:16:05 crc kubenswrapper[4971]: I0320 07:16:05.182452 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwnzc\" (UniqueName: \"kubernetes.io/projected/a0e1b3f7-6850-4b2e-b692-ab96d597b3ed-kube-api-access-kwnzc\") on node \"crc\" DevicePath \"\"" Mar 20 07:16:05 crc kubenswrapper[4971]: I0320 07:16:05.191271 4971 scope.go:117] "RemoveContainer" containerID="3bc926e8abdcab959b8ce2e3d7c3be7c01ec42c3cc5ee00077229e1618e99dca" Mar 20 07:16:05 crc kubenswrapper[4971]: I0320 07:16:05.213508 4971 scope.go:117] "RemoveContainer" containerID="606899b0d5647f6bba62377dc5689cc5a5012b312b0d21b384c0392805a1f808" Mar 20 07:16:05 crc kubenswrapper[4971]: I0320 07:16:05.235745 4971 scope.go:117] "RemoveContainer" containerID="f933aca417afe71a53e9fc981ae7ba972eb5df323cf2c751b7c7592ca963d97c" Mar 20 07:16:05 crc kubenswrapper[4971]: I0320 07:16:05.258129 4971 scope.go:117] "RemoveContainer" containerID="d1197f3d4771bcf093680ea5c7e317e0d1839ab9a28044e6c0af4c8d3d15a45d" Mar 20 07:16:05 crc kubenswrapper[4971]: I0320 07:16:05.283378 4971 scope.go:117] "RemoveContainer" containerID="0974124c6c77137e176d1974dea07e511d0b153152c5fc899503abbc3b959028" Mar 20 07:16:05 crc kubenswrapper[4971]: I0320 07:16:05.310347 4971 scope.go:117] "RemoveContainer" containerID="7c9c527925a998de06f623f87a6e63a193cc7cb04eaaee5ab4f3102ab86f462e" Mar 20 07:16:05 crc kubenswrapper[4971]: I0320 07:16:05.348015 4971 scope.go:117] "RemoveContainer" containerID="1a911d36f7d7aa0fb9edc9388eb7caf6d4875f4f78e071a88a18bb2b44086495" Mar 20 07:16:05 crc kubenswrapper[4971]: I0320 07:16:05.367089 4971 scope.go:117] "RemoveContainer" containerID="991e136e3d1419ae52ab4b98312ecd62fb645e90cc4433d51eaf06d9da2b9acb" Mar 20 07:16:05 crc kubenswrapper[4971]: E0320 07:16:05.367728 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"991e136e3d1419ae52ab4b98312ecd62fb645e90cc4433d51eaf06d9da2b9acb\": container with ID starting with 991e136e3d1419ae52ab4b98312ecd62fb645e90cc4433d51eaf06d9da2b9acb not found: ID does not exist" containerID="991e136e3d1419ae52ab4b98312ecd62fb645e90cc4433d51eaf06d9da2b9acb" Mar 20 07:16:05 crc kubenswrapper[4971]: I0320 07:16:05.367764 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"991e136e3d1419ae52ab4b98312ecd62fb645e90cc4433d51eaf06d9da2b9acb"} err="failed to get container status \"991e136e3d1419ae52ab4b98312ecd62fb645e90cc4433d51eaf06d9da2b9acb\": rpc error: code = NotFound desc = could not find container \"991e136e3d1419ae52ab4b98312ecd62fb645e90cc4433d51eaf06d9da2b9acb\": container with ID starting with 991e136e3d1419ae52ab4b98312ecd62fb645e90cc4433d51eaf06d9da2b9acb not found: ID does not exist" Mar 20 07:16:05 crc kubenswrapper[4971]: I0320 07:16:05.367788 4971 scope.go:117] "RemoveContainer" containerID="208cc47279ede4fc675dd95db356e62187d88bcc7f60e2e2862a2771c8f1ac3a" Mar 20 07:16:05 crc kubenswrapper[4971]: E0320 07:16:05.368226 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"208cc47279ede4fc675dd95db356e62187d88bcc7f60e2e2862a2771c8f1ac3a\": container with ID starting with 208cc47279ede4fc675dd95db356e62187d88bcc7f60e2e2862a2771c8f1ac3a not found: ID does not exist" containerID="208cc47279ede4fc675dd95db356e62187d88bcc7f60e2e2862a2771c8f1ac3a" Mar 20 07:16:05 crc kubenswrapper[4971]: I0320 07:16:05.368280 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"208cc47279ede4fc675dd95db356e62187d88bcc7f60e2e2862a2771c8f1ac3a"} err="failed to get container status \"208cc47279ede4fc675dd95db356e62187d88bcc7f60e2e2862a2771c8f1ac3a\": rpc error: code = NotFound desc = could not find container \"208cc47279ede4fc675dd95db356e62187d88bcc7f60e2e2862a2771c8f1ac3a\": container with ID starting with 208cc47279ede4fc675dd95db356e62187d88bcc7f60e2e2862a2771c8f1ac3a not found: ID does not exist" Mar 20 07:16:05 crc kubenswrapper[4971]: I0320 07:16:05.368309 4971 scope.go:117] "RemoveContainer" containerID="1cca389d9342f12475322096ca826c0aec4ad5debfe45d027be7baa037a77986" Mar 20 07:16:05 crc kubenswrapper[4971]: E0320 07:16:05.368731 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cca389d9342f12475322096ca826c0aec4ad5debfe45d027be7baa037a77986\": container with ID starting with 1cca389d9342f12475322096ca826c0aec4ad5debfe45d027be7baa037a77986 not found: ID does not exist" containerID="1cca389d9342f12475322096ca826c0aec4ad5debfe45d027be7baa037a77986" Mar 20 07:16:05 crc kubenswrapper[4971]: I0320 07:16:05.368766 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cca389d9342f12475322096ca826c0aec4ad5debfe45d027be7baa037a77986"} err="failed to get container status \"1cca389d9342f12475322096ca826c0aec4ad5debfe45d027be7baa037a77986\": rpc error: code = NotFound desc = could not find container \"1cca389d9342f12475322096ca826c0aec4ad5debfe45d027be7baa037a77986\": container with ID starting with 1cca389d9342f12475322096ca826c0aec4ad5debfe45d027be7baa037a77986 not found: ID does not exist" Mar 20 07:16:05 crc kubenswrapper[4971]: I0320 07:16:05.368784 4971 scope.go:117] "RemoveContainer" containerID="956cfacd43945d37dd879d68eb8604413a9b059e3c4840c7381849fcbde53b7b" Mar 20 07:16:05 crc kubenswrapper[4971]: E0320 07:16:05.369060 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"956cfacd43945d37dd879d68eb8604413a9b059e3c4840c7381849fcbde53b7b\": container with ID starting with 956cfacd43945d37dd879d68eb8604413a9b059e3c4840c7381849fcbde53b7b not found: ID does not exist" containerID="956cfacd43945d37dd879d68eb8604413a9b059e3c4840c7381849fcbde53b7b" Mar 20 07:16:05 crc kubenswrapper[4971]: I0320 07:16:05.369085 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"956cfacd43945d37dd879d68eb8604413a9b059e3c4840c7381849fcbde53b7b"} err="failed to get container status \"956cfacd43945d37dd879d68eb8604413a9b059e3c4840c7381849fcbde53b7b\": rpc error: code = NotFound desc = could not find container \"956cfacd43945d37dd879d68eb8604413a9b059e3c4840c7381849fcbde53b7b\": container with ID starting with 956cfacd43945d37dd879d68eb8604413a9b059e3c4840c7381849fcbde53b7b not found: ID does not exist" Mar 20 07:16:05 crc kubenswrapper[4971]: I0320 07:16:05.369128 4971 scope.go:117] "RemoveContainer" containerID="86e323d45ff8a18ef238cc1dc929c3203fdb02bcad78b9221d6f2ec70e572487" Mar 20 07:16:05 crc kubenswrapper[4971]: E0320 07:16:05.369432 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86e323d45ff8a18ef238cc1dc929c3203fdb02bcad78b9221d6f2ec70e572487\": container with ID starting with 86e323d45ff8a18ef238cc1dc929c3203fdb02bcad78b9221d6f2ec70e572487 not found: ID does not exist" containerID="86e323d45ff8a18ef238cc1dc929c3203fdb02bcad78b9221d6f2ec70e572487" Mar 20 07:16:05 crc kubenswrapper[4971]: I0320 07:16:05.369459 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86e323d45ff8a18ef238cc1dc929c3203fdb02bcad78b9221d6f2ec70e572487"} err="failed to get container status \"86e323d45ff8a18ef238cc1dc929c3203fdb02bcad78b9221d6f2ec70e572487\": rpc error: code = NotFound desc = could not find container \"86e323d45ff8a18ef238cc1dc929c3203fdb02bcad78b9221d6f2ec70e572487\": container with ID starting with 86e323d45ff8a18ef238cc1dc929c3203fdb02bcad78b9221d6f2ec70e572487 not found: ID does not exist" Mar 20 07:16:05 crc kubenswrapper[4971]: I0320 07:16:05.369475 4971 scope.go:117] "RemoveContainer" containerID="5e8654e5ce222132f07315464b39e25a8cab844b1929418e596056aec0200d84" Mar 20 07:16:05 crc kubenswrapper[4971]: E0320 07:16:05.369902 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e8654e5ce222132f07315464b39e25a8cab844b1929418e596056aec0200d84\": container with ID starting with 5e8654e5ce222132f07315464b39e25a8cab844b1929418e596056aec0200d84 not found: ID does not exist" containerID="5e8654e5ce222132f07315464b39e25a8cab844b1929418e596056aec0200d84" Mar 20 07:16:05 crc kubenswrapper[4971]: I0320 07:16:05.369959 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e8654e5ce222132f07315464b39e25a8cab844b1929418e596056aec0200d84"} err="failed to get container status \"5e8654e5ce222132f07315464b39e25a8cab844b1929418e596056aec0200d84\": rpc error: code = NotFound desc = could not find container \"5e8654e5ce222132f07315464b39e25a8cab844b1929418e596056aec0200d84\": container with ID starting with 5e8654e5ce222132f07315464b39e25a8cab844b1929418e596056aec0200d84 not found: ID does not exist" Mar 20 07:16:05 crc kubenswrapper[4971]: I0320 07:16:05.369993 4971 scope.go:117] "RemoveContainer" containerID="c9c23b9558cdbf263d8320145f4a662b276ea98292c7170bbe1609469c85d516" Mar 20 07:16:05 crc kubenswrapper[4971]: E0320 07:16:05.370396 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9c23b9558cdbf263d8320145f4a662b276ea98292c7170bbe1609469c85d516\": container with ID starting with c9c23b9558cdbf263d8320145f4a662b276ea98292c7170bbe1609469c85d516 not found: ID does not exist" containerID="c9c23b9558cdbf263d8320145f4a662b276ea98292c7170bbe1609469c85d516" Mar 20 07:16:05 crc kubenswrapper[4971]: I0320 07:16:05.370426 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9c23b9558cdbf263d8320145f4a662b276ea98292c7170bbe1609469c85d516"} err="failed to get container status \"c9c23b9558cdbf263d8320145f4a662b276ea98292c7170bbe1609469c85d516\": rpc error: code = NotFound desc = could not find container \"c9c23b9558cdbf263d8320145f4a662b276ea98292c7170bbe1609469c85d516\": container with ID starting with c9c23b9558cdbf263d8320145f4a662b276ea98292c7170bbe1609469c85d516 not found: ID does not exist" Mar 20 07:16:05 crc kubenswrapper[4971]: I0320 07:16:05.370444 4971 scope.go:117] "RemoveContainer" containerID="6ff097de6ee924556cdb6109e47ceb1245409ef90ff32e007c6f3da327384af0" Mar 20 07:16:05 crc kubenswrapper[4971]: E0320 07:16:05.370770 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ff097de6ee924556cdb6109e47ceb1245409ef90ff32e007c6f3da327384af0\": container with ID starting with 6ff097de6ee924556cdb6109e47ceb1245409ef90ff32e007c6f3da327384af0 not found: ID does not exist" containerID="6ff097de6ee924556cdb6109e47ceb1245409ef90ff32e007c6f3da327384af0" Mar 20 07:16:05 crc kubenswrapper[4971]: I0320 07:16:05.370801 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ff097de6ee924556cdb6109e47ceb1245409ef90ff32e007c6f3da327384af0"} err="failed to get container status \"6ff097de6ee924556cdb6109e47ceb1245409ef90ff32e007c6f3da327384af0\": rpc error: code = NotFound desc = could not find container \"6ff097de6ee924556cdb6109e47ceb1245409ef90ff32e007c6f3da327384af0\": container with ID starting with 6ff097de6ee924556cdb6109e47ceb1245409ef90ff32e007c6f3da327384af0 not found: ID does not exist" Mar 20 07:16:05 crc kubenswrapper[4971]: I0320 07:16:05.370820 4971 scope.go:117] "RemoveContainer" containerID="3bc926e8abdcab959b8ce2e3d7c3be7c01ec42c3cc5ee00077229e1618e99dca" Mar 20 07:16:05 crc kubenswrapper[4971]: E0320 07:16:05.371164 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bc926e8abdcab959b8ce2e3d7c3be7c01ec42c3cc5ee00077229e1618e99dca\": container with ID starting with 3bc926e8abdcab959b8ce2e3d7c3be7c01ec42c3cc5ee00077229e1618e99dca not found: ID does not exist" containerID="3bc926e8abdcab959b8ce2e3d7c3be7c01ec42c3cc5ee00077229e1618e99dca" Mar 20 07:16:05 crc kubenswrapper[4971]: I0320 07:16:05.371195 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bc926e8abdcab959b8ce2e3d7c3be7c01ec42c3cc5ee00077229e1618e99dca"} err="failed to get container status \"3bc926e8abdcab959b8ce2e3d7c3be7c01ec42c3cc5ee00077229e1618e99dca\": rpc error: code = NotFound desc = could not find container \"3bc926e8abdcab959b8ce2e3d7c3be7c01ec42c3cc5ee00077229e1618e99dca\": container with ID starting with 3bc926e8abdcab959b8ce2e3d7c3be7c01ec42c3cc5ee00077229e1618e99dca not found: ID does not exist" Mar 20 07:16:05 crc kubenswrapper[4971]: I0320 07:16:05.371217 4971 scope.go:117] "RemoveContainer" containerID="606899b0d5647f6bba62377dc5689cc5a5012b312b0d21b384c0392805a1f808" Mar 20 07:16:05 crc kubenswrapper[4971]: E0320 07:16:05.371480 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"606899b0d5647f6bba62377dc5689cc5a5012b312b0d21b384c0392805a1f808\": container with ID starting with 606899b0d5647f6bba62377dc5689cc5a5012b312b0d21b384c0392805a1f808 not found: ID does not exist" containerID="606899b0d5647f6bba62377dc5689cc5a5012b312b0d21b384c0392805a1f808" Mar 20 07:16:05 crc kubenswrapper[4971]: I0320 07:16:05.371506 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"606899b0d5647f6bba62377dc5689cc5a5012b312b0d21b384c0392805a1f808"} err="failed to get container status \"606899b0d5647f6bba62377dc5689cc5a5012b312b0d21b384c0392805a1f808\": rpc error: code = NotFound desc = could not find container \"606899b0d5647f6bba62377dc5689cc5a5012b312b0d21b384c0392805a1f808\": container with ID starting with 606899b0d5647f6bba62377dc5689cc5a5012b312b0d21b384c0392805a1f808 not found: ID does not exist" Mar 20 07:16:05 crc kubenswrapper[4971]: I0320 07:16:05.371519 4971 scope.go:117] "RemoveContainer" containerID="f933aca417afe71a53e9fc981ae7ba972eb5df323cf2c751b7c7592ca963d97c" Mar 20 07:16:05 crc kubenswrapper[4971]: E0320 07:16:05.371879 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f933aca417afe71a53e9fc981ae7ba972eb5df323cf2c751b7c7592ca963d97c\": container with ID starting with f933aca417afe71a53e9fc981ae7ba972eb5df323cf2c751b7c7592ca963d97c not found: ID does not exist" containerID="f933aca417afe71a53e9fc981ae7ba972eb5df323cf2c751b7c7592ca963d97c" Mar 20 07:16:05 crc kubenswrapper[4971]: I0320 07:16:05.371909 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f933aca417afe71a53e9fc981ae7ba972eb5df323cf2c751b7c7592ca963d97c"} err="failed to get container status \"f933aca417afe71a53e9fc981ae7ba972eb5df323cf2c751b7c7592ca963d97c\": rpc error: code = NotFound desc = could not find container \"f933aca417afe71a53e9fc981ae7ba972eb5df323cf2c751b7c7592ca963d97c\": container with ID starting with f933aca417afe71a53e9fc981ae7ba972eb5df323cf2c751b7c7592ca963d97c not found: ID does not exist" Mar 20 07:16:05 crc kubenswrapper[4971]: I0320 07:16:05.371931 4971 scope.go:117] "RemoveContainer" containerID="d1197f3d4771bcf093680ea5c7e317e0d1839ab9a28044e6c0af4c8d3d15a45d" Mar 20 07:16:05 crc kubenswrapper[4971]: E0320 07:16:05.372288 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1197f3d4771bcf093680ea5c7e317e0d1839ab9a28044e6c0af4c8d3d15a45d\": container with ID starting with d1197f3d4771bcf093680ea5c7e317e0d1839ab9a28044e6c0af4c8d3d15a45d not found: ID does not exist" containerID="d1197f3d4771bcf093680ea5c7e317e0d1839ab9a28044e6c0af4c8d3d15a45d" Mar 20 07:16:05 crc kubenswrapper[4971]: I0320 07:16:05.372315 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1197f3d4771bcf093680ea5c7e317e0d1839ab9a28044e6c0af4c8d3d15a45d"} err="failed to get container status \"d1197f3d4771bcf093680ea5c7e317e0d1839ab9a28044e6c0af4c8d3d15a45d\": rpc error: code = NotFound desc = could not find container \"d1197f3d4771bcf093680ea5c7e317e0d1839ab9a28044e6c0af4c8d3d15a45d\": container with ID starting with d1197f3d4771bcf093680ea5c7e317e0d1839ab9a28044e6c0af4c8d3d15a45d not found: ID does not exist" Mar 20 07:16:05 crc kubenswrapper[4971]: I0320 07:16:05.372330 4971 scope.go:117] "RemoveContainer" containerID="0974124c6c77137e176d1974dea07e511d0b153152c5fc899503abbc3b959028" Mar 20 07:16:05 crc kubenswrapper[4971]: E0320 07:16:05.372579 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0974124c6c77137e176d1974dea07e511d0b153152c5fc899503abbc3b959028\": container with ID starting with 0974124c6c77137e176d1974dea07e511d0b153152c5fc899503abbc3b959028 not found: ID does not exist" containerID="0974124c6c77137e176d1974dea07e511d0b153152c5fc899503abbc3b959028" Mar 20 07:16:05 crc kubenswrapper[4971]: I0320 07:16:05.372596 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0974124c6c77137e176d1974dea07e511d0b153152c5fc899503abbc3b959028"} err="failed to get container status \"0974124c6c77137e176d1974dea07e511d0b153152c5fc899503abbc3b959028\": rpc error: code = NotFound desc = could not find container \"0974124c6c77137e176d1974dea07e511d0b153152c5fc899503abbc3b959028\": container with ID starting with 0974124c6c77137e176d1974dea07e511d0b153152c5fc899503abbc3b959028 not found: ID does not exist" Mar 20 07:16:05 crc kubenswrapper[4971]: I0320 07:16:05.372686 4971 scope.go:117] "RemoveContainer" containerID="7c9c527925a998de06f623f87a6e63a193cc7cb04eaaee5ab4f3102ab86f462e" Mar 20 07:16:05 crc kubenswrapper[4971]: E0320 07:16:05.373334 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c9c527925a998de06f623f87a6e63a193cc7cb04eaaee5ab4f3102ab86f462e\": container with ID starting with 7c9c527925a998de06f623f87a6e63a193cc7cb04eaaee5ab4f3102ab86f462e not found: ID does not exist" containerID="7c9c527925a998de06f623f87a6e63a193cc7cb04eaaee5ab4f3102ab86f462e" Mar 20 07:16:05 crc kubenswrapper[4971]: I0320 07:16:05.373383 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c9c527925a998de06f623f87a6e63a193cc7cb04eaaee5ab4f3102ab86f462e"} err="failed to get container status \"7c9c527925a998de06f623f87a6e63a193cc7cb04eaaee5ab4f3102ab86f462e\": rpc error: code = NotFound desc = could not find container \"7c9c527925a998de06f623f87a6e63a193cc7cb04eaaee5ab4f3102ab86f462e\": container with ID starting with 7c9c527925a998de06f623f87a6e63a193cc7cb04eaaee5ab4f3102ab86f462e not found: ID does not exist" Mar 20 07:16:05 crc kubenswrapper[4971]: I0320 07:16:05.373400 4971 scope.go:117] "RemoveContainer" containerID="1a911d36f7d7aa0fb9edc9388eb7caf6d4875f4f78e071a88a18bb2b44086495" Mar 20 07:16:05 crc kubenswrapper[4971]: E0320 07:16:05.374264 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a911d36f7d7aa0fb9edc9388eb7caf6d4875f4f78e071a88a18bb2b44086495\": container with ID starting with 1a911d36f7d7aa0fb9edc9388eb7caf6d4875f4f78e071a88a18bb2b44086495 not found: ID does not exist" containerID="1a911d36f7d7aa0fb9edc9388eb7caf6d4875f4f78e071a88a18bb2b44086495" Mar 20 07:16:05 crc kubenswrapper[4971]: I0320 07:16:05.374293 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a911d36f7d7aa0fb9edc9388eb7caf6d4875f4f78e071a88a18bb2b44086495"} err="failed to get container status \"1a911d36f7d7aa0fb9edc9388eb7caf6d4875f4f78e071a88a18bb2b44086495\": rpc error: code = NotFound desc = could not find container \"1a911d36f7d7aa0fb9edc9388eb7caf6d4875f4f78e071a88a18bb2b44086495\": container with ID starting with 1a911d36f7d7aa0fb9edc9388eb7caf6d4875f4f78e071a88a18bb2b44086495 not found: ID does not exist" Mar 20 07:16:05 crc kubenswrapper[4971]: I0320 07:16:05.425739 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Mar 20 07:16:05 crc kubenswrapper[4971]: I0320 07:16:05.449837 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Mar 20 07:16:05 crc kubenswrapper[4971]: I0320 07:16:05.795507 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566516-8vx7v" event={"ID":"a0e1b3f7-6850-4b2e-b692-ab96d597b3ed","Type":"ContainerDied","Data":"d8d5e3cf5eadbbcfd5c9fc654422cc643d442a6f4ea6f293ebc4b0d7dfe9613b"} Mar 20 07:16:05 crc kubenswrapper[4971]: I0320 07:16:05.795590 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8d5e3cf5eadbbcfd5c9fc654422cc643d442a6f4ea6f293ebc4b0d7dfe9613b" Mar 20 07:16:05 crc kubenswrapper[4971]: I0320 07:16:05.795653 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566516-8vx7v" Mar 20 07:16:05 crc kubenswrapper[4971]: I0320 07:16:05.835059 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566510-xzmwb"] Mar 20 07:16:05 crc kubenswrapper[4971]: I0320 07:16:05.846169 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566510-xzmwb"] Mar 20 07:16:06 crc kubenswrapper[4971]: I0320 07:16:06.748089 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f0608b4-f344-45db-a952-d6bc328083a2" path="/var/lib/kubelet/pods/9f0608b4-f344-45db-a952-d6bc328083a2/volumes" Mar 20 07:16:06 crc kubenswrapper[4971]: I0320 07:16:06.753191 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cabe7ab2-f272-4434-8295-641c334a0ae2" path="/var/lib/kubelet/pods/cabe7ab2-f272-4434-8295-641c334a0ae2/volumes" Mar 20 07:16:06 crc kubenswrapper[4971]: I0320 07:16:06.754490 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f989670d-5d1a-4fee-a7eb-d6142c0776d7" path="/var/lib/kubelet/pods/f989670d-5d1a-4fee-a7eb-d6142c0776d7/volumes" Mar 20 07:16:20 crc kubenswrapper[4971]: I0320 07:16:20.162623 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:16:20 crc kubenswrapper[4971]: I0320 07:16:20.163272 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:16:20 crc kubenswrapper[4971]: I0320 07:16:20.163324 4971 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" Mar 20 07:16:20 crc kubenswrapper[4971]: I0320 07:16:20.163998 4971 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0d76c711da5cb6a39ec9395f7a9d86e2671b04a8f185bea088275be6dbc9a85d"} pod="openshift-machine-config-operator/machine-config-daemon-m26zl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 07:16:20 crc kubenswrapper[4971]: I0320 07:16:20.164055 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" containerID="cri-o://0d76c711da5cb6a39ec9395f7a9d86e2671b04a8f185bea088275be6dbc9a85d" gracePeriod=600 Mar 20 07:16:20 crc kubenswrapper[4971]: E0320 07:16:20.290215 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:16:20 crc kubenswrapper[4971]: I0320 07:16:20.955760 4971 generic.go:334] "Generic (PLEG): container finished" podID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerID="0d76c711da5cb6a39ec9395f7a9d86e2671b04a8f185bea088275be6dbc9a85d" exitCode=0 Mar 20 07:16:20 crc kubenswrapper[4971]: I0320 07:16:20.955782 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerDied","Data":"0d76c711da5cb6a39ec9395f7a9d86e2671b04a8f185bea088275be6dbc9a85d"} Mar 20 07:16:20 crc kubenswrapper[4971]: I0320 07:16:20.955890 4971 scope.go:117] "RemoveContainer" containerID="ae2e826d251bbaee40981cb044e2215f19a7929bd69583c144888da9a8754c15" Mar 20 07:16:20 crc kubenswrapper[4971]: I0320 07:16:20.956361 4971 scope.go:117] "RemoveContainer" containerID="0d76c711da5cb6a39ec9395f7a9d86e2671b04a8f185bea088275be6dbc9a85d" Mar 20 07:16:20 crc kubenswrapper[4971]: E0320 07:16:20.956555 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:16:32 crc kubenswrapper[4971]: I0320 07:16:32.732029 4971 scope.go:117] "RemoveContainer" containerID="0d76c711da5cb6a39ec9395f7a9d86e2671b04a8f185bea088275be6dbc9a85d" Mar 20 07:16:32 crc kubenswrapper[4971]: E0320 07:16:32.732922 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:16:47 crc kubenswrapper[4971]: I0320 07:16:47.732145 4971 scope.go:117] "RemoveContainer" containerID="0d76c711da5cb6a39ec9395f7a9d86e2671b04a8f185bea088275be6dbc9a85d" Mar 20 07:16:47 crc kubenswrapper[4971]: E0320 07:16:47.733117 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:16:51 crc kubenswrapper[4971]: I0320 07:16:51.610688 4971 scope.go:117] "RemoveContainer" containerID="7a8fb84cd3c710f21209e4374bda991cfc693cf132e4175c2497375433af1b4f" Mar 20 07:16:51 crc kubenswrapper[4971]: I0320 07:16:51.651255 4971 scope.go:117] "RemoveContainer" containerID="bad86612b55b70b2e623e5e65fc1e1cf3aba7fdf58c279008475d32b3bc49a53" Mar 20 07:16:58 crc kubenswrapper[4971]: I0320 07:16:58.745267 4971 scope.go:117] "RemoveContainer" containerID="0d76c711da5cb6a39ec9395f7a9d86e2671b04a8f185bea088275be6dbc9a85d" Mar 20 07:16:58 crc kubenswrapper[4971]: E0320 07:16:58.746543 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:17:10 crc kubenswrapper[4971]: I0320 07:17:10.733773 4971 scope.go:117] "RemoveContainer" containerID="0d76c711da5cb6a39ec9395f7a9d86e2671b04a8f185bea088275be6dbc9a85d" Mar 20 07:17:10 crc kubenswrapper[4971]: E0320 07:17:10.734935 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:17:23 crc kubenswrapper[4971]: I0320 07:17:23.732244 4971 scope.go:117] "RemoveContainer" containerID="0d76c711da5cb6a39ec9395f7a9d86e2671b04a8f185bea088275be6dbc9a85d" Mar 20 07:17:23 crc kubenswrapper[4971]: E0320 07:17:23.733276 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:17:34 crc kubenswrapper[4971]: I0320 07:17:34.470947 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-k99qd"] Mar 20 07:17:34 crc kubenswrapper[4971]: E0320 07:17:34.471815 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f0608b4-f344-45db-a952-d6bc328083a2" containerName="container-replicator" Mar 20 07:17:34 crc kubenswrapper[4971]: I0320 07:17:34.471832 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f0608b4-f344-45db-a952-d6bc328083a2" containerName="container-replicator" Mar 20 07:17:34 crc kubenswrapper[4971]: E0320 07:17:34.471848 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f0608b4-f344-45db-a952-d6bc328083a2" containerName="object-replicator" Mar 20 07:17:34 crc kubenswrapper[4971]: I0320 07:17:34.471856 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f0608b4-f344-45db-a952-d6bc328083a2" containerName="object-replicator" Mar 20 07:17:34 crc kubenswrapper[4971]: E0320 07:17:34.471876 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f0608b4-f344-45db-a952-d6bc328083a2" containerName="account-replicator" Mar 20 07:17:34 crc kubenswrapper[4971]: I0320 07:17:34.471888 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f0608b4-f344-45db-a952-d6bc328083a2" containerName="account-replicator" Mar 20 07:17:34 crc kubenswrapper[4971]: E0320 07:17:34.471901 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f0608b4-f344-45db-a952-d6bc328083a2" containerName="container-updater" Mar 20 07:17:34 crc kubenswrapper[4971]: I0320 07:17:34.471909 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f0608b4-f344-45db-a952-d6bc328083a2" containerName="container-updater" Mar 20 07:17:34 crc kubenswrapper[4971]: E0320 07:17:34.471925 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f0608b4-f344-45db-a952-d6bc328083a2" containerName="account-auditor" Mar 20 07:17:34 crc kubenswrapper[4971]: I0320 07:17:34.471934 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f0608b4-f344-45db-a952-d6bc328083a2" containerName="account-auditor" Mar 20 07:17:34 crc kubenswrapper[4971]: E0320 07:17:34.471944 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f0608b4-f344-45db-a952-d6bc328083a2" containerName="account-reaper" Mar 20 07:17:34 crc kubenswrapper[4971]: I0320 07:17:34.471952 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f0608b4-f344-45db-a952-d6bc328083a2" containerName="account-reaper" Mar 20 07:17:34 crc kubenswrapper[4971]: E0320 07:17:34.471963 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f0608b4-f344-45db-a952-d6bc328083a2" containerName="container-server" Mar 20 07:17:34 crc kubenswrapper[4971]: I0320 07:17:34.471971 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f0608b4-f344-45db-a952-d6bc328083a2" containerName="container-server" Mar 20 07:17:34 crc kubenswrapper[4971]: E0320 07:17:34.471985 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f0608b4-f344-45db-a952-d6bc328083a2" containerName="rsync" Mar 20 07:17:34 crc kubenswrapper[4971]: I0320 07:17:34.471993 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f0608b4-f344-45db-a952-d6bc328083a2" containerName="rsync" Mar 20 07:17:34 crc kubenswrapper[4971]: E0320 07:17:34.472003 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f0608b4-f344-45db-a952-d6bc328083a2" containerName="container-auditor" Mar 20 07:17:34 crc kubenswrapper[4971]: I0320 07:17:34.472014 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f0608b4-f344-45db-a952-d6bc328083a2" containerName="container-auditor" Mar 20 07:17:34 crc kubenswrapper[4971]: E0320 07:17:34.472080 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f0608b4-f344-45db-a952-d6bc328083a2" containerName="object-updater" Mar 20 07:17:34 crc kubenswrapper[4971]: I0320 07:17:34.472090 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f0608b4-f344-45db-a952-d6bc328083a2" containerName="object-updater" Mar 20 07:17:34 crc kubenswrapper[4971]: E0320 07:17:34.472103 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cabe7ab2-f272-4434-8295-641c334a0ae2" containerName="ovsdb-server-init" Mar 20 07:17:34 crc kubenswrapper[4971]: I0320 07:17:34.472111 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="cabe7ab2-f272-4434-8295-641c334a0ae2" containerName="ovsdb-server-init" Mar 20 07:17:34 crc kubenswrapper[4971]: E0320 07:17:34.472139 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f0608b4-f344-45db-a952-d6bc328083a2" containerName="swift-recon-cron" Mar 20 07:17:34 crc kubenswrapper[4971]: I0320 07:17:34.472148 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f0608b4-f344-45db-a952-d6bc328083a2" containerName="swift-recon-cron" Mar 20 07:17:34 crc kubenswrapper[4971]: E0320 07:17:34.472163 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f0608b4-f344-45db-a952-d6bc328083a2" containerName="account-server" Mar 20 07:17:34 crc kubenswrapper[4971]: I0320 07:17:34.472171 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f0608b4-f344-45db-a952-d6bc328083a2" containerName="account-server" Mar 20 07:17:34 crc kubenswrapper[4971]: E0320 07:17:34.472183 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cabe7ab2-f272-4434-8295-641c334a0ae2" containerName="ovs-vswitchd" Mar 20 07:17:34 crc kubenswrapper[4971]: I0320 07:17:34.472192 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="cabe7ab2-f272-4434-8295-641c334a0ae2" containerName="ovs-vswitchd" Mar 20 07:17:34 crc kubenswrapper[4971]: E0320 07:17:34.472208 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f0608b4-f344-45db-a952-d6bc328083a2" containerName="object-auditor" Mar 20 07:17:34 crc kubenswrapper[4971]: I0320 07:17:34.472216 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f0608b4-f344-45db-a952-d6bc328083a2" containerName="object-auditor" Mar 20 07:17:34 crc kubenswrapper[4971]: E0320 07:17:34.472231 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cabe7ab2-f272-4434-8295-641c334a0ae2" containerName="ovsdb-server" Mar 20 07:17:34 crc kubenswrapper[4971]: I0320 07:17:34.472239 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="cabe7ab2-f272-4434-8295-641c334a0ae2" containerName="ovsdb-server" Mar 20 07:17:34 crc kubenswrapper[4971]: E0320 07:17:34.472257 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f0608b4-f344-45db-a952-d6bc328083a2" containerName="object-expirer" Mar 20 07:17:34 crc kubenswrapper[4971]: I0320 07:17:34.472270 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f0608b4-f344-45db-a952-d6bc328083a2" containerName="object-expirer" Mar 20 07:17:34 crc kubenswrapper[4971]: E0320 07:17:34.472283 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f0608b4-f344-45db-a952-d6bc328083a2" containerName="object-server" Mar 20 07:17:34 crc kubenswrapper[4971]: I0320 07:17:34.472291 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f0608b4-f344-45db-a952-d6bc328083a2" containerName="object-server" Mar 20 07:17:34 crc kubenswrapper[4971]: E0320 07:17:34.472299 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0e1b3f7-6850-4b2e-b692-ab96d597b3ed" containerName="oc" Mar 20 07:17:34 crc kubenswrapper[4971]: I0320 07:17:34.472308 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0e1b3f7-6850-4b2e-b692-ab96d597b3ed" containerName="oc" Mar 20 07:17:34 crc kubenswrapper[4971]: I0320 07:17:34.472467 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f0608b4-f344-45db-a952-d6bc328083a2" containerName="object-server" Mar 20 07:17:34 crc kubenswrapper[4971]: I0320 07:17:34.472484 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f0608b4-f344-45db-a952-d6bc328083a2" containerName="object-auditor" Mar 20 07:17:34 crc kubenswrapper[4971]: I0320 07:17:34.472496 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f0608b4-f344-45db-a952-d6bc328083a2" containerName="account-server" Mar 20 07:17:34 crc kubenswrapper[4971]: I0320 07:17:34.472506 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f0608b4-f344-45db-a952-d6bc328083a2" containerName="container-updater" Mar 20 07:17:34 crc kubenswrapper[4971]: I0320 07:17:34.472520 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f0608b4-f344-45db-a952-d6bc328083a2" containerName="rsync" Mar 20 07:17:34 crc kubenswrapper[4971]: I0320 07:17:34.472530 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f0608b4-f344-45db-a952-d6bc328083a2" containerName="container-replicator" Mar 20 07:17:34 crc kubenswrapper[4971]: I0320 07:17:34.472538 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f0608b4-f344-45db-a952-d6bc328083a2" containerName="object-updater" Mar 20 07:17:34 crc kubenswrapper[4971]: I0320 07:17:34.472548 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f0608b4-f344-45db-a952-d6bc328083a2" containerName="swift-recon-cron" Mar 20 07:17:34 crc kubenswrapper[4971]: I0320 07:17:34.472560 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f0608b4-f344-45db-a952-d6bc328083a2" containerName="account-auditor" Mar 20 07:17:34 crc kubenswrapper[4971]: I0320 07:17:34.472575 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="cabe7ab2-f272-4434-8295-641c334a0ae2" containerName="ovs-vswitchd" Mar 20 07:17:34 crc kubenswrapper[4971]: I0320 07:17:34.472588 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="cabe7ab2-f272-4434-8295-641c334a0ae2" containerName="ovsdb-server" Mar 20 07:17:34 crc kubenswrapper[4971]: I0320 07:17:34.472597 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f0608b4-f344-45db-a952-d6bc328083a2" containerName="container-auditor" Mar 20 07:17:34 crc kubenswrapper[4971]: I0320 07:17:34.472633 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f0608b4-f344-45db-a952-d6bc328083a2" containerName="object-replicator" Mar 20 07:17:34 crc kubenswrapper[4971]: I0320 07:17:34.472646 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f0608b4-f344-45db-a952-d6bc328083a2" containerName="object-expirer" Mar 20 07:17:34 crc kubenswrapper[4971]: I0320 07:17:34.472657 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f0608b4-f344-45db-a952-d6bc328083a2" containerName="account-reaper" Mar 20 07:17:34 crc kubenswrapper[4971]: I0320 07:17:34.472668 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f0608b4-f344-45db-a952-d6bc328083a2" containerName="container-server" Mar 20 07:17:34 crc kubenswrapper[4971]: I0320 07:17:34.472684 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f0608b4-f344-45db-a952-d6bc328083a2" containerName="account-replicator" Mar 20 07:17:34 crc kubenswrapper[4971]: I0320 07:17:34.472698 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0e1b3f7-6850-4b2e-b692-ab96d597b3ed" containerName="oc" Mar 20 07:17:34 crc kubenswrapper[4971]: I0320 07:17:34.473895 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k99qd" Mar 20 07:17:34 crc kubenswrapper[4971]: I0320 07:17:34.521424 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k99qd"] Mar 20 07:17:34 crc kubenswrapper[4971]: I0320 07:17:34.565816 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a55e05d9-0802-4478-bb95-db70284b9dd2-catalog-content\") pod \"redhat-marketplace-k99qd\" (UID: \"a55e05d9-0802-4478-bb95-db70284b9dd2\") " pod="openshift-marketplace/redhat-marketplace-k99qd" Mar 20 07:17:34 crc kubenswrapper[4971]: I0320 07:17:34.566114 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrqnw\" (UniqueName: \"kubernetes.io/projected/a55e05d9-0802-4478-bb95-db70284b9dd2-kube-api-access-nrqnw\") pod \"redhat-marketplace-k99qd\" (UID: \"a55e05d9-0802-4478-bb95-db70284b9dd2\") " pod="openshift-marketplace/redhat-marketplace-k99qd" Mar 20 07:17:34 crc kubenswrapper[4971]: I0320 07:17:34.566193 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a55e05d9-0802-4478-bb95-db70284b9dd2-utilities\") pod \"redhat-marketplace-k99qd\" (UID: \"a55e05d9-0802-4478-bb95-db70284b9dd2\") " pod="openshift-marketplace/redhat-marketplace-k99qd" Mar 20 07:17:34 crc kubenswrapper[4971]: I0320 07:17:34.667563 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a55e05d9-0802-4478-bb95-db70284b9dd2-catalog-content\") pod \"redhat-marketplace-k99qd\" (UID: \"a55e05d9-0802-4478-bb95-db70284b9dd2\") " pod="openshift-marketplace/redhat-marketplace-k99qd" Mar 20 07:17:34 crc kubenswrapper[4971]: I0320 07:17:34.667622 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrqnw\" (UniqueName: \"kubernetes.io/projected/a55e05d9-0802-4478-bb95-db70284b9dd2-kube-api-access-nrqnw\") pod \"redhat-marketplace-k99qd\" (UID: \"a55e05d9-0802-4478-bb95-db70284b9dd2\") " pod="openshift-marketplace/redhat-marketplace-k99qd" Mar 20 07:17:34 crc kubenswrapper[4971]: I0320 07:17:34.667657 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a55e05d9-0802-4478-bb95-db70284b9dd2-utilities\") pod \"redhat-marketplace-k99qd\" (UID: \"a55e05d9-0802-4478-bb95-db70284b9dd2\") " pod="openshift-marketplace/redhat-marketplace-k99qd" Mar 20 07:17:34 crc kubenswrapper[4971]: I0320 07:17:34.669010 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a55e05d9-0802-4478-bb95-db70284b9dd2-utilities\") pod \"redhat-marketplace-k99qd\" (UID: \"a55e05d9-0802-4478-bb95-db70284b9dd2\") " pod="openshift-marketplace/redhat-marketplace-k99qd" Mar 20 07:17:34 crc kubenswrapper[4971]: I0320 07:17:34.669173 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a55e05d9-0802-4478-bb95-db70284b9dd2-catalog-content\") pod \"redhat-marketplace-k99qd\" (UID: \"a55e05d9-0802-4478-bb95-db70284b9dd2\") " pod="openshift-marketplace/redhat-marketplace-k99qd" Mar 20 07:17:34 crc kubenswrapper[4971]: I0320 07:17:34.694254 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrqnw\" (UniqueName: \"kubernetes.io/projected/a55e05d9-0802-4478-bb95-db70284b9dd2-kube-api-access-nrqnw\") pod \"redhat-marketplace-k99qd\" (UID: \"a55e05d9-0802-4478-bb95-db70284b9dd2\") " pod="openshift-marketplace/redhat-marketplace-k99qd" Mar 20 07:17:34 crc kubenswrapper[4971]: I0320 07:17:34.808367 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k99qd" Mar 20 07:17:35 crc kubenswrapper[4971]: I0320 07:17:35.281710 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k99qd"] Mar 20 07:17:35 crc kubenswrapper[4971]: I0320 07:17:35.751437 4971 generic.go:334] "Generic (PLEG): container finished" podID="a55e05d9-0802-4478-bb95-db70284b9dd2" containerID="5088229d788926183bf157b298852b79a65fe5dc61f28d5b14f319ec405ba54b" exitCode=0 Mar 20 07:17:35 crc kubenswrapper[4971]: I0320 07:17:35.751545 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k99qd" event={"ID":"a55e05d9-0802-4478-bb95-db70284b9dd2","Type":"ContainerDied","Data":"5088229d788926183bf157b298852b79a65fe5dc61f28d5b14f319ec405ba54b"} Mar 20 07:17:35 crc kubenswrapper[4971]: I0320 07:17:35.751983 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k99qd" event={"ID":"a55e05d9-0802-4478-bb95-db70284b9dd2","Type":"ContainerStarted","Data":"036ee244b117adbc80f4f9b62c4b0b64c88977399726018e93ad7b13d1f5ccd5"} Mar 20 07:17:36 crc kubenswrapper[4971]: I0320 07:17:36.769257 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k99qd" event={"ID":"a55e05d9-0802-4478-bb95-db70284b9dd2","Type":"ContainerStarted","Data":"d7655ee33e6d2cb63819a068e27489d1d519d906d3207a4cc9d48e07bcffcd0e"} Mar 20 07:17:37 crc kubenswrapper[4971]: I0320 07:17:37.791466 4971 generic.go:334] "Generic (PLEG): container finished" podID="a55e05d9-0802-4478-bb95-db70284b9dd2" containerID="d7655ee33e6d2cb63819a068e27489d1d519d906d3207a4cc9d48e07bcffcd0e" exitCode=0 Mar 20 07:17:37 crc kubenswrapper[4971]: I0320 07:17:37.791551 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k99qd" event={"ID":"a55e05d9-0802-4478-bb95-db70284b9dd2","Type":"ContainerDied","Data":"d7655ee33e6d2cb63819a068e27489d1d519d906d3207a4cc9d48e07bcffcd0e"} Mar 20 07:17:38 crc kubenswrapper[4971]: I0320 07:17:38.732290 4971 scope.go:117] "RemoveContainer" containerID="0d76c711da5cb6a39ec9395f7a9d86e2671b04a8f185bea088275be6dbc9a85d" Mar 20 07:17:38 crc kubenswrapper[4971]: E0320 07:17:38.736926 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:17:38 crc kubenswrapper[4971]: I0320 07:17:38.806343 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k99qd" event={"ID":"a55e05d9-0802-4478-bb95-db70284b9dd2","Type":"ContainerStarted","Data":"105d2c62d78effac2b0d3fd2e4663db000568f8656f460f70d324578bcf7e59c"} Mar 20 07:17:38 crc kubenswrapper[4971]: I0320 07:17:38.852182 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-k99qd" podStartSLOduration=2.034448511 podStartE2EDuration="4.852159387s" podCreationTimestamp="2026-03-20 07:17:34 +0000 UTC" firstStartedPulling="2026-03-20 07:17:35.754265529 +0000 UTC m=+1677.734139707" lastFinishedPulling="2026-03-20 07:17:38.571976435 +0000 UTC m=+1680.551850583" observedRunningTime="2026-03-20 07:17:38.851038738 +0000 UTC m=+1680.830912906" watchObservedRunningTime="2026-03-20 07:17:38.852159387 +0000 UTC m=+1680.832033535" Mar 20 07:17:44 crc kubenswrapper[4971]: I0320 07:17:44.809654 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-k99qd" Mar 20 07:17:44 crc kubenswrapper[4971]: I0320 07:17:44.810256 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-k99qd" Mar 20 07:17:44 crc kubenswrapper[4971]: I0320 07:17:44.887940 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-k99qd" Mar 20 07:17:44 crc kubenswrapper[4971]: I0320 07:17:44.958363 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-k99qd" Mar 20 07:17:45 crc kubenswrapper[4971]: I0320 07:17:45.135109 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k99qd"] Mar 20 07:17:46 crc kubenswrapper[4971]: I0320 07:17:46.887135 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-k99qd" podUID="a55e05d9-0802-4478-bb95-db70284b9dd2" containerName="registry-server" containerID="cri-o://105d2c62d78effac2b0d3fd2e4663db000568f8656f460f70d324578bcf7e59c" gracePeriod=2 Mar 20 07:17:47 crc kubenswrapper[4971]: I0320 07:17:47.437177 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k99qd" Mar 20 07:17:47 crc kubenswrapper[4971]: I0320 07:17:47.583590 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrqnw\" (UniqueName: \"kubernetes.io/projected/a55e05d9-0802-4478-bb95-db70284b9dd2-kube-api-access-nrqnw\") pod \"a55e05d9-0802-4478-bb95-db70284b9dd2\" (UID: \"a55e05d9-0802-4478-bb95-db70284b9dd2\") " Mar 20 07:17:47 crc kubenswrapper[4971]: I0320 07:17:47.583717 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a55e05d9-0802-4478-bb95-db70284b9dd2-utilities\") pod \"a55e05d9-0802-4478-bb95-db70284b9dd2\" (UID: \"a55e05d9-0802-4478-bb95-db70284b9dd2\") " Mar 20 07:17:47 crc kubenswrapper[4971]: I0320 07:17:47.583762 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a55e05d9-0802-4478-bb95-db70284b9dd2-catalog-content\") pod \"a55e05d9-0802-4478-bb95-db70284b9dd2\" (UID: \"a55e05d9-0802-4478-bb95-db70284b9dd2\") " Mar 20 07:17:47 crc kubenswrapper[4971]: I0320 07:17:47.585693 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a55e05d9-0802-4478-bb95-db70284b9dd2-utilities" (OuterVolumeSpecName: "utilities") pod "a55e05d9-0802-4478-bb95-db70284b9dd2" (UID: "a55e05d9-0802-4478-bb95-db70284b9dd2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:17:47 crc kubenswrapper[4971]: I0320 07:17:47.592978 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a55e05d9-0802-4478-bb95-db70284b9dd2-kube-api-access-nrqnw" (OuterVolumeSpecName: "kube-api-access-nrqnw") pod "a55e05d9-0802-4478-bb95-db70284b9dd2" (UID: "a55e05d9-0802-4478-bb95-db70284b9dd2"). InnerVolumeSpecName "kube-api-access-nrqnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:17:47 crc kubenswrapper[4971]: I0320 07:17:47.613396 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a55e05d9-0802-4478-bb95-db70284b9dd2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a55e05d9-0802-4478-bb95-db70284b9dd2" (UID: "a55e05d9-0802-4478-bb95-db70284b9dd2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:17:47 crc kubenswrapper[4971]: I0320 07:17:47.685500 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a55e05d9-0802-4478-bb95-db70284b9dd2-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 07:17:47 crc kubenswrapper[4971]: I0320 07:17:47.685543 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a55e05d9-0802-4478-bb95-db70284b9dd2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 07:17:47 crc kubenswrapper[4971]: I0320 07:17:47.685557 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrqnw\" (UniqueName: \"kubernetes.io/projected/a55e05d9-0802-4478-bb95-db70284b9dd2-kube-api-access-nrqnw\") on node \"crc\" DevicePath \"\"" Mar 20 07:17:47 crc kubenswrapper[4971]: I0320 07:17:47.903548 4971 generic.go:334] "Generic (PLEG): container finished" podID="a55e05d9-0802-4478-bb95-db70284b9dd2" containerID="105d2c62d78effac2b0d3fd2e4663db000568f8656f460f70d324578bcf7e59c" exitCode=0 Mar 20 07:17:47 crc kubenswrapper[4971]: I0320 07:17:47.903665 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k99qd" event={"ID":"a55e05d9-0802-4478-bb95-db70284b9dd2","Type":"ContainerDied","Data":"105d2c62d78effac2b0d3fd2e4663db000568f8656f460f70d324578bcf7e59c"} Mar 20 07:17:47 crc kubenswrapper[4971]: I0320 07:17:47.903677 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k99qd" Mar 20 07:17:47 crc kubenswrapper[4971]: I0320 07:17:47.903717 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k99qd" event={"ID":"a55e05d9-0802-4478-bb95-db70284b9dd2","Type":"ContainerDied","Data":"036ee244b117adbc80f4f9b62c4b0b64c88977399726018e93ad7b13d1f5ccd5"} Mar 20 07:17:47 crc kubenswrapper[4971]: I0320 07:17:47.903746 4971 scope.go:117] "RemoveContainer" containerID="105d2c62d78effac2b0d3fd2e4663db000568f8656f460f70d324578bcf7e59c" Mar 20 07:17:47 crc kubenswrapper[4971]: I0320 07:17:47.941074 4971 scope.go:117] "RemoveContainer" containerID="d7655ee33e6d2cb63819a068e27489d1d519d906d3207a4cc9d48e07bcffcd0e" Mar 20 07:17:47 crc kubenswrapper[4971]: I0320 07:17:47.959651 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k99qd"] Mar 20 07:17:47 crc kubenswrapper[4971]: I0320 07:17:47.970072 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-k99qd"] Mar 20 07:17:48 crc kubenswrapper[4971]: I0320 07:17:48.021155 4971 scope.go:117] "RemoveContainer" containerID="5088229d788926183bf157b298852b79a65fe5dc61f28d5b14f319ec405ba54b" Mar 20 07:17:48 crc kubenswrapper[4971]: I0320 07:17:48.044229 4971 scope.go:117] "RemoveContainer" containerID="105d2c62d78effac2b0d3fd2e4663db000568f8656f460f70d324578bcf7e59c" Mar 20 07:17:48 crc kubenswrapper[4971]: E0320 07:17:48.045093 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"105d2c62d78effac2b0d3fd2e4663db000568f8656f460f70d324578bcf7e59c\": container with ID starting with 105d2c62d78effac2b0d3fd2e4663db000568f8656f460f70d324578bcf7e59c not found: ID does not exist" containerID="105d2c62d78effac2b0d3fd2e4663db000568f8656f460f70d324578bcf7e59c" Mar 20 07:17:48 crc kubenswrapper[4971]: I0320 07:17:48.045172 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"105d2c62d78effac2b0d3fd2e4663db000568f8656f460f70d324578bcf7e59c"} err="failed to get container status \"105d2c62d78effac2b0d3fd2e4663db000568f8656f460f70d324578bcf7e59c\": rpc error: code = NotFound desc = could not find container \"105d2c62d78effac2b0d3fd2e4663db000568f8656f460f70d324578bcf7e59c\": container with ID starting with 105d2c62d78effac2b0d3fd2e4663db000568f8656f460f70d324578bcf7e59c not found: ID does not exist" Mar 20 07:17:48 crc kubenswrapper[4971]: I0320 07:17:48.045214 4971 scope.go:117] "RemoveContainer" containerID="d7655ee33e6d2cb63819a068e27489d1d519d906d3207a4cc9d48e07bcffcd0e" Mar 20 07:17:48 crc kubenswrapper[4971]: E0320 07:17:48.045774 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7655ee33e6d2cb63819a068e27489d1d519d906d3207a4cc9d48e07bcffcd0e\": container with ID starting with d7655ee33e6d2cb63819a068e27489d1d519d906d3207a4cc9d48e07bcffcd0e not found: ID does not exist" containerID="d7655ee33e6d2cb63819a068e27489d1d519d906d3207a4cc9d48e07bcffcd0e" Mar 20 07:17:48 crc kubenswrapper[4971]: I0320 07:17:48.045819 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7655ee33e6d2cb63819a068e27489d1d519d906d3207a4cc9d48e07bcffcd0e"} err="failed to get container status \"d7655ee33e6d2cb63819a068e27489d1d519d906d3207a4cc9d48e07bcffcd0e\": rpc error: code = NotFound desc = could not find container \"d7655ee33e6d2cb63819a068e27489d1d519d906d3207a4cc9d48e07bcffcd0e\": container with ID starting with d7655ee33e6d2cb63819a068e27489d1d519d906d3207a4cc9d48e07bcffcd0e not found: ID does not exist" Mar 20 07:17:48 crc kubenswrapper[4971]: I0320 07:17:48.045847 4971 scope.go:117] "RemoveContainer" containerID="5088229d788926183bf157b298852b79a65fe5dc61f28d5b14f319ec405ba54b" Mar 20 07:17:48 crc kubenswrapper[4971]: E0320 07:17:48.046315 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5088229d788926183bf157b298852b79a65fe5dc61f28d5b14f319ec405ba54b\": container with ID starting with 5088229d788926183bf157b298852b79a65fe5dc61f28d5b14f319ec405ba54b not found: ID does not exist" containerID="5088229d788926183bf157b298852b79a65fe5dc61f28d5b14f319ec405ba54b" Mar 20 07:17:48 crc kubenswrapper[4971]: I0320 07:17:48.046387 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5088229d788926183bf157b298852b79a65fe5dc61f28d5b14f319ec405ba54b"} err="failed to get container status \"5088229d788926183bf157b298852b79a65fe5dc61f28d5b14f319ec405ba54b\": rpc error: code = NotFound desc = could not find container \"5088229d788926183bf157b298852b79a65fe5dc61f28d5b14f319ec405ba54b\": container with ID starting with 5088229d788926183bf157b298852b79a65fe5dc61f28d5b14f319ec405ba54b not found: ID does not exist" Mar 20 07:17:48 crc kubenswrapper[4971]: I0320 07:17:48.746472 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a55e05d9-0802-4478-bb95-db70284b9dd2" path="/var/lib/kubelet/pods/a55e05d9-0802-4478-bb95-db70284b9dd2/volumes" Mar 20 07:17:50 crc kubenswrapper[4971]: I0320 07:17:50.732861 4971 scope.go:117] "RemoveContainer" containerID="0d76c711da5cb6a39ec9395f7a9d86e2671b04a8f185bea088275be6dbc9a85d" Mar 20 07:17:50 crc kubenswrapper[4971]: E0320 07:17:50.733910 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:17:51 crc kubenswrapper[4971]: I0320 07:17:51.822218 4971 scope.go:117] "RemoveContainer" containerID="374e54ab4a048309941118221a07e96b275e295109371c7b2474327f103803a5" Mar 20 07:17:51 crc kubenswrapper[4971]: I0320 07:17:51.885903 4971 scope.go:117] "RemoveContainer" containerID="33e0dc596986e9a8f65bb0952b3f04fc1ffca37a4742dfd20efd67c5686312ae" Mar 20 07:17:51 crc kubenswrapper[4971]: I0320 07:17:51.922111 4971 scope.go:117] "RemoveContainer" containerID="8894eea462d02c27135c00ee973dff94683ebc6860c55336204627ceb1560fa1" Mar 20 07:17:51 crc kubenswrapper[4971]: I0320 07:17:51.952764 4971 scope.go:117] "RemoveContainer" containerID="b20124ae1f6d2b7ddda5e25f0cc517f56ab0056bdc786ff56907512f902df9f9" Mar 20 07:17:51 crc kubenswrapper[4971]: I0320 07:17:51.980191 4971 scope.go:117] "RemoveContainer" containerID="f88854cdba235abc6a679d085e2ddae83d1dcc94556a4f55f7597dc55a163760" Mar 20 07:17:52 crc kubenswrapper[4971]: I0320 07:17:52.012859 4971 scope.go:117] "RemoveContainer" containerID="f38c82bcc1900d8bc2655f6085c5695f555d943d88c400862d379df92953d557" Mar 20 07:17:52 crc kubenswrapper[4971]: I0320 07:17:52.042574 4971 scope.go:117] "RemoveContainer" containerID="55283060505b357dc023a960a805e270100d975ab8edf53910bac6a09d66c831" Mar 20 07:17:52 crc kubenswrapper[4971]: I0320 07:17:52.070165 4971 scope.go:117] "RemoveContainer" containerID="9dc2fbe9979ece731bdad20be4ff77fb6c042b3ae61d4a3a925fb081c9c02905" Mar 20 07:17:52 crc kubenswrapper[4971]: I0320 07:17:52.123772 4971 scope.go:117] "RemoveContainer" containerID="dd296b2cff52994971d74ccd25b29d247876a06340250cd1ca3f4c3979f95a89" Mar 20 07:17:52 crc kubenswrapper[4971]: I0320 07:17:52.159865 4971 scope.go:117] "RemoveContainer" containerID="1b849baaa85bafbb8df6c2a64ec733a62d77ddeafe57154a63e33f841cdc919e" Mar 20 07:17:52 crc kubenswrapper[4971]: I0320 07:17:52.214944 4971 scope.go:117] "RemoveContainer" containerID="3aa555eddf63e48f674d7432f8d4139f9f04b81f51a084e8ad63dcacfa5ce437" Mar 20 07:17:52 crc kubenswrapper[4971]: I0320 07:17:52.248341 4971 scope.go:117] "RemoveContainer" containerID="0d9196d5fa1e06eb300a897a2efc8f4155967e4bbf140160338ab7fea0655449" Mar 20 07:17:52 crc kubenswrapper[4971]: I0320 07:17:52.277657 4971 scope.go:117] "RemoveContainer" containerID="bd7d583000b590bc82de39df0965fc04bd2bcbd76dd569dc0bcc7c4f381a4bf7" Mar 20 07:17:52 crc kubenswrapper[4971]: I0320 07:17:52.310166 4971 scope.go:117] "RemoveContainer" containerID="f00a93dff8f53e5f0a13ded30cfb6e5907ce153f6b5a93a64c0c8fc026b840cc" Mar 20 07:17:52 crc kubenswrapper[4971]: I0320 07:17:52.339639 4971 scope.go:117] "RemoveContainer" containerID="1fa0728e16fb926d9758ef863ed07ffb1e047c3342be2cb6f8294d64076e6fba" Mar 20 07:17:52 crc kubenswrapper[4971]: I0320 07:17:52.369750 4971 scope.go:117] "RemoveContainer" containerID="7bd65cb3fc2edb798de3a2ade999c6f405c2215e4ac027cd03552885c229b825" Mar 20 07:17:52 crc kubenswrapper[4971]: I0320 07:17:52.391139 4971 scope.go:117] "RemoveContainer" containerID="1f0d05e9c6b42bd0f6ea81b947a19f9ceb2c1ffb8db5799376283fa131c7af29" Mar 20 07:17:52 crc kubenswrapper[4971]: I0320 07:17:52.415772 4971 scope.go:117] "RemoveContainer" containerID="37616ec9a22928fa47fed1046ff25a2bedd0b51b9300cfcaadbe24306840515b" Mar 20 07:17:52 crc kubenswrapper[4971]: I0320 07:17:52.445283 4971 scope.go:117] "RemoveContainer" containerID="af1598521084a1003841bd322d7ad538ec00eeef1caab95601854478c13ae85f" Mar 20 07:17:52 crc kubenswrapper[4971]: I0320 07:17:52.474713 4971 scope.go:117] "RemoveContainer" containerID="42b6d9f32625085d1006654e4e4f12708f6719d8a4afc56ed61979703f1462f1" Mar 20 07:18:00 crc kubenswrapper[4971]: I0320 07:18:00.164484 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566518-ckb5h"] Mar 20 07:18:00 crc kubenswrapper[4971]: E0320 07:18:00.165318 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a55e05d9-0802-4478-bb95-db70284b9dd2" containerName="extract-utilities" Mar 20 07:18:00 crc kubenswrapper[4971]: I0320 07:18:00.165356 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="a55e05d9-0802-4478-bb95-db70284b9dd2" containerName="extract-utilities" Mar 20 07:18:00 crc kubenswrapper[4971]: E0320 07:18:00.165371 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a55e05d9-0802-4478-bb95-db70284b9dd2" containerName="registry-server" Mar 20 07:18:00 crc kubenswrapper[4971]: I0320 07:18:00.165378 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="a55e05d9-0802-4478-bb95-db70284b9dd2" containerName="registry-server" Mar 20 07:18:00 crc kubenswrapper[4971]: E0320 07:18:00.165400 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a55e05d9-0802-4478-bb95-db70284b9dd2" containerName="extract-content" Mar 20 07:18:00 crc kubenswrapper[4971]: I0320 07:18:00.165407 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="a55e05d9-0802-4478-bb95-db70284b9dd2" containerName="extract-content" Mar 20 07:18:00 crc kubenswrapper[4971]: I0320 07:18:00.165520 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="a55e05d9-0802-4478-bb95-db70284b9dd2" containerName="registry-server" Mar 20 07:18:00 crc kubenswrapper[4971]: I0320 07:18:00.165962 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566518-ckb5h" Mar 20 07:18:00 crc kubenswrapper[4971]: I0320 07:18:00.169053 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 07:18:00 crc kubenswrapper[4971]: I0320 07:18:00.170167 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 07:18:00 crc kubenswrapper[4971]: I0320 07:18:00.172590 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 07:18:00 crc kubenswrapper[4971]: I0320 07:18:00.176311 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566518-ckb5h"] Mar 20 07:18:00 crc kubenswrapper[4971]: I0320 07:18:00.200485 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dd5vg\" (UniqueName: \"kubernetes.io/projected/5c7c341a-b390-40e0-899e-6cf925e5280d-kube-api-access-dd5vg\") pod \"auto-csr-approver-29566518-ckb5h\" (UID: \"5c7c341a-b390-40e0-899e-6cf925e5280d\") " pod="openshift-infra/auto-csr-approver-29566518-ckb5h" Mar 20 07:18:00 crc kubenswrapper[4971]: I0320 07:18:00.302476 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dd5vg\" (UniqueName: \"kubernetes.io/projected/5c7c341a-b390-40e0-899e-6cf925e5280d-kube-api-access-dd5vg\") pod \"auto-csr-approver-29566518-ckb5h\" (UID: \"5c7c341a-b390-40e0-899e-6cf925e5280d\") " pod="openshift-infra/auto-csr-approver-29566518-ckb5h" Mar 20 07:18:00 crc kubenswrapper[4971]: I0320 07:18:00.328985 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dd5vg\" (UniqueName: \"kubernetes.io/projected/5c7c341a-b390-40e0-899e-6cf925e5280d-kube-api-access-dd5vg\") pod \"auto-csr-approver-29566518-ckb5h\" (UID: \"5c7c341a-b390-40e0-899e-6cf925e5280d\") " pod="openshift-infra/auto-csr-approver-29566518-ckb5h" Mar 20 07:18:00 crc kubenswrapper[4971]: I0320 07:18:00.496446 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566518-ckb5h" Mar 20 07:18:01 crc kubenswrapper[4971]: I0320 07:18:01.012464 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566518-ckb5h"] Mar 20 07:18:01 crc kubenswrapper[4971]: I0320 07:18:01.045461 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566518-ckb5h" event={"ID":"5c7c341a-b390-40e0-899e-6cf925e5280d","Type":"ContainerStarted","Data":"ae89401b7e7fc598bf2584bb90fbf897187f34150dfeeff3ec9bc80b6a3943ff"} Mar 20 07:18:03 crc kubenswrapper[4971]: I0320 07:18:03.072619 4971 generic.go:334] "Generic (PLEG): container finished" podID="5c7c341a-b390-40e0-899e-6cf925e5280d" containerID="fa37d66caca9e3742d097cc40b4fc7d60ff365982e423295b793998c7a381052" exitCode=0 Mar 20 07:18:03 crc kubenswrapper[4971]: I0320 07:18:03.072703 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566518-ckb5h" event={"ID":"5c7c341a-b390-40e0-899e-6cf925e5280d","Type":"ContainerDied","Data":"fa37d66caca9e3742d097cc40b4fc7d60ff365982e423295b793998c7a381052"} Mar 20 07:18:04 crc kubenswrapper[4971]: I0320 07:18:04.513436 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566518-ckb5h" Mar 20 07:18:04 crc kubenswrapper[4971]: I0320 07:18:04.680984 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dd5vg\" (UniqueName: \"kubernetes.io/projected/5c7c341a-b390-40e0-899e-6cf925e5280d-kube-api-access-dd5vg\") pod \"5c7c341a-b390-40e0-899e-6cf925e5280d\" (UID: \"5c7c341a-b390-40e0-899e-6cf925e5280d\") " Mar 20 07:18:04 crc kubenswrapper[4971]: I0320 07:18:04.687591 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c7c341a-b390-40e0-899e-6cf925e5280d-kube-api-access-dd5vg" (OuterVolumeSpecName: "kube-api-access-dd5vg") pod "5c7c341a-b390-40e0-899e-6cf925e5280d" (UID: "5c7c341a-b390-40e0-899e-6cf925e5280d"). InnerVolumeSpecName "kube-api-access-dd5vg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:18:04 crc kubenswrapper[4971]: I0320 07:18:04.782964 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dd5vg\" (UniqueName: \"kubernetes.io/projected/5c7c341a-b390-40e0-899e-6cf925e5280d-kube-api-access-dd5vg\") on node \"crc\" DevicePath \"\"" Mar 20 07:18:05 crc kubenswrapper[4971]: I0320 07:18:05.094635 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566518-ckb5h" event={"ID":"5c7c341a-b390-40e0-899e-6cf925e5280d","Type":"ContainerDied","Data":"ae89401b7e7fc598bf2584bb90fbf897187f34150dfeeff3ec9bc80b6a3943ff"} Mar 20 07:18:05 crc kubenswrapper[4971]: I0320 07:18:05.094678 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae89401b7e7fc598bf2584bb90fbf897187f34150dfeeff3ec9bc80b6a3943ff" Mar 20 07:18:05 crc kubenswrapper[4971]: I0320 07:18:05.094721 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566518-ckb5h" Mar 20 07:18:05 crc kubenswrapper[4971]: I0320 07:18:05.596410 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566512-7bqgd"] Mar 20 07:18:05 crc kubenswrapper[4971]: I0320 07:18:05.603570 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566512-7bqgd"] Mar 20 07:18:05 crc kubenswrapper[4971]: I0320 07:18:05.732874 4971 scope.go:117] "RemoveContainer" containerID="0d76c711da5cb6a39ec9395f7a9d86e2671b04a8f185bea088275be6dbc9a85d" Mar 20 07:18:05 crc kubenswrapper[4971]: E0320 07:18:05.733394 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:18:06 crc kubenswrapper[4971]: I0320 07:18:06.750218 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f70b1cc-68a3-4e49-8fc0-c90be8b0d30e" path="/var/lib/kubelet/pods/3f70b1cc-68a3-4e49-8fc0-c90be8b0d30e/volumes" Mar 20 07:18:20 crc kubenswrapper[4971]: I0320 07:18:20.732251 4971 scope.go:117] "RemoveContainer" containerID="0d76c711da5cb6a39ec9395f7a9d86e2671b04a8f185bea088275be6dbc9a85d" Mar 20 07:18:20 crc kubenswrapper[4971]: E0320 07:18:20.733164 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:18:33 crc kubenswrapper[4971]: I0320 07:18:33.732550 4971 scope.go:117] "RemoveContainer" containerID="0d76c711da5cb6a39ec9395f7a9d86e2671b04a8f185bea088275be6dbc9a85d" Mar 20 07:18:33 crc kubenswrapper[4971]: E0320 07:18:33.733599 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:18:46 crc kubenswrapper[4971]: I0320 07:18:46.732063 4971 scope.go:117] "RemoveContainer" containerID="0d76c711da5cb6a39ec9395f7a9d86e2671b04a8f185bea088275be6dbc9a85d" Mar 20 07:18:46 crc kubenswrapper[4971]: E0320 07:18:46.732752 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:18:52 crc kubenswrapper[4971]: I0320 07:18:52.798160 4971 scope.go:117] "RemoveContainer" containerID="f81aecf5a1a3f9063ecfd9a721f11e941db08bb5f827ff314bb3c535b94bb02d" Mar 20 07:18:52 crc kubenswrapper[4971]: I0320 07:18:52.870287 4971 scope.go:117] "RemoveContainer" containerID="821ce76436237ce7e9434e6e2e31435cd0f21237c27264d9054c57be93e6161a" Mar 20 07:18:52 crc kubenswrapper[4971]: I0320 07:18:52.932617 4971 scope.go:117] "RemoveContainer" containerID="b3b8a1b34d0168c170da49d2d6cb8b4c128acb2d7875417dfcd0a1ccb510034b" Mar 20 07:18:52 crc kubenswrapper[4971]: I0320 07:18:52.958444 4971 scope.go:117] "RemoveContainer" containerID="5ac947ff7b65ad4e4ef7fa4b0e909f3f4005fc6b016e4dd5e60be9354d69b94e" Mar 20 07:18:53 crc kubenswrapper[4971]: I0320 07:18:53.009458 4971 scope.go:117] "RemoveContainer" containerID="f6ed6c2bd98279cdf6b3f41b7c35c157a45f901ea5da3911570bab6705696124" Mar 20 07:18:53 crc kubenswrapper[4971]: I0320 07:18:53.039325 4971 scope.go:117] "RemoveContainer" containerID="f94f5c131900e9142db384e685b8b6e3db64b43435aff6a1f804077e0c0eeafb" Mar 20 07:18:53 crc kubenswrapper[4971]: I0320 07:18:53.064282 4971 scope.go:117] "RemoveContainer" containerID="f2c6e66d0b0a91e73a33dda62aeb723853c5bbc9009e00a71519161cd37e2c4a" Mar 20 07:18:53 crc kubenswrapper[4971]: I0320 07:18:53.087474 4971 scope.go:117] "RemoveContainer" containerID="e16f3a7524c0effa050fe10707eb8e5fb4083331573d897b85372c20bcb66dcc" Mar 20 07:18:57 crc kubenswrapper[4971]: I0320 07:18:57.732583 4971 scope.go:117] "RemoveContainer" containerID="0d76c711da5cb6a39ec9395f7a9d86e2671b04a8f185bea088275be6dbc9a85d" Mar 20 07:18:57 crc kubenswrapper[4971]: E0320 07:18:57.733374 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:19:12 crc kubenswrapper[4971]: I0320 07:19:12.733252 4971 scope.go:117] "RemoveContainer" containerID="0d76c711da5cb6a39ec9395f7a9d86e2671b04a8f185bea088275be6dbc9a85d" Mar 20 07:19:12 crc kubenswrapper[4971]: E0320 07:19:12.734653 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:19:24 crc kubenswrapper[4971]: I0320 07:19:24.744720 4971 scope.go:117] "RemoveContainer" containerID="0d76c711da5cb6a39ec9395f7a9d86e2671b04a8f185bea088275be6dbc9a85d" Mar 20 07:19:24 crc kubenswrapper[4971]: E0320 07:19:24.746750 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:19:35 crc kubenswrapper[4971]: I0320 07:19:35.732645 4971 scope.go:117] "RemoveContainer" containerID="0d76c711da5cb6a39ec9395f7a9d86e2671b04a8f185bea088275be6dbc9a85d" Mar 20 07:19:35 crc kubenswrapper[4971]: E0320 07:19:35.733409 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:19:50 crc kubenswrapper[4971]: I0320 07:19:50.732225 4971 scope.go:117] "RemoveContainer" containerID="0d76c711da5cb6a39ec9395f7a9d86e2671b04a8f185bea088275be6dbc9a85d" Mar 20 07:19:50 crc kubenswrapper[4971]: E0320 07:19:50.733125 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:19:53 crc kubenswrapper[4971]: I0320 07:19:53.216915 4971 scope.go:117] "RemoveContainer" containerID="c8dce6132c0db7b826887ea68485e6926d6993786a3e7ddf0bd242030d28557d" Mar 20 07:19:53 crc kubenswrapper[4971]: I0320 07:19:53.249707 4971 scope.go:117] "RemoveContainer" containerID="10a37a5b0caa91deed8a9e36e1111c76d38c04830b8df96ba48ece28266de196" Mar 20 07:19:53 crc kubenswrapper[4971]: I0320 07:19:53.308493 4971 scope.go:117] "RemoveContainer" containerID="b8a7e074bded868b90861fe1e5fdddd63c2b37acc4111e94697089382de56703" Mar 20 07:19:53 crc kubenswrapper[4971]: I0320 07:19:53.354210 4971 scope.go:117] "RemoveContainer" containerID="68284f1f3498c3544a13dd866e1ad49696c1cf23cedffe7cf47b35b59e699421" Mar 20 07:19:53 crc kubenswrapper[4971]: I0320 07:19:53.380928 4971 scope.go:117] "RemoveContainer" containerID="a476691f9b3abf540342b199004e9fe8a0cb58643f58fc4b0ace659b56cc66dc" Mar 20 07:19:53 crc kubenswrapper[4971]: I0320 07:19:53.405861 4971 scope.go:117] "RemoveContainer" containerID="f62e0cf03afdd39a3ebc3d89fcbc2de75a784eae11e65894a8324b66b714dca4" Mar 20 07:19:53 crc kubenswrapper[4971]: I0320 07:19:53.434530 4971 scope.go:117] "RemoveContainer" containerID="4410efaf3f9d8a87e04404e86f12371268c9d97f8691fe91ffd43229eced6dd3" Mar 20 07:19:53 crc kubenswrapper[4971]: I0320 07:19:53.474829 4971 scope.go:117] "RemoveContainer" containerID="2d165ae39e404c6b7bad9797db6ef0ab755622644236c7d233aa70de83d5923b" Mar 20 07:19:53 crc kubenswrapper[4971]: I0320 07:19:53.502395 4971 scope.go:117] "RemoveContainer" containerID="16b1faff31785032898871d7f78f8190dd2a353c88127055a1284e643fc00ee1" Mar 20 07:20:00 crc kubenswrapper[4971]: I0320 07:20:00.148241 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566520-sbc7k"] Mar 20 07:20:00 crc kubenswrapper[4971]: E0320 07:20:00.152716 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c7c341a-b390-40e0-899e-6cf925e5280d" containerName="oc" Mar 20 07:20:00 crc kubenswrapper[4971]: I0320 07:20:00.152754 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c7c341a-b390-40e0-899e-6cf925e5280d" containerName="oc" Mar 20 07:20:00 crc kubenswrapper[4971]: I0320 07:20:00.152981 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c7c341a-b390-40e0-899e-6cf925e5280d" containerName="oc" Mar 20 07:20:00 crc kubenswrapper[4971]: I0320 07:20:00.153869 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566520-sbc7k" Mar 20 07:20:00 crc kubenswrapper[4971]: I0320 07:20:00.155973 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 07:20:00 crc kubenswrapper[4971]: I0320 07:20:00.156170 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 07:20:00 crc kubenswrapper[4971]: I0320 07:20:00.156719 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 07:20:00 crc kubenswrapper[4971]: I0320 07:20:00.163672 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566520-sbc7k"] Mar 20 07:20:00 crc kubenswrapper[4971]: I0320 07:20:00.203483 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8z9k\" (UniqueName: \"kubernetes.io/projected/d9c3f735-4efd-4d7e-a019-6cf5be7862f1-kube-api-access-q8z9k\") pod \"auto-csr-approver-29566520-sbc7k\" (UID: \"d9c3f735-4efd-4d7e-a019-6cf5be7862f1\") " pod="openshift-infra/auto-csr-approver-29566520-sbc7k" Mar 20 07:20:00 crc kubenswrapper[4971]: I0320 07:20:00.305531 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8z9k\" (UniqueName: \"kubernetes.io/projected/d9c3f735-4efd-4d7e-a019-6cf5be7862f1-kube-api-access-q8z9k\") pod \"auto-csr-approver-29566520-sbc7k\" (UID: \"d9c3f735-4efd-4d7e-a019-6cf5be7862f1\") " pod="openshift-infra/auto-csr-approver-29566520-sbc7k" Mar 20 07:20:00 crc kubenswrapper[4971]: I0320 07:20:00.341411 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8z9k\" (UniqueName: \"kubernetes.io/projected/d9c3f735-4efd-4d7e-a019-6cf5be7862f1-kube-api-access-q8z9k\") pod \"auto-csr-approver-29566520-sbc7k\" (UID: \"d9c3f735-4efd-4d7e-a019-6cf5be7862f1\") " pod="openshift-infra/auto-csr-approver-29566520-sbc7k" Mar 20 07:20:00 crc kubenswrapper[4971]: I0320 07:20:00.477885 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566520-sbc7k" Mar 20 07:20:00 crc kubenswrapper[4971]: I0320 07:20:00.789977 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566520-sbc7k"] Mar 20 07:20:00 crc kubenswrapper[4971]: W0320 07:20:00.797165 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9c3f735_4efd_4d7e_a019_6cf5be7862f1.slice/crio-a8f2b579be1866d016b630e92054280f61f92cec40db55a4ab7f14cc45cc3690 WatchSource:0}: Error finding container a8f2b579be1866d016b630e92054280f61f92cec40db55a4ab7f14cc45cc3690: Status 404 returned error can't find the container with id a8f2b579be1866d016b630e92054280f61f92cec40db55a4ab7f14cc45cc3690 Mar 20 07:20:01 crc kubenswrapper[4971]: I0320 07:20:01.232332 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566520-sbc7k" event={"ID":"d9c3f735-4efd-4d7e-a019-6cf5be7862f1","Type":"ContainerStarted","Data":"a8f2b579be1866d016b630e92054280f61f92cec40db55a4ab7f14cc45cc3690"} Mar 20 07:20:01 crc kubenswrapper[4971]: I0320 07:20:01.732048 4971 scope.go:117] "RemoveContainer" containerID="0d76c711da5cb6a39ec9395f7a9d86e2671b04a8f185bea088275be6dbc9a85d" Mar 20 07:20:01 crc kubenswrapper[4971]: E0320 07:20:01.732427 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:20:04 crc kubenswrapper[4971]: I0320 07:20:04.272165 4971 generic.go:334] "Generic (PLEG): container finished" podID="d9c3f735-4efd-4d7e-a019-6cf5be7862f1" containerID="007d81326095205ca7232b97e22a62bf3c678b744d564ebee485b549c04fd283" exitCode=0 Mar 20 07:20:04 crc kubenswrapper[4971]: I0320 07:20:04.272537 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566520-sbc7k" event={"ID":"d9c3f735-4efd-4d7e-a019-6cf5be7862f1","Type":"ContainerDied","Data":"007d81326095205ca7232b97e22a62bf3c678b744d564ebee485b549c04fd283"} Mar 20 07:20:05 crc kubenswrapper[4971]: I0320 07:20:05.661945 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566520-sbc7k" Mar 20 07:20:05 crc kubenswrapper[4971]: I0320 07:20:05.694277 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8z9k\" (UniqueName: \"kubernetes.io/projected/d9c3f735-4efd-4d7e-a019-6cf5be7862f1-kube-api-access-q8z9k\") pod \"d9c3f735-4efd-4d7e-a019-6cf5be7862f1\" (UID: \"d9c3f735-4efd-4d7e-a019-6cf5be7862f1\") " Mar 20 07:20:05 crc kubenswrapper[4971]: I0320 07:20:05.699041 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9c3f735-4efd-4d7e-a019-6cf5be7862f1-kube-api-access-q8z9k" (OuterVolumeSpecName: "kube-api-access-q8z9k") pod "d9c3f735-4efd-4d7e-a019-6cf5be7862f1" (UID: "d9c3f735-4efd-4d7e-a019-6cf5be7862f1"). InnerVolumeSpecName "kube-api-access-q8z9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:20:05 crc kubenswrapper[4971]: I0320 07:20:05.795847 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8z9k\" (UniqueName: \"kubernetes.io/projected/d9c3f735-4efd-4d7e-a019-6cf5be7862f1-kube-api-access-q8z9k\") on node \"crc\" DevicePath \"\"" Mar 20 07:20:06 crc kubenswrapper[4971]: I0320 07:20:06.300599 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566520-sbc7k" event={"ID":"d9c3f735-4efd-4d7e-a019-6cf5be7862f1","Type":"ContainerDied","Data":"a8f2b579be1866d016b630e92054280f61f92cec40db55a4ab7f14cc45cc3690"} Mar 20 07:20:06 crc kubenswrapper[4971]: I0320 07:20:06.300718 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8f2b579be1866d016b630e92054280f61f92cec40db55a4ab7f14cc45cc3690" Mar 20 07:20:06 crc kubenswrapper[4971]: I0320 07:20:06.301056 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566520-sbc7k" Mar 20 07:20:06 crc kubenswrapper[4971]: I0320 07:20:06.755550 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566514-7qv5f"] Mar 20 07:20:06 crc kubenswrapper[4971]: I0320 07:20:06.766396 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566514-7qv5f"] Mar 20 07:20:08 crc kubenswrapper[4971]: I0320 07:20:08.742171 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ad43e1b-a41b-446b-8690-8fede9e1fc7f" path="/var/lib/kubelet/pods/4ad43e1b-a41b-446b-8690-8fede9e1fc7f/volumes" Mar 20 07:20:13 crc kubenswrapper[4971]: I0320 07:20:13.732448 4971 scope.go:117] "RemoveContainer" containerID="0d76c711da5cb6a39ec9395f7a9d86e2671b04a8f185bea088275be6dbc9a85d" Mar 20 07:20:13 crc kubenswrapper[4971]: E0320 07:20:13.733711 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:20:21 crc kubenswrapper[4971]: I0320 07:20:21.869025 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fbjdb"] Mar 20 07:20:21 crc kubenswrapper[4971]: E0320 07:20:21.869941 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9c3f735-4efd-4d7e-a019-6cf5be7862f1" containerName="oc" Mar 20 07:20:21 crc kubenswrapper[4971]: I0320 07:20:21.869957 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9c3f735-4efd-4d7e-a019-6cf5be7862f1" containerName="oc" Mar 20 07:20:21 crc kubenswrapper[4971]: I0320 07:20:21.870123 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9c3f735-4efd-4d7e-a019-6cf5be7862f1" containerName="oc" Mar 20 07:20:21 crc kubenswrapper[4971]: I0320 07:20:21.871338 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fbjdb" Mar 20 07:20:21 crc kubenswrapper[4971]: I0320 07:20:21.886706 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fbjdb"] Mar 20 07:20:21 crc kubenswrapper[4971]: I0320 07:20:21.935931 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ktv5\" (UniqueName: \"kubernetes.io/projected/7a80b6c9-55ce-48a2-a70e-c29a6c61770c-kube-api-access-8ktv5\") pod \"redhat-operators-fbjdb\" (UID: \"7a80b6c9-55ce-48a2-a70e-c29a6c61770c\") " pod="openshift-marketplace/redhat-operators-fbjdb" Mar 20 07:20:21 crc kubenswrapper[4971]: I0320 07:20:21.936012 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a80b6c9-55ce-48a2-a70e-c29a6c61770c-utilities\") pod \"redhat-operators-fbjdb\" (UID: \"7a80b6c9-55ce-48a2-a70e-c29a6c61770c\") " pod="openshift-marketplace/redhat-operators-fbjdb" Mar 20 07:20:21 crc kubenswrapper[4971]: I0320 07:20:21.936049 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a80b6c9-55ce-48a2-a70e-c29a6c61770c-catalog-content\") pod \"redhat-operators-fbjdb\" (UID: \"7a80b6c9-55ce-48a2-a70e-c29a6c61770c\") " pod="openshift-marketplace/redhat-operators-fbjdb" Mar 20 07:20:22 crc kubenswrapper[4971]: I0320 07:20:22.037366 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a80b6c9-55ce-48a2-a70e-c29a6c61770c-utilities\") pod \"redhat-operators-fbjdb\" (UID: \"7a80b6c9-55ce-48a2-a70e-c29a6c61770c\") " pod="openshift-marketplace/redhat-operators-fbjdb" Mar 20 07:20:22 crc kubenswrapper[4971]: I0320 07:20:22.037431 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a80b6c9-55ce-48a2-a70e-c29a6c61770c-catalog-content\") pod \"redhat-operators-fbjdb\" (UID: \"7a80b6c9-55ce-48a2-a70e-c29a6c61770c\") " pod="openshift-marketplace/redhat-operators-fbjdb" Mar 20 07:20:22 crc kubenswrapper[4971]: I0320 07:20:22.037507 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ktv5\" (UniqueName: \"kubernetes.io/projected/7a80b6c9-55ce-48a2-a70e-c29a6c61770c-kube-api-access-8ktv5\") pod \"redhat-operators-fbjdb\" (UID: \"7a80b6c9-55ce-48a2-a70e-c29a6c61770c\") " pod="openshift-marketplace/redhat-operators-fbjdb" Mar 20 07:20:22 crc kubenswrapper[4971]: I0320 07:20:22.037882 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a80b6c9-55ce-48a2-a70e-c29a6c61770c-utilities\") pod \"redhat-operators-fbjdb\" (UID: \"7a80b6c9-55ce-48a2-a70e-c29a6c61770c\") " pod="openshift-marketplace/redhat-operators-fbjdb" Mar 20 07:20:22 crc kubenswrapper[4971]: I0320 07:20:22.038289 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a80b6c9-55ce-48a2-a70e-c29a6c61770c-catalog-content\") pod \"redhat-operators-fbjdb\" (UID: \"7a80b6c9-55ce-48a2-a70e-c29a6c61770c\") " pod="openshift-marketplace/redhat-operators-fbjdb" Mar 20 07:20:22 crc kubenswrapper[4971]: I0320 07:20:22.057520 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ktv5\" (UniqueName: \"kubernetes.io/projected/7a80b6c9-55ce-48a2-a70e-c29a6c61770c-kube-api-access-8ktv5\") pod \"redhat-operators-fbjdb\" (UID: \"7a80b6c9-55ce-48a2-a70e-c29a6c61770c\") " pod="openshift-marketplace/redhat-operators-fbjdb" Mar 20 07:20:22 crc kubenswrapper[4971]: I0320 07:20:22.195158 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fbjdb" Mar 20 07:20:22 crc kubenswrapper[4971]: I0320 07:20:22.667671 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fbjdb"] Mar 20 07:20:23 crc kubenswrapper[4971]: I0320 07:20:23.452409 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dvpvs"] Mar 20 07:20:23 crc kubenswrapper[4971]: I0320 07:20:23.455557 4971 generic.go:334] "Generic (PLEG): container finished" podID="7a80b6c9-55ce-48a2-a70e-c29a6c61770c" containerID="f69f4055f7576c4b882d811e601c747bec12718b07a82c4f1c7caf8629ed4d1b" exitCode=0 Mar 20 07:20:23 crc kubenswrapper[4971]: I0320 07:20:23.456458 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbjdb" event={"ID":"7a80b6c9-55ce-48a2-a70e-c29a6c61770c","Type":"ContainerDied","Data":"f69f4055f7576c4b882d811e601c747bec12718b07a82c4f1c7caf8629ed4d1b"} Mar 20 07:20:23 crc kubenswrapper[4971]: I0320 07:20:23.456502 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbjdb" event={"ID":"7a80b6c9-55ce-48a2-a70e-c29a6c61770c","Type":"ContainerStarted","Data":"aa80a865dc8503cda5f750d987ff0ded9efe439f8dd02c6f1651e327aa3e3367"} Mar 20 07:20:23 crc kubenswrapper[4971]: I0320 07:20:23.456731 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dvpvs" Mar 20 07:20:23 crc kubenswrapper[4971]: I0320 07:20:23.460327 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dvpvs"] Mar 20 07:20:23 crc kubenswrapper[4971]: I0320 07:20:23.558462 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0976523b-bde5-4229-ae81-a19f9e89380f-catalog-content\") pod \"certified-operators-dvpvs\" (UID: \"0976523b-bde5-4229-ae81-a19f9e89380f\") " pod="openshift-marketplace/certified-operators-dvpvs" Mar 20 07:20:23 crc kubenswrapper[4971]: I0320 07:20:23.558520 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0976523b-bde5-4229-ae81-a19f9e89380f-utilities\") pod \"certified-operators-dvpvs\" (UID: \"0976523b-bde5-4229-ae81-a19f9e89380f\") " pod="openshift-marketplace/certified-operators-dvpvs" Mar 20 07:20:23 crc kubenswrapper[4971]: I0320 07:20:23.558801 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tbn9\" (UniqueName: \"kubernetes.io/projected/0976523b-bde5-4229-ae81-a19f9e89380f-kube-api-access-8tbn9\") pod \"certified-operators-dvpvs\" (UID: \"0976523b-bde5-4229-ae81-a19f9e89380f\") " pod="openshift-marketplace/certified-operators-dvpvs" Mar 20 07:20:23 crc kubenswrapper[4971]: I0320 07:20:23.660210 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0976523b-bde5-4229-ae81-a19f9e89380f-catalog-content\") pod \"certified-operators-dvpvs\" (UID: \"0976523b-bde5-4229-ae81-a19f9e89380f\") " pod="openshift-marketplace/certified-operators-dvpvs" Mar 20 07:20:23 crc kubenswrapper[4971]: I0320 07:20:23.660257 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0976523b-bde5-4229-ae81-a19f9e89380f-utilities\") pod \"certified-operators-dvpvs\" (UID: \"0976523b-bde5-4229-ae81-a19f9e89380f\") " pod="openshift-marketplace/certified-operators-dvpvs" Mar 20 07:20:23 crc kubenswrapper[4971]: I0320 07:20:23.660352 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tbn9\" (UniqueName: \"kubernetes.io/projected/0976523b-bde5-4229-ae81-a19f9e89380f-kube-api-access-8tbn9\") pod \"certified-operators-dvpvs\" (UID: \"0976523b-bde5-4229-ae81-a19f9e89380f\") " pod="openshift-marketplace/certified-operators-dvpvs" Mar 20 07:20:23 crc kubenswrapper[4971]: I0320 07:20:23.660844 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0976523b-bde5-4229-ae81-a19f9e89380f-catalog-content\") pod \"certified-operators-dvpvs\" (UID: \"0976523b-bde5-4229-ae81-a19f9e89380f\") " pod="openshift-marketplace/certified-operators-dvpvs" Mar 20 07:20:23 crc kubenswrapper[4971]: I0320 07:20:23.660878 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0976523b-bde5-4229-ae81-a19f9e89380f-utilities\") pod \"certified-operators-dvpvs\" (UID: \"0976523b-bde5-4229-ae81-a19f9e89380f\") " pod="openshift-marketplace/certified-operators-dvpvs" Mar 20 07:20:23 crc kubenswrapper[4971]: I0320 07:20:23.686483 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tbn9\" (UniqueName: \"kubernetes.io/projected/0976523b-bde5-4229-ae81-a19f9e89380f-kube-api-access-8tbn9\") pod \"certified-operators-dvpvs\" (UID: \"0976523b-bde5-4229-ae81-a19f9e89380f\") " pod="openshift-marketplace/certified-operators-dvpvs" Mar 20 07:20:23 crc kubenswrapper[4971]: I0320 07:20:23.773598 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dvpvs" Mar 20 07:20:24 crc kubenswrapper[4971]: I0320 07:20:24.046230 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dvpvs"] Mar 20 07:20:24 crc kubenswrapper[4971]: I0320 07:20:24.464883 4971 generic.go:334] "Generic (PLEG): container finished" podID="0976523b-bde5-4229-ae81-a19f9e89380f" containerID="34ed4b31797a0decd42fe7a3fefac4ee327733145e133ad6a26da1e51281e271" exitCode=0 Mar 20 07:20:24 crc kubenswrapper[4971]: I0320 07:20:24.464921 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dvpvs" event={"ID":"0976523b-bde5-4229-ae81-a19f9e89380f","Type":"ContainerDied","Data":"34ed4b31797a0decd42fe7a3fefac4ee327733145e133ad6a26da1e51281e271"} Mar 20 07:20:24 crc kubenswrapper[4971]: I0320 07:20:24.464944 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dvpvs" event={"ID":"0976523b-bde5-4229-ae81-a19f9e89380f","Type":"ContainerStarted","Data":"6c8611426779146be25342948a985504884808b9327af0f1aa85f619987c2abd"} Mar 20 07:20:25 crc kubenswrapper[4971]: I0320 07:20:25.255386 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6gsj5"] Mar 20 07:20:25 crc kubenswrapper[4971]: I0320 07:20:25.257633 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6gsj5" Mar 20 07:20:25 crc kubenswrapper[4971]: I0320 07:20:25.269349 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6gsj5"] Mar 20 07:20:25 crc kubenswrapper[4971]: I0320 07:20:25.287072 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95091f09-6a13-4afa-a29b-5350f226342b-catalog-content\") pod \"community-operators-6gsj5\" (UID: \"95091f09-6a13-4afa-a29b-5350f226342b\") " pod="openshift-marketplace/community-operators-6gsj5" Mar 20 07:20:25 crc kubenswrapper[4971]: I0320 07:20:25.287162 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95091f09-6a13-4afa-a29b-5350f226342b-utilities\") pod \"community-operators-6gsj5\" (UID: \"95091f09-6a13-4afa-a29b-5350f226342b\") " pod="openshift-marketplace/community-operators-6gsj5" Mar 20 07:20:25 crc kubenswrapper[4971]: I0320 07:20:25.287221 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7knw7\" (UniqueName: \"kubernetes.io/projected/95091f09-6a13-4afa-a29b-5350f226342b-kube-api-access-7knw7\") pod \"community-operators-6gsj5\" (UID: \"95091f09-6a13-4afa-a29b-5350f226342b\") " pod="openshift-marketplace/community-operators-6gsj5" Mar 20 07:20:25 crc kubenswrapper[4971]: I0320 07:20:25.388461 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95091f09-6a13-4afa-a29b-5350f226342b-catalog-content\") pod \"community-operators-6gsj5\" (UID: \"95091f09-6a13-4afa-a29b-5350f226342b\") " pod="openshift-marketplace/community-operators-6gsj5" Mar 20 07:20:25 crc kubenswrapper[4971]: I0320 07:20:25.388543 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95091f09-6a13-4afa-a29b-5350f226342b-utilities\") pod \"community-operators-6gsj5\" (UID: \"95091f09-6a13-4afa-a29b-5350f226342b\") " pod="openshift-marketplace/community-operators-6gsj5" Mar 20 07:20:25 crc kubenswrapper[4971]: I0320 07:20:25.388605 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7knw7\" (UniqueName: \"kubernetes.io/projected/95091f09-6a13-4afa-a29b-5350f226342b-kube-api-access-7knw7\") pod \"community-operators-6gsj5\" (UID: \"95091f09-6a13-4afa-a29b-5350f226342b\") " pod="openshift-marketplace/community-operators-6gsj5" Mar 20 07:20:25 crc kubenswrapper[4971]: I0320 07:20:25.389333 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95091f09-6a13-4afa-a29b-5350f226342b-catalog-content\") pod \"community-operators-6gsj5\" (UID: \"95091f09-6a13-4afa-a29b-5350f226342b\") " pod="openshift-marketplace/community-operators-6gsj5" Mar 20 07:20:25 crc kubenswrapper[4971]: I0320 07:20:25.389360 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95091f09-6a13-4afa-a29b-5350f226342b-utilities\") pod \"community-operators-6gsj5\" (UID: \"95091f09-6a13-4afa-a29b-5350f226342b\") " pod="openshift-marketplace/community-operators-6gsj5" Mar 20 07:20:25 crc kubenswrapper[4971]: I0320 07:20:25.411670 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7knw7\" (UniqueName: \"kubernetes.io/projected/95091f09-6a13-4afa-a29b-5350f226342b-kube-api-access-7knw7\") pod \"community-operators-6gsj5\" (UID: \"95091f09-6a13-4afa-a29b-5350f226342b\") " pod="openshift-marketplace/community-operators-6gsj5" Mar 20 07:20:25 crc kubenswrapper[4971]: I0320 07:20:25.474891 4971 generic.go:334] "Generic (PLEG): container finished" podID="0976523b-bde5-4229-ae81-a19f9e89380f" containerID="9c9751438113ed2516d63b6b9f5ab37991e4c8542fabc76a254c54e61b59cb99" exitCode=0 Mar 20 07:20:25 crc kubenswrapper[4971]: I0320 07:20:25.474924 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dvpvs" event={"ID":"0976523b-bde5-4229-ae81-a19f9e89380f","Type":"ContainerDied","Data":"9c9751438113ed2516d63b6b9f5ab37991e4c8542fabc76a254c54e61b59cb99"} Mar 20 07:20:25 crc kubenswrapper[4971]: I0320 07:20:25.477070 4971 generic.go:334] "Generic (PLEG): container finished" podID="7a80b6c9-55ce-48a2-a70e-c29a6c61770c" containerID="ecd98156c8c44ede4d584e6e9a65a659d10c39dd289ce237ca5c2f597f46811a" exitCode=0 Mar 20 07:20:25 crc kubenswrapper[4971]: I0320 07:20:25.477089 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbjdb" event={"ID":"7a80b6c9-55ce-48a2-a70e-c29a6c61770c","Type":"ContainerDied","Data":"ecd98156c8c44ede4d584e6e9a65a659d10c39dd289ce237ca5c2f597f46811a"} Mar 20 07:20:25 crc kubenswrapper[4971]: I0320 07:20:25.584507 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6gsj5" Mar 20 07:20:25 crc kubenswrapper[4971]: I0320 07:20:25.907986 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6gsj5"] Mar 20 07:20:26 crc kubenswrapper[4971]: I0320 07:20:26.487944 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbjdb" event={"ID":"7a80b6c9-55ce-48a2-a70e-c29a6c61770c","Type":"ContainerStarted","Data":"8b80e992c16892020f23d003719282ade44d52e9ec7908d9338a77cd050961c0"} Mar 20 07:20:26 crc kubenswrapper[4971]: I0320 07:20:26.489474 4971 generic.go:334] "Generic (PLEG): container finished" podID="95091f09-6a13-4afa-a29b-5350f226342b" containerID="7f57391f7f6928aa4e481ff2cb676b12aaccfff7fd3cc1e6d758fa6007f2d150" exitCode=0 Mar 20 07:20:26 crc kubenswrapper[4971]: I0320 07:20:26.489543 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6gsj5" event={"ID":"95091f09-6a13-4afa-a29b-5350f226342b","Type":"ContainerDied","Data":"7f57391f7f6928aa4e481ff2cb676b12aaccfff7fd3cc1e6d758fa6007f2d150"} Mar 20 07:20:26 crc kubenswrapper[4971]: I0320 07:20:26.489572 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6gsj5" event={"ID":"95091f09-6a13-4afa-a29b-5350f226342b","Type":"ContainerStarted","Data":"fd9340679ddc1d36dcf097436a69633b16cc99a9763073a8a878dc81958725e9"} Mar 20 07:20:26 crc kubenswrapper[4971]: I0320 07:20:26.493799 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dvpvs" event={"ID":"0976523b-bde5-4229-ae81-a19f9e89380f","Type":"ContainerStarted","Data":"f951ca06cfc18800eb61e6e23099f56624c3d6d7b73b75108983f3bcb0f6616f"} Mar 20 07:20:26 crc kubenswrapper[4971]: I0320 07:20:26.514248 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fbjdb" podStartSLOduration=2.858988776 podStartE2EDuration="5.514219349s" podCreationTimestamp="2026-03-20 07:20:21 +0000 UTC" firstStartedPulling="2026-03-20 07:20:23.459724432 +0000 UTC m=+1845.439598560" lastFinishedPulling="2026-03-20 07:20:26.114954965 +0000 UTC m=+1848.094829133" observedRunningTime="2026-03-20 07:20:26.504556956 +0000 UTC m=+1848.484431114" watchObservedRunningTime="2026-03-20 07:20:26.514219349 +0000 UTC m=+1848.494093497" Mar 20 07:20:26 crc kubenswrapper[4971]: I0320 07:20:26.536556 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dvpvs" podStartSLOduration=2.137611481 podStartE2EDuration="3.536523845s" podCreationTimestamp="2026-03-20 07:20:23 +0000 UTC" firstStartedPulling="2026-03-20 07:20:24.466250262 +0000 UTC m=+1846.446124390" lastFinishedPulling="2026-03-20 07:20:25.865162616 +0000 UTC m=+1847.845036754" observedRunningTime="2026-03-20 07:20:26.528762041 +0000 UTC m=+1848.508636219" watchObservedRunningTime="2026-03-20 07:20:26.536523845 +0000 UTC m=+1848.516398023" Mar 20 07:20:27 crc kubenswrapper[4971]: I0320 07:20:27.732954 4971 scope.go:117] "RemoveContainer" containerID="0d76c711da5cb6a39ec9395f7a9d86e2671b04a8f185bea088275be6dbc9a85d" Mar 20 07:20:27 crc kubenswrapper[4971]: E0320 07:20:27.733498 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:20:28 crc kubenswrapper[4971]: I0320 07:20:28.510380 4971 generic.go:334] "Generic (PLEG): container finished" podID="95091f09-6a13-4afa-a29b-5350f226342b" containerID="cde99eb762d0cf5b02ad10096e1381075108e9c879a8f205720fa4ee14fad39e" exitCode=0 Mar 20 07:20:28 crc kubenswrapper[4971]: I0320 07:20:28.510430 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6gsj5" event={"ID":"95091f09-6a13-4afa-a29b-5350f226342b","Type":"ContainerDied","Data":"cde99eb762d0cf5b02ad10096e1381075108e9c879a8f205720fa4ee14fad39e"} Mar 20 07:20:29 crc kubenswrapper[4971]: I0320 07:20:29.520247 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6gsj5" event={"ID":"95091f09-6a13-4afa-a29b-5350f226342b","Type":"ContainerStarted","Data":"e68e0e88e1e2f86045ed375c0082fa503cf28792882d8fbd9dcdb9a5c0863b20"} Mar 20 07:20:29 crc kubenswrapper[4971]: I0320 07:20:29.542848 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6gsj5" podStartSLOduration=2.10755948 podStartE2EDuration="4.542826887s" podCreationTimestamp="2026-03-20 07:20:25 +0000 UTC" firstStartedPulling="2026-03-20 07:20:26.491954065 +0000 UTC m=+1848.471828213" lastFinishedPulling="2026-03-20 07:20:28.927221492 +0000 UTC m=+1850.907095620" observedRunningTime="2026-03-20 07:20:29.536341297 +0000 UTC m=+1851.516215455" watchObservedRunningTime="2026-03-20 07:20:29.542826887 +0000 UTC m=+1851.522701025" Mar 20 07:20:32 crc kubenswrapper[4971]: I0320 07:20:32.196075 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fbjdb" Mar 20 07:20:32 crc kubenswrapper[4971]: I0320 07:20:32.196396 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fbjdb" Mar 20 07:20:33 crc kubenswrapper[4971]: I0320 07:20:33.268904 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fbjdb" podUID="7a80b6c9-55ce-48a2-a70e-c29a6c61770c" containerName="registry-server" probeResult="failure" output=< Mar 20 07:20:33 crc kubenswrapper[4971]: timeout: failed to connect service ":50051" within 1s Mar 20 07:20:33 crc kubenswrapper[4971]: > Mar 20 07:20:33 crc kubenswrapper[4971]: I0320 07:20:33.773726 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dvpvs" Mar 20 07:20:33 crc kubenswrapper[4971]: I0320 07:20:33.773791 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dvpvs" Mar 20 07:20:33 crc kubenswrapper[4971]: I0320 07:20:33.850339 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dvpvs" Mar 20 07:20:34 crc kubenswrapper[4971]: I0320 07:20:34.626676 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dvpvs" Mar 20 07:20:34 crc kubenswrapper[4971]: I0320 07:20:34.690432 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dvpvs"] Mar 20 07:20:35 crc kubenswrapper[4971]: I0320 07:20:35.585109 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6gsj5" Mar 20 07:20:35 crc kubenswrapper[4971]: I0320 07:20:35.585202 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6gsj5" Mar 20 07:20:35 crc kubenswrapper[4971]: I0320 07:20:35.639564 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6gsj5" Mar 20 07:20:36 crc kubenswrapper[4971]: I0320 07:20:36.576054 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dvpvs" podUID="0976523b-bde5-4229-ae81-a19f9e89380f" containerName="registry-server" containerID="cri-o://f951ca06cfc18800eb61e6e23099f56624c3d6d7b73b75108983f3bcb0f6616f" gracePeriod=2 Mar 20 07:20:36 crc kubenswrapper[4971]: I0320 07:20:36.652729 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6gsj5" Mar 20 07:20:37 crc kubenswrapper[4971]: I0320 07:20:37.065542 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dvpvs" Mar 20 07:20:37 crc kubenswrapper[4971]: I0320 07:20:37.158600 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0976523b-bde5-4229-ae81-a19f9e89380f-utilities\") pod \"0976523b-bde5-4229-ae81-a19f9e89380f\" (UID: \"0976523b-bde5-4229-ae81-a19f9e89380f\") " Mar 20 07:20:37 crc kubenswrapper[4971]: I0320 07:20:37.158877 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tbn9\" (UniqueName: \"kubernetes.io/projected/0976523b-bde5-4229-ae81-a19f9e89380f-kube-api-access-8tbn9\") pod \"0976523b-bde5-4229-ae81-a19f9e89380f\" (UID: \"0976523b-bde5-4229-ae81-a19f9e89380f\") " Mar 20 07:20:37 crc kubenswrapper[4971]: I0320 07:20:37.158932 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0976523b-bde5-4229-ae81-a19f9e89380f-catalog-content\") pod \"0976523b-bde5-4229-ae81-a19f9e89380f\" (UID: \"0976523b-bde5-4229-ae81-a19f9e89380f\") " Mar 20 07:20:37 crc kubenswrapper[4971]: I0320 07:20:37.159879 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0976523b-bde5-4229-ae81-a19f9e89380f-utilities" (OuterVolumeSpecName: "utilities") pod "0976523b-bde5-4229-ae81-a19f9e89380f" (UID: "0976523b-bde5-4229-ae81-a19f9e89380f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:20:37 crc kubenswrapper[4971]: I0320 07:20:37.165259 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0976523b-bde5-4229-ae81-a19f9e89380f-kube-api-access-8tbn9" (OuterVolumeSpecName: "kube-api-access-8tbn9") pod "0976523b-bde5-4229-ae81-a19f9e89380f" (UID: "0976523b-bde5-4229-ae81-a19f9e89380f"). InnerVolumeSpecName "kube-api-access-8tbn9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:20:37 crc kubenswrapper[4971]: I0320 07:20:37.210173 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0976523b-bde5-4229-ae81-a19f9e89380f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0976523b-bde5-4229-ae81-a19f9e89380f" (UID: "0976523b-bde5-4229-ae81-a19f9e89380f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:20:37 crc kubenswrapper[4971]: I0320 07:20:37.260542 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tbn9\" (UniqueName: \"kubernetes.io/projected/0976523b-bde5-4229-ae81-a19f9e89380f-kube-api-access-8tbn9\") on node \"crc\" DevicePath \"\"" Mar 20 07:20:37 crc kubenswrapper[4971]: I0320 07:20:37.260585 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0976523b-bde5-4229-ae81-a19f9e89380f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 07:20:37 crc kubenswrapper[4971]: I0320 07:20:37.260623 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0976523b-bde5-4229-ae81-a19f9e89380f-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 07:20:37 crc kubenswrapper[4971]: I0320 07:20:37.584943 4971 generic.go:334] "Generic (PLEG): container finished" podID="0976523b-bde5-4229-ae81-a19f9e89380f" containerID="f951ca06cfc18800eb61e6e23099f56624c3d6d7b73b75108983f3bcb0f6616f" exitCode=0 Mar 20 07:20:37 crc kubenswrapper[4971]: I0320 07:20:37.585028 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dvpvs" event={"ID":"0976523b-bde5-4229-ae81-a19f9e89380f","Type":"ContainerDied","Data":"f951ca06cfc18800eb61e6e23099f56624c3d6d7b73b75108983f3bcb0f6616f"} Mar 20 07:20:37 crc kubenswrapper[4971]: I0320 07:20:37.585404 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dvpvs" event={"ID":"0976523b-bde5-4229-ae81-a19f9e89380f","Type":"ContainerDied","Data":"6c8611426779146be25342948a985504884808b9327af0f1aa85f619987c2abd"} Mar 20 07:20:37 crc kubenswrapper[4971]: I0320 07:20:37.585444 4971 scope.go:117] "RemoveContainer" containerID="f951ca06cfc18800eb61e6e23099f56624c3d6d7b73b75108983f3bcb0f6616f" Mar 20 07:20:37 crc kubenswrapper[4971]: I0320 07:20:37.585055 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dvpvs" Mar 20 07:20:37 crc kubenswrapper[4971]: I0320 07:20:37.602846 4971 scope.go:117] "RemoveContainer" containerID="9c9751438113ed2516d63b6b9f5ab37991e4c8542fabc76a254c54e61b59cb99" Mar 20 07:20:37 crc kubenswrapper[4971]: I0320 07:20:37.631950 4971 scope.go:117] "RemoveContainer" containerID="34ed4b31797a0decd42fe7a3fefac4ee327733145e133ad6a26da1e51281e271" Mar 20 07:20:37 crc kubenswrapper[4971]: I0320 07:20:37.684226 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dvpvs"] Mar 20 07:20:37 crc kubenswrapper[4971]: I0320 07:20:37.699639 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dvpvs"] Mar 20 07:20:37 crc kubenswrapper[4971]: I0320 07:20:37.707310 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6gsj5"] Mar 20 07:20:37 crc kubenswrapper[4971]: I0320 07:20:37.713866 4971 scope.go:117] "RemoveContainer" containerID="f951ca06cfc18800eb61e6e23099f56624c3d6d7b73b75108983f3bcb0f6616f" Mar 20 07:20:37 crc kubenswrapper[4971]: E0320 07:20:37.714308 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f951ca06cfc18800eb61e6e23099f56624c3d6d7b73b75108983f3bcb0f6616f\": container with ID starting with f951ca06cfc18800eb61e6e23099f56624c3d6d7b73b75108983f3bcb0f6616f not found: ID does not exist" containerID="f951ca06cfc18800eb61e6e23099f56624c3d6d7b73b75108983f3bcb0f6616f" Mar 20 07:20:37 crc kubenswrapper[4971]: I0320 07:20:37.714346 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f951ca06cfc18800eb61e6e23099f56624c3d6d7b73b75108983f3bcb0f6616f"} err="failed to get container status \"f951ca06cfc18800eb61e6e23099f56624c3d6d7b73b75108983f3bcb0f6616f\": rpc error: code = NotFound desc = could not find container \"f951ca06cfc18800eb61e6e23099f56624c3d6d7b73b75108983f3bcb0f6616f\": container with ID starting with f951ca06cfc18800eb61e6e23099f56624c3d6d7b73b75108983f3bcb0f6616f not found: ID does not exist" Mar 20 07:20:37 crc kubenswrapper[4971]: I0320 07:20:37.714371 4971 scope.go:117] "RemoveContainer" containerID="9c9751438113ed2516d63b6b9f5ab37991e4c8542fabc76a254c54e61b59cb99" Mar 20 07:20:37 crc kubenswrapper[4971]: E0320 07:20:37.714598 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c9751438113ed2516d63b6b9f5ab37991e4c8542fabc76a254c54e61b59cb99\": container with ID starting with 9c9751438113ed2516d63b6b9f5ab37991e4c8542fabc76a254c54e61b59cb99 not found: ID does not exist" containerID="9c9751438113ed2516d63b6b9f5ab37991e4c8542fabc76a254c54e61b59cb99" Mar 20 07:20:37 crc kubenswrapper[4971]: I0320 07:20:37.714645 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c9751438113ed2516d63b6b9f5ab37991e4c8542fabc76a254c54e61b59cb99"} err="failed to get container status \"9c9751438113ed2516d63b6b9f5ab37991e4c8542fabc76a254c54e61b59cb99\": rpc error: code = NotFound desc = could not find container \"9c9751438113ed2516d63b6b9f5ab37991e4c8542fabc76a254c54e61b59cb99\": container with ID starting with 9c9751438113ed2516d63b6b9f5ab37991e4c8542fabc76a254c54e61b59cb99 not found: ID does not exist" Mar 20 07:20:37 crc kubenswrapper[4971]: I0320 07:20:37.714663 4971 scope.go:117] "RemoveContainer" containerID="34ed4b31797a0decd42fe7a3fefac4ee327733145e133ad6a26da1e51281e271" Mar 20 07:20:37 crc kubenswrapper[4971]: E0320 07:20:37.714886 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34ed4b31797a0decd42fe7a3fefac4ee327733145e133ad6a26da1e51281e271\": container with ID starting with 34ed4b31797a0decd42fe7a3fefac4ee327733145e133ad6a26da1e51281e271 not found: ID does not exist" containerID="34ed4b31797a0decd42fe7a3fefac4ee327733145e133ad6a26da1e51281e271" Mar 20 07:20:37 crc kubenswrapper[4971]: I0320 07:20:37.714914 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34ed4b31797a0decd42fe7a3fefac4ee327733145e133ad6a26da1e51281e271"} err="failed to get container status \"34ed4b31797a0decd42fe7a3fefac4ee327733145e133ad6a26da1e51281e271\": rpc error: code = NotFound desc = could not find container \"34ed4b31797a0decd42fe7a3fefac4ee327733145e133ad6a26da1e51281e271\": container with ID starting with 34ed4b31797a0decd42fe7a3fefac4ee327733145e133ad6a26da1e51281e271 not found: ID does not exist" Mar 20 07:20:38 crc kubenswrapper[4971]: I0320 07:20:38.596511 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6gsj5" podUID="95091f09-6a13-4afa-a29b-5350f226342b" containerName="registry-server" containerID="cri-o://e68e0e88e1e2f86045ed375c0082fa503cf28792882d8fbd9dcdb9a5c0863b20" gracePeriod=2 Mar 20 07:20:38 crc kubenswrapper[4971]: I0320 07:20:38.742783 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0976523b-bde5-4229-ae81-a19f9e89380f" path="/var/lib/kubelet/pods/0976523b-bde5-4229-ae81-a19f9e89380f/volumes" Mar 20 07:20:39 crc kubenswrapper[4971]: I0320 07:20:39.035238 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6gsj5" Mar 20 07:20:39 crc kubenswrapper[4971]: I0320 07:20:39.195102 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7knw7\" (UniqueName: \"kubernetes.io/projected/95091f09-6a13-4afa-a29b-5350f226342b-kube-api-access-7knw7\") pod \"95091f09-6a13-4afa-a29b-5350f226342b\" (UID: \"95091f09-6a13-4afa-a29b-5350f226342b\") " Mar 20 07:20:39 crc kubenswrapper[4971]: I0320 07:20:39.195161 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95091f09-6a13-4afa-a29b-5350f226342b-catalog-content\") pod \"95091f09-6a13-4afa-a29b-5350f226342b\" (UID: \"95091f09-6a13-4afa-a29b-5350f226342b\") " Mar 20 07:20:39 crc kubenswrapper[4971]: I0320 07:20:39.195188 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95091f09-6a13-4afa-a29b-5350f226342b-utilities\") pod \"95091f09-6a13-4afa-a29b-5350f226342b\" (UID: \"95091f09-6a13-4afa-a29b-5350f226342b\") " Mar 20 07:20:39 crc kubenswrapper[4971]: I0320 07:20:39.196083 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95091f09-6a13-4afa-a29b-5350f226342b-utilities" (OuterVolumeSpecName: "utilities") pod "95091f09-6a13-4afa-a29b-5350f226342b" (UID: "95091f09-6a13-4afa-a29b-5350f226342b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:20:39 crc kubenswrapper[4971]: I0320 07:20:39.201412 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95091f09-6a13-4afa-a29b-5350f226342b-kube-api-access-7knw7" (OuterVolumeSpecName: "kube-api-access-7knw7") pod "95091f09-6a13-4afa-a29b-5350f226342b" (UID: "95091f09-6a13-4afa-a29b-5350f226342b"). InnerVolumeSpecName "kube-api-access-7knw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:20:39 crc kubenswrapper[4971]: I0320 07:20:39.257528 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95091f09-6a13-4afa-a29b-5350f226342b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "95091f09-6a13-4afa-a29b-5350f226342b" (UID: "95091f09-6a13-4afa-a29b-5350f226342b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:20:39 crc kubenswrapper[4971]: I0320 07:20:39.296854 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7knw7\" (UniqueName: \"kubernetes.io/projected/95091f09-6a13-4afa-a29b-5350f226342b-kube-api-access-7knw7\") on node \"crc\" DevicePath \"\"" Mar 20 07:20:39 crc kubenswrapper[4971]: I0320 07:20:39.297096 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95091f09-6a13-4afa-a29b-5350f226342b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 07:20:39 crc kubenswrapper[4971]: I0320 07:20:39.297189 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95091f09-6a13-4afa-a29b-5350f226342b-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 07:20:39 crc kubenswrapper[4971]: I0320 07:20:39.606911 4971 generic.go:334] "Generic (PLEG): container finished" podID="95091f09-6a13-4afa-a29b-5350f226342b" containerID="e68e0e88e1e2f86045ed375c0082fa503cf28792882d8fbd9dcdb9a5c0863b20" exitCode=0 Mar 20 07:20:39 crc kubenswrapper[4971]: I0320 07:20:39.606967 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6gsj5" event={"ID":"95091f09-6a13-4afa-a29b-5350f226342b","Type":"ContainerDied","Data":"e68e0e88e1e2f86045ed375c0082fa503cf28792882d8fbd9dcdb9a5c0863b20"} Mar 20 07:20:39 crc kubenswrapper[4971]: I0320 07:20:39.606974 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6gsj5" Mar 20 07:20:39 crc kubenswrapper[4971]: I0320 07:20:39.607004 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6gsj5" event={"ID":"95091f09-6a13-4afa-a29b-5350f226342b","Type":"ContainerDied","Data":"fd9340679ddc1d36dcf097436a69633b16cc99a9763073a8a878dc81958725e9"} Mar 20 07:20:39 crc kubenswrapper[4971]: I0320 07:20:39.607033 4971 scope.go:117] "RemoveContainer" containerID="e68e0e88e1e2f86045ed375c0082fa503cf28792882d8fbd9dcdb9a5c0863b20" Mar 20 07:20:39 crc kubenswrapper[4971]: I0320 07:20:39.627860 4971 scope.go:117] "RemoveContainer" containerID="cde99eb762d0cf5b02ad10096e1381075108e9c879a8f205720fa4ee14fad39e" Mar 20 07:20:39 crc kubenswrapper[4971]: I0320 07:20:39.654869 4971 scope.go:117] "RemoveContainer" containerID="7f57391f7f6928aa4e481ff2cb676b12aaccfff7fd3cc1e6d758fa6007f2d150" Mar 20 07:20:39 crc kubenswrapper[4971]: I0320 07:20:39.664629 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6gsj5"] Mar 20 07:20:39 crc kubenswrapper[4971]: I0320 07:20:39.673724 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6gsj5"] Mar 20 07:20:39 crc kubenswrapper[4971]: I0320 07:20:39.701481 4971 scope.go:117] "RemoveContainer" containerID="e68e0e88e1e2f86045ed375c0082fa503cf28792882d8fbd9dcdb9a5c0863b20" Mar 20 07:20:39 crc kubenswrapper[4971]: E0320 07:20:39.701862 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e68e0e88e1e2f86045ed375c0082fa503cf28792882d8fbd9dcdb9a5c0863b20\": container with ID starting with e68e0e88e1e2f86045ed375c0082fa503cf28792882d8fbd9dcdb9a5c0863b20 not found: ID does not exist" containerID="e68e0e88e1e2f86045ed375c0082fa503cf28792882d8fbd9dcdb9a5c0863b20" Mar 20 07:20:39 crc kubenswrapper[4971]: I0320 07:20:39.701896 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e68e0e88e1e2f86045ed375c0082fa503cf28792882d8fbd9dcdb9a5c0863b20"} err="failed to get container status \"e68e0e88e1e2f86045ed375c0082fa503cf28792882d8fbd9dcdb9a5c0863b20\": rpc error: code = NotFound desc = could not find container \"e68e0e88e1e2f86045ed375c0082fa503cf28792882d8fbd9dcdb9a5c0863b20\": container with ID starting with e68e0e88e1e2f86045ed375c0082fa503cf28792882d8fbd9dcdb9a5c0863b20 not found: ID does not exist" Mar 20 07:20:39 crc kubenswrapper[4971]: I0320 07:20:39.701919 4971 scope.go:117] "RemoveContainer" containerID="cde99eb762d0cf5b02ad10096e1381075108e9c879a8f205720fa4ee14fad39e" Mar 20 07:20:39 crc kubenswrapper[4971]: E0320 07:20:39.702360 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cde99eb762d0cf5b02ad10096e1381075108e9c879a8f205720fa4ee14fad39e\": container with ID starting with cde99eb762d0cf5b02ad10096e1381075108e9c879a8f205720fa4ee14fad39e not found: ID does not exist" containerID="cde99eb762d0cf5b02ad10096e1381075108e9c879a8f205720fa4ee14fad39e" Mar 20 07:20:39 crc kubenswrapper[4971]: I0320 07:20:39.702388 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cde99eb762d0cf5b02ad10096e1381075108e9c879a8f205720fa4ee14fad39e"} err="failed to get container status \"cde99eb762d0cf5b02ad10096e1381075108e9c879a8f205720fa4ee14fad39e\": rpc error: code = NotFound desc = could not find container \"cde99eb762d0cf5b02ad10096e1381075108e9c879a8f205720fa4ee14fad39e\": container with ID starting with cde99eb762d0cf5b02ad10096e1381075108e9c879a8f205720fa4ee14fad39e not found: ID does not exist" Mar 20 07:20:39 crc kubenswrapper[4971]: I0320 07:20:39.702405 4971 scope.go:117] "RemoveContainer" containerID="7f57391f7f6928aa4e481ff2cb676b12aaccfff7fd3cc1e6d758fa6007f2d150" Mar 20 07:20:39 crc kubenswrapper[4971]: E0320 07:20:39.702860 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f57391f7f6928aa4e481ff2cb676b12aaccfff7fd3cc1e6d758fa6007f2d150\": container with ID starting with 7f57391f7f6928aa4e481ff2cb676b12aaccfff7fd3cc1e6d758fa6007f2d150 not found: ID does not exist" containerID="7f57391f7f6928aa4e481ff2cb676b12aaccfff7fd3cc1e6d758fa6007f2d150" Mar 20 07:20:39 crc kubenswrapper[4971]: I0320 07:20:39.702889 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f57391f7f6928aa4e481ff2cb676b12aaccfff7fd3cc1e6d758fa6007f2d150"} err="failed to get container status \"7f57391f7f6928aa4e481ff2cb676b12aaccfff7fd3cc1e6d758fa6007f2d150\": rpc error: code = NotFound desc = could not find container \"7f57391f7f6928aa4e481ff2cb676b12aaccfff7fd3cc1e6d758fa6007f2d150\": container with ID starting with 7f57391f7f6928aa4e481ff2cb676b12aaccfff7fd3cc1e6d758fa6007f2d150 not found: ID does not exist" Mar 20 07:20:40 crc kubenswrapper[4971]: I0320 07:20:40.742975 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95091f09-6a13-4afa-a29b-5350f226342b" path="/var/lib/kubelet/pods/95091f09-6a13-4afa-a29b-5350f226342b/volumes" Mar 20 07:20:42 crc kubenswrapper[4971]: I0320 07:20:42.253113 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fbjdb" Mar 20 07:20:42 crc kubenswrapper[4971]: I0320 07:20:42.333281 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fbjdb" Mar 20 07:20:42 crc kubenswrapper[4971]: I0320 07:20:42.692258 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fbjdb"] Mar 20 07:20:42 crc kubenswrapper[4971]: I0320 07:20:42.732347 4971 scope.go:117] "RemoveContainer" containerID="0d76c711da5cb6a39ec9395f7a9d86e2671b04a8f185bea088275be6dbc9a85d" Mar 20 07:20:42 crc kubenswrapper[4971]: E0320 07:20:42.732646 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:20:43 crc kubenswrapper[4971]: I0320 07:20:43.652661 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fbjdb" podUID="7a80b6c9-55ce-48a2-a70e-c29a6c61770c" containerName="registry-server" containerID="cri-o://8b80e992c16892020f23d003719282ade44d52e9ec7908d9338a77cd050961c0" gracePeriod=2 Mar 20 07:20:44 crc kubenswrapper[4971]: I0320 07:20:44.023198 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fbjdb" Mar 20 07:20:44 crc kubenswrapper[4971]: I0320 07:20:44.167273 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a80b6c9-55ce-48a2-a70e-c29a6c61770c-utilities\") pod \"7a80b6c9-55ce-48a2-a70e-c29a6c61770c\" (UID: \"7a80b6c9-55ce-48a2-a70e-c29a6c61770c\") " Mar 20 07:20:44 crc kubenswrapper[4971]: I0320 07:20:44.167391 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ktv5\" (UniqueName: \"kubernetes.io/projected/7a80b6c9-55ce-48a2-a70e-c29a6c61770c-kube-api-access-8ktv5\") pod \"7a80b6c9-55ce-48a2-a70e-c29a6c61770c\" (UID: \"7a80b6c9-55ce-48a2-a70e-c29a6c61770c\") " Mar 20 07:20:44 crc kubenswrapper[4971]: I0320 07:20:44.167442 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a80b6c9-55ce-48a2-a70e-c29a6c61770c-catalog-content\") pod \"7a80b6c9-55ce-48a2-a70e-c29a6c61770c\" (UID: \"7a80b6c9-55ce-48a2-a70e-c29a6c61770c\") " Mar 20 07:20:44 crc kubenswrapper[4971]: I0320 07:20:44.168021 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a80b6c9-55ce-48a2-a70e-c29a6c61770c-utilities" (OuterVolumeSpecName: "utilities") pod "7a80b6c9-55ce-48a2-a70e-c29a6c61770c" (UID: "7a80b6c9-55ce-48a2-a70e-c29a6c61770c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:20:44 crc kubenswrapper[4971]: I0320 07:20:44.172798 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a80b6c9-55ce-48a2-a70e-c29a6c61770c-kube-api-access-8ktv5" (OuterVolumeSpecName: "kube-api-access-8ktv5") pod "7a80b6c9-55ce-48a2-a70e-c29a6c61770c" (UID: "7a80b6c9-55ce-48a2-a70e-c29a6c61770c"). InnerVolumeSpecName "kube-api-access-8ktv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:20:44 crc kubenswrapper[4971]: I0320 07:20:44.269581 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a80b6c9-55ce-48a2-a70e-c29a6c61770c-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 07:20:44 crc kubenswrapper[4971]: I0320 07:20:44.269845 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ktv5\" (UniqueName: \"kubernetes.io/projected/7a80b6c9-55ce-48a2-a70e-c29a6c61770c-kube-api-access-8ktv5\") on node \"crc\" DevicePath \"\"" Mar 20 07:20:44 crc kubenswrapper[4971]: I0320 07:20:44.352202 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a80b6c9-55ce-48a2-a70e-c29a6c61770c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7a80b6c9-55ce-48a2-a70e-c29a6c61770c" (UID: "7a80b6c9-55ce-48a2-a70e-c29a6c61770c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:20:44 crc kubenswrapper[4971]: I0320 07:20:44.371234 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a80b6c9-55ce-48a2-a70e-c29a6c61770c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 07:20:44 crc kubenswrapper[4971]: I0320 07:20:44.662093 4971 generic.go:334] "Generic (PLEG): container finished" podID="7a80b6c9-55ce-48a2-a70e-c29a6c61770c" containerID="8b80e992c16892020f23d003719282ade44d52e9ec7908d9338a77cd050961c0" exitCode=0 Mar 20 07:20:44 crc kubenswrapper[4971]: I0320 07:20:44.662153 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbjdb" event={"ID":"7a80b6c9-55ce-48a2-a70e-c29a6c61770c","Type":"ContainerDied","Data":"8b80e992c16892020f23d003719282ade44d52e9ec7908d9338a77cd050961c0"} Mar 20 07:20:44 crc kubenswrapper[4971]: I0320 07:20:44.662191 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbjdb" event={"ID":"7a80b6c9-55ce-48a2-a70e-c29a6c61770c","Type":"ContainerDied","Data":"aa80a865dc8503cda5f750d987ff0ded9efe439f8dd02c6f1651e327aa3e3367"} Mar 20 07:20:44 crc kubenswrapper[4971]: I0320 07:20:44.662221 4971 scope.go:117] "RemoveContainer" containerID="8b80e992c16892020f23d003719282ade44d52e9ec7908d9338a77cd050961c0" Mar 20 07:20:44 crc kubenswrapper[4971]: I0320 07:20:44.662415 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fbjdb" Mar 20 07:20:44 crc kubenswrapper[4971]: I0320 07:20:44.693152 4971 scope.go:117] "RemoveContainer" containerID="ecd98156c8c44ede4d584e6e9a65a659d10c39dd289ce237ca5c2f597f46811a" Mar 20 07:20:44 crc kubenswrapper[4971]: I0320 07:20:44.711752 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fbjdb"] Mar 20 07:20:44 crc kubenswrapper[4971]: I0320 07:20:44.718653 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fbjdb"] Mar 20 07:20:44 crc kubenswrapper[4971]: I0320 07:20:44.736175 4971 scope.go:117] "RemoveContainer" containerID="f69f4055f7576c4b882d811e601c747bec12718b07a82c4f1c7caf8629ed4d1b" Mar 20 07:20:44 crc kubenswrapper[4971]: I0320 07:20:44.748602 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a80b6c9-55ce-48a2-a70e-c29a6c61770c" path="/var/lib/kubelet/pods/7a80b6c9-55ce-48a2-a70e-c29a6c61770c/volumes" Mar 20 07:20:44 crc kubenswrapper[4971]: I0320 07:20:44.755841 4971 scope.go:117] "RemoveContainer" containerID="8b80e992c16892020f23d003719282ade44d52e9ec7908d9338a77cd050961c0" Mar 20 07:20:44 crc kubenswrapper[4971]: E0320 07:20:44.757016 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b80e992c16892020f23d003719282ade44d52e9ec7908d9338a77cd050961c0\": container with ID starting with 8b80e992c16892020f23d003719282ade44d52e9ec7908d9338a77cd050961c0 not found: ID does not exist" containerID="8b80e992c16892020f23d003719282ade44d52e9ec7908d9338a77cd050961c0" Mar 20 07:20:44 crc kubenswrapper[4971]: I0320 07:20:44.757057 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b80e992c16892020f23d003719282ade44d52e9ec7908d9338a77cd050961c0"} err="failed to get container status \"8b80e992c16892020f23d003719282ade44d52e9ec7908d9338a77cd050961c0\": rpc error: code = NotFound desc = could not find container \"8b80e992c16892020f23d003719282ade44d52e9ec7908d9338a77cd050961c0\": container with ID starting with 8b80e992c16892020f23d003719282ade44d52e9ec7908d9338a77cd050961c0 not found: ID does not exist" Mar 20 07:20:44 crc kubenswrapper[4971]: I0320 07:20:44.757083 4971 scope.go:117] "RemoveContainer" containerID="ecd98156c8c44ede4d584e6e9a65a659d10c39dd289ce237ca5c2f597f46811a" Mar 20 07:20:44 crc kubenswrapper[4971]: E0320 07:20:44.757493 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecd98156c8c44ede4d584e6e9a65a659d10c39dd289ce237ca5c2f597f46811a\": container with ID starting with ecd98156c8c44ede4d584e6e9a65a659d10c39dd289ce237ca5c2f597f46811a not found: ID does not exist" containerID="ecd98156c8c44ede4d584e6e9a65a659d10c39dd289ce237ca5c2f597f46811a" Mar 20 07:20:44 crc kubenswrapper[4971]: I0320 07:20:44.757541 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecd98156c8c44ede4d584e6e9a65a659d10c39dd289ce237ca5c2f597f46811a"} err="failed to get container status \"ecd98156c8c44ede4d584e6e9a65a659d10c39dd289ce237ca5c2f597f46811a\": rpc error: code = NotFound desc = could not find container \"ecd98156c8c44ede4d584e6e9a65a659d10c39dd289ce237ca5c2f597f46811a\": container with ID starting with ecd98156c8c44ede4d584e6e9a65a659d10c39dd289ce237ca5c2f597f46811a not found: ID does not exist" Mar 20 07:20:44 crc kubenswrapper[4971]: I0320 07:20:44.757571 4971 scope.go:117] "RemoveContainer" containerID="f69f4055f7576c4b882d811e601c747bec12718b07a82c4f1c7caf8629ed4d1b" Mar 20 07:20:44 crc kubenswrapper[4971]: E0320 07:20:44.757899 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f69f4055f7576c4b882d811e601c747bec12718b07a82c4f1c7caf8629ed4d1b\": container with ID starting with f69f4055f7576c4b882d811e601c747bec12718b07a82c4f1c7caf8629ed4d1b not found: ID does not exist" containerID="f69f4055f7576c4b882d811e601c747bec12718b07a82c4f1c7caf8629ed4d1b" Mar 20 07:20:44 crc kubenswrapper[4971]: I0320 07:20:44.757933 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f69f4055f7576c4b882d811e601c747bec12718b07a82c4f1c7caf8629ed4d1b"} err="failed to get container status \"f69f4055f7576c4b882d811e601c747bec12718b07a82c4f1c7caf8629ed4d1b\": rpc error: code = NotFound desc = could not find container \"f69f4055f7576c4b882d811e601c747bec12718b07a82c4f1c7caf8629ed4d1b\": container with ID starting with f69f4055f7576c4b882d811e601c747bec12718b07a82c4f1c7caf8629ed4d1b not found: ID does not exist" Mar 20 07:20:53 crc kubenswrapper[4971]: I0320 07:20:53.647859 4971 scope.go:117] "RemoveContainer" containerID="6196fd736045852e54d49d69c1722264de6f657acdbf76992e58582e67c832e2" Mar 20 07:20:53 crc kubenswrapper[4971]: I0320 07:20:53.672856 4971 scope.go:117] "RemoveContainer" containerID="20bb8dee4f6eeefb1b1658b0eb47261977255f1a69893b7d980be7dacb8c4402" Mar 20 07:20:53 crc kubenswrapper[4971]: I0320 07:20:53.727708 4971 scope.go:117] "RemoveContainer" containerID="9382cb2feb2098aa15b65f8346c5de40c2576c9f1856eeea4da89cdb6f2581f4" Mar 20 07:20:56 crc kubenswrapper[4971]: I0320 07:20:56.732679 4971 scope.go:117] "RemoveContainer" containerID="0d76c711da5cb6a39ec9395f7a9d86e2671b04a8f185bea088275be6dbc9a85d" Mar 20 07:20:56 crc kubenswrapper[4971]: E0320 07:20:56.733320 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:21:07 crc kubenswrapper[4971]: I0320 07:21:07.732977 4971 scope.go:117] "RemoveContainer" containerID="0d76c711da5cb6a39ec9395f7a9d86e2671b04a8f185bea088275be6dbc9a85d" Mar 20 07:21:07 crc kubenswrapper[4971]: E0320 07:21:07.734240 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:21:20 crc kubenswrapper[4971]: I0320 07:21:20.733802 4971 scope.go:117] "RemoveContainer" containerID="0d76c711da5cb6a39ec9395f7a9d86e2671b04a8f185bea088275be6dbc9a85d" Mar 20 07:21:21 crc kubenswrapper[4971]: I0320 07:21:21.027389 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerStarted","Data":"67985db6300a75c1473c9a37158bf883bc3610c2404968488fdc34b9f80b08a0"} Mar 20 07:22:00 crc kubenswrapper[4971]: I0320 07:22:00.168820 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566522-96fh8"] Mar 20 07:22:00 crc kubenswrapper[4971]: E0320 07:22:00.170426 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95091f09-6a13-4afa-a29b-5350f226342b" containerName="extract-utilities" Mar 20 07:22:00 crc kubenswrapper[4971]: I0320 07:22:00.170462 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="95091f09-6a13-4afa-a29b-5350f226342b" containerName="extract-utilities" Mar 20 07:22:00 crc kubenswrapper[4971]: E0320 07:22:00.170500 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0976523b-bde5-4229-ae81-a19f9e89380f" containerName="registry-server" Mar 20 07:22:00 crc kubenswrapper[4971]: I0320 07:22:00.170519 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="0976523b-bde5-4229-ae81-a19f9e89380f" containerName="registry-server" Mar 20 07:22:00 crc kubenswrapper[4971]: E0320 07:22:00.170545 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0976523b-bde5-4229-ae81-a19f9e89380f" containerName="extract-content" Mar 20 07:22:00 crc kubenswrapper[4971]: I0320 07:22:00.170563 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="0976523b-bde5-4229-ae81-a19f9e89380f" containerName="extract-content" Mar 20 07:22:00 crc kubenswrapper[4971]: E0320 07:22:00.170590 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95091f09-6a13-4afa-a29b-5350f226342b" containerName="extract-content" Mar 20 07:22:00 crc kubenswrapper[4971]: I0320 07:22:00.170647 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="95091f09-6a13-4afa-a29b-5350f226342b" containerName="extract-content" Mar 20 07:22:00 crc kubenswrapper[4971]: E0320 07:22:00.170671 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a80b6c9-55ce-48a2-a70e-c29a6c61770c" containerName="registry-server" Mar 20 07:22:00 crc kubenswrapper[4971]: I0320 07:22:00.170690 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a80b6c9-55ce-48a2-a70e-c29a6c61770c" containerName="registry-server" Mar 20 07:22:00 crc kubenswrapper[4971]: E0320 07:22:00.170726 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a80b6c9-55ce-48a2-a70e-c29a6c61770c" containerName="extract-content" Mar 20 07:22:00 crc kubenswrapper[4971]: I0320 07:22:00.170745 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a80b6c9-55ce-48a2-a70e-c29a6c61770c" containerName="extract-content" Mar 20 07:22:00 crc kubenswrapper[4971]: E0320 07:22:00.170782 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0976523b-bde5-4229-ae81-a19f9e89380f" containerName="extract-utilities" Mar 20 07:22:00 crc kubenswrapper[4971]: I0320 07:22:00.170801 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="0976523b-bde5-4229-ae81-a19f9e89380f" containerName="extract-utilities" Mar 20 07:22:00 crc kubenswrapper[4971]: E0320 07:22:00.170827 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a80b6c9-55ce-48a2-a70e-c29a6c61770c" containerName="extract-utilities" Mar 20 07:22:00 crc kubenswrapper[4971]: I0320 07:22:00.170848 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a80b6c9-55ce-48a2-a70e-c29a6c61770c" containerName="extract-utilities" Mar 20 07:22:00 crc kubenswrapper[4971]: E0320 07:22:00.170891 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95091f09-6a13-4afa-a29b-5350f226342b" containerName="registry-server" Mar 20 07:22:00 crc kubenswrapper[4971]: I0320 07:22:00.170909 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="95091f09-6a13-4afa-a29b-5350f226342b" containerName="registry-server" Mar 20 07:22:00 crc kubenswrapper[4971]: I0320 07:22:00.171258 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="95091f09-6a13-4afa-a29b-5350f226342b" containerName="registry-server" Mar 20 07:22:00 crc kubenswrapper[4971]: I0320 07:22:00.171295 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a80b6c9-55ce-48a2-a70e-c29a6c61770c" containerName="registry-server" Mar 20 07:22:00 crc kubenswrapper[4971]: I0320 07:22:00.171353 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="0976523b-bde5-4229-ae81-a19f9e89380f" containerName="registry-server" Mar 20 07:22:00 crc kubenswrapper[4971]: I0320 07:22:00.174118 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566522-96fh8"] Mar 20 07:22:00 crc kubenswrapper[4971]: I0320 07:22:00.180771 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566522-96fh8" Mar 20 07:22:00 crc kubenswrapper[4971]: I0320 07:22:00.183269 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 07:22:00 crc kubenswrapper[4971]: I0320 07:22:00.185904 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 07:22:00 crc kubenswrapper[4971]: I0320 07:22:00.186123 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 07:22:00 crc kubenswrapper[4971]: I0320 07:22:00.256205 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smmmc\" (UniqueName: \"kubernetes.io/projected/e7314142-a7a2-4277-91b0-18a8e874546d-kube-api-access-smmmc\") pod \"auto-csr-approver-29566522-96fh8\" (UID: \"e7314142-a7a2-4277-91b0-18a8e874546d\") " pod="openshift-infra/auto-csr-approver-29566522-96fh8" Mar 20 07:22:00 crc kubenswrapper[4971]: I0320 07:22:00.357833 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smmmc\" (UniqueName: \"kubernetes.io/projected/e7314142-a7a2-4277-91b0-18a8e874546d-kube-api-access-smmmc\") pod \"auto-csr-approver-29566522-96fh8\" (UID: \"e7314142-a7a2-4277-91b0-18a8e874546d\") " pod="openshift-infra/auto-csr-approver-29566522-96fh8" Mar 20 07:22:00 crc kubenswrapper[4971]: I0320 07:22:00.379190 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smmmc\" (UniqueName: \"kubernetes.io/projected/e7314142-a7a2-4277-91b0-18a8e874546d-kube-api-access-smmmc\") pod \"auto-csr-approver-29566522-96fh8\" (UID: \"e7314142-a7a2-4277-91b0-18a8e874546d\") " pod="openshift-infra/auto-csr-approver-29566522-96fh8" Mar 20 07:22:00 crc kubenswrapper[4971]: I0320 07:22:00.512242 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566522-96fh8" Mar 20 07:22:00 crc kubenswrapper[4971]: I0320 07:22:00.982261 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566522-96fh8"] Mar 20 07:22:00 crc kubenswrapper[4971]: I0320 07:22:00.989218 4971 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 07:22:01 crc kubenswrapper[4971]: I0320 07:22:01.409639 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566522-96fh8" event={"ID":"e7314142-a7a2-4277-91b0-18a8e874546d","Type":"ContainerStarted","Data":"c2f200538bc0220e13f0f4420d6195b520f4834cc4a56bc4a2ffaf071ef4dc2f"} Mar 20 07:22:03 crc kubenswrapper[4971]: I0320 07:22:03.429077 4971 generic.go:334] "Generic (PLEG): container finished" podID="e7314142-a7a2-4277-91b0-18a8e874546d" containerID="364c7420fdd4b13846d7fea70fdb67e79c3e13127eceb646aa0ae3cd4baa0c39" exitCode=0 Mar 20 07:22:03 crc kubenswrapper[4971]: I0320 07:22:03.429187 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566522-96fh8" event={"ID":"e7314142-a7a2-4277-91b0-18a8e874546d","Type":"ContainerDied","Data":"364c7420fdd4b13846d7fea70fdb67e79c3e13127eceb646aa0ae3cd4baa0c39"} Mar 20 07:22:04 crc kubenswrapper[4971]: I0320 07:22:04.818927 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566522-96fh8" Mar 20 07:22:04 crc kubenswrapper[4971]: I0320 07:22:04.926360 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smmmc\" (UniqueName: \"kubernetes.io/projected/e7314142-a7a2-4277-91b0-18a8e874546d-kube-api-access-smmmc\") pod \"e7314142-a7a2-4277-91b0-18a8e874546d\" (UID: \"e7314142-a7a2-4277-91b0-18a8e874546d\") " Mar 20 07:22:04 crc kubenswrapper[4971]: I0320 07:22:04.933669 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7314142-a7a2-4277-91b0-18a8e874546d-kube-api-access-smmmc" (OuterVolumeSpecName: "kube-api-access-smmmc") pod "e7314142-a7a2-4277-91b0-18a8e874546d" (UID: "e7314142-a7a2-4277-91b0-18a8e874546d"). InnerVolumeSpecName "kube-api-access-smmmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:22:05 crc kubenswrapper[4971]: I0320 07:22:05.027558 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smmmc\" (UniqueName: \"kubernetes.io/projected/e7314142-a7a2-4277-91b0-18a8e874546d-kube-api-access-smmmc\") on node \"crc\" DevicePath \"\"" Mar 20 07:22:05 crc kubenswrapper[4971]: I0320 07:22:05.465343 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566522-96fh8" event={"ID":"e7314142-a7a2-4277-91b0-18a8e874546d","Type":"ContainerDied","Data":"c2f200538bc0220e13f0f4420d6195b520f4834cc4a56bc4a2ffaf071ef4dc2f"} Mar 20 07:22:05 crc kubenswrapper[4971]: I0320 07:22:05.465407 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2f200538bc0220e13f0f4420d6195b520f4834cc4a56bc4a2ffaf071ef4dc2f" Mar 20 07:22:05 crc kubenswrapper[4971]: I0320 07:22:05.465417 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566522-96fh8" Mar 20 07:22:05 crc kubenswrapper[4971]: I0320 07:22:05.910277 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566516-8vx7v"] Mar 20 07:22:05 crc kubenswrapper[4971]: I0320 07:22:05.922025 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566516-8vx7v"] Mar 20 07:22:06 crc kubenswrapper[4971]: I0320 07:22:06.748378 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0e1b3f7-6850-4b2e-b692-ab96d597b3ed" path="/var/lib/kubelet/pods/a0e1b3f7-6850-4b2e-b692-ab96d597b3ed/volumes" Mar 20 07:22:53 crc kubenswrapper[4971]: I0320 07:22:53.919997 4971 scope.go:117] "RemoveContainer" containerID="7831d892345d390eff12e03d1886c71737d9b5e1b0545506bc2d92be4cbe6494" Mar 20 07:23:20 crc kubenswrapper[4971]: I0320 07:23:20.162820 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:23:20 crc kubenswrapper[4971]: I0320 07:23:20.165259 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:23:50 crc kubenswrapper[4971]: I0320 07:23:50.162597 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:23:50 crc kubenswrapper[4971]: I0320 07:23:50.163331 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:24:00 crc kubenswrapper[4971]: I0320 07:24:00.147951 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566524-rkrcp"] Mar 20 07:24:00 crc kubenswrapper[4971]: E0320 07:24:00.149062 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7314142-a7a2-4277-91b0-18a8e874546d" containerName="oc" Mar 20 07:24:00 crc kubenswrapper[4971]: I0320 07:24:00.149092 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7314142-a7a2-4277-91b0-18a8e874546d" containerName="oc" Mar 20 07:24:00 crc kubenswrapper[4971]: I0320 07:24:00.149446 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7314142-a7a2-4277-91b0-18a8e874546d" containerName="oc" Mar 20 07:24:00 crc kubenswrapper[4971]: I0320 07:24:00.150408 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566524-rkrcp" Mar 20 07:24:00 crc kubenswrapper[4971]: I0320 07:24:00.155064 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 07:24:00 crc kubenswrapper[4971]: I0320 07:24:00.155491 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 07:24:00 crc kubenswrapper[4971]: I0320 07:24:00.157504 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 07:24:00 crc kubenswrapper[4971]: I0320 07:24:00.160344 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566524-rkrcp"] Mar 20 07:24:00 crc kubenswrapper[4971]: I0320 07:24:00.322948 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhvrp\" (UniqueName: \"kubernetes.io/projected/e60a87d3-55d4-4bfa-9b11-4c105cf613cc-kube-api-access-mhvrp\") pod \"auto-csr-approver-29566524-rkrcp\" (UID: \"e60a87d3-55d4-4bfa-9b11-4c105cf613cc\") " pod="openshift-infra/auto-csr-approver-29566524-rkrcp" Mar 20 07:24:00 crc kubenswrapper[4971]: I0320 07:24:00.423920 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhvrp\" (UniqueName: \"kubernetes.io/projected/e60a87d3-55d4-4bfa-9b11-4c105cf613cc-kube-api-access-mhvrp\") pod \"auto-csr-approver-29566524-rkrcp\" (UID: \"e60a87d3-55d4-4bfa-9b11-4c105cf613cc\") " pod="openshift-infra/auto-csr-approver-29566524-rkrcp" Mar 20 07:24:00 crc kubenswrapper[4971]: I0320 07:24:00.446467 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhvrp\" (UniqueName: \"kubernetes.io/projected/e60a87d3-55d4-4bfa-9b11-4c105cf613cc-kube-api-access-mhvrp\") pod \"auto-csr-approver-29566524-rkrcp\" (UID: \"e60a87d3-55d4-4bfa-9b11-4c105cf613cc\") " pod="openshift-infra/auto-csr-approver-29566524-rkrcp" Mar 20 07:24:00 crc kubenswrapper[4971]: I0320 07:24:00.494663 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566524-rkrcp" Mar 20 07:24:00 crc kubenswrapper[4971]: I0320 07:24:00.950935 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566524-rkrcp"] Mar 20 07:24:01 crc kubenswrapper[4971]: I0320 07:24:01.639993 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566524-rkrcp" event={"ID":"e60a87d3-55d4-4bfa-9b11-4c105cf613cc","Type":"ContainerStarted","Data":"c00d6523a3e27b7439b2cb5104c73fe2c312d7ac9a115c9a41520f764b9776bc"} Mar 20 07:24:02 crc kubenswrapper[4971]: I0320 07:24:02.673442 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566524-rkrcp" event={"ID":"e60a87d3-55d4-4bfa-9b11-4c105cf613cc","Type":"ContainerStarted","Data":"204a3cd7c64dc1ccfc2b91d007f10f29b87d10258c56c2733f640c25ff176670"} Mar 20 07:24:02 crc kubenswrapper[4971]: I0320 07:24:02.695928 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566524-rkrcp" podStartSLOduration=1.503539962 podStartE2EDuration="2.695909015s" podCreationTimestamp="2026-03-20 07:24:00 +0000 UTC" firstStartedPulling="2026-03-20 07:24:00.958311393 +0000 UTC m=+2062.938185571" lastFinishedPulling="2026-03-20 07:24:02.150680476 +0000 UTC m=+2064.130554624" observedRunningTime="2026-03-20 07:24:02.69184278 +0000 UTC m=+2064.671716918" watchObservedRunningTime="2026-03-20 07:24:02.695909015 +0000 UTC m=+2064.675783163" Mar 20 07:24:03 crc kubenswrapper[4971]: I0320 07:24:03.685428 4971 generic.go:334] "Generic (PLEG): container finished" podID="e60a87d3-55d4-4bfa-9b11-4c105cf613cc" containerID="204a3cd7c64dc1ccfc2b91d007f10f29b87d10258c56c2733f640c25ff176670" exitCode=0 Mar 20 07:24:03 crc kubenswrapper[4971]: I0320 07:24:03.685494 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566524-rkrcp" event={"ID":"e60a87d3-55d4-4bfa-9b11-4c105cf613cc","Type":"ContainerDied","Data":"204a3cd7c64dc1ccfc2b91d007f10f29b87d10258c56c2733f640c25ff176670"} Mar 20 07:24:05 crc kubenswrapper[4971]: I0320 07:24:05.087188 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566524-rkrcp" Mar 20 07:24:05 crc kubenswrapper[4971]: I0320 07:24:05.112064 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhvrp\" (UniqueName: \"kubernetes.io/projected/e60a87d3-55d4-4bfa-9b11-4c105cf613cc-kube-api-access-mhvrp\") pod \"e60a87d3-55d4-4bfa-9b11-4c105cf613cc\" (UID: \"e60a87d3-55d4-4bfa-9b11-4c105cf613cc\") " Mar 20 07:24:05 crc kubenswrapper[4971]: I0320 07:24:05.121708 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e60a87d3-55d4-4bfa-9b11-4c105cf613cc-kube-api-access-mhvrp" (OuterVolumeSpecName: "kube-api-access-mhvrp") pod "e60a87d3-55d4-4bfa-9b11-4c105cf613cc" (UID: "e60a87d3-55d4-4bfa-9b11-4c105cf613cc"). InnerVolumeSpecName "kube-api-access-mhvrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:24:05 crc kubenswrapper[4971]: I0320 07:24:05.212934 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhvrp\" (UniqueName: \"kubernetes.io/projected/e60a87d3-55d4-4bfa-9b11-4c105cf613cc-kube-api-access-mhvrp\") on node \"crc\" DevicePath \"\"" Mar 20 07:24:05 crc kubenswrapper[4971]: I0320 07:24:05.704918 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566524-rkrcp" event={"ID":"e60a87d3-55d4-4bfa-9b11-4c105cf613cc","Type":"ContainerDied","Data":"c00d6523a3e27b7439b2cb5104c73fe2c312d7ac9a115c9a41520f764b9776bc"} Mar 20 07:24:05 crc kubenswrapper[4971]: I0320 07:24:05.705347 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c00d6523a3e27b7439b2cb5104c73fe2c312d7ac9a115c9a41520f764b9776bc" Mar 20 07:24:05 crc kubenswrapper[4971]: I0320 07:24:05.705054 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566524-rkrcp" Mar 20 07:24:05 crc kubenswrapper[4971]: I0320 07:24:05.783933 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566518-ckb5h"] Mar 20 07:24:05 crc kubenswrapper[4971]: I0320 07:24:05.793761 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566518-ckb5h"] Mar 20 07:24:06 crc kubenswrapper[4971]: I0320 07:24:06.750964 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c7c341a-b390-40e0-899e-6cf925e5280d" path="/var/lib/kubelet/pods/5c7c341a-b390-40e0-899e-6cf925e5280d/volumes" Mar 20 07:24:20 crc kubenswrapper[4971]: I0320 07:24:20.163065 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:24:20 crc kubenswrapper[4971]: I0320 07:24:20.164026 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:24:20 crc kubenswrapper[4971]: I0320 07:24:20.164114 4971 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" Mar 20 07:24:20 crc kubenswrapper[4971]: I0320 07:24:20.165264 4971 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"67985db6300a75c1473c9a37158bf883bc3610c2404968488fdc34b9f80b08a0"} pod="openshift-machine-config-operator/machine-config-daemon-m26zl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 07:24:20 crc kubenswrapper[4971]: I0320 07:24:20.165420 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" containerID="cri-o://67985db6300a75c1473c9a37158bf883bc3610c2404968488fdc34b9f80b08a0" gracePeriod=600 Mar 20 07:24:20 crc kubenswrapper[4971]: I0320 07:24:20.867753 4971 generic.go:334] "Generic (PLEG): container finished" podID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerID="67985db6300a75c1473c9a37158bf883bc3610c2404968488fdc34b9f80b08a0" exitCode=0 Mar 20 07:24:20 crc kubenswrapper[4971]: I0320 07:24:20.867835 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerDied","Data":"67985db6300a75c1473c9a37158bf883bc3610c2404968488fdc34b9f80b08a0"} Mar 20 07:24:20 crc kubenswrapper[4971]: I0320 07:24:20.868098 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerStarted","Data":"0fb36bb8261cef13ec2408924ecbe7eee5da04e46c14e79a7fb3e0e821bf601c"} Mar 20 07:24:20 crc kubenswrapper[4971]: I0320 07:24:20.868131 4971 scope.go:117] "RemoveContainer" containerID="0d76c711da5cb6a39ec9395f7a9d86e2671b04a8f185bea088275be6dbc9a85d" Mar 20 07:24:54 crc kubenswrapper[4971]: I0320 07:24:54.030381 4971 scope.go:117] "RemoveContainer" containerID="fa37d66caca9e3742d097cc40b4fc7d60ff365982e423295b793998c7a381052" Mar 20 07:26:00 crc kubenswrapper[4971]: I0320 07:26:00.152475 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566526-sh7kc"] Mar 20 07:26:00 crc kubenswrapper[4971]: E0320 07:26:00.155122 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e60a87d3-55d4-4bfa-9b11-4c105cf613cc" containerName="oc" Mar 20 07:26:00 crc kubenswrapper[4971]: I0320 07:26:00.155244 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="e60a87d3-55d4-4bfa-9b11-4c105cf613cc" containerName="oc" Mar 20 07:26:00 crc kubenswrapper[4971]: I0320 07:26:00.155575 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="e60a87d3-55d4-4bfa-9b11-4c105cf613cc" containerName="oc" Mar 20 07:26:00 crc kubenswrapper[4971]: I0320 07:26:00.156285 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566526-sh7kc" Mar 20 07:26:00 crc kubenswrapper[4971]: I0320 07:26:00.162205 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 07:26:00 crc kubenswrapper[4971]: I0320 07:26:00.162526 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 07:26:00 crc kubenswrapper[4971]: I0320 07:26:00.163303 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 07:26:00 crc kubenswrapper[4971]: I0320 07:26:00.220381 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566526-sh7kc"] Mar 20 07:26:00 crc kubenswrapper[4971]: I0320 07:26:00.271365 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4p2v\" (UniqueName: \"kubernetes.io/projected/0ee7182c-6612-4722-903b-68fb1657e563-kube-api-access-k4p2v\") pod \"auto-csr-approver-29566526-sh7kc\" (UID: \"0ee7182c-6612-4722-903b-68fb1657e563\") " pod="openshift-infra/auto-csr-approver-29566526-sh7kc" Mar 20 07:26:00 crc kubenswrapper[4971]: I0320 07:26:00.372763 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4p2v\" (UniqueName: \"kubernetes.io/projected/0ee7182c-6612-4722-903b-68fb1657e563-kube-api-access-k4p2v\") pod \"auto-csr-approver-29566526-sh7kc\" (UID: \"0ee7182c-6612-4722-903b-68fb1657e563\") " pod="openshift-infra/auto-csr-approver-29566526-sh7kc" Mar 20 07:26:00 crc kubenswrapper[4971]: I0320 07:26:00.395852 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4p2v\" (UniqueName: \"kubernetes.io/projected/0ee7182c-6612-4722-903b-68fb1657e563-kube-api-access-k4p2v\") pod \"auto-csr-approver-29566526-sh7kc\" (UID: \"0ee7182c-6612-4722-903b-68fb1657e563\") " pod="openshift-infra/auto-csr-approver-29566526-sh7kc" Mar 20 07:26:00 crc kubenswrapper[4971]: I0320 07:26:00.539204 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566526-sh7kc" Mar 20 07:26:00 crc kubenswrapper[4971]: I0320 07:26:00.971462 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566526-sh7kc"] Mar 20 07:26:01 crc kubenswrapper[4971]: I0320 07:26:01.855014 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566526-sh7kc" event={"ID":"0ee7182c-6612-4722-903b-68fb1657e563","Type":"ContainerStarted","Data":"12be0e1e6a9bcaedf82fa3156ca67a1cd0ec1a863897231641c10f8fbf1c6abd"} Mar 20 07:26:02 crc kubenswrapper[4971]: I0320 07:26:02.863871 4971 generic.go:334] "Generic (PLEG): container finished" podID="0ee7182c-6612-4722-903b-68fb1657e563" containerID="d258479291395756cd521e2ed47075d6a91361e0022659674da12eac788898ae" exitCode=0 Mar 20 07:26:02 crc kubenswrapper[4971]: I0320 07:26:02.863961 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566526-sh7kc" event={"ID":"0ee7182c-6612-4722-903b-68fb1657e563","Type":"ContainerDied","Data":"d258479291395756cd521e2ed47075d6a91361e0022659674da12eac788898ae"} Mar 20 07:26:04 crc kubenswrapper[4971]: I0320 07:26:04.220177 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566526-sh7kc" Mar 20 07:26:04 crc kubenswrapper[4971]: I0320 07:26:04.326367 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4p2v\" (UniqueName: \"kubernetes.io/projected/0ee7182c-6612-4722-903b-68fb1657e563-kube-api-access-k4p2v\") pod \"0ee7182c-6612-4722-903b-68fb1657e563\" (UID: \"0ee7182c-6612-4722-903b-68fb1657e563\") " Mar 20 07:26:04 crc kubenswrapper[4971]: I0320 07:26:04.332288 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ee7182c-6612-4722-903b-68fb1657e563-kube-api-access-k4p2v" (OuterVolumeSpecName: "kube-api-access-k4p2v") pod "0ee7182c-6612-4722-903b-68fb1657e563" (UID: "0ee7182c-6612-4722-903b-68fb1657e563"). InnerVolumeSpecName "kube-api-access-k4p2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:26:04 crc kubenswrapper[4971]: I0320 07:26:04.429176 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4p2v\" (UniqueName: \"kubernetes.io/projected/0ee7182c-6612-4722-903b-68fb1657e563-kube-api-access-k4p2v\") on node \"crc\" DevicePath \"\"" Mar 20 07:26:04 crc kubenswrapper[4971]: I0320 07:26:04.884583 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566526-sh7kc" event={"ID":"0ee7182c-6612-4722-903b-68fb1657e563","Type":"ContainerDied","Data":"12be0e1e6a9bcaedf82fa3156ca67a1cd0ec1a863897231641c10f8fbf1c6abd"} Mar 20 07:26:04 crc kubenswrapper[4971]: I0320 07:26:04.884636 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12be0e1e6a9bcaedf82fa3156ca67a1cd0ec1a863897231641c10f8fbf1c6abd" Mar 20 07:26:04 crc kubenswrapper[4971]: I0320 07:26:04.884695 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566526-sh7kc" Mar 20 07:26:05 crc kubenswrapper[4971]: I0320 07:26:05.318498 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566520-sbc7k"] Mar 20 07:26:05 crc kubenswrapper[4971]: I0320 07:26:05.326719 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566520-sbc7k"] Mar 20 07:26:06 crc kubenswrapper[4971]: I0320 07:26:06.747581 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9c3f735-4efd-4d7e-a019-6cf5be7862f1" path="/var/lib/kubelet/pods/d9c3f735-4efd-4d7e-a019-6cf5be7862f1/volumes" Mar 20 07:26:20 crc kubenswrapper[4971]: I0320 07:26:20.162340 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:26:20 crc kubenswrapper[4971]: I0320 07:26:20.163075 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:26:50 crc kubenswrapper[4971]: I0320 07:26:50.162121 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:26:50 crc kubenswrapper[4971]: I0320 07:26:50.162953 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:26:54 crc kubenswrapper[4971]: I0320 07:26:54.151243 4971 scope.go:117] "RemoveContainer" containerID="007d81326095205ca7232b97e22a62bf3c678b744d564ebee485b549c04fd283" Mar 20 07:27:20 crc kubenswrapper[4971]: I0320 07:27:20.162533 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:27:20 crc kubenswrapper[4971]: I0320 07:27:20.163342 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:27:20 crc kubenswrapper[4971]: I0320 07:27:20.163404 4971 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" Mar 20 07:27:20 crc kubenswrapper[4971]: I0320 07:27:20.164127 4971 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0fb36bb8261cef13ec2408924ecbe7eee5da04e46c14e79a7fb3e0e821bf601c"} pod="openshift-machine-config-operator/machine-config-daemon-m26zl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 07:27:20 crc kubenswrapper[4971]: I0320 07:27:20.164220 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" containerID="cri-o://0fb36bb8261cef13ec2408924ecbe7eee5da04e46c14e79a7fb3e0e821bf601c" gracePeriod=600 Mar 20 07:27:20 crc kubenswrapper[4971]: E0320 07:27:20.298779 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:27:20 crc kubenswrapper[4971]: I0320 07:27:20.554499 4971 generic.go:334] "Generic (PLEG): container finished" podID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerID="0fb36bb8261cef13ec2408924ecbe7eee5da04e46c14e79a7fb3e0e821bf601c" exitCode=0 Mar 20 07:27:20 crc kubenswrapper[4971]: I0320 07:27:20.554568 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerDied","Data":"0fb36bb8261cef13ec2408924ecbe7eee5da04e46c14e79a7fb3e0e821bf601c"} Mar 20 07:27:20 crc kubenswrapper[4971]: I0320 07:27:20.554651 4971 scope.go:117] "RemoveContainer" containerID="67985db6300a75c1473c9a37158bf883bc3610c2404968488fdc34b9f80b08a0" Mar 20 07:27:20 crc kubenswrapper[4971]: I0320 07:27:20.555878 4971 scope.go:117] "RemoveContainer" containerID="0fb36bb8261cef13ec2408924ecbe7eee5da04e46c14e79a7fb3e0e821bf601c" Mar 20 07:27:20 crc kubenswrapper[4971]: E0320 07:27:20.556458 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:27:33 crc kubenswrapper[4971]: I0320 07:27:33.732848 4971 scope.go:117] "RemoveContainer" containerID="0fb36bb8261cef13ec2408924ecbe7eee5da04e46c14e79a7fb3e0e821bf601c" Mar 20 07:27:33 crc kubenswrapper[4971]: E0320 07:27:33.733734 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:27:48 crc kubenswrapper[4971]: I0320 07:27:48.744890 4971 scope.go:117] "RemoveContainer" containerID="0fb36bb8261cef13ec2408924ecbe7eee5da04e46c14e79a7fb3e0e821bf601c" Mar 20 07:27:48 crc kubenswrapper[4971]: E0320 07:27:48.746013 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:28:00 crc kubenswrapper[4971]: I0320 07:28:00.150333 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566528-67gds"] Mar 20 07:28:00 crc kubenswrapper[4971]: E0320 07:28:00.151328 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ee7182c-6612-4722-903b-68fb1657e563" containerName="oc" Mar 20 07:28:00 crc kubenswrapper[4971]: I0320 07:28:00.151348 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ee7182c-6612-4722-903b-68fb1657e563" containerName="oc" Mar 20 07:28:00 crc kubenswrapper[4971]: I0320 07:28:00.151513 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ee7182c-6612-4722-903b-68fb1657e563" containerName="oc" Mar 20 07:28:00 crc kubenswrapper[4971]: I0320 07:28:00.152116 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566528-67gds" Mar 20 07:28:00 crc kubenswrapper[4971]: I0320 07:28:00.154886 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 07:28:00 crc kubenswrapper[4971]: I0320 07:28:00.154977 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 07:28:00 crc kubenswrapper[4971]: I0320 07:28:00.155516 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 07:28:00 crc kubenswrapper[4971]: I0320 07:28:00.159991 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566528-67gds"] Mar 20 07:28:00 crc kubenswrapper[4971]: I0320 07:28:00.232267 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnwdt\" (UniqueName: \"kubernetes.io/projected/19457825-8ae2-40f8-b5b3-b939be1cbbfa-kube-api-access-gnwdt\") pod \"auto-csr-approver-29566528-67gds\" (UID: \"19457825-8ae2-40f8-b5b3-b939be1cbbfa\") " pod="openshift-infra/auto-csr-approver-29566528-67gds" Mar 20 07:28:00 crc kubenswrapper[4971]: I0320 07:28:00.334655 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnwdt\" (UniqueName: \"kubernetes.io/projected/19457825-8ae2-40f8-b5b3-b939be1cbbfa-kube-api-access-gnwdt\") pod \"auto-csr-approver-29566528-67gds\" (UID: \"19457825-8ae2-40f8-b5b3-b939be1cbbfa\") " pod="openshift-infra/auto-csr-approver-29566528-67gds" Mar 20 07:28:00 crc kubenswrapper[4971]: I0320 07:28:00.359351 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnwdt\" (UniqueName: \"kubernetes.io/projected/19457825-8ae2-40f8-b5b3-b939be1cbbfa-kube-api-access-gnwdt\") pod \"auto-csr-approver-29566528-67gds\" (UID: \"19457825-8ae2-40f8-b5b3-b939be1cbbfa\") " pod="openshift-infra/auto-csr-approver-29566528-67gds" Mar 20 07:28:00 crc kubenswrapper[4971]: I0320 07:28:00.495669 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566528-67gds" Mar 20 07:28:00 crc kubenswrapper[4971]: I0320 07:28:00.732391 4971 scope.go:117] "RemoveContainer" containerID="0fb36bb8261cef13ec2408924ecbe7eee5da04e46c14e79a7fb3e0e821bf601c" Mar 20 07:28:00 crc kubenswrapper[4971]: E0320 07:28:00.732599 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:28:00 crc kubenswrapper[4971]: I0320 07:28:00.973268 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566528-67gds"] Mar 20 07:28:00 crc kubenswrapper[4971]: I0320 07:28:00.986268 4971 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 07:28:01 crc kubenswrapper[4971]: I0320 07:28:01.008779 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566528-67gds" event={"ID":"19457825-8ae2-40f8-b5b3-b939be1cbbfa","Type":"ContainerStarted","Data":"1af3240fd323086291108af1c3bb22da7b3f0e047374632dfabaa7719c925caf"} Mar 20 07:28:03 crc kubenswrapper[4971]: I0320 07:28:03.024134 4971 generic.go:334] "Generic (PLEG): container finished" podID="19457825-8ae2-40f8-b5b3-b939be1cbbfa" containerID="f22b74d0d4523d5c0076fcb1fe68d737f3123f881309a7d13e8fed3cc654ce56" exitCode=0 Mar 20 07:28:03 crc kubenswrapper[4971]: I0320 07:28:03.024186 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566528-67gds" event={"ID":"19457825-8ae2-40f8-b5b3-b939be1cbbfa","Type":"ContainerDied","Data":"f22b74d0d4523d5c0076fcb1fe68d737f3123f881309a7d13e8fed3cc654ce56"} Mar 20 07:28:04 crc kubenswrapper[4971]: I0320 07:28:04.386100 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566528-67gds" Mar 20 07:28:04 crc kubenswrapper[4971]: I0320 07:28:04.496201 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnwdt\" (UniqueName: \"kubernetes.io/projected/19457825-8ae2-40f8-b5b3-b939be1cbbfa-kube-api-access-gnwdt\") pod \"19457825-8ae2-40f8-b5b3-b939be1cbbfa\" (UID: \"19457825-8ae2-40f8-b5b3-b939be1cbbfa\") " Mar 20 07:28:04 crc kubenswrapper[4971]: I0320 07:28:04.502779 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19457825-8ae2-40f8-b5b3-b939be1cbbfa-kube-api-access-gnwdt" (OuterVolumeSpecName: "kube-api-access-gnwdt") pod "19457825-8ae2-40f8-b5b3-b939be1cbbfa" (UID: "19457825-8ae2-40f8-b5b3-b939be1cbbfa"). InnerVolumeSpecName "kube-api-access-gnwdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:28:04 crc kubenswrapper[4971]: I0320 07:28:04.598096 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gnwdt\" (UniqueName: \"kubernetes.io/projected/19457825-8ae2-40f8-b5b3-b939be1cbbfa-kube-api-access-gnwdt\") on node \"crc\" DevicePath \"\"" Mar 20 07:28:05 crc kubenswrapper[4971]: I0320 07:28:05.044917 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566528-67gds" event={"ID":"19457825-8ae2-40f8-b5b3-b939be1cbbfa","Type":"ContainerDied","Data":"1af3240fd323086291108af1c3bb22da7b3f0e047374632dfabaa7719c925caf"} Mar 20 07:28:05 crc kubenswrapper[4971]: I0320 07:28:05.045289 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1af3240fd323086291108af1c3bb22da7b3f0e047374632dfabaa7719c925caf" Mar 20 07:28:05 crc kubenswrapper[4971]: I0320 07:28:05.045082 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566528-67gds" Mar 20 07:28:05 crc kubenswrapper[4971]: I0320 07:28:05.479921 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566522-96fh8"] Mar 20 07:28:05 crc kubenswrapper[4971]: I0320 07:28:05.487771 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566522-96fh8"] Mar 20 07:28:06 crc kubenswrapper[4971]: I0320 07:28:06.747785 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7314142-a7a2-4277-91b0-18a8e874546d" path="/var/lib/kubelet/pods/e7314142-a7a2-4277-91b0-18a8e874546d/volumes" Mar 20 07:28:13 crc kubenswrapper[4971]: I0320 07:28:13.732421 4971 scope.go:117] "RemoveContainer" containerID="0fb36bb8261cef13ec2408924ecbe7eee5da04e46c14e79a7fb3e0e821bf601c" Mar 20 07:28:13 crc kubenswrapper[4971]: E0320 07:28:13.733553 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:28:25 crc kubenswrapper[4971]: I0320 07:28:25.732252 4971 scope.go:117] "RemoveContainer" containerID="0fb36bb8261cef13ec2408924ecbe7eee5da04e46c14e79a7fb3e0e821bf601c" Mar 20 07:28:25 crc kubenswrapper[4971]: E0320 07:28:25.733135 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:28:30 crc kubenswrapper[4971]: I0320 07:28:30.803289 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-czgll"] Mar 20 07:28:30 crc kubenswrapper[4971]: E0320 07:28:30.804537 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19457825-8ae2-40f8-b5b3-b939be1cbbfa" containerName="oc" Mar 20 07:28:30 crc kubenswrapper[4971]: I0320 07:28:30.804560 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="19457825-8ae2-40f8-b5b3-b939be1cbbfa" containerName="oc" Mar 20 07:28:30 crc kubenswrapper[4971]: I0320 07:28:30.804856 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="19457825-8ae2-40f8-b5b3-b939be1cbbfa" containerName="oc" Mar 20 07:28:30 crc kubenswrapper[4971]: I0320 07:28:30.806774 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-czgll" Mar 20 07:28:30 crc kubenswrapper[4971]: I0320 07:28:30.818599 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-czgll"] Mar 20 07:28:30 crc kubenswrapper[4971]: I0320 07:28:30.934043 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0903fd3-20bd-448e-a264-d4e0960a10c8-catalog-content\") pod \"redhat-marketplace-czgll\" (UID: \"c0903fd3-20bd-448e-a264-d4e0960a10c8\") " pod="openshift-marketplace/redhat-marketplace-czgll" Mar 20 07:28:30 crc kubenswrapper[4971]: I0320 07:28:30.934100 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0903fd3-20bd-448e-a264-d4e0960a10c8-utilities\") pod \"redhat-marketplace-czgll\" (UID: \"c0903fd3-20bd-448e-a264-d4e0960a10c8\") " pod="openshift-marketplace/redhat-marketplace-czgll" Mar 20 07:28:30 crc kubenswrapper[4971]: I0320 07:28:30.934128 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwz9g\" (UniqueName: \"kubernetes.io/projected/c0903fd3-20bd-448e-a264-d4e0960a10c8-kube-api-access-hwz9g\") pod \"redhat-marketplace-czgll\" (UID: \"c0903fd3-20bd-448e-a264-d4e0960a10c8\") " pod="openshift-marketplace/redhat-marketplace-czgll" Mar 20 07:28:31 crc kubenswrapper[4971]: I0320 07:28:31.035233 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0903fd3-20bd-448e-a264-d4e0960a10c8-utilities\") pod \"redhat-marketplace-czgll\" (UID: \"c0903fd3-20bd-448e-a264-d4e0960a10c8\") " pod="openshift-marketplace/redhat-marketplace-czgll" Mar 20 07:28:31 crc kubenswrapper[4971]: I0320 07:28:31.035288 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwz9g\" (UniqueName: \"kubernetes.io/projected/c0903fd3-20bd-448e-a264-d4e0960a10c8-kube-api-access-hwz9g\") pod \"redhat-marketplace-czgll\" (UID: \"c0903fd3-20bd-448e-a264-d4e0960a10c8\") " pod="openshift-marketplace/redhat-marketplace-czgll" Mar 20 07:28:31 crc kubenswrapper[4971]: I0320 07:28:31.035372 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0903fd3-20bd-448e-a264-d4e0960a10c8-catalog-content\") pod \"redhat-marketplace-czgll\" (UID: \"c0903fd3-20bd-448e-a264-d4e0960a10c8\") " pod="openshift-marketplace/redhat-marketplace-czgll" Mar 20 07:28:31 crc kubenswrapper[4971]: I0320 07:28:31.035887 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0903fd3-20bd-448e-a264-d4e0960a10c8-catalog-content\") pod \"redhat-marketplace-czgll\" (UID: \"c0903fd3-20bd-448e-a264-d4e0960a10c8\") " pod="openshift-marketplace/redhat-marketplace-czgll" Mar 20 07:28:31 crc kubenswrapper[4971]: I0320 07:28:31.036082 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0903fd3-20bd-448e-a264-d4e0960a10c8-utilities\") pod \"redhat-marketplace-czgll\" (UID: \"c0903fd3-20bd-448e-a264-d4e0960a10c8\") " pod="openshift-marketplace/redhat-marketplace-czgll" Mar 20 07:28:31 crc kubenswrapper[4971]: I0320 07:28:31.059562 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwz9g\" (UniqueName: \"kubernetes.io/projected/c0903fd3-20bd-448e-a264-d4e0960a10c8-kube-api-access-hwz9g\") pod \"redhat-marketplace-czgll\" (UID: \"c0903fd3-20bd-448e-a264-d4e0960a10c8\") " pod="openshift-marketplace/redhat-marketplace-czgll" Mar 20 07:28:31 crc kubenswrapper[4971]: I0320 07:28:31.172148 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-czgll" Mar 20 07:28:31 crc kubenswrapper[4971]: I0320 07:28:31.399524 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-czgll"] Mar 20 07:28:32 crc kubenswrapper[4971]: I0320 07:28:32.311279 4971 generic.go:334] "Generic (PLEG): container finished" podID="c0903fd3-20bd-448e-a264-d4e0960a10c8" containerID="7cd3b01c001843ff1cf911197f74cde7b2d3159d7cad48d23abdf77436d7fbba" exitCode=0 Mar 20 07:28:32 crc kubenswrapper[4971]: I0320 07:28:32.311580 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-czgll" event={"ID":"c0903fd3-20bd-448e-a264-d4e0960a10c8","Type":"ContainerDied","Data":"7cd3b01c001843ff1cf911197f74cde7b2d3159d7cad48d23abdf77436d7fbba"} Mar 20 07:28:32 crc kubenswrapper[4971]: I0320 07:28:32.311642 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-czgll" event={"ID":"c0903fd3-20bd-448e-a264-d4e0960a10c8","Type":"ContainerStarted","Data":"81dd3edec81d99393552c21083bc3b8c406c9109b64e29c629fe31a807166965"} Mar 20 07:28:33 crc kubenswrapper[4971]: I0320 07:28:33.322789 4971 generic.go:334] "Generic (PLEG): container finished" podID="c0903fd3-20bd-448e-a264-d4e0960a10c8" containerID="cabf7b78951fcf54442814322e806f9debcc61425f4e25723e6ed10dee41f8e7" exitCode=0 Mar 20 07:28:33 crc kubenswrapper[4971]: I0320 07:28:33.322837 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-czgll" event={"ID":"c0903fd3-20bd-448e-a264-d4e0960a10c8","Type":"ContainerDied","Data":"cabf7b78951fcf54442814322e806f9debcc61425f4e25723e6ed10dee41f8e7"} Mar 20 07:28:34 crc kubenswrapper[4971]: I0320 07:28:34.335480 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-czgll" event={"ID":"c0903fd3-20bd-448e-a264-d4e0960a10c8","Type":"ContainerStarted","Data":"c855589a2ef61139e905b5334b9c926e43a056e3f95f513e3a514de0891076d2"} Mar 20 07:28:34 crc kubenswrapper[4971]: I0320 07:28:34.362211 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-czgll" podStartSLOduration=2.9490945589999997 podStartE2EDuration="4.362186606s" podCreationTimestamp="2026-03-20 07:28:30 +0000 UTC" firstStartedPulling="2026-03-20 07:28:32.313697407 +0000 UTC m=+2334.293571555" lastFinishedPulling="2026-03-20 07:28:33.726789434 +0000 UTC m=+2335.706663602" observedRunningTime="2026-03-20 07:28:34.356501377 +0000 UTC m=+2336.336375555" watchObservedRunningTime="2026-03-20 07:28:34.362186606 +0000 UTC m=+2336.342060764" Mar 20 07:28:36 crc kubenswrapper[4971]: I0320 07:28:36.732006 4971 scope.go:117] "RemoveContainer" containerID="0fb36bb8261cef13ec2408924ecbe7eee5da04e46c14e79a7fb3e0e821bf601c" Mar 20 07:28:36 crc kubenswrapper[4971]: E0320 07:28:36.732695 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:28:41 crc kubenswrapper[4971]: I0320 07:28:41.172947 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-czgll" Mar 20 07:28:41 crc kubenswrapper[4971]: I0320 07:28:41.173307 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-czgll" Mar 20 07:28:41 crc kubenswrapper[4971]: I0320 07:28:41.236106 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-czgll" Mar 20 07:28:41 crc kubenswrapper[4971]: I0320 07:28:41.485663 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-czgll" Mar 20 07:28:41 crc kubenswrapper[4971]: I0320 07:28:41.552878 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-czgll"] Mar 20 07:28:43 crc kubenswrapper[4971]: I0320 07:28:43.430843 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-czgll" podUID="c0903fd3-20bd-448e-a264-d4e0960a10c8" containerName="registry-server" containerID="cri-o://c855589a2ef61139e905b5334b9c926e43a056e3f95f513e3a514de0891076d2" gracePeriod=2 Mar 20 07:28:44 crc kubenswrapper[4971]: I0320 07:28:44.145246 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-czgll" Mar 20 07:28:44 crc kubenswrapper[4971]: I0320 07:28:44.258101 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0903fd3-20bd-448e-a264-d4e0960a10c8-utilities\") pod \"c0903fd3-20bd-448e-a264-d4e0960a10c8\" (UID: \"c0903fd3-20bd-448e-a264-d4e0960a10c8\") " Mar 20 07:28:44 crc kubenswrapper[4971]: I0320 07:28:44.258243 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwz9g\" (UniqueName: \"kubernetes.io/projected/c0903fd3-20bd-448e-a264-d4e0960a10c8-kube-api-access-hwz9g\") pod \"c0903fd3-20bd-448e-a264-d4e0960a10c8\" (UID: \"c0903fd3-20bd-448e-a264-d4e0960a10c8\") " Mar 20 07:28:44 crc kubenswrapper[4971]: I0320 07:28:44.258320 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0903fd3-20bd-448e-a264-d4e0960a10c8-catalog-content\") pod \"c0903fd3-20bd-448e-a264-d4e0960a10c8\" (UID: \"c0903fd3-20bd-448e-a264-d4e0960a10c8\") " Mar 20 07:28:44 crc kubenswrapper[4971]: I0320 07:28:44.259027 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0903fd3-20bd-448e-a264-d4e0960a10c8-utilities" (OuterVolumeSpecName: "utilities") pod "c0903fd3-20bd-448e-a264-d4e0960a10c8" (UID: "c0903fd3-20bd-448e-a264-d4e0960a10c8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:28:44 crc kubenswrapper[4971]: I0320 07:28:44.267784 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0903fd3-20bd-448e-a264-d4e0960a10c8-kube-api-access-hwz9g" (OuterVolumeSpecName: "kube-api-access-hwz9g") pod "c0903fd3-20bd-448e-a264-d4e0960a10c8" (UID: "c0903fd3-20bd-448e-a264-d4e0960a10c8"). InnerVolumeSpecName "kube-api-access-hwz9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:28:44 crc kubenswrapper[4971]: I0320 07:28:44.360741 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0903fd3-20bd-448e-a264-d4e0960a10c8-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 07:28:44 crc kubenswrapper[4971]: I0320 07:28:44.360812 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwz9g\" (UniqueName: \"kubernetes.io/projected/c0903fd3-20bd-448e-a264-d4e0960a10c8-kube-api-access-hwz9g\") on node \"crc\" DevicePath \"\"" Mar 20 07:28:44 crc kubenswrapper[4971]: I0320 07:28:44.405911 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0903fd3-20bd-448e-a264-d4e0960a10c8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c0903fd3-20bd-448e-a264-d4e0960a10c8" (UID: "c0903fd3-20bd-448e-a264-d4e0960a10c8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:28:44 crc kubenswrapper[4971]: I0320 07:28:44.444175 4971 generic.go:334] "Generic (PLEG): container finished" podID="c0903fd3-20bd-448e-a264-d4e0960a10c8" containerID="c855589a2ef61139e905b5334b9c926e43a056e3f95f513e3a514de0891076d2" exitCode=0 Mar 20 07:28:44 crc kubenswrapper[4971]: I0320 07:28:44.444218 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-czgll" event={"ID":"c0903fd3-20bd-448e-a264-d4e0960a10c8","Type":"ContainerDied","Data":"c855589a2ef61139e905b5334b9c926e43a056e3f95f513e3a514de0891076d2"} Mar 20 07:28:44 crc kubenswrapper[4971]: I0320 07:28:44.444245 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-czgll" event={"ID":"c0903fd3-20bd-448e-a264-d4e0960a10c8","Type":"ContainerDied","Data":"81dd3edec81d99393552c21083bc3b8c406c9109b64e29c629fe31a807166965"} Mar 20 07:28:44 crc kubenswrapper[4971]: I0320 07:28:44.444260 4971 scope.go:117] "RemoveContainer" containerID="c855589a2ef61139e905b5334b9c926e43a056e3f95f513e3a514de0891076d2" Mar 20 07:28:44 crc kubenswrapper[4971]: I0320 07:28:44.444544 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-czgll" Mar 20 07:28:44 crc kubenswrapper[4971]: I0320 07:28:44.461697 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0903fd3-20bd-448e-a264-d4e0960a10c8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 07:28:44 crc kubenswrapper[4971]: I0320 07:28:44.481049 4971 scope.go:117] "RemoveContainer" containerID="cabf7b78951fcf54442814322e806f9debcc61425f4e25723e6ed10dee41f8e7" Mar 20 07:28:44 crc kubenswrapper[4971]: I0320 07:28:44.498359 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-czgll"] Mar 20 07:28:44 crc kubenswrapper[4971]: I0320 07:28:44.504774 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-czgll"] Mar 20 07:28:44 crc kubenswrapper[4971]: I0320 07:28:44.519285 4971 scope.go:117] "RemoveContainer" containerID="7cd3b01c001843ff1cf911197f74cde7b2d3159d7cad48d23abdf77436d7fbba" Mar 20 07:28:44 crc kubenswrapper[4971]: I0320 07:28:44.538619 4971 scope.go:117] "RemoveContainer" containerID="c855589a2ef61139e905b5334b9c926e43a056e3f95f513e3a514de0891076d2" Mar 20 07:28:44 crc kubenswrapper[4971]: E0320 07:28:44.539285 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c855589a2ef61139e905b5334b9c926e43a056e3f95f513e3a514de0891076d2\": container with ID starting with c855589a2ef61139e905b5334b9c926e43a056e3f95f513e3a514de0891076d2 not found: ID does not exist" containerID="c855589a2ef61139e905b5334b9c926e43a056e3f95f513e3a514de0891076d2" Mar 20 07:28:44 crc kubenswrapper[4971]: I0320 07:28:44.539323 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c855589a2ef61139e905b5334b9c926e43a056e3f95f513e3a514de0891076d2"} err="failed to get container status \"c855589a2ef61139e905b5334b9c926e43a056e3f95f513e3a514de0891076d2\": rpc error: code = NotFound desc = could not find container \"c855589a2ef61139e905b5334b9c926e43a056e3f95f513e3a514de0891076d2\": container with ID starting with c855589a2ef61139e905b5334b9c926e43a056e3f95f513e3a514de0891076d2 not found: ID does not exist" Mar 20 07:28:44 crc kubenswrapper[4971]: I0320 07:28:44.539352 4971 scope.go:117] "RemoveContainer" containerID="cabf7b78951fcf54442814322e806f9debcc61425f4e25723e6ed10dee41f8e7" Mar 20 07:28:44 crc kubenswrapper[4971]: E0320 07:28:44.539925 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cabf7b78951fcf54442814322e806f9debcc61425f4e25723e6ed10dee41f8e7\": container with ID starting with cabf7b78951fcf54442814322e806f9debcc61425f4e25723e6ed10dee41f8e7 not found: ID does not exist" containerID="cabf7b78951fcf54442814322e806f9debcc61425f4e25723e6ed10dee41f8e7" Mar 20 07:28:44 crc kubenswrapper[4971]: I0320 07:28:44.539970 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cabf7b78951fcf54442814322e806f9debcc61425f4e25723e6ed10dee41f8e7"} err="failed to get container status \"cabf7b78951fcf54442814322e806f9debcc61425f4e25723e6ed10dee41f8e7\": rpc error: code = NotFound desc = could not find container \"cabf7b78951fcf54442814322e806f9debcc61425f4e25723e6ed10dee41f8e7\": container with ID starting with cabf7b78951fcf54442814322e806f9debcc61425f4e25723e6ed10dee41f8e7 not found: ID does not exist" Mar 20 07:28:44 crc kubenswrapper[4971]: I0320 07:28:44.540001 4971 scope.go:117] "RemoveContainer" containerID="7cd3b01c001843ff1cf911197f74cde7b2d3159d7cad48d23abdf77436d7fbba" Mar 20 07:28:44 crc kubenswrapper[4971]: E0320 07:28:44.540369 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cd3b01c001843ff1cf911197f74cde7b2d3159d7cad48d23abdf77436d7fbba\": container with ID starting with 7cd3b01c001843ff1cf911197f74cde7b2d3159d7cad48d23abdf77436d7fbba not found: ID does not exist" containerID="7cd3b01c001843ff1cf911197f74cde7b2d3159d7cad48d23abdf77436d7fbba" Mar 20 07:28:44 crc kubenswrapper[4971]: I0320 07:28:44.540396 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cd3b01c001843ff1cf911197f74cde7b2d3159d7cad48d23abdf77436d7fbba"} err="failed to get container status \"7cd3b01c001843ff1cf911197f74cde7b2d3159d7cad48d23abdf77436d7fbba\": rpc error: code = NotFound desc = could not find container \"7cd3b01c001843ff1cf911197f74cde7b2d3159d7cad48d23abdf77436d7fbba\": container with ID starting with 7cd3b01c001843ff1cf911197f74cde7b2d3159d7cad48d23abdf77436d7fbba not found: ID does not exist" Mar 20 07:28:44 crc kubenswrapper[4971]: I0320 07:28:44.749025 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0903fd3-20bd-448e-a264-d4e0960a10c8" path="/var/lib/kubelet/pods/c0903fd3-20bd-448e-a264-d4e0960a10c8/volumes" Mar 20 07:28:51 crc kubenswrapper[4971]: I0320 07:28:51.732909 4971 scope.go:117] "RemoveContainer" containerID="0fb36bb8261cef13ec2408924ecbe7eee5da04e46c14e79a7fb3e0e821bf601c" Mar 20 07:28:51 crc kubenswrapper[4971]: E0320 07:28:51.733670 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:28:54 crc kubenswrapper[4971]: I0320 07:28:54.256664 4971 scope.go:117] "RemoveContainer" containerID="364c7420fdd4b13846d7fea70fdb67e79c3e13127eceb646aa0ae3cd4baa0c39" Mar 20 07:29:03 crc kubenswrapper[4971]: I0320 07:29:03.732482 4971 scope.go:117] "RemoveContainer" containerID="0fb36bb8261cef13ec2408924ecbe7eee5da04e46c14e79a7fb3e0e821bf601c" Mar 20 07:29:03 crc kubenswrapper[4971]: E0320 07:29:03.733582 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:29:17 crc kubenswrapper[4971]: I0320 07:29:17.732789 4971 scope.go:117] "RemoveContainer" containerID="0fb36bb8261cef13ec2408924ecbe7eee5da04e46c14e79a7fb3e0e821bf601c" Mar 20 07:29:17 crc kubenswrapper[4971]: E0320 07:29:17.734133 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:29:30 crc kubenswrapper[4971]: I0320 07:29:30.733049 4971 scope.go:117] "RemoveContainer" containerID="0fb36bb8261cef13ec2408924ecbe7eee5da04e46c14e79a7fb3e0e821bf601c" Mar 20 07:29:30 crc kubenswrapper[4971]: E0320 07:29:30.735398 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:29:45 crc kubenswrapper[4971]: I0320 07:29:45.732206 4971 scope.go:117] "RemoveContainer" containerID="0fb36bb8261cef13ec2408924ecbe7eee5da04e46c14e79a7fb3e0e821bf601c" Mar 20 07:29:45 crc kubenswrapper[4971]: E0320 07:29:45.732914 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:30:00 crc kubenswrapper[4971]: I0320 07:30:00.154842 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566530-w24sg"] Mar 20 07:30:00 crc kubenswrapper[4971]: E0320 07:30:00.156284 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0903fd3-20bd-448e-a264-d4e0960a10c8" containerName="extract-utilities" Mar 20 07:30:00 crc kubenswrapper[4971]: I0320 07:30:00.156305 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0903fd3-20bd-448e-a264-d4e0960a10c8" containerName="extract-utilities" Mar 20 07:30:00 crc kubenswrapper[4971]: E0320 07:30:00.156334 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0903fd3-20bd-448e-a264-d4e0960a10c8" containerName="registry-server" Mar 20 07:30:00 crc kubenswrapper[4971]: I0320 07:30:00.156350 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0903fd3-20bd-448e-a264-d4e0960a10c8" containerName="registry-server" Mar 20 07:30:00 crc kubenswrapper[4971]: E0320 07:30:00.156363 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0903fd3-20bd-448e-a264-d4e0960a10c8" containerName="extract-content" Mar 20 07:30:00 crc kubenswrapper[4971]: I0320 07:30:00.156373 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0903fd3-20bd-448e-a264-d4e0960a10c8" containerName="extract-content" Mar 20 07:30:00 crc kubenswrapper[4971]: I0320 07:30:00.156566 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0903fd3-20bd-448e-a264-d4e0960a10c8" containerName="registry-server" Mar 20 07:30:00 crc kubenswrapper[4971]: I0320 07:30:00.157143 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566530-w24sg" Mar 20 07:30:00 crc kubenswrapper[4971]: I0320 07:30:00.159727 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 07:30:00 crc kubenswrapper[4971]: I0320 07:30:00.159747 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 07:30:00 crc kubenswrapper[4971]: I0320 07:30:00.172726 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566530-vfnhc"] Mar 20 07:30:00 crc kubenswrapper[4971]: I0320 07:30:00.174537 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566530-vfnhc" Mar 20 07:30:00 crc kubenswrapper[4971]: I0320 07:30:00.177535 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 07:30:00 crc kubenswrapper[4971]: I0320 07:30:00.178113 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 07:30:00 crc kubenswrapper[4971]: I0320 07:30:00.178122 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 07:30:00 crc kubenswrapper[4971]: I0320 07:30:00.184671 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566530-w24sg"] Mar 20 07:30:00 crc kubenswrapper[4971]: I0320 07:30:00.198056 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566530-vfnhc"] Mar 20 07:30:00 crc kubenswrapper[4971]: I0320 07:30:00.306675 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cff34fd3-3552-416f-b2d8-059274d1667b-secret-volume\") pod \"collect-profiles-29566530-w24sg\" (UID: \"cff34fd3-3552-416f-b2d8-059274d1667b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566530-w24sg" Mar 20 07:30:00 crc kubenswrapper[4971]: I0320 07:30:00.306739 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cff34fd3-3552-416f-b2d8-059274d1667b-config-volume\") pod \"collect-profiles-29566530-w24sg\" (UID: \"cff34fd3-3552-416f-b2d8-059274d1667b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566530-w24sg" Mar 20 07:30:00 crc kubenswrapper[4971]: I0320 07:30:00.306821 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92zh6\" (UniqueName: \"kubernetes.io/projected/cff34fd3-3552-416f-b2d8-059274d1667b-kube-api-access-92zh6\") pod \"collect-profiles-29566530-w24sg\" (UID: \"cff34fd3-3552-416f-b2d8-059274d1667b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566530-w24sg" Mar 20 07:30:00 crc kubenswrapper[4971]: I0320 07:30:00.306907 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tshj\" (UniqueName: \"kubernetes.io/projected/26312162-6c67-44f8-92d2-9a65b8bbb4d8-kube-api-access-6tshj\") pod \"auto-csr-approver-29566530-vfnhc\" (UID: \"26312162-6c67-44f8-92d2-9a65b8bbb4d8\") " pod="openshift-infra/auto-csr-approver-29566530-vfnhc" Mar 20 07:30:00 crc kubenswrapper[4971]: I0320 07:30:00.408248 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cff34fd3-3552-416f-b2d8-059274d1667b-config-volume\") pod \"collect-profiles-29566530-w24sg\" (UID: \"cff34fd3-3552-416f-b2d8-059274d1667b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566530-w24sg" Mar 20 07:30:00 crc kubenswrapper[4971]: I0320 07:30:00.408328 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92zh6\" (UniqueName: \"kubernetes.io/projected/cff34fd3-3552-416f-b2d8-059274d1667b-kube-api-access-92zh6\") pod \"collect-profiles-29566530-w24sg\" (UID: \"cff34fd3-3552-416f-b2d8-059274d1667b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566530-w24sg" Mar 20 07:30:00 crc kubenswrapper[4971]: I0320 07:30:00.408354 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tshj\" (UniqueName: \"kubernetes.io/projected/26312162-6c67-44f8-92d2-9a65b8bbb4d8-kube-api-access-6tshj\") pod \"auto-csr-approver-29566530-vfnhc\" (UID: \"26312162-6c67-44f8-92d2-9a65b8bbb4d8\") " pod="openshift-infra/auto-csr-approver-29566530-vfnhc" Mar 20 07:30:00 crc kubenswrapper[4971]: I0320 07:30:00.408461 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cff34fd3-3552-416f-b2d8-059274d1667b-secret-volume\") pod \"collect-profiles-29566530-w24sg\" (UID: \"cff34fd3-3552-416f-b2d8-059274d1667b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566530-w24sg" Mar 20 07:30:00 crc kubenswrapper[4971]: I0320 07:30:00.409243 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cff34fd3-3552-416f-b2d8-059274d1667b-config-volume\") pod \"collect-profiles-29566530-w24sg\" (UID: \"cff34fd3-3552-416f-b2d8-059274d1667b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566530-w24sg" Mar 20 07:30:00 crc kubenswrapper[4971]: I0320 07:30:00.423933 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cff34fd3-3552-416f-b2d8-059274d1667b-secret-volume\") pod \"collect-profiles-29566530-w24sg\" (UID: \"cff34fd3-3552-416f-b2d8-059274d1667b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566530-w24sg" Mar 20 07:30:00 crc kubenswrapper[4971]: I0320 07:30:00.437032 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tshj\" (UniqueName: \"kubernetes.io/projected/26312162-6c67-44f8-92d2-9a65b8bbb4d8-kube-api-access-6tshj\") pod \"auto-csr-approver-29566530-vfnhc\" (UID: \"26312162-6c67-44f8-92d2-9a65b8bbb4d8\") " pod="openshift-infra/auto-csr-approver-29566530-vfnhc" Mar 20 07:30:00 crc kubenswrapper[4971]: I0320 07:30:00.438208 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92zh6\" (UniqueName: \"kubernetes.io/projected/cff34fd3-3552-416f-b2d8-059274d1667b-kube-api-access-92zh6\") pod \"collect-profiles-29566530-w24sg\" (UID: \"cff34fd3-3552-416f-b2d8-059274d1667b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566530-w24sg" Mar 20 07:30:00 crc kubenswrapper[4971]: I0320 07:30:00.503711 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566530-w24sg" Mar 20 07:30:00 crc kubenswrapper[4971]: I0320 07:30:00.517649 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566530-vfnhc" Mar 20 07:30:00 crc kubenswrapper[4971]: I0320 07:30:00.732639 4971 scope.go:117] "RemoveContainer" containerID="0fb36bb8261cef13ec2408924ecbe7eee5da04e46c14e79a7fb3e0e821bf601c" Mar 20 07:30:00 crc kubenswrapper[4971]: E0320 07:30:00.733043 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:30:00 crc kubenswrapper[4971]: I0320 07:30:00.835037 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566530-vfnhc"] Mar 20 07:30:00 crc kubenswrapper[4971]: I0320 07:30:00.983474 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566530-w24sg"] Mar 20 07:30:01 crc kubenswrapper[4971]: I0320 07:30:01.173093 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566530-w24sg" event={"ID":"cff34fd3-3552-416f-b2d8-059274d1667b","Type":"ContainerStarted","Data":"e28cbc5493795aef965e4ddd34d2b0a50afae362a9a0391238968bf4dbd07036"} Mar 20 07:30:01 crc kubenswrapper[4971]: I0320 07:30:01.173373 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566530-w24sg" event={"ID":"cff34fd3-3552-416f-b2d8-059274d1667b","Type":"ContainerStarted","Data":"13549fd656edd0c43953f3acf4cf0feddb3bb69534cf69f9b715b5e767897a1a"} Mar 20 07:30:01 crc kubenswrapper[4971]: I0320 07:30:01.175438 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566530-vfnhc" event={"ID":"26312162-6c67-44f8-92d2-9a65b8bbb4d8","Type":"ContainerStarted","Data":"cd2abf5f90f888277a19fe4747c054863ad520ae208739134671129ec71f9f08"} Mar 20 07:30:01 crc kubenswrapper[4971]: I0320 07:30:01.209727 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29566530-w24sg" podStartSLOduration=1.209704389 podStartE2EDuration="1.209704389s" podCreationTimestamp="2026-03-20 07:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:30:01.19709315 +0000 UTC m=+2423.176967318" watchObservedRunningTime="2026-03-20 07:30:01.209704389 +0000 UTC m=+2423.189578547" Mar 20 07:30:02 crc kubenswrapper[4971]: I0320 07:30:02.185265 4971 generic.go:334] "Generic (PLEG): container finished" podID="cff34fd3-3552-416f-b2d8-059274d1667b" containerID="e28cbc5493795aef965e4ddd34d2b0a50afae362a9a0391238968bf4dbd07036" exitCode=0 Mar 20 07:30:02 crc kubenswrapper[4971]: I0320 07:30:02.185460 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566530-w24sg" event={"ID":"cff34fd3-3552-416f-b2d8-059274d1667b","Type":"ContainerDied","Data":"e28cbc5493795aef965e4ddd34d2b0a50afae362a9a0391238968bf4dbd07036"} Mar 20 07:30:03 crc kubenswrapper[4971]: I0320 07:30:03.197204 4971 generic.go:334] "Generic (PLEG): container finished" podID="26312162-6c67-44f8-92d2-9a65b8bbb4d8" containerID="ddfa582730f08684af4b5fc818fcfc779d6ef1674b3ed611f43eb16de8369691" exitCode=0 Mar 20 07:30:03 crc kubenswrapper[4971]: I0320 07:30:03.197333 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566530-vfnhc" event={"ID":"26312162-6c67-44f8-92d2-9a65b8bbb4d8","Type":"ContainerDied","Data":"ddfa582730f08684af4b5fc818fcfc779d6ef1674b3ed611f43eb16de8369691"} Mar 20 07:30:03 crc kubenswrapper[4971]: I0320 07:30:03.581935 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566530-w24sg" Mar 20 07:30:03 crc kubenswrapper[4971]: I0320 07:30:03.660199 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92zh6\" (UniqueName: \"kubernetes.io/projected/cff34fd3-3552-416f-b2d8-059274d1667b-kube-api-access-92zh6\") pod \"cff34fd3-3552-416f-b2d8-059274d1667b\" (UID: \"cff34fd3-3552-416f-b2d8-059274d1667b\") " Mar 20 07:30:03 crc kubenswrapper[4971]: I0320 07:30:03.660469 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cff34fd3-3552-416f-b2d8-059274d1667b-config-volume\") pod \"cff34fd3-3552-416f-b2d8-059274d1667b\" (UID: \"cff34fd3-3552-416f-b2d8-059274d1667b\") " Mar 20 07:30:03 crc kubenswrapper[4971]: I0320 07:30:03.660716 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cff34fd3-3552-416f-b2d8-059274d1667b-secret-volume\") pod \"cff34fd3-3552-416f-b2d8-059274d1667b\" (UID: \"cff34fd3-3552-416f-b2d8-059274d1667b\") " Mar 20 07:30:03 crc kubenswrapper[4971]: I0320 07:30:03.661398 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cff34fd3-3552-416f-b2d8-059274d1667b-config-volume" (OuterVolumeSpecName: "config-volume") pod "cff34fd3-3552-416f-b2d8-059274d1667b" (UID: "cff34fd3-3552-416f-b2d8-059274d1667b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:30:03 crc kubenswrapper[4971]: I0320 07:30:03.661949 4971 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cff34fd3-3552-416f-b2d8-059274d1667b-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 07:30:03 crc kubenswrapper[4971]: I0320 07:30:03.666323 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cff34fd3-3552-416f-b2d8-059274d1667b-kube-api-access-92zh6" (OuterVolumeSpecName: "kube-api-access-92zh6") pod "cff34fd3-3552-416f-b2d8-059274d1667b" (UID: "cff34fd3-3552-416f-b2d8-059274d1667b"). InnerVolumeSpecName "kube-api-access-92zh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:30:03 crc kubenswrapper[4971]: I0320 07:30:03.666449 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cff34fd3-3552-416f-b2d8-059274d1667b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "cff34fd3-3552-416f-b2d8-059274d1667b" (UID: "cff34fd3-3552-416f-b2d8-059274d1667b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:30:03 crc kubenswrapper[4971]: I0320 07:30:03.763668 4971 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cff34fd3-3552-416f-b2d8-059274d1667b-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 07:30:03 crc kubenswrapper[4971]: I0320 07:30:03.763721 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92zh6\" (UniqueName: \"kubernetes.io/projected/cff34fd3-3552-416f-b2d8-059274d1667b-kube-api-access-92zh6\") on node \"crc\" DevicePath \"\"" Mar 20 07:30:04 crc kubenswrapper[4971]: I0320 07:30:04.210276 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566530-w24sg" event={"ID":"cff34fd3-3552-416f-b2d8-059274d1667b","Type":"ContainerDied","Data":"13549fd656edd0c43953f3acf4cf0feddb3bb69534cf69f9b715b5e767897a1a"} Mar 20 07:30:04 crc kubenswrapper[4971]: I0320 07:30:04.210359 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13549fd656edd0c43953f3acf4cf0feddb3bb69534cf69f9b715b5e767897a1a" Mar 20 07:30:04 crc kubenswrapper[4971]: I0320 07:30:04.210430 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566530-w24sg" Mar 20 07:30:04 crc kubenswrapper[4971]: I0320 07:30:04.302196 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566485-rnzpk"] Mar 20 07:30:04 crc kubenswrapper[4971]: I0320 07:30:04.309418 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566485-rnzpk"] Mar 20 07:30:04 crc kubenswrapper[4971]: I0320 07:30:04.555527 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566530-vfnhc" Mar 20 07:30:04 crc kubenswrapper[4971]: I0320 07:30:04.677229 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tshj\" (UniqueName: \"kubernetes.io/projected/26312162-6c67-44f8-92d2-9a65b8bbb4d8-kube-api-access-6tshj\") pod \"26312162-6c67-44f8-92d2-9a65b8bbb4d8\" (UID: \"26312162-6c67-44f8-92d2-9a65b8bbb4d8\") " Mar 20 07:30:04 crc kubenswrapper[4971]: I0320 07:30:04.683009 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26312162-6c67-44f8-92d2-9a65b8bbb4d8-kube-api-access-6tshj" (OuterVolumeSpecName: "kube-api-access-6tshj") pod "26312162-6c67-44f8-92d2-9a65b8bbb4d8" (UID: "26312162-6c67-44f8-92d2-9a65b8bbb4d8"). InnerVolumeSpecName "kube-api-access-6tshj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:30:04 crc kubenswrapper[4971]: I0320 07:30:04.745943 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7d2e412-0a4f-44b6-a326-269ac4921dae" path="/var/lib/kubelet/pods/b7d2e412-0a4f-44b6-a326-269ac4921dae/volumes" Mar 20 07:30:04 crc kubenswrapper[4971]: I0320 07:30:04.779166 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tshj\" (UniqueName: \"kubernetes.io/projected/26312162-6c67-44f8-92d2-9a65b8bbb4d8-kube-api-access-6tshj\") on node \"crc\" DevicePath \"\"" Mar 20 07:30:05 crc kubenswrapper[4971]: I0320 07:30:05.243094 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566530-vfnhc" event={"ID":"26312162-6c67-44f8-92d2-9a65b8bbb4d8","Type":"ContainerDied","Data":"cd2abf5f90f888277a19fe4747c054863ad520ae208739134671129ec71f9f08"} Mar 20 07:30:05 crc kubenswrapper[4971]: I0320 07:30:05.243132 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd2abf5f90f888277a19fe4747c054863ad520ae208739134671129ec71f9f08" Mar 20 07:30:05 crc kubenswrapper[4971]: I0320 07:30:05.243215 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566530-vfnhc" Mar 20 07:30:05 crc kubenswrapper[4971]: I0320 07:30:05.626529 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566524-rkrcp"] Mar 20 07:30:05 crc kubenswrapper[4971]: I0320 07:30:05.637124 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566524-rkrcp"] Mar 20 07:30:06 crc kubenswrapper[4971]: I0320 07:30:06.747582 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e60a87d3-55d4-4bfa-9b11-4c105cf613cc" path="/var/lib/kubelet/pods/e60a87d3-55d4-4bfa-9b11-4c105cf613cc/volumes" Mar 20 07:30:12 crc kubenswrapper[4971]: I0320 07:30:12.732995 4971 scope.go:117] "RemoveContainer" containerID="0fb36bb8261cef13ec2408924ecbe7eee5da04e46c14e79a7fb3e0e821bf601c" Mar 20 07:30:12 crc kubenswrapper[4971]: E0320 07:30:12.736750 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:30:26 crc kubenswrapper[4971]: I0320 07:30:26.732999 4971 scope.go:117] "RemoveContainer" containerID="0fb36bb8261cef13ec2408924ecbe7eee5da04e46c14e79a7fb3e0e821bf601c" Mar 20 07:30:26 crc kubenswrapper[4971]: E0320 07:30:26.734079 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:30:37 crc kubenswrapper[4971]: I0320 07:30:37.732175 4971 scope.go:117] "RemoveContainer" containerID="0fb36bb8261cef13ec2408924ecbe7eee5da04e46c14e79a7fb3e0e821bf601c" Mar 20 07:30:37 crc kubenswrapper[4971]: E0320 07:30:37.733245 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:30:50 crc kubenswrapper[4971]: I0320 07:30:50.732602 4971 scope.go:117] "RemoveContainer" containerID="0fb36bb8261cef13ec2408924ecbe7eee5da04e46c14e79a7fb3e0e821bf601c" Mar 20 07:30:50 crc kubenswrapper[4971]: E0320 07:30:50.733562 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:30:54 crc kubenswrapper[4971]: I0320 07:30:54.396464 4971 scope.go:117] "RemoveContainer" containerID="034740c7fd2522f3622e7a92b4698c566929791c244c307efc89fc747e087833" Mar 20 07:30:54 crc kubenswrapper[4971]: I0320 07:30:54.443360 4971 scope.go:117] "RemoveContainer" containerID="204a3cd7c64dc1ccfc2b91d007f10f29b87d10258c56c2733f640c25ff176670" Mar 20 07:31:05 crc kubenswrapper[4971]: I0320 07:31:05.732989 4971 scope.go:117] "RemoveContainer" containerID="0fb36bb8261cef13ec2408924ecbe7eee5da04e46c14e79a7fb3e0e821bf601c" Mar 20 07:31:05 crc kubenswrapper[4971]: E0320 07:31:05.734385 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:31:09 crc kubenswrapper[4971]: I0320 07:31:09.044599 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dbv4j"] Mar 20 07:31:09 crc kubenswrapper[4971]: E0320 07:31:09.045154 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cff34fd3-3552-416f-b2d8-059274d1667b" containerName="collect-profiles" Mar 20 07:31:09 crc kubenswrapper[4971]: I0320 07:31:09.045180 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="cff34fd3-3552-416f-b2d8-059274d1667b" containerName="collect-profiles" Mar 20 07:31:09 crc kubenswrapper[4971]: E0320 07:31:09.045232 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26312162-6c67-44f8-92d2-9a65b8bbb4d8" containerName="oc" Mar 20 07:31:09 crc kubenswrapper[4971]: I0320 07:31:09.045254 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="26312162-6c67-44f8-92d2-9a65b8bbb4d8" containerName="oc" Mar 20 07:31:09 crc kubenswrapper[4971]: I0320 07:31:09.045651 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="cff34fd3-3552-416f-b2d8-059274d1667b" containerName="collect-profiles" Mar 20 07:31:09 crc kubenswrapper[4971]: I0320 07:31:09.045683 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="26312162-6c67-44f8-92d2-9a65b8bbb4d8" containerName="oc" Mar 20 07:31:09 crc kubenswrapper[4971]: I0320 07:31:09.047726 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dbv4j" Mar 20 07:31:09 crc kubenswrapper[4971]: I0320 07:31:09.057816 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dbv4j"] Mar 20 07:31:09 crc kubenswrapper[4971]: I0320 07:31:09.172034 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzd22\" (UniqueName: \"kubernetes.io/projected/79fe8ebc-cfb7-4b00-929c-053c573f1f24-kube-api-access-wzd22\") pod \"community-operators-dbv4j\" (UID: \"79fe8ebc-cfb7-4b00-929c-053c573f1f24\") " pod="openshift-marketplace/community-operators-dbv4j" Mar 20 07:31:09 crc kubenswrapper[4971]: I0320 07:31:09.172094 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79fe8ebc-cfb7-4b00-929c-053c573f1f24-catalog-content\") pod \"community-operators-dbv4j\" (UID: \"79fe8ebc-cfb7-4b00-929c-053c573f1f24\") " pod="openshift-marketplace/community-operators-dbv4j" Mar 20 07:31:09 crc kubenswrapper[4971]: I0320 07:31:09.172351 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79fe8ebc-cfb7-4b00-929c-053c573f1f24-utilities\") pod \"community-operators-dbv4j\" (UID: \"79fe8ebc-cfb7-4b00-929c-053c573f1f24\") " pod="openshift-marketplace/community-operators-dbv4j" Mar 20 07:31:09 crc kubenswrapper[4971]: I0320 07:31:09.274414 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzd22\" (UniqueName: \"kubernetes.io/projected/79fe8ebc-cfb7-4b00-929c-053c573f1f24-kube-api-access-wzd22\") pod \"community-operators-dbv4j\" (UID: \"79fe8ebc-cfb7-4b00-929c-053c573f1f24\") " pod="openshift-marketplace/community-operators-dbv4j" Mar 20 07:31:09 crc kubenswrapper[4971]: I0320 07:31:09.274498 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79fe8ebc-cfb7-4b00-929c-053c573f1f24-catalog-content\") pod \"community-operators-dbv4j\" (UID: \"79fe8ebc-cfb7-4b00-929c-053c573f1f24\") " pod="openshift-marketplace/community-operators-dbv4j" Mar 20 07:31:09 crc kubenswrapper[4971]: I0320 07:31:09.274549 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79fe8ebc-cfb7-4b00-929c-053c573f1f24-utilities\") pod \"community-operators-dbv4j\" (UID: \"79fe8ebc-cfb7-4b00-929c-053c573f1f24\") " pod="openshift-marketplace/community-operators-dbv4j" Mar 20 07:31:09 crc kubenswrapper[4971]: I0320 07:31:09.275100 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79fe8ebc-cfb7-4b00-929c-053c573f1f24-utilities\") pod \"community-operators-dbv4j\" (UID: \"79fe8ebc-cfb7-4b00-929c-053c573f1f24\") " pod="openshift-marketplace/community-operators-dbv4j" Mar 20 07:31:09 crc kubenswrapper[4971]: I0320 07:31:09.275212 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79fe8ebc-cfb7-4b00-929c-053c573f1f24-catalog-content\") pod \"community-operators-dbv4j\" (UID: \"79fe8ebc-cfb7-4b00-929c-053c573f1f24\") " pod="openshift-marketplace/community-operators-dbv4j" Mar 20 07:31:09 crc kubenswrapper[4971]: I0320 07:31:09.297929 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzd22\" (UniqueName: \"kubernetes.io/projected/79fe8ebc-cfb7-4b00-929c-053c573f1f24-kube-api-access-wzd22\") pod \"community-operators-dbv4j\" (UID: \"79fe8ebc-cfb7-4b00-929c-053c573f1f24\") " pod="openshift-marketplace/community-operators-dbv4j" Mar 20 07:31:09 crc kubenswrapper[4971]: I0320 07:31:09.380164 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dbv4j" Mar 20 07:31:09 crc kubenswrapper[4971]: I0320 07:31:09.917682 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dbv4j"] Mar 20 07:31:10 crc kubenswrapper[4971]: I0320 07:31:10.862905 4971 generic.go:334] "Generic (PLEG): container finished" podID="79fe8ebc-cfb7-4b00-929c-053c573f1f24" containerID="a25baede573d9d8e6fcb15e9f5fff5c55b690c607d027ca4f7545fa02ea07bd5" exitCode=0 Mar 20 07:31:10 crc kubenswrapper[4971]: I0320 07:31:10.863023 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dbv4j" event={"ID":"79fe8ebc-cfb7-4b00-929c-053c573f1f24","Type":"ContainerDied","Data":"a25baede573d9d8e6fcb15e9f5fff5c55b690c607d027ca4f7545fa02ea07bd5"} Mar 20 07:31:10 crc kubenswrapper[4971]: I0320 07:31:10.866725 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dbv4j" event={"ID":"79fe8ebc-cfb7-4b00-929c-053c573f1f24","Type":"ContainerStarted","Data":"5af73a2f7a8722961da40d5c1a2f2934aebd0f926c3894fa890a5a9b3d6a3d42"} Mar 20 07:31:11 crc kubenswrapper[4971]: I0320 07:31:11.882817 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dbv4j" event={"ID":"79fe8ebc-cfb7-4b00-929c-053c573f1f24","Type":"ContainerStarted","Data":"ca684ba92d412c346d5c5ba3e3eaf9ce721689d1198ac4ccd9dcc56b1935ea97"} Mar 20 07:31:12 crc kubenswrapper[4971]: I0320 07:31:12.898203 4971 generic.go:334] "Generic (PLEG): container finished" podID="79fe8ebc-cfb7-4b00-929c-053c573f1f24" containerID="ca684ba92d412c346d5c5ba3e3eaf9ce721689d1198ac4ccd9dcc56b1935ea97" exitCode=0 Mar 20 07:31:12 crc kubenswrapper[4971]: I0320 07:31:12.898503 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dbv4j" event={"ID":"79fe8ebc-cfb7-4b00-929c-053c573f1f24","Type":"ContainerDied","Data":"ca684ba92d412c346d5c5ba3e3eaf9ce721689d1198ac4ccd9dcc56b1935ea97"} Mar 20 07:31:13 crc kubenswrapper[4971]: I0320 07:31:13.912231 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dbv4j" event={"ID":"79fe8ebc-cfb7-4b00-929c-053c573f1f24","Type":"ContainerStarted","Data":"949bcc49d52e1f07866751ac0349598f6350eb5b45402cd953da2a65b5eca669"} Mar 20 07:31:13 crc kubenswrapper[4971]: I0320 07:31:13.940799 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dbv4j" podStartSLOduration=2.401717787 podStartE2EDuration="4.940767434s" podCreationTimestamp="2026-03-20 07:31:09 +0000 UTC" firstStartedPulling="2026-03-20 07:31:10.866545012 +0000 UTC m=+2492.846419160" lastFinishedPulling="2026-03-20 07:31:13.405594629 +0000 UTC m=+2495.385468807" observedRunningTime="2026-03-20 07:31:13.934831839 +0000 UTC m=+2495.914705987" watchObservedRunningTime="2026-03-20 07:31:13.940767434 +0000 UTC m=+2495.920641562" Mar 20 07:31:17 crc kubenswrapper[4971]: I0320 07:31:17.733964 4971 scope.go:117] "RemoveContainer" containerID="0fb36bb8261cef13ec2408924ecbe7eee5da04e46c14e79a7fb3e0e821bf601c" Mar 20 07:31:17 crc kubenswrapper[4971]: E0320 07:31:17.734542 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:31:19 crc kubenswrapper[4971]: I0320 07:31:19.381249 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dbv4j" Mar 20 07:31:19 crc kubenswrapper[4971]: I0320 07:31:19.381346 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dbv4j" Mar 20 07:31:19 crc kubenswrapper[4971]: I0320 07:31:19.464164 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dbv4j" Mar 20 07:31:20 crc kubenswrapper[4971]: I0320 07:31:20.047832 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dbv4j" Mar 20 07:31:20 crc kubenswrapper[4971]: I0320 07:31:20.120118 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dbv4j"] Mar 20 07:31:21 crc kubenswrapper[4971]: I0320 07:31:21.979751 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-dbv4j" podUID="79fe8ebc-cfb7-4b00-929c-053c573f1f24" containerName="registry-server" containerID="cri-o://949bcc49d52e1f07866751ac0349598f6350eb5b45402cd953da2a65b5eca669" gracePeriod=2 Mar 20 07:31:22 crc kubenswrapper[4971]: I0320 07:31:22.528808 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dbv4j" Mar 20 07:31:22 crc kubenswrapper[4971]: I0320 07:31:22.659937 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79fe8ebc-cfb7-4b00-929c-053c573f1f24-catalog-content\") pod \"79fe8ebc-cfb7-4b00-929c-053c573f1f24\" (UID: \"79fe8ebc-cfb7-4b00-929c-053c573f1f24\") " Mar 20 07:31:22 crc kubenswrapper[4971]: I0320 07:31:22.660065 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzd22\" (UniqueName: \"kubernetes.io/projected/79fe8ebc-cfb7-4b00-929c-053c573f1f24-kube-api-access-wzd22\") pod \"79fe8ebc-cfb7-4b00-929c-053c573f1f24\" (UID: \"79fe8ebc-cfb7-4b00-929c-053c573f1f24\") " Mar 20 07:31:22 crc kubenswrapper[4971]: I0320 07:31:22.660099 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79fe8ebc-cfb7-4b00-929c-053c573f1f24-utilities\") pod \"79fe8ebc-cfb7-4b00-929c-053c573f1f24\" (UID: \"79fe8ebc-cfb7-4b00-929c-053c573f1f24\") " Mar 20 07:31:22 crc kubenswrapper[4971]: I0320 07:31:22.661084 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79fe8ebc-cfb7-4b00-929c-053c573f1f24-utilities" (OuterVolumeSpecName: "utilities") pod "79fe8ebc-cfb7-4b00-929c-053c573f1f24" (UID: "79fe8ebc-cfb7-4b00-929c-053c573f1f24"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:31:22 crc kubenswrapper[4971]: I0320 07:31:22.667764 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79fe8ebc-cfb7-4b00-929c-053c573f1f24-kube-api-access-wzd22" (OuterVolumeSpecName: "kube-api-access-wzd22") pod "79fe8ebc-cfb7-4b00-929c-053c573f1f24" (UID: "79fe8ebc-cfb7-4b00-929c-053c573f1f24"). InnerVolumeSpecName "kube-api-access-wzd22". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:31:22 crc kubenswrapper[4971]: I0320 07:31:22.722400 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79fe8ebc-cfb7-4b00-929c-053c573f1f24-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "79fe8ebc-cfb7-4b00-929c-053c573f1f24" (UID: "79fe8ebc-cfb7-4b00-929c-053c573f1f24"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:31:22 crc kubenswrapper[4971]: I0320 07:31:22.761265 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzd22\" (UniqueName: \"kubernetes.io/projected/79fe8ebc-cfb7-4b00-929c-053c573f1f24-kube-api-access-wzd22\") on node \"crc\" DevicePath \"\"" Mar 20 07:31:22 crc kubenswrapper[4971]: I0320 07:31:22.761306 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79fe8ebc-cfb7-4b00-929c-053c573f1f24-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 07:31:22 crc kubenswrapper[4971]: I0320 07:31:22.761317 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79fe8ebc-cfb7-4b00-929c-053c573f1f24-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 07:31:22 crc kubenswrapper[4971]: I0320 07:31:22.991401 4971 generic.go:334] "Generic (PLEG): container finished" podID="79fe8ebc-cfb7-4b00-929c-053c573f1f24" containerID="949bcc49d52e1f07866751ac0349598f6350eb5b45402cd953da2a65b5eca669" exitCode=0 Mar 20 07:31:22 crc kubenswrapper[4971]: I0320 07:31:22.991459 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dbv4j" event={"ID":"79fe8ebc-cfb7-4b00-929c-053c573f1f24","Type":"ContainerDied","Data":"949bcc49d52e1f07866751ac0349598f6350eb5b45402cd953da2a65b5eca669"} Mar 20 07:31:22 crc kubenswrapper[4971]: I0320 07:31:22.991543 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dbv4j" Mar 20 07:31:22 crc kubenswrapper[4971]: I0320 07:31:22.991590 4971 scope.go:117] "RemoveContainer" containerID="949bcc49d52e1f07866751ac0349598f6350eb5b45402cd953da2a65b5eca669" Mar 20 07:31:22 crc kubenswrapper[4971]: I0320 07:31:22.991565 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dbv4j" event={"ID":"79fe8ebc-cfb7-4b00-929c-053c573f1f24","Type":"ContainerDied","Data":"5af73a2f7a8722961da40d5c1a2f2934aebd0f926c3894fa890a5a9b3d6a3d42"} Mar 20 07:31:23 crc kubenswrapper[4971]: I0320 07:31:23.025632 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dbv4j"] Mar 20 07:31:23 crc kubenswrapper[4971]: I0320 07:31:23.027346 4971 scope.go:117] "RemoveContainer" containerID="ca684ba92d412c346d5c5ba3e3eaf9ce721689d1198ac4ccd9dcc56b1935ea97" Mar 20 07:31:23 crc kubenswrapper[4971]: I0320 07:31:23.033007 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-dbv4j"] Mar 20 07:31:23 crc kubenswrapper[4971]: I0320 07:31:23.065920 4971 scope.go:117] "RemoveContainer" containerID="a25baede573d9d8e6fcb15e9f5fff5c55b690c607d027ca4f7545fa02ea07bd5" Mar 20 07:31:23 crc kubenswrapper[4971]: I0320 07:31:23.089858 4971 scope.go:117] "RemoveContainer" containerID="949bcc49d52e1f07866751ac0349598f6350eb5b45402cd953da2a65b5eca669" Mar 20 07:31:23 crc kubenswrapper[4971]: E0320 07:31:23.090363 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"949bcc49d52e1f07866751ac0349598f6350eb5b45402cd953da2a65b5eca669\": container with ID starting with 949bcc49d52e1f07866751ac0349598f6350eb5b45402cd953da2a65b5eca669 not found: ID does not exist" containerID="949bcc49d52e1f07866751ac0349598f6350eb5b45402cd953da2a65b5eca669" Mar 20 07:31:23 crc kubenswrapper[4971]: I0320 07:31:23.090417 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"949bcc49d52e1f07866751ac0349598f6350eb5b45402cd953da2a65b5eca669"} err="failed to get container status \"949bcc49d52e1f07866751ac0349598f6350eb5b45402cd953da2a65b5eca669\": rpc error: code = NotFound desc = could not find container \"949bcc49d52e1f07866751ac0349598f6350eb5b45402cd953da2a65b5eca669\": container with ID starting with 949bcc49d52e1f07866751ac0349598f6350eb5b45402cd953da2a65b5eca669 not found: ID does not exist" Mar 20 07:31:23 crc kubenswrapper[4971]: I0320 07:31:23.090443 4971 scope.go:117] "RemoveContainer" containerID="ca684ba92d412c346d5c5ba3e3eaf9ce721689d1198ac4ccd9dcc56b1935ea97" Mar 20 07:31:23 crc kubenswrapper[4971]: E0320 07:31:23.090804 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca684ba92d412c346d5c5ba3e3eaf9ce721689d1198ac4ccd9dcc56b1935ea97\": container with ID starting with ca684ba92d412c346d5c5ba3e3eaf9ce721689d1198ac4ccd9dcc56b1935ea97 not found: ID does not exist" containerID="ca684ba92d412c346d5c5ba3e3eaf9ce721689d1198ac4ccd9dcc56b1935ea97" Mar 20 07:31:23 crc kubenswrapper[4971]: I0320 07:31:23.090828 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca684ba92d412c346d5c5ba3e3eaf9ce721689d1198ac4ccd9dcc56b1935ea97"} err="failed to get container status \"ca684ba92d412c346d5c5ba3e3eaf9ce721689d1198ac4ccd9dcc56b1935ea97\": rpc error: code = NotFound desc = could not find container \"ca684ba92d412c346d5c5ba3e3eaf9ce721689d1198ac4ccd9dcc56b1935ea97\": container with ID starting with ca684ba92d412c346d5c5ba3e3eaf9ce721689d1198ac4ccd9dcc56b1935ea97 not found: ID does not exist" Mar 20 07:31:23 crc kubenswrapper[4971]: I0320 07:31:23.090863 4971 scope.go:117] "RemoveContainer" containerID="a25baede573d9d8e6fcb15e9f5fff5c55b690c607d027ca4f7545fa02ea07bd5" Mar 20 07:31:23 crc kubenswrapper[4971]: E0320 07:31:23.091188 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a25baede573d9d8e6fcb15e9f5fff5c55b690c607d027ca4f7545fa02ea07bd5\": container with ID starting with a25baede573d9d8e6fcb15e9f5fff5c55b690c607d027ca4f7545fa02ea07bd5 not found: ID does not exist" containerID="a25baede573d9d8e6fcb15e9f5fff5c55b690c607d027ca4f7545fa02ea07bd5" Mar 20 07:31:23 crc kubenswrapper[4971]: I0320 07:31:23.091224 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a25baede573d9d8e6fcb15e9f5fff5c55b690c607d027ca4f7545fa02ea07bd5"} err="failed to get container status \"a25baede573d9d8e6fcb15e9f5fff5c55b690c607d027ca4f7545fa02ea07bd5\": rpc error: code = NotFound desc = could not find container \"a25baede573d9d8e6fcb15e9f5fff5c55b690c607d027ca4f7545fa02ea07bd5\": container with ID starting with a25baede573d9d8e6fcb15e9f5fff5c55b690c607d027ca4f7545fa02ea07bd5 not found: ID does not exist" Mar 20 07:31:24 crc kubenswrapper[4971]: I0320 07:31:24.747387 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79fe8ebc-cfb7-4b00-929c-053c573f1f24" path="/var/lib/kubelet/pods/79fe8ebc-cfb7-4b00-929c-053c573f1f24/volumes" Mar 20 07:31:27 crc kubenswrapper[4971]: I0320 07:31:27.992980 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dx5tq"] Mar 20 07:31:27 crc kubenswrapper[4971]: E0320 07:31:27.996036 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79fe8ebc-cfb7-4b00-929c-053c573f1f24" containerName="registry-server" Mar 20 07:31:27 crc kubenswrapper[4971]: I0320 07:31:27.996089 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="79fe8ebc-cfb7-4b00-929c-053c573f1f24" containerName="registry-server" Mar 20 07:31:27 crc kubenswrapper[4971]: E0320 07:31:27.996129 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79fe8ebc-cfb7-4b00-929c-053c573f1f24" containerName="extract-content" Mar 20 07:31:27 crc kubenswrapper[4971]: I0320 07:31:27.996146 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="79fe8ebc-cfb7-4b00-929c-053c573f1f24" containerName="extract-content" Mar 20 07:31:27 crc kubenswrapper[4971]: E0320 07:31:27.996170 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79fe8ebc-cfb7-4b00-929c-053c573f1f24" containerName="extract-utilities" Mar 20 07:31:27 crc kubenswrapper[4971]: I0320 07:31:27.996186 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="79fe8ebc-cfb7-4b00-929c-053c573f1f24" containerName="extract-utilities" Mar 20 07:31:27 crc kubenswrapper[4971]: I0320 07:31:27.996500 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="79fe8ebc-cfb7-4b00-929c-053c573f1f24" containerName="registry-server" Mar 20 07:31:27 crc kubenswrapper[4971]: I0320 07:31:27.998885 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dx5tq" Mar 20 07:31:28 crc kubenswrapper[4971]: I0320 07:31:28.028477 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dx5tq"] Mar 20 07:31:28 crc kubenswrapper[4971]: I0320 07:31:28.142714 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ece6dd3-0cf8-4566-9ab7-14e3cb5a581b-catalog-content\") pod \"certified-operators-dx5tq\" (UID: \"9ece6dd3-0cf8-4566-9ab7-14e3cb5a581b\") " pod="openshift-marketplace/certified-operators-dx5tq" Mar 20 07:31:28 crc kubenswrapper[4971]: I0320 07:31:28.142795 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ece6dd3-0cf8-4566-9ab7-14e3cb5a581b-utilities\") pod \"certified-operators-dx5tq\" (UID: \"9ece6dd3-0cf8-4566-9ab7-14e3cb5a581b\") " pod="openshift-marketplace/certified-operators-dx5tq" Mar 20 07:31:28 crc kubenswrapper[4971]: I0320 07:31:28.142864 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlchx\" (UniqueName: \"kubernetes.io/projected/9ece6dd3-0cf8-4566-9ab7-14e3cb5a581b-kube-api-access-zlchx\") pod \"certified-operators-dx5tq\" (UID: \"9ece6dd3-0cf8-4566-9ab7-14e3cb5a581b\") " pod="openshift-marketplace/certified-operators-dx5tq" Mar 20 07:31:28 crc kubenswrapper[4971]: I0320 07:31:28.244496 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ece6dd3-0cf8-4566-9ab7-14e3cb5a581b-catalog-content\") pod \"certified-operators-dx5tq\" (UID: \"9ece6dd3-0cf8-4566-9ab7-14e3cb5a581b\") " pod="openshift-marketplace/certified-operators-dx5tq" Mar 20 07:31:28 crc kubenswrapper[4971]: I0320 07:31:28.244545 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ece6dd3-0cf8-4566-9ab7-14e3cb5a581b-utilities\") pod \"certified-operators-dx5tq\" (UID: \"9ece6dd3-0cf8-4566-9ab7-14e3cb5a581b\") " pod="openshift-marketplace/certified-operators-dx5tq" Mar 20 07:31:28 crc kubenswrapper[4971]: I0320 07:31:28.244585 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlchx\" (UniqueName: \"kubernetes.io/projected/9ece6dd3-0cf8-4566-9ab7-14e3cb5a581b-kube-api-access-zlchx\") pod \"certified-operators-dx5tq\" (UID: \"9ece6dd3-0cf8-4566-9ab7-14e3cb5a581b\") " pod="openshift-marketplace/certified-operators-dx5tq" Mar 20 07:31:28 crc kubenswrapper[4971]: I0320 07:31:28.245050 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ece6dd3-0cf8-4566-9ab7-14e3cb5a581b-catalog-content\") pod \"certified-operators-dx5tq\" (UID: \"9ece6dd3-0cf8-4566-9ab7-14e3cb5a581b\") " pod="openshift-marketplace/certified-operators-dx5tq" Mar 20 07:31:28 crc kubenswrapper[4971]: I0320 07:31:28.245149 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ece6dd3-0cf8-4566-9ab7-14e3cb5a581b-utilities\") pod \"certified-operators-dx5tq\" (UID: \"9ece6dd3-0cf8-4566-9ab7-14e3cb5a581b\") " pod="openshift-marketplace/certified-operators-dx5tq" Mar 20 07:31:28 crc kubenswrapper[4971]: I0320 07:31:28.267238 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlchx\" (UniqueName: \"kubernetes.io/projected/9ece6dd3-0cf8-4566-9ab7-14e3cb5a581b-kube-api-access-zlchx\") pod \"certified-operators-dx5tq\" (UID: \"9ece6dd3-0cf8-4566-9ab7-14e3cb5a581b\") " pod="openshift-marketplace/certified-operators-dx5tq" Mar 20 07:31:28 crc kubenswrapper[4971]: I0320 07:31:28.333143 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dx5tq" Mar 20 07:31:28 crc kubenswrapper[4971]: I0320 07:31:28.564812 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dx5tq"] Mar 20 07:31:29 crc kubenswrapper[4971]: I0320 07:31:29.042435 4971 generic.go:334] "Generic (PLEG): container finished" podID="9ece6dd3-0cf8-4566-9ab7-14e3cb5a581b" containerID="3cbe20896f37667b4a6cfa623bafe788244dfd660747f90df275c77c4798dd18" exitCode=0 Mar 20 07:31:29 crc kubenswrapper[4971]: I0320 07:31:29.042475 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dx5tq" event={"ID":"9ece6dd3-0cf8-4566-9ab7-14e3cb5a581b","Type":"ContainerDied","Data":"3cbe20896f37667b4a6cfa623bafe788244dfd660747f90df275c77c4798dd18"} Mar 20 07:31:29 crc kubenswrapper[4971]: I0320 07:31:29.042498 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dx5tq" event={"ID":"9ece6dd3-0cf8-4566-9ab7-14e3cb5a581b","Type":"ContainerStarted","Data":"5aac517347ff7fb55a0d0fbd981a61a5a7ecc2f5244f9378a08cd5cdcb37fc30"} Mar 20 07:31:30 crc kubenswrapper[4971]: I0320 07:31:30.056922 4971 generic.go:334] "Generic (PLEG): container finished" podID="9ece6dd3-0cf8-4566-9ab7-14e3cb5a581b" containerID="9f0f6c8cce5129a7ca399103c619e48197c9bb6dd13cfb842ca24d6f59abf165" exitCode=0 Mar 20 07:31:30 crc kubenswrapper[4971]: I0320 07:31:30.056973 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dx5tq" event={"ID":"9ece6dd3-0cf8-4566-9ab7-14e3cb5a581b","Type":"ContainerDied","Data":"9f0f6c8cce5129a7ca399103c619e48197c9bb6dd13cfb842ca24d6f59abf165"} Mar 20 07:31:31 crc kubenswrapper[4971]: I0320 07:31:31.066443 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dx5tq" event={"ID":"9ece6dd3-0cf8-4566-9ab7-14e3cb5a581b","Type":"ContainerStarted","Data":"de9d0f7a6f48618b8f25becee46e29b78dd414ab44adcb5e8a99c7d59e45db1c"} Mar 20 07:31:31 crc kubenswrapper[4971]: I0320 07:31:31.092941 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dx5tq" podStartSLOduration=2.652841039 podStartE2EDuration="4.09292439s" podCreationTimestamp="2026-03-20 07:31:27 +0000 UTC" firstStartedPulling="2026-03-20 07:31:29.043742697 +0000 UTC m=+2511.023616835" lastFinishedPulling="2026-03-20 07:31:30.483826028 +0000 UTC m=+2512.463700186" observedRunningTime="2026-03-20 07:31:31.089800059 +0000 UTC m=+2513.069674197" watchObservedRunningTime="2026-03-20 07:31:31.09292439 +0000 UTC m=+2513.072798538" Mar 20 07:31:32 crc kubenswrapper[4971]: I0320 07:31:32.732907 4971 scope.go:117] "RemoveContainer" containerID="0fb36bb8261cef13ec2408924ecbe7eee5da04e46c14e79a7fb3e0e821bf601c" Mar 20 07:31:32 crc kubenswrapper[4971]: E0320 07:31:32.733455 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:31:38 crc kubenswrapper[4971]: I0320 07:31:38.334232 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dx5tq" Mar 20 07:31:38 crc kubenswrapper[4971]: I0320 07:31:38.334654 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dx5tq" Mar 20 07:31:38 crc kubenswrapper[4971]: I0320 07:31:38.424070 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dx5tq" Mar 20 07:31:39 crc kubenswrapper[4971]: I0320 07:31:39.190698 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dx5tq" Mar 20 07:31:39 crc kubenswrapper[4971]: I0320 07:31:39.687692 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dx5tq"] Mar 20 07:31:41 crc kubenswrapper[4971]: I0320 07:31:41.156742 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dx5tq" podUID="9ece6dd3-0cf8-4566-9ab7-14e3cb5a581b" containerName="registry-server" containerID="cri-o://de9d0f7a6f48618b8f25becee46e29b78dd414ab44adcb5e8a99c7d59e45db1c" gracePeriod=2 Mar 20 07:31:41 crc kubenswrapper[4971]: I0320 07:31:41.580268 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dx5tq" Mar 20 07:31:41 crc kubenswrapper[4971]: I0320 07:31:41.751488 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ece6dd3-0cf8-4566-9ab7-14e3cb5a581b-catalog-content\") pod \"9ece6dd3-0cf8-4566-9ab7-14e3cb5a581b\" (UID: \"9ece6dd3-0cf8-4566-9ab7-14e3cb5a581b\") " Mar 20 07:31:41 crc kubenswrapper[4971]: I0320 07:31:41.752015 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ece6dd3-0cf8-4566-9ab7-14e3cb5a581b-utilities\") pod \"9ece6dd3-0cf8-4566-9ab7-14e3cb5a581b\" (UID: \"9ece6dd3-0cf8-4566-9ab7-14e3cb5a581b\") " Mar 20 07:31:41 crc kubenswrapper[4971]: I0320 07:31:41.752098 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlchx\" (UniqueName: \"kubernetes.io/projected/9ece6dd3-0cf8-4566-9ab7-14e3cb5a581b-kube-api-access-zlchx\") pod \"9ece6dd3-0cf8-4566-9ab7-14e3cb5a581b\" (UID: \"9ece6dd3-0cf8-4566-9ab7-14e3cb5a581b\") " Mar 20 07:31:41 crc kubenswrapper[4971]: I0320 07:31:41.753292 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ece6dd3-0cf8-4566-9ab7-14e3cb5a581b-utilities" (OuterVolumeSpecName: "utilities") pod "9ece6dd3-0cf8-4566-9ab7-14e3cb5a581b" (UID: "9ece6dd3-0cf8-4566-9ab7-14e3cb5a581b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:31:41 crc kubenswrapper[4971]: I0320 07:31:41.762786 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ece6dd3-0cf8-4566-9ab7-14e3cb5a581b-kube-api-access-zlchx" (OuterVolumeSpecName: "kube-api-access-zlchx") pod "9ece6dd3-0cf8-4566-9ab7-14e3cb5a581b" (UID: "9ece6dd3-0cf8-4566-9ab7-14e3cb5a581b"). InnerVolumeSpecName "kube-api-access-zlchx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:31:41 crc kubenswrapper[4971]: I0320 07:31:41.803430 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ece6dd3-0cf8-4566-9ab7-14e3cb5a581b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9ece6dd3-0cf8-4566-9ab7-14e3cb5a581b" (UID: "9ece6dd3-0cf8-4566-9ab7-14e3cb5a581b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:31:41 crc kubenswrapper[4971]: I0320 07:31:41.853721 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ece6dd3-0cf8-4566-9ab7-14e3cb5a581b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 07:31:41 crc kubenswrapper[4971]: I0320 07:31:41.853755 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ece6dd3-0cf8-4566-9ab7-14e3cb5a581b-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 07:31:41 crc kubenswrapper[4971]: I0320 07:31:41.853770 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zlchx\" (UniqueName: \"kubernetes.io/projected/9ece6dd3-0cf8-4566-9ab7-14e3cb5a581b-kube-api-access-zlchx\") on node \"crc\" DevicePath \"\"" Mar 20 07:31:42 crc kubenswrapper[4971]: I0320 07:31:42.166822 4971 generic.go:334] "Generic (PLEG): container finished" podID="9ece6dd3-0cf8-4566-9ab7-14e3cb5a581b" containerID="de9d0f7a6f48618b8f25becee46e29b78dd414ab44adcb5e8a99c7d59e45db1c" exitCode=0 Mar 20 07:31:42 crc kubenswrapper[4971]: I0320 07:31:42.166883 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dx5tq" event={"ID":"9ece6dd3-0cf8-4566-9ab7-14e3cb5a581b","Type":"ContainerDied","Data":"de9d0f7a6f48618b8f25becee46e29b78dd414ab44adcb5e8a99c7d59e45db1c"} Mar 20 07:31:42 crc kubenswrapper[4971]: I0320 07:31:42.166921 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dx5tq" Mar 20 07:31:42 crc kubenswrapper[4971]: I0320 07:31:42.166951 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dx5tq" event={"ID":"9ece6dd3-0cf8-4566-9ab7-14e3cb5a581b","Type":"ContainerDied","Data":"5aac517347ff7fb55a0d0fbd981a61a5a7ecc2f5244f9378a08cd5cdcb37fc30"} Mar 20 07:31:42 crc kubenswrapper[4971]: I0320 07:31:42.166986 4971 scope.go:117] "RemoveContainer" containerID="de9d0f7a6f48618b8f25becee46e29b78dd414ab44adcb5e8a99c7d59e45db1c" Mar 20 07:31:42 crc kubenswrapper[4971]: I0320 07:31:42.191656 4971 scope.go:117] "RemoveContainer" containerID="9f0f6c8cce5129a7ca399103c619e48197c9bb6dd13cfb842ca24d6f59abf165" Mar 20 07:31:42 crc kubenswrapper[4971]: I0320 07:31:42.206019 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dx5tq"] Mar 20 07:31:42 crc kubenswrapper[4971]: I0320 07:31:42.212571 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dx5tq"] Mar 20 07:31:42 crc kubenswrapper[4971]: I0320 07:31:42.227400 4971 scope.go:117] "RemoveContainer" containerID="3cbe20896f37667b4a6cfa623bafe788244dfd660747f90df275c77c4798dd18" Mar 20 07:31:42 crc kubenswrapper[4971]: I0320 07:31:42.265054 4971 scope.go:117] "RemoveContainer" containerID="de9d0f7a6f48618b8f25becee46e29b78dd414ab44adcb5e8a99c7d59e45db1c" Mar 20 07:31:42 crc kubenswrapper[4971]: E0320 07:31:42.265698 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de9d0f7a6f48618b8f25becee46e29b78dd414ab44adcb5e8a99c7d59e45db1c\": container with ID starting with de9d0f7a6f48618b8f25becee46e29b78dd414ab44adcb5e8a99c7d59e45db1c not found: ID does not exist" containerID="de9d0f7a6f48618b8f25becee46e29b78dd414ab44adcb5e8a99c7d59e45db1c" Mar 20 07:31:42 crc kubenswrapper[4971]: I0320 07:31:42.265802 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de9d0f7a6f48618b8f25becee46e29b78dd414ab44adcb5e8a99c7d59e45db1c"} err="failed to get container status \"de9d0f7a6f48618b8f25becee46e29b78dd414ab44adcb5e8a99c7d59e45db1c\": rpc error: code = NotFound desc = could not find container \"de9d0f7a6f48618b8f25becee46e29b78dd414ab44adcb5e8a99c7d59e45db1c\": container with ID starting with de9d0f7a6f48618b8f25becee46e29b78dd414ab44adcb5e8a99c7d59e45db1c not found: ID does not exist" Mar 20 07:31:42 crc kubenswrapper[4971]: I0320 07:31:42.265879 4971 scope.go:117] "RemoveContainer" containerID="9f0f6c8cce5129a7ca399103c619e48197c9bb6dd13cfb842ca24d6f59abf165" Mar 20 07:31:42 crc kubenswrapper[4971]: E0320 07:31:42.266274 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f0f6c8cce5129a7ca399103c619e48197c9bb6dd13cfb842ca24d6f59abf165\": container with ID starting with 9f0f6c8cce5129a7ca399103c619e48197c9bb6dd13cfb842ca24d6f59abf165 not found: ID does not exist" containerID="9f0f6c8cce5129a7ca399103c619e48197c9bb6dd13cfb842ca24d6f59abf165" Mar 20 07:31:42 crc kubenswrapper[4971]: I0320 07:31:42.266364 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f0f6c8cce5129a7ca399103c619e48197c9bb6dd13cfb842ca24d6f59abf165"} err="failed to get container status \"9f0f6c8cce5129a7ca399103c619e48197c9bb6dd13cfb842ca24d6f59abf165\": rpc error: code = NotFound desc = could not find container \"9f0f6c8cce5129a7ca399103c619e48197c9bb6dd13cfb842ca24d6f59abf165\": container with ID starting with 9f0f6c8cce5129a7ca399103c619e48197c9bb6dd13cfb842ca24d6f59abf165 not found: ID does not exist" Mar 20 07:31:42 crc kubenswrapper[4971]: I0320 07:31:42.266450 4971 scope.go:117] "RemoveContainer" containerID="3cbe20896f37667b4a6cfa623bafe788244dfd660747f90df275c77c4798dd18" Mar 20 07:31:42 crc kubenswrapper[4971]: E0320 07:31:42.266958 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cbe20896f37667b4a6cfa623bafe788244dfd660747f90df275c77c4798dd18\": container with ID starting with 3cbe20896f37667b4a6cfa623bafe788244dfd660747f90df275c77c4798dd18 not found: ID does not exist" containerID="3cbe20896f37667b4a6cfa623bafe788244dfd660747f90df275c77c4798dd18" Mar 20 07:31:42 crc kubenswrapper[4971]: I0320 07:31:42.267032 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cbe20896f37667b4a6cfa623bafe788244dfd660747f90df275c77c4798dd18"} err="failed to get container status \"3cbe20896f37667b4a6cfa623bafe788244dfd660747f90df275c77c4798dd18\": rpc error: code = NotFound desc = could not find container \"3cbe20896f37667b4a6cfa623bafe788244dfd660747f90df275c77c4798dd18\": container with ID starting with 3cbe20896f37667b4a6cfa623bafe788244dfd660747f90df275c77c4798dd18 not found: ID does not exist" Mar 20 07:31:42 crc kubenswrapper[4971]: I0320 07:31:42.746931 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ece6dd3-0cf8-4566-9ab7-14e3cb5a581b" path="/var/lib/kubelet/pods/9ece6dd3-0cf8-4566-9ab7-14e3cb5a581b/volumes" Mar 20 07:31:44 crc kubenswrapper[4971]: I0320 07:31:44.733034 4971 scope.go:117] "RemoveContainer" containerID="0fb36bb8261cef13ec2408924ecbe7eee5da04e46c14e79a7fb3e0e821bf601c" Mar 20 07:31:44 crc kubenswrapper[4971]: E0320 07:31:44.735427 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:31:54 crc kubenswrapper[4971]: I0320 07:31:54.056959 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-v8922"] Mar 20 07:31:54 crc kubenswrapper[4971]: E0320 07:31:54.057890 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ece6dd3-0cf8-4566-9ab7-14e3cb5a581b" containerName="extract-content" Mar 20 07:31:54 crc kubenswrapper[4971]: I0320 07:31:54.057921 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ece6dd3-0cf8-4566-9ab7-14e3cb5a581b" containerName="extract-content" Mar 20 07:31:54 crc kubenswrapper[4971]: E0320 07:31:54.057945 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ece6dd3-0cf8-4566-9ab7-14e3cb5a581b" containerName="extract-utilities" Mar 20 07:31:54 crc kubenswrapper[4971]: I0320 07:31:54.057960 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ece6dd3-0cf8-4566-9ab7-14e3cb5a581b" containerName="extract-utilities" Mar 20 07:31:54 crc kubenswrapper[4971]: E0320 07:31:54.058008 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ece6dd3-0cf8-4566-9ab7-14e3cb5a581b" containerName="registry-server" Mar 20 07:31:54 crc kubenswrapper[4971]: I0320 07:31:54.058025 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ece6dd3-0cf8-4566-9ab7-14e3cb5a581b" containerName="registry-server" Mar 20 07:31:54 crc kubenswrapper[4971]: I0320 07:31:54.058320 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ece6dd3-0cf8-4566-9ab7-14e3cb5a581b" containerName="registry-server" Mar 20 07:31:54 crc kubenswrapper[4971]: I0320 07:31:54.060701 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v8922" Mar 20 07:31:54 crc kubenswrapper[4971]: I0320 07:31:54.078569 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v8922"] Mar 20 07:31:54 crc kubenswrapper[4971]: I0320 07:31:54.079279 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/def96ac9-249a-4de1-8609-7b98bb4ca8b1-catalog-content\") pod \"redhat-operators-v8922\" (UID: \"def96ac9-249a-4de1-8609-7b98bb4ca8b1\") " pod="openshift-marketplace/redhat-operators-v8922" Mar 20 07:31:54 crc kubenswrapper[4971]: I0320 07:31:54.079344 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5xf7\" (UniqueName: \"kubernetes.io/projected/def96ac9-249a-4de1-8609-7b98bb4ca8b1-kube-api-access-h5xf7\") pod \"redhat-operators-v8922\" (UID: \"def96ac9-249a-4de1-8609-7b98bb4ca8b1\") " pod="openshift-marketplace/redhat-operators-v8922" Mar 20 07:31:54 crc kubenswrapper[4971]: I0320 07:31:54.079403 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/def96ac9-249a-4de1-8609-7b98bb4ca8b1-utilities\") pod \"redhat-operators-v8922\" (UID: \"def96ac9-249a-4de1-8609-7b98bb4ca8b1\") " pod="openshift-marketplace/redhat-operators-v8922" Mar 20 07:31:54 crc kubenswrapper[4971]: I0320 07:31:54.180671 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5xf7\" (UniqueName: \"kubernetes.io/projected/def96ac9-249a-4de1-8609-7b98bb4ca8b1-kube-api-access-h5xf7\") pod \"redhat-operators-v8922\" (UID: \"def96ac9-249a-4de1-8609-7b98bb4ca8b1\") " pod="openshift-marketplace/redhat-operators-v8922" Mar 20 07:31:54 crc kubenswrapper[4971]: I0320 07:31:54.181026 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/def96ac9-249a-4de1-8609-7b98bb4ca8b1-utilities\") pod \"redhat-operators-v8922\" (UID: \"def96ac9-249a-4de1-8609-7b98bb4ca8b1\") " pod="openshift-marketplace/redhat-operators-v8922" Mar 20 07:31:54 crc kubenswrapper[4971]: I0320 07:31:54.181240 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/def96ac9-249a-4de1-8609-7b98bb4ca8b1-catalog-content\") pod \"redhat-operators-v8922\" (UID: \"def96ac9-249a-4de1-8609-7b98bb4ca8b1\") " pod="openshift-marketplace/redhat-operators-v8922" Mar 20 07:31:54 crc kubenswrapper[4971]: I0320 07:31:54.181596 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/def96ac9-249a-4de1-8609-7b98bb4ca8b1-utilities\") pod \"redhat-operators-v8922\" (UID: \"def96ac9-249a-4de1-8609-7b98bb4ca8b1\") " pod="openshift-marketplace/redhat-operators-v8922" Mar 20 07:31:54 crc kubenswrapper[4971]: I0320 07:31:54.181888 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/def96ac9-249a-4de1-8609-7b98bb4ca8b1-catalog-content\") pod \"redhat-operators-v8922\" (UID: \"def96ac9-249a-4de1-8609-7b98bb4ca8b1\") " pod="openshift-marketplace/redhat-operators-v8922" Mar 20 07:31:54 crc kubenswrapper[4971]: I0320 07:31:54.206711 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5xf7\" (UniqueName: \"kubernetes.io/projected/def96ac9-249a-4de1-8609-7b98bb4ca8b1-kube-api-access-h5xf7\") pod \"redhat-operators-v8922\" (UID: \"def96ac9-249a-4de1-8609-7b98bb4ca8b1\") " pod="openshift-marketplace/redhat-operators-v8922" Mar 20 07:31:54 crc kubenswrapper[4971]: I0320 07:31:54.390945 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v8922" Mar 20 07:31:54 crc kubenswrapper[4971]: I0320 07:31:54.860922 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v8922"] Mar 20 07:31:55 crc kubenswrapper[4971]: I0320 07:31:55.317100 4971 generic.go:334] "Generic (PLEG): container finished" podID="def96ac9-249a-4de1-8609-7b98bb4ca8b1" containerID="10aa7d19dff41d6c098d50aa2297d3c27a00465f328a9995a3a1776b0e64a8aa" exitCode=0 Mar 20 07:31:55 crc kubenswrapper[4971]: I0320 07:31:55.317173 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v8922" event={"ID":"def96ac9-249a-4de1-8609-7b98bb4ca8b1","Type":"ContainerDied","Data":"10aa7d19dff41d6c098d50aa2297d3c27a00465f328a9995a3a1776b0e64a8aa"} Mar 20 07:31:55 crc kubenswrapper[4971]: I0320 07:31:55.317376 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v8922" event={"ID":"def96ac9-249a-4de1-8609-7b98bb4ca8b1","Type":"ContainerStarted","Data":"178674e47ab73ca865c25b6ef5940cb46afa539a00c566fbef61f713ee32ce5b"} Mar 20 07:31:56 crc kubenswrapper[4971]: I0320 07:31:56.732990 4971 scope.go:117] "RemoveContainer" containerID="0fb36bb8261cef13ec2408924ecbe7eee5da04e46c14e79a7fb3e0e821bf601c" Mar 20 07:31:56 crc kubenswrapper[4971]: E0320 07:31:56.733580 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:31:57 crc kubenswrapper[4971]: I0320 07:31:57.335046 4971 generic.go:334] "Generic (PLEG): container finished" podID="def96ac9-249a-4de1-8609-7b98bb4ca8b1" containerID="84416819e76a9ff018407610af0112cbe8c7d505df1048a530652694fa895315" exitCode=0 Mar 20 07:31:57 crc kubenswrapper[4971]: I0320 07:31:57.335134 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v8922" event={"ID":"def96ac9-249a-4de1-8609-7b98bb4ca8b1","Type":"ContainerDied","Data":"84416819e76a9ff018407610af0112cbe8c7d505df1048a530652694fa895315"} Mar 20 07:31:58 crc kubenswrapper[4971]: I0320 07:31:58.345119 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v8922" event={"ID":"def96ac9-249a-4de1-8609-7b98bb4ca8b1","Type":"ContainerStarted","Data":"514cd2ce91d94b3f9cd72bb086087695e1ec59821c45f3f763dcbd0d799cf4c6"} Mar 20 07:31:58 crc kubenswrapper[4971]: I0320 07:31:58.373260 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-v8922" podStartSLOduration=1.955263126 podStartE2EDuration="4.373236515s" podCreationTimestamp="2026-03-20 07:31:54 +0000 UTC" firstStartedPulling="2026-03-20 07:31:55.318946063 +0000 UTC m=+2537.298820201" lastFinishedPulling="2026-03-20 07:31:57.736919442 +0000 UTC m=+2539.716793590" observedRunningTime="2026-03-20 07:31:58.364146618 +0000 UTC m=+2540.344020796" watchObservedRunningTime="2026-03-20 07:31:58.373236515 +0000 UTC m=+2540.353110663" Mar 20 07:32:00 crc kubenswrapper[4971]: I0320 07:32:00.146492 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566532-bgsgz"] Mar 20 07:32:00 crc kubenswrapper[4971]: I0320 07:32:00.147465 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566532-bgsgz" Mar 20 07:32:00 crc kubenswrapper[4971]: I0320 07:32:00.149898 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 07:32:00 crc kubenswrapper[4971]: I0320 07:32:00.150485 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 07:32:00 crc kubenswrapper[4971]: I0320 07:32:00.150641 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 07:32:00 crc kubenswrapper[4971]: I0320 07:32:00.159644 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566532-bgsgz"] Mar 20 07:32:00 crc kubenswrapper[4971]: I0320 07:32:00.280875 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfd9m\" (UniqueName: \"kubernetes.io/projected/dfec0fdd-e1d3-467a-9288-17fd98bb57bf-kube-api-access-kfd9m\") pod \"auto-csr-approver-29566532-bgsgz\" (UID: \"dfec0fdd-e1d3-467a-9288-17fd98bb57bf\") " pod="openshift-infra/auto-csr-approver-29566532-bgsgz" Mar 20 07:32:00 crc kubenswrapper[4971]: I0320 07:32:00.382706 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfd9m\" (UniqueName: \"kubernetes.io/projected/dfec0fdd-e1d3-467a-9288-17fd98bb57bf-kube-api-access-kfd9m\") pod \"auto-csr-approver-29566532-bgsgz\" (UID: \"dfec0fdd-e1d3-467a-9288-17fd98bb57bf\") " pod="openshift-infra/auto-csr-approver-29566532-bgsgz" Mar 20 07:32:00 crc kubenswrapper[4971]: I0320 07:32:00.409547 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfd9m\" (UniqueName: \"kubernetes.io/projected/dfec0fdd-e1d3-467a-9288-17fd98bb57bf-kube-api-access-kfd9m\") pod \"auto-csr-approver-29566532-bgsgz\" (UID: \"dfec0fdd-e1d3-467a-9288-17fd98bb57bf\") " pod="openshift-infra/auto-csr-approver-29566532-bgsgz" Mar 20 07:32:00 crc kubenswrapper[4971]: I0320 07:32:00.463532 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566532-bgsgz" Mar 20 07:32:00 crc kubenswrapper[4971]: I0320 07:32:00.966522 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566532-bgsgz"] Mar 20 07:32:01 crc kubenswrapper[4971]: I0320 07:32:01.373588 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566532-bgsgz" event={"ID":"dfec0fdd-e1d3-467a-9288-17fd98bb57bf","Type":"ContainerStarted","Data":"2d1621fd0f8ab0580a3a7990ce774576df1fcfe7c0c6e42a68180e5bf3b9e74c"} Mar 20 07:32:02 crc kubenswrapper[4971]: I0320 07:32:02.382523 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566532-bgsgz" event={"ID":"dfec0fdd-e1d3-467a-9288-17fd98bb57bf","Type":"ContainerStarted","Data":"bb1939835f6bb5056120f2dc0ddad5a6b66d9bd46a2e39b3a669f97e55507f28"} Mar 20 07:32:03 crc kubenswrapper[4971]: I0320 07:32:03.393017 4971 generic.go:334] "Generic (PLEG): container finished" podID="dfec0fdd-e1d3-467a-9288-17fd98bb57bf" containerID="bb1939835f6bb5056120f2dc0ddad5a6b66d9bd46a2e39b3a669f97e55507f28" exitCode=0 Mar 20 07:32:03 crc kubenswrapper[4971]: I0320 07:32:03.393768 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566532-bgsgz" event={"ID":"dfec0fdd-e1d3-467a-9288-17fd98bb57bf","Type":"ContainerDied","Data":"bb1939835f6bb5056120f2dc0ddad5a6b66d9bd46a2e39b3a669f97e55507f28"} Mar 20 07:32:04 crc kubenswrapper[4971]: I0320 07:32:04.391801 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-v8922" Mar 20 07:32:04 crc kubenswrapper[4971]: I0320 07:32:04.392126 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-v8922" Mar 20 07:32:04 crc kubenswrapper[4971]: I0320 07:32:04.698257 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566532-bgsgz" Mar 20 07:32:04 crc kubenswrapper[4971]: I0320 07:32:04.790407 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfd9m\" (UniqueName: \"kubernetes.io/projected/dfec0fdd-e1d3-467a-9288-17fd98bb57bf-kube-api-access-kfd9m\") pod \"dfec0fdd-e1d3-467a-9288-17fd98bb57bf\" (UID: \"dfec0fdd-e1d3-467a-9288-17fd98bb57bf\") " Mar 20 07:32:04 crc kubenswrapper[4971]: I0320 07:32:04.796816 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfec0fdd-e1d3-467a-9288-17fd98bb57bf-kube-api-access-kfd9m" (OuterVolumeSpecName: "kube-api-access-kfd9m") pod "dfec0fdd-e1d3-467a-9288-17fd98bb57bf" (UID: "dfec0fdd-e1d3-467a-9288-17fd98bb57bf"). InnerVolumeSpecName "kube-api-access-kfd9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:32:04 crc kubenswrapper[4971]: I0320 07:32:04.892164 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfd9m\" (UniqueName: \"kubernetes.io/projected/dfec0fdd-e1d3-467a-9288-17fd98bb57bf-kube-api-access-kfd9m\") on node \"crc\" DevicePath \"\"" Mar 20 07:32:05 crc kubenswrapper[4971]: I0320 07:32:05.414854 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566532-bgsgz" event={"ID":"dfec0fdd-e1d3-467a-9288-17fd98bb57bf","Type":"ContainerDied","Data":"2d1621fd0f8ab0580a3a7990ce774576df1fcfe7c0c6e42a68180e5bf3b9e74c"} Mar 20 07:32:05 crc kubenswrapper[4971]: I0320 07:32:05.414889 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d1621fd0f8ab0580a3a7990ce774576df1fcfe7c0c6e42a68180e5bf3b9e74c" Mar 20 07:32:05 crc kubenswrapper[4971]: I0320 07:32:05.414962 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566532-bgsgz" Mar 20 07:32:05 crc kubenswrapper[4971]: I0320 07:32:05.457528 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-v8922" podUID="def96ac9-249a-4de1-8609-7b98bb4ca8b1" containerName="registry-server" probeResult="failure" output=< Mar 20 07:32:05 crc kubenswrapper[4971]: timeout: failed to connect service ":50051" within 1s Mar 20 07:32:05 crc kubenswrapper[4971]: > Mar 20 07:32:05 crc kubenswrapper[4971]: I0320 07:32:05.475783 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566526-sh7kc"] Mar 20 07:32:05 crc kubenswrapper[4971]: I0320 07:32:05.481231 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566526-sh7kc"] Mar 20 07:32:06 crc kubenswrapper[4971]: I0320 07:32:06.746412 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ee7182c-6612-4722-903b-68fb1657e563" path="/var/lib/kubelet/pods/0ee7182c-6612-4722-903b-68fb1657e563/volumes" Mar 20 07:32:09 crc kubenswrapper[4971]: I0320 07:32:09.732128 4971 scope.go:117] "RemoveContainer" containerID="0fb36bb8261cef13ec2408924ecbe7eee5da04e46c14e79a7fb3e0e821bf601c" Mar 20 07:32:09 crc kubenswrapper[4971]: E0320 07:32:09.732821 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:32:14 crc kubenswrapper[4971]: I0320 07:32:14.445496 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-v8922" Mar 20 07:32:14 crc kubenswrapper[4971]: I0320 07:32:14.502291 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-v8922" Mar 20 07:32:14 crc kubenswrapper[4971]: I0320 07:32:14.705914 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v8922"] Mar 20 07:32:15 crc kubenswrapper[4971]: I0320 07:32:15.496669 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-v8922" podUID="def96ac9-249a-4de1-8609-7b98bb4ca8b1" containerName="registry-server" containerID="cri-o://514cd2ce91d94b3f9cd72bb086087695e1ec59821c45f3f763dcbd0d799cf4c6" gracePeriod=2 Mar 20 07:32:15 crc kubenswrapper[4971]: I0320 07:32:15.951977 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v8922" Mar 20 07:32:16 crc kubenswrapper[4971]: I0320 07:32:16.052597 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/def96ac9-249a-4de1-8609-7b98bb4ca8b1-utilities\") pod \"def96ac9-249a-4de1-8609-7b98bb4ca8b1\" (UID: \"def96ac9-249a-4de1-8609-7b98bb4ca8b1\") " Mar 20 07:32:16 crc kubenswrapper[4971]: I0320 07:32:16.052698 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5xf7\" (UniqueName: \"kubernetes.io/projected/def96ac9-249a-4de1-8609-7b98bb4ca8b1-kube-api-access-h5xf7\") pod \"def96ac9-249a-4de1-8609-7b98bb4ca8b1\" (UID: \"def96ac9-249a-4de1-8609-7b98bb4ca8b1\") " Mar 20 07:32:16 crc kubenswrapper[4971]: I0320 07:32:16.052776 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/def96ac9-249a-4de1-8609-7b98bb4ca8b1-catalog-content\") pod \"def96ac9-249a-4de1-8609-7b98bb4ca8b1\" (UID: \"def96ac9-249a-4de1-8609-7b98bb4ca8b1\") " Mar 20 07:32:16 crc kubenswrapper[4971]: I0320 07:32:16.055466 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/def96ac9-249a-4de1-8609-7b98bb4ca8b1-utilities" (OuterVolumeSpecName: "utilities") pod "def96ac9-249a-4de1-8609-7b98bb4ca8b1" (UID: "def96ac9-249a-4de1-8609-7b98bb4ca8b1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:32:16 crc kubenswrapper[4971]: I0320 07:32:16.059389 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/def96ac9-249a-4de1-8609-7b98bb4ca8b1-kube-api-access-h5xf7" (OuterVolumeSpecName: "kube-api-access-h5xf7") pod "def96ac9-249a-4de1-8609-7b98bb4ca8b1" (UID: "def96ac9-249a-4de1-8609-7b98bb4ca8b1"). InnerVolumeSpecName "kube-api-access-h5xf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:32:16 crc kubenswrapper[4971]: I0320 07:32:16.154134 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/def96ac9-249a-4de1-8609-7b98bb4ca8b1-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 07:32:16 crc kubenswrapper[4971]: I0320 07:32:16.154172 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5xf7\" (UniqueName: \"kubernetes.io/projected/def96ac9-249a-4de1-8609-7b98bb4ca8b1-kube-api-access-h5xf7\") on node \"crc\" DevicePath \"\"" Mar 20 07:32:16 crc kubenswrapper[4971]: I0320 07:32:16.237233 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/def96ac9-249a-4de1-8609-7b98bb4ca8b1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "def96ac9-249a-4de1-8609-7b98bb4ca8b1" (UID: "def96ac9-249a-4de1-8609-7b98bb4ca8b1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:32:16 crc kubenswrapper[4971]: I0320 07:32:16.256296 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/def96ac9-249a-4de1-8609-7b98bb4ca8b1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 07:32:16 crc kubenswrapper[4971]: I0320 07:32:16.510908 4971 generic.go:334] "Generic (PLEG): container finished" podID="def96ac9-249a-4de1-8609-7b98bb4ca8b1" containerID="514cd2ce91d94b3f9cd72bb086087695e1ec59821c45f3f763dcbd0d799cf4c6" exitCode=0 Mar 20 07:32:16 crc kubenswrapper[4971]: I0320 07:32:16.510969 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v8922" event={"ID":"def96ac9-249a-4de1-8609-7b98bb4ca8b1","Type":"ContainerDied","Data":"514cd2ce91d94b3f9cd72bb086087695e1ec59821c45f3f763dcbd0d799cf4c6"} Mar 20 07:32:16 crc kubenswrapper[4971]: I0320 07:32:16.511022 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v8922" event={"ID":"def96ac9-249a-4de1-8609-7b98bb4ca8b1","Type":"ContainerDied","Data":"178674e47ab73ca865c25b6ef5940cb46afa539a00c566fbef61f713ee32ce5b"} Mar 20 07:32:16 crc kubenswrapper[4971]: I0320 07:32:16.511056 4971 scope.go:117] "RemoveContainer" containerID="514cd2ce91d94b3f9cd72bb086087695e1ec59821c45f3f763dcbd0d799cf4c6" Mar 20 07:32:16 crc kubenswrapper[4971]: I0320 07:32:16.511069 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v8922" Mar 20 07:32:16 crc kubenswrapper[4971]: I0320 07:32:16.552199 4971 scope.go:117] "RemoveContainer" containerID="84416819e76a9ff018407610af0112cbe8c7d505df1048a530652694fa895315" Mar 20 07:32:16 crc kubenswrapper[4971]: I0320 07:32:16.568275 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v8922"] Mar 20 07:32:16 crc kubenswrapper[4971]: I0320 07:32:16.578346 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-v8922"] Mar 20 07:32:16 crc kubenswrapper[4971]: I0320 07:32:16.608148 4971 scope.go:117] "RemoveContainer" containerID="10aa7d19dff41d6c098d50aa2297d3c27a00465f328a9995a3a1776b0e64a8aa" Mar 20 07:32:16 crc kubenswrapper[4971]: I0320 07:32:16.629729 4971 scope.go:117] "RemoveContainer" containerID="514cd2ce91d94b3f9cd72bb086087695e1ec59821c45f3f763dcbd0d799cf4c6" Mar 20 07:32:16 crc kubenswrapper[4971]: E0320 07:32:16.630245 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"514cd2ce91d94b3f9cd72bb086087695e1ec59821c45f3f763dcbd0d799cf4c6\": container with ID starting with 514cd2ce91d94b3f9cd72bb086087695e1ec59821c45f3f763dcbd0d799cf4c6 not found: ID does not exist" containerID="514cd2ce91d94b3f9cd72bb086087695e1ec59821c45f3f763dcbd0d799cf4c6" Mar 20 07:32:16 crc kubenswrapper[4971]: I0320 07:32:16.630276 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"514cd2ce91d94b3f9cd72bb086087695e1ec59821c45f3f763dcbd0d799cf4c6"} err="failed to get container status \"514cd2ce91d94b3f9cd72bb086087695e1ec59821c45f3f763dcbd0d799cf4c6\": rpc error: code = NotFound desc = could not find container \"514cd2ce91d94b3f9cd72bb086087695e1ec59821c45f3f763dcbd0d799cf4c6\": container with ID starting with 514cd2ce91d94b3f9cd72bb086087695e1ec59821c45f3f763dcbd0d799cf4c6 not found: ID does not exist" Mar 20 07:32:16 crc kubenswrapper[4971]: I0320 07:32:16.630295 4971 scope.go:117] "RemoveContainer" containerID="84416819e76a9ff018407610af0112cbe8c7d505df1048a530652694fa895315" Mar 20 07:32:16 crc kubenswrapper[4971]: E0320 07:32:16.631112 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84416819e76a9ff018407610af0112cbe8c7d505df1048a530652694fa895315\": container with ID starting with 84416819e76a9ff018407610af0112cbe8c7d505df1048a530652694fa895315 not found: ID does not exist" containerID="84416819e76a9ff018407610af0112cbe8c7d505df1048a530652694fa895315" Mar 20 07:32:16 crc kubenswrapper[4971]: I0320 07:32:16.631174 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84416819e76a9ff018407610af0112cbe8c7d505df1048a530652694fa895315"} err="failed to get container status \"84416819e76a9ff018407610af0112cbe8c7d505df1048a530652694fa895315\": rpc error: code = NotFound desc = could not find container \"84416819e76a9ff018407610af0112cbe8c7d505df1048a530652694fa895315\": container with ID starting with 84416819e76a9ff018407610af0112cbe8c7d505df1048a530652694fa895315 not found: ID does not exist" Mar 20 07:32:16 crc kubenswrapper[4971]: I0320 07:32:16.631213 4971 scope.go:117] "RemoveContainer" containerID="10aa7d19dff41d6c098d50aa2297d3c27a00465f328a9995a3a1776b0e64a8aa" Mar 20 07:32:16 crc kubenswrapper[4971]: E0320 07:32:16.631688 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10aa7d19dff41d6c098d50aa2297d3c27a00465f328a9995a3a1776b0e64a8aa\": container with ID starting with 10aa7d19dff41d6c098d50aa2297d3c27a00465f328a9995a3a1776b0e64a8aa not found: ID does not exist" containerID="10aa7d19dff41d6c098d50aa2297d3c27a00465f328a9995a3a1776b0e64a8aa" Mar 20 07:32:16 crc kubenswrapper[4971]: I0320 07:32:16.631865 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10aa7d19dff41d6c098d50aa2297d3c27a00465f328a9995a3a1776b0e64a8aa"} err="failed to get container status \"10aa7d19dff41d6c098d50aa2297d3c27a00465f328a9995a3a1776b0e64a8aa\": rpc error: code = NotFound desc = could not find container \"10aa7d19dff41d6c098d50aa2297d3c27a00465f328a9995a3a1776b0e64a8aa\": container with ID starting with 10aa7d19dff41d6c098d50aa2297d3c27a00465f328a9995a3a1776b0e64a8aa not found: ID does not exist" Mar 20 07:32:16 crc kubenswrapper[4971]: I0320 07:32:16.742675 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="def96ac9-249a-4de1-8609-7b98bb4ca8b1" path="/var/lib/kubelet/pods/def96ac9-249a-4de1-8609-7b98bb4ca8b1/volumes" Mar 20 07:32:20 crc kubenswrapper[4971]: I0320 07:32:20.732542 4971 scope.go:117] "RemoveContainer" containerID="0fb36bb8261cef13ec2408924ecbe7eee5da04e46c14e79a7fb3e0e821bf601c" Mar 20 07:32:21 crc kubenswrapper[4971]: I0320 07:32:21.556852 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerStarted","Data":"895dfb7e469b10223e69c2b2adeacdd2952b5bb050f2773ef438370babd44bba"} Mar 20 07:32:54 crc kubenswrapper[4971]: I0320 07:32:54.612220 4971 scope.go:117] "RemoveContainer" containerID="d258479291395756cd521e2ed47075d6a91361e0022659674da12eac788898ae" Mar 20 07:34:00 crc kubenswrapper[4971]: I0320 07:34:00.167528 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566534-nkrw8"] Mar 20 07:34:00 crc kubenswrapper[4971]: E0320 07:34:00.169922 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfec0fdd-e1d3-467a-9288-17fd98bb57bf" containerName="oc" Mar 20 07:34:00 crc kubenswrapper[4971]: I0320 07:34:00.169962 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfec0fdd-e1d3-467a-9288-17fd98bb57bf" containerName="oc" Mar 20 07:34:00 crc kubenswrapper[4971]: E0320 07:34:00.169979 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="def96ac9-249a-4de1-8609-7b98bb4ca8b1" containerName="extract-content" Mar 20 07:34:00 crc kubenswrapper[4971]: I0320 07:34:00.169990 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="def96ac9-249a-4de1-8609-7b98bb4ca8b1" containerName="extract-content" Mar 20 07:34:00 crc kubenswrapper[4971]: E0320 07:34:00.170072 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="def96ac9-249a-4de1-8609-7b98bb4ca8b1" containerName="extract-utilities" Mar 20 07:34:00 crc kubenswrapper[4971]: I0320 07:34:00.170092 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="def96ac9-249a-4de1-8609-7b98bb4ca8b1" containerName="extract-utilities" Mar 20 07:34:00 crc kubenswrapper[4971]: E0320 07:34:00.170106 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="def96ac9-249a-4de1-8609-7b98bb4ca8b1" containerName="registry-server" Mar 20 07:34:00 crc kubenswrapper[4971]: I0320 07:34:00.170116 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="def96ac9-249a-4de1-8609-7b98bb4ca8b1" containerName="registry-server" Mar 20 07:34:00 crc kubenswrapper[4971]: I0320 07:34:00.170435 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="def96ac9-249a-4de1-8609-7b98bb4ca8b1" containerName="registry-server" Mar 20 07:34:00 crc kubenswrapper[4971]: I0320 07:34:00.170457 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfec0fdd-e1d3-467a-9288-17fd98bb57bf" containerName="oc" Mar 20 07:34:00 crc kubenswrapper[4971]: I0320 07:34:00.171588 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566534-nkrw8" Mar 20 07:34:00 crc kubenswrapper[4971]: I0320 07:34:00.174954 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 07:34:00 crc kubenswrapper[4971]: I0320 07:34:00.175153 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 07:34:00 crc kubenswrapper[4971]: I0320 07:34:00.175654 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 07:34:00 crc kubenswrapper[4971]: I0320 07:34:00.175907 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566534-nkrw8"] Mar 20 07:34:00 crc kubenswrapper[4971]: I0320 07:34:00.257351 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcnxl\" (UniqueName: \"kubernetes.io/projected/6610a358-42ec-41ec-af3b-87ae50dbe56c-kube-api-access-gcnxl\") pod \"auto-csr-approver-29566534-nkrw8\" (UID: \"6610a358-42ec-41ec-af3b-87ae50dbe56c\") " pod="openshift-infra/auto-csr-approver-29566534-nkrw8" Mar 20 07:34:00 crc kubenswrapper[4971]: I0320 07:34:00.358801 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcnxl\" (UniqueName: \"kubernetes.io/projected/6610a358-42ec-41ec-af3b-87ae50dbe56c-kube-api-access-gcnxl\") pod \"auto-csr-approver-29566534-nkrw8\" (UID: \"6610a358-42ec-41ec-af3b-87ae50dbe56c\") " pod="openshift-infra/auto-csr-approver-29566534-nkrw8" Mar 20 07:34:00 crc kubenswrapper[4971]: I0320 07:34:00.377569 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcnxl\" (UniqueName: \"kubernetes.io/projected/6610a358-42ec-41ec-af3b-87ae50dbe56c-kube-api-access-gcnxl\") pod \"auto-csr-approver-29566534-nkrw8\" (UID: \"6610a358-42ec-41ec-af3b-87ae50dbe56c\") " pod="openshift-infra/auto-csr-approver-29566534-nkrw8" Mar 20 07:34:00 crc kubenswrapper[4971]: I0320 07:34:00.497386 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566534-nkrw8" Mar 20 07:34:00 crc kubenswrapper[4971]: I0320 07:34:00.794141 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566534-nkrw8"] Mar 20 07:34:00 crc kubenswrapper[4971]: I0320 07:34:00.817262 4971 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 07:34:01 crc kubenswrapper[4971]: I0320 07:34:01.480167 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566534-nkrw8" event={"ID":"6610a358-42ec-41ec-af3b-87ae50dbe56c","Type":"ContainerStarted","Data":"ecc30b51fd9a3feb822e2af1625c1e8e2b921ddc213e9e17b3b16aa85a2aaf14"} Mar 20 07:34:02 crc kubenswrapper[4971]: I0320 07:34:02.488927 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566534-nkrw8" event={"ID":"6610a358-42ec-41ec-af3b-87ae50dbe56c","Type":"ContainerStarted","Data":"60fab7dd49c29dea4bfa214904e08186d43eda2d4648af652a9367ccc044088a"} Mar 20 07:34:02 crc kubenswrapper[4971]: I0320 07:34:02.502359 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566534-nkrw8" podStartSLOduration=1.349642042 podStartE2EDuration="2.502338579s" podCreationTimestamp="2026-03-20 07:34:00 +0000 UTC" firstStartedPulling="2026-03-20 07:34:00.816998133 +0000 UTC m=+2662.796872271" lastFinishedPulling="2026-03-20 07:34:01.96969466 +0000 UTC m=+2663.949568808" observedRunningTime="2026-03-20 07:34:02.501726053 +0000 UTC m=+2664.481600201" watchObservedRunningTime="2026-03-20 07:34:02.502338579 +0000 UTC m=+2664.482212717" Mar 20 07:34:03 crc kubenswrapper[4971]: I0320 07:34:03.501669 4971 generic.go:334] "Generic (PLEG): container finished" podID="6610a358-42ec-41ec-af3b-87ae50dbe56c" containerID="60fab7dd49c29dea4bfa214904e08186d43eda2d4648af652a9367ccc044088a" exitCode=0 Mar 20 07:34:03 crc kubenswrapper[4971]: I0320 07:34:03.501713 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566534-nkrw8" event={"ID":"6610a358-42ec-41ec-af3b-87ae50dbe56c","Type":"ContainerDied","Data":"60fab7dd49c29dea4bfa214904e08186d43eda2d4648af652a9367ccc044088a"} Mar 20 07:34:04 crc kubenswrapper[4971]: I0320 07:34:04.914320 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566534-nkrw8" Mar 20 07:34:04 crc kubenswrapper[4971]: I0320 07:34:04.925783 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcnxl\" (UniqueName: \"kubernetes.io/projected/6610a358-42ec-41ec-af3b-87ae50dbe56c-kube-api-access-gcnxl\") pod \"6610a358-42ec-41ec-af3b-87ae50dbe56c\" (UID: \"6610a358-42ec-41ec-af3b-87ae50dbe56c\") " Mar 20 07:34:04 crc kubenswrapper[4971]: I0320 07:34:04.933337 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6610a358-42ec-41ec-af3b-87ae50dbe56c-kube-api-access-gcnxl" (OuterVolumeSpecName: "kube-api-access-gcnxl") pod "6610a358-42ec-41ec-af3b-87ae50dbe56c" (UID: "6610a358-42ec-41ec-af3b-87ae50dbe56c"). InnerVolumeSpecName "kube-api-access-gcnxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:34:05 crc kubenswrapper[4971]: I0320 07:34:05.027213 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcnxl\" (UniqueName: \"kubernetes.io/projected/6610a358-42ec-41ec-af3b-87ae50dbe56c-kube-api-access-gcnxl\") on node \"crc\" DevicePath \"\"" Mar 20 07:34:05 crc kubenswrapper[4971]: I0320 07:34:05.522410 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566534-nkrw8" event={"ID":"6610a358-42ec-41ec-af3b-87ae50dbe56c","Type":"ContainerDied","Data":"ecc30b51fd9a3feb822e2af1625c1e8e2b921ddc213e9e17b3b16aa85a2aaf14"} Mar 20 07:34:05 crc kubenswrapper[4971]: I0320 07:34:05.522974 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ecc30b51fd9a3feb822e2af1625c1e8e2b921ddc213e9e17b3b16aa85a2aaf14" Mar 20 07:34:05 crc kubenswrapper[4971]: I0320 07:34:05.522521 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566534-nkrw8" Mar 20 07:34:05 crc kubenswrapper[4971]: I0320 07:34:05.607271 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566528-67gds"] Mar 20 07:34:05 crc kubenswrapper[4971]: I0320 07:34:05.616538 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566528-67gds"] Mar 20 07:34:06 crc kubenswrapper[4971]: I0320 07:34:06.748388 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19457825-8ae2-40f8-b5b3-b939be1cbbfa" path="/var/lib/kubelet/pods/19457825-8ae2-40f8-b5b3-b939be1cbbfa/volumes" Mar 20 07:34:20 crc kubenswrapper[4971]: I0320 07:34:20.162004 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:34:20 crc kubenswrapper[4971]: I0320 07:34:20.162746 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:34:50 crc kubenswrapper[4971]: I0320 07:34:50.162101 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:34:50 crc kubenswrapper[4971]: I0320 07:34:50.162718 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:34:54 crc kubenswrapper[4971]: I0320 07:34:54.776510 4971 scope.go:117] "RemoveContainer" containerID="f22b74d0d4523d5c0076fcb1fe68d737f3123f881309a7d13e8fed3cc654ce56" Mar 20 07:35:20 crc kubenswrapper[4971]: I0320 07:35:20.161968 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:35:20 crc kubenswrapper[4971]: I0320 07:35:20.162678 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:35:20 crc kubenswrapper[4971]: I0320 07:35:20.162738 4971 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" Mar 20 07:35:20 crc kubenswrapper[4971]: I0320 07:35:20.163543 4971 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"895dfb7e469b10223e69c2b2adeacdd2952b5bb050f2773ef438370babd44bba"} pod="openshift-machine-config-operator/machine-config-daemon-m26zl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 07:35:20 crc kubenswrapper[4971]: I0320 07:35:20.163677 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" containerID="cri-o://895dfb7e469b10223e69c2b2adeacdd2952b5bb050f2773ef438370babd44bba" gracePeriod=600 Mar 20 07:35:21 crc kubenswrapper[4971]: I0320 07:35:21.262314 4971 generic.go:334] "Generic (PLEG): container finished" podID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerID="895dfb7e469b10223e69c2b2adeacdd2952b5bb050f2773ef438370babd44bba" exitCode=0 Mar 20 07:35:21 crc kubenswrapper[4971]: I0320 07:35:21.262415 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerDied","Data":"895dfb7e469b10223e69c2b2adeacdd2952b5bb050f2773ef438370babd44bba"} Mar 20 07:35:21 crc kubenswrapper[4971]: I0320 07:35:21.262745 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerStarted","Data":"01da59be6249845923e82657099ea463d3e7161471e1e125f547ba6a1aa77809"} Mar 20 07:35:21 crc kubenswrapper[4971]: I0320 07:35:21.262774 4971 scope.go:117] "RemoveContainer" containerID="0fb36bb8261cef13ec2408924ecbe7eee5da04e46c14e79a7fb3e0e821bf601c" Mar 20 07:36:00 crc kubenswrapper[4971]: I0320 07:36:00.163517 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566536-mrhvj"] Mar 20 07:36:00 crc kubenswrapper[4971]: E0320 07:36:00.164319 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6610a358-42ec-41ec-af3b-87ae50dbe56c" containerName="oc" Mar 20 07:36:00 crc kubenswrapper[4971]: I0320 07:36:00.164336 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="6610a358-42ec-41ec-af3b-87ae50dbe56c" containerName="oc" Mar 20 07:36:00 crc kubenswrapper[4971]: I0320 07:36:00.164526 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="6610a358-42ec-41ec-af3b-87ae50dbe56c" containerName="oc" Mar 20 07:36:00 crc kubenswrapper[4971]: I0320 07:36:00.165175 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566536-mrhvj" Mar 20 07:36:00 crc kubenswrapper[4971]: I0320 07:36:00.169176 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 07:36:00 crc kubenswrapper[4971]: I0320 07:36:00.169429 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 07:36:00 crc kubenswrapper[4971]: I0320 07:36:00.178379 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 07:36:00 crc kubenswrapper[4971]: I0320 07:36:00.178856 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566536-mrhvj"] Mar 20 07:36:00 crc kubenswrapper[4971]: I0320 07:36:00.324736 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pq4sq\" (UniqueName: \"kubernetes.io/projected/8afdbf46-868e-4d6a-92ab-efe6212d89de-kube-api-access-pq4sq\") pod \"auto-csr-approver-29566536-mrhvj\" (UID: \"8afdbf46-868e-4d6a-92ab-efe6212d89de\") " pod="openshift-infra/auto-csr-approver-29566536-mrhvj" Mar 20 07:36:00 crc kubenswrapper[4971]: I0320 07:36:00.427477 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pq4sq\" (UniqueName: \"kubernetes.io/projected/8afdbf46-868e-4d6a-92ab-efe6212d89de-kube-api-access-pq4sq\") pod \"auto-csr-approver-29566536-mrhvj\" (UID: \"8afdbf46-868e-4d6a-92ab-efe6212d89de\") " pod="openshift-infra/auto-csr-approver-29566536-mrhvj" Mar 20 07:36:00 crc kubenswrapper[4971]: I0320 07:36:00.458626 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pq4sq\" (UniqueName: \"kubernetes.io/projected/8afdbf46-868e-4d6a-92ab-efe6212d89de-kube-api-access-pq4sq\") pod \"auto-csr-approver-29566536-mrhvj\" (UID: \"8afdbf46-868e-4d6a-92ab-efe6212d89de\") " pod="openshift-infra/auto-csr-approver-29566536-mrhvj" Mar 20 07:36:00 crc kubenswrapper[4971]: I0320 07:36:00.516956 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566536-mrhvj" Mar 20 07:36:01 crc kubenswrapper[4971]: I0320 07:36:01.020283 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566536-mrhvj"] Mar 20 07:36:01 crc kubenswrapper[4971]: I0320 07:36:01.643675 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566536-mrhvj" event={"ID":"8afdbf46-868e-4d6a-92ab-efe6212d89de","Type":"ContainerStarted","Data":"a5373d1fced8b93d0f4d1415b64a8c552d8d8bc68d258995dde3e075835bb8e6"} Mar 20 07:36:02 crc kubenswrapper[4971]: I0320 07:36:02.655675 4971 generic.go:334] "Generic (PLEG): container finished" podID="8afdbf46-868e-4d6a-92ab-efe6212d89de" containerID="f11dc4b5f798331b479b0e5d22e2922af37342f1091aa5cd55d8149a0c066dab" exitCode=0 Mar 20 07:36:02 crc kubenswrapper[4971]: I0320 07:36:02.655735 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566536-mrhvj" event={"ID":"8afdbf46-868e-4d6a-92ab-efe6212d89de","Type":"ContainerDied","Data":"f11dc4b5f798331b479b0e5d22e2922af37342f1091aa5cd55d8149a0c066dab"} Mar 20 07:36:04 crc kubenswrapper[4971]: I0320 07:36:04.091420 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566536-mrhvj" Mar 20 07:36:04 crc kubenswrapper[4971]: I0320 07:36:04.202869 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pq4sq\" (UniqueName: \"kubernetes.io/projected/8afdbf46-868e-4d6a-92ab-efe6212d89de-kube-api-access-pq4sq\") pod \"8afdbf46-868e-4d6a-92ab-efe6212d89de\" (UID: \"8afdbf46-868e-4d6a-92ab-efe6212d89de\") " Mar 20 07:36:04 crc kubenswrapper[4971]: I0320 07:36:04.212585 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8afdbf46-868e-4d6a-92ab-efe6212d89de-kube-api-access-pq4sq" (OuterVolumeSpecName: "kube-api-access-pq4sq") pod "8afdbf46-868e-4d6a-92ab-efe6212d89de" (UID: "8afdbf46-868e-4d6a-92ab-efe6212d89de"). InnerVolumeSpecName "kube-api-access-pq4sq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:36:04 crc kubenswrapper[4971]: I0320 07:36:04.304996 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pq4sq\" (UniqueName: \"kubernetes.io/projected/8afdbf46-868e-4d6a-92ab-efe6212d89de-kube-api-access-pq4sq\") on node \"crc\" DevicePath \"\"" Mar 20 07:36:04 crc kubenswrapper[4971]: I0320 07:36:04.681157 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566536-mrhvj" event={"ID":"8afdbf46-868e-4d6a-92ab-efe6212d89de","Type":"ContainerDied","Data":"a5373d1fced8b93d0f4d1415b64a8c552d8d8bc68d258995dde3e075835bb8e6"} Mar 20 07:36:04 crc kubenswrapper[4971]: I0320 07:36:04.681215 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5373d1fced8b93d0f4d1415b64a8c552d8d8bc68d258995dde3e075835bb8e6" Mar 20 07:36:04 crc kubenswrapper[4971]: I0320 07:36:04.681333 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566536-mrhvj" Mar 20 07:36:05 crc kubenswrapper[4971]: I0320 07:36:05.177801 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566530-vfnhc"] Mar 20 07:36:05 crc kubenswrapper[4971]: I0320 07:36:05.183794 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566530-vfnhc"] Mar 20 07:36:06 crc kubenswrapper[4971]: I0320 07:36:06.750227 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26312162-6c67-44f8-92d2-9a65b8bbb4d8" path="/var/lib/kubelet/pods/26312162-6c67-44f8-92d2-9a65b8bbb4d8/volumes" Mar 20 07:36:54 crc kubenswrapper[4971]: I0320 07:36:54.890579 4971 scope.go:117] "RemoveContainer" containerID="ddfa582730f08684af4b5fc818fcfc779d6ef1674b3ed611f43eb16de8369691" Mar 20 07:37:20 crc kubenswrapper[4971]: I0320 07:37:20.162938 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:37:20 crc kubenswrapper[4971]: I0320 07:37:20.165513 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:37:50 crc kubenswrapper[4971]: I0320 07:37:50.163110 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:37:50 crc kubenswrapper[4971]: I0320 07:37:50.163513 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:38:00 crc kubenswrapper[4971]: I0320 07:38:00.131757 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566538-mtkzh"] Mar 20 07:38:00 crc kubenswrapper[4971]: E0320 07:38:00.132497 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8afdbf46-868e-4d6a-92ab-efe6212d89de" containerName="oc" Mar 20 07:38:00 crc kubenswrapper[4971]: I0320 07:38:00.132509 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="8afdbf46-868e-4d6a-92ab-efe6212d89de" containerName="oc" Mar 20 07:38:00 crc kubenswrapper[4971]: I0320 07:38:00.132661 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="8afdbf46-868e-4d6a-92ab-efe6212d89de" containerName="oc" Mar 20 07:38:00 crc kubenswrapper[4971]: I0320 07:38:00.133057 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566538-mtkzh" Mar 20 07:38:00 crc kubenswrapper[4971]: I0320 07:38:00.134882 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 07:38:00 crc kubenswrapper[4971]: I0320 07:38:00.135186 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 07:38:00 crc kubenswrapper[4971]: I0320 07:38:00.135857 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 07:38:00 crc kubenswrapper[4971]: I0320 07:38:00.147471 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566538-mtkzh"] Mar 20 07:38:00 crc kubenswrapper[4971]: I0320 07:38:00.209655 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6765\" (UniqueName: \"kubernetes.io/projected/0f83c433-e498-48f0-8e6d-b0ccf8cf41a4-kube-api-access-m6765\") pod \"auto-csr-approver-29566538-mtkzh\" (UID: \"0f83c433-e498-48f0-8e6d-b0ccf8cf41a4\") " pod="openshift-infra/auto-csr-approver-29566538-mtkzh" Mar 20 07:38:00 crc kubenswrapper[4971]: I0320 07:38:00.311113 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6765\" (UniqueName: \"kubernetes.io/projected/0f83c433-e498-48f0-8e6d-b0ccf8cf41a4-kube-api-access-m6765\") pod \"auto-csr-approver-29566538-mtkzh\" (UID: \"0f83c433-e498-48f0-8e6d-b0ccf8cf41a4\") " pod="openshift-infra/auto-csr-approver-29566538-mtkzh" Mar 20 07:38:00 crc kubenswrapper[4971]: I0320 07:38:00.336348 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6765\" (UniqueName: \"kubernetes.io/projected/0f83c433-e498-48f0-8e6d-b0ccf8cf41a4-kube-api-access-m6765\") pod \"auto-csr-approver-29566538-mtkzh\" (UID: \"0f83c433-e498-48f0-8e6d-b0ccf8cf41a4\") " pod="openshift-infra/auto-csr-approver-29566538-mtkzh" Mar 20 07:38:00 crc kubenswrapper[4971]: I0320 07:38:00.456217 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566538-mtkzh" Mar 20 07:38:01 crc kubenswrapper[4971]: I0320 07:38:01.020273 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566538-mtkzh"] Mar 20 07:38:01 crc kubenswrapper[4971]: I0320 07:38:01.662533 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566538-mtkzh" event={"ID":"0f83c433-e498-48f0-8e6d-b0ccf8cf41a4","Type":"ContainerStarted","Data":"98faeaa7b069d2e40e570f603269bb7d3fc0b28f95c2ab8e9b69c8b1acfb5682"} Mar 20 07:38:02 crc kubenswrapper[4971]: I0320 07:38:02.671096 4971 generic.go:334] "Generic (PLEG): container finished" podID="0f83c433-e498-48f0-8e6d-b0ccf8cf41a4" containerID="1a9e9869e3bd317cdb8512ff559deeef4e0e83fb6b1da3fb62f9a246fb8d92e0" exitCode=0 Mar 20 07:38:02 crc kubenswrapper[4971]: I0320 07:38:02.671206 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566538-mtkzh" event={"ID":"0f83c433-e498-48f0-8e6d-b0ccf8cf41a4","Type":"ContainerDied","Data":"1a9e9869e3bd317cdb8512ff559deeef4e0e83fb6b1da3fb62f9a246fb8d92e0"} Mar 20 07:38:04 crc kubenswrapper[4971]: I0320 07:38:04.002514 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566538-mtkzh" Mar 20 07:38:04 crc kubenswrapper[4971]: I0320 07:38:04.092523 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6765\" (UniqueName: \"kubernetes.io/projected/0f83c433-e498-48f0-8e6d-b0ccf8cf41a4-kube-api-access-m6765\") pod \"0f83c433-e498-48f0-8e6d-b0ccf8cf41a4\" (UID: \"0f83c433-e498-48f0-8e6d-b0ccf8cf41a4\") " Mar 20 07:38:04 crc kubenswrapper[4971]: I0320 07:38:04.098374 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f83c433-e498-48f0-8e6d-b0ccf8cf41a4-kube-api-access-m6765" (OuterVolumeSpecName: "kube-api-access-m6765") pod "0f83c433-e498-48f0-8e6d-b0ccf8cf41a4" (UID: "0f83c433-e498-48f0-8e6d-b0ccf8cf41a4"). InnerVolumeSpecName "kube-api-access-m6765". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:38:04 crc kubenswrapper[4971]: I0320 07:38:04.193984 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6765\" (UniqueName: \"kubernetes.io/projected/0f83c433-e498-48f0-8e6d-b0ccf8cf41a4-kube-api-access-m6765\") on node \"crc\" DevicePath \"\"" Mar 20 07:38:04 crc kubenswrapper[4971]: I0320 07:38:04.695193 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566538-mtkzh" event={"ID":"0f83c433-e498-48f0-8e6d-b0ccf8cf41a4","Type":"ContainerDied","Data":"98faeaa7b069d2e40e570f603269bb7d3fc0b28f95c2ab8e9b69c8b1acfb5682"} Mar 20 07:38:04 crc kubenswrapper[4971]: I0320 07:38:04.695245 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98faeaa7b069d2e40e570f603269bb7d3fc0b28f95c2ab8e9b69c8b1acfb5682" Mar 20 07:38:04 crc kubenswrapper[4971]: I0320 07:38:04.695329 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566538-mtkzh" Mar 20 07:38:05 crc kubenswrapper[4971]: I0320 07:38:05.099143 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566532-bgsgz"] Mar 20 07:38:05 crc kubenswrapper[4971]: I0320 07:38:05.112395 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566532-bgsgz"] Mar 20 07:38:06 crc kubenswrapper[4971]: I0320 07:38:06.758518 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfec0fdd-e1d3-467a-9288-17fd98bb57bf" path="/var/lib/kubelet/pods/dfec0fdd-e1d3-467a-9288-17fd98bb57bf/volumes" Mar 20 07:38:20 crc kubenswrapper[4971]: I0320 07:38:20.163079 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:38:20 crc kubenswrapper[4971]: I0320 07:38:20.165336 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:38:20 crc kubenswrapper[4971]: I0320 07:38:20.165553 4971 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" Mar 20 07:38:20 crc kubenswrapper[4971]: I0320 07:38:20.166681 4971 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"01da59be6249845923e82657099ea463d3e7161471e1e125f547ba6a1aa77809"} pod="openshift-machine-config-operator/machine-config-daemon-m26zl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 07:38:20 crc kubenswrapper[4971]: I0320 07:38:20.166934 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" containerID="cri-o://01da59be6249845923e82657099ea463d3e7161471e1e125f547ba6a1aa77809" gracePeriod=600 Mar 20 07:38:20 crc kubenswrapper[4971]: E0320 07:38:20.301071 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:38:20 crc kubenswrapper[4971]: I0320 07:38:20.834276 4971 generic.go:334] "Generic (PLEG): container finished" podID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerID="01da59be6249845923e82657099ea463d3e7161471e1e125f547ba6a1aa77809" exitCode=0 Mar 20 07:38:20 crc kubenswrapper[4971]: I0320 07:38:20.834322 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerDied","Data":"01da59be6249845923e82657099ea463d3e7161471e1e125f547ba6a1aa77809"} Mar 20 07:38:20 crc kubenswrapper[4971]: I0320 07:38:20.834355 4971 scope.go:117] "RemoveContainer" containerID="895dfb7e469b10223e69c2b2adeacdd2952b5bb050f2773ef438370babd44bba" Mar 20 07:38:20 crc kubenswrapper[4971]: I0320 07:38:20.835215 4971 scope.go:117] "RemoveContainer" containerID="01da59be6249845923e82657099ea463d3e7161471e1e125f547ba6a1aa77809" Mar 20 07:38:20 crc kubenswrapper[4971]: E0320 07:38:20.835668 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:38:33 crc kubenswrapper[4971]: I0320 07:38:33.732643 4971 scope.go:117] "RemoveContainer" containerID="01da59be6249845923e82657099ea463d3e7161471e1e125f547ba6a1aa77809" Mar 20 07:38:33 crc kubenswrapper[4971]: E0320 07:38:33.734017 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:38:48 crc kubenswrapper[4971]: I0320 07:38:48.739987 4971 scope.go:117] "RemoveContainer" containerID="01da59be6249845923e82657099ea463d3e7161471e1e125f547ba6a1aa77809" Mar 20 07:38:48 crc kubenswrapper[4971]: E0320 07:38:48.742146 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:38:52 crc kubenswrapper[4971]: I0320 07:38:52.826793 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mlpf8"] Mar 20 07:38:52 crc kubenswrapper[4971]: E0320 07:38:52.827531 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f83c433-e498-48f0-8e6d-b0ccf8cf41a4" containerName="oc" Mar 20 07:38:52 crc kubenswrapper[4971]: I0320 07:38:52.827552 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f83c433-e498-48f0-8e6d-b0ccf8cf41a4" containerName="oc" Mar 20 07:38:52 crc kubenswrapper[4971]: I0320 07:38:52.827830 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f83c433-e498-48f0-8e6d-b0ccf8cf41a4" containerName="oc" Mar 20 07:38:52 crc kubenswrapper[4971]: I0320 07:38:52.829340 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mlpf8" Mar 20 07:38:52 crc kubenswrapper[4971]: I0320 07:38:52.845236 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mlpf8"] Mar 20 07:38:52 crc kubenswrapper[4971]: I0320 07:38:52.943242 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80f6b755-a40f-4bda-92d9-ce8d13eaaf86-utilities\") pod \"redhat-marketplace-mlpf8\" (UID: \"80f6b755-a40f-4bda-92d9-ce8d13eaaf86\") " pod="openshift-marketplace/redhat-marketplace-mlpf8" Mar 20 07:38:52 crc kubenswrapper[4971]: I0320 07:38:52.943701 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80f6b755-a40f-4bda-92d9-ce8d13eaaf86-catalog-content\") pod \"redhat-marketplace-mlpf8\" (UID: \"80f6b755-a40f-4bda-92d9-ce8d13eaaf86\") " pod="openshift-marketplace/redhat-marketplace-mlpf8" Mar 20 07:38:52 crc kubenswrapper[4971]: I0320 07:38:52.943770 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6q4s\" (UniqueName: \"kubernetes.io/projected/80f6b755-a40f-4bda-92d9-ce8d13eaaf86-kube-api-access-g6q4s\") pod \"redhat-marketplace-mlpf8\" (UID: \"80f6b755-a40f-4bda-92d9-ce8d13eaaf86\") " pod="openshift-marketplace/redhat-marketplace-mlpf8" Mar 20 07:38:53 crc kubenswrapper[4971]: I0320 07:38:53.045046 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80f6b755-a40f-4bda-92d9-ce8d13eaaf86-utilities\") pod \"redhat-marketplace-mlpf8\" (UID: \"80f6b755-a40f-4bda-92d9-ce8d13eaaf86\") " pod="openshift-marketplace/redhat-marketplace-mlpf8" Mar 20 07:38:53 crc kubenswrapper[4971]: I0320 07:38:53.045132 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80f6b755-a40f-4bda-92d9-ce8d13eaaf86-catalog-content\") pod \"redhat-marketplace-mlpf8\" (UID: \"80f6b755-a40f-4bda-92d9-ce8d13eaaf86\") " pod="openshift-marketplace/redhat-marketplace-mlpf8" Mar 20 07:38:53 crc kubenswrapper[4971]: I0320 07:38:53.045213 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6q4s\" (UniqueName: \"kubernetes.io/projected/80f6b755-a40f-4bda-92d9-ce8d13eaaf86-kube-api-access-g6q4s\") pod \"redhat-marketplace-mlpf8\" (UID: \"80f6b755-a40f-4bda-92d9-ce8d13eaaf86\") " pod="openshift-marketplace/redhat-marketplace-mlpf8" Mar 20 07:38:53 crc kubenswrapper[4971]: I0320 07:38:53.045789 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80f6b755-a40f-4bda-92d9-ce8d13eaaf86-catalog-content\") pod \"redhat-marketplace-mlpf8\" (UID: \"80f6b755-a40f-4bda-92d9-ce8d13eaaf86\") " pod="openshift-marketplace/redhat-marketplace-mlpf8" Mar 20 07:38:53 crc kubenswrapper[4971]: I0320 07:38:53.046176 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80f6b755-a40f-4bda-92d9-ce8d13eaaf86-utilities\") pod \"redhat-marketplace-mlpf8\" (UID: \"80f6b755-a40f-4bda-92d9-ce8d13eaaf86\") " pod="openshift-marketplace/redhat-marketplace-mlpf8" Mar 20 07:38:53 crc kubenswrapper[4971]: I0320 07:38:53.077791 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6q4s\" (UniqueName: \"kubernetes.io/projected/80f6b755-a40f-4bda-92d9-ce8d13eaaf86-kube-api-access-g6q4s\") pod \"redhat-marketplace-mlpf8\" (UID: \"80f6b755-a40f-4bda-92d9-ce8d13eaaf86\") " pod="openshift-marketplace/redhat-marketplace-mlpf8" Mar 20 07:38:53 crc kubenswrapper[4971]: I0320 07:38:53.155509 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mlpf8" Mar 20 07:38:53 crc kubenswrapper[4971]: I0320 07:38:53.626224 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mlpf8"] Mar 20 07:38:54 crc kubenswrapper[4971]: I0320 07:38:54.347369 4971 generic.go:334] "Generic (PLEG): container finished" podID="80f6b755-a40f-4bda-92d9-ce8d13eaaf86" containerID="d143795d8d952c9470d0b9d379680f99da187603244cd5f652f598aa9cac80b3" exitCode=0 Mar 20 07:38:54 crc kubenswrapper[4971]: I0320 07:38:54.347718 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mlpf8" event={"ID":"80f6b755-a40f-4bda-92d9-ce8d13eaaf86","Type":"ContainerDied","Data":"d143795d8d952c9470d0b9d379680f99da187603244cd5f652f598aa9cac80b3"} Mar 20 07:38:54 crc kubenswrapper[4971]: I0320 07:38:54.347829 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mlpf8" event={"ID":"80f6b755-a40f-4bda-92d9-ce8d13eaaf86","Type":"ContainerStarted","Data":"d2cd1d9df483644d95255fc77a7d3a982701d3f38bf6878252fe80f1072dde40"} Mar 20 07:38:55 crc kubenswrapper[4971]: I0320 07:38:55.006010 4971 scope.go:117] "RemoveContainer" containerID="bb1939835f6bb5056120f2dc0ddad5a6b66d9bd46a2e39b3a669f97e55507f28" Mar 20 07:38:55 crc kubenswrapper[4971]: I0320 07:38:55.361501 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mlpf8" event={"ID":"80f6b755-a40f-4bda-92d9-ce8d13eaaf86","Type":"ContainerStarted","Data":"b5337654061b34ce133533f8fa3d6589e0cadbe37e647fa95de2b25ce2909a36"} Mar 20 07:38:56 crc kubenswrapper[4971]: I0320 07:38:56.372099 4971 generic.go:334] "Generic (PLEG): container finished" podID="80f6b755-a40f-4bda-92d9-ce8d13eaaf86" containerID="b5337654061b34ce133533f8fa3d6589e0cadbe37e647fa95de2b25ce2909a36" exitCode=0 Mar 20 07:38:56 crc kubenswrapper[4971]: I0320 07:38:56.372350 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mlpf8" event={"ID":"80f6b755-a40f-4bda-92d9-ce8d13eaaf86","Type":"ContainerDied","Data":"b5337654061b34ce133533f8fa3d6589e0cadbe37e647fa95de2b25ce2909a36"} Mar 20 07:38:57 crc kubenswrapper[4971]: I0320 07:38:57.382555 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mlpf8" event={"ID":"80f6b755-a40f-4bda-92d9-ce8d13eaaf86","Type":"ContainerStarted","Data":"8aef987267565c81b271c3b91571f9ab2091f4c6c604362542f07499c31d7e2c"} Mar 20 07:38:57 crc kubenswrapper[4971]: I0320 07:38:57.412374 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mlpf8" podStartSLOduration=2.945236057 podStartE2EDuration="5.412353558s" podCreationTimestamp="2026-03-20 07:38:52 +0000 UTC" firstStartedPulling="2026-03-20 07:38:54.351836122 +0000 UTC m=+2956.331710270" lastFinishedPulling="2026-03-20 07:38:56.818953593 +0000 UTC m=+2958.798827771" observedRunningTime="2026-03-20 07:38:57.408741984 +0000 UTC m=+2959.388616152" watchObservedRunningTime="2026-03-20 07:38:57.412353558 +0000 UTC m=+2959.392227706" Mar 20 07:38:59 crc kubenswrapper[4971]: I0320 07:38:59.732447 4971 scope.go:117] "RemoveContainer" containerID="01da59be6249845923e82657099ea463d3e7161471e1e125f547ba6a1aa77809" Mar 20 07:38:59 crc kubenswrapper[4971]: E0320 07:38:59.734215 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:39:03 crc kubenswrapper[4971]: I0320 07:39:03.156683 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mlpf8" Mar 20 07:39:03 crc kubenswrapper[4971]: I0320 07:39:03.157078 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mlpf8" Mar 20 07:39:03 crc kubenswrapper[4971]: I0320 07:39:03.226560 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mlpf8" Mar 20 07:39:03 crc kubenswrapper[4971]: I0320 07:39:03.488327 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mlpf8" Mar 20 07:39:03 crc kubenswrapper[4971]: I0320 07:39:03.541102 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mlpf8"] Mar 20 07:39:05 crc kubenswrapper[4971]: I0320 07:39:05.456723 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mlpf8" podUID="80f6b755-a40f-4bda-92d9-ce8d13eaaf86" containerName="registry-server" containerID="cri-o://8aef987267565c81b271c3b91571f9ab2091f4c6c604362542f07499c31d7e2c" gracePeriod=2 Mar 20 07:39:05 crc kubenswrapper[4971]: I0320 07:39:05.932449 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mlpf8" Mar 20 07:39:06 crc kubenswrapper[4971]: I0320 07:39:06.043739 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80f6b755-a40f-4bda-92d9-ce8d13eaaf86-utilities\") pod \"80f6b755-a40f-4bda-92d9-ce8d13eaaf86\" (UID: \"80f6b755-a40f-4bda-92d9-ce8d13eaaf86\") " Mar 20 07:39:06 crc kubenswrapper[4971]: I0320 07:39:06.043862 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6q4s\" (UniqueName: \"kubernetes.io/projected/80f6b755-a40f-4bda-92d9-ce8d13eaaf86-kube-api-access-g6q4s\") pod \"80f6b755-a40f-4bda-92d9-ce8d13eaaf86\" (UID: \"80f6b755-a40f-4bda-92d9-ce8d13eaaf86\") " Mar 20 07:39:06 crc kubenswrapper[4971]: I0320 07:39:06.043897 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80f6b755-a40f-4bda-92d9-ce8d13eaaf86-catalog-content\") pod \"80f6b755-a40f-4bda-92d9-ce8d13eaaf86\" (UID: \"80f6b755-a40f-4bda-92d9-ce8d13eaaf86\") " Mar 20 07:39:06 crc kubenswrapper[4971]: I0320 07:39:06.045126 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80f6b755-a40f-4bda-92d9-ce8d13eaaf86-utilities" (OuterVolumeSpecName: "utilities") pod "80f6b755-a40f-4bda-92d9-ce8d13eaaf86" (UID: "80f6b755-a40f-4bda-92d9-ce8d13eaaf86"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:39:06 crc kubenswrapper[4971]: I0320 07:39:06.050165 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80f6b755-a40f-4bda-92d9-ce8d13eaaf86-kube-api-access-g6q4s" (OuterVolumeSpecName: "kube-api-access-g6q4s") pod "80f6b755-a40f-4bda-92d9-ce8d13eaaf86" (UID: "80f6b755-a40f-4bda-92d9-ce8d13eaaf86"). InnerVolumeSpecName "kube-api-access-g6q4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:39:06 crc kubenswrapper[4971]: I0320 07:39:06.089134 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80f6b755-a40f-4bda-92d9-ce8d13eaaf86-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "80f6b755-a40f-4bda-92d9-ce8d13eaaf86" (UID: "80f6b755-a40f-4bda-92d9-ce8d13eaaf86"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:39:06 crc kubenswrapper[4971]: I0320 07:39:06.145940 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80f6b755-a40f-4bda-92d9-ce8d13eaaf86-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 07:39:06 crc kubenswrapper[4971]: I0320 07:39:06.145984 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6q4s\" (UniqueName: \"kubernetes.io/projected/80f6b755-a40f-4bda-92d9-ce8d13eaaf86-kube-api-access-g6q4s\") on node \"crc\" DevicePath \"\"" Mar 20 07:39:06 crc kubenswrapper[4971]: I0320 07:39:06.146005 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80f6b755-a40f-4bda-92d9-ce8d13eaaf86-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 07:39:06 crc kubenswrapper[4971]: I0320 07:39:06.467480 4971 generic.go:334] "Generic (PLEG): container finished" podID="80f6b755-a40f-4bda-92d9-ce8d13eaaf86" containerID="8aef987267565c81b271c3b91571f9ab2091f4c6c604362542f07499c31d7e2c" exitCode=0 Mar 20 07:39:06 crc kubenswrapper[4971]: I0320 07:39:06.467541 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mlpf8" event={"ID":"80f6b755-a40f-4bda-92d9-ce8d13eaaf86","Type":"ContainerDied","Data":"8aef987267565c81b271c3b91571f9ab2091f4c6c604362542f07499c31d7e2c"} Mar 20 07:39:06 crc kubenswrapper[4971]: I0320 07:39:06.467581 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mlpf8" event={"ID":"80f6b755-a40f-4bda-92d9-ce8d13eaaf86","Type":"ContainerDied","Data":"d2cd1d9df483644d95255fc77a7d3a982701d3f38bf6878252fe80f1072dde40"} Mar 20 07:39:06 crc kubenswrapper[4971]: I0320 07:39:06.467609 4971 scope.go:117] "RemoveContainer" containerID="8aef987267565c81b271c3b91571f9ab2091f4c6c604362542f07499c31d7e2c" Mar 20 07:39:06 crc kubenswrapper[4971]: I0320 07:39:06.467678 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mlpf8" Mar 20 07:39:06 crc kubenswrapper[4971]: I0320 07:39:06.503330 4971 scope.go:117] "RemoveContainer" containerID="b5337654061b34ce133533f8fa3d6589e0cadbe37e647fa95de2b25ce2909a36" Mar 20 07:39:06 crc kubenswrapper[4971]: I0320 07:39:06.531689 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mlpf8"] Mar 20 07:39:06 crc kubenswrapper[4971]: I0320 07:39:06.540522 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mlpf8"] Mar 20 07:39:06 crc kubenswrapper[4971]: I0320 07:39:06.553983 4971 scope.go:117] "RemoveContainer" containerID="d143795d8d952c9470d0b9d379680f99da187603244cd5f652f598aa9cac80b3" Mar 20 07:39:06 crc kubenswrapper[4971]: I0320 07:39:06.574797 4971 scope.go:117] "RemoveContainer" containerID="8aef987267565c81b271c3b91571f9ab2091f4c6c604362542f07499c31d7e2c" Mar 20 07:39:06 crc kubenswrapper[4971]: E0320 07:39:06.575332 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8aef987267565c81b271c3b91571f9ab2091f4c6c604362542f07499c31d7e2c\": container with ID starting with 8aef987267565c81b271c3b91571f9ab2091f4c6c604362542f07499c31d7e2c not found: ID does not exist" containerID="8aef987267565c81b271c3b91571f9ab2091f4c6c604362542f07499c31d7e2c" Mar 20 07:39:06 crc kubenswrapper[4971]: I0320 07:39:06.575367 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8aef987267565c81b271c3b91571f9ab2091f4c6c604362542f07499c31d7e2c"} err="failed to get container status \"8aef987267565c81b271c3b91571f9ab2091f4c6c604362542f07499c31d7e2c\": rpc error: code = NotFound desc = could not find container \"8aef987267565c81b271c3b91571f9ab2091f4c6c604362542f07499c31d7e2c\": container with ID starting with 8aef987267565c81b271c3b91571f9ab2091f4c6c604362542f07499c31d7e2c not found: ID does not exist" Mar 20 07:39:06 crc kubenswrapper[4971]: I0320 07:39:06.575386 4971 scope.go:117] "RemoveContainer" containerID="b5337654061b34ce133533f8fa3d6589e0cadbe37e647fa95de2b25ce2909a36" Mar 20 07:39:06 crc kubenswrapper[4971]: E0320 07:39:06.575949 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5337654061b34ce133533f8fa3d6589e0cadbe37e647fa95de2b25ce2909a36\": container with ID starting with b5337654061b34ce133533f8fa3d6589e0cadbe37e647fa95de2b25ce2909a36 not found: ID does not exist" containerID="b5337654061b34ce133533f8fa3d6589e0cadbe37e647fa95de2b25ce2909a36" Mar 20 07:39:06 crc kubenswrapper[4971]: I0320 07:39:06.576014 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5337654061b34ce133533f8fa3d6589e0cadbe37e647fa95de2b25ce2909a36"} err="failed to get container status \"b5337654061b34ce133533f8fa3d6589e0cadbe37e647fa95de2b25ce2909a36\": rpc error: code = NotFound desc = could not find container \"b5337654061b34ce133533f8fa3d6589e0cadbe37e647fa95de2b25ce2909a36\": container with ID starting with b5337654061b34ce133533f8fa3d6589e0cadbe37e647fa95de2b25ce2909a36 not found: ID does not exist" Mar 20 07:39:06 crc kubenswrapper[4971]: I0320 07:39:06.576055 4971 scope.go:117] "RemoveContainer" containerID="d143795d8d952c9470d0b9d379680f99da187603244cd5f652f598aa9cac80b3" Mar 20 07:39:06 crc kubenswrapper[4971]: E0320 07:39:06.576449 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d143795d8d952c9470d0b9d379680f99da187603244cd5f652f598aa9cac80b3\": container with ID starting with d143795d8d952c9470d0b9d379680f99da187603244cd5f652f598aa9cac80b3 not found: ID does not exist" containerID="d143795d8d952c9470d0b9d379680f99da187603244cd5f652f598aa9cac80b3" Mar 20 07:39:06 crc kubenswrapper[4971]: I0320 07:39:06.576494 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d143795d8d952c9470d0b9d379680f99da187603244cd5f652f598aa9cac80b3"} err="failed to get container status \"d143795d8d952c9470d0b9d379680f99da187603244cd5f652f598aa9cac80b3\": rpc error: code = NotFound desc = could not find container \"d143795d8d952c9470d0b9d379680f99da187603244cd5f652f598aa9cac80b3\": container with ID starting with d143795d8d952c9470d0b9d379680f99da187603244cd5f652f598aa9cac80b3 not found: ID does not exist" Mar 20 07:39:06 crc kubenswrapper[4971]: I0320 07:39:06.748266 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80f6b755-a40f-4bda-92d9-ce8d13eaaf86" path="/var/lib/kubelet/pods/80f6b755-a40f-4bda-92d9-ce8d13eaaf86/volumes" Mar 20 07:39:14 crc kubenswrapper[4971]: I0320 07:39:14.731894 4971 scope.go:117] "RemoveContainer" containerID="01da59be6249845923e82657099ea463d3e7161471e1e125f547ba6a1aa77809" Mar 20 07:39:14 crc kubenswrapper[4971]: E0320 07:39:14.732834 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:39:26 crc kubenswrapper[4971]: I0320 07:39:26.738542 4971 scope.go:117] "RemoveContainer" containerID="01da59be6249845923e82657099ea463d3e7161471e1e125f547ba6a1aa77809" Mar 20 07:39:26 crc kubenswrapper[4971]: E0320 07:39:26.740362 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:39:37 crc kubenswrapper[4971]: I0320 07:39:37.732369 4971 scope.go:117] "RemoveContainer" containerID="01da59be6249845923e82657099ea463d3e7161471e1e125f547ba6a1aa77809" Mar 20 07:39:37 crc kubenswrapper[4971]: E0320 07:39:37.733428 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:39:51 crc kubenswrapper[4971]: I0320 07:39:51.732535 4971 scope.go:117] "RemoveContainer" containerID="01da59be6249845923e82657099ea463d3e7161471e1e125f547ba6a1aa77809" Mar 20 07:39:51 crc kubenswrapper[4971]: E0320 07:39:51.733155 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:40:00 crc kubenswrapper[4971]: I0320 07:40:00.158042 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566540-frh76"] Mar 20 07:40:00 crc kubenswrapper[4971]: E0320 07:40:00.159420 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80f6b755-a40f-4bda-92d9-ce8d13eaaf86" containerName="extract-content" Mar 20 07:40:00 crc kubenswrapper[4971]: I0320 07:40:00.159451 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="80f6b755-a40f-4bda-92d9-ce8d13eaaf86" containerName="extract-content" Mar 20 07:40:00 crc kubenswrapper[4971]: E0320 07:40:00.159477 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80f6b755-a40f-4bda-92d9-ce8d13eaaf86" containerName="extract-utilities" Mar 20 07:40:00 crc kubenswrapper[4971]: I0320 07:40:00.159499 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="80f6b755-a40f-4bda-92d9-ce8d13eaaf86" containerName="extract-utilities" Mar 20 07:40:00 crc kubenswrapper[4971]: E0320 07:40:00.159543 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80f6b755-a40f-4bda-92d9-ce8d13eaaf86" containerName="registry-server" Mar 20 07:40:00 crc kubenswrapper[4971]: I0320 07:40:00.159561 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="80f6b755-a40f-4bda-92d9-ce8d13eaaf86" containerName="registry-server" Mar 20 07:40:00 crc kubenswrapper[4971]: I0320 07:40:00.159998 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="80f6b755-a40f-4bda-92d9-ce8d13eaaf86" containerName="registry-server" Mar 20 07:40:00 crc kubenswrapper[4971]: I0320 07:40:00.161045 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566540-frh76" Mar 20 07:40:00 crc kubenswrapper[4971]: I0320 07:40:00.163359 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 07:40:00 crc kubenswrapper[4971]: I0320 07:40:00.163507 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 07:40:00 crc kubenswrapper[4971]: I0320 07:40:00.165501 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 07:40:00 crc kubenswrapper[4971]: I0320 07:40:00.165699 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566540-frh76"] Mar 20 07:40:00 crc kubenswrapper[4971]: I0320 07:40:00.316411 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpdzd\" (UniqueName: \"kubernetes.io/projected/1647ca10-1c76-45f9-82ad-4b89294b13f5-kube-api-access-fpdzd\") pod \"auto-csr-approver-29566540-frh76\" (UID: \"1647ca10-1c76-45f9-82ad-4b89294b13f5\") " pod="openshift-infra/auto-csr-approver-29566540-frh76" Mar 20 07:40:00 crc kubenswrapper[4971]: I0320 07:40:00.418112 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpdzd\" (UniqueName: \"kubernetes.io/projected/1647ca10-1c76-45f9-82ad-4b89294b13f5-kube-api-access-fpdzd\") pod \"auto-csr-approver-29566540-frh76\" (UID: \"1647ca10-1c76-45f9-82ad-4b89294b13f5\") " pod="openshift-infra/auto-csr-approver-29566540-frh76" Mar 20 07:40:00 crc kubenswrapper[4971]: I0320 07:40:00.439370 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpdzd\" (UniqueName: \"kubernetes.io/projected/1647ca10-1c76-45f9-82ad-4b89294b13f5-kube-api-access-fpdzd\") pod \"auto-csr-approver-29566540-frh76\" (UID: \"1647ca10-1c76-45f9-82ad-4b89294b13f5\") " pod="openshift-infra/auto-csr-approver-29566540-frh76" Mar 20 07:40:00 crc kubenswrapper[4971]: I0320 07:40:00.488527 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566540-frh76" Mar 20 07:40:00 crc kubenswrapper[4971]: I0320 07:40:00.962850 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566540-frh76"] Mar 20 07:40:00 crc kubenswrapper[4971]: I0320 07:40:00.970894 4971 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 07:40:00 crc kubenswrapper[4971]: I0320 07:40:00.989375 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566540-frh76" event={"ID":"1647ca10-1c76-45f9-82ad-4b89294b13f5","Type":"ContainerStarted","Data":"646c740fe9065134103a3d5860abbc5f104519aab536aa04514b8bbbd6702a45"} Mar 20 07:40:03 crc kubenswrapper[4971]: I0320 07:40:03.009794 4971 generic.go:334] "Generic (PLEG): container finished" podID="1647ca10-1c76-45f9-82ad-4b89294b13f5" containerID="5fffa80226cf7c07e20384152a482dc061e11ae8a2088c022e9f51a8d5f4a519" exitCode=0 Mar 20 07:40:03 crc kubenswrapper[4971]: I0320 07:40:03.009901 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566540-frh76" event={"ID":"1647ca10-1c76-45f9-82ad-4b89294b13f5","Type":"ContainerDied","Data":"5fffa80226cf7c07e20384152a482dc061e11ae8a2088c022e9f51a8d5f4a519"} Mar 20 07:40:03 crc kubenswrapper[4971]: I0320 07:40:03.733178 4971 scope.go:117] "RemoveContainer" containerID="01da59be6249845923e82657099ea463d3e7161471e1e125f547ba6a1aa77809" Mar 20 07:40:03 crc kubenswrapper[4971]: E0320 07:40:03.733692 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:40:04 crc kubenswrapper[4971]: I0320 07:40:04.358302 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566540-frh76" Mar 20 07:40:04 crc kubenswrapper[4971]: I0320 07:40:04.476484 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpdzd\" (UniqueName: \"kubernetes.io/projected/1647ca10-1c76-45f9-82ad-4b89294b13f5-kube-api-access-fpdzd\") pod \"1647ca10-1c76-45f9-82ad-4b89294b13f5\" (UID: \"1647ca10-1c76-45f9-82ad-4b89294b13f5\") " Mar 20 07:40:04 crc kubenswrapper[4971]: I0320 07:40:04.481275 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1647ca10-1c76-45f9-82ad-4b89294b13f5-kube-api-access-fpdzd" (OuterVolumeSpecName: "kube-api-access-fpdzd") pod "1647ca10-1c76-45f9-82ad-4b89294b13f5" (UID: "1647ca10-1c76-45f9-82ad-4b89294b13f5"). InnerVolumeSpecName "kube-api-access-fpdzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:40:04 crc kubenswrapper[4971]: I0320 07:40:04.578255 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpdzd\" (UniqueName: \"kubernetes.io/projected/1647ca10-1c76-45f9-82ad-4b89294b13f5-kube-api-access-fpdzd\") on node \"crc\" DevicePath \"\"" Mar 20 07:40:05 crc kubenswrapper[4971]: I0320 07:40:05.025664 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566540-frh76" event={"ID":"1647ca10-1c76-45f9-82ad-4b89294b13f5","Type":"ContainerDied","Data":"646c740fe9065134103a3d5860abbc5f104519aab536aa04514b8bbbd6702a45"} Mar 20 07:40:05 crc kubenswrapper[4971]: I0320 07:40:05.025704 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="646c740fe9065134103a3d5860abbc5f104519aab536aa04514b8bbbd6702a45" Mar 20 07:40:05 crc kubenswrapper[4971]: I0320 07:40:05.025741 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566540-frh76" Mar 20 07:40:05 crc kubenswrapper[4971]: I0320 07:40:05.416786 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566534-nkrw8"] Mar 20 07:40:05 crc kubenswrapper[4971]: I0320 07:40:05.424184 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566534-nkrw8"] Mar 20 07:40:06 crc kubenswrapper[4971]: I0320 07:40:06.747928 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6610a358-42ec-41ec-af3b-87ae50dbe56c" path="/var/lib/kubelet/pods/6610a358-42ec-41ec-af3b-87ae50dbe56c/volumes" Mar 20 07:40:14 crc kubenswrapper[4971]: I0320 07:40:14.732266 4971 scope.go:117] "RemoveContainer" containerID="01da59be6249845923e82657099ea463d3e7161471e1e125f547ba6a1aa77809" Mar 20 07:40:14 crc kubenswrapper[4971]: E0320 07:40:14.733085 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:40:25 crc kubenswrapper[4971]: I0320 07:40:25.732090 4971 scope.go:117] "RemoveContainer" containerID="01da59be6249845923e82657099ea463d3e7161471e1e125f547ba6a1aa77809" Mar 20 07:40:25 crc kubenswrapper[4971]: E0320 07:40:25.732831 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:40:39 crc kubenswrapper[4971]: I0320 07:40:39.733632 4971 scope.go:117] "RemoveContainer" containerID="01da59be6249845923e82657099ea463d3e7161471e1e125f547ba6a1aa77809" Mar 20 07:40:39 crc kubenswrapper[4971]: E0320 07:40:39.734404 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:40:50 crc kubenswrapper[4971]: I0320 07:40:50.732986 4971 scope.go:117] "RemoveContainer" containerID="01da59be6249845923e82657099ea463d3e7161471e1e125f547ba6a1aa77809" Mar 20 07:40:50 crc kubenswrapper[4971]: E0320 07:40:50.736032 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:40:55 crc kubenswrapper[4971]: I0320 07:40:55.154589 4971 scope.go:117] "RemoveContainer" containerID="60fab7dd49c29dea4bfa214904e08186d43eda2d4648af652a9367ccc044088a" Mar 20 07:41:05 crc kubenswrapper[4971]: I0320 07:41:05.732238 4971 scope.go:117] "RemoveContainer" containerID="01da59be6249845923e82657099ea463d3e7161471e1e125f547ba6a1aa77809" Mar 20 07:41:05 crc kubenswrapper[4971]: E0320 07:41:05.733197 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:41:20 crc kubenswrapper[4971]: I0320 07:41:20.732655 4971 scope.go:117] "RemoveContainer" containerID="01da59be6249845923e82657099ea463d3e7161471e1e125f547ba6a1aa77809" Mar 20 07:41:20 crc kubenswrapper[4971]: E0320 07:41:20.733378 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:41:35 crc kubenswrapper[4971]: I0320 07:41:35.732558 4971 scope.go:117] "RemoveContainer" containerID="01da59be6249845923e82657099ea463d3e7161471e1e125f547ba6a1aa77809" Mar 20 07:41:35 crc kubenswrapper[4971]: E0320 07:41:35.733717 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:41:48 crc kubenswrapper[4971]: I0320 07:41:48.742275 4971 scope.go:117] "RemoveContainer" containerID="01da59be6249845923e82657099ea463d3e7161471e1e125f547ba6a1aa77809" Mar 20 07:41:48 crc kubenswrapper[4971]: E0320 07:41:48.744727 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:41:54 crc kubenswrapper[4971]: I0320 07:41:54.342174 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-znt5t"] Mar 20 07:41:54 crc kubenswrapper[4971]: E0320 07:41:54.350458 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1647ca10-1c76-45f9-82ad-4b89294b13f5" containerName="oc" Mar 20 07:41:54 crc kubenswrapper[4971]: I0320 07:41:54.350501 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="1647ca10-1c76-45f9-82ad-4b89294b13f5" containerName="oc" Mar 20 07:41:54 crc kubenswrapper[4971]: I0320 07:41:54.350940 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="1647ca10-1c76-45f9-82ad-4b89294b13f5" containerName="oc" Mar 20 07:41:54 crc kubenswrapper[4971]: I0320 07:41:54.353515 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-znt5t" Mar 20 07:41:54 crc kubenswrapper[4971]: I0320 07:41:54.374827 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-znt5t"] Mar 20 07:41:54 crc kubenswrapper[4971]: I0320 07:41:54.455443 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70b6ac49-013b-4983-8f67-f6a498f3c317-utilities\") pod \"certified-operators-znt5t\" (UID: \"70b6ac49-013b-4983-8f67-f6a498f3c317\") " pod="openshift-marketplace/certified-operators-znt5t" Mar 20 07:41:54 crc kubenswrapper[4971]: I0320 07:41:54.455716 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jz7qv\" (UniqueName: \"kubernetes.io/projected/70b6ac49-013b-4983-8f67-f6a498f3c317-kube-api-access-jz7qv\") pod \"certified-operators-znt5t\" (UID: \"70b6ac49-013b-4983-8f67-f6a498f3c317\") " pod="openshift-marketplace/certified-operators-znt5t" Mar 20 07:41:54 crc kubenswrapper[4971]: I0320 07:41:54.455908 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70b6ac49-013b-4983-8f67-f6a498f3c317-catalog-content\") pod \"certified-operators-znt5t\" (UID: \"70b6ac49-013b-4983-8f67-f6a498f3c317\") " pod="openshift-marketplace/certified-operators-znt5t" Mar 20 07:41:54 crc kubenswrapper[4971]: I0320 07:41:54.557098 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jz7qv\" (UniqueName: \"kubernetes.io/projected/70b6ac49-013b-4983-8f67-f6a498f3c317-kube-api-access-jz7qv\") pod \"certified-operators-znt5t\" (UID: \"70b6ac49-013b-4983-8f67-f6a498f3c317\") " pod="openshift-marketplace/certified-operators-znt5t" Mar 20 07:41:54 crc kubenswrapper[4971]: I0320 07:41:54.557194 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70b6ac49-013b-4983-8f67-f6a498f3c317-catalog-content\") pod \"certified-operators-znt5t\" (UID: \"70b6ac49-013b-4983-8f67-f6a498f3c317\") " pod="openshift-marketplace/certified-operators-znt5t" Mar 20 07:41:54 crc kubenswrapper[4971]: I0320 07:41:54.557241 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70b6ac49-013b-4983-8f67-f6a498f3c317-utilities\") pod \"certified-operators-znt5t\" (UID: \"70b6ac49-013b-4983-8f67-f6a498f3c317\") " pod="openshift-marketplace/certified-operators-znt5t" Mar 20 07:41:54 crc kubenswrapper[4971]: I0320 07:41:54.557943 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70b6ac49-013b-4983-8f67-f6a498f3c317-catalog-content\") pod \"certified-operators-znt5t\" (UID: \"70b6ac49-013b-4983-8f67-f6a498f3c317\") " pod="openshift-marketplace/certified-operators-znt5t" Mar 20 07:41:54 crc kubenswrapper[4971]: I0320 07:41:54.557994 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70b6ac49-013b-4983-8f67-f6a498f3c317-utilities\") pod \"certified-operators-znt5t\" (UID: \"70b6ac49-013b-4983-8f67-f6a498f3c317\") " pod="openshift-marketplace/certified-operators-znt5t" Mar 20 07:41:54 crc kubenswrapper[4971]: I0320 07:41:54.578204 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jz7qv\" (UniqueName: \"kubernetes.io/projected/70b6ac49-013b-4983-8f67-f6a498f3c317-kube-api-access-jz7qv\") pod \"certified-operators-znt5t\" (UID: \"70b6ac49-013b-4983-8f67-f6a498f3c317\") " pod="openshift-marketplace/certified-operators-znt5t" Mar 20 07:41:54 crc kubenswrapper[4971]: I0320 07:41:54.700304 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-znt5t" Mar 20 07:41:55 crc kubenswrapper[4971]: I0320 07:41:55.217797 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-znt5t"] Mar 20 07:41:56 crc kubenswrapper[4971]: I0320 07:41:56.036766 4971 generic.go:334] "Generic (PLEG): container finished" podID="70b6ac49-013b-4983-8f67-f6a498f3c317" containerID="78c5fdd4ee5b29bb29507fbfd957d3716500fee845d3849bda14ba26700565c4" exitCode=0 Mar 20 07:41:56 crc kubenswrapper[4971]: I0320 07:41:56.036910 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-znt5t" event={"ID":"70b6ac49-013b-4983-8f67-f6a498f3c317","Type":"ContainerDied","Data":"78c5fdd4ee5b29bb29507fbfd957d3716500fee845d3849bda14ba26700565c4"} Mar 20 07:41:56 crc kubenswrapper[4971]: I0320 07:41:56.037375 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-znt5t" event={"ID":"70b6ac49-013b-4983-8f67-f6a498f3c317","Type":"ContainerStarted","Data":"c2fa9e1905798e4095df92e8ef28224737e63aa7e9a7e650aa5735dde9035676"} Mar 20 07:41:57 crc kubenswrapper[4971]: I0320 07:41:57.045559 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-znt5t" event={"ID":"70b6ac49-013b-4983-8f67-f6a498f3c317","Type":"ContainerStarted","Data":"0103ba473bb6accab86b265cd520ccf4b6ac2b5e7373f8351d548095ae5a6c14"} Mar 20 07:41:58 crc kubenswrapper[4971]: I0320 07:41:58.057590 4971 generic.go:334] "Generic (PLEG): container finished" podID="70b6ac49-013b-4983-8f67-f6a498f3c317" containerID="0103ba473bb6accab86b265cd520ccf4b6ac2b5e7373f8351d548095ae5a6c14" exitCode=0 Mar 20 07:41:58 crc kubenswrapper[4971]: I0320 07:41:58.057727 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-znt5t" event={"ID":"70b6ac49-013b-4983-8f67-f6a498f3c317","Type":"ContainerDied","Data":"0103ba473bb6accab86b265cd520ccf4b6ac2b5e7373f8351d548095ae5a6c14"} Mar 20 07:41:59 crc kubenswrapper[4971]: I0320 07:41:59.072362 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-znt5t" event={"ID":"70b6ac49-013b-4983-8f67-f6a498f3c317","Type":"ContainerStarted","Data":"b2567a9071058e7582891f0ba7eb76ce5a357af261b1bce6caba2569399c40d1"} Mar 20 07:41:59 crc kubenswrapper[4971]: I0320 07:41:59.103343 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-znt5t" podStartSLOduration=2.54609574 podStartE2EDuration="5.103318519s" podCreationTimestamp="2026-03-20 07:41:54 +0000 UTC" firstStartedPulling="2026-03-20 07:41:56.041249714 +0000 UTC m=+3138.021123892" lastFinishedPulling="2026-03-20 07:41:58.598472493 +0000 UTC m=+3140.578346671" observedRunningTime="2026-03-20 07:41:59.102816636 +0000 UTC m=+3141.082690864" watchObservedRunningTime="2026-03-20 07:41:59.103318519 +0000 UTC m=+3141.083192697" Mar 20 07:42:00 crc kubenswrapper[4971]: I0320 07:42:00.149781 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566542-z5c9s"] Mar 20 07:42:00 crc kubenswrapper[4971]: I0320 07:42:00.151176 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566542-z5c9s" Mar 20 07:42:00 crc kubenswrapper[4971]: I0320 07:42:00.153565 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 07:42:00 crc kubenswrapper[4971]: I0320 07:42:00.154101 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 07:42:00 crc kubenswrapper[4971]: I0320 07:42:00.156526 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 07:42:00 crc kubenswrapper[4971]: I0320 07:42:00.160800 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566542-z5c9s"] Mar 20 07:42:00 crc kubenswrapper[4971]: I0320 07:42:00.258834 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qbtq\" (UniqueName: \"kubernetes.io/projected/092750ba-9445-418f-ba6a-f4d4456593e4-kube-api-access-2qbtq\") pod \"auto-csr-approver-29566542-z5c9s\" (UID: \"092750ba-9445-418f-ba6a-f4d4456593e4\") " pod="openshift-infra/auto-csr-approver-29566542-z5c9s" Mar 20 07:42:00 crc kubenswrapper[4971]: I0320 07:42:00.359694 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qbtq\" (UniqueName: \"kubernetes.io/projected/092750ba-9445-418f-ba6a-f4d4456593e4-kube-api-access-2qbtq\") pod \"auto-csr-approver-29566542-z5c9s\" (UID: \"092750ba-9445-418f-ba6a-f4d4456593e4\") " pod="openshift-infra/auto-csr-approver-29566542-z5c9s" Mar 20 07:42:00 crc kubenswrapper[4971]: I0320 07:42:00.392649 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qbtq\" (UniqueName: \"kubernetes.io/projected/092750ba-9445-418f-ba6a-f4d4456593e4-kube-api-access-2qbtq\") pod \"auto-csr-approver-29566542-z5c9s\" (UID: \"092750ba-9445-418f-ba6a-f4d4456593e4\") " pod="openshift-infra/auto-csr-approver-29566542-z5c9s" Mar 20 07:42:00 crc kubenswrapper[4971]: I0320 07:42:00.474927 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566542-z5c9s" Mar 20 07:42:00 crc kubenswrapper[4971]: I0320 07:42:00.734372 4971 scope.go:117] "RemoveContainer" containerID="01da59be6249845923e82657099ea463d3e7161471e1e125f547ba6a1aa77809" Mar 20 07:42:00 crc kubenswrapper[4971]: E0320 07:42:00.735046 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:42:00 crc kubenswrapper[4971]: I0320 07:42:00.923318 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566542-z5c9s"] Mar 20 07:42:00 crc kubenswrapper[4971]: W0320 07:42:00.929033 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod092750ba_9445_418f_ba6a_f4d4456593e4.slice/crio-7e544efd7638b96b8fa0318517d90ea016fcd8140e77605f1c7a3da0f3b58125 WatchSource:0}: Error finding container 7e544efd7638b96b8fa0318517d90ea016fcd8140e77605f1c7a3da0f3b58125: Status 404 returned error can't find the container with id 7e544efd7638b96b8fa0318517d90ea016fcd8140e77605f1c7a3da0f3b58125 Mar 20 07:42:01 crc kubenswrapper[4971]: I0320 07:42:01.090299 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566542-z5c9s" event={"ID":"092750ba-9445-418f-ba6a-f4d4456593e4","Type":"ContainerStarted","Data":"7e544efd7638b96b8fa0318517d90ea016fcd8140e77605f1c7a3da0f3b58125"} Mar 20 07:42:03 crc kubenswrapper[4971]: I0320 07:42:03.110082 4971 generic.go:334] "Generic (PLEG): container finished" podID="092750ba-9445-418f-ba6a-f4d4456593e4" containerID="79b43a743e05a6dd05c236f045b0497dae8384dc655396e5e94b8a847b2926d3" exitCode=0 Mar 20 07:42:03 crc kubenswrapper[4971]: I0320 07:42:03.110170 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566542-z5c9s" event={"ID":"092750ba-9445-418f-ba6a-f4d4456593e4","Type":"ContainerDied","Data":"79b43a743e05a6dd05c236f045b0497dae8384dc655396e5e94b8a847b2926d3"} Mar 20 07:42:04 crc kubenswrapper[4971]: I0320 07:42:04.490001 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566542-z5c9s" Mar 20 07:42:04 crc kubenswrapper[4971]: I0320 07:42:04.631443 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qbtq\" (UniqueName: \"kubernetes.io/projected/092750ba-9445-418f-ba6a-f4d4456593e4-kube-api-access-2qbtq\") pod \"092750ba-9445-418f-ba6a-f4d4456593e4\" (UID: \"092750ba-9445-418f-ba6a-f4d4456593e4\") " Mar 20 07:42:04 crc kubenswrapper[4971]: I0320 07:42:04.639329 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/092750ba-9445-418f-ba6a-f4d4456593e4-kube-api-access-2qbtq" (OuterVolumeSpecName: "kube-api-access-2qbtq") pod "092750ba-9445-418f-ba6a-f4d4456593e4" (UID: "092750ba-9445-418f-ba6a-f4d4456593e4"). InnerVolumeSpecName "kube-api-access-2qbtq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:42:04 crc kubenswrapper[4971]: I0320 07:42:04.702096 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-znt5t" Mar 20 07:42:04 crc kubenswrapper[4971]: I0320 07:42:04.702147 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-znt5t" Mar 20 07:42:04 crc kubenswrapper[4971]: I0320 07:42:04.734016 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qbtq\" (UniqueName: \"kubernetes.io/projected/092750ba-9445-418f-ba6a-f4d4456593e4-kube-api-access-2qbtq\") on node \"crc\" DevicePath \"\"" Mar 20 07:42:04 crc kubenswrapper[4971]: I0320 07:42:04.761909 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-znt5t" Mar 20 07:42:05 crc kubenswrapper[4971]: I0320 07:42:05.132797 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566542-z5c9s" event={"ID":"092750ba-9445-418f-ba6a-f4d4456593e4","Type":"ContainerDied","Data":"7e544efd7638b96b8fa0318517d90ea016fcd8140e77605f1c7a3da0f3b58125"} Mar 20 07:42:05 crc kubenswrapper[4971]: I0320 07:42:05.132875 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566542-z5c9s" Mar 20 07:42:05 crc kubenswrapper[4971]: I0320 07:42:05.133529 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e544efd7638b96b8fa0318517d90ea016fcd8140e77605f1c7a3da0f3b58125" Mar 20 07:42:05 crc kubenswrapper[4971]: I0320 07:42:05.195953 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-znt5t" Mar 20 07:42:05 crc kubenswrapper[4971]: I0320 07:42:05.267365 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-znt5t"] Mar 20 07:42:05 crc kubenswrapper[4971]: I0320 07:42:05.584717 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566536-mrhvj"] Mar 20 07:42:05 crc kubenswrapper[4971]: I0320 07:42:05.594734 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566536-mrhvj"] Mar 20 07:42:06 crc kubenswrapper[4971]: I0320 07:42:06.752082 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8afdbf46-868e-4d6a-92ab-efe6212d89de" path="/var/lib/kubelet/pods/8afdbf46-868e-4d6a-92ab-efe6212d89de/volumes" Mar 20 07:42:07 crc kubenswrapper[4971]: I0320 07:42:07.151009 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-znt5t" podUID="70b6ac49-013b-4983-8f67-f6a498f3c317" containerName="registry-server" containerID="cri-o://b2567a9071058e7582891f0ba7eb76ce5a357af261b1bce6caba2569399c40d1" gracePeriod=2 Mar 20 07:42:07 crc kubenswrapper[4971]: I0320 07:42:07.733896 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-znt5t" Mar 20 07:42:07 crc kubenswrapper[4971]: I0320 07:42:07.886016 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70b6ac49-013b-4983-8f67-f6a498f3c317-catalog-content\") pod \"70b6ac49-013b-4983-8f67-f6a498f3c317\" (UID: \"70b6ac49-013b-4983-8f67-f6a498f3c317\") " Mar 20 07:42:07 crc kubenswrapper[4971]: I0320 07:42:07.886084 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70b6ac49-013b-4983-8f67-f6a498f3c317-utilities\") pod \"70b6ac49-013b-4983-8f67-f6a498f3c317\" (UID: \"70b6ac49-013b-4983-8f67-f6a498f3c317\") " Mar 20 07:42:07 crc kubenswrapper[4971]: I0320 07:42:07.886191 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jz7qv\" (UniqueName: \"kubernetes.io/projected/70b6ac49-013b-4983-8f67-f6a498f3c317-kube-api-access-jz7qv\") pod \"70b6ac49-013b-4983-8f67-f6a498f3c317\" (UID: \"70b6ac49-013b-4983-8f67-f6a498f3c317\") " Mar 20 07:42:07 crc kubenswrapper[4971]: I0320 07:42:07.888181 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70b6ac49-013b-4983-8f67-f6a498f3c317-utilities" (OuterVolumeSpecName: "utilities") pod "70b6ac49-013b-4983-8f67-f6a498f3c317" (UID: "70b6ac49-013b-4983-8f67-f6a498f3c317"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:42:07 crc kubenswrapper[4971]: I0320 07:42:07.897374 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70b6ac49-013b-4983-8f67-f6a498f3c317-kube-api-access-jz7qv" (OuterVolumeSpecName: "kube-api-access-jz7qv") pod "70b6ac49-013b-4983-8f67-f6a498f3c317" (UID: "70b6ac49-013b-4983-8f67-f6a498f3c317"). InnerVolumeSpecName "kube-api-access-jz7qv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:42:07 crc kubenswrapper[4971]: I0320 07:42:07.988649 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jz7qv\" (UniqueName: \"kubernetes.io/projected/70b6ac49-013b-4983-8f67-f6a498f3c317-kube-api-access-jz7qv\") on node \"crc\" DevicePath \"\"" Mar 20 07:42:07 crc kubenswrapper[4971]: I0320 07:42:07.988693 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70b6ac49-013b-4983-8f67-f6a498f3c317-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 07:42:08 crc kubenswrapper[4971]: I0320 07:42:08.002337 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70b6ac49-013b-4983-8f67-f6a498f3c317-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "70b6ac49-013b-4983-8f67-f6a498f3c317" (UID: "70b6ac49-013b-4983-8f67-f6a498f3c317"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:42:08 crc kubenswrapper[4971]: I0320 07:42:08.089988 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70b6ac49-013b-4983-8f67-f6a498f3c317-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 07:42:08 crc kubenswrapper[4971]: I0320 07:42:08.169655 4971 generic.go:334] "Generic (PLEG): container finished" podID="70b6ac49-013b-4983-8f67-f6a498f3c317" containerID="b2567a9071058e7582891f0ba7eb76ce5a357af261b1bce6caba2569399c40d1" exitCode=0 Mar 20 07:42:08 crc kubenswrapper[4971]: I0320 07:42:08.169775 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-znt5t" event={"ID":"70b6ac49-013b-4983-8f67-f6a498f3c317","Type":"ContainerDied","Data":"b2567a9071058e7582891f0ba7eb76ce5a357af261b1bce6caba2569399c40d1"} Mar 20 07:42:08 crc kubenswrapper[4971]: I0320 07:42:08.169826 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-znt5t" event={"ID":"70b6ac49-013b-4983-8f67-f6a498f3c317","Type":"ContainerDied","Data":"c2fa9e1905798e4095df92e8ef28224737e63aa7e9a7e650aa5735dde9035676"} Mar 20 07:42:08 crc kubenswrapper[4971]: I0320 07:42:08.169863 4971 scope.go:117] "RemoveContainer" containerID="b2567a9071058e7582891f0ba7eb76ce5a357af261b1bce6caba2569399c40d1" Mar 20 07:42:08 crc kubenswrapper[4971]: I0320 07:42:08.170385 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-znt5t" Mar 20 07:42:08 crc kubenswrapper[4971]: I0320 07:42:08.215014 4971 scope.go:117] "RemoveContainer" containerID="0103ba473bb6accab86b265cd520ccf4b6ac2b5e7373f8351d548095ae5a6c14" Mar 20 07:42:08 crc kubenswrapper[4971]: I0320 07:42:08.229651 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-znt5t"] Mar 20 07:42:08 crc kubenswrapper[4971]: I0320 07:42:08.238969 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-znt5t"] Mar 20 07:42:08 crc kubenswrapper[4971]: I0320 07:42:08.249498 4971 scope.go:117] "RemoveContainer" containerID="78c5fdd4ee5b29bb29507fbfd957d3716500fee845d3849bda14ba26700565c4" Mar 20 07:42:08 crc kubenswrapper[4971]: I0320 07:42:08.285733 4971 scope.go:117] "RemoveContainer" containerID="b2567a9071058e7582891f0ba7eb76ce5a357af261b1bce6caba2569399c40d1" Mar 20 07:42:08 crc kubenswrapper[4971]: E0320 07:42:08.286566 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2567a9071058e7582891f0ba7eb76ce5a357af261b1bce6caba2569399c40d1\": container with ID starting with b2567a9071058e7582891f0ba7eb76ce5a357af261b1bce6caba2569399c40d1 not found: ID does not exist" containerID="b2567a9071058e7582891f0ba7eb76ce5a357af261b1bce6caba2569399c40d1" Mar 20 07:42:08 crc kubenswrapper[4971]: I0320 07:42:08.286699 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2567a9071058e7582891f0ba7eb76ce5a357af261b1bce6caba2569399c40d1"} err="failed to get container status \"b2567a9071058e7582891f0ba7eb76ce5a357af261b1bce6caba2569399c40d1\": rpc error: code = NotFound desc = could not find container \"b2567a9071058e7582891f0ba7eb76ce5a357af261b1bce6caba2569399c40d1\": container with ID starting with b2567a9071058e7582891f0ba7eb76ce5a357af261b1bce6caba2569399c40d1 not found: ID does not exist" Mar 20 07:42:08 crc kubenswrapper[4971]: I0320 07:42:08.286740 4971 scope.go:117] "RemoveContainer" containerID="0103ba473bb6accab86b265cd520ccf4b6ac2b5e7373f8351d548095ae5a6c14" Mar 20 07:42:08 crc kubenswrapper[4971]: E0320 07:42:08.287249 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0103ba473bb6accab86b265cd520ccf4b6ac2b5e7373f8351d548095ae5a6c14\": container with ID starting with 0103ba473bb6accab86b265cd520ccf4b6ac2b5e7373f8351d548095ae5a6c14 not found: ID does not exist" containerID="0103ba473bb6accab86b265cd520ccf4b6ac2b5e7373f8351d548095ae5a6c14" Mar 20 07:42:08 crc kubenswrapper[4971]: I0320 07:42:08.287320 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0103ba473bb6accab86b265cd520ccf4b6ac2b5e7373f8351d548095ae5a6c14"} err="failed to get container status \"0103ba473bb6accab86b265cd520ccf4b6ac2b5e7373f8351d548095ae5a6c14\": rpc error: code = NotFound desc = could not find container \"0103ba473bb6accab86b265cd520ccf4b6ac2b5e7373f8351d548095ae5a6c14\": container with ID starting with 0103ba473bb6accab86b265cd520ccf4b6ac2b5e7373f8351d548095ae5a6c14 not found: ID does not exist" Mar 20 07:42:08 crc kubenswrapper[4971]: I0320 07:42:08.287369 4971 scope.go:117] "RemoveContainer" containerID="78c5fdd4ee5b29bb29507fbfd957d3716500fee845d3849bda14ba26700565c4" Mar 20 07:42:08 crc kubenswrapper[4971]: E0320 07:42:08.287937 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78c5fdd4ee5b29bb29507fbfd957d3716500fee845d3849bda14ba26700565c4\": container with ID starting with 78c5fdd4ee5b29bb29507fbfd957d3716500fee845d3849bda14ba26700565c4 not found: ID does not exist" containerID="78c5fdd4ee5b29bb29507fbfd957d3716500fee845d3849bda14ba26700565c4" Mar 20 07:42:08 crc kubenswrapper[4971]: I0320 07:42:08.287978 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78c5fdd4ee5b29bb29507fbfd957d3716500fee845d3849bda14ba26700565c4"} err="failed to get container status \"78c5fdd4ee5b29bb29507fbfd957d3716500fee845d3849bda14ba26700565c4\": rpc error: code = NotFound desc = could not find container \"78c5fdd4ee5b29bb29507fbfd957d3716500fee845d3849bda14ba26700565c4\": container with ID starting with 78c5fdd4ee5b29bb29507fbfd957d3716500fee845d3849bda14ba26700565c4 not found: ID does not exist" Mar 20 07:42:08 crc kubenswrapper[4971]: I0320 07:42:08.757133 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70b6ac49-013b-4983-8f67-f6a498f3c317" path="/var/lib/kubelet/pods/70b6ac49-013b-4983-8f67-f6a498f3c317/volumes" Mar 20 07:42:13 crc kubenswrapper[4971]: I0320 07:42:13.732455 4971 scope.go:117] "RemoveContainer" containerID="01da59be6249845923e82657099ea463d3e7161471e1e125f547ba6a1aa77809" Mar 20 07:42:13 crc kubenswrapper[4971]: E0320 07:42:13.733420 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:42:28 crc kubenswrapper[4971]: I0320 07:42:28.740507 4971 scope.go:117] "RemoveContainer" containerID="01da59be6249845923e82657099ea463d3e7161471e1e125f547ba6a1aa77809" Mar 20 07:42:28 crc kubenswrapper[4971]: E0320 07:42:28.741666 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:42:42 crc kubenswrapper[4971]: I0320 07:42:42.732540 4971 scope.go:117] "RemoveContainer" containerID="01da59be6249845923e82657099ea463d3e7161471e1e125f547ba6a1aa77809" Mar 20 07:42:42 crc kubenswrapper[4971]: E0320 07:42:42.733457 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:42:53 crc kubenswrapper[4971]: I0320 07:42:53.732458 4971 scope.go:117] "RemoveContainer" containerID="01da59be6249845923e82657099ea463d3e7161471e1e125f547ba6a1aa77809" Mar 20 07:42:53 crc kubenswrapper[4971]: E0320 07:42:53.733154 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:42:55 crc kubenswrapper[4971]: I0320 07:42:55.281818 4971 scope.go:117] "RemoveContainer" containerID="f11dc4b5f798331b479b0e5d22e2922af37342f1091aa5cd55d8149a0c066dab" Mar 20 07:43:06 crc kubenswrapper[4971]: I0320 07:43:06.732773 4971 scope.go:117] "RemoveContainer" containerID="01da59be6249845923e82657099ea463d3e7161471e1e125f547ba6a1aa77809" Mar 20 07:43:06 crc kubenswrapper[4971]: E0320 07:43:06.734039 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:43:20 crc kubenswrapper[4971]: I0320 07:43:20.739933 4971 scope.go:117] "RemoveContainer" containerID="01da59be6249845923e82657099ea463d3e7161471e1e125f547ba6a1aa77809" Mar 20 07:43:21 crc kubenswrapper[4971]: I0320 07:43:21.880477 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerStarted","Data":"9b0eb5d686da2e08fdc8fe5d8bad3b653b6ff89379afb776c56d2edbf4453130"} Mar 20 07:44:00 crc kubenswrapper[4971]: I0320 07:44:00.157297 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566544-gj7nc"] Mar 20 07:44:00 crc kubenswrapper[4971]: E0320 07:44:00.158256 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70b6ac49-013b-4983-8f67-f6a498f3c317" containerName="extract-utilities" Mar 20 07:44:00 crc kubenswrapper[4971]: I0320 07:44:00.158276 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="70b6ac49-013b-4983-8f67-f6a498f3c317" containerName="extract-utilities" Mar 20 07:44:00 crc kubenswrapper[4971]: E0320 07:44:00.158293 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70b6ac49-013b-4983-8f67-f6a498f3c317" containerName="registry-server" Mar 20 07:44:00 crc kubenswrapper[4971]: I0320 07:44:00.158301 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="70b6ac49-013b-4983-8f67-f6a498f3c317" containerName="registry-server" Mar 20 07:44:00 crc kubenswrapper[4971]: E0320 07:44:00.158322 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70b6ac49-013b-4983-8f67-f6a498f3c317" containerName="extract-content" Mar 20 07:44:00 crc kubenswrapper[4971]: I0320 07:44:00.158330 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="70b6ac49-013b-4983-8f67-f6a498f3c317" containerName="extract-content" Mar 20 07:44:00 crc kubenswrapper[4971]: E0320 07:44:00.158355 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="092750ba-9445-418f-ba6a-f4d4456593e4" containerName="oc" Mar 20 07:44:00 crc kubenswrapper[4971]: I0320 07:44:00.158366 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="092750ba-9445-418f-ba6a-f4d4456593e4" containerName="oc" Mar 20 07:44:00 crc kubenswrapper[4971]: I0320 07:44:00.158546 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="092750ba-9445-418f-ba6a-f4d4456593e4" containerName="oc" Mar 20 07:44:00 crc kubenswrapper[4971]: I0320 07:44:00.158565 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="70b6ac49-013b-4983-8f67-f6a498f3c317" containerName="registry-server" Mar 20 07:44:00 crc kubenswrapper[4971]: I0320 07:44:00.159210 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566544-gj7nc" Mar 20 07:44:00 crc kubenswrapper[4971]: I0320 07:44:00.162821 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 07:44:00 crc kubenswrapper[4971]: I0320 07:44:00.165881 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 07:44:00 crc kubenswrapper[4971]: I0320 07:44:00.170052 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 07:44:00 crc kubenswrapper[4971]: I0320 07:44:00.177785 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566544-gj7nc"] Mar 20 07:44:00 crc kubenswrapper[4971]: I0320 07:44:00.191785 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52csk\" (UniqueName: \"kubernetes.io/projected/d789ba76-0170-4d11-905a-afe2ddf78f14-kube-api-access-52csk\") pod \"auto-csr-approver-29566544-gj7nc\" (UID: \"d789ba76-0170-4d11-905a-afe2ddf78f14\") " pod="openshift-infra/auto-csr-approver-29566544-gj7nc" Mar 20 07:44:00 crc kubenswrapper[4971]: I0320 07:44:00.295412 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52csk\" (UniqueName: \"kubernetes.io/projected/d789ba76-0170-4d11-905a-afe2ddf78f14-kube-api-access-52csk\") pod \"auto-csr-approver-29566544-gj7nc\" (UID: \"d789ba76-0170-4d11-905a-afe2ddf78f14\") " pod="openshift-infra/auto-csr-approver-29566544-gj7nc" Mar 20 07:44:00 crc kubenswrapper[4971]: I0320 07:44:00.317973 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52csk\" (UniqueName: \"kubernetes.io/projected/d789ba76-0170-4d11-905a-afe2ddf78f14-kube-api-access-52csk\") pod \"auto-csr-approver-29566544-gj7nc\" (UID: \"d789ba76-0170-4d11-905a-afe2ddf78f14\") " pod="openshift-infra/auto-csr-approver-29566544-gj7nc" Mar 20 07:44:00 crc kubenswrapper[4971]: I0320 07:44:00.483498 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566544-gj7nc" Mar 20 07:44:00 crc kubenswrapper[4971]: W0320 07:44:00.740723 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd789ba76_0170_4d11_905a_afe2ddf78f14.slice/crio-91ac1842a6481e481a9e1e6d71d1b5a2d4c5046972052fb4f642a20643a58911 WatchSource:0}: Error finding container 91ac1842a6481e481a9e1e6d71d1b5a2d4c5046972052fb4f642a20643a58911: Status 404 returned error can't find the container with id 91ac1842a6481e481a9e1e6d71d1b5a2d4c5046972052fb4f642a20643a58911 Mar 20 07:44:00 crc kubenswrapper[4971]: I0320 07:44:00.743432 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566544-gj7nc"] Mar 20 07:44:01 crc kubenswrapper[4971]: I0320 07:44:01.198334 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566544-gj7nc" event={"ID":"d789ba76-0170-4d11-905a-afe2ddf78f14","Type":"ContainerStarted","Data":"91ac1842a6481e481a9e1e6d71d1b5a2d4c5046972052fb4f642a20643a58911"} Mar 20 07:44:02 crc kubenswrapper[4971]: I0320 07:44:02.205381 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566544-gj7nc" event={"ID":"d789ba76-0170-4d11-905a-afe2ddf78f14","Type":"ContainerStarted","Data":"256217ee7227a11698e46a294091dced7833eb4ea30a7288d38cd9ef7f447794"} Mar 20 07:44:02 crc kubenswrapper[4971]: I0320 07:44:02.223077 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566544-gj7nc" podStartSLOduration=1.283469979 podStartE2EDuration="2.223062045s" podCreationTimestamp="2026-03-20 07:44:00 +0000 UTC" firstStartedPulling="2026-03-20 07:44:00.743727844 +0000 UTC m=+3262.723601982" lastFinishedPulling="2026-03-20 07:44:01.68331991 +0000 UTC m=+3263.663194048" observedRunningTime="2026-03-20 07:44:02.218750602 +0000 UTC m=+3264.198624740" watchObservedRunningTime="2026-03-20 07:44:02.223062045 +0000 UTC m=+3264.202936183" Mar 20 07:44:03 crc kubenswrapper[4971]: I0320 07:44:03.215319 4971 generic.go:334] "Generic (PLEG): container finished" podID="d789ba76-0170-4d11-905a-afe2ddf78f14" containerID="256217ee7227a11698e46a294091dced7833eb4ea30a7288d38cd9ef7f447794" exitCode=0 Mar 20 07:44:03 crc kubenswrapper[4971]: I0320 07:44:03.215434 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566544-gj7nc" event={"ID":"d789ba76-0170-4d11-905a-afe2ddf78f14","Type":"ContainerDied","Data":"256217ee7227a11698e46a294091dced7833eb4ea30a7288d38cd9ef7f447794"} Mar 20 07:44:04 crc kubenswrapper[4971]: I0320 07:44:04.592256 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566544-gj7nc" Mar 20 07:44:04 crc kubenswrapper[4971]: I0320 07:44:04.690738 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52csk\" (UniqueName: \"kubernetes.io/projected/d789ba76-0170-4d11-905a-afe2ddf78f14-kube-api-access-52csk\") pod \"d789ba76-0170-4d11-905a-afe2ddf78f14\" (UID: \"d789ba76-0170-4d11-905a-afe2ddf78f14\") " Mar 20 07:44:04 crc kubenswrapper[4971]: I0320 07:44:04.698925 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d789ba76-0170-4d11-905a-afe2ddf78f14-kube-api-access-52csk" (OuterVolumeSpecName: "kube-api-access-52csk") pod "d789ba76-0170-4d11-905a-afe2ddf78f14" (UID: "d789ba76-0170-4d11-905a-afe2ddf78f14"). InnerVolumeSpecName "kube-api-access-52csk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:44:04 crc kubenswrapper[4971]: I0320 07:44:04.793160 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52csk\" (UniqueName: \"kubernetes.io/projected/d789ba76-0170-4d11-905a-afe2ddf78f14-kube-api-access-52csk\") on node \"crc\" DevicePath \"\"" Mar 20 07:44:05 crc kubenswrapper[4971]: I0320 07:44:05.232563 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566544-gj7nc" event={"ID":"d789ba76-0170-4d11-905a-afe2ddf78f14","Type":"ContainerDied","Data":"91ac1842a6481e481a9e1e6d71d1b5a2d4c5046972052fb4f642a20643a58911"} Mar 20 07:44:05 crc kubenswrapper[4971]: I0320 07:44:05.232630 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91ac1842a6481e481a9e1e6d71d1b5a2d4c5046972052fb4f642a20643a58911" Mar 20 07:44:05 crc kubenswrapper[4971]: I0320 07:44:05.232682 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566544-gj7nc" Mar 20 07:44:05 crc kubenswrapper[4971]: I0320 07:44:05.310223 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566538-mtkzh"] Mar 20 07:44:05 crc kubenswrapper[4971]: I0320 07:44:05.320686 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566538-mtkzh"] Mar 20 07:44:06 crc kubenswrapper[4971]: I0320 07:44:06.745476 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f83c433-e498-48f0-8e6d-b0ccf8cf41a4" path="/var/lib/kubelet/pods/0f83c433-e498-48f0-8e6d-b0ccf8cf41a4/volumes" Mar 20 07:44:39 crc kubenswrapper[4971]: I0320 07:44:39.351576 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2kjvp"] Mar 20 07:44:39 crc kubenswrapper[4971]: E0320 07:44:39.352885 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d789ba76-0170-4d11-905a-afe2ddf78f14" containerName="oc" Mar 20 07:44:39 crc kubenswrapper[4971]: I0320 07:44:39.352916 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="d789ba76-0170-4d11-905a-afe2ddf78f14" containerName="oc" Mar 20 07:44:39 crc kubenswrapper[4971]: I0320 07:44:39.353215 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="d789ba76-0170-4d11-905a-afe2ddf78f14" containerName="oc" Mar 20 07:44:39 crc kubenswrapper[4971]: I0320 07:44:39.355304 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2kjvp" Mar 20 07:44:39 crc kubenswrapper[4971]: I0320 07:44:39.408785 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b32e59f7-9098-4b36-bd54-793975f9d73a-utilities\") pod \"community-operators-2kjvp\" (UID: \"b32e59f7-9098-4b36-bd54-793975f9d73a\") " pod="openshift-marketplace/community-operators-2kjvp" Mar 20 07:44:39 crc kubenswrapper[4971]: I0320 07:44:39.408899 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b32e59f7-9098-4b36-bd54-793975f9d73a-catalog-content\") pod \"community-operators-2kjvp\" (UID: \"b32e59f7-9098-4b36-bd54-793975f9d73a\") " pod="openshift-marketplace/community-operators-2kjvp" Mar 20 07:44:39 crc kubenswrapper[4971]: I0320 07:44:39.408938 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2wts\" (UniqueName: \"kubernetes.io/projected/b32e59f7-9098-4b36-bd54-793975f9d73a-kube-api-access-j2wts\") pod \"community-operators-2kjvp\" (UID: \"b32e59f7-9098-4b36-bd54-793975f9d73a\") " pod="openshift-marketplace/community-operators-2kjvp" Mar 20 07:44:39 crc kubenswrapper[4971]: I0320 07:44:39.420873 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2kjvp"] Mar 20 07:44:39 crc kubenswrapper[4971]: I0320 07:44:39.510010 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b32e59f7-9098-4b36-bd54-793975f9d73a-catalog-content\") pod \"community-operators-2kjvp\" (UID: \"b32e59f7-9098-4b36-bd54-793975f9d73a\") " pod="openshift-marketplace/community-operators-2kjvp" Mar 20 07:44:39 crc kubenswrapper[4971]: I0320 07:44:39.510069 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2wts\" (UniqueName: \"kubernetes.io/projected/b32e59f7-9098-4b36-bd54-793975f9d73a-kube-api-access-j2wts\") pod \"community-operators-2kjvp\" (UID: \"b32e59f7-9098-4b36-bd54-793975f9d73a\") " pod="openshift-marketplace/community-operators-2kjvp" Mar 20 07:44:39 crc kubenswrapper[4971]: I0320 07:44:39.510141 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b32e59f7-9098-4b36-bd54-793975f9d73a-utilities\") pod \"community-operators-2kjvp\" (UID: \"b32e59f7-9098-4b36-bd54-793975f9d73a\") " pod="openshift-marketplace/community-operators-2kjvp" Mar 20 07:44:39 crc kubenswrapper[4971]: I0320 07:44:39.510633 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b32e59f7-9098-4b36-bd54-793975f9d73a-catalog-content\") pod \"community-operators-2kjvp\" (UID: \"b32e59f7-9098-4b36-bd54-793975f9d73a\") " pod="openshift-marketplace/community-operators-2kjvp" Mar 20 07:44:39 crc kubenswrapper[4971]: I0320 07:44:39.510659 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b32e59f7-9098-4b36-bd54-793975f9d73a-utilities\") pod \"community-operators-2kjvp\" (UID: \"b32e59f7-9098-4b36-bd54-793975f9d73a\") " pod="openshift-marketplace/community-operators-2kjvp" Mar 20 07:44:39 crc kubenswrapper[4971]: I0320 07:44:39.533458 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2wts\" (UniqueName: \"kubernetes.io/projected/b32e59f7-9098-4b36-bd54-793975f9d73a-kube-api-access-j2wts\") pod \"community-operators-2kjvp\" (UID: \"b32e59f7-9098-4b36-bd54-793975f9d73a\") " pod="openshift-marketplace/community-operators-2kjvp" Mar 20 07:44:39 crc kubenswrapper[4971]: I0320 07:44:39.746393 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2kjvp" Mar 20 07:44:40 crc kubenswrapper[4971]: I0320 07:44:40.262039 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2kjvp"] Mar 20 07:44:40 crc kubenswrapper[4971]: I0320 07:44:40.547194 4971 generic.go:334] "Generic (PLEG): container finished" podID="b32e59f7-9098-4b36-bd54-793975f9d73a" containerID="eb1c4229db6651b40813e2a7b4c28dc7d808287ccb03809f5603659ac400cf88" exitCode=0 Mar 20 07:44:40 crc kubenswrapper[4971]: I0320 07:44:40.547247 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2kjvp" event={"ID":"b32e59f7-9098-4b36-bd54-793975f9d73a","Type":"ContainerDied","Data":"eb1c4229db6651b40813e2a7b4c28dc7d808287ccb03809f5603659ac400cf88"} Mar 20 07:44:40 crc kubenswrapper[4971]: I0320 07:44:40.547294 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2kjvp" event={"ID":"b32e59f7-9098-4b36-bd54-793975f9d73a","Type":"ContainerStarted","Data":"e5fb22f7f08b535bb6fcc704ffe49d068e7b00edae085174bfd1d78d06936d0b"} Mar 20 07:44:42 crc kubenswrapper[4971]: I0320 07:44:42.139457 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-b6xr5"] Mar 20 07:44:42 crc kubenswrapper[4971]: I0320 07:44:42.142827 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b6xr5" Mar 20 07:44:42 crc kubenswrapper[4971]: I0320 07:44:42.152486 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/440ff08e-8ef8-4f32-931f-36bd34d9a936-catalog-content\") pod \"redhat-operators-b6xr5\" (UID: \"440ff08e-8ef8-4f32-931f-36bd34d9a936\") " pod="openshift-marketplace/redhat-operators-b6xr5" Mar 20 07:44:42 crc kubenswrapper[4971]: I0320 07:44:42.152565 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/440ff08e-8ef8-4f32-931f-36bd34d9a936-utilities\") pod \"redhat-operators-b6xr5\" (UID: \"440ff08e-8ef8-4f32-931f-36bd34d9a936\") " pod="openshift-marketplace/redhat-operators-b6xr5" Mar 20 07:44:42 crc kubenswrapper[4971]: I0320 07:44:42.152667 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v2xp\" (UniqueName: \"kubernetes.io/projected/440ff08e-8ef8-4f32-931f-36bd34d9a936-kube-api-access-9v2xp\") pod \"redhat-operators-b6xr5\" (UID: \"440ff08e-8ef8-4f32-931f-36bd34d9a936\") " pod="openshift-marketplace/redhat-operators-b6xr5" Mar 20 07:44:42 crc kubenswrapper[4971]: I0320 07:44:42.253855 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/440ff08e-8ef8-4f32-931f-36bd34d9a936-catalog-content\") pod \"redhat-operators-b6xr5\" (UID: \"440ff08e-8ef8-4f32-931f-36bd34d9a936\") " pod="openshift-marketplace/redhat-operators-b6xr5" Mar 20 07:44:42 crc kubenswrapper[4971]: I0320 07:44:42.253911 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/440ff08e-8ef8-4f32-931f-36bd34d9a936-utilities\") pod \"redhat-operators-b6xr5\" (UID: \"440ff08e-8ef8-4f32-931f-36bd34d9a936\") " pod="openshift-marketplace/redhat-operators-b6xr5" Mar 20 07:44:42 crc kubenswrapper[4971]: I0320 07:44:42.253955 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v2xp\" (UniqueName: \"kubernetes.io/projected/440ff08e-8ef8-4f32-931f-36bd34d9a936-kube-api-access-9v2xp\") pod \"redhat-operators-b6xr5\" (UID: \"440ff08e-8ef8-4f32-931f-36bd34d9a936\") " pod="openshift-marketplace/redhat-operators-b6xr5" Mar 20 07:44:42 crc kubenswrapper[4971]: I0320 07:44:42.254818 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/440ff08e-8ef8-4f32-931f-36bd34d9a936-utilities\") pod \"redhat-operators-b6xr5\" (UID: \"440ff08e-8ef8-4f32-931f-36bd34d9a936\") " pod="openshift-marketplace/redhat-operators-b6xr5" Mar 20 07:44:42 crc kubenswrapper[4971]: I0320 07:44:42.255362 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/440ff08e-8ef8-4f32-931f-36bd34d9a936-catalog-content\") pod \"redhat-operators-b6xr5\" (UID: \"440ff08e-8ef8-4f32-931f-36bd34d9a936\") " pod="openshift-marketplace/redhat-operators-b6xr5" Mar 20 07:44:42 crc kubenswrapper[4971]: I0320 07:44:42.268167 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b6xr5"] Mar 20 07:44:42 crc kubenswrapper[4971]: I0320 07:44:42.281235 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v2xp\" (UniqueName: \"kubernetes.io/projected/440ff08e-8ef8-4f32-931f-36bd34d9a936-kube-api-access-9v2xp\") pod \"redhat-operators-b6xr5\" (UID: \"440ff08e-8ef8-4f32-931f-36bd34d9a936\") " pod="openshift-marketplace/redhat-operators-b6xr5" Mar 20 07:44:42 crc kubenswrapper[4971]: I0320 07:44:42.473215 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b6xr5" Mar 20 07:44:42 crc kubenswrapper[4971]: I0320 07:44:42.570474 4971 generic.go:334] "Generic (PLEG): container finished" podID="b32e59f7-9098-4b36-bd54-793975f9d73a" containerID="25a812791988bc29b331c685d6ac046c5bac84f80757b85955475e4ad383b390" exitCode=0 Mar 20 07:44:42 crc kubenswrapper[4971]: I0320 07:44:42.570728 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2kjvp" event={"ID":"b32e59f7-9098-4b36-bd54-793975f9d73a","Type":"ContainerDied","Data":"25a812791988bc29b331c685d6ac046c5bac84f80757b85955475e4ad383b390"} Mar 20 07:44:42 crc kubenswrapper[4971]: I0320 07:44:42.904383 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b6xr5"] Mar 20 07:44:42 crc kubenswrapper[4971]: W0320 07:44:42.913589 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod440ff08e_8ef8_4f32_931f_36bd34d9a936.slice/crio-af07d2cb22380544766ec93083e2a20210cae80eb229c88dd999fb3b361f8693 WatchSource:0}: Error finding container af07d2cb22380544766ec93083e2a20210cae80eb229c88dd999fb3b361f8693: Status 404 returned error can't find the container with id af07d2cb22380544766ec93083e2a20210cae80eb229c88dd999fb3b361f8693 Mar 20 07:44:43 crc kubenswrapper[4971]: I0320 07:44:43.579716 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2kjvp" event={"ID":"b32e59f7-9098-4b36-bd54-793975f9d73a","Type":"ContainerStarted","Data":"ab85fc249920bd303577c8c83101adc53e3d511f3bb2db4033c25262f1f155a3"} Mar 20 07:44:43 crc kubenswrapper[4971]: I0320 07:44:43.581492 4971 generic.go:334] "Generic (PLEG): container finished" podID="440ff08e-8ef8-4f32-931f-36bd34d9a936" containerID="c73d537b198c1c212e6e9ecfed6436ed54f69e37fad13e0a3bdd3501842720f6" exitCode=0 Mar 20 07:44:43 crc kubenswrapper[4971]: I0320 07:44:43.581537 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b6xr5" event={"ID":"440ff08e-8ef8-4f32-931f-36bd34d9a936","Type":"ContainerDied","Data":"c73d537b198c1c212e6e9ecfed6436ed54f69e37fad13e0a3bdd3501842720f6"} Mar 20 07:44:43 crc kubenswrapper[4971]: I0320 07:44:43.581564 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b6xr5" event={"ID":"440ff08e-8ef8-4f32-931f-36bd34d9a936","Type":"ContainerStarted","Data":"af07d2cb22380544766ec93083e2a20210cae80eb229c88dd999fb3b361f8693"} Mar 20 07:44:43 crc kubenswrapper[4971]: I0320 07:44:43.607261 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2kjvp" podStartSLOduration=2.130783245 podStartE2EDuration="4.607243047s" podCreationTimestamp="2026-03-20 07:44:39 +0000 UTC" firstStartedPulling="2026-03-20 07:44:40.548463876 +0000 UTC m=+3302.528338014" lastFinishedPulling="2026-03-20 07:44:43.024923678 +0000 UTC m=+3305.004797816" observedRunningTime="2026-03-20 07:44:43.599949016 +0000 UTC m=+3305.579823154" watchObservedRunningTime="2026-03-20 07:44:43.607243047 +0000 UTC m=+3305.587117195" Mar 20 07:44:44 crc kubenswrapper[4971]: I0320 07:44:44.591451 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b6xr5" event={"ID":"440ff08e-8ef8-4f32-931f-36bd34d9a936","Type":"ContainerStarted","Data":"f07941cf712d0a09bd239c44d352fcbc9592b6e6d89403a0c2828676c22922b0"} Mar 20 07:44:45 crc kubenswrapper[4971]: I0320 07:44:45.604641 4971 generic.go:334] "Generic (PLEG): container finished" podID="440ff08e-8ef8-4f32-931f-36bd34d9a936" containerID="f07941cf712d0a09bd239c44d352fcbc9592b6e6d89403a0c2828676c22922b0" exitCode=0 Mar 20 07:44:45 crc kubenswrapper[4971]: I0320 07:44:45.604686 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b6xr5" event={"ID":"440ff08e-8ef8-4f32-931f-36bd34d9a936","Type":"ContainerDied","Data":"f07941cf712d0a09bd239c44d352fcbc9592b6e6d89403a0c2828676c22922b0"} Mar 20 07:44:46 crc kubenswrapper[4971]: I0320 07:44:46.629400 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b6xr5" event={"ID":"440ff08e-8ef8-4f32-931f-36bd34d9a936","Type":"ContainerStarted","Data":"668b8881301c2138150435ade26af079505ebdb5d23e638a0ad7084123417f46"} Mar 20 07:44:49 crc kubenswrapper[4971]: I0320 07:44:49.747652 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2kjvp" Mar 20 07:44:49 crc kubenswrapper[4971]: I0320 07:44:49.748559 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2kjvp" Mar 20 07:44:49 crc kubenswrapper[4971]: I0320 07:44:49.797657 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2kjvp" Mar 20 07:44:49 crc kubenswrapper[4971]: I0320 07:44:49.837231 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-b6xr5" podStartSLOduration=5.310670036 podStartE2EDuration="7.837190656s" podCreationTimestamp="2026-03-20 07:44:42 +0000 UTC" firstStartedPulling="2026-03-20 07:44:43.582724566 +0000 UTC m=+3305.562598704" lastFinishedPulling="2026-03-20 07:44:46.109245186 +0000 UTC m=+3308.089119324" observedRunningTime="2026-03-20 07:44:46.659296962 +0000 UTC m=+3308.639171110" watchObservedRunningTime="2026-03-20 07:44:49.837190656 +0000 UTC m=+3311.817064834" Mar 20 07:44:50 crc kubenswrapper[4971]: I0320 07:44:50.750600 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2kjvp" Mar 20 07:44:50 crc kubenswrapper[4971]: I0320 07:44:50.810997 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2kjvp"] Mar 20 07:44:52 crc kubenswrapper[4971]: I0320 07:44:52.473990 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-b6xr5" Mar 20 07:44:52 crc kubenswrapper[4971]: I0320 07:44:52.474334 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-b6xr5" Mar 20 07:44:52 crc kubenswrapper[4971]: I0320 07:44:52.689799 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2kjvp" podUID="b32e59f7-9098-4b36-bd54-793975f9d73a" containerName="registry-server" containerID="cri-o://ab85fc249920bd303577c8c83101adc53e3d511f3bb2db4033c25262f1f155a3" gracePeriod=2 Mar 20 07:44:53 crc kubenswrapper[4971]: I0320 07:44:53.201270 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2kjvp" Mar 20 07:44:53 crc kubenswrapper[4971]: I0320 07:44:53.348661 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b32e59f7-9098-4b36-bd54-793975f9d73a-catalog-content\") pod \"b32e59f7-9098-4b36-bd54-793975f9d73a\" (UID: \"b32e59f7-9098-4b36-bd54-793975f9d73a\") " Mar 20 07:44:53 crc kubenswrapper[4971]: I0320 07:44:53.348752 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b32e59f7-9098-4b36-bd54-793975f9d73a-utilities\") pod \"b32e59f7-9098-4b36-bd54-793975f9d73a\" (UID: \"b32e59f7-9098-4b36-bd54-793975f9d73a\") " Mar 20 07:44:53 crc kubenswrapper[4971]: I0320 07:44:53.348816 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2wts\" (UniqueName: \"kubernetes.io/projected/b32e59f7-9098-4b36-bd54-793975f9d73a-kube-api-access-j2wts\") pod \"b32e59f7-9098-4b36-bd54-793975f9d73a\" (UID: \"b32e59f7-9098-4b36-bd54-793975f9d73a\") " Mar 20 07:44:53 crc kubenswrapper[4971]: I0320 07:44:53.350021 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b32e59f7-9098-4b36-bd54-793975f9d73a-utilities" (OuterVolumeSpecName: "utilities") pod "b32e59f7-9098-4b36-bd54-793975f9d73a" (UID: "b32e59f7-9098-4b36-bd54-793975f9d73a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:44:53 crc kubenswrapper[4971]: I0320 07:44:53.366009 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b32e59f7-9098-4b36-bd54-793975f9d73a-kube-api-access-j2wts" (OuterVolumeSpecName: "kube-api-access-j2wts") pod "b32e59f7-9098-4b36-bd54-793975f9d73a" (UID: "b32e59f7-9098-4b36-bd54-793975f9d73a"). InnerVolumeSpecName "kube-api-access-j2wts". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:44:53 crc kubenswrapper[4971]: I0320 07:44:53.429324 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b32e59f7-9098-4b36-bd54-793975f9d73a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b32e59f7-9098-4b36-bd54-793975f9d73a" (UID: "b32e59f7-9098-4b36-bd54-793975f9d73a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:44:53 crc kubenswrapper[4971]: I0320 07:44:53.450070 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2wts\" (UniqueName: \"kubernetes.io/projected/b32e59f7-9098-4b36-bd54-793975f9d73a-kube-api-access-j2wts\") on node \"crc\" DevicePath \"\"" Mar 20 07:44:53 crc kubenswrapper[4971]: I0320 07:44:53.450108 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b32e59f7-9098-4b36-bd54-793975f9d73a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 07:44:53 crc kubenswrapper[4971]: I0320 07:44:53.450158 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b32e59f7-9098-4b36-bd54-793975f9d73a-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 07:44:53 crc kubenswrapper[4971]: I0320 07:44:53.525323 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-b6xr5" podUID="440ff08e-8ef8-4f32-931f-36bd34d9a936" containerName="registry-server" probeResult="failure" output=< Mar 20 07:44:53 crc kubenswrapper[4971]: timeout: failed to connect service ":50051" within 1s Mar 20 07:44:53 crc kubenswrapper[4971]: > Mar 20 07:44:53 crc kubenswrapper[4971]: I0320 07:44:53.700637 4971 generic.go:334] "Generic (PLEG): container finished" podID="b32e59f7-9098-4b36-bd54-793975f9d73a" containerID="ab85fc249920bd303577c8c83101adc53e3d511f3bb2db4033c25262f1f155a3" exitCode=0 Mar 20 07:44:53 crc kubenswrapper[4971]: I0320 07:44:53.700701 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2kjvp" event={"ID":"b32e59f7-9098-4b36-bd54-793975f9d73a","Type":"ContainerDied","Data":"ab85fc249920bd303577c8c83101adc53e3d511f3bb2db4033c25262f1f155a3"} Mar 20 07:44:53 crc kubenswrapper[4971]: I0320 07:44:53.700747 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2kjvp" event={"ID":"b32e59f7-9098-4b36-bd54-793975f9d73a","Type":"ContainerDied","Data":"e5fb22f7f08b535bb6fcc704ffe49d068e7b00edae085174bfd1d78d06936d0b"} Mar 20 07:44:53 crc kubenswrapper[4971]: I0320 07:44:53.700758 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2kjvp" Mar 20 07:44:53 crc kubenswrapper[4971]: I0320 07:44:53.700773 4971 scope.go:117] "RemoveContainer" containerID="ab85fc249920bd303577c8c83101adc53e3d511f3bb2db4033c25262f1f155a3" Mar 20 07:44:53 crc kubenswrapper[4971]: I0320 07:44:53.728334 4971 scope.go:117] "RemoveContainer" containerID="25a812791988bc29b331c685d6ac046c5bac84f80757b85955475e4ad383b390" Mar 20 07:44:53 crc kubenswrapper[4971]: I0320 07:44:53.757040 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2kjvp"] Mar 20 07:44:53 crc kubenswrapper[4971]: I0320 07:44:53.758471 4971 scope.go:117] "RemoveContainer" containerID="eb1c4229db6651b40813e2a7b4c28dc7d808287ccb03809f5603659ac400cf88" Mar 20 07:44:53 crc kubenswrapper[4971]: I0320 07:44:53.770598 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2kjvp"] Mar 20 07:44:53 crc kubenswrapper[4971]: I0320 07:44:53.787726 4971 scope.go:117] "RemoveContainer" containerID="ab85fc249920bd303577c8c83101adc53e3d511f3bb2db4033c25262f1f155a3" Mar 20 07:44:53 crc kubenswrapper[4971]: E0320 07:44:53.788270 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab85fc249920bd303577c8c83101adc53e3d511f3bb2db4033c25262f1f155a3\": container with ID starting with ab85fc249920bd303577c8c83101adc53e3d511f3bb2db4033c25262f1f155a3 not found: ID does not exist" containerID="ab85fc249920bd303577c8c83101adc53e3d511f3bb2db4033c25262f1f155a3" Mar 20 07:44:53 crc kubenswrapper[4971]: I0320 07:44:53.788344 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab85fc249920bd303577c8c83101adc53e3d511f3bb2db4033c25262f1f155a3"} err="failed to get container status \"ab85fc249920bd303577c8c83101adc53e3d511f3bb2db4033c25262f1f155a3\": rpc error: code = NotFound desc = could not find container \"ab85fc249920bd303577c8c83101adc53e3d511f3bb2db4033c25262f1f155a3\": container with ID starting with ab85fc249920bd303577c8c83101adc53e3d511f3bb2db4033c25262f1f155a3 not found: ID does not exist" Mar 20 07:44:53 crc kubenswrapper[4971]: I0320 07:44:53.788389 4971 scope.go:117] "RemoveContainer" containerID="25a812791988bc29b331c685d6ac046c5bac84f80757b85955475e4ad383b390" Mar 20 07:44:53 crc kubenswrapper[4971]: E0320 07:44:53.789060 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25a812791988bc29b331c685d6ac046c5bac84f80757b85955475e4ad383b390\": container with ID starting with 25a812791988bc29b331c685d6ac046c5bac84f80757b85955475e4ad383b390 not found: ID does not exist" containerID="25a812791988bc29b331c685d6ac046c5bac84f80757b85955475e4ad383b390" Mar 20 07:44:53 crc kubenswrapper[4971]: I0320 07:44:53.789108 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25a812791988bc29b331c685d6ac046c5bac84f80757b85955475e4ad383b390"} err="failed to get container status \"25a812791988bc29b331c685d6ac046c5bac84f80757b85955475e4ad383b390\": rpc error: code = NotFound desc = could not find container \"25a812791988bc29b331c685d6ac046c5bac84f80757b85955475e4ad383b390\": container with ID starting with 25a812791988bc29b331c685d6ac046c5bac84f80757b85955475e4ad383b390 not found: ID does not exist" Mar 20 07:44:53 crc kubenswrapper[4971]: I0320 07:44:53.789135 4971 scope.go:117] "RemoveContainer" containerID="eb1c4229db6651b40813e2a7b4c28dc7d808287ccb03809f5603659ac400cf88" Mar 20 07:44:53 crc kubenswrapper[4971]: E0320 07:44:53.789555 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb1c4229db6651b40813e2a7b4c28dc7d808287ccb03809f5603659ac400cf88\": container with ID starting with eb1c4229db6651b40813e2a7b4c28dc7d808287ccb03809f5603659ac400cf88 not found: ID does not exist" containerID="eb1c4229db6651b40813e2a7b4c28dc7d808287ccb03809f5603659ac400cf88" Mar 20 07:44:53 crc kubenswrapper[4971]: I0320 07:44:53.789670 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb1c4229db6651b40813e2a7b4c28dc7d808287ccb03809f5603659ac400cf88"} err="failed to get container status \"eb1c4229db6651b40813e2a7b4c28dc7d808287ccb03809f5603659ac400cf88\": rpc error: code = NotFound desc = could not find container \"eb1c4229db6651b40813e2a7b4c28dc7d808287ccb03809f5603659ac400cf88\": container with ID starting with eb1c4229db6651b40813e2a7b4c28dc7d808287ccb03809f5603659ac400cf88 not found: ID does not exist" Mar 20 07:44:54 crc kubenswrapper[4971]: I0320 07:44:54.742653 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b32e59f7-9098-4b36-bd54-793975f9d73a" path="/var/lib/kubelet/pods/b32e59f7-9098-4b36-bd54-793975f9d73a/volumes" Mar 20 07:44:55 crc kubenswrapper[4971]: I0320 07:44:55.416571 4971 scope.go:117] "RemoveContainer" containerID="1a9e9869e3bd317cdb8512ff559deeef4e0e83fb6b1da3fb62f9a246fb8d92e0" Mar 20 07:45:00 crc kubenswrapper[4971]: I0320 07:45:00.235732 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566545-8rrzq"] Mar 20 07:45:00 crc kubenswrapper[4971]: E0320 07:45:00.236076 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b32e59f7-9098-4b36-bd54-793975f9d73a" containerName="extract-utilities" Mar 20 07:45:00 crc kubenswrapper[4971]: I0320 07:45:00.236090 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="b32e59f7-9098-4b36-bd54-793975f9d73a" containerName="extract-utilities" Mar 20 07:45:00 crc kubenswrapper[4971]: E0320 07:45:00.236119 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b32e59f7-9098-4b36-bd54-793975f9d73a" containerName="registry-server" Mar 20 07:45:00 crc kubenswrapper[4971]: I0320 07:45:00.236128 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="b32e59f7-9098-4b36-bd54-793975f9d73a" containerName="registry-server" Mar 20 07:45:00 crc kubenswrapper[4971]: E0320 07:45:00.236148 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b32e59f7-9098-4b36-bd54-793975f9d73a" containerName="extract-content" Mar 20 07:45:00 crc kubenswrapper[4971]: I0320 07:45:00.236156 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="b32e59f7-9098-4b36-bd54-793975f9d73a" containerName="extract-content" Mar 20 07:45:00 crc kubenswrapper[4971]: I0320 07:45:00.236324 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="b32e59f7-9098-4b36-bd54-793975f9d73a" containerName="registry-server" Mar 20 07:45:00 crc kubenswrapper[4971]: I0320 07:45:00.236891 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566545-8rrzq" Mar 20 07:45:00 crc kubenswrapper[4971]: I0320 07:45:00.239162 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 07:45:00 crc kubenswrapper[4971]: I0320 07:45:00.239248 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 07:45:00 crc kubenswrapper[4971]: I0320 07:45:00.248767 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566545-8rrzq"] Mar 20 07:45:00 crc kubenswrapper[4971]: I0320 07:45:00.367567 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkr7f\" (UniqueName: \"kubernetes.io/projected/894a3a5d-c688-41ea-8b99-578fde0702d5-kube-api-access-mkr7f\") pod \"collect-profiles-29566545-8rrzq\" (UID: \"894a3a5d-c688-41ea-8b99-578fde0702d5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566545-8rrzq" Mar 20 07:45:00 crc kubenswrapper[4971]: I0320 07:45:00.367631 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/894a3a5d-c688-41ea-8b99-578fde0702d5-config-volume\") pod \"collect-profiles-29566545-8rrzq\" (UID: \"894a3a5d-c688-41ea-8b99-578fde0702d5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566545-8rrzq" Mar 20 07:45:00 crc kubenswrapper[4971]: I0320 07:45:00.367685 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/894a3a5d-c688-41ea-8b99-578fde0702d5-secret-volume\") pod \"collect-profiles-29566545-8rrzq\" (UID: \"894a3a5d-c688-41ea-8b99-578fde0702d5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566545-8rrzq" Mar 20 07:45:00 crc kubenswrapper[4971]: I0320 07:45:00.469268 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/894a3a5d-c688-41ea-8b99-578fde0702d5-config-volume\") pod \"collect-profiles-29566545-8rrzq\" (UID: \"894a3a5d-c688-41ea-8b99-578fde0702d5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566545-8rrzq" Mar 20 07:45:00 crc kubenswrapper[4971]: I0320 07:45:00.469356 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/894a3a5d-c688-41ea-8b99-578fde0702d5-secret-volume\") pod \"collect-profiles-29566545-8rrzq\" (UID: \"894a3a5d-c688-41ea-8b99-578fde0702d5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566545-8rrzq" Mar 20 07:45:00 crc kubenswrapper[4971]: I0320 07:45:00.469423 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkr7f\" (UniqueName: \"kubernetes.io/projected/894a3a5d-c688-41ea-8b99-578fde0702d5-kube-api-access-mkr7f\") pod \"collect-profiles-29566545-8rrzq\" (UID: \"894a3a5d-c688-41ea-8b99-578fde0702d5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566545-8rrzq" Mar 20 07:45:00 crc kubenswrapper[4971]: I0320 07:45:00.470196 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/894a3a5d-c688-41ea-8b99-578fde0702d5-config-volume\") pod \"collect-profiles-29566545-8rrzq\" (UID: \"894a3a5d-c688-41ea-8b99-578fde0702d5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566545-8rrzq" Mar 20 07:45:00 crc kubenswrapper[4971]: I0320 07:45:00.476262 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/894a3a5d-c688-41ea-8b99-578fde0702d5-secret-volume\") pod \"collect-profiles-29566545-8rrzq\" (UID: \"894a3a5d-c688-41ea-8b99-578fde0702d5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566545-8rrzq" Mar 20 07:45:00 crc kubenswrapper[4971]: I0320 07:45:00.489882 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkr7f\" (UniqueName: \"kubernetes.io/projected/894a3a5d-c688-41ea-8b99-578fde0702d5-kube-api-access-mkr7f\") pod \"collect-profiles-29566545-8rrzq\" (UID: \"894a3a5d-c688-41ea-8b99-578fde0702d5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566545-8rrzq" Mar 20 07:45:00 crc kubenswrapper[4971]: I0320 07:45:00.566375 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566545-8rrzq" Mar 20 07:45:01 crc kubenswrapper[4971]: I0320 07:45:01.037864 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566545-8rrzq"] Mar 20 07:45:01 crc kubenswrapper[4971]: W0320 07:45:01.047541 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod894a3a5d_c688_41ea_8b99_578fde0702d5.slice/crio-5ebfd064571b54bf0d5e5b5f5c619ab6cf32f382caa390751b470b4073aa6387 WatchSource:0}: Error finding container 5ebfd064571b54bf0d5e5b5f5c619ab6cf32f382caa390751b470b4073aa6387: Status 404 returned error can't find the container with id 5ebfd064571b54bf0d5e5b5f5c619ab6cf32f382caa390751b470b4073aa6387 Mar 20 07:45:01 crc kubenswrapper[4971]: I0320 07:45:01.795073 4971 generic.go:334] "Generic (PLEG): container finished" podID="894a3a5d-c688-41ea-8b99-578fde0702d5" containerID="9c40445e2e759b79bd7e1ad9c220e706a67500424068e12b695171e762e58b44" exitCode=0 Mar 20 07:45:01 crc kubenswrapper[4971]: I0320 07:45:01.795145 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566545-8rrzq" event={"ID":"894a3a5d-c688-41ea-8b99-578fde0702d5","Type":"ContainerDied","Data":"9c40445e2e759b79bd7e1ad9c220e706a67500424068e12b695171e762e58b44"} Mar 20 07:45:01 crc kubenswrapper[4971]: I0320 07:45:01.795181 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566545-8rrzq" event={"ID":"894a3a5d-c688-41ea-8b99-578fde0702d5","Type":"ContainerStarted","Data":"5ebfd064571b54bf0d5e5b5f5c619ab6cf32f382caa390751b470b4073aa6387"} Mar 20 07:45:02 crc kubenswrapper[4971]: I0320 07:45:02.555107 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-b6xr5" Mar 20 07:45:02 crc kubenswrapper[4971]: I0320 07:45:02.641873 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-b6xr5" Mar 20 07:45:02 crc kubenswrapper[4971]: I0320 07:45:02.811236 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b6xr5"] Mar 20 07:45:03 crc kubenswrapper[4971]: I0320 07:45:03.093298 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566545-8rrzq" Mar 20 07:45:03 crc kubenswrapper[4971]: I0320 07:45:03.216234 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkr7f\" (UniqueName: \"kubernetes.io/projected/894a3a5d-c688-41ea-8b99-578fde0702d5-kube-api-access-mkr7f\") pod \"894a3a5d-c688-41ea-8b99-578fde0702d5\" (UID: \"894a3a5d-c688-41ea-8b99-578fde0702d5\") " Mar 20 07:45:03 crc kubenswrapper[4971]: I0320 07:45:03.216488 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/894a3a5d-c688-41ea-8b99-578fde0702d5-secret-volume\") pod \"894a3a5d-c688-41ea-8b99-578fde0702d5\" (UID: \"894a3a5d-c688-41ea-8b99-578fde0702d5\") " Mar 20 07:45:03 crc kubenswrapper[4971]: I0320 07:45:03.216740 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/894a3a5d-c688-41ea-8b99-578fde0702d5-config-volume\") pod \"894a3a5d-c688-41ea-8b99-578fde0702d5\" (UID: \"894a3a5d-c688-41ea-8b99-578fde0702d5\") " Mar 20 07:45:03 crc kubenswrapper[4971]: I0320 07:45:03.217909 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/894a3a5d-c688-41ea-8b99-578fde0702d5-config-volume" (OuterVolumeSpecName: "config-volume") pod "894a3a5d-c688-41ea-8b99-578fde0702d5" (UID: "894a3a5d-c688-41ea-8b99-578fde0702d5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:45:03 crc kubenswrapper[4971]: I0320 07:45:03.223065 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/894a3a5d-c688-41ea-8b99-578fde0702d5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "894a3a5d-c688-41ea-8b99-578fde0702d5" (UID: "894a3a5d-c688-41ea-8b99-578fde0702d5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:45:03 crc kubenswrapper[4971]: I0320 07:45:03.223123 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/894a3a5d-c688-41ea-8b99-578fde0702d5-kube-api-access-mkr7f" (OuterVolumeSpecName: "kube-api-access-mkr7f") pod "894a3a5d-c688-41ea-8b99-578fde0702d5" (UID: "894a3a5d-c688-41ea-8b99-578fde0702d5"). InnerVolumeSpecName "kube-api-access-mkr7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:45:03 crc kubenswrapper[4971]: I0320 07:45:03.319365 4971 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/894a3a5d-c688-41ea-8b99-578fde0702d5-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 07:45:03 crc kubenswrapper[4971]: I0320 07:45:03.319417 4971 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/894a3a5d-c688-41ea-8b99-578fde0702d5-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 07:45:03 crc kubenswrapper[4971]: I0320 07:45:03.319432 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mkr7f\" (UniqueName: \"kubernetes.io/projected/894a3a5d-c688-41ea-8b99-578fde0702d5-kube-api-access-mkr7f\") on node \"crc\" DevicePath \"\"" Mar 20 07:45:03 crc kubenswrapper[4971]: I0320 07:45:03.818970 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566545-8rrzq" event={"ID":"894a3a5d-c688-41ea-8b99-578fde0702d5","Type":"ContainerDied","Data":"5ebfd064571b54bf0d5e5b5f5c619ab6cf32f382caa390751b470b4073aa6387"} Mar 20 07:45:03 crc kubenswrapper[4971]: I0320 07:45:03.819011 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ebfd064571b54bf0d5e5b5f5c619ab6cf32f382caa390751b470b4073aa6387" Mar 20 07:45:03 crc kubenswrapper[4971]: I0320 07:45:03.819006 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566545-8rrzq" Mar 20 07:45:03 crc kubenswrapper[4971]: I0320 07:45:03.819196 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-b6xr5" podUID="440ff08e-8ef8-4f32-931f-36bd34d9a936" containerName="registry-server" containerID="cri-o://668b8881301c2138150435ade26af079505ebdb5d23e638a0ad7084123417f46" gracePeriod=2 Mar 20 07:45:04 crc kubenswrapper[4971]: I0320 07:45:04.179309 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566500-lqrrn"] Mar 20 07:45:04 crc kubenswrapper[4971]: I0320 07:45:04.183384 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566500-lqrrn"] Mar 20 07:45:04 crc kubenswrapper[4971]: I0320 07:45:04.276594 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b6xr5" Mar 20 07:45:04 crc kubenswrapper[4971]: I0320 07:45:04.443940 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/440ff08e-8ef8-4f32-931f-36bd34d9a936-catalog-content\") pod \"440ff08e-8ef8-4f32-931f-36bd34d9a936\" (UID: \"440ff08e-8ef8-4f32-931f-36bd34d9a936\") " Mar 20 07:45:04 crc kubenswrapper[4971]: I0320 07:45:04.443996 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9v2xp\" (UniqueName: \"kubernetes.io/projected/440ff08e-8ef8-4f32-931f-36bd34d9a936-kube-api-access-9v2xp\") pod \"440ff08e-8ef8-4f32-931f-36bd34d9a936\" (UID: \"440ff08e-8ef8-4f32-931f-36bd34d9a936\") " Mar 20 07:45:04 crc kubenswrapper[4971]: I0320 07:45:04.444031 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/440ff08e-8ef8-4f32-931f-36bd34d9a936-utilities\") pod \"440ff08e-8ef8-4f32-931f-36bd34d9a936\" (UID: \"440ff08e-8ef8-4f32-931f-36bd34d9a936\") " Mar 20 07:45:04 crc kubenswrapper[4971]: I0320 07:45:04.446184 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/440ff08e-8ef8-4f32-931f-36bd34d9a936-utilities" (OuterVolumeSpecName: "utilities") pod "440ff08e-8ef8-4f32-931f-36bd34d9a936" (UID: "440ff08e-8ef8-4f32-931f-36bd34d9a936"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:45:04 crc kubenswrapper[4971]: I0320 07:45:04.452850 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/440ff08e-8ef8-4f32-931f-36bd34d9a936-kube-api-access-9v2xp" (OuterVolumeSpecName: "kube-api-access-9v2xp") pod "440ff08e-8ef8-4f32-931f-36bd34d9a936" (UID: "440ff08e-8ef8-4f32-931f-36bd34d9a936"). InnerVolumeSpecName "kube-api-access-9v2xp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:45:04 crc kubenswrapper[4971]: I0320 07:45:04.545544 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9v2xp\" (UniqueName: \"kubernetes.io/projected/440ff08e-8ef8-4f32-931f-36bd34d9a936-kube-api-access-9v2xp\") on node \"crc\" DevicePath \"\"" Mar 20 07:45:04 crc kubenswrapper[4971]: I0320 07:45:04.545629 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/440ff08e-8ef8-4f32-931f-36bd34d9a936-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 07:45:04 crc kubenswrapper[4971]: I0320 07:45:04.643889 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/440ff08e-8ef8-4f32-931f-36bd34d9a936-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "440ff08e-8ef8-4f32-931f-36bd34d9a936" (UID: "440ff08e-8ef8-4f32-931f-36bd34d9a936"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:45:04 crc kubenswrapper[4971]: I0320 07:45:04.646985 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/440ff08e-8ef8-4f32-931f-36bd34d9a936-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 07:45:04 crc kubenswrapper[4971]: I0320 07:45:04.745391 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6da4e6ff-eae8-421a-81e8-694388cbb4d3" path="/var/lib/kubelet/pods/6da4e6ff-eae8-421a-81e8-694388cbb4d3/volumes" Mar 20 07:45:04 crc kubenswrapper[4971]: I0320 07:45:04.831757 4971 generic.go:334] "Generic (PLEG): container finished" podID="440ff08e-8ef8-4f32-931f-36bd34d9a936" containerID="668b8881301c2138150435ade26af079505ebdb5d23e638a0ad7084123417f46" exitCode=0 Mar 20 07:45:04 crc kubenswrapper[4971]: I0320 07:45:04.831815 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b6xr5" event={"ID":"440ff08e-8ef8-4f32-931f-36bd34d9a936","Type":"ContainerDied","Data":"668b8881301c2138150435ade26af079505ebdb5d23e638a0ad7084123417f46"} Mar 20 07:45:04 crc kubenswrapper[4971]: I0320 07:45:04.831857 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b6xr5" event={"ID":"440ff08e-8ef8-4f32-931f-36bd34d9a936","Type":"ContainerDied","Data":"af07d2cb22380544766ec93083e2a20210cae80eb229c88dd999fb3b361f8693"} Mar 20 07:45:04 crc kubenswrapper[4971]: I0320 07:45:04.831855 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b6xr5" Mar 20 07:45:04 crc kubenswrapper[4971]: I0320 07:45:04.831877 4971 scope.go:117] "RemoveContainer" containerID="668b8881301c2138150435ade26af079505ebdb5d23e638a0ad7084123417f46" Mar 20 07:45:04 crc kubenswrapper[4971]: I0320 07:45:04.853087 4971 scope.go:117] "RemoveContainer" containerID="f07941cf712d0a09bd239c44d352fcbc9592b6e6d89403a0c2828676c22922b0" Mar 20 07:45:04 crc kubenswrapper[4971]: I0320 07:45:04.855838 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b6xr5"] Mar 20 07:45:04 crc kubenswrapper[4971]: I0320 07:45:04.860720 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-b6xr5"] Mar 20 07:45:04 crc kubenswrapper[4971]: I0320 07:45:04.882415 4971 scope.go:117] "RemoveContainer" containerID="c73d537b198c1c212e6e9ecfed6436ed54f69e37fad13e0a3bdd3501842720f6" Mar 20 07:45:04 crc kubenswrapper[4971]: I0320 07:45:04.912248 4971 scope.go:117] "RemoveContainer" containerID="668b8881301c2138150435ade26af079505ebdb5d23e638a0ad7084123417f46" Mar 20 07:45:04 crc kubenswrapper[4971]: E0320 07:45:04.912915 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"668b8881301c2138150435ade26af079505ebdb5d23e638a0ad7084123417f46\": container with ID starting with 668b8881301c2138150435ade26af079505ebdb5d23e638a0ad7084123417f46 not found: ID does not exist" containerID="668b8881301c2138150435ade26af079505ebdb5d23e638a0ad7084123417f46" Mar 20 07:45:04 crc kubenswrapper[4971]: I0320 07:45:04.912951 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"668b8881301c2138150435ade26af079505ebdb5d23e638a0ad7084123417f46"} err="failed to get container status \"668b8881301c2138150435ade26af079505ebdb5d23e638a0ad7084123417f46\": rpc error: code = NotFound desc = could not find container \"668b8881301c2138150435ade26af079505ebdb5d23e638a0ad7084123417f46\": container with ID starting with 668b8881301c2138150435ade26af079505ebdb5d23e638a0ad7084123417f46 not found: ID does not exist" Mar 20 07:45:04 crc kubenswrapper[4971]: I0320 07:45:04.912974 4971 scope.go:117] "RemoveContainer" containerID="f07941cf712d0a09bd239c44d352fcbc9592b6e6d89403a0c2828676c22922b0" Mar 20 07:45:04 crc kubenswrapper[4971]: E0320 07:45:04.923917 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f07941cf712d0a09bd239c44d352fcbc9592b6e6d89403a0c2828676c22922b0\": container with ID starting with f07941cf712d0a09bd239c44d352fcbc9592b6e6d89403a0c2828676c22922b0 not found: ID does not exist" containerID="f07941cf712d0a09bd239c44d352fcbc9592b6e6d89403a0c2828676c22922b0" Mar 20 07:45:04 crc kubenswrapper[4971]: I0320 07:45:04.923956 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f07941cf712d0a09bd239c44d352fcbc9592b6e6d89403a0c2828676c22922b0"} err="failed to get container status \"f07941cf712d0a09bd239c44d352fcbc9592b6e6d89403a0c2828676c22922b0\": rpc error: code = NotFound desc = could not find container \"f07941cf712d0a09bd239c44d352fcbc9592b6e6d89403a0c2828676c22922b0\": container with ID starting with f07941cf712d0a09bd239c44d352fcbc9592b6e6d89403a0c2828676c22922b0 not found: ID does not exist" Mar 20 07:45:04 crc kubenswrapper[4971]: I0320 07:45:04.923975 4971 scope.go:117] "RemoveContainer" containerID="c73d537b198c1c212e6e9ecfed6436ed54f69e37fad13e0a3bdd3501842720f6" Mar 20 07:45:04 crc kubenswrapper[4971]: E0320 07:45:04.926508 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c73d537b198c1c212e6e9ecfed6436ed54f69e37fad13e0a3bdd3501842720f6\": container with ID starting with c73d537b198c1c212e6e9ecfed6436ed54f69e37fad13e0a3bdd3501842720f6 not found: ID does not exist" containerID="c73d537b198c1c212e6e9ecfed6436ed54f69e37fad13e0a3bdd3501842720f6" Mar 20 07:45:04 crc kubenswrapper[4971]: I0320 07:45:04.926577 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c73d537b198c1c212e6e9ecfed6436ed54f69e37fad13e0a3bdd3501842720f6"} err="failed to get container status \"c73d537b198c1c212e6e9ecfed6436ed54f69e37fad13e0a3bdd3501842720f6\": rpc error: code = NotFound desc = could not find container \"c73d537b198c1c212e6e9ecfed6436ed54f69e37fad13e0a3bdd3501842720f6\": container with ID starting with c73d537b198c1c212e6e9ecfed6436ed54f69e37fad13e0a3bdd3501842720f6 not found: ID does not exist" Mar 20 07:45:06 crc kubenswrapper[4971]: I0320 07:45:06.739643 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="440ff08e-8ef8-4f32-931f-36bd34d9a936" path="/var/lib/kubelet/pods/440ff08e-8ef8-4f32-931f-36bd34d9a936/volumes" Mar 20 07:45:20 crc kubenswrapper[4971]: I0320 07:45:20.162519 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:45:20 crc kubenswrapper[4971]: I0320 07:45:20.162868 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:45:50 crc kubenswrapper[4971]: I0320 07:45:50.162845 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:45:50 crc kubenswrapper[4971]: I0320 07:45:50.164022 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:45:55 crc kubenswrapper[4971]: I0320 07:45:55.513329 4971 scope.go:117] "RemoveContainer" containerID="54c16493493392922f01a08812121dfe702a5cd7ce82156b39195a9f80a7e0bf" Mar 20 07:46:00 crc kubenswrapper[4971]: I0320 07:46:00.168146 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566546-jwq59"] Mar 20 07:46:00 crc kubenswrapper[4971]: E0320 07:46:00.169278 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="440ff08e-8ef8-4f32-931f-36bd34d9a936" containerName="extract-content" Mar 20 07:46:00 crc kubenswrapper[4971]: I0320 07:46:00.169302 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="440ff08e-8ef8-4f32-931f-36bd34d9a936" containerName="extract-content" Mar 20 07:46:00 crc kubenswrapper[4971]: E0320 07:46:00.169330 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="440ff08e-8ef8-4f32-931f-36bd34d9a936" containerName="extract-utilities" Mar 20 07:46:00 crc kubenswrapper[4971]: I0320 07:46:00.169346 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="440ff08e-8ef8-4f32-931f-36bd34d9a936" containerName="extract-utilities" Mar 20 07:46:00 crc kubenswrapper[4971]: E0320 07:46:00.169365 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="440ff08e-8ef8-4f32-931f-36bd34d9a936" containerName="registry-server" Mar 20 07:46:00 crc kubenswrapper[4971]: I0320 07:46:00.169378 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="440ff08e-8ef8-4f32-931f-36bd34d9a936" containerName="registry-server" Mar 20 07:46:00 crc kubenswrapper[4971]: E0320 07:46:00.169402 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="894a3a5d-c688-41ea-8b99-578fde0702d5" containerName="collect-profiles" Mar 20 07:46:00 crc kubenswrapper[4971]: I0320 07:46:00.169414 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="894a3a5d-c688-41ea-8b99-578fde0702d5" containerName="collect-profiles" Mar 20 07:46:00 crc kubenswrapper[4971]: I0320 07:46:00.169887 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="894a3a5d-c688-41ea-8b99-578fde0702d5" containerName="collect-profiles" Mar 20 07:46:00 crc kubenswrapper[4971]: I0320 07:46:00.169924 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="440ff08e-8ef8-4f32-931f-36bd34d9a936" containerName="registry-server" Mar 20 07:46:00 crc kubenswrapper[4971]: I0320 07:46:00.170781 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566546-jwq59" Mar 20 07:46:00 crc kubenswrapper[4971]: I0320 07:46:00.174177 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 07:46:00 crc kubenswrapper[4971]: I0320 07:46:00.177590 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 07:46:00 crc kubenswrapper[4971]: I0320 07:46:00.177992 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 07:46:00 crc kubenswrapper[4971]: I0320 07:46:00.187355 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566546-jwq59"] Mar 20 07:46:00 crc kubenswrapper[4971]: I0320 07:46:00.254128 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cfxp\" (UniqueName: \"kubernetes.io/projected/0ac6f1ad-5ed5-4a1a-b3f9-c108c4e26e3f-kube-api-access-4cfxp\") pod \"auto-csr-approver-29566546-jwq59\" (UID: \"0ac6f1ad-5ed5-4a1a-b3f9-c108c4e26e3f\") " pod="openshift-infra/auto-csr-approver-29566546-jwq59" Mar 20 07:46:00 crc kubenswrapper[4971]: I0320 07:46:00.356436 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cfxp\" (UniqueName: \"kubernetes.io/projected/0ac6f1ad-5ed5-4a1a-b3f9-c108c4e26e3f-kube-api-access-4cfxp\") pod \"auto-csr-approver-29566546-jwq59\" (UID: \"0ac6f1ad-5ed5-4a1a-b3f9-c108c4e26e3f\") " pod="openshift-infra/auto-csr-approver-29566546-jwq59" Mar 20 07:46:00 crc kubenswrapper[4971]: I0320 07:46:00.394840 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cfxp\" (UniqueName: \"kubernetes.io/projected/0ac6f1ad-5ed5-4a1a-b3f9-c108c4e26e3f-kube-api-access-4cfxp\") pod \"auto-csr-approver-29566546-jwq59\" (UID: \"0ac6f1ad-5ed5-4a1a-b3f9-c108c4e26e3f\") " pod="openshift-infra/auto-csr-approver-29566546-jwq59" Mar 20 07:46:00 crc kubenswrapper[4971]: I0320 07:46:00.515522 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566546-jwq59" Mar 20 07:46:01 crc kubenswrapper[4971]: I0320 07:46:01.021084 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566546-jwq59"] Mar 20 07:46:01 crc kubenswrapper[4971]: I0320 07:46:01.033201 4971 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 07:46:01 crc kubenswrapper[4971]: I0320 07:46:01.408977 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566546-jwq59" event={"ID":"0ac6f1ad-5ed5-4a1a-b3f9-c108c4e26e3f","Type":"ContainerStarted","Data":"28364d652abe7d909a970d0724de820f8488bf6bfd746c05fb41bc1c15fe564e"} Mar 20 07:46:02 crc kubenswrapper[4971]: I0320 07:46:02.418410 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566546-jwq59" event={"ID":"0ac6f1ad-5ed5-4a1a-b3f9-c108c4e26e3f","Type":"ContainerStarted","Data":"760433e90992d9ddfe656fbb318be7aeb6a3c29c555f6b9b319a75075391e3c9"} Mar 20 07:46:02 crc kubenswrapper[4971]: I0320 07:46:02.435823 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566546-jwq59" podStartSLOduration=1.560305342 podStartE2EDuration="2.435804274s" podCreationTimestamp="2026-03-20 07:46:00 +0000 UTC" firstStartedPulling="2026-03-20 07:46:01.032835407 +0000 UTC m=+3383.012709565" lastFinishedPulling="2026-03-20 07:46:01.908334369 +0000 UTC m=+3383.888208497" observedRunningTime="2026-03-20 07:46:02.43217927 +0000 UTC m=+3384.412053408" watchObservedRunningTime="2026-03-20 07:46:02.435804274 +0000 UTC m=+3384.415678412" Mar 20 07:46:03 crc kubenswrapper[4971]: I0320 07:46:03.430687 4971 generic.go:334] "Generic (PLEG): container finished" podID="0ac6f1ad-5ed5-4a1a-b3f9-c108c4e26e3f" containerID="760433e90992d9ddfe656fbb318be7aeb6a3c29c555f6b9b319a75075391e3c9" exitCode=0 Mar 20 07:46:03 crc kubenswrapper[4971]: I0320 07:46:03.430815 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566546-jwq59" event={"ID":"0ac6f1ad-5ed5-4a1a-b3f9-c108c4e26e3f","Type":"ContainerDied","Data":"760433e90992d9ddfe656fbb318be7aeb6a3c29c555f6b9b319a75075391e3c9"} Mar 20 07:46:04 crc kubenswrapper[4971]: I0320 07:46:04.777251 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566546-jwq59" Mar 20 07:46:04 crc kubenswrapper[4971]: I0320 07:46:04.827532 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cfxp\" (UniqueName: \"kubernetes.io/projected/0ac6f1ad-5ed5-4a1a-b3f9-c108c4e26e3f-kube-api-access-4cfxp\") pod \"0ac6f1ad-5ed5-4a1a-b3f9-c108c4e26e3f\" (UID: \"0ac6f1ad-5ed5-4a1a-b3f9-c108c4e26e3f\") " Mar 20 07:46:04 crc kubenswrapper[4971]: I0320 07:46:04.840635 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ac6f1ad-5ed5-4a1a-b3f9-c108c4e26e3f-kube-api-access-4cfxp" (OuterVolumeSpecName: "kube-api-access-4cfxp") pod "0ac6f1ad-5ed5-4a1a-b3f9-c108c4e26e3f" (UID: "0ac6f1ad-5ed5-4a1a-b3f9-c108c4e26e3f"). InnerVolumeSpecName "kube-api-access-4cfxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:46:04 crc kubenswrapper[4971]: I0320 07:46:04.929321 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cfxp\" (UniqueName: \"kubernetes.io/projected/0ac6f1ad-5ed5-4a1a-b3f9-c108c4e26e3f-kube-api-access-4cfxp\") on node \"crc\" DevicePath \"\"" Mar 20 07:46:05 crc kubenswrapper[4971]: I0320 07:46:05.449644 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566546-jwq59" event={"ID":"0ac6f1ad-5ed5-4a1a-b3f9-c108c4e26e3f","Type":"ContainerDied","Data":"28364d652abe7d909a970d0724de820f8488bf6bfd746c05fb41bc1c15fe564e"} Mar 20 07:46:05 crc kubenswrapper[4971]: I0320 07:46:05.449720 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28364d652abe7d909a970d0724de820f8488bf6bfd746c05fb41bc1c15fe564e" Mar 20 07:46:05 crc kubenswrapper[4971]: I0320 07:46:05.449745 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566546-jwq59" Mar 20 07:46:05 crc kubenswrapper[4971]: I0320 07:46:05.511798 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566540-frh76"] Mar 20 07:46:05 crc kubenswrapper[4971]: I0320 07:46:05.518810 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566540-frh76"] Mar 20 07:46:06 crc kubenswrapper[4971]: I0320 07:46:06.750918 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1647ca10-1c76-45f9-82ad-4b89294b13f5" path="/var/lib/kubelet/pods/1647ca10-1c76-45f9-82ad-4b89294b13f5/volumes" Mar 20 07:46:20 crc kubenswrapper[4971]: I0320 07:46:20.162349 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:46:20 crc kubenswrapper[4971]: I0320 07:46:20.163156 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:46:20 crc kubenswrapper[4971]: I0320 07:46:20.163242 4971 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" Mar 20 07:46:20 crc kubenswrapper[4971]: I0320 07:46:20.164420 4971 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9b0eb5d686da2e08fdc8fe5d8bad3b653b6ff89379afb776c56d2edbf4453130"} pod="openshift-machine-config-operator/machine-config-daemon-m26zl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 07:46:20 crc kubenswrapper[4971]: I0320 07:46:20.164551 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" containerID="cri-o://9b0eb5d686da2e08fdc8fe5d8bad3b653b6ff89379afb776c56d2edbf4453130" gracePeriod=600 Mar 20 07:46:20 crc kubenswrapper[4971]: I0320 07:46:20.581207 4971 generic.go:334] "Generic (PLEG): container finished" podID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerID="9b0eb5d686da2e08fdc8fe5d8bad3b653b6ff89379afb776c56d2edbf4453130" exitCode=0 Mar 20 07:46:20 crc kubenswrapper[4971]: I0320 07:46:20.581289 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerDied","Data":"9b0eb5d686da2e08fdc8fe5d8bad3b653b6ff89379afb776c56d2edbf4453130"} Mar 20 07:46:20 crc kubenswrapper[4971]: I0320 07:46:20.581568 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerStarted","Data":"9f34439c5fc9bd9c8be3ed8b25bf8407df6d56135128f84724d3d2afe1cca94e"} Mar 20 07:46:20 crc kubenswrapper[4971]: I0320 07:46:20.581589 4971 scope.go:117] "RemoveContainer" containerID="01da59be6249845923e82657099ea463d3e7161471e1e125f547ba6a1aa77809" Mar 20 07:46:55 crc kubenswrapper[4971]: I0320 07:46:55.624948 4971 scope.go:117] "RemoveContainer" containerID="5fffa80226cf7c07e20384152a482dc061e11ae8a2088c022e9f51a8d5f4a519" Mar 20 07:48:00 crc kubenswrapper[4971]: I0320 07:48:00.158098 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566548-nzsf8"] Mar 20 07:48:00 crc kubenswrapper[4971]: E0320 07:48:00.159326 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ac6f1ad-5ed5-4a1a-b3f9-c108c4e26e3f" containerName="oc" Mar 20 07:48:00 crc kubenswrapper[4971]: I0320 07:48:00.159356 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ac6f1ad-5ed5-4a1a-b3f9-c108c4e26e3f" containerName="oc" Mar 20 07:48:00 crc kubenswrapper[4971]: I0320 07:48:00.159725 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ac6f1ad-5ed5-4a1a-b3f9-c108c4e26e3f" containerName="oc" Mar 20 07:48:00 crc kubenswrapper[4971]: I0320 07:48:00.160820 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566548-nzsf8" Mar 20 07:48:00 crc kubenswrapper[4971]: I0320 07:48:00.163843 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 07:48:00 crc kubenswrapper[4971]: I0320 07:48:00.165014 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 07:48:00 crc kubenswrapper[4971]: I0320 07:48:00.167280 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 07:48:00 crc kubenswrapper[4971]: I0320 07:48:00.171913 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566548-nzsf8"] Mar 20 07:48:00 crc kubenswrapper[4971]: I0320 07:48:00.248091 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqgs4\" (UniqueName: \"kubernetes.io/projected/fc72c18d-9df1-455f-94a7-fff6a10ccb21-kube-api-access-vqgs4\") pod \"auto-csr-approver-29566548-nzsf8\" (UID: \"fc72c18d-9df1-455f-94a7-fff6a10ccb21\") " pod="openshift-infra/auto-csr-approver-29566548-nzsf8" Mar 20 07:48:00 crc kubenswrapper[4971]: I0320 07:48:00.349144 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqgs4\" (UniqueName: \"kubernetes.io/projected/fc72c18d-9df1-455f-94a7-fff6a10ccb21-kube-api-access-vqgs4\") pod \"auto-csr-approver-29566548-nzsf8\" (UID: \"fc72c18d-9df1-455f-94a7-fff6a10ccb21\") " pod="openshift-infra/auto-csr-approver-29566548-nzsf8" Mar 20 07:48:00 crc kubenswrapper[4971]: I0320 07:48:00.376162 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqgs4\" (UniqueName: \"kubernetes.io/projected/fc72c18d-9df1-455f-94a7-fff6a10ccb21-kube-api-access-vqgs4\") pod \"auto-csr-approver-29566548-nzsf8\" (UID: \"fc72c18d-9df1-455f-94a7-fff6a10ccb21\") " pod="openshift-infra/auto-csr-approver-29566548-nzsf8" Mar 20 07:48:00 crc kubenswrapper[4971]: I0320 07:48:00.492100 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566548-nzsf8" Mar 20 07:48:00 crc kubenswrapper[4971]: I0320 07:48:00.954905 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566548-nzsf8"] Mar 20 07:48:00 crc kubenswrapper[4971]: W0320 07:48:00.959284 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc72c18d_9df1_455f_94a7_fff6a10ccb21.slice/crio-383b98a1c41fa9300ca62a87ada17fb6f8baecdebfa424480c84ad94fe0cdb1a WatchSource:0}: Error finding container 383b98a1c41fa9300ca62a87ada17fb6f8baecdebfa424480c84ad94fe0cdb1a: Status 404 returned error can't find the container with id 383b98a1c41fa9300ca62a87ada17fb6f8baecdebfa424480c84ad94fe0cdb1a Mar 20 07:48:01 crc kubenswrapper[4971]: I0320 07:48:01.532002 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566548-nzsf8" event={"ID":"fc72c18d-9df1-455f-94a7-fff6a10ccb21","Type":"ContainerStarted","Data":"383b98a1c41fa9300ca62a87ada17fb6f8baecdebfa424480c84ad94fe0cdb1a"} Mar 20 07:48:02 crc kubenswrapper[4971]: I0320 07:48:02.541781 4971 generic.go:334] "Generic (PLEG): container finished" podID="fc72c18d-9df1-455f-94a7-fff6a10ccb21" containerID="10250ebdde278ab84a6b88e597b253ce0a54f7eb624e62fc04611859042a874a" exitCode=0 Mar 20 07:48:02 crc kubenswrapper[4971]: I0320 07:48:02.542779 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566548-nzsf8" event={"ID":"fc72c18d-9df1-455f-94a7-fff6a10ccb21","Type":"ContainerDied","Data":"10250ebdde278ab84a6b88e597b253ce0a54f7eb624e62fc04611859042a874a"} Mar 20 07:48:03 crc kubenswrapper[4971]: I0320 07:48:03.915665 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566548-nzsf8" Mar 20 07:48:04 crc kubenswrapper[4971]: I0320 07:48:04.004459 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqgs4\" (UniqueName: \"kubernetes.io/projected/fc72c18d-9df1-455f-94a7-fff6a10ccb21-kube-api-access-vqgs4\") pod \"fc72c18d-9df1-455f-94a7-fff6a10ccb21\" (UID: \"fc72c18d-9df1-455f-94a7-fff6a10ccb21\") " Mar 20 07:48:04 crc kubenswrapper[4971]: I0320 07:48:04.010059 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc72c18d-9df1-455f-94a7-fff6a10ccb21-kube-api-access-vqgs4" (OuterVolumeSpecName: "kube-api-access-vqgs4") pod "fc72c18d-9df1-455f-94a7-fff6a10ccb21" (UID: "fc72c18d-9df1-455f-94a7-fff6a10ccb21"). InnerVolumeSpecName "kube-api-access-vqgs4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:48:04 crc kubenswrapper[4971]: I0320 07:48:04.106318 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqgs4\" (UniqueName: \"kubernetes.io/projected/fc72c18d-9df1-455f-94a7-fff6a10ccb21-kube-api-access-vqgs4\") on node \"crc\" DevicePath \"\"" Mar 20 07:48:04 crc kubenswrapper[4971]: I0320 07:48:04.563396 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566548-nzsf8" event={"ID":"fc72c18d-9df1-455f-94a7-fff6a10ccb21","Type":"ContainerDied","Data":"383b98a1c41fa9300ca62a87ada17fb6f8baecdebfa424480c84ad94fe0cdb1a"} Mar 20 07:48:04 crc kubenswrapper[4971]: I0320 07:48:04.563460 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566548-nzsf8" Mar 20 07:48:04 crc kubenswrapper[4971]: I0320 07:48:04.563463 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="383b98a1c41fa9300ca62a87ada17fb6f8baecdebfa424480c84ad94fe0cdb1a" Mar 20 07:48:05 crc kubenswrapper[4971]: I0320 07:48:05.005022 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566542-z5c9s"] Mar 20 07:48:05 crc kubenswrapper[4971]: I0320 07:48:05.016391 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566542-z5c9s"] Mar 20 07:48:06 crc kubenswrapper[4971]: I0320 07:48:06.744291 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="092750ba-9445-418f-ba6a-f4d4456593e4" path="/var/lib/kubelet/pods/092750ba-9445-418f-ba6a-f4d4456593e4/volumes" Mar 20 07:48:20 crc kubenswrapper[4971]: I0320 07:48:20.162900 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:48:20 crc kubenswrapper[4971]: I0320 07:48:20.163644 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:48:50 crc kubenswrapper[4971]: I0320 07:48:50.162471 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:48:50 crc kubenswrapper[4971]: I0320 07:48:50.163217 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:48:55 crc kubenswrapper[4971]: I0320 07:48:55.743391 4971 scope.go:117] "RemoveContainer" containerID="79b43a743e05a6dd05c236f045b0497dae8384dc655396e5e94b8a847b2926d3" Mar 20 07:49:09 crc kubenswrapper[4971]: I0320 07:49:09.288542 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2lx7z"] Mar 20 07:49:09 crc kubenswrapper[4971]: E0320 07:49:09.290852 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc72c18d-9df1-455f-94a7-fff6a10ccb21" containerName="oc" Mar 20 07:49:09 crc kubenswrapper[4971]: I0320 07:49:09.290962 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc72c18d-9df1-455f-94a7-fff6a10ccb21" containerName="oc" Mar 20 07:49:09 crc kubenswrapper[4971]: I0320 07:49:09.291262 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc72c18d-9df1-455f-94a7-fff6a10ccb21" containerName="oc" Mar 20 07:49:09 crc kubenswrapper[4971]: I0320 07:49:09.292704 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2lx7z" Mar 20 07:49:09 crc kubenswrapper[4971]: I0320 07:49:09.314510 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2lx7z"] Mar 20 07:49:09 crc kubenswrapper[4971]: I0320 07:49:09.397140 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c71d2868-b22e-4586-b811-223083cd4e9d-catalog-content\") pod \"redhat-marketplace-2lx7z\" (UID: \"c71d2868-b22e-4586-b811-223083cd4e9d\") " pod="openshift-marketplace/redhat-marketplace-2lx7z" Mar 20 07:49:09 crc kubenswrapper[4971]: I0320 07:49:09.397194 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c71d2868-b22e-4586-b811-223083cd4e9d-utilities\") pod \"redhat-marketplace-2lx7z\" (UID: \"c71d2868-b22e-4586-b811-223083cd4e9d\") " pod="openshift-marketplace/redhat-marketplace-2lx7z" Mar 20 07:49:09 crc kubenswrapper[4971]: I0320 07:49:09.397347 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lnkb\" (UniqueName: \"kubernetes.io/projected/c71d2868-b22e-4586-b811-223083cd4e9d-kube-api-access-7lnkb\") pod \"redhat-marketplace-2lx7z\" (UID: \"c71d2868-b22e-4586-b811-223083cd4e9d\") " pod="openshift-marketplace/redhat-marketplace-2lx7z" Mar 20 07:49:09 crc kubenswrapper[4971]: I0320 07:49:09.498582 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lnkb\" (UniqueName: \"kubernetes.io/projected/c71d2868-b22e-4586-b811-223083cd4e9d-kube-api-access-7lnkb\") pod \"redhat-marketplace-2lx7z\" (UID: \"c71d2868-b22e-4586-b811-223083cd4e9d\") " pod="openshift-marketplace/redhat-marketplace-2lx7z" Mar 20 07:49:09 crc kubenswrapper[4971]: I0320 07:49:09.498890 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c71d2868-b22e-4586-b811-223083cd4e9d-catalog-content\") pod \"redhat-marketplace-2lx7z\" (UID: \"c71d2868-b22e-4586-b811-223083cd4e9d\") " pod="openshift-marketplace/redhat-marketplace-2lx7z" Mar 20 07:49:09 crc kubenswrapper[4971]: I0320 07:49:09.499037 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c71d2868-b22e-4586-b811-223083cd4e9d-utilities\") pod \"redhat-marketplace-2lx7z\" (UID: \"c71d2868-b22e-4586-b811-223083cd4e9d\") " pod="openshift-marketplace/redhat-marketplace-2lx7z" Mar 20 07:49:09 crc kubenswrapper[4971]: I0320 07:49:09.499497 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c71d2868-b22e-4586-b811-223083cd4e9d-catalog-content\") pod \"redhat-marketplace-2lx7z\" (UID: \"c71d2868-b22e-4586-b811-223083cd4e9d\") " pod="openshift-marketplace/redhat-marketplace-2lx7z" Mar 20 07:49:09 crc kubenswrapper[4971]: I0320 07:49:09.499517 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c71d2868-b22e-4586-b811-223083cd4e9d-utilities\") pod \"redhat-marketplace-2lx7z\" (UID: \"c71d2868-b22e-4586-b811-223083cd4e9d\") " pod="openshift-marketplace/redhat-marketplace-2lx7z" Mar 20 07:49:09 crc kubenswrapper[4971]: I0320 07:49:09.517451 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lnkb\" (UniqueName: \"kubernetes.io/projected/c71d2868-b22e-4586-b811-223083cd4e9d-kube-api-access-7lnkb\") pod \"redhat-marketplace-2lx7z\" (UID: \"c71d2868-b22e-4586-b811-223083cd4e9d\") " pod="openshift-marketplace/redhat-marketplace-2lx7z" Mar 20 07:49:09 crc kubenswrapper[4971]: I0320 07:49:09.667578 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2lx7z" Mar 20 07:49:10 crc kubenswrapper[4971]: I0320 07:49:10.114753 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2lx7z"] Mar 20 07:49:10 crc kubenswrapper[4971]: I0320 07:49:10.497312 4971 generic.go:334] "Generic (PLEG): container finished" podID="c71d2868-b22e-4586-b811-223083cd4e9d" containerID="b80159c91b5d89a914a8aa9f7afc33d2de786f4efdb7c75b4f472b5b4e7ec042" exitCode=0 Mar 20 07:49:10 crc kubenswrapper[4971]: I0320 07:49:10.497449 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2lx7z" event={"ID":"c71d2868-b22e-4586-b811-223083cd4e9d","Type":"ContainerDied","Data":"b80159c91b5d89a914a8aa9f7afc33d2de786f4efdb7c75b4f472b5b4e7ec042"} Mar 20 07:49:10 crc kubenswrapper[4971]: I0320 07:49:10.498906 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2lx7z" event={"ID":"c71d2868-b22e-4586-b811-223083cd4e9d","Type":"ContainerStarted","Data":"d54be4271cb4d5ff2f5bef5f175c7de025870d32a24e1a8259789cebd9a83636"} Mar 20 07:49:11 crc kubenswrapper[4971]: I0320 07:49:11.509794 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2lx7z" event={"ID":"c71d2868-b22e-4586-b811-223083cd4e9d","Type":"ContainerStarted","Data":"cd67f18a486c314836f542ad2cf50c25a799ef70bf4336f4a25749a670b54b81"} Mar 20 07:49:12 crc kubenswrapper[4971]: I0320 07:49:12.519711 4971 generic.go:334] "Generic (PLEG): container finished" podID="c71d2868-b22e-4586-b811-223083cd4e9d" containerID="cd67f18a486c314836f542ad2cf50c25a799ef70bf4336f4a25749a670b54b81" exitCode=0 Mar 20 07:49:12 crc kubenswrapper[4971]: I0320 07:49:12.519997 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2lx7z" event={"ID":"c71d2868-b22e-4586-b811-223083cd4e9d","Type":"ContainerDied","Data":"cd67f18a486c314836f542ad2cf50c25a799ef70bf4336f4a25749a670b54b81"} Mar 20 07:49:13 crc kubenswrapper[4971]: I0320 07:49:13.531873 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2lx7z" event={"ID":"c71d2868-b22e-4586-b811-223083cd4e9d","Type":"ContainerStarted","Data":"2584ef6530ff9615a67cd014eac84b5c49eb36253c888c5f67361559cb3026a7"} Mar 20 07:49:13 crc kubenswrapper[4971]: I0320 07:49:13.563083 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2lx7z" podStartSLOduration=2.132396204 podStartE2EDuration="4.563055409s" podCreationTimestamp="2026-03-20 07:49:09 +0000 UTC" firstStartedPulling="2026-03-20 07:49:10.499476453 +0000 UTC m=+3572.479350601" lastFinishedPulling="2026-03-20 07:49:12.930135648 +0000 UTC m=+3574.910009806" observedRunningTime="2026-03-20 07:49:13.555972894 +0000 UTC m=+3575.535847032" watchObservedRunningTime="2026-03-20 07:49:13.563055409 +0000 UTC m=+3575.542929587" Mar 20 07:49:19 crc kubenswrapper[4971]: I0320 07:49:19.668596 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2lx7z" Mar 20 07:49:19 crc kubenswrapper[4971]: I0320 07:49:19.669201 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2lx7z" Mar 20 07:49:19 crc kubenswrapper[4971]: I0320 07:49:19.745281 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2lx7z" Mar 20 07:49:20 crc kubenswrapper[4971]: I0320 07:49:20.162424 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:49:20 crc kubenswrapper[4971]: I0320 07:49:20.162497 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:49:20 crc kubenswrapper[4971]: I0320 07:49:20.162550 4971 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" Mar 20 07:49:20 crc kubenswrapper[4971]: I0320 07:49:20.163257 4971 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9f34439c5fc9bd9c8be3ed8b25bf8407df6d56135128f84724d3d2afe1cca94e"} pod="openshift-machine-config-operator/machine-config-daemon-m26zl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 07:49:20 crc kubenswrapper[4971]: I0320 07:49:20.163354 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" containerID="cri-o://9f34439c5fc9bd9c8be3ed8b25bf8407df6d56135128f84724d3d2afe1cca94e" gracePeriod=600 Mar 20 07:49:20 crc kubenswrapper[4971]: E0320 07:49:20.321633 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:49:20 crc kubenswrapper[4971]: I0320 07:49:20.587523 4971 generic.go:334] "Generic (PLEG): container finished" podID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerID="9f34439c5fc9bd9c8be3ed8b25bf8407df6d56135128f84724d3d2afe1cca94e" exitCode=0 Mar 20 07:49:20 crc kubenswrapper[4971]: I0320 07:49:20.587618 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerDied","Data":"9f34439c5fc9bd9c8be3ed8b25bf8407df6d56135128f84724d3d2afe1cca94e"} Mar 20 07:49:20 crc kubenswrapper[4971]: I0320 07:49:20.587668 4971 scope.go:117] "RemoveContainer" containerID="9b0eb5d686da2e08fdc8fe5d8bad3b653b6ff89379afb776c56d2edbf4453130" Mar 20 07:49:20 crc kubenswrapper[4971]: I0320 07:49:20.588176 4971 scope.go:117] "RemoveContainer" containerID="9f34439c5fc9bd9c8be3ed8b25bf8407df6d56135128f84724d3d2afe1cca94e" Mar 20 07:49:20 crc kubenswrapper[4971]: E0320 07:49:20.588375 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:49:20 crc kubenswrapper[4971]: I0320 07:49:20.709661 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2lx7z" Mar 20 07:49:20 crc kubenswrapper[4971]: I0320 07:49:20.768126 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2lx7z"] Mar 20 07:49:22 crc kubenswrapper[4971]: I0320 07:49:22.603756 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2lx7z" podUID="c71d2868-b22e-4586-b811-223083cd4e9d" containerName="registry-server" containerID="cri-o://2584ef6530ff9615a67cd014eac84b5c49eb36253c888c5f67361559cb3026a7" gracePeriod=2 Mar 20 07:49:23 crc kubenswrapper[4971]: I0320 07:49:23.030581 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2lx7z" Mar 20 07:49:23 crc kubenswrapper[4971]: I0320 07:49:23.098243 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c71d2868-b22e-4586-b811-223083cd4e9d-utilities\") pod \"c71d2868-b22e-4586-b811-223083cd4e9d\" (UID: \"c71d2868-b22e-4586-b811-223083cd4e9d\") " Mar 20 07:49:23 crc kubenswrapper[4971]: I0320 07:49:23.098322 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c71d2868-b22e-4586-b811-223083cd4e9d-catalog-content\") pod \"c71d2868-b22e-4586-b811-223083cd4e9d\" (UID: \"c71d2868-b22e-4586-b811-223083cd4e9d\") " Mar 20 07:49:23 crc kubenswrapper[4971]: I0320 07:49:23.098367 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7lnkb\" (UniqueName: \"kubernetes.io/projected/c71d2868-b22e-4586-b811-223083cd4e9d-kube-api-access-7lnkb\") pod \"c71d2868-b22e-4586-b811-223083cd4e9d\" (UID: \"c71d2868-b22e-4586-b811-223083cd4e9d\") " Mar 20 07:49:23 crc kubenswrapper[4971]: I0320 07:49:23.101175 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c71d2868-b22e-4586-b811-223083cd4e9d-utilities" (OuterVolumeSpecName: "utilities") pod "c71d2868-b22e-4586-b811-223083cd4e9d" (UID: "c71d2868-b22e-4586-b811-223083cd4e9d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:49:23 crc kubenswrapper[4971]: I0320 07:49:23.111955 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c71d2868-b22e-4586-b811-223083cd4e9d-kube-api-access-7lnkb" (OuterVolumeSpecName: "kube-api-access-7lnkb") pod "c71d2868-b22e-4586-b811-223083cd4e9d" (UID: "c71d2868-b22e-4586-b811-223083cd4e9d"). InnerVolumeSpecName "kube-api-access-7lnkb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:49:23 crc kubenswrapper[4971]: I0320 07:49:23.152074 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c71d2868-b22e-4586-b811-223083cd4e9d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c71d2868-b22e-4586-b811-223083cd4e9d" (UID: "c71d2868-b22e-4586-b811-223083cd4e9d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:49:23 crc kubenswrapper[4971]: I0320 07:49:23.200370 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c71d2868-b22e-4586-b811-223083cd4e9d-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 07:49:23 crc kubenswrapper[4971]: I0320 07:49:23.200408 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c71d2868-b22e-4586-b811-223083cd4e9d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 07:49:23 crc kubenswrapper[4971]: I0320 07:49:23.200422 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7lnkb\" (UniqueName: \"kubernetes.io/projected/c71d2868-b22e-4586-b811-223083cd4e9d-kube-api-access-7lnkb\") on node \"crc\" DevicePath \"\"" Mar 20 07:49:23 crc kubenswrapper[4971]: I0320 07:49:23.615635 4971 generic.go:334] "Generic (PLEG): container finished" podID="c71d2868-b22e-4586-b811-223083cd4e9d" containerID="2584ef6530ff9615a67cd014eac84b5c49eb36253c888c5f67361559cb3026a7" exitCode=0 Mar 20 07:49:23 crc kubenswrapper[4971]: I0320 07:49:23.615686 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2lx7z" Mar 20 07:49:23 crc kubenswrapper[4971]: I0320 07:49:23.615715 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2lx7z" event={"ID":"c71d2868-b22e-4586-b811-223083cd4e9d","Type":"ContainerDied","Data":"2584ef6530ff9615a67cd014eac84b5c49eb36253c888c5f67361559cb3026a7"} Mar 20 07:49:23 crc kubenswrapper[4971]: I0320 07:49:23.615755 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2lx7z" event={"ID":"c71d2868-b22e-4586-b811-223083cd4e9d","Type":"ContainerDied","Data":"d54be4271cb4d5ff2f5bef5f175c7de025870d32a24e1a8259789cebd9a83636"} Mar 20 07:49:23 crc kubenswrapper[4971]: I0320 07:49:23.615786 4971 scope.go:117] "RemoveContainer" containerID="2584ef6530ff9615a67cd014eac84b5c49eb36253c888c5f67361559cb3026a7" Mar 20 07:49:23 crc kubenswrapper[4971]: I0320 07:49:23.652106 4971 scope.go:117] "RemoveContainer" containerID="cd67f18a486c314836f542ad2cf50c25a799ef70bf4336f4a25749a670b54b81" Mar 20 07:49:23 crc kubenswrapper[4971]: I0320 07:49:23.661238 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2lx7z"] Mar 20 07:49:23 crc kubenswrapper[4971]: I0320 07:49:23.669831 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2lx7z"] Mar 20 07:49:23 crc kubenswrapper[4971]: I0320 07:49:23.688935 4971 scope.go:117] "RemoveContainer" containerID="b80159c91b5d89a914a8aa9f7afc33d2de786f4efdb7c75b4f472b5b4e7ec042" Mar 20 07:49:23 crc kubenswrapper[4971]: I0320 07:49:23.724679 4971 scope.go:117] "RemoveContainer" containerID="2584ef6530ff9615a67cd014eac84b5c49eb36253c888c5f67361559cb3026a7" Mar 20 07:49:23 crc kubenswrapper[4971]: E0320 07:49:23.725589 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2584ef6530ff9615a67cd014eac84b5c49eb36253c888c5f67361559cb3026a7\": container with ID starting with 2584ef6530ff9615a67cd014eac84b5c49eb36253c888c5f67361559cb3026a7 not found: ID does not exist" containerID="2584ef6530ff9615a67cd014eac84b5c49eb36253c888c5f67361559cb3026a7" Mar 20 07:49:23 crc kubenswrapper[4971]: I0320 07:49:23.725671 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2584ef6530ff9615a67cd014eac84b5c49eb36253c888c5f67361559cb3026a7"} err="failed to get container status \"2584ef6530ff9615a67cd014eac84b5c49eb36253c888c5f67361559cb3026a7\": rpc error: code = NotFound desc = could not find container \"2584ef6530ff9615a67cd014eac84b5c49eb36253c888c5f67361559cb3026a7\": container with ID starting with 2584ef6530ff9615a67cd014eac84b5c49eb36253c888c5f67361559cb3026a7 not found: ID does not exist" Mar 20 07:49:23 crc kubenswrapper[4971]: I0320 07:49:23.725711 4971 scope.go:117] "RemoveContainer" containerID="cd67f18a486c314836f542ad2cf50c25a799ef70bf4336f4a25749a670b54b81" Mar 20 07:49:23 crc kubenswrapper[4971]: E0320 07:49:23.726815 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd67f18a486c314836f542ad2cf50c25a799ef70bf4336f4a25749a670b54b81\": container with ID starting with cd67f18a486c314836f542ad2cf50c25a799ef70bf4336f4a25749a670b54b81 not found: ID does not exist" containerID="cd67f18a486c314836f542ad2cf50c25a799ef70bf4336f4a25749a670b54b81" Mar 20 07:49:23 crc kubenswrapper[4971]: I0320 07:49:23.726859 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd67f18a486c314836f542ad2cf50c25a799ef70bf4336f4a25749a670b54b81"} err="failed to get container status \"cd67f18a486c314836f542ad2cf50c25a799ef70bf4336f4a25749a670b54b81\": rpc error: code = NotFound desc = could not find container \"cd67f18a486c314836f542ad2cf50c25a799ef70bf4336f4a25749a670b54b81\": container with ID starting with cd67f18a486c314836f542ad2cf50c25a799ef70bf4336f4a25749a670b54b81 not found: ID does not exist" Mar 20 07:49:23 crc kubenswrapper[4971]: I0320 07:49:23.726890 4971 scope.go:117] "RemoveContainer" containerID="b80159c91b5d89a914a8aa9f7afc33d2de786f4efdb7c75b4f472b5b4e7ec042" Mar 20 07:49:23 crc kubenswrapper[4971]: E0320 07:49:23.727303 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b80159c91b5d89a914a8aa9f7afc33d2de786f4efdb7c75b4f472b5b4e7ec042\": container with ID starting with b80159c91b5d89a914a8aa9f7afc33d2de786f4efdb7c75b4f472b5b4e7ec042 not found: ID does not exist" containerID="b80159c91b5d89a914a8aa9f7afc33d2de786f4efdb7c75b4f472b5b4e7ec042" Mar 20 07:49:23 crc kubenswrapper[4971]: I0320 07:49:23.727349 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b80159c91b5d89a914a8aa9f7afc33d2de786f4efdb7c75b4f472b5b4e7ec042"} err="failed to get container status \"b80159c91b5d89a914a8aa9f7afc33d2de786f4efdb7c75b4f472b5b4e7ec042\": rpc error: code = NotFound desc = could not find container \"b80159c91b5d89a914a8aa9f7afc33d2de786f4efdb7c75b4f472b5b4e7ec042\": container with ID starting with b80159c91b5d89a914a8aa9f7afc33d2de786f4efdb7c75b4f472b5b4e7ec042 not found: ID does not exist" Mar 20 07:49:24 crc kubenswrapper[4971]: I0320 07:49:24.748956 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c71d2868-b22e-4586-b811-223083cd4e9d" path="/var/lib/kubelet/pods/c71d2868-b22e-4586-b811-223083cd4e9d/volumes" Mar 20 07:49:31 crc kubenswrapper[4971]: I0320 07:49:31.732208 4971 scope.go:117] "RemoveContainer" containerID="9f34439c5fc9bd9c8be3ed8b25bf8407df6d56135128f84724d3d2afe1cca94e" Mar 20 07:49:31 crc kubenswrapper[4971]: E0320 07:49:31.733354 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:49:46 crc kubenswrapper[4971]: I0320 07:49:46.732808 4971 scope.go:117] "RemoveContainer" containerID="9f34439c5fc9bd9c8be3ed8b25bf8407df6d56135128f84724d3d2afe1cca94e" Mar 20 07:49:46 crc kubenswrapper[4971]: E0320 07:49:46.733759 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:49:59 crc kubenswrapper[4971]: I0320 07:49:59.732369 4971 scope.go:117] "RemoveContainer" containerID="9f34439c5fc9bd9c8be3ed8b25bf8407df6d56135128f84724d3d2afe1cca94e" Mar 20 07:49:59 crc kubenswrapper[4971]: E0320 07:49:59.733066 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:50:00 crc kubenswrapper[4971]: I0320 07:50:00.153163 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566550-c7h6d"] Mar 20 07:50:00 crc kubenswrapper[4971]: E0320 07:50:00.153708 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c71d2868-b22e-4586-b811-223083cd4e9d" containerName="extract-content" Mar 20 07:50:00 crc kubenswrapper[4971]: I0320 07:50:00.153741 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="c71d2868-b22e-4586-b811-223083cd4e9d" containerName="extract-content" Mar 20 07:50:00 crc kubenswrapper[4971]: E0320 07:50:00.153784 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c71d2868-b22e-4586-b811-223083cd4e9d" containerName="registry-server" Mar 20 07:50:00 crc kubenswrapper[4971]: I0320 07:50:00.153796 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="c71d2868-b22e-4586-b811-223083cd4e9d" containerName="registry-server" Mar 20 07:50:00 crc kubenswrapper[4971]: E0320 07:50:00.153812 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c71d2868-b22e-4586-b811-223083cd4e9d" containerName="extract-utilities" Mar 20 07:50:00 crc kubenswrapper[4971]: I0320 07:50:00.153823 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="c71d2868-b22e-4586-b811-223083cd4e9d" containerName="extract-utilities" Mar 20 07:50:00 crc kubenswrapper[4971]: I0320 07:50:00.154058 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="c71d2868-b22e-4586-b811-223083cd4e9d" containerName="registry-server" Mar 20 07:50:00 crc kubenswrapper[4971]: I0320 07:50:00.154943 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566550-c7h6d" Mar 20 07:50:00 crc kubenswrapper[4971]: I0320 07:50:00.158182 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 07:50:00 crc kubenswrapper[4971]: I0320 07:50:00.158177 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 07:50:00 crc kubenswrapper[4971]: I0320 07:50:00.165538 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566550-c7h6d"] Mar 20 07:50:00 crc kubenswrapper[4971]: I0320 07:50:00.167132 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 07:50:00 crc kubenswrapper[4971]: I0320 07:50:00.217699 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zfgd\" (UniqueName: \"kubernetes.io/projected/ac972304-92a0-4afb-b1bf-2c859becff62-kube-api-access-7zfgd\") pod \"auto-csr-approver-29566550-c7h6d\" (UID: \"ac972304-92a0-4afb-b1bf-2c859becff62\") " pod="openshift-infra/auto-csr-approver-29566550-c7h6d" Mar 20 07:50:00 crc kubenswrapper[4971]: I0320 07:50:00.318900 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zfgd\" (UniqueName: \"kubernetes.io/projected/ac972304-92a0-4afb-b1bf-2c859becff62-kube-api-access-7zfgd\") pod \"auto-csr-approver-29566550-c7h6d\" (UID: \"ac972304-92a0-4afb-b1bf-2c859becff62\") " pod="openshift-infra/auto-csr-approver-29566550-c7h6d" Mar 20 07:50:00 crc kubenswrapper[4971]: I0320 07:50:00.340958 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zfgd\" (UniqueName: \"kubernetes.io/projected/ac972304-92a0-4afb-b1bf-2c859becff62-kube-api-access-7zfgd\") pod \"auto-csr-approver-29566550-c7h6d\" (UID: \"ac972304-92a0-4afb-b1bf-2c859becff62\") " pod="openshift-infra/auto-csr-approver-29566550-c7h6d" Mar 20 07:50:00 crc kubenswrapper[4971]: I0320 07:50:00.484512 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566550-c7h6d" Mar 20 07:50:00 crc kubenswrapper[4971]: I0320 07:50:00.975221 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566550-c7h6d"] Mar 20 07:50:01 crc kubenswrapper[4971]: I0320 07:50:01.931184 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566550-c7h6d" event={"ID":"ac972304-92a0-4afb-b1bf-2c859becff62","Type":"ContainerStarted","Data":"f5ec5fc4506e9f6923f1601bc07ab905d01f25e467e1210b6dab0a7fa5319a8c"} Mar 20 07:50:02 crc kubenswrapper[4971]: I0320 07:50:02.942038 4971 generic.go:334] "Generic (PLEG): container finished" podID="ac972304-92a0-4afb-b1bf-2c859becff62" containerID="a73a29342b7da3702d3ab797f62e043ae6e127163d853f13f0d354296b6fcff9" exitCode=0 Mar 20 07:50:02 crc kubenswrapper[4971]: I0320 07:50:02.942155 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566550-c7h6d" event={"ID":"ac972304-92a0-4afb-b1bf-2c859becff62","Type":"ContainerDied","Data":"a73a29342b7da3702d3ab797f62e043ae6e127163d853f13f0d354296b6fcff9"} Mar 20 07:50:04 crc kubenswrapper[4971]: I0320 07:50:04.274828 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566550-c7h6d" Mar 20 07:50:04 crc kubenswrapper[4971]: I0320 07:50:04.381643 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zfgd\" (UniqueName: \"kubernetes.io/projected/ac972304-92a0-4afb-b1bf-2c859becff62-kube-api-access-7zfgd\") pod \"ac972304-92a0-4afb-b1bf-2c859becff62\" (UID: \"ac972304-92a0-4afb-b1bf-2c859becff62\") " Mar 20 07:50:04 crc kubenswrapper[4971]: I0320 07:50:04.388462 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac972304-92a0-4afb-b1bf-2c859becff62-kube-api-access-7zfgd" (OuterVolumeSpecName: "kube-api-access-7zfgd") pod "ac972304-92a0-4afb-b1bf-2c859becff62" (UID: "ac972304-92a0-4afb-b1bf-2c859becff62"). InnerVolumeSpecName "kube-api-access-7zfgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:50:04 crc kubenswrapper[4971]: I0320 07:50:04.483976 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zfgd\" (UniqueName: \"kubernetes.io/projected/ac972304-92a0-4afb-b1bf-2c859becff62-kube-api-access-7zfgd\") on node \"crc\" DevicePath \"\"" Mar 20 07:50:04 crc kubenswrapper[4971]: I0320 07:50:04.960476 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566550-c7h6d" event={"ID":"ac972304-92a0-4afb-b1bf-2c859becff62","Type":"ContainerDied","Data":"f5ec5fc4506e9f6923f1601bc07ab905d01f25e467e1210b6dab0a7fa5319a8c"} Mar 20 07:50:04 crc kubenswrapper[4971]: I0320 07:50:04.960838 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5ec5fc4506e9f6923f1601bc07ab905d01f25e467e1210b6dab0a7fa5319a8c" Mar 20 07:50:04 crc kubenswrapper[4971]: I0320 07:50:04.960538 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566550-c7h6d" Mar 20 07:50:05 crc kubenswrapper[4971]: I0320 07:50:05.342774 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566544-gj7nc"] Mar 20 07:50:05 crc kubenswrapper[4971]: I0320 07:50:05.349152 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566544-gj7nc"] Mar 20 07:50:06 crc kubenswrapper[4971]: I0320 07:50:06.748442 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d789ba76-0170-4d11-905a-afe2ddf78f14" path="/var/lib/kubelet/pods/d789ba76-0170-4d11-905a-afe2ddf78f14/volumes" Mar 20 07:50:14 crc kubenswrapper[4971]: I0320 07:50:14.736223 4971 scope.go:117] "RemoveContainer" containerID="9f34439c5fc9bd9c8be3ed8b25bf8407df6d56135128f84724d3d2afe1cca94e" Mar 20 07:50:14 crc kubenswrapper[4971]: E0320 07:50:14.738647 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:50:29 crc kubenswrapper[4971]: I0320 07:50:29.732147 4971 scope.go:117] "RemoveContainer" containerID="9f34439c5fc9bd9c8be3ed8b25bf8407df6d56135128f84724d3d2afe1cca94e" Mar 20 07:50:29 crc kubenswrapper[4971]: E0320 07:50:29.732954 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:50:43 crc kubenswrapper[4971]: I0320 07:50:43.732511 4971 scope.go:117] "RemoveContainer" containerID="9f34439c5fc9bd9c8be3ed8b25bf8407df6d56135128f84724d3d2afe1cca94e" Mar 20 07:50:43 crc kubenswrapper[4971]: E0320 07:50:43.734093 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:50:55 crc kubenswrapper[4971]: I0320 07:50:55.863065 4971 scope.go:117] "RemoveContainer" containerID="256217ee7227a11698e46a294091dced7833eb4ea30a7288d38cd9ef7f447794" Mar 20 07:50:56 crc kubenswrapper[4971]: I0320 07:50:56.733468 4971 scope.go:117] "RemoveContainer" containerID="9f34439c5fc9bd9c8be3ed8b25bf8407df6d56135128f84724d3d2afe1cca94e" Mar 20 07:50:56 crc kubenswrapper[4971]: E0320 07:50:56.734136 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:51:11 crc kubenswrapper[4971]: I0320 07:51:11.733398 4971 scope.go:117] "RemoveContainer" containerID="9f34439c5fc9bd9c8be3ed8b25bf8407df6d56135128f84724d3d2afe1cca94e" Mar 20 07:51:11 crc kubenswrapper[4971]: E0320 07:51:11.734578 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:51:22 crc kubenswrapper[4971]: I0320 07:51:22.732846 4971 scope.go:117] "RemoveContainer" containerID="9f34439c5fc9bd9c8be3ed8b25bf8407df6d56135128f84724d3d2afe1cca94e" Mar 20 07:51:22 crc kubenswrapper[4971]: E0320 07:51:22.733789 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:51:37 crc kubenswrapper[4971]: I0320 07:51:37.733102 4971 scope.go:117] "RemoveContainer" containerID="9f34439c5fc9bd9c8be3ed8b25bf8407df6d56135128f84724d3d2afe1cca94e" Mar 20 07:51:37 crc kubenswrapper[4971]: E0320 07:51:37.734112 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:51:48 crc kubenswrapper[4971]: I0320 07:51:48.739892 4971 scope.go:117] "RemoveContainer" containerID="9f34439c5fc9bd9c8be3ed8b25bf8407df6d56135128f84724d3d2afe1cca94e" Mar 20 07:51:48 crc kubenswrapper[4971]: E0320 07:51:48.740743 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:52:00 crc kubenswrapper[4971]: I0320 07:52:00.161663 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566552-4sq4p"] Mar 20 07:52:00 crc kubenswrapper[4971]: E0320 07:52:00.162845 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac972304-92a0-4afb-b1bf-2c859becff62" containerName="oc" Mar 20 07:52:00 crc kubenswrapper[4971]: I0320 07:52:00.162872 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac972304-92a0-4afb-b1bf-2c859becff62" containerName="oc" Mar 20 07:52:00 crc kubenswrapper[4971]: I0320 07:52:00.163148 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac972304-92a0-4afb-b1bf-2c859becff62" containerName="oc" Mar 20 07:52:00 crc kubenswrapper[4971]: I0320 07:52:00.163933 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566552-4sq4p" Mar 20 07:52:00 crc kubenswrapper[4971]: I0320 07:52:00.167410 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 07:52:00 crc kubenswrapper[4971]: I0320 07:52:00.167838 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 07:52:00 crc kubenswrapper[4971]: I0320 07:52:00.168056 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 07:52:00 crc kubenswrapper[4971]: I0320 07:52:00.179187 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566552-4sq4p"] Mar 20 07:52:00 crc kubenswrapper[4971]: I0320 07:52:00.280170 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvrh5\" (UniqueName: \"kubernetes.io/projected/1d9d3837-87c1-4349-96d8-503d340a566a-kube-api-access-rvrh5\") pod \"auto-csr-approver-29566552-4sq4p\" (UID: \"1d9d3837-87c1-4349-96d8-503d340a566a\") " pod="openshift-infra/auto-csr-approver-29566552-4sq4p" Mar 20 07:52:00 crc kubenswrapper[4971]: I0320 07:52:00.382281 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvrh5\" (UniqueName: \"kubernetes.io/projected/1d9d3837-87c1-4349-96d8-503d340a566a-kube-api-access-rvrh5\") pod \"auto-csr-approver-29566552-4sq4p\" (UID: \"1d9d3837-87c1-4349-96d8-503d340a566a\") " pod="openshift-infra/auto-csr-approver-29566552-4sq4p" Mar 20 07:52:00 crc kubenswrapper[4971]: I0320 07:52:00.412259 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvrh5\" (UniqueName: \"kubernetes.io/projected/1d9d3837-87c1-4349-96d8-503d340a566a-kube-api-access-rvrh5\") pod \"auto-csr-approver-29566552-4sq4p\" (UID: \"1d9d3837-87c1-4349-96d8-503d340a566a\") " pod="openshift-infra/auto-csr-approver-29566552-4sq4p" Mar 20 07:52:00 crc kubenswrapper[4971]: I0320 07:52:00.498012 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566552-4sq4p" Mar 20 07:52:00 crc kubenswrapper[4971]: I0320 07:52:00.732498 4971 scope.go:117] "RemoveContainer" containerID="9f34439c5fc9bd9c8be3ed8b25bf8407df6d56135128f84724d3d2afe1cca94e" Mar 20 07:52:00 crc kubenswrapper[4971]: E0320 07:52:00.733098 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:52:01 crc kubenswrapper[4971]: I0320 07:52:01.036910 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566552-4sq4p"] Mar 20 07:52:01 crc kubenswrapper[4971]: I0320 07:52:01.042620 4971 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 07:52:02 crc kubenswrapper[4971]: I0320 07:52:02.024777 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566552-4sq4p" event={"ID":"1d9d3837-87c1-4349-96d8-503d340a566a","Type":"ContainerStarted","Data":"38987baa5012f5619098f47a61c6e4e6cb22d05bac2ba44dffd10ea122e06de4"} Mar 20 07:52:03 crc kubenswrapper[4971]: I0320 07:52:03.037890 4971 generic.go:334] "Generic (PLEG): container finished" podID="1d9d3837-87c1-4349-96d8-503d340a566a" containerID="6c0d33e4b6a80a8877b32eaebedd068385f8ccc8d3c63ac58ebb4b2c074e02f9" exitCode=0 Mar 20 07:52:03 crc kubenswrapper[4971]: I0320 07:52:03.038008 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566552-4sq4p" event={"ID":"1d9d3837-87c1-4349-96d8-503d340a566a","Type":"ContainerDied","Data":"6c0d33e4b6a80a8877b32eaebedd068385f8ccc8d3c63ac58ebb4b2c074e02f9"} Mar 20 07:52:04 crc kubenswrapper[4971]: I0320 07:52:04.387965 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566552-4sq4p" Mar 20 07:52:04 crc kubenswrapper[4971]: I0320 07:52:04.546082 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvrh5\" (UniqueName: \"kubernetes.io/projected/1d9d3837-87c1-4349-96d8-503d340a566a-kube-api-access-rvrh5\") pod \"1d9d3837-87c1-4349-96d8-503d340a566a\" (UID: \"1d9d3837-87c1-4349-96d8-503d340a566a\") " Mar 20 07:52:04 crc kubenswrapper[4971]: I0320 07:52:04.553997 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d9d3837-87c1-4349-96d8-503d340a566a-kube-api-access-rvrh5" (OuterVolumeSpecName: "kube-api-access-rvrh5") pod "1d9d3837-87c1-4349-96d8-503d340a566a" (UID: "1d9d3837-87c1-4349-96d8-503d340a566a"). InnerVolumeSpecName "kube-api-access-rvrh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:52:04 crc kubenswrapper[4971]: I0320 07:52:04.648148 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvrh5\" (UniqueName: \"kubernetes.io/projected/1d9d3837-87c1-4349-96d8-503d340a566a-kube-api-access-rvrh5\") on node \"crc\" DevicePath \"\"" Mar 20 07:52:05 crc kubenswrapper[4971]: I0320 07:52:05.059471 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566552-4sq4p" event={"ID":"1d9d3837-87c1-4349-96d8-503d340a566a","Type":"ContainerDied","Data":"38987baa5012f5619098f47a61c6e4e6cb22d05bac2ba44dffd10ea122e06de4"} Mar 20 07:52:05 crc kubenswrapper[4971]: I0320 07:52:05.059933 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38987baa5012f5619098f47a61c6e4e6cb22d05bac2ba44dffd10ea122e06de4" Mar 20 07:52:05 crc kubenswrapper[4971]: I0320 07:52:05.059635 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566552-4sq4p" Mar 20 07:52:05 crc kubenswrapper[4971]: I0320 07:52:05.471791 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566546-jwq59"] Mar 20 07:52:05 crc kubenswrapper[4971]: I0320 07:52:05.476499 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566546-jwq59"] Mar 20 07:52:06 crc kubenswrapper[4971]: I0320 07:52:06.750710 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ac6f1ad-5ed5-4a1a-b3f9-c108c4e26e3f" path="/var/lib/kubelet/pods/0ac6f1ad-5ed5-4a1a-b3f9-c108c4e26e3f/volumes" Mar 20 07:52:07 crc kubenswrapper[4971]: I0320 07:52:07.401130 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pkmd4"] Mar 20 07:52:07 crc kubenswrapper[4971]: E0320 07:52:07.402111 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d9d3837-87c1-4349-96d8-503d340a566a" containerName="oc" Mar 20 07:52:07 crc kubenswrapper[4971]: I0320 07:52:07.402148 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d9d3837-87c1-4349-96d8-503d340a566a" containerName="oc" Mar 20 07:52:07 crc kubenswrapper[4971]: I0320 07:52:07.402584 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d9d3837-87c1-4349-96d8-503d340a566a" containerName="oc" Mar 20 07:52:07 crc kubenswrapper[4971]: I0320 07:52:07.405101 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pkmd4" Mar 20 07:52:07 crc kubenswrapper[4971]: I0320 07:52:07.414853 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pkmd4"] Mar 20 07:52:07 crc kubenswrapper[4971]: I0320 07:52:07.606124 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rq6l\" (UniqueName: \"kubernetes.io/projected/7e29ac05-c22d-489c-8a42-a29892dec62f-kube-api-access-4rq6l\") pod \"certified-operators-pkmd4\" (UID: \"7e29ac05-c22d-489c-8a42-a29892dec62f\") " pod="openshift-marketplace/certified-operators-pkmd4" Mar 20 07:52:07 crc kubenswrapper[4971]: I0320 07:52:07.606356 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e29ac05-c22d-489c-8a42-a29892dec62f-catalog-content\") pod \"certified-operators-pkmd4\" (UID: \"7e29ac05-c22d-489c-8a42-a29892dec62f\") " pod="openshift-marketplace/certified-operators-pkmd4" Mar 20 07:52:07 crc kubenswrapper[4971]: I0320 07:52:07.606429 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e29ac05-c22d-489c-8a42-a29892dec62f-utilities\") pod \"certified-operators-pkmd4\" (UID: \"7e29ac05-c22d-489c-8a42-a29892dec62f\") " pod="openshift-marketplace/certified-operators-pkmd4" Mar 20 07:52:07 crc kubenswrapper[4971]: I0320 07:52:07.707663 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e29ac05-c22d-489c-8a42-a29892dec62f-utilities\") pod \"certified-operators-pkmd4\" (UID: \"7e29ac05-c22d-489c-8a42-a29892dec62f\") " pod="openshift-marketplace/certified-operators-pkmd4" Mar 20 07:52:07 crc kubenswrapper[4971]: I0320 07:52:07.707754 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rq6l\" (UniqueName: \"kubernetes.io/projected/7e29ac05-c22d-489c-8a42-a29892dec62f-kube-api-access-4rq6l\") pod \"certified-operators-pkmd4\" (UID: \"7e29ac05-c22d-489c-8a42-a29892dec62f\") " pod="openshift-marketplace/certified-operators-pkmd4" Mar 20 07:52:07 crc kubenswrapper[4971]: I0320 07:52:07.707791 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e29ac05-c22d-489c-8a42-a29892dec62f-catalog-content\") pod \"certified-operators-pkmd4\" (UID: \"7e29ac05-c22d-489c-8a42-a29892dec62f\") " pod="openshift-marketplace/certified-operators-pkmd4" Mar 20 07:52:07 crc kubenswrapper[4971]: I0320 07:52:07.708247 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e29ac05-c22d-489c-8a42-a29892dec62f-utilities\") pod \"certified-operators-pkmd4\" (UID: \"7e29ac05-c22d-489c-8a42-a29892dec62f\") " pod="openshift-marketplace/certified-operators-pkmd4" Mar 20 07:52:07 crc kubenswrapper[4971]: I0320 07:52:07.708334 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e29ac05-c22d-489c-8a42-a29892dec62f-catalog-content\") pod \"certified-operators-pkmd4\" (UID: \"7e29ac05-c22d-489c-8a42-a29892dec62f\") " pod="openshift-marketplace/certified-operators-pkmd4" Mar 20 07:52:07 crc kubenswrapper[4971]: I0320 07:52:07.726727 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rq6l\" (UniqueName: \"kubernetes.io/projected/7e29ac05-c22d-489c-8a42-a29892dec62f-kube-api-access-4rq6l\") pod \"certified-operators-pkmd4\" (UID: \"7e29ac05-c22d-489c-8a42-a29892dec62f\") " pod="openshift-marketplace/certified-operators-pkmd4" Mar 20 07:52:07 crc kubenswrapper[4971]: I0320 07:52:07.743891 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pkmd4" Mar 20 07:52:08 crc kubenswrapper[4971]: I0320 07:52:08.184650 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pkmd4"] Mar 20 07:52:09 crc kubenswrapper[4971]: I0320 07:52:09.107280 4971 generic.go:334] "Generic (PLEG): container finished" podID="7e29ac05-c22d-489c-8a42-a29892dec62f" containerID="654f060dfd24db580632257302eb876b3ff4689d47f94107e9a6d751d2895fc0" exitCode=0 Mar 20 07:52:09 crc kubenswrapper[4971]: I0320 07:52:09.107342 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pkmd4" event={"ID":"7e29ac05-c22d-489c-8a42-a29892dec62f","Type":"ContainerDied","Data":"654f060dfd24db580632257302eb876b3ff4689d47f94107e9a6d751d2895fc0"} Mar 20 07:52:09 crc kubenswrapper[4971]: I0320 07:52:09.107378 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pkmd4" event={"ID":"7e29ac05-c22d-489c-8a42-a29892dec62f","Type":"ContainerStarted","Data":"b6a23dfe96aed1dfa2a2b7bd81468c2b21fb793c58d97fceaf5cf616517316e9"} Mar 20 07:52:10 crc kubenswrapper[4971]: I0320 07:52:10.116621 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pkmd4" event={"ID":"7e29ac05-c22d-489c-8a42-a29892dec62f","Type":"ContainerStarted","Data":"dab74a4aee27c9aea6097e3531aa6c080d43c87c2777a0bf2cd41223fb0aaa46"} Mar 20 07:52:11 crc kubenswrapper[4971]: I0320 07:52:11.136466 4971 generic.go:334] "Generic (PLEG): container finished" podID="7e29ac05-c22d-489c-8a42-a29892dec62f" containerID="dab74a4aee27c9aea6097e3531aa6c080d43c87c2777a0bf2cd41223fb0aaa46" exitCode=0 Mar 20 07:52:11 crc kubenswrapper[4971]: I0320 07:52:11.136660 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pkmd4" event={"ID":"7e29ac05-c22d-489c-8a42-a29892dec62f","Type":"ContainerDied","Data":"dab74a4aee27c9aea6097e3531aa6c080d43c87c2777a0bf2cd41223fb0aaa46"} Mar 20 07:52:12 crc kubenswrapper[4971]: I0320 07:52:12.150016 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pkmd4" event={"ID":"7e29ac05-c22d-489c-8a42-a29892dec62f","Type":"ContainerStarted","Data":"483a6a61b477913024424799a0fbc1b6cf22cc517e29a37d979931f1845063d4"} Mar 20 07:52:12 crc kubenswrapper[4971]: I0320 07:52:12.180650 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pkmd4" podStartSLOduration=2.618447787 podStartE2EDuration="5.180591072s" podCreationTimestamp="2026-03-20 07:52:07 +0000 UTC" firstStartedPulling="2026-03-20 07:52:09.1114203 +0000 UTC m=+3751.091294468" lastFinishedPulling="2026-03-20 07:52:11.673563615 +0000 UTC m=+3753.653437753" observedRunningTime="2026-03-20 07:52:12.167450821 +0000 UTC m=+3754.147324999" watchObservedRunningTime="2026-03-20 07:52:12.180591072 +0000 UTC m=+3754.160465250" Mar 20 07:52:14 crc kubenswrapper[4971]: I0320 07:52:14.732331 4971 scope.go:117] "RemoveContainer" containerID="9f34439c5fc9bd9c8be3ed8b25bf8407df6d56135128f84724d3d2afe1cca94e" Mar 20 07:52:14 crc kubenswrapper[4971]: E0320 07:52:14.733198 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:52:17 crc kubenswrapper[4971]: I0320 07:52:17.744328 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pkmd4" Mar 20 07:52:17 crc kubenswrapper[4971]: I0320 07:52:17.744449 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pkmd4" Mar 20 07:52:17 crc kubenswrapper[4971]: I0320 07:52:17.860819 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pkmd4" Mar 20 07:52:18 crc kubenswrapper[4971]: I0320 07:52:18.278828 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pkmd4" Mar 20 07:52:18 crc kubenswrapper[4971]: I0320 07:52:18.349045 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pkmd4"] Mar 20 07:52:20 crc kubenswrapper[4971]: I0320 07:52:20.228287 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-pkmd4" podUID="7e29ac05-c22d-489c-8a42-a29892dec62f" containerName="registry-server" containerID="cri-o://483a6a61b477913024424799a0fbc1b6cf22cc517e29a37d979931f1845063d4" gracePeriod=2 Mar 20 07:52:20 crc kubenswrapper[4971]: I0320 07:52:20.710956 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pkmd4" Mar 20 07:52:20 crc kubenswrapper[4971]: I0320 07:52:20.734566 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e29ac05-c22d-489c-8a42-a29892dec62f-catalog-content\") pod \"7e29ac05-c22d-489c-8a42-a29892dec62f\" (UID: \"7e29ac05-c22d-489c-8a42-a29892dec62f\") " Mar 20 07:52:20 crc kubenswrapper[4971]: I0320 07:52:20.734635 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e29ac05-c22d-489c-8a42-a29892dec62f-utilities\") pod \"7e29ac05-c22d-489c-8a42-a29892dec62f\" (UID: \"7e29ac05-c22d-489c-8a42-a29892dec62f\") " Mar 20 07:52:20 crc kubenswrapper[4971]: I0320 07:52:20.734697 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rq6l\" (UniqueName: \"kubernetes.io/projected/7e29ac05-c22d-489c-8a42-a29892dec62f-kube-api-access-4rq6l\") pod \"7e29ac05-c22d-489c-8a42-a29892dec62f\" (UID: \"7e29ac05-c22d-489c-8a42-a29892dec62f\") " Mar 20 07:52:20 crc kubenswrapper[4971]: I0320 07:52:20.736671 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e29ac05-c22d-489c-8a42-a29892dec62f-utilities" (OuterVolumeSpecName: "utilities") pod "7e29ac05-c22d-489c-8a42-a29892dec62f" (UID: "7e29ac05-c22d-489c-8a42-a29892dec62f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:52:20 crc kubenswrapper[4971]: I0320 07:52:20.748009 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e29ac05-c22d-489c-8a42-a29892dec62f-kube-api-access-4rq6l" (OuterVolumeSpecName: "kube-api-access-4rq6l") pod "7e29ac05-c22d-489c-8a42-a29892dec62f" (UID: "7e29ac05-c22d-489c-8a42-a29892dec62f"). InnerVolumeSpecName "kube-api-access-4rq6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:52:20 crc kubenswrapper[4971]: I0320 07:52:20.836447 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e29ac05-c22d-489c-8a42-a29892dec62f-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 07:52:20 crc kubenswrapper[4971]: I0320 07:52:20.836496 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rq6l\" (UniqueName: \"kubernetes.io/projected/7e29ac05-c22d-489c-8a42-a29892dec62f-kube-api-access-4rq6l\") on node \"crc\" DevicePath \"\"" Mar 20 07:52:21 crc kubenswrapper[4971]: I0320 07:52:21.011905 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e29ac05-c22d-489c-8a42-a29892dec62f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7e29ac05-c22d-489c-8a42-a29892dec62f" (UID: "7e29ac05-c22d-489c-8a42-a29892dec62f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:52:21 crc kubenswrapper[4971]: I0320 07:52:21.038523 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e29ac05-c22d-489c-8a42-a29892dec62f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 07:52:21 crc kubenswrapper[4971]: I0320 07:52:21.239330 4971 generic.go:334] "Generic (PLEG): container finished" podID="7e29ac05-c22d-489c-8a42-a29892dec62f" containerID="483a6a61b477913024424799a0fbc1b6cf22cc517e29a37d979931f1845063d4" exitCode=0 Mar 20 07:52:21 crc kubenswrapper[4971]: I0320 07:52:21.239462 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pkmd4" Mar 20 07:52:21 crc kubenswrapper[4971]: I0320 07:52:21.240044 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pkmd4" event={"ID":"7e29ac05-c22d-489c-8a42-a29892dec62f","Type":"ContainerDied","Data":"483a6a61b477913024424799a0fbc1b6cf22cc517e29a37d979931f1845063d4"} Mar 20 07:52:21 crc kubenswrapper[4971]: I0320 07:52:21.240146 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pkmd4" event={"ID":"7e29ac05-c22d-489c-8a42-a29892dec62f","Type":"ContainerDied","Data":"b6a23dfe96aed1dfa2a2b7bd81468c2b21fb793c58d97fceaf5cf616517316e9"} Mar 20 07:52:21 crc kubenswrapper[4971]: I0320 07:52:21.240192 4971 scope.go:117] "RemoveContainer" containerID="483a6a61b477913024424799a0fbc1b6cf22cc517e29a37d979931f1845063d4" Mar 20 07:52:21 crc kubenswrapper[4971]: I0320 07:52:21.277432 4971 scope.go:117] "RemoveContainer" containerID="dab74a4aee27c9aea6097e3531aa6c080d43c87c2777a0bf2cd41223fb0aaa46" Mar 20 07:52:21 crc kubenswrapper[4971]: I0320 07:52:21.280983 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pkmd4"] Mar 20 07:52:21 crc kubenswrapper[4971]: I0320 07:52:21.293166 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-pkmd4"] Mar 20 07:52:21 crc kubenswrapper[4971]: I0320 07:52:21.318486 4971 scope.go:117] "RemoveContainer" containerID="654f060dfd24db580632257302eb876b3ff4689d47f94107e9a6d751d2895fc0" Mar 20 07:52:21 crc kubenswrapper[4971]: I0320 07:52:21.350712 4971 scope.go:117] "RemoveContainer" containerID="483a6a61b477913024424799a0fbc1b6cf22cc517e29a37d979931f1845063d4" Mar 20 07:52:21 crc kubenswrapper[4971]: E0320 07:52:21.351130 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"483a6a61b477913024424799a0fbc1b6cf22cc517e29a37d979931f1845063d4\": container with ID starting with 483a6a61b477913024424799a0fbc1b6cf22cc517e29a37d979931f1845063d4 not found: ID does not exist" containerID="483a6a61b477913024424799a0fbc1b6cf22cc517e29a37d979931f1845063d4" Mar 20 07:52:21 crc kubenswrapper[4971]: I0320 07:52:21.351167 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"483a6a61b477913024424799a0fbc1b6cf22cc517e29a37d979931f1845063d4"} err="failed to get container status \"483a6a61b477913024424799a0fbc1b6cf22cc517e29a37d979931f1845063d4\": rpc error: code = NotFound desc = could not find container \"483a6a61b477913024424799a0fbc1b6cf22cc517e29a37d979931f1845063d4\": container with ID starting with 483a6a61b477913024424799a0fbc1b6cf22cc517e29a37d979931f1845063d4 not found: ID does not exist" Mar 20 07:52:21 crc kubenswrapper[4971]: I0320 07:52:21.351193 4971 scope.go:117] "RemoveContainer" containerID="dab74a4aee27c9aea6097e3531aa6c080d43c87c2777a0bf2cd41223fb0aaa46" Mar 20 07:52:21 crc kubenswrapper[4971]: E0320 07:52:21.351504 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dab74a4aee27c9aea6097e3531aa6c080d43c87c2777a0bf2cd41223fb0aaa46\": container with ID starting with dab74a4aee27c9aea6097e3531aa6c080d43c87c2777a0bf2cd41223fb0aaa46 not found: ID does not exist" containerID="dab74a4aee27c9aea6097e3531aa6c080d43c87c2777a0bf2cd41223fb0aaa46" Mar 20 07:52:21 crc kubenswrapper[4971]: I0320 07:52:21.351550 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dab74a4aee27c9aea6097e3531aa6c080d43c87c2777a0bf2cd41223fb0aaa46"} err="failed to get container status \"dab74a4aee27c9aea6097e3531aa6c080d43c87c2777a0bf2cd41223fb0aaa46\": rpc error: code = NotFound desc = could not find container \"dab74a4aee27c9aea6097e3531aa6c080d43c87c2777a0bf2cd41223fb0aaa46\": container with ID starting with dab74a4aee27c9aea6097e3531aa6c080d43c87c2777a0bf2cd41223fb0aaa46 not found: ID does not exist" Mar 20 07:52:21 crc kubenswrapper[4971]: I0320 07:52:21.351577 4971 scope.go:117] "RemoveContainer" containerID="654f060dfd24db580632257302eb876b3ff4689d47f94107e9a6d751d2895fc0" Mar 20 07:52:21 crc kubenswrapper[4971]: E0320 07:52:21.352061 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"654f060dfd24db580632257302eb876b3ff4689d47f94107e9a6d751d2895fc0\": container with ID starting with 654f060dfd24db580632257302eb876b3ff4689d47f94107e9a6d751d2895fc0 not found: ID does not exist" containerID="654f060dfd24db580632257302eb876b3ff4689d47f94107e9a6d751d2895fc0" Mar 20 07:52:21 crc kubenswrapper[4971]: I0320 07:52:21.352098 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"654f060dfd24db580632257302eb876b3ff4689d47f94107e9a6d751d2895fc0"} err="failed to get container status \"654f060dfd24db580632257302eb876b3ff4689d47f94107e9a6d751d2895fc0\": rpc error: code = NotFound desc = could not find container \"654f060dfd24db580632257302eb876b3ff4689d47f94107e9a6d751d2895fc0\": container with ID starting with 654f060dfd24db580632257302eb876b3ff4689d47f94107e9a6d751d2895fc0 not found: ID does not exist" Mar 20 07:52:22 crc kubenswrapper[4971]: I0320 07:52:22.745874 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e29ac05-c22d-489c-8a42-a29892dec62f" path="/var/lib/kubelet/pods/7e29ac05-c22d-489c-8a42-a29892dec62f/volumes" Mar 20 07:52:29 crc kubenswrapper[4971]: I0320 07:52:29.732511 4971 scope.go:117] "RemoveContainer" containerID="9f34439c5fc9bd9c8be3ed8b25bf8407df6d56135128f84724d3d2afe1cca94e" Mar 20 07:52:29 crc kubenswrapper[4971]: E0320 07:52:29.733353 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:52:42 crc kubenswrapper[4971]: I0320 07:52:42.732926 4971 scope.go:117] "RemoveContainer" containerID="9f34439c5fc9bd9c8be3ed8b25bf8407df6d56135128f84724d3d2afe1cca94e" Mar 20 07:52:42 crc kubenswrapper[4971]: E0320 07:52:42.733786 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:52:55 crc kubenswrapper[4971]: I0320 07:52:55.732970 4971 scope.go:117] "RemoveContainer" containerID="9f34439c5fc9bd9c8be3ed8b25bf8407df6d56135128f84724d3d2afe1cca94e" Mar 20 07:52:55 crc kubenswrapper[4971]: E0320 07:52:55.733992 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:52:55 crc kubenswrapper[4971]: I0320 07:52:55.977539 4971 scope.go:117] "RemoveContainer" containerID="760433e90992d9ddfe656fbb318be7aeb6a3c29c555f6b9b319a75075391e3c9" Mar 20 07:53:10 crc kubenswrapper[4971]: I0320 07:53:10.732270 4971 scope.go:117] "RemoveContainer" containerID="9f34439c5fc9bd9c8be3ed8b25bf8407df6d56135128f84724d3d2afe1cca94e" Mar 20 07:53:10 crc kubenswrapper[4971]: E0320 07:53:10.733391 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:53:21 crc kubenswrapper[4971]: I0320 07:53:21.735482 4971 scope.go:117] "RemoveContainer" containerID="9f34439c5fc9bd9c8be3ed8b25bf8407df6d56135128f84724d3d2afe1cca94e" Mar 20 07:53:21 crc kubenswrapper[4971]: E0320 07:53:21.736895 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:53:35 crc kubenswrapper[4971]: I0320 07:53:35.732250 4971 scope.go:117] "RemoveContainer" containerID="9f34439c5fc9bd9c8be3ed8b25bf8407df6d56135128f84724d3d2afe1cca94e" Mar 20 07:53:35 crc kubenswrapper[4971]: E0320 07:53:35.732900 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:53:48 crc kubenswrapper[4971]: I0320 07:53:48.739921 4971 scope.go:117] "RemoveContainer" containerID="9f34439c5fc9bd9c8be3ed8b25bf8407df6d56135128f84724d3d2afe1cca94e" Mar 20 07:53:48 crc kubenswrapper[4971]: E0320 07:53:48.740918 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:54:00 crc kubenswrapper[4971]: I0320 07:54:00.170101 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566554-dcp9x"] Mar 20 07:54:00 crc kubenswrapper[4971]: E0320 07:54:00.171395 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e29ac05-c22d-489c-8a42-a29892dec62f" containerName="extract-content" Mar 20 07:54:00 crc kubenswrapper[4971]: I0320 07:54:00.171642 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e29ac05-c22d-489c-8a42-a29892dec62f" containerName="extract-content" Mar 20 07:54:00 crc kubenswrapper[4971]: E0320 07:54:00.171696 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e29ac05-c22d-489c-8a42-a29892dec62f" containerName="extract-utilities" Mar 20 07:54:00 crc kubenswrapper[4971]: I0320 07:54:00.171716 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e29ac05-c22d-489c-8a42-a29892dec62f" containerName="extract-utilities" Mar 20 07:54:00 crc kubenswrapper[4971]: E0320 07:54:00.171743 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e29ac05-c22d-489c-8a42-a29892dec62f" containerName="registry-server" Mar 20 07:54:00 crc kubenswrapper[4971]: I0320 07:54:00.171762 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e29ac05-c22d-489c-8a42-a29892dec62f" containerName="registry-server" Mar 20 07:54:00 crc kubenswrapper[4971]: I0320 07:54:00.172124 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e29ac05-c22d-489c-8a42-a29892dec62f" containerName="registry-server" Mar 20 07:54:00 crc kubenswrapper[4971]: I0320 07:54:00.173199 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566554-dcp9x" Mar 20 07:54:00 crc kubenswrapper[4971]: I0320 07:54:00.175531 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 07:54:00 crc kubenswrapper[4971]: I0320 07:54:00.176322 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 07:54:00 crc kubenswrapper[4971]: I0320 07:54:00.177383 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566554-dcp9x"] Mar 20 07:54:00 crc kubenswrapper[4971]: I0320 07:54:00.181224 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 07:54:00 crc kubenswrapper[4971]: I0320 07:54:00.197251 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qv8qq\" (UniqueName: \"kubernetes.io/projected/d06aea70-0ef4-4194-887f-65c565e47798-kube-api-access-qv8qq\") pod \"auto-csr-approver-29566554-dcp9x\" (UID: \"d06aea70-0ef4-4194-887f-65c565e47798\") " pod="openshift-infra/auto-csr-approver-29566554-dcp9x" Mar 20 07:54:00 crc kubenswrapper[4971]: I0320 07:54:00.299169 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qv8qq\" (UniqueName: \"kubernetes.io/projected/d06aea70-0ef4-4194-887f-65c565e47798-kube-api-access-qv8qq\") pod \"auto-csr-approver-29566554-dcp9x\" (UID: \"d06aea70-0ef4-4194-887f-65c565e47798\") " pod="openshift-infra/auto-csr-approver-29566554-dcp9x" Mar 20 07:54:00 crc kubenswrapper[4971]: I0320 07:54:00.330528 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv8qq\" (UniqueName: \"kubernetes.io/projected/d06aea70-0ef4-4194-887f-65c565e47798-kube-api-access-qv8qq\") pod \"auto-csr-approver-29566554-dcp9x\" (UID: \"d06aea70-0ef4-4194-887f-65c565e47798\") " pod="openshift-infra/auto-csr-approver-29566554-dcp9x" Mar 20 07:54:00 crc kubenswrapper[4971]: I0320 07:54:00.496719 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566554-dcp9x" Mar 20 07:54:01 crc kubenswrapper[4971]: I0320 07:54:01.028178 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566554-dcp9x"] Mar 20 07:54:01 crc kubenswrapper[4971]: I0320 07:54:01.447124 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566554-dcp9x" event={"ID":"d06aea70-0ef4-4194-887f-65c565e47798","Type":"ContainerStarted","Data":"efce85b6244b793858cb0ecad7b88c8b5db54d7407c1bb0f4482f448f27ac2db"} Mar 20 07:54:01 crc kubenswrapper[4971]: I0320 07:54:01.732833 4971 scope.go:117] "RemoveContainer" containerID="9f34439c5fc9bd9c8be3ed8b25bf8407df6d56135128f84724d3d2afe1cca94e" Mar 20 07:54:01 crc kubenswrapper[4971]: E0320 07:54:01.733134 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:54:02 crc kubenswrapper[4971]: I0320 07:54:02.454123 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566554-dcp9x" event={"ID":"d06aea70-0ef4-4194-887f-65c565e47798","Type":"ContainerStarted","Data":"fa3f08d261fa3e16e592eed0b5d13d958e5ad9941a0d19e31fc58faafd2266fb"} Mar 20 07:54:02 crc kubenswrapper[4971]: I0320 07:54:02.471715 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566554-dcp9x" podStartSLOduration=1.536024908 podStartE2EDuration="2.471696869s" podCreationTimestamp="2026-03-20 07:54:00 +0000 UTC" firstStartedPulling="2026-03-20 07:54:01.03896121 +0000 UTC m=+3863.018835388" lastFinishedPulling="2026-03-20 07:54:01.974633181 +0000 UTC m=+3863.954507349" observedRunningTime="2026-03-20 07:54:02.468570018 +0000 UTC m=+3864.448444166" watchObservedRunningTime="2026-03-20 07:54:02.471696869 +0000 UTC m=+3864.451571007" Mar 20 07:54:03 crc kubenswrapper[4971]: I0320 07:54:03.466688 4971 generic.go:334] "Generic (PLEG): container finished" podID="d06aea70-0ef4-4194-887f-65c565e47798" containerID="fa3f08d261fa3e16e592eed0b5d13d958e5ad9941a0d19e31fc58faafd2266fb" exitCode=0 Mar 20 07:54:03 crc kubenswrapper[4971]: I0320 07:54:03.466914 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566554-dcp9x" event={"ID":"d06aea70-0ef4-4194-887f-65c565e47798","Type":"ContainerDied","Data":"fa3f08d261fa3e16e592eed0b5d13d958e5ad9941a0d19e31fc58faafd2266fb"} Mar 20 07:54:04 crc kubenswrapper[4971]: I0320 07:54:04.869192 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566554-dcp9x" Mar 20 07:54:04 crc kubenswrapper[4971]: I0320 07:54:04.988763 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qv8qq\" (UniqueName: \"kubernetes.io/projected/d06aea70-0ef4-4194-887f-65c565e47798-kube-api-access-qv8qq\") pod \"d06aea70-0ef4-4194-887f-65c565e47798\" (UID: \"d06aea70-0ef4-4194-887f-65c565e47798\") " Mar 20 07:54:05 crc kubenswrapper[4971]: I0320 07:54:05.000544 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d06aea70-0ef4-4194-887f-65c565e47798-kube-api-access-qv8qq" (OuterVolumeSpecName: "kube-api-access-qv8qq") pod "d06aea70-0ef4-4194-887f-65c565e47798" (UID: "d06aea70-0ef4-4194-887f-65c565e47798"). InnerVolumeSpecName "kube-api-access-qv8qq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:54:05 crc kubenswrapper[4971]: I0320 07:54:05.091895 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qv8qq\" (UniqueName: \"kubernetes.io/projected/d06aea70-0ef4-4194-887f-65c565e47798-kube-api-access-qv8qq\") on node \"crc\" DevicePath \"\"" Mar 20 07:54:05 crc kubenswrapper[4971]: I0320 07:54:05.501640 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566554-dcp9x" event={"ID":"d06aea70-0ef4-4194-887f-65c565e47798","Type":"ContainerDied","Data":"efce85b6244b793858cb0ecad7b88c8b5db54d7407c1bb0f4482f448f27ac2db"} Mar 20 07:54:05 crc kubenswrapper[4971]: I0320 07:54:05.502312 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="efce85b6244b793858cb0ecad7b88c8b5db54d7407c1bb0f4482f448f27ac2db" Mar 20 07:54:05 crc kubenswrapper[4971]: I0320 07:54:05.501771 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566554-dcp9x" Mar 20 07:54:05 crc kubenswrapper[4971]: I0320 07:54:05.577866 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566548-nzsf8"] Mar 20 07:54:05 crc kubenswrapper[4971]: I0320 07:54:05.587377 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566548-nzsf8"] Mar 20 07:54:06 crc kubenswrapper[4971]: I0320 07:54:06.745232 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc72c18d-9df1-455f-94a7-fff6a10ccb21" path="/var/lib/kubelet/pods/fc72c18d-9df1-455f-94a7-fff6a10ccb21/volumes" Mar 20 07:54:15 crc kubenswrapper[4971]: I0320 07:54:15.732441 4971 scope.go:117] "RemoveContainer" containerID="9f34439c5fc9bd9c8be3ed8b25bf8407df6d56135128f84724d3d2afe1cca94e" Mar 20 07:54:15 crc kubenswrapper[4971]: E0320 07:54:15.733403 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 07:54:27 crc kubenswrapper[4971]: I0320 07:54:27.732975 4971 scope.go:117] "RemoveContainer" containerID="9f34439c5fc9bd9c8be3ed8b25bf8407df6d56135128f84724d3d2afe1cca94e" Mar 20 07:54:28 crc kubenswrapper[4971]: I0320 07:54:28.726470 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerStarted","Data":"f5757dab665bec4f1034edb8e2c85da8a22c711cf9230d598981b106f2238f62"} Mar 20 07:54:56 crc kubenswrapper[4971]: I0320 07:54:56.118291 4971 scope.go:117] "RemoveContainer" containerID="10250ebdde278ab84a6b88e597b253ce0a54f7eb624e62fc04611859042a874a" Mar 20 07:55:40 crc kubenswrapper[4971]: I0320 07:55:40.252865 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dfhq9"] Mar 20 07:55:40 crc kubenswrapper[4971]: E0320 07:55:40.253839 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d06aea70-0ef4-4194-887f-65c565e47798" containerName="oc" Mar 20 07:55:40 crc kubenswrapper[4971]: I0320 07:55:40.253861 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="d06aea70-0ef4-4194-887f-65c565e47798" containerName="oc" Mar 20 07:55:40 crc kubenswrapper[4971]: I0320 07:55:40.254106 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="d06aea70-0ef4-4194-887f-65c565e47798" containerName="oc" Mar 20 07:55:40 crc kubenswrapper[4971]: I0320 07:55:40.255982 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dfhq9" Mar 20 07:55:40 crc kubenswrapper[4971]: I0320 07:55:40.262261 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dfhq9"] Mar 20 07:55:40 crc kubenswrapper[4971]: I0320 07:55:40.413116 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp526\" (UniqueName: \"kubernetes.io/projected/65e9cb3c-4705-4e93-bb16-a0a56de56d62-kube-api-access-qp526\") pod \"community-operators-dfhq9\" (UID: \"65e9cb3c-4705-4e93-bb16-a0a56de56d62\") " pod="openshift-marketplace/community-operators-dfhq9" Mar 20 07:55:40 crc kubenswrapper[4971]: I0320 07:55:40.413482 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65e9cb3c-4705-4e93-bb16-a0a56de56d62-utilities\") pod \"community-operators-dfhq9\" (UID: \"65e9cb3c-4705-4e93-bb16-a0a56de56d62\") " pod="openshift-marketplace/community-operators-dfhq9" Mar 20 07:55:40 crc kubenswrapper[4971]: I0320 07:55:40.413671 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65e9cb3c-4705-4e93-bb16-a0a56de56d62-catalog-content\") pod \"community-operators-dfhq9\" (UID: \"65e9cb3c-4705-4e93-bb16-a0a56de56d62\") " pod="openshift-marketplace/community-operators-dfhq9" Mar 20 07:55:40 crc kubenswrapper[4971]: I0320 07:55:40.514349 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65e9cb3c-4705-4e93-bb16-a0a56de56d62-catalog-content\") pod \"community-operators-dfhq9\" (UID: \"65e9cb3c-4705-4e93-bb16-a0a56de56d62\") " pod="openshift-marketplace/community-operators-dfhq9" Mar 20 07:55:40 crc kubenswrapper[4971]: I0320 07:55:40.514466 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qp526\" (UniqueName: \"kubernetes.io/projected/65e9cb3c-4705-4e93-bb16-a0a56de56d62-kube-api-access-qp526\") pod \"community-operators-dfhq9\" (UID: \"65e9cb3c-4705-4e93-bb16-a0a56de56d62\") " pod="openshift-marketplace/community-operators-dfhq9" Mar 20 07:55:40 crc kubenswrapper[4971]: I0320 07:55:40.514520 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65e9cb3c-4705-4e93-bb16-a0a56de56d62-utilities\") pod \"community-operators-dfhq9\" (UID: \"65e9cb3c-4705-4e93-bb16-a0a56de56d62\") " pod="openshift-marketplace/community-operators-dfhq9" Mar 20 07:55:40 crc kubenswrapper[4971]: I0320 07:55:40.515190 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65e9cb3c-4705-4e93-bb16-a0a56de56d62-utilities\") pod \"community-operators-dfhq9\" (UID: \"65e9cb3c-4705-4e93-bb16-a0a56de56d62\") " pod="openshift-marketplace/community-operators-dfhq9" Mar 20 07:55:40 crc kubenswrapper[4971]: I0320 07:55:40.515403 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65e9cb3c-4705-4e93-bb16-a0a56de56d62-catalog-content\") pod \"community-operators-dfhq9\" (UID: \"65e9cb3c-4705-4e93-bb16-a0a56de56d62\") " pod="openshift-marketplace/community-operators-dfhq9" Mar 20 07:55:40 crc kubenswrapper[4971]: I0320 07:55:40.701069 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp526\" (UniqueName: \"kubernetes.io/projected/65e9cb3c-4705-4e93-bb16-a0a56de56d62-kube-api-access-qp526\") pod \"community-operators-dfhq9\" (UID: \"65e9cb3c-4705-4e93-bb16-a0a56de56d62\") " pod="openshift-marketplace/community-operators-dfhq9" Mar 20 07:55:40 crc kubenswrapper[4971]: I0320 07:55:40.884780 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dfhq9" Mar 20 07:55:41 crc kubenswrapper[4971]: I0320 07:55:41.907257 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dfhq9"] Mar 20 07:55:42 crc kubenswrapper[4971]: I0320 07:55:42.363700 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dfhq9" event={"ID":"65e9cb3c-4705-4e93-bb16-a0a56de56d62","Type":"ContainerStarted","Data":"061ce4a386e20f1a27e975a4f7b80168e523c21aa55ac56c3c7ada0ec88c2990"} Mar 20 07:55:43 crc kubenswrapper[4971]: I0320 07:55:43.376358 4971 generic.go:334] "Generic (PLEG): container finished" podID="65e9cb3c-4705-4e93-bb16-a0a56de56d62" containerID="02d9ef4b17ed88eeb779e2c2030d7a8a51c9adc5b84475a66d4370958d757fda" exitCode=0 Mar 20 07:55:43 crc kubenswrapper[4971]: I0320 07:55:43.376419 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dfhq9" event={"ID":"65e9cb3c-4705-4e93-bb16-a0a56de56d62","Type":"ContainerDied","Data":"02d9ef4b17ed88eeb779e2c2030d7a8a51c9adc5b84475a66d4370958d757fda"} Mar 20 07:55:49 crc kubenswrapper[4971]: I0320 07:55:49.433365 4971 generic.go:334] "Generic (PLEG): container finished" podID="65e9cb3c-4705-4e93-bb16-a0a56de56d62" containerID="eb02e224a9f4d5d661c0d5aab136b93c270f5fbeab8e84c6858f11a8ab53d91e" exitCode=0 Mar 20 07:55:49 crc kubenswrapper[4971]: I0320 07:55:49.433447 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dfhq9" event={"ID":"65e9cb3c-4705-4e93-bb16-a0a56de56d62","Type":"ContainerDied","Data":"eb02e224a9f4d5d661c0d5aab136b93c270f5fbeab8e84c6858f11a8ab53d91e"} Mar 20 07:55:50 crc kubenswrapper[4971]: I0320 07:55:50.448584 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dfhq9" event={"ID":"65e9cb3c-4705-4e93-bb16-a0a56de56d62","Type":"ContainerStarted","Data":"bd4be7968f59c8940ce1d937903dcd6e3f911573a4a719b287e095fa48b8ad24"} Mar 20 07:55:50 crc kubenswrapper[4971]: I0320 07:55:50.487175 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dfhq9" podStartSLOduration=3.990177247 podStartE2EDuration="10.487142493s" podCreationTimestamp="2026-03-20 07:55:40 +0000 UTC" firstStartedPulling="2026-03-20 07:55:43.379561702 +0000 UTC m=+3965.359435880" lastFinishedPulling="2026-03-20 07:55:49.876526968 +0000 UTC m=+3971.856401126" observedRunningTime="2026-03-20 07:55:50.475693126 +0000 UTC m=+3972.455567294" watchObservedRunningTime="2026-03-20 07:55:50.487142493 +0000 UTC m=+3972.467016661" Mar 20 07:55:50 crc kubenswrapper[4971]: I0320 07:55:50.885755 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dfhq9" Mar 20 07:55:50 crc kubenswrapper[4971]: I0320 07:55:50.885803 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dfhq9" Mar 20 07:55:51 crc kubenswrapper[4971]: I0320 07:55:51.929761 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-dfhq9" podUID="65e9cb3c-4705-4e93-bb16-a0a56de56d62" containerName="registry-server" probeResult="failure" output=< Mar 20 07:55:51 crc kubenswrapper[4971]: timeout: failed to connect service ":50051" within 1s Mar 20 07:55:51 crc kubenswrapper[4971]: > Mar 20 07:56:00 crc kubenswrapper[4971]: I0320 07:56:00.150900 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566556-86xkz"] Mar 20 07:56:00 crc kubenswrapper[4971]: I0320 07:56:00.152035 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566556-86xkz" Mar 20 07:56:00 crc kubenswrapper[4971]: I0320 07:56:00.156140 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 07:56:00 crc kubenswrapper[4971]: I0320 07:56:00.158096 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 07:56:00 crc kubenswrapper[4971]: I0320 07:56:00.163560 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 07:56:00 crc kubenswrapper[4971]: I0320 07:56:00.171935 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566556-86xkz"] Mar 20 07:56:00 crc kubenswrapper[4971]: I0320 07:56:00.324405 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cg8rl\" (UniqueName: \"kubernetes.io/projected/55e6a816-98cf-45fa-9e19-f0d440f373df-kube-api-access-cg8rl\") pod \"auto-csr-approver-29566556-86xkz\" (UID: \"55e6a816-98cf-45fa-9e19-f0d440f373df\") " pod="openshift-infra/auto-csr-approver-29566556-86xkz" Mar 20 07:56:00 crc kubenswrapper[4971]: I0320 07:56:00.426898 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cg8rl\" (UniqueName: \"kubernetes.io/projected/55e6a816-98cf-45fa-9e19-f0d440f373df-kube-api-access-cg8rl\") pod \"auto-csr-approver-29566556-86xkz\" (UID: \"55e6a816-98cf-45fa-9e19-f0d440f373df\") " pod="openshift-infra/auto-csr-approver-29566556-86xkz" Mar 20 07:56:00 crc kubenswrapper[4971]: I0320 07:56:00.448321 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cg8rl\" (UniqueName: \"kubernetes.io/projected/55e6a816-98cf-45fa-9e19-f0d440f373df-kube-api-access-cg8rl\") pod \"auto-csr-approver-29566556-86xkz\" (UID: \"55e6a816-98cf-45fa-9e19-f0d440f373df\") " pod="openshift-infra/auto-csr-approver-29566556-86xkz" Mar 20 07:56:00 crc kubenswrapper[4971]: I0320 07:56:00.483185 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566556-86xkz" Mar 20 07:56:00 crc kubenswrapper[4971]: I0320 07:56:00.938645 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dfhq9" Mar 20 07:56:00 crc kubenswrapper[4971]: I0320 07:56:00.991516 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dfhq9" Mar 20 07:56:01 crc kubenswrapper[4971]: I0320 07:56:01.015216 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566556-86xkz"] Mar 20 07:56:01 crc kubenswrapper[4971]: I0320 07:56:01.076852 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dfhq9"] Mar 20 07:56:01 crc kubenswrapper[4971]: I0320 07:56:01.178153 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ktk7v"] Mar 20 07:56:01 crc kubenswrapper[4971]: I0320 07:56:01.178526 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ktk7v" podUID="03cfe007-db9f-4403-86b5-cd107e145c1a" containerName="registry-server" containerID="cri-o://b71909be1e7a548f33c8b7e149d925ff0688997aa8fca17f82a05c823a8e392a" gracePeriod=2 Mar 20 07:56:01 crc kubenswrapper[4971]: I0320 07:56:01.540066 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566556-86xkz" event={"ID":"55e6a816-98cf-45fa-9e19-f0d440f373df","Type":"ContainerStarted","Data":"a51faca23f47f86549a1f2bdefdc7e413d3640cd8ac2c16060c51aa4d4d847b6"} Mar 20 07:56:01 crc kubenswrapper[4971]: I0320 07:56:01.542263 4971 generic.go:334] "Generic (PLEG): container finished" podID="03cfe007-db9f-4403-86b5-cd107e145c1a" containerID="b71909be1e7a548f33c8b7e149d925ff0688997aa8fca17f82a05c823a8e392a" exitCode=0 Mar 20 07:56:01 crc kubenswrapper[4971]: I0320 07:56:01.542323 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ktk7v" event={"ID":"03cfe007-db9f-4403-86b5-cd107e145c1a","Type":"ContainerDied","Data":"b71909be1e7a548f33c8b7e149d925ff0688997aa8fca17f82a05c823a8e392a"} Mar 20 07:56:02 crc kubenswrapper[4971]: E0320 07:56:02.028985 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b71909be1e7a548f33c8b7e149d925ff0688997aa8fca17f82a05c823a8e392a is running failed: container process not found" containerID="b71909be1e7a548f33c8b7e149d925ff0688997aa8fca17f82a05c823a8e392a" cmd=["grpc_health_probe","-addr=:50051"] Mar 20 07:56:02 crc kubenswrapper[4971]: E0320 07:56:02.029584 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b71909be1e7a548f33c8b7e149d925ff0688997aa8fca17f82a05c823a8e392a is running failed: container process not found" containerID="b71909be1e7a548f33c8b7e149d925ff0688997aa8fca17f82a05c823a8e392a" cmd=["grpc_health_probe","-addr=:50051"] Mar 20 07:56:02 crc kubenswrapper[4971]: E0320 07:56:02.029981 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b71909be1e7a548f33c8b7e149d925ff0688997aa8fca17f82a05c823a8e392a is running failed: container process not found" containerID="b71909be1e7a548f33c8b7e149d925ff0688997aa8fca17f82a05c823a8e392a" cmd=["grpc_health_probe","-addr=:50051"] Mar 20 07:56:02 crc kubenswrapper[4971]: E0320 07:56:02.030077 4971 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b71909be1e7a548f33c8b7e149d925ff0688997aa8fca17f82a05c823a8e392a is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-ktk7v" podUID="03cfe007-db9f-4403-86b5-cd107e145c1a" containerName="registry-server" Mar 20 07:56:02 crc kubenswrapper[4971]: I0320 07:56:02.888059 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ktk7v" Mar 20 07:56:02 crc kubenswrapper[4971]: I0320 07:56:02.966362 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03cfe007-db9f-4403-86b5-cd107e145c1a-utilities\") pod \"03cfe007-db9f-4403-86b5-cd107e145c1a\" (UID: \"03cfe007-db9f-4403-86b5-cd107e145c1a\") " Mar 20 07:56:02 crc kubenswrapper[4971]: I0320 07:56:02.966465 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghrhr\" (UniqueName: \"kubernetes.io/projected/03cfe007-db9f-4403-86b5-cd107e145c1a-kube-api-access-ghrhr\") pod \"03cfe007-db9f-4403-86b5-cd107e145c1a\" (UID: \"03cfe007-db9f-4403-86b5-cd107e145c1a\") " Mar 20 07:56:02 crc kubenswrapper[4971]: I0320 07:56:02.966564 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03cfe007-db9f-4403-86b5-cd107e145c1a-catalog-content\") pod \"03cfe007-db9f-4403-86b5-cd107e145c1a\" (UID: \"03cfe007-db9f-4403-86b5-cd107e145c1a\") " Mar 20 07:56:02 crc kubenswrapper[4971]: I0320 07:56:02.967409 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03cfe007-db9f-4403-86b5-cd107e145c1a-utilities" (OuterVolumeSpecName: "utilities") pod "03cfe007-db9f-4403-86b5-cd107e145c1a" (UID: "03cfe007-db9f-4403-86b5-cd107e145c1a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:56:02 crc kubenswrapper[4971]: I0320 07:56:02.974398 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03cfe007-db9f-4403-86b5-cd107e145c1a-kube-api-access-ghrhr" (OuterVolumeSpecName: "kube-api-access-ghrhr") pod "03cfe007-db9f-4403-86b5-cd107e145c1a" (UID: "03cfe007-db9f-4403-86b5-cd107e145c1a"). InnerVolumeSpecName "kube-api-access-ghrhr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:56:03 crc kubenswrapper[4971]: I0320 07:56:03.016828 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03cfe007-db9f-4403-86b5-cd107e145c1a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "03cfe007-db9f-4403-86b5-cd107e145c1a" (UID: "03cfe007-db9f-4403-86b5-cd107e145c1a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:56:03 crc kubenswrapper[4971]: I0320 07:56:03.068540 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03cfe007-db9f-4403-86b5-cd107e145c1a-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 07:56:03 crc kubenswrapper[4971]: I0320 07:56:03.068582 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghrhr\" (UniqueName: \"kubernetes.io/projected/03cfe007-db9f-4403-86b5-cd107e145c1a-kube-api-access-ghrhr\") on node \"crc\" DevicePath \"\"" Mar 20 07:56:03 crc kubenswrapper[4971]: I0320 07:56:03.068596 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03cfe007-db9f-4403-86b5-cd107e145c1a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 07:56:03 crc kubenswrapper[4971]: I0320 07:56:03.566830 4971 generic.go:334] "Generic (PLEG): container finished" podID="55e6a816-98cf-45fa-9e19-f0d440f373df" containerID="ef584511ec2bbb26d04d5fd61cea156875f31577dab89aa46b8bf386cf5605e3" exitCode=0 Mar 20 07:56:03 crc kubenswrapper[4971]: I0320 07:56:03.566940 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566556-86xkz" event={"ID":"55e6a816-98cf-45fa-9e19-f0d440f373df","Type":"ContainerDied","Data":"ef584511ec2bbb26d04d5fd61cea156875f31577dab89aa46b8bf386cf5605e3"} Mar 20 07:56:03 crc kubenswrapper[4971]: I0320 07:56:03.571443 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ktk7v" event={"ID":"03cfe007-db9f-4403-86b5-cd107e145c1a","Type":"ContainerDied","Data":"979e0ace61d21c228304ee91272b3f5334ca82724b07bd2f68247f7690012a50"} Mar 20 07:56:03 crc kubenswrapper[4971]: I0320 07:56:03.571528 4971 scope.go:117] "RemoveContainer" containerID="b71909be1e7a548f33c8b7e149d925ff0688997aa8fca17f82a05c823a8e392a" Mar 20 07:56:03 crc kubenswrapper[4971]: I0320 07:56:03.571563 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ktk7v" Mar 20 07:56:03 crc kubenswrapper[4971]: I0320 07:56:03.599587 4971 scope.go:117] "RemoveContainer" containerID="7190523d839dd731cc4a99d3805728490092af7e318af62fc5597246e1201a8c" Mar 20 07:56:03 crc kubenswrapper[4971]: I0320 07:56:03.643219 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ktk7v"] Mar 20 07:56:03 crc kubenswrapper[4971]: I0320 07:56:03.647543 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ktk7v"] Mar 20 07:56:03 crc kubenswrapper[4971]: I0320 07:56:03.648264 4971 scope.go:117] "RemoveContainer" containerID="4d4ff9df969b72ff953b8f5446293968bf1e6b252bcf392fb26864f556fecaeb" Mar 20 07:56:04 crc kubenswrapper[4971]: I0320 07:56:04.742193 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03cfe007-db9f-4403-86b5-cd107e145c1a" path="/var/lib/kubelet/pods/03cfe007-db9f-4403-86b5-cd107e145c1a/volumes" Mar 20 07:56:05 crc kubenswrapper[4971]: I0320 07:56:05.344898 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566556-86xkz" Mar 20 07:56:05 crc kubenswrapper[4971]: I0320 07:56:05.510125 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cg8rl\" (UniqueName: \"kubernetes.io/projected/55e6a816-98cf-45fa-9e19-f0d440f373df-kube-api-access-cg8rl\") pod \"55e6a816-98cf-45fa-9e19-f0d440f373df\" (UID: \"55e6a816-98cf-45fa-9e19-f0d440f373df\") " Mar 20 07:56:05 crc kubenswrapper[4971]: I0320 07:56:05.515197 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55e6a816-98cf-45fa-9e19-f0d440f373df-kube-api-access-cg8rl" (OuterVolumeSpecName: "kube-api-access-cg8rl") pod "55e6a816-98cf-45fa-9e19-f0d440f373df" (UID: "55e6a816-98cf-45fa-9e19-f0d440f373df"). InnerVolumeSpecName "kube-api-access-cg8rl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:56:05 crc kubenswrapper[4971]: I0320 07:56:05.590823 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566556-86xkz" event={"ID":"55e6a816-98cf-45fa-9e19-f0d440f373df","Type":"ContainerDied","Data":"a51faca23f47f86549a1f2bdefdc7e413d3640cd8ac2c16060c51aa4d4d847b6"} Mar 20 07:56:05 crc kubenswrapper[4971]: I0320 07:56:05.590861 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a51faca23f47f86549a1f2bdefdc7e413d3640cd8ac2c16060c51aa4d4d847b6" Mar 20 07:56:05 crc kubenswrapper[4971]: I0320 07:56:05.590942 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566556-86xkz" Mar 20 07:56:05 crc kubenswrapper[4971]: I0320 07:56:05.614295 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cg8rl\" (UniqueName: \"kubernetes.io/projected/55e6a816-98cf-45fa-9e19-f0d440f373df-kube-api-access-cg8rl\") on node \"crc\" DevicePath \"\"" Mar 20 07:56:05 crc kubenswrapper[4971]: E0320 07:56:05.656694 4971 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55e6a816_98cf_45fa_9e19_f0d440f373df.slice\": RecentStats: unable to find data in memory cache]" Mar 20 07:56:06 crc kubenswrapper[4971]: I0320 07:56:06.513156 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566550-c7h6d"] Mar 20 07:56:06 crc kubenswrapper[4971]: I0320 07:56:06.518389 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566550-c7h6d"] Mar 20 07:56:06 crc kubenswrapper[4971]: I0320 07:56:06.740905 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac972304-92a0-4afb-b1bf-2c859becff62" path="/var/lib/kubelet/pods/ac972304-92a0-4afb-b1bf-2c859becff62/volumes" Mar 20 07:56:49 crc kubenswrapper[4971]: I0320 07:56:49.182140 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dzk4z"] Mar 20 07:56:49 crc kubenswrapper[4971]: E0320 07:56:49.183062 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55e6a816-98cf-45fa-9e19-f0d440f373df" containerName="oc" Mar 20 07:56:49 crc kubenswrapper[4971]: I0320 07:56:49.183080 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="55e6a816-98cf-45fa-9e19-f0d440f373df" containerName="oc" Mar 20 07:56:49 crc kubenswrapper[4971]: E0320 07:56:49.183098 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03cfe007-db9f-4403-86b5-cd107e145c1a" containerName="extract-utilities" Mar 20 07:56:49 crc kubenswrapper[4971]: I0320 07:56:49.183105 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="03cfe007-db9f-4403-86b5-cd107e145c1a" containerName="extract-utilities" Mar 20 07:56:49 crc kubenswrapper[4971]: E0320 07:56:49.183129 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03cfe007-db9f-4403-86b5-cd107e145c1a" containerName="registry-server" Mar 20 07:56:49 crc kubenswrapper[4971]: I0320 07:56:49.183139 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="03cfe007-db9f-4403-86b5-cd107e145c1a" containerName="registry-server" Mar 20 07:56:49 crc kubenswrapper[4971]: E0320 07:56:49.183161 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03cfe007-db9f-4403-86b5-cd107e145c1a" containerName="extract-content" Mar 20 07:56:49 crc kubenswrapper[4971]: I0320 07:56:49.183167 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="03cfe007-db9f-4403-86b5-cd107e145c1a" containerName="extract-content" Mar 20 07:56:49 crc kubenswrapper[4971]: I0320 07:56:49.183360 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="03cfe007-db9f-4403-86b5-cd107e145c1a" containerName="registry-server" Mar 20 07:56:49 crc kubenswrapper[4971]: I0320 07:56:49.183385 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="55e6a816-98cf-45fa-9e19-f0d440f373df" containerName="oc" Mar 20 07:56:49 crc kubenswrapper[4971]: I0320 07:56:49.184631 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dzk4z" Mar 20 07:56:49 crc kubenswrapper[4971]: I0320 07:56:49.202052 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dzk4z"] Mar 20 07:56:49 crc kubenswrapper[4971]: I0320 07:56:49.264853 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebef1b55-a768-4f74-aec4-04a5babf37aa-catalog-content\") pod \"redhat-operators-dzk4z\" (UID: \"ebef1b55-a768-4f74-aec4-04a5babf37aa\") " pod="openshift-marketplace/redhat-operators-dzk4z" Mar 20 07:56:49 crc kubenswrapper[4971]: I0320 07:56:49.264922 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lhzq\" (UniqueName: \"kubernetes.io/projected/ebef1b55-a768-4f74-aec4-04a5babf37aa-kube-api-access-8lhzq\") pod \"redhat-operators-dzk4z\" (UID: \"ebef1b55-a768-4f74-aec4-04a5babf37aa\") " pod="openshift-marketplace/redhat-operators-dzk4z" Mar 20 07:56:49 crc kubenswrapper[4971]: I0320 07:56:49.265039 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebef1b55-a768-4f74-aec4-04a5babf37aa-utilities\") pod \"redhat-operators-dzk4z\" (UID: \"ebef1b55-a768-4f74-aec4-04a5babf37aa\") " pod="openshift-marketplace/redhat-operators-dzk4z" Mar 20 07:56:49 crc kubenswrapper[4971]: I0320 07:56:49.366447 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebef1b55-a768-4f74-aec4-04a5babf37aa-utilities\") pod \"redhat-operators-dzk4z\" (UID: \"ebef1b55-a768-4f74-aec4-04a5babf37aa\") " pod="openshift-marketplace/redhat-operators-dzk4z" Mar 20 07:56:49 crc kubenswrapper[4971]: I0320 07:56:49.366592 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebef1b55-a768-4f74-aec4-04a5babf37aa-catalog-content\") pod \"redhat-operators-dzk4z\" (UID: \"ebef1b55-a768-4f74-aec4-04a5babf37aa\") " pod="openshift-marketplace/redhat-operators-dzk4z" Mar 20 07:56:49 crc kubenswrapper[4971]: I0320 07:56:49.366668 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lhzq\" (UniqueName: \"kubernetes.io/projected/ebef1b55-a768-4f74-aec4-04a5babf37aa-kube-api-access-8lhzq\") pod \"redhat-operators-dzk4z\" (UID: \"ebef1b55-a768-4f74-aec4-04a5babf37aa\") " pod="openshift-marketplace/redhat-operators-dzk4z" Mar 20 07:56:49 crc kubenswrapper[4971]: I0320 07:56:49.367126 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebef1b55-a768-4f74-aec4-04a5babf37aa-utilities\") pod \"redhat-operators-dzk4z\" (UID: \"ebef1b55-a768-4f74-aec4-04a5babf37aa\") " pod="openshift-marketplace/redhat-operators-dzk4z" Mar 20 07:56:49 crc kubenswrapper[4971]: I0320 07:56:49.367137 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebef1b55-a768-4f74-aec4-04a5babf37aa-catalog-content\") pod \"redhat-operators-dzk4z\" (UID: \"ebef1b55-a768-4f74-aec4-04a5babf37aa\") " pod="openshift-marketplace/redhat-operators-dzk4z" Mar 20 07:56:49 crc kubenswrapper[4971]: I0320 07:56:49.396454 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lhzq\" (UniqueName: \"kubernetes.io/projected/ebef1b55-a768-4f74-aec4-04a5babf37aa-kube-api-access-8lhzq\") pod \"redhat-operators-dzk4z\" (UID: \"ebef1b55-a768-4f74-aec4-04a5babf37aa\") " pod="openshift-marketplace/redhat-operators-dzk4z" Mar 20 07:56:49 crc kubenswrapper[4971]: I0320 07:56:49.505922 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dzk4z" Mar 20 07:56:49 crc kubenswrapper[4971]: I0320 07:56:49.933410 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dzk4z"] Mar 20 07:56:49 crc kubenswrapper[4971]: I0320 07:56:49.993479 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dzk4z" event={"ID":"ebef1b55-a768-4f74-aec4-04a5babf37aa","Type":"ContainerStarted","Data":"bde0403e97bcacdd3f0cb422850eed8c8fff39195cd5bcd5342e6dee19d33558"} Mar 20 07:56:50 crc kubenswrapper[4971]: I0320 07:56:50.162764 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:56:50 crc kubenswrapper[4971]: I0320 07:56:50.162820 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:56:51 crc kubenswrapper[4971]: I0320 07:56:51.000901 4971 generic.go:334] "Generic (PLEG): container finished" podID="ebef1b55-a768-4f74-aec4-04a5babf37aa" containerID="865bb0719b3ddbc83b14b5df7cfbcb4dbd629f4c87fa4b27ec4e009786226704" exitCode=0 Mar 20 07:56:51 crc kubenswrapper[4971]: I0320 07:56:51.000950 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dzk4z" event={"ID":"ebef1b55-a768-4f74-aec4-04a5babf37aa","Type":"ContainerDied","Data":"865bb0719b3ddbc83b14b5df7cfbcb4dbd629f4c87fa4b27ec4e009786226704"} Mar 20 07:56:52 crc kubenswrapper[4971]: I0320 07:56:52.012083 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dzk4z" event={"ID":"ebef1b55-a768-4f74-aec4-04a5babf37aa","Type":"ContainerStarted","Data":"0f302d7d39bd991e58e360bce3ce1c5e22993c1c4eb4813ee273e23b0c5d370b"} Mar 20 07:56:53 crc kubenswrapper[4971]: I0320 07:56:53.024104 4971 generic.go:334] "Generic (PLEG): container finished" podID="ebef1b55-a768-4f74-aec4-04a5babf37aa" containerID="0f302d7d39bd991e58e360bce3ce1c5e22993c1c4eb4813ee273e23b0c5d370b" exitCode=0 Mar 20 07:56:53 crc kubenswrapper[4971]: I0320 07:56:53.024162 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dzk4z" event={"ID":"ebef1b55-a768-4f74-aec4-04a5babf37aa","Type":"ContainerDied","Data":"0f302d7d39bd991e58e360bce3ce1c5e22993c1c4eb4813ee273e23b0c5d370b"} Mar 20 07:56:54 crc kubenswrapper[4971]: I0320 07:56:54.032636 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dzk4z" event={"ID":"ebef1b55-a768-4f74-aec4-04a5babf37aa","Type":"ContainerStarted","Data":"8a0a6d8dcda767509c9eb53789742240cbaeec2c59f1c7e0c3cd54b05738d5b7"} Mar 20 07:56:54 crc kubenswrapper[4971]: I0320 07:56:54.068025 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dzk4z" podStartSLOduration=2.609682366 podStartE2EDuration="5.068006978s" podCreationTimestamp="2026-03-20 07:56:49 +0000 UTC" firstStartedPulling="2026-03-20 07:56:51.002474199 +0000 UTC m=+4032.982348337" lastFinishedPulling="2026-03-20 07:56:53.460798811 +0000 UTC m=+4035.440672949" observedRunningTime="2026-03-20 07:56:54.063034829 +0000 UTC m=+4036.042908957" watchObservedRunningTime="2026-03-20 07:56:54.068006978 +0000 UTC m=+4036.047881116" Mar 20 07:56:56 crc kubenswrapper[4971]: I0320 07:56:56.221324 4971 scope.go:117] "RemoveContainer" containerID="a73a29342b7da3702d3ab797f62e043ae6e127163d853f13f0d354296b6fcff9" Mar 20 07:56:59 crc kubenswrapper[4971]: I0320 07:56:59.506650 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dzk4z" Mar 20 07:56:59 crc kubenswrapper[4971]: I0320 07:56:59.506931 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dzk4z" Mar 20 07:57:00 crc kubenswrapper[4971]: I0320 07:57:00.568903 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dzk4z" podUID="ebef1b55-a768-4f74-aec4-04a5babf37aa" containerName="registry-server" probeResult="failure" output=< Mar 20 07:57:00 crc kubenswrapper[4971]: timeout: failed to connect service ":50051" within 1s Mar 20 07:57:00 crc kubenswrapper[4971]: > Mar 20 07:57:09 crc kubenswrapper[4971]: I0320 07:57:09.564757 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dzk4z" Mar 20 07:57:09 crc kubenswrapper[4971]: I0320 07:57:09.613757 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dzk4z" Mar 20 07:57:09 crc kubenswrapper[4971]: I0320 07:57:09.803785 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dzk4z"] Mar 20 07:57:11 crc kubenswrapper[4971]: I0320 07:57:11.180792 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dzk4z" podUID="ebef1b55-a768-4f74-aec4-04a5babf37aa" containerName="registry-server" containerID="cri-o://8a0a6d8dcda767509c9eb53789742240cbaeec2c59f1c7e0c3cd54b05738d5b7" gracePeriod=2 Mar 20 07:57:11 crc kubenswrapper[4971]: I0320 07:57:11.574816 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dzk4z" Mar 20 07:57:11 crc kubenswrapper[4971]: I0320 07:57:11.610133 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebef1b55-a768-4f74-aec4-04a5babf37aa-utilities\") pod \"ebef1b55-a768-4f74-aec4-04a5babf37aa\" (UID: \"ebef1b55-a768-4f74-aec4-04a5babf37aa\") " Mar 20 07:57:11 crc kubenswrapper[4971]: I0320 07:57:11.610298 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lhzq\" (UniqueName: \"kubernetes.io/projected/ebef1b55-a768-4f74-aec4-04a5babf37aa-kube-api-access-8lhzq\") pod \"ebef1b55-a768-4f74-aec4-04a5babf37aa\" (UID: \"ebef1b55-a768-4f74-aec4-04a5babf37aa\") " Mar 20 07:57:11 crc kubenswrapper[4971]: I0320 07:57:11.610524 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebef1b55-a768-4f74-aec4-04a5babf37aa-catalog-content\") pod \"ebef1b55-a768-4f74-aec4-04a5babf37aa\" (UID: \"ebef1b55-a768-4f74-aec4-04a5babf37aa\") " Mar 20 07:57:11 crc kubenswrapper[4971]: I0320 07:57:11.614660 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebef1b55-a768-4f74-aec4-04a5babf37aa-utilities" (OuterVolumeSpecName: "utilities") pod "ebef1b55-a768-4f74-aec4-04a5babf37aa" (UID: "ebef1b55-a768-4f74-aec4-04a5babf37aa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:57:11 crc kubenswrapper[4971]: I0320 07:57:11.621929 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebef1b55-a768-4f74-aec4-04a5babf37aa-kube-api-access-8lhzq" (OuterVolumeSpecName: "kube-api-access-8lhzq") pod "ebef1b55-a768-4f74-aec4-04a5babf37aa" (UID: "ebef1b55-a768-4f74-aec4-04a5babf37aa"). InnerVolumeSpecName "kube-api-access-8lhzq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:57:11 crc kubenswrapper[4971]: I0320 07:57:11.712225 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lhzq\" (UniqueName: \"kubernetes.io/projected/ebef1b55-a768-4f74-aec4-04a5babf37aa-kube-api-access-8lhzq\") on node \"crc\" DevicePath \"\"" Mar 20 07:57:11 crc kubenswrapper[4971]: I0320 07:57:11.712277 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebef1b55-a768-4f74-aec4-04a5babf37aa-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 07:57:11 crc kubenswrapper[4971]: I0320 07:57:11.762374 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebef1b55-a768-4f74-aec4-04a5babf37aa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ebef1b55-a768-4f74-aec4-04a5babf37aa" (UID: "ebef1b55-a768-4f74-aec4-04a5babf37aa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:57:11 crc kubenswrapper[4971]: I0320 07:57:11.813356 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebef1b55-a768-4f74-aec4-04a5babf37aa-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 07:57:12 crc kubenswrapper[4971]: I0320 07:57:12.194367 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dzk4z" event={"ID":"ebef1b55-a768-4f74-aec4-04a5babf37aa","Type":"ContainerDied","Data":"8a0a6d8dcda767509c9eb53789742240cbaeec2c59f1c7e0c3cd54b05738d5b7"} Mar 20 07:57:12 crc kubenswrapper[4971]: I0320 07:57:12.194382 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dzk4z" Mar 20 07:57:12 crc kubenswrapper[4971]: I0320 07:57:12.194450 4971 scope.go:117] "RemoveContainer" containerID="8a0a6d8dcda767509c9eb53789742240cbaeec2c59f1c7e0c3cd54b05738d5b7" Mar 20 07:57:12 crc kubenswrapper[4971]: I0320 07:57:12.194247 4971 generic.go:334] "Generic (PLEG): container finished" podID="ebef1b55-a768-4f74-aec4-04a5babf37aa" containerID="8a0a6d8dcda767509c9eb53789742240cbaeec2c59f1c7e0c3cd54b05738d5b7" exitCode=0 Mar 20 07:57:12 crc kubenswrapper[4971]: I0320 07:57:12.197774 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dzk4z" event={"ID":"ebef1b55-a768-4f74-aec4-04a5babf37aa","Type":"ContainerDied","Data":"bde0403e97bcacdd3f0cb422850eed8c8fff39195cd5bcd5342e6dee19d33558"} Mar 20 07:57:12 crc kubenswrapper[4971]: I0320 07:57:12.235388 4971 scope.go:117] "RemoveContainer" containerID="0f302d7d39bd991e58e360bce3ce1c5e22993c1c4eb4813ee273e23b0c5d370b" Mar 20 07:57:12 crc kubenswrapper[4971]: I0320 07:57:12.255690 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dzk4z"] Mar 20 07:57:12 crc kubenswrapper[4971]: I0320 07:57:12.269403 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dzk4z"] Mar 20 07:57:12 crc kubenswrapper[4971]: I0320 07:57:12.274303 4971 scope.go:117] "RemoveContainer" containerID="865bb0719b3ddbc83b14b5df7cfbcb4dbd629f4c87fa4b27ec4e009786226704" Mar 20 07:57:12 crc kubenswrapper[4971]: I0320 07:57:12.305210 4971 scope.go:117] "RemoveContainer" containerID="8a0a6d8dcda767509c9eb53789742240cbaeec2c59f1c7e0c3cd54b05738d5b7" Mar 20 07:57:12 crc kubenswrapper[4971]: E0320 07:57:12.305845 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a0a6d8dcda767509c9eb53789742240cbaeec2c59f1c7e0c3cd54b05738d5b7\": container with ID starting with 8a0a6d8dcda767509c9eb53789742240cbaeec2c59f1c7e0c3cd54b05738d5b7 not found: ID does not exist" containerID="8a0a6d8dcda767509c9eb53789742240cbaeec2c59f1c7e0c3cd54b05738d5b7" Mar 20 07:57:12 crc kubenswrapper[4971]: I0320 07:57:12.305969 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a0a6d8dcda767509c9eb53789742240cbaeec2c59f1c7e0c3cd54b05738d5b7"} err="failed to get container status \"8a0a6d8dcda767509c9eb53789742240cbaeec2c59f1c7e0c3cd54b05738d5b7\": rpc error: code = NotFound desc = could not find container \"8a0a6d8dcda767509c9eb53789742240cbaeec2c59f1c7e0c3cd54b05738d5b7\": container with ID starting with 8a0a6d8dcda767509c9eb53789742240cbaeec2c59f1c7e0c3cd54b05738d5b7 not found: ID does not exist" Mar 20 07:57:12 crc kubenswrapper[4971]: I0320 07:57:12.306065 4971 scope.go:117] "RemoveContainer" containerID="0f302d7d39bd991e58e360bce3ce1c5e22993c1c4eb4813ee273e23b0c5d370b" Mar 20 07:57:12 crc kubenswrapper[4971]: E0320 07:57:12.306651 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f302d7d39bd991e58e360bce3ce1c5e22993c1c4eb4813ee273e23b0c5d370b\": container with ID starting with 0f302d7d39bd991e58e360bce3ce1c5e22993c1c4eb4813ee273e23b0c5d370b not found: ID does not exist" containerID="0f302d7d39bd991e58e360bce3ce1c5e22993c1c4eb4813ee273e23b0c5d370b" Mar 20 07:57:12 crc kubenswrapper[4971]: I0320 07:57:12.306758 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f302d7d39bd991e58e360bce3ce1c5e22993c1c4eb4813ee273e23b0c5d370b"} err="failed to get container status \"0f302d7d39bd991e58e360bce3ce1c5e22993c1c4eb4813ee273e23b0c5d370b\": rpc error: code = NotFound desc = could not find container \"0f302d7d39bd991e58e360bce3ce1c5e22993c1c4eb4813ee273e23b0c5d370b\": container with ID starting with 0f302d7d39bd991e58e360bce3ce1c5e22993c1c4eb4813ee273e23b0c5d370b not found: ID does not exist" Mar 20 07:57:12 crc kubenswrapper[4971]: I0320 07:57:12.306882 4971 scope.go:117] "RemoveContainer" containerID="865bb0719b3ddbc83b14b5df7cfbcb4dbd629f4c87fa4b27ec4e009786226704" Mar 20 07:57:12 crc kubenswrapper[4971]: E0320 07:57:12.307405 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"865bb0719b3ddbc83b14b5df7cfbcb4dbd629f4c87fa4b27ec4e009786226704\": container with ID starting with 865bb0719b3ddbc83b14b5df7cfbcb4dbd629f4c87fa4b27ec4e009786226704 not found: ID does not exist" containerID="865bb0719b3ddbc83b14b5df7cfbcb4dbd629f4c87fa4b27ec4e009786226704" Mar 20 07:57:12 crc kubenswrapper[4971]: I0320 07:57:12.307503 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"865bb0719b3ddbc83b14b5df7cfbcb4dbd629f4c87fa4b27ec4e009786226704"} err="failed to get container status \"865bb0719b3ddbc83b14b5df7cfbcb4dbd629f4c87fa4b27ec4e009786226704\": rpc error: code = NotFound desc = could not find container \"865bb0719b3ddbc83b14b5df7cfbcb4dbd629f4c87fa4b27ec4e009786226704\": container with ID starting with 865bb0719b3ddbc83b14b5df7cfbcb4dbd629f4c87fa4b27ec4e009786226704 not found: ID does not exist" Mar 20 07:57:12 crc kubenswrapper[4971]: I0320 07:57:12.747653 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebef1b55-a768-4f74-aec4-04a5babf37aa" path="/var/lib/kubelet/pods/ebef1b55-a768-4f74-aec4-04a5babf37aa/volumes" Mar 20 07:57:20 crc kubenswrapper[4971]: I0320 07:57:20.162885 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:57:20 crc kubenswrapper[4971]: I0320 07:57:20.163657 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:57:50 crc kubenswrapper[4971]: I0320 07:57:50.163371 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:57:50 crc kubenswrapper[4971]: I0320 07:57:50.164563 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:57:50 crc kubenswrapper[4971]: I0320 07:57:50.164741 4971 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" Mar 20 07:57:50 crc kubenswrapper[4971]: I0320 07:57:50.166036 4971 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f5757dab665bec4f1034edb8e2c85da8a22c711cf9230d598981b106f2238f62"} pod="openshift-machine-config-operator/machine-config-daemon-m26zl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 07:57:50 crc kubenswrapper[4971]: I0320 07:57:50.166147 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" containerID="cri-o://f5757dab665bec4f1034edb8e2c85da8a22c711cf9230d598981b106f2238f62" gracePeriod=600 Mar 20 07:57:50 crc kubenswrapper[4971]: I0320 07:57:50.569773 4971 generic.go:334] "Generic (PLEG): container finished" podID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerID="f5757dab665bec4f1034edb8e2c85da8a22c711cf9230d598981b106f2238f62" exitCode=0 Mar 20 07:57:50 crc kubenswrapper[4971]: I0320 07:57:50.569872 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerDied","Data":"f5757dab665bec4f1034edb8e2c85da8a22c711cf9230d598981b106f2238f62"} Mar 20 07:57:50 crc kubenswrapper[4971]: I0320 07:57:50.570324 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerStarted","Data":"4a80cd36cbda07810e83e10d6f5e9bdd6015f9ed1688173c120cf0268aeac5df"} Mar 20 07:57:50 crc kubenswrapper[4971]: I0320 07:57:50.570362 4971 scope.go:117] "RemoveContainer" containerID="9f34439c5fc9bd9c8be3ed8b25bf8407df6d56135128f84724d3d2afe1cca94e" Mar 20 07:58:00 crc kubenswrapper[4971]: I0320 07:58:00.162028 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566558-26qdx"] Mar 20 07:58:00 crc kubenswrapper[4971]: E0320 07:58:00.163468 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebef1b55-a768-4f74-aec4-04a5babf37aa" containerName="extract-content" Mar 20 07:58:00 crc kubenswrapper[4971]: I0320 07:58:00.163485 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebef1b55-a768-4f74-aec4-04a5babf37aa" containerName="extract-content" Mar 20 07:58:00 crc kubenswrapper[4971]: E0320 07:58:00.163514 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebef1b55-a768-4f74-aec4-04a5babf37aa" containerName="registry-server" Mar 20 07:58:00 crc kubenswrapper[4971]: I0320 07:58:00.163522 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebef1b55-a768-4f74-aec4-04a5babf37aa" containerName="registry-server" Mar 20 07:58:00 crc kubenswrapper[4971]: E0320 07:58:00.163548 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebef1b55-a768-4f74-aec4-04a5babf37aa" containerName="extract-utilities" Mar 20 07:58:00 crc kubenswrapper[4971]: I0320 07:58:00.163558 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebef1b55-a768-4f74-aec4-04a5babf37aa" containerName="extract-utilities" Mar 20 07:58:00 crc kubenswrapper[4971]: I0320 07:58:00.163758 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebef1b55-a768-4f74-aec4-04a5babf37aa" containerName="registry-server" Mar 20 07:58:00 crc kubenswrapper[4971]: I0320 07:58:00.164300 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566558-26qdx" Mar 20 07:58:00 crc kubenswrapper[4971]: I0320 07:58:00.167306 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 07:58:00 crc kubenswrapper[4971]: I0320 07:58:00.167854 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 07:58:00 crc kubenswrapper[4971]: I0320 07:58:00.168180 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 07:58:00 crc kubenswrapper[4971]: I0320 07:58:00.183044 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566558-26qdx"] Mar 20 07:58:00 crc kubenswrapper[4971]: I0320 07:58:00.242065 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtm4n\" (UniqueName: \"kubernetes.io/projected/a55658af-2656-49f8-9a8f-b3de1a23d3c5-kube-api-access-qtm4n\") pod \"auto-csr-approver-29566558-26qdx\" (UID: \"a55658af-2656-49f8-9a8f-b3de1a23d3c5\") " pod="openshift-infra/auto-csr-approver-29566558-26qdx" Mar 20 07:58:00 crc kubenswrapper[4971]: I0320 07:58:00.343216 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtm4n\" (UniqueName: \"kubernetes.io/projected/a55658af-2656-49f8-9a8f-b3de1a23d3c5-kube-api-access-qtm4n\") pod \"auto-csr-approver-29566558-26qdx\" (UID: \"a55658af-2656-49f8-9a8f-b3de1a23d3c5\") " pod="openshift-infra/auto-csr-approver-29566558-26qdx" Mar 20 07:58:00 crc kubenswrapper[4971]: I0320 07:58:00.363515 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtm4n\" (UniqueName: \"kubernetes.io/projected/a55658af-2656-49f8-9a8f-b3de1a23d3c5-kube-api-access-qtm4n\") pod \"auto-csr-approver-29566558-26qdx\" (UID: \"a55658af-2656-49f8-9a8f-b3de1a23d3c5\") " pod="openshift-infra/auto-csr-approver-29566558-26qdx" Mar 20 07:58:00 crc kubenswrapper[4971]: I0320 07:58:00.496476 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566558-26qdx" Mar 20 07:58:00 crc kubenswrapper[4971]: I0320 07:58:00.995382 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566558-26qdx"] Mar 20 07:58:01 crc kubenswrapper[4971]: I0320 07:58:01.006770 4971 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 07:58:01 crc kubenswrapper[4971]: I0320 07:58:01.687590 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566558-26qdx" event={"ID":"a55658af-2656-49f8-9a8f-b3de1a23d3c5","Type":"ContainerStarted","Data":"d33a43c2c5186080c72557b9827db3a4bd18b287faf2c5da74085eddbc08aed4"} Mar 20 07:58:02 crc kubenswrapper[4971]: I0320 07:58:02.700475 4971 generic.go:334] "Generic (PLEG): container finished" podID="a55658af-2656-49f8-9a8f-b3de1a23d3c5" containerID="29444ac3093ce735c03486014240a4542acca8aeebf1b6cccc7fda03ff4e528f" exitCode=0 Mar 20 07:58:02 crc kubenswrapper[4971]: I0320 07:58:02.700568 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566558-26qdx" event={"ID":"a55658af-2656-49f8-9a8f-b3de1a23d3c5","Type":"ContainerDied","Data":"29444ac3093ce735c03486014240a4542acca8aeebf1b6cccc7fda03ff4e528f"} Mar 20 07:58:04 crc kubenswrapper[4971]: I0320 07:58:04.091804 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566558-26qdx" Mar 20 07:58:04 crc kubenswrapper[4971]: I0320 07:58:04.114860 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtm4n\" (UniqueName: \"kubernetes.io/projected/a55658af-2656-49f8-9a8f-b3de1a23d3c5-kube-api-access-qtm4n\") pod \"a55658af-2656-49f8-9a8f-b3de1a23d3c5\" (UID: \"a55658af-2656-49f8-9a8f-b3de1a23d3c5\") " Mar 20 07:58:04 crc kubenswrapper[4971]: I0320 07:58:04.147222 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a55658af-2656-49f8-9a8f-b3de1a23d3c5-kube-api-access-qtm4n" (OuterVolumeSpecName: "kube-api-access-qtm4n") pod "a55658af-2656-49f8-9a8f-b3de1a23d3c5" (UID: "a55658af-2656-49f8-9a8f-b3de1a23d3c5"). InnerVolumeSpecName "kube-api-access-qtm4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:58:04 crc kubenswrapper[4971]: I0320 07:58:04.216039 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtm4n\" (UniqueName: \"kubernetes.io/projected/a55658af-2656-49f8-9a8f-b3de1a23d3c5-kube-api-access-qtm4n\") on node \"crc\" DevicePath \"\"" Mar 20 07:58:04 crc kubenswrapper[4971]: I0320 07:58:04.722695 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566558-26qdx" event={"ID":"a55658af-2656-49f8-9a8f-b3de1a23d3c5","Type":"ContainerDied","Data":"d33a43c2c5186080c72557b9827db3a4bd18b287faf2c5da74085eddbc08aed4"} Mar 20 07:58:04 crc kubenswrapper[4971]: I0320 07:58:04.723151 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d33a43c2c5186080c72557b9827db3a4bd18b287faf2c5da74085eddbc08aed4" Mar 20 07:58:04 crc kubenswrapper[4971]: I0320 07:58:04.722790 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566558-26qdx" Mar 20 07:58:05 crc kubenswrapper[4971]: I0320 07:58:05.181286 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566552-4sq4p"] Mar 20 07:58:05 crc kubenswrapper[4971]: I0320 07:58:05.188534 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566552-4sq4p"] Mar 20 07:58:06 crc kubenswrapper[4971]: I0320 07:58:06.750722 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d9d3837-87c1-4349-96d8-503d340a566a" path="/var/lib/kubelet/pods/1d9d3837-87c1-4349-96d8-503d340a566a/volumes" Mar 20 07:58:56 crc kubenswrapper[4971]: I0320 07:58:56.387555 4971 scope.go:117] "RemoveContainer" containerID="6c0d33e4b6a80a8877b32eaebedd068385f8ccc8d3c63ac58ebb4b2c074e02f9" Mar 20 07:59:50 crc kubenswrapper[4971]: I0320 07:59:50.162403 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:59:50 crc kubenswrapper[4971]: I0320 07:59:50.163250 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:59:56 crc kubenswrapper[4971]: I0320 07:59:56.055742 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wcfsz"] Mar 20 07:59:56 crc kubenswrapper[4971]: E0320 07:59:56.056544 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a55658af-2656-49f8-9a8f-b3de1a23d3c5" containerName="oc" Mar 20 07:59:56 crc kubenswrapper[4971]: I0320 07:59:56.056565 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="a55658af-2656-49f8-9a8f-b3de1a23d3c5" containerName="oc" Mar 20 07:59:56 crc kubenswrapper[4971]: I0320 07:59:56.057808 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="a55658af-2656-49f8-9a8f-b3de1a23d3c5" containerName="oc" Mar 20 07:59:56 crc kubenswrapper[4971]: I0320 07:59:56.059579 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wcfsz" Mar 20 07:59:56 crc kubenswrapper[4971]: I0320 07:59:56.080913 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wcfsz"] Mar 20 07:59:56 crc kubenswrapper[4971]: I0320 07:59:56.231676 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c78be730-b987-43d1-b83a-070681fce468-utilities\") pod \"redhat-marketplace-wcfsz\" (UID: \"c78be730-b987-43d1-b83a-070681fce468\") " pod="openshift-marketplace/redhat-marketplace-wcfsz" Mar 20 07:59:56 crc kubenswrapper[4971]: I0320 07:59:56.231743 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c78be730-b987-43d1-b83a-070681fce468-catalog-content\") pod \"redhat-marketplace-wcfsz\" (UID: \"c78be730-b987-43d1-b83a-070681fce468\") " pod="openshift-marketplace/redhat-marketplace-wcfsz" Mar 20 07:59:56 crc kubenswrapper[4971]: I0320 07:59:56.231807 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtvv2\" (UniqueName: \"kubernetes.io/projected/c78be730-b987-43d1-b83a-070681fce468-kube-api-access-jtvv2\") pod \"redhat-marketplace-wcfsz\" (UID: \"c78be730-b987-43d1-b83a-070681fce468\") " pod="openshift-marketplace/redhat-marketplace-wcfsz" Mar 20 07:59:56 crc kubenswrapper[4971]: I0320 07:59:56.333620 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c78be730-b987-43d1-b83a-070681fce468-catalog-content\") pod \"redhat-marketplace-wcfsz\" (UID: \"c78be730-b987-43d1-b83a-070681fce468\") " pod="openshift-marketplace/redhat-marketplace-wcfsz" Mar 20 07:59:56 crc kubenswrapper[4971]: I0320 07:59:56.333716 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtvv2\" (UniqueName: \"kubernetes.io/projected/c78be730-b987-43d1-b83a-070681fce468-kube-api-access-jtvv2\") pod \"redhat-marketplace-wcfsz\" (UID: \"c78be730-b987-43d1-b83a-070681fce468\") " pod="openshift-marketplace/redhat-marketplace-wcfsz" Mar 20 07:59:56 crc kubenswrapper[4971]: I0320 07:59:56.333819 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c78be730-b987-43d1-b83a-070681fce468-utilities\") pod \"redhat-marketplace-wcfsz\" (UID: \"c78be730-b987-43d1-b83a-070681fce468\") " pod="openshift-marketplace/redhat-marketplace-wcfsz" Mar 20 07:59:56 crc kubenswrapper[4971]: I0320 07:59:56.334175 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c78be730-b987-43d1-b83a-070681fce468-catalog-content\") pod \"redhat-marketplace-wcfsz\" (UID: \"c78be730-b987-43d1-b83a-070681fce468\") " pod="openshift-marketplace/redhat-marketplace-wcfsz" Mar 20 07:59:56 crc kubenswrapper[4971]: I0320 07:59:56.334359 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c78be730-b987-43d1-b83a-070681fce468-utilities\") pod \"redhat-marketplace-wcfsz\" (UID: \"c78be730-b987-43d1-b83a-070681fce468\") " pod="openshift-marketplace/redhat-marketplace-wcfsz" Mar 20 07:59:56 crc kubenswrapper[4971]: I0320 07:59:56.355926 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtvv2\" (UniqueName: \"kubernetes.io/projected/c78be730-b987-43d1-b83a-070681fce468-kube-api-access-jtvv2\") pod \"redhat-marketplace-wcfsz\" (UID: \"c78be730-b987-43d1-b83a-070681fce468\") " pod="openshift-marketplace/redhat-marketplace-wcfsz" Mar 20 07:59:56 crc kubenswrapper[4971]: I0320 07:59:56.434695 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wcfsz" Mar 20 07:59:56 crc kubenswrapper[4971]: I0320 07:59:56.878715 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wcfsz"] Mar 20 07:59:57 crc kubenswrapper[4971]: I0320 07:59:57.430183 4971 generic.go:334] "Generic (PLEG): container finished" podID="c78be730-b987-43d1-b83a-070681fce468" containerID="65bf921e34838252265e485da39456f6f69415aa027656efd1584365ac503068" exitCode=0 Mar 20 07:59:57 crc kubenswrapper[4971]: I0320 07:59:57.430280 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wcfsz" event={"ID":"c78be730-b987-43d1-b83a-070681fce468","Type":"ContainerDied","Data":"65bf921e34838252265e485da39456f6f69415aa027656efd1584365ac503068"} Mar 20 07:59:57 crc kubenswrapper[4971]: I0320 07:59:57.430717 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wcfsz" event={"ID":"c78be730-b987-43d1-b83a-070681fce468","Type":"ContainerStarted","Data":"5f66a90b09ca2dcd49c19f82c76ffd5e8b31e3775e821d2aabee9558dce02635"} Mar 20 07:59:58 crc kubenswrapper[4971]: I0320 07:59:58.441539 4971 generic.go:334] "Generic (PLEG): container finished" podID="c78be730-b987-43d1-b83a-070681fce468" containerID="340edcfd8cade1f5629d2bc21acf4276188298e960b8b50167efe489dbd2bb68" exitCode=0 Mar 20 07:59:58 crc kubenswrapper[4971]: I0320 07:59:58.441686 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wcfsz" event={"ID":"c78be730-b987-43d1-b83a-070681fce468","Type":"ContainerDied","Data":"340edcfd8cade1f5629d2bc21acf4276188298e960b8b50167efe489dbd2bb68"} Mar 20 07:59:59 crc kubenswrapper[4971]: I0320 07:59:59.451079 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wcfsz" event={"ID":"c78be730-b987-43d1-b83a-070681fce468","Type":"ContainerStarted","Data":"01fc8d9203603d7a72442f8974e2201e6057828cef0a6162d850d0997bfae6a7"} Mar 20 07:59:59 crc kubenswrapper[4971]: I0320 07:59:59.472423 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wcfsz" podStartSLOduration=2.080930702 podStartE2EDuration="3.472403491s" podCreationTimestamp="2026-03-20 07:59:56 +0000 UTC" firstStartedPulling="2026-03-20 07:59:57.433755928 +0000 UTC m=+4219.413630106" lastFinishedPulling="2026-03-20 07:59:58.825228717 +0000 UTC m=+4220.805102895" observedRunningTime="2026-03-20 07:59:59.471720123 +0000 UTC m=+4221.451594281" watchObservedRunningTime="2026-03-20 07:59:59.472403491 +0000 UTC m=+4221.452277629" Mar 20 08:00:00 crc kubenswrapper[4971]: I0320 08:00:00.155754 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566560-jqw9s"] Mar 20 08:00:00 crc kubenswrapper[4971]: I0320 08:00:00.157578 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566560-jqw9s" Mar 20 08:00:00 crc kubenswrapper[4971]: I0320 08:00:00.160521 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 08:00:00 crc kubenswrapper[4971]: I0320 08:00:00.161237 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 08:00:00 crc kubenswrapper[4971]: I0320 08:00:00.161974 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 08:00:00 crc kubenswrapper[4971]: I0320 08:00:00.163329 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566560-qzp8b"] Mar 20 08:00:00 crc kubenswrapper[4971]: I0320 08:00:00.165738 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566560-qzp8b" Mar 20 08:00:00 crc kubenswrapper[4971]: I0320 08:00:00.168037 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 08:00:00 crc kubenswrapper[4971]: I0320 08:00:00.168254 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 08:00:00 crc kubenswrapper[4971]: I0320 08:00:00.172386 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566560-jqw9s"] Mar 20 08:00:00 crc kubenswrapper[4971]: I0320 08:00:00.178164 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566560-qzp8b"] Mar 20 08:00:00 crc kubenswrapper[4971]: I0320 08:00:00.300550 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnfzb\" (UniqueName: \"kubernetes.io/projected/b9d47e8b-b290-4df9-9976-8088a83a8d3a-kube-api-access-tnfzb\") pod \"auto-csr-approver-29566560-jqw9s\" (UID: \"b9d47e8b-b290-4df9-9976-8088a83a8d3a\") " pod="openshift-infra/auto-csr-approver-29566560-jqw9s" Mar 20 08:00:00 crc kubenswrapper[4971]: I0320 08:00:00.300711 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a7121fb5-1631-4257-8aa2-4e732578fc6f-config-volume\") pod \"collect-profiles-29566560-qzp8b\" (UID: \"a7121fb5-1631-4257-8aa2-4e732578fc6f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566560-qzp8b" Mar 20 08:00:00 crc kubenswrapper[4971]: I0320 08:00:00.300789 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hdll\" (UniqueName: \"kubernetes.io/projected/a7121fb5-1631-4257-8aa2-4e732578fc6f-kube-api-access-8hdll\") pod \"collect-profiles-29566560-qzp8b\" (UID: \"a7121fb5-1631-4257-8aa2-4e732578fc6f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566560-qzp8b" Mar 20 08:00:00 crc kubenswrapper[4971]: I0320 08:00:00.300820 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a7121fb5-1631-4257-8aa2-4e732578fc6f-secret-volume\") pod \"collect-profiles-29566560-qzp8b\" (UID: \"a7121fb5-1631-4257-8aa2-4e732578fc6f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566560-qzp8b" Mar 20 08:00:00 crc kubenswrapper[4971]: I0320 08:00:00.401897 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hdll\" (UniqueName: \"kubernetes.io/projected/a7121fb5-1631-4257-8aa2-4e732578fc6f-kube-api-access-8hdll\") pod \"collect-profiles-29566560-qzp8b\" (UID: \"a7121fb5-1631-4257-8aa2-4e732578fc6f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566560-qzp8b" Mar 20 08:00:00 crc kubenswrapper[4971]: I0320 08:00:00.401943 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a7121fb5-1631-4257-8aa2-4e732578fc6f-secret-volume\") pod \"collect-profiles-29566560-qzp8b\" (UID: \"a7121fb5-1631-4257-8aa2-4e732578fc6f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566560-qzp8b" Mar 20 08:00:00 crc kubenswrapper[4971]: I0320 08:00:00.402001 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnfzb\" (UniqueName: \"kubernetes.io/projected/b9d47e8b-b290-4df9-9976-8088a83a8d3a-kube-api-access-tnfzb\") pod \"auto-csr-approver-29566560-jqw9s\" (UID: \"b9d47e8b-b290-4df9-9976-8088a83a8d3a\") " pod="openshift-infra/auto-csr-approver-29566560-jqw9s" Mar 20 08:00:00 crc kubenswrapper[4971]: I0320 08:00:00.402039 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a7121fb5-1631-4257-8aa2-4e732578fc6f-config-volume\") pod \"collect-profiles-29566560-qzp8b\" (UID: \"a7121fb5-1631-4257-8aa2-4e732578fc6f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566560-qzp8b" Mar 20 08:00:00 crc kubenswrapper[4971]: I0320 08:00:00.402914 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a7121fb5-1631-4257-8aa2-4e732578fc6f-config-volume\") pod \"collect-profiles-29566560-qzp8b\" (UID: \"a7121fb5-1631-4257-8aa2-4e732578fc6f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566560-qzp8b" Mar 20 08:00:00 crc kubenswrapper[4971]: I0320 08:00:00.415419 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a7121fb5-1631-4257-8aa2-4e732578fc6f-secret-volume\") pod \"collect-profiles-29566560-qzp8b\" (UID: \"a7121fb5-1631-4257-8aa2-4e732578fc6f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566560-qzp8b" Mar 20 08:00:00 crc kubenswrapper[4971]: I0320 08:00:00.421885 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hdll\" (UniqueName: \"kubernetes.io/projected/a7121fb5-1631-4257-8aa2-4e732578fc6f-kube-api-access-8hdll\") pod \"collect-profiles-29566560-qzp8b\" (UID: \"a7121fb5-1631-4257-8aa2-4e732578fc6f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566560-qzp8b" Mar 20 08:00:00 crc kubenswrapper[4971]: I0320 08:00:00.424315 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnfzb\" (UniqueName: \"kubernetes.io/projected/b9d47e8b-b290-4df9-9976-8088a83a8d3a-kube-api-access-tnfzb\") pod \"auto-csr-approver-29566560-jqw9s\" (UID: \"b9d47e8b-b290-4df9-9976-8088a83a8d3a\") " pod="openshift-infra/auto-csr-approver-29566560-jqw9s" Mar 20 08:00:00 crc kubenswrapper[4971]: I0320 08:00:00.481254 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566560-jqw9s" Mar 20 08:00:00 crc kubenswrapper[4971]: I0320 08:00:00.491841 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566560-qzp8b" Mar 20 08:00:00 crc kubenswrapper[4971]: I0320 08:00:00.959626 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566560-jqw9s"] Mar 20 08:00:00 crc kubenswrapper[4971]: W0320 08:00:00.963142 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9d47e8b_b290_4df9_9976_8088a83a8d3a.slice/crio-0ea0afaa0a3ae3989e3cc413b6c97522c103ca190a73b0500c33c3555f837774 WatchSource:0}: Error finding container 0ea0afaa0a3ae3989e3cc413b6c97522c103ca190a73b0500c33c3555f837774: Status 404 returned error can't find the container with id 0ea0afaa0a3ae3989e3cc413b6c97522c103ca190a73b0500c33c3555f837774 Mar 20 08:00:01 crc kubenswrapper[4971]: W0320 08:00:01.028066 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7121fb5_1631_4257_8aa2_4e732578fc6f.slice/crio-82b5558b74f0cdf7b8cee9b60d35a2c0347bef1608a85298c143869f0d1f6fe8 WatchSource:0}: Error finding container 82b5558b74f0cdf7b8cee9b60d35a2c0347bef1608a85298c143869f0d1f6fe8: Status 404 returned error can't find the container with id 82b5558b74f0cdf7b8cee9b60d35a2c0347bef1608a85298c143869f0d1f6fe8 Mar 20 08:00:01 crc kubenswrapper[4971]: I0320 08:00:01.031221 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566560-qzp8b"] Mar 20 08:00:01 crc kubenswrapper[4971]: I0320 08:00:01.465665 4971 generic.go:334] "Generic (PLEG): container finished" podID="a7121fb5-1631-4257-8aa2-4e732578fc6f" containerID="44cb92331bdc2482cf210d940813ee155283805d84c34d9758ce7de0b24e6a8c" exitCode=0 Mar 20 08:00:01 crc kubenswrapper[4971]: I0320 08:00:01.465720 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566560-qzp8b" event={"ID":"a7121fb5-1631-4257-8aa2-4e732578fc6f","Type":"ContainerDied","Data":"44cb92331bdc2482cf210d940813ee155283805d84c34d9758ce7de0b24e6a8c"} Mar 20 08:00:01 crc kubenswrapper[4971]: I0320 08:00:01.465957 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566560-qzp8b" event={"ID":"a7121fb5-1631-4257-8aa2-4e732578fc6f","Type":"ContainerStarted","Data":"82b5558b74f0cdf7b8cee9b60d35a2c0347bef1608a85298c143869f0d1f6fe8"} Mar 20 08:00:01 crc kubenswrapper[4971]: I0320 08:00:01.467141 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566560-jqw9s" event={"ID":"b9d47e8b-b290-4df9-9976-8088a83a8d3a","Type":"ContainerStarted","Data":"0ea0afaa0a3ae3989e3cc413b6c97522c103ca190a73b0500c33c3555f837774"} Mar 20 08:00:02 crc kubenswrapper[4971]: I0320 08:00:02.834941 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566560-qzp8b" Mar 20 08:00:02 crc kubenswrapper[4971]: I0320 08:00:02.937738 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a7121fb5-1631-4257-8aa2-4e732578fc6f-secret-volume\") pod \"a7121fb5-1631-4257-8aa2-4e732578fc6f\" (UID: \"a7121fb5-1631-4257-8aa2-4e732578fc6f\") " Mar 20 08:00:02 crc kubenswrapper[4971]: I0320 08:00:02.937788 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hdll\" (UniqueName: \"kubernetes.io/projected/a7121fb5-1631-4257-8aa2-4e732578fc6f-kube-api-access-8hdll\") pod \"a7121fb5-1631-4257-8aa2-4e732578fc6f\" (UID: \"a7121fb5-1631-4257-8aa2-4e732578fc6f\") " Mar 20 08:00:02 crc kubenswrapper[4971]: I0320 08:00:02.937880 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a7121fb5-1631-4257-8aa2-4e732578fc6f-config-volume\") pod \"a7121fb5-1631-4257-8aa2-4e732578fc6f\" (UID: \"a7121fb5-1631-4257-8aa2-4e732578fc6f\") " Mar 20 08:00:02 crc kubenswrapper[4971]: I0320 08:00:02.938476 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7121fb5-1631-4257-8aa2-4e732578fc6f-config-volume" (OuterVolumeSpecName: "config-volume") pod "a7121fb5-1631-4257-8aa2-4e732578fc6f" (UID: "a7121fb5-1631-4257-8aa2-4e732578fc6f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:00:02 crc kubenswrapper[4971]: I0320 08:00:02.943286 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7121fb5-1631-4257-8aa2-4e732578fc6f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a7121fb5-1631-4257-8aa2-4e732578fc6f" (UID: "a7121fb5-1631-4257-8aa2-4e732578fc6f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:00:02 crc kubenswrapper[4971]: I0320 08:00:02.943359 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7121fb5-1631-4257-8aa2-4e732578fc6f-kube-api-access-8hdll" (OuterVolumeSpecName: "kube-api-access-8hdll") pod "a7121fb5-1631-4257-8aa2-4e732578fc6f" (UID: "a7121fb5-1631-4257-8aa2-4e732578fc6f"). InnerVolumeSpecName "kube-api-access-8hdll". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:00:03 crc kubenswrapper[4971]: I0320 08:00:03.039982 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hdll\" (UniqueName: \"kubernetes.io/projected/a7121fb5-1631-4257-8aa2-4e732578fc6f-kube-api-access-8hdll\") on node \"crc\" DevicePath \"\"" Mar 20 08:00:03 crc kubenswrapper[4971]: I0320 08:00:03.040036 4971 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a7121fb5-1631-4257-8aa2-4e732578fc6f-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 08:00:03 crc kubenswrapper[4971]: I0320 08:00:03.040058 4971 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a7121fb5-1631-4257-8aa2-4e732578fc6f-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 08:00:03 crc kubenswrapper[4971]: I0320 08:00:03.486335 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566560-qzp8b" event={"ID":"a7121fb5-1631-4257-8aa2-4e732578fc6f","Type":"ContainerDied","Data":"82b5558b74f0cdf7b8cee9b60d35a2c0347bef1608a85298c143869f0d1f6fe8"} Mar 20 08:00:03 crc kubenswrapper[4971]: I0320 08:00:03.486726 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82b5558b74f0cdf7b8cee9b60d35a2c0347bef1608a85298c143869f0d1f6fe8" Mar 20 08:00:03 crc kubenswrapper[4971]: I0320 08:00:03.486431 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566560-qzp8b" Mar 20 08:00:03 crc kubenswrapper[4971]: I0320 08:00:03.913645 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566515-qz9wn"] Mar 20 08:00:03 crc kubenswrapper[4971]: I0320 08:00:03.920862 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566515-qz9wn"] Mar 20 08:00:04 crc kubenswrapper[4971]: I0320 08:00:04.495497 4971 generic.go:334] "Generic (PLEG): container finished" podID="b9d47e8b-b290-4df9-9976-8088a83a8d3a" containerID="cd951b502f8136b313fb329c92de668c29a003209ecd2b89289b5172729fa207" exitCode=0 Mar 20 08:00:04 crc kubenswrapper[4971]: I0320 08:00:04.495698 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566560-jqw9s" event={"ID":"b9d47e8b-b290-4df9-9976-8088a83a8d3a","Type":"ContainerDied","Data":"cd951b502f8136b313fb329c92de668c29a003209ecd2b89289b5172729fa207"} Mar 20 08:00:04 crc kubenswrapper[4971]: I0320 08:00:04.747227 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4516e191-9ff2-4b88-a20e-66edfa04a8d7" path="/var/lib/kubelet/pods/4516e191-9ff2-4b88-a20e-66edfa04a8d7/volumes" Mar 20 08:00:05 crc kubenswrapper[4971]: I0320 08:00:05.840653 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566560-jqw9s" Mar 20 08:00:05 crc kubenswrapper[4971]: I0320 08:00:05.991274 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnfzb\" (UniqueName: \"kubernetes.io/projected/b9d47e8b-b290-4df9-9976-8088a83a8d3a-kube-api-access-tnfzb\") pod \"b9d47e8b-b290-4df9-9976-8088a83a8d3a\" (UID: \"b9d47e8b-b290-4df9-9976-8088a83a8d3a\") " Mar 20 08:00:06 crc kubenswrapper[4971]: I0320 08:00:06.004321 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9d47e8b-b290-4df9-9976-8088a83a8d3a-kube-api-access-tnfzb" (OuterVolumeSpecName: "kube-api-access-tnfzb") pod "b9d47e8b-b290-4df9-9976-8088a83a8d3a" (UID: "b9d47e8b-b290-4df9-9976-8088a83a8d3a"). InnerVolumeSpecName "kube-api-access-tnfzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:00:06 crc kubenswrapper[4971]: I0320 08:00:06.093415 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnfzb\" (UniqueName: \"kubernetes.io/projected/b9d47e8b-b290-4df9-9976-8088a83a8d3a-kube-api-access-tnfzb\") on node \"crc\" DevicePath \"\"" Mar 20 08:00:06 crc kubenswrapper[4971]: I0320 08:00:06.435126 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wcfsz" Mar 20 08:00:06 crc kubenswrapper[4971]: I0320 08:00:06.435177 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wcfsz" Mar 20 08:00:06 crc kubenswrapper[4971]: I0320 08:00:06.478117 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wcfsz" Mar 20 08:00:06 crc kubenswrapper[4971]: I0320 08:00:06.509279 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566560-jqw9s" event={"ID":"b9d47e8b-b290-4df9-9976-8088a83a8d3a","Type":"ContainerDied","Data":"0ea0afaa0a3ae3989e3cc413b6c97522c103ca190a73b0500c33c3555f837774"} Mar 20 08:00:06 crc kubenswrapper[4971]: I0320 08:00:06.509523 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ea0afaa0a3ae3989e3cc413b6c97522c103ca190a73b0500c33c3555f837774" Mar 20 08:00:06 crc kubenswrapper[4971]: I0320 08:00:06.509304 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566560-jqw9s" Mar 20 08:00:06 crc kubenswrapper[4971]: I0320 08:00:06.558945 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wcfsz" Mar 20 08:00:06 crc kubenswrapper[4971]: I0320 08:00:06.730846 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wcfsz"] Mar 20 08:00:06 crc kubenswrapper[4971]: I0320 08:00:06.905841 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566554-dcp9x"] Mar 20 08:00:06 crc kubenswrapper[4971]: I0320 08:00:06.912459 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566554-dcp9x"] Mar 20 08:00:08 crc kubenswrapper[4971]: I0320 08:00:08.530229 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wcfsz" podUID="c78be730-b987-43d1-b83a-070681fce468" containerName="registry-server" containerID="cri-o://01fc8d9203603d7a72442f8974e2201e6057828cef0a6162d850d0997bfae6a7" gracePeriod=2 Mar 20 08:00:08 crc kubenswrapper[4971]: I0320 08:00:08.748243 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d06aea70-0ef4-4194-887f-65c565e47798" path="/var/lib/kubelet/pods/d06aea70-0ef4-4194-887f-65c565e47798/volumes" Mar 20 08:00:09 crc kubenswrapper[4971]: I0320 08:00:09.091393 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wcfsz" Mar 20 08:00:09 crc kubenswrapper[4971]: I0320 08:00:09.247084 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c78be730-b987-43d1-b83a-070681fce468-catalog-content\") pod \"c78be730-b987-43d1-b83a-070681fce468\" (UID: \"c78be730-b987-43d1-b83a-070681fce468\") " Mar 20 08:00:09 crc kubenswrapper[4971]: I0320 08:00:09.247507 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtvv2\" (UniqueName: \"kubernetes.io/projected/c78be730-b987-43d1-b83a-070681fce468-kube-api-access-jtvv2\") pod \"c78be730-b987-43d1-b83a-070681fce468\" (UID: \"c78be730-b987-43d1-b83a-070681fce468\") " Mar 20 08:00:09 crc kubenswrapper[4971]: I0320 08:00:09.247573 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c78be730-b987-43d1-b83a-070681fce468-utilities\") pod \"c78be730-b987-43d1-b83a-070681fce468\" (UID: \"c78be730-b987-43d1-b83a-070681fce468\") " Mar 20 08:00:09 crc kubenswrapper[4971]: I0320 08:00:09.249813 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c78be730-b987-43d1-b83a-070681fce468-utilities" (OuterVolumeSpecName: "utilities") pod "c78be730-b987-43d1-b83a-070681fce468" (UID: "c78be730-b987-43d1-b83a-070681fce468"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:00:09 crc kubenswrapper[4971]: I0320 08:00:09.256908 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c78be730-b987-43d1-b83a-070681fce468-kube-api-access-jtvv2" (OuterVolumeSpecName: "kube-api-access-jtvv2") pod "c78be730-b987-43d1-b83a-070681fce468" (UID: "c78be730-b987-43d1-b83a-070681fce468"). InnerVolumeSpecName "kube-api-access-jtvv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:00:09 crc kubenswrapper[4971]: I0320 08:00:09.294668 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c78be730-b987-43d1-b83a-070681fce468-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c78be730-b987-43d1-b83a-070681fce468" (UID: "c78be730-b987-43d1-b83a-070681fce468"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:00:09 crc kubenswrapper[4971]: I0320 08:00:09.350065 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtvv2\" (UniqueName: \"kubernetes.io/projected/c78be730-b987-43d1-b83a-070681fce468-kube-api-access-jtvv2\") on node \"crc\" DevicePath \"\"" Mar 20 08:00:09 crc kubenswrapper[4971]: I0320 08:00:09.350122 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c78be730-b987-43d1-b83a-070681fce468-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 08:00:09 crc kubenswrapper[4971]: I0320 08:00:09.350143 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c78be730-b987-43d1-b83a-070681fce468-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 08:00:09 crc kubenswrapper[4971]: I0320 08:00:09.544043 4971 generic.go:334] "Generic (PLEG): container finished" podID="c78be730-b987-43d1-b83a-070681fce468" containerID="01fc8d9203603d7a72442f8974e2201e6057828cef0a6162d850d0997bfae6a7" exitCode=0 Mar 20 08:00:09 crc kubenswrapper[4971]: I0320 08:00:09.544106 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wcfsz" event={"ID":"c78be730-b987-43d1-b83a-070681fce468","Type":"ContainerDied","Data":"01fc8d9203603d7a72442f8974e2201e6057828cef0a6162d850d0997bfae6a7"} Mar 20 08:00:09 crc kubenswrapper[4971]: I0320 08:00:09.544176 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wcfsz" Mar 20 08:00:09 crc kubenswrapper[4971]: I0320 08:00:09.544214 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wcfsz" event={"ID":"c78be730-b987-43d1-b83a-070681fce468","Type":"ContainerDied","Data":"5f66a90b09ca2dcd49c19f82c76ffd5e8b31e3775e821d2aabee9558dce02635"} Mar 20 08:00:09 crc kubenswrapper[4971]: I0320 08:00:09.544249 4971 scope.go:117] "RemoveContainer" containerID="01fc8d9203603d7a72442f8974e2201e6057828cef0a6162d850d0997bfae6a7" Mar 20 08:00:09 crc kubenswrapper[4971]: I0320 08:00:09.585482 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wcfsz"] Mar 20 08:00:09 crc kubenswrapper[4971]: I0320 08:00:09.585969 4971 scope.go:117] "RemoveContainer" containerID="340edcfd8cade1f5629d2bc21acf4276188298e960b8b50167efe489dbd2bb68" Mar 20 08:00:09 crc kubenswrapper[4971]: I0320 08:00:09.595571 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wcfsz"] Mar 20 08:00:09 crc kubenswrapper[4971]: I0320 08:00:09.916809 4971 scope.go:117] "RemoveContainer" containerID="65bf921e34838252265e485da39456f6f69415aa027656efd1584365ac503068" Mar 20 08:00:10 crc kubenswrapper[4971]: I0320 08:00:10.056319 4971 scope.go:117] "RemoveContainer" containerID="01fc8d9203603d7a72442f8974e2201e6057828cef0a6162d850d0997bfae6a7" Mar 20 08:00:10 crc kubenswrapper[4971]: E0320 08:00:10.056911 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01fc8d9203603d7a72442f8974e2201e6057828cef0a6162d850d0997bfae6a7\": container with ID starting with 01fc8d9203603d7a72442f8974e2201e6057828cef0a6162d850d0997bfae6a7 not found: ID does not exist" containerID="01fc8d9203603d7a72442f8974e2201e6057828cef0a6162d850d0997bfae6a7" Mar 20 08:00:10 crc kubenswrapper[4971]: I0320 08:00:10.056974 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01fc8d9203603d7a72442f8974e2201e6057828cef0a6162d850d0997bfae6a7"} err="failed to get container status \"01fc8d9203603d7a72442f8974e2201e6057828cef0a6162d850d0997bfae6a7\": rpc error: code = NotFound desc = could not find container \"01fc8d9203603d7a72442f8974e2201e6057828cef0a6162d850d0997bfae6a7\": container with ID starting with 01fc8d9203603d7a72442f8974e2201e6057828cef0a6162d850d0997bfae6a7 not found: ID does not exist" Mar 20 08:00:10 crc kubenswrapper[4971]: I0320 08:00:10.056994 4971 scope.go:117] "RemoveContainer" containerID="340edcfd8cade1f5629d2bc21acf4276188298e960b8b50167efe489dbd2bb68" Mar 20 08:00:10 crc kubenswrapper[4971]: E0320 08:00:10.057386 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"340edcfd8cade1f5629d2bc21acf4276188298e960b8b50167efe489dbd2bb68\": container with ID starting with 340edcfd8cade1f5629d2bc21acf4276188298e960b8b50167efe489dbd2bb68 not found: ID does not exist" containerID="340edcfd8cade1f5629d2bc21acf4276188298e960b8b50167efe489dbd2bb68" Mar 20 08:00:10 crc kubenswrapper[4971]: I0320 08:00:10.057435 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"340edcfd8cade1f5629d2bc21acf4276188298e960b8b50167efe489dbd2bb68"} err="failed to get container status \"340edcfd8cade1f5629d2bc21acf4276188298e960b8b50167efe489dbd2bb68\": rpc error: code = NotFound desc = could not find container \"340edcfd8cade1f5629d2bc21acf4276188298e960b8b50167efe489dbd2bb68\": container with ID starting with 340edcfd8cade1f5629d2bc21acf4276188298e960b8b50167efe489dbd2bb68 not found: ID does not exist" Mar 20 08:00:10 crc kubenswrapper[4971]: I0320 08:00:10.057468 4971 scope.go:117] "RemoveContainer" containerID="65bf921e34838252265e485da39456f6f69415aa027656efd1584365ac503068" Mar 20 08:00:10 crc kubenswrapper[4971]: E0320 08:00:10.057900 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65bf921e34838252265e485da39456f6f69415aa027656efd1584365ac503068\": container with ID starting with 65bf921e34838252265e485da39456f6f69415aa027656efd1584365ac503068 not found: ID does not exist" containerID="65bf921e34838252265e485da39456f6f69415aa027656efd1584365ac503068" Mar 20 08:00:10 crc kubenswrapper[4971]: I0320 08:00:10.057925 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65bf921e34838252265e485da39456f6f69415aa027656efd1584365ac503068"} err="failed to get container status \"65bf921e34838252265e485da39456f6f69415aa027656efd1584365ac503068\": rpc error: code = NotFound desc = could not find container \"65bf921e34838252265e485da39456f6f69415aa027656efd1584365ac503068\": container with ID starting with 65bf921e34838252265e485da39456f6f69415aa027656efd1584365ac503068 not found: ID does not exist" Mar 20 08:00:10 crc kubenswrapper[4971]: I0320 08:00:10.744670 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c78be730-b987-43d1-b83a-070681fce468" path="/var/lib/kubelet/pods/c78be730-b987-43d1-b83a-070681fce468/volumes" Mar 20 08:00:20 crc kubenswrapper[4971]: I0320 08:00:20.162663 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 08:00:20 crc kubenswrapper[4971]: I0320 08:00:20.163088 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 08:00:50 crc kubenswrapper[4971]: I0320 08:00:50.163141 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 08:00:50 crc kubenswrapper[4971]: I0320 08:00:50.164327 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 08:00:50 crc kubenswrapper[4971]: I0320 08:00:50.164416 4971 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" Mar 20 08:00:50 crc kubenswrapper[4971]: I0320 08:00:50.165490 4971 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4a80cd36cbda07810e83e10d6f5e9bdd6015f9ed1688173c120cf0268aeac5df"} pod="openshift-machine-config-operator/machine-config-daemon-m26zl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 08:00:50 crc kubenswrapper[4971]: I0320 08:00:50.165653 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" containerID="cri-o://4a80cd36cbda07810e83e10d6f5e9bdd6015f9ed1688173c120cf0268aeac5df" gracePeriod=600 Mar 20 08:00:50 crc kubenswrapper[4971]: E0320 08:00:50.301068 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:00:50 crc kubenswrapper[4971]: I0320 08:00:50.909393 4971 generic.go:334] "Generic (PLEG): container finished" podID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerID="4a80cd36cbda07810e83e10d6f5e9bdd6015f9ed1688173c120cf0268aeac5df" exitCode=0 Mar 20 08:00:50 crc kubenswrapper[4971]: I0320 08:00:50.909474 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerDied","Data":"4a80cd36cbda07810e83e10d6f5e9bdd6015f9ed1688173c120cf0268aeac5df"} Mar 20 08:00:50 crc kubenswrapper[4971]: I0320 08:00:50.909566 4971 scope.go:117] "RemoveContainer" containerID="f5757dab665bec4f1034edb8e2c85da8a22c711cf9230d598981b106f2238f62" Mar 20 08:00:50 crc kubenswrapper[4971]: I0320 08:00:50.910250 4971 scope.go:117] "RemoveContainer" containerID="4a80cd36cbda07810e83e10d6f5e9bdd6015f9ed1688173c120cf0268aeac5df" Mar 20 08:00:50 crc kubenswrapper[4971]: E0320 08:00:50.910731 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:00:56 crc kubenswrapper[4971]: I0320 08:00:56.473080 4971 scope.go:117] "RemoveContainer" containerID="ea87d8d39a379069f57550ef31343612a37996e2f76e0e2acef4d11d633aabe4" Mar 20 08:00:56 crc kubenswrapper[4971]: I0320 08:00:56.614400 4971 scope.go:117] "RemoveContainer" containerID="fa3f08d261fa3e16e592eed0b5d13d958e5ad9941a0d19e31fc58faafd2266fb" Mar 20 08:01:04 crc kubenswrapper[4971]: I0320 08:01:04.732698 4971 scope.go:117] "RemoveContainer" containerID="4a80cd36cbda07810e83e10d6f5e9bdd6015f9ed1688173c120cf0268aeac5df" Mar 20 08:01:04 crc kubenswrapper[4971]: E0320 08:01:04.733820 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:01:15 crc kubenswrapper[4971]: I0320 08:01:15.732905 4971 scope.go:117] "RemoveContainer" containerID="4a80cd36cbda07810e83e10d6f5e9bdd6015f9ed1688173c120cf0268aeac5df" Mar 20 08:01:15 crc kubenswrapper[4971]: E0320 08:01:15.733718 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:01:30 crc kubenswrapper[4971]: I0320 08:01:30.732087 4971 scope.go:117] "RemoveContainer" containerID="4a80cd36cbda07810e83e10d6f5e9bdd6015f9ed1688173c120cf0268aeac5df" Mar 20 08:01:30 crc kubenswrapper[4971]: E0320 08:01:30.733188 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:01:45 crc kubenswrapper[4971]: I0320 08:01:45.732529 4971 scope.go:117] "RemoveContainer" containerID="4a80cd36cbda07810e83e10d6f5e9bdd6015f9ed1688173c120cf0268aeac5df" Mar 20 08:01:45 crc kubenswrapper[4971]: E0320 08:01:45.733559 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:01:59 crc kubenswrapper[4971]: I0320 08:01:59.733304 4971 scope.go:117] "RemoveContainer" containerID="4a80cd36cbda07810e83e10d6f5e9bdd6015f9ed1688173c120cf0268aeac5df" Mar 20 08:01:59 crc kubenswrapper[4971]: E0320 08:01:59.734107 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:02:00 crc kubenswrapper[4971]: I0320 08:02:00.148794 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566562-897kl"] Mar 20 08:02:00 crc kubenswrapper[4971]: E0320 08:02:00.149125 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c78be730-b987-43d1-b83a-070681fce468" containerName="extract-utilities" Mar 20 08:02:00 crc kubenswrapper[4971]: I0320 08:02:00.149140 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="c78be730-b987-43d1-b83a-070681fce468" containerName="extract-utilities" Mar 20 08:02:00 crc kubenswrapper[4971]: E0320 08:02:00.149162 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7121fb5-1631-4257-8aa2-4e732578fc6f" containerName="collect-profiles" Mar 20 08:02:00 crc kubenswrapper[4971]: I0320 08:02:00.149171 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7121fb5-1631-4257-8aa2-4e732578fc6f" containerName="collect-profiles" Mar 20 08:02:00 crc kubenswrapper[4971]: E0320 08:02:00.149195 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9d47e8b-b290-4df9-9976-8088a83a8d3a" containerName="oc" Mar 20 08:02:00 crc kubenswrapper[4971]: I0320 08:02:00.149209 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9d47e8b-b290-4df9-9976-8088a83a8d3a" containerName="oc" Mar 20 08:02:00 crc kubenswrapper[4971]: E0320 08:02:00.149225 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c78be730-b987-43d1-b83a-070681fce468" containerName="extract-content" Mar 20 08:02:00 crc kubenswrapper[4971]: I0320 08:02:00.149233 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="c78be730-b987-43d1-b83a-070681fce468" containerName="extract-content" Mar 20 08:02:00 crc kubenswrapper[4971]: E0320 08:02:00.149243 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c78be730-b987-43d1-b83a-070681fce468" containerName="registry-server" Mar 20 08:02:00 crc kubenswrapper[4971]: I0320 08:02:00.149251 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="c78be730-b987-43d1-b83a-070681fce468" containerName="registry-server" Mar 20 08:02:00 crc kubenswrapper[4971]: I0320 08:02:00.149434 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="c78be730-b987-43d1-b83a-070681fce468" containerName="registry-server" Mar 20 08:02:00 crc kubenswrapper[4971]: I0320 08:02:00.149459 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7121fb5-1631-4257-8aa2-4e732578fc6f" containerName="collect-profiles" Mar 20 08:02:00 crc kubenswrapper[4971]: I0320 08:02:00.149480 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9d47e8b-b290-4df9-9976-8088a83a8d3a" containerName="oc" Mar 20 08:02:00 crc kubenswrapper[4971]: I0320 08:02:00.150093 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566562-897kl" Mar 20 08:02:00 crc kubenswrapper[4971]: I0320 08:02:00.155116 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 08:02:00 crc kubenswrapper[4971]: I0320 08:02:00.155580 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 08:02:00 crc kubenswrapper[4971]: I0320 08:02:00.161725 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566562-897kl"] Mar 20 08:02:00 crc kubenswrapper[4971]: I0320 08:02:00.164353 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 08:02:00 crc kubenswrapper[4971]: I0320 08:02:00.180128 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r58pb\" (UniqueName: \"kubernetes.io/projected/bf9b4c24-f51a-4b84-ade2-1aee21f0fb99-kube-api-access-r58pb\") pod \"auto-csr-approver-29566562-897kl\" (UID: \"bf9b4c24-f51a-4b84-ade2-1aee21f0fb99\") " pod="openshift-infra/auto-csr-approver-29566562-897kl" Mar 20 08:02:00 crc kubenswrapper[4971]: I0320 08:02:00.281539 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r58pb\" (UniqueName: \"kubernetes.io/projected/bf9b4c24-f51a-4b84-ade2-1aee21f0fb99-kube-api-access-r58pb\") pod \"auto-csr-approver-29566562-897kl\" (UID: \"bf9b4c24-f51a-4b84-ade2-1aee21f0fb99\") " pod="openshift-infra/auto-csr-approver-29566562-897kl" Mar 20 08:02:00 crc kubenswrapper[4971]: I0320 08:02:00.698441 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r58pb\" (UniqueName: \"kubernetes.io/projected/bf9b4c24-f51a-4b84-ade2-1aee21f0fb99-kube-api-access-r58pb\") pod \"auto-csr-approver-29566562-897kl\" (UID: \"bf9b4c24-f51a-4b84-ade2-1aee21f0fb99\") " pod="openshift-infra/auto-csr-approver-29566562-897kl" Mar 20 08:02:00 crc kubenswrapper[4971]: I0320 08:02:00.772041 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566562-897kl" Mar 20 08:02:01 crc kubenswrapper[4971]: I0320 08:02:01.174476 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566562-897kl"] Mar 20 08:02:01 crc kubenswrapper[4971]: I0320 08:02:01.513532 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566562-897kl" event={"ID":"bf9b4c24-f51a-4b84-ade2-1aee21f0fb99","Type":"ContainerStarted","Data":"b2a54611e2b059230037453f1a950d1338772e57ea650fef46a374552bf45b1d"} Mar 20 08:02:02 crc kubenswrapper[4971]: I0320 08:02:02.520471 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566562-897kl" event={"ID":"bf9b4c24-f51a-4b84-ade2-1aee21f0fb99","Type":"ContainerStarted","Data":"712e391ef54a2c9105a9ac6e9655cd6befae581436a7056985c03b934a768bee"} Mar 20 08:02:02 crc kubenswrapper[4971]: I0320 08:02:02.541336 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566562-897kl" podStartSLOduration=1.759139773 podStartE2EDuration="2.541316953s" podCreationTimestamp="2026-03-20 08:02:00 +0000 UTC" firstStartedPulling="2026-03-20 08:02:01.182047376 +0000 UTC m=+4343.161921514" lastFinishedPulling="2026-03-20 08:02:01.964224546 +0000 UTC m=+4343.944098694" observedRunningTime="2026-03-20 08:02:02.536088237 +0000 UTC m=+4344.515962375" watchObservedRunningTime="2026-03-20 08:02:02.541316953 +0000 UTC m=+4344.521191091" Mar 20 08:02:03 crc kubenswrapper[4971]: I0320 08:02:03.533239 4971 generic.go:334] "Generic (PLEG): container finished" podID="bf9b4c24-f51a-4b84-ade2-1aee21f0fb99" containerID="712e391ef54a2c9105a9ac6e9655cd6befae581436a7056985c03b934a768bee" exitCode=0 Mar 20 08:02:03 crc kubenswrapper[4971]: I0320 08:02:03.533306 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566562-897kl" event={"ID":"bf9b4c24-f51a-4b84-ade2-1aee21f0fb99","Type":"ContainerDied","Data":"712e391ef54a2c9105a9ac6e9655cd6befae581436a7056985c03b934a768bee"} Mar 20 08:02:04 crc kubenswrapper[4971]: I0320 08:02:04.876243 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566562-897kl" Mar 20 08:02:04 crc kubenswrapper[4971]: I0320 08:02:04.958185 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r58pb\" (UniqueName: \"kubernetes.io/projected/bf9b4c24-f51a-4b84-ade2-1aee21f0fb99-kube-api-access-r58pb\") pod \"bf9b4c24-f51a-4b84-ade2-1aee21f0fb99\" (UID: \"bf9b4c24-f51a-4b84-ade2-1aee21f0fb99\") " Mar 20 08:02:04 crc kubenswrapper[4971]: I0320 08:02:04.968767 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf9b4c24-f51a-4b84-ade2-1aee21f0fb99-kube-api-access-r58pb" (OuterVolumeSpecName: "kube-api-access-r58pb") pod "bf9b4c24-f51a-4b84-ade2-1aee21f0fb99" (UID: "bf9b4c24-f51a-4b84-ade2-1aee21f0fb99"). InnerVolumeSpecName "kube-api-access-r58pb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:02:05 crc kubenswrapper[4971]: I0320 08:02:05.059709 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r58pb\" (UniqueName: \"kubernetes.io/projected/bf9b4c24-f51a-4b84-ade2-1aee21f0fb99-kube-api-access-r58pb\") on node \"crc\" DevicePath \"\"" Mar 20 08:02:05 crc kubenswrapper[4971]: I0320 08:02:05.551786 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566562-897kl" event={"ID":"bf9b4c24-f51a-4b84-ade2-1aee21f0fb99","Type":"ContainerDied","Data":"b2a54611e2b059230037453f1a950d1338772e57ea650fef46a374552bf45b1d"} Mar 20 08:02:05 crc kubenswrapper[4971]: I0320 08:02:05.551826 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2a54611e2b059230037453f1a950d1338772e57ea650fef46a374552bf45b1d" Mar 20 08:02:05 crc kubenswrapper[4971]: I0320 08:02:05.551960 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566562-897kl" Mar 20 08:02:05 crc kubenswrapper[4971]: I0320 08:02:05.626415 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566556-86xkz"] Mar 20 08:02:05 crc kubenswrapper[4971]: I0320 08:02:05.635375 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566556-86xkz"] Mar 20 08:02:06 crc kubenswrapper[4971]: I0320 08:02:06.744866 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55e6a816-98cf-45fa-9e19-f0d440f373df" path="/var/lib/kubelet/pods/55e6a816-98cf-45fa-9e19-f0d440f373df/volumes" Mar 20 08:02:13 crc kubenswrapper[4971]: I0320 08:02:13.732469 4971 scope.go:117] "RemoveContainer" containerID="4a80cd36cbda07810e83e10d6f5e9bdd6015f9ed1688173c120cf0268aeac5df" Mar 20 08:02:13 crc kubenswrapper[4971]: E0320 08:02:13.733222 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:02:24 crc kubenswrapper[4971]: I0320 08:02:24.732879 4971 scope.go:117] "RemoveContainer" containerID="4a80cd36cbda07810e83e10d6f5e9bdd6015f9ed1688173c120cf0268aeac5df" Mar 20 08:02:24 crc kubenswrapper[4971]: E0320 08:02:24.733859 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:02:28 crc kubenswrapper[4971]: I0320 08:02:28.091848 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9qg7w"] Mar 20 08:02:28 crc kubenswrapper[4971]: E0320 08:02:28.092977 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf9b4c24-f51a-4b84-ade2-1aee21f0fb99" containerName="oc" Mar 20 08:02:28 crc kubenswrapper[4971]: I0320 08:02:28.093007 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf9b4c24-f51a-4b84-ade2-1aee21f0fb99" containerName="oc" Mar 20 08:02:28 crc kubenswrapper[4971]: I0320 08:02:28.093372 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf9b4c24-f51a-4b84-ade2-1aee21f0fb99" containerName="oc" Mar 20 08:02:28 crc kubenswrapper[4971]: I0320 08:02:28.095777 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9qg7w" Mar 20 08:02:28 crc kubenswrapper[4971]: I0320 08:02:28.116657 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9qg7w"] Mar 20 08:02:28 crc kubenswrapper[4971]: I0320 08:02:28.210357 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af993f9e-5348-4fae-9c77-8c3651b5c306-utilities\") pod \"certified-operators-9qg7w\" (UID: \"af993f9e-5348-4fae-9c77-8c3651b5c306\") " pod="openshift-marketplace/certified-operators-9qg7w" Mar 20 08:02:28 crc kubenswrapper[4971]: I0320 08:02:28.210554 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af993f9e-5348-4fae-9c77-8c3651b5c306-catalog-content\") pod \"certified-operators-9qg7w\" (UID: \"af993f9e-5348-4fae-9c77-8c3651b5c306\") " pod="openshift-marketplace/certified-operators-9qg7w" Mar 20 08:02:28 crc kubenswrapper[4971]: I0320 08:02:28.210668 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss6dk\" (UniqueName: \"kubernetes.io/projected/af993f9e-5348-4fae-9c77-8c3651b5c306-kube-api-access-ss6dk\") pod \"certified-operators-9qg7w\" (UID: \"af993f9e-5348-4fae-9c77-8c3651b5c306\") " pod="openshift-marketplace/certified-operators-9qg7w" Mar 20 08:02:28 crc kubenswrapper[4971]: I0320 08:02:28.312631 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af993f9e-5348-4fae-9c77-8c3651b5c306-utilities\") pod \"certified-operators-9qg7w\" (UID: \"af993f9e-5348-4fae-9c77-8c3651b5c306\") " pod="openshift-marketplace/certified-operators-9qg7w" Mar 20 08:02:28 crc kubenswrapper[4971]: I0320 08:02:28.312708 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af993f9e-5348-4fae-9c77-8c3651b5c306-catalog-content\") pod \"certified-operators-9qg7w\" (UID: \"af993f9e-5348-4fae-9c77-8c3651b5c306\") " pod="openshift-marketplace/certified-operators-9qg7w" Mar 20 08:02:28 crc kubenswrapper[4971]: I0320 08:02:28.312737 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss6dk\" (UniqueName: \"kubernetes.io/projected/af993f9e-5348-4fae-9c77-8c3651b5c306-kube-api-access-ss6dk\") pod \"certified-operators-9qg7w\" (UID: \"af993f9e-5348-4fae-9c77-8c3651b5c306\") " pod="openshift-marketplace/certified-operators-9qg7w" Mar 20 08:02:28 crc kubenswrapper[4971]: I0320 08:02:28.313521 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af993f9e-5348-4fae-9c77-8c3651b5c306-utilities\") pod \"certified-operators-9qg7w\" (UID: \"af993f9e-5348-4fae-9c77-8c3651b5c306\") " pod="openshift-marketplace/certified-operators-9qg7w" Mar 20 08:02:28 crc kubenswrapper[4971]: I0320 08:02:28.313581 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af993f9e-5348-4fae-9c77-8c3651b5c306-catalog-content\") pod \"certified-operators-9qg7w\" (UID: \"af993f9e-5348-4fae-9c77-8c3651b5c306\") " pod="openshift-marketplace/certified-operators-9qg7w" Mar 20 08:02:28 crc kubenswrapper[4971]: I0320 08:02:28.344016 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss6dk\" (UniqueName: \"kubernetes.io/projected/af993f9e-5348-4fae-9c77-8c3651b5c306-kube-api-access-ss6dk\") pod \"certified-operators-9qg7w\" (UID: \"af993f9e-5348-4fae-9c77-8c3651b5c306\") " pod="openshift-marketplace/certified-operators-9qg7w" Mar 20 08:02:28 crc kubenswrapper[4971]: I0320 08:02:28.435261 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9qg7w" Mar 20 08:02:28 crc kubenswrapper[4971]: I0320 08:02:28.916932 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9qg7w"] Mar 20 08:02:29 crc kubenswrapper[4971]: I0320 08:02:29.781160 4971 generic.go:334] "Generic (PLEG): container finished" podID="af993f9e-5348-4fae-9c77-8c3651b5c306" containerID="dd5b5c87d1b8cbbc49a5bd99326a62482502d46df300523aed15422ce7154c26" exitCode=0 Mar 20 08:02:29 crc kubenswrapper[4971]: I0320 08:02:29.781491 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9qg7w" event={"ID":"af993f9e-5348-4fae-9c77-8c3651b5c306","Type":"ContainerDied","Data":"dd5b5c87d1b8cbbc49a5bd99326a62482502d46df300523aed15422ce7154c26"} Mar 20 08:02:29 crc kubenswrapper[4971]: I0320 08:02:29.781527 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9qg7w" event={"ID":"af993f9e-5348-4fae-9c77-8c3651b5c306","Type":"ContainerStarted","Data":"748230686e1382288bdcdbc40feed4054e742c148c6dd66a97c53b6d53649756"} Mar 20 08:02:31 crc kubenswrapper[4971]: I0320 08:02:31.810995 4971 generic.go:334] "Generic (PLEG): container finished" podID="af993f9e-5348-4fae-9c77-8c3651b5c306" containerID="2b8f6fd421aed3ab35ae2a24a61a113f0e36ddca8f22c272bffda8850a6ffdcd" exitCode=0 Mar 20 08:02:31 crc kubenswrapper[4971]: I0320 08:02:31.811064 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9qg7w" event={"ID":"af993f9e-5348-4fae-9c77-8c3651b5c306","Type":"ContainerDied","Data":"2b8f6fd421aed3ab35ae2a24a61a113f0e36ddca8f22c272bffda8850a6ffdcd"} Mar 20 08:02:32 crc kubenswrapper[4971]: I0320 08:02:32.819273 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9qg7w" event={"ID":"af993f9e-5348-4fae-9c77-8c3651b5c306","Type":"ContainerStarted","Data":"bab3e345bd152f2641ae68b7a1cae9560cb375249fd10b981b0cf2b763ec929c"} Mar 20 08:02:32 crc kubenswrapper[4971]: I0320 08:02:32.839337 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9qg7w" podStartSLOduration=2.371999297 podStartE2EDuration="4.839312413s" podCreationTimestamp="2026-03-20 08:02:28 +0000 UTC" firstStartedPulling="2026-03-20 08:02:29.78298113 +0000 UTC m=+4371.762855268" lastFinishedPulling="2026-03-20 08:02:32.250294236 +0000 UTC m=+4374.230168384" observedRunningTime="2026-03-20 08:02:32.83765607 +0000 UTC m=+4374.817530218" watchObservedRunningTime="2026-03-20 08:02:32.839312413 +0000 UTC m=+4374.819186571" Mar 20 08:02:35 crc kubenswrapper[4971]: I0320 08:02:35.732832 4971 scope.go:117] "RemoveContainer" containerID="4a80cd36cbda07810e83e10d6f5e9bdd6015f9ed1688173c120cf0268aeac5df" Mar 20 08:02:35 crc kubenswrapper[4971]: E0320 08:02:35.733505 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:02:38 crc kubenswrapper[4971]: I0320 08:02:38.437692 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9qg7w" Mar 20 08:02:38 crc kubenswrapper[4971]: I0320 08:02:38.438040 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9qg7w" Mar 20 08:02:38 crc kubenswrapper[4971]: I0320 08:02:38.516286 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9qg7w" Mar 20 08:02:38 crc kubenswrapper[4971]: I0320 08:02:38.954345 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9qg7w" Mar 20 08:02:39 crc kubenswrapper[4971]: I0320 08:02:39.282128 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9qg7w"] Mar 20 08:02:40 crc kubenswrapper[4971]: I0320 08:02:40.897529 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9qg7w" podUID="af993f9e-5348-4fae-9c77-8c3651b5c306" containerName="registry-server" containerID="cri-o://bab3e345bd152f2641ae68b7a1cae9560cb375249fd10b981b0cf2b763ec929c" gracePeriod=2 Mar 20 08:02:42 crc kubenswrapper[4971]: I0320 08:02:42.507833 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9qg7w" Mar 20 08:02:42 crc kubenswrapper[4971]: I0320 08:02:42.544827 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af993f9e-5348-4fae-9c77-8c3651b5c306-utilities\") pod \"af993f9e-5348-4fae-9c77-8c3651b5c306\" (UID: \"af993f9e-5348-4fae-9c77-8c3651b5c306\") " Mar 20 08:02:42 crc kubenswrapper[4971]: I0320 08:02:42.544913 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af993f9e-5348-4fae-9c77-8c3651b5c306-catalog-content\") pod \"af993f9e-5348-4fae-9c77-8c3651b5c306\" (UID: \"af993f9e-5348-4fae-9c77-8c3651b5c306\") " Mar 20 08:02:42 crc kubenswrapper[4971]: I0320 08:02:42.544954 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ss6dk\" (UniqueName: \"kubernetes.io/projected/af993f9e-5348-4fae-9c77-8c3651b5c306-kube-api-access-ss6dk\") pod \"af993f9e-5348-4fae-9c77-8c3651b5c306\" (UID: \"af993f9e-5348-4fae-9c77-8c3651b5c306\") " Mar 20 08:02:42 crc kubenswrapper[4971]: I0320 08:02:42.552417 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af993f9e-5348-4fae-9c77-8c3651b5c306-kube-api-access-ss6dk" (OuterVolumeSpecName: "kube-api-access-ss6dk") pod "af993f9e-5348-4fae-9c77-8c3651b5c306" (UID: "af993f9e-5348-4fae-9c77-8c3651b5c306"). InnerVolumeSpecName "kube-api-access-ss6dk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:02:42 crc kubenswrapper[4971]: I0320 08:02:42.555361 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af993f9e-5348-4fae-9c77-8c3651b5c306-utilities" (OuterVolumeSpecName: "utilities") pod "af993f9e-5348-4fae-9c77-8c3651b5c306" (UID: "af993f9e-5348-4fae-9c77-8c3651b5c306"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:02:42 crc kubenswrapper[4971]: I0320 08:02:42.630666 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af993f9e-5348-4fae-9c77-8c3651b5c306-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "af993f9e-5348-4fae-9c77-8c3651b5c306" (UID: "af993f9e-5348-4fae-9c77-8c3651b5c306"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:02:42 crc kubenswrapper[4971]: I0320 08:02:42.648696 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ss6dk\" (UniqueName: \"kubernetes.io/projected/af993f9e-5348-4fae-9c77-8c3651b5c306-kube-api-access-ss6dk\") on node \"crc\" DevicePath \"\"" Mar 20 08:02:42 crc kubenswrapper[4971]: I0320 08:02:42.648757 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af993f9e-5348-4fae-9c77-8c3651b5c306-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 08:02:42 crc kubenswrapper[4971]: I0320 08:02:42.648769 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af993f9e-5348-4fae-9c77-8c3651b5c306-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 08:02:42 crc kubenswrapper[4971]: I0320 08:02:42.918516 4971 generic.go:334] "Generic (PLEG): container finished" podID="af993f9e-5348-4fae-9c77-8c3651b5c306" containerID="bab3e345bd152f2641ae68b7a1cae9560cb375249fd10b981b0cf2b763ec929c" exitCode=0 Mar 20 08:02:42 crc kubenswrapper[4971]: I0320 08:02:42.918651 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9qg7w" event={"ID":"af993f9e-5348-4fae-9c77-8c3651b5c306","Type":"ContainerDied","Data":"bab3e345bd152f2641ae68b7a1cae9560cb375249fd10b981b0cf2b763ec929c"} Mar 20 08:02:42 crc kubenswrapper[4971]: I0320 08:02:42.919254 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9qg7w" event={"ID":"af993f9e-5348-4fae-9c77-8c3651b5c306","Type":"ContainerDied","Data":"748230686e1382288bdcdbc40feed4054e742c148c6dd66a97c53b6d53649756"} Mar 20 08:02:42 crc kubenswrapper[4971]: I0320 08:02:42.918671 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9qg7w" Mar 20 08:02:42 crc kubenswrapper[4971]: I0320 08:02:42.919328 4971 scope.go:117] "RemoveContainer" containerID="bab3e345bd152f2641ae68b7a1cae9560cb375249fd10b981b0cf2b763ec929c" Mar 20 08:02:42 crc kubenswrapper[4971]: I0320 08:02:42.946184 4971 scope.go:117] "RemoveContainer" containerID="2b8f6fd421aed3ab35ae2a24a61a113f0e36ddca8f22c272bffda8850a6ffdcd" Mar 20 08:02:42 crc kubenswrapper[4971]: I0320 08:02:42.949234 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9qg7w"] Mar 20 08:02:42 crc kubenswrapper[4971]: I0320 08:02:42.955506 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9qg7w"] Mar 20 08:02:42 crc kubenswrapper[4971]: I0320 08:02:42.964863 4971 scope.go:117] "RemoveContainer" containerID="dd5b5c87d1b8cbbc49a5bd99326a62482502d46df300523aed15422ce7154c26" Mar 20 08:02:42 crc kubenswrapper[4971]: I0320 08:02:42.985755 4971 scope.go:117] "RemoveContainer" containerID="bab3e345bd152f2641ae68b7a1cae9560cb375249fd10b981b0cf2b763ec929c" Mar 20 08:02:42 crc kubenswrapper[4971]: E0320 08:02:42.986265 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bab3e345bd152f2641ae68b7a1cae9560cb375249fd10b981b0cf2b763ec929c\": container with ID starting with bab3e345bd152f2641ae68b7a1cae9560cb375249fd10b981b0cf2b763ec929c not found: ID does not exist" containerID="bab3e345bd152f2641ae68b7a1cae9560cb375249fd10b981b0cf2b763ec929c" Mar 20 08:02:42 crc kubenswrapper[4971]: I0320 08:02:42.986306 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bab3e345bd152f2641ae68b7a1cae9560cb375249fd10b981b0cf2b763ec929c"} err="failed to get container status \"bab3e345bd152f2641ae68b7a1cae9560cb375249fd10b981b0cf2b763ec929c\": rpc error: code = NotFound desc = could not find container \"bab3e345bd152f2641ae68b7a1cae9560cb375249fd10b981b0cf2b763ec929c\": container with ID starting with bab3e345bd152f2641ae68b7a1cae9560cb375249fd10b981b0cf2b763ec929c not found: ID does not exist" Mar 20 08:02:42 crc kubenswrapper[4971]: I0320 08:02:42.986331 4971 scope.go:117] "RemoveContainer" containerID="2b8f6fd421aed3ab35ae2a24a61a113f0e36ddca8f22c272bffda8850a6ffdcd" Mar 20 08:02:42 crc kubenswrapper[4971]: E0320 08:02:42.986648 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b8f6fd421aed3ab35ae2a24a61a113f0e36ddca8f22c272bffda8850a6ffdcd\": container with ID starting with 2b8f6fd421aed3ab35ae2a24a61a113f0e36ddca8f22c272bffda8850a6ffdcd not found: ID does not exist" containerID="2b8f6fd421aed3ab35ae2a24a61a113f0e36ddca8f22c272bffda8850a6ffdcd" Mar 20 08:02:42 crc kubenswrapper[4971]: I0320 08:02:42.986698 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b8f6fd421aed3ab35ae2a24a61a113f0e36ddca8f22c272bffda8850a6ffdcd"} err="failed to get container status \"2b8f6fd421aed3ab35ae2a24a61a113f0e36ddca8f22c272bffda8850a6ffdcd\": rpc error: code = NotFound desc = could not find container \"2b8f6fd421aed3ab35ae2a24a61a113f0e36ddca8f22c272bffda8850a6ffdcd\": container with ID starting with 2b8f6fd421aed3ab35ae2a24a61a113f0e36ddca8f22c272bffda8850a6ffdcd not found: ID does not exist" Mar 20 08:02:42 crc kubenswrapper[4971]: I0320 08:02:42.986733 4971 scope.go:117] "RemoveContainer" containerID="dd5b5c87d1b8cbbc49a5bd99326a62482502d46df300523aed15422ce7154c26" Mar 20 08:02:42 crc kubenswrapper[4971]: E0320 08:02:42.987275 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd5b5c87d1b8cbbc49a5bd99326a62482502d46df300523aed15422ce7154c26\": container with ID starting with dd5b5c87d1b8cbbc49a5bd99326a62482502d46df300523aed15422ce7154c26 not found: ID does not exist" containerID="dd5b5c87d1b8cbbc49a5bd99326a62482502d46df300523aed15422ce7154c26" Mar 20 08:02:42 crc kubenswrapper[4971]: I0320 08:02:42.987302 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd5b5c87d1b8cbbc49a5bd99326a62482502d46df300523aed15422ce7154c26"} err="failed to get container status \"dd5b5c87d1b8cbbc49a5bd99326a62482502d46df300523aed15422ce7154c26\": rpc error: code = NotFound desc = could not find container \"dd5b5c87d1b8cbbc49a5bd99326a62482502d46df300523aed15422ce7154c26\": container with ID starting with dd5b5c87d1b8cbbc49a5bd99326a62482502d46df300523aed15422ce7154c26 not found: ID does not exist" Mar 20 08:02:44 crc kubenswrapper[4971]: I0320 08:02:44.749250 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af993f9e-5348-4fae-9c77-8c3651b5c306" path="/var/lib/kubelet/pods/af993f9e-5348-4fae-9c77-8c3651b5c306/volumes" Mar 20 08:02:49 crc kubenswrapper[4971]: I0320 08:02:49.733728 4971 scope.go:117] "RemoveContainer" containerID="4a80cd36cbda07810e83e10d6f5e9bdd6015f9ed1688173c120cf0268aeac5df" Mar 20 08:02:49 crc kubenswrapper[4971]: E0320 08:02:49.734879 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:02:56 crc kubenswrapper[4971]: I0320 08:02:56.730430 4971 scope.go:117] "RemoveContainer" containerID="ef584511ec2bbb26d04d5fd61cea156875f31577dab89aa46b8bf386cf5605e3" Mar 20 08:03:01 crc kubenswrapper[4971]: I0320 08:03:01.733173 4971 scope.go:117] "RemoveContainer" containerID="4a80cd36cbda07810e83e10d6f5e9bdd6015f9ed1688173c120cf0268aeac5df" Mar 20 08:03:01 crc kubenswrapper[4971]: E0320 08:03:01.733829 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:03:13 crc kubenswrapper[4971]: I0320 08:03:13.732222 4971 scope.go:117] "RemoveContainer" containerID="4a80cd36cbda07810e83e10d6f5e9bdd6015f9ed1688173c120cf0268aeac5df" Mar 20 08:03:13 crc kubenswrapper[4971]: E0320 08:03:13.733402 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:03:25 crc kubenswrapper[4971]: I0320 08:03:25.732503 4971 scope.go:117] "RemoveContainer" containerID="4a80cd36cbda07810e83e10d6f5e9bdd6015f9ed1688173c120cf0268aeac5df" Mar 20 08:03:25 crc kubenswrapper[4971]: E0320 08:03:25.733655 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:03:40 crc kubenswrapper[4971]: I0320 08:03:40.732557 4971 scope.go:117] "RemoveContainer" containerID="4a80cd36cbda07810e83e10d6f5e9bdd6015f9ed1688173c120cf0268aeac5df" Mar 20 08:03:40 crc kubenswrapper[4971]: E0320 08:03:40.733773 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:03:52 crc kubenswrapper[4971]: I0320 08:03:52.732951 4971 scope.go:117] "RemoveContainer" containerID="4a80cd36cbda07810e83e10d6f5e9bdd6015f9ed1688173c120cf0268aeac5df" Mar 20 08:03:52 crc kubenswrapper[4971]: E0320 08:03:52.734042 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:04:00 crc kubenswrapper[4971]: I0320 08:04:00.158860 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566564-kcmwv"] Mar 20 08:04:00 crc kubenswrapper[4971]: E0320 08:04:00.159697 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af993f9e-5348-4fae-9c77-8c3651b5c306" containerName="extract-content" Mar 20 08:04:00 crc kubenswrapper[4971]: I0320 08:04:00.159713 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="af993f9e-5348-4fae-9c77-8c3651b5c306" containerName="extract-content" Mar 20 08:04:00 crc kubenswrapper[4971]: E0320 08:04:00.159735 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af993f9e-5348-4fae-9c77-8c3651b5c306" containerName="registry-server" Mar 20 08:04:00 crc kubenswrapper[4971]: I0320 08:04:00.159742 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="af993f9e-5348-4fae-9c77-8c3651b5c306" containerName="registry-server" Mar 20 08:04:00 crc kubenswrapper[4971]: E0320 08:04:00.159757 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af993f9e-5348-4fae-9c77-8c3651b5c306" containerName="extract-utilities" Mar 20 08:04:00 crc kubenswrapper[4971]: I0320 08:04:00.159765 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="af993f9e-5348-4fae-9c77-8c3651b5c306" containerName="extract-utilities" Mar 20 08:04:00 crc kubenswrapper[4971]: I0320 08:04:00.159934 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="af993f9e-5348-4fae-9c77-8c3651b5c306" containerName="registry-server" Mar 20 08:04:00 crc kubenswrapper[4971]: I0320 08:04:00.160542 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566564-kcmwv" Mar 20 08:04:00 crc kubenswrapper[4971]: I0320 08:04:00.163843 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 08:04:00 crc kubenswrapper[4971]: I0320 08:04:00.163963 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 08:04:00 crc kubenswrapper[4971]: I0320 08:04:00.164096 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 08:04:00 crc kubenswrapper[4971]: I0320 08:04:00.169388 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566564-kcmwv"] Mar 20 08:04:00 crc kubenswrapper[4971]: I0320 08:04:00.266288 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pqqt\" (UniqueName: \"kubernetes.io/projected/bda8c113-0844-4495-bdbb-9aad30b74950-kube-api-access-6pqqt\") pod \"auto-csr-approver-29566564-kcmwv\" (UID: \"bda8c113-0844-4495-bdbb-9aad30b74950\") " pod="openshift-infra/auto-csr-approver-29566564-kcmwv" Mar 20 08:04:00 crc kubenswrapper[4971]: I0320 08:04:00.367827 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pqqt\" (UniqueName: \"kubernetes.io/projected/bda8c113-0844-4495-bdbb-9aad30b74950-kube-api-access-6pqqt\") pod \"auto-csr-approver-29566564-kcmwv\" (UID: \"bda8c113-0844-4495-bdbb-9aad30b74950\") " pod="openshift-infra/auto-csr-approver-29566564-kcmwv" Mar 20 08:04:00 crc kubenswrapper[4971]: I0320 08:04:00.407579 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pqqt\" (UniqueName: \"kubernetes.io/projected/bda8c113-0844-4495-bdbb-9aad30b74950-kube-api-access-6pqqt\") pod \"auto-csr-approver-29566564-kcmwv\" (UID: \"bda8c113-0844-4495-bdbb-9aad30b74950\") " pod="openshift-infra/auto-csr-approver-29566564-kcmwv" Mar 20 08:04:00 crc kubenswrapper[4971]: I0320 08:04:00.485879 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566564-kcmwv" Mar 20 08:04:00 crc kubenswrapper[4971]: I0320 08:04:00.996734 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566564-kcmwv"] Mar 20 08:04:00 crc kubenswrapper[4971]: W0320 08:04:00.998104 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbda8c113_0844_4495_bdbb_9aad30b74950.slice/crio-1cd791daf2cfa371dc99b3a70bf1a775110d012fac335fae0dacf563f6ff1ddd WatchSource:0}: Error finding container 1cd791daf2cfa371dc99b3a70bf1a775110d012fac335fae0dacf563f6ff1ddd: Status 404 returned error can't find the container with id 1cd791daf2cfa371dc99b3a70bf1a775110d012fac335fae0dacf563f6ff1ddd Mar 20 08:04:01 crc kubenswrapper[4971]: I0320 08:04:01.003419 4971 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 08:04:01 crc kubenswrapper[4971]: I0320 08:04:01.688971 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566564-kcmwv" event={"ID":"bda8c113-0844-4495-bdbb-9aad30b74950","Type":"ContainerStarted","Data":"1cd791daf2cfa371dc99b3a70bf1a775110d012fac335fae0dacf563f6ff1ddd"} Mar 20 08:04:02 crc kubenswrapper[4971]: I0320 08:04:02.701408 4971 generic.go:334] "Generic (PLEG): container finished" podID="bda8c113-0844-4495-bdbb-9aad30b74950" containerID="56f57d649a8da81224432f1e422f1a4e613ca8557ca9ee54b0dbe79d871ab1d3" exitCode=0 Mar 20 08:04:02 crc kubenswrapper[4971]: I0320 08:04:02.701487 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566564-kcmwv" event={"ID":"bda8c113-0844-4495-bdbb-9aad30b74950","Type":"ContainerDied","Data":"56f57d649a8da81224432f1e422f1a4e613ca8557ca9ee54b0dbe79d871ab1d3"} Mar 20 08:04:04 crc kubenswrapper[4971]: I0320 08:04:04.049886 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566564-kcmwv" Mar 20 08:04:04 crc kubenswrapper[4971]: I0320 08:04:04.136404 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pqqt\" (UniqueName: \"kubernetes.io/projected/bda8c113-0844-4495-bdbb-9aad30b74950-kube-api-access-6pqqt\") pod \"bda8c113-0844-4495-bdbb-9aad30b74950\" (UID: \"bda8c113-0844-4495-bdbb-9aad30b74950\") " Mar 20 08:04:04 crc kubenswrapper[4971]: I0320 08:04:04.143853 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bda8c113-0844-4495-bdbb-9aad30b74950-kube-api-access-6pqqt" (OuterVolumeSpecName: "kube-api-access-6pqqt") pod "bda8c113-0844-4495-bdbb-9aad30b74950" (UID: "bda8c113-0844-4495-bdbb-9aad30b74950"). InnerVolumeSpecName "kube-api-access-6pqqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:04:04 crc kubenswrapper[4971]: I0320 08:04:04.238215 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6pqqt\" (UniqueName: \"kubernetes.io/projected/bda8c113-0844-4495-bdbb-9aad30b74950-kube-api-access-6pqqt\") on node \"crc\" DevicePath \"\"" Mar 20 08:04:04 crc kubenswrapper[4971]: I0320 08:04:04.721304 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566564-kcmwv" event={"ID":"bda8c113-0844-4495-bdbb-9aad30b74950","Type":"ContainerDied","Data":"1cd791daf2cfa371dc99b3a70bf1a775110d012fac335fae0dacf563f6ff1ddd"} Mar 20 08:04:04 crc kubenswrapper[4971]: I0320 08:04:04.721366 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1cd791daf2cfa371dc99b3a70bf1a775110d012fac335fae0dacf563f6ff1ddd" Mar 20 08:04:04 crc kubenswrapper[4971]: I0320 08:04:04.721450 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566564-kcmwv" Mar 20 08:04:04 crc kubenswrapper[4971]: I0320 08:04:04.736273 4971 scope.go:117] "RemoveContainer" containerID="4a80cd36cbda07810e83e10d6f5e9bdd6015f9ed1688173c120cf0268aeac5df" Mar 20 08:04:04 crc kubenswrapper[4971]: E0320 08:04:04.736522 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:04:05 crc kubenswrapper[4971]: I0320 08:04:05.148286 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566558-26qdx"] Mar 20 08:04:05 crc kubenswrapper[4971]: I0320 08:04:05.156634 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566558-26qdx"] Mar 20 08:04:06 crc kubenswrapper[4971]: I0320 08:04:06.741628 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a55658af-2656-49f8-9a8f-b3de1a23d3c5" path="/var/lib/kubelet/pods/a55658af-2656-49f8-9a8f-b3de1a23d3c5/volumes" Mar 20 08:04:16 crc kubenswrapper[4971]: I0320 08:04:16.733413 4971 scope.go:117] "RemoveContainer" containerID="4a80cd36cbda07810e83e10d6f5e9bdd6015f9ed1688173c120cf0268aeac5df" Mar 20 08:04:16 crc kubenswrapper[4971]: E0320 08:04:16.734449 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:04:27 crc kubenswrapper[4971]: I0320 08:04:27.732863 4971 scope.go:117] "RemoveContainer" containerID="4a80cd36cbda07810e83e10d6f5e9bdd6015f9ed1688173c120cf0268aeac5df" Mar 20 08:04:27 crc kubenswrapper[4971]: E0320 08:04:27.733590 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:04:42 crc kubenswrapper[4971]: I0320 08:04:42.732819 4971 scope.go:117] "RemoveContainer" containerID="4a80cd36cbda07810e83e10d6f5e9bdd6015f9ed1688173c120cf0268aeac5df" Mar 20 08:04:42 crc kubenswrapper[4971]: E0320 08:04:42.733800 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:04:54 crc kubenswrapper[4971]: I0320 08:04:54.733196 4971 scope.go:117] "RemoveContainer" containerID="4a80cd36cbda07810e83e10d6f5e9bdd6015f9ed1688173c120cf0268aeac5df" Mar 20 08:04:54 crc kubenswrapper[4971]: E0320 08:04:54.734418 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:04:56 crc kubenswrapper[4971]: I0320 08:04:56.868632 4971 scope.go:117] "RemoveContainer" containerID="29444ac3093ce735c03486014240a4542acca8aeebf1b6cccc7fda03ff4e528f" Mar 20 08:05:06 crc kubenswrapper[4971]: I0320 08:05:06.732511 4971 scope.go:117] "RemoveContainer" containerID="4a80cd36cbda07810e83e10d6f5e9bdd6015f9ed1688173c120cf0268aeac5df" Mar 20 08:05:06 crc kubenswrapper[4971]: E0320 08:05:06.734793 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:05:21 crc kubenswrapper[4971]: I0320 08:05:21.732415 4971 scope.go:117] "RemoveContainer" containerID="4a80cd36cbda07810e83e10d6f5e9bdd6015f9ed1688173c120cf0268aeac5df" Mar 20 08:05:21 crc kubenswrapper[4971]: E0320 08:05:21.734710 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:05:32 crc kubenswrapper[4971]: I0320 08:05:32.733267 4971 scope.go:117] "RemoveContainer" containerID="4a80cd36cbda07810e83e10d6f5e9bdd6015f9ed1688173c120cf0268aeac5df" Mar 20 08:05:32 crc kubenswrapper[4971]: E0320 08:05:32.734582 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:05:45 crc kubenswrapper[4971]: I0320 08:05:45.732679 4971 scope.go:117] "RemoveContainer" containerID="4a80cd36cbda07810e83e10d6f5e9bdd6015f9ed1688173c120cf0268aeac5df" Mar 20 08:05:45 crc kubenswrapper[4971]: E0320 08:05:45.733445 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:05:56 crc kubenswrapper[4971]: I0320 08:05:56.733527 4971 scope.go:117] "RemoveContainer" containerID="4a80cd36cbda07810e83e10d6f5e9bdd6015f9ed1688173c120cf0268aeac5df" Mar 20 08:05:57 crc kubenswrapper[4971]: I0320 08:05:57.796532 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerStarted","Data":"9202ff95fa9a8e5fda31c183fc82833d745eca16c84e697454145e650520e248"} Mar 20 08:06:00 crc kubenswrapper[4971]: I0320 08:06:00.140100 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566566-bfdf5"] Mar 20 08:06:00 crc kubenswrapper[4971]: E0320 08:06:00.141131 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bda8c113-0844-4495-bdbb-9aad30b74950" containerName="oc" Mar 20 08:06:00 crc kubenswrapper[4971]: I0320 08:06:00.141160 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="bda8c113-0844-4495-bdbb-9aad30b74950" containerName="oc" Mar 20 08:06:00 crc kubenswrapper[4971]: I0320 08:06:00.141390 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="bda8c113-0844-4495-bdbb-9aad30b74950" containerName="oc" Mar 20 08:06:00 crc kubenswrapper[4971]: I0320 08:06:00.141967 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566566-bfdf5" Mar 20 08:06:00 crc kubenswrapper[4971]: I0320 08:06:00.144597 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 08:06:00 crc kubenswrapper[4971]: I0320 08:06:00.144852 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 08:06:00 crc kubenswrapper[4971]: I0320 08:06:00.150430 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 08:06:00 crc kubenswrapper[4971]: I0320 08:06:00.153597 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566566-bfdf5"] Mar 20 08:06:00 crc kubenswrapper[4971]: I0320 08:06:00.303403 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2ndw\" (UniqueName: \"kubernetes.io/projected/657f8543-eddf-4b34-949d-004f4389cb59-kube-api-access-v2ndw\") pod \"auto-csr-approver-29566566-bfdf5\" (UID: \"657f8543-eddf-4b34-949d-004f4389cb59\") " pod="openshift-infra/auto-csr-approver-29566566-bfdf5" Mar 20 08:06:00 crc kubenswrapper[4971]: I0320 08:06:00.404911 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2ndw\" (UniqueName: \"kubernetes.io/projected/657f8543-eddf-4b34-949d-004f4389cb59-kube-api-access-v2ndw\") pod \"auto-csr-approver-29566566-bfdf5\" (UID: \"657f8543-eddf-4b34-949d-004f4389cb59\") " pod="openshift-infra/auto-csr-approver-29566566-bfdf5" Mar 20 08:06:00 crc kubenswrapper[4971]: I0320 08:06:00.424697 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2ndw\" (UniqueName: \"kubernetes.io/projected/657f8543-eddf-4b34-949d-004f4389cb59-kube-api-access-v2ndw\") pod \"auto-csr-approver-29566566-bfdf5\" (UID: \"657f8543-eddf-4b34-949d-004f4389cb59\") " pod="openshift-infra/auto-csr-approver-29566566-bfdf5" Mar 20 08:06:00 crc kubenswrapper[4971]: I0320 08:06:00.460134 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566566-bfdf5" Mar 20 08:06:00 crc kubenswrapper[4971]: I0320 08:06:00.906851 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566566-bfdf5"] Mar 20 08:06:01 crc kubenswrapper[4971]: I0320 08:06:01.846155 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566566-bfdf5" event={"ID":"657f8543-eddf-4b34-949d-004f4389cb59","Type":"ContainerStarted","Data":"21023a035ace7889ddc9e0cc0d35beceeb153e6eed5d624e0e10861967be8049"} Mar 20 08:06:02 crc kubenswrapper[4971]: I0320 08:06:02.867858 4971 generic.go:334] "Generic (PLEG): container finished" podID="657f8543-eddf-4b34-949d-004f4389cb59" containerID="2c5aebd9c44ccedbf00cbcee1f139a25591f9affeba87963f84b82839babcf58" exitCode=0 Mar 20 08:06:02 crc kubenswrapper[4971]: I0320 08:06:02.868054 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566566-bfdf5" event={"ID":"657f8543-eddf-4b34-949d-004f4389cb59","Type":"ContainerDied","Data":"2c5aebd9c44ccedbf00cbcee1f139a25591f9affeba87963f84b82839babcf58"} Mar 20 08:06:04 crc kubenswrapper[4971]: I0320 08:06:04.258970 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566566-bfdf5" Mar 20 08:06:04 crc kubenswrapper[4971]: I0320 08:06:04.275659 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2ndw\" (UniqueName: \"kubernetes.io/projected/657f8543-eddf-4b34-949d-004f4389cb59-kube-api-access-v2ndw\") pod \"657f8543-eddf-4b34-949d-004f4389cb59\" (UID: \"657f8543-eddf-4b34-949d-004f4389cb59\") " Mar 20 08:06:04 crc kubenswrapper[4971]: I0320 08:06:04.282156 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/657f8543-eddf-4b34-949d-004f4389cb59-kube-api-access-v2ndw" (OuterVolumeSpecName: "kube-api-access-v2ndw") pod "657f8543-eddf-4b34-949d-004f4389cb59" (UID: "657f8543-eddf-4b34-949d-004f4389cb59"). InnerVolumeSpecName "kube-api-access-v2ndw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:06:04 crc kubenswrapper[4971]: I0320 08:06:04.376801 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2ndw\" (UniqueName: \"kubernetes.io/projected/657f8543-eddf-4b34-949d-004f4389cb59-kube-api-access-v2ndw\") on node \"crc\" DevicePath \"\"" Mar 20 08:06:04 crc kubenswrapper[4971]: I0320 08:06:04.891869 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566566-bfdf5" event={"ID":"657f8543-eddf-4b34-949d-004f4389cb59","Type":"ContainerDied","Data":"21023a035ace7889ddc9e0cc0d35beceeb153e6eed5d624e0e10861967be8049"} Mar 20 08:06:04 crc kubenswrapper[4971]: I0320 08:06:04.892191 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21023a035ace7889ddc9e0cc0d35beceeb153e6eed5d624e0e10861967be8049" Mar 20 08:06:04 crc kubenswrapper[4971]: I0320 08:06:04.892137 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566566-bfdf5" Mar 20 08:06:05 crc kubenswrapper[4971]: I0320 08:06:05.363094 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566560-jqw9s"] Mar 20 08:06:05 crc kubenswrapper[4971]: I0320 08:06:05.376790 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566560-jqw9s"] Mar 20 08:06:06 crc kubenswrapper[4971]: I0320 08:06:06.748587 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9d47e8b-b290-4df9-9976-8088a83a8d3a" path="/var/lib/kubelet/pods/b9d47e8b-b290-4df9-9976-8088a83a8d3a/volumes" Mar 20 08:06:56 crc kubenswrapper[4971]: I0320 08:06:56.978520 4971 scope.go:117] "RemoveContainer" containerID="cd951b502f8136b313fb329c92de668c29a003209ecd2b89289b5172729fa207" Mar 20 08:07:29 crc kubenswrapper[4971]: I0320 08:07:29.780452 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pcjs8"] Mar 20 08:07:29 crc kubenswrapper[4971]: E0320 08:07:29.781531 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="657f8543-eddf-4b34-949d-004f4389cb59" containerName="oc" Mar 20 08:07:29 crc kubenswrapper[4971]: I0320 08:07:29.781557 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="657f8543-eddf-4b34-949d-004f4389cb59" containerName="oc" Mar 20 08:07:29 crc kubenswrapper[4971]: I0320 08:07:29.782128 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="657f8543-eddf-4b34-949d-004f4389cb59" containerName="oc" Mar 20 08:07:29 crc kubenswrapper[4971]: I0320 08:07:29.785070 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pcjs8" Mar 20 08:07:29 crc kubenswrapper[4971]: I0320 08:07:29.798538 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pcjs8"] Mar 20 08:07:29 crc kubenswrapper[4971]: I0320 08:07:29.899100 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df4065d6-7b9a-4210-943f-43c6c49b1b96-catalog-content\") pod \"redhat-operators-pcjs8\" (UID: \"df4065d6-7b9a-4210-943f-43c6c49b1b96\") " pod="openshift-marketplace/redhat-operators-pcjs8" Mar 20 08:07:29 crc kubenswrapper[4971]: I0320 08:07:29.899145 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zhwk\" (UniqueName: \"kubernetes.io/projected/df4065d6-7b9a-4210-943f-43c6c49b1b96-kube-api-access-4zhwk\") pod \"redhat-operators-pcjs8\" (UID: \"df4065d6-7b9a-4210-943f-43c6c49b1b96\") " pod="openshift-marketplace/redhat-operators-pcjs8" Mar 20 08:07:29 crc kubenswrapper[4971]: I0320 08:07:29.899261 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df4065d6-7b9a-4210-943f-43c6c49b1b96-utilities\") pod \"redhat-operators-pcjs8\" (UID: \"df4065d6-7b9a-4210-943f-43c6c49b1b96\") " pod="openshift-marketplace/redhat-operators-pcjs8" Mar 20 08:07:30 crc kubenswrapper[4971]: I0320 08:07:30.000669 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df4065d6-7b9a-4210-943f-43c6c49b1b96-utilities\") pod \"redhat-operators-pcjs8\" (UID: \"df4065d6-7b9a-4210-943f-43c6c49b1b96\") " pod="openshift-marketplace/redhat-operators-pcjs8" Mar 20 08:07:30 crc kubenswrapper[4971]: I0320 08:07:30.000728 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df4065d6-7b9a-4210-943f-43c6c49b1b96-catalog-content\") pod \"redhat-operators-pcjs8\" (UID: \"df4065d6-7b9a-4210-943f-43c6c49b1b96\") " pod="openshift-marketplace/redhat-operators-pcjs8" Mar 20 08:07:30 crc kubenswrapper[4971]: I0320 08:07:30.000772 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zhwk\" (UniqueName: \"kubernetes.io/projected/df4065d6-7b9a-4210-943f-43c6c49b1b96-kube-api-access-4zhwk\") pod \"redhat-operators-pcjs8\" (UID: \"df4065d6-7b9a-4210-943f-43c6c49b1b96\") " pod="openshift-marketplace/redhat-operators-pcjs8" Mar 20 08:07:30 crc kubenswrapper[4971]: I0320 08:07:30.001483 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df4065d6-7b9a-4210-943f-43c6c49b1b96-utilities\") pod \"redhat-operators-pcjs8\" (UID: \"df4065d6-7b9a-4210-943f-43c6c49b1b96\") " pod="openshift-marketplace/redhat-operators-pcjs8" Mar 20 08:07:30 crc kubenswrapper[4971]: I0320 08:07:30.001631 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df4065d6-7b9a-4210-943f-43c6c49b1b96-catalog-content\") pod \"redhat-operators-pcjs8\" (UID: \"df4065d6-7b9a-4210-943f-43c6c49b1b96\") " pod="openshift-marketplace/redhat-operators-pcjs8" Mar 20 08:07:30 crc kubenswrapper[4971]: I0320 08:07:30.023865 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zhwk\" (UniqueName: \"kubernetes.io/projected/df4065d6-7b9a-4210-943f-43c6c49b1b96-kube-api-access-4zhwk\") pod \"redhat-operators-pcjs8\" (UID: \"df4065d6-7b9a-4210-943f-43c6c49b1b96\") " pod="openshift-marketplace/redhat-operators-pcjs8" Mar 20 08:07:30 crc kubenswrapper[4971]: I0320 08:07:30.111123 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pcjs8" Mar 20 08:07:30 crc kubenswrapper[4971]: I0320 08:07:30.539585 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pcjs8"] Mar 20 08:07:30 crc kubenswrapper[4971]: I0320 08:07:30.592082 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pcjs8" event={"ID":"df4065d6-7b9a-4210-943f-43c6c49b1b96","Type":"ContainerStarted","Data":"8b60513a512754d0cd9d2ecfd5a22c3752d3b0ca5089eb62c6655cc6461fa25f"} Mar 20 08:07:31 crc kubenswrapper[4971]: I0320 08:07:31.600373 4971 generic.go:334] "Generic (PLEG): container finished" podID="df4065d6-7b9a-4210-943f-43c6c49b1b96" containerID="7fd0e6d563ae05c6e5804d74419f0de94f02d404d2c82cc35c637f3d6b955113" exitCode=0 Mar 20 08:07:31 crc kubenswrapper[4971]: I0320 08:07:31.600429 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pcjs8" event={"ID":"df4065d6-7b9a-4210-943f-43c6c49b1b96","Type":"ContainerDied","Data":"7fd0e6d563ae05c6e5804d74419f0de94f02d404d2c82cc35c637f3d6b955113"} Mar 20 08:07:32 crc kubenswrapper[4971]: I0320 08:07:32.615488 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pcjs8" event={"ID":"df4065d6-7b9a-4210-943f-43c6c49b1b96","Type":"ContainerStarted","Data":"b12bb0c44634635197b3793b48adf2f8ae581ad6e7fe1692d2bdcff7c6789f4c"} Mar 20 08:07:33 crc kubenswrapper[4971]: I0320 08:07:33.628649 4971 generic.go:334] "Generic (PLEG): container finished" podID="df4065d6-7b9a-4210-943f-43c6c49b1b96" containerID="b12bb0c44634635197b3793b48adf2f8ae581ad6e7fe1692d2bdcff7c6789f4c" exitCode=0 Mar 20 08:07:33 crc kubenswrapper[4971]: I0320 08:07:33.628784 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pcjs8" event={"ID":"df4065d6-7b9a-4210-943f-43c6c49b1b96","Type":"ContainerDied","Data":"b12bb0c44634635197b3793b48adf2f8ae581ad6e7fe1692d2bdcff7c6789f4c"} Mar 20 08:07:34 crc kubenswrapper[4971]: I0320 08:07:34.646197 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pcjs8" event={"ID":"df4065d6-7b9a-4210-943f-43c6c49b1b96","Type":"ContainerStarted","Data":"413dff47da07833f718b6628e184531cada80396d484e6df01d556a2853c6110"} Mar 20 08:07:34 crc kubenswrapper[4971]: I0320 08:07:34.685947 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pcjs8" podStartSLOduration=3.122972715 podStartE2EDuration="5.685921783s" podCreationTimestamp="2026-03-20 08:07:29 +0000 UTC" firstStartedPulling="2026-03-20 08:07:31.60195189 +0000 UTC m=+4673.581826028" lastFinishedPulling="2026-03-20 08:07:34.164900958 +0000 UTC m=+4676.144775096" observedRunningTime="2026-03-20 08:07:34.67580978 +0000 UTC m=+4676.655683958" watchObservedRunningTime="2026-03-20 08:07:34.685921783 +0000 UTC m=+4676.665795951" Mar 20 08:07:40 crc kubenswrapper[4971]: I0320 08:07:40.111586 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pcjs8" Mar 20 08:07:40 crc kubenswrapper[4971]: I0320 08:07:40.112134 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pcjs8" Mar 20 08:07:41 crc kubenswrapper[4971]: I0320 08:07:41.152581 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pcjs8" podUID="df4065d6-7b9a-4210-943f-43c6c49b1b96" containerName="registry-server" probeResult="failure" output=< Mar 20 08:07:41 crc kubenswrapper[4971]: timeout: failed to connect service ":50051" within 1s Mar 20 08:07:41 crc kubenswrapper[4971]: > Mar 20 08:07:50 crc kubenswrapper[4971]: I0320 08:07:50.159709 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pcjs8" Mar 20 08:07:50 crc kubenswrapper[4971]: I0320 08:07:50.217903 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pcjs8" Mar 20 08:07:50 crc kubenswrapper[4971]: I0320 08:07:50.407661 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pcjs8"] Mar 20 08:07:51 crc kubenswrapper[4971]: I0320 08:07:51.800947 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pcjs8" podUID="df4065d6-7b9a-4210-943f-43c6c49b1b96" containerName="registry-server" containerID="cri-o://413dff47da07833f718b6628e184531cada80396d484e6df01d556a2853c6110" gracePeriod=2 Mar 20 08:07:52 crc kubenswrapper[4971]: I0320 08:07:52.814499 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pcjs8" event={"ID":"df4065d6-7b9a-4210-943f-43c6c49b1b96","Type":"ContainerDied","Data":"413dff47da07833f718b6628e184531cada80396d484e6df01d556a2853c6110"} Mar 20 08:07:52 crc kubenswrapper[4971]: I0320 08:07:52.814500 4971 generic.go:334] "Generic (PLEG): container finished" podID="df4065d6-7b9a-4210-943f-43c6c49b1b96" containerID="413dff47da07833f718b6628e184531cada80396d484e6df01d556a2853c6110" exitCode=0 Mar 20 08:07:53 crc kubenswrapper[4971]: I0320 08:07:53.244454 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pcjs8" Mar 20 08:07:53 crc kubenswrapper[4971]: I0320 08:07:53.425601 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df4065d6-7b9a-4210-943f-43c6c49b1b96-utilities\") pod \"df4065d6-7b9a-4210-943f-43c6c49b1b96\" (UID: \"df4065d6-7b9a-4210-943f-43c6c49b1b96\") " Mar 20 08:07:53 crc kubenswrapper[4971]: I0320 08:07:53.425783 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zhwk\" (UniqueName: \"kubernetes.io/projected/df4065d6-7b9a-4210-943f-43c6c49b1b96-kube-api-access-4zhwk\") pod \"df4065d6-7b9a-4210-943f-43c6c49b1b96\" (UID: \"df4065d6-7b9a-4210-943f-43c6c49b1b96\") " Mar 20 08:07:53 crc kubenswrapper[4971]: I0320 08:07:53.425886 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df4065d6-7b9a-4210-943f-43c6c49b1b96-catalog-content\") pod \"df4065d6-7b9a-4210-943f-43c6c49b1b96\" (UID: \"df4065d6-7b9a-4210-943f-43c6c49b1b96\") " Mar 20 08:07:53 crc kubenswrapper[4971]: I0320 08:07:53.427701 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df4065d6-7b9a-4210-943f-43c6c49b1b96-utilities" (OuterVolumeSpecName: "utilities") pod "df4065d6-7b9a-4210-943f-43c6c49b1b96" (UID: "df4065d6-7b9a-4210-943f-43c6c49b1b96"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:07:53 crc kubenswrapper[4971]: I0320 08:07:53.435069 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df4065d6-7b9a-4210-943f-43c6c49b1b96-kube-api-access-4zhwk" (OuterVolumeSpecName: "kube-api-access-4zhwk") pod "df4065d6-7b9a-4210-943f-43c6c49b1b96" (UID: "df4065d6-7b9a-4210-943f-43c6c49b1b96"). InnerVolumeSpecName "kube-api-access-4zhwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:07:53 crc kubenswrapper[4971]: I0320 08:07:53.527199 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df4065d6-7b9a-4210-943f-43c6c49b1b96-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 08:07:53 crc kubenswrapper[4971]: I0320 08:07:53.527237 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zhwk\" (UniqueName: \"kubernetes.io/projected/df4065d6-7b9a-4210-943f-43c6c49b1b96-kube-api-access-4zhwk\") on node \"crc\" DevicePath \"\"" Mar 20 08:07:53 crc kubenswrapper[4971]: I0320 08:07:53.655825 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df4065d6-7b9a-4210-943f-43c6c49b1b96-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "df4065d6-7b9a-4210-943f-43c6c49b1b96" (UID: "df4065d6-7b9a-4210-943f-43c6c49b1b96"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:07:53 crc kubenswrapper[4971]: I0320 08:07:53.730178 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df4065d6-7b9a-4210-943f-43c6c49b1b96-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 08:07:53 crc kubenswrapper[4971]: I0320 08:07:53.826983 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pcjs8" event={"ID":"df4065d6-7b9a-4210-943f-43c6c49b1b96","Type":"ContainerDied","Data":"8b60513a512754d0cd9d2ecfd5a22c3752d3b0ca5089eb62c6655cc6461fa25f"} Mar 20 08:07:53 crc kubenswrapper[4971]: I0320 08:07:53.827093 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pcjs8" Mar 20 08:07:53 crc kubenswrapper[4971]: I0320 08:07:53.827269 4971 scope.go:117] "RemoveContainer" containerID="413dff47da07833f718b6628e184531cada80396d484e6df01d556a2853c6110" Mar 20 08:07:53 crc kubenswrapper[4971]: I0320 08:07:53.851957 4971 scope.go:117] "RemoveContainer" containerID="b12bb0c44634635197b3793b48adf2f8ae581ad6e7fe1692d2bdcff7c6789f4c" Mar 20 08:07:53 crc kubenswrapper[4971]: I0320 08:07:53.871569 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pcjs8"] Mar 20 08:07:53 crc kubenswrapper[4971]: I0320 08:07:53.877721 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pcjs8"] Mar 20 08:07:54 crc kubenswrapper[4971]: I0320 08:07:54.020643 4971 scope.go:117] "RemoveContainer" containerID="7fd0e6d563ae05c6e5804d74419f0de94f02d404d2c82cc35c637f3d6b955113" Mar 20 08:07:54 crc kubenswrapper[4971]: I0320 08:07:54.748320 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df4065d6-7b9a-4210-943f-43c6c49b1b96" path="/var/lib/kubelet/pods/df4065d6-7b9a-4210-943f-43c6c49b1b96/volumes" Mar 20 08:08:00 crc kubenswrapper[4971]: I0320 08:08:00.152399 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566568-kc7rk"] Mar 20 08:08:00 crc kubenswrapper[4971]: E0320 08:08:00.153223 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df4065d6-7b9a-4210-943f-43c6c49b1b96" containerName="extract-utilities" Mar 20 08:08:00 crc kubenswrapper[4971]: I0320 08:08:00.153236 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="df4065d6-7b9a-4210-943f-43c6c49b1b96" containerName="extract-utilities" Mar 20 08:08:00 crc kubenswrapper[4971]: E0320 08:08:00.153243 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df4065d6-7b9a-4210-943f-43c6c49b1b96" containerName="registry-server" Mar 20 08:08:00 crc kubenswrapper[4971]: I0320 08:08:00.153249 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="df4065d6-7b9a-4210-943f-43c6c49b1b96" containerName="registry-server" Mar 20 08:08:00 crc kubenswrapper[4971]: E0320 08:08:00.153272 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df4065d6-7b9a-4210-943f-43c6c49b1b96" containerName="extract-content" Mar 20 08:08:00 crc kubenswrapper[4971]: I0320 08:08:00.153278 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="df4065d6-7b9a-4210-943f-43c6c49b1b96" containerName="extract-content" Mar 20 08:08:00 crc kubenswrapper[4971]: I0320 08:08:00.153409 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="df4065d6-7b9a-4210-943f-43c6c49b1b96" containerName="registry-server" Mar 20 08:08:00 crc kubenswrapper[4971]: I0320 08:08:00.153876 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566568-kc7rk" Mar 20 08:08:00 crc kubenswrapper[4971]: I0320 08:08:00.155839 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 08:08:00 crc kubenswrapper[4971]: I0320 08:08:00.156109 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 08:08:00 crc kubenswrapper[4971]: I0320 08:08:00.156560 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 08:08:00 crc kubenswrapper[4971]: I0320 08:08:00.163076 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566568-kc7rk"] Mar 20 08:08:00 crc kubenswrapper[4971]: I0320 08:08:00.342787 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqpnr\" (UniqueName: \"kubernetes.io/projected/70920290-7006-4bc5-bf90-c440861b4155-kube-api-access-wqpnr\") pod \"auto-csr-approver-29566568-kc7rk\" (UID: \"70920290-7006-4bc5-bf90-c440861b4155\") " pod="openshift-infra/auto-csr-approver-29566568-kc7rk" Mar 20 08:08:00 crc kubenswrapper[4971]: I0320 08:08:00.444050 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqpnr\" (UniqueName: \"kubernetes.io/projected/70920290-7006-4bc5-bf90-c440861b4155-kube-api-access-wqpnr\") pod \"auto-csr-approver-29566568-kc7rk\" (UID: \"70920290-7006-4bc5-bf90-c440861b4155\") " pod="openshift-infra/auto-csr-approver-29566568-kc7rk" Mar 20 08:08:00 crc kubenswrapper[4971]: I0320 08:08:00.473575 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqpnr\" (UniqueName: \"kubernetes.io/projected/70920290-7006-4bc5-bf90-c440861b4155-kube-api-access-wqpnr\") pod \"auto-csr-approver-29566568-kc7rk\" (UID: \"70920290-7006-4bc5-bf90-c440861b4155\") " pod="openshift-infra/auto-csr-approver-29566568-kc7rk" Mar 20 08:08:00 crc kubenswrapper[4971]: I0320 08:08:00.493496 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566568-kc7rk" Mar 20 08:08:00 crc kubenswrapper[4971]: I0320 08:08:00.940708 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566568-kc7rk"] Mar 20 08:08:00 crc kubenswrapper[4971]: W0320 08:08:00.950179 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70920290_7006_4bc5_bf90_c440861b4155.slice/crio-4d4f8870837d012cc877486f21c99c924aaf2df8ad381c7c560c3c8ce7e5906e WatchSource:0}: Error finding container 4d4f8870837d012cc877486f21c99c924aaf2df8ad381c7c560c3c8ce7e5906e: Status 404 returned error can't find the container with id 4d4f8870837d012cc877486f21c99c924aaf2df8ad381c7c560c3c8ce7e5906e Mar 20 08:08:01 crc kubenswrapper[4971]: I0320 08:08:01.925870 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566568-kc7rk" event={"ID":"70920290-7006-4bc5-bf90-c440861b4155","Type":"ContainerStarted","Data":"4d4f8870837d012cc877486f21c99c924aaf2df8ad381c7c560c3c8ce7e5906e"} Mar 20 08:08:02 crc kubenswrapper[4971]: I0320 08:08:02.939766 4971 generic.go:334] "Generic (PLEG): container finished" podID="70920290-7006-4bc5-bf90-c440861b4155" containerID="2e791ab94042326d81f115d12d410c88c93da2ceca499194bbdc0d023ba031cc" exitCode=0 Mar 20 08:08:02 crc kubenswrapper[4971]: I0320 08:08:02.940194 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566568-kc7rk" event={"ID":"70920290-7006-4bc5-bf90-c440861b4155","Type":"ContainerDied","Data":"2e791ab94042326d81f115d12d410c88c93da2ceca499194bbdc0d023ba031cc"} Mar 20 08:08:04 crc kubenswrapper[4971]: I0320 08:08:04.266099 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566568-kc7rk" Mar 20 08:08:04 crc kubenswrapper[4971]: I0320 08:08:04.407089 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqpnr\" (UniqueName: \"kubernetes.io/projected/70920290-7006-4bc5-bf90-c440861b4155-kube-api-access-wqpnr\") pod \"70920290-7006-4bc5-bf90-c440861b4155\" (UID: \"70920290-7006-4bc5-bf90-c440861b4155\") " Mar 20 08:08:04 crc kubenswrapper[4971]: I0320 08:08:04.412648 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70920290-7006-4bc5-bf90-c440861b4155-kube-api-access-wqpnr" (OuterVolumeSpecName: "kube-api-access-wqpnr") pod "70920290-7006-4bc5-bf90-c440861b4155" (UID: "70920290-7006-4bc5-bf90-c440861b4155"). InnerVolumeSpecName "kube-api-access-wqpnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:08:04 crc kubenswrapper[4971]: I0320 08:08:04.509656 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqpnr\" (UniqueName: \"kubernetes.io/projected/70920290-7006-4bc5-bf90-c440861b4155-kube-api-access-wqpnr\") on node \"crc\" DevicePath \"\"" Mar 20 08:08:04 crc kubenswrapper[4971]: I0320 08:08:04.959840 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566568-kc7rk" event={"ID":"70920290-7006-4bc5-bf90-c440861b4155","Type":"ContainerDied","Data":"4d4f8870837d012cc877486f21c99c924aaf2df8ad381c7c560c3c8ce7e5906e"} Mar 20 08:08:04 crc kubenswrapper[4971]: I0320 08:08:04.959897 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d4f8870837d012cc877486f21c99c924aaf2df8ad381c7c560c3c8ce7e5906e" Mar 20 08:08:04 crc kubenswrapper[4971]: I0320 08:08:04.960353 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566568-kc7rk" Mar 20 08:08:05 crc kubenswrapper[4971]: I0320 08:08:05.365405 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566562-897kl"] Mar 20 08:08:05 crc kubenswrapper[4971]: I0320 08:08:05.377806 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566562-897kl"] Mar 20 08:08:06 crc kubenswrapper[4971]: I0320 08:08:06.746944 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf9b4c24-f51a-4b84-ade2-1aee21f0fb99" path="/var/lib/kubelet/pods/bf9b4c24-f51a-4b84-ade2-1aee21f0fb99/volumes" Mar 20 08:08:20 crc kubenswrapper[4971]: I0320 08:08:20.162063 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 08:08:20 crc kubenswrapper[4971]: I0320 08:08:20.162839 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 08:08:50 crc kubenswrapper[4971]: I0320 08:08:50.163056 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 08:08:50 crc kubenswrapper[4971]: I0320 08:08:50.163893 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 08:08:54 crc kubenswrapper[4971]: I0320 08:08:54.432109 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nmdwl"] Mar 20 08:08:54 crc kubenswrapper[4971]: E0320 08:08:54.433300 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70920290-7006-4bc5-bf90-c440861b4155" containerName="oc" Mar 20 08:08:54 crc kubenswrapper[4971]: I0320 08:08:54.433324 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="70920290-7006-4bc5-bf90-c440861b4155" containerName="oc" Mar 20 08:08:54 crc kubenswrapper[4971]: I0320 08:08:54.433651 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="70920290-7006-4bc5-bf90-c440861b4155" containerName="oc" Mar 20 08:08:54 crc kubenswrapper[4971]: I0320 08:08:54.442956 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nmdwl" Mar 20 08:08:54 crc kubenswrapper[4971]: I0320 08:08:54.460712 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nmdwl"] Mar 20 08:08:54 crc kubenswrapper[4971]: I0320 08:08:54.531543 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6edd1939-7a2c-40fc-b2e8-557e9f999310-utilities\") pod \"community-operators-nmdwl\" (UID: \"6edd1939-7a2c-40fc-b2e8-557e9f999310\") " pod="openshift-marketplace/community-operators-nmdwl" Mar 20 08:08:54 crc kubenswrapper[4971]: I0320 08:08:54.531623 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4zpl\" (UniqueName: \"kubernetes.io/projected/6edd1939-7a2c-40fc-b2e8-557e9f999310-kube-api-access-l4zpl\") pod \"community-operators-nmdwl\" (UID: \"6edd1939-7a2c-40fc-b2e8-557e9f999310\") " pod="openshift-marketplace/community-operators-nmdwl" Mar 20 08:08:54 crc kubenswrapper[4971]: I0320 08:08:54.531681 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6edd1939-7a2c-40fc-b2e8-557e9f999310-catalog-content\") pod \"community-operators-nmdwl\" (UID: \"6edd1939-7a2c-40fc-b2e8-557e9f999310\") " pod="openshift-marketplace/community-operators-nmdwl" Mar 20 08:08:54 crc kubenswrapper[4971]: I0320 08:08:54.632647 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6edd1939-7a2c-40fc-b2e8-557e9f999310-catalog-content\") pod \"community-operators-nmdwl\" (UID: \"6edd1939-7a2c-40fc-b2e8-557e9f999310\") " pod="openshift-marketplace/community-operators-nmdwl" Mar 20 08:08:54 crc kubenswrapper[4971]: I0320 08:08:54.632775 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6edd1939-7a2c-40fc-b2e8-557e9f999310-utilities\") pod \"community-operators-nmdwl\" (UID: \"6edd1939-7a2c-40fc-b2e8-557e9f999310\") " pod="openshift-marketplace/community-operators-nmdwl" Mar 20 08:08:54 crc kubenswrapper[4971]: I0320 08:08:54.632818 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4zpl\" (UniqueName: \"kubernetes.io/projected/6edd1939-7a2c-40fc-b2e8-557e9f999310-kube-api-access-l4zpl\") pod \"community-operators-nmdwl\" (UID: \"6edd1939-7a2c-40fc-b2e8-557e9f999310\") " pod="openshift-marketplace/community-operators-nmdwl" Mar 20 08:08:54 crc kubenswrapper[4971]: I0320 08:08:54.633507 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6edd1939-7a2c-40fc-b2e8-557e9f999310-catalog-content\") pod \"community-operators-nmdwl\" (UID: \"6edd1939-7a2c-40fc-b2e8-557e9f999310\") " pod="openshift-marketplace/community-operators-nmdwl" Mar 20 08:08:54 crc kubenswrapper[4971]: I0320 08:08:54.633676 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6edd1939-7a2c-40fc-b2e8-557e9f999310-utilities\") pod \"community-operators-nmdwl\" (UID: \"6edd1939-7a2c-40fc-b2e8-557e9f999310\") " pod="openshift-marketplace/community-operators-nmdwl" Mar 20 08:08:54 crc kubenswrapper[4971]: I0320 08:08:54.659782 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4zpl\" (UniqueName: \"kubernetes.io/projected/6edd1939-7a2c-40fc-b2e8-557e9f999310-kube-api-access-l4zpl\") pod \"community-operators-nmdwl\" (UID: \"6edd1939-7a2c-40fc-b2e8-557e9f999310\") " pod="openshift-marketplace/community-operators-nmdwl" Mar 20 08:08:54 crc kubenswrapper[4971]: I0320 08:08:54.793477 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nmdwl" Mar 20 08:08:55 crc kubenswrapper[4971]: I0320 08:08:55.302489 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nmdwl"] Mar 20 08:08:55 crc kubenswrapper[4971]: I0320 08:08:55.445263 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nmdwl" event={"ID":"6edd1939-7a2c-40fc-b2e8-557e9f999310","Type":"ContainerStarted","Data":"4982f6124a5245057d10061447510babe022b99ee26f94963c9bb3ccc467cc53"} Mar 20 08:08:56 crc kubenswrapper[4971]: I0320 08:08:56.459193 4971 generic.go:334] "Generic (PLEG): container finished" podID="6edd1939-7a2c-40fc-b2e8-557e9f999310" containerID="da8a355e597d004ca8d3979b9342cc4fa2c7b3c99d4df2a813856d1afb21b51d" exitCode=0 Mar 20 08:08:56 crc kubenswrapper[4971]: I0320 08:08:56.459257 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nmdwl" event={"ID":"6edd1939-7a2c-40fc-b2e8-557e9f999310","Type":"ContainerDied","Data":"da8a355e597d004ca8d3979b9342cc4fa2c7b3c99d4df2a813856d1afb21b51d"} Mar 20 08:08:57 crc kubenswrapper[4971]: I0320 08:08:57.098484 4971 scope.go:117] "RemoveContainer" containerID="712e391ef54a2c9105a9ac6e9655cd6befae581436a7056985c03b934a768bee" Mar 20 08:08:57 crc kubenswrapper[4971]: I0320 08:08:57.470180 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nmdwl" event={"ID":"6edd1939-7a2c-40fc-b2e8-557e9f999310","Type":"ContainerStarted","Data":"6986ea593bf8ef2d2aa64274415430fba4b91eb9019532cce27a0b3291702542"} Mar 20 08:08:58 crc kubenswrapper[4971]: I0320 08:08:58.483356 4971 generic.go:334] "Generic (PLEG): container finished" podID="6edd1939-7a2c-40fc-b2e8-557e9f999310" containerID="6986ea593bf8ef2d2aa64274415430fba4b91eb9019532cce27a0b3291702542" exitCode=0 Mar 20 08:08:58 crc kubenswrapper[4971]: I0320 08:08:58.483480 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nmdwl" event={"ID":"6edd1939-7a2c-40fc-b2e8-557e9f999310","Type":"ContainerDied","Data":"6986ea593bf8ef2d2aa64274415430fba4b91eb9019532cce27a0b3291702542"} Mar 20 08:08:59 crc kubenswrapper[4971]: I0320 08:08:59.504655 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nmdwl" event={"ID":"6edd1939-7a2c-40fc-b2e8-557e9f999310","Type":"ContainerStarted","Data":"ae5026f1a5e44da2e54652c488d507312fd592fcbb713898790af88c6460a3f7"} Mar 20 08:08:59 crc kubenswrapper[4971]: I0320 08:08:59.534989 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nmdwl" podStartSLOduration=2.876868759 podStartE2EDuration="5.534967387s" podCreationTimestamp="2026-03-20 08:08:54 +0000 UTC" firstStartedPulling="2026-03-20 08:08:56.464436493 +0000 UTC m=+4758.444310671" lastFinishedPulling="2026-03-20 08:08:59.122535131 +0000 UTC m=+4761.102409299" observedRunningTime="2026-03-20 08:08:59.527345129 +0000 UTC m=+4761.507219297" watchObservedRunningTime="2026-03-20 08:08:59.534967387 +0000 UTC m=+4761.514841535" Mar 20 08:09:04 crc kubenswrapper[4971]: I0320 08:09:04.793826 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nmdwl" Mar 20 08:09:04 crc kubenswrapper[4971]: I0320 08:09:04.795008 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nmdwl" Mar 20 08:09:04 crc kubenswrapper[4971]: I0320 08:09:04.845936 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nmdwl" Mar 20 08:09:05 crc kubenswrapper[4971]: I0320 08:09:05.621660 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nmdwl" Mar 20 08:09:05 crc kubenswrapper[4971]: I0320 08:09:05.688982 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nmdwl"] Mar 20 08:09:07 crc kubenswrapper[4971]: I0320 08:09:07.573786 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nmdwl" podUID="6edd1939-7a2c-40fc-b2e8-557e9f999310" containerName="registry-server" containerID="cri-o://ae5026f1a5e44da2e54652c488d507312fd592fcbb713898790af88c6460a3f7" gracePeriod=2 Mar 20 08:09:08 crc kubenswrapper[4971]: I0320 08:09:08.035486 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nmdwl" Mar 20 08:09:08 crc kubenswrapper[4971]: I0320 08:09:08.165050 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4zpl\" (UniqueName: \"kubernetes.io/projected/6edd1939-7a2c-40fc-b2e8-557e9f999310-kube-api-access-l4zpl\") pod \"6edd1939-7a2c-40fc-b2e8-557e9f999310\" (UID: \"6edd1939-7a2c-40fc-b2e8-557e9f999310\") " Mar 20 08:09:08 crc kubenswrapper[4971]: I0320 08:09:08.165197 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6edd1939-7a2c-40fc-b2e8-557e9f999310-utilities\") pod \"6edd1939-7a2c-40fc-b2e8-557e9f999310\" (UID: \"6edd1939-7a2c-40fc-b2e8-557e9f999310\") " Mar 20 08:09:08 crc kubenswrapper[4971]: I0320 08:09:08.165310 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6edd1939-7a2c-40fc-b2e8-557e9f999310-catalog-content\") pod \"6edd1939-7a2c-40fc-b2e8-557e9f999310\" (UID: \"6edd1939-7a2c-40fc-b2e8-557e9f999310\") " Mar 20 08:09:08 crc kubenswrapper[4971]: I0320 08:09:08.166697 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6edd1939-7a2c-40fc-b2e8-557e9f999310-utilities" (OuterVolumeSpecName: "utilities") pod "6edd1939-7a2c-40fc-b2e8-557e9f999310" (UID: "6edd1939-7a2c-40fc-b2e8-557e9f999310"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:09:08 crc kubenswrapper[4971]: I0320 08:09:08.174927 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6edd1939-7a2c-40fc-b2e8-557e9f999310-kube-api-access-l4zpl" (OuterVolumeSpecName: "kube-api-access-l4zpl") pod "6edd1939-7a2c-40fc-b2e8-557e9f999310" (UID: "6edd1939-7a2c-40fc-b2e8-557e9f999310"). InnerVolumeSpecName "kube-api-access-l4zpl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:09:08 crc kubenswrapper[4971]: I0320 08:09:08.248276 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6edd1939-7a2c-40fc-b2e8-557e9f999310-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6edd1939-7a2c-40fc-b2e8-557e9f999310" (UID: "6edd1939-7a2c-40fc-b2e8-557e9f999310"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:09:08 crc kubenswrapper[4971]: I0320 08:09:08.268297 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4zpl\" (UniqueName: \"kubernetes.io/projected/6edd1939-7a2c-40fc-b2e8-557e9f999310-kube-api-access-l4zpl\") on node \"crc\" DevicePath \"\"" Mar 20 08:09:08 crc kubenswrapper[4971]: I0320 08:09:08.268347 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6edd1939-7a2c-40fc-b2e8-557e9f999310-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 08:09:08 crc kubenswrapper[4971]: I0320 08:09:08.268366 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6edd1939-7a2c-40fc-b2e8-557e9f999310-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 08:09:08 crc kubenswrapper[4971]: I0320 08:09:08.588584 4971 generic.go:334] "Generic (PLEG): container finished" podID="6edd1939-7a2c-40fc-b2e8-557e9f999310" containerID="ae5026f1a5e44da2e54652c488d507312fd592fcbb713898790af88c6460a3f7" exitCode=0 Mar 20 08:09:08 crc kubenswrapper[4971]: I0320 08:09:08.588722 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nmdwl" event={"ID":"6edd1939-7a2c-40fc-b2e8-557e9f999310","Type":"ContainerDied","Data":"ae5026f1a5e44da2e54652c488d507312fd592fcbb713898790af88c6460a3f7"} Mar 20 08:09:08 crc kubenswrapper[4971]: I0320 08:09:08.588729 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nmdwl" Mar 20 08:09:08 crc kubenswrapper[4971]: I0320 08:09:08.588764 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nmdwl" event={"ID":"6edd1939-7a2c-40fc-b2e8-557e9f999310","Type":"ContainerDied","Data":"4982f6124a5245057d10061447510babe022b99ee26f94963c9bb3ccc467cc53"} Mar 20 08:09:08 crc kubenswrapper[4971]: I0320 08:09:08.588796 4971 scope.go:117] "RemoveContainer" containerID="ae5026f1a5e44da2e54652c488d507312fd592fcbb713898790af88c6460a3f7" Mar 20 08:09:08 crc kubenswrapper[4971]: I0320 08:09:08.621216 4971 scope.go:117] "RemoveContainer" containerID="6986ea593bf8ef2d2aa64274415430fba4b91eb9019532cce27a0b3291702542" Mar 20 08:09:08 crc kubenswrapper[4971]: I0320 08:09:08.657672 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nmdwl"] Mar 20 08:09:08 crc kubenswrapper[4971]: I0320 08:09:08.664262 4971 scope.go:117] "RemoveContainer" containerID="da8a355e597d004ca8d3979b9342cc4fa2c7b3c99d4df2a813856d1afb21b51d" Mar 20 08:09:08 crc kubenswrapper[4971]: I0320 08:09:08.670988 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nmdwl"] Mar 20 08:09:08 crc kubenswrapper[4971]: I0320 08:09:08.694546 4971 scope.go:117] "RemoveContainer" containerID="ae5026f1a5e44da2e54652c488d507312fd592fcbb713898790af88c6460a3f7" Mar 20 08:09:08 crc kubenswrapper[4971]: E0320 08:09:08.695549 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae5026f1a5e44da2e54652c488d507312fd592fcbb713898790af88c6460a3f7\": container with ID starting with ae5026f1a5e44da2e54652c488d507312fd592fcbb713898790af88c6460a3f7 not found: ID does not exist" containerID="ae5026f1a5e44da2e54652c488d507312fd592fcbb713898790af88c6460a3f7" Mar 20 08:09:08 crc kubenswrapper[4971]: I0320 08:09:08.695635 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae5026f1a5e44da2e54652c488d507312fd592fcbb713898790af88c6460a3f7"} err="failed to get container status \"ae5026f1a5e44da2e54652c488d507312fd592fcbb713898790af88c6460a3f7\": rpc error: code = NotFound desc = could not find container \"ae5026f1a5e44da2e54652c488d507312fd592fcbb713898790af88c6460a3f7\": container with ID starting with ae5026f1a5e44da2e54652c488d507312fd592fcbb713898790af88c6460a3f7 not found: ID does not exist" Mar 20 08:09:08 crc kubenswrapper[4971]: I0320 08:09:08.695675 4971 scope.go:117] "RemoveContainer" containerID="6986ea593bf8ef2d2aa64274415430fba4b91eb9019532cce27a0b3291702542" Mar 20 08:09:08 crc kubenswrapper[4971]: E0320 08:09:08.696297 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6986ea593bf8ef2d2aa64274415430fba4b91eb9019532cce27a0b3291702542\": container with ID starting with 6986ea593bf8ef2d2aa64274415430fba4b91eb9019532cce27a0b3291702542 not found: ID does not exist" containerID="6986ea593bf8ef2d2aa64274415430fba4b91eb9019532cce27a0b3291702542" Mar 20 08:09:08 crc kubenswrapper[4971]: I0320 08:09:08.696436 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6986ea593bf8ef2d2aa64274415430fba4b91eb9019532cce27a0b3291702542"} err="failed to get container status \"6986ea593bf8ef2d2aa64274415430fba4b91eb9019532cce27a0b3291702542\": rpc error: code = NotFound desc = could not find container \"6986ea593bf8ef2d2aa64274415430fba4b91eb9019532cce27a0b3291702542\": container with ID starting with 6986ea593bf8ef2d2aa64274415430fba4b91eb9019532cce27a0b3291702542 not found: ID does not exist" Mar 20 08:09:08 crc kubenswrapper[4971]: I0320 08:09:08.696531 4971 scope.go:117] "RemoveContainer" containerID="da8a355e597d004ca8d3979b9342cc4fa2c7b3c99d4df2a813856d1afb21b51d" Mar 20 08:09:08 crc kubenswrapper[4971]: E0320 08:09:08.697192 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da8a355e597d004ca8d3979b9342cc4fa2c7b3c99d4df2a813856d1afb21b51d\": container with ID starting with da8a355e597d004ca8d3979b9342cc4fa2c7b3c99d4df2a813856d1afb21b51d not found: ID does not exist" containerID="da8a355e597d004ca8d3979b9342cc4fa2c7b3c99d4df2a813856d1afb21b51d" Mar 20 08:09:08 crc kubenswrapper[4971]: I0320 08:09:08.697259 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da8a355e597d004ca8d3979b9342cc4fa2c7b3c99d4df2a813856d1afb21b51d"} err="failed to get container status \"da8a355e597d004ca8d3979b9342cc4fa2c7b3c99d4df2a813856d1afb21b51d\": rpc error: code = NotFound desc = could not find container \"da8a355e597d004ca8d3979b9342cc4fa2c7b3c99d4df2a813856d1afb21b51d\": container with ID starting with da8a355e597d004ca8d3979b9342cc4fa2c7b3c99d4df2a813856d1afb21b51d not found: ID does not exist" Mar 20 08:09:08 crc kubenswrapper[4971]: I0320 08:09:08.748392 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6edd1939-7a2c-40fc-b2e8-557e9f999310" path="/var/lib/kubelet/pods/6edd1939-7a2c-40fc-b2e8-557e9f999310/volumes" Mar 20 08:09:20 crc kubenswrapper[4971]: I0320 08:09:20.162991 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 08:09:20 crc kubenswrapper[4971]: I0320 08:09:20.163542 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 08:09:20 crc kubenswrapper[4971]: I0320 08:09:20.163656 4971 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" Mar 20 08:09:20 crc kubenswrapper[4971]: I0320 08:09:20.164488 4971 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9202ff95fa9a8e5fda31c183fc82833d745eca16c84e697454145e650520e248"} pod="openshift-machine-config-operator/machine-config-daemon-m26zl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 08:09:20 crc kubenswrapper[4971]: I0320 08:09:20.164586 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" containerID="cri-o://9202ff95fa9a8e5fda31c183fc82833d745eca16c84e697454145e650520e248" gracePeriod=600 Mar 20 08:09:20 crc kubenswrapper[4971]: I0320 08:09:20.710186 4971 generic.go:334] "Generic (PLEG): container finished" podID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerID="9202ff95fa9a8e5fda31c183fc82833d745eca16c84e697454145e650520e248" exitCode=0 Mar 20 08:09:20 crc kubenswrapper[4971]: I0320 08:09:20.710241 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerDied","Data":"9202ff95fa9a8e5fda31c183fc82833d745eca16c84e697454145e650520e248"} Mar 20 08:09:20 crc kubenswrapper[4971]: I0320 08:09:20.710547 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerStarted","Data":"bb270a4dc93b8d6f80363bddc69a208c2724ae180fbd65a393dbe537269dfa3b"} Mar 20 08:09:20 crc kubenswrapper[4971]: I0320 08:09:20.710577 4971 scope.go:117] "RemoveContainer" containerID="4a80cd36cbda07810e83e10d6f5e9bdd6015f9ed1688173c120cf0268aeac5df" Mar 20 08:10:00 crc kubenswrapper[4971]: I0320 08:10:00.136078 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566570-xfsqv"] Mar 20 08:10:00 crc kubenswrapper[4971]: E0320 08:10:00.137388 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6edd1939-7a2c-40fc-b2e8-557e9f999310" containerName="extract-utilities" Mar 20 08:10:00 crc kubenswrapper[4971]: I0320 08:10:00.137407 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="6edd1939-7a2c-40fc-b2e8-557e9f999310" containerName="extract-utilities" Mar 20 08:10:00 crc kubenswrapper[4971]: E0320 08:10:00.137429 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6edd1939-7a2c-40fc-b2e8-557e9f999310" containerName="extract-content" Mar 20 08:10:00 crc kubenswrapper[4971]: I0320 08:10:00.137438 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="6edd1939-7a2c-40fc-b2e8-557e9f999310" containerName="extract-content" Mar 20 08:10:00 crc kubenswrapper[4971]: E0320 08:10:00.137449 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6edd1939-7a2c-40fc-b2e8-557e9f999310" containerName="registry-server" Mar 20 08:10:00 crc kubenswrapper[4971]: I0320 08:10:00.137458 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="6edd1939-7a2c-40fc-b2e8-557e9f999310" containerName="registry-server" Mar 20 08:10:00 crc kubenswrapper[4971]: I0320 08:10:00.137633 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="6edd1939-7a2c-40fc-b2e8-557e9f999310" containerName="registry-server" Mar 20 08:10:00 crc kubenswrapper[4971]: I0320 08:10:00.138227 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566570-xfsqv" Mar 20 08:10:00 crc kubenswrapper[4971]: I0320 08:10:00.140513 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 08:10:00 crc kubenswrapper[4971]: I0320 08:10:00.140560 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 08:10:00 crc kubenswrapper[4971]: I0320 08:10:00.141050 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 08:10:00 crc kubenswrapper[4971]: I0320 08:10:00.142476 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566570-xfsqv"] Mar 20 08:10:00 crc kubenswrapper[4971]: I0320 08:10:00.240997 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttn49\" (UniqueName: \"kubernetes.io/projected/21ac11e3-ae71-4a58-835c-33f6c6daea7d-kube-api-access-ttn49\") pod \"auto-csr-approver-29566570-xfsqv\" (UID: \"21ac11e3-ae71-4a58-835c-33f6c6daea7d\") " pod="openshift-infra/auto-csr-approver-29566570-xfsqv" Mar 20 08:10:00 crc kubenswrapper[4971]: I0320 08:10:00.341897 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttn49\" (UniqueName: \"kubernetes.io/projected/21ac11e3-ae71-4a58-835c-33f6c6daea7d-kube-api-access-ttn49\") pod \"auto-csr-approver-29566570-xfsqv\" (UID: \"21ac11e3-ae71-4a58-835c-33f6c6daea7d\") " pod="openshift-infra/auto-csr-approver-29566570-xfsqv" Mar 20 08:10:00 crc kubenswrapper[4971]: I0320 08:10:00.371328 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttn49\" (UniqueName: \"kubernetes.io/projected/21ac11e3-ae71-4a58-835c-33f6c6daea7d-kube-api-access-ttn49\") pod \"auto-csr-approver-29566570-xfsqv\" (UID: \"21ac11e3-ae71-4a58-835c-33f6c6daea7d\") " pod="openshift-infra/auto-csr-approver-29566570-xfsqv" Mar 20 08:10:00 crc kubenswrapper[4971]: I0320 08:10:00.458693 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566570-xfsqv" Mar 20 08:10:00 crc kubenswrapper[4971]: I0320 08:10:00.922906 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566570-xfsqv"] Mar 20 08:10:01 crc kubenswrapper[4971]: I0320 08:10:01.106056 4971 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 08:10:02 crc kubenswrapper[4971]: I0320 08:10:02.091079 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566570-xfsqv" event={"ID":"21ac11e3-ae71-4a58-835c-33f6c6daea7d","Type":"ContainerStarted","Data":"3ac1b16c9aa1c6aaf8d455794d28654a8b0596e8ea9e5bef0636e882f74ea095"} Mar 20 08:10:03 crc kubenswrapper[4971]: I0320 08:10:03.105183 4971 generic.go:334] "Generic (PLEG): container finished" podID="21ac11e3-ae71-4a58-835c-33f6c6daea7d" containerID="c3f87ac67a2bbb23159e36340bc8f0b3f42303696c046155cd9fbcbf656f85dc" exitCode=0 Mar 20 08:10:03 crc kubenswrapper[4971]: I0320 08:10:03.105373 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566570-xfsqv" event={"ID":"21ac11e3-ae71-4a58-835c-33f6c6daea7d","Type":"ContainerDied","Data":"c3f87ac67a2bbb23159e36340bc8f0b3f42303696c046155cd9fbcbf656f85dc"} Mar 20 08:10:04 crc kubenswrapper[4971]: I0320 08:10:04.520582 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566570-xfsqv" Mar 20 08:10:04 crc kubenswrapper[4971]: I0320 08:10:04.639618 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttn49\" (UniqueName: \"kubernetes.io/projected/21ac11e3-ae71-4a58-835c-33f6c6daea7d-kube-api-access-ttn49\") pod \"21ac11e3-ae71-4a58-835c-33f6c6daea7d\" (UID: \"21ac11e3-ae71-4a58-835c-33f6c6daea7d\") " Mar 20 08:10:04 crc kubenswrapper[4971]: I0320 08:10:04.648897 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21ac11e3-ae71-4a58-835c-33f6c6daea7d-kube-api-access-ttn49" (OuterVolumeSpecName: "kube-api-access-ttn49") pod "21ac11e3-ae71-4a58-835c-33f6c6daea7d" (UID: "21ac11e3-ae71-4a58-835c-33f6c6daea7d"). InnerVolumeSpecName "kube-api-access-ttn49". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:10:04 crc kubenswrapper[4971]: I0320 08:10:04.741121 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttn49\" (UniqueName: \"kubernetes.io/projected/21ac11e3-ae71-4a58-835c-33f6c6daea7d-kube-api-access-ttn49\") on node \"crc\" DevicePath \"\"" Mar 20 08:10:05 crc kubenswrapper[4971]: I0320 08:10:05.127476 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566570-xfsqv" event={"ID":"21ac11e3-ae71-4a58-835c-33f6c6daea7d","Type":"ContainerDied","Data":"3ac1b16c9aa1c6aaf8d455794d28654a8b0596e8ea9e5bef0636e882f74ea095"} Mar 20 08:10:05 crc kubenswrapper[4971]: I0320 08:10:05.127545 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ac1b16c9aa1c6aaf8d455794d28654a8b0596e8ea9e5bef0636e882f74ea095" Mar 20 08:10:05 crc kubenswrapper[4971]: I0320 08:10:05.127585 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566570-xfsqv" Mar 20 08:10:05 crc kubenswrapper[4971]: I0320 08:10:05.623191 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566564-kcmwv"] Mar 20 08:10:05 crc kubenswrapper[4971]: I0320 08:10:05.628636 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566564-kcmwv"] Mar 20 08:10:06 crc kubenswrapper[4971]: I0320 08:10:06.748658 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bda8c113-0844-4495-bdbb-9aad30b74950" path="/var/lib/kubelet/pods/bda8c113-0844-4495-bdbb-9aad30b74950/volumes" Mar 20 08:10:45 crc kubenswrapper[4971]: I0320 08:10:45.426667 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pj8c8"] Mar 20 08:10:45 crc kubenswrapper[4971]: E0320 08:10:45.430311 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21ac11e3-ae71-4a58-835c-33f6c6daea7d" containerName="oc" Mar 20 08:10:45 crc kubenswrapper[4971]: I0320 08:10:45.430579 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="21ac11e3-ae71-4a58-835c-33f6c6daea7d" containerName="oc" Mar 20 08:10:45 crc kubenswrapper[4971]: I0320 08:10:45.431350 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="21ac11e3-ae71-4a58-835c-33f6c6daea7d" containerName="oc" Mar 20 08:10:45 crc kubenswrapper[4971]: I0320 08:10:45.434038 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pj8c8" Mar 20 08:10:45 crc kubenswrapper[4971]: I0320 08:10:45.452950 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pj8c8"] Mar 20 08:10:45 crc kubenswrapper[4971]: I0320 08:10:45.529130 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4671b5ae-44e2-4252-8aa4-fcfed1133d86-utilities\") pod \"redhat-marketplace-pj8c8\" (UID: \"4671b5ae-44e2-4252-8aa4-fcfed1133d86\") " pod="openshift-marketplace/redhat-marketplace-pj8c8" Mar 20 08:10:45 crc kubenswrapper[4971]: I0320 08:10:45.529219 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4671b5ae-44e2-4252-8aa4-fcfed1133d86-catalog-content\") pod \"redhat-marketplace-pj8c8\" (UID: \"4671b5ae-44e2-4252-8aa4-fcfed1133d86\") " pod="openshift-marketplace/redhat-marketplace-pj8c8" Mar 20 08:10:45 crc kubenswrapper[4971]: I0320 08:10:45.529280 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss956\" (UniqueName: \"kubernetes.io/projected/4671b5ae-44e2-4252-8aa4-fcfed1133d86-kube-api-access-ss956\") pod \"redhat-marketplace-pj8c8\" (UID: \"4671b5ae-44e2-4252-8aa4-fcfed1133d86\") " pod="openshift-marketplace/redhat-marketplace-pj8c8" Mar 20 08:10:45 crc kubenswrapper[4971]: I0320 08:10:45.630075 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss956\" (UniqueName: \"kubernetes.io/projected/4671b5ae-44e2-4252-8aa4-fcfed1133d86-kube-api-access-ss956\") pod \"redhat-marketplace-pj8c8\" (UID: \"4671b5ae-44e2-4252-8aa4-fcfed1133d86\") " pod="openshift-marketplace/redhat-marketplace-pj8c8" Mar 20 08:10:45 crc kubenswrapper[4971]: I0320 08:10:45.630142 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4671b5ae-44e2-4252-8aa4-fcfed1133d86-utilities\") pod \"redhat-marketplace-pj8c8\" (UID: \"4671b5ae-44e2-4252-8aa4-fcfed1133d86\") " pod="openshift-marketplace/redhat-marketplace-pj8c8" Mar 20 08:10:45 crc kubenswrapper[4971]: I0320 08:10:45.630191 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4671b5ae-44e2-4252-8aa4-fcfed1133d86-catalog-content\") pod \"redhat-marketplace-pj8c8\" (UID: \"4671b5ae-44e2-4252-8aa4-fcfed1133d86\") " pod="openshift-marketplace/redhat-marketplace-pj8c8" Mar 20 08:10:45 crc kubenswrapper[4971]: I0320 08:10:45.630680 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4671b5ae-44e2-4252-8aa4-fcfed1133d86-catalog-content\") pod \"redhat-marketplace-pj8c8\" (UID: \"4671b5ae-44e2-4252-8aa4-fcfed1133d86\") " pod="openshift-marketplace/redhat-marketplace-pj8c8" Mar 20 08:10:45 crc kubenswrapper[4971]: I0320 08:10:45.631925 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4671b5ae-44e2-4252-8aa4-fcfed1133d86-utilities\") pod \"redhat-marketplace-pj8c8\" (UID: \"4671b5ae-44e2-4252-8aa4-fcfed1133d86\") " pod="openshift-marketplace/redhat-marketplace-pj8c8" Mar 20 08:10:45 crc kubenswrapper[4971]: I0320 08:10:45.664223 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss956\" (UniqueName: \"kubernetes.io/projected/4671b5ae-44e2-4252-8aa4-fcfed1133d86-kube-api-access-ss956\") pod \"redhat-marketplace-pj8c8\" (UID: \"4671b5ae-44e2-4252-8aa4-fcfed1133d86\") " pod="openshift-marketplace/redhat-marketplace-pj8c8" Mar 20 08:10:45 crc kubenswrapper[4971]: I0320 08:10:45.757626 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pj8c8" Mar 20 08:10:46 crc kubenswrapper[4971]: I0320 08:10:46.240922 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pj8c8"] Mar 20 08:10:46 crc kubenswrapper[4971]: I0320 08:10:46.501319 4971 generic.go:334] "Generic (PLEG): container finished" podID="4671b5ae-44e2-4252-8aa4-fcfed1133d86" containerID="6465f12dfedce66f57d957d62097e40b5b2347f81dd9049b17372a0e40d9d4a3" exitCode=0 Mar 20 08:10:46 crc kubenswrapper[4971]: I0320 08:10:46.501368 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pj8c8" event={"ID":"4671b5ae-44e2-4252-8aa4-fcfed1133d86","Type":"ContainerDied","Data":"6465f12dfedce66f57d957d62097e40b5b2347f81dd9049b17372a0e40d9d4a3"} Mar 20 08:10:46 crc kubenswrapper[4971]: I0320 08:10:46.501420 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pj8c8" event={"ID":"4671b5ae-44e2-4252-8aa4-fcfed1133d86","Type":"ContainerStarted","Data":"eed6c4e9950f42bbc97888402f309b752d8178c10b7fe0ab898002fb7009d8e8"} Mar 20 08:10:48 crc kubenswrapper[4971]: I0320 08:10:48.519337 4971 generic.go:334] "Generic (PLEG): container finished" podID="4671b5ae-44e2-4252-8aa4-fcfed1133d86" containerID="75be08bf56bdb5b7931f1e28e47ca3b2413f27e8f20f3437c7479147371257a9" exitCode=0 Mar 20 08:10:48 crc kubenswrapper[4971]: I0320 08:10:48.519420 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pj8c8" event={"ID":"4671b5ae-44e2-4252-8aa4-fcfed1133d86","Type":"ContainerDied","Data":"75be08bf56bdb5b7931f1e28e47ca3b2413f27e8f20f3437c7479147371257a9"} Mar 20 08:10:49 crc kubenswrapper[4971]: I0320 08:10:49.532339 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pj8c8" event={"ID":"4671b5ae-44e2-4252-8aa4-fcfed1133d86","Type":"ContainerStarted","Data":"cb2532b286d14ec117eab17ed55ffa3646c08db0846e69ada9ad7c98055bafcd"} Mar 20 08:10:49 crc kubenswrapper[4971]: I0320 08:10:49.557191 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pj8c8" podStartSLOduration=2.099877446 podStartE2EDuration="4.557175222s" podCreationTimestamp="2026-03-20 08:10:45 +0000 UTC" firstStartedPulling="2026-03-20 08:10:46.502984054 +0000 UTC m=+4868.482858192" lastFinishedPulling="2026-03-20 08:10:48.96028182 +0000 UTC m=+4870.940155968" observedRunningTime="2026-03-20 08:10:49.554363448 +0000 UTC m=+4871.534237586" watchObservedRunningTime="2026-03-20 08:10:49.557175222 +0000 UTC m=+4871.537049360" Mar 20 08:10:55 crc kubenswrapper[4971]: I0320 08:10:55.758519 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pj8c8" Mar 20 08:10:55 crc kubenswrapper[4971]: I0320 08:10:55.759103 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pj8c8" Mar 20 08:10:55 crc kubenswrapper[4971]: I0320 08:10:55.836120 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pj8c8" Mar 20 08:10:56 crc kubenswrapper[4971]: I0320 08:10:56.668477 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pj8c8" Mar 20 08:10:56 crc kubenswrapper[4971]: I0320 08:10:56.726402 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pj8c8"] Mar 20 08:10:57 crc kubenswrapper[4971]: I0320 08:10:57.248003 4971 scope.go:117] "RemoveContainer" containerID="56f57d649a8da81224432f1e422f1a4e613ca8557ca9ee54b0dbe79d871ab1d3" Mar 20 08:10:58 crc kubenswrapper[4971]: I0320 08:10:58.614160 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pj8c8" podUID="4671b5ae-44e2-4252-8aa4-fcfed1133d86" containerName="registry-server" containerID="cri-o://cb2532b286d14ec117eab17ed55ffa3646c08db0846e69ada9ad7c98055bafcd" gracePeriod=2 Mar 20 08:10:59 crc kubenswrapper[4971]: I0320 08:10:59.130294 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pj8c8" Mar 20 08:10:59 crc kubenswrapper[4971]: I0320 08:10:59.190748 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ss956\" (UniqueName: \"kubernetes.io/projected/4671b5ae-44e2-4252-8aa4-fcfed1133d86-kube-api-access-ss956\") pod \"4671b5ae-44e2-4252-8aa4-fcfed1133d86\" (UID: \"4671b5ae-44e2-4252-8aa4-fcfed1133d86\") " Mar 20 08:10:59 crc kubenswrapper[4971]: I0320 08:10:59.190903 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4671b5ae-44e2-4252-8aa4-fcfed1133d86-catalog-content\") pod \"4671b5ae-44e2-4252-8aa4-fcfed1133d86\" (UID: \"4671b5ae-44e2-4252-8aa4-fcfed1133d86\") " Mar 20 08:10:59 crc kubenswrapper[4971]: I0320 08:10:59.190992 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4671b5ae-44e2-4252-8aa4-fcfed1133d86-utilities\") pod \"4671b5ae-44e2-4252-8aa4-fcfed1133d86\" (UID: \"4671b5ae-44e2-4252-8aa4-fcfed1133d86\") " Mar 20 08:10:59 crc kubenswrapper[4971]: I0320 08:10:59.193482 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4671b5ae-44e2-4252-8aa4-fcfed1133d86-utilities" (OuterVolumeSpecName: "utilities") pod "4671b5ae-44e2-4252-8aa4-fcfed1133d86" (UID: "4671b5ae-44e2-4252-8aa4-fcfed1133d86"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:10:59 crc kubenswrapper[4971]: I0320 08:10:59.200267 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4671b5ae-44e2-4252-8aa4-fcfed1133d86-kube-api-access-ss956" (OuterVolumeSpecName: "kube-api-access-ss956") pod "4671b5ae-44e2-4252-8aa4-fcfed1133d86" (UID: "4671b5ae-44e2-4252-8aa4-fcfed1133d86"). InnerVolumeSpecName "kube-api-access-ss956". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:10:59 crc kubenswrapper[4971]: I0320 08:10:59.241347 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4671b5ae-44e2-4252-8aa4-fcfed1133d86-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4671b5ae-44e2-4252-8aa4-fcfed1133d86" (UID: "4671b5ae-44e2-4252-8aa4-fcfed1133d86"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:10:59 crc kubenswrapper[4971]: I0320 08:10:59.294124 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4671b5ae-44e2-4252-8aa4-fcfed1133d86-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 08:10:59 crc kubenswrapper[4971]: I0320 08:10:59.295948 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ss956\" (UniqueName: \"kubernetes.io/projected/4671b5ae-44e2-4252-8aa4-fcfed1133d86-kube-api-access-ss956\") on node \"crc\" DevicePath \"\"" Mar 20 08:10:59 crc kubenswrapper[4971]: I0320 08:10:59.296092 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4671b5ae-44e2-4252-8aa4-fcfed1133d86-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 08:10:59 crc kubenswrapper[4971]: I0320 08:10:59.627904 4971 generic.go:334] "Generic (PLEG): container finished" podID="4671b5ae-44e2-4252-8aa4-fcfed1133d86" containerID="cb2532b286d14ec117eab17ed55ffa3646c08db0846e69ada9ad7c98055bafcd" exitCode=0 Mar 20 08:10:59 crc kubenswrapper[4971]: I0320 08:10:59.627970 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pj8c8" event={"ID":"4671b5ae-44e2-4252-8aa4-fcfed1133d86","Type":"ContainerDied","Data":"cb2532b286d14ec117eab17ed55ffa3646c08db0846e69ada9ad7c98055bafcd"} Mar 20 08:10:59 crc kubenswrapper[4971]: I0320 08:10:59.628025 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pj8c8" event={"ID":"4671b5ae-44e2-4252-8aa4-fcfed1133d86","Type":"ContainerDied","Data":"eed6c4e9950f42bbc97888402f309b752d8178c10b7fe0ab898002fb7009d8e8"} Mar 20 08:10:59 crc kubenswrapper[4971]: I0320 08:10:59.628060 4971 scope.go:117] "RemoveContainer" containerID="cb2532b286d14ec117eab17ed55ffa3646c08db0846e69ada9ad7c98055bafcd" Mar 20 08:10:59 crc kubenswrapper[4971]: I0320 08:10:59.628057 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pj8c8" Mar 20 08:10:59 crc kubenswrapper[4971]: I0320 08:10:59.660690 4971 scope.go:117] "RemoveContainer" containerID="75be08bf56bdb5b7931f1e28e47ca3b2413f27e8f20f3437c7479147371257a9" Mar 20 08:10:59 crc kubenswrapper[4971]: I0320 08:10:59.682516 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pj8c8"] Mar 20 08:10:59 crc kubenswrapper[4971]: I0320 08:10:59.692246 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pj8c8"] Mar 20 08:10:59 crc kubenswrapper[4971]: I0320 08:10:59.714996 4971 scope.go:117] "RemoveContainer" containerID="6465f12dfedce66f57d957d62097e40b5b2347f81dd9049b17372a0e40d9d4a3" Mar 20 08:10:59 crc kubenswrapper[4971]: I0320 08:10:59.744331 4971 scope.go:117] "RemoveContainer" containerID="cb2532b286d14ec117eab17ed55ffa3646c08db0846e69ada9ad7c98055bafcd" Mar 20 08:10:59 crc kubenswrapper[4971]: E0320 08:10:59.745332 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb2532b286d14ec117eab17ed55ffa3646c08db0846e69ada9ad7c98055bafcd\": container with ID starting with cb2532b286d14ec117eab17ed55ffa3646c08db0846e69ada9ad7c98055bafcd not found: ID does not exist" containerID="cb2532b286d14ec117eab17ed55ffa3646c08db0846e69ada9ad7c98055bafcd" Mar 20 08:10:59 crc kubenswrapper[4971]: I0320 08:10:59.745389 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb2532b286d14ec117eab17ed55ffa3646c08db0846e69ada9ad7c98055bafcd"} err="failed to get container status \"cb2532b286d14ec117eab17ed55ffa3646c08db0846e69ada9ad7c98055bafcd\": rpc error: code = NotFound desc = could not find container \"cb2532b286d14ec117eab17ed55ffa3646c08db0846e69ada9ad7c98055bafcd\": container with ID starting with cb2532b286d14ec117eab17ed55ffa3646c08db0846e69ada9ad7c98055bafcd not found: ID does not exist" Mar 20 08:10:59 crc kubenswrapper[4971]: I0320 08:10:59.745481 4971 scope.go:117] "RemoveContainer" containerID="75be08bf56bdb5b7931f1e28e47ca3b2413f27e8f20f3437c7479147371257a9" Mar 20 08:10:59 crc kubenswrapper[4971]: E0320 08:10:59.746033 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75be08bf56bdb5b7931f1e28e47ca3b2413f27e8f20f3437c7479147371257a9\": container with ID starting with 75be08bf56bdb5b7931f1e28e47ca3b2413f27e8f20f3437c7479147371257a9 not found: ID does not exist" containerID="75be08bf56bdb5b7931f1e28e47ca3b2413f27e8f20f3437c7479147371257a9" Mar 20 08:10:59 crc kubenswrapper[4971]: I0320 08:10:59.746114 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75be08bf56bdb5b7931f1e28e47ca3b2413f27e8f20f3437c7479147371257a9"} err="failed to get container status \"75be08bf56bdb5b7931f1e28e47ca3b2413f27e8f20f3437c7479147371257a9\": rpc error: code = NotFound desc = could not find container \"75be08bf56bdb5b7931f1e28e47ca3b2413f27e8f20f3437c7479147371257a9\": container with ID starting with 75be08bf56bdb5b7931f1e28e47ca3b2413f27e8f20f3437c7479147371257a9 not found: ID does not exist" Mar 20 08:10:59 crc kubenswrapper[4971]: I0320 08:10:59.746192 4971 scope.go:117] "RemoveContainer" containerID="6465f12dfedce66f57d957d62097e40b5b2347f81dd9049b17372a0e40d9d4a3" Mar 20 08:10:59 crc kubenswrapper[4971]: E0320 08:10:59.746670 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6465f12dfedce66f57d957d62097e40b5b2347f81dd9049b17372a0e40d9d4a3\": container with ID starting with 6465f12dfedce66f57d957d62097e40b5b2347f81dd9049b17372a0e40d9d4a3 not found: ID does not exist" containerID="6465f12dfedce66f57d957d62097e40b5b2347f81dd9049b17372a0e40d9d4a3" Mar 20 08:10:59 crc kubenswrapper[4971]: I0320 08:10:59.746758 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6465f12dfedce66f57d957d62097e40b5b2347f81dd9049b17372a0e40d9d4a3"} err="failed to get container status \"6465f12dfedce66f57d957d62097e40b5b2347f81dd9049b17372a0e40d9d4a3\": rpc error: code = NotFound desc = could not find container \"6465f12dfedce66f57d957d62097e40b5b2347f81dd9049b17372a0e40d9d4a3\": container with ID starting with 6465f12dfedce66f57d957d62097e40b5b2347f81dd9049b17372a0e40d9d4a3 not found: ID does not exist" Mar 20 08:11:00 crc kubenswrapper[4971]: I0320 08:11:00.742119 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4671b5ae-44e2-4252-8aa4-fcfed1133d86" path="/var/lib/kubelet/pods/4671b5ae-44e2-4252-8aa4-fcfed1133d86/volumes" Mar 20 08:11:20 crc kubenswrapper[4971]: I0320 08:11:20.162242 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 08:11:20 crc kubenswrapper[4971]: I0320 08:11:20.163873 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 08:11:50 crc kubenswrapper[4971]: I0320 08:11:50.163184 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 08:11:50 crc kubenswrapper[4971]: I0320 08:11:50.163996 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 08:12:00 crc kubenswrapper[4971]: I0320 08:12:00.158515 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566572-vcfhv"] Mar 20 08:12:00 crc kubenswrapper[4971]: E0320 08:12:00.159701 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4671b5ae-44e2-4252-8aa4-fcfed1133d86" containerName="registry-server" Mar 20 08:12:00 crc kubenswrapper[4971]: I0320 08:12:00.159720 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="4671b5ae-44e2-4252-8aa4-fcfed1133d86" containerName="registry-server" Mar 20 08:12:00 crc kubenswrapper[4971]: E0320 08:12:00.159745 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4671b5ae-44e2-4252-8aa4-fcfed1133d86" containerName="extract-utilities" Mar 20 08:12:00 crc kubenswrapper[4971]: I0320 08:12:00.159755 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="4671b5ae-44e2-4252-8aa4-fcfed1133d86" containerName="extract-utilities" Mar 20 08:12:00 crc kubenswrapper[4971]: E0320 08:12:00.159806 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4671b5ae-44e2-4252-8aa4-fcfed1133d86" containerName="extract-content" Mar 20 08:12:00 crc kubenswrapper[4971]: I0320 08:12:00.159816 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="4671b5ae-44e2-4252-8aa4-fcfed1133d86" containerName="extract-content" Mar 20 08:12:00 crc kubenswrapper[4971]: I0320 08:12:00.160174 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="4671b5ae-44e2-4252-8aa4-fcfed1133d86" containerName="registry-server" Mar 20 08:12:00 crc kubenswrapper[4971]: I0320 08:12:00.162303 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566572-vcfhv" Mar 20 08:12:00 crc kubenswrapper[4971]: I0320 08:12:00.170510 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 08:12:00 crc kubenswrapper[4971]: I0320 08:12:00.170773 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 08:12:00 crc kubenswrapper[4971]: I0320 08:12:00.171044 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 08:12:00 crc kubenswrapper[4971]: I0320 08:12:00.182369 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566572-vcfhv"] Mar 20 08:12:00 crc kubenswrapper[4971]: I0320 08:12:00.356364 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qczlp\" (UniqueName: \"kubernetes.io/projected/003ba1c3-2c55-4f11-bcdd-9a62917d1ce9-kube-api-access-qczlp\") pod \"auto-csr-approver-29566572-vcfhv\" (UID: \"003ba1c3-2c55-4f11-bcdd-9a62917d1ce9\") " pod="openshift-infra/auto-csr-approver-29566572-vcfhv" Mar 20 08:12:00 crc kubenswrapper[4971]: I0320 08:12:00.457600 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qczlp\" (UniqueName: \"kubernetes.io/projected/003ba1c3-2c55-4f11-bcdd-9a62917d1ce9-kube-api-access-qczlp\") pod \"auto-csr-approver-29566572-vcfhv\" (UID: \"003ba1c3-2c55-4f11-bcdd-9a62917d1ce9\") " pod="openshift-infra/auto-csr-approver-29566572-vcfhv" Mar 20 08:12:00 crc kubenswrapper[4971]: I0320 08:12:00.491489 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qczlp\" (UniqueName: \"kubernetes.io/projected/003ba1c3-2c55-4f11-bcdd-9a62917d1ce9-kube-api-access-qczlp\") pod \"auto-csr-approver-29566572-vcfhv\" (UID: \"003ba1c3-2c55-4f11-bcdd-9a62917d1ce9\") " pod="openshift-infra/auto-csr-approver-29566572-vcfhv" Mar 20 08:12:00 crc kubenswrapper[4971]: I0320 08:12:00.493585 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566572-vcfhv" Mar 20 08:12:00 crc kubenswrapper[4971]: I0320 08:12:00.780734 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566572-vcfhv"] Mar 20 08:12:01 crc kubenswrapper[4971]: I0320 08:12:01.215339 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566572-vcfhv" event={"ID":"003ba1c3-2c55-4f11-bcdd-9a62917d1ce9","Type":"ContainerStarted","Data":"fa15a5163488dfe19913b5a1d6324242ea0fec8e43625d318c53fdee843362e5"} Mar 20 08:12:04 crc kubenswrapper[4971]: I0320 08:12:04.245115 4971 generic.go:334] "Generic (PLEG): container finished" podID="003ba1c3-2c55-4f11-bcdd-9a62917d1ce9" containerID="40f3fa9b1bc0edba483b4c0e1df7ce42d78147494af3037981325c5b72a37673" exitCode=0 Mar 20 08:12:04 crc kubenswrapper[4971]: I0320 08:12:04.245185 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566572-vcfhv" event={"ID":"003ba1c3-2c55-4f11-bcdd-9a62917d1ce9","Type":"ContainerDied","Data":"40f3fa9b1bc0edba483b4c0e1df7ce42d78147494af3037981325c5b72a37673"} Mar 20 08:12:05 crc kubenswrapper[4971]: I0320 08:12:05.617690 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566572-vcfhv" Mar 20 08:12:05 crc kubenswrapper[4971]: I0320 08:12:05.731904 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qczlp\" (UniqueName: \"kubernetes.io/projected/003ba1c3-2c55-4f11-bcdd-9a62917d1ce9-kube-api-access-qczlp\") pod \"003ba1c3-2c55-4f11-bcdd-9a62917d1ce9\" (UID: \"003ba1c3-2c55-4f11-bcdd-9a62917d1ce9\") " Mar 20 08:12:05 crc kubenswrapper[4971]: I0320 08:12:05.738351 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/003ba1c3-2c55-4f11-bcdd-9a62917d1ce9-kube-api-access-qczlp" (OuterVolumeSpecName: "kube-api-access-qczlp") pod "003ba1c3-2c55-4f11-bcdd-9a62917d1ce9" (UID: "003ba1c3-2c55-4f11-bcdd-9a62917d1ce9"). InnerVolumeSpecName "kube-api-access-qczlp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:12:05 crc kubenswrapper[4971]: I0320 08:12:05.833495 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qczlp\" (UniqueName: \"kubernetes.io/projected/003ba1c3-2c55-4f11-bcdd-9a62917d1ce9-kube-api-access-qczlp\") on node \"crc\" DevicePath \"\"" Mar 20 08:12:06 crc kubenswrapper[4971]: I0320 08:12:06.263902 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566572-vcfhv" event={"ID":"003ba1c3-2c55-4f11-bcdd-9a62917d1ce9","Type":"ContainerDied","Data":"fa15a5163488dfe19913b5a1d6324242ea0fec8e43625d318c53fdee843362e5"} Mar 20 08:12:06 crc kubenswrapper[4971]: I0320 08:12:06.263944 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa15a5163488dfe19913b5a1d6324242ea0fec8e43625d318c53fdee843362e5" Mar 20 08:12:06 crc kubenswrapper[4971]: I0320 08:12:06.263995 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566572-vcfhv" Mar 20 08:12:06 crc kubenswrapper[4971]: I0320 08:12:06.712190 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566566-bfdf5"] Mar 20 08:12:06 crc kubenswrapper[4971]: I0320 08:12:06.723685 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566566-bfdf5"] Mar 20 08:12:06 crc kubenswrapper[4971]: I0320 08:12:06.750433 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="657f8543-eddf-4b34-949d-004f4389cb59" path="/var/lib/kubelet/pods/657f8543-eddf-4b34-949d-004f4389cb59/volumes" Mar 20 08:12:20 crc kubenswrapper[4971]: I0320 08:12:20.162005 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 08:12:20 crc kubenswrapper[4971]: I0320 08:12:20.162880 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 08:12:20 crc kubenswrapper[4971]: I0320 08:12:20.162962 4971 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" Mar 20 08:12:20 crc kubenswrapper[4971]: I0320 08:12:20.163843 4971 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bb270a4dc93b8d6f80363bddc69a208c2724ae180fbd65a393dbe537269dfa3b"} pod="openshift-machine-config-operator/machine-config-daemon-m26zl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 08:12:20 crc kubenswrapper[4971]: I0320 08:12:20.163965 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" containerID="cri-o://bb270a4dc93b8d6f80363bddc69a208c2724ae180fbd65a393dbe537269dfa3b" gracePeriod=600 Mar 20 08:12:21 crc kubenswrapper[4971]: E0320 08:12:21.222732 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:12:21 crc kubenswrapper[4971]: I0320 08:12:21.418808 4971 generic.go:334] "Generic (PLEG): container finished" podID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerID="bb270a4dc93b8d6f80363bddc69a208c2724ae180fbd65a393dbe537269dfa3b" exitCode=0 Mar 20 08:12:21 crc kubenswrapper[4971]: I0320 08:12:21.418861 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerDied","Data":"bb270a4dc93b8d6f80363bddc69a208c2724ae180fbd65a393dbe537269dfa3b"} Mar 20 08:12:21 crc kubenswrapper[4971]: I0320 08:12:21.418898 4971 scope.go:117] "RemoveContainer" containerID="9202ff95fa9a8e5fda31c183fc82833d745eca16c84e697454145e650520e248" Mar 20 08:12:21 crc kubenswrapper[4971]: I0320 08:12:21.419479 4971 scope.go:117] "RemoveContainer" containerID="bb270a4dc93b8d6f80363bddc69a208c2724ae180fbd65a393dbe537269dfa3b" Mar 20 08:12:21 crc kubenswrapper[4971]: E0320 08:12:21.419884 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:12:32 crc kubenswrapper[4971]: I0320 08:12:32.733494 4971 scope.go:117] "RemoveContainer" containerID="bb270a4dc93b8d6f80363bddc69a208c2724ae180fbd65a393dbe537269dfa3b" Mar 20 08:12:32 crc kubenswrapper[4971]: E0320 08:12:32.734329 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:12:45 crc kubenswrapper[4971]: I0320 08:12:45.732029 4971 scope.go:117] "RemoveContainer" containerID="bb270a4dc93b8d6f80363bddc69a208c2724ae180fbd65a393dbe537269dfa3b" Mar 20 08:12:45 crc kubenswrapper[4971]: E0320 08:12:45.732843 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:12:57 crc kubenswrapper[4971]: I0320 08:12:57.397689 4971 scope.go:117] "RemoveContainer" containerID="2c5aebd9c44ccedbf00cbcee1f139a25591f9affeba87963f84b82839babcf58" Mar 20 08:12:59 crc kubenswrapper[4971]: I0320 08:12:59.732447 4971 scope.go:117] "RemoveContainer" containerID="bb270a4dc93b8d6f80363bddc69a208c2724ae180fbd65a393dbe537269dfa3b" Mar 20 08:12:59 crc kubenswrapper[4971]: E0320 08:12:59.733366 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:13:10 crc kubenswrapper[4971]: I0320 08:13:10.585810 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fx4v5"] Mar 20 08:13:10 crc kubenswrapper[4971]: E0320 08:13:10.588556 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="003ba1c3-2c55-4f11-bcdd-9a62917d1ce9" containerName="oc" Mar 20 08:13:10 crc kubenswrapper[4971]: I0320 08:13:10.588636 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="003ba1c3-2c55-4f11-bcdd-9a62917d1ce9" containerName="oc" Mar 20 08:13:10 crc kubenswrapper[4971]: I0320 08:13:10.588931 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="003ba1c3-2c55-4f11-bcdd-9a62917d1ce9" containerName="oc" Mar 20 08:13:10 crc kubenswrapper[4971]: I0320 08:13:10.590578 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fx4v5" Mar 20 08:13:10 crc kubenswrapper[4971]: I0320 08:13:10.605717 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fx4v5"] Mar 20 08:13:10 crc kubenswrapper[4971]: I0320 08:13:10.732313 4971 scope.go:117] "RemoveContainer" containerID="bb270a4dc93b8d6f80363bddc69a208c2724ae180fbd65a393dbe537269dfa3b" Mar 20 08:13:10 crc kubenswrapper[4971]: E0320 08:13:10.732802 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:13:10 crc kubenswrapper[4971]: I0320 08:13:10.775470 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38b03ff4-eef7-41ef-af1d-355a83d2e1f7-catalog-content\") pod \"certified-operators-fx4v5\" (UID: \"38b03ff4-eef7-41ef-af1d-355a83d2e1f7\") " pod="openshift-marketplace/certified-operators-fx4v5" Mar 20 08:13:10 crc kubenswrapper[4971]: I0320 08:13:10.775537 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38b03ff4-eef7-41ef-af1d-355a83d2e1f7-utilities\") pod \"certified-operators-fx4v5\" (UID: \"38b03ff4-eef7-41ef-af1d-355a83d2e1f7\") " pod="openshift-marketplace/certified-operators-fx4v5" Mar 20 08:13:10 crc kubenswrapper[4971]: I0320 08:13:10.775636 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzk8c\" (UniqueName: \"kubernetes.io/projected/38b03ff4-eef7-41ef-af1d-355a83d2e1f7-kube-api-access-nzk8c\") pod \"certified-operators-fx4v5\" (UID: \"38b03ff4-eef7-41ef-af1d-355a83d2e1f7\") " pod="openshift-marketplace/certified-operators-fx4v5" Mar 20 08:13:10 crc kubenswrapper[4971]: I0320 08:13:10.877148 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38b03ff4-eef7-41ef-af1d-355a83d2e1f7-catalog-content\") pod \"certified-operators-fx4v5\" (UID: \"38b03ff4-eef7-41ef-af1d-355a83d2e1f7\") " pod="openshift-marketplace/certified-operators-fx4v5" Mar 20 08:13:10 crc kubenswrapper[4971]: I0320 08:13:10.877237 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38b03ff4-eef7-41ef-af1d-355a83d2e1f7-utilities\") pod \"certified-operators-fx4v5\" (UID: \"38b03ff4-eef7-41ef-af1d-355a83d2e1f7\") " pod="openshift-marketplace/certified-operators-fx4v5" Mar 20 08:13:10 crc kubenswrapper[4971]: I0320 08:13:10.877372 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzk8c\" (UniqueName: \"kubernetes.io/projected/38b03ff4-eef7-41ef-af1d-355a83d2e1f7-kube-api-access-nzk8c\") pod \"certified-operators-fx4v5\" (UID: \"38b03ff4-eef7-41ef-af1d-355a83d2e1f7\") " pod="openshift-marketplace/certified-operators-fx4v5" Mar 20 08:13:10 crc kubenswrapper[4971]: I0320 08:13:10.877788 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38b03ff4-eef7-41ef-af1d-355a83d2e1f7-catalog-content\") pod \"certified-operators-fx4v5\" (UID: \"38b03ff4-eef7-41ef-af1d-355a83d2e1f7\") " pod="openshift-marketplace/certified-operators-fx4v5" Mar 20 08:13:10 crc kubenswrapper[4971]: I0320 08:13:10.878100 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38b03ff4-eef7-41ef-af1d-355a83d2e1f7-utilities\") pod \"certified-operators-fx4v5\" (UID: \"38b03ff4-eef7-41ef-af1d-355a83d2e1f7\") " pod="openshift-marketplace/certified-operators-fx4v5" Mar 20 08:13:10 crc kubenswrapper[4971]: I0320 08:13:10.923743 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzk8c\" (UniqueName: \"kubernetes.io/projected/38b03ff4-eef7-41ef-af1d-355a83d2e1f7-kube-api-access-nzk8c\") pod \"certified-operators-fx4v5\" (UID: \"38b03ff4-eef7-41ef-af1d-355a83d2e1f7\") " pod="openshift-marketplace/certified-operators-fx4v5" Mar 20 08:13:10 crc kubenswrapper[4971]: I0320 08:13:10.928461 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fx4v5" Mar 20 08:13:11 crc kubenswrapper[4971]: I0320 08:13:11.237708 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fx4v5"] Mar 20 08:13:11 crc kubenswrapper[4971]: I0320 08:13:11.870071 4971 generic.go:334] "Generic (PLEG): container finished" podID="38b03ff4-eef7-41ef-af1d-355a83d2e1f7" containerID="dbd2ac6b9db011bdaeccc24e1afdabc4f6867a1610a0a09b4a471bd2267ddc89" exitCode=0 Mar 20 08:13:11 crc kubenswrapper[4971]: I0320 08:13:11.870141 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fx4v5" event={"ID":"38b03ff4-eef7-41ef-af1d-355a83d2e1f7","Type":"ContainerDied","Data":"dbd2ac6b9db011bdaeccc24e1afdabc4f6867a1610a0a09b4a471bd2267ddc89"} Mar 20 08:13:11 crc kubenswrapper[4971]: I0320 08:13:11.870475 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fx4v5" event={"ID":"38b03ff4-eef7-41ef-af1d-355a83d2e1f7","Type":"ContainerStarted","Data":"27fbbd936b0f6f222a2d8d9ec6c6efd9fbc2c62e6d425e6ce723889fc83c8ad4"} Mar 20 08:13:14 crc kubenswrapper[4971]: I0320 08:13:14.899374 4971 generic.go:334] "Generic (PLEG): container finished" podID="38b03ff4-eef7-41ef-af1d-355a83d2e1f7" containerID="5f3c42f6cd72fb10898b2ad1afec5b856f4535afa332991ceb3f5bea445cd400" exitCode=0 Mar 20 08:13:14 crc kubenswrapper[4971]: I0320 08:13:14.899468 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fx4v5" event={"ID":"38b03ff4-eef7-41ef-af1d-355a83d2e1f7","Type":"ContainerDied","Data":"5f3c42f6cd72fb10898b2ad1afec5b856f4535afa332991ceb3f5bea445cd400"} Mar 20 08:13:15 crc kubenswrapper[4971]: I0320 08:13:15.909381 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fx4v5" event={"ID":"38b03ff4-eef7-41ef-af1d-355a83d2e1f7","Type":"ContainerStarted","Data":"ec4a5ce83b9dbd0547132e2ce553d74b4fa7aed829dde428c1ef2f334ada137a"} Mar 20 08:13:15 crc kubenswrapper[4971]: I0320 08:13:15.933591 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fx4v5" podStartSLOduration=2.364273542 podStartE2EDuration="5.933568251s" podCreationTimestamp="2026-03-20 08:13:10 +0000 UTC" firstStartedPulling="2026-03-20 08:13:11.873738052 +0000 UTC m=+5013.853612210" lastFinishedPulling="2026-03-20 08:13:15.443032741 +0000 UTC m=+5017.422906919" observedRunningTime="2026-03-20 08:13:15.928849108 +0000 UTC m=+5017.908723286" watchObservedRunningTime="2026-03-20 08:13:15.933568251 +0000 UTC m=+5017.913442399" Mar 20 08:13:20 crc kubenswrapper[4971]: I0320 08:13:20.929594 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fx4v5" Mar 20 08:13:20 crc kubenswrapper[4971]: I0320 08:13:20.930206 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fx4v5" Mar 20 08:13:20 crc kubenswrapper[4971]: I0320 08:13:20.970002 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fx4v5" Mar 20 08:13:21 crc kubenswrapper[4971]: I0320 08:13:21.009655 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fx4v5" Mar 20 08:13:21 crc kubenswrapper[4971]: I0320 08:13:21.204055 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fx4v5"] Mar 20 08:13:22 crc kubenswrapper[4971]: I0320 08:13:22.974655 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fx4v5" podUID="38b03ff4-eef7-41ef-af1d-355a83d2e1f7" containerName="registry-server" containerID="cri-o://ec4a5ce83b9dbd0547132e2ce553d74b4fa7aed829dde428c1ef2f334ada137a" gracePeriod=2 Mar 20 08:13:23 crc kubenswrapper[4971]: I0320 08:13:23.360777 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fx4v5" Mar 20 08:13:23 crc kubenswrapper[4971]: I0320 08:13:23.463908 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38b03ff4-eef7-41ef-af1d-355a83d2e1f7-utilities\") pod \"38b03ff4-eef7-41ef-af1d-355a83d2e1f7\" (UID: \"38b03ff4-eef7-41ef-af1d-355a83d2e1f7\") " Mar 20 08:13:23 crc kubenswrapper[4971]: I0320 08:13:23.463955 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzk8c\" (UniqueName: \"kubernetes.io/projected/38b03ff4-eef7-41ef-af1d-355a83d2e1f7-kube-api-access-nzk8c\") pod \"38b03ff4-eef7-41ef-af1d-355a83d2e1f7\" (UID: \"38b03ff4-eef7-41ef-af1d-355a83d2e1f7\") " Mar 20 08:13:23 crc kubenswrapper[4971]: I0320 08:13:23.464017 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38b03ff4-eef7-41ef-af1d-355a83d2e1f7-catalog-content\") pod \"38b03ff4-eef7-41ef-af1d-355a83d2e1f7\" (UID: \"38b03ff4-eef7-41ef-af1d-355a83d2e1f7\") " Mar 20 08:13:23 crc kubenswrapper[4971]: I0320 08:13:23.464766 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38b03ff4-eef7-41ef-af1d-355a83d2e1f7-utilities" (OuterVolumeSpecName: "utilities") pod "38b03ff4-eef7-41ef-af1d-355a83d2e1f7" (UID: "38b03ff4-eef7-41ef-af1d-355a83d2e1f7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:13:23 crc kubenswrapper[4971]: I0320 08:13:23.520274 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38b03ff4-eef7-41ef-af1d-355a83d2e1f7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "38b03ff4-eef7-41ef-af1d-355a83d2e1f7" (UID: "38b03ff4-eef7-41ef-af1d-355a83d2e1f7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:13:23 crc kubenswrapper[4971]: I0320 08:13:23.565309 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38b03ff4-eef7-41ef-af1d-355a83d2e1f7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 08:13:23 crc kubenswrapper[4971]: I0320 08:13:23.565362 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38b03ff4-eef7-41ef-af1d-355a83d2e1f7-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 08:13:23 crc kubenswrapper[4971]: I0320 08:13:23.697943 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38b03ff4-eef7-41ef-af1d-355a83d2e1f7-kube-api-access-nzk8c" (OuterVolumeSpecName: "kube-api-access-nzk8c") pod "38b03ff4-eef7-41ef-af1d-355a83d2e1f7" (UID: "38b03ff4-eef7-41ef-af1d-355a83d2e1f7"). InnerVolumeSpecName "kube-api-access-nzk8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:13:23 crc kubenswrapper[4971]: I0320 08:13:23.767682 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzk8c\" (UniqueName: \"kubernetes.io/projected/38b03ff4-eef7-41ef-af1d-355a83d2e1f7-kube-api-access-nzk8c\") on node \"crc\" DevicePath \"\"" Mar 20 08:13:23 crc kubenswrapper[4971]: I0320 08:13:23.985265 4971 generic.go:334] "Generic (PLEG): container finished" podID="38b03ff4-eef7-41ef-af1d-355a83d2e1f7" containerID="ec4a5ce83b9dbd0547132e2ce553d74b4fa7aed829dde428c1ef2f334ada137a" exitCode=0 Mar 20 08:13:23 crc kubenswrapper[4971]: I0320 08:13:23.985338 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fx4v5" Mar 20 08:13:23 crc kubenswrapper[4971]: I0320 08:13:23.985336 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fx4v5" event={"ID":"38b03ff4-eef7-41ef-af1d-355a83d2e1f7","Type":"ContainerDied","Data":"ec4a5ce83b9dbd0547132e2ce553d74b4fa7aed829dde428c1ef2f334ada137a"} Mar 20 08:13:23 crc kubenswrapper[4971]: I0320 08:13:23.985410 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fx4v5" event={"ID":"38b03ff4-eef7-41ef-af1d-355a83d2e1f7","Type":"ContainerDied","Data":"27fbbd936b0f6f222a2d8d9ec6c6efd9fbc2c62e6d425e6ce723889fc83c8ad4"} Mar 20 08:13:23 crc kubenswrapper[4971]: I0320 08:13:23.985453 4971 scope.go:117] "RemoveContainer" containerID="ec4a5ce83b9dbd0547132e2ce553d74b4fa7aed829dde428c1ef2f334ada137a" Mar 20 08:13:24 crc kubenswrapper[4971]: I0320 08:13:24.020540 4971 scope.go:117] "RemoveContainer" containerID="5f3c42f6cd72fb10898b2ad1afec5b856f4535afa332991ceb3f5bea445cd400" Mar 20 08:13:24 crc kubenswrapper[4971]: I0320 08:13:24.022975 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fx4v5"] Mar 20 08:13:24 crc kubenswrapper[4971]: I0320 08:13:24.033690 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fx4v5"] Mar 20 08:13:24 crc kubenswrapper[4971]: I0320 08:13:24.044375 4971 scope.go:117] "RemoveContainer" containerID="dbd2ac6b9db011bdaeccc24e1afdabc4f6867a1610a0a09b4a471bd2267ddc89" Mar 20 08:13:24 crc kubenswrapper[4971]: I0320 08:13:24.067447 4971 scope.go:117] "RemoveContainer" containerID="ec4a5ce83b9dbd0547132e2ce553d74b4fa7aed829dde428c1ef2f334ada137a" Mar 20 08:13:24 crc kubenswrapper[4971]: E0320 08:13:24.067890 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec4a5ce83b9dbd0547132e2ce553d74b4fa7aed829dde428c1ef2f334ada137a\": container with ID starting with ec4a5ce83b9dbd0547132e2ce553d74b4fa7aed829dde428c1ef2f334ada137a not found: ID does not exist" containerID="ec4a5ce83b9dbd0547132e2ce553d74b4fa7aed829dde428c1ef2f334ada137a" Mar 20 08:13:24 crc kubenswrapper[4971]: I0320 08:13:24.067923 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec4a5ce83b9dbd0547132e2ce553d74b4fa7aed829dde428c1ef2f334ada137a"} err="failed to get container status \"ec4a5ce83b9dbd0547132e2ce553d74b4fa7aed829dde428c1ef2f334ada137a\": rpc error: code = NotFound desc = could not find container \"ec4a5ce83b9dbd0547132e2ce553d74b4fa7aed829dde428c1ef2f334ada137a\": container with ID starting with ec4a5ce83b9dbd0547132e2ce553d74b4fa7aed829dde428c1ef2f334ada137a not found: ID does not exist" Mar 20 08:13:24 crc kubenswrapper[4971]: I0320 08:13:24.067969 4971 scope.go:117] "RemoveContainer" containerID="5f3c42f6cd72fb10898b2ad1afec5b856f4535afa332991ceb3f5bea445cd400" Mar 20 08:13:24 crc kubenswrapper[4971]: E0320 08:13:24.068248 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f3c42f6cd72fb10898b2ad1afec5b856f4535afa332991ceb3f5bea445cd400\": container with ID starting with 5f3c42f6cd72fb10898b2ad1afec5b856f4535afa332991ceb3f5bea445cd400 not found: ID does not exist" containerID="5f3c42f6cd72fb10898b2ad1afec5b856f4535afa332991ceb3f5bea445cd400" Mar 20 08:13:24 crc kubenswrapper[4971]: I0320 08:13:24.068295 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f3c42f6cd72fb10898b2ad1afec5b856f4535afa332991ceb3f5bea445cd400"} err="failed to get container status \"5f3c42f6cd72fb10898b2ad1afec5b856f4535afa332991ceb3f5bea445cd400\": rpc error: code = NotFound desc = could not find container \"5f3c42f6cd72fb10898b2ad1afec5b856f4535afa332991ceb3f5bea445cd400\": container with ID starting with 5f3c42f6cd72fb10898b2ad1afec5b856f4535afa332991ceb3f5bea445cd400 not found: ID does not exist" Mar 20 08:13:24 crc kubenswrapper[4971]: I0320 08:13:24.068314 4971 scope.go:117] "RemoveContainer" containerID="dbd2ac6b9db011bdaeccc24e1afdabc4f6867a1610a0a09b4a471bd2267ddc89" Mar 20 08:13:24 crc kubenswrapper[4971]: E0320 08:13:24.068582 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbd2ac6b9db011bdaeccc24e1afdabc4f6867a1610a0a09b4a471bd2267ddc89\": container with ID starting with dbd2ac6b9db011bdaeccc24e1afdabc4f6867a1610a0a09b4a471bd2267ddc89 not found: ID does not exist" containerID="dbd2ac6b9db011bdaeccc24e1afdabc4f6867a1610a0a09b4a471bd2267ddc89" Mar 20 08:13:24 crc kubenswrapper[4971]: I0320 08:13:24.068637 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbd2ac6b9db011bdaeccc24e1afdabc4f6867a1610a0a09b4a471bd2267ddc89"} err="failed to get container status \"dbd2ac6b9db011bdaeccc24e1afdabc4f6867a1610a0a09b4a471bd2267ddc89\": rpc error: code = NotFound desc = could not find container \"dbd2ac6b9db011bdaeccc24e1afdabc4f6867a1610a0a09b4a471bd2267ddc89\": container with ID starting with dbd2ac6b9db011bdaeccc24e1afdabc4f6867a1610a0a09b4a471bd2267ddc89 not found: ID does not exist" Mar 20 08:13:24 crc kubenswrapper[4971]: I0320 08:13:24.733268 4971 scope.go:117] "RemoveContainer" containerID="bb270a4dc93b8d6f80363bddc69a208c2724ae180fbd65a393dbe537269dfa3b" Mar 20 08:13:24 crc kubenswrapper[4971]: E0320 08:13:24.734058 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:13:24 crc kubenswrapper[4971]: I0320 08:13:24.744114 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38b03ff4-eef7-41ef-af1d-355a83d2e1f7" path="/var/lib/kubelet/pods/38b03ff4-eef7-41ef-af1d-355a83d2e1f7/volumes" Mar 20 08:13:37 crc kubenswrapper[4971]: I0320 08:13:37.732820 4971 scope.go:117] "RemoveContainer" containerID="bb270a4dc93b8d6f80363bddc69a208c2724ae180fbd65a393dbe537269dfa3b" Mar 20 08:13:37 crc kubenswrapper[4971]: E0320 08:13:37.733756 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:13:52 crc kubenswrapper[4971]: I0320 08:13:52.732751 4971 scope.go:117] "RemoveContainer" containerID="bb270a4dc93b8d6f80363bddc69a208c2724ae180fbd65a393dbe537269dfa3b" Mar 20 08:13:52 crc kubenswrapper[4971]: E0320 08:13:52.733475 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:14:00 crc kubenswrapper[4971]: I0320 08:14:00.166859 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566574-fhfpl"] Mar 20 08:14:00 crc kubenswrapper[4971]: E0320 08:14:00.168285 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38b03ff4-eef7-41ef-af1d-355a83d2e1f7" containerName="registry-server" Mar 20 08:14:00 crc kubenswrapper[4971]: I0320 08:14:00.168311 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="38b03ff4-eef7-41ef-af1d-355a83d2e1f7" containerName="registry-server" Mar 20 08:14:00 crc kubenswrapper[4971]: E0320 08:14:00.168348 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38b03ff4-eef7-41ef-af1d-355a83d2e1f7" containerName="extract-content" Mar 20 08:14:00 crc kubenswrapper[4971]: I0320 08:14:00.168360 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="38b03ff4-eef7-41ef-af1d-355a83d2e1f7" containerName="extract-content" Mar 20 08:14:00 crc kubenswrapper[4971]: E0320 08:14:00.168384 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38b03ff4-eef7-41ef-af1d-355a83d2e1f7" containerName="extract-utilities" Mar 20 08:14:00 crc kubenswrapper[4971]: I0320 08:14:00.168397 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="38b03ff4-eef7-41ef-af1d-355a83d2e1f7" containerName="extract-utilities" Mar 20 08:14:00 crc kubenswrapper[4971]: I0320 08:14:00.168728 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="38b03ff4-eef7-41ef-af1d-355a83d2e1f7" containerName="registry-server" Mar 20 08:14:00 crc kubenswrapper[4971]: I0320 08:14:00.169678 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566574-fhfpl" Mar 20 08:14:00 crc kubenswrapper[4971]: I0320 08:14:00.174009 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 08:14:00 crc kubenswrapper[4971]: I0320 08:14:00.174535 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 08:14:00 crc kubenswrapper[4971]: I0320 08:14:00.174942 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 08:14:00 crc kubenswrapper[4971]: I0320 08:14:00.187081 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566574-fhfpl"] Mar 20 08:14:00 crc kubenswrapper[4971]: I0320 08:14:00.330831 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bln86\" (UniqueName: \"kubernetes.io/projected/265e7f55-5292-45cd-b48a-1260ea0579fd-kube-api-access-bln86\") pod \"auto-csr-approver-29566574-fhfpl\" (UID: \"265e7f55-5292-45cd-b48a-1260ea0579fd\") " pod="openshift-infra/auto-csr-approver-29566574-fhfpl" Mar 20 08:14:00 crc kubenswrapper[4971]: I0320 08:14:00.432306 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bln86\" (UniqueName: \"kubernetes.io/projected/265e7f55-5292-45cd-b48a-1260ea0579fd-kube-api-access-bln86\") pod \"auto-csr-approver-29566574-fhfpl\" (UID: \"265e7f55-5292-45cd-b48a-1260ea0579fd\") " pod="openshift-infra/auto-csr-approver-29566574-fhfpl" Mar 20 08:14:00 crc kubenswrapper[4971]: I0320 08:14:00.457566 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bln86\" (UniqueName: \"kubernetes.io/projected/265e7f55-5292-45cd-b48a-1260ea0579fd-kube-api-access-bln86\") pod \"auto-csr-approver-29566574-fhfpl\" (UID: \"265e7f55-5292-45cd-b48a-1260ea0579fd\") " pod="openshift-infra/auto-csr-approver-29566574-fhfpl" Mar 20 08:14:00 crc kubenswrapper[4971]: I0320 08:14:00.512082 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566574-fhfpl" Mar 20 08:14:01 crc kubenswrapper[4971]: I0320 08:14:01.013184 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566574-fhfpl"] Mar 20 08:14:01 crc kubenswrapper[4971]: W0320 08:14:01.015111 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod265e7f55_5292_45cd_b48a_1260ea0579fd.slice/crio-62aacd879b7d7f71906f9f6b024ff3a364ffb936d0330b484614053f76ed15f4 WatchSource:0}: Error finding container 62aacd879b7d7f71906f9f6b024ff3a364ffb936d0330b484614053f76ed15f4: Status 404 returned error can't find the container with id 62aacd879b7d7f71906f9f6b024ff3a364ffb936d0330b484614053f76ed15f4 Mar 20 08:14:01 crc kubenswrapper[4971]: I0320 08:14:01.315079 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566574-fhfpl" event={"ID":"265e7f55-5292-45cd-b48a-1260ea0579fd","Type":"ContainerStarted","Data":"62aacd879b7d7f71906f9f6b024ff3a364ffb936d0330b484614053f76ed15f4"} Mar 20 08:14:02 crc kubenswrapper[4971]: I0320 08:14:02.331535 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566574-fhfpl" event={"ID":"265e7f55-5292-45cd-b48a-1260ea0579fd","Type":"ContainerStarted","Data":"a3f8834f07891ab78574eaf21e4cd42c45c376156dcd86c47abdc9929f5b48b7"} Mar 20 08:14:02 crc kubenswrapper[4971]: I0320 08:14:02.367294 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566574-fhfpl" podStartSLOduration=1.558267761 podStartE2EDuration="2.367265919s" podCreationTimestamp="2026-03-20 08:14:00 +0000 UTC" firstStartedPulling="2026-03-20 08:14:01.018118938 +0000 UTC m=+5062.997993086" lastFinishedPulling="2026-03-20 08:14:01.827117106 +0000 UTC m=+5063.806991244" observedRunningTime="2026-03-20 08:14:02.348125481 +0000 UTC m=+5064.327999649" watchObservedRunningTime="2026-03-20 08:14:02.367265919 +0000 UTC m=+5064.347140097" Mar 20 08:14:03 crc kubenswrapper[4971]: I0320 08:14:03.340955 4971 generic.go:334] "Generic (PLEG): container finished" podID="265e7f55-5292-45cd-b48a-1260ea0579fd" containerID="a3f8834f07891ab78574eaf21e4cd42c45c376156dcd86c47abdc9929f5b48b7" exitCode=0 Mar 20 08:14:03 crc kubenswrapper[4971]: I0320 08:14:03.341039 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566574-fhfpl" event={"ID":"265e7f55-5292-45cd-b48a-1260ea0579fd","Type":"ContainerDied","Data":"a3f8834f07891ab78574eaf21e4cd42c45c376156dcd86c47abdc9929f5b48b7"} Mar 20 08:14:04 crc kubenswrapper[4971]: I0320 08:14:04.686628 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566574-fhfpl" Mar 20 08:14:04 crc kubenswrapper[4971]: I0320 08:14:04.816674 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bln86\" (UniqueName: \"kubernetes.io/projected/265e7f55-5292-45cd-b48a-1260ea0579fd-kube-api-access-bln86\") pod \"265e7f55-5292-45cd-b48a-1260ea0579fd\" (UID: \"265e7f55-5292-45cd-b48a-1260ea0579fd\") " Mar 20 08:14:04 crc kubenswrapper[4971]: I0320 08:14:04.821989 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/265e7f55-5292-45cd-b48a-1260ea0579fd-kube-api-access-bln86" (OuterVolumeSpecName: "kube-api-access-bln86") pod "265e7f55-5292-45cd-b48a-1260ea0579fd" (UID: "265e7f55-5292-45cd-b48a-1260ea0579fd"). InnerVolumeSpecName "kube-api-access-bln86". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:14:04 crc kubenswrapper[4971]: I0320 08:14:04.918210 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bln86\" (UniqueName: \"kubernetes.io/projected/265e7f55-5292-45cd-b48a-1260ea0579fd-kube-api-access-bln86\") on node \"crc\" DevicePath \"\"" Mar 20 08:14:05 crc kubenswrapper[4971]: I0320 08:14:05.360876 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566574-fhfpl" event={"ID":"265e7f55-5292-45cd-b48a-1260ea0579fd","Type":"ContainerDied","Data":"62aacd879b7d7f71906f9f6b024ff3a364ffb936d0330b484614053f76ed15f4"} Mar 20 08:14:05 crc kubenswrapper[4971]: I0320 08:14:05.360940 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62aacd879b7d7f71906f9f6b024ff3a364ffb936d0330b484614053f76ed15f4" Mar 20 08:14:05 crc kubenswrapper[4971]: I0320 08:14:05.360968 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566574-fhfpl" Mar 20 08:14:05 crc kubenswrapper[4971]: I0320 08:14:05.438416 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566568-kc7rk"] Mar 20 08:14:05 crc kubenswrapper[4971]: I0320 08:14:05.443227 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566568-kc7rk"] Mar 20 08:14:05 crc kubenswrapper[4971]: I0320 08:14:05.732748 4971 scope.go:117] "RemoveContainer" containerID="bb270a4dc93b8d6f80363bddc69a208c2724ae180fbd65a393dbe537269dfa3b" Mar 20 08:14:05 crc kubenswrapper[4971]: E0320 08:14:05.733671 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:14:06 crc kubenswrapper[4971]: I0320 08:14:06.745650 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70920290-7006-4bc5-bf90-c440861b4155" path="/var/lib/kubelet/pods/70920290-7006-4bc5-bf90-c440861b4155/volumes" Mar 20 08:14:20 crc kubenswrapper[4971]: I0320 08:14:20.732444 4971 scope.go:117] "RemoveContainer" containerID="bb270a4dc93b8d6f80363bddc69a208c2724ae180fbd65a393dbe537269dfa3b" Mar 20 08:14:20 crc kubenswrapper[4971]: E0320 08:14:20.734053 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:14:34 crc kubenswrapper[4971]: I0320 08:14:34.732500 4971 scope.go:117] "RemoveContainer" containerID="bb270a4dc93b8d6f80363bddc69a208c2724ae180fbd65a393dbe537269dfa3b" Mar 20 08:14:34 crc kubenswrapper[4971]: E0320 08:14:34.733750 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:14:48 crc kubenswrapper[4971]: I0320 08:14:48.739320 4971 scope.go:117] "RemoveContainer" containerID="bb270a4dc93b8d6f80363bddc69a208c2724ae180fbd65a393dbe537269dfa3b" Mar 20 08:14:48 crc kubenswrapper[4971]: E0320 08:14:48.740549 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:14:57 crc kubenswrapper[4971]: I0320 08:14:57.510131 4971 scope.go:117] "RemoveContainer" containerID="2e791ab94042326d81f115d12d410c88c93da2ceca499194bbdc0d023ba031cc" Mar 20 08:15:00 crc kubenswrapper[4971]: I0320 08:15:00.145828 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566575-n89r2"] Mar 20 08:15:00 crc kubenswrapper[4971]: E0320 08:15:00.147685 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="265e7f55-5292-45cd-b48a-1260ea0579fd" containerName="oc" Mar 20 08:15:00 crc kubenswrapper[4971]: I0320 08:15:00.147769 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="265e7f55-5292-45cd-b48a-1260ea0579fd" containerName="oc" Mar 20 08:15:00 crc kubenswrapper[4971]: I0320 08:15:00.147975 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="265e7f55-5292-45cd-b48a-1260ea0579fd" containerName="oc" Mar 20 08:15:00 crc kubenswrapper[4971]: I0320 08:15:00.148461 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566575-n89r2" Mar 20 08:15:00 crc kubenswrapper[4971]: I0320 08:15:00.150758 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 08:15:00 crc kubenswrapper[4971]: I0320 08:15:00.152490 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 08:15:00 crc kubenswrapper[4971]: I0320 08:15:00.158550 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566575-n89r2"] Mar 20 08:15:00 crc kubenswrapper[4971]: I0320 08:15:00.313744 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/46196b48-8776-46c5-99e9-4136b414f740-config-volume\") pod \"collect-profiles-29566575-n89r2\" (UID: \"46196b48-8776-46c5-99e9-4136b414f740\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566575-n89r2" Mar 20 08:15:00 crc kubenswrapper[4971]: I0320 08:15:00.313826 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/46196b48-8776-46c5-99e9-4136b414f740-secret-volume\") pod \"collect-profiles-29566575-n89r2\" (UID: \"46196b48-8776-46c5-99e9-4136b414f740\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566575-n89r2" Mar 20 08:15:00 crc kubenswrapper[4971]: I0320 08:15:00.313931 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfpxg\" (UniqueName: \"kubernetes.io/projected/46196b48-8776-46c5-99e9-4136b414f740-kube-api-access-rfpxg\") pod \"collect-profiles-29566575-n89r2\" (UID: \"46196b48-8776-46c5-99e9-4136b414f740\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566575-n89r2" Mar 20 08:15:00 crc kubenswrapper[4971]: I0320 08:15:00.415898 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/46196b48-8776-46c5-99e9-4136b414f740-secret-volume\") pod \"collect-profiles-29566575-n89r2\" (UID: \"46196b48-8776-46c5-99e9-4136b414f740\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566575-n89r2" Mar 20 08:15:00 crc kubenswrapper[4971]: I0320 08:15:00.416065 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfpxg\" (UniqueName: \"kubernetes.io/projected/46196b48-8776-46c5-99e9-4136b414f740-kube-api-access-rfpxg\") pod \"collect-profiles-29566575-n89r2\" (UID: \"46196b48-8776-46c5-99e9-4136b414f740\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566575-n89r2" Mar 20 08:15:00 crc kubenswrapper[4971]: I0320 08:15:00.416145 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/46196b48-8776-46c5-99e9-4136b414f740-config-volume\") pod \"collect-profiles-29566575-n89r2\" (UID: \"46196b48-8776-46c5-99e9-4136b414f740\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566575-n89r2" Mar 20 08:15:00 crc kubenswrapper[4971]: I0320 08:15:00.418019 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/46196b48-8776-46c5-99e9-4136b414f740-config-volume\") pod \"collect-profiles-29566575-n89r2\" (UID: \"46196b48-8776-46c5-99e9-4136b414f740\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566575-n89r2" Mar 20 08:15:00 crc kubenswrapper[4971]: I0320 08:15:00.426003 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/46196b48-8776-46c5-99e9-4136b414f740-secret-volume\") pod \"collect-profiles-29566575-n89r2\" (UID: \"46196b48-8776-46c5-99e9-4136b414f740\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566575-n89r2" Mar 20 08:15:00 crc kubenswrapper[4971]: I0320 08:15:00.437244 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfpxg\" (UniqueName: \"kubernetes.io/projected/46196b48-8776-46c5-99e9-4136b414f740-kube-api-access-rfpxg\") pod \"collect-profiles-29566575-n89r2\" (UID: \"46196b48-8776-46c5-99e9-4136b414f740\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566575-n89r2" Mar 20 08:15:00 crc kubenswrapper[4971]: I0320 08:15:00.466537 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566575-n89r2" Mar 20 08:15:00 crc kubenswrapper[4971]: I0320 08:15:00.708693 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566575-n89r2"] Mar 20 08:15:00 crc kubenswrapper[4971]: I0320 08:15:00.733246 4971 scope.go:117] "RemoveContainer" containerID="bb270a4dc93b8d6f80363bddc69a208c2724ae180fbd65a393dbe537269dfa3b" Mar 20 08:15:00 crc kubenswrapper[4971]: E0320 08:15:00.733742 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:15:00 crc kubenswrapper[4971]: I0320 08:15:00.869095 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566575-n89r2" event={"ID":"46196b48-8776-46c5-99e9-4136b414f740","Type":"ContainerStarted","Data":"d5c3abb9d9c287744d505d6a2c3401b23bc7072f268fdee44434fba13c9523a6"} Mar 20 08:15:01 crc kubenswrapper[4971]: I0320 08:15:01.905348 4971 generic.go:334] "Generic (PLEG): container finished" podID="46196b48-8776-46c5-99e9-4136b414f740" containerID="e6f5ba6b52a367733a0734c52b66393cd78add4857d1bd8f04c9c167a849c434" exitCode=0 Mar 20 08:15:01 crc kubenswrapper[4971]: I0320 08:15:01.905674 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566575-n89r2" event={"ID":"46196b48-8776-46c5-99e9-4136b414f740","Type":"ContainerDied","Data":"e6f5ba6b52a367733a0734c52b66393cd78add4857d1bd8f04c9c167a849c434"} Mar 20 08:15:03 crc kubenswrapper[4971]: I0320 08:15:03.239409 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566575-n89r2" Mar 20 08:15:03 crc kubenswrapper[4971]: I0320 08:15:03.361445 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/46196b48-8776-46c5-99e9-4136b414f740-secret-volume\") pod \"46196b48-8776-46c5-99e9-4136b414f740\" (UID: \"46196b48-8776-46c5-99e9-4136b414f740\") " Mar 20 08:15:03 crc kubenswrapper[4971]: I0320 08:15:03.361526 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/46196b48-8776-46c5-99e9-4136b414f740-config-volume\") pod \"46196b48-8776-46c5-99e9-4136b414f740\" (UID: \"46196b48-8776-46c5-99e9-4136b414f740\") " Mar 20 08:15:03 crc kubenswrapper[4971]: I0320 08:15:03.362360 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfpxg\" (UniqueName: \"kubernetes.io/projected/46196b48-8776-46c5-99e9-4136b414f740-kube-api-access-rfpxg\") pod \"46196b48-8776-46c5-99e9-4136b414f740\" (UID: \"46196b48-8776-46c5-99e9-4136b414f740\") " Mar 20 08:15:03 crc kubenswrapper[4971]: I0320 08:15:03.362635 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46196b48-8776-46c5-99e9-4136b414f740-config-volume" (OuterVolumeSpecName: "config-volume") pod "46196b48-8776-46c5-99e9-4136b414f740" (UID: "46196b48-8776-46c5-99e9-4136b414f740"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:15:03 crc kubenswrapper[4971]: I0320 08:15:03.367888 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46196b48-8776-46c5-99e9-4136b414f740-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "46196b48-8776-46c5-99e9-4136b414f740" (UID: "46196b48-8776-46c5-99e9-4136b414f740"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:15:03 crc kubenswrapper[4971]: I0320 08:15:03.368426 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46196b48-8776-46c5-99e9-4136b414f740-kube-api-access-rfpxg" (OuterVolumeSpecName: "kube-api-access-rfpxg") pod "46196b48-8776-46c5-99e9-4136b414f740" (UID: "46196b48-8776-46c5-99e9-4136b414f740"). InnerVolumeSpecName "kube-api-access-rfpxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:15:03 crc kubenswrapper[4971]: I0320 08:15:03.464457 4971 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/46196b48-8776-46c5-99e9-4136b414f740-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 08:15:03 crc kubenswrapper[4971]: I0320 08:15:03.464531 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfpxg\" (UniqueName: \"kubernetes.io/projected/46196b48-8776-46c5-99e9-4136b414f740-kube-api-access-rfpxg\") on node \"crc\" DevicePath \"\"" Mar 20 08:15:03 crc kubenswrapper[4971]: I0320 08:15:03.464548 4971 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/46196b48-8776-46c5-99e9-4136b414f740-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 08:15:03 crc kubenswrapper[4971]: I0320 08:15:03.920451 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566575-n89r2" event={"ID":"46196b48-8776-46c5-99e9-4136b414f740","Type":"ContainerDied","Data":"d5c3abb9d9c287744d505d6a2c3401b23bc7072f268fdee44434fba13c9523a6"} Mar 20 08:15:03 crc kubenswrapper[4971]: I0320 08:15:03.920508 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5c3abb9d9c287744d505d6a2c3401b23bc7072f268fdee44434fba13c9523a6" Mar 20 08:15:03 crc kubenswrapper[4971]: I0320 08:15:03.920595 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566575-n89r2" Mar 20 08:15:04 crc kubenswrapper[4971]: I0320 08:15:04.318073 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566530-w24sg"] Mar 20 08:15:04 crc kubenswrapper[4971]: I0320 08:15:04.324955 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566530-w24sg"] Mar 20 08:15:04 crc kubenswrapper[4971]: I0320 08:15:04.748479 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cff34fd3-3552-416f-b2d8-059274d1667b" path="/var/lib/kubelet/pods/cff34fd3-3552-416f-b2d8-059274d1667b/volumes" Mar 20 08:15:15 crc kubenswrapper[4971]: I0320 08:15:15.733050 4971 scope.go:117] "RemoveContainer" containerID="bb270a4dc93b8d6f80363bddc69a208c2724ae180fbd65a393dbe537269dfa3b" Mar 20 08:15:15 crc kubenswrapper[4971]: E0320 08:15:15.734230 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:15:26 crc kubenswrapper[4971]: I0320 08:15:26.732330 4971 scope.go:117] "RemoveContainer" containerID="bb270a4dc93b8d6f80363bddc69a208c2724ae180fbd65a393dbe537269dfa3b" Mar 20 08:15:26 crc kubenswrapper[4971]: E0320 08:15:26.733484 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:15:41 crc kubenswrapper[4971]: I0320 08:15:41.732651 4971 scope.go:117] "RemoveContainer" containerID="bb270a4dc93b8d6f80363bddc69a208c2724ae180fbd65a393dbe537269dfa3b" Mar 20 08:15:41 crc kubenswrapper[4971]: E0320 08:15:41.733594 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:15:56 crc kubenswrapper[4971]: I0320 08:15:56.733048 4971 scope.go:117] "RemoveContainer" containerID="bb270a4dc93b8d6f80363bddc69a208c2724ae180fbd65a393dbe537269dfa3b" Mar 20 08:15:56 crc kubenswrapper[4971]: E0320 08:15:56.734081 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:15:57 crc kubenswrapper[4971]: I0320 08:15:57.608449 4971 scope.go:117] "RemoveContainer" containerID="e28cbc5493795aef965e4ddd34d2b0a50afae362a9a0391238968bf4dbd07036" Mar 20 08:16:00 crc kubenswrapper[4971]: I0320 08:16:00.154946 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566576-mr4rm"] Mar 20 08:16:00 crc kubenswrapper[4971]: E0320 08:16:00.155472 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46196b48-8776-46c5-99e9-4136b414f740" containerName="collect-profiles" Mar 20 08:16:00 crc kubenswrapper[4971]: I0320 08:16:00.155484 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="46196b48-8776-46c5-99e9-4136b414f740" containerName="collect-profiles" Mar 20 08:16:00 crc kubenswrapper[4971]: I0320 08:16:00.155632 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="46196b48-8776-46c5-99e9-4136b414f740" containerName="collect-profiles" Mar 20 08:16:00 crc kubenswrapper[4971]: I0320 08:16:00.156054 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566576-mr4rm" Mar 20 08:16:00 crc kubenswrapper[4971]: I0320 08:16:00.159439 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 08:16:00 crc kubenswrapper[4971]: I0320 08:16:00.159461 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 08:16:00 crc kubenswrapper[4971]: I0320 08:16:00.159830 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 08:16:00 crc kubenswrapper[4971]: I0320 08:16:00.183121 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566576-mr4rm"] Mar 20 08:16:00 crc kubenswrapper[4971]: I0320 08:16:00.267051 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87sps\" (UniqueName: \"kubernetes.io/projected/4ef8258e-f555-4c0c-8d12-b766f3a32432-kube-api-access-87sps\") pod \"auto-csr-approver-29566576-mr4rm\" (UID: \"4ef8258e-f555-4c0c-8d12-b766f3a32432\") " pod="openshift-infra/auto-csr-approver-29566576-mr4rm" Mar 20 08:16:00 crc kubenswrapper[4971]: I0320 08:16:00.369415 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87sps\" (UniqueName: \"kubernetes.io/projected/4ef8258e-f555-4c0c-8d12-b766f3a32432-kube-api-access-87sps\") pod \"auto-csr-approver-29566576-mr4rm\" (UID: \"4ef8258e-f555-4c0c-8d12-b766f3a32432\") " pod="openshift-infra/auto-csr-approver-29566576-mr4rm" Mar 20 08:16:00 crc kubenswrapper[4971]: I0320 08:16:00.406215 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87sps\" (UniqueName: \"kubernetes.io/projected/4ef8258e-f555-4c0c-8d12-b766f3a32432-kube-api-access-87sps\") pod \"auto-csr-approver-29566576-mr4rm\" (UID: \"4ef8258e-f555-4c0c-8d12-b766f3a32432\") " pod="openshift-infra/auto-csr-approver-29566576-mr4rm" Mar 20 08:16:00 crc kubenswrapper[4971]: I0320 08:16:00.475486 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566576-mr4rm" Mar 20 08:16:00 crc kubenswrapper[4971]: I0320 08:16:00.939683 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566576-mr4rm"] Mar 20 08:16:01 crc kubenswrapper[4971]: I0320 08:16:01.308507 4971 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 08:16:01 crc kubenswrapper[4971]: I0320 08:16:01.435795 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566576-mr4rm" event={"ID":"4ef8258e-f555-4c0c-8d12-b766f3a32432","Type":"ContainerStarted","Data":"957e69546ab7fdac04ded4c32b49ddc7240529f7aceb64c1463bfd35b007c4df"} Mar 20 08:16:03 crc kubenswrapper[4971]: I0320 08:16:03.453689 4971 generic.go:334] "Generic (PLEG): container finished" podID="4ef8258e-f555-4c0c-8d12-b766f3a32432" containerID="85c46807cb1047829eb9722ab24a0abea17af1b0d7d0aa641604c59c0f806ff0" exitCode=0 Mar 20 08:16:03 crc kubenswrapper[4971]: I0320 08:16:03.453740 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566576-mr4rm" event={"ID":"4ef8258e-f555-4c0c-8d12-b766f3a32432","Type":"ContainerDied","Data":"85c46807cb1047829eb9722ab24a0abea17af1b0d7d0aa641604c59c0f806ff0"} Mar 20 08:16:04 crc kubenswrapper[4971]: I0320 08:16:04.767751 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566576-mr4rm" Mar 20 08:16:04 crc kubenswrapper[4971]: I0320 08:16:04.832105 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87sps\" (UniqueName: \"kubernetes.io/projected/4ef8258e-f555-4c0c-8d12-b766f3a32432-kube-api-access-87sps\") pod \"4ef8258e-f555-4c0c-8d12-b766f3a32432\" (UID: \"4ef8258e-f555-4c0c-8d12-b766f3a32432\") " Mar 20 08:16:04 crc kubenswrapper[4971]: I0320 08:16:04.838525 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ef8258e-f555-4c0c-8d12-b766f3a32432-kube-api-access-87sps" (OuterVolumeSpecName: "kube-api-access-87sps") pod "4ef8258e-f555-4c0c-8d12-b766f3a32432" (UID: "4ef8258e-f555-4c0c-8d12-b766f3a32432"). InnerVolumeSpecName "kube-api-access-87sps". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:16:04 crc kubenswrapper[4971]: I0320 08:16:04.933941 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87sps\" (UniqueName: \"kubernetes.io/projected/4ef8258e-f555-4c0c-8d12-b766f3a32432-kube-api-access-87sps\") on node \"crc\" DevicePath \"\"" Mar 20 08:16:05 crc kubenswrapper[4971]: I0320 08:16:05.488815 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566576-mr4rm" event={"ID":"4ef8258e-f555-4c0c-8d12-b766f3a32432","Type":"ContainerDied","Data":"957e69546ab7fdac04ded4c32b49ddc7240529f7aceb64c1463bfd35b007c4df"} Mar 20 08:16:05 crc kubenswrapper[4971]: I0320 08:16:05.488857 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="957e69546ab7fdac04ded4c32b49ddc7240529f7aceb64c1463bfd35b007c4df" Mar 20 08:16:05 crc kubenswrapper[4971]: I0320 08:16:05.488944 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566576-mr4rm" Mar 20 08:16:05 crc kubenswrapper[4971]: I0320 08:16:05.834730 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566570-xfsqv"] Mar 20 08:16:05 crc kubenswrapper[4971]: I0320 08:16:05.839756 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566570-xfsqv"] Mar 20 08:16:06 crc kubenswrapper[4971]: I0320 08:16:06.740140 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21ac11e3-ae71-4a58-835c-33f6c6daea7d" path="/var/lib/kubelet/pods/21ac11e3-ae71-4a58-835c-33f6c6daea7d/volumes" Mar 20 08:16:07 crc kubenswrapper[4971]: I0320 08:16:07.731570 4971 scope.go:117] "RemoveContainer" containerID="bb270a4dc93b8d6f80363bddc69a208c2724ae180fbd65a393dbe537269dfa3b" Mar 20 08:16:07 crc kubenswrapper[4971]: E0320 08:16:07.731906 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:16:21 crc kubenswrapper[4971]: I0320 08:16:21.731771 4971 scope.go:117] "RemoveContainer" containerID="bb270a4dc93b8d6f80363bddc69a208c2724ae180fbd65a393dbe537269dfa3b" Mar 20 08:16:21 crc kubenswrapper[4971]: E0320 08:16:21.732418 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:16:36 crc kubenswrapper[4971]: I0320 08:16:36.732099 4971 scope.go:117] "RemoveContainer" containerID="bb270a4dc93b8d6f80363bddc69a208c2724ae180fbd65a393dbe537269dfa3b" Mar 20 08:16:36 crc kubenswrapper[4971]: E0320 08:16:36.732834 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:16:51 crc kubenswrapper[4971]: I0320 08:16:51.732119 4971 scope.go:117] "RemoveContainer" containerID="bb270a4dc93b8d6f80363bddc69a208c2724ae180fbd65a393dbe537269dfa3b" Mar 20 08:16:51 crc kubenswrapper[4971]: E0320 08:16:51.732952 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:16:57 crc kubenswrapper[4971]: I0320 08:16:57.670712 4971 scope.go:117] "RemoveContainer" containerID="c3f87ac67a2bbb23159e36340bc8f0b3f42303696c046155cd9fbcbf656f85dc" Mar 20 08:17:05 crc kubenswrapper[4971]: I0320 08:17:05.732683 4971 scope.go:117] "RemoveContainer" containerID="bb270a4dc93b8d6f80363bddc69a208c2724ae180fbd65a393dbe537269dfa3b" Mar 20 08:17:05 crc kubenswrapper[4971]: E0320 08:17:05.733341 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:17:18 crc kubenswrapper[4971]: I0320 08:17:18.740256 4971 scope.go:117] "RemoveContainer" containerID="bb270a4dc93b8d6f80363bddc69a208c2724ae180fbd65a393dbe537269dfa3b" Mar 20 08:17:18 crc kubenswrapper[4971]: E0320 08:17:18.741195 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:17:31 crc kubenswrapper[4971]: I0320 08:17:31.732176 4971 scope.go:117] "RemoveContainer" containerID="bb270a4dc93b8d6f80363bddc69a208c2724ae180fbd65a393dbe537269dfa3b" Mar 20 08:17:32 crc kubenswrapper[4971]: I0320 08:17:32.303484 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerStarted","Data":"5e2feddb7ec76b423537a24d64a944171176c7421d63449dfb282d7be14bc7fe"} Mar 20 08:18:00 crc kubenswrapper[4971]: I0320 08:18:00.172893 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566578-55ww2"] Mar 20 08:18:00 crc kubenswrapper[4971]: E0320 08:18:00.174263 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ef8258e-f555-4c0c-8d12-b766f3a32432" containerName="oc" Mar 20 08:18:00 crc kubenswrapper[4971]: I0320 08:18:00.174296 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ef8258e-f555-4c0c-8d12-b766f3a32432" containerName="oc" Mar 20 08:18:00 crc kubenswrapper[4971]: I0320 08:18:00.174695 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ef8258e-f555-4c0c-8d12-b766f3a32432" containerName="oc" Mar 20 08:18:00 crc kubenswrapper[4971]: I0320 08:18:00.175742 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566578-55ww2" Mar 20 08:18:00 crc kubenswrapper[4971]: I0320 08:18:00.180384 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 08:18:00 crc kubenswrapper[4971]: I0320 08:18:00.180965 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 08:18:00 crc kubenswrapper[4971]: I0320 08:18:00.180923 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 08:18:00 crc kubenswrapper[4971]: I0320 08:18:00.184658 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566578-55ww2"] Mar 20 08:18:00 crc kubenswrapper[4971]: I0320 08:18:00.211921 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdklb\" (UniqueName: \"kubernetes.io/projected/ceede041-462b-41c4-829b-742e01009552-kube-api-access-zdklb\") pod \"auto-csr-approver-29566578-55ww2\" (UID: \"ceede041-462b-41c4-829b-742e01009552\") " pod="openshift-infra/auto-csr-approver-29566578-55ww2" Mar 20 08:18:00 crc kubenswrapper[4971]: I0320 08:18:00.313130 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdklb\" (UniqueName: \"kubernetes.io/projected/ceede041-462b-41c4-829b-742e01009552-kube-api-access-zdklb\") pod \"auto-csr-approver-29566578-55ww2\" (UID: \"ceede041-462b-41c4-829b-742e01009552\") " pod="openshift-infra/auto-csr-approver-29566578-55ww2" Mar 20 08:18:00 crc kubenswrapper[4971]: I0320 08:18:00.338216 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdklb\" (UniqueName: \"kubernetes.io/projected/ceede041-462b-41c4-829b-742e01009552-kube-api-access-zdklb\") pod \"auto-csr-approver-29566578-55ww2\" (UID: \"ceede041-462b-41c4-829b-742e01009552\") " pod="openshift-infra/auto-csr-approver-29566578-55ww2" Mar 20 08:18:00 crc kubenswrapper[4971]: I0320 08:18:00.512822 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566578-55ww2" Mar 20 08:18:01 crc kubenswrapper[4971]: I0320 08:18:01.006037 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566578-55ww2"] Mar 20 08:18:01 crc kubenswrapper[4971]: I0320 08:18:01.540270 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566578-55ww2" event={"ID":"ceede041-462b-41c4-829b-742e01009552","Type":"ContainerStarted","Data":"f79f68fb038a6f8d0f14e0a69beee8fc7c59a3aa52d06ae779578e2e3709ae65"} Mar 20 08:18:02 crc kubenswrapper[4971]: I0320 08:18:02.547415 4971 generic.go:334] "Generic (PLEG): container finished" podID="ceede041-462b-41c4-829b-742e01009552" containerID="2d0c8329921efa0bf9761dbfc803e3b733569555226421859c118869d99bba88" exitCode=0 Mar 20 08:18:02 crc kubenswrapper[4971]: I0320 08:18:02.547566 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566578-55ww2" event={"ID":"ceede041-462b-41c4-829b-742e01009552","Type":"ContainerDied","Data":"2d0c8329921efa0bf9761dbfc803e3b733569555226421859c118869d99bba88"} Mar 20 08:18:03 crc kubenswrapper[4971]: I0320 08:18:03.905686 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566578-55ww2" Mar 20 08:18:04 crc kubenswrapper[4971]: I0320 08:18:04.067360 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdklb\" (UniqueName: \"kubernetes.io/projected/ceede041-462b-41c4-829b-742e01009552-kube-api-access-zdklb\") pod \"ceede041-462b-41c4-829b-742e01009552\" (UID: \"ceede041-462b-41c4-829b-742e01009552\") " Mar 20 08:18:04 crc kubenswrapper[4971]: I0320 08:18:04.076050 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ceede041-462b-41c4-829b-742e01009552-kube-api-access-zdklb" (OuterVolumeSpecName: "kube-api-access-zdklb") pod "ceede041-462b-41c4-829b-742e01009552" (UID: "ceede041-462b-41c4-829b-742e01009552"). InnerVolumeSpecName "kube-api-access-zdklb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:18:04 crc kubenswrapper[4971]: I0320 08:18:04.169274 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdklb\" (UniqueName: \"kubernetes.io/projected/ceede041-462b-41c4-829b-742e01009552-kube-api-access-zdklb\") on node \"crc\" DevicePath \"\"" Mar 20 08:18:04 crc kubenswrapper[4971]: I0320 08:18:04.564699 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566578-55ww2" event={"ID":"ceede041-462b-41c4-829b-742e01009552","Type":"ContainerDied","Data":"f79f68fb038a6f8d0f14e0a69beee8fc7c59a3aa52d06ae779578e2e3709ae65"} Mar 20 08:18:04 crc kubenswrapper[4971]: I0320 08:18:04.564741 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f79f68fb038a6f8d0f14e0a69beee8fc7c59a3aa52d06ae779578e2e3709ae65" Mar 20 08:18:04 crc kubenswrapper[4971]: I0320 08:18:04.564822 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566578-55ww2" Mar 20 08:18:05 crc kubenswrapper[4971]: I0320 08:18:05.014989 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566572-vcfhv"] Mar 20 08:18:05 crc kubenswrapper[4971]: I0320 08:18:05.027185 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566572-vcfhv"] Mar 20 08:18:06 crc kubenswrapper[4971]: I0320 08:18:06.748855 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="003ba1c3-2c55-4f11-bcdd-9a62917d1ce9" path="/var/lib/kubelet/pods/003ba1c3-2c55-4f11-bcdd-9a62917d1ce9/volumes" Mar 20 08:18:57 crc kubenswrapper[4971]: I0320 08:18:57.793304 4971 scope.go:117] "RemoveContainer" containerID="40f3fa9b1bc0edba483b4c0e1df7ce42d78147494af3037981325c5b72a37673" Mar 20 08:19:04 crc kubenswrapper[4971]: I0320 08:19:04.199292 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5pdkk"] Mar 20 08:19:04 crc kubenswrapper[4971]: E0320 08:19:04.201240 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceede041-462b-41c4-829b-742e01009552" containerName="oc" Mar 20 08:19:04 crc kubenswrapper[4971]: I0320 08:19:04.201346 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceede041-462b-41c4-829b-742e01009552" containerName="oc" Mar 20 08:19:04 crc kubenswrapper[4971]: I0320 08:19:04.203490 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="ceede041-462b-41c4-829b-742e01009552" containerName="oc" Mar 20 08:19:04 crc kubenswrapper[4971]: I0320 08:19:04.205151 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5pdkk" Mar 20 08:19:04 crc kubenswrapper[4971]: I0320 08:19:04.238959 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5pdkk"] Mar 20 08:19:04 crc kubenswrapper[4971]: I0320 08:19:04.395818 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f721f54-1b6d-4911-885e-0153d9e448a0-utilities\") pod \"community-operators-5pdkk\" (UID: \"6f721f54-1b6d-4911-885e-0153d9e448a0\") " pod="openshift-marketplace/community-operators-5pdkk" Mar 20 08:19:04 crc kubenswrapper[4971]: I0320 08:19:04.395919 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4j86g\" (UniqueName: \"kubernetes.io/projected/6f721f54-1b6d-4911-885e-0153d9e448a0-kube-api-access-4j86g\") pod \"community-operators-5pdkk\" (UID: \"6f721f54-1b6d-4911-885e-0153d9e448a0\") " pod="openshift-marketplace/community-operators-5pdkk" Mar 20 08:19:04 crc kubenswrapper[4971]: I0320 08:19:04.396014 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f721f54-1b6d-4911-885e-0153d9e448a0-catalog-content\") pod \"community-operators-5pdkk\" (UID: \"6f721f54-1b6d-4911-885e-0153d9e448a0\") " pod="openshift-marketplace/community-operators-5pdkk" Mar 20 08:19:04 crc kubenswrapper[4971]: I0320 08:19:04.497476 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f721f54-1b6d-4911-885e-0153d9e448a0-utilities\") pod \"community-operators-5pdkk\" (UID: \"6f721f54-1b6d-4911-885e-0153d9e448a0\") " pod="openshift-marketplace/community-operators-5pdkk" Mar 20 08:19:04 crc kubenswrapper[4971]: I0320 08:19:04.497545 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4j86g\" (UniqueName: \"kubernetes.io/projected/6f721f54-1b6d-4911-885e-0153d9e448a0-kube-api-access-4j86g\") pod \"community-operators-5pdkk\" (UID: \"6f721f54-1b6d-4911-885e-0153d9e448a0\") " pod="openshift-marketplace/community-operators-5pdkk" Mar 20 08:19:04 crc kubenswrapper[4971]: I0320 08:19:04.497617 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f721f54-1b6d-4911-885e-0153d9e448a0-catalog-content\") pod \"community-operators-5pdkk\" (UID: \"6f721f54-1b6d-4911-885e-0153d9e448a0\") " pod="openshift-marketplace/community-operators-5pdkk" Mar 20 08:19:04 crc kubenswrapper[4971]: I0320 08:19:04.498034 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f721f54-1b6d-4911-885e-0153d9e448a0-utilities\") pod \"community-operators-5pdkk\" (UID: \"6f721f54-1b6d-4911-885e-0153d9e448a0\") " pod="openshift-marketplace/community-operators-5pdkk" Mar 20 08:19:04 crc kubenswrapper[4971]: I0320 08:19:04.498067 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f721f54-1b6d-4911-885e-0153d9e448a0-catalog-content\") pod \"community-operators-5pdkk\" (UID: \"6f721f54-1b6d-4911-885e-0153d9e448a0\") " pod="openshift-marketplace/community-operators-5pdkk" Mar 20 08:19:04 crc kubenswrapper[4971]: I0320 08:19:04.528623 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4j86g\" (UniqueName: \"kubernetes.io/projected/6f721f54-1b6d-4911-885e-0153d9e448a0-kube-api-access-4j86g\") pod \"community-operators-5pdkk\" (UID: \"6f721f54-1b6d-4911-885e-0153d9e448a0\") " pod="openshift-marketplace/community-operators-5pdkk" Mar 20 08:19:04 crc kubenswrapper[4971]: I0320 08:19:04.536215 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5pdkk" Mar 20 08:19:05 crc kubenswrapper[4971]: I0320 08:19:05.081150 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5pdkk"] Mar 20 08:19:05 crc kubenswrapper[4971]: I0320 08:19:05.261028 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5pdkk" event={"ID":"6f721f54-1b6d-4911-885e-0153d9e448a0","Type":"ContainerStarted","Data":"144a01518553c381f4dcda6e427542dc885ea1b578d7468c99597b118bba4d67"} Mar 20 08:19:05 crc kubenswrapper[4971]: I0320 08:19:05.261380 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5pdkk" event={"ID":"6f721f54-1b6d-4911-885e-0153d9e448a0","Type":"ContainerStarted","Data":"8d9c343a9805ef602396595dd85b0eb92d9a5af0ee5d424ebb8511b0f073ab3c"} Mar 20 08:19:06 crc kubenswrapper[4971]: I0320 08:19:06.272660 4971 generic.go:334] "Generic (PLEG): container finished" podID="6f721f54-1b6d-4911-885e-0153d9e448a0" containerID="144a01518553c381f4dcda6e427542dc885ea1b578d7468c99597b118bba4d67" exitCode=0 Mar 20 08:19:06 crc kubenswrapper[4971]: I0320 08:19:06.272731 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5pdkk" event={"ID":"6f721f54-1b6d-4911-885e-0153d9e448a0","Type":"ContainerDied","Data":"144a01518553c381f4dcda6e427542dc885ea1b578d7468c99597b118bba4d67"} Mar 20 08:19:07 crc kubenswrapper[4971]: I0320 08:19:07.170284 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2jcnt"] Mar 20 08:19:07 crc kubenswrapper[4971]: I0320 08:19:07.171817 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2jcnt" Mar 20 08:19:07 crc kubenswrapper[4971]: I0320 08:19:07.189049 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2jcnt"] Mar 20 08:19:07 crc kubenswrapper[4971]: I0320 08:19:07.282886 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5pdkk" event={"ID":"6f721f54-1b6d-4911-885e-0153d9e448a0","Type":"ContainerStarted","Data":"41f2a941338a1b4068373aaaaf526a26824bfa6894a490a25ecaca0d8dd2d923"} Mar 20 08:19:07 crc kubenswrapper[4971]: I0320 08:19:07.344167 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bb41efd-6b90-406d-9661-fdc11d3d986e-catalog-content\") pod \"redhat-operators-2jcnt\" (UID: \"9bb41efd-6b90-406d-9661-fdc11d3d986e\") " pod="openshift-marketplace/redhat-operators-2jcnt" Mar 20 08:19:07 crc kubenswrapper[4971]: I0320 08:19:07.344456 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zg5w\" (UniqueName: \"kubernetes.io/projected/9bb41efd-6b90-406d-9661-fdc11d3d986e-kube-api-access-9zg5w\") pod \"redhat-operators-2jcnt\" (UID: \"9bb41efd-6b90-406d-9661-fdc11d3d986e\") " pod="openshift-marketplace/redhat-operators-2jcnt" Mar 20 08:19:07 crc kubenswrapper[4971]: I0320 08:19:07.344797 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bb41efd-6b90-406d-9661-fdc11d3d986e-utilities\") pod \"redhat-operators-2jcnt\" (UID: \"9bb41efd-6b90-406d-9661-fdc11d3d986e\") " pod="openshift-marketplace/redhat-operators-2jcnt" Mar 20 08:19:07 crc kubenswrapper[4971]: I0320 08:19:07.446069 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zg5w\" (UniqueName: \"kubernetes.io/projected/9bb41efd-6b90-406d-9661-fdc11d3d986e-kube-api-access-9zg5w\") pod \"redhat-operators-2jcnt\" (UID: \"9bb41efd-6b90-406d-9661-fdc11d3d986e\") " pod="openshift-marketplace/redhat-operators-2jcnt" Mar 20 08:19:07 crc kubenswrapper[4971]: I0320 08:19:07.446154 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bb41efd-6b90-406d-9661-fdc11d3d986e-utilities\") pod \"redhat-operators-2jcnt\" (UID: \"9bb41efd-6b90-406d-9661-fdc11d3d986e\") " pod="openshift-marketplace/redhat-operators-2jcnt" Mar 20 08:19:07 crc kubenswrapper[4971]: I0320 08:19:07.446205 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bb41efd-6b90-406d-9661-fdc11d3d986e-catalog-content\") pod \"redhat-operators-2jcnt\" (UID: \"9bb41efd-6b90-406d-9661-fdc11d3d986e\") " pod="openshift-marketplace/redhat-operators-2jcnt" Mar 20 08:19:07 crc kubenswrapper[4971]: I0320 08:19:07.447759 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bb41efd-6b90-406d-9661-fdc11d3d986e-catalog-content\") pod \"redhat-operators-2jcnt\" (UID: \"9bb41efd-6b90-406d-9661-fdc11d3d986e\") " pod="openshift-marketplace/redhat-operators-2jcnt" Mar 20 08:19:07 crc kubenswrapper[4971]: I0320 08:19:07.447767 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bb41efd-6b90-406d-9661-fdc11d3d986e-utilities\") pod \"redhat-operators-2jcnt\" (UID: \"9bb41efd-6b90-406d-9661-fdc11d3d986e\") " pod="openshift-marketplace/redhat-operators-2jcnt" Mar 20 08:19:07 crc kubenswrapper[4971]: I0320 08:19:07.483403 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zg5w\" (UniqueName: \"kubernetes.io/projected/9bb41efd-6b90-406d-9661-fdc11d3d986e-kube-api-access-9zg5w\") pod \"redhat-operators-2jcnt\" (UID: \"9bb41efd-6b90-406d-9661-fdc11d3d986e\") " pod="openshift-marketplace/redhat-operators-2jcnt" Mar 20 08:19:07 crc kubenswrapper[4971]: I0320 08:19:07.493937 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2jcnt" Mar 20 08:19:07 crc kubenswrapper[4971]: I0320 08:19:07.744203 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2jcnt"] Mar 20 08:19:08 crc kubenswrapper[4971]: I0320 08:19:08.290532 4971 generic.go:334] "Generic (PLEG): container finished" podID="6f721f54-1b6d-4911-885e-0153d9e448a0" containerID="41f2a941338a1b4068373aaaaf526a26824bfa6894a490a25ecaca0d8dd2d923" exitCode=0 Mar 20 08:19:08 crc kubenswrapper[4971]: I0320 08:19:08.290696 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5pdkk" event={"ID":"6f721f54-1b6d-4911-885e-0153d9e448a0","Type":"ContainerDied","Data":"41f2a941338a1b4068373aaaaf526a26824bfa6894a490a25ecaca0d8dd2d923"} Mar 20 08:19:08 crc kubenswrapper[4971]: I0320 08:19:08.292310 4971 generic.go:334] "Generic (PLEG): container finished" podID="9bb41efd-6b90-406d-9661-fdc11d3d986e" containerID="5746b6dcc0ba23f4885ec878e2fd9b0d079b3b60039e7554f04183f3afbc489d" exitCode=0 Mar 20 08:19:08 crc kubenswrapper[4971]: I0320 08:19:08.292351 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2jcnt" event={"ID":"9bb41efd-6b90-406d-9661-fdc11d3d986e","Type":"ContainerDied","Data":"5746b6dcc0ba23f4885ec878e2fd9b0d079b3b60039e7554f04183f3afbc489d"} Mar 20 08:19:08 crc kubenswrapper[4971]: I0320 08:19:08.292376 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2jcnt" event={"ID":"9bb41efd-6b90-406d-9661-fdc11d3d986e","Type":"ContainerStarted","Data":"f1c60cb5c23e3da4512dda3a10786eb47432d6ec7cce93fe9352739ab6d9b1e0"} Mar 20 08:19:09 crc kubenswrapper[4971]: I0320 08:19:09.300273 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2jcnt" event={"ID":"9bb41efd-6b90-406d-9661-fdc11d3d986e","Type":"ContainerStarted","Data":"8717b168f6d8f069ac175b103f743a070d6b4b209f7a477a7dcf61ff3e0a8d92"} Mar 20 08:19:09 crc kubenswrapper[4971]: I0320 08:19:09.302955 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5pdkk" event={"ID":"6f721f54-1b6d-4911-885e-0153d9e448a0","Type":"ContainerStarted","Data":"007495fa6e612dc30862e9f3ed00433e8901febea352b9989a64025185f86365"} Mar 20 08:19:09 crc kubenswrapper[4971]: I0320 08:19:09.354252 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5pdkk" podStartSLOduration=2.668493004 podStartE2EDuration="5.354229737s" podCreationTimestamp="2026-03-20 08:19:04 +0000 UTC" firstStartedPulling="2026-03-20 08:19:06.274972986 +0000 UTC m=+5368.254847164" lastFinishedPulling="2026-03-20 08:19:08.960709759 +0000 UTC m=+5370.940583897" observedRunningTime="2026-03-20 08:19:09.346571968 +0000 UTC m=+5371.326446126" watchObservedRunningTime="2026-03-20 08:19:09.354229737 +0000 UTC m=+5371.334103875" Mar 20 08:19:10 crc kubenswrapper[4971]: I0320 08:19:10.315722 4971 generic.go:334] "Generic (PLEG): container finished" podID="9bb41efd-6b90-406d-9661-fdc11d3d986e" containerID="8717b168f6d8f069ac175b103f743a070d6b4b209f7a477a7dcf61ff3e0a8d92" exitCode=0 Mar 20 08:19:10 crc kubenswrapper[4971]: I0320 08:19:10.315844 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2jcnt" event={"ID":"9bb41efd-6b90-406d-9661-fdc11d3d986e","Type":"ContainerDied","Data":"8717b168f6d8f069ac175b103f743a070d6b4b209f7a477a7dcf61ff3e0a8d92"} Mar 20 08:19:11 crc kubenswrapper[4971]: I0320 08:19:11.325052 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2jcnt" event={"ID":"9bb41efd-6b90-406d-9661-fdc11d3d986e","Type":"ContainerStarted","Data":"6270ca83729e199a7a4e2dc74dd79b9bab48bbc886579ceb5fb7820549bcf998"} Mar 20 08:19:11 crc kubenswrapper[4971]: I0320 08:19:11.345425 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2jcnt" podStartSLOduration=1.683592812 podStartE2EDuration="4.345399011s" podCreationTimestamp="2026-03-20 08:19:07 +0000 UTC" firstStartedPulling="2026-03-20 08:19:08.29337963 +0000 UTC m=+5370.273253768" lastFinishedPulling="2026-03-20 08:19:10.955185829 +0000 UTC m=+5372.935059967" observedRunningTime="2026-03-20 08:19:11.340020211 +0000 UTC m=+5373.319894419" watchObservedRunningTime="2026-03-20 08:19:11.345399011 +0000 UTC m=+5373.325273179" Mar 20 08:19:14 crc kubenswrapper[4971]: I0320 08:19:14.536893 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5pdkk" Mar 20 08:19:14 crc kubenswrapper[4971]: I0320 08:19:14.537493 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5pdkk" Mar 20 08:19:14 crc kubenswrapper[4971]: I0320 08:19:14.576834 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5pdkk" Mar 20 08:19:15 crc kubenswrapper[4971]: I0320 08:19:15.407778 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5pdkk" Mar 20 08:19:15 crc kubenswrapper[4971]: I0320 08:19:15.761959 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5pdkk"] Mar 20 08:19:17 crc kubenswrapper[4971]: I0320 08:19:17.374570 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5pdkk" podUID="6f721f54-1b6d-4911-885e-0153d9e448a0" containerName="registry-server" containerID="cri-o://007495fa6e612dc30862e9f3ed00433e8901febea352b9989a64025185f86365" gracePeriod=2 Mar 20 08:19:17 crc kubenswrapper[4971]: I0320 08:19:17.494955 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2jcnt" Mar 20 08:19:17 crc kubenswrapper[4971]: I0320 08:19:17.495251 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2jcnt" Mar 20 08:19:17 crc kubenswrapper[4971]: I0320 08:19:17.781702 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5pdkk" Mar 20 08:19:17 crc kubenswrapper[4971]: I0320 08:19:17.896113 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4j86g\" (UniqueName: \"kubernetes.io/projected/6f721f54-1b6d-4911-885e-0153d9e448a0-kube-api-access-4j86g\") pod \"6f721f54-1b6d-4911-885e-0153d9e448a0\" (UID: \"6f721f54-1b6d-4911-885e-0153d9e448a0\") " Mar 20 08:19:17 crc kubenswrapper[4971]: I0320 08:19:17.896202 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f721f54-1b6d-4911-885e-0153d9e448a0-utilities\") pod \"6f721f54-1b6d-4911-885e-0153d9e448a0\" (UID: \"6f721f54-1b6d-4911-885e-0153d9e448a0\") " Mar 20 08:19:17 crc kubenswrapper[4971]: I0320 08:19:17.896255 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f721f54-1b6d-4911-885e-0153d9e448a0-catalog-content\") pod \"6f721f54-1b6d-4911-885e-0153d9e448a0\" (UID: \"6f721f54-1b6d-4911-885e-0153d9e448a0\") " Mar 20 08:19:17 crc kubenswrapper[4971]: I0320 08:19:17.897030 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f721f54-1b6d-4911-885e-0153d9e448a0-utilities" (OuterVolumeSpecName: "utilities") pod "6f721f54-1b6d-4911-885e-0153d9e448a0" (UID: "6f721f54-1b6d-4911-885e-0153d9e448a0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:19:17 crc kubenswrapper[4971]: I0320 08:19:17.902927 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f721f54-1b6d-4911-885e-0153d9e448a0-kube-api-access-4j86g" (OuterVolumeSpecName: "kube-api-access-4j86g") pod "6f721f54-1b6d-4911-885e-0153d9e448a0" (UID: "6f721f54-1b6d-4911-885e-0153d9e448a0"). InnerVolumeSpecName "kube-api-access-4j86g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:19:17 crc kubenswrapper[4971]: I0320 08:19:17.950585 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f721f54-1b6d-4911-885e-0153d9e448a0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6f721f54-1b6d-4911-885e-0153d9e448a0" (UID: "6f721f54-1b6d-4911-885e-0153d9e448a0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:19:17 crc kubenswrapper[4971]: I0320 08:19:17.997583 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f721f54-1b6d-4911-885e-0153d9e448a0-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 08:19:17 crc kubenswrapper[4971]: I0320 08:19:17.997644 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f721f54-1b6d-4911-885e-0153d9e448a0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 08:19:17 crc kubenswrapper[4971]: I0320 08:19:17.997656 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4j86g\" (UniqueName: \"kubernetes.io/projected/6f721f54-1b6d-4911-885e-0153d9e448a0-kube-api-access-4j86g\") on node \"crc\" DevicePath \"\"" Mar 20 08:19:18 crc kubenswrapper[4971]: I0320 08:19:18.386555 4971 generic.go:334] "Generic (PLEG): container finished" podID="6f721f54-1b6d-4911-885e-0153d9e448a0" containerID="007495fa6e612dc30862e9f3ed00433e8901febea352b9989a64025185f86365" exitCode=0 Mar 20 08:19:18 crc kubenswrapper[4971]: I0320 08:19:18.386650 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5pdkk" event={"ID":"6f721f54-1b6d-4911-885e-0153d9e448a0","Type":"ContainerDied","Data":"007495fa6e612dc30862e9f3ed00433e8901febea352b9989a64025185f86365"} Mar 20 08:19:18 crc kubenswrapper[4971]: I0320 08:19:18.386686 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5pdkk" Mar 20 08:19:18 crc kubenswrapper[4971]: I0320 08:19:18.386712 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5pdkk" event={"ID":"6f721f54-1b6d-4911-885e-0153d9e448a0","Type":"ContainerDied","Data":"8d9c343a9805ef602396595dd85b0eb92d9a5af0ee5d424ebb8511b0f073ab3c"} Mar 20 08:19:18 crc kubenswrapper[4971]: I0320 08:19:18.386742 4971 scope.go:117] "RemoveContainer" containerID="007495fa6e612dc30862e9f3ed00433e8901febea352b9989a64025185f86365" Mar 20 08:19:18 crc kubenswrapper[4971]: I0320 08:19:18.418957 4971 scope.go:117] "RemoveContainer" containerID="41f2a941338a1b4068373aaaaf526a26824bfa6894a490a25ecaca0d8dd2d923" Mar 20 08:19:18 crc kubenswrapper[4971]: I0320 08:19:18.436658 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5pdkk"] Mar 20 08:19:18 crc kubenswrapper[4971]: I0320 08:19:18.443638 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5pdkk"] Mar 20 08:19:18 crc kubenswrapper[4971]: I0320 08:19:18.463641 4971 scope.go:117] "RemoveContainer" containerID="144a01518553c381f4dcda6e427542dc885ea1b578d7468c99597b118bba4d67" Mar 20 08:19:18 crc kubenswrapper[4971]: I0320 08:19:18.483540 4971 scope.go:117] "RemoveContainer" containerID="007495fa6e612dc30862e9f3ed00433e8901febea352b9989a64025185f86365" Mar 20 08:19:18 crc kubenswrapper[4971]: E0320 08:19:18.484036 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"007495fa6e612dc30862e9f3ed00433e8901febea352b9989a64025185f86365\": container with ID starting with 007495fa6e612dc30862e9f3ed00433e8901febea352b9989a64025185f86365 not found: ID does not exist" containerID="007495fa6e612dc30862e9f3ed00433e8901febea352b9989a64025185f86365" Mar 20 08:19:18 crc kubenswrapper[4971]: I0320 08:19:18.484066 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"007495fa6e612dc30862e9f3ed00433e8901febea352b9989a64025185f86365"} err="failed to get container status \"007495fa6e612dc30862e9f3ed00433e8901febea352b9989a64025185f86365\": rpc error: code = NotFound desc = could not find container \"007495fa6e612dc30862e9f3ed00433e8901febea352b9989a64025185f86365\": container with ID starting with 007495fa6e612dc30862e9f3ed00433e8901febea352b9989a64025185f86365 not found: ID does not exist" Mar 20 08:19:18 crc kubenswrapper[4971]: I0320 08:19:18.484084 4971 scope.go:117] "RemoveContainer" containerID="41f2a941338a1b4068373aaaaf526a26824bfa6894a490a25ecaca0d8dd2d923" Mar 20 08:19:18 crc kubenswrapper[4971]: E0320 08:19:18.484419 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41f2a941338a1b4068373aaaaf526a26824bfa6894a490a25ecaca0d8dd2d923\": container with ID starting with 41f2a941338a1b4068373aaaaf526a26824bfa6894a490a25ecaca0d8dd2d923 not found: ID does not exist" containerID="41f2a941338a1b4068373aaaaf526a26824bfa6894a490a25ecaca0d8dd2d923" Mar 20 08:19:18 crc kubenswrapper[4971]: I0320 08:19:18.484443 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41f2a941338a1b4068373aaaaf526a26824bfa6894a490a25ecaca0d8dd2d923"} err="failed to get container status \"41f2a941338a1b4068373aaaaf526a26824bfa6894a490a25ecaca0d8dd2d923\": rpc error: code = NotFound desc = could not find container \"41f2a941338a1b4068373aaaaf526a26824bfa6894a490a25ecaca0d8dd2d923\": container with ID starting with 41f2a941338a1b4068373aaaaf526a26824bfa6894a490a25ecaca0d8dd2d923 not found: ID does not exist" Mar 20 08:19:18 crc kubenswrapper[4971]: I0320 08:19:18.484456 4971 scope.go:117] "RemoveContainer" containerID="144a01518553c381f4dcda6e427542dc885ea1b578d7468c99597b118bba4d67" Mar 20 08:19:18 crc kubenswrapper[4971]: E0320 08:19:18.484679 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"144a01518553c381f4dcda6e427542dc885ea1b578d7468c99597b118bba4d67\": container with ID starting with 144a01518553c381f4dcda6e427542dc885ea1b578d7468c99597b118bba4d67 not found: ID does not exist" containerID="144a01518553c381f4dcda6e427542dc885ea1b578d7468c99597b118bba4d67" Mar 20 08:19:18 crc kubenswrapper[4971]: I0320 08:19:18.484702 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"144a01518553c381f4dcda6e427542dc885ea1b578d7468c99597b118bba4d67"} err="failed to get container status \"144a01518553c381f4dcda6e427542dc885ea1b578d7468c99597b118bba4d67\": rpc error: code = NotFound desc = could not find container \"144a01518553c381f4dcda6e427542dc885ea1b578d7468c99597b118bba4d67\": container with ID starting with 144a01518553c381f4dcda6e427542dc885ea1b578d7468c99597b118bba4d67 not found: ID does not exist" Mar 20 08:19:18 crc kubenswrapper[4971]: I0320 08:19:18.557231 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2jcnt" podUID="9bb41efd-6b90-406d-9661-fdc11d3d986e" containerName="registry-server" probeResult="failure" output=< Mar 20 08:19:18 crc kubenswrapper[4971]: timeout: failed to connect service ":50051" within 1s Mar 20 08:19:18 crc kubenswrapper[4971]: > Mar 20 08:19:18 crc kubenswrapper[4971]: I0320 08:19:18.753980 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f721f54-1b6d-4911-885e-0153d9e448a0" path="/var/lib/kubelet/pods/6f721f54-1b6d-4911-885e-0153d9e448a0/volumes" Mar 20 08:19:27 crc kubenswrapper[4971]: I0320 08:19:27.535171 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2jcnt" Mar 20 08:19:27 crc kubenswrapper[4971]: I0320 08:19:27.576305 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2jcnt" Mar 20 08:19:27 crc kubenswrapper[4971]: I0320 08:19:27.766131 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2jcnt"] Mar 20 08:19:29 crc kubenswrapper[4971]: I0320 08:19:29.469077 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2jcnt" podUID="9bb41efd-6b90-406d-9661-fdc11d3d986e" containerName="registry-server" containerID="cri-o://6270ca83729e199a7a4e2dc74dd79b9bab48bbc886579ceb5fb7820549bcf998" gracePeriod=2 Mar 20 08:19:29 crc kubenswrapper[4971]: I0320 08:19:29.863613 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2jcnt" Mar 20 08:19:29 crc kubenswrapper[4971]: I0320 08:19:29.970197 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bb41efd-6b90-406d-9661-fdc11d3d986e-utilities\") pod \"9bb41efd-6b90-406d-9661-fdc11d3d986e\" (UID: \"9bb41efd-6b90-406d-9661-fdc11d3d986e\") " Mar 20 08:19:29 crc kubenswrapper[4971]: I0320 08:19:29.970315 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bb41efd-6b90-406d-9661-fdc11d3d986e-catalog-content\") pod \"9bb41efd-6b90-406d-9661-fdc11d3d986e\" (UID: \"9bb41efd-6b90-406d-9661-fdc11d3d986e\") " Mar 20 08:19:29 crc kubenswrapper[4971]: I0320 08:19:29.970387 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zg5w\" (UniqueName: \"kubernetes.io/projected/9bb41efd-6b90-406d-9661-fdc11d3d986e-kube-api-access-9zg5w\") pod \"9bb41efd-6b90-406d-9661-fdc11d3d986e\" (UID: \"9bb41efd-6b90-406d-9661-fdc11d3d986e\") " Mar 20 08:19:29 crc kubenswrapper[4971]: I0320 08:19:29.971088 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9bb41efd-6b90-406d-9661-fdc11d3d986e-utilities" (OuterVolumeSpecName: "utilities") pod "9bb41efd-6b90-406d-9661-fdc11d3d986e" (UID: "9bb41efd-6b90-406d-9661-fdc11d3d986e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:19:29 crc kubenswrapper[4971]: I0320 08:19:29.976945 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bb41efd-6b90-406d-9661-fdc11d3d986e-kube-api-access-9zg5w" (OuterVolumeSpecName: "kube-api-access-9zg5w") pod "9bb41efd-6b90-406d-9661-fdc11d3d986e" (UID: "9bb41efd-6b90-406d-9661-fdc11d3d986e"). InnerVolumeSpecName "kube-api-access-9zg5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:19:30 crc kubenswrapper[4971]: I0320 08:19:30.072394 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bb41efd-6b90-406d-9661-fdc11d3d986e-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 08:19:30 crc kubenswrapper[4971]: I0320 08:19:30.072432 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zg5w\" (UniqueName: \"kubernetes.io/projected/9bb41efd-6b90-406d-9661-fdc11d3d986e-kube-api-access-9zg5w\") on node \"crc\" DevicePath \"\"" Mar 20 08:19:30 crc kubenswrapper[4971]: I0320 08:19:30.124557 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9bb41efd-6b90-406d-9661-fdc11d3d986e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9bb41efd-6b90-406d-9661-fdc11d3d986e" (UID: "9bb41efd-6b90-406d-9661-fdc11d3d986e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:19:30 crc kubenswrapper[4971]: I0320 08:19:30.174246 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bb41efd-6b90-406d-9661-fdc11d3d986e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 08:19:30 crc kubenswrapper[4971]: I0320 08:19:30.480037 4971 generic.go:334] "Generic (PLEG): container finished" podID="9bb41efd-6b90-406d-9661-fdc11d3d986e" containerID="6270ca83729e199a7a4e2dc74dd79b9bab48bbc886579ceb5fb7820549bcf998" exitCode=0 Mar 20 08:19:30 crc kubenswrapper[4971]: I0320 08:19:30.480102 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2jcnt" event={"ID":"9bb41efd-6b90-406d-9661-fdc11d3d986e","Type":"ContainerDied","Data":"6270ca83729e199a7a4e2dc74dd79b9bab48bbc886579ceb5fb7820549bcf998"} Mar 20 08:19:30 crc kubenswrapper[4971]: I0320 08:19:30.480151 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2jcnt" Mar 20 08:19:30 crc kubenswrapper[4971]: I0320 08:19:30.480182 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2jcnt" event={"ID":"9bb41efd-6b90-406d-9661-fdc11d3d986e","Type":"ContainerDied","Data":"f1c60cb5c23e3da4512dda3a10786eb47432d6ec7cce93fe9352739ab6d9b1e0"} Mar 20 08:19:30 crc kubenswrapper[4971]: I0320 08:19:30.480211 4971 scope.go:117] "RemoveContainer" containerID="6270ca83729e199a7a4e2dc74dd79b9bab48bbc886579ceb5fb7820549bcf998" Mar 20 08:19:30 crc kubenswrapper[4971]: I0320 08:19:30.514778 4971 scope.go:117] "RemoveContainer" containerID="8717b168f6d8f069ac175b103f743a070d6b4b209f7a477a7dcf61ff3e0a8d92" Mar 20 08:19:30 crc kubenswrapper[4971]: I0320 08:19:30.527476 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2jcnt"] Mar 20 08:19:30 crc kubenswrapper[4971]: I0320 08:19:30.535446 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2jcnt"] Mar 20 08:19:30 crc kubenswrapper[4971]: I0320 08:19:30.562937 4971 scope.go:117] "RemoveContainer" containerID="5746b6dcc0ba23f4885ec878e2fd9b0d079b3b60039e7554f04183f3afbc489d" Mar 20 08:19:30 crc kubenswrapper[4971]: I0320 08:19:30.579399 4971 scope.go:117] "RemoveContainer" containerID="6270ca83729e199a7a4e2dc74dd79b9bab48bbc886579ceb5fb7820549bcf998" Mar 20 08:19:30 crc kubenswrapper[4971]: E0320 08:19:30.579817 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6270ca83729e199a7a4e2dc74dd79b9bab48bbc886579ceb5fb7820549bcf998\": container with ID starting with 6270ca83729e199a7a4e2dc74dd79b9bab48bbc886579ceb5fb7820549bcf998 not found: ID does not exist" containerID="6270ca83729e199a7a4e2dc74dd79b9bab48bbc886579ceb5fb7820549bcf998" Mar 20 08:19:30 crc kubenswrapper[4971]: I0320 08:19:30.579848 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6270ca83729e199a7a4e2dc74dd79b9bab48bbc886579ceb5fb7820549bcf998"} err="failed to get container status \"6270ca83729e199a7a4e2dc74dd79b9bab48bbc886579ceb5fb7820549bcf998\": rpc error: code = NotFound desc = could not find container \"6270ca83729e199a7a4e2dc74dd79b9bab48bbc886579ceb5fb7820549bcf998\": container with ID starting with 6270ca83729e199a7a4e2dc74dd79b9bab48bbc886579ceb5fb7820549bcf998 not found: ID does not exist" Mar 20 08:19:30 crc kubenswrapper[4971]: I0320 08:19:30.579870 4971 scope.go:117] "RemoveContainer" containerID="8717b168f6d8f069ac175b103f743a070d6b4b209f7a477a7dcf61ff3e0a8d92" Mar 20 08:19:30 crc kubenswrapper[4971]: E0320 08:19:30.580274 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8717b168f6d8f069ac175b103f743a070d6b4b209f7a477a7dcf61ff3e0a8d92\": container with ID starting with 8717b168f6d8f069ac175b103f743a070d6b4b209f7a477a7dcf61ff3e0a8d92 not found: ID does not exist" containerID="8717b168f6d8f069ac175b103f743a070d6b4b209f7a477a7dcf61ff3e0a8d92" Mar 20 08:19:30 crc kubenswrapper[4971]: I0320 08:19:30.580325 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8717b168f6d8f069ac175b103f743a070d6b4b209f7a477a7dcf61ff3e0a8d92"} err="failed to get container status \"8717b168f6d8f069ac175b103f743a070d6b4b209f7a477a7dcf61ff3e0a8d92\": rpc error: code = NotFound desc = could not find container \"8717b168f6d8f069ac175b103f743a070d6b4b209f7a477a7dcf61ff3e0a8d92\": container with ID starting with 8717b168f6d8f069ac175b103f743a070d6b4b209f7a477a7dcf61ff3e0a8d92 not found: ID does not exist" Mar 20 08:19:30 crc kubenswrapper[4971]: I0320 08:19:30.580362 4971 scope.go:117] "RemoveContainer" containerID="5746b6dcc0ba23f4885ec878e2fd9b0d079b3b60039e7554f04183f3afbc489d" Mar 20 08:19:30 crc kubenswrapper[4971]: E0320 08:19:30.580728 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5746b6dcc0ba23f4885ec878e2fd9b0d079b3b60039e7554f04183f3afbc489d\": container with ID starting with 5746b6dcc0ba23f4885ec878e2fd9b0d079b3b60039e7554f04183f3afbc489d not found: ID does not exist" containerID="5746b6dcc0ba23f4885ec878e2fd9b0d079b3b60039e7554f04183f3afbc489d" Mar 20 08:19:30 crc kubenswrapper[4971]: I0320 08:19:30.580795 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5746b6dcc0ba23f4885ec878e2fd9b0d079b3b60039e7554f04183f3afbc489d"} err="failed to get container status \"5746b6dcc0ba23f4885ec878e2fd9b0d079b3b60039e7554f04183f3afbc489d\": rpc error: code = NotFound desc = could not find container \"5746b6dcc0ba23f4885ec878e2fd9b0d079b3b60039e7554f04183f3afbc489d\": container with ID starting with 5746b6dcc0ba23f4885ec878e2fd9b0d079b3b60039e7554f04183f3afbc489d not found: ID does not exist" Mar 20 08:19:30 crc kubenswrapper[4971]: I0320 08:19:30.746469 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bb41efd-6b90-406d-9661-fdc11d3d986e" path="/var/lib/kubelet/pods/9bb41efd-6b90-406d-9661-fdc11d3d986e/volumes" Mar 20 08:19:50 crc kubenswrapper[4971]: I0320 08:19:50.162965 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 08:19:50 crc kubenswrapper[4971]: I0320 08:19:50.163468 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 08:20:00 crc kubenswrapper[4971]: I0320 08:20:00.138085 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566580-ck8br"] Mar 20 08:20:00 crc kubenswrapper[4971]: E0320 08:20:00.139186 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bb41efd-6b90-406d-9661-fdc11d3d986e" containerName="extract-content" Mar 20 08:20:00 crc kubenswrapper[4971]: I0320 08:20:00.139203 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bb41efd-6b90-406d-9661-fdc11d3d986e" containerName="extract-content" Mar 20 08:20:00 crc kubenswrapper[4971]: E0320 08:20:00.139227 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bb41efd-6b90-406d-9661-fdc11d3d986e" containerName="registry-server" Mar 20 08:20:00 crc kubenswrapper[4971]: I0320 08:20:00.139269 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bb41efd-6b90-406d-9661-fdc11d3d986e" containerName="registry-server" Mar 20 08:20:00 crc kubenswrapper[4971]: E0320 08:20:00.139291 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f721f54-1b6d-4911-885e-0153d9e448a0" containerName="registry-server" Mar 20 08:20:00 crc kubenswrapper[4971]: I0320 08:20:00.139300 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f721f54-1b6d-4911-885e-0153d9e448a0" containerName="registry-server" Mar 20 08:20:00 crc kubenswrapper[4971]: E0320 08:20:00.139314 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f721f54-1b6d-4911-885e-0153d9e448a0" containerName="extract-content" Mar 20 08:20:00 crc kubenswrapper[4971]: I0320 08:20:00.139353 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f721f54-1b6d-4911-885e-0153d9e448a0" containerName="extract-content" Mar 20 08:20:00 crc kubenswrapper[4971]: E0320 08:20:00.139368 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bb41efd-6b90-406d-9661-fdc11d3d986e" containerName="extract-utilities" Mar 20 08:20:00 crc kubenswrapper[4971]: I0320 08:20:00.139375 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bb41efd-6b90-406d-9661-fdc11d3d986e" containerName="extract-utilities" Mar 20 08:20:00 crc kubenswrapper[4971]: E0320 08:20:00.141066 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f721f54-1b6d-4911-885e-0153d9e448a0" containerName="extract-utilities" Mar 20 08:20:00 crc kubenswrapper[4971]: I0320 08:20:00.141087 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f721f54-1b6d-4911-885e-0153d9e448a0" containerName="extract-utilities" Mar 20 08:20:00 crc kubenswrapper[4971]: I0320 08:20:00.142276 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bb41efd-6b90-406d-9661-fdc11d3d986e" containerName="registry-server" Mar 20 08:20:00 crc kubenswrapper[4971]: I0320 08:20:00.142338 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f721f54-1b6d-4911-885e-0153d9e448a0" containerName="registry-server" Mar 20 08:20:00 crc kubenswrapper[4971]: I0320 08:20:00.143104 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566580-ck8br" Mar 20 08:20:00 crc kubenswrapper[4971]: I0320 08:20:00.145198 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 08:20:00 crc kubenswrapper[4971]: I0320 08:20:00.145791 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 08:20:00 crc kubenswrapper[4971]: I0320 08:20:00.146072 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566580-ck8br"] Mar 20 08:20:00 crc kubenswrapper[4971]: I0320 08:20:00.146775 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 08:20:00 crc kubenswrapper[4971]: I0320 08:20:00.229109 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wh64h\" (UniqueName: \"kubernetes.io/projected/5cdbda64-10b5-46a1-a725-e3217bf2a00d-kube-api-access-wh64h\") pod \"auto-csr-approver-29566580-ck8br\" (UID: \"5cdbda64-10b5-46a1-a725-e3217bf2a00d\") " pod="openshift-infra/auto-csr-approver-29566580-ck8br" Mar 20 08:20:00 crc kubenswrapper[4971]: I0320 08:20:00.330658 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wh64h\" (UniqueName: \"kubernetes.io/projected/5cdbda64-10b5-46a1-a725-e3217bf2a00d-kube-api-access-wh64h\") pod \"auto-csr-approver-29566580-ck8br\" (UID: \"5cdbda64-10b5-46a1-a725-e3217bf2a00d\") " pod="openshift-infra/auto-csr-approver-29566580-ck8br" Mar 20 08:20:00 crc kubenswrapper[4971]: I0320 08:20:00.352889 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wh64h\" (UniqueName: \"kubernetes.io/projected/5cdbda64-10b5-46a1-a725-e3217bf2a00d-kube-api-access-wh64h\") pod \"auto-csr-approver-29566580-ck8br\" (UID: \"5cdbda64-10b5-46a1-a725-e3217bf2a00d\") " pod="openshift-infra/auto-csr-approver-29566580-ck8br" Mar 20 08:20:00 crc kubenswrapper[4971]: I0320 08:20:00.465020 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566580-ck8br" Mar 20 08:20:00 crc kubenswrapper[4971]: I0320 08:20:00.867335 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566580-ck8br"] Mar 20 08:20:01 crc kubenswrapper[4971]: I0320 08:20:01.704971 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566580-ck8br" event={"ID":"5cdbda64-10b5-46a1-a725-e3217bf2a00d","Type":"ContainerStarted","Data":"c366d9d1767e2216795cdb302c6705c6054baa544af1601ef589a82040a940ad"} Mar 20 08:20:02 crc kubenswrapper[4971]: I0320 08:20:02.712985 4971 generic.go:334] "Generic (PLEG): container finished" podID="5cdbda64-10b5-46a1-a725-e3217bf2a00d" containerID="164d99c8a3ef2325fdc55d6446cdd8febbd31c71758989a988c6d34d16a8ebf6" exitCode=0 Mar 20 08:20:02 crc kubenswrapper[4971]: I0320 08:20:02.713164 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566580-ck8br" event={"ID":"5cdbda64-10b5-46a1-a725-e3217bf2a00d","Type":"ContainerDied","Data":"164d99c8a3ef2325fdc55d6446cdd8febbd31c71758989a988c6d34d16a8ebf6"} Mar 20 08:20:04 crc kubenswrapper[4971]: I0320 08:20:04.049358 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566580-ck8br" Mar 20 08:20:04 crc kubenswrapper[4971]: I0320 08:20:04.091726 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wh64h\" (UniqueName: \"kubernetes.io/projected/5cdbda64-10b5-46a1-a725-e3217bf2a00d-kube-api-access-wh64h\") pod \"5cdbda64-10b5-46a1-a725-e3217bf2a00d\" (UID: \"5cdbda64-10b5-46a1-a725-e3217bf2a00d\") " Mar 20 08:20:04 crc kubenswrapper[4971]: I0320 08:20:04.106975 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cdbda64-10b5-46a1-a725-e3217bf2a00d-kube-api-access-wh64h" (OuterVolumeSpecName: "kube-api-access-wh64h") pod "5cdbda64-10b5-46a1-a725-e3217bf2a00d" (UID: "5cdbda64-10b5-46a1-a725-e3217bf2a00d"). InnerVolumeSpecName "kube-api-access-wh64h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:20:04 crc kubenswrapper[4971]: I0320 08:20:04.195638 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wh64h\" (UniqueName: \"kubernetes.io/projected/5cdbda64-10b5-46a1-a725-e3217bf2a00d-kube-api-access-wh64h\") on node \"crc\" DevicePath \"\"" Mar 20 08:20:04 crc kubenswrapper[4971]: I0320 08:20:04.734722 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566580-ck8br" Mar 20 08:20:04 crc kubenswrapper[4971]: I0320 08:20:04.749733 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566580-ck8br" event={"ID":"5cdbda64-10b5-46a1-a725-e3217bf2a00d","Type":"ContainerDied","Data":"c366d9d1767e2216795cdb302c6705c6054baa544af1601ef589a82040a940ad"} Mar 20 08:20:04 crc kubenswrapper[4971]: I0320 08:20:04.749777 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c366d9d1767e2216795cdb302c6705c6054baa544af1601ef589a82040a940ad" Mar 20 08:20:05 crc kubenswrapper[4971]: I0320 08:20:05.125841 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566574-fhfpl"] Mar 20 08:20:05 crc kubenswrapper[4971]: I0320 08:20:05.132247 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566574-fhfpl"] Mar 20 08:20:06 crc kubenswrapper[4971]: I0320 08:20:06.749222 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="265e7f55-5292-45cd-b48a-1260ea0579fd" path="/var/lib/kubelet/pods/265e7f55-5292-45cd-b48a-1260ea0579fd/volumes" Mar 20 08:20:20 crc kubenswrapper[4971]: I0320 08:20:20.162459 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 08:20:20 crc kubenswrapper[4971]: I0320 08:20:20.163464 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 08:20:50 crc kubenswrapper[4971]: I0320 08:20:50.162674 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 08:20:50 crc kubenswrapper[4971]: I0320 08:20:50.163298 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 08:20:50 crc kubenswrapper[4971]: I0320 08:20:50.163423 4971 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" Mar 20 08:20:50 crc kubenswrapper[4971]: I0320 08:20:50.164371 4971 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5e2feddb7ec76b423537a24d64a944171176c7421d63449dfb282d7be14bc7fe"} pod="openshift-machine-config-operator/machine-config-daemon-m26zl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 08:20:50 crc kubenswrapper[4971]: I0320 08:20:50.164473 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" containerID="cri-o://5e2feddb7ec76b423537a24d64a944171176c7421d63449dfb282d7be14bc7fe" gracePeriod=600 Mar 20 08:20:51 crc kubenswrapper[4971]: I0320 08:20:51.140503 4971 generic.go:334] "Generic (PLEG): container finished" podID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerID="5e2feddb7ec76b423537a24d64a944171176c7421d63449dfb282d7be14bc7fe" exitCode=0 Mar 20 08:20:51 crc kubenswrapper[4971]: I0320 08:20:51.140635 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerDied","Data":"5e2feddb7ec76b423537a24d64a944171176c7421d63449dfb282d7be14bc7fe"} Mar 20 08:20:51 crc kubenswrapper[4971]: I0320 08:20:51.140971 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerStarted","Data":"52eff19a5271a1aa4b1f0d67866346931e99423b21667041f8e58856f311aa9a"} Mar 20 08:20:51 crc kubenswrapper[4971]: I0320 08:20:51.141009 4971 scope.go:117] "RemoveContainer" containerID="bb270a4dc93b8d6f80363bddc69a208c2724ae180fbd65a393dbe537269dfa3b" Mar 20 08:20:57 crc kubenswrapper[4971]: I0320 08:20:57.923433 4971 scope.go:117] "RemoveContainer" containerID="a3f8834f07891ab78574eaf21e4cd42c45c376156dcd86c47abdc9929f5b48b7" Mar 20 08:21:01 crc kubenswrapper[4971]: I0320 08:21:01.271140 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qmpgv"] Mar 20 08:21:01 crc kubenswrapper[4971]: E0320 08:21:01.271918 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cdbda64-10b5-46a1-a725-e3217bf2a00d" containerName="oc" Mar 20 08:21:01 crc kubenswrapper[4971]: I0320 08:21:01.271930 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cdbda64-10b5-46a1-a725-e3217bf2a00d" containerName="oc" Mar 20 08:21:01 crc kubenswrapper[4971]: I0320 08:21:01.272071 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cdbda64-10b5-46a1-a725-e3217bf2a00d" containerName="oc" Mar 20 08:21:01 crc kubenswrapper[4971]: I0320 08:21:01.273223 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qmpgv" Mar 20 08:21:01 crc kubenswrapper[4971]: I0320 08:21:01.280577 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qmpgv"] Mar 20 08:21:01 crc kubenswrapper[4971]: I0320 08:21:01.364688 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28r98\" (UniqueName: \"kubernetes.io/projected/e4bd41e5-1912-48da-8447-689ca9a57199-kube-api-access-28r98\") pod \"redhat-marketplace-qmpgv\" (UID: \"e4bd41e5-1912-48da-8447-689ca9a57199\") " pod="openshift-marketplace/redhat-marketplace-qmpgv" Mar 20 08:21:01 crc kubenswrapper[4971]: I0320 08:21:01.365027 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4bd41e5-1912-48da-8447-689ca9a57199-catalog-content\") pod \"redhat-marketplace-qmpgv\" (UID: \"e4bd41e5-1912-48da-8447-689ca9a57199\") " pod="openshift-marketplace/redhat-marketplace-qmpgv" Mar 20 08:21:01 crc kubenswrapper[4971]: I0320 08:21:01.365147 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4bd41e5-1912-48da-8447-689ca9a57199-utilities\") pod \"redhat-marketplace-qmpgv\" (UID: \"e4bd41e5-1912-48da-8447-689ca9a57199\") " pod="openshift-marketplace/redhat-marketplace-qmpgv" Mar 20 08:21:01 crc kubenswrapper[4971]: I0320 08:21:01.466512 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28r98\" (UniqueName: \"kubernetes.io/projected/e4bd41e5-1912-48da-8447-689ca9a57199-kube-api-access-28r98\") pod \"redhat-marketplace-qmpgv\" (UID: \"e4bd41e5-1912-48da-8447-689ca9a57199\") " pod="openshift-marketplace/redhat-marketplace-qmpgv" Mar 20 08:21:01 crc kubenswrapper[4971]: I0320 08:21:01.466695 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4bd41e5-1912-48da-8447-689ca9a57199-catalog-content\") pod \"redhat-marketplace-qmpgv\" (UID: \"e4bd41e5-1912-48da-8447-689ca9a57199\") " pod="openshift-marketplace/redhat-marketplace-qmpgv" Mar 20 08:21:01 crc kubenswrapper[4971]: I0320 08:21:01.466734 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4bd41e5-1912-48da-8447-689ca9a57199-utilities\") pod \"redhat-marketplace-qmpgv\" (UID: \"e4bd41e5-1912-48da-8447-689ca9a57199\") " pod="openshift-marketplace/redhat-marketplace-qmpgv" Mar 20 08:21:01 crc kubenswrapper[4971]: I0320 08:21:01.467382 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4bd41e5-1912-48da-8447-689ca9a57199-utilities\") pod \"redhat-marketplace-qmpgv\" (UID: \"e4bd41e5-1912-48da-8447-689ca9a57199\") " pod="openshift-marketplace/redhat-marketplace-qmpgv" Mar 20 08:21:01 crc kubenswrapper[4971]: I0320 08:21:01.468220 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4bd41e5-1912-48da-8447-689ca9a57199-catalog-content\") pod \"redhat-marketplace-qmpgv\" (UID: \"e4bd41e5-1912-48da-8447-689ca9a57199\") " pod="openshift-marketplace/redhat-marketplace-qmpgv" Mar 20 08:21:01 crc kubenswrapper[4971]: I0320 08:21:01.490888 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28r98\" (UniqueName: \"kubernetes.io/projected/e4bd41e5-1912-48da-8447-689ca9a57199-kube-api-access-28r98\") pod \"redhat-marketplace-qmpgv\" (UID: \"e4bd41e5-1912-48da-8447-689ca9a57199\") " pod="openshift-marketplace/redhat-marketplace-qmpgv" Mar 20 08:21:01 crc kubenswrapper[4971]: I0320 08:21:01.604474 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qmpgv" Mar 20 08:21:02 crc kubenswrapper[4971]: I0320 08:21:02.037828 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qmpgv"] Mar 20 08:21:02 crc kubenswrapper[4971]: I0320 08:21:02.245327 4971 generic.go:334] "Generic (PLEG): container finished" podID="e4bd41e5-1912-48da-8447-689ca9a57199" containerID="45d1f254c07c904998ea5da95e4ca25bd68c5aa4ecf2990aa7aff9218cabeb21" exitCode=0 Mar 20 08:21:02 crc kubenswrapper[4971]: I0320 08:21:02.245430 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qmpgv" event={"ID":"e4bd41e5-1912-48da-8447-689ca9a57199","Type":"ContainerDied","Data":"45d1f254c07c904998ea5da95e4ca25bd68c5aa4ecf2990aa7aff9218cabeb21"} Mar 20 08:21:02 crc kubenswrapper[4971]: I0320 08:21:02.245965 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qmpgv" event={"ID":"e4bd41e5-1912-48da-8447-689ca9a57199","Type":"ContainerStarted","Data":"fa1543d6fdefcf98db7cab3e7d9d00ea50b9dbb920723bf5c53ae4ce48951650"} Mar 20 08:21:02 crc kubenswrapper[4971]: I0320 08:21:02.247119 4971 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 08:21:03 crc kubenswrapper[4971]: I0320 08:21:03.252858 4971 generic.go:334] "Generic (PLEG): container finished" podID="e4bd41e5-1912-48da-8447-689ca9a57199" containerID="5d3c2d31018998ec8980fd430cb60657db770d525f150aa3ec405d5fba7ce9a1" exitCode=0 Mar 20 08:21:03 crc kubenswrapper[4971]: I0320 08:21:03.253053 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qmpgv" event={"ID":"e4bd41e5-1912-48da-8447-689ca9a57199","Type":"ContainerDied","Data":"5d3c2d31018998ec8980fd430cb60657db770d525f150aa3ec405d5fba7ce9a1"} Mar 20 08:21:04 crc kubenswrapper[4971]: I0320 08:21:04.261705 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qmpgv" event={"ID":"e4bd41e5-1912-48da-8447-689ca9a57199","Type":"ContainerStarted","Data":"743bbd74b283eb06cdbeb563f9a00fcfcd7e7b8294cd0f27528316a80c343b81"} Mar 20 08:21:04 crc kubenswrapper[4971]: I0320 08:21:04.283937 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qmpgv" podStartSLOduration=1.883413393 podStartE2EDuration="3.283915046s" podCreationTimestamp="2026-03-20 08:21:01 +0000 UTC" firstStartedPulling="2026-03-20 08:21:02.246844545 +0000 UTC m=+5484.226718683" lastFinishedPulling="2026-03-20 08:21:03.647346198 +0000 UTC m=+5485.627220336" observedRunningTime="2026-03-20 08:21:04.281120644 +0000 UTC m=+5486.260994812" watchObservedRunningTime="2026-03-20 08:21:04.283915046 +0000 UTC m=+5486.263789194" Mar 20 08:21:11 crc kubenswrapper[4971]: I0320 08:21:11.605322 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qmpgv" Mar 20 08:21:11 crc kubenswrapper[4971]: I0320 08:21:11.606380 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qmpgv" Mar 20 08:21:11 crc kubenswrapper[4971]: I0320 08:21:11.686955 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qmpgv" Mar 20 08:21:12 crc kubenswrapper[4971]: I0320 08:21:12.380112 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qmpgv" Mar 20 08:21:12 crc kubenswrapper[4971]: I0320 08:21:12.430382 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qmpgv"] Mar 20 08:21:14 crc kubenswrapper[4971]: I0320 08:21:14.344337 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qmpgv" podUID="e4bd41e5-1912-48da-8447-689ca9a57199" containerName="registry-server" containerID="cri-o://743bbd74b283eb06cdbeb563f9a00fcfcd7e7b8294cd0f27528316a80c343b81" gracePeriod=2 Mar 20 08:21:14 crc kubenswrapper[4971]: I0320 08:21:14.786501 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qmpgv" Mar 20 08:21:14 crc kubenswrapper[4971]: I0320 08:21:14.889303 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4bd41e5-1912-48da-8447-689ca9a57199-catalog-content\") pod \"e4bd41e5-1912-48da-8447-689ca9a57199\" (UID: \"e4bd41e5-1912-48da-8447-689ca9a57199\") " Mar 20 08:21:14 crc kubenswrapper[4971]: I0320 08:21:14.889666 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4bd41e5-1912-48da-8447-689ca9a57199-utilities\") pod \"e4bd41e5-1912-48da-8447-689ca9a57199\" (UID: \"e4bd41e5-1912-48da-8447-689ca9a57199\") " Mar 20 08:21:14 crc kubenswrapper[4971]: I0320 08:21:14.890044 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28r98\" (UniqueName: \"kubernetes.io/projected/e4bd41e5-1912-48da-8447-689ca9a57199-kube-api-access-28r98\") pod \"e4bd41e5-1912-48da-8447-689ca9a57199\" (UID: \"e4bd41e5-1912-48da-8447-689ca9a57199\") " Mar 20 08:21:14 crc kubenswrapper[4971]: I0320 08:21:14.890701 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4bd41e5-1912-48da-8447-689ca9a57199-utilities" (OuterVolumeSpecName: "utilities") pod "e4bd41e5-1912-48da-8447-689ca9a57199" (UID: "e4bd41e5-1912-48da-8447-689ca9a57199"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:21:14 crc kubenswrapper[4971]: I0320 08:21:14.895090 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4bd41e5-1912-48da-8447-689ca9a57199-kube-api-access-28r98" (OuterVolumeSpecName: "kube-api-access-28r98") pod "e4bd41e5-1912-48da-8447-689ca9a57199" (UID: "e4bd41e5-1912-48da-8447-689ca9a57199"). InnerVolumeSpecName "kube-api-access-28r98". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:21:14 crc kubenswrapper[4971]: I0320 08:21:14.914208 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4bd41e5-1912-48da-8447-689ca9a57199-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e4bd41e5-1912-48da-8447-689ca9a57199" (UID: "e4bd41e5-1912-48da-8447-689ca9a57199"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:21:14 crc kubenswrapper[4971]: I0320 08:21:14.991806 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28r98\" (UniqueName: \"kubernetes.io/projected/e4bd41e5-1912-48da-8447-689ca9a57199-kube-api-access-28r98\") on node \"crc\" DevicePath \"\"" Mar 20 08:21:14 crc kubenswrapper[4971]: I0320 08:21:14.991847 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4bd41e5-1912-48da-8447-689ca9a57199-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 08:21:14 crc kubenswrapper[4971]: I0320 08:21:14.991856 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4bd41e5-1912-48da-8447-689ca9a57199-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 08:21:15 crc kubenswrapper[4971]: I0320 08:21:15.357879 4971 generic.go:334] "Generic (PLEG): container finished" podID="e4bd41e5-1912-48da-8447-689ca9a57199" containerID="743bbd74b283eb06cdbeb563f9a00fcfcd7e7b8294cd0f27528316a80c343b81" exitCode=0 Mar 20 08:21:15 crc kubenswrapper[4971]: I0320 08:21:15.357946 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qmpgv" event={"ID":"e4bd41e5-1912-48da-8447-689ca9a57199","Type":"ContainerDied","Data":"743bbd74b283eb06cdbeb563f9a00fcfcd7e7b8294cd0f27528316a80c343b81"} Mar 20 08:21:15 crc kubenswrapper[4971]: I0320 08:21:15.357994 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qmpgv" event={"ID":"e4bd41e5-1912-48da-8447-689ca9a57199","Type":"ContainerDied","Data":"fa1543d6fdefcf98db7cab3e7d9d00ea50b9dbb920723bf5c53ae4ce48951650"} Mar 20 08:21:15 crc kubenswrapper[4971]: I0320 08:21:15.358025 4971 scope.go:117] "RemoveContainer" containerID="743bbd74b283eb06cdbeb563f9a00fcfcd7e7b8294cd0f27528316a80c343b81" Mar 20 08:21:15 crc kubenswrapper[4971]: I0320 08:21:15.358030 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qmpgv" Mar 20 08:21:15 crc kubenswrapper[4971]: I0320 08:21:15.379413 4971 scope.go:117] "RemoveContainer" containerID="5d3c2d31018998ec8980fd430cb60657db770d525f150aa3ec405d5fba7ce9a1" Mar 20 08:21:15 crc kubenswrapper[4971]: I0320 08:21:15.405570 4971 scope.go:117] "RemoveContainer" containerID="45d1f254c07c904998ea5da95e4ca25bd68c5aa4ecf2990aa7aff9218cabeb21" Mar 20 08:21:15 crc kubenswrapper[4971]: I0320 08:21:15.408953 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qmpgv"] Mar 20 08:21:15 crc kubenswrapper[4971]: I0320 08:21:15.414956 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qmpgv"] Mar 20 08:21:15 crc kubenswrapper[4971]: I0320 08:21:15.450318 4971 scope.go:117] "RemoveContainer" containerID="743bbd74b283eb06cdbeb563f9a00fcfcd7e7b8294cd0f27528316a80c343b81" Mar 20 08:21:15 crc kubenswrapper[4971]: E0320 08:21:15.450791 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"743bbd74b283eb06cdbeb563f9a00fcfcd7e7b8294cd0f27528316a80c343b81\": container with ID starting with 743bbd74b283eb06cdbeb563f9a00fcfcd7e7b8294cd0f27528316a80c343b81 not found: ID does not exist" containerID="743bbd74b283eb06cdbeb563f9a00fcfcd7e7b8294cd0f27528316a80c343b81" Mar 20 08:21:15 crc kubenswrapper[4971]: I0320 08:21:15.450890 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"743bbd74b283eb06cdbeb563f9a00fcfcd7e7b8294cd0f27528316a80c343b81"} err="failed to get container status \"743bbd74b283eb06cdbeb563f9a00fcfcd7e7b8294cd0f27528316a80c343b81\": rpc error: code = NotFound desc = could not find container \"743bbd74b283eb06cdbeb563f9a00fcfcd7e7b8294cd0f27528316a80c343b81\": container with ID starting with 743bbd74b283eb06cdbeb563f9a00fcfcd7e7b8294cd0f27528316a80c343b81 not found: ID does not exist" Mar 20 08:21:15 crc kubenswrapper[4971]: I0320 08:21:15.450964 4971 scope.go:117] "RemoveContainer" containerID="5d3c2d31018998ec8980fd430cb60657db770d525f150aa3ec405d5fba7ce9a1" Mar 20 08:21:15 crc kubenswrapper[4971]: E0320 08:21:15.451256 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d3c2d31018998ec8980fd430cb60657db770d525f150aa3ec405d5fba7ce9a1\": container with ID starting with 5d3c2d31018998ec8980fd430cb60657db770d525f150aa3ec405d5fba7ce9a1 not found: ID does not exist" containerID="5d3c2d31018998ec8980fd430cb60657db770d525f150aa3ec405d5fba7ce9a1" Mar 20 08:21:15 crc kubenswrapper[4971]: I0320 08:21:15.451285 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d3c2d31018998ec8980fd430cb60657db770d525f150aa3ec405d5fba7ce9a1"} err="failed to get container status \"5d3c2d31018998ec8980fd430cb60657db770d525f150aa3ec405d5fba7ce9a1\": rpc error: code = NotFound desc = could not find container \"5d3c2d31018998ec8980fd430cb60657db770d525f150aa3ec405d5fba7ce9a1\": container with ID starting with 5d3c2d31018998ec8980fd430cb60657db770d525f150aa3ec405d5fba7ce9a1 not found: ID does not exist" Mar 20 08:21:15 crc kubenswrapper[4971]: I0320 08:21:15.451302 4971 scope.go:117] "RemoveContainer" containerID="45d1f254c07c904998ea5da95e4ca25bd68c5aa4ecf2990aa7aff9218cabeb21" Mar 20 08:21:15 crc kubenswrapper[4971]: E0320 08:21:15.451722 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45d1f254c07c904998ea5da95e4ca25bd68c5aa4ecf2990aa7aff9218cabeb21\": container with ID starting with 45d1f254c07c904998ea5da95e4ca25bd68c5aa4ecf2990aa7aff9218cabeb21 not found: ID does not exist" containerID="45d1f254c07c904998ea5da95e4ca25bd68c5aa4ecf2990aa7aff9218cabeb21" Mar 20 08:21:15 crc kubenswrapper[4971]: I0320 08:21:15.451753 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45d1f254c07c904998ea5da95e4ca25bd68c5aa4ecf2990aa7aff9218cabeb21"} err="failed to get container status \"45d1f254c07c904998ea5da95e4ca25bd68c5aa4ecf2990aa7aff9218cabeb21\": rpc error: code = NotFound desc = could not find container \"45d1f254c07c904998ea5da95e4ca25bd68c5aa4ecf2990aa7aff9218cabeb21\": container with ID starting with 45d1f254c07c904998ea5da95e4ca25bd68c5aa4ecf2990aa7aff9218cabeb21 not found: ID does not exist" Mar 20 08:21:16 crc kubenswrapper[4971]: I0320 08:21:16.752653 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4bd41e5-1912-48da-8447-689ca9a57199" path="/var/lib/kubelet/pods/e4bd41e5-1912-48da-8447-689ca9a57199/volumes" Mar 20 08:22:00 crc kubenswrapper[4971]: I0320 08:22:00.140764 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566582-l74ct"] Mar 20 08:22:00 crc kubenswrapper[4971]: E0320 08:22:00.141561 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4bd41e5-1912-48da-8447-689ca9a57199" containerName="extract-utilities" Mar 20 08:22:00 crc kubenswrapper[4971]: I0320 08:22:00.141573 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4bd41e5-1912-48da-8447-689ca9a57199" containerName="extract-utilities" Mar 20 08:22:00 crc kubenswrapper[4971]: E0320 08:22:00.141587 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4bd41e5-1912-48da-8447-689ca9a57199" containerName="extract-content" Mar 20 08:22:00 crc kubenswrapper[4971]: I0320 08:22:00.141593 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4bd41e5-1912-48da-8447-689ca9a57199" containerName="extract-content" Mar 20 08:22:00 crc kubenswrapper[4971]: E0320 08:22:00.141624 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4bd41e5-1912-48da-8447-689ca9a57199" containerName="registry-server" Mar 20 08:22:00 crc kubenswrapper[4971]: I0320 08:22:00.141631 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4bd41e5-1912-48da-8447-689ca9a57199" containerName="registry-server" Mar 20 08:22:00 crc kubenswrapper[4971]: I0320 08:22:00.141784 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4bd41e5-1912-48da-8447-689ca9a57199" containerName="registry-server" Mar 20 08:22:00 crc kubenswrapper[4971]: I0320 08:22:00.142183 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566582-l74ct" Mar 20 08:22:00 crc kubenswrapper[4971]: I0320 08:22:00.146311 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 08:22:00 crc kubenswrapper[4971]: I0320 08:22:00.146489 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 08:22:00 crc kubenswrapper[4971]: I0320 08:22:00.146916 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 08:22:00 crc kubenswrapper[4971]: I0320 08:22:00.153923 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566582-l74ct"] Mar 20 08:22:00 crc kubenswrapper[4971]: I0320 08:22:00.198129 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pslq\" (UniqueName: \"kubernetes.io/projected/6a08b814-b6c1-466a-a596-8638e8fef3f6-kube-api-access-2pslq\") pod \"auto-csr-approver-29566582-l74ct\" (UID: \"6a08b814-b6c1-466a-a596-8638e8fef3f6\") " pod="openshift-infra/auto-csr-approver-29566582-l74ct" Mar 20 08:22:00 crc kubenswrapper[4971]: I0320 08:22:00.299903 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pslq\" (UniqueName: \"kubernetes.io/projected/6a08b814-b6c1-466a-a596-8638e8fef3f6-kube-api-access-2pslq\") pod \"auto-csr-approver-29566582-l74ct\" (UID: \"6a08b814-b6c1-466a-a596-8638e8fef3f6\") " pod="openshift-infra/auto-csr-approver-29566582-l74ct" Mar 20 08:22:00 crc kubenswrapper[4971]: I0320 08:22:00.324969 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pslq\" (UniqueName: \"kubernetes.io/projected/6a08b814-b6c1-466a-a596-8638e8fef3f6-kube-api-access-2pslq\") pod \"auto-csr-approver-29566582-l74ct\" (UID: \"6a08b814-b6c1-466a-a596-8638e8fef3f6\") " pod="openshift-infra/auto-csr-approver-29566582-l74ct" Mar 20 08:22:00 crc kubenswrapper[4971]: I0320 08:22:00.465162 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566582-l74ct" Mar 20 08:22:00 crc kubenswrapper[4971]: I0320 08:22:00.904398 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566582-l74ct"] Mar 20 08:22:00 crc kubenswrapper[4971]: I0320 08:22:00.972267 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566582-l74ct" event={"ID":"6a08b814-b6c1-466a-a596-8638e8fef3f6","Type":"ContainerStarted","Data":"fa0af35b78d00937097a79d20a6d2b3e8fe88edc7cb4f83c6fd8fa93998d0f14"} Mar 20 08:22:02 crc kubenswrapper[4971]: I0320 08:22:02.994518 4971 generic.go:334] "Generic (PLEG): container finished" podID="6a08b814-b6c1-466a-a596-8638e8fef3f6" containerID="bf495453fb1525dc07e1e3cfdbf4e89f394dde805c342b393aff74b6660bbea1" exitCode=0 Mar 20 08:22:02 crc kubenswrapper[4971]: I0320 08:22:02.994643 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566582-l74ct" event={"ID":"6a08b814-b6c1-466a-a596-8638e8fef3f6","Type":"ContainerDied","Data":"bf495453fb1525dc07e1e3cfdbf4e89f394dde805c342b393aff74b6660bbea1"} Mar 20 08:22:04 crc kubenswrapper[4971]: I0320 08:22:04.371893 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566582-l74ct" Mar 20 08:22:04 crc kubenswrapper[4971]: I0320 08:22:04.462103 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pslq\" (UniqueName: \"kubernetes.io/projected/6a08b814-b6c1-466a-a596-8638e8fef3f6-kube-api-access-2pslq\") pod \"6a08b814-b6c1-466a-a596-8638e8fef3f6\" (UID: \"6a08b814-b6c1-466a-a596-8638e8fef3f6\") " Mar 20 08:22:04 crc kubenswrapper[4971]: I0320 08:22:04.470363 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a08b814-b6c1-466a-a596-8638e8fef3f6-kube-api-access-2pslq" (OuterVolumeSpecName: "kube-api-access-2pslq") pod "6a08b814-b6c1-466a-a596-8638e8fef3f6" (UID: "6a08b814-b6c1-466a-a596-8638e8fef3f6"). InnerVolumeSpecName "kube-api-access-2pslq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:22:04 crc kubenswrapper[4971]: I0320 08:22:04.563925 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pslq\" (UniqueName: \"kubernetes.io/projected/6a08b814-b6c1-466a-a596-8638e8fef3f6-kube-api-access-2pslq\") on node \"crc\" DevicePath \"\"" Mar 20 08:22:05 crc kubenswrapper[4971]: I0320 08:22:05.020709 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566582-l74ct" event={"ID":"6a08b814-b6c1-466a-a596-8638e8fef3f6","Type":"ContainerDied","Data":"fa0af35b78d00937097a79d20a6d2b3e8fe88edc7cb4f83c6fd8fa93998d0f14"} Mar 20 08:22:05 crc kubenswrapper[4971]: I0320 08:22:05.020750 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566582-l74ct" Mar 20 08:22:05 crc kubenswrapper[4971]: I0320 08:22:05.020753 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa0af35b78d00937097a79d20a6d2b3e8fe88edc7cb4f83c6fd8fa93998d0f14" Mar 20 08:22:05 crc kubenswrapper[4971]: I0320 08:22:05.475557 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566576-mr4rm"] Mar 20 08:22:05 crc kubenswrapper[4971]: I0320 08:22:05.487492 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566576-mr4rm"] Mar 20 08:22:06 crc kubenswrapper[4971]: I0320 08:22:06.749005 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ef8258e-f555-4c0c-8d12-b766f3a32432" path="/var/lib/kubelet/pods/4ef8258e-f555-4c0c-8d12-b766f3a32432/volumes" Mar 20 08:22:50 crc kubenswrapper[4971]: I0320 08:22:50.163321 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 08:22:50 crc kubenswrapper[4971]: I0320 08:22:50.164099 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 08:22:58 crc kubenswrapper[4971]: I0320 08:22:58.054806 4971 scope.go:117] "RemoveContainer" containerID="85c46807cb1047829eb9722ab24a0abea17af1b0d7d0aa641604c59c0f806ff0" Mar 20 08:23:20 crc kubenswrapper[4971]: I0320 08:23:20.162320 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 08:23:20 crc kubenswrapper[4971]: I0320 08:23:20.162997 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 08:23:50 crc kubenswrapper[4971]: I0320 08:23:50.162670 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 08:23:50 crc kubenswrapper[4971]: I0320 08:23:50.163351 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 08:23:50 crc kubenswrapper[4971]: I0320 08:23:50.163455 4971 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" Mar 20 08:23:50 crc kubenswrapper[4971]: I0320 08:23:50.164400 4971 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"52eff19a5271a1aa4b1f0d67866346931e99423b21667041f8e58856f311aa9a"} pod="openshift-machine-config-operator/machine-config-daemon-m26zl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 08:23:50 crc kubenswrapper[4971]: I0320 08:23:50.164513 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" containerID="cri-o://52eff19a5271a1aa4b1f0d67866346931e99423b21667041f8e58856f311aa9a" gracePeriod=600 Mar 20 08:23:50 crc kubenswrapper[4971]: E0320 08:23:50.299267 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:23:50 crc kubenswrapper[4971]: I0320 08:23:50.396025 4971 generic.go:334] "Generic (PLEG): container finished" podID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerID="52eff19a5271a1aa4b1f0d67866346931e99423b21667041f8e58856f311aa9a" exitCode=0 Mar 20 08:23:50 crc kubenswrapper[4971]: I0320 08:23:50.396098 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerDied","Data":"52eff19a5271a1aa4b1f0d67866346931e99423b21667041f8e58856f311aa9a"} Mar 20 08:23:50 crc kubenswrapper[4971]: I0320 08:23:50.396162 4971 scope.go:117] "RemoveContainer" containerID="5e2feddb7ec76b423537a24d64a944171176c7421d63449dfb282d7be14bc7fe" Mar 20 08:23:50 crc kubenswrapper[4971]: I0320 08:23:50.398094 4971 scope.go:117] "RemoveContainer" containerID="52eff19a5271a1aa4b1f0d67866346931e99423b21667041f8e58856f311aa9a" Mar 20 08:23:50 crc kubenswrapper[4971]: E0320 08:23:50.398727 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:24:00 crc kubenswrapper[4971]: I0320 08:24:00.146247 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566584-hsfjb"] Mar 20 08:24:00 crc kubenswrapper[4971]: E0320 08:24:00.147237 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a08b814-b6c1-466a-a596-8638e8fef3f6" containerName="oc" Mar 20 08:24:00 crc kubenswrapper[4971]: I0320 08:24:00.147254 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a08b814-b6c1-466a-a596-8638e8fef3f6" containerName="oc" Mar 20 08:24:00 crc kubenswrapper[4971]: I0320 08:24:00.147432 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a08b814-b6c1-466a-a596-8638e8fef3f6" containerName="oc" Mar 20 08:24:00 crc kubenswrapper[4971]: I0320 08:24:00.148026 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566584-hsfjb" Mar 20 08:24:00 crc kubenswrapper[4971]: I0320 08:24:00.151078 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 08:24:00 crc kubenswrapper[4971]: I0320 08:24:00.151221 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 08:24:00 crc kubenswrapper[4971]: I0320 08:24:00.151536 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 08:24:00 crc kubenswrapper[4971]: I0320 08:24:00.161042 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566584-hsfjb"] Mar 20 08:24:00 crc kubenswrapper[4971]: I0320 08:24:00.266573 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jlcs\" (UniqueName: \"kubernetes.io/projected/91dea636-eec3-4e6d-b176-178b89aaa5f5-kube-api-access-9jlcs\") pod \"auto-csr-approver-29566584-hsfjb\" (UID: \"91dea636-eec3-4e6d-b176-178b89aaa5f5\") " pod="openshift-infra/auto-csr-approver-29566584-hsfjb" Mar 20 08:24:00 crc kubenswrapper[4971]: I0320 08:24:00.368024 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jlcs\" (UniqueName: \"kubernetes.io/projected/91dea636-eec3-4e6d-b176-178b89aaa5f5-kube-api-access-9jlcs\") pod \"auto-csr-approver-29566584-hsfjb\" (UID: \"91dea636-eec3-4e6d-b176-178b89aaa5f5\") " pod="openshift-infra/auto-csr-approver-29566584-hsfjb" Mar 20 08:24:00 crc kubenswrapper[4971]: I0320 08:24:00.400918 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jlcs\" (UniqueName: \"kubernetes.io/projected/91dea636-eec3-4e6d-b176-178b89aaa5f5-kube-api-access-9jlcs\") pod \"auto-csr-approver-29566584-hsfjb\" (UID: \"91dea636-eec3-4e6d-b176-178b89aaa5f5\") " pod="openshift-infra/auto-csr-approver-29566584-hsfjb" Mar 20 08:24:00 crc kubenswrapper[4971]: I0320 08:24:00.470125 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566584-hsfjb" Mar 20 08:24:01 crc kubenswrapper[4971]: I0320 08:24:01.009099 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566584-hsfjb"] Mar 20 08:24:01 crc kubenswrapper[4971]: I0320 08:24:01.507496 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566584-hsfjb" event={"ID":"91dea636-eec3-4e6d-b176-178b89aaa5f5","Type":"ContainerStarted","Data":"5547d59084cd1d325ee49b70d471dbc1a76d265cede3ffaecbb3375413e9bee1"} Mar 20 08:24:02 crc kubenswrapper[4971]: I0320 08:24:02.517051 4971 generic.go:334] "Generic (PLEG): container finished" podID="91dea636-eec3-4e6d-b176-178b89aaa5f5" containerID="22227758c8b92949f49a89b6285139e0241a86e83430d9249dec0fa1233e8e02" exitCode=0 Mar 20 08:24:02 crc kubenswrapper[4971]: I0320 08:24:02.517113 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566584-hsfjb" event={"ID":"91dea636-eec3-4e6d-b176-178b89aaa5f5","Type":"ContainerDied","Data":"22227758c8b92949f49a89b6285139e0241a86e83430d9249dec0fa1233e8e02"} Mar 20 08:24:03 crc kubenswrapper[4971]: I0320 08:24:03.732229 4971 scope.go:117] "RemoveContainer" containerID="52eff19a5271a1aa4b1f0d67866346931e99423b21667041f8e58856f311aa9a" Mar 20 08:24:03 crc kubenswrapper[4971]: E0320 08:24:03.733590 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:24:03 crc kubenswrapper[4971]: I0320 08:24:03.839335 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566584-hsfjb" Mar 20 08:24:04 crc kubenswrapper[4971]: I0320 08:24:04.018495 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jlcs\" (UniqueName: \"kubernetes.io/projected/91dea636-eec3-4e6d-b176-178b89aaa5f5-kube-api-access-9jlcs\") pod \"91dea636-eec3-4e6d-b176-178b89aaa5f5\" (UID: \"91dea636-eec3-4e6d-b176-178b89aaa5f5\") " Mar 20 08:24:04 crc kubenswrapper[4971]: I0320 08:24:04.024909 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91dea636-eec3-4e6d-b176-178b89aaa5f5-kube-api-access-9jlcs" (OuterVolumeSpecName: "kube-api-access-9jlcs") pod "91dea636-eec3-4e6d-b176-178b89aaa5f5" (UID: "91dea636-eec3-4e6d-b176-178b89aaa5f5"). InnerVolumeSpecName "kube-api-access-9jlcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:24:04 crc kubenswrapper[4971]: I0320 08:24:04.120198 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jlcs\" (UniqueName: \"kubernetes.io/projected/91dea636-eec3-4e6d-b176-178b89aaa5f5-kube-api-access-9jlcs\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:04 crc kubenswrapper[4971]: I0320 08:24:04.537537 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566584-hsfjb" event={"ID":"91dea636-eec3-4e6d-b176-178b89aaa5f5","Type":"ContainerDied","Data":"5547d59084cd1d325ee49b70d471dbc1a76d265cede3ffaecbb3375413e9bee1"} Mar 20 08:24:04 crc kubenswrapper[4971]: I0320 08:24:04.537689 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5547d59084cd1d325ee49b70d471dbc1a76d265cede3ffaecbb3375413e9bee1" Mar 20 08:24:04 crc kubenswrapper[4971]: I0320 08:24:04.537622 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566584-hsfjb" Mar 20 08:24:04 crc kubenswrapper[4971]: I0320 08:24:04.919711 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566578-55ww2"] Mar 20 08:24:04 crc kubenswrapper[4971]: I0320 08:24:04.930461 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566578-55ww2"] Mar 20 08:24:06 crc kubenswrapper[4971]: I0320 08:24:06.742972 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ceede041-462b-41c4-829b-742e01009552" path="/var/lib/kubelet/pods/ceede041-462b-41c4-829b-742e01009552/volumes" Mar 20 08:24:13 crc kubenswrapper[4971]: I0320 08:24:13.702478 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9l7xb"] Mar 20 08:24:13 crc kubenswrapper[4971]: E0320 08:24:13.702906 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91dea636-eec3-4e6d-b176-178b89aaa5f5" containerName="oc" Mar 20 08:24:13 crc kubenswrapper[4971]: I0320 08:24:13.702927 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="91dea636-eec3-4e6d-b176-178b89aaa5f5" containerName="oc" Mar 20 08:24:13 crc kubenswrapper[4971]: I0320 08:24:13.703167 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="91dea636-eec3-4e6d-b176-178b89aaa5f5" containerName="oc" Mar 20 08:24:13 crc kubenswrapper[4971]: I0320 08:24:13.704576 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9l7xb" Mar 20 08:24:13 crc kubenswrapper[4971]: I0320 08:24:13.719703 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9l7xb"] Mar 20 08:24:13 crc kubenswrapper[4971]: I0320 08:24:13.856322 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1027b17f-0368-4cb2-bb8b-88fdfd2a05d9-catalog-content\") pod \"certified-operators-9l7xb\" (UID: \"1027b17f-0368-4cb2-bb8b-88fdfd2a05d9\") " pod="openshift-marketplace/certified-operators-9l7xb" Mar 20 08:24:13 crc kubenswrapper[4971]: I0320 08:24:13.856412 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1027b17f-0368-4cb2-bb8b-88fdfd2a05d9-utilities\") pod \"certified-operators-9l7xb\" (UID: \"1027b17f-0368-4cb2-bb8b-88fdfd2a05d9\") " pod="openshift-marketplace/certified-operators-9l7xb" Mar 20 08:24:13 crc kubenswrapper[4971]: I0320 08:24:13.856438 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fv7xq\" (UniqueName: \"kubernetes.io/projected/1027b17f-0368-4cb2-bb8b-88fdfd2a05d9-kube-api-access-fv7xq\") pod \"certified-operators-9l7xb\" (UID: \"1027b17f-0368-4cb2-bb8b-88fdfd2a05d9\") " pod="openshift-marketplace/certified-operators-9l7xb" Mar 20 08:24:13 crc kubenswrapper[4971]: I0320 08:24:13.957281 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1027b17f-0368-4cb2-bb8b-88fdfd2a05d9-catalog-content\") pod \"certified-operators-9l7xb\" (UID: \"1027b17f-0368-4cb2-bb8b-88fdfd2a05d9\") " pod="openshift-marketplace/certified-operators-9l7xb" Mar 20 08:24:13 crc kubenswrapper[4971]: I0320 08:24:13.957341 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1027b17f-0368-4cb2-bb8b-88fdfd2a05d9-utilities\") pod \"certified-operators-9l7xb\" (UID: \"1027b17f-0368-4cb2-bb8b-88fdfd2a05d9\") " pod="openshift-marketplace/certified-operators-9l7xb" Mar 20 08:24:13 crc kubenswrapper[4971]: I0320 08:24:13.957389 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fv7xq\" (UniqueName: \"kubernetes.io/projected/1027b17f-0368-4cb2-bb8b-88fdfd2a05d9-kube-api-access-fv7xq\") pod \"certified-operators-9l7xb\" (UID: \"1027b17f-0368-4cb2-bb8b-88fdfd2a05d9\") " pod="openshift-marketplace/certified-operators-9l7xb" Mar 20 08:24:13 crc kubenswrapper[4971]: I0320 08:24:13.957735 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1027b17f-0368-4cb2-bb8b-88fdfd2a05d9-catalog-content\") pod \"certified-operators-9l7xb\" (UID: \"1027b17f-0368-4cb2-bb8b-88fdfd2a05d9\") " pod="openshift-marketplace/certified-operators-9l7xb" Mar 20 08:24:13 crc kubenswrapper[4971]: I0320 08:24:13.957865 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1027b17f-0368-4cb2-bb8b-88fdfd2a05d9-utilities\") pod \"certified-operators-9l7xb\" (UID: \"1027b17f-0368-4cb2-bb8b-88fdfd2a05d9\") " pod="openshift-marketplace/certified-operators-9l7xb" Mar 20 08:24:13 crc kubenswrapper[4971]: I0320 08:24:13.980181 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fv7xq\" (UniqueName: \"kubernetes.io/projected/1027b17f-0368-4cb2-bb8b-88fdfd2a05d9-kube-api-access-fv7xq\") pod \"certified-operators-9l7xb\" (UID: \"1027b17f-0368-4cb2-bb8b-88fdfd2a05d9\") " pod="openshift-marketplace/certified-operators-9l7xb" Mar 20 08:24:14 crc kubenswrapper[4971]: I0320 08:24:14.077885 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9l7xb" Mar 20 08:24:14 crc kubenswrapper[4971]: I0320 08:24:14.563493 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9l7xb"] Mar 20 08:24:14 crc kubenswrapper[4971]: I0320 08:24:14.610129 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9l7xb" event={"ID":"1027b17f-0368-4cb2-bb8b-88fdfd2a05d9","Type":"ContainerStarted","Data":"35df45502819cf45e954c1f64f96d88db6524bfdad9579369f512ae36f0eb765"} Mar 20 08:24:14 crc kubenswrapper[4971]: I0320 08:24:14.734359 4971 scope.go:117] "RemoveContainer" containerID="52eff19a5271a1aa4b1f0d67866346931e99423b21667041f8e58856f311aa9a" Mar 20 08:24:14 crc kubenswrapper[4971]: E0320 08:24:14.735041 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:24:15 crc kubenswrapper[4971]: I0320 08:24:15.620435 4971 generic.go:334] "Generic (PLEG): container finished" podID="1027b17f-0368-4cb2-bb8b-88fdfd2a05d9" containerID="d3c7f682a2e3ab1fa8d950ab7f94f72d07c792970c8fc69fbc824869f14c4055" exitCode=0 Mar 20 08:24:15 crc kubenswrapper[4971]: I0320 08:24:15.620485 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9l7xb" event={"ID":"1027b17f-0368-4cb2-bb8b-88fdfd2a05d9","Type":"ContainerDied","Data":"d3c7f682a2e3ab1fa8d950ab7f94f72d07c792970c8fc69fbc824869f14c4055"} Mar 20 08:24:16 crc kubenswrapper[4971]: I0320 08:24:16.628501 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9l7xb" event={"ID":"1027b17f-0368-4cb2-bb8b-88fdfd2a05d9","Type":"ContainerStarted","Data":"ad3125cd3f4dabe90a57590e0b1717438af4f51d840f5782cadef2fead69ff6e"} Mar 20 08:24:17 crc kubenswrapper[4971]: I0320 08:24:17.641904 4971 generic.go:334] "Generic (PLEG): container finished" podID="1027b17f-0368-4cb2-bb8b-88fdfd2a05d9" containerID="ad3125cd3f4dabe90a57590e0b1717438af4f51d840f5782cadef2fead69ff6e" exitCode=0 Mar 20 08:24:17 crc kubenswrapper[4971]: I0320 08:24:17.641983 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9l7xb" event={"ID":"1027b17f-0368-4cb2-bb8b-88fdfd2a05d9","Type":"ContainerDied","Data":"ad3125cd3f4dabe90a57590e0b1717438af4f51d840f5782cadef2fead69ff6e"} Mar 20 08:24:18 crc kubenswrapper[4971]: I0320 08:24:18.653342 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9l7xb" event={"ID":"1027b17f-0368-4cb2-bb8b-88fdfd2a05d9","Type":"ContainerStarted","Data":"f0666c3375812d63a771752bbf3550416647d48fbbf1757c6ad7b4bac6d5dbb1"} Mar 20 08:24:18 crc kubenswrapper[4971]: I0320 08:24:18.672542 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9l7xb" podStartSLOduration=3.205779784 podStartE2EDuration="5.672519114s" podCreationTimestamp="2026-03-20 08:24:13 +0000 UTC" firstStartedPulling="2026-03-20 08:24:15.622138605 +0000 UTC m=+5677.602012753" lastFinishedPulling="2026-03-20 08:24:18.088877935 +0000 UTC m=+5680.068752083" observedRunningTime="2026-03-20 08:24:18.670901582 +0000 UTC m=+5680.650775760" watchObservedRunningTime="2026-03-20 08:24:18.672519114 +0000 UTC m=+5680.652393262" Mar 20 08:24:24 crc kubenswrapper[4971]: I0320 08:24:24.078050 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9l7xb" Mar 20 08:24:24 crc kubenswrapper[4971]: I0320 08:24:24.079223 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9l7xb" Mar 20 08:24:24 crc kubenswrapper[4971]: I0320 08:24:24.156015 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9l7xb" Mar 20 08:24:24 crc kubenswrapper[4971]: I0320 08:24:24.772838 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9l7xb" Mar 20 08:24:24 crc kubenswrapper[4971]: I0320 08:24:24.829339 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9l7xb"] Mar 20 08:24:25 crc kubenswrapper[4971]: I0320 08:24:25.732726 4971 scope.go:117] "RemoveContainer" containerID="52eff19a5271a1aa4b1f0d67866346931e99423b21667041f8e58856f311aa9a" Mar 20 08:24:25 crc kubenswrapper[4971]: E0320 08:24:25.733134 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:24:26 crc kubenswrapper[4971]: I0320 08:24:26.732654 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9l7xb" podUID="1027b17f-0368-4cb2-bb8b-88fdfd2a05d9" containerName="registry-server" containerID="cri-o://f0666c3375812d63a771752bbf3550416647d48fbbf1757c6ad7b4bac6d5dbb1" gracePeriod=2 Mar 20 08:24:27 crc kubenswrapper[4971]: I0320 08:24:27.673574 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9l7xb" Mar 20 08:24:27 crc kubenswrapper[4971]: I0320 08:24:27.749857 4971 generic.go:334] "Generic (PLEG): container finished" podID="1027b17f-0368-4cb2-bb8b-88fdfd2a05d9" containerID="f0666c3375812d63a771752bbf3550416647d48fbbf1757c6ad7b4bac6d5dbb1" exitCode=0 Mar 20 08:24:27 crc kubenswrapper[4971]: I0320 08:24:27.749902 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9l7xb" event={"ID":"1027b17f-0368-4cb2-bb8b-88fdfd2a05d9","Type":"ContainerDied","Data":"f0666c3375812d63a771752bbf3550416647d48fbbf1757c6ad7b4bac6d5dbb1"} Mar 20 08:24:27 crc kubenswrapper[4971]: I0320 08:24:27.749925 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9l7xb" event={"ID":"1027b17f-0368-4cb2-bb8b-88fdfd2a05d9","Type":"ContainerDied","Data":"35df45502819cf45e954c1f64f96d88db6524bfdad9579369f512ae36f0eb765"} Mar 20 08:24:27 crc kubenswrapper[4971]: I0320 08:24:27.749941 4971 scope.go:117] "RemoveContainer" containerID="f0666c3375812d63a771752bbf3550416647d48fbbf1757c6ad7b4bac6d5dbb1" Mar 20 08:24:27 crc kubenswrapper[4971]: I0320 08:24:27.750049 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9l7xb" Mar 20 08:24:27 crc kubenswrapper[4971]: I0320 08:24:27.769215 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fv7xq\" (UniqueName: \"kubernetes.io/projected/1027b17f-0368-4cb2-bb8b-88fdfd2a05d9-kube-api-access-fv7xq\") pod \"1027b17f-0368-4cb2-bb8b-88fdfd2a05d9\" (UID: \"1027b17f-0368-4cb2-bb8b-88fdfd2a05d9\") " Mar 20 08:24:27 crc kubenswrapper[4971]: I0320 08:24:27.769287 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1027b17f-0368-4cb2-bb8b-88fdfd2a05d9-catalog-content\") pod \"1027b17f-0368-4cb2-bb8b-88fdfd2a05d9\" (UID: \"1027b17f-0368-4cb2-bb8b-88fdfd2a05d9\") " Mar 20 08:24:27 crc kubenswrapper[4971]: I0320 08:24:27.769369 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1027b17f-0368-4cb2-bb8b-88fdfd2a05d9-utilities\") pod \"1027b17f-0368-4cb2-bb8b-88fdfd2a05d9\" (UID: \"1027b17f-0368-4cb2-bb8b-88fdfd2a05d9\") " Mar 20 08:24:27 crc kubenswrapper[4971]: I0320 08:24:27.770239 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1027b17f-0368-4cb2-bb8b-88fdfd2a05d9-utilities" (OuterVolumeSpecName: "utilities") pod "1027b17f-0368-4cb2-bb8b-88fdfd2a05d9" (UID: "1027b17f-0368-4cb2-bb8b-88fdfd2a05d9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:24:27 crc kubenswrapper[4971]: I0320 08:24:27.776911 4971 scope.go:117] "RemoveContainer" containerID="ad3125cd3f4dabe90a57590e0b1717438af4f51d840f5782cadef2fead69ff6e" Mar 20 08:24:27 crc kubenswrapper[4971]: I0320 08:24:27.777927 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1027b17f-0368-4cb2-bb8b-88fdfd2a05d9-kube-api-access-fv7xq" (OuterVolumeSpecName: "kube-api-access-fv7xq") pod "1027b17f-0368-4cb2-bb8b-88fdfd2a05d9" (UID: "1027b17f-0368-4cb2-bb8b-88fdfd2a05d9"). InnerVolumeSpecName "kube-api-access-fv7xq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:24:27 crc kubenswrapper[4971]: I0320 08:24:27.811574 4971 scope.go:117] "RemoveContainer" containerID="d3c7f682a2e3ab1fa8d950ab7f94f72d07c792970c8fc69fbc824869f14c4055" Mar 20 08:24:27 crc kubenswrapper[4971]: I0320 08:24:27.832068 4971 scope.go:117] "RemoveContainer" containerID="f0666c3375812d63a771752bbf3550416647d48fbbf1757c6ad7b4bac6d5dbb1" Mar 20 08:24:27 crc kubenswrapper[4971]: E0320 08:24:27.832505 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0666c3375812d63a771752bbf3550416647d48fbbf1757c6ad7b4bac6d5dbb1\": container with ID starting with f0666c3375812d63a771752bbf3550416647d48fbbf1757c6ad7b4bac6d5dbb1 not found: ID does not exist" containerID="f0666c3375812d63a771752bbf3550416647d48fbbf1757c6ad7b4bac6d5dbb1" Mar 20 08:24:27 crc kubenswrapper[4971]: I0320 08:24:27.832532 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0666c3375812d63a771752bbf3550416647d48fbbf1757c6ad7b4bac6d5dbb1"} err="failed to get container status \"f0666c3375812d63a771752bbf3550416647d48fbbf1757c6ad7b4bac6d5dbb1\": rpc error: code = NotFound desc = could not find container \"f0666c3375812d63a771752bbf3550416647d48fbbf1757c6ad7b4bac6d5dbb1\": container with ID starting with f0666c3375812d63a771752bbf3550416647d48fbbf1757c6ad7b4bac6d5dbb1 not found: ID does not exist" Mar 20 08:24:27 crc kubenswrapper[4971]: I0320 08:24:27.832554 4971 scope.go:117] "RemoveContainer" containerID="ad3125cd3f4dabe90a57590e0b1717438af4f51d840f5782cadef2fead69ff6e" Mar 20 08:24:27 crc kubenswrapper[4971]: E0320 08:24:27.832846 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad3125cd3f4dabe90a57590e0b1717438af4f51d840f5782cadef2fead69ff6e\": container with ID starting with ad3125cd3f4dabe90a57590e0b1717438af4f51d840f5782cadef2fead69ff6e not found: ID does not exist" containerID="ad3125cd3f4dabe90a57590e0b1717438af4f51d840f5782cadef2fead69ff6e" Mar 20 08:24:27 crc kubenswrapper[4971]: I0320 08:24:27.832866 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad3125cd3f4dabe90a57590e0b1717438af4f51d840f5782cadef2fead69ff6e"} err="failed to get container status \"ad3125cd3f4dabe90a57590e0b1717438af4f51d840f5782cadef2fead69ff6e\": rpc error: code = NotFound desc = could not find container \"ad3125cd3f4dabe90a57590e0b1717438af4f51d840f5782cadef2fead69ff6e\": container with ID starting with ad3125cd3f4dabe90a57590e0b1717438af4f51d840f5782cadef2fead69ff6e not found: ID does not exist" Mar 20 08:24:27 crc kubenswrapper[4971]: I0320 08:24:27.832878 4971 scope.go:117] "RemoveContainer" containerID="d3c7f682a2e3ab1fa8d950ab7f94f72d07c792970c8fc69fbc824869f14c4055" Mar 20 08:24:27 crc kubenswrapper[4971]: E0320 08:24:27.833234 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3c7f682a2e3ab1fa8d950ab7f94f72d07c792970c8fc69fbc824869f14c4055\": container with ID starting with d3c7f682a2e3ab1fa8d950ab7f94f72d07c792970c8fc69fbc824869f14c4055 not found: ID does not exist" containerID="d3c7f682a2e3ab1fa8d950ab7f94f72d07c792970c8fc69fbc824869f14c4055" Mar 20 08:24:27 crc kubenswrapper[4971]: I0320 08:24:27.833252 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3c7f682a2e3ab1fa8d950ab7f94f72d07c792970c8fc69fbc824869f14c4055"} err="failed to get container status \"d3c7f682a2e3ab1fa8d950ab7f94f72d07c792970c8fc69fbc824869f14c4055\": rpc error: code = NotFound desc = could not find container \"d3c7f682a2e3ab1fa8d950ab7f94f72d07c792970c8fc69fbc824869f14c4055\": container with ID starting with d3c7f682a2e3ab1fa8d950ab7f94f72d07c792970c8fc69fbc824869f14c4055 not found: ID does not exist" Mar 20 08:24:27 crc kubenswrapper[4971]: I0320 08:24:27.835676 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1027b17f-0368-4cb2-bb8b-88fdfd2a05d9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1027b17f-0368-4cb2-bb8b-88fdfd2a05d9" (UID: "1027b17f-0368-4cb2-bb8b-88fdfd2a05d9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:24:27 crc kubenswrapper[4971]: I0320 08:24:27.870967 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1027b17f-0368-4cb2-bb8b-88fdfd2a05d9-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:27 crc kubenswrapper[4971]: I0320 08:24:27.871136 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fv7xq\" (UniqueName: \"kubernetes.io/projected/1027b17f-0368-4cb2-bb8b-88fdfd2a05d9-kube-api-access-fv7xq\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:27 crc kubenswrapper[4971]: I0320 08:24:27.871699 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1027b17f-0368-4cb2-bb8b-88fdfd2a05d9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:28 crc kubenswrapper[4971]: I0320 08:24:28.100146 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9l7xb"] Mar 20 08:24:28 crc kubenswrapper[4971]: I0320 08:24:28.105730 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9l7xb"] Mar 20 08:24:28 crc kubenswrapper[4971]: I0320 08:24:28.752374 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1027b17f-0368-4cb2-bb8b-88fdfd2a05d9" path="/var/lib/kubelet/pods/1027b17f-0368-4cb2-bb8b-88fdfd2a05d9/volumes" Mar 20 08:24:40 crc kubenswrapper[4971]: I0320 08:24:40.732520 4971 scope.go:117] "RemoveContainer" containerID="52eff19a5271a1aa4b1f0d67866346931e99423b21667041f8e58856f311aa9a" Mar 20 08:24:40 crc kubenswrapper[4971]: E0320 08:24:40.734240 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:24:52 crc kubenswrapper[4971]: I0320 08:24:52.733178 4971 scope.go:117] "RemoveContainer" containerID="52eff19a5271a1aa4b1f0d67866346931e99423b21667041f8e58856f311aa9a" Mar 20 08:24:52 crc kubenswrapper[4971]: E0320 08:24:52.734423 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:24:58 crc kubenswrapper[4971]: I0320 08:24:58.169790 4971 scope.go:117] "RemoveContainer" containerID="2d0c8329921efa0bf9761dbfc803e3b733569555226421859c118869d99bba88" Mar 20 08:25:04 crc kubenswrapper[4971]: I0320 08:25:04.732653 4971 scope.go:117] "RemoveContainer" containerID="52eff19a5271a1aa4b1f0d67866346931e99423b21667041f8e58856f311aa9a" Mar 20 08:25:04 crc kubenswrapper[4971]: E0320 08:25:04.734288 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:25:16 crc kubenswrapper[4971]: I0320 08:25:16.732569 4971 scope.go:117] "RemoveContainer" containerID="52eff19a5271a1aa4b1f0d67866346931e99423b21667041f8e58856f311aa9a" Mar 20 08:25:16 crc kubenswrapper[4971]: E0320 08:25:16.733524 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:25:29 crc kubenswrapper[4971]: I0320 08:25:29.732199 4971 scope.go:117] "RemoveContainer" containerID="52eff19a5271a1aa4b1f0d67866346931e99423b21667041f8e58856f311aa9a" Mar 20 08:25:29 crc kubenswrapper[4971]: E0320 08:25:29.733160 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:25:44 crc kubenswrapper[4971]: I0320 08:25:44.733314 4971 scope.go:117] "RemoveContainer" containerID="52eff19a5271a1aa4b1f0d67866346931e99423b21667041f8e58856f311aa9a" Mar 20 08:25:44 crc kubenswrapper[4971]: E0320 08:25:44.736046 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:25:55 crc kubenswrapper[4971]: I0320 08:25:55.732510 4971 scope.go:117] "RemoveContainer" containerID="52eff19a5271a1aa4b1f0d67866346931e99423b21667041f8e58856f311aa9a" Mar 20 08:25:55 crc kubenswrapper[4971]: E0320 08:25:55.733187 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:26:00 crc kubenswrapper[4971]: I0320 08:26:00.156897 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566586-rl6nz"] Mar 20 08:26:00 crc kubenswrapper[4971]: E0320 08:26:00.158489 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1027b17f-0368-4cb2-bb8b-88fdfd2a05d9" containerName="registry-server" Mar 20 08:26:00 crc kubenswrapper[4971]: I0320 08:26:00.158569 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="1027b17f-0368-4cb2-bb8b-88fdfd2a05d9" containerName="registry-server" Mar 20 08:26:00 crc kubenswrapper[4971]: E0320 08:26:00.158683 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1027b17f-0368-4cb2-bb8b-88fdfd2a05d9" containerName="extract-utilities" Mar 20 08:26:00 crc kubenswrapper[4971]: I0320 08:26:00.158698 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="1027b17f-0368-4cb2-bb8b-88fdfd2a05d9" containerName="extract-utilities" Mar 20 08:26:00 crc kubenswrapper[4971]: E0320 08:26:00.158721 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1027b17f-0368-4cb2-bb8b-88fdfd2a05d9" containerName="extract-content" Mar 20 08:26:00 crc kubenswrapper[4971]: I0320 08:26:00.158733 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="1027b17f-0368-4cb2-bb8b-88fdfd2a05d9" containerName="extract-content" Mar 20 08:26:00 crc kubenswrapper[4971]: I0320 08:26:00.158988 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="1027b17f-0368-4cb2-bb8b-88fdfd2a05d9" containerName="registry-server" Mar 20 08:26:00 crc kubenswrapper[4971]: I0320 08:26:00.160406 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566586-rl6nz" Mar 20 08:26:00 crc kubenswrapper[4971]: I0320 08:26:00.163385 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 08:26:00 crc kubenswrapper[4971]: I0320 08:26:00.163692 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 08:26:00 crc kubenswrapper[4971]: I0320 08:26:00.163869 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 08:26:00 crc kubenswrapper[4971]: I0320 08:26:00.174878 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566586-rl6nz"] Mar 20 08:26:00 crc kubenswrapper[4971]: I0320 08:26:00.323188 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9f64\" (UniqueName: \"kubernetes.io/projected/8a719b13-476d-43b3-9767-91af911e2767-kube-api-access-w9f64\") pod \"auto-csr-approver-29566586-rl6nz\" (UID: \"8a719b13-476d-43b3-9767-91af911e2767\") " pod="openshift-infra/auto-csr-approver-29566586-rl6nz" Mar 20 08:26:00 crc kubenswrapper[4971]: I0320 08:26:00.424884 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9f64\" (UniqueName: \"kubernetes.io/projected/8a719b13-476d-43b3-9767-91af911e2767-kube-api-access-w9f64\") pod \"auto-csr-approver-29566586-rl6nz\" (UID: \"8a719b13-476d-43b3-9767-91af911e2767\") " pod="openshift-infra/auto-csr-approver-29566586-rl6nz" Mar 20 08:26:00 crc kubenswrapper[4971]: I0320 08:26:00.454537 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9f64\" (UniqueName: \"kubernetes.io/projected/8a719b13-476d-43b3-9767-91af911e2767-kube-api-access-w9f64\") pod \"auto-csr-approver-29566586-rl6nz\" (UID: \"8a719b13-476d-43b3-9767-91af911e2767\") " pod="openshift-infra/auto-csr-approver-29566586-rl6nz" Mar 20 08:26:00 crc kubenswrapper[4971]: I0320 08:26:00.494645 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566586-rl6nz" Mar 20 08:26:00 crc kubenswrapper[4971]: I0320 08:26:00.960264 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566586-rl6nz"] Mar 20 08:26:01 crc kubenswrapper[4971]: I0320 08:26:01.564034 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566586-rl6nz" event={"ID":"8a719b13-476d-43b3-9767-91af911e2767","Type":"ContainerStarted","Data":"7e71d209484d81344955b2fe2d25e7c9cf888f6f436bf08542d7f08907f6a98f"} Mar 20 08:26:02 crc kubenswrapper[4971]: I0320 08:26:02.571390 4971 generic.go:334] "Generic (PLEG): container finished" podID="8a719b13-476d-43b3-9767-91af911e2767" containerID="21d8da55f5697fa8fd922ba33d1dbde448c1263efe8f165e85c24d1342ade57c" exitCode=0 Mar 20 08:26:02 crc kubenswrapper[4971]: I0320 08:26:02.571468 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566586-rl6nz" event={"ID":"8a719b13-476d-43b3-9767-91af911e2767","Type":"ContainerDied","Data":"21d8da55f5697fa8fd922ba33d1dbde448c1263efe8f165e85c24d1342ade57c"} Mar 20 08:26:03 crc kubenswrapper[4971]: I0320 08:26:03.905659 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566586-rl6nz" Mar 20 08:26:03 crc kubenswrapper[4971]: I0320 08:26:03.992706 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9f64\" (UniqueName: \"kubernetes.io/projected/8a719b13-476d-43b3-9767-91af911e2767-kube-api-access-w9f64\") pod \"8a719b13-476d-43b3-9767-91af911e2767\" (UID: \"8a719b13-476d-43b3-9767-91af911e2767\") " Mar 20 08:26:03 crc kubenswrapper[4971]: I0320 08:26:03.998851 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a719b13-476d-43b3-9767-91af911e2767-kube-api-access-w9f64" (OuterVolumeSpecName: "kube-api-access-w9f64") pod "8a719b13-476d-43b3-9767-91af911e2767" (UID: "8a719b13-476d-43b3-9767-91af911e2767"). InnerVolumeSpecName "kube-api-access-w9f64". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:26:04 crc kubenswrapper[4971]: I0320 08:26:04.094858 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9f64\" (UniqueName: \"kubernetes.io/projected/8a719b13-476d-43b3-9767-91af911e2767-kube-api-access-w9f64\") on node \"crc\" DevicePath \"\"" Mar 20 08:26:04 crc kubenswrapper[4971]: I0320 08:26:04.590794 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566586-rl6nz" event={"ID":"8a719b13-476d-43b3-9767-91af911e2767","Type":"ContainerDied","Data":"7e71d209484d81344955b2fe2d25e7c9cf888f6f436bf08542d7f08907f6a98f"} Mar 20 08:26:04 crc kubenswrapper[4971]: I0320 08:26:04.590833 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e71d209484d81344955b2fe2d25e7c9cf888f6f436bf08542d7f08907f6a98f" Mar 20 08:26:04 crc kubenswrapper[4971]: I0320 08:26:04.590866 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566586-rl6nz" Mar 20 08:26:05 crc kubenswrapper[4971]: I0320 08:26:05.010518 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566580-ck8br"] Mar 20 08:26:05 crc kubenswrapper[4971]: I0320 08:26:05.018071 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566580-ck8br"] Mar 20 08:26:06 crc kubenswrapper[4971]: I0320 08:26:06.741463 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cdbda64-10b5-46a1-a725-e3217bf2a00d" path="/var/lib/kubelet/pods/5cdbda64-10b5-46a1-a725-e3217bf2a00d/volumes" Mar 20 08:26:10 crc kubenswrapper[4971]: I0320 08:26:10.731956 4971 scope.go:117] "RemoveContainer" containerID="52eff19a5271a1aa4b1f0d67866346931e99423b21667041f8e58856f311aa9a" Mar 20 08:26:10 crc kubenswrapper[4971]: E0320 08:26:10.732386 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:26:24 crc kubenswrapper[4971]: I0320 08:26:24.732793 4971 scope.go:117] "RemoveContainer" containerID="52eff19a5271a1aa4b1f0d67866346931e99423b21667041f8e58856f311aa9a" Mar 20 08:26:24 crc kubenswrapper[4971]: E0320 08:26:24.733822 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:26:37 crc kubenswrapper[4971]: I0320 08:26:37.732549 4971 scope.go:117] "RemoveContainer" containerID="52eff19a5271a1aa4b1f0d67866346931e99423b21667041f8e58856f311aa9a" Mar 20 08:26:37 crc kubenswrapper[4971]: E0320 08:26:37.733510 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:26:52 crc kubenswrapper[4971]: I0320 08:26:52.736151 4971 scope.go:117] "RemoveContainer" containerID="52eff19a5271a1aa4b1f0d67866346931e99423b21667041f8e58856f311aa9a" Mar 20 08:26:52 crc kubenswrapper[4971]: E0320 08:26:52.737256 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:26:58 crc kubenswrapper[4971]: I0320 08:26:58.279255 4971 scope.go:117] "RemoveContainer" containerID="164d99c8a3ef2325fdc55d6446cdd8febbd31c71758989a988c6d34d16a8ebf6" Mar 20 08:27:04 crc kubenswrapper[4971]: I0320 08:27:04.732761 4971 scope.go:117] "RemoveContainer" containerID="52eff19a5271a1aa4b1f0d67866346931e99423b21667041f8e58856f311aa9a" Mar 20 08:27:04 crc kubenswrapper[4971]: E0320 08:27:04.733792 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:27:15 crc kubenswrapper[4971]: I0320 08:27:15.732471 4971 scope.go:117] "RemoveContainer" containerID="52eff19a5271a1aa4b1f0d67866346931e99423b21667041f8e58856f311aa9a" Mar 20 08:27:15 crc kubenswrapper[4971]: E0320 08:27:15.734376 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:27:26 crc kubenswrapper[4971]: I0320 08:27:26.733104 4971 scope.go:117] "RemoveContainer" containerID="52eff19a5271a1aa4b1f0d67866346931e99423b21667041f8e58856f311aa9a" Mar 20 08:27:26 crc kubenswrapper[4971]: E0320 08:27:26.734057 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:27:41 crc kubenswrapper[4971]: I0320 08:27:41.734085 4971 scope.go:117] "RemoveContainer" containerID="52eff19a5271a1aa4b1f0d67866346931e99423b21667041f8e58856f311aa9a" Mar 20 08:27:41 crc kubenswrapper[4971]: E0320 08:27:41.735248 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:27:54 crc kubenswrapper[4971]: I0320 08:27:54.732845 4971 scope.go:117] "RemoveContainer" containerID="52eff19a5271a1aa4b1f0d67866346931e99423b21667041f8e58856f311aa9a" Mar 20 08:27:54 crc kubenswrapper[4971]: E0320 08:27:54.733955 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:28:00 crc kubenswrapper[4971]: I0320 08:28:00.151046 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566588-7lwms"] Mar 20 08:28:00 crc kubenswrapper[4971]: E0320 08:28:00.151732 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a719b13-476d-43b3-9767-91af911e2767" containerName="oc" Mar 20 08:28:00 crc kubenswrapper[4971]: I0320 08:28:00.151749 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a719b13-476d-43b3-9767-91af911e2767" containerName="oc" Mar 20 08:28:00 crc kubenswrapper[4971]: I0320 08:28:00.151922 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a719b13-476d-43b3-9767-91af911e2767" containerName="oc" Mar 20 08:28:00 crc kubenswrapper[4971]: I0320 08:28:00.152472 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566588-7lwms" Mar 20 08:28:00 crc kubenswrapper[4971]: I0320 08:28:00.161184 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566588-7lwms"] Mar 20 08:28:00 crc kubenswrapper[4971]: I0320 08:28:00.164497 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 08:28:00 crc kubenswrapper[4971]: I0320 08:28:00.165574 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 08:28:00 crc kubenswrapper[4971]: I0320 08:28:00.165912 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 08:28:00 crc kubenswrapper[4971]: I0320 08:28:00.293181 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67zfk\" (UniqueName: \"kubernetes.io/projected/3778330e-8385-4e2a-b656-4732211092e9-kube-api-access-67zfk\") pod \"auto-csr-approver-29566588-7lwms\" (UID: \"3778330e-8385-4e2a-b656-4732211092e9\") " pod="openshift-infra/auto-csr-approver-29566588-7lwms" Mar 20 08:28:00 crc kubenswrapper[4971]: I0320 08:28:00.395310 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67zfk\" (UniqueName: \"kubernetes.io/projected/3778330e-8385-4e2a-b656-4732211092e9-kube-api-access-67zfk\") pod \"auto-csr-approver-29566588-7lwms\" (UID: \"3778330e-8385-4e2a-b656-4732211092e9\") " pod="openshift-infra/auto-csr-approver-29566588-7lwms" Mar 20 08:28:00 crc kubenswrapper[4971]: I0320 08:28:00.429009 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67zfk\" (UniqueName: \"kubernetes.io/projected/3778330e-8385-4e2a-b656-4732211092e9-kube-api-access-67zfk\") pod \"auto-csr-approver-29566588-7lwms\" (UID: \"3778330e-8385-4e2a-b656-4732211092e9\") " pod="openshift-infra/auto-csr-approver-29566588-7lwms" Mar 20 08:28:00 crc kubenswrapper[4971]: I0320 08:28:00.483991 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566588-7lwms" Mar 20 08:28:00 crc kubenswrapper[4971]: I0320 08:28:00.950127 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566588-7lwms"] Mar 20 08:28:00 crc kubenswrapper[4971]: I0320 08:28:00.964846 4971 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 08:28:01 crc kubenswrapper[4971]: I0320 08:28:01.699163 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566588-7lwms" event={"ID":"3778330e-8385-4e2a-b656-4732211092e9","Type":"ContainerStarted","Data":"ae1db1c91b5a57459c5a32ad6713cc6008f0a1e4ac9558d4b76c24bf436d386c"} Mar 20 08:28:02 crc kubenswrapper[4971]: I0320 08:28:02.706645 4971 generic.go:334] "Generic (PLEG): container finished" podID="3778330e-8385-4e2a-b656-4732211092e9" containerID="280ee8e5cedbf543b0b1daeb5a987937280ff54c27acd7d14971531accde0228" exitCode=0 Mar 20 08:28:02 crc kubenswrapper[4971]: I0320 08:28:02.706714 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566588-7lwms" event={"ID":"3778330e-8385-4e2a-b656-4732211092e9","Type":"ContainerDied","Data":"280ee8e5cedbf543b0b1daeb5a987937280ff54c27acd7d14971531accde0228"} Mar 20 08:28:04 crc kubenswrapper[4971]: I0320 08:28:04.045247 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566588-7lwms" Mar 20 08:28:04 crc kubenswrapper[4971]: I0320 08:28:04.148291 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67zfk\" (UniqueName: \"kubernetes.io/projected/3778330e-8385-4e2a-b656-4732211092e9-kube-api-access-67zfk\") pod \"3778330e-8385-4e2a-b656-4732211092e9\" (UID: \"3778330e-8385-4e2a-b656-4732211092e9\") " Mar 20 08:28:04 crc kubenswrapper[4971]: I0320 08:28:04.156833 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3778330e-8385-4e2a-b656-4732211092e9-kube-api-access-67zfk" (OuterVolumeSpecName: "kube-api-access-67zfk") pod "3778330e-8385-4e2a-b656-4732211092e9" (UID: "3778330e-8385-4e2a-b656-4732211092e9"). InnerVolumeSpecName "kube-api-access-67zfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:28:04 crc kubenswrapper[4971]: I0320 08:28:04.250285 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67zfk\" (UniqueName: \"kubernetes.io/projected/3778330e-8385-4e2a-b656-4732211092e9-kube-api-access-67zfk\") on node \"crc\" DevicePath \"\"" Mar 20 08:28:04 crc kubenswrapper[4971]: I0320 08:28:04.734304 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566588-7lwms" Mar 20 08:28:04 crc kubenswrapper[4971]: I0320 08:28:04.752288 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566588-7lwms" event={"ID":"3778330e-8385-4e2a-b656-4732211092e9","Type":"ContainerDied","Data":"ae1db1c91b5a57459c5a32ad6713cc6008f0a1e4ac9558d4b76c24bf436d386c"} Mar 20 08:28:04 crc kubenswrapper[4971]: I0320 08:28:04.752544 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae1db1c91b5a57459c5a32ad6713cc6008f0a1e4ac9558d4b76c24bf436d386c" Mar 20 08:28:05 crc kubenswrapper[4971]: I0320 08:28:05.117710 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566582-l74ct"] Mar 20 08:28:05 crc kubenswrapper[4971]: I0320 08:28:05.123375 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566582-l74ct"] Mar 20 08:28:06 crc kubenswrapper[4971]: I0320 08:28:06.733084 4971 scope.go:117] "RemoveContainer" containerID="52eff19a5271a1aa4b1f0d67866346931e99423b21667041f8e58856f311aa9a" Mar 20 08:28:06 crc kubenswrapper[4971]: E0320 08:28:06.734042 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:28:06 crc kubenswrapper[4971]: I0320 08:28:06.750674 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a08b814-b6c1-466a-a596-8638e8fef3f6" path="/var/lib/kubelet/pods/6a08b814-b6c1-466a-a596-8638e8fef3f6/volumes" Mar 20 08:28:09 crc kubenswrapper[4971]: I0320 08:28:09.987320 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-l5xpx"] Mar 20 08:28:09 crc kubenswrapper[4971]: I0320 08:28:09.993851 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-l5xpx"] Mar 20 08:28:10 crc kubenswrapper[4971]: I0320 08:28:10.122714 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-wwtwf"] Mar 20 08:28:10 crc kubenswrapper[4971]: E0320 08:28:10.123095 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3778330e-8385-4e2a-b656-4732211092e9" containerName="oc" Mar 20 08:28:10 crc kubenswrapper[4971]: I0320 08:28:10.123110 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="3778330e-8385-4e2a-b656-4732211092e9" containerName="oc" Mar 20 08:28:10 crc kubenswrapper[4971]: I0320 08:28:10.123243 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="3778330e-8385-4e2a-b656-4732211092e9" containerName="oc" Mar 20 08:28:10 crc kubenswrapper[4971]: I0320 08:28:10.123815 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-wwtwf" Mar 20 08:28:10 crc kubenswrapper[4971]: I0320 08:28:10.127062 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Mar 20 08:28:10 crc kubenswrapper[4971]: I0320 08:28:10.127476 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Mar 20 08:28:10 crc kubenswrapper[4971]: I0320 08:28:10.127623 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Mar 20 08:28:10 crc kubenswrapper[4971]: I0320 08:28:10.127895 4971 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-k8hht" Mar 20 08:28:10 crc kubenswrapper[4971]: I0320 08:28:10.129094 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-wwtwf"] Mar 20 08:28:10 crc kubenswrapper[4971]: I0320 08:28:10.141574 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/ffb6a6f7-030f-447f-8078-9a35a99548ad-node-mnt\") pod \"crc-storage-crc-wwtwf\" (UID: \"ffb6a6f7-030f-447f-8078-9a35a99548ad\") " pod="crc-storage/crc-storage-crc-wwtwf" Mar 20 08:28:10 crc kubenswrapper[4971]: I0320 08:28:10.141781 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/ffb6a6f7-030f-447f-8078-9a35a99548ad-crc-storage\") pod \"crc-storage-crc-wwtwf\" (UID: \"ffb6a6f7-030f-447f-8078-9a35a99548ad\") " pod="crc-storage/crc-storage-crc-wwtwf" Mar 20 08:28:10 crc kubenswrapper[4971]: I0320 08:28:10.142120 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msbh5\" (UniqueName: \"kubernetes.io/projected/ffb6a6f7-030f-447f-8078-9a35a99548ad-kube-api-access-msbh5\") pod \"crc-storage-crc-wwtwf\" (UID: \"ffb6a6f7-030f-447f-8078-9a35a99548ad\") " pod="crc-storage/crc-storage-crc-wwtwf" Mar 20 08:28:10 crc kubenswrapper[4971]: I0320 08:28:10.243186 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/ffb6a6f7-030f-447f-8078-9a35a99548ad-crc-storage\") pod \"crc-storage-crc-wwtwf\" (UID: \"ffb6a6f7-030f-447f-8078-9a35a99548ad\") " pod="crc-storage/crc-storage-crc-wwtwf" Mar 20 08:28:10 crc kubenswrapper[4971]: I0320 08:28:10.243303 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msbh5\" (UniqueName: \"kubernetes.io/projected/ffb6a6f7-030f-447f-8078-9a35a99548ad-kube-api-access-msbh5\") pod \"crc-storage-crc-wwtwf\" (UID: \"ffb6a6f7-030f-447f-8078-9a35a99548ad\") " pod="crc-storage/crc-storage-crc-wwtwf" Mar 20 08:28:10 crc kubenswrapper[4971]: I0320 08:28:10.243367 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/ffb6a6f7-030f-447f-8078-9a35a99548ad-node-mnt\") pod \"crc-storage-crc-wwtwf\" (UID: \"ffb6a6f7-030f-447f-8078-9a35a99548ad\") " pod="crc-storage/crc-storage-crc-wwtwf" Mar 20 08:28:10 crc kubenswrapper[4971]: I0320 08:28:10.243755 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/ffb6a6f7-030f-447f-8078-9a35a99548ad-node-mnt\") pod \"crc-storage-crc-wwtwf\" (UID: \"ffb6a6f7-030f-447f-8078-9a35a99548ad\") " pod="crc-storage/crc-storage-crc-wwtwf" Mar 20 08:28:10 crc kubenswrapper[4971]: I0320 08:28:10.245040 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/ffb6a6f7-030f-447f-8078-9a35a99548ad-crc-storage\") pod \"crc-storage-crc-wwtwf\" (UID: \"ffb6a6f7-030f-447f-8078-9a35a99548ad\") " pod="crc-storage/crc-storage-crc-wwtwf" Mar 20 08:28:10 crc kubenswrapper[4971]: I0320 08:28:10.274205 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msbh5\" (UniqueName: \"kubernetes.io/projected/ffb6a6f7-030f-447f-8078-9a35a99548ad-kube-api-access-msbh5\") pod \"crc-storage-crc-wwtwf\" (UID: \"ffb6a6f7-030f-447f-8078-9a35a99548ad\") " pod="crc-storage/crc-storage-crc-wwtwf" Mar 20 08:28:10 crc kubenswrapper[4971]: I0320 08:28:10.449602 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-wwtwf" Mar 20 08:28:10 crc kubenswrapper[4971]: I0320 08:28:10.742697 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e8977ac-1b93-49d2-8f42-20e64d909d7b" path="/var/lib/kubelet/pods/8e8977ac-1b93-49d2-8f42-20e64d909d7b/volumes" Mar 20 08:28:10 crc kubenswrapper[4971]: I0320 08:28:10.955023 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-wwtwf"] Mar 20 08:28:11 crc kubenswrapper[4971]: I0320 08:28:11.796039 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-wwtwf" event={"ID":"ffb6a6f7-030f-447f-8078-9a35a99548ad","Type":"ContainerStarted","Data":"c954218d1c7c7dd2ebaeabfdaac186297c2cf4fb71004111ff4aedf09d9c2c92"} Mar 20 08:28:11 crc kubenswrapper[4971]: I0320 08:28:11.796358 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-wwtwf" event={"ID":"ffb6a6f7-030f-447f-8078-9a35a99548ad","Type":"ContainerStarted","Data":"deb8d2d64e55b83c0d2ed066272b262f389bebbdb4651a88e074cc89a16c609e"} Mar 20 08:28:11 crc kubenswrapper[4971]: I0320 08:28:11.824757 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="crc-storage/crc-storage-crc-wwtwf" podStartSLOduration=1.239537025 podStartE2EDuration="1.824679584s" podCreationTimestamp="2026-03-20 08:28:10 +0000 UTC" firstStartedPulling="2026-03-20 08:28:10.963062355 +0000 UTC m=+5912.942936533" lastFinishedPulling="2026-03-20 08:28:11.548204954 +0000 UTC m=+5913.528079092" observedRunningTime="2026-03-20 08:28:11.813827951 +0000 UTC m=+5913.793702099" watchObservedRunningTime="2026-03-20 08:28:11.824679584 +0000 UTC m=+5913.804553812" Mar 20 08:28:12 crc kubenswrapper[4971]: I0320 08:28:12.807012 4971 generic.go:334] "Generic (PLEG): container finished" podID="ffb6a6f7-030f-447f-8078-9a35a99548ad" containerID="c954218d1c7c7dd2ebaeabfdaac186297c2cf4fb71004111ff4aedf09d9c2c92" exitCode=0 Mar 20 08:28:12 crc kubenswrapper[4971]: I0320 08:28:12.807122 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-wwtwf" event={"ID":"ffb6a6f7-030f-447f-8078-9a35a99548ad","Type":"ContainerDied","Data":"c954218d1c7c7dd2ebaeabfdaac186297c2cf4fb71004111ff4aedf09d9c2c92"} Mar 20 08:28:14 crc kubenswrapper[4971]: I0320 08:28:14.172896 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-wwtwf" Mar 20 08:28:14 crc kubenswrapper[4971]: I0320 08:28:14.338158 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-msbh5\" (UniqueName: \"kubernetes.io/projected/ffb6a6f7-030f-447f-8078-9a35a99548ad-kube-api-access-msbh5\") pod \"ffb6a6f7-030f-447f-8078-9a35a99548ad\" (UID: \"ffb6a6f7-030f-447f-8078-9a35a99548ad\") " Mar 20 08:28:14 crc kubenswrapper[4971]: I0320 08:28:14.338229 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/ffb6a6f7-030f-447f-8078-9a35a99548ad-crc-storage\") pod \"ffb6a6f7-030f-447f-8078-9a35a99548ad\" (UID: \"ffb6a6f7-030f-447f-8078-9a35a99548ad\") " Mar 20 08:28:14 crc kubenswrapper[4971]: I0320 08:28:14.338316 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/ffb6a6f7-030f-447f-8078-9a35a99548ad-node-mnt\") pod \"ffb6a6f7-030f-447f-8078-9a35a99548ad\" (UID: \"ffb6a6f7-030f-447f-8078-9a35a99548ad\") " Mar 20 08:28:14 crc kubenswrapper[4971]: I0320 08:28:14.338765 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ffb6a6f7-030f-447f-8078-9a35a99548ad-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "ffb6a6f7-030f-447f-8078-9a35a99548ad" (UID: "ffb6a6f7-030f-447f-8078-9a35a99548ad"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:28:14 crc kubenswrapper[4971]: I0320 08:28:14.347940 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffb6a6f7-030f-447f-8078-9a35a99548ad-kube-api-access-msbh5" (OuterVolumeSpecName: "kube-api-access-msbh5") pod "ffb6a6f7-030f-447f-8078-9a35a99548ad" (UID: "ffb6a6f7-030f-447f-8078-9a35a99548ad"). InnerVolumeSpecName "kube-api-access-msbh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:28:14 crc kubenswrapper[4971]: I0320 08:28:14.362399 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffb6a6f7-030f-447f-8078-9a35a99548ad-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "ffb6a6f7-030f-447f-8078-9a35a99548ad" (UID: "ffb6a6f7-030f-447f-8078-9a35a99548ad"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:28:14 crc kubenswrapper[4971]: I0320 08:28:14.440382 4971 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/ffb6a6f7-030f-447f-8078-9a35a99548ad-node-mnt\") on node \"crc\" DevicePath \"\"" Mar 20 08:28:14 crc kubenswrapper[4971]: I0320 08:28:14.440434 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-msbh5\" (UniqueName: \"kubernetes.io/projected/ffb6a6f7-030f-447f-8078-9a35a99548ad-kube-api-access-msbh5\") on node \"crc\" DevicePath \"\"" Mar 20 08:28:14 crc kubenswrapper[4971]: I0320 08:28:14.440451 4971 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/ffb6a6f7-030f-447f-8078-9a35a99548ad-crc-storage\") on node \"crc\" DevicePath \"\"" Mar 20 08:28:14 crc kubenswrapper[4971]: I0320 08:28:14.842918 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-wwtwf" event={"ID":"ffb6a6f7-030f-447f-8078-9a35a99548ad","Type":"ContainerDied","Data":"deb8d2d64e55b83c0d2ed066272b262f389bebbdb4651a88e074cc89a16c609e"} Mar 20 08:28:14 crc kubenswrapper[4971]: I0320 08:28:14.843453 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="deb8d2d64e55b83c0d2ed066272b262f389bebbdb4651a88e074cc89a16c609e" Mar 20 08:28:14 crc kubenswrapper[4971]: I0320 08:28:14.842978 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-wwtwf" Mar 20 08:28:16 crc kubenswrapper[4971]: I0320 08:28:16.266330 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-wwtwf"] Mar 20 08:28:16 crc kubenswrapper[4971]: I0320 08:28:16.271752 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-wwtwf"] Mar 20 08:28:16 crc kubenswrapper[4971]: I0320 08:28:16.441999 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-h88zl"] Mar 20 08:28:16 crc kubenswrapper[4971]: E0320 08:28:16.442423 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffb6a6f7-030f-447f-8078-9a35a99548ad" containerName="storage" Mar 20 08:28:16 crc kubenswrapper[4971]: I0320 08:28:16.442451 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffb6a6f7-030f-447f-8078-9a35a99548ad" containerName="storage" Mar 20 08:28:16 crc kubenswrapper[4971]: I0320 08:28:16.442788 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffb6a6f7-030f-447f-8078-9a35a99548ad" containerName="storage" Mar 20 08:28:16 crc kubenswrapper[4971]: I0320 08:28:16.443571 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-h88zl" Mar 20 08:28:16 crc kubenswrapper[4971]: I0320 08:28:16.446686 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Mar 20 08:28:16 crc kubenswrapper[4971]: I0320 08:28:16.448366 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Mar 20 08:28:16 crc kubenswrapper[4971]: I0320 08:28:16.448726 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Mar 20 08:28:16 crc kubenswrapper[4971]: I0320 08:28:16.449200 4971 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-k8hht" Mar 20 08:28:16 crc kubenswrapper[4971]: I0320 08:28:16.488382 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-h88zl"] Mar 20 08:28:16 crc kubenswrapper[4971]: I0320 08:28:16.570726 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/9f049200-5e19-4e04-ba28-078caed3ec3e-node-mnt\") pod \"crc-storage-crc-h88zl\" (UID: \"9f049200-5e19-4e04-ba28-078caed3ec3e\") " pod="crc-storage/crc-storage-crc-h88zl" Mar 20 08:28:16 crc kubenswrapper[4971]: I0320 08:28:16.570811 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/9f049200-5e19-4e04-ba28-078caed3ec3e-crc-storage\") pod \"crc-storage-crc-h88zl\" (UID: \"9f049200-5e19-4e04-ba28-078caed3ec3e\") " pod="crc-storage/crc-storage-crc-h88zl" Mar 20 08:28:16 crc kubenswrapper[4971]: I0320 08:28:16.571266 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmlcp\" (UniqueName: \"kubernetes.io/projected/9f049200-5e19-4e04-ba28-078caed3ec3e-kube-api-access-vmlcp\") pod \"crc-storage-crc-h88zl\" (UID: \"9f049200-5e19-4e04-ba28-078caed3ec3e\") " pod="crc-storage/crc-storage-crc-h88zl" Mar 20 08:28:16 crc kubenswrapper[4971]: I0320 08:28:16.673318 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/9f049200-5e19-4e04-ba28-078caed3ec3e-node-mnt\") pod \"crc-storage-crc-h88zl\" (UID: \"9f049200-5e19-4e04-ba28-078caed3ec3e\") " pod="crc-storage/crc-storage-crc-h88zl" Mar 20 08:28:16 crc kubenswrapper[4971]: I0320 08:28:16.673387 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/9f049200-5e19-4e04-ba28-078caed3ec3e-crc-storage\") pod \"crc-storage-crc-h88zl\" (UID: \"9f049200-5e19-4e04-ba28-078caed3ec3e\") " pod="crc-storage/crc-storage-crc-h88zl" Mar 20 08:28:16 crc kubenswrapper[4971]: I0320 08:28:16.673449 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmlcp\" (UniqueName: \"kubernetes.io/projected/9f049200-5e19-4e04-ba28-078caed3ec3e-kube-api-access-vmlcp\") pod \"crc-storage-crc-h88zl\" (UID: \"9f049200-5e19-4e04-ba28-078caed3ec3e\") " pod="crc-storage/crc-storage-crc-h88zl" Mar 20 08:28:16 crc kubenswrapper[4971]: I0320 08:28:16.673766 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/9f049200-5e19-4e04-ba28-078caed3ec3e-node-mnt\") pod \"crc-storage-crc-h88zl\" (UID: \"9f049200-5e19-4e04-ba28-078caed3ec3e\") " pod="crc-storage/crc-storage-crc-h88zl" Mar 20 08:28:16 crc kubenswrapper[4971]: I0320 08:28:16.674896 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/9f049200-5e19-4e04-ba28-078caed3ec3e-crc-storage\") pod \"crc-storage-crc-h88zl\" (UID: \"9f049200-5e19-4e04-ba28-078caed3ec3e\") " pod="crc-storage/crc-storage-crc-h88zl" Mar 20 08:28:16 crc kubenswrapper[4971]: I0320 08:28:16.705030 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmlcp\" (UniqueName: \"kubernetes.io/projected/9f049200-5e19-4e04-ba28-078caed3ec3e-kube-api-access-vmlcp\") pod \"crc-storage-crc-h88zl\" (UID: \"9f049200-5e19-4e04-ba28-078caed3ec3e\") " pod="crc-storage/crc-storage-crc-h88zl" Mar 20 08:28:16 crc kubenswrapper[4971]: I0320 08:28:16.750855 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffb6a6f7-030f-447f-8078-9a35a99548ad" path="/var/lib/kubelet/pods/ffb6a6f7-030f-447f-8078-9a35a99548ad/volumes" Mar 20 08:28:16 crc kubenswrapper[4971]: I0320 08:28:16.807900 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-h88zl" Mar 20 08:28:17 crc kubenswrapper[4971]: I0320 08:28:17.289181 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-h88zl"] Mar 20 08:28:17 crc kubenswrapper[4971]: I0320 08:28:17.732780 4971 scope.go:117] "RemoveContainer" containerID="52eff19a5271a1aa4b1f0d67866346931e99423b21667041f8e58856f311aa9a" Mar 20 08:28:17 crc kubenswrapper[4971]: E0320 08:28:17.733277 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:28:17 crc kubenswrapper[4971]: I0320 08:28:17.872677 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-h88zl" event={"ID":"9f049200-5e19-4e04-ba28-078caed3ec3e","Type":"ContainerStarted","Data":"97a7eebf4e8948b8d051b34b7c238ee3a2747c2c24d71795870dbdfa18a0f77e"} Mar 20 08:28:18 crc kubenswrapper[4971]: I0320 08:28:18.884729 4971 generic.go:334] "Generic (PLEG): container finished" podID="9f049200-5e19-4e04-ba28-078caed3ec3e" containerID="eaf2179b2c00ec438a548c12702af021f2882f85c6717ed14106ced570239abb" exitCode=0 Mar 20 08:28:18 crc kubenswrapper[4971]: I0320 08:28:18.884803 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-h88zl" event={"ID":"9f049200-5e19-4e04-ba28-078caed3ec3e","Type":"ContainerDied","Data":"eaf2179b2c00ec438a548c12702af021f2882f85c6717ed14106ced570239abb"} Mar 20 08:28:20 crc kubenswrapper[4971]: I0320 08:28:20.241081 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-h88zl" Mar 20 08:28:20 crc kubenswrapper[4971]: I0320 08:28:20.438890 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmlcp\" (UniqueName: \"kubernetes.io/projected/9f049200-5e19-4e04-ba28-078caed3ec3e-kube-api-access-vmlcp\") pod \"9f049200-5e19-4e04-ba28-078caed3ec3e\" (UID: \"9f049200-5e19-4e04-ba28-078caed3ec3e\") " Mar 20 08:28:20 crc kubenswrapper[4971]: I0320 08:28:20.439366 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/9f049200-5e19-4e04-ba28-078caed3ec3e-crc-storage\") pod \"9f049200-5e19-4e04-ba28-078caed3ec3e\" (UID: \"9f049200-5e19-4e04-ba28-078caed3ec3e\") " Mar 20 08:28:20 crc kubenswrapper[4971]: I0320 08:28:20.439400 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/9f049200-5e19-4e04-ba28-078caed3ec3e-node-mnt\") pod \"9f049200-5e19-4e04-ba28-078caed3ec3e\" (UID: \"9f049200-5e19-4e04-ba28-078caed3ec3e\") " Mar 20 08:28:20 crc kubenswrapper[4971]: I0320 08:28:20.439555 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9f049200-5e19-4e04-ba28-078caed3ec3e-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "9f049200-5e19-4e04-ba28-078caed3ec3e" (UID: "9f049200-5e19-4e04-ba28-078caed3ec3e"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:28:20 crc kubenswrapper[4971]: I0320 08:28:20.439765 4971 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/9f049200-5e19-4e04-ba28-078caed3ec3e-node-mnt\") on node \"crc\" DevicePath \"\"" Mar 20 08:28:20 crc kubenswrapper[4971]: I0320 08:28:20.449383 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f049200-5e19-4e04-ba28-078caed3ec3e-kube-api-access-vmlcp" (OuterVolumeSpecName: "kube-api-access-vmlcp") pod "9f049200-5e19-4e04-ba28-078caed3ec3e" (UID: "9f049200-5e19-4e04-ba28-078caed3ec3e"). InnerVolumeSpecName "kube-api-access-vmlcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:28:20 crc kubenswrapper[4971]: I0320 08:28:20.475185 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f049200-5e19-4e04-ba28-078caed3ec3e-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "9f049200-5e19-4e04-ba28-078caed3ec3e" (UID: "9f049200-5e19-4e04-ba28-078caed3ec3e"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:28:20 crc kubenswrapper[4971]: I0320 08:28:20.541055 4971 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/9f049200-5e19-4e04-ba28-078caed3ec3e-crc-storage\") on node \"crc\" DevicePath \"\"" Mar 20 08:28:20 crc kubenswrapper[4971]: I0320 08:28:20.541088 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmlcp\" (UniqueName: \"kubernetes.io/projected/9f049200-5e19-4e04-ba28-078caed3ec3e-kube-api-access-vmlcp\") on node \"crc\" DevicePath \"\"" Mar 20 08:28:20 crc kubenswrapper[4971]: I0320 08:28:20.904355 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-h88zl" event={"ID":"9f049200-5e19-4e04-ba28-078caed3ec3e","Type":"ContainerDied","Data":"97a7eebf4e8948b8d051b34b7c238ee3a2747c2c24d71795870dbdfa18a0f77e"} Mar 20 08:28:20 crc kubenswrapper[4971]: I0320 08:28:20.904401 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97a7eebf4e8948b8d051b34b7c238ee3a2747c2c24d71795870dbdfa18a0f77e" Mar 20 08:28:20 crc kubenswrapper[4971]: I0320 08:28:20.904500 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-h88zl" Mar 20 08:28:28 crc kubenswrapper[4971]: I0320 08:28:28.740508 4971 scope.go:117] "RemoveContainer" containerID="52eff19a5271a1aa4b1f0d67866346931e99423b21667041f8e58856f311aa9a" Mar 20 08:28:28 crc kubenswrapper[4971]: E0320 08:28:28.741276 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:28:41 crc kubenswrapper[4971]: I0320 08:28:41.732602 4971 scope.go:117] "RemoveContainer" containerID="52eff19a5271a1aa4b1f0d67866346931e99423b21667041f8e58856f311aa9a" Mar 20 08:28:41 crc kubenswrapper[4971]: E0320 08:28:41.733649 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:28:52 crc kubenswrapper[4971]: I0320 08:28:52.731691 4971 scope.go:117] "RemoveContainer" containerID="52eff19a5271a1aa4b1f0d67866346931e99423b21667041f8e58856f311aa9a" Mar 20 08:28:53 crc kubenswrapper[4971]: I0320 08:28:53.220739 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerStarted","Data":"5db1e00d6090d5bd7995513994dc958aa01f39fa2d0f251bb90827207208e2ff"} Mar 20 08:28:58 crc kubenswrapper[4971]: I0320 08:28:58.400793 4971 scope.go:117] "RemoveContainer" containerID="bf495453fb1525dc07e1e3cfdbf4e89f394dde805c342b393aff74b6660bbea1" Mar 20 08:28:58 crc kubenswrapper[4971]: I0320 08:28:58.477300 4971 scope.go:117] "RemoveContainer" containerID="1df822e674560188b6562afdacdcc6638e60307322f9a4bcc81dd72d14db5b00" Mar 20 08:30:00 crc kubenswrapper[4971]: I0320 08:30:00.153519 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566590-c2225"] Mar 20 08:30:00 crc kubenswrapper[4971]: E0320 08:30:00.154461 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f049200-5e19-4e04-ba28-078caed3ec3e" containerName="storage" Mar 20 08:30:00 crc kubenswrapper[4971]: I0320 08:30:00.154478 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f049200-5e19-4e04-ba28-078caed3ec3e" containerName="storage" Mar 20 08:30:00 crc kubenswrapper[4971]: I0320 08:30:00.154683 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f049200-5e19-4e04-ba28-078caed3ec3e" containerName="storage" Mar 20 08:30:00 crc kubenswrapper[4971]: I0320 08:30:00.155225 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566590-c2225" Mar 20 08:30:00 crc kubenswrapper[4971]: I0320 08:30:00.158311 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 08:30:00 crc kubenswrapper[4971]: I0320 08:30:00.158399 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 08:30:00 crc kubenswrapper[4971]: I0320 08:30:00.159246 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 08:30:00 crc kubenswrapper[4971]: I0320 08:30:00.160298 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566590-59zps"] Mar 20 08:30:00 crc kubenswrapper[4971]: I0320 08:30:00.161272 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566590-59zps" Mar 20 08:30:00 crc kubenswrapper[4971]: I0320 08:30:00.165468 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 08:30:00 crc kubenswrapper[4971]: I0320 08:30:00.165563 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 08:30:00 crc kubenswrapper[4971]: I0320 08:30:00.175545 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566590-59zps"] Mar 20 08:30:00 crc kubenswrapper[4971]: I0320 08:30:00.217735 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566590-c2225"] Mar 20 08:30:00 crc kubenswrapper[4971]: I0320 08:30:00.309774 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8f04acd8-fa60-4df0-9149-a2875b62ff82-config-volume\") pod \"collect-profiles-29566590-59zps\" (UID: \"8f04acd8-fa60-4df0-9149-a2875b62ff82\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566590-59zps" Mar 20 08:30:00 crc kubenswrapper[4971]: I0320 08:30:00.309989 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jllc\" (UniqueName: \"kubernetes.io/projected/99c80702-40ac-409b-bf0c-3c5aa898b765-kube-api-access-4jllc\") pod \"auto-csr-approver-29566590-c2225\" (UID: \"99c80702-40ac-409b-bf0c-3c5aa898b765\") " pod="openshift-infra/auto-csr-approver-29566590-c2225" Mar 20 08:30:00 crc kubenswrapper[4971]: I0320 08:30:00.310189 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8f04acd8-fa60-4df0-9149-a2875b62ff82-secret-volume\") pod \"collect-profiles-29566590-59zps\" (UID: \"8f04acd8-fa60-4df0-9149-a2875b62ff82\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566590-59zps" Mar 20 08:30:00 crc kubenswrapper[4971]: I0320 08:30:00.310441 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwbzg\" (UniqueName: \"kubernetes.io/projected/8f04acd8-fa60-4df0-9149-a2875b62ff82-kube-api-access-jwbzg\") pod \"collect-profiles-29566590-59zps\" (UID: \"8f04acd8-fa60-4df0-9149-a2875b62ff82\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566590-59zps" Mar 20 08:30:00 crc kubenswrapper[4971]: I0320 08:30:00.412249 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8f04acd8-fa60-4df0-9149-a2875b62ff82-config-volume\") pod \"collect-profiles-29566590-59zps\" (UID: \"8f04acd8-fa60-4df0-9149-a2875b62ff82\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566590-59zps" Mar 20 08:30:00 crc kubenswrapper[4971]: I0320 08:30:00.412440 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jllc\" (UniqueName: \"kubernetes.io/projected/99c80702-40ac-409b-bf0c-3c5aa898b765-kube-api-access-4jllc\") pod \"auto-csr-approver-29566590-c2225\" (UID: \"99c80702-40ac-409b-bf0c-3c5aa898b765\") " pod="openshift-infra/auto-csr-approver-29566590-c2225" Mar 20 08:30:00 crc kubenswrapper[4971]: I0320 08:30:00.412534 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8f04acd8-fa60-4df0-9149-a2875b62ff82-secret-volume\") pod \"collect-profiles-29566590-59zps\" (UID: \"8f04acd8-fa60-4df0-9149-a2875b62ff82\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566590-59zps" Mar 20 08:30:00 crc kubenswrapper[4971]: I0320 08:30:00.412701 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwbzg\" (UniqueName: \"kubernetes.io/projected/8f04acd8-fa60-4df0-9149-a2875b62ff82-kube-api-access-jwbzg\") pod \"collect-profiles-29566590-59zps\" (UID: \"8f04acd8-fa60-4df0-9149-a2875b62ff82\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566590-59zps" Mar 20 08:30:00 crc kubenswrapper[4971]: I0320 08:30:00.413713 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8f04acd8-fa60-4df0-9149-a2875b62ff82-config-volume\") pod \"collect-profiles-29566590-59zps\" (UID: \"8f04acd8-fa60-4df0-9149-a2875b62ff82\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566590-59zps" Mar 20 08:30:00 crc kubenswrapper[4971]: I0320 08:30:00.420218 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8f04acd8-fa60-4df0-9149-a2875b62ff82-secret-volume\") pod \"collect-profiles-29566590-59zps\" (UID: \"8f04acd8-fa60-4df0-9149-a2875b62ff82\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566590-59zps" Mar 20 08:30:00 crc kubenswrapper[4971]: I0320 08:30:00.429953 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwbzg\" (UniqueName: \"kubernetes.io/projected/8f04acd8-fa60-4df0-9149-a2875b62ff82-kube-api-access-jwbzg\") pod \"collect-profiles-29566590-59zps\" (UID: \"8f04acd8-fa60-4df0-9149-a2875b62ff82\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566590-59zps" Mar 20 08:30:00 crc kubenswrapper[4971]: I0320 08:30:00.430591 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jllc\" (UniqueName: \"kubernetes.io/projected/99c80702-40ac-409b-bf0c-3c5aa898b765-kube-api-access-4jllc\") pod \"auto-csr-approver-29566590-c2225\" (UID: \"99c80702-40ac-409b-bf0c-3c5aa898b765\") " pod="openshift-infra/auto-csr-approver-29566590-c2225" Mar 20 08:30:00 crc kubenswrapper[4971]: I0320 08:30:00.494965 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566590-c2225" Mar 20 08:30:00 crc kubenswrapper[4971]: I0320 08:30:00.509425 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566590-59zps" Mar 20 08:30:00 crc kubenswrapper[4971]: I0320 08:30:00.833123 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566590-c2225"] Mar 20 08:30:01 crc kubenswrapper[4971]: I0320 08:30:01.016978 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566590-59zps"] Mar 20 08:30:01 crc kubenswrapper[4971]: W0320 08:30:01.020919 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f04acd8_fa60_4df0_9149_a2875b62ff82.slice/crio-a10f649ebc1157addac3097cc7b32904237bafd406d2235246515eb948246133 WatchSource:0}: Error finding container a10f649ebc1157addac3097cc7b32904237bafd406d2235246515eb948246133: Status 404 returned error can't find the container with id a10f649ebc1157addac3097cc7b32904237bafd406d2235246515eb948246133 Mar 20 08:30:01 crc kubenswrapper[4971]: I0320 08:30:01.846321 4971 generic.go:334] "Generic (PLEG): container finished" podID="8f04acd8-fa60-4df0-9149-a2875b62ff82" containerID="89c65477f3ea5c66679f59924eb0a20b8141b950fcdd80e357b69d48d20cbf74" exitCode=0 Mar 20 08:30:01 crc kubenswrapper[4971]: I0320 08:30:01.846442 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566590-59zps" event={"ID":"8f04acd8-fa60-4df0-9149-a2875b62ff82","Type":"ContainerDied","Data":"89c65477f3ea5c66679f59924eb0a20b8141b950fcdd80e357b69d48d20cbf74"} Mar 20 08:30:01 crc kubenswrapper[4971]: I0320 08:30:01.846495 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566590-59zps" event={"ID":"8f04acd8-fa60-4df0-9149-a2875b62ff82","Type":"ContainerStarted","Data":"a10f649ebc1157addac3097cc7b32904237bafd406d2235246515eb948246133"} Mar 20 08:30:01 crc kubenswrapper[4971]: I0320 08:30:01.849106 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566590-c2225" event={"ID":"99c80702-40ac-409b-bf0c-3c5aa898b765","Type":"ContainerStarted","Data":"5d583459f1dcc31690f640c6b74a9e998cfa222a3b19549eaefce9274a7714ba"} Mar 20 08:30:02 crc kubenswrapper[4971]: I0320 08:30:02.858583 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566590-c2225" event={"ID":"99c80702-40ac-409b-bf0c-3c5aa898b765","Type":"ContainerStarted","Data":"c978236199e76b4c5286d3e39c75dfefda4150e144cbcff70f8628f6d431a40e"} Mar 20 08:30:02 crc kubenswrapper[4971]: I0320 08:30:02.878149 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566590-c2225" podStartSLOduration=1.452535068 podStartE2EDuration="2.878124413s" podCreationTimestamp="2026-03-20 08:30:00 +0000 UTC" firstStartedPulling="2026-03-20 08:30:00.843385754 +0000 UTC m=+6022.823259892" lastFinishedPulling="2026-03-20 08:30:02.268975059 +0000 UTC m=+6024.248849237" observedRunningTime="2026-03-20 08:30:02.87495224 +0000 UTC m=+6024.854826378" watchObservedRunningTime="2026-03-20 08:30:02.878124413 +0000 UTC m=+6024.857998551" Mar 20 08:30:03 crc kubenswrapper[4971]: I0320 08:30:03.157319 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566590-59zps" Mar 20 08:30:03 crc kubenswrapper[4971]: I0320 08:30:03.257343 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8f04acd8-fa60-4df0-9149-a2875b62ff82-config-volume\") pod \"8f04acd8-fa60-4df0-9149-a2875b62ff82\" (UID: \"8f04acd8-fa60-4df0-9149-a2875b62ff82\") " Mar 20 08:30:03 crc kubenswrapper[4971]: I0320 08:30:03.257406 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwbzg\" (UniqueName: \"kubernetes.io/projected/8f04acd8-fa60-4df0-9149-a2875b62ff82-kube-api-access-jwbzg\") pod \"8f04acd8-fa60-4df0-9149-a2875b62ff82\" (UID: \"8f04acd8-fa60-4df0-9149-a2875b62ff82\") " Mar 20 08:30:03 crc kubenswrapper[4971]: I0320 08:30:03.257518 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8f04acd8-fa60-4df0-9149-a2875b62ff82-secret-volume\") pod \"8f04acd8-fa60-4df0-9149-a2875b62ff82\" (UID: \"8f04acd8-fa60-4df0-9149-a2875b62ff82\") " Mar 20 08:30:03 crc kubenswrapper[4971]: I0320 08:30:03.258242 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f04acd8-fa60-4df0-9149-a2875b62ff82-config-volume" (OuterVolumeSpecName: "config-volume") pod "8f04acd8-fa60-4df0-9149-a2875b62ff82" (UID: "8f04acd8-fa60-4df0-9149-a2875b62ff82"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:30:03 crc kubenswrapper[4971]: I0320 08:30:03.265572 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f04acd8-fa60-4df0-9149-a2875b62ff82-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8f04acd8-fa60-4df0-9149-a2875b62ff82" (UID: "8f04acd8-fa60-4df0-9149-a2875b62ff82"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:30:03 crc kubenswrapper[4971]: I0320 08:30:03.266167 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f04acd8-fa60-4df0-9149-a2875b62ff82-kube-api-access-jwbzg" (OuterVolumeSpecName: "kube-api-access-jwbzg") pod "8f04acd8-fa60-4df0-9149-a2875b62ff82" (UID: "8f04acd8-fa60-4df0-9149-a2875b62ff82"). InnerVolumeSpecName "kube-api-access-jwbzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:30:03 crc kubenswrapper[4971]: I0320 08:30:03.359366 4971 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8f04acd8-fa60-4df0-9149-a2875b62ff82-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 08:30:03 crc kubenswrapper[4971]: I0320 08:30:03.359408 4971 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8f04acd8-fa60-4df0-9149-a2875b62ff82-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 08:30:03 crc kubenswrapper[4971]: I0320 08:30:03.359421 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwbzg\" (UniqueName: \"kubernetes.io/projected/8f04acd8-fa60-4df0-9149-a2875b62ff82-kube-api-access-jwbzg\") on node \"crc\" DevicePath \"\"" Mar 20 08:30:03 crc kubenswrapper[4971]: I0320 08:30:03.871267 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566590-59zps" event={"ID":"8f04acd8-fa60-4df0-9149-a2875b62ff82","Type":"ContainerDied","Data":"a10f649ebc1157addac3097cc7b32904237bafd406d2235246515eb948246133"} Mar 20 08:30:03 crc kubenswrapper[4971]: I0320 08:30:03.871289 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566590-59zps" Mar 20 08:30:03 crc kubenswrapper[4971]: I0320 08:30:03.871317 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a10f649ebc1157addac3097cc7b32904237bafd406d2235246515eb948246133" Mar 20 08:30:03 crc kubenswrapper[4971]: I0320 08:30:03.873220 4971 generic.go:334] "Generic (PLEG): container finished" podID="99c80702-40ac-409b-bf0c-3c5aa898b765" containerID="c978236199e76b4c5286d3e39c75dfefda4150e144cbcff70f8628f6d431a40e" exitCode=0 Mar 20 08:30:03 crc kubenswrapper[4971]: I0320 08:30:03.873464 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566590-c2225" event={"ID":"99c80702-40ac-409b-bf0c-3c5aa898b765","Type":"ContainerDied","Data":"c978236199e76b4c5286d3e39c75dfefda4150e144cbcff70f8628f6d431a40e"} Mar 20 08:30:04 crc kubenswrapper[4971]: I0320 08:30:04.228245 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566545-8rrzq"] Mar 20 08:30:04 crc kubenswrapper[4971]: I0320 08:30:04.233798 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566545-8rrzq"] Mar 20 08:30:04 crc kubenswrapper[4971]: I0320 08:30:04.744390 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="894a3a5d-c688-41ea-8b99-578fde0702d5" path="/var/lib/kubelet/pods/894a3a5d-c688-41ea-8b99-578fde0702d5/volumes" Mar 20 08:30:05 crc kubenswrapper[4971]: I0320 08:30:05.205098 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566590-c2225" Mar 20 08:30:05 crc kubenswrapper[4971]: I0320 08:30:05.286531 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jllc\" (UniqueName: \"kubernetes.io/projected/99c80702-40ac-409b-bf0c-3c5aa898b765-kube-api-access-4jllc\") pod \"99c80702-40ac-409b-bf0c-3c5aa898b765\" (UID: \"99c80702-40ac-409b-bf0c-3c5aa898b765\") " Mar 20 08:30:05 crc kubenswrapper[4971]: I0320 08:30:05.304843 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99c80702-40ac-409b-bf0c-3c5aa898b765-kube-api-access-4jllc" (OuterVolumeSpecName: "kube-api-access-4jllc") pod "99c80702-40ac-409b-bf0c-3c5aa898b765" (UID: "99c80702-40ac-409b-bf0c-3c5aa898b765"). InnerVolumeSpecName "kube-api-access-4jllc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:30:05 crc kubenswrapper[4971]: I0320 08:30:05.388919 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4jllc\" (UniqueName: \"kubernetes.io/projected/99c80702-40ac-409b-bf0c-3c5aa898b765-kube-api-access-4jllc\") on node \"crc\" DevicePath \"\"" Mar 20 08:30:05 crc kubenswrapper[4971]: I0320 08:30:05.893897 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566590-c2225" Mar 20 08:30:05 crc kubenswrapper[4971]: I0320 08:30:05.893867 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566590-c2225" event={"ID":"99c80702-40ac-409b-bf0c-3c5aa898b765","Type":"ContainerDied","Data":"5d583459f1dcc31690f640c6b74a9e998cfa222a3b19549eaefce9274a7714ba"} Mar 20 08:30:05 crc kubenswrapper[4971]: I0320 08:30:05.894839 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d583459f1dcc31690f640c6b74a9e998cfa222a3b19549eaefce9274a7714ba" Mar 20 08:30:05 crc kubenswrapper[4971]: I0320 08:30:05.974717 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566584-hsfjb"] Mar 20 08:30:05 crc kubenswrapper[4971]: I0320 08:30:05.982503 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566584-hsfjb"] Mar 20 08:30:06 crc kubenswrapper[4971]: I0320 08:30:06.742255 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91dea636-eec3-4e6d-b176-178b89aaa5f5" path="/var/lib/kubelet/pods/91dea636-eec3-4e6d-b176-178b89aaa5f5/volumes" Mar 20 08:30:21 crc kubenswrapper[4971]: I0320 08:30:21.354163 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7xrsm"] Mar 20 08:30:21 crc kubenswrapper[4971]: E0320 08:30:21.355091 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f04acd8-fa60-4df0-9149-a2875b62ff82" containerName="collect-profiles" Mar 20 08:30:21 crc kubenswrapper[4971]: I0320 08:30:21.355113 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f04acd8-fa60-4df0-9149-a2875b62ff82" containerName="collect-profiles" Mar 20 08:30:21 crc kubenswrapper[4971]: E0320 08:30:21.355154 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99c80702-40ac-409b-bf0c-3c5aa898b765" containerName="oc" Mar 20 08:30:21 crc kubenswrapper[4971]: I0320 08:30:21.355168 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="99c80702-40ac-409b-bf0c-3c5aa898b765" containerName="oc" Mar 20 08:30:21 crc kubenswrapper[4971]: I0320 08:30:21.355478 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f04acd8-fa60-4df0-9149-a2875b62ff82" containerName="collect-profiles" Mar 20 08:30:21 crc kubenswrapper[4971]: I0320 08:30:21.355515 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="99c80702-40ac-409b-bf0c-3c5aa898b765" containerName="oc" Mar 20 08:30:21 crc kubenswrapper[4971]: I0320 08:30:21.363460 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7xrsm" Mar 20 08:30:21 crc kubenswrapper[4971]: I0320 08:30:21.378939 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7xrsm"] Mar 20 08:30:21 crc kubenswrapper[4971]: I0320 08:30:21.451893 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a9d0182-c4af-4a57-9890-2e48896a8df4-utilities\") pod \"community-operators-7xrsm\" (UID: \"4a9d0182-c4af-4a57-9890-2e48896a8df4\") " pod="openshift-marketplace/community-operators-7xrsm" Mar 20 08:30:21 crc kubenswrapper[4971]: I0320 08:30:21.452044 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a9d0182-c4af-4a57-9890-2e48896a8df4-catalog-content\") pod \"community-operators-7xrsm\" (UID: \"4a9d0182-c4af-4a57-9890-2e48896a8df4\") " pod="openshift-marketplace/community-operators-7xrsm" Mar 20 08:30:21 crc kubenswrapper[4971]: I0320 08:30:21.452125 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4chdz\" (UniqueName: \"kubernetes.io/projected/4a9d0182-c4af-4a57-9890-2e48896a8df4-kube-api-access-4chdz\") pod \"community-operators-7xrsm\" (UID: \"4a9d0182-c4af-4a57-9890-2e48896a8df4\") " pod="openshift-marketplace/community-operators-7xrsm" Mar 20 08:30:21 crc kubenswrapper[4971]: I0320 08:30:21.553586 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a9d0182-c4af-4a57-9890-2e48896a8df4-utilities\") pod \"community-operators-7xrsm\" (UID: \"4a9d0182-c4af-4a57-9890-2e48896a8df4\") " pod="openshift-marketplace/community-operators-7xrsm" Mar 20 08:30:21 crc kubenswrapper[4971]: I0320 08:30:21.553690 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a9d0182-c4af-4a57-9890-2e48896a8df4-catalog-content\") pod \"community-operators-7xrsm\" (UID: \"4a9d0182-c4af-4a57-9890-2e48896a8df4\") " pod="openshift-marketplace/community-operators-7xrsm" Mar 20 08:30:21 crc kubenswrapper[4971]: I0320 08:30:21.553728 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4chdz\" (UniqueName: \"kubernetes.io/projected/4a9d0182-c4af-4a57-9890-2e48896a8df4-kube-api-access-4chdz\") pod \"community-operators-7xrsm\" (UID: \"4a9d0182-c4af-4a57-9890-2e48896a8df4\") " pod="openshift-marketplace/community-operators-7xrsm" Mar 20 08:30:21 crc kubenswrapper[4971]: I0320 08:30:21.554115 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a9d0182-c4af-4a57-9890-2e48896a8df4-utilities\") pod \"community-operators-7xrsm\" (UID: \"4a9d0182-c4af-4a57-9890-2e48896a8df4\") " pod="openshift-marketplace/community-operators-7xrsm" Mar 20 08:30:21 crc kubenswrapper[4971]: I0320 08:30:21.554229 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a9d0182-c4af-4a57-9890-2e48896a8df4-catalog-content\") pod \"community-operators-7xrsm\" (UID: \"4a9d0182-c4af-4a57-9890-2e48896a8df4\") " pod="openshift-marketplace/community-operators-7xrsm" Mar 20 08:30:21 crc kubenswrapper[4971]: I0320 08:30:21.584152 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4chdz\" (UniqueName: \"kubernetes.io/projected/4a9d0182-c4af-4a57-9890-2e48896a8df4-kube-api-access-4chdz\") pod \"community-operators-7xrsm\" (UID: \"4a9d0182-c4af-4a57-9890-2e48896a8df4\") " pod="openshift-marketplace/community-operators-7xrsm" Mar 20 08:30:21 crc kubenswrapper[4971]: I0320 08:30:21.704195 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7xrsm" Mar 20 08:30:22 crc kubenswrapper[4971]: I0320 08:30:22.206306 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7xrsm"] Mar 20 08:30:23 crc kubenswrapper[4971]: I0320 08:30:23.032130 4971 generic.go:334] "Generic (PLEG): container finished" podID="4a9d0182-c4af-4a57-9890-2e48896a8df4" containerID="981434231cbf1ee14588e28f5df7d593f3ddc9bb4109ff931a608bfe45a6ba32" exitCode=0 Mar 20 08:30:23 crc kubenswrapper[4971]: I0320 08:30:23.032365 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7xrsm" event={"ID":"4a9d0182-c4af-4a57-9890-2e48896a8df4","Type":"ContainerDied","Data":"981434231cbf1ee14588e28f5df7d593f3ddc9bb4109ff931a608bfe45a6ba32"} Mar 20 08:30:23 crc kubenswrapper[4971]: I0320 08:30:23.032393 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7xrsm" event={"ID":"4a9d0182-c4af-4a57-9890-2e48896a8df4","Type":"ContainerStarted","Data":"63a26b010fd839bf4dc7de1a933f09f06337d60ff1d52c549e97bcbbe7306823"} Mar 20 08:30:25 crc kubenswrapper[4971]: I0320 08:30:25.048858 4971 generic.go:334] "Generic (PLEG): container finished" podID="4a9d0182-c4af-4a57-9890-2e48896a8df4" containerID="6d7d661e23291956cc1a7f979e4852c629222cc3ed980ddd3422d7187daf126a" exitCode=0 Mar 20 08:30:25 crc kubenswrapper[4971]: I0320 08:30:25.048988 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7xrsm" event={"ID":"4a9d0182-c4af-4a57-9890-2e48896a8df4","Type":"ContainerDied","Data":"6d7d661e23291956cc1a7f979e4852c629222cc3ed980ddd3422d7187daf126a"} Mar 20 08:30:26 crc kubenswrapper[4971]: I0320 08:30:26.059952 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7xrsm" event={"ID":"4a9d0182-c4af-4a57-9890-2e48896a8df4","Type":"ContainerStarted","Data":"6800546ea436a7795d6f4544feed6e9608d213a079c4d526fdb5f7297daa71fd"} Mar 20 08:30:26 crc kubenswrapper[4971]: I0320 08:30:26.084835 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7xrsm" podStartSLOduration=2.5424111910000002 podStartE2EDuration="5.084815222s" podCreationTimestamp="2026-03-20 08:30:21 +0000 UTC" firstStartedPulling="2026-03-20 08:30:23.034350159 +0000 UTC m=+6045.014224307" lastFinishedPulling="2026-03-20 08:30:25.57675418 +0000 UTC m=+6047.556628338" observedRunningTime="2026-03-20 08:30:26.08168018 +0000 UTC m=+6048.061554338" watchObservedRunningTime="2026-03-20 08:30:26.084815222 +0000 UTC m=+6048.064689380" Mar 20 08:30:31 crc kubenswrapper[4971]: I0320 08:30:31.705025 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7xrsm" Mar 20 08:30:31 crc kubenswrapper[4971]: I0320 08:30:31.705632 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7xrsm" Mar 20 08:30:31 crc kubenswrapper[4971]: I0320 08:30:31.760951 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7xrsm" Mar 20 08:30:32 crc kubenswrapper[4971]: I0320 08:30:32.157365 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7xrsm" Mar 20 08:30:32 crc kubenswrapper[4971]: I0320 08:30:32.200339 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7xrsm"] Mar 20 08:30:34 crc kubenswrapper[4971]: I0320 08:30:34.109921 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7xrsm" podUID="4a9d0182-c4af-4a57-9890-2e48896a8df4" containerName="registry-server" containerID="cri-o://6800546ea436a7795d6f4544feed6e9608d213a079c4d526fdb5f7297daa71fd" gracePeriod=2 Mar 20 08:30:34 crc kubenswrapper[4971]: I0320 08:30:34.531429 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7xrsm" Mar 20 08:30:34 crc kubenswrapper[4971]: I0320 08:30:34.647806 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4chdz\" (UniqueName: \"kubernetes.io/projected/4a9d0182-c4af-4a57-9890-2e48896a8df4-kube-api-access-4chdz\") pod \"4a9d0182-c4af-4a57-9890-2e48896a8df4\" (UID: \"4a9d0182-c4af-4a57-9890-2e48896a8df4\") " Mar 20 08:30:34 crc kubenswrapper[4971]: I0320 08:30:34.647980 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a9d0182-c4af-4a57-9890-2e48896a8df4-catalog-content\") pod \"4a9d0182-c4af-4a57-9890-2e48896a8df4\" (UID: \"4a9d0182-c4af-4a57-9890-2e48896a8df4\") " Mar 20 08:30:34 crc kubenswrapper[4971]: I0320 08:30:34.648073 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a9d0182-c4af-4a57-9890-2e48896a8df4-utilities\") pod \"4a9d0182-c4af-4a57-9890-2e48896a8df4\" (UID: \"4a9d0182-c4af-4a57-9890-2e48896a8df4\") " Mar 20 08:30:34 crc kubenswrapper[4971]: I0320 08:30:34.648876 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a9d0182-c4af-4a57-9890-2e48896a8df4-utilities" (OuterVolumeSpecName: "utilities") pod "4a9d0182-c4af-4a57-9890-2e48896a8df4" (UID: "4a9d0182-c4af-4a57-9890-2e48896a8df4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:30:34 crc kubenswrapper[4971]: I0320 08:30:34.659773 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a9d0182-c4af-4a57-9890-2e48896a8df4-kube-api-access-4chdz" (OuterVolumeSpecName: "kube-api-access-4chdz") pod "4a9d0182-c4af-4a57-9890-2e48896a8df4" (UID: "4a9d0182-c4af-4a57-9890-2e48896a8df4"). InnerVolumeSpecName "kube-api-access-4chdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:30:34 crc kubenswrapper[4971]: I0320 08:30:34.749645 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a9d0182-c4af-4a57-9890-2e48896a8df4-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 08:30:34 crc kubenswrapper[4971]: I0320 08:30:34.749935 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4chdz\" (UniqueName: \"kubernetes.io/projected/4a9d0182-c4af-4a57-9890-2e48896a8df4-kube-api-access-4chdz\") on node \"crc\" DevicePath \"\"" Mar 20 08:30:35 crc kubenswrapper[4971]: I0320 08:30:35.047011 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a9d0182-c4af-4a57-9890-2e48896a8df4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4a9d0182-c4af-4a57-9890-2e48896a8df4" (UID: "4a9d0182-c4af-4a57-9890-2e48896a8df4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:30:35 crc kubenswrapper[4971]: I0320 08:30:35.054387 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a9d0182-c4af-4a57-9890-2e48896a8df4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 08:30:35 crc kubenswrapper[4971]: I0320 08:30:35.120070 4971 generic.go:334] "Generic (PLEG): container finished" podID="4a9d0182-c4af-4a57-9890-2e48896a8df4" containerID="6800546ea436a7795d6f4544feed6e9608d213a079c4d526fdb5f7297daa71fd" exitCode=0 Mar 20 08:30:35 crc kubenswrapper[4971]: I0320 08:30:35.120117 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7xrsm" Mar 20 08:30:35 crc kubenswrapper[4971]: I0320 08:30:35.120121 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7xrsm" event={"ID":"4a9d0182-c4af-4a57-9890-2e48896a8df4","Type":"ContainerDied","Data":"6800546ea436a7795d6f4544feed6e9608d213a079c4d526fdb5f7297daa71fd"} Mar 20 08:30:35 crc kubenswrapper[4971]: I0320 08:30:35.120268 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7xrsm" event={"ID":"4a9d0182-c4af-4a57-9890-2e48896a8df4","Type":"ContainerDied","Data":"63a26b010fd839bf4dc7de1a933f09f06337d60ff1d52c549e97bcbbe7306823"} Mar 20 08:30:35 crc kubenswrapper[4971]: I0320 08:30:35.120307 4971 scope.go:117] "RemoveContainer" containerID="6800546ea436a7795d6f4544feed6e9608d213a079c4d526fdb5f7297daa71fd" Mar 20 08:30:35 crc kubenswrapper[4971]: I0320 08:30:35.152036 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7xrsm"] Mar 20 08:30:35 crc kubenswrapper[4971]: I0320 08:30:35.159342 4971 scope.go:117] "RemoveContainer" containerID="6d7d661e23291956cc1a7f979e4852c629222cc3ed980ddd3422d7187daf126a" Mar 20 08:30:35 crc kubenswrapper[4971]: I0320 08:30:35.160514 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7xrsm"] Mar 20 08:30:35 crc kubenswrapper[4971]: I0320 08:30:35.177947 4971 scope.go:117] "RemoveContainer" containerID="981434231cbf1ee14588e28f5df7d593f3ddc9bb4109ff931a608bfe45a6ba32" Mar 20 08:30:35 crc kubenswrapper[4971]: I0320 08:30:35.209629 4971 scope.go:117] "RemoveContainer" containerID="6800546ea436a7795d6f4544feed6e9608d213a079c4d526fdb5f7297daa71fd" Mar 20 08:30:35 crc kubenswrapper[4971]: E0320 08:30:35.210100 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6800546ea436a7795d6f4544feed6e9608d213a079c4d526fdb5f7297daa71fd\": container with ID starting with 6800546ea436a7795d6f4544feed6e9608d213a079c4d526fdb5f7297daa71fd not found: ID does not exist" containerID="6800546ea436a7795d6f4544feed6e9608d213a079c4d526fdb5f7297daa71fd" Mar 20 08:30:35 crc kubenswrapper[4971]: I0320 08:30:35.210149 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6800546ea436a7795d6f4544feed6e9608d213a079c4d526fdb5f7297daa71fd"} err="failed to get container status \"6800546ea436a7795d6f4544feed6e9608d213a079c4d526fdb5f7297daa71fd\": rpc error: code = NotFound desc = could not find container \"6800546ea436a7795d6f4544feed6e9608d213a079c4d526fdb5f7297daa71fd\": container with ID starting with 6800546ea436a7795d6f4544feed6e9608d213a079c4d526fdb5f7297daa71fd not found: ID does not exist" Mar 20 08:30:35 crc kubenswrapper[4971]: I0320 08:30:35.210183 4971 scope.go:117] "RemoveContainer" containerID="6d7d661e23291956cc1a7f979e4852c629222cc3ed980ddd3422d7187daf126a" Mar 20 08:30:35 crc kubenswrapper[4971]: E0320 08:30:35.210507 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d7d661e23291956cc1a7f979e4852c629222cc3ed980ddd3422d7187daf126a\": container with ID starting with 6d7d661e23291956cc1a7f979e4852c629222cc3ed980ddd3422d7187daf126a not found: ID does not exist" containerID="6d7d661e23291956cc1a7f979e4852c629222cc3ed980ddd3422d7187daf126a" Mar 20 08:30:35 crc kubenswrapper[4971]: I0320 08:30:35.210545 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d7d661e23291956cc1a7f979e4852c629222cc3ed980ddd3422d7187daf126a"} err="failed to get container status \"6d7d661e23291956cc1a7f979e4852c629222cc3ed980ddd3422d7187daf126a\": rpc error: code = NotFound desc = could not find container \"6d7d661e23291956cc1a7f979e4852c629222cc3ed980ddd3422d7187daf126a\": container with ID starting with 6d7d661e23291956cc1a7f979e4852c629222cc3ed980ddd3422d7187daf126a not found: ID does not exist" Mar 20 08:30:35 crc kubenswrapper[4971]: I0320 08:30:35.210564 4971 scope.go:117] "RemoveContainer" containerID="981434231cbf1ee14588e28f5df7d593f3ddc9bb4109ff931a608bfe45a6ba32" Mar 20 08:30:35 crc kubenswrapper[4971]: E0320 08:30:35.210909 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"981434231cbf1ee14588e28f5df7d593f3ddc9bb4109ff931a608bfe45a6ba32\": container with ID starting with 981434231cbf1ee14588e28f5df7d593f3ddc9bb4109ff931a608bfe45a6ba32 not found: ID does not exist" containerID="981434231cbf1ee14588e28f5df7d593f3ddc9bb4109ff931a608bfe45a6ba32" Mar 20 08:30:35 crc kubenswrapper[4971]: I0320 08:30:35.210945 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"981434231cbf1ee14588e28f5df7d593f3ddc9bb4109ff931a608bfe45a6ba32"} err="failed to get container status \"981434231cbf1ee14588e28f5df7d593f3ddc9bb4109ff931a608bfe45a6ba32\": rpc error: code = NotFound desc = could not find container \"981434231cbf1ee14588e28f5df7d593f3ddc9bb4109ff931a608bfe45a6ba32\": container with ID starting with 981434231cbf1ee14588e28f5df7d593f3ddc9bb4109ff931a608bfe45a6ba32 not found: ID does not exist" Mar 20 08:30:36 crc kubenswrapper[4971]: I0320 08:30:36.745507 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a9d0182-c4af-4a57-9890-2e48896a8df4" path="/var/lib/kubelet/pods/4a9d0182-c4af-4a57-9890-2e48896a8df4/volumes" Mar 20 08:30:37 crc kubenswrapper[4971]: I0320 08:30:37.540351 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6648865bb9-ps7cp"] Mar 20 08:30:37 crc kubenswrapper[4971]: E0320 08:30:37.541059 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a9d0182-c4af-4a57-9890-2e48896a8df4" containerName="extract-content" Mar 20 08:30:37 crc kubenswrapper[4971]: I0320 08:30:37.541082 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a9d0182-c4af-4a57-9890-2e48896a8df4" containerName="extract-content" Mar 20 08:30:37 crc kubenswrapper[4971]: E0320 08:30:37.541099 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a9d0182-c4af-4a57-9890-2e48896a8df4" containerName="registry-server" Mar 20 08:30:37 crc kubenswrapper[4971]: I0320 08:30:37.541110 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a9d0182-c4af-4a57-9890-2e48896a8df4" containerName="registry-server" Mar 20 08:30:37 crc kubenswrapper[4971]: E0320 08:30:37.541136 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a9d0182-c4af-4a57-9890-2e48896a8df4" containerName="extract-utilities" Mar 20 08:30:37 crc kubenswrapper[4971]: I0320 08:30:37.541145 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a9d0182-c4af-4a57-9890-2e48896a8df4" containerName="extract-utilities" Mar 20 08:30:37 crc kubenswrapper[4971]: I0320 08:30:37.541507 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a9d0182-c4af-4a57-9890-2e48896a8df4" containerName="registry-server" Mar 20 08:30:37 crc kubenswrapper[4971]: I0320 08:30:37.548745 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6648865bb9-ps7cp" Mar 20 08:30:37 crc kubenswrapper[4971]: I0320 08:30:37.562270 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 20 08:30:37 crc kubenswrapper[4971]: I0320 08:30:37.562578 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 20 08:30:37 crc kubenswrapper[4971]: I0320 08:30:37.563267 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-xmtdc" Mar 20 08:30:37 crc kubenswrapper[4971]: I0320 08:30:37.563305 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 20 08:30:37 crc kubenswrapper[4971]: I0320 08:30:37.571956 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6648865bb9-ps7cp"] Mar 20 08:30:37 crc kubenswrapper[4971]: I0320 08:30:37.579175 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86ffc6867-ffg6c"] Mar 20 08:30:37 crc kubenswrapper[4971]: I0320 08:30:37.580533 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86ffc6867-ffg6c" Mar 20 08:30:37 crc kubenswrapper[4971]: I0320 08:30:37.583434 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 20 08:30:37 crc kubenswrapper[4971]: I0320 08:30:37.594440 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e48a8d8-1745-4c6c-88d1-971cb73e3908-config\") pod \"dnsmasq-dns-6648865bb9-ps7cp\" (UID: \"9e48a8d8-1745-4c6c-88d1-971cb73e3908\") " pod="openstack/dnsmasq-dns-6648865bb9-ps7cp" Mar 20 08:30:37 crc kubenswrapper[4971]: I0320 08:30:37.594541 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wbrs\" (UniqueName: \"kubernetes.io/projected/9e48a8d8-1745-4c6c-88d1-971cb73e3908-kube-api-access-8wbrs\") pod \"dnsmasq-dns-6648865bb9-ps7cp\" (UID: \"9e48a8d8-1745-4c6c-88d1-971cb73e3908\") " pod="openstack/dnsmasq-dns-6648865bb9-ps7cp" Mar 20 08:30:37 crc kubenswrapper[4971]: I0320 08:30:37.610393 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86ffc6867-ffg6c"] Mar 20 08:30:37 crc kubenswrapper[4971]: I0320 08:30:37.695788 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlmgv\" (UniqueName: \"kubernetes.io/projected/5e813b87-6cd8-460b-ae03-dfa481872f39-kube-api-access-nlmgv\") pod \"dnsmasq-dns-86ffc6867-ffg6c\" (UID: \"5e813b87-6cd8-460b-ae03-dfa481872f39\") " pod="openstack/dnsmasq-dns-86ffc6867-ffg6c" Mar 20 08:30:37 crc kubenswrapper[4971]: I0320 08:30:37.696048 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wbrs\" (UniqueName: \"kubernetes.io/projected/9e48a8d8-1745-4c6c-88d1-971cb73e3908-kube-api-access-8wbrs\") pod \"dnsmasq-dns-6648865bb9-ps7cp\" (UID: \"9e48a8d8-1745-4c6c-88d1-971cb73e3908\") " pod="openstack/dnsmasq-dns-6648865bb9-ps7cp" Mar 20 08:30:37 crc kubenswrapper[4971]: I0320 08:30:37.696211 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e813b87-6cd8-460b-ae03-dfa481872f39-config\") pod \"dnsmasq-dns-86ffc6867-ffg6c\" (UID: \"5e813b87-6cd8-460b-ae03-dfa481872f39\") " pod="openstack/dnsmasq-dns-86ffc6867-ffg6c" Mar 20 08:30:37 crc kubenswrapper[4971]: I0320 08:30:37.696301 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e48a8d8-1745-4c6c-88d1-971cb73e3908-config\") pod \"dnsmasq-dns-6648865bb9-ps7cp\" (UID: \"9e48a8d8-1745-4c6c-88d1-971cb73e3908\") " pod="openstack/dnsmasq-dns-6648865bb9-ps7cp" Mar 20 08:30:37 crc kubenswrapper[4971]: I0320 08:30:37.696386 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e813b87-6cd8-460b-ae03-dfa481872f39-dns-svc\") pod \"dnsmasq-dns-86ffc6867-ffg6c\" (UID: \"5e813b87-6cd8-460b-ae03-dfa481872f39\") " pod="openstack/dnsmasq-dns-86ffc6867-ffg6c" Mar 20 08:30:37 crc kubenswrapper[4971]: I0320 08:30:37.697240 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e48a8d8-1745-4c6c-88d1-971cb73e3908-config\") pod \"dnsmasq-dns-6648865bb9-ps7cp\" (UID: \"9e48a8d8-1745-4c6c-88d1-971cb73e3908\") " pod="openstack/dnsmasq-dns-6648865bb9-ps7cp" Mar 20 08:30:37 crc kubenswrapper[4971]: I0320 08:30:37.705109 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6648865bb9-ps7cp"] Mar 20 08:30:37 crc kubenswrapper[4971]: E0320 08:30:37.707873 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-8wbrs], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-6648865bb9-ps7cp" podUID="9e48a8d8-1745-4c6c-88d1-971cb73e3908" Mar 20 08:30:37 crc kubenswrapper[4971]: I0320 08:30:37.748789 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wbrs\" (UniqueName: \"kubernetes.io/projected/9e48a8d8-1745-4c6c-88d1-971cb73e3908-kube-api-access-8wbrs\") pod \"dnsmasq-dns-6648865bb9-ps7cp\" (UID: \"9e48a8d8-1745-4c6c-88d1-971cb73e3908\") " pod="openstack/dnsmasq-dns-6648865bb9-ps7cp" Mar 20 08:30:37 crc kubenswrapper[4971]: I0320 08:30:37.773522 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5459c8c9f5-kdvrf"] Mar 20 08:30:37 crc kubenswrapper[4971]: I0320 08:30:37.782829 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5459c8c9f5-kdvrf" Mar 20 08:30:37 crc kubenswrapper[4971]: I0320 08:30:37.793619 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5459c8c9f5-kdvrf"] Mar 20 08:30:37 crc kubenswrapper[4971]: I0320 08:30:37.797318 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e813b87-6cd8-460b-ae03-dfa481872f39-config\") pod \"dnsmasq-dns-86ffc6867-ffg6c\" (UID: \"5e813b87-6cd8-460b-ae03-dfa481872f39\") " pod="openstack/dnsmasq-dns-86ffc6867-ffg6c" Mar 20 08:30:37 crc kubenswrapper[4971]: I0320 08:30:37.797364 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e813b87-6cd8-460b-ae03-dfa481872f39-dns-svc\") pod \"dnsmasq-dns-86ffc6867-ffg6c\" (UID: \"5e813b87-6cd8-460b-ae03-dfa481872f39\") " pod="openstack/dnsmasq-dns-86ffc6867-ffg6c" Mar 20 08:30:37 crc kubenswrapper[4971]: I0320 08:30:37.797454 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlmgv\" (UniqueName: \"kubernetes.io/projected/5e813b87-6cd8-460b-ae03-dfa481872f39-kube-api-access-nlmgv\") pod \"dnsmasq-dns-86ffc6867-ffg6c\" (UID: \"5e813b87-6cd8-460b-ae03-dfa481872f39\") " pod="openstack/dnsmasq-dns-86ffc6867-ffg6c" Mar 20 08:30:37 crc kubenswrapper[4971]: I0320 08:30:37.798674 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e813b87-6cd8-460b-ae03-dfa481872f39-config\") pod \"dnsmasq-dns-86ffc6867-ffg6c\" (UID: \"5e813b87-6cd8-460b-ae03-dfa481872f39\") " pod="openstack/dnsmasq-dns-86ffc6867-ffg6c" Mar 20 08:30:37 crc kubenswrapper[4971]: I0320 08:30:37.799315 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e813b87-6cd8-460b-ae03-dfa481872f39-dns-svc\") pod \"dnsmasq-dns-86ffc6867-ffg6c\" (UID: \"5e813b87-6cd8-460b-ae03-dfa481872f39\") " pod="openstack/dnsmasq-dns-86ffc6867-ffg6c" Mar 20 08:30:37 crc kubenswrapper[4971]: I0320 08:30:37.821368 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlmgv\" (UniqueName: \"kubernetes.io/projected/5e813b87-6cd8-460b-ae03-dfa481872f39-kube-api-access-nlmgv\") pod \"dnsmasq-dns-86ffc6867-ffg6c\" (UID: \"5e813b87-6cd8-460b-ae03-dfa481872f39\") " pod="openstack/dnsmasq-dns-86ffc6867-ffg6c" Mar 20 08:30:37 crc kubenswrapper[4971]: I0320 08:30:37.899382 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f18f3000-93ad-4c2a-8fd4-de3b748b35d1-config\") pod \"dnsmasq-dns-5459c8c9f5-kdvrf\" (UID: \"f18f3000-93ad-4c2a-8fd4-de3b748b35d1\") " pod="openstack/dnsmasq-dns-5459c8c9f5-kdvrf" Mar 20 08:30:37 crc kubenswrapper[4971]: I0320 08:30:37.899425 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tg5cn\" (UniqueName: \"kubernetes.io/projected/f18f3000-93ad-4c2a-8fd4-de3b748b35d1-kube-api-access-tg5cn\") pod \"dnsmasq-dns-5459c8c9f5-kdvrf\" (UID: \"f18f3000-93ad-4c2a-8fd4-de3b748b35d1\") " pod="openstack/dnsmasq-dns-5459c8c9f5-kdvrf" Mar 20 08:30:37 crc kubenswrapper[4971]: I0320 08:30:37.899488 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f18f3000-93ad-4c2a-8fd4-de3b748b35d1-dns-svc\") pod \"dnsmasq-dns-5459c8c9f5-kdvrf\" (UID: \"f18f3000-93ad-4c2a-8fd4-de3b748b35d1\") " pod="openstack/dnsmasq-dns-5459c8c9f5-kdvrf" Mar 20 08:30:37 crc kubenswrapper[4971]: I0320 08:30:37.903998 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86ffc6867-ffg6c" Mar 20 08:30:38 crc kubenswrapper[4971]: I0320 08:30:38.003351 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f18f3000-93ad-4c2a-8fd4-de3b748b35d1-dns-svc\") pod \"dnsmasq-dns-5459c8c9f5-kdvrf\" (UID: \"f18f3000-93ad-4c2a-8fd4-de3b748b35d1\") " pod="openstack/dnsmasq-dns-5459c8c9f5-kdvrf" Mar 20 08:30:38 crc kubenswrapper[4971]: I0320 08:30:38.003726 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f18f3000-93ad-4c2a-8fd4-de3b748b35d1-config\") pod \"dnsmasq-dns-5459c8c9f5-kdvrf\" (UID: \"f18f3000-93ad-4c2a-8fd4-de3b748b35d1\") " pod="openstack/dnsmasq-dns-5459c8c9f5-kdvrf" Mar 20 08:30:38 crc kubenswrapper[4971]: I0320 08:30:38.003857 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tg5cn\" (UniqueName: \"kubernetes.io/projected/f18f3000-93ad-4c2a-8fd4-de3b748b35d1-kube-api-access-tg5cn\") pod \"dnsmasq-dns-5459c8c9f5-kdvrf\" (UID: \"f18f3000-93ad-4c2a-8fd4-de3b748b35d1\") " pod="openstack/dnsmasq-dns-5459c8c9f5-kdvrf" Mar 20 08:30:38 crc kubenswrapper[4971]: I0320 08:30:38.004758 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f18f3000-93ad-4c2a-8fd4-de3b748b35d1-dns-svc\") pod \"dnsmasq-dns-5459c8c9f5-kdvrf\" (UID: \"f18f3000-93ad-4c2a-8fd4-de3b748b35d1\") " pod="openstack/dnsmasq-dns-5459c8c9f5-kdvrf" Mar 20 08:30:38 crc kubenswrapper[4971]: I0320 08:30:38.004961 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f18f3000-93ad-4c2a-8fd4-de3b748b35d1-config\") pod \"dnsmasq-dns-5459c8c9f5-kdvrf\" (UID: \"f18f3000-93ad-4c2a-8fd4-de3b748b35d1\") " pod="openstack/dnsmasq-dns-5459c8c9f5-kdvrf" Mar 20 08:30:38 crc kubenswrapper[4971]: I0320 08:30:38.032190 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5459c8c9f5-kdvrf"] Mar 20 08:30:38 crc kubenswrapper[4971]: E0320 08:30:38.032951 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-tg5cn], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-5459c8c9f5-kdvrf" podUID="f18f3000-93ad-4c2a-8fd4-de3b748b35d1" Mar 20 08:30:38 crc kubenswrapper[4971]: I0320 08:30:38.040784 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tg5cn\" (UniqueName: \"kubernetes.io/projected/f18f3000-93ad-4c2a-8fd4-de3b748b35d1-kube-api-access-tg5cn\") pod \"dnsmasq-dns-5459c8c9f5-kdvrf\" (UID: \"f18f3000-93ad-4c2a-8fd4-de3b748b35d1\") " pod="openstack/dnsmasq-dns-5459c8c9f5-kdvrf" Mar 20 08:30:38 crc kubenswrapper[4971]: I0320 08:30:38.084237 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-685785d49f-bbgzq"] Mar 20 08:30:38 crc kubenswrapper[4971]: I0320 08:30:38.085473 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-685785d49f-bbgzq" Mar 20 08:30:38 crc kubenswrapper[4971]: I0320 08:30:38.097499 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-685785d49f-bbgzq"] Mar 20 08:30:38 crc kubenswrapper[4971]: I0320 08:30:38.170563 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5459c8c9f5-kdvrf" Mar 20 08:30:38 crc kubenswrapper[4971]: I0320 08:30:38.170954 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6648865bb9-ps7cp" Mar 20 08:30:38 crc kubenswrapper[4971]: I0320 08:30:38.179841 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5459c8c9f5-kdvrf" Mar 20 08:30:38 crc kubenswrapper[4971]: I0320 08:30:38.187924 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6648865bb9-ps7cp" Mar 20 08:30:38 crc kubenswrapper[4971]: I0320 08:30:38.206565 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/329e0d78-d60d-4745-b5c0-0e3f699b3216-dns-svc\") pod \"dnsmasq-dns-685785d49f-bbgzq\" (UID: \"329e0d78-d60d-4745-b5c0-0e3f699b3216\") " pod="openstack/dnsmasq-dns-685785d49f-bbgzq" Mar 20 08:30:38 crc kubenswrapper[4971]: I0320 08:30:38.206640 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/329e0d78-d60d-4745-b5c0-0e3f699b3216-config\") pod \"dnsmasq-dns-685785d49f-bbgzq\" (UID: \"329e0d78-d60d-4745-b5c0-0e3f699b3216\") " pod="openstack/dnsmasq-dns-685785d49f-bbgzq" Mar 20 08:30:38 crc kubenswrapper[4971]: I0320 08:30:38.206674 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktqmg\" (UniqueName: \"kubernetes.io/projected/329e0d78-d60d-4745-b5c0-0e3f699b3216-kube-api-access-ktqmg\") pod \"dnsmasq-dns-685785d49f-bbgzq\" (UID: \"329e0d78-d60d-4745-b5c0-0e3f699b3216\") " pod="openstack/dnsmasq-dns-685785d49f-bbgzq" Mar 20 08:30:38 crc kubenswrapper[4971]: I0320 08:30:38.313443 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f18f3000-93ad-4c2a-8fd4-de3b748b35d1-dns-svc\") pod \"f18f3000-93ad-4c2a-8fd4-de3b748b35d1\" (UID: \"f18f3000-93ad-4c2a-8fd4-de3b748b35d1\") " Mar 20 08:30:38 crc kubenswrapper[4971]: I0320 08:30:38.313541 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tg5cn\" (UniqueName: \"kubernetes.io/projected/f18f3000-93ad-4c2a-8fd4-de3b748b35d1-kube-api-access-tg5cn\") pod \"f18f3000-93ad-4c2a-8fd4-de3b748b35d1\" (UID: \"f18f3000-93ad-4c2a-8fd4-de3b748b35d1\") " Mar 20 08:30:38 crc kubenswrapper[4971]: I0320 08:30:38.313578 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f18f3000-93ad-4c2a-8fd4-de3b748b35d1-config\") pod \"f18f3000-93ad-4c2a-8fd4-de3b748b35d1\" (UID: \"f18f3000-93ad-4c2a-8fd4-de3b748b35d1\") " Mar 20 08:30:38 crc kubenswrapper[4971]: I0320 08:30:38.313625 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e48a8d8-1745-4c6c-88d1-971cb73e3908-config\") pod \"9e48a8d8-1745-4c6c-88d1-971cb73e3908\" (UID: \"9e48a8d8-1745-4c6c-88d1-971cb73e3908\") " Mar 20 08:30:38 crc kubenswrapper[4971]: I0320 08:30:38.313662 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wbrs\" (UniqueName: \"kubernetes.io/projected/9e48a8d8-1745-4c6c-88d1-971cb73e3908-kube-api-access-8wbrs\") pod \"9e48a8d8-1745-4c6c-88d1-971cb73e3908\" (UID: \"9e48a8d8-1745-4c6c-88d1-971cb73e3908\") " Mar 20 08:30:38 crc kubenswrapper[4971]: I0320 08:30:38.313862 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/329e0d78-d60d-4745-b5c0-0e3f699b3216-dns-svc\") pod \"dnsmasq-dns-685785d49f-bbgzq\" (UID: \"329e0d78-d60d-4745-b5c0-0e3f699b3216\") " pod="openstack/dnsmasq-dns-685785d49f-bbgzq" Mar 20 08:30:38 crc kubenswrapper[4971]: I0320 08:30:38.313906 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/329e0d78-d60d-4745-b5c0-0e3f699b3216-config\") pod \"dnsmasq-dns-685785d49f-bbgzq\" (UID: \"329e0d78-d60d-4745-b5c0-0e3f699b3216\") " pod="openstack/dnsmasq-dns-685785d49f-bbgzq" Mar 20 08:30:38 crc kubenswrapper[4971]: I0320 08:30:38.313929 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktqmg\" (UniqueName: \"kubernetes.io/projected/329e0d78-d60d-4745-b5c0-0e3f699b3216-kube-api-access-ktqmg\") pod \"dnsmasq-dns-685785d49f-bbgzq\" (UID: \"329e0d78-d60d-4745-b5c0-0e3f699b3216\") " pod="openstack/dnsmasq-dns-685785d49f-bbgzq" Mar 20 08:30:38 crc kubenswrapper[4971]: I0320 08:30:38.314763 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f18f3000-93ad-4c2a-8fd4-de3b748b35d1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f18f3000-93ad-4c2a-8fd4-de3b748b35d1" (UID: "f18f3000-93ad-4c2a-8fd4-de3b748b35d1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:30:38 crc kubenswrapper[4971]: I0320 08:30:38.315632 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f18f3000-93ad-4c2a-8fd4-de3b748b35d1-config" (OuterVolumeSpecName: "config") pod "f18f3000-93ad-4c2a-8fd4-de3b748b35d1" (UID: "f18f3000-93ad-4c2a-8fd4-de3b748b35d1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:30:38 crc kubenswrapper[4971]: I0320 08:30:38.315995 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e48a8d8-1745-4c6c-88d1-971cb73e3908-config" (OuterVolumeSpecName: "config") pod "9e48a8d8-1745-4c6c-88d1-971cb73e3908" (UID: "9e48a8d8-1745-4c6c-88d1-971cb73e3908"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:30:38 crc kubenswrapper[4971]: I0320 08:30:38.316802 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/329e0d78-d60d-4745-b5c0-0e3f699b3216-dns-svc\") pod \"dnsmasq-dns-685785d49f-bbgzq\" (UID: \"329e0d78-d60d-4745-b5c0-0e3f699b3216\") " pod="openstack/dnsmasq-dns-685785d49f-bbgzq" Mar 20 08:30:38 crc kubenswrapper[4971]: I0320 08:30:38.318049 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/329e0d78-d60d-4745-b5c0-0e3f699b3216-config\") pod \"dnsmasq-dns-685785d49f-bbgzq\" (UID: \"329e0d78-d60d-4745-b5c0-0e3f699b3216\") " pod="openstack/dnsmasq-dns-685785d49f-bbgzq" Mar 20 08:30:38 crc kubenswrapper[4971]: I0320 08:30:38.321042 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f18f3000-93ad-4c2a-8fd4-de3b748b35d1-kube-api-access-tg5cn" (OuterVolumeSpecName: "kube-api-access-tg5cn") pod "f18f3000-93ad-4c2a-8fd4-de3b748b35d1" (UID: "f18f3000-93ad-4c2a-8fd4-de3b748b35d1"). InnerVolumeSpecName "kube-api-access-tg5cn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:30:38 crc kubenswrapper[4971]: I0320 08:30:38.321781 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e48a8d8-1745-4c6c-88d1-971cb73e3908-kube-api-access-8wbrs" (OuterVolumeSpecName: "kube-api-access-8wbrs") pod "9e48a8d8-1745-4c6c-88d1-971cb73e3908" (UID: "9e48a8d8-1745-4c6c-88d1-971cb73e3908"). InnerVolumeSpecName "kube-api-access-8wbrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:30:38 crc kubenswrapper[4971]: I0320 08:30:38.330462 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktqmg\" (UniqueName: \"kubernetes.io/projected/329e0d78-d60d-4745-b5c0-0e3f699b3216-kube-api-access-ktqmg\") pod \"dnsmasq-dns-685785d49f-bbgzq\" (UID: \"329e0d78-d60d-4745-b5c0-0e3f699b3216\") " pod="openstack/dnsmasq-dns-685785d49f-bbgzq" Mar 20 08:30:38 crc kubenswrapper[4971]: I0320 08:30:38.415916 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wbrs\" (UniqueName: \"kubernetes.io/projected/9e48a8d8-1745-4c6c-88d1-971cb73e3908-kube-api-access-8wbrs\") on node \"crc\" DevicePath \"\"" Mar 20 08:30:38 crc kubenswrapper[4971]: I0320 08:30:38.415960 4971 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f18f3000-93ad-4c2a-8fd4-de3b748b35d1-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 08:30:38 crc kubenswrapper[4971]: I0320 08:30:38.415972 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tg5cn\" (UniqueName: \"kubernetes.io/projected/f18f3000-93ad-4c2a-8fd4-de3b748b35d1-kube-api-access-tg5cn\") on node \"crc\" DevicePath \"\"" Mar 20 08:30:38 crc kubenswrapper[4971]: I0320 08:30:38.415993 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f18f3000-93ad-4c2a-8fd4-de3b748b35d1-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:30:38 crc kubenswrapper[4971]: I0320 08:30:38.416004 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e48a8d8-1745-4c6c-88d1-971cb73e3908-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:30:38 crc kubenswrapper[4971]: I0320 08:30:38.418466 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-685785d49f-bbgzq" Mar 20 08:30:38 crc kubenswrapper[4971]: I0320 08:30:38.494425 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86ffc6867-ffg6c"] Mar 20 08:30:38 crc kubenswrapper[4971]: I0320 08:30:38.879110 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 08:30:38 crc kubenswrapper[4971]: I0320 08:30:38.885562 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:30:38 crc kubenswrapper[4971]: I0320 08:30:38.895357 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 20 08:30:38 crc kubenswrapper[4971]: I0320 08:30:38.897087 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 20 08:30:38 crc kubenswrapper[4971]: I0320 08:30:38.897237 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-685785d49f-bbgzq"] Mar 20 08:30:38 crc kubenswrapper[4971]: I0320 08:30:38.897333 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-5zq6x" Mar 20 08:30:38 crc kubenswrapper[4971]: I0320 08:30:38.897488 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 20 08:30:38 crc kubenswrapper[4971]: I0320 08:30:38.909680 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 20 08:30:38 crc kubenswrapper[4971]: I0320 08:30:38.930295 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.044598 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ec11b39f-d056-4dc3-a449-612cfac34c54-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ec11b39f-d056-4dc3-a449-612cfac34c54\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.044657 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ec11b39f-d056-4dc3-a449-612cfac34c54-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ec11b39f-d056-4dc3-a449-612cfac34c54\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.044684 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ec11b39f-d056-4dc3-a449-612cfac34c54-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ec11b39f-d056-4dc3-a449-612cfac34c54\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.044726 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ec11b39f-d056-4dc3-a449-612cfac34c54-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ec11b39f-d056-4dc3-a449-612cfac34c54\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.044749 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frzk2\" (UniqueName: \"kubernetes.io/projected/ec11b39f-d056-4dc3-a449-612cfac34c54-kube-api-access-frzk2\") pod \"rabbitmq-cell1-server-0\" (UID: \"ec11b39f-d056-4dc3-a449-612cfac34c54\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.044767 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ec11b39f-d056-4dc3-a449-612cfac34c54-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ec11b39f-d056-4dc3-a449-612cfac34c54\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.044816 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ec11b39f-d056-4dc3-a449-612cfac34c54-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ec11b39f-d056-4dc3-a449-612cfac34c54\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.044851 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-77eee80e-450c-4861-a0d5-e0a93cc87c8d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-77eee80e-450c-4861-a0d5-e0a93cc87c8d\") pod \"rabbitmq-cell1-server-0\" (UID: \"ec11b39f-d056-4dc3-a449-612cfac34c54\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.044868 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ec11b39f-d056-4dc3-a449-612cfac34c54-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ec11b39f-d056-4dc3-a449-612cfac34c54\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.146145 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ec11b39f-d056-4dc3-a449-612cfac34c54-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ec11b39f-d056-4dc3-a449-612cfac34c54\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.146203 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ec11b39f-d056-4dc3-a449-612cfac34c54-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ec11b39f-d056-4dc3-a449-612cfac34c54\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.146229 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ec11b39f-d056-4dc3-a449-612cfac34c54-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ec11b39f-d056-4dc3-a449-612cfac34c54\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.146282 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ec11b39f-d056-4dc3-a449-612cfac34c54-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ec11b39f-d056-4dc3-a449-612cfac34c54\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.146310 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frzk2\" (UniqueName: \"kubernetes.io/projected/ec11b39f-d056-4dc3-a449-612cfac34c54-kube-api-access-frzk2\") pod \"rabbitmq-cell1-server-0\" (UID: \"ec11b39f-d056-4dc3-a449-612cfac34c54\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.146338 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ec11b39f-d056-4dc3-a449-612cfac34c54-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ec11b39f-d056-4dc3-a449-612cfac34c54\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.146368 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ec11b39f-d056-4dc3-a449-612cfac34c54-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ec11b39f-d056-4dc3-a449-612cfac34c54\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.146406 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-77eee80e-450c-4861-a0d5-e0a93cc87c8d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-77eee80e-450c-4861-a0d5-e0a93cc87c8d\") pod \"rabbitmq-cell1-server-0\" (UID: \"ec11b39f-d056-4dc3-a449-612cfac34c54\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.146432 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ec11b39f-d056-4dc3-a449-612cfac34c54-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ec11b39f-d056-4dc3-a449-612cfac34c54\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.147355 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ec11b39f-d056-4dc3-a449-612cfac34c54-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ec11b39f-d056-4dc3-a449-612cfac34c54\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.147508 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ec11b39f-d056-4dc3-a449-612cfac34c54-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ec11b39f-d056-4dc3-a449-612cfac34c54\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.148933 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ec11b39f-d056-4dc3-a449-612cfac34c54-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ec11b39f-d056-4dc3-a449-612cfac34c54\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.149225 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ec11b39f-d056-4dc3-a449-612cfac34c54-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ec11b39f-d056-4dc3-a449-612cfac34c54\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.153069 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ec11b39f-d056-4dc3-a449-612cfac34c54-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ec11b39f-d056-4dc3-a449-612cfac34c54\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.154057 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ec11b39f-d056-4dc3-a449-612cfac34c54-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ec11b39f-d056-4dc3-a449-612cfac34c54\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.155008 4971 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.155053 4971 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-77eee80e-450c-4861-a0d5-e0a93cc87c8d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-77eee80e-450c-4861-a0d5-e0a93cc87c8d\") pod \"rabbitmq-cell1-server-0\" (UID: \"ec11b39f-d056-4dc3-a449-612cfac34c54\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/419840093af8e828c53552d4afc15e982c99c820b589ef57a0e61c257053cbd7/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.169711 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frzk2\" (UniqueName: \"kubernetes.io/projected/ec11b39f-d056-4dc3-a449-612cfac34c54-kube-api-access-frzk2\") pod \"rabbitmq-cell1-server-0\" (UID: \"ec11b39f-d056-4dc3-a449-612cfac34c54\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.182183 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ec11b39f-d056-4dc3-a449-612cfac34c54-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ec11b39f-d056-4dc3-a449-612cfac34c54\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.209132 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.217355 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-77eee80e-450c-4861-a0d5-e0a93cc87c8d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-77eee80e-450c-4861-a0d5-e0a93cc87c8d\") pod \"rabbitmq-cell1-server-0\" (UID: \"ec11b39f-d056-4dc3-a449-612cfac34c54\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.222266 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.224581 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.227416 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-5952g" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.227823 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.228104 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.228539 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.235308 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.246295 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.260478 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86ffc6867-ffg6c" event={"ID":"5e813b87-6cd8-460b-ae03-dfa481872f39","Type":"ContainerStarted","Data":"209bc9ff2e27dd077b9e1f91083860b0f4f72d6dc9c2578aa726e80b4fde3e4a"} Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.265539 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6648865bb9-ps7cp" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.265999 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-685785d49f-bbgzq" event={"ID":"329e0d78-d60d-4745-b5c0-0e3f699b3216","Type":"ContainerStarted","Data":"57a4213a6b8d519af0a34b03a3fa578d6b8c764a7b8ac58ee7ee56b4fb0acaea"} Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.266099 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5459c8c9f5-kdvrf" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.346418 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5459c8c9f5-kdvrf"] Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.350441 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5fe79aa1-d952-4250-8c28-171206f10655-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5fe79aa1-d952-4250-8c28-171206f10655\") " pod="openstack/rabbitmq-server-0" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.350492 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5fe79aa1-d952-4250-8c28-171206f10655-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5fe79aa1-d952-4250-8c28-171206f10655\") " pod="openstack/rabbitmq-server-0" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.350536 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5l4f\" (UniqueName: \"kubernetes.io/projected/5fe79aa1-d952-4250-8c28-171206f10655-kube-api-access-h5l4f\") pod \"rabbitmq-server-0\" (UID: \"5fe79aa1-d952-4250-8c28-171206f10655\") " pod="openstack/rabbitmq-server-0" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.350564 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5fe79aa1-d952-4250-8c28-171206f10655-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5fe79aa1-d952-4250-8c28-171206f10655\") " pod="openstack/rabbitmq-server-0" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.350589 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5fe79aa1-d952-4250-8c28-171206f10655-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5fe79aa1-d952-4250-8c28-171206f10655\") " pod="openstack/rabbitmq-server-0" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.350633 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5fe79aa1-d952-4250-8c28-171206f10655-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5fe79aa1-d952-4250-8c28-171206f10655\") " pod="openstack/rabbitmq-server-0" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.350669 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5fe79aa1-d952-4250-8c28-171206f10655-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5fe79aa1-d952-4250-8c28-171206f10655\") " pod="openstack/rabbitmq-server-0" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.350712 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-caa78e20-148d-4ea1-bb40-9915bf39399b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-caa78e20-148d-4ea1-bb40-9915bf39399b\") pod \"rabbitmq-server-0\" (UID: \"5fe79aa1-d952-4250-8c28-171206f10655\") " pod="openstack/rabbitmq-server-0" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.350737 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5fe79aa1-d952-4250-8c28-171206f10655-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5fe79aa1-d952-4250-8c28-171206f10655\") " pod="openstack/rabbitmq-server-0" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.363116 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5459c8c9f5-kdvrf"] Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.408402 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6648865bb9-ps7cp"] Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.411126 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6648865bb9-ps7cp"] Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.454716 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-caa78e20-148d-4ea1-bb40-9915bf39399b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-caa78e20-148d-4ea1-bb40-9915bf39399b\") pod \"rabbitmq-server-0\" (UID: \"5fe79aa1-d952-4250-8c28-171206f10655\") " pod="openstack/rabbitmq-server-0" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.454758 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5fe79aa1-d952-4250-8c28-171206f10655-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5fe79aa1-d952-4250-8c28-171206f10655\") " pod="openstack/rabbitmq-server-0" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.454818 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5fe79aa1-d952-4250-8c28-171206f10655-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5fe79aa1-d952-4250-8c28-171206f10655\") " pod="openstack/rabbitmq-server-0" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.454836 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5fe79aa1-d952-4250-8c28-171206f10655-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5fe79aa1-d952-4250-8c28-171206f10655\") " pod="openstack/rabbitmq-server-0" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.454867 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5l4f\" (UniqueName: \"kubernetes.io/projected/5fe79aa1-d952-4250-8c28-171206f10655-kube-api-access-h5l4f\") pod \"rabbitmq-server-0\" (UID: \"5fe79aa1-d952-4250-8c28-171206f10655\") " pod="openstack/rabbitmq-server-0" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.454885 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5fe79aa1-d952-4250-8c28-171206f10655-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5fe79aa1-d952-4250-8c28-171206f10655\") " pod="openstack/rabbitmq-server-0" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.454903 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5fe79aa1-d952-4250-8c28-171206f10655-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5fe79aa1-d952-4250-8c28-171206f10655\") " pod="openstack/rabbitmq-server-0" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.454924 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5fe79aa1-d952-4250-8c28-171206f10655-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5fe79aa1-d952-4250-8c28-171206f10655\") " pod="openstack/rabbitmq-server-0" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.454949 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5fe79aa1-d952-4250-8c28-171206f10655-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5fe79aa1-d952-4250-8c28-171206f10655\") " pod="openstack/rabbitmq-server-0" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.456217 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5fe79aa1-d952-4250-8c28-171206f10655-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5fe79aa1-d952-4250-8c28-171206f10655\") " pod="openstack/rabbitmq-server-0" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.458768 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5fe79aa1-d952-4250-8c28-171206f10655-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5fe79aa1-d952-4250-8c28-171206f10655\") " pod="openstack/rabbitmq-server-0" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.459366 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5fe79aa1-d952-4250-8c28-171206f10655-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5fe79aa1-d952-4250-8c28-171206f10655\") " pod="openstack/rabbitmq-server-0" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.460031 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5fe79aa1-d952-4250-8c28-171206f10655-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5fe79aa1-d952-4250-8c28-171206f10655\") " pod="openstack/rabbitmq-server-0" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.460353 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5fe79aa1-d952-4250-8c28-171206f10655-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5fe79aa1-d952-4250-8c28-171206f10655\") " pod="openstack/rabbitmq-server-0" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.460527 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5fe79aa1-d952-4250-8c28-171206f10655-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5fe79aa1-d952-4250-8c28-171206f10655\") " pod="openstack/rabbitmq-server-0" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.460810 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5fe79aa1-d952-4250-8c28-171206f10655-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5fe79aa1-d952-4250-8c28-171206f10655\") " pod="openstack/rabbitmq-server-0" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.460857 4971 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.460907 4971 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-caa78e20-148d-4ea1-bb40-9915bf39399b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-caa78e20-148d-4ea1-bb40-9915bf39399b\") pod \"rabbitmq-server-0\" (UID: \"5fe79aa1-d952-4250-8c28-171206f10655\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/26823bc6c9a8531b3afbaac866627afd8118e6588c53f9fde37dcff0bcf28a4f/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.475959 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5l4f\" (UniqueName: \"kubernetes.io/projected/5fe79aa1-d952-4250-8c28-171206f10655-kube-api-access-h5l4f\") pod \"rabbitmq-server-0\" (UID: \"5fe79aa1-d952-4250-8c28-171206f10655\") " pod="openstack/rabbitmq-server-0" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.505995 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-caa78e20-148d-4ea1-bb40-9915bf39399b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-caa78e20-148d-4ea1-bb40-9915bf39399b\") pod \"rabbitmq-server-0\" (UID: \"5fe79aa1-d952-4250-8c28-171206f10655\") " pod="openstack/rabbitmq-server-0" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.551590 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 08:30:39 crc kubenswrapper[4971]: I0320 08:30:39.743306 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 08:30:40 crc kubenswrapper[4971]: I0320 08:30:40.063887 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 08:30:40 crc kubenswrapper[4971]: W0320 08:30:40.065677 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe79aa1_d952_4250_8c28_171206f10655.slice/crio-1768e0b9697d17c877bb05e84b59d6d2c123bce09a35271e6041f9411a14eacd WatchSource:0}: Error finding container 1768e0b9697d17c877bb05e84b59d6d2c123bce09a35271e6041f9411a14eacd: Status 404 returned error can't find the container with id 1768e0b9697d17c877bb05e84b59d6d2c123bce09a35271e6041f9411a14eacd Mar 20 08:30:40 crc kubenswrapper[4971]: I0320 08:30:40.074657 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 20 08:30:40 crc kubenswrapper[4971]: I0320 08:30:40.075856 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 20 08:30:40 crc kubenswrapper[4971]: I0320 08:30:40.080187 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 20 08:30:40 crc kubenswrapper[4971]: I0320 08:30:40.080355 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 20 08:30:40 crc kubenswrapper[4971]: I0320 08:30:40.080487 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 20 08:30:40 crc kubenswrapper[4971]: I0320 08:30:40.080897 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-npkvm" Mar 20 08:30:40 crc kubenswrapper[4971]: I0320 08:30:40.084176 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 20 08:30:40 crc kubenswrapper[4971]: I0320 08:30:40.086376 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 20 08:30:40 crc kubenswrapper[4971]: I0320 08:30:40.163706 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d6c9df9-b2df-4747-a0c7-0a6684689438-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"8d6c9df9-b2df-4747-a0c7-0a6684689438\") " pod="openstack/openstack-galera-0" Mar 20 08:30:40 crc kubenswrapper[4971]: I0320 08:30:40.163752 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-fb56e8ce-ef02-4406-8580-ab0633f14215\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fb56e8ce-ef02-4406-8580-ab0633f14215\") pod \"openstack-galera-0\" (UID: \"8d6c9df9-b2df-4747-a0c7-0a6684689438\") " pod="openstack/openstack-galera-0" Mar 20 08:30:40 crc kubenswrapper[4971]: I0320 08:30:40.163775 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8d6c9df9-b2df-4747-a0c7-0a6684689438-config-data-default\") pod \"openstack-galera-0\" (UID: \"8d6c9df9-b2df-4747-a0c7-0a6684689438\") " pod="openstack/openstack-galera-0" Mar 20 08:30:40 crc kubenswrapper[4971]: I0320 08:30:40.163792 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8d6c9df9-b2df-4747-a0c7-0a6684689438-kolla-config\") pod \"openstack-galera-0\" (UID: \"8d6c9df9-b2df-4747-a0c7-0a6684689438\") " pod="openstack/openstack-galera-0" Mar 20 08:30:40 crc kubenswrapper[4971]: I0320 08:30:40.164155 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-df5xd\" (UniqueName: \"kubernetes.io/projected/8d6c9df9-b2df-4747-a0c7-0a6684689438-kube-api-access-df5xd\") pod \"openstack-galera-0\" (UID: \"8d6c9df9-b2df-4747-a0c7-0a6684689438\") " pod="openstack/openstack-galera-0" Mar 20 08:30:40 crc kubenswrapper[4971]: I0320 08:30:40.164201 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8d6c9df9-b2df-4747-a0c7-0a6684689438-config-data-generated\") pod \"openstack-galera-0\" (UID: \"8d6c9df9-b2df-4747-a0c7-0a6684689438\") " pod="openstack/openstack-galera-0" Mar 20 08:30:40 crc kubenswrapper[4971]: I0320 08:30:40.164314 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d6c9df9-b2df-4747-a0c7-0a6684689438-operator-scripts\") pod \"openstack-galera-0\" (UID: \"8d6c9df9-b2df-4747-a0c7-0a6684689438\") " pod="openstack/openstack-galera-0" Mar 20 08:30:40 crc kubenswrapper[4971]: I0320 08:30:40.164386 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d6c9df9-b2df-4747-a0c7-0a6684689438-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"8d6c9df9-b2df-4747-a0c7-0a6684689438\") " pod="openstack/openstack-galera-0" Mar 20 08:30:40 crc kubenswrapper[4971]: I0320 08:30:40.265591 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d6c9df9-b2df-4747-a0c7-0a6684689438-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"8d6c9df9-b2df-4747-a0c7-0a6684689438\") " pod="openstack/openstack-galera-0" Mar 20 08:30:40 crc kubenswrapper[4971]: I0320 08:30:40.265677 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-fb56e8ce-ef02-4406-8580-ab0633f14215\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fb56e8ce-ef02-4406-8580-ab0633f14215\") pod \"openstack-galera-0\" (UID: \"8d6c9df9-b2df-4747-a0c7-0a6684689438\") " pod="openstack/openstack-galera-0" Mar 20 08:30:40 crc kubenswrapper[4971]: I0320 08:30:40.265710 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8d6c9df9-b2df-4747-a0c7-0a6684689438-config-data-default\") pod \"openstack-galera-0\" (UID: \"8d6c9df9-b2df-4747-a0c7-0a6684689438\") " pod="openstack/openstack-galera-0" Mar 20 08:30:40 crc kubenswrapper[4971]: I0320 08:30:40.265736 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8d6c9df9-b2df-4747-a0c7-0a6684689438-kolla-config\") pod \"openstack-galera-0\" (UID: \"8d6c9df9-b2df-4747-a0c7-0a6684689438\") " pod="openstack/openstack-galera-0" Mar 20 08:30:40 crc kubenswrapper[4971]: I0320 08:30:40.265782 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-df5xd\" (UniqueName: \"kubernetes.io/projected/8d6c9df9-b2df-4747-a0c7-0a6684689438-kube-api-access-df5xd\") pod \"openstack-galera-0\" (UID: \"8d6c9df9-b2df-4747-a0c7-0a6684689438\") " pod="openstack/openstack-galera-0" Mar 20 08:30:40 crc kubenswrapper[4971]: I0320 08:30:40.265807 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8d6c9df9-b2df-4747-a0c7-0a6684689438-config-data-generated\") pod \"openstack-galera-0\" (UID: \"8d6c9df9-b2df-4747-a0c7-0a6684689438\") " pod="openstack/openstack-galera-0" Mar 20 08:30:40 crc kubenswrapper[4971]: I0320 08:30:40.265851 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d6c9df9-b2df-4747-a0c7-0a6684689438-operator-scripts\") pod \"openstack-galera-0\" (UID: \"8d6c9df9-b2df-4747-a0c7-0a6684689438\") " pod="openstack/openstack-galera-0" Mar 20 08:30:40 crc kubenswrapper[4971]: I0320 08:30:40.265888 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d6c9df9-b2df-4747-a0c7-0a6684689438-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"8d6c9df9-b2df-4747-a0c7-0a6684689438\") " pod="openstack/openstack-galera-0" Mar 20 08:30:40 crc kubenswrapper[4971]: I0320 08:30:40.267054 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8d6c9df9-b2df-4747-a0c7-0a6684689438-kolla-config\") pod \"openstack-galera-0\" (UID: \"8d6c9df9-b2df-4747-a0c7-0a6684689438\") " pod="openstack/openstack-galera-0" Mar 20 08:30:40 crc kubenswrapper[4971]: I0320 08:30:40.268330 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8d6c9df9-b2df-4747-a0c7-0a6684689438-config-data-generated\") pod \"openstack-galera-0\" (UID: \"8d6c9df9-b2df-4747-a0c7-0a6684689438\") " pod="openstack/openstack-galera-0" Mar 20 08:30:40 crc kubenswrapper[4971]: I0320 08:30:40.270073 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d6c9df9-b2df-4747-a0c7-0a6684689438-operator-scripts\") pod \"openstack-galera-0\" (UID: \"8d6c9df9-b2df-4747-a0c7-0a6684689438\") " pod="openstack/openstack-galera-0" Mar 20 08:30:40 crc kubenswrapper[4971]: I0320 08:30:40.271081 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8d6c9df9-b2df-4747-a0c7-0a6684689438-config-data-default\") pod \"openstack-galera-0\" (UID: \"8d6c9df9-b2df-4747-a0c7-0a6684689438\") " pod="openstack/openstack-galera-0" Mar 20 08:30:40 crc kubenswrapper[4971]: I0320 08:30:40.274021 4971 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 08:30:40 crc kubenswrapper[4971]: I0320 08:30:40.274073 4971 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-fb56e8ce-ef02-4406-8580-ab0633f14215\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fb56e8ce-ef02-4406-8580-ab0633f14215\") pod \"openstack-galera-0\" (UID: \"8d6c9df9-b2df-4747-a0c7-0a6684689438\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/464d02d4bd2f8cee63d7523f59c1cdcf1e179aea958d0563f826e3a262a884e0/globalmount\"" pod="openstack/openstack-galera-0" Mar 20 08:30:40 crc kubenswrapper[4971]: I0320 08:30:40.277006 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d6c9df9-b2df-4747-a0c7-0a6684689438-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"8d6c9df9-b2df-4747-a0c7-0a6684689438\") " pod="openstack/openstack-galera-0" Mar 20 08:30:40 crc kubenswrapper[4971]: I0320 08:30:40.278343 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d6c9df9-b2df-4747-a0c7-0a6684689438-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"8d6c9df9-b2df-4747-a0c7-0a6684689438\") " pod="openstack/openstack-galera-0" Mar 20 08:30:40 crc kubenswrapper[4971]: I0320 08:30:40.288660 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-df5xd\" (UniqueName: \"kubernetes.io/projected/8d6c9df9-b2df-4747-a0c7-0a6684689438-kube-api-access-df5xd\") pod \"openstack-galera-0\" (UID: \"8d6c9df9-b2df-4747-a0c7-0a6684689438\") " pod="openstack/openstack-galera-0" Mar 20 08:30:40 crc kubenswrapper[4971]: I0320 08:30:40.299222 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5fe79aa1-d952-4250-8c28-171206f10655","Type":"ContainerStarted","Data":"1768e0b9697d17c877bb05e84b59d6d2c123bce09a35271e6041f9411a14eacd"} Mar 20 08:30:40 crc kubenswrapper[4971]: I0320 08:30:40.308856 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ec11b39f-d056-4dc3-a449-612cfac34c54","Type":"ContainerStarted","Data":"c51a176782ef8896011627f13991891395b1d9a214aa0f65170d34ffabc7b360"} Mar 20 08:30:40 crc kubenswrapper[4971]: I0320 08:30:40.351598 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-fb56e8ce-ef02-4406-8580-ab0633f14215\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fb56e8ce-ef02-4406-8580-ab0633f14215\") pod \"openstack-galera-0\" (UID: \"8d6c9df9-b2df-4747-a0c7-0a6684689438\") " pod="openstack/openstack-galera-0" Mar 20 08:30:40 crc kubenswrapper[4971]: I0320 08:30:40.400748 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 20 08:30:40 crc kubenswrapper[4971]: I0320 08:30:40.641766 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 20 08:30:40 crc kubenswrapper[4971]: I0320 08:30:40.642625 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 20 08:30:40 crc kubenswrapper[4971]: I0320 08:30:40.644808 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-r6dmw" Mar 20 08:30:40 crc kubenswrapper[4971]: I0320 08:30:40.647917 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 20 08:30:40 crc kubenswrapper[4971]: I0320 08:30:40.676810 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 20 08:30:40 crc kubenswrapper[4971]: I0320 08:30:40.763750 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e48a8d8-1745-4c6c-88d1-971cb73e3908" path="/var/lib/kubelet/pods/9e48a8d8-1745-4c6c-88d1-971cb73e3908/volumes" Mar 20 08:30:40 crc kubenswrapper[4971]: I0320 08:30:40.764461 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f18f3000-93ad-4c2a-8fd4-de3b748b35d1" path="/var/lib/kubelet/pods/f18f3000-93ad-4c2a-8fd4-de3b748b35d1/volumes" Mar 20 08:30:40 crc kubenswrapper[4971]: I0320 08:30:40.777522 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d0878a99-9a41-4556-a72f-21fb18a5761d-kolla-config\") pod \"memcached-0\" (UID: \"d0878a99-9a41-4556-a72f-21fb18a5761d\") " pod="openstack/memcached-0" Mar 20 08:30:40 crc kubenswrapper[4971]: I0320 08:30:40.777570 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cv9pj\" (UniqueName: \"kubernetes.io/projected/d0878a99-9a41-4556-a72f-21fb18a5761d-kube-api-access-cv9pj\") pod \"memcached-0\" (UID: \"d0878a99-9a41-4556-a72f-21fb18a5761d\") " pod="openstack/memcached-0" Mar 20 08:30:40 crc kubenswrapper[4971]: I0320 08:30:40.777638 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d0878a99-9a41-4556-a72f-21fb18a5761d-config-data\") pod \"memcached-0\" (UID: \"d0878a99-9a41-4556-a72f-21fb18a5761d\") " pod="openstack/memcached-0" Mar 20 08:30:40 crc kubenswrapper[4971]: I0320 08:30:40.878987 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d0878a99-9a41-4556-a72f-21fb18a5761d-kolla-config\") pod \"memcached-0\" (UID: \"d0878a99-9a41-4556-a72f-21fb18a5761d\") " pod="openstack/memcached-0" Mar 20 08:30:40 crc kubenswrapper[4971]: I0320 08:30:40.880055 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cv9pj\" (UniqueName: \"kubernetes.io/projected/d0878a99-9a41-4556-a72f-21fb18a5761d-kube-api-access-cv9pj\") pod \"memcached-0\" (UID: \"d0878a99-9a41-4556-a72f-21fb18a5761d\") " pod="openstack/memcached-0" Mar 20 08:30:40 crc kubenswrapper[4971]: I0320 08:30:40.880145 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d0878a99-9a41-4556-a72f-21fb18a5761d-config-data\") pod \"memcached-0\" (UID: \"d0878a99-9a41-4556-a72f-21fb18a5761d\") " pod="openstack/memcached-0" Mar 20 08:30:40 crc kubenswrapper[4971]: I0320 08:30:40.880650 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d0878a99-9a41-4556-a72f-21fb18a5761d-kolla-config\") pod \"memcached-0\" (UID: \"d0878a99-9a41-4556-a72f-21fb18a5761d\") " pod="openstack/memcached-0" Mar 20 08:30:40 crc kubenswrapper[4971]: I0320 08:30:40.881033 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d0878a99-9a41-4556-a72f-21fb18a5761d-config-data\") pod \"memcached-0\" (UID: \"d0878a99-9a41-4556-a72f-21fb18a5761d\") " pod="openstack/memcached-0" Mar 20 08:30:40 crc kubenswrapper[4971]: I0320 08:30:40.897814 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cv9pj\" (UniqueName: \"kubernetes.io/projected/d0878a99-9a41-4556-a72f-21fb18a5761d-kube-api-access-cv9pj\") pod \"memcached-0\" (UID: \"d0878a99-9a41-4556-a72f-21fb18a5761d\") " pod="openstack/memcached-0" Mar 20 08:30:40 crc kubenswrapper[4971]: I0320 08:30:40.998879 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 20 08:30:41 crc kubenswrapper[4971]: I0320 08:30:41.197129 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 20 08:30:41 crc kubenswrapper[4971]: I0320 08:30:41.319799 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8d6c9df9-b2df-4747-a0c7-0a6684689438","Type":"ContainerStarted","Data":"0c664427c45eeb163bf3b10cf75dc766c928adf04eb0772ce8d3857d4d257c4f"} Mar 20 08:30:41 crc kubenswrapper[4971]: I0320 08:30:41.498165 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 20 08:30:41 crc kubenswrapper[4971]: I0320 08:30:41.639894 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 08:30:41 crc kubenswrapper[4971]: I0320 08:30:41.641796 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 20 08:30:41 crc kubenswrapper[4971]: I0320 08:30:41.645131 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 20 08:30:41 crc kubenswrapper[4971]: I0320 08:30:41.645141 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 20 08:30:41 crc kubenswrapper[4971]: I0320 08:30:41.646561 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-bx9cg" Mar 20 08:30:41 crc kubenswrapper[4971]: I0320 08:30:41.646754 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 20 08:30:41 crc kubenswrapper[4971]: I0320 08:30:41.650501 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 08:30:41 crc kubenswrapper[4971]: I0320 08:30:41.695703 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/52099a51-54ac-4611-89cf-191426ae31d7-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"52099a51-54ac-4611-89cf-191426ae31d7\") " pod="openstack/openstack-cell1-galera-0" Mar 20 08:30:41 crc kubenswrapper[4971]: I0320 08:30:41.695750 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/52099a51-54ac-4611-89cf-191426ae31d7-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"52099a51-54ac-4611-89cf-191426ae31d7\") " pod="openstack/openstack-cell1-galera-0" Mar 20 08:30:41 crc kubenswrapper[4971]: I0320 08:30:41.695787 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/52099a51-54ac-4611-89cf-191426ae31d7-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"52099a51-54ac-4611-89cf-191426ae31d7\") " pod="openstack/openstack-cell1-galera-0" Mar 20 08:30:41 crc kubenswrapper[4971]: I0320 08:30:41.695896 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-4885fb93-06d0-4725-b3ae-0979b2406a5d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4885fb93-06d0-4725-b3ae-0979b2406a5d\") pod \"openstack-cell1-galera-0\" (UID: \"52099a51-54ac-4611-89cf-191426ae31d7\") " pod="openstack/openstack-cell1-galera-0" Mar 20 08:30:41 crc kubenswrapper[4971]: I0320 08:30:41.695940 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/52099a51-54ac-4611-89cf-191426ae31d7-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"52099a51-54ac-4611-89cf-191426ae31d7\") " pod="openstack/openstack-cell1-galera-0" Mar 20 08:30:41 crc kubenswrapper[4971]: I0320 08:30:41.695963 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2nh5\" (UniqueName: \"kubernetes.io/projected/52099a51-54ac-4611-89cf-191426ae31d7-kube-api-access-b2nh5\") pod \"openstack-cell1-galera-0\" (UID: \"52099a51-54ac-4611-89cf-191426ae31d7\") " pod="openstack/openstack-cell1-galera-0" Mar 20 08:30:41 crc kubenswrapper[4971]: I0320 08:30:41.695986 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52099a51-54ac-4611-89cf-191426ae31d7-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"52099a51-54ac-4611-89cf-191426ae31d7\") " pod="openstack/openstack-cell1-galera-0" Mar 20 08:30:41 crc kubenswrapper[4971]: I0320 08:30:41.696042 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/52099a51-54ac-4611-89cf-191426ae31d7-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"52099a51-54ac-4611-89cf-191426ae31d7\") " pod="openstack/openstack-cell1-galera-0" Mar 20 08:30:41 crc kubenswrapper[4971]: I0320 08:30:41.797767 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/52099a51-54ac-4611-89cf-191426ae31d7-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"52099a51-54ac-4611-89cf-191426ae31d7\") " pod="openstack/openstack-cell1-galera-0" Mar 20 08:30:41 crc kubenswrapper[4971]: I0320 08:30:41.797853 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-4885fb93-06d0-4725-b3ae-0979b2406a5d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4885fb93-06d0-4725-b3ae-0979b2406a5d\") pod \"openstack-cell1-galera-0\" (UID: \"52099a51-54ac-4611-89cf-191426ae31d7\") " pod="openstack/openstack-cell1-galera-0" Mar 20 08:30:41 crc kubenswrapper[4971]: I0320 08:30:41.797879 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/52099a51-54ac-4611-89cf-191426ae31d7-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"52099a51-54ac-4611-89cf-191426ae31d7\") " pod="openstack/openstack-cell1-galera-0" Mar 20 08:30:41 crc kubenswrapper[4971]: I0320 08:30:41.797901 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2nh5\" (UniqueName: \"kubernetes.io/projected/52099a51-54ac-4611-89cf-191426ae31d7-kube-api-access-b2nh5\") pod \"openstack-cell1-galera-0\" (UID: \"52099a51-54ac-4611-89cf-191426ae31d7\") " pod="openstack/openstack-cell1-galera-0" Mar 20 08:30:41 crc kubenswrapper[4971]: I0320 08:30:41.797924 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52099a51-54ac-4611-89cf-191426ae31d7-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"52099a51-54ac-4611-89cf-191426ae31d7\") " pod="openstack/openstack-cell1-galera-0" Mar 20 08:30:41 crc kubenswrapper[4971]: I0320 08:30:41.797966 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/52099a51-54ac-4611-89cf-191426ae31d7-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"52099a51-54ac-4611-89cf-191426ae31d7\") " pod="openstack/openstack-cell1-galera-0" Mar 20 08:30:41 crc kubenswrapper[4971]: I0320 08:30:41.797985 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/52099a51-54ac-4611-89cf-191426ae31d7-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"52099a51-54ac-4611-89cf-191426ae31d7\") " pod="openstack/openstack-cell1-galera-0" Mar 20 08:30:41 crc kubenswrapper[4971]: I0320 08:30:41.798005 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/52099a51-54ac-4611-89cf-191426ae31d7-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"52099a51-54ac-4611-89cf-191426ae31d7\") " pod="openstack/openstack-cell1-galera-0" Mar 20 08:30:41 crc kubenswrapper[4971]: I0320 08:30:41.798378 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/52099a51-54ac-4611-89cf-191426ae31d7-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"52099a51-54ac-4611-89cf-191426ae31d7\") " pod="openstack/openstack-cell1-galera-0" Mar 20 08:30:41 crc kubenswrapper[4971]: I0320 08:30:41.799881 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/52099a51-54ac-4611-89cf-191426ae31d7-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"52099a51-54ac-4611-89cf-191426ae31d7\") " pod="openstack/openstack-cell1-galera-0" Mar 20 08:30:41 crc kubenswrapper[4971]: I0320 08:30:41.800297 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/52099a51-54ac-4611-89cf-191426ae31d7-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"52099a51-54ac-4611-89cf-191426ae31d7\") " pod="openstack/openstack-cell1-galera-0" Mar 20 08:30:41 crc kubenswrapper[4971]: I0320 08:30:41.800933 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/52099a51-54ac-4611-89cf-191426ae31d7-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"52099a51-54ac-4611-89cf-191426ae31d7\") " pod="openstack/openstack-cell1-galera-0" Mar 20 08:30:41 crc kubenswrapper[4971]: I0320 08:30:41.804018 4971 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 08:30:41 crc kubenswrapper[4971]: I0320 08:30:41.804200 4971 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-4885fb93-06d0-4725-b3ae-0979b2406a5d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4885fb93-06d0-4725-b3ae-0979b2406a5d\") pod \"openstack-cell1-galera-0\" (UID: \"52099a51-54ac-4611-89cf-191426ae31d7\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/999506ba7a073d40de85fffd62db1d726a47216699a3b63a25df4d38ecb93dd2/globalmount\"" pod="openstack/openstack-cell1-galera-0" Mar 20 08:30:41 crc kubenswrapper[4971]: I0320 08:30:41.820519 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/52099a51-54ac-4611-89cf-191426ae31d7-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"52099a51-54ac-4611-89cf-191426ae31d7\") " pod="openstack/openstack-cell1-galera-0" Mar 20 08:30:41 crc kubenswrapper[4971]: I0320 08:30:41.821065 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52099a51-54ac-4611-89cf-191426ae31d7-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"52099a51-54ac-4611-89cf-191426ae31d7\") " pod="openstack/openstack-cell1-galera-0" Mar 20 08:30:41 crc kubenswrapper[4971]: I0320 08:30:41.824300 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2nh5\" (UniqueName: \"kubernetes.io/projected/52099a51-54ac-4611-89cf-191426ae31d7-kube-api-access-b2nh5\") pod \"openstack-cell1-galera-0\" (UID: \"52099a51-54ac-4611-89cf-191426ae31d7\") " pod="openstack/openstack-cell1-galera-0" Mar 20 08:30:41 crc kubenswrapper[4971]: I0320 08:30:41.871004 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-4885fb93-06d0-4725-b3ae-0979b2406a5d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4885fb93-06d0-4725-b3ae-0979b2406a5d\") pod \"openstack-cell1-galera-0\" (UID: \"52099a51-54ac-4611-89cf-191426ae31d7\") " pod="openstack/openstack-cell1-galera-0" Mar 20 08:30:41 crc kubenswrapper[4971]: I0320 08:30:41.980448 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 20 08:30:42 crc kubenswrapper[4971]: I0320 08:30:42.331254 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"d0878a99-9a41-4556-a72f-21fb18a5761d","Type":"ContainerStarted","Data":"fa23fcaffa35247291611e6045f280b17c1e894b328ec169231798b8b3ac6dcf"} Mar 20 08:30:42 crc kubenswrapper[4971]: I0320 08:30:42.415726 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 08:30:42 crc kubenswrapper[4971]: W0320 08:30:42.428945 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52099a51_54ac_4611_89cf_191426ae31d7.slice/crio-b1eed62a5a3fa25d9132f5184878ca4631554bf321f5ed30dcc9084b5fba70ea WatchSource:0}: Error finding container b1eed62a5a3fa25d9132f5184878ca4631554bf321f5ed30dcc9084b5fba70ea: Status 404 returned error can't find the container with id b1eed62a5a3fa25d9132f5184878ca4631554bf321f5ed30dcc9084b5fba70ea Mar 20 08:30:44 crc kubenswrapper[4971]: I0320 08:30:43.835094 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"52099a51-54ac-4611-89cf-191426ae31d7","Type":"ContainerStarted","Data":"b1eed62a5a3fa25d9132f5184878ca4631554bf321f5ed30dcc9084b5fba70ea"} Mar 20 08:30:58 crc kubenswrapper[4971]: I0320 08:30:58.570444 4971 scope.go:117] "RemoveContainer" containerID="22227758c8b92949f49a89b6285139e0241a86e83430d9249dec0fa1233e8e02" Mar 20 08:31:05 crc kubenswrapper[4971]: E0320 08:31:05.626420 4971 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-mariadb:39fc4cb70f516d8e9b48225bc0a253ef" Mar 20 08:31:05 crc kubenswrapper[4971]: E0320 08:31:05.627173 4971 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-mariadb:39fc4cb70f516d8e9b48225bc0a253ef" Mar 20 08:31:05 crc kubenswrapper[4971]: E0320 08:31:05.627401 4971 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-mariadb:39fc4cb70f516d8e9b48225bc0a253ef,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b2nh5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-cell1-galera-0_openstack(52099a51-54ac-4611-89cf-191426ae31d7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 08:31:05 crc kubenswrapper[4971]: E0320 08:31:05.629080 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-cell1-galera-0" podUID="52099a51-54ac-4611-89cf-191426ae31d7" Mar 20 08:31:05 crc kubenswrapper[4971]: I0320 08:31:05.631828 4971 scope.go:117] "RemoveContainer" containerID="9c40445e2e759b79bd7e1ad9c220e706a67500424068e12b695171e762e58b44" Mar 20 08:31:06 crc kubenswrapper[4971]: E0320 08:31:06.412166 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-mariadb:39fc4cb70f516d8e9b48225bc0a253ef\\\"\"" pod="openstack/openstack-cell1-galera-0" podUID="52099a51-54ac-4611-89cf-191426ae31d7" Mar 20 08:31:06 crc kubenswrapper[4971]: E0320 08:31:06.466939 4971 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-memcached:39fc4cb70f516d8e9b48225bc0a253ef" Mar 20 08:31:06 crc kubenswrapper[4971]: E0320 08:31:06.466992 4971 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-memcached:39fc4cb70f516d8e9b48225bc0a253ef" Mar 20 08:31:06 crc kubenswrapper[4971]: E0320 08:31:06.467123 4971 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:memcached,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-memcached:39fc4cb70f516d8e9b48225bc0a253ef,Command:[/usr/bin/dumb-init -- /usr/local/bin/kolla_start],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:memcached,HostPort:0,ContainerPort:11211,Protocol:TCP,HostIP:,},ContainerPort{Name:memcached-tls,HostPort:0,ContainerPort:11212,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:POD_IPS,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIPs,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:CONFIG_HASH,Value:n689hb7hdbh54fh65bh594h5d8h58h6bh594h558h87h5b5h674h66fh54bhcfh579h88h8bhbchf6h6bh65h57bhd4h644hbch68dh5c8h67fh58cq,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/src,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cv9pj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42457,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42457,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod memcached-0_openstack(d0878a99-9a41-4556-a72f-21fb18a5761d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 08:31:06 crc kubenswrapper[4971]: E0320 08:31:06.468223 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/memcached-0" podUID="d0878a99-9a41-4556-a72f-21fb18a5761d" Mar 20 08:31:07 crc kubenswrapper[4971]: E0320 08:31:07.233800 4971 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:39fc4cb70f516d8e9b48225bc0a253ef" Mar 20 08:31:07 crc kubenswrapper[4971]: E0320 08:31:07.234296 4971 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:39fc4cb70f516d8e9b48225bc0a253ef" Mar 20 08:31:07 crc kubenswrapper[4971]: E0320 08:31:07.234506 4971 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:39fc4cb70f516d8e9b48225bc0a253ef,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n697h54dhb7h666h69h76h59ch55ch65ch596h8h79h5c8h57hc8hfch5d7h697h79h698h5fch644hf9h54chbfh655hfchcbh5f8h646h5f7h89q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nlmgv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-86ffc6867-ffg6c_openstack(5e813b87-6cd8-460b-ae03-dfa481872f39): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 08:31:07 crc kubenswrapper[4971]: E0320 08:31:07.235770 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-86ffc6867-ffg6c" podUID="5e813b87-6cd8-460b-ae03-dfa481872f39" Mar 20 08:31:07 crc kubenswrapper[4971]: E0320 08:31:07.416840 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-memcached:39fc4cb70f516d8e9b48225bc0a253ef\\\"\"" pod="openstack/memcached-0" podUID="d0878a99-9a41-4556-a72f-21fb18a5761d" Mar 20 08:31:07 crc kubenswrapper[4971]: E0320 08:31:07.597483 4971 mount_linux.go:282] Mount failed: exit status 32 Mar 20 08:31:07 crc kubenswrapper[4971]: Mounting command: mount Mar 20 08:31:07 crc kubenswrapper[4971]: Mounting arguments: --no-canonicalize -o bind /proc/4971/fd/25 /var/lib/kubelet/pods/5e813b87-6cd8-460b-ae03-dfa481872f39/volume-subpaths/dns-svc/init/1 Mar 20 08:31:07 crc kubenswrapper[4971]: Output: mount: /var/lib/kubelet/pods/5e813b87-6cd8-460b-ae03-dfa481872f39/volume-subpaths/dns-svc/init/1: mount(2) system call failed: No such file or directory. Mar 20 08:31:08 crc kubenswrapper[4971]: E0320 08:31:08.018337 4971 kubelet_pods.go:349] "Failed to prepare subPath for volumeMount of the container" err=< Mar 20 08:31:08 crc kubenswrapper[4971]: error mounting /var/lib/kubelet/pods/5e813b87-6cd8-460b-ae03-dfa481872f39/volumes/kubernetes.io~configmap/dns-svc/..2026_03_20_08_30_37.535954436/dns-svc: mount failed: exit status 32 Mar 20 08:31:08 crc kubenswrapper[4971]: Mounting command: mount Mar 20 08:31:08 crc kubenswrapper[4971]: Mounting arguments: --no-canonicalize -o bind /proc/4971/fd/25 /var/lib/kubelet/pods/5e813b87-6cd8-460b-ae03-dfa481872f39/volume-subpaths/dns-svc/init/1 Mar 20 08:31:08 crc kubenswrapper[4971]: Output: mount: /var/lib/kubelet/pods/5e813b87-6cd8-460b-ae03-dfa481872f39/volume-subpaths/dns-svc/init/1: mount(2) system call failed: No such file or directory. Mar 20 08:31:08 crc kubenswrapper[4971]: > containerName="init" volumeMountName="dns-svc" Mar 20 08:31:08 crc kubenswrapper[4971]: E0320 08:31:08.018507 4971 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:39fc4cb70f516d8e9b48225bc0a253ef,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n697h54dhb7h666h69h76h59ch55ch65ch596h8h79h5c8h57hc8hfch5d7h697h79h698h5fch644hf9h54chbfh655hfchcbh5f8h646h5f7h89q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nlmgv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-86ffc6867-ffg6c_openstack(5e813b87-6cd8-460b-ae03-dfa481872f39): CreateContainerConfigError: failed to prepare subPath for volumeMount \"dns-svc\" of container \"init\"" logger="UnhandledError" Mar 20 08:31:08 crc kubenswrapper[4971]: E0320 08:31:08.019771 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with CreateContainerConfigError: \"failed to prepare subPath for volumeMount \\\"dns-svc\\\" of container \\\"init\\\"\"" pod="openstack/dnsmasq-dns-86ffc6867-ffg6c" podUID="5e813b87-6cd8-460b-ae03-dfa481872f39" Mar 20 08:31:08 crc kubenswrapper[4971]: I0320 08:31:08.424981 4971 generic.go:334] "Generic (PLEG): container finished" podID="329e0d78-d60d-4745-b5c0-0e3f699b3216" containerID="b236ac3b0da61470c679a78bb72285db4dcb0fdbd80ff88535024648680f5564" exitCode=0 Mar 20 08:31:08 crc kubenswrapper[4971]: I0320 08:31:08.425039 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-685785d49f-bbgzq" event={"ID":"329e0d78-d60d-4745-b5c0-0e3f699b3216","Type":"ContainerDied","Data":"b236ac3b0da61470c679a78bb72285db4dcb0fdbd80ff88535024648680f5564"} Mar 20 08:31:08 crc kubenswrapper[4971]: I0320 08:31:08.427511 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8d6c9df9-b2df-4747-a0c7-0a6684689438","Type":"ContainerStarted","Data":"e7cb0312eb5164216b7ccc5071a65e10088f380c11591c3bdbd79426e69f22d6"} Mar 20 08:31:09 crc kubenswrapper[4971]: I0320 08:31:09.445747 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5fe79aa1-d952-4250-8c28-171206f10655","Type":"ContainerStarted","Data":"91cd546efdaee92d89ab7721073785de3139500fa4088d4917d923224924d45d"} Mar 20 08:31:09 crc kubenswrapper[4971]: I0320 08:31:09.449039 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-685785d49f-bbgzq" event={"ID":"329e0d78-d60d-4745-b5c0-0e3f699b3216","Type":"ContainerStarted","Data":"5e811cd0c6e28533e2b19b9171ce7f2af8ce9aa1821baf7d743666e5a4bc8670"} Mar 20 08:31:09 crc kubenswrapper[4971]: I0320 08:31:09.449166 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-685785d49f-bbgzq" Mar 20 08:31:09 crc kubenswrapper[4971]: I0320 08:31:09.452654 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ec11b39f-d056-4dc3-a449-612cfac34c54","Type":"ContainerStarted","Data":"c7b06ca27f810ab66264f2f8d009d19d9e2002bf3731d96bf666b9df25cd0f36"} Mar 20 08:31:09 crc kubenswrapper[4971]: I0320 08:31:09.517964 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-685785d49f-bbgzq" podStartSLOduration=3.139072311 podStartE2EDuration="31.517939745s" podCreationTimestamp="2026-03-20 08:30:38 +0000 UTC" firstStartedPulling="2026-03-20 08:30:38.932892176 +0000 UTC m=+6060.912766314" lastFinishedPulling="2026-03-20 08:31:07.31175961 +0000 UTC m=+6089.291633748" observedRunningTime="2026-03-20 08:31:09.516444456 +0000 UTC m=+6091.496318614" watchObservedRunningTime="2026-03-20 08:31:09.517939745 +0000 UTC m=+6091.497813903" Mar 20 08:31:12 crc kubenswrapper[4971]: I0320 08:31:12.475582 4971 generic.go:334] "Generic (PLEG): container finished" podID="8d6c9df9-b2df-4747-a0c7-0a6684689438" containerID="e7cb0312eb5164216b7ccc5071a65e10088f380c11591c3bdbd79426e69f22d6" exitCode=0 Mar 20 08:31:12 crc kubenswrapper[4971]: I0320 08:31:12.475642 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8d6c9df9-b2df-4747-a0c7-0a6684689438","Type":"ContainerDied","Data":"e7cb0312eb5164216b7ccc5071a65e10088f380c11591c3bdbd79426e69f22d6"} Mar 20 08:31:13 crc kubenswrapper[4971]: I0320 08:31:13.420401 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-685785d49f-bbgzq" Mar 20 08:31:13 crc kubenswrapper[4971]: I0320 08:31:13.496334 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8d6c9df9-b2df-4747-a0c7-0a6684689438","Type":"ContainerStarted","Data":"a1dc8893f74ea60caf94bea1c3f404467de12471f25ce964a2106c1940ff1912"} Mar 20 08:31:13 crc kubenswrapper[4971]: I0320 08:31:13.509035 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86ffc6867-ffg6c"] Mar 20 08:31:13 crc kubenswrapper[4971]: I0320 08:31:13.529165 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=8.569107334 podStartE2EDuration="34.529126255s" podCreationTimestamp="2026-03-20 08:30:39 +0000 UTC" firstStartedPulling="2026-03-20 08:30:41.241465167 +0000 UTC m=+6063.221339305" lastFinishedPulling="2026-03-20 08:31:07.201484088 +0000 UTC m=+6089.181358226" observedRunningTime="2026-03-20 08:31:13.526101546 +0000 UTC m=+6095.505975684" watchObservedRunningTime="2026-03-20 08:31:13.529126255 +0000 UTC m=+6095.509000393" Mar 20 08:31:13 crc kubenswrapper[4971]: I0320 08:31:13.919847 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86ffc6867-ffg6c" Mar 20 08:31:14 crc kubenswrapper[4971]: I0320 08:31:14.013359 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlmgv\" (UniqueName: \"kubernetes.io/projected/5e813b87-6cd8-460b-ae03-dfa481872f39-kube-api-access-nlmgv\") pod \"5e813b87-6cd8-460b-ae03-dfa481872f39\" (UID: \"5e813b87-6cd8-460b-ae03-dfa481872f39\") " Mar 20 08:31:14 crc kubenswrapper[4971]: I0320 08:31:14.013526 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e813b87-6cd8-460b-ae03-dfa481872f39-dns-svc\") pod \"5e813b87-6cd8-460b-ae03-dfa481872f39\" (UID: \"5e813b87-6cd8-460b-ae03-dfa481872f39\") " Mar 20 08:31:14 crc kubenswrapper[4971]: I0320 08:31:14.013579 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e813b87-6cd8-460b-ae03-dfa481872f39-config\") pod \"5e813b87-6cd8-460b-ae03-dfa481872f39\" (UID: \"5e813b87-6cd8-460b-ae03-dfa481872f39\") " Mar 20 08:31:14 crc kubenswrapper[4971]: I0320 08:31:14.014004 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e813b87-6cd8-460b-ae03-dfa481872f39-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5e813b87-6cd8-460b-ae03-dfa481872f39" (UID: "5e813b87-6cd8-460b-ae03-dfa481872f39"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:31:14 crc kubenswrapper[4971]: I0320 08:31:14.017644 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e813b87-6cd8-460b-ae03-dfa481872f39-kube-api-access-nlmgv" (OuterVolumeSpecName: "kube-api-access-nlmgv") pod "5e813b87-6cd8-460b-ae03-dfa481872f39" (UID: "5e813b87-6cd8-460b-ae03-dfa481872f39"). InnerVolumeSpecName "kube-api-access-nlmgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:31:14 crc kubenswrapper[4971]: I0320 08:31:14.033102 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e813b87-6cd8-460b-ae03-dfa481872f39-config" (OuterVolumeSpecName: "config") pod "5e813b87-6cd8-460b-ae03-dfa481872f39" (UID: "5e813b87-6cd8-460b-ae03-dfa481872f39"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:31:14 crc kubenswrapper[4971]: I0320 08:31:14.115088 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlmgv\" (UniqueName: \"kubernetes.io/projected/5e813b87-6cd8-460b-ae03-dfa481872f39-kube-api-access-nlmgv\") on node \"crc\" DevicePath \"\"" Mar 20 08:31:14 crc kubenswrapper[4971]: I0320 08:31:14.115118 4971 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e813b87-6cd8-460b-ae03-dfa481872f39-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 08:31:14 crc kubenswrapper[4971]: I0320 08:31:14.115131 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e813b87-6cd8-460b-ae03-dfa481872f39-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:31:14 crc kubenswrapper[4971]: I0320 08:31:14.504837 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86ffc6867-ffg6c" event={"ID":"5e813b87-6cd8-460b-ae03-dfa481872f39","Type":"ContainerDied","Data":"209bc9ff2e27dd077b9e1f91083860b0f4f72d6dc9c2578aa726e80b4fde3e4a"} Mar 20 08:31:14 crc kubenswrapper[4971]: I0320 08:31:14.504913 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86ffc6867-ffg6c" Mar 20 08:31:14 crc kubenswrapper[4971]: I0320 08:31:14.573944 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86ffc6867-ffg6c"] Mar 20 08:31:14 crc kubenswrapper[4971]: I0320 08:31:14.583475 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86ffc6867-ffg6c"] Mar 20 08:31:14 crc kubenswrapper[4971]: I0320 08:31:14.744112 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e813b87-6cd8-460b-ae03-dfa481872f39" path="/var/lib/kubelet/pods/5e813b87-6cd8-460b-ae03-dfa481872f39/volumes" Mar 20 08:31:17 crc kubenswrapper[4971]: I0320 08:31:17.535526 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"52099a51-54ac-4611-89cf-191426ae31d7","Type":"ContainerStarted","Data":"2fbf39b03993c8b645055a2492d9c2f1c30c6a9a0788efc2349eef94945783db"} Mar 20 08:31:20 crc kubenswrapper[4971]: I0320 08:31:20.162982 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 08:31:20 crc kubenswrapper[4971]: I0320 08:31:20.163430 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 08:31:20 crc kubenswrapper[4971]: I0320 08:31:20.401723 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 20 08:31:20 crc kubenswrapper[4971]: I0320 08:31:20.402051 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 20 08:31:20 crc kubenswrapper[4971]: I0320 08:31:20.522557 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 20 08:31:20 crc kubenswrapper[4971]: I0320 08:31:20.561769 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"d0878a99-9a41-4556-a72f-21fb18a5761d","Type":"ContainerStarted","Data":"6208f7da62d8d0e3992f815eb3afb41941aaf042542306a1ee581b7ee9ecfdcc"} Mar 20 08:31:20 crc kubenswrapper[4971]: I0320 08:31:20.562975 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 20 08:31:20 crc kubenswrapper[4971]: I0320 08:31:20.566023 4971 generic.go:334] "Generic (PLEG): container finished" podID="52099a51-54ac-4611-89cf-191426ae31d7" containerID="2fbf39b03993c8b645055a2492d9c2f1c30c6a9a0788efc2349eef94945783db" exitCode=0 Mar 20 08:31:20 crc kubenswrapper[4971]: I0320 08:31:20.566070 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"52099a51-54ac-4611-89cf-191426ae31d7","Type":"ContainerDied","Data":"2fbf39b03993c8b645055a2492d9c2f1c30c6a9a0788efc2349eef94945783db"} Mar 20 08:31:20 crc kubenswrapper[4971]: I0320 08:31:20.603573 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.240933384 podStartE2EDuration="40.603544329s" podCreationTimestamp="2026-03-20 08:30:40 +0000 UTC" firstStartedPulling="2026-03-20 08:30:41.52995383 +0000 UTC m=+6063.509827968" lastFinishedPulling="2026-03-20 08:31:19.892564775 +0000 UTC m=+6101.872438913" observedRunningTime="2026-03-20 08:31:20.59705257 +0000 UTC m=+6102.576926768" watchObservedRunningTime="2026-03-20 08:31:20.603544329 +0000 UTC m=+6102.583418507" Mar 20 08:31:20 crc kubenswrapper[4971]: I0320 08:31:20.659793 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 20 08:31:21 crc kubenswrapper[4971]: I0320 08:31:21.580529 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"52099a51-54ac-4611-89cf-191426ae31d7","Type":"ContainerStarted","Data":"5d96e2532de531d5fa5ea7e8bd4b26806a682d123826db0c879e5f3aa78d37b6"} Mar 20 08:31:21 crc kubenswrapper[4971]: I0320 08:31:21.610744 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=-9223371995.244047 podStartE2EDuration="41.610727759s" podCreationTimestamp="2026-03-20 08:30:40 +0000 UTC" firstStartedPulling="2026-03-20 08:30:42.434032674 +0000 UTC m=+6064.413906812" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:31:21.605542344 +0000 UTC m=+6103.585416472" watchObservedRunningTime="2026-03-20 08:31:21.610727759 +0000 UTC m=+6103.590601897" Mar 20 08:31:21 crc kubenswrapper[4971]: I0320 08:31:21.981243 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 20 08:31:21 crc kubenswrapper[4971]: I0320 08:31:21.981418 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 20 08:31:24 crc kubenswrapper[4971]: I0320 08:31:24.123383 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-42gdx"] Mar 20 08:31:24 crc kubenswrapper[4971]: I0320 08:31:24.126478 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-42gdx" Mar 20 08:31:24 crc kubenswrapper[4971]: I0320 08:31:24.139998 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-42gdx"] Mar 20 08:31:24 crc kubenswrapper[4971]: I0320 08:31:24.184461 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cst9k\" (UniqueName: \"kubernetes.io/projected/745cb060-ddd0-464e-9264-2b1a13c58c47-kube-api-access-cst9k\") pod \"redhat-marketplace-42gdx\" (UID: \"745cb060-ddd0-464e-9264-2b1a13c58c47\") " pod="openshift-marketplace/redhat-marketplace-42gdx" Mar 20 08:31:24 crc kubenswrapper[4971]: I0320 08:31:24.184570 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/745cb060-ddd0-464e-9264-2b1a13c58c47-catalog-content\") pod \"redhat-marketplace-42gdx\" (UID: \"745cb060-ddd0-464e-9264-2b1a13c58c47\") " pod="openshift-marketplace/redhat-marketplace-42gdx" Mar 20 08:31:24 crc kubenswrapper[4971]: I0320 08:31:24.185198 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/745cb060-ddd0-464e-9264-2b1a13c58c47-utilities\") pod \"redhat-marketplace-42gdx\" (UID: \"745cb060-ddd0-464e-9264-2b1a13c58c47\") " pod="openshift-marketplace/redhat-marketplace-42gdx" Mar 20 08:31:24 crc kubenswrapper[4971]: I0320 08:31:24.287158 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/745cb060-ddd0-464e-9264-2b1a13c58c47-utilities\") pod \"redhat-marketplace-42gdx\" (UID: \"745cb060-ddd0-464e-9264-2b1a13c58c47\") " pod="openshift-marketplace/redhat-marketplace-42gdx" Mar 20 08:31:24 crc kubenswrapper[4971]: I0320 08:31:24.287251 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cst9k\" (UniqueName: \"kubernetes.io/projected/745cb060-ddd0-464e-9264-2b1a13c58c47-kube-api-access-cst9k\") pod \"redhat-marketplace-42gdx\" (UID: \"745cb060-ddd0-464e-9264-2b1a13c58c47\") " pod="openshift-marketplace/redhat-marketplace-42gdx" Mar 20 08:31:24 crc kubenswrapper[4971]: I0320 08:31:24.287328 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/745cb060-ddd0-464e-9264-2b1a13c58c47-catalog-content\") pod \"redhat-marketplace-42gdx\" (UID: \"745cb060-ddd0-464e-9264-2b1a13c58c47\") " pod="openshift-marketplace/redhat-marketplace-42gdx" Mar 20 08:31:24 crc kubenswrapper[4971]: I0320 08:31:24.287796 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/745cb060-ddd0-464e-9264-2b1a13c58c47-utilities\") pod \"redhat-marketplace-42gdx\" (UID: \"745cb060-ddd0-464e-9264-2b1a13c58c47\") " pod="openshift-marketplace/redhat-marketplace-42gdx" Mar 20 08:31:24 crc kubenswrapper[4971]: I0320 08:31:24.287974 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/745cb060-ddd0-464e-9264-2b1a13c58c47-catalog-content\") pod \"redhat-marketplace-42gdx\" (UID: \"745cb060-ddd0-464e-9264-2b1a13c58c47\") " pod="openshift-marketplace/redhat-marketplace-42gdx" Mar 20 08:31:24 crc kubenswrapper[4971]: I0320 08:31:24.313740 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cst9k\" (UniqueName: \"kubernetes.io/projected/745cb060-ddd0-464e-9264-2b1a13c58c47-kube-api-access-cst9k\") pod \"redhat-marketplace-42gdx\" (UID: \"745cb060-ddd0-464e-9264-2b1a13c58c47\") " pod="openshift-marketplace/redhat-marketplace-42gdx" Mar 20 08:31:24 crc kubenswrapper[4971]: I0320 08:31:24.462251 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-42gdx" Mar 20 08:31:24 crc kubenswrapper[4971]: I0320 08:31:24.907060 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-42gdx"] Mar 20 08:31:24 crc kubenswrapper[4971]: W0320 08:31:24.931112 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod745cb060_ddd0_464e_9264_2b1a13c58c47.slice/crio-1103ecb8249daca8e3fc00b829404af2a9e21ff60f3418ff2932f7e2eb59194f WatchSource:0}: Error finding container 1103ecb8249daca8e3fc00b829404af2a9e21ff60f3418ff2932f7e2eb59194f: Status 404 returned error can't find the container with id 1103ecb8249daca8e3fc00b829404af2a9e21ff60f3418ff2932f7e2eb59194f Mar 20 08:31:25 crc kubenswrapper[4971]: I0320 08:31:25.628844 4971 generic.go:334] "Generic (PLEG): container finished" podID="745cb060-ddd0-464e-9264-2b1a13c58c47" containerID="5b7a13b93e9dd5097aa677624986d0c9699612613c96286245b0b1109ea7ea33" exitCode=0 Mar 20 08:31:25 crc kubenswrapper[4971]: I0320 08:31:25.628935 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-42gdx" event={"ID":"745cb060-ddd0-464e-9264-2b1a13c58c47","Type":"ContainerDied","Data":"5b7a13b93e9dd5097aa677624986d0c9699612613c96286245b0b1109ea7ea33"} Mar 20 08:31:25 crc kubenswrapper[4971]: I0320 08:31:25.628996 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-42gdx" event={"ID":"745cb060-ddd0-464e-9264-2b1a13c58c47","Type":"ContainerStarted","Data":"1103ecb8249daca8e3fc00b829404af2a9e21ff60f3418ff2932f7e2eb59194f"} Mar 20 08:31:26 crc kubenswrapper[4971]: I0320 08:31:26.003891 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 20 08:31:26 crc kubenswrapper[4971]: I0320 08:31:26.092079 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 20 08:31:26 crc kubenswrapper[4971]: I0320 08:31:26.179378 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 20 08:31:26 crc kubenswrapper[4971]: I0320 08:31:26.638215 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-42gdx" event={"ID":"745cb060-ddd0-464e-9264-2b1a13c58c47","Type":"ContainerStarted","Data":"5ba84c28ac39a925135f8568223e99dd097788b82af19c0b707fe26aea468adb"} Mar 20 08:31:27 crc kubenswrapper[4971]: I0320 08:31:27.648807 4971 generic.go:334] "Generic (PLEG): container finished" podID="745cb060-ddd0-464e-9264-2b1a13c58c47" containerID="5ba84c28ac39a925135f8568223e99dd097788b82af19c0b707fe26aea468adb" exitCode=0 Mar 20 08:31:27 crc kubenswrapper[4971]: I0320 08:31:27.648894 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-42gdx" event={"ID":"745cb060-ddd0-464e-9264-2b1a13c58c47","Type":"ContainerDied","Data":"5ba84c28ac39a925135f8568223e99dd097788b82af19c0b707fe26aea468adb"} Mar 20 08:31:28 crc kubenswrapper[4971]: I0320 08:31:28.662824 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-42gdx" event={"ID":"745cb060-ddd0-464e-9264-2b1a13c58c47","Type":"ContainerStarted","Data":"8c197e06198fba4a8cecb639ffba5bfc7134b40f91c810531590446812db478d"} Mar 20 08:31:28 crc kubenswrapper[4971]: I0320 08:31:28.686523 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-42gdx" podStartSLOduration=1.972630663 podStartE2EDuration="4.686506129s" podCreationTimestamp="2026-03-20 08:31:24 +0000 UTC" firstStartedPulling="2026-03-20 08:31:25.631111359 +0000 UTC m=+6107.610985537" lastFinishedPulling="2026-03-20 08:31:28.344986835 +0000 UTC m=+6110.324861003" observedRunningTime="2026-03-20 08:31:28.6849808 +0000 UTC m=+6110.664854948" watchObservedRunningTime="2026-03-20 08:31:28.686506129 +0000 UTC m=+6110.666380267" Mar 20 08:31:29 crc kubenswrapper[4971]: I0320 08:31:29.059114 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-rxwbm"] Mar 20 08:31:29 crc kubenswrapper[4971]: I0320 08:31:29.060358 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-rxwbm" Mar 20 08:31:29 crc kubenswrapper[4971]: I0320 08:31:29.063237 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 20 08:31:29 crc kubenswrapper[4971]: I0320 08:31:29.086773 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-rxwbm"] Mar 20 08:31:29 crc kubenswrapper[4971]: I0320 08:31:29.163468 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4e3c5f0-02ca-4f3d-8f3f-204929b4096a-operator-scripts\") pod \"root-account-create-update-rxwbm\" (UID: \"c4e3c5f0-02ca-4f3d-8f3f-204929b4096a\") " pod="openstack/root-account-create-update-rxwbm" Mar 20 08:31:29 crc kubenswrapper[4971]: I0320 08:31:29.163730 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xj98\" (UniqueName: \"kubernetes.io/projected/c4e3c5f0-02ca-4f3d-8f3f-204929b4096a-kube-api-access-2xj98\") pod \"root-account-create-update-rxwbm\" (UID: \"c4e3c5f0-02ca-4f3d-8f3f-204929b4096a\") " pod="openstack/root-account-create-update-rxwbm" Mar 20 08:31:29 crc kubenswrapper[4971]: I0320 08:31:29.265065 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xj98\" (UniqueName: \"kubernetes.io/projected/c4e3c5f0-02ca-4f3d-8f3f-204929b4096a-kube-api-access-2xj98\") pod \"root-account-create-update-rxwbm\" (UID: \"c4e3c5f0-02ca-4f3d-8f3f-204929b4096a\") " pod="openstack/root-account-create-update-rxwbm" Mar 20 08:31:29 crc kubenswrapper[4971]: I0320 08:31:29.265250 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4e3c5f0-02ca-4f3d-8f3f-204929b4096a-operator-scripts\") pod \"root-account-create-update-rxwbm\" (UID: \"c4e3c5f0-02ca-4f3d-8f3f-204929b4096a\") " pod="openstack/root-account-create-update-rxwbm" Mar 20 08:31:29 crc kubenswrapper[4971]: I0320 08:31:29.266302 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4e3c5f0-02ca-4f3d-8f3f-204929b4096a-operator-scripts\") pod \"root-account-create-update-rxwbm\" (UID: \"c4e3c5f0-02ca-4f3d-8f3f-204929b4096a\") " pod="openstack/root-account-create-update-rxwbm" Mar 20 08:31:29 crc kubenswrapper[4971]: I0320 08:31:29.289041 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xj98\" (UniqueName: \"kubernetes.io/projected/c4e3c5f0-02ca-4f3d-8f3f-204929b4096a-kube-api-access-2xj98\") pod \"root-account-create-update-rxwbm\" (UID: \"c4e3c5f0-02ca-4f3d-8f3f-204929b4096a\") " pod="openstack/root-account-create-update-rxwbm" Mar 20 08:31:29 crc kubenswrapper[4971]: I0320 08:31:29.381633 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-rxwbm" Mar 20 08:31:29 crc kubenswrapper[4971]: I0320 08:31:29.833012 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-rxwbm"] Mar 20 08:31:29 crc kubenswrapper[4971]: W0320 08:31:29.847122 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4e3c5f0_02ca_4f3d_8f3f_204929b4096a.slice/crio-0add1e3c297d8c91c4320f63089f80d74b526005a122ce9376dd5255ac02612b WatchSource:0}: Error finding container 0add1e3c297d8c91c4320f63089f80d74b526005a122ce9376dd5255ac02612b: Status 404 returned error can't find the container with id 0add1e3c297d8c91c4320f63089f80d74b526005a122ce9376dd5255ac02612b Mar 20 08:31:30 crc kubenswrapper[4971]: I0320 08:31:30.681709 4971 generic.go:334] "Generic (PLEG): container finished" podID="c4e3c5f0-02ca-4f3d-8f3f-204929b4096a" containerID="a3317d43c773393c90034edfe3cd67415c27e14fe9be62a0da9c45a67d69d291" exitCode=0 Mar 20 08:31:30 crc kubenswrapper[4971]: I0320 08:31:30.681809 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-rxwbm" event={"ID":"c4e3c5f0-02ca-4f3d-8f3f-204929b4096a","Type":"ContainerDied","Data":"a3317d43c773393c90034edfe3cd67415c27e14fe9be62a0da9c45a67d69d291"} Mar 20 08:31:30 crc kubenswrapper[4971]: I0320 08:31:30.682088 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-rxwbm" event={"ID":"c4e3c5f0-02ca-4f3d-8f3f-204929b4096a","Type":"ContainerStarted","Data":"0add1e3c297d8c91c4320f63089f80d74b526005a122ce9376dd5255ac02612b"} Mar 20 08:31:31 crc kubenswrapper[4971]: I0320 08:31:31.957800 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-rxwbm" Mar 20 08:31:32 crc kubenswrapper[4971]: I0320 08:31:32.115032 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4e3c5f0-02ca-4f3d-8f3f-204929b4096a-operator-scripts\") pod \"c4e3c5f0-02ca-4f3d-8f3f-204929b4096a\" (UID: \"c4e3c5f0-02ca-4f3d-8f3f-204929b4096a\") " Mar 20 08:31:32 crc kubenswrapper[4971]: I0320 08:31:32.115116 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xj98\" (UniqueName: \"kubernetes.io/projected/c4e3c5f0-02ca-4f3d-8f3f-204929b4096a-kube-api-access-2xj98\") pod \"c4e3c5f0-02ca-4f3d-8f3f-204929b4096a\" (UID: \"c4e3c5f0-02ca-4f3d-8f3f-204929b4096a\") " Mar 20 08:31:32 crc kubenswrapper[4971]: I0320 08:31:32.116014 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4e3c5f0-02ca-4f3d-8f3f-204929b4096a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c4e3c5f0-02ca-4f3d-8f3f-204929b4096a" (UID: "c4e3c5f0-02ca-4f3d-8f3f-204929b4096a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:31:32 crc kubenswrapper[4971]: I0320 08:31:32.129095 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4e3c5f0-02ca-4f3d-8f3f-204929b4096a-kube-api-access-2xj98" (OuterVolumeSpecName: "kube-api-access-2xj98") pod "c4e3c5f0-02ca-4f3d-8f3f-204929b4096a" (UID: "c4e3c5f0-02ca-4f3d-8f3f-204929b4096a"). InnerVolumeSpecName "kube-api-access-2xj98". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:31:32 crc kubenswrapper[4971]: I0320 08:31:32.216704 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xj98\" (UniqueName: \"kubernetes.io/projected/c4e3c5f0-02ca-4f3d-8f3f-204929b4096a-kube-api-access-2xj98\") on node \"crc\" DevicePath \"\"" Mar 20 08:31:32 crc kubenswrapper[4971]: I0320 08:31:32.216749 4971 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4e3c5f0-02ca-4f3d-8f3f-204929b4096a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:31:32 crc kubenswrapper[4971]: I0320 08:31:32.697966 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-rxwbm" event={"ID":"c4e3c5f0-02ca-4f3d-8f3f-204929b4096a","Type":"ContainerDied","Data":"0add1e3c297d8c91c4320f63089f80d74b526005a122ce9376dd5255ac02612b"} Mar 20 08:31:32 crc kubenswrapper[4971]: I0320 08:31:32.698018 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-rxwbm" Mar 20 08:31:32 crc kubenswrapper[4971]: I0320 08:31:32.698023 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0add1e3c297d8c91c4320f63089f80d74b526005a122ce9376dd5255ac02612b" Mar 20 08:31:34 crc kubenswrapper[4971]: I0320 08:31:34.463489 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-42gdx" Mar 20 08:31:34 crc kubenswrapper[4971]: I0320 08:31:34.464048 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-42gdx" Mar 20 08:31:34 crc kubenswrapper[4971]: I0320 08:31:34.533634 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-42gdx" Mar 20 08:31:34 crc kubenswrapper[4971]: I0320 08:31:34.788487 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-42gdx" Mar 20 08:31:34 crc kubenswrapper[4971]: I0320 08:31:34.847520 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-42gdx"] Mar 20 08:31:35 crc kubenswrapper[4971]: I0320 08:31:35.585961 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-rxwbm"] Mar 20 08:31:35 crc kubenswrapper[4971]: I0320 08:31:35.593596 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-rxwbm"] Mar 20 08:31:36 crc kubenswrapper[4971]: I0320 08:31:36.737598 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-42gdx" podUID="745cb060-ddd0-464e-9264-2b1a13c58c47" containerName="registry-server" containerID="cri-o://8c197e06198fba4a8cecb639ffba5bfc7134b40f91c810531590446812db478d" gracePeriod=2 Mar 20 08:31:36 crc kubenswrapper[4971]: I0320 08:31:36.751251 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4e3c5f0-02ca-4f3d-8f3f-204929b4096a" path="/var/lib/kubelet/pods/c4e3c5f0-02ca-4f3d-8f3f-204929b4096a/volumes" Mar 20 08:31:37 crc kubenswrapper[4971]: I0320 08:31:37.750736 4971 generic.go:334] "Generic (PLEG): container finished" podID="745cb060-ddd0-464e-9264-2b1a13c58c47" containerID="8c197e06198fba4a8cecb639ffba5bfc7134b40f91c810531590446812db478d" exitCode=0 Mar 20 08:31:37 crc kubenswrapper[4971]: I0320 08:31:37.750837 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-42gdx" event={"ID":"745cb060-ddd0-464e-9264-2b1a13c58c47","Type":"ContainerDied","Data":"8c197e06198fba4a8cecb639ffba5bfc7134b40f91c810531590446812db478d"} Mar 20 08:31:37 crc kubenswrapper[4971]: I0320 08:31:37.750911 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-42gdx" event={"ID":"745cb060-ddd0-464e-9264-2b1a13c58c47","Type":"ContainerDied","Data":"1103ecb8249daca8e3fc00b829404af2a9e21ff60f3418ff2932f7e2eb59194f"} Mar 20 08:31:37 crc kubenswrapper[4971]: I0320 08:31:37.750932 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1103ecb8249daca8e3fc00b829404af2a9e21ff60f3418ff2932f7e2eb59194f" Mar 20 08:31:37 crc kubenswrapper[4971]: I0320 08:31:37.776559 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-42gdx" Mar 20 08:31:37 crc kubenswrapper[4971]: I0320 08:31:37.842872 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cst9k\" (UniqueName: \"kubernetes.io/projected/745cb060-ddd0-464e-9264-2b1a13c58c47-kube-api-access-cst9k\") pod \"745cb060-ddd0-464e-9264-2b1a13c58c47\" (UID: \"745cb060-ddd0-464e-9264-2b1a13c58c47\") " Mar 20 08:31:37 crc kubenswrapper[4971]: I0320 08:31:37.842915 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/745cb060-ddd0-464e-9264-2b1a13c58c47-catalog-content\") pod \"745cb060-ddd0-464e-9264-2b1a13c58c47\" (UID: \"745cb060-ddd0-464e-9264-2b1a13c58c47\") " Mar 20 08:31:37 crc kubenswrapper[4971]: I0320 08:31:37.843143 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/745cb060-ddd0-464e-9264-2b1a13c58c47-utilities\") pod \"745cb060-ddd0-464e-9264-2b1a13c58c47\" (UID: \"745cb060-ddd0-464e-9264-2b1a13c58c47\") " Mar 20 08:31:37 crc kubenswrapper[4971]: I0320 08:31:37.845459 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/745cb060-ddd0-464e-9264-2b1a13c58c47-utilities" (OuterVolumeSpecName: "utilities") pod "745cb060-ddd0-464e-9264-2b1a13c58c47" (UID: "745cb060-ddd0-464e-9264-2b1a13c58c47"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:31:37 crc kubenswrapper[4971]: I0320 08:31:37.849769 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/745cb060-ddd0-464e-9264-2b1a13c58c47-kube-api-access-cst9k" (OuterVolumeSpecName: "kube-api-access-cst9k") pod "745cb060-ddd0-464e-9264-2b1a13c58c47" (UID: "745cb060-ddd0-464e-9264-2b1a13c58c47"). InnerVolumeSpecName "kube-api-access-cst9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:31:37 crc kubenswrapper[4971]: I0320 08:31:37.896434 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/745cb060-ddd0-464e-9264-2b1a13c58c47-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "745cb060-ddd0-464e-9264-2b1a13c58c47" (UID: "745cb060-ddd0-464e-9264-2b1a13c58c47"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:31:37 crc kubenswrapper[4971]: I0320 08:31:37.944912 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/745cb060-ddd0-464e-9264-2b1a13c58c47-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 08:31:37 crc kubenswrapper[4971]: I0320 08:31:37.944945 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cst9k\" (UniqueName: \"kubernetes.io/projected/745cb060-ddd0-464e-9264-2b1a13c58c47-kube-api-access-cst9k\") on node \"crc\" DevicePath \"\"" Mar 20 08:31:37 crc kubenswrapper[4971]: I0320 08:31:37.944956 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/745cb060-ddd0-464e-9264-2b1a13c58c47-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 08:31:38 crc kubenswrapper[4971]: I0320 08:31:38.759335 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-42gdx" Mar 20 08:31:38 crc kubenswrapper[4971]: I0320 08:31:38.813144 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-42gdx"] Mar 20 08:31:38 crc kubenswrapper[4971]: I0320 08:31:38.819991 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-42gdx"] Mar 20 08:31:40 crc kubenswrapper[4971]: I0320 08:31:40.624285 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-rm8gn"] Mar 20 08:31:40 crc kubenswrapper[4971]: E0320 08:31:40.624823 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4e3c5f0-02ca-4f3d-8f3f-204929b4096a" containerName="mariadb-account-create-update" Mar 20 08:31:40 crc kubenswrapper[4971]: I0320 08:31:40.624848 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4e3c5f0-02ca-4f3d-8f3f-204929b4096a" containerName="mariadb-account-create-update" Mar 20 08:31:40 crc kubenswrapper[4971]: E0320 08:31:40.624874 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="745cb060-ddd0-464e-9264-2b1a13c58c47" containerName="registry-server" Mar 20 08:31:40 crc kubenswrapper[4971]: I0320 08:31:40.624886 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="745cb060-ddd0-464e-9264-2b1a13c58c47" containerName="registry-server" Mar 20 08:31:40 crc kubenswrapper[4971]: E0320 08:31:40.624902 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="745cb060-ddd0-464e-9264-2b1a13c58c47" containerName="extract-content" Mar 20 08:31:40 crc kubenswrapper[4971]: I0320 08:31:40.624914 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="745cb060-ddd0-464e-9264-2b1a13c58c47" containerName="extract-content" Mar 20 08:31:40 crc kubenswrapper[4971]: E0320 08:31:40.624934 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="745cb060-ddd0-464e-9264-2b1a13c58c47" containerName="extract-utilities" Mar 20 08:31:40 crc kubenswrapper[4971]: I0320 08:31:40.624945 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="745cb060-ddd0-464e-9264-2b1a13c58c47" containerName="extract-utilities" Mar 20 08:31:40 crc kubenswrapper[4971]: I0320 08:31:40.625217 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="745cb060-ddd0-464e-9264-2b1a13c58c47" containerName="registry-server" Mar 20 08:31:40 crc kubenswrapper[4971]: I0320 08:31:40.625246 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4e3c5f0-02ca-4f3d-8f3f-204929b4096a" containerName="mariadb-account-create-update" Mar 20 08:31:40 crc kubenswrapper[4971]: I0320 08:31:40.626684 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-rm8gn" Mar 20 08:31:40 crc kubenswrapper[4971]: I0320 08:31:40.630696 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 20 08:31:40 crc kubenswrapper[4971]: I0320 08:31:40.658453 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-rm8gn"] Mar 20 08:31:40 crc kubenswrapper[4971]: I0320 08:31:40.750033 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="745cb060-ddd0-464e-9264-2b1a13c58c47" path="/var/lib/kubelet/pods/745cb060-ddd0-464e-9264-2b1a13c58c47/volumes" Mar 20 08:31:40 crc kubenswrapper[4971]: I0320 08:31:40.777365 4971 generic.go:334] "Generic (PLEG): container finished" podID="5fe79aa1-d952-4250-8c28-171206f10655" containerID="91cd546efdaee92d89ab7721073785de3139500fa4088d4917d923224924d45d" exitCode=0 Mar 20 08:31:40 crc kubenswrapper[4971]: I0320 08:31:40.777468 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5fe79aa1-d952-4250-8c28-171206f10655","Type":"ContainerDied","Data":"91cd546efdaee92d89ab7721073785de3139500fa4088d4917d923224924d45d"} Mar 20 08:31:40 crc kubenswrapper[4971]: I0320 08:31:40.797911 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8095ddfe-bc8a-4d45-aaa3-431ade832c0f-operator-scripts\") pod \"root-account-create-update-rm8gn\" (UID: \"8095ddfe-bc8a-4d45-aaa3-431ade832c0f\") " pod="openstack/root-account-create-update-rm8gn" Mar 20 08:31:40 crc kubenswrapper[4971]: I0320 08:31:40.797974 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnhc6\" (UniqueName: \"kubernetes.io/projected/8095ddfe-bc8a-4d45-aaa3-431ade832c0f-kube-api-access-wnhc6\") pod \"root-account-create-update-rm8gn\" (UID: \"8095ddfe-bc8a-4d45-aaa3-431ade832c0f\") " pod="openstack/root-account-create-update-rm8gn" Mar 20 08:31:40 crc kubenswrapper[4971]: I0320 08:31:40.899601 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8095ddfe-bc8a-4d45-aaa3-431ade832c0f-operator-scripts\") pod \"root-account-create-update-rm8gn\" (UID: \"8095ddfe-bc8a-4d45-aaa3-431ade832c0f\") " pod="openstack/root-account-create-update-rm8gn" Mar 20 08:31:40 crc kubenswrapper[4971]: I0320 08:31:40.899771 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnhc6\" (UniqueName: \"kubernetes.io/projected/8095ddfe-bc8a-4d45-aaa3-431ade832c0f-kube-api-access-wnhc6\") pod \"root-account-create-update-rm8gn\" (UID: \"8095ddfe-bc8a-4d45-aaa3-431ade832c0f\") " pod="openstack/root-account-create-update-rm8gn" Mar 20 08:31:40 crc kubenswrapper[4971]: I0320 08:31:40.900322 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8095ddfe-bc8a-4d45-aaa3-431ade832c0f-operator-scripts\") pod \"root-account-create-update-rm8gn\" (UID: \"8095ddfe-bc8a-4d45-aaa3-431ade832c0f\") " pod="openstack/root-account-create-update-rm8gn" Mar 20 08:31:40 crc kubenswrapper[4971]: I0320 08:31:40.932169 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnhc6\" (UniqueName: \"kubernetes.io/projected/8095ddfe-bc8a-4d45-aaa3-431ade832c0f-kube-api-access-wnhc6\") pod \"root-account-create-update-rm8gn\" (UID: \"8095ddfe-bc8a-4d45-aaa3-431ade832c0f\") " pod="openstack/root-account-create-update-rm8gn" Mar 20 08:31:40 crc kubenswrapper[4971]: I0320 08:31:40.977930 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-rm8gn" Mar 20 08:31:41 crc kubenswrapper[4971]: I0320 08:31:41.222302 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-rm8gn"] Mar 20 08:31:41 crc kubenswrapper[4971]: W0320 08:31:41.223902 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8095ddfe_bc8a_4d45_aaa3_431ade832c0f.slice/crio-f46a3113628b519dd44680792b35b03d7ffe510382059db5101885dce605ee20 WatchSource:0}: Error finding container f46a3113628b519dd44680792b35b03d7ffe510382059db5101885dce605ee20: Status 404 returned error can't find the container with id f46a3113628b519dd44680792b35b03d7ffe510382059db5101885dce605ee20 Mar 20 08:31:41 crc kubenswrapper[4971]: I0320 08:31:41.784835 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5fe79aa1-d952-4250-8c28-171206f10655","Type":"ContainerStarted","Data":"67e5397fb41abdd0a0ac3956eaf5df8da6dabe56ac3e80ff0cd992802af0da8e"} Mar 20 08:31:41 crc kubenswrapper[4971]: I0320 08:31:41.785412 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 20 08:31:41 crc kubenswrapper[4971]: I0320 08:31:41.786318 4971 generic.go:334] "Generic (PLEG): container finished" podID="ec11b39f-d056-4dc3-a449-612cfac34c54" containerID="c7b06ca27f810ab66264f2f8d009d19d9e2002bf3731d96bf666b9df25cd0f36" exitCode=0 Mar 20 08:31:41 crc kubenswrapper[4971]: I0320 08:31:41.786367 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ec11b39f-d056-4dc3-a449-612cfac34c54","Type":"ContainerDied","Data":"c7b06ca27f810ab66264f2f8d009d19d9e2002bf3731d96bf666b9df25cd0f36"} Mar 20 08:31:41 crc kubenswrapper[4971]: I0320 08:31:41.789000 4971 generic.go:334] "Generic (PLEG): container finished" podID="8095ddfe-bc8a-4d45-aaa3-431ade832c0f" containerID="409f257bdfc44c3a17cce22d85603115082b3572de83288a02998d605af2af75" exitCode=0 Mar 20 08:31:41 crc kubenswrapper[4971]: I0320 08:31:41.789029 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-rm8gn" event={"ID":"8095ddfe-bc8a-4d45-aaa3-431ade832c0f","Type":"ContainerDied","Data":"409f257bdfc44c3a17cce22d85603115082b3572de83288a02998d605af2af75"} Mar 20 08:31:41 crc kubenswrapper[4971]: I0320 08:31:41.789047 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-rm8gn" event={"ID":"8095ddfe-bc8a-4d45-aaa3-431ade832c0f","Type":"ContainerStarted","Data":"f46a3113628b519dd44680792b35b03d7ffe510382059db5101885dce605ee20"} Mar 20 08:31:41 crc kubenswrapper[4971]: I0320 08:31:41.814784 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.722714459 podStartE2EDuration="1m3.814767081s" podCreationTimestamp="2026-03-20 08:30:38 +0000 UTC" firstStartedPulling="2026-03-20 08:30:40.088645005 +0000 UTC m=+6062.068519143" lastFinishedPulling="2026-03-20 08:31:07.180697627 +0000 UTC m=+6089.160571765" observedRunningTime="2026-03-20 08:31:41.812170134 +0000 UTC m=+6123.792044312" watchObservedRunningTime="2026-03-20 08:31:41.814767081 +0000 UTC m=+6123.794641219" Mar 20 08:31:42 crc kubenswrapper[4971]: I0320 08:31:42.800839 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ec11b39f-d056-4dc3-a449-612cfac34c54","Type":"ContainerStarted","Data":"4d66387b247054006d2dae55f16125e43aac064959775d4ec95d6a9987d25b9d"} Mar 20 08:31:42 crc kubenswrapper[4971]: I0320 08:31:42.801399 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:31:42 crc kubenswrapper[4971]: I0320 08:31:42.838973 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=38.33628873 podStartE2EDuration="1m5.838906203s" podCreationTimestamp="2026-03-20 08:30:37 +0000 UTC" firstStartedPulling="2026-03-20 08:30:39.75782463 +0000 UTC m=+6061.737698768" lastFinishedPulling="2026-03-20 08:31:07.260442103 +0000 UTC m=+6089.240316241" observedRunningTime="2026-03-20 08:31:42.83190047 +0000 UTC m=+6124.811774688" watchObservedRunningTime="2026-03-20 08:31:42.838906203 +0000 UTC m=+6124.818780381" Mar 20 08:31:43 crc kubenswrapper[4971]: I0320 08:31:43.118683 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-rm8gn" Mar 20 08:31:43 crc kubenswrapper[4971]: I0320 08:31:43.243936 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8095ddfe-bc8a-4d45-aaa3-431ade832c0f-operator-scripts\") pod \"8095ddfe-bc8a-4d45-aaa3-431ade832c0f\" (UID: \"8095ddfe-bc8a-4d45-aaa3-431ade832c0f\") " Mar 20 08:31:43 crc kubenswrapper[4971]: I0320 08:31:43.244086 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnhc6\" (UniqueName: \"kubernetes.io/projected/8095ddfe-bc8a-4d45-aaa3-431ade832c0f-kube-api-access-wnhc6\") pod \"8095ddfe-bc8a-4d45-aaa3-431ade832c0f\" (UID: \"8095ddfe-bc8a-4d45-aaa3-431ade832c0f\") " Mar 20 08:31:43 crc kubenswrapper[4971]: I0320 08:31:43.244507 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8095ddfe-bc8a-4d45-aaa3-431ade832c0f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8095ddfe-bc8a-4d45-aaa3-431ade832c0f" (UID: "8095ddfe-bc8a-4d45-aaa3-431ade832c0f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:31:43 crc kubenswrapper[4971]: I0320 08:31:43.248957 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8095ddfe-bc8a-4d45-aaa3-431ade832c0f-kube-api-access-wnhc6" (OuterVolumeSpecName: "kube-api-access-wnhc6") pod "8095ddfe-bc8a-4d45-aaa3-431ade832c0f" (UID: "8095ddfe-bc8a-4d45-aaa3-431ade832c0f"). InnerVolumeSpecName "kube-api-access-wnhc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:31:43 crc kubenswrapper[4971]: I0320 08:31:43.345922 4971 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8095ddfe-bc8a-4d45-aaa3-431ade832c0f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:31:43 crc kubenswrapper[4971]: I0320 08:31:43.346234 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnhc6\" (UniqueName: \"kubernetes.io/projected/8095ddfe-bc8a-4d45-aaa3-431ade832c0f-kube-api-access-wnhc6\") on node \"crc\" DevicePath \"\"" Mar 20 08:31:43 crc kubenswrapper[4971]: I0320 08:31:43.813575 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-rm8gn" event={"ID":"8095ddfe-bc8a-4d45-aaa3-431ade832c0f","Type":"ContainerDied","Data":"f46a3113628b519dd44680792b35b03d7ffe510382059db5101885dce605ee20"} Mar 20 08:31:43 crc kubenswrapper[4971]: I0320 08:31:43.813647 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-rm8gn" Mar 20 08:31:43 crc kubenswrapper[4971]: I0320 08:31:43.813671 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f46a3113628b519dd44680792b35b03d7ffe510382059db5101885dce605ee20" Mar 20 08:31:50 crc kubenswrapper[4971]: I0320 08:31:50.162516 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 08:31:50 crc kubenswrapper[4971]: I0320 08:31:50.162895 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 08:31:59 crc kubenswrapper[4971]: I0320 08:31:59.249855 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:31:59 crc kubenswrapper[4971]: I0320 08:31:59.554664 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 20 08:32:00 crc kubenswrapper[4971]: I0320 08:32:00.129146 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566592-krlb2"] Mar 20 08:32:00 crc kubenswrapper[4971]: E0320 08:32:00.129537 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8095ddfe-bc8a-4d45-aaa3-431ade832c0f" containerName="mariadb-account-create-update" Mar 20 08:32:00 crc kubenswrapper[4971]: I0320 08:32:00.129563 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="8095ddfe-bc8a-4d45-aaa3-431ade832c0f" containerName="mariadb-account-create-update" Mar 20 08:32:00 crc kubenswrapper[4971]: I0320 08:32:00.129776 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="8095ddfe-bc8a-4d45-aaa3-431ade832c0f" containerName="mariadb-account-create-update" Mar 20 08:32:00 crc kubenswrapper[4971]: I0320 08:32:00.130416 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566592-krlb2" Mar 20 08:32:00 crc kubenswrapper[4971]: I0320 08:32:00.133786 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 08:32:00 crc kubenswrapper[4971]: I0320 08:32:00.133815 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 08:32:00 crc kubenswrapper[4971]: I0320 08:32:00.133809 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 08:32:00 crc kubenswrapper[4971]: I0320 08:32:00.138547 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566592-krlb2"] Mar 20 08:32:00 crc kubenswrapper[4971]: I0320 08:32:00.247834 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w486c\" (UniqueName: \"kubernetes.io/projected/6d17325d-784b-4210-972e-6a316cde36bf-kube-api-access-w486c\") pod \"auto-csr-approver-29566592-krlb2\" (UID: \"6d17325d-784b-4210-972e-6a316cde36bf\") " pod="openshift-infra/auto-csr-approver-29566592-krlb2" Mar 20 08:32:00 crc kubenswrapper[4971]: I0320 08:32:00.350123 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w486c\" (UniqueName: \"kubernetes.io/projected/6d17325d-784b-4210-972e-6a316cde36bf-kube-api-access-w486c\") pod \"auto-csr-approver-29566592-krlb2\" (UID: \"6d17325d-784b-4210-972e-6a316cde36bf\") " pod="openshift-infra/auto-csr-approver-29566592-krlb2" Mar 20 08:32:00 crc kubenswrapper[4971]: I0320 08:32:00.375489 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w486c\" (UniqueName: \"kubernetes.io/projected/6d17325d-784b-4210-972e-6a316cde36bf-kube-api-access-w486c\") pod \"auto-csr-approver-29566592-krlb2\" (UID: \"6d17325d-784b-4210-972e-6a316cde36bf\") " pod="openshift-infra/auto-csr-approver-29566592-krlb2" Mar 20 08:32:00 crc kubenswrapper[4971]: I0320 08:32:00.447658 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566592-krlb2" Mar 20 08:32:00 crc kubenswrapper[4971]: I0320 08:32:00.934352 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566592-krlb2"] Mar 20 08:32:00 crc kubenswrapper[4971]: I0320 08:32:00.964995 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566592-krlb2" event={"ID":"6d17325d-784b-4210-972e-6a316cde36bf","Type":"ContainerStarted","Data":"25809c5cf4bc6bfcbfe142f435400f2e3ab7ef63bcf0ae8c5b358301d2a481bf"} Mar 20 08:32:02 crc kubenswrapper[4971]: I0320 08:32:02.986964 4971 generic.go:334] "Generic (PLEG): container finished" podID="6d17325d-784b-4210-972e-6a316cde36bf" containerID="5911248b91fe8356aa7640d293e826b285aa76c1ea5012003162f4cb1e9e79ff" exitCode=0 Mar 20 08:32:02 crc kubenswrapper[4971]: I0320 08:32:02.987048 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566592-krlb2" event={"ID":"6d17325d-784b-4210-972e-6a316cde36bf","Type":"ContainerDied","Data":"5911248b91fe8356aa7640d293e826b285aa76c1ea5012003162f4cb1e9e79ff"} Mar 20 08:32:04 crc kubenswrapper[4971]: I0320 08:32:04.351334 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566592-krlb2" Mar 20 08:32:04 crc kubenswrapper[4971]: I0320 08:32:04.410565 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w486c\" (UniqueName: \"kubernetes.io/projected/6d17325d-784b-4210-972e-6a316cde36bf-kube-api-access-w486c\") pod \"6d17325d-784b-4210-972e-6a316cde36bf\" (UID: \"6d17325d-784b-4210-972e-6a316cde36bf\") " Mar 20 08:32:04 crc kubenswrapper[4971]: I0320 08:32:04.415633 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d17325d-784b-4210-972e-6a316cde36bf-kube-api-access-w486c" (OuterVolumeSpecName: "kube-api-access-w486c") pod "6d17325d-784b-4210-972e-6a316cde36bf" (UID: "6d17325d-784b-4210-972e-6a316cde36bf"). InnerVolumeSpecName "kube-api-access-w486c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:32:04 crc kubenswrapper[4971]: I0320 08:32:04.512355 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w486c\" (UniqueName: \"kubernetes.io/projected/6d17325d-784b-4210-972e-6a316cde36bf-kube-api-access-w486c\") on node \"crc\" DevicePath \"\"" Mar 20 08:32:05 crc kubenswrapper[4971]: I0320 08:32:05.008216 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566592-krlb2" event={"ID":"6d17325d-784b-4210-972e-6a316cde36bf","Type":"ContainerDied","Data":"25809c5cf4bc6bfcbfe142f435400f2e3ab7ef63bcf0ae8c5b358301d2a481bf"} Mar 20 08:32:05 crc kubenswrapper[4971]: I0320 08:32:05.008253 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566592-krlb2" Mar 20 08:32:05 crc kubenswrapper[4971]: I0320 08:32:05.008256 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25809c5cf4bc6bfcbfe142f435400f2e3ab7ef63bcf0ae8c5b358301d2a481bf" Mar 20 08:32:05 crc kubenswrapper[4971]: I0320 08:32:05.456324 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566586-rl6nz"] Mar 20 08:32:05 crc kubenswrapper[4971]: I0320 08:32:05.468633 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566586-rl6nz"] Mar 20 08:32:06 crc kubenswrapper[4971]: I0320 08:32:06.222424 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b5c84b9cc-8pkhg"] Mar 20 08:32:06 crc kubenswrapper[4971]: E0320 08:32:06.222880 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d17325d-784b-4210-972e-6a316cde36bf" containerName="oc" Mar 20 08:32:06 crc kubenswrapper[4971]: I0320 08:32:06.222910 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d17325d-784b-4210-972e-6a316cde36bf" containerName="oc" Mar 20 08:32:06 crc kubenswrapper[4971]: I0320 08:32:06.223104 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d17325d-784b-4210-972e-6a316cde36bf" containerName="oc" Mar 20 08:32:06 crc kubenswrapper[4971]: I0320 08:32:06.224195 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b5c84b9cc-8pkhg" Mar 20 08:32:06 crc kubenswrapper[4971]: I0320 08:32:06.239458 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a18dea62-5df6-4d36-b9c2-f9bdf096c784-dns-svc\") pod \"dnsmasq-dns-5b5c84b9cc-8pkhg\" (UID: \"a18dea62-5df6-4d36-b9c2-f9bdf096c784\") " pod="openstack/dnsmasq-dns-5b5c84b9cc-8pkhg" Mar 20 08:32:06 crc kubenswrapper[4971]: I0320 08:32:06.239513 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbskr\" (UniqueName: \"kubernetes.io/projected/a18dea62-5df6-4d36-b9c2-f9bdf096c784-kube-api-access-jbskr\") pod \"dnsmasq-dns-5b5c84b9cc-8pkhg\" (UID: \"a18dea62-5df6-4d36-b9c2-f9bdf096c784\") " pod="openstack/dnsmasq-dns-5b5c84b9cc-8pkhg" Mar 20 08:32:06 crc kubenswrapper[4971]: I0320 08:32:06.239569 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a18dea62-5df6-4d36-b9c2-f9bdf096c784-config\") pod \"dnsmasq-dns-5b5c84b9cc-8pkhg\" (UID: \"a18dea62-5df6-4d36-b9c2-f9bdf096c784\") " pod="openstack/dnsmasq-dns-5b5c84b9cc-8pkhg" Mar 20 08:32:06 crc kubenswrapper[4971]: I0320 08:32:06.287990 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b5c84b9cc-8pkhg"] Mar 20 08:32:06 crc kubenswrapper[4971]: I0320 08:32:06.340185 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a18dea62-5df6-4d36-b9c2-f9bdf096c784-dns-svc\") pod \"dnsmasq-dns-5b5c84b9cc-8pkhg\" (UID: \"a18dea62-5df6-4d36-b9c2-f9bdf096c784\") " pod="openstack/dnsmasq-dns-5b5c84b9cc-8pkhg" Mar 20 08:32:06 crc kubenswrapper[4971]: I0320 08:32:06.340244 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbskr\" (UniqueName: \"kubernetes.io/projected/a18dea62-5df6-4d36-b9c2-f9bdf096c784-kube-api-access-jbskr\") pod \"dnsmasq-dns-5b5c84b9cc-8pkhg\" (UID: \"a18dea62-5df6-4d36-b9c2-f9bdf096c784\") " pod="openstack/dnsmasq-dns-5b5c84b9cc-8pkhg" Mar 20 08:32:06 crc kubenswrapper[4971]: I0320 08:32:06.340292 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a18dea62-5df6-4d36-b9c2-f9bdf096c784-config\") pod \"dnsmasq-dns-5b5c84b9cc-8pkhg\" (UID: \"a18dea62-5df6-4d36-b9c2-f9bdf096c784\") " pod="openstack/dnsmasq-dns-5b5c84b9cc-8pkhg" Mar 20 08:32:06 crc kubenswrapper[4971]: I0320 08:32:06.341292 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a18dea62-5df6-4d36-b9c2-f9bdf096c784-config\") pod \"dnsmasq-dns-5b5c84b9cc-8pkhg\" (UID: \"a18dea62-5df6-4d36-b9c2-f9bdf096c784\") " pod="openstack/dnsmasq-dns-5b5c84b9cc-8pkhg" Mar 20 08:32:06 crc kubenswrapper[4971]: I0320 08:32:06.341305 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a18dea62-5df6-4d36-b9c2-f9bdf096c784-dns-svc\") pod \"dnsmasq-dns-5b5c84b9cc-8pkhg\" (UID: \"a18dea62-5df6-4d36-b9c2-f9bdf096c784\") " pod="openstack/dnsmasq-dns-5b5c84b9cc-8pkhg" Mar 20 08:32:06 crc kubenswrapper[4971]: I0320 08:32:06.366157 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbskr\" (UniqueName: \"kubernetes.io/projected/a18dea62-5df6-4d36-b9c2-f9bdf096c784-kube-api-access-jbskr\") pod \"dnsmasq-dns-5b5c84b9cc-8pkhg\" (UID: \"a18dea62-5df6-4d36-b9c2-f9bdf096c784\") " pod="openstack/dnsmasq-dns-5b5c84b9cc-8pkhg" Mar 20 08:32:06 crc kubenswrapper[4971]: I0320 08:32:06.547314 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b5c84b9cc-8pkhg" Mar 20 08:32:06 crc kubenswrapper[4971]: I0320 08:32:06.746372 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a719b13-476d-43b3-9767-91af911e2767" path="/var/lib/kubelet/pods/8a719b13-476d-43b3-9767-91af911e2767/volumes" Mar 20 08:32:06 crc kubenswrapper[4971]: I0320 08:32:06.974381 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b5c84b9cc-8pkhg"] Mar 20 08:32:07 crc kubenswrapper[4971]: I0320 08:32:07.029493 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b5c84b9cc-8pkhg" event={"ID":"a18dea62-5df6-4d36-b9c2-f9bdf096c784","Type":"ContainerStarted","Data":"abc960e14e140efbe8503dc76b40d3cdfef67c1365ffe48802bc23c7d97d393e"} Mar 20 08:32:07 crc kubenswrapper[4971]: I0320 08:32:07.307427 4971 scope.go:117] "RemoveContainer" containerID="21d8da55f5697fa8fd922ba33d1dbde448c1263efe8f165e85c24d1342ade57c" Mar 20 08:32:07 crc kubenswrapper[4971]: I0320 08:32:07.652999 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 08:32:07 crc kubenswrapper[4971]: I0320 08:32:07.748967 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 08:32:08 crc kubenswrapper[4971]: I0320 08:32:08.040284 4971 generic.go:334] "Generic (PLEG): container finished" podID="a18dea62-5df6-4d36-b9c2-f9bdf096c784" containerID="dda263a3079b00c4a13ffc54a4850966eda926086be90d6f279ed86258c9b396" exitCode=0 Mar 20 08:32:08 crc kubenswrapper[4971]: I0320 08:32:08.040334 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b5c84b9cc-8pkhg" event={"ID":"a18dea62-5df6-4d36-b9c2-f9bdf096c784","Type":"ContainerDied","Data":"dda263a3079b00c4a13ffc54a4850966eda926086be90d6f279ed86258c9b396"} Mar 20 08:32:09 crc kubenswrapper[4971]: I0320 08:32:09.049472 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b5c84b9cc-8pkhg" event={"ID":"a18dea62-5df6-4d36-b9c2-f9bdf096c784","Type":"ContainerStarted","Data":"23f4ca7608b466e4ebf73e557f57a795e566f94344921867720a8c451bd18af4"} Mar 20 08:32:09 crc kubenswrapper[4971]: I0320 08:32:09.049812 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b5c84b9cc-8pkhg" Mar 20 08:32:09 crc kubenswrapper[4971]: I0320 08:32:09.075931 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b5c84b9cc-8pkhg" podStartSLOduration=3.075907307 podStartE2EDuration="3.075907307s" podCreationTimestamp="2026-03-20 08:32:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:32:09.070465495 +0000 UTC m=+6151.050339653" watchObservedRunningTime="2026-03-20 08:32:09.075907307 +0000 UTC m=+6151.055781465" Mar 20 08:32:09 crc kubenswrapper[4971]: I0320 08:32:09.396546 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="ec11b39f-d056-4dc3-a449-612cfac34c54" containerName="rabbitmq" containerID="cri-o://4d66387b247054006d2dae55f16125e43aac064959775d4ec95d6a9987d25b9d" gracePeriod=604799 Mar 20 08:32:09 crc kubenswrapper[4971]: I0320 08:32:09.445770 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="5fe79aa1-d952-4250-8c28-171206f10655" containerName="rabbitmq" containerID="cri-o://67e5397fb41abdd0a0ac3956eaf5df8da6dabe56ac3e80ff0cd992802af0da8e" gracePeriod=604799 Mar 20 08:32:09 crc kubenswrapper[4971]: I0320 08:32:09.552890 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="5fe79aa1-d952-4250-8c28-171206f10655" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.1.43:5672: connect: connection refused" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.077042 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.082151 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.092855 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ec11b39f-d056-4dc3-a449-612cfac34c54-pod-info\") pod \"ec11b39f-d056-4dc3-a449-612cfac34c54\" (UID: \"ec11b39f-d056-4dc3-a449-612cfac34c54\") " Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.092925 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5l4f\" (UniqueName: \"kubernetes.io/projected/5fe79aa1-d952-4250-8c28-171206f10655-kube-api-access-h5l4f\") pod \"5fe79aa1-d952-4250-8c28-171206f10655\" (UID: \"5fe79aa1-d952-4250-8c28-171206f10655\") " Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.092971 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5fe79aa1-d952-4250-8c28-171206f10655-rabbitmq-plugins\") pod \"5fe79aa1-d952-4250-8c28-171206f10655\" (UID: \"5fe79aa1-d952-4250-8c28-171206f10655\") " Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.093020 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ec11b39f-d056-4dc3-a449-612cfac34c54-plugins-conf\") pod \"ec11b39f-d056-4dc3-a449-612cfac34c54\" (UID: \"ec11b39f-d056-4dc3-a449-612cfac34c54\") " Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.093058 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frzk2\" (UniqueName: \"kubernetes.io/projected/ec11b39f-d056-4dc3-a449-612cfac34c54-kube-api-access-frzk2\") pod \"ec11b39f-d056-4dc3-a449-612cfac34c54\" (UID: \"ec11b39f-d056-4dc3-a449-612cfac34c54\") " Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.093326 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-caa78e20-148d-4ea1-bb40-9915bf39399b\") pod \"5fe79aa1-d952-4250-8c28-171206f10655\" (UID: \"5fe79aa1-d952-4250-8c28-171206f10655\") " Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.093377 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5fe79aa1-d952-4250-8c28-171206f10655-rabbitmq-erlang-cookie\") pod \"5fe79aa1-d952-4250-8c28-171206f10655\" (UID: \"5fe79aa1-d952-4250-8c28-171206f10655\") " Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.093409 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ec11b39f-d056-4dc3-a449-612cfac34c54-rabbitmq-confd\") pod \"ec11b39f-d056-4dc3-a449-612cfac34c54\" (UID: \"ec11b39f-d056-4dc3-a449-612cfac34c54\") " Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.093454 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5fe79aa1-d952-4250-8c28-171206f10655-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "5fe79aa1-d952-4250-8c28-171206f10655" (UID: "5fe79aa1-d952-4250-8c28-171206f10655"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.093462 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5fe79aa1-d952-4250-8c28-171206f10655-rabbitmq-confd\") pod \"5fe79aa1-d952-4250-8c28-171206f10655\" (UID: \"5fe79aa1-d952-4250-8c28-171206f10655\") " Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.093520 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5fe79aa1-d952-4250-8c28-171206f10655-server-conf\") pod \"5fe79aa1-d952-4250-8c28-171206f10655\" (UID: \"5fe79aa1-d952-4250-8c28-171206f10655\") " Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.093598 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-77eee80e-450c-4861-a0d5-e0a93cc87c8d\") pod \"ec11b39f-d056-4dc3-a449-612cfac34c54\" (UID: \"ec11b39f-d056-4dc3-a449-612cfac34c54\") " Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.093687 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5fe79aa1-d952-4250-8c28-171206f10655-plugins-conf\") pod \"5fe79aa1-d952-4250-8c28-171206f10655\" (UID: \"5fe79aa1-d952-4250-8c28-171206f10655\") " Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.093750 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ec11b39f-d056-4dc3-a449-612cfac34c54-rabbitmq-plugins\") pod \"ec11b39f-d056-4dc3-a449-612cfac34c54\" (UID: \"ec11b39f-d056-4dc3-a449-612cfac34c54\") " Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.093784 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ec11b39f-d056-4dc3-a449-612cfac34c54-erlang-cookie-secret\") pod \"ec11b39f-d056-4dc3-a449-612cfac34c54\" (UID: \"ec11b39f-d056-4dc3-a449-612cfac34c54\") " Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.093819 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ec11b39f-d056-4dc3-a449-612cfac34c54-rabbitmq-erlang-cookie\") pod \"ec11b39f-d056-4dc3-a449-612cfac34c54\" (UID: \"ec11b39f-d056-4dc3-a449-612cfac34c54\") " Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.093842 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5fe79aa1-d952-4250-8c28-171206f10655-pod-info\") pod \"5fe79aa1-d952-4250-8c28-171206f10655\" (UID: \"5fe79aa1-d952-4250-8c28-171206f10655\") " Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.093867 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ec11b39f-d056-4dc3-a449-612cfac34c54-server-conf\") pod \"ec11b39f-d056-4dc3-a449-612cfac34c54\" (UID: \"ec11b39f-d056-4dc3-a449-612cfac34c54\") " Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.093886 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5fe79aa1-d952-4250-8c28-171206f10655-erlang-cookie-secret\") pod \"5fe79aa1-d952-4250-8c28-171206f10655\" (UID: \"5fe79aa1-d952-4250-8c28-171206f10655\") " Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.094013 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5fe79aa1-d952-4250-8c28-171206f10655-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "5fe79aa1-d952-4250-8c28-171206f10655" (UID: "5fe79aa1-d952-4250-8c28-171206f10655"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.094260 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec11b39f-d056-4dc3-a449-612cfac34c54-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "ec11b39f-d056-4dc3-a449-612cfac34c54" (UID: "ec11b39f-d056-4dc3-a449-612cfac34c54"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.094444 4971 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5fe79aa1-d952-4250-8c28-171206f10655-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.094461 4971 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ec11b39f-d056-4dc3-a449-612cfac34c54-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.094476 4971 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5fe79aa1-d952-4250-8c28-171206f10655-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.095061 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec11b39f-d056-4dc3-a449-612cfac34c54-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "ec11b39f-d056-4dc3-a449-612cfac34c54" (UID: "ec11b39f-d056-4dc3-a449-612cfac34c54"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.095553 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5fe79aa1-d952-4250-8c28-171206f10655-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "5fe79aa1-d952-4250-8c28-171206f10655" (UID: "5fe79aa1-d952-4250-8c28-171206f10655"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.096104 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec11b39f-d056-4dc3-a449-612cfac34c54-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "ec11b39f-d056-4dc3-a449-612cfac34c54" (UID: "ec11b39f-d056-4dc3-a449-612cfac34c54"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.101751 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe79aa1-d952-4250-8c28-171206f10655-kube-api-access-h5l4f" (OuterVolumeSpecName: "kube-api-access-h5l4f") pod "5fe79aa1-d952-4250-8c28-171206f10655" (UID: "5fe79aa1-d952-4250-8c28-171206f10655"). InnerVolumeSpecName "kube-api-access-h5l4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.104166 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/ec11b39f-d056-4dc3-a449-612cfac34c54-pod-info" (OuterVolumeSpecName: "pod-info") pod "ec11b39f-d056-4dc3-a449-612cfac34c54" (UID: "ec11b39f-d056-4dc3-a449-612cfac34c54"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.104676 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec11b39f-d056-4dc3-a449-612cfac34c54-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "ec11b39f-d056-4dc3-a449-612cfac34c54" (UID: "ec11b39f-d056-4dc3-a449-612cfac34c54"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.111794 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/5fe79aa1-d952-4250-8c28-171206f10655-pod-info" (OuterVolumeSpecName: "pod-info") pod "5fe79aa1-d952-4250-8c28-171206f10655" (UID: "5fe79aa1-d952-4250-8c28-171206f10655"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.111939 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe79aa1-d952-4250-8c28-171206f10655-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "5fe79aa1-d952-4250-8c28-171206f10655" (UID: "5fe79aa1-d952-4250-8c28-171206f10655"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.112274 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-caa78e20-148d-4ea1-bb40-9915bf39399b" (OuterVolumeSpecName: "persistence") pod "5fe79aa1-d952-4250-8c28-171206f10655" (UID: "5fe79aa1-d952-4250-8c28-171206f10655"). InnerVolumeSpecName "pvc-caa78e20-148d-4ea1-bb40-9915bf39399b". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.121010 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec11b39f-d056-4dc3-a449-612cfac34c54-kube-api-access-frzk2" (OuterVolumeSpecName: "kube-api-access-frzk2") pod "ec11b39f-d056-4dc3-a449-612cfac34c54" (UID: "ec11b39f-d056-4dc3-a449-612cfac34c54"). InnerVolumeSpecName "kube-api-access-frzk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.123443 4971 generic.go:334] "Generic (PLEG): container finished" podID="ec11b39f-d056-4dc3-a449-612cfac34c54" containerID="4d66387b247054006d2dae55f16125e43aac064959775d4ec95d6a9987d25b9d" exitCode=0 Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.123692 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ec11b39f-d056-4dc3-a449-612cfac34c54","Type":"ContainerDied","Data":"4d66387b247054006d2dae55f16125e43aac064959775d4ec95d6a9987d25b9d"} Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.123819 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ec11b39f-d056-4dc3-a449-612cfac34c54","Type":"ContainerDied","Data":"c51a176782ef8896011627f13991891395b1d9a214aa0f65170d34ffabc7b360"} Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.123910 4971 scope.go:117] "RemoveContainer" containerID="4d66387b247054006d2dae55f16125e43aac064959775d4ec95d6a9987d25b9d" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.124124 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.135363 4971 generic.go:334] "Generic (PLEG): container finished" podID="5fe79aa1-d952-4250-8c28-171206f10655" containerID="67e5397fb41abdd0a0ac3956eaf5df8da6dabe56ac3e80ff0cd992802af0da8e" exitCode=0 Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.135411 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5fe79aa1-d952-4250-8c28-171206f10655","Type":"ContainerDied","Data":"67e5397fb41abdd0a0ac3956eaf5df8da6dabe56ac3e80ff0cd992802af0da8e"} Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.135442 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5fe79aa1-d952-4250-8c28-171206f10655","Type":"ContainerDied","Data":"1768e0b9697d17c877bb05e84b59d6d2c123bce09a35271e6041f9411a14eacd"} Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.135472 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.136186 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-77eee80e-450c-4861-a0d5-e0a93cc87c8d" (OuterVolumeSpecName: "persistence") pod "ec11b39f-d056-4dc3-a449-612cfac34c54" (UID: "ec11b39f-d056-4dc3-a449-612cfac34c54"). InnerVolumeSpecName "pvc-77eee80e-450c-4861-a0d5-e0a93cc87c8d". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.144654 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5fe79aa1-d952-4250-8c28-171206f10655-server-conf" (OuterVolumeSpecName: "server-conf") pod "5fe79aa1-d952-4250-8c28-171206f10655" (UID: "5fe79aa1-d952-4250-8c28-171206f10655"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.154451 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec11b39f-d056-4dc3-a449-612cfac34c54-server-conf" (OuterVolumeSpecName: "server-conf") pod "ec11b39f-d056-4dc3-a449-612cfac34c54" (UID: "ec11b39f-d056-4dc3-a449-612cfac34c54"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.167960 4971 scope.go:117] "RemoveContainer" containerID="c7b06ca27f810ab66264f2f8d009d19d9e2002bf3731d96bf666b9df25cd0f36" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.185254 4971 scope.go:117] "RemoveContainer" containerID="4d66387b247054006d2dae55f16125e43aac064959775d4ec95d6a9987d25b9d" Mar 20 08:32:16 crc kubenswrapper[4971]: E0320 08:32:16.185641 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d66387b247054006d2dae55f16125e43aac064959775d4ec95d6a9987d25b9d\": container with ID starting with 4d66387b247054006d2dae55f16125e43aac064959775d4ec95d6a9987d25b9d not found: ID does not exist" containerID="4d66387b247054006d2dae55f16125e43aac064959775d4ec95d6a9987d25b9d" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.185692 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d66387b247054006d2dae55f16125e43aac064959775d4ec95d6a9987d25b9d"} err="failed to get container status \"4d66387b247054006d2dae55f16125e43aac064959775d4ec95d6a9987d25b9d\": rpc error: code = NotFound desc = could not find container \"4d66387b247054006d2dae55f16125e43aac064959775d4ec95d6a9987d25b9d\": container with ID starting with 4d66387b247054006d2dae55f16125e43aac064959775d4ec95d6a9987d25b9d not found: ID does not exist" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.185717 4971 scope.go:117] "RemoveContainer" containerID="c7b06ca27f810ab66264f2f8d009d19d9e2002bf3731d96bf666b9df25cd0f36" Mar 20 08:32:16 crc kubenswrapper[4971]: E0320 08:32:16.186103 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7b06ca27f810ab66264f2f8d009d19d9e2002bf3731d96bf666b9df25cd0f36\": container with ID starting with c7b06ca27f810ab66264f2f8d009d19d9e2002bf3731d96bf666b9df25cd0f36 not found: ID does not exist" containerID="c7b06ca27f810ab66264f2f8d009d19d9e2002bf3731d96bf666b9df25cd0f36" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.186135 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7b06ca27f810ab66264f2f8d009d19d9e2002bf3731d96bf666b9df25cd0f36"} err="failed to get container status \"c7b06ca27f810ab66264f2f8d009d19d9e2002bf3731d96bf666b9df25cd0f36\": rpc error: code = NotFound desc = could not find container \"c7b06ca27f810ab66264f2f8d009d19d9e2002bf3731d96bf666b9df25cd0f36\": container with ID starting with c7b06ca27f810ab66264f2f8d009d19d9e2002bf3731d96bf666b9df25cd0f36 not found: ID does not exist" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.186155 4971 scope.go:117] "RemoveContainer" containerID="67e5397fb41abdd0a0ac3956eaf5df8da6dabe56ac3e80ff0cd992802af0da8e" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.195823 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5l4f\" (UniqueName: \"kubernetes.io/projected/5fe79aa1-d952-4250-8c28-171206f10655-kube-api-access-h5l4f\") on node \"crc\" DevicePath \"\"" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.195857 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frzk2\" (UniqueName: \"kubernetes.io/projected/ec11b39f-d056-4dc3-a449-612cfac34c54-kube-api-access-frzk2\") on node \"crc\" DevicePath \"\"" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.195900 4971 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-caa78e20-148d-4ea1-bb40-9915bf39399b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-caa78e20-148d-4ea1-bb40-9915bf39399b\") on node \"crc\" " Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.195918 4971 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5fe79aa1-d952-4250-8c28-171206f10655-server-conf\") on node \"crc\" DevicePath \"\"" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.195941 4971 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-77eee80e-450c-4861-a0d5-e0a93cc87c8d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-77eee80e-450c-4861-a0d5-e0a93cc87c8d\") on node \"crc\" " Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.195955 4971 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5fe79aa1-d952-4250-8c28-171206f10655-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.195967 4971 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ec11b39f-d056-4dc3-a449-612cfac34c54-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.195979 4971 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ec11b39f-d056-4dc3-a449-612cfac34c54-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.195992 4971 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ec11b39f-d056-4dc3-a449-612cfac34c54-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.196003 4971 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5fe79aa1-d952-4250-8c28-171206f10655-pod-info\") on node \"crc\" DevicePath \"\"" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.196013 4971 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ec11b39f-d056-4dc3-a449-612cfac34c54-server-conf\") on node \"crc\" DevicePath \"\"" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.196026 4971 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5fe79aa1-d952-4250-8c28-171206f10655-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.196037 4971 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ec11b39f-d056-4dc3-a449-612cfac34c54-pod-info\") on node \"crc\" DevicePath \"\"" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.201406 4971 scope.go:117] "RemoveContainer" containerID="91cd546efdaee92d89ab7721073785de3139500fa4088d4917d923224924d45d" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.207284 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe79aa1-d952-4250-8c28-171206f10655-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "5fe79aa1-d952-4250-8c28-171206f10655" (UID: "5fe79aa1-d952-4250-8c28-171206f10655"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.211951 4971 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.212099 4971 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-77eee80e-450c-4861-a0d5-e0a93cc87c8d" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-77eee80e-450c-4861-a0d5-e0a93cc87c8d") on node "crc" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.222957 4971 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.223136 4971 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-caa78e20-148d-4ea1-bb40-9915bf39399b" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-caa78e20-148d-4ea1-bb40-9915bf39399b") on node "crc" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.225616 4971 scope.go:117] "RemoveContainer" containerID="67e5397fb41abdd0a0ac3956eaf5df8da6dabe56ac3e80ff0cd992802af0da8e" Mar 20 08:32:16 crc kubenswrapper[4971]: E0320 08:32:16.225983 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67e5397fb41abdd0a0ac3956eaf5df8da6dabe56ac3e80ff0cd992802af0da8e\": container with ID starting with 67e5397fb41abdd0a0ac3956eaf5df8da6dabe56ac3e80ff0cd992802af0da8e not found: ID does not exist" containerID="67e5397fb41abdd0a0ac3956eaf5df8da6dabe56ac3e80ff0cd992802af0da8e" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.226015 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67e5397fb41abdd0a0ac3956eaf5df8da6dabe56ac3e80ff0cd992802af0da8e"} err="failed to get container status \"67e5397fb41abdd0a0ac3956eaf5df8da6dabe56ac3e80ff0cd992802af0da8e\": rpc error: code = NotFound desc = could not find container \"67e5397fb41abdd0a0ac3956eaf5df8da6dabe56ac3e80ff0cd992802af0da8e\": container with ID starting with 67e5397fb41abdd0a0ac3956eaf5df8da6dabe56ac3e80ff0cd992802af0da8e not found: ID does not exist" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.226034 4971 scope.go:117] "RemoveContainer" containerID="91cd546efdaee92d89ab7721073785de3139500fa4088d4917d923224924d45d" Mar 20 08:32:16 crc kubenswrapper[4971]: E0320 08:32:16.228829 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91cd546efdaee92d89ab7721073785de3139500fa4088d4917d923224924d45d\": container with ID starting with 91cd546efdaee92d89ab7721073785de3139500fa4088d4917d923224924d45d not found: ID does not exist" containerID="91cd546efdaee92d89ab7721073785de3139500fa4088d4917d923224924d45d" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.228866 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91cd546efdaee92d89ab7721073785de3139500fa4088d4917d923224924d45d"} err="failed to get container status \"91cd546efdaee92d89ab7721073785de3139500fa4088d4917d923224924d45d\": rpc error: code = NotFound desc = could not find container \"91cd546efdaee92d89ab7721073785de3139500fa4088d4917d923224924d45d\": container with ID starting with 91cd546efdaee92d89ab7721073785de3139500fa4088d4917d923224924d45d not found: ID does not exist" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.247709 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec11b39f-d056-4dc3-a449-612cfac34c54-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "ec11b39f-d056-4dc3-a449-612cfac34c54" (UID: "ec11b39f-d056-4dc3-a449-612cfac34c54"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.297790 4971 reconciler_common.go:293] "Volume detached for volume \"pvc-caa78e20-148d-4ea1-bb40-9915bf39399b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-caa78e20-148d-4ea1-bb40-9915bf39399b\") on node \"crc\" DevicePath \"\"" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.297827 4971 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ec11b39f-d056-4dc3-a449-612cfac34c54-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.297837 4971 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5fe79aa1-d952-4250-8c28-171206f10655-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.297847 4971 reconciler_common.go:293] "Volume detached for volume \"pvc-77eee80e-450c-4861-a0d5-e0a93cc87c8d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-77eee80e-450c-4861-a0d5-e0a93cc87c8d\") on node \"crc\" DevicePath \"\"" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.451042 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.456892 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.473009 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.483450 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 08:32:16 crc kubenswrapper[4971]: E0320 08:32:16.484027 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fe79aa1-d952-4250-8c28-171206f10655" containerName="setup-container" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.484107 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fe79aa1-d952-4250-8c28-171206f10655" containerName="setup-container" Mar 20 08:32:16 crc kubenswrapper[4971]: E0320 08:32:16.484179 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fe79aa1-d952-4250-8c28-171206f10655" containerName="rabbitmq" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.484230 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fe79aa1-d952-4250-8c28-171206f10655" containerName="rabbitmq" Mar 20 08:32:16 crc kubenswrapper[4971]: E0320 08:32:16.484289 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec11b39f-d056-4dc3-a449-612cfac34c54" containerName="setup-container" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.484339 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec11b39f-d056-4dc3-a449-612cfac34c54" containerName="setup-container" Mar 20 08:32:16 crc kubenswrapper[4971]: E0320 08:32:16.484406 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec11b39f-d056-4dc3-a449-612cfac34c54" containerName="rabbitmq" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.484461 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec11b39f-d056-4dc3-a449-612cfac34c54" containerName="rabbitmq" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.484670 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec11b39f-d056-4dc3-a449-612cfac34c54" containerName="rabbitmq" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.484735 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fe79aa1-d952-4250-8c28-171206f10655" containerName="rabbitmq" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.485589 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.488102 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.488636 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.489859 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-5zq6x" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.489980 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.490279 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.490409 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.496733 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.524645 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.526386 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.531519 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.531829 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-5952g" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.531878 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.531835 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.532027 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.543454 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.549817 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b5c84b9cc-8pkhg" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.600130 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6e020080-54ed-44d0-ad96-8da6f1aa9bf0-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e020080-54ed-44d0-ad96-8da6f1aa9bf0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.600313 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6e020080-54ed-44d0-ad96-8da6f1aa9bf0-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e020080-54ed-44d0-ad96-8da6f1aa9bf0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.600388 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6e020080-54ed-44d0-ad96-8da6f1aa9bf0-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e020080-54ed-44d0-ad96-8da6f1aa9bf0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.600459 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-77eee80e-450c-4861-a0d5-e0a93cc87c8d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-77eee80e-450c-4861-a0d5-e0a93cc87c8d\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e020080-54ed-44d0-ad96-8da6f1aa9bf0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.600494 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-566zs\" (UniqueName: \"kubernetes.io/projected/6e020080-54ed-44d0-ad96-8da6f1aa9bf0-kube-api-access-566zs\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e020080-54ed-44d0-ad96-8da6f1aa9bf0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.600548 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6e020080-54ed-44d0-ad96-8da6f1aa9bf0-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e020080-54ed-44d0-ad96-8da6f1aa9bf0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.600654 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6e020080-54ed-44d0-ad96-8da6f1aa9bf0-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e020080-54ed-44d0-ad96-8da6f1aa9bf0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.600686 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6e020080-54ed-44d0-ad96-8da6f1aa9bf0-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e020080-54ed-44d0-ad96-8da6f1aa9bf0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.600756 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6e020080-54ed-44d0-ad96-8da6f1aa9bf0-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e020080-54ed-44d0-ad96-8da6f1aa9bf0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.611271 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-685785d49f-bbgzq"] Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.611505 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-685785d49f-bbgzq" podUID="329e0d78-d60d-4745-b5c0-0e3f699b3216" containerName="dnsmasq-dns" containerID="cri-o://5e811cd0c6e28533e2b19b9171ce7f2af8ce9aa1821baf7d743666e5a4bc8670" gracePeriod=10 Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.702089 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6e020080-54ed-44d0-ad96-8da6f1aa9bf0-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e020080-54ed-44d0-ad96-8da6f1aa9bf0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.702141 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6e020080-54ed-44d0-ad96-8da6f1aa9bf0-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e020080-54ed-44d0-ad96-8da6f1aa9bf0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.702166 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/afb0da72-3952-4b60-8cd1-00b036d211d4-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"afb0da72-3952-4b60-8cd1-00b036d211d4\") " pod="openstack/rabbitmq-server-0" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.703872 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6e020080-54ed-44d0-ad96-8da6f1aa9bf0-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e020080-54ed-44d0-ad96-8da6f1aa9bf0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.703927 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/afb0da72-3952-4b60-8cd1-00b036d211d4-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"afb0da72-3952-4b60-8cd1-00b036d211d4\") " pod="openstack/rabbitmq-server-0" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.703998 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/afb0da72-3952-4b60-8cd1-00b036d211d4-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"afb0da72-3952-4b60-8cd1-00b036d211d4\") " pod="openstack/rabbitmq-server-0" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.704071 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6e020080-54ed-44d0-ad96-8da6f1aa9bf0-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e020080-54ed-44d0-ad96-8da6f1aa9bf0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.704107 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/afb0da72-3952-4b60-8cd1-00b036d211d4-server-conf\") pod \"rabbitmq-server-0\" (UID: \"afb0da72-3952-4b60-8cd1-00b036d211d4\") " pod="openstack/rabbitmq-server-0" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.704173 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/afb0da72-3952-4b60-8cd1-00b036d211d4-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"afb0da72-3952-4b60-8cd1-00b036d211d4\") " pod="openstack/rabbitmq-server-0" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.704225 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/afb0da72-3952-4b60-8cd1-00b036d211d4-pod-info\") pod \"rabbitmq-server-0\" (UID: \"afb0da72-3952-4b60-8cd1-00b036d211d4\") " pod="openstack/rabbitmq-server-0" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.704265 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/afb0da72-3952-4b60-8cd1-00b036d211d4-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"afb0da72-3952-4b60-8cd1-00b036d211d4\") " pod="openstack/rabbitmq-server-0" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.704322 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-855g6\" (UniqueName: \"kubernetes.io/projected/afb0da72-3952-4b60-8cd1-00b036d211d4-kube-api-access-855g6\") pod \"rabbitmq-server-0\" (UID: \"afb0da72-3952-4b60-8cd1-00b036d211d4\") " pod="openstack/rabbitmq-server-0" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.704433 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6e020080-54ed-44d0-ad96-8da6f1aa9bf0-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e020080-54ed-44d0-ad96-8da6f1aa9bf0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.704486 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6e020080-54ed-44d0-ad96-8da6f1aa9bf0-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e020080-54ed-44d0-ad96-8da6f1aa9bf0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.704571 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-caa78e20-148d-4ea1-bb40-9915bf39399b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-caa78e20-148d-4ea1-bb40-9915bf39399b\") pod \"rabbitmq-server-0\" (UID: \"afb0da72-3952-4b60-8cd1-00b036d211d4\") " pod="openstack/rabbitmq-server-0" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.704641 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-77eee80e-450c-4861-a0d5-e0a93cc87c8d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-77eee80e-450c-4861-a0d5-e0a93cc87c8d\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e020080-54ed-44d0-ad96-8da6f1aa9bf0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.704691 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-566zs\" (UniqueName: \"kubernetes.io/projected/6e020080-54ed-44d0-ad96-8da6f1aa9bf0-kube-api-access-566zs\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e020080-54ed-44d0-ad96-8da6f1aa9bf0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.704749 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6e020080-54ed-44d0-ad96-8da6f1aa9bf0-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e020080-54ed-44d0-ad96-8da6f1aa9bf0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.705155 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6e020080-54ed-44d0-ad96-8da6f1aa9bf0-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e020080-54ed-44d0-ad96-8da6f1aa9bf0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.705193 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6e020080-54ed-44d0-ad96-8da6f1aa9bf0-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e020080-54ed-44d0-ad96-8da6f1aa9bf0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.705256 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6e020080-54ed-44d0-ad96-8da6f1aa9bf0-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e020080-54ed-44d0-ad96-8da6f1aa9bf0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.705984 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6e020080-54ed-44d0-ad96-8da6f1aa9bf0-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e020080-54ed-44d0-ad96-8da6f1aa9bf0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.706585 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6e020080-54ed-44d0-ad96-8da6f1aa9bf0-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e020080-54ed-44d0-ad96-8da6f1aa9bf0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.708126 4971 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.708162 4971 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-77eee80e-450c-4861-a0d5-e0a93cc87c8d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-77eee80e-450c-4861-a0d5-e0a93cc87c8d\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e020080-54ed-44d0-ad96-8da6f1aa9bf0\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/419840093af8e828c53552d4afc15e982c99c820b589ef57a0e61c257053cbd7/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.708127 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6e020080-54ed-44d0-ad96-8da6f1aa9bf0-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e020080-54ed-44d0-ad96-8da6f1aa9bf0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.708354 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6e020080-54ed-44d0-ad96-8da6f1aa9bf0-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e020080-54ed-44d0-ad96-8da6f1aa9bf0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.731673 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-566zs\" (UniqueName: \"kubernetes.io/projected/6e020080-54ed-44d0-ad96-8da6f1aa9bf0-kube-api-access-566zs\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e020080-54ed-44d0-ad96-8da6f1aa9bf0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.738072 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-77eee80e-450c-4861-a0d5-e0a93cc87c8d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-77eee80e-450c-4861-a0d5-e0a93cc87c8d\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e020080-54ed-44d0-ad96-8da6f1aa9bf0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.747378 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe79aa1-d952-4250-8c28-171206f10655" path="/var/lib/kubelet/pods/5fe79aa1-d952-4250-8c28-171206f10655/volumes" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.748302 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec11b39f-d056-4dc3-a449-612cfac34c54" path="/var/lib/kubelet/pods/ec11b39f-d056-4dc3-a449-612cfac34c54/volumes" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.804766 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.805468 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-caa78e20-148d-4ea1-bb40-9915bf39399b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-caa78e20-148d-4ea1-bb40-9915bf39399b\") pod \"rabbitmq-server-0\" (UID: \"afb0da72-3952-4b60-8cd1-00b036d211d4\") " pod="openstack/rabbitmq-server-0" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.805557 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/afb0da72-3952-4b60-8cd1-00b036d211d4-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"afb0da72-3952-4b60-8cd1-00b036d211d4\") " pod="openstack/rabbitmq-server-0" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.805591 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/afb0da72-3952-4b60-8cd1-00b036d211d4-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"afb0da72-3952-4b60-8cd1-00b036d211d4\") " pod="openstack/rabbitmq-server-0" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.805631 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/afb0da72-3952-4b60-8cd1-00b036d211d4-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"afb0da72-3952-4b60-8cd1-00b036d211d4\") " pod="openstack/rabbitmq-server-0" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.805665 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/afb0da72-3952-4b60-8cd1-00b036d211d4-server-conf\") pod \"rabbitmq-server-0\" (UID: \"afb0da72-3952-4b60-8cd1-00b036d211d4\") " pod="openstack/rabbitmq-server-0" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.805698 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/afb0da72-3952-4b60-8cd1-00b036d211d4-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"afb0da72-3952-4b60-8cd1-00b036d211d4\") " pod="openstack/rabbitmq-server-0" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.805723 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/afb0da72-3952-4b60-8cd1-00b036d211d4-pod-info\") pod \"rabbitmq-server-0\" (UID: \"afb0da72-3952-4b60-8cd1-00b036d211d4\") " pod="openstack/rabbitmq-server-0" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.805747 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/afb0da72-3952-4b60-8cd1-00b036d211d4-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"afb0da72-3952-4b60-8cd1-00b036d211d4\") " pod="openstack/rabbitmq-server-0" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.805772 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-855g6\" (UniqueName: \"kubernetes.io/projected/afb0da72-3952-4b60-8cd1-00b036d211d4-kube-api-access-855g6\") pod \"rabbitmq-server-0\" (UID: \"afb0da72-3952-4b60-8cd1-00b036d211d4\") " pod="openstack/rabbitmq-server-0" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.806760 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/afb0da72-3952-4b60-8cd1-00b036d211d4-server-conf\") pod \"rabbitmq-server-0\" (UID: \"afb0da72-3952-4b60-8cd1-00b036d211d4\") " pod="openstack/rabbitmq-server-0" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.806870 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/afb0da72-3952-4b60-8cd1-00b036d211d4-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"afb0da72-3952-4b60-8cd1-00b036d211d4\") " pod="openstack/rabbitmq-server-0" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.807752 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/afb0da72-3952-4b60-8cd1-00b036d211d4-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"afb0da72-3952-4b60-8cd1-00b036d211d4\") " pod="openstack/rabbitmq-server-0" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.808017 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/afb0da72-3952-4b60-8cd1-00b036d211d4-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"afb0da72-3952-4b60-8cd1-00b036d211d4\") " pod="openstack/rabbitmq-server-0" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.809553 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/afb0da72-3952-4b60-8cd1-00b036d211d4-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"afb0da72-3952-4b60-8cd1-00b036d211d4\") " pod="openstack/rabbitmq-server-0" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.810906 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/afb0da72-3952-4b60-8cd1-00b036d211d4-pod-info\") pod \"rabbitmq-server-0\" (UID: \"afb0da72-3952-4b60-8cd1-00b036d211d4\") " pod="openstack/rabbitmq-server-0" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.810958 4971 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.810982 4971 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-caa78e20-148d-4ea1-bb40-9915bf39399b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-caa78e20-148d-4ea1-bb40-9915bf39399b\") pod \"rabbitmq-server-0\" (UID: \"afb0da72-3952-4b60-8cd1-00b036d211d4\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/26823bc6c9a8531b3afbaac866627afd8118e6588c53f9fde37dcff0bcf28a4f/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.811355 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/afb0da72-3952-4b60-8cd1-00b036d211d4-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"afb0da72-3952-4b60-8cd1-00b036d211d4\") " pod="openstack/rabbitmq-server-0" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.838455 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-855g6\" (UniqueName: \"kubernetes.io/projected/afb0da72-3952-4b60-8cd1-00b036d211d4-kube-api-access-855g6\") pod \"rabbitmq-server-0\" (UID: \"afb0da72-3952-4b60-8cd1-00b036d211d4\") " pod="openstack/rabbitmq-server-0" Mar 20 08:32:16 crc kubenswrapper[4971]: I0320 08:32:16.913528 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-caa78e20-148d-4ea1-bb40-9915bf39399b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-caa78e20-148d-4ea1-bb40-9915bf39399b\") pod \"rabbitmq-server-0\" (UID: \"afb0da72-3952-4b60-8cd1-00b036d211d4\") " pod="openstack/rabbitmq-server-0" Mar 20 08:32:17 crc kubenswrapper[4971]: I0320 08:32:17.053762 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-685785d49f-bbgzq" Mar 20 08:32:17 crc kubenswrapper[4971]: I0320 08:32:17.142725 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 08:32:17 crc kubenswrapper[4971]: I0320 08:32:17.156138 4971 generic.go:334] "Generic (PLEG): container finished" podID="329e0d78-d60d-4745-b5c0-0e3f699b3216" containerID="5e811cd0c6e28533e2b19b9171ce7f2af8ce9aa1821baf7d743666e5a4bc8670" exitCode=0 Mar 20 08:32:17 crc kubenswrapper[4971]: I0320 08:32:17.156181 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-685785d49f-bbgzq" event={"ID":"329e0d78-d60d-4745-b5c0-0e3f699b3216","Type":"ContainerDied","Data":"5e811cd0c6e28533e2b19b9171ce7f2af8ce9aa1821baf7d743666e5a4bc8670"} Mar 20 08:32:17 crc kubenswrapper[4971]: I0320 08:32:17.156205 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-685785d49f-bbgzq" event={"ID":"329e0d78-d60d-4745-b5c0-0e3f699b3216","Type":"ContainerDied","Data":"57a4213a6b8d519af0a34b03a3fa578d6b8c764a7b8ac58ee7ee56b4fb0acaea"} Mar 20 08:32:17 crc kubenswrapper[4971]: I0320 08:32:17.156221 4971 scope.go:117] "RemoveContainer" containerID="5e811cd0c6e28533e2b19b9171ce7f2af8ce9aa1821baf7d743666e5a4bc8670" Mar 20 08:32:17 crc kubenswrapper[4971]: I0320 08:32:17.156196 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-685785d49f-bbgzq" Mar 20 08:32:17 crc kubenswrapper[4971]: I0320 08:32:17.174793 4971 scope.go:117] "RemoveContainer" containerID="b236ac3b0da61470c679a78bb72285db4dcb0fdbd80ff88535024648680f5564" Mar 20 08:32:17 crc kubenswrapper[4971]: I0320 08:32:17.195268 4971 scope.go:117] "RemoveContainer" containerID="5e811cd0c6e28533e2b19b9171ce7f2af8ce9aa1821baf7d743666e5a4bc8670" Mar 20 08:32:17 crc kubenswrapper[4971]: E0320 08:32:17.195882 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e811cd0c6e28533e2b19b9171ce7f2af8ce9aa1821baf7d743666e5a4bc8670\": container with ID starting with 5e811cd0c6e28533e2b19b9171ce7f2af8ce9aa1821baf7d743666e5a4bc8670 not found: ID does not exist" containerID="5e811cd0c6e28533e2b19b9171ce7f2af8ce9aa1821baf7d743666e5a4bc8670" Mar 20 08:32:17 crc kubenswrapper[4971]: I0320 08:32:17.195928 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e811cd0c6e28533e2b19b9171ce7f2af8ce9aa1821baf7d743666e5a4bc8670"} err="failed to get container status \"5e811cd0c6e28533e2b19b9171ce7f2af8ce9aa1821baf7d743666e5a4bc8670\": rpc error: code = NotFound desc = could not find container \"5e811cd0c6e28533e2b19b9171ce7f2af8ce9aa1821baf7d743666e5a4bc8670\": container with ID starting with 5e811cd0c6e28533e2b19b9171ce7f2af8ce9aa1821baf7d743666e5a4bc8670 not found: ID does not exist" Mar 20 08:32:17 crc kubenswrapper[4971]: I0320 08:32:17.195955 4971 scope.go:117] "RemoveContainer" containerID="b236ac3b0da61470c679a78bb72285db4dcb0fdbd80ff88535024648680f5564" Mar 20 08:32:17 crc kubenswrapper[4971]: E0320 08:32:17.196362 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b236ac3b0da61470c679a78bb72285db4dcb0fdbd80ff88535024648680f5564\": container with ID starting with b236ac3b0da61470c679a78bb72285db4dcb0fdbd80ff88535024648680f5564 not found: ID does not exist" containerID="b236ac3b0da61470c679a78bb72285db4dcb0fdbd80ff88535024648680f5564" Mar 20 08:32:17 crc kubenswrapper[4971]: I0320 08:32:17.196402 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b236ac3b0da61470c679a78bb72285db4dcb0fdbd80ff88535024648680f5564"} err="failed to get container status \"b236ac3b0da61470c679a78bb72285db4dcb0fdbd80ff88535024648680f5564\": rpc error: code = NotFound desc = could not find container \"b236ac3b0da61470c679a78bb72285db4dcb0fdbd80ff88535024648680f5564\": container with ID starting with b236ac3b0da61470c679a78bb72285db4dcb0fdbd80ff88535024648680f5564 not found: ID does not exist" Mar 20 08:32:17 crc kubenswrapper[4971]: I0320 08:32:17.212310 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/329e0d78-d60d-4745-b5c0-0e3f699b3216-dns-svc\") pod \"329e0d78-d60d-4745-b5c0-0e3f699b3216\" (UID: \"329e0d78-d60d-4745-b5c0-0e3f699b3216\") " Mar 20 08:32:17 crc kubenswrapper[4971]: I0320 08:32:17.212397 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/329e0d78-d60d-4745-b5c0-0e3f699b3216-config\") pod \"329e0d78-d60d-4745-b5c0-0e3f699b3216\" (UID: \"329e0d78-d60d-4745-b5c0-0e3f699b3216\") " Mar 20 08:32:17 crc kubenswrapper[4971]: I0320 08:32:17.212474 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktqmg\" (UniqueName: \"kubernetes.io/projected/329e0d78-d60d-4745-b5c0-0e3f699b3216-kube-api-access-ktqmg\") pod \"329e0d78-d60d-4745-b5c0-0e3f699b3216\" (UID: \"329e0d78-d60d-4745-b5c0-0e3f699b3216\") " Mar 20 08:32:17 crc kubenswrapper[4971]: I0320 08:32:17.222800 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/329e0d78-d60d-4745-b5c0-0e3f699b3216-kube-api-access-ktqmg" (OuterVolumeSpecName: "kube-api-access-ktqmg") pod "329e0d78-d60d-4745-b5c0-0e3f699b3216" (UID: "329e0d78-d60d-4745-b5c0-0e3f699b3216"). InnerVolumeSpecName "kube-api-access-ktqmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:32:17 crc kubenswrapper[4971]: I0320 08:32:17.260368 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/329e0d78-d60d-4745-b5c0-0e3f699b3216-config" (OuterVolumeSpecName: "config") pod "329e0d78-d60d-4745-b5c0-0e3f699b3216" (UID: "329e0d78-d60d-4745-b5c0-0e3f699b3216"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:32:17 crc kubenswrapper[4971]: I0320 08:32:17.260485 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/329e0d78-d60d-4745-b5c0-0e3f699b3216-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "329e0d78-d60d-4745-b5c0-0e3f699b3216" (UID: "329e0d78-d60d-4745-b5c0-0e3f699b3216"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:32:17 crc kubenswrapper[4971]: I0320 08:32:17.287570 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 08:32:17 crc kubenswrapper[4971]: I0320 08:32:17.313917 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/329e0d78-d60d-4745-b5c0-0e3f699b3216-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:32:17 crc kubenswrapper[4971]: I0320 08:32:17.313952 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktqmg\" (UniqueName: \"kubernetes.io/projected/329e0d78-d60d-4745-b5c0-0e3f699b3216-kube-api-access-ktqmg\") on node \"crc\" DevicePath \"\"" Mar 20 08:32:17 crc kubenswrapper[4971]: I0320 08:32:17.313964 4971 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/329e0d78-d60d-4745-b5c0-0e3f699b3216-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 08:32:17 crc kubenswrapper[4971]: I0320 08:32:17.494966 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-685785d49f-bbgzq"] Mar 20 08:32:17 crc kubenswrapper[4971]: I0320 08:32:17.505308 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-685785d49f-bbgzq"] Mar 20 08:32:17 crc kubenswrapper[4971]: W0320 08:32:17.586119 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podafb0da72_3952_4b60_8cd1_00b036d211d4.slice/crio-e18ee16afe967f9fb7685eae1320fb024e2918ab1101d52de920228653a27839 WatchSource:0}: Error finding container e18ee16afe967f9fb7685eae1320fb024e2918ab1101d52de920228653a27839: Status 404 returned error can't find the container with id e18ee16afe967f9fb7685eae1320fb024e2918ab1101d52de920228653a27839 Mar 20 08:32:17 crc kubenswrapper[4971]: I0320 08:32:17.587979 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 08:32:18 crc kubenswrapper[4971]: I0320 08:32:18.174978 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"afb0da72-3952-4b60-8cd1-00b036d211d4","Type":"ContainerStarted","Data":"e18ee16afe967f9fb7685eae1320fb024e2918ab1101d52de920228653a27839"} Mar 20 08:32:18 crc kubenswrapper[4971]: I0320 08:32:18.176357 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6e020080-54ed-44d0-ad96-8da6f1aa9bf0","Type":"ContainerStarted","Data":"1962b1eed82b4219a8ccc0e02cc117b07d10a1dde6aa89fb206b0f128fc72431"} Mar 20 08:32:18 crc kubenswrapper[4971]: I0320 08:32:18.748803 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="329e0d78-d60d-4745-b5c0-0e3f699b3216" path="/var/lib/kubelet/pods/329e0d78-d60d-4745-b5c0-0e3f699b3216/volumes" Mar 20 08:32:19 crc kubenswrapper[4971]: I0320 08:32:19.186181 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"afb0da72-3952-4b60-8cd1-00b036d211d4","Type":"ContainerStarted","Data":"373780e6a01e60a75068cfc00be18cf1a0a02a6a9546600150f080fb298d521d"} Mar 20 08:32:19 crc kubenswrapper[4971]: I0320 08:32:19.187809 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6e020080-54ed-44d0-ad96-8da6f1aa9bf0","Type":"ContainerStarted","Data":"82b6b8b29e31762284d43ee42a806674d2b04f7c552b2c83f38e5e86817ba2c7"} Mar 20 08:32:20 crc kubenswrapper[4971]: I0320 08:32:20.161794 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 08:32:20 crc kubenswrapper[4971]: I0320 08:32:20.161858 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 08:32:20 crc kubenswrapper[4971]: I0320 08:32:20.161911 4971 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" Mar 20 08:32:20 crc kubenswrapper[4971]: I0320 08:32:20.162881 4971 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5db1e00d6090d5bd7995513994dc958aa01f39fa2d0f251bb90827207208e2ff"} pod="openshift-machine-config-operator/machine-config-daemon-m26zl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 08:32:20 crc kubenswrapper[4971]: I0320 08:32:20.162951 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" containerID="cri-o://5db1e00d6090d5bd7995513994dc958aa01f39fa2d0f251bb90827207208e2ff" gracePeriod=600 Mar 20 08:32:21 crc kubenswrapper[4971]: I0320 08:32:21.207857 4971 generic.go:334] "Generic (PLEG): container finished" podID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerID="5db1e00d6090d5bd7995513994dc958aa01f39fa2d0f251bb90827207208e2ff" exitCode=0 Mar 20 08:32:21 crc kubenswrapper[4971]: I0320 08:32:21.207963 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerDied","Data":"5db1e00d6090d5bd7995513994dc958aa01f39fa2d0f251bb90827207208e2ff"} Mar 20 08:32:21 crc kubenswrapper[4971]: I0320 08:32:21.209336 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerStarted","Data":"66c1fcf2276534ad864a4ee931635b14a07c0330af3a57e8ed8300895949b30a"} Mar 20 08:32:21 crc kubenswrapper[4971]: I0320 08:32:21.209458 4971 scope.go:117] "RemoveContainer" containerID="52eff19a5271a1aa4b1f0d67866346931e99423b21667041f8e58856f311aa9a" Mar 20 08:32:51 crc kubenswrapper[4971]: I0320 08:32:51.507173 4971 generic.go:334] "Generic (PLEG): container finished" podID="afb0da72-3952-4b60-8cd1-00b036d211d4" containerID="373780e6a01e60a75068cfc00be18cf1a0a02a6a9546600150f080fb298d521d" exitCode=0 Mar 20 08:32:51 crc kubenswrapper[4971]: I0320 08:32:51.507351 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"afb0da72-3952-4b60-8cd1-00b036d211d4","Type":"ContainerDied","Data":"373780e6a01e60a75068cfc00be18cf1a0a02a6a9546600150f080fb298d521d"} Mar 20 08:32:51 crc kubenswrapper[4971]: I0320 08:32:51.513272 4971 generic.go:334] "Generic (PLEG): container finished" podID="6e020080-54ed-44d0-ad96-8da6f1aa9bf0" containerID="82b6b8b29e31762284d43ee42a806674d2b04f7c552b2c83f38e5e86817ba2c7" exitCode=0 Mar 20 08:32:51 crc kubenswrapper[4971]: I0320 08:32:51.513321 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6e020080-54ed-44d0-ad96-8da6f1aa9bf0","Type":"ContainerDied","Data":"82b6b8b29e31762284d43ee42a806674d2b04f7c552b2c83f38e5e86817ba2c7"} Mar 20 08:32:52 crc kubenswrapper[4971]: I0320 08:32:52.525639 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"afb0da72-3952-4b60-8cd1-00b036d211d4","Type":"ContainerStarted","Data":"97da347c9d1edecc7f9eccc9dda1abfb2d21f79c9f94b24c2324cb87808fd70a"} Mar 20 08:32:52 crc kubenswrapper[4971]: I0320 08:32:52.526333 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 20 08:32:52 crc kubenswrapper[4971]: I0320 08:32:52.529052 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6e020080-54ed-44d0-ad96-8da6f1aa9bf0","Type":"ContainerStarted","Data":"5a72d85e39b47eccbcf0cdccef5024572c3d0c2dde021742a54f97c640b9b9e7"} Mar 20 08:32:52 crc kubenswrapper[4971]: I0320 08:32:52.529274 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:32:52 crc kubenswrapper[4971]: I0320 08:32:52.559014 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.558982999 podStartE2EDuration="36.558982999s" podCreationTimestamp="2026-03-20 08:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:32:52.54557478 +0000 UTC m=+6194.525448938" watchObservedRunningTime="2026-03-20 08:32:52.558982999 +0000 UTC m=+6194.538857187" Mar 20 08:32:52 crc kubenswrapper[4971]: I0320 08:32:52.566226 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.566204117 podStartE2EDuration="36.566204117s" podCreationTimestamp="2026-03-20 08:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:32:52.562600344 +0000 UTC m=+6194.542474492" watchObservedRunningTime="2026-03-20 08:32:52.566204117 +0000 UTC m=+6194.546078265" Mar 20 08:33:06 crc kubenswrapper[4971]: I0320 08:33:06.811886 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:33:07 crc kubenswrapper[4971]: I0320 08:33:07.145827 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 20 08:33:13 crc kubenswrapper[4971]: I0320 08:33:13.122733 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dnmnb"] Mar 20 08:33:13 crc kubenswrapper[4971]: E0320 08:33:13.123793 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="329e0d78-d60d-4745-b5c0-0e3f699b3216" containerName="init" Mar 20 08:33:13 crc kubenswrapper[4971]: I0320 08:33:13.123815 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="329e0d78-d60d-4745-b5c0-0e3f699b3216" containerName="init" Mar 20 08:33:13 crc kubenswrapper[4971]: E0320 08:33:13.123853 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="329e0d78-d60d-4745-b5c0-0e3f699b3216" containerName="dnsmasq-dns" Mar 20 08:33:13 crc kubenswrapper[4971]: I0320 08:33:13.123865 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="329e0d78-d60d-4745-b5c0-0e3f699b3216" containerName="dnsmasq-dns" Mar 20 08:33:13 crc kubenswrapper[4971]: I0320 08:33:13.124099 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="329e0d78-d60d-4745-b5c0-0e3f699b3216" containerName="dnsmasq-dns" Mar 20 08:33:13 crc kubenswrapper[4971]: I0320 08:33:13.125911 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dnmnb" Mar 20 08:33:13 crc kubenswrapper[4971]: I0320 08:33:13.129882 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dnmnb"] Mar 20 08:33:13 crc kubenswrapper[4971]: I0320 08:33:13.209766 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qt6ms\" (UniqueName: \"kubernetes.io/projected/b9db3e69-eabf-4ef2-b443-9c4e7cc13dfb-kube-api-access-qt6ms\") pod \"redhat-operators-dnmnb\" (UID: \"b9db3e69-eabf-4ef2-b443-9c4e7cc13dfb\") " pod="openshift-marketplace/redhat-operators-dnmnb" Mar 20 08:33:13 crc kubenswrapper[4971]: I0320 08:33:13.209886 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9db3e69-eabf-4ef2-b443-9c4e7cc13dfb-catalog-content\") pod \"redhat-operators-dnmnb\" (UID: \"b9db3e69-eabf-4ef2-b443-9c4e7cc13dfb\") " pod="openshift-marketplace/redhat-operators-dnmnb" Mar 20 08:33:13 crc kubenswrapper[4971]: I0320 08:33:13.210031 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9db3e69-eabf-4ef2-b443-9c4e7cc13dfb-utilities\") pod \"redhat-operators-dnmnb\" (UID: \"b9db3e69-eabf-4ef2-b443-9c4e7cc13dfb\") " pod="openshift-marketplace/redhat-operators-dnmnb" Mar 20 08:33:13 crc kubenswrapper[4971]: I0320 08:33:13.311352 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9db3e69-eabf-4ef2-b443-9c4e7cc13dfb-utilities\") pod \"redhat-operators-dnmnb\" (UID: \"b9db3e69-eabf-4ef2-b443-9c4e7cc13dfb\") " pod="openshift-marketplace/redhat-operators-dnmnb" Mar 20 08:33:13 crc kubenswrapper[4971]: I0320 08:33:13.311405 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qt6ms\" (UniqueName: \"kubernetes.io/projected/b9db3e69-eabf-4ef2-b443-9c4e7cc13dfb-kube-api-access-qt6ms\") pod \"redhat-operators-dnmnb\" (UID: \"b9db3e69-eabf-4ef2-b443-9c4e7cc13dfb\") " pod="openshift-marketplace/redhat-operators-dnmnb" Mar 20 08:33:13 crc kubenswrapper[4971]: I0320 08:33:13.311441 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9db3e69-eabf-4ef2-b443-9c4e7cc13dfb-catalog-content\") pod \"redhat-operators-dnmnb\" (UID: \"b9db3e69-eabf-4ef2-b443-9c4e7cc13dfb\") " pod="openshift-marketplace/redhat-operators-dnmnb" Mar 20 08:33:13 crc kubenswrapper[4971]: I0320 08:33:13.311970 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9db3e69-eabf-4ef2-b443-9c4e7cc13dfb-catalog-content\") pod \"redhat-operators-dnmnb\" (UID: \"b9db3e69-eabf-4ef2-b443-9c4e7cc13dfb\") " pod="openshift-marketplace/redhat-operators-dnmnb" Mar 20 08:33:13 crc kubenswrapper[4971]: I0320 08:33:13.312050 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9db3e69-eabf-4ef2-b443-9c4e7cc13dfb-utilities\") pod \"redhat-operators-dnmnb\" (UID: \"b9db3e69-eabf-4ef2-b443-9c4e7cc13dfb\") " pod="openshift-marketplace/redhat-operators-dnmnb" Mar 20 08:33:13 crc kubenswrapper[4971]: I0320 08:33:13.329930 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qt6ms\" (UniqueName: \"kubernetes.io/projected/b9db3e69-eabf-4ef2-b443-9c4e7cc13dfb-kube-api-access-qt6ms\") pod \"redhat-operators-dnmnb\" (UID: \"b9db3e69-eabf-4ef2-b443-9c4e7cc13dfb\") " pod="openshift-marketplace/redhat-operators-dnmnb" Mar 20 08:33:13 crc kubenswrapper[4971]: I0320 08:33:13.452049 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dnmnb" Mar 20 08:33:13 crc kubenswrapper[4971]: I0320 08:33:13.889428 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dnmnb"] Mar 20 08:33:13 crc kubenswrapper[4971]: W0320 08:33:13.895685 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9db3e69_eabf_4ef2_b443_9c4e7cc13dfb.slice/crio-751a3930125696396da91fe46e0af66d29711fdcc5b90f0fa3434db8d4f275a9 WatchSource:0}: Error finding container 751a3930125696396da91fe46e0af66d29711fdcc5b90f0fa3434db8d4f275a9: Status 404 returned error can't find the container with id 751a3930125696396da91fe46e0af66d29711fdcc5b90f0fa3434db8d4f275a9 Mar 20 08:33:14 crc kubenswrapper[4971]: I0320 08:33:14.776576 4971 generic.go:334] "Generic (PLEG): container finished" podID="b9db3e69-eabf-4ef2-b443-9c4e7cc13dfb" containerID="0724aa1ac5d0435571afaa2a83ebf6fbd3bbd1e454e294ec99cf660006389c34" exitCode=0 Mar 20 08:33:14 crc kubenswrapper[4971]: I0320 08:33:14.776696 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dnmnb" event={"ID":"b9db3e69-eabf-4ef2-b443-9c4e7cc13dfb","Type":"ContainerDied","Data":"0724aa1ac5d0435571afaa2a83ebf6fbd3bbd1e454e294ec99cf660006389c34"} Mar 20 08:33:14 crc kubenswrapper[4971]: I0320 08:33:14.777044 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dnmnb" event={"ID":"b9db3e69-eabf-4ef2-b443-9c4e7cc13dfb","Type":"ContainerStarted","Data":"751a3930125696396da91fe46e0af66d29711fdcc5b90f0fa3434db8d4f275a9"} Mar 20 08:33:14 crc kubenswrapper[4971]: I0320 08:33:14.779519 4971 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 08:33:15 crc kubenswrapper[4971]: I0320 08:33:15.787678 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dnmnb" event={"ID":"b9db3e69-eabf-4ef2-b443-9c4e7cc13dfb","Type":"ContainerStarted","Data":"3636f4ea152a874743313338ce4a2da332d4f2113fa8fcfb62a0c0e7713fb2b9"} Mar 20 08:33:16 crc kubenswrapper[4971]: I0320 08:33:16.802518 4971 generic.go:334] "Generic (PLEG): container finished" podID="b9db3e69-eabf-4ef2-b443-9c4e7cc13dfb" containerID="3636f4ea152a874743313338ce4a2da332d4f2113fa8fcfb62a0c0e7713fb2b9" exitCode=0 Mar 20 08:33:16 crc kubenswrapper[4971]: I0320 08:33:16.802679 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dnmnb" event={"ID":"b9db3e69-eabf-4ef2-b443-9c4e7cc13dfb","Type":"ContainerDied","Data":"3636f4ea152a874743313338ce4a2da332d4f2113fa8fcfb62a0c0e7713fb2b9"} Mar 20 08:33:17 crc kubenswrapper[4971]: I0320 08:33:17.830975 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dnmnb" event={"ID":"b9db3e69-eabf-4ef2-b443-9c4e7cc13dfb","Type":"ContainerStarted","Data":"e6d1a46bb257469e33738c64b7ec2ba5e89c583ba715b8efbdff484e71a5b293"} Mar 20 08:33:17 crc kubenswrapper[4971]: I0320 08:33:17.870816 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dnmnb" podStartSLOduration=2.416761689 podStartE2EDuration="4.870783908s" podCreationTimestamp="2026-03-20 08:33:13 +0000 UTC" firstStartedPulling="2026-03-20 08:33:14.779264377 +0000 UTC m=+6216.759138515" lastFinishedPulling="2026-03-20 08:33:17.233286556 +0000 UTC m=+6219.213160734" observedRunningTime="2026-03-20 08:33:17.865632584 +0000 UTC m=+6219.845506812" watchObservedRunningTime="2026-03-20 08:33:17.870783908 +0000 UTC m=+6219.850658086" Mar 20 08:33:19 crc kubenswrapper[4971]: I0320 08:33:19.507575 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Mar 20 08:33:19 crc kubenswrapper[4971]: I0320 08:33:19.508461 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 20 08:33:19 crc kubenswrapper[4971]: I0320 08:33:19.511411 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-cjphw" Mar 20 08:33:19 crc kubenswrapper[4971]: I0320 08:33:19.528217 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 20 08:33:19 crc kubenswrapper[4971]: I0320 08:33:19.653549 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghmzq\" (UniqueName: \"kubernetes.io/projected/fa2eb27f-18a2-4ca1-b2ab-5774ec861b60-kube-api-access-ghmzq\") pod \"mariadb-client\" (UID: \"fa2eb27f-18a2-4ca1-b2ab-5774ec861b60\") " pod="openstack/mariadb-client" Mar 20 08:33:19 crc kubenswrapper[4971]: I0320 08:33:19.754957 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghmzq\" (UniqueName: \"kubernetes.io/projected/fa2eb27f-18a2-4ca1-b2ab-5774ec861b60-kube-api-access-ghmzq\") pod \"mariadb-client\" (UID: \"fa2eb27f-18a2-4ca1-b2ab-5774ec861b60\") " pod="openstack/mariadb-client" Mar 20 08:33:19 crc kubenswrapper[4971]: I0320 08:33:19.773100 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghmzq\" (UniqueName: \"kubernetes.io/projected/fa2eb27f-18a2-4ca1-b2ab-5774ec861b60-kube-api-access-ghmzq\") pod \"mariadb-client\" (UID: \"fa2eb27f-18a2-4ca1-b2ab-5774ec861b60\") " pod="openstack/mariadb-client" Mar 20 08:33:19 crc kubenswrapper[4971]: I0320 08:33:19.883069 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 20 08:33:20 crc kubenswrapper[4971]: W0320 08:33:20.401523 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa2eb27f_18a2_4ca1_b2ab_5774ec861b60.slice/crio-49fa790f76abf2ed14107ccfb98302840d35acc23ec08d1c4b6134c83f4ddeab WatchSource:0}: Error finding container 49fa790f76abf2ed14107ccfb98302840d35acc23ec08d1c4b6134c83f4ddeab: Status 404 returned error can't find the container with id 49fa790f76abf2ed14107ccfb98302840d35acc23ec08d1c4b6134c83f4ddeab Mar 20 08:33:20 crc kubenswrapper[4971]: I0320 08:33:20.407176 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 20 08:33:20 crc kubenswrapper[4971]: I0320 08:33:20.859396 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"fa2eb27f-18a2-4ca1-b2ab-5774ec861b60","Type":"ContainerStarted","Data":"49fa790f76abf2ed14107ccfb98302840d35acc23ec08d1c4b6134c83f4ddeab"} Mar 20 08:33:23 crc kubenswrapper[4971]: I0320 08:33:23.453126 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dnmnb" Mar 20 08:33:23 crc kubenswrapper[4971]: I0320 08:33:23.453851 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dnmnb" Mar 20 08:33:24 crc kubenswrapper[4971]: I0320 08:33:24.513490 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dnmnb" podUID="b9db3e69-eabf-4ef2-b443-9c4e7cc13dfb" containerName="registry-server" probeResult="failure" output=< Mar 20 08:33:24 crc kubenswrapper[4971]: timeout: failed to connect service ":50051" within 1s Mar 20 08:33:24 crc kubenswrapper[4971]: > Mar 20 08:33:25 crc kubenswrapper[4971]: I0320 08:33:25.909782 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"fa2eb27f-18a2-4ca1-b2ab-5774ec861b60","Type":"ContainerStarted","Data":"42e9089e02e173fb2f876dcfa4412c485e37d12c660fac36b609f229c3f8c768"} Mar 20 08:33:25 crc kubenswrapper[4971]: I0320 08:33:25.946453 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client" podStartSLOduration=2.3775943330000002 podStartE2EDuration="6.946427467s" podCreationTimestamp="2026-03-20 08:33:19 +0000 UTC" firstStartedPulling="2026-03-20 08:33:20.40751714 +0000 UTC m=+6222.387391278" lastFinishedPulling="2026-03-20 08:33:24.976350274 +0000 UTC m=+6226.956224412" observedRunningTime="2026-03-20 08:33:25.930579254 +0000 UTC m=+6227.910453422" watchObservedRunningTime="2026-03-20 08:33:25.946427467 +0000 UTC m=+6227.926301635" Mar 20 08:33:33 crc kubenswrapper[4971]: I0320 08:33:33.508327 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dnmnb" Mar 20 08:33:33 crc kubenswrapper[4971]: I0320 08:33:33.568123 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dnmnb" Mar 20 08:33:33 crc kubenswrapper[4971]: I0320 08:33:33.753179 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dnmnb"] Mar 20 08:33:34 crc kubenswrapper[4971]: I0320 08:33:34.988131 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dnmnb" podUID="b9db3e69-eabf-4ef2-b443-9c4e7cc13dfb" containerName="registry-server" containerID="cri-o://e6d1a46bb257469e33738c64b7ec2ba5e89c583ba715b8efbdff484e71a5b293" gracePeriod=2 Mar 20 08:33:35 crc kubenswrapper[4971]: I0320 08:33:35.444733 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dnmnb" Mar 20 08:33:35 crc kubenswrapper[4971]: I0320 08:33:35.544473 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9db3e69-eabf-4ef2-b443-9c4e7cc13dfb-catalog-content\") pod \"b9db3e69-eabf-4ef2-b443-9c4e7cc13dfb\" (UID: \"b9db3e69-eabf-4ef2-b443-9c4e7cc13dfb\") " Mar 20 08:33:35 crc kubenswrapper[4971]: I0320 08:33:35.544549 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9db3e69-eabf-4ef2-b443-9c4e7cc13dfb-utilities\") pod \"b9db3e69-eabf-4ef2-b443-9c4e7cc13dfb\" (UID: \"b9db3e69-eabf-4ef2-b443-9c4e7cc13dfb\") " Mar 20 08:33:35 crc kubenswrapper[4971]: I0320 08:33:35.544678 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qt6ms\" (UniqueName: \"kubernetes.io/projected/b9db3e69-eabf-4ef2-b443-9c4e7cc13dfb-kube-api-access-qt6ms\") pod \"b9db3e69-eabf-4ef2-b443-9c4e7cc13dfb\" (UID: \"b9db3e69-eabf-4ef2-b443-9c4e7cc13dfb\") " Mar 20 08:33:35 crc kubenswrapper[4971]: I0320 08:33:35.545945 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9db3e69-eabf-4ef2-b443-9c4e7cc13dfb-utilities" (OuterVolumeSpecName: "utilities") pod "b9db3e69-eabf-4ef2-b443-9c4e7cc13dfb" (UID: "b9db3e69-eabf-4ef2-b443-9c4e7cc13dfb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:33:35 crc kubenswrapper[4971]: I0320 08:33:35.550933 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9db3e69-eabf-4ef2-b443-9c4e7cc13dfb-kube-api-access-qt6ms" (OuterVolumeSpecName: "kube-api-access-qt6ms") pod "b9db3e69-eabf-4ef2-b443-9c4e7cc13dfb" (UID: "b9db3e69-eabf-4ef2-b443-9c4e7cc13dfb"). InnerVolumeSpecName "kube-api-access-qt6ms". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:33:35 crc kubenswrapper[4971]: I0320 08:33:35.646841 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9db3e69-eabf-4ef2-b443-9c4e7cc13dfb-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 08:33:35 crc kubenswrapper[4971]: I0320 08:33:35.646868 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qt6ms\" (UniqueName: \"kubernetes.io/projected/b9db3e69-eabf-4ef2-b443-9c4e7cc13dfb-kube-api-access-qt6ms\") on node \"crc\" DevicePath \"\"" Mar 20 08:33:35 crc kubenswrapper[4971]: I0320 08:33:35.689186 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9db3e69-eabf-4ef2-b443-9c4e7cc13dfb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b9db3e69-eabf-4ef2-b443-9c4e7cc13dfb" (UID: "b9db3e69-eabf-4ef2-b443-9c4e7cc13dfb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:33:35 crc kubenswrapper[4971]: I0320 08:33:35.748179 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9db3e69-eabf-4ef2-b443-9c4e7cc13dfb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 08:33:36 crc kubenswrapper[4971]: I0320 08:33:36.000948 4971 generic.go:334] "Generic (PLEG): container finished" podID="b9db3e69-eabf-4ef2-b443-9c4e7cc13dfb" containerID="e6d1a46bb257469e33738c64b7ec2ba5e89c583ba715b8efbdff484e71a5b293" exitCode=0 Mar 20 08:33:36 crc kubenswrapper[4971]: I0320 08:33:36.001023 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dnmnb" Mar 20 08:33:36 crc kubenswrapper[4971]: I0320 08:33:36.001037 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dnmnb" event={"ID":"b9db3e69-eabf-4ef2-b443-9c4e7cc13dfb","Type":"ContainerDied","Data":"e6d1a46bb257469e33738c64b7ec2ba5e89c583ba715b8efbdff484e71a5b293"} Mar 20 08:33:36 crc kubenswrapper[4971]: I0320 08:33:36.001098 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dnmnb" event={"ID":"b9db3e69-eabf-4ef2-b443-9c4e7cc13dfb","Type":"ContainerDied","Data":"751a3930125696396da91fe46e0af66d29711fdcc5b90f0fa3434db8d4f275a9"} Mar 20 08:33:36 crc kubenswrapper[4971]: I0320 08:33:36.001137 4971 scope.go:117] "RemoveContainer" containerID="e6d1a46bb257469e33738c64b7ec2ba5e89c583ba715b8efbdff484e71a5b293" Mar 20 08:33:36 crc kubenswrapper[4971]: I0320 08:33:36.032633 4971 scope.go:117] "RemoveContainer" containerID="3636f4ea152a874743313338ce4a2da332d4f2113fa8fcfb62a0c0e7713fb2b9" Mar 20 08:33:36 crc kubenswrapper[4971]: I0320 08:33:36.054971 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dnmnb"] Mar 20 08:33:36 crc kubenswrapper[4971]: I0320 08:33:36.079369 4971 scope.go:117] "RemoveContainer" containerID="0724aa1ac5d0435571afaa2a83ebf6fbd3bbd1e454e294ec99cf660006389c34" Mar 20 08:33:36 crc kubenswrapper[4971]: I0320 08:33:36.090050 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dnmnb"] Mar 20 08:33:36 crc kubenswrapper[4971]: I0320 08:33:36.102495 4971 scope.go:117] "RemoveContainer" containerID="e6d1a46bb257469e33738c64b7ec2ba5e89c583ba715b8efbdff484e71a5b293" Mar 20 08:33:36 crc kubenswrapper[4971]: E0320 08:33:36.103100 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6d1a46bb257469e33738c64b7ec2ba5e89c583ba715b8efbdff484e71a5b293\": container with ID starting with e6d1a46bb257469e33738c64b7ec2ba5e89c583ba715b8efbdff484e71a5b293 not found: ID does not exist" containerID="e6d1a46bb257469e33738c64b7ec2ba5e89c583ba715b8efbdff484e71a5b293" Mar 20 08:33:36 crc kubenswrapper[4971]: I0320 08:33:36.103143 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6d1a46bb257469e33738c64b7ec2ba5e89c583ba715b8efbdff484e71a5b293"} err="failed to get container status \"e6d1a46bb257469e33738c64b7ec2ba5e89c583ba715b8efbdff484e71a5b293\": rpc error: code = NotFound desc = could not find container \"e6d1a46bb257469e33738c64b7ec2ba5e89c583ba715b8efbdff484e71a5b293\": container with ID starting with e6d1a46bb257469e33738c64b7ec2ba5e89c583ba715b8efbdff484e71a5b293 not found: ID does not exist" Mar 20 08:33:36 crc kubenswrapper[4971]: I0320 08:33:36.103173 4971 scope.go:117] "RemoveContainer" containerID="3636f4ea152a874743313338ce4a2da332d4f2113fa8fcfb62a0c0e7713fb2b9" Mar 20 08:33:36 crc kubenswrapper[4971]: E0320 08:33:36.103420 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3636f4ea152a874743313338ce4a2da332d4f2113fa8fcfb62a0c0e7713fb2b9\": container with ID starting with 3636f4ea152a874743313338ce4a2da332d4f2113fa8fcfb62a0c0e7713fb2b9 not found: ID does not exist" containerID="3636f4ea152a874743313338ce4a2da332d4f2113fa8fcfb62a0c0e7713fb2b9" Mar 20 08:33:36 crc kubenswrapper[4971]: I0320 08:33:36.103451 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3636f4ea152a874743313338ce4a2da332d4f2113fa8fcfb62a0c0e7713fb2b9"} err="failed to get container status \"3636f4ea152a874743313338ce4a2da332d4f2113fa8fcfb62a0c0e7713fb2b9\": rpc error: code = NotFound desc = could not find container \"3636f4ea152a874743313338ce4a2da332d4f2113fa8fcfb62a0c0e7713fb2b9\": container with ID starting with 3636f4ea152a874743313338ce4a2da332d4f2113fa8fcfb62a0c0e7713fb2b9 not found: ID does not exist" Mar 20 08:33:36 crc kubenswrapper[4971]: I0320 08:33:36.103470 4971 scope.go:117] "RemoveContainer" containerID="0724aa1ac5d0435571afaa2a83ebf6fbd3bbd1e454e294ec99cf660006389c34" Mar 20 08:33:36 crc kubenswrapper[4971]: E0320 08:33:36.103815 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0724aa1ac5d0435571afaa2a83ebf6fbd3bbd1e454e294ec99cf660006389c34\": container with ID starting with 0724aa1ac5d0435571afaa2a83ebf6fbd3bbd1e454e294ec99cf660006389c34 not found: ID does not exist" containerID="0724aa1ac5d0435571afaa2a83ebf6fbd3bbd1e454e294ec99cf660006389c34" Mar 20 08:33:36 crc kubenswrapper[4971]: I0320 08:33:36.103849 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0724aa1ac5d0435571afaa2a83ebf6fbd3bbd1e454e294ec99cf660006389c34"} err="failed to get container status \"0724aa1ac5d0435571afaa2a83ebf6fbd3bbd1e454e294ec99cf660006389c34\": rpc error: code = NotFound desc = could not find container \"0724aa1ac5d0435571afaa2a83ebf6fbd3bbd1e454e294ec99cf660006389c34\": container with ID starting with 0724aa1ac5d0435571afaa2a83ebf6fbd3bbd1e454e294ec99cf660006389c34 not found: ID does not exist" Mar 20 08:33:36 crc kubenswrapper[4971]: I0320 08:33:36.752114 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9db3e69-eabf-4ef2-b443-9c4e7cc13dfb" path="/var/lib/kubelet/pods/b9db3e69-eabf-4ef2-b443-9c4e7cc13dfb/volumes" Mar 20 08:33:38 crc kubenswrapper[4971]: I0320 08:33:38.318360 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Mar 20 08:33:38 crc kubenswrapper[4971]: I0320 08:33:38.318729 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-client" podUID="fa2eb27f-18a2-4ca1-b2ab-5774ec861b60" containerName="mariadb-client" containerID="cri-o://42e9089e02e173fb2f876dcfa4412c485e37d12c660fac36b609f229c3f8c768" gracePeriod=30 Mar 20 08:33:38 crc kubenswrapper[4971]: I0320 08:33:38.800795 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 20 08:33:38 crc kubenswrapper[4971]: I0320 08:33:38.906897 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghmzq\" (UniqueName: \"kubernetes.io/projected/fa2eb27f-18a2-4ca1-b2ab-5774ec861b60-kube-api-access-ghmzq\") pod \"fa2eb27f-18a2-4ca1-b2ab-5774ec861b60\" (UID: \"fa2eb27f-18a2-4ca1-b2ab-5774ec861b60\") " Mar 20 08:33:38 crc kubenswrapper[4971]: I0320 08:33:38.914538 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa2eb27f-18a2-4ca1-b2ab-5774ec861b60-kube-api-access-ghmzq" (OuterVolumeSpecName: "kube-api-access-ghmzq") pod "fa2eb27f-18a2-4ca1-b2ab-5774ec861b60" (UID: "fa2eb27f-18a2-4ca1-b2ab-5774ec861b60"). InnerVolumeSpecName "kube-api-access-ghmzq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:33:39 crc kubenswrapper[4971]: I0320 08:33:39.008767 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghmzq\" (UniqueName: \"kubernetes.io/projected/fa2eb27f-18a2-4ca1-b2ab-5774ec861b60-kube-api-access-ghmzq\") on node \"crc\" DevicePath \"\"" Mar 20 08:33:39 crc kubenswrapper[4971]: I0320 08:33:39.032373 4971 generic.go:334] "Generic (PLEG): container finished" podID="fa2eb27f-18a2-4ca1-b2ab-5774ec861b60" containerID="42e9089e02e173fb2f876dcfa4412c485e37d12c660fac36b609f229c3f8c768" exitCode=143 Mar 20 08:33:39 crc kubenswrapper[4971]: I0320 08:33:39.032451 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 20 08:33:39 crc kubenswrapper[4971]: I0320 08:33:39.032439 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"fa2eb27f-18a2-4ca1-b2ab-5774ec861b60","Type":"ContainerDied","Data":"42e9089e02e173fb2f876dcfa4412c485e37d12c660fac36b609f229c3f8c768"} Mar 20 08:33:39 crc kubenswrapper[4971]: I0320 08:33:39.032525 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"fa2eb27f-18a2-4ca1-b2ab-5774ec861b60","Type":"ContainerDied","Data":"49fa790f76abf2ed14107ccfb98302840d35acc23ec08d1c4b6134c83f4ddeab"} Mar 20 08:33:39 crc kubenswrapper[4971]: I0320 08:33:39.032556 4971 scope.go:117] "RemoveContainer" containerID="42e9089e02e173fb2f876dcfa4412c485e37d12c660fac36b609f229c3f8c768" Mar 20 08:33:39 crc kubenswrapper[4971]: I0320 08:33:39.058694 4971 scope.go:117] "RemoveContainer" containerID="42e9089e02e173fb2f876dcfa4412c485e37d12c660fac36b609f229c3f8c768" Mar 20 08:33:39 crc kubenswrapper[4971]: E0320 08:33:39.059593 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42e9089e02e173fb2f876dcfa4412c485e37d12c660fac36b609f229c3f8c768\": container with ID starting with 42e9089e02e173fb2f876dcfa4412c485e37d12c660fac36b609f229c3f8c768 not found: ID does not exist" containerID="42e9089e02e173fb2f876dcfa4412c485e37d12c660fac36b609f229c3f8c768" Mar 20 08:33:39 crc kubenswrapper[4971]: I0320 08:33:39.059667 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42e9089e02e173fb2f876dcfa4412c485e37d12c660fac36b609f229c3f8c768"} err="failed to get container status \"42e9089e02e173fb2f876dcfa4412c485e37d12c660fac36b609f229c3f8c768\": rpc error: code = NotFound desc = could not find container \"42e9089e02e173fb2f876dcfa4412c485e37d12c660fac36b609f229c3f8c768\": container with ID starting with 42e9089e02e173fb2f876dcfa4412c485e37d12c660fac36b609f229c3f8c768 not found: ID does not exist" Mar 20 08:33:39 crc kubenswrapper[4971]: I0320 08:33:39.090853 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Mar 20 08:33:39 crc kubenswrapper[4971]: I0320 08:33:39.100942 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Mar 20 08:33:40 crc kubenswrapper[4971]: I0320 08:33:40.747854 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa2eb27f-18a2-4ca1-b2ab-5774ec861b60" path="/var/lib/kubelet/pods/fa2eb27f-18a2-4ca1-b2ab-5774ec861b60/volumes" Mar 20 08:34:00 crc kubenswrapper[4971]: I0320 08:34:00.195998 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566594-vngj2"] Mar 20 08:34:00 crc kubenswrapper[4971]: E0320 08:34:00.196948 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9db3e69-eabf-4ef2-b443-9c4e7cc13dfb" containerName="extract-content" Mar 20 08:34:00 crc kubenswrapper[4971]: I0320 08:34:00.196963 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9db3e69-eabf-4ef2-b443-9c4e7cc13dfb" containerName="extract-content" Mar 20 08:34:00 crc kubenswrapper[4971]: E0320 08:34:00.196972 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9db3e69-eabf-4ef2-b443-9c4e7cc13dfb" containerName="extract-utilities" Mar 20 08:34:00 crc kubenswrapper[4971]: I0320 08:34:00.196979 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9db3e69-eabf-4ef2-b443-9c4e7cc13dfb" containerName="extract-utilities" Mar 20 08:34:00 crc kubenswrapper[4971]: E0320 08:34:00.196996 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa2eb27f-18a2-4ca1-b2ab-5774ec861b60" containerName="mariadb-client" Mar 20 08:34:00 crc kubenswrapper[4971]: I0320 08:34:00.197002 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa2eb27f-18a2-4ca1-b2ab-5774ec861b60" containerName="mariadb-client" Mar 20 08:34:00 crc kubenswrapper[4971]: E0320 08:34:00.197010 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9db3e69-eabf-4ef2-b443-9c4e7cc13dfb" containerName="registry-server" Mar 20 08:34:00 crc kubenswrapper[4971]: I0320 08:34:00.197015 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9db3e69-eabf-4ef2-b443-9c4e7cc13dfb" containerName="registry-server" Mar 20 08:34:00 crc kubenswrapper[4971]: I0320 08:34:00.197159 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9db3e69-eabf-4ef2-b443-9c4e7cc13dfb" containerName="registry-server" Mar 20 08:34:00 crc kubenswrapper[4971]: I0320 08:34:00.197172 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa2eb27f-18a2-4ca1-b2ab-5774ec861b60" containerName="mariadb-client" Mar 20 08:34:00 crc kubenswrapper[4971]: I0320 08:34:00.197738 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566594-vngj2" Mar 20 08:34:00 crc kubenswrapper[4971]: I0320 08:34:00.199804 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 08:34:00 crc kubenswrapper[4971]: I0320 08:34:00.200229 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 08:34:00 crc kubenswrapper[4971]: I0320 08:34:00.200723 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 08:34:00 crc kubenswrapper[4971]: I0320 08:34:00.209935 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566594-vngj2"] Mar 20 08:34:00 crc kubenswrapper[4971]: I0320 08:34:00.278300 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfvvh\" (UniqueName: \"kubernetes.io/projected/1a73b98e-da63-4329-8f91-4d738e5db372-kube-api-access-jfvvh\") pod \"auto-csr-approver-29566594-vngj2\" (UID: \"1a73b98e-da63-4329-8f91-4d738e5db372\") " pod="openshift-infra/auto-csr-approver-29566594-vngj2" Mar 20 08:34:00 crc kubenswrapper[4971]: I0320 08:34:00.380868 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfvvh\" (UniqueName: \"kubernetes.io/projected/1a73b98e-da63-4329-8f91-4d738e5db372-kube-api-access-jfvvh\") pod \"auto-csr-approver-29566594-vngj2\" (UID: \"1a73b98e-da63-4329-8f91-4d738e5db372\") " pod="openshift-infra/auto-csr-approver-29566594-vngj2" Mar 20 08:34:00 crc kubenswrapper[4971]: I0320 08:34:00.409150 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfvvh\" (UniqueName: \"kubernetes.io/projected/1a73b98e-da63-4329-8f91-4d738e5db372-kube-api-access-jfvvh\") pod \"auto-csr-approver-29566594-vngj2\" (UID: \"1a73b98e-da63-4329-8f91-4d738e5db372\") " pod="openshift-infra/auto-csr-approver-29566594-vngj2" Mar 20 08:34:00 crc kubenswrapper[4971]: I0320 08:34:00.523668 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566594-vngj2" Mar 20 08:34:00 crc kubenswrapper[4971]: I0320 08:34:00.788295 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566594-vngj2"] Mar 20 08:34:01 crc kubenswrapper[4971]: I0320 08:34:01.273624 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566594-vngj2" event={"ID":"1a73b98e-da63-4329-8f91-4d738e5db372","Type":"ContainerStarted","Data":"5967cc54467da7c33905ef100ca27daa9f693eab39b1ed0d265e612b23379e76"} Mar 20 08:34:02 crc kubenswrapper[4971]: I0320 08:34:02.308543 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566594-vngj2" event={"ID":"1a73b98e-da63-4329-8f91-4d738e5db372","Type":"ContainerStarted","Data":"3ad228a363a2b47097004f14163ed12acc3431b2ab00dd03f596237bcf95f9b8"} Mar 20 08:34:03 crc kubenswrapper[4971]: I0320 08:34:03.319578 4971 generic.go:334] "Generic (PLEG): container finished" podID="1a73b98e-da63-4329-8f91-4d738e5db372" containerID="3ad228a363a2b47097004f14163ed12acc3431b2ab00dd03f596237bcf95f9b8" exitCode=0 Mar 20 08:34:03 crc kubenswrapper[4971]: I0320 08:34:03.319711 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566594-vngj2" event={"ID":"1a73b98e-da63-4329-8f91-4d738e5db372","Type":"ContainerDied","Data":"3ad228a363a2b47097004f14163ed12acc3431b2ab00dd03f596237bcf95f9b8"} Mar 20 08:34:03 crc kubenswrapper[4971]: I0320 08:34:03.631619 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566594-vngj2" Mar 20 08:34:03 crc kubenswrapper[4971]: I0320 08:34:03.740285 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfvvh\" (UniqueName: \"kubernetes.io/projected/1a73b98e-da63-4329-8f91-4d738e5db372-kube-api-access-jfvvh\") pod \"1a73b98e-da63-4329-8f91-4d738e5db372\" (UID: \"1a73b98e-da63-4329-8f91-4d738e5db372\") " Mar 20 08:34:03 crc kubenswrapper[4971]: I0320 08:34:03.746510 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a73b98e-da63-4329-8f91-4d738e5db372-kube-api-access-jfvvh" (OuterVolumeSpecName: "kube-api-access-jfvvh") pod "1a73b98e-da63-4329-8f91-4d738e5db372" (UID: "1a73b98e-da63-4329-8f91-4d738e5db372"). InnerVolumeSpecName "kube-api-access-jfvvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:34:03 crc kubenswrapper[4971]: I0320 08:34:03.843825 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfvvh\" (UniqueName: \"kubernetes.io/projected/1a73b98e-da63-4329-8f91-4d738e5db372-kube-api-access-jfvvh\") on node \"crc\" DevicePath \"\"" Mar 20 08:34:04 crc kubenswrapper[4971]: I0320 08:34:04.329517 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566594-vngj2" event={"ID":"1a73b98e-da63-4329-8f91-4d738e5db372","Type":"ContainerDied","Data":"5967cc54467da7c33905ef100ca27daa9f693eab39b1ed0d265e612b23379e76"} Mar 20 08:34:04 crc kubenswrapper[4971]: I0320 08:34:04.329573 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5967cc54467da7c33905ef100ca27daa9f693eab39b1ed0d265e612b23379e76" Mar 20 08:34:04 crc kubenswrapper[4971]: I0320 08:34:04.329585 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566594-vngj2" Mar 20 08:34:04 crc kubenswrapper[4971]: I0320 08:34:04.689092 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566588-7lwms"] Mar 20 08:34:04 crc kubenswrapper[4971]: I0320 08:34:04.694131 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566588-7lwms"] Mar 20 08:34:04 crc kubenswrapper[4971]: I0320 08:34:04.741903 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3778330e-8385-4e2a-b656-4732211092e9" path="/var/lib/kubelet/pods/3778330e-8385-4e2a-b656-4732211092e9/volumes" Mar 20 08:34:07 crc kubenswrapper[4971]: I0320 08:34:07.480760 4971 scope.go:117] "RemoveContainer" containerID="280ee8e5cedbf543b0b1daeb5a987937280ff54c27acd7d14971531accde0228" Mar 20 08:34:20 crc kubenswrapper[4971]: I0320 08:34:20.162364 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 08:34:20 crc kubenswrapper[4971]: I0320 08:34:20.163091 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 08:34:50 crc kubenswrapper[4971]: I0320 08:34:50.162828 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 08:34:50 crc kubenswrapper[4971]: I0320 08:34:50.163857 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 08:35:07 crc kubenswrapper[4971]: I0320 08:35:07.620429 4971 scope.go:117] "RemoveContainer" containerID="c954218d1c7c7dd2ebaeabfdaac186297c2cf4fb71004111ff4aedf09d9c2c92" Mar 20 08:35:20 crc kubenswrapper[4971]: I0320 08:35:20.162867 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 08:35:20 crc kubenswrapper[4971]: I0320 08:35:20.163805 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 08:35:20 crc kubenswrapper[4971]: I0320 08:35:20.163903 4971 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" Mar 20 08:35:20 crc kubenswrapper[4971]: I0320 08:35:20.165152 4971 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"66c1fcf2276534ad864a4ee931635b14a07c0330af3a57e8ed8300895949b30a"} pod="openshift-machine-config-operator/machine-config-daemon-m26zl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 08:35:20 crc kubenswrapper[4971]: I0320 08:35:20.165273 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" containerID="cri-o://66c1fcf2276534ad864a4ee931635b14a07c0330af3a57e8ed8300895949b30a" gracePeriod=600 Mar 20 08:35:20 crc kubenswrapper[4971]: E0320 08:35:20.314053 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:35:21 crc kubenswrapper[4971]: I0320 08:35:21.143898 4971 generic.go:334] "Generic (PLEG): container finished" podID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerID="66c1fcf2276534ad864a4ee931635b14a07c0330af3a57e8ed8300895949b30a" exitCode=0 Mar 20 08:35:21 crc kubenswrapper[4971]: I0320 08:35:21.143981 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerDied","Data":"66c1fcf2276534ad864a4ee931635b14a07c0330af3a57e8ed8300895949b30a"} Mar 20 08:35:21 crc kubenswrapper[4971]: I0320 08:35:21.144058 4971 scope.go:117] "RemoveContainer" containerID="5db1e00d6090d5bd7995513994dc958aa01f39fa2d0f251bb90827207208e2ff" Mar 20 08:35:21 crc kubenswrapper[4971]: I0320 08:35:21.145285 4971 scope.go:117] "RemoveContainer" containerID="66c1fcf2276534ad864a4ee931635b14a07c0330af3a57e8ed8300895949b30a" Mar 20 08:35:21 crc kubenswrapper[4971]: E0320 08:35:21.146030 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:35:32 crc kubenswrapper[4971]: I0320 08:35:32.952505 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sxr76"] Mar 20 08:35:32 crc kubenswrapper[4971]: E0320 08:35:32.953395 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a73b98e-da63-4329-8f91-4d738e5db372" containerName="oc" Mar 20 08:35:32 crc kubenswrapper[4971]: I0320 08:35:32.953411 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a73b98e-da63-4329-8f91-4d738e5db372" containerName="oc" Mar 20 08:35:32 crc kubenswrapper[4971]: I0320 08:35:32.953577 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a73b98e-da63-4329-8f91-4d738e5db372" containerName="oc" Mar 20 08:35:32 crc kubenswrapper[4971]: I0320 08:35:32.955588 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sxr76" Mar 20 08:35:32 crc kubenswrapper[4971]: I0320 08:35:32.968000 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sxr76"] Mar 20 08:35:33 crc kubenswrapper[4971]: I0320 08:35:33.070357 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/365aedcd-c028-4346-b780-6d4dc0083b16-utilities\") pod \"certified-operators-sxr76\" (UID: \"365aedcd-c028-4346-b780-6d4dc0083b16\") " pod="openshift-marketplace/certified-operators-sxr76" Mar 20 08:35:33 crc kubenswrapper[4971]: I0320 08:35:33.070752 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25cxg\" (UniqueName: \"kubernetes.io/projected/365aedcd-c028-4346-b780-6d4dc0083b16-kube-api-access-25cxg\") pod \"certified-operators-sxr76\" (UID: \"365aedcd-c028-4346-b780-6d4dc0083b16\") " pod="openshift-marketplace/certified-operators-sxr76" Mar 20 08:35:33 crc kubenswrapper[4971]: I0320 08:35:33.070894 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/365aedcd-c028-4346-b780-6d4dc0083b16-catalog-content\") pod \"certified-operators-sxr76\" (UID: \"365aedcd-c028-4346-b780-6d4dc0083b16\") " pod="openshift-marketplace/certified-operators-sxr76" Mar 20 08:35:33 crc kubenswrapper[4971]: I0320 08:35:33.172209 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/365aedcd-c028-4346-b780-6d4dc0083b16-catalog-content\") pod \"certified-operators-sxr76\" (UID: \"365aedcd-c028-4346-b780-6d4dc0083b16\") " pod="openshift-marketplace/certified-operators-sxr76" Mar 20 08:35:33 crc kubenswrapper[4971]: I0320 08:35:33.172281 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/365aedcd-c028-4346-b780-6d4dc0083b16-utilities\") pod \"certified-operators-sxr76\" (UID: \"365aedcd-c028-4346-b780-6d4dc0083b16\") " pod="openshift-marketplace/certified-operators-sxr76" Mar 20 08:35:33 crc kubenswrapper[4971]: I0320 08:35:33.172368 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25cxg\" (UniqueName: \"kubernetes.io/projected/365aedcd-c028-4346-b780-6d4dc0083b16-kube-api-access-25cxg\") pod \"certified-operators-sxr76\" (UID: \"365aedcd-c028-4346-b780-6d4dc0083b16\") " pod="openshift-marketplace/certified-operators-sxr76" Mar 20 08:35:33 crc kubenswrapper[4971]: I0320 08:35:33.172744 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/365aedcd-c028-4346-b780-6d4dc0083b16-catalog-content\") pod \"certified-operators-sxr76\" (UID: \"365aedcd-c028-4346-b780-6d4dc0083b16\") " pod="openshift-marketplace/certified-operators-sxr76" Mar 20 08:35:33 crc kubenswrapper[4971]: I0320 08:35:33.172904 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/365aedcd-c028-4346-b780-6d4dc0083b16-utilities\") pod \"certified-operators-sxr76\" (UID: \"365aedcd-c028-4346-b780-6d4dc0083b16\") " pod="openshift-marketplace/certified-operators-sxr76" Mar 20 08:35:33 crc kubenswrapper[4971]: I0320 08:35:33.191122 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25cxg\" (UniqueName: \"kubernetes.io/projected/365aedcd-c028-4346-b780-6d4dc0083b16-kube-api-access-25cxg\") pod \"certified-operators-sxr76\" (UID: \"365aedcd-c028-4346-b780-6d4dc0083b16\") " pod="openshift-marketplace/certified-operators-sxr76" Mar 20 08:35:33 crc kubenswrapper[4971]: I0320 08:35:33.293710 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sxr76" Mar 20 08:35:33 crc kubenswrapper[4971]: I0320 08:35:33.732028 4971 scope.go:117] "RemoveContainer" containerID="66c1fcf2276534ad864a4ee931635b14a07c0330af3a57e8ed8300895949b30a" Mar 20 08:35:33 crc kubenswrapper[4971]: E0320 08:35:33.732564 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:35:33 crc kubenswrapper[4971]: I0320 08:35:33.755367 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sxr76"] Mar 20 08:35:34 crc kubenswrapper[4971]: I0320 08:35:34.282757 4971 generic.go:334] "Generic (PLEG): container finished" podID="365aedcd-c028-4346-b780-6d4dc0083b16" containerID="664b21bfa65bb8c99785d5faeaea3f39a67b556ced85052ae242643705b345f7" exitCode=0 Mar 20 08:35:34 crc kubenswrapper[4971]: I0320 08:35:34.283074 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sxr76" event={"ID":"365aedcd-c028-4346-b780-6d4dc0083b16","Type":"ContainerDied","Data":"664b21bfa65bb8c99785d5faeaea3f39a67b556ced85052ae242643705b345f7"} Mar 20 08:35:34 crc kubenswrapper[4971]: I0320 08:35:34.283173 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sxr76" event={"ID":"365aedcd-c028-4346-b780-6d4dc0083b16","Type":"ContainerStarted","Data":"1dbcbede40fcc22a3f74bd1a4b98a047cdad0510ede486a1a376a3c4e55609fc"} Mar 20 08:35:35 crc kubenswrapper[4971]: I0320 08:35:35.293654 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sxr76" event={"ID":"365aedcd-c028-4346-b780-6d4dc0083b16","Type":"ContainerStarted","Data":"2c9657ebc4d821b5c720e0546bf7516b893f561f5360bb97f5bef5d5e187b59f"} Mar 20 08:35:36 crc kubenswrapper[4971]: I0320 08:35:36.309095 4971 generic.go:334] "Generic (PLEG): container finished" podID="365aedcd-c028-4346-b780-6d4dc0083b16" containerID="2c9657ebc4d821b5c720e0546bf7516b893f561f5360bb97f5bef5d5e187b59f" exitCode=0 Mar 20 08:35:36 crc kubenswrapper[4971]: I0320 08:35:36.309164 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sxr76" event={"ID":"365aedcd-c028-4346-b780-6d4dc0083b16","Type":"ContainerDied","Data":"2c9657ebc4d821b5c720e0546bf7516b893f561f5360bb97f5bef5d5e187b59f"} Mar 20 08:35:37 crc kubenswrapper[4971]: I0320 08:35:37.321577 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sxr76" event={"ID":"365aedcd-c028-4346-b780-6d4dc0083b16","Type":"ContainerStarted","Data":"e12006db9f8812c2822434483b620d9ebd767534ac451a5725e81f48aef29f55"} Mar 20 08:35:37 crc kubenswrapper[4971]: I0320 08:35:37.355373 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sxr76" podStartSLOduration=2.928538275 podStartE2EDuration="5.355352061s" podCreationTimestamp="2026-03-20 08:35:32 +0000 UTC" firstStartedPulling="2026-03-20 08:35:34.284471714 +0000 UTC m=+6356.264345852" lastFinishedPulling="2026-03-20 08:35:36.71128547 +0000 UTC m=+6358.691159638" observedRunningTime="2026-03-20 08:35:37.350090394 +0000 UTC m=+6359.329964582" watchObservedRunningTime="2026-03-20 08:35:37.355352061 +0000 UTC m=+6359.335226209" Mar 20 08:35:43 crc kubenswrapper[4971]: I0320 08:35:43.294863 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sxr76" Mar 20 08:35:43 crc kubenswrapper[4971]: I0320 08:35:43.295326 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sxr76" Mar 20 08:35:43 crc kubenswrapper[4971]: I0320 08:35:43.335742 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sxr76" Mar 20 08:35:43 crc kubenswrapper[4971]: I0320 08:35:43.409206 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sxr76" Mar 20 08:35:44 crc kubenswrapper[4971]: I0320 08:35:44.709834 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sxr76"] Mar 20 08:35:45 crc kubenswrapper[4971]: I0320 08:35:45.394099 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-sxr76" podUID="365aedcd-c028-4346-b780-6d4dc0083b16" containerName="registry-server" containerID="cri-o://e12006db9f8812c2822434483b620d9ebd767534ac451a5725e81f48aef29f55" gracePeriod=2 Mar 20 08:35:45 crc kubenswrapper[4971]: I0320 08:35:45.841369 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sxr76" Mar 20 08:35:45 crc kubenswrapper[4971]: I0320 08:35:45.972627 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25cxg\" (UniqueName: \"kubernetes.io/projected/365aedcd-c028-4346-b780-6d4dc0083b16-kube-api-access-25cxg\") pod \"365aedcd-c028-4346-b780-6d4dc0083b16\" (UID: \"365aedcd-c028-4346-b780-6d4dc0083b16\") " Mar 20 08:35:45 crc kubenswrapper[4971]: I0320 08:35:45.972723 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/365aedcd-c028-4346-b780-6d4dc0083b16-utilities\") pod \"365aedcd-c028-4346-b780-6d4dc0083b16\" (UID: \"365aedcd-c028-4346-b780-6d4dc0083b16\") " Mar 20 08:35:45 crc kubenswrapper[4971]: I0320 08:35:45.972760 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/365aedcd-c028-4346-b780-6d4dc0083b16-catalog-content\") pod \"365aedcd-c028-4346-b780-6d4dc0083b16\" (UID: \"365aedcd-c028-4346-b780-6d4dc0083b16\") " Mar 20 08:35:45 crc kubenswrapper[4971]: I0320 08:35:45.978434 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/365aedcd-c028-4346-b780-6d4dc0083b16-utilities" (OuterVolumeSpecName: "utilities") pod "365aedcd-c028-4346-b780-6d4dc0083b16" (UID: "365aedcd-c028-4346-b780-6d4dc0083b16"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:35:45 crc kubenswrapper[4971]: I0320 08:35:45.981579 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/365aedcd-c028-4346-b780-6d4dc0083b16-kube-api-access-25cxg" (OuterVolumeSpecName: "kube-api-access-25cxg") pod "365aedcd-c028-4346-b780-6d4dc0083b16" (UID: "365aedcd-c028-4346-b780-6d4dc0083b16"). InnerVolumeSpecName "kube-api-access-25cxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:35:46 crc kubenswrapper[4971]: I0320 08:35:46.074412 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25cxg\" (UniqueName: \"kubernetes.io/projected/365aedcd-c028-4346-b780-6d4dc0083b16-kube-api-access-25cxg\") on node \"crc\" DevicePath \"\"" Mar 20 08:35:46 crc kubenswrapper[4971]: I0320 08:35:46.074444 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/365aedcd-c028-4346-b780-6d4dc0083b16-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 08:35:46 crc kubenswrapper[4971]: I0320 08:35:46.403278 4971 generic.go:334] "Generic (PLEG): container finished" podID="365aedcd-c028-4346-b780-6d4dc0083b16" containerID="e12006db9f8812c2822434483b620d9ebd767534ac451a5725e81f48aef29f55" exitCode=0 Mar 20 08:35:46 crc kubenswrapper[4971]: I0320 08:35:46.403369 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sxr76" Mar 20 08:35:46 crc kubenswrapper[4971]: I0320 08:35:46.403387 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sxr76" event={"ID":"365aedcd-c028-4346-b780-6d4dc0083b16","Type":"ContainerDied","Data":"e12006db9f8812c2822434483b620d9ebd767534ac451a5725e81f48aef29f55"} Mar 20 08:35:46 crc kubenswrapper[4971]: I0320 08:35:46.403435 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sxr76" event={"ID":"365aedcd-c028-4346-b780-6d4dc0083b16","Type":"ContainerDied","Data":"1dbcbede40fcc22a3f74bd1a4b98a047cdad0510ede486a1a376a3c4e55609fc"} Mar 20 08:35:46 crc kubenswrapper[4971]: I0320 08:35:46.403456 4971 scope.go:117] "RemoveContainer" containerID="e12006db9f8812c2822434483b620d9ebd767534ac451a5725e81f48aef29f55" Mar 20 08:35:46 crc kubenswrapper[4971]: I0320 08:35:46.425737 4971 scope.go:117] "RemoveContainer" containerID="2c9657ebc4d821b5c720e0546bf7516b893f561f5360bb97f5bef5d5e187b59f" Mar 20 08:35:46 crc kubenswrapper[4971]: I0320 08:35:46.448475 4971 scope.go:117] "RemoveContainer" containerID="664b21bfa65bb8c99785d5faeaea3f39a67b556ced85052ae242643705b345f7" Mar 20 08:35:46 crc kubenswrapper[4971]: I0320 08:35:46.475913 4971 scope.go:117] "RemoveContainer" containerID="e12006db9f8812c2822434483b620d9ebd767534ac451a5725e81f48aef29f55" Mar 20 08:35:46 crc kubenswrapper[4971]: E0320 08:35:46.476473 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e12006db9f8812c2822434483b620d9ebd767534ac451a5725e81f48aef29f55\": container with ID starting with e12006db9f8812c2822434483b620d9ebd767534ac451a5725e81f48aef29f55 not found: ID does not exist" containerID="e12006db9f8812c2822434483b620d9ebd767534ac451a5725e81f48aef29f55" Mar 20 08:35:46 crc kubenswrapper[4971]: I0320 08:35:46.476539 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e12006db9f8812c2822434483b620d9ebd767534ac451a5725e81f48aef29f55"} err="failed to get container status \"e12006db9f8812c2822434483b620d9ebd767534ac451a5725e81f48aef29f55\": rpc error: code = NotFound desc = could not find container \"e12006db9f8812c2822434483b620d9ebd767534ac451a5725e81f48aef29f55\": container with ID starting with e12006db9f8812c2822434483b620d9ebd767534ac451a5725e81f48aef29f55 not found: ID does not exist" Mar 20 08:35:46 crc kubenswrapper[4971]: I0320 08:35:46.476574 4971 scope.go:117] "RemoveContainer" containerID="2c9657ebc4d821b5c720e0546bf7516b893f561f5360bb97f5bef5d5e187b59f" Mar 20 08:35:46 crc kubenswrapper[4971]: E0320 08:35:46.477157 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c9657ebc4d821b5c720e0546bf7516b893f561f5360bb97f5bef5d5e187b59f\": container with ID starting with 2c9657ebc4d821b5c720e0546bf7516b893f561f5360bb97f5bef5d5e187b59f not found: ID does not exist" containerID="2c9657ebc4d821b5c720e0546bf7516b893f561f5360bb97f5bef5d5e187b59f" Mar 20 08:35:46 crc kubenswrapper[4971]: I0320 08:35:46.477210 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c9657ebc4d821b5c720e0546bf7516b893f561f5360bb97f5bef5d5e187b59f"} err="failed to get container status \"2c9657ebc4d821b5c720e0546bf7516b893f561f5360bb97f5bef5d5e187b59f\": rpc error: code = NotFound desc = could not find container \"2c9657ebc4d821b5c720e0546bf7516b893f561f5360bb97f5bef5d5e187b59f\": container with ID starting with 2c9657ebc4d821b5c720e0546bf7516b893f561f5360bb97f5bef5d5e187b59f not found: ID does not exist" Mar 20 08:35:46 crc kubenswrapper[4971]: I0320 08:35:46.477249 4971 scope.go:117] "RemoveContainer" containerID="664b21bfa65bb8c99785d5faeaea3f39a67b556ced85052ae242643705b345f7" Mar 20 08:35:46 crc kubenswrapper[4971]: E0320 08:35:46.477739 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"664b21bfa65bb8c99785d5faeaea3f39a67b556ced85052ae242643705b345f7\": container with ID starting with 664b21bfa65bb8c99785d5faeaea3f39a67b556ced85052ae242643705b345f7 not found: ID does not exist" containerID="664b21bfa65bb8c99785d5faeaea3f39a67b556ced85052ae242643705b345f7" Mar 20 08:35:46 crc kubenswrapper[4971]: I0320 08:35:46.477783 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"664b21bfa65bb8c99785d5faeaea3f39a67b556ced85052ae242643705b345f7"} err="failed to get container status \"664b21bfa65bb8c99785d5faeaea3f39a67b556ced85052ae242643705b345f7\": rpc error: code = NotFound desc = could not find container \"664b21bfa65bb8c99785d5faeaea3f39a67b556ced85052ae242643705b345f7\": container with ID starting with 664b21bfa65bb8c99785d5faeaea3f39a67b556ced85052ae242643705b345f7 not found: ID does not exist" Mar 20 08:35:46 crc kubenswrapper[4971]: I0320 08:35:46.531869 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/365aedcd-c028-4346-b780-6d4dc0083b16-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "365aedcd-c028-4346-b780-6d4dc0083b16" (UID: "365aedcd-c028-4346-b780-6d4dc0083b16"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:35:46 crc kubenswrapper[4971]: I0320 08:35:46.581506 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/365aedcd-c028-4346-b780-6d4dc0083b16-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 08:35:46 crc kubenswrapper[4971]: I0320 08:35:46.749965 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sxr76"] Mar 20 08:35:46 crc kubenswrapper[4971]: I0320 08:35:46.754632 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-sxr76"] Mar 20 08:35:48 crc kubenswrapper[4971]: I0320 08:35:48.742003 4971 scope.go:117] "RemoveContainer" containerID="66c1fcf2276534ad864a4ee931635b14a07c0330af3a57e8ed8300895949b30a" Mar 20 08:35:48 crc kubenswrapper[4971]: E0320 08:35:48.742967 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:35:48 crc kubenswrapper[4971]: I0320 08:35:48.748508 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="365aedcd-c028-4346-b780-6d4dc0083b16" path="/var/lib/kubelet/pods/365aedcd-c028-4346-b780-6d4dc0083b16/volumes" Mar 20 08:36:00 crc kubenswrapper[4971]: I0320 08:36:00.153910 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566596-lmvzn"] Mar 20 08:36:00 crc kubenswrapper[4971]: E0320 08:36:00.155055 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="365aedcd-c028-4346-b780-6d4dc0083b16" containerName="extract-content" Mar 20 08:36:00 crc kubenswrapper[4971]: I0320 08:36:00.155077 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="365aedcd-c028-4346-b780-6d4dc0083b16" containerName="extract-content" Mar 20 08:36:00 crc kubenswrapper[4971]: E0320 08:36:00.155094 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="365aedcd-c028-4346-b780-6d4dc0083b16" containerName="registry-server" Mar 20 08:36:00 crc kubenswrapper[4971]: I0320 08:36:00.155104 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="365aedcd-c028-4346-b780-6d4dc0083b16" containerName="registry-server" Mar 20 08:36:00 crc kubenswrapper[4971]: E0320 08:36:00.155152 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="365aedcd-c028-4346-b780-6d4dc0083b16" containerName="extract-utilities" Mar 20 08:36:00 crc kubenswrapper[4971]: I0320 08:36:00.155162 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="365aedcd-c028-4346-b780-6d4dc0083b16" containerName="extract-utilities" Mar 20 08:36:00 crc kubenswrapper[4971]: I0320 08:36:00.155382 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="365aedcd-c028-4346-b780-6d4dc0083b16" containerName="registry-server" Mar 20 08:36:00 crc kubenswrapper[4971]: I0320 08:36:00.156250 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566596-lmvzn" Mar 20 08:36:00 crc kubenswrapper[4971]: I0320 08:36:00.160620 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 08:36:00 crc kubenswrapper[4971]: I0320 08:36:00.160851 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 08:36:00 crc kubenswrapper[4971]: I0320 08:36:00.165872 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 08:36:00 crc kubenswrapper[4971]: I0320 08:36:00.172438 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566596-lmvzn"] Mar 20 08:36:00 crc kubenswrapper[4971]: I0320 08:36:00.317311 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdpwj\" (UniqueName: \"kubernetes.io/projected/fb47c47c-73d5-457d-8245-5de3fa7be20a-kube-api-access-vdpwj\") pod \"auto-csr-approver-29566596-lmvzn\" (UID: \"fb47c47c-73d5-457d-8245-5de3fa7be20a\") " pod="openshift-infra/auto-csr-approver-29566596-lmvzn" Mar 20 08:36:00 crc kubenswrapper[4971]: I0320 08:36:00.419309 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdpwj\" (UniqueName: \"kubernetes.io/projected/fb47c47c-73d5-457d-8245-5de3fa7be20a-kube-api-access-vdpwj\") pod \"auto-csr-approver-29566596-lmvzn\" (UID: \"fb47c47c-73d5-457d-8245-5de3fa7be20a\") " pod="openshift-infra/auto-csr-approver-29566596-lmvzn" Mar 20 08:36:00 crc kubenswrapper[4971]: I0320 08:36:00.441582 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdpwj\" (UniqueName: \"kubernetes.io/projected/fb47c47c-73d5-457d-8245-5de3fa7be20a-kube-api-access-vdpwj\") pod \"auto-csr-approver-29566596-lmvzn\" (UID: \"fb47c47c-73d5-457d-8245-5de3fa7be20a\") " pod="openshift-infra/auto-csr-approver-29566596-lmvzn" Mar 20 08:36:00 crc kubenswrapper[4971]: I0320 08:36:00.497467 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566596-lmvzn" Mar 20 08:36:00 crc kubenswrapper[4971]: I0320 08:36:00.732722 4971 scope.go:117] "RemoveContainer" containerID="66c1fcf2276534ad864a4ee931635b14a07c0330af3a57e8ed8300895949b30a" Mar 20 08:36:00 crc kubenswrapper[4971]: E0320 08:36:00.733219 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:36:01 crc kubenswrapper[4971]: I0320 08:36:01.001368 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566596-lmvzn"] Mar 20 08:36:01 crc kubenswrapper[4971]: W0320 08:36:01.009937 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb47c47c_73d5_457d_8245_5de3fa7be20a.slice/crio-aa3781e14fe6b29d55f2b3c4c5e4f96e98003a17b81babb5b1b3f9ff6540e927 WatchSource:0}: Error finding container aa3781e14fe6b29d55f2b3c4c5e4f96e98003a17b81babb5b1b3f9ff6540e927: Status 404 returned error can't find the container with id aa3781e14fe6b29d55f2b3c4c5e4f96e98003a17b81babb5b1b3f9ff6540e927 Mar 20 08:36:01 crc kubenswrapper[4971]: I0320 08:36:01.551355 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566596-lmvzn" event={"ID":"fb47c47c-73d5-457d-8245-5de3fa7be20a","Type":"ContainerStarted","Data":"aa3781e14fe6b29d55f2b3c4c5e4f96e98003a17b81babb5b1b3f9ff6540e927"} Mar 20 08:36:02 crc kubenswrapper[4971]: I0320 08:36:02.563352 4971 generic.go:334] "Generic (PLEG): container finished" podID="fb47c47c-73d5-457d-8245-5de3fa7be20a" containerID="e6fb634d45ca95abff9c227bdf3b54764512fa09969316ca92b53a0fe71949af" exitCode=0 Mar 20 08:36:02 crc kubenswrapper[4971]: I0320 08:36:02.563502 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566596-lmvzn" event={"ID":"fb47c47c-73d5-457d-8245-5de3fa7be20a","Type":"ContainerDied","Data":"e6fb634d45ca95abff9c227bdf3b54764512fa09969316ca92b53a0fe71949af"} Mar 20 08:36:04 crc kubenswrapper[4971]: I0320 08:36:04.042056 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566596-lmvzn" Mar 20 08:36:04 crc kubenswrapper[4971]: I0320 08:36:04.210108 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdpwj\" (UniqueName: \"kubernetes.io/projected/fb47c47c-73d5-457d-8245-5de3fa7be20a-kube-api-access-vdpwj\") pod \"fb47c47c-73d5-457d-8245-5de3fa7be20a\" (UID: \"fb47c47c-73d5-457d-8245-5de3fa7be20a\") " Mar 20 08:36:04 crc kubenswrapper[4971]: I0320 08:36:04.216770 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb47c47c-73d5-457d-8245-5de3fa7be20a-kube-api-access-vdpwj" (OuterVolumeSpecName: "kube-api-access-vdpwj") pod "fb47c47c-73d5-457d-8245-5de3fa7be20a" (UID: "fb47c47c-73d5-457d-8245-5de3fa7be20a"). InnerVolumeSpecName "kube-api-access-vdpwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:36:04 crc kubenswrapper[4971]: I0320 08:36:04.311995 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdpwj\" (UniqueName: \"kubernetes.io/projected/fb47c47c-73d5-457d-8245-5de3fa7be20a-kube-api-access-vdpwj\") on node \"crc\" DevicePath \"\"" Mar 20 08:36:04 crc kubenswrapper[4971]: I0320 08:36:04.592640 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566596-lmvzn" event={"ID":"fb47c47c-73d5-457d-8245-5de3fa7be20a","Type":"ContainerDied","Data":"aa3781e14fe6b29d55f2b3c4c5e4f96e98003a17b81babb5b1b3f9ff6540e927"} Mar 20 08:36:04 crc kubenswrapper[4971]: I0320 08:36:04.592696 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa3781e14fe6b29d55f2b3c4c5e4f96e98003a17b81babb5b1b3f9ff6540e927" Mar 20 08:36:04 crc kubenswrapper[4971]: I0320 08:36:04.592799 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566596-lmvzn" Mar 20 08:36:05 crc kubenswrapper[4971]: I0320 08:36:05.118535 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566590-c2225"] Mar 20 08:36:05 crc kubenswrapper[4971]: I0320 08:36:05.123885 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566590-c2225"] Mar 20 08:36:06 crc kubenswrapper[4971]: I0320 08:36:06.749127 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99c80702-40ac-409b-bf0c-3c5aa898b765" path="/var/lib/kubelet/pods/99c80702-40ac-409b-bf0c-3c5aa898b765/volumes" Mar 20 08:36:07 crc kubenswrapper[4971]: I0320 08:36:07.675096 4971 scope.go:117] "RemoveContainer" containerID="c978236199e76b4c5286d3e39c75dfefda4150e144cbcff70f8628f6d431a40e" Mar 20 08:36:15 crc kubenswrapper[4971]: I0320 08:36:15.732714 4971 scope.go:117] "RemoveContainer" containerID="66c1fcf2276534ad864a4ee931635b14a07c0330af3a57e8ed8300895949b30a" Mar 20 08:36:15 crc kubenswrapper[4971]: E0320 08:36:15.733801 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:36:28 crc kubenswrapper[4971]: I0320 08:36:28.736642 4971 scope.go:117] "RemoveContainer" containerID="66c1fcf2276534ad864a4ee931635b14a07c0330af3a57e8ed8300895949b30a" Mar 20 08:36:28 crc kubenswrapper[4971]: E0320 08:36:28.739382 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:36:42 crc kubenswrapper[4971]: I0320 08:36:42.732487 4971 scope.go:117] "RemoveContainer" containerID="66c1fcf2276534ad864a4ee931635b14a07c0330af3a57e8ed8300895949b30a" Mar 20 08:36:42 crc kubenswrapper[4971]: E0320 08:36:42.733423 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:36:57 crc kubenswrapper[4971]: I0320 08:36:57.732327 4971 scope.go:117] "RemoveContainer" containerID="66c1fcf2276534ad864a4ee931635b14a07c0330af3a57e8ed8300895949b30a" Mar 20 08:36:57 crc kubenswrapper[4971]: E0320 08:36:57.735154 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:37:08 crc kubenswrapper[4971]: I0320 08:37:08.738883 4971 scope.go:117] "RemoveContainer" containerID="66c1fcf2276534ad864a4ee931635b14a07c0330af3a57e8ed8300895949b30a" Mar 20 08:37:08 crc kubenswrapper[4971]: E0320 08:37:08.739809 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:37:22 crc kubenswrapper[4971]: I0320 08:37:22.732462 4971 scope.go:117] "RemoveContainer" containerID="66c1fcf2276534ad864a4ee931635b14a07c0330af3a57e8ed8300895949b30a" Mar 20 08:37:22 crc kubenswrapper[4971]: E0320 08:37:22.733261 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:37:34 crc kubenswrapper[4971]: I0320 08:37:34.732977 4971 scope.go:117] "RemoveContainer" containerID="66c1fcf2276534ad864a4ee931635b14a07c0330af3a57e8ed8300895949b30a" Mar 20 08:37:34 crc kubenswrapper[4971]: E0320 08:37:34.733821 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:37:47 crc kubenswrapper[4971]: I0320 08:37:47.732963 4971 scope.go:117] "RemoveContainer" containerID="66c1fcf2276534ad864a4ee931635b14a07c0330af3a57e8ed8300895949b30a" Mar 20 08:37:47 crc kubenswrapper[4971]: E0320 08:37:47.734035 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:38:00 crc kubenswrapper[4971]: I0320 08:38:00.160326 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566598-78wqb"] Mar 20 08:38:00 crc kubenswrapper[4971]: E0320 08:38:00.162119 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb47c47c-73d5-457d-8245-5de3fa7be20a" containerName="oc" Mar 20 08:38:00 crc kubenswrapper[4971]: I0320 08:38:00.162155 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb47c47c-73d5-457d-8245-5de3fa7be20a" containerName="oc" Mar 20 08:38:00 crc kubenswrapper[4971]: I0320 08:38:00.162471 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb47c47c-73d5-457d-8245-5de3fa7be20a" containerName="oc" Mar 20 08:38:00 crc kubenswrapper[4971]: I0320 08:38:00.163592 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566598-78wqb" Mar 20 08:38:00 crc kubenswrapper[4971]: I0320 08:38:00.166354 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 08:38:00 crc kubenswrapper[4971]: I0320 08:38:00.169091 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 08:38:00 crc kubenswrapper[4971]: I0320 08:38:00.169092 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 08:38:00 crc kubenswrapper[4971]: I0320 08:38:00.176421 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566598-78wqb"] Mar 20 08:38:00 crc kubenswrapper[4971]: I0320 08:38:00.251295 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vs7g\" (UniqueName: \"kubernetes.io/projected/3285aa80-885b-4a96-9fbf-0170c06603fb-kube-api-access-8vs7g\") pod \"auto-csr-approver-29566598-78wqb\" (UID: \"3285aa80-885b-4a96-9fbf-0170c06603fb\") " pod="openshift-infra/auto-csr-approver-29566598-78wqb" Mar 20 08:38:00 crc kubenswrapper[4971]: I0320 08:38:00.354283 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vs7g\" (UniqueName: \"kubernetes.io/projected/3285aa80-885b-4a96-9fbf-0170c06603fb-kube-api-access-8vs7g\") pod \"auto-csr-approver-29566598-78wqb\" (UID: \"3285aa80-885b-4a96-9fbf-0170c06603fb\") " pod="openshift-infra/auto-csr-approver-29566598-78wqb" Mar 20 08:38:00 crc kubenswrapper[4971]: I0320 08:38:00.384013 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vs7g\" (UniqueName: \"kubernetes.io/projected/3285aa80-885b-4a96-9fbf-0170c06603fb-kube-api-access-8vs7g\") pod \"auto-csr-approver-29566598-78wqb\" (UID: \"3285aa80-885b-4a96-9fbf-0170c06603fb\") " pod="openshift-infra/auto-csr-approver-29566598-78wqb" Mar 20 08:38:00 crc kubenswrapper[4971]: I0320 08:38:00.486215 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566598-78wqb" Mar 20 08:38:01 crc kubenswrapper[4971]: I0320 08:38:01.032642 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566598-78wqb"] Mar 20 08:38:01 crc kubenswrapper[4971]: I0320 08:38:01.728287 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566598-78wqb" event={"ID":"3285aa80-885b-4a96-9fbf-0170c06603fb","Type":"ContainerStarted","Data":"89a059f96bc9238db5f21ca60f952351aba01c9bbe1f5575236ca9c3930c158f"} Mar 20 08:38:02 crc kubenswrapper[4971]: I0320 08:38:02.738984 4971 scope.go:117] "RemoveContainer" containerID="66c1fcf2276534ad864a4ee931635b14a07c0330af3a57e8ed8300895949b30a" Mar 20 08:38:02 crc kubenswrapper[4971]: E0320 08:38:02.740558 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:38:02 crc kubenswrapper[4971]: I0320 08:38:02.749803 4971 generic.go:334] "Generic (PLEG): container finished" podID="3285aa80-885b-4a96-9fbf-0170c06603fb" containerID="5c83360796f0a8705c327dab6ab3d4dd597ad4f696432c85ced93266b6b43ed0" exitCode=0 Mar 20 08:38:02 crc kubenswrapper[4971]: I0320 08:38:02.778244 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566598-78wqb" event={"ID":"3285aa80-885b-4a96-9fbf-0170c06603fb","Type":"ContainerDied","Data":"5c83360796f0a8705c327dab6ab3d4dd597ad4f696432c85ced93266b6b43ed0"} Mar 20 08:38:04 crc kubenswrapper[4971]: I0320 08:38:04.094593 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566598-78wqb" Mar 20 08:38:04 crc kubenswrapper[4971]: I0320 08:38:04.146656 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vs7g\" (UniqueName: \"kubernetes.io/projected/3285aa80-885b-4a96-9fbf-0170c06603fb-kube-api-access-8vs7g\") pod \"3285aa80-885b-4a96-9fbf-0170c06603fb\" (UID: \"3285aa80-885b-4a96-9fbf-0170c06603fb\") " Mar 20 08:38:04 crc kubenswrapper[4971]: I0320 08:38:04.154121 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3285aa80-885b-4a96-9fbf-0170c06603fb-kube-api-access-8vs7g" (OuterVolumeSpecName: "kube-api-access-8vs7g") pod "3285aa80-885b-4a96-9fbf-0170c06603fb" (UID: "3285aa80-885b-4a96-9fbf-0170c06603fb"). InnerVolumeSpecName "kube-api-access-8vs7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:38:04 crc kubenswrapper[4971]: I0320 08:38:04.249496 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vs7g\" (UniqueName: \"kubernetes.io/projected/3285aa80-885b-4a96-9fbf-0170c06603fb-kube-api-access-8vs7g\") on node \"crc\" DevicePath \"\"" Mar 20 08:38:04 crc kubenswrapper[4971]: I0320 08:38:04.771715 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566598-78wqb" event={"ID":"3285aa80-885b-4a96-9fbf-0170c06603fb","Type":"ContainerDied","Data":"89a059f96bc9238db5f21ca60f952351aba01c9bbe1f5575236ca9c3930c158f"} Mar 20 08:38:04 crc kubenswrapper[4971]: I0320 08:38:04.772084 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89a059f96bc9238db5f21ca60f952351aba01c9bbe1f5575236ca9c3930c158f" Mar 20 08:38:04 crc kubenswrapper[4971]: I0320 08:38:04.771816 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566598-78wqb" Mar 20 08:38:05 crc kubenswrapper[4971]: I0320 08:38:05.197248 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566592-krlb2"] Mar 20 08:38:05 crc kubenswrapper[4971]: I0320 08:38:05.205721 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566592-krlb2"] Mar 20 08:38:06 crc kubenswrapper[4971]: I0320 08:38:06.750096 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d17325d-784b-4210-972e-6a316cde36bf" path="/var/lib/kubelet/pods/6d17325d-784b-4210-972e-6a316cde36bf/volumes" Mar 20 08:38:07 crc kubenswrapper[4971]: I0320 08:38:07.784499 4971 scope.go:117] "RemoveContainer" containerID="5b7a13b93e9dd5097aa677624986d0c9699612613c96286245b0b1109ea7ea33" Mar 20 08:38:07 crc kubenswrapper[4971]: I0320 08:38:07.817076 4971 scope.go:117] "RemoveContainer" containerID="8c197e06198fba4a8cecb639ffba5bfc7134b40f91c810531590446812db478d" Mar 20 08:38:07 crc kubenswrapper[4971]: I0320 08:38:07.870811 4971 scope.go:117] "RemoveContainer" containerID="5911248b91fe8356aa7640d293e826b285aa76c1ea5012003162f4cb1e9e79ff" Mar 20 08:38:07 crc kubenswrapper[4971]: I0320 08:38:07.925596 4971 scope.go:117] "RemoveContainer" containerID="5ba84c28ac39a925135f8568223e99dd097788b82af19c0b707fe26aea468adb" Mar 20 08:38:07 crc kubenswrapper[4971]: I0320 08:38:07.955386 4971 scope.go:117] "RemoveContainer" containerID="a3317d43c773393c90034edfe3cd67415c27e14fe9be62a0da9c45a67d69d291" Mar 20 08:38:15 crc kubenswrapper[4971]: I0320 08:38:15.731937 4971 scope.go:117] "RemoveContainer" containerID="66c1fcf2276534ad864a4ee931635b14a07c0330af3a57e8ed8300895949b30a" Mar 20 08:38:15 crc kubenswrapper[4971]: E0320 08:38:15.732661 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:38:30 crc kubenswrapper[4971]: I0320 08:38:30.733783 4971 scope.go:117] "RemoveContainer" containerID="66c1fcf2276534ad864a4ee931635b14a07c0330af3a57e8ed8300895949b30a" Mar 20 08:38:30 crc kubenswrapper[4971]: E0320 08:38:30.734433 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:38:42 crc kubenswrapper[4971]: I0320 08:38:42.733960 4971 scope.go:117] "RemoveContainer" containerID="66c1fcf2276534ad864a4ee931635b14a07c0330af3a57e8ed8300895949b30a" Mar 20 08:38:42 crc kubenswrapper[4971]: E0320 08:38:42.735539 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:38:56 crc kubenswrapper[4971]: I0320 08:38:56.733032 4971 scope.go:117] "RemoveContainer" containerID="66c1fcf2276534ad864a4ee931635b14a07c0330af3a57e8ed8300895949b30a" Mar 20 08:38:56 crc kubenswrapper[4971]: E0320 08:38:56.734032 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:39:07 crc kubenswrapper[4971]: I0320 08:39:07.732815 4971 scope.go:117] "RemoveContainer" containerID="66c1fcf2276534ad864a4ee931635b14a07c0330af3a57e8ed8300895949b30a" Mar 20 08:39:07 crc kubenswrapper[4971]: E0320 08:39:07.733336 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:39:20 crc kubenswrapper[4971]: I0320 08:39:20.732237 4971 scope.go:117] "RemoveContainer" containerID="66c1fcf2276534ad864a4ee931635b14a07c0330af3a57e8ed8300895949b30a" Mar 20 08:39:20 crc kubenswrapper[4971]: E0320 08:39:20.733339 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:39:31 crc kubenswrapper[4971]: I0320 08:39:31.733070 4971 scope.go:117] "RemoveContainer" containerID="66c1fcf2276534ad864a4ee931635b14a07c0330af3a57e8ed8300895949b30a" Mar 20 08:39:31 crc kubenswrapper[4971]: E0320 08:39:31.733860 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:39:42 crc kubenswrapper[4971]: I0320 08:39:42.732924 4971 scope.go:117] "RemoveContainer" containerID="66c1fcf2276534ad864a4ee931635b14a07c0330af3a57e8ed8300895949b30a" Mar 20 08:39:42 crc kubenswrapper[4971]: E0320 08:39:42.733757 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:39:53 crc kubenswrapper[4971]: I0320 08:39:53.732963 4971 scope.go:117] "RemoveContainer" containerID="66c1fcf2276534ad864a4ee931635b14a07c0330af3a57e8ed8300895949b30a" Mar 20 08:39:53 crc kubenswrapper[4971]: E0320 08:39:53.734023 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:40:00 crc kubenswrapper[4971]: I0320 08:40:00.161918 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566600-qznhs"] Mar 20 08:40:00 crc kubenswrapper[4971]: E0320 08:40:00.163375 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3285aa80-885b-4a96-9fbf-0170c06603fb" containerName="oc" Mar 20 08:40:00 crc kubenswrapper[4971]: I0320 08:40:00.163410 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="3285aa80-885b-4a96-9fbf-0170c06603fb" containerName="oc" Mar 20 08:40:00 crc kubenswrapper[4971]: I0320 08:40:00.163962 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="3285aa80-885b-4a96-9fbf-0170c06603fb" containerName="oc" Mar 20 08:40:00 crc kubenswrapper[4971]: I0320 08:40:00.164920 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566600-qznhs" Mar 20 08:40:00 crc kubenswrapper[4971]: I0320 08:40:00.168917 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 08:40:00 crc kubenswrapper[4971]: I0320 08:40:00.169750 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 08:40:00 crc kubenswrapper[4971]: I0320 08:40:00.170098 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 08:40:00 crc kubenswrapper[4971]: I0320 08:40:00.187476 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566600-qznhs"] Mar 20 08:40:00 crc kubenswrapper[4971]: I0320 08:40:00.313097 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f8nf\" (UniqueName: \"kubernetes.io/projected/c063d1e0-0975-4b8c-8253-e781c924a051-kube-api-access-4f8nf\") pod \"auto-csr-approver-29566600-qznhs\" (UID: \"c063d1e0-0975-4b8c-8253-e781c924a051\") " pod="openshift-infra/auto-csr-approver-29566600-qznhs" Mar 20 08:40:00 crc kubenswrapper[4971]: I0320 08:40:00.415230 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4f8nf\" (UniqueName: \"kubernetes.io/projected/c063d1e0-0975-4b8c-8253-e781c924a051-kube-api-access-4f8nf\") pod \"auto-csr-approver-29566600-qznhs\" (UID: \"c063d1e0-0975-4b8c-8253-e781c924a051\") " pod="openshift-infra/auto-csr-approver-29566600-qznhs" Mar 20 08:40:00 crc kubenswrapper[4971]: I0320 08:40:00.449273 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f8nf\" (UniqueName: \"kubernetes.io/projected/c063d1e0-0975-4b8c-8253-e781c924a051-kube-api-access-4f8nf\") pod \"auto-csr-approver-29566600-qznhs\" (UID: \"c063d1e0-0975-4b8c-8253-e781c924a051\") " pod="openshift-infra/auto-csr-approver-29566600-qznhs" Mar 20 08:40:00 crc kubenswrapper[4971]: I0320 08:40:00.499339 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566600-qznhs" Mar 20 08:40:01 crc kubenswrapper[4971]: I0320 08:40:01.008026 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566600-qznhs"] Mar 20 08:40:01 crc kubenswrapper[4971]: I0320 08:40:01.023422 4971 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 08:40:01 crc kubenswrapper[4971]: I0320 08:40:01.875545 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566600-qznhs" event={"ID":"c063d1e0-0975-4b8c-8253-e781c924a051","Type":"ContainerStarted","Data":"2d15db796137293cf7f18995b9641ac13d3a260ec1e3dd56bc6241d82de2e8d6"} Mar 20 08:40:02 crc kubenswrapper[4971]: I0320 08:40:02.883056 4971 generic.go:334] "Generic (PLEG): container finished" podID="c063d1e0-0975-4b8c-8253-e781c924a051" containerID="f1010db1b06c8c861d7f18dbdb2b3b614e3337b4fb463f609c45de2fd9a5e7a7" exitCode=0 Mar 20 08:40:02 crc kubenswrapper[4971]: I0320 08:40:02.883131 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566600-qznhs" event={"ID":"c063d1e0-0975-4b8c-8253-e781c924a051","Type":"ContainerDied","Data":"f1010db1b06c8c861d7f18dbdb2b3b614e3337b4fb463f609c45de2fd9a5e7a7"} Mar 20 08:40:04 crc kubenswrapper[4971]: I0320 08:40:04.257917 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566600-qznhs" Mar 20 08:40:04 crc kubenswrapper[4971]: I0320 08:40:04.374783 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4f8nf\" (UniqueName: \"kubernetes.io/projected/c063d1e0-0975-4b8c-8253-e781c924a051-kube-api-access-4f8nf\") pod \"c063d1e0-0975-4b8c-8253-e781c924a051\" (UID: \"c063d1e0-0975-4b8c-8253-e781c924a051\") " Mar 20 08:40:04 crc kubenswrapper[4971]: I0320 08:40:04.383895 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c063d1e0-0975-4b8c-8253-e781c924a051-kube-api-access-4f8nf" (OuterVolumeSpecName: "kube-api-access-4f8nf") pod "c063d1e0-0975-4b8c-8253-e781c924a051" (UID: "c063d1e0-0975-4b8c-8253-e781c924a051"). InnerVolumeSpecName "kube-api-access-4f8nf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:40:04 crc kubenswrapper[4971]: I0320 08:40:04.477120 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4f8nf\" (UniqueName: \"kubernetes.io/projected/c063d1e0-0975-4b8c-8253-e781c924a051-kube-api-access-4f8nf\") on node \"crc\" DevicePath \"\"" Mar 20 08:40:04 crc kubenswrapper[4971]: I0320 08:40:04.732376 4971 scope.go:117] "RemoveContainer" containerID="66c1fcf2276534ad864a4ee931635b14a07c0330af3a57e8ed8300895949b30a" Mar 20 08:40:04 crc kubenswrapper[4971]: E0320 08:40:04.733013 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:40:04 crc kubenswrapper[4971]: I0320 08:40:04.907992 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566600-qznhs" event={"ID":"c063d1e0-0975-4b8c-8253-e781c924a051","Type":"ContainerDied","Data":"2d15db796137293cf7f18995b9641ac13d3a260ec1e3dd56bc6241d82de2e8d6"} Mar 20 08:40:04 crc kubenswrapper[4971]: I0320 08:40:04.908039 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d15db796137293cf7f18995b9641ac13d3a260ec1e3dd56bc6241d82de2e8d6" Mar 20 08:40:04 crc kubenswrapper[4971]: I0320 08:40:04.908061 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566600-qznhs" Mar 20 08:40:05 crc kubenswrapper[4971]: I0320 08:40:05.342900 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566594-vngj2"] Mar 20 08:40:05 crc kubenswrapper[4971]: I0320 08:40:05.349831 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566594-vngj2"] Mar 20 08:40:06 crc kubenswrapper[4971]: I0320 08:40:06.743238 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a73b98e-da63-4329-8f91-4d738e5db372" path="/var/lib/kubelet/pods/1a73b98e-da63-4329-8f91-4d738e5db372/volumes" Mar 20 08:40:08 crc kubenswrapper[4971]: I0320 08:40:08.054354 4971 scope.go:117] "RemoveContainer" containerID="3ad228a363a2b47097004f14163ed12acc3431b2ab00dd03f596237bcf95f9b8" Mar 20 08:40:17 crc kubenswrapper[4971]: I0320 08:40:17.732438 4971 scope.go:117] "RemoveContainer" containerID="66c1fcf2276534ad864a4ee931635b14a07c0330af3a57e8ed8300895949b30a" Mar 20 08:40:17 crc kubenswrapper[4971]: E0320 08:40:17.733098 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:40:30 crc kubenswrapper[4971]: I0320 08:40:30.732289 4971 scope.go:117] "RemoveContainer" containerID="66c1fcf2276534ad864a4ee931635b14a07c0330af3a57e8ed8300895949b30a" Mar 20 08:40:31 crc kubenswrapper[4971]: I0320 08:40:31.135146 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerStarted","Data":"7a173279568ba15985897ecc9daca378d7d89e3775153dfc334adc490c1852dc"} Mar 20 08:40:50 crc kubenswrapper[4971]: I0320 08:40:50.894691 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6cvxn"] Mar 20 08:40:50 crc kubenswrapper[4971]: E0320 08:40:50.896021 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c063d1e0-0975-4b8c-8253-e781c924a051" containerName="oc" Mar 20 08:40:50 crc kubenswrapper[4971]: I0320 08:40:50.896039 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="c063d1e0-0975-4b8c-8253-e781c924a051" containerName="oc" Mar 20 08:40:50 crc kubenswrapper[4971]: I0320 08:40:50.896421 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="c063d1e0-0975-4b8c-8253-e781c924a051" containerName="oc" Mar 20 08:40:50 crc kubenswrapper[4971]: I0320 08:40:50.899115 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6cvxn" Mar 20 08:40:50 crc kubenswrapper[4971]: I0320 08:40:50.921033 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6cvxn"] Mar 20 08:40:50 crc kubenswrapper[4971]: I0320 08:40:50.998291 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd43bc66-414f-4664-8a56-97812fa4ad17-utilities\") pod \"community-operators-6cvxn\" (UID: \"cd43bc66-414f-4664-8a56-97812fa4ad17\") " pod="openshift-marketplace/community-operators-6cvxn" Mar 20 08:40:50 crc kubenswrapper[4971]: I0320 08:40:50.998659 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd43bc66-414f-4664-8a56-97812fa4ad17-catalog-content\") pod \"community-operators-6cvxn\" (UID: \"cd43bc66-414f-4664-8a56-97812fa4ad17\") " pod="openshift-marketplace/community-operators-6cvxn" Mar 20 08:40:50 crc kubenswrapper[4971]: I0320 08:40:50.998697 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l442r\" (UniqueName: \"kubernetes.io/projected/cd43bc66-414f-4664-8a56-97812fa4ad17-kube-api-access-l442r\") pod \"community-operators-6cvxn\" (UID: \"cd43bc66-414f-4664-8a56-97812fa4ad17\") " pod="openshift-marketplace/community-operators-6cvxn" Mar 20 08:40:51 crc kubenswrapper[4971]: I0320 08:40:51.100269 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd43bc66-414f-4664-8a56-97812fa4ad17-catalog-content\") pod \"community-operators-6cvxn\" (UID: \"cd43bc66-414f-4664-8a56-97812fa4ad17\") " pod="openshift-marketplace/community-operators-6cvxn" Mar 20 08:40:51 crc kubenswrapper[4971]: I0320 08:40:51.100330 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l442r\" (UniqueName: \"kubernetes.io/projected/cd43bc66-414f-4664-8a56-97812fa4ad17-kube-api-access-l442r\") pod \"community-operators-6cvxn\" (UID: \"cd43bc66-414f-4664-8a56-97812fa4ad17\") " pod="openshift-marketplace/community-operators-6cvxn" Mar 20 08:40:51 crc kubenswrapper[4971]: I0320 08:40:51.100392 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd43bc66-414f-4664-8a56-97812fa4ad17-utilities\") pod \"community-operators-6cvxn\" (UID: \"cd43bc66-414f-4664-8a56-97812fa4ad17\") " pod="openshift-marketplace/community-operators-6cvxn" Mar 20 08:40:51 crc kubenswrapper[4971]: I0320 08:40:51.100876 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd43bc66-414f-4664-8a56-97812fa4ad17-catalog-content\") pod \"community-operators-6cvxn\" (UID: \"cd43bc66-414f-4664-8a56-97812fa4ad17\") " pod="openshift-marketplace/community-operators-6cvxn" Mar 20 08:40:51 crc kubenswrapper[4971]: I0320 08:40:51.100958 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd43bc66-414f-4664-8a56-97812fa4ad17-utilities\") pod \"community-operators-6cvxn\" (UID: \"cd43bc66-414f-4664-8a56-97812fa4ad17\") " pod="openshift-marketplace/community-operators-6cvxn" Mar 20 08:40:51 crc kubenswrapper[4971]: I0320 08:40:51.119636 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l442r\" (UniqueName: \"kubernetes.io/projected/cd43bc66-414f-4664-8a56-97812fa4ad17-kube-api-access-l442r\") pod \"community-operators-6cvxn\" (UID: \"cd43bc66-414f-4664-8a56-97812fa4ad17\") " pod="openshift-marketplace/community-operators-6cvxn" Mar 20 08:40:51 crc kubenswrapper[4971]: I0320 08:40:51.234547 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6cvxn" Mar 20 08:40:51 crc kubenswrapper[4971]: I0320 08:40:51.734066 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6cvxn"] Mar 20 08:40:52 crc kubenswrapper[4971]: I0320 08:40:52.323824 4971 generic.go:334] "Generic (PLEG): container finished" podID="cd43bc66-414f-4664-8a56-97812fa4ad17" containerID="d6c725ccb89984e6c53e885edfb675693d52555974a8738bae0b24642956375d" exitCode=0 Mar 20 08:40:52 crc kubenswrapper[4971]: I0320 08:40:52.323939 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6cvxn" event={"ID":"cd43bc66-414f-4664-8a56-97812fa4ad17","Type":"ContainerDied","Data":"d6c725ccb89984e6c53e885edfb675693d52555974a8738bae0b24642956375d"} Mar 20 08:40:52 crc kubenswrapper[4971]: I0320 08:40:52.324131 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6cvxn" event={"ID":"cd43bc66-414f-4664-8a56-97812fa4ad17","Type":"ContainerStarted","Data":"bd94088f8d482ddb86cef71771d1e195bb02f72c25600c5c6516302b893e129e"} Mar 20 08:40:53 crc kubenswrapper[4971]: I0320 08:40:53.337470 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6cvxn" event={"ID":"cd43bc66-414f-4664-8a56-97812fa4ad17","Type":"ContainerStarted","Data":"b1753b8e1ef78e838679a1f5d87e7834e5b4d764fe8886abfd2346d726645f16"} Mar 20 08:40:54 crc kubenswrapper[4971]: I0320 08:40:54.348931 4971 generic.go:334] "Generic (PLEG): container finished" podID="cd43bc66-414f-4664-8a56-97812fa4ad17" containerID="b1753b8e1ef78e838679a1f5d87e7834e5b4d764fe8886abfd2346d726645f16" exitCode=0 Mar 20 08:40:54 crc kubenswrapper[4971]: I0320 08:40:54.348978 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6cvxn" event={"ID":"cd43bc66-414f-4664-8a56-97812fa4ad17","Type":"ContainerDied","Data":"b1753b8e1ef78e838679a1f5d87e7834e5b4d764fe8886abfd2346d726645f16"} Mar 20 08:40:55 crc kubenswrapper[4971]: I0320 08:40:55.361027 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6cvxn" event={"ID":"cd43bc66-414f-4664-8a56-97812fa4ad17","Type":"ContainerStarted","Data":"9f0f20f8f5d5d2d48eaf5d328f6fb11741be04729bc488eb26dd7f31526fe720"} Mar 20 08:40:55 crc kubenswrapper[4971]: I0320 08:40:55.412521 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6cvxn" podStartSLOduration=2.9451897540000003 podStartE2EDuration="5.41249241s" podCreationTimestamp="2026-03-20 08:40:50 +0000 UTC" firstStartedPulling="2026-03-20 08:40:52.326571649 +0000 UTC m=+6674.306445787" lastFinishedPulling="2026-03-20 08:40:54.793874305 +0000 UTC m=+6676.773748443" observedRunningTime="2026-03-20 08:40:55.407913291 +0000 UTC m=+6677.387787469" watchObservedRunningTime="2026-03-20 08:40:55.41249241 +0000 UTC m=+6677.392366578" Mar 20 08:41:01 crc kubenswrapper[4971]: I0320 08:41:01.234740 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6cvxn" Mar 20 08:41:01 crc kubenswrapper[4971]: I0320 08:41:01.235847 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6cvxn" Mar 20 08:41:01 crc kubenswrapper[4971]: I0320 08:41:01.276627 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6cvxn" Mar 20 08:41:01 crc kubenswrapper[4971]: I0320 08:41:01.453377 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6cvxn" Mar 20 08:41:01 crc kubenswrapper[4971]: I0320 08:41:01.510843 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6cvxn"] Mar 20 08:41:03 crc kubenswrapper[4971]: I0320 08:41:03.429279 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6cvxn" podUID="cd43bc66-414f-4664-8a56-97812fa4ad17" containerName="registry-server" containerID="cri-o://9f0f20f8f5d5d2d48eaf5d328f6fb11741be04729bc488eb26dd7f31526fe720" gracePeriod=2 Mar 20 08:41:03 crc kubenswrapper[4971]: I0320 08:41:03.867055 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6cvxn" Mar 20 08:41:04 crc kubenswrapper[4971]: I0320 08:41:04.023804 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd43bc66-414f-4664-8a56-97812fa4ad17-catalog-content\") pod \"cd43bc66-414f-4664-8a56-97812fa4ad17\" (UID: \"cd43bc66-414f-4664-8a56-97812fa4ad17\") " Mar 20 08:41:04 crc kubenswrapper[4971]: I0320 08:41:04.023886 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd43bc66-414f-4664-8a56-97812fa4ad17-utilities\") pod \"cd43bc66-414f-4664-8a56-97812fa4ad17\" (UID: \"cd43bc66-414f-4664-8a56-97812fa4ad17\") " Mar 20 08:41:04 crc kubenswrapper[4971]: I0320 08:41:04.024067 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l442r\" (UniqueName: \"kubernetes.io/projected/cd43bc66-414f-4664-8a56-97812fa4ad17-kube-api-access-l442r\") pod \"cd43bc66-414f-4664-8a56-97812fa4ad17\" (UID: \"cd43bc66-414f-4664-8a56-97812fa4ad17\") " Mar 20 08:41:04 crc kubenswrapper[4971]: I0320 08:41:04.025552 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd43bc66-414f-4664-8a56-97812fa4ad17-utilities" (OuterVolumeSpecName: "utilities") pod "cd43bc66-414f-4664-8a56-97812fa4ad17" (UID: "cd43bc66-414f-4664-8a56-97812fa4ad17"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:41:04 crc kubenswrapper[4971]: I0320 08:41:04.034256 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd43bc66-414f-4664-8a56-97812fa4ad17-kube-api-access-l442r" (OuterVolumeSpecName: "kube-api-access-l442r") pod "cd43bc66-414f-4664-8a56-97812fa4ad17" (UID: "cd43bc66-414f-4664-8a56-97812fa4ad17"). InnerVolumeSpecName "kube-api-access-l442r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:41:04 crc kubenswrapper[4971]: I0320 08:41:04.092827 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd43bc66-414f-4664-8a56-97812fa4ad17-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cd43bc66-414f-4664-8a56-97812fa4ad17" (UID: "cd43bc66-414f-4664-8a56-97812fa4ad17"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:41:04 crc kubenswrapper[4971]: I0320 08:41:04.127889 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd43bc66-414f-4664-8a56-97812fa4ad17-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 08:41:04 crc kubenswrapper[4971]: I0320 08:41:04.127947 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l442r\" (UniqueName: \"kubernetes.io/projected/cd43bc66-414f-4664-8a56-97812fa4ad17-kube-api-access-l442r\") on node \"crc\" DevicePath \"\"" Mar 20 08:41:04 crc kubenswrapper[4971]: I0320 08:41:04.127967 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd43bc66-414f-4664-8a56-97812fa4ad17-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 08:41:04 crc kubenswrapper[4971]: I0320 08:41:04.457502 4971 generic.go:334] "Generic (PLEG): container finished" podID="cd43bc66-414f-4664-8a56-97812fa4ad17" containerID="9f0f20f8f5d5d2d48eaf5d328f6fb11741be04729bc488eb26dd7f31526fe720" exitCode=0 Mar 20 08:41:04 crc kubenswrapper[4971]: I0320 08:41:04.457574 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6cvxn" event={"ID":"cd43bc66-414f-4664-8a56-97812fa4ad17","Type":"ContainerDied","Data":"9f0f20f8f5d5d2d48eaf5d328f6fb11741be04729bc488eb26dd7f31526fe720"} Mar 20 08:41:04 crc kubenswrapper[4971]: I0320 08:41:04.457656 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6cvxn" Mar 20 08:41:04 crc kubenswrapper[4971]: I0320 08:41:04.457680 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6cvxn" event={"ID":"cd43bc66-414f-4664-8a56-97812fa4ad17","Type":"ContainerDied","Data":"bd94088f8d482ddb86cef71771d1e195bb02f72c25600c5c6516302b893e129e"} Mar 20 08:41:04 crc kubenswrapper[4971]: I0320 08:41:04.457712 4971 scope.go:117] "RemoveContainer" containerID="9f0f20f8f5d5d2d48eaf5d328f6fb11741be04729bc488eb26dd7f31526fe720" Mar 20 08:41:04 crc kubenswrapper[4971]: I0320 08:41:04.493562 4971 scope.go:117] "RemoveContainer" containerID="b1753b8e1ef78e838679a1f5d87e7834e5b4d764fe8886abfd2346d726645f16" Mar 20 08:41:04 crc kubenswrapper[4971]: I0320 08:41:04.515901 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6cvxn"] Mar 20 08:41:04 crc kubenswrapper[4971]: I0320 08:41:04.526287 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6cvxn"] Mar 20 08:41:04 crc kubenswrapper[4971]: I0320 08:41:04.538718 4971 scope.go:117] "RemoveContainer" containerID="d6c725ccb89984e6c53e885edfb675693d52555974a8738bae0b24642956375d" Mar 20 08:41:04 crc kubenswrapper[4971]: I0320 08:41:04.573419 4971 scope.go:117] "RemoveContainer" containerID="9f0f20f8f5d5d2d48eaf5d328f6fb11741be04729bc488eb26dd7f31526fe720" Mar 20 08:41:04 crc kubenswrapper[4971]: E0320 08:41:04.573878 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f0f20f8f5d5d2d48eaf5d328f6fb11741be04729bc488eb26dd7f31526fe720\": container with ID starting with 9f0f20f8f5d5d2d48eaf5d328f6fb11741be04729bc488eb26dd7f31526fe720 not found: ID does not exist" containerID="9f0f20f8f5d5d2d48eaf5d328f6fb11741be04729bc488eb26dd7f31526fe720" Mar 20 08:41:04 crc kubenswrapper[4971]: I0320 08:41:04.573924 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f0f20f8f5d5d2d48eaf5d328f6fb11741be04729bc488eb26dd7f31526fe720"} err="failed to get container status \"9f0f20f8f5d5d2d48eaf5d328f6fb11741be04729bc488eb26dd7f31526fe720\": rpc error: code = NotFound desc = could not find container \"9f0f20f8f5d5d2d48eaf5d328f6fb11741be04729bc488eb26dd7f31526fe720\": container with ID starting with 9f0f20f8f5d5d2d48eaf5d328f6fb11741be04729bc488eb26dd7f31526fe720 not found: ID does not exist" Mar 20 08:41:04 crc kubenswrapper[4971]: I0320 08:41:04.573954 4971 scope.go:117] "RemoveContainer" containerID="b1753b8e1ef78e838679a1f5d87e7834e5b4d764fe8886abfd2346d726645f16" Mar 20 08:41:04 crc kubenswrapper[4971]: E0320 08:41:04.574279 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1753b8e1ef78e838679a1f5d87e7834e5b4d764fe8886abfd2346d726645f16\": container with ID starting with b1753b8e1ef78e838679a1f5d87e7834e5b4d764fe8886abfd2346d726645f16 not found: ID does not exist" containerID="b1753b8e1ef78e838679a1f5d87e7834e5b4d764fe8886abfd2346d726645f16" Mar 20 08:41:04 crc kubenswrapper[4971]: I0320 08:41:04.574305 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1753b8e1ef78e838679a1f5d87e7834e5b4d764fe8886abfd2346d726645f16"} err="failed to get container status \"b1753b8e1ef78e838679a1f5d87e7834e5b4d764fe8886abfd2346d726645f16\": rpc error: code = NotFound desc = could not find container \"b1753b8e1ef78e838679a1f5d87e7834e5b4d764fe8886abfd2346d726645f16\": container with ID starting with b1753b8e1ef78e838679a1f5d87e7834e5b4d764fe8886abfd2346d726645f16 not found: ID does not exist" Mar 20 08:41:04 crc kubenswrapper[4971]: I0320 08:41:04.574327 4971 scope.go:117] "RemoveContainer" containerID="d6c725ccb89984e6c53e885edfb675693d52555974a8738bae0b24642956375d" Mar 20 08:41:04 crc kubenswrapper[4971]: E0320 08:41:04.574914 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6c725ccb89984e6c53e885edfb675693d52555974a8738bae0b24642956375d\": container with ID starting with d6c725ccb89984e6c53e885edfb675693d52555974a8738bae0b24642956375d not found: ID does not exist" containerID="d6c725ccb89984e6c53e885edfb675693d52555974a8738bae0b24642956375d" Mar 20 08:41:04 crc kubenswrapper[4971]: I0320 08:41:04.574968 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6c725ccb89984e6c53e885edfb675693d52555974a8738bae0b24642956375d"} err="failed to get container status \"d6c725ccb89984e6c53e885edfb675693d52555974a8738bae0b24642956375d\": rpc error: code = NotFound desc = could not find container \"d6c725ccb89984e6c53e885edfb675693d52555974a8738bae0b24642956375d\": container with ID starting with d6c725ccb89984e6c53e885edfb675693d52555974a8738bae0b24642956375d not found: ID does not exist" Mar 20 08:41:04 crc kubenswrapper[4971]: I0320 08:41:04.747796 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd43bc66-414f-4664-8a56-97812fa4ad17" path="/var/lib/kubelet/pods/cd43bc66-414f-4664-8a56-97812fa4ad17/volumes" Mar 20 08:41:44 crc kubenswrapper[4971]: I0320 08:41:44.078405 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-rm8gn"] Mar 20 08:41:44 crc kubenswrapper[4971]: I0320 08:41:44.085705 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-rm8gn"] Mar 20 08:41:44 crc kubenswrapper[4971]: I0320 08:41:44.743209 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8095ddfe-bc8a-4d45-aaa3-431ade832c0f" path="/var/lib/kubelet/pods/8095ddfe-bc8a-4d45-aaa3-431ade832c0f/volumes" Mar 20 08:41:50 crc kubenswrapper[4971]: I0320 08:41:50.989459 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bb25l"] Mar 20 08:41:50 crc kubenswrapper[4971]: E0320 08:41:50.990286 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd43bc66-414f-4664-8a56-97812fa4ad17" containerName="registry-server" Mar 20 08:41:50 crc kubenswrapper[4971]: I0320 08:41:50.990306 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd43bc66-414f-4664-8a56-97812fa4ad17" containerName="registry-server" Mar 20 08:41:50 crc kubenswrapper[4971]: E0320 08:41:50.990343 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd43bc66-414f-4664-8a56-97812fa4ad17" containerName="extract-utilities" Mar 20 08:41:50 crc kubenswrapper[4971]: I0320 08:41:50.990351 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd43bc66-414f-4664-8a56-97812fa4ad17" containerName="extract-utilities" Mar 20 08:41:50 crc kubenswrapper[4971]: E0320 08:41:50.990370 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd43bc66-414f-4664-8a56-97812fa4ad17" containerName="extract-content" Mar 20 08:41:50 crc kubenswrapper[4971]: I0320 08:41:50.990378 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd43bc66-414f-4664-8a56-97812fa4ad17" containerName="extract-content" Mar 20 08:41:50 crc kubenswrapper[4971]: I0320 08:41:50.990550 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd43bc66-414f-4664-8a56-97812fa4ad17" containerName="registry-server" Mar 20 08:41:50 crc kubenswrapper[4971]: I0320 08:41:50.992024 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bb25l" Mar 20 08:41:51 crc kubenswrapper[4971]: I0320 08:41:51.002730 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bb25l"] Mar 20 08:41:51 crc kubenswrapper[4971]: I0320 08:41:51.051343 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfpfd\" (UniqueName: \"kubernetes.io/projected/52e00eae-d49f-442b-be26-317eaa68bcf3-kube-api-access-zfpfd\") pod \"redhat-marketplace-bb25l\" (UID: \"52e00eae-d49f-442b-be26-317eaa68bcf3\") " pod="openshift-marketplace/redhat-marketplace-bb25l" Mar 20 08:41:51 crc kubenswrapper[4971]: I0320 08:41:51.051400 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52e00eae-d49f-442b-be26-317eaa68bcf3-catalog-content\") pod \"redhat-marketplace-bb25l\" (UID: \"52e00eae-d49f-442b-be26-317eaa68bcf3\") " pod="openshift-marketplace/redhat-marketplace-bb25l" Mar 20 08:41:51 crc kubenswrapper[4971]: I0320 08:41:51.051451 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52e00eae-d49f-442b-be26-317eaa68bcf3-utilities\") pod \"redhat-marketplace-bb25l\" (UID: \"52e00eae-d49f-442b-be26-317eaa68bcf3\") " pod="openshift-marketplace/redhat-marketplace-bb25l" Mar 20 08:41:51 crc kubenswrapper[4971]: I0320 08:41:51.153646 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfpfd\" (UniqueName: \"kubernetes.io/projected/52e00eae-d49f-442b-be26-317eaa68bcf3-kube-api-access-zfpfd\") pod \"redhat-marketplace-bb25l\" (UID: \"52e00eae-d49f-442b-be26-317eaa68bcf3\") " pod="openshift-marketplace/redhat-marketplace-bb25l" Mar 20 08:41:51 crc kubenswrapper[4971]: I0320 08:41:51.153699 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52e00eae-d49f-442b-be26-317eaa68bcf3-catalog-content\") pod \"redhat-marketplace-bb25l\" (UID: \"52e00eae-d49f-442b-be26-317eaa68bcf3\") " pod="openshift-marketplace/redhat-marketplace-bb25l" Mar 20 08:41:51 crc kubenswrapper[4971]: I0320 08:41:51.153753 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52e00eae-d49f-442b-be26-317eaa68bcf3-utilities\") pod \"redhat-marketplace-bb25l\" (UID: \"52e00eae-d49f-442b-be26-317eaa68bcf3\") " pod="openshift-marketplace/redhat-marketplace-bb25l" Mar 20 08:41:51 crc kubenswrapper[4971]: I0320 08:41:51.154329 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52e00eae-d49f-442b-be26-317eaa68bcf3-catalog-content\") pod \"redhat-marketplace-bb25l\" (UID: \"52e00eae-d49f-442b-be26-317eaa68bcf3\") " pod="openshift-marketplace/redhat-marketplace-bb25l" Mar 20 08:41:51 crc kubenswrapper[4971]: I0320 08:41:51.154389 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52e00eae-d49f-442b-be26-317eaa68bcf3-utilities\") pod \"redhat-marketplace-bb25l\" (UID: \"52e00eae-d49f-442b-be26-317eaa68bcf3\") " pod="openshift-marketplace/redhat-marketplace-bb25l" Mar 20 08:41:51 crc kubenswrapper[4971]: I0320 08:41:51.176594 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfpfd\" (UniqueName: \"kubernetes.io/projected/52e00eae-d49f-442b-be26-317eaa68bcf3-kube-api-access-zfpfd\") pod \"redhat-marketplace-bb25l\" (UID: \"52e00eae-d49f-442b-be26-317eaa68bcf3\") " pod="openshift-marketplace/redhat-marketplace-bb25l" Mar 20 08:41:51 crc kubenswrapper[4971]: I0320 08:41:51.319440 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bb25l" Mar 20 08:41:51 crc kubenswrapper[4971]: I0320 08:41:51.749043 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bb25l"] Mar 20 08:41:51 crc kubenswrapper[4971]: I0320 08:41:51.887949 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bb25l" event={"ID":"52e00eae-d49f-442b-be26-317eaa68bcf3","Type":"ContainerStarted","Data":"a052e5037b5a6a17979428e62095cdbd308b355ff4cf28992183c3cc68986cbc"} Mar 20 08:41:52 crc kubenswrapper[4971]: I0320 08:41:52.897536 4971 generic.go:334] "Generic (PLEG): container finished" podID="52e00eae-d49f-442b-be26-317eaa68bcf3" containerID="9a00c111ffb2c605a419fc77f62b00bb04a252219a9af0a3454d9c4134188740" exitCode=0 Mar 20 08:41:52 crc kubenswrapper[4971]: I0320 08:41:52.897638 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bb25l" event={"ID":"52e00eae-d49f-442b-be26-317eaa68bcf3","Type":"ContainerDied","Data":"9a00c111ffb2c605a419fc77f62b00bb04a252219a9af0a3454d9c4134188740"} Mar 20 08:41:53 crc kubenswrapper[4971]: I0320 08:41:53.906943 4971 generic.go:334] "Generic (PLEG): container finished" podID="52e00eae-d49f-442b-be26-317eaa68bcf3" containerID="228eca0b3287de4cb511ba28b0405e29177dcfec970fd4ef7e051ecaf7b0227d" exitCode=0 Mar 20 08:41:53 crc kubenswrapper[4971]: I0320 08:41:53.906978 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bb25l" event={"ID":"52e00eae-d49f-442b-be26-317eaa68bcf3","Type":"ContainerDied","Data":"228eca0b3287de4cb511ba28b0405e29177dcfec970fd4ef7e051ecaf7b0227d"} Mar 20 08:41:54 crc kubenswrapper[4971]: I0320 08:41:54.920064 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bb25l" event={"ID":"52e00eae-d49f-442b-be26-317eaa68bcf3","Type":"ContainerStarted","Data":"5663515acb9a70ac3bc17adce8afcd5c0ac23773da8697976f81acf300ff2fde"} Mar 20 08:41:54 crc kubenswrapper[4971]: I0320 08:41:54.955191 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bb25l" podStartSLOduration=3.4884075660000002 podStartE2EDuration="4.955159208s" podCreationTimestamp="2026-03-20 08:41:50 +0000 UTC" firstStartedPulling="2026-03-20 08:41:52.899440285 +0000 UTC m=+6734.879314433" lastFinishedPulling="2026-03-20 08:41:54.366191897 +0000 UTC m=+6736.346066075" observedRunningTime="2026-03-20 08:41:54.945160196 +0000 UTC m=+6736.925034334" watchObservedRunningTime="2026-03-20 08:41:54.955159208 +0000 UTC m=+6736.935033386" Mar 20 08:42:00 crc kubenswrapper[4971]: I0320 08:42:00.172214 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566602-27sdp"] Mar 20 08:42:00 crc kubenswrapper[4971]: I0320 08:42:00.175070 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566602-27sdp" Mar 20 08:42:00 crc kubenswrapper[4971]: I0320 08:42:00.178860 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 08:42:00 crc kubenswrapper[4971]: I0320 08:42:00.179392 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 08:42:00 crc kubenswrapper[4971]: I0320 08:42:00.183076 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 08:42:00 crc kubenswrapper[4971]: I0320 08:42:00.186803 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566602-27sdp"] Mar 20 08:42:00 crc kubenswrapper[4971]: I0320 08:42:00.307686 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvr59\" (UniqueName: \"kubernetes.io/projected/b023bc7b-9694-43ec-a1e0-9b14fa80c9f0-kube-api-access-vvr59\") pod \"auto-csr-approver-29566602-27sdp\" (UID: \"b023bc7b-9694-43ec-a1e0-9b14fa80c9f0\") " pod="openshift-infra/auto-csr-approver-29566602-27sdp" Mar 20 08:42:00 crc kubenswrapper[4971]: I0320 08:42:00.409120 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvr59\" (UniqueName: \"kubernetes.io/projected/b023bc7b-9694-43ec-a1e0-9b14fa80c9f0-kube-api-access-vvr59\") pod \"auto-csr-approver-29566602-27sdp\" (UID: \"b023bc7b-9694-43ec-a1e0-9b14fa80c9f0\") " pod="openshift-infra/auto-csr-approver-29566602-27sdp" Mar 20 08:42:00 crc kubenswrapper[4971]: I0320 08:42:00.447416 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvr59\" (UniqueName: \"kubernetes.io/projected/b023bc7b-9694-43ec-a1e0-9b14fa80c9f0-kube-api-access-vvr59\") pod \"auto-csr-approver-29566602-27sdp\" (UID: \"b023bc7b-9694-43ec-a1e0-9b14fa80c9f0\") " pod="openshift-infra/auto-csr-approver-29566602-27sdp" Mar 20 08:42:00 crc kubenswrapper[4971]: I0320 08:42:00.504777 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566602-27sdp" Mar 20 08:42:00 crc kubenswrapper[4971]: I0320 08:42:00.815053 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566602-27sdp"] Mar 20 08:42:00 crc kubenswrapper[4971]: I0320 08:42:00.980865 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566602-27sdp" event={"ID":"b023bc7b-9694-43ec-a1e0-9b14fa80c9f0","Type":"ContainerStarted","Data":"cc89b13566ceaee7ed0b4e5ceb54eb12c9b20c2f08aae7eafcf04c8776f9dd5e"} Mar 20 08:42:01 crc kubenswrapper[4971]: I0320 08:42:01.320046 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bb25l" Mar 20 08:42:01 crc kubenswrapper[4971]: I0320 08:42:01.320176 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bb25l" Mar 20 08:42:01 crc kubenswrapper[4971]: I0320 08:42:01.383375 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bb25l" Mar 20 08:42:02 crc kubenswrapper[4971]: I0320 08:42:02.048188 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bb25l" Mar 20 08:42:02 crc kubenswrapper[4971]: I0320 08:42:02.104193 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bb25l"] Mar 20 08:42:03 crc kubenswrapper[4971]: I0320 08:42:03.003260 4971 generic.go:334] "Generic (PLEG): container finished" podID="b023bc7b-9694-43ec-a1e0-9b14fa80c9f0" containerID="22b69961abb997f506af552eebfe41f579ce5b873527cb69be50cc4de260c0c4" exitCode=0 Mar 20 08:42:03 crc kubenswrapper[4971]: I0320 08:42:03.003459 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566602-27sdp" event={"ID":"b023bc7b-9694-43ec-a1e0-9b14fa80c9f0","Type":"ContainerDied","Data":"22b69961abb997f506af552eebfe41f579ce5b873527cb69be50cc4de260c0c4"} Mar 20 08:42:04 crc kubenswrapper[4971]: I0320 08:42:04.015051 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bb25l" podUID="52e00eae-d49f-442b-be26-317eaa68bcf3" containerName="registry-server" containerID="cri-o://5663515acb9a70ac3bc17adce8afcd5c0ac23773da8697976f81acf300ff2fde" gracePeriod=2 Mar 20 08:42:04 crc kubenswrapper[4971]: I0320 08:42:04.364455 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566602-27sdp" Mar 20 08:42:04 crc kubenswrapper[4971]: I0320 08:42:04.480371 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvr59\" (UniqueName: \"kubernetes.io/projected/b023bc7b-9694-43ec-a1e0-9b14fa80c9f0-kube-api-access-vvr59\") pod \"b023bc7b-9694-43ec-a1e0-9b14fa80c9f0\" (UID: \"b023bc7b-9694-43ec-a1e0-9b14fa80c9f0\") " Mar 20 08:42:04 crc kubenswrapper[4971]: I0320 08:42:04.488084 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b023bc7b-9694-43ec-a1e0-9b14fa80c9f0-kube-api-access-vvr59" (OuterVolumeSpecName: "kube-api-access-vvr59") pod "b023bc7b-9694-43ec-a1e0-9b14fa80c9f0" (UID: "b023bc7b-9694-43ec-a1e0-9b14fa80c9f0"). InnerVolumeSpecName "kube-api-access-vvr59". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:42:04 crc kubenswrapper[4971]: I0320 08:42:04.583003 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvr59\" (UniqueName: \"kubernetes.io/projected/b023bc7b-9694-43ec-a1e0-9b14fa80c9f0-kube-api-access-vvr59\") on node \"crc\" DevicePath \"\"" Mar 20 08:42:05 crc kubenswrapper[4971]: I0320 08:42:05.010475 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bb25l" Mar 20 08:42:05 crc kubenswrapper[4971]: I0320 08:42:05.031779 4971 generic.go:334] "Generic (PLEG): container finished" podID="52e00eae-d49f-442b-be26-317eaa68bcf3" containerID="5663515acb9a70ac3bc17adce8afcd5c0ac23773da8697976f81acf300ff2fde" exitCode=0 Mar 20 08:42:05 crc kubenswrapper[4971]: I0320 08:42:05.031871 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bb25l" event={"ID":"52e00eae-d49f-442b-be26-317eaa68bcf3","Type":"ContainerDied","Data":"5663515acb9a70ac3bc17adce8afcd5c0ac23773da8697976f81acf300ff2fde"} Mar 20 08:42:05 crc kubenswrapper[4971]: I0320 08:42:05.031898 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bb25l" event={"ID":"52e00eae-d49f-442b-be26-317eaa68bcf3","Type":"ContainerDied","Data":"a052e5037b5a6a17979428e62095cdbd308b355ff4cf28992183c3cc68986cbc"} Mar 20 08:42:05 crc kubenswrapper[4971]: I0320 08:42:05.032563 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bb25l" Mar 20 08:42:05 crc kubenswrapper[4971]: I0320 08:42:05.032654 4971 scope.go:117] "RemoveContainer" containerID="5663515acb9a70ac3bc17adce8afcd5c0ac23773da8697976f81acf300ff2fde" Mar 20 08:42:05 crc kubenswrapper[4971]: I0320 08:42:05.037856 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566602-27sdp" event={"ID":"b023bc7b-9694-43ec-a1e0-9b14fa80c9f0","Type":"ContainerDied","Data":"cc89b13566ceaee7ed0b4e5ceb54eb12c9b20c2f08aae7eafcf04c8776f9dd5e"} Mar 20 08:42:05 crc kubenswrapper[4971]: I0320 08:42:05.037889 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc89b13566ceaee7ed0b4e5ceb54eb12c9b20c2f08aae7eafcf04c8776f9dd5e" Mar 20 08:42:05 crc kubenswrapper[4971]: I0320 08:42:05.037951 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566602-27sdp" Mar 20 08:42:05 crc kubenswrapper[4971]: I0320 08:42:05.054090 4971 scope.go:117] "RemoveContainer" containerID="228eca0b3287de4cb511ba28b0405e29177dcfec970fd4ef7e051ecaf7b0227d" Mar 20 08:42:05 crc kubenswrapper[4971]: I0320 08:42:05.069756 4971 scope.go:117] "RemoveContainer" containerID="9a00c111ffb2c605a419fc77f62b00bb04a252219a9af0a3454d9c4134188740" Mar 20 08:42:05 crc kubenswrapper[4971]: I0320 08:42:05.083441 4971 scope.go:117] "RemoveContainer" containerID="5663515acb9a70ac3bc17adce8afcd5c0ac23773da8697976f81acf300ff2fde" Mar 20 08:42:05 crc kubenswrapper[4971]: E0320 08:42:05.083791 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5663515acb9a70ac3bc17adce8afcd5c0ac23773da8697976f81acf300ff2fde\": container with ID starting with 5663515acb9a70ac3bc17adce8afcd5c0ac23773da8697976f81acf300ff2fde not found: ID does not exist" containerID="5663515acb9a70ac3bc17adce8afcd5c0ac23773da8697976f81acf300ff2fde" Mar 20 08:42:05 crc kubenswrapper[4971]: I0320 08:42:05.083821 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5663515acb9a70ac3bc17adce8afcd5c0ac23773da8697976f81acf300ff2fde"} err="failed to get container status \"5663515acb9a70ac3bc17adce8afcd5c0ac23773da8697976f81acf300ff2fde\": rpc error: code = NotFound desc = could not find container \"5663515acb9a70ac3bc17adce8afcd5c0ac23773da8697976f81acf300ff2fde\": container with ID starting with 5663515acb9a70ac3bc17adce8afcd5c0ac23773da8697976f81acf300ff2fde not found: ID does not exist" Mar 20 08:42:05 crc kubenswrapper[4971]: I0320 08:42:05.083842 4971 scope.go:117] "RemoveContainer" containerID="228eca0b3287de4cb511ba28b0405e29177dcfec970fd4ef7e051ecaf7b0227d" Mar 20 08:42:05 crc kubenswrapper[4971]: E0320 08:42:05.084082 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"228eca0b3287de4cb511ba28b0405e29177dcfec970fd4ef7e051ecaf7b0227d\": container with ID starting with 228eca0b3287de4cb511ba28b0405e29177dcfec970fd4ef7e051ecaf7b0227d not found: ID does not exist" containerID="228eca0b3287de4cb511ba28b0405e29177dcfec970fd4ef7e051ecaf7b0227d" Mar 20 08:42:05 crc kubenswrapper[4971]: I0320 08:42:05.084105 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"228eca0b3287de4cb511ba28b0405e29177dcfec970fd4ef7e051ecaf7b0227d"} err="failed to get container status \"228eca0b3287de4cb511ba28b0405e29177dcfec970fd4ef7e051ecaf7b0227d\": rpc error: code = NotFound desc = could not find container \"228eca0b3287de4cb511ba28b0405e29177dcfec970fd4ef7e051ecaf7b0227d\": container with ID starting with 228eca0b3287de4cb511ba28b0405e29177dcfec970fd4ef7e051ecaf7b0227d not found: ID does not exist" Mar 20 08:42:05 crc kubenswrapper[4971]: I0320 08:42:05.084120 4971 scope.go:117] "RemoveContainer" containerID="9a00c111ffb2c605a419fc77f62b00bb04a252219a9af0a3454d9c4134188740" Mar 20 08:42:05 crc kubenswrapper[4971]: E0320 08:42:05.084353 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a00c111ffb2c605a419fc77f62b00bb04a252219a9af0a3454d9c4134188740\": container with ID starting with 9a00c111ffb2c605a419fc77f62b00bb04a252219a9af0a3454d9c4134188740 not found: ID does not exist" containerID="9a00c111ffb2c605a419fc77f62b00bb04a252219a9af0a3454d9c4134188740" Mar 20 08:42:05 crc kubenswrapper[4971]: I0320 08:42:05.084369 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a00c111ffb2c605a419fc77f62b00bb04a252219a9af0a3454d9c4134188740"} err="failed to get container status \"9a00c111ffb2c605a419fc77f62b00bb04a252219a9af0a3454d9c4134188740\": rpc error: code = NotFound desc = could not find container \"9a00c111ffb2c605a419fc77f62b00bb04a252219a9af0a3454d9c4134188740\": container with ID starting with 9a00c111ffb2c605a419fc77f62b00bb04a252219a9af0a3454d9c4134188740 not found: ID does not exist" Mar 20 08:42:05 crc kubenswrapper[4971]: I0320 08:42:05.201094 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52e00eae-d49f-442b-be26-317eaa68bcf3-utilities\") pod \"52e00eae-d49f-442b-be26-317eaa68bcf3\" (UID: \"52e00eae-d49f-442b-be26-317eaa68bcf3\") " Mar 20 08:42:05 crc kubenswrapper[4971]: I0320 08:42:05.201158 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52e00eae-d49f-442b-be26-317eaa68bcf3-catalog-content\") pod \"52e00eae-d49f-442b-be26-317eaa68bcf3\" (UID: \"52e00eae-d49f-442b-be26-317eaa68bcf3\") " Mar 20 08:42:05 crc kubenswrapper[4971]: I0320 08:42:05.201267 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfpfd\" (UniqueName: \"kubernetes.io/projected/52e00eae-d49f-442b-be26-317eaa68bcf3-kube-api-access-zfpfd\") pod \"52e00eae-d49f-442b-be26-317eaa68bcf3\" (UID: \"52e00eae-d49f-442b-be26-317eaa68bcf3\") " Mar 20 08:42:05 crc kubenswrapper[4971]: I0320 08:42:05.202590 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52e00eae-d49f-442b-be26-317eaa68bcf3-utilities" (OuterVolumeSpecName: "utilities") pod "52e00eae-d49f-442b-be26-317eaa68bcf3" (UID: "52e00eae-d49f-442b-be26-317eaa68bcf3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:42:05 crc kubenswrapper[4971]: I0320 08:42:05.208941 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52e00eae-d49f-442b-be26-317eaa68bcf3-kube-api-access-zfpfd" (OuterVolumeSpecName: "kube-api-access-zfpfd") pod "52e00eae-d49f-442b-be26-317eaa68bcf3" (UID: "52e00eae-d49f-442b-be26-317eaa68bcf3"). InnerVolumeSpecName "kube-api-access-zfpfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:42:05 crc kubenswrapper[4971]: I0320 08:42:05.238060 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52e00eae-d49f-442b-be26-317eaa68bcf3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "52e00eae-d49f-442b-be26-317eaa68bcf3" (UID: "52e00eae-d49f-442b-be26-317eaa68bcf3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:42:05 crc kubenswrapper[4971]: I0320 08:42:05.304406 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfpfd\" (UniqueName: \"kubernetes.io/projected/52e00eae-d49f-442b-be26-317eaa68bcf3-kube-api-access-zfpfd\") on node \"crc\" DevicePath \"\"" Mar 20 08:42:05 crc kubenswrapper[4971]: I0320 08:42:05.304447 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52e00eae-d49f-442b-be26-317eaa68bcf3-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 08:42:05 crc kubenswrapper[4971]: I0320 08:42:05.304463 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52e00eae-d49f-442b-be26-317eaa68bcf3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 08:42:05 crc kubenswrapper[4971]: I0320 08:42:05.374659 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bb25l"] Mar 20 08:42:05 crc kubenswrapper[4971]: I0320 08:42:05.382653 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bb25l"] Mar 20 08:42:05 crc kubenswrapper[4971]: I0320 08:42:05.436937 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566596-lmvzn"] Mar 20 08:42:05 crc kubenswrapper[4971]: I0320 08:42:05.444623 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566596-lmvzn"] Mar 20 08:42:06 crc kubenswrapper[4971]: I0320 08:42:06.744171 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52e00eae-d49f-442b-be26-317eaa68bcf3" path="/var/lib/kubelet/pods/52e00eae-d49f-442b-be26-317eaa68bcf3/volumes" Mar 20 08:42:06 crc kubenswrapper[4971]: I0320 08:42:06.745826 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb47c47c-73d5-457d-8245-5de3fa7be20a" path="/var/lib/kubelet/pods/fb47c47c-73d5-457d-8245-5de3fa7be20a/volumes" Mar 20 08:42:08 crc kubenswrapper[4971]: I0320 08:42:08.201246 4971 scope.go:117] "RemoveContainer" containerID="e6fb634d45ca95abff9c227bdf3b54764512fa09969316ca92b53a0fe71949af" Mar 20 08:42:08 crc kubenswrapper[4971]: I0320 08:42:08.252105 4971 scope.go:117] "RemoveContainer" containerID="409f257bdfc44c3a17cce22d85603115082b3572de83288a02998d605af2af75" Mar 20 08:42:50 crc kubenswrapper[4971]: I0320 08:42:50.162721 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 08:42:50 crc kubenswrapper[4971]: I0320 08:42:50.163830 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 08:43:20 crc kubenswrapper[4971]: I0320 08:43:20.162908 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 08:43:20 crc kubenswrapper[4971]: I0320 08:43:20.163511 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 08:43:43 crc kubenswrapper[4971]: I0320 08:43:43.189436 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-copy-data"] Mar 20 08:43:43 crc kubenswrapper[4971]: E0320 08:43:43.190379 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b023bc7b-9694-43ec-a1e0-9b14fa80c9f0" containerName="oc" Mar 20 08:43:43 crc kubenswrapper[4971]: I0320 08:43:43.190397 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="b023bc7b-9694-43ec-a1e0-9b14fa80c9f0" containerName="oc" Mar 20 08:43:43 crc kubenswrapper[4971]: E0320 08:43:43.190415 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52e00eae-d49f-442b-be26-317eaa68bcf3" containerName="registry-server" Mar 20 08:43:43 crc kubenswrapper[4971]: I0320 08:43:43.190423 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="52e00eae-d49f-442b-be26-317eaa68bcf3" containerName="registry-server" Mar 20 08:43:43 crc kubenswrapper[4971]: E0320 08:43:43.190435 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52e00eae-d49f-442b-be26-317eaa68bcf3" containerName="extract-content" Mar 20 08:43:43 crc kubenswrapper[4971]: I0320 08:43:43.190444 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="52e00eae-d49f-442b-be26-317eaa68bcf3" containerName="extract-content" Mar 20 08:43:43 crc kubenswrapper[4971]: E0320 08:43:43.190468 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52e00eae-d49f-442b-be26-317eaa68bcf3" containerName="extract-utilities" Mar 20 08:43:43 crc kubenswrapper[4971]: I0320 08:43:43.190476 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="52e00eae-d49f-442b-be26-317eaa68bcf3" containerName="extract-utilities" Mar 20 08:43:43 crc kubenswrapper[4971]: I0320 08:43:43.190645 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="b023bc7b-9694-43ec-a1e0-9b14fa80c9f0" containerName="oc" Mar 20 08:43:43 crc kubenswrapper[4971]: I0320 08:43:43.190671 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="52e00eae-d49f-442b-be26-317eaa68bcf3" containerName="registry-server" Mar 20 08:43:43 crc kubenswrapper[4971]: I0320 08:43:43.191411 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Mar 20 08:43:43 crc kubenswrapper[4971]: I0320 08:43:43.193985 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-cjphw" Mar 20 08:43:43 crc kubenswrapper[4971]: I0320 08:43:43.206960 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Mar 20 08:43:43 crc kubenswrapper[4971]: I0320 08:43:43.362730 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-737c1c9e-6ada-4804-b275-4270f95b7a2f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-737c1c9e-6ada-4804-b275-4270f95b7a2f\") pod \"mariadb-copy-data\" (UID: \"e6fe2ad3-64b7-41e6-9785-7f3fefbc4b12\") " pod="openstack/mariadb-copy-data" Mar 20 08:43:43 crc kubenswrapper[4971]: I0320 08:43:43.362912 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cmgd\" (UniqueName: \"kubernetes.io/projected/e6fe2ad3-64b7-41e6-9785-7f3fefbc4b12-kube-api-access-2cmgd\") pod \"mariadb-copy-data\" (UID: \"e6fe2ad3-64b7-41e6-9785-7f3fefbc4b12\") " pod="openstack/mariadb-copy-data" Mar 20 08:43:43 crc kubenswrapper[4971]: I0320 08:43:43.464796 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-737c1c9e-6ada-4804-b275-4270f95b7a2f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-737c1c9e-6ada-4804-b275-4270f95b7a2f\") pod \"mariadb-copy-data\" (UID: \"e6fe2ad3-64b7-41e6-9785-7f3fefbc4b12\") " pod="openstack/mariadb-copy-data" Mar 20 08:43:43 crc kubenswrapper[4971]: I0320 08:43:43.464883 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cmgd\" (UniqueName: \"kubernetes.io/projected/e6fe2ad3-64b7-41e6-9785-7f3fefbc4b12-kube-api-access-2cmgd\") pod \"mariadb-copy-data\" (UID: \"e6fe2ad3-64b7-41e6-9785-7f3fefbc4b12\") " pod="openstack/mariadb-copy-data" Mar 20 08:43:43 crc kubenswrapper[4971]: I0320 08:43:43.467823 4971 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 08:43:43 crc kubenswrapper[4971]: I0320 08:43:43.467946 4971 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-737c1c9e-6ada-4804-b275-4270f95b7a2f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-737c1c9e-6ada-4804-b275-4270f95b7a2f\") pod \"mariadb-copy-data\" (UID: \"e6fe2ad3-64b7-41e6-9785-7f3fefbc4b12\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b1d358dc64f518510f03f77302b695510d2d5ae51d4c4621baa56d6577c214de/globalmount\"" pod="openstack/mariadb-copy-data" Mar 20 08:43:43 crc kubenswrapper[4971]: I0320 08:43:43.491361 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cmgd\" (UniqueName: \"kubernetes.io/projected/e6fe2ad3-64b7-41e6-9785-7f3fefbc4b12-kube-api-access-2cmgd\") pod \"mariadb-copy-data\" (UID: \"e6fe2ad3-64b7-41e6-9785-7f3fefbc4b12\") " pod="openstack/mariadb-copy-data" Mar 20 08:43:43 crc kubenswrapper[4971]: I0320 08:43:43.501844 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-737c1c9e-6ada-4804-b275-4270f95b7a2f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-737c1c9e-6ada-4804-b275-4270f95b7a2f\") pod \"mariadb-copy-data\" (UID: \"e6fe2ad3-64b7-41e6-9785-7f3fefbc4b12\") " pod="openstack/mariadb-copy-data" Mar 20 08:43:43 crc kubenswrapper[4971]: I0320 08:43:43.527732 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Mar 20 08:43:44 crc kubenswrapper[4971]: I0320 08:43:44.082693 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Mar 20 08:43:45 crc kubenswrapper[4971]: I0320 08:43:45.047423 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"e6fe2ad3-64b7-41e6-9785-7f3fefbc4b12","Type":"ContainerStarted","Data":"1789c6a8f27abcd140c930426134a0b6bab5c49b07acc356ef45a8cb0e9bbd8c"} Mar 20 08:43:45 crc kubenswrapper[4971]: I0320 08:43:45.047774 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"e6fe2ad3-64b7-41e6-9785-7f3fefbc4b12","Type":"ContainerStarted","Data":"d1361c475a265b7fd97a29c9aff58896821a47fe3df136e06084025426a40e75"} Mar 20 08:43:45 crc kubenswrapper[4971]: I0320 08:43:45.069891 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-copy-data" podStartSLOduration=3.069869223 podStartE2EDuration="3.069869223s" podCreationTimestamp="2026-03-20 08:43:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:43:45.062573722 +0000 UTC m=+6847.042447880" watchObservedRunningTime="2026-03-20 08:43:45.069869223 +0000 UTC m=+6847.049743381" Mar 20 08:43:46 crc kubenswrapper[4971]: E0320 08:43:46.653879 4971 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.119:43556->38.102.83.119:38499: read tcp 38.102.83.119:43556->38.102.83.119:38499: read: connection reset by peer Mar 20 08:43:48 crc kubenswrapper[4971]: I0320 08:43:48.284187 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Mar 20 08:43:48 crc kubenswrapper[4971]: I0320 08:43:48.285381 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 20 08:43:48 crc kubenswrapper[4971]: I0320 08:43:48.297857 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 20 08:43:48 crc kubenswrapper[4971]: I0320 08:43:48.346572 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gclxw\" (UniqueName: \"kubernetes.io/projected/ec51406a-4caa-4ca6-a86d-3bad08477500-kube-api-access-gclxw\") pod \"mariadb-client\" (UID: \"ec51406a-4caa-4ca6-a86d-3bad08477500\") " pod="openstack/mariadb-client" Mar 20 08:43:48 crc kubenswrapper[4971]: I0320 08:43:48.448236 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gclxw\" (UniqueName: \"kubernetes.io/projected/ec51406a-4caa-4ca6-a86d-3bad08477500-kube-api-access-gclxw\") pod \"mariadb-client\" (UID: \"ec51406a-4caa-4ca6-a86d-3bad08477500\") " pod="openstack/mariadb-client" Mar 20 08:43:48 crc kubenswrapper[4971]: I0320 08:43:48.469449 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gclxw\" (UniqueName: \"kubernetes.io/projected/ec51406a-4caa-4ca6-a86d-3bad08477500-kube-api-access-gclxw\") pod \"mariadb-client\" (UID: \"ec51406a-4caa-4ca6-a86d-3bad08477500\") " pod="openstack/mariadb-client" Mar 20 08:43:48 crc kubenswrapper[4971]: I0320 08:43:48.605325 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 20 08:43:49 crc kubenswrapper[4971]: I0320 08:43:49.085842 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 20 08:43:49 crc kubenswrapper[4971]: W0320 08:43:49.091203 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec51406a_4caa_4ca6_a86d_3bad08477500.slice/crio-9e646d8d20c638c6660bf9d58d58e6c7e797b2be3f7e7445787d4c17c5147dfd WatchSource:0}: Error finding container 9e646d8d20c638c6660bf9d58d58e6c7e797b2be3f7e7445787d4c17c5147dfd: Status 404 returned error can't find the container with id 9e646d8d20c638c6660bf9d58d58e6c7e797b2be3f7e7445787d4c17c5147dfd Mar 20 08:43:50 crc kubenswrapper[4971]: I0320 08:43:50.096158 4971 generic.go:334] "Generic (PLEG): container finished" podID="ec51406a-4caa-4ca6-a86d-3bad08477500" containerID="cecc7433831be4400ecbdbc9420ea265d86f412f043c3c3876cf89f9de3d316a" exitCode=0 Mar 20 08:43:50 crc kubenswrapper[4971]: I0320 08:43:50.096261 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"ec51406a-4caa-4ca6-a86d-3bad08477500","Type":"ContainerDied","Data":"cecc7433831be4400ecbdbc9420ea265d86f412f043c3c3876cf89f9de3d316a"} Mar 20 08:43:50 crc kubenswrapper[4971]: I0320 08:43:50.096349 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"ec51406a-4caa-4ca6-a86d-3bad08477500","Type":"ContainerStarted","Data":"9e646d8d20c638c6660bf9d58d58e6c7e797b2be3f7e7445787d4c17c5147dfd"} Mar 20 08:43:50 crc kubenswrapper[4971]: I0320 08:43:50.162970 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 08:43:50 crc kubenswrapper[4971]: I0320 08:43:50.163535 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 08:43:50 crc kubenswrapper[4971]: I0320 08:43:50.163669 4971 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" Mar 20 08:43:50 crc kubenswrapper[4971]: I0320 08:43:50.164566 4971 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7a173279568ba15985897ecc9daca378d7d89e3775153dfc334adc490c1852dc"} pod="openshift-machine-config-operator/machine-config-daemon-m26zl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 08:43:50 crc kubenswrapper[4971]: I0320 08:43:50.164715 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" containerID="cri-o://7a173279568ba15985897ecc9daca378d7d89e3775153dfc334adc490c1852dc" gracePeriod=600 Mar 20 08:43:51 crc kubenswrapper[4971]: I0320 08:43:51.108286 4971 generic.go:334] "Generic (PLEG): container finished" podID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerID="7a173279568ba15985897ecc9daca378d7d89e3775153dfc334adc490c1852dc" exitCode=0 Mar 20 08:43:51 crc kubenswrapper[4971]: I0320 08:43:51.108346 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerDied","Data":"7a173279568ba15985897ecc9daca378d7d89e3775153dfc334adc490c1852dc"} Mar 20 08:43:51 crc kubenswrapper[4971]: I0320 08:43:51.108900 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerStarted","Data":"af8dd483a43b5c96df916fd811fc94a881785f568b623794921e968459fc26b4"} Mar 20 08:43:51 crc kubenswrapper[4971]: I0320 08:43:51.108932 4971 scope.go:117] "RemoveContainer" containerID="66c1fcf2276534ad864a4ee931635b14a07c0330af3a57e8ed8300895949b30a" Mar 20 08:43:51 crc kubenswrapper[4971]: I0320 08:43:51.556599 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 20 08:43:51 crc kubenswrapper[4971]: I0320 08:43:51.580623 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_ec51406a-4caa-4ca6-a86d-3bad08477500/mariadb-client/0.log" Mar 20 08:43:51 crc kubenswrapper[4971]: I0320 08:43:51.605945 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Mar 20 08:43:51 crc kubenswrapper[4971]: I0320 08:43:51.613660 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Mar 20 08:43:51 crc kubenswrapper[4971]: I0320 08:43:51.706444 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gclxw\" (UniqueName: \"kubernetes.io/projected/ec51406a-4caa-4ca6-a86d-3bad08477500-kube-api-access-gclxw\") pod \"ec51406a-4caa-4ca6-a86d-3bad08477500\" (UID: \"ec51406a-4caa-4ca6-a86d-3bad08477500\") " Mar 20 08:43:51 crc kubenswrapper[4971]: I0320 08:43:51.719195 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Mar 20 08:43:51 crc kubenswrapper[4971]: I0320 08:43:51.720122 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec51406a-4caa-4ca6-a86d-3bad08477500-kube-api-access-gclxw" (OuterVolumeSpecName: "kube-api-access-gclxw") pod "ec51406a-4caa-4ca6-a86d-3bad08477500" (UID: "ec51406a-4caa-4ca6-a86d-3bad08477500"). InnerVolumeSpecName "kube-api-access-gclxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:43:51 crc kubenswrapper[4971]: E0320 08:43:51.720403 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec51406a-4caa-4ca6-a86d-3bad08477500" containerName="mariadb-client" Mar 20 08:43:51 crc kubenswrapper[4971]: I0320 08:43:51.720429 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec51406a-4caa-4ca6-a86d-3bad08477500" containerName="mariadb-client" Mar 20 08:43:51 crc kubenswrapper[4971]: I0320 08:43:51.720790 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec51406a-4caa-4ca6-a86d-3bad08477500" containerName="mariadb-client" Mar 20 08:43:51 crc kubenswrapper[4971]: I0320 08:43:51.721512 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 20 08:43:51 crc kubenswrapper[4971]: I0320 08:43:51.729553 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 20 08:43:51 crc kubenswrapper[4971]: I0320 08:43:51.809794 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbhwp\" (UniqueName: \"kubernetes.io/projected/e020b86a-b868-434d-a44c-6b74c8b51ab6-kube-api-access-rbhwp\") pod \"mariadb-client\" (UID: \"e020b86a-b868-434d-a44c-6b74c8b51ab6\") " pod="openstack/mariadb-client" Mar 20 08:43:51 crc kubenswrapper[4971]: I0320 08:43:51.809952 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gclxw\" (UniqueName: \"kubernetes.io/projected/ec51406a-4caa-4ca6-a86d-3bad08477500-kube-api-access-gclxw\") on node \"crc\" DevicePath \"\"" Mar 20 08:43:51 crc kubenswrapper[4971]: I0320 08:43:51.911108 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbhwp\" (UniqueName: \"kubernetes.io/projected/e020b86a-b868-434d-a44c-6b74c8b51ab6-kube-api-access-rbhwp\") pod \"mariadb-client\" (UID: \"e020b86a-b868-434d-a44c-6b74c8b51ab6\") " pod="openstack/mariadb-client" Mar 20 08:43:51 crc kubenswrapper[4971]: I0320 08:43:51.944425 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbhwp\" (UniqueName: \"kubernetes.io/projected/e020b86a-b868-434d-a44c-6b74c8b51ab6-kube-api-access-rbhwp\") pod \"mariadb-client\" (UID: \"e020b86a-b868-434d-a44c-6b74c8b51ab6\") " pod="openstack/mariadb-client" Mar 20 08:43:52 crc kubenswrapper[4971]: I0320 08:43:52.083197 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 20 08:43:52 crc kubenswrapper[4971]: I0320 08:43:52.127188 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e646d8d20c638c6660bf9d58d58e6c7e797b2be3f7e7445787d4c17c5147dfd" Mar 20 08:43:52 crc kubenswrapper[4971]: I0320 08:43:52.128164 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 20 08:43:52 crc kubenswrapper[4971]: I0320 08:43:52.167311 4971 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/mariadb-client" oldPodUID="ec51406a-4caa-4ca6-a86d-3bad08477500" podUID="e020b86a-b868-434d-a44c-6b74c8b51ab6" Mar 20 08:43:52 crc kubenswrapper[4971]: I0320 08:43:52.417851 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 20 08:43:52 crc kubenswrapper[4971]: W0320 08:43:52.428142 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode020b86a_b868_434d_a44c_6b74c8b51ab6.slice/crio-ad494c0cbb311fb1e805a9792d8fb4299f63a41932138dbbdda4f5216ac211be WatchSource:0}: Error finding container ad494c0cbb311fb1e805a9792d8fb4299f63a41932138dbbdda4f5216ac211be: Status 404 returned error can't find the container with id ad494c0cbb311fb1e805a9792d8fb4299f63a41932138dbbdda4f5216ac211be Mar 20 08:43:52 crc kubenswrapper[4971]: I0320 08:43:52.746338 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec51406a-4caa-4ca6-a86d-3bad08477500" path="/var/lib/kubelet/pods/ec51406a-4caa-4ca6-a86d-3bad08477500/volumes" Mar 20 08:43:53 crc kubenswrapper[4971]: I0320 08:43:53.139253 4971 generic.go:334] "Generic (PLEG): container finished" podID="e020b86a-b868-434d-a44c-6b74c8b51ab6" containerID="587616bdfc047e1abd5d495d9adebd496e6f56c12f28bd9b44b9eadc2b8591bf" exitCode=0 Mar 20 08:43:53 crc kubenswrapper[4971]: I0320 08:43:53.139306 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"e020b86a-b868-434d-a44c-6b74c8b51ab6","Type":"ContainerDied","Data":"587616bdfc047e1abd5d495d9adebd496e6f56c12f28bd9b44b9eadc2b8591bf"} Mar 20 08:43:53 crc kubenswrapper[4971]: I0320 08:43:53.139337 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"e020b86a-b868-434d-a44c-6b74c8b51ab6","Type":"ContainerStarted","Data":"ad494c0cbb311fb1e805a9792d8fb4299f63a41932138dbbdda4f5216ac211be"} Mar 20 08:43:54 crc kubenswrapper[4971]: I0320 08:43:54.506245 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 20 08:43:54 crc kubenswrapper[4971]: I0320 08:43:54.526708 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_e020b86a-b868-434d-a44c-6b74c8b51ab6/mariadb-client/0.log" Mar 20 08:43:54 crc kubenswrapper[4971]: I0320 08:43:54.552452 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Mar 20 08:43:54 crc kubenswrapper[4971]: I0320 08:43:54.556182 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbhwp\" (UniqueName: \"kubernetes.io/projected/e020b86a-b868-434d-a44c-6b74c8b51ab6-kube-api-access-rbhwp\") pod \"e020b86a-b868-434d-a44c-6b74c8b51ab6\" (UID: \"e020b86a-b868-434d-a44c-6b74c8b51ab6\") " Mar 20 08:43:54 crc kubenswrapper[4971]: I0320 08:43:54.559418 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Mar 20 08:43:54 crc kubenswrapper[4971]: I0320 08:43:54.572005 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e020b86a-b868-434d-a44c-6b74c8b51ab6-kube-api-access-rbhwp" (OuterVolumeSpecName: "kube-api-access-rbhwp") pod "e020b86a-b868-434d-a44c-6b74c8b51ab6" (UID: "e020b86a-b868-434d-a44c-6b74c8b51ab6"). InnerVolumeSpecName "kube-api-access-rbhwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:43:54 crc kubenswrapper[4971]: I0320 08:43:54.658727 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbhwp\" (UniqueName: \"kubernetes.io/projected/e020b86a-b868-434d-a44c-6b74c8b51ab6-kube-api-access-rbhwp\") on node \"crc\" DevicePath \"\"" Mar 20 08:43:54 crc kubenswrapper[4971]: I0320 08:43:54.746386 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e020b86a-b868-434d-a44c-6b74c8b51ab6" path="/var/lib/kubelet/pods/e020b86a-b868-434d-a44c-6b74c8b51ab6/volumes" Mar 20 08:43:55 crc kubenswrapper[4971]: I0320 08:43:55.160053 4971 scope.go:117] "RemoveContainer" containerID="587616bdfc047e1abd5d495d9adebd496e6f56c12f28bd9b44b9eadc2b8591bf" Mar 20 08:43:55 crc kubenswrapper[4971]: I0320 08:43:55.160059 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 20 08:44:00 crc kubenswrapper[4971]: I0320 08:44:00.161439 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566604-kq4wg"] Mar 20 08:44:00 crc kubenswrapper[4971]: E0320 08:44:00.162902 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e020b86a-b868-434d-a44c-6b74c8b51ab6" containerName="mariadb-client" Mar 20 08:44:00 crc kubenswrapper[4971]: I0320 08:44:00.162938 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="e020b86a-b868-434d-a44c-6b74c8b51ab6" containerName="mariadb-client" Mar 20 08:44:00 crc kubenswrapper[4971]: I0320 08:44:00.163295 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="e020b86a-b868-434d-a44c-6b74c8b51ab6" containerName="mariadb-client" Mar 20 08:44:00 crc kubenswrapper[4971]: I0320 08:44:00.164247 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566604-kq4wg" Mar 20 08:44:00 crc kubenswrapper[4971]: I0320 08:44:00.168418 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 08:44:00 crc kubenswrapper[4971]: I0320 08:44:00.168760 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 08:44:00 crc kubenswrapper[4971]: I0320 08:44:00.172641 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566604-kq4wg"] Mar 20 08:44:00 crc kubenswrapper[4971]: I0320 08:44:00.173147 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 08:44:00 crc kubenswrapper[4971]: I0320 08:44:00.265196 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfbdg\" (UniqueName: \"kubernetes.io/projected/c63a4c1d-aa44-4180-a5cc-4261d1694fc9-kube-api-access-xfbdg\") pod \"auto-csr-approver-29566604-kq4wg\" (UID: \"c63a4c1d-aa44-4180-a5cc-4261d1694fc9\") " pod="openshift-infra/auto-csr-approver-29566604-kq4wg" Mar 20 08:44:00 crc kubenswrapper[4971]: I0320 08:44:00.367352 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfbdg\" (UniqueName: \"kubernetes.io/projected/c63a4c1d-aa44-4180-a5cc-4261d1694fc9-kube-api-access-xfbdg\") pod \"auto-csr-approver-29566604-kq4wg\" (UID: \"c63a4c1d-aa44-4180-a5cc-4261d1694fc9\") " pod="openshift-infra/auto-csr-approver-29566604-kq4wg" Mar 20 08:44:00 crc kubenswrapper[4971]: I0320 08:44:00.412535 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfbdg\" (UniqueName: \"kubernetes.io/projected/c63a4c1d-aa44-4180-a5cc-4261d1694fc9-kube-api-access-xfbdg\") pod \"auto-csr-approver-29566604-kq4wg\" (UID: \"c63a4c1d-aa44-4180-a5cc-4261d1694fc9\") " pod="openshift-infra/auto-csr-approver-29566604-kq4wg" Mar 20 08:44:00 crc kubenswrapper[4971]: I0320 08:44:00.521644 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566604-kq4wg" Mar 20 08:44:00 crc kubenswrapper[4971]: I0320 08:44:00.944973 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566604-kq4wg"] Mar 20 08:44:01 crc kubenswrapper[4971]: I0320 08:44:01.242816 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566604-kq4wg" event={"ID":"c63a4c1d-aa44-4180-a5cc-4261d1694fc9","Type":"ContainerStarted","Data":"715a3940f6ddd7c86bce759145fb52274da4168f1dc77fe2d5cb50c92f2ab6fb"} Mar 20 08:44:03 crc kubenswrapper[4971]: I0320 08:44:03.258681 4971 generic.go:334] "Generic (PLEG): container finished" podID="c63a4c1d-aa44-4180-a5cc-4261d1694fc9" containerID="15b57bb6774001b7f0c5f76c93be2bfa8facfe9e68486f422b42901b1ea3a0a6" exitCode=0 Mar 20 08:44:03 crc kubenswrapper[4971]: I0320 08:44:03.258782 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566604-kq4wg" event={"ID":"c63a4c1d-aa44-4180-a5cc-4261d1694fc9","Type":"ContainerDied","Data":"15b57bb6774001b7f0c5f76c93be2bfa8facfe9e68486f422b42901b1ea3a0a6"} Mar 20 08:44:04 crc kubenswrapper[4971]: I0320 08:44:04.581793 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566604-kq4wg" Mar 20 08:44:04 crc kubenswrapper[4971]: I0320 08:44:04.642578 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfbdg\" (UniqueName: \"kubernetes.io/projected/c63a4c1d-aa44-4180-a5cc-4261d1694fc9-kube-api-access-xfbdg\") pod \"c63a4c1d-aa44-4180-a5cc-4261d1694fc9\" (UID: \"c63a4c1d-aa44-4180-a5cc-4261d1694fc9\") " Mar 20 08:44:04 crc kubenswrapper[4971]: I0320 08:44:04.650282 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c63a4c1d-aa44-4180-a5cc-4261d1694fc9-kube-api-access-xfbdg" (OuterVolumeSpecName: "kube-api-access-xfbdg") pod "c63a4c1d-aa44-4180-a5cc-4261d1694fc9" (UID: "c63a4c1d-aa44-4180-a5cc-4261d1694fc9"). InnerVolumeSpecName "kube-api-access-xfbdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:44:04 crc kubenswrapper[4971]: I0320 08:44:04.743960 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfbdg\" (UniqueName: \"kubernetes.io/projected/c63a4c1d-aa44-4180-a5cc-4261d1694fc9-kube-api-access-xfbdg\") on node \"crc\" DevicePath \"\"" Mar 20 08:44:05 crc kubenswrapper[4971]: I0320 08:44:05.278150 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566604-kq4wg" event={"ID":"c63a4c1d-aa44-4180-a5cc-4261d1694fc9","Type":"ContainerDied","Data":"715a3940f6ddd7c86bce759145fb52274da4168f1dc77fe2d5cb50c92f2ab6fb"} Mar 20 08:44:05 crc kubenswrapper[4971]: I0320 08:44:05.278505 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="715a3940f6ddd7c86bce759145fb52274da4168f1dc77fe2d5cb50c92f2ab6fb" Mar 20 08:44:05 crc kubenswrapper[4971]: I0320 08:44:05.278235 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566604-kq4wg" Mar 20 08:44:05 crc kubenswrapper[4971]: I0320 08:44:05.682062 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566598-78wqb"] Mar 20 08:44:05 crc kubenswrapper[4971]: I0320 08:44:05.693945 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566598-78wqb"] Mar 20 08:44:06 crc kubenswrapper[4971]: I0320 08:44:06.742750 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3285aa80-885b-4a96-9fbf-0170c06603fb" path="/var/lib/kubelet/pods/3285aa80-885b-4a96-9fbf-0170c06603fb/volumes" Mar 20 08:44:08 crc kubenswrapper[4971]: I0320 08:44:08.355937 4971 scope.go:117] "RemoveContainer" containerID="5c83360796f0a8705c327dab6ab3d4dd597ad4f696432c85ced93266b6b43ed0" Mar 20 08:44:22 crc kubenswrapper[4971]: E0320 08:44:22.274459 4971 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.119:44110->38.102.83.119:38499: write tcp 38.102.83.119:44110->38.102.83.119:38499: write: broken pipe Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.507176 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 08:44:24 crc kubenswrapper[4971]: E0320 08:44:24.507863 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c63a4c1d-aa44-4180-a5cc-4261d1694fc9" containerName="oc" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.507875 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="c63a4c1d-aa44-4180-a5cc-4261d1694fc9" containerName="oc" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.508036 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="c63a4c1d-aa44-4180-a5cc-4261d1694fc9" containerName="oc" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.508793 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.514297 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-nc9p9" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.514566 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.516677 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.525195 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-2"] Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.526346 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.540445 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.551224 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-1"] Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.553094 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.562705 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.607684 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.615516 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/69ef94eb-93b4-42b8-8f8b-4ad69843586d-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"69ef94eb-93b4-42b8-8f8b-4ad69843586d\") " pod="openstack/ovsdbserver-nb-1" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.615573 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69ef94eb-93b4-42b8-8f8b-4ad69843586d-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"69ef94eb-93b4-42b8-8f8b-4ad69843586d\") " pod="openstack/ovsdbserver-nb-1" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.615597 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-64572e22-9539-4361-b126-69fd919889de\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-64572e22-9539-4361-b126-69fd919889de\") pod \"ovsdbserver-nb-1\" (UID: \"69ef94eb-93b4-42b8-8f8b-4ad69843586d\") " pod="openstack/ovsdbserver-nb-1" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.615636 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/69ef94eb-93b4-42b8-8f8b-4ad69843586d-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"69ef94eb-93b4-42b8-8f8b-4ad69843586d\") " pod="openstack/ovsdbserver-nb-1" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.615680 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69ef94eb-93b4-42b8-8f8b-4ad69843586d-config\") pod \"ovsdbserver-nb-1\" (UID: \"69ef94eb-93b4-42b8-8f8b-4ad69843586d\") " pod="openstack/ovsdbserver-nb-1" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.615854 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gv5f2\" (UniqueName: \"kubernetes.io/projected/69ef94eb-93b4-42b8-8f8b-4ad69843586d-kube-api-access-gv5f2\") pod \"ovsdbserver-nb-1\" (UID: \"69ef94eb-93b4-42b8-8f8b-4ad69843586d\") " pod="openstack/ovsdbserver-nb-1" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.615896 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e2bebfec-671e-453b-855a-0a1c4b6a3109-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"e2bebfec-671e-453b-855a-0a1c4b6a3109\") " pod="openstack/ovsdbserver-nb-0" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.615942 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s26x8\" (UniqueName: \"kubernetes.io/projected/e2bebfec-671e-453b-855a-0a1c4b6a3109-kube-api-access-s26x8\") pod \"ovsdbserver-nb-0\" (UID: \"e2bebfec-671e-453b-855a-0a1c4b6a3109\") " pod="openstack/ovsdbserver-nb-0" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.616005 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2bebfec-671e-453b-855a-0a1c4b6a3109-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"e2bebfec-671e-453b-855a-0a1c4b6a3109\") " pod="openstack/ovsdbserver-nb-0" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.616061 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2bebfec-671e-453b-855a-0a1c4b6a3109-config\") pod \"ovsdbserver-nb-0\" (UID: \"e2bebfec-671e-453b-855a-0a1c4b6a3109\") " pod="openstack/ovsdbserver-nb-0" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.616168 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-f5766958-4935-4676-8bbe-8d7ee68035de\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f5766958-4935-4676-8bbe-8d7ee68035de\") pod \"ovsdbserver-nb-0\" (UID: \"e2bebfec-671e-453b-855a-0a1c4b6a3109\") " pod="openstack/ovsdbserver-nb-0" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.616197 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e2bebfec-671e-453b-855a-0a1c4b6a3109-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"e2bebfec-671e-453b-855a-0a1c4b6a3109\") " pod="openstack/ovsdbserver-nb-0" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.704817 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.705950 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.709982 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-x2q5n" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.712807 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.712866 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.717421 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69ef94eb-93b4-42b8-8f8b-4ad69843586d-config\") pod \"ovsdbserver-nb-1\" (UID: \"69ef94eb-93b4-42b8-8f8b-4ad69843586d\") " pod="openstack/ovsdbserver-nb-1" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.717481 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gv5f2\" (UniqueName: \"kubernetes.io/projected/69ef94eb-93b4-42b8-8f8b-4ad69843586d-kube-api-access-gv5f2\") pod \"ovsdbserver-nb-1\" (UID: \"69ef94eb-93b4-42b8-8f8b-4ad69843586d\") " pod="openstack/ovsdbserver-nb-1" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.717501 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e2bebfec-671e-453b-855a-0a1c4b6a3109-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"e2bebfec-671e-453b-855a-0a1c4b6a3109\") " pod="openstack/ovsdbserver-nb-0" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.717530 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tqdm\" (UniqueName: \"kubernetes.io/projected/66a65a08-59f0-4d4f-bde1-db66ae04d2ff-kube-api-access-5tqdm\") pod \"ovsdbserver-nb-2\" (UID: \"66a65a08-59f0-4d4f-bde1-db66ae04d2ff\") " pod="openstack/ovsdbserver-nb-2" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.717557 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s26x8\" (UniqueName: \"kubernetes.io/projected/e2bebfec-671e-453b-855a-0a1c4b6a3109-kube-api-access-s26x8\") pod \"ovsdbserver-nb-0\" (UID: \"e2bebfec-671e-453b-855a-0a1c4b6a3109\") " pod="openstack/ovsdbserver-nb-0" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.717583 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66a65a08-59f0-4d4f-bde1-db66ae04d2ff-config\") pod \"ovsdbserver-nb-2\" (UID: \"66a65a08-59f0-4d4f-bde1-db66ae04d2ff\") " pod="openstack/ovsdbserver-nb-2" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.717612 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2bebfec-671e-453b-855a-0a1c4b6a3109-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"e2bebfec-671e-453b-855a-0a1c4b6a3109\") " pod="openstack/ovsdbserver-nb-0" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.717654 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2bebfec-671e-453b-855a-0a1c4b6a3109-config\") pod \"ovsdbserver-nb-0\" (UID: \"e2bebfec-671e-453b-855a-0a1c4b6a3109\") " pod="openstack/ovsdbserver-nb-0" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.717686 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/66a65a08-59f0-4d4f-bde1-db66ae04d2ff-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"66a65a08-59f0-4d4f-bde1-db66ae04d2ff\") " pod="openstack/ovsdbserver-nb-2" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.717725 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/66a65a08-59f0-4d4f-bde1-db66ae04d2ff-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"66a65a08-59f0-4d4f-bde1-db66ae04d2ff\") " pod="openstack/ovsdbserver-nb-2" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.717780 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-f5766958-4935-4676-8bbe-8d7ee68035de\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f5766958-4935-4676-8bbe-8d7ee68035de\") pod \"ovsdbserver-nb-0\" (UID: \"e2bebfec-671e-453b-855a-0a1c4b6a3109\") " pod="openstack/ovsdbserver-nb-0" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.717807 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e2bebfec-671e-453b-855a-0a1c4b6a3109-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"e2bebfec-671e-453b-855a-0a1c4b6a3109\") " pod="openstack/ovsdbserver-nb-0" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.717834 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/69ef94eb-93b4-42b8-8f8b-4ad69843586d-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"69ef94eb-93b4-42b8-8f8b-4ad69843586d\") " pod="openstack/ovsdbserver-nb-1" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.717865 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69ef94eb-93b4-42b8-8f8b-4ad69843586d-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"69ef94eb-93b4-42b8-8f8b-4ad69843586d\") " pod="openstack/ovsdbserver-nb-1" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.717891 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-64572e22-9539-4361-b126-69fd919889de\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-64572e22-9539-4361-b126-69fd919889de\") pod \"ovsdbserver-nb-1\" (UID: \"69ef94eb-93b4-42b8-8f8b-4ad69843586d\") " pod="openstack/ovsdbserver-nb-1" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.718076 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/69ef94eb-93b4-42b8-8f8b-4ad69843586d-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"69ef94eb-93b4-42b8-8f8b-4ad69843586d\") " pod="openstack/ovsdbserver-nb-1" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.718104 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66a65a08-59f0-4d4f-bde1-db66ae04d2ff-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"66a65a08-59f0-4d4f-bde1-db66ae04d2ff\") " pod="openstack/ovsdbserver-nb-2" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.718138 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ead2e559-1406-4436-9c76-604e13ffafe2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ead2e559-1406-4436-9c76-604e13ffafe2\") pod \"ovsdbserver-nb-2\" (UID: \"66a65a08-59f0-4d4f-bde1-db66ae04d2ff\") " pod="openstack/ovsdbserver-nb-2" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.718871 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e2bebfec-671e-453b-855a-0a1c4b6a3109-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"e2bebfec-671e-453b-855a-0a1c4b6a3109\") " pod="openstack/ovsdbserver-nb-0" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.719903 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69ef94eb-93b4-42b8-8f8b-4ad69843586d-config\") pod \"ovsdbserver-nb-1\" (UID: \"69ef94eb-93b4-42b8-8f8b-4ad69843586d\") " pod="openstack/ovsdbserver-nb-1" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.723957 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/69ef94eb-93b4-42b8-8f8b-4ad69843586d-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"69ef94eb-93b4-42b8-8f8b-4ad69843586d\") " pod="openstack/ovsdbserver-nb-1" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.724359 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/69ef94eb-93b4-42b8-8f8b-4ad69843586d-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"69ef94eb-93b4-42b8-8f8b-4ad69843586d\") " pod="openstack/ovsdbserver-nb-1" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.724948 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.726276 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2bebfec-671e-453b-855a-0a1c4b6a3109-config\") pod \"ovsdbserver-nb-0\" (UID: \"e2bebfec-671e-453b-855a-0a1c4b6a3109\") " pod="openstack/ovsdbserver-nb-0" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.727442 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2bebfec-671e-453b-855a-0a1c4b6a3109-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"e2bebfec-671e-453b-855a-0a1c4b6a3109\") " pod="openstack/ovsdbserver-nb-0" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.727647 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69ef94eb-93b4-42b8-8f8b-4ad69843586d-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"69ef94eb-93b4-42b8-8f8b-4ad69843586d\") " pod="openstack/ovsdbserver-nb-1" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.729390 4971 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.729425 4971 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-64572e22-9539-4361-b126-69fd919889de\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-64572e22-9539-4361-b126-69fd919889de\") pod \"ovsdbserver-nb-1\" (UID: \"69ef94eb-93b4-42b8-8f8b-4ad69843586d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/91c8e52732d5ee88636c2765a0613046f42b37fdf33cab317a33e934050ef45f/globalmount\"" pod="openstack/ovsdbserver-nb-1" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.730753 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e2bebfec-671e-453b-855a-0a1c4b6a3109-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"e2bebfec-671e-453b-855a-0a1c4b6a3109\") " pod="openstack/ovsdbserver-nb-0" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.730959 4971 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.730984 4971 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-f5766958-4935-4676-8bbe-8d7ee68035de\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f5766958-4935-4676-8bbe-8d7ee68035de\") pod \"ovsdbserver-nb-0\" (UID: \"e2bebfec-671e-453b-855a-0a1c4b6a3109\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/05ac7742a1a31c5f010e9a183aedce4f1dcdc38fdbf9dbc4e3348bd359616774/globalmount\"" pod="openstack/ovsdbserver-nb-0" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.743666 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gv5f2\" (UniqueName: \"kubernetes.io/projected/69ef94eb-93b4-42b8-8f8b-4ad69843586d-kube-api-access-gv5f2\") pod \"ovsdbserver-nb-1\" (UID: \"69ef94eb-93b4-42b8-8f8b-4ad69843586d\") " pod="openstack/ovsdbserver-nb-1" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.747243 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-1"] Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.749739 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-2"] Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.750069 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.751455 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.755235 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.770902 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.779655 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s26x8\" (UniqueName: \"kubernetes.io/projected/e2bebfec-671e-453b-855a-0a1c4b6a3109-kube-api-access-s26x8\") pod \"ovsdbserver-nb-0\" (UID: \"e2bebfec-671e-453b-855a-0a1c4b6a3109\") " pod="openstack/ovsdbserver-nb-0" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.785516 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-64572e22-9539-4361-b126-69fd919889de\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-64572e22-9539-4361-b126-69fd919889de\") pod \"ovsdbserver-nb-1\" (UID: \"69ef94eb-93b4-42b8-8f8b-4ad69843586d\") " pod="openstack/ovsdbserver-nb-1" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.819513 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-f5766958-4935-4676-8bbe-8d7ee68035de\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f5766958-4935-4676-8bbe-8d7ee68035de\") pod \"ovsdbserver-nb-0\" (UID: \"e2bebfec-671e-453b-855a-0a1c4b6a3109\") " pod="openstack/ovsdbserver-nb-0" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.819816 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f19df11e-015e-4c86-8369-c2bafe67a087-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"f19df11e-015e-4c86-8369-c2bafe67a087\") " pod="openstack/ovsdbserver-sb-0" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.820011 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/66a65a08-59f0-4d4f-bde1-db66ae04d2ff-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"66a65a08-59f0-4d4f-bde1-db66ae04d2ff\") " pod="openstack/ovsdbserver-nb-2" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.820124 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f19df11e-015e-4c86-8369-c2bafe67a087-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"f19df11e-015e-4c86-8369-c2bafe67a087\") " pod="openstack/ovsdbserver-sb-0" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.820256 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/66a65a08-59f0-4d4f-bde1-db66ae04d2ff-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"66a65a08-59f0-4d4f-bde1-db66ae04d2ff\") " pod="openstack/ovsdbserver-nb-2" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.820393 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f19df11e-015e-4c86-8369-c2bafe67a087-config\") pod \"ovsdbserver-sb-0\" (UID: \"f19df11e-015e-4c86-8369-c2bafe67a087\") " pod="openstack/ovsdbserver-sb-0" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.820535 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66a65a08-59f0-4d4f-bde1-db66ae04d2ff-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"66a65a08-59f0-4d4f-bde1-db66ae04d2ff\") " pod="openstack/ovsdbserver-nb-2" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.820659 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wftwt\" (UniqueName: \"kubernetes.io/projected/f19df11e-015e-4c86-8369-c2bafe67a087-kube-api-access-wftwt\") pod \"ovsdbserver-sb-0\" (UID: \"f19df11e-015e-4c86-8369-c2bafe67a087\") " pod="openstack/ovsdbserver-sb-0" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.820771 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ead2e559-1406-4436-9c76-604e13ffafe2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ead2e559-1406-4436-9c76-604e13ffafe2\") pod \"ovsdbserver-nb-2\" (UID: \"66a65a08-59f0-4d4f-bde1-db66ae04d2ff\") " pod="openstack/ovsdbserver-nb-2" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.820892 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a2b19c4f-c3bf-4d64-9d09-a1afa3453b98\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a2b19c4f-c3bf-4d64-9d09-a1afa3453b98\") pod \"ovsdbserver-sb-0\" (UID: \"f19df11e-015e-4c86-8369-c2bafe67a087\") " pod="openstack/ovsdbserver-sb-0" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.821207 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/66a65a08-59f0-4d4f-bde1-db66ae04d2ff-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"66a65a08-59f0-4d4f-bde1-db66ae04d2ff\") " pod="openstack/ovsdbserver-nb-2" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.821315 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/66a65a08-59f0-4d4f-bde1-db66ae04d2ff-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"66a65a08-59f0-4d4f-bde1-db66ae04d2ff\") " pod="openstack/ovsdbserver-nb-2" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.822930 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f19df11e-015e-4c86-8369-c2bafe67a087-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"f19df11e-015e-4c86-8369-c2bafe67a087\") " pod="openstack/ovsdbserver-sb-0" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.823124 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tqdm\" (UniqueName: \"kubernetes.io/projected/66a65a08-59f0-4d4f-bde1-db66ae04d2ff-kube-api-access-5tqdm\") pod \"ovsdbserver-nb-2\" (UID: \"66a65a08-59f0-4d4f-bde1-db66ae04d2ff\") " pod="openstack/ovsdbserver-nb-2" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.823502 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66a65a08-59f0-4d4f-bde1-db66ae04d2ff-config\") pod \"ovsdbserver-nb-2\" (UID: \"66a65a08-59f0-4d4f-bde1-db66ae04d2ff\") " pod="openstack/ovsdbserver-nb-2" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.824350 4971 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.824381 4971 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ead2e559-1406-4436-9c76-604e13ffafe2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ead2e559-1406-4436-9c76-604e13ffafe2\") pod \"ovsdbserver-nb-2\" (UID: \"66a65a08-59f0-4d4f-bde1-db66ae04d2ff\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/e467fb75eb0e014c8e9e801277a8b663eb64046cf4b74d57fd3a844fdc915337/globalmount\"" pod="openstack/ovsdbserver-nb-2" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.824398 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66a65a08-59f0-4d4f-bde1-db66ae04d2ff-config\") pod \"ovsdbserver-nb-2\" (UID: \"66a65a08-59f0-4d4f-bde1-db66ae04d2ff\") " pod="openstack/ovsdbserver-nb-2" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.825116 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66a65a08-59f0-4d4f-bde1-db66ae04d2ff-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"66a65a08-59f0-4d4f-bde1-db66ae04d2ff\") " pod="openstack/ovsdbserver-nb-2" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.831717 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.843501 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tqdm\" (UniqueName: \"kubernetes.io/projected/66a65a08-59f0-4d4f-bde1-db66ae04d2ff-kube-api-access-5tqdm\") pod \"ovsdbserver-nb-2\" (UID: \"66a65a08-59f0-4d4f-bde1-db66ae04d2ff\") " pod="openstack/ovsdbserver-nb-2" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.847989 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ead2e559-1406-4436-9c76-604e13ffafe2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ead2e559-1406-4436-9c76-604e13ffafe2\") pod \"ovsdbserver-nb-2\" (UID: \"66a65a08-59f0-4d4f-bde1-db66ae04d2ff\") " pod="openstack/ovsdbserver-nb-2" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.888361 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.925068 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a2b19c4f-c3bf-4d64-9d09-a1afa3453b98\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a2b19c4f-c3bf-4d64-9d09-a1afa3453b98\") pod \"ovsdbserver-sb-0\" (UID: \"f19df11e-015e-4c86-8369-c2bafe67a087\") " pod="openstack/ovsdbserver-sb-0" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.925131 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-53021351-e6c4-472b-8bef-2f7f0634f17b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-53021351-e6c4-472b-8bef-2f7f0634f17b\") pod \"ovsdbserver-sb-1\" (UID: \"10014eeb-477f-4f79-ad5d-6412a0ba2dfd\") " pod="openstack/ovsdbserver-sb-1" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.925160 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f19df11e-015e-4c86-8369-c2bafe67a087-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"f19df11e-015e-4c86-8369-c2bafe67a087\") " pod="openstack/ovsdbserver-sb-0" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.925188 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/10014eeb-477f-4f79-ad5d-6412a0ba2dfd-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"10014eeb-477f-4f79-ad5d-6412a0ba2dfd\") " pod="openstack/ovsdbserver-sb-1" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.925222 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-1ccca278-7fdd-455b-a31d-02c7412d79b3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1ccca278-7fdd-455b-a31d-02c7412d79b3\") pod \"ovsdbserver-sb-2\" (UID: \"de027a38-7366-4724-b961-dd311e4cfd46\") " pod="openstack/ovsdbserver-sb-2" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.925247 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f19df11e-015e-4c86-8369-c2bafe67a087-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"f19df11e-015e-4c86-8369-c2bafe67a087\") " pod="openstack/ovsdbserver-sb-0" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.925267 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10014eeb-477f-4f79-ad5d-6412a0ba2dfd-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"10014eeb-477f-4f79-ad5d-6412a0ba2dfd\") " pod="openstack/ovsdbserver-sb-1" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.925284 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f19df11e-015e-4c86-8369-c2bafe67a087-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"f19df11e-015e-4c86-8369-c2bafe67a087\") " pod="openstack/ovsdbserver-sb-0" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.925304 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/de027a38-7366-4724-b961-dd311e4cfd46-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"de027a38-7366-4724-b961-dd311e4cfd46\") " pod="openstack/ovsdbserver-sb-2" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.925320 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de027a38-7366-4724-b961-dd311e4cfd46-config\") pod \"ovsdbserver-sb-2\" (UID: \"de027a38-7366-4724-b961-dd311e4cfd46\") " pod="openstack/ovsdbserver-sb-2" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.925344 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10014eeb-477f-4f79-ad5d-6412a0ba2dfd-config\") pod \"ovsdbserver-sb-1\" (UID: \"10014eeb-477f-4f79-ad5d-6412a0ba2dfd\") " pod="openstack/ovsdbserver-sb-1" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.925361 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de027a38-7366-4724-b961-dd311e4cfd46-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"de027a38-7366-4724-b961-dd311e4cfd46\") " pod="openstack/ovsdbserver-sb-2" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.925384 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dbkp\" (UniqueName: \"kubernetes.io/projected/10014eeb-477f-4f79-ad5d-6412a0ba2dfd-kube-api-access-2dbkp\") pod \"ovsdbserver-sb-1\" (UID: \"10014eeb-477f-4f79-ad5d-6412a0ba2dfd\") " pod="openstack/ovsdbserver-sb-1" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.925408 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f19df11e-015e-4c86-8369-c2bafe67a087-config\") pod \"ovsdbserver-sb-0\" (UID: \"f19df11e-015e-4c86-8369-c2bafe67a087\") " pod="openstack/ovsdbserver-sb-0" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.925426 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/de027a38-7366-4724-b961-dd311e4cfd46-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"de027a38-7366-4724-b961-dd311e4cfd46\") " pod="openstack/ovsdbserver-sb-2" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.925448 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/10014eeb-477f-4f79-ad5d-6412a0ba2dfd-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"10014eeb-477f-4f79-ad5d-6412a0ba2dfd\") " pod="openstack/ovsdbserver-sb-1" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.926370 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wftwt\" (UniqueName: \"kubernetes.io/projected/f19df11e-015e-4c86-8369-c2bafe67a087-kube-api-access-wftwt\") pod \"ovsdbserver-sb-0\" (UID: \"f19df11e-015e-4c86-8369-c2bafe67a087\") " pod="openstack/ovsdbserver-sb-0" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.926407 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kjld\" (UniqueName: \"kubernetes.io/projected/de027a38-7366-4724-b961-dd311e4cfd46-kube-api-access-5kjld\") pod \"ovsdbserver-sb-2\" (UID: \"de027a38-7366-4724-b961-dd311e4cfd46\") " pod="openstack/ovsdbserver-sb-2" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.926702 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f19df11e-015e-4c86-8369-c2bafe67a087-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"f19df11e-015e-4c86-8369-c2bafe67a087\") " pod="openstack/ovsdbserver-sb-0" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.927282 4971 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.927302 4971 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a2b19c4f-c3bf-4d64-9d09-a1afa3453b98\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a2b19c4f-c3bf-4d64-9d09-a1afa3453b98\") pod \"ovsdbserver-sb-0\" (UID: \"f19df11e-015e-4c86-8369-c2bafe67a087\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3de3bcd879f229f5b8018dafe1e05438247d82397e644ad3374158092d0d8223/globalmount\"" pod="openstack/ovsdbserver-sb-0" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.927309 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f19df11e-015e-4c86-8369-c2bafe67a087-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"f19df11e-015e-4c86-8369-c2bafe67a087\") " pod="openstack/ovsdbserver-sb-0" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.933198 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f19df11e-015e-4c86-8369-c2bafe67a087-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"f19df11e-015e-4c86-8369-c2bafe67a087\") " pod="openstack/ovsdbserver-sb-0" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.933437 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f19df11e-015e-4c86-8369-c2bafe67a087-config\") pod \"ovsdbserver-sb-0\" (UID: \"f19df11e-015e-4c86-8369-c2bafe67a087\") " pod="openstack/ovsdbserver-sb-0" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.948147 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wftwt\" (UniqueName: \"kubernetes.io/projected/f19df11e-015e-4c86-8369-c2bafe67a087-kube-api-access-wftwt\") pod \"ovsdbserver-sb-0\" (UID: \"f19df11e-015e-4c86-8369-c2bafe67a087\") " pod="openstack/ovsdbserver-sb-0" Mar 20 08:44:24 crc kubenswrapper[4971]: I0320 08:44:24.961685 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a2b19c4f-c3bf-4d64-9d09-a1afa3453b98\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a2b19c4f-c3bf-4d64-9d09-a1afa3453b98\") pod \"ovsdbserver-sb-0\" (UID: \"f19df11e-015e-4c86-8369-c2bafe67a087\") " pod="openstack/ovsdbserver-sb-0" Mar 20 08:44:25 crc kubenswrapper[4971]: I0320 08:44:25.028194 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-1ccca278-7fdd-455b-a31d-02c7412d79b3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1ccca278-7fdd-455b-a31d-02c7412d79b3\") pod \"ovsdbserver-sb-2\" (UID: \"de027a38-7366-4724-b961-dd311e4cfd46\") " pod="openstack/ovsdbserver-sb-2" Mar 20 08:44:25 crc kubenswrapper[4971]: I0320 08:44:25.028250 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10014eeb-477f-4f79-ad5d-6412a0ba2dfd-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"10014eeb-477f-4f79-ad5d-6412a0ba2dfd\") " pod="openstack/ovsdbserver-sb-1" Mar 20 08:44:25 crc kubenswrapper[4971]: I0320 08:44:25.028274 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/de027a38-7366-4724-b961-dd311e4cfd46-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"de027a38-7366-4724-b961-dd311e4cfd46\") " pod="openstack/ovsdbserver-sb-2" Mar 20 08:44:25 crc kubenswrapper[4971]: I0320 08:44:25.028812 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de027a38-7366-4724-b961-dd311e4cfd46-config\") pod \"ovsdbserver-sb-2\" (UID: \"de027a38-7366-4724-b961-dd311e4cfd46\") " pod="openstack/ovsdbserver-sb-2" Mar 20 08:44:25 crc kubenswrapper[4971]: I0320 08:44:25.028846 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10014eeb-477f-4f79-ad5d-6412a0ba2dfd-config\") pod \"ovsdbserver-sb-1\" (UID: \"10014eeb-477f-4f79-ad5d-6412a0ba2dfd\") " pod="openstack/ovsdbserver-sb-1" Mar 20 08:44:25 crc kubenswrapper[4971]: I0320 08:44:25.028883 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de027a38-7366-4724-b961-dd311e4cfd46-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"de027a38-7366-4724-b961-dd311e4cfd46\") " pod="openstack/ovsdbserver-sb-2" Mar 20 08:44:25 crc kubenswrapper[4971]: I0320 08:44:25.028906 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dbkp\" (UniqueName: \"kubernetes.io/projected/10014eeb-477f-4f79-ad5d-6412a0ba2dfd-kube-api-access-2dbkp\") pod \"ovsdbserver-sb-1\" (UID: \"10014eeb-477f-4f79-ad5d-6412a0ba2dfd\") " pod="openstack/ovsdbserver-sb-1" Mar 20 08:44:25 crc kubenswrapper[4971]: I0320 08:44:25.029031 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/de027a38-7366-4724-b961-dd311e4cfd46-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"de027a38-7366-4724-b961-dd311e4cfd46\") " pod="openstack/ovsdbserver-sb-2" Mar 20 08:44:25 crc kubenswrapper[4971]: I0320 08:44:25.029080 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/de027a38-7366-4724-b961-dd311e4cfd46-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"de027a38-7366-4724-b961-dd311e4cfd46\") " pod="openstack/ovsdbserver-sb-2" Mar 20 08:44:25 crc kubenswrapper[4971]: I0320 08:44:25.029601 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de027a38-7366-4724-b961-dd311e4cfd46-config\") pod \"ovsdbserver-sb-2\" (UID: \"de027a38-7366-4724-b961-dd311e4cfd46\") " pod="openstack/ovsdbserver-sb-2" Mar 20 08:44:25 crc kubenswrapper[4971]: I0320 08:44:25.029779 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/10014eeb-477f-4f79-ad5d-6412a0ba2dfd-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"10014eeb-477f-4f79-ad5d-6412a0ba2dfd\") " pod="openstack/ovsdbserver-sb-1" Mar 20 08:44:25 crc kubenswrapper[4971]: I0320 08:44:25.030082 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kjld\" (UniqueName: \"kubernetes.io/projected/de027a38-7366-4724-b961-dd311e4cfd46-kube-api-access-5kjld\") pod \"ovsdbserver-sb-2\" (UID: \"de027a38-7366-4724-b961-dd311e4cfd46\") " pod="openstack/ovsdbserver-sb-2" Mar 20 08:44:25 crc kubenswrapper[4971]: I0320 08:44:25.030098 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10014eeb-477f-4f79-ad5d-6412a0ba2dfd-config\") pod \"ovsdbserver-sb-1\" (UID: \"10014eeb-477f-4f79-ad5d-6412a0ba2dfd\") " pod="openstack/ovsdbserver-sb-1" Mar 20 08:44:25 crc kubenswrapper[4971]: I0320 08:44:25.030196 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-53021351-e6c4-472b-8bef-2f7f0634f17b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-53021351-e6c4-472b-8bef-2f7f0634f17b\") pod \"ovsdbserver-sb-1\" (UID: \"10014eeb-477f-4f79-ad5d-6412a0ba2dfd\") " pod="openstack/ovsdbserver-sb-1" Mar 20 08:44:25 crc kubenswrapper[4971]: I0320 08:44:25.030273 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/10014eeb-477f-4f79-ad5d-6412a0ba2dfd-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"10014eeb-477f-4f79-ad5d-6412a0ba2dfd\") " pod="openstack/ovsdbserver-sb-1" Mar 20 08:44:25 crc kubenswrapper[4971]: I0320 08:44:25.030785 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/10014eeb-477f-4f79-ad5d-6412a0ba2dfd-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"10014eeb-477f-4f79-ad5d-6412a0ba2dfd\") " pod="openstack/ovsdbserver-sb-1" Mar 20 08:44:25 crc kubenswrapper[4971]: I0320 08:44:25.030953 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/10014eeb-477f-4f79-ad5d-6412a0ba2dfd-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"10014eeb-477f-4f79-ad5d-6412a0ba2dfd\") " pod="openstack/ovsdbserver-sb-1" Mar 20 08:44:25 crc kubenswrapper[4971]: I0320 08:44:25.032177 4971 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 08:44:25 crc kubenswrapper[4971]: I0320 08:44:25.032203 4971 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-1ccca278-7fdd-455b-a31d-02c7412d79b3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1ccca278-7fdd-455b-a31d-02c7412d79b3\") pod \"ovsdbserver-sb-2\" (UID: \"de027a38-7366-4724-b961-dd311e4cfd46\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a5d390389551cca99795addc7b3867a18084a3edbd7a6b6a8ba753d758312f0e/globalmount\"" pod="openstack/ovsdbserver-sb-2" Mar 20 08:44:25 crc kubenswrapper[4971]: I0320 08:44:25.032767 4971 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 08:44:25 crc kubenswrapper[4971]: I0320 08:44:25.032789 4971 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-53021351-e6c4-472b-8bef-2f7f0634f17b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-53021351-e6c4-472b-8bef-2f7f0634f17b\") pod \"ovsdbserver-sb-1\" (UID: \"10014eeb-477f-4f79-ad5d-6412a0ba2dfd\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ea98f9e4e0efa38c70dffcd13651955230f7f80031749b072674840d4b8f7047/globalmount\"" pod="openstack/ovsdbserver-sb-1" Mar 20 08:44:25 crc kubenswrapper[4971]: I0320 08:44:25.033769 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de027a38-7366-4724-b961-dd311e4cfd46-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"de027a38-7366-4724-b961-dd311e4cfd46\") " pod="openstack/ovsdbserver-sb-2" Mar 20 08:44:25 crc kubenswrapper[4971]: I0320 08:44:25.034800 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10014eeb-477f-4f79-ad5d-6412a0ba2dfd-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"10014eeb-477f-4f79-ad5d-6412a0ba2dfd\") " pod="openstack/ovsdbserver-sb-1" Mar 20 08:44:25 crc kubenswrapper[4971]: I0320 08:44:25.036482 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/de027a38-7366-4724-b961-dd311e4cfd46-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"de027a38-7366-4724-b961-dd311e4cfd46\") " pod="openstack/ovsdbserver-sb-2" Mar 20 08:44:25 crc kubenswrapper[4971]: I0320 08:44:25.051847 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kjld\" (UniqueName: \"kubernetes.io/projected/de027a38-7366-4724-b961-dd311e4cfd46-kube-api-access-5kjld\") pod \"ovsdbserver-sb-2\" (UID: \"de027a38-7366-4724-b961-dd311e4cfd46\") " pod="openstack/ovsdbserver-sb-2" Mar 20 08:44:25 crc kubenswrapper[4971]: I0320 08:44:25.052926 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dbkp\" (UniqueName: \"kubernetes.io/projected/10014eeb-477f-4f79-ad5d-6412a0ba2dfd-kube-api-access-2dbkp\") pod \"ovsdbserver-sb-1\" (UID: \"10014eeb-477f-4f79-ad5d-6412a0ba2dfd\") " pod="openstack/ovsdbserver-sb-1" Mar 20 08:44:25 crc kubenswrapper[4971]: I0320 08:44:25.059180 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-1ccca278-7fdd-455b-a31d-02c7412d79b3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1ccca278-7fdd-455b-a31d-02c7412d79b3\") pod \"ovsdbserver-sb-2\" (UID: \"de027a38-7366-4724-b961-dd311e4cfd46\") " pod="openstack/ovsdbserver-sb-2" Mar 20 08:44:25 crc kubenswrapper[4971]: I0320 08:44:25.062105 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-53021351-e6c4-472b-8bef-2f7f0634f17b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-53021351-e6c4-472b-8bef-2f7f0634f17b\") pod \"ovsdbserver-sb-1\" (UID: \"10014eeb-477f-4f79-ad5d-6412a0ba2dfd\") " pod="openstack/ovsdbserver-sb-1" Mar 20 08:44:25 crc kubenswrapper[4971]: I0320 08:44:25.124155 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 20 08:44:25 crc kubenswrapper[4971]: I0320 08:44:25.146773 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Mar 20 08:44:25 crc kubenswrapper[4971]: I0320 08:44:25.186129 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Mar 20 08:44:25 crc kubenswrapper[4971]: I0320 08:44:25.193155 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Mar 20 08:44:25 crc kubenswrapper[4971]: I0320 08:44:25.272009 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Mar 20 08:44:25 crc kubenswrapper[4971]: I0320 08:44:25.377323 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 08:44:25 crc kubenswrapper[4971]: I0320 08:44:25.491327 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"69ef94eb-93b4-42b8-8f8b-4ad69843586d","Type":"ContainerStarted","Data":"24e58d02ee6c397695f23e89753e2ed720de99a2902dc7c20b66e4c3feb56f2f"} Mar 20 08:44:25 crc kubenswrapper[4971]: I0320 08:44:25.492387 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e2bebfec-671e-453b-855a-0a1c4b6a3109","Type":"ContainerStarted","Data":"190880601ad129b8b742a4b4eea7f874fdd7e1b800d90e98f42fa444260134ad"} Mar 20 08:44:25 crc kubenswrapper[4971]: I0320 08:44:25.633943 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 08:44:25 crc kubenswrapper[4971]: W0320 08:44:25.637164 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf19df11e_015e_4c86_8369_c2bafe67a087.slice/crio-cb7632810c4b27f1e190b169d51bf3c56cebd2dd4719f341737b63fa6e7cfa07 WatchSource:0}: Error finding container cb7632810c4b27f1e190b169d51bf3c56cebd2dd4719f341737b63fa6e7cfa07: Status 404 returned error can't find the container with id cb7632810c4b27f1e190b169d51bf3c56cebd2dd4719f341737b63fa6e7cfa07 Mar 20 08:44:25 crc kubenswrapper[4971]: I0320 08:44:25.811428 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Mar 20 08:44:25 crc kubenswrapper[4971]: W0320 08:44:25.826969 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10014eeb_477f_4f79_ad5d_6412a0ba2dfd.slice/crio-eebb3ea0b75d81f5d127cc679ebca4fb170425f9b1b916938ef48c6bb803ae59 WatchSource:0}: Error finding container eebb3ea0b75d81f5d127cc679ebca4fb170425f9b1b916938ef48c6bb803ae59: Status 404 returned error can't find the container with id eebb3ea0b75d81f5d127cc679ebca4fb170425f9b1b916938ef48c6bb803ae59 Mar 20 08:44:25 crc kubenswrapper[4971]: I0320 08:44:25.888341 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Mar 20 08:44:25 crc kubenswrapper[4971]: W0320 08:44:25.894336 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66a65a08_59f0_4d4f_bde1_db66ae04d2ff.slice/crio-c72facc370e8eea3d0a17c960fbfad6b6137c55c7b8a45da4a55547874d336ff WatchSource:0}: Error finding container c72facc370e8eea3d0a17c960fbfad6b6137c55c7b8a45da4a55547874d336ff: Status 404 returned error can't find the container with id c72facc370e8eea3d0a17c960fbfad6b6137c55c7b8a45da4a55547874d336ff Mar 20 08:44:26 crc kubenswrapper[4971]: I0320 08:44:26.505774 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"10014eeb-477f-4f79-ad5d-6412a0ba2dfd","Type":"ContainerStarted","Data":"eebb3ea0b75d81f5d127cc679ebca4fb170425f9b1b916938ef48c6bb803ae59"} Mar 20 08:44:26 crc kubenswrapper[4971]: I0320 08:44:26.508958 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"66a65a08-59f0-4d4f-bde1-db66ae04d2ff","Type":"ContainerStarted","Data":"c72facc370e8eea3d0a17c960fbfad6b6137c55c7b8a45da4a55547874d336ff"} Mar 20 08:44:26 crc kubenswrapper[4971]: I0320 08:44:26.513640 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"f19df11e-015e-4c86-8369-c2bafe67a087","Type":"ContainerStarted","Data":"cb7632810c4b27f1e190b169d51bf3c56cebd2dd4719f341737b63fa6e7cfa07"} Mar 20 08:44:26 crc kubenswrapper[4971]: I0320 08:44:26.743510 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Mar 20 08:44:26 crc kubenswrapper[4971]: W0320 08:44:26.756947 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde027a38_7366_4724_b961_dd311e4cfd46.slice/crio-f35adb5dd751944df3f9f3492fbc9871a0b07159526813385ed2a3e800d69cfb WatchSource:0}: Error finding container f35adb5dd751944df3f9f3492fbc9871a0b07159526813385ed2a3e800d69cfb: Status 404 returned error can't find the container with id f35adb5dd751944df3f9f3492fbc9871a0b07159526813385ed2a3e800d69cfb Mar 20 08:44:27 crc kubenswrapper[4971]: I0320 08:44:27.523755 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"de027a38-7366-4724-b961-dd311e4cfd46","Type":"ContainerStarted","Data":"f35adb5dd751944df3f9f3492fbc9871a0b07159526813385ed2a3e800d69cfb"} Mar 20 08:44:30 crc kubenswrapper[4971]: I0320 08:44:30.551077 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"69ef94eb-93b4-42b8-8f8b-4ad69843586d","Type":"ContainerStarted","Data":"17df0a570488756acebdacce2a181d4935d277b8bd43303276ac908a3b9a17ee"} Mar 20 08:44:30 crc kubenswrapper[4971]: I0320 08:44:30.551333 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"69ef94eb-93b4-42b8-8f8b-4ad69843586d","Type":"ContainerStarted","Data":"e2677e6580ad45945e28ef32bcff1c8c7233122d3e7c4829ab41fbd8e847f5ac"} Mar 20 08:44:30 crc kubenswrapper[4971]: I0320 08:44:30.553394 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e2bebfec-671e-453b-855a-0a1c4b6a3109","Type":"ContainerStarted","Data":"8501cac49ba90734036d8252adbda809fc9e011c2d4e5832781e74c6d7bb028e"} Mar 20 08:44:30 crc kubenswrapper[4971]: I0320 08:44:30.553449 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e2bebfec-671e-453b-855a-0a1c4b6a3109","Type":"ContainerStarted","Data":"49796598b4edd48f8b30346936b37fa42fa331f10c0f9a3d969c28840efa59a9"} Mar 20 08:44:30 crc kubenswrapper[4971]: I0320 08:44:30.556347 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"f19df11e-015e-4c86-8369-c2bafe67a087","Type":"ContainerStarted","Data":"bb4e1198ac04ade69d655baa98f64db4cd6737c6698277b9e3436497380e6a92"} Mar 20 08:44:30 crc kubenswrapper[4971]: I0320 08:44:30.563964 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"10014eeb-477f-4f79-ad5d-6412a0ba2dfd","Type":"ContainerStarted","Data":"31254cf0853917f1499e25a281e62f8034710ba93265fd8036e1b1194f15a024"} Mar 20 08:44:30 crc kubenswrapper[4971]: I0320 08:44:30.564121 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"10014eeb-477f-4f79-ad5d-6412a0ba2dfd","Type":"ContainerStarted","Data":"cea32bdb9cbbb5e1460fef1ac4c1a1fdb6475a61969ca0396826444979c4f078"} Mar 20 08:44:30 crc kubenswrapper[4971]: I0320 08:44:30.566511 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"66a65a08-59f0-4d4f-bde1-db66ae04d2ff","Type":"ContainerStarted","Data":"82a6647816cfdaffb87fc4c884ac1e5ba9a5ff165fd2443c78b6475e18d4c325"} Mar 20 08:44:30 crc kubenswrapper[4971]: I0320 08:44:30.569338 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"de027a38-7366-4724-b961-dd311e4cfd46","Type":"ContainerStarted","Data":"f6c20d632e5f645323628260cdc36dde9b0fe7b2213c78c77e83fd062ec583ac"} Mar 20 08:44:30 crc kubenswrapper[4971]: I0320 08:44:30.569383 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"de027a38-7366-4724-b961-dd311e4cfd46","Type":"ContainerStarted","Data":"84b9c9096a6bc2ee89ccfcf8252cf19df4d78c0c0cc8dc1cd286fb97fba54c5b"} Mar 20 08:44:30 crc kubenswrapper[4971]: I0320 08:44:30.571489 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-1" podStartSLOduration=2.996175961 podStartE2EDuration="7.571471216s" podCreationTimestamp="2026-03-20 08:44:23 +0000 UTC" firstStartedPulling="2026-03-20 08:44:25.299926695 +0000 UTC m=+6887.279800833" lastFinishedPulling="2026-03-20 08:44:29.87522195 +0000 UTC m=+6891.855096088" observedRunningTime="2026-03-20 08:44:30.567139313 +0000 UTC m=+6892.547013471" watchObservedRunningTime="2026-03-20 08:44:30.571471216 +0000 UTC m=+6892.551345354" Mar 20 08:44:30 crc kubenswrapper[4971]: I0320 08:44:30.584585 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=3.018466963 podStartE2EDuration="7.584564708s" podCreationTimestamp="2026-03-20 08:44:23 +0000 UTC" firstStartedPulling="2026-03-20 08:44:25.38083113 +0000 UTC m=+6887.360705268" lastFinishedPulling="2026-03-20 08:44:29.946928875 +0000 UTC m=+6891.926803013" observedRunningTime="2026-03-20 08:44:30.580630946 +0000 UTC m=+6892.560505104" watchObservedRunningTime="2026-03-20 08:44:30.584564708 +0000 UTC m=+6892.564438836" Mar 20 08:44:30 crc kubenswrapper[4971]: I0320 08:44:30.600363 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-2" podStartSLOduration=3.596437016 podStartE2EDuration="7.600347851s" podCreationTimestamp="2026-03-20 08:44:23 +0000 UTC" firstStartedPulling="2026-03-20 08:44:25.896577916 +0000 UTC m=+6887.876452054" lastFinishedPulling="2026-03-20 08:44:29.900488741 +0000 UTC m=+6891.880362889" observedRunningTime="2026-03-20 08:44:30.595587897 +0000 UTC m=+6892.575462035" watchObservedRunningTime="2026-03-20 08:44:30.600347851 +0000 UTC m=+6892.580221989" Mar 20 08:44:30 crc kubenswrapper[4971]: I0320 08:44:30.617811 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-1" podStartSLOduration=3.572225813 podStartE2EDuration="7.617792277s" podCreationTimestamp="2026-03-20 08:44:23 +0000 UTC" firstStartedPulling="2026-03-20 08:44:25.830902309 +0000 UTC m=+6887.810776457" lastFinishedPulling="2026-03-20 08:44:29.876468783 +0000 UTC m=+6891.856342921" observedRunningTime="2026-03-20 08:44:30.610659091 +0000 UTC m=+6892.590533239" watchObservedRunningTime="2026-03-20 08:44:30.617792277 +0000 UTC m=+6892.597666415" Mar 20 08:44:30 crc kubenswrapper[4971]: I0320 08:44:30.629831 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=3.394599709 podStartE2EDuration="7.629813232s" podCreationTimestamp="2026-03-20 08:44:23 +0000 UTC" firstStartedPulling="2026-03-20 08:44:25.640051678 +0000 UTC m=+6887.619925826" lastFinishedPulling="2026-03-20 08:44:29.875265211 +0000 UTC m=+6891.855139349" observedRunningTime="2026-03-20 08:44:30.627721377 +0000 UTC m=+6892.607595515" watchObservedRunningTime="2026-03-20 08:44:30.629813232 +0000 UTC m=+6892.609687370" Mar 20 08:44:30 crc kubenswrapper[4971]: I0320 08:44:30.648030 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-2" podStartSLOduration=4.440247131 podStartE2EDuration="7.648012898s" podCreationTimestamp="2026-03-20 08:44:23 +0000 UTC" firstStartedPulling="2026-03-20 08:44:26.766102113 +0000 UTC m=+6888.745976251" lastFinishedPulling="2026-03-20 08:44:29.97386788 +0000 UTC m=+6891.953742018" observedRunningTime="2026-03-20 08:44:30.64542211 +0000 UTC m=+6892.625296248" watchObservedRunningTime="2026-03-20 08:44:30.648012898 +0000 UTC m=+6892.627887036" Mar 20 08:44:30 crc kubenswrapper[4971]: I0320 08:44:30.831872 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 20 08:44:30 crc kubenswrapper[4971]: I0320 08:44:30.889318 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-1" Mar 20 08:44:31 crc kubenswrapper[4971]: I0320 08:44:31.124257 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 20 08:44:31 crc kubenswrapper[4971]: I0320 08:44:31.147281 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-2" Mar 20 08:44:31 crc kubenswrapper[4971]: I0320 08:44:31.186249 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-1" Mar 20 08:44:31 crc kubenswrapper[4971]: I0320 08:44:31.193641 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-2" Mar 20 08:44:31 crc kubenswrapper[4971]: I0320 08:44:31.578692 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"f19df11e-015e-4c86-8369-c2bafe67a087","Type":"ContainerStarted","Data":"7086ce0e46aa139d665076fc265e22f5df2fced6e874874e46e483bae7c1758f"} Mar 20 08:44:31 crc kubenswrapper[4971]: I0320 08:44:31.580623 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"66a65a08-59f0-4d4f-bde1-db66ae04d2ff","Type":"ContainerStarted","Data":"8747f4c489a01b2c4bd7600cb1b77576a24c8c7fd087aca094cf53c3e9aba9ed"} Mar 20 08:44:33 crc kubenswrapper[4971]: I0320 08:44:33.869769 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 20 08:44:33 crc kubenswrapper[4971]: I0320 08:44:33.870469 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 20 08:44:33 crc kubenswrapper[4971]: I0320 08:44:33.922354 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-1" Mar 20 08:44:33 crc kubenswrapper[4971]: I0320 08:44:33.922750 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-1" Mar 20 08:44:34 crc kubenswrapper[4971]: I0320 08:44:34.168945 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 20 08:44:34 crc kubenswrapper[4971]: I0320 08:44:34.169339 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 20 08:44:34 crc kubenswrapper[4971]: I0320 08:44:34.195376 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-2" Mar 20 08:44:34 crc kubenswrapper[4971]: I0320 08:44:34.195847 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-2" Mar 20 08:44:34 crc kubenswrapper[4971]: I0320 08:44:34.246969 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-2" Mar 20 08:44:34 crc kubenswrapper[4971]: I0320 08:44:34.247295 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-2" Mar 20 08:44:34 crc kubenswrapper[4971]: I0320 08:44:34.254400 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-1" Mar 20 08:44:34 crc kubenswrapper[4971]: I0320 08:44:34.254584 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-1" Mar 20 08:44:35 crc kubenswrapper[4971]: I0320 08:44:35.185218 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 20 08:44:35 crc kubenswrapper[4971]: I0320 08:44:35.188117 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-2" Mar 20 08:44:35 crc kubenswrapper[4971]: I0320 08:44:35.231370 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-1" Mar 20 08:44:35 crc kubenswrapper[4971]: I0320 08:44:35.241170 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-2" Mar 20 08:44:35 crc kubenswrapper[4971]: I0320 08:44:35.410952 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86d5dc9599-tks8s"] Mar 20 08:44:35 crc kubenswrapper[4971]: I0320 08:44:35.412710 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86d5dc9599-tks8s" Mar 20 08:44:35 crc kubenswrapper[4971]: I0320 08:44:35.415093 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 20 08:44:35 crc kubenswrapper[4971]: I0320 08:44:35.426408 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86d5dc9599-tks8s"] Mar 20 08:44:35 crc kubenswrapper[4971]: I0320 08:44:35.537592 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z65qb\" (UniqueName: \"kubernetes.io/projected/e86a37d9-5e82-4c79-8486-163e398ae582-kube-api-access-z65qb\") pod \"dnsmasq-dns-86d5dc9599-tks8s\" (UID: \"e86a37d9-5e82-4c79-8486-163e398ae582\") " pod="openstack/dnsmasq-dns-86d5dc9599-tks8s" Mar 20 08:44:35 crc kubenswrapper[4971]: I0320 08:44:35.537703 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e86a37d9-5e82-4c79-8486-163e398ae582-dns-svc\") pod \"dnsmasq-dns-86d5dc9599-tks8s\" (UID: \"e86a37d9-5e82-4c79-8486-163e398ae582\") " pod="openstack/dnsmasq-dns-86d5dc9599-tks8s" Mar 20 08:44:35 crc kubenswrapper[4971]: I0320 08:44:35.537755 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e86a37d9-5e82-4c79-8486-163e398ae582-ovsdbserver-sb\") pod \"dnsmasq-dns-86d5dc9599-tks8s\" (UID: \"e86a37d9-5e82-4c79-8486-163e398ae582\") " pod="openstack/dnsmasq-dns-86d5dc9599-tks8s" Mar 20 08:44:35 crc kubenswrapper[4971]: I0320 08:44:35.537796 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e86a37d9-5e82-4c79-8486-163e398ae582-config\") pod \"dnsmasq-dns-86d5dc9599-tks8s\" (UID: \"e86a37d9-5e82-4c79-8486-163e398ae582\") " pod="openstack/dnsmasq-dns-86d5dc9599-tks8s" Mar 20 08:44:35 crc kubenswrapper[4971]: I0320 08:44:35.550926 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86d5dc9599-tks8s"] Mar 20 08:44:35 crc kubenswrapper[4971]: E0320 08:44:35.552189 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-z65qb ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-86d5dc9599-tks8s" podUID="e86a37d9-5e82-4c79-8486-163e398ae582" Mar 20 08:44:35 crc kubenswrapper[4971]: I0320 08:44:35.588301 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5db489d56c-jxfx5"] Mar 20 08:44:35 crc kubenswrapper[4971]: I0320 08:44:35.589813 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5db489d56c-jxfx5" Mar 20 08:44:35 crc kubenswrapper[4971]: I0320 08:44:35.595480 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 20 08:44:35 crc kubenswrapper[4971]: I0320 08:44:35.598701 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5db489d56c-jxfx5"] Mar 20 08:44:35 crc kubenswrapper[4971]: I0320 08:44:35.624686 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86d5dc9599-tks8s" Mar 20 08:44:35 crc kubenswrapper[4971]: I0320 08:44:35.635819 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86d5dc9599-tks8s" Mar 20 08:44:35 crc kubenswrapper[4971]: I0320 08:44:35.639829 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z65qb\" (UniqueName: \"kubernetes.io/projected/e86a37d9-5e82-4c79-8486-163e398ae582-kube-api-access-z65qb\") pod \"dnsmasq-dns-86d5dc9599-tks8s\" (UID: \"e86a37d9-5e82-4c79-8486-163e398ae582\") " pod="openstack/dnsmasq-dns-86d5dc9599-tks8s" Mar 20 08:44:35 crc kubenswrapper[4971]: I0320 08:44:35.639916 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e86a37d9-5e82-4c79-8486-163e398ae582-dns-svc\") pod \"dnsmasq-dns-86d5dc9599-tks8s\" (UID: \"e86a37d9-5e82-4c79-8486-163e398ae582\") " pod="openstack/dnsmasq-dns-86d5dc9599-tks8s" Mar 20 08:44:35 crc kubenswrapper[4971]: I0320 08:44:35.639984 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e86a37d9-5e82-4c79-8486-163e398ae582-ovsdbserver-sb\") pod \"dnsmasq-dns-86d5dc9599-tks8s\" (UID: \"e86a37d9-5e82-4c79-8486-163e398ae582\") " pod="openstack/dnsmasq-dns-86d5dc9599-tks8s" Mar 20 08:44:35 crc kubenswrapper[4971]: I0320 08:44:35.640035 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5qtv\" (UniqueName: \"kubernetes.io/projected/efcb77d2-e456-4d89-8ef3-9a2445bd5e2c-kube-api-access-c5qtv\") pod \"dnsmasq-dns-5db489d56c-jxfx5\" (UID: \"efcb77d2-e456-4d89-8ef3-9a2445bd5e2c\") " pod="openstack/dnsmasq-dns-5db489d56c-jxfx5" Mar 20 08:44:35 crc kubenswrapper[4971]: I0320 08:44:35.640068 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e86a37d9-5e82-4c79-8486-163e398ae582-config\") pod \"dnsmasq-dns-86d5dc9599-tks8s\" (UID: \"e86a37d9-5e82-4c79-8486-163e398ae582\") " pod="openstack/dnsmasq-dns-86d5dc9599-tks8s" Mar 20 08:44:35 crc kubenswrapper[4971]: I0320 08:44:35.640128 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/efcb77d2-e456-4d89-8ef3-9a2445bd5e2c-ovsdbserver-nb\") pod \"dnsmasq-dns-5db489d56c-jxfx5\" (UID: \"efcb77d2-e456-4d89-8ef3-9a2445bd5e2c\") " pod="openstack/dnsmasq-dns-5db489d56c-jxfx5" Mar 20 08:44:35 crc kubenswrapper[4971]: I0320 08:44:35.640156 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/efcb77d2-e456-4d89-8ef3-9a2445bd5e2c-ovsdbserver-sb\") pod \"dnsmasq-dns-5db489d56c-jxfx5\" (UID: \"efcb77d2-e456-4d89-8ef3-9a2445bd5e2c\") " pod="openstack/dnsmasq-dns-5db489d56c-jxfx5" Mar 20 08:44:35 crc kubenswrapper[4971]: I0320 08:44:35.640193 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efcb77d2-e456-4d89-8ef3-9a2445bd5e2c-config\") pod \"dnsmasq-dns-5db489d56c-jxfx5\" (UID: \"efcb77d2-e456-4d89-8ef3-9a2445bd5e2c\") " pod="openstack/dnsmasq-dns-5db489d56c-jxfx5" Mar 20 08:44:35 crc kubenswrapper[4971]: I0320 08:44:35.640229 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/efcb77d2-e456-4d89-8ef3-9a2445bd5e2c-dns-svc\") pod \"dnsmasq-dns-5db489d56c-jxfx5\" (UID: \"efcb77d2-e456-4d89-8ef3-9a2445bd5e2c\") " pod="openstack/dnsmasq-dns-5db489d56c-jxfx5" Mar 20 08:44:35 crc kubenswrapper[4971]: I0320 08:44:35.642127 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e86a37d9-5e82-4c79-8486-163e398ae582-dns-svc\") pod \"dnsmasq-dns-86d5dc9599-tks8s\" (UID: \"e86a37d9-5e82-4c79-8486-163e398ae582\") " pod="openstack/dnsmasq-dns-86d5dc9599-tks8s" Mar 20 08:44:35 crc kubenswrapper[4971]: I0320 08:44:35.642151 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e86a37d9-5e82-4c79-8486-163e398ae582-ovsdbserver-sb\") pod \"dnsmasq-dns-86d5dc9599-tks8s\" (UID: \"e86a37d9-5e82-4c79-8486-163e398ae582\") " pod="openstack/dnsmasq-dns-86d5dc9599-tks8s" Mar 20 08:44:35 crc kubenswrapper[4971]: I0320 08:44:35.642729 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e86a37d9-5e82-4c79-8486-163e398ae582-config\") pod \"dnsmasq-dns-86d5dc9599-tks8s\" (UID: \"e86a37d9-5e82-4c79-8486-163e398ae582\") " pod="openstack/dnsmasq-dns-86d5dc9599-tks8s" Mar 20 08:44:35 crc kubenswrapper[4971]: I0320 08:44:35.671113 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z65qb\" (UniqueName: \"kubernetes.io/projected/e86a37d9-5e82-4c79-8486-163e398ae582-kube-api-access-z65qb\") pod \"dnsmasq-dns-86d5dc9599-tks8s\" (UID: \"e86a37d9-5e82-4c79-8486-163e398ae582\") " pod="openstack/dnsmasq-dns-86d5dc9599-tks8s" Mar 20 08:44:35 crc kubenswrapper[4971]: I0320 08:44:35.741294 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e86a37d9-5e82-4c79-8486-163e398ae582-config\") pod \"e86a37d9-5e82-4c79-8486-163e398ae582\" (UID: \"e86a37d9-5e82-4c79-8486-163e398ae582\") " Mar 20 08:44:35 crc kubenswrapper[4971]: I0320 08:44:35.741705 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e86a37d9-5e82-4c79-8486-163e398ae582-ovsdbserver-sb\") pod \"e86a37d9-5e82-4c79-8486-163e398ae582\" (UID: \"e86a37d9-5e82-4c79-8486-163e398ae582\") " Mar 20 08:44:35 crc kubenswrapper[4971]: I0320 08:44:35.741822 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z65qb\" (UniqueName: \"kubernetes.io/projected/e86a37d9-5e82-4c79-8486-163e398ae582-kube-api-access-z65qb\") pod \"e86a37d9-5e82-4c79-8486-163e398ae582\" (UID: \"e86a37d9-5e82-4c79-8486-163e398ae582\") " Mar 20 08:44:35 crc kubenswrapper[4971]: I0320 08:44:35.741996 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e86a37d9-5e82-4c79-8486-163e398ae582-config" (OuterVolumeSpecName: "config") pod "e86a37d9-5e82-4c79-8486-163e398ae582" (UID: "e86a37d9-5e82-4c79-8486-163e398ae582"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:44:35 crc kubenswrapper[4971]: I0320 08:44:35.742077 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e86a37d9-5e82-4c79-8486-163e398ae582-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e86a37d9-5e82-4c79-8486-163e398ae582" (UID: "e86a37d9-5e82-4c79-8486-163e398ae582"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:44:35 crc kubenswrapper[4971]: I0320 08:44:35.742028 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e86a37d9-5e82-4c79-8486-163e398ae582-dns-svc\") pod \"e86a37d9-5e82-4c79-8486-163e398ae582\" (UID: \"e86a37d9-5e82-4c79-8486-163e398ae582\") " Mar 20 08:44:35 crc kubenswrapper[4971]: I0320 08:44:35.742502 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/efcb77d2-e456-4d89-8ef3-9a2445bd5e2c-ovsdbserver-nb\") pod \"dnsmasq-dns-5db489d56c-jxfx5\" (UID: \"efcb77d2-e456-4d89-8ef3-9a2445bd5e2c\") " pod="openstack/dnsmasq-dns-5db489d56c-jxfx5" Mar 20 08:44:35 crc kubenswrapper[4971]: I0320 08:44:35.742564 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/efcb77d2-e456-4d89-8ef3-9a2445bd5e2c-ovsdbserver-sb\") pod \"dnsmasq-dns-5db489d56c-jxfx5\" (UID: \"efcb77d2-e456-4d89-8ef3-9a2445bd5e2c\") " pod="openstack/dnsmasq-dns-5db489d56c-jxfx5" Mar 20 08:44:35 crc kubenswrapper[4971]: I0320 08:44:35.742642 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efcb77d2-e456-4d89-8ef3-9a2445bd5e2c-config\") pod \"dnsmasq-dns-5db489d56c-jxfx5\" (UID: \"efcb77d2-e456-4d89-8ef3-9a2445bd5e2c\") " pod="openstack/dnsmasq-dns-5db489d56c-jxfx5" Mar 20 08:44:35 crc kubenswrapper[4971]: I0320 08:44:35.742711 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/efcb77d2-e456-4d89-8ef3-9a2445bd5e2c-dns-svc\") pod \"dnsmasq-dns-5db489d56c-jxfx5\" (UID: \"efcb77d2-e456-4d89-8ef3-9a2445bd5e2c\") " pod="openstack/dnsmasq-dns-5db489d56c-jxfx5" Mar 20 08:44:35 crc kubenswrapper[4971]: I0320 08:44:35.742817 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e86a37d9-5e82-4c79-8486-163e398ae582-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e86a37d9-5e82-4c79-8486-163e398ae582" (UID: "e86a37d9-5e82-4c79-8486-163e398ae582"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:44:35 crc kubenswrapper[4971]: I0320 08:44:35.743005 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5qtv\" (UniqueName: \"kubernetes.io/projected/efcb77d2-e456-4d89-8ef3-9a2445bd5e2c-kube-api-access-c5qtv\") pod \"dnsmasq-dns-5db489d56c-jxfx5\" (UID: \"efcb77d2-e456-4d89-8ef3-9a2445bd5e2c\") " pod="openstack/dnsmasq-dns-5db489d56c-jxfx5" Mar 20 08:44:35 crc kubenswrapper[4971]: I0320 08:44:35.743097 4971 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e86a37d9-5e82-4c79-8486-163e398ae582-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 08:44:35 crc kubenswrapper[4971]: I0320 08:44:35.743118 4971 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e86a37d9-5e82-4c79-8486-163e398ae582-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 08:44:35 crc kubenswrapper[4971]: I0320 08:44:35.743134 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e86a37d9-5e82-4c79-8486-163e398ae582-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:44:35 crc kubenswrapper[4971]: I0320 08:44:35.743803 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efcb77d2-e456-4d89-8ef3-9a2445bd5e2c-config\") pod \"dnsmasq-dns-5db489d56c-jxfx5\" (UID: \"efcb77d2-e456-4d89-8ef3-9a2445bd5e2c\") " pod="openstack/dnsmasq-dns-5db489d56c-jxfx5" Mar 20 08:44:35 crc kubenswrapper[4971]: I0320 08:44:35.744526 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/efcb77d2-e456-4d89-8ef3-9a2445bd5e2c-dns-svc\") pod \"dnsmasq-dns-5db489d56c-jxfx5\" (UID: \"efcb77d2-e456-4d89-8ef3-9a2445bd5e2c\") " pod="openstack/dnsmasq-dns-5db489d56c-jxfx5" Mar 20 08:44:35 crc kubenswrapper[4971]: I0320 08:44:35.744822 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/efcb77d2-e456-4d89-8ef3-9a2445bd5e2c-ovsdbserver-nb\") pod \"dnsmasq-dns-5db489d56c-jxfx5\" (UID: \"efcb77d2-e456-4d89-8ef3-9a2445bd5e2c\") " pod="openstack/dnsmasq-dns-5db489d56c-jxfx5" Mar 20 08:44:35 crc kubenswrapper[4971]: I0320 08:44:35.744878 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/efcb77d2-e456-4d89-8ef3-9a2445bd5e2c-ovsdbserver-sb\") pod \"dnsmasq-dns-5db489d56c-jxfx5\" (UID: \"efcb77d2-e456-4d89-8ef3-9a2445bd5e2c\") " pod="openstack/dnsmasq-dns-5db489d56c-jxfx5" Mar 20 08:44:35 crc kubenswrapper[4971]: I0320 08:44:35.745440 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e86a37d9-5e82-4c79-8486-163e398ae582-kube-api-access-z65qb" (OuterVolumeSpecName: "kube-api-access-z65qb") pod "e86a37d9-5e82-4c79-8486-163e398ae582" (UID: "e86a37d9-5e82-4c79-8486-163e398ae582"). InnerVolumeSpecName "kube-api-access-z65qb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:44:35 crc kubenswrapper[4971]: I0320 08:44:35.767243 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5qtv\" (UniqueName: \"kubernetes.io/projected/efcb77d2-e456-4d89-8ef3-9a2445bd5e2c-kube-api-access-c5qtv\") pod \"dnsmasq-dns-5db489d56c-jxfx5\" (UID: \"efcb77d2-e456-4d89-8ef3-9a2445bd5e2c\") " pod="openstack/dnsmasq-dns-5db489d56c-jxfx5" Mar 20 08:44:35 crc kubenswrapper[4971]: I0320 08:44:35.844990 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z65qb\" (UniqueName: \"kubernetes.io/projected/e86a37d9-5e82-4c79-8486-163e398ae582-kube-api-access-z65qb\") on node \"crc\" DevicePath \"\"" Mar 20 08:44:35 crc kubenswrapper[4971]: I0320 08:44:35.912430 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5db489d56c-jxfx5" Mar 20 08:44:36 crc kubenswrapper[4971]: I0320 08:44:36.389228 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5db489d56c-jxfx5"] Mar 20 08:44:36 crc kubenswrapper[4971]: I0320 08:44:36.643008 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86d5dc9599-tks8s" Mar 20 08:44:36 crc kubenswrapper[4971]: I0320 08:44:36.643194 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5db489d56c-jxfx5" event={"ID":"efcb77d2-e456-4d89-8ef3-9a2445bd5e2c","Type":"ContainerStarted","Data":"b1018ba4ff8caea1f09d4e8e2b9c8bde07989722fcd05800a6b46f11c3284932"} Mar 20 08:44:36 crc kubenswrapper[4971]: I0320 08:44:36.829315 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86d5dc9599-tks8s"] Mar 20 08:44:36 crc kubenswrapper[4971]: I0320 08:44:36.836025 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86d5dc9599-tks8s"] Mar 20 08:44:37 crc kubenswrapper[4971]: I0320 08:44:37.653277 4971 generic.go:334] "Generic (PLEG): container finished" podID="efcb77d2-e456-4d89-8ef3-9a2445bd5e2c" containerID="7e6703c2cbb16d1ba5ede75f4f58d39149556f72188f5544d3c676d2e746b6ce" exitCode=0 Mar 20 08:44:37 crc kubenswrapper[4971]: I0320 08:44:37.653340 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5db489d56c-jxfx5" event={"ID":"efcb77d2-e456-4d89-8ef3-9a2445bd5e2c","Type":"ContainerDied","Data":"7e6703c2cbb16d1ba5ede75f4f58d39149556f72188f5544d3c676d2e746b6ce"} Mar 20 08:44:38 crc kubenswrapper[4971]: I0320 08:44:38.671089 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5db489d56c-jxfx5" event={"ID":"efcb77d2-e456-4d89-8ef3-9a2445bd5e2c","Type":"ContainerStarted","Data":"48e9c08e2cc299fd6646bdeae2420e3149dee56a25923ef2635e1d90643445c4"} Mar 20 08:44:38 crc kubenswrapper[4971]: I0320 08:44:38.671594 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5db489d56c-jxfx5" Mar 20 08:44:38 crc kubenswrapper[4971]: I0320 08:44:38.698373 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5db489d56c-jxfx5" podStartSLOduration=3.698352927 podStartE2EDuration="3.698352927s" podCreationTimestamp="2026-03-20 08:44:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:44:38.689923117 +0000 UTC m=+6900.669797275" watchObservedRunningTime="2026-03-20 08:44:38.698352927 +0000 UTC m=+6900.678227055" Mar 20 08:44:38 crc kubenswrapper[4971]: I0320 08:44:38.747084 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e86a37d9-5e82-4c79-8486-163e398ae582" path="/var/lib/kubelet/pods/e86a37d9-5e82-4c79-8486-163e398ae582/volumes" Mar 20 08:44:39 crc kubenswrapper[4971]: I0320 08:44:39.871641 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 20 08:44:39 crc kubenswrapper[4971]: I0320 08:44:39.945056 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-1" Mar 20 08:44:42 crc kubenswrapper[4971]: I0320 08:44:42.868800 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-copy-data"] Mar 20 08:44:42 crc kubenswrapper[4971]: I0320 08:44:42.870499 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Mar 20 08:44:42 crc kubenswrapper[4971]: I0320 08:44:42.878493 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovn-data-cert" Mar 20 08:44:42 crc kubenswrapper[4971]: I0320 08:44:42.902423 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Mar 20 08:44:42 crc kubenswrapper[4971]: I0320 08:44:42.977970 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dr7mg\" (UniqueName: \"kubernetes.io/projected/a20e6b72-7aa2-4468-b8e3-e13011048d27-kube-api-access-dr7mg\") pod \"ovn-copy-data\" (UID: \"a20e6b72-7aa2-4468-b8e3-e13011048d27\") " pod="openstack/ovn-copy-data" Mar 20 08:44:42 crc kubenswrapper[4971]: I0320 08:44:42.978266 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8277eb03-155c-41fa-8914-8f82b3ebad1a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8277eb03-155c-41fa-8914-8f82b3ebad1a\") pod \"ovn-copy-data\" (UID: \"a20e6b72-7aa2-4468-b8e3-e13011048d27\") " pod="openstack/ovn-copy-data" Mar 20 08:44:42 crc kubenswrapper[4971]: I0320 08:44:42.978306 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/a20e6b72-7aa2-4468-b8e3-e13011048d27-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"a20e6b72-7aa2-4468-b8e3-e13011048d27\") " pod="openstack/ovn-copy-data" Mar 20 08:44:43 crc kubenswrapper[4971]: I0320 08:44:43.079358 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dr7mg\" (UniqueName: \"kubernetes.io/projected/a20e6b72-7aa2-4468-b8e3-e13011048d27-kube-api-access-dr7mg\") pod \"ovn-copy-data\" (UID: \"a20e6b72-7aa2-4468-b8e3-e13011048d27\") " pod="openstack/ovn-copy-data" Mar 20 08:44:43 crc kubenswrapper[4971]: I0320 08:44:43.079722 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8277eb03-155c-41fa-8914-8f82b3ebad1a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8277eb03-155c-41fa-8914-8f82b3ebad1a\") pod \"ovn-copy-data\" (UID: \"a20e6b72-7aa2-4468-b8e3-e13011048d27\") " pod="openstack/ovn-copy-data" Mar 20 08:44:43 crc kubenswrapper[4971]: I0320 08:44:43.079893 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/a20e6b72-7aa2-4468-b8e3-e13011048d27-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"a20e6b72-7aa2-4468-b8e3-e13011048d27\") " pod="openstack/ovn-copy-data" Mar 20 08:44:43 crc kubenswrapper[4971]: I0320 08:44:43.085555 4971 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 08:44:43 crc kubenswrapper[4971]: I0320 08:44:43.085888 4971 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8277eb03-155c-41fa-8914-8f82b3ebad1a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8277eb03-155c-41fa-8914-8f82b3ebad1a\") pod \"ovn-copy-data\" (UID: \"a20e6b72-7aa2-4468-b8e3-e13011048d27\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/77a5644527f42843466c819918cb20319d85fd7ace9ea60aed635fe2804c292e/globalmount\"" pod="openstack/ovn-copy-data" Mar 20 08:44:43 crc kubenswrapper[4971]: I0320 08:44:43.095665 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/a20e6b72-7aa2-4468-b8e3-e13011048d27-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"a20e6b72-7aa2-4468-b8e3-e13011048d27\") " pod="openstack/ovn-copy-data" Mar 20 08:44:43 crc kubenswrapper[4971]: I0320 08:44:43.108318 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dr7mg\" (UniqueName: \"kubernetes.io/projected/a20e6b72-7aa2-4468-b8e3-e13011048d27-kube-api-access-dr7mg\") pod \"ovn-copy-data\" (UID: \"a20e6b72-7aa2-4468-b8e3-e13011048d27\") " pod="openstack/ovn-copy-data" Mar 20 08:44:43 crc kubenswrapper[4971]: I0320 08:44:43.131510 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8277eb03-155c-41fa-8914-8f82b3ebad1a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8277eb03-155c-41fa-8914-8f82b3ebad1a\") pod \"ovn-copy-data\" (UID: \"a20e6b72-7aa2-4468-b8e3-e13011048d27\") " pod="openstack/ovn-copy-data" Mar 20 08:44:43 crc kubenswrapper[4971]: I0320 08:44:43.206759 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Mar 20 08:44:43 crc kubenswrapper[4971]: I0320 08:44:43.528907 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Mar 20 08:44:45 crc kubenswrapper[4971]: I0320 08:44:45.080017 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"a20e6b72-7aa2-4468-b8e3-e13011048d27","Type":"ContainerStarted","Data":"e855ba31df1a2ff8e58e2e04c16bdf015d243ff41bb4e0cd8057a25fefd79739"} Mar 20 08:44:45 crc kubenswrapper[4971]: I0320 08:44:45.915824 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5db489d56c-jxfx5" Mar 20 08:44:46 crc kubenswrapper[4971]: I0320 08:44:46.015555 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b5c84b9cc-8pkhg"] Mar 20 08:44:46 crc kubenswrapper[4971]: I0320 08:44:46.016154 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b5c84b9cc-8pkhg" podUID="a18dea62-5df6-4d36-b9c2-f9bdf096c784" containerName="dnsmasq-dns" containerID="cri-o://23f4ca7608b466e4ebf73e557f57a795e566f94344921867720a8c451bd18af4" gracePeriod=10 Mar 20 08:44:46 crc kubenswrapper[4971]: I0320 08:44:46.089004 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"a20e6b72-7aa2-4468-b8e3-e13011048d27","Type":"ContainerStarted","Data":"8091166a7604c7770d00088c604be07ff7c8c445eb3df721c3dfb714fd24daf7"} Mar 20 08:44:46 crc kubenswrapper[4971]: I0320 08:44:46.108036 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-copy-data" podStartSLOduration=3.489796592 podStartE2EDuration="5.108012446s" podCreationTimestamp="2026-03-20 08:44:41 +0000 UTC" firstStartedPulling="2026-03-20 08:44:43.542045911 +0000 UTC m=+6905.521920089" lastFinishedPulling="2026-03-20 08:44:45.160261815 +0000 UTC m=+6907.140135943" observedRunningTime="2026-03-20 08:44:46.106141317 +0000 UTC m=+6908.086015455" watchObservedRunningTime="2026-03-20 08:44:46.108012446 +0000 UTC m=+6908.087886584" Mar 20 08:44:46 crc kubenswrapper[4971]: I0320 08:44:46.473117 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b5c84b9cc-8pkhg" Mar 20 08:44:46 crc kubenswrapper[4971]: I0320 08:44:46.669042 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a18dea62-5df6-4d36-b9c2-f9bdf096c784-dns-svc\") pod \"a18dea62-5df6-4d36-b9c2-f9bdf096c784\" (UID: \"a18dea62-5df6-4d36-b9c2-f9bdf096c784\") " Mar 20 08:44:46 crc kubenswrapper[4971]: I0320 08:44:46.669236 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a18dea62-5df6-4d36-b9c2-f9bdf096c784-config\") pod \"a18dea62-5df6-4d36-b9c2-f9bdf096c784\" (UID: \"a18dea62-5df6-4d36-b9c2-f9bdf096c784\") " Mar 20 08:44:46 crc kubenswrapper[4971]: I0320 08:44:46.669284 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbskr\" (UniqueName: \"kubernetes.io/projected/a18dea62-5df6-4d36-b9c2-f9bdf096c784-kube-api-access-jbskr\") pod \"a18dea62-5df6-4d36-b9c2-f9bdf096c784\" (UID: \"a18dea62-5df6-4d36-b9c2-f9bdf096c784\") " Mar 20 08:44:46 crc kubenswrapper[4971]: I0320 08:44:46.681722 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a18dea62-5df6-4d36-b9c2-f9bdf096c784-kube-api-access-jbskr" (OuterVolumeSpecName: "kube-api-access-jbskr") pod "a18dea62-5df6-4d36-b9c2-f9bdf096c784" (UID: "a18dea62-5df6-4d36-b9c2-f9bdf096c784"). InnerVolumeSpecName "kube-api-access-jbskr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:44:46 crc kubenswrapper[4971]: I0320 08:44:46.717338 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a18dea62-5df6-4d36-b9c2-f9bdf096c784-config" (OuterVolumeSpecName: "config") pod "a18dea62-5df6-4d36-b9c2-f9bdf096c784" (UID: "a18dea62-5df6-4d36-b9c2-f9bdf096c784"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:44:46 crc kubenswrapper[4971]: I0320 08:44:46.724700 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a18dea62-5df6-4d36-b9c2-f9bdf096c784-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a18dea62-5df6-4d36-b9c2-f9bdf096c784" (UID: "a18dea62-5df6-4d36-b9c2-f9bdf096c784"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:44:46 crc kubenswrapper[4971]: I0320 08:44:46.771013 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a18dea62-5df6-4d36-b9c2-f9bdf096c784-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:44:46 crc kubenswrapper[4971]: I0320 08:44:46.771065 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbskr\" (UniqueName: \"kubernetes.io/projected/a18dea62-5df6-4d36-b9c2-f9bdf096c784-kube-api-access-jbskr\") on node \"crc\" DevicePath \"\"" Mar 20 08:44:46 crc kubenswrapper[4971]: I0320 08:44:46.771085 4971 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a18dea62-5df6-4d36-b9c2-f9bdf096c784-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 08:44:47 crc kubenswrapper[4971]: I0320 08:44:47.098103 4971 generic.go:334] "Generic (PLEG): container finished" podID="a18dea62-5df6-4d36-b9c2-f9bdf096c784" containerID="23f4ca7608b466e4ebf73e557f57a795e566f94344921867720a8c451bd18af4" exitCode=0 Mar 20 08:44:47 crc kubenswrapper[4971]: I0320 08:44:47.098170 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b5c84b9cc-8pkhg" Mar 20 08:44:47 crc kubenswrapper[4971]: I0320 08:44:47.098217 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b5c84b9cc-8pkhg" event={"ID":"a18dea62-5df6-4d36-b9c2-f9bdf096c784","Type":"ContainerDied","Data":"23f4ca7608b466e4ebf73e557f57a795e566f94344921867720a8c451bd18af4"} Mar 20 08:44:47 crc kubenswrapper[4971]: I0320 08:44:47.098250 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b5c84b9cc-8pkhg" event={"ID":"a18dea62-5df6-4d36-b9c2-f9bdf096c784","Type":"ContainerDied","Data":"abc960e14e140efbe8503dc76b40d3cdfef67c1365ffe48802bc23c7d97d393e"} Mar 20 08:44:47 crc kubenswrapper[4971]: I0320 08:44:47.098271 4971 scope.go:117] "RemoveContainer" containerID="23f4ca7608b466e4ebf73e557f57a795e566f94344921867720a8c451bd18af4" Mar 20 08:44:47 crc kubenswrapper[4971]: I0320 08:44:47.125705 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b5c84b9cc-8pkhg"] Mar 20 08:44:47 crc kubenswrapper[4971]: I0320 08:44:47.128721 4971 scope.go:117] "RemoveContainer" containerID="dda263a3079b00c4a13ffc54a4850966eda926086be90d6f279ed86258c9b396" Mar 20 08:44:47 crc kubenswrapper[4971]: I0320 08:44:47.129961 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b5c84b9cc-8pkhg"] Mar 20 08:44:47 crc kubenswrapper[4971]: I0320 08:44:47.148835 4971 scope.go:117] "RemoveContainer" containerID="23f4ca7608b466e4ebf73e557f57a795e566f94344921867720a8c451bd18af4" Mar 20 08:44:47 crc kubenswrapper[4971]: E0320 08:44:47.149659 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23f4ca7608b466e4ebf73e557f57a795e566f94344921867720a8c451bd18af4\": container with ID starting with 23f4ca7608b466e4ebf73e557f57a795e566f94344921867720a8c451bd18af4 not found: ID does not exist" containerID="23f4ca7608b466e4ebf73e557f57a795e566f94344921867720a8c451bd18af4" Mar 20 08:44:47 crc kubenswrapper[4971]: I0320 08:44:47.149819 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23f4ca7608b466e4ebf73e557f57a795e566f94344921867720a8c451bd18af4"} err="failed to get container status \"23f4ca7608b466e4ebf73e557f57a795e566f94344921867720a8c451bd18af4\": rpc error: code = NotFound desc = could not find container \"23f4ca7608b466e4ebf73e557f57a795e566f94344921867720a8c451bd18af4\": container with ID starting with 23f4ca7608b466e4ebf73e557f57a795e566f94344921867720a8c451bd18af4 not found: ID does not exist" Mar 20 08:44:47 crc kubenswrapper[4971]: I0320 08:44:47.149926 4971 scope.go:117] "RemoveContainer" containerID="dda263a3079b00c4a13ffc54a4850966eda926086be90d6f279ed86258c9b396" Mar 20 08:44:47 crc kubenswrapper[4971]: E0320 08:44:47.150629 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dda263a3079b00c4a13ffc54a4850966eda926086be90d6f279ed86258c9b396\": container with ID starting with dda263a3079b00c4a13ffc54a4850966eda926086be90d6f279ed86258c9b396 not found: ID does not exist" containerID="dda263a3079b00c4a13ffc54a4850966eda926086be90d6f279ed86258c9b396" Mar 20 08:44:47 crc kubenswrapper[4971]: I0320 08:44:47.150753 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dda263a3079b00c4a13ffc54a4850966eda926086be90d6f279ed86258c9b396"} err="failed to get container status \"dda263a3079b00c4a13ffc54a4850966eda926086be90d6f279ed86258c9b396\": rpc error: code = NotFound desc = could not find container \"dda263a3079b00c4a13ffc54a4850966eda926086be90d6f279ed86258c9b396\": container with ID starting with dda263a3079b00c4a13ffc54a4850966eda926086be90d6f279ed86258c9b396 not found: ID does not exist" Mar 20 08:44:48 crc kubenswrapper[4971]: I0320 08:44:48.749267 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a18dea62-5df6-4d36-b9c2-f9bdf096c784" path="/var/lib/kubelet/pods/a18dea62-5df6-4d36-b9c2-f9bdf096c784/volumes" Mar 20 08:44:54 crc kubenswrapper[4971]: I0320 08:44:54.007469 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 20 08:44:54 crc kubenswrapper[4971]: E0320 08:44:54.008263 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a18dea62-5df6-4d36-b9c2-f9bdf096c784" containerName="init" Mar 20 08:44:54 crc kubenswrapper[4971]: I0320 08:44:54.008277 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="a18dea62-5df6-4d36-b9c2-f9bdf096c784" containerName="init" Mar 20 08:44:54 crc kubenswrapper[4971]: E0320 08:44:54.008292 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a18dea62-5df6-4d36-b9c2-f9bdf096c784" containerName="dnsmasq-dns" Mar 20 08:44:54 crc kubenswrapper[4971]: I0320 08:44:54.008300 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="a18dea62-5df6-4d36-b9c2-f9bdf096c784" containerName="dnsmasq-dns" Mar 20 08:44:54 crc kubenswrapper[4971]: I0320 08:44:54.008469 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="a18dea62-5df6-4d36-b9c2-f9bdf096c784" containerName="dnsmasq-dns" Mar 20 08:44:54 crc kubenswrapper[4971]: I0320 08:44:54.009431 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 20 08:44:54 crc kubenswrapper[4971]: I0320 08:44:54.012097 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 20 08:44:54 crc kubenswrapper[4971]: I0320 08:44:54.012541 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 20 08:44:54 crc kubenswrapper[4971]: I0320 08:44:54.013049 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-2cmbn" Mar 20 08:44:54 crc kubenswrapper[4971]: I0320 08:44:54.029793 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 20 08:44:54 crc kubenswrapper[4971]: I0320 08:44:54.094947 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrgq4\" (UniqueName: \"kubernetes.io/projected/331e05d9-b918-47df-94c6-ad8f16ad4f05-kube-api-access-wrgq4\") pod \"ovn-northd-0\" (UID: \"331e05d9-b918-47df-94c6-ad8f16ad4f05\") " pod="openstack/ovn-northd-0" Mar 20 08:44:54 crc kubenswrapper[4971]: I0320 08:44:54.095185 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/331e05d9-b918-47df-94c6-ad8f16ad4f05-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"331e05d9-b918-47df-94c6-ad8f16ad4f05\") " pod="openstack/ovn-northd-0" Mar 20 08:44:54 crc kubenswrapper[4971]: I0320 08:44:54.095250 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/331e05d9-b918-47df-94c6-ad8f16ad4f05-config\") pod \"ovn-northd-0\" (UID: \"331e05d9-b918-47df-94c6-ad8f16ad4f05\") " pod="openstack/ovn-northd-0" Mar 20 08:44:54 crc kubenswrapper[4971]: I0320 08:44:54.095333 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/331e05d9-b918-47df-94c6-ad8f16ad4f05-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"331e05d9-b918-47df-94c6-ad8f16ad4f05\") " pod="openstack/ovn-northd-0" Mar 20 08:44:54 crc kubenswrapper[4971]: I0320 08:44:54.095404 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/331e05d9-b918-47df-94c6-ad8f16ad4f05-scripts\") pod \"ovn-northd-0\" (UID: \"331e05d9-b918-47df-94c6-ad8f16ad4f05\") " pod="openstack/ovn-northd-0" Mar 20 08:44:54 crc kubenswrapper[4971]: I0320 08:44:54.197463 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/331e05d9-b918-47df-94c6-ad8f16ad4f05-config\") pod \"ovn-northd-0\" (UID: \"331e05d9-b918-47df-94c6-ad8f16ad4f05\") " pod="openstack/ovn-northd-0" Mar 20 08:44:54 crc kubenswrapper[4971]: I0320 08:44:54.197566 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/331e05d9-b918-47df-94c6-ad8f16ad4f05-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"331e05d9-b918-47df-94c6-ad8f16ad4f05\") " pod="openstack/ovn-northd-0" Mar 20 08:44:54 crc kubenswrapper[4971]: I0320 08:44:54.197690 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/331e05d9-b918-47df-94c6-ad8f16ad4f05-scripts\") pod \"ovn-northd-0\" (UID: \"331e05d9-b918-47df-94c6-ad8f16ad4f05\") " pod="openstack/ovn-northd-0" Mar 20 08:44:54 crc kubenswrapper[4971]: I0320 08:44:54.197744 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrgq4\" (UniqueName: \"kubernetes.io/projected/331e05d9-b918-47df-94c6-ad8f16ad4f05-kube-api-access-wrgq4\") pod \"ovn-northd-0\" (UID: \"331e05d9-b918-47df-94c6-ad8f16ad4f05\") " pod="openstack/ovn-northd-0" Mar 20 08:44:54 crc kubenswrapper[4971]: I0320 08:44:54.197888 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/331e05d9-b918-47df-94c6-ad8f16ad4f05-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"331e05d9-b918-47df-94c6-ad8f16ad4f05\") " pod="openstack/ovn-northd-0" Mar 20 08:44:54 crc kubenswrapper[4971]: I0320 08:44:54.198671 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/331e05d9-b918-47df-94c6-ad8f16ad4f05-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"331e05d9-b918-47df-94c6-ad8f16ad4f05\") " pod="openstack/ovn-northd-0" Mar 20 08:44:54 crc kubenswrapper[4971]: I0320 08:44:54.198977 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/331e05d9-b918-47df-94c6-ad8f16ad4f05-scripts\") pod \"ovn-northd-0\" (UID: \"331e05d9-b918-47df-94c6-ad8f16ad4f05\") " pod="openstack/ovn-northd-0" Mar 20 08:44:54 crc kubenswrapper[4971]: I0320 08:44:54.199161 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/331e05d9-b918-47df-94c6-ad8f16ad4f05-config\") pod \"ovn-northd-0\" (UID: \"331e05d9-b918-47df-94c6-ad8f16ad4f05\") " pod="openstack/ovn-northd-0" Mar 20 08:44:54 crc kubenswrapper[4971]: I0320 08:44:54.209285 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/331e05d9-b918-47df-94c6-ad8f16ad4f05-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"331e05d9-b918-47df-94c6-ad8f16ad4f05\") " pod="openstack/ovn-northd-0" Mar 20 08:44:54 crc kubenswrapper[4971]: I0320 08:44:54.220269 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrgq4\" (UniqueName: \"kubernetes.io/projected/331e05d9-b918-47df-94c6-ad8f16ad4f05-kube-api-access-wrgq4\") pod \"ovn-northd-0\" (UID: \"331e05d9-b918-47df-94c6-ad8f16ad4f05\") " pod="openstack/ovn-northd-0" Mar 20 08:44:54 crc kubenswrapper[4971]: I0320 08:44:54.335885 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 20 08:44:54 crc kubenswrapper[4971]: W0320 08:44:54.618425 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod331e05d9_b918_47df_94c6_ad8f16ad4f05.slice/crio-e3727be73c411acd1db99f914e9d1975f129d7d3c23bddbbece9cc6bbd5aea15 WatchSource:0}: Error finding container e3727be73c411acd1db99f914e9d1975f129d7d3c23bddbbece9cc6bbd5aea15: Status 404 returned error can't find the container with id e3727be73c411acd1db99f914e9d1975f129d7d3c23bddbbece9cc6bbd5aea15 Mar 20 08:44:54 crc kubenswrapper[4971]: I0320 08:44:54.637800 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 20 08:44:55 crc kubenswrapper[4971]: I0320 08:44:55.169748 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"331e05d9-b918-47df-94c6-ad8f16ad4f05","Type":"ContainerStarted","Data":"e3727be73c411acd1db99f914e9d1975f129d7d3c23bddbbece9cc6bbd5aea15"} Mar 20 08:44:56 crc kubenswrapper[4971]: I0320 08:44:56.181308 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"331e05d9-b918-47df-94c6-ad8f16ad4f05","Type":"ContainerStarted","Data":"248e7875df72103267e9785b8e586e267984a943a44e58310d4e8532970b09b2"} Mar 20 08:44:56 crc kubenswrapper[4971]: I0320 08:44:56.181813 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"331e05d9-b918-47df-94c6-ad8f16ad4f05","Type":"ContainerStarted","Data":"207fe2c49ddd0961eb33e94eb851a642aff3564017fea371f5cf48ebc66747b3"} Mar 20 08:44:56 crc kubenswrapper[4971]: I0320 08:44:56.181848 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 20 08:44:56 crc kubenswrapper[4971]: I0320 08:44:56.207862 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.572913476 podStartE2EDuration="3.207829868s" podCreationTimestamp="2026-03-20 08:44:53 +0000 UTC" firstStartedPulling="2026-03-20 08:44:54.62652843 +0000 UTC m=+6916.606402568" lastFinishedPulling="2026-03-20 08:44:55.261444792 +0000 UTC m=+6917.241318960" observedRunningTime="2026-03-20 08:44:56.199384877 +0000 UTC m=+6918.179259035" watchObservedRunningTime="2026-03-20 08:44:56.207829868 +0000 UTC m=+6918.187704026" Mar 20 08:45:00 crc kubenswrapper[4971]: I0320 08:45:00.146707 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566605-k5mzl"] Mar 20 08:45:00 crc kubenswrapper[4971]: I0320 08:45:00.151534 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566605-k5mzl" Mar 20 08:45:00 crc kubenswrapper[4971]: I0320 08:45:00.155561 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 08:45:00 crc kubenswrapper[4971]: I0320 08:45:00.156516 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 08:45:00 crc kubenswrapper[4971]: I0320 08:45:00.158965 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566605-k5mzl"] Mar 20 08:45:00 crc kubenswrapper[4971]: I0320 08:45:00.317291 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a8846ab1-f8f1-4ed6-a554-fa0741899c59-secret-volume\") pod \"collect-profiles-29566605-k5mzl\" (UID: \"a8846ab1-f8f1-4ed6-a554-fa0741899c59\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566605-k5mzl" Mar 20 08:45:00 crc kubenswrapper[4971]: I0320 08:45:00.317355 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxfqm\" (UniqueName: \"kubernetes.io/projected/a8846ab1-f8f1-4ed6-a554-fa0741899c59-kube-api-access-hxfqm\") pod \"collect-profiles-29566605-k5mzl\" (UID: \"a8846ab1-f8f1-4ed6-a554-fa0741899c59\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566605-k5mzl" Mar 20 08:45:00 crc kubenswrapper[4971]: I0320 08:45:00.317722 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a8846ab1-f8f1-4ed6-a554-fa0741899c59-config-volume\") pod \"collect-profiles-29566605-k5mzl\" (UID: \"a8846ab1-f8f1-4ed6-a554-fa0741899c59\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566605-k5mzl" Mar 20 08:45:00 crc kubenswrapper[4971]: I0320 08:45:00.419258 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a8846ab1-f8f1-4ed6-a554-fa0741899c59-config-volume\") pod \"collect-profiles-29566605-k5mzl\" (UID: \"a8846ab1-f8f1-4ed6-a554-fa0741899c59\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566605-k5mzl" Mar 20 08:45:00 crc kubenswrapper[4971]: I0320 08:45:00.419336 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a8846ab1-f8f1-4ed6-a554-fa0741899c59-secret-volume\") pod \"collect-profiles-29566605-k5mzl\" (UID: \"a8846ab1-f8f1-4ed6-a554-fa0741899c59\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566605-k5mzl" Mar 20 08:45:00 crc kubenswrapper[4971]: I0320 08:45:00.419379 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxfqm\" (UniqueName: \"kubernetes.io/projected/a8846ab1-f8f1-4ed6-a554-fa0741899c59-kube-api-access-hxfqm\") pod \"collect-profiles-29566605-k5mzl\" (UID: \"a8846ab1-f8f1-4ed6-a554-fa0741899c59\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566605-k5mzl" Mar 20 08:45:00 crc kubenswrapper[4971]: I0320 08:45:00.421678 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a8846ab1-f8f1-4ed6-a554-fa0741899c59-config-volume\") pod \"collect-profiles-29566605-k5mzl\" (UID: \"a8846ab1-f8f1-4ed6-a554-fa0741899c59\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566605-k5mzl" Mar 20 08:45:00 crc kubenswrapper[4971]: I0320 08:45:00.431444 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a8846ab1-f8f1-4ed6-a554-fa0741899c59-secret-volume\") pod \"collect-profiles-29566605-k5mzl\" (UID: \"a8846ab1-f8f1-4ed6-a554-fa0741899c59\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566605-k5mzl" Mar 20 08:45:00 crc kubenswrapper[4971]: I0320 08:45:00.454429 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxfqm\" (UniqueName: \"kubernetes.io/projected/a8846ab1-f8f1-4ed6-a554-fa0741899c59-kube-api-access-hxfqm\") pod \"collect-profiles-29566605-k5mzl\" (UID: \"a8846ab1-f8f1-4ed6-a554-fa0741899c59\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566605-k5mzl" Mar 20 08:45:00 crc kubenswrapper[4971]: I0320 08:45:00.512077 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566605-k5mzl" Mar 20 08:45:00 crc kubenswrapper[4971]: I0320 08:45:00.970811 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566605-k5mzl"] Mar 20 08:45:01 crc kubenswrapper[4971]: I0320 08:45:01.256417 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566605-k5mzl" event={"ID":"a8846ab1-f8f1-4ed6-a554-fa0741899c59","Type":"ContainerStarted","Data":"90dce88a8b6f91ac1ba18dab79d45b75502eabace1ac3b1b5f8df2d30b2781f5"} Mar 20 08:45:01 crc kubenswrapper[4971]: I0320 08:45:01.256869 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566605-k5mzl" event={"ID":"a8846ab1-f8f1-4ed6-a554-fa0741899c59","Type":"ContainerStarted","Data":"19830853c00e3e34fa60756beb200f23b090e84a0928a8c095a7cdbdc7d08bfc"} Mar 20 08:45:01 crc kubenswrapper[4971]: I0320 08:45:01.287278 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29566605-k5mzl" podStartSLOduration=1.287248937 podStartE2EDuration="1.287248937s" podCreationTimestamp="2026-03-20 08:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:45:01.274535154 +0000 UTC m=+6923.254409302" watchObservedRunningTime="2026-03-20 08:45:01.287248937 +0000 UTC m=+6923.267123095" Mar 20 08:45:01 crc kubenswrapper[4971]: I0320 08:45:01.763515 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-bx4dr"] Mar 20 08:45:01 crc kubenswrapper[4971]: I0320 08:45:01.764592 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-bx4dr" Mar 20 08:45:01 crc kubenswrapper[4971]: I0320 08:45:01.810920 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-bx4dr"] Mar 20 08:45:01 crc kubenswrapper[4971]: I0320 08:45:01.848631 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a2c9c34-5d1e-4f21-8c05-860986eba258-operator-scripts\") pod \"keystone-db-create-bx4dr\" (UID: \"2a2c9c34-5d1e-4f21-8c05-860986eba258\") " pod="openstack/keystone-db-create-bx4dr" Mar 20 08:45:01 crc kubenswrapper[4971]: I0320 08:45:01.848676 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ggmg\" (UniqueName: \"kubernetes.io/projected/2a2c9c34-5d1e-4f21-8c05-860986eba258-kube-api-access-7ggmg\") pod \"keystone-db-create-bx4dr\" (UID: \"2a2c9c34-5d1e-4f21-8c05-860986eba258\") " pod="openstack/keystone-db-create-bx4dr" Mar 20 08:45:01 crc kubenswrapper[4971]: I0320 08:45:01.950192 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a2c9c34-5d1e-4f21-8c05-860986eba258-operator-scripts\") pod \"keystone-db-create-bx4dr\" (UID: \"2a2c9c34-5d1e-4f21-8c05-860986eba258\") " pod="openstack/keystone-db-create-bx4dr" Mar 20 08:45:01 crc kubenswrapper[4971]: I0320 08:45:01.950494 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ggmg\" (UniqueName: \"kubernetes.io/projected/2a2c9c34-5d1e-4f21-8c05-860986eba258-kube-api-access-7ggmg\") pod \"keystone-db-create-bx4dr\" (UID: \"2a2c9c34-5d1e-4f21-8c05-860986eba258\") " pod="openstack/keystone-db-create-bx4dr" Mar 20 08:45:01 crc kubenswrapper[4971]: I0320 08:45:01.951109 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a2c9c34-5d1e-4f21-8c05-860986eba258-operator-scripts\") pod \"keystone-db-create-bx4dr\" (UID: \"2a2c9c34-5d1e-4f21-8c05-860986eba258\") " pod="openstack/keystone-db-create-bx4dr" Mar 20 08:45:01 crc kubenswrapper[4971]: I0320 08:45:01.970422 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ggmg\" (UniqueName: \"kubernetes.io/projected/2a2c9c34-5d1e-4f21-8c05-860986eba258-kube-api-access-7ggmg\") pod \"keystone-db-create-bx4dr\" (UID: \"2a2c9c34-5d1e-4f21-8c05-860986eba258\") " pod="openstack/keystone-db-create-bx4dr" Mar 20 08:45:01 crc kubenswrapper[4971]: I0320 08:45:01.991933 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-67e9-account-create-update-pdh5p"] Mar 20 08:45:01 crc kubenswrapper[4971]: I0320 08:45:01.997885 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-67e9-account-create-update-pdh5p" Mar 20 08:45:02 crc kubenswrapper[4971]: I0320 08:45:02.002020 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 20 08:45:02 crc kubenswrapper[4971]: I0320 08:45:02.005097 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-67e9-account-create-update-pdh5p"] Mar 20 08:45:02 crc kubenswrapper[4971]: I0320 08:45:02.081320 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-bx4dr" Mar 20 08:45:02 crc kubenswrapper[4971]: I0320 08:45:02.154432 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1e1c936-56dd-4723-8473-f8231f4e8197-operator-scripts\") pod \"keystone-67e9-account-create-update-pdh5p\" (UID: \"c1e1c936-56dd-4723-8473-f8231f4e8197\") " pod="openstack/keystone-67e9-account-create-update-pdh5p" Mar 20 08:45:02 crc kubenswrapper[4971]: I0320 08:45:02.154747 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh7cz\" (UniqueName: \"kubernetes.io/projected/c1e1c936-56dd-4723-8473-f8231f4e8197-kube-api-access-nh7cz\") pod \"keystone-67e9-account-create-update-pdh5p\" (UID: \"c1e1c936-56dd-4723-8473-f8231f4e8197\") " pod="openstack/keystone-67e9-account-create-update-pdh5p" Mar 20 08:45:02 crc kubenswrapper[4971]: I0320 08:45:02.261161 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh7cz\" (UniqueName: \"kubernetes.io/projected/c1e1c936-56dd-4723-8473-f8231f4e8197-kube-api-access-nh7cz\") pod \"keystone-67e9-account-create-update-pdh5p\" (UID: \"c1e1c936-56dd-4723-8473-f8231f4e8197\") " pod="openstack/keystone-67e9-account-create-update-pdh5p" Mar 20 08:45:02 crc kubenswrapper[4971]: I0320 08:45:02.261251 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1e1c936-56dd-4723-8473-f8231f4e8197-operator-scripts\") pod \"keystone-67e9-account-create-update-pdh5p\" (UID: \"c1e1c936-56dd-4723-8473-f8231f4e8197\") " pod="openstack/keystone-67e9-account-create-update-pdh5p" Mar 20 08:45:02 crc kubenswrapper[4971]: I0320 08:45:02.262182 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1e1c936-56dd-4723-8473-f8231f4e8197-operator-scripts\") pod \"keystone-67e9-account-create-update-pdh5p\" (UID: \"c1e1c936-56dd-4723-8473-f8231f4e8197\") " pod="openstack/keystone-67e9-account-create-update-pdh5p" Mar 20 08:45:02 crc kubenswrapper[4971]: I0320 08:45:02.274509 4971 generic.go:334] "Generic (PLEG): container finished" podID="a8846ab1-f8f1-4ed6-a554-fa0741899c59" containerID="90dce88a8b6f91ac1ba18dab79d45b75502eabace1ac3b1b5f8df2d30b2781f5" exitCode=0 Mar 20 08:45:02 crc kubenswrapper[4971]: I0320 08:45:02.274928 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566605-k5mzl" event={"ID":"a8846ab1-f8f1-4ed6-a554-fa0741899c59","Type":"ContainerDied","Data":"90dce88a8b6f91ac1ba18dab79d45b75502eabace1ac3b1b5f8df2d30b2781f5"} Mar 20 08:45:02 crc kubenswrapper[4971]: I0320 08:45:02.282524 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh7cz\" (UniqueName: \"kubernetes.io/projected/c1e1c936-56dd-4723-8473-f8231f4e8197-kube-api-access-nh7cz\") pod \"keystone-67e9-account-create-update-pdh5p\" (UID: \"c1e1c936-56dd-4723-8473-f8231f4e8197\") " pod="openstack/keystone-67e9-account-create-update-pdh5p" Mar 20 08:45:02 crc kubenswrapper[4971]: I0320 08:45:02.321069 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-67e9-account-create-update-pdh5p" Mar 20 08:45:02 crc kubenswrapper[4971]: I0320 08:45:02.553402 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-bx4dr"] Mar 20 08:45:02 crc kubenswrapper[4971]: W0320 08:45:02.556762 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a2c9c34_5d1e_4f21_8c05_860986eba258.slice/crio-d5fd56c4ac0494d5d39a09292c5fc49920d69d6ddb31670ed2df27770a654cbb WatchSource:0}: Error finding container d5fd56c4ac0494d5d39a09292c5fc49920d69d6ddb31670ed2df27770a654cbb: Status 404 returned error can't find the container with id d5fd56c4ac0494d5d39a09292c5fc49920d69d6ddb31670ed2df27770a654cbb Mar 20 08:45:02 crc kubenswrapper[4971]: I0320 08:45:02.752097 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-67e9-account-create-update-pdh5p"] Mar 20 08:45:03 crc kubenswrapper[4971]: I0320 08:45:03.286329 4971 generic.go:334] "Generic (PLEG): container finished" podID="c1e1c936-56dd-4723-8473-f8231f4e8197" containerID="ef03885fd1d4938fa069ac2a963ba0fccac3cc24217902a1e698016ae0993951" exitCode=0 Mar 20 08:45:03 crc kubenswrapper[4971]: I0320 08:45:03.286573 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-67e9-account-create-update-pdh5p" event={"ID":"c1e1c936-56dd-4723-8473-f8231f4e8197","Type":"ContainerDied","Data":"ef03885fd1d4938fa069ac2a963ba0fccac3cc24217902a1e698016ae0993951"} Mar 20 08:45:03 crc kubenswrapper[4971]: I0320 08:45:03.287537 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-67e9-account-create-update-pdh5p" event={"ID":"c1e1c936-56dd-4723-8473-f8231f4e8197","Type":"ContainerStarted","Data":"761b1780932d03657348b69f8bcb2b5abd921016ea9df4110c7200d6dd204546"} Mar 20 08:45:03 crc kubenswrapper[4971]: I0320 08:45:03.290498 4971 generic.go:334] "Generic (PLEG): container finished" podID="2a2c9c34-5d1e-4f21-8c05-860986eba258" containerID="dc313ff17c9a4367cd3556d879562db7489e7c5c358c71204de32720086e5b25" exitCode=0 Mar 20 08:45:03 crc kubenswrapper[4971]: I0320 08:45:03.290581 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-bx4dr" event={"ID":"2a2c9c34-5d1e-4f21-8c05-860986eba258","Type":"ContainerDied","Data":"dc313ff17c9a4367cd3556d879562db7489e7c5c358c71204de32720086e5b25"} Mar 20 08:45:03 crc kubenswrapper[4971]: I0320 08:45:03.290692 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-bx4dr" event={"ID":"2a2c9c34-5d1e-4f21-8c05-860986eba258","Type":"ContainerStarted","Data":"d5fd56c4ac0494d5d39a09292c5fc49920d69d6ddb31670ed2df27770a654cbb"} Mar 20 08:45:03 crc kubenswrapper[4971]: I0320 08:45:03.731850 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566605-k5mzl" Mar 20 08:45:03 crc kubenswrapper[4971]: I0320 08:45:03.886309 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a8846ab1-f8f1-4ed6-a554-fa0741899c59-config-volume\") pod \"a8846ab1-f8f1-4ed6-a554-fa0741899c59\" (UID: \"a8846ab1-f8f1-4ed6-a554-fa0741899c59\") " Mar 20 08:45:03 crc kubenswrapper[4971]: I0320 08:45:03.886433 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxfqm\" (UniqueName: \"kubernetes.io/projected/a8846ab1-f8f1-4ed6-a554-fa0741899c59-kube-api-access-hxfqm\") pod \"a8846ab1-f8f1-4ed6-a554-fa0741899c59\" (UID: \"a8846ab1-f8f1-4ed6-a554-fa0741899c59\") " Mar 20 08:45:03 crc kubenswrapper[4971]: I0320 08:45:03.886470 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a8846ab1-f8f1-4ed6-a554-fa0741899c59-secret-volume\") pod \"a8846ab1-f8f1-4ed6-a554-fa0741899c59\" (UID: \"a8846ab1-f8f1-4ed6-a554-fa0741899c59\") " Mar 20 08:45:03 crc kubenswrapper[4971]: I0320 08:45:03.887644 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8846ab1-f8f1-4ed6-a554-fa0741899c59-config-volume" (OuterVolumeSpecName: "config-volume") pod "a8846ab1-f8f1-4ed6-a554-fa0741899c59" (UID: "a8846ab1-f8f1-4ed6-a554-fa0741899c59"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:45:03 crc kubenswrapper[4971]: I0320 08:45:03.892360 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8846ab1-f8f1-4ed6-a554-fa0741899c59-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a8846ab1-f8f1-4ed6-a554-fa0741899c59" (UID: "a8846ab1-f8f1-4ed6-a554-fa0741899c59"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:45:03 crc kubenswrapper[4971]: I0320 08:45:03.895390 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8846ab1-f8f1-4ed6-a554-fa0741899c59-kube-api-access-hxfqm" (OuterVolumeSpecName: "kube-api-access-hxfqm") pod "a8846ab1-f8f1-4ed6-a554-fa0741899c59" (UID: "a8846ab1-f8f1-4ed6-a554-fa0741899c59"). InnerVolumeSpecName "kube-api-access-hxfqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:45:03 crc kubenswrapper[4971]: I0320 08:45:03.988660 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxfqm\" (UniqueName: \"kubernetes.io/projected/a8846ab1-f8f1-4ed6-a554-fa0741899c59-kube-api-access-hxfqm\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:03 crc kubenswrapper[4971]: I0320 08:45:03.988695 4971 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a8846ab1-f8f1-4ed6-a554-fa0741899c59-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:03 crc kubenswrapper[4971]: I0320 08:45:03.988707 4971 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a8846ab1-f8f1-4ed6-a554-fa0741899c59-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:04 crc kubenswrapper[4971]: I0320 08:45:04.304036 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566605-k5mzl" event={"ID":"a8846ab1-f8f1-4ed6-a554-fa0741899c59","Type":"ContainerDied","Data":"19830853c00e3e34fa60756beb200f23b090e84a0928a8c095a7cdbdc7d08bfc"} Mar 20 08:45:04 crc kubenswrapper[4971]: I0320 08:45:04.307137 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19830853c00e3e34fa60756beb200f23b090e84a0928a8c095a7cdbdc7d08bfc" Mar 20 08:45:04 crc kubenswrapper[4971]: I0320 08:45:04.304197 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566605-k5mzl" Mar 20 08:45:04 crc kubenswrapper[4971]: I0320 08:45:04.378357 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566560-qzp8b"] Mar 20 08:45:04 crc kubenswrapper[4971]: I0320 08:45:04.388065 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566560-qzp8b"] Mar 20 08:45:04 crc kubenswrapper[4971]: I0320 08:45:04.741358 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7121fb5-1631-4257-8aa2-4e732578fc6f" path="/var/lib/kubelet/pods/a7121fb5-1631-4257-8aa2-4e732578fc6f/volumes" Mar 20 08:45:04 crc kubenswrapper[4971]: I0320 08:45:04.813805 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-67e9-account-create-update-pdh5p" Mar 20 08:45:04 crc kubenswrapper[4971]: I0320 08:45:04.818924 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-bx4dr" Mar 20 08:45:04 crc kubenswrapper[4971]: I0320 08:45:04.904160 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a2c9c34-5d1e-4f21-8c05-860986eba258-operator-scripts\") pod \"2a2c9c34-5d1e-4f21-8c05-860986eba258\" (UID: \"2a2c9c34-5d1e-4f21-8c05-860986eba258\") " Mar 20 08:45:04 crc kubenswrapper[4971]: I0320 08:45:04.904234 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ggmg\" (UniqueName: \"kubernetes.io/projected/2a2c9c34-5d1e-4f21-8c05-860986eba258-kube-api-access-7ggmg\") pod \"2a2c9c34-5d1e-4f21-8c05-860986eba258\" (UID: \"2a2c9c34-5d1e-4f21-8c05-860986eba258\") " Mar 20 08:45:04 crc kubenswrapper[4971]: I0320 08:45:04.904280 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nh7cz\" (UniqueName: \"kubernetes.io/projected/c1e1c936-56dd-4723-8473-f8231f4e8197-kube-api-access-nh7cz\") pod \"c1e1c936-56dd-4723-8473-f8231f4e8197\" (UID: \"c1e1c936-56dd-4723-8473-f8231f4e8197\") " Mar 20 08:45:04 crc kubenswrapper[4971]: I0320 08:45:04.904355 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1e1c936-56dd-4723-8473-f8231f4e8197-operator-scripts\") pod \"c1e1c936-56dd-4723-8473-f8231f4e8197\" (UID: \"c1e1c936-56dd-4723-8473-f8231f4e8197\") " Mar 20 08:45:04 crc kubenswrapper[4971]: I0320 08:45:04.905135 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1e1c936-56dd-4723-8473-f8231f4e8197-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c1e1c936-56dd-4723-8473-f8231f4e8197" (UID: "c1e1c936-56dd-4723-8473-f8231f4e8197"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:45:04 crc kubenswrapper[4971]: I0320 08:45:04.905258 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a2c9c34-5d1e-4f21-8c05-860986eba258-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2a2c9c34-5d1e-4f21-8c05-860986eba258" (UID: "2a2c9c34-5d1e-4f21-8c05-860986eba258"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:45:04 crc kubenswrapper[4971]: I0320 08:45:04.910843 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1e1c936-56dd-4723-8473-f8231f4e8197-kube-api-access-nh7cz" (OuterVolumeSpecName: "kube-api-access-nh7cz") pod "c1e1c936-56dd-4723-8473-f8231f4e8197" (UID: "c1e1c936-56dd-4723-8473-f8231f4e8197"). InnerVolumeSpecName "kube-api-access-nh7cz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:45:04 crc kubenswrapper[4971]: I0320 08:45:04.913470 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a2c9c34-5d1e-4f21-8c05-860986eba258-kube-api-access-7ggmg" (OuterVolumeSpecName: "kube-api-access-7ggmg") pod "2a2c9c34-5d1e-4f21-8c05-860986eba258" (UID: "2a2c9c34-5d1e-4f21-8c05-860986eba258"). InnerVolumeSpecName "kube-api-access-7ggmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:45:05 crc kubenswrapper[4971]: I0320 08:45:05.005887 4971 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a2c9c34-5d1e-4f21-8c05-860986eba258-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:05 crc kubenswrapper[4971]: I0320 08:45:05.006182 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ggmg\" (UniqueName: \"kubernetes.io/projected/2a2c9c34-5d1e-4f21-8c05-860986eba258-kube-api-access-7ggmg\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:05 crc kubenswrapper[4971]: I0320 08:45:05.006193 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nh7cz\" (UniqueName: \"kubernetes.io/projected/c1e1c936-56dd-4723-8473-f8231f4e8197-kube-api-access-nh7cz\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:05 crc kubenswrapper[4971]: I0320 08:45:05.006204 4971 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1e1c936-56dd-4723-8473-f8231f4e8197-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:05 crc kubenswrapper[4971]: I0320 08:45:05.315733 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-67e9-account-create-update-pdh5p" event={"ID":"c1e1c936-56dd-4723-8473-f8231f4e8197","Type":"ContainerDied","Data":"761b1780932d03657348b69f8bcb2b5abd921016ea9df4110c7200d6dd204546"} Mar 20 08:45:05 crc kubenswrapper[4971]: I0320 08:45:05.315783 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="761b1780932d03657348b69f8bcb2b5abd921016ea9df4110c7200d6dd204546" Mar 20 08:45:05 crc kubenswrapper[4971]: I0320 08:45:05.315850 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-67e9-account-create-update-pdh5p" Mar 20 08:45:05 crc kubenswrapper[4971]: I0320 08:45:05.319561 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-bx4dr" event={"ID":"2a2c9c34-5d1e-4f21-8c05-860986eba258","Type":"ContainerDied","Data":"d5fd56c4ac0494d5d39a09292c5fc49920d69d6ddb31670ed2df27770a654cbb"} Mar 20 08:45:05 crc kubenswrapper[4971]: I0320 08:45:05.319666 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-bx4dr" Mar 20 08:45:05 crc kubenswrapper[4971]: I0320 08:45:05.319674 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5fd56c4ac0494d5d39a09292c5fc49920d69d6ddb31670ed2df27770a654cbb" Mar 20 08:45:07 crc kubenswrapper[4971]: I0320 08:45:07.467147 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-v6d2f"] Mar 20 08:45:07 crc kubenswrapper[4971]: E0320 08:45:07.468009 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1e1c936-56dd-4723-8473-f8231f4e8197" containerName="mariadb-account-create-update" Mar 20 08:45:07 crc kubenswrapper[4971]: I0320 08:45:07.468026 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1e1c936-56dd-4723-8473-f8231f4e8197" containerName="mariadb-account-create-update" Mar 20 08:45:07 crc kubenswrapper[4971]: E0320 08:45:07.468045 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8846ab1-f8f1-4ed6-a554-fa0741899c59" containerName="collect-profiles" Mar 20 08:45:07 crc kubenswrapper[4971]: I0320 08:45:07.468053 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8846ab1-f8f1-4ed6-a554-fa0741899c59" containerName="collect-profiles" Mar 20 08:45:07 crc kubenswrapper[4971]: E0320 08:45:07.468073 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a2c9c34-5d1e-4f21-8c05-860986eba258" containerName="mariadb-database-create" Mar 20 08:45:07 crc kubenswrapper[4971]: I0320 08:45:07.468081 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a2c9c34-5d1e-4f21-8c05-860986eba258" containerName="mariadb-database-create" Mar 20 08:45:07 crc kubenswrapper[4971]: I0320 08:45:07.468259 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a2c9c34-5d1e-4f21-8c05-860986eba258" containerName="mariadb-database-create" Mar 20 08:45:07 crc kubenswrapper[4971]: I0320 08:45:07.468282 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8846ab1-f8f1-4ed6-a554-fa0741899c59" containerName="collect-profiles" Mar 20 08:45:07 crc kubenswrapper[4971]: I0320 08:45:07.468305 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1e1c936-56dd-4723-8473-f8231f4e8197" containerName="mariadb-account-create-update" Mar 20 08:45:07 crc kubenswrapper[4971]: I0320 08:45:07.468919 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-v6d2f" Mar 20 08:45:07 crc kubenswrapper[4971]: I0320 08:45:07.471949 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-bdcjr" Mar 20 08:45:07 crc kubenswrapper[4971]: I0320 08:45:07.472013 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 08:45:07 crc kubenswrapper[4971]: I0320 08:45:07.472014 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 08:45:07 crc kubenswrapper[4971]: I0320 08:45:07.480546 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-v6d2f"] Mar 20 08:45:07 crc kubenswrapper[4971]: I0320 08:45:07.484261 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 08:45:07 crc kubenswrapper[4971]: I0320 08:45:07.547395 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d9e8117-f465-491d-8e98-0f9d1509bb8e-config-data\") pod \"keystone-db-sync-v6d2f\" (UID: \"7d9e8117-f465-491d-8e98-0f9d1509bb8e\") " pod="openstack/keystone-db-sync-v6d2f" Mar 20 08:45:07 crc kubenswrapper[4971]: I0320 08:45:07.547560 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrnds\" (UniqueName: \"kubernetes.io/projected/7d9e8117-f465-491d-8e98-0f9d1509bb8e-kube-api-access-rrnds\") pod \"keystone-db-sync-v6d2f\" (UID: \"7d9e8117-f465-491d-8e98-0f9d1509bb8e\") " pod="openstack/keystone-db-sync-v6d2f" Mar 20 08:45:07 crc kubenswrapper[4971]: I0320 08:45:07.547600 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d9e8117-f465-491d-8e98-0f9d1509bb8e-combined-ca-bundle\") pod \"keystone-db-sync-v6d2f\" (UID: \"7d9e8117-f465-491d-8e98-0f9d1509bb8e\") " pod="openstack/keystone-db-sync-v6d2f" Mar 20 08:45:07 crc kubenswrapper[4971]: I0320 08:45:07.649109 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d9e8117-f465-491d-8e98-0f9d1509bb8e-combined-ca-bundle\") pod \"keystone-db-sync-v6d2f\" (UID: \"7d9e8117-f465-491d-8e98-0f9d1509bb8e\") " pod="openstack/keystone-db-sync-v6d2f" Mar 20 08:45:07 crc kubenswrapper[4971]: I0320 08:45:07.649204 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d9e8117-f465-491d-8e98-0f9d1509bb8e-config-data\") pod \"keystone-db-sync-v6d2f\" (UID: \"7d9e8117-f465-491d-8e98-0f9d1509bb8e\") " pod="openstack/keystone-db-sync-v6d2f" Mar 20 08:45:07 crc kubenswrapper[4971]: I0320 08:45:07.649315 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrnds\" (UniqueName: \"kubernetes.io/projected/7d9e8117-f465-491d-8e98-0f9d1509bb8e-kube-api-access-rrnds\") pod \"keystone-db-sync-v6d2f\" (UID: \"7d9e8117-f465-491d-8e98-0f9d1509bb8e\") " pod="openstack/keystone-db-sync-v6d2f" Mar 20 08:45:07 crc kubenswrapper[4971]: I0320 08:45:07.659418 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d9e8117-f465-491d-8e98-0f9d1509bb8e-config-data\") pod \"keystone-db-sync-v6d2f\" (UID: \"7d9e8117-f465-491d-8e98-0f9d1509bb8e\") " pod="openstack/keystone-db-sync-v6d2f" Mar 20 08:45:07 crc kubenswrapper[4971]: I0320 08:45:07.659502 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d9e8117-f465-491d-8e98-0f9d1509bb8e-combined-ca-bundle\") pod \"keystone-db-sync-v6d2f\" (UID: \"7d9e8117-f465-491d-8e98-0f9d1509bb8e\") " pod="openstack/keystone-db-sync-v6d2f" Mar 20 08:45:07 crc kubenswrapper[4971]: I0320 08:45:07.665271 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrnds\" (UniqueName: \"kubernetes.io/projected/7d9e8117-f465-491d-8e98-0f9d1509bb8e-kube-api-access-rrnds\") pod \"keystone-db-sync-v6d2f\" (UID: \"7d9e8117-f465-491d-8e98-0f9d1509bb8e\") " pod="openstack/keystone-db-sync-v6d2f" Mar 20 08:45:07 crc kubenswrapper[4971]: I0320 08:45:07.799780 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-v6d2f" Mar 20 08:45:08 crc kubenswrapper[4971]: I0320 08:45:08.299475 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-v6d2f"] Mar 20 08:45:08 crc kubenswrapper[4971]: W0320 08:45:08.303883 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d9e8117_f465_491d_8e98_0f9d1509bb8e.slice/crio-2db9d8b7a55cb827c314be40b6f843192c285f0b1fc4fef4604f5cf1da0d88d2 WatchSource:0}: Error finding container 2db9d8b7a55cb827c314be40b6f843192c285f0b1fc4fef4604f5cf1da0d88d2: Status 404 returned error can't find the container with id 2db9d8b7a55cb827c314be40b6f843192c285f0b1fc4fef4604f5cf1da0d88d2 Mar 20 08:45:08 crc kubenswrapper[4971]: I0320 08:45:08.307736 4971 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 08:45:08 crc kubenswrapper[4971]: I0320 08:45:08.348082 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-v6d2f" event={"ID":"7d9e8117-f465-491d-8e98-0f9d1509bb8e","Type":"ContainerStarted","Data":"2db9d8b7a55cb827c314be40b6f843192c285f0b1fc4fef4604f5cf1da0d88d2"} Mar 20 08:45:08 crc kubenswrapper[4971]: I0320 08:45:08.492496 4971 scope.go:117] "RemoveContainer" containerID="44cb92331bdc2482cf210d940813ee155283805d84c34d9758ce7de0b24e6a8c" Mar 20 08:45:14 crc kubenswrapper[4971]: I0320 08:45:14.394975 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-v6d2f" event={"ID":"7d9e8117-f465-491d-8e98-0f9d1509bb8e","Type":"ContainerStarted","Data":"1f1ca8928af817ffa9a9524502a479b6d6015028a928d4269b7ac876f621625f"} Mar 20 08:45:14 crc kubenswrapper[4971]: I0320 08:45:14.426019 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-v6d2f" podStartSLOduration=2.046359682 podStartE2EDuration="7.42599779s" podCreationTimestamp="2026-03-20 08:45:07 +0000 UTC" firstStartedPulling="2026-03-20 08:45:08.307432621 +0000 UTC m=+6930.287306769" lastFinishedPulling="2026-03-20 08:45:13.687070739 +0000 UTC m=+6935.666944877" observedRunningTime="2026-03-20 08:45:14.419686685 +0000 UTC m=+6936.399560853" watchObservedRunningTime="2026-03-20 08:45:14.42599779 +0000 UTC m=+6936.405871928" Mar 20 08:45:14 crc kubenswrapper[4971]: I0320 08:45:14.447743 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 20 08:45:15 crc kubenswrapper[4971]: I0320 08:45:15.407047 4971 generic.go:334] "Generic (PLEG): container finished" podID="7d9e8117-f465-491d-8e98-0f9d1509bb8e" containerID="1f1ca8928af817ffa9a9524502a479b6d6015028a928d4269b7ac876f621625f" exitCode=0 Mar 20 08:45:15 crc kubenswrapper[4971]: I0320 08:45:15.407167 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-v6d2f" event={"ID":"7d9e8117-f465-491d-8e98-0f9d1509bb8e","Type":"ContainerDied","Data":"1f1ca8928af817ffa9a9524502a479b6d6015028a928d4269b7ac876f621625f"} Mar 20 08:45:16 crc kubenswrapper[4971]: I0320 08:45:16.798023 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-v6d2f" Mar 20 08:45:16 crc kubenswrapper[4971]: I0320 08:45:16.906427 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d9e8117-f465-491d-8e98-0f9d1509bb8e-combined-ca-bundle\") pod \"7d9e8117-f465-491d-8e98-0f9d1509bb8e\" (UID: \"7d9e8117-f465-491d-8e98-0f9d1509bb8e\") " Mar 20 08:45:16 crc kubenswrapper[4971]: I0320 08:45:16.906576 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d9e8117-f465-491d-8e98-0f9d1509bb8e-config-data\") pod \"7d9e8117-f465-491d-8e98-0f9d1509bb8e\" (UID: \"7d9e8117-f465-491d-8e98-0f9d1509bb8e\") " Mar 20 08:45:16 crc kubenswrapper[4971]: I0320 08:45:16.906620 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrnds\" (UniqueName: \"kubernetes.io/projected/7d9e8117-f465-491d-8e98-0f9d1509bb8e-kube-api-access-rrnds\") pod \"7d9e8117-f465-491d-8e98-0f9d1509bb8e\" (UID: \"7d9e8117-f465-491d-8e98-0f9d1509bb8e\") " Mar 20 08:45:16 crc kubenswrapper[4971]: I0320 08:45:16.912535 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d9e8117-f465-491d-8e98-0f9d1509bb8e-kube-api-access-rrnds" (OuterVolumeSpecName: "kube-api-access-rrnds") pod "7d9e8117-f465-491d-8e98-0f9d1509bb8e" (UID: "7d9e8117-f465-491d-8e98-0f9d1509bb8e"). InnerVolumeSpecName "kube-api-access-rrnds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:45:16 crc kubenswrapper[4971]: I0320 08:45:16.929311 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d9e8117-f465-491d-8e98-0f9d1509bb8e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7d9e8117-f465-491d-8e98-0f9d1509bb8e" (UID: "7d9e8117-f465-491d-8e98-0f9d1509bb8e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:45:16 crc kubenswrapper[4971]: I0320 08:45:16.944369 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d9e8117-f465-491d-8e98-0f9d1509bb8e-config-data" (OuterVolumeSpecName: "config-data") pod "7d9e8117-f465-491d-8e98-0f9d1509bb8e" (UID: "7d9e8117-f465-491d-8e98-0f9d1509bb8e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:45:17 crc kubenswrapper[4971]: I0320 08:45:17.008320 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d9e8117-f465-491d-8e98-0f9d1509bb8e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:17 crc kubenswrapper[4971]: I0320 08:45:17.008388 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d9e8117-f465-491d-8e98-0f9d1509bb8e-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:17 crc kubenswrapper[4971]: I0320 08:45:17.008407 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrnds\" (UniqueName: \"kubernetes.io/projected/7d9e8117-f465-491d-8e98-0f9d1509bb8e-kube-api-access-rrnds\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:17 crc kubenswrapper[4971]: I0320 08:45:17.428414 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-v6d2f" event={"ID":"7d9e8117-f465-491d-8e98-0f9d1509bb8e","Type":"ContainerDied","Data":"2db9d8b7a55cb827c314be40b6f843192c285f0b1fc4fef4604f5cf1da0d88d2"} Mar 20 08:45:17 crc kubenswrapper[4971]: I0320 08:45:17.428480 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2db9d8b7a55cb827c314be40b6f843192c285f0b1fc4fef4604f5cf1da0d88d2" Mar 20 08:45:17 crc kubenswrapper[4971]: I0320 08:45:17.428522 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-v6d2f" Mar 20 08:45:17 crc kubenswrapper[4971]: I0320 08:45:17.727321 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b54488967-zlmqp"] Mar 20 08:45:17 crc kubenswrapper[4971]: E0320 08:45:17.728040 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d9e8117-f465-491d-8e98-0f9d1509bb8e" containerName="keystone-db-sync" Mar 20 08:45:17 crc kubenswrapper[4971]: I0320 08:45:17.728061 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d9e8117-f465-491d-8e98-0f9d1509bb8e" containerName="keystone-db-sync" Mar 20 08:45:17 crc kubenswrapper[4971]: I0320 08:45:17.728277 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d9e8117-f465-491d-8e98-0f9d1509bb8e" containerName="keystone-db-sync" Mar 20 08:45:17 crc kubenswrapper[4971]: I0320 08:45:17.729371 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b54488967-zlmqp" Mar 20 08:45:17 crc kubenswrapper[4971]: I0320 08:45:17.734118 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-xl96r"] Mar 20 08:45:17 crc kubenswrapper[4971]: I0320 08:45:17.735122 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xl96r" Mar 20 08:45:17 crc kubenswrapper[4971]: I0320 08:45:17.742232 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 08:45:17 crc kubenswrapper[4971]: I0320 08:45:17.742434 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-bdcjr" Mar 20 08:45:17 crc kubenswrapper[4971]: I0320 08:45:17.742584 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 08:45:17 crc kubenswrapper[4971]: I0320 08:45:17.742718 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 20 08:45:17 crc kubenswrapper[4971]: I0320 08:45:17.742785 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 08:45:17 crc kubenswrapper[4971]: I0320 08:45:17.745976 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xl96r"] Mar 20 08:45:17 crc kubenswrapper[4971]: I0320 08:45:17.751938 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b54488967-zlmqp"] Mar 20 08:45:17 crc kubenswrapper[4971]: I0320 08:45:17.823187 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e94ca33f-0afb-4d43-954c-6ff16f84a9a1-config-data\") pod \"keystone-bootstrap-xl96r\" (UID: \"e94ca33f-0afb-4d43-954c-6ff16f84a9a1\") " pod="openstack/keystone-bootstrap-xl96r" Mar 20 08:45:17 crc kubenswrapper[4971]: I0320 08:45:17.823234 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a1b846cc-f217-4bf2-a2ad-aaa721b8480d-ovsdbserver-nb\") pod \"dnsmasq-dns-7b54488967-zlmqp\" (UID: \"a1b846cc-f217-4bf2-a2ad-aaa721b8480d\") " pod="openstack/dnsmasq-dns-7b54488967-zlmqp" Mar 20 08:45:17 crc kubenswrapper[4971]: I0320 08:45:17.823283 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e94ca33f-0afb-4d43-954c-6ff16f84a9a1-credential-keys\") pod \"keystone-bootstrap-xl96r\" (UID: \"e94ca33f-0afb-4d43-954c-6ff16f84a9a1\") " pod="openstack/keystone-bootstrap-xl96r" Mar 20 08:45:17 crc kubenswrapper[4971]: I0320 08:45:17.823388 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a1b846cc-f217-4bf2-a2ad-aaa721b8480d-ovsdbserver-sb\") pod \"dnsmasq-dns-7b54488967-zlmqp\" (UID: \"a1b846cc-f217-4bf2-a2ad-aaa721b8480d\") " pod="openstack/dnsmasq-dns-7b54488967-zlmqp" Mar 20 08:45:17 crc kubenswrapper[4971]: I0320 08:45:17.823466 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a1b846cc-f217-4bf2-a2ad-aaa721b8480d-dns-svc\") pod \"dnsmasq-dns-7b54488967-zlmqp\" (UID: \"a1b846cc-f217-4bf2-a2ad-aaa721b8480d\") " pod="openstack/dnsmasq-dns-7b54488967-zlmqp" Mar 20 08:45:17 crc kubenswrapper[4971]: I0320 08:45:17.823555 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e94ca33f-0afb-4d43-954c-6ff16f84a9a1-fernet-keys\") pod \"keystone-bootstrap-xl96r\" (UID: \"e94ca33f-0afb-4d43-954c-6ff16f84a9a1\") " pod="openstack/keystone-bootstrap-xl96r" Mar 20 08:45:17 crc kubenswrapper[4971]: I0320 08:45:17.823575 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kb9ps\" (UniqueName: \"kubernetes.io/projected/a1b846cc-f217-4bf2-a2ad-aaa721b8480d-kube-api-access-kb9ps\") pod \"dnsmasq-dns-7b54488967-zlmqp\" (UID: \"a1b846cc-f217-4bf2-a2ad-aaa721b8480d\") " pod="openstack/dnsmasq-dns-7b54488967-zlmqp" Mar 20 08:45:17 crc kubenswrapper[4971]: I0320 08:45:17.823721 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1b846cc-f217-4bf2-a2ad-aaa721b8480d-config\") pod \"dnsmasq-dns-7b54488967-zlmqp\" (UID: \"a1b846cc-f217-4bf2-a2ad-aaa721b8480d\") " pod="openstack/dnsmasq-dns-7b54488967-zlmqp" Mar 20 08:45:17 crc kubenswrapper[4971]: I0320 08:45:17.823793 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e94ca33f-0afb-4d43-954c-6ff16f84a9a1-scripts\") pod \"keystone-bootstrap-xl96r\" (UID: \"e94ca33f-0afb-4d43-954c-6ff16f84a9a1\") " pod="openstack/keystone-bootstrap-xl96r" Mar 20 08:45:17 crc kubenswrapper[4971]: I0320 08:45:17.823903 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xwn7\" (UniqueName: \"kubernetes.io/projected/e94ca33f-0afb-4d43-954c-6ff16f84a9a1-kube-api-access-7xwn7\") pod \"keystone-bootstrap-xl96r\" (UID: \"e94ca33f-0afb-4d43-954c-6ff16f84a9a1\") " pod="openstack/keystone-bootstrap-xl96r" Mar 20 08:45:17 crc kubenswrapper[4971]: I0320 08:45:17.823936 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e94ca33f-0afb-4d43-954c-6ff16f84a9a1-combined-ca-bundle\") pod \"keystone-bootstrap-xl96r\" (UID: \"e94ca33f-0afb-4d43-954c-6ff16f84a9a1\") " pod="openstack/keystone-bootstrap-xl96r" Mar 20 08:45:17 crc kubenswrapper[4971]: I0320 08:45:17.925127 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xwn7\" (UniqueName: \"kubernetes.io/projected/e94ca33f-0afb-4d43-954c-6ff16f84a9a1-kube-api-access-7xwn7\") pod \"keystone-bootstrap-xl96r\" (UID: \"e94ca33f-0afb-4d43-954c-6ff16f84a9a1\") " pod="openstack/keystone-bootstrap-xl96r" Mar 20 08:45:17 crc kubenswrapper[4971]: I0320 08:45:17.925196 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e94ca33f-0afb-4d43-954c-6ff16f84a9a1-combined-ca-bundle\") pod \"keystone-bootstrap-xl96r\" (UID: \"e94ca33f-0afb-4d43-954c-6ff16f84a9a1\") " pod="openstack/keystone-bootstrap-xl96r" Mar 20 08:45:17 crc kubenswrapper[4971]: I0320 08:45:17.925249 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e94ca33f-0afb-4d43-954c-6ff16f84a9a1-config-data\") pod \"keystone-bootstrap-xl96r\" (UID: \"e94ca33f-0afb-4d43-954c-6ff16f84a9a1\") " pod="openstack/keystone-bootstrap-xl96r" Mar 20 08:45:17 crc kubenswrapper[4971]: I0320 08:45:17.925274 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a1b846cc-f217-4bf2-a2ad-aaa721b8480d-ovsdbserver-nb\") pod \"dnsmasq-dns-7b54488967-zlmqp\" (UID: \"a1b846cc-f217-4bf2-a2ad-aaa721b8480d\") " pod="openstack/dnsmasq-dns-7b54488967-zlmqp" Mar 20 08:45:17 crc kubenswrapper[4971]: I0320 08:45:17.925334 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e94ca33f-0afb-4d43-954c-6ff16f84a9a1-credential-keys\") pod \"keystone-bootstrap-xl96r\" (UID: \"e94ca33f-0afb-4d43-954c-6ff16f84a9a1\") " pod="openstack/keystone-bootstrap-xl96r" Mar 20 08:45:17 crc kubenswrapper[4971]: I0320 08:45:17.925361 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a1b846cc-f217-4bf2-a2ad-aaa721b8480d-ovsdbserver-sb\") pod \"dnsmasq-dns-7b54488967-zlmqp\" (UID: \"a1b846cc-f217-4bf2-a2ad-aaa721b8480d\") " pod="openstack/dnsmasq-dns-7b54488967-zlmqp" Mar 20 08:45:17 crc kubenswrapper[4971]: I0320 08:45:17.925389 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a1b846cc-f217-4bf2-a2ad-aaa721b8480d-dns-svc\") pod \"dnsmasq-dns-7b54488967-zlmqp\" (UID: \"a1b846cc-f217-4bf2-a2ad-aaa721b8480d\") " pod="openstack/dnsmasq-dns-7b54488967-zlmqp" Mar 20 08:45:17 crc kubenswrapper[4971]: I0320 08:45:17.925439 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kb9ps\" (UniqueName: \"kubernetes.io/projected/a1b846cc-f217-4bf2-a2ad-aaa721b8480d-kube-api-access-kb9ps\") pod \"dnsmasq-dns-7b54488967-zlmqp\" (UID: \"a1b846cc-f217-4bf2-a2ad-aaa721b8480d\") " pod="openstack/dnsmasq-dns-7b54488967-zlmqp" Mar 20 08:45:17 crc kubenswrapper[4971]: I0320 08:45:17.925466 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e94ca33f-0afb-4d43-954c-6ff16f84a9a1-fernet-keys\") pod \"keystone-bootstrap-xl96r\" (UID: \"e94ca33f-0afb-4d43-954c-6ff16f84a9a1\") " pod="openstack/keystone-bootstrap-xl96r" Mar 20 08:45:17 crc kubenswrapper[4971]: I0320 08:45:17.925523 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1b846cc-f217-4bf2-a2ad-aaa721b8480d-config\") pod \"dnsmasq-dns-7b54488967-zlmqp\" (UID: \"a1b846cc-f217-4bf2-a2ad-aaa721b8480d\") " pod="openstack/dnsmasq-dns-7b54488967-zlmqp" Mar 20 08:45:17 crc kubenswrapper[4971]: I0320 08:45:17.925566 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e94ca33f-0afb-4d43-954c-6ff16f84a9a1-scripts\") pod \"keystone-bootstrap-xl96r\" (UID: \"e94ca33f-0afb-4d43-954c-6ff16f84a9a1\") " pod="openstack/keystone-bootstrap-xl96r" Mar 20 08:45:17 crc kubenswrapper[4971]: I0320 08:45:17.927047 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a1b846cc-f217-4bf2-a2ad-aaa721b8480d-dns-svc\") pod \"dnsmasq-dns-7b54488967-zlmqp\" (UID: \"a1b846cc-f217-4bf2-a2ad-aaa721b8480d\") " pod="openstack/dnsmasq-dns-7b54488967-zlmqp" Mar 20 08:45:17 crc kubenswrapper[4971]: I0320 08:45:17.927059 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a1b846cc-f217-4bf2-a2ad-aaa721b8480d-ovsdbserver-nb\") pod \"dnsmasq-dns-7b54488967-zlmqp\" (UID: \"a1b846cc-f217-4bf2-a2ad-aaa721b8480d\") " pod="openstack/dnsmasq-dns-7b54488967-zlmqp" Mar 20 08:45:17 crc kubenswrapper[4971]: I0320 08:45:17.929272 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1b846cc-f217-4bf2-a2ad-aaa721b8480d-config\") pod \"dnsmasq-dns-7b54488967-zlmqp\" (UID: \"a1b846cc-f217-4bf2-a2ad-aaa721b8480d\") " pod="openstack/dnsmasq-dns-7b54488967-zlmqp" Mar 20 08:45:17 crc kubenswrapper[4971]: I0320 08:45:17.929485 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a1b846cc-f217-4bf2-a2ad-aaa721b8480d-ovsdbserver-sb\") pod \"dnsmasq-dns-7b54488967-zlmqp\" (UID: \"a1b846cc-f217-4bf2-a2ad-aaa721b8480d\") " pod="openstack/dnsmasq-dns-7b54488967-zlmqp" Mar 20 08:45:17 crc kubenswrapper[4971]: I0320 08:45:17.932577 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e94ca33f-0afb-4d43-954c-6ff16f84a9a1-credential-keys\") pod \"keystone-bootstrap-xl96r\" (UID: \"e94ca33f-0afb-4d43-954c-6ff16f84a9a1\") " pod="openstack/keystone-bootstrap-xl96r" Mar 20 08:45:17 crc kubenswrapper[4971]: I0320 08:45:17.932984 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e94ca33f-0afb-4d43-954c-6ff16f84a9a1-config-data\") pod \"keystone-bootstrap-xl96r\" (UID: \"e94ca33f-0afb-4d43-954c-6ff16f84a9a1\") " pod="openstack/keystone-bootstrap-xl96r" Mar 20 08:45:17 crc kubenswrapper[4971]: I0320 08:45:17.937629 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e94ca33f-0afb-4d43-954c-6ff16f84a9a1-fernet-keys\") pod \"keystone-bootstrap-xl96r\" (UID: \"e94ca33f-0afb-4d43-954c-6ff16f84a9a1\") " pod="openstack/keystone-bootstrap-xl96r" Mar 20 08:45:17 crc kubenswrapper[4971]: I0320 08:45:17.947300 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e94ca33f-0afb-4d43-954c-6ff16f84a9a1-combined-ca-bundle\") pod \"keystone-bootstrap-xl96r\" (UID: \"e94ca33f-0afb-4d43-954c-6ff16f84a9a1\") " pod="openstack/keystone-bootstrap-xl96r" Mar 20 08:45:17 crc kubenswrapper[4971]: I0320 08:45:17.947684 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e94ca33f-0afb-4d43-954c-6ff16f84a9a1-scripts\") pod \"keystone-bootstrap-xl96r\" (UID: \"e94ca33f-0afb-4d43-954c-6ff16f84a9a1\") " pod="openstack/keystone-bootstrap-xl96r" Mar 20 08:45:17 crc kubenswrapper[4971]: I0320 08:45:17.949157 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xwn7\" (UniqueName: \"kubernetes.io/projected/e94ca33f-0afb-4d43-954c-6ff16f84a9a1-kube-api-access-7xwn7\") pod \"keystone-bootstrap-xl96r\" (UID: \"e94ca33f-0afb-4d43-954c-6ff16f84a9a1\") " pod="openstack/keystone-bootstrap-xl96r" Mar 20 08:45:17 crc kubenswrapper[4971]: I0320 08:45:17.954334 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kb9ps\" (UniqueName: \"kubernetes.io/projected/a1b846cc-f217-4bf2-a2ad-aaa721b8480d-kube-api-access-kb9ps\") pod \"dnsmasq-dns-7b54488967-zlmqp\" (UID: \"a1b846cc-f217-4bf2-a2ad-aaa721b8480d\") " pod="openstack/dnsmasq-dns-7b54488967-zlmqp" Mar 20 08:45:18 crc kubenswrapper[4971]: I0320 08:45:18.102444 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b54488967-zlmqp" Mar 20 08:45:18 crc kubenswrapper[4971]: I0320 08:45:18.113333 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xl96r" Mar 20 08:45:18 crc kubenswrapper[4971]: I0320 08:45:18.552264 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zj4rj"] Mar 20 08:45:18 crc kubenswrapper[4971]: I0320 08:45:18.558009 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zj4rj" Mar 20 08:45:18 crc kubenswrapper[4971]: I0320 08:45:18.564461 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zj4rj"] Mar 20 08:45:18 crc kubenswrapper[4971]: I0320 08:45:18.609414 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xl96r"] Mar 20 08:45:18 crc kubenswrapper[4971]: I0320 08:45:18.637128 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73a93a9f-b8fe-44bd-9840-09339c326e93-catalog-content\") pod \"redhat-operators-zj4rj\" (UID: \"73a93a9f-b8fe-44bd-9840-09339c326e93\") " pod="openshift-marketplace/redhat-operators-zj4rj" Mar 20 08:45:18 crc kubenswrapper[4971]: I0320 08:45:18.637301 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73a93a9f-b8fe-44bd-9840-09339c326e93-utilities\") pod \"redhat-operators-zj4rj\" (UID: \"73a93a9f-b8fe-44bd-9840-09339c326e93\") " pod="openshift-marketplace/redhat-operators-zj4rj" Mar 20 08:45:18 crc kubenswrapper[4971]: I0320 08:45:18.637590 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwbp8\" (UniqueName: \"kubernetes.io/projected/73a93a9f-b8fe-44bd-9840-09339c326e93-kube-api-access-zwbp8\") pod \"redhat-operators-zj4rj\" (UID: \"73a93a9f-b8fe-44bd-9840-09339c326e93\") " pod="openshift-marketplace/redhat-operators-zj4rj" Mar 20 08:45:18 crc kubenswrapper[4971]: I0320 08:45:18.668047 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b54488967-zlmqp"] Mar 20 08:45:18 crc kubenswrapper[4971]: W0320 08:45:18.672498 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1b846cc_f217_4bf2_a2ad_aaa721b8480d.slice/crio-ae0b7b4b87fa6402443fcdd892b9fb0f02778a8c89b331783c3a1525ca971438 WatchSource:0}: Error finding container ae0b7b4b87fa6402443fcdd892b9fb0f02778a8c89b331783c3a1525ca971438: Status 404 returned error can't find the container with id ae0b7b4b87fa6402443fcdd892b9fb0f02778a8c89b331783c3a1525ca971438 Mar 20 08:45:18 crc kubenswrapper[4971]: I0320 08:45:18.739486 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73a93a9f-b8fe-44bd-9840-09339c326e93-catalog-content\") pod \"redhat-operators-zj4rj\" (UID: \"73a93a9f-b8fe-44bd-9840-09339c326e93\") " pod="openshift-marketplace/redhat-operators-zj4rj" Mar 20 08:45:18 crc kubenswrapper[4971]: I0320 08:45:18.739528 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73a93a9f-b8fe-44bd-9840-09339c326e93-utilities\") pod \"redhat-operators-zj4rj\" (UID: \"73a93a9f-b8fe-44bd-9840-09339c326e93\") " pod="openshift-marketplace/redhat-operators-zj4rj" Mar 20 08:45:18 crc kubenswrapper[4971]: I0320 08:45:18.739571 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwbp8\" (UniqueName: \"kubernetes.io/projected/73a93a9f-b8fe-44bd-9840-09339c326e93-kube-api-access-zwbp8\") pod \"redhat-operators-zj4rj\" (UID: \"73a93a9f-b8fe-44bd-9840-09339c326e93\") " pod="openshift-marketplace/redhat-operators-zj4rj" Mar 20 08:45:18 crc kubenswrapper[4971]: I0320 08:45:18.740168 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73a93a9f-b8fe-44bd-9840-09339c326e93-catalog-content\") pod \"redhat-operators-zj4rj\" (UID: \"73a93a9f-b8fe-44bd-9840-09339c326e93\") " pod="openshift-marketplace/redhat-operators-zj4rj" Mar 20 08:45:18 crc kubenswrapper[4971]: I0320 08:45:18.740801 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73a93a9f-b8fe-44bd-9840-09339c326e93-utilities\") pod \"redhat-operators-zj4rj\" (UID: \"73a93a9f-b8fe-44bd-9840-09339c326e93\") " pod="openshift-marketplace/redhat-operators-zj4rj" Mar 20 08:45:18 crc kubenswrapper[4971]: I0320 08:45:18.761817 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwbp8\" (UniqueName: \"kubernetes.io/projected/73a93a9f-b8fe-44bd-9840-09339c326e93-kube-api-access-zwbp8\") pod \"redhat-operators-zj4rj\" (UID: \"73a93a9f-b8fe-44bd-9840-09339c326e93\") " pod="openshift-marketplace/redhat-operators-zj4rj" Mar 20 08:45:18 crc kubenswrapper[4971]: I0320 08:45:18.880822 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zj4rj" Mar 20 08:45:19 crc kubenswrapper[4971]: I0320 08:45:19.359016 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zj4rj"] Mar 20 08:45:19 crc kubenswrapper[4971]: I0320 08:45:19.446130 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zj4rj" event={"ID":"73a93a9f-b8fe-44bd-9840-09339c326e93","Type":"ContainerStarted","Data":"34f4a108bd754416a14d04f9a9d82278b0f72f03a07cef4e16a886ceea5d4ebe"} Mar 20 08:45:19 crc kubenswrapper[4971]: I0320 08:45:19.448787 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xl96r" event={"ID":"e94ca33f-0afb-4d43-954c-6ff16f84a9a1","Type":"ContainerStarted","Data":"b2d012c332ffad8e91b3e921fa80377d0620c04da62ce6058386366fd408deba"} Mar 20 08:45:19 crc kubenswrapper[4971]: I0320 08:45:19.448896 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xl96r" event={"ID":"e94ca33f-0afb-4d43-954c-6ff16f84a9a1","Type":"ContainerStarted","Data":"7f7647bb5832907ca788a87146cbdede08dd69fe2f51e42140d09517a09b9e37"} Mar 20 08:45:19 crc kubenswrapper[4971]: I0320 08:45:19.451142 4971 generic.go:334] "Generic (PLEG): container finished" podID="a1b846cc-f217-4bf2-a2ad-aaa721b8480d" containerID="73bfc543c6cba52262ea1124324c6c8eceb32bd6201dc4c8dcbbc4e410c2b49c" exitCode=0 Mar 20 08:45:19 crc kubenswrapper[4971]: I0320 08:45:19.451194 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b54488967-zlmqp" event={"ID":"a1b846cc-f217-4bf2-a2ad-aaa721b8480d","Type":"ContainerDied","Data":"73bfc543c6cba52262ea1124324c6c8eceb32bd6201dc4c8dcbbc4e410c2b49c"} Mar 20 08:45:19 crc kubenswrapper[4971]: I0320 08:45:19.451227 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b54488967-zlmqp" event={"ID":"a1b846cc-f217-4bf2-a2ad-aaa721b8480d","Type":"ContainerStarted","Data":"ae0b7b4b87fa6402443fcdd892b9fb0f02778a8c89b331783c3a1525ca971438"} Mar 20 08:45:19 crc kubenswrapper[4971]: I0320 08:45:19.509134 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-xl96r" podStartSLOduration=2.509106154 podStartE2EDuration="2.509106154s" podCreationTimestamp="2026-03-20 08:45:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:45:19.481589355 +0000 UTC m=+6941.461463503" watchObservedRunningTime="2026-03-20 08:45:19.509106154 +0000 UTC m=+6941.488980292" Mar 20 08:45:20 crc kubenswrapper[4971]: I0320 08:45:20.461991 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b54488967-zlmqp" event={"ID":"a1b846cc-f217-4bf2-a2ad-aaa721b8480d","Type":"ContainerStarted","Data":"a7d5495118bd04c96bc8f5e3374e63550c006d70eda39ab9e7431d15190146f8"} Mar 20 08:45:20 crc kubenswrapper[4971]: I0320 08:45:20.462349 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b54488967-zlmqp" Mar 20 08:45:20 crc kubenswrapper[4971]: I0320 08:45:20.464126 4971 generic.go:334] "Generic (PLEG): container finished" podID="73a93a9f-b8fe-44bd-9840-09339c326e93" containerID="65f1fd07e05f8bed8ca0da78686799388ab51848b4e77ae251a2d67a0b97a977" exitCode=0 Mar 20 08:45:20 crc kubenswrapper[4971]: I0320 08:45:20.464233 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zj4rj" event={"ID":"73a93a9f-b8fe-44bd-9840-09339c326e93","Type":"ContainerDied","Data":"65f1fd07e05f8bed8ca0da78686799388ab51848b4e77ae251a2d67a0b97a977"} Mar 20 08:45:20 crc kubenswrapper[4971]: I0320 08:45:20.493360 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b54488967-zlmqp" podStartSLOduration=3.49333721 podStartE2EDuration="3.49333721s" podCreationTimestamp="2026-03-20 08:45:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:45:20.489083179 +0000 UTC m=+6942.468957307" watchObservedRunningTime="2026-03-20 08:45:20.49333721 +0000 UTC m=+6942.473211358" Mar 20 08:45:21 crc kubenswrapper[4971]: I0320 08:45:21.475094 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zj4rj" event={"ID":"73a93a9f-b8fe-44bd-9840-09339c326e93","Type":"ContainerStarted","Data":"5f3ac4545e5bc6cd7b62e17dcc2005a45fc6a6827f62fee9f1c56a18c049237b"} Mar 20 08:45:22 crc kubenswrapper[4971]: I0320 08:45:22.483631 4971 generic.go:334] "Generic (PLEG): container finished" podID="73a93a9f-b8fe-44bd-9840-09339c326e93" containerID="5f3ac4545e5bc6cd7b62e17dcc2005a45fc6a6827f62fee9f1c56a18c049237b" exitCode=0 Mar 20 08:45:22 crc kubenswrapper[4971]: I0320 08:45:22.483705 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zj4rj" event={"ID":"73a93a9f-b8fe-44bd-9840-09339c326e93","Type":"ContainerDied","Data":"5f3ac4545e5bc6cd7b62e17dcc2005a45fc6a6827f62fee9f1c56a18c049237b"} Mar 20 08:45:22 crc kubenswrapper[4971]: I0320 08:45:22.485379 4971 generic.go:334] "Generic (PLEG): container finished" podID="e94ca33f-0afb-4d43-954c-6ff16f84a9a1" containerID="b2d012c332ffad8e91b3e921fa80377d0620c04da62ce6058386366fd408deba" exitCode=0 Mar 20 08:45:22 crc kubenswrapper[4971]: I0320 08:45:22.485401 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xl96r" event={"ID":"e94ca33f-0afb-4d43-954c-6ff16f84a9a1","Type":"ContainerDied","Data":"b2d012c332ffad8e91b3e921fa80377d0620c04da62ce6058386366fd408deba"} Mar 20 08:45:23 crc kubenswrapper[4971]: I0320 08:45:23.497176 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zj4rj" event={"ID":"73a93a9f-b8fe-44bd-9840-09339c326e93","Type":"ContainerStarted","Data":"1047aba86dd870ef7a99112f48432ab713103241ac2d4ad32f1fe8f483c82a2a"} Mar 20 08:45:23 crc kubenswrapper[4971]: I0320 08:45:23.522200 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zj4rj" podStartSLOduration=2.873968873 podStartE2EDuration="5.522183058s" podCreationTimestamp="2026-03-20 08:45:18 +0000 UTC" firstStartedPulling="2026-03-20 08:45:20.466317874 +0000 UTC m=+6942.446192002" lastFinishedPulling="2026-03-20 08:45:23.114532039 +0000 UTC m=+6945.094406187" observedRunningTime="2026-03-20 08:45:23.518468391 +0000 UTC m=+6945.498342569" watchObservedRunningTime="2026-03-20 08:45:23.522183058 +0000 UTC m=+6945.502057196" Mar 20 08:45:23 crc kubenswrapper[4971]: I0320 08:45:23.941197 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xl96r" Mar 20 08:45:24 crc kubenswrapper[4971]: I0320 08:45:24.021951 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e94ca33f-0afb-4d43-954c-6ff16f84a9a1-config-data\") pod \"e94ca33f-0afb-4d43-954c-6ff16f84a9a1\" (UID: \"e94ca33f-0afb-4d43-954c-6ff16f84a9a1\") " Mar 20 08:45:24 crc kubenswrapper[4971]: I0320 08:45:24.022102 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e94ca33f-0afb-4d43-954c-6ff16f84a9a1-combined-ca-bundle\") pod \"e94ca33f-0afb-4d43-954c-6ff16f84a9a1\" (UID: \"e94ca33f-0afb-4d43-954c-6ff16f84a9a1\") " Mar 20 08:45:24 crc kubenswrapper[4971]: I0320 08:45:24.022142 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e94ca33f-0afb-4d43-954c-6ff16f84a9a1-scripts\") pod \"e94ca33f-0afb-4d43-954c-6ff16f84a9a1\" (UID: \"e94ca33f-0afb-4d43-954c-6ff16f84a9a1\") " Mar 20 08:45:24 crc kubenswrapper[4971]: I0320 08:45:24.022262 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e94ca33f-0afb-4d43-954c-6ff16f84a9a1-fernet-keys\") pod \"e94ca33f-0afb-4d43-954c-6ff16f84a9a1\" (UID: \"e94ca33f-0afb-4d43-954c-6ff16f84a9a1\") " Mar 20 08:45:24 crc kubenswrapper[4971]: I0320 08:45:24.022365 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xwn7\" (UniqueName: \"kubernetes.io/projected/e94ca33f-0afb-4d43-954c-6ff16f84a9a1-kube-api-access-7xwn7\") pod \"e94ca33f-0afb-4d43-954c-6ff16f84a9a1\" (UID: \"e94ca33f-0afb-4d43-954c-6ff16f84a9a1\") " Mar 20 08:45:24 crc kubenswrapper[4971]: I0320 08:45:24.022404 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e94ca33f-0afb-4d43-954c-6ff16f84a9a1-credential-keys\") pod \"e94ca33f-0afb-4d43-954c-6ff16f84a9a1\" (UID: \"e94ca33f-0afb-4d43-954c-6ff16f84a9a1\") " Mar 20 08:45:24 crc kubenswrapper[4971]: I0320 08:45:24.027910 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e94ca33f-0afb-4d43-954c-6ff16f84a9a1-kube-api-access-7xwn7" (OuterVolumeSpecName: "kube-api-access-7xwn7") pod "e94ca33f-0afb-4d43-954c-6ff16f84a9a1" (UID: "e94ca33f-0afb-4d43-954c-6ff16f84a9a1"). InnerVolumeSpecName "kube-api-access-7xwn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:45:24 crc kubenswrapper[4971]: I0320 08:45:24.028357 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e94ca33f-0afb-4d43-954c-6ff16f84a9a1-scripts" (OuterVolumeSpecName: "scripts") pod "e94ca33f-0afb-4d43-954c-6ff16f84a9a1" (UID: "e94ca33f-0afb-4d43-954c-6ff16f84a9a1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:45:24 crc kubenswrapper[4971]: I0320 08:45:24.028710 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e94ca33f-0afb-4d43-954c-6ff16f84a9a1-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "e94ca33f-0afb-4d43-954c-6ff16f84a9a1" (UID: "e94ca33f-0afb-4d43-954c-6ff16f84a9a1"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:45:24 crc kubenswrapper[4971]: I0320 08:45:24.035730 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e94ca33f-0afb-4d43-954c-6ff16f84a9a1-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e94ca33f-0afb-4d43-954c-6ff16f84a9a1" (UID: "e94ca33f-0afb-4d43-954c-6ff16f84a9a1"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:45:24 crc kubenswrapper[4971]: I0320 08:45:24.049036 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e94ca33f-0afb-4d43-954c-6ff16f84a9a1-config-data" (OuterVolumeSpecName: "config-data") pod "e94ca33f-0afb-4d43-954c-6ff16f84a9a1" (UID: "e94ca33f-0afb-4d43-954c-6ff16f84a9a1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:45:24 crc kubenswrapper[4971]: I0320 08:45:24.049243 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e94ca33f-0afb-4d43-954c-6ff16f84a9a1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e94ca33f-0afb-4d43-954c-6ff16f84a9a1" (UID: "e94ca33f-0afb-4d43-954c-6ff16f84a9a1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:45:24 crc kubenswrapper[4971]: I0320 08:45:24.124722 4971 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e94ca33f-0afb-4d43-954c-6ff16f84a9a1-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:24 crc kubenswrapper[4971]: I0320 08:45:24.124758 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e94ca33f-0afb-4d43-954c-6ff16f84a9a1-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:24 crc kubenswrapper[4971]: I0320 08:45:24.124771 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e94ca33f-0afb-4d43-954c-6ff16f84a9a1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:24 crc kubenswrapper[4971]: I0320 08:45:24.124785 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e94ca33f-0afb-4d43-954c-6ff16f84a9a1-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:24 crc kubenswrapper[4971]: I0320 08:45:24.124799 4971 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e94ca33f-0afb-4d43-954c-6ff16f84a9a1-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:24 crc kubenswrapper[4971]: I0320 08:45:24.124811 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xwn7\" (UniqueName: \"kubernetes.io/projected/e94ca33f-0afb-4d43-954c-6ff16f84a9a1-kube-api-access-7xwn7\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:24 crc kubenswrapper[4971]: I0320 08:45:24.512165 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xl96r" Mar 20 08:45:24 crc kubenswrapper[4971]: I0320 08:45:24.514093 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xl96r" event={"ID":"e94ca33f-0afb-4d43-954c-6ff16f84a9a1","Type":"ContainerDied","Data":"7f7647bb5832907ca788a87146cbdede08dd69fe2f51e42140d09517a09b9e37"} Mar 20 08:45:24 crc kubenswrapper[4971]: I0320 08:45:24.514160 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f7647bb5832907ca788a87146cbdede08dd69fe2f51e42140d09517a09b9e37" Mar 20 08:45:24 crc kubenswrapper[4971]: I0320 08:45:24.591053 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-xl96r"] Mar 20 08:45:24 crc kubenswrapper[4971]: I0320 08:45:24.597177 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-xl96r"] Mar 20 08:45:24 crc kubenswrapper[4971]: I0320 08:45:24.691528 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-7xt4j"] Mar 20 08:45:24 crc kubenswrapper[4971]: E0320 08:45:24.692062 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e94ca33f-0afb-4d43-954c-6ff16f84a9a1" containerName="keystone-bootstrap" Mar 20 08:45:24 crc kubenswrapper[4971]: I0320 08:45:24.692138 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="e94ca33f-0afb-4d43-954c-6ff16f84a9a1" containerName="keystone-bootstrap" Mar 20 08:45:24 crc kubenswrapper[4971]: I0320 08:45:24.692455 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="e94ca33f-0afb-4d43-954c-6ff16f84a9a1" containerName="keystone-bootstrap" Mar 20 08:45:24 crc kubenswrapper[4971]: I0320 08:45:24.693053 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7xt4j" Mar 20 08:45:24 crc kubenswrapper[4971]: I0320 08:45:24.694967 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 20 08:45:24 crc kubenswrapper[4971]: I0320 08:45:24.695021 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 08:45:24 crc kubenswrapper[4971]: I0320 08:45:24.694982 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 08:45:24 crc kubenswrapper[4971]: I0320 08:45:24.695379 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-bdcjr" Mar 20 08:45:24 crc kubenswrapper[4971]: I0320 08:45:24.700085 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-7xt4j"] Mar 20 08:45:24 crc kubenswrapper[4971]: I0320 08:45:24.727269 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 08:45:24 crc kubenswrapper[4971]: I0320 08:45:24.735019 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1322bd61-d156-4142-9a0c-023e76e87cd0-fernet-keys\") pod \"keystone-bootstrap-7xt4j\" (UID: \"1322bd61-d156-4142-9a0c-023e76e87cd0\") " pod="openstack/keystone-bootstrap-7xt4j" Mar 20 08:45:24 crc kubenswrapper[4971]: I0320 08:45:24.735229 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1322bd61-d156-4142-9a0c-023e76e87cd0-credential-keys\") pod \"keystone-bootstrap-7xt4j\" (UID: \"1322bd61-d156-4142-9a0c-023e76e87cd0\") " pod="openstack/keystone-bootstrap-7xt4j" Mar 20 08:45:24 crc kubenswrapper[4971]: I0320 08:45:24.735315 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55p8h\" (UniqueName: \"kubernetes.io/projected/1322bd61-d156-4142-9a0c-023e76e87cd0-kube-api-access-55p8h\") pod \"keystone-bootstrap-7xt4j\" (UID: \"1322bd61-d156-4142-9a0c-023e76e87cd0\") " pod="openstack/keystone-bootstrap-7xt4j" Mar 20 08:45:24 crc kubenswrapper[4971]: I0320 08:45:24.735433 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1322bd61-d156-4142-9a0c-023e76e87cd0-combined-ca-bundle\") pod \"keystone-bootstrap-7xt4j\" (UID: \"1322bd61-d156-4142-9a0c-023e76e87cd0\") " pod="openstack/keystone-bootstrap-7xt4j" Mar 20 08:45:24 crc kubenswrapper[4971]: I0320 08:45:24.735536 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1322bd61-d156-4142-9a0c-023e76e87cd0-config-data\") pod \"keystone-bootstrap-7xt4j\" (UID: \"1322bd61-d156-4142-9a0c-023e76e87cd0\") " pod="openstack/keystone-bootstrap-7xt4j" Mar 20 08:45:24 crc kubenswrapper[4971]: I0320 08:45:24.735658 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1322bd61-d156-4142-9a0c-023e76e87cd0-scripts\") pod \"keystone-bootstrap-7xt4j\" (UID: \"1322bd61-d156-4142-9a0c-023e76e87cd0\") " pod="openstack/keystone-bootstrap-7xt4j" Mar 20 08:45:24 crc kubenswrapper[4971]: I0320 08:45:24.741854 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e94ca33f-0afb-4d43-954c-6ff16f84a9a1" path="/var/lib/kubelet/pods/e94ca33f-0afb-4d43-954c-6ff16f84a9a1/volumes" Mar 20 08:45:24 crc kubenswrapper[4971]: I0320 08:45:24.837091 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1322bd61-d156-4142-9a0c-023e76e87cd0-credential-keys\") pod \"keystone-bootstrap-7xt4j\" (UID: \"1322bd61-d156-4142-9a0c-023e76e87cd0\") " pod="openstack/keystone-bootstrap-7xt4j" Mar 20 08:45:24 crc kubenswrapper[4971]: I0320 08:45:24.837356 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55p8h\" (UniqueName: \"kubernetes.io/projected/1322bd61-d156-4142-9a0c-023e76e87cd0-kube-api-access-55p8h\") pod \"keystone-bootstrap-7xt4j\" (UID: \"1322bd61-d156-4142-9a0c-023e76e87cd0\") " pod="openstack/keystone-bootstrap-7xt4j" Mar 20 08:45:24 crc kubenswrapper[4971]: I0320 08:45:24.837391 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1322bd61-d156-4142-9a0c-023e76e87cd0-combined-ca-bundle\") pod \"keystone-bootstrap-7xt4j\" (UID: \"1322bd61-d156-4142-9a0c-023e76e87cd0\") " pod="openstack/keystone-bootstrap-7xt4j" Mar 20 08:45:24 crc kubenswrapper[4971]: I0320 08:45:24.837420 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1322bd61-d156-4142-9a0c-023e76e87cd0-config-data\") pod \"keystone-bootstrap-7xt4j\" (UID: \"1322bd61-d156-4142-9a0c-023e76e87cd0\") " pod="openstack/keystone-bootstrap-7xt4j" Mar 20 08:45:24 crc kubenswrapper[4971]: I0320 08:45:24.837438 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1322bd61-d156-4142-9a0c-023e76e87cd0-scripts\") pod \"keystone-bootstrap-7xt4j\" (UID: \"1322bd61-d156-4142-9a0c-023e76e87cd0\") " pod="openstack/keystone-bootstrap-7xt4j" Mar 20 08:45:24 crc kubenswrapper[4971]: I0320 08:45:24.837498 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1322bd61-d156-4142-9a0c-023e76e87cd0-fernet-keys\") pod \"keystone-bootstrap-7xt4j\" (UID: \"1322bd61-d156-4142-9a0c-023e76e87cd0\") " pod="openstack/keystone-bootstrap-7xt4j" Mar 20 08:45:24 crc kubenswrapper[4971]: I0320 08:45:24.841470 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1322bd61-d156-4142-9a0c-023e76e87cd0-combined-ca-bundle\") pod \"keystone-bootstrap-7xt4j\" (UID: \"1322bd61-d156-4142-9a0c-023e76e87cd0\") " pod="openstack/keystone-bootstrap-7xt4j" Mar 20 08:45:24 crc kubenswrapper[4971]: I0320 08:45:24.841968 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1322bd61-d156-4142-9a0c-023e76e87cd0-credential-keys\") pod \"keystone-bootstrap-7xt4j\" (UID: \"1322bd61-d156-4142-9a0c-023e76e87cd0\") " pod="openstack/keystone-bootstrap-7xt4j" Mar 20 08:45:24 crc kubenswrapper[4971]: I0320 08:45:24.843970 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1322bd61-d156-4142-9a0c-023e76e87cd0-scripts\") pod \"keystone-bootstrap-7xt4j\" (UID: \"1322bd61-d156-4142-9a0c-023e76e87cd0\") " pod="openstack/keystone-bootstrap-7xt4j" Mar 20 08:45:24 crc kubenswrapper[4971]: I0320 08:45:24.844586 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1322bd61-d156-4142-9a0c-023e76e87cd0-config-data\") pod \"keystone-bootstrap-7xt4j\" (UID: \"1322bd61-d156-4142-9a0c-023e76e87cd0\") " pod="openstack/keystone-bootstrap-7xt4j" Mar 20 08:45:24 crc kubenswrapper[4971]: I0320 08:45:24.853209 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1322bd61-d156-4142-9a0c-023e76e87cd0-fernet-keys\") pod \"keystone-bootstrap-7xt4j\" (UID: \"1322bd61-d156-4142-9a0c-023e76e87cd0\") " pod="openstack/keystone-bootstrap-7xt4j" Mar 20 08:45:24 crc kubenswrapper[4971]: I0320 08:45:24.865439 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55p8h\" (UniqueName: \"kubernetes.io/projected/1322bd61-d156-4142-9a0c-023e76e87cd0-kube-api-access-55p8h\") pod \"keystone-bootstrap-7xt4j\" (UID: \"1322bd61-d156-4142-9a0c-023e76e87cd0\") " pod="openstack/keystone-bootstrap-7xt4j" Mar 20 08:45:25 crc kubenswrapper[4971]: I0320 08:45:25.040669 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7xt4j" Mar 20 08:45:25 crc kubenswrapper[4971]: I0320 08:45:25.477428 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-7xt4j"] Mar 20 08:45:25 crc kubenswrapper[4971]: W0320 08:45:25.484011 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1322bd61_d156_4142_9a0c_023e76e87cd0.slice/crio-02886fe19d0fc8f7ed5ac6499858018ee26ab863491e10eb99e7455450ded274 WatchSource:0}: Error finding container 02886fe19d0fc8f7ed5ac6499858018ee26ab863491e10eb99e7455450ded274: Status 404 returned error can't find the container with id 02886fe19d0fc8f7ed5ac6499858018ee26ab863491e10eb99e7455450ded274 Mar 20 08:45:25 crc kubenswrapper[4971]: I0320 08:45:25.541773 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7xt4j" event={"ID":"1322bd61-d156-4142-9a0c-023e76e87cd0","Type":"ContainerStarted","Data":"02886fe19d0fc8f7ed5ac6499858018ee26ab863491e10eb99e7455450ded274"} Mar 20 08:45:26 crc kubenswrapper[4971]: I0320 08:45:26.553236 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7xt4j" event={"ID":"1322bd61-d156-4142-9a0c-023e76e87cd0","Type":"ContainerStarted","Data":"cf162ba3fd6d8b323310fc3121d8e69631133d3d4345971206f13063cd997bf8"} Mar 20 08:45:26 crc kubenswrapper[4971]: I0320 08:45:26.575483 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-7xt4j" podStartSLOduration=2.575455296 podStartE2EDuration="2.575455296s" podCreationTimestamp="2026-03-20 08:45:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:45:26.570129666 +0000 UTC m=+6948.550003844" watchObservedRunningTime="2026-03-20 08:45:26.575455296 +0000 UTC m=+6948.555329474" Mar 20 08:45:28 crc kubenswrapper[4971]: I0320 08:45:28.104771 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7b54488967-zlmqp" Mar 20 08:45:28 crc kubenswrapper[4971]: I0320 08:45:28.196137 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5db489d56c-jxfx5"] Mar 20 08:45:28 crc kubenswrapper[4971]: I0320 08:45:28.196363 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5db489d56c-jxfx5" podUID="efcb77d2-e456-4d89-8ef3-9a2445bd5e2c" containerName="dnsmasq-dns" containerID="cri-o://48e9c08e2cc299fd6646bdeae2420e3149dee56a25923ef2635e1d90643445c4" gracePeriod=10 Mar 20 08:45:28 crc kubenswrapper[4971]: I0320 08:45:28.569737 4971 generic.go:334] "Generic (PLEG): container finished" podID="efcb77d2-e456-4d89-8ef3-9a2445bd5e2c" containerID="48e9c08e2cc299fd6646bdeae2420e3149dee56a25923ef2635e1d90643445c4" exitCode=0 Mar 20 08:45:28 crc kubenswrapper[4971]: I0320 08:45:28.569787 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5db489d56c-jxfx5" event={"ID":"efcb77d2-e456-4d89-8ef3-9a2445bd5e2c","Type":"ContainerDied","Data":"48e9c08e2cc299fd6646bdeae2420e3149dee56a25923ef2635e1d90643445c4"} Mar 20 08:45:28 crc kubenswrapper[4971]: I0320 08:45:28.677989 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5db489d56c-jxfx5" Mar 20 08:45:28 crc kubenswrapper[4971]: I0320 08:45:28.730775 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efcb77d2-e456-4d89-8ef3-9a2445bd5e2c-config\") pod \"efcb77d2-e456-4d89-8ef3-9a2445bd5e2c\" (UID: \"efcb77d2-e456-4d89-8ef3-9a2445bd5e2c\") " Mar 20 08:45:28 crc kubenswrapper[4971]: I0320 08:45:28.730836 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/efcb77d2-e456-4d89-8ef3-9a2445bd5e2c-ovsdbserver-sb\") pod \"efcb77d2-e456-4d89-8ef3-9a2445bd5e2c\" (UID: \"efcb77d2-e456-4d89-8ef3-9a2445bd5e2c\") " Mar 20 08:45:28 crc kubenswrapper[4971]: I0320 08:45:28.730871 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5qtv\" (UniqueName: \"kubernetes.io/projected/efcb77d2-e456-4d89-8ef3-9a2445bd5e2c-kube-api-access-c5qtv\") pod \"efcb77d2-e456-4d89-8ef3-9a2445bd5e2c\" (UID: \"efcb77d2-e456-4d89-8ef3-9a2445bd5e2c\") " Mar 20 08:45:28 crc kubenswrapper[4971]: I0320 08:45:28.730893 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/efcb77d2-e456-4d89-8ef3-9a2445bd5e2c-dns-svc\") pod \"efcb77d2-e456-4d89-8ef3-9a2445bd5e2c\" (UID: \"efcb77d2-e456-4d89-8ef3-9a2445bd5e2c\") " Mar 20 08:45:28 crc kubenswrapper[4971]: I0320 08:45:28.730937 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/efcb77d2-e456-4d89-8ef3-9a2445bd5e2c-ovsdbserver-nb\") pod \"efcb77d2-e456-4d89-8ef3-9a2445bd5e2c\" (UID: \"efcb77d2-e456-4d89-8ef3-9a2445bd5e2c\") " Mar 20 08:45:28 crc kubenswrapper[4971]: I0320 08:45:28.744484 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efcb77d2-e456-4d89-8ef3-9a2445bd5e2c-kube-api-access-c5qtv" (OuterVolumeSpecName: "kube-api-access-c5qtv") pod "efcb77d2-e456-4d89-8ef3-9a2445bd5e2c" (UID: "efcb77d2-e456-4d89-8ef3-9a2445bd5e2c"). InnerVolumeSpecName "kube-api-access-c5qtv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:45:28 crc kubenswrapper[4971]: I0320 08:45:28.773125 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efcb77d2-e456-4d89-8ef3-9a2445bd5e2c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "efcb77d2-e456-4d89-8ef3-9a2445bd5e2c" (UID: "efcb77d2-e456-4d89-8ef3-9a2445bd5e2c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:45:28 crc kubenswrapper[4971]: I0320 08:45:28.778211 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efcb77d2-e456-4d89-8ef3-9a2445bd5e2c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "efcb77d2-e456-4d89-8ef3-9a2445bd5e2c" (UID: "efcb77d2-e456-4d89-8ef3-9a2445bd5e2c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:45:28 crc kubenswrapper[4971]: I0320 08:45:28.781647 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efcb77d2-e456-4d89-8ef3-9a2445bd5e2c-config" (OuterVolumeSpecName: "config") pod "efcb77d2-e456-4d89-8ef3-9a2445bd5e2c" (UID: "efcb77d2-e456-4d89-8ef3-9a2445bd5e2c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:45:28 crc kubenswrapper[4971]: I0320 08:45:28.793897 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efcb77d2-e456-4d89-8ef3-9a2445bd5e2c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "efcb77d2-e456-4d89-8ef3-9a2445bd5e2c" (UID: "efcb77d2-e456-4d89-8ef3-9a2445bd5e2c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:45:28 crc kubenswrapper[4971]: I0320 08:45:28.833032 4971 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/efcb77d2-e456-4d89-8ef3-9a2445bd5e2c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:28 crc kubenswrapper[4971]: I0320 08:45:28.833060 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efcb77d2-e456-4d89-8ef3-9a2445bd5e2c-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:28 crc kubenswrapper[4971]: I0320 08:45:28.833069 4971 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/efcb77d2-e456-4d89-8ef3-9a2445bd5e2c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:28 crc kubenswrapper[4971]: I0320 08:45:28.833077 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5qtv\" (UniqueName: \"kubernetes.io/projected/efcb77d2-e456-4d89-8ef3-9a2445bd5e2c-kube-api-access-c5qtv\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:28 crc kubenswrapper[4971]: I0320 08:45:28.833088 4971 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/efcb77d2-e456-4d89-8ef3-9a2445bd5e2c-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:28 crc kubenswrapper[4971]: I0320 08:45:28.881383 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zj4rj" Mar 20 08:45:28 crc kubenswrapper[4971]: I0320 08:45:28.881420 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zj4rj" Mar 20 08:45:29 crc kubenswrapper[4971]: I0320 08:45:29.579713 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5db489d56c-jxfx5" event={"ID":"efcb77d2-e456-4d89-8ef3-9a2445bd5e2c","Type":"ContainerDied","Data":"b1018ba4ff8caea1f09d4e8e2b9c8bde07989722fcd05800a6b46f11c3284932"} Mar 20 08:45:29 crc kubenswrapper[4971]: I0320 08:45:29.579787 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5db489d56c-jxfx5" Mar 20 08:45:29 crc kubenswrapper[4971]: I0320 08:45:29.580034 4971 scope.go:117] "RemoveContainer" containerID="48e9c08e2cc299fd6646bdeae2420e3149dee56a25923ef2635e1d90643445c4" Mar 20 08:45:29 crc kubenswrapper[4971]: I0320 08:45:29.583142 4971 generic.go:334] "Generic (PLEG): container finished" podID="1322bd61-d156-4142-9a0c-023e76e87cd0" containerID="cf162ba3fd6d8b323310fc3121d8e69631133d3d4345971206f13063cd997bf8" exitCode=0 Mar 20 08:45:29 crc kubenswrapper[4971]: I0320 08:45:29.583199 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7xt4j" event={"ID":"1322bd61-d156-4142-9a0c-023e76e87cd0","Type":"ContainerDied","Data":"cf162ba3fd6d8b323310fc3121d8e69631133d3d4345971206f13063cd997bf8"} Mar 20 08:45:29 crc kubenswrapper[4971]: I0320 08:45:29.612806 4971 scope.go:117] "RemoveContainer" containerID="7e6703c2cbb16d1ba5ede75f4f58d39149556f72188f5544d3c676d2e746b6ce" Mar 20 08:45:29 crc kubenswrapper[4971]: I0320 08:45:29.641497 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5db489d56c-jxfx5"] Mar 20 08:45:29 crc kubenswrapper[4971]: I0320 08:45:29.653136 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5db489d56c-jxfx5"] Mar 20 08:45:29 crc kubenswrapper[4971]: I0320 08:45:29.939849 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zj4rj" podUID="73a93a9f-b8fe-44bd-9840-09339c326e93" containerName="registry-server" probeResult="failure" output=< Mar 20 08:45:29 crc kubenswrapper[4971]: timeout: failed to connect service ":50051" within 1s Mar 20 08:45:29 crc kubenswrapper[4971]: > Mar 20 08:45:30 crc kubenswrapper[4971]: I0320 08:45:30.746833 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efcb77d2-e456-4d89-8ef3-9a2445bd5e2c" path="/var/lib/kubelet/pods/efcb77d2-e456-4d89-8ef3-9a2445bd5e2c/volumes" Mar 20 08:45:30 crc kubenswrapper[4971]: I0320 08:45:30.983103 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7xt4j" Mar 20 08:45:31 crc kubenswrapper[4971]: I0320 08:45:31.078466 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55p8h\" (UniqueName: \"kubernetes.io/projected/1322bd61-d156-4142-9a0c-023e76e87cd0-kube-api-access-55p8h\") pod \"1322bd61-d156-4142-9a0c-023e76e87cd0\" (UID: \"1322bd61-d156-4142-9a0c-023e76e87cd0\") " Mar 20 08:45:31 crc kubenswrapper[4971]: I0320 08:45:31.078618 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1322bd61-d156-4142-9a0c-023e76e87cd0-combined-ca-bundle\") pod \"1322bd61-d156-4142-9a0c-023e76e87cd0\" (UID: \"1322bd61-d156-4142-9a0c-023e76e87cd0\") " Mar 20 08:45:31 crc kubenswrapper[4971]: I0320 08:45:31.078699 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1322bd61-d156-4142-9a0c-023e76e87cd0-fernet-keys\") pod \"1322bd61-d156-4142-9a0c-023e76e87cd0\" (UID: \"1322bd61-d156-4142-9a0c-023e76e87cd0\") " Mar 20 08:45:31 crc kubenswrapper[4971]: I0320 08:45:31.078738 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1322bd61-d156-4142-9a0c-023e76e87cd0-scripts\") pod \"1322bd61-d156-4142-9a0c-023e76e87cd0\" (UID: \"1322bd61-d156-4142-9a0c-023e76e87cd0\") " Mar 20 08:45:31 crc kubenswrapper[4971]: I0320 08:45:31.078757 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1322bd61-d156-4142-9a0c-023e76e87cd0-credential-keys\") pod \"1322bd61-d156-4142-9a0c-023e76e87cd0\" (UID: \"1322bd61-d156-4142-9a0c-023e76e87cd0\") " Mar 20 08:45:31 crc kubenswrapper[4971]: I0320 08:45:31.078772 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1322bd61-d156-4142-9a0c-023e76e87cd0-config-data\") pod \"1322bd61-d156-4142-9a0c-023e76e87cd0\" (UID: \"1322bd61-d156-4142-9a0c-023e76e87cd0\") " Mar 20 08:45:31 crc kubenswrapper[4971]: I0320 08:45:31.083591 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1322bd61-d156-4142-9a0c-023e76e87cd0-kube-api-access-55p8h" (OuterVolumeSpecName: "kube-api-access-55p8h") pod "1322bd61-d156-4142-9a0c-023e76e87cd0" (UID: "1322bd61-d156-4142-9a0c-023e76e87cd0"). InnerVolumeSpecName "kube-api-access-55p8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:45:31 crc kubenswrapper[4971]: I0320 08:45:31.084146 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1322bd61-d156-4142-9a0c-023e76e87cd0-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "1322bd61-d156-4142-9a0c-023e76e87cd0" (UID: "1322bd61-d156-4142-9a0c-023e76e87cd0"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:45:31 crc kubenswrapper[4971]: I0320 08:45:31.085235 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1322bd61-d156-4142-9a0c-023e76e87cd0-scripts" (OuterVolumeSpecName: "scripts") pod "1322bd61-d156-4142-9a0c-023e76e87cd0" (UID: "1322bd61-d156-4142-9a0c-023e76e87cd0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:45:31 crc kubenswrapper[4971]: I0320 08:45:31.086016 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1322bd61-d156-4142-9a0c-023e76e87cd0-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "1322bd61-d156-4142-9a0c-023e76e87cd0" (UID: "1322bd61-d156-4142-9a0c-023e76e87cd0"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:45:31 crc kubenswrapper[4971]: I0320 08:45:31.100623 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1322bd61-d156-4142-9a0c-023e76e87cd0-config-data" (OuterVolumeSpecName: "config-data") pod "1322bd61-d156-4142-9a0c-023e76e87cd0" (UID: "1322bd61-d156-4142-9a0c-023e76e87cd0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:45:31 crc kubenswrapper[4971]: I0320 08:45:31.105943 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1322bd61-d156-4142-9a0c-023e76e87cd0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1322bd61-d156-4142-9a0c-023e76e87cd0" (UID: "1322bd61-d156-4142-9a0c-023e76e87cd0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:45:31 crc kubenswrapper[4971]: I0320 08:45:31.181134 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55p8h\" (UniqueName: \"kubernetes.io/projected/1322bd61-d156-4142-9a0c-023e76e87cd0-kube-api-access-55p8h\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:31 crc kubenswrapper[4971]: I0320 08:45:31.181421 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1322bd61-d156-4142-9a0c-023e76e87cd0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:31 crc kubenswrapper[4971]: I0320 08:45:31.181467 4971 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1322bd61-d156-4142-9a0c-023e76e87cd0-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:31 crc kubenswrapper[4971]: I0320 08:45:31.181487 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1322bd61-d156-4142-9a0c-023e76e87cd0-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:31 crc kubenswrapper[4971]: I0320 08:45:31.181504 4971 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1322bd61-d156-4142-9a0c-023e76e87cd0-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:31 crc kubenswrapper[4971]: I0320 08:45:31.181521 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1322bd61-d156-4142-9a0c-023e76e87cd0-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:31 crc kubenswrapper[4971]: I0320 08:45:31.602212 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7xt4j" event={"ID":"1322bd61-d156-4142-9a0c-023e76e87cd0","Type":"ContainerDied","Data":"02886fe19d0fc8f7ed5ac6499858018ee26ab863491e10eb99e7455450ded274"} Mar 20 08:45:31 crc kubenswrapper[4971]: I0320 08:45:31.602260 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02886fe19d0fc8f7ed5ac6499858018ee26ab863491e10eb99e7455450ded274" Mar 20 08:45:31 crc kubenswrapper[4971]: I0320 08:45:31.602282 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7xt4j" Mar 20 08:45:31 crc kubenswrapper[4971]: I0320 08:45:31.688299 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7948b77f69-j5269"] Mar 20 08:45:31 crc kubenswrapper[4971]: E0320 08:45:31.688756 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1322bd61-d156-4142-9a0c-023e76e87cd0" containerName="keystone-bootstrap" Mar 20 08:45:31 crc kubenswrapper[4971]: I0320 08:45:31.688781 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="1322bd61-d156-4142-9a0c-023e76e87cd0" containerName="keystone-bootstrap" Mar 20 08:45:31 crc kubenswrapper[4971]: E0320 08:45:31.688798 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efcb77d2-e456-4d89-8ef3-9a2445bd5e2c" containerName="dnsmasq-dns" Mar 20 08:45:31 crc kubenswrapper[4971]: I0320 08:45:31.688807 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="efcb77d2-e456-4d89-8ef3-9a2445bd5e2c" containerName="dnsmasq-dns" Mar 20 08:45:31 crc kubenswrapper[4971]: E0320 08:45:31.688830 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efcb77d2-e456-4d89-8ef3-9a2445bd5e2c" containerName="init" Mar 20 08:45:31 crc kubenswrapper[4971]: I0320 08:45:31.688843 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="efcb77d2-e456-4d89-8ef3-9a2445bd5e2c" containerName="init" Mar 20 08:45:31 crc kubenswrapper[4971]: I0320 08:45:31.689063 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="efcb77d2-e456-4d89-8ef3-9a2445bd5e2c" containerName="dnsmasq-dns" Mar 20 08:45:31 crc kubenswrapper[4971]: I0320 08:45:31.689085 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="1322bd61-d156-4142-9a0c-023e76e87cd0" containerName="keystone-bootstrap" Mar 20 08:45:31 crc kubenswrapper[4971]: I0320 08:45:31.689780 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7948b77f69-j5269" Mar 20 08:45:31 crc kubenswrapper[4971]: I0320 08:45:31.697036 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 08:45:31 crc kubenswrapper[4971]: I0320 08:45:31.701890 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-bdcjr" Mar 20 08:45:31 crc kubenswrapper[4971]: I0320 08:45:31.701945 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 08:45:31 crc kubenswrapper[4971]: I0320 08:45:31.703301 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 08:45:31 crc kubenswrapper[4971]: I0320 08:45:31.703575 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7948b77f69-j5269"] Mar 20 08:45:31 crc kubenswrapper[4971]: I0320 08:45:31.790852 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/97c1ece7-9e4a-46a6-a86c-bdde47c1ab39-credential-keys\") pod \"keystone-7948b77f69-j5269\" (UID: \"97c1ece7-9e4a-46a6-a86c-bdde47c1ab39\") " pod="openstack/keystone-7948b77f69-j5269" Mar 20 08:45:31 crc kubenswrapper[4971]: I0320 08:45:31.791844 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97c1ece7-9e4a-46a6-a86c-bdde47c1ab39-scripts\") pod \"keystone-7948b77f69-j5269\" (UID: \"97c1ece7-9e4a-46a6-a86c-bdde47c1ab39\") " pod="openstack/keystone-7948b77f69-j5269" Mar 20 08:45:31 crc kubenswrapper[4971]: I0320 08:45:31.791910 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmkfv\" (UniqueName: \"kubernetes.io/projected/97c1ece7-9e4a-46a6-a86c-bdde47c1ab39-kube-api-access-zmkfv\") pod \"keystone-7948b77f69-j5269\" (UID: \"97c1ece7-9e4a-46a6-a86c-bdde47c1ab39\") " pod="openstack/keystone-7948b77f69-j5269" Mar 20 08:45:31 crc kubenswrapper[4971]: I0320 08:45:31.791973 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97c1ece7-9e4a-46a6-a86c-bdde47c1ab39-combined-ca-bundle\") pod \"keystone-7948b77f69-j5269\" (UID: \"97c1ece7-9e4a-46a6-a86c-bdde47c1ab39\") " pod="openstack/keystone-7948b77f69-j5269" Mar 20 08:45:31 crc kubenswrapper[4971]: I0320 08:45:31.792026 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97c1ece7-9e4a-46a6-a86c-bdde47c1ab39-config-data\") pod \"keystone-7948b77f69-j5269\" (UID: \"97c1ece7-9e4a-46a6-a86c-bdde47c1ab39\") " pod="openstack/keystone-7948b77f69-j5269" Mar 20 08:45:31 crc kubenswrapper[4971]: I0320 08:45:31.792083 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/97c1ece7-9e4a-46a6-a86c-bdde47c1ab39-fernet-keys\") pod \"keystone-7948b77f69-j5269\" (UID: \"97c1ece7-9e4a-46a6-a86c-bdde47c1ab39\") " pod="openstack/keystone-7948b77f69-j5269" Mar 20 08:45:31 crc kubenswrapper[4971]: I0320 08:45:31.893930 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97c1ece7-9e4a-46a6-a86c-bdde47c1ab39-scripts\") pod \"keystone-7948b77f69-j5269\" (UID: \"97c1ece7-9e4a-46a6-a86c-bdde47c1ab39\") " pod="openstack/keystone-7948b77f69-j5269" Mar 20 08:45:31 crc kubenswrapper[4971]: I0320 08:45:31.893984 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmkfv\" (UniqueName: \"kubernetes.io/projected/97c1ece7-9e4a-46a6-a86c-bdde47c1ab39-kube-api-access-zmkfv\") pod \"keystone-7948b77f69-j5269\" (UID: \"97c1ece7-9e4a-46a6-a86c-bdde47c1ab39\") " pod="openstack/keystone-7948b77f69-j5269" Mar 20 08:45:31 crc kubenswrapper[4971]: I0320 08:45:31.894022 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97c1ece7-9e4a-46a6-a86c-bdde47c1ab39-combined-ca-bundle\") pod \"keystone-7948b77f69-j5269\" (UID: \"97c1ece7-9e4a-46a6-a86c-bdde47c1ab39\") " pod="openstack/keystone-7948b77f69-j5269" Mar 20 08:45:31 crc kubenswrapper[4971]: I0320 08:45:31.894055 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97c1ece7-9e4a-46a6-a86c-bdde47c1ab39-config-data\") pod \"keystone-7948b77f69-j5269\" (UID: \"97c1ece7-9e4a-46a6-a86c-bdde47c1ab39\") " pod="openstack/keystone-7948b77f69-j5269" Mar 20 08:45:31 crc kubenswrapper[4971]: I0320 08:45:31.894089 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/97c1ece7-9e4a-46a6-a86c-bdde47c1ab39-fernet-keys\") pod \"keystone-7948b77f69-j5269\" (UID: \"97c1ece7-9e4a-46a6-a86c-bdde47c1ab39\") " pod="openstack/keystone-7948b77f69-j5269" Mar 20 08:45:31 crc kubenswrapper[4971]: I0320 08:45:31.895022 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/97c1ece7-9e4a-46a6-a86c-bdde47c1ab39-credential-keys\") pod \"keystone-7948b77f69-j5269\" (UID: \"97c1ece7-9e4a-46a6-a86c-bdde47c1ab39\") " pod="openstack/keystone-7948b77f69-j5269" Mar 20 08:45:31 crc kubenswrapper[4971]: I0320 08:45:31.899027 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97c1ece7-9e4a-46a6-a86c-bdde47c1ab39-combined-ca-bundle\") pod \"keystone-7948b77f69-j5269\" (UID: \"97c1ece7-9e4a-46a6-a86c-bdde47c1ab39\") " pod="openstack/keystone-7948b77f69-j5269" Mar 20 08:45:31 crc kubenswrapper[4971]: I0320 08:45:31.899273 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97c1ece7-9e4a-46a6-a86c-bdde47c1ab39-scripts\") pod \"keystone-7948b77f69-j5269\" (UID: \"97c1ece7-9e4a-46a6-a86c-bdde47c1ab39\") " pod="openstack/keystone-7948b77f69-j5269" Mar 20 08:45:31 crc kubenswrapper[4971]: I0320 08:45:31.900356 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/97c1ece7-9e4a-46a6-a86c-bdde47c1ab39-fernet-keys\") pod \"keystone-7948b77f69-j5269\" (UID: \"97c1ece7-9e4a-46a6-a86c-bdde47c1ab39\") " pod="openstack/keystone-7948b77f69-j5269" Mar 20 08:45:31 crc kubenswrapper[4971]: I0320 08:45:31.904897 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97c1ece7-9e4a-46a6-a86c-bdde47c1ab39-config-data\") pod \"keystone-7948b77f69-j5269\" (UID: \"97c1ece7-9e4a-46a6-a86c-bdde47c1ab39\") " pod="openstack/keystone-7948b77f69-j5269" Mar 20 08:45:31 crc kubenswrapper[4971]: I0320 08:45:31.907160 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/97c1ece7-9e4a-46a6-a86c-bdde47c1ab39-credential-keys\") pod \"keystone-7948b77f69-j5269\" (UID: \"97c1ece7-9e4a-46a6-a86c-bdde47c1ab39\") " pod="openstack/keystone-7948b77f69-j5269" Mar 20 08:45:31 crc kubenswrapper[4971]: I0320 08:45:31.912009 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmkfv\" (UniqueName: \"kubernetes.io/projected/97c1ece7-9e4a-46a6-a86c-bdde47c1ab39-kube-api-access-zmkfv\") pod \"keystone-7948b77f69-j5269\" (UID: \"97c1ece7-9e4a-46a6-a86c-bdde47c1ab39\") " pod="openstack/keystone-7948b77f69-j5269" Mar 20 08:45:32 crc kubenswrapper[4971]: I0320 08:45:32.006208 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7948b77f69-j5269" Mar 20 08:45:32 crc kubenswrapper[4971]: I0320 08:45:32.443430 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7948b77f69-j5269"] Mar 20 08:45:32 crc kubenswrapper[4971]: W0320 08:45:32.445401 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97c1ece7_9e4a_46a6_a86c_bdde47c1ab39.slice/crio-b7d40723882b8b1efd6af909a52d6a1f945f88ee3f276821787b23b4ec6aae40 WatchSource:0}: Error finding container b7d40723882b8b1efd6af909a52d6a1f945f88ee3f276821787b23b4ec6aae40: Status 404 returned error can't find the container with id b7d40723882b8b1efd6af909a52d6a1f945f88ee3f276821787b23b4ec6aae40 Mar 20 08:45:32 crc kubenswrapper[4971]: I0320 08:45:32.610555 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7948b77f69-j5269" event={"ID":"97c1ece7-9e4a-46a6-a86c-bdde47c1ab39","Type":"ContainerStarted","Data":"b7d40723882b8b1efd6af909a52d6a1f945f88ee3f276821787b23b4ec6aae40"} Mar 20 08:45:33 crc kubenswrapper[4971]: I0320 08:45:33.621918 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7948b77f69-j5269" event={"ID":"97c1ece7-9e4a-46a6-a86c-bdde47c1ab39","Type":"ContainerStarted","Data":"911e978371e42792cf0d894e624cb9d0c92c58d91582dada346464fee52d44b0"} Mar 20 08:45:33 crc kubenswrapper[4971]: I0320 08:45:33.622385 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-7948b77f69-j5269" Mar 20 08:45:33 crc kubenswrapper[4971]: I0320 08:45:33.661073 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-7948b77f69-j5269" podStartSLOduration=2.661047641 podStartE2EDuration="2.661047641s" podCreationTimestamp="2026-03-20 08:45:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:45:33.650259379 +0000 UTC m=+6955.630133547" watchObservedRunningTime="2026-03-20 08:45:33.661047641 +0000 UTC m=+6955.640921819" Mar 20 08:45:38 crc kubenswrapper[4971]: I0320 08:45:38.964504 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zj4rj" Mar 20 08:45:39 crc kubenswrapper[4971]: I0320 08:45:39.053526 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zj4rj" Mar 20 08:45:39 crc kubenswrapper[4971]: I0320 08:45:39.212451 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zj4rj"] Mar 20 08:45:40 crc kubenswrapper[4971]: I0320 08:45:40.689037 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zj4rj" podUID="73a93a9f-b8fe-44bd-9840-09339c326e93" containerName="registry-server" containerID="cri-o://1047aba86dd870ef7a99112f48432ab713103241ac2d4ad32f1fe8f483c82a2a" gracePeriod=2 Mar 20 08:45:41 crc kubenswrapper[4971]: I0320 08:45:41.186154 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zj4rj" Mar 20 08:45:41 crc kubenswrapper[4971]: I0320 08:45:41.295935 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwbp8\" (UniqueName: \"kubernetes.io/projected/73a93a9f-b8fe-44bd-9840-09339c326e93-kube-api-access-zwbp8\") pod \"73a93a9f-b8fe-44bd-9840-09339c326e93\" (UID: \"73a93a9f-b8fe-44bd-9840-09339c326e93\") " Mar 20 08:45:41 crc kubenswrapper[4971]: I0320 08:45:41.296020 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73a93a9f-b8fe-44bd-9840-09339c326e93-utilities\") pod \"73a93a9f-b8fe-44bd-9840-09339c326e93\" (UID: \"73a93a9f-b8fe-44bd-9840-09339c326e93\") " Mar 20 08:45:41 crc kubenswrapper[4971]: I0320 08:45:41.296091 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73a93a9f-b8fe-44bd-9840-09339c326e93-catalog-content\") pod \"73a93a9f-b8fe-44bd-9840-09339c326e93\" (UID: \"73a93a9f-b8fe-44bd-9840-09339c326e93\") " Mar 20 08:45:41 crc kubenswrapper[4971]: I0320 08:45:41.297204 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73a93a9f-b8fe-44bd-9840-09339c326e93-utilities" (OuterVolumeSpecName: "utilities") pod "73a93a9f-b8fe-44bd-9840-09339c326e93" (UID: "73a93a9f-b8fe-44bd-9840-09339c326e93"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:45:41 crc kubenswrapper[4971]: I0320 08:45:41.310921 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73a93a9f-b8fe-44bd-9840-09339c326e93-kube-api-access-zwbp8" (OuterVolumeSpecName: "kube-api-access-zwbp8") pod "73a93a9f-b8fe-44bd-9840-09339c326e93" (UID: "73a93a9f-b8fe-44bd-9840-09339c326e93"). InnerVolumeSpecName "kube-api-access-zwbp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:45:41 crc kubenswrapper[4971]: I0320 08:45:41.398973 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73a93a9f-b8fe-44bd-9840-09339c326e93-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:41 crc kubenswrapper[4971]: I0320 08:45:41.399116 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwbp8\" (UniqueName: \"kubernetes.io/projected/73a93a9f-b8fe-44bd-9840-09339c326e93-kube-api-access-zwbp8\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:41 crc kubenswrapper[4971]: I0320 08:45:41.497856 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73a93a9f-b8fe-44bd-9840-09339c326e93-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "73a93a9f-b8fe-44bd-9840-09339c326e93" (UID: "73a93a9f-b8fe-44bd-9840-09339c326e93"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:45:41 crc kubenswrapper[4971]: I0320 08:45:41.500797 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73a93a9f-b8fe-44bd-9840-09339c326e93-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:41 crc kubenswrapper[4971]: I0320 08:45:41.705751 4971 generic.go:334] "Generic (PLEG): container finished" podID="73a93a9f-b8fe-44bd-9840-09339c326e93" containerID="1047aba86dd870ef7a99112f48432ab713103241ac2d4ad32f1fe8f483c82a2a" exitCode=0 Mar 20 08:45:41 crc kubenswrapper[4971]: I0320 08:45:41.705793 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zj4rj" event={"ID":"73a93a9f-b8fe-44bd-9840-09339c326e93","Type":"ContainerDied","Data":"1047aba86dd870ef7a99112f48432ab713103241ac2d4ad32f1fe8f483c82a2a"} Mar 20 08:45:41 crc kubenswrapper[4971]: I0320 08:45:41.705825 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zj4rj" event={"ID":"73a93a9f-b8fe-44bd-9840-09339c326e93","Type":"ContainerDied","Data":"34f4a108bd754416a14d04f9a9d82278b0f72f03a07cef4e16a886ceea5d4ebe"} Mar 20 08:45:41 crc kubenswrapper[4971]: I0320 08:45:41.705842 4971 scope.go:117] "RemoveContainer" containerID="1047aba86dd870ef7a99112f48432ab713103241ac2d4ad32f1fe8f483c82a2a" Mar 20 08:45:41 crc kubenswrapper[4971]: I0320 08:45:41.705846 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zj4rj" Mar 20 08:45:41 crc kubenswrapper[4971]: I0320 08:45:41.728795 4971 scope.go:117] "RemoveContainer" containerID="5f3ac4545e5bc6cd7b62e17dcc2005a45fc6a6827f62fee9f1c56a18c049237b" Mar 20 08:45:41 crc kubenswrapper[4971]: I0320 08:45:41.741271 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zj4rj"] Mar 20 08:45:41 crc kubenswrapper[4971]: I0320 08:45:41.746929 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zj4rj"] Mar 20 08:45:41 crc kubenswrapper[4971]: I0320 08:45:41.753977 4971 scope.go:117] "RemoveContainer" containerID="65f1fd07e05f8bed8ca0da78686799388ab51848b4e77ae251a2d67a0b97a977" Mar 20 08:45:41 crc kubenswrapper[4971]: I0320 08:45:41.798278 4971 scope.go:117] "RemoveContainer" containerID="1047aba86dd870ef7a99112f48432ab713103241ac2d4ad32f1fe8f483c82a2a" Mar 20 08:45:41 crc kubenswrapper[4971]: E0320 08:45:41.798869 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1047aba86dd870ef7a99112f48432ab713103241ac2d4ad32f1fe8f483c82a2a\": container with ID starting with 1047aba86dd870ef7a99112f48432ab713103241ac2d4ad32f1fe8f483c82a2a not found: ID does not exist" containerID="1047aba86dd870ef7a99112f48432ab713103241ac2d4ad32f1fe8f483c82a2a" Mar 20 08:45:41 crc kubenswrapper[4971]: I0320 08:45:41.798968 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1047aba86dd870ef7a99112f48432ab713103241ac2d4ad32f1fe8f483c82a2a"} err="failed to get container status \"1047aba86dd870ef7a99112f48432ab713103241ac2d4ad32f1fe8f483c82a2a\": rpc error: code = NotFound desc = could not find container \"1047aba86dd870ef7a99112f48432ab713103241ac2d4ad32f1fe8f483c82a2a\": container with ID starting with 1047aba86dd870ef7a99112f48432ab713103241ac2d4ad32f1fe8f483c82a2a not found: ID does not exist" Mar 20 08:45:41 crc kubenswrapper[4971]: I0320 08:45:41.799042 4971 scope.go:117] "RemoveContainer" containerID="5f3ac4545e5bc6cd7b62e17dcc2005a45fc6a6827f62fee9f1c56a18c049237b" Mar 20 08:45:41 crc kubenswrapper[4971]: E0320 08:45:41.799481 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f3ac4545e5bc6cd7b62e17dcc2005a45fc6a6827f62fee9f1c56a18c049237b\": container with ID starting with 5f3ac4545e5bc6cd7b62e17dcc2005a45fc6a6827f62fee9f1c56a18c049237b not found: ID does not exist" containerID="5f3ac4545e5bc6cd7b62e17dcc2005a45fc6a6827f62fee9f1c56a18c049237b" Mar 20 08:45:41 crc kubenswrapper[4971]: I0320 08:45:41.799587 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f3ac4545e5bc6cd7b62e17dcc2005a45fc6a6827f62fee9f1c56a18c049237b"} err="failed to get container status \"5f3ac4545e5bc6cd7b62e17dcc2005a45fc6a6827f62fee9f1c56a18c049237b\": rpc error: code = NotFound desc = could not find container \"5f3ac4545e5bc6cd7b62e17dcc2005a45fc6a6827f62fee9f1c56a18c049237b\": container with ID starting with 5f3ac4545e5bc6cd7b62e17dcc2005a45fc6a6827f62fee9f1c56a18c049237b not found: ID does not exist" Mar 20 08:45:41 crc kubenswrapper[4971]: I0320 08:45:41.799673 4971 scope.go:117] "RemoveContainer" containerID="65f1fd07e05f8bed8ca0da78686799388ab51848b4e77ae251a2d67a0b97a977" Mar 20 08:45:41 crc kubenswrapper[4971]: E0320 08:45:41.800374 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65f1fd07e05f8bed8ca0da78686799388ab51848b4e77ae251a2d67a0b97a977\": container with ID starting with 65f1fd07e05f8bed8ca0da78686799388ab51848b4e77ae251a2d67a0b97a977 not found: ID does not exist" containerID="65f1fd07e05f8bed8ca0da78686799388ab51848b4e77ae251a2d67a0b97a977" Mar 20 08:45:41 crc kubenswrapper[4971]: I0320 08:45:41.800520 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65f1fd07e05f8bed8ca0da78686799388ab51848b4e77ae251a2d67a0b97a977"} err="failed to get container status \"65f1fd07e05f8bed8ca0da78686799388ab51848b4e77ae251a2d67a0b97a977\": rpc error: code = NotFound desc = could not find container \"65f1fd07e05f8bed8ca0da78686799388ab51848b4e77ae251a2d67a0b97a977\": container with ID starting with 65f1fd07e05f8bed8ca0da78686799388ab51848b4e77ae251a2d67a0b97a977 not found: ID does not exist" Mar 20 08:45:42 crc kubenswrapper[4971]: I0320 08:45:42.745139 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73a93a9f-b8fe-44bd-9840-09339c326e93" path="/var/lib/kubelet/pods/73a93a9f-b8fe-44bd-9840-09339c326e93/volumes" Mar 20 08:45:50 crc kubenswrapper[4971]: I0320 08:45:50.162477 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 08:45:50 crc kubenswrapper[4971]: I0320 08:45:50.162972 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 08:46:00 crc kubenswrapper[4971]: I0320 08:46:00.151834 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566606-5k4mc"] Mar 20 08:46:00 crc kubenswrapper[4971]: E0320 08:46:00.153100 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73a93a9f-b8fe-44bd-9840-09339c326e93" containerName="registry-server" Mar 20 08:46:00 crc kubenswrapper[4971]: I0320 08:46:00.153117 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="73a93a9f-b8fe-44bd-9840-09339c326e93" containerName="registry-server" Mar 20 08:46:00 crc kubenswrapper[4971]: E0320 08:46:00.153132 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73a93a9f-b8fe-44bd-9840-09339c326e93" containerName="extract-content" Mar 20 08:46:00 crc kubenswrapper[4971]: I0320 08:46:00.153139 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="73a93a9f-b8fe-44bd-9840-09339c326e93" containerName="extract-content" Mar 20 08:46:00 crc kubenswrapper[4971]: E0320 08:46:00.153176 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73a93a9f-b8fe-44bd-9840-09339c326e93" containerName="extract-utilities" Mar 20 08:46:00 crc kubenswrapper[4971]: I0320 08:46:00.153186 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="73a93a9f-b8fe-44bd-9840-09339c326e93" containerName="extract-utilities" Mar 20 08:46:00 crc kubenswrapper[4971]: I0320 08:46:00.153371 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="73a93a9f-b8fe-44bd-9840-09339c326e93" containerName="registry-server" Mar 20 08:46:00 crc kubenswrapper[4971]: I0320 08:46:00.153907 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566606-5k4mc" Mar 20 08:46:00 crc kubenswrapper[4971]: I0320 08:46:00.159478 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 08:46:00 crc kubenswrapper[4971]: I0320 08:46:00.159696 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 08:46:00 crc kubenswrapper[4971]: I0320 08:46:00.159827 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 08:46:00 crc kubenswrapper[4971]: I0320 08:46:00.171638 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566606-5k4mc"] Mar 20 08:46:00 crc kubenswrapper[4971]: I0320 08:46:00.280715 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddwb7\" (UniqueName: \"kubernetes.io/projected/b13baff6-c4ad-4c29-bb53-d0931ca962b6-kube-api-access-ddwb7\") pod \"auto-csr-approver-29566606-5k4mc\" (UID: \"b13baff6-c4ad-4c29-bb53-d0931ca962b6\") " pod="openshift-infra/auto-csr-approver-29566606-5k4mc" Mar 20 08:46:00 crc kubenswrapper[4971]: I0320 08:46:00.383681 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddwb7\" (UniqueName: \"kubernetes.io/projected/b13baff6-c4ad-4c29-bb53-d0931ca962b6-kube-api-access-ddwb7\") pod \"auto-csr-approver-29566606-5k4mc\" (UID: \"b13baff6-c4ad-4c29-bb53-d0931ca962b6\") " pod="openshift-infra/auto-csr-approver-29566606-5k4mc" Mar 20 08:46:00 crc kubenswrapper[4971]: I0320 08:46:00.406502 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddwb7\" (UniqueName: \"kubernetes.io/projected/b13baff6-c4ad-4c29-bb53-d0931ca962b6-kube-api-access-ddwb7\") pod \"auto-csr-approver-29566606-5k4mc\" (UID: \"b13baff6-c4ad-4c29-bb53-d0931ca962b6\") " pod="openshift-infra/auto-csr-approver-29566606-5k4mc" Mar 20 08:46:00 crc kubenswrapper[4971]: I0320 08:46:00.480367 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566606-5k4mc" Mar 20 08:46:01 crc kubenswrapper[4971]: I0320 08:46:01.037939 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566606-5k4mc"] Mar 20 08:46:01 crc kubenswrapper[4971]: I0320 08:46:01.903692 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566606-5k4mc" event={"ID":"b13baff6-c4ad-4c29-bb53-d0931ca962b6","Type":"ContainerStarted","Data":"cf7820ed8e891d19719a01c31ebcf4b04fb66455bbed7449effa97cb362b0e5c"} Mar 20 08:46:02 crc kubenswrapper[4971]: I0320 08:46:02.913043 4971 generic.go:334] "Generic (PLEG): container finished" podID="b13baff6-c4ad-4c29-bb53-d0931ca962b6" containerID="96083d13836e22a902d07819daa2aba9e77e25137eb26268bbea69183fd009e0" exitCode=0 Mar 20 08:46:02 crc kubenswrapper[4971]: I0320 08:46:02.913125 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566606-5k4mc" event={"ID":"b13baff6-c4ad-4c29-bb53-d0931ca962b6","Type":"ContainerDied","Data":"96083d13836e22a902d07819daa2aba9e77e25137eb26268bbea69183fd009e0"} Mar 20 08:46:03 crc kubenswrapper[4971]: I0320 08:46:03.554346 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-7948b77f69-j5269" Mar 20 08:46:04 crc kubenswrapper[4971]: I0320 08:46:04.310687 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566606-5k4mc" Mar 20 08:46:04 crc kubenswrapper[4971]: I0320 08:46:04.353838 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddwb7\" (UniqueName: \"kubernetes.io/projected/b13baff6-c4ad-4c29-bb53-d0931ca962b6-kube-api-access-ddwb7\") pod \"b13baff6-c4ad-4c29-bb53-d0931ca962b6\" (UID: \"b13baff6-c4ad-4c29-bb53-d0931ca962b6\") " Mar 20 08:46:04 crc kubenswrapper[4971]: I0320 08:46:04.361780 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b13baff6-c4ad-4c29-bb53-d0931ca962b6-kube-api-access-ddwb7" (OuterVolumeSpecName: "kube-api-access-ddwb7") pod "b13baff6-c4ad-4c29-bb53-d0931ca962b6" (UID: "b13baff6-c4ad-4c29-bb53-d0931ca962b6"). InnerVolumeSpecName "kube-api-access-ddwb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:46:04 crc kubenswrapper[4971]: I0320 08:46:04.456863 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddwb7\" (UniqueName: \"kubernetes.io/projected/b13baff6-c4ad-4c29-bb53-d0931ca962b6-kube-api-access-ddwb7\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:04 crc kubenswrapper[4971]: I0320 08:46:04.944012 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566606-5k4mc" event={"ID":"b13baff6-c4ad-4c29-bb53-d0931ca962b6","Type":"ContainerDied","Data":"cf7820ed8e891d19719a01c31ebcf4b04fb66455bbed7449effa97cb362b0e5c"} Mar 20 08:46:04 crc kubenswrapper[4971]: I0320 08:46:04.944480 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf7820ed8e891d19719a01c31ebcf4b04fb66455bbed7449effa97cb362b0e5c" Mar 20 08:46:04 crc kubenswrapper[4971]: I0320 08:46:04.944115 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566606-5k4mc" Mar 20 08:46:05 crc kubenswrapper[4971]: I0320 08:46:05.408859 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566600-qznhs"] Mar 20 08:46:05 crc kubenswrapper[4971]: I0320 08:46:05.414516 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566600-qznhs"] Mar 20 08:46:06 crc kubenswrapper[4971]: I0320 08:46:06.749119 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c063d1e0-0975-4b8c-8253-e781c924a051" path="/var/lib/kubelet/pods/c063d1e0-0975-4b8c-8253-e781c924a051/volumes" Mar 20 08:46:06 crc kubenswrapper[4971]: I0320 08:46:06.779399 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 20 08:46:06 crc kubenswrapper[4971]: E0320 08:46:06.780162 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b13baff6-c4ad-4c29-bb53-d0931ca962b6" containerName="oc" Mar 20 08:46:06 crc kubenswrapper[4971]: I0320 08:46:06.780216 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="b13baff6-c4ad-4c29-bb53-d0931ca962b6" containerName="oc" Mar 20 08:46:06 crc kubenswrapper[4971]: I0320 08:46:06.780660 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="b13baff6-c4ad-4c29-bb53-d0931ca962b6" containerName="oc" Mar 20 08:46:06 crc kubenswrapper[4971]: I0320 08:46:06.781957 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 08:46:06 crc kubenswrapper[4971]: I0320 08:46:06.785298 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 20 08:46:06 crc kubenswrapper[4971]: I0320 08:46:06.785402 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-c765k" Mar 20 08:46:06 crc kubenswrapper[4971]: I0320 08:46:06.785716 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 20 08:46:06 crc kubenswrapper[4971]: I0320 08:46:06.796650 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 20 08:46:06 crc kubenswrapper[4971]: I0320 08:46:06.838752 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 20 08:46:06 crc kubenswrapper[4971]: E0320 08:46:06.840070 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-xwwpj openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[kube-api-access-xwwpj openstack-config openstack-config-secret]: context canceled" pod="openstack/openstackclient" podUID="1f04188d-510d-4b0c-9c0d-f9b5f98a010c" Mar 20 08:46:06 crc kubenswrapper[4971]: I0320 08:46:06.852330 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 20 08:46:06 crc kubenswrapper[4971]: I0320 08:46:06.865737 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 20 08:46:06 crc kubenswrapper[4971]: I0320 08:46:06.900111 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 20 08:46:06 crc kubenswrapper[4971]: I0320 08:46:06.900264 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 08:46:06 crc kubenswrapper[4971]: I0320 08:46:06.962094 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 08:46:06 crc kubenswrapper[4971]: I0320 08:46:06.965354 4971 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="1f04188d-510d-4b0c-9c0d-f9b5f98a010c" podUID="9f3a3b9f-f195-436e-9097-d085880e71f8" Mar 20 08:46:06 crc kubenswrapper[4971]: I0320 08:46:06.970893 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 08:46:06 crc kubenswrapper[4971]: I0320 08:46:06.996335 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9f3a3b9f-f195-436e-9097-d085880e71f8-openstack-config-secret\") pod \"openstackclient\" (UID: \"9f3a3b9f-f195-436e-9097-d085880e71f8\") " pod="openstack/openstackclient" Mar 20 08:46:06 crc kubenswrapper[4971]: I0320 08:46:06.996383 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7tvm\" (UniqueName: \"kubernetes.io/projected/9f3a3b9f-f195-436e-9097-d085880e71f8-kube-api-access-v7tvm\") pod \"openstackclient\" (UID: \"9f3a3b9f-f195-436e-9097-d085880e71f8\") " pod="openstack/openstackclient" Mar 20 08:46:06 crc kubenswrapper[4971]: I0320 08:46:06.996414 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9f3a3b9f-f195-436e-9097-d085880e71f8-openstack-config\") pod \"openstackclient\" (UID: \"9f3a3b9f-f195-436e-9097-d085880e71f8\") " pod="openstack/openstackclient" Mar 20 08:46:07 crc kubenswrapper[4971]: I0320 08:46:07.100248 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9f3a3b9f-f195-436e-9097-d085880e71f8-openstack-config-secret\") pod \"openstackclient\" (UID: \"9f3a3b9f-f195-436e-9097-d085880e71f8\") " pod="openstack/openstackclient" Mar 20 08:46:07 crc kubenswrapper[4971]: I0320 08:46:07.100309 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7tvm\" (UniqueName: \"kubernetes.io/projected/9f3a3b9f-f195-436e-9097-d085880e71f8-kube-api-access-v7tvm\") pod \"openstackclient\" (UID: \"9f3a3b9f-f195-436e-9097-d085880e71f8\") " pod="openstack/openstackclient" Mar 20 08:46:07 crc kubenswrapper[4971]: I0320 08:46:07.100343 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9f3a3b9f-f195-436e-9097-d085880e71f8-openstack-config\") pod \"openstackclient\" (UID: \"9f3a3b9f-f195-436e-9097-d085880e71f8\") " pod="openstack/openstackclient" Mar 20 08:46:07 crc kubenswrapper[4971]: I0320 08:46:07.101761 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9f3a3b9f-f195-436e-9097-d085880e71f8-openstack-config\") pod \"openstackclient\" (UID: \"9f3a3b9f-f195-436e-9097-d085880e71f8\") " pod="openstack/openstackclient" Mar 20 08:46:07 crc kubenswrapper[4971]: I0320 08:46:07.110271 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9f3a3b9f-f195-436e-9097-d085880e71f8-openstack-config-secret\") pod \"openstackclient\" (UID: \"9f3a3b9f-f195-436e-9097-d085880e71f8\") " pod="openstack/openstackclient" Mar 20 08:46:07 crc kubenswrapper[4971]: I0320 08:46:07.138570 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7tvm\" (UniqueName: \"kubernetes.io/projected/9f3a3b9f-f195-436e-9097-d085880e71f8-kube-api-access-v7tvm\") pod \"openstackclient\" (UID: \"9f3a3b9f-f195-436e-9097-d085880e71f8\") " pod="openstack/openstackclient" Mar 20 08:46:07 crc kubenswrapper[4971]: I0320 08:46:07.249855 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 08:46:07 crc kubenswrapper[4971]: I0320 08:46:07.713902 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 20 08:46:07 crc kubenswrapper[4971]: I0320 08:46:07.971847 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"9f3a3b9f-f195-436e-9097-d085880e71f8","Type":"ContainerStarted","Data":"22b4b8db83103cebd123f02ecd7b92727d8f27dcca87d651cb71577ff399a53e"} Mar 20 08:46:07 crc kubenswrapper[4971]: I0320 08:46:07.971871 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 08:46:07 crc kubenswrapper[4971]: I0320 08:46:07.975480 4971 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="1f04188d-510d-4b0c-9c0d-f9b5f98a010c" podUID="9f3a3b9f-f195-436e-9097-d085880e71f8" Mar 20 08:46:08 crc kubenswrapper[4971]: I0320 08:46:08.585112 4971 scope.go:117] "RemoveContainer" containerID="f1010db1b06c8c861d7f18dbdb2b3b614e3337b4fb463f609c45de2fd9a5e7a7" Mar 20 08:46:08 crc kubenswrapper[4971]: I0320 08:46:08.742374 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f04188d-510d-4b0c-9c0d-f9b5f98a010c" path="/var/lib/kubelet/pods/1f04188d-510d-4b0c-9c0d-f9b5f98a010c/volumes" Mar 20 08:46:20 crc kubenswrapper[4971]: I0320 08:46:20.078975 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"9f3a3b9f-f195-436e-9097-d085880e71f8","Type":"ContainerStarted","Data":"bd95892b113df7a9165f0379707e216c41cb86d938f4c421d789927ccad60c6a"} Mar 20 08:46:20 crc kubenswrapper[4971]: I0320 08:46:20.107411 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.8623755429999997 podStartE2EDuration="14.107393258s" podCreationTimestamp="2026-03-20 08:46:06 +0000 UTC" firstStartedPulling="2026-03-20 08:46:07.706638483 +0000 UTC m=+6989.686512611" lastFinishedPulling="2026-03-20 08:46:18.951656178 +0000 UTC m=+7000.931530326" observedRunningTime="2026-03-20 08:46:20.103759473 +0000 UTC m=+7002.083633651" watchObservedRunningTime="2026-03-20 08:46:20.107393258 +0000 UTC m=+7002.087267396" Mar 20 08:46:20 crc kubenswrapper[4971]: I0320 08:46:20.162442 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 08:46:20 crc kubenswrapper[4971]: I0320 08:46:20.162573 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 08:46:50 crc kubenswrapper[4971]: I0320 08:46:50.175379 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 08:46:50 crc kubenswrapper[4971]: I0320 08:46:50.176004 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 08:46:50 crc kubenswrapper[4971]: I0320 08:46:50.176807 4971 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" Mar 20 08:46:50 crc kubenswrapper[4971]: I0320 08:46:50.181749 4971 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"af8dd483a43b5c96df916fd811fc94a881785f568b623794921e968459fc26b4"} pod="openshift-machine-config-operator/machine-config-daemon-m26zl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 08:46:50 crc kubenswrapper[4971]: I0320 08:46:50.181920 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" containerID="cri-o://af8dd483a43b5c96df916fd811fc94a881785f568b623794921e968459fc26b4" gracePeriod=600 Mar 20 08:46:50 crc kubenswrapper[4971]: E0320 08:46:50.319478 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:46:50 crc kubenswrapper[4971]: I0320 08:46:50.422939 4971 generic.go:334] "Generic (PLEG): container finished" podID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerID="af8dd483a43b5c96df916fd811fc94a881785f568b623794921e968459fc26b4" exitCode=0 Mar 20 08:46:50 crc kubenswrapper[4971]: I0320 08:46:50.422989 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerDied","Data":"af8dd483a43b5c96df916fd811fc94a881785f568b623794921e968459fc26b4"} Mar 20 08:46:50 crc kubenswrapper[4971]: I0320 08:46:50.423024 4971 scope.go:117] "RemoveContainer" containerID="7a173279568ba15985897ecc9daca378d7d89e3775153dfc334adc490c1852dc" Mar 20 08:46:50 crc kubenswrapper[4971]: I0320 08:46:50.423696 4971 scope.go:117] "RemoveContainer" containerID="af8dd483a43b5c96df916fd811fc94a881785f568b623794921e968459fc26b4" Mar 20 08:46:50 crc kubenswrapper[4971]: E0320 08:46:50.423953 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:47:05 crc kubenswrapper[4971]: I0320 08:47:05.732733 4971 scope.go:117] "RemoveContainer" containerID="af8dd483a43b5c96df916fd811fc94a881785f568b623794921e968459fc26b4" Mar 20 08:47:05 crc kubenswrapper[4971]: E0320 08:47:05.733427 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:47:19 crc kubenswrapper[4971]: I0320 08:47:19.740369 4971 scope.go:117] "RemoveContainer" containerID="af8dd483a43b5c96df916fd811fc94a881785f568b623794921e968459fc26b4" Mar 20 08:47:19 crc kubenswrapper[4971]: E0320 08:47:19.741729 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:47:32 crc kubenswrapper[4971]: I0320 08:47:32.732842 4971 scope.go:117] "RemoveContainer" containerID="af8dd483a43b5c96df916fd811fc94a881785f568b623794921e968459fc26b4" Mar 20 08:47:32 crc kubenswrapper[4971]: E0320 08:47:32.733472 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:47:44 crc kubenswrapper[4971]: I0320 08:47:44.870210 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-c2w9d"] Mar 20 08:47:44 crc kubenswrapper[4971]: I0320 08:47:44.872503 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-c2w9d" Mar 20 08:47:44 crc kubenswrapper[4971]: I0320 08:47:44.879467 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-c2w9d"] Mar 20 08:47:44 crc kubenswrapper[4971]: I0320 08:47:44.959996 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-b44d-account-create-update-pnwdx"] Mar 20 08:47:44 crc kubenswrapper[4971]: I0320 08:47:44.961493 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b44d-account-create-update-pnwdx" Mar 20 08:47:44 crc kubenswrapper[4971]: I0320 08:47:44.963521 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 20 08:47:44 crc kubenswrapper[4971]: I0320 08:47:44.967471 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b44d-account-create-update-pnwdx"] Mar 20 08:47:45 crc kubenswrapper[4971]: I0320 08:47:45.029948 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21a92691-09f1-4841-86b2-c72f6153c44c-operator-scripts\") pod \"neutron-db-create-c2w9d\" (UID: \"21a92691-09f1-4841-86b2-c72f6153c44c\") " pod="openstack/neutron-db-create-c2w9d" Mar 20 08:47:45 crc kubenswrapper[4971]: I0320 08:47:45.030005 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mp4j5\" (UniqueName: \"kubernetes.io/projected/21a92691-09f1-4841-86b2-c72f6153c44c-kube-api-access-mp4j5\") pod \"neutron-db-create-c2w9d\" (UID: \"21a92691-09f1-4841-86b2-c72f6153c44c\") " pod="openstack/neutron-db-create-c2w9d" Mar 20 08:47:45 crc kubenswrapper[4971]: I0320 08:47:45.131314 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21a92691-09f1-4841-86b2-c72f6153c44c-operator-scripts\") pod \"neutron-db-create-c2w9d\" (UID: \"21a92691-09f1-4841-86b2-c72f6153c44c\") " pod="openstack/neutron-db-create-c2w9d" Mar 20 08:47:45 crc kubenswrapper[4971]: I0320 08:47:45.131364 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mp4j5\" (UniqueName: \"kubernetes.io/projected/21a92691-09f1-4841-86b2-c72f6153c44c-kube-api-access-mp4j5\") pod \"neutron-db-create-c2w9d\" (UID: \"21a92691-09f1-4841-86b2-c72f6153c44c\") " pod="openstack/neutron-db-create-c2w9d" Mar 20 08:47:45 crc kubenswrapper[4971]: I0320 08:47:45.131388 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94f9b6de-f4ff-4a74-a86b-dffa2e03bd58-operator-scripts\") pod \"neutron-b44d-account-create-update-pnwdx\" (UID: \"94f9b6de-f4ff-4a74-a86b-dffa2e03bd58\") " pod="openstack/neutron-b44d-account-create-update-pnwdx" Mar 20 08:47:45 crc kubenswrapper[4971]: I0320 08:47:45.131427 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvnrb\" (UniqueName: \"kubernetes.io/projected/94f9b6de-f4ff-4a74-a86b-dffa2e03bd58-kube-api-access-xvnrb\") pod \"neutron-b44d-account-create-update-pnwdx\" (UID: \"94f9b6de-f4ff-4a74-a86b-dffa2e03bd58\") " pod="openstack/neutron-b44d-account-create-update-pnwdx" Mar 20 08:47:45 crc kubenswrapper[4971]: I0320 08:47:45.132099 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21a92691-09f1-4841-86b2-c72f6153c44c-operator-scripts\") pod \"neutron-db-create-c2w9d\" (UID: \"21a92691-09f1-4841-86b2-c72f6153c44c\") " pod="openstack/neutron-db-create-c2w9d" Mar 20 08:47:45 crc kubenswrapper[4971]: I0320 08:47:45.153017 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mp4j5\" (UniqueName: \"kubernetes.io/projected/21a92691-09f1-4841-86b2-c72f6153c44c-kube-api-access-mp4j5\") pod \"neutron-db-create-c2w9d\" (UID: \"21a92691-09f1-4841-86b2-c72f6153c44c\") " pod="openstack/neutron-db-create-c2w9d" Mar 20 08:47:45 crc kubenswrapper[4971]: I0320 08:47:45.214220 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-c2w9d" Mar 20 08:47:45 crc kubenswrapper[4971]: I0320 08:47:45.232900 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvnrb\" (UniqueName: \"kubernetes.io/projected/94f9b6de-f4ff-4a74-a86b-dffa2e03bd58-kube-api-access-xvnrb\") pod \"neutron-b44d-account-create-update-pnwdx\" (UID: \"94f9b6de-f4ff-4a74-a86b-dffa2e03bd58\") " pod="openstack/neutron-b44d-account-create-update-pnwdx" Mar 20 08:47:45 crc kubenswrapper[4971]: I0320 08:47:45.233034 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94f9b6de-f4ff-4a74-a86b-dffa2e03bd58-operator-scripts\") pod \"neutron-b44d-account-create-update-pnwdx\" (UID: \"94f9b6de-f4ff-4a74-a86b-dffa2e03bd58\") " pod="openstack/neutron-b44d-account-create-update-pnwdx" Mar 20 08:47:45 crc kubenswrapper[4971]: I0320 08:47:45.233777 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94f9b6de-f4ff-4a74-a86b-dffa2e03bd58-operator-scripts\") pod \"neutron-b44d-account-create-update-pnwdx\" (UID: \"94f9b6de-f4ff-4a74-a86b-dffa2e03bd58\") " pod="openstack/neutron-b44d-account-create-update-pnwdx" Mar 20 08:47:45 crc kubenswrapper[4971]: I0320 08:47:45.257373 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvnrb\" (UniqueName: \"kubernetes.io/projected/94f9b6de-f4ff-4a74-a86b-dffa2e03bd58-kube-api-access-xvnrb\") pod \"neutron-b44d-account-create-update-pnwdx\" (UID: \"94f9b6de-f4ff-4a74-a86b-dffa2e03bd58\") " pod="openstack/neutron-b44d-account-create-update-pnwdx" Mar 20 08:47:45 crc kubenswrapper[4971]: I0320 08:47:45.279104 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b44d-account-create-update-pnwdx" Mar 20 08:47:45 crc kubenswrapper[4971]: I0320 08:47:45.653884 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-c2w9d"] Mar 20 08:47:45 crc kubenswrapper[4971]: I0320 08:47:45.802573 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b44d-account-create-update-pnwdx"] Mar 20 08:47:45 crc kubenswrapper[4971]: W0320 08:47:45.805670 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94f9b6de_f4ff_4a74_a86b_dffa2e03bd58.slice/crio-afc606db2abcaff3c6f256cacdb11aa5a4f69f21d883ca38a40c6a7e26b477c6 WatchSource:0}: Error finding container afc606db2abcaff3c6f256cacdb11aa5a4f69f21d883ca38a40c6a7e26b477c6: Status 404 returned error can't find the container with id afc606db2abcaff3c6f256cacdb11aa5a4f69f21d883ca38a40c6a7e26b477c6 Mar 20 08:47:45 crc kubenswrapper[4971]: I0320 08:47:45.930209 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-c2w9d" event={"ID":"21a92691-09f1-4841-86b2-c72f6153c44c","Type":"ContainerStarted","Data":"0f9922a306e1d4cf3642a3c2976ad988b37298f6b61b9546422bfa5df38512df"} Mar 20 08:47:45 crc kubenswrapper[4971]: I0320 08:47:45.931510 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b44d-account-create-update-pnwdx" event={"ID":"94f9b6de-f4ff-4a74-a86b-dffa2e03bd58","Type":"ContainerStarted","Data":"afc606db2abcaff3c6f256cacdb11aa5a4f69f21d883ca38a40c6a7e26b477c6"} Mar 20 08:47:46 crc kubenswrapper[4971]: I0320 08:47:46.941138 4971 generic.go:334] "Generic (PLEG): container finished" podID="21a92691-09f1-4841-86b2-c72f6153c44c" containerID="faabcdc160d09d964a7e7e4a4a3b1aaa5558df39eb1d4c85f3e74e6da072c101" exitCode=0 Mar 20 08:47:46 crc kubenswrapper[4971]: I0320 08:47:46.941189 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-c2w9d" event={"ID":"21a92691-09f1-4841-86b2-c72f6153c44c","Type":"ContainerDied","Data":"faabcdc160d09d964a7e7e4a4a3b1aaa5558df39eb1d4c85f3e74e6da072c101"} Mar 20 08:47:46 crc kubenswrapper[4971]: I0320 08:47:46.944432 4971 generic.go:334] "Generic (PLEG): container finished" podID="94f9b6de-f4ff-4a74-a86b-dffa2e03bd58" containerID="53b83d7597aa24961704a25b8bef68c21f1c126d80db2427810041ea51630dea" exitCode=0 Mar 20 08:47:46 crc kubenswrapper[4971]: I0320 08:47:46.944525 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b44d-account-create-update-pnwdx" event={"ID":"94f9b6de-f4ff-4a74-a86b-dffa2e03bd58","Type":"ContainerDied","Data":"53b83d7597aa24961704a25b8bef68c21f1c126d80db2427810041ea51630dea"} Mar 20 08:47:47 crc kubenswrapper[4971]: I0320 08:47:47.732866 4971 scope.go:117] "RemoveContainer" containerID="af8dd483a43b5c96df916fd811fc94a881785f568b623794921e968459fc26b4" Mar 20 08:47:47 crc kubenswrapper[4971]: E0320 08:47:47.733344 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:47:48 crc kubenswrapper[4971]: I0320 08:47:48.308551 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b44d-account-create-update-pnwdx" Mar 20 08:47:48 crc kubenswrapper[4971]: I0320 08:47:48.316046 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-c2w9d" Mar 20 08:47:48 crc kubenswrapper[4971]: I0320 08:47:48.494581 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21a92691-09f1-4841-86b2-c72f6153c44c-operator-scripts\") pod \"21a92691-09f1-4841-86b2-c72f6153c44c\" (UID: \"21a92691-09f1-4841-86b2-c72f6153c44c\") " Mar 20 08:47:48 crc kubenswrapper[4971]: I0320 08:47:48.494714 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mp4j5\" (UniqueName: \"kubernetes.io/projected/21a92691-09f1-4841-86b2-c72f6153c44c-kube-api-access-mp4j5\") pod \"21a92691-09f1-4841-86b2-c72f6153c44c\" (UID: \"21a92691-09f1-4841-86b2-c72f6153c44c\") " Mar 20 08:47:48 crc kubenswrapper[4971]: I0320 08:47:48.494764 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94f9b6de-f4ff-4a74-a86b-dffa2e03bd58-operator-scripts\") pod \"94f9b6de-f4ff-4a74-a86b-dffa2e03bd58\" (UID: \"94f9b6de-f4ff-4a74-a86b-dffa2e03bd58\") " Mar 20 08:47:48 crc kubenswrapper[4971]: I0320 08:47:48.494922 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvnrb\" (UniqueName: \"kubernetes.io/projected/94f9b6de-f4ff-4a74-a86b-dffa2e03bd58-kube-api-access-xvnrb\") pod \"94f9b6de-f4ff-4a74-a86b-dffa2e03bd58\" (UID: \"94f9b6de-f4ff-4a74-a86b-dffa2e03bd58\") " Mar 20 08:47:48 crc kubenswrapper[4971]: I0320 08:47:48.495297 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21a92691-09f1-4841-86b2-c72f6153c44c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "21a92691-09f1-4841-86b2-c72f6153c44c" (UID: "21a92691-09f1-4841-86b2-c72f6153c44c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:47:48 crc kubenswrapper[4971]: I0320 08:47:48.495660 4971 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21a92691-09f1-4841-86b2-c72f6153c44c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:47:48 crc kubenswrapper[4971]: I0320 08:47:48.495708 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94f9b6de-f4ff-4a74-a86b-dffa2e03bd58-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "94f9b6de-f4ff-4a74-a86b-dffa2e03bd58" (UID: "94f9b6de-f4ff-4a74-a86b-dffa2e03bd58"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:47:48 crc kubenswrapper[4971]: I0320 08:47:48.501851 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94f9b6de-f4ff-4a74-a86b-dffa2e03bd58-kube-api-access-xvnrb" (OuterVolumeSpecName: "kube-api-access-xvnrb") pod "94f9b6de-f4ff-4a74-a86b-dffa2e03bd58" (UID: "94f9b6de-f4ff-4a74-a86b-dffa2e03bd58"). InnerVolumeSpecName "kube-api-access-xvnrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:47:48 crc kubenswrapper[4971]: I0320 08:47:48.501926 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21a92691-09f1-4841-86b2-c72f6153c44c-kube-api-access-mp4j5" (OuterVolumeSpecName: "kube-api-access-mp4j5") pod "21a92691-09f1-4841-86b2-c72f6153c44c" (UID: "21a92691-09f1-4841-86b2-c72f6153c44c"). InnerVolumeSpecName "kube-api-access-mp4j5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:47:48 crc kubenswrapper[4971]: I0320 08:47:48.597582 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvnrb\" (UniqueName: \"kubernetes.io/projected/94f9b6de-f4ff-4a74-a86b-dffa2e03bd58-kube-api-access-xvnrb\") on node \"crc\" DevicePath \"\"" Mar 20 08:47:48 crc kubenswrapper[4971]: I0320 08:47:48.597848 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mp4j5\" (UniqueName: \"kubernetes.io/projected/21a92691-09f1-4841-86b2-c72f6153c44c-kube-api-access-mp4j5\") on node \"crc\" DevicePath \"\"" Mar 20 08:47:48 crc kubenswrapper[4971]: I0320 08:47:48.597860 4971 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94f9b6de-f4ff-4a74-a86b-dffa2e03bd58-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:47:48 crc kubenswrapper[4971]: I0320 08:47:48.968463 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b44d-account-create-update-pnwdx" event={"ID":"94f9b6de-f4ff-4a74-a86b-dffa2e03bd58","Type":"ContainerDied","Data":"afc606db2abcaff3c6f256cacdb11aa5a4f69f21d883ca38a40c6a7e26b477c6"} Mar 20 08:47:48 crc kubenswrapper[4971]: I0320 08:47:48.968515 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afc606db2abcaff3c6f256cacdb11aa5a4f69f21d883ca38a40c6a7e26b477c6" Mar 20 08:47:48 crc kubenswrapper[4971]: I0320 08:47:48.968571 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b44d-account-create-update-pnwdx" Mar 20 08:47:48 crc kubenswrapper[4971]: I0320 08:47:48.972493 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-c2w9d" event={"ID":"21a92691-09f1-4841-86b2-c72f6153c44c","Type":"ContainerDied","Data":"0f9922a306e1d4cf3642a3c2976ad988b37298f6b61b9546422bfa5df38512df"} Mar 20 08:47:48 crc kubenswrapper[4971]: I0320 08:47:48.972554 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f9922a306e1d4cf3642a3c2976ad988b37298f6b61b9546422bfa5df38512df" Mar 20 08:47:48 crc kubenswrapper[4971]: I0320 08:47:48.972655 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-c2w9d" Mar 20 08:47:50 crc kubenswrapper[4971]: I0320 08:47:50.265760 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-ks7nc"] Mar 20 08:47:50 crc kubenswrapper[4971]: E0320 08:47:50.266049 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21a92691-09f1-4841-86b2-c72f6153c44c" containerName="mariadb-database-create" Mar 20 08:47:50 crc kubenswrapper[4971]: I0320 08:47:50.266062 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="21a92691-09f1-4841-86b2-c72f6153c44c" containerName="mariadb-database-create" Mar 20 08:47:50 crc kubenswrapper[4971]: E0320 08:47:50.266098 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94f9b6de-f4ff-4a74-a86b-dffa2e03bd58" containerName="mariadb-account-create-update" Mar 20 08:47:50 crc kubenswrapper[4971]: I0320 08:47:50.266104 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="94f9b6de-f4ff-4a74-a86b-dffa2e03bd58" containerName="mariadb-account-create-update" Mar 20 08:47:50 crc kubenswrapper[4971]: I0320 08:47:50.266240 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="21a92691-09f1-4841-86b2-c72f6153c44c" containerName="mariadb-database-create" Mar 20 08:47:50 crc kubenswrapper[4971]: I0320 08:47:50.266261 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="94f9b6de-f4ff-4a74-a86b-dffa2e03bd58" containerName="mariadb-account-create-update" Mar 20 08:47:50 crc kubenswrapper[4971]: I0320 08:47:50.266716 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-ks7nc" Mar 20 08:47:50 crc kubenswrapper[4971]: I0320 08:47:50.289132 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 20 08:47:50 crc kubenswrapper[4971]: I0320 08:47:50.289303 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 20 08:47:50 crc kubenswrapper[4971]: I0320 08:47:50.289753 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-4xj2d" Mar 20 08:47:50 crc kubenswrapper[4971]: I0320 08:47:50.306831 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-ks7nc"] Mar 20 08:47:50 crc kubenswrapper[4971]: I0320 08:47:50.430704 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbldx\" (UniqueName: \"kubernetes.io/projected/debc2ab1-f7a9-4493-8258-4255eec956c5-kube-api-access-qbldx\") pod \"neutron-db-sync-ks7nc\" (UID: \"debc2ab1-f7a9-4493-8258-4255eec956c5\") " pod="openstack/neutron-db-sync-ks7nc" Mar 20 08:47:50 crc kubenswrapper[4971]: I0320 08:47:50.430986 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/debc2ab1-f7a9-4493-8258-4255eec956c5-config\") pod \"neutron-db-sync-ks7nc\" (UID: \"debc2ab1-f7a9-4493-8258-4255eec956c5\") " pod="openstack/neutron-db-sync-ks7nc" Mar 20 08:47:50 crc kubenswrapper[4971]: I0320 08:47:50.431036 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/debc2ab1-f7a9-4493-8258-4255eec956c5-combined-ca-bundle\") pod \"neutron-db-sync-ks7nc\" (UID: \"debc2ab1-f7a9-4493-8258-4255eec956c5\") " pod="openstack/neutron-db-sync-ks7nc" Mar 20 08:47:50 crc kubenswrapper[4971]: I0320 08:47:50.532746 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbldx\" (UniqueName: \"kubernetes.io/projected/debc2ab1-f7a9-4493-8258-4255eec956c5-kube-api-access-qbldx\") pod \"neutron-db-sync-ks7nc\" (UID: \"debc2ab1-f7a9-4493-8258-4255eec956c5\") " pod="openstack/neutron-db-sync-ks7nc" Mar 20 08:47:50 crc kubenswrapper[4971]: I0320 08:47:50.532856 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/debc2ab1-f7a9-4493-8258-4255eec956c5-config\") pod \"neutron-db-sync-ks7nc\" (UID: \"debc2ab1-f7a9-4493-8258-4255eec956c5\") " pod="openstack/neutron-db-sync-ks7nc" Mar 20 08:47:50 crc kubenswrapper[4971]: I0320 08:47:50.532895 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/debc2ab1-f7a9-4493-8258-4255eec956c5-combined-ca-bundle\") pod \"neutron-db-sync-ks7nc\" (UID: \"debc2ab1-f7a9-4493-8258-4255eec956c5\") " pod="openstack/neutron-db-sync-ks7nc" Mar 20 08:47:50 crc kubenswrapper[4971]: I0320 08:47:50.539328 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/debc2ab1-f7a9-4493-8258-4255eec956c5-combined-ca-bundle\") pod \"neutron-db-sync-ks7nc\" (UID: \"debc2ab1-f7a9-4493-8258-4255eec956c5\") " pod="openstack/neutron-db-sync-ks7nc" Mar 20 08:47:50 crc kubenswrapper[4971]: I0320 08:47:50.544289 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/debc2ab1-f7a9-4493-8258-4255eec956c5-config\") pod \"neutron-db-sync-ks7nc\" (UID: \"debc2ab1-f7a9-4493-8258-4255eec956c5\") " pod="openstack/neutron-db-sync-ks7nc" Mar 20 08:47:50 crc kubenswrapper[4971]: I0320 08:47:50.557641 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbldx\" (UniqueName: \"kubernetes.io/projected/debc2ab1-f7a9-4493-8258-4255eec956c5-kube-api-access-qbldx\") pod \"neutron-db-sync-ks7nc\" (UID: \"debc2ab1-f7a9-4493-8258-4255eec956c5\") " pod="openstack/neutron-db-sync-ks7nc" Mar 20 08:47:50 crc kubenswrapper[4971]: I0320 08:47:50.590020 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-ks7nc" Mar 20 08:47:51 crc kubenswrapper[4971]: I0320 08:47:51.039356 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-ks7nc"] Mar 20 08:47:51 crc kubenswrapper[4971]: W0320 08:47:51.049851 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddebc2ab1_f7a9_4493_8258_4255eec956c5.slice/crio-734d61f5ecfc7e80c92dad3632726232884b887ca175e869fc7018017a9dcf02 WatchSource:0}: Error finding container 734d61f5ecfc7e80c92dad3632726232884b887ca175e869fc7018017a9dcf02: Status 404 returned error can't find the container with id 734d61f5ecfc7e80c92dad3632726232884b887ca175e869fc7018017a9dcf02 Mar 20 08:47:52 crc kubenswrapper[4971]: I0320 08:47:52.015501 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-ks7nc" event={"ID":"debc2ab1-f7a9-4493-8258-4255eec956c5","Type":"ContainerStarted","Data":"393f437b8a985920f62d59397b36358f6019166f814347b0aaa53d62db017201"} Mar 20 08:47:52 crc kubenswrapper[4971]: I0320 08:47:52.015897 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-ks7nc" event={"ID":"debc2ab1-f7a9-4493-8258-4255eec956c5","Type":"ContainerStarted","Data":"734d61f5ecfc7e80c92dad3632726232884b887ca175e869fc7018017a9dcf02"} Mar 20 08:47:52 crc kubenswrapper[4971]: I0320 08:47:52.041825 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-ks7nc" podStartSLOduration=2.041801311 podStartE2EDuration="2.041801311s" podCreationTimestamp="2026-03-20 08:47:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:47:52.03679734 +0000 UTC m=+7094.016671528" watchObservedRunningTime="2026-03-20 08:47:52.041801311 +0000 UTC m=+7094.021675459" Mar 20 08:47:55 crc kubenswrapper[4971]: I0320 08:47:55.050552 4971 generic.go:334] "Generic (PLEG): container finished" podID="debc2ab1-f7a9-4493-8258-4255eec956c5" containerID="393f437b8a985920f62d59397b36358f6019166f814347b0aaa53d62db017201" exitCode=0 Mar 20 08:47:55 crc kubenswrapper[4971]: I0320 08:47:55.050792 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-ks7nc" event={"ID":"debc2ab1-f7a9-4493-8258-4255eec956c5","Type":"ContainerDied","Data":"393f437b8a985920f62d59397b36358f6019166f814347b0aaa53d62db017201"} Mar 20 08:47:56 crc kubenswrapper[4971]: I0320 08:47:56.434549 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-ks7nc" Mar 20 08:47:56 crc kubenswrapper[4971]: I0320 08:47:56.545726 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbldx\" (UniqueName: \"kubernetes.io/projected/debc2ab1-f7a9-4493-8258-4255eec956c5-kube-api-access-qbldx\") pod \"debc2ab1-f7a9-4493-8258-4255eec956c5\" (UID: \"debc2ab1-f7a9-4493-8258-4255eec956c5\") " Mar 20 08:47:56 crc kubenswrapper[4971]: I0320 08:47:56.545792 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/debc2ab1-f7a9-4493-8258-4255eec956c5-config\") pod \"debc2ab1-f7a9-4493-8258-4255eec956c5\" (UID: \"debc2ab1-f7a9-4493-8258-4255eec956c5\") " Mar 20 08:47:56 crc kubenswrapper[4971]: I0320 08:47:56.545945 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/debc2ab1-f7a9-4493-8258-4255eec956c5-combined-ca-bundle\") pod \"debc2ab1-f7a9-4493-8258-4255eec956c5\" (UID: \"debc2ab1-f7a9-4493-8258-4255eec956c5\") " Mar 20 08:47:56 crc kubenswrapper[4971]: I0320 08:47:56.550825 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/debc2ab1-f7a9-4493-8258-4255eec956c5-kube-api-access-qbldx" (OuterVolumeSpecName: "kube-api-access-qbldx") pod "debc2ab1-f7a9-4493-8258-4255eec956c5" (UID: "debc2ab1-f7a9-4493-8258-4255eec956c5"). InnerVolumeSpecName "kube-api-access-qbldx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:47:56 crc kubenswrapper[4971]: I0320 08:47:56.566312 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/debc2ab1-f7a9-4493-8258-4255eec956c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "debc2ab1-f7a9-4493-8258-4255eec956c5" (UID: "debc2ab1-f7a9-4493-8258-4255eec956c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:47:56 crc kubenswrapper[4971]: I0320 08:47:56.567803 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/debc2ab1-f7a9-4493-8258-4255eec956c5-config" (OuterVolumeSpecName: "config") pod "debc2ab1-f7a9-4493-8258-4255eec956c5" (UID: "debc2ab1-f7a9-4493-8258-4255eec956c5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:47:56 crc kubenswrapper[4971]: I0320 08:47:56.647594 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/debc2ab1-f7a9-4493-8258-4255eec956c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:47:56 crc kubenswrapper[4971]: I0320 08:47:56.648013 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qbldx\" (UniqueName: \"kubernetes.io/projected/debc2ab1-f7a9-4493-8258-4255eec956c5-kube-api-access-qbldx\") on node \"crc\" DevicePath \"\"" Mar 20 08:47:56 crc kubenswrapper[4971]: I0320 08:47:56.648103 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/debc2ab1-f7a9-4493-8258-4255eec956c5-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:47:57 crc kubenswrapper[4971]: I0320 08:47:57.079414 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-ks7nc" event={"ID":"debc2ab1-f7a9-4493-8258-4255eec956c5","Type":"ContainerDied","Data":"734d61f5ecfc7e80c92dad3632726232884b887ca175e869fc7018017a9dcf02"} Mar 20 08:47:57 crc kubenswrapper[4971]: I0320 08:47:57.079453 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-ks7nc" Mar 20 08:47:57 crc kubenswrapper[4971]: I0320 08:47:57.079468 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="734d61f5ecfc7e80c92dad3632726232884b887ca175e869fc7018017a9dcf02" Mar 20 08:47:57 crc kubenswrapper[4971]: I0320 08:47:57.329020 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76595c7fd7-gsnb6"] Mar 20 08:47:57 crc kubenswrapper[4971]: E0320 08:47:57.329417 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="debc2ab1-f7a9-4493-8258-4255eec956c5" containerName="neutron-db-sync" Mar 20 08:47:57 crc kubenswrapper[4971]: I0320 08:47:57.329441 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="debc2ab1-f7a9-4493-8258-4255eec956c5" containerName="neutron-db-sync" Mar 20 08:47:57 crc kubenswrapper[4971]: I0320 08:47:57.329704 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="debc2ab1-f7a9-4493-8258-4255eec956c5" containerName="neutron-db-sync" Mar 20 08:47:57 crc kubenswrapper[4971]: I0320 08:47:57.330919 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76595c7fd7-gsnb6" Mar 20 08:47:57 crc kubenswrapper[4971]: I0320 08:47:57.349825 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76595c7fd7-gsnb6"] Mar 20 08:47:57 crc kubenswrapper[4971]: I0320 08:47:57.455826 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6c7c478cfc-2twl9"] Mar 20 08:47:57 crc kubenswrapper[4971]: I0320 08:47:57.457096 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6c7c478cfc-2twl9" Mar 20 08:47:57 crc kubenswrapper[4971]: I0320 08:47:57.460018 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 20 08:47:57 crc kubenswrapper[4971]: I0320 08:47:57.460047 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-4xj2d" Mar 20 08:47:57 crc kubenswrapper[4971]: I0320 08:47:57.465326 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 20 08:47:57 crc kubenswrapper[4971]: I0320 08:47:57.467094 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc746079-6be6-41df-bcdc-3408d71f511d-ovsdbserver-nb\") pod \"dnsmasq-dns-76595c7fd7-gsnb6\" (UID: \"cc746079-6be6-41df-bcdc-3408d71f511d\") " pod="openstack/dnsmasq-dns-76595c7fd7-gsnb6" Mar 20 08:47:57 crc kubenswrapper[4971]: I0320 08:47:57.467260 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc746079-6be6-41df-bcdc-3408d71f511d-config\") pod \"dnsmasq-dns-76595c7fd7-gsnb6\" (UID: \"cc746079-6be6-41df-bcdc-3408d71f511d\") " pod="openstack/dnsmasq-dns-76595c7fd7-gsnb6" Mar 20 08:47:57 crc kubenswrapper[4971]: I0320 08:47:57.467753 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b9kl\" (UniqueName: \"kubernetes.io/projected/cc746079-6be6-41df-bcdc-3408d71f511d-kube-api-access-4b9kl\") pod \"dnsmasq-dns-76595c7fd7-gsnb6\" (UID: \"cc746079-6be6-41df-bcdc-3408d71f511d\") " pod="openstack/dnsmasq-dns-76595c7fd7-gsnb6" Mar 20 08:47:57 crc kubenswrapper[4971]: I0320 08:47:57.467858 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc746079-6be6-41df-bcdc-3408d71f511d-ovsdbserver-sb\") pod \"dnsmasq-dns-76595c7fd7-gsnb6\" (UID: \"cc746079-6be6-41df-bcdc-3408d71f511d\") " pod="openstack/dnsmasq-dns-76595c7fd7-gsnb6" Mar 20 08:47:57 crc kubenswrapper[4971]: I0320 08:47:57.467931 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc746079-6be6-41df-bcdc-3408d71f511d-dns-svc\") pod \"dnsmasq-dns-76595c7fd7-gsnb6\" (UID: \"cc746079-6be6-41df-bcdc-3408d71f511d\") " pod="openstack/dnsmasq-dns-76595c7fd7-gsnb6" Mar 20 08:47:57 crc kubenswrapper[4971]: I0320 08:47:57.480683 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6c7c478cfc-2twl9"] Mar 20 08:47:57 crc kubenswrapper[4971]: I0320 08:47:57.570775 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4b9kl\" (UniqueName: \"kubernetes.io/projected/cc746079-6be6-41df-bcdc-3408d71f511d-kube-api-access-4b9kl\") pod \"dnsmasq-dns-76595c7fd7-gsnb6\" (UID: \"cc746079-6be6-41df-bcdc-3408d71f511d\") " pod="openstack/dnsmasq-dns-76595c7fd7-gsnb6" Mar 20 08:47:57 crc kubenswrapper[4971]: I0320 08:47:57.570836 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc746079-6be6-41df-bcdc-3408d71f511d-ovsdbserver-sb\") pod \"dnsmasq-dns-76595c7fd7-gsnb6\" (UID: \"cc746079-6be6-41df-bcdc-3408d71f511d\") " pod="openstack/dnsmasq-dns-76595c7fd7-gsnb6" Mar 20 08:47:57 crc kubenswrapper[4971]: I0320 08:47:57.570868 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc746079-6be6-41df-bcdc-3408d71f511d-dns-svc\") pod \"dnsmasq-dns-76595c7fd7-gsnb6\" (UID: \"cc746079-6be6-41df-bcdc-3408d71f511d\") " pod="openstack/dnsmasq-dns-76595c7fd7-gsnb6" Mar 20 08:47:57 crc kubenswrapper[4971]: I0320 08:47:57.570915 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1bf12dad-fd1a-4a84-9f23-b9b9ca622268-config\") pod \"neutron-6c7c478cfc-2twl9\" (UID: \"1bf12dad-fd1a-4a84-9f23-b9b9ca622268\") " pod="openstack/neutron-6c7c478cfc-2twl9" Mar 20 08:47:57 crc kubenswrapper[4971]: I0320 08:47:57.570933 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7pbq\" (UniqueName: \"kubernetes.io/projected/1bf12dad-fd1a-4a84-9f23-b9b9ca622268-kube-api-access-n7pbq\") pod \"neutron-6c7c478cfc-2twl9\" (UID: \"1bf12dad-fd1a-4a84-9f23-b9b9ca622268\") " pod="openstack/neutron-6c7c478cfc-2twl9" Mar 20 08:47:57 crc kubenswrapper[4971]: I0320 08:47:57.570951 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bf12dad-fd1a-4a84-9f23-b9b9ca622268-combined-ca-bundle\") pod \"neutron-6c7c478cfc-2twl9\" (UID: \"1bf12dad-fd1a-4a84-9f23-b9b9ca622268\") " pod="openstack/neutron-6c7c478cfc-2twl9" Mar 20 08:47:57 crc kubenswrapper[4971]: I0320 08:47:57.570978 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1bf12dad-fd1a-4a84-9f23-b9b9ca622268-httpd-config\") pod \"neutron-6c7c478cfc-2twl9\" (UID: \"1bf12dad-fd1a-4a84-9f23-b9b9ca622268\") " pod="openstack/neutron-6c7c478cfc-2twl9" Mar 20 08:47:57 crc kubenswrapper[4971]: I0320 08:47:57.571002 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc746079-6be6-41df-bcdc-3408d71f511d-ovsdbserver-nb\") pod \"dnsmasq-dns-76595c7fd7-gsnb6\" (UID: \"cc746079-6be6-41df-bcdc-3408d71f511d\") " pod="openstack/dnsmasq-dns-76595c7fd7-gsnb6" Mar 20 08:47:57 crc kubenswrapper[4971]: I0320 08:47:57.571025 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc746079-6be6-41df-bcdc-3408d71f511d-config\") pod \"dnsmasq-dns-76595c7fd7-gsnb6\" (UID: \"cc746079-6be6-41df-bcdc-3408d71f511d\") " pod="openstack/dnsmasq-dns-76595c7fd7-gsnb6" Mar 20 08:47:57 crc kubenswrapper[4971]: I0320 08:47:57.571772 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc746079-6be6-41df-bcdc-3408d71f511d-config\") pod \"dnsmasq-dns-76595c7fd7-gsnb6\" (UID: \"cc746079-6be6-41df-bcdc-3408d71f511d\") " pod="openstack/dnsmasq-dns-76595c7fd7-gsnb6" Mar 20 08:47:57 crc kubenswrapper[4971]: I0320 08:47:57.572564 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc746079-6be6-41df-bcdc-3408d71f511d-ovsdbserver-sb\") pod \"dnsmasq-dns-76595c7fd7-gsnb6\" (UID: \"cc746079-6be6-41df-bcdc-3408d71f511d\") " pod="openstack/dnsmasq-dns-76595c7fd7-gsnb6" Mar 20 08:47:57 crc kubenswrapper[4971]: I0320 08:47:57.573346 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc746079-6be6-41df-bcdc-3408d71f511d-ovsdbserver-nb\") pod \"dnsmasq-dns-76595c7fd7-gsnb6\" (UID: \"cc746079-6be6-41df-bcdc-3408d71f511d\") " pod="openstack/dnsmasq-dns-76595c7fd7-gsnb6" Mar 20 08:47:57 crc kubenswrapper[4971]: I0320 08:47:57.573407 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc746079-6be6-41df-bcdc-3408d71f511d-dns-svc\") pod \"dnsmasq-dns-76595c7fd7-gsnb6\" (UID: \"cc746079-6be6-41df-bcdc-3408d71f511d\") " pod="openstack/dnsmasq-dns-76595c7fd7-gsnb6" Mar 20 08:47:57 crc kubenswrapper[4971]: I0320 08:47:57.593515 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b9kl\" (UniqueName: \"kubernetes.io/projected/cc746079-6be6-41df-bcdc-3408d71f511d-kube-api-access-4b9kl\") pod \"dnsmasq-dns-76595c7fd7-gsnb6\" (UID: \"cc746079-6be6-41df-bcdc-3408d71f511d\") " pod="openstack/dnsmasq-dns-76595c7fd7-gsnb6" Mar 20 08:47:57 crc kubenswrapper[4971]: I0320 08:47:57.655986 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76595c7fd7-gsnb6" Mar 20 08:47:57 crc kubenswrapper[4971]: I0320 08:47:57.672793 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1bf12dad-fd1a-4a84-9f23-b9b9ca622268-config\") pod \"neutron-6c7c478cfc-2twl9\" (UID: \"1bf12dad-fd1a-4a84-9f23-b9b9ca622268\") " pod="openstack/neutron-6c7c478cfc-2twl9" Mar 20 08:47:57 crc kubenswrapper[4971]: I0320 08:47:57.672842 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7pbq\" (UniqueName: \"kubernetes.io/projected/1bf12dad-fd1a-4a84-9f23-b9b9ca622268-kube-api-access-n7pbq\") pod \"neutron-6c7c478cfc-2twl9\" (UID: \"1bf12dad-fd1a-4a84-9f23-b9b9ca622268\") " pod="openstack/neutron-6c7c478cfc-2twl9" Mar 20 08:47:57 crc kubenswrapper[4971]: I0320 08:47:57.672862 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bf12dad-fd1a-4a84-9f23-b9b9ca622268-combined-ca-bundle\") pod \"neutron-6c7c478cfc-2twl9\" (UID: \"1bf12dad-fd1a-4a84-9f23-b9b9ca622268\") " pod="openstack/neutron-6c7c478cfc-2twl9" Mar 20 08:47:57 crc kubenswrapper[4971]: I0320 08:47:57.672890 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1bf12dad-fd1a-4a84-9f23-b9b9ca622268-httpd-config\") pod \"neutron-6c7c478cfc-2twl9\" (UID: \"1bf12dad-fd1a-4a84-9f23-b9b9ca622268\") " pod="openstack/neutron-6c7c478cfc-2twl9" Mar 20 08:47:57 crc kubenswrapper[4971]: I0320 08:47:57.677813 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/1bf12dad-fd1a-4a84-9f23-b9b9ca622268-config\") pod \"neutron-6c7c478cfc-2twl9\" (UID: \"1bf12dad-fd1a-4a84-9f23-b9b9ca622268\") " pod="openstack/neutron-6c7c478cfc-2twl9" Mar 20 08:47:57 crc kubenswrapper[4971]: I0320 08:47:57.678938 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bf12dad-fd1a-4a84-9f23-b9b9ca622268-combined-ca-bundle\") pod \"neutron-6c7c478cfc-2twl9\" (UID: \"1bf12dad-fd1a-4a84-9f23-b9b9ca622268\") " pod="openstack/neutron-6c7c478cfc-2twl9" Mar 20 08:47:57 crc kubenswrapper[4971]: I0320 08:47:57.680353 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1bf12dad-fd1a-4a84-9f23-b9b9ca622268-httpd-config\") pod \"neutron-6c7c478cfc-2twl9\" (UID: \"1bf12dad-fd1a-4a84-9f23-b9b9ca622268\") " pod="openstack/neutron-6c7c478cfc-2twl9" Mar 20 08:47:57 crc kubenswrapper[4971]: I0320 08:47:57.695354 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7pbq\" (UniqueName: \"kubernetes.io/projected/1bf12dad-fd1a-4a84-9f23-b9b9ca622268-kube-api-access-n7pbq\") pod \"neutron-6c7c478cfc-2twl9\" (UID: \"1bf12dad-fd1a-4a84-9f23-b9b9ca622268\") " pod="openstack/neutron-6c7c478cfc-2twl9" Mar 20 08:47:57 crc kubenswrapper[4971]: I0320 08:47:57.777312 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6c7c478cfc-2twl9" Mar 20 08:47:58 crc kubenswrapper[4971]: I0320 08:47:58.168814 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76595c7fd7-gsnb6"] Mar 20 08:47:58 crc kubenswrapper[4971]: I0320 08:47:58.348121 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6c7c478cfc-2twl9"] Mar 20 08:47:58 crc kubenswrapper[4971]: W0320 08:47:58.354348 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1bf12dad_fd1a_4a84_9f23_b9b9ca622268.slice/crio-f316f3822c15030583eea6995cb4d01171b5bbf35d142cb864d632f0c4d86daa WatchSource:0}: Error finding container f316f3822c15030583eea6995cb4d01171b5bbf35d142cb864d632f0c4d86daa: Status 404 returned error can't find the container with id f316f3822c15030583eea6995cb4d01171b5bbf35d142cb864d632f0c4d86daa Mar 20 08:47:59 crc kubenswrapper[4971]: I0320 08:47:59.145429 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c7c478cfc-2twl9" event={"ID":"1bf12dad-fd1a-4a84-9f23-b9b9ca622268","Type":"ContainerStarted","Data":"dbfc6fcb278d144cd97ca7c451930461e3c1331c0ef61b2795bda7d615842660"} Mar 20 08:47:59 crc kubenswrapper[4971]: I0320 08:47:59.145842 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c7c478cfc-2twl9" event={"ID":"1bf12dad-fd1a-4a84-9f23-b9b9ca622268","Type":"ContainerStarted","Data":"39fa8c83b6a4b086188930b76a68da59ec1e793600dcc330fa4cbc45c1e3c697"} Mar 20 08:47:59 crc kubenswrapper[4971]: I0320 08:47:59.145893 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6c7c478cfc-2twl9" Mar 20 08:47:59 crc kubenswrapper[4971]: I0320 08:47:59.145909 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c7c478cfc-2twl9" event={"ID":"1bf12dad-fd1a-4a84-9f23-b9b9ca622268","Type":"ContainerStarted","Data":"f316f3822c15030583eea6995cb4d01171b5bbf35d142cb864d632f0c4d86daa"} Mar 20 08:47:59 crc kubenswrapper[4971]: I0320 08:47:59.147325 4971 generic.go:334] "Generic (PLEG): container finished" podID="cc746079-6be6-41df-bcdc-3408d71f511d" containerID="e491bf5ed45929343bcaf6eee297a8a2b0d2b18310ba7170ecf317c38a711c70" exitCode=0 Mar 20 08:47:59 crc kubenswrapper[4971]: I0320 08:47:59.147363 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76595c7fd7-gsnb6" event={"ID":"cc746079-6be6-41df-bcdc-3408d71f511d","Type":"ContainerDied","Data":"e491bf5ed45929343bcaf6eee297a8a2b0d2b18310ba7170ecf317c38a711c70"} Mar 20 08:47:59 crc kubenswrapper[4971]: I0320 08:47:59.147384 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76595c7fd7-gsnb6" event={"ID":"cc746079-6be6-41df-bcdc-3408d71f511d","Type":"ContainerStarted","Data":"2f02fe998d4b80470b14d418d9175c31cf202bf94c043c0830c10abf692947e0"} Mar 20 08:47:59 crc kubenswrapper[4971]: I0320 08:47:59.168482 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6c7c478cfc-2twl9" podStartSLOduration=2.168463509 podStartE2EDuration="2.168463509s" podCreationTimestamp="2026-03-20 08:47:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:47:59.16814376 +0000 UTC m=+7101.148017908" watchObservedRunningTime="2026-03-20 08:47:59.168463509 +0000 UTC m=+7101.148337647" Mar 20 08:48:00 crc kubenswrapper[4971]: I0320 08:48:00.133337 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566608-6tg57"] Mar 20 08:48:00 crc kubenswrapper[4971]: I0320 08:48:00.134835 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566608-6tg57" Mar 20 08:48:00 crc kubenswrapper[4971]: I0320 08:48:00.138047 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 08:48:00 crc kubenswrapper[4971]: I0320 08:48:00.138540 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 08:48:00 crc kubenswrapper[4971]: I0320 08:48:00.139857 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 08:48:00 crc kubenswrapper[4971]: I0320 08:48:00.142079 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566608-6tg57"] Mar 20 08:48:00 crc kubenswrapper[4971]: I0320 08:48:00.178248 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76595c7fd7-gsnb6" event={"ID":"cc746079-6be6-41df-bcdc-3408d71f511d","Type":"ContainerStarted","Data":"b22100b61c3682887038ce910cdd89012803b22481e95f6c56fc1ae8b831d959"} Mar 20 08:48:00 crc kubenswrapper[4971]: I0320 08:48:00.178783 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-76595c7fd7-gsnb6" Mar 20 08:48:00 crc kubenswrapper[4971]: I0320 08:48:00.198451 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-76595c7fd7-gsnb6" podStartSLOduration=3.198332728 podStartE2EDuration="3.198332728s" podCreationTimestamp="2026-03-20 08:47:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:48:00.192227788 +0000 UTC m=+7102.172101936" watchObservedRunningTime="2026-03-20 08:48:00.198332728 +0000 UTC m=+7102.178206866" Mar 20 08:48:00 crc kubenswrapper[4971]: I0320 08:48:00.217732 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smr8k\" (UniqueName: \"kubernetes.io/projected/4a8f89b1-e91b-4e29-bb03-25bd13d72ff8-kube-api-access-smr8k\") pod \"auto-csr-approver-29566608-6tg57\" (UID: \"4a8f89b1-e91b-4e29-bb03-25bd13d72ff8\") " pod="openshift-infra/auto-csr-approver-29566608-6tg57" Mar 20 08:48:00 crc kubenswrapper[4971]: I0320 08:48:00.319428 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smr8k\" (UniqueName: \"kubernetes.io/projected/4a8f89b1-e91b-4e29-bb03-25bd13d72ff8-kube-api-access-smr8k\") pod \"auto-csr-approver-29566608-6tg57\" (UID: \"4a8f89b1-e91b-4e29-bb03-25bd13d72ff8\") " pod="openshift-infra/auto-csr-approver-29566608-6tg57" Mar 20 08:48:00 crc kubenswrapper[4971]: I0320 08:48:00.340324 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smr8k\" (UniqueName: \"kubernetes.io/projected/4a8f89b1-e91b-4e29-bb03-25bd13d72ff8-kube-api-access-smr8k\") pod \"auto-csr-approver-29566608-6tg57\" (UID: \"4a8f89b1-e91b-4e29-bb03-25bd13d72ff8\") " pod="openshift-infra/auto-csr-approver-29566608-6tg57" Mar 20 08:48:00 crc kubenswrapper[4971]: I0320 08:48:00.459433 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566608-6tg57" Mar 20 08:48:00 crc kubenswrapper[4971]: I0320 08:48:00.733443 4971 scope.go:117] "RemoveContainer" containerID="af8dd483a43b5c96df916fd811fc94a881785f568b623794921e968459fc26b4" Mar 20 08:48:00 crc kubenswrapper[4971]: E0320 08:48:00.733971 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:48:00 crc kubenswrapper[4971]: I0320 08:48:00.765518 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566608-6tg57"] Mar 20 08:48:00 crc kubenswrapper[4971]: W0320 08:48:00.773925 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a8f89b1_e91b_4e29_bb03_25bd13d72ff8.slice/crio-8a5e8b2e3b8343896c9d2bede046dd2c8e5869e1015b320261a3f010325c37d2 WatchSource:0}: Error finding container 8a5e8b2e3b8343896c9d2bede046dd2c8e5869e1015b320261a3f010325c37d2: Status 404 returned error can't find the container with id 8a5e8b2e3b8343896c9d2bede046dd2c8e5869e1015b320261a3f010325c37d2 Mar 20 08:48:01 crc kubenswrapper[4971]: I0320 08:48:01.190056 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566608-6tg57" event={"ID":"4a8f89b1-e91b-4e29-bb03-25bd13d72ff8","Type":"ContainerStarted","Data":"8a5e8b2e3b8343896c9d2bede046dd2c8e5869e1015b320261a3f010325c37d2"} Mar 20 08:48:02 crc kubenswrapper[4971]: I0320 08:48:02.198556 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566608-6tg57" event={"ID":"4a8f89b1-e91b-4e29-bb03-25bd13d72ff8","Type":"ContainerStarted","Data":"c9db52a07f98b0ceae1d45171bb74dc1c130ed1ba9bfe33947b5b965e47d6fd2"} Mar 20 08:48:02 crc kubenswrapper[4971]: I0320 08:48:02.224132 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566608-6tg57" podStartSLOduration=1.38585071 podStartE2EDuration="2.224099209s" podCreationTimestamp="2026-03-20 08:48:00 +0000 UTC" firstStartedPulling="2026-03-20 08:48:00.78350536 +0000 UTC m=+7102.763379498" lastFinishedPulling="2026-03-20 08:48:01.621753849 +0000 UTC m=+7103.601627997" observedRunningTime="2026-03-20 08:48:02.21265377 +0000 UTC m=+7104.192527928" watchObservedRunningTime="2026-03-20 08:48:02.224099209 +0000 UTC m=+7104.203973427" Mar 20 08:48:03 crc kubenswrapper[4971]: I0320 08:48:03.209594 4971 generic.go:334] "Generic (PLEG): container finished" podID="4a8f89b1-e91b-4e29-bb03-25bd13d72ff8" containerID="c9db52a07f98b0ceae1d45171bb74dc1c130ed1ba9bfe33947b5b965e47d6fd2" exitCode=0 Mar 20 08:48:03 crc kubenswrapper[4971]: I0320 08:48:03.209655 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566608-6tg57" event={"ID":"4a8f89b1-e91b-4e29-bb03-25bd13d72ff8","Type":"ContainerDied","Data":"c9db52a07f98b0ceae1d45171bb74dc1c130ed1ba9bfe33947b5b965e47d6fd2"} Mar 20 08:48:04 crc kubenswrapper[4971]: I0320 08:48:04.595904 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566608-6tg57" Mar 20 08:48:04 crc kubenswrapper[4971]: I0320 08:48:04.724002 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smr8k\" (UniqueName: \"kubernetes.io/projected/4a8f89b1-e91b-4e29-bb03-25bd13d72ff8-kube-api-access-smr8k\") pod \"4a8f89b1-e91b-4e29-bb03-25bd13d72ff8\" (UID: \"4a8f89b1-e91b-4e29-bb03-25bd13d72ff8\") " Mar 20 08:48:04 crc kubenswrapper[4971]: I0320 08:48:04.742888 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a8f89b1-e91b-4e29-bb03-25bd13d72ff8-kube-api-access-smr8k" (OuterVolumeSpecName: "kube-api-access-smr8k") pod "4a8f89b1-e91b-4e29-bb03-25bd13d72ff8" (UID: "4a8f89b1-e91b-4e29-bb03-25bd13d72ff8"). InnerVolumeSpecName "kube-api-access-smr8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:48:04 crc kubenswrapper[4971]: I0320 08:48:04.826852 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smr8k\" (UniqueName: \"kubernetes.io/projected/4a8f89b1-e91b-4e29-bb03-25bd13d72ff8-kube-api-access-smr8k\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:05 crc kubenswrapper[4971]: I0320 08:48:05.233738 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566608-6tg57" event={"ID":"4a8f89b1-e91b-4e29-bb03-25bd13d72ff8","Type":"ContainerDied","Data":"8a5e8b2e3b8343896c9d2bede046dd2c8e5869e1015b320261a3f010325c37d2"} Mar 20 08:48:05 crc kubenswrapper[4971]: I0320 08:48:05.233989 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a5e8b2e3b8343896c9d2bede046dd2c8e5869e1015b320261a3f010325c37d2" Mar 20 08:48:05 crc kubenswrapper[4971]: I0320 08:48:05.233852 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566608-6tg57" Mar 20 08:48:05 crc kubenswrapper[4971]: I0320 08:48:05.282874 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566602-27sdp"] Mar 20 08:48:05 crc kubenswrapper[4971]: I0320 08:48:05.288552 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566602-27sdp"] Mar 20 08:48:06 crc kubenswrapper[4971]: I0320 08:48:06.746733 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b023bc7b-9694-43ec-a1e0-9b14fa80c9f0" path="/var/lib/kubelet/pods/b023bc7b-9694-43ec-a1e0-9b14fa80c9f0/volumes" Mar 20 08:48:07 crc kubenswrapper[4971]: I0320 08:48:07.657795 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-76595c7fd7-gsnb6" Mar 20 08:48:07 crc kubenswrapper[4971]: I0320 08:48:07.766507 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b54488967-zlmqp"] Mar 20 08:48:07 crc kubenswrapper[4971]: I0320 08:48:07.766761 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b54488967-zlmqp" podUID="a1b846cc-f217-4bf2-a2ad-aaa721b8480d" containerName="dnsmasq-dns" containerID="cri-o://a7d5495118bd04c96bc8f5e3374e63550c006d70eda39ab9e7431d15190146f8" gracePeriod=10 Mar 20 08:48:08 crc kubenswrapper[4971]: I0320 08:48:08.260516 4971 generic.go:334] "Generic (PLEG): container finished" podID="a1b846cc-f217-4bf2-a2ad-aaa721b8480d" containerID="a7d5495118bd04c96bc8f5e3374e63550c006d70eda39ab9e7431d15190146f8" exitCode=0 Mar 20 08:48:08 crc kubenswrapper[4971]: I0320 08:48:08.260591 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b54488967-zlmqp" event={"ID":"a1b846cc-f217-4bf2-a2ad-aaa721b8480d","Type":"ContainerDied","Data":"a7d5495118bd04c96bc8f5e3374e63550c006d70eda39ab9e7431d15190146f8"} Mar 20 08:48:08 crc kubenswrapper[4971]: I0320 08:48:08.261105 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b54488967-zlmqp" event={"ID":"a1b846cc-f217-4bf2-a2ad-aaa721b8480d","Type":"ContainerDied","Data":"ae0b7b4b87fa6402443fcdd892b9fb0f02778a8c89b331783c3a1525ca971438"} Mar 20 08:48:08 crc kubenswrapper[4971]: I0320 08:48:08.261128 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae0b7b4b87fa6402443fcdd892b9fb0f02778a8c89b331783c3a1525ca971438" Mar 20 08:48:08 crc kubenswrapper[4971]: I0320 08:48:08.313152 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b54488967-zlmqp" Mar 20 08:48:08 crc kubenswrapper[4971]: I0320 08:48:08.393202 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1b846cc-f217-4bf2-a2ad-aaa721b8480d-config\") pod \"a1b846cc-f217-4bf2-a2ad-aaa721b8480d\" (UID: \"a1b846cc-f217-4bf2-a2ad-aaa721b8480d\") " Mar 20 08:48:08 crc kubenswrapper[4971]: I0320 08:48:08.393316 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a1b846cc-f217-4bf2-a2ad-aaa721b8480d-dns-svc\") pod \"a1b846cc-f217-4bf2-a2ad-aaa721b8480d\" (UID: \"a1b846cc-f217-4bf2-a2ad-aaa721b8480d\") " Mar 20 08:48:08 crc kubenswrapper[4971]: I0320 08:48:08.393392 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a1b846cc-f217-4bf2-a2ad-aaa721b8480d-ovsdbserver-sb\") pod \"a1b846cc-f217-4bf2-a2ad-aaa721b8480d\" (UID: \"a1b846cc-f217-4bf2-a2ad-aaa721b8480d\") " Mar 20 08:48:08 crc kubenswrapper[4971]: I0320 08:48:08.393435 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a1b846cc-f217-4bf2-a2ad-aaa721b8480d-ovsdbserver-nb\") pod \"a1b846cc-f217-4bf2-a2ad-aaa721b8480d\" (UID: \"a1b846cc-f217-4bf2-a2ad-aaa721b8480d\") " Mar 20 08:48:08 crc kubenswrapper[4971]: I0320 08:48:08.393515 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kb9ps\" (UniqueName: \"kubernetes.io/projected/a1b846cc-f217-4bf2-a2ad-aaa721b8480d-kube-api-access-kb9ps\") pod \"a1b846cc-f217-4bf2-a2ad-aaa721b8480d\" (UID: \"a1b846cc-f217-4bf2-a2ad-aaa721b8480d\") " Mar 20 08:48:08 crc kubenswrapper[4971]: I0320 08:48:08.411355 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1b846cc-f217-4bf2-a2ad-aaa721b8480d-kube-api-access-kb9ps" (OuterVolumeSpecName: "kube-api-access-kb9ps") pod "a1b846cc-f217-4bf2-a2ad-aaa721b8480d" (UID: "a1b846cc-f217-4bf2-a2ad-aaa721b8480d"). InnerVolumeSpecName "kube-api-access-kb9ps". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:48:08 crc kubenswrapper[4971]: I0320 08:48:08.441496 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1b846cc-f217-4bf2-a2ad-aaa721b8480d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a1b846cc-f217-4bf2-a2ad-aaa721b8480d" (UID: "a1b846cc-f217-4bf2-a2ad-aaa721b8480d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:48:08 crc kubenswrapper[4971]: I0320 08:48:08.443917 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1b846cc-f217-4bf2-a2ad-aaa721b8480d-config" (OuterVolumeSpecName: "config") pod "a1b846cc-f217-4bf2-a2ad-aaa721b8480d" (UID: "a1b846cc-f217-4bf2-a2ad-aaa721b8480d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:48:08 crc kubenswrapper[4971]: I0320 08:48:08.454211 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1b846cc-f217-4bf2-a2ad-aaa721b8480d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a1b846cc-f217-4bf2-a2ad-aaa721b8480d" (UID: "a1b846cc-f217-4bf2-a2ad-aaa721b8480d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:48:08 crc kubenswrapper[4971]: I0320 08:48:08.476720 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1b846cc-f217-4bf2-a2ad-aaa721b8480d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a1b846cc-f217-4bf2-a2ad-aaa721b8480d" (UID: "a1b846cc-f217-4bf2-a2ad-aaa721b8480d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:48:08 crc kubenswrapper[4971]: I0320 08:48:08.495912 4971 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a1b846cc-f217-4bf2-a2ad-aaa721b8480d-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:08 crc kubenswrapper[4971]: I0320 08:48:08.495952 4971 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a1b846cc-f217-4bf2-a2ad-aaa721b8480d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:08 crc kubenswrapper[4971]: I0320 08:48:08.495965 4971 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a1b846cc-f217-4bf2-a2ad-aaa721b8480d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:08 crc kubenswrapper[4971]: I0320 08:48:08.495977 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kb9ps\" (UniqueName: \"kubernetes.io/projected/a1b846cc-f217-4bf2-a2ad-aaa721b8480d-kube-api-access-kb9ps\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:08 crc kubenswrapper[4971]: I0320 08:48:08.495987 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1b846cc-f217-4bf2-a2ad-aaa721b8480d-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:08 crc kubenswrapper[4971]: I0320 08:48:08.729351 4971 scope.go:117] "RemoveContainer" containerID="22b69961abb997f506af552eebfe41f579ce5b873527cb69be50cc4de260c0c4" Mar 20 08:48:09 crc kubenswrapper[4971]: I0320 08:48:09.267735 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b54488967-zlmqp" Mar 20 08:48:09 crc kubenswrapper[4971]: I0320 08:48:09.286964 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b54488967-zlmqp"] Mar 20 08:48:09 crc kubenswrapper[4971]: I0320 08:48:09.294885 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b54488967-zlmqp"] Mar 20 08:48:10 crc kubenswrapper[4971]: I0320 08:48:10.749052 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1b846cc-f217-4bf2-a2ad-aaa721b8480d" path="/var/lib/kubelet/pods/a1b846cc-f217-4bf2-a2ad-aaa721b8480d/volumes" Mar 20 08:48:13 crc kubenswrapper[4971]: I0320 08:48:13.112350 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7b54488967-zlmqp" podUID="a1b846cc-f217-4bf2-a2ad-aaa721b8480d" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.82:5353: i/o timeout" Mar 20 08:48:15 crc kubenswrapper[4971]: I0320 08:48:15.731726 4971 scope.go:117] "RemoveContainer" containerID="af8dd483a43b5c96df916fd811fc94a881785f568b623794921e968459fc26b4" Mar 20 08:48:15 crc kubenswrapper[4971]: E0320 08:48:15.732461 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:48:27 crc kubenswrapper[4971]: I0320 08:48:27.788293 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6c7c478cfc-2twl9" Mar 20 08:48:28 crc kubenswrapper[4971]: I0320 08:48:28.736757 4971 scope.go:117] "RemoveContainer" containerID="af8dd483a43b5c96df916fd811fc94a881785f568b623794921e968459fc26b4" Mar 20 08:48:28 crc kubenswrapper[4971]: E0320 08:48:28.737289 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:48:35 crc kubenswrapper[4971]: I0320 08:48:35.018392 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-rfzsl"] Mar 20 08:48:35 crc kubenswrapper[4971]: E0320 08:48:35.019488 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a8f89b1-e91b-4e29-bb03-25bd13d72ff8" containerName="oc" Mar 20 08:48:35 crc kubenswrapper[4971]: I0320 08:48:35.019506 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a8f89b1-e91b-4e29-bb03-25bd13d72ff8" containerName="oc" Mar 20 08:48:35 crc kubenswrapper[4971]: E0320 08:48:35.019527 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1b846cc-f217-4bf2-a2ad-aaa721b8480d" containerName="init" Mar 20 08:48:35 crc kubenswrapper[4971]: I0320 08:48:35.019544 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1b846cc-f217-4bf2-a2ad-aaa721b8480d" containerName="init" Mar 20 08:48:35 crc kubenswrapper[4971]: E0320 08:48:35.019561 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1b846cc-f217-4bf2-a2ad-aaa721b8480d" containerName="dnsmasq-dns" Mar 20 08:48:35 crc kubenswrapper[4971]: I0320 08:48:35.019569 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1b846cc-f217-4bf2-a2ad-aaa721b8480d" containerName="dnsmasq-dns" Mar 20 08:48:35 crc kubenswrapper[4971]: I0320 08:48:35.019817 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a8f89b1-e91b-4e29-bb03-25bd13d72ff8" containerName="oc" Mar 20 08:48:35 crc kubenswrapper[4971]: I0320 08:48:35.019840 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1b846cc-f217-4bf2-a2ad-aaa721b8480d" containerName="dnsmasq-dns" Mar 20 08:48:35 crc kubenswrapper[4971]: I0320 08:48:35.020699 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-rfzsl" Mar 20 08:48:35 crc kubenswrapper[4971]: I0320 08:48:35.024396 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-0606-account-create-update-7nmf4"] Mar 20 08:48:35 crc kubenswrapper[4971]: I0320 08:48:35.025868 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-0606-account-create-update-7nmf4" Mar 20 08:48:35 crc kubenswrapper[4971]: I0320 08:48:35.028283 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 20 08:48:35 crc kubenswrapper[4971]: I0320 08:48:35.032117 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-0606-account-create-update-7nmf4"] Mar 20 08:48:35 crc kubenswrapper[4971]: I0320 08:48:35.042228 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-rfzsl"] Mar 20 08:48:35 crc kubenswrapper[4971]: I0320 08:48:35.182745 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfnd9\" (UniqueName: \"kubernetes.io/projected/114a8aed-e610-49e8-ba46-fcfcc4b7f0f7-kube-api-access-lfnd9\") pod \"glance-0606-account-create-update-7nmf4\" (UID: \"114a8aed-e610-49e8-ba46-fcfcc4b7f0f7\") " pod="openstack/glance-0606-account-create-update-7nmf4" Mar 20 08:48:35 crc kubenswrapper[4971]: I0320 08:48:35.183138 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/79df42cd-73aa-4c95-833b-7b7d1a2043cd-operator-scripts\") pod \"glance-db-create-rfzsl\" (UID: \"79df42cd-73aa-4c95-833b-7b7d1a2043cd\") " pod="openstack/glance-db-create-rfzsl" Mar 20 08:48:35 crc kubenswrapper[4971]: I0320 08:48:35.183212 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/114a8aed-e610-49e8-ba46-fcfcc4b7f0f7-operator-scripts\") pod \"glance-0606-account-create-update-7nmf4\" (UID: \"114a8aed-e610-49e8-ba46-fcfcc4b7f0f7\") " pod="openstack/glance-0606-account-create-update-7nmf4" Mar 20 08:48:35 crc kubenswrapper[4971]: I0320 08:48:35.183244 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckr8v\" (UniqueName: \"kubernetes.io/projected/79df42cd-73aa-4c95-833b-7b7d1a2043cd-kube-api-access-ckr8v\") pod \"glance-db-create-rfzsl\" (UID: \"79df42cd-73aa-4c95-833b-7b7d1a2043cd\") " pod="openstack/glance-db-create-rfzsl" Mar 20 08:48:35 crc kubenswrapper[4971]: I0320 08:48:35.284885 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfnd9\" (UniqueName: \"kubernetes.io/projected/114a8aed-e610-49e8-ba46-fcfcc4b7f0f7-kube-api-access-lfnd9\") pod \"glance-0606-account-create-update-7nmf4\" (UID: \"114a8aed-e610-49e8-ba46-fcfcc4b7f0f7\") " pod="openstack/glance-0606-account-create-update-7nmf4" Mar 20 08:48:35 crc kubenswrapper[4971]: I0320 08:48:35.285277 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/79df42cd-73aa-4c95-833b-7b7d1a2043cd-operator-scripts\") pod \"glance-db-create-rfzsl\" (UID: \"79df42cd-73aa-4c95-833b-7b7d1a2043cd\") " pod="openstack/glance-db-create-rfzsl" Mar 20 08:48:35 crc kubenswrapper[4971]: I0320 08:48:35.285348 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/114a8aed-e610-49e8-ba46-fcfcc4b7f0f7-operator-scripts\") pod \"glance-0606-account-create-update-7nmf4\" (UID: \"114a8aed-e610-49e8-ba46-fcfcc4b7f0f7\") " pod="openstack/glance-0606-account-create-update-7nmf4" Mar 20 08:48:35 crc kubenswrapper[4971]: I0320 08:48:35.285383 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckr8v\" (UniqueName: \"kubernetes.io/projected/79df42cd-73aa-4c95-833b-7b7d1a2043cd-kube-api-access-ckr8v\") pod \"glance-db-create-rfzsl\" (UID: \"79df42cd-73aa-4c95-833b-7b7d1a2043cd\") " pod="openstack/glance-db-create-rfzsl" Mar 20 08:48:35 crc kubenswrapper[4971]: I0320 08:48:35.286143 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/79df42cd-73aa-4c95-833b-7b7d1a2043cd-operator-scripts\") pod \"glance-db-create-rfzsl\" (UID: \"79df42cd-73aa-4c95-833b-7b7d1a2043cd\") " pod="openstack/glance-db-create-rfzsl" Mar 20 08:48:35 crc kubenswrapper[4971]: I0320 08:48:35.286285 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/114a8aed-e610-49e8-ba46-fcfcc4b7f0f7-operator-scripts\") pod \"glance-0606-account-create-update-7nmf4\" (UID: \"114a8aed-e610-49e8-ba46-fcfcc4b7f0f7\") " pod="openstack/glance-0606-account-create-update-7nmf4" Mar 20 08:48:35 crc kubenswrapper[4971]: I0320 08:48:35.309029 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfnd9\" (UniqueName: \"kubernetes.io/projected/114a8aed-e610-49e8-ba46-fcfcc4b7f0f7-kube-api-access-lfnd9\") pod \"glance-0606-account-create-update-7nmf4\" (UID: \"114a8aed-e610-49e8-ba46-fcfcc4b7f0f7\") " pod="openstack/glance-0606-account-create-update-7nmf4" Mar 20 08:48:35 crc kubenswrapper[4971]: I0320 08:48:35.316915 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckr8v\" (UniqueName: \"kubernetes.io/projected/79df42cd-73aa-4c95-833b-7b7d1a2043cd-kube-api-access-ckr8v\") pod \"glance-db-create-rfzsl\" (UID: \"79df42cd-73aa-4c95-833b-7b7d1a2043cd\") " pod="openstack/glance-db-create-rfzsl" Mar 20 08:48:35 crc kubenswrapper[4971]: I0320 08:48:35.351502 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-rfzsl" Mar 20 08:48:35 crc kubenswrapper[4971]: I0320 08:48:35.358181 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-0606-account-create-update-7nmf4" Mar 20 08:48:35 crc kubenswrapper[4971]: I0320 08:48:35.643050 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-rfzsl"] Mar 20 08:48:35 crc kubenswrapper[4971]: I0320 08:48:35.739149 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-0606-account-create-update-7nmf4"] Mar 20 08:48:35 crc kubenswrapper[4971]: W0320 08:48:35.750130 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod114a8aed_e610_49e8_ba46_fcfcc4b7f0f7.slice/crio-242098a09bfff446a4805f5605546ab26b033590fd9b1b18e2dec00032b78d1f WatchSource:0}: Error finding container 242098a09bfff446a4805f5605546ab26b033590fd9b1b18e2dec00032b78d1f: Status 404 returned error can't find the container with id 242098a09bfff446a4805f5605546ab26b033590fd9b1b18e2dec00032b78d1f Mar 20 08:48:36 crc kubenswrapper[4971]: I0320 08:48:36.516664 4971 generic.go:334] "Generic (PLEG): container finished" podID="79df42cd-73aa-4c95-833b-7b7d1a2043cd" containerID="163d8ccb757a1a02fc718a5f47fb20513fc37cbca66d847cba546e368b38588d" exitCode=0 Mar 20 08:48:36 crc kubenswrapper[4971]: I0320 08:48:36.517099 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-rfzsl" event={"ID":"79df42cd-73aa-4c95-833b-7b7d1a2043cd","Type":"ContainerDied","Data":"163d8ccb757a1a02fc718a5f47fb20513fc37cbca66d847cba546e368b38588d"} Mar 20 08:48:36 crc kubenswrapper[4971]: I0320 08:48:36.517148 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-rfzsl" event={"ID":"79df42cd-73aa-4c95-833b-7b7d1a2043cd","Type":"ContainerStarted","Data":"2937929db12fb3c254a308f4ed88bbc9edc102e1d5b9abc17859a5dc35942cbc"} Mar 20 08:48:36 crc kubenswrapper[4971]: I0320 08:48:36.518978 4971 generic.go:334] "Generic (PLEG): container finished" podID="114a8aed-e610-49e8-ba46-fcfcc4b7f0f7" containerID="b214be5f31acbfdbe0402aee158630139f194b3e9b9063303293c20c51e092ce" exitCode=0 Mar 20 08:48:36 crc kubenswrapper[4971]: I0320 08:48:36.519041 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-0606-account-create-update-7nmf4" event={"ID":"114a8aed-e610-49e8-ba46-fcfcc4b7f0f7","Type":"ContainerDied","Data":"b214be5f31acbfdbe0402aee158630139f194b3e9b9063303293c20c51e092ce"} Mar 20 08:48:36 crc kubenswrapper[4971]: I0320 08:48:36.519079 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-0606-account-create-update-7nmf4" event={"ID":"114a8aed-e610-49e8-ba46-fcfcc4b7f0f7","Type":"ContainerStarted","Data":"242098a09bfff446a4805f5605546ab26b033590fd9b1b18e2dec00032b78d1f"} Mar 20 08:48:37 crc kubenswrapper[4971]: I0320 08:48:37.911879 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-rfzsl" Mar 20 08:48:37 crc kubenswrapper[4971]: I0320 08:48:37.923067 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-0606-account-create-update-7nmf4" Mar 20 08:48:38 crc kubenswrapper[4971]: I0320 08:48:38.033036 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfnd9\" (UniqueName: \"kubernetes.io/projected/114a8aed-e610-49e8-ba46-fcfcc4b7f0f7-kube-api-access-lfnd9\") pod \"114a8aed-e610-49e8-ba46-fcfcc4b7f0f7\" (UID: \"114a8aed-e610-49e8-ba46-fcfcc4b7f0f7\") " Mar 20 08:48:38 crc kubenswrapper[4971]: I0320 08:48:38.033120 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/114a8aed-e610-49e8-ba46-fcfcc4b7f0f7-operator-scripts\") pod \"114a8aed-e610-49e8-ba46-fcfcc4b7f0f7\" (UID: \"114a8aed-e610-49e8-ba46-fcfcc4b7f0f7\") " Mar 20 08:48:38 crc kubenswrapper[4971]: I0320 08:48:38.033323 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/79df42cd-73aa-4c95-833b-7b7d1a2043cd-operator-scripts\") pod \"79df42cd-73aa-4c95-833b-7b7d1a2043cd\" (UID: \"79df42cd-73aa-4c95-833b-7b7d1a2043cd\") " Mar 20 08:48:38 crc kubenswrapper[4971]: I0320 08:48:38.033342 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckr8v\" (UniqueName: \"kubernetes.io/projected/79df42cd-73aa-4c95-833b-7b7d1a2043cd-kube-api-access-ckr8v\") pod \"79df42cd-73aa-4c95-833b-7b7d1a2043cd\" (UID: \"79df42cd-73aa-4c95-833b-7b7d1a2043cd\") " Mar 20 08:48:38 crc kubenswrapper[4971]: I0320 08:48:38.033694 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/114a8aed-e610-49e8-ba46-fcfcc4b7f0f7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "114a8aed-e610-49e8-ba46-fcfcc4b7f0f7" (UID: "114a8aed-e610-49e8-ba46-fcfcc4b7f0f7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:48:38 crc kubenswrapper[4971]: I0320 08:48:38.034308 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79df42cd-73aa-4c95-833b-7b7d1a2043cd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "79df42cd-73aa-4c95-833b-7b7d1a2043cd" (UID: "79df42cd-73aa-4c95-833b-7b7d1a2043cd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:48:38 crc kubenswrapper[4971]: I0320 08:48:38.034632 4971 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/114a8aed-e610-49e8-ba46-fcfcc4b7f0f7-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:38 crc kubenswrapper[4971]: I0320 08:48:38.034648 4971 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/79df42cd-73aa-4c95-833b-7b7d1a2043cd-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:38 crc kubenswrapper[4971]: I0320 08:48:38.039255 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79df42cd-73aa-4c95-833b-7b7d1a2043cd-kube-api-access-ckr8v" (OuterVolumeSpecName: "kube-api-access-ckr8v") pod "79df42cd-73aa-4c95-833b-7b7d1a2043cd" (UID: "79df42cd-73aa-4c95-833b-7b7d1a2043cd"). InnerVolumeSpecName "kube-api-access-ckr8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:48:38 crc kubenswrapper[4971]: I0320 08:48:38.047971 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/114a8aed-e610-49e8-ba46-fcfcc4b7f0f7-kube-api-access-lfnd9" (OuterVolumeSpecName: "kube-api-access-lfnd9") pod "114a8aed-e610-49e8-ba46-fcfcc4b7f0f7" (UID: "114a8aed-e610-49e8-ba46-fcfcc4b7f0f7"). InnerVolumeSpecName "kube-api-access-lfnd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:48:38 crc kubenswrapper[4971]: I0320 08:48:38.137205 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckr8v\" (UniqueName: \"kubernetes.io/projected/79df42cd-73aa-4c95-833b-7b7d1a2043cd-kube-api-access-ckr8v\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:38 crc kubenswrapper[4971]: I0320 08:48:38.137255 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfnd9\" (UniqueName: \"kubernetes.io/projected/114a8aed-e610-49e8-ba46-fcfcc4b7f0f7-kube-api-access-lfnd9\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:38 crc kubenswrapper[4971]: I0320 08:48:38.539771 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-0606-account-create-update-7nmf4" Mar 20 08:48:38 crc kubenswrapper[4971]: I0320 08:48:38.540998 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-0606-account-create-update-7nmf4" event={"ID":"114a8aed-e610-49e8-ba46-fcfcc4b7f0f7","Type":"ContainerDied","Data":"242098a09bfff446a4805f5605546ab26b033590fd9b1b18e2dec00032b78d1f"} Mar 20 08:48:38 crc kubenswrapper[4971]: I0320 08:48:38.541054 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="242098a09bfff446a4805f5605546ab26b033590fd9b1b18e2dec00032b78d1f" Mar 20 08:48:38 crc kubenswrapper[4971]: I0320 08:48:38.543659 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-rfzsl" event={"ID":"79df42cd-73aa-4c95-833b-7b7d1a2043cd","Type":"ContainerDied","Data":"2937929db12fb3c254a308f4ed88bbc9edc102e1d5b9abc17859a5dc35942cbc"} Mar 20 08:48:38 crc kubenswrapper[4971]: I0320 08:48:38.543703 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2937929db12fb3c254a308f4ed88bbc9edc102e1d5b9abc17859a5dc35942cbc" Mar 20 08:48:38 crc kubenswrapper[4971]: I0320 08:48:38.543821 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-rfzsl" Mar 20 08:48:39 crc kubenswrapper[4971]: I0320 08:48:39.732883 4971 scope.go:117] "RemoveContainer" containerID="af8dd483a43b5c96df916fd811fc94a881785f568b623794921e968459fc26b4" Mar 20 08:48:39 crc kubenswrapper[4971]: E0320 08:48:39.735228 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:48:40 crc kubenswrapper[4971]: I0320 08:48:40.183080 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-tkdkl"] Mar 20 08:48:40 crc kubenswrapper[4971]: E0320 08:48:40.183387 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="114a8aed-e610-49e8-ba46-fcfcc4b7f0f7" containerName="mariadb-account-create-update" Mar 20 08:48:40 crc kubenswrapper[4971]: I0320 08:48:40.183398 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="114a8aed-e610-49e8-ba46-fcfcc4b7f0f7" containerName="mariadb-account-create-update" Mar 20 08:48:40 crc kubenswrapper[4971]: E0320 08:48:40.183430 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79df42cd-73aa-4c95-833b-7b7d1a2043cd" containerName="mariadb-database-create" Mar 20 08:48:40 crc kubenswrapper[4971]: I0320 08:48:40.183436 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="79df42cd-73aa-4c95-833b-7b7d1a2043cd" containerName="mariadb-database-create" Mar 20 08:48:40 crc kubenswrapper[4971]: I0320 08:48:40.183584 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="114a8aed-e610-49e8-ba46-fcfcc4b7f0f7" containerName="mariadb-account-create-update" Mar 20 08:48:40 crc kubenswrapper[4971]: I0320 08:48:40.183598 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="79df42cd-73aa-4c95-833b-7b7d1a2043cd" containerName="mariadb-database-create" Mar 20 08:48:40 crc kubenswrapper[4971]: I0320 08:48:40.184127 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-tkdkl" Mar 20 08:48:40 crc kubenswrapper[4971]: I0320 08:48:40.187575 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-wklvs" Mar 20 08:48:40 crc kubenswrapper[4971]: I0320 08:48:40.187892 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 20 08:48:40 crc kubenswrapper[4971]: I0320 08:48:40.211319 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-tkdkl"] Mar 20 08:48:40 crc kubenswrapper[4971]: I0320 08:48:40.281409 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c0eb761-10ea-4b19-8b9b-deaa635a6393-config-data\") pod \"glance-db-sync-tkdkl\" (UID: \"7c0eb761-10ea-4b19-8b9b-deaa635a6393\") " pod="openstack/glance-db-sync-tkdkl" Mar 20 08:48:40 crc kubenswrapper[4971]: I0320 08:48:40.281914 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7c0eb761-10ea-4b19-8b9b-deaa635a6393-db-sync-config-data\") pod \"glance-db-sync-tkdkl\" (UID: \"7c0eb761-10ea-4b19-8b9b-deaa635a6393\") " pod="openstack/glance-db-sync-tkdkl" Mar 20 08:48:40 crc kubenswrapper[4971]: I0320 08:48:40.281966 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c0eb761-10ea-4b19-8b9b-deaa635a6393-combined-ca-bundle\") pod \"glance-db-sync-tkdkl\" (UID: \"7c0eb761-10ea-4b19-8b9b-deaa635a6393\") " pod="openstack/glance-db-sync-tkdkl" Mar 20 08:48:40 crc kubenswrapper[4971]: I0320 08:48:40.281987 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzzzp\" (UniqueName: \"kubernetes.io/projected/7c0eb761-10ea-4b19-8b9b-deaa635a6393-kube-api-access-fzzzp\") pod \"glance-db-sync-tkdkl\" (UID: \"7c0eb761-10ea-4b19-8b9b-deaa635a6393\") " pod="openstack/glance-db-sync-tkdkl" Mar 20 08:48:40 crc kubenswrapper[4971]: I0320 08:48:40.383151 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c0eb761-10ea-4b19-8b9b-deaa635a6393-combined-ca-bundle\") pod \"glance-db-sync-tkdkl\" (UID: \"7c0eb761-10ea-4b19-8b9b-deaa635a6393\") " pod="openstack/glance-db-sync-tkdkl" Mar 20 08:48:40 crc kubenswrapper[4971]: I0320 08:48:40.384080 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzzzp\" (UniqueName: \"kubernetes.io/projected/7c0eb761-10ea-4b19-8b9b-deaa635a6393-kube-api-access-fzzzp\") pod \"glance-db-sync-tkdkl\" (UID: \"7c0eb761-10ea-4b19-8b9b-deaa635a6393\") " pod="openstack/glance-db-sync-tkdkl" Mar 20 08:48:40 crc kubenswrapper[4971]: I0320 08:48:40.384155 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c0eb761-10ea-4b19-8b9b-deaa635a6393-config-data\") pod \"glance-db-sync-tkdkl\" (UID: \"7c0eb761-10ea-4b19-8b9b-deaa635a6393\") " pod="openstack/glance-db-sync-tkdkl" Mar 20 08:48:40 crc kubenswrapper[4971]: I0320 08:48:40.384217 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7c0eb761-10ea-4b19-8b9b-deaa635a6393-db-sync-config-data\") pod \"glance-db-sync-tkdkl\" (UID: \"7c0eb761-10ea-4b19-8b9b-deaa635a6393\") " pod="openstack/glance-db-sync-tkdkl" Mar 20 08:48:40 crc kubenswrapper[4971]: I0320 08:48:40.391583 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c0eb761-10ea-4b19-8b9b-deaa635a6393-combined-ca-bundle\") pod \"glance-db-sync-tkdkl\" (UID: \"7c0eb761-10ea-4b19-8b9b-deaa635a6393\") " pod="openstack/glance-db-sync-tkdkl" Mar 20 08:48:40 crc kubenswrapper[4971]: I0320 08:48:40.392289 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c0eb761-10ea-4b19-8b9b-deaa635a6393-config-data\") pod \"glance-db-sync-tkdkl\" (UID: \"7c0eb761-10ea-4b19-8b9b-deaa635a6393\") " pod="openstack/glance-db-sync-tkdkl" Mar 20 08:48:40 crc kubenswrapper[4971]: I0320 08:48:40.393895 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7c0eb761-10ea-4b19-8b9b-deaa635a6393-db-sync-config-data\") pod \"glance-db-sync-tkdkl\" (UID: \"7c0eb761-10ea-4b19-8b9b-deaa635a6393\") " pod="openstack/glance-db-sync-tkdkl" Mar 20 08:48:40 crc kubenswrapper[4971]: I0320 08:48:40.405492 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzzzp\" (UniqueName: \"kubernetes.io/projected/7c0eb761-10ea-4b19-8b9b-deaa635a6393-kube-api-access-fzzzp\") pod \"glance-db-sync-tkdkl\" (UID: \"7c0eb761-10ea-4b19-8b9b-deaa635a6393\") " pod="openstack/glance-db-sync-tkdkl" Mar 20 08:48:40 crc kubenswrapper[4971]: I0320 08:48:40.522894 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-tkdkl" Mar 20 08:48:41 crc kubenswrapper[4971]: I0320 08:48:41.299594 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-tkdkl"] Mar 20 08:48:41 crc kubenswrapper[4971]: I0320 08:48:41.581362 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-tkdkl" event={"ID":"7c0eb761-10ea-4b19-8b9b-deaa635a6393","Type":"ContainerStarted","Data":"3ee39d2d98215a4a9d43ad1bc4f3360a43a4860527e735bf25ee463ab57a961d"} Mar 20 08:48:51 crc kubenswrapper[4971]: I0320 08:48:51.733048 4971 scope.go:117] "RemoveContainer" containerID="af8dd483a43b5c96df916fd811fc94a881785f568b623794921e968459fc26b4" Mar 20 08:48:51 crc kubenswrapper[4971]: E0320 08:48:51.733985 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:48:58 crc kubenswrapper[4971]: I0320 08:48:58.727361 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-tkdkl" event={"ID":"7c0eb761-10ea-4b19-8b9b-deaa635a6393","Type":"ContainerStarted","Data":"5d7d1a3cd6061f415d29124b2aff214ee5133e735f9d5cbfcaa020ed266f2e96"} Mar 20 08:48:58 crc kubenswrapper[4971]: I0320 08:48:58.756587 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-tkdkl" podStartSLOduration=2.567643457 podStartE2EDuration="18.756560256s" podCreationTimestamp="2026-03-20 08:48:40 +0000 UTC" firstStartedPulling="2026-03-20 08:48:41.295946724 +0000 UTC m=+7143.275820862" lastFinishedPulling="2026-03-20 08:48:57.484863503 +0000 UTC m=+7159.464737661" observedRunningTime="2026-03-20 08:48:58.755780056 +0000 UTC m=+7160.735654254" watchObservedRunningTime="2026-03-20 08:48:58.756560256 +0000 UTC m=+7160.736434424" Mar 20 08:49:01 crc kubenswrapper[4971]: I0320 08:49:01.769497 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-tkdkl" event={"ID":"7c0eb761-10ea-4b19-8b9b-deaa635a6393","Type":"ContainerDied","Data":"5d7d1a3cd6061f415d29124b2aff214ee5133e735f9d5cbfcaa020ed266f2e96"} Mar 20 08:49:01 crc kubenswrapper[4971]: I0320 08:49:01.769441 4971 generic.go:334] "Generic (PLEG): container finished" podID="7c0eb761-10ea-4b19-8b9b-deaa635a6393" containerID="5d7d1a3cd6061f415d29124b2aff214ee5133e735f9d5cbfcaa020ed266f2e96" exitCode=0 Mar 20 08:49:03 crc kubenswrapper[4971]: I0320 08:49:03.165756 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-tkdkl" Mar 20 08:49:03 crc kubenswrapper[4971]: I0320 08:49:03.340327 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c0eb761-10ea-4b19-8b9b-deaa635a6393-combined-ca-bundle\") pod \"7c0eb761-10ea-4b19-8b9b-deaa635a6393\" (UID: \"7c0eb761-10ea-4b19-8b9b-deaa635a6393\") " Mar 20 08:49:03 crc kubenswrapper[4971]: I0320 08:49:03.340392 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzzzp\" (UniqueName: \"kubernetes.io/projected/7c0eb761-10ea-4b19-8b9b-deaa635a6393-kube-api-access-fzzzp\") pod \"7c0eb761-10ea-4b19-8b9b-deaa635a6393\" (UID: \"7c0eb761-10ea-4b19-8b9b-deaa635a6393\") " Mar 20 08:49:03 crc kubenswrapper[4971]: I0320 08:49:03.340501 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c0eb761-10ea-4b19-8b9b-deaa635a6393-config-data\") pod \"7c0eb761-10ea-4b19-8b9b-deaa635a6393\" (UID: \"7c0eb761-10ea-4b19-8b9b-deaa635a6393\") " Mar 20 08:49:03 crc kubenswrapper[4971]: I0320 08:49:03.340554 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7c0eb761-10ea-4b19-8b9b-deaa635a6393-db-sync-config-data\") pod \"7c0eb761-10ea-4b19-8b9b-deaa635a6393\" (UID: \"7c0eb761-10ea-4b19-8b9b-deaa635a6393\") " Mar 20 08:49:03 crc kubenswrapper[4971]: I0320 08:49:03.345471 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c0eb761-10ea-4b19-8b9b-deaa635a6393-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "7c0eb761-10ea-4b19-8b9b-deaa635a6393" (UID: "7c0eb761-10ea-4b19-8b9b-deaa635a6393"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:03 crc kubenswrapper[4971]: I0320 08:49:03.346121 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c0eb761-10ea-4b19-8b9b-deaa635a6393-kube-api-access-fzzzp" (OuterVolumeSpecName: "kube-api-access-fzzzp") pod "7c0eb761-10ea-4b19-8b9b-deaa635a6393" (UID: "7c0eb761-10ea-4b19-8b9b-deaa635a6393"). InnerVolumeSpecName "kube-api-access-fzzzp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:49:03 crc kubenswrapper[4971]: I0320 08:49:03.374950 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c0eb761-10ea-4b19-8b9b-deaa635a6393-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7c0eb761-10ea-4b19-8b9b-deaa635a6393" (UID: "7c0eb761-10ea-4b19-8b9b-deaa635a6393"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:03 crc kubenswrapper[4971]: I0320 08:49:03.380737 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c0eb761-10ea-4b19-8b9b-deaa635a6393-config-data" (OuterVolumeSpecName: "config-data") pod "7c0eb761-10ea-4b19-8b9b-deaa635a6393" (UID: "7c0eb761-10ea-4b19-8b9b-deaa635a6393"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:03 crc kubenswrapper[4971]: I0320 08:49:03.442632 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzzzp\" (UniqueName: \"kubernetes.io/projected/7c0eb761-10ea-4b19-8b9b-deaa635a6393-kube-api-access-fzzzp\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:03 crc kubenswrapper[4971]: I0320 08:49:03.442682 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c0eb761-10ea-4b19-8b9b-deaa635a6393-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:03 crc kubenswrapper[4971]: I0320 08:49:03.442696 4971 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7c0eb761-10ea-4b19-8b9b-deaa635a6393-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:03 crc kubenswrapper[4971]: I0320 08:49:03.442709 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c0eb761-10ea-4b19-8b9b-deaa635a6393-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:03 crc kubenswrapper[4971]: I0320 08:49:03.787889 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-tkdkl" event={"ID":"7c0eb761-10ea-4b19-8b9b-deaa635a6393","Type":"ContainerDied","Data":"3ee39d2d98215a4a9d43ad1bc4f3360a43a4860527e735bf25ee463ab57a961d"} Mar 20 08:49:03 crc kubenswrapper[4971]: I0320 08:49:03.788185 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ee39d2d98215a4a9d43ad1bc4f3360a43a4860527e735bf25ee463ab57a961d" Mar 20 08:49:03 crc kubenswrapper[4971]: I0320 08:49:03.787993 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-tkdkl" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.117676 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 08:49:04 crc kubenswrapper[4971]: E0320 08:49:04.118152 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c0eb761-10ea-4b19-8b9b-deaa635a6393" containerName="glance-db-sync" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.118173 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c0eb761-10ea-4b19-8b9b-deaa635a6393" containerName="glance-db-sync" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.118376 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c0eb761-10ea-4b19-8b9b-deaa635a6393" containerName="glance-db-sync" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.119335 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.122564 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.122637 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.123203 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-wklvs" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.126355 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.151452 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.258734 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e6cb98d-a89c-4b15-b745-4935f0f9c6e2-config-data\") pod \"glance-default-external-api-0\" (UID: \"6e6cb98d-a89c-4b15-b745-4935f0f9c6e2\") " pod="openstack/glance-default-external-api-0" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.258812 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzm4g\" (UniqueName: \"kubernetes.io/projected/6e6cb98d-a89c-4b15-b745-4935f0f9c6e2-kube-api-access-xzm4g\") pod \"glance-default-external-api-0\" (UID: \"6e6cb98d-a89c-4b15-b745-4935f0f9c6e2\") " pod="openstack/glance-default-external-api-0" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.258837 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6e6cb98d-a89c-4b15-b745-4935f0f9c6e2-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6e6cb98d-a89c-4b15-b745-4935f0f9c6e2\") " pod="openstack/glance-default-external-api-0" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.258866 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e6cb98d-a89c-4b15-b745-4935f0f9c6e2-logs\") pod \"glance-default-external-api-0\" (UID: \"6e6cb98d-a89c-4b15-b745-4935f0f9c6e2\") " pod="openstack/glance-default-external-api-0" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.258887 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e6cb98d-a89c-4b15-b745-4935f0f9c6e2-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6e6cb98d-a89c-4b15-b745-4935f0f9c6e2\") " pod="openstack/glance-default-external-api-0" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.258914 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6e6cb98d-a89c-4b15-b745-4935f0f9c6e2-ceph\") pod \"glance-default-external-api-0\" (UID: \"6e6cb98d-a89c-4b15-b745-4935f0f9c6e2\") " pod="openstack/glance-default-external-api-0" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.258940 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e6cb98d-a89c-4b15-b745-4935f0f9c6e2-scripts\") pod \"glance-default-external-api-0\" (UID: \"6e6cb98d-a89c-4b15-b745-4935f0f9c6e2\") " pod="openstack/glance-default-external-api-0" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.395926 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-589f6f54ff-4rcvv"] Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.399637 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e6cb98d-a89c-4b15-b745-4935f0f9c6e2-scripts\") pod \"glance-default-external-api-0\" (UID: \"6e6cb98d-a89c-4b15-b745-4935f0f9c6e2\") " pod="openstack/glance-default-external-api-0" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.399909 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e6cb98d-a89c-4b15-b745-4935f0f9c6e2-config-data\") pod \"glance-default-external-api-0\" (UID: \"6e6cb98d-a89c-4b15-b745-4935f0f9c6e2\") " pod="openstack/glance-default-external-api-0" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.399960 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzm4g\" (UniqueName: \"kubernetes.io/projected/6e6cb98d-a89c-4b15-b745-4935f0f9c6e2-kube-api-access-xzm4g\") pod \"glance-default-external-api-0\" (UID: \"6e6cb98d-a89c-4b15-b745-4935f0f9c6e2\") " pod="openstack/glance-default-external-api-0" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.399982 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6e6cb98d-a89c-4b15-b745-4935f0f9c6e2-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6e6cb98d-a89c-4b15-b745-4935f0f9c6e2\") " pod="openstack/glance-default-external-api-0" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.400007 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e6cb98d-a89c-4b15-b745-4935f0f9c6e2-logs\") pod \"glance-default-external-api-0\" (UID: \"6e6cb98d-a89c-4b15-b745-4935f0f9c6e2\") " pod="openstack/glance-default-external-api-0" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.400027 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e6cb98d-a89c-4b15-b745-4935f0f9c6e2-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6e6cb98d-a89c-4b15-b745-4935f0f9c6e2\") " pod="openstack/glance-default-external-api-0" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.400053 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6e6cb98d-a89c-4b15-b745-4935f0f9c6e2-ceph\") pod \"glance-default-external-api-0\" (UID: \"6e6cb98d-a89c-4b15-b745-4935f0f9c6e2\") " pod="openstack/glance-default-external-api-0" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.406551 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e6cb98d-a89c-4b15-b745-4935f0f9c6e2-logs\") pod \"glance-default-external-api-0\" (UID: \"6e6cb98d-a89c-4b15-b745-4935f0f9c6e2\") " pod="openstack/glance-default-external-api-0" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.406846 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6e6cb98d-a89c-4b15-b745-4935f0f9c6e2-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6e6cb98d-a89c-4b15-b745-4935f0f9c6e2\") " pod="openstack/glance-default-external-api-0" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.408544 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589f6f54ff-4rcvv" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.411483 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e6cb98d-a89c-4b15-b745-4935f0f9c6e2-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6e6cb98d-a89c-4b15-b745-4935f0f9c6e2\") " pod="openstack/glance-default-external-api-0" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.420223 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6e6cb98d-a89c-4b15-b745-4935f0f9c6e2-ceph\") pod \"glance-default-external-api-0\" (UID: \"6e6cb98d-a89c-4b15-b745-4935f0f9c6e2\") " pod="openstack/glance-default-external-api-0" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.424913 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e6cb98d-a89c-4b15-b745-4935f0f9c6e2-scripts\") pod \"glance-default-external-api-0\" (UID: \"6e6cb98d-a89c-4b15-b745-4935f0f9c6e2\") " pod="openstack/glance-default-external-api-0" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.439335 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzm4g\" (UniqueName: \"kubernetes.io/projected/6e6cb98d-a89c-4b15-b745-4935f0f9c6e2-kube-api-access-xzm4g\") pod \"glance-default-external-api-0\" (UID: \"6e6cb98d-a89c-4b15-b745-4935f0f9c6e2\") " pod="openstack/glance-default-external-api-0" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.440796 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e6cb98d-a89c-4b15-b745-4935f0f9c6e2-config-data\") pod \"glance-default-external-api-0\" (UID: \"6e6cb98d-a89c-4b15-b745-4935f0f9c6e2\") " pod="openstack/glance-default-external-api-0" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.455523 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-589f6f54ff-4rcvv"] Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.468026 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.469396 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.483933 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.493846 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.500914 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e6bad53-1552-4cc5-aafd-2981191d8afc-ovsdbserver-sb\") pod \"dnsmasq-dns-589f6f54ff-4rcvv\" (UID: \"5e6bad53-1552-4cc5-aafd-2981191d8afc\") " pod="openstack/dnsmasq-dns-589f6f54ff-4rcvv" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.500951 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e6bad53-1552-4cc5-aafd-2981191d8afc-dns-svc\") pod \"dnsmasq-dns-589f6f54ff-4rcvv\" (UID: \"5e6bad53-1552-4cc5-aafd-2981191d8afc\") " pod="openstack/dnsmasq-dns-589f6f54ff-4rcvv" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.500987 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksfxf\" (UniqueName: \"kubernetes.io/projected/5e6bad53-1552-4cc5-aafd-2981191d8afc-kube-api-access-ksfxf\") pod \"dnsmasq-dns-589f6f54ff-4rcvv\" (UID: \"5e6bad53-1552-4cc5-aafd-2981191d8afc\") " pod="openstack/dnsmasq-dns-589f6f54ff-4rcvv" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.501053 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e6bad53-1552-4cc5-aafd-2981191d8afc-config\") pod \"dnsmasq-dns-589f6f54ff-4rcvv\" (UID: \"5e6bad53-1552-4cc5-aafd-2981191d8afc\") " pod="openstack/dnsmasq-dns-589f6f54ff-4rcvv" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.501075 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e6bad53-1552-4cc5-aafd-2981191d8afc-ovsdbserver-nb\") pod \"dnsmasq-dns-589f6f54ff-4rcvv\" (UID: \"5e6bad53-1552-4cc5-aafd-2981191d8afc\") " pod="openstack/dnsmasq-dns-589f6f54ff-4rcvv" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.602167 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksfxf\" (UniqueName: \"kubernetes.io/projected/5e6bad53-1552-4cc5-aafd-2981191d8afc-kube-api-access-ksfxf\") pod \"dnsmasq-dns-589f6f54ff-4rcvv\" (UID: \"5e6bad53-1552-4cc5-aafd-2981191d8afc\") " pod="openstack/dnsmasq-dns-589f6f54ff-4rcvv" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.602273 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b29d3be9-3635-458d-af16-5c5cc23b0cc1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b29d3be9-3635-458d-af16-5c5cc23b0cc1\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.602300 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e6bad53-1552-4cc5-aafd-2981191d8afc-config\") pod \"dnsmasq-dns-589f6f54ff-4rcvv\" (UID: \"5e6bad53-1552-4cc5-aafd-2981191d8afc\") " pod="openstack/dnsmasq-dns-589f6f54ff-4rcvv" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.602320 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/b29d3be9-3635-458d-af16-5c5cc23b0cc1-ceph\") pod \"glance-default-internal-api-0\" (UID: \"b29d3be9-3635-458d-af16-5c5cc23b0cc1\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.602336 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b29d3be9-3635-458d-af16-5c5cc23b0cc1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b29d3be9-3635-458d-af16-5c5cc23b0cc1\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.602359 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e6bad53-1552-4cc5-aafd-2981191d8afc-ovsdbserver-nb\") pod \"dnsmasq-dns-589f6f54ff-4rcvv\" (UID: \"5e6bad53-1552-4cc5-aafd-2981191d8afc\") " pod="openstack/dnsmasq-dns-589f6f54ff-4rcvv" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.602409 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b29d3be9-3635-458d-af16-5c5cc23b0cc1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b29d3be9-3635-458d-af16-5c5cc23b0cc1\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.602428 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e6bad53-1552-4cc5-aafd-2981191d8afc-ovsdbserver-sb\") pod \"dnsmasq-dns-589f6f54ff-4rcvv\" (UID: \"5e6bad53-1552-4cc5-aafd-2981191d8afc\") " pod="openstack/dnsmasq-dns-589f6f54ff-4rcvv" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.602445 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b29d3be9-3635-458d-af16-5c5cc23b0cc1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b29d3be9-3635-458d-af16-5c5cc23b0cc1\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.602464 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e6bad53-1552-4cc5-aafd-2981191d8afc-dns-svc\") pod \"dnsmasq-dns-589f6f54ff-4rcvv\" (UID: \"5e6bad53-1552-4cc5-aafd-2981191d8afc\") " pod="openstack/dnsmasq-dns-589f6f54ff-4rcvv" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.602485 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b29d3be9-3635-458d-af16-5c5cc23b0cc1-logs\") pod \"glance-default-internal-api-0\" (UID: \"b29d3be9-3635-458d-af16-5c5cc23b0cc1\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.602503 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vc559\" (UniqueName: \"kubernetes.io/projected/b29d3be9-3635-458d-af16-5c5cc23b0cc1-kube-api-access-vc559\") pod \"glance-default-internal-api-0\" (UID: \"b29d3be9-3635-458d-af16-5c5cc23b0cc1\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.603498 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e6bad53-1552-4cc5-aafd-2981191d8afc-config\") pod \"dnsmasq-dns-589f6f54ff-4rcvv\" (UID: \"5e6bad53-1552-4cc5-aafd-2981191d8afc\") " pod="openstack/dnsmasq-dns-589f6f54ff-4rcvv" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.603524 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e6bad53-1552-4cc5-aafd-2981191d8afc-dns-svc\") pod \"dnsmasq-dns-589f6f54ff-4rcvv\" (UID: \"5e6bad53-1552-4cc5-aafd-2981191d8afc\") " pod="openstack/dnsmasq-dns-589f6f54ff-4rcvv" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.603633 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e6bad53-1552-4cc5-aafd-2981191d8afc-ovsdbserver-nb\") pod \"dnsmasq-dns-589f6f54ff-4rcvv\" (UID: \"5e6bad53-1552-4cc5-aafd-2981191d8afc\") " pod="openstack/dnsmasq-dns-589f6f54ff-4rcvv" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.604087 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e6bad53-1552-4cc5-aafd-2981191d8afc-ovsdbserver-sb\") pod \"dnsmasq-dns-589f6f54ff-4rcvv\" (UID: \"5e6bad53-1552-4cc5-aafd-2981191d8afc\") " pod="openstack/dnsmasq-dns-589f6f54ff-4rcvv" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.617896 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksfxf\" (UniqueName: \"kubernetes.io/projected/5e6bad53-1552-4cc5-aafd-2981191d8afc-kube-api-access-ksfxf\") pod \"dnsmasq-dns-589f6f54ff-4rcvv\" (UID: \"5e6bad53-1552-4cc5-aafd-2981191d8afc\") " pod="openstack/dnsmasq-dns-589f6f54ff-4rcvv" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.703710 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b29d3be9-3635-458d-af16-5c5cc23b0cc1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b29d3be9-3635-458d-af16-5c5cc23b0cc1\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.703801 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/b29d3be9-3635-458d-af16-5c5cc23b0cc1-ceph\") pod \"glance-default-internal-api-0\" (UID: \"b29d3be9-3635-458d-af16-5c5cc23b0cc1\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.703848 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b29d3be9-3635-458d-af16-5c5cc23b0cc1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b29d3be9-3635-458d-af16-5c5cc23b0cc1\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.703978 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b29d3be9-3635-458d-af16-5c5cc23b0cc1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b29d3be9-3635-458d-af16-5c5cc23b0cc1\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.704027 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b29d3be9-3635-458d-af16-5c5cc23b0cc1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b29d3be9-3635-458d-af16-5c5cc23b0cc1\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.704089 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b29d3be9-3635-458d-af16-5c5cc23b0cc1-logs\") pod \"glance-default-internal-api-0\" (UID: \"b29d3be9-3635-458d-af16-5c5cc23b0cc1\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.704124 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vc559\" (UniqueName: \"kubernetes.io/projected/b29d3be9-3635-458d-af16-5c5cc23b0cc1-kube-api-access-vc559\") pod \"glance-default-internal-api-0\" (UID: \"b29d3be9-3635-458d-af16-5c5cc23b0cc1\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.705379 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b29d3be9-3635-458d-af16-5c5cc23b0cc1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b29d3be9-3635-458d-af16-5c5cc23b0cc1\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.705674 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b29d3be9-3635-458d-af16-5c5cc23b0cc1-logs\") pod \"glance-default-internal-api-0\" (UID: \"b29d3be9-3635-458d-af16-5c5cc23b0cc1\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.708445 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/b29d3be9-3635-458d-af16-5c5cc23b0cc1-ceph\") pod \"glance-default-internal-api-0\" (UID: \"b29d3be9-3635-458d-af16-5c5cc23b0cc1\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.709938 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b29d3be9-3635-458d-af16-5c5cc23b0cc1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b29d3be9-3635-458d-af16-5c5cc23b0cc1\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.712198 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b29d3be9-3635-458d-af16-5c5cc23b0cc1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b29d3be9-3635-458d-af16-5c5cc23b0cc1\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.719191 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b29d3be9-3635-458d-af16-5c5cc23b0cc1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b29d3be9-3635-458d-af16-5c5cc23b0cc1\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.725232 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vc559\" (UniqueName: \"kubernetes.io/projected/b29d3be9-3635-458d-af16-5c5cc23b0cc1-kube-api-access-vc559\") pod \"glance-default-internal-api-0\" (UID: \"b29d3be9-3635-458d-af16-5c5cc23b0cc1\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.737325 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.815597 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589f6f54ff-4rcvv" Mar 20 08:49:04 crc kubenswrapper[4971]: I0320 08:49:04.823534 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 08:49:05 crc kubenswrapper[4971]: I0320 08:49:05.296323 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 08:49:05 crc kubenswrapper[4971]: I0320 08:49:05.339990 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-589f6f54ff-4rcvv"] Mar 20 08:49:05 crc kubenswrapper[4971]: W0320 08:49:05.341447 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e6bad53_1552_4cc5_aafd_2981191d8afc.slice/crio-a6b20af701976a1d7cd6d6098208647968f6f93ad411da2dd4b574ac4ef5c6c9 WatchSource:0}: Error finding container a6b20af701976a1d7cd6d6098208647968f6f93ad411da2dd4b574ac4ef5c6c9: Status 404 returned error can't find the container with id a6b20af701976a1d7cd6d6098208647968f6f93ad411da2dd4b574ac4ef5c6c9 Mar 20 08:49:05 crc kubenswrapper[4971]: I0320 08:49:05.415573 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 08:49:05 crc kubenswrapper[4971]: I0320 08:49:05.810888 4971 generic.go:334] "Generic (PLEG): container finished" podID="5e6bad53-1552-4cc5-aafd-2981191d8afc" containerID="1a54a78ad16ce0e30c468f80e056d848967a3bfac29d768fe94ee196fcc73b65" exitCode=0 Mar 20 08:49:05 crc kubenswrapper[4971]: I0320 08:49:05.810950 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589f6f54ff-4rcvv" event={"ID":"5e6bad53-1552-4cc5-aafd-2981191d8afc","Type":"ContainerDied","Data":"1a54a78ad16ce0e30c468f80e056d848967a3bfac29d768fe94ee196fcc73b65"} Mar 20 08:49:05 crc kubenswrapper[4971]: I0320 08:49:05.811302 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589f6f54ff-4rcvv" event={"ID":"5e6bad53-1552-4cc5-aafd-2981191d8afc","Type":"ContainerStarted","Data":"a6b20af701976a1d7cd6d6098208647968f6f93ad411da2dd4b574ac4ef5c6c9"} Mar 20 08:49:05 crc kubenswrapper[4971]: I0320 08:49:05.813779 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b29d3be9-3635-458d-af16-5c5cc23b0cc1","Type":"ContainerStarted","Data":"61648a38d0df52ca47d49b6dd719f982de0b541df604e1805958cc7ab31a3edc"} Mar 20 08:49:05 crc kubenswrapper[4971]: I0320 08:49:05.930716 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 08:49:05 crc kubenswrapper[4971]: W0320 08:49:05.941336 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e6cb98d_a89c_4b15_b745_4935f0f9c6e2.slice/crio-d56b92a8713befe0e3bbb60639b23161e7234287e3d8528c1c8ae539656fc578 WatchSource:0}: Error finding container d56b92a8713befe0e3bbb60639b23161e7234287e3d8528c1c8ae539656fc578: Status 404 returned error can't find the container with id d56b92a8713befe0e3bbb60639b23161e7234287e3d8528c1c8ae539656fc578 Mar 20 08:49:06 crc kubenswrapper[4971]: I0320 08:49:06.733755 4971 scope.go:117] "RemoveContainer" containerID="af8dd483a43b5c96df916fd811fc94a881785f568b623794921e968459fc26b4" Mar 20 08:49:06 crc kubenswrapper[4971]: E0320 08:49:06.734215 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:49:06 crc kubenswrapper[4971]: I0320 08:49:06.825734 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b29d3be9-3635-458d-af16-5c5cc23b0cc1","Type":"ContainerStarted","Data":"6f6eb0ce29bd957e46a75085aa58cab35b4423c4ffc78d26f0b683f227969340"} Mar 20 08:49:06 crc kubenswrapper[4971]: I0320 08:49:06.825778 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b29d3be9-3635-458d-af16-5c5cc23b0cc1","Type":"ContainerStarted","Data":"03d8ce807017637a7a27004860493cd876fbc0d26b6e7700e738e08f89f24a71"} Mar 20 08:49:06 crc kubenswrapper[4971]: I0320 08:49:06.827556 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589f6f54ff-4rcvv" event={"ID":"5e6bad53-1552-4cc5-aafd-2981191d8afc","Type":"ContainerStarted","Data":"cbebd05d72b78a97fd116f1947143c21cf63be5b9db6d08d0a1e837efe48152c"} Mar 20 08:49:06 crc kubenswrapper[4971]: I0320 08:49:06.827709 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-589f6f54ff-4rcvv" Mar 20 08:49:06 crc kubenswrapper[4971]: I0320 08:49:06.828984 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6e6cb98d-a89c-4b15-b745-4935f0f9c6e2","Type":"ContainerStarted","Data":"77eea962be41fd966e55956acbe8fd9849e3111e35e99c11e7b1d252f7a26464"} Mar 20 08:49:06 crc kubenswrapper[4971]: I0320 08:49:06.829007 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6e6cb98d-a89c-4b15-b745-4935f0f9c6e2","Type":"ContainerStarted","Data":"d56b92a8713befe0e3bbb60639b23161e7234287e3d8528c1c8ae539656fc578"} Mar 20 08:49:06 crc kubenswrapper[4971]: I0320 08:49:06.863318 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=2.8632979020000002 podStartE2EDuration="2.863297902s" podCreationTimestamp="2026-03-20 08:49:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:49:06.841072211 +0000 UTC m=+7168.820946379" watchObservedRunningTime="2026-03-20 08:49:06.863297902 +0000 UTC m=+7168.843172050" Mar 20 08:49:06 crc kubenswrapper[4971]: I0320 08:49:06.884976 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-589f6f54ff-4rcvv" podStartSLOduration=2.884956889 podStartE2EDuration="2.884956889s" podCreationTimestamp="2026-03-20 08:49:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:49:06.874135896 +0000 UTC m=+7168.854010034" watchObservedRunningTime="2026-03-20 08:49:06.884956889 +0000 UTC m=+7168.864831027" Mar 20 08:49:07 crc kubenswrapper[4971]: I0320 08:49:07.264545 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 08:49:07 crc kubenswrapper[4971]: I0320 08:49:07.836651 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6e6cb98d-a89c-4b15-b745-4935f0f9c6e2","Type":"ContainerStarted","Data":"8878d157b2c6ecb737010caa9e81ca0b725d4c543a302a5be06596abcf207dcc"} Mar 20 08:49:07 crc kubenswrapper[4971]: I0320 08:49:07.836900 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6e6cb98d-a89c-4b15-b745-4935f0f9c6e2" containerName="glance-log" containerID="cri-o://77eea962be41fd966e55956acbe8fd9849e3111e35e99c11e7b1d252f7a26464" gracePeriod=30 Mar 20 08:49:07 crc kubenswrapper[4971]: I0320 08:49:07.837915 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6e6cb98d-a89c-4b15-b745-4935f0f9c6e2" containerName="glance-httpd" containerID="cri-o://8878d157b2c6ecb737010caa9e81ca0b725d4c543a302a5be06596abcf207dcc" gracePeriod=30 Mar 20 08:49:07 crc kubenswrapper[4971]: I0320 08:49:07.856190 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.856169683 podStartE2EDuration="3.856169683s" podCreationTimestamp="2026-03-20 08:49:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:49:07.854311565 +0000 UTC m=+7169.834185703" watchObservedRunningTime="2026-03-20 08:49:07.856169683 +0000 UTC m=+7169.836043821" Mar 20 08:49:08 crc kubenswrapper[4971]: I0320 08:49:08.470385 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 08:49:08 crc kubenswrapper[4971]: I0320 08:49:08.566938 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e6cb98d-a89c-4b15-b745-4935f0f9c6e2-logs\") pod \"6e6cb98d-a89c-4b15-b745-4935f0f9c6e2\" (UID: \"6e6cb98d-a89c-4b15-b745-4935f0f9c6e2\") " Mar 20 08:49:08 crc kubenswrapper[4971]: I0320 08:49:08.567072 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6e6cb98d-a89c-4b15-b745-4935f0f9c6e2-ceph\") pod \"6e6cb98d-a89c-4b15-b745-4935f0f9c6e2\" (UID: \"6e6cb98d-a89c-4b15-b745-4935f0f9c6e2\") " Mar 20 08:49:08 crc kubenswrapper[4971]: I0320 08:49:08.567120 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e6cb98d-a89c-4b15-b745-4935f0f9c6e2-config-data\") pod \"6e6cb98d-a89c-4b15-b745-4935f0f9c6e2\" (UID: \"6e6cb98d-a89c-4b15-b745-4935f0f9c6e2\") " Mar 20 08:49:08 crc kubenswrapper[4971]: I0320 08:49:08.567143 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e6cb98d-a89c-4b15-b745-4935f0f9c6e2-combined-ca-bundle\") pod \"6e6cb98d-a89c-4b15-b745-4935f0f9c6e2\" (UID: \"6e6cb98d-a89c-4b15-b745-4935f0f9c6e2\") " Mar 20 08:49:08 crc kubenswrapper[4971]: I0320 08:49:08.567198 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzm4g\" (UniqueName: \"kubernetes.io/projected/6e6cb98d-a89c-4b15-b745-4935f0f9c6e2-kube-api-access-xzm4g\") pod \"6e6cb98d-a89c-4b15-b745-4935f0f9c6e2\" (UID: \"6e6cb98d-a89c-4b15-b745-4935f0f9c6e2\") " Mar 20 08:49:08 crc kubenswrapper[4971]: I0320 08:49:08.567239 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e6cb98d-a89c-4b15-b745-4935f0f9c6e2-scripts\") pod \"6e6cb98d-a89c-4b15-b745-4935f0f9c6e2\" (UID: \"6e6cb98d-a89c-4b15-b745-4935f0f9c6e2\") " Mar 20 08:49:08 crc kubenswrapper[4971]: I0320 08:49:08.567256 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6e6cb98d-a89c-4b15-b745-4935f0f9c6e2-httpd-run\") pod \"6e6cb98d-a89c-4b15-b745-4935f0f9c6e2\" (UID: \"6e6cb98d-a89c-4b15-b745-4935f0f9c6e2\") " Mar 20 08:49:08 crc kubenswrapper[4971]: I0320 08:49:08.567585 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e6cb98d-a89c-4b15-b745-4935f0f9c6e2-logs" (OuterVolumeSpecName: "logs") pod "6e6cb98d-a89c-4b15-b745-4935f0f9c6e2" (UID: "6e6cb98d-a89c-4b15-b745-4935f0f9c6e2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:49:08 crc kubenswrapper[4971]: I0320 08:49:08.567876 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e6cb98d-a89c-4b15-b745-4935f0f9c6e2-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6e6cb98d-a89c-4b15-b745-4935f0f9c6e2" (UID: "6e6cb98d-a89c-4b15-b745-4935f0f9c6e2"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:49:08 crc kubenswrapper[4971]: I0320 08:49:08.574106 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e6cb98d-a89c-4b15-b745-4935f0f9c6e2-scripts" (OuterVolumeSpecName: "scripts") pod "6e6cb98d-a89c-4b15-b745-4935f0f9c6e2" (UID: "6e6cb98d-a89c-4b15-b745-4935f0f9c6e2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:08 crc kubenswrapper[4971]: I0320 08:49:08.574198 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e6cb98d-a89c-4b15-b745-4935f0f9c6e2-kube-api-access-xzm4g" (OuterVolumeSpecName: "kube-api-access-xzm4g") pod "6e6cb98d-a89c-4b15-b745-4935f0f9c6e2" (UID: "6e6cb98d-a89c-4b15-b745-4935f0f9c6e2"). InnerVolumeSpecName "kube-api-access-xzm4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:49:08 crc kubenswrapper[4971]: I0320 08:49:08.574874 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e6cb98d-a89c-4b15-b745-4935f0f9c6e2-ceph" (OuterVolumeSpecName: "ceph") pod "6e6cb98d-a89c-4b15-b745-4935f0f9c6e2" (UID: "6e6cb98d-a89c-4b15-b745-4935f0f9c6e2"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:49:08 crc kubenswrapper[4971]: I0320 08:49:08.599511 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e6cb98d-a89c-4b15-b745-4935f0f9c6e2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6e6cb98d-a89c-4b15-b745-4935f0f9c6e2" (UID: "6e6cb98d-a89c-4b15-b745-4935f0f9c6e2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:08 crc kubenswrapper[4971]: I0320 08:49:08.646206 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e6cb98d-a89c-4b15-b745-4935f0f9c6e2-config-data" (OuterVolumeSpecName: "config-data") pod "6e6cb98d-a89c-4b15-b745-4935f0f9c6e2" (UID: "6e6cb98d-a89c-4b15-b745-4935f0f9c6e2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:08 crc kubenswrapper[4971]: I0320 08:49:08.670119 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e6cb98d-a89c-4b15-b745-4935f0f9c6e2-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:08 crc kubenswrapper[4971]: I0320 08:49:08.670191 4971 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6e6cb98d-a89c-4b15-b745-4935f0f9c6e2-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:08 crc kubenswrapper[4971]: I0320 08:49:08.670225 4971 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e6cb98d-a89c-4b15-b745-4935f0f9c6e2-logs\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:08 crc kubenswrapper[4971]: I0320 08:49:08.670250 4971 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6e6cb98d-a89c-4b15-b745-4935f0f9c6e2-ceph\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:08 crc kubenswrapper[4971]: I0320 08:49:08.670275 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e6cb98d-a89c-4b15-b745-4935f0f9c6e2-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:08 crc kubenswrapper[4971]: I0320 08:49:08.670300 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e6cb98d-a89c-4b15-b745-4935f0f9c6e2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:08 crc kubenswrapper[4971]: I0320 08:49:08.670328 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzm4g\" (UniqueName: \"kubernetes.io/projected/6e6cb98d-a89c-4b15-b745-4935f0f9c6e2-kube-api-access-xzm4g\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:08 crc kubenswrapper[4971]: I0320 08:49:08.850968 4971 generic.go:334] "Generic (PLEG): container finished" podID="6e6cb98d-a89c-4b15-b745-4935f0f9c6e2" containerID="8878d157b2c6ecb737010caa9e81ca0b725d4c543a302a5be06596abcf207dcc" exitCode=0 Mar 20 08:49:08 crc kubenswrapper[4971]: I0320 08:49:08.851286 4971 generic.go:334] "Generic (PLEG): container finished" podID="6e6cb98d-a89c-4b15-b745-4935f0f9c6e2" containerID="77eea962be41fd966e55956acbe8fd9849e3111e35e99c11e7b1d252f7a26464" exitCode=143 Mar 20 08:49:08 crc kubenswrapper[4971]: I0320 08:49:08.851555 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="b29d3be9-3635-458d-af16-5c5cc23b0cc1" containerName="glance-log" containerID="cri-o://03d8ce807017637a7a27004860493cd876fbc0d26b6e7700e738e08f89f24a71" gracePeriod=30 Mar 20 08:49:08 crc kubenswrapper[4971]: I0320 08:49:08.851215 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6e6cb98d-a89c-4b15-b745-4935f0f9c6e2","Type":"ContainerDied","Data":"8878d157b2c6ecb737010caa9e81ca0b725d4c543a302a5be06596abcf207dcc"} Mar 20 08:49:08 crc kubenswrapper[4971]: I0320 08:49:08.851736 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6e6cb98d-a89c-4b15-b745-4935f0f9c6e2","Type":"ContainerDied","Data":"77eea962be41fd966e55956acbe8fd9849e3111e35e99c11e7b1d252f7a26464"} Mar 20 08:49:08 crc kubenswrapper[4971]: I0320 08:49:08.851762 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6e6cb98d-a89c-4b15-b745-4935f0f9c6e2","Type":"ContainerDied","Data":"d56b92a8713befe0e3bbb60639b23161e7234287e3d8528c1c8ae539656fc578"} Mar 20 08:49:08 crc kubenswrapper[4971]: I0320 08:49:08.851963 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="b29d3be9-3635-458d-af16-5c5cc23b0cc1" containerName="glance-httpd" containerID="cri-o://6f6eb0ce29bd957e46a75085aa58cab35b4423c4ffc78d26f0b683f227969340" gracePeriod=30 Mar 20 08:49:08 crc kubenswrapper[4971]: I0320 08:49:08.852431 4971 scope.go:117] "RemoveContainer" containerID="8878d157b2c6ecb737010caa9e81ca0b725d4c543a302a5be06596abcf207dcc" Mar 20 08:49:08 crc kubenswrapper[4971]: I0320 08:49:08.854653 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 08:49:08 crc kubenswrapper[4971]: I0320 08:49:08.900918 4971 scope.go:117] "RemoveContainer" containerID="77eea962be41fd966e55956acbe8fd9849e3111e35e99c11e7b1d252f7a26464" Mar 20 08:49:08 crc kubenswrapper[4971]: I0320 08:49:08.915764 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 08:49:08 crc kubenswrapper[4971]: I0320 08:49:08.922079 4971 scope.go:117] "RemoveContainer" containerID="8878d157b2c6ecb737010caa9e81ca0b725d4c543a302a5be06596abcf207dcc" Mar 20 08:49:08 crc kubenswrapper[4971]: E0320 08:49:08.922576 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8878d157b2c6ecb737010caa9e81ca0b725d4c543a302a5be06596abcf207dcc\": container with ID starting with 8878d157b2c6ecb737010caa9e81ca0b725d4c543a302a5be06596abcf207dcc not found: ID does not exist" containerID="8878d157b2c6ecb737010caa9e81ca0b725d4c543a302a5be06596abcf207dcc" Mar 20 08:49:08 crc kubenswrapper[4971]: I0320 08:49:08.922645 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8878d157b2c6ecb737010caa9e81ca0b725d4c543a302a5be06596abcf207dcc"} err="failed to get container status \"8878d157b2c6ecb737010caa9e81ca0b725d4c543a302a5be06596abcf207dcc\": rpc error: code = NotFound desc = could not find container \"8878d157b2c6ecb737010caa9e81ca0b725d4c543a302a5be06596abcf207dcc\": container with ID starting with 8878d157b2c6ecb737010caa9e81ca0b725d4c543a302a5be06596abcf207dcc not found: ID does not exist" Mar 20 08:49:08 crc kubenswrapper[4971]: I0320 08:49:08.922682 4971 scope.go:117] "RemoveContainer" containerID="77eea962be41fd966e55956acbe8fd9849e3111e35e99c11e7b1d252f7a26464" Mar 20 08:49:08 crc kubenswrapper[4971]: E0320 08:49:08.923024 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77eea962be41fd966e55956acbe8fd9849e3111e35e99c11e7b1d252f7a26464\": container with ID starting with 77eea962be41fd966e55956acbe8fd9849e3111e35e99c11e7b1d252f7a26464 not found: ID does not exist" containerID="77eea962be41fd966e55956acbe8fd9849e3111e35e99c11e7b1d252f7a26464" Mar 20 08:49:08 crc kubenswrapper[4971]: I0320 08:49:08.923126 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77eea962be41fd966e55956acbe8fd9849e3111e35e99c11e7b1d252f7a26464"} err="failed to get container status \"77eea962be41fd966e55956acbe8fd9849e3111e35e99c11e7b1d252f7a26464\": rpc error: code = NotFound desc = could not find container \"77eea962be41fd966e55956acbe8fd9849e3111e35e99c11e7b1d252f7a26464\": container with ID starting with 77eea962be41fd966e55956acbe8fd9849e3111e35e99c11e7b1d252f7a26464 not found: ID does not exist" Mar 20 08:49:08 crc kubenswrapper[4971]: I0320 08:49:08.923213 4971 scope.go:117] "RemoveContainer" containerID="8878d157b2c6ecb737010caa9e81ca0b725d4c543a302a5be06596abcf207dcc" Mar 20 08:49:08 crc kubenswrapper[4971]: I0320 08:49:08.923592 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8878d157b2c6ecb737010caa9e81ca0b725d4c543a302a5be06596abcf207dcc"} err="failed to get container status \"8878d157b2c6ecb737010caa9e81ca0b725d4c543a302a5be06596abcf207dcc\": rpc error: code = NotFound desc = could not find container \"8878d157b2c6ecb737010caa9e81ca0b725d4c543a302a5be06596abcf207dcc\": container with ID starting with 8878d157b2c6ecb737010caa9e81ca0b725d4c543a302a5be06596abcf207dcc not found: ID does not exist" Mar 20 08:49:08 crc kubenswrapper[4971]: I0320 08:49:08.923633 4971 scope.go:117] "RemoveContainer" containerID="77eea962be41fd966e55956acbe8fd9849e3111e35e99c11e7b1d252f7a26464" Mar 20 08:49:08 crc kubenswrapper[4971]: I0320 08:49:08.923828 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77eea962be41fd966e55956acbe8fd9849e3111e35e99c11e7b1d252f7a26464"} err="failed to get container status \"77eea962be41fd966e55956acbe8fd9849e3111e35e99c11e7b1d252f7a26464\": rpc error: code = NotFound desc = could not find container \"77eea962be41fd966e55956acbe8fd9849e3111e35e99c11e7b1d252f7a26464\": container with ID starting with 77eea962be41fd966e55956acbe8fd9849e3111e35e99c11e7b1d252f7a26464 not found: ID does not exist" Mar 20 08:49:08 crc kubenswrapper[4971]: I0320 08:49:08.931707 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 08:49:08 crc kubenswrapper[4971]: I0320 08:49:08.949800 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 08:49:08 crc kubenswrapper[4971]: E0320 08:49:08.950361 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e6cb98d-a89c-4b15-b745-4935f0f9c6e2" containerName="glance-log" Mar 20 08:49:08 crc kubenswrapper[4971]: I0320 08:49:08.950438 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e6cb98d-a89c-4b15-b745-4935f0f9c6e2" containerName="glance-log" Mar 20 08:49:08 crc kubenswrapper[4971]: E0320 08:49:08.950505 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e6cb98d-a89c-4b15-b745-4935f0f9c6e2" containerName="glance-httpd" Mar 20 08:49:08 crc kubenswrapper[4971]: I0320 08:49:08.950570 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e6cb98d-a89c-4b15-b745-4935f0f9c6e2" containerName="glance-httpd" Mar 20 08:49:08 crc kubenswrapper[4971]: I0320 08:49:08.950794 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e6cb98d-a89c-4b15-b745-4935f0f9c6e2" containerName="glance-httpd" Mar 20 08:49:08 crc kubenswrapper[4971]: I0320 08:49:08.950873 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e6cb98d-a89c-4b15-b745-4935f0f9c6e2" containerName="glance-log" Mar 20 08:49:08 crc kubenswrapper[4971]: I0320 08:49:08.951750 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 08:49:08 crc kubenswrapper[4971]: I0320 08:49:08.954657 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 20 08:49:08 crc kubenswrapper[4971]: I0320 08:49:08.987555 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.080822 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/24488a4f-a46e-4ca8-86d4-5b23dd0baea2-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"24488a4f-a46e-4ca8-86d4-5b23dd0baea2\") " pod="openstack/glance-default-external-api-0" Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.080946 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9m5v\" (UniqueName: \"kubernetes.io/projected/24488a4f-a46e-4ca8-86d4-5b23dd0baea2-kube-api-access-q9m5v\") pod \"glance-default-external-api-0\" (UID: \"24488a4f-a46e-4ca8-86d4-5b23dd0baea2\") " pod="openstack/glance-default-external-api-0" Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.080982 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/24488a4f-a46e-4ca8-86d4-5b23dd0baea2-ceph\") pod \"glance-default-external-api-0\" (UID: \"24488a4f-a46e-4ca8-86d4-5b23dd0baea2\") " pod="openstack/glance-default-external-api-0" Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.081016 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24488a4f-a46e-4ca8-86d4-5b23dd0baea2-scripts\") pod \"glance-default-external-api-0\" (UID: \"24488a4f-a46e-4ca8-86d4-5b23dd0baea2\") " pod="openstack/glance-default-external-api-0" Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.081052 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24488a4f-a46e-4ca8-86d4-5b23dd0baea2-config-data\") pod \"glance-default-external-api-0\" (UID: \"24488a4f-a46e-4ca8-86d4-5b23dd0baea2\") " pod="openstack/glance-default-external-api-0" Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.081779 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24488a4f-a46e-4ca8-86d4-5b23dd0baea2-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"24488a4f-a46e-4ca8-86d4-5b23dd0baea2\") " pod="openstack/glance-default-external-api-0" Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.081881 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24488a4f-a46e-4ca8-86d4-5b23dd0baea2-logs\") pod \"glance-default-external-api-0\" (UID: \"24488a4f-a46e-4ca8-86d4-5b23dd0baea2\") " pod="openstack/glance-default-external-api-0" Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.183738 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24488a4f-a46e-4ca8-86d4-5b23dd0baea2-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"24488a4f-a46e-4ca8-86d4-5b23dd0baea2\") " pod="openstack/glance-default-external-api-0" Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.183842 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24488a4f-a46e-4ca8-86d4-5b23dd0baea2-logs\") pod \"glance-default-external-api-0\" (UID: \"24488a4f-a46e-4ca8-86d4-5b23dd0baea2\") " pod="openstack/glance-default-external-api-0" Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.183925 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/24488a4f-a46e-4ca8-86d4-5b23dd0baea2-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"24488a4f-a46e-4ca8-86d4-5b23dd0baea2\") " pod="openstack/glance-default-external-api-0" Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.183999 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9m5v\" (UniqueName: \"kubernetes.io/projected/24488a4f-a46e-4ca8-86d4-5b23dd0baea2-kube-api-access-q9m5v\") pod \"glance-default-external-api-0\" (UID: \"24488a4f-a46e-4ca8-86d4-5b23dd0baea2\") " pod="openstack/glance-default-external-api-0" Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.184035 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/24488a4f-a46e-4ca8-86d4-5b23dd0baea2-ceph\") pod \"glance-default-external-api-0\" (UID: \"24488a4f-a46e-4ca8-86d4-5b23dd0baea2\") " pod="openstack/glance-default-external-api-0" Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.184067 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24488a4f-a46e-4ca8-86d4-5b23dd0baea2-scripts\") pod \"glance-default-external-api-0\" (UID: \"24488a4f-a46e-4ca8-86d4-5b23dd0baea2\") " pod="openstack/glance-default-external-api-0" Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.184108 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24488a4f-a46e-4ca8-86d4-5b23dd0baea2-config-data\") pod \"glance-default-external-api-0\" (UID: \"24488a4f-a46e-4ca8-86d4-5b23dd0baea2\") " pod="openstack/glance-default-external-api-0" Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.184423 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24488a4f-a46e-4ca8-86d4-5b23dd0baea2-logs\") pod \"glance-default-external-api-0\" (UID: \"24488a4f-a46e-4ca8-86d4-5b23dd0baea2\") " pod="openstack/glance-default-external-api-0" Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.184718 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/24488a4f-a46e-4ca8-86d4-5b23dd0baea2-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"24488a4f-a46e-4ca8-86d4-5b23dd0baea2\") " pod="openstack/glance-default-external-api-0" Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.189491 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/24488a4f-a46e-4ca8-86d4-5b23dd0baea2-ceph\") pod \"glance-default-external-api-0\" (UID: \"24488a4f-a46e-4ca8-86d4-5b23dd0baea2\") " pod="openstack/glance-default-external-api-0" Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.190090 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24488a4f-a46e-4ca8-86d4-5b23dd0baea2-scripts\") pod \"glance-default-external-api-0\" (UID: \"24488a4f-a46e-4ca8-86d4-5b23dd0baea2\") " pod="openstack/glance-default-external-api-0" Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.191206 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24488a4f-a46e-4ca8-86d4-5b23dd0baea2-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"24488a4f-a46e-4ca8-86d4-5b23dd0baea2\") " pod="openstack/glance-default-external-api-0" Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.195510 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24488a4f-a46e-4ca8-86d4-5b23dd0baea2-config-data\") pod \"glance-default-external-api-0\" (UID: \"24488a4f-a46e-4ca8-86d4-5b23dd0baea2\") " pod="openstack/glance-default-external-api-0" Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.200262 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9m5v\" (UniqueName: \"kubernetes.io/projected/24488a4f-a46e-4ca8-86d4-5b23dd0baea2-kube-api-access-q9m5v\") pod \"glance-default-external-api-0\" (UID: \"24488a4f-a46e-4ca8-86d4-5b23dd0baea2\") " pod="openstack/glance-default-external-api-0" Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.385724 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.442321 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.491398 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b29d3be9-3635-458d-af16-5c5cc23b0cc1-httpd-run\") pod \"b29d3be9-3635-458d-af16-5c5cc23b0cc1\" (UID: \"b29d3be9-3635-458d-af16-5c5cc23b0cc1\") " Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.491462 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b29d3be9-3635-458d-af16-5c5cc23b0cc1-scripts\") pod \"b29d3be9-3635-458d-af16-5c5cc23b0cc1\" (UID: \"b29d3be9-3635-458d-af16-5c5cc23b0cc1\") " Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.491484 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b29d3be9-3635-458d-af16-5c5cc23b0cc1-config-data\") pod \"b29d3be9-3635-458d-af16-5c5cc23b0cc1\" (UID: \"b29d3be9-3635-458d-af16-5c5cc23b0cc1\") " Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.491534 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b29d3be9-3635-458d-af16-5c5cc23b0cc1-logs\") pod \"b29d3be9-3635-458d-af16-5c5cc23b0cc1\" (UID: \"b29d3be9-3635-458d-af16-5c5cc23b0cc1\") " Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.491630 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vc559\" (UniqueName: \"kubernetes.io/projected/b29d3be9-3635-458d-af16-5c5cc23b0cc1-kube-api-access-vc559\") pod \"b29d3be9-3635-458d-af16-5c5cc23b0cc1\" (UID: \"b29d3be9-3635-458d-af16-5c5cc23b0cc1\") " Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.491711 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b29d3be9-3635-458d-af16-5c5cc23b0cc1-combined-ca-bundle\") pod \"b29d3be9-3635-458d-af16-5c5cc23b0cc1\" (UID: \"b29d3be9-3635-458d-af16-5c5cc23b0cc1\") " Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.491756 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/b29d3be9-3635-458d-af16-5c5cc23b0cc1-ceph\") pod \"b29d3be9-3635-458d-af16-5c5cc23b0cc1\" (UID: \"b29d3be9-3635-458d-af16-5c5cc23b0cc1\") " Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.495169 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b29d3be9-3635-458d-af16-5c5cc23b0cc1-logs" (OuterVolumeSpecName: "logs") pod "b29d3be9-3635-458d-af16-5c5cc23b0cc1" (UID: "b29d3be9-3635-458d-af16-5c5cc23b0cc1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.497925 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b29d3be9-3635-458d-af16-5c5cc23b0cc1-kube-api-access-vc559" (OuterVolumeSpecName: "kube-api-access-vc559") pod "b29d3be9-3635-458d-af16-5c5cc23b0cc1" (UID: "b29d3be9-3635-458d-af16-5c5cc23b0cc1"). InnerVolumeSpecName "kube-api-access-vc559". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.498207 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b29d3be9-3635-458d-af16-5c5cc23b0cc1-scripts" (OuterVolumeSpecName: "scripts") pod "b29d3be9-3635-458d-af16-5c5cc23b0cc1" (UID: "b29d3be9-3635-458d-af16-5c5cc23b0cc1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.499566 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b29d3be9-3635-458d-af16-5c5cc23b0cc1-ceph" (OuterVolumeSpecName: "ceph") pod "b29d3be9-3635-458d-af16-5c5cc23b0cc1" (UID: "b29d3be9-3635-458d-af16-5c5cc23b0cc1"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.505358 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b29d3be9-3635-458d-af16-5c5cc23b0cc1-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b29d3be9-3635-458d-af16-5c5cc23b0cc1" (UID: "b29d3be9-3635-458d-af16-5c5cc23b0cc1"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.516406 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b29d3be9-3635-458d-af16-5c5cc23b0cc1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b29d3be9-3635-458d-af16-5c5cc23b0cc1" (UID: "b29d3be9-3635-458d-af16-5c5cc23b0cc1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.551166 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b29d3be9-3635-458d-af16-5c5cc23b0cc1-config-data" (OuterVolumeSpecName: "config-data") pod "b29d3be9-3635-458d-af16-5c5cc23b0cc1" (UID: "b29d3be9-3635-458d-af16-5c5cc23b0cc1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.593910 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b29d3be9-3635-458d-af16-5c5cc23b0cc1-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.593932 4971 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b29d3be9-3635-458d-af16-5c5cc23b0cc1-logs\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.593942 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vc559\" (UniqueName: \"kubernetes.io/projected/b29d3be9-3635-458d-af16-5c5cc23b0cc1-kube-api-access-vc559\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.593951 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b29d3be9-3635-458d-af16-5c5cc23b0cc1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.593964 4971 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/b29d3be9-3635-458d-af16-5c5cc23b0cc1-ceph\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.593972 4971 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b29d3be9-3635-458d-af16-5c5cc23b0cc1-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.593980 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b29d3be9-3635-458d-af16-5c5cc23b0cc1-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.868438 4971 generic.go:334] "Generic (PLEG): container finished" podID="b29d3be9-3635-458d-af16-5c5cc23b0cc1" containerID="6f6eb0ce29bd957e46a75085aa58cab35b4423c4ffc78d26f0b683f227969340" exitCode=0 Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.868473 4971 generic.go:334] "Generic (PLEG): container finished" podID="b29d3be9-3635-458d-af16-5c5cc23b0cc1" containerID="03d8ce807017637a7a27004860493cd876fbc0d26b6e7700e738e08f89f24a71" exitCode=143 Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.868489 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b29d3be9-3635-458d-af16-5c5cc23b0cc1","Type":"ContainerDied","Data":"6f6eb0ce29bd957e46a75085aa58cab35b4423c4ffc78d26f0b683f227969340"} Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.868510 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.868545 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b29d3be9-3635-458d-af16-5c5cc23b0cc1","Type":"ContainerDied","Data":"03d8ce807017637a7a27004860493cd876fbc0d26b6e7700e738e08f89f24a71"} Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.868564 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b29d3be9-3635-458d-af16-5c5cc23b0cc1","Type":"ContainerDied","Data":"61648a38d0df52ca47d49b6dd719f982de0b541df604e1805958cc7ab31a3edc"} Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.868585 4971 scope.go:117] "RemoveContainer" containerID="6f6eb0ce29bd957e46a75085aa58cab35b4423c4ffc78d26f0b683f227969340" Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.919951 4971 scope.go:117] "RemoveContainer" containerID="03d8ce807017637a7a27004860493cd876fbc0d26b6e7700e738e08f89f24a71" Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.921385 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.932935 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.957105 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 08:49:09 crc kubenswrapper[4971]: E0320 08:49:09.957568 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b29d3be9-3635-458d-af16-5c5cc23b0cc1" containerName="glance-httpd" Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.957589 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="b29d3be9-3635-458d-af16-5c5cc23b0cc1" containerName="glance-httpd" Mar 20 08:49:09 crc kubenswrapper[4971]: E0320 08:49:09.957627 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b29d3be9-3635-458d-af16-5c5cc23b0cc1" containerName="glance-log" Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.957638 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="b29d3be9-3635-458d-af16-5c5cc23b0cc1" containerName="glance-log" Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.957859 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="b29d3be9-3635-458d-af16-5c5cc23b0cc1" containerName="glance-log" Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.957890 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="b29d3be9-3635-458d-af16-5c5cc23b0cc1" containerName="glance-httpd" Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.959097 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.962991 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.970050 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.974726 4971 scope.go:117] "RemoveContainer" containerID="6f6eb0ce29bd957e46a75085aa58cab35b4423c4ffc78d26f0b683f227969340" Mar 20 08:49:09 crc kubenswrapper[4971]: E0320 08:49:09.978200 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f6eb0ce29bd957e46a75085aa58cab35b4423c4ffc78d26f0b683f227969340\": container with ID starting with 6f6eb0ce29bd957e46a75085aa58cab35b4423c4ffc78d26f0b683f227969340 not found: ID does not exist" containerID="6f6eb0ce29bd957e46a75085aa58cab35b4423c4ffc78d26f0b683f227969340" Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.978242 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f6eb0ce29bd957e46a75085aa58cab35b4423c4ffc78d26f0b683f227969340"} err="failed to get container status \"6f6eb0ce29bd957e46a75085aa58cab35b4423c4ffc78d26f0b683f227969340\": rpc error: code = NotFound desc = could not find container \"6f6eb0ce29bd957e46a75085aa58cab35b4423c4ffc78d26f0b683f227969340\": container with ID starting with 6f6eb0ce29bd957e46a75085aa58cab35b4423c4ffc78d26f0b683f227969340 not found: ID does not exist" Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.978288 4971 scope.go:117] "RemoveContainer" containerID="03d8ce807017637a7a27004860493cd876fbc0d26b6e7700e738e08f89f24a71" Mar 20 08:49:09 crc kubenswrapper[4971]: E0320 08:49:09.980904 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03d8ce807017637a7a27004860493cd876fbc0d26b6e7700e738e08f89f24a71\": container with ID starting with 03d8ce807017637a7a27004860493cd876fbc0d26b6e7700e738e08f89f24a71 not found: ID does not exist" containerID="03d8ce807017637a7a27004860493cd876fbc0d26b6e7700e738e08f89f24a71" Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.980963 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03d8ce807017637a7a27004860493cd876fbc0d26b6e7700e738e08f89f24a71"} err="failed to get container status \"03d8ce807017637a7a27004860493cd876fbc0d26b6e7700e738e08f89f24a71\": rpc error: code = NotFound desc = could not find container \"03d8ce807017637a7a27004860493cd876fbc0d26b6e7700e738e08f89f24a71\": container with ID starting with 03d8ce807017637a7a27004860493cd876fbc0d26b6e7700e738e08f89f24a71 not found: ID does not exist" Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.980994 4971 scope.go:117] "RemoveContainer" containerID="6f6eb0ce29bd957e46a75085aa58cab35b4423c4ffc78d26f0b683f227969340" Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.981488 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f6eb0ce29bd957e46a75085aa58cab35b4423c4ffc78d26f0b683f227969340"} err="failed to get container status \"6f6eb0ce29bd957e46a75085aa58cab35b4423c4ffc78d26f0b683f227969340\": rpc error: code = NotFound desc = could not find container \"6f6eb0ce29bd957e46a75085aa58cab35b4423c4ffc78d26f0b683f227969340\": container with ID starting with 6f6eb0ce29bd957e46a75085aa58cab35b4423c4ffc78d26f0b683f227969340 not found: ID does not exist" Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.981508 4971 scope.go:117] "RemoveContainer" containerID="03d8ce807017637a7a27004860493cd876fbc0d26b6e7700e738e08f89f24a71" Mar 20 08:49:09 crc kubenswrapper[4971]: I0320 08:49:09.981953 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03d8ce807017637a7a27004860493cd876fbc0d26b6e7700e738e08f89f24a71"} err="failed to get container status \"03d8ce807017637a7a27004860493cd876fbc0d26b6e7700e738e08f89f24a71\": rpc error: code = NotFound desc = could not find container \"03d8ce807017637a7a27004860493cd876fbc0d26b6e7700e738e08f89f24a71\": container with ID starting with 03d8ce807017637a7a27004860493cd876fbc0d26b6e7700e738e08f89f24a71 not found: ID does not exist" Mar 20 08:49:10 crc kubenswrapper[4971]: I0320 08:49:10.029178 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 08:49:10 crc kubenswrapper[4971]: W0320 08:49:10.034028 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24488a4f_a46e_4ca8_86d4_5b23dd0baea2.slice/crio-01a9b85a328d239b715c61567cb8970555e667d48e1c35a1b0c270ceabfd099d WatchSource:0}: Error finding container 01a9b85a328d239b715c61567cb8970555e667d48e1c35a1b0c270ceabfd099d: Status 404 returned error can't find the container with id 01a9b85a328d239b715c61567cb8970555e667d48e1c35a1b0c270ceabfd099d Mar 20 08:49:10 crc kubenswrapper[4971]: I0320 08:49:10.102771 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df4aae77-3962-438d-a455-e6f017a477c5-config-data\") pod \"glance-default-internal-api-0\" (UID: \"df4aae77-3962-438d-a455-e6f017a477c5\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:49:10 crc kubenswrapper[4971]: I0320 08:49:10.102858 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/df4aae77-3962-438d-a455-e6f017a477c5-ceph\") pod \"glance-default-internal-api-0\" (UID: \"df4aae77-3962-438d-a455-e6f017a477c5\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:49:10 crc kubenswrapper[4971]: I0320 08:49:10.102892 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df4aae77-3962-438d-a455-e6f017a477c5-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"df4aae77-3962-438d-a455-e6f017a477c5\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:49:10 crc kubenswrapper[4971]: I0320 08:49:10.102919 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xhhf\" (UniqueName: \"kubernetes.io/projected/df4aae77-3962-438d-a455-e6f017a477c5-kube-api-access-5xhhf\") pod \"glance-default-internal-api-0\" (UID: \"df4aae77-3962-438d-a455-e6f017a477c5\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:49:10 crc kubenswrapper[4971]: I0320 08:49:10.102960 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df4aae77-3962-438d-a455-e6f017a477c5-logs\") pod \"glance-default-internal-api-0\" (UID: \"df4aae77-3962-438d-a455-e6f017a477c5\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:49:10 crc kubenswrapper[4971]: I0320 08:49:10.102982 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df4aae77-3962-438d-a455-e6f017a477c5-scripts\") pod \"glance-default-internal-api-0\" (UID: \"df4aae77-3962-438d-a455-e6f017a477c5\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:49:10 crc kubenswrapper[4971]: I0320 08:49:10.103031 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/df4aae77-3962-438d-a455-e6f017a477c5-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"df4aae77-3962-438d-a455-e6f017a477c5\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:49:10 crc kubenswrapper[4971]: I0320 08:49:10.205206 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/df4aae77-3962-438d-a455-e6f017a477c5-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"df4aae77-3962-438d-a455-e6f017a477c5\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:49:10 crc kubenswrapper[4971]: I0320 08:49:10.205472 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df4aae77-3962-438d-a455-e6f017a477c5-config-data\") pod \"glance-default-internal-api-0\" (UID: \"df4aae77-3962-438d-a455-e6f017a477c5\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:49:10 crc kubenswrapper[4971]: I0320 08:49:10.205530 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/df4aae77-3962-438d-a455-e6f017a477c5-ceph\") pod \"glance-default-internal-api-0\" (UID: \"df4aae77-3962-438d-a455-e6f017a477c5\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:49:10 crc kubenswrapper[4971]: I0320 08:49:10.205559 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df4aae77-3962-438d-a455-e6f017a477c5-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"df4aae77-3962-438d-a455-e6f017a477c5\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:49:10 crc kubenswrapper[4971]: I0320 08:49:10.205640 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xhhf\" (UniqueName: \"kubernetes.io/projected/df4aae77-3962-438d-a455-e6f017a477c5-kube-api-access-5xhhf\") pod \"glance-default-internal-api-0\" (UID: \"df4aae77-3962-438d-a455-e6f017a477c5\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:49:10 crc kubenswrapper[4971]: I0320 08:49:10.205691 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/df4aae77-3962-438d-a455-e6f017a477c5-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"df4aae77-3962-438d-a455-e6f017a477c5\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:49:10 crc kubenswrapper[4971]: I0320 08:49:10.205682 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df4aae77-3962-438d-a455-e6f017a477c5-logs\") pod \"glance-default-internal-api-0\" (UID: \"df4aae77-3962-438d-a455-e6f017a477c5\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:49:10 crc kubenswrapper[4971]: I0320 08:49:10.205791 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df4aae77-3962-438d-a455-e6f017a477c5-scripts\") pod \"glance-default-internal-api-0\" (UID: \"df4aae77-3962-438d-a455-e6f017a477c5\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:49:10 crc kubenswrapper[4971]: I0320 08:49:10.209230 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df4aae77-3962-438d-a455-e6f017a477c5-logs\") pod \"glance-default-internal-api-0\" (UID: \"df4aae77-3962-438d-a455-e6f017a477c5\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:49:10 crc kubenswrapper[4971]: I0320 08:49:10.210164 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/df4aae77-3962-438d-a455-e6f017a477c5-ceph\") pod \"glance-default-internal-api-0\" (UID: \"df4aae77-3962-438d-a455-e6f017a477c5\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:49:10 crc kubenswrapper[4971]: I0320 08:49:10.210237 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df4aae77-3962-438d-a455-e6f017a477c5-scripts\") pod \"glance-default-internal-api-0\" (UID: \"df4aae77-3962-438d-a455-e6f017a477c5\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:49:10 crc kubenswrapper[4971]: I0320 08:49:10.211539 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df4aae77-3962-438d-a455-e6f017a477c5-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"df4aae77-3962-438d-a455-e6f017a477c5\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:49:10 crc kubenswrapper[4971]: I0320 08:49:10.212543 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df4aae77-3962-438d-a455-e6f017a477c5-config-data\") pod \"glance-default-internal-api-0\" (UID: \"df4aae77-3962-438d-a455-e6f017a477c5\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:49:10 crc kubenswrapper[4971]: I0320 08:49:10.231323 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xhhf\" (UniqueName: \"kubernetes.io/projected/df4aae77-3962-438d-a455-e6f017a477c5-kube-api-access-5xhhf\") pod \"glance-default-internal-api-0\" (UID: \"df4aae77-3962-438d-a455-e6f017a477c5\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:49:10 crc kubenswrapper[4971]: I0320 08:49:10.323925 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 08:49:10 crc kubenswrapper[4971]: I0320 08:49:10.747328 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e6cb98d-a89c-4b15-b745-4935f0f9c6e2" path="/var/lib/kubelet/pods/6e6cb98d-a89c-4b15-b745-4935f0f9c6e2/volumes" Mar 20 08:49:10 crc kubenswrapper[4971]: I0320 08:49:10.748681 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b29d3be9-3635-458d-af16-5c5cc23b0cc1" path="/var/lib/kubelet/pods/b29d3be9-3635-458d-af16-5c5cc23b0cc1/volumes" Mar 20 08:49:10 crc kubenswrapper[4971]: I0320 08:49:10.871143 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 08:49:10 crc kubenswrapper[4971]: W0320 08:49:10.887286 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf4aae77_3962_438d_a455_e6f017a477c5.slice/crio-ed8f8922904979e8de2fc583d412f2259155802961a128c952a0b46917d063dc WatchSource:0}: Error finding container ed8f8922904979e8de2fc583d412f2259155802961a128c952a0b46917d063dc: Status 404 returned error can't find the container with id ed8f8922904979e8de2fc583d412f2259155802961a128c952a0b46917d063dc Mar 20 08:49:10 crc kubenswrapper[4971]: I0320 08:49:10.905029 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"24488a4f-a46e-4ca8-86d4-5b23dd0baea2","Type":"ContainerStarted","Data":"13545e7041483007f6d6bd25d1ff07ca64b88102cd73f7ec312096aa6bbd746b"} Mar 20 08:49:10 crc kubenswrapper[4971]: I0320 08:49:10.905096 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"24488a4f-a46e-4ca8-86d4-5b23dd0baea2","Type":"ContainerStarted","Data":"01a9b85a328d239b715c61567cb8970555e667d48e1c35a1b0c270ceabfd099d"} Mar 20 08:49:11 crc kubenswrapper[4971]: I0320 08:49:11.918742 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"df4aae77-3962-438d-a455-e6f017a477c5","Type":"ContainerStarted","Data":"4fcdf8ed86ec8c23603232a5a54512655104a51c2cb4d5d309335f1a29c503c1"} Mar 20 08:49:11 crc kubenswrapper[4971]: I0320 08:49:11.919619 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"df4aae77-3962-438d-a455-e6f017a477c5","Type":"ContainerStarted","Data":"a0d9b96c46022c457032f3d730e3d05303c77c5874a2edd8709197377173b185"} Mar 20 08:49:11 crc kubenswrapper[4971]: I0320 08:49:11.919633 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"df4aae77-3962-438d-a455-e6f017a477c5","Type":"ContainerStarted","Data":"ed8f8922904979e8de2fc583d412f2259155802961a128c952a0b46917d063dc"} Mar 20 08:49:11 crc kubenswrapper[4971]: I0320 08:49:11.920935 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"24488a4f-a46e-4ca8-86d4-5b23dd0baea2","Type":"ContainerStarted","Data":"4ace891a7df0f84ad86f358c910694d8b067090dd19974eea57d33f583f52709"} Mar 20 08:49:11 crc kubenswrapper[4971]: I0320 08:49:11.949787 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=2.949763383 podStartE2EDuration="2.949763383s" podCreationTimestamp="2026-03-20 08:49:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:49:11.947759511 +0000 UTC m=+7173.927633659" watchObservedRunningTime="2026-03-20 08:49:11.949763383 +0000 UTC m=+7173.929637521" Mar 20 08:49:11 crc kubenswrapper[4971]: I0320 08:49:11.977644 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.9776250920000003 podStartE2EDuration="3.977625092s" podCreationTimestamp="2026-03-20 08:49:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:49:11.969703294 +0000 UTC m=+7173.949577452" watchObservedRunningTime="2026-03-20 08:49:11.977625092 +0000 UTC m=+7173.957499240" Mar 20 08:49:14 crc kubenswrapper[4971]: I0320 08:49:14.817999 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-589f6f54ff-4rcvv" Mar 20 08:49:14 crc kubenswrapper[4971]: I0320 08:49:14.903185 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76595c7fd7-gsnb6"] Mar 20 08:49:14 crc kubenswrapper[4971]: I0320 08:49:14.903494 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-76595c7fd7-gsnb6" podUID="cc746079-6be6-41df-bcdc-3408d71f511d" containerName="dnsmasq-dns" containerID="cri-o://b22100b61c3682887038ce910cdd89012803b22481e95f6c56fc1ae8b831d959" gracePeriod=10 Mar 20 08:49:15 crc kubenswrapper[4971]: I0320 08:49:15.376984 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76595c7fd7-gsnb6" Mar 20 08:49:15 crc kubenswrapper[4971]: I0320 08:49:15.501505 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc746079-6be6-41df-bcdc-3408d71f511d-dns-svc\") pod \"cc746079-6be6-41df-bcdc-3408d71f511d\" (UID: \"cc746079-6be6-41df-bcdc-3408d71f511d\") " Mar 20 08:49:15 crc kubenswrapper[4971]: I0320 08:49:15.501576 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc746079-6be6-41df-bcdc-3408d71f511d-ovsdbserver-nb\") pod \"cc746079-6be6-41df-bcdc-3408d71f511d\" (UID: \"cc746079-6be6-41df-bcdc-3408d71f511d\") " Mar 20 08:49:15 crc kubenswrapper[4971]: I0320 08:49:15.501647 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc746079-6be6-41df-bcdc-3408d71f511d-ovsdbserver-sb\") pod \"cc746079-6be6-41df-bcdc-3408d71f511d\" (UID: \"cc746079-6be6-41df-bcdc-3408d71f511d\") " Mar 20 08:49:15 crc kubenswrapper[4971]: I0320 08:49:15.501686 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc746079-6be6-41df-bcdc-3408d71f511d-config\") pod \"cc746079-6be6-41df-bcdc-3408d71f511d\" (UID: \"cc746079-6be6-41df-bcdc-3408d71f511d\") " Mar 20 08:49:15 crc kubenswrapper[4971]: I0320 08:49:15.501807 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4b9kl\" (UniqueName: \"kubernetes.io/projected/cc746079-6be6-41df-bcdc-3408d71f511d-kube-api-access-4b9kl\") pod \"cc746079-6be6-41df-bcdc-3408d71f511d\" (UID: \"cc746079-6be6-41df-bcdc-3408d71f511d\") " Mar 20 08:49:15 crc kubenswrapper[4971]: I0320 08:49:15.507873 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc746079-6be6-41df-bcdc-3408d71f511d-kube-api-access-4b9kl" (OuterVolumeSpecName: "kube-api-access-4b9kl") pod "cc746079-6be6-41df-bcdc-3408d71f511d" (UID: "cc746079-6be6-41df-bcdc-3408d71f511d"). InnerVolumeSpecName "kube-api-access-4b9kl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:49:15 crc kubenswrapper[4971]: I0320 08:49:15.541029 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc746079-6be6-41df-bcdc-3408d71f511d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cc746079-6be6-41df-bcdc-3408d71f511d" (UID: "cc746079-6be6-41df-bcdc-3408d71f511d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:49:15 crc kubenswrapper[4971]: I0320 08:49:15.544081 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc746079-6be6-41df-bcdc-3408d71f511d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cc746079-6be6-41df-bcdc-3408d71f511d" (UID: "cc746079-6be6-41df-bcdc-3408d71f511d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:49:15 crc kubenswrapper[4971]: I0320 08:49:15.546756 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc746079-6be6-41df-bcdc-3408d71f511d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cc746079-6be6-41df-bcdc-3408d71f511d" (UID: "cc746079-6be6-41df-bcdc-3408d71f511d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:49:15 crc kubenswrapper[4971]: I0320 08:49:15.552174 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc746079-6be6-41df-bcdc-3408d71f511d-config" (OuterVolumeSpecName: "config") pod "cc746079-6be6-41df-bcdc-3408d71f511d" (UID: "cc746079-6be6-41df-bcdc-3408d71f511d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:49:15 crc kubenswrapper[4971]: I0320 08:49:15.603967 4971 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc746079-6be6-41df-bcdc-3408d71f511d-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:15 crc kubenswrapper[4971]: I0320 08:49:15.604298 4971 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc746079-6be6-41df-bcdc-3408d71f511d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:15 crc kubenswrapper[4971]: I0320 08:49:15.604310 4971 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc746079-6be6-41df-bcdc-3408d71f511d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:15 crc kubenswrapper[4971]: I0320 08:49:15.604323 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc746079-6be6-41df-bcdc-3408d71f511d-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:15 crc kubenswrapper[4971]: I0320 08:49:15.604332 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4b9kl\" (UniqueName: \"kubernetes.io/projected/cc746079-6be6-41df-bcdc-3408d71f511d-kube-api-access-4b9kl\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:15 crc kubenswrapper[4971]: I0320 08:49:15.981502 4971 generic.go:334] "Generic (PLEG): container finished" podID="cc746079-6be6-41df-bcdc-3408d71f511d" containerID="b22100b61c3682887038ce910cdd89012803b22481e95f6c56fc1ae8b831d959" exitCode=0 Mar 20 08:49:15 crc kubenswrapper[4971]: I0320 08:49:15.981543 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76595c7fd7-gsnb6" event={"ID":"cc746079-6be6-41df-bcdc-3408d71f511d","Type":"ContainerDied","Data":"b22100b61c3682887038ce910cdd89012803b22481e95f6c56fc1ae8b831d959"} Mar 20 08:49:15 crc kubenswrapper[4971]: I0320 08:49:15.981568 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76595c7fd7-gsnb6" event={"ID":"cc746079-6be6-41df-bcdc-3408d71f511d","Type":"ContainerDied","Data":"2f02fe998d4b80470b14d418d9175c31cf202bf94c043c0830c10abf692947e0"} Mar 20 08:49:15 crc kubenswrapper[4971]: I0320 08:49:15.981590 4971 scope.go:117] "RemoveContainer" containerID="b22100b61c3682887038ce910cdd89012803b22481e95f6c56fc1ae8b831d959" Mar 20 08:49:15 crc kubenswrapper[4971]: I0320 08:49:15.981775 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76595c7fd7-gsnb6" Mar 20 08:49:16 crc kubenswrapper[4971]: I0320 08:49:16.020399 4971 scope.go:117] "RemoveContainer" containerID="e491bf5ed45929343bcaf6eee297a8a2b0d2b18310ba7170ecf317c38a711c70" Mar 20 08:49:16 crc kubenswrapper[4971]: I0320 08:49:16.025792 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76595c7fd7-gsnb6"] Mar 20 08:49:16 crc kubenswrapper[4971]: I0320 08:49:16.038313 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-76595c7fd7-gsnb6"] Mar 20 08:49:16 crc kubenswrapper[4971]: I0320 08:49:16.074359 4971 scope.go:117] "RemoveContainer" containerID="b22100b61c3682887038ce910cdd89012803b22481e95f6c56fc1ae8b831d959" Mar 20 08:49:16 crc kubenswrapper[4971]: E0320 08:49:16.074906 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b22100b61c3682887038ce910cdd89012803b22481e95f6c56fc1ae8b831d959\": container with ID starting with b22100b61c3682887038ce910cdd89012803b22481e95f6c56fc1ae8b831d959 not found: ID does not exist" containerID="b22100b61c3682887038ce910cdd89012803b22481e95f6c56fc1ae8b831d959" Mar 20 08:49:16 crc kubenswrapper[4971]: I0320 08:49:16.074974 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b22100b61c3682887038ce910cdd89012803b22481e95f6c56fc1ae8b831d959"} err="failed to get container status \"b22100b61c3682887038ce910cdd89012803b22481e95f6c56fc1ae8b831d959\": rpc error: code = NotFound desc = could not find container \"b22100b61c3682887038ce910cdd89012803b22481e95f6c56fc1ae8b831d959\": container with ID starting with b22100b61c3682887038ce910cdd89012803b22481e95f6c56fc1ae8b831d959 not found: ID does not exist" Mar 20 08:49:16 crc kubenswrapper[4971]: I0320 08:49:16.075018 4971 scope.go:117] "RemoveContainer" containerID="e491bf5ed45929343bcaf6eee297a8a2b0d2b18310ba7170ecf317c38a711c70" Mar 20 08:49:16 crc kubenswrapper[4971]: E0320 08:49:16.076052 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e491bf5ed45929343bcaf6eee297a8a2b0d2b18310ba7170ecf317c38a711c70\": container with ID starting with e491bf5ed45929343bcaf6eee297a8a2b0d2b18310ba7170ecf317c38a711c70 not found: ID does not exist" containerID="e491bf5ed45929343bcaf6eee297a8a2b0d2b18310ba7170ecf317c38a711c70" Mar 20 08:49:16 crc kubenswrapper[4971]: I0320 08:49:16.076104 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e491bf5ed45929343bcaf6eee297a8a2b0d2b18310ba7170ecf317c38a711c70"} err="failed to get container status \"e491bf5ed45929343bcaf6eee297a8a2b0d2b18310ba7170ecf317c38a711c70\": rpc error: code = NotFound desc = could not find container \"e491bf5ed45929343bcaf6eee297a8a2b0d2b18310ba7170ecf317c38a711c70\": container with ID starting with e491bf5ed45929343bcaf6eee297a8a2b0d2b18310ba7170ecf317c38a711c70 not found: ID does not exist" Mar 20 08:49:16 crc kubenswrapper[4971]: I0320 08:49:16.750555 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc746079-6be6-41df-bcdc-3408d71f511d" path="/var/lib/kubelet/pods/cc746079-6be6-41df-bcdc-3408d71f511d/volumes" Mar 20 08:49:18 crc kubenswrapper[4971]: I0320 08:49:18.746497 4971 scope.go:117] "RemoveContainer" containerID="af8dd483a43b5c96df916fd811fc94a881785f568b623794921e968459fc26b4" Mar 20 08:49:18 crc kubenswrapper[4971]: E0320 08:49:18.747326 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:49:19 crc kubenswrapper[4971]: I0320 08:49:19.443328 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 08:49:19 crc kubenswrapper[4971]: I0320 08:49:19.443382 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 08:49:19 crc kubenswrapper[4971]: I0320 08:49:19.490579 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 08:49:19 crc kubenswrapper[4971]: I0320 08:49:19.493054 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 08:49:20 crc kubenswrapper[4971]: I0320 08:49:20.028907 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 08:49:20 crc kubenswrapper[4971]: I0320 08:49:20.028966 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 08:49:20 crc kubenswrapper[4971]: I0320 08:49:20.325449 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 08:49:20 crc kubenswrapper[4971]: I0320 08:49:20.325540 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 08:49:20 crc kubenswrapper[4971]: I0320 08:49:20.363897 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 08:49:20 crc kubenswrapper[4971]: I0320 08:49:20.400738 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 08:49:21 crc kubenswrapper[4971]: I0320 08:49:21.039256 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 08:49:21 crc kubenswrapper[4971]: I0320 08:49:21.039660 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 08:49:21 crc kubenswrapper[4971]: I0320 08:49:21.903646 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 08:49:21 crc kubenswrapper[4971]: I0320 08:49:21.924178 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 08:49:23 crc kubenswrapper[4971]: I0320 08:49:23.114050 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 08:49:23 crc kubenswrapper[4971]: I0320 08:49:23.114442 4971 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 08:49:23 crc kubenswrapper[4971]: I0320 08:49:23.128506 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 08:49:30 crc kubenswrapper[4971]: I0320 08:49:30.284996 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-2hbqb"] Mar 20 08:49:30 crc kubenswrapper[4971]: E0320 08:49:30.285822 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc746079-6be6-41df-bcdc-3408d71f511d" containerName="init" Mar 20 08:49:30 crc kubenswrapper[4971]: I0320 08:49:30.285834 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc746079-6be6-41df-bcdc-3408d71f511d" containerName="init" Mar 20 08:49:30 crc kubenswrapper[4971]: E0320 08:49:30.285856 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc746079-6be6-41df-bcdc-3408d71f511d" containerName="dnsmasq-dns" Mar 20 08:49:30 crc kubenswrapper[4971]: I0320 08:49:30.285862 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc746079-6be6-41df-bcdc-3408d71f511d" containerName="dnsmasq-dns" Mar 20 08:49:30 crc kubenswrapper[4971]: I0320 08:49:30.286035 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc746079-6be6-41df-bcdc-3408d71f511d" containerName="dnsmasq-dns" Mar 20 08:49:30 crc kubenswrapper[4971]: I0320 08:49:30.286653 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-2hbqb" Mar 20 08:49:30 crc kubenswrapper[4971]: I0320 08:49:30.296176 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-2hbqb"] Mar 20 08:49:30 crc kubenswrapper[4971]: I0320 08:49:30.353490 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-b860-account-create-update-khxtz"] Mar 20 08:49:30 crc kubenswrapper[4971]: I0320 08:49:30.355556 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b860-account-create-update-khxtz" Mar 20 08:49:30 crc kubenswrapper[4971]: I0320 08:49:30.358969 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 20 08:49:30 crc kubenswrapper[4971]: I0320 08:49:30.366215 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-b860-account-create-update-khxtz"] Mar 20 08:49:30 crc kubenswrapper[4971]: I0320 08:49:30.367548 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnxjn\" (UniqueName: \"kubernetes.io/projected/a650993b-e4f4-4d7c-9d90-e9ce7e0ddd12-kube-api-access-vnxjn\") pod \"placement-db-create-2hbqb\" (UID: \"a650993b-e4f4-4d7c-9d90-e9ce7e0ddd12\") " pod="openstack/placement-db-create-2hbqb" Mar 20 08:49:30 crc kubenswrapper[4971]: I0320 08:49:30.367837 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a650993b-e4f4-4d7c-9d90-e9ce7e0ddd12-operator-scripts\") pod \"placement-db-create-2hbqb\" (UID: \"a650993b-e4f4-4d7c-9d90-e9ce7e0ddd12\") " pod="openstack/placement-db-create-2hbqb" Mar 20 08:49:30 crc kubenswrapper[4971]: I0320 08:49:30.469446 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnxjn\" (UniqueName: \"kubernetes.io/projected/a650993b-e4f4-4d7c-9d90-e9ce7e0ddd12-kube-api-access-vnxjn\") pod \"placement-db-create-2hbqb\" (UID: \"a650993b-e4f4-4d7c-9d90-e9ce7e0ddd12\") " pod="openstack/placement-db-create-2hbqb" Mar 20 08:49:30 crc kubenswrapper[4971]: I0320 08:49:30.469567 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5qf2\" (UniqueName: \"kubernetes.io/projected/6599a56b-bc8f-42e6-8f6e-359727614c6e-kube-api-access-w5qf2\") pod \"placement-b860-account-create-update-khxtz\" (UID: \"6599a56b-bc8f-42e6-8f6e-359727614c6e\") " pod="openstack/placement-b860-account-create-update-khxtz" Mar 20 08:49:30 crc kubenswrapper[4971]: I0320 08:49:30.469663 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6599a56b-bc8f-42e6-8f6e-359727614c6e-operator-scripts\") pod \"placement-b860-account-create-update-khxtz\" (UID: \"6599a56b-bc8f-42e6-8f6e-359727614c6e\") " pod="openstack/placement-b860-account-create-update-khxtz" Mar 20 08:49:30 crc kubenswrapper[4971]: I0320 08:49:30.469704 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a650993b-e4f4-4d7c-9d90-e9ce7e0ddd12-operator-scripts\") pod \"placement-db-create-2hbqb\" (UID: \"a650993b-e4f4-4d7c-9d90-e9ce7e0ddd12\") " pod="openstack/placement-db-create-2hbqb" Mar 20 08:49:30 crc kubenswrapper[4971]: I0320 08:49:30.470675 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a650993b-e4f4-4d7c-9d90-e9ce7e0ddd12-operator-scripts\") pod \"placement-db-create-2hbqb\" (UID: \"a650993b-e4f4-4d7c-9d90-e9ce7e0ddd12\") " pod="openstack/placement-db-create-2hbqb" Mar 20 08:49:30 crc kubenswrapper[4971]: I0320 08:49:30.487482 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnxjn\" (UniqueName: \"kubernetes.io/projected/a650993b-e4f4-4d7c-9d90-e9ce7e0ddd12-kube-api-access-vnxjn\") pod \"placement-db-create-2hbqb\" (UID: \"a650993b-e4f4-4d7c-9d90-e9ce7e0ddd12\") " pod="openstack/placement-db-create-2hbqb" Mar 20 08:49:30 crc kubenswrapper[4971]: I0320 08:49:30.571770 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5qf2\" (UniqueName: \"kubernetes.io/projected/6599a56b-bc8f-42e6-8f6e-359727614c6e-kube-api-access-w5qf2\") pod \"placement-b860-account-create-update-khxtz\" (UID: \"6599a56b-bc8f-42e6-8f6e-359727614c6e\") " pod="openstack/placement-b860-account-create-update-khxtz" Mar 20 08:49:30 crc kubenswrapper[4971]: I0320 08:49:30.571820 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6599a56b-bc8f-42e6-8f6e-359727614c6e-operator-scripts\") pod \"placement-b860-account-create-update-khxtz\" (UID: \"6599a56b-bc8f-42e6-8f6e-359727614c6e\") " pod="openstack/placement-b860-account-create-update-khxtz" Mar 20 08:49:30 crc kubenswrapper[4971]: I0320 08:49:30.572544 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6599a56b-bc8f-42e6-8f6e-359727614c6e-operator-scripts\") pod \"placement-b860-account-create-update-khxtz\" (UID: \"6599a56b-bc8f-42e6-8f6e-359727614c6e\") " pod="openstack/placement-b860-account-create-update-khxtz" Mar 20 08:49:30 crc kubenswrapper[4971]: I0320 08:49:30.588297 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5qf2\" (UniqueName: \"kubernetes.io/projected/6599a56b-bc8f-42e6-8f6e-359727614c6e-kube-api-access-w5qf2\") pod \"placement-b860-account-create-update-khxtz\" (UID: \"6599a56b-bc8f-42e6-8f6e-359727614c6e\") " pod="openstack/placement-b860-account-create-update-khxtz" Mar 20 08:49:30 crc kubenswrapper[4971]: I0320 08:49:30.612374 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-2hbqb" Mar 20 08:49:30 crc kubenswrapper[4971]: I0320 08:49:30.678468 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b860-account-create-update-khxtz" Mar 20 08:49:31 crc kubenswrapper[4971]: I0320 08:49:31.043239 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-2hbqb"] Mar 20 08:49:31 crc kubenswrapper[4971]: W0320 08:49:31.048699 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda650993b_e4f4_4d7c_9d90_e9ce7e0ddd12.slice/crio-076d3a5884866ab847a0d2c85cdb7424186dd562978eeddd8b1c6f96a62769d3 WatchSource:0}: Error finding container 076d3a5884866ab847a0d2c85cdb7424186dd562978eeddd8b1c6f96a62769d3: Status 404 returned error can't find the container with id 076d3a5884866ab847a0d2c85cdb7424186dd562978eeddd8b1c6f96a62769d3 Mar 20 08:49:31 crc kubenswrapper[4971]: I0320 08:49:31.137043 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-2hbqb" event={"ID":"a650993b-e4f4-4d7c-9d90-e9ce7e0ddd12","Type":"ContainerStarted","Data":"076d3a5884866ab847a0d2c85cdb7424186dd562978eeddd8b1c6f96a62769d3"} Mar 20 08:49:31 crc kubenswrapper[4971]: I0320 08:49:31.184944 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-b860-account-create-update-khxtz"] Mar 20 08:49:31 crc kubenswrapper[4971]: W0320 08:49:31.188670 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6599a56b_bc8f_42e6_8f6e_359727614c6e.slice/crio-3c3ad2d56fcb53341642a404c36fcd3bd112fa9436bddbd014b6904ba5cb0777 WatchSource:0}: Error finding container 3c3ad2d56fcb53341642a404c36fcd3bd112fa9436bddbd014b6904ba5cb0777: Status 404 returned error can't find the container with id 3c3ad2d56fcb53341642a404c36fcd3bd112fa9436bddbd014b6904ba5cb0777 Mar 20 08:49:32 crc kubenswrapper[4971]: I0320 08:49:32.162545 4971 generic.go:334] "Generic (PLEG): container finished" podID="6599a56b-bc8f-42e6-8f6e-359727614c6e" containerID="db0f8efa8537bff11b3dc5fc1c79852b330e9ac024ed998065ee5da9ff50df4e" exitCode=0 Mar 20 08:49:32 crc kubenswrapper[4971]: I0320 08:49:32.163282 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b860-account-create-update-khxtz" event={"ID":"6599a56b-bc8f-42e6-8f6e-359727614c6e","Type":"ContainerDied","Data":"db0f8efa8537bff11b3dc5fc1c79852b330e9ac024ed998065ee5da9ff50df4e"} Mar 20 08:49:32 crc kubenswrapper[4971]: I0320 08:49:32.163355 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b860-account-create-update-khxtz" event={"ID":"6599a56b-bc8f-42e6-8f6e-359727614c6e","Type":"ContainerStarted","Data":"3c3ad2d56fcb53341642a404c36fcd3bd112fa9436bddbd014b6904ba5cb0777"} Mar 20 08:49:32 crc kubenswrapper[4971]: I0320 08:49:32.166760 4971 generic.go:334] "Generic (PLEG): container finished" podID="a650993b-e4f4-4d7c-9d90-e9ce7e0ddd12" containerID="eb902b3800d51a0b8ec0ea5e03cfc27c7506f87e4f321c55d1eeb903bda03455" exitCode=0 Mar 20 08:49:32 crc kubenswrapper[4971]: I0320 08:49:32.167008 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-2hbqb" event={"ID":"a650993b-e4f4-4d7c-9d90-e9ce7e0ddd12","Type":"ContainerDied","Data":"eb902b3800d51a0b8ec0ea5e03cfc27c7506f87e4f321c55d1eeb903bda03455"} Mar 20 08:49:33 crc kubenswrapper[4971]: I0320 08:49:33.575045 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b860-account-create-update-khxtz" Mar 20 08:49:33 crc kubenswrapper[4971]: I0320 08:49:33.583957 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-2hbqb" Mar 20 08:49:33 crc kubenswrapper[4971]: I0320 08:49:33.626471 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5qf2\" (UniqueName: \"kubernetes.io/projected/6599a56b-bc8f-42e6-8f6e-359727614c6e-kube-api-access-w5qf2\") pod \"6599a56b-bc8f-42e6-8f6e-359727614c6e\" (UID: \"6599a56b-bc8f-42e6-8f6e-359727614c6e\") " Mar 20 08:49:33 crc kubenswrapper[4971]: I0320 08:49:33.626540 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a650993b-e4f4-4d7c-9d90-e9ce7e0ddd12-operator-scripts\") pod \"a650993b-e4f4-4d7c-9d90-e9ce7e0ddd12\" (UID: \"a650993b-e4f4-4d7c-9d90-e9ce7e0ddd12\") " Mar 20 08:49:33 crc kubenswrapper[4971]: I0320 08:49:33.626665 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6599a56b-bc8f-42e6-8f6e-359727614c6e-operator-scripts\") pod \"6599a56b-bc8f-42e6-8f6e-359727614c6e\" (UID: \"6599a56b-bc8f-42e6-8f6e-359727614c6e\") " Mar 20 08:49:33 crc kubenswrapper[4971]: I0320 08:49:33.626712 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnxjn\" (UniqueName: \"kubernetes.io/projected/a650993b-e4f4-4d7c-9d90-e9ce7e0ddd12-kube-api-access-vnxjn\") pod \"a650993b-e4f4-4d7c-9d90-e9ce7e0ddd12\" (UID: \"a650993b-e4f4-4d7c-9d90-e9ce7e0ddd12\") " Mar 20 08:49:33 crc kubenswrapper[4971]: I0320 08:49:33.627462 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a650993b-e4f4-4d7c-9d90-e9ce7e0ddd12-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a650993b-e4f4-4d7c-9d90-e9ce7e0ddd12" (UID: "a650993b-e4f4-4d7c-9d90-e9ce7e0ddd12"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:49:33 crc kubenswrapper[4971]: I0320 08:49:33.628493 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6599a56b-bc8f-42e6-8f6e-359727614c6e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6599a56b-bc8f-42e6-8f6e-359727614c6e" (UID: "6599a56b-bc8f-42e6-8f6e-359727614c6e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:49:33 crc kubenswrapper[4971]: I0320 08:49:33.634727 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a650993b-e4f4-4d7c-9d90-e9ce7e0ddd12-kube-api-access-vnxjn" (OuterVolumeSpecName: "kube-api-access-vnxjn") pod "a650993b-e4f4-4d7c-9d90-e9ce7e0ddd12" (UID: "a650993b-e4f4-4d7c-9d90-e9ce7e0ddd12"). InnerVolumeSpecName "kube-api-access-vnxjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:49:33 crc kubenswrapper[4971]: I0320 08:49:33.635370 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6599a56b-bc8f-42e6-8f6e-359727614c6e-kube-api-access-w5qf2" (OuterVolumeSpecName: "kube-api-access-w5qf2") pod "6599a56b-bc8f-42e6-8f6e-359727614c6e" (UID: "6599a56b-bc8f-42e6-8f6e-359727614c6e"). InnerVolumeSpecName "kube-api-access-w5qf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:49:33 crc kubenswrapper[4971]: I0320 08:49:33.728441 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5qf2\" (UniqueName: \"kubernetes.io/projected/6599a56b-bc8f-42e6-8f6e-359727614c6e-kube-api-access-w5qf2\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:33 crc kubenswrapper[4971]: I0320 08:49:33.728474 4971 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a650993b-e4f4-4d7c-9d90-e9ce7e0ddd12-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:33 crc kubenswrapper[4971]: I0320 08:49:33.728483 4971 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6599a56b-bc8f-42e6-8f6e-359727614c6e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:33 crc kubenswrapper[4971]: I0320 08:49:33.728491 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnxjn\" (UniqueName: \"kubernetes.io/projected/a650993b-e4f4-4d7c-9d90-e9ce7e0ddd12-kube-api-access-vnxjn\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:33 crc kubenswrapper[4971]: I0320 08:49:33.732562 4971 scope.go:117] "RemoveContainer" containerID="af8dd483a43b5c96df916fd811fc94a881785f568b623794921e968459fc26b4" Mar 20 08:49:33 crc kubenswrapper[4971]: E0320 08:49:33.732820 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:49:34 crc kubenswrapper[4971]: I0320 08:49:34.188619 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b860-account-create-update-khxtz" event={"ID":"6599a56b-bc8f-42e6-8f6e-359727614c6e","Type":"ContainerDied","Data":"3c3ad2d56fcb53341642a404c36fcd3bd112fa9436bddbd014b6904ba5cb0777"} Mar 20 08:49:34 crc kubenswrapper[4971]: I0320 08:49:34.188919 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c3ad2d56fcb53341642a404c36fcd3bd112fa9436bddbd014b6904ba5cb0777" Mar 20 08:49:34 crc kubenswrapper[4971]: I0320 08:49:34.188736 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b860-account-create-update-khxtz" Mar 20 08:49:34 crc kubenswrapper[4971]: I0320 08:49:34.191498 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-2hbqb" event={"ID":"a650993b-e4f4-4d7c-9d90-e9ce7e0ddd12","Type":"ContainerDied","Data":"076d3a5884866ab847a0d2c85cdb7424186dd562978eeddd8b1c6f96a62769d3"} Mar 20 08:49:34 crc kubenswrapper[4971]: I0320 08:49:34.191563 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-2hbqb" Mar 20 08:49:34 crc kubenswrapper[4971]: I0320 08:49:34.191567 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="076d3a5884866ab847a0d2c85cdb7424186dd562978eeddd8b1c6f96a62769d3" Mar 20 08:49:35 crc kubenswrapper[4971]: I0320 08:49:35.719213 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6649d5b6d7-wcsd7"] Mar 20 08:49:35 crc kubenswrapper[4971]: E0320 08:49:35.719986 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6599a56b-bc8f-42e6-8f6e-359727614c6e" containerName="mariadb-account-create-update" Mar 20 08:49:35 crc kubenswrapper[4971]: I0320 08:49:35.720003 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="6599a56b-bc8f-42e6-8f6e-359727614c6e" containerName="mariadb-account-create-update" Mar 20 08:49:35 crc kubenswrapper[4971]: E0320 08:49:35.720025 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a650993b-e4f4-4d7c-9d90-e9ce7e0ddd12" containerName="mariadb-database-create" Mar 20 08:49:35 crc kubenswrapper[4971]: I0320 08:49:35.720032 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="a650993b-e4f4-4d7c-9d90-e9ce7e0ddd12" containerName="mariadb-database-create" Mar 20 08:49:35 crc kubenswrapper[4971]: I0320 08:49:35.720221 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="6599a56b-bc8f-42e6-8f6e-359727614c6e" containerName="mariadb-account-create-update" Mar 20 08:49:35 crc kubenswrapper[4971]: I0320 08:49:35.720255 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="a650993b-e4f4-4d7c-9d90-e9ce7e0ddd12" containerName="mariadb-database-create" Mar 20 08:49:35 crc kubenswrapper[4971]: I0320 08:49:35.721274 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6649d5b6d7-wcsd7" Mar 20 08:49:35 crc kubenswrapper[4971]: I0320 08:49:35.758804 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6649d5b6d7-wcsd7"] Mar 20 08:49:35 crc kubenswrapper[4971]: I0320 08:49:35.796723 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-df4kp"] Mar 20 08:49:35 crc kubenswrapper[4971]: I0320 08:49:35.797878 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-df4kp" Mar 20 08:49:35 crc kubenswrapper[4971]: I0320 08:49:35.801099 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-wf68c" Mar 20 08:49:35 crc kubenswrapper[4971]: I0320 08:49:35.801690 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 20 08:49:35 crc kubenswrapper[4971]: I0320 08:49:35.801860 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 20 08:49:35 crc kubenswrapper[4971]: I0320 08:49:35.804228 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-df4kp"] Mar 20 08:49:35 crc kubenswrapper[4971]: I0320 08:49:35.872838 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9-ovsdbserver-nb\") pod \"dnsmasq-dns-6649d5b6d7-wcsd7\" (UID: \"b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9\") " pod="openstack/dnsmasq-dns-6649d5b6d7-wcsd7" Mar 20 08:49:35 crc kubenswrapper[4971]: I0320 08:49:35.872984 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9-config\") pod \"dnsmasq-dns-6649d5b6d7-wcsd7\" (UID: \"b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9\") " pod="openstack/dnsmasq-dns-6649d5b6d7-wcsd7" Mar 20 08:49:35 crc kubenswrapper[4971]: I0320 08:49:35.873025 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9-dns-svc\") pod \"dnsmasq-dns-6649d5b6d7-wcsd7\" (UID: \"b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9\") " pod="openstack/dnsmasq-dns-6649d5b6d7-wcsd7" Mar 20 08:49:35 crc kubenswrapper[4971]: I0320 08:49:35.873077 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9-ovsdbserver-sb\") pod \"dnsmasq-dns-6649d5b6d7-wcsd7\" (UID: \"b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9\") " pod="openstack/dnsmasq-dns-6649d5b6d7-wcsd7" Mar 20 08:49:35 crc kubenswrapper[4971]: I0320 08:49:35.873107 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn9bn\" (UniqueName: \"kubernetes.io/projected/b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9-kube-api-access-bn9bn\") pod \"dnsmasq-dns-6649d5b6d7-wcsd7\" (UID: \"b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9\") " pod="openstack/dnsmasq-dns-6649d5b6d7-wcsd7" Mar 20 08:49:35 crc kubenswrapper[4971]: I0320 08:49:35.974775 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73f73e65-8c8e-4358-a2fc-7ad8e1583148-combined-ca-bundle\") pod \"placement-db-sync-df4kp\" (UID: \"73f73e65-8c8e-4358-a2fc-7ad8e1583148\") " pod="openstack/placement-db-sync-df4kp" Mar 20 08:49:35 crc kubenswrapper[4971]: I0320 08:49:35.974842 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73f73e65-8c8e-4358-a2fc-7ad8e1583148-config-data\") pod \"placement-db-sync-df4kp\" (UID: \"73f73e65-8c8e-4358-a2fc-7ad8e1583148\") " pod="openstack/placement-db-sync-df4kp" Mar 20 08:49:35 crc kubenswrapper[4971]: I0320 08:49:35.974885 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9-config\") pod \"dnsmasq-dns-6649d5b6d7-wcsd7\" (UID: \"b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9\") " pod="openstack/dnsmasq-dns-6649d5b6d7-wcsd7" Mar 20 08:49:35 crc kubenswrapper[4971]: I0320 08:49:35.974904 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73f73e65-8c8e-4358-a2fc-7ad8e1583148-scripts\") pod \"placement-db-sync-df4kp\" (UID: \"73f73e65-8c8e-4358-a2fc-7ad8e1583148\") " pod="openstack/placement-db-sync-df4kp" Mar 20 08:49:35 crc kubenswrapper[4971]: I0320 08:49:35.974932 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9-dns-svc\") pod \"dnsmasq-dns-6649d5b6d7-wcsd7\" (UID: \"b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9\") " pod="openstack/dnsmasq-dns-6649d5b6d7-wcsd7" Mar 20 08:49:35 crc kubenswrapper[4971]: I0320 08:49:35.975098 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9-ovsdbserver-sb\") pod \"dnsmasq-dns-6649d5b6d7-wcsd7\" (UID: \"b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9\") " pod="openstack/dnsmasq-dns-6649d5b6d7-wcsd7" Mar 20 08:49:35 crc kubenswrapper[4971]: I0320 08:49:35.975167 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73f73e65-8c8e-4358-a2fc-7ad8e1583148-logs\") pod \"placement-db-sync-df4kp\" (UID: \"73f73e65-8c8e-4358-a2fc-7ad8e1583148\") " pod="openstack/placement-db-sync-df4kp" Mar 20 08:49:35 crc kubenswrapper[4971]: I0320 08:49:35.975221 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bn9bn\" (UniqueName: \"kubernetes.io/projected/b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9-kube-api-access-bn9bn\") pod \"dnsmasq-dns-6649d5b6d7-wcsd7\" (UID: \"b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9\") " pod="openstack/dnsmasq-dns-6649d5b6d7-wcsd7" Mar 20 08:49:35 crc kubenswrapper[4971]: I0320 08:49:35.975400 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9-ovsdbserver-nb\") pod \"dnsmasq-dns-6649d5b6d7-wcsd7\" (UID: \"b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9\") " pod="openstack/dnsmasq-dns-6649d5b6d7-wcsd7" Mar 20 08:49:35 crc kubenswrapper[4971]: I0320 08:49:35.975507 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgnbg\" (UniqueName: \"kubernetes.io/projected/73f73e65-8c8e-4358-a2fc-7ad8e1583148-kube-api-access-pgnbg\") pod \"placement-db-sync-df4kp\" (UID: \"73f73e65-8c8e-4358-a2fc-7ad8e1583148\") " pod="openstack/placement-db-sync-df4kp" Mar 20 08:49:35 crc kubenswrapper[4971]: I0320 08:49:35.975828 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9-config\") pod \"dnsmasq-dns-6649d5b6d7-wcsd7\" (UID: \"b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9\") " pod="openstack/dnsmasq-dns-6649d5b6d7-wcsd7" Mar 20 08:49:35 crc kubenswrapper[4971]: I0320 08:49:35.975856 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9-dns-svc\") pod \"dnsmasq-dns-6649d5b6d7-wcsd7\" (UID: \"b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9\") " pod="openstack/dnsmasq-dns-6649d5b6d7-wcsd7" Mar 20 08:49:35 crc kubenswrapper[4971]: I0320 08:49:35.976084 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9-ovsdbserver-sb\") pod \"dnsmasq-dns-6649d5b6d7-wcsd7\" (UID: \"b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9\") " pod="openstack/dnsmasq-dns-6649d5b6d7-wcsd7" Mar 20 08:49:35 crc kubenswrapper[4971]: I0320 08:49:35.976278 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9-ovsdbserver-nb\") pod \"dnsmasq-dns-6649d5b6d7-wcsd7\" (UID: \"b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9\") " pod="openstack/dnsmasq-dns-6649d5b6d7-wcsd7" Mar 20 08:49:35 crc kubenswrapper[4971]: I0320 08:49:35.996658 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn9bn\" (UniqueName: \"kubernetes.io/projected/b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9-kube-api-access-bn9bn\") pod \"dnsmasq-dns-6649d5b6d7-wcsd7\" (UID: \"b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9\") " pod="openstack/dnsmasq-dns-6649d5b6d7-wcsd7" Mar 20 08:49:36 crc kubenswrapper[4971]: I0320 08:49:36.042791 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6649d5b6d7-wcsd7" Mar 20 08:49:36 crc kubenswrapper[4971]: I0320 08:49:36.077696 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73f73e65-8c8e-4358-a2fc-7ad8e1583148-scripts\") pod \"placement-db-sync-df4kp\" (UID: \"73f73e65-8c8e-4358-a2fc-7ad8e1583148\") " pod="openstack/placement-db-sync-df4kp" Mar 20 08:49:36 crc kubenswrapper[4971]: I0320 08:49:36.077802 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73f73e65-8c8e-4358-a2fc-7ad8e1583148-logs\") pod \"placement-db-sync-df4kp\" (UID: \"73f73e65-8c8e-4358-a2fc-7ad8e1583148\") " pod="openstack/placement-db-sync-df4kp" Mar 20 08:49:36 crc kubenswrapper[4971]: I0320 08:49:36.077891 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgnbg\" (UniqueName: \"kubernetes.io/projected/73f73e65-8c8e-4358-a2fc-7ad8e1583148-kube-api-access-pgnbg\") pod \"placement-db-sync-df4kp\" (UID: \"73f73e65-8c8e-4358-a2fc-7ad8e1583148\") " pod="openstack/placement-db-sync-df4kp" Mar 20 08:49:36 crc kubenswrapper[4971]: I0320 08:49:36.077924 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73f73e65-8c8e-4358-a2fc-7ad8e1583148-combined-ca-bundle\") pod \"placement-db-sync-df4kp\" (UID: \"73f73e65-8c8e-4358-a2fc-7ad8e1583148\") " pod="openstack/placement-db-sync-df4kp" Mar 20 08:49:36 crc kubenswrapper[4971]: I0320 08:49:36.077957 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73f73e65-8c8e-4358-a2fc-7ad8e1583148-config-data\") pod \"placement-db-sync-df4kp\" (UID: \"73f73e65-8c8e-4358-a2fc-7ad8e1583148\") " pod="openstack/placement-db-sync-df4kp" Mar 20 08:49:36 crc kubenswrapper[4971]: I0320 08:49:36.078727 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73f73e65-8c8e-4358-a2fc-7ad8e1583148-logs\") pod \"placement-db-sync-df4kp\" (UID: \"73f73e65-8c8e-4358-a2fc-7ad8e1583148\") " pod="openstack/placement-db-sync-df4kp" Mar 20 08:49:36 crc kubenswrapper[4971]: I0320 08:49:36.088050 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73f73e65-8c8e-4358-a2fc-7ad8e1583148-scripts\") pod \"placement-db-sync-df4kp\" (UID: \"73f73e65-8c8e-4358-a2fc-7ad8e1583148\") " pod="openstack/placement-db-sync-df4kp" Mar 20 08:49:36 crc kubenswrapper[4971]: I0320 08:49:36.088442 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73f73e65-8c8e-4358-a2fc-7ad8e1583148-combined-ca-bundle\") pod \"placement-db-sync-df4kp\" (UID: \"73f73e65-8c8e-4358-a2fc-7ad8e1583148\") " pod="openstack/placement-db-sync-df4kp" Mar 20 08:49:36 crc kubenswrapper[4971]: I0320 08:49:36.088620 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73f73e65-8c8e-4358-a2fc-7ad8e1583148-config-data\") pod \"placement-db-sync-df4kp\" (UID: \"73f73e65-8c8e-4358-a2fc-7ad8e1583148\") " pod="openstack/placement-db-sync-df4kp" Mar 20 08:49:36 crc kubenswrapper[4971]: I0320 08:49:36.094954 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgnbg\" (UniqueName: \"kubernetes.io/projected/73f73e65-8c8e-4358-a2fc-7ad8e1583148-kube-api-access-pgnbg\") pod \"placement-db-sync-df4kp\" (UID: \"73f73e65-8c8e-4358-a2fc-7ad8e1583148\") " pod="openstack/placement-db-sync-df4kp" Mar 20 08:49:36 crc kubenswrapper[4971]: I0320 08:49:36.115590 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-df4kp" Mar 20 08:49:36 crc kubenswrapper[4971]: I0320 08:49:36.494017 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6649d5b6d7-wcsd7"] Mar 20 08:49:36 crc kubenswrapper[4971]: W0320 08:49:36.580124 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod73f73e65_8c8e_4358_a2fc_7ad8e1583148.slice/crio-29266fab1a15eabadc7b8b61ebc601081af3fd38320b88410aabdafa7ae3cab3 WatchSource:0}: Error finding container 29266fab1a15eabadc7b8b61ebc601081af3fd38320b88410aabdafa7ae3cab3: Status 404 returned error can't find the container with id 29266fab1a15eabadc7b8b61ebc601081af3fd38320b88410aabdafa7ae3cab3 Mar 20 08:49:36 crc kubenswrapper[4971]: I0320 08:49:36.580418 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-df4kp"] Mar 20 08:49:37 crc kubenswrapper[4971]: I0320 08:49:37.229403 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-df4kp" event={"ID":"73f73e65-8c8e-4358-a2fc-7ad8e1583148","Type":"ContainerStarted","Data":"29266fab1a15eabadc7b8b61ebc601081af3fd38320b88410aabdafa7ae3cab3"} Mar 20 08:49:37 crc kubenswrapper[4971]: I0320 08:49:37.231640 4971 generic.go:334] "Generic (PLEG): container finished" podID="b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9" containerID="276f47e60f666794f2d37d4f83e2e8566c68481aa03c94c594ab69dbe7bfa9d2" exitCode=0 Mar 20 08:49:37 crc kubenswrapper[4971]: I0320 08:49:37.231673 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6649d5b6d7-wcsd7" event={"ID":"b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9","Type":"ContainerDied","Data":"276f47e60f666794f2d37d4f83e2e8566c68481aa03c94c594ab69dbe7bfa9d2"} Mar 20 08:49:37 crc kubenswrapper[4971]: I0320 08:49:37.231693 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6649d5b6d7-wcsd7" event={"ID":"b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9","Type":"ContainerStarted","Data":"e24cd1ea7a7baeb09554130fcd6dfe044aad870a2f4186bdbcea5ccf7addda21"} Mar 20 08:49:38 crc kubenswrapper[4971]: I0320 08:49:38.246267 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6649d5b6d7-wcsd7" event={"ID":"b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9","Type":"ContainerStarted","Data":"c09b73a25970cbbb8d7b4a4f2b3fb52cd2863104d696e951d58c0e5331ec2519"} Mar 20 08:49:38 crc kubenswrapper[4971]: I0320 08:49:38.246588 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6649d5b6d7-wcsd7" Mar 20 08:49:38 crc kubenswrapper[4971]: I0320 08:49:38.275988 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6649d5b6d7-wcsd7" podStartSLOduration=3.275959866 podStartE2EDuration="3.275959866s" podCreationTimestamp="2026-03-20 08:49:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:49:38.266311663 +0000 UTC m=+7200.246185901" watchObservedRunningTime="2026-03-20 08:49:38.275959866 +0000 UTC m=+7200.255834044" Mar 20 08:49:40 crc kubenswrapper[4971]: I0320 08:49:40.271008 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-df4kp" event={"ID":"73f73e65-8c8e-4358-a2fc-7ad8e1583148","Type":"ContainerStarted","Data":"6cc7d057c9f72b6b7f92f41e98c24f0875b9cfee36807c506d1fe3aa1ccac4f0"} Mar 20 08:49:40 crc kubenswrapper[4971]: I0320 08:49:40.289978 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-df4kp" podStartSLOduration=2.019103161 podStartE2EDuration="5.289963938s" podCreationTimestamp="2026-03-20 08:49:35 +0000 UTC" firstStartedPulling="2026-03-20 08:49:36.58232621 +0000 UTC m=+7198.562200338" lastFinishedPulling="2026-03-20 08:49:39.853186977 +0000 UTC m=+7201.833061115" observedRunningTime="2026-03-20 08:49:40.285230034 +0000 UTC m=+7202.265104172" watchObservedRunningTime="2026-03-20 08:49:40.289963938 +0000 UTC m=+7202.269838066" Mar 20 08:49:42 crc kubenswrapper[4971]: I0320 08:49:42.298235 4971 generic.go:334] "Generic (PLEG): container finished" podID="73f73e65-8c8e-4358-a2fc-7ad8e1583148" containerID="6cc7d057c9f72b6b7f92f41e98c24f0875b9cfee36807c506d1fe3aa1ccac4f0" exitCode=0 Mar 20 08:49:42 crc kubenswrapper[4971]: I0320 08:49:42.298354 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-df4kp" event={"ID":"73f73e65-8c8e-4358-a2fc-7ad8e1583148","Type":"ContainerDied","Data":"6cc7d057c9f72b6b7f92f41e98c24f0875b9cfee36807c506d1fe3aa1ccac4f0"} Mar 20 08:49:43 crc kubenswrapper[4971]: I0320 08:49:43.728317 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-df4kp" Mar 20 08:49:43 crc kubenswrapper[4971]: I0320 08:49:43.913290 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgnbg\" (UniqueName: \"kubernetes.io/projected/73f73e65-8c8e-4358-a2fc-7ad8e1583148-kube-api-access-pgnbg\") pod \"73f73e65-8c8e-4358-a2fc-7ad8e1583148\" (UID: \"73f73e65-8c8e-4358-a2fc-7ad8e1583148\") " Mar 20 08:49:43 crc kubenswrapper[4971]: I0320 08:49:43.913374 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73f73e65-8c8e-4358-a2fc-7ad8e1583148-scripts\") pod \"73f73e65-8c8e-4358-a2fc-7ad8e1583148\" (UID: \"73f73e65-8c8e-4358-a2fc-7ad8e1583148\") " Mar 20 08:49:43 crc kubenswrapper[4971]: I0320 08:49:43.913493 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73f73e65-8c8e-4358-a2fc-7ad8e1583148-combined-ca-bundle\") pod \"73f73e65-8c8e-4358-a2fc-7ad8e1583148\" (UID: \"73f73e65-8c8e-4358-a2fc-7ad8e1583148\") " Mar 20 08:49:43 crc kubenswrapper[4971]: I0320 08:49:43.913545 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73f73e65-8c8e-4358-a2fc-7ad8e1583148-logs\") pod \"73f73e65-8c8e-4358-a2fc-7ad8e1583148\" (UID: \"73f73e65-8c8e-4358-a2fc-7ad8e1583148\") " Mar 20 08:49:43 crc kubenswrapper[4971]: I0320 08:49:43.913744 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73f73e65-8c8e-4358-a2fc-7ad8e1583148-config-data\") pod \"73f73e65-8c8e-4358-a2fc-7ad8e1583148\" (UID: \"73f73e65-8c8e-4358-a2fc-7ad8e1583148\") " Mar 20 08:49:43 crc kubenswrapper[4971]: I0320 08:49:43.914204 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73f73e65-8c8e-4358-a2fc-7ad8e1583148-logs" (OuterVolumeSpecName: "logs") pod "73f73e65-8c8e-4358-a2fc-7ad8e1583148" (UID: "73f73e65-8c8e-4358-a2fc-7ad8e1583148"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:49:43 crc kubenswrapper[4971]: I0320 08:49:43.915719 4971 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73f73e65-8c8e-4358-a2fc-7ad8e1583148-logs\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:43 crc kubenswrapper[4971]: I0320 08:49:43.919282 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73f73e65-8c8e-4358-a2fc-7ad8e1583148-kube-api-access-pgnbg" (OuterVolumeSpecName: "kube-api-access-pgnbg") pod "73f73e65-8c8e-4358-a2fc-7ad8e1583148" (UID: "73f73e65-8c8e-4358-a2fc-7ad8e1583148"). InnerVolumeSpecName "kube-api-access-pgnbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:49:43 crc kubenswrapper[4971]: I0320 08:49:43.920362 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73f73e65-8c8e-4358-a2fc-7ad8e1583148-scripts" (OuterVolumeSpecName: "scripts") pod "73f73e65-8c8e-4358-a2fc-7ad8e1583148" (UID: "73f73e65-8c8e-4358-a2fc-7ad8e1583148"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:43 crc kubenswrapper[4971]: I0320 08:49:43.952345 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73f73e65-8c8e-4358-a2fc-7ad8e1583148-config-data" (OuterVolumeSpecName: "config-data") pod "73f73e65-8c8e-4358-a2fc-7ad8e1583148" (UID: "73f73e65-8c8e-4358-a2fc-7ad8e1583148"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:43 crc kubenswrapper[4971]: I0320 08:49:43.954027 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73f73e65-8c8e-4358-a2fc-7ad8e1583148-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "73f73e65-8c8e-4358-a2fc-7ad8e1583148" (UID: "73f73e65-8c8e-4358-a2fc-7ad8e1583148"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:44 crc kubenswrapper[4971]: I0320 08:49:44.017454 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgnbg\" (UniqueName: \"kubernetes.io/projected/73f73e65-8c8e-4358-a2fc-7ad8e1583148-kube-api-access-pgnbg\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:44 crc kubenswrapper[4971]: I0320 08:49:44.017488 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73f73e65-8c8e-4358-a2fc-7ad8e1583148-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:44 crc kubenswrapper[4971]: I0320 08:49:44.017500 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73f73e65-8c8e-4358-a2fc-7ad8e1583148-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:44 crc kubenswrapper[4971]: I0320 08:49:44.017508 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73f73e65-8c8e-4358-a2fc-7ad8e1583148-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:44 crc kubenswrapper[4971]: I0320 08:49:44.327491 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-df4kp" event={"ID":"73f73e65-8c8e-4358-a2fc-7ad8e1583148","Type":"ContainerDied","Data":"29266fab1a15eabadc7b8b61ebc601081af3fd38320b88410aabdafa7ae3cab3"} Mar 20 08:49:44 crc kubenswrapper[4971]: I0320 08:49:44.327915 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29266fab1a15eabadc7b8b61ebc601081af3fd38320b88410aabdafa7ae3cab3" Mar 20 08:49:44 crc kubenswrapper[4971]: I0320 08:49:44.327760 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-df4kp" Mar 20 08:49:44 crc kubenswrapper[4971]: I0320 08:49:44.436379 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5c9b55c9fb-k6gqg"] Mar 20 08:49:44 crc kubenswrapper[4971]: E0320 08:49:44.436910 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73f73e65-8c8e-4358-a2fc-7ad8e1583148" containerName="placement-db-sync" Mar 20 08:49:44 crc kubenswrapper[4971]: I0320 08:49:44.436942 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="73f73e65-8c8e-4358-a2fc-7ad8e1583148" containerName="placement-db-sync" Mar 20 08:49:44 crc kubenswrapper[4971]: I0320 08:49:44.437248 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="73f73e65-8c8e-4358-a2fc-7ad8e1583148" containerName="placement-db-sync" Mar 20 08:49:44 crc kubenswrapper[4971]: I0320 08:49:44.438415 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5c9b55c9fb-k6gqg" Mar 20 08:49:44 crc kubenswrapper[4971]: I0320 08:49:44.444257 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-wf68c" Mar 20 08:49:44 crc kubenswrapper[4971]: I0320 08:49:44.444968 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 20 08:49:44 crc kubenswrapper[4971]: I0320 08:49:44.446342 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 20 08:49:44 crc kubenswrapper[4971]: I0320 08:49:44.456468 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5c9b55c9fb-k6gqg"] Mar 20 08:49:44 crc kubenswrapper[4971]: I0320 08:49:44.528452 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21e946e7-e1e2-469c-b61a-b6bf7c3e9543-config-data\") pod \"placement-5c9b55c9fb-k6gqg\" (UID: \"21e946e7-e1e2-469c-b61a-b6bf7c3e9543\") " pod="openstack/placement-5c9b55c9fb-k6gqg" Mar 20 08:49:44 crc kubenswrapper[4971]: I0320 08:49:44.528499 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wpd9\" (UniqueName: \"kubernetes.io/projected/21e946e7-e1e2-469c-b61a-b6bf7c3e9543-kube-api-access-9wpd9\") pod \"placement-5c9b55c9fb-k6gqg\" (UID: \"21e946e7-e1e2-469c-b61a-b6bf7c3e9543\") " pod="openstack/placement-5c9b55c9fb-k6gqg" Mar 20 08:49:44 crc kubenswrapper[4971]: I0320 08:49:44.528753 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21e946e7-e1e2-469c-b61a-b6bf7c3e9543-combined-ca-bundle\") pod \"placement-5c9b55c9fb-k6gqg\" (UID: \"21e946e7-e1e2-469c-b61a-b6bf7c3e9543\") " pod="openstack/placement-5c9b55c9fb-k6gqg" Mar 20 08:49:44 crc kubenswrapper[4971]: I0320 08:49:44.528916 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21e946e7-e1e2-469c-b61a-b6bf7c3e9543-scripts\") pod \"placement-5c9b55c9fb-k6gqg\" (UID: \"21e946e7-e1e2-469c-b61a-b6bf7c3e9543\") " pod="openstack/placement-5c9b55c9fb-k6gqg" Mar 20 08:49:44 crc kubenswrapper[4971]: I0320 08:49:44.529041 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21e946e7-e1e2-469c-b61a-b6bf7c3e9543-logs\") pod \"placement-5c9b55c9fb-k6gqg\" (UID: \"21e946e7-e1e2-469c-b61a-b6bf7c3e9543\") " pod="openstack/placement-5c9b55c9fb-k6gqg" Mar 20 08:49:44 crc kubenswrapper[4971]: I0320 08:49:44.630539 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21e946e7-e1e2-469c-b61a-b6bf7c3e9543-config-data\") pod \"placement-5c9b55c9fb-k6gqg\" (UID: \"21e946e7-e1e2-469c-b61a-b6bf7c3e9543\") " pod="openstack/placement-5c9b55c9fb-k6gqg" Mar 20 08:49:44 crc kubenswrapper[4971]: I0320 08:49:44.630597 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wpd9\" (UniqueName: \"kubernetes.io/projected/21e946e7-e1e2-469c-b61a-b6bf7c3e9543-kube-api-access-9wpd9\") pod \"placement-5c9b55c9fb-k6gqg\" (UID: \"21e946e7-e1e2-469c-b61a-b6bf7c3e9543\") " pod="openstack/placement-5c9b55c9fb-k6gqg" Mar 20 08:49:44 crc kubenswrapper[4971]: I0320 08:49:44.630757 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21e946e7-e1e2-469c-b61a-b6bf7c3e9543-combined-ca-bundle\") pod \"placement-5c9b55c9fb-k6gqg\" (UID: \"21e946e7-e1e2-469c-b61a-b6bf7c3e9543\") " pod="openstack/placement-5c9b55c9fb-k6gqg" Mar 20 08:49:44 crc kubenswrapper[4971]: I0320 08:49:44.630790 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21e946e7-e1e2-469c-b61a-b6bf7c3e9543-scripts\") pod \"placement-5c9b55c9fb-k6gqg\" (UID: \"21e946e7-e1e2-469c-b61a-b6bf7c3e9543\") " pod="openstack/placement-5c9b55c9fb-k6gqg" Mar 20 08:49:44 crc kubenswrapper[4971]: I0320 08:49:44.630832 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21e946e7-e1e2-469c-b61a-b6bf7c3e9543-logs\") pod \"placement-5c9b55c9fb-k6gqg\" (UID: \"21e946e7-e1e2-469c-b61a-b6bf7c3e9543\") " pod="openstack/placement-5c9b55c9fb-k6gqg" Mar 20 08:49:44 crc kubenswrapper[4971]: I0320 08:49:44.631271 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21e946e7-e1e2-469c-b61a-b6bf7c3e9543-logs\") pod \"placement-5c9b55c9fb-k6gqg\" (UID: \"21e946e7-e1e2-469c-b61a-b6bf7c3e9543\") " pod="openstack/placement-5c9b55c9fb-k6gqg" Mar 20 08:49:44 crc kubenswrapper[4971]: I0320 08:49:44.635181 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21e946e7-e1e2-469c-b61a-b6bf7c3e9543-scripts\") pod \"placement-5c9b55c9fb-k6gqg\" (UID: \"21e946e7-e1e2-469c-b61a-b6bf7c3e9543\") " pod="openstack/placement-5c9b55c9fb-k6gqg" Mar 20 08:49:44 crc kubenswrapper[4971]: I0320 08:49:44.636521 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21e946e7-e1e2-469c-b61a-b6bf7c3e9543-combined-ca-bundle\") pod \"placement-5c9b55c9fb-k6gqg\" (UID: \"21e946e7-e1e2-469c-b61a-b6bf7c3e9543\") " pod="openstack/placement-5c9b55c9fb-k6gqg" Mar 20 08:49:44 crc kubenswrapper[4971]: I0320 08:49:44.637862 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21e946e7-e1e2-469c-b61a-b6bf7c3e9543-config-data\") pod \"placement-5c9b55c9fb-k6gqg\" (UID: \"21e946e7-e1e2-469c-b61a-b6bf7c3e9543\") " pod="openstack/placement-5c9b55c9fb-k6gqg" Mar 20 08:49:44 crc kubenswrapper[4971]: I0320 08:49:44.648122 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wpd9\" (UniqueName: \"kubernetes.io/projected/21e946e7-e1e2-469c-b61a-b6bf7c3e9543-kube-api-access-9wpd9\") pod \"placement-5c9b55c9fb-k6gqg\" (UID: \"21e946e7-e1e2-469c-b61a-b6bf7c3e9543\") " pod="openstack/placement-5c9b55c9fb-k6gqg" Mar 20 08:49:44 crc kubenswrapper[4971]: I0320 08:49:44.808367 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5c9b55c9fb-k6gqg" Mar 20 08:49:45 crc kubenswrapper[4971]: I0320 08:49:45.255330 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5c9b55c9fb-k6gqg"] Mar 20 08:49:45 crc kubenswrapper[4971]: W0320 08:49:45.266150 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21e946e7_e1e2_469c_b61a_b6bf7c3e9543.slice/crio-aa0e0d7266094d9c6b9df9b93ccb11b3a45eaecaf89dc7e2532c81b3573dff2b WatchSource:0}: Error finding container aa0e0d7266094d9c6b9df9b93ccb11b3a45eaecaf89dc7e2532c81b3573dff2b: Status 404 returned error can't find the container with id aa0e0d7266094d9c6b9df9b93ccb11b3a45eaecaf89dc7e2532c81b3573dff2b Mar 20 08:49:45 crc kubenswrapper[4971]: I0320 08:49:45.337077 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5c9b55c9fb-k6gqg" event={"ID":"21e946e7-e1e2-469c-b61a-b6bf7c3e9543","Type":"ContainerStarted","Data":"aa0e0d7266094d9c6b9df9b93ccb11b3a45eaecaf89dc7e2532c81b3573dff2b"} Mar 20 08:49:46 crc kubenswrapper[4971]: I0320 08:49:46.044761 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6649d5b6d7-wcsd7" Mar 20 08:49:46 crc kubenswrapper[4971]: I0320 08:49:46.134517 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-589f6f54ff-4rcvv"] Mar 20 08:49:46 crc kubenswrapper[4971]: I0320 08:49:46.134943 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-589f6f54ff-4rcvv" podUID="5e6bad53-1552-4cc5-aafd-2981191d8afc" containerName="dnsmasq-dns" containerID="cri-o://cbebd05d72b78a97fd116f1947143c21cf63be5b9db6d08d0a1e837efe48152c" gracePeriod=10 Mar 20 08:49:46 crc kubenswrapper[4971]: I0320 08:49:46.348031 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5c9b55c9fb-k6gqg" event={"ID":"21e946e7-e1e2-469c-b61a-b6bf7c3e9543","Type":"ContainerStarted","Data":"ed9534fc3ab272b2662296e54f9f77b236e75a9f705fbf9bcb2c6c1f7e8bcbf0"} Mar 20 08:49:46 crc kubenswrapper[4971]: I0320 08:49:46.348097 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5c9b55c9fb-k6gqg" Mar 20 08:49:46 crc kubenswrapper[4971]: I0320 08:49:46.348119 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5c9b55c9fb-k6gqg" Mar 20 08:49:46 crc kubenswrapper[4971]: I0320 08:49:46.348131 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5c9b55c9fb-k6gqg" event={"ID":"21e946e7-e1e2-469c-b61a-b6bf7c3e9543","Type":"ContainerStarted","Data":"f4133d89c48fe8832f001885c0dcb2baaa5ad38435c0b7c2e7fbbed22eb307a3"} Mar 20 08:49:46 crc kubenswrapper[4971]: I0320 08:49:46.350644 4971 generic.go:334] "Generic (PLEG): container finished" podID="5e6bad53-1552-4cc5-aafd-2981191d8afc" containerID="cbebd05d72b78a97fd116f1947143c21cf63be5b9db6d08d0a1e837efe48152c" exitCode=0 Mar 20 08:49:46 crc kubenswrapper[4971]: I0320 08:49:46.350688 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589f6f54ff-4rcvv" event={"ID":"5e6bad53-1552-4cc5-aafd-2981191d8afc","Type":"ContainerDied","Data":"cbebd05d72b78a97fd116f1947143c21cf63be5b9db6d08d0a1e837efe48152c"} Mar 20 08:49:46 crc kubenswrapper[4971]: I0320 08:49:46.368878 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5c9b55c9fb-k6gqg" podStartSLOduration=2.368861769 podStartE2EDuration="2.368861769s" podCreationTimestamp="2026-03-20 08:49:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:49:46.365329997 +0000 UTC m=+7208.345204135" watchObservedRunningTime="2026-03-20 08:49:46.368861769 +0000 UTC m=+7208.348735897" Mar 20 08:49:46 crc kubenswrapper[4971]: I0320 08:49:46.665961 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589f6f54ff-4rcvv" Mar 20 08:49:46 crc kubenswrapper[4971]: I0320 08:49:46.766299 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e6bad53-1552-4cc5-aafd-2981191d8afc-config\") pod \"5e6bad53-1552-4cc5-aafd-2981191d8afc\" (UID: \"5e6bad53-1552-4cc5-aafd-2981191d8afc\") " Mar 20 08:49:46 crc kubenswrapper[4971]: I0320 08:49:46.766475 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e6bad53-1552-4cc5-aafd-2981191d8afc-ovsdbserver-nb\") pod \"5e6bad53-1552-4cc5-aafd-2981191d8afc\" (UID: \"5e6bad53-1552-4cc5-aafd-2981191d8afc\") " Mar 20 08:49:46 crc kubenswrapper[4971]: I0320 08:49:46.766559 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e6bad53-1552-4cc5-aafd-2981191d8afc-ovsdbserver-sb\") pod \"5e6bad53-1552-4cc5-aafd-2981191d8afc\" (UID: \"5e6bad53-1552-4cc5-aafd-2981191d8afc\") " Mar 20 08:49:46 crc kubenswrapper[4971]: I0320 08:49:46.766605 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e6bad53-1552-4cc5-aafd-2981191d8afc-dns-svc\") pod \"5e6bad53-1552-4cc5-aafd-2981191d8afc\" (UID: \"5e6bad53-1552-4cc5-aafd-2981191d8afc\") " Mar 20 08:49:46 crc kubenswrapper[4971]: I0320 08:49:46.766649 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ksfxf\" (UniqueName: \"kubernetes.io/projected/5e6bad53-1552-4cc5-aafd-2981191d8afc-kube-api-access-ksfxf\") pod \"5e6bad53-1552-4cc5-aafd-2981191d8afc\" (UID: \"5e6bad53-1552-4cc5-aafd-2981191d8afc\") " Mar 20 08:49:46 crc kubenswrapper[4971]: I0320 08:49:46.771089 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e6bad53-1552-4cc5-aafd-2981191d8afc-kube-api-access-ksfxf" (OuterVolumeSpecName: "kube-api-access-ksfxf") pod "5e6bad53-1552-4cc5-aafd-2981191d8afc" (UID: "5e6bad53-1552-4cc5-aafd-2981191d8afc"). InnerVolumeSpecName "kube-api-access-ksfxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:49:46 crc kubenswrapper[4971]: I0320 08:49:46.808534 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e6bad53-1552-4cc5-aafd-2981191d8afc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5e6bad53-1552-4cc5-aafd-2981191d8afc" (UID: "5e6bad53-1552-4cc5-aafd-2981191d8afc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:49:46 crc kubenswrapper[4971]: I0320 08:49:46.809028 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e6bad53-1552-4cc5-aafd-2981191d8afc-config" (OuterVolumeSpecName: "config") pod "5e6bad53-1552-4cc5-aafd-2981191d8afc" (UID: "5e6bad53-1552-4cc5-aafd-2981191d8afc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:49:46 crc kubenswrapper[4971]: I0320 08:49:46.812161 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e6bad53-1552-4cc5-aafd-2981191d8afc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5e6bad53-1552-4cc5-aafd-2981191d8afc" (UID: "5e6bad53-1552-4cc5-aafd-2981191d8afc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:49:46 crc kubenswrapper[4971]: I0320 08:49:46.818979 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e6bad53-1552-4cc5-aafd-2981191d8afc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5e6bad53-1552-4cc5-aafd-2981191d8afc" (UID: "5e6bad53-1552-4cc5-aafd-2981191d8afc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:49:46 crc kubenswrapper[4971]: I0320 08:49:46.868880 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e6bad53-1552-4cc5-aafd-2981191d8afc-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:46 crc kubenswrapper[4971]: I0320 08:49:46.868908 4971 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e6bad53-1552-4cc5-aafd-2981191d8afc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:46 crc kubenswrapper[4971]: I0320 08:49:46.868919 4971 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e6bad53-1552-4cc5-aafd-2981191d8afc-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:46 crc kubenswrapper[4971]: I0320 08:49:46.868927 4971 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e6bad53-1552-4cc5-aafd-2981191d8afc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:46 crc kubenswrapper[4971]: I0320 08:49:46.868937 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ksfxf\" (UniqueName: \"kubernetes.io/projected/5e6bad53-1552-4cc5-aafd-2981191d8afc-kube-api-access-ksfxf\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:47 crc kubenswrapper[4971]: I0320 08:49:47.365668 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589f6f54ff-4rcvv" event={"ID":"5e6bad53-1552-4cc5-aafd-2981191d8afc","Type":"ContainerDied","Data":"a6b20af701976a1d7cd6d6098208647968f6f93ad411da2dd4b574ac4ef5c6c9"} Mar 20 08:49:47 crc kubenswrapper[4971]: I0320 08:49:47.365762 4971 scope.go:117] "RemoveContainer" containerID="cbebd05d72b78a97fd116f1947143c21cf63be5b9db6d08d0a1e837efe48152c" Mar 20 08:49:47 crc kubenswrapper[4971]: I0320 08:49:47.366956 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589f6f54ff-4rcvv" Mar 20 08:49:47 crc kubenswrapper[4971]: I0320 08:49:47.410505 4971 scope.go:117] "RemoveContainer" containerID="1a54a78ad16ce0e30c468f80e056d848967a3bfac29d768fe94ee196fcc73b65" Mar 20 08:49:47 crc kubenswrapper[4971]: I0320 08:49:47.440638 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-589f6f54ff-4rcvv"] Mar 20 08:49:47 crc kubenswrapper[4971]: I0320 08:49:47.453943 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-589f6f54ff-4rcvv"] Mar 20 08:49:48 crc kubenswrapper[4971]: I0320 08:49:48.744292 4971 scope.go:117] "RemoveContainer" containerID="af8dd483a43b5c96df916fd811fc94a881785f568b623794921e968459fc26b4" Mar 20 08:49:48 crc kubenswrapper[4971]: E0320 08:49:48.744829 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:49:48 crc kubenswrapper[4971]: I0320 08:49:48.746025 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e6bad53-1552-4cc5-aafd-2981191d8afc" path="/var/lib/kubelet/pods/5e6bad53-1552-4cc5-aafd-2981191d8afc/volumes" Mar 20 08:49:59 crc kubenswrapper[4971]: I0320 08:49:59.733110 4971 scope.go:117] "RemoveContainer" containerID="af8dd483a43b5c96df916fd811fc94a881785f568b623794921e968459fc26b4" Mar 20 08:49:59 crc kubenswrapper[4971]: E0320 08:49:59.734271 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:50:00 crc kubenswrapper[4971]: I0320 08:50:00.148223 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566610-7nbjt"] Mar 20 08:50:00 crc kubenswrapper[4971]: E0320 08:50:00.149530 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e6bad53-1552-4cc5-aafd-2981191d8afc" containerName="dnsmasq-dns" Mar 20 08:50:00 crc kubenswrapper[4971]: I0320 08:50:00.149893 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e6bad53-1552-4cc5-aafd-2981191d8afc" containerName="dnsmasq-dns" Mar 20 08:50:00 crc kubenswrapper[4971]: E0320 08:50:00.150205 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e6bad53-1552-4cc5-aafd-2981191d8afc" containerName="init" Mar 20 08:50:00 crc kubenswrapper[4971]: I0320 08:50:00.150384 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e6bad53-1552-4cc5-aafd-2981191d8afc" containerName="init" Mar 20 08:50:00 crc kubenswrapper[4971]: I0320 08:50:00.150950 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e6bad53-1552-4cc5-aafd-2981191d8afc" containerName="dnsmasq-dns" Mar 20 08:50:00 crc kubenswrapper[4971]: I0320 08:50:00.152276 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566610-7nbjt" Mar 20 08:50:00 crc kubenswrapper[4971]: I0320 08:50:00.156214 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 08:50:00 crc kubenswrapper[4971]: I0320 08:50:00.156498 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 08:50:00 crc kubenswrapper[4971]: I0320 08:50:00.157753 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 08:50:00 crc kubenswrapper[4971]: I0320 08:50:00.169476 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566610-7nbjt"] Mar 20 08:50:00 crc kubenswrapper[4971]: I0320 08:50:00.228991 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55pcr\" (UniqueName: \"kubernetes.io/projected/e6aacce2-46ea-4aec-a7d9-d18fe3a32dda-kube-api-access-55pcr\") pod \"auto-csr-approver-29566610-7nbjt\" (UID: \"e6aacce2-46ea-4aec-a7d9-d18fe3a32dda\") " pod="openshift-infra/auto-csr-approver-29566610-7nbjt" Mar 20 08:50:00 crc kubenswrapper[4971]: I0320 08:50:00.331051 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55pcr\" (UniqueName: \"kubernetes.io/projected/e6aacce2-46ea-4aec-a7d9-d18fe3a32dda-kube-api-access-55pcr\") pod \"auto-csr-approver-29566610-7nbjt\" (UID: \"e6aacce2-46ea-4aec-a7d9-d18fe3a32dda\") " pod="openshift-infra/auto-csr-approver-29566610-7nbjt" Mar 20 08:50:00 crc kubenswrapper[4971]: I0320 08:50:00.367962 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55pcr\" (UniqueName: \"kubernetes.io/projected/e6aacce2-46ea-4aec-a7d9-d18fe3a32dda-kube-api-access-55pcr\") pod \"auto-csr-approver-29566610-7nbjt\" (UID: \"e6aacce2-46ea-4aec-a7d9-d18fe3a32dda\") " pod="openshift-infra/auto-csr-approver-29566610-7nbjt" Mar 20 08:50:00 crc kubenswrapper[4971]: I0320 08:50:00.488493 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566610-7nbjt" Mar 20 08:50:01 crc kubenswrapper[4971]: I0320 08:50:01.015970 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566610-7nbjt"] Mar 20 08:50:01 crc kubenswrapper[4971]: W0320 08:50:01.022558 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6aacce2_46ea_4aec_a7d9_d18fe3a32dda.slice/crio-1e8fc7e3db374e7b11f56774523083efb28b7cdd973e7f653b980f40330e1a20 WatchSource:0}: Error finding container 1e8fc7e3db374e7b11f56774523083efb28b7cdd973e7f653b980f40330e1a20: Status 404 returned error can't find the container with id 1e8fc7e3db374e7b11f56774523083efb28b7cdd973e7f653b980f40330e1a20 Mar 20 08:50:01 crc kubenswrapper[4971]: I0320 08:50:01.529584 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566610-7nbjt" event={"ID":"e6aacce2-46ea-4aec-a7d9-d18fe3a32dda","Type":"ContainerStarted","Data":"1e8fc7e3db374e7b11f56774523083efb28b7cdd973e7f653b980f40330e1a20"} Mar 20 08:50:03 crc kubenswrapper[4971]: I0320 08:50:03.552377 4971 generic.go:334] "Generic (PLEG): container finished" podID="e6aacce2-46ea-4aec-a7d9-d18fe3a32dda" containerID="d129dbf7cc444e0e8577a646a0c23af07f88b72f7ae6d465014464ea54c53024" exitCode=0 Mar 20 08:50:03 crc kubenswrapper[4971]: I0320 08:50:03.552474 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566610-7nbjt" event={"ID":"e6aacce2-46ea-4aec-a7d9-d18fe3a32dda","Type":"ContainerDied","Data":"d129dbf7cc444e0e8577a646a0c23af07f88b72f7ae6d465014464ea54c53024"} Mar 20 08:50:04 crc kubenswrapper[4971]: I0320 08:50:04.983835 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566610-7nbjt" Mar 20 08:50:05 crc kubenswrapper[4971]: I0320 08:50:05.057775 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55pcr\" (UniqueName: \"kubernetes.io/projected/e6aacce2-46ea-4aec-a7d9-d18fe3a32dda-kube-api-access-55pcr\") pod \"e6aacce2-46ea-4aec-a7d9-d18fe3a32dda\" (UID: \"e6aacce2-46ea-4aec-a7d9-d18fe3a32dda\") " Mar 20 08:50:05 crc kubenswrapper[4971]: I0320 08:50:05.070386 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6aacce2-46ea-4aec-a7d9-d18fe3a32dda-kube-api-access-55pcr" (OuterVolumeSpecName: "kube-api-access-55pcr") pod "e6aacce2-46ea-4aec-a7d9-d18fe3a32dda" (UID: "e6aacce2-46ea-4aec-a7d9-d18fe3a32dda"). InnerVolumeSpecName "kube-api-access-55pcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:50:05 crc kubenswrapper[4971]: I0320 08:50:05.160265 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55pcr\" (UniqueName: \"kubernetes.io/projected/e6aacce2-46ea-4aec-a7d9-d18fe3a32dda-kube-api-access-55pcr\") on node \"crc\" DevicePath \"\"" Mar 20 08:50:05 crc kubenswrapper[4971]: I0320 08:50:05.576217 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566610-7nbjt" Mar 20 08:50:05 crc kubenswrapper[4971]: I0320 08:50:05.591718 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566610-7nbjt" event={"ID":"e6aacce2-46ea-4aec-a7d9-d18fe3a32dda","Type":"ContainerDied","Data":"1e8fc7e3db374e7b11f56774523083efb28b7cdd973e7f653b980f40330e1a20"} Mar 20 08:50:05 crc kubenswrapper[4971]: I0320 08:50:05.591793 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e8fc7e3db374e7b11f56774523083efb28b7cdd973e7f653b980f40330e1a20" Mar 20 08:50:06 crc kubenswrapper[4971]: I0320 08:50:06.062849 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566604-kq4wg"] Mar 20 08:50:06 crc kubenswrapper[4971]: I0320 08:50:06.071145 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566604-kq4wg"] Mar 20 08:50:06 crc kubenswrapper[4971]: I0320 08:50:06.745594 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c63a4c1d-aa44-4180-a5cc-4261d1694fc9" path="/var/lib/kubelet/pods/c63a4c1d-aa44-4180-a5cc-4261d1694fc9/volumes" Mar 20 08:50:08 crc kubenswrapper[4971]: I0320 08:50:08.856026 4971 scope.go:117] "RemoveContainer" containerID="cecc7433831be4400ecbdbc9420ea265d86f412f043c3c3876cf89f9de3d316a" Mar 20 08:50:08 crc kubenswrapper[4971]: I0320 08:50:08.907754 4971 scope.go:117] "RemoveContainer" containerID="15b57bb6774001b7f0c5f76c93be2bfa8facfe9e68486f422b42901b1ea3a0a6" Mar 20 08:50:11 crc kubenswrapper[4971]: I0320 08:50:11.732955 4971 scope.go:117] "RemoveContainer" containerID="af8dd483a43b5c96df916fd811fc94a881785f568b623794921e968459fc26b4" Mar 20 08:50:11 crc kubenswrapper[4971]: E0320 08:50:11.734348 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:50:15 crc kubenswrapper[4971]: I0320 08:50:15.799444 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5c9b55c9fb-k6gqg" Mar 20 08:50:15 crc kubenswrapper[4971]: I0320 08:50:15.799990 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5c9b55c9fb-k6gqg" Mar 20 08:50:26 crc kubenswrapper[4971]: I0320 08:50:26.733534 4971 scope.go:117] "RemoveContainer" containerID="af8dd483a43b5c96df916fd811fc94a881785f568b623794921e968459fc26b4" Mar 20 08:50:26 crc kubenswrapper[4971]: E0320 08:50:26.734470 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:50:36 crc kubenswrapper[4971]: E0320 08:50:36.545176 4971 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.119:44816->38.102.83.119:38499: write tcp 38.102.83.119:44816->38.102.83.119:38499: write: broken pipe Mar 20 08:50:39 crc kubenswrapper[4971]: I0320 08:50:39.436143 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-jbwbl"] Mar 20 08:50:39 crc kubenswrapper[4971]: E0320 08:50:39.436977 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6aacce2-46ea-4aec-a7d9-d18fe3a32dda" containerName="oc" Mar 20 08:50:39 crc kubenswrapper[4971]: I0320 08:50:39.436996 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6aacce2-46ea-4aec-a7d9-d18fe3a32dda" containerName="oc" Mar 20 08:50:39 crc kubenswrapper[4971]: I0320 08:50:39.437191 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6aacce2-46ea-4aec-a7d9-d18fe3a32dda" containerName="oc" Mar 20 08:50:39 crc kubenswrapper[4971]: I0320 08:50:39.437938 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-jbwbl" Mar 20 08:50:39 crc kubenswrapper[4971]: I0320 08:50:39.450466 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-jbwbl"] Mar 20 08:50:39 crc kubenswrapper[4971]: I0320 08:50:39.523676 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-6lts7"] Mar 20 08:50:39 crc kubenswrapper[4971]: I0320 08:50:39.524667 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-6lts7" Mar 20 08:50:39 crc kubenswrapper[4971]: I0320 08:50:39.536714 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-6lts7"] Mar 20 08:50:39 crc kubenswrapper[4971]: I0320 08:50:39.544411 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90424d47-8eee-4fce-b17a-1fb1250ab67d-operator-scripts\") pod \"nova-api-db-create-jbwbl\" (UID: \"90424d47-8eee-4fce-b17a-1fb1250ab67d\") " pod="openstack/nova-api-db-create-jbwbl" Mar 20 08:50:39 crc kubenswrapper[4971]: I0320 08:50:39.544465 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnq4n\" (UniqueName: \"kubernetes.io/projected/90424d47-8eee-4fce-b17a-1fb1250ab67d-kube-api-access-pnq4n\") pod \"nova-api-db-create-jbwbl\" (UID: \"90424d47-8eee-4fce-b17a-1fb1250ab67d\") " pod="openstack/nova-api-db-create-jbwbl" Mar 20 08:50:39 crc kubenswrapper[4971]: I0320 08:50:39.544503 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcpc9\" (UniqueName: \"kubernetes.io/projected/75476996-b18e-4254-a5ad-5afa1e533ecf-kube-api-access-zcpc9\") pod \"nova-cell0-db-create-6lts7\" (UID: \"75476996-b18e-4254-a5ad-5afa1e533ecf\") " pod="openstack/nova-cell0-db-create-6lts7" Mar 20 08:50:39 crc kubenswrapper[4971]: I0320 08:50:39.544534 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75476996-b18e-4254-a5ad-5afa1e533ecf-operator-scripts\") pod \"nova-cell0-db-create-6lts7\" (UID: \"75476996-b18e-4254-a5ad-5afa1e533ecf\") " pod="openstack/nova-cell0-db-create-6lts7" Mar 20 08:50:39 crc kubenswrapper[4971]: I0320 08:50:39.637700 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-q448g"] Mar 20 08:50:39 crc kubenswrapper[4971]: I0320 08:50:39.638780 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-q448g" Mar 20 08:50:39 crc kubenswrapper[4971]: I0320 08:50:39.648722 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90424d47-8eee-4fce-b17a-1fb1250ab67d-operator-scripts\") pod \"nova-api-db-create-jbwbl\" (UID: \"90424d47-8eee-4fce-b17a-1fb1250ab67d\") " pod="openstack/nova-api-db-create-jbwbl" Mar 20 08:50:39 crc kubenswrapper[4971]: I0320 08:50:39.648781 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnq4n\" (UniqueName: \"kubernetes.io/projected/90424d47-8eee-4fce-b17a-1fb1250ab67d-kube-api-access-pnq4n\") pod \"nova-api-db-create-jbwbl\" (UID: \"90424d47-8eee-4fce-b17a-1fb1250ab67d\") " pod="openstack/nova-api-db-create-jbwbl" Mar 20 08:50:39 crc kubenswrapper[4971]: I0320 08:50:39.648824 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjljx\" (UniqueName: \"kubernetes.io/projected/095b7bd0-1af9-4d2a-ac8f-82c4446c5746-kube-api-access-zjljx\") pod \"nova-cell1-db-create-q448g\" (UID: \"095b7bd0-1af9-4d2a-ac8f-82c4446c5746\") " pod="openstack/nova-cell1-db-create-q448g" Mar 20 08:50:39 crc kubenswrapper[4971]: I0320 08:50:39.648847 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcpc9\" (UniqueName: \"kubernetes.io/projected/75476996-b18e-4254-a5ad-5afa1e533ecf-kube-api-access-zcpc9\") pod \"nova-cell0-db-create-6lts7\" (UID: \"75476996-b18e-4254-a5ad-5afa1e533ecf\") " pod="openstack/nova-cell0-db-create-6lts7" Mar 20 08:50:39 crc kubenswrapper[4971]: I0320 08:50:39.648873 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75476996-b18e-4254-a5ad-5afa1e533ecf-operator-scripts\") pod \"nova-cell0-db-create-6lts7\" (UID: \"75476996-b18e-4254-a5ad-5afa1e533ecf\") " pod="openstack/nova-cell0-db-create-6lts7" Mar 20 08:50:39 crc kubenswrapper[4971]: I0320 08:50:39.648891 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/095b7bd0-1af9-4d2a-ac8f-82c4446c5746-operator-scripts\") pod \"nova-cell1-db-create-q448g\" (UID: \"095b7bd0-1af9-4d2a-ac8f-82c4446c5746\") " pod="openstack/nova-cell1-db-create-q448g" Mar 20 08:50:39 crc kubenswrapper[4971]: I0320 08:50:39.649798 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90424d47-8eee-4fce-b17a-1fb1250ab67d-operator-scripts\") pod \"nova-api-db-create-jbwbl\" (UID: \"90424d47-8eee-4fce-b17a-1fb1250ab67d\") " pod="openstack/nova-api-db-create-jbwbl" Mar 20 08:50:39 crc kubenswrapper[4971]: I0320 08:50:39.650562 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75476996-b18e-4254-a5ad-5afa1e533ecf-operator-scripts\") pod \"nova-cell0-db-create-6lts7\" (UID: \"75476996-b18e-4254-a5ad-5afa1e533ecf\") " pod="openstack/nova-cell0-db-create-6lts7" Mar 20 08:50:39 crc kubenswrapper[4971]: I0320 08:50:39.650617 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-9831-account-create-update-wfn6w"] Mar 20 08:50:39 crc kubenswrapper[4971]: I0320 08:50:39.651593 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-9831-account-create-update-wfn6w" Mar 20 08:50:39 crc kubenswrapper[4971]: I0320 08:50:39.653544 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 20 08:50:39 crc kubenswrapper[4971]: I0320 08:50:39.661461 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-q448g"] Mar 20 08:50:39 crc kubenswrapper[4971]: I0320 08:50:39.675457 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnq4n\" (UniqueName: \"kubernetes.io/projected/90424d47-8eee-4fce-b17a-1fb1250ab67d-kube-api-access-pnq4n\") pod \"nova-api-db-create-jbwbl\" (UID: \"90424d47-8eee-4fce-b17a-1fb1250ab67d\") " pod="openstack/nova-api-db-create-jbwbl" Mar 20 08:50:39 crc kubenswrapper[4971]: I0320 08:50:39.696514 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcpc9\" (UniqueName: \"kubernetes.io/projected/75476996-b18e-4254-a5ad-5afa1e533ecf-kube-api-access-zcpc9\") pod \"nova-cell0-db-create-6lts7\" (UID: \"75476996-b18e-4254-a5ad-5afa1e533ecf\") " pod="openstack/nova-cell0-db-create-6lts7" Mar 20 08:50:39 crc kubenswrapper[4971]: I0320 08:50:39.697063 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-9831-account-create-update-wfn6w"] Mar 20 08:50:39 crc kubenswrapper[4971]: I0320 08:50:39.756851 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/095b7bd0-1af9-4d2a-ac8f-82c4446c5746-operator-scripts\") pod \"nova-cell1-db-create-q448g\" (UID: \"095b7bd0-1af9-4d2a-ac8f-82c4446c5746\") " pod="openstack/nova-cell1-db-create-q448g" Mar 20 08:50:39 crc kubenswrapper[4971]: I0320 08:50:39.757023 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ddf84f04-f715-49cd-b077-66df82ec4ea2-operator-scripts\") pod \"nova-api-9831-account-create-update-wfn6w\" (UID: \"ddf84f04-f715-49cd-b077-66df82ec4ea2\") " pod="openstack/nova-api-9831-account-create-update-wfn6w" Mar 20 08:50:39 crc kubenswrapper[4971]: I0320 08:50:39.757070 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q86nz\" (UniqueName: \"kubernetes.io/projected/ddf84f04-f715-49cd-b077-66df82ec4ea2-kube-api-access-q86nz\") pod \"nova-api-9831-account-create-update-wfn6w\" (UID: \"ddf84f04-f715-49cd-b077-66df82ec4ea2\") " pod="openstack/nova-api-9831-account-create-update-wfn6w" Mar 20 08:50:39 crc kubenswrapper[4971]: I0320 08:50:39.757113 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjljx\" (UniqueName: \"kubernetes.io/projected/095b7bd0-1af9-4d2a-ac8f-82c4446c5746-kube-api-access-zjljx\") pod \"nova-cell1-db-create-q448g\" (UID: \"095b7bd0-1af9-4d2a-ac8f-82c4446c5746\") " pod="openstack/nova-cell1-db-create-q448g" Mar 20 08:50:39 crc kubenswrapper[4971]: I0320 08:50:39.757393 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-jbwbl" Mar 20 08:50:39 crc kubenswrapper[4971]: I0320 08:50:39.757762 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/095b7bd0-1af9-4d2a-ac8f-82c4446c5746-operator-scripts\") pod \"nova-cell1-db-create-q448g\" (UID: \"095b7bd0-1af9-4d2a-ac8f-82c4446c5746\") " pod="openstack/nova-cell1-db-create-q448g" Mar 20 08:50:39 crc kubenswrapper[4971]: I0320 08:50:39.815112 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjljx\" (UniqueName: \"kubernetes.io/projected/095b7bd0-1af9-4d2a-ac8f-82c4446c5746-kube-api-access-zjljx\") pod \"nova-cell1-db-create-q448g\" (UID: \"095b7bd0-1af9-4d2a-ac8f-82c4446c5746\") " pod="openstack/nova-cell1-db-create-q448g" Mar 20 08:50:39 crc kubenswrapper[4971]: I0320 08:50:39.848083 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-6lts7" Mar 20 08:50:39 crc kubenswrapper[4971]: I0320 08:50:39.853771 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-8a03-account-create-update-k5dtm"] Mar 20 08:50:39 crc kubenswrapper[4971]: I0320 08:50:39.859884 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8a03-account-create-update-k5dtm" Mar 20 08:50:39 crc kubenswrapper[4971]: I0320 08:50:39.860122 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ddf84f04-f715-49cd-b077-66df82ec4ea2-operator-scripts\") pod \"nova-api-9831-account-create-update-wfn6w\" (UID: \"ddf84f04-f715-49cd-b077-66df82ec4ea2\") " pod="openstack/nova-api-9831-account-create-update-wfn6w" Mar 20 08:50:39 crc kubenswrapper[4971]: I0320 08:50:39.860173 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q86nz\" (UniqueName: \"kubernetes.io/projected/ddf84f04-f715-49cd-b077-66df82ec4ea2-kube-api-access-q86nz\") pod \"nova-api-9831-account-create-update-wfn6w\" (UID: \"ddf84f04-f715-49cd-b077-66df82ec4ea2\") " pod="openstack/nova-api-9831-account-create-update-wfn6w" Mar 20 08:50:39 crc kubenswrapper[4971]: I0320 08:50:39.862453 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ddf84f04-f715-49cd-b077-66df82ec4ea2-operator-scripts\") pod \"nova-api-9831-account-create-update-wfn6w\" (UID: \"ddf84f04-f715-49cd-b077-66df82ec4ea2\") " pod="openstack/nova-api-9831-account-create-update-wfn6w" Mar 20 08:50:39 crc kubenswrapper[4971]: I0320 08:50:39.865029 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 20 08:50:39 crc kubenswrapper[4971]: I0320 08:50:39.865141 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-8a03-account-create-update-k5dtm"] Mar 20 08:50:39 crc kubenswrapper[4971]: I0320 08:50:39.892520 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q86nz\" (UniqueName: \"kubernetes.io/projected/ddf84f04-f715-49cd-b077-66df82ec4ea2-kube-api-access-q86nz\") pod \"nova-api-9831-account-create-update-wfn6w\" (UID: \"ddf84f04-f715-49cd-b077-66df82ec4ea2\") " pod="openstack/nova-api-9831-account-create-update-wfn6w" Mar 20 08:50:39 crc kubenswrapper[4971]: I0320 08:50:39.961741 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmv88\" (UniqueName: \"kubernetes.io/projected/330d1914-0337-4d4f-a8ea-2a7c9f18942a-kube-api-access-pmv88\") pod \"nova-cell0-8a03-account-create-update-k5dtm\" (UID: \"330d1914-0337-4d4f-a8ea-2a7c9f18942a\") " pod="openstack/nova-cell0-8a03-account-create-update-k5dtm" Mar 20 08:50:39 crc kubenswrapper[4971]: I0320 08:50:39.962151 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/330d1914-0337-4d4f-a8ea-2a7c9f18942a-operator-scripts\") pod \"nova-cell0-8a03-account-create-update-k5dtm\" (UID: \"330d1914-0337-4d4f-a8ea-2a7c9f18942a\") " pod="openstack/nova-cell0-8a03-account-create-update-k5dtm" Mar 20 08:50:39 crc kubenswrapper[4971]: I0320 08:50:39.978068 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-q448g" Mar 20 08:50:40 crc kubenswrapper[4971]: I0320 08:50:40.038840 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-33f2-account-create-update-z8tbj"] Mar 20 08:50:40 crc kubenswrapper[4971]: I0320 08:50:40.039873 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-33f2-account-create-update-z8tbj" Mar 20 08:50:40 crc kubenswrapper[4971]: I0320 08:50:40.042139 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 20 08:50:40 crc kubenswrapper[4971]: I0320 08:50:40.055287 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-33f2-account-create-update-z8tbj"] Mar 20 08:50:40 crc kubenswrapper[4971]: I0320 08:50:40.063120 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6dcl\" (UniqueName: \"kubernetes.io/projected/9ef92901-a49d-4e95-9491-53378694d160-kube-api-access-h6dcl\") pod \"nova-cell1-33f2-account-create-update-z8tbj\" (UID: \"9ef92901-a49d-4e95-9491-53378694d160\") " pod="openstack/nova-cell1-33f2-account-create-update-z8tbj" Mar 20 08:50:40 crc kubenswrapper[4971]: I0320 08:50:40.063402 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmv88\" (UniqueName: \"kubernetes.io/projected/330d1914-0337-4d4f-a8ea-2a7c9f18942a-kube-api-access-pmv88\") pod \"nova-cell0-8a03-account-create-update-k5dtm\" (UID: \"330d1914-0337-4d4f-a8ea-2a7c9f18942a\") " pod="openstack/nova-cell0-8a03-account-create-update-k5dtm" Mar 20 08:50:40 crc kubenswrapper[4971]: I0320 08:50:40.063545 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/330d1914-0337-4d4f-a8ea-2a7c9f18942a-operator-scripts\") pod \"nova-cell0-8a03-account-create-update-k5dtm\" (UID: \"330d1914-0337-4d4f-a8ea-2a7c9f18942a\") " pod="openstack/nova-cell0-8a03-account-create-update-k5dtm" Mar 20 08:50:40 crc kubenswrapper[4971]: I0320 08:50:40.063664 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ef92901-a49d-4e95-9491-53378694d160-operator-scripts\") pod \"nova-cell1-33f2-account-create-update-z8tbj\" (UID: \"9ef92901-a49d-4e95-9491-53378694d160\") " pod="openstack/nova-cell1-33f2-account-create-update-z8tbj" Mar 20 08:50:40 crc kubenswrapper[4971]: I0320 08:50:40.064847 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/330d1914-0337-4d4f-a8ea-2a7c9f18942a-operator-scripts\") pod \"nova-cell0-8a03-account-create-update-k5dtm\" (UID: \"330d1914-0337-4d4f-a8ea-2a7c9f18942a\") " pod="openstack/nova-cell0-8a03-account-create-update-k5dtm" Mar 20 08:50:40 crc kubenswrapper[4971]: I0320 08:50:40.076148 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-9831-account-create-update-wfn6w" Mar 20 08:50:40 crc kubenswrapper[4971]: I0320 08:50:40.085140 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmv88\" (UniqueName: \"kubernetes.io/projected/330d1914-0337-4d4f-a8ea-2a7c9f18942a-kube-api-access-pmv88\") pod \"nova-cell0-8a03-account-create-update-k5dtm\" (UID: \"330d1914-0337-4d4f-a8ea-2a7c9f18942a\") " pod="openstack/nova-cell0-8a03-account-create-update-k5dtm" Mar 20 08:50:40 crc kubenswrapper[4971]: I0320 08:50:40.166544 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ef92901-a49d-4e95-9491-53378694d160-operator-scripts\") pod \"nova-cell1-33f2-account-create-update-z8tbj\" (UID: \"9ef92901-a49d-4e95-9491-53378694d160\") " pod="openstack/nova-cell1-33f2-account-create-update-z8tbj" Mar 20 08:50:40 crc kubenswrapper[4971]: I0320 08:50:40.167536 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ef92901-a49d-4e95-9491-53378694d160-operator-scripts\") pod \"nova-cell1-33f2-account-create-update-z8tbj\" (UID: \"9ef92901-a49d-4e95-9491-53378694d160\") " pod="openstack/nova-cell1-33f2-account-create-update-z8tbj" Mar 20 08:50:40 crc kubenswrapper[4971]: I0320 08:50:40.166647 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6dcl\" (UniqueName: \"kubernetes.io/projected/9ef92901-a49d-4e95-9491-53378694d160-kube-api-access-h6dcl\") pod \"nova-cell1-33f2-account-create-update-z8tbj\" (UID: \"9ef92901-a49d-4e95-9491-53378694d160\") " pod="openstack/nova-cell1-33f2-account-create-update-z8tbj" Mar 20 08:50:40 crc kubenswrapper[4971]: I0320 08:50:40.193334 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6dcl\" (UniqueName: \"kubernetes.io/projected/9ef92901-a49d-4e95-9491-53378694d160-kube-api-access-h6dcl\") pod \"nova-cell1-33f2-account-create-update-z8tbj\" (UID: \"9ef92901-a49d-4e95-9491-53378694d160\") " pod="openstack/nova-cell1-33f2-account-create-update-z8tbj" Mar 20 08:50:40 crc kubenswrapper[4971]: I0320 08:50:40.219923 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8a03-account-create-update-k5dtm" Mar 20 08:50:40 crc kubenswrapper[4971]: I0320 08:50:40.323965 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-jbwbl"] Mar 20 08:50:40 crc kubenswrapper[4971]: W0320 08:50:40.334183 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90424d47_8eee_4fce_b17a_1fb1250ab67d.slice/crio-fed95837de2ee2aa5aad4997d2de21e81ba2a6b7ec66926b573a5d093d428860 WatchSource:0}: Error finding container fed95837de2ee2aa5aad4997d2de21e81ba2a6b7ec66926b573a5d093d428860: Status 404 returned error can't find the container with id fed95837de2ee2aa5aad4997d2de21e81ba2a6b7ec66926b573a5d093d428860 Mar 20 08:50:40 crc kubenswrapper[4971]: I0320 08:50:40.361012 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-33f2-account-create-update-z8tbj" Mar 20 08:50:40 crc kubenswrapper[4971]: I0320 08:50:40.440199 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-6lts7"] Mar 20 08:50:40 crc kubenswrapper[4971]: I0320 08:50:40.462451 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-9831-account-create-update-wfn6w"] Mar 20 08:50:40 crc kubenswrapper[4971]: I0320 08:50:40.554264 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-q448g"] Mar 20 08:50:40 crc kubenswrapper[4971]: I0320 08:50:40.735664 4971 scope.go:117] "RemoveContainer" containerID="af8dd483a43b5c96df916fd811fc94a881785f568b623794921e968459fc26b4" Mar 20 08:50:40 crc kubenswrapper[4971]: E0320 08:50:40.736123 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:50:40 crc kubenswrapper[4971]: I0320 08:50:40.744099 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-8a03-account-create-update-k5dtm"] Mar 20 08:50:40 crc kubenswrapper[4971]: I0320 08:50:40.849633 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-33f2-account-create-update-z8tbj"] Mar 20 08:50:40 crc kubenswrapper[4971]: W0320 08:50:40.888874 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ef92901_a49d_4e95_9491_53378694d160.slice/crio-0dd7b19a57e5e10a0b38b34d716905a98b1b89c51e6856f85b733bc4a45890d4 WatchSource:0}: Error finding container 0dd7b19a57e5e10a0b38b34d716905a98b1b89c51e6856f85b733bc4a45890d4: Status 404 returned error can't find the container with id 0dd7b19a57e5e10a0b38b34d716905a98b1b89c51e6856f85b733bc4a45890d4 Mar 20 08:50:40 crc kubenswrapper[4971]: I0320 08:50:40.988845 4971 generic.go:334] "Generic (PLEG): container finished" podID="095b7bd0-1af9-4d2a-ac8f-82c4446c5746" containerID="9ed536a707d208b5391f0833ed540467f50206dffb41395c012fe6ffba0586f3" exitCode=0 Mar 20 08:50:40 crc kubenswrapper[4971]: I0320 08:50:40.988930 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-q448g" event={"ID":"095b7bd0-1af9-4d2a-ac8f-82c4446c5746","Type":"ContainerDied","Data":"9ed536a707d208b5391f0833ed540467f50206dffb41395c012fe6ffba0586f3"} Mar 20 08:50:40 crc kubenswrapper[4971]: I0320 08:50:40.988962 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-q448g" event={"ID":"095b7bd0-1af9-4d2a-ac8f-82c4446c5746","Type":"ContainerStarted","Data":"23fa1990c20d541250da246df73ac4123b880f8aff86e139be48d1c73c5d8bc5"} Mar 20 08:50:40 crc kubenswrapper[4971]: I0320 08:50:40.991008 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8a03-account-create-update-k5dtm" event={"ID":"330d1914-0337-4d4f-a8ea-2a7c9f18942a","Type":"ContainerStarted","Data":"9de46ca40f62d9acb31286017291d4a5df68c779ba4fa6504b13196cec7ccf57"} Mar 20 08:50:40 crc kubenswrapper[4971]: I0320 08:50:40.992334 4971 generic.go:334] "Generic (PLEG): container finished" podID="ddf84f04-f715-49cd-b077-66df82ec4ea2" containerID="400998e0e738f9a7e7a96c4eb797c377b61ff9ab1c5b31b66113136a3ea4a217" exitCode=0 Mar 20 08:50:40 crc kubenswrapper[4971]: I0320 08:50:40.992419 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-9831-account-create-update-wfn6w" event={"ID":"ddf84f04-f715-49cd-b077-66df82ec4ea2","Type":"ContainerDied","Data":"400998e0e738f9a7e7a96c4eb797c377b61ff9ab1c5b31b66113136a3ea4a217"} Mar 20 08:50:40 crc kubenswrapper[4971]: I0320 08:50:40.992471 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-9831-account-create-update-wfn6w" event={"ID":"ddf84f04-f715-49cd-b077-66df82ec4ea2","Type":"ContainerStarted","Data":"1106ed40c5c060004ea64f20235aa3dde65217115d0fa8a7cece5948111f4426"} Mar 20 08:50:40 crc kubenswrapper[4971]: I0320 08:50:40.994825 4971 generic.go:334] "Generic (PLEG): container finished" podID="90424d47-8eee-4fce-b17a-1fb1250ab67d" containerID="0ccfe317b073edb2cee1f335ccef0124d18becdf156b787592a97cbc86b74aca" exitCode=0 Mar 20 08:50:40 crc kubenswrapper[4971]: I0320 08:50:40.994947 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-jbwbl" event={"ID":"90424d47-8eee-4fce-b17a-1fb1250ab67d","Type":"ContainerDied","Data":"0ccfe317b073edb2cee1f335ccef0124d18becdf156b787592a97cbc86b74aca"} Mar 20 08:50:40 crc kubenswrapper[4971]: I0320 08:50:40.994987 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-jbwbl" event={"ID":"90424d47-8eee-4fce-b17a-1fb1250ab67d","Type":"ContainerStarted","Data":"fed95837de2ee2aa5aad4997d2de21e81ba2a6b7ec66926b573a5d093d428860"} Mar 20 08:50:40 crc kubenswrapper[4971]: I0320 08:50:40.996437 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-33f2-account-create-update-z8tbj" event={"ID":"9ef92901-a49d-4e95-9491-53378694d160","Type":"ContainerStarted","Data":"0dd7b19a57e5e10a0b38b34d716905a98b1b89c51e6856f85b733bc4a45890d4"} Mar 20 08:50:40 crc kubenswrapper[4971]: I0320 08:50:40.998143 4971 generic.go:334] "Generic (PLEG): container finished" podID="75476996-b18e-4254-a5ad-5afa1e533ecf" containerID="d482b887b43776d9e1d09fe44e3d545aa63f832fbebbe436bd11899f4997f2d4" exitCode=0 Mar 20 08:50:40 crc kubenswrapper[4971]: I0320 08:50:40.998173 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-6lts7" event={"ID":"75476996-b18e-4254-a5ad-5afa1e533ecf","Type":"ContainerDied","Data":"d482b887b43776d9e1d09fe44e3d545aa63f832fbebbe436bd11899f4997f2d4"} Mar 20 08:50:40 crc kubenswrapper[4971]: I0320 08:50:40.998187 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-6lts7" event={"ID":"75476996-b18e-4254-a5ad-5afa1e533ecf","Type":"ContainerStarted","Data":"8197f6238bc6c0556f58a0619abaa41f729cc2ed8524bdd64e9f2c6303b828c0"} Mar 20 08:50:42 crc kubenswrapper[4971]: I0320 08:50:42.010598 4971 generic.go:334] "Generic (PLEG): container finished" podID="9ef92901-a49d-4e95-9491-53378694d160" containerID="e59db8128262a6ce828d9dbe0932907eeeecee248c59467a85b08d4a125111ca" exitCode=0 Mar 20 08:50:42 crc kubenswrapper[4971]: I0320 08:50:42.010864 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-33f2-account-create-update-z8tbj" event={"ID":"9ef92901-a49d-4e95-9491-53378694d160","Type":"ContainerDied","Data":"e59db8128262a6ce828d9dbe0932907eeeecee248c59467a85b08d4a125111ca"} Mar 20 08:50:42 crc kubenswrapper[4971]: I0320 08:50:42.014427 4971 generic.go:334] "Generic (PLEG): container finished" podID="330d1914-0337-4d4f-a8ea-2a7c9f18942a" containerID="dd903e8c825771c10e2b6e69674ba5b9e108f1d49a7a90b4724f2f23c659d74b" exitCode=0 Mar 20 08:50:42 crc kubenswrapper[4971]: I0320 08:50:42.014479 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8a03-account-create-update-k5dtm" event={"ID":"330d1914-0337-4d4f-a8ea-2a7c9f18942a","Type":"ContainerDied","Data":"dd903e8c825771c10e2b6e69674ba5b9e108f1d49a7a90b4724f2f23c659d74b"} Mar 20 08:50:42 crc kubenswrapper[4971]: I0320 08:50:42.427420 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-9831-account-create-update-wfn6w" Mar 20 08:50:42 crc kubenswrapper[4971]: I0320 08:50:42.522877 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q86nz\" (UniqueName: \"kubernetes.io/projected/ddf84f04-f715-49cd-b077-66df82ec4ea2-kube-api-access-q86nz\") pod \"ddf84f04-f715-49cd-b077-66df82ec4ea2\" (UID: \"ddf84f04-f715-49cd-b077-66df82ec4ea2\") " Mar 20 08:50:42 crc kubenswrapper[4971]: I0320 08:50:42.522954 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ddf84f04-f715-49cd-b077-66df82ec4ea2-operator-scripts\") pod \"ddf84f04-f715-49cd-b077-66df82ec4ea2\" (UID: \"ddf84f04-f715-49cd-b077-66df82ec4ea2\") " Mar 20 08:50:42 crc kubenswrapper[4971]: I0320 08:50:42.523952 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddf84f04-f715-49cd-b077-66df82ec4ea2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ddf84f04-f715-49cd-b077-66df82ec4ea2" (UID: "ddf84f04-f715-49cd-b077-66df82ec4ea2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:50:42 crc kubenswrapper[4971]: I0320 08:50:42.527668 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddf84f04-f715-49cd-b077-66df82ec4ea2-kube-api-access-q86nz" (OuterVolumeSpecName: "kube-api-access-q86nz") pod "ddf84f04-f715-49cd-b077-66df82ec4ea2" (UID: "ddf84f04-f715-49cd-b077-66df82ec4ea2"). InnerVolumeSpecName "kube-api-access-q86nz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:50:42 crc kubenswrapper[4971]: I0320 08:50:42.582372 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-q448g" Mar 20 08:50:42 crc kubenswrapper[4971]: I0320 08:50:42.588291 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-jbwbl" Mar 20 08:50:42 crc kubenswrapper[4971]: I0320 08:50:42.593779 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-6lts7" Mar 20 08:50:42 crc kubenswrapper[4971]: I0320 08:50:42.632067 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90424d47-8eee-4fce-b17a-1fb1250ab67d-operator-scripts\") pod \"90424d47-8eee-4fce-b17a-1fb1250ab67d\" (UID: \"90424d47-8eee-4fce-b17a-1fb1250ab67d\") " Mar 20 08:50:42 crc kubenswrapper[4971]: I0320 08:50:42.632158 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnq4n\" (UniqueName: \"kubernetes.io/projected/90424d47-8eee-4fce-b17a-1fb1250ab67d-kube-api-access-pnq4n\") pod \"90424d47-8eee-4fce-b17a-1fb1250ab67d\" (UID: \"90424d47-8eee-4fce-b17a-1fb1250ab67d\") " Mar 20 08:50:42 crc kubenswrapper[4971]: I0320 08:50:42.632191 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/095b7bd0-1af9-4d2a-ac8f-82c4446c5746-operator-scripts\") pod \"095b7bd0-1af9-4d2a-ac8f-82c4446c5746\" (UID: \"095b7bd0-1af9-4d2a-ac8f-82c4446c5746\") " Mar 20 08:50:42 crc kubenswrapper[4971]: I0320 08:50:42.632260 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75476996-b18e-4254-a5ad-5afa1e533ecf-operator-scripts\") pod \"75476996-b18e-4254-a5ad-5afa1e533ecf\" (UID: \"75476996-b18e-4254-a5ad-5afa1e533ecf\") " Mar 20 08:50:42 crc kubenswrapper[4971]: I0320 08:50:42.632328 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjljx\" (UniqueName: \"kubernetes.io/projected/095b7bd0-1af9-4d2a-ac8f-82c4446c5746-kube-api-access-zjljx\") pod \"095b7bd0-1af9-4d2a-ac8f-82c4446c5746\" (UID: \"095b7bd0-1af9-4d2a-ac8f-82c4446c5746\") " Mar 20 08:50:42 crc kubenswrapper[4971]: I0320 08:50:42.632384 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcpc9\" (UniqueName: \"kubernetes.io/projected/75476996-b18e-4254-a5ad-5afa1e533ecf-kube-api-access-zcpc9\") pod \"75476996-b18e-4254-a5ad-5afa1e533ecf\" (UID: \"75476996-b18e-4254-a5ad-5afa1e533ecf\") " Mar 20 08:50:42 crc kubenswrapper[4971]: I0320 08:50:42.633658 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75476996-b18e-4254-a5ad-5afa1e533ecf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "75476996-b18e-4254-a5ad-5afa1e533ecf" (UID: "75476996-b18e-4254-a5ad-5afa1e533ecf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:50:42 crc kubenswrapper[4971]: I0320 08:50:42.634925 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90424d47-8eee-4fce-b17a-1fb1250ab67d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "90424d47-8eee-4fce-b17a-1fb1250ab67d" (UID: "90424d47-8eee-4fce-b17a-1fb1250ab67d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:50:42 crc kubenswrapper[4971]: I0320 08:50:42.637023 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q86nz\" (UniqueName: \"kubernetes.io/projected/ddf84f04-f715-49cd-b077-66df82ec4ea2-kube-api-access-q86nz\") on node \"crc\" DevicePath \"\"" Mar 20 08:50:42 crc kubenswrapper[4971]: I0320 08:50:42.637045 4971 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90424d47-8eee-4fce-b17a-1fb1250ab67d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:50:42 crc kubenswrapper[4971]: I0320 08:50:42.637054 4971 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ddf84f04-f715-49cd-b077-66df82ec4ea2-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:50:42 crc kubenswrapper[4971]: I0320 08:50:42.637062 4971 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75476996-b18e-4254-a5ad-5afa1e533ecf-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:50:42 crc kubenswrapper[4971]: I0320 08:50:42.637224 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/095b7bd0-1af9-4d2a-ac8f-82c4446c5746-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "095b7bd0-1af9-4d2a-ac8f-82c4446c5746" (UID: "095b7bd0-1af9-4d2a-ac8f-82c4446c5746"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:50:42 crc kubenswrapper[4971]: I0320 08:50:42.637950 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90424d47-8eee-4fce-b17a-1fb1250ab67d-kube-api-access-pnq4n" (OuterVolumeSpecName: "kube-api-access-pnq4n") pod "90424d47-8eee-4fce-b17a-1fb1250ab67d" (UID: "90424d47-8eee-4fce-b17a-1fb1250ab67d"). InnerVolumeSpecName "kube-api-access-pnq4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:50:42 crc kubenswrapper[4971]: I0320 08:50:42.641320 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75476996-b18e-4254-a5ad-5afa1e533ecf-kube-api-access-zcpc9" (OuterVolumeSpecName: "kube-api-access-zcpc9") pod "75476996-b18e-4254-a5ad-5afa1e533ecf" (UID: "75476996-b18e-4254-a5ad-5afa1e533ecf"). InnerVolumeSpecName "kube-api-access-zcpc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:50:42 crc kubenswrapper[4971]: I0320 08:50:42.647289 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/095b7bd0-1af9-4d2a-ac8f-82c4446c5746-kube-api-access-zjljx" (OuterVolumeSpecName: "kube-api-access-zjljx") pod "095b7bd0-1af9-4d2a-ac8f-82c4446c5746" (UID: "095b7bd0-1af9-4d2a-ac8f-82c4446c5746"). InnerVolumeSpecName "kube-api-access-zjljx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:50:42 crc kubenswrapper[4971]: I0320 08:50:42.739129 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pnq4n\" (UniqueName: \"kubernetes.io/projected/90424d47-8eee-4fce-b17a-1fb1250ab67d-kube-api-access-pnq4n\") on node \"crc\" DevicePath \"\"" Mar 20 08:50:42 crc kubenswrapper[4971]: I0320 08:50:42.739231 4971 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/095b7bd0-1af9-4d2a-ac8f-82c4446c5746-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:50:42 crc kubenswrapper[4971]: I0320 08:50:42.739251 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjljx\" (UniqueName: \"kubernetes.io/projected/095b7bd0-1af9-4d2a-ac8f-82c4446c5746-kube-api-access-zjljx\") on node \"crc\" DevicePath \"\"" Mar 20 08:50:42 crc kubenswrapper[4971]: I0320 08:50:42.739267 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zcpc9\" (UniqueName: \"kubernetes.io/projected/75476996-b18e-4254-a5ad-5afa1e533ecf-kube-api-access-zcpc9\") on node \"crc\" DevicePath \"\"" Mar 20 08:50:43 crc kubenswrapper[4971]: I0320 08:50:43.023154 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-jbwbl" event={"ID":"90424d47-8eee-4fce-b17a-1fb1250ab67d","Type":"ContainerDied","Data":"fed95837de2ee2aa5aad4997d2de21e81ba2a6b7ec66926b573a5d093d428860"} Mar 20 08:50:43 crc kubenswrapper[4971]: I0320 08:50:43.023214 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fed95837de2ee2aa5aad4997d2de21e81ba2a6b7ec66926b573a5d093d428860" Mar 20 08:50:43 crc kubenswrapper[4971]: I0320 08:50:43.023515 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-jbwbl" Mar 20 08:50:43 crc kubenswrapper[4971]: I0320 08:50:43.024595 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-6lts7" event={"ID":"75476996-b18e-4254-a5ad-5afa1e533ecf","Type":"ContainerDied","Data":"8197f6238bc6c0556f58a0619abaa41f729cc2ed8524bdd64e9f2c6303b828c0"} Mar 20 08:50:43 crc kubenswrapper[4971]: I0320 08:50:43.024647 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8197f6238bc6c0556f58a0619abaa41f729cc2ed8524bdd64e9f2c6303b828c0" Mar 20 08:50:43 crc kubenswrapper[4971]: I0320 08:50:43.024701 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-6lts7" Mar 20 08:50:43 crc kubenswrapper[4971]: I0320 08:50:43.026086 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-q448g" event={"ID":"095b7bd0-1af9-4d2a-ac8f-82c4446c5746","Type":"ContainerDied","Data":"23fa1990c20d541250da246df73ac4123b880f8aff86e139be48d1c73c5d8bc5"} Mar 20 08:50:43 crc kubenswrapper[4971]: I0320 08:50:43.026113 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23fa1990c20d541250da246df73ac4123b880f8aff86e139be48d1c73c5d8bc5" Mar 20 08:50:43 crc kubenswrapper[4971]: I0320 08:50:43.026134 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-q448g" Mar 20 08:50:43 crc kubenswrapper[4971]: I0320 08:50:43.028505 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-9831-account-create-update-wfn6w" Mar 20 08:50:43 crc kubenswrapper[4971]: I0320 08:50:43.029208 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-9831-account-create-update-wfn6w" event={"ID":"ddf84f04-f715-49cd-b077-66df82ec4ea2","Type":"ContainerDied","Data":"1106ed40c5c060004ea64f20235aa3dde65217115d0fa8a7cece5948111f4426"} Mar 20 08:50:43 crc kubenswrapper[4971]: I0320 08:50:43.029240 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1106ed40c5c060004ea64f20235aa3dde65217115d0fa8a7cece5948111f4426" Mar 20 08:50:43 crc kubenswrapper[4971]: I0320 08:50:43.276845 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-33f2-account-create-update-z8tbj" Mar 20 08:50:43 crc kubenswrapper[4971]: I0320 08:50:43.349337 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ef92901-a49d-4e95-9491-53378694d160-operator-scripts\") pod \"9ef92901-a49d-4e95-9491-53378694d160\" (UID: \"9ef92901-a49d-4e95-9491-53378694d160\") " Mar 20 08:50:43 crc kubenswrapper[4971]: I0320 08:50:43.349433 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6dcl\" (UniqueName: \"kubernetes.io/projected/9ef92901-a49d-4e95-9491-53378694d160-kube-api-access-h6dcl\") pod \"9ef92901-a49d-4e95-9491-53378694d160\" (UID: \"9ef92901-a49d-4e95-9491-53378694d160\") " Mar 20 08:50:43 crc kubenswrapper[4971]: I0320 08:50:43.350012 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ef92901-a49d-4e95-9491-53378694d160-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9ef92901-a49d-4e95-9491-53378694d160" (UID: "9ef92901-a49d-4e95-9491-53378694d160"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:50:43 crc kubenswrapper[4971]: I0320 08:50:43.353992 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ef92901-a49d-4e95-9491-53378694d160-kube-api-access-h6dcl" (OuterVolumeSpecName: "kube-api-access-h6dcl") pod "9ef92901-a49d-4e95-9491-53378694d160" (UID: "9ef92901-a49d-4e95-9491-53378694d160"). InnerVolumeSpecName "kube-api-access-h6dcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:50:43 crc kubenswrapper[4971]: I0320 08:50:43.401126 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8a03-account-create-update-k5dtm" Mar 20 08:50:43 crc kubenswrapper[4971]: I0320 08:50:43.452145 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/330d1914-0337-4d4f-a8ea-2a7c9f18942a-operator-scripts\") pod \"330d1914-0337-4d4f-a8ea-2a7c9f18942a\" (UID: \"330d1914-0337-4d4f-a8ea-2a7c9f18942a\") " Mar 20 08:50:43 crc kubenswrapper[4971]: I0320 08:50:43.452191 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmv88\" (UniqueName: \"kubernetes.io/projected/330d1914-0337-4d4f-a8ea-2a7c9f18942a-kube-api-access-pmv88\") pod \"330d1914-0337-4d4f-a8ea-2a7c9f18942a\" (UID: \"330d1914-0337-4d4f-a8ea-2a7c9f18942a\") " Mar 20 08:50:43 crc kubenswrapper[4971]: I0320 08:50:43.452705 4971 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ef92901-a49d-4e95-9491-53378694d160-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:50:43 crc kubenswrapper[4971]: I0320 08:50:43.452720 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6dcl\" (UniqueName: \"kubernetes.io/projected/9ef92901-a49d-4e95-9491-53378694d160-kube-api-access-h6dcl\") on node \"crc\" DevicePath \"\"" Mar 20 08:50:43 crc kubenswrapper[4971]: I0320 08:50:43.452854 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/330d1914-0337-4d4f-a8ea-2a7c9f18942a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "330d1914-0337-4d4f-a8ea-2a7c9f18942a" (UID: "330d1914-0337-4d4f-a8ea-2a7c9f18942a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:50:43 crc kubenswrapper[4971]: I0320 08:50:43.463399 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/330d1914-0337-4d4f-a8ea-2a7c9f18942a-kube-api-access-pmv88" (OuterVolumeSpecName: "kube-api-access-pmv88") pod "330d1914-0337-4d4f-a8ea-2a7c9f18942a" (UID: "330d1914-0337-4d4f-a8ea-2a7c9f18942a"). InnerVolumeSpecName "kube-api-access-pmv88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:50:43 crc kubenswrapper[4971]: I0320 08:50:43.554844 4971 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/330d1914-0337-4d4f-a8ea-2a7c9f18942a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:50:43 crc kubenswrapper[4971]: I0320 08:50:43.554883 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmv88\" (UniqueName: \"kubernetes.io/projected/330d1914-0337-4d4f-a8ea-2a7c9f18942a-kube-api-access-pmv88\") on node \"crc\" DevicePath \"\"" Mar 20 08:50:44 crc kubenswrapper[4971]: I0320 08:50:44.037317 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8a03-account-create-update-k5dtm" event={"ID":"330d1914-0337-4d4f-a8ea-2a7c9f18942a","Type":"ContainerDied","Data":"9de46ca40f62d9acb31286017291d4a5df68c779ba4fa6504b13196cec7ccf57"} Mar 20 08:50:44 crc kubenswrapper[4971]: I0320 08:50:44.037370 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9de46ca40f62d9acb31286017291d4a5df68c779ba4fa6504b13196cec7ccf57" Mar 20 08:50:44 crc kubenswrapper[4971]: I0320 08:50:44.037497 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8a03-account-create-update-k5dtm" Mar 20 08:50:44 crc kubenswrapper[4971]: I0320 08:50:44.040412 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-33f2-account-create-update-z8tbj" event={"ID":"9ef92901-a49d-4e95-9491-53378694d160","Type":"ContainerDied","Data":"0dd7b19a57e5e10a0b38b34d716905a98b1b89c51e6856f85b733bc4a45890d4"} Mar 20 08:50:44 crc kubenswrapper[4971]: I0320 08:50:44.040433 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0dd7b19a57e5e10a0b38b34d716905a98b1b89c51e6856f85b733bc4a45890d4" Mar 20 08:50:44 crc kubenswrapper[4971]: I0320 08:50:44.040461 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-33f2-account-create-update-z8tbj" Mar 20 08:50:45 crc kubenswrapper[4971]: I0320 08:50:45.053264 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-z8jrm"] Mar 20 08:50:45 crc kubenswrapper[4971]: E0320 08:50:45.053891 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90424d47-8eee-4fce-b17a-1fb1250ab67d" containerName="mariadb-database-create" Mar 20 08:50:45 crc kubenswrapper[4971]: I0320 08:50:45.053908 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="90424d47-8eee-4fce-b17a-1fb1250ab67d" containerName="mariadb-database-create" Mar 20 08:50:45 crc kubenswrapper[4971]: E0320 08:50:45.053924 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75476996-b18e-4254-a5ad-5afa1e533ecf" containerName="mariadb-database-create" Mar 20 08:50:45 crc kubenswrapper[4971]: I0320 08:50:45.053932 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="75476996-b18e-4254-a5ad-5afa1e533ecf" containerName="mariadb-database-create" Mar 20 08:50:45 crc kubenswrapper[4971]: E0320 08:50:45.053954 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ef92901-a49d-4e95-9491-53378694d160" containerName="mariadb-account-create-update" Mar 20 08:50:45 crc kubenswrapper[4971]: I0320 08:50:45.053962 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ef92901-a49d-4e95-9491-53378694d160" containerName="mariadb-account-create-update" Mar 20 08:50:45 crc kubenswrapper[4971]: E0320 08:50:45.053988 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="330d1914-0337-4d4f-a8ea-2a7c9f18942a" containerName="mariadb-account-create-update" Mar 20 08:50:45 crc kubenswrapper[4971]: I0320 08:50:45.053997 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="330d1914-0337-4d4f-a8ea-2a7c9f18942a" containerName="mariadb-account-create-update" Mar 20 08:50:45 crc kubenswrapper[4971]: E0320 08:50:45.054008 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="095b7bd0-1af9-4d2a-ac8f-82c4446c5746" containerName="mariadb-database-create" Mar 20 08:50:45 crc kubenswrapper[4971]: I0320 08:50:45.054017 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="095b7bd0-1af9-4d2a-ac8f-82c4446c5746" containerName="mariadb-database-create" Mar 20 08:50:45 crc kubenswrapper[4971]: E0320 08:50:45.054037 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddf84f04-f715-49cd-b077-66df82ec4ea2" containerName="mariadb-account-create-update" Mar 20 08:50:45 crc kubenswrapper[4971]: I0320 08:50:45.054045 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddf84f04-f715-49cd-b077-66df82ec4ea2" containerName="mariadb-account-create-update" Mar 20 08:50:45 crc kubenswrapper[4971]: I0320 08:50:45.054253 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="330d1914-0337-4d4f-a8ea-2a7c9f18942a" containerName="mariadb-account-create-update" Mar 20 08:50:45 crc kubenswrapper[4971]: I0320 08:50:45.054270 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="90424d47-8eee-4fce-b17a-1fb1250ab67d" containerName="mariadb-database-create" Mar 20 08:50:45 crc kubenswrapper[4971]: I0320 08:50:45.054284 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ef92901-a49d-4e95-9491-53378694d160" containerName="mariadb-account-create-update" Mar 20 08:50:45 crc kubenswrapper[4971]: I0320 08:50:45.054297 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddf84f04-f715-49cd-b077-66df82ec4ea2" containerName="mariadb-account-create-update" Mar 20 08:50:45 crc kubenswrapper[4971]: I0320 08:50:45.054315 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="095b7bd0-1af9-4d2a-ac8f-82c4446c5746" containerName="mariadb-database-create" Mar 20 08:50:45 crc kubenswrapper[4971]: I0320 08:50:45.054326 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="75476996-b18e-4254-a5ad-5afa1e533ecf" containerName="mariadb-database-create" Mar 20 08:50:45 crc kubenswrapper[4971]: I0320 08:50:45.054977 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-z8jrm" Mar 20 08:50:45 crc kubenswrapper[4971]: I0320 08:50:45.057190 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 20 08:50:45 crc kubenswrapper[4971]: I0320 08:50:45.059923 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 20 08:50:45 crc kubenswrapper[4971]: I0320 08:50:45.060715 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-txwr7" Mar 20 08:50:45 crc kubenswrapper[4971]: I0320 08:50:45.066138 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-z8jrm"] Mar 20 08:50:45 crc kubenswrapper[4971]: I0320 08:50:45.180223 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb841e1b-1731-434d-8ffb-6c135a174b4a-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-z8jrm\" (UID: \"fb841e1b-1731-434d-8ffb-6c135a174b4a\") " pod="openstack/nova-cell0-conductor-db-sync-z8jrm" Mar 20 08:50:45 crc kubenswrapper[4971]: I0320 08:50:45.180265 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb841e1b-1731-434d-8ffb-6c135a174b4a-scripts\") pod \"nova-cell0-conductor-db-sync-z8jrm\" (UID: \"fb841e1b-1731-434d-8ffb-6c135a174b4a\") " pod="openstack/nova-cell0-conductor-db-sync-z8jrm" Mar 20 08:50:45 crc kubenswrapper[4971]: I0320 08:50:45.180302 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xs62\" (UniqueName: \"kubernetes.io/projected/fb841e1b-1731-434d-8ffb-6c135a174b4a-kube-api-access-5xs62\") pod \"nova-cell0-conductor-db-sync-z8jrm\" (UID: \"fb841e1b-1731-434d-8ffb-6c135a174b4a\") " pod="openstack/nova-cell0-conductor-db-sync-z8jrm" Mar 20 08:50:45 crc kubenswrapper[4971]: I0320 08:50:45.180347 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb841e1b-1731-434d-8ffb-6c135a174b4a-config-data\") pod \"nova-cell0-conductor-db-sync-z8jrm\" (UID: \"fb841e1b-1731-434d-8ffb-6c135a174b4a\") " pod="openstack/nova-cell0-conductor-db-sync-z8jrm" Mar 20 08:50:45 crc kubenswrapper[4971]: I0320 08:50:45.281380 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb841e1b-1731-434d-8ffb-6c135a174b4a-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-z8jrm\" (UID: \"fb841e1b-1731-434d-8ffb-6c135a174b4a\") " pod="openstack/nova-cell0-conductor-db-sync-z8jrm" Mar 20 08:50:45 crc kubenswrapper[4971]: I0320 08:50:45.281423 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb841e1b-1731-434d-8ffb-6c135a174b4a-scripts\") pod \"nova-cell0-conductor-db-sync-z8jrm\" (UID: \"fb841e1b-1731-434d-8ffb-6c135a174b4a\") " pod="openstack/nova-cell0-conductor-db-sync-z8jrm" Mar 20 08:50:45 crc kubenswrapper[4971]: I0320 08:50:45.281463 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xs62\" (UniqueName: \"kubernetes.io/projected/fb841e1b-1731-434d-8ffb-6c135a174b4a-kube-api-access-5xs62\") pod \"nova-cell0-conductor-db-sync-z8jrm\" (UID: \"fb841e1b-1731-434d-8ffb-6c135a174b4a\") " pod="openstack/nova-cell0-conductor-db-sync-z8jrm" Mar 20 08:50:45 crc kubenswrapper[4971]: I0320 08:50:45.281506 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb841e1b-1731-434d-8ffb-6c135a174b4a-config-data\") pod \"nova-cell0-conductor-db-sync-z8jrm\" (UID: \"fb841e1b-1731-434d-8ffb-6c135a174b4a\") " pod="openstack/nova-cell0-conductor-db-sync-z8jrm" Mar 20 08:50:45 crc kubenswrapper[4971]: I0320 08:50:45.286699 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb841e1b-1731-434d-8ffb-6c135a174b4a-scripts\") pod \"nova-cell0-conductor-db-sync-z8jrm\" (UID: \"fb841e1b-1731-434d-8ffb-6c135a174b4a\") " pod="openstack/nova-cell0-conductor-db-sync-z8jrm" Mar 20 08:50:45 crc kubenswrapper[4971]: I0320 08:50:45.287000 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb841e1b-1731-434d-8ffb-6c135a174b4a-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-z8jrm\" (UID: \"fb841e1b-1731-434d-8ffb-6c135a174b4a\") " pod="openstack/nova-cell0-conductor-db-sync-z8jrm" Mar 20 08:50:45 crc kubenswrapper[4971]: I0320 08:50:45.287328 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb841e1b-1731-434d-8ffb-6c135a174b4a-config-data\") pod \"nova-cell0-conductor-db-sync-z8jrm\" (UID: \"fb841e1b-1731-434d-8ffb-6c135a174b4a\") " pod="openstack/nova-cell0-conductor-db-sync-z8jrm" Mar 20 08:50:45 crc kubenswrapper[4971]: I0320 08:50:45.296074 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xs62\" (UniqueName: \"kubernetes.io/projected/fb841e1b-1731-434d-8ffb-6c135a174b4a-kube-api-access-5xs62\") pod \"nova-cell0-conductor-db-sync-z8jrm\" (UID: \"fb841e1b-1731-434d-8ffb-6c135a174b4a\") " pod="openstack/nova-cell0-conductor-db-sync-z8jrm" Mar 20 08:50:45 crc kubenswrapper[4971]: I0320 08:50:45.372814 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-z8jrm" Mar 20 08:50:45 crc kubenswrapper[4971]: I0320 08:50:45.826656 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-z8jrm"] Mar 20 08:50:45 crc kubenswrapper[4971]: W0320 08:50:45.827573 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb841e1b_1731_434d_8ffb_6c135a174b4a.slice/crio-b3f7163e69af38ebb51080f49f67bbdcf2dc55c06e6984c89b98a3d9c8eaf2ca WatchSource:0}: Error finding container b3f7163e69af38ebb51080f49f67bbdcf2dc55c06e6984c89b98a3d9c8eaf2ca: Status 404 returned error can't find the container with id b3f7163e69af38ebb51080f49f67bbdcf2dc55c06e6984c89b98a3d9c8eaf2ca Mar 20 08:50:45 crc kubenswrapper[4971]: I0320 08:50:45.831114 4971 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 08:50:46 crc kubenswrapper[4971]: I0320 08:50:46.096999 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-z8jrm" event={"ID":"fb841e1b-1731-434d-8ffb-6c135a174b4a","Type":"ContainerStarted","Data":"b3f7163e69af38ebb51080f49f67bbdcf2dc55c06e6984c89b98a3d9c8eaf2ca"} Mar 20 08:50:54 crc kubenswrapper[4971]: I0320 08:50:54.733166 4971 scope.go:117] "RemoveContainer" containerID="af8dd483a43b5c96df916fd811fc94a881785f568b623794921e968459fc26b4" Mar 20 08:50:54 crc kubenswrapper[4971]: E0320 08:50:54.734420 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:50:56 crc kubenswrapper[4971]: I0320 08:50:56.185935 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-z8jrm" event={"ID":"fb841e1b-1731-434d-8ffb-6c135a174b4a","Type":"ContainerStarted","Data":"b91d4df69c14a4f5548b65c843781e5b7c499782ff29add7573cdc17b7adb5e8"} Mar 20 08:50:56 crc kubenswrapper[4971]: I0320 08:50:56.206544 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-z8jrm" podStartSLOduration=1.9466124649999998 podStartE2EDuration="11.206515364s" podCreationTimestamp="2026-03-20 08:50:45 +0000 UTC" firstStartedPulling="2026-03-20 08:50:45.830913482 +0000 UTC m=+7267.810787620" lastFinishedPulling="2026-03-20 08:50:55.090816381 +0000 UTC m=+7277.070690519" observedRunningTime="2026-03-20 08:50:56.200377394 +0000 UTC m=+7278.180251532" watchObservedRunningTime="2026-03-20 08:50:56.206515364 +0000 UTC m=+7278.186389532" Mar 20 08:51:01 crc kubenswrapper[4971]: I0320 08:51:01.251111 4971 generic.go:334] "Generic (PLEG): container finished" podID="fb841e1b-1731-434d-8ffb-6c135a174b4a" containerID="b91d4df69c14a4f5548b65c843781e5b7c499782ff29add7573cdc17b7adb5e8" exitCode=0 Mar 20 08:51:01 crc kubenswrapper[4971]: I0320 08:51:01.251183 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-z8jrm" event={"ID":"fb841e1b-1731-434d-8ffb-6c135a174b4a","Type":"ContainerDied","Data":"b91d4df69c14a4f5548b65c843781e5b7c499782ff29add7573cdc17b7adb5e8"} Mar 20 08:51:02 crc kubenswrapper[4971]: I0320 08:51:02.691644 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-z8jrm" Mar 20 08:51:02 crc kubenswrapper[4971]: I0320 08:51:02.750333 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb841e1b-1731-434d-8ffb-6c135a174b4a-combined-ca-bundle\") pod \"fb841e1b-1731-434d-8ffb-6c135a174b4a\" (UID: \"fb841e1b-1731-434d-8ffb-6c135a174b4a\") " Mar 20 08:51:02 crc kubenswrapper[4971]: I0320 08:51:02.750495 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb841e1b-1731-434d-8ffb-6c135a174b4a-config-data\") pod \"fb841e1b-1731-434d-8ffb-6c135a174b4a\" (UID: \"fb841e1b-1731-434d-8ffb-6c135a174b4a\") " Mar 20 08:51:02 crc kubenswrapper[4971]: I0320 08:51:02.750541 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xs62\" (UniqueName: \"kubernetes.io/projected/fb841e1b-1731-434d-8ffb-6c135a174b4a-kube-api-access-5xs62\") pod \"fb841e1b-1731-434d-8ffb-6c135a174b4a\" (UID: \"fb841e1b-1731-434d-8ffb-6c135a174b4a\") " Mar 20 08:51:02 crc kubenswrapper[4971]: I0320 08:51:02.750559 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb841e1b-1731-434d-8ffb-6c135a174b4a-scripts\") pod \"fb841e1b-1731-434d-8ffb-6c135a174b4a\" (UID: \"fb841e1b-1731-434d-8ffb-6c135a174b4a\") " Mar 20 08:51:02 crc kubenswrapper[4971]: I0320 08:51:02.755890 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb841e1b-1731-434d-8ffb-6c135a174b4a-kube-api-access-5xs62" (OuterVolumeSpecName: "kube-api-access-5xs62") pod "fb841e1b-1731-434d-8ffb-6c135a174b4a" (UID: "fb841e1b-1731-434d-8ffb-6c135a174b4a"). InnerVolumeSpecName "kube-api-access-5xs62". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:51:02 crc kubenswrapper[4971]: I0320 08:51:02.756660 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb841e1b-1731-434d-8ffb-6c135a174b4a-scripts" (OuterVolumeSpecName: "scripts") pod "fb841e1b-1731-434d-8ffb-6c135a174b4a" (UID: "fb841e1b-1731-434d-8ffb-6c135a174b4a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:51:02 crc kubenswrapper[4971]: I0320 08:51:02.771369 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb841e1b-1731-434d-8ffb-6c135a174b4a-config-data" (OuterVolumeSpecName: "config-data") pod "fb841e1b-1731-434d-8ffb-6c135a174b4a" (UID: "fb841e1b-1731-434d-8ffb-6c135a174b4a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:51:02 crc kubenswrapper[4971]: I0320 08:51:02.775850 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb841e1b-1731-434d-8ffb-6c135a174b4a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb841e1b-1731-434d-8ffb-6c135a174b4a" (UID: "fb841e1b-1731-434d-8ffb-6c135a174b4a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:51:02 crc kubenswrapper[4971]: I0320 08:51:02.852867 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb841e1b-1731-434d-8ffb-6c135a174b4a-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:51:02 crc kubenswrapper[4971]: I0320 08:51:02.852900 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xs62\" (UniqueName: \"kubernetes.io/projected/fb841e1b-1731-434d-8ffb-6c135a174b4a-kube-api-access-5xs62\") on node \"crc\" DevicePath \"\"" Mar 20 08:51:02 crc kubenswrapper[4971]: I0320 08:51:02.852910 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb841e1b-1731-434d-8ffb-6c135a174b4a-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:51:02 crc kubenswrapper[4971]: I0320 08:51:02.852918 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb841e1b-1731-434d-8ffb-6c135a174b4a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:51:03 crc kubenswrapper[4971]: I0320 08:51:03.310997 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-z8jrm" event={"ID":"fb841e1b-1731-434d-8ffb-6c135a174b4a","Type":"ContainerDied","Data":"b3f7163e69af38ebb51080f49f67bbdcf2dc55c06e6984c89b98a3d9c8eaf2ca"} Mar 20 08:51:03 crc kubenswrapper[4971]: I0320 08:51:03.311387 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3f7163e69af38ebb51080f49f67bbdcf2dc55c06e6984c89b98a3d9c8eaf2ca" Mar 20 08:51:03 crc kubenswrapper[4971]: I0320 08:51:03.311206 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-z8jrm" Mar 20 08:51:03 crc kubenswrapper[4971]: I0320 08:51:03.396107 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 08:51:03 crc kubenswrapper[4971]: E0320 08:51:03.396572 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb841e1b-1731-434d-8ffb-6c135a174b4a" containerName="nova-cell0-conductor-db-sync" Mar 20 08:51:03 crc kubenswrapper[4971]: I0320 08:51:03.396595 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb841e1b-1731-434d-8ffb-6c135a174b4a" containerName="nova-cell0-conductor-db-sync" Mar 20 08:51:03 crc kubenswrapper[4971]: I0320 08:51:03.396837 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb841e1b-1731-434d-8ffb-6c135a174b4a" containerName="nova-cell0-conductor-db-sync" Mar 20 08:51:03 crc kubenswrapper[4971]: I0320 08:51:03.397549 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 20 08:51:03 crc kubenswrapper[4971]: I0320 08:51:03.400069 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-txwr7" Mar 20 08:51:03 crc kubenswrapper[4971]: I0320 08:51:03.401067 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 20 08:51:03 crc kubenswrapper[4971]: I0320 08:51:03.409184 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 08:51:03 crc kubenswrapper[4971]: I0320 08:51:03.464228 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1e23d5e-175f-4f85-a438-b6fc8ffd65e4-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"c1e23d5e-175f-4f85-a438-b6fc8ffd65e4\") " pod="openstack/nova-cell0-conductor-0" Mar 20 08:51:03 crc kubenswrapper[4971]: I0320 08:51:03.464355 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w2nl\" (UniqueName: \"kubernetes.io/projected/c1e23d5e-175f-4f85-a438-b6fc8ffd65e4-kube-api-access-2w2nl\") pod \"nova-cell0-conductor-0\" (UID: \"c1e23d5e-175f-4f85-a438-b6fc8ffd65e4\") " pod="openstack/nova-cell0-conductor-0" Mar 20 08:51:03 crc kubenswrapper[4971]: I0320 08:51:03.464438 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1e23d5e-175f-4f85-a438-b6fc8ffd65e4-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"c1e23d5e-175f-4f85-a438-b6fc8ffd65e4\") " pod="openstack/nova-cell0-conductor-0" Mar 20 08:51:03 crc kubenswrapper[4971]: I0320 08:51:03.566639 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1e23d5e-175f-4f85-a438-b6fc8ffd65e4-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"c1e23d5e-175f-4f85-a438-b6fc8ffd65e4\") " pod="openstack/nova-cell0-conductor-0" Mar 20 08:51:03 crc kubenswrapper[4971]: I0320 08:51:03.566775 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2w2nl\" (UniqueName: \"kubernetes.io/projected/c1e23d5e-175f-4f85-a438-b6fc8ffd65e4-kube-api-access-2w2nl\") pod \"nova-cell0-conductor-0\" (UID: \"c1e23d5e-175f-4f85-a438-b6fc8ffd65e4\") " pod="openstack/nova-cell0-conductor-0" Mar 20 08:51:03 crc kubenswrapper[4971]: I0320 08:51:03.566825 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1e23d5e-175f-4f85-a438-b6fc8ffd65e4-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"c1e23d5e-175f-4f85-a438-b6fc8ffd65e4\") " pod="openstack/nova-cell0-conductor-0" Mar 20 08:51:03 crc kubenswrapper[4971]: I0320 08:51:03.573585 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1e23d5e-175f-4f85-a438-b6fc8ffd65e4-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"c1e23d5e-175f-4f85-a438-b6fc8ffd65e4\") " pod="openstack/nova-cell0-conductor-0" Mar 20 08:51:03 crc kubenswrapper[4971]: I0320 08:51:03.574458 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1e23d5e-175f-4f85-a438-b6fc8ffd65e4-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"c1e23d5e-175f-4f85-a438-b6fc8ffd65e4\") " pod="openstack/nova-cell0-conductor-0" Mar 20 08:51:03 crc kubenswrapper[4971]: I0320 08:51:03.601243 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w2nl\" (UniqueName: \"kubernetes.io/projected/c1e23d5e-175f-4f85-a438-b6fc8ffd65e4-kube-api-access-2w2nl\") pod \"nova-cell0-conductor-0\" (UID: \"c1e23d5e-175f-4f85-a438-b6fc8ffd65e4\") " pod="openstack/nova-cell0-conductor-0" Mar 20 08:51:03 crc kubenswrapper[4971]: I0320 08:51:03.718384 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 20 08:51:04 crc kubenswrapper[4971]: I0320 08:51:04.226273 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 08:51:04 crc kubenswrapper[4971]: W0320 08:51:04.238256 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1e23d5e_175f_4f85_a438_b6fc8ffd65e4.slice/crio-8c727871ef74d61b8656ec699bd0c74869f50a4439464e3eec44ef4b20affa91 WatchSource:0}: Error finding container 8c727871ef74d61b8656ec699bd0c74869f50a4439464e3eec44ef4b20affa91: Status 404 returned error can't find the container with id 8c727871ef74d61b8656ec699bd0c74869f50a4439464e3eec44ef4b20affa91 Mar 20 08:51:04 crc kubenswrapper[4971]: I0320 08:51:04.324886 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"c1e23d5e-175f-4f85-a438-b6fc8ffd65e4","Type":"ContainerStarted","Data":"8c727871ef74d61b8656ec699bd0c74869f50a4439464e3eec44ef4b20affa91"} Mar 20 08:51:05 crc kubenswrapper[4971]: I0320 08:51:05.340664 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"c1e23d5e-175f-4f85-a438-b6fc8ffd65e4","Type":"ContainerStarted","Data":"826c478f4fe50ed03dc4ea36ad1e5487b97c0d28bfe7a13b5c6eab4bc78f0b62"} Mar 20 08:51:05 crc kubenswrapper[4971]: I0320 08:51:05.340967 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 20 08:51:05 crc kubenswrapper[4971]: I0320 08:51:05.369860 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.369834597 podStartE2EDuration="2.369834597s" podCreationTimestamp="2026-03-20 08:51:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:51:05.360972875 +0000 UTC m=+7287.340847063" watchObservedRunningTime="2026-03-20 08:51:05.369834597 +0000 UTC m=+7287.349708775" Mar 20 08:51:08 crc kubenswrapper[4971]: I0320 08:51:08.751782 4971 scope.go:117] "RemoveContainer" containerID="af8dd483a43b5c96df916fd811fc94a881785f568b623794921e968459fc26b4" Mar 20 08:51:08 crc kubenswrapper[4971]: E0320 08:51:08.753046 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:51:13 crc kubenswrapper[4971]: I0320 08:51:13.747544 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.274357 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-v48pn"] Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.276213 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-v48pn" Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.279911 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.280155 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.291437 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-v48pn"] Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.429712 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.431151 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.434725 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.458633 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.461874 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7be898e-b88b-4778-82e2-eaa341e838eb-config-data\") pod \"nova-scheduler-0\" (UID: \"b7be898e-b88b-4778-82e2-eaa341e838eb\") " pod="openstack/nova-scheduler-0" Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.461950 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7be898e-b88b-4778-82e2-eaa341e838eb-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b7be898e-b88b-4778-82e2-eaa341e838eb\") " pod="openstack/nova-scheduler-0" Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.461981 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bedda968-4c4c-4bea-a384-4695a8169a96-scripts\") pod \"nova-cell0-cell-mapping-v48pn\" (UID: \"bedda968-4c4c-4bea-a384-4695a8169a96\") " pod="openstack/nova-cell0-cell-mapping-v48pn" Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.462011 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bedda968-4c4c-4bea-a384-4695a8169a96-config-data\") pod \"nova-cell0-cell-mapping-v48pn\" (UID: \"bedda968-4c4c-4bea-a384-4695a8169a96\") " pod="openstack/nova-cell0-cell-mapping-v48pn" Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.462046 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.462055 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bedda968-4c4c-4bea-a384-4695a8169a96-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-v48pn\" (UID: \"bedda968-4c4c-4bea-a384-4695a8169a96\") " pod="openstack/nova-cell0-cell-mapping-v48pn" Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.462114 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5blz7\" (UniqueName: \"kubernetes.io/projected/bedda968-4c4c-4bea-a384-4695a8169a96-kube-api-access-5blz7\") pod \"nova-cell0-cell-mapping-v48pn\" (UID: \"bedda968-4c4c-4bea-a384-4695a8169a96\") " pod="openstack/nova-cell0-cell-mapping-v48pn" Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.462144 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sn28h\" (UniqueName: \"kubernetes.io/projected/b7be898e-b88b-4778-82e2-eaa341e838eb-kube-api-access-sn28h\") pod \"nova-scheduler-0\" (UID: \"b7be898e-b88b-4778-82e2-eaa341e838eb\") " pod="openstack/nova-scheduler-0" Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.464413 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.476613 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.502405 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.563709 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5blz7\" (UniqueName: \"kubernetes.io/projected/bedda968-4c4c-4bea-a384-4695a8169a96-kube-api-access-5blz7\") pod \"nova-cell0-cell-mapping-v48pn\" (UID: \"bedda968-4c4c-4bea-a384-4695a8169a96\") " pod="openstack/nova-cell0-cell-mapping-v48pn" Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.563767 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng9cx\" (UniqueName: \"kubernetes.io/projected/f9661ff0-007c-427d-a506-b8d44c915440-kube-api-access-ng9cx\") pod \"nova-api-0\" (UID: \"f9661ff0-007c-427d-a506-b8d44c915440\") " pod="openstack/nova-api-0" Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.563805 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sn28h\" (UniqueName: \"kubernetes.io/projected/b7be898e-b88b-4778-82e2-eaa341e838eb-kube-api-access-sn28h\") pod \"nova-scheduler-0\" (UID: \"b7be898e-b88b-4778-82e2-eaa341e838eb\") " pod="openstack/nova-scheduler-0" Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.563867 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7be898e-b88b-4778-82e2-eaa341e838eb-config-data\") pod \"nova-scheduler-0\" (UID: \"b7be898e-b88b-4778-82e2-eaa341e838eb\") " pod="openstack/nova-scheduler-0" Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.563906 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9661ff0-007c-427d-a506-b8d44c915440-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f9661ff0-007c-427d-a506-b8d44c915440\") " pod="openstack/nova-api-0" Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.563950 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7be898e-b88b-4778-82e2-eaa341e838eb-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b7be898e-b88b-4778-82e2-eaa341e838eb\") " pod="openstack/nova-scheduler-0" Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.563982 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bedda968-4c4c-4bea-a384-4695a8169a96-scripts\") pod \"nova-cell0-cell-mapping-v48pn\" (UID: \"bedda968-4c4c-4bea-a384-4695a8169a96\") " pod="openstack/nova-cell0-cell-mapping-v48pn" Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.564013 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bedda968-4c4c-4bea-a384-4695a8169a96-config-data\") pod \"nova-cell0-cell-mapping-v48pn\" (UID: \"bedda968-4c4c-4bea-a384-4695a8169a96\") " pod="openstack/nova-cell0-cell-mapping-v48pn" Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.564082 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bedda968-4c4c-4bea-a384-4695a8169a96-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-v48pn\" (UID: \"bedda968-4c4c-4bea-a384-4695a8169a96\") " pod="openstack/nova-cell0-cell-mapping-v48pn" Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.564114 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9661ff0-007c-427d-a506-b8d44c915440-logs\") pod \"nova-api-0\" (UID: \"f9661ff0-007c-427d-a506-b8d44c915440\") " pod="openstack/nova-api-0" Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.564144 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9661ff0-007c-427d-a506-b8d44c915440-config-data\") pod \"nova-api-0\" (UID: \"f9661ff0-007c-427d-a506-b8d44c915440\") " pod="openstack/nova-api-0" Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.568091 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.572068 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bedda968-4c4c-4bea-a384-4695a8169a96-scripts\") pod \"nova-cell0-cell-mapping-v48pn\" (UID: \"bedda968-4c4c-4bea-a384-4695a8169a96\") " pod="openstack/nova-cell0-cell-mapping-v48pn" Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.579003 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.580660 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7be898e-b88b-4778-82e2-eaa341e838eb-config-data\") pod \"nova-scheduler-0\" (UID: \"b7be898e-b88b-4778-82e2-eaa341e838eb\") " pod="openstack/nova-scheduler-0" Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.591735 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7be898e-b88b-4778-82e2-eaa341e838eb-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b7be898e-b88b-4778-82e2-eaa341e838eb\") " pod="openstack/nova-scheduler-0" Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.592019 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.594142 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bedda968-4c4c-4bea-a384-4695a8169a96-config-data\") pod \"nova-cell0-cell-mapping-v48pn\" (UID: \"bedda968-4c4c-4bea-a384-4695a8169a96\") " pod="openstack/nova-cell0-cell-mapping-v48pn" Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.595440 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bedda968-4c4c-4bea-a384-4695a8169a96-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-v48pn\" (UID: \"bedda968-4c4c-4bea-a384-4695a8169a96\") " pod="openstack/nova-cell0-cell-mapping-v48pn" Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.596993 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sn28h\" (UniqueName: \"kubernetes.io/projected/b7be898e-b88b-4778-82e2-eaa341e838eb-kube-api-access-sn28h\") pod \"nova-scheduler-0\" (UID: \"b7be898e-b88b-4778-82e2-eaa341e838eb\") " pod="openstack/nova-scheduler-0" Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.597308 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5blz7\" (UniqueName: \"kubernetes.io/projected/bedda968-4c4c-4bea-a384-4695a8169a96-kube-api-access-5blz7\") pod \"nova-cell0-cell-mapping-v48pn\" (UID: \"bedda968-4c4c-4bea-a384-4695a8169a96\") " pod="openstack/nova-cell0-cell-mapping-v48pn" Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.635365 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-v48pn" Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.658238 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.666476 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9661ff0-007c-427d-a506-b8d44c915440-logs\") pod \"nova-api-0\" (UID: \"f9661ff0-007c-427d-a506-b8d44c915440\") " pod="openstack/nova-api-0" Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.666524 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9661ff0-007c-427d-a506-b8d44c915440-config-data\") pod \"nova-api-0\" (UID: \"f9661ff0-007c-427d-a506-b8d44c915440\") " pod="openstack/nova-api-0" Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.666555 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e54821f-4696-4669-92cf-d9af53f447fe-config-data\") pod \"nova-metadata-0\" (UID: \"5e54821f-4696-4669-92cf-d9af53f447fe\") " pod="openstack/nova-metadata-0" Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.666574 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e54821f-4696-4669-92cf-d9af53f447fe-logs\") pod \"nova-metadata-0\" (UID: \"5e54821f-4696-4669-92cf-d9af53f447fe\") " pod="openstack/nova-metadata-0" Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.666613 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ng9cx\" (UniqueName: \"kubernetes.io/projected/f9661ff0-007c-427d-a506-b8d44c915440-kube-api-access-ng9cx\") pod \"nova-api-0\" (UID: \"f9661ff0-007c-427d-a506-b8d44c915440\") " pod="openstack/nova-api-0" Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.666667 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9661ff0-007c-427d-a506-b8d44c915440-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f9661ff0-007c-427d-a506-b8d44c915440\") " pod="openstack/nova-api-0" Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.666698 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwv6f\" (UniqueName: \"kubernetes.io/projected/5e54821f-4696-4669-92cf-d9af53f447fe-kube-api-access-mwv6f\") pod \"nova-metadata-0\" (UID: \"5e54821f-4696-4669-92cf-d9af53f447fe\") " pod="openstack/nova-metadata-0" Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.666734 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e54821f-4696-4669-92cf-d9af53f447fe-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5e54821f-4696-4669-92cf-d9af53f447fe\") " pod="openstack/nova-metadata-0" Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.667126 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9661ff0-007c-427d-a506-b8d44c915440-logs\") pod \"nova-api-0\" (UID: \"f9661ff0-007c-427d-a506-b8d44c915440\") " pod="openstack/nova-api-0" Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.674321 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9661ff0-007c-427d-a506-b8d44c915440-config-data\") pod \"nova-api-0\" (UID: \"f9661ff0-007c-427d-a506-b8d44c915440\") " pod="openstack/nova-api-0" Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.676742 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9661ff0-007c-427d-a506-b8d44c915440-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f9661ff0-007c-427d-a506-b8d44c915440\") " pod="openstack/nova-api-0" Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.697261 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng9cx\" (UniqueName: \"kubernetes.io/projected/f9661ff0-007c-427d-a506-b8d44c915440-kube-api-access-ng9cx\") pod \"nova-api-0\" (UID: \"f9661ff0-007c-427d-a506-b8d44c915440\") " pod="openstack/nova-api-0" Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.780642 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.797285 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwv6f\" (UniqueName: \"kubernetes.io/projected/5e54821f-4696-4669-92cf-d9af53f447fe-kube-api-access-mwv6f\") pod \"nova-metadata-0\" (UID: \"5e54821f-4696-4669-92cf-d9af53f447fe\") " pod="openstack/nova-metadata-0" Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.797529 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e54821f-4696-4669-92cf-d9af53f447fe-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5e54821f-4696-4669-92cf-d9af53f447fe\") " pod="openstack/nova-metadata-0" Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.797799 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e54821f-4696-4669-92cf-d9af53f447fe-config-data\") pod \"nova-metadata-0\" (UID: \"5e54821f-4696-4669-92cf-d9af53f447fe\") " pod="openstack/nova-metadata-0" Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.797912 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e54821f-4696-4669-92cf-d9af53f447fe-logs\") pod \"nova-metadata-0\" (UID: \"5e54821f-4696-4669-92cf-d9af53f447fe\") " pod="openstack/nova-metadata-0" Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.798592 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e54821f-4696-4669-92cf-d9af53f447fe-logs\") pod \"nova-metadata-0\" (UID: \"5e54821f-4696-4669-92cf-d9af53f447fe\") " pod="openstack/nova-metadata-0" Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.799002 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.815228 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.816847 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e54821f-4696-4669-92cf-d9af53f447fe-config-data\") pod \"nova-metadata-0\" (UID: \"5e54821f-4696-4669-92cf-d9af53f447fe\") " pod="openstack/nova-metadata-0" Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.818546 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e54821f-4696-4669-92cf-d9af53f447fe-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5e54821f-4696-4669-92cf-d9af53f447fe\") " pod="openstack/nova-metadata-0" Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.818577 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.820781 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.825156 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwv6f\" (UniqueName: \"kubernetes.io/projected/5e54821f-4696-4669-92cf-d9af53f447fe-kube-api-access-mwv6f\") pod \"nova-metadata-0\" (UID: \"5e54821f-4696-4669-92cf-d9af53f447fe\") " pod="openstack/nova-metadata-0" Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.830573 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.843463 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-844dd44645-96kb5"] Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.853076 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-844dd44645-96kb5"] Mar 20 08:51:14 crc kubenswrapper[4971]: I0320 08:51:14.853183 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-844dd44645-96kb5" Mar 20 08:51:15 crc kubenswrapper[4971]: I0320 08:51:15.003035 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpb59\" (UniqueName: \"kubernetes.io/projected/4581cde5-cdb3-4215-b51a-17504bb3fc30-kube-api-access-bpb59\") pod \"nova-cell1-novncproxy-0\" (UID: \"4581cde5-cdb3-4215-b51a-17504bb3fc30\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:51:15 crc kubenswrapper[4971]: I0320 08:51:15.003440 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4581cde5-cdb3-4215-b51a-17504bb3fc30-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4581cde5-cdb3-4215-b51a-17504bb3fc30\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:51:15 crc kubenswrapper[4971]: I0320 08:51:15.003472 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/513a0a95-fc59-49c1-ba38-c859f0c74158-ovsdbserver-sb\") pod \"dnsmasq-dns-844dd44645-96kb5\" (UID: \"513a0a95-fc59-49c1-ba38-c859f0c74158\") " pod="openstack/dnsmasq-dns-844dd44645-96kb5" Mar 20 08:51:15 crc kubenswrapper[4971]: I0320 08:51:15.003498 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4581cde5-cdb3-4215-b51a-17504bb3fc30-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4581cde5-cdb3-4215-b51a-17504bb3fc30\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:51:15 crc kubenswrapper[4971]: I0320 08:51:15.003556 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/513a0a95-fc59-49c1-ba38-c859f0c74158-dns-svc\") pod \"dnsmasq-dns-844dd44645-96kb5\" (UID: \"513a0a95-fc59-49c1-ba38-c859f0c74158\") " pod="openstack/dnsmasq-dns-844dd44645-96kb5" Mar 20 08:51:15 crc kubenswrapper[4971]: I0320 08:51:15.003597 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d4sb\" (UniqueName: \"kubernetes.io/projected/513a0a95-fc59-49c1-ba38-c859f0c74158-kube-api-access-4d4sb\") pod \"dnsmasq-dns-844dd44645-96kb5\" (UID: \"513a0a95-fc59-49c1-ba38-c859f0c74158\") " pod="openstack/dnsmasq-dns-844dd44645-96kb5" Mar 20 08:51:15 crc kubenswrapper[4971]: I0320 08:51:15.003634 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/513a0a95-fc59-49c1-ba38-c859f0c74158-config\") pod \"dnsmasq-dns-844dd44645-96kb5\" (UID: \"513a0a95-fc59-49c1-ba38-c859f0c74158\") " pod="openstack/dnsmasq-dns-844dd44645-96kb5" Mar 20 08:51:15 crc kubenswrapper[4971]: I0320 08:51:15.003832 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/513a0a95-fc59-49c1-ba38-c859f0c74158-ovsdbserver-nb\") pod \"dnsmasq-dns-844dd44645-96kb5\" (UID: \"513a0a95-fc59-49c1-ba38-c859f0c74158\") " pod="openstack/dnsmasq-dns-844dd44645-96kb5" Mar 20 08:51:15 crc kubenswrapper[4971]: I0320 08:51:15.105547 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4d4sb\" (UniqueName: \"kubernetes.io/projected/513a0a95-fc59-49c1-ba38-c859f0c74158-kube-api-access-4d4sb\") pod \"dnsmasq-dns-844dd44645-96kb5\" (UID: \"513a0a95-fc59-49c1-ba38-c859f0c74158\") " pod="openstack/dnsmasq-dns-844dd44645-96kb5" Mar 20 08:51:15 crc kubenswrapper[4971]: I0320 08:51:15.105635 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/513a0a95-fc59-49c1-ba38-c859f0c74158-config\") pod \"dnsmasq-dns-844dd44645-96kb5\" (UID: \"513a0a95-fc59-49c1-ba38-c859f0c74158\") " pod="openstack/dnsmasq-dns-844dd44645-96kb5" Mar 20 08:51:15 crc kubenswrapper[4971]: I0320 08:51:15.105660 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/513a0a95-fc59-49c1-ba38-c859f0c74158-ovsdbserver-nb\") pod \"dnsmasq-dns-844dd44645-96kb5\" (UID: \"513a0a95-fc59-49c1-ba38-c859f0c74158\") " pod="openstack/dnsmasq-dns-844dd44645-96kb5" Mar 20 08:51:15 crc kubenswrapper[4971]: I0320 08:51:15.105685 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpb59\" (UniqueName: \"kubernetes.io/projected/4581cde5-cdb3-4215-b51a-17504bb3fc30-kube-api-access-bpb59\") pod \"nova-cell1-novncproxy-0\" (UID: \"4581cde5-cdb3-4215-b51a-17504bb3fc30\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:51:15 crc kubenswrapper[4971]: I0320 08:51:15.105737 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4581cde5-cdb3-4215-b51a-17504bb3fc30-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4581cde5-cdb3-4215-b51a-17504bb3fc30\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:51:15 crc kubenswrapper[4971]: I0320 08:51:15.105768 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/513a0a95-fc59-49c1-ba38-c859f0c74158-ovsdbserver-sb\") pod \"dnsmasq-dns-844dd44645-96kb5\" (UID: \"513a0a95-fc59-49c1-ba38-c859f0c74158\") " pod="openstack/dnsmasq-dns-844dd44645-96kb5" Mar 20 08:51:15 crc kubenswrapper[4971]: I0320 08:51:15.105835 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4581cde5-cdb3-4215-b51a-17504bb3fc30-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4581cde5-cdb3-4215-b51a-17504bb3fc30\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:51:15 crc kubenswrapper[4971]: I0320 08:51:15.105917 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/513a0a95-fc59-49c1-ba38-c859f0c74158-dns-svc\") pod \"dnsmasq-dns-844dd44645-96kb5\" (UID: \"513a0a95-fc59-49c1-ba38-c859f0c74158\") " pod="openstack/dnsmasq-dns-844dd44645-96kb5" Mar 20 08:51:15 crc kubenswrapper[4971]: I0320 08:51:15.109421 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 08:51:15 crc kubenswrapper[4971]: I0320 08:51:15.126520 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/513a0a95-fc59-49c1-ba38-c859f0c74158-config\") pod \"dnsmasq-dns-844dd44645-96kb5\" (UID: \"513a0a95-fc59-49c1-ba38-c859f0c74158\") " pod="openstack/dnsmasq-dns-844dd44645-96kb5" Mar 20 08:51:15 crc kubenswrapper[4971]: I0320 08:51:15.129829 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/513a0a95-fc59-49c1-ba38-c859f0c74158-dns-svc\") pod \"dnsmasq-dns-844dd44645-96kb5\" (UID: \"513a0a95-fc59-49c1-ba38-c859f0c74158\") " pod="openstack/dnsmasq-dns-844dd44645-96kb5" Mar 20 08:51:15 crc kubenswrapper[4971]: I0320 08:51:15.132209 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/513a0a95-fc59-49c1-ba38-c859f0c74158-ovsdbserver-sb\") pod \"dnsmasq-dns-844dd44645-96kb5\" (UID: \"513a0a95-fc59-49c1-ba38-c859f0c74158\") " pod="openstack/dnsmasq-dns-844dd44645-96kb5" Mar 20 08:51:15 crc kubenswrapper[4971]: I0320 08:51:15.132303 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4581cde5-cdb3-4215-b51a-17504bb3fc30-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4581cde5-cdb3-4215-b51a-17504bb3fc30\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:51:15 crc kubenswrapper[4971]: I0320 08:51:15.132336 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpb59\" (UniqueName: \"kubernetes.io/projected/4581cde5-cdb3-4215-b51a-17504bb3fc30-kube-api-access-bpb59\") pod \"nova-cell1-novncproxy-0\" (UID: \"4581cde5-cdb3-4215-b51a-17504bb3fc30\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:51:15 crc kubenswrapper[4971]: I0320 08:51:15.132426 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d4sb\" (UniqueName: \"kubernetes.io/projected/513a0a95-fc59-49c1-ba38-c859f0c74158-kube-api-access-4d4sb\") pod \"dnsmasq-dns-844dd44645-96kb5\" (UID: \"513a0a95-fc59-49c1-ba38-c859f0c74158\") " pod="openstack/dnsmasq-dns-844dd44645-96kb5" Mar 20 08:51:15 crc kubenswrapper[4971]: I0320 08:51:15.133299 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4581cde5-cdb3-4215-b51a-17504bb3fc30-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4581cde5-cdb3-4215-b51a-17504bb3fc30\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:51:15 crc kubenswrapper[4971]: I0320 08:51:15.135293 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/513a0a95-fc59-49c1-ba38-c859f0c74158-ovsdbserver-nb\") pod \"dnsmasq-dns-844dd44645-96kb5\" (UID: \"513a0a95-fc59-49c1-ba38-c859f0c74158\") " pod="openstack/dnsmasq-dns-844dd44645-96kb5" Mar 20 08:51:15 crc kubenswrapper[4971]: I0320 08:51:15.137936 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:51:15 crc kubenswrapper[4971]: I0320 08:51:15.190588 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-844dd44645-96kb5" Mar 20 08:51:15 crc kubenswrapper[4971]: I0320 08:51:15.249212 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-cvhsb"] Mar 20 08:51:15 crc kubenswrapper[4971]: I0320 08:51:15.252520 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-cvhsb" Mar 20 08:51:15 crc kubenswrapper[4971]: I0320 08:51:15.255543 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 20 08:51:15 crc kubenswrapper[4971]: I0320 08:51:15.257087 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 20 08:51:15 crc kubenswrapper[4971]: I0320 08:51:15.265579 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-cvhsb"] Mar 20 08:51:15 crc kubenswrapper[4971]: I0320 08:51:15.281016 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-v48pn"] Mar 20 08:51:15 crc kubenswrapper[4971]: I0320 08:51:15.414126 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f6bf0dd-87ab-4d12-8d00-fd3907b69533-scripts\") pod \"nova-cell1-conductor-db-sync-cvhsb\" (UID: \"0f6bf0dd-87ab-4d12-8d00-fd3907b69533\") " pod="openstack/nova-cell1-conductor-db-sync-cvhsb" Mar 20 08:51:15 crc kubenswrapper[4971]: I0320 08:51:15.414498 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f6bf0dd-87ab-4d12-8d00-fd3907b69533-config-data\") pod \"nova-cell1-conductor-db-sync-cvhsb\" (UID: \"0f6bf0dd-87ab-4d12-8d00-fd3907b69533\") " pod="openstack/nova-cell1-conductor-db-sync-cvhsb" Mar 20 08:51:15 crc kubenswrapper[4971]: I0320 08:51:15.414551 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlx52\" (UniqueName: \"kubernetes.io/projected/0f6bf0dd-87ab-4d12-8d00-fd3907b69533-kube-api-access-dlx52\") pod \"nova-cell1-conductor-db-sync-cvhsb\" (UID: \"0f6bf0dd-87ab-4d12-8d00-fd3907b69533\") " pod="openstack/nova-cell1-conductor-db-sync-cvhsb" Mar 20 08:51:15 crc kubenswrapper[4971]: I0320 08:51:15.414590 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f6bf0dd-87ab-4d12-8d00-fd3907b69533-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-cvhsb\" (UID: \"0f6bf0dd-87ab-4d12-8d00-fd3907b69533\") " pod="openstack/nova-cell1-conductor-db-sync-cvhsb" Mar 20 08:51:15 crc kubenswrapper[4971]: I0320 08:51:15.421238 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 08:51:16 crc kubenswrapper[4971]: I0320 08:51:15.512955 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-v48pn" event={"ID":"bedda968-4c4c-4bea-a384-4695a8169a96","Type":"ContainerStarted","Data":"06d38ec29fd6ba7b896188c16b18091b32c3a892967cc4831fb7d355a89ea790"} Mar 20 08:51:16 crc kubenswrapper[4971]: I0320 08:51:15.513007 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-v48pn" event={"ID":"bedda968-4c4c-4bea-a384-4695a8169a96","Type":"ContainerStarted","Data":"a1b92677ab4720f8804547cdd8a3bc3613b9648086fb363957e29472da01857f"} Mar 20 08:51:16 crc kubenswrapper[4971]: I0320 08:51:15.516096 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f6bf0dd-87ab-4d12-8d00-fd3907b69533-scripts\") pod \"nova-cell1-conductor-db-sync-cvhsb\" (UID: \"0f6bf0dd-87ab-4d12-8d00-fd3907b69533\") " pod="openstack/nova-cell1-conductor-db-sync-cvhsb" Mar 20 08:51:16 crc kubenswrapper[4971]: I0320 08:51:15.516179 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f6bf0dd-87ab-4d12-8d00-fd3907b69533-config-data\") pod \"nova-cell1-conductor-db-sync-cvhsb\" (UID: \"0f6bf0dd-87ab-4d12-8d00-fd3907b69533\") " pod="openstack/nova-cell1-conductor-db-sync-cvhsb" Mar 20 08:51:16 crc kubenswrapper[4971]: I0320 08:51:15.516220 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlx52\" (UniqueName: \"kubernetes.io/projected/0f6bf0dd-87ab-4d12-8d00-fd3907b69533-kube-api-access-dlx52\") pod \"nova-cell1-conductor-db-sync-cvhsb\" (UID: \"0f6bf0dd-87ab-4d12-8d00-fd3907b69533\") " pod="openstack/nova-cell1-conductor-db-sync-cvhsb" Mar 20 08:51:16 crc kubenswrapper[4971]: I0320 08:51:15.516271 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f6bf0dd-87ab-4d12-8d00-fd3907b69533-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-cvhsb\" (UID: \"0f6bf0dd-87ab-4d12-8d00-fd3907b69533\") " pod="openstack/nova-cell1-conductor-db-sync-cvhsb" Mar 20 08:51:16 crc kubenswrapper[4971]: I0320 08:51:15.524850 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f6bf0dd-87ab-4d12-8d00-fd3907b69533-scripts\") pod \"nova-cell1-conductor-db-sync-cvhsb\" (UID: \"0f6bf0dd-87ab-4d12-8d00-fd3907b69533\") " pod="openstack/nova-cell1-conductor-db-sync-cvhsb" Mar 20 08:51:16 crc kubenswrapper[4971]: I0320 08:51:15.525225 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f6bf0dd-87ab-4d12-8d00-fd3907b69533-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-cvhsb\" (UID: \"0f6bf0dd-87ab-4d12-8d00-fd3907b69533\") " pod="openstack/nova-cell1-conductor-db-sync-cvhsb" Mar 20 08:51:16 crc kubenswrapper[4971]: I0320 08:51:15.530152 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f6bf0dd-87ab-4d12-8d00-fd3907b69533-config-data\") pod \"nova-cell1-conductor-db-sync-cvhsb\" (UID: \"0f6bf0dd-87ab-4d12-8d00-fd3907b69533\") " pod="openstack/nova-cell1-conductor-db-sync-cvhsb" Mar 20 08:51:16 crc kubenswrapper[4971]: I0320 08:51:15.530198 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f9661ff0-007c-427d-a506-b8d44c915440","Type":"ContainerStarted","Data":"88eb7b4998f4291d3d70c6c2ae8efe8aaad55fd2dd11d86107cc184ae2bf561d"} Mar 20 08:51:16 crc kubenswrapper[4971]: I0320 08:51:15.533014 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-v48pn" podStartSLOduration=1.533001385 podStartE2EDuration="1.533001385s" podCreationTimestamp="2026-03-20 08:51:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:51:15.529396891 +0000 UTC m=+7297.509271029" watchObservedRunningTime="2026-03-20 08:51:15.533001385 +0000 UTC m=+7297.512875523" Mar 20 08:51:16 crc kubenswrapper[4971]: I0320 08:51:15.534284 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 08:51:16 crc kubenswrapper[4971]: I0320 08:51:15.537011 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlx52\" (UniqueName: \"kubernetes.io/projected/0f6bf0dd-87ab-4d12-8d00-fd3907b69533-kube-api-access-dlx52\") pod \"nova-cell1-conductor-db-sync-cvhsb\" (UID: \"0f6bf0dd-87ab-4d12-8d00-fd3907b69533\") " pod="openstack/nova-cell1-conductor-db-sync-cvhsb" Mar 20 08:51:16 crc kubenswrapper[4971]: I0320 08:51:15.583178 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-cvhsb" Mar 20 08:51:16 crc kubenswrapper[4971]: I0320 08:51:15.738353 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 08:51:16 crc kubenswrapper[4971]: I0320 08:51:16.263780 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-844dd44645-96kb5"] Mar 20 08:51:16 crc kubenswrapper[4971]: I0320 08:51:16.542209 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4581cde5-cdb3-4215-b51a-17504bb3fc30","Type":"ContainerStarted","Data":"0e9e08fa35b1607cc954bb721c86c11951a8e92c8642ed3ec23f97f0bc4c4657"} Mar 20 08:51:16 crc kubenswrapper[4971]: I0320 08:51:16.543778 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 08:51:16 crc kubenswrapper[4971]: I0320 08:51:16.544093 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b7be898e-b88b-4778-82e2-eaa341e838eb","Type":"ContainerStarted","Data":"45697abc7f2742de866eec8501dab5b2de1826cc06dcfcfa6a77e99c0bc62edf"} Mar 20 08:51:16 crc kubenswrapper[4971]: I0320 08:51:16.557358 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-cvhsb"] Mar 20 08:51:16 crc kubenswrapper[4971]: W0320 08:51:16.774169 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod513a0a95_fc59_49c1_ba38_c859f0c74158.slice/crio-a6a6a947e495dcfdebff7ad7b05ad32cb30ecbc380150d98526554cf3af68222 WatchSource:0}: Error finding container a6a6a947e495dcfdebff7ad7b05ad32cb30ecbc380150d98526554cf3af68222: Status 404 returned error can't find the container with id a6a6a947e495dcfdebff7ad7b05ad32cb30ecbc380150d98526554cf3af68222 Mar 20 08:51:17 crc kubenswrapper[4971]: I0320 08:51:17.556874 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-cvhsb" event={"ID":"0f6bf0dd-87ab-4d12-8d00-fd3907b69533","Type":"ContainerStarted","Data":"97f8a04f474ad0f480353530a593b2a4729444e0e6021728acfaf3adcc9947da"} Mar 20 08:51:17 crc kubenswrapper[4971]: I0320 08:51:17.558694 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-844dd44645-96kb5" event={"ID":"513a0a95-fc59-49c1-ba38-c859f0c74158","Type":"ContainerStarted","Data":"a6a6a947e495dcfdebff7ad7b05ad32cb30ecbc380150d98526554cf3af68222"} Mar 20 08:51:17 crc kubenswrapper[4971]: I0320 08:51:17.559977 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5e54821f-4696-4669-92cf-d9af53f447fe","Type":"ContainerStarted","Data":"01b10b5994776ca558624ce0957dbe22cd31e0e975708b096ea755088b99bccb"} Mar 20 08:51:18 crc kubenswrapper[4971]: I0320 08:51:18.568232 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b7be898e-b88b-4778-82e2-eaa341e838eb","Type":"ContainerStarted","Data":"f57d450671465e68aca75cef9ed3629f5047433caac4afa143514cb786ac66a6"} Mar 20 08:51:18 crc kubenswrapper[4971]: I0320 08:51:18.573307 4971 generic.go:334] "Generic (PLEG): container finished" podID="513a0a95-fc59-49c1-ba38-c859f0c74158" containerID="a216082ab407de137cd7d9211b7791bc94abd0bb4c336aeac51f09f626b32c0a" exitCode=0 Mar 20 08:51:18 crc kubenswrapper[4971]: I0320 08:51:18.573376 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-844dd44645-96kb5" event={"ID":"513a0a95-fc59-49c1-ba38-c859f0c74158","Type":"ContainerDied","Data":"a216082ab407de137cd7d9211b7791bc94abd0bb4c336aeac51f09f626b32c0a"} Mar 20 08:51:18 crc kubenswrapper[4971]: I0320 08:51:18.577989 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f9661ff0-007c-427d-a506-b8d44c915440","Type":"ContainerStarted","Data":"6ab45cd166a7dd6c2538025f44263aac39737700acc57fcf1f2fa63c7e2a9a58"} Mar 20 08:51:18 crc kubenswrapper[4971]: I0320 08:51:18.578022 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f9661ff0-007c-427d-a506-b8d44c915440","Type":"ContainerStarted","Data":"be4b8d6be11307aff10272a0bb0b21f9d7d01b3be3a712d23daa4eeb54de7a8f"} Mar 20 08:51:18 crc kubenswrapper[4971]: I0320 08:51:18.579960 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4581cde5-cdb3-4215-b51a-17504bb3fc30","Type":"ContainerStarted","Data":"4f7c12edf0bcfcac83fe9e0f80c281b7f43c8af30f095eb9ca01ba4a7b85bdc1"} Mar 20 08:51:18 crc kubenswrapper[4971]: I0320 08:51:18.585300 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5e54821f-4696-4669-92cf-d9af53f447fe","Type":"ContainerStarted","Data":"5f27e62e9db7779f9c687450c3ba9d29b63a65c435f01096e905df1c846b25da"} Mar 20 08:51:18 crc kubenswrapper[4971]: I0320 08:51:18.585344 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5e54821f-4696-4669-92cf-d9af53f447fe","Type":"ContainerStarted","Data":"2a29dca8e0ba292edee04cd306de740683db4f1921ee5dadc8dd0813984d0a3b"} Mar 20 08:51:18 crc kubenswrapper[4971]: I0320 08:51:18.587088 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-cvhsb" event={"ID":"0f6bf0dd-87ab-4d12-8d00-fd3907b69533","Type":"ContainerStarted","Data":"31dc4ef88a1409cbba7b1e7cdc7efcdd59fa94f13bfae269cce863ffd1f1b2ac"} Mar 20 08:51:18 crc kubenswrapper[4971]: I0320 08:51:18.587924 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.385774014 podStartE2EDuration="4.587908725s" podCreationTimestamp="2026-03-20 08:51:14 +0000 UTC" firstStartedPulling="2026-03-20 08:51:15.544975548 +0000 UTC m=+7297.524849686" lastFinishedPulling="2026-03-20 08:51:17.747110259 +0000 UTC m=+7299.726984397" observedRunningTime="2026-03-20 08:51:18.582516304 +0000 UTC m=+7300.562390442" watchObservedRunningTime="2026-03-20 08:51:18.587908725 +0000 UTC m=+7300.567782863" Mar 20 08:51:18 crc kubenswrapper[4971]: I0320 08:51:18.600842 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.252077978 podStartE2EDuration="4.600824543s" podCreationTimestamp="2026-03-20 08:51:14 +0000 UTC" firstStartedPulling="2026-03-20 08:51:15.424622851 +0000 UTC m=+7297.404496979" lastFinishedPulling="2026-03-20 08:51:17.773369396 +0000 UTC m=+7299.753243544" observedRunningTime="2026-03-20 08:51:18.597262539 +0000 UTC m=+7300.577136687" watchObservedRunningTime="2026-03-20 08:51:18.600824543 +0000 UTC m=+7300.580698681" Mar 20 08:51:18 crc kubenswrapper[4971]: I0320 08:51:18.649516 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.635870703 podStartE2EDuration="4.649493385s" podCreationTimestamp="2026-03-20 08:51:14 +0000 UTC" firstStartedPulling="2026-03-20 08:51:15.733366874 +0000 UTC m=+7297.713241012" lastFinishedPulling="2026-03-20 08:51:17.746989566 +0000 UTC m=+7299.726863694" observedRunningTime="2026-03-20 08:51:18.641719562 +0000 UTC m=+7300.621593710" watchObservedRunningTime="2026-03-20 08:51:18.649493385 +0000 UTC m=+7300.629367523" Mar 20 08:51:18 crc kubenswrapper[4971]: I0320 08:51:18.679685 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=4.2416939320000004 podStartE2EDuration="4.679669904s" podCreationTimestamp="2026-03-20 08:51:14 +0000 UTC" firstStartedPulling="2026-03-20 08:51:17.335474306 +0000 UTC m=+7299.315348444" lastFinishedPulling="2026-03-20 08:51:17.773450248 +0000 UTC m=+7299.753324416" observedRunningTime="2026-03-20 08:51:18.666387567 +0000 UTC m=+7300.646261705" watchObservedRunningTime="2026-03-20 08:51:18.679669904 +0000 UTC m=+7300.659544042" Mar 20 08:51:18 crc kubenswrapper[4971]: I0320 08:51:18.685035 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-cvhsb" podStartSLOduration=3.685027384 podStartE2EDuration="3.685027384s" podCreationTimestamp="2026-03-20 08:51:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:51:18.679194282 +0000 UTC m=+7300.659068430" watchObservedRunningTime="2026-03-20 08:51:18.685027384 +0000 UTC m=+7300.664901522" Mar 20 08:51:19 crc kubenswrapper[4971]: I0320 08:51:19.599845 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-844dd44645-96kb5" event={"ID":"513a0a95-fc59-49c1-ba38-c859f0c74158","Type":"ContainerStarted","Data":"34a2307cd9ef11f84bc46fc48f75e4d7a85aaa4d588f3d6905d59c181dca1a68"} Mar 20 08:51:19 crc kubenswrapper[4971]: I0320 08:51:19.622038 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-844dd44645-96kb5" podStartSLOduration=5.622016414 podStartE2EDuration="5.622016414s" podCreationTimestamp="2026-03-20 08:51:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:51:19.621427409 +0000 UTC m=+7301.601301567" watchObservedRunningTime="2026-03-20 08:51:19.622016414 +0000 UTC m=+7301.601890562" Mar 20 08:51:19 crc kubenswrapper[4971]: I0320 08:51:19.779356 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 20 08:51:20 crc kubenswrapper[4971]: I0320 08:51:20.138844 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:51:20 crc kubenswrapper[4971]: I0320 08:51:20.212276 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-844dd44645-96kb5" Mar 20 08:51:20 crc kubenswrapper[4971]: I0320 08:51:20.632900 4971 generic.go:334] "Generic (PLEG): container finished" podID="bedda968-4c4c-4bea-a384-4695a8169a96" containerID="06d38ec29fd6ba7b896188c16b18091b32c3a892967cc4831fb7d355a89ea790" exitCode=0 Mar 20 08:51:20 crc kubenswrapper[4971]: I0320 08:51:20.634454 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-v48pn" event={"ID":"bedda968-4c4c-4bea-a384-4695a8169a96","Type":"ContainerDied","Data":"06d38ec29fd6ba7b896188c16b18091b32c3a892967cc4831fb7d355a89ea790"} Mar 20 08:51:20 crc kubenswrapper[4971]: I0320 08:51:20.733130 4971 scope.go:117] "RemoveContainer" containerID="af8dd483a43b5c96df916fd811fc94a881785f568b623794921e968459fc26b4" Mar 20 08:51:20 crc kubenswrapper[4971]: E0320 08:51:20.734016 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:51:21 crc kubenswrapper[4971]: I0320 08:51:21.646494 4971 generic.go:334] "Generic (PLEG): container finished" podID="0f6bf0dd-87ab-4d12-8d00-fd3907b69533" containerID="31dc4ef88a1409cbba7b1e7cdc7efcdd59fa94f13bfae269cce863ffd1f1b2ac" exitCode=0 Mar 20 08:51:21 crc kubenswrapper[4971]: I0320 08:51:21.646583 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-cvhsb" event={"ID":"0f6bf0dd-87ab-4d12-8d00-fd3907b69533","Type":"ContainerDied","Data":"31dc4ef88a1409cbba7b1e7cdc7efcdd59fa94f13bfae269cce863ffd1f1b2ac"} Mar 20 08:51:22 crc kubenswrapper[4971]: I0320 08:51:22.138220 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-v48pn" Mar 20 08:51:22 crc kubenswrapper[4971]: I0320 08:51:22.191837 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bedda968-4c4c-4bea-a384-4695a8169a96-scripts\") pod \"bedda968-4c4c-4bea-a384-4695a8169a96\" (UID: \"bedda968-4c4c-4bea-a384-4695a8169a96\") " Mar 20 08:51:22 crc kubenswrapper[4971]: I0320 08:51:22.191883 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bedda968-4c4c-4bea-a384-4695a8169a96-config-data\") pod \"bedda968-4c4c-4bea-a384-4695a8169a96\" (UID: \"bedda968-4c4c-4bea-a384-4695a8169a96\") " Mar 20 08:51:22 crc kubenswrapper[4971]: I0320 08:51:22.191935 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bedda968-4c4c-4bea-a384-4695a8169a96-combined-ca-bundle\") pod \"bedda968-4c4c-4bea-a384-4695a8169a96\" (UID: \"bedda968-4c4c-4bea-a384-4695a8169a96\") " Mar 20 08:51:22 crc kubenswrapper[4971]: I0320 08:51:22.191972 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5blz7\" (UniqueName: \"kubernetes.io/projected/bedda968-4c4c-4bea-a384-4695a8169a96-kube-api-access-5blz7\") pod \"bedda968-4c4c-4bea-a384-4695a8169a96\" (UID: \"bedda968-4c4c-4bea-a384-4695a8169a96\") " Mar 20 08:51:22 crc kubenswrapper[4971]: I0320 08:51:22.199451 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bedda968-4c4c-4bea-a384-4695a8169a96-scripts" (OuterVolumeSpecName: "scripts") pod "bedda968-4c4c-4bea-a384-4695a8169a96" (UID: "bedda968-4c4c-4bea-a384-4695a8169a96"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:51:22 crc kubenswrapper[4971]: I0320 08:51:22.199562 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bedda968-4c4c-4bea-a384-4695a8169a96-kube-api-access-5blz7" (OuterVolumeSpecName: "kube-api-access-5blz7") pod "bedda968-4c4c-4bea-a384-4695a8169a96" (UID: "bedda968-4c4c-4bea-a384-4695a8169a96"). InnerVolumeSpecName "kube-api-access-5blz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:51:22 crc kubenswrapper[4971]: I0320 08:51:22.227539 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bedda968-4c4c-4bea-a384-4695a8169a96-config-data" (OuterVolumeSpecName: "config-data") pod "bedda968-4c4c-4bea-a384-4695a8169a96" (UID: "bedda968-4c4c-4bea-a384-4695a8169a96"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:51:22 crc kubenswrapper[4971]: I0320 08:51:22.229761 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bedda968-4c4c-4bea-a384-4695a8169a96-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bedda968-4c4c-4bea-a384-4695a8169a96" (UID: "bedda968-4c4c-4bea-a384-4695a8169a96"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:51:22 crc kubenswrapper[4971]: I0320 08:51:22.295139 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bedda968-4c4c-4bea-a384-4695a8169a96-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:51:22 crc kubenswrapper[4971]: I0320 08:51:22.295193 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bedda968-4c4c-4bea-a384-4695a8169a96-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:51:22 crc kubenswrapper[4971]: I0320 08:51:22.295215 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bedda968-4c4c-4bea-a384-4695a8169a96-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:51:22 crc kubenswrapper[4971]: I0320 08:51:22.295236 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5blz7\" (UniqueName: \"kubernetes.io/projected/bedda968-4c4c-4bea-a384-4695a8169a96-kube-api-access-5blz7\") on node \"crc\" DevicePath \"\"" Mar 20 08:51:22 crc kubenswrapper[4971]: I0320 08:51:22.660003 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-v48pn" event={"ID":"bedda968-4c4c-4bea-a384-4695a8169a96","Type":"ContainerDied","Data":"a1b92677ab4720f8804547cdd8a3bc3613b9648086fb363957e29472da01857f"} Mar 20 08:51:22 crc kubenswrapper[4971]: I0320 08:51:22.660083 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1b92677ab4720f8804547cdd8a3bc3613b9648086fb363957e29472da01857f" Mar 20 08:51:22 crc kubenswrapper[4971]: I0320 08:51:22.660018 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-v48pn" Mar 20 08:51:22 crc kubenswrapper[4971]: I0320 08:51:22.854498 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 08:51:22 crc kubenswrapper[4971]: I0320 08:51:22.854907 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f9661ff0-007c-427d-a506-b8d44c915440" containerName="nova-api-log" containerID="cri-o://be4b8d6be11307aff10272a0bb0b21f9d7d01b3be3a712d23daa4eeb54de7a8f" gracePeriod=30 Mar 20 08:51:22 crc kubenswrapper[4971]: I0320 08:51:22.855196 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f9661ff0-007c-427d-a506-b8d44c915440" containerName="nova-api-api" containerID="cri-o://6ab45cd166a7dd6c2538025f44263aac39737700acc57fcf1f2fa63c7e2a9a58" gracePeriod=30 Mar 20 08:51:22 crc kubenswrapper[4971]: I0320 08:51:22.972180 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 08:51:22 crc kubenswrapper[4971]: I0320 08:51:22.972956 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="b7be898e-b88b-4778-82e2-eaa341e838eb" containerName="nova-scheduler-scheduler" containerID="cri-o://f57d450671465e68aca75cef9ed3629f5047433caac4afa143514cb786ac66a6" gracePeriod=30 Mar 20 08:51:22 crc kubenswrapper[4971]: I0320 08:51:22.988197 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 08:51:22 crc kubenswrapper[4971]: I0320 08:51:22.988417 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5e54821f-4696-4669-92cf-d9af53f447fe" containerName="nova-metadata-log" containerID="cri-o://2a29dca8e0ba292edee04cd306de740683db4f1921ee5dadc8dd0813984d0a3b" gracePeriod=30 Mar 20 08:51:22 crc kubenswrapper[4971]: I0320 08:51:22.988855 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5e54821f-4696-4669-92cf-d9af53f447fe" containerName="nova-metadata-metadata" containerID="cri-o://5f27e62e9db7779f9c687450c3ba9d29b63a65c435f01096e905df1c846b25da" gracePeriod=30 Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.261421 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-cvhsb" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.316232 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f6bf0dd-87ab-4d12-8d00-fd3907b69533-combined-ca-bundle\") pod \"0f6bf0dd-87ab-4d12-8d00-fd3907b69533\" (UID: \"0f6bf0dd-87ab-4d12-8d00-fd3907b69533\") " Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.316420 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f6bf0dd-87ab-4d12-8d00-fd3907b69533-config-data\") pod \"0f6bf0dd-87ab-4d12-8d00-fd3907b69533\" (UID: \"0f6bf0dd-87ab-4d12-8d00-fd3907b69533\") " Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.316440 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlx52\" (UniqueName: \"kubernetes.io/projected/0f6bf0dd-87ab-4d12-8d00-fd3907b69533-kube-api-access-dlx52\") pod \"0f6bf0dd-87ab-4d12-8d00-fd3907b69533\" (UID: \"0f6bf0dd-87ab-4d12-8d00-fd3907b69533\") " Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.316562 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f6bf0dd-87ab-4d12-8d00-fd3907b69533-scripts\") pod \"0f6bf0dd-87ab-4d12-8d00-fd3907b69533\" (UID: \"0f6bf0dd-87ab-4d12-8d00-fd3907b69533\") " Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.324304 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f6bf0dd-87ab-4d12-8d00-fd3907b69533-kube-api-access-dlx52" (OuterVolumeSpecName: "kube-api-access-dlx52") pod "0f6bf0dd-87ab-4d12-8d00-fd3907b69533" (UID: "0f6bf0dd-87ab-4d12-8d00-fd3907b69533"). InnerVolumeSpecName "kube-api-access-dlx52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.327590 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f6bf0dd-87ab-4d12-8d00-fd3907b69533-scripts" (OuterVolumeSpecName: "scripts") pod "0f6bf0dd-87ab-4d12-8d00-fd3907b69533" (UID: "0f6bf0dd-87ab-4d12-8d00-fd3907b69533"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.352761 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f6bf0dd-87ab-4d12-8d00-fd3907b69533-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0f6bf0dd-87ab-4d12-8d00-fd3907b69533" (UID: "0f6bf0dd-87ab-4d12-8d00-fd3907b69533"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.375028 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f6bf0dd-87ab-4d12-8d00-fd3907b69533-config-data" (OuterVolumeSpecName: "config-data") pod "0f6bf0dd-87ab-4d12-8d00-fd3907b69533" (UID: "0f6bf0dd-87ab-4d12-8d00-fd3907b69533"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.418864 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f6bf0dd-87ab-4d12-8d00-fd3907b69533-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.419118 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f6bf0dd-87ab-4d12-8d00-fd3907b69533-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.419176 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f6bf0dd-87ab-4d12-8d00-fd3907b69533-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.419227 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlx52\" (UniqueName: \"kubernetes.io/projected/0f6bf0dd-87ab-4d12-8d00-fd3907b69533-kube-api-access-dlx52\") on node \"crc\" DevicePath \"\"" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.459202 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.484102 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.520304 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9661ff0-007c-427d-a506-b8d44c915440-config-data\") pod \"f9661ff0-007c-427d-a506-b8d44c915440\" (UID: \"f9661ff0-007c-427d-a506-b8d44c915440\") " Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.574765 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9661ff0-007c-427d-a506-b8d44c915440-config-data" (OuterVolumeSpecName: "config-data") pod "f9661ff0-007c-427d-a506-b8d44c915440" (UID: "f9661ff0-007c-427d-a506-b8d44c915440"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.630129 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ng9cx\" (UniqueName: \"kubernetes.io/projected/f9661ff0-007c-427d-a506-b8d44c915440-kube-api-access-ng9cx\") pod \"f9661ff0-007c-427d-a506-b8d44c915440\" (UID: \"f9661ff0-007c-427d-a506-b8d44c915440\") " Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.630197 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9661ff0-007c-427d-a506-b8d44c915440-logs\") pod \"f9661ff0-007c-427d-a506-b8d44c915440\" (UID: \"f9661ff0-007c-427d-a506-b8d44c915440\") " Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.630286 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9661ff0-007c-427d-a506-b8d44c915440-combined-ca-bundle\") pod \"f9661ff0-007c-427d-a506-b8d44c915440\" (UID: \"f9661ff0-007c-427d-a506-b8d44c915440\") " Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.630340 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwv6f\" (UniqueName: \"kubernetes.io/projected/5e54821f-4696-4669-92cf-d9af53f447fe-kube-api-access-mwv6f\") pod \"5e54821f-4696-4669-92cf-d9af53f447fe\" (UID: \"5e54821f-4696-4669-92cf-d9af53f447fe\") " Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.630368 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e54821f-4696-4669-92cf-d9af53f447fe-config-data\") pod \"5e54821f-4696-4669-92cf-d9af53f447fe\" (UID: \"5e54821f-4696-4669-92cf-d9af53f447fe\") " Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.630450 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e54821f-4696-4669-92cf-d9af53f447fe-logs\") pod \"5e54821f-4696-4669-92cf-d9af53f447fe\" (UID: \"5e54821f-4696-4669-92cf-d9af53f447fe\") " Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.630469 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e54821f-4696-4669-92cf-d9af53f447fe-combined-ca-bundle\") pod \"5e54821f-4696-4669-92cf-d9af53f447fe\" (UID: \"5e54821f-4696-4669-92cf-d9af53f447fe\") " Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.630809 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9661ff0-007c-427d-a506-b8d44c915440-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.634843 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e54821f-4696-4669-92cf-d9af53f447fe-logs" (OuterVolumeSpecName: "logs") pod "5e54821f-4696-4669-92cf-d9af53f447fe" (UID: "5e54821f-4696-4669-92cf-d9af53f447fe"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.635283 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9661ff0-007c-427d-a506-b8d44c915440-logs" (OuterVolumeSpecName: "logs") pod "f9661ff0-007c-427d-a506-b8d44c915440" (UID: "f9661ff0-007c-427d-a506-b8d44c915440"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.638807 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e54821f-4696-4669-92cf-d9af53f447fe-kube-api-access-mwv6f" (OuterVolumeSpecName: "kube-api-access-mwv6f") pod "5e54821f-4696-4669-92cf-d9af53f447fe" (UID: "5e54821f-4696-4669-92cf-d9af53f447fe"). InnerVolumeSpecName "kube-api-access-mwv6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.654520 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9661ff0-007c-427d-a506-b8d44c915440-kube-api-access-ng9cx" (OuterVolumeSpecName: "kube-api-access-ng9cx") pod "f9661ff0-007c-427d-a506-b8d44c915440" (UID: "f9661ff0-007c-427d-a506-b8d44c915440"). InnerVolumeSpecName "kube-api-access-ng9cx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.667088 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9661ff0-007c-427d-a506-b8d44c915440-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f9661ff0-007c-427d-a506-b8d44c915440" (UID: "f9661ff0-007c-427d-a506-b8d44c915440"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.674024 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e54821f-4696-4669-92cf-d9af53f447fe-config-data" (OuterVolumeSpecName: "config-data") pod "5e54821f-4696-4669-92cf-d9af53f447fe" (UID: "5e54821f-4696-4669-92cf-d9af53f447fe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.675114 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-cvhsb" event={"ID":"0f6bf0dd-87ab-4d12-8d00-fd3907b69533","Type":"ContainerDied","Data":"97f8a04f474ad0f480353530a593b2a4729444e0e6021728acfaf3adcc9947da"} Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.675205 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97f8a04f474ad0f480353530a593b2a4729444e0e6021728acfaf3adcc9947da" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.675305 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-cvhsb" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.679719 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e54821f-4696-4669-92cf-d9af53f447fe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5e54821f-4696-4669-92cf-d9af53f447fe" (UID: "5e54821f-4696-4669-92cf-d9af53f447fe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.681473 4971 generic.go:334] "Generic (PLEG): container finished" podID="f9661ff0-007c-427d-a506-b8d44c915440" containerID="6ab45cd166a7dd6c2538025f44263aac39737700acc57fcf1f2fa63c7e2a9a58" exitCode=0 Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.681549 4971 generic.go:334] "Generic (PLEG): container finished" podID="f9661ff0-007c-427d-a506-b8d44c915440" containerID="be4b8d6be11307aff10272a0bb0b21f9d7d01b3be3a712d23daa4eeb54de7a8f" exitCode=143 Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.681674 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f9661ff0-007c-427d-a506-b8d44c915440","Type":"ContainerDied","Data":"6ab45cd166a7dd6c2538025f44263aac39737700acc57fcf1f2fa63c7e2a9a58"} Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.681752 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f9661ff0-007c-427d-a506-b8d44c915440","Type":"ContainerDied","Data":"be4b8d6be11307aff10272a0bb0b21f9d7d01b3be3a712d23daa4eeb54de7a8f"} Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.681805 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f9661ff0-007c-427d-a506-b8d44c915440","Type":"ContainerDied","Data":"88eb7b4998f4291d3d70c6c2ae8efe8aaad55fd2dd11d86107cc184ae2bf561d"} Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.681860 4971 scope.go:117] "RemoveContainer" containerID="6ab45cd166a7dd6c2538025f44263aac39737700acc57fcf1f2fa63c7e2a9a58" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.682052 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.686105 4971 generic.go:334] "Generic (PLEG): container finished" podID="5e54821f-4696-4669-92cf-d9af53f447fe" containerID="5f27e62e9db7779f9c687450c3ba9d29b63a65c435f01096e905df1c846b25da" exitCode=0 Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.686134 4971 generic.go:334] "Generic (PLEG): container finished" podID="5e54821f-4696-4669-92cf-d9af53f447fe" containerID="2a29dca8e0ba292edee04cd306de740683db4f1921ee5dadc8dd0813984d0a3b" exitCode=143 Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.686153 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5e54821f-4696-4669-92cf-d9af53f447fe","Type":"ContainerDied","Data":"5f27e62e9db7779f9c687450c3ba9d29b63a65c435f01096e905df1c846b25da"} Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.686176 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5e54821f-4696-4669-92cf-d9af53f447fe","Type":"ContainerDied","Data":"2a29dca8e0ba292edee04cd306de740683db4f1921ee5dadc8dd0813984d0a3b"} Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.686186 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5e54821f-4696-4669-92cf-d9af53f447fe","Type":"ContainerDied","Data":"01b10b5994776ca558624ce0957dbe22cd31e0e975708b096ea755088b99bccb"} Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.686235 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.713737 4971 scope.go:117] "RemoveContainer" containerID="be4b8d6be11307aff10272a0bb0b21f9d7d01b3be3a712d23daa4eeb54de7a8f" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.735857 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9661ff0-007c-427d-a506-b8d44c915440-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.735912 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwv6f\" (UniqueName: \"kubernetes.io/projected/5e54821f-4696-4669-92cf-d9af53f447fe-kube-api-access-mwv6f\") on node \"crc\" DevicePath \"\"" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.735922 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e54821f-4696-4669-92cf-d9af53f447fe-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.735936 4971 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e54821f-4696-4669-92cf-d9af53f447fe-logs\") on node \"crc\" DevicePath \"\"" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.735946 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e54821f-4696-4669-92cf-d9af53f447fe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.735955 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ng9cx\" (UniqueName: \"kubernetes.io/projected/f9661ff0-007c-427d-a506-b8d44c915440-kube-api-access-ng9cx\") on node \"crc\" DevicePath \"\"" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.735963 4971 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9661ff0-007c-427d-a506-b8d44c915440-logs\") on node \"crc\" DevicePath \"\"" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.776767 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.779544 4971 scope.go:117] "RemoveContainer" containerID="6ab45cd166a7dd6c2538025f44263aac39737700acc57fcf1f2fa63c7e2a9a58" Mar 20 08:51:23 crc kubenswrapper[4971]: E0320 08:51:23.783128 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ab45cd166a7dd6c2538025f44263aac39737700acc57fcf1f2fa63c7e2a9a58\": container with ID starting with 6ab45cd166a7dd6c2538025f44263aac39737700acc57fcf1f2fa63c7e2a9a58 not found: ID does not exist" containerID="6ab45cd166a7dd6c2538025f44263aac39737700acc57fcf1f2fa63c7e2a9a58" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.783180 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ab45cd166a7dd6c2538025f44263aac39737700acc57fcf1f2fa63c7e2a9a58"} err="failed to get container status \"6ab45cd166a7dd6c2538025f44263aac39737700acc57fcf1f2fa63c7e2a9a58\": rpc error: code = NotFound desc = could not find container \"6ab45cd166a7dd6c2538025f44263aac39737700acc57fcf1f2fa63c7e2a9a58\": container with ID starting with 6ab45cd166a7dd6c2538025f44263aac39737700acc57fcf1f2fa63c7e2a9a58 not found: ID does not exist" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.783209 4971 scope.go:117] "RemoveContainer" containerID="be4b8d6be11307aff10272a0bb0b21f9d7d01b3be3a712d23daa4eeb54de7a8f" Mar 20 08:51:23 crc kubenswrapper[4971]: E0320 08:51:23.784638 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be4b8d6be11307aff10272a0bb0b21f9d7d01b3be3a712d23daa4eeb54de7a8f\": container with ID starting with be4b8d6be11307aff10272a0bb0b21f9d7d01b3be3a712d23daa4eeb54de7a8f not found: ID does not exist" containerID="be4b8d6be11307aff10272a0bb0b21f9d7d01b3be3a712d23daa4eeb54de7a8f" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.784676 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be4b8d6be11307aff10272a0bb0b21f9d7d01b3be3a712d23daa4eeb54de7a8f"} err="failed to get container status \"be4b8d6be11307aff10272a0bb0b21f9d7d01b3be3a712d23daa4eeb54de7a8f\": rpc error: code = NotFound desc = could not find container \"be4b8d6be11307aff10272a0bb0b21f9d7d01b3be3a712d23daa4eeb54de7a8f\": container with ID starting with be4b8d6be11307aff10272a0bb0b21f9d7d01b3be3a712d23daa4eeb54de7a8f not found: ID does not exist" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.784697 4971 scope.go:117] "RemoveContainer" containerID="6ab45cd166a7dd6c2538025f44263aac39737700acc57fcf1f2fa63c7e2a9a58" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.785770 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.785941 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ab45cd166a7dd6c2538025f44263aac39737700acc57fcf1f2fa63c7e2a9a58"} err="failed to get container status \"6ab45cd166a7dd6c2538025f44263aac39737700acc57fcf1f2fa63c7e2a9a58\": rpc error: code = NotFound desc = could not find container \"6ab45cd166a7dd6c2538025f44263aac39737700acc57fcf1f2fa63c7e2a9a58\": container with ID starting with 6ab45cd166a7dd6c2538025f44263aac39737700acc57fcf1f2fa63c7e2a9a58 not found: ID does not exist" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.786238 4971 scope.go:117] "RemoveContainer" containerID="be4b8d6be11307aff10272a0bb0b21f9d7d01b3be3a712d23daa4eeb54de7a8f" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.787780 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be4b8d6be11307aff10272a0bb0b21f9d7d01b3be3a712d23daa4eeb54de7a8f"} err="failed to get container status \"be4b8d6be11307aff10272a0bb0b21f9d7d01b3be3a712d23daa4eeb54de7a8f\": rpc error: code = NotFound desc = could not find container \"be4b8d6be11307aff10272a0bb0b21f9d7d01b3be3a712d23daa4eeb54de7a8f\": container with ID starting with be4b8d6be11307aff10272a0bb0b21f9d7d01b3be3a712d23daa4eeb54de7a8f not found: ID does not exist" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.787810 4971 scope.go:117] "RemoveContainer" containerID="5f27e62e9db7779f9c687450c3ba9d29b63a65c435f01096e905df1c846b25da" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.799040 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 08:51:23 crc kubenswrapper[4971]: E0320 08:51:23.799621 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bedda968-4c4c-4bea-a384-4695a8169a96" containerName="nova-manage" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.799719 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="bedda968-4c4c-4bea-a384-4695a8169a96" containerName="nova-manage" Mar 20 08:51:23 crc kubenswrapper[4971]: E0320 08:51:23.799852 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f6bf0dd-87ab-4d12-8d00-fd3907b69533" containerName="nova-cell1-conductor-db-sync" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.799957 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f6bf0dd-87ab-4d12-8d00-fd3907b69533" containerName="nova-cell1-conductor-db-sync" Mar 20 08:51:23 crc kubenswrapper[4971]: E0320 08:51:23.800006 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e54821f-4696-4669-92cf-d9af53f447fe" containerName="nova-metadata-metadata" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.800052 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e54821f-4696-4669-92cf-d9af53f447fe" containerName="nova-metadata-metadata" Mar 20 08:51:23 crc kubenswrapper[4971]: E0320 08:51:23.800099 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9661ff0-007c-427d-a506-b8d44c915440" containerName="nova-api-api" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.800162 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9661ff0-007c-427d-a506-b8d44c915440" containerName="nova-api-api" Mar 20 08:51:23 crc kubenswrapper[4971]: E0320 08:51:23.800244 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9661ff0-007c-427d-a506-b8d44c915440" containerName="nova-api-log" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.800293 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9661ff0-007c-427d-a506-b8d44c915440" containerName="nova-api-log" Mar 20 08:51:23 crc kubenswrapper[4971]: E0320 08:51:23.800342 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e54821f-4696-4669-92cf-d9af53f447fe" containerName="nova-metadata-log" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.800564 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e54821f-4696-4669-92cf-d9af53f447fe" containerName="nova-metadata-log" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.800830 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f6bf0dd-87ab-4d12-8d00-fd3907b69533" containerName="nova-cell1-conductor-db-sync" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.800912 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e54821f-4696-4669-92cf-d9af53f447fe" containerName="nova-metadata-log" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.800966 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="bedda968-4c4c-4bea-a384-4695a8169a96" containerName="nova-manage" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.801046 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9661ff0-007c-427d-a506-b8d44c915440" containerName="nova-api-api" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.801096 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e54821f-4696-4669-92cf-d9af53f447fe" containerName="nova-metadata-metadata" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.801209 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9661ff0-007c-427d-a506-b8d44c915440" containerName="nova-api-log" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.802133 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.811234 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.813100 4971 scope.go:117] "RemoveContainer" containerID="2a29dca8e0ba292edee04cd306de740683db4f1921ee5dadc8dd0813984d0a3b" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.813711 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.814227 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.815894 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.824714 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.843870 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.852851 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.853989 4971 scope.go:117] "RemoveContainer" containerID="5f27e62e9db7779f9c687450c3ba9d29b63a65c435f01096e905df1c846b25da" Mar 20 08:51:23 crc kubenswrapper[4971]: E0320 08:51:23.854671 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f27e62e9db7779f9c687450c3ba9d29b63a65c435f01096e905df1c846b25da\": container with ID starting with 5f27e62e9db7779f9c687450c3ba9d29b63a65c435f01096e905df1c846b25da not found: ID does not exist" containerID="5f27e62e9db7779f9c687450c3ba9d29b63a65c435f01096e905df1c846b25da" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.854723 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f27e62e9db7779f9c687450c3ba9d29b63a65c435f01096e905df1c846b25da"} err="failed to get container status \"5f27e62e9db7779f9c687450c3ba9d29b63a65c435f01096e905df1c846b25da\": rpc error: code = NotFound desc = could not find container \"5f27e62e9db7779f9c687450c3ba9d29b63a65c435f01096e905df1c846b25da\": container with ID starting with 5f27e62e9db7779f9c687450c3ba9d29b63a65c435f01096e905df1c846b25da not found: ID does not exist" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.854744 4971 scope.go:117] "RemoveContainer" containerID="2a29dca8e0ba292edee04cd306de740683db4f1921ee5dadc8dd0813984d0a3b" Mar 20 08:51:23 crc kubenswrapper[4971]: E0320 08:51:23.855172 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a29dca8e0ba292edee04cd306de740683db4f1921ee5dadc8dd0813984d0a3b\": container with ID starting with 2a29dca8e0ba292edee04cd306de740683db4f1921ee5dadc8dd0813984d0a3b not found: ID does not exist" containerID="2a29dca8e0ba292edee04cd306de740683db4f1921ee5dadc8dd0813984d0a3b" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.855217 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a29dca8e0ba292edee04cd306de740683db4f1921ee5dadc8dd0813984d0a3b"} err="failed to get container status \"2a29dca8e0ba292edee04cd306de740683db4f1921ee5dadc8dd0813984d0a3b\": rpc error: code = NotFound desc = could not find container \"2a29dca8e0ba292edee04cd306de740683db4f1921ee5dadc8dd0813984d0a3b\": container with ID starting with 2a29dca8e0ba292edee04cd306de740683db4f1921ee5dadc8dd0813984d0a3b not found: ID does not exist" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.855248 4971 scope.go:117] "RemoveContainer" containerID="5f27e62e9db7779f9c687450c3ba9d29b63a65c435f01096e905df1c846b25da" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.855710 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f27e62e9db7779f9c687450c3ba9d29b63a65c435f01096e905df1c846b25da"} err="failed to get container status \"5f27e62e9db7779f9c687450c3ba9d29b63a65c435f01096e905df1c846b25da\": rpc error: code = NotFound desc = could not find container \"5f27e62e9db7779f9c687450c3ba9d29b63a65c435f01096e905df1c846b25da\": container with ID starting with 5f27e62e9db7779f9c687450c3ba9d29b63a65c435f01096e905df1c846b25da not found: ID does not exist" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.855732 4971 scope.go:117] "RemoveContainer" containerID="2a29dca8e0ba292edee04cd306de740683db4f1921ee5dadc8dd0813984d0a3b" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.855980 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a29dca8e0ba292edee04cd306de740683db4f1921ee5dadc8dd0813984d0a3b"} err="failed to get container status \"2a29dca8e0ba292edee04cd306de740683db4f1921ee5dadc8dd0813984d0a3b\": rpc error: code = NotFound desc = could not find container \"2a29dca8e0ba292edee04cd306de740683db4f1921ee5dadc8dd0813984d0a3b\": container with ID starting with 2a29dca8e0ba292edee04cd306de740683db4f1921ee5dadc8dd0813984d0a3b not found: ID does not exist" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.859876 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.861247 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.864172 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.871249 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.886453 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.941400 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zr5j\" (UniqueName: \"kubernetes.io/projected/18367358-fcaa-4258-8420-b842bce69483-kube-api-access-6zr5j\") pod \"nova-metadata-0\" (UID: \"18367358-fcaa-4258-8420-b842bce69483\") " pod="openstack/nova-metadata-0" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.941462 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67c5f\" (UniqueName: \"kubernetes.io/projected/23472bad-7f81-42e7-9878-b2cbc945508a-kube-api-access-67c5f\") pod \"nova-cell1-conductor-0\" (UID: \"23472bad-7f81-42e7-9878-b2cbc945508a\") " pod="openstack/nova-cell1-conductor-0" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.941515 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23472bad-7f81-42e7-9878-b2cbc945508a-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"23472bad-7f81-42e7-9878-b2cbc945508a\") " pod="openstack/nova-cell1-conductor-0" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.941718 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23472bad-7f81-42e7-9878-b2cbc945508a-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"23472bad-7f81-42e7-9878-b2cbc945508a\") " pod="openstack/nova-cell1-conductor-0" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.941853 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18367358-fcaa-4258-8420-b842bce69483-logs\") pod \"nova-metadata-0\" (UID: \"18367358-fcaa-4258-8420-b842bce69483\") " pod="openstack/nova-metadata-0" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.942003 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18367358-fcaa-4258-8420-b842bce69483-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"18367358-fcaa-4258-8420-b842bce69483\") " pod="openstack/nova-metadata-0" Mar 20 08:51:23 crc kubenswrapper[4971]: I0320 08:51:23.942115 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18367358-fcaa-4258-8420-b842bce69483-config-data\") pod \"nova-metadata-0\" (UID: \"18367358-fcaa-4258-8420-b842bce69483\") " pod="openstack/nova-metadata-0" Mar 20 08:51:24 crc kubenswrapper[4971]: I0320 08:51:24.044640 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23472bad-7f81-42e7-9878-b2cbc945508a-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"23472bad-7f81-42e7-9878-b2cbc945508a\") " pod="openstack/nova-cell1-conductor-0" Mar 20 08:51:24 crc kubenswrapper[4971]: I0320 08:51:24.045016 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c-logs\") pod \"nova-api-0\" (UID: \"5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c\") " pod="openstack/nova-api-0" Mar 20 08:51:24 crc kubenswrapper[4971]: I0320 08:51:24.045222 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18367358-fcaa-4258-8420-b842bce69483-logs\") pod \"nova-metadata-0\" (UID: \"18367358-fcaa-4258-8420-b842bce69483\") " pod="openstack/nova-metadata-0" Mar 20 08:51:24 crc kubenswrapper[4971]: I0320 08:51:24.045588 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18367358-fcaa-4258-8420-b842bce69483-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"18367358-fcaa-4258-8420-b842bce69483\") " pod="openstack/nova-metadata-0" Mar 20 08:51:24 crc kubenswrapper[4971]: I0320 08:51:24.045823 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18367358-fcaa-4258-8420-b842bce69483-config-data\") pod \"nova-metadata-0\" (UID: \"18367358-fcaa-4258-8420-b842bce69483\") " pod="openstack/nova-metadata-0" Mar 20 08:51:24 crc kubenswrapper[4971]: I0320 08:51:24.045959 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c\") " pod="openstack/nova-api-0" Mar 20 08:51:24 crc kubenswrapper[4971]: I0320 08:51:24.046151 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wdvj\" (UniqueName: \"kubernetes.io/projected/5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c-kube-api-access-5wdvj\") pod \"nova-api-0\" (UID: \"5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c\") " pod="openstack/nova-api-0" Mar 20 08:51:24 crc kubenswrapper[4971]: I0320 08:51:24.046347 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zr5j\" (UniqueName: \"kubernetes.io/projected/18367358-fcaa-4258-8420-b842bce69483-kube-api-access-6zr5j\") pod \"nova-metadata-0\" (UID: \"18367358-fcaa-4258-8420-b842bce69483\") " pod="openstack/nova-metadata-0" Mar 20 08:51:24 crc kubenswrapper[4971]: I0320 08:51:24.046029 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18367358-fcaa-4258-8420-b842bce69483-logs\") pod \"nova-metadata-0\" (UID: \"18367358-fcaa-4258-8420-b842bce69483\") " pod="openstack/nova-metadata-0" Mar 20 08:51:24 crc kubenswrapper[4971]: I0320 08:51:24.046534 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67c5f\" (UniqueName: \"kubernetes.io/projected/23472bad-7f81-42e7-9878-b2cbc945508a-kube-api-access-67c5f\") pod \"nova-cell1-conductor-0\" (UID: \"23472bad-7f81-42e7-9878-b2cbc945508a\") " pod="openstack/nova-cell1-conductor-0" Mar 20 08:51:24 crc kubenswrapper[4971]: I0320 08:51:24.046878 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23472bad-7f81-42e7-9878-b2cbc945508a-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"23472bad-7f81-42e7-9878-b2cbc945508a\") " pod="openstack/nova-cell1-conductor-0" Mar 20 08:51:24 crc kubenswrapper[4971]: I0320 08:51:24.046982 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c-config-data\") pod \"nova-api-0\" (UID: \"5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c\") " pod="openstack/nova-api-0" Mar 20 08:51:24 crc kubenswrapper[4971]: I0320 08:51:24.051263 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23472bad-7f81-42e7-9878-b2cbc945508a-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"23472bad-7f81-42e7-9878-b2cbc945508a\") " pod="openstack/nova-cell1-conductor-0" Mar 20 08:51:24 crc kubenswrapper[4971]: I0320 08:51:24.051397 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18367358-fcaa-4258-8420-b842bce69483-config-data\") pod \"nova-metadata-0\" (UID: \"18367358-fcaa-4258-8420-b842bce69483\") " pod="openstack/nova-metadata-0" Mar 20 08:51:24 crc kubenswrapper[4971]: I0320 08:51:24.052859 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23472bad-7f81-42e7-9878-b2cbc945508a-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"23472bad-7f81-42e7-9878-b2cbc945508a\") " pod="openstack/nova-cell1-conductor-0" Mar 20 08:51:24 crc kubenswrapper[4971]: I0320 08:51:24.052877 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18367358-fcaa-4258-8420-b842bce69483-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"18367358-fcaa-4258-8420-b842bce69483\") " pod="openstack/nova-metadata-0" Mar 20 08:51:24 crc kubenswrapper[4971]: I0320 08:51:24.064893 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zr5j\" (UniqueName: \"kubernetes.io/projected/18367358-fcaa-4258-8420-b842bce69483-kube-api-access-6zr5j\") pod \"nova-metadata-0\" (UID: \"18367358-fcaa-4258-8420-b842bce69483\") " pod="openstack/nova-metadata-0" Mar 20 08:51:24 crc kubenswrapper[4971]: I0320 08:51:24.075687 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67c5f\" (UniqueName: \"kubernetes.io/projected/23472bad-7f81-42e7-9878-b2cbc945508a-kube-api-access-67c5f\") pod \"nova-cell1-conductor-0\" (UID: \"23472bad-7f81-42e7-9878-b2cbc945508a\") " pod="openstack/nova-cell1-conductor-0" Mar 20 08:51:24 crc kubenswrapper[4971]: I0320 08:51:24.135716 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 20 08:51:24 crc kubenswrapper[4971]: I0320 08:51:24.148889 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c\") " pod="openstack/nova-api-0" Mar 20 08:51:24 crc kubenswrapper[4971]: I0320 08:51:24.148945 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wdvj\" (UniqueName: \"kubernetes.io/projected/5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c-kube-api-access-5wdvj\") pod \"nova-api-0\" (UID: \"5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c\") " pod="openstack/nova-api-0" Mar 20 08:51:24 crc kubenswrapper[4971]: I0320 08:51:24.148998 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c-config-data\") pod \"nova-api-0\" (UID: \"5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c\") " pod="openstack/nova-api-0" Mar 20 08:51:24 crc kubenswrapper[4971]: I0320 08:51:24.149031 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c-logs\") pod \"nova-api-0\" (UID: \"5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c\") " pod="openstack/nova-api-0" Mar 20 08:51:24 crc kubenswrapper[4971]: I0320 08:51:24.149352 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c-logs\") pod \"nova-api-0\" (UID: \"5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c\") " pod="openstack/nova-api-0" Mar 20 08:51:24 crc kubenswrapper[4971]: I0320 08:51:24.151194 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 08:51:24 crc kubenswrapper[4971]: I0320 08:51:24.152865 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c-config-data\") pod \"nova-api-0\" (UID: \"5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c\") " pod="openstack/nova-api-0" Mar 20 08:51:24 crc kubenswrapper[4971]: I0320 08:51:24.166516 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c\") " pod="openstack/nova-api-0" Mar 20 08:51:24 crc kubenswrapper[4971]: I0320 08:51:24.171648 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wdvj\" (UniqueName: \"kubernetes.io/projected/5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c-kube-api-access-5wdvj\") pod \"nova-api-0\" (UID: \"5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c\") " pod="openstack/nova-api-0" Mar 20 08:51:24 crc kubenswrapper[4971]: I0320 08:51:24.175244 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 08:51:24 crc kubenswrapper[4971]: W0320 08:51:24.669460 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18367358_fcaa_4258_8420_b842bce69483.slice/crio-a1f9b16460b577710ca8e046bf132893955aa152d8691ce41684e904193994fd WatchSource:0}: Error finding container a1f9b16460b577710ca8e046bf132893955aa152d8691ce41684e904193994fd: Status 404 returned error can't find the container with id a1f9b16460b577710ca8e046bf132893955aa152d8691ce41684e904193994fd Mar 20 08:51:24 crc kubenswrapper[4971]: I0320 08:51:24.673382 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 08:51:24 crc kubenswrapper[4971]: I0320 08:51:24.723237 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"18367358-fcaa-4258-8420-b842bce69483","Type":"ContainerStarted","Data":"a1f9b16460b577710ca8e046bf132893955aa152d8691ce41684e904193994fd"} Mar 20 08:51:24 crc kubenswrapper[4971]: I0320 08:51:24.751896 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e54821f-4696-4669-92cf-d9af53f447fe" path="/var/lib/kubelet/pods/5e54821f-4696-4669-92cf-d9af53f447fe/volumes" Mar 20 08:51:24 crc kubenswrapper[4971]: I0320 08:51:24.753190 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9661ff0-007c-427d-a506-b8d44c915440" path="/var/lib/kubelet/pods/f9661ff0-007c-427d-a506-b8d44c915440/volumes" Mar 20 08:51:24 crc kubenswrapper[4971]: W0320 08:51:24.758899 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5cfd209c_cac4_4e7d_8c80_6ee1c3b7484c.slice/crio-8ae2785e0ce8607647b159efe3c56e7d0a7d92621e15cf7c8320b7c7314b7ed6 WatchSource:0}: Error finding container 8ae2785e0ce8607647b159efe3c56e7d0a7d92621e15cf7c8320b7c7314b7ed6: Status 404 returned error can't find the container with id 8ae2785e0ce8607647b159efe3c56e7d0a7d92621e15cf7c8320b7c7314b7ed6 Mar 20 08:51:24 crc kubenswrapper[4971]: I0320 08:51:24.765740 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 08:51:24 crc kubenswrapper[4971]: I0320 08:51:24.800294 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 08:51:25 crc kubenswrapper[4971]: I0320 08:51:25.139036 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:51:25 crc kubenswrapper[4971]: I0320 08:51:25.157398 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:51:25 crc kubenswrapper[4971]: I0320 08:51:25.193317 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-844dd44645-96kb5" Mar 20 08:51:25 crc kubenswrapper[4971]: I0320 08:51:25.263673 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6649d5b6d7-wcsd7"] Mar 20 08:51:25 crc kubenswrapper[4971]: I0320 08:51:25.271081 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6649d5b6d7-wcsd7" podUID="b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9" containerName="dnsmasq-dns" containerID="cri-o://c09b73a25970cbbb8d7b4a4f2b3fb52cd2863104d696e951d58c0e5331ec2519" gracePeriod=10 Mar 20 08:51:25 crc kubenswrapper[4971]: I0320 08:51:25.738827 4971 generic.go:334] "Generic (PLEG): container finished" podID="b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9" containerID="c09b73a25970cbbb8d7b4a4f2b3fb52cd2863104d696e951d58c0e5331ec2519" exitCode=0 Mar 20 08:51:25 crc kubenswrapper[4971]: I0320 08:51:25.738902 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6649d5b6d7-wcsd7" event={"ID":"b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9","Type":"ContainerDied","Data":"c09b73a25970cbbb8d7b4a4f2b3fb52cd2863104d696e951d58c0e5331ec2519"} Mar 20 08:51:25 crc kubenswrapper[4971]: I0320 08:51:25.741323 4971 generic.go:334] "Generic (PLEG): container finished" podID="b7be898e-b88b-4778-82e2-eaa341e838eb" containerID="f57d450671465e68aca75cef9ed3629f5047433caac4afa143514cb786ac66a6" exitCode=0 Mar 20 08:51:25 crc kubenswrapper[4971]: I0320 08:51:25.741391 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b7be898e-b88b-4778-82e2-eaa341e838eb","Type":"ContainerDied","Data":"f57d450671465e68aca75cef9ed3629f5047433caac4afa143514cb786ac66a6"} Mar 20 08:51:25 crc kubenswrapper[4971]: I0320 08:51:25.741421 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b7be898e-b88b-4778-82e2-eaa341e838eb","Type":"ContainerDied","Data":"45697abc7f2742de866eec8501dab5b2de1826cc06dcfcfa6a77e99c0bc62edf"} Mar 20 08:51:25 crc kubenswrapper[4971]: I0320 08:51:25.741433 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45697abc7f2742de866eec8501dab5b2de1826cc06dcfcfa6a77e99c0bc62edf" Mar 20 08:51:25 crc kubenswrapper[4971]: I0320 08:51:25.743229 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c","Type":"ContainerStarted","Data":"31f01dbdcd12c71cea8619d50ff64284549a31f4943f10e0852af30b15238a31"} Mar 20 08:51:25 crc kubenswrapper[4971]: I0320 08:51:25.743259 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c","Type":"ContainerStarted","Data":"1e15f0b9e557e8147a67e4bdaef7e94f1d374d5267f40d9cd6478452de50ca11"} Mar 20 08:51:25 crc kubenswrapper[4971]: I0320 08:51:25.743270 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c","Type":"ContainerStarted","Data":"8ae2785e0ce8607647b159efe3c56e7d0a7d92621e15cf7c8320b7c7314b7ed6"} Mar 20 08:51:25 crc kubenswrapper[4971]: I0320 08:51:25.747814 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"23472bad-7f81-42e7-9878-b2cbc945508a","Type":"ContainerStarted","Data":"63d452ef4e3ac2efb555ef163ff51d3ca8589f65d0913c80c5580273babf5f39"} Mar 20 08:51:25 crc kubenswrapper[4971]: I0320 08:51:25.747858 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"23472bad-7f81-42e7-9878-b2cbc945508a","Type":"ContainerStarted","Data":"691802c63e6a6fb32b88ee4a533c29858d2c9c515ed64af9d7b607e1a0e2d695"} Mar 20 08:51:25 crc kubenswrapper[4971]: I0320 08:51:25.748492 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 20 08:51:25 crc kubenswrapper[4971]: I0320 08:51:25.759727 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"18367358-fcaa-4258-8420-b842bce69483","Type":"ContainerStarted","Data":"873782db7ee53b6c5c4609f01795e7157343a4970c004ee32964a53e04f77177"} Mar 20 08:51:25 crc kubenswrapper[4971]: I0320 08:51:25.759766 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"18367358-fcaa-4258-8420-b842bce69483","Type":"ContainerStarted","Data":"1fc4eb3d2d3a0e0e836c1f6b2a279ed404addf03efd01c700cf55dccb270afba"} Mar 20 08:51:25 crc kubenswrapper[4971]: I0320 08:51:25.773172 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.773149856 podStartE2EDuration="2.773149856s" podCreationTimestamp="2026-03-20 08:51:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:51:25.765008833 +0000 UTC m=+7307.744882971" watchObservedRunningTime="2026-03-20 08:51:25.773149856 +0000 UTC m=+7307.753023994" Mar 20 08:51:25 crc kubenswrapper[4971]: I0320 08:51:25.773651 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:51:25 crc kubenswrapper[4971]: I0320 08:51:25.782773 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 08:51:25 crc kubenswrapper[4971]: I0320 08:51:25.783286 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.7832691 podStartE2EDuration="2.7832691s" podCreationTimestamp="2026-03-20 08:51:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:51:25.779542233 +0000 UTC m=+7307.759416371" watchObservedRunningTime="2026-03-20 08:51:25.7832691 +0000 UTC m=+7307.763143238" Mar 20 08:51:25 crc kubenswrapper[4971]: I0320 08:51:25.794360 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6649d5b6d7-wcsd7" Mar 20 08:51:25 crc kubenswrapper[4971]: I0320 08:51:25.825426 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.825404862 podStartE2EDuration="2.825404862s" podCreationTimestamp="2026-03-20 08:51:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:51:25.80505 +0000 UTC m=+7307.784924138" watchObservedRunningTime="2026-03-20 08:51:25.825404862 +0000 UTC m=+7307.805279000" Mar 20 08:51:25 crc kubenswrapper[4971]: I0320 08:51:25.886913 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9-dns-svc\") pod \"b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9\" (UID: \"b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9\") " Mar 20 08:51:25 crc kubenswrapper[4971]: I0320 08:51:25.886969 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bn9bn\" (UniqueName: \"kubernetes.io/projected/b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9-kube-api-access-bn9bn\") pod \"b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9\" (UID: \"b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9\") " Mar 20 08:51:25 crc kubenswrapper[4971]: I0320 08:51:25.887064 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9-ovsdbserver-sb\") pod \"b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9\" (UID: \"b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9\") " Mar 20 08:51:25 crc kubenswrapper[4971]: I0320 08:51:25.887136 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7be898e-b88b-4778-82e2-eaa341e838eb-combined-ca-bundle\") pod \"b7be898e-b88b-4778-82e2-eaa341e838eb\" (UID: \"b7be898e-b88b-4778-82e2-eaa341e838eb\") " Mar 20 08:51:25 crc kubenswrapper[4971]: I0320 08:51:25.887217 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7be898e-b88b-4778-82e2-eaa341e838eb-config-data\") pod \"b7be898e-b88b-4778-82e2-eaa341e838eb\" (UID: \"b7be898e-b88b-4778-82e2-eaa341e838eb\") " Mar 20 08:51:25 crc kubenswrapper[4971]: I0320 08:51:25.887254 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9-config\") pod \"b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9\" (UID: \"b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9\") " Mar 20 08:51:25 crc kubenswrapper[4971]: I0320 08:51:25.887295 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9-ovsdbserver-nb\") pod \"b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9\" (UID: \"b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9\") " Mar 20 08:51:25 crc kubenswrapper[4971]: I0320 08:51:25.887321 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sn28h\" (UniqueName: \"kubernetes.io/projected/b7be898e-b88b-4778-82e2-eaa341e838eb-kube-api-access-sn28h\") pod \"b7be898e-b88b-4778-82e2-eaa341e838eb\" (UID: \"b7be898e-b88b-4778-82e2-eaa341e838eb\") " Mar 20 08:51:25 crc kubenswrapper[4971]: I0320 08:51:25.893924 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9-kube-api-access-bn9bn" (OuterVolumeSpecName: "kube-api-access-bn9bn") pod "b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9" (UID: "b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9"). InnerVolumeSpecName "kube-api-access-bn9bn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:51:25 crc kubenswrapper[4971]: I0320 08:51:25.896960 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7be898e-b88b-4778-82e2-eaa341e838eb-kube-api-access-sn28h" (OuterVolumeSpecName: "kube-api-access-sn28h") pod "b7be898e-b88b-4778-82e2-eaa341e838eb" (UID: "b7be898e-b88b-4778-82e2-eaa341e838eb"). InnerVolumeSpecName "kube-api-access-sn28h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:51:25 crc kubenswrapper[4971]: I0320 08:51:25.940746 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7be898e-b88b-4778-82e2-eaa341e838eb-config-data" (OuterVolumeSpecName: "config-data") pod "b7be898e-b88b-4778-82e2-eaa341e838eb" (UID: "b7be898e-b88b-4778-82e2-eaa341e838eb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:51:25 crc kubenswrapper[4971]: I0320 08:51:25.941647 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7be898e-b88b-4778-82e2-eaa341e838eb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b7be898e-b88b-4778-82e2-eaa341e838eb" (UID: "b7be898e-b88b-4778-82e2-eaa341e838eb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:51:25 crc kubenswrapper[4971]: I0320 08:51:25.960222 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9-config" (OuterVolumeSpecName: "config") pod "b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9" (UID: "b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:51:25 crc kubenswrapper[4971]: I0320 08:51:25.960373 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9" (UID: "b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:51:25 crc kubenswrapper[4971]: I0320 08:51:25.968047 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9" (UID: "b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:51:25 crc kubenswrapper[4971]: I0320 08:51:25.978081 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9" (UID: "b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:51:25 crc kubenswrapper[4971]: I0320 08:51:25.989778 4971 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 08:51:25 crc kubenswrapper[4971]: I0320 08:51:25.989822 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7be898e-b88b-4778-82e2-eaa341e838eb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:51:25 crc kubenswrapper[4971]: I0320 08:51:25.989835 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7be898e-b88b-4778-82e2-eaa341e838eb-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:51:25 crc kubenswrapper[4971]: I0320 08:51:25.989844 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:51:25 crc kubenswrapper[4971]: I0320 08:51:25.989854 4971 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 08:51:25 crc kubenswrapper[4971]: I0320 08:51:25.989863 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sn28h\" (UniqueName: \"kubernetes.io/projected/b7be898e-b88b-4778-82e2-eaa341e838eb-kube-api-access-sn28h\") on node \"crc\" DevicePath \"\"" Mar 20 08:51:25 crc kubenswrapper[4971]: I0320 08:51:25.989876 4971 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 08:51:25 crc kubenswrapper[4971]: I0320 08:51:25.989884 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bn9bn\" (UniqueName: \"kubernetes.io/projected/b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9-kube-api-access-bn9bn\") on node \"crc\" DevicePath \"\"" Mar 20 08:51:26 crc kubenswrapper[4971]: I0320 08:51:26.773074 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 08:51:26 crc kubenswrapper[4971]: I0320 08:51:26.773457 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6649d5b6d7-wcsd7" event={"ID":"b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9","Type":"ContainerDied","Data":"e24cd1ea7a7baeb09554130fcd6dfe044aad870a2f4186bdbcea5ccf7addda21"} Mar 20 08:51:26 crc kubenswrapper[4971]: I0320 08:51:26.773474 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6649d5b6d7-wcsd7" Mar 20 08:51:26 crc kubenswrapper[4971]: I0320 08:51:26.773509 4971 scope.go:117] "RemoveContainer" containerID="c09b73a25970cbbb8d7b4a4f2b3fb52cd2863104d696e951d58c0e5331ec2519" Mar 20 08:51:26 crc kubenswrapper[4971]: I0320 08:51:26.817006 4971 scope.go:117] "RemoveContainer" containerID="276f47e60f666794f2d37d4f83e2e8566c68481aa03c94c594ab69dbe7bfa9d2" Mar 20 08:51:26 crc kubenswrapper[4971]: I0320 08:51:26.821305 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 08:51:26 crc kubenswrapper[4971]: I0320 08:51:26.838534 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 08:51:26 crc kubenswrapper[4971]: I0320 08:51:26.855245 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6649d5b6d7-wcsd7"] Mar 20 08:51:26 crc kubenswrapper[4971]: I0320 08:51:26.865730 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6649d5b6d7-wcsd7"] Mar 20 08:51:26 crc kubenswrapper[4971]: I0320 08:51:26.887974 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 08:51:26 crc kubenswrapper[4971]: E0320 08:51:26.888387 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9" containerName="init" Mar 20 08:51:26 crc kubenswrapper[4971]: I0320 08:51:26.888404 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9" containerName="init" Mar 20 08:51:26 crc kubenswrapper[4971]: E0320 08:51:26.888433 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9" containerName="dnsmasq-dns" Mar 20 08:51:26 crc kubenswrapper[4971]: I0320 08:51:26.888446 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9" containerName="dnsmasq-dns" Mar 20 08:51:26 crc kubenswrapper[4971]: E0320 08:51:26.888462 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7be898e-b88b-4778-82e2-eaa341e838eb" containerName="nova-scheduler-scheduler" Mar 20 08:51:26 crc kubenswrapper[4971]: I0320 08:51:26.888470 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7be898e-b88b-4778-82e2-eaa341e838eb" containerName="nova-scheduler-scheduler" Mar 20 08:51:26 crc kubenswrapper[4971]: I0320 08:51:26.888694 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9" containerName="dnsmasq-dns" Mar 20 08:51:26 crc kubenswrapper[4971]: I0320 08:51:26.888720 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7be898e-b88b-4778-82e2-eaa341e838eb" containerName="nova-scheduler-scheduler" Mar 20 08:51:26 crc kubenswrapper[4971]: I0320 08:51:26.889392 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 08:51:26 crc kubenswrapper[4971]: I0320 08:51:26.893320 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 20 08:51:26 crc kubenswrapper[4971]: I0320 08:51:26.899855 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 08:51:27 crc kubenswrapper[4971]: I0320 08:51:27.009877 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abaad3d4-9e4c-4322-a544-07434f720dd6-config-data\") pod \"nova-scheduler-0\" (UID: \"abaad3d4-9e4c-4322-a544-07434f720dd6\") " pod="openstack/nova-scheduler-0" Mar 20 08:51:27 crc kubenswrapper[4971]: I0320 08:51:27.010205 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abaad3d4-9e4c-4322-a544-07434f720dd6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"abaad3d4-9e4c-4322-a544-07434f720dd6\") " pod="openstack/nova-scheduler-0" Mar 20 08:51:27 crc kubenswrapper[4971]: I0320 08:51:27.010540 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dh4ck\" (UniqueName: \"kubernetes.io/projected/abaad3d4-9e4c-4322-a544-07434f720dd6-kube-api-access-dh4ck\") pod \"nova-scheduler-0\" (UID: \"abaad3d4-9e4c-4322-a544-07434f720dd6\") " pod="openstack/nova-scheduler-0" Mar 20 08:51:27 crc kubenswrapper[4971]: I0320 08:51:27.112815 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abaad3d4-9e4c-4322-a544-07434f720dd6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"abaad3d4-9e4c-4322-a544-07434f720dd6\") " pod="openstack/nova-scheduler-0" Mar 20 08:51:27 crc kubenswrapper[4971]: I0320 08:51:27.113013 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dh4ck\" (UniqueName: \"kubernetes.io/projected/abaad3d4-9e4c-4322-a544-07434f720dd6-kube-api-access-dh4ck\") pod \"nova-scheduler-0\" (UID: \"abaad3d4-9e4c-4322-a544-07434f720dd6\") " pod="openstack/nova-scheduler-0" Mar 20 08:51:27 crc kubenswrapper[4971]: I0320 08:51:27.113096 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abaad3d4-9e4c-4322-a544-07434f720dd6-config-data\") pod \"nova-scheduler-0\" (UID: \"abaad3d4-9e4c-4322-a544-07434f720dd6\") " pod="openstack/nova-scheduler-0" Mar 20 08:51:27 crc kubenswrapper[4971]: I0320 08:51:27.122193 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abaad3d4-9e4c-4322-a544-07434f720dd6-config-data\") pod \"nova-scheduler-0\" (UID: \"abaad3d4-9e4c-4322-a544-07434f720dd6\") " pod="openstack/nova-scheduler-0" Mar 20 08:51:27 crc kubenswrapper[4971]: I0320 08:51:27.126295 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abaad3d4-9e4c-4322-a544-07434f720dd6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"abaad3d4-9e4c-4322-a544-07434f720dd6\") " pod="openstack/nova-scheduler-0" Mar 20 08:51:27 crc kubenswrapper[4971]: I0320 08:51:27.132029 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dh4ck\" (UniqueName: \"kubernetes.io/projected/abaad3d4-9e4c-4322-a544-07434f720dd6-kube-api-access-dh4ck\") pod \"nova-scheduler-0\" (UID: \"abaad3d4-9e4c-4322-a544-07434f720dd6\") " pod="openstack/nova-scheduler-0" Mar 20 08:51:27 crc kubenswrapper[4971]: I0320 08:51:27.211648 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 08:51:27 crc kubenswrapper[4971]: I0320 08:51:27.703155 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 08:51:27 crc kubenswrapper[4971]: W0320 08:51:27.708879 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podabaad3d4_9e4c_4322_a544_07434f720dd6.slice/crio-1b99920d02b692e93ad31c4d8184dce3e031a06f2ea5cee663ef3f1b4265187d WatchSource:0}: Error finding container 1b99920d02b692e93ad31c4d8184dce3e031a06f2ea5cee663ef3f1b4265187d: Status 404 returned error can't find the container with id 1b99920d02b692e93ad31c4d8184dce3e031a06f2ea5cee663ef3f1b4265187d Mar 20 08:51:27 crc kubenswrapper[4971]: I0320 08:51:27.783144 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"abaad3d4-9e4c-4322-a544-07434f720dd6","Type":"ContainerStarted","Data":"1b99920d02b692e93ad31c4d8184dce3e031a06f2ea5cee663ef3f1b4265187d"} Mar 20 08:51:28 crc kubenswrapper[4971]: I0320 08:51:28.767290 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9" path="/var/lib/kubelet/pods/b176ec8e-03eb-4d7a-b10a-9b8b43ad2ef9/volumes" Mar 20 08:51:28 crc kubenswrapper[4971]: I0320 08:51:28.769166 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7be898e-b88b-4778-82e2-eaa341e838eb" path="/var/lib/kubelet/pods/b7be898e-b88b-4778-82e2-eaa341e838eb/volumes" Mar 20 08:51:28 crc kubenswrapper[4971]: I0320 08:51:28.799288 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"abaad3d4-9e4c-4322-a544-07434f720dd6","Type":"ContainerStarted","Data":"6071165ef8838374e606949f925529455ddb71af42f07027d63afadd2aa41356"} Mar 20 08:51:28 crc kubenswrapper[4971]: I0320 08:51:28.840421 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.840395748 podStartE2EDuration="2.840395748s" podCreationTimestamp="2026-03-20 08:51:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:51:28.828258951 +0000 UTC m=+7310.808133129" watchObservedRunningTime="2026-03-20 08:51:28.840395748 +0000 UTC m=+7310.820269886" Mar 20 08:51:32 crc kubenswrapper[4971]: I0320 08:51:32.211778 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 20 08:51:34 crc kubenswrapper[4971]: I0320 08:51:34.152257 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 08:51:34 crc kubenswrapper[4971]: I0320 08:51:34.154306 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 08:51:34 crc kubenswrapper[4971]: I0320 08:51:34.175836 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 08:51:34 crc kubenswrapper[4971]: I0320 08:51:34.175899 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 08:51:34 crc kubenswrapper[4971]: I0320 08:51:34.178466 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 20 08:51:34 crc kubenswrapper[4971]: I0320 08:51:34.733031 4971 scope.go:117] "RemoveContainer" containerID="af8dd483a43b5c96df916fd811fc94a881785f568b623794921e968459fc26b4" Mar 20 08:51:34 crc kubenswrapper[4971]: E0320 08:51:34.734100 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:51:34 crc kubenswrapper[4971]: I0320 08:51:34.799849 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-9lwb2"] Mar 20 08:51:34 crc kubenswrapper[4971]: I0320 08:51:34.801503 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-9lwb2" Mar 20 08:51:34 crc kubenswrapper[4971]: I0320 08:51:34.805921 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 20 08:51:34 crc kubenswrapper[4971]: I0320 08:51:34.806846 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 20 08:51:34 crc kubenswrapper[4971]: I0320 08:51:34.809154 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-9lwb2"] Mar 20 08:51:34 crc kubenswrapper[4971]: I0320 08:51:34.962826 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/949d0358-d7fd-45a3-9c86-50b430830c78-config-data\") pod \"nova-cell1-cell-mapping-9lwb2\" (UID: \"949d0358-d7fd-45a3-9c86-50b430830c78\") " pod="openstack/nova-cell1-cell-mapping-9lwb2" Mar 20 08:51:34 crc kubenswrapper[4971]: I0320 08:51:34.963166 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949d0358-d7fd-45a3-9c86-50b430830c78-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-9lwb2\" (UID: \"949d0358-d7fd-45a3-9c86-50b430830c78\") " pod="openstack/nova-cell1-cell-mapping-9lwb2" Mar 20 08:51:34 crc kubenswrapper[4971]: I0320 08:51:34.963226 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/949d0358-d7fd-45a3-9c86-50b430830c78-scripts\") pod \"nova-cell1-cell-mapping-9lwb2\" (UID: \"949d0358-d7fd-45a3-9c86-50b430830c78\") " pod="openstack/nova-cell1-cell-mapping-9lwb2" Mar 20 08:51:34 crc kubenswrapper[4971]: I0320 08:51:34.963245 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzlm2\" (UniqueName: \"kubernetes.io/projected/949d0358-d7fd-45a3-9c86-50b430830c78-kube-api-access-hzlm2\") pod \"nova-cell1-cell-mapping-9lwb2\" (UID: \"949d0358-d7fd-45a3-9c86-50b430830c78\") " pod="openstack/nova-cell1-cell-mapping-9lwb2" Mar 20 08:51:35 crc kubenswrapper[4971]: I0320 08:51:35.064959 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/949d0358-d7fd-45a3-9c86-50b430830c78-config-data\") pod \"nova-cell1-cell-mapping-9lwb2\" (UID: \"949d0358-d7fd-45a3-9c86-50b430830c78\") " pod="openstack/nova-cell1-cell-mapping-9lwb2" Mar 20 08:51:35 crc kubenswrapper[4971]: I0320 08:51:35.065203 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949d0358-d7fd-45a3-9c86-50b430830c78-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-9lwb2\" (UID: \"949d0358-d7fd-45a3-9c86-50b430830c78\") " pod="openstack/nova-cell1-cell-mapping-9lwb2" Mar 20 08:51:35 crc kubenswrapper[4971]: I0320 08:51:35.067082 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/949d0358-d7fd-45a3-9c86-50b430830c78-scripts\") pod \"nova-cell1-cell-mapping-9lwb2\" (UID: \"949d0358-d7fd-45a3-9c86-50b430830c78\") " pod="openstack/nova-cell1-cell-mapping-9lwb2" Mar 20 08:51:35 crc kubenswrapper[4971]: I0320 08:51:35.067176 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzlm2\" (UniqueName: \"kubernetes.io/projected/949d0358-d7fd-45a3-9c86-50b430830c78-kube-api-access-hzlm2\") pod \"nova-cell1-cell-mapping-9lwb2\" (UID: \"949d0358-d7fd-45a3-9c86-50b430830c78\") " pod="openstack/nova-cell1-cell-mapping-9lwb2" Mar 20 08:51:35 crc kubenswrapper[4971]: I0320 08:51:35.073235 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/949d0358-d7fd-45a3-9c86-50b430830c78-config-data\") pod \"nova-cell1-cell-mapping-9lwb2\" (UID: \"949d0358-d7fd-45a3-9c86-50b430830c78\") " pod="openstack/nova-cell1-cell-mapping-9lwb2" Mar 20 08:51:35 crc kubenswrapper[4971]: I0320 08:51:35.074070 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/949d0358-d7fd-45a3-9c86-50b430830c78-scripts\") pod \"nova-cell1-cell-mapping-9lwb2\" (UID: \"949d0358-d7fd-45a3-9c86-50b430830c78\") " pod="openstack/nova-cell1-cell-mapping-9lwb2" Mar 20 08:51:35 crc kubenswrapper[4971]: I0320 08:51:35.083990 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949d0358-d7fd-45a3-9c86-50b430830c78-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-9lwb2\" (UID: \"949d0358-d7fd-45a3-9c86-50b430830c78\") " pod="openstack/nova-cell1-cell-mapping-9lwb2" Mar 20 08:51:35 crc kubenswrapper[4971]: I0320 08:51:35.084099 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzlm2\" (UniqueName: \"kubernetes.io/projected/949d0358-d7fd-45a3-9c86-50b430830c78-kube-api-access-hzlm2\") pod \"nova-cell1-cell-mapping-9lwb2\" (UID: \"949d0358-d7fd-45a3-9c86-50b430830c78\") " pod="openstack/nova-cell1-cell-mapping-9lwb2" Mar 20 08:51:35 crc kubenswrapper[4971]: I0320 08:51:35.131975 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-9lwb2" Mar 20 08:51:35 crc kubenswrapper[4971]: I0320 08:51:35.240288 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="18367358-fcaa-4258-8420-b842bce69483" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.126:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 08:51:35 crc kubenswrapper[4971]: I0320 08:51:35.324015 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.127:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 08:51:35 crc kubenswrapper[4971]: I0320 08:51:35.324411 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="18367358-fcaa-4258-8420-b842bce69483" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.126:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 08:51:35 crc kubenswrapper[4971]: I0320 08:51:35.324591 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.127:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 08:51:35 crc kubenswrapper[4971]: W0320 08:51:35.532291 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod949d0358_d7fd_45a3_9c86_50b430830c78.slice/crio-1ed6963d6c8d5927881e35671027542f54391b04f31d5cd8796fe3af8ba8a686 WatchSource:0}: Error finding container 1ed6963d6c8d5927881e35671027542f54391b04f31d5cd8796fe3af8ba8a686: Status 404 returned error can't find the container with id 1ed6963d6c8d5927881e35671027542f54391b04f31d5cd8796fe3af8ba8a686 Mar 20 08:51:35 crc kubenswrapper[4971]: I0320 08:51:35.545066 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-9lwb2"] Mar 20 08:51:35 crc kubenswrapper[4971]: I0320 08:51:35.878425 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-9lwb2" event={"ID":"949d0358-d7fd-45a3-9c86-50b430830c78","Type":"ContainerStarted","Data":"aa5baf9afd5b7d1411c3fc977d9855538f13f336bb7e4b82ac9bc989cc27b547"} Mar 20 08:51:35 crc kubenswrapper[4971]: I0320 08:51:35.878791 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-9lwb2" event={"ID":"949d0358-d7fd-45a3-9c86-50b430830c78","Type":"ContainerStarted","Data":"1ed6963d6c8d5927881e35671027542f54391b04f31d5cd8796fe3af8ba8a686"} Mar 20 08:51:37 crc kubenswrapper[4971]: I0320 08:51:37.213137 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 20 08:51:37 crc kubenswrapper[4971]: I0320 08:51:37.243754 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 20 08:51:37 crc kubenswrapper[4971]: I0320 08:51:37.268228 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-9lwb2" podStartSLOduration=3.2682034189999998 podStartE2EDuration="3.268203419s" podCreationTimestamp="2026-03-20 08:51:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:51:35.905487697 +0000 UTC m=+7317.885361835" watchObservedRunningTime="2026-03-20 08:51:37.268203419 +0000 UTC m=+7319.248077567" Mar 20 08:51:37 crc kubenswrapper[4971]: I0320 08:51:37.946531 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 20 08:51:40 crc kubenswrapper[4971]: I0320 08:51:40.960336 4971 generic.go:334] "Generic (PLEG): container finished" podID="949d0358-d7fd-45a3-9c86-50b430830c78" containerID="aa5baf9afd5b7d1411c3fc977d9855538f13f336bb7e4b82ac9bc989cc27b547" exitCode=0 Mar 20 08:51:40 crc kubenswrapper[4971]: I0320 08:51:40.960452 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-9lwb2" event={"ID":"949d0358-d7fd-45a3-9c86-50b430830c78","Type":"ContainerDied","Data":"aa5baf9afd5b7d1411c3fc977d9855538f13f336bb7e4b82ac9bc989cc27b547"} Mar 20 08:51:42 crc kubenswrapper[4971]: I0320 08:51:42.151839 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 08:51:42 crc kubenswrapper[4971]: I0320 08:51:42.152633 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 08:51:42 crc kubenswrapper[4971]: I0320 08:51:42.175947 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 08:51:42 crc kubenswrapper[4971]: I0320 08:51:42.176322 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 08:51:42 crc kubenswrapper[4971]: I0320 08:51:42.314820 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-9lwb2" Mar 20 08:51:42 crc kubenswrapper[4971]: I0320 08:51:42.420245 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/949d0358-d7fd-45a3-9c86-50b430830c78-config-data\") pod \"949d0358-d7fd-45a3-9c86-50b430830c78\" (UID: \"949d0358-d7fd-45a3-9c86-50b430830c78\") " Mar 20 08:51:42 crc kubenswrapper[4971]: I0320 08:51:42.420338 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzlm2\" (UniqueName: \"kubernetes.io/projected/949d0358-d7fd-45a3-9c86-50b430830c78-kube-api-access-hzlm2\") pod \"949d0358-d7fd-45a3-9c86-50b430830c78\" (UID: \"949d0358-d7fd-45a3-9c86-50b430830c78\") " Mar 20 08:51:42 crc kubenswrapper[4971]: I0320 08:51:42.420366 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/949d0358-d7fd-45a3-9c86-50b430830c78-scripts\") pod \"949d0358-d7fd-45a3-9c86-50b430830c78\" (UID: \"949d0358-d7fd-45a3-9c86-50b430830c78\") " Mar 20 08:51:42 crc kubenswrapper[4971]: I0320 08:51:42.420421 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949d0358-d7fd-45a3-9c86-50b430830c78-combined-ca-bundle\") pod \"949d0358-d7fd-45a3-9c86-50b430830c78\" (UID: \"949d0358-d7fd-45a3-9c86-50b430830c78\") " Mar 20 08:51:42 crc kubenswrapper[4971]: I0320 08:51:42.629981 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/949d0358-d7fd-45a3-9c86-50b430830c78-scripts" (OuterVolumeSpecName: "scripts") pod "949d0358-d7fd-45a3-9c86-50b430830c78" (UID: "949d0358-d7fd-45a3-9c86-50b430830c78"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:51:42 crc kubenswrapper[4971]: I0320 08:51:42.630055 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/949d0358-d7fd-45a3-9c86-50b430830c78-kube-api-access-hzlm2" (OuterVolumeSpecName: "kube-api-access-hzlm2") pod "949d0358-d7fd-45a3-9c86-50b430830c78" (UID: "949d0358-d7fd-45a3-9c86-50b430830c78"). InnerVolumeSpecName "kube-api-access-hzlm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:51:42 crc kubenswrapper[4971]: I0320 08:51:42.630335 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzlm2\" (UniqueName: \"kubernetes.io/projected/949d0358-d7fd-45a3-9c86-50b430830c78-kube-api-access-hzlm2\") pod \"949d0358-d7fd-45a3-9c86-50b430830c78\" (UID: \"949d0358-d7fd-45a3-9c86-50b430830c78\") " Mar 20 08:51:42 crc kubenswrapper[4971]: I0320 08:51:42.630375 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/949d0358-d7fd-45a3-9c86-50b430830c78-scripts\") pod \"949d0358-d7fd-45a3-9c86-50b430830c78\" (UID: \"949d0358-d7fd-45a3-9c86-50b430830c78\") " Mar 20 08:51:42 crc kubenswrapper[4971]: W0320 08:51:42.630447 4971 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/949d0358-d7fd-45a3-9c86-50b430830c78/volumes/kubernetes.io~projected/kube-api-access-hzlm2 Mar 20 08:51:42 crc kubenswrapper[4971]: I0320 08:51:42.630474 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/949d0358-d7fd-45a3-9c86-50b430830c78-kube-api-access-hzlm2" (OuterVolumeSpecName: "kube-api-access-hzlm2") pod "949d0358-d7fd-45a3-9c86-50b430830c78" (UID: "949d0358-d7fd-45a3-9c86-50b430830c78"). InnerVolumeSpecName "kube-api-access-hzlm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:51:42 crc kubenswrapper[4971]: W0320 08:51:42.630618 4971 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/949d0358-d7fd-45a3-9c86-50b430830c78/volumes/kubernetes.io~secret/scripts Mar 20 08:51:42 crc kubenswrapper[4971]: I0320 08:51:42.630642 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/949d0358-d7fd-45a3-9c86-50b430830c78-scripts" (OuterVolumeSpecName: "scripts") pod "949d0358-d7fd-45a3-9c86-50b430830c78" (UID: "949d0358-d7fd-45a3-9c86-50b430830c78"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:51:42 crc kubenswrapper[4971]: I0320 08:51:42.630878 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzlm2\" (UniqueName: \"kubernetes.io/projected/949d0358-d7fd-45a3-9c86-50b430830c78-kube-api-access-hzlm2\") on node \"crc\" DevicePath \"\"" Mar 20 08:51:42 crc kubenswrapper[4971]: I0320 08:51:42.630895 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/949d0358-d7fd-45a3-9c86-50b430830c78-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:51:42 crc kubenswrapper[4971]: I0320 08:51:42.639797 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/949d0358-d7fd-45a3-9c86-50b430830c78-config-data" (OuterVolumeSpecName: "config-data") pod "949d0358-d7fd-45a3-9c86-50b430830c78" (UID: "949d0358-d7fd-45a3-9c86-50b430830c78"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:51:42 crc kubenswrapper[4971]: I0320 08:51:42.640340 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/949d0358-d7fd-45a3-9c86-50b430830c78-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "949d0358-d7fd-45a3-9c86-50b430830c78" (UID: "949d0358-d7fd-45a3-9c86-50b430830c78"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:51:42 crc kubenswrapper[4971]: I0320 08:51:42.733294 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949d0358-d7fd-45a3-9c86-50b430830c78-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:51:42 crc kubenswrapper[4971]: I0320 08:51:42.733323 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/949d0358-d7fd-45a3-9c86-50b430830c78-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:51:42 crc kubenswrapper[4971]: I0320 08:51:42.984692 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-9lwb2" event={"ID":"949d0358-d7fd-45a3-9c86-50b430830c78","Type":"ContainerDied","Data":"1ed6963d6c8d5927881e35671027542f54391b04f31d5cd8796fe3af8ba8a686"} Mar 20 08:51:42 crc kubenswrapper[4971]: I0320 08:51:42.984734 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-9lwb2" Mar 20 08:51:42 crc kubenswrapper[4971]: I0320 08:51:42.984749 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ed6963d6c8d5927881e35671027542f54391b04f31d5cd8796fe3af8ba8a686" Mar 20 08:51:43 crc kubenswrapper[4971]: I0320 08:51:43.208215 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 08:51:43 crc kubenswrapper[4971]: I0320 08:51:43.242813 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 08:51:43 crc kubenswrapper[4971]: I0320 08:51:43.243031 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="abaad3d4-9e4c-4322-a544-07434f720dd6" containerName="nova-scheduler-scheduler" containerID="cri-o://6071165ef8838374e606949f925529455ddb71af42f07027d63afadd2aa41356" gracePeriod=30 Mar 20 08:51:43 crc kubenswrapper[4971]: I0320 08:51:43.342072 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 08:51:43 crc kubenswrapper[4971]: I0320 08:51:43.995067 4971 generic.go:334] "Generic (PLEG): container finished" podID="abaad3d4-9e4c-4322-a544-07434f720dd6" containerID="6071165ef8838374e606949f925529455ddb71af42f07027d63afadd2aa41356" exitCode=0 Mar 20 08:51:43 crc kubenswrapper[4971]: I0320 08:51:43.995142 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"abaad3d4-9e4c-4322-a544-07434f720dd6","Type":"ContainerDied","Data":"6071165ef8838374e606949f925529455ddb71af42f07027d63afadd2aa41356"} Mar 20 08:51:43 crc kubenswrapper[4971]: I0320 08:51:43.995569 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c" containerName="nova-api-log" containerID="cri-o://1e15f0b9e557e8147a67e4bdaef7e94f1d374d5267f40d9cd6478452de50ca11" gracePeriod=30 Mar 20 08:51:43 crc kubenswrapper[4971]: I0320 08:51:43.995666 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c" containerName="nova-api-api" containerID="cri-o://31f01dbdcd12c71cea8619d50ff64284549a31f4943f10e0852af30b15238a31" gracePeriod=30 Mar 20 08:51:43 crc kubenswrapper[4971]: I0320 08:51:43.995736 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="18367358-fcaa-4258-8420-b842bce69483" containerName="nova-metadata-log" containerID="cri-o://1fc4eb3d2d3a0e0e836c1f6b2a279ed404addf03efd01c700cf55dccb270afba" gracePeriod=30 Mar 20 08:51:43 crc kubenswrapper[4971]: I0320 08:51:43.995846 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="18367358-fcaa-4258-8420-b842bce69483" containerName="nova-metadata-metadata" containerID="cri-o://873782db7ee53b6c5c4609f01795e7157343a4970c004ee32964a53e04f77177" gracePeriod=30 Mar 20 08:51:44 crc kubenswrapper[4971]: I0320 08:51:44.191059 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 08:51:44 crc kubenswrapper[4971]: I0320 08:51:44.259688 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abaad3d4-9e4c-4322-a544-07434f720dd6-config-data\") pod \"abaad3d4-9e4c-4322-a544-07434f720dd6\" (UID: \"abaad3d4-9e4c-4322-a544-07434f720dd6\") " Mar 20 08:51:44 crc kubenswrapper[4971]: I0320 08:51:44.259760 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abaad3d4-9e4c-4322-a544-07434f720dd6-combined-ca-bundle\") pod \"abaad3d4-9e4c-4322-a544-07434f720dd6\" (UID: \"abaad3d4-9e4c-4322-a544-07434f720dd6\") " Mar 20 08:51:44 crc kubenswrapper[4971]: I0320 08:51:44.259863 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dh4ck\" (UniqueName: \"kubernetes.io/projected/abaad3d4-9e4c-4322-a544-07434f720dd6-kube-api-access-dh4ck\") pod \"abaad3d4-9e4c-4322-a544-07434f720dd6\" (UID: \"abaad3d4-9e4c-4322-a544-07434f720dd6\") " Mar 20 08:51:44 crc kubenswrapper[4971]: I0320 08:51:44.265137 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abaad3d4-9e4c-4322-a544-07434f720dd6-kube-api-access-dh4ck" (OuterVolumeSpecName: "kube-api-access-dh4ck") pod "abaad3d4-9e4c-4322-a544-07434f720dd6" (UID: "abaad3d4-9e4c-4322-a544-07434f720dd6"). InnerVolumeSpecName "kube-api-access-dh4ck". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:51:44 crc kubenswrapper[4971]: I0320 08:51:44.285645 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abaad3d4-9e4c-4322-a544-07434f720dd6-config-data" (OuterVolumeSpecName: "config-data") pod "abaad3d4-9e4c-4322-a544-07434f720dd6" (UID: "abaad3d4-9e4c-4322-a544-07434f720dd6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:51:44 crc kubenswrapper[4971]: I0320 08:51:44.286979 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abaad3d4-9e4c-4322-a544-07434f720dd6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "abaad3d4-9e4c-4322-a544-07434f720dd6" (UID: "abaad3d4-9e4c-4322-a544-07434f720dd6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:51:44 crc kubenswrapper[4971]: I0320 08:51:44.364007 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abaad3d4-9e4c-4322-a544-07434f720dd6-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:51:44 crc kubenswrapper[4971]: I0320 08:51:44.364348 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abaad3d4-9e4c-4322-a544-07434f720dd6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:51:44 crc kubenswrapper[4971]: I0320 08:51:44.364363 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dh4ck\" (UniqueName: \"kubernetes.io/projected/abaad3d4-9e4c-4322-a544-07434f720dd6-kube-api-access-dh4ck\") on node \"crc\" DevicePath \"\"" Mar 20 08:51:45 crc kubenswrapper[4971]: I0320 08:51:45.006304 4971 generic.go:334] "Generic (PLEG): container finished" podID="5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c" containerID="1e15f0b9e557e8147a67e4bdaef7e94f1d374d5267f40d9cd6478452de50ca11" exitCode=143 Mar 20 08:51:45 crc kubenswrapper[4971]: I0320 08:51:45.006371 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c","Type":"ContainerDied","Data":"1e15f0b9e557e8147a67e4bdaef7e94f1d374d5267f40d9cd6478452de50ca11"} Mar 20 08:51:45 crc kubenswrapper[4971]: I0320 08:51:45.008113 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 08:51:45 crc kubenswrapper[4971]: I0320 08:51:45.008129 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"abaad3d4-9e4c-4322-a544-07434f720dd6","Type":"ContainerDied","Data":"1b99920d02b692e93ad31c4d8184dce3e031a06f2ea5cee663ef3f1b4265187d"} Mar 20 08:51:45 crc kubenswrapper[4971]: I0320 08:51:45.008206 4971 scope.go:117] "RemoveContainer" containerID="6071165ef8838374e606949f925529455ddb71af42f07027d63afadd2aa41356" Mar 20 08:51:45 crc kubenswrapper[4971]: I0320 08:51:45.010899 4971 generic.go:334] "Generic (PLEG): container finished" podID="18367358-fcaa-4258-8420-b842bce69483" containerID="1fc4eb3d2d3a0e0e836c1f6b2a279ed404addf03efd01c700cf55dccb270afba" exitCode=143 Mar 20 08:51:45 crc kubenswrapper[4971]: I0320 08:51:45.010937 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"18367358-fcaa-4258-8420-b842bce69483","Type":"ContainerDied","Data":"1fc4eb3d2d3a0e0e836c1f6b2a279ed404addf03efd01c700cf55dccb270afba"} Mar 20 08:51:45 crc kubenswrapper[4971]: I0320 08:51:45.038473 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 08:51:45 crc kubenswrapper[4971]: I0320 08:51:45.060183 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 08:51:45 crc kubenswrapper[4971]: I0320 08:51:45.077441 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 08:51:45 crc kubenswrapper[4971]: E0320 08:51:45.077945 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="949d0358-d7fd-45a3-9c86-50b430830c78" containerName="nova-manage" Mar 20 08:51:45 crc kubenswrapper[4971]: I0320 08:51:45.077968 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="949d0358-d7fd-45a3-9c86-50b430830c78" containerName="nova-manage" Mar 20 08:51:45 crc kubenswrapper[4971]: E0320 08:51:45.078005 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abaad3d4-9e4c-4322-a544-07434f720dd6" containerName="nova-scheduler-scheduler" Mar 20 08:51:45 crc kubenswrapper[4971]: I0320 08:51:45.078013 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="abaad3d4-9e4c-4322-a544-07434f720dd6" containerName="nova-scheduler-scheduler" Mar 20 08:51:45 crc kubenswrapper[4971]: I0320 08:51:45.078244 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="abaad3d4-9e4c-4322-a544-07434f720dd6" containerName="nova-scheduler-scheduler" Mar 20 08:51:45 crc kubenswrapper[4971]: I0320 08:51:45.078274 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="949d0358-d7fd-45a3-9c86-50b430830c78" containerName="nova-manage" Mar 20 08:51:45 crc kubenswrapper[4971]: I0320 08:51:45.079134 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 08:51:45 crc kubenswrapper[4971]: I0320 08:51:45.083925 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 20 08:51:45 crc kubenswrapper[4971]: I0320 08:51:45.089481 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 08:51:45 crc kubenswrapper[4971]: I0320 08:51:45.179211 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pj92n\" (UniqueName: \"kubernetes.io/projected/1448e032-bee2-4ebb-b6bc-9b62539fea32-kube-api-access-pj92n\") pod \"nova-scheduler-0\" (UID: \"1448e032-bee2-4ebb-b6bc-9b62539fea32\") " pod="openstack/nova-scheduler-0" Mar 20 08:51:45 crc kubenswrapper[4971]: I0320 08:51:45.179446 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1448e032-bee2-4ebb-b6bc-9b62539fea32-config-data\") pod \"nova-scheduler-0\" (UID: \"1448e032-bee2-4ebb-b6bc-9b62539fea32\") " pod="openstack/nova-scheduler-0" Mar 20 08:51:45 crc kubenswrapper[4971]: I0320 08:51:45.179482 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1448e032-bee2-4ebb-b6bc-9b62539fea32-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1448e032-bee2-4ebb-b6bc-9b62539fea32\") " pod="openstack/nova-scheduler-0" Mar 20 08:51:45 crc kubenswrapper[4971]: I0320 08:51:45.291586 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pj92n\" (UniqueName: \"kubernetes.io/projected/1448e032-bee2-4ebb-b6bc-9b62539fea32-kube-api-access-pj92n\") pod \"nova-scheduler-0\" (UID: \"1448e032-bee2-4ebb-b6bc-9b62539fea32\") " pod="openstack/nova-scheduler-0" Mar 20 08:51:45 crc kubenswrapper[4971]: I0320 08:51:45.292250 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1448e032-bee2-4ebb-b6bc-9b62539fea32-config-data\") pod \"nova-scheduler-0\" (UID: \"1448e032-bee2-4ebb-b6bc-9b62539fea32\") " pod="openstack/nova-scheduler-0" Mar 20 08:51:45 crc kubenswrapper[4971]: I0320 08:51:45.292341 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1448e032-bee2-4ebb-b6bc-9b62539fea32-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1448e032-bee2-4ebb-b6bc-9b62539fea32\") " pod="openstack/nova-scheduler-0" Mar 20 08:51:45 crc kubenswrapper[4971]: I0320 08:51:45.297849 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1448e032-bee2-4ebb-b6bc-9b62539fea32-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1448e032-bee2-4ebb-b6bc-9b62539fea32\") " pod="openstack/nova-scheduler-0" Mar 20 08:51:45 crc kubenswrapper[4971]: I0320 08:51:45.302379 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1448e032-bee2-4ebb-b6bc-9b62539fea32-config-data\") pod \"nova-scheduler-0\" (UID: \"1448e032-bee2-4ebb-b6bc-9b62539fea32\") " pod="openstack/nova-scheduler-0" Mar 20 08:51:45 crc kubenswrapper[4971]: I0320 08:51:45.313013 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pj92n\" (UniqueName: \"kubernetes.io/projected/1448e032-bee2-4ebb-b6bc-9b62539fea32-kube-api-access-pj92n\") pod \"nova-scheduler-0\" (UID: \"1448e032-bee2-4ebb-b6bc-9b62539fea32\") " pod="openstack/nova-scheduler-0" Mar 20 08:51:45 crc kubenswrapper[4971]: I0320 08:51:45.406477 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 08:51:45 crc kubenswrapper[4971]: I0320 08:51:45.925481 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 08:51:45 crc kubenswrapper[4971]: W0320 08:51:45.930923 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1448e032_bee2_4ebb_b6bc_9b62539fea32.slice/crio-2cb6ec783813ce6a0dc9aa7d47c53ee959ccd88021583a73df9c07cb9b8bbcfd WatchSource:0}: Error finding container 2cb6ec783813ce6a0dc9aa7d47c53ee959ccd88021583a73df9c07cb9b8bbcfd: Status 404 returned error can't find the container with id 2cb6ec783813ce6a0dc9aa7d47c53ee959ccd88021583a73df9c07cb9b8bbcfd Mar 20 08:51:46 crc kubenswrapper[4971]: I0320 08:51:46.023826 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1448e032-bee2-4ebb-b6bc-9b62539fea32","Type":"ContainerStarted","Data":"2cb6ec783813ce6a0dc9aa7d47c53ee959ccd88021583a73df9c07cb9b8bbcfd"} Mar 20 08:51:46 crc kubenswrapper[4971]: I0320 08:51:46.752134 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abaad3d4-9e4c-4322-a544-07434f720dd6" path="/var/lib/kubelet/pods/abaad3d4-9e4c-4322-a544-07434f720dd6/volumes" Mar 20 08:51:47 crc kubenswrapper[4971]: I0320 08:51:47.035082 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1448e032-bee2-4ebb-b6bc-9b62539fea32","Type":"ContainerStarted","Data":"1bb783d5a44fa970c0622ae5ed68bff208b5e5ef186250d261674fdd233a7d51"} Mar 20 08:51:47 crc kubenswrapper[4971]: I0320 08:51:47.056095 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.056075764 podStartE2EDuration="2.056075764s" podCreationTimestamp="2026-03-20 08:51:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:51:47.054361699 +0000 UTC m=+7329.034235857" watchObservedRunningTime="2026-03-20 08:51:47.056075764 +0000 UTC m=+7329.035949902" Mar 20 08:51:47 crc kubenswrapper[4971]: I0320 08:51:47.593381 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 08:51:47 crc kubenswrapper[4971]: I0320 08:51:47.604421 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 08:51:47 crc kubenswrapper[4971]: I0320 08:51:47.740470 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18367358-fcaa-4258-8420-b842bce69483-config-data\") pod \"18367358-fcaa-4258-8420-b842bce69483\" (UID: \"18367358-fcaa-4258-8420-b842bce69483\") " Mar 20 08:51:47 crc kubenswrapper[4971]: I0320 08:51:47.740625 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18367358-fcaa-4258-8420-b842bce69483-logs\") pod \"18367358-fcaa-4258-8420-b842bce69483\" (UID: \"18367358-fcaa-4258-8420-b842bce69483\") " Mar 20 08:51:47 crc kubenswrapper[4971]: I0320 08:51:47.740649 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zr5j\" (UniqueName: \"kubernetes.io/projected/18367358-fcaa-4258-8420-b842bce69483-kube-api-access-6zr5j\") pod \"18367358-fcaa-4258-8420-b842bce69483\" (UID: \"18367358-fcaa-4258-8420-b842bce69483\") " Mar 20 08:51:47 crc kubenswrapper[4971]: I0320 08:51:47.740665 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18367358-fcaa-4258-8420-b842bce69483-combined-ca-bundle\") pod \"18367358-fcaa-4258-8420-b842bce69483\" (UID: \"18367358-fcaa-4258-8420-b842bce69483\") " Mar 20 08:51:47 crc kubenswrapper[4971]: I0320 08:51:47.740727 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c-logs\") pod \"5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c\" (UID: \"5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c\") " Mar 20 08:51:47 crc kubenswrapper[4971]: I0320 08:51:47.740745 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wdvj\" (UniqueName: \"kubernetes.io/projected/5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c-kube-api-access-5wdvj\") pod \"5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c\" (UID: \"5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c\") " Mar 20 08:51:47 crc kubenswrapper[4971]: I0320 08:51:47.740758 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c-config-data\") pod \"5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c\" (UID: \"5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c\") " Mar 20 08:51:47 crc kubenswrapper[4971]: I0320 08:51:47.740785 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c-combined-ca-bundle\") pod \"5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c\" (UID: \"5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c\") " Mar 20 08:51:47 crc kubenswrapper[4971]: I0320 08:51:47.743068 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18367358-fcaa-4258-8420-b842bce69483-logs" (OuterVolumeSpecName: "logs") pod "18367358-fcaa-4258-8420-b842bce69483" (UID: "18367358-fcaa-4258-8420-b842bce69483"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:51:47 crc kubenswrapper[4971]: I0320 08:51:47.743165 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c-logs" (OuterVolumeSpecName: "logs") pod "5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c" (UID: "5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:51:47 crc kubenswrapper[4971]: I0320 08:51:47.746668 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18367358-fcaa-4258-8420-b842bce69483-kube-api-access-6zr5j" (OuterVolumeSpecName: "kube-api-access-6zr5j") pod "18367358-fcaa-4258-8420-b842bce69483" (UID: "18367358-fcaa-4258-8420-b842bce69483"). InnerVolumeSpecName "kube-api-access-6zr5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:51:47 crc kubenswrapper[4971]: I0320 08:51:47.761104 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c-kube-api-access-5wdvj" (OuterVolumeSpecName: "kube-api-access-5wdvj") pod "5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c" (UID: "5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c"). InnerVolumeSpecName "kube-api-access-5wdvj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:51:47 crc kubenswrapper[4971]: I0320 08:51:47.767762 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18367358-fcaa-4258-8420-b842bce69483-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "18367358-fcaa-4258-8420-b842bce69483" (UID: "18367358-fcaa-4258-8420-b842bce69483"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:51:47 crc kubenswrapper[4971]: I0320 08:51:47.768230 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c-config-data" (OuterVolumeSpecName: "config-data") pod "5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c" (UID: "5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:51:47 crc kubenswrapper[4971]: I0320 08:51:47.769425 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c" (UID: "5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:51:47 crc kubenswrapper[4971]: I0320 08:51:47.797766 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18367358-fcaa-4258-8420-b842bce69483-config-data" (OuterVolumeSpecName: "config-data") pod "18367358-fcaa-4258-8420-b842bce69483" (UID: "18367358-fcaa-4258-8420-b842bce69483"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:51:47 crc kubenswrapper[4971]: I0320 08:51:47.842994 4971 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18367358-fcaa-4258-8420-b842bce69483-logs\") on node \"crc\" DevicePath \"\"" Mar 20 08:51:47 crc kubenswrapper[4971]: I0320 08:51:47.843038 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zr5j\" (UniqueName: \"kubernetes.io/projected/18367358-fcaa-4258-8420-b842bce69483-kube-api-access-6zr5j\") on node \"crc\" DevicePath \"\"" Mar 20 08:51:47 crc kubenswrapper[4971]: I0320 08:51:47.843052 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18367358-fcaa-4258-8420-b842bce69483-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:51:47 crc kubenswrapper[4971]: I0320 08:51:47.843066 4971 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c-logs\") on node \"crc\" DevicePath \"\"" Mar 20 08:51:47 crc kubenswrapper[4971]: I0320 08:51:47.843079 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wdvj\" (UniqueName: \"kubernetes.io/projected/5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c-kube-api-access-5wdvj\") on node \"crc\" DevicePath \"\"" Mar 20 08:51:47 crc kubenswrapper[4971]: I0320 08:51:47.843092 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:51:47 crc kubenswrapper[4971]: I0320 08:51:47.843106 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:51:47 crc kubenswrapper[4971]: I0320 08:51:47.843120 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18367358-fcaa-4258-8420-b842bce69483-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.052399 4971 generic.go:334] "Generic (PLEG): container finished" podID="18367358-fcaa-4258-8420-b842bce69483" containerID="873782db7ee53b6c5c4609f01795e7157343a4970c004ee32964a53e04f77177" exitCode=0 Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.052490 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"18367358-fcaa-4258-8420-b842bce69483","Type":"ContainerDied","Data":"873782db7ee53b6c5c4609f01795e7157343a4970c004ee32964a53e04f77177"} Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.052554 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.052974 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"18367358-fcaa-4258-8420-b842bce69483","Type":"ContainerDied","Data":"a1f9b16460b577710ca8e046bf132893955aa152d8691ce41684e904193994fd"} Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.053135 4971 scope.go:117] "RemoveContainer" containerID="873782db7ee53b6c5c4609f01795e7157343a4970c004ee32964a53e04f77177" Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.057766 4971 generic.go:334] "Generic (PLEG): container finished" podID="5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c" containerID="31f01dbdcd12c71cea8619d50ff64284549a31f4943f10e0852af30b15238a31" exitCode=0 Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.057835 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c","Type":"ContainerDied","Data":"31f01dbdcd12c71cea8619d50ff64284549a31f4943f10e0852af30b15238a31"} Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.057919 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c","Type":"ContainerDied","Data":"8ae2785e0ce8607647b159efe3c56e7d0a7d92621e15cf7c8320b7c7314b7ed6"} Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.057930 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.092988 4971 scope.go:117] "RemoveContainer" containerID="1fc4eb3d2d3a0e0e836c1f6b2a279ed404addf03efd01c700cf55dccb270afba" Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.132225 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.147929 4971 scope.go:117] "RemoveContainer" containerID="873782db7ee53b6c5c4609f01795e7157343a4970c004ee32964a53e04f77177" Mar 20 08:51:48 crc kubenswrapper[4971]: E0320 08:51:48.150080 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"873782db7ee53b6c5c4609f01795e7157343a4970c004ee32964a53e04f77177\": container with ID starting with 873782db7ee53b6c5c4609f01795e7157343a4970c004ee32964a53e04f77177 not found: ID does not exist" containerID="873782db7ee53b6c5c4609f01795e7157343a4970c004ee32964a53e04f77177" Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.150144 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"873782db7ee53b6c5c4609f01795e7157343a4970c004ee32964a53e04f77177"} err="failed to get container status \"873782db7ee53b6c5c4609f01795e7157343a4970c004ee32964a53e04f77177\": rpc error: code = NotFound desc = could not find container \"873782db7ee53b6c5c4609f01795e7157343a4970c004ee32964a53e04f77177\": container with ID starting with 873782db7ee53b6c5c4609f01795e7157343a4970c004ee32964a53e04f77177 not found: ID does not exist" Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.150180 4971 scope.go:117] "RemoveContainer" containerID="1fc4eb3d2d3a0e0e836c1f6b2a279ed404addf03efd01c700cf55dccb270afba" Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.155710 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 20 08:51:48 crc kubenswrapper[4971]: E0320 08:51:48.155759 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fc4eb3d2d3a0e0e836c1f6b2a279ed404addf03efd01c700cf55dccb270afba\": container with ID starting with 1fc4eb3d2d3a0e0e836c1f6b2a279ed404addf03efd01c700cf55dccb270afba not found: ID does not exist" containerID="1fc4eb3d2d3a0e0e836c1f6b2a279ed404addf03efd01c700cf55dccb270afba" Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.155793 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fc4eb3d2d3a0e0e836c1f6b2a279ed404addf03efd01c700cf55dccb270afba"} err="failed to get container status \"1fc4eb3d2d3a0e0e836c1f6b2a279ed404addf03efd01c700cf55dccb270afba\": rpc error: code = NotFound desc = could not find container \"1fc4eb3d2d3a0e0e836c1f6b2a279ed404addf03efd01c700cf55dccb270afba\": container with ID starting with 1fc4eb3d2d3a0e0e836c1f6b2a279ed404addf03efd01c700cf55dccb270afba not found: ID does not exist" Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.155822 4971 scope.go:117] "RemoveContainer" containerID="31f01dbdcd12c71cea8619d50ff64284549a31f4943f10e0852af30b15238a31" Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.175454 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.188187 4971 scope.go:117] "RemoveContainer" containerID="1e15f0b9e557e8147a67e4bdaef7e94f1d374d5267f40d9cd6478452de50ca11" Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.196822 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 08:51:48 crc kubenswrapper[4971]: E0320 08:51:48.197471 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c" containerName="nova-api-api" Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.197485 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c" containerName="nova-api-api" Mar 20 08:51:48 crc kubenswrapper[4971]: E0320 08:51:48.197527 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18367358-fcaa-4258-8420-b842bce69483" containerName="nova-metadata-log" Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.197534 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="18367358-fcaa-4258-8420-b842bce69483" containerName="nova-metadata-log" Mar 20 08:51:48 crc kubenswrapper[4971]: E0320 08:51:48.197552 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c" containerName="nova-api-log" Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.197558 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c" containerName="nova-api-log" Mar 20 08:51:48 crc kubenswrapper[4971]: E0320 08:51:48.197625 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18367358-fcaa-4258-8420-b842bce69483" containerName="nova-metadata-metadata" Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.197633 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="18367358-fcaa-4258-8420-b842bce69483" containerName="nova-metadata-metadata" Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.197785 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c" containerName="nova-api-log" Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.197801 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c" containerName="nova-api-api" Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.197817 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="18367358-fcaa-4258-8420-b842bce69483" containerName="nova-metadata-log" Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.197828 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="18367358-fcaa-4258-8420-b842bce69483" containerName="nova-metadata-metadata" Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.198848 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.201202 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.209800 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.213204 4971 scope.go:117] "RemoveContainer" containerID="31f01dbdcd12c71cea8619d50ff64284549a31f4943f10e0852af30b15238a31" Mar 20 08:51:48 crc kubenswrapper[4971]: E0320 08:51:48.218346 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31f01dbdcd12c71cea8619d50ff64284549a31f4943f10e0852af30b15238a31\": container with ID starting with 31f01dbdcd12c71cea8619d50ff64284549a31f4943f10e0852af30b15238a31 not found: ID does not exist" containerID="31f01dbdcd12c71cea8619d50ff64284549a31f4943f10e0852af30b15238a31" Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.218397 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31f01dbdcd12c71cea8619d50ff64284549a31f4943f10e0852af30b15238a31"} err="failed to get container status \"31f01dbdcd12c71cea8619d50ff64284549a31f4943f10e0852af30b15238a31\": rpc error: code = NotFound desc = could not find container \"31f01dbdcd12c71cea8619d50ff64284549a31f4943f10e0852af30b15238a31\": container with ID starting with 31f01dbdcd12c71cea8619d50ff64284549a31f4943f10e0852af30b15238a31 not found: ID does not exist" Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.218443 4971 scope.go:117] "RemoveContainer" containerID="1e15f0b9e557e8147a67e4bdaef7e94f1d374d5267f40d9cd6478452de50ca11" Mar 20 08:51:48 crc kubenswrapper[4971]: E0320 08:51:48.220288 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e15f0b9e557e8147a67e4bdaef7e94f1d374d5267f40d9cd6478452de50ca11\": container with ID starting with 1e15f0b9e557e8147a67e4bdaef7e94f1d374d5267f40d9cd6478452de50ca11 not found: ID does not exist" containerID="1e15f0b9e557e8147a67e4bdaef7e94f1d374d5267f40d9cd6478452de50ca11" Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.220320 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e15f0b9e557e8147a67e4bdaef7e94f1d374d5267f40d9cd6478452de50ca11"} err="failed to get container status \"1e15f0b9e557e8147a67e4bdaef7e94f1d374d5267f40d9cd6478452de50ca11\": rpc error: code = NotFound desc = could not find container \"1e15f0b9e557e8147a67e4bdaef7e94f1d374d5267f40d9cd6478452de50ca11\": container with ID starting with 1e15f0b9e557e8147a67e4bdaef7e94f1d374d5267f40d9cd6478452de50ca11 not found: ID does not exist" Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.220506 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.229650 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.231933 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.234126 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.237393 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.258036 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b8ee1bc-f59c-49b5-8679-3411e0d64db3-config-data\") pod \"nova-api-0\" (UID: \"8b8ee1bc-f59c-49b5-8679-3411e0d64db3\") " pod="openstack/nova-api-0" Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.258076 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b8ee1bc-f59c-49b5-8679-3411e0d64db3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8b8ee1bc-f59c-49b5-8679-3411e0d64db3\") " pod="openstack/nova-api-0" Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.258124 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5l6l\" (UniqueName: \"kubernetes.io/projected/8b8ee1bc-f59c-49b5-8679-3411e0d64db3-kube-api-access-z5l6l\") pod \"nova-api-0\" (UID: \"8b8ee1bc-f59c-49b5-8679-3411e0d64db3\") " pod="openstack/nova-api-0" Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.258420 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b8ee1bc-f59c-49b5-8679-3411e0d64db3-logs\") pod \"nova-api-0\" (UID: \"8b8ee1bc-f59c-49b5-8679-3411e0d64db3\") " pod="openstack/nova-api-0" Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.360478 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b8ee1bc-f59c-49b5-8679-3411e0d64db3-config-data\") pod \"nova-api-0\" (UID: \"8b8ee1bc-f59c-49b5-8679-3411e0d64db3\") " pod="openstack/nova-api-0" Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.360540 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b8ee1bc-f59c-49b5-8679-3411e0d64db3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8b8ee1bc-f59c-49b5-8679-3411e0d64db3\") " pod="openstack/nova-api-0" Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.360596 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b74jw\" (UniqueName: \"kubernetes.io/projected/42569610-bca3-4909-ad7c-0ad057f7cfe7-kube-api-access-b74jw\") pod \"nova-metadata-0\" (UID: \"42569610-bca3-4909-ad7c-0ad057f7cfe7\") " pod="openstack/nova-metadata-0" Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.360640 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5l6l\" (UniqueName: \"kubernetes.io/projected/8b8ee1bc-f59c-49b5-8679-3411e0d64db3-kube-api-access-z5l6l\") pod \"nova-api-0\" (UID: \"8b8ee1bc-f59c-49b5-8679-3411e0d64db3\") " pod="openstack/nova-api-0" Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.360675 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42569610-bca3-4909-ad7c-0ad057f7cfe7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"42569610-bca3-4909-ad7c-0ad057f7cfe7\") " pod="openstack/nova-metadata-0" Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.360728 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42569610-bca3-4909-ad7c-0ad057f7cfe7-config-data\") pod \"nova-metadata-0\" (UID: \"42569610-bca3-4909-ad7c-0ad057f7cfe7\") " pod="openstack/nova-metadata-0" Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.360841 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b8ee1bc-f59c-49b5-8679-3411e0d64db3-logs\") pod \"nova-api-0\" (UID: \"8b8ee1bc-f59c-49b5-8679-3411e0d64db3\") " pod="openstack/nova-api-0" Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.360880 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42569610-bca3-4909-ad7c-0ad057f7cfe7-logs\") pod \"nova-metadata-0\" (UID: \"42569610-bca3-4909-ad7c-0ad057f7cfe7\") " pod="openstack/nova-metadata-0" Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.361429 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b8ee1bc-f59c-49b5-8679-3411e0d64db3-logs\") pod \"nova-api-0\" (UID: \"8b8ee1bc-f59c-49b5-8679-3411e0d64db3\") " pod="openstack/nova-api-0" Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.375345 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b8ee1bc-f59c-49b5-8679-3411e0d64db3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8b8ee1bc-f59c-49b5-8679-3411e0d64db3\") " pod="openstack/nova-api-0" Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.375400 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b8ee1bc-f59c-49b5-8679-3411e0d64db3-config-data\") pod \"nova-api-0\" (UID: \"8b8ee1bc-f59c-49b5-8679-3411e0d64db3\") " pod="openstack/nova-api-0" Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.390082 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5l6l\" (UniqueName: \"kubernetes.io/projected/8b8ee1bc-f59c-49b5-8679-3411e0d64db3-kube-api-access-z5l6l\") pod \"nova-api-0\" (UID: \"8b8ee1bc-f59c-49b5-8679-3411e0d64db3\") " pod="openstack/nova-api-0" Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.463480 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42569610-bca3-4909-ad7c-0ad057f7cfe7-logs\") pod \"nova-metadata-0\" (UID: \"42569610-bca3-4909-ad7c-0ad057f7cfe7\") " pod="openstack/nova-metadata-0" Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.464221 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42569610-bca3-4909-ad7c-0ad057f7cfe7-logs\") pod \"nova-metadata-0\" (UID: \"42569610-bca3-4909-ad7c-0ad057f7cfe7\") " pod="openstack/nova-metadata-0" Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.464329 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b74jw\" (UniqueName: \"kubernetes.io/projected/42569610-bca3-4909-ad7c-0ad057f7cfe7-kube-api-access-b74jw\") pod \"nova-metadata-0\" (UID: \"42569610-bca3-4909-ad7c-0ad057f7cfe7\") " pod="openstack/nova-metadata-0" Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.464431 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42569610-bca3-4909-ad7c-0ad057f7cfe7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"42569610-bca3-4909-ad7c-0ad057f7cfe7\") " pod="openstack/nova-metadata-0" Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.464538 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42569610-bca3-4909-ad7c-0ad057f7cfe7-config-data\") pod \"nova-metadata-0\" (UID: \"42569610-bca3-4909-ad7c-0ad057f7cfe7\") " pod="openstack/nova-metadata-0" Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.470056 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42569610-bca3-4909-ad7c-0ad057f7cfe7-config-data\") pod \"nova-metadata-0\" (UID: \"42569610-bca3-4909-ad7c-0ad057f7cfe7\") " pod="openstack/nova-metadata-0" Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.470701 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42569610-bca3-4909-ad7c-0ad057f7cfe7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"42569610-bca3-4909-ad7c-0ad057f7cfe7\") " pod="openstack/nova-metadata-0" Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.490712 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b74jw\" (UniqueName: \"kubernetes.io/projected/42569610-bca3-4909-ad7c-0ad057f7cfe7-kube-api-access-b74jw\") pod \"nova-metadata-0\" (UID: \"42569610-bca3-4909-ad7c-0ad057f7cfe7\") " pod="openstack/nova-metadata-0" Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.519478 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.551232 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.737145 4971 scope.go:117] "RemoveContainer" containerID="af8dd483a43b5c96df916fd811fc94a881785f568b623794921e968459fc26b4" Mar 20 08:51:48 crc kubenswrapper[4971]: E0320 08:51:48.737475 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.744692 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18367358-fcaa-4258-8420-b842bce69483" path="/var/lib/kubelet/pods/18367358-fcaa-4258-8420-b842bce69483/volumes" Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.745579 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c" path="/var/lib/kubelet/pods/5cfd209c-cac4-4e7d-8c80-6ee1c3b7484c/volumes" Mar 20 08:51:48 crc kubenswrapper[4971]: I0320 08:51:48.983402 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 08:51:48 crc kubenswrapper[4971]: W0320 08:51:48.983765 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b8ee1bc_f59c_49b5_8679_3411e0d64db3.slice/crio-a9d07ff0db1ee1f9121dd29989916e97980c543bea6fc97d9582258fc996aa1b WatchSource:0}: Error finding container a9d07ff0db1ee1f9121dd29989916e97980c543bea6fc97d9582258fc996aa1b: Status 404 returned error can't find the container with id a9d07ff0db1ee1f9121dd29989916e97980c543bea6fc97d9582258fc996aa1b Mar 20 08:51:49 crc kubenswrapper[4971]: I0320 08:51:49.065646 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 08:51:49 crc kubenswrapper[4971]: I0320 08:51:49.073965 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8b8ee1bc-f59c-49b5-8679-3411e0d64db3","Type":"ContainerStarted","Data":"a9d07ff0db1ee1f9121dd29989916e97980c543bea6fc97d9582258fc996aa1b"} Mar 20 08:51:49 crc kubenswrapper[4971]: W0320 08:51:49.081508 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42569610_bca3_4909_ad7c_0ad057f7cfe7.slice/crio-61bd637926349633be742b9ee719ea7a13792d9fb8c467ddd91cce705de359b7 WatchSource:0}: Error finding container 61bd637926349633be742b9ee719ea7a13792d9fb8c467ddd91cce705de359b7: Status 404 returned error can't find the container with id 61bd637926349633be742b9ee719ea7a13792d9fb8c467ddd91cce705de359b7 Mar 20 08:51:50 crc kubenswrapper[4971]: I0320 08:51:50.087996 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8b8ee1bc-f59c-49b5-8679-3411e0d64db3","Type":"ContainerStarted","Data":"d44b83d1e9043bf165e9007af7fe2a791004b924391a249c5c2abdfa5435ca3c"} Mar 20 08:51:50 crc kubenswrapper[4971]: I0320 08:51:50.088438 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8b8ee1bc-f59c-49b5-8679-3411e0d64db3","Type":"ContainerStarted","Data":"002b18a4cbf209717c99cf48e1266bf53e0e4fa084810a4684e789b2d1032562"} Mar 20 08:51:50 crc kubenswrapper[4971]: I0320 08:51:50.093240 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"42569610-bca3-4909-ad7c-0ad057f7cfe7","Type":"ContainerStarted","Data":"248b2194ee601bc6137794b73597702de74ad6edf54acb4b3be4732e29f966ef"} Mar 20 08:51:50 crc kubenswrapper[4971]: I0320 08:51:50.093277 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"42569610-bca3-4909-ad7c-0ad057f7cfe7","Type":"ContainerStarted","Data":"18abf822562d7202715250e9b4313bf919b96054565cfe87e96f7333dac34472"} Mar 20 08:51:50 crc kubenswrapper[4971]: I0320 08:51:50.093287 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"42569610-bca3-4909-ad7c-0ad057f7cfe7","Type":"ContainerStarted","Data":"61bd637926349633be742b9ee719ea7a13792d9fb8c467ddd91cce705de359b7"} Mar 20 08:51:50 crc kubenswrapper[4971]: I0320 08:51:50.116340 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.116317724 podStartE2EDuration="2.116317724s" podCreationTimestamp="2026-03-20 08:51:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:51:50.11502506 +0000 UTC m=+7332.094899208" watchObservedRunningTime="2026-03-20 08:51:50.116317724 +0000 UTC m=+7332.096191862" Mar 20 08:51:50 crc kubenswrapper[4971]: I0320 08:51:50.144936 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.144916002 podStartE2EDuration="2.144916002s" podCreationTimestamp="2026-03-20 08:51:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:51:50.137477277 +0000 UTC m=+7332.117351415" watchObservedRunningTime="2026-03-20 08:51:50.144916002 +0000 UTC m=+7332.124790140" Mar 20 08:51:50 crc kubenswrapper[4971]: I0320 08:51:50.406963 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 20 08:51:55 crc kubenswrapper[4971]: I0320 08:51:55.407336 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 20 08:51:55 crc kubenswrapper[4971]: I0320 08:51:55.433319 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 20 08:51:56 crc kubenswrapper[4971]: I0320 08:51:56.198698 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 20 08:51:58 crc kubenswrapper[4971]: I0320 08:51:58.520759 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 08:51:58 crc kubenswrapper[4971]: I0320 08:51:58.520809 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 08:51:58 crc kubenswrapper[4971]: I0320 08:51:58.552364 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 08:51:58 crc kubenswrapper[4971]: I0320 08:51:58.552526 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 08:51:59 crc kubenswrapper[4971]: I0320 08:51:59.684917 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="42569610-bca3-4909-ad7c-0ad057f7cfe7" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.132:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 08:51:59 crc kubenswrapper[4971]: I0320 08:51:59.684975 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8b8ee1bc-f59c-49b5-8679-3411e0d64db3" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.131:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 08:51:59 crc kubenswrapper[4971]: I0320 08:51:59.685042 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="42569610-bca3-4909-ad7c-0ad057f7cfe7" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.132:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 08:51:59 crc kubenswrapper[4971]: I0320 08:51:59.685047 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8b8ee1bc-f59c-49b5-8679-3411e0d64db3" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.131:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 08:52:00 crc kubenswrapper[4971]: I0320 08:52:00.165288 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566612-z2km9"] Mar 20 08:52:00 crc kubenswrapper[4971]: I0320 08:52:00.166685 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566612-z2km9" Mar 20 08:52:00 crc kubenswrapper[4971]: I0320 08:52:00.169879 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 08:52:00 crc kubenswrapper[4971]: I0320 08:52:00.170401 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 08:52:00 crc kubenswrapper[4971]: I0320 08:52:00.170908 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 08:52:00 crc kubenswrapper[4971]: I0320 08:52:00.183741 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566612-z2km9"] Mar 20 08:52:00 crc kubenswrapper[4971]: I0320 08:52:00.204418 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fdxt\" (UniqueName: \"kubernetes.io/projected/c0c478ce-43da-4806-b7ad-29e39c5dcc3d-kube-api-access-4fdxt\") pod \"auto-csr-approver-29566612-z2km9\" (UID: \"c0c478ce-43da-4806-b7ad-29e39c5dcc3d\") " pod="openshift-infra/auto-csr-approver-29566612-z2km9" Mar 20 08:52:00 crc kubenswrapper[4971]: I0320 08:52:00.305502 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fdxt\" (UniqueName: \"kubernetes.io/projected/c0c478ce-43da-4806-b7ad-29e39c5dcc3d-kube-api-access-4fdxt\") pod \"auto-csr-approver-29566612-z2km9\" (UID: \"c0c478ce-43da-4806-b7ad-29e39c5dcc3d\") " pod="openshift-infra/auto-csr-approver-29566612-z2km9" Mar 20 08:52:00 crc kubenswrapper[4971]: I0320 08:52:00.343199 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fdxt\" (UniqueName: \"kubernetes.io/projected/c0c478ce-43da-4806-b7ad-29e39c5dcc3d-kube-api-access-4fdxt\") pod \"auto-csr-approver-29566612-z2km9\" (UID: \"c0c478ce-43da-4806-b7ad-29e39c5dcc3d\") " pod="openshift-infra/auto-csr-approver-29566612-z2km9" Mar 20 08:52:00 crc kubenswrapper[4971]: I0320 08:52:00.516832 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566612-z2km9" Mar 20 08:52:01 crc kubenswrapper[4971]: I0320 08:52:01.002256 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566612-z2km9"] Mar 20 08:52:01 crc kubenswrapper[4971]: I0320 08:52:01.225475 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566612-z2km9" event={"ID":"c0c478ce-43da-4806-b7ad-29e39c5dcc3d","Type":"ContainerStarted","Data":"019a0bfa04ff9512e4c1f223c952303a9d65e4a04d4f10b1159ca441c17a28ac"} Mar 20 08:52:02 crc kubenswrapper[4971]: I0320 08:52:02.732452 4971 scope.go:117] "RemoveContainer" containerID="af8dd483a43b5c96df916fd811fc94a881785f568b623794921e968459fc26b4" Mar 20 08:52:03 crc kubenswrapper[4971]: I0320 08:52:03.243879 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerStarted","Data":"d3a14bb739a0b2e954975f7532ea81560227f071f1b801e07dff179ccac02bd7"} Mar 20 08:52:03 crc kubenswrapper[4971]: I0320 08:52:03.248380 4971 generic.go:334] "Generic (PLEG): container finished" podID="c0c478ce-43da-4806-b7ad-29e39c5dcc3d" containerID="97987dcf1e2645cd3642dee79f84a45fc99d268530b4ae0b4e140fcf3cce45b8" exitCode=0 Mar 20 08:52:03 crc kubenswrapper[4971]: I0320 08:52:03.248435 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566612-z2km9" event={"ID":"c0c478ce-43da-4806-b7ad-29e39c5dcc3d","Type":"ContainerDied","Data":"97987dcf1e2645cd3642dee79f84a45fc99d268530b4ae0b4e140fcf3cce45b8"} Mar 20 08:52:04 crc kubenswrapper[4971]: I0320 08:52:04.598310 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566612-z2km9" Mar 20 08:52:04 crc kubenswrapper[4971]: I0320 08:52:04.694522 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fdxt\" (UniqueName: \"kubernetes.io/projected/c0c478ce-43da-4806-b7ad-29e39c5dcc3d-kube-api-access-4fdxt\") pod \"c0c478ce-43da-4806-b7ad-29e39c5dcc3d\" (UID: \"c0c478ce-43da-4806-b7ad-29e39c5dcc3d\") " Mar 20 08:52:04 crc kubenswrapper[4971]: I0320 08:52:04.699939 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0c478ce-43da-4806-b7ad-29e39c5dcc3d-kube-api-access-4fdxt" (OuterVolumeSpecName: "kube-api-access-4fdxt") pod "c0c478ce-43da-4806-b7ad-29e39c5dcc3d" (UID: "c0c478ce-43da-4806-b7ad-29e39c5dcc3d"). InnerVolumeSpecName "kube-api-access-4fdxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:52:04 crc kubenswrapper[4971]: I0320 08:52:04.797155 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fdxt\" (UniqueName: \"kubernetes.io/projected/c0c478ce-43da-4806-b7ad-29e39c5dcc3d-kube-api-access-4fdxt\") on node \"crc\" DevicePath \"\"" Mar 20 08:52:05 crc kubenswrapper[4971]: I0320 08:52:05.267813 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566612-z2km9" event={"ID":"c0c478ce-43da-4806-b7ad-29e39c5dcc3d","Type":"ContainerDied","Data":"019a0bfa04ff9512e4c1f223c952303a9d65e4a04d4f10b1159ca441c17a28ac"} Mar 20 08:52:05 crc kubenswrapper[4971]: I0320 08:52:05.268074 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="019a0bfa04ff9512e4c1f223c952303a9d65e4a04d4f10b1159ca441c17a28ac" Mar 20 08:52:05 crc kubenswrapper[4971]: I0320 08:52:05.267907 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566612-z2km9" Mar 20 08:52:05 crc kubenswrapper[4971]: I0320 08:52:05.684538 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566606-5k4mc"] Mar 20 08:52:05 crc kubenswrapper[4971]: I0320 08:52:05.691437 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566606-5k4mc"] Mar 20 08:52:06 crc kubenswrapper[4971]: I0320 08:52:06.520762 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 08:52:06 crc kubenswrapper[4971]: I0320 08:52:06.520860 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 08:52:06 crc kubenswrapper[4971]: I0320 08:52:06.551281 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 08:52:06 crc kubenswrapper[4971]: I0320 08:52:06.551552 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 08:52:06 crc kubenswrapper[4971]: I0320 08:52:06.648373 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xncqk"] Mar 20 08:52:06 crc kubenswrapper[4971]: E0320 08:52:06.648847 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0c478ce-43da-4806-b7ad-29e39c5dcc3d" containerName="oc" Mar 20 08:52:06 crc kubenswrapper[4971]: I0320 08:52:06.648865 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0c478ce-43da-4806-b7ad-29e39c5dcc3d" containerName="oc" Mar 20 08:52:06 crc kubenswrapper[4971]: I0320 08:52:06.649057 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0c478ce-43da-4806-b7ad-29e39c5dcc3d" containerName="oc" Mar 20 08:52:06 crc kubenswrapper[4971]: I0320 08:52:06.650454 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xncqk" Mar 20 08:52:06 crc kubenswrapper[4971]: I0320 08:52:06.686210 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xncqk"] Mar 20 08:52:06 crc kubenswrapper[4971]: I0320 08:52:06.731319 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5e692dd-45d1-4384-b888-3958c4e970c9-catalog-content\") pod \"community-operators-xncqk\" (UID: \"d5e692dd-45d1-4384-b888-3958c4e970c9\") " pod="openshift-marketplace/community-operators-xncqk" Mar 20 08:52:06 crc kubenswrapper[4971]: I0320 08:52:06.731440 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5e692dd-45d1-4384-b888-3958c4e970c9-utilities\") pod \"community-operators-xncqk\" (UID: \"d5e692dd-45d1-4384-b888-3958c4e970c9\") " pod="openshift-marketplace/community-operators-xncqk" Mar 20 08:52:06 crc kubenswrapper[4971]: I0320 08:52:06.731553 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np4xp\" (UniqueName: \"kubernetes.io/projected/d5e692dd-45d1-4384-b888-3958c4e970c9-kube-api-access-np4xp\") pod \"community-operators-xncqk\" (UID: \"d5e692dd-45d1-4384-b888-3958c4e970c9\") " pod="openshift-marketplace/community-operators-xncqk" Mar 20 08:52:06 crc kubenswrapper[4971]: I0320 08:52:06.743191 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b13baff6-c4ad-4c29-bb53-d0931ca962b6" path="/var/lib/kubelet/pods/b13baff6-c4ad-4c29-bb53-d0931ca962b6/volumes" Mar 20 08:52:06 crc kubenswrapper[4971]: I0320 08:52:06.833498 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5e692dd-45d1-4384-b888-3958c4e970c9-utilities\") pod \"community-operators-xncqk\" (UID: \"d5e692dd-45d1-4384-b888-3958c4e970c9\") " pod="openshift-marketplace/community-operators-xncqk" Mar 20 08:52:06 crc kubenswrapper[4971]: I0320 08:52:06.833635 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-np4xp\" (UniqueName: \"kubernetes.io/projected/d5e692dd-45d1-4384-b888-3958c4e970c9-kube-api-access-np4xp\") pod \"community-operators-xncqk\" (UID: \"d5e692dd-45d1-4384-b888-3958c4e970c9\") " pod="openshift-marketplace/community-operators-xncqk" Mar 20 08:52:06 crc kubenswrapper[4971]: I0320 08:52:06.833683 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5e692dd-45d1-4384-b888-3958c4e970c9-catalog-content\") pod \"community-operators-xncqk\" (UID: \"d5e692dd-45d1-4384-b888-3958c4e970c9\") " pod="openshift-marketplace/community-operators-xncqk" Mar 20 08:52:06 crc kubenswrapper[4971]: I0320 08:52:06.834115 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5e692dd-45d1-4384-b888-3958c4e970c9-utilities\") pod \"community-operators-xncqk\" (UID: \"d5e692dd-45d1-4384-b888-3958c4e970c9\") " pod="openshift-marketplace/community-operators-xncqk" Mar 20 08:52:06 crc kubenswrapper[4971]: I0320 08:52:06.834333 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5e692dd-45d1-4384-b888-3958c4e970c9-catalog-content\") pod \"community-operators-xncqk\" (UID: \"d5e692dd-45d1-4384-b888-3958c4e970c9\") " pod="openshift-marketplace/community-operators-xncqk" Mar 20 08:52:06 crc kubenswrapper[4971]: I0320 08:52:06.854087 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-np4xp\" (UniqueName: \"kubernetes.io/projected/d5e692dd-45d1-4384-b888-3958c4e970c9-kube-api-access-np4xp\") pod \"community-operators-xncqk\" (UID: \"d5e692dd-45d1-4384-b888-3958c4e970c9\") " pod="openshift-marketplace/community-operators-xncqk" Mar 20 08:52:06 crc kubenswrapper[4971]: I0320 08:52:06.976499 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xncqk" Mar 20 08:52:07 crc kubenswrapper[4971]: I0320 08:52:07.273127 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xncqk"] Mar 20 08:52:07 crc kubenswrapper[4971]: W0320 08:52:07.289763 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5e692dd_45d1_4384_b888_3958c4e970c9.slice/crio-13d600120a2b38064bbf85b14e2ec1b7a83214d2956e1a65cc02d07aa3119ec9 WatchSource:0}: Error finding container 13d600120a2b38064bbf85b14e2ec1b7a83214d2956e1a65cc02d07aa3119ec9: Status 404 returned error can't find the container with id 13d600120a2b38064bbf85b14e2ec1b7a83214d2956e1a65cc02d07aa3119ec9 Mar 20 08:52:08 crc kubenswrapper[4971]: I0320 08:52:08.310570 4971 generic.go:334] "Generic (PLEG): container finished" podID="d5e692dd-45d1-4384-b888-3958c4e970c9" containerID="3f968db899f46a401f9fcd8363226060bcb9a1088d65b6fa13116ba2858d68ee" exitCode=0 Mar 20 08:52:08 crc kubenswrapper[4971]: I0320 08:52:08.310683 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xncqk" event={"ID":"d5e692dd-45d1-4384-b888-3958c4e970c9","Type":"ContainerDied","Data":"3f968db899f46a401f9fcd8363226060bcb9a1088d65b6fa13116ba2858d68ee"} Mar 20 08:52:08 crc kubenswrapper[4971]: I0320 08:52:08.310745 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xncqk" event={"ID":"d5e692dd-45d1-4384-b888-3958c4e970c9","Type":"ContainerStarted","Data":"13d600120a2b38064bbf85b14e2ec1b7a83214d2956e1a65cc02d07aa3119ec9"} Mar 20 08:52:08 crc kubenswrapper[4971]: I0320 08:52:08.525740 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 08:52:08 crc kubenswrapper[4971]: I0320 08:52:08.526141 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 08:52:08 crc kubenswrapper[4971]: I0320 08:52:08.529865 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 08:52:08 crc kubenswrapper[4971]: I0320 08:52:08.529932 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 08:52:08 crc kubenswrapper[4971]: I0320 08:52:08.555255 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 08:52:08 crc kubenswrapper[4971]: I0320 08:52:08.563479 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 08:52:08 crc kubenswrapper[4971]: I0320 08:52:08.563589 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 08:52:08 crc kubenswrapper[4971]: I0320 08:52:08.709648 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d9d566f97-dtk5g"] Mar 20 08:52:08 crc kubenswrapper[4971]: I0320 08:52:08.711648 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d9d566f97-dtk5g" Mar 20 08:52:08 crc kubenswrapper[4971]: I0320 08:52:08.728334 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d9d566f97-dtk5g"] Mar 20 08:52:08 crc kubenswrapper[4971]: I0320 08:52:08.787930 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f112ece8-c233-4f7d-87bb-4634c0bc3582-ovsdbserver-sb\") pod \"dnsmasq-dns-7d9d566f97-dtk5g\" (UID: \"f112ece8-c233-4f7d-87bb-4634c0bc3582\") " pod="openstack/dnsmasq-dns-7d9d566f97-dtk5g" Mar 20 08:52:08 crc kubenswrapper[4971]: I0320 08:52:08.788006 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f112ece8-c233-4f7d-87bb-4634c0bc3582-config\") pod \"dnsmasq-dns-7d9d566f97-dtk5g\" (UID: \"f112ece8-c233-4f7d-87bb-4634c0bc3582\") " pod="openstack/dnsmasq-dns-7d9d566f97-dtk5g" Mar 20 08:52:08 crc kubenswrapper[4971]: I0320 08:52:08.788093 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f112ece8-c233-4f7d-87bb-4634c0bc3582-dns-svc\") pod \"dnsmasq-dns-7d9d566f97-dtk5g\" (UID: \"f112ece8-c233-4f7d-87bb-4634c0bc3582\") " pod="openstack/dnsmasq-dns-7d9d566f97-dtk5g" Mar 20 08:52:08 crc kubenswrapper[4971]: I0320 08:52:08.788118 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8d6z\" (UniqueName: \"kubernetes.io/projected/f112ece8-c233-4f7d-87bb-4634c0bc3582-kube-api-access-m8d6z\") pod \"dnsmasq-dns-7d9d566f97-dtk5g\" (UID: \"f112ece8-c233-4f7d-87bb-4634c0bc3582\") " pod="openstack/dnsmasq-dns-7d9d566f97-dtk5g" Mar 20 08:52:08 crc kubenswrapper[4971]: I0320 08:52:08.788136 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f112ece8-c233-4f7d-87bb-4634c0bc3582-ovsdbserver-nb\") pod \"dnsmasq-dns-7d9d566f97-dtk5g\" (UID: \"f112ece8-c233-4f7d-87bb-4634c0bc3582\") " pod="openstack/dnsmasq-dns-7d9d566f97-dtk5g" Mar 20 08:52:08 crc kubenswrapper[4971]: I0320 08:52:08.889516 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f112ece8-c233-4f7d-87bb-4634c0bc3582-ovsdbserver-sb\") pod \"dnsmasq-dns-7d9d566f97-dtk5g\" (UID: \"f112ece8-c233-4f7d-87bb-4634c0bc3582\") " pod="openstack/dnsmasq-dns-7d9d566f97-dtk5g" Mar 20 08:52:08 crc kubenswrapper[4971]: I0320 08:52:08.889581 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f112ece8-c233-4f7d-87bb-4634c0bc3582-config\") pod \"dnsmasq-dns-7d9d566f97-dtk5g\" (UID: \"f112ece8-c233-4f7d-87bb-4634c0bc3582\") " pod="openstack/dnsmasq-dns-7d9d566f97-dtk5g" Mar 20 08:52:08 crc kubenswrapper[4971]: I0320 08:52:08.889690 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f112ece8-c233-4f7d-87bb-4634c0bc3582-dns-svc\") pod \"dnsmasq-dns-7d9d566f97-dtk5g\" (UID: \"f112ece8-c233-4f7d-87bb-4634c0bc3582\") " pod="openstack/dnsmasq-dns-7d9d566f97-dtk5g" Mar 20 08:52:08 crc kubenswrapper[4971]: I0320 08:52:08.889717 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8d6z\" (UniqueName: \"kubernetes.io/projected/f112ece8-c233-4f7d-87bb-4634c0bc3582-kube-api-access-m8d6z\") pod \"dnsmasq-dns-7d9d566f97-dtk5g\" (UID: \"f112ece8-c233-4f7d-87bb-4634c0bc3582\") " pod="openstack/dnsmasq-dns-7d9d566f97-dtk5g" Mar 20 08:52:08 crc kubenswrapper[4971]: I0320 08:52:08.889746 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f112ece8-c233-4f7d-87bb-4634c0bc3582-ovsdbserver-nb\") pod \"dnsmasq-dns-7d9d566f97-dtk5g\" (UID: \"f112ece8-c233-4f7d-87bb-4634c0bc3582\") " pod="openstack/dnsmasq-dns-7d9d566f97-dtk5g" Mar 20 08:52:08 crc kubenswrapper[4971]: I0320 08:52:08.890549 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f112ece8-c233-4f7d-87bb-4634c0bc3582-ovsdbserver-nb\") pod \"dnsmasq-dns-7d9d566f97-dtk5g\" (UID: \"f112ece8-c233-4f7d-87bb-4634c0bc3582\") " pod="openstack/dnsmasq-dns-7d9d566f97-dtk5g" Mar 20 08:52:08 crc kubenswrapper[4971]: I0320 08:52:08.891151 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f112ece8-c233-4f7d-87bb-4634c0bc3582-ovsdbserver-sb\") pod \"dnsmasq-dns-7d9d566f97-dtk5g\" (UID: \"f112ece8-c233-4f7d-87bb-4634c0bc3582\") " pod="openstack/dnsmasq-dns-7d9d566f97-dtk5g" Mar 20 08:52:08 crc kubenswrapper[4971]: I0320 08:52:08.891788 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f112ece8-c233-4f7d-87bb-4634c0bc3582-config\") pod \"dnsmasq-dns-7d9d566f97-dtk5g\" (UID: \"f112ece8-c233-4f7d-87bb-4634c0bc3582\") " pod="openstack/dnsmasq-dns-7d9d566f97-dtk5g" Mar 20 08:52:08 crc kubenswrapper[4971]: I0320 08:52:08.892333 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f112ece8-c233-4f7d-87bb-4634c0bc3582-dns-svc\") pod \"dnsmasq-dns-7d9d566f97-dtk5g\" (UID: \"f112ece8-c233-4f7d-87bb-4634c0bc3582\") " pod="openstack/dnsmasq-dns-7d9d566f97-dtk5g" Mar 20 08:52:08 crc kubenswrapper[4971]: I0320 08:52:08.912632 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8d6z\" (UniqueName: \"kubernetes.io/projected/f112ece8-c233-4f7d-87bb-4634c0bc3582-kube-api-access-m8d6z\") pod \"dnsmasq-dns-7d9d566f97-dtk5g\" (UID: \"f112ece8-c233-4f7d-87bb-4634c0bc3582\") " pod="openstack/dnsmasq-dns-7d9d566f97-dtk5g" Mar 20 08:52:09 crc kubenswrapper[4971]: I0320 08:52:09.041090 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d9d566f97-dtk5g" Mar 20 08:52:09 crc kubenswrapper[4971]: I0320 08:52:09.135112 4971 scope.go:117] "RemoveContainer" containerID="73bfc543c6cba52262ea1124324c6c8eceb32bd6201dc4c8dcbbc4e410c2b49c" Mar 20 08:52:09 crc kubenswrapper[4971]: I0320 08:52:09.161317 4971 scope.go:117] "RemoveContainer" containerID="96083d13836e22a902d07819daa2aba9e77e25137eb26268bbea69183fd009e0" Mar 20 08:52:09 crc kubenswrapper[4971]: I0320 08:52:09.228198 4971 scope.go:117] "RemoveContainer" containerID="a7d5495118bd04c96bc8f5e3374e63550c006d70eda39ab9e7431d15190146f8" Mar 20 08:52:09 crc kubenswrapper[4971]: I0320 08:52:09.256774 4971 scope.go:117] "RemoveContainer" containerID="b2d012c332ffad8e91b3e921fa80377d0620c04da62ce6058386366fd408deba" Mar 20 08:52:09 crc kubenswrapper[4971]: I0320 08:52:09.327262 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 08:52:09 crc kubenswrapper[4971]: I0320 08:52:09.489964 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d9d566f97-dtk5g"] Mar 20 08:52:10 crc kubenswrapper[4971]: I0320 08:52:10.343458 4971 generic.go:334] "Generic (PLEG): container finished" podID="f112ece8-c233-4f7d-87bb-4634c0bc3582" containerID="ca2565998fcf3679734c3ac012587b4b187a5c511b9af022d5a0693095c51c31" exitCode=0 Mar 20 08:52:10 crc kubenswrapper[4971]: I0320 08:52:10.343582 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d9d566f97-dtk5g" event={"ID":"f112ece8-c233-4f7d-87bb-4634c0bc3582","Type":"ContainerDied","Data":"ca2565998fcf3679734c3ac012587b4b187a5c511b9af022d5a0693095c51c31"} Mar 20 08:52:10 crc kubenswrapper[4971]: I0320 08:52:10.343842 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d9d566f97-dtk5g" event={"ID":"f112ece8-c233-4f7d-87bb-4634c0bc3582","Type":"ContainerStarted","Data":"929954f2356f3e073ecf58ac034bfb3fa8f20081f5efdf108c4fae3de5d12734"} Mar 20 08:52:11 crc kubenswrapper[4971]: I0320 08:52:11.352356 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d9d566f97-dtk5g" event={"ID":"f112ece8-c233-4f7d-87bb-4634c0bc3582","Type":"ContainerStarted","Data":"b76a430ee503de1488cbd512a2d5fd22e4fa28a3bf88d405161d6a91caf45db2"} Mar 20 08:52:11 crc kubenswrapper[4971]: I0320 08:52:11.352674 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7d9d566f97-dtk5g" Mar 20 08:52:11 crc kubenswrapper[4971]: I0320 08:52:11.373183 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7d9d566f97-dtk5g" podStartSLOduration=3.3731621929999998 podStartE2EDuration="3.373162193s" podCreationTimestamp="2026-03-20 08:52:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:52:11.367377421 +0000 UTC m=+7353.347251559" watchObservedRunningTime="2026-03-20 08:52:11.373162193 +0000 UTC m=+7353.353036331" Mar 20 08:52:14 crc kubenswrapper[4971]: I0320 08:52:14.391527 4971 generic.go:334] "Generic (PLEG): container finished" podID="d5e692dd-45d1-4384-b888-3958c4e970c9" containerID="6d3cbaa014998737015ce96fceb3e202475bb0f362a03e518590f90bf3b758a1" exitCode=0 Mar 20 08:52:14 crc kubenswrapper[4971]: I0320 08:52:14.392156 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xncqk" event={"ID":"d5e692dd-45d1-4384-b888-3958c4e970c9","Type":"ContainerDied","Data":"6d3cbaa014998737015ce96fceb3e202475bb0f362a03e518590f90bf3b758a1"} Mar 20 08:52:15 crc kubenswrapper[4971]: I0320 08:52:15.447093 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xncqk" event={"ID":"d5e692dd-45d1-4384-b888-3958c4e970c9","Type":"ContainerStarted","Data":"70e8f78b39fcadff69658f8398a368eff3a3f179fcb917cf582850b055ce7861"} Mar 20 08:52:16 crc kubenswrapper[4971]: I0320 08:52:16.898065 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xncqk" podStartSLOduration=4.222260658 podStartE2EDuration="10.898043037s" podCreationTimestamp="2026-03-20 08:52:06 +0000 UTC" firstStartedPulling="2026-03-20 08:52:08.313218861 +0000 UTC m=+7350.293093029" lastFinishedPulling="2026-03-20 08:52:14.98900127 +0000 UTC m=+7356.968875408" observedRunningTime="2026-03-20 08:52:15.469818152 +0000 UTC m=+7357.449692310" watchObservedRunningTime="2026-03-20 08:52:16.898043037 +0000 UTC m=+7358.877917175" Mar 20 08:52:16 crc kubenswrapper[4971]: I0320 08:52:16.900018 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-q2x2f"] Mar 20 08:52:16 crc kubenswrapper[4971]: I0320 08:52:16.902158 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q2x2f" Mar 20 08:52:16 crc kubenswrapper[4971]: I0320 08:52:16.907839 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q2x2f"] Mar 20 08:52:16 crc kubenswrapper[4971]: I0320 08:52:16.959808 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e06868f-58f4-4643-805d-ba92c179a628-utilities\") pod \"redhat-marketplace-q2x2f\" (UID: \"2e06868f-58f4-4643-805d-ba92c179a628\") " pod="openshift-marketplace/redhat-marketplace-q2x2f" Mar 20 08:52:16 crc kubenswrapper[4971]: I0320 08:52:16.960278 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfcdn\" (UniqueName: \"kubernetes.io/projected/2e06868f-58f4-4643-805d-ba92c179a628-kube-api-access-zfcdn\") pod \"redhat-marketplace-q2x2f\" (UID: \"2e06868f-58f4-4643-805d-ba92c179a628\") " pod="openshift-marketplace/redhat-marketplace-q2x2f" Mar 20 08:52:16 crc kubenswrapper[4971]: I0320 08:52:16.960396 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e06868f-58f4-4643-805d-ba92c179a628-catalog-content\") pod \"redhat-marketplace-q2x2f\" (UID: \"2e06868f-58f4-4643-805d-ba92c179a628\") " pod="openshift-marketplace/redhat-marketplace-q2x2f" Mar 20 08:52:16 crc kubenswrapper[4971]: I0320 08:52:16.977328 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xncqk" Mar 20 08:52:16 crc kubenswrapper[4971]: I0320 08:52:16.977378 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xncqk" Mar 20 08:52:17 crc kubenswrapper[4971]: I0320 08:52:17.062274 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e06868f-58f4-4643-805d-ba92c179a628-catalog-content\") pod \"redhat-marketplace-q2x2f\" (UID: \"2e06868f-58f4-4643-805d-ba92c179a628\") " pod="openshift-marketplace/redhat-marketplace-q2x2f" Mar 20 08:52:17 crc kubenswrapper[4971]: I0320 08:52:17.062349 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e06868f-58f4-4643-805d-ba92c179a628-utilities\") pod \"redhat-marketplace-q2x2f\" (UID: \"2e06868f-58f4-4643-805d-ba92c179a628\") " pod="openshift-marketplace/redhat-marketplace-q2x2f" Mar 20 08:52:17 crc kubenswrapper[4971]: I0320 08:52:17.062437 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfcdn\" (UniqueName: \"kubernetes.io/projected/2e06868f-58f4-4643-805d-ba92c179a628-kube-api-access-zfcdn\") pod \"redhat-marketplace-q2x2f\" (UID: \"2e06868f-58f4-4643-805d-ba92c179a628\") " pod="openshift-marketplace/redhat-marketplace-q2x2f" Mar 20 08:52:17 crc kubenswrapper[4971]: I0320 08:52:17.063283 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e06868f-58f4-4643-805d-ba92c179a628-catalog-content\") pod \"redhat-marketplace-q2x2f\" (UID: \"2e06868f-58f4-4643-805d-ba92c179a628\") " pod="openshift-marketplace/redhat-marketplace-q2x2f" Mar 20 08:52:17 crc kubenswrapper[4971]: I0320 08:52:17.063550 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e06868f-58f4-4643-805d-ba92c179a628-utilities\") pod \"redhat-marketplace-q2x2f\" (UID: \"2e06868f-58f4-4643-805d-ba92c179a628\") " pod="openshift-marketplace/redhat-marketplace-q2x2f" Mar 20 08:52:17 crc kubenswrapper[4971]: I0320 08:52:17.081013 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfcdn\" (UniqueName: \"kubernetes.io/projected/2e06868f-58f4-4643-805d-ba92c179a628-kube-api-access-zfcdn\") pod \"redhat-marketplace-q2x2f\" (UID: \"2e06868f-58f4-4643-805d-ba92c179a628\") " pod="openshift-marketplace/redhat-marketplace-q2x2f" Mar 20 08:52:17 crc kubenswrapper[4971]: I0320 08:52:17.231564 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q2x2f" Mar 20 08:52:17 crc kubenswrapper[4971]: I0320 08:52:17.768131 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q2x2f"] Mar 20 08:52:18 crc kubenswrapper[4971]: I0320 08:52:18.018577 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-xncqk" podUID="d5e692dd-45d1-4384-b888-3958c4e970c9" containerName="registry-server" probeResult="failure" output=< Mar 20 08:52:18 crc kubenswrapper[4971]: timeout: failed to connect service ":50051" within 1s Mar 20 08:52:18 crc kubenswrapper[4971]: > Mar 20 08:52:18 crc kubenswrapper[4971]: I0320 08:52:18.473278 4971 generic.go:334] "Generic (PLEG): container finished" podID="2e06868f-58f4-4643-805d-ba92c179a628" containerID="5e35c76b8104972b4f89e94d9485586d4b30a807720c178a70667750ed3683e8" exitCode=0 Mar 20 08:52:18 crc kubenswrapper[4971]: I0320 08:52:18.473322 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q2x2f" event={"ID":"2e06868f-58f4-4643-805d-ba92c179a628","Type":"ContainerDied","Data":"5e35c76b8104972b4f89e94d9485586d4b30a807720c178a70667750ed3683e8"} Mar 20 08:52:18 crc kubenswrapper[4971]: I0320 08:52:18.473349 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q2x2f" event={"ID":"2e06868f-58f4-4643-805d-ba92c179a628","Type":"ContainerStarted","Data":"cde6c3a5f41d50d28fd93544f02eeac26a713a8f9557df947f4a3c32a504c7b2"} Mar 20 08:52:19 crc kubenswrapper[4971]: I0320 08:52:19.042741 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7d9d566f97-dtk5g" Mar 20 08:52:19 crc kubenswrapper[4971]: I0320 08:52:19.116527 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-844dd44645-96kb5"] Mar 20 08:52:19 crc kubenswrapper[4971]: I0320 08:52:19.116773 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-844dd44645-96kb5" podUID="513a0a95-fc59-49c1-ba38-c859f0c74158" containerName="dnsmasq-dns" containerID="cri-o://34a2307cd9ef11f84bc46fc48f75e4d7a85aaa4d588f3d6905d59c181dca1a68" gracePeriod=10 Mar 20 08:52:19 crc kubenswrapper[4971]: I0320 08:52:19.296343 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-58zhp"] Mar 20 08:52:19 crc kubenswrapper[4971]: I0320 08:52:19.299691 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-58zhp" Mar 20 08:52:19 crc kubenswrapper[4971]: I0320 08:52:19.309405 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-58zhp"] Mar 20 08:52:19 crc kubenswrapper[4971]: I0320 08:52:19.402908 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9703312b-200c-49c1-8b17-7fbde4f59c14-catalog-content\") pod \"certified-operators-58zhp\" (UID: \"9703312b-200c-49c1-8b17-7fbde4f59c14\") " pod="openshift-marketplace/certified-operators-58zhp" Mar 20 08:52:19 crc kubenswrapper[4971]: I0320 08:52:19.402958 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9703312b-200c-49c1-8b17-7fbde4f59c14-utilities\") pod \"certified-operators-58zhp\" (UID: \"9703312b-200c-49c1-8b17-7fbde4f59c14\") " pod="openshift-marketplace/certified-operators-58zhp" Mar 20 08:52:19 crc kubenswrapper[4971]: I0320 08:52:19.403138 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q76sx\" (UniqueName: \"kubernetes.io/projected/9703312b-200c-49c1-8b17-7fbde4f59c14-kube-api-access-q76sx\") pod \"certified-operators-58zhp\" (UID: \"9703312b-200c-49c1-8b17-7fbde4f59c14\") " pod="openshift-marketplace/certified-operators-58zhp" Mar 20 08:52:19 crc kubenswrapper[4971]: I0320 08:52:19.484756 4971 generic.go:334] "Generic (PLEG): container finished" podID="513a0a95-fc59-49c1-ba38-c859f0c74158" containerID="34a2307cd9ef11f84bc46fc48f75e4d7a85aaa4d588f3d6905d59c181dca1a68" exitCode=0 Mar 20 08:52:19 crc kubenswrapper[4971]: I0320 08:52:19.484799 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-844dd44645-96kb5" event={"ID":"513a0a95-fc59-49c1-ba38-c859f0c74158","Type":"ContainerDied","Data":"34a2307cd9ef11f84bc46fc48f75e4d7a85aaa4d588f3d6905d59c181dca1a68"} Mar 20 08:52:19 crc kubenswrapper[4971]: I0320 08:52:19.504207 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q76sx\" (UniqueName: \"kubernetes.io/projected/9703312b-200c-49c1-8b17-7fbde4f59c14-kube-api-access-q76sx\") pod \"certified-operators-58zhp\" (UID: \"9703312b-200c-49c1-8b17-7fbde4f59c14\") " pod="openshift-marketplace/certified-operators-58zhp" Mar 20 08:52:19 crc kubenswrapper[4971]: I0320 08:52:19.504271 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9703312b-200c-49c1-8b17-7fbde4f59c14-catalog-content\") pod \"certified-operators-58zhp\" (UID: \"9703312b-200c-49c1-8b17-7fbde4f59c14\") " pod="openshift-marketplace/certified-operators-58zhp" Mar 20 08:52:19 crc kubenswrapper[4971]: I0320 08:52:19.504297 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9703312b-200c-49c1-8b17-7fbde4f59c14-utilities\") pod \"certified-operators-58zhp\" (UID: \"9703312b-200c-49c1-8b17-7fbde4f59c14\") " pod="openshift-marketplace/certified-operators-58zhp" Mar 20 08:52:19 crc kubenswrapper[4971]: I0320 08:52:19.504763 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9703312b-200c-49c1-8b17-7fbde4f59c14-utilities\") pod \"certified-operators-58zhp\" (UID: \"9703312b-200c-49c1-8b17-7fbde4f59c14\") " pod="openshift-marketplace/certified-operators-58zhp" Mar 20 08:52:19 crc kubenswrapper[4971]: I0320 08:52:19.505262 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9703312b-200c-49c1-8b17-7fbde4f59c14-catalog-content\") pod \"certified-operators-58zhp\" (UID: \"9703312b-200c-49c1-8b17-7fbde4f59c14\") " pod="openshift-marketplace/certified-operators-58zhp" Mar 20 08:52:19 crc kubenswrapper[4971]: I0320 08:52:19.527590 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q76sx\" (UniqueName: \"kubernetes.io/projected/9703312b-200c-49c1-8b17-7fbde4f59c14-kube-api-access-q76sx\") pod \"certified-operators-58zhp\" (UID: \"9703312b-200c-49c1-8b17-7fbde4f59c14\") " pod="openshift-marketplace/certified-operators-58zhp" Mar 20 08:52:19 crc kubenswrapper[4971]: I0320 08:52:19.624636 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-58zhp" Mar 20 08:52:19 crc kubenswrapper[4971]: I0320 08:52:19.656887 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-844dd44645-96kb5" Mar 20 08:52:19 crc kubenswrapper[4971]: I0320 08:52:19.713307 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/513a0a95-fc59-49c1-ba38-c859f0c74158-ovsdbserver-nb\") pod \"513a0a95-fc59-49c1-ba38-c859f0c74158\" (UID: \"513a0a95-fc59-49c1-ba38-c859f0c74158\") " Mar 20 08:52:19 crc kubenswrapper[4971]: I0320 08:52:19.713432 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/513a0a95-fc59-49c1-ba38-c859f0c74158-config\") pod \"513a0a95-fc59-49c1-ba38-c859f0c74158\" (UID: \"513a0a95-fc59-49c1-ba38-c859f0c74158\") " Mar 20 08:52:19 crc kubenswrapper[4971]: I0320 08:52:19.713481 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/513a0a95-fc59-49c1-ba38-c859f0c74158-dns-svc\") pod \"513a0a95-fc59-49c1-ba38-c859f0c74158\" (UID: \"513a0a95-fc59-49c1-ba38-c859f0c74158\") " Mar 20 08:52:19 crc kubenswrapper[4971]: I0320 08:52:19.713508 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/513a0a95-fc59-49c1-ba38-c859f0c74158-ovsdbserver-sb\") pod \"513a0a95-fc59-49c1-ba38-c859f0c74158\" (UID: \"513a0a95-fc59-49c1-ba38-c859f0c74158\") " Mar 20 08:52:19 crc kubenswrapper[4971]: I0320 08:52:19.713551 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4sb\" (UniqueName: \"kubernetes.io/projected/513a0a95-fc59-49c1-ba38-c859f0c74158-kube-api-access-4d4sb\") pod \"513a0a95-fc59-49c1-ba38-c859f0c74158\" (UID: \"513a0a95-fc59-49c1-ba38-c859f0c74158\") " Mar 20 08:52:19 crc kubenswrapper[4971]: I0320 08:52:19.722562 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/513a0a95-fc59-49c1-ba38-c859f0c74158-kube-api-access-4d4sb" (OuterVolumeSpecName: "kube-api-access-4d4sb") pod "513a0a95-fc59-49c1-ba38-c859f0c74158" (UID: "513a0a95-fc59-49c1-ba38-c859f0c74158"). InnerVolumeSpecName "kube-api-access-4d4sb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:52:19 crc kubenswrapper[4971]: I0320 08:52:19.817103 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/513a0a95-fc59-49c1-ba38-c859f0c74158-config" (OuterVolumeSpecName: "config") pod "513a0a95-fc59-49c1-ba38-c859f0c74158" (UID: "513a0a95-fc59-49c1-ba38-c859f0c74158"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:52:19 crc kubenswrapper[4971]: I0320 08:52:19.817788 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/513a0a95-fc59-49c1-ba38-c859f0c74158-config\") pod \"513a0a95-fc59-49c1-ba38-c859f0c74158\" (UID: \"513a0a95-fc59-49c1-ba38-c859f0c74158\") " Mar 20 08:52:19 crc kubenswrapper[4971]: W0320 08:52:19.817989 4971 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/513a0a95-fc59-49c1-ba38-c859f0c74158/volumes/kubernetes.io~configmap/config Mar 20 08:52:19 crc kubenswrapper[4971]: I0320 08:52:19.818029 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/513a0a95-fc59-49c1-ba38-c859f0c74158-config" (OuterVolumeSpecName: "config") pod "513a0a95-fc59-49c1-ba38-c859f0c74158" (UID: "513a0a95-fc59-49c1-ba38-c859f0c74158"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:52:19 crc kubenswrapper[4971]: I0320 08:52:19.827359 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4sb\" (UniqueName: \"kubernetes.io/projected/513a0a95-fc59-49c1-ba38-c859f0c74158-kube-api-access-4d4sb\") on node \"crc\" DevicePath \"\"" Mar 20 08:52:19 crc kubenswrapper[4971]: I0320 08:52:19.827400 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/513a0a95-fc59-49c1-ba38-c859f0c74158-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:52:19 crc kubenswrapper[4971]: I0320 08:52:19.827900 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/513a0a95-fc59-49c1-ba38-c859f0c74158-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "513a0a95-fc59-49c1-ba38-c859f0c74158" (UID: "513a0a95-fc59-49c1-ba38-c859f0c74158"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:52:19 crc kubenswrapper[4971]: I0320 08:52:19.852251 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/513a0a95-fc59-49c1-ba38-c859f0c74158-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "513a0a95-fc59-49c1-ba38-c859f0c74158" (UID: "513a0a95-fc59-49c1-ba38-c859f0c74158"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:52:19 crc kubenswrapper[4971]: I0320 08:52:19.876887 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/513a0a95-fc59-49c1-ba38-c859f0c74158-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "513a0a95-fc59-49c1-ba38-c859f0c74158" (UID: "513a0a95-fc59-49c1-ba38-c859f0c74158"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:52:19 crc kubenswrapper[4971]: I0320 08:52:19.933880 4971 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/513a0a95-fc59-49c1-ba38-c859f0c74158-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 08:52:19 crc kubenswrapper[4971]: I0320 08:52:19.933918 4971 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/513a0a95-fc59-49c1-ba38-c859f0c74158-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 08:52:19 crc kubenswrapper[4971]: I0320 08:52:19.933929 4971 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/513a0a95-fc59-49c1-ba38-c859f0c74158-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 08:52:20 crc kubenswrapper[4971]: I0320 08:52:20.195286 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-58zhp"] Mar 20 08:52:20 crc kubenswrapper[4971]: W0320 08:52:20.204780 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9703312b_200c_49c1_8b17_7fbde4f59c14.slice/crio-d7601f9545ab66fe120c34c787f531ed611a7619acb4986b1f3cdb381c3a83ba WatchSource:0}: Error finding container d7601f9545ab66fe120c34c787f531ed611a7619acb4986b1f3cdb381c3a83ba: Status 404 returned error can't find the container with id d7601f9545ab66fe120c34c787f531ed611a7619acb4986b1f3cdb381c3a83ba Mar 20 08:52:20 crc kubenswrapper[4971]: I0320 08:52:20.494472 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-58zhp" event={"ID":"9703312b-200c-49c1-8b17-7fbde4f59c14","Type":"ContainerStarted","Data":"d7601f9545ab66fe120c34c787f531ed611a7619acb4986b1f3cdb381c3a83ba"} Mar 20 08:52:20 crc kubenswrapper[4971]: I0320 08:52:20.496959 4971 generic.go:334] "Generic (PLEG): container finished" podID="2e06868f-58f4-4643-805d-ba92c179a628" containerID="95189c2ed1eb1873f6ce3fc4b54cdc1eff6191dd396dc68ac2b2dba57c5674eb" exitCode=0 Mar 20 08:52:20 crc kubenswrapper[4971]: I0320 08:52:20.497037 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q2x2f" event={"ID":"2e06868f-58f4-4643-805d-ba92c179a628","Type":"ContainerDied","Data":"95189c2ed1eb1873f6ce3fc4b54cdc1eff6191dd396dc68ac2b2dba57c5674eb"} Mar 20 08:52:20 crc kubenswrapper[4971]: I0320 08:52:20.499029 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-844dd44645-96kb5" event={"ID":"513a0a95-fc59-49c1-ba38-c859f0c74158","Type":"ContainerDied","Data":"a6a6a947e495dcfdebff7ad7b05ad32cb30ecbc380150d98526554cf3af68222"} Mar 20 08:52:20 crc kubenswrapper[4971]: I0320 08:52:20.499112 4971 scope.go:117] "RemoveContainer" containerID="34a2307cd9ef11f84bc46fc48f75e4d7a85aaa4d588f3d6905d59c181dca1a68" Mar 20 08:52:20 crc kubenswrapper[4971]: I0320 08:52:20.499233 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-844dd44645-96kb5" Mar 20 08:52:20 crc kubenswrapper[4971]: I0320 08:52:20.540380 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-844dd44645-96kb5"] Mar 20 08:52:20 crc kubenswrapper[4971]: I0320 08:52:20.549129 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-844dd44645-96kb5"] Mar 20 08:52:20 crc kubenswrapper[4971]: I0320 08:52:20.750684 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="513a0a95-fc59-49c1-ba38-c859f0c74158" path="/var/lib/kubelet/pods/513a0a95-fc59-49c1-ba38-c859f0c74158/volumes" Mar 20 08:52:20 crc kubenswrapper[4971]: I0320 08:52:20.778960 4971 scope.go:117] "RemoveContainer" containerID="a216082ab407de137cd7d9211b7791bc94abd0bb4c336aeac51f09f626b32c0a" Mar 20 08:52:21 crc kubenswrapper[4971]: I0320 08:52:21.507965 4971 generic.go:334] "Generic (PLEG): container finished" podID="9703312b-200c-49c1-8b17-7fbde4f59c14" containerID="e643b4d467d7cc247bb535eaa51ee230c28a681e83fde5cd9c3465bcf68bd5ea" exitCode=0 Mar 20 08:52:21 crc kubenswrapper[4971]: I0320 08:52:21.508057 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-58zhp" event={"ID":"9703312b-200c-49c1-8b17-7fbde4f59c14","Type":"ContainerDied","Data":"e643b4d467d7cc247bb535eaa51ee230c28a681e83fde5cd9c3465bcf68bd5ea"} Mar 20 08:52:21 crc kubenswrapper[4971]: I0320 08:52:21.516316 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q2x2f" event={"ID":"2e06868f-58f4-4643-805d-ba92c179a628","Type":"ContainerStarted","Data":"460cccca5f90deb6869850639e36eb52cd63ed61cc173b827502b4d182a0d4d3"} Mar 20 08:52:21 crc kubenswrapper[4971]: I0320 08:52:21.556357 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-q2x2f" podStartSLOduration=2.863040179 podStartE2EDuration="5.556328493s" podCreationTimestamp="2026-03-20 08:52:16 +0000 UTC" firstStartedPulling="2026-03-20 08:52:18.475027253 +0000 UTC m=+7360.454901381" lastFinishedPulling="2026-03-20 08:52:21.168315557 +0000 UTC m=+7363.148189695" observedRunningTime="2026-03-20 08:52:21.555147522 +0000 UTC m=+7363.535021680" watchObservedRunningTime="2026-03-20 08:52:21.556328493 +0000 UTC m=+7363.536202641" Mar 20 08:52:21 crc kubenswrapper[4971]: I0320 08:52:21.576722 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-nh6zs"] Mar 20 08:52:21 crc kubenswrapper[4971]: E0320 08:52:21.577216 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="513a0a95-fc59-49c1-ba38-c859f0c74158" containerName="dnsmasq-dns" Mar 20 08:52:21 crc kubenswrapper[4971]: I0320 08:52:21.577241 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="513a0a95-fc59-49c1-ba38-c859f0c74158" containerName="dnsmasq-dns" Mar 20 08:52:21 crc kubenswrapper[4971]: E0320 08:52:21.577256 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="513a0a95-fc59-49c1-ba38-c859f0c74158" containerName="init" Mar 20 08:52:21 crc kubenswrapper[4971]: I0320 08:52:21.577267 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="513a0a95-fc59-49c1-ba38-c859f0c74158" containerName="init" Mar 20 08:52:21 crc kubenswrapper[4971]: I0320 08:52:21.577519 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="513a0a95-fc59-49c1-ba38-c859f0c74158" containerName="dnsmasq-dns" Mar 20 08:52:21 crc kubenswrapper[4971]: I0320 08:52:21.578215 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-nh6zs" Mar 20 08:52:21 crc kubenswrapper[4971]: I0320 08:52:21.585714 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-nh6zs"] Mar 20 08:52:21 crc kubenswrapper[4971]: I0320 08:52:21.665048 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eca1bba8-bc6c-4454-afab-398104d09e1e-operator-scripts\") pod \"cinder-db-create-nh6zs\" (UID: \"eca1bba8-bc6c-4454-afab-398104d09e1e\") " pod="openstack/cinder-db-create-nh6zs" Mar 20 08:52:21 crc kubenswrapper[4971]: I0320 08:52:21.665098 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26www\" (UniqueName: \"kubernetes.io/projected/eca1bba8-bc6c-4454-afab-398104d09e1e-kube-api-access-26www\") pod \"cinder-db-create-nh6zs\" (UID: \"eca1bba8-bc6c-4454-afab-398104d09e1e\") " pod="openstack/cinder-db-create-nh6zs" Mar 20 08:52:21 crc kubenswrapper[4971]: I0320 08:52:21.761694 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-9f82-account-create-update-x85jj"] Mar 20 08:52:21 crc kubenswrapper[4971]: I0320 08:52:21.763298 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9f82-account-create-update-x85jj" Mar 20 08:52:21 crc kubenswrapper[4971]: I0320 08:52:21.765583 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 20 08:52:21 crc kubenswrapper[4971]: I0320 08:52:21.766407 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eca1bba8-bc6c-4454-afab-398104d09e1e-operator-scripts\") pod \"cinder-db-create-nh6zs\" (UID: \"eca1bba8-bc6c-4454-afab-398104d09e1e\") " pod="openstack/cinder-db-create-nh6zs" Mar 20 08:52:21 crc kubenswrapper[4971]: I0320 08:52:21.766437 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26www\" (UniqueName: \"kubernetes.io/projected/eca1bba8-bc6c-4454-afab-398104d09e1e-kube-api-access-26www\") pod \"cinder-db-create-nh6zs\" (UID: \"eca1bba8-bc6c-4454-afab-398104d09e1e\") " pod="openstack/cinder-db-create-nh6zs" Mar 20 08:52:21 crc kubenswrapper[4971]: I0320 08:52:21.767289 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eca1bba8-bc6c-4454-afab-398104d09e1e-operator-scripts\") pod \"cinder-db-create-nh6zs\" (UID: \"eca1bba8-bc6c-4454-afab-398104d09e1e\") " pod="openstack/cinder-db-create-nh6zs" Mar 20 08:52:21 crc kubenswrapper[4971]: I0320 08:52:21.785798 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-9f82-account-create-update-x85jj"] Mar 20 08:52:21 crc kubenswrapper[4971]: I0320 08:52:21.787241 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26www\" (UniqueName: \"kubernetes.io/projected/eca1bba8-bc6c-4454-afab-398104d09e1e-kube-api-access-26www\") pod \"cinder-db-create-nh6zs\" (UID: \"eca1bba8-bc6c-4454-afab-398104d09e1e\") " pod="openstack/cinder-db-create-nh6zs" Mar 20 08:52:21 crc kubenswrapper[4971]: I0320 08:52:21.868436 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fbb5e33-a994-47a2-8574-a19b8220e0be-operator-scripts\") pod \"cinder-9f82-account-create-update-x85jj\" (UID: \"8fbb5e33-a994-47a2-8574-a19b8220e0be\") " pod="openstack/cinder-9f82-account-create-update-x85jj" Mar 20 08:52:21 crc kubenswrapper[4971]: I0320 08:52:21.868539 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpc5n\" (UniqueName: \"kubernetes.io/projected/8fbb5e33-a994-47a2-8574-a19b8220e0be-kube-api-access-zpc5n\") pod \"cinder-9f82-account-create-update-x85jj\" (UID: \"8fbb5e33-a994-47a2-8574-a19b8220e0be\") " pod="openstack/cinder-9f82-account-create-update-x85jj" Mar 20 08:52:21 crc kubenswrapper[4971]: I0320 08:52:21.908727 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-nh6zs" Mar 20 08:52:21 crc kubenswrapper[4971]: I0320 08:52:21.970156 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fbb5e33-a994-47a2-8574-a19b8220e0be-operator-scripts\") pod \"cinder-9f82-account-create-update-x85jj\" (UID: \"8fbb5e33-a994-47a2-8574-a19b8220e0be\") " pod="openstack/cinder-9f82-account-create-update-x85jj" Mar 20 08:52:21 crc kubenswrapper[4971]: I0320 08:52:21.970233 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpc5n\" (UniqueName: \"kubernetes.io/projected/8fbb5e33-a994-47a2-8574-a19b8220e0be-kube-api-access-zpc5n\") pod \"cinder-9f82-account-create-update-x85jj\" (UID: \"8fbb5e33-a994-47a2-8574-a19b8220e0be\") " pod="openstack/cinder-9f82-account-create-update-x85jj" Mar 20 08:52:21 crc kubenswrapper[4971]: I0320 08:52:21.970994 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fbb5e33-a994-47a2-8574-a19b8220e0be-operator-scripts\") pod \"cinder-9f82-account-create-update-x85jj\" (UID: \"8fbb5e33-a994-47a2-8574-a19b8220e0be\") " pod="openstack/cinder-9f82-account-create-update-x85jj" Mar 20 08:52:21 crc kubenswrapper[4971]: I0320 08:52:21.986861 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpc5n\" (UniqueName: \"kubernetes.io/projected/8fbb5e33-a994-47a2-8574-a19b8220e0be-kube-api-access-zpc5n\") pod \"cinder-9f82-account-create-update-x85jj\" (UID: \"8fbb5e33-a994-47a2-8574-a19b8220e0be\") " pod="openstack/cinder-9f82-account-create-update-x85jj" Mar 20 08:52:22 crc kubenswrapper[4971]: I0320 08:52:22.089990 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9f82-account-create-update-x85jj" Mar 20 08:52:22 crc kubenswrapper[4971]: I0320 08:52:22.475764 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-nh6zs"] Mar 20 08:52:22 crc kubenswrapper[4971]: W0320 08:52:22.477202 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeca1bba8_bc6c_4454_afab_398104d09e1e.slice/crio-ff7c0441008abe6b1723ec3a200d2d8d7b4d059914fbf4411fef959862e60963 WatchSource:0}: Error finding container ff7c0441008abe6b1723ec3a200d2d8d7b4d059914fbf4411fef959862e60963: Status 404 returned error can't find the container with id ff7c0441008abe6b1723ec3a200d2d8d7b4d059914fbf4411fef959862e60963 Mar 20 08:52:22 crc kubenswrapper[4971]: I0320 08:52:22.537326 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-58zhp" event={"ID":"9703312b-200c-49c1-8b17-7fbde4f59c14","Type":"ContainerStarted","Data":"3f5a000eeda9b38c6812a068204d5814de56d5092b65efd377fcd991e2306c87"} Mar 20 08:52:22 crc kubenswrapper[4971]: I0320 08:52:22.539852 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-nh6zs" event={"ID":"eca1bba8-bc6c-4454-afab-398104d09e1e","Type":"ContainerStarted","Data":"ff7c0441008abe6b1723ec3a200d2d8d7b4d059914fbf4411fef959862e60963"} Mar 20 08:52:22 crc kubenswrapper[4971]: I0320 08:52:22.653277 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-9f82-account-create-update-x85jj"] Mar 20 08:52:22 crc kubenswrapper[4971]: W0320 08:52:22.658260 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8fbb5e33_a994_47a2_8574_a19b8220e0be.slice/crio-f698a3dcc65c2796d20becfccc4ed843e94a38b2813c9fed87227e5b92af805c WatchSource:0}: Error finding container f698a3dcc65c2796d20becfccc4ed843e94a38b2813c9fed87227e5b92af805c: Status 404 returned error can't find the container with id f698a3dcc65c2796d20becfccc4ed843e94a38b2813c9fed87227e5b92af805c Mar 20 08:52:23 crc kubenswrapper[4971]: I0320 08:52:23.552454 4971 generic.go:334] "Generic (PLEG): container finished" podID="8fbb5e33-a994-47a2-8574-a19b8220e0be" containerID="77ba289c4840ef807399fc320c939664efefc46aa001ddb3e09a65c214ae7370" exitCode=0 Mar 20 08:52:23 crc kubenswrapper[4971]: I0320 08:52:23.552573 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9f82-account-create-update-x85jj" event={"ID":"8fbb5e33-a994-47a2-8574-a19b8220e0be","Type":"ContainerDied","Data":"77ba289c4840ef807399fc320c939664efefc46aa001ddb3e09a65c214ae7370"} Mar 20 08:52:23 crc kubenswrapper[4971]: I0320 08:52:23.552694 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9f82-account-create-update-x85jj" event={"ID":"8fbb5e33-a994-47a2-8574-a19b8220e0be","Type":"ContainerStarted","Data":"f698a3dcc65c2796d20becfccc4ed843e94a38b2813c9fed87227e5b92af805c"} Mar 20 08:52:23 crc kubenswrapper[4971]: I0320 08:52:23.554589 4971 generic.go:334] "Generic (PLEG): container finished" podID="eca1bba8-bc6c-4454-afab-398104d09e1e" containerID="81b8470c91cb51a284777f066b129acbd03ea2d7d0a225be6953968a1de86fdb" exitCode=0 Mar 20 08:52:23 crc kubenswrapper[4971]: I0320 08:52:23.554734 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-nh6zs" event={"ID":"eca1bba8-bc6c-4454-afab-398104d09e1e","Type":"ContainerDied","Data":"81b8470c91cb51a284777f066b129acbd03ea2d7d0a225be6953968a1de86fdb"} Mar 20 08:52:23 crc kubenswrapper[4971]: I0320 08:52:23.557115 4971 generic.go:334] "Generic (PLEG): container finished" podID="9703312b-200c-49c1-8b17-7fbde4f59c14" containerID="3f5a000eeda9b38c6812a068204d5814de56d5092b65efd377fcd991e2306c87" exitCode=0 Mar 20 08:52:23 crc kubenswrapper[4971]: I0320 08:52:23.557206 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-58zhp" event={"ID":"9703312b-200c-49c1-8b17-7fbde4f59c14","Type":"ContainerDied","Data":"3f5a000eeda9b38c6812a068204d5814de56d5092b65efd377fcd991e2306c87"} Mar 20 08:52:24 crc kubenswrapper[4971]: I0320 08:52:24.567725 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-58zhp" event={"ID":"9703312b-200c-49c1-8b17-7fbde4f59c14","Type":"ContainerStarted","Data":"b6d6108c6c2039bf48fc505b5d98e9f45c9aaf11318de2dcf3e7d75512aa2cc0"} Mar 20 08:52:24 crc kubenswrapper[4971]: I0320 08:52:24.591180 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-58zhp" podStartSLOduration=3.123034326 podStartE2EDuration="5.591161356s" podCreationTimestamp="2026-03-20 08:52:19 +0000 UTC" firstStartedPulling="2026-03-20 08:52:21.509653782 +0000 UTC m=+7363.489527920" lastFinishedPulling="2026-03-20 08:52:23.977780812 +0000 UTC m=+7365.957654950" observedRunningTime="2026-03-20 08:52:24.582954351 +0000 UTC m=+7366.562828499" watchObservedRunningTime="2026-03-20 08:52:24.591161356 +0000 UTC m=+7366.571035494" Mar 20 08:52:25 crc kubenswrapper[4971]: I0320 08:52:25.020347 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9f82-account-create-update-x85jj" Mar 20 08:52:25 crc kubenswrapper[4971]: I0320 08:52:25.024036 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-nh6zs" Mar 20 08:52:25 crc kubenswrapper[4971]: I0320 08:52:25.145863 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpc5n\" (UniqueName: \"kubernetes.io/projected/8fbb5e33-a994-47a2-8574-a19b8220e0be-kube-api-access-zpc5n\") pod \"8fbb5e33-a994-47a2-8574-a19b8220e0be\" (UID: \"8fbb5e33-a994-47a2-8574-a19b8220e0be\") " Mar 20 08:52:25 crc kubenswrapper[4971]: I0320 08:52:25.145933 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fbb5e33-a994-47a2-8574-a19b8220e0be-operator-scripts\") pod \"8fbb5e33-a994-47a2-8574-a19b8220e0be\" (UID: \"8fbb5e33-a994-47a2-8574-a19b8220e0be\") " Mar 20 08:52:25 crc kubenswrapper[4971]: I0320 08:52:25.146001 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26www\" (UniqueName: \"kubernetes.io/projected/eca1bba8-bc6c-4454-afab-398104d09e1e-kube-api-access-26www\") pod \"eca1bba8-bc6c-4454-afab-398104d09e1e\" (UID: \"eca1bba8-bc6c-4454-afab-398104d09e1e\") " Mar 20 08:52:25 crc kubenswrapper[4971]: I0320 08:52:25.146054 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eca1bba8-bc6c-4454-afab-398104d09e1e-operator-scripts\") pod \"eca1bba8-bc6c-4454-afab-398104d09e1e\" (UID: \"eca1bba8-bc6c-4454-afab-398104d09e1e\") " Mar 20 08:52:25 crc kubenswrapper[4971]: I0320 08:52:25.146753 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fbb5e33-a994-47a2-8574-a19b8220e0be-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8fbb5e33-a994-47a2-8574-a19b8220e0be" (UID: "8fbb5e33-a994-47a2-8574-a19b8220e0be"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:52:25 crc kubenswrapper[4971]: I0320 08:52:25.146917 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eca1bba8-bc6c-4454-afab-398104d09e1e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "eca1bba8-bc6c-4454-afab-398104d09e1e" (UID: "eca1bba8-bc6c-4454-afab-398104d09e1e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:52:25 crc kubenswrapper[4971]: I0320 08:52:25.158701 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eca1bba8-bc6c-4454-afab-398104d09e1e-kube-api-access-26www" (OuterVolumeSpecName: "kube-api-access-26www") pod "eca1bba8-bc6c-4454-afab-398104d09e1e" (UID: "eca1bba8-bc6c-4454-afab-398104d09e1e"). InnerVolumeSpecName "kube-api-access-26www". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:52:25 crc kubenswrapper[4971]: I0320 08:52:25.168519 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fbb5e33-a994-47a2-8574-a19b8220e0be-kube-api-access-zpc5n" (OuterVolumeSpecName: "kube-api-access-zpc5n") pod "8fbb5e33-a994-47a2-8574-a19b8220e0be" (UID: "8fbb5e33-a994-47a2-8574-a19b8220e0be"). InnerVolumeSpecName "kube-api-access-zpc5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:52:25 crc kubenswrapper[4971]: I0320 08:52:25.248261 4971 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eca1bba8-bc6c-4454-afab-398104d09e1e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:52:25 crc kubenswrapper[4971]: I0320 08:52:25.248292 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zpc5n\" (UniqueName: \"kubernetes.io/projected/8fbb5e33-a994-47a2-8574-a19b8220e0be-kube-api-access-zpc5n\") on node \"crc\" DevicePath \"\"" Mar 20 08:52:25 crc kubenswrapper[4971]: I0320 08:52:25.248307 4971 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fbb5e33-a994-47a2-8574-a19b8220e0be-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:52:25 crc kubenswrapper[4971]: I0320 08:52:25.248320 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26www\" (UniqueName: \"kubernetes.io/projected/eca1bba8-bc6c-4454-afab-398104d09e1e-kube-api-access-26www\") on node \"crc\" DevicePath \"\"" Mar 20 08:52:25 crc kubenswrapper[4971]: I0320 08:52:25.578505 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9f82-account-create-update-x85jj" event={"ID":"8fbb5e33-a994-47a2-8574-a19b8220e0be","Type":"ContainerDied","Data":"f698a3dcc65c2796d20becfccc4ed843e94a38b2813c9fed87227e5b92af805c"} Mar 20 08:52:25 crc kubenswrapper[4971]: I0320 08:52:25.578552 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f698a3dcc65c2796d20becfccc4ed843e94a38b2813c9fed87227e5b92af805c" Mar 20 08:52:25 crc kubenswrapper[4971]: I0320 08:52:25.578632 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9f82-account-create-update-x85jj" Mar 20 08:52:25 crc kubenswrapper[4971]: I0320 08:52:25.585864 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-nh6zs" event={"ID":"eca1bba8-bc6c-4454-afab-398104d09e1e","Type":"ContainerDied","Data":"ff7c0441008abe6b1723ec3a200d2d8d7b4d059914fbf4411fef959862e60963"} Mar 20 08:52:25 crc kubenswrapper[4971]: I0320 08:52:25.585934 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff7c0441008abe6b1723ec3a200d2d8d7b4d059914fbf4411fef959862e60963" Mar 20 08:52:25 crc kubenswrapper[4971]: I0320 08:52:25.585883 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-nh6zs" Mar 20 08:52:26 crc kubenswrapper[4971]: I0320 08:52:26.924904 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-tt6jv"] Mar 20 08:52:26 crc kubenswrapper[4971]: E0320 08:52:26.925708 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eca1bba8-bc6c-4454-afab-398104d09e1e" containerName="mariadb-database-create" Mar 20 08:52:26 crc kubenswrapper[4971]: I0320 08:52:26.925729 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="eca1bba8-bc6c-4454-afab-398104d09e1e" containerName="mariadb-database-create" Mar 20 08:52:26 crc kubenswrapper[4971]: E0320 08:52:26.925747 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fbb5e33-a994-47a2-8574-a19b8220e0be" containerName="mariadb-account-create-update" Mar 20 08:52:26 crc kubenswrapper[4971]: I0320 08:52:26.925756 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fbb5e33-a994-47a2-8574-a19b8220e0be" containerName="mariadb-account-create-update" Mar 20 08:52:26 crc kubenswrapper[4971]: I0320 08:52:26.926010 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="eca1bba8-bc6c-4454-afab-398104d09e1e" containerName="mariadb-database-create" Mar 20 08:52:26 crc kubenswrapper[4971]: I0320 08:52:26.926030 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fbb5e33-a994-47a2-8574-a19b8220e0be" containerName="mariadb-account-create-update" Mar 20 08:52:26 crc kubenswrapper[4971]: I0320 08:52:26.926810 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-tt6jv" Mar 20 08:52:26 crc kubenswrapper[4971]: I0320 08:52:26.931812 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 20 08:52:26 crc kubenswrapper[4971]: I0320 08:52:26.936515 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-tt6jv"] Mar 20 08:52:26 crc kubenswrapper[4971]: I0320 08:52:26.937694 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 20 08:52:26 crc kubenswrapper[4971]: I0320 08:52:26.938152 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-jw8wb" Mar 20 08:52:26 crc kubenswrapper[4971]: I0320 08:52:26.999037 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kl52x\" (UniqueName: \"kubernetes.io/projected/993ffc7e-64d4-455b-84ba-134ffefa83ea-kube-api-access-kl52x\") pod \"cinder-db-sync-tt6jv\" (UID: \"993ffc7e-64d4-455b-84ba-134ffefa83ea\") " pod="openstack/cinder-db-sync-tt6jv" Mar 20 08:52:26 crc kubenswrapper[4971]: I0320 08:52:26.999128 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/993ffc7e-64d4-455b-84ba-134ffefa83ea-etc-machine-id\") pod \"cinder-db-sync-tt6jv\" (UID: \"993ffc7e-64d4-455b-84ba-134ffefa83ea\") " pod="openstack/cinder-db-sync-tt6jv" Mar 20 08:52:26 crc kubenswrapper[4971]: I0320 08:52:26.999170 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/993ffc7e-64d4-455b-84ba-134ffefa83ea-combined-ca-bundle\") pod \"cinder-db-sync-tt6jv\" (UID: \"993ffc7e-64d4-455b-84ba-134ffefa83ea\") " pod="openstack/cinder-db-sync-tt6jv" Mar 20 08:52:26 crc kubenswrapper[4971]: I0320 08:52:26.999300 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/993ffc7e-64d4-455b-84ba-134ffefa83ea-scripts\") pod \"cinder-db-sync-tt6jv\" (UID: \"993ffc7e-64d4-455b-84ba-134ffefa83ea\") " pod="openstack/cinder-db-sync-tt6jv" Mar 20 08:52:26 crc kubenswrapper[4971]: I0320 08:52:26.999375 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/993ffc7e-64d4-455b-84ba-134ffefa83ea-config-data\") pod \"cinder-db-sync-tt6jv\" (UID: \"993ffc7e-64d4-455b-84ba-134ffefa83ea\") " pod="openstack/cinder-db-sync-tt6jv" Mar 20 08:52:27 crc kubenswrapper[4971]: I0320 08:52:26.999486 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/993ffc7e-64d4-455b-84ba-134ffefa83ea-db-sync-config-data\") pod \"cinder-db-sync-tt6jv\" (UID: \"993ffc7e-64d4-455b-84ba-134ffefa83ea\") " pod="openstack/cinder-db-sync-tt6jv" Mar 20 08:52:27 crc kubenswrapper[4971]: I0320 08:52:27.055682 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xncqk" Mar 20 08:52:27 crc kubenswrapper[4971]: I0320 08:52:27.102642 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/993ffc7e-64d4-455b-84ba-134ffefa83ea-db-sync-config-data\") pod \"cinder-db-sync-tt6jv\" (UID: \"993ffc7e-64d4-455b-84ba-134ffefa83ea\") " pod="openstack/cinder-db-sync-tt6jv" Mar 20 08:52:27 crc kubenswrapper[4971]: I0320 08:52:27.102858 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kl52x\" (UniqueName: \"kubernetes.io/projected/993ffc7e-64d4-455b-84ba-134ffefa83ea-kube-api-access-kl52x\") pod \"cinder-db-sync-tt6jv\" (UID: \"993ffc7e-64d4-455b-84ba-134ffefa83ea\") " pod="openstack/cinder-db-sync-tt6jv" Mar 20 08:52:27 crc kubenswrapper[4971]: I0320 08:52:27.102896 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/993ffc7e-64d4-455b-84ba-134ffefa83ea-etc-machine-id\") pod \"cinder-db-sync-tt6jv\" (UID: \"993ffc7e-64d4-455b-84ba-134ffefa83ea\") " pod="openstack/cinder-db-sync-tt6jv" Mar 20 08:52:27 crc kubenswrapper[4971]: I0320 08:52:27.102920 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/993ffc7e-64d4-455b-84ba-134ffefa83ea-combined-ca-bundle\") pod \"cinder-db-sync-tt6jv\" (UID: \"993ffc7e-64d4-455b-84ba-134ffefa83ea\") " pod="openstack/cinder-db-sync-tt6jv" Mar 20 08:52:27 crc kubenswrapper[4971]: I0320 08:52:27.103023 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/993ffc7e-64d4-455b-84ba-134ffefa83ea-scripts\") pod \"cinder-db-sync-tt6jv\" (UID: \"993ffc7e-64d4-455b-84ba-134ffefa83ea\") " pod="openstack/cinder-db-sync-tt6jv" Mar 20 08:52:27 crc kubenswrapper[4971]: I0320 08:52:27.103075 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/993ffc7e-64d4-455b-84ba-134ffefa83ea-config-data\") pod \"cinder-db-sync-tt6jv\" (UID: \"993ffc7e-64d4-455b-84ba-134ffefa83ea\") " pod="openstack/cinder-db-sync-tt6jv" Mar 20 08:52:27 crc kubenswrapper[4971]: I0320 08:52:27.104484 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/993ffc7e-64d4-455b-84ba-134ffefa83ea-etc-machine-id\") pod \"cinder-db-sync-tt6jv\" (UID: \"993ffc7e-64d4-455b-84ba-134ffefa83ea\") " pod="openstack/cinder-db-sync-tt6jv" Mar 20 08:52:27 crc kubenswrapper[4971]: I0320 08:52:27.109013 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/993ffc7e-64d4-455b-84ba-134ffefa83ea-scripts\") pod \"cinder-db-sync-tt6jv\" (UID: \"993ffc7e-64d4-455b-84ba-134ffefa83ea\") " pod="openstack/cinder-db-sync-tt6jv" Mar 20 08:52:27 crc kubenswrapper[4971]: I0320 08:52:27.109203 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/993ffc7e-64d4-455b-84ba-134ffefa83ea-config-data\") pod \"cinder-db-sync-tt6jv\" (UID: \"993ffc7e-64d4-455b-84ba-134ffefa83ea\") " pod="openstack/cinder-db-sync-tt6jv" Mar 20 08:52:27 crc kubenswrapper[4971]: I0320 08:52:27.111171 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/993ffc7e-64d4-455b-84ba-134ffefa83ea-db-sync-config-data\") pod \"cinder-db-sync-tt6jv\" (UID: \"993ffc7e-64d4-455b-84ba-134ffefa83ea\") " pod="openstack/cinder-db-sync-tt6jv" Mar 20 08:52:27 crc kubenswrapper[4971]: I0320 08:52:27.117362 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xncqk" Mar 20 08:52:27 crc kubenswrapper[4971]: I0320 08:52:27.121268 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kl52x\" (UniqueName: \"kubernetes.io/projected/993ffc7e-64d4-455b-84ba-134ffefa83ea-kube-api-access-kl52x\") pod \"cinder-db-sync-tt6jv\" (UID: \"993ffc7e-64d4-455b-84ba-134ffefa83ea\") " pod="openstack/cinder-db-sync-tt6jv" Mar 20 08:52:27 crc kubenswrapper[4971]: I0320 08:52:27.122015 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/993ffc7e-64d4-455b-84ba-134ffefa83ea-combined-ca-bundle\") pod \"cinder-db-sync-tt6jv\" (UID: \"993ffc7e-64d4-455b-84ba-134ffefa83ea\") " pod="openstack/cinder-db-sync-tt6jv" Mar 20 08:52:27 crc kubenswrapper[4971]: I0320 08:52:27.232501 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-q2x2f" Mar 20 08:52:27 crc kubenswrapper[4971]: I0320 08:52:27.232540 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-q2x2f" Mar 20 08:52:27 crc kubenswrapper[4971]: I0320 08:52:27.291175 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-q2x2f" Mar 20 08:52:27 crc kubenswrapper[4971]: I0320 08:52:27.300084 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-tt6jv" Mar 20 08:52:27 crc kubenswrapper[4971]: I0320 08:52:27.640325 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-q2x2f" Mar 20 08:52:27 crc kubenswrapper[4971]: I0320 08:52:27.794212 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-tt6jv"] Mar 20 08:52:28 crc kubenswrapper[4971]: I0320 08:52:28.608686 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-tt6jv" event={"ID":"993ffc7e-64d4-455b-84ba-134ffefa83ea","Type":"ContainerStarted","Data":"03e6ac24111ae24d40baaa9dc49fb21ea78fdff439d0d77451e5503023b7a9de"} Mar 20 08:52:29 crc kubenswrapper[4971]: I0320 08:52:29.106620 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xncqk"] Mar 20 08:52:29 crc kubenswrapper[4971]: I0320 08:52:29.484323 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dfhq9"] Mar 20 08:52:29 crc kubenswrapper[4971]: I0320 08:52:29.484813 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-dfhq9" podUID="65e9cb3c-4705-4e93-bb16-a0a56de56d62" containerName="registry-server" containerID="cri-o://bd4be7968f59c8940ce1d937903dcd6e3f911573a4a719b287e095fa48b8ad24" gracePeriod=2 Mar 20 08:52:29 crc kubenswrapper[4971]: I0320 08:52:29.625846 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-58zhp" Mar 20 08:52:29 crc kubenswrapper[4971]: I0320 08:52:29.626345 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-58zhp" Mar 20 08:52:29 crc kubenswrapper[4971]: I0320 08:52:29.632445 4971 generic.go:334] "Generic (PLEG): container finished" podID="65e9cb3c-4705-4e93-bb16-a0a56de56d62" containerID="bd4be7968f59c8940ce1d937903dcd6e3f911573a4a719b287e095fa48b8ad24" exitCode=0 Mar 20 08:52:29 crc kubenswrapper[4971]: I0320 08:52:29.632538 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dfhq9" event={"ID":"65e9cb3c-4705-4e93-bb16-a0a56de56d62","Type":"ContainerDied","Data":"bd4be7968f59c8940ce1d937903dcd6e3f911573a4a719b287e095fa48b8ad24"} Mar 20 08:52:29 crc kubenswrapper[4971]: I0320 08:52:29.694688 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-58zhp" Mar 20 08:52:29 crc kubenswrapper[4971]: I0320 08:52:29.989817 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dfhq9" Mar 20 08:52:30 crc kubenswrapper[4971]: I0320 08:52:30.060803 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qp526\" (UniqueName: \"kubernetes.io/projected/65e9cb3c-4705-4e93-bb16-a0a56de56d62-kube-api-access-qp526\") pod \"65e9cb3c-4705-4e93-bb16-a0a56de56d62\" (UID: \"65e9cb3c-4705-4e93-bb16-a0a56de56d62\") " Mar 20 08:52:30 crc kubenswrapper[4971]: I0320 08:52:30.060995 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65e9cb3c-4705-4e93-bb16-a0a56de56d62-utilities\") pod \"65e9cb3c-4705-4e93-bb16-a0a56de56d62\" (UID: \"65e9cb3c-4705-4e93-bb16-a0a56de56d62\") " Mar 20 08:52:30 crc kubenswrapper[4971]: I0320 08:52:30.061015 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65e9cb3c-4705-4e93-bb16-a0a56de56d62-catalog-content\") pod \"65e9cb3c-4705-4e93-bb16-a0a56de56d62\" (UID: \"65e9cb3c-4705-4e93-bb16-a0a56de56d62\") " Mar 20 08:52:30 crc kubenswrapper[4971]: I0320 08:52:30.061659 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65e9cb3c-4705-4e93-bb16-a0a56de56d62-utilities" (OuterVolumeSpecName: "utilities") pod "65e9cb3c-4705-4e93-bb16-a0a56de56d62" (UID: "65e9cb3c-4705-4e93-bb16-a0a56de56d62"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:52:30 crc kubenswrapper[4971]: I0320 08:52:30.068004 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65e9cb3c-4705-4e93-bb16-a0a56de56d62-kube-api-access-qp526" (OuterVolumeSpecName: "kube-api-access-qp526") pod "65e9cb3c-4705-4e93-bb16-a0a56de56d62" (UID: "65e9cb3c-4705-4e93-bb16-a0a56de56d62"). InnerVolumeSpecName "kube-api-access-qp526". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:52:30 crc kubenswrapper[4971]: I0320 08:52:30.122690 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65e9cb3c-4705-4e93-bb16-a0a56de56d62-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "65e9cb3c-4705-4e93-bb16-a0a56de56d62" (UID: "65e9cb3c-4705-4e93-bb16-a0a56de56d62"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:52:30 crc kubenswrapper[4971]: I0320 08:52:30.165285 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65e9cb3c-4705-4e93-bb16-a0a56de56d62-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 08:52:30 crc kubenswrapper[4971]: I0320 08:52:30.165322 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65e9cb3c-4705-4e93-bb16-a0a56de56d62-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 08:52:30 crc kubenswrapper[4971]: I0320 08:52:30.165333 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qp526\" (UniqueName: \"kubernetes.io/projected/65e9cb3c-4705-4e93-bb16-a0a56de56d62-kube-api-access-qp526\") on node \"crc\" DevicePath \"\"" Mar 20 08:52:30 crc kubenswrapper[4971]: I0320 08:52:30.491137 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q2x2f"] Mar 20 08:52:30 crc kubenswrapper[4971]: I0320 08:52:30.492820 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-q2x2f" podUID="2e06868f-58f4-4643-805d-ba92c179a628" containerName="registry-server" containerID="cri-o://460cccca5f90deb6869850639e36eb52cd63ed61cc173b827502b4d182a0d4d3" gracePeriod=2 Mar 20 08:52:30 crc kubenswrapper[4971]: I0320 08:52:30.646689 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dfhq9" Mar 20 08:52:30 crc kubenswrapper[4971]: I0320 08:52:30.646759 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dfhq9" event={"ID":"65e9cb3c-4705-4e93-bb16-a0a56de56d62","Type":"ContainerDied","Data":"061ce4a386e20f1a27e975a4f7b80168e523c21aa55ac56c3c7ada0ec88c2990"} Mar 20 08:52:30 crc kubenswrapper[4971]: I0320 08:52:30.646819 4971 scope.go:117] "RemoveContainer" containerID="bd4be7968f59c8940ce1d937903dcd6e3f911573a4a719b287e095fa48b8ad24" Mar 20 08:52:30 crc kubenswrapper[4971]: I0320 08:52:30.651118 4971 generic.go:334] "Generic (PLEG): container finished" podID="2e06868f-58f4-4643-805d-ba92c179a628" containerID="460cccca5f90deb6869850639e36eb52cd63ed61cc173b827502b4d182a0d4d3" exitCode=0 Mar 20 08:52:30 crc kubenswrapper[4971]: I0320 08:52:30.651168 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q2x2f" event={"ID":"2e06868f-58f4-4643-805d-ba92c179a628","Type":"ContainerDied","Data":"460cccca5f90deb6869850639e36eb52cd63ed61cc173b827502b4d182a0d4d3"} Mar 20 08:52:30 crc kubenswrapper[4971]: I0320 08:52:30.679676 4971 scope.go:117] "RemoveContainer" containerID="eb02e224a9f4d5d661c0d5aab136b93c270f5fbeab8e84c6858f11a8ab53d91e" Mar 20 08:52:30 crc kubenswrapper[4971]: I0320 08:52:30.701086 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dfhq9"] Mar 20 08:52:30 crc kubenswrapper[4971]: I0320 08:52:30.712233 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-dfhq9"] Mar 20 08:52:30 crc kubenswrapper[4971]: I0320 08:52:30.717340 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-58zhp" Mar 20 08:52:30 crc kubenswrapper[4971]: I0320 08:52:30.719986 4971 scope.go:117] "RemoveContainer" containerID="02d9ef4b17ed88eeb779e2c2030d7a8a51c9adc5b84475a66d4370958d757fda" Mar 20 08:52:30 crc kubenswrapper[4971]: I0320 08:52:30.748086 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65e9cb3c-4705-4e93-bb16-a0a56de56d62" path="/var/lib/kubelet/pods/65e9cb3c-4705-4e93-bb16-a0a56de56d62/volumes" Mar 20 08:52:30 crc kubenswrapper[4971]: I0320 08:52:30.985928 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q2x2f" Mar 20 08:52:31 crc kubenswrapper[4971]: I0320 08:52:31.090709 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e06868f-58f4-4643-805d-ba92c179a628-utilities\") pod \"2e06868f-58f4-4643-805d-ba92c179a628\" (UID: \"2e06868f-58f4-4643-805d-ba92c179a628\") " Mar 20 08:52:31 crc kubenswrapper[4971]: I0320 08:52:31.090804 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfcdn\" (UniqueName: \"kubernetes.io/projected/2e06868f-58f4-4643-805d-ba92c179a628-kube-api-access-zfcdn\") pod \"2e06868f-58f4-4643-805d-ba92c179a628\" (UID: \"2e06868f-58f4-4643-805d-ba92c179a628\") " Mar 20 08:52:31 crc kubenswrapper[4971]: I0320 08:52:31.090862 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e06868f-58f4-4643-805d-ba92c179a628-catalog-content\") pod \"2e06868f-58f4-4643-805d-ba92c179a628\" (UID: \"2e06868f-58f4-4643-805d-ba92c179a628\") " Mar 20 08:52:31 crc kubenswrapper[4971]: I0320 08:52:31.092474 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e06868f-58f4-4643-805d-ba92c179a628-utilities" (OuterVolumeSpecName: "utilities") pod "2e06868f-58f4-4643-805d-ba92c179a628" (UID: "2e06868f-58f4-4643-805d-ba92c179a628"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:52:31 crc kubenswrapper[4971]: I0320 08:52:31.097774 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e06868f-58f4-4643-805d-ba92c179a628-kube-api-access-zfcdn" (OuterVolumeSpecName: "kube-api-access-zfcdn") pod "2e06868f-58f4-4643-805d-ba92c179a628" (UID: "2e06868f-58f4-4643-805d-ba92c179a628"). InnerVolumeSpecName "kube-api-access-zfcdn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:52:31 crc kubenswrapper[4971]: I0320 08:52:31.119674 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e06868f-58f4-4643-805d-ba92c179a628-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2e06868f-58f4-4643-805d-ba92c179a628" (UID: "2e06868f-58f4-4643-805d-ba92c179a628"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:52:31 crc kubenswrapper[4971]: I0320 08:52:31.193324 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfcdn\" (UniqueName: \"kubernetes.io/projected/2e06868f-58f4-4643-805d-ba92c179a628-kube-api-access-zfcdn\") on node \"crc\" DevicePath \"\"" Mar 20 08:52:31 crc kubenswrapper[4971]: I0320 08:52:31.193353 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e06868f-58f4-4643-805d-ba92c179a628-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 08:52:31 crc kubenswrapper[4971]: I0320 08:52:31.193362 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e06868f-58f4-4643-805d-ba92c179a628-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 08:52:31 crc kubenswrapper[4971]: I0320 08:52:31.665856 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q2x2f" event={"ID":"2e06868f-58f4-4643-805d-ba92c179a628","Type":"ContainerDied","Data":"cde6c3a5f41d50d28fd93544f02eeac26a713a8f9557df947f4a3c32a504c7b2"} Mar 20 08:52:31 crc kubenswrapper[4971]: I0320 08:52:31.665901 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q2x2f" Mar 20 08:52:31 crc kubenswrapper[4971]: I0320 08:52:31.665953 4971 scope.go:117] "RemoveContainer" containerID="460cccca5f90deb6869850639e36eb52cd63ed61cc173b827502b4d182a0d4d3" Mar 20 08:52:31 crc kubenswrapper[4971]: I0320 08:52:31.702577 4971 scope.go:117] "RemoveContainer" containerID="95189c2ed1eb1873f6ce3fc4b54cdc1eff6191dd396dc68ac2b2dba57c5674eb" Mar 20 08:52:31 crc kubenswrapper[4971]: I0320 08:52:31.714598 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q2x2f"] Mar 20 08:52:31 crc kubenswrapper[4971]: I0320 08:52:31.720538 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-q2x2f"] Mar 20 08:52:31 crc kubenswrapper[4971]: I0320 08:52:31.762532 4971 scope.go:117] "RemoveContainer" containerID="5e35c76b8104972b4f89e94d9485586d4b30a807720c178a70667750ed3683e8" Mar 20 08:52:32 crc kubenswrapper[4971]: I0320 08:52:32.745218 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e06868f-58f4-4643-805d-ba92c179a628" path="/var/lib/kubelet/pods/2e06868f-58f4-4643-805d-ba92c179a628/volumes" Mar 20 08:52:32 crc kubenswrapper[4971]: I0320 08:52:32.888374 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-58zhp"] Mar 20 08:52:32 crc kubenswrapper[4971]: I0320 08:52:32.888688 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-58zhp" podUID="9703312b-200c-49c1-8b17-7fbde4f59c14" containerName="registry-server" containerID="cri-o://b6d6108c6c2039bf48fc505b5d98e9f45c9aaf11318de2dcf3e7d75512aa2cc0" gracePeriod=2 Mar 20 08:52:33 crc kubenswrapper[4971]: I0320 08:52:33.370572 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-58zhp" Mar 20 08:52:33 crc kubenswrapper[4971]: I0320 08:52:33.435191 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9703312b-200c-49c1-8b17-7fbde4f59c14-catalog-content\") pod \"9703312b-200c-49c1-8b17-7fbde4f59c14\" (UID: \"9703312b-200c-49c1-8b17-7fbde4f59c14\") " Mar 20 08:52:33 crc kubenswrapper[4971]: I0320 08:52:33.435340 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q76sx\" (UniqueName: \"kubernetes.io/projected/9703312b-200c-49c1-8b17-7fbde4f59c14-kube-api-access-q76sx\") pod \"9703312b-200c-49c1-8b17-7fbde4f59c14\" (UID: \"9703312b-200c-49c1-8b17-7fbde4f59c14\") " Mar 20 08:52:33 crc kubenswrapper[4971]: I0320 08:52:33.435469 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9703312b-200c-49c1-8b17-7fbde4f59c14-utilities\") pod \"9703312b-200c-49c1-8b17-7fbde4f59c14\" (UID: \"9703312b-200c-49c1-8b17-7fbde4f59c14\") " Mar 20 08:52:33 crc kubenswrapper[4971]: I0320 08:52:33.436636 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9703312b-200c-49c1-8b17-7fbde4f59c14-utilities" (OuterVolumeSpecName: "utilities") pod "9703312b-200c-49c1-8b17-7fbde4f59c14" (UID: "9703312b-200c-49c1-8b17-7fbde4f59c14"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:52:33 crc kubenswrapper[4971]: I0320 08:52:33.453843 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9703312b-200c-49c1-8b17-7fbde4f59c14-kube-api-access-q76sx" (OuterVolumeSpecName: "kube-api-access-q76sx") pod "9703312b-200c-49c1-8b17-7fbde4f59c14" (UID: "9703312b-200c-49c1-8b17-7fbde4f59c14"). InnerVolumeSpecName "kube-api-access-q76sx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:52:33 crc kubenswrapper[4971]: I0320 08:52:33.539325 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q76sx\" (UniqueName: \"kubernetes.io/projected/9703312b-200c-49c1-8b17-7fbde4f59c14-kube-api-access-q76sx\") on node \"crc\" DevicePath \"\"" Mar 20 08:52:33 crc kubenswrapper[4971]: I0320 08:52:33.539524 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9703312b-200c-49c1-8b17-7fbde4f59c14-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 08:52:33 crc kubenswrapper[4971]: I0320 08:52:33.546936 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9703312b-200c-49c1-8b17-7fbde4f59c14-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9703312b-200c-49c1-8b17-7fbde4f59c14" (UID: "9703312b-200c-49c1-8b17-7fbde4f59c14"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:52:33 crc kubenswrapper[4971]: I0320 08:52:33.641530 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9703312b-200c-49c1-8b17-7fbde4f59c14-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 08:52:33 crc kubenswrapper[4971]: I0320 08:52:33.699225 4971 generic.go:334] "Generic (PLEG): container finished" podID="9703312b-200c-49c1-8b17-7fbde4f59c14" containerID="b6d6108c6c2039bf48fc505b5d98e9f45c9aaf11318de2dcf3e7d75512aa2cc0" exitCode=0 Mar 20 08:52:33 crc kubenswrapper[4971]: I0320 08:52:33.699267 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-58zhp" event={"ID":"9703312b-200c-49c1-8b17-7fbde4f59c14","Type":"ContainerDied","Data":"b6d6108c6c2039bf48fc505b5d98e9f45c9aaf11318de2dcf3e7d75512aa2cc0"} Mar 20 08:52:33 crc kubenswrapper[4971]: I0320 08:52:33.699297 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-58zhp" event={"ID":"9703312b-200c-49c1-8b17-7fbde4f59c14","Type":"ContainerDied","Data":"d7601f9545ab66fe120c34c787f531ed611a7619acb4986b1f3cdb381c3a83ba"} Mar 20 08:52:33 crc kubenswrapper[4971]: I0320 08:52:33.699317 4971 scope.go:117] "RemoveContainer" containerID="b6d6108c6c2039bf48fc505b5d98e9f45c9aaf11318de2dcf3e7d75512aa2cc0" Mar 20 08:52:33 crc kubenswrapper[4971]: I0320 08:52:33.699325 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-58zhp" Mar 20 08:52:33 crc kubenswrapper[4971]: I0320 08:52:33.774915 4971 scope.go:117] "RemoveContainer" containerID="3f5a000eeda9b38c6812a068204d5814de56d5092b65efd377fcd991e2306c87" Mar 20 08:52:33 crc kubenswrapper[4971]: I0320 08:52:33.789578 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-58zhp"] Mar 20 08:52:33 crc kubenswrapper[4971]: I0320 08:52:33.811788 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-58zhp"] Mar 20 08:52:33 crc kubenswrapper[4971]: I0320 08:52:33.861470 4971 scope.go:117] "RemoveContainer" containerID="e643b4d467d7cc247bb535eaa51ee230c28a681e83fde5cd9c3465bcf68bd5ea" Mar 20 08:52:33 crc kubenswrapper[4971]: I0320 08:52:33.885215 4971 scope.go:117] "RemoveContainer" containerID="b6d6108c6c2039bf48fc505b5d98e9f45c9aaf11318de2dcf3e7d75512aa2cc0" Mar 20 08:52:33 crc kubenswrapper[4971]: E0320 08:52:33.885731 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6d6108c6c2039bf48fc505b5d98e9f45c9aaf11318de2dcf3e7d75512aa2cc0\": container with ID starting with b6d6108c6c2039bf48fc505b5d98e9f45c9aaf11318de2dcf3e7d75512aa2cc0 not found: ID does not exist" containerID="b6d6108c6c2039bf48fc505b5d98e9f45c9aaf11318de2dcf3e7d75512aa2cc0" Mar 20 08:52:33 crc kubenswrapper[4971]: I0320 08:52:33.885780 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6d6108c6c2039bf48fc505b5d98e9f45c9aaf11318de2dcf3e7d75512aa2cc0"} err="failed to get container status \"b6d6108c6c2039bf48fc505b5d98e9f45c9aaf11318de2dcf3e7d75512aa2cc0\": rpc error: code = NotFound desc = could not find container \"b6d6108c6c2039bf48fc505b5d98e9f45c9aaf11318de2dcf3e7d75512aa2cc0\": container with ID starting with b6d6108c6c2039bf48fc505b5d98e9f45c9aaf11318de2dcf3e7d75512aa2cc0 not found: ID does not exist" Mar 20 08:52:33 crc kubenswrapper[4971]: I0320 08:52:33.885815 4971 scope.go:117] "RemoveContainer" containerID="3f5a000eeda9b38c6812a068204d5814de56d5092b65efd377fcd991e2306c87" Mar 20 08:52:33 crc kubenswrapper[4971]: E0320 08:52:33.886339 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f5a000eeda9b38c6812a068204d5814de56d5092b65efd377fcd991e2306c87\": container with ID starting with 3f5a000eeda9b38c6812a068204d5814de56d5092b65efd377fcd991e2306c87 not found: ID does not exist" containerID="3f5a000eeda9b38c6812a068204d5814de56d5092b65efd377fcd991e2306c87" Mar 20 08:52:33 crc kubenswrapper[4971]: I0320 08:52:33.886386 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f5a000eeda9b38c6812a068204d5814de56d5092b65efd377fcd991e2306c87"} err="failed to get container status \"3f5a000eeda9b38c6812a068204d5814de56d5092b65efd377fcd991e2306c87\": rpc error: code = NotFound desc = could not find container \"3f5a000eeda9b38c6812a068204d5814de56d5092b65efd377fcd991e2306c87\": container with ID starting with 3f5a000eeda9b38c6812a068204d5814de56d5092b65efd377fcd991e2306c87 not found: ID does not exist" Mar 20 08:52:33 crc kubenswrapper[4971]: I0320 08:52:33.886407 4971 scope.go:117] "RemoveContainer" containerID="e643b4d467d7cc247bb535eaa51ee230c28a681e83fde5cd9c3465bcf68bd5ea" Mar 20 08:52:33 crc kubenswrapper[4971]: E0320 08:52:33.886630 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e643b4d467d7cc247bb535eaa51ee230c28a681e83fde5cd9c3465bcf68bd5ea\": container with ID starting with e643b4d467d7cc247bb535eaa51ee230c28a681e83fde5cd9c3465bcf68bd5ea not found: ID does not exist" containerID="e643b4d467d7cc247bb535eaa51ee230c28a681e83fde5cd9c3465bcf68bd5ea" Mar 20 08:52:33 crc kubenswrapper[4971]: I0320 08:52:33.886650 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e643b4d467d7cc247bb535eaa51ee230c28a681e83fde5cd9c3465bcf68bd5ea"} err="failed to get container status \"e643b4d467d7cc247bb535eaa51ee230c28a681e83fde5cd9c3465bcf68bd5ea\": rpc error: code = NotFound desc = could not find container \"e643b4d467d7cc247bb535eaa51ee230c28a681e83fde5cd9c3465bcf68bd5ea\": container with ID starting with e643b4d467d7cc247bb535eaa51ee230c28a681e83fde5cd9c3465bcf68bd5ea not found: ID does not exist" Mar 20 08:52:34 crc kubenswrapper[4971]: I0320 08:52:34.748634 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9703312b-200c-49c1-8b17-7fbde4f59c14" path="/var/lib/kubelet/pods/9703312b-200c-49c1-8b17-7fbde4f59c14/volumes" Mar 20 08:52:50 crc kubenswrapper[4971]: E0320 08:52:50.996288 4971 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-cinder-api:39fc4cb70f516d8e9b48225bc0a253ef" Mar 20 08:52:50 crc kubenswrapper[4971]: E0320 08:52:50.996838 4971 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-cinder-api:39fc4cb70f516d8e9b48225bc0a253ef" Mar 20 08:52:50 crc kubenswrapper[4971]: E0320 08:52:50.996974 4971 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-cinder-api:39fc4cb70f516d8e9b48225bc0a253ef,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kl52x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-tt6jv_openstack(993ffc7e-64d4-455b-84ba-134ffefa83ea): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 08:52:50 crc kubenswrapper[4971]: E0320 08:52:50.998648 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-tt6jv" podUID="993ffc7e-64d4-455b-84ba-134ffefa83ea" Mar 20 08:52:51 crc kubenswrapper[4971]: E0320 08:52:51.890043 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-cinder-api:39fc4cb70f516d8e9b48225bc0a253ef\\\"\"" pod="openstack/cinder-db-sync-tt6jv" podUID="993ffc7e-64d4-455b-84ba-134ffefa83ea" Mar 20 08:53:08 crc kubenswrapper[4971]: I0320 08:53:08.075948 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-tt6jv" event={"ID":"993ffc7e-64d4-455b-84ba-134ffefa83ea","Type":"ContainerStarted","Data":"cb659b4d9bb5d6b0240f06da910addfcc1ce38b3748625de83ef55c78cfffca4"} Mar 20 08:53:11 crc kubenswrapper[4971]: I0320 08:53:11.122427 4971 generic.go:334] "Generic (PLEG): container finished" podID="993ffc7e-64d4-455b-84ba-134ffefa83ea" containerID="cb659b4d9bb5d6b0240f06da910addfcc1ce38b3748625de83ef55c78cfffca4" exitCode=0 Mar 20 08:53:11 crc kubenswrapper[4971]: I0320 08:53:11.122839 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-tt6jv" event={"ID":"993ffc7e-64d4-455b-84ba-134ffefa83ea","Type":"ContainerDied","Data":"cb659b4d9bb5d6b0240f06da910addfcc1ce38b3748625de83ef55c78cfffca4"} Mar 20 08:53:12 crc kubenswrapper[4971]: I0320 08:53:12.426586 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-tt6jv" Mar 20 08:53:12 crc kubenswrapper[4971]: I0320 08:53:12.551785 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/993ffc7e-64d4-455b-84ba-134ffefa83ea-etc-machine-id\") pod \"993ffc7e-64d4-455b-84ba-134ffefa83ea\" (UID: \"993ffc7e-64d4-455b-84ba-134ffefa83ea\") " Mar 20 08:53:12 crc kubenswrapper[4971]: I0320 08:53:12.551968 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/993ffc7e-64d4-455b-84ba-134ffefa83ea-combined-ca-bundle\") pod \"993ffc7e-64d4-455b-84ba-134ffefa83ea\" (UID: \"993ffc7e-64d4-455b-84ba-134ffefa83ea\") " Mar 20 08:53:12 crc kubenswrapper[4971]: I0320 08:53:12.551909 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/993ffc7e-64d4-455b-84ba-134ffefa83ea-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "993ffc7e-64d4-455b-84ba-134ffefa83ea" (UID: "993ffc7e-64d4-455b-84ba-134ffefa83ea"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:53:12 crc kubenswrapper[4971]: I0320 08:53:12.552814 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/993ffc7e-64d4-455b-84ba-134ffefa83ea-db-sync-config-data\") pod \"993ffc7e-64d4-455b-84ba-134ffefa83ea\" (UID: \"993ffc7e-64d4-455b-84ba-134ffefa83ea\") " Mar 20 08:53:12 crc kubenswrapper[4971]: I0320 08:53:12.552957 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/993ffc7e-64d4-455b-84ba-134ffefa83ea-scripts\") pod \"993ffc7e-64d4-455b-84ba-134ffefa83ea\" (UID: \"993ffc7e-64d4-455b-84ba-134ffefa83ea\") " Mar 20 08:53:12 crc kubenswrapper[4971]: I0320 08:53:12.553097 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/993ffc7e-64d4-455b-84ba-134ffefa83ea-config-data\") pod \"993ffc7e-64d4-455b-84ba-134ffefa83ea\" (UID: \"993ffc7e-64d4-455b-84ba-134ffefa83ea\") " Mar 20 08:53:12 crc kubenswrapper[4971]: I0320 08:53:12.553231 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kl52x\" (UniqueName: \"kubernetes.io/projected/993ffc7e-64d4-455b-84ba-134ffefa83ea-kube-api-access-kl52x\") pod \"993ffc7e-64d4-455b-84ba-134ffefa83ea\" (UID: \"993ffc7e-64d4-455b-84ba-134ffefa83ea\") " Mar 20 08:53:12 crc kubenswrapper[4971]: I0320 08:53:12.554148 4971 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/993ffc7e-64d4-455b-84ba-134ffefa83ea-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 20 08:53:12 crc kubenswrapper[4971]: I0320 08:53:12.558927 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/993ffc7e-64d4-455b-84ba-134ffefa83ea-scripts" (OuterVolumeSpecName: "scripts") pod "993ffc7e-64d4-455b-84ba-134ffefa83ea" (UID: "993ffc7e-64d4-455b-84ba-134ffefa83ea"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:53:12 crc kubenswrapper[4971]: I0320 08:53:12.560691 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/993ffc7e-64d4-455b-84ba-134ffefa83ea-kube-api-access-kl52x" (OuterVolumeSpecName: "kube-api-access-kl52x") pod "993ffc7e-64d4-455b-84ba-134ffefa83ea" (UID: "993ffc7e-64d4-455b-84ba-134ffefa83ea"). InnerVolumeSpecName "kube-api-access-kl52x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:53:12 crc kubenswrapper[4971]: I0320 08:53:12.560923 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/993ffc7e-64d4-455b-84ba-134ffefa83ea-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "993ffc7e-64d4-455b-84ba-134ffefa83ea" (UID: "993ffc7e-64d4-455b-84ba-134ffefa83ea"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:53:12 crc kubenswrapper[4971]: I0320 08:53:12.599726 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/993ffc7e-64d4-455b-84ba-134ffefa83ea-config-data" (OuterVolumeSpecName: "config-data") pod "993ffc7e-64d4-455b-84ba-134ffefa83ea" (UID: "993ffc7e-64d4-455b-84ba-134ffefa83ea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:53:12 crc kubenswrapper[4971]: I0320 08:53:12.603932 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/993ffc7e-64d4-455b-84ba-134ffefa83ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "993ffc7e-64d4-455b-84ba-134ffefa83ea" (UID: "993ffc7e-64d4-455b-84ba-134ffefa83ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:53:12 crc kubenswrapper[4971]: I0320 08:53:12.656435 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kl52x\" (UniqueName: \"kubernetes.io/projected/993ffc7e-64d4-455b-84ba-134ffefa83ea-kube-api-access-kl52x\") on node \"crc\" DevicePath \"\"" Mar 20 08:53:12 crc kubenswrapper[4971]: I0320 08:53:12.656761 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/993ffc7e-64d4-455b-84ba-134ffefa83ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:53:12 crc kubenswrapper[4971]: I0320 08:53:12.656771 4971 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/993ffc7e-64d4-455b-84ba-134ffefa83ea-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:53:12 crc kubenswrapper[4971]: I0320 08:53:12.656781 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/993ffc7e-64d4-455b-84ba-134ffefa83ea-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:53:12 crc kubenswrapper[4971]: I0320 08:53:12.656791 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/993ffc7e-64d4-455b-84ba-134ffefa83ea-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:53:13 crc kubenswrapper[4971]: I0320 08:53:13.154066 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-tt6jv" event={"ID":"993ffc7e-64d4-455b-84ba-134ffefa83ea","Type":"ContainerDied","Data":"03e6ac24111ae24d40baaa9dc49fb21ea78fdff439d0d77451e5503023b7a9de"} Mar 20 08:53:13 crc kubenswrapper[4971]: I0320 08:53:13.154106 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03e6ac24111ae24d40baaa9dc49fb21ea78fdff439d0d77451e5503023b7a9de" Mar 20 08:53:13 crc kubenswrapper[4971]: I0320 08:53:13.154148 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-tt6jv" Mar 20 08:53:13 crc kubenswrapper[4971]: I0320 08:53:13.556008 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-697dc7fb99-tb6mt"] Mar 20 08:53:13 crc kubenswrapper[4971]: E0320 08:53:13.556475 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65e9cb3c-4705-4e93-bb16-a0a56de56d62" containerName="extract-content" Mar 20 08:53:13 crc kubenswrapper[4971]: I0320 08:53:13.556492 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="65e9cb3c-4705-4e93-bb16-a0a56de56d62" containerName="extract-content" Mar 20 08:53:13 crc kubenswrapper[4971]: E0320 08:53:13.556510 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e06868f-58f4-4643-805d-ba92c179a628" containerName="extract-content" Mar 20 08:53:13 crc kubenswrapper[4971]: I0320 08:53:13.556517 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e06868f-58f4-4643-805d-ba92c179a628" containerName="extract-content" Mar 20 08:53:13 crc kubenswrapper[4971]: E0320 08:53:13.556532 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e06868f-58f4-4643-805d-ba92c179a628" containerName="registry-server" Mar 20 08:53:13 crc kubenswrapper[4971]: I0320 08:53:13.556538 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e06868f-58f4-4643-805d-ba92c179a628" containerName="registry-server" Mar 20 08:53:13 crc kubenswrapper[4971]: E0320 08:53:13.556549 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9703312b-200c-49c1-8b17-7fbde4f59c14" containerName="registry-server" Mar 20 08:53:13 crc kubenswrapper[4971]: I0320 08:53:13.556555 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="9703312b-200c-49c1-8b17-7fbde4f59c14" containerName="registry-server" Mar 20 08:53:13 crc kubenswrapper[4971]: E0320 08:53:13.556567 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65e9cb3c-4705-4e93-bb16-a0a56de56d62" containerName="extract-utilities" Mar 20 08:53:13 crc kubenswrapper[4971]: I0320 08:53:13.556573 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="65e9cb3c-4705-4e93-bb16-a0a56de56d62" containerName="extract-utilities" Mar 20 08:53:13 crc kubenswrapper[4971]: E0320 08:53:13.556589 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9703312b-200c-49c1-8b17-7fbde4f59c14" containerName="extract-content" Mar 20 08:53:13 crc kubenswrapper[4971]: I0320 08:53:13.556619 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="9703312b-200c-49c1-8b17-7fbde4f59c14" containerName="extract-content" Mar 20 08:53:13 crc kubenswrapper[4971]: E0320 08:53:13.556632 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9703312b-200c-49c1-8b17-7fbde4f59c14" containerName="extract-utilities" Mar 20 08:53:13 crc kubenswrapper[4971]: I0320 08:53:13.556639 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="9703312b-200c-49c1-8b17-7fbde4f59c14" containerName="extract-utilities" Mar 20 08:53:13 crc kubenswrapper[4971]: E0320 08:53:13.556651 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="993ffc7e-64d4-455b-84ba-134ffefa83ea" containerName="cinder-db-sync" Mar 20 08:53:13 crc kubenswrapper[4971]: I0320 08:53:13.556657 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="993ffc7e-64d4-455b-84ba-134ffefa83ea" containerName="cinder-db-sync" Mar 20 08:53:13 crc kubenswrapper[4971]: E0320 08:53:13.556666 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e06868f-58f4-4643-805d-ba92c179a628" containerName="extract-utilities" Mar 20 08:53:13 crc kubenswrapper[4971]: I0320 08:53:13.556672 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e06868f-58f4-4643-805d-ba92c179a628" containerName="extract-utilities" Mar 20 08:53:13 crc kubenswrapper[4971]: E0320 08:53:13.556680 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65e9cb3c-4705-4e93-bb16-a0a56de56d62" containerName="registry-server" Mar 20 08:53:13 crc kubenswrapper[4971]: I0320 08:53:13.556686 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="65e9cb3c-4705-4e93-bb16-a0a56de56d62" containerName="registry-server" Mar 20 08:53:13 crc kubenswrapper[4971]: I0320 08:53:13.556865 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="993ffc7e-64d4-455b-84ba-134ffefa83ea" containerName="cinder-db-sync" Mar 20 08:53:13 crc kubenswrapper[4971]: I0320 08:53:13.556884 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="65e9cb3c-4705-4e93-bb16-a0a56de56d62" containerName="registry-server" Mar 20 08:53:13 crc kubenswrapper[4971]: I0320 08:53:13.556902 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e06868f-58f4-4643-805d-ba92c179a628" containerName="registry-server" Mar 20 08:53:13 crc kubenswrapper[4971]: I0320 08:53:13.556915 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="9703312b-200c-49c1-8b17-7fbde4f59c14" containerName="registry-server" Mar 20 08:53:13 crc kubenswrapper[4971]: I0320 08:53:13.557872 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-697dc7fb99-tb6mt" Mar 20 08:53:13 crc kubenswrapper[4971]: I0320 08:53:13.574428 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9-ovsdbserver-nb\") pod \"dnsmasq-dns-697dc7fb99-tb6mt\" (UID: \"fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9\") " pod="openstack/dnsmasq-dns-697dc7fb99-tb6mt" Mar 20 08:53:13 crc kubenswrapper[4971]: I0320 08:53:13.574499 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9-config\") pod \"dnsmasq-dns-697dc7fb99-tb6mt\" (UID: \"fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9\") " pod="openstack/dnsmasq-dns-697dc7fb99-tb6mt" Mar 20 08:53:13 crc kubenswrapper[4971]: I0320 08:53:13.574537 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9-dns-svc\") pod \"dnsmasq-dns-697dc7fb99-tb6mt\" (UID: \"fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9\") " pod="openstack/dnsmasq-dns-697dc7fb99-tb6mt" Mar 20 08:53:13 crc kubenswrapper[4971]: I0320 08:53:13.574595 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9-ovsdbserver-sb\") pod \"dnsmasq-dns-697dc7fb99-tb6mt\" (UID: \"fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9\") " pod="openstack/dnsmasq-dns-697dc7fb99-tb6mt" Mar 20 08:53:13 crc kubenswrapper[4971]: I0320 08:53:13.574694 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spfjs\" (UniqueName: \"kubernetes.io/projected/fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9-kube-api-access-spfjs\") pod \"dnsmasq-dns-697dc7fb99-tb6mt\" (UID: \"fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9\") " pod="openstack/dnsmasq-dns-697dc7fb99-tb6mt" Mar 20 08:53:13 crc kubenswrapper[4971]: I0320 08:53:13.579581 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-697dc7fb99-tb6mt"] Mar 20 08:53:13 crc kubenswrapper[4971]: I0320 08:53:13.602994 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 20 08:53:13 crc kubenswrapper[4971]: I0320 08:53:13.604453 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 08:53:13 crc kubenswrapper[4971]: I0320 08:53:13.609340 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 20 08:53:13 crc kubenswrapper[4971]: I0320 08:53:13.609586 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-jw8wb" Mar 20 08:53:13 crc kubenswrapper[4971]: I0320 08:53:13.609769 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 20 08:53:13 crc kubenswrapper[4971]: I0320 08:53:13.615241 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 20 08:53:13 crc kubenswrapper[4971]: I0320 08:53:13.643180 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 08:53:13 crc kubenswrapper[4971]: I0320 08:53:13.676373 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9-config\") pod \"dnsmasq-dns-697dc7fb99-tb6mt\" (UID: \"fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9\") " pod="openstack/dnsmasq-dns-697dc7fb99-tb6mt" Mar 20 08:53:13 crc kubenswrapper[4971]: I0320 08:53:13.676429 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34b1bb1b-3d48-4177-80d9-62596eacee6b-logs\") pod \"cinder-api-0\" (UID: \"34b1bb1b-3d48-4177-80d9-62596eacee6b\") " pod="openstack/cinder-api-0" Mar 20 08:53:13 crc kubenswrapper[4971]: I0320 08:53:13.676454 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9-dns-svc\") pod \"dnsmasq-dns-697dc7fb99-tb6mt\" (UID: \"fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9\") " pod="openstack/dnsmasq-dns-697dc7fb99-tb6mt" Mar 20 08:53:13 crc kubenswrapper[4971]: I0320 08:53:13.676498 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/34b1bb1b-3d48-4177-80d9-62596eacee6b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"34b1bb1b-3d48-4177-80d9-62596eacee6b\") " pod="openstack/cinder-api-0" Mar 20 08:53:13 crc kubenswrapper[4971]: I0320 08:53:13.676520 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34b1bb1b-3d48-4177-80d9-62596eacee6b-config-data\") pod \"cinder-api-0\" (UID: \"34b1bb1b-3d48-4177-80d9-62596eacee6b\") " pod="openstack/cinder-api-0" Mar 20 08:53:13 crc kubenswrapper[4971]: I0320 08:53:13.676544 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9-ovsdbserver-sb\") pod \"dnsmasq-dns-697dc7fb99-tb6mt\" (UID: \"fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9\") " pod="openstack/dnsmasq-dns-697dc7fb99-tb6mt" Mar 20 08:53:13 crc kubenswrapper[4971]: I0320 08:53:13.676569 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/34b1bb1b-3d48-4177-80d9-62596eacee6b-config-data-custom\") pod \"cinder-api-0\" (UID: \"34b1bb1b-3d48-4177-80d9-62596eacee6b\") " pod="openstack/cinder-api-0" Mar 20 08:53:13 crc kubenswrapper[4971]: I0320 08:53:13.676600 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spfjs\" (UniqueName: \"kubernetes.io/projected/fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9-kube-api-access-spfjs\") pod \"dnsmasq-dns-697dc7fb99-tb6mt\" (UID: \"fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9\") " pod="openstack/dnsmasq-dns-697dc7fb99-tb6mt" Mar 20 08:53:13 crc kubenswrapper[4971]: I0320 08:53:13.676690 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9-ovsdbserver-nb\") pod \"dnsmasq-dns-697dc7fb99-tb6mt\" (UID: \"fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9\") " pod="openstack/dnsmasq-dns-697dc7fb99-tb6mt" Mar 20 08:53:13 crc kubenswrapper[4971]: I0320 08:53:13.676707 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34b1bb1b-3d48-4177-80d9-62596eacee6b-scripts\") pod \"cinder-api-0\" (UID: \"34b1bb1b-3d48-4177-80d9-62596eacee6b\") " pod="openstack/cinder-api-0" Mar 20 08:53:13 crc kubenswrapper[4971]: I0320 08:53:13.676729 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znmqw\" (UniqueName: \"kubernetes.io/projected/34b1bb1b-3d48-4177-80d9-62596eacee6b-kube-api-access-znmqw\") pod \"cinder-api-0\" (UID: \"34b1bb1b-3d48-4177-80d9-62596eacee6b\") " pod="openstack/cinder-api-0" Mar 20 08:53:13 crc kubenswrapper[4971]: I0320 08:53:13.676751 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34b1bb1b-3d48-4177-80d9-62596eacee6b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"34b1bb1b-3d48-4177-80d9-62596eacee6b\") " pod="openstack/cinder-api-0" Mar 20 08:53:13 crc kubenswrapper[4971]: I0320 08:53:13.677721 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9-config\") pod \"dnsmasq-dns-697dc7fb99-tb6mt\" (UID: \"fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9\") " pod="openstack/dnsmasq-dns-697dc7fb99-tb6mt" Mar 20 08:53:13 crc kubenswrapper[4971]: I0320 08:53:13.678487 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9-dns-svc\") pod \"dnsmasq-dns-697dc7fb99-tb6mt\" (UID: \"fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9\") " pod="openstack/dnsmasq-dns-697dc7fb99-tb6mt" Mar 20 08:53:13 crc kubenswrapper[4971]: I0320 08:53:13.678715 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9-ovsdbserver-sb\") pod \"dnsmasq-dns-697dc7fb99-tb6mt\" (UID: \"fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9\") " pod="openstack/dnsmasq-dns-697dc7fb99-tb6mt" Mar 20 08:53:13 crc kubenswrapper[4971]: I0320 08:53:13.678811 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9-ovsdbserver-nb\") pod \"dnsmasq-dns-697dc7fb99-tb6mt\" (UID: \"fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9\") " pod="openstack/dnsmasq-dns-697dc7fb99-tb6mt" Mar 20 08:53:13 crc kubenswrapper[4971]: I0320 08:53:13.697532 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spfjs\" (UniqueName: \"kubernetes.io/projected/fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9-kube-api-access-spfjs\") pod \"dnsmasq-dns-697dc7fb99-tb6mt\" (UID: \"fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9\") " pod="openstack/dnsmasq-dns-697dc7fb99-tb6mt" Mar 20 08:53:13 crc kubenswrapper[4971]: I0320 08:53:13.778681 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/34b1bb1b-3d48-4177-80d9-62596eacee6b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"34b1bb1b-3d48-4177-80d9-62596eacee6b\") " pod="openstack/cinder-api-0" Mar 20 08:53:13 crc kubenswrapper[4971]: I0320 08:53:13.778736 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34b1bb1b-3d48-4177-80d9-62596eacee6b-config-data\") pod \"cinder-api-0\" (UID: \"34b1bb1b-3d48-4177-80d9-62596eacee6b\") " pod="openstack/cinder-api-0" Mar 20 08:53:13 crc kubenswrapper[4971]: I0320 08:53:13.778772 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/34b1bb1b-3d48-4177-80d9-62596eacee6b-config-data-custom\") pod \"cinder-api-0\" (UID: \"34b1bb1b-3d48-4177-80d9-62596eacee6b\") " pod="openstack/cinder-api-0" Mar 20 08:53:13 crc kubenswrapper[4971]: I0320 08:53:13.778812 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/34b1bb1b-3d48-4177-80d9-62596eacee6b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"34b1bb1b-3d48-4177-80d9-62596eacee6b\") " pod="openstack/cinder-api-0" Mar 20 08:53:13 crc kubenswrapper[4971]: I0320 08:53:13.778842 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34b1bb1b-3d48-4177-80d9-62596eacee6b-scripts\") pod \"cinder-api-0\" (UID: \"34b1bb1b-3d48-4177-80d9-62596eacee6b\") " pod="openstack/cinder-api-0" Mar 20 08:53:13 crc kubenswrapper[4971]: I0320 08:53:13.778931 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znmqw\" (UniqueName: \"kubernetes.io/projected/34b1bb1b-3d48-4177-80d9-62596eacee6b-kube-api-access-znmqw\") pod \"cinder-api-0\" (UID: \"34b1bb1b-3d48-4177-80d9-62596eacee6b\") " pod="openstack/cinder-api-0" Mar 20 08:53:13 crc kubenswrapper[4971]: I0320 08:53:13.778979 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34b1bb1b-3d48-4177-80d9-62596eacee6b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"34b1bb1b-3d48-4177-80d9-62596eacee6b\") " pod="openstack/cinder-api-0" Mar 20 08:53:13 crc kubenswrapper[4971]: I0320 08:53:13.779111 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34b1bb1b-3d48-4177-80d9-62596eacee6b-logs\") pod \"cinder-api-0\" (UID: \"34b1bb1b-3d48-4177-80d9-62596eacee6b\") " pod="openstack/cinder-api-0" Mar 20 08:53:13 crc kubenswrapper[4971]: I0320 08:53:13.779703 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34b1bb1b-3d48-4177-80d9-62596eacee6b-logs\") pod \"cinder-api-0\" (UID: \"34b1bb1b-3d48-4177-80d9-62596eacee6b\") " pod="openstack/cinder-api-0" Mar 20 08:53:13 crc kubenswrapper[4971]: I0320 08:53:13.783841 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34b1bb1b-3d48-4177-80d9-62596eacee6b-scripts\") pod \"cinder-api-0\" (UID: \"34b1bb1b-3d48-4177-80d9-62596eacee6b\") " pod="openstack/cinder-api-0" Mar 20 08:53:13 crc kubenswrapper[4971]: I0320 08:53:13.784681 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34b1bb1b-3d48-4177-80d9-62596eacee6b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"34b1bb1b-3d48-4177-80d9-62596eacee6b\") " pod="openstack/cinder-api-0" Mar 20 08:53:13 crc kubenswrapper[4971]: I0320 08:53:13.795765 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34b1bb1b-3d48-4177-80d9-62596eacee6b-config-data\") pod \"cinder-api-0\" (UID: \"34b1bb1b-3d48-4177-80d9-62596eacee6b\") " pod="openstack/cinder-api-0" Mar 20 08:53:13 crc kubenswrapper[4971]: I0320 08:53:13.802278 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/34b1bb1b-3d48-4177-80d9-62596eacee6b-config-data-custom\") pod \"cinder-api-0\" (UID: \"34b1bb1b-3d48-4177-80d9-62596eacee6b\") " pod="openstack/cinder-api-0" Mar 20 08:53:13 crc kubenswrapper[4971]: I0320 08:53:13.803040 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znmqw\" (UniqueName: \"kubernetes.io/projected/34b1bb1b-3d48-4177-80d9-62596eacee6b-kube-api-access-znmqw\") pod \"cinder-api-0\" (UID: \"34b1bb1b-3d48-4177-80d9-62596eacee6b\") " pod="openstack/cinder-api-0" Mar 20 08:53:13 crc kubenswrapper[4971]: I0320 08:53:13.876767 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-697dc7fb99-tb6mt" Mar 20 08:53:13 crc kubenswrapper[4971]: I0320 08:53:13.933456 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 08:53:14 crc kubenswrapper[4971]: I0320 08:53:14.337699 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-697dc7fb99-tb6mt"] Mar 20 08:53:14 crc kubenswrapper[4971]: I0320 08:53:14.461380 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 08:53:15 crc kubenswrapper[4971]: I0320 08:53:15.212993 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"34b1bb1b-3d48-4177-80d9-62596eacee6b","Type":"ContainerStarted","Data":"96c3b627129ecc249303d163c2067cb98d2ade6f55ab368897c1cbcff7aa35d7"} Mar 20 08:53:15 crc kubenswrapper[4971]: I0320 08:53:15.213496 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"34b1bb1b-3d48-4177-80d9-62596eacee6b","Type":"ContainerStarted","Data":"f3b8b6f49fb999293252bae176ed7d8a728f824c2e18f292ce8816286ab5fe69"} Mar 20 08:53:15 crc kubenswrapper[4971]: I0320 08:53:15.225414 4971 generic.go:334] "Generic (PLEG): container finished" podID="fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9" containerID="d4b921166b14c31ce2e366c331fb3059f4c0fa9065a79b7e204a3dc598f2013e" exitCode=0 Mar 20 08:53:15 crc kubenswrapper[4971]: I0320 08:53:15.225452 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-697dc7fb99-tb6mt" event={"ID":"fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9","Type":"ContainerDied","Data":"d4b921166b14c31ce2e366c331fb3059f4c0fa9065a79b7e204a3dc598f2013e"} Mar 20 08:53:15 crc kubenswrapper[4971]: I0320 08:53:15.225477 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-697dc7fb99-tb6mt" event={"ID":"fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9","Type":"ContainerStarted","Data":"70cd7918c0fd977f522da7deffab9a007f5a81554cee657ca3fd6599c74201a0"} Mar 20 08:53:16 crc kubenswrapper[4971]: I0320 08:53:16.238131 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-697dc7fb99-tb6mt" event={"ID":"fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9","Type":"ContainerStarted","Data":"9bfaff52fcacbfea721e6561894459b21799086f9c6fdd6a4b86f509f6a735dd"} Mar 20 08:53:16 crc kubenswrapper[4971]: I0320 08:53:16.239662 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-697dc7fb99-tb6mt" Mar 20 08:53:16 crc kubenswrapper[4971]: I0320 08:53:16.242687 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"34b1bb1b-3d48-4177-80d9-62596eacee6b","Type":"ContainerStarted","Data":"b9b301e52c53c6f8e5ddfdc11458c7656298eb5ebe8338b8ce2cf4b72d58e0cd"} Mar 20 08:53:16 crc kubenswrapper[4971]: I0320 08:53:16.242794 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 20 08:53:16 crc kubenswrapper[4971]: I0320 08:53:16.265691 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-697dc7fb99-tb6mt" podStartSLOduration=3.265670042 podStartE2EDuration="3.265670042s" podCreationTimestamp="2026-03-20 08:53:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:53:16.259132441 +0000 UTC m=+7418.239006589" watchObservedRunningTime="2026-03-20 08:53:16.265670042 +0000 UTC m=+7418.245544180" Mar 20 08:53:16 crc kubenswrapper[4971]: I0320 08:53:16.301383 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.301363656 podStartE2EDuration="3.301363656s" podCreationTimestamp="2026-03-20 08:53:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:53:16.277352008 +0000 UTC m=+7418.257226146" watchObservedRunningTime="2026-03-20 08:53:16.301363656 +0000 UTC m=+7418.281237794" Mar 20 08:53:23 crc kubenswrapper[4971]: I0320 08:53:23.878727 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-697dc7fb99-tb6mt" Mar 20 08:53:23 crc kubenswrapper[4971]: I0320 08:53:23.961902 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d9d566f97-dtk5g"] Mar 20 08:53:23 crc kubenswrapper[4971]: I0320 08:53:23.962178 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7d9d566f97-dtk5g" podUID="f112ece8-c233-4f7d-87bb-4634c0bc3582" containerName="dnsmasq-dns" containerID="cri-o://b76a430ee503de1488cbd512a2d5fd22e4fa28a3bf88d405161d6a91caf45db2" gracePeriod=10 Mar 20 08:53:24 crc kubenswrapper[4971]: I0320 08:53:24.042104 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7d9d566f97-dtk5g" podUID="f112ece8-c233-4f7d-87bb-4634c0bc3582" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.135:5353: connect: connection refused" Mar 20 08:53:24 crc kubenswrapper[4971]: I0320 08:53:24.338006 4971 generic.go:334] "Generic (PLEG): container finished" podID="f112ece8-c233-4f7d-87bb-4634c0bc3582" containerID="b76a430ee503de1488cbd512a2d5fd22e4fa28a3bf88d405161d6a91caf45db2" exitCode=0 Mar 20 08:53:24 crc kubenswrapper[4971]: I0320 08:53:24.338316 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d9d566f97-dtk5g" event={"ID":"f112ece8-c233-4f7d-87bb-4634c0bc3582","Type":"ContainerDied","Data":"b76a430ee503de1488cbd512a2d5fd22e4fa28a3bf88d405161d6a91caf45db2"} Mar 20 08:53:25 crc kubenswrapper[4971]: I0320 08:53:25.005989 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d9d566f97-dtk5g" Mar 20 08:53:25 crc kubenswrapper[4971]: I0320 08:53:25.109325 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f112ece8-c233-4f7d-87bb-4634c0bc3582-ovsdbserver-sb\") pod \"f112ece8-c233-4f7d-87bb-4634c0bc3582\" (UID: \"f112ece8-c233-4f7d-87bb-4634c0bc3582\") " Mar 20 08:53:25 crc kubenswrapper[4971]: I0320 08:53:25.109444 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f112ece8-c233-4f7d-87bb-4634c0bc3582-dns-svc\") pod \"f112ece8-c233-4f7d-87bb-4634c0bc3582\" (UID: \"f112ece8-c233-4f7d-87bb-4634c0bc3582\") " Mar 20 08:53:25 crc kubenswrapper[4971]: I0320 08:53:25.109476 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f112ece8-c233-4f7d-87bb-4634c0bc3582-config\") pod \"f112ece8-c233-4f7d-87bb-4634c0bc3582\" (UID: \"f112ece8-c233-4f7d-87bb-4634c0bc3582\") " Mar 20 08:53:25 crc kubenswrapper[4971]: I0320 08:53:25.109571 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f112ece8-c233-4f7d-87bb-4634c0bc3582-ovsdbserver-nb\") pod \"f112ece8-c233-4f7d-87bb-4634c0bc3582\" (UID: \"f112ece8-c233-4f7d-87bb-4634c0bc3582\") " Mar 20 08:53:25 crc kubenswrapper[4971]: I0320 08:53:25.109623 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8d6z\" (UniqueName: \"kubernetes.io/projected/f112ece8-c233-4f7d-87bb-4634c0bc3582-kube-api-access-m8d6z\") pod \"f112ece8-c233-4f7d-87bb-4634c0bc3582\" (UID: \"f112ece8-c233-4f7d-87bb-4634c0bc3582\") " Mar 20 08:53:25 crc kubenswrapper[4971]: I0320 08:53:25.122808 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f112ece8-c233-4f7d-87bb-4634c0bc3582-kube-api-access-m8d6z" (OuterVolumeSpecName: "kube-api-access-m8d6z") pod "f112ece8-c233-4f7d-87bb-4634c0bc3582" (UID: "f112ece8-c233-4f7d-87bb-4634c0bc3582"). InnerVolumeSpecName "kube-api-access-m8d6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:53:25 crc kubenswrapper[4971]: I0320 08:53:25.149695 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f112ece8-c233-4f7d-87bb-4634c0bc3582-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f112ece8-c233-4f7d-87bb-4634c0bc3582" (UID: "f112ece8-c233-4f7d-87bb-4634c0bc3582"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:53:25 crc kubenswrapper[4971]: I0320 08:53:25.162869 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f112ece8-c233-4f7d-87bb-4634c0bc3582-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f112ece8-c233-4f7d-87bb-4634c0bc3582" (UID: "f112ece8-c233-4f7d-87bb-4634c0bc3582"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:53:25 crc kubenswrapper[4971]: I0320 08:53:25.179051 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f112ece8-c233-4f7d-87bb-4634c0bc3582-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f112ece8-c233-4f7d-87bb-4634c0bc3582" (UID: "f112ece8-c233-4f7d-87bb-4634c0bc3582"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:53:25 crc kubenswrapper[4971]: I0320 08:53:25.183892 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f112ece8-c233-4f7d-87bb-4634c0bc3582-config" (OuterVolumeSpecName: "config") pod "f112ece8-c233-4f7d-87bb-4634c0bc3582" (UID: "f112ece8-c233-4f7d-87bb-4634c0bc3582"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:53:25 crc kubenswrapper[4971]: I0320 08:53:25.220829 4971 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f112ece8-c233-4f7d-87bb-4634c0bc3582-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 08:53:25 crc kubenswrapper[4971]: I0320 08:53:25.220867 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8d6z\" (UniqueName: \"kubernetes.io/projected/f112ece8-c233-4f7d-87bb-4634c0bc3582-kube-api-access-m8d6z\") on node \"crc\" DevicePath \"\"" Mar 20 08:53:25 crc kubenswrapper[4971]: I0320 08:53:25.220881 4971 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f112ece8-c233-4f7d-87bb-4634c0bc3582-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 08:53:25 crc kubenswrapper[4971]: I0320 08:53:25.220896 4971 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f112ece8-c233-4f7d-87bb-4634c0bc3582-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 08:53:25 crc kubenswrapper[4971]: I0320 08:53:25.220908 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f112ece8-c233-4f7d-87bb-4634c0bc3582-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:53:25 crc kubenswrapper[4971]: I0320 08:53:25.348533 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d9d566f97-dtk5g" event={"ID":"f112ece8-c233-4f7d-87bb-4634c0bc3582","Type":"ContainerDied","Data":"929954f2356f3e073ecf58ac034bfb3fa8f20081f5efdf108c4fae3de5d12734"} Mar 20 08:53:25 crc kubenswrapper[4971]: I0320 08:53:25.348593 4971 scope.go:117] "RemoveContainer" containerID="b76a430ee503de1488cbd512a2d5fd22e4fa28a3bf88d405161d6a91caf45db2" Mar 20 08:53:25 crc kubenswrapper[4971]: I0320 08:53:25.348664 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d9d566f97-dtk5g" Mar 20 08:53:25 crc kubenswrapper[4971]: I0320 08:53:25.370055 4971 scope.go:117] "RemoveContainer" containerID="ca2565998fcf3679734c3ac012587b4b187a5c511b9af022d5a0693095c51c31" Mar 20 08:53:25 crc kubenswrapper[4971]: I0320 08:53:25.390503 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d9d566f97-dtk5g"] Mar 20 08:53:25 crc kubenswrapper[4971]: I0320 08:53:25.412472 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d9d566f97-dtk5g"] Mar 20 08:53:25 crc kubenswrapper[4971]: I0320 08:53:25.911902 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 20 08:53:26 crc kubenswrapper[4971]: I0320 08:53:26.022912 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 08:53:26 crc kubenswrapper[4971]: I0320 08:53:26.023393 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="1448e032-bee2-4ebb-b6bc-9b62539fea32" containerName="nova-scheduler-scheduler" containerID="cri-o://1bb783d5a44fa970c0622ae5ed68bff208b5e5ef186250d261674fdd233a7d51" gracePeriod=30 Mar 20 08:53:26 crc kubenswrapper[4971]: I0320 08:53:26.045667 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 08:53:26 crc kubenswrapper[4971]: I0320 08:53:26.045949 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="42569610-bca3-4909-ad7c-0ad057f7cfe7" containerName="nova-metadata-log" containerID="cri-o://18abf822562d7202715250e9b4313bf919b96054565cfe87e96f7333dac34472" gracePeriod=30 Mar 20 08:53:26 crc kubenswrapper[4971]: I0320 08:53:26.046094 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="42569610-bca3-4909-ad7c-0ad057f7cfe7" containerName="nova-metadata-metadata" containerID="cri-o://248b2194ee601bc6137794b73597702de74ad6edf54acb4b3be4732e29f966ef" gracePeriod=30 Mar 20 08:53:26 crc kubenswrapper[4971]: I0320 08:53:26.058949 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 08:53:26 crc kubenswrapper[4971]: I0320 08:53:26.059211 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="c1e23d5e-175f-4f85-a438-b6fc8ffd65e4" containerName="nova-cell0-conductor-conductor" containerID="cri-o://826c478f4fe50ed03dc4ea36ad1e5487b97c0d28bfe7a13b5c6eab4bc78f0b62" gracePeriod=30 Mar 20 08:53:26 crc kubenswrapper[4971]: I0320 08:53:26.071916 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 08:53:26 crc kubenswrapper[4971]: I0320 08:53:26.072259 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8b8ee1bc-f59c-49b5-8679-3411e0d64db3" containerName="nova-api-api" containerID="cri-o://d44b83d1e9043bf165e9007af7fe2a791004b924391a249c5c2abdfa5435ca3c" gracePeriod=30 Mar 20 08:53:26 crc kubenswrapper[4971]: I0320 08:53:26.072581 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8b8ee1bc-f59c-49b5-8679-3411e0d64db3" containerName="nova-api-log" containerID="cri-o://002b18a4cbf209717c99cf48e1266bf53e0e4fa084810a4684e789b2d1032562" gracePeriod=30 Mar 20 08:53:26 crc kubenswrapper[4971]: I0320 08:53:26.082161 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 08:53:26 crc kubenswrapper[4971]: I0320 08:53:26.082357 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="4581cde5-cdb3-4215-b51a-17504bb3fc30" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://4f7c12edf0bcfcac83fe9e0f80c281b7f43c8af30f095eb9ca01ba4a7b85bdc1" gracePeriod=30 Mar 20 08:53:26 crc kubenswrapper[4971]: I0320 08:53:26.134926 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 08:53:26 crc kubenswrapper[4971]: I0320 08:53:26.135146 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="23472bad-7f81-42e7-9878-b2cbc945508a" containerName="nova-cell1-conductor-conductor" containerID="cri-o://63d452ef4e3ac2efb555ef163ff51d3ca8589f65d0913c80c5580273babf5f39" gracePeriod=30 Mar 20 08:53:26 crc kubenswrapper[4971]: I0320 08:53:26.361755 4971 generic.go:334] "Generic (PLEG): container finished" podID="42569610-bca3-4909-ad7c-0ad057f7cfe7" containerID="18abf822562d7202715250e9b4313bf919b96054565cfe87e96f7333dac34472" exitCode=143 Mar 20 08:53:26 crc kubenswrapper[4971]: I0320 08:53:26.361847 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"42569610-bca3-4909-ad7c-0ad057f7cfe7","Type":"ContainerDied","Data":"18abf822562d7202715250e9b4313bf919b96054565cfe87e96f7333dac34472"} Mar 20 08:53:26 crc kubenswrapper[4971]: I0320 08:53:26.365111 4971 generic.go:334] "Generic (PLEG): container finished" podID="8b8ee1bc-f59c-49b5-8679-3411e0d64db3" containerID="002b18a4cbf209717c99cf48e1266bf53e0e4fa084810a4684e789b2d1032562" exitCode=143 Mar 20 08:53:26 crc kubenswrapper[4971]: I0320 08:53:26.365154 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8b8ee1bc-f59c-49b5-8679-3411e0d64db3","Type":"ContainerDied","Data":"002b18a4cbf209717c99cf48e1266bf53e0e4fa084810a4684e789b2d1032562"} Mar 20 08:53:26 crc kubenswrapper[4971]: I0320 08:53:26.742834 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f112ece8-c233-4f7d-87bb-4634c0bc3582" path="/var/lib/kubelet/pods/f112ece8-c233-4f7d-87bb-4634c0bc3582/volumes" Mar 20 08:53:27 crc kubenswrapper[4971]: I0320 08:53:27.403677 4971 generic.go:334] "Generic (PLEG): container finished" podID="4581cde5-cdb3-4215-b51a-17504bb3fc30" containerID="4f7c12edf0bcfcac83fe9e0f80c281b7f43c8af30f095eb9ca01ba4a7b85bdc1" exitCode=0 Mar 20 08:53:27 crc kubenswrapper[4971]: I0320 08:53:27.403910 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4581cde5-cdb3-4215-b51a-17504bb3fc30","Type":"ContainerDied","Data":"4f7c12edf0bcfcac83fe9e0f80c281b7f43c8af30f095eb9ca01ba4a7b85bdc1"} Mar 20 08:53:27 crc kubenswrapper[4971]: I0320 08:53:27.608140 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:53:27 crc kubenswrapper[4971]: I0320 08:53:27.773935 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4581cde5-cdb3-4215-b51a-17504bb3fc30-combined-ca-bundle\") pod \"4581cde5-cdb3-4215-b51a-17504bb3fc30\" (UID: \"4581cde5-cdb3-4215-b51a-17504bb3fc30\") " Mar 20 08:53:27 crc kubenswrapper[4971]: I0320 08:53:27.774346 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpb59\" (UniqueName: \"kubernetes.io/projected/4581cde5-cdb3-4215-b51a-17504bb3fc30-kube-api-access-bpb59\") pod \"4581cde5-cdb3-4215-b51a-17504bb3fc30\" (UID: \"4581cde5-cdb3-4215-b51a-17504bb3fc30\") " Mar 20 08:53:27 crc kubenswrapper[4971]: I0320 08:53:27.774382 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4581cde5-cdb3-4215-b51a-17504bb3fc30-config-data\") pod \"4581cde5-cdb3-4215-b51a-17504bb3fc30\" (UID: \"4581cde5-cdb3-4215-b51a-17504bb3fc30\") " Mar 20 08:53:27 crc kubenswrapper[4971]: I0320 08:53:27.793209 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4581cde5-cdb3-4215-b51a-17504bb3fc30-kube-api-access-bpb59" (OuterVolumeSpecName: "kube-api-access-bpb59") pod "4581cde5-cdb3-4215-b51a-17504bb3fc30" (UID: "4581cde5-cdb3-4215-b51a-17504bb3fc30"). InnerVolumeSpecName "kube-api-access-bpb59". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:53:27 crc kubenswrapper[4971]: I0320 08:53:27.801304 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4581cde5-cdb3-4215-b51a-17504bb3fc30-config-data" (OuterVolumeSpecName: "config-data") pod "4581cde5-cdb3-4215-b51a-17504bb3fc30" (UID: "4581cde5-cdb3-4215-b51a-17504bb3fc30"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:53:27 crc kubenswrapper[4971]: I0320 08:53:27.805197 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4581cde5-cdb3-4215-b51a-17504bb3fc30-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4581cde5-cdb3-4215-b51a-17504bb3fc30" (UID: "4581cde5-cdb3-4215-b51a-17504bb3fc30"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:53:27 crc kubenswrapper[4971]: I0320 08:53:27.877295 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4581cde5-cdb3-4215-b51a-17504bb3fc30-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:53:27 crc kubenswrapper[4971]: I0320 08:53:27.877335 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpb59\" (UniqueName: \"kubernetes.io/projected/4581cde5-cdb3-4215-b51a-17504bb3fc30-kube-api-access-bpb59\") on node \"crc\" DevicePath \"\"" Mar 20 08:53:27 crc kubenswrapper[4971]: I0320 08:53:27.877348 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4581cde5-cdb3-4215-b51a-17504bb3fc30-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:53:28 crc kubenswrapper[4971]: I0320 08:53:28.082043 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 20 08:53:28 crc kubenswrapper[4971]: I0320 08:53:28.181494 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1e23d5e-175f-4f85-a438-b6fc8ffd65e4-config-data\") pod \"c1e23d5e-175f-4f85-a438-b6fc8ffd65e4\" (UID: \"c1e23d5e-175f-4f85-a438-b6fc8ffd65e4\") " Mar 20 08:53:28 crc kubenswrapper[4971]: I0320 08:53:28.182085 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w2nl\" (UniqueName: \"kubernetes.io/projected/c1e23d5e-175f-4f85-a438-b6fc8ffd65e4-kube-api-access-2w2nl\") pod \"c1e23d5e-175f-4f85-a438-b6fc8ffd65e4\" (UID: \"c1e23d5e-175f-4f85-a438-b6fc8ffd65e4\") " Mar 20 08:53:28 crc kubenswrapper[4971]: I0320 08:53:28.182388 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1e23d5e-175f-4f85-a438-b6fc8ffd65e4-combined-ca-bundle\") pod \"c1e23d5e-175f-4f85-a438-b6fc8ffd65e4\" (UID: \"c1e23d5e-175f-4f85-a438-b6fc8ffd65e4\") " Mar 20 08:53:28 crc kubenswrapper[4971]: I0320 08:53:28.185439 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1e23d5e-175f-4f85-a438-b6fc8ffd65e4-kube-api-access-2w2nl" (OuterVolumeSpecName: "kube-api-access-2w2nl") pod "c1e23d5e-175f-4f85-a438-b6fc8ffd65e4" (UID: "c1e23d5e-175f-4f85-a438-b6fc8ffd65e4"). InnerVolumeSpecName "kube-api-access-2w2nl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:53:28 crc kubenswrapper[4971]: I0320 08:53:28.215305 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1e23d5e-175f-4f85-a438-b6fc8ffd65e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c1e23d5e-175f-4f85-a438-b6fc8ffd65e4" (UID: "c1e23d5e-175f-4f85-a438-b6fc8ffd65e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:53:28 crc kubenswrapper[4971]: I0320 08:53:28.219270 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1e23d5e-175f-4f85-a438-b6fc8ffd65e4-config-data" (OuterVolumeSpecName: "config-data") pod "c1e23d5e-175f-4f85-a438-b6fc8ffd65e4" (UID: "c1e23d5e-175f-4f85-a438-b6fc8ffd65e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:53:28 crc kubenswrapper[4971]: I0320 08:53:28.284686 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w2nl\" (UniqueName: \"kubernetes.io/projected/c1e23d5e-175f-4f85-a438-b6fc8ffd65e4-kube-api-access-2w2nl\") on node \"crc\" DevicePath \"\"" Mar 20 08:53:28 crc kubenswrapper[4971]: I0320 08:53:28.284718 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1e23d5e-175f-4f85-a438-b6fc8ffd65e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:53:28 crc kubenswrapper[4971]: I0320 08:53:28.284728 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1e23d5e-175f-4f85-a438-b6fc8ffd65e4-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:53:28 crc kubenswrapper[4971]: I0320 08:53:28.413749 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:53:28 crc kubenswrapper[4971]: I0320 08:53:28.413711 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4581cde5-cdb3-4215-b51a-17504bb3fc30","Type":"ContainerDied","Data":"0e9e08fa35b1607cc954bb721c86c11951a8e92c8642ed3ec23f97f0bc4c4657"} Mar 20 08:53:28 crc kubenswrapper[4971]: I0320 08:53:28.413951 4971 scope.go:117] "RemoveContainer" containerID="4f7c12edf0bcfcac83fe9e0f80c281b7f43c8af30f095eb9ca01ba4a7b85bdc1" Mar 20 08:53:28 crc kubenswrapper[4971]: I0320 08:53:28.415624 4971 generic.go:334] "Generic (PLEG): container finished" podID="c1e23d5e-175f-4f85-a438-b6fc8ffd65e4" containerID="826c478f4fe50ed03dc4ea36ad1e5487b97c0d28bfe7a13b5c6eab4bc78f0b62" exitCode=0 Mar 20 08:53:28 crc kubenswrapper[4971]: I0320 08:53:28.415669 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"c1e23d5e-175f-4f85-a438-b6fc8ffd65e4","Type":"ContainerDied","Data":"826c478f4fe50ed03dc4ea36ad1e5487b97c0d28bfe7a13b5c6eab4bc78f0b62"} Mar 20 08:53:28 crc kubenswrapper[4971]: I0320 08:53:28.415686 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 20 08:53:28 crc kubenswrapper[4971]: I0320 08:53:28.415699 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"c1e23d5e-175f-4f85-a438-b6fc8ffd65e4","Type":"ContainerDied","Data":"8c727871ef74d61b8656ec699bd0c74869f50a4439464e3eec44ef4b20affa91"} Mar 20 08:53:28 crc kubenswrapper[4971]: I0320 08:53:28.450026 4971 scope.go:117] "RemoveContainer" containerID="826c478f4fe50ed03dc4ea36ad1e5487b97c0d28bfe7a13b5c6eab4bc78f0b62" Mar 20 08:53:28 crc kubenswrapper[4971]: I0320 08:53:28.461239 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 08:53:28 crc kubenswrapper[4971]: I0320 08:53:28.476516 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 08:53:28 crc kubenswrapper[4971]: I0320 08:53:28.495493 4971 scope.go:117] "RemoveContainer" containerID="826c478f4fe50ed03dc4ea36ad1e5487b97c0d28bfe7a13b5c6eab4bc78f0b62" Mar 20 08:53:28 crc kubenswrapper[4971]: E0320 08:53:28.506435 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"826c478f4fe50ed03dc4ea36ad1e5487b97c0d28bfe7a13b5c6eab4bc78f0b62\": container with ID starting with 826c478f4fe50ed03dc4ea36ad1e5487b97c0d28bfe7a13b5c6eab4bc78f0b62 not found: ID does not exist" containerID="826c478f4fe50ed03dc4ea36ad1e5487b97c0d28bfe7a13b5c6eab4bc78f0b62" Mar 20 08:53:28 crc kubenswrapper[4971]: I0320 08:53:28.506491 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"826c478f4fe50ed03dc4ea36ad1e5487b97c0d28bfe7a13b5c6eab4bc78f0b62"} err="failed to get container status \"826c478f4fe50ed03dc4ea36ad1e5487b97c0d28bfe7a13b5c6eab4bc78f0b62\": rpc error: code = NotFound desc = could not find container \"826c478f4fe50ed03dc4ea36ad1e5487b97c0d28bfe7a13b5c6eab4bc78f0b62\": container with ID starting with 826c478f4fe50ed03dc4ea36ad1e5487b97c0d28bfe7a13b5c6eab4bc78f0b62 not found: ID does not exist" Mar 20 08:53:28 crc kubenswrapper[4971]: I0320 08:53:28.518720 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 08:53:28 crc kubenswrapper[4971]: I0320 08:53:28.528484 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 08:53:28 crc kubenswrapper[4971]: I0320 08:53:28.563657 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 08:53:28 crc kubenswrapper[4971]: E0320 08:53:28.564012 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f112ece8-c233-4f7d-87bb-4634c0bc3582" containerName="dnsmasq-dns" Mar 20 08:53:28 crc kubenswrapper[4971]: I0320 08:53:28.564028 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f112ece8-c233-4f7d-87bb-4634c0bc3582" containerName="dnsmasq-dns" Mar 20 08:53:28 crc kubenswrapper[4971]: E0320 08:53:28.564043 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4581cde5-cdb3-4215-b51a-17504bb3fc30" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 08:53:28 crc kubenswrapper[4971]: I0320 08:53:28.564050 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="4581cde5-cdb3-4215-b51a-17504bb3fc30" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 08:53:28 crc kubenswrapper[4971]: E0320 08:53:28.564063 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1e23d5e-175f-4f85-a438-b6fc8ffd65e4" containerName="nova-cell0-conductor-conductor" Mar 20 08:53:28 crc kubenswrapper[4971]: I0320 08:53:28.564069 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1e23d5e-175f-4f85-a438-b6fc8ffd65e4" containerName="nova-cell0-conductor-conductor" Mar 20 08:53:28 crc kubenswrapper[4971]: E0320 08:53:28.564092 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f112ece8-c233-4f7d-87bb-4634c0bc3582" containerName="init" Mar 20 08:53:28 crc kubenswrapper[4971]: I0320 08:53:28.564098 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f112ece8-c233-4f7d-87bb-4634c0bc3582" containerName="init" Mar 20 08:53:28 crc kubenswrapper[4971]: I0320 08:53:28.564271 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="f112ece8-c233-4f7d-87bb-4634c0bc3582" containerName="dnsmasq-dns" Mar 20 08:53:28 crc kubenswrapper[4971]: I0320 08:53:28.564293 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1e23d5e-175f-4f85-a438-b6fc8ffd65e4" containerName="nova-cell0-conductor-conductor" Mar 20 08:53:28 crc kubenswrapper[4971]: I0320 08:53:28.564306 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="4581cde5-cdb3-4215-b51a-17504bb3fc30" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 08:53:28 crc kubenswrapper[4971]: I0320 08:53:28.564858 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:53:28 crc kubenswrapper[4971]: I0320 08:53:28.567102 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 20 08:53:28 crc kubenswrapper[4971]: I0320 08:53:28.574262 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 08:53:28 crc kubenswrapper[4971]: I0320 08:53:28.584709 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 08:53:28 crc kubenswrapper[4971]: I0320 08:53:28.585876 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 20 08:53:28 crc kubenswrapper[4971]: I0320 08:53:28.587695 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 20 08:53:28 crc kubenswrapper[4971]: I0320 08:53:28.593841 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 08:53:28 crc kubenswrapper[4971]: I0320 08:53:28.692953 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77ccca0d-9032-4c04-9074-3711401b473c-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"77ccca0d-9032-4c04-9074-3711401b473c\") " pod="openstack/nova-cell0-conductor-0" Mar 20 08:53:28 crc kubenswrapper[4971]: I0320 08:53:28.693030 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4573bfc-d861-400d-bd97-475733d58617-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d4573bfc-d861-400d-bd97-475733d58617\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:53:28 crc kubenswrapper[4971]: I0320 08:53:28.693058 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4573bfc-d861-400d-bd97-475733d58617-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d4573bfc-d861-400d-bd97-475733d58617\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:53:28 crc kubenswrapper[4971]: I0320 08:53:28.693279 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2n4wb\" (UniqueName: \"kubernetes.io/projected/d4573bfc-d861-400d-bd97-475733d58617-kube-api-access-2n4wb\") pod \"nova-cell1-novncproxy-0\" (UID: \"d4573bfc-d861-400d-bd97-475733d58617\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:53:28 crc kubenswrapper[4971]: I0320 08:53:28.693444 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77ccca0d-9032-4c04-9074-3711401b473c-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"77ccca0d-9032-4c04-9074-3711401b473c\") " pod="openstack/nova-cell0-conductor-0" Mar 20 08:53:28 crc kubenswrapper[4971]: I0320 08:53:28.693554 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltrf8\" (UniqueName: \"kubernetes.io/projected/77ccca0d-9032-4c04-9074-3711401b473c-kube-api-access-ltrf8\") pod \"nova-cell0-conductor-0\" (UID: \"77ccca0d-9032-4c04-9074-3711401b473c\") " pod="openstack/nova-cell0-conductor-0" Mar 20 08:53:28 crc kubenswrapper[4971]: I0320 08:53:28.744522 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4581cde5-cdb3-4215-b51a-17504bb3fc30" path="/var/lib/kubelet/pods/4581cde5-cdb3-4215-b51a-17504bb3fc30/volumes" Mar 20 08:53:28 crc kubenswrapper[4971]: I0320 08:53:28.745064 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1e23d5e-175f-4f85-a438-b6fc8ffd65e4" path="/var/lib/kubelet/pods/c1e23d5e-175f-4f85-a438-b6fc8ffd65e4/volumes" Mar 20 08:53:28 crc kubenswrapper[4971]: I0320 08:53:28.795749 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77ccca0d-9032-4c04-9074-3711401b473c-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"77ccca0d-9032-4c04-9074-3711401b473c\") " pod="openstack/nova-cell0-conductor-0" Mar 20 08:53:28 crc kubenswrapper[4971]: I0320 08:53:28.795837 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltrf8\" (UniqueName: \"kubernetes.io/projected/77ccca0d-9032-4c04-9074-3711401b473c-kube-api-access-ltrf8\") pod \"nova-cell0-conductor-0\" (UID: \"77ccca0d-9032-4c04-9074-3711401b473c\") " pod="openstack/nova-cell0-conductor-0" Mar 20 08:53:28 crc kubenswrapper[4971]: I0320 08:53:28.795907 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77ccca0d-9032-4c04-9074-3711401b473c-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"77ccca0d-9032-4c04-9074-3711401b473c\") " pod="openstack/nova-cell0-conductor-0" Mar 20 08:53:28 crc kubenswrapper[4971]: I0320 08:53:28.795936 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4573bfc-d861-400d-bd97-475733d58617-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d4573bfc-d861-400d-bd97-475733d58617\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:53:28 crc kubenswrapper[4971]: I0320 08:53:28.795960 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4573bfc-d861-400d-bd97-475733d58617-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d4573bfc-d861-400d-bd97-475733d58617\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:53:28 crc kubenswrapper[4971]: I0320 08:53:28.796028 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2n4wb\" (UniqueName: \"kubernetes.io/projected/d4573bfc-d861-400d-bd97-475733d58617-kube-api-access-2n4wb\") pod \"nova-cell1-novncproxy-0\" (UID: \"d4573bfc-d861-400d-bd97-475733d58617\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:53:28 crc kubenswrapper[4971]: I0320 08:53:28.802011 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4573bfc-d861-400d-bd97-475733d58617-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d4573bfc-d861-400d-bd97-475733d58617\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:53:28 crc kubenswrapper[4971]: I0320 08:53:28.815502 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4573bfc-d861-400d-bd97-475733d58617-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d4573bfc-d861-400d-bd97-475733d58617\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:53:28 crc kubenswrapper[4971]: I0320 08:53:28.817022 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77ccca0d-9032-4c04-9074-3711401b473c-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"77ccca0d-9032-4c04-9074-3711401b473c\") " pod="openstack/nova-cell0-conductor-0" Mar 20 08:53:28 crc kubenswrapper[4971]: I0320 08:53:28.817827 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2n4wb\" (UniqueName: \"kubernetes.io/projected/d4573bfc-d861-400d-bd97-475733d58617-kube-api-access-2n4wb\") pod \"nova-cell1-novncproxy-0\" (UID: \"d4573bfc-d861-400d-bd97-475733d58617\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:53:28 crc kubenswrapper[4971]: I0320 08:53:28.826885 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77ccca0d-9032-4c04-9074-3711401b473c-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"77ccca0d-9032-4c04-9074-3711401b473c\") " pod="openstack/nova-cell0-conductor-0" Mar 20 08:53:28 crc kubenswrapper[4971]: I0320 08:53:28.830787 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltrf8\" (UniqueName: \"kubernetes.io/projected/77ccca0d-9032-4c04-9074-3711401b473c-kube-api-access-ltrf8\") pod \"nova-cell0-conductor-0\" (UID: \"77ccca0d-9032-4c04-9074-3711401b473c\") " pod="openstack/nova-cell0-conductor-0" Mar 20 08:53:28 crc kubenswrapper[4971]: I0320 08:53:28.885868 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:53:28 crc kubenswrapper[4971]: I0320 08:53:28.902138 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 20 08:53:29 crc kubenswrapper[4971]: E0320 08:53:29.143193 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="63d452ef4e3ac2efb555ef163ff51d3ca8589f65d0913c80c5580273babf5f39" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 08:53:29 crc kubenswrapper[4971]: E0320 08:53:29.144378 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="63d452ef4e3ac2efb555ef163ff51d3ca8589f65d0913c80c5580273babf5f39" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 08:53:29 crc kubenswrapper[4971]: E0320 08:53:29.145646 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="63d452ef4e3ac2efb555ef163ff51d3ca8589f65d0913c80c5580273babf5f39" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 08:53:29 crc kubenswrapper[4971]: E0320 08:53:29.145675 4971 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="23472bad-7f81-42e7-9878-b2cbc945508a" containerName="nova-cell1-conductor-conductor" Mar 20 08:53:29 crc kubenswrapper[4971]: I0320 08:53:29.354323 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 08:53:29 crc kubenswrapper[4971]: W0320 08:53:29.360050 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4573bfc_d861_400d_bd97_475733d58617.slice/crio-3607a5b6dd5deca1b964c592a39fa610671e818371936c9d376d6baa1d3c4177 WatchSource:0}: Error finding container 3607a5b6dd5deca1b964c592a39fa610671e818371936c9d376d6baa1d3c4177: Status 404 returned error can't find the container with id 3607a5b6dd5deca1b964c592a39fa610671e818371936c9d376d6baa1d3c4177 Mar 20 08:53:29 crc kubenswrapper[4971]: I0320 08:53:29.393555 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 08:53:29 crc kubenswrapper[4971]: I0320 08:53:29.438765 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d4573bfc-d861-400d-bd97-475733d58617","Type":"ContainerStarted","Data":"3607a5b6dd5deca1b964c592a39fa610671e818371936c9d376d6baa1d3c4177"} Mar 20 08:53:29 crc kubenswrapper[4971]: I0320 08:53:29.447844 4971 generic.go:334] "Generic (PLEG): container finished" podID="1448e032-bee2-4ebb-b6bc-9b62539fea32" containerID="1bb783d5a44fa970c0622ae5ed68bff208b5e5ef186250d261674fdd233a7d51" exitCode=0 Mar 20 08:53:29 crc kubenswrapper[4971]: I0320 08:53:29.447883 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1448e032-bee2-4ebb-b6bc-9b62539fea32","Type":"ContainerDied","Data":"1bb783d5a44fa970c0622ae5ed68bff208b5e5ef186250d261674fdd233a7d51"} Mar 20 08:53:29 crc kubenswrapper[4971]: I0320 08:53:29.447909 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1448e032-bee2-4ebb-b6bc-9b62539fea32","Type":"ContainerDied","Data":"2cb6ec783813ce6a0dc9aa7d47c53ee959ccd88021583a73df9c07cb9b8bbcfd"} Mar 20 08:53:29 crc kubenswrapper[4971]: I0320 08:53:29.447926 4971 scope.go:117] "RemoveContainer" containerID="1bb783d5a44fa970c0622ae5ed68bff208b5e5ef186250d261674fdd233a7d51" Mar 20 08:53:29 crc kubenswrapper[4971]: I0320 08:53:29.448086 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 08:53:29 crc kubenswrapper[4971]: I0320 08:53:29.482201 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 08:53:29 crc kubenswrapper[4971]: I0320 08:53:29.505976 4971 scope.go:117] "RemoveContainer" containerID="1bb783d5a44fa970c0622ae5ed68bff208b5e5ef186250d261674fdd233a7d51" Mar 20 08:53:29 crc kubenswrapper[4971]: E0320 08:53:29.506377 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bb783d5a44fa970c0622ae5ed68bff208b5e5ef186250d261674fdd233a7d51\": container with ID starting with 1bb783d5a44fa970c0622ae5ed68bff208b5e5ef186250d261674fdd233a7d51 not found: ID does not exist" containerID="1bb783d5a44fa970c0622ae5ed68bff208b5e5ef186250d261674fdd233a7d51" Mar 20 08:53:29 crc kubenswrapper[4971]: I0320 08:53:29.506407 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bb783d5a44fa970c0622ae5ed68bff208b5e5ef186250d261674fdd233a7d51"} err="failed to get container status \"1bb783d5a44fa970c0622ae5ed68bff208b5e5ef186250d261674fdd233a7d51\": rpc error: code = NotFound desc = could not find container \"1bb783d5a44fa970c0622ae5ed68bff208b5e5ef186250d261674fdd233a7d51\": container with ID starting with 1bb783d5a44fa970c0622ae5ed68bff208b5e5ef186250d261674fdd233a7d51 not found: ID does not exist" Mar 20 08:53:29 crc kubenswrapper[4971]: I0320 08:53:29.516498 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1448e032-bee2-4ebb-b6bc-9b62539fea32-config-data\") pod \"1448e032-bee2-4ebb-b6bc-9b62539fea32\" (UID: \"1448e032-bee2-4ebb-b6bc-9b62539fea32\") " Mar 20 08:53:29 crc kubenswrapper[4971]: I0320 08:53:29.516640 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj92n\" (UniqueName: \"kubernetes.io/projected/1448e032-bee2-4ebb-b6bc-9b62539fea32-kube-api-access-pj92n\") pod \"1448e032-bee2-4ebb-b6bc-9b62539fea32\" (UID: \"1448e032-bee2-4ebb-b6bc-9b62539fea32\") " Mar 20 08:53:29 crc kubenswrapper[4971]: I0320 08:53:29.516811 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1448e032-bee2-4ebb-b6bc-9b62539fea32-combined-ca-bundle\") pod \"1448e032-bee2-4ebb-b6bc-9b62539fea32\" (UID: \"1448e032-bee2-4ebb-b6bc-9b62539fea32\") " Mar 20 08:53:29 crc kubenswrapper[4971]: I0320 08:53:29.529405 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1448e032-bee2-4ebb-b6bc-9b62539fea32-kube-api-access-pj92n" (OuterVolumeSpecName: "kube-api-access-pj92n") pod "1448e032-bee2-4ebb-b6bc-9b62539fea32" (UID: "1448e032-bee2-4ebb-b6bc-9b62539fea32"). InnerVolumeSpecName "kube-api-access-pj92n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:53:29 crc kubenswrapper[4971]: I0320 08:53:29.550916 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1448e032-bee2-4ebb-b6bc-9b62539fea32-config-data" (OuterVolumeSpecName: "config-data") pod "1448e032-bee2-4ebb-b6bc-9b62539fea32" (UID: "1448e032-bee2-4ebb-b6bc-9b62539fea32"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:53:29 crc kubenswrapper[4971]: I0320 08:53:29.565526 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1448e032-bee2-4ebb-b6bc-9b62539fea32-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1448e032-bee2-4ebb-b6bc-9b62539fea32" (UID: "1448e032-bee2-4ebb-b6bc-9b62539fea32"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:53:29 crc kubenswrapper[4971]: I0320 08:53:29.619337 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1448e032-bee2-4ebb-b6bc-9b62539fea32-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:53:29 crc kubenswrapper[4971]: I0320 08:53:29.619577 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj92n\" (UniqueName: \"kubernetes.io/projected/1448e032-bee2-4ebb-b6bc-9b62539fea32-kube-api-access-pj92n\") on node \"crc\" DevicePath \"\"" Mar 20 08:53:29 crc kubenswrapper[4971]: I0320 08:53:29.619586 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1448e032-bee2-4ebb-b6bc-9b62539fea32-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:53:29 crc kubenswrapper[4971]: I0320 08:53:29.890144 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 08:53:29 crc kubenswrapper[4971]: I0320 08:53:29.894004 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 08:53:29 crc kubenswrapper[4971]: I0320 08:53:29.927854 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 08:53:29 crc kubenswrapper[4971]: I0320 08:53:29.950785 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 08:53:29 crc kubenswrapper[4971]: E0320 08:53:29.951223 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42569610-bca3-4909-ad7c-0ad057f7cfe7" containerName="nova-metadata-log" Mar 20 08:53:29 crc kubenswrapper[4971]: I0320 08:53:29.951236 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="42569610-bca3-4909-ad7c-0ad057f7cfe7" containerName="nova-metadata-log" Mar 20 08:53:29 crc kubenswrapper[4971]: E0320 08:53:29.951250 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1448e032-bee2-4ebb-b6bc-9b62539fea32" containerName="nova-scheduler-scheduler" Mar 20 08:53:29 crc kubenswrapper[4971]: I0320 08:53:29.951256 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="1448e032-bee2-4ebb-b6bc-9b62539fea32" containerName="nova-scheduler-scheduler" Mar 20 08:53:29 crc kubenswrapper[4971]: E0320 08:53:29.951274 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42569610-bca3-4909-ad7c-0ad057f7cfe7" containerName="nova-metadata-metadata" Mar 20 08:53:29 crc kubenswrapper[4971]: I0320 08:53:29.951280 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="42569610-bca3-4909-ad7c-0ad057f7cfe7" containerName="nova-metadata-metadata" Mar 20 08:53:29 crc kubenswrapper[4971]: I0320 08:53:29.951429 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="1448e032-bee2-4ebb-b6bc-9b62539fea32" containerName="nova-scheduler-scheduler" Mar 20 08:53:29 crc kubenswrapper[4971]: I0320 08:53:29.951447 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="42569610-bca3-4909-ad7c-0ad057f7cfe7" containerName="nova-metadata-log" Mar 20 08:53:29 crc kubenswrapper[4971]: I0320 08:53:29.951459 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="42569610-bca3-4909-ad7c-0ad057f7cfe7" containerName="nova-metadata-metadata" Mar 20 08:53:29 crc kubenswrapper[4971]: I0320 08:53:29.952047 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 08:53:29 crc kubenswrapper[4971]: I0320 08:53:29.954889 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 08:53:29 crc kubenswrapper[4971]: I0320 08:53:29.955051 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.029795 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42569610-bca3-4909-ad7c-0ad057f7cfe7-combined-ca-bundle\") pod \"42569610-bca3-4909-ad7c-0ad057f7cfe7\" (UID: \"42569610-bca3-4909-ad7c-0ad057f7cfe7\") " Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.030166 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42569610-bca3-4909-ad7c-0ad057f7cfe7-config-data\") pod \"42569610-bca3-4909-ad7c-0ad057f7cfe7\" (UID: \"42569610-bca3-4909-ad7c-0ad057f7cfe7\") " Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.030259 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b74jw\" (UniqueName: \"kubernetes.io/projected/42569610-bca3-4909-ad7c-0ad057f7cfe7-kube-api-access-b74jw\") pod \"42569610-bca3-4909-ad7c-0ad057f7cfe7\" (UID: \"42569610-bca3-4909-ad7c-0ad057f7cfe7\") " Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.030316 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42569610-bca3-4909-ad7c-0ad057f7cfe7-logs\") pod \"42569610-bca3-4909-ad7c-0ad057f7cfe7\" (UID: \"42569610-bca3-4909-ad7c-0ad057f7cfe7\") " Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.031217 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42569610-bca3-4909-ad7c-0ad057f7cfe7-logs" (OuterVolumeSpecName: "logs") pod "42569610-bca3-4909-ad7c-0ad057f7cfe7" (UID: "42569610-bca3-4909-ad7c-0ad057f7cfe7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.042468 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42569610-bca3-4909-ad7c-0ad057f7cfe7-kube-api-access-b74jw" (OuterVolumeSpecName: "kube-api-access-b74jw") pod "42569610-bca3-4909-ad7c-0ad057f7cfe7" (UID: "42569610-bca3-4909-ad7c-0ad057f7cfe7"). InnerVolumeSpecName "kube-api-access-b74jw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.057849 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.078391 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42569610-bca3-4909-ad7c-0ad057f7cfe7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "42569610-bca3-4909-ad7c-0ad057f7cfe7" (UID: "42569610-bca3-4909-ad7c-0ad057f7cfe7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.102713 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42569610-bca3-4909-ad7c-0ad057f7cfe7-config-data" (OuterVolumeSpecName: "config-data") pod "42569610-bca3-4909-ad7c-0ad057f7cfe7" (UID: "42569610-bca3-4909-ad7c-0ad057f7cfe7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.133045 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29b84ad8-d074-4c35-a0dd-a0b491ac3e45-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"29b84ad8-d074-4c35-a0dd-a0b491ac3e45\") " pod="openstack/nova-scheduler-0" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.133259 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ztnj\" (UniqueName: \"kubernetes.io/projected/29b84ad8-d074-4c35-a0dd-a0b491ac3e45-kube-api-access-8ztnj\") pod \"nova-scheduler-0\" (UID: \"29b84ad8-d074-4c35-a0dd-a0b491ac3e45\") " pod="openstack/nova-scheduler-0" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.133351 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29b84ad8-d074-4c35-a0dd-a0b491ac3e45-config-data\") pod \"nova-scheduler-0\" (UID: \"29b84ad8-d074-4c35-a0dd-a0b491ac3e45\") " pod="openstack/nova-scheduler-0" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.133438 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42569610-bca3-4909-ad7c-0ad057f7cfe7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.133458 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42569610-bca3-4909-ad7c-0ad057f7cfe7-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.133468 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b74jw\" (UniqueName: \"kubernetes.io/projected/42569610-bca3-4909-ad7c-0ad057f7cfe7-kube-api-access-b74jw\") on node \"crc\" DevicePath \"\"" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.133478 4971 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42569610-bca3-4909-ad7c-0ad057f7cfe7-logs\") on node \"crc\" DevicePath \"\"" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.234615 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b8ee1bc-f59c-49b5-8679-3411e0d64db3-logs\") pod \"8b8ee1bc-f59c-49b5-8679-3411e0d64db3\" (UID: \"8b8ee1bc-f59c-49b5-8679-3411e0d64db3\") " Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.234686 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5l6l\" (UniqueName: \"kubernetes.io/projected/8b8ee1bc-f59c-49b5-8679-3411e0d64db3-kube-api-access-z5l6l\") pod \"8b8ee1bc-f59c-49b5-8679-3411e0d64db3\" (UID: \"8b8ee1bc-f59c-49b5-8679-3411e0d64db3\") " Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.234855 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b8ee1bc-f59c-49b5-8679-3411e0d64db3-combined-ca-bundle\") pod \"8b8ee1bc-f59c-49b5-8679-3411e0d64db3\" (UID: \"8b8ee1bc-f59c-49b5-8679-3411e0d64db3\") " Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.234890 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b8ee1bc-f59c-49b5-8679-3411e0d64db3-config-data\") pod \"8b8ee1bc-f59c-49b5-8679-3411e0d64db3\" (UID: \"8b8ee1bc-f59c-49b5-8679-3411e0d64db3\") " Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.235024 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b8ee1bc-f59c-49b5-8679-3411e0d64db3-logs" (OuterVolumeSpecName: "logs") pod "8b8ee1bc-f59c-49b5-8679-3411e0d64db3" (UID: "8b8ee1bc-f59c-49b5-8679-3411e0d64db3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.235119 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29b84ad8-d074-4c35-a0dd-a0b491ac3e45-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"29b84ad8-d074-4c35-a0dd-a0b491ac3e45\") " pod="openstack/nova-scheduler-0" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.235200 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ztnj\" (UniqueName: \"kubernetes.io/projected/29b84ad8-d074-4c35-a0dd-a0b491ac3e45-kube-api-access-8ztnj\") pod \"nova-scheduler-0\" (UID: \"29b84ad8-d074-4c35-a0dd-a0b491ac3e45\") " pod="openstack/nova-scheduler-0" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.235236 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29b84ad8-d074-4c35-a0dd-a0b491ac3e45-config-data\") pod \"nova-scheduler-0\" (UID: \"29b84ad8-d074-4c35-a0dd-a0b491ac3e45\") " pod="openstack/nova-scheduler-0" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.235294 4971 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b8ee1bc-f59c-49b5-8679-3411e0d64db3-logs\") on node \"crc\" DevicePath \"\"" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.239075 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29b84ad8-d074-4c35-a0dd-a0b491ac3e45-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"29b84ad8-d074-4c35-a0dd-a0b491ac3e45\") " pod="openstack/nova-scheduler-0" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.243398 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29b84ad8-d074-4c35-a0dd-a0b491ac3e45-config-data\") pod \"nova-scheduler-0\" (UID: \"29b84ad8-d074-4c35-a0dd-a0b491ac3e45\") " pod="openstack/nova-scheduler-0" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.257400 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ztnj\" (UniqueName: \"kubernetes.io/projected/29b84ad8-d074-4c35-a0dd-a0b491ac3e45-kube-api-access-8ztnj\") pod \"nova-scheduler-0\" (UID: \"29b84ad8-d074-4c35-a0dd-a0b491ac3e45\") " pod="openstack/nova-scheduler-0" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.259909 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b8ee1bc-f59c-49b5-8679-3411e0d64db3-kube-api-access-z5l6l" (OuterVolumeSpecName: "kube-api-access-z5l6l") pod "8b8ee1bc-f59c-49b5-8679-3411e0d64db3" (UID: "8b8ee1bc-f59c-49b5-8679-3411e0d64db3"). InnerVolumeSpecName "kube-api-access-z5l6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.267841 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b8ee1bc-f59c-49b5-8679-3411e0d64db3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8b8ee1bc-f59c-49b5-8679-3411e0d64db3" (UID: "8b8ee1bc-f59c-49b5-8679-3411e0d64db3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.273675 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b8ee1bc-f59c-49b5-8679-3411e0d64db3-config-data" (OuterVolumeSpecName: "config-data") pod "8b8ee1bc-f59c-49b5-8679-3411e0d64db3" (UID: "8b8ee1bc-f59c-49b5-8679-3411e0d64db3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.284083 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.336997 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5l6l\" (UniqueName: \"kubernetes.io/projected/8b8ee1bc-f59c-49b5-8679-3411e0d64db3-kube-api-access-z5l6l\") on node \"crc\" DevicePath \"\"" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.337025 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b8ee1bc-f59c-49b5-8679-3411e0d64db3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.337038 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b8ee1bc-f59c-49b5-8679-3411e0d64db3-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.459228 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d4573bfc-d861-400d-bd97-475733d58617","Type":"ContainerStarted","Data":"2adfd6ef32ea4e1e619aec17fc22a86ed1331f367080b089224084672f9e89d0"} Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.469837 4971 generic.go:334] "Generic (PLEG): container finished" podID="8b8ee1bc-f59c-49b5-8679-3411e0d64db3" containerID="d44b83d1e9043bf165e9007af7fe2a791004b924391a249c5c2abdfa5435ca3c" exitCode=0 Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.469928 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.469972 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8b8ee1bc-f59c-49b5-8679-3411e0d64db3","Type":"ContainerDied","Data":"d44b83d1e9043bf165e9007af7fe2a791004b924391a249c5c2abdfa5435ca3c"} Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.470004 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8b8ee1bc-f59c-49b5-8679-3411e0d64db3","Type":"ContainerDied","Data":"a9d07ff0db1ee1f9121dd29989916e97980c543bea6fc97d9582258fc996aa1b"} Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.470021 4971 scope.go:117] "RemoveContainer" containerID="d44b83d1e9043bf165e9007af7fe2a791004b924391a249c5c2abdfa5435ca3c" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.476734 4971 generic.go:334] "Generic (PLEG): container finished" podID="42569610-bca3-4909-ad7c-0ad057f7cfe7" containerID="248b2194ee601bc6137794b73597702de74ad6edf54acb4b3be4732e29f966ef" exitCode=0 Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.476810 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"42569610-bca3-4909-ad7c-0ad057f7cfe7","Type":"ContainerDied","Data":"248b2194ee601bc6137794b73597702de74ad6edf54acb4b3be4732e29f966ef"} Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.476839 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"42569610-bca3-4909-ad7c-0ad057f7cfe7","Type":"ContainerDied","Data":"61bd637926349633be742b9ee719ea7a13792d9fb8c467ddd91cce705de359b7"} Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.477215 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.495882 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"77ccca0d-9032-4c04-9074-3711401b473c","Type":"ContainerStarted","Data":"c1701b1eb7140b5fe38ba24029a2325e7c0489059e8e39e989e69d7bbff07652"} Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.495927 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"77ccca0d-9032-4c04-9074-3711401b473c","Type":"ContainerStarted","Data":"783512d5f25ffdd9c564022e0035cd1fab58f972cc8982fae7745ad3a00d65e9"} Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.495940 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.496192 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.496172591 podStartE2EDuration="2.496172591s" podCreationTimestamp="2026-03-20 08:53:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:53:30.48660038 +0000 UTC m=+7432.466474518" watchObservedRunningTime="2026-03-20 08:53:30.496172591 +0000 UTC m=+7432.476046729" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.508524 4971 scope.go:117] "RemoveContainer" containerID="002b18a4cbf209717c99cf48e1266bf53e0e4fa084810a4684e789b2d1032562" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.533203 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.533180589 podStartE2EDuration="2.533180589s" podCreationTimestamp="2026-03-20 08:53:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:53:30.522410968 +0000 UTC m=+7432.502285106" watchObservedRunningTime="2026-03-20 08:53:30.533180589 +0000 UTC m=+7432.513054737" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.575656 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.605697 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.613744 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.628642 4971 scope.go:117] "RemoveContainer" containerID="d44b83d1e9043bf165e9007af7fe2a791004b924391a249c5c2abdfa5435ca3c" Mar 20 08:53:30 crc kubenswrapper[4971]: E0320 08:53:30.631050 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d44b83d1e9043bf165e9007af7fe2a791004b924391a249c5c2abdfa5435ca3c\": container with ID starting with d44b83d1e9043bf165e9007af7fe2a791004b924391a249c5c2abdfa5435ca3c not found: ID does not exist" containerID="d44b83d1e9043bf165e9007af7fe2a791004b924391a249c5c2abdfa5435ca3c" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.631152 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d44b83d1e9043bf165e9007af7fe2a791004b924391a249c5c2abdfa5435ca3c"} err="failed to get container status \"d44b83d1e9043bf165e9007af7fe2a791004b924391a249c5c2abdfa5435ca3c\": rpc error: code = NotFound desc = could not find container \"d44b83d1e9043bf165e9007af7fe2a791004b924391a249c5c2abdfa5435ca3c\": container with ID starting with d44b83d1e9043bf165e9007af7fe2a791004b924391a249c5c2abdfa5435ca3c not found: ID does not exist" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.631226 4971 scope.go:117] "RemoveContainer" containerID="002b18a4cbf209717c99cf48e1266bf53e0e4fa084810a4684e789b2d1032562" Mar 20 08:53:30 crc kubenswrapper[4971]: E0320 08:53:30.636843 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"002b18a4cbf209717c99cf48e1266bf53e0e4fa084810a4684e789b2d1032562\": container with ID starting with 002b18a4cbf209717c99cf48e1266bf53e0e4fa084810a4684e789b2d1032562 not found: ID does not exist" containerID="002b18a4cbf209717c99cf48e1266bf53e0e4fa084810a4684e789b2d1032562" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.636879 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"002b18a4cbf209717c99cf48e1266bf53e0e4fa084810a4684e789b2d1032562"} err="failed to get container status \"002b18a4cbf209717c99cf48e1266bf53e0e4fa084810a4684e789b2d1032562\": rpc error: code = NotFound desc = could not find container \"002b18a4cbf209717c99cf48e1266bf53e0e4fa084810a4684e789b2d1032562\": container with ID starting with 002b18a4cbf209717c99cf48e1266bf53e0e4fa084810a4684e789b2d1032562 not found: ID does not exist" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.636906 4971 scope.go:117] "RemoveContainer" containerID="248b2194ee601bc6137794b73597702de74ad6edf54acb4b3be4732e29f966ef" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.640205 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.659544 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 08:53:30 crc kubenswrapper[4971]: E0320 08:53:30.660000 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b8ee1bc-f59c-49b5-8679-3411e0d64db3" containerName="nova-api-log" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.660019 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b8ee1bc-f59c-49b5-8679-3411e0d64db3" containerName="nova-api-log" Mar 20 08:53:30 crc kubenswrapper[4971]: E0320 08:53:30.660045 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b8ee1bc-f59c-49b5-8679-3411e0d64db3" containerName="nova-api-api" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.660052 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b8ee1bc-f59c-49b5-8679-3411e0d64db3" containerName="nova-api-api" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.660201 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b8ee1bc-f59c-49b5-8679-3411e0d64db3" containerName="nova-api-api" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.660218 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b8ee1bc-f59c-49b5-8679-3411e0d64db3" containerName="nova-api-log" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.661141 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.663126 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.670727 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.693844 4971 scope.go:117] "RemoveContainer" containerID="18abf822562d7202715250e9b4313bf919b96054565cfe87e96f7333dac34472" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.699531 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.707497 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.713326 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.722804 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.730135 4971 scope.go:117] "RemoveContainer" containerID="248b2194ee601bc6137794b73597702de74ad6edf54acb4b3be4732e29f966ef" Mar 20 08:53:30 crc kubenswrapper[4971]: E0320 08:53:30.732381 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"248b2194ee601bc6137794b73597702de74ad6edf54acb4b3be4732e29f966ef\": container with ID starting with 248b2194ee601bc6137794b73597702de74ad6edf54acb4b3be4732e29f966ef not found: ID does not exist" containerID="248b2194ee601bc6137794b73597702de74ad6edf54acb4b3be4732e29f966ef" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.732450 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"248b2194ee601bc6137794b73597702de74ad6edf54acb4b3be4732e29f966ef"} err="failed to get container status \"248b2194ee601bc6137794b73597702de74ad6edf54acb4b3be4732e29f966ef\": rpc error: code = NotFound desc = could not find container \"248b2194ee601bc6137794b73597702de74ad6edf54acb4b3be4732e29f966ef\": container with ID starting with 248b2194ee601bc6137794b73597702de74ad6edf54acb4b3be4732e29f966ef not found: ID does not exist" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.732479 4971 scope.go:117] "RemoveContainer" containerID="18abf822562d7202715250e9b4313bf919b96054565cfe87e96f7333dac34472" Mar 20 08:53:30 crc kubenswrapper[4971]: E0320 08:53:30.733761 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18abf822562d7202715250e9b4313bf919b96054565cfe87e96f7333dac34472\": container with ID starting with 18abf822562d7202715250e9b4313bf919b96054565cfe87e96f7333dac34472 not found: ID does not exist" containerID="18abf822562d7202715250e9b4313bf919b96054565cfe87e96f7333dac34472" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.733812 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18abf822562d7202715250e9b4313bf919b96054565cfe87e96f7333dac34472"} err="failed to get container status \"18abf822562d7202715250e9b4313bf919b96054565cfe87e96f7333dac34472\": rpc error: code = NotFound desc = could not find container \"18abf822562d7202715250e9b4313bf919b96054565cfe87e96f7333dac34472\": container with ID starting with 18abf822562d7202715250e9b4313bf919b96054565cfe87e96f7333dac34472 not found: ID does not exist" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.742371 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4e0393d-4241-455b-a857-3f4e8536576b-logs\") pod \"nova-api-0\" (UID: \"f4e0393d-4241-455b-a857-3f4e8536576b\") " pod="openstack/nova-api-0" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.742462 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbxwz\" (UniqueName: \"kubernetes.io/projected/f4e0393d-4241-455b-a857-3f4e8536576b-kube-api-access-lbxwz\") pod \"nova-api-0\" (UID: \"f4e0393d-4241-455b-a857-3f4e8536576b\") " pod="openstack/nova-api-0" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.742520 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4e0393d-4241-455b-a857-3f4e8536576b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f4e0393d-4241-455b-a857-3f4e8536576b\") " pod="openstack/nova-api-0" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.742536 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4e0393d-4241-455b-a857-3f4e8536576b-config-data\") pod \"nova-api-0\" (UID: \"f4e0393d-4241-455b-a857-3f4e8536576b\") " pod="openstack/nova-api-0" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.742706 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1448e032-bee2-4ebb-b6bc-9b62539fea32" path="/var/lib/kubelet/pods/1448e032-bee2-4ebb-b6bc-9b62539fea32/volumes" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.746908 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42569610-bca3-4909-ad7c-0ad057f7cfe7" path="/var/lib/kubelet/pods/42569610-bca3-4909-ad7c-0ad057f7cfe7/volumes" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.747531 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b8ee1bc-f59c-49b5-8679-3411e0d64db3" path="/var/lib/kubelet/pods/8b8ee1bc-f59c-49b5-8679-3411e0d64db3/volumes" Mar 20 08:53:30 crc kubenswrapper[4971]: W0320 08:53:30.784391 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29b84ad8_d074_4c35_a0dd_a0b491ac3e45.slice/crio-38c6b76be7923f31036e49e7fdd4ecf2bd38ab11f323745804bb47da0cd8800c WatchSource:0}: Error finding container 38c6b76be7923f31036e49e7fdd4ecf2bd38ab11f323745804bb47da0cd8800c: Status 404 returned error can't find the container with id 38c6b76be7923f31036e49e7fdd4ecf2bd38ab11f323745804bb47da0cd8800c Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.789627 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.857617 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecdcaee4-8a8d-4f0f-94de-fe2082e55ead-config-data\") pod \"nova-metadata-0\" (UID: \"ecdcaee4-8a8d-4f0f-94de-fe2082e55ead\") " pod="openstack/nova-metadata-0" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.857678 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecdcaee4-8a8d-4f0f-94de-fe2082e55ead-logs\") pod \"nova-metadata-0\" (UID: \"ecdcaee4-8a8d-4f0f-94de-fe2082e55ead\") " pod="openstack/nova-metadata-0" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.857701 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbxwz\" (UniqueName: \"kubernetes.io/projected/f4e0393d-4241-455b-a857-3f4e8536576b-kube-api-access-lbxwz\") pod \"nova-api-0\" (UID: \"f4e0393d-4241-455b-a857-3f4e8536576b\") " pod="openstack/nova-api-0" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.857755 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4e0393d-4241-455b-a857-3f4e8536576b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f4e0393d-4241-455b-a857-3f4e8536576b\") " pod="openstack/nova-api-0" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.857772 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4e0393d-4241-455b-a857-3f4e8536576b-config-data\") pod \"nova-api-0\" (UID: \"f4e0393d-4241-455b-a857-3f4e8536576b\") " pod="openstack/nova-api-0" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.857788 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b8q7\" (UniqueName: \"kubernetes.io/projected/ecdcaee4-8a8d-4f0f-94de-fe2082e55ead-kube-api-access-8b8q7\") pod \"nova-metadata-0\" (UID: \"ecdcaee4-8a8d-4f0f-94de-fe2082e55ead\") " pod="openstack/nova-metadata-0" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.857829 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4e0393d-4241-455b-a857-3f4e8536576b-logs\") pod \"nova-api-0\" (UID: \"f4e0393d-4241-455b-a857-3f4e8536576b\") " pod="openstack/nova-api-0" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.857862 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecdcaee4-8a8d-4f0f-94de-fe2082e55ead-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ecdcaee4-8a8d-4f0f-94de-fe2082e55ead\") " pod="openstack/nova-metadata-0" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.858511 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4e0393d-4241-455b-a857-3f4e8536576b-logs\") pod \"nova-api-0\" (UID: \"f4e0393d-4241-455b-a857-3f4e8536576b\") " pod="openstack/nova-api-0" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.864805 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4e0393d-4241-455b-a857-3f4e8536576b-config-data\") pod \"nova-api-0\" (UID: \"f4e0393d-4241-455b-a857-3f4e8536576b\") " pod="openstack/nova-api-0" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.864897 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4e0393d-4241-455b-a857-3f4e8536576b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f4e0393d-4241-455b-a857-3f4e8536576b\") " pod="openstack/nova-api-0" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.873879 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbxwz\" (UniqueName: \"kubernetes.io/projected/f4e0393d-4241-455b-a857-3f4e8536576b-kube-api-access-lbxwz\") pod \"nova-api-0\" (UID: \"f4e0393d-4241-455b-a857-3f4e8536576b\") " pod="openstack/nova-api-0" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.960765 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecdcaee4-8a8d-4f0f-94de-fe2082e55ead-config-data\") pod \"nova-metadata-0\" (UID: \"ecdcaee4-8a8d-4f0f-94de-fe2082e55ead\") " pod="openstack/nova-metadata-0" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.961058 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecdcaee4-8a8d-4f0f-94de-fe2082e55ead-logs\") pod \"nova-metadata-0\" (UID: \"ecdcaee4-8a8d-4f0f-94de-fe2082e55ead\") " pod="openstack/nova-metadata-0" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.961126 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8b8q7\" (UniqueName: \"kubernetes.io/projected/ecdcaee4-8a8d-4f0f-94de-fe2082e55ead-kube-api-access-8b8q7\") pod \"nova-metadata-0\" (UID: \"ecdcaee4-8a8d-4f0f-94de-fe2082e55ead\") " pod="openstack/nova-metadata-0" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.961189 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecdcaee4-8a8d-4f0f-94de-fe2082e55ead-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ecdcaee4-8a8d-4f0f-94de-fe2082e55ead\") " pod="openstack/nova-metadata-0" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.962022 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecdcaee4-8a8d-4f0f-94de-fe2082e55ead-logs\") pod \"nova-metadata-0\" (UID: \"ecdcaee4-8a8d-4f0f-94de-fe2082e55ead\") " pod="openstack/nova-metadata-0" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.977909 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecdcaee4-8a8d-4f0f-94de-fe2082e55ead-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ecdcaee4-8a8d-4f0f-94de-fe2082e55ead\") " pod="openstack/nova-metadata-0" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.978044 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecdcaee4-8a8d-4f0f-94de-fe2082e55ead-config-data\") pod \"nova-metadata-0\" (UID: \"ecdcaee4-8a8d-4f0f-94de-fe2082e55ead\") " pod="openstack/nova-metadata-0" Mar 20 08:53:30 crc kubenswrapper[4971]: I0320 08:53:30.981933 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b8q7\" (UniqueName: \"kubernetes.io/projected/ecdcaee4-8a8d-4f0f-94de-fe2082e55ead-kube-api-access-8b8q7\") pod \"nova-metadata-0\" (UID: \"ecdcaee4-8a8d-4f0f-94de-fe2082e55ead\") " pod="openstack/nova-metadata-0" Mar 20 08:53:31 crc kubenswrapper[4971]: I0320 08:53:31.008009 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 08:53:31 crc kubenswrapper[4971]: I0320 08:53:31.053813 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 08:53:31 crc kubenswrapper[4971]: I0320 08:53:31.519926 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"29b84ad8-d074-4c35-a0dd-a0b491ac3e45","Type":"ContainerStarted","Data":"7e8913c8e78092443fc5bdec82211995ed9e16db7ae63352cf5840096b976239"} Mar 20 08:53:31 crc kubenswrapper[4971]: I0320 08:53:31.520461 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"29b84ad8-d074-4c35-a0dd-a0b491ac3e45","Type":"ContainerStarted","Data":"38c6b76be7923f31036e49e7fdd4ecf2bd38ab11f323745804bb47da0cd8800c"} Mar 20 08:53:31 crc kubenswrapper[4971]: I0320 08:53:31.536553 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 08:53:31 crc kubenswrapper[4971]: W0320 08:53:31.538937 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4e0393d_4241_455b_a857_3f4e8536576b.slice/crio-342a103ce3d6aba9d44d3af04671f35bed74ea7e36d9c75e07bc8831e3f50a45 WatchSource:0}: Error finding container 342a103ce3d6aba9d44d3af04671f35bed74ea7e36d9c75e07bc8831e3f50a45: Status 404 returned error can't find the container with id 342a103ce3d6aba9d44d3af04671f35bed74ea7e36d9c75e07bc8831e3f50a45 Mar 20 08:53:31 crc kubenswrapper[4971]: I0320 08:53:31.543024 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.543003309 podStartE2EDuration="2.543003309s" podCreationTimestamp="2026-03-20 08:53:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:53:31.539235501 +0000 UTC m=+7433.519109639" watchObservedRunningTime="2026-03-20 08:53:31.543003309 +0000 UTC m=+7433.522877467" Mar 20 08:53:31 crc kubenswrapper[4971]: I0320 08:53:31.599693 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 08:53:31 crc kubenswrapper[4971]: W0320 08:53:31.614783 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podecdcaee4_8a8d_4f0f_94de_fe2082e55ead.slice/crio-f9add17e4401ee21eb0cdc536e7ebf45c08ccfca2850ddac53740419a8a589fa WatchSource:0}: Error finding container f9add17e4401ee21eb0cdc536e7ebf45c08ccfca2850ddac53740419a8a589fa: Status 404 returned error can't find the container with id f9add17e4401ee21eb0cdc536e7ebf45c08ccfca2850ddac53740419a8a589fa Mar 20 08:53:32 crc kubenswrapper[4971]: I0320 08:53:32.529966 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ecdcaee4-8a8d-4f0f-94de-fe2082e55ead","Type":"ContainerStarted","Data":"a754c0876c0bb879fece283bb1842c594d8bf635ee792749cff6fcae64503153"} Mar 20 08:53:32 crc kubenswrapper[4971]: I0320 08:53:32.530506 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ecdcaee4-8a8d-4f0f-94de-fe2082e55ead","Type":"ContainerStarted","Data":"f03c934f8e3cab9668e02a35a1ff1b598209096c238dc5e2b9041ebfa5a60f51"} Mar 20 08:53:32 crc kubenswrapper[4971]: I0320 08:53:32.530516 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ecdcaee4-8a8d-4f0f-94de-fe2082e55ead","Type":"ContainerStarted","Data":"f9add17e4401ee21eb0cdc536e7ebf45c08ccfca2850ddac53740419a8a589fa"} Mar 20 08:53:32 crc kubenswrapper[4971]: I0320 08:53:32.533185 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f4e0393d-4241-455b-a857-3f4e8536576b","Type":"ContainerStarted","Data":"069c4523fd249ea051101504fc8505a865bd50f12636610910eacda6ead5b5ca"} Mar 20 08:53:32 crc kubenswrapper[4971]: I0320 08:53:32.533211 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f4e0393d-4241-455b-a857-3f4e8536576b","Type":"ContainerStarted","Data":"2427238a074f49768d7eca5f2b457978f1303ed5f92dd92cae8208ee17726a0c"} Mar 20 08:53:32 crc kubenswrapper[4971]: I0320 08:53:32.533222 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f4e0393d-4241-455b-a857-3f4e8536576b","Type":"ContainerStarted","Data":"342a103ce3d6aba9d44d3af04671f35bed74ea7e36d9c75e07bc8831e3f50a45"} Mar 20 08:53:32 crc kubenswrapper[4971]: I0320 08:53:32.582619 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.582578137 podStartE2EDuration="2.582578137s" podCreationTimestamp="2026-03-20 08:53:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:53:32.55976943 +0000 UTC m=+7434.539643568" watchObservedRunningTime="2026-03-20 08:53:32.582578137 +0000 UTC m=+7434.562452275" Mar 20 08:53:32 crc kubenswrapper[4971]: I0320 08:53:32.588142 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.588128272 podStartE2EDuration="2.588128272s" podCreationTimestamp="2026-03-20 08:53:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:53:32.582435593 +0000 UTC m=+7434.562309731" watchObservedRunningTime="2026-03-20 08:53:32.588128272 +0000 UTC m=+7434.568002410" Mar 20 08:53:33 crc kubenswrapper[4971]: I0320 08:53:33.513440 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 20 08:53:33 crc kubenswrapper[4971]: I0320 08:53:33.543717 4971 generic.go:334] "Generic (PLEG): container finished" podID="23472bad-7f81-42e7-9878-b2cbc945508a" containerID="63d452ef4e3ac2efb555ef163ff51d3ca8589f65d0913c80c5580273babf5f39" exitCode=0 Mar 20 08:53:33 crc kubenswrapper[4971]: I0320 08:53:33.544029 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"23472bad-7f81-42e7-9878-b2cbc945508a","Type":"ContainerDied","Data":"63d452ef4e3ac2efb555ef163ff51d3ca8589f65d0913c80c5580273babf5f39"} Mar 20 08:53:33 crc kubenswrapper[4971]: I0320 08:53:33.544076 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"23472bad-7f81-42e7-9878-b2cbc945508a","Type":"ContainerDied","Data":"691802c63e6a6fb32b88ee4a533c29858d2c9c515ed64af9d7b607e1a0e2d695"} Mar 20 08:53:33 crc kubenswrapper[4971]: I0320 08:53:33.544095 4971 scope.go:117] "RemoveContainer" containerID="63d452ef4e3ac2efb555ef163ff51d3ca8589f65d0913c80c5580273babf5f39" Mar 20 08:53:33 crc kubenswrapper[4971]: I0320 08:53:33.544907 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 20 08:53:33 crc kubenswrapper[4971]: I0320 08:53:33.590858 4971 scope.go:117] "RemoveContainer" containerID="63d452ef4e3ac2efb555ef163ff51d3ca8589f65d0913c80c5580273babf5f39" Mar 20 08:53:33 crc kubenswrapper[4971]: E0320 08:53:33.592337 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63d452ef4e3ac2efb555ef163ff51d3ca8589f65d0913c80c5580273babf5f39\": container with ID starting with 63d452ef4e3ac2efb555ef163ff51d3ca8589f65d0913c80c5580273babf5f39 not found: ID does not exist" containerID="63d452ef4e3ac2efb555ef163ff51d3ca8589f65d0913c80c5580273babf5f39" Mar 20 08:53:33 crc kubenswrapper[4971]: I0320 08:53:33.592379 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63d452ef4e3ac2efb555ef163ff51d3ca8589f65d0913c80c5580273babf5f39"} err="failed to get container status \"63d452ef4e3ac2efb555ef163ff51d3ca8589f65d0913c80c5580273babf5f39\": rpc error: code = NotFound desc = could not find container \"63d452ef4e3ac2efb555ef163ff51d3ca8589f65d0913c80c5580273babf5f39\": container with ID starting with 63d452ef4e3ac2efb555ef163ff51d3ca8589f65d0913c80c5580273babf5f39 not found: ID does not exist" Mar 20 08:53:33 crc kubenswrapper[4971]: I0320 08:53:33.611957 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23472bad-7f81-42e7-9878-b2cbc945508a-config-data\") pod \"23472bad-7f81-42e7-9878-b2cbc945508a\" (UID: \"23472bad-7f81-42e7-9878-b2cbc945508a\") " Mar 20 08:53:33 crc kubenswrapper[4971]: I0320 08:53:33.612107 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67c5f\" (UniqueName: \"kubernetes.io/projected/23472bad-7f81-42e7-9878-b2cbc945508a-kube-api-access-67c5f\") pod \"23472bad-7f81-42e7-9878-b2cbc945508a\" (UID: \"23472bad-7f81-42e7-9878-b2cbc945508a\") " Mar 20 08:53:33 crc kubenswrapper[4971]: I0320 08:53:33.612133 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23472bad-7f81-42e7-9878-b2cbc945508a-combined-ca-bundle\") pod \"23472bad-7f81-42e7-9878-b2cbc945508a\" (UID: \"23472bad-7f81-42e7-9878-b2cbc945508a\") " Mar 20 08:53:33 crc kubenswrapper[4971]: I0320 08:53:33.630161 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23472bad-7f81-42e7-9878-b2cbc945508a-kube-api-access-67c5f" (OuterVolumeSpecName: "kube-api-access-67c5f") pod "23472bad-7f81-42e7-9878-b2cbc945508a" (UID: "23472bad-7f81-42e7-9878-b2cbc945508a"). InnerVolumeSpecName "kube-api-access-67c5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:53:33 crc kubenswrapper[4971]: I0320 08:53:33.637562 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23472bad-7f81-42e7-9878-b2cbc945508a-config-data" (OuterVolumeSpecName: "config-data") pod "23472bad-7f81-42e7-9878-b2cbc945508a" (UID: "23472bad-7f81-42e7-9878-b2cbc945508a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:53:33 crc kubenswrapper[4971]: I0320 08:53:33.657088 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23472bad-7f81-42e7-9878-b2cbc945508a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "23472bad-7f81-42e7-9878-b2cbc945508a" (UID: "23472bad-7f81-42e7-9878-b2cbc945508a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:53:33 crc kubenswrapper[4971]: I0320 08:53:33.713757 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67c5f\" (UniqueName: \"kubernetes.io/projected/23472bad-7f81-42e7-9878-b2cbc945508a-kube-api-access-67c5f\") on node \"crc\" DevicePath \"\"" Mar 20 08:53:33 crc kubenswrapper[4971]: I0320 08:53:33.713786 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23472bad-7f81-42e7-9878-b2cbc945508a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:53:33 crc kubenswrapper[4971]: I0320 08:53:33.713795 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23472bad-7f81-42e7-9878-b2cbc945508a-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:53:33 crc kubenswrapper[4971]: I0320 08:53:33.886114 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:53:33 crc kubenswrapper[4971]: I0320 08:53:33.906093 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 08:53:33 crc kubenswrapper[4971]: I0320 08:53:33.916094 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 08:53:33 crc kubenswrapper[4971]: I0320 08:53:33.952002 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 08:53:33 crc kubenswrapper[4971]: E0320 08:53:33.952559 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23472bad-7f81-42e7-9878-b2cbc945508a" containerName="nova-cell1-conductor-conductor" Mar 20 08:53:33 crc kubenswrapper[4971]: I0320 08:53:33.952586 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="23472bad-7f81-42e7-9878-b2cbc945508a" containerName="nova-cell1-conductor-conductor" Mar 20 08:53:33 crc kubenswrapper[4971]: I0320 08:53:33.952842 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="23472bad-7f81-42e7-9878-b2cbc945508a" containerName="nova-cell1-conductor-conductor" Mar 20 08:53:33 crc kubenswrapper[4971]: I0320 08:53:33.953598 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 20 08:53:33 crc kubenswrapper[4971]: I0320 08:53:33.956048 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 20 08:53:33 crc kubenswrapper[4971]: I0320 08:53:33.963704 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 08:53:34 crc kubenswrapper[4971]: I0320 08:53:34.122114 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d18a1e5-a77f-4c5f-a2f3-be899e4c702d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"8d18a1e5-a77f-4c5f-a2f3-be899e4c702d\") " pod="openstack/nova-cell1-conductor-0" Mar 20 08:53:34 crc kubenswrapper[4971]: I0320 08:53:34.122238 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d18a1e5-a77f-4c5f-a2f3-be899e4c702d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"8d18a1e5-a77f-4c5f-a2f3-be899e4c702d\") " pod="openstack/nova-cell1-conductor-0" Mar 20 08:53:34 crc kubenswrapper[4971]: I0320 08:53:34.122458 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67vvl\" (UniqueName: \"kubernetes.io/projected/8d18a1e5-a77f-4c5f-a2f3-be899e4c702d-kube-api-access-67vvl\") pod \"nova-cell1-conductor-0\" (UID: \"8d18a1e5-a77f-4c5f-a2f3-be899e4c702d\") " pod="openstack/nova-cell1-conductor-0" Mar 20 08:53:34 crc kubenswrapper[4971]: I0320 08:53:34.224964 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d18a1e5-a77f-4c5f-a2f3-be899e4c702d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"8d18a1e5-a77f-4c5f-a2f3-be899e4c702d\") " pod="openstack/nova-cell1-conductor-0" Mar 20 08:53:34 crc kubenswrapper[4971]: I0320 08:53:34.225093 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d18a1e5-a77f-4c5f-a2f3-be899e4c702d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"8d18a1e5-a77f-4c5f-a2f3-be899e4c702d\") " pod="openstack/nova-cell1-conductor-0" Mar 20 08:53:34 crc kubenswrapper[4971]: I0320 08:53:34.225170 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67vvl\" (UniqueName: \"kubernetes.io/projected/8d18a1e5-a77f-4c5f-a2f3-be899e4c702d-kube-api-access-67vvl\") pod \"nova-cell1-conductor-0\" (UID: \"8d18a1e5-a77f-4c5f-a2f3-be899e4c702d\") " pod="openstack/nova-cell1-conductor-0" Mar 20 08:53:34 crc kubenswrapper[4971]: I0320 08:53:34.230736 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d18a1e5-a77f-4c5f-a2f3-be899e4c702d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"8d18a1e5-a77f-4c5f-a2f3-be899e4c702d\") " pod="openstack/nova-cell1-conductor-0" Mar 20 08:53:34 crc kubenswrapper[4971]: I0320 08:53:34.231334 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d18a1e5-a77f-4c5f-a2f3-be899e4c702d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"8d18a1e5-a77f-4c5f-a2f3-be899e4c702d\") " pod="openstack/nova-cell1-conductor-0" Mar 20 08:53:34 crc kubenswrapper[4971]: I0320 08:53:34.247945 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67vvl\" (UniqueName: \"kubernetes.io/projected/8d18a1e5-a77f-4c5f-a2f3-be899e4c702d-kube-api-access-67vvl\") pod \"nova-cell1-conductor-0\" (UID: \"8d18a1e5-a77f-4c5f-a2f3-be899e4c702d\") " pod="openstack/nova-cell1-conductor-0" Mar 20 08:53:34 crc kubenswrapper[4971]: I0320 08:53:34.270051 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 20 08:53:34 crc kubenswrapper[4971]: W0320 08:53:34.745214 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d18a1e5_a77f_4c5f_a2f3_be899e4c702d.slice/crio-fcaa96fec80dd83aa17cf6fc13ebed7bdbca6d37cd81b45cacadbd631166ca68 WatchSource:0}: Error finding container fcaa96fec80dd83aa17cf6fc13ebed7bdbca6d37cd81b45cacadbd631166ca68: Status 404 returned error can't find the container with id fcaa96fec80dd83aa17cf6fc13ebed7bdbca6d37cd81b45cacadbd631166ca68 Mar 20 08:53:34 crc kubenswrapper[4971]: I0320 08:53:34.753861 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23472bad-7f81-42e7-9878-b2cbc945508a" path="/var/lib/kubelet/pods/23472bad-7f81-42e7-9878-b2cbc945508a/volumes" Mar 20 08:53:34 crc kubenswrapper[4971]: I0320 08:53:34.754752 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 08:53:35 crc kubenswrapper[4971]: I0320 08:53:35.284622 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 20 08:53:35 crc kubenswrapper[4971]: I0320 08:53:35.564045 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"8d18a1e5-a77f-4c5f-a2f3-be899e4c702d","Type":"ContainerStarted","Data":"cc0310d6f80ae78b6a488ab8385070f8bac2cd555c6aa72a5afb7fb64cbc727b"} Mar 20 08:53:35 crc kubenswrapper[4971]: I0320 08:53:35.564114 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"8d18a1e5-a77f-4c5f-a2f3-be899e4c702d","Type":"ContainerStarted","Data":"fcaa96fec80dd83aa17cf6fc13ebed7bdbca6d37cd81b45cacadbd631166ca68"} Mar 20 08:53:35 crc kubenswrapper[4971]: I0320 08:53:35.564163 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 20 08:53:35 crc kubenswrapper[4971]: I0320 08:53:35.605326 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.60530356 podStartE2EDuration="2.60530356s" podCreationTimestamp="2026-03-20 08:53:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:53:35.592685259 +0000 UTC m=+7437.572559457" watchObservedRunningTime="2026-03-20 08:53:35.60530356 +0000 UTC m=+7437.585177718" Mar 20 08:53:38 crc kubenswrapper[4971]: I0320 08:53:38.887731 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:53:38 crc kubenswrapper[4971]: I0320 08:53:38.908465 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:53:38 crc kubenswrapper[4971]: I0320 08:53:38.939939 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 20 08:53:39 crc kubenswrapper[4971]: I0320 08:53:39.305663 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 20 08:53:39 crc kubenswrapper[4971]: I0320 08:53:39.625353 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:53:40 crc kubenswrapper[4971]: I0320 08:53:40.285897 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 20 08:53:40 crc kubenswrapper[4971]: I0320 08:53:40.322545 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 20 08:53:40 crc kubenswrapper[4971]: I0320 08:53:40.710174 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 20 08:53:41 crc kubenswrapper[4971]: I0320 08:53:41.009286 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 08:53:41 crc kubenswrapper[4971]: I0320 08:53:41.009772 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 08:53:41 crc kubenswrapper[4971]: I0320 08:53:41.055180 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 08:53:41 crc kubenswrapper[4971]: I0320 08:53:41.055252 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 08:53:42 crc kubenswrapper[4971]: I0320 08:53:42.097899 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ecdcaee4-8a8d-4f0f-94de-fe2082e55ead" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.147:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 08:53:42 crc kubenswrapper[4971]: I0320 08:53:42.403158 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f4e0393d-4241-455b-a857-3f4e8536576b" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.146:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 08:53:42 crc kubenswrapper[4971]: I0320 08:53:42.486140 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f4e0393d-4241-455b-a857-3f4e8536576b" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.146:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 08:53:42 crc kubenswrapper[4971]: I0320 08:53:42.486951 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ecdcaee4-8a8d-4f0f-94de-fe2082e55ead" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.147:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 08:53:48 crc kubenswrapper[4971]: I0320 08:53:48.541202 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 08:53:48 crc kubenswrapper[4971]: I0320 08:53:48.545454 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 08:53:48 crc kubenswrapper[4971]: I0320 08:53:48.552743 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 08:53:48 crc kubenswrapper[4971]: I0320 08:53:48.558319 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 20 08:53:48 crc kubenswrapper[4971]: I0320 08:53:48.593346 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f731eb88-b58b-4134-a9ec-6bd0d3be30cd-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f731eb88-b58b-4134-a9ec-6bd0d3be30cd\") " pod="openstack/cinder-scheduler-0" Mar 20 08:53:48 crc kubenswrapper[4971]: I0320 08:53:48.593496 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f731eb88-b58b-4134-a9ec-6bd0d3be30cd-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f731eb88-b58b-4134-a9ec-6bd0d3be30cd\") " pod="openstack/cinder-scheduler-0" Mar 20 08:53:48 crc kubenswrapper[4971]: I0320 08:53:48.593682 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qnjr\" (UniqueName: \"kubernetes.io/projected/f731eb88-b58b-4134-a9ec-6bd0d3be30cd-kube-api-access-5qnjr\") pod \"cinder-scheduler-0\" (UID: \"f731eb88-b58b-4134-a9ec-6bd0d3be30cd\") " pod="openstack/cinder-scheduler-0" Mar 20 08:53:48 crc kubenswrapper[4971]: I0320 08:53:48.593739 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f731eb88-b58b-4134-a9ec-6bd0d3be30cd-scripts\") pod \"cinder-scheduler-0\" (UID: \"f731eb88-b58b-4134-a9ec-6bd0d3be30cd\") " pod="openstack/cinder-scheduler-0" Mar 20 08:53:48 crc kubenswrapper[4971]: I0320 08:53:48.593782 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f731eb88-b58b-4134-a9ec-6bd0d3be30cd-config-data\") pod \"cinder-scheduler-0\" (UID: \"f731eb88-b58b-4134-a9ec-6bd0d3be30cd\") " pod="openstack/cinder-scheduler-0" Mar 20 08:53:48 crc kubenswrapper[4971]: I0320 08:53:48.593916 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f731eb88-b58b-4134-a9ec-6bd0d3be30cd-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f731eb88-b58b-4134-a9ec-6bd0d3be30cd\") " pod="openstack/cinder-scheduler-0" Mar 20 08:53:48 crc kubenswrapper[4971]: I0320 08:53:48.696182 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f731eb88-b58b-4134-a9ec-6bd0d3be30cd-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f731eb88-b58b-4134-a9ec-6bd0d3be30cd\") " pod="openstack/cinder-scheduler-0" Mar 20 08:53:48 crc kubenswrapper[4971]: I0320 08:53:48.696292 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qnjr\" (UniqueName: \"kubernetes.io/projected/f731eb88-b58b-4134-a9ec-6bd0d3be30cd-kube-api-access-5qnjr\") pod \"cinder-scheduler-0\" (UID: \"f731eb88-b58b-4134-a9ec-6bd0d3be30cd\") " pod="openstack/cinder-scheduler-0" Mar 20 08:53:48 crc kubenswrapper[4971]: I0320 08:53:48.696332 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f731eb88-b58b-4134-a9ec-6bd0d3be30cd-scripts\") pod \"cinder-scheduler-0\" (UID: \"f731eb88-b58b-4134-a9ec-6bd0d3be30cd\") " pod="openstack/cinder-scheduler-0" Mar 20 08:53:48 crc kubenswrapper[4971]: I0320 08:53:48.696361 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f731eb88-b58b-4134-a9ec-6bd0d3be30cd-config-data\") pod \"cinder-scheduler-0\" (UID: \"f731eb88-b58b-4134-a9ec-6bd0d3be30cd\") " pod="openstack/cinder-scheduler-0" Mar 20 08:53:48 crc kubenswrapper[4971]: I0320 08:53:48.696388 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f731eb88-b58b-4134-a9ec-6bd0d3be30cd-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f731eb88-b58b-4134-a9ec-6bd0d3be30cd\") " pod="openstack/cinder-scheduler-0" Mar 20 08:53:48 crc kubenswrapper[4971]: I0320 08:53:48.696416 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f731eb88-b58b-4134-a9ec-6bd0d3be30cd-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f731eb88-b58b-4134-a9ec-6bd0d3be30cd\") " pod="openstack/cinder-scheduler-0" Mar 20 08:53:48 crc kubenswrapper[4971]: I0320 08:53:48.697009 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f731eb88-b58b-4134-a9ec-6bd0d3be30cd-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f731eb88-b58b-4134-a9ec-6bd0d3be30cd\") " pod="openstack/cinder-scheduler-0" Mar 20 08:53:48 crc kubenswrapper[4971]: I0320 08:53:48.703009 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f731eb88-b58b-4134-a9ec-6bd0d3be30cd-scripts\") pod \"cinder-scheduler-0\" (UID: \"f731eb88-b58b-4134-a9ec-6bd0d3be30cd\") " pod="openstack/cinder-scheduler-0" Mar 20 08:53:48 crc kubenswrapper[4971]: I0320 08:53:48.703355 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f731eb88-b58b-4134-a9ec-6bd0d3be30cd-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f731eb88-b58b-4134-a9ec-6bd0d3be30cd\") " pod="openstack/cinder-scheduler-0" Mar 20 08:53:48 crc kubenswrapper[4971]: I0320 08:53:48.704061 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f731eb88-b58b-4134-a9ec-6bd0d3be30cd-config-data\") pod \"cinder-scheduler-0\" (UID: \"f731eb88-b58b-4134-a9ec-6bd0d3be30cd\") " pod="openstack/cinder-scheduler-0" Mar 20 08:53:48 crc kubenswrapper[4971]: I0320 08:53:48.705954 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f731eb88-b58b-4134-a9ec-6bd0d3be30cd-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f731eb88-b58b-4134-a9ec-6bd0d3be30cd\") " pod="openstack/cinder-scheduler-0" Mar 20 08:53:48 crc kubenswrapper[4971]: I0320 08:53:48.720189 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qnjr\" (UniqueName: \"kubernetes.io/projected/f731eb88-b58b-4134-a9ec-6bd0d3be30cd-kube-api-access-5qnjr\") pod \"cinder-scheduler-0\" (UID: \"f731eb88-b58b-4134-a9ec-6bd0d3be30cd\") " pod="openstack/cinder-scheduler-0" Mar 20 08:53:48 crc kubenswrapper[4971]: I0320 08:53:48.869752 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 08:53:49 crc kubenswrapper[4971]: I0320 08:53:49.008942 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 08:53:49 crc kubenswrapper[4971]: I0320 08:53:49.010024 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 08:53:49 crc kubenswrapper[4971]: I0320 08:53:49.055585 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 08:53:49 crc kubenswrapper[4971]: I0320 08:53:49.055698 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 08:53:49 crc kubenswrapper[4971]: I0320 08:53:49.368931 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 08:53:49 crc kubenswrapper[4971]: W0320 08:53:49.372365 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf731eb88_b58b_4134_a9ec_6bd0d3be30cd.slice/crio-2b824cc886055453afddd9136bbd7d631c326f60b5208749e2abfa3a03d9a3e3 WatchSource:0}: Error finding container 2b824cc886055453afddd9136bbd7d631c326f60b5208749e2abfa3a03d9a3e3: Status 404 returned error can't find the container with id 2b824cc886055453afddd9136bbd7d631c326f60b5208749e2abfa3a03d9a3e3 Mar 20 08:53:49 crc kubenswrapper[4971]: I0320 08:53:49.736391 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f731eb88-b58b-4134-a9ec-6bd0d3be30cd","Type":"ContainerStarted","Data":"2b824cc886055453afddd9136bbd7d631c326f60b5208749e2abfa3a03d9a3e3"} Mar 20 08:53:50 crc kubenswrapper[4971]: I0320 08:53:50.167188 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 20 08:53:50 crc kubenswrapper[4971]: I0320 08:53:50.167911 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="34b1bb1b-3d48-4177-80d9-62596eacee6b" containerName="cinder-api-log" containerID="cri-o://96c3b627129ecc249303d163c2067cb98d2ade6f55ab368897c1cbcff7aa35d7" gracePeriod=30 Mar 20 08:53:50 crc kubenswrapper[4971]: I0320 08:53:50.168010 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="34b1bb1b-3d48-4177-80d9-62596eacee6b" containerName="cinder-api" containerID="cri-o://b9b301e52c53c6f8e5ddfdc11458c7656298eb5ebe8338b8ce2cf4b72d58e0cd" gracePeriod=30 Mar 20 08:53:50 crc kubenswrapper[4971]: I0320 08:53:50.656015 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Mar 20 08:53:50 crc kubenswrapper[4971]: I0320 08:53:50.657829 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Mar 20 08:53:50 crc kubenswrapper[4971]: I0320 08:53:50.659922 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Mar 20 08:53:50 crc kubenswrapper[4971]: I0320 08:53:50.667339 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Mar 20 08:53:50 crc kubenswrapper[4971]: I0320 08:53:50.673626 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/82e8fffd-a0f2-4591-9a54-ffc5227fac62-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"82e8fffd-a0f2-4591-9a54-ffc5227fac62\") " pod="openstack/cinder-volume-volume1-0" Mar 20 08:53:50 crc kubenswrapper[4971]: I0320 08:53:50.673681 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/82e8fffd-a0f2-4591-9a54-ffc5227fac62-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"82e8fffd-a0f2-4591-9a54-ffc5227fac62\") " pod="openstack/cinder-volume-volume1-0" Mar 20 08:53:50 crc kubenswrapper[4971]: I0320 08:53:50.673703 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/82e8fffd-a0f2-4591-9a54-ffc5227fac62-sys\") pod \"cinder-volume-volume1-0\" (UID: \"82e8fffd-a0f2-4591-9a54-ffc5227fac62\") " pod="openstack/cinder-volume-volume1-0" Mar 20 08:53:50 crc kubenswrapper[4971]: I0320 08:53:50.673725 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/82e8fffd-a0f2-4591-9a54-ffc5227fac62-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"82e8fffd-a0f2-4591-9a54-ffc5227fac62\") " pod="openstack/cinder-volume-volume1-0" Mar 20 08:53:50 crc kubenswrapper[4971]: I0320 08:53:50.673744 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/82e8fffd-a0f2-4591-9a54-ffc5227fac62-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"82e8fffd-a0f2-4591-9a54-ffc5227fac62\") " pod="openstack/cinder-volume-volume1-0" Mar 20 08:53:50 crc kubenswrapper[4971]: I0320 08:53:50.673782 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82e8fffd-a0f2-4591-9a54-ffc5227fac62-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"82e8fffd-a0f2-4591-9a54-ffc5227fac62\") " pod="openstack/cinder-volume-volume1-0" Mar 20 08:53:50 crc kubenswrapper[4971]: I0320 08:53:50.673802 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/82e8fffd-a0f2-4591-9a54-ffc5227fac62-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"82e8fffd-a0f2-4591-9a54-ffc5227fac62\") " pod="openstack/cinder-volume-volume1-0" Mar 20 08:53:50 crc kubenswrapper[4971]: I0320 08:53:50.673831 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e8fffd-a0f2-4591-9a54-ffc5227fac62-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"82e8fffd-a0f2-4591-9a54-ffc5227fac62\") " pod="openstack/cinder-volume-volume1-0" Mar 20 08:53:50 crc kubenswrapper[4971]: I0320 08:53:50.673866 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tljtb\" (UniqueName: \"kubernetes.io/projected/82e8fffd-a0f2-4591-9a54-ffc5227fac62-kube-api-access-tljtb\") pod \"cinder-volume-volume1-0\" (UID: \"82e8fffd-a0f2-4591-9a54-ffc5227fac62\") " pod="openstack/cinder-volume-volume1-0" Mar 20 08:53:50 crc kubenswrapper[4971]: I0320 08:53:50.673887 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82e8fffd-a0f2-4591-9a54-ffc5227fac62-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"82e8fffd-a0f2-4591-9a54-ffc5227fac62\") " pod="openstack/cinder-volume-volume1-0" Mar 20 08:53:50 crc kubenswrapper[4971]: I0320 08:53:50.673900 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/82e8fffd-a0f2-4591-9a54-ffc5227fac62-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"82e8fffd-a0f2-4591-9a54-ffc5227fac62\") " pod="openstack/cinder-volume-volume1-0" Mar 20 08:53:50 crc kubenswrapper[4971]: I0320 08:53:50.673920 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/82e8fffd-a0f2-4591-9a54-ffc5227fac62-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"82e8fffd-a0f2-4591-9a54-ffc5227fac62\") " pod="openstack/cinder-volume-volume1-0" Mar 20 08:53:50 crc kubenswrapper[4971]: I0320 08:53:50.673936 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/82e8fffd-a0f2-4591-9a54-ffc5227fac62-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"82e8fffd-a0f2-4591-9a54-ffc5227fac62\") " pod="openstack/cinder-volume-volume1-0" Mar 20 08:53:50 crc kubenswrapper[4971]: I0320 08:53:50.673972 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/82e8fffd-a0f2-4591-9a54-ffc5227fac62-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"82e8fffd-a0f2-4591-9a54-ffc5227fac62\") " pod="openstack/cinder-volume-volume1-0" Mar 20 08:53:50 crc kubenswrapper[4971]: I0320 08:53:50.673988 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/82e8fffd-a0f2-4591-9a54-ffc5227fac62-run\") pod \"cinder-volume-volume1-0\" (UID: \"82e8fffd-a0f2-4591-9a54-ffc5227fac62\") " pod="openstack/cinder-volume-volume1-0" Mar 20 08:53:50 crc kubenswrapper[4971]: I0320 08:53:50.674007 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/82e8fffd-a0f2-4591-9a54-ffc5227fac62-dev\") pod \"cinder-volume-volume1-0\" (UID: \"82e8fffd-a0f2-4591-9a54-ffc5227fac62\") " pod="openstack/cinder-volume-volume1-0" Mar 20 08:53:50 crc kubenswrapper[4971]: I0320 08:53:50.751831 4971 generic.go:334] "Generic (PLEG): container finished" podID="34b1bb1b-3d48-4177-80d9-62596eacee6b" containerID="96c3b627129ecc249303d163c2067cb98d2ade6f55ab368897c1cbcff7aa35d7" exitCode=143 Mar 20 08:53:50 crc kubenswrapper[4971]: I0320 08:53:50.751906 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"34b1bb1b-3d48-4177-80d9-62596eacee6b","Type":"ContainerDied","Data":"96c3b627129ecc249303d163c2067cb98d2ade6f55ab368897c1cbcff7aa35d7"} Mar 20 08:53:50 crc kubenswrapper[4971]: I0320 08:53:50.753742 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f731eb88-b58b-4134-a9ec-6bd0d3be30cd","Type":"ContainerStarted","Data":"7cb02e3ebc468d7c0e2d25bad9290bf3599e217074384ccdc05cbd26e177aab5"} Mar 20 08:53:50 crc kubenswrapper[4971]: I0320 08:53:50.774591 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/82e8fffd-a0f2-4591-9a54-ffc5227fac62-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"82e8fffd-a0f2-4591-9a54-ffc5227fac62\") " pod="openstack/cinder-volume-volume1-0" Mar 20 08:53:50 crc kubenswrapper[4971]: I0320 08:53:50.774698 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/82e8fffd-a0f2-4591-9a54-ffc5227fac62-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"82e8fffd-a0f2-4591-9a54-ffc5227fac62\") " pod="openstack/cinder-volume-volume1-0" Mar 20 08:53:50 crc kubenswrapper[4971]: I0320 08:53:50.774731 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/82e8fffd-a0f2-4591-9a54-ffc5227fac62-sys\") pod \"cinder-volume-volume1-0\" (UID: \"82e8fffd-a0f2-4591-9a54-ffc5227fac62\") " pod="openstack/cinder-volume-volume1-0" Mar 20 08:53:50 crc kubenswrapper[4971]: I0320 08:53:50.774758 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/82e8fffd-a0f2-4591-9a54-ffc5227fac62-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"82e8fffd-a0f2-4591-9a54-ffc5227fac62\") " pod="openstack/cinder-volume-volume1-0" Mar 20 08:53:50 crc kubenswrapper[4971]: I0320 08:53:50.774778 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/82e8fffd-a0f2-4591-9a54-ffc5227fac62-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"82e8fffd-a0f2-4591-9a54-ffc5227fac62\") " pod="openstack/cinder-volume-volume1-0" Mar 20 08:53:50 crc kubenswrapper[4971]: I0320 08:53:50.774829 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/82e8fffd-a0f2-4591-9a54-ffc5227fac62-sys\") pod \"cinder-volume-volume1-0\" (UID: \"82e8fffd-a0f2-4591-9a54-ffc5227fac62\") " pod="openstack/cinder-volume-volume1-0" Mar 20 08:53:50 crc kubenswrapper[4971]: I0320 08:53:50.774905 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/82e8fffd-a0f2-4591-9a54-ffc5227fac62-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"82e8fffd-a0f2-4591-9a54-ffc5227fac62\") " pod="openstack/cinder-volume-volume1-0" Mar 20 08:53:50 crc kubenswrapper[4971]: I0320 08:53:50.774935 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/82e8fffd-a0f2-4591-9a54-ffc5227fac62-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"82e8fffd-a0f2-4591-9a54-ffc5227fac62\") " pod="openstack/cinder-volume-volume1-0" Mar 20 08:53:50 crc kubenswrapper[4971]: I0320 08:53:50.775320 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82e8fffd-a0f2-4591-9a54-ffc5227fac62-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"82e8fffd-a0f2-4591-9a54-ffc5227fac62\") " pod="openstack/cinder-volume-volume1-0" Mar 20 08:53:50 crc kubenswrapper[4971]: I0320 08:53:50.775386 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/82e8fffd-a0f2-4591-9a54-ffc5227fac62-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"82e8fffd-a0f2-4591-9a54-ffc5227fac62\") " pod="openstack/cinder-volume-volume1-0" Mar 20 08:53:50 crc kubenswrapper[4971]: I0320 08:53:50.775429 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e8fffd-a0f2-4591-9a54-ffc5227fac62-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"82e8fffd-a0f2-4591-9a54-ffc5227fac62\") " pod="openstack/cinder-volume-volume1-0" Mar 20 08:53:50 crc kubenswrapper[4971]: I0320 08:53:50.775518 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tljtb\" (UniqueName: \"kubernetes.io/projected/82e8fffd-a0f2-4591-9a54-ffc5227fac62-kube-api-access-tljtb\") pod \"cinder-volume-volume1-0\" (UID: \"82e8fffd-a0f2-4591-9a54-ffc5227fac62\") " pod="openstack/cinder-volume-volume1-0" Mar 20 08:53:50 crc kubenswrapper[4971]: I0320 08:53:50.775565 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/82e8fffd-a0f2-4591-9a54-ffc5227fac62-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"82e8fffd-a0f2-4591-9a54-ffc5227fac62\") " pod="openstack/cinder-volume-volume1-0" Mar 20 08:53:50 crc kubenswrapper[4971]: I0320 08:53:50.775571 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82e8fffd-a0f2-4591-9a54-ffc5227fac62-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"82e8fffd-a0f2-4591-9a54-ffc5227fac62\") " pod="openstack/cinder-volume-volume1-0" Mar 20 08:53:50 crc kubenswrapper[4971]: I0320 08:53:50.775658 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/82e8fffd-a0f2-4591-9a54-ffc5227fac62-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"82e8fffd-a0f2-4591-9a54-ffc5227fac62\") " pod="openstack/cinder-volume-volume1-0" Mar 20 08:53:50 crc kubenswrapper[4971]: I0320 08:53:50.775696 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/82e8fffd-a0f2-4591-9a54-ffc5227fac62-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"82e8fffd-a0f2-4591-9a54-ffc5227fac62\") " pod="openstack/cinder-volume-volume1-0" Mar 20 08:53:50 crc kubenswrapper[4971]: I0320 08:53:50.775718 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/82e8fffd-a0f2-4591-9a54-ffc5227fac62-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"82e8fffd-a0f2-4591-9a54-ffc5227fac62\") " pod="openstack/cinder-volume-volume1-0" Mar 20 08:53:50 crc kubenswrapper[4971]: I0320 08:53:50.775792 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/82e8fffd-a0f2-4591-9a54-ffc5227fac62-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"82e8fffd-a0f2-4591-9a54-ffc5227fac62\") " pod="openstack/cinder-volume-volume1-0" Mar 20 08:53:50 crc kubenswrapper[4971]: I0320 08:53:50.775848 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/82e8fffd-a0f2-4591-9a54-ffc5227fac62-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"82e8fffd-a0f2-4591-9a54-ffc5227fac62\") " pod="openstack/cinder-volume-volume1-0" Mar 20 08:53:50 crc kubenswrapper[4971]: I0320 08:53:50.775864 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/82e8fffd-a0f2-4591-9a54-ffc5227fac62-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"82e8fffd-a0f2-4591-9a54-ffc5227fac62\") " pod="openstack/cinder-volume-volume1-0" Mar 20 08:53:50 crc kubenswrapper[4971]: I0320 08:53:50.775902 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/82e8fffd-a0f2-4591-9a54-ffc5227fac62-run\") pod \"cinder-volume-volume1-0\" (UID: \"82e8fffd-a0f2-4591-9a54-ffc5227fac62\") " pod="openstack/cinder-volume-volume1-0" Mar 20 08:53:50 crc kubenswrapper[4971]: I0320 08:53:50.775935 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/82e8fffd-a0f2-4591-9a54-ffc5227fac62-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"82e8fffd-a0f2-4591-9a54-ffc5227fac62\") " pod="openstack/cinder-volume-volume1-0" Mar 20 08:53:50 crc kubenswrapper[4971]: I0320 08:53:50.775952 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/82e8fffd-a0f2-4591-9a54-ffc5227fac62-dev\") pod \"cinder-volume-volume1-0\" (UID: \"82e8fffd-a0f2-4591-9a54-ffc5227fac62\") " pod="openstack/cinder-volume-volume1-0" Mar 20 08:53:50 crc kubenswrapper[4971]: I0320 08:53:50.775982 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/82e8fffd-a0f2-4591-9a54-ffc5227fac62-dev\") pod \"cinder-volume-volume1-0\" (UID: \"82e8fffd-a0f2-4591-9a54-ffc5227fac62\") " pod="openstack/cinder-volume-volume1-0" Mar 20 08:53:50 crc kubenswrapper[4971]: I0320 08:53:50.775994 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/82e8fffd-a0f2-4591-9a54-ffc5227fac62-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"82e8fffd-a0f2-4591-9a54-ffc5227fac62\") " pod="openstack/cinder-volume-volume1-0" Mar 20 08:53:50 crc kubenswrapper[4971]: I0320 08:53:50.776131 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/82e8fffd-a0f2-4591-9a54-ffc5227fac62-run\") pod \"cinder-volume-volume1-0\" (UID: \"82e8fffd-a0f2-4591-9a54-ffc5227fac62\") " pod="openstack/cinder-volume-volume1-0" Mar 20 08:53:50 crc kubenswrapper[4971]: I0320 08:53:50.783718 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82e8fffd-a0f2-4591-9a54-ffc5227fac62-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"82e8fffd-a0f2-4591-9a54-ffc5227fac62\") " pod="openstack/cinder-volume-volume1-0" Mar 20 08:53:50 crc kubenswrapper[4971]: I0320 08:53:50.784562 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82e8fffd-a0f2-4591-9a54-ffc5227fac62-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"82e8fffd-a0f2-4591-9a54-ffc5227fac62\") " pod="openstack/cinder-volume-volume1-0" Mar 20 08:53:50 crc kubenswrapper[4971]: I0320 08:53:50.784954 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/82e8fffd-a0f2-4591-9a54-ffc5227fac62-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"82e8fffd-a0f2-4591-9a54-ffc5227fac62\") " pod="openstack/cinder-volume-volume1-0" Mar 20 08:53:50 crc kubenswrapper[4971]: I0320 08:53:50.788046 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/82e8fffd-a0f2-4591-9a54-ffc5227fac62-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"82e8fffd-a0f2-4591-9a54-ffc5227fac62\") " pod="openstack/cinder-volume-volume1-0" Mar 20 08:53:50 crc kubenswrapper[4971]: I0320 08:53:50.793777 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e8fffd-a0f2-4591-9a54-ffc5227fac62-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"82e8fffd-a0f2-4591-9a54-ffc5227fac62\") " pod="openstack/cinder-volume-volume1-0" Mar 20 08:53:50 crc kubenswrapper[4971]: I0320 08:53:50.797008 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tljtb\" (UniqueName: \"kubernetes.io/projected/82e8fffd-a0f2-4591-9a54-ffc5227fac62-kube-api-access-tljtb\") pod \"cinder-volume-volume1-0\" (UID: \"82e8fffd-a0f2-4591-9a54-ffc5227fac62\") " pod="openstack/cinder-volume-volume1-0" Mar 20 08:53:50 crc kubenswrapper[4971]: I0320 08:53:50.989766 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Mar 20 08:53:51 crc kubenswrapper[4971]: I0320 08:53:51.025028 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 08:53:51 crc kubenswrapper[4971]: I0320 08:53:51.028773 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 08:53:51 crc kubenswrapper[4971]: I0320 08:53:51.058280 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 08:53:51 crc kubenswrapper[4971]: I0320 08:53:51.061351 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 08:53:51 crc kubenswrapper[4971]: I0320 08:53:51.061464 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 08:53:51 crc kubenswrapper[4971]: I0320 08:53:51.069070 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 08:53:51 crc kubenswrapper[4971]: I0320 08:53:51.449870 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Mar 20 08:53:51 crc kubenswrapper[4971]: I0320 08:53:51.451905 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Mar 20 08:53:51 crc kubenswrapper[4971]: I0320 08:53:51.454824 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Mar 20 08:53:51 crc kubenswrapper[4971]: I0320 08:53:51.470975 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Mar 20 08:53:51 crc kubenswrapper[4971]: I0320 08:53:51.583163 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Mar 20 08:53:51 crc kubenswrapper[4971]: I0320 08:53:51.596817 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/747155f7-75a9-43bc-9726-f4fa4f618ee6-run\") pod \"cinder-backup-0\" (UID: \"747155f7-75a9-43bc-9726-f4fa4f618ee6\") " pod="openstack/cinder-backup-0" Mar 20 08:53:51 crc kubenswrapper[4971]: I0320 08:53:51.596871 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/747155f7-75a9-43bc-9726-f4fa4f618ee6-config-data-custom\") pod \"cinder-backup-0\" (UID: \"747155f7-75a9-43bc-9726-f4fa4f618ee6\") " pod="openstack/cinder-backup-0" Mar 20 08:53:51 crc kubenswrapper[4971]: I0320 08:53:51.596893 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/747155f7-75a9-43bc-9726-f4fa4f618ee6-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"747155f7-75a9-43bc-9726-f4fa4f618ee6\") " pod="openstack/cinder-backup-0" Mar 20 08:53:51 crc kubenswrapper[4971]: I0320 08:53:51.597049 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/747155f7-75a9-43bc-9726-f4fa4f618ee6-ceph\") pod \"cinder-backup-0\" (UID: \"747155f7-75a9-43bc-9726-f4fa4f618ee6\") " pod="openstack/cinder-backup-0" Mar 20 08:53:51 crc kubenswrapper[4971]: I0320 08:53:51.597093 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/747155f7-75a9-43bc-9726-f4fa4f618ee6-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"747155f7-75a9-43bc-9726-f4fa4f618ee6\") " pod="openstack/cinder-backup-0" Mar 20 08:53:51 crc kubenswrapper[4971]: I0320 08:53:51.597647 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/747155f7-75a9-43bc-9726-f4fa4f618ee6-scripts\") pod \"cinder-backup-0\" (UID: \"747155f7-75a9-43bc-9726-f4fa4f618ee6\") " pod="openstack/cinder-backup-0" Mar 20 08:53:51 crc kubenswrapper[4971]: I0320 08:53:51.597688 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/747155f7-75a9-43bc-9726-f4fa4f618ee6-dev\") pod \"cinder-backup-0\" (UID: \"747155f7-75a9-43bc-9726-f4fa4f618ee6\") " pod="openstack/cinder-backup-0" Mar 20 08:53:51 crc kubenswrapper[4971]: I0320 08:53:51.597719 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/747155f7-75a9-43bc-9726-f4fa4f618ee6-etc-nvme\") pod \"cinder-backup-0\" (UID: \"747155f7-75a9-43bc-9726-f4fa4f618ee6\") " pod="openstack/cinder-backup-0" Mar 20 08:53:51 crc kubenswrapper[4971]: I0320 08:53:51.597761 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/747155f7-75a9-43bc-9726-f4fa4f618ee6-sys\") pod \"cinder-backup-0\" (UID: \"747155f7-75a9-43bc-9726-f4fa4f618ee6\") " pod="openstack/cinder-backup-0" Mar 20 08:53:51 crc kubenswrapper[4971]: I0320 08:53:51.597923 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/747155f7-75a9-43bc-9726-f4fa4f618ee6-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"747155f7-75a9-43bc-9726-f4fa4f618ee6\") " pod="openstack/cinder-backup-0" Mar 20 08:53:51 crc kubenswrapper[4971]: I0320 08:53:51.597967 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/747155f7-75a9-43bc-9726-f4fa4f618ee6-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"747155f7-75a9-43bc-9726-f4fa4f618ee6\") " pod="openstack/cinder-backup-0" Mar 20 08:53:51 crc kubenswrapper[4971]: I0320 08:53:51.598010 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/747155f7-75a9-43bc-9726-f4fa4f618ee6-lib-modules\") pod \"cinder-backup-0\" (UID: \"747155f7-75a9-43bc-9726-f4fa4f618ee6\") " pod="openstack/cinder-backup-0" Mar 20 08:53:51 crc kubenswrapper[4971]: I0320 08:53:51.598038 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/747155f7-75a9-43bc-9726-f4fa4f618ee6-config-data\") pod \"cinder-backup-0\" (UID: \"747155f7-75a9-43bc-9726-f4fa4f618ee6\") " pod="openstack/cinder-backup-0" Mar 20 08:53:51 crc kubenswrapper[4971]: I0320 08:53:51.598069 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/747155f7-75a9-43bc-9726-f4fa4f618ee6-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"747155f7-75a9-43bc-9726-f4fa4f618ee6\") " pod="openstack/cinder-backup-0" Mar 20 08:53:51 crc kubenswrapper[4971]: I0320 08:53:51.598104 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zvxr\" (UniqueName: \"kubernetes.io/projected/747155f7-75a9-43bc-9726-f4fa4f618ee6-kube-api-access-7zvxr\") pod \"cinder-backup-0\" (UID: \"747155f7-75a9-43bc-9726-f4fa4f618ee6\") " pod="openstack/cinder-backup-0" Mar 20 08:53:51 crc kubenswrapper[4971]: I0320 08:53:51.598140 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/747155f7-75a9-43bc-9726-f4fa4f618ee6-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"747155f7-75a9-43bc-9726-f4fa4f618ee6\") " pod="openstack/cinder-backup-0" Mar 20 08:53:51 crc kubenswrapper[4971]: I0320 08:53:51.699499 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/747155f7-75a9-43bc-9726-f4fa4f618ee6-scripts\") pod \"cinder-backup-0\" (UID: \"747155f7-75a9-43bc-9726-f4fa4f618ee6\") " pod="openstack/cinder-backup-0" Mar 20 08:53:51 crc kubenswrapper[4971]: I0320 08:53:51.699559 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/747155f7-75a9-43bc-9726-f4fa4f618ee6-dev\") pod \"cinder-backup-0\" (UID: \"747155f7-75a9-43bc-9726-f4fa4f618ee6\") " pod="openstack/cinder-backup-0" Mar 20 08:53:51 crc kubenswrapper[4971]: I0320 08:53:51.699579 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/747155f7-75a9-43bc-9726-f4fa4f618ee6-etc-nvme\") pod \"cinder-backup-0\" (UID: \"747155f7-75a9-43bc-9726-f4fa4f618ee6\") " pod="openstack/cinder-backup-0" Mar 20 08:53:51 crc kubenswrapper[4971]: I0320 08:53:51.699625 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/747155f7-75a9-43bc-9726-f4fa4f618ee6-sys\") pod \"cinder-backup-0\" (UID: \"747155f7-75a9-43bc-9726-f4fa4f618ee6\") " pod="openstack/cinder-backup-0" Mar 20 08:53:51 crc kubenswrapper[4971]: I0320 08:53:51.699710 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/747155f7-75a9-43bc-9726-f4fa4f618ee6-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"747155f7-75a9-43bc-9726-f4fa4f618ee6\") " pod="openstack/cinder-backup-0" Mar 20 08:53:51 crc kubenswrapper[4971]: I0320 08:53:51.699691 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/747155f7-75a9-43bc-9726-f4fa4f618ee6-dev\") pod \"cinder-backup-0\" (UID: \"747155f7-75a9-43bc-9726-f4fa4f618ee6\") " pod="openstack/cinder-backup-0" Mar 20 08:53:51 crc kubenswrapper[4971]: I0320 08:53:51.699774 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/747155f7-75a9-43bc-9726-f4fa4f618ee6-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"747155f7-75a9-43bc-9726-f4fa4f618ee6\") " pod="openstack/cinder-backup-0" Mar 20 08:53:51 crc kubenswrapper[4971]: I0320 08:53:51.699731 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/747155f7-75a9-43bc-9726-f4fa4f618ee6-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"747155f7-75a9-43bc-9726-f4fa4f618ee6\") " pod="openstack/cinder-backup-0" Mar 20 08:53:51 crc kubenswrapper[4971]: I0320 08:53:51.699709 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/747155f7-75a9-43bc-9726-f4fa4f618ee6-etc-nvme\") pod \"cinder-backup-0\" (UID: \"747155f7-75a9-43bc-9726-f4fa4f618ee6\") " pod="openstack/cinder-backup-0" Mar 20 08:53:51 crc kubenswrapper[4971]: I0320 08:53:51.699786 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/747155f7-75a9-43bc-9726-f4fa4f618ee6-sys\") pod \"cinder-backup-0\" (UID: \"747155f7-75a9-43bc-9726-f4fa4f618ee6\") " pod="openstack/cinder-backup-0" Mar 20 08:53:51 crc kubenswrapper[4971]: I0320 08:53:51.699744 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/747155f7-75a9-43bc-9726-f4fa4f618ee6-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"747155f7-75a9-43bc-9726-f4fa4f618ee6\") " pod="openstack/cinder-backup-0" Mar 20 08:53:51 crc kubenswrapper[4971]: I0320 08:53:51.699891 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/747155f7-75a9-43bc-9726-f4fa4f618ee6-lib-modules\") pod \"cinder-backup-0\" (UID: \"747155f7-75a9-43bc-9726-f4fa4f618ee6\") " pod="openstack/cinder-backup-0" Mar 20 08:53:51 crc kubenswrapper[4971]: I0320 08:53:51.699934 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/747155f7-75a9-43bc-9726-f4fa4f618ee6-config-data\") pod \"cinder-backup-0\" (UID: \"747155f7-75a9-43bc-9726-f4fa4f618ee6\") " pod="openstack/cinder-backup-0" Mar 20 08:53:51 crc kubenswrapper[4971]: I0320 08:53:51.699957 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/747155f7-75a9-43bc-9726-f4fa4f618ee6-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"747155f7-75a9-43bc-9726-f4fa4f618ee6\") " pod="openstack/cinder-backup-0" Mar 20 08:53:51 crc kubenswrapper[4971]: I0320 08:53:51.699975 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/747155f7-75a9-43bc-9726-f4fa4f618ee6-lib-modules\") pod \"cinder-backup-0\" (UID: \"747155f7-75a9-43bc-9726-f4fa4f618ee6\") " pod="openstack/cinder-backup-0" Mar 20 08:53:51 crc kubenswrapper[4971]: I0320 08:53:51.700016 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/747155f7-75a9-43bc-9726-f4fa4f618ee6-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"747155f7-75a9-43bc-9726-f4fa4f618ee6\") " pod="openstack/cinder-backup-0" Mar 20 08:53:51 crc kubenswrapper[4971]: I0320 08:53:51.700019 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zvxr\" (UniqueName: \"kubernetes.io/projected/747155f7-75a9-43bc-9726-f4fa4f618ee6-kube-api-access-7zvxr\") pod \"cinder-backup-0\" (UID: \"747155f7-75a9-43bc-9726-f4fa4f618ee6\") " pod="openstack/cinder-backup-0" Mar 20 08:53:51 crc kubenswrapper[4971]: I0320 08:53:51.700066 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/747155f7-75a9-43bc-9726-f4fa4f618ee6-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"747155f7-75a9-43bc-9726-f4fa4f618ee6\") " pod="openstack/cinder-backup-0" Mar 20 08:53:51 crc kubenswrapper[4971]: I0320 08:53:51.700105 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/747155f7-75a9-43bc-9726-f4fa4f618ee6-run\") pod \"cinder-backup-0\" (UID: \"747155f7-75a9-43bc-9726-f4fa4f618ee6\") " pod="openstack/cinder-backup-0" Mar 20 08:53:51 crc kubenswrapper[4971]: I0320 08:53:51.700166 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/747155f7-75a9-43bc-9726-f4fa4f618ee6-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"747155f7-75a9-43bc-9726-f4fa4f618ee6\") " pod="openstack/cinder-backup-0" Mar 20 08:53:51 crc kubenswrapper[4971]: I0320 08:53:51.700178 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/747155f7-75a9-43bc-9726-f4fa4f618ee6-config-data-custom\") pod \"cinder-backup-0\" (UID: \"747155f7-75a9-43bc-9726-f4fa4f618ee6\") " pod="openstack/cinder-backup-0" Mar 20 08:53:51 crc kubenswrapper[4971]: I0320 08:53:51.700231 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/747155f7-75a9-43bc-9726-f4fa4f618ee6-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"747155f7-75a9-43bc-9726-f4fa4f618ee6\") " pod="openstack/cinder-backup-0" Mar 20 08:53:51 crc kubenswrapper[4971]: I0320 08:53:51.700244 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/747155f7-75a9-43bc-9726-f4fa4f618ee6-run\") pod \"cinder-backup-0\" (UID: \"747155f7-75a9-43bc-9726-f4fa4f618ee6\") " pod="openstack/cinder-backup-0" Mar 20 08:53:51 crc kubenswrapper[4971]: I0320 08:53:51.700304 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/747155f7-75a9-43bc-9726-f4fa4f618ee6-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"747155f7-75a9-43bc-9726-f4fa4f618ee6\") " pod="openstack/cinder-backup-0" Mar 20 08:53:51 crc kubenswrapper[4971]: I0320 08:53:51.700326 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/747155f7-75a9-43bc-9726-f4fa4f618ee6-ceph\") pod \"cinder-backup-0\" (UID: \"747155f7-75a9-43bc-9726-f4fa4f618ee6\") " pod="openstack/cinder-backup-0" Mar 20 08:53:51 crc kubenswrapper[4971]: I0320 08:53:51.700352 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/747155f7-75a9-43bc-9726-f4fa4f618ee6-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"747155f7-75a9-43bc-9726-f4fa4f618ee6\") " pod="openstack/cinder-backup-0" Mar 20 08:53:51 crc kubenswrapper[4971]: I0320 08:53:51.707338 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/747155f7-75a9-43bc-9726-f4fa4f618ee6-scripts\") pod \"cinder-backup-0\" (UID: \"747155f7-75a9-43bc-9726-f4fa4f618ee6\") " pod="openstack/cinder-backup-0" Mar 20 08:53:51 crc kubenswrapper[4971]: I0320 08:53:51.708745 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/747155f7-75a9-43bc-9726-f4fa4f618ee6-config-data-custom\") pod \"cinder-backup-0\" (UID: \"747155f7-75a9-43bc-9726-f4fa4f618ee6\") " pod="openstack/cinder-backup-0" Mar 20 08:53:51 crc kubenswrapper[4971]: I0320 08:53:51.709196 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/747155f7-75a9-43bc-9726-f4fa4f618ee6-ceph\") pod \"cinder-backup-0\" (UID: \"747155f7-75a9-43bc-9726-f4fa4f618ee6\") " pod="openstack/cinder-backup-0" Mar 20 08:53:51 crc kubenswrapper[4971]: I0320 08:53:51.709327 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/747155f7-75a9-43bc-9726-f4fa4f618ee6-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"747155f7-75a9-43bc-9726-f4fa4f618ee6\") " pod="openstack/cinder-backup-0" Mar 20 08:53:51 crc kubenswrapper[4971]: I0320 08:53:51.713514 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/747155f7-75a9-43bc-9726-f4fa4f618ee6-config-data\") pod \"cinder-backup-0\" (UID: \"747155f7-75a9-43bc-9726-f4fa4f618ee6\") " pod="openstack/cinder-backup-0" Mar 20 08:53:51 crc kubenswrapper[4971]: I0320 08:53:51.741676 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zvxr\" (UniqueName: \"kubernetes.io/projected/747155f7-75a9-43bc-9726-f4fa4f618ee6-kube-api-access-7zvxr\") pod \"cinder-backup-0\" (UID: \"747155f7-75a9-43bc-9726-f4fa4f618ee6\") " pod="openstack/cinder-backup-0" Mar 20 08:53:51 crc kubenswrapper[4971]: I0320 08:53:51.775639 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"82e8fffd-a0f2-4591-9a54-ffc5227fac62","Type":"ContainerStarted","Data":"88861abb090caeb0294a55a46b95b7ea7da78878b1992338a0a4ab2a4b9fa8fb"} Mar 20 08:53:51 crc kubenswrapper[4971]: I0320 08:53:51.781371 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f731eb88-b58b-4134-a9ec-6bd0d3be30cd","Type":"ContainerStarted","Data":"97ad1b820aa5b0a504d7c408ca5eef25a7e66a24093df657fc1927ab2630540e"} Mar 20 08:53:51 crc kubenswrapper[4971]: I0320 08:53:51.790914 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Mar 20 08:53:51 crc kubenswrapper[4971]: I0320 08:53:51.877807 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 08:53:51 crc kubenswrapper[4971]: I0320 08:53:51.878156 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 08:53:51 crc kubenswrapper[4971]: I0320 08:53:51.898213 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.641055118 podStartE2EDuration="3.898193478s" podCreationTimestamp="2026-03-20 08:53:48 +0000 UTC" firstStartedPulling="2026-03-20 08:53:49.375989996 +0000 UTC m=+7451.355864134" lastFinishedPulling="2026-03-20 08:53:49.633128326 +0000 UTC m=+7451.613002494" observedRunningTime="2026-03-20 08:53:51.878312138 +0000 UTC m=+7453.858186276" watchObservedRunningTime="2026-03-20 08:53:51.898193478 +0000 UTC m=+7453.878067616" Mar 20 08:53:52 crc kubenswrapper[4971]: I0320 08:53:52.498574 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Mar 20 08:53:52 crc kubenswrapper[4971]: I0320 08:53:52.791053 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"82e8fffd-a0f2-4591-9a54-ffc5227fac62","Type":"ContainerStarted","Data":"8d79a04742c6dd8f4d983e545023957ac817bff86e7f535d7317919d8b924cfe"} Mar 20 08:53:52 crc kubenswrapper[4971]: I0320 08:53:52.791684 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"82e8fffd-a0f2-4591-9a54-ffc5227fac62","Type":"ContainerStarted","Data":"5d4dc18f5a636ca9b6579b677f32267069a98b69c9c56e5afac785663e3944d9"} Mar 20 08:53:52 crc kubenswrapper[4971]: I0320 08:53:52.793318 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"747155f7-75a9-43bc-9726-f4fa4f618ee6","Type":"ContainerStarted","Data":"9fc4a905164066ea5b99e3de09a581b534f079dfd50a09cd0a61c11887ee7a2b"} Mar 20 08:53:52 crc kubenswrapper[4971]: I0320 08:53:52.819599 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=2.260893742 podStartE2EDuration="2.819580004s" podCreationTimestamp="2026-03-20 08:53:50 +0000 UTC" firstStartedPulling="2026-03-20 08:53:51.599129781 +0000 UTC m=+7453.579003919" lastFinishedPulling="2026-03-20 08:53:52.157816043 +0000 UTC m=+7454.137690181" observedRunningTime="2026-03-20 08:53:52.816115953 +0000 UTC m=+7454.795990121" watchObservedRunningTime="2026-03-20 08:53:52.819580004 +0000 UTC m=+7454.799454142" Mar 20 08:53:53 crc kubenswrapper[4971]: I0320 08:53:53.776842 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 08:53:53 crc kubenswrapper[4971]: I0320 08:53:53.803945 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"747155f7-75a9-43bc-9726-f4fa4f618ee6","Type":"ContainerStarted","Data":"03d1193b029a9a0ad366135c4c5bacb6465b1f062296451300966170a3b5d172"} Mar 20 08:53:53 crc kubenswrapper[4971]: I0320 08:53:53.804010 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"747155f7-75a9-43bc-9726-f4fa4f618ee6","Type":"ContainerStarted","Data":"a4b7e341b94c603855cfe242b892c1fa5327a25394a7496011faa9ac796f70fa"} Mar 20 08:53:53 crc kubenswrapper[4971]: I0320 08:53:53.813012 4971 generic.go:334] "Generic (PLEG): container finished" podID="34b1bb1b-3d48-4177-80d9-62596eacee6b" containerID="b9b301e52c53c6f8e5ddfdc11458c7656298eb5ebe8338b8ce2cf4b72d58e0cd" exitCode=0 Mar 20 08:53:53 crc kubenswrapper[4971]: I0320 08:53:53.813532 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"34b1bb1b-3d48-4177-80d9-62596eacee6b","Type":"ContainerDied","Data":"b9b301e52c53c6f8e5ddfdc11458c7656298eb5ebe8338b8ce2cf4b72d58e0cd"} Mar 20 08:53:53 crc kubenswrapper[4971]: I0320 08:53:53.813653 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"34b1bb1b-3d48-4177-80d9-62596eacee6b","Type":"ContainerDied","Data":"f3b8b6f49fb999293252bae176ed7d8a728f824c2e18f292ce8816286ab5fe69"} Mar 20 08:53:53 crc kubenswrapper[4971]: I0320 08:53:53.813685 4971 scope.go:117] "RemoveContainer" containerID="b9b301e52c53c6f8e5ddfdc11458c7656298eb5ebe8338b8ce2cf4b72d58e0cd" Mar 20 08:53:53 crc kubenswrapper[4971]: I0320 08:53:53.815168 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 08:53:53 crc kubenswrapper[4971]: I0320 08:53:53.858647 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=2.5761398939999998 podStartE2EDuration="2.858631468s" podCreationTimestamp="2026-03-20 08:53:51 +0000 UTC" firstStartedPulling="2026-03-20 08:53:52.475150239 +0000 UTC m=+7454.455024377" lastFinishedPulling="2026-03-20 08:53:52.757641803 +0000 UTC m=+7454.737515951" observedRunningTime="2026-03-20 08:53:53.843748418 +0000 UTC m=+7455.823622566" watchObservedRunningTime="2026-03-20 08:53:53.858631468 +0000 UTC m=+7455.838505606" Mar 20 08:53:53 crc kubenswrapper[4971]: I0320 08:53:53.864090 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34b1bb1b-3d48-4177-80d9-62596eacee6b-combined-ca-bundle\") pod \"34b1bb1b-3d48-4177-80d9-62596eacee6b\" (UID: \"34b1bb1b-3d48-4177-80d9-62596eacee6b\") " Mar 20 08:53:53 crc kubenswrapper[4971]: I0320 08:53:53.864246 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/34b1bb1b-3d48-4177-80d9-62596eacee6b-etc-machine-id\") pod \"34b1bb1b-3d48-4177-80d9-62596eacee6b\" (UID: \"34b1bb1b-3d48-4177-80d9-62596eacee6b\") " Mar 20 08:53:53 crc kubenswrapper[4971]: I0320 08:53:53.864337 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34b1bb1b-3d48-4177-80d9-62596eacee6b-scripts\") pod \"34b1bb1b-3d48-4177-80d9-62596eacee6b\" (UID: \"34b1bb1b-3d48-4177-80d9-62596eacee6b\") " Mar 20 08:53:53 crc kubenswrapper[4971]: I0320 08:53:53.864428 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34b1bb1b-3d48-4177-80d9-62596eacee6b-logs\") pod \"34b1bb1b-3d48-4177-80d9-62596eacee6b\" (UID: \"34b1bb1b-3d48-4177-80d9-62596eacee6b\") " Mar 20 08:53:53 crc kubenswrapper[4971]: I0320 08:53:53.864701 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/34b1bb1b-3d48-4177-80d9-62596eacee6b-config-data-custom\") pod \"34b1bb1b-3d48-4177-80d9-62596eacee6b\" (UID: \"34b1bb1b-3d48-4177-80d9-62596eacee6b\") " Mar 20 08:53:53 crc kubenswrapper[4971]: I0320 08:53:53.864788 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znmqw\" (UniqueName: \"kubernetes.io/projected/34b1bb1b-3d48-4177-80d9-62596eacee6b-kube-api-access-znmqw\") pod \"34b1bb1b-3d48-4177-80d9-62596eacee6b\" (UID: \"34b1bb1b-3d48-4177-80d9-62596eacee6b\") " Mar 20 08:53:53 crc kubenswrapper[4971]: I0320 08:53:53.864805 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34b1bb1b-3d48-4177-80d9-62596eacee6b-config-data\") pod \"34b1bb1b-3d48-4177-80d9-62596eacee6b\" (UID: \"34b1bb1b-3d48-4177-80d9-62596eacee6b\") " Mar 20 08:53:53 crc kubenswrapper[4971]: I0320 08:53:53.866832 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/34b1bb1b-3d48-4177-80d9-62596eacee6b-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "34b1bb1b-3d48-4177-80d9-62596eacee6b" (UID: "34b1bb1b-3d48-4177-80d9-62596eacee6b"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:53:53 crc kubenswrapper[4971]: I0320 08:53:53.875546 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34b1bb1b-3d48-4177-80d9-62596eacee6b-logs" (OuterVolumeSpecName: "logs") pod "34b1bb1b-3d48-4177-80d9-62596eacee6b" (UID: "34b1bb1b-3d48-4177-80d9-62596eacee6b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:53:53 crc kubenswrapper[4971]: I0320 08:53:53.883044 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 20 08:53:53 crc kubenswrapper[4971]: I0320 08:53:53.893854 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34b1bb1b-3d48-4177-80d9-62596eacee6b-scripts" (OuterVolumeSpecName: "scripts") pod "34b1bb1b-3d48-4177-80d9-62596eacee6b" (UID: "34b1bb1b-3d48-4177-80d9-62596eacee6b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:53:53 crc kubenswrapper[4971]: I0320 08:53:53.900765 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34b1bb1b-3d48-4177-80d9-62596eacee6b-kube-api-access-znmqw" (OuterVolumeSpecName: "kube-api-access-znmqw") pod "34b1bb1b-3d48-4177-80d9-62596eacee6b" (UID: "34b1bb1b-3d48-4177-80d9-62596eacee6b"). InnerVolumeSpecName "kube-api-access-znmqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:53:53 crc kubenswrapper[4971]: I0320 08:53:53.908053 4971 scope.go:117] "RemoveContainer" containerID="96c3b627129ecc249303d163c2067cb98d2ade6f55ab368897c1cbcff7aa35d7" Mar 20 08:53:53 crc kubenswrapper[4971]: I0320 08:53:53.916049 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34b1bb1b-3d48-4177-80d9-62596eacee6b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "34b1bb1b-3d48-4177-80d9-62596eacee6b" (UID: "34b1bb1b-3d48-4177-80d9-62596eacee6b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:53:53 crc kubenswrapper[4971]: I0320 08:53:53.969805 4971 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34b1bb1b-3d48-4177-80d9-62596eacee6b-logs\") on node \"crc\" DevicePath \"\"" Mar 20 08:53:53 crc kubenswrapper[4971]: I0320 08:53:53.969829 4971 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/34b1bb1b-3d48-4177-80d9-62596eacee6b-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 08:53:53 crc kubenswrapper[4971]: I0320 08:53:53.969839 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znmqw\" (UniqueName: \"kubernetes.io/projected/34b1bb1b-3d48-4177-80d9-62596eacee6b-kube-api-access-znmqw\") on node \"crc\" DevicePath \"\"" Mar 20 08:53:53 crc kubenswrapper[4971]: I0320 08:53:53.969849 4971 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/34b1bb1b-3d48-4177-80d9-62596eacee6b-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 20 08:53:53 crc kubenswrapper[4971]: I0320 08:53:53.969857 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34b1bb1b-3d48-4177-80d9-62596eacee6b-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:53:53 crc kubenswrapper[4971]: I0320 08:53:53.996821 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34b1bb1b-3d48-4177-80d9-62596eacee6b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "34b1bb1b-3d48-4177-80d9-62596eacee6b" (UID: "34b1bb1b-3d48-4177-80d9-62596eacee6b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:53:54 crc kubenswrapper[4971]: I0320 08:53:54.042754 4971 scope.go:117] "RemoveContainer" containerID="b9b301e52c53c6f8e5ddfdc11458c7656298eb5ebe8338b8ce2cf4b72d58e0cd" Mar 20 08:53:54 crc kubenswrapper[4971]: E0320 08:53:54.047723 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9b301e52c53c6f8e5ddfdc11458c7656298eb5ebe8338b8ce2cf4b72d58e0cd\": container with ID starting with b9b301e52c53c6f8e5ddfdc11458c7656298eb5ebe8338b8ce2cf4b72d58e0cd not found: ID does not exist" containerID="b9b301e52c53c6f8e5ddfdc11458c7656298eb5ebe8338b8ce2cf4b72d58e0cd" Mar 20 08:53:54 crc kubenswrapper[4971]: I0320 08:53:54.047758 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9b301e52c53c6f8e5ddfdc11458c7656298eb5ebe8338b8ce2cf4b72d58e0cd"} err="failed to get container status \"b9b301e52c53c6f8e5ddfdc11458c7656298eb5ebe8338b8ce2cf4b72d58e0cd\": rpc error: code = NotFound desc = could not find container \"b9b301e52c53c6f8e5ddfdc11458c7656298eb5ebe8338b8ce2cf4b72d58e0cd\": container with ID starting with b9b301e52c53c6f8e5ddfdc11458c7656298eb5ebe8338b8ce2cf4b72d58e0cd not found: ID does not exist" Mar 20 08:53:54 crc kubenswrapper[4971]: I0320 08:53:54.047778 4971 scope.go:117] "RemoveContainer" containerID="96c3b627129ecc249303d163c2067cb98d2ade6f55ab368897c1cbcff7aa35d7" Mar 20 08:53:54 crc kubenswrapper[4971]: E0320 08:53:54.052695 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96c3b627129ecc249303d163c2067cb98d2ade6f55ab368897c1cbcff7aa35d7\": container with ID starting with 96c3b627129ecc249303d163c2067cb98d2ade6f55ab368897c1cbcff7aa35d7 not found: ID does not exist" containerID="96c3b627129ecc249303d163c2067cb98d2ade6f55ab368897c1cbcff7aa35d7" Mar 20 08:53:54 crc kubenswrapper[4971]: I0320 08:53:54.052723 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96c3b627129ecc249303d163c2067cb98d2ade6f55ab368897c1cbcff7aa35d7"} err="failed to get container status \"96c3b627129ecc249303d163c2067cb98d2ade6f55ab368897c1cbcff7aa35d7\": rpc error: code = NotFound desc = could not find container \"96c3b627129ecc249303d163c2067cb98d2ade6f55ab368897c1cbcff7aa35d7\": container with ID starting with 96c3b627129ecc249303d163c2067cb98d2ade6f55ab368897c1cbcff7aa35d7 not found: ID does not exist" Mar 20 08:53:54 crc kubenswrapper[4971]: I0320 08:53:54.057730 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34b1bb1b-3d48-4177-80d9-62596eacee6b-config-data" (OuterVolumeSpecName: "config-data") pod "34b1bb1b-3d48-4177-80d9-62596eacee6b" (UID: "34b1bb1b-3d48-4177-80d9-62596eacee6b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:53:54 crc kubenswrapper[4971]: I0320 08:53:54.071287 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34b1bb1b-3d48-4177-80d9-62596eacee6b-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:53:54 crc kubenswrapper[4971]: I0320 08:53:54.071315 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34b1bb1b-3d48-4177-80d9-62596eacee6b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:53:54 crc kubenswrapper[4971]: I0320 08:53:54.194670 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 20 08:53:54 crc kubenswrapper[4971]: I0320 08:53:54.202803 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 20 08:53:54 crc kubenswrapper[4971]: I0320 08:53:54.208648 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 20 08:53:54 crc kubenswrapper[4971]: E0320 08:53:54.209049 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34b1bb1b-3d48-4177-80d9-62596eacee6b" containerName="cinder-api-log" Mar 20 08:53:54 crc kubenswrapper[4971]: I0320 08:53:54.209067 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="34b1bb1b-3d48-4177-80d9-62596eacee6b" containerName="cinder-api-log" Mar 20 08:53:54 crc kubenswrapper[4971]: E0320 08:53:54.209087 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34b1bb1b-3d48-4177-80d9-62596eacee6b" containerName="cinder-api" Mar 20 08:53:54 crc kubenswrapper[4971]: I0320 08:53:54.209095 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="34b1bb1b-3d48-4177-80d9-62596eacee6b" containerName="cinder-api" Mar 20 08:53:54 crc kubenswrapper[4971]: I0320 08:53:54.209264 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="34b1bb1b-3d48-4177-80d9-62596eacee6b" containerName="cinder-api" Mar 20 08:53:54 crc kubenswrapper[4971]: I0320 08:53:54.209293 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="34b1bb1b-3d48-4177-80d9-62596eacee6b" containerName="cinder-api-log" Mar 20 08:53:54 crc kubenswrapper[4971]: I0320 08:53:54.210231 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 08:53:54 crc kubenswrapper[4971]: I0320 08:53:54.215058 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 08:53:54 crc kubenswrapper[4971]: I0320 08:53:54.245575 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 20 08:53:54 crc kubenswrapper[4971]: I0320 08:53:54.277199 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fafe0d4a-a07e-4d27-a7d3-c951a2ef82bc-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"fafe0d4a-a07e-4d27-a7d3-c951a2ef82bc\") " pod="openstack/cinder-api-0" Mar 20 08:53:54 crc kubenswrapper[4971]: I0320 08:53:54.277252 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fafe0d4a-a07e-4d27-a7d3-c951a2ef82bc-etc-machine-id\") pod \"cinder-api-0\" (UID: \"fafe0d4a-a07e-4d27-a7d3-c951a2ef82bc\") " pod="openstack/cinder-api-0" Mar 20 08:53:54 crc kubenswrapper[4971]: I0320 08:53:54.277288 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fafe0d4a-a07e-4d27-a7d3-c951a2ef82bc-config-data\") pod \"cinder-api-0\" (UID: \"fafe0d4a-a07e-4d27-a7d3-c951a2ef82bc\") " pod="openstack/cinder-api-0" Mar 20 08:53:54 crc kubenswrapper[4971]: I0320 08:53:54.277351 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fafe0d4a-a07e-4d27-a7d3-c951a2ef82bc-config-data-custom\") pod \"cinder-api-0\" (UID: \"fafe0d4a-a07e-4d27-a7d3-c951a2ef82bc\") " pod="openstack/cinder-api-0" Mar 20 08:53:54 crc kubenswrapper[4971]: I0320 08:53:54.277373 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbj8t\" (UniqueName: \"kubernetes.io/projected/fafe0d4a-a07e-4d27-a7d3-c951a2ef82bc-kube-api-access-jbj8t\") pod \"cinder-api-0\" (UID: \"fafe0d4a-a07e-4d27-a7d3-c951a2ef82bc\") " pod="openstack/cinder-api-0" Mar 20 08:53:54 crc kubenswrapper[4971]: I0320 08:53:54.277403 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fafe0d4a-a07e-4d27-a7d3-c951a2ef82bc-logs\") pod \"cinder-api-0\" (UID: \"fafe0d4a-a07e-4d27-a7d3-c951a2ef82bc\") " pod="openstack/cinder-api-0" Mar 20 08:53:54 crc kubenswrapper[4971]: I0320 08:53:54.277447 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fafe0d4a-a07e-4d27-a7d3-c951a2ef82bc-scripts\") pod \"cinder-api-0\" (UID: \"fafe0d4a-a07e-4d27-a7d3-c951a2ef82bc\") " pod="openstack/cinder-api-0" Mar 20 08:53:54 crc kubenswrapper[4971]: I0320 08:53:54.378809 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fafe0d4a-a07e-4d27-a7d3-c951a2ef82bc-etc-machine-id\") pod \"cinder-api-0\" (UID: \"fafe0d4a-a07e-4d27-a7d3-c951a2ef82bc\") " pod="openstack/cinder-api-0" Mar 20 08:53:54 crc kubenswrapper[4971]: I0320 08:53:54.378855 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fafe0d4a-a07e-4d27-a7d3-c951a2ef82bc-config-data\") pod \"cinder-api-0\" (UID: \"fafe0d4a-a07e-4d27-a7d3-c951a2ef82bc\") " pod="openstack/cinder-api-0" Mar 20 08:53:54 crc kubenswrapper[4971]: I0320 08:53:54.378907 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fafe0d4a-a07e-4d27-a7d3-c951a2ef82bc-etc-machine-id\") pod \"cinder-api-0\" (UID: \"fafe0d4a-a07e-4d27-a7d3-c951a2ef82bc\") " pod="openstack/cinder-api-0" Mar 20 08:53:54 crc kubenswrapper[4971]: I0320 08:53:54.378917 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fafe0d4a-a07e-4d27-a7d3-c951a2ef82bc-config-data-custom\") pod \"cinder-api-0\" (UID: \"fafe0d4a-a07e-4d27-a7d3-c951a2ef82bc\") " pod="openstack/cinder-api-0" Mar 20 08:53:54 crc kubenswrapper[4971]: I0320 08:53:54.378956 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbj8t\" (UniqueName: \"kubernetes.io/projected/fafe0d4a-a07e-4d27-a7d3-c951a2ef82bc-kube-api-access-jbj8t\") pod \"cinder-api-0\" (UID: \"fafe0d4a-a07e-4d27-a7d3-c951a2ef82bc\") " pod="openstack/cinder-api-0" Mar 20 08:53:54 crc kubenswrapper[4971]: I0320 08:53:54.378988 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fafe0d4a-a07e-4d27-a7d3-c951a2ef82bc-logs\") pod \"cinder-api-0\" (UID: \"fafe0d4a-a07e-4d27-a7d3-c951a2ef82bc\") " pod="openstack/cinder-api-0" Mar 20 08:53:54 crc kubenswrapper[4971]: I0320 08:53:54.379036 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fafe0d4a-a07e-4d27-a7d3-c951a2ef82bc-scripts\") pod \"cinder-api-0\" (UID: \"fafe0d4a-a07e-4d27-a7d3-c951a2ef82bc\") " pod="openstack/cinder-api-0" Mar 20 08:53:54 crc kubenswrapper[4971]: I0320 08:53:54.379061 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fafe0d4a-a07e-4d27-a7d3-c951a2ef82bc-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"fafe0d4a-a07e-4d27-a7d3-c951a2ef82bc\") " pod="openstack/cinder-api-0" Mar 20 08:53:54 crc kubenswrapper[4971]: I0320 08:53:54.379973 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fafe0d4a-a07e-4d27-a7d3-c951a2ef82bc-logs\") pod \"cinder-api-0\" (UID: \"fafe0d4a-a07e-4d27-a7d3-c951a2ef82bc\") " pod="openstack/cinder-api-0" Mar 20 08:53:54 crc kubenswrapper[4971]: I0320 08:53:54.384568 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fafe0d4a-a07e-4d27-a7d3-c951a2ef82bc-config-data-custom\") pod \"cinder-api-0\" (UID: \"fafe0d4a-a07e-4d27-a7d3-c951a2ef82bc\") " pod="openstack/cinder-api-0" Mar 20 08:53:54 crc kubenswrapper[4971]: I0320 08:53:54.387124 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fafe0d4a-a07e-4d27-a7d3-c951a2ef82bc-scripts\") pod \"cinder-api-0\" (UID: \"fafe0d4a-a07e-4d27-a7d3-c951a2ef82bc\") " pod="openstack/cinder-api-0" Mar 20 08:53:54 crc kubenswrapper[4971]: I0320 08:53:54.387456 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fafe0d4a-a07e-4d27-a7d3-c951a2ef82bc-config-data\") pod \"cinder-api-0\" (UID: \"fafe0d4a-a07e-4d27-a7d3-c951a2ef82bc\") " pod="openstack/cinder-api-0" Mar 20 08:53:54 crc kubenswrapper[4971]: I0320 08:53:54.387594 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fafe0d4a-a07e-4d27-a7d3-c951a2ef82bc-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"fafe0d4a-a07e-4d27-a7d3-c951a2ef82bc\") " pod="openstack/cinder-api-0" Mar 20 08:53:54 crc kubenswrapper[4971]: I0320 08:53:54.397197 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbj8t\" (UniqueName: \"kubernetes.io/projected/fafe0d4a-a07e-4d27-a7d3-c951a2ef82bc-kube-api-access-jbj8t\") pod \"cinder-api-0\" (UID: \"fafe0d4a-a07e-4d27-a7d3-c951a2ef82bc\") " pod="openstack/cinder-api-0" Mar 20 08:53:54 crc kubenswrapper[4971]: I0320 08:53:54.572336 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 08:53:54 crc kubenswrapper[4971]: I0320 08:53:54.752157 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34b1bb1b-3d48-4177-80d9-62596eacee6b" path="/var/lib/kubelet/pods/34b1bb1b-3d48-4177-80d9-62596eacee6b/volumes" Mar 20 08:53:55 crc kubenswrapper[4971]: I0320 08:53:55.071229 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 08:53:55 crc kubenswrapper[4971]: W0320 08:53:55.088850 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfafe0d4a_a07e_4d27_a7d3_c951a2ef82bc.slice/crio-1ac9a2a0d65544f80ae77b02991311928451df3bce336753d9a13633aad22900 WatchSource:0}: Error finding container 1ac9a2a0d65544f80ae77b02991311928451df3bce336753d9a13633aad22900: Status 404 returned error can't find the container with id 1ac9a2a0d65544f80ae77b02991311928451df3bce336753d9a13633aad22900 Mar 20 08:53:55 crc kubenswrapper[4971]: I0320 08:53:55.846796 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"fafe0d4a-a07e-4d27-a7d3-c951a2ef82bc","Type":"ContainerStarted","Data":"f110881371fad5f6f9e4bec296d2c643ef804ddff2147b6fff7a63984200da87"} Mar 20 08:53:55 crc kubenswrapper[4971]: I0320 08:53:55.847338 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"fafe0d4a-a07e-4d27-a7d3-c951a2ef82bc","Type":"ContainerStarted","Data":"1ac9a2a0d65544f80ae77b02991311928451df3bce336753d9a13633aad22900"} Mar 20 08:53:55 crc kubenswrapper[4971]: I0320 08:53:55.990036 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Mar 20 08:53:56 crc kubenswrapper[4971]: I0320 08:53:56.792422 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Mar 20 08:53:56 crc kubenswrapper[4971]: I0320 08:53:56.857488 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"fafe0d4a-a07e-4d27-a7d3-c951a2ef82bc","Type":"ContainerStarted","Data":"749f5e147af7d4060d9be4c7cc85f3f42e13e990104cd388e3e0e509fe77e49c"} Mar 20 08:53:56 crc kubenswrapper[4971]: I0320 08:53:56.858072 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 20 08:53:56 crc kubenswrapper[4971]: I0320 08:53:56.881931 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=2.881910925 podStartE2EDuration="2.881910925s" podCreationTimestamp="2026-03-20 08:53:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:53:56.876647407 +0000 UTC m=+7458.856521575" watchObservedRunningTime="2026-03-20 08:53:56.881910925 +0000 UTC m=+7458.861785083" Mar 20 08:53:59 crc kubenswrapper[4971]: I0320 08:53:59.062217 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 20 08:53:59 crc kubenswrapper[4971]: I0320 08:53:59.141404 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 08:53:59 crc kubenswrapper[4971]: I0320 08:53:59.902843 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="f731eb88-b58b-4134-a9ec-6bd0d3be30cd" containerName="cinder-scheduler" containerID="cri-o://7cb02e3ebc468d7c0e2d25bad9290bf3599e217074384ccdc05cbd26e177aab5" gracePeriod=30 Mar 20 08:53:59 crc kubenswrapper[4971]: I0320 08:53:59.902908 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="f731eb88-b58b-4134-a9ec-6bd0d3be30cd" containerName="probe" containerID="cri-o://97ad1b820aa5b0a504d7c408ca5eef25a7e66a24093df657fc1927ab2630540e" gracePeriod=30 Mar 20 08:54:00 crc kubenswrapper[4971]: I0320 08:54:00.152225 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566614-vkmv9"] Mar 20 08:54:00 crc kubenswrapper[4971]: I0320 08:54:00.155052 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566614-vkmv9" Mar 20 08:54:00 crc kubenswrapper[4971]: I0320 08:54:00.158952 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 08:54:00 crc kubenswrapper[4971]: I0320 08:54:00.159539 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 08:54:00 crc kubenswrapper[4971]: I0320 08:54:00.159947 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 08:54:00 crc kubenswrapper[4971]: I0320 08:54:00.168979 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566614-vkmv9"] Mar 20 08:54:00 crc kubenswrapper[4971]: I0320 08:54:00.215912 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vswh7\" (UniqueName: \"kubernetes.io/projected/29aba505-fe6d-41ed-a14b-76ced70e80f8-kube-api-access-vswh7\") pod \"auto-csr-approver-29566614-vkmv9\" (UID: \"29aba505-fe6d-41ed-a14b-76ced70e80f8\") " pod="openshift-infra/auto-csr-approver-29566614-vkmv9" Mar 20 08:54:00 crc kubenswrapper[4971]: I0320 08:54:00.317393 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vswh7\" (UniqueName: \"kubernetes.io/projected/29aba505-fe6d-41ed-a14b-76ced70e80f8-kube-api-access-vswh7\") pod \"auto-csr-approver-29566614-vkmv9\" (UID: \"29aba505-fe6d-41ed-a14b-76ced70e80f8\") " pod="openshift-infra/auto-csr-approver-29566614-vkmv9" Mar 20 08:54:00 crc kubenswrapper[4971]: I0320 08:54:00.345047 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vswh7\" (UniqueName: \"kubernetes.io/projected/29aba505-fe6d-41ed-a14b-76ced70e80f8-kube-api-access-vswh7\") pod \"auto-csr-approver-29566614-vkmv9\" (UID: \"29aba505-fe6d-41ed-a14b-76ced70e80f8\") " pod="openshift-infra/auto-csr-approver-29566614-vkmv9" Mar 20 08:54:00 crc kubenswrapper[4971]: I0320 08:54:00.484147 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566614-vkmv9" Mar 20 08:54:00 crc kubenswrapper[4971]: I0320 08:54:00.913570 4971 generic.go:334] "Generic (PLEG): container finished" podID="f731eb88-b58b-4134-a9ec-6bd0d3be30cd" containerID="97ad1b820aa5b0a504d7c408ca5eef25a7e66a24093df657fc1927ab2630540e" exitCode=0 Mar 20 08:54:00 crc kubenswrapper[4971]: I0320 08:54:00.913698 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f731eb88-b58b-4134-a9ec-6bd0d3be30cd","Type":"ContainerDied","Data":"97ad1b820aa5b0a504d7c408ca5eef25a7e66a24093df657fc1927ab2630540e"} Mar 20 08:54:00 crc kubenswrapper[4971]: I0320 08:54:00.961746 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566614-vkmv9"] Mar 20 08:54:01 crc kubenswrapper[4971]: I0320 08:54:01.220705 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Mar 20 08:54:01 crc kubenswrapper[4971]: I0320 08:54:01.519254 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 08:54:01 crc kubenswrapper[4971]: I0320 08:54:01.544095 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f731eb88-b58b-4134-a9ec-6bd0d3be30cd-combined-ca-bundle\") pod \"f731eb88-b58b-4134-a9ec-6bd0d3be30cd\" (UID: \"f731eb88-b58b-4134-a9ec-6bd0d3be30cd\") " Mar 20 08:54:01 crc kubenswrapper[4971]: I0320 08:54:01.544150 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f731eb88-b58b-4134-a9ec-6bd0d3be30cd-config-data\") pod \"f731eb88-b58b-4134-a9ec-6bd0d3be30cd\" (UID: \"f731eb88-b58b-4134-a9ec-6bd0d3be30cd\") " Mar 20 08:54:01 crc kubenswrapper[4971]: I0320 08:54:01.544192 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f731eb88-b58b-4134-a9ec-6bd0d3be30cd-config-data-custom\") pod \"f731eb88-b58b-4134-a9ec-6bd0d3be30cd\" (UID: \"f731eb88-b58b-4134-a9ec-6bd0d3be30cd\") " Mar 20 08:54:01 crc kubenswrapper[4971]: I0320 08:54:01.544273 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f731eb88-b58b-4134-a9ec-6bd0d3be30cd-etc-machine-id\") pod \"f731eb88-b58b-4134-a9ec-6bd0d3be30cd\" (UID: \"f731eb88-b58b-4134-a9ec-6bd0d3be30cd\") " Mar 20 08:54:01 crc kubenswrapper[4971]: I0320 08:54:01.544293 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qnjr\" (UniqueName: \"kubernetes.io/projected/f731eb88-b58b-4134-a9ec-6bd0d3be30cd-kube-api-access-5qnjr\") pod \"f731eb88-b58b-4134-a9ec-6bd0d3be30cd\" (UID: \"f731eb88-b58b-4134-a9ec-6bd0d3be30cd\") " Mar 20 08:54:01 crc kubenswrapper[4971]: I0320 08:54:01.544328 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f731eb88-b58b-4134-a9ec-6bd0d3be30cd-scripts\") pod \"f731eb88-b58b-4134-a9ec-6bd0d3be30cd\" (UID: \"f731eb88-b58b-4134-a9ec-6bd0d3be30cd\") " Mar 20 08:54:01 crc kubenswrapper[4971]: I0320 08:54:01.544393 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f731eb88-b58b-4134-a9ec-6bd0d3be30cd-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "f731eb88-b58b-4134-a9ec-6bd0d3be30cd" (UID: "f731eb88-b58b-4134-a9ec-6bd0d3be30cd"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:54:01 crc kubenswrapper[4971]: I0320 08:54:01.545464 4971 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f731eb88-b58b-4134-a9ec-6bd0d3be30cd-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 20 08:54:01 crc kubenswrapper[4971]: I0320 08:54:01.554484 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f731eb88-b58b-4134-a9ec-6bd0d3be30cd-scripts" (OuterVolumeSpecName: "scripts") pod "f731eb88-b58b-4134-a9ec-6bd0d3be30cd" (UID: "f731eb88-b58b-4134-a9ec-6bd0d3be30cd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:54:01 crc kubenswrapper[4971]: I0320 08:54:01.554510 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f731eb88-b58b-4134-a9ec-6bd0d3be30cd-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f731eb88-b58b-4134-a9ec-6bd0d3be30cd" (UID: "f731eb88-b58b-4134-a9ec-6bd0d3be30cd"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:54:01 crc kubenswrapper[4971]: I0320 08:54:01.554621 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f731eb88-b58b-4134-a9ec-6bd0d3be30cd-kube-api-access-5qnjr" (OuterVolumeSpecName: "kube-api-access-5qnjr") pod "f731eb88-b58b-4134-a9ec-6bd0d3be30cd" (UID: "f731eb88-b58b-4134-a9ec-6bd0d3be30cd"). InnerVolumeSpecName "kube-api-access-5qnjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:54:01 crc kubenswrapper[4971]: I0320 08:54:01.602673 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f731eb88-b58b-4134-a9ec-6bd0d3be30cd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f731eb88-b58b-4134-a9ec-6bd0d3be30cd" (UID: "f731eb88-b58b-4134-a9ec-6bd0d3be30cd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:54:01 crc kubenswrapper[4971]: I0320 08:54:01.646355 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5qnjr\" (UniqueName: \"kubernetes.io/projected/f731eb88-b58b-4134-a9ec-6bd0d3be30cd-kube-api-access-5qnjr\") on node \"crc\" DevicePath \"\"" Mar 20 08:54:01 crc kubenswrapper[4971]: I0320 08:54:01.646381 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f731eb88-b58b-4134-a9ec-6bd0d3be30cd-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:54:01 crc kubenswrapper[4971]: I0320 08:54:01.646390 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f731eb88-b58b-4134-a9ec-6bd0d3be30cd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:54:01 crc kubenswrapper[4971]: I0320 08:54:01.646399 4971 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f731eb88-b58b-4134-a9ec-6bd0d3be30cd-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 08:54:01 crc kubenswrapper[4971]: I0320 08:54:01.668101 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f731eb88-b58b-4134-a9ec-6bd0d3be30cd-config-data" (OuterVolumeSpecName: "config-data") pod "f731eb88-b58b-4134-a9ec-6bd0d3be30cd" (UID: "f731eb88-b58b-4134-a9ec-6bd0d3be30cd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:54:01 crc kubenswrapper[4971]: I0320 08:54:01.748053 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f731eb88-b58b-4134-a9ec-6bd0d3be30cd-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:54:01 crc kubenswrapper[4971]: I0320 08:54:01.929641 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566614-vkmv9" event={"ID":"29aba505-fe6d-41ed-a14b-76ced70e80f8","Type":"ContainerStarted","Data":"87e34b7e1bec934d8076d1ed0a08a19e3e16810e893e067bded8b292a10213b7"} Mar 20 08:54:01 crc kubenswrapper[4971]: I0320 08:54:01.933354 4971 generic.go:334] "Generic (PLEG): container finished" podID="f731eb88-b58b-4134-a9ec-6bd0d3be30cd" containerID="7cb02e3ebc468d7c0e2d25bad9290bf3599e217074384ccdc05cbd26e177aab5" exitCode=0 Mar 20 08:54:01 crc kubenswrapper[4971]: I0320 08:54:01.933385 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f731eb88-b58b-4134-a9ec-6bd0d3be30cd","Type":"ContainerDied","Data":"7cb02e3ebc468d7c0e2d25bad9290bf3599e217074384ccdc05cbd26e177aab5"} Mar 20 08:54:01 crc kubenswrapper[4971]: I0320 08:54:01.933404 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f731eb88-b58b-4134-a9ec-6bd0d3be30cd","Type":"ContainerDied","Data":"2b824cc886055453afddd9136bbd7d631c326f60b5208749e2abfa3a03d9a3e3"} Mar 20 08:54:01 crc kubenswrapper[4971]: I0320 08:54:01.933419 4971 scope.go:117] "RemoveContainer" containerID="97ad1b820aa5b0a504d7c408ca5eef25a7e66a24093df657fc1927ab2630540e" Mar 20 08:54:01 crc kubenswrapper[4971]: I0320 08:54:01.933539 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 08:54:01 crc kubenswrapper[4971]: I0320 08:54:01.983963 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 08:54:01 crc kubenswrapper[4971]: I0320 08:54:01.985450 4971 scope.go:117] "RemoveContainer" containerID="7cb02e3ebc468d7c0e2d25bad9290bf3599e217074384ccdc05cbd26e177aab5" Mar 20 08:54:01 crc kubenswrapper[4971]: I0320 08:54:01.994705 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 08:54:02 crc kubenswrapper[4971]: I0320 08:54:02.007348 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 08:54:02 crc kubenswrapper[4971]: E0320 08:54:02.007864 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f731eb88-b58b-4134-a9ec-6bd0d3be30cd" containerName="cinder-scheduler" Mar 20 08:54:02 crc kubenswrapper[4971]: I0320 08:54:02.007888 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f731eb88-b58b-4134-a9ec-6bd0d3be30cd" containerName="cinder-scheduler" Mar 20 08:54:02 crc kubenswrapper[4971]: E0320 08:54:02.007912 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f731eb88-b58b-4134-a9ec-6bd0d3be30cd" containerName="probe" Mar 20 08:54:02 crc kubenswrapper[4971]: I0320 08:54:02.007920 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f731eb88-b58b-4134-a9ec-6bd0d3be30cd" containerName="probe" Mar 20 08:54:02 crc kubenswrapper[4971]: I0320 08:54:02.008132 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="f731eb88-b58b-4134-a9ec-6bd0d3be30cd" containerName="cinder-scheduler" Mar 20 08:54:02 crc kubenswrapper[4971]: I0320 08:54:02.008152 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="f731eb88-b58b-4134-a9ec-6bd0d3be30cd" containerName="probe" Mar 20 08:54:02 crc kubenswrapper[4971]: I0320 08:54:02.015374 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 08:54:02 crc kubenswrapper[4971]: I0320 08:54:02.020297 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 20 08:54:02 crc kubenswrapper[4971]: I0320 08:54:02.043182 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 08:54:02 crc kubenswrapper[4971]: I0320 08:54:02.043307 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Mar 20 08:54:02 crc kubenswrapper[4971]: I0320 08:54:02.106807 4971 scope.go:117] "RemoveContainer" containerID="97ad1b820aa5b0a504d7c408ca5eef25a7e66a24093df657fc1927ab2630540e" Mar 20 08:54:02 crc kubenswrapper[4971]: E0320 08:54:02.118789 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97ad1b820aa5b0a504d7c408ca5eef25a7e66a24093df657fc1927ab2630540e\": container with ID starting with 97ad1b820aa5b0a504d7c408ca5eef25a7e66a24093df657fc1927ab2630540e not found: ID does not exist" containerID="97ad1b820aa5b0a504d7c408ca5eef25a7e66a24093df657fc1927ab2630540e" Mar 20 08:54:02 crc kubenswrapper[4971]: I0320 08:54:02.118872 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97ad1b820aa5b0a504d7c408ca5eef25a7e66a24093df657fc1927ab2630540e"} err="failed to get container status \"97ad1b820aa5b0a504d7c408ca5eef25a7e66a24093df657fc1927ab2630540e\": rpc error: code = NotFound desc = could not find container \"97ad1b820aa5b0a504d7c408ca5eef25a7e66a24093df657fc1927ab2630540e\": container with ID starting with 97ad1b820aa5b0a504d7c408ca5eef25a7e66a24093df657fc1927ab2630540e not found: ID does not exist" Mar 20 08:54:02 crc kubenswrapper[4971]: I0320 08:54:02.118900 4971 scope.go:117] "RemoveContainer" containerID="7cb02e3ebc468d7c0e2d25bad9290bf3599e217074384ccdc05cbd26e177aab5" Mar 20 08:54:02 crc kubenswrapper[4971]: E0320 08:54:02.131781 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cb02e3ebc468d7c0e2d25bad9290bf3599e217074384ccdc05cbd26e177aab5\": container with ID starting with 7cb02e3ebc468d7c0e2d25bad9290bf3599e217074384ccdc05cbd26e177aab5 not found: ID does not exist" containerID="7cb02e3ebc468d7c0e2d25bad9290bf3599e217074384ccdc05cbd26e177aab5" Mar 20 08:54:02 crc kubenswrapper[4971]: I0320 08:54:02.131846 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cb02e3ebc468d7c0e2d25bad9290bf3599e217074384ccdc05cbd26e177aab5"} err="failed to get container status \"7cb02e3ebc468d7c0e2d25bad9290bf3599e217074384ccdc05cbd26e177aab5\": rpc error: code = NotFound desc = could not find container \"7cb02e3ebc468d7c0e2d25bad9290bf3599e217074384ccdc05cbd26e177aab5\": container with ID starting with 7cb02e3ebc468d7c0e2d25bad9290bf3599e217074384ccdc05cbd26e177aab5 not found: ID does not exist" Mar 20 08:54:02 crc kubenswrapper[4971]: I0320 08:54:02.180738 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40678394-304b-4d4d-8d8a-4f399a896a2a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"40678394-304b-4d4d-8d8a-4f399a896a2a\") " pod="openstack/cinder-scheduler-0" Mar 20 08:54:02 crc kubenswrapper[4971]: I0320 08:54:02.180828 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9499\" (UniqueName: \"kubernetes.io/projected/40678394-304b-4d4d-8d8a-4f399a896a2a-kube-api-access-m9499\") pod \"cinder-scheduler-0\" (UID: \"40678394-304b-4d4d-8d8a-4f399a896a2a\") " pod="openstack/cinder-scheduler-0" Mar 20 08:54:02 crc kubenswrapper[4971]: I0320 08:54:02.180902 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40678394-304b-4d4d-8d8a-4f399a896a2a-config-data\") pod \"cinder-scheduler-0\" (UID: \"40678394-304b-4d4d-8d8a-4f399a896a2a\") " pod="openstack/cinder-scheduler-0" Mar 20 08:54:02 crc kubenswrapper[4971]: I0320 08:54:02.180926 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/40678394-304b-4d4d-8d8a-4f399a896a2a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"40678394-304b-4d4d-8d8a-4f399a896a2a\") " pod="openstack/cinder-scheduler-0" Mar 20 08:54:02 crc kubenswrapper[4971]: I0320 08:54:02.180965 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/40678394-304b-4d4d-8d8a-4f399a896a2a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"40678394-304b-4d4d-8d8a-4f399a896a2a\") " pod="openstack/cinder-scheduler-0" Mar 20 08:54:02 crc kubenswrapper[4971]: I0320 08:54:02.181014 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40678394-304b-4d4d-8d8a-4f399a896a2a-scripts\") pod \"cinder-scheduler-0\" (UID: \"40678394-304b-4d4d-8d8a-4f399a896a2a\") " pod="openstack/cinder-scheduler-0" Mar 20 08:54:02 crc kubenswrapper[4971]: I0320 08:54:02.283728 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40678394-304b-4d4d-8d8a-4f399a896a2a-config-data\") pod \"cinder-scheduler-0\" (UID: \"40678394-304b-4d4d-8d8a-4f399a896a2a\") " pod="openstack/cinder-scheduler-0" Mar 20 08:54:02 crc kubenswrapper[4971]: I0320 08:54:02.283863 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/40678394-304b-4d4d-8d8a-4f399a896a2a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"40678394-304b-4d4d-8d8a-4f399a896a2a\") " pod="openstack/cinder-scheduler-0" Mar 20 08:54:02 crc kubenswrapper[4971]: I0320 08:54:02.283912 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/40678394-304b-4d4d-8d8a-4f399a896a2a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"40678394-304b-4d4d-8d8a-4f399a896a2a\") " pod="openstack/cinder-scheduler-0" Mar 20 08:54:02 crc kubenswrapper[4971]: I0320 08:54:02.283969 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40678394-304b-4d4d-8d8a-4f399a896a2a-scripts\") pod \"cinder-scheduler-0\" (UID: \"40678394-304b-4d4d-8d8a-4f399a896a2a\") " pod="openstack/cinder-scheduler-0" Mar 20 08:54:02 crc kubenswrapper[4971]: I0320 08:54:02.284045 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40678394-304b-4d4d-8d8a-4f399a896a2a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"40678394-304b-4d4d-8d8a-4f399a896a2a\") " pod="openstack/cinder-scheduler-0" Mar 20 08:54:02 crc kubenswrapper[4971]: I0320 08:54:02.284088 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9499\" (UniqueName: \"kubernetes.io/projected/40678394-304b-4d4d-8d8a-4f399a896a2a-kube-api-access-m9499\") pod \"cinder-scheduler-0\" (UID: \"40678394-304b-4d4d-8d8a-4f399a896a2a\") " pod="openstack/cinder-scheduler-0" Mar 20 08:54:02 crc kubenswrapper[4971]: I0320 08:54:02.285205 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/40678394-304b-4d4d-8d8a-4f399a896a2a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"40678394-304b-4d4d-8d8a-4f399a896a2a\") " pod="openstack/cinder-scheduler-0" Mar 20 08:54:02 crc kubenswrapper[4971]: I0320 08:54:02.288417 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/40678394-304b-4d4d-8d8a-4f399a896a2a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"40678394-304b-4d4d-8d8a-4f399a896a2a\") " pod="openstack/cinder-scheduler-0" Mar 20 08:54:02 crc kubenswrapper[4971]: I0320 08:54:02.289371 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40678394-304b-4d4d-8d8a-4f399a896a2a-config-data\") pod \"cinder-scheduler-0\" (UID: \"40678394-304b-4d4d-8d8a-4f399a896a2a\") " pod="openstack/cinder-scheduler-0" Mar 20 08:54:02 crc kubenswrapper[4971]: I0320 08:54:02.293675 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40678394-304b-4d4d-8d8a-4f399a896a2a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"40678394-304b-4d4d-8d8a-4f399a896a2a\") " pod="openstack/cinder-scheduler-0" Mar 20 08:54:02 crc kubenswrapper[4971]: I0320 08:54:02.294224 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40678394-304b-4d4d-8d8a-4f399a896a2a-scripts\") pod \"cinder-scheduler-0\" (UID: \"40678394-304b-4d4d-8d8a-4f399a896a2a\") " pod="openstack/cinder-scheduler-0" Mar 20 08:54:02 crc kubenswrapper[4971]: I0320 08:54:02.306979 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9499\" (UniqueName: \"kubernetes.io/projected/40678394-304b-4d4d-8d8a-4f399a896a2a-kube-api-access-m9499\") pod \"cinder-scheduler-0\" (UID: \"40678394-304b-4d4d-8d8a-4f399a896a2a\") " pod="openstack/cinder-scheduler-0" Mar 20 08:54:02 crc kubenswrapper[4971]: I0320 08:54:02.349566 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 08:54:02 crc kubenswrapper[4971]: I0320 08:54:02.744172 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f731eb88-b58b-4134-a9ec-6bd0d3be30cd" path="/var/lib/kubelet/pods/f731eb88-b58b-4134-a9ec-6bd0d3be30cd/volumes" Mar 20 08:54:02 crc kubenswrapper[4971]: I0320 08:54:02.810005 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 08:54:02 crc kubenswrapper[4971]: I0320 08:54:02.952875 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"40678394-304b-4d4d-8d8a-4f399a896a2a","Type":"ContainerStarted","Data":"655248b87f92cb7d0d8bf1a1787332fda5cb59a87399c230574e37a71f2e9c94"} Mar 20 08:54:02 crc kubenswrapper[4971]: I0320 08:54:02.955890 4971 generic.go:334] "Generic (PLEG): container finished" podID="29aba505-fe6d-41ed-a14b-76ced70e80f8" containerID="22ced0c9a1dad73f05d4a4b4683533579ec93a52ca7526e1d544d63a469a39ea" exitCode=0 Mar 20 08:54:02 crc kubenswrapper[4971]: I0320 08:54:02.955986 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566614-vkmv9" event={"ID":"29aba505-fe6d-41ed-a14b-76ced70e80f8","Type":"ContainerDied","Data":"22ced0c9a1dad73f05d4a4b4683533579ec93a52ca7526e1d544d63a469a39ea"} Mar 20 08:54:03 crc kubenswrapper[4971]: I0320 08:54:03.971763 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"40678394-304b-4d4d-8d8a-4f399a896a2a","Type":"ContainerStarted","Data":"70974bd3350ed00155adf48790c238979eb64893ccbe1e4e1d2cc742a6bf8afc"} Mar 20 08:54:03 crc kubenswrapper[4971]: I0320 08:54:03.972092 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"40678394-304b-4d4d-8d8a-4f399a896a2a","Type":"ContainerStarted","Data":"1b6d7f67da053138841852797b7b3d5628ed7af758e46422cf7995fb65032dba"} Mar 20 08:54:03 crc kubenswrapper[4971]: I0320 08:54:03.997770 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=2.997752535 podStartE2EDuration="2.997752535s" podCreationTimestamp="2026-03-20 08:54:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:54:03.997009325 +0000 UTC m=+7465.976883473" watchObservedRunningTime="2026-03-20 08:54:03.997752535 +0000 UTC m=+7465.977626673" Mar 20 08:54:04 crc kubenswrapper[4971]: I0320 08:54:04.358597 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566614-vkmv9" Mar 20 08:54:04 crc kubenswrapper[4971]: I0320 08:54:04.528082 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vswh7\" (UniqueName: \"kubernetes.io/projected/29aba505-fe6d-41ed-a14b-76ced70e80f8-kube-api-access-vswh7\") pod \"29aba505-fe6d-41ed-a14b-76ced70e80f8\" (UID: \"29aba505-fe6d-41ed-a14b-76ced70e80f8\") " Mar 20 08:54:04 crc kubenswrapper[4971]: I0320 08:54:04.541980 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29aba505-fe6d-41ed-a14b-76ced70e80f8-kube-api-access-vswh7" (OuterVolumeSpecName: "kube-api-access-vswh7") pod "29aba505-fe6d-41ed-a14b-76ced70e80f8" (UID: "29aba505-fe6d-41ed-a14b-76ced70e80f8"). InnerVolumeSpecName "kube-api-access-vswh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:54:04 crc kubenswrapper[4971]: I0320 08:54:04.630084 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vswh7\" (UniqueName: \"kubernetes.io/projected/29aba505-fe6d-41ed-a14b-76ced70e80f8-kube-api-access-vswh7\") on node \"crc\" DevicePath \"\"" Mar 20 08:54:04 crc kubenswrapper[4971]: I0320 08:54:04.985936 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566614-vkmv9" Mar 20 08:54:04 crc kubenswrapper[4971]: I0320 08:54:04.985950 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566614-vkmv9" event={"ID":"29aba505-fe6d-41ed-a14b-76ced70e80f8","Type":"ContainerDied","Data":"87e34b7e1bec934d8076d1ed0a08a19e3e16810e893e067bded8b292a10213b7"} Mar 20 08:54:04 crc kubenswrapper[4971]: I0320 08:54:04.986944 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87e34b7e1bec934d8076d1ed0a08a19e3e16810e893e067bded8b292a10213b7" Mar 20 08:54:05 crc kubenswrapper[4971]: I0320 08:54:05.434984 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566608-6tg57"] Mar 20 08:54:05 crc kubenswrapper[4971]: I0320 08:54:05.443953 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566608-6tg57"] Mar 20 08:54:06 crc kubenswrapper[4971]: I0320 08:54:06.334912 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 20 08:54:06 crc kubenswrapper[4971]: I0320 08:54:06.745164 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a8f89b1-e91b-4e29-bb03-25bd13d72ff8" path="/var/lib/kubelet/pods/4a8f89b1-e91b-4e29-bb03-25bd13d72ff8/volumes" Mar 20 08:54:07 crc kubenswrapper[4971]: I0320 08:54:07.350453 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 20 08:54:09 crc kubenswrapper[4971]: I0320 08:54:09.596983 4971 scope.go:117] "RemoveContainer" containerID="c9db52a07f98b0ceae1d45171bb74dc1c130ed1ba9bfe33947b5b965e47d6fd2" Mar 20 08:54:12 crc kubenswrapper[4971]: I0320 08:54:12.592310 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 20 08:54:20 crc kubenswrapper[4971]: I0320 08:54:20.162551 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 08:54:20 crc kubenswrapper[4971]: I0320 08:54:20.163093 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 08:54:50 crc kubenswrapper[4971]: I0320 08:54:50.163014 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 08:54:50 crc kubenswrapper[4971]: I0320 08:54:50.163782 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 08:55:05 crc kubenswrapper[4971]: I0320 08:55:05.042854 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-67e9-account-create-update-pdh5p"] Mar 20 08:55:05 crc kubenswrapper[4971]: I0320 08:55:05.053268 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-bx4dr"] Mar 20 08:55:05 crc kubenswrapper[4971]: I0320 08:55:05.061199 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-bx4dr"] Mar 20 08:55:05 crc kubenswrapper[4971]: I0320 08:55:05.068817 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-67e9-account-create-update-pdh5p"] Mar 20 08:55:06 crc kubenswrapper[4971]: I0320 08:55:06.755936 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a2c9c34-5d1e-4f21-8c05-860986eba258" path="/var/lib/kubelet/pods/2a2c9c34-5d1e-4f21-8c05-860986eba258/volumes" Mar 20 08:55:06 crc kubenswrapper[4971]: I0320 08:55:06.757440 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1e1c936-56dd-4723-8473-f8231f4e8197" path="/var/lib/kubelet/pods/c1e1c936-56dd-4723-8473-f8231f4e8197/volumes" Mar 20 08:55:09 crc kubenswrapper[4971]: I0320 08:55:09.909911 4971 scope.go:117] "RemoveContainer" containerID="dc313ff17c9a4367cd3556d879562db7489e7c5c358c71204de32720086e5b25" Mar 20 08:55:09 crc kubenswrapper[4971]: I0320 08:55:09.951069 4971 scope.go:117] "RemoveContainer" containerID="ef03885fd1d4938fa069ac2a963ba0fccac3cc24217902a1e698016ae0993951" Mar 20 08:55:17 crc kubenswrapper[4971]: I0320 08:55:17.065494 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-v6d2f"] Mar 20 08:55:17 crc kubenswrapper[4971]: I0320 08:55:17.083750 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-v6d2f"] Mar 20 08:55:18 crc kubenswrapper[4971]: I0320 08:55:18.751749 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d9e8117-f465-491d-8e98-0f9d1509bb8e" path="/var/lib/kubelet/pods/7d9e8117-f465-491d-8e98-0f9d1509bb8e/volumes" Mar 20 08:55:20 crc kubenswrapper[4971]: I0320 08:55:20.162963 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 08:55:20 crc kubenswrapper[4971]: I0320 08:55:20.163463 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 08:55:20 crc kubenswrapper[4971]: I0320 08:55:20.163545 4971 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" Mar 20 08:55:20 crc kubenswrapper[4971]: I0320 08:55:20.164758 4971 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d3a14bb739a0b2e954975f7532ea81560227f071f1b801e07dff179ccac02bd7"} pod="openshift-machine-config-operator/machine-config-daemon-m26zl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 08:55:20 crc kubenswrapper[4971]: I0320 08:55:20.164886 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" containerID="cri-o://d3a14bb739a0b2e954975f7532ea81560227f071f1b801e07dff179ccac02bd7" gracePeriod=600 Mar 20 08:55:20 crc kubenswrapper[4971]: I0320 08:55:20.822914 4971 generic.go:334] "Generic (PLEG): container finished" podID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerID="d3a14bb739a0b2e954975f7532ea81560227f071f1b801e07dff179ccac02bd7" exitCode=0 Mar 20 08:55:20 crc kubenswrapper[4971]: I0320 08:55:20.822975 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerDied","Data":"d3a14bb739a0b2e954975f7532ea81560227f071f1b801e07dff179ccac02bd7"} Mar 20 08:55:20 crc kubenswrapper[4971]: I0320 08:55:20.823294 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerStarted","Data":"28296d931187d2e7d4e287e10e7dff5f848527abf9621b1204e0a061cbc3d395"} Mar 20 08:55:20 crc kubenswrapper[4971]: I0320 08:55:20.823316 4971 scope.go:117] "RemoveContainer" containerID="af8dd483a43b5c96df916fd811fc94a881785f568b623794921e968459fc26b4" Mar 20 08:55:31 crc kubenswrapper[4971]: I0320 08:55:31.047717 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-7xt4j"] Mar 20 08:55:31 crc kubenswrapper[4971]: I0320 08:55:31.059373 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-7xt4j"] Mar 20 08:55:32 crc kubenswrapper[4971]: I0320 08:55:32.743352 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1322bd61-d156-4142-9a0c-023e76e87cd0" path="/var/lib/kubelet/pods/1322bd61-d156-4142-9a0c-023e76e87cd0/volumes" Mar 20 08:55:44 crc kubenswrapper[4971]: I0320 08:55:44.844210 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-57c77c7dc5-pcf25"] Mar 20 08:55:44 crc kubenswrapper[4971]: E0320 08:55:44.848444 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29aba505-fe6d-41ed-a14b-76ced70e80f8" containerName="oc" Mar 20 08:55:44 crc kubenswrapper[4971]: I0320 08:55:44.848478 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="29aba505-fe6d-41ed-a14b-76ced70e80f8" containerName="oc" Mar 20 08:55:44 crc kubenswrapper[4971]: I0320 08:55:44.848717 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="29aba505-fe6d-41ed-a14b-76ced70e80f8" containerName="oc" Mar 20 08:55:44 crc kubenswrapper[4971]: I0320 08:55:44.850240 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-57c77c7dc5-pcf25" Mar 20 08:55:44 crc kubenswrapper[4971]: I0320 08:55:44.858057 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Mar 20 08:55:44 crc kubenswrapper[4971]: I0320 08:55:44.858320 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Mar 20 08:55:44 crc kubenswrapper[4971]: I0320 08:55:44.864893 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-57c77c7dc5-pcf25"] Mar 20 08:55:44 crc kubenswrapper[4971]: I0320 08:55:44.865422 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-z2s5f" Mar 20 08:55:44 crc kubenswrapper[4971]: I0320 08:55:44.868218 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Mar 20 08:55:44 crc kubenswrapper[4971]: I0320 08:55:44.937545 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 08:55:44 crc kubenswrapper[4971]: I0320 08:55:44.938016 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="24488a4f-a46e-4ca8-86d4-5b23dd0baea2" containerName="glance-log" containerID="cri-o://13545e7041483007f6d6bd25d1ff07ca64b88102cd73f7ec312096aa6bbd746b" gracePeriod=30 Mar 20 08:55:44 crc kubenswrapper[4971]: I0320 08:55:44.938425 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="24488a4f-a46e-4ca8-86d4-5b23dd0baea2" containerName="glance-httpd" containerID="cri-o://4ace891a7df0f84ad86f358c910694d8b067090dd19974eea57d33f583f52709" gracePeriod=30 Mar 20 08:55:44 crc kubenswrapper[4971]: I0320 08:55:44.967927 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-789f468b55-wcnxp"] Mar 20 08:55:44 crc kubenswrapper[4971]: I0320 08:55:44.969902 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1a7e1d91-b96d-443b-a5c2-30535a7761cd-config-data\") pod \"horizon-57c77c7dc5-pcf25\" (UID: \"1a7e1d91-b96d-443b-a5c2-30535a7761cd\") " pod="openstack/horizon-57c77c7dc5-pcf25" Mar 20 08:55:44 crc kubenswrapper[4971]: I0320 08:55:44.969955 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1a7e1d91-b96d-443b-a5c2-30535a7761cd-horizon-secret-key\") pod \"horizon-57c77c7dc5-pcf25\" (UID: \"1a7e1d91-b96d-443b-a5c2-30535a7761cd\") " pod="openstack/horizon-57c77c7dc5-pcf25" Mar 20 08:55:44 crc kubenswrapper[4971]: I0320 08:55:44.970092 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtdmk\" (UniqueName: \"kubernetes.io/projected/1a7e1d91-b96d-443b-a5c2-30535a7761cd-kube-api-access-vtdmk\") pod \"horizon-57c77c7dc5-pcf25\" (UID: \"1a7e1d91-b96d-443b-a5c2-30535a7761cd\") " pod="openstack/horizon-57c77c7dc5-pcf25" Mar 20 08:55:44 crc kubenswrapper[4971]: I0320 08:55:44.970161 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1a7e1d91-b96d-443b-a5c2-30535a7761cd-scripts\") pod \"horizon-57c77c7dc5-pcf25\" (UID: \"1a7e1d91-b96d-443b-a5c2-30535a7761cd\") " pod="openstack/horizon-57c77c7dc5-pcf25" Mar 20 08:55:44 crc kubenswrapper[4971]: I0320 08:55:44.970233 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a7e1d91-b96d-443b-a5c2-30535a7761cd-logs\") pod \"horizon-57c77c7dc5-pcf25\" (UID: \"1a7e1d91-b96d-443b-a5c2-30535a7761cd\") " pod="openstack/horizon-57c77c7dc5-pcf25" Mar 20 08:55:44 crc kubenswrapper[4971]: I0320 08:55:44.971433 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-789f468b55-wcnxp" Mar 20 08:55:44 crc kubenswrapper[4971]: I0320 08:55:44.986599 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-789f468b55-wcnxp"] Mar 20 08:55:45 crc kubenswrapper[4971]: I0320 08:55:45.007699 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 08:55:45 crc kubenswrapper[4971]: I0320 08:55:45.008041 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="df4aae77-3962-438d-a455-e6f017a477c5" containerName="glance-log" containerID="cri-o://a0d9b96c46022c457032f3d730e3d05303c77c5874a2edd8709197377173b185" gracePeriod=30 Mar 20 08:55:45 crc kubenswrapper[4971]: I0320 08:55:45.008553 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="df4aae77-3962-438d-a455-e6f017a477c5" containerName="glance-httpd" containerID="cri-o://4fcdf8ed86ec8c23603232a5a54512655104a51c2cb4d5d309335f1a29c503c1" gracePeriod=30 Mar 20 08:55:45 crc kubenswrapper[4971]: I0320 08:55:45.077001 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/639cda12-bee9-4d40-8613-24156ac5b3ec-scripts\") pod \"horizon-789f468b55-wcnxp\" (UID: \"639cda12-bee9-4d40-8613-24156ac5b3ec\") " pod="openstack/horizon-789f468b55-wcnxp" Mar 20 08:55:45 crc kubenswrapper[4971]: I0320 08:55:45.077413 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtdmk\" (UniqueName: \"kubernetes.io/projected/1a7e1d91-b96d-443b-a5c2-30535a7761cd-kube-api-access-vtdmk\") pod \"horizon-57c77c7dc5-pcf25\" (UID: \"1a7e1d91-b96d-443b-a5c2-30535a7761cd\") " pod="openstack/horizon-57c77c7dc5-pcf25" Mar 20 08:55:45 crc kubenswrapper[4971]: I0320 08:55:45.077445 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-429m6\" (UniqueName: \"kubernetes.io/projected/639cda12-bee9-4d40-8613-24156ac5b3ec-kube-api-access-429m6\") pod \"horizon-789f468b55-wcnxp\" (UID: \"639cda12-bee9-4d40-8613-24156ac5b3ec\") " pod="openstack/horizon-789f468b55-wcnxp" Mar 20 08:55:45 crc kubenswrapper[4971]: I0320 08:55:45.077512 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1a7e1d91-b96d-443b-a5c2-30535a7761cd-scripts\") pod \"horizon-57c77c7dc5-pcf25\" (UID: \"1a7e1d91-b96d-443b-a5c2-30535a7761cd\") " pod="openstack/horizon-57c77c7dc5-pcf25" Mar 20 08:55:45 crc kubenswrapper[4971]: I0320 08:55:45.077588 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/639cda12-bee9-4d40-8613-24156ac5b3ec-logs\") pod \"horizon-789f468b55-wcnxp\" (UID: \"639cda12-bee9-4d40-8613-24156ac5b3ec\") " pod="openstack/horizon-789f468b55-wcnxp" Mar 20 08:55:45 crc kubenswrapper[4971]: I0320 08:55:45.077634 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a7e1d91-b96d-443b-a5c2-30535a7761cd-logs\") pod \"horizon-57c77c7dc5-pcf25\" (UID: \"1a7e1d91-b96d-443b-a5c2-30535a7761cd\") " pod="openstack/horizon-57c77c7dc5-pcf25" Mar 20 08:55:45 crc kubenswrapper[4971]: I0320 08:55:45.077723 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/639cda12-bee9-4d40-8613-24156ac5b3ec-horizon-secret-key\") pod \"horizon-789f468b55-wcnxp\" (UID: \"639cda12-bee9-4d40-8613-24156ac5b3ec\") " pod="openstack/horizon-789f468b55-wcnxp" Mar 20 08:55:45 crc kubenswrapper[4971]: I0320 08:55:45.077877 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/639cda12-bee9-4d40-8613-24156ac5b3ec-config-data\") pod \"horizon-789f468b55-wcnxp\" (UID: \"639cda12-bee9-4d40-8613-24156ac5b3ec\") " pod="openstack/horizon-789f468b55-wcnxp" Mar 20 08:55:45 crc kubenswrapper[4971]: I0320 08:55:45.077944 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1a7e1d91-b96d-443b-a5c2-30535a7761cd-config-data\") pod \"horizon-57c77c7dc5-pcf25\" (UID: \"1a7e1d91-b96d-443b-a5c2-30535a7761cd\") " pod="openstack/horizon-57c77c7dc5-pcf25" Mar 20 08:55:45 crc kubenswrapper[4971]: I0320 08:55:45.078060 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1a7e1d91-b96d-443b-a5c2-30535a7761cd-horizon-secret-key\") pod \"horizon-57c77c7dc5-pcf25\" (UID: \"1a7e1d91-b96d-443b-a5c2-30535a7761cd\") " pod="openstack/horizon-57c77c7dc5-pcf25" Mar 20 08:55:45 crc kubenswrapper[4971]: I0320 08:55:45.078261 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1a7e1d91-b96d-443b-a5c2-30535a7761cd-scripts\") pod \"horizon-57c77c7dc5-pcf25\" (UID: \"1a7e1d91-b96d-443b-a5c2-30535a7761cd\") " pod="openstack/horizon-57c77c7dc5-pcf25" Mar 20 08:55:45 crc kubenswrapper[4971]: I0320 08:55:45.079501 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a7e1d91-b96d-443b-a5c2-30535a7761cd-logs\") pod \"horizon-57c77c7dc5-pcf25\" (UID: \"1a7e1d91-b96d-443b-a5c2-30535a7761cd\") " pod="openstack/horizon-57c77c7dc5-pcf25" Mar 20 08:55:45 crc kubenswrapper[4971]: I0320 08:55:45.080678 4971 generic.go:334] "Generic (PLEG): container finished" podID="24488a4f-a46e-4ca8-86d4-5b23dd0baea2" containerID="13545e7041483007f6d6bd25d1ff07ca64b88102cd73f7ec312096aa6bbd746b" exitCode=143 Mar 20 08:55:45 crc kubenswrapper[4971]: I0320 08:55:45.080716 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"24488a4f-a46e-4ca8-86d4-5b23dd0baea2","Type":"ContainerDied","Data":"13545e7041483007f6d6bd25d1ff07ca64b88102cd73f7ec312096aa6bbd746b"} Mar 20 08:55:45 crc kubenswrapper[4971]: I0320 08:55:45.081915 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1a7e1d91-b96d-443b-a5c2-30535a7761cd-config-data\") pod \"horizon-57c77c7dc5-pcf25\" (UID: \"1a7e1d91-b96d-443b-a5c2-30535a7761cd\") " pod="openstack/horizon-57c77c7dc5-pcf25" Mar 20 08:55:45 crc kubenswrapper[4971]: I0320 08:55:45.087299 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1a7e1d91-b96d-443b-a5c2-30535a7761cd-horizon-secret-key\") pod \"horizon-57c77c7dc5-pcf25\" (UID: \"1a7e1d91-b96d-443b-a5c2-30535a7761cd\") " pod="openstack/horizon-57c77c7dc5-pcf25" Mar 20 08:55:45 crc kubenswrapper[4971]: I0320 08:55:45.103265 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtdmk\" (UniqueName: \"kubernetes.io/projected/1a7e1d91-b96d-443b-a5c2-30535a7761cd-kube-api-access-vtdmk\") pod \"horizon-57c77c7dc5-pcf25\" (UID: \"1a7e1d91-b96d-443b-a5c2-30535a7761cd\") " pod="openstack/horizon-57c77c7dc5-pcf25" Mar 20 08:55:45 crc kubenswrapper[4971]: I0320 08:55:45.179724 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/639cda12-bee9-4d40-8613-24156ac5b3ec-horizon-secret-key\") pod \"horizon-789f468b55-wcnxp\" (UID: \"639cda12-bee9-4d40-8613-24156ac5b3ec\") " pod="openstack/horizon-789f468b55-wcnxp" Mar 20 08:55:45 crc kubenswrapper[4971]: I0320 08:55:45.179789 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/639cda12-bee9-4d40-8613-24156ac5b3ec-config-data\") pod \"horizon-789f468b55-wcnxp\" (UID: \"639cda12-bee9-4d40-8613-24156ac5b3ec\") " pod="openstack/horizon-789f468b55-wcnxp" Mar 20 08:55:45 crc kubenswrapper[4971]: I0320 08:55:45.179861 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/639cda12-bee9-4d40-8613-24156ac5b3ec-scripts\") pod \"horizon-789f468b55-wcnxp\" (UID: \"639cda12-bee9-4d40-8613-24156ac5b3ec\") " pod="openstack/horizon-789f468b55-wcnxp" Mar 20 08:55:45 crc kubenswrapper[4971]: I0320 08:55:45.179944 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-429m6\" (UniqueName: \"kubernetes.io/projected/639cda12-bee9-4d40-8613-24156ac5b3ec-kube-api-access-429m6\") pod \"horizon-789f468b55-wcnxp\" (UID: \"639cda12-bee9-4d40-8613-24156ac5b3ec\") " pod="openstack/horizon-789f468b55-wcnxp" Mar 20 08:55:45 crc kubenswrapper[4971]: I0320 08:55:45.180040 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/639cda12-bee9-4d40-8613-24156ac5b3ec-logs\") pod \"horizon-789f468b55-wcnxp\" (UID: \"639cda12-bee9-4d40-8613-24156ac5b3ec\") " pod="openstack/horizon-789f468b55-wcnxp" Mar 20 08:55:45 crc kubenswrapper[4971]: I0320 08:55:45.180689 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/639cda12-bee9-4d40-8613-24156ac5b3ec-scripts\") pod \"horizon-789f468b55-wcnxp\" (UID: \"639cda12-bee9-4d40-8613-24156ac5b3ec\") " pod="openstack/horizon-789f468b55-wcnxp" Mar 20 08:55:45 crc kubenswrapper[4971]: I0320 08:55:45.180696 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/639cda12-bee9-4d40-8613-24156ac5b3ec-logs\") pod \"horizon-789f468b55-wcnxp\" (UID: \"639cda12-bee9-4d40-8613-24156ac5b3ec\") " pod="openstack/horizon-789f468b55-wcnxp" Mar 20 08:55:45 crc kubenswrapper[4971]: I0320 08:55:45.181096 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/639cda12-bee9-4d40-8613-24156ac5b3ec-config-data\") pod \"horizon-789f468b55-wcnxp\" (UID: \"639cda12-bee9-4d40-8613-24156ac5b3ec\") " pod="openstack/horizon-789f468b55-wcnxp" Mar 20 08:55:45 crc kubenswrapper[4971]: I0320 08:55:45.182320 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/639cda12-bee9-4d40-8613-24156ac5b3ec-horizon-secret-key\") pod \"horizon-789f468b55-wcnxp\" (UID: \"639cda12-bee9-4d40-8613-24156ac5b3ec\") " pod="openstack/horizon-789f468b55-wcnxp" Mar 20 08:55:45 crc kubenswrapper[4971]: I0320 08:55:45.186777 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-57c77c7dc5-pcf25" Mar 20 08:55:45 crc kubenswrapper[4971]: I0320 08:55:45.198066 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-429m6\" (UniqueName: \"kubernetes.io/projected/639cda12-bee9-4d40-8613-24156ac5b3ec-kube-api-access-429m6\") pod \"horizon-789f468b55-wcnxp\" (UID: \"639cda12-bee9-4d40-8613-24156ac5b3ec\") " pod="openstack/horizon-789f468b55-wcnxp" Mar 20 08:55:45 crc kubenswrapper[4971]: I0320 08:55:45.295461 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-789f468b55-wcnxp" Mar 20 08:55:45 crc kubenswrapper[4971]: I0320 08:55:45.554475 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-57c77c7dc5-pcf25"] Mar 20 08:55:45 crc kubenswrapper[4971]: I0320 08:55:45.590691 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-859774b9d9-bgg24"] Mar 20 08:55:45 crc kubenswrapper[4971]: I0320 08:55:45.592697 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-859774b9d9-bgg24" Mar 20 08:55:45 crc kubenswrapper[4971]: I0320 08:55:45.600295 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-859774b9d9-bgg24"] Mar 20 08:55:45 crc kubenswrapper[4971]: I0320 08:55:45.687928 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7-horizon-secret-key\") pod \"horizon-859774b9d9-bgg24\" (UID: \"003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7\") " pod="openstack/horizon-859774b9d9-bgg24" Mar 20 08:55:45 crc kubenswrapper[4971]: I0320 08:55:45.687987 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7-scripts\") pod \"horizon-859774b9d9-bgg24\" (UID: \"003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7\") " pod="openstack/horizon-859774b9d9-bgg24" Mar 20 08:55:45 crc kubenswrapper[4971]: I0320 08:55:45.688028 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7-config-data\") pod \"horizon-859774b9d9-bgg24\" (UID: \"003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7\") " pod="openstack/horizon-859774b9d9-bgg24" Mar 20 08:55:45 crc kubenswrapper[4971]: I0320 08:55:45.688095 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4j4ql\" (UniqueName: \"kubernetes.io/projected/003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7-kube-api-access-4j4ql\") pod \"horizon-859774b9d9-bgg24\" (UID: \"003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7\") " pod="openstack/horizon-859774b9d9-bgg24" Mar 20 08:55:45 crc kubenswrapper[4971]: I0320 08:55:45.688116 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7-logs\") pod \"horizon-859774b9d9-bgg24\" (UID: \"003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7\") " pod="openstack/horizon-859774b9d9-bgg24" Mar 20 08:55:45 crc kubenswrapper[4971]: I0320 08:55:45.698411 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-57c77c7dc5-pcf25"] Mar 20 08:55:45 crc kubenswrapper[4971]: I0320 08:55:45.786470 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-789f468b55-wcnxp"] Mar 20 08:55:45 crc kubenswrapper[4971]: I0320 08:55:45.789922 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4j4ql\" (UniqueName: \"kubernetes.io/projected/003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7-kube-api-access-4j4ql\") pod \"horizon-859774b9d9-bgg24\" (UID: \"003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7\") " pod="openstack/horizon-859774b9d9-bgg24" Mar 20 08:55:45 crc kubenswrapper[4971]: I0320 08:55:45.789971 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7-logs\") pod \"horizon-859774b9d9-bgg24\" (UID: \"003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7\") " pod="openstack/horizon-859774b9d9-bgg24" Mar 20 08:55:45 crc kubenswrapper[4971]: I0320 08:55:45.790069 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7-horizon-secret-key\") pod \"horizon-859774b9d9-bgg24\" (UID: \"003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7\") " pod="openstack/horizon-859774b9d9-bgg24" Mar 20 08:55:45 crc kubenswrapper[4971]: I0320 08:55:45.790102 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7-scripts\") pod \"horizon-859774b9d9-bgg24\" (UID: \"003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7\") " pod="openstack/horizon-859774b9d9-bgg24" Mar 20 08:55:45 crc kubenswrapper[4971]: I0320 08:55:45.790133 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7-config-data\") pod \"horizon-859774b9d9-bgg24\" (UID: \"003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7\") " pod="openstack/horizon-859774b9d9-bgg24" Mar 20 08:55:45 crc kubenswrapper[4971]: I0320 08:55:45.791172 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7-logs\") pod \"horizon-859774b9d9-bgg24\" (UID: \"003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7\") " pod="openstack/horizon-859774b9d9-bgg24" Mar 20 08:55:45 crc kubenswrapper[4971]: I0320 08:55:45.791375 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7-scripts\") pod \"horizon-859774b9d9-bgg24\" (UID: \"003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7\") " pod="openstack/horizon-859774b9d9-bgg24" Mar 20 08:55:45 crc kubenswrapper[4971]: I0320 08:55:45.791887 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7-config-data\") pod \"horizon-859774b9d9-bgg24\" (UID: \"003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7\") " pod="openstack/horizon-859774b9d9-bgg24" Mar 20 08:55:45 crc kubenswrapper[4971]: I0320 08:55:45.799272 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7-horizon-secret-key\") pod \"horizon-859774b9d9-bgg24\" (UID: \"003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7\") " pod="openstack/horizon-859774b9d9-bgg24" Mar 20 08:55:45 crc kubenswrapper[4971]: I0320 08:55:45.810384 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4j4ql\" (UniqueName: \"kubernetes.io/projected/003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7-kube-api-access-4j4ql\") pod \"horizon-859774b9d9-bgg24\" (UID: \"003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7\") " pod="openstack/horizon-859774b9d9-bgg24" Mar 20 08:55:45 crc kubenswrapper[4971]: I0320 08:55:45.915791 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-859774b9d9-bgg24" Mar 20 08:55:46 crc kubenswrapper[4971]: I0320 08:55:46.095913 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-789f468b55-wcnxp" event={"ID":"639cda12-bee9-4d40-8613-24156ac5b3ec","Type":"ContainerStarted","Data":"4421f8719515d2d80844530bbf1e70fdacf5c45c8f3c647eb92a4d549983f8df"} Mar 20 08:55:46 crc kubenswrapper[4971]: I0320 08:55:46.099266 4971 generic.go:334] "Generic (PLEG): container finished" podID="df4aae77-3962-438d-a455-e6f017a477c5" containerID="a0d9b96c46022c457032f3d730e3d05303c77c5874a2edd8709197377173b185" exitCode=143 Mar 20 08:55:46 crc kubenswrapper[4971]: I0320 08:55:46.099382 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"df4aae77-3962-438d-a455-e6f017a477c5","Type":"ContainerDied","Data":"a0d9b96c46022c457032f3d730e3d05303c77c5874a2edd8709197377173b185"} Mar 20 08:55:46 crc kubenswrapper[4971]: I0320 08:55:46.101434 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57c77c7dc5-pcf25" event={"ID":"1a7e1d91-b96d-443b-a5c2-30535a7761cd","Type":"ContainerStarted","Data":"6bd73cac9769578d34c5f62dabcc0354e58f2a854b2a82e5954577676633b4aa"} Mar 20 08:55:46 crc kubenswrapper[4971]: I0320 08:55:46.386061 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-859774b9d9-bgg24"] Mar 20 08:55:46 crc kubenswrapper[4971]: I0320 08:55:46.411728 4971 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 08:55:47 crc kubenswrapper[4971]: I0320 08:55:47.113301 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-859774b9d9-bgg24" event={"ID":"003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7","Type":"ContainerStarted","Data":"3248c7877087fae9e1227abc847a6425343c152724f9d8b007bbc2191418fea0"} Mar 20 08:55:48 crc kubenswrapper[4971]: I0320 08:55:48.649513 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 08:55:48 crc kubenswrapper[4971]: I0320 08:55:48.715043 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 08:55:48 crc kubenswrapper[4971]: I0320 08:55:48.752966 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24488a4f-a46e-4ca8-86d4-5b23dd0baea2-config-data\") pod \"24488a4f-a46e-4ca8-86d4-5b23dd0baea2\" (UID: \"24488a4f-a46e-4ca8-86d4-5b23dd0baea2\") " Mar 20 08:55:48 crc kubenswrapper[4971]: I0320 08:55:48.753089 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24488a4f-a46e-4ca8-86d4-5b23dd0baea2-combined-ca-bundle\") pod \"24488a4f-a46e-4ca8-86d4-5b23dd0baea2\" (UID: \"24488a4f-a46e-4ca8-86d4-5b23dd0baea2\") " Mar 20 08:55:48 crc kubenswrapper[4971]: I0320 08:55:48.753166 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24488a4f-a46e-4ca8-86d4-5b23dd0baea2-scripts\") pod \"24488a4f-a46e-4ca8-86d4-5b23dd0baea2\" (UID: \"24488a4f-a46e-4ca8-86d4-5b23dd0baea2\") " Mar 20 08:55:48 crc kubenswrapper[4971]: I0320 08:55:48.753215 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24488a4f-a46e-4ca8-86d4-5b23dd0baea2-logs\") pod \"24488a4f-a46e-4ca8-86d4-5b23dd0baea2\" (UID: \"24488a4f-a46e-4ca8-86d4-5b23dd0baea2\") " Mar 20 08:55:48 crc kubenswrapper[4971]: I0320 08:55:48.753271 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9m5v\" (UniqueName: \"kubernetes.io/projected/24488a4f-a46e-4ca8-86d4-5b23dd0baea2-kube-api-access-q9m5v\") pod \"24488a4f-a46e-4ca8-86d4-5b23dd0baea2\" (UID: \"24488a4f-a46e-4ca8-86d4-5b23dd0baea2\") " Mar 20 08:55:48 crc kubenswrapper[4971]: I0320 08:55:48.753297 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/24488a4f-a46e-4ca8-86d4-5b23dd0baea2-httpd-run\") pod \"24488a4f-a46e-4ca8-86d4-5b23dd0baea2\" (UID: \"24488a4f-a46e-4ca8-86d4-5b23dd0baea2\") " Mar 20 08:55:48 crc kubenswrapper[4971]: I0320 08:55:48.753327 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/24488a4f-a46e-4ca8-86d4-5b23dd0baea2-ceph\") pod \"24488a4f-a46e-4ca8-86d4-5b23dd0baea2\" (UID: \"24488a4f-a46e-4ca8-86d4-5b23dd0baea2\") " Mar 20 08:55:48 crc kubenswrapper[4971]: I0320 08:55:48.758353 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24488a4f-a46e-4ca8-86d4-5b23dd0baea2-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "24488a4f-a46e-4ca8-86d4-5b23dd0baea2" (UID: "24488a4f-a46e-4ca8-86d4-5b23dd0baea2"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:55:48 crc kubenswrapper[4971]: I0320 08:55:48.758712 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24488a4f-a46e-4ca8-86d4-5b23dd0baea2-logs" (OuterVolumeSpecName: "logs") pod "24488a4f-a46e-4ca8-86d4-5b23dd0baea2" (UID: "24488a4f-a46e-4ca8-86d4-5b23dd0baea2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:55:48 crc kubenswrapper[4971]: I0320 08:55:48.759013 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24488a4f-a46e-4ca8-86d4-5b23dd0baea2-ceph" (OuterVolumeSpecName: "ceph") pod "24488a4f-a46e-4ca8-86d4-5b23dd0baea2" (UID: "24488a4f-a46e-4ca8-86d4-5b23dd0baea2"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:55:48 crc kubenswrapper[4971]: I0320 08:55:48.759291 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24488a4f-a46e-4ca8-86d4-5b23dd0baea2-scripts" (OuterVolumeSpecName: "scripts") pod "24488a4f-a46e-4ca8-86d4-5b23dd0baea2" (UID: "24488a4f-a46e-4ca8-86d4-5b23dd0baea2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:55:48 crc kubenswrapper[4971]: I0320 08:55:48.760028 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24488a4f-a46e-4ca8-86d4-5b23dd0baea2-kube-api-access-q9m5v" (OuterVolumeSpecName: "kube-api-access-q9m5v") pod "24488a4f-a46e-4ca8-86d4-5b23dd0baea2" (UID: "24488a4f-a46e-4ca8-86d4-5b23dd0baea2"). InnerVolumeSpecName "kube-api-access-q9m5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:55:48 crc kubenswrapper[4971]: I0320 08:55:48.805239 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24488a4f-a46e-4ca8-86d4-5b23dd0baea2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "24488a4f-a46e-4ca8-86d4-5b23dd0baea2" (UID: "24488a4f-a46e-4ca8-86d4-5b23dd0baea2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:55:48 crc kubenswrapper[4971]: I0320 08:55:48.832178 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24488a4f-a46e-4ca8-86d4-5b23dd0baea2-config-data" (OuterVolumeSpecName: "config-data") pod "24488a4f-a46e-4ca8-86d4-5b23dd0baea2" (UID: "24488a4f-a46e-4ca8-86d4-5b23dd0baea2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:55:48 crc kubenswrapper[4971]: I0320 08:55:48.855217 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df4aae77-3962-438d-a455-e6f017a477c5-logs\") pod \"df4aae77-3962-438d-a455-e6f017a477c5\" (UID: \"df4aae77-3962-438d-a455-e6f017a477c5\") " Mar 20 08:55:48 crc kubenswrapper[4971]: I0320 08:55:48.855383 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df4aae77-3962-438d-a455-e6f017a477c5-scripts\") pod \"df4aae77-3962-438d-a455-e6f017a477c5\" (UID: \"df4aae77-3962-438d-a455-e6f017a477c5\") " Mar 20 08:55:48 crc kubenswrapper[4971]: I0320 08:55:48.855436 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df4aae77-3962-438d-a455-e6f017a477c5-config-data\") pod \"df4aae77-3962-438d-a455-e6f017a477c5\" (UID: \"df4aae77-3962-438d-a455-e6f017a477c5\") " Mar 20 08:55:48 crc kubenswrapper[4971]: I0320 08:55:48.855530 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/df4aae77-3962-438d-a455-e6f017a477c5-httpd-run\") pod \"df4aae77-3962-438d-a455-e6f017a477c5\" (UID: \"df4aae77-3962-438d-a455-e6f017a477c5\") " Mar 20 08:55:48 crc kubenswrapper[4971]: I0320 08:55:48.855559 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df4aae77-3962-438d-a455-e6f017a477c5-combined-ca-bundle\") pod \"df4aae77-3962-438d-a455-e6f017a477c5\" (UID: \"df4aae77-3962-438d-a455-e6f017a477c5\") " Mar 20 08:55:48 crc kubenswrapper[4971]: I0320 08:55:48.855748 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/df4aae77-3962-438d-a455-e6f017a477c5-ceph\") pod \"df4aae77-3962-438d-a455-e6f017a477c5\" (UID: \"df4aae77-3962-438d-a455-e6f017a477c5\") " Mar 20 08:55:48 crc kubenswrapper[4971]: I0320 08:55:48.855767 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xhhf\" (UniqueName: \"kubernetes.io/projected/df4aae77-3962-438d-a455-e6f017a477c5-kube-api-access-5xhhf\") pod \"df4aae77-3962-438d-a455-e6f017a477c5\" (UID: \"df4aae77-3962-438d-a455-e6f017a477c5\") " Mar 20 08:55:48 crc kubenswrapper[4971]: I0320 08:55:48.856283 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24488a4f-a46e-4ca8-86d4-5b23dd0baea2-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:55:48 crc kubenswrapper[4971]: I0320 08:55:48.856297 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24488a4f-a46e-4ca8-86d4-5b23dd0baea2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:55:48 crc kubenswrapper[4971]: I0320 08:55:48.856306 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24488a4f-a46e-4ca8-86d4-5b23dd0baea2-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:55:48 crc kubenswrapper[4971]: I0320 08:55:48.856314 4971 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24488a4f-a46e-4ca8-86d4-5b23dd0baea2-logs\") on node \"crc\" DevicePath \"\"" Mar 20 08:55:48 crc kubenswrapper[4971]: I0320 08:55:48.856323 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9m5v\" (UniqueName: \"kubernetes.io/projected/24488a4f-a46e-4ca8-86d4-5b23dd0baea2-kube-api-access-q9m5v\") on node \"crc\" DevicePath \"\"" Mar 20 08:55:48 crc kubenswrapper[4971]: I0320 08:55:48.856333 4971 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/24488a4f-a46e-4ca8-86d4-5b23dd0baea2-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 08:55:48 crc kubenswrapper[4971]: I0320 08:55:48.856357 4971 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/24488a4f-a46e-4ca8-86d4-5b23dd0baea2-ceph\") on node \"crc\" DevicePath \"\"" Mar 20 08:55:48 crc kubenswrapper[4971]: I0320 08:55:48.857568 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df4aae77-3962-438d-a455-e6f017a477c5-logs" (OuterVolumeSpecName: "logs") pod "df4aae77-3962-438d-a455-e6f017a477c5" (UID: "df4aae77-3962-438d-a455-e6f017a477c5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:55:48 crc kubenswrapper[4971]: I0320 08:55:48.858257 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df4aae77-3962-438d-a455-e6f017a477c5-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "df4aae77-3962-438d-a455-e6f017a477c5" (UID: "df4aae77-3962-438d-a455-e6f017a477c5"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:55:48 crc kubenswrapper[4971]: I0320 08:55:48.864136 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df4aae77-3962-438d-a455-e6f017a477c5-scripts" (OuterVolumeSpecName: "scripts") pod "df4aae77-3962-438d-a455-e6f017a477c5" (UID: "df4aae77-3962-438d-a455-e6f017a477c5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:55:48 crc kubenswrapper[4971]: I0320 08:55:48.864518 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df4aae77-3962-438d-a455-e6f017a477c5-ceph" (OuterVolumeSpecName: "ceph") pod "df4aae77-3962-438d-a455-e6f017a477c5" (UID: "df4aae77-3962-438d-a455-e6f017a477c5"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:55:48 crc kubenswrapper[4971]: I0320 08:55:48.864905 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df4aae77-3962-438d-a455-e6f017a477c5-kube-api-access-5xhhf" (OuterVolumeSpecName: "kube-api-access-5xhhf") pod "df4aae77-3962-438d-a455-e6f017a477c5" (UID: "df4aae77-3962-438d-a455-e6f017a477c5"). InnerVolumeSpecName "kube-api-access-5xhhf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:55:48 crc kubenswrapper[4971]: I0320 08:55:48.883553 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df4aae77-3962-438d-a455-e6f017a477c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "df4aae77-3962-438d-a455-e6f017a477c5" (UID: "df4aae77-3962-438d-a455-e6f017a477c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:55:48 crc kubenswrapper[4971]: I0320 08:55:48.958108 4971 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df4aae77-3962-438d-a455-e6f017a477c5-logs\") on node \"crc\" DevicePath \"\"" Mar 20 08:55:48 crc kubenswrapper[4971]: I0320 08:55:48.958144 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df4aae77-3962-438d-a455-e6f017a477c5-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:55:48 crc kubenswrapper[4971]: I0320 08:55:48.958159 4971 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/df4aae77-3962-438d-a455-e6f017a477c5-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 08:55:48 crc kubenswrapper[4971]: I0320 08:55:48.958171 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df4aae77-3962-438d-a455-e6f017a477c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:55:48 crc kubenswrapper[4971]: I0320 08:55:48.958184 4971 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/df4aae77-3962-438d-a455-e6f017a477c5-ceph\") on node \"crc\" DevicePath \"\"" Mar 20 08:55:48 crc kubenswrapper[4971]: I0320 08:55:48.958197 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xhhf\" (UniqueName: \"kubernetes.io/projected/df4aae77-3962-438d-a455-e6f017a477c5-kube-api-access-5xhhf\") on node \"crc\" DevicePath \"\"" Mar 20 08:55:48 crc kubenswrapper[4971]: I0320 08:55:48.973645 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df4aae77-3962-438d-a455-e6f017a477c5-config-data" (OuterVolumeSpecName: "config-data") pod "df4aae77-3962-438d-a455-e6f017a477c5" (UID: "df4aae77-3962-438d-a455-e6f017a477c5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.059488 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df4aae77-3962-438d-a455-e6f017a477c5-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.142231 4971 generic.go:334] "Generic (PLEG): container finished" podID="df4aae77-3962-438d-a455-e6f017a477c5" containerID="4fcdf8ed86ec8c23603232a5a54512655104a51c2cb4d5d309335f1a29c503c1" exitCode=0 Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.142308 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"df4aae77-3962-438d-a455-e6f017a477c5","Type":"ContainerDied","Data":"4fcdf8ed86ec8c23603232a5a54512655104a51c2cb4d5d309335f1a29c503c1"} Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.142343 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"df4aae77-3962-438d-a455-e6f017a477c5","Type":"ContainerDied","Data":"ed8f8922904979e8de2fc583d412f2259155802961a128c952a0b46917d063dc"} Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.142363 4971 scope.go:117] "RemoveContainer" containerID="4fcdf8ed86ec8c23603232a5a54512655104a51c2cb4d5d309335f1a29c503c1" Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.142543 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.155414 4971 generic.go:334] "Generic (PLEG): container finished" podID="24488a4f-a46e-4ca8-86d4-5b23dd0baea2" containerID="4ace891a7df0f84ad86f358c910694d8b067090dd19974eea57d33f583f52709" exitCode=0 Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.155452 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"24488a4f-a46e-4ca8-86d4-5b23dd0baea2","Type":"ContainerDied","Data":"4ace891a7df0f84ad86f358c910694d8b067090dd19974eea57d33f583f52709"} Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.155477 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"24488a4f-a46e-4ca8-86d4-5b23dd0baea2","Type":"ContainerDied","Data":"01a9b85a328d239b715c61567cb8970555e667d48e1c35a1b0c270ceabfd099d"} Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.155524 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.271088 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.281876 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.304548 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.304862 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.319662 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 08:55:49 crc kubenswrapper[4971]: E0320 08:55:49.320082 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df4aae77-3962-438d-a455-e6f017a477c5" containerName="glance-log" Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.320100 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="df4aae77-3962-438d-a455-e6f017a477c5" containerName="glance-log" Mar 20 08:55:49 crc kubenswrapper[4971]: E0320 08:55:49.320121 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df4aae77-3962-438d-a455-e6f017a477c5" containerName="glance-httpd" Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.320128 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="df4aae77-3962-438d-a455-e6f017a477c5" containerName="glance-httpd" Mar 20 08:55:49 crc kubenswrapper[4971]: E0320 08:55:49.320144 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24488a4f-a46e-4ca8-86d4-5b23dd0baea2" containerName="glance-httpd" Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.320151 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="24488a4f-a46e-4ca8-86d4-5b23dd0baea2" containerName="glance-httpd" Mar 20 08:55:49 crc kubenswrapper[4971]: E0320 08:55:49.320162 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24488a4f-a46e-4ca8-86d4-5b23dd0baea2" containerName="glance-log" Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.320168 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="24488a4f-a46e-4ca8-86d4-5b23dd0baea2" containerName="glance-log" Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.320319 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="24488a4f-a46e-4ca8-86d4-5b23dd0baea2" containerName="glance-httpd" Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.320334 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="df4aae77-3962-438d-a455-e6f017a477c5" containerName="glance-log" Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.320348 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="24488a4f-a46e-4ca8-86d4-5b23dd0baea2" containerName="glance-log" Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.320359 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="df4aae77-3962-438d-a455-e6f017a477c5" containerName="glance-httpd" Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.321663 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.321760 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.323343 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-wklvs" Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.323829 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.324279 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.330948 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.332510 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.337144 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.354219 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.469105 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ks5kj\" (UniqueName: \"kubernetes.io/projected/051896ec-c7dc-48da-acab-6b65bf92d80e-kube-api-access-ks5kj\") pod \"glance-default-external-api-0\" (UID: \"051896ec-c7dc-48da-acab-6b65bf92d80e\") " pod="openstack/glance-default-external-api-0" Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.469418 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/051896ec-c7dc-48da-acab-6b65bf92d80e-scripts\") pod \"glance-default-external-api-0\" (UID: \"051896ec-c7dc-48da-acab-6b65bf92d80e\") " pod="openstack/glance-default-external-api-0" Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.469512 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/fb8fbce1-2aa7-445f-aed3-09344c7e1f30-ceph\") pod \"glance-default-internal-api-0\" (UID: \"fb8fbce1-2aa7-445f-aed3-09344c7e1f30\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.469657 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/051896ec-c7dc-48da-acab-6b65bf92d80e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"051896ec-c7dc-48da-acab-6b65bf92d80e\") " pod="openstack/glance-default-external-api-0" Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.469770 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/051896ec-c7dc-48da-acab-6b65bf92d80e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"051896ec-c7dc-48da-acab-6b65bf92d80e\") " pod="openstack/glance-default-external-api-0" Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.469865 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/051896ec-c7dc-48da-acab-6b65bf92d80e-config-data\") pod \"glance-default-external-api-0\" (UID: \"051896ec-c7dc-48da-acab-6b65bf92d80e\") " pod="openstack/glance-default-external-api-0" Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.469999 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb8fbce1-2aa7-445f-aed3-09344c7e1f30-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"fb8fbce1-2aa7-445f-aed3-09344c7e1f30\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.470096 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7dbv\" (UniqueName: \"kubernetes.io/projected/fb8fbce1-2aa7-445f-aed3-09344c7e1f30-kube-api-access-j7dbv\") pod \"glance-default-internal-api-0\" (UID: \"fb8fbce1-2aa7-445f-aed3-09344c7e1f30\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.470193 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb8fbce1-2aa7-445f-aed3-09344c7e1f30-scripts\") pod \"glance-default-internal-api-0\" (UID: \"fb8fbce1-2aa7-445f-aed3-09344c7e1f30\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.470285 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb8fbce1-2aa7-445f-aed3-09344c7e1f30-logs\") pod \"glance-default-internal-api-0\" (UID: \"fb8fbce1-2aa7-445f-aed3-09344c7e1f30\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.470374 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/051896ec-c7dc-48da-acab-6b65bf92d80e-logs\") pod \"glance-default-external-api-0\" (UID: \"051896ec-c7dc-48da-acab-6b65bf92d80e\") " pod="openstack/glance-default-external-api-0" Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.470464 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb8fbce1-2aa7-445f-aed3-09344c7e1f30-config-data\") pod \"glance-default-internal-api-0\" (UID: \"fb8fbce1-2aa7-445f-aed3-09344c7e1f30\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.470551 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/051896ec-c7dc-48da-acab-6b65bf92d80e-ceph\") pod \"glance-default-external-api-0\" (UID: \"051896ec-c7dc-48da-acab-6b65bf92d80e\") " pod="openstack/glance-default-external-api-0" Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.470796 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fb8fbce1-2aa7-445f-aed3-09344c7e1f30-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"fb8fbce1-2aa7-445f-aed3-09344c7e1f30\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.572784 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb8fbce1-2aa7-445f-aed3-09344c7e1f30-scripts\") pod \"glance-default-internal-api-0\" (UID: \"fb8fbce1-2aa7-445f-aed3-09344c7e1f30\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.573080 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb8fbce1-2aa7-445f-aed3-09344c7e1f30-logs\") pod \"glance-default-internal-api-0\" (UID: \"fb8fbce1-2aa7-445f-aed3-09344c7e1f30\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.573200 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/051896ec-c7dc-48da-acab-6b65bf92d80e-logs\") pod \"glance-default-external-api-0\" (UID: \"051896ec-c7dc-48da-acab-6b65bf92d80e\") " pod="openstack/glance-default-external-api-0" Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.573325 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb8fbce1-2aa7-445f-aed3-09344c7e1f30-config-data\") pod \"glance-default-internal-api-0\" (UID: \"fb8fbce1-2aa7-445f-aed3-09344c7e1f30\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.573473 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/051896ec-c7dc-48da-acab-6b65bf92d80e-ceph\") pod \"glance-default-external-api-0\" (UID: \"051896ec-c7dc-48da-acab-6b65bf92d80e\") " pod="openstack/glance-default-external-api-0" Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.573568 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb8fbce1-2aa7-445f-aed3-09344c7e1f30-logs\") pod \"glance-default-internal-api-0\" (UID: \"fb8fbce1-2aa7-445f-aed3-09344c7e1f30\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.573975 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fb8fbce1-2aa7-445f-aed3-09344c7e1f30-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"fb8fbce1-2aa7-445f-aed3-09344c7e1f30\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.574504 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ks5kj\" (UniqueName: \"kubernetes.io/projected/051896ec-c7dc-48da-acab-6b65bf92d80e-kube-api-access-ks5kj\") pod \"glance-default-external-api-0\" (UID: \"051896ec-c7dc-48da-acab-6b65bf92d80e\") " pod="openstack/glance-default-external-api-0" Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.574683 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/051896ec-c7dc-48da-acab-6b65bf92d80e-scripts\") pod \"glance-default-external-api-0\" (UID: \"051896ec-c7dc-48da-acab-6b65bf92d80e\") " pod="openstack/glance-default-external-api-0" Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.574790 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/fb8fbce1-2aa7-445f-aed3-09344c7e1f30-ceph\") pod \"glance-default-internal-api-0\" (UID: \"fb8fbce1-2aa7-445f-aed3-09344c7e1f30\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.574929 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/051896ec-c7dc-48da-acab-6b65bf92d80e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"051896ec-c7dc-48da-acab-6b65bf92d80e\") " pod="openstack/glance-default-external-api-0" Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.575039 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/051896ec-c7dc-48da-acab-6b65bf92d80e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"051896ec-c7dc-48da-acab-6b65bf92d80e\") " pod="openstack/glance-default-external-api-0" Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.575146 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/051896ec-c7dc-48da-acab-6b65bf92d80e-config-data\") pod \"glance-default-external-api-0\" (UID: \"051896ec-c7dc-48da-acab-6b65bf92d80e\") " pod="openstack/glance-default-external-api-0" Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.574430 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fb8fbce1-2aa7-445f-aed3-09344c7e1f30-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"fb8fbce1-2aa7-445f-aed3-09344c7e1f30\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.575291 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb8fbce1-2aa7-445f-aed3-09344c7e1f30-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"fb8fbce1-2aa7-445f-aed3-09344c7e1f30\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.575457 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7dbv\" (UniqueName: \"kubernetes.io/projected/fb8fbce1-2aa7-445f-aed3-09344c7e1f30-kube-api-access-j7dbv\") pod \"glance-default-internal-api-0\" (UID: \"fb8fbce1-2aa7-445f-aed3-09344c7e1f30\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.576192 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/051896ec-c7dc-48da-acab-6b65bf92d80e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"051896ec-c7dc-48da-acab-6b65bf92d80e\") " pod="openstack/glance-default-external-api-0" Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.574956 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/051896ec-c7dc-48da-acab-6b65bf92d80e-logs\") pod \"glance-default-external-api-0\" (UID: \"051896ec-c7dc-48da-acab-6b65bf92d80e\") " pod="openstack/glance-default-external-api-0" Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.577030 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/051896ec-c7dc-48da-acab-6b65bf92d80e-ceph\") pod \"glance-default-external-api-0\" (UID: \"051896ec-c7dc-48da-acab-6b65bf92d80e\") " pod="openstack/glance-default-external-api-0" Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.577814 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb8fbce1-2aa7-445f-aed3-09344c7e1f30-config-data\") pod \"glance-default-internal-api-0\" (UID: \"fb8fbce1-2aa7-445f-aed3-09344c7e1f30\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.580368 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/fb8fbce1-2aa7-445f-aed3-09344c7e1f30-ceph\") pod \"glance-default-internal-api-0\" (UID: \"fb8fbce1-2aa7-445f-aed3-09344c7e1f30\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.580797 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/051896ec-c7dc-48da-acab-6b65bf92d80e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"051896ec-c7dc-48da-acab-6b65bf92d80e\") " pod="openstack/glance-default-external-api-0" Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.582722 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/051896ec-c7dc-48da-acab-6b65bf92d80e-config-data\") pod \"glance-default-external-api-0\" (UID: \"051896ec-c7dc-48da-acab-6b65bf92d80e\") " pod="openstack/glance-default-external-api-0" Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.585073 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb8fbce1-2aa7-445f-aed3-09344c7e1f30-scripts\") pod \"glance-default-internal-api-0\" (UID: \"fb8fbce1-2aa7-445f-aed3-09344c7e1f30\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.586375 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/051896ec-c7dc-48da-acab-6b65bf92d80e-scripts\") pod \"glance-default-external-api-0\" (UID: \"051896ec-c7dc-48da-acab-6b65bf92d80e\") " pod="openstack/glance-default-external-api-0" Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.587210 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb8fbce1-2aa7-445f-aed3-09344c7e1f30-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"fb8fbce1-2aa7-445f-aed3-09344c7e1f30\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.593634 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ks5kj\" (UniqueName: \"kubernetes.io/projected/051896ec-c7dc-48da-acab-6b65bf92d80e-kube-api-access-ks5kj\") pod \"glance-default-external-api-0\" (UID: \"051896ec-c7dc-48da-acab-6b65bf92d80e\") " pod="openstack/glance-default-external-api-0" Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.601928 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7dbv\" (UniqueName: \"kubernetes.io/projected/fb8fbce1-2aa7-445f-aed3-09344c7e1f30-kube-api-access-j7dbv\") pod \"glance-default-internal-api-0\" (UID: \"fb8fbce1-2aa7-445f-aed3-09344c7e1f30\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.649338 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 08:55:49 crc kubenswrapper[4971]: I0320 08:55:49.657009 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 08:55:50 crc kubenswrapper[4971]: I0320 08:55:50.743006 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24488a4f-a46e-4ca8-86d4-5b23dd0baea2" path="/var/lib/kubelet/pods/24488a4f-a46e-4ca8-86d4-5b23dd0baea2/volumes" Mar 20 08:55:50 crc kubenswrapper[4971]: I0320 08:55:50.744127 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df4aae77-3962-438d-a455-e6f017a477c5" path="/var/lib/kubelet/pods/df4aae77-3962-438d-a455-e6f017a477c5/volumes" Mar 20 08:55:54 crc kubenswrapper[4971]: I0320 08:55:54.224268 4971 scope.go:117] "RemoveContainer" containerID="a0d9b96c46022c457032f3d730e3d05303c77c5874a2edd8709197377173b185" Mar 20 08:55:54 crc kubenswrapper[4971]: I0320 08:55:54.303684 4971 scope.go:117] "RemoveContainer" containerID="4fcdf8ed86ec8c23603232a5a54512655104a51c2cb4d5d309335f1a29c503c1" Mar 20 08:55:54 crc kubenswrapper[4971]: E0320 08:55:54.304195 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fcdf8ed86ec8c23603232a5a54512655104a51c2cb4d5d309335f1a29c503c1\": container with ID starting with 4fcdf8ed86ec8c23603232a5a54512655104a51c2cb4d5d309335f1a29c503c1 not found: ID does not exist" containerID="4fcdf8ed86ec8c23603232a5a54512655104a51c2cb4d5d309335f1a29c503c1" Mar 20 08:55:54 crc kubenswrapper[4971]: I0320 08:55:54.304228 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fcdf8ed86ec8c23603232a5a54512655104a51c2cb4d5d309335f1a29c503c1"} err="failed to get container status \"4fcdf8ed86ec8c23603232a5a54512655104a51c2cb4d5d309335f1a29c503c1\": rpc error: code = NotFound desc = could not find container \"4fcdf8ed86ec8c23603232a5a54512655104a51c2cb4d5d309335f1a29c503c1\": container with ID starting with 4fcdf8ed86ec8c23603232a5a54512655104a51c2cb4d5d309335f1a29c503c1 not found: ID does not exist" Mar 20 08:55:54 crc kubenswrapper[4971]: I0320 08:55:54.304250 4971 scope.go:117] "RemoveContainer" containerID="a0d9b96c46022c457032f3d730e3d05303c77c5874a2edd8709197377173b185" Mar 20 08:55:54 crc kubenswrapper[4971]: E0320 08:55:54.305311 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0d9b96c46022c457032f3d730e3d05303c77c5874a2edd8709197377173b185\": container with ID starting with a0d9b96c46022c457032f3d730e3d05303c77c5874a2edd8709197377173b185 not found: ID does not exist" containerID="a0d9b96c46022c457032f3d730e3d05303c77c5874a2edd8709197377173b185" Mar 20 08:55:54 crc kubenswrapper[4971]: I0320 08:55:54.305339 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0d9b96c46022c457032f3d730e3d05303c77c5874a2edd8709197377173b185"} err="failed to get container status \"a0d9b96c46022c457032f3d730e3d05303c77c5874a2edd8709197377173b185\": rpc error: code = NotFound desc = could not find container \"a0d9b96c46022c457032f3d730e3d05303c77c5874a2edd8709197377173b185\": container with ID starting with a0d9b96c46022c457032f3d730e3d05303c77c5874a2edd8709197377173b185 not found: ID does not exist" Mar 20 08:55:54 crc kubenswrapper[4971]: I0320 08:55:54.305354 4971 scope.go:117] "RemoveContainer" containerID="4ace891a7df0f84ad86f358c910694d8b067090dd19974eea57d33f583f52709" Mar 20 08:55:54 crc kubenswrapper[4971]: I0320 08:55:54.453677 4971 scope.go:117] "RemoveContainer" containerID="13545e7041483007f6d6bd25d1ff07ca64b88102cd73f7ec312096aa6bbd746b" Mar 20 08:55:54 crc kubenswrapper[4971]: I0320 08:55:54.546908 4971 scope.go:117] "RemoveContainer" containerID="4ace891a7df0f84ad86f358c910694d8b067090dd19974eea57d33f583f52709" Mar 20 08:55:54 crc kubenswrapper[4971]: E0320 08:55:54.571638 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ace891a7df0f84ad86f358c910694d8b067090dd19974eea57d33f583f52709\": container with ID starting with 4ace891a7df0f84ad86f358c910694d8b067090dd19974eea57d33f583f52709 not found: ID does not exist" containerID="4ace891a7df0f84ad86f358c910694d8b067090dd19974eea57d33f583f52709" Mar 20 08:55:54 crc kubenswrapper[4971]: I0320 08:55:54.571702 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ace891a7df0f84ad86f358c910694d8b067090dd19974eea57d33f583f52709"} err="failed to get container status \"4ace891a7df0f84ad86f358c910694d8b067090dd19974eea57d33f583f52709\": rpc error: code = NotFound desc = could not find container \"4ace891a7df0f84ad86f358c910694d8b067090dd19974eea57d33f583f52709\": container with ID starting with 4ace891a7df0f84ad86f358c910694d8b067090dd19974eea57d33f583f52709 not found: ID does not exist" Mar 20 08:55:54 crc kubenswrapper[4971]: I0320 08:55:54.571728 4971 scope.go:117] "RemoveContainer" containerID="13545e7041483007f6d6bd25d1ff07ca64b88102cd73f7ec312096aa6bbd746b" Mar 20 08:55:54 crc kubenswrapper[4971]: E0320 08:55:54.573490 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13545e7041483007f6d6bd25d1ff07ca64b88102cd73f7ec312096aa6bbd746b\": container with ID starting with 13545e7041483007f6d6bd25d1ff07ca64b88102cd73f7ec312096aa6bbd746b not found: ID does not exist" containerID="13545e7041483007f6d6bd25d1ff07ca64b88102cd73f7ec312096aa6bbd746b" Mar 20 08:55:54 crc kubenswrapper[4971]: I0320 08:55:54.573766 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13545e7041483007f6d6bd25d1ff07ca64b88102cd73f7ec312096aa6bbd746b"} err="failed to get container status \"13545e7041483007f6d6bd25d1ff07ca64b88102cd73f7ec312096aa6bbd746b\": rpc error: code = NotFound desc = could not find container \"13545e7041483007f6d6bd25d1ff07ca64b88102cd73f7ec312096aa6bbd746b\": container with ID starting with 13545e7041483007f6d6bd25d1ff07ca64b88102cd73f7ec312096aa6bbd746b not found: ID does not exist" Mar 20 08:55:54 crc kubenswrapper[4971]: I0320 08:55:54.958361 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 08:55:55 crc kubenswrapper[4971]: I0320 08:55:55.032805 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 08:55:55 crc kubenswrapper[4971]: W0320 08:55:55.035091 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb8fbce1_2aa7_445f_aed3_09344c7e1f30.slice/crio-bd9edc2c3676b3751ac4d501ba8c57a69d6a0d5e3f74ee1a8f41f444fdba4601 WatchSource:0}: Error finding container bd9edc2c3676b3751ac4d501ba8c57a69d6a0d5e3f74ee1a8f41f444fdba4601: Status 404 returned error can't find the container with id bd9edc2c3676b3751ac4d501ba8c57a69d6a0d5e3f74ee1a8f41f444fdba4601 Mar 20 08:55:55 crc kubenswrapper[4971]: I0320 08:55:55.220032 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57c77c7dc5-pcf25" event={"ID":"1a7e1d91-b96d-443b-a5c2-30535a7761cd","Type":"ContainerStarted","Data":"174d3b9c2897301fe6721b266783bb01ac91ef00a15c8ae6bdf6627259373f44"} Mar 20 08:55:55 crc kubenswrapper[4971]: I0320 08:55:55.220073 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57c77c7dc5-pcf25" event={"ID":"1a7e1d91-b96d-443b-a5c2-30535a7761cd","Type":"ContainerStarted","Data":"8e5fcb25dd0c7081778bf484f585e292b1141de016e2ebef3153f2b90c4e30eb"} Mar 20 08:55:55 crc kubenswrapper[4971]: I0320 08:55:55.220193 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-57c77c7dc5-pcf25" podUID="1a7e1d91-b96d-443b-a5c2-30535a7761cd" containerName="horizon-log" containerID="cri-o://8e5fcb25dd0c7081778bf484f585e292b1141de016e2ebef3153f2b90c4e30eb" gracePeriod=30 Mar 20 08:55:55 crc kubenswrapper[4971]: I0320 08:55:55.220666 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-57c77c7dc5-pcf25" podUID="1a7e1d91-b96d-443b-a5c2-30535a7761cd" containerName="horizon" containerID="cri-o://174d3b9c2897301fe6721b266783bb01ac91ef00a15c8ae6bdf6627259373f44" gracePeriod=30 Mar 20 08:55:55 crc kubenswrapper[4971]: I0320 08:55:55.229727 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-789f468b55-wcnxp" event={"ID":"639cda12-bee9-4d40-8613-24156ac5b3ec","Type":"ContainerStarted","Data":"46aa6971476639c5373ca601c0e27c7ece37087a0f677a02a1bd9e8918a1fe79"} Mar 20 08:55:55 crc kubenswrapper[4971]: I0320 08:55:55.229772 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-789f468b55-wcnxp" event={"ID":"639cda12-bee9-4d40-8613-24156ac5b3ec","Type":"ContainerStarted","Data":"473b17ac1d231d2534f5e072f8d895e5b0c2225e62cb03d25e3e0e0c0581be41"} Mar 20 08:55:55 crc kubenswrapper[4971]: I0320 08:55:55.236276 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"051896ec-c7dc-48da-acab-6b65bf92d80e","Type":"ContainerStarted","Data":"ebb3a7c789985ed41e4257992d56e7edcdbc2d7ef5ab76892d648a1ac904cca7"} Mar 20 08:55:55 crc kubenswrapper[4971]: I0320 08:55:55.252902 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-859774b9d9-bgg24" event={"ID":"003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7","Type":"ContainerStarted","Data":"cd5664fbe414be3892e03fb8aac0115e5833b4449389bbb196cfdfb905cd5ce3"} Mar 20 08:55:55 crc kubenswrapper[4971]: I0320 08:55:55.252966 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-859774b9d9-bgg24" event={"ID":"003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7","Type":"ContainerStarted","Data":"ae213f50770cd8f09e548bfa98d77803b409aefbf1d32f4970593d1ac8802c71"} Mar 20 08:55:55 crc kubenswrapper[4971]: I0320 08:55:55.254518 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-57c77c7dc5-pcf25" podStartSLOduration=2.595578274 podStartE2EDuration="11.254493341s" podCreationTimestamp="2026-03-20 08:55:44 +0000 UTC" firstStartedPulling="2026-03-20 08:55:45.693976007 +0000 UTC m=+7567.673850145" lastFinishedPulling="2026-03-20 08:55:54.352891074 +0000 UTC m=+7576.332765212" observedRunningTime="2026-03-20 08:55:55.239923209 +0000 UTC m=+7577.219797347" watchObservedRunningTime="2026-03-20 08:55:55.254493341 +0000 UTC m=+7577.234367499" Mar 20 08:55:55 crc kubenswrapper[4971]: I0320 08:55:55.264482 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"fb8fbce1-2aa7-445f-aed3-09344c7e1f30","Type":"ContainerStarted","Data":"bd9edc2c3676b3751ac4d501ba8c57a69d6a0d5e3f74ee1a8f41f444fdba4601"} Mar 20 08:55:55 crc kubenswrapper[4971]: I0320 08:55:55.266431 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-789f468b55-wcnxp" podStartSLOduration=2.727195129 podStartE2EDuration="11.266411623s" podCreationTimestamp="2026-03-20 08:55:44 +0000 UTC" firstStartedPulling="2026-03-20 08:55:45.799215552 +0000 UTC m=+7567.779089690" lastFinishedPulling="2026-03-20 08:55:54.338432026 +0000 UTC m=+7576.318306184" observedRunningTime="2026-03-20 08:55:55.264708568 +0000 UTC m=+7577.244582706" watchObservedRunningTime="2026-03-20 08:55:55.266411623 +0000 UTC m=+7577.246285751" Mar 20 08:55:55 crc kubenswrapper[4971]: I0320 08:55:55.288666 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-859774b9d9-bgg24" podStartSLOduration=2.308765721 podStartE2EDuration="10.288648535s" podCreationTimestamp="2026-03-20 08:55:45 +0000 UTC" firstStartedPulling="2026-03-20 08:55:46.411419415 +0000 UTC m=+7568.391293563" lastFinishedPulling="2026-03-20 08:55:54.391302239 +0000 UTC m=+7576.371176377" observedRunningTime="2026-03-20 08:55:55.285566564 +0000 UTC m=+7577.265440712" watchObservedRunningTime="2026-03-20 08:55:55.288648535 +0000 UTC m=+7577.268522673" Mar 20 08:55:55 crc kubenswrapper[4971]: I0320 08:55:55.296021 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-789f468b55-wcnxp" Mar 20 08:55:55 crc kubenswrapper[4971]: I0320 08:55:55.296085 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-789f468b55-wcnxp" Mar 20 08:55:55 crc kubenswrapper[4971]: I0320 08:55:55.916425 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-859774b9d9-bgg24" Mar 20 08:55:55 crc kubenswrapper[4971]: I0320 08:55:55.916789 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-859774b9d9-bgg24" Mar 20 08:55:56 crc kubenswrapper[4971]: I0320 08:55:56.275913 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"fb8fbce1-2aa7-445f-aed3-09344c7e1f30","Type":"ContainerStarted","Data":"5742eeb15c6b10e7a83f12ac9206f7edb7cc4624e41eafc4370b916366280e73"} Mar 20 08:55:56 crc kubenswrapper[4971]: I0320 08:55:56.277234 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"fb8fbce1-2aa7-445f-aed3-09344c7e1f30","Type":"ContainerStarted","Data":"c621031ead95073b2e22fb8422266fa3b2975dc796cf2e36f771fcd88e9a913d"} Mar 20 08:55:56 crc kubenswrapper[4971]: I0320 08:55:56.280900 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"051896ec-c7dc-48da-acab-6b65bf92d80e","Type":"ContainerStarted","Data":"049f96748590da4882afad314eb8eaf80a44d97f2ae954a88c0c49f07ab88514"} Mar 20 08:55:56 crc kubenswrapper[4971]: I0320 08:55:56.281017 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"051896ec-c7dc-48da-acab-6b65bf92d80e","Type":"ContainerStarted","Data":"92a373ef72adeedb24045ec1ac8d8b951a3388264fd9c45bd2ff7e06d8d5f2bb"} Mar 20 08:55:56 crc kubenswrapper[4971]: I0320 08:55:56.320720 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=7.320701966 podStartE2EDuration="7.320701966s" podCreationTimestamp="2026-03-20 08:55:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:55:56.31855678 +0000 UTC m=+7578.298430928" watchObservedRunningTime="2026-03-20 08:55:56.320701966 +0000 UTC m=+7578.300576104" Mar 20 08:55:56 crc kubenswrapper[4971]: I0320 08:55:56.350252 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=7.350233839 podStartE2EDuration="7.350233839s" podCreationTimestamp="2026-03-20 08:55:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:55:56.348098023 +0000 UTC m=+7578.327972171" watchObservedRunningTime="2026-03-20 08:55:56.350233839 +0000 UTC m=+7578.330107977" Mar 20 08:55:59 crc kubenswrapper[4971]: I0320 08:55:59.649804 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 08:55:59 crc kubenswrapper[4971]: I0320 08:55:59.650851 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 08:55:59 crc kubenswrapper[4971]: I0320 08:55:59.658011 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 08:55:59 crc kubenswrapper[4971]: I0320 08:55:59.658076 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 08:55:59 crc kubenswrapper[4971]: I0320 08:55:59.684280 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 08:55:59 crc kubenswrapper[4971]: I0320 08:55:59.686716 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 08:55:59 crc kubenswrapper[4971]: I0320 08:55:59.701479 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 08:55:59 crc kubenswrapper[4971]: I0320 08:55:59.702187 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 08:56:00 crc kubenswrapper[4971]: I0320 08:56:00.137192 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566616-f5hbx"] Mar 20 08:56:00 crc kubenswrapper[4971]: I0320 08:56:00.142741 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566616-f5hbx" Mar 20 08:56:00 crc kubenswrapper[4971]: I0320 08:56:00.148574 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 08:56:00 crc kubenswrapper[4971]: I0320 08:56:00.148660 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 08:56:00 crc kubenswrapper[4971]: I0320 08:56:00.149018 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 08:56:00 crc kubenswrapper[4971]: I0320 08:56:00.157594 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566616-f5hbx"] Mar 20 08:56:00 crc kubenswrapper[4971]: I0320 08:56:00.240948 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xgbd\" (UniqueName: \"kubernetes.io/projected/7d320012-274b-452a-92d8-a4cc549704cf-kube-api-access-4xgbd\") pod \"auto-csr-approver-29566616-f5hbx\" (UID: \"7d320012-274b-452a-92d8-a4cc549704cf\") " pod="openshift-infra/auto-csr-approver-29566616-f5hbx" Mar 20 08:56:00 crc kubenswrapper[4971]: I0320 08:56:00.325930 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 08:56:00 crc kubenswrapper[4971]: I0320 08:56:00.325967 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 08:56:00 crc kubenswrapper[4971]: I0320 08:56:00.325977 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 08:56:00 crc kubenswrapper[4971]: I0320 08:56:00.325992 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 08:56:00 crc kubenswrapper[4971]: I0320 08:56:00.342429 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xgbd\" (UniqueName: \"kubernetes.io/projected/7d320012-274b-452a-92d8-a4cc549704cf-kube-api-access-4xgbd\") pod \"auto-csr-approver-29566616-f5hbx\" (UID: \"7d320012-274b-452a-92d8-a4cc549704cf\") " pod="openshift-infra/auto-csr-approver-29566616-f5hbx" Mar 20 08:56:00 crc kubenswrapper[4971]: I0320 08:56:00.363659 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xgbd\" (UniqueName: \"kubernetes.io/projected/7d320012-274b-452a-92d8-a4cc549704cf-kube-api-access-4xgbd\") pod \"auto-csr-approver-29566616-f5hbx\" (UID: \"7d320012-274b-452a-92d8-a4cc549704cf\") " pod="openshift-infra/auto-csr-approver-29566616-f5hbx" Mar 20 08:56:00 crc kubenswrapper[4971]: I0320 08:56:00.467393 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566616-f5hbx" Mar 20 08:56:00 crc kubenswrapper[4971]: I0320 08:56:00.949134 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566616-f5hbx"] Mar 20 08:56:00 crc kubenswrapper[4971]: W0320 08:56:00.950956 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d320012_274b_452a_92d8_a4cc549704cf.slice/crio-9237dc6be737e4b50ac8901ace08195fa9893cccb6a69adcd88436b9da49c952 WatchSource:0}: Error finding container 9237dc6be737e4b50ac8901ace08195fa9893cccb6a69adcd88436b9da49c952: Status 404 returned error can't find the container with id 9237dc6be737e4b50ac8901ace08195fa9893cccb6a69adcd88436b9da49c952 Mar 20 08:56:01 crc kubenswrapper[4971]: I0320 08:56:01.333171 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566616-f5hbx" event={"ID":"7d320012-274b-452a-92d8-a4cc549704cf","Type":"ContainerStarted","Data":"9237dc6be737e4b50ac8901ace08195fa9893cccb6a69adcd88436b9da49c952"} Mar 20 08:56:02 crc kubenswrapper[4971]: I0320 08:56:02.338329 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 08:56:02 crc kubenswrapper[4971]: I0320 08:56:02.698241 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 08:56:03 crc kubenswrapper[4971]: I0320 08:56:03.355382 4971 generic.go:334] "Generic (PLEG): container finished" podID="7d320012-274b-452a-92d8-a4cc549704cf" containerID="24e4e7ddba7159ae28607f81be76f7f328af812dac3ad28b92a91d885b95ed44" exitCode=0 Mar 20 08:56:03 crc kubenswrapper[4971]: I0320 08:56:03.355492 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566616-f5hbx" event={"ID":"7d320012-274b-452a-92d8-a4cc549704cf","Type":"ContainerDied","Data":"24e4e7ddba7159ae28607f81be76f7f328af812dac3ad28b92a91d885b95ed44"} Mar 20 08:56:03 crc kubenswrapper[4971]: I0320 08:56:03.471892 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 08:56:03 crc kubenswrapper[4971]: I0320 08:56:03.572528 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 08:56:04 crc kubenswrapper[4971]: I0320 08:56:04.700027 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566616-f5hbx" Mar 20 08:56:04 crc kubenswrapper[4971]: I0320 08:56:04.837387 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xgbd\" (UniqueName: \"kubernetes.io/projected/7d320012-274b-452a-92d8-a4cc549704cf-kube-api-access-4xgbd\") pod \"7d320012-274b-452a-92d8-a4cc549704cf\" (UID: \"7d320012-274b-452a-92d8-a4cc549704cf\") " Mar 20 08:56:04 crc kubenswrapper[4971]: I0320 08:56:04.844985 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d320012-274b-452a-92d8-a4cc549704cf-kube-api-access-4xgbd" (OuterVolumeSpecName: "kube-api-access-4xgbd") pod "7d320012-274b-452a-92d8-a4cc549704cf" (UID: "7d320012-274b-452a-92d8-a4cc549704cf"). InnerVolumeSpecName "kube-api-access-4xgbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:56:04 crc kubenswrapper[4971]: I0320 08:56:04.938998 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xgbd\" (UniqueName: \"kubernetes.io/projected/7d320012-274b-452a-92d8-a4cc549704cf-kube-api-access-4xgbd\") on node \"crc\" DevicePath \"\"" Mar 20 08:56:05 crc kubenswrapper[4971]: I0320 08:56:05.186940 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-57c77c7dc5-pcf25" Mar 20 08:56:05 crc kubenswrapper[4971]: I0320 08:56:05.297458 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-789f468b55-wcnxp" podUID="639cda12-bee9-4d40-8613-24156ac5b3ec" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.156:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.156:8080: connect: connection refused" Mar 20 08:56:05 crc kubenswrapper[4971]: I0320 08:56:05.379600 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566616-f5hbx" event={"ID":"7d320012-274b-452a-92d8-a4cc549704cf","Type":"ContainerDied","Data":"9237dc6be737e4b50ac8901ace08195fa9893cccb6a69adcd88436b9da49c952"} Mar 20 08:56:05 crc kubenswrapper[4971]: I0320 08:56:05.379675 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566616-f5hbx" Mar 20 08:56:05 crc kubenswrapper[4971]: I0320 08:56:05.379685 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9237dc6be737e4b50ac8901ace08195fa9893cccb6a69adcd88436b9da49c952" Mar 20 08:56:05 crc kubenswrapper[4971]: I0320 08:56:05.804017 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566610-7nbjt"] Mar 20 08:56:05 crc kubenswrapper[4971]: I0320 08:56:05.812839 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566610-7nbjt"] Mar 20 08:56:05 crc kubenswrapper[4971]: I0320 08:56:05.917875 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-859774b9d9-bgg24" podUID="003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.157:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.157:8080: connect: connection refused" Mar 20 08:56:06 crc kubenswrapper[4971]: I0320 08:56:06.763877 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6aacce2-46ea-4aec-a7d9-d18fe3a32dda" path="/var/lib/kubelet/pods/e6aacce2-46ea-4aec-a7d9-d18fe3a32dda/volumes" Mar 20 08:56:06 crc kubenswrapper[4971]: I0320 08:56:06.826359 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9r6st"] Mar 20 08:56:06 crc kubenswrapper[4971]: E0320 08:56:06.827584 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d320012-274b-452a-92d8-a4cc549704cf" containerName="oc" Mar 20 08:56:06 crc kubenswrapper[4971]: I0320 08:56:06.827730 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d320012-274b-452a-92d8-a4cc549704cf" containerName="oc" Mar 20 08:56:06 crc kubenswrapper[4971]: I0320 08:56:06.828130 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d320012-274b-452a-92d8-a4cc549704cf" containerName="oc" Mar 20 08:56:06 crc kubenswrapper[4971]: I0320 08:56:06.831021 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9r6st" Mar 20 08:56:06 crc kubenswrapper[4971]: I0320 08:56:06.861936 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9r6st"] Mar 20 08:56:06 crc kubenswrapper[4971]: I0320 08:56:06.985667 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vptz\" (UniqueName: \"kubernetes.io/projected/eabce1a7-f5a0-47ec-a9ba-72f11cde02d9-kube-api-access-2vptz\") pod \"redhat-operators-9r6st\" (UID: \"eabce1a7-f5a0-47ec-a9ba-72f11cde02d9\") " pod="openshift-marketplace/redhat-operators-9r6st" Mar 20 08:56:06 crc kubenswrapper[4971]: I0320 08:56:06.985720 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eabce1a7-f5a0-47ec-a9ba-72f11cde02d9-utilities\") pod \"redhat-operators-9r6st\" (UID: \"eabce1a7-f5a0-47ec-a9ba-72f11cde02d9\") " pod="openshift-marketplace/redhat-operators-9r6st" Mar 20 08:56:06 crc kubenswrapper[4971]: I0320 08:56:06.985776 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eabce1a7-f5a0-47ec-a9ba-72f11cde02d9-catalog-content\") pod \"redhat-operators-9r6st\" (UID: \"eabce1a7-f5a0-47ec-a9ba-72f11cde02d9\") " pod="openshift-marketplace/redhat-operators-9r6st" Mar 20 08:56:07 crc kubenswrapper[4971]: I0320 08:56:07.087936 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vptz\" (UniqueName: \"kubernetes.io/projected/eabce1a7-f5a0-47ec-a9ba-72f11cde02d9-kube-api-access-2vptz\") pod \"redhat-operators-9r6st\" (UID: \"eabce1a7-f5a0-47ec-a9ba-72f11cde02d9\") " pod="openshift-marketplace/redhat-operators-9r6st" Mar 20 08:56:07 crc kubenswrapper[4971]: I0320 08:56:07.088294 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eabce1a7-f5a0-47ec-a9ba-72f11cde02d9-utilities\") pod \"redhat-operators-9r6st\" (UID: \"eabce1a7-f5a0-47ec-a9ba-72f11cde02d9\") " pod="openshift-marketplace/redhat-operators-9r6st" Mar 20 08:56:07 crc kubenswrapper[4971]: I0320 08:56:07.088439 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eabce1a7-f5a0-47ec-a9ba-72f11cde02d9-catalog-content\") pod \"redhat-operators-9r6st\" (UID: \"eabce1a7-f5a0-47ec-a9ba-72f11cde02d9\") " pod="openshift-marketplace/redhat-operators-9r6st" Mar 20 08:56:07 crc kubenswrapper[4971]: I0320 08:56:07.089028 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eabce1a7-f5a0-47ec-a9ba-72f11cde02d9-utilities\") pod \"redhat-operators-9r6st\" (UID: \"eabce1a7-f5a0-47ec-a9ba-72f11cde02d9\") " pod="openshift-marketplace/redhat-operators-9r6st" Mar 20 08:56:07 crc kubenswrapper[4971]: I0320 08:56:07.089036 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eabce1a7-f5a0-47ec-a9ba-72f11cde02d9-catalog-content\") pod \"redhat-operators-9r6st\" (UID: \"eabce1a7-f5a0-47ec-a9ba-72f11cde02d9\") " pod="openshift-marketplace/redhat-operators-9r6st" Mar 20 08:56:07 crc kubenswrapper[4971]: I0320 08:56:07.116416 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vptz\" (UniqueName: \"kubernetes.io/projected/eabce1a7-f5a0-47ec-a9ba-72f11cde02d9-kube-api-access-2vptz\") pod \"redhat-operators-9r6st\" (UID: \"eabce1a7-f5a0-47ec-a9ba-72f11cde02d9\") " pod="openshift-marketplace/redhat-operators-9r6st" Mar 20 08:56:07 crc kubenswrapper[4971]: I0320 08:56:07.163890 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9r6st" Mar 20 08:56:07 crc kubenswrapper[4971]: I0320 08:56:07.665729 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9r6st"] Mar 20 08:56:08 crc kubenswrapper[4971]: I0320 08:56:08.412457 4971 generic.go:334] "Generic (PLEG): container finished" podID="eabce1a7-f5a0-47ec-a9ba-72f11cde02d9" containerID="c481a1a060d7215cf2f4fbc83be673d6380c529abd3c90d549158a7c38ef571f" exitCode=0 Mar 20 08:56:08 crc kubenswrapper[4971]: I0320 08:56:08.412529 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9r6st" event={"ID":"eabce1a7-f5a0-47ec-a9ba-72f11cde02d9","Type":"ContainerDied","Data":"c481a1a060d7215cf2f4fbc83be673d6380c529abd3c90d549158a7c38ef571f"} Mar 20 08:56:08 crc kubenswrapper[4971]: I0320 08:56:08.413998 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9r6st" event={"ID":"eabce1a7-f5a0-47ec-a9ba-72f11cde02d9","Type":"ContainerStarted","Data":"a2ff474cc72b0c17512673d72dbdbd4a544bf5e7db9d1513867e8dd0bd0042ae"} Mar 20 08:56:09 crc kubenswrapper[4971]: I0320 08:56:09.426033 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9r6st" event={"ID":"eabce1a7-f5a0-47ec-a9ba-72f11cde02d9","Type":"ContainerStarted","Data":"75ad29b1e982ec73e75760d238ea7952f623357701cf40febcf042303f1b5cf9"} Mar 20 08:56:10 crc kubenswrapper[4971]: I0320 08:56:10.065538 4971 scope.go:117] "RemoveContainer" containerID="d129dbf7cc444e0e8577a646a0c23af07f88b72f7ae6d465014464ea54c53024" Mar 20 08:56:10 crc kubenswrapper[4971]: I0320 08:56:10.147082 4971 scope.go:117] "RemoveContainer" containerID="cf162ba3fd6d8b323310fc3121d8e69631133d3d4345971206f13063cd997bf8" Mar 20 08:56:10 crc kubenswrapper[4971]: I0320 08:56:10.206489 4971 scope.go:117] "RemoveContainer" containerID="1f1ca8928af817ffa9a9524502a479b6d6015028a928d4269b7ac876f621625f" Mar 20 08:56:10 crc kubenswrapper[4971]: I0320 08:56:10.441774 4971 generic.go:334] "Generic (PLEG): container finished" podID="eabce1a7-f5a0-47ec-a9ba-72f11cde02d9" containerID="75ad29b1e982ec73e75760d238ea7952f623357701cf40febcf042303f1b5cf9" exitCode=0 Mar 20 08:56:10 crc kubenswrapper[4971]: I0320 08:56:10.441828 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9r6st" event={"ID":"eabce1a7-f5a0-47ec-a9ba-72f11cde02d9","Type":"ContainerDied","Data":"75ad29b1e982ec73e75760d238ea7952f623357701cf40febcf042303f1b5cf9"} Mar 20 08:56:11 crc kubenswrapper[4971]: I0320 08:56:11.455369 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9r6st" event={"ID":"eabce1a7-f5a0-47ec-a9ba-72f11cde02d9","Type":"ContainerStarted","Data":"1a5fc08de9e96c5e60d4770854cc7d6fe4100e06723c6a50bcb8b606e958a3af"} Mar 20 08:56:11 crc kubenswrapper[4971]: I0320 08:56:11.482405 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9r6st" podStartSLOduration=2.978891024 podStartE2EDuration="5.482386737s" podCreationTimestamp="2026-03-20 08:56:06 +0000 UTC" firstStartedPulling="2026-03-20 08:56:08.414884973 +0000 UTC m=+7590.394759111" lastFinishedPulling="2026-03-20 08:56:10.918380666 +0000 UTC m=+7592.898254824" observedRunningTime="2026-03-20 08:56:11.477219492 +0000 UTC m=+7593.457093630" watchObservedRunningTime="2026-03-20 08:56:11.482386737 +0000 UTC m=+7593.462260875" Mar 20 08:56:17 crc kubenswrapper[4971]: I0320 08:56:17.165096 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9r6st" Mar 20 08:56:17 crc kubenswrapper[4971]: I0320 08:56:17.165634 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9r6st" Mar 20 08:56:17 crc kubenswrapper[4971]: I0320 08:56:17.192061 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-789f468b55-wcnxp" Mar 20 08:56:17 crc kubenswrapper[4971]: I0320 08:56:17.789395 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-859774b9d9-bgg24" Mar 20 08:56:18 crc kubenswrapper[4971]: I0320 08:56:18.207476 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9r6st" podUID="eabce1a7-f5a0-47ec-a9ba-72f11cde02d9" containerName="registry-server" probeResult="failure" output=< Mar 20 08:56:18 crc kubenswrapper[4971]: timeout: failed to connect service ":50051" within 1s Mar 20 08:56:18 crc kubenswrapper[4971]: > Mar 20 08:56:18 crc kubenswrapper[4971]: I0320 08:56:18.875962 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-789f468b55-wcnxp" Mar 20 08:56:19 crc kubenswrapper[4971]: I0320 08:56:19.416987 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-859774b9d9-bgg24" Mar 20 08:56:19 crc kubenswrapper[4971]: I0320 08:56:19.480129 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-789f468b55-wcnxp"] Mar 20 08:56:19 crc kubenswrapper[4971]: I0320 08:56:19.517478 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-789f468b55-wcnxp" podUID="639cda12-bee9-4d40-8613-24156ac5b3ec" containerName="horizon-log" containerID="cri-o://473b17ac1d231d2534f5e072f8d895e5b0c2225e62cb03d25e3e0e0c0581be41" gracePeriod=30 Mar 20 08:56:19 crc kubenswrapper[4971]: I0320 08:56:19.517833 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-789f468b55-wcnxp" podUID="639cda12-bee9-4d40-8613-24156ac5b3ec" containerName="horizon" containerID="cri-o://46aa6971476639c5373ca601c0e27c7ece37087a0f677a02a1bd9e8918a1fe79" gracePeriod=30 Mar 20 08:56:23 crc kubenswrapper[4971]: I0320 08:56:23.550208 4971 generic.go:334] "Generic (PLEG): container finished" podID="639cda12-bee9-4d40-8613-24156ac5b3ec" containerID="46aa6971476639c5373ca601c0e27c7ece37087a0f677a02a1bd9e8918a1fe79" exitCode=0 Mar 20 08:56:23 crc kubenswrapper[4971]: I0320 08:56:23.550717 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-789f468b55-wcnxp" event={"ID":"639cda12-bee9-4d40-8613-24156ac5b3ec","Type":"ContainerDied","Data":"46aa6971476639c5373ca601c0e27c7ece37087a0f677a02a1bd9e8918a1fe79"} Mar 20 08:56:25 crc kubenswrapper[4971]: I0320 08:56:25.296454 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-789f468b55-wcnxp" podUID="639cda12-bee9-4d40-8613-24156ac5b3ec" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.156:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.156:8080: connect: connection refused" Mar 20 08:56:25 crc kubenswrapper[4971]: I0320 08:56:25.569332 4971 generic.go:334] "Generic (PLEG): container finished" podID="1a7e1d91-b96d-443b-a5c2-30535a7761cd" containerID="174d3b9c2897301fe6721b266783bb01ac91ef00a15c8ae6bdf6627259373f44" exitCode=137 Mar 20 08:56:25 crc kubenswrapper[4971]: I0320 08:56:25.569364 4971 generic.go:334] "Generic (PLEG): container finished" podID="1a7e1d91-b96d-443b-a5c2-30535a7761cd" containerID="8e5fcb25dd0c7081778bf484f585e292b1141de016e2ebef3153f2b90c4e30eb" exitCode=137 Mar 20 08:56:25 crc kubenswrapper[4971]: I0320 08:56:25.569383 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57c77c7dc5-pcf25" event={"ID":"1a7e1d91-b96d-443b-a5c2-30535a7761cd","Type":"ContainerDied","Data":"174d3b9c2897301fe6721b266783bb01ac91ef00a15c8ae6bdf6627259373f44"} Mar 20 08:56:25 crc kubenswrapper[4971]: I0320 08:56:25.569408 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57c77c7dc5-pcf25" event={"ID":"1a7e1d91-b96d-443b-a5c2-30535a7761cd","Type":"ContainerDied","Data":"8e5fcb25dd0c7081778bf484f585e292b1141de016e2ebef3153f2b90c4e30eb"} Mar 20 08:56:25 crc kubenswrapper[4971]: I0320 08:56:25.710422 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-57c77c7dc5-pcf25" Mar 20 08:56:25 crc kubenswrapper[4971]: I0320 08:56:25.856776 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1a7e1d91-b96d-443b-a5c2-30535a7761cd-scripts\") pod \"1a7e1d91-b96d-443b-a5c2-30535a7761cd\" (UID: \"1a7e1d91-b96d-443b-a5c2-30535a7761cd\") " Mar 20 08:56:25 crc kubenswrapper[4971]: I0320 08:56:25.856841 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a7e1d91-b96d-443b-a5c2-30535a7761cd-logs\") pod \"1a7e1d91-b96d-443b-a5c2-30535a7761cd\" (UID: \"1a7e1d91-b96d-443b-a5c2-30535a7761cd\") " Mar 20 08:56:25 crc kubenswrapper[4971]: I0320 08:56:25.856923 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtdmk\" (UniqueName: \"kubernetes.io/projected/1a7e1d91-b96d-443b-a5c2-30535a7761cd-kube-api-access-vtdmk\") pod \"1a7e1d91-b96d-443b-a5c2-30535a7761cd\" (UID: \"1a7e1d91-b96d-443b-a5c2-30535a7761cd\") " Mar 20 08:56:25 crc kubenswrapper[4971]: I0320 08:56:25.856979 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1a7e1d91-b96d-443b-a5c2-30535a7761cd-config-data\") pod \"1a7e1d91-b96d-443b-a5c2-30535a7761cd\" (UID: \"1a7e1d91-b96d-443b-a5c2-30535a7761cd\") " Mar 20 08:56:25 crc kubenswrapper[4971]: I0320 08:56:25.857079 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1a7e1d91-b96d-443b-a5c2-30535a7761cd-horizon-secret-key\") pod \"1a7e1d91-b96d-443b-a5c2-30535a7761cd\" (UID: \"1a7e1d91-b96d-443b-a5c2-30535a7761cd\") " Mar 20 08:56:25 crc kubenswrapper[4971]: I0320 08:56:25.857592 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a7e1d91-b96d-443b-a5c2-30535a7761cd-logs" (OuterVolumeSpecName: "logs") pod "1a7e1d91-b96d-443b-a5c2-30535a7761cd" (UID: "1a7e1d91-b96d-443b-a5c2-30535a7761cd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:56:25 crc kubenswrapper[4971]: I0320 08:56:25.862585 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a7e1d91-b96d-443b-a5c2-30535a7761cd-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "1a7e1d91-b96d-443b-a5c2-30535a7761cd" (UID: "1a7e1d91-b96d-443b-a5c2-30535a7761cd"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:56:25 crc kubenswrapper[4971]: I0320 08:56:25.864569 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a7e1d91-b96d-443b-a5c2-30535a7761cd-kube-api-access-vtdmk" (OuterVolumeSpecName: "kube-api-access-vtdmk") pod "1a7e1d91-b96d-443b-a5c2-30535a7761cd" (UID: "1a7e1d91-b96d-443b-a5c2-30535a7761cd"). InnerVolumeSpecName "kube-api-access-vtdmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:56:25 crc kubenswrapper[4971]: I0320 08:56:25.879853 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a7e1d91-b96d-443b-a5c2-30535a7761cd-scripts" (OuterVolumeSpecName: "scripts") pod "1a7e1d91-b96d-443b-a5c2-30535a7761cd" (UID: "1a7e1d91-b96d-443b-a5c2-30535a7761cd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:56:25 crc kubenswrapper[4971]: I0320 08:56:25.891450 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a7e1d91-b96d-443b-a5c2-30535a7761cd-config-data" (OuterVolumeSpecName: "config-data") pod "1a7e1d91-b96d-443b-a5c2-30535a7761cd" (UID: "1a7e1d91-b96d-443b-a5c2-30535a7761cd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:56:25 crc kubenswrapper[4971]: I0320 08:56:25.959569 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtdmk\" (UniqueName: \"kubernetes.io/projected/1a7e1d91-b96d-443b-a5c2-30535a7761cd-kube-api-access-vtdmk\") on node \"crc\" DevicePath \"\"" Mar 20 08:56:25 crc kubenswrapper[4971]: I0320 08:56:25.959650 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1a7e1d91-b96d-443b-a5c2-30535a7761cd-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:56:25 crc kubenswrapper[4971]: I0320 08:56:25.959670 4971 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1a7e1d91-b96d-443b-a5c2-30535a7761cd-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 20 08:56:25 crc kubenswrapper[4971]: I0320 08:56:25.959684 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1a7e1d91-b96d-443b-a5c2-30535a7761cd-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:56:25 crc kubenswrapper[4971]: I0320 08:56:25.959694 4971 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a7e1d91-b96d-443b-a5c2-30535a7761cd-logs\") on node \"crc\" DevicePath \"\"" Mar 20 08:56:26 crc kubenswrapper[4971]: I0320 08:56:26.578702 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57c77c7dc5-pcf25" event={"ID":"1a7e1d91-b96d-443b-a5c2-30535a7761cd","Type":"ContainerDied","Data":"6bd73cac9769578d34c5f62dabcc0354e58f2a854b2a82e5954577676633b4aa"} Mar 20 08:56:26 crc kubenswrapper[4971]: I0320 08:56:26.578762 4971 scope.go:117] "RemoveContainer" containerID="174d3b9c2897301fe6721b266783bb01ac91ef00a15c8ae6bdf6627259373f44" Mar 20 08:56:26 crc kubenswrapper[4971]: I0320 08:56:26.578825 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-57c77c7dc5-pcf25" Mar 20 08:56:26 crc kubenswrapper[4971]: I0320 08:56:26.630464 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-57c77c7dc5-pcf25"] Mar 20 08:56:26 crc kubenswrapper[4971]: I0320 08:56:26.640000 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-57c77c7dc5-pcf25"] Mar 20 08:56:26 crc kubenswrapper[4971]: I0320 08:56:26.751800 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a7e1d91-b96d-443b-a5c2-30535a7761cd" path="/var/lib/kubelet/pods/1a7e1d91-b96d-443b-a5c2-30535a7761cd/volumes" Mar 20 08:56:26 crc kubenswrapper[4971]: I0320 08:56:26.772107 4971 scope.go:117] "RemoveContainer" containerID="8e5fcb25dd0c7081778bf484f585e292b1141de016e2ebef3153f2b90c4e30eb" Mar 20 08:56:27 crc kubenswrapper[4971]: I0320 08:56:27.221999 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9r6st" Mar 20 08:56:27 crc kubenswrapper[4971]: I0320 08:56:27.294986 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9r6st" Mar 20 08:56:27 crc kubenswrapper[4971]: I0320 08:56:27.474268 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9r6st"] Mar 20 08:56:28 crc kubenswrapper[4971]: I0320 08:56:28.601061 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9r6st" podUID="eabce1a7-f5a0-47ec-a9ba-72f11cde02d9" containerName="registry-server" containerID="cri-o://1a5fc08de9e96c5e60d4770854cc7d6fe4100e06723c6a50bcb8b606e958a3af" gracePeriod=2 Mar 20 08:56:29 crc kubenswrapper[4971]: I0320 08:56:29.161974 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9r6st" Mar 20 08:56:29 crc kubenswrapper[4971]: I0320 08:56:29.329181 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eabce1a7-f5a0-47ec-a9ba-72f11cde02d9-utilities\") pod \"eabce1a7-f5a0-47ec-a9ba-72f11cde02d9\" (UID: \"eabce1a7-f5a0-47ec-a9ba-72f11cde02d9\") " Mar 20 08:56:29 crc kubenswrapper[4971]: I0320 08:56:29.329297 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vptz\" (UniqueName: \"kubernetes.io/projected/eabce1a7-f5a0-47ec-a9ba-72f11cde02d9-kube-api-access-2vptz\") pod \"eabce1a7-f5a0-47ec-a9ba-72f11cde02d9\" (UID: \"eabce1a7-f5a0-47ec-a9ba-72f11cde02d9\") " Mar 20 08:56:29 crc kubenswrapper[4971]: I0320 08:56:29.329333 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eabce1a7-f5a0-47ec-a9ba-72f11cde02d9-catalog-content\") pod \"eabce1a7-f5a0-47ec-a9ba-72f11cde02d9\" (UID: \"eabce1a7-f5a0-47ec-a9ba-72f11cde02d9\") " Mar 20 08:56:29 crc kubenswrapper[4971]: I0320 08:56:29.329933 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eabce1a7-f5a0-47ec-a9ba-72f11cde02d9-utilities" (OuterVolumeSpecName: "utilities") pod "eabce1a7-f5a0-47ec-a9ba-72f11cde02d9" (UID: "eabce1a7-f5a0-47ec-a9ba-72f11cde02d9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:56:29 crc kubenswrapper[4971]: I0320 08:56:29.341841 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eabce1a7-f5a0-47ec-a9ba-72f11cde02d9-kube-api-access-2vptz" (OuterVolumeSpecName: "kube-api-access-2vptz") pod "eabce1a7-f5a0-47ec-a9ba-72f11cde02d9" (UID: "eabce1a7-f5a0-47ec-a9ba-72f11cde02d9"). InnerVolumeSpecName "kube-api-access-2vptz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:56:29 crc kubenswrapper[4971]: I0320 08:56:29.431113 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eabce1a7-f5a0-47ec-a9ba-72f11cde02d9-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 08:56:29 crc kubenswrapper[4971]: I0320 08:56:29.431166 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vptz\" (UniqueName: \"kubernetes.io/projected/eabce1a7-f5a0-47ec-a9ba-72f11cde02d9-kube-api-access-2vptz\") on node \"crc\" DevicePath \"\"" Mar 20 08:56:29 crc kubenswrapper[4971]: I0320 08:56:29.467927 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eabce1a7-f5a0-47ec-a9ba-72f11cde02d9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eabce1a7-f5a0-47ec-a9ba-72f11cde02d9" (UID: "eabce1a7-f5a0-47ec-a9ba-72f11cde02d9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:56:29 crc kubenswrapper[4971]: I0320 08:56:29.532697 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eabce1a7-f5a0-47ec-a9ba-72f11cde02d9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 08:56:29 crc kubenswrapper[4971]: I0320 08:56:29.613710 4971 generic.go:334] "Generic (PLEG): container finished" podID="eabce1a7-f5a0-47ec-a9ba-72f11cde02d9" containerID="1a5fc08de9e96c5e60d4770854cc7d6fe4100e06723c6a50bcb8b606e958a3af" exitCode=0 Mar 20 08:56:29 crc kubenswrapper[4971]: I0320 08:56:29.613786 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9r6st" Mar 20 08:56:29 crc kubenswrapper[4971]: I0320 08:56:29.613798 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9r6st" event={"ID":"eabce1a7-f5a0-47ec-a9ba-72f11cde02d9","Type":"ContainerDied","Data":"1a5fc08de9e96c5e60d4770854cc7d6fe4100e06723c6a50bcb8b606e958a3af"} Mar 20 08:56:29 crc kubenswrapper[4971]: I0320 08:56:29.613848 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9r6st" event={"ID":"eabce1a7-f5a0-47ec-a9ba-72f11cde02d9","Type":"ContainerDied","Data":"a2ff474cc72b0c17512673d72dbdbd4a544bf5e7db9d1513867e8dd0bd0042ae"} Mar 20 08:56:29 crc kubenswrapper[4971]: I0320 08:56:29.613878 4971 scope.go:117] "RemoveContainer" containerID="1a5fc08de9e96c5e60d4770854cc7d6fe4100e06723c6a50bcb8b606e958a3af" Mar 20 08:56:29 crc kubenswrapper[4971]: I0320 08:56:29.644377 4971 scope.go:117] "RemoveContainer" containerID="75ad29b1e982ec73e75760d238ea7952f623357701cf40febcf042303f1b5cf9" Mar 20 08:56:29 crc kubenswrapper[4971]: I0320 08:56:29.672644 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9r6st"] Mar 20 08:56:29 crc kubenswrapper[4971]: I0320 08:56:29.682957 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9r6st"] Mar 20 08:56:29 crc kubenswrapper[4971]: I0320 08:56:29.700189 4971 scope.go:117] "RemoveContainer" containerID="c481a1a060d7215cf2f4fbc83be673d6380c529abd3c90d549158a7c38ef571f" Mar 20 08:56:29 crc kubenswrapper[4971]: I0320 08:56:29.726173 4971 scope.go:117] "RemoveContainer" containerID="1a5fc08de9e96c5e60d4770854cc7d6fe4100e06723c6a50bcb8b606e958a3af" Mar 20 08:56:29 crc kubenswrapper[4971]: E0320 08:56:29.726690 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a5fc08de9e96c5e60d4770854cc7d6fe4100e06723c6a50bcb8b606e958a3af\": container with ID starting with 1a5fc08de9e96c5e60d4770854cc7d6fe4100e06723c6a50bcb8b606e958a3af not found: ID does not exist" containerID="1a5fc08de9e96c5e60d4770854cc7d6fe4100e06723c6a50bcb8b606e958a3af" Mar 20 08:56:29 crc kubenswrapper[4971]: I0320 08:56:29.726725 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a5fc08de9e96c5e60d4770854cc7d6fe4100e06723c6a50bcb8b606e958a3af"} err="failed to get container status \"1a5fc08de9e96c5e60d4770854cc7d6fe4100e06723c6a50bcb8b606e958a3af\": rpc error: code = NotFound desc = could not find container \"1a5fc08de9e96c5e60d4770854cc7d6fe4100e06723c6a50bcb8b606e958a3af\": container with ID starting with 1a5fc08de9e96c5e60d4770854cc7d6fe4100e06723c6a50bcb8b606e958a3af not found: ID does not exist" Mar 20 08:56:29 crc kubenswrapper[4971]: I0320 08:56:29.726750 4971 scope.go:117] "RemoveContainer" containerID="75ad29b1e982ec73e75760d238ea7952f623357701cf40febcf042303f1b5cf9" Mar 20 08:56:29 crc kubenswrapper[4971]: E0320 08:56:29.727192 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75ad29b1e982ec73e75760d238ea7952f623357701cf40febcf042303f1b5cf9\": container with ID starting with 75ad29b1e982ec73e75760d238ea7952f623357701cf40febcf042303f1b5cf9 not found: ID does not exist" containerID="75ad29b1e982ec73e75760d238ea7952f623357701cf40febcf042303f1b5cf9" Mar 20 08:56:29 crc kubenswrapper[4971]: I0320 08:56:29.727300 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75ad29b1e982ec73e75760d238ea7952f623357701cf40febcf042303f1b5cf9"} err="failed to get container status \"75ad29b1e982ec73e75760d238ea7952f623357701cf40febcf042303f1b5cf9\": rpc error: code = NotFound desc = could not find container \"75ad29b1e982ec73e75760d238ea7952f623357701cf40febcf042303f1b5cf9\": container with ID starting with 75ad29b1e982ec73e75760d238ea7952f623357701cf40febcf042303f1b5cf9 not found: ID does not exist" Mar 20 08:56:29 crc kubenswrapper[4971]: I0320 08:56:29.727345 4971 scope.go:117] "RemoveContainer" containerID="c481a1a060d7215cf2f4fbc83be673d6380c529abd3c90d549158a7c38ef571f" Mar 20 08:56:29 crc kubenswrapper[4971]: E0320 08:56:29.727851 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c481a1a060d7215cf2f4fbc83be673d6380c529abd3c90d549158a7c38ef571f\": container with ID starting with c481a1a060d7215cf2f4fbc83be673d6380c529abd3c90d549158a7c38ef571f not found: ID does not exist" containerID="c481a1a060d7215cf2f4fbc83be673d6380c529abd3c90d549158a7c38ef571f" Mar 20 08:56:29 crc kubenswrapper[4971]: I0320 08:56:29.727929 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c481a1a060d7215cf2f4fbc83be673d6380c529abd3c90d549158a7c38ef571f"} err="failed to get container status \"c481a1a060d7215cf2f4fbc83be673d6380c529abd3c90d549158a7c38ef571f\": rpc error: code = NotFound desc = could not find container \"c481a1a060d7215cf2f4fbc83be673d6380c529abd3c90d549158a7c38ef571f\": container with ID starting with c481a1a060d7215cf2f4fbc83be673d6380c529abd3c90d549158a7c38ef571f not found: ID does not exist" Mar 20 08:56:30 crc kubenswrapper[4971]: I0320 08:56:30.747355 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eabce1a7-f5a0-47ec-a9ba-72f11cde02d9" path="/var/lib/kubelet/pods/eabce1a7-f5a0-47ec-a9ba-72f11cde02d9/volumes" Mar 20 08:56:35 crc kubenswrapper[4971]: I0320 08:56:35.296519 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-789f468b55-wcnxp" podUID="639cda12-bee9-4d40-8613-24156ac5b3ec" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.156:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.156:8080: connect: connection refused" Mar 20 08:56:45 crc kubenswrapper[4971]: I0320 08:56:45.297087 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-789f468b55-wcnxp" podUID="639cda12-bee9-4d40-8613-24156ac5b3ec" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.156:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.156:8080: connect: connection refused" Mar 20 08:56:45 crc kubenswrapper[4971]: I0320 08:56:45.297548 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-789f468b55-wcnxp" Mar 20 08:56:49 crc kubenswrapper[4971]: I0320 08:56:49.884266 4971 generic.go:334] "Generic (PLEG): container finished" podID="639cda12-bee9-4d40-8613-24156ac5b3ec" containerID="473b17ac1d231d2534f5e072f8d895e5b0c2225e62cb03d25e3e0e0c0581be41" exitCode=137 Mar 20 08:56:49 crc kubenswrapper[4971]: I0320 08:56:49.884467 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-789f468b55-wcnxp" event={"ID":"639cda12-bee9-4d40-8613-24156ac5b3ec","Type":"ContainerDied","Data":"473b17ac1d231d2534f5e072f8d895e5b0c2225e62cb03d25e3e0e0c0581be41"} Mar 20 08:56:50 crc kubenswrapper[4971]: I0320 08:56:50.005601 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-789f468b55-wcnxp" Mar 20 08:56:50 crc kubenswrapper[4971]: I0320 08:56:50.139534 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/639cda12-bee9-4d40-8613-24156ac5b3ec-scripts\") pod \"639cda12-bee9-4d40-8613-24156ac5b3ec\" (UID: \"639cda12-bee9-4d40-8613-24156ac5b3ec\") " Mar 20 08:56:50 crc kubenswrapper[4971]: I0320 08:56:50.139646 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-429m6\" (UniqueName: \"kubernetes.io/projected/639cda12-bee9-4d40-8613-24156ac5b3ec-kube-api-access-429m6\") pod \"639cda12-bee9-4d40-8613-24156ac5b3ec\" (UID: \"639cda12-bee9-4d40-8613-24156ac5b3ec\") " Mar 20 08:56:50 crc kubenswrapper[4971]: I0320 08:56:50.139688 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/639cda12-bee9-4d40-8613-24156ac5b3ec-horizon-secret-key\") pod \"639cda12-bee9-4d40-8613-24156ac5b3ec\" (UID: \"639cda12-bee9-4d40-8613-24156ac5b3ec\") " Mar 20 08:56:50 crc kubenswrapper[4971]: I0320 08:56:50.139739 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/639cda12-bee9-4d40-8613-24156ac5b3ec-config-data\") pod \"639cda12-bee9-4d40-8613-24156ac5b3ec\" (UID: \"639cda12-bee9-4d40-8613-24156ac5b3ec\") " Mar 20 08:56:50 crc kubenswrapper[4971]: I0320 08:56:50.139780 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/639cda12-bee9-4d40-8613-24156ac5b3ec-logs\") pod \"639cda12-bee9-4d40-8613-24156ac5b3ec\" (UID: \"639cda12-bee9-4d40-8613-24156ac5b3ec\") " Mar 20 08:56:50 crc kubenswrapper[4971]: I0320 08:56:50.140511 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/639cda12-bee9-4d40-8613-24156ac5b3ec-logs" (OuterVolumeSpecName: "logs") pod "639cda12-bee9-4d40-8613-24156ac5b3ec" (UID: "639cda12-bee9-4d40-8613-24156ac5b3ec"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:56:50 crc kubenswrapper[4971]: I0320 08:56:50.146846 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/639cda12-bee9-4d40-8613-24156ac5b3ec-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "639cda12-bee9-4d40-8613-24156ac5b3ec" (UID: "639cda12-bee9-4d40-8613-24156ac5b3ec"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:56:50 crc kubenswrapper[4971]: I0320 08:56:50.151935 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/639cda12-bee9-4d40-8613-24156ac5b3ec-kube-api-access-429m6" (OuterVolumeSpecName: "kube-api-access-429m6") pod "639cda12-bee9-4d40-8613-24156ac5b3ec" (UID: "639cda12-bee9-4d40-8613-24156ac5b3ec"). InnerVolumeSpecName "kube-api-access-429m6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:56:50 crc kubenswrapper[4971]: I0320 08:56:50.171680 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/639cda12-bee9-4d40-8613-24156ac5b3ec-scripts" (OuterVolumeSpecName: "scripts") pod "639cda12-bee9-4d40-8613-24156ac5b3ec" (UID: "639cda12-bee9-4d40-8613-24156ac5b3ec"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:56:50 crc kubenswrapper[4971]: I0320 08:56:50.176539 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/639cda12-bee9-4d40-8613-24156ac5b3ec-config-data" (OuterVolumeSpecName: "config-data") pod "639cda12-bee9-4d40-8613-24156ac5b3ec" (UID: "639cda12-bee9-4d40-8613-24156ac5b3ec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:56:50 crc kubenswrapper[4971]: I0320 08:56:50.241565 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/639cda12-bee9-4d40-8613-24156ac5b3ec-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:56:50 crc kubenswrapper[4971]: I0320 08:56:50.241599 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-429m6\" (UniqueName: \"kubernetes.io/projected/639cda12-bee9-4d40-8613-24156ac5b3ec-kube-api-access-429m6\") on node \"crc\" DevicePath \"\"" Mar 20 08:56:50 crc kubenswrapper[4971]: I0320 08:56:50.241624 4971 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/639cda12-bee9-4d40-8613-24156ac5b3ec-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 20 08:56:50 crc kubenswrapper[4971]: I0320 08:56:50.241633 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/639cda12-bee9-4d40-8613-24156ac5b3ec-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:56:50 crc kubenswrapper[4971]: I0320 08:56:50.241643 4971 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/639cda12-bee9-4d40-8613-24156ac5b3ec-logs\") on node \"crc\" DevicePath \"\"" Mar 20 08:56:50 crc kubenswrapper[4971]: I0320 08:56:50.900475 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-789f468b55-wcnxp" event={"ID":"639cda12-bee9-4d40-8613-24156ac5b3ec","Type":"ContainerDied","Data":"4421f8719515d2d80844530bbf1e70fdacf5c45c8f3c647eb92a4d549983f8df"} Mar 20 08:56:50 crc kubenswrapper[4971]: I0320 08:56:50.900576 4971 scope.go:117] "RemoveContainer" containerID="46aa6971476639c5373ca601c0e27c7ece37087a0f677a02a1bd9e8918a1fe79" Mar 20 08:56:50 crc kubenswrapper[4971]: I0320 08:56:50.900742 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-789f468b55-wcnxp" Mar 20 08:56:50 crc kubenswrapper[4971]: I0320 08:56:50.949869 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-789f468b55-wcnxp"] Mar 20 08:56:50 crc kubenswrapper[4971]: I0320 08:56:50.960251 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-789f468b55-wcnxp"] Mar 20 08:56:51 crc kubenswrapper[4971]: I0320 08:56:51.174183 4971 scope.go:117] "RemoveContainer" containerID="473b17ac1d231d2534f5e072f8d895e5b0c2225e62cb03d25e3e0e0c0581be41" Mar 20 08:56:52 crc kubenswrapper[4971]: I0320 08:56:52.755681 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="639cda12-bee9-4d40-8613-24156ac5b3ec" path="/var/lib/kubelet/pods/639cda12-bee9-4d40-8613-24156ac5b3ec/volumes" Mar 20 08:57:02 crc kubenswrapper[4971]: I0320 08:57:02.176513 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7c6cddf495-pndtg"] Mar 20 08:57:02 crc kubenswrapper[4971]: E0320 08:57:02.177799 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eabce1a7-f5a0-47ec-a9ba-72f11cde02d9" containerName="extract-utilities" Mar 20 08:57:02 crc kubenswrapper[4971]: I0320 08:57:02.177817 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="eabce1a7-f5a0-47ec-a9ba-72f11cde02d9" containerName="extract-utilities" Mar 20 08:57:02 crc kubenswrapper[4971]: E0320 08:57:02.177835 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eabce1a7-f5a0-47ec-a9ba-72f11cde02d9" containerName="extract-content" Mar 20 08:57:02 crc kubenswrapper[4971]: I0320 08:57:02.177843 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="eabce1a7-f5a0-47ec-a9ba-72f11cde02d9" containerName="extract-content" Mar 20 08:57:02 crc kubenswrapper[4971]: E0320 08:57:02.177860 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a7e1d91-b96d-443b-a5c2-30535a7761cd" containerName="horizon" Mar 20 08:57:02 crc kubenswrapper[4971]: I0320 08:57:02.177868 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a7e1d91-b96d-443b-a5c2-30535a7761cd" containerName="horizon" Mar 20 08:57:02 crc kubenswrapper[4971]: E0320 08:57:02.177890 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="639cda12-bee9-4d40-8613-24156ac5b3ec" containerName="horizon" Mar 20 08:57:02 crc kubenswrapper[4971]: I0320 08:57:02.177898 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="639cda12-bee9-4d40-8613-24156ac5b3ec" containerName="horizon" Mar 20 08:57:02 crc kubenswrapper[4971]: E0320 08:57:02.177921 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eabce1a7-f5a0-47ec-a9ba-72f11cde02d9" containerName="registry-server" Mar 20 08:57:02 crc kubenswrapper[4971]: I0320 08:57:02.177928 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="eabce1a7-f5a0-47ec-a9ba-72f11cde02d9" containerName="registry-server" Mar 20 08:57:02 crc kubenswrapper[4971]: E0320 08:57:02.177943 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="639cda12-bee9-4d40-8613-24156ac5b3ec" containerName="horizon-log" Mar 20 08:57:02 crc kubenswrapper[4971]: I0320 08:57:02.177950 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="639cda12-bee9-4d40-8613-24156ac5b3ec" containerName="horizon-log" Mar 20 08:57:02 crc kubenswrapper[4971]: E0320 08:57:02.177975 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a7e1d91-b96d-443b-a5c2-30535a7761cd" containerName="horizon-log" Mar 20 08:57:02 crc kubenswrapper[4971]: I0320 08:57:02.177983 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a7e1d91-b96d-443b-a5c2-30535a7761cd" containerName="horizon-log" Mar 20 08:57:02 crc kubenswrapper[4971]: I0320 08:57:02.178216 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="639cda12-bee9-4d40-8613-24156ac5b3ec" containerName="horizon-log" Mar 20 08:57:02 crc kubenswrapper[4971]: I0320 08:57:02.178239 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="639cda12-bee9-4d40-8613-24156ac5b3ec" containerName="horizon" Mar 20 08:57:02 crc kubenswrapper[4971]: I0320 08:57:02.178251 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a7e1d91-b96d-443b-a5c2-30535a7761cd" containerName="horizon" Mar 20 08:57:02 crc kubenswrapper[4971]: I0320 08:57:02.178263 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="eabce1a7-f5a0-47ec-a9ba-72f11cde02d9" containerName="registry-server" Mar 20 08:57:02 crc kubenswrapper[4971]: I0320 08:57:02.178275 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a7e1d91-b96d-443b-a5c2-30535a7761cd" containerName="horizon-log" Mar 20 08:57:02 crc kubenswrapper[4971]: I0320 08:57:02.179509 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c6cddf495-pndtg" Mar 20 08:57:02 crc kubenswrapper[4971]: I0320 08:57:02.194018 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7c6cddf495-pndtg"] Mar 20 08:57:02 crc kubenswrapper[4971]: I0320 08:57:02.333161 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rznc2\" (UniqueName: \"kubernetes.io/projected/d5207d51-a124-4d40-bfef-2dcc09b9fade-kube-api-access-rznc2\") pod \"horizon-7c6cddf495-pndtg\" (UID: \"d5207d51-a124-4d40-bfef-2dcc09b9fade\") " pod="openstack/horizon-7c6cddf495-pndtg" Mar 20 08:57:02 crc kubenswrapper[4971]: I0320 08:57:02.333232 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d5207d51-a124-4d40-bfef-2dcc09b9fade-horizon-secret-key\") pod \"horizon-7c6cddf495-pndtg\" (UID: \"d5207d51-a124-4d40-bfef-2dcc09b9fade\") " pod="openstack/horizon-7c6cddf495-pndtg" Mar 20 08:57:02 crc kubenswrapper[4971]: I0320 08:57:02.333270 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d5207d51-a124-4d40-bfef-2dcc09b9fade-scripts\") pod \"horizon-7c6cddf495-pndtg\" (UID: \"d5207d51-a124-4d40-bfef-2dcc09b9fade\") " pod="openstack/horizon-7c6cddf495-pndtg" Mar 20 08:57:02 crc kubenswrapper[4971]: I0320 08:57:02.333731 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5207d51-a124-4d40-bfef-2dcc09b9fade-logs\") pod \"horizon-7c6cddf495-pndtg\" (UID: \"d5207d51-a124-4d40-bfef-2dcc09b9fade\") " pod="openstack/horizon-7c6cddf495-pndtg" Mar 20 08:57:02 crc kubenswrapper[4971]: I0320 08:57:02.333811 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d5207d51-a124-4d40-bfef-2dcc09b9fade-config-data\") pod \"horizon-7c6cddf495-pndtg\" (UID: \"d5207d51-a124-4d40-bfef-2dcc09b9fade\") " pod="openstack/horizon-7c6cddf495-pndtg" Mar 20 08:57:02 crc kubenswrapper[4971]: I0320 08:57:02.435448 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5207d51-a124-4d40-bfef-2dcc09b9fade-logs\") pod \"horizon-7c6cddf495-pndtg\" (UID: \"d5207d51-a124-4d40-bfef-2dcc09b9fade\") " pod="openstack/horizon-7c6cddf495-pndtg" Mar 20 08:57:02 crc kubenswrapper[4971]: I0320 08:57:02.435508 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d5207d51-a124-4d40-bfef-2dcc09b9fade-config-data\") pod \"horizon-7c6cddf495-pndtg\" (UID: \"d5207d51-a124-4d40-bfef-2dcc09b9fade\") " pod="openstack/horizon-7c6cddf495-pndtg" Mar 20 08:57:02 crc kubenswrapper[4971]: I0320 08:57:02.435630 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rznc2\" (UniqueName: \"kubernetes.io/projected/d5207d51-a124-4d40-bfef-2dcc09b9fade-kube-api-access-rznc2\") pod \"horizon-7c6cddf495-pndtg\" (UID: \"d5207d51-a124-4d40-bfef-2dcc09b9fade\") " pod="openstack/horizon-7c6cddf495-pndtg" Mar 20 08:57:02 crc kubenswrapper[4971]: I0320 08:57:02.435677 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d5207d51-a124-4d40-bfef-2dcc09b9fade-horizon-secret-key\") pod \"horizon-7c6cddf495-pndtg\" (UID: \"d5207d51-a124-4d40-bfef-2dcc09b9fade\") " pod="openstack/horizon-7c6cddf495-pndtg" Mar 20 08:57:02 crc kubenswrapper[4971]: I0320 08:57:02.435710 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d5207d51-a124-4d40-bfef-2dcc09b9fade-scripts\") pod \"horizon-7c6cddf495-pndtg\" (UID: \"d5207d51-a124-4d40-bfef-2dcc09b9fade\") " pod="openstack/horizon-7c6cddf495-pndtg" Mar 20 08:57:02 crc kubenswrapper[4971]: I0320 08:57:02.436027 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5207d51-a124-4d40-bfef-2dcc09b9fade-logs\") pod \"horizon-7c6cddf495-pndtg\" (UID: \"d5207d51-a124-4d40-bfef-2dcc09b9fade\") " pod="openstack/horizon-7c6cddf495-pndtg" Mar 20 08:57:02 crc kubenswrapper[4971]: I0320 08:57:02.436528 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d5207d51-a124-4d40-bfef-2dcc09b9fade-scripts\") pod \"horizon-7c6cddf495-pndtg\" (UID: \"d5207d51-a124-4d40-bfef-2dcc09b9fade\") " pod="openstack/horizon-7c6cddf495-pndtg" Mar 20 08:57:02 crc kubenswrapper[4971]: I0320 08:57:02.437083 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d5207d51-a124-4d40-bfef-2dcc09b9fade-config-data\") pod \"horizon-7c6cddf495-pndtg\" (UID: \"d5207d51-a124-4d40-bfef-2dcc09b9fade\") " pod="openstack/horizon-7c6cddf495-pndtg" Mar 20 08:57:02 crc kubenswrapper[4971]: I0320 08:57:02.444566 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d5207d51-a124-4d40-bfef-2dcc09b9fade-horizon-secret-key\") pod \"horizon-7c6cddf495-pndtg\" (UID: \"d5207d51-a124-4d40-bfef-2dcc09b9fade\") " pod="openstack/horizon-7c6cddf495-pndtg" Mar 20 08:57:02 crc kubenswrapper[4971]: I0320 08:57:02.462405 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rznc2\" (UniqueName: \"kubernetes.io/projected/d5207d51-a124-4d40-bfef-2dcc09b9fade-kube-api-access-rznc2\") pod \"horizon-7c6cddf495-pndtg\" (UID: \"d5207d51-a124-4d40-bfef-2dcc09b9fade\") " pod="openstack/horizon-7c6cddf495-pndtg" Mar 20 08:57:02 crc kubenswrapper[4971]: I0320 08:57:02.506412 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c6cddf495-pndtg" Mar 20 08:57:02 crc kubenswrapper[4971]: I0320 08:57:02.982651 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7c6cddf495-pndtg"] Mar 20 08:57:03 crc kubenswrapper[4971]: I0320 08:57:03.031555 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c6cddf495-pndtg" event={"ID":"d5207d51-a124-4d40-bfef-2dcc09b9fade","Type":"ContainerStarted","Data":"131025d9bbec442f5da8bfb52b36b86a81eedc391ff160ee38865e5a3e98b04f"} Mar 20 08:57:03 crc kubenswrapper[4971]: I0320 08:57:03.387681 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-wwtwh"] Mar 20 08:57:03 crc kubenswrapper[4971]: I0320 08:57:03.388912 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-wwtwh" Mar 20 08:57:03 crc kubenswrapper[4971]: I0320 08:57:03.408996 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-wwtwh"] Mar 20 08:57:03 crc kubenswrapper[4971]: I0320 08:57:03.501986 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-9a0e-account-create-update-sm9hz"] Mar 20 08:57:03 crc kubenswrapper[4971]: I0320 08:57:03.503694 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-9a0e-account-create-update-sm9hz" Mar 20 08:57:03 crc kubenswrapper[4971]: I0320 08:57:03.513206 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Mar 20 08:57:03 crc kubenswrapper[4971]: I0320 08:57:03.520876 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-9a0e-account-create-update-sm9hz"] Mar 20 08:57:03 crc kubenswrapper[4971]: I0320 08:57:03.555657 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/079d372c-65a0-4c34-b239-e49413cac654-operator-scripts\") pod \"heat-db-create-wwtwh\" (UID: \"079d372c-65a0-4c34-b239-e49413cac654\") " pod="openstack/heat-db-create-wwtwh" Mar 20 08:57:03 crc kubenswrapper[4971]: I0320 08:57:03.555784 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgt24\" (UniqueName: \"kubernetes.io/projected/079d372c-65a0-4c34-b239-e49413cac654-kube-api-access-qgt24\") pod \"heat-db-create-wwtwh\" (UID: \"079d372c-65a0-4c34-b239-e49413cac654\") " pod="openstack/heat-db-create-wwtwh" Mar 20 08:57:03 crc kubenswrapper[4971]: I0320 08:57:03.657455 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgt24\" (UniqueName: \"kubernetes.io/projected/079d372c-65a0-4c34-b239-e49413cac654-kube-api-access-qgt24\") pod \"heat-db-create-wwtwh\" (UID: \"079d372c-65a0-4c34-b239-e49413cac654\") " pod="openstack/heat-db-create-wwtwh" Mar 20 08:57:03 crc kubenswrapper[4971]: I0320 08:57:03.657549 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/079d372c-65a0-4c34-b239-e49413cac654-operator-scripts\") pod \"heat-db-create-wwtwh\" (UID: \"079d372c-65a0-4c34-b239-e49413cac654\") " pod="openstack/heat-db-create-wwtwh" Mar 20 08:57:03 crc kubenswrapper[4971]: I0320 08:57:03.657597 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zdbf\" (UniqueName: \"kubernetes.io/projected/6402da04-6fc4-4546-b7ac-5af524c11e9a-kube-api-access-6zdbf\") pod \"heat-9a0e-account-create-update-sm9hz\" (UID: \"6402da04-6fc4-4546-b7ac-5af524c11e9a\") " pod="openstack/heat-9a0e-account-create-update-sm9hz" Mar 20 08:57:03 crc kubenswrapper[4971]: I0320 08:57:03.657669 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6402da04-6fc4-4546-b7ac-5af524c11e9a-operator-scripts\") pod \"heat-9a0e-account-create-update-sm9hz\" (UID: \"6402da04-6fc4-4546-b7ac-5af524c11e9a\") " pod="openstack/heat-9a0e-account-create-update-sm9hz" Mar 20 08:57:03 crc kubenswrapper[4971]: I0320 08:57:03.658319 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/079d372c-65a0-4c34-b239-e49413cac654-operator-scripts\") pod \"heat-db-create-wwtwh\" (UID: \"079d372c-65a0-4c34-b239-e49413cac654\") " pod="openstack/heat-db-create-wwtwh" Mar 20 08:57:03 crc kubenswrapper[4971]: I0320 08:57:03.675086 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgt24\" (UniqueName: \"kubernetes.io/projected/079d372c-65a0-4c34-b239-e49413cac654-kube-api-access-qgt24\") pod \"heat-db-create-wwtwh\" (UID: \"079d372c-65a0-4c34-b239-e49413cac654\") " pod="openstack/heat-db-create-wwtwh" Mar 20 08:57:03 crc kubenswrapper[4971]: I0320 08:57:03.707169 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-wwtwh" Mar 20 08:57:03 crc kubenswrapper[4971]: I0320 08:57:03.762692 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6402da04-6fc4-4546-b7ac-5af524c11e9a-operator-scripts\") pod \"heat-9a0e-account-create-update-sm9hz\" (UID: \"6402da04-6fc4-4546-b7ac-5af524c11e9a\") " pod="openstack/heat-9a0e-account-create-update-sm9hz" Mar 20 08:57:03 crc kubenswrapper[4971]: I0320 08:57:03.762848 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zdbf\" (UniqueName: \"kubernetes.io/projected/6402da04-6fc4-4546-b7ac-5af524c11e9a-kube-api-access-6zdbf\") pod \"heat-9a0e-account-create-update-sm9hz\" (UID: \"6402da04-6fc4-4546-b7ac-5af524c11e9a\") " pod="openstack/heat-9a0e-account-create-update-sm9hz" Mar 20 08:57:03 crc kubenswrapper[4971]: I0320 08:57:03.763448 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6402da04-6fc4-4546-b7ac-5af524c11e9a-operator-scripts\") pod \"heat-9a0e-account-create-update-sm9hz\" (UID: \"6402da04-6fc4-4546-b7ac-5af524c11e9a\") " pod="openstack/heat-9a0e-account-create-update-sm9hz" Mar 20 08:57:03 crc kubenswrapper[4971]: I0320 08:57:03.779252 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zdbf\" (UniqueName: \"kubernetes.io/projected/6402da04-6fc4-4546-b7ac-5af524c11e9a-kube-api-access-6zdbf\") pod \"heat-9a0e-account-create-update-sm9hz\" (UID: \"6402da04-6fc4-4546-b7ac-5af524c11e9a\") " pod="openstack/heat-9a0e-account-create-update-sm9hz" Mar 20 08:57:03 crc kubenswrapper[4971]: I0320 08:57:03.821078 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-9a0e-account-create-update-sm9hz" Mar 20 08:57:04 crc kubenswrapper[4971]: I0320 08:57:04.045588 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c6cddf495-pndtg" event={"ID":"d5207d51-a124-4d40-bfef-2dcc09b9fade","Type":"ContainerStarted","Data":"2dff8a155b159ea841a58b582af54820ca53d1dde0caed3186db20fe18ba2f91"} Mar 20 08:57:04 crc kubenswrapper[4971]: I0320 08:57:04.045948 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c6cddf495-pndtg" event={"ID":"d5207d51-a124-4d40-bfef-2dcc09b9fade","Type":"ContainerStarted","Data":"4c877baaef33ee691f7cafc724d2e207e847234be51348781377131be26bb318"} Mar 20 08:57:04 crc kubenswrapper[4971]: I0320 08:57:04.079157 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7c6cddf495-pndtg" podStartSLOduration=2.079137379 podStartE2EDuration="2.079137379s" podCreationTimestamp="2026-03-20 08:57:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:57:04.07117086 +0000 UTC m=+7646.051044998" watchObservedRunningTime="2026-03-20 08:57:04.079137379 +0000 UTC m=+7646.059011507" Mar 20 08:57:04 crc kubenswrapper[4971]: W0320 08:57:04.216176 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod079d372c_65a0_4c34_b239_e49413cac654.slice/crio-b8630eced1addd296fd6678ee955ee9ddc609ea46b73b81428ed4a3059d0352c WatchSource:0}: Error finding container b8630eced1addd296fd6678ee955ee9ddc609ea46b73b81428ed4a3059d0352c: Status 404 returned error can't find the container with id b8630eced1addd296fd6678ee955ee9ddc609ea46b73b81428ed4a3059d0352c Mar 20 08:57:04 crc kubenswrapper[4971]: I0320 08:57:04.216215 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-wwtwh"] Mar 20 08:57:04 crc kubenswrapper[4971]: I0320 08:57:04.342524 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-9a0e-account-create-update-sm9hz"] Mar 20 08:57:05 crc kubenswrapper[4971]: I0320 08:57:05.054584 4971 generic.go:334] "Generic (PLEG): container finished" podID="079d372c-65a0-4c34-b239-e49413cac654" containerID="f77e1b52f123ea9084bed679abde9bc2db5e642cbe533ea56cc25d73c52f410a" exitCode=0 Mar 20 08:57:05 crc kubenswrapper[4971]: I0320 08:57:05.055105 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-wwtwh" event={"ID":"079d372c-65a0-4c34-b239-e49413cac654","Type":"ContainerDied","Data":"f77e1b52f123ea9084bed679abde9bc2db5e642cbe533ea56cc25d73c52f410a"} Mar 20 08:57:05 crc kubenswrapper[4971]: I0320 08:57:05.055133 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-wwtwh" event={"ID":"079d372c-65a0-4c34-b239-e49413cac654","Type":"ContainerStarted","Data":"b8630eced1addd296fd6678ee955ee9ddc609ea46b73b81428ed4a3059d0352c"} Mar 20 08:57:05 crc kubenswrapper[4971]: I0320 08:57:05.058476 4971 generic.go:334] "Generic (PLEG): container finished" podID="6402da04-6fc4-4546-b7ac-5af524c11e9a" containerID="8ec3e621d3d4e1df49ce16aa037fa1a96f48d7b5e10caf31f75af4a8bfc58fdc" exitCode=0 Mar 20 08:57:05 crc kubenswrapper[4971]: I0320 08:57:05.058626 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-9a0e-account-create-update-sm9hz" event={"ID":"6402da04-6fc4-4546-b7ac-5af524c11e9a","Type":"ContainerDied","Data":"8ec3e621d3d4e1df49ce16aa037fa1a96f48d7b5e10caf31f75af4a8bfc58fdc"} Mar 20 08:57:05 crc kubenswrapper[4971]: I0320 08:57:05.058674 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-9a0e-account-create-update-sm9hz" event={"ID":"6402da04-6fc4-4546-b7ac-5af524c11e9a","Type":"ContainerStarted","Data":"0f7ec9b23be53cf5716340a7d14211d2ca01207f84b777e0592eac8f485319c4"} Mar 20 08:57:06 crc kubenswrapper[4971]: I0320 08:57:06.570575 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-wwtwh" Mar 20 08:57:06 crc kubenswrapper[4971]: I0320 08:57:06.585717 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-9a0e-account-create-update-sm9hz" Mar 20 08:57:06 crc kubenswrapper[4971]: I0320 08:57:06.729436 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zdbf\" (UniqueName: \"kubernetes.io/projected/6402da04-6fc4-4546-b7ac-5af524c11e9a-kube-api-access-6zdbf\") pod \"6402da04-6fc4-4546-b7ac-5af524c11e9a\" (UID: \"6402da04-6fc4-4546-b7ac-5af524c11e9a\") " Mar 20 08:57:06 crc kubenswrapper[4971]: I0320 08:57:06.729507 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/079d372c-65a0-4c34-b239-e49413cac654-operator-scripts\") pod \"079d372c-65a0-4c34-b239-e49413cac654\" (UID: \"079d372c-65a0-4c34-b239-e49413cac654\") " Mar 20 08:57:06 crc kubenswrapper[4971]: I0320 08:57:06.729767 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgt24\" (UniqueName: \"kubernetes.io/projected/079d372c-65a0-4c34-b239-e49413cac654-kube-api-access-qgt24\") pod \"079d372c-65a0-4c34-b239-e49413cac654\" (UID: \"079d372c-65a0-4c34-b239-e49413cac654\") " Mar 20 08:57:06 crc kubenswrapper[4971]: I0320 08:57:06.729866 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6402da04-6fc4-4546-b7ac-5af524c11e9a-operator-scripts\") pod \"6402da04-6fc4-4546-b7ac-5af524c11e9a\" (UID: \"6402da04-6fc4-4546-b7ac-5af524c11e9a\") " Mar 20 08:57:06 crc kubenswrapper[4971]: I0320 08:57:06.731090 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/079d372c-65a0-4c34-b239-e49413cac654-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "079d372c-65a0-4c34-b239-e49413cac654" (UID: "079d372c-65a0-4c34-b239-e49413cac654"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:57:06 crc kubenswrapper[4971]: I0320 08:57:06.731359 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402da04-6fc4-4546-b7ac-5af524c11e9a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6402da04-6fc4-4546-b7ac-5af524c11e9a" (UID: "6402da04-6fc4-4546-b7ac-5af524c11e9a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:57:06 crc kubenswrapper[4971]: I0320 08:57:06.735333 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402da04-6fc4-4546-b7ac-5af524c11e9a-kube-api-access-6zdbf" (OuterVolumeSpecName: "kube-api-access-6zdbf") pod "6402da04-6fc4-4546-b7ac-5af524c11e9a" (UID: "6402da04-6fc4-4546-b7ac-5af524c11e9a"). InnerVolumeSpecName "kube-api-access-6zdbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:57:06 crc kubenswrapper[4971]: I0320 08:57:06.736913 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/079d372c-65a0-4c34-b239-e49413cac654-kube-api-access-qgt24" (OuterVolumeSpecName: "kube-api-access-qgt24") pod "079d372c-65a0-4c34-b239-e49413cac654" (UID: "079d372c-65a0-4c34-b239-e49413cac654"). InnerVolumeSpecName "kube-api-access-qgt24". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:57:06 crc kubenswrapper[4971]: I0320 08:57:06.832922 4971 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6402da04-6fc4-4546-b7ac-5af524c11e9a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:57:06 crc kubenswrapper[4971]: I0320 08:57:06.832970 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zdbf\" (UniqueName: \"kubernetes.io/projected/6402da04-6fc4-4546-b7ac-5af524c11e9a-kube-api-access-6zdbf\") on node \"crc\" DevicePath \"\"" Mar 20 08:57:06 crc kubenswrapper[4971]: I0320 08:57:06.832990 4971 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/079d372c-65a0-4c34-b239-e49413cac654-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:57:06 crc kubenswrapper[4971]: I0320 08:57:06.833008 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgt24\" (UniqueName: \"kubernetes.io/projected/079d372c-65a0-4c34-b239-e49413cac654-kube-api-access-qgt24\") on node \"crc\" DevicePath \"\"" Mar 20 08:57:07 crc kubenswrapper[4971]: I0320 08:57:07.084753 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-9a0e-account-create-update-sm9hz" event={"ID":"6402da04-6fc4-4546-b7ac-5af524c11e9a","Type":"ContainerDied","Data":"0f7ec9b23be53cf5716340a7d14211d2ca01207f84b777e0592eac8f485319c4"} Mar 20 08:57:07 crc kubenswrapper[4971]: I0320 08:57:07.084819 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f7ec9b23be53cf5716340a7d14211d2ca01207f84b777e0592eac8f485319c4" Mar 20 08:57:07 crc kubenswrapper[4971]: I0320 08:57:07.084851 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-9a0e-account-create-update-sm9hz" Mar 20 08:57:07 crc kubenswrapper[4971]: I0320 08:57:07.086473 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-wwtwh" event={"ID":"079d372c-65a0-4c34-b239-e49413cac654","Type":"ContainerDied","Data":"b8630eced1addd296fd6678ee955ee9ddc609ea46b73b81428ed4a3059d0352c"} Mar 20 08:57:07 crc kubenswrapper[4971]: I0320 08:57:07.086514 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8630eced1addd296fd6678ee955ee9ddc609ea46b73b81428ed4a3059d0352c" Mar 20 08:57:07 crc kubenswrapper[4971]: I0320 08:57:07.086587 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-wwtwh" Mar 20 08:57:08 crc kubenswrapper[4971]: I0320 08:57:08.726450 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-k87v7"] Mar 20 08:57:08 crc kubenswrapper[4971]: E0320 08:57:08.727275 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6402da04-6fc4-4546-b7ac-5af524c11e9a" containerName="mariadb-account-create-update" Mar 20 08:57:08 crc kubenswrapper[4971]: I0320 08:57:08.727298 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="6402da04-6fc4-4546-b7ac-5af524c11e9a" containerName="mariadb-account-create-update" Mar 20 08:57:08 crc kubenswrapper[4971]: E0320 08:57:08.727331 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="079d372c-65a0-4c34-b239-e49413cac654" containerName="mariadb-database-create" Mar 20 08:57:08 crc kubenswrapper[4971]: I0320 08:57:08.727344 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="079d372c-65a0-4c34-b239-e49413cac654" containerName="mariadb-database-create" Mar 20 08:57:08 crc kubenswrapper[4971]: I0320 08:57:08.727651 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="6402da04-6fc4-4546-b7ac-5af524c11e9a" containerName="mariadb-account-create-update" Mar 20 08:57:08 crc kubenswrapper[4971]: I0320 08:57:08.727699 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="079d372c-65a0-4c34-b239-e49413cac654" containerName="mariadb-database-create" Mar 20 08:57:08 crc kubenswrapper[4971]: I0320 08:57:08.728663 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-k87v7" Mar 20 08:57:08 crc kubenswrapper[4971]: I0320 08:57:08.732578 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-blrwt" Mar 20 08:57:08 crc kubenswrapper[4971]: I0320 08:57:08.745997 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Mar 20 08:57:08 crc kubenswrapper[4971]: I0320 08:57:08.777203 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-k87v7"] Mar 20 08:57:08 crc kubenswrapper[4971]: I0320 08:57:08.781645 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2694c8fe-bdc7-48ad-b4aa-ae199a9cc90d-combined-ca-bundle\") pod \"heat-db-sync-k87v7\" (UID: \"2694c8fe-bdc7-48ad-b4aa-ae199a9cc90d\") " pod="openstack/heat-db-sync-k87v7" Mar 20 08:57:08 crc kubenswrapper[4971]: I0320 08:57:08.782009 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjjh7\" (UniqueName: \"kubernetes.io/projected/2694c8fe-bdc7-48ad-b4aa-ae199a9cc90d-kube-api-access-pjjh7\") pod \"heat-db-sync-k87v7\" (UID: \"2694c8fe-bdc7-48ad-b4aa-ae199a9cc90d\") " pod="openstack/heat-db-sync-k87v7" Mar 20 08:57:08 crc kubenswrapper[4971]: I0320 08:57:08.782148 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2694c8fe-bdc7-48ad-b4aa-ae199a9cc90d-config-data\") pod \"heat-db-sync-k87v7\" (UID: \"2694c8fe-bdc7-48ad-b4aa-ae199a9cc90d\") " pod="openstack/heat-db-sync-k87v7" Mar 20 08:57:08 crc kubenswrapper[4971]: I0320 08:57:08.884234 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjjh7\" (UniqueName: \"kubernetes.io/projected/2694c8fe-bdc7-48ad-b4aa-ae199a9cc90d-kube-api-access-pjjh7\") pod \"heat-db-sync-k87v7\" (UID: \"2694c8fe-bdc7-48ad-b4aa-ae199a9cc90d\") " pod="openstack/heat-db-sync-k87v7" Mar 20 08:57:08 crc kubenswrapper[4971]: I0320 08:57:08.884599 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2694c8fe-bdc7-48ad-b4aa-ae199a9cc90d-config-data\") pod \"heat-db-sync-k87v7\" (UID: \"2694c8fe-bdc7-48ad-b4aa-ae199a9cc90d\") " pod="openstack/heat-db-sync-k87v7" Mar 20 08:57:08 crc kubenswrapper[4971]: I0320 08:57:08.884780 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2694c8fe-bdc7-48ad-b4aa-ae199a9cc90d-combined-ca-bundle\") pod \"heat-db-sync-k87v7\" (UID: \"2694c8fe-bdc7-48ad-b4aa-ae199a9cc90d\") " pod="openstack/heat-db-sync-k87v7" Mar 20 08:57:08 crc kubenswrapper[4971]: I0320 08:57:08.891655 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2694c8fe-bdc7-48ad-b4aa-ae199a9cc90d-combined-ca-bundle\") pod \"heat-db-sync-k87v7\" (UID: \"2694c8fe-bdc7-48ad-b4aa-ae199a9cc90d\") " pod="openstack/heat-db-sync-k87v7" Mar 20 08:57:08 crc kubenswrapper[4971]: I0320 08:57:08.892497 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2694c8fe-bdc7-48ad-b4aa-ae199a9cc90d-config-data\") pod \"heat-db-sync-k87v7\" (UID: \"2694c8fe-bdc7-48ad-b4aa-ae199a9cc90d\") " pod="openstack/heat-db-sync-k87v7" Mar 20 08:57:08 crc kubenswrapper[4971]: I0320 08:57:08.902145 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjjh7\" (UniqueName: \"kubernetes.io/projected/2694c8fe-bdc7-48ad-b4aa-ae199a9cc90d-kube-api-access-pjjh7\") pod \"heat-db-sync-k87v7\" (UID: \"2694c8fe-bdc7-48ad-b4aa-ae199a9cc90d\") " pod="openstack/heat-db-sync-k87v7" Mar 20 08:57:09 crc kubenswrapper[4971]: I0320 08:57:09.070152 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-k87v7" Mar 20 08:57:09 crc kubenswrapper[4971]: I0320 08:57:09.537522 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-k87v7"] Mar 20 08:57:09 crc kubenswrapper[4971]: W0320 08:57:09.551295 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2694c8fe_bdc7_48ad_b4aa_ae199a9cc90d.slice/crio-97328384a752b898877b3c4db96d25fbcf9cc3e842464154abb5981c1fd51694 WatchSource:0}: Error finding container 97328384a752b898877b3c4db96d25fbcf9cc3e842464154abb5981c1fd51694: Status 404 returned error can't find the container with id 97328384a752b898877b3c4db96d25fbcf9cc3e842464154abb5981c1fd51694 Mar 20 08:57:10 crc kubenswrapper[4971]: I0320 08:57:10.119441 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-k87v7" event={"ID":"2694c8fe-bdc7-48ad-b4aa-ae199a9cc90d","Type":"ContainerStarted","Data":"97328384a752b898877b3c4db96d25fbcf9cc3e842464154abb5981c1fd51694"} Mar 20 08:57:12 crc kubenswrapper[4971]: I0320 08:57:12.506871 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7c6cddf495-pndtg" Mar 20 08:57:12 crc kubenswrapper[4971]: I0320 08:57:12.507953 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7c6cddf495-pndtg" Mar 20 08:57:19 crc kubenswrapper[4971]: I0320 08:57:19.251415 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-k87v7" event={"ID":"2694c8fe-bdc7-48ad-b4aa-ae199a9cc90d","Type":"ContainerStarted","Data":"ba1507967aa4533d9a912d97883c9af0760d929a7ccc35c3a15e7dd1fb8b3ad7"} Mar 20 08:57:19 crc kubenswrapper[4971]: I0320 08:57:19.274440 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-k87v7" podStartSLOduration=2.447498347 podStartE2EDuration="11.274418029s" podCreationTimestamp="2026-03-20 08:57:08 +0000 UTC" firstStartedPulling="2026-03-20 08:57:09.554378941 +0000 UTC m=+7651.534253079" lastFinishedPulling="2026-03-20 08:57:18.381298613 +0000 UTC m=+7660.361172761" observedRunningTime="2026-03-20 08:57:19.268690259 +0000 UTC m=+7661.248564407" watchObservedRunningTime="2026-03-20 08:57:19.274418029 +0000 UTC m=+7661.254292177" Mar 20 08:57:20 crc kubenswrapper[4971]: I0320 08:57:20.162418 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 08:57:20 crc kubenswrapper[4971]: I0320 08:57:20.162501 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 08:57:21 crc kubenswrapper[4971]: I0320 08:57:21.270723 4971 generic.go:334] "Generic (PLEG): container finished" podID="2694c8fe-bdc7-48ad-b4aa-ae199a9cc90d" containerID="ba1507967aa4533d9a912d97883c9af0760d929a7ccc35c3a15e7dd1fb8b3ad7" exitCode=0 Mar 20 08:57:21 crc kubenswrapper[4971]: I0320 08:57:21.270798 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-k87v7" event={"ID":"2694c8fe-bdc7-48ad-b4aa-ae199a9cc90d","Type":"ContainerDied","Data":"ba1507967aa4533d9a912d97883c9af0760d929a7ccc35c3a15e7dd1fb8b3ad7"} Mar 20 08:57:22 crc kubenswrapper[4971]: I0320 08:57:22.594818 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-k87v7" Mar 20 08:57:22 crc kubenswrapper[4971]: I0320 08:57:22.674209 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2694c8fe-bdc7-48ad-b4aa-ae199a9cc90d-combined-ca-bundle\") pod \"2694c8fe-bdc7-48ad-b4aa-ae199a9cc90d\" (UID: \"2694c8fe-bdc7-48ad-b4aa-ae199a9cc90d\") " Mar 20 08:57:22 crc kubenswrapper[4971]: I0320 08:57:22.674361 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2694c8fe-bdc7-48ad-b4aa-ae199a9cc90d-config-data\") pod \"2694c8fe-bdc7-48ad-b4aa-ae199a9cc90d\" (UID: \"2694c8fe-bdc7-48ad-b4aa-ae199a9cc90d\") " Mar 20 08:57:22 crc kubenswrapper[4971]: I0320 08:57:22.674398 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjjh7\" (UniqueName: \"kubernetes.io/projected/2694c8fe-bdc7-48ad-b4aa-ae199a9cc90d-kube-api-access-pjjh7\") pod \"2694c8fe-bdc7-48ad-b4aa-ae199a9cc90d\" (UID: \"2694c8fe-bdc7-48ad-b4aa-ae199a9cc90d\") " Mar 20 08:57:22 crc kubenswrapper[4971]: I0320 08:57:22.679173 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2694c8fe-bdc7-48ad-b4aa-ae199a9cc90d-kube-api-access-pjjh7" (OuterVolumeSpecName: "kube-api-access-pjjh7") pod "2694c8fe-bdc7-48ad-b4aa-ae199a9cc90d" (UID: "2694c8fe-bdc7-48ad-b4aa-ae199a9cc90d"). InnerVolumeSpecName "kube-api-access-pjjh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:57:22 crc kubenswrapper[4971]: I0320 08:57:22.705739 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2694c8fe-bdc7-48ad-b4aa-ae199a9cc90d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2694c8fe-bdc7-48ad-b4aa-ae199a9cc90d" (UID: "2694c8fe-bdc7-48ad-b4aa-ae199a9cc90d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:57:22 crc kubenswrapper[4971]: I0320 08:57:22.742783 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2694c8fe-bdc7-48ad-b4aa-ae199a9cc90d-config-data" (OuterVolumeSpecName: "config-data") pod "2694c8fe-bdc7-48ad-b4aa-ae199a9cc90d" (UID: "2694c8fe-bdc7-48ad-b4aa-ae199a9cc90d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:57:22 crc kubenswrapper[4971]: I0320 08:57:22.777195 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjjh7\" (UniqueName: \"kubernetes.io/projected/2694c8fe-bdc7-48ad-b4aa-ae199a9cc90d-kube-api-access-pjjh7\") on node \"crc\" DevicePath \"\"" Mar 20 08:57:22 crc kubenswrapper[4971]: I0320 08:57:22.777233 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2694c8fe-bdc7-48ad-b4aa-ae199a9cc90d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:57:22 crc kubenswrapper[4971]: I0320 08:57:22.777246 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2694c8fe-bdc7-48ad-b4aa-ae199a9cc90d-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:57:23 crc kubenswrapper[4971]: I0320 08:57:23.292107 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-k87v7" event={"ID":"2694c8fe-bdc7-48ad-b4aa-ae199a9cc90d","Type":"ContainerDied","Data":"97328384a752b898877b3c4db96d25fbcf9cc3e842464154abb5981c1fd51694"} Mar 20 08:57:23 crc kubenswrapper[4971]: I0320 08:57:23.292160 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97328384a752b898877b3c4db96d25fbcf9cc3e842464154abb5981c1fd51694" Mar 20 08:57:23 crc kubenswrapper[4971]: I0320 08:57:23.292169 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-k87v7" Mar 20 08:57:24 crc kubenswrapper[4971]: I0320 08:57:24.211553 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7c6cddf495-pndtg" Mar 20 08:57:24 crc kubenswrapper[4971]: I0320 08:57:24.565833 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-66bdd8fc4b-czgn7"] Mar 20 08:57:24 crc kubenswrapper[4971]: E0320 08:57:24.566214 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2694c8fe-bdc7-48ad-b4aa-ae199a9cc90d" containerName="heat-db-sync" Mar 20 08:57:24 crc kubenswrapper[4971]: I0320 08:57:24.566229 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="2694c8fe-bdc7-48ad-b4aa-ae199a9cc90d" containerName="heat-db-sync" Mar 20 08:57:24 crc kubenswrapper[4971]: I0320 08:57:24.566456 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="2694c8fe-bdc7-48ad-b4aa-ae199a9cc90d" containerName="heat-db-sync" Mar 20 08:57:24 crc kubenswrapper[4971]: I0320 08:57:24.567070 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-66bdd8fc4b-czgn7" Mar 20 08:57:24 crc kubenswrapper[4971]: W0320 08:57:24.573638 4971 reflector.go:561] object-"openstack"/"heat-engine-config-data": failed to list *v1.Secret: secrets "heat-engine-config-data" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Mar 20 08:57:24 crc kubenswrapper[4971]: E0320 08:57:24.573691 4971 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"heat-engine-config-data\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"heat-engine-config-data\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 20 08:57:24 crc kubenswrapper[4971]: I0320 08:57:24.574066 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-blrwt" Mar 20 08:57:24 crc kubenswrapper[4971]: I0320 08:57:24.574439 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Mar 20 08:57:24 crc kubenswrapper[4971]: I0320 08:57:24.655593 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-66bdd8fc4b-czgn7"] Mar 20 08:57:24 crc kubenswrapper[4971]: I0320 08:57:24.709071 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4893772c-593c-4f8b-9f44-8a786adeadb9-combined-ca-bundle\") pod \"heat-engine-66bdd8fc4b-czgn7\" (UID: \"4893772c-593c-4f8b-9f44-8a786adeadb9\") " pod="openstack/heat-engine-66bdd8fc4b-czgn7" Mar 20 08:57:24 crc kubenswrapper[4971]: I0320 08:57:24.709129 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4893772c-593c-4f8b-9f44-8a786adeadb9-config-data-custom\") pod \"heat-engine-66bdd8fc4b-czgn7\" (UID: \"4893772c-593c-4f8b-9f44-8a786adeadb9\") " pod="openstack/heat-engine-66bdd8fc4b-czgn7" Mar 20 08:57:24 crc kubenswrapper[4971]: I0320 08:57:24.709148 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4893772c-593c-4f8b-9f44-8a786adeadb9-config-data\") pod \"heat-engine-66bdd8fc4b-czgn7\" (UID: \"4893772c-593c-4f8b-9f44-8a786adeadb9\") " pod="openstack/heat-engine-66bdd8fc4b-czgn7" Mar 20 08:57:24 crc kubenswrapper[4971]: I0320 08:57:24.709232 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qspk\" (UniqueName: \"kubernetes.io/projected/4893772c-593c-4f8b-9f44-8a786adeadb9-kube-api-access-8qspk\") pod \"heat-engine-66bdd8fc4b-czgn7\" (UID: \"4893772c-593c-4f8b-9f44-8a786adeadb9\") " pod="openstack/heat-engine-66bdd8fc4b-czgn7" Mar 20 08:57:24 crc kubenswrapper[4971]: I0320 08:57:24.810754 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qspk\" (UniqueName: \"kubernetes.io/projected/4893772c-593c-4f8b-9f44-8a786adeadb9-kube-api-access-8qspk\") pod \"heat-engine-66bdd8fc4b-czgn7\" (UID: \"4893772c-593c-4f8b-9f44-8a786adeadb9\") " pod="openstack/heat-engine-66bdd8fc4b-czgn7" Mar 20 08:57:24 crc kubenswrapper[4971]: I0320 08:57:24.810864 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4893772c-593c-4f8b-9f44-8a786adeadb9-combined-ca-bundle\") pod \"heat-engine-66bdd8fc4b-czgn7\" (UID: \"4893772c-593c-4f8b-9f44-8a786adeadb9\") " pod="openstack/heat-engine-66bdd8fc4b-czgn7" Mar 20 08:57:24 crc kubenswrapper[4971]: I0320 08:57:24.810896 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4893772c-593c-4f8b-9f44-8a786adeadb9-config-data-custom\") pod \"heat-engine-66bdd8fc4b-czgn7\" (UID: \"4893772c-593c-4f8b-9f44-8a786adeadb9\") " pod="openstack/heat-engine-66bdd8fc4b-czgn7" Mar 20 08:57:24 crc kubenswrapper[4971]: I0320 08:57:24.810912 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4893772c-593c-4f8b-9f44-8a786adeadb9-config-data\") pod \"heat-engine-66bdd8fc4b-czgn7\" (UID: \"4893772c-593c-4f8b-9f44-8a786adeadb9\") " pod="openstack/heat-engine-66bdd8fc4b-czgn7" Mar 20 08:57:24 crc kubenswrapper[4971]: I0320 08:57:24.818896 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4893772c-593c-4f8b-9f44-8a786adeadb9-combined-ca-bundle\") pod \"heat-engine-66bdd8fc4b-czgn7\" (UID: \"4893772c-593c-4f8b-9f44-8a786adeadb9\") " pod="openstack/heat-engine-66bdd8fc4b-czgn7" Mar 20 08:57:24 crc kubenswrapper[4971]: I0320 08:57:24.829793 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qspk\" (UniqueName: \"kubernetes.io/projected/4893772c-593c-4f8b-9f44-8a786adeadb9-kube-api-access-8qspk\") pod \"heat-engine-66bdd8fc4b-czgn7\" (UID: \"4893772c-593c-4f8b-9f44-8a786adeadb9\") " pod="openstack/heat-engine-66bdd8fc4b-czgn7" Mar 20 08:57:24 crc kubenswrapper[4971]: I0320 08:57:24.858685 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4893772c-593c-4f8b-9f44-8a786adeadb9-config-data\") pod \"heat-engine-66bdd8fc4b-czgn7\" (UID: \"4893772c-593c-4f8b-9f44-8a786adeadb9\") " pod="openstack/heat-engine-66bdd8fc4b-czgn7" Mar 20 08:57:24 crc kubenswrapper[4971]: I0320 08:57:24.874628 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-769d45db65-qj7ql"] Mar 20 08:57:24 crc kubenswrapper[4971]: I0320 08:57:24.875865 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-769d45db65-qj7ql" Mar 20 08:57:24 crc kubenswrapper[4971]: I0320 08:57:24.878715 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Mar 20 08:57:24 crc kubenswrapper[4971]: I0320 08:57:24.882714 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-769d45db65-qj7ql"] Mar 20 08:57:24 crc kubenswrapper[4971]: I0320 08:57:24.891132 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-68cbbccf66-npd8p"] Mar 20 08:57:24 crc kubenswrapper[4971]: I0320 08:57:24.897950 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-68cbbccf66-npd8p"] Mar 20 08:57:24 crc kubenswrapper[4971]: I0320 08:57:24.898073 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-68cbbccf66-npd8p" Mar 20 08:57:24 crc kubenswrapper[4971]: I0320 08:57:24.899972 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Mar 20 08:57:25 crc kubenswrapper[4971]: I0320 08:57:25.026655 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcba81ea-8871-4cdd-aeae-61f58557c44c-combined-ca-bundle\") pod \"heat-cfnapi-769d45db65-qj7ql\" (UID: \"dcba81ea-8871-4cdd-aeae-61f58557c44c\") " pod="openstack/heat-cfnapi-769d45db65-qj7ql" Mar 20 08:57:25 crc kubenswrapper[4971]: I0320 08:57:25.026694 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4caadd7e-7246-481f-8093-627230427f16-combined-ca-bundle\") pod \"heat-api-68cbbccf66-npd8p\" (UID: \"4caadd7e-7246-481f-8093-627230427f16\") " pod="openstack/heat-api-68cbbccf66-npd8p" Mar 20 08:57:25 crc kubenswrapper[4971]: I0320 08:57:25.026727 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hs46v\" (UniqueName: \"kubernetes.io/projected/dcba81ea-8871-4cdd-aeae-61f58557c44c-kube-api-access-hs46v\") pod \"heat-cfnapi-769d45db65-qj7ql\" (UID: \"dcba81ea-8871-4cdd-aeae-61f58557c44c\") " pod="openstack/heat-cfnapi-769d45db65-qj7ql" Mar 20 08:57:25 crc kubenswrapper[4971]: I0320 08:57:25.026767 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dcba81ea-8871-4cdd-aeae-61f58557c44c-config-data-custom\") pod \"heat-cfnapi-769d45db65-qj7ql\" (UID: \"dcba81ea-8871-4cdd-aeae-61f58557c44c\") " pod="openstack/heat-cfnapi-769d45db65-qj7ql" Mar 20 08:57:25 crc kubenswrapper[4971]: I0320 08:57:25.026901 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcba81ea-8871-4cdd-aeae-61f58557c44c-config-data\") pod \"heat-cfnapi-769d45db65-qj7ql\" (UID: \"dcba81ea-8871-4cdd-aeae-61f58557c44c\") " pod="openstack/heat-cfnapi-769d45db65-qj7ql" Mar 20 08:57:25 crc kubenswrapper[4971]: I0320 08:57:25.027033 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4caadd7e-7246-481f-8093-627230427f16-config-data-custom\") pod \"heat-api-68cbbccf66-npd8p\" (UID: \"4caadd7e-7246-481f-8093-627230427f16\") " pod="openstack/heat-api-68cbbccf66-npd8p" Mar 20 08:57:25 crc kubenswrapper[4971]: I0320 08:57:25.027128 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4caadd7e-7246-481f-8093-627230427f16-config-data\") pod \"heat-api-68cbbccf66-npd8p\" (UID: \"4caadd7e-7246-481f-8093-627230427f16\") " pod="openstack/heat-api-68cbbccf66-npd8p" Mar 20 08:57:25 crc kubenswrapper[4971]: I0320 08:57:25.027569 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkrc2\" (UniqueName: \"kubernetes.io/projected/4caadd7e-7246-481f-8093-627230427f16-kube-api-access-rkrc2\") pod \"heat-api-68cbbccf66-npd8p\" (UID: \"4caadd7e-7246-481f-8093-627230427f16\") " pod="openstack/heat-api-68cbbccf66-npd8p" Mar 20 08:57:25 crc kubenswrapper[4971]: I0320 08:57:25.128875 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dcba81ea-8871-4cdd-aeae-61f58557c44c-config-data-custom\") pod \"heat-cfnapi-769d45db65-qj7ql\" (UID: \"dcba81ea-8871-4cdd-aeae-61f58557c44c\") " pod="openstack/heat-cfnapi-769d45db65-qj7ql" Mar 20 08:57:25 crc kubenswrapper[4971]: I0320 08:57:25.128924 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcba81ea-8871-4cdd-aeae-61f58557c44c-config-data\") pod \"heat-cfnapi-769d45db65-qj7ql\" (UID: \"dcba81ea-8871-4cdd-aeae-61f58557c44c\") " pod="openstack/heat-cfnapi-769d45db65-qj7ql" Mar 20 08:57:25 crc kubenswrapper[4971]: I0320 08:57:25.128958 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4caadd7e-7246-481f-8093-627230427f16-config-data-custom\") pod \"heat-api-68cbbccf66-npd8p\" (UID: \"4caadd7e-7246-481f-8093-627230427f16\") " pod="openstack/heat-api-68cbbccf66-npd8p" Mar 20 08:57:25 crc kubenswrapper[4971]: I0320 08:57:25.128994 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4caadd7e-7246-481f-8093-627230427f16-config-data\") pod \"heat-api-68cbbccf66-npd8p\" (UID: \"4caadd7e-7246-481f-8093-627230427f16\") " pod="openstack/heat-api-68cbbccf66-npd8p" Mar 20 08:57:25 crc kubenswrapper[4971]: I0320 08:57:25.129080 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkrc2\" (UniqueName: \"kubernetes.io/projected/4caadd7e-7246-481f-8093-627230427f16-kube-api-access-rkrc2\") pod \"heat-api-68cbbccf66-npd8p\" (UID: \"4caadd7e-7246-481f-8093-627230427f16\") " pod="openstack/heat-api-68cbbccf66-npd8p" Mar 20 08:57:25 crc kubenswrapper[4971]: I0320 08:57:25.129103 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcba81ea-8871-4cdd-aeae-61f58557c44c-combined-ca-bundle\") pod \"heat-cfnapi-769d45db65-qj7ql\" (UID: \"dcba81ea-8871-4cdd-aeae-61f58557c44c\") " pod="openstack/heat-cfnapi-769d45db65-qj7ql" Mar 20 08:57:25 crc kubenswrapper[4971]: I0320 08:57:25.129118 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4caadd7e-7246-481f-8093-627230427f16-combined-ca-bundle\") pod \"heat-api-68cbbccf66-npd8p\" (UID: \"4caadd7e-7246-481f-8093-627230427f16\") " pod="openstack/heat-api-68cbbccf66-npd8p" Mar 20 08:57:25 crc kubenswrapper[4971]: I0320 08:57:25.129153 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hs46v\" (UniqueName: \"kubernetes.io/projected/dcba81ea-8871-4cdd-aeae-61f58557c44c-kube-api-access-hs46v\") pod \"heat-cfnapi-769d45db65-qj7ql\" (UID: \"dcba81ea-8871-4cdd-aeae-61f58557c44c\") " pod="openstack/heat-cfnapi-769d45db65-qj7ql" Mar 20 08:57:25 crc kubenswrapper[4971]: I0320 08:57:25.135727 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4caadd7e-7246-481f-8093-627230427f16-config-data-custom\") pod \"heat-api-68cbbccf66-npd8p\" (UID: \"4caadd7e-7246-481f-8093-627230427f16\") " pod="openstack/heat-api-68cbbccf66-npd8p" Mar 20 08:57:25 crc kubenswrapper[4971]: I0320 08:57:25.136810 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcba81ea-8871-4cdd-aeae-61f58557c44c-config-data\") pod \"heat-cfnapi-769d45db65-qj7ql\" (UID: \"dcba81ea-8871-4cdd-aeae-61f58557c44c\") " pod="openstack/heat-cfnapi-769d45db65-qj7ql" Mar 20 08:57:25 crc kubenswrapper[4971]: I0320 08:57:25.140312 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcba81ea-8871-4cdd-aeae-61f58557c44c-combined-ca-bundle\") pod \"heat-cfnapi-769d45db65-qj7ql\" (UID: \"dcba81ea-8871-4cdd-aeae-61f58557c44c\") " pod="openstack/heat-cfnapi-769d45db65-qj7ql" Mar 20 08:57:25 crc kubenswrapper[4971]: I0320 08:57:25.142380 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4caadd7e-7246-481f-8093-627230427f16-combined-ca-bundle\") pod \"heat-api-68cbbccf66-npd8p\" (UID: \"4caadd7e-7246-481f-8093-627230427f16\") " pod="openstack/heat-api-68cbbccf66-npd8p" Mar 20 08:57:25 crc kubenswrapper[4971]: I0320 08:57:25.150060 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hs46v\" (UniqueName: \"kubernetes.io/projected/dcba81ea-8871-4cdd-aeae-61f58557c44c-kube-api-access-hs46v\") pod \"heat-cfnapi-769d45db65-qj7ql\" (UID: \"dcba81ea-8871-4cdd-aeae-61f58557c44c\") " pod="openstack/heat-cfnapi-769d45db65-qj7ql" Mar 20 08:57:25 crc kubenswrapper[4971]: I0320 08:57:25.150335 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkrc2\" (UniqueName: \"kubernetes.io/projected/4caadd7e-7246-481f-8093-627230427f16-kube-api-access-rkrc2\") pod \"heat-api-68cbbccf66-npd8p\" (UID: \"4caadd7e-7246-481f-8093-627230427f16\") " pod="openstack/heat-api-68cbbccf66-npd8p" Mar 20 08:57:25 crc kubenswrapper[4971]: I0320 08:57:25.157028 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4caadd7e-7246-481f-8093-627230427f16-config-data\") pod \"heat-api-68cbbccf66-npd8p\" (UID: \"4caadd7e-7246-481f-8093-627230427f16\") " pod="openstack/heat-api-68cbbccf66-npd8p" Mar 20 08:57:25 crc kubenswrapper[4971]: I0320 08:57:25.161259 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dcba81ea-8871-4cdd-aeae-61f58557c44c-config-data-custom\") pod \"heat-cfnapi-769d45db65-qj7ql\" (UID: \"dcba81ea-8871-4cdd-aeae-61f58557c44c\") " pod="openstack/heat-cfnapi-769d45db65-qj7ql" Mar 20 08:57:25 crc kubenswrapper[4971]: I0320 08:57:25.238763 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-769d45db65-qj7ql" Mar 20 08:57:25 crc kubenswrapper[4971]: I0320 08:57:25.258803 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-68cbbccf66-npd8p" Mar 20 08:57:25 crc kubenswrapper[4971]: W0320 08:57:25.776306 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddcba81ea_8871_4cdd_aeae_61f58557c44c.slice/crio-c88aa47842a10b1f7743da352c4c7fb6c7e14b874b5eb449e065bf0dc2c7a3ed WatchSource:0}: Error finding container c88aa47842a10b1f7743da352c4c7fb6c7e14b874b5eb449e065bf0dc2c7a3ed: Status 404 returned error can't find the container with id c88aa47842a10b1f7743da352c4c7fb6c7e14b874b5eb449e065bf0dc2c7a3ed Mar 20 08:57:25 crc kubenswrapper[4971]: I0320 08:57:25.779040 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-769d45db65-qj7ql"] Mar 20 08:57:25 crc kubenswrapper[4971]: E0320 08:57:25.811912 4971 secret.go:188] Couldn't get secret openstack/heat-engine-config-data: failed to sync secret cache: timed out waiting for the condition Mar 20 08:57:25 crc kubenswrapper[4971]: E0320 08:57:25.811996 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4893772c-593c-4f8b-9f44-8a786adeadb9-config-data-custom podName:4893772c-593c-4f8b-9f44-8a786adeadb9 nodeName:}" failed. No retries permitted until 2026-03-20 08:57:26.311976494 +0000 UTC m=+7668.291850632 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data-custom" (UniqueName: "kubernetes.io/secret/4893772c-593c-4f8b-9f44-8a786adeadb9-config-data-custom") pod "heat-engine-66bdd8fc4b-czgn7" (UID: "4893772c-593c-4f8b-9f44-8a786adeadb9") : failed to sync secret cache: timed out waiting for the condition Mar 20 08:57:25 crc kubenswrapper[4971]: I0320 08:57:25.885505 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-68cbbccf66-npd8p"] Mar 20 08:57:26 crc kubenswrapper[4971]: I0320 08:57:26.105194 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Mar 20 08:57:26 crc kubenswrapper[4971]: I0320 08:57:26.266564 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7c6cddf495-pndtg" Mar 20 08:57:26 crc kubenswrapper[4971]: I0320 08:57:26.326828 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-859774b9d9-bgg24"] Mar 20 08:57:26 crc kubenswrapper[4971]: I0320 08:57:26.327233 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-859774b9d9-bgg24" podUID="003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7" containerName="horizon-log" containerID="cri-o://ae213f50770cd8f09e548bfa98d77803b409aefbf1d32f4970593d1ac8802c71" gracePeriod=30 Mar 20 08:57:26 crc kubenswrapper[4971]: I0320 08:57:26.327351 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-859774b9d9-bgg24" podUID="003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7" containerName="horizon" containerID="cri-o://cd5664fbe414be3892e03fb8aac0115e5833b4449389bbb196cfdfb905cd5ce3" gracePeriod=30 Mar 20 08:57:26 crc kubenswrapper[4971]: I0320 08:57:26.332276 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-769d45db65-qj7ql" event={"ID":"dcba81ea-8871-4cdd-aeae-61f58557c44c","Type":"ContainerStarted","Data":"c88aa47842a10b1f7743da352c4c7fb6c7e14b874b5eb449e065bf0dc2c7a3ed"} Mar 20 08:57:26 crc kubenswrapper[4971]: I0320 08:57:26.337802 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-68cbbccf66-npd8p" event={"ID":"4caadd7e-7246-481f-8093-627230427f16","Type":"ContainerStarted","Data":"04cb56fb93631eaebbfadabb4da415ce5b926613881048b7048a29c00897223f"} Mar 20 08:57:26 crc kubenswrapper[4971]: I0320 08:57:26.361077 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4893772c-593c-4f8b-9f44-8a786adeadb9-config-data-custom\") pod \"heat-engine-66bdd8fc4b-czgn7\" (UID: \"4893772c-593c-4f8b-9f44-8a786adeadb9\") " pod="openstack/heat-engine-66bdd8fc4b-czgn7" Mar 20 08:57:26 crc kubenswrapper[4971]: I0320 08:57:26.379440 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4893772c-593c-4f8b-9f44-8a786adeadb9-config-data-custom\") pod \"heat-engine-66bdd8fc4b-czgn7\" (UID: \"4893772c-593c-4f8b-9f44-8a786adeadb9\") " pod="openstack/heat-engine-66bdd8fc4b-czgn7" Mar 20 08:57:26 crc kubenswrapper[4971]: I0320 08:57:26.418103 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-66bdd8fc4b-czgn7" Mar 20 08:57:26 crc kubenswrapper[4971]: I0320 08:57:26.901112 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-66bdd8fc4b-czgn7"] Mar 20 08:57:27 crc kubenswrapper[4971]: I0320 08:57:27.348327 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-66bdd8fc4b-czgn7" event={"ID":"4893772c-593c-4f8b-9f44-8a786adeadb9","Type":"ContainerStarted","Data":"09bbed1f6c76a0330dfe06be1fd7de7d5d158515441e02ca19fcecb16fb665c7"} Mar 20 08:57:28 crc kubenswrapper[4971]: I0320 08:57:28.365519 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-66bdd8fc4b-czgn7" event={"ID":"4893772c-593c-4f8b-9f44-8a786adeadb9","Type":"ContainerStarted","Data":"5879454dd5a3b571d42f0adbba7e2d2432be9a60b026a8e405c72bf8e9325926"} Mar 20 08:57:28 crc kubenswrapper[4971]: I0320 08:57:28.366229 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-66bdd8fc4b-czgn7" Mar 20 08:57:28 crc kubenswrapper[4971]: I0320 08:57:28.369362 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-769d45db65-qj7ql" event={"ID":"dcba81ea-8871-4cdd-aeae-61f58557c44c","Type":"ContainerStarted","Data":"2e7c481f9cd8622fe91e3a0b2e5ad345baf6ec39da275d1abd49ecf9faf66910"} Mar 20 08:57:28 crc kubenswrapper[4971]: I0320 08:57:28.370095 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-769d45db65-qj7ql" Mar 20 08:57:28 crc kubenswrapper[4971]: I0320 08:57:28.372095 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-68cbbccf66-npd8p" event={"ID":"4caadd7e-7246-481f-8093-627230427f16","Type":"ContainerStarted","Data":"ad8e792dcb4629f57ec686a4215d9ea4ed71171e0153f9013471f8a34b110fc1"} Mar 20 08:57:28 crc kubenswrapper[4971]: I0320 08:57:28.372367 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-68cbbccf66-npd8p" Mar 20 08:57:28 crc kubenswrapper[4971]: I0320 08:57:28.395062 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-66bdd8fc4b-czgn7" podStartSLOduration=4.395042479 podStartE2EDuration="4.395042479s" podCreationTimestamp="2026-03-20 08:57:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:57:28.382343647 +0000 UTC m=+7670.362217785" watchObservedRunningTime="2026-03-20 08:57:28.395042479 +0000 UTC m=+7670.374916617" Mar 20 08:57:28 crc kubenswrapper[4971]: I0320 08:57:28.419500 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-68cbbccf66-npd8p" podStartSLOduration=2.874580994 podStartE2EDuration="4.419480819s" podCreationTimestamp="2026-03-20 08:57:24 +0000 UTC" firstStartedPulling="2026-03-20 08:57:25.880701222 +0000 UTC m=+7667.860575360" lastFinishedPulling="2026-03-20 08:57:27.425601047 +0000 UTC m=+7669.405475185" observedRunningTime="2026-03-20 08:57:28.400268976 +0000 UTC m=+7670.380143114" watchObservedRunningTime="2026-03-20 08:57:28.419480819 +0000 UTC m=+7670.399354957" Mar 20 08:57:28 crc kubenswrapper[4971]: I0320 08:57:28.424441 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-769d45db65-qj7ql" podStartSLOduration=2.784339162 podStartE2EDuration="4.424427618s" podCreationTimestamp="2026-03-20 08:57:24 +0000 UTC" firstStartedPulling="2026-03-20 08:57:25.779195416 +0000 UTC m=+7667.759069544" lastFinishedPulling="2026-03-20 08:57:27.419283862 +0000 UTC m=+7669.399158000" observedRunningTime="2026-03-20 08:57:28.411650774 +0000 UTC m=+7670.391524912" watchObservedRunningTime="2026-03-20 08:57:28.424427618 +0000 UTC m=+7670.404301756" Mar 20 08:57:30 crc kubenswrapper[4971]: I0320 08:57:30.388458 4971 generic.go:334] "Generic (PLEG): container finished" podID="003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7" containerID="cd5664fbe414be3892e03fb8aac0115e5833b4449389bbb196cfdfb905cd5ce3" exitCode=0 Mar 20 08:57:30 crc kubenswrapper[4971]: I0320 08:57:30.388559 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-859774b9d9-bgg24" event={"ID":"003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7","Type":"ContainerDied","Data":"cd5664fbe414be3892e03fb8aac0115e5833b4449389bbb196cfdfb905cd5ce3"} Mar 20 08:57:35 crc kubenswrapper[4971]: I0320 08:57:35.916395 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-859774b9d9-bgg24" podUID="003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.157:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.157:8080: connect: connection refused" Mar 20 08:57:36 crc kubenswrapper[4971]: I0320 08:57:36.628712 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-68cbbccf66-npd8p" Mar 20 08:57:36 crc kubenswrapper[4971]: I0320 08:57:36.938022 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-769d45db65-qj7ql" Mar 20 08:57:42 crc kubenswrapper[4971]: I0320 08:57:42.895881 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="52099a51-54ac-4611-89cf-191426ae31d7" containerName="galera" probeResult="failure" output="command timed out" Mar 20 08:57:45 crc kubenswrapper[4971]: I0320 08:57:45.918396 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-859774b9d9-bgg24" podUID="003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.157:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.157:8080: connect: connection refused" Mar 20 08:57:46 crc kubenswrapper[4971]: I0320 08:57:46.456336 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-66bdd8fc4b-czgn7" Mar 20 08:57:49 crc kubenswrapper[4971]: I0320 08:57:49.041675 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-b44d-account-create-update-pnwdx"] Mar 20 08:57:49 crc kubenswrapper[4971]: I0320 08:57:49.052065 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-c2w9d"] Mar 20 08:57:49 crc kubenswrapper[4971]: I0320 08:57:49.060998 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-b44d-account-create-update-pnwdx"] Mar 20 08:57:49 crc kubenswrapper[4971]: I0320 08:57:49.068937 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-c2w9d"] Mar 20 08:57:50 crc kubenswrapper[4971]: I0320 08:57:50.162755 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 08:57:50 crc kubenswrapper[4971]: I0320 08:57:50.163183 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 08:57:50 crc kubenswrapper[4971]: I0320 08:57:50.741303 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21a92691-09f1-4841-86b2-c72f6153c44c" path="/var/lib/kubelet/pods/21a92691-09f1-4841-86b2-c72f6153c44c/volumes" Mar 20 08:57:50 crc kubenswrapper[4971]: I0320 08:57:50.742188 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94f9b6de-f4ff-4a74-a86b-dffa2e03bd58" path="/var/lib/kubelet/pods/94f9b6de-f4ff-4a74-a86b-dffa2e03bd58/volumes" Mar 20 08:57:55 crc kubenswrapper[4971]: I0320 08:57:55.354829 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g2c5x"] Mar 20 08:57:55 crc kubenswrapper[4971]: I0320 08:57:55.357200 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g2c5x" Mar 20 08:57:55 crc kubenswrapper[4971]: I0320 08:57:55.359185 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 20 08:57:55 crc kubenswrapper[4971]: I0320 08:57:55.367157 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g2c5x"] Mar 20 08:57:55 crc kubenswrapper[4971]: I0320 08:57:55.453993 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mc6mf\" (UniqueName: \"kubernetes.io/projected/cad0849d-53e9-4b65-a400-5eeeba9c533d-kube-api-access-mc6mf\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g2c5x\" (UID: \"cad0849d-53e9-4b65-a400-5eeeba9c533d\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g2c5x" Mar 20 08:57:55 crc kubenswrapper[4971]: I0320 08:57:55.454733 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cad0849d-53e9-4b65-a400-5eeeba9c533d-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g2c5x\" (UID: \"cad0849d-53e9-4b65-a400-5eeeba9c533d\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g2c5x" Mar 20 08:57:55 crc kubenswrapper[4971]: I0320 08:57:55.455884 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cad0849d-53e9-4b65-a400-5eeeba9c533d-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g2c5x\" (UID: \"cad0849d-53e9-4b65-a400-5eeeba9c533d\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g2c5x" Mar 20 08:57:55 crc kubenswrapper[4971]: I0320 08:57:55.557856 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cad0849d-53e9-4b65-a400-5eeeba9c533d-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g2c5x\" (UID: \"cad0849d-53e9-4b65-a400-5eeeba9c533d\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g2c5x" Mar 20 08:57:55 crc kubenswrapper[4971]: I0320 08:57:55.558095 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cad0849d-53e9-4b65-a400-5eeeba9c533d-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g2c5x\" (UID: \"cad0849d-53e9-4b65-a400-5eeeba9c533d\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g2c5x" Mar 20 08:57:55 crc kubenswrapper[4971]: I0320 08:57:55.558180 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mc6mf\" (UniqueName: \"kubernetes.io/projected/cad0849d-53e9-4b65-a400-5eeeba9c533d-kube-api-access-mc6mf\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g2c5x\" (UID: \"cad0849d-53e9-4b65-a400-5eeeba9c533d\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g2c5x" Mar 20 08:57:55 crc kubenswrapper[4971]: I0320 08:57:55.558807 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cad0849d-53e9-4b65-a400-5eeeba9c533d-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g2c5x\" (UID: \"cad0849d-53e9-4b65-a400-5eeeba9c533d\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g2c5x" Mar 20 08:57:55 crc kubenswrapper[4971]: I0320 08:57:55.558920 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cad0849d-53e9-4b65-a400-5eeeba9c533d-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g2c5x\" (UID: \"cad0849d-53e9-4b65-a400-5eeeba9c533d\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g2c5x" Mar 20 08:57:55 crc kubenswrapper[4971]: I0320 08:57:55.588339 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mc6mf\" (UniqueName: \"kubernetes.io/projected/cad0849d-53e9-4b65-a400-5eeeba9c533d-kube-api-access-mc6mf\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g2c5x\" (UID: \"cad0849d-53e9-4b65-a400-5eeeba9c533d\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g2c5x" Mar 20 08:57:55 crc kubenswrapper[4971]: I0320 08:57:55.680877 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g2c5x" Mar 20 08:57:55 crc kubenswrapper[4971]: I0320 08:57:55.919070 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-859774b9d9-bgg24" podUID="003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.157:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.157:8080: connect: connection refused" Mar 20 08:57:55 crc kubenswrapper[4971]: I0320 08:57:55.919474 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-859774b9d9-bgg24" Mar 20 08:57:56 crc kubenswrapper[4971]: W0320 08:57:56.204869 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcad0849d_53e9_4b65_a400_5eeeba9c533d.slice/crio-7a4e628e10a49a23786e59313f8f0a3b4bcbe87af60c770406f6acf7c6f32c52 WatchSource:0}: Error finding container 7a4e628e10a49a23786e59313f8f0a3b4bcbe87af60c770406f6acf7c6f32c52: Status 404 returned error can't find the container with id 7a4e628e10a49a23786e59313f8f0a3b4bcbe87af60c770406f6acf7c6f32c52 Mar 20 08:57:56 crc kubenswrapper[4971]: I0320 08:57:56.210934 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g2c5x"] Mar 20 08:57:56 crc kubenswrapper[4971]: I0320 08:57:56.776354 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-859774b9d9-bgg24" Mar 20 08:57:56 crc kubenswrapper[4971]: I0320 08:57:56.890147 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7-horizon-secret-key\") pod \"003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7\" (UID: \"003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7\") " Mar 20 08:57:56 crc kubenswrapper[4971]: I0320 08:57:56.890443 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7-config-data\") pod \"003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7\" (UID: \"003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7\") " Mar 20 08:57:56 crc kubenswrapper[4971]: I0320 08:57:56.890475 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4j4ql\" (UniqueName: \"kubernetes.io/projected/003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7-kube-api-access-4j4ql\") pod \"003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7\" (UID: \"003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7\") " Mar 20 08:57:56 crc kubenswrapper[4971]: I0320 08:57:56.890536 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7-logs\") pod \"003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7\" (UID: \"003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7\") " Mar 20 08:57:56 crc kubenswrapper[4971]: I0320 08:57:56.890586 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7-scripts\") pod \"003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7\" (UID: \"003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7\") " Mar 20 08:57:56 crc kubenswrapper[4971]: I0320 08:57:56.890906 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7-logs" (OuterVolumeSpecName: "logs") pod "003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7" (UID: "003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:57:56 crc kubenswrapper[4971]: I0320 08:57:56.891365 4971 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7-logs\") on node \"crc\" DevicePath \"\"" Mar 20 08:57:56 crc kubenswrapper[4971]: I0320 08:57:56.897964 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7" (UID: "003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:57:56 crc kubenswrapper[4971]: I0320 08:57:56.898001 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7-kube-api-access-4j4ql" (OuterVolumeSpecName: "kube-api-access-4j4ql") pod "003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7" (UID: "003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7"). InnerVolumeSpecName "kube-api-access-4j4ql". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:57:56 crc kubenswrapper[4971]: I0320 08:57:56.919764 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7-scripts" (OuterVolumeSpecName: "scripts") pod "003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7" (UID: "003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:57:56 crc kubenswrapper[4971]: I0320 08:57:56.930764 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7-config-data" (OuterVolumeSpecName: "config-data") pod "003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7" (UID: "003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:57:56 crc kubenswrapper[4971]: I0320 08:57:56.993018 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:57:56 crc kubenswrapper[4971]: I0320 08:57:56.993060 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4j4ql\" (UniqueName: \"kubernetes.io/projected/003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7-kube-api-access-4j4ql\") on node \"crc\" DevicePath \"\"" Mar 20 08:57:56 crc kubenswrapper[4971]: I0320 08:57:56.993070 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:57:56 crc kubenswrapper[4971]: I0320 08:57:56.993079 4971 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 20 08:57:57 crc kubenswrapper[4971]: I0320 08:57:57.039760 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-ks7nc"] Mar 20 08:57:57 crc kubenswrapper[4971]: I0320 08:57:57.054101 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-ks7nc"] Mar 20 08:57:57 crc kubenswrapper[4971]: I0320 08:57:57.132524 4971 generic.go:334] "Generic (PLEG): container finished" podID="003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7" containerID="ae213f50770cd8f09e548bfa98d77803b409aefbf1d32f4970593d1ac8802c71" exitCode=137 Mar 20 08:57:57 crc kubenswrapper[4971]: I0320 08:57:57.132640 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-859774b9d9-bgg24" Mar 20 08:57:57 crc kubenswrapper[4971]: I0320 08:57:57.132640 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-859774b9d9-bgg24" event={"ID":"003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7","Type":"ContainerDied","Data":"ae213f50770cd8f09e548bfa98d77803b409aefbf1d32f4970593d1ac8802c71"} Mar 20 08:57:57 crc kubenswrapper[4971]: I0320 08:57:57.132763 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-859774b9d9-bgg24" event={"ID":"003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7","Type":"ContainerDied","Data":"3248c7877087fae9e1227abc847a6425343c152724f9d8b007bbc2191418fea0"} Mar 20 08:57:57 crc kubenswrapper[4971]: I0320 08:57:57.132789 4971 scope.go:117] "RemoveContainer" containerID="cd5664fbe414be3892e03fb8aac0115e5833b4449389bbb196cfdfb905cd5ce3" Mar 20 08:57:57 crc kubenswrapper[4971]: I0320 08:57:57.135010 4971 generic.go:334] "Generic (PLEG): container finished" podID="cad0849d-53e9-4b65-a400-5eeeba9c533d" containerID="6026856a7dd388c06c833579b0f733e0b83092b99edd772d6066c3bd876fabad" exitCode=0 Mar 20 08:57:57 crc kubenswrapper[4971]: I0320 08:57:57.135078 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g2c5x" event={"ID":"cad0849d-53e9-4b65-a400-5eeeba9c533d","Type":"ContainerDied","Data":"6026856a7dd388c06c833579b0f733e0b83092b99edd772d6066c3bd876fabad"} Mar 20 08:57:57 crc kubenswrapper[4971]: I0320 08:57:57.135118 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g2c5x" event={"ID":"cad0849d-53e9-4b65-a400-5eeeba9c533d","Type":"ContainerStarted","Data":"7a4e628e10a49a23786e59313f8f0a3b4bcbe87af60c770406f6acf7c6f32c52"} Mar 20 08:57:57 crc kubenswrapper[4971]: I0320 08:57:57.198860 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-859774b9d9-bgg24"] Mar 20 08:57:57 crc kubenswrapper[4971]: I0320 08:57:57.208481 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-859774b9d9-bgg24"] Mar 20 08:57:57 crc kubenswrapper[4971]: I0320 08:57:57.351598 4971 scope.go:117] "RemoveContainer" containerID="ae213f50770cd8f09e548bfa98d77803b409aefbf1d32f4970593d1ac8802c71" Mar 20 08:57:57 crc kubenswrapper[4971]: I0320 08:57:57.378960 4971 scope.go:117] "RemoveContainer" containerID="cd5664fbe414be3892e03fb8aac0115e5833b4449389bbb196cfdfb905cd5ce3" Mar 20 08:57:57 crc kubenswrapper[4971]: E0320 08:57:57.380331 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd5664fbe414be3892e03fb8aac0115e5833b4449389bbb196cfdfb905cd5ce3\": container with ID starting with cd5664fbe414be3892e03fb8aac0115e5833b4449389bbb196cfdfb905cd5ce3 not found: ID does not exist" containerID="cd5664fbe414be3892e03fb8aac0115e5833b4449389bbb196cfdfb905cd5ce3" Mar 20 08:57:57 crc kubenswrapper[4971]: I0320 08:57:57.380377 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd5664fbe414be3892e03fb8aac0115e5833b4449389bbb196cfdfb905cd5ce3"} err="failed to get container status \"cd5664fbe414be3892e03fb8aac0115e5833b4449389bbb196cfdfb905cd5ce3\": rpc error: code = NotFound desc = could not find container \"cd5664fbe414be3892e03fb8aac0115e5833b4449389bbb196cfdfb905cd5ce3\": container with ID starting with cd5664fbe414be3892e03fb8aac0115e5833b4449389bbb196cfdfb905cd5ce3 not found: ID does not exist" Mar 20 08:57:57 crc kubenswrapper[4971]: I0320 08:57:57.380408 4971 scope.go:117] "RemoveContainer" containerID="ae213f50770cd8f09e548bfa98d77803b409aefbf1d32f4970593d1ac8802c71" Mar 20 08:57:57 crc kubenswrapper[4971]: E0320 08:57:57.380942 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae213f50770cd8f09e548bfa98d77803b409aefbf1d32f4970593d1ac8802c71\": container with ID starting with ae213f50770cd8f09e548bfa98d77803b409aefbf1d32f4970593d1ac8802c71 not found: ID does not exist" containerID="ae213f50770cd8f09e548bfa98d77803b409aefbf1d32f4970593d1ac8802c71" Mar 20 08:57:57 crc kubenswrapper[4971]: I0320 08:57:57.380973 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae213f50770cd8f09e548bfa98d77803b409aefbf1d32f4970593d1ac8802c71"} err="failed to get container status \"ae213f50770cd8f09e548bfa98d77803b409aefbf1d32f4970593d1ac8802c71\": rpc error: code = NotFound desc = could not find container \"ae213f50770cd8f09e548bfa98d77803b409aefbf1d32f4970593d1ac8802c71\": container with ID starting with ae213f50770cd8f09e548bfa98d77803b409aefbf1d32f4970593d1ac8802c71 not found: ID does not exist" Mar 20 08:57:58 crc kubenswrapper[4971]: I0320 08:57:58.750483 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7" path="/var/lib/kubelet/pods/003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7/volumes" Mar 20 08:57:58 crc kubenswrapper[4971]: I0320 08:57:58.751800 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="debc2ab1-f7a9-4493-8258-4255eec956c5" path="/var/lib/kubelet/pods/debc2ab1-f7a9-4493-8258-4255eec956c5/volumes" Mar 20 08:57:59 crc kubenswrapper[4971]: I0320 08:57:59.162330 4971 generic.go:334] "Generic (PLEG): container finished" podID="cad0849d-53e9-4b65-a400-5eeeba9c533d" containerID="f8ad02bd49e4fad4ea1c2b660cfebe375c7cbf57827f61c097b022a9a195e694" exitCode=0 Mar 20 08:57:59 crc kubenswrapper[4971]: I0320 08:57:59.162414 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g2c5x" event={"ID":"cad0849d-53e9-4b65-a400-5eeeba9c533d","Type":"ContainerDied","Data":"f8ad02bd49e4fad4ea1c2b660cfebe375c7cbf57827f61c097b022a9a195e694"} Mar 20 08:58:00 crc kubenswrapper[4971]: I0320 08:58:00.159838 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566618-kfv8f"] Mar 20 08:58:00 crc kubenswrapper[4971]: E0320 08:58:00.160865 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7" containerName="horizon-log" Mar 20 08:58:00 crc kubenswrapper[4971]: I0320 08:58:00.160883 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7" containerName="horizon-log" Mar 20 08:58:00 crc kubenswrapper[4971]: E0320 08:58:00.160901 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7" containerName="horizon" Mar 20 08:58:00 crc kubenswrapper[4971]: I0320 08:58:00.160939 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7" containerName="horizon" Mar 20 08:58:00 crc kubenswrapper[4971]: I0320 08:58:00.161271 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7" containerName="horizon-log" Mar 20 08:58:00 crc kubenswrapper[4971]: I0320 08:58:00.161291 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="003b4b52-2eb0-4fc1-be2c-282ea2c2f4f7" containerName="horizon" Mar 20 08:58:00 crc kubenswrapper[4971]: I0320 08:58:00.162079 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566618-kfv8f" Mar 20 08:58:00 crc kubenswrapper[4971]: I0320 08:58:00.175255 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 08:58:00 crc kubenswrapper[4971]: I0320 08:58:00.175936 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 08:58:00 crc kubenswrapper[4971]: I0320 08:58:00.176595 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 08:58:00 crc kubenswrapper[4971]: I0320 08:58:00.209467 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g2c5x" event={"ID":"cad0849d-53e9-4b65-a400-5eeeba9c533d","Type":"ContainerDied","Data":"1ac74d05d85aa737e881a2b985e60dc40caa9adb31ee5f6a37884e0d4b657278"} Mar 20 08:58:00 crc kubenswrapper[4971]: I0320 08:58:00.209643 4971 generic.go:334] "Generic (PLEG): container finished" podID="cad0849d-53e9-4b65-a400-5eeeba9c533d" containerID="1ac74d05d85aa737e881a2b985e60dc40caa9adb31ee5f6a37884e0d4b657278" exitCode=0 Mar 20 08:58:00 crc kubenswrapper[4971]: I0320 08:58:00.223388 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566618-kfv8f"] Mar 20 08:58:00 crc kubenswrapper[4971]: I0320 08:58:00.259235 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xszdg\" (UniqueName: \"kubernetes.io/projected/01f7b9ec-77f3-43f1-8b14-f6cdcc25e266-kube-api-access-xszdg\") pod \"auto-csr-approver-29566618-kfv8f\" (UID: \"01f7b9ec-77f3-43f1-8b14-f6cdcc25e266\") " pod="openshift-infra/auto-csr-approver-29566618-kfv8f" Mar 20 08:58:00 crc kubenswrapper[4971]: I0320 08:58:00.361156 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xszdg\" (UniqueName: \"kubernetes.io/projected/01f7b9ec-77f3-43f1-8b14-f6cdcc25e266-kube-api-access-xszdg\") pod \"auto-csr-approver-29566618-kfv8f\" (UID: \"01f7b9ec-77f3-43f1-8b14-f6cdcc25e266\") " pod="openshift-infra/auto-csr-approver-29566618-kfv8f" Mar 20 08:58:00 crc kubenswrapper[4971]: I0320 08:58:00.383112 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xszdg\" (UniqueName: \"kubernetes.io/projected/01f7b9ec-77f3-43f1-8b14-f6cdcc25e266-kube-api-access-xszdg\") pod \"auto-csr-approver-29566618-kfv8f\" (UID: \"01f7b9ec-77f3-43f1-8b14-f6cdcc25e266\") " pod="openshift-infra/auto-csr-approver-29566618-kfv8f" Mar 20 08:58:00 crc kubenswrapper[4971]: I0320 08:58:00.515747 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566618-kfv8f" Mar 20 08:58:00 crc kubenswrapper[4971]: I0320 08:58:00.992064 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566618-kfv8f"] Mar 20 08:58:00 crc kubenswrapper[4971]: W0320 08:58:00.998803 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01f7b9ec_77f3_43f1_8b14_f6cdcc25e266.slice/crio-cb104146f7951f062b166215a24f960d494bb88ad02121d17cba22c28aa59029 WatchSource:0}: Error finding container cb104146f7951f062b166215a24f960d494bb88ad02121d17cba22c28aa59029: Status 404 returned error can't find the container with id cb104146f7951f062b166215a24f960d494bb88ad02121d17cba22c28aa59029 Mar 20 08:58:01 crc kubenswrapper[4971]: I0320 08:58:01.224708 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566618-kfv8f" event={"ID":"01f7b9ec-77f3-43f1-8b14-f6cdcc25e266","Type":"ContainerStarted","Data":"cb104146f7951f062b166215a24f960d494bb88ad02121d17cba22c28aa59029"} Mar 20 08:58:01 crc kubenswrapper[4971]: I0320 08:58:01.538620 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g2c5x" Mar 20 08:58:01 crc kubenswrapper[4971]: I0320 08:58:01.582719 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cad0849d-53e9-4b65-a400-5eeeba9c533d-bundle\") pod \"cad0849d-53e9-4b65-a400-5eeeba9c533d\" (UID: \"cad0849d-53e9-4b65-a400-5eeeba9c533d\") " Mar 20 08:58:01 crc kubenswrapper[4971]: I0320 08:58:01.582889 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cad0849d-53e9-4b65-a400-5eeeba9c533d-util\") pod \"cad0849d-53e9-4b65-a400-5eeeba9c533d\" (UID: \"cad0849d-53e9-4b65-a400-5eeeba9c533d\") " Mar 20 08:58:01 crc kubenswrapper[4971]: I0320 08:58:01.582928 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mc6mf\" (UniqueName: \"kubernetes.io/projected/cad0849d-53e9-4b65-a400-5eeeba9c533d-kube-api-access-mc6mf\") pod \"cad0849d-53e9-4b65-a400-5eeeba9c533d\" (UID: \"cad0849d-53e9-4b65-a400-5eeeba9c533d\") " Mar 20 08:58:01 crc kubenswrapper[4971]: I0320 08:58:01.584399 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cad0849d-53e9-4b65-a400-5eeeba9c533d-bundle" (OuterVolumeSpecName: "bundle") pod "cad0849d-53e9-4b65-a400-5eeeba9c533d" (UID: "cad0849d-53e9-4b65-a400-5eeeba9c533d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:58:01 crc kubenswrapper[4971]: I0320 08:58:01.589257 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cad0849d-53e9-4b65-a400-5eeeba9c533d-kube-api-access-mc6mf" (OuterVolumeSpecName: "kube-api-access-mc6mf") pod "cad0849d-53e9-4b65-a400-5eeeba9c533d" (UID: "cad0849d-53e9-4b65-a400-5eeeba9c533d"). InnerVolumeSpecName "kube-api-access-mc6mf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:58:01 crc kubenswrapper[4971]: I0320 08:58:01.594053 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cad0849d-53e9-4b65-a400-5eeeba9c533d-util" (OuterVolumeSpecName: "util") pod "cad0849d-53e9-4b65-a400-5eeeba9c533d" (UID: "cad0849d-53e9-4b65-a400-5eeeba9c533d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:58:01 crc kubenswrapper[4971]: I0320 08:58:01.684855 4971 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cad0849d-53e9-4b65-a400-5eeeba9c533d-util\") on node \"crc\" DevicePath \"\"" Mar 20 08:58:01 crc kubenswrapper[4971]: I0320 08:58:01.684888 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mc6mf\" (UniqueName: \"kubernetes.io/projected/cad0849d-53e9-4b65-a400-5eeeba9c533d-kube-api-access-mc6mf\") on node \"crc\" DevicePath \"\"" Mar 20 08:58:01 crc kubenswrapper[4971]: I0320 08:58:01.684898 4971 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cad0849d-53e9-4b65-a400-5eeeba9c533d-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:58:02 crc kubenswrapper[4971]: I0320 08:58:02.237306 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g2c5x" event={"ID":"cad0849d-53e9-4b65-a400-5eeeba9c533d","Type":"ContainerDied","Data":"7a4e628e10a49a23786e59313f8f0a3b4bcbe87af60c770406f6acf7c6f32c52"} Mar 20 08:58:02 crc kubenswrapper[4971]: I0320 08:58:02.237722 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a4e628e10a49a23786e59313f8f0a3b4bcbe87af60c770406f6acf7c6f32c52" Mar 20 08:58:02 crc kubenswrapper[4971]: I0320 08:58:02.237374 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g2c5x" Mar 20 08:58:03 crc kubenswrapper[4971]: I0320 08:58:03.248703 4971 generic.go:334] "Generic (PLEG): container finished" podID="01f7b9ec-77f3-43f1-8b14-f6cdcc25e266" containerID="e6d4cbbd041a631af01e05256443dce237af6cb31b5bbf91ad15f99aa75fd18b" exitCode=0 Mar 20 08:58:03 crc kubenswrapper[4971]: I0320 08:58:03.248780 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566618-kfv8f" event={"ID":"01f7b9ec-77f3-43f1-8b14-f6cdcc25e266","Type":"ContainerDied","Data":"e6d4cbbd041a631af01e05256443dce237af6cb31b5bbf91ad15f99aa75fd18b"} Mar 20 08:58:04 crc kubenswrapper[4971]: I0320 08:58:04.613750 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566618-kfv8f" Mar 20 08:58:04 crc kubenswrapper[4971]: I0320 08:58:04.641153 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xszdg\" (UniqueName: \"kubernetes.io/projected/01f7b9ec-77f3-43f1-8b14-f6cdcc25e266-kube-api-access-xszdg\") pod \"01f7b9ec-77f3-43f1-8b14-f6cdcc25e266\" (UID: \"01f7b9ec-77f3-43f1-8b14-f6cdcc25e266\") " Mar 20 08:58:04 crc kubenswrapper[4971]: I0320 08:58:04.647469 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01f7b9ec-77f3-43f1-8b14-f6cdcc25e266-kube-api-access-xszdg" (OuterVolumeSpecName: "kube-api-access-xszdg") pod "01f7b9ec-77f3-43f1-8b14-f6cdcc25e266" (UID: "01f7b9ec-77f3-43f1-8b14-f6cdcc25e266"). InnerVolumeSpecName "kube-api-access-xszdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:58:04 crc kubenswrapper[4971]: I0320 08:58:04.743126 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xszdg\" (UniqueName: \"kubernetes.io/projected/01f7b9ec-77f3-43f1-8b14-f6cdcc25e266-kube-api-access-xszdg\") on node \"crc\" DevicePath \"\"" Mar 20 08:58:04 crc kubenswrapper[4971]: E0320 08:58:04.864759 4971 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01f7b9ec_77f3_43f1_8b14_f6cdcc25e266.slice\": RecentStats: unable to find data in memory cache]" Mar 20 08:58:05 crc kubenswrapper[4971]: I0320 08:58:05.276884 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566618-kfv8f" event={"ID":"01f7b9ec-77f3-43f1-8b14-f6cdcc25e266","Type":"ContainerDied","Data":"cb104146f7951f062b166215a24f960d494bb88ad02121d17cba22c28aa59029"} Mar 20 08:58:05 crc kubenswrapper[4971]: I0320 08:58:05.276932 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb104146f7951f062b166215a24f960d494bb88ad02121d17cba22c28aa59029" Mar 20 08:58:05 crc kubenswrapper[4971]: I0320 08:58:05.276927 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566618-kfv8f" Mar 20 08:58:05 crc kubenswrapper[4971]: I0320 08:58:05.700972 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566612-z2km9"] Mar 20 08:58:05 crc kubenswrapper[4971]: I0320 08:58:05.710414 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566612-z2km9"] Mar 20 08:58:06 crc kubenswrapper[4971]: I0320 08:58:06.746258 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0c478ce-43da-4806-b7ad-29e39c5dcc3d" path="/var/lib/kubelet/pods/c0c478ce-43da-4806-b7ad-29e39c5dcc3d/volumes" Mar 20 08:58:10 crc kubenswrapper[4971]: I0320 08:58:10.443174 4971 scope.go:117] "RemoveContainer" containerID="97987dcf1e2645cd3642dee79f84a45fc99d268530b4ae0b4e140fcf3cce45b8" Mar 20 08:58:10 crc kubenswrapper[4971]: I0320 08:58:10.496918 4971 scope.go:117] "RemoveContainer" containerID="393f437b8a985920f62d59397b36358f6019166f814347b0aaa53d62db017201" Mar 20 08:58:10 crc kubenswrapper[4971]: I0320 08:58:10.560594 4971 scope.go:117] "RemoveContainer" containerID="faabcdc160d09d964a7e7e4a4a3b1aaa5558df39eb1d4c85f3e74e6da072c101" Mar 20 08:58:10 crc kubenswrapper[4971]: I0320 08:58:10.592751 4971 scope.go:117] "RemoveContainer" containerID="f57d450671465e68aca75cef9ed3629f5047433caac4afa143514cb786ac66a6" Mar 20 08:58:10 crc kubenswrapper[4971]: I0320 08:58:10.630340 4971 scope.go:117] "RemoveContainer" containerID="53b83d7597aa24961704a25b8bef68c21f1c126d80db2427810041ea51630dea" Mar 20 08:58:14 crc kubenswrapper[4971]: I0320 08:58:14.561514 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-tqdrq"] Mar 20 08:58:14 crc kubenswrapper[4971]: E0320 08:58:14.562382 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cad0849d-53e9-4b65-a400-5eeeba9c533d" containerName="extract" Mar 20 08:58:14 crc kubenswrapper[4971]: I0320 08:58:14.562395 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="cad0849d-53e9-4b65-a400-5eeeba9c533d" containerName="extract" Mar 20 08:58:14 crc kubenswrapper[4971]: E0320 08:58:14.562411 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cad0849d-53e9-4b65-a400-5eeeba9c533d" containerName="pull" Mar 20 08:58:14 crc kubenswrapper[4971]: I0320 08:58:14.562418 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="cad0849d-53e9-4b65-a400-5eeeba9c533d" containerName="pull" Mar 20 08:58:14 crc kubenswrapper[4971]: E0320 08:58:14.562444 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01f7b9ec-77f3-43f1-8b14-f6cdcc25e266" containerName="oc" Mar 20 08:58:14 crc kubenswrapper[4971]: I0320 08:58:14.562452 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="01f7b9ec-77f3-43f1-8b14-f6cdcc25e266" containerName="oc" Mar 20 08:58:14 crc kubenswrapper[4971]: E0320 08:58:14.562465 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cad0849d-53e9-4b65-a400-5eeeba9c533d" containerName="util" Mar 20 08:58:14 crc kubenswrapper[4971]: I0320 08:58:14.562472 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="cad0849d-53e9-4b65-a400-5eeeba9c533d" containerName="util" Mar 20 08:58:14 crc kubenswrapper[4971]: I0320 08:58:14.562691 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="01f7b9ec-77f3-43f1-8b14-f6cdcc25e266" containerName="oc" Mar 20 08:58:14 crc kubenswrapper[4971]: I0320 08:58:14.562705 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="cad0849d-53e9-4b65-a400-5eeeba9c533d" containerName="extract" Mar 20 08:58:14 crc kubenswrapper[4971]: I0320 08:58:14.563349 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-tqdrq" Mar 20 08:58:14 crc kubenswrapper[4971]: I0320 08:58:14.565008 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-7rrhz" Mar 20 08:58:14 crc kubenswrapper[4971]: I0320 08:58:14.565316 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Mar 20 08:58:14 crc kubenswrapper[4971]: I0320 08:58:14.565551 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Mar 20 08:58:14 crc kubenswrapper[4971]: I0320 08:58:14.579340 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-tqdrq"] Mar 20 08:58:14 crc kubenswrapper[4971]: I0320 08:58:14.647065 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dz7gm\" (UniqueName: \"kubernetes.io/projected/05730a75-1c9d-4199-bd62-176210c9207a-kube-api-access-dz7gm\") pod \"obo-prometheus-operator-8ff7d675-tqdrq\" (UID: \"05730a75-1c9d-4199-bd62-176210c9207a\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-tqdrq" Mar 20 08:58:14 crc kubenswrapper[4971]: I0320 08:58:14.749548 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dz7gm\" (UniqueName: \"kubernetes.io/projected/05730a75-1c9d-4199-bd62-176210c9207a-kube-api-access-dz7gm\") pod \"obo-prometheus-operator-8ff7d675-tqdrq\" (UID: \"05730a75-1c9d-4199-bd62-176210c9207a\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-tqdrq" Mar 20 08:58:14 crc kubenswrapper[4971]: I0320 08:58:14.789546 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dz7gm\" (UniqueName: \"kubernetes.io/projected/05730a75-1c9d-4199-bd62-176210c9207a-kube-api-access-dz7gm\") pod \"obo-prometheus-operator-8ff7d675-tqdrq\" (UID: \"05730a75-1c9d-4199-bd62-176210c9207a\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-tqdrq" Mar 20 08:58:14 crc kubenswrapper[4971]: I0320 08:58:14.892171 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-tqdrq" Mar 20 08:58:15 crc kubenswrapper[4971]: I0320 08:58:15.063854 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-657d9b77d4-wsqv2"] Mar 20 08:58:15 crc kubenswrapper[4971]: I0320 08:58:15.065480 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-657d9b77d4-wsqv2" Mar 20 08:58:15 crc kubenswrapper[4971]: I0320 08:58:15.067884 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-66cft" Mar 20 08:58:15 crc kubenswrapper[4971]: I0320 08:58:15.068012 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Mar 20 08:58:15 crc kubenswrapper[4971]: I0320 08:58:15.116479 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-657d9b77d4-b797l"] Mar 20 08:58:15 crc kubenswrapper[4971]: I0320 08:58:15.118223 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-657d9b77d4-b797l" Mar 20 08:58:15 crc kubenswrapper[4971]: I0320 08:58:15.129750 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-657d9b77d4-wsqv2"] Mar 20 08:58:15 crc kubenswrapper[4971]: I0320 08:58:15.155478 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-657d9b77d4-b797l"] Mar 20 08:58:15 crc kubenswrapper[4971]: I0320 08:58:15.159066 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4aeb8dc6-151a-40dc-871e-6145135c7fdf-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-657d9b77d4-b797l\" (UID: \"4aeb8dc6-151a-40dc-871e-6145135c7fdf\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-657d9b77d4-b797l" Mar 20 08:58:15 crc kubenswrapper[4971]: I0320 08:58:15.159251 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1be3085b-b74b-4086-ab50-b41ac948660c-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-657d9b77d4-wsqv2\" (UID: \"1be3085b-b74b-4086-ab50-b41ac948660c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-657d9b77d4-wsqv2" Mar 20 08:58:15 crc kubenswrapper[4971]: I0320 08:58:15.159287 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1be3085b-b74b-4086-ab50-b41ac948660c-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-657d9b77d4-wsqv2\" (UID: \"1be3085b-b74b-4086-ab50-b41ac948660c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-657d9b77d4-wsqv2" Mar 20 08:58:15 crc kubenswrapper[4971]: I0320 08:58:15.159310 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4aeb8dc6-151a-40dc-871e-6145135c7fdf-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-657d9b77d4-b797l\" (UID: \"4aeb8dc6-151a-40dc-871e-6145135c7fdf\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-657d9b77d4-b797l" Mar 20 08:58:15 crc kubenswrapper[4971]: I0320 08:58:15.260791 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1be3085b-b74b-4086-ab50-b41ac948660c-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-657d9b77d4-wsqv2\" (UID: \"1be3085b-b74b-4086-ab50-b41ac948660c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-657d9b77d4-wsqv2" Mar 20 08:58:15 crc kubenswrapper[4971]: I0320 08:58:15.260860 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1be3085b-b74b-4086-ab50-b41ac948660c-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-657d9b77d4-wsqv2\" (UID: \"1be3085b-b74b-4086-ab50-b41ac948660c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-657d9b77d4-wsqv2" Mar 20 08:58:15 crc kubenswrapper[4971]: I0320 08:58:15.260896 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4aeb8dc6-151a-40dc-871e-6145135c7fdf-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-657d9b77d4-b797l\" (UID: \"4aeb8dc6-151a-40dc-871e-6145135c7fdf\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-657d9b77d4-b797l" Mar 20 08:58:15 crc kubenswrapper[4971]: I0320 08:58:15.260983 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4aeb8dc6-151a-40dc-871e-6145135c7fdf-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-657d9b77d4-b797l\" (UID: \"4aeb8dc6-151a-40dc-871e-6145135c7fdf\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-657d9b77d4-b797l" Mar 20 08:58:15 crc kubenswrapper[4971]: I0320 08:58:15.318406 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4aeb8dc6-151a-40dc-871e-6145135c7fdf-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-657d9b77d4-b797l\" (UID: \"4aeb8dc6-151a-40dc-871e-6145135c7fdf\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-657d9b77d4-b797l" Mar 20 08:58:15 crc kubenswrapper[4971]: I0320 08:58:15.320217 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1be3085b-b74b-4086-ab50-b41ac948660c-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-657d9b77d4-wsqv2\" (UID: \"1be3085b-b74b-4086-ab50-b41ac948660c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-657d9b77d4-wsqv2" Mar 20 08:58:15 crc kubenswrapper[4971]: I0320 08:58:15.330993 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4aeb8dc6-151a-40dc-871e-6145135c7fdf-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-657d9b77d4-b797l\" (UID: \"4aeb8dc6-151a-40dc-871e-6145135c7fdf\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-657d9b77d4-b797l" Mar 20 08:58:15 crc kubenswrapper[4971]: I0320 08:58:15.336114 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1be3085b-b74b-4086-ab50-b41ac948660c-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-657d9b77d4-wsqv2\" (UID: \"1be3085b-b74b-4086-ab50-b41ac948660c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-657d9b77d4-wsqv2" Mar 20 08:58:15 crc kubenswrapper[4971]: I0320 08:58:15.438023 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-jt588"] Mar 20 08:58:15 crc kubenswrapper[4971]: I0320 08:58:15.439214 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-jt588" Mar 20 08:58:15 crc kubenswrapper[4971]: I0320 08:58:15.444800 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-xvz55" Mar 20 08:58:15 crc kubenswrapper[4971]: I0320 08:58:15.449075 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-jt588"] Mar 20 08:58:15 crc kubenswrapper[4971]: I0320 08:58:15.452924 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Mar 20 08:58:15 crc kubenswrapper[4971]: I0320 08:58:15.462130 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-657d9b77d4-wsqv2" Mar 20 08:58:15 crc kubenswrapper[4971]: I0320 08:58:15.473396 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzpj6\" (UniqueName: \"kubernetes.io/projected/a64188ef-f360-424b-b0f7-4b6f3801e47e-kube-api-access-bzpj6\") pod \"observability-operator-6dd7dd855f-jt588\" (UID: \"a64188ef-f360-424b-b0f7-4b6f3801e47e\") " pod="openshift-operators/observability-operator-6dd7dd855f-jt588" Mar 20 08:58:15 crc kubenswrapper[4971]: I0320 08:58:15.473517 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/a64188ef-f360-424b-b0f7-4b6f3801e47e-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-jt588\" (UID: \"a64188ef-f360-424b-b0f7-4b6f3801e47e\") " pod="openshift-operators/observability-operator-6dd7dd855f-jt588" Mar 20 08:58:15 crc kubenswrapper[4971]: I0320 08:58:15.487126 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-657d9b77d4-b797l" Mar 20 08:58:15 crc kubenswrapper[4971]: I0320 08:58:15.539366 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-tqdrq"] Mar 20 08:58:15 crc kubenswrapper[4971]: I0320 08:58:15.575088 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/a64188ef-f360-424b-b0f7-4b6f3801e47e-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-jt588\" (UID: \"a64188ef-f360-424b-b0f7-4b6f3801e47e\") " pod="openshift-operators/observability-operator-6dd7dd855f-jt588" Mar 20 08:58:15 crc kubenswrapper[4971]: I0320 08:58:15.575214 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzpj6\" (UniqueName: \"kubernetes.io/projected/a64188ef-f360-424b-b0f7-4b6f3801e47e-kube-api-access-bzpj6\") pod \"observability-operator-6dd7dd855f-jt588\" (UID: \"a64188ef-f360-424b-b0f7-4b6f3801e47e\") " pod="openshift-operators/observability-operator-6dd7dd855f-jt588" Mar 20 08:58:15 crc kubenswrapper[4971]: I0320 08:58:15.581255 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/a64188ef-f360-424b-b0f7-4b6f3801e47e-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-jt588\" (UID: \"a64188ef-f360-424b-b0f7-4b6f3801e47e\") " pod="openshift-operators/observability-operator-6dd7dd855f-jt588" Mar 20 08:58:15 crc kubenswrapper[4971]: I0320 08:58:15.594229 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzpj6\" (UniqueName: \"kubernetes.io/projected/a64188ef-f360-424b-b0f7-4b6f3801e47e-kube-api-access-bzpj6\") pod \"observability-operator-6dd7dd855f-jt588\" (UID: \"a64188ef-f360-424b-b0f7-4b6f3801e47e\") " pod="openshift-operators/observability-operator-6dd7dd855f-jt588" Mar 20 08:58:15 crc kubenswrapper[4971]: I0320 08:58:15.765691 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-jt588" Mar 20 08:58:15 crc kubenswrapper[4971]: I0320 08:58:15.845791 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-86dc4f68fc-bsgd2"] Mar 20 08:58:15 crc kubenswrapper[4971]: I0320 08:58:15.847294 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-86dc4f68fc-bsgd2" Mar 20 08:58:15 crc kubenswrapper[4971]: I0320 08:58:15.849361 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-zstkh" Mar 20 08:58:15 crc kubenswrapper[4971]: I0320 08:58:15.849504 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-service-cert" Mar 20 08:58:15 crc kubenswrapper[4971]: I0320 08:58:15.861629 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-86dc4f68fc-bsgd2"] Mar 20 08:58:15 crc kubenswrapper[4971]: I0320 08:58:15.883157 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7d563a7f-a865-46b9-b775-cb3c61eae815-apiservice-cert\") pod \"perses-operator-86dc4f68fc-bsgd2\" (UID: \"7d563a7f-a865-46b9-b775-cb3c61eae815\") " pod="openshift-operators/perses-operator-86dc4f68fc-bsgd2" Mar 20 08:58:15 crc kubenswrapper[4971]: I0320 08:58:15.883221 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/7d563a7f-a865-46b9-b775-cb3c61eae815-openshift-service-ca\") pod \"perses-operator-86dc4f68fc-bsgd2\" (UID: \"7d563a7f-a865-46b9-b775-cb3c61eae815\") " pod="openshift-operators/perses-operator-86dc4f68fc-bsgd2" Mar 20 08:58:15 crc kubenswrapper[4971]: I0320 08:58:15.883252 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7d563a7f-a865-46b9-b775-cb3c61eae815-webhook-cert\") pod \"perses-operator-86dc4f68fc-bsgd2\" (UID: \"7d563a7f-a865-46b9-b775-cb3c61eae815\") " pod="openshift-operators/perses-operator-86dc4f68fc-bsgd2" Mar 20 08:58:15 crc kubenswrapper[4971]: I0320 08:58:15.883315 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgxrn\" (UniqueName: \"kubernetes.io/projected/7d563a7f-a865-46b9-b775-cb3c61eae815-kube-api-access-zgxrn\") pod \"perses-operator-86dc4f68fc-bsgd2\" (UID: \"7d563a7f-a865-46b9-b775-cb3c61eae815\") " pod="openshift-operators/perses-operator-86dc4f68fc-bsgd2" Mar 20 08:58:15 crc kubenswrapper[4971]: I0320 08:58:15.970693 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-657d9b77d4-b797l"] Mar 20 08:58:15 crc kubenswrapper[4971]: I0320 08:58:15.985761 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgxrn\" (UniqueName: \"kubernetes.io/projected/7d563a7f-a865-46b9-b775-cb3c61eae815-kube-api-access-zgxrn\") pod \"perses-operator-86dc4f68fc-bsgd2\" (UID: \"7d563a7f-a865-46b9-b775-cb3c61eae815\") " pod="openshift-operators/perses-operator-86dc4f68fc-bsgd2" Mar 20 08:58:15 crc kubenswrapper[4971]: I0320 08:58:15.985899 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7d563a7f-a865-46b9-b775-cb3c61eae815-apiservice-cert\") pod \"perses-operator-86dc4f68fc-bsgd2\" (UID: \"7d563a7f-a865-46b9-b775-cb3c61eae815\") " pod="openshift-operators/perses-operator-86dc4f68fc-bsgd2" Mar 20 08:58:15 crc kubenswrapper[4971]: I0320 08:58:15.985933 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/7d563a7f-a865-46b9-b775-cb3c61eae815-openshift-service-ca\") pod \"perses-operator-86dc4f68fc-bsgd2\" (UID: \"7d563a7f-a865-46b9-b775-cb3c61eae815\") " pod="openshift-operators/perses-operator-86dc4f68fc-bsgd2" Mar 20 08:58:15 crc kubenswrapper[4971]: I0320 08:58:15.985956 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7d563a7f-a865-46b9-b775-cb3c61eae815-webhook-cert\") pod \"perses-operator-86dc4f68fc-bsgd2\" (UID: \"7d563a7f-a865-46b9-b775-cb3c61eae815\") " pod="openshift-operators/perses-operator-86dc4f68fc-bsgd2" Mar 20 08:58:15 crc kubenswrapper[4971]: I0320 08:58:15.993961 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/7d563a7f-a865-46b9-b775-cb3c61eae815-openshift-service-ca\") pod \"perses-operator-86dc4f68fc-bsgd2\" (UID: \"7d563a7f-a865-46b9-b775-cb3c61eae815\") " pod="openshift-operators/perses-operator-86dc4f68fc-bsgd2" Mar 20 08:58:15 crc kubenswrapper[4971]: I0320 08:58:15.996708 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7d563a7f-a865-46b9-b775-cb3c61eae815-apiservice-cert\") pod \"perses-operator-86dc4f68fc-bsgd2\" (UID: \"7d563a7f-a865-46b9-b775-cb3c61eae815\") " pod="openshift-operators/perses-operator-86dc4f68fc-bsgd2" Mar 20 08:58:16 crc kubenswrapper[4971]: I0320 08:58:16.004683 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7d563a7f-a865-46b9-b775-cb3c61eae815-webhook-cert\") pod \"perses-operator-86dc4f68fc-bsgd2\" (UID: \"7d563a7f-a865-46b9-b775-cb3c61eae815\") " pod="openshift-operators/perses-operator-86dc4f68fc-bsgd2" Mar 20 08:58:16 crc kubenswrapper[4971]: I0320 08:58:16.011364 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-657d9b77d4-wsqv2"] Mar 20 08:58:16 crc kubenswrapper[4971]: W0320 08:58:16.012092 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4aeb8dc6_151a_40dc_871e_6145135c7fdf.slice/crio-16e2e4f462602c97b72fcec9cbf9d598bfcc9ed5b59de305084242976f793029 WatchSource:0}: Error finding container 16e2e4f462602c97b72fcec9cbf9d598bfcc9ed5b59de305084242976f793029: Status 404 returned error can't find the container with id 16e2e4f462602c97b72fcec9cbf9d598bfcc9ed5b59de305084242976f793029 Mar 20 08:58:16 crc kubenswrapper[4971]: I0320 08:58:16.026705 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgxrn\" (UniqueName: \"kubernetes.io/projected/7d563a7f-a865-46b9-b775-cb3c61eae815-kube-api-access-zgxrn\") pod \"perses-operator-86dc4f68fc-bsgd2\" (UID: \"7d563a7f-a865-46b9-b775-cb3c61eae815\") " pod="openshift-operators/perses-operator-86dc4f68fc-bsgd2" Mar 20 08:58:16 crc kubenswrapper[4971]: I0320 08:58:16.243414 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-86dc4f68fc-bsgd2" Mar 20 08:58:16 crc kubenswrapper[4971]: I0320 08:58:16.399743 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-jt588"] Mar 20 08:58:16 crc kubenswrapper[4971]: I0320 08:58:16.426496 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-8ff7d675-tqdrq" event={"ID":"05730a75-1c9d-4199-bd62-176210c9207a","Type":"ContainerStarted","Data":"2668d4ca824bef1902610d29103ae80d4f9ce30ff01fc2455a24fe5495b7ebb2"} Mar 20 08:58:16 crc kubenswrapper[4971]: I0320 08:58:16.432898 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-657d9b77d4-b797l" event={"ID":"4aeb8dc6-151a-40dc-871e-6145135c7fdf","Type":"ContainerStarted","Data":"16e2e4f462602c97b72fcec9cbf9d598bfcc9ed5b59de305084242976f793029"} Mar 20 08:58:16 crc kubenswrapper[4971]: I0320 08:58:16.435457 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-657d9b77d4-wsqv2" event={"ID":"1be3085b-b74b-4086-ab50-b41ac948660c","Type":"ContainerStarted","Data":"fcf789943cd207c89271e0fb1de585d496fcf75f3fa76248c847f5cae5278fb7"} Mar 20 08:58:16 crc kubenswrapper[4971]: I0320 08:58:16.846016 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-86dc4f68fc-bsgd2"] Mar 20 08:58:16 crc kubenswrapper[4971]: W0320 08:58:16.852630 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d563a7f_a865_46b9_b775_cb3c61eae815.slice/crio-c2e80dee53df9bcf576f78b58d9acf27ad1381d8f2623b90ac74414ccb9bab13 WatchSource:0}: Error finding container c2e80dee53df9bcf576f78b58d9acf27ad1381d8f2623b90ac74414ccb9bab13: Status 404 returned error can't find the container with id c2e80dee53df9bcf576f78b58d9acf27ad1381d8f2623b90ac74414ccb9bab13 Mar 20 08:58:17 crc kubenswrapper[4971]: I0320 08:58:17.445820 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-86dc4f68fc-bsgd2" event={"ID":"7d563a7f-a865-46b9-b775-cb3c61eae815","Type":"ContainerStarted","Data":"c2e80dee53df9bcf576f78b58d9acf27ad1381d8f2623b90ac74414ccb9bab13"} Mar 20 08:58:17 crc kubenswrapper[4971]: I0320 08:58:17.446866 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-6dd7dd855f-jt588" event={"ID":"a64188ef-f360-424b-b0f7-4b6f3801e47e","Type":"ContainerStarted","Data":"4dc9456ec8f8aa8ca96057cc004bbc1e4533808b68ebd7b95c8898aefc7554c2"} Mar 20 08:58:20 crc kubenswrapper[4971]: I0320 08:58:20.162235 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 08:58:20 crc kubenswrapper[4971]: I0320 08:58:20.162719 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 08:58:20 crc kubenswrapper[4971]: I0320 08:58:20.162768 4971 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" Mar 20 08:58:20 crc kubenswrapper[4971]: I0320 08:58:20.163479 4971 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"28296d931187d2e7d4e287e10e7dff5f848527abf9621b1204e0a061cbc3d395"} pod="openshift-machine-config-operator/machine-config-daemon-m26zl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 08:58:20 crc kubenswrapper[4971]: I0320 08:58:20.163529 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" containerID="cri-o://28296d931187d2e7d4e287e10e7dff5f848527abf9621b1204e0a061cbc3d395" gracePeriod=600 Mar 20 08:58:20 crc kubenswrapper[4971]: I0320 08:58:20.533239 4971 generic.go:334] "Generic (PLEG): container finished" podID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerID="28296d931187d2e7d4e287e10e7dff5f848527abf9621b1204e0a061cbc3d395" exitCode=0 Mar 20 08:58:20 crc kubenswrapper[4971]: I0320 08:58:20.533326 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerDied","Data":"28296d931187d2e7d4e287e10e7dff5f848527abf9621b1204e0a061cbc3d395"} Mar 20 08:58:20 crc kubenswrapper[4971]: I0320 08:58:20.533376 4971 scope.go:117] "RemoveContainer" containerID="d3a14bb739a0b2e954975f7532ea81560227f071f1b801e07dff179ccac02bd7" Mar 20 08:58:20 crc kubenswrapper[4971]: I0320 08:58:20.536478 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-657d9b77d4-b797l" event={"ID":"4aeb8dc6-151a-40dc-871e-6145135c7fdf","Type":"ContainerStarted","Data":"66d24e63408c0cb0edef085ec9f9861332274f2ec1c233bb9b41f871ebd6d032"} Mar 20 08:58:20 crc kubenswrapper[4971]: I0320 08:58:20.555253 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-657d9b77d4-wsqv2" event={"ID":"1be3085b-b74b-4086-ab50-b41ac948660c","Type":"ContainerStarted","Data":"4ee01664df581f6ba36e07386c70b283d06243bdb2f6a21b0f172922641daff2"} Mar 20 08:58:20 crc kubenswrapper[4971]: I0320 08:58:20.585248 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-657d9b77d4-b797l" podStartSLOduration=2.209508512 podStartE2EDuration="5.585224963s" podCreationTimestamp="2026-03-20 08:58:15 +0000 UTC" firstStartedPulling="2026-03-20 08:58:16.02615606 +0000 UTC m=+7718.006030188" lastFinishedPulling="2026-03-20 08:58:19.401872501 +0000 UTC m=+7721.381746639" observedRunningTime="2026-03-20 08:58:20.569705067 +0000 UTC m=+7722.549579225" watchObservedRunningTime="2026-03-20 08:58:20.585224963 +0000 UTC m=+7722.565099101" Mar 20 08:58:20 crc kubenswrapper[4971]: I0320 08:58:20.620072 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-657d9b77d4-wsqv2" podStartSLOduration=2.231538988 podStartE2EDuration="5.620038624s" podCreationTimestamp="2026-03-20 08:58:15 +0000 UTC" firstStartedPulling="2026-03-20 08:58:16.014988838 +0000 UTC m=+7717.994862976" lastFinishedPulling="2026-03-20 08:58:19.403488474 +0000 UTC m=+7721.383362612" observedRunningTime="2026-03-20 08:58:20.601061968 +0000 UTC m=+7722.580936106" watchObservedRunningTime="2026-03-20 08:58:20.620038624 +0000 UTC m=+7722.599912762" Mar 20 08:58:21 crc kubenswrapper[4971]: E0320 08:58:21.404485 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:58:21 crc kubenswrapper[4971]: I0320 08:58:21.567055 4971 scope.go:117] "RemoveContainer" containerID="28296d931187d2e7d4e287e10e7dff5f848527abf9621b1204e0a061cbc3d395" Mar 20 08:58:21 crc kubenswrapper[4971]: E0320 08:58:21.567357 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:58:28 crc kubenswrapper[4971]: I0320 08:58:28.634784 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-8ff7d675-tqdrq" event={"ID":"05730a75-1c9d-4199-bd62-176210c9207a","Type":"ContainerStarted","Data":"c4a3b433ee5c920fbf3bae680bc07b180f1a0cdda7c464913f35e91ae89e4dc6"} Mar 20 08:58:28 crc kubenswrapper[4971]: I0320 08:58:28.638844 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-86dc4f68fc-bsgd2" event={"ID":"7d563a7f-a865-46b9-b775-cb3c61eae815","Type":"ContainerStarted","Data":"60f895e412b6f2dd4f9eab676418cd53802dfb9a19212898c26875c544db0750"} Mar 20 08:58:28 crc kubenswrapper[4971]: I0320 08:58:28.639342 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-86dc4f68fc-bsgd2" Mar 20 08:58:28 crc kubenswrapper[4971]: I0320 08:58:28.640793 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-6dd7dd855f-jt588" event={"ID":"a64188ef-f360-424b-b0f7-4b6f3801e47e","Type":"ContainerStarted","Data":"ea5d0684d445528c4d97452a86b1850f47c43c46dcbe2098a9f68290412891f8"} Mar 20 08:58:28 crc kubenswrapper[4971]: I0320 08:58:28.641491 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-6dd7dd855f-jt588" Mar 20 08:58:28 crc kubenswrapper[4971]: I0320 08:58:28.668197 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-8ff7d675-tqdrq" podStartSLOduration=2.752956723 podStartE2EDuration="14.668175976s" podCreationTimestamp="2026-03-20 08:58:14 +0000 UTC" firstStartedPulling="2026-03-20 08:58:15.547000599 +0000 UTC m=+7717.526874737" lastFinishedPulling="2026-03-20 08:58:27.462219852 +0000 UTC m=+7729.442093990" observedRunningTime="2026-03-20 08:58:28.660863524 +0000 UTC m=+7730.640737672" watchObservedRunningTime="2026-03-20 08:58:28.668175976 +0000 UTC m=+7730.648050114" Mar 20 08:58:28 crc kubenswrapper[4971]: I0320 08:58:28.697976 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-6dd7dd855f-jt588" Mar 20 08:58:28 crc kubenswrapper[4971]: I0320 08:58:28.719936 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-6dd7dd855f-jt588" podStartSLOduration=2.662403655 podStartE2EDuration="13.71991974s" podCreationTimestamp="2026-03-20 08:58:15 +0000 UTC" firstStartedPulling="2026-03-20 08:58:16.42657843 +0000 UTC m=+7718.406452568" lastFinishedPulling="2026-03-20 08:58:27.484094525 +0000 UTC m=+7729.463968653" observedRunningTime="2026-03-20 08:58:28.697050241 +0000 UTC m=+7730.676924389" watchObservedRunningTime="2026-03-20 08:58:28.71991974 +0000 UTC m=+7730.699793878" Mar 20 08:58:28 crc kubenswrapper[4971]: I0320 08:58:28.750265 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-86dc4f68fc-bsgd2" podStartSLOduration=3.149816333 podStartE2EDuration="13.750251044s" podCreationTimestamp="2026-03-20 08:58:15 +0000 UTC" firstStartedPulling="2026-03-20 08:58:16.854584323 +0000 UTC m=+7718.834458461" lastFinishedPulling="2026-03-20 08:58:27.455019024 +0000 UTC m=+7729.434893172" observedRunningTime="2026-03-20 08:58:28.718708218 +0000 UTC m=+7730.698582366" watchObservedRunningTime="2026-03-20 08:58:28.750251044 +0000 UTC m=+7730.730125182" Mar 20 08:58:32 crc kubenswrapper[4971]: I0320 08:58:32.744775 4971 scope.go:117] "RemoveContainer" containerID="28296d931187d2e7d4e287e10e7dff5f848527abf9621b1204e0a061cbc3d395" Mar 20 08:58:32 crc kubenswrapper[4971]: E0320 08:58:32.745841 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:58:36 crc kubenswrapper[4971]: I0320 08:58:36.246932 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-86dc4f68fc-bsgd2" Mar 20 08:58:38 crc kubenswrapper[4971]: I0320 08:58:38.048660 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-rfzsl"] Mar 20 08:58:38 crc kubenswrapper[4971]: I0320 08:58:38.056270 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-rfzsl"] Mar 20 08:58:38 crc kubenswrapper[4971]: I0320 08:58:38.090958 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-0606-account-create-update-7nmf4"] Mar 20 08:58:38 crc kubenswrapper[4971]: I0320 08:58:38.098376 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-0606-account-create-update-7nmf4"] Mar 20 08:58:38 crc kubenswrapper[4971]: I0320 08:58:38.743948 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="114a8aed-e610-49e8-ba46-fcfcc4b7f0f7" path="/var/lib/kubelet/pods/114a8aed-e610-49e8-ba46-fcfcc4b7f0f7/volumes" Mar 20 08:58:38 crc kubenswrapper[4971]: I0320 08:58:38.745321 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79df42cd-73aa-4c95-833b-7b7d1a2043cd" path="/var/lib/kubelet/pods/79df42cd-73aa-4c95-833b-7b7d1a2043cd/volumes" Mar 20 08:58:38 crc kubenswrapper[4971]: I0320 08:58:38.995508 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 20 08:58:38 crc kubenswrapper[4971]: I0320 08:58:38.995751 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="9f3a3b9f-f195-436e-9097-d085880e71f8" containerName="openstackclient" containerID="cri-o://bd95892b113df7a9165f0379707e216c41cb86d938f4c421d789927ccad60c6a" gracePeriod=2 Mar 20 08:58:39 crc kubenswrapper[4971]: I0320 08:58:39.005940 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 20 08:58:39 crc kubenswrapper[4971]: I0320 08:58:39.048488 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 20 08:58:39 crc kubenswrapper[4971]: E0320 08:58:39.048973 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f3a3b9f-f195-436e-9097-d085880e71f8" containerName="openstackclient" Mar 20 08:58:39 crc kubenswrapper[4971]: I0320 08:58:39.048996 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f3a3b9f-f195-436e-9097-d085880e71f8" containerName="openstackclient" Mar 20 08:58:39 crc kubenswrapper[4971]: I0320 08:58:39.049233 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f3a3b9f-f195-436e-9097-d085880e71f8" containerName="openstackclient" Mar 20 08:58:39 crc kubenswrapper[4971]: I0320 08:58:39.049897 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 08:58:39 crc kubenswrapper[4971]: I0320 08:58:39.058436 4971 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="9f3a3b9f-f195-436e-9097-d085880e71f8" podUID="0a492260-faae-40fb-887d-46a5be1c4c5e" Mar 20 08:58:39 crc kubenswrapper[4971]: I0320 08:58:39.067456 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 20 08:58:39 crc kubenswrapper[4971]: I0320 08:58:39.148422 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0a492260-faae-40fb-887d-46a5be1c4c5e-openstack-config-secret\") pod \"openstackclient\" (UID: \"0a492260-faae-40fb-887d-46a5be1c4c5e\") " pod="openstack/openstackclient" Mar 20 08:58:39 crc kubenswrapper[4971]: I0320 08:58:39.148509 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0a492260-faae-40fb-887d-46a5be1c4c5e-openstack-config\") pod \"openstackclient\" (UID: \"0a492260-faae-40fb-887d-46a5be1c4c5e\") " pod="openstack/openstackclient" Mar 20 08:58:39 crc kubenswrapper[4971]: I0320 08:58:39.148584 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4sfn\" (UniqueName: \"kubernetes.io/projected/0a492260-faae-40fb-887d-46a5be1c4c5e-kube-api-access-l4sfn\") pod \"openstackclient\" (UID: \"0a492260-faae-40fb-887d-46a5be1c4c5e\") " pod="openstack/openstackclient" Mar 20 08:58:39 crc kubenswrapper[4971]: I0320 08:58:39.250644 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0a492260-faae-40fb-887d-46a5be1c4c5e-openstack-config\") pod \"openstackclient\" (UID: \"0a492260-faae-40fb-887d-46a5be1c4c5e\") " pod="openstack/openstackclient" Mar 20 08:58:39 crc kubenswrapper[4971]: I0320 08:58:39.250713 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4sfn\" (UniqueName: \"kubernetes.io/projected/0a492260-faae-40fb-887d-46a5be1c4c5e-kube-api-access-l4sfn\") pod \"openstackclient\" (UID: \"0a492260-faae-40fb-887d-46a5be1c4c5e\") " pod="openstack/openstackclient" Mar 20 08:58:39 crc kubenswrapper[4971]: I0320 08:58:39.250843 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0a492260-faae-40fb-887d-46a5be1c4c5e-openstack-config-secret\") pod \"openstackclient\" (UID: \"0a492260-faae-40fb-887d-46a5be1c4c5e\") " pod="openstack/openstackclient" Mar 20 08:58:39 crc kubenswrapper[4971]: I0320 08:58:39.251538 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0a492260-faae-40fb-887d-46a5be1c4c5e-openstack-config\") pod \"openstackclient\" (UID: \"0a492260-faae-40fb-887d-46a5be1c4c5e\") " pod="openstack/openstackclient" Mar 20 08:58:39 crc kubenswrapper[4971]: I0320 08:58:39.258208 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0a492260-faae-40fb-887d-46a5be1c4c5e-openstack-config-secret\") pod \"openstackclient\" (UID: \"0a492260-faae-40fb-887d-46a5be1c4c5e\") " pod="openstack/openstackclient" Mar 20 08:58:39 crc kubenswrapper[4971]: I0320 08:58:39.268757 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 08:58:39 crc kubenswrapper[4971]: I0320 08:58:39.270469 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 08:58:39 crc kubenswrapper[4971]: I0320 08:58:39.275155 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-2w9df" Mar 20 08:58:39 crc kubenswrapper[4971]: I0320 08:58:39.285034 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 08:58:39 crc kubenswrapper[4971]: I0320 08:58:39.295900 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4sfn\" (UniqueName: \"kubernetes.io/projected/0a492260-faae-40fb-887d-46a5be1c4c5e-kube-api-access-l4sfn\") pod \"openstackclient\" (UID: \"0a492260-faae-40fb-887d-46a5be1c4c5e\") " pod="openstack/openstackclient" Mar 20 08:58:39 crc kubenswrapper[4971]: I0320 08:58:39.352811 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94cdj\" (UniqueName: \"kubernetes.io/projected/3663f86e-c27b-4585-8201-8115f0e04501-kube-api-access-94cdj\") pod \"kube-state-metrics-0\" (UID: \"3663f86e-c27b-4585-8201-8115f0e04501\") " pod="openstack/kube-state-metrics-0" Mar 20 08:58:39 crc kubenswrapper[4971]: I0320 08:58:39.370717 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 08:58:39 crc kubenswrapper[4971]: I0320 08:58:39.454165 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94cdj\" (UniqueName: \"kubernetes.io/projected/3663f86e-c27b-4585-8201-8115f0e04501-kube-api-access-94cdj\") pod \"kube-state-metrics-0\" (UID: \"3663f86e-c27b-4585-8201-8115f0e04501\") " pod="openstack/kube-state-metrics-0" Mar 20 08:58:39 crc kubenswrapper[4971]: I0320 08:58:39.484312 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94cdj\" (UniqueName: \"kubernetes.io/projected/3663f86e-c27b-4585-8201-8115f0e04501-kube-api-access-94cdj\") pod \"kube-state-metrics-0\" (UID: \"3663f86e-c27b-4585-8201-8115f0e04501\") " pod="openstack/kube-state-metrics-0" Mar 20 08:58:39 crc kubenswrapper[4971]: I0320 08:58:39.653122 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 08:58:40 crc kubenswrapper[4971]: I0320 08:58:40.122187 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Mar 20 08:58:40 crc kubenswrapper[4971]: I0320 08:58:40.124068 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Mar 20 08:58:40 crc kubenswrapper[4971]: I0320 08:58:40.129701 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-v5rcg" Mar 20 08:58:40 crc kubenswrapper[4971]: I0320 08:58:40.134484 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-cluster-tls-config" Mar 20 08:58:40 crc kubenswrapper[4971]: I0320 08:58:40.134657 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Mar 20 08:58:40 crc kubenswrapper[4971]: I0320 08:58:40.134764 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Mar 20 08:58:40 crc kubenswrapper[4971]: I0320 08:58:40.145957 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Mar 20 08:58:40 crc kubenswrapper[4971]: I0320 08:58:40.335474 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Mar 20 08:58:40 crc kubenswrapper[4971]: I0320 08:58:40.343822 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/3c261d6c-ad35-4558-a676-58606d0c78a5-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"3c261d6c-ad35-4558-a676-58606d0c78a5\") " pod="openstack/alertmanager-metric-storage-0" Mar 20 08:58:40 crc kubenswrapper[4971]: I0320 08:58:40.343894 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/3c261d6c-ad35-4558-a676-58606d0c78a5-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"3c261d6c-ad35-4558-a676-58606d0c78a5\") " pod="openstack/alertmanager-metric-storage-0" Mar 20 08:58:40 crc kubenswrapper[4971]: I0320 08:58:40.343977 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smvng\" (UniqueName: \"kubernetes.io/projected/3c261d6c-ad35-4558-a676-58606d0c78a5-kube-api-access-smvng\") pod \"alertmanager-metric-storage-0\" (UID: \"3c261d6c-ad35-4558-a676-58606d0c78a5\") " pod="openstack/alertmanager-metric-storage-0" Mar 20 08:58:40 crc kubenswrapper[4971]: I0320 08:58:40.344002 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3c261d6c-ad35-4558-a676-58606d0c78a5-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"3c261d6c-ad35-4558-a676-58606d0c78a5\") " pod="openstack/alertmanager-metric-storage-0" Mar 20 08:58:40 crc kubenswrapper[4971]: I0320 08:58:40.344053 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/3c261d6c-ad35-4558-a676-58606d0c78a5-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"3c261d6c-ad35-4558-a676-58606d0c78a5\") " pod="openstack/alertmanager-metric-storage-0" Mar 20 08:58:40 crc kubenswrapper[4971]: I0320 08:58:40.344084 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3c261d6c-ad35-4558-a676-58606d0c78a5-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"3c261d6c-ad35-4558-a676-58606d0c78a5\") " pod="openstack/alertmanager-metric-storage-0" Mar 20 08:58:40 crc kubenswrapper[4971]: I0320 08:58:40.344163 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3c261d6c-ad35-4558-a676-58606d0c78a5-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"3c261d6c-ad35-4558-a676-58606d0c78a5\") " pod="openstack/alertmanager-metric-storage-0" Mar 20 08:58:40 crc kubenswrapper[4971]: I0320 08:58:40.446839 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3c261d6c-ad35-4558-a676-58606d0c78a5-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"3c261d6c-ad35-4558-a676-58606d0c78a5\") " pod="openstack/alertmanager-metric-storage-0" Mar 20 08:58:40 crc kubenswrapper[4971]: I0320 08:58:40.447698 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/3c261d6c-ad35-4558-a676-58606d0c78a5-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"3c261d6c-ad35-4558-a676-58606d0c78a5\") " pod="openstack/alertmanager-metric-storage-0" Mar 20 08:58:40 crc kubenswrapper[4971]: I0320 08:58:40.447760 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3c261d6c-ad35-4558-a676-58606d0c78a5-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"3c261d6c-ad35-4558-a676-58606d0c78a5\") " pod="openstack/alertmanager-metric-storage-0" Mar 20 08:58:40 crc kubenswrapper[4971]: I0320 08:58:40.447903 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3c261d6c-ad35-4558-a676-58606d0c78a5-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"3c261d6c-ad35-4558-a676-58606d0c78a5\") " pod="openstack/alertmanager-metric-storage-0" Mar 20 08:58:40 crc kubenswrapper[4971]: I0320 08:58:40.448053 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/3c261d6c-ad35-4558-a676-58606d0c78a5-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"3c261d6c-ad35-4558-a676-58606d0c78a5\") " pod="openstack/alertmanager-metric-storage-0" Mar 20 08:58:40 crc kubenswrapper[4971]: I0320 08:58:40.448107 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/3c261d6c-ad35-4558-a676-58606d0c78a5-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"3c261d6c-ad35-4558-a676-58606d0c78a5\") " pod="openstack/alertmanager-metric-storage-0" Mar 20 08:58:40 crc kubenswrapper[4971]: I0320 08:58:40.448225 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smvng\" (UniqueName: \"kubernetes.io/projected/3c261d6c-ad35-4558-a676-58606d0c78a5-kube-api-access-smvng\") pod \"alertmanager-metric-storage-0\" (UID: \"3c261d6c-ad35-4558-a676-58606d0c78a5\") " pod="openstack/alertmanager-metric-storage-0" Mar 20 08:58:40 crc kubenswrapper[4971]: I0320 08:58:40.468696 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/3c261d6c-ad35-4558-a676-58606d0c78a5-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"3c261d6c-ad35-4558-a676-58606d0c78a5\") " pod="openstack/alertmanager-metric-storage-0" Mar 20 08:58:40 crc kubenswrapper[4971]: I0320 08:58:40.469348 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3c261d6c-ad35-4558-a676-58606d0c78a5-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"3c261d6c-ad35-4558-a676-58606d0c78a5\") " pod="openstack/alertmanager-metric-storage-0" Mar 20 08:58:40 crc kubenswrapper[4971]: I0320 08:58:40.469484 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 20 08:58:40 crc kubenswrapper[4971]: I0320 08:58:40.481207 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/3c261d6c-ad35-4558-a676-58606d0c78a5-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"3c261d6c-ad35-4558-a676-58606d0c78a5\") " pod="openstack/alertmanager-metric-storage-0" Mar 20 08:58:40 crc kubenswrapper[4971]: I0320 08:58:40.484092 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smvng\" (UniqueName: \"kubernetes.io/projected/3c261d6c-ad35-4558-a676-58606d0c78a5-kube-api-access-smvng\") pod \"alertmanager-metric-storage-0\" (UID: \"3c261d6c-ad35-4558-a676-58606d0c78a5\") " pod="openstack/alertmanager-metric-storage-0" Mar 20 08:58:40 crc kubenswrapper[4971]: I0320 08:58:40.501782 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3c261d6c-ad35-4558-a676-58606d0c78a5-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"3c261d6c-ad35-4558-a676-58606d0c78a5\") " pod="openstack/alertmanager-metric-storage-0" Mar 20 08:58:40 crc kubenswrapper[4971]: I0320 08:58:40.508199 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/3c261d6c-ad35-4558-a676-58606d0c78a5-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"3c261d6c-ad35-4558-a676-58606d0c78a5\") " pod="openstack/alertmanager-metric-storage-0" Mar 20 08:58:40 crc kubenswrapper[4971]: I0320 08:58:40.520058 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3c261d6c-ad35-4558-a676-58606d0c78a5-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"3c261d6c-ad35-4558-a676-58606d0c78a5\") " pod="openstack/alertmanager-metric-storage-0" Mar 20 08:58:40 crc kubenswrapper[4971]: I0320 08:58:40.722312 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 08:58:40 crc kubenswrapper[4971]: I0320 08:58:40.781838 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"0a492260-faae-40fb-887d-46a5be1c4c5e","Type":"ContainerStarted","Data":"df0f8d9d0be92d34ce9ef3e564d30f902746834b78e31466e1a8cb4774266feb"} Mar 20 08:58:40 crc kubenswrapper[4971]: I0320 08:58:40.790899 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"3663f86e-c27b-4585-8201-8115f0e04501","Type":"ContainerStarted","Data":"82c42f2ac77c70f2fe848335608c54bc8d490d9b577acb027feba99b49371c14"} Mar 20 08:58:40 crc kubenswrapper[4971]: I0320 08:58:40.790984 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.140719 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.148576 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.167186 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.169266 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.169502 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.169508 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.169631 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.169686 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.169745 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.169835 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.169862 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-4vffp" Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.284975 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/903b0df1-cef0-4077-9033-c534831e6ca8-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"903b0df1-cef0-4077-9033-c534831e6ca8\") " pod="openstack/prometheus-metric-storage-0" Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.285033 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-f3705f92-1eba-49dd-9654-288b96344486\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f3705f92-1eba-49dd-9654-288b96344486\") pod \"prometheus-metric-storage-0\" (UID: \"903b0df1-cef0-4077-9033-c534831e6ca8\") " pod="openstack/prometheus-metric-storage-0" Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.285052 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/903b0df1-cef0-4077-9033-c534831e6ca8-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"903b0df1-cef0-4077-9033-c534831e6ca8\") " pod="openstack/prometheus-metric-storage-0" Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.285202 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/903b0df1-cef0-4077-9033-c534831e6ca8-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"903b0df1-cef0-4077-9033-c534831e6ca8\") " pod="openstack/prometheus-metric-storage-0" Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.285334 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/903b0df1-cef0-4077-9033-c534831e6ca8-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"903b0df1-cef0-4077-9033-c534831e6ca8\") " pod="openstack/prometheus-metric-storage-0" Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.285396 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/903b0df1-cef0-4077-9033-c534831e6ca8-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"903b0df1-cef0-4077-9033-c534831e6ca8\") " pod="openstack/prometheus-metric-storage-0" Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.285482 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/903b0df1-cef0-4077-9033-c534831e6ca8-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"903b0df1-cef0-4077-9033-c534831e6ca8\") " pod="openstack/prometheus-metric-storage-0" Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.285523 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/903b0df1-cef0-4077-9033-c534831e6ca8-config\") pod \"prometheus-metric-storage-0\" (UID: \"903b0df1-cef0-4077-9033-c534831e6ca8\") " pod="openstack/prometheus-metric-storage-0" Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.285579 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4j5rw\" (UniqueName: \"kubernetes.io/projected/903b0df1-cef0-4077-9033-c534831e6ca8-kube-api-access-4j5rw\") pod \"prometheus-metric-storage-0\" (UID: \"903b0df1-cef0-4077-9033-c534831e6ca8\") " pod="openstack/prometheus-metric-storage-0" Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.285925 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/903b0df1-cef0-4077-9033-c534831e6ca8-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"903b0df1-cef0-4077-9033-c534831e6ca8\") " pod="openstack/prometheus-metric-storage-0" Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.375935 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.389571 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/903b0df1-cef0-4077-9033-c534831e6ca8-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"903b0df1-cef0-4077-9033-c534831e6ca8\") " pod="openstack/prometheus-metric-storage-0" Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.389641 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/903b0df1-cef0-4077-9033-c534831e6ca8-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"903b0df1-cef0-4077-9033-c534831e6ca8\") " pod="openstack/prometheus-metric-storage-0" Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.389665 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/903b0df1-cef0-4077-9033-c534831e6ca8-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"903b0df1-cef0-4077-9033-c534831e6ca8\") " pod="openstack/prometheus-metric-storage-0" Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.389699 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/903b0df1-cef0-4077-9033-c534831e6ca8-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"903b0df1-cef0-4077-9033-c534831e6ca8\") " pod="openstack/prometheus-metric-storage-0" Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.389719 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/903b0df1-cef0-4077-9033-c534831e6ca8-config\") pod \"prometheus-metric-storage-0\" (UID: \"903b0df1-cef0-4077-9033-c534831e6ca8\") " pod="openstack/prometheus-metric-storage-0" Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.389738 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4j5rw\" (UniqueName: \"kubernetes.io/projected/903b0df1-cef0-4077-9033-c534831e6ca8-kube-api-access-4j5rw\") pod \"prometheus-metric-storage-0\" (UID: \"903b0df1-cef0-4077-9033-c534831e6ca8\") " pod="openstack/prometheus-metric-storage-0" Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.389797 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/903b0df1-cef0-4077-9033-c534831e6ca8-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"903b0df1-cef0-4077-9033-c534831e6ca8\") " pod="openstack/prometheus-metric-storage-0" Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.389851 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/903b0df1-cef0-4077-9033-c534831e6ca8-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"903b0df1-cef0-4077-9033-c534831e6ca8\") " pod="openstack/prometheus-metric-storage-0" Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.389881 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-f3705f92-1eba-49dd-9654-288b96344486\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f3705f92-1eba-49dd-9654-288b96344486\") pod \"prometheus-metric-storage-0\" (UID: \"903b0df1-cef0-4077-9033-c534831e6ca8\") " pod="openstack/prometheus-metric-storage-0" Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.389897 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/903b0df1-cef0-4077-9033-c534831e6ca8-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"903b0df1-cef0-4077-9033-c534831e6ca8\") " pod="openstack/prometheus-metric-storage-0" Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.390662 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/903b0df1-cef0-4077-9033-c534831e6ca8-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"903b0df1-cef0-4077-9033-c534831e6ca8\") " pod="openstack/prometheus-metric-storage-0" Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.392168 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/903b0df1-cef0-4077-9033-c534831e6ca8-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"903b0df1-cef0-4077-9033-c534831e6ca8\") " pod="openstack/prometheus-metric-storage-0" Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.392961 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/903b0df1-cef0-4077-9033-c534831e6ca8-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"903b0df1-cef0-4077-9033-c534831e6ca8\") " pod="openstack/prometheus-metric-storage-0" Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.398956 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/903b0df1-cef0-4077-9033-c534831e6ca8-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"903b0df1-cef0-4077-9033-c534831e6ca8\") " pod="openstack/prometheus-metric-storage-0" Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.399771 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/903b0df1-cef0-4077-9033-c534831e6ca8-config\") pod \"prometheus-metric-storage-0\" (UID: \"903b0df1-cef0-4077-9033-c534831e6ca8\") " pod="openstack/prometheus-metric-storage-0" Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.401008 4971 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.401051 4971 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-f3705f92-1eba-49dd-9654-288b96344486\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f3705f92-1eba-49dd-9654-288b96344486\") pod \"prometheus-metric-storage-0\" (UID: \"903b0df1-cef0-4077-9033-c534831e6ca8\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b79a392221e5b443cf461a696785c769d98ed70d813f1b54c39a4ac79afe5828/globalmount\"" pod="openstack/prometheus-metric-storage-0" Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.403970 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/903b0df1-cef0-4077-9033-c534831e6ca8-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"903b0df1-cef0-4077-9033-c534831e6ca8\") " pod="openstack/prometheus-metric-storage-0" Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.407829 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/903b0df1-cef0-4077-9033-c534831e6ca8-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"903b0df1-cef0-4077-9033-c534831e6ca8\") " pod="openstack/prometheus-metric-storage-0" Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.425942 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4j5rw\" (UniqueName: \"kubernetes.io/projected/903b0df1-cef0-4077-9033-c534831e6ca8-kube-api-access-4j5rw\") pod \"prometheus-metric-storage-0\" (UID: \"903b0df1-cef0-4077-9033-c534831e6ca8\") " pod="openstack/prometheus-metric-storage-0" Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.426291 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/903b0df1-cef0-4077-9033-c534831e6ca8-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"903b0df1-cef0-4077-9033-c534831e6ca8\") " pod="openstack/prometheus-metric-storage-0" Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.433819 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.471884 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-f3705f92-1eba-49dd-9654-288b96344486\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f3705f92-1eba-49dd-9654-288b96344486\") pod \"prometheus-metric-storage-0\" (UID: \"903b0df1-cef0-4077-9033-c534831e6ca8\") " pod="openstack/prometheus-metric-storage-0" Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.491440 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9f3a3b9f-f195-436e-9097-d085880e71f8-openstack-config-secret\") pod \"9f3a3b9f-f195-436e-9097-d085880e71f8\" (UID: \"9f3a3b9f-f195-436e-9097-d085880e71f8\") " Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.491540 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7tvm\" (UniqueName: \"kubernetes.io/projected/9f3a3b9f-f195-436e-9097-d085880e71f8-kube-api-access-v7tvm\") pod \"9f3a3b9f-f195-436e-9097-d085880e71f8\" (UID: \"9f3a3b9f-f195-436e-9097-d085880e71f8\") " Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.491584 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9f3a3b9f-f195-436e-9097-d085880e71f8-openstack-config\") pod \"9f3a3b9f-f195-436e-9097-d085880e71f8\" (UID: \"9f3a3b9f-f195-436e-9097-d085880e71f8\") " Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.505026 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.511881 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f3a3b9f-f195-436e-9097-d085880e71f8-kube-api-access-v7tvm" (OuterVolumeSpecName: "kube-api-access-v7tvm") pod "9f3a3b9f-f195-436e-9097-d085880e71f8" (UID: "9f3a3b9f-f195-436e-9097-d085880e71f8"). InnerVolumeSpecName "kube-api-access-v7tvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.550407 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f3a3b9f-f195-436e-9097-d085880e71f8-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "9f3a3b9f-f195-436e-9097-d085880e71f8" (UID: "9f3a3b9f-f195-436e-9097-d085880e71f8"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.587544 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f3a3b9f-f195-436e-9097-d085880e71f8-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "9f3a3b9f-f195-436e-9097-d085880e71f8" (UID: "9f3a3b9f-f195-436e-9097-d085880e71f8"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.594008 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7tvm\" (UniqueName: \"kubernetes.io/projected/9f3a3b9f-f195-436e-9097-d085880e71f8-kube-api-access-v7tvm\") on node \"crc\" DevicePath \"\"" Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.594048 4971 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9f3a3b9f-f195-436e-9097-d085880e71f8-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.594063 4971 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9f3a3b9f-f195-436e-9097-d085880e71f8-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.808311 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"3663f86e-c27b-4585-8201-8115f0e04501","Type":"ContainerStarted","Data":"78b529d4e2aa698686fe9aef590e63fe62baa8301056523c7339e5659e5aa321"} Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.808553 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.810185 4971 generic.go:334] "Generic (PLEG): container finished" podID="9f3a3b9f-f195-436e-9097-d085880e71f8" containerID="bd95892b113df7a9165f0379707e216c41cb86d938f4c421d789927ccad60c6a" exitCode=137 Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.810236 4971 scope.go:117] "RemoveContainer" containerID="bd95892b113df7a9165f0379707e216c41cb86d938f4c421d789927ccad60c6a" Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.810319 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.813421 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"0a492260-faae-40fb-887d-46a5be1c4c5e","Type":"ContainerStarted","Data":"e143817381cc3b9c8cba6145e15fc3d94ef0bf66f0a3e7305f995a32b5eb6264"} Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.820285 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"3c261d6c-ad35-4558-a676-58606d0c78a5","Type":"ContainerStarted","Data":"fcedebf855af98bc62800f0ade8e7864819ad5cbe8a8cb00213f08b368cbdac8"} Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.834014 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.424048469 podStartE2EDuration="2.833992429s" podCreationTimestamp="2026-03-20 08:58:39 +0000 UTC" firstStartedPulling="2026-03-20 08:58:40.729428669 +0000 UTC m=+7742.709302807" lastFinishedPulling="2026-03-20 08:58:41.139372629 +0000 UTC m=+7743.119246767" observedRunningTime="2026-03-20 08:58:41.827179041 +0000 UTC m=+7743.807053179" watchObservedRunningTime="2026-03-20 08:58:41.833992429 +0000 UTC m=+7743.813866557" Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.852097 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.852084683 podStartE2EDuration="2.852084683s" podCreationTimestamp="2026-03-20 08:58:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:58:41.849106155 +0000 UTC m=+7743.828980293" watchObservedRunningTime="2026-03-20 08:58:41.852084683 +0000 UTC m=+7743.831958811" Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.853728 4971 scope.go:117] "RemoveContainer" containerID="bd95892b113df7a9165f0379707e216c41cb86d938f4c421d789927ccad60c6a" Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.855103 4971 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="9f3a3b9f-f195-436e-9097-d085880e71f8" podUID="0a492260-faae-40fb-887d-46a5be1c4c5e" Mar 20 08:58:41 crc kubenswrapper[4971]: E0320 08:58:41.857592 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd95892b113df7a9165f0379707e216c41cb86d938f4c421d789927ccad60c6a\": container with ID starting with bd95892b113df7a9165f0379707e216c41cb86d938f4c421d789927ccad60c6a not found: ID does not exist" containerID="bd95892b113df7a9165f0379707e216c41cb86d938f4c421d789927ccad60c6a" Mar 20 08:58:41 crc kubenswrapper[4971]: I0320 08:58:41.857635 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd95892b113df7a9165f0379707e216c41cb86d938f4c421d789927ccad60c6a"} err="failed to get container status \"bd95892b113df7a9165f0379707e216c41cb86d938f4c421d789927ccad60c6a\": rpc error: code = NotFound desc = could not find container \"bd95892b113df7a9165f0379707e216c41cb86d938f4c421d789927ccad60c6a\": container with ID starting with bd95892b113df7a9165f0379707e216c41cb86d938f4c421d789927ccad60c6a not found: ID does not exist" Mar 20 08:58:42 crc kubenswrapper[4971]: I0320 08:58:42.054852 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 20 08:58:42 crc kubenswrapper[4971]: I0320 08:58:42.743327 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f3a3b9f-f195-436e-9097-d085880e71f8" path="/var/lib/kubelet/pods/9f3a3b9f-f195-436e-9097-d085880e71f8/volumes" Mar 20 08:58:42 crc kubenswrapper[4971]: I0320 08:58:42.832141 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"903b0df1-cef0-4077-9033-c534831e6ca8","Type":"ContainerStarted","Data":"9684678776bf807a76b021536d81fdc14cd66d2dcd0d5e2584c21a1829e5b58d"} Mar 20 08:58:44 crc kubenswrapper[4971]: I0320 08:58:44.732653 4971 scope.go:117] "RemoveContainer" containerID="28296d931187d2e7d4e287e10e7dff5f848527abf9621b1204e0a061cbc3d395" Mar 20 08:58:44 crc kubenswrapper[4971]: E0320 08:58:44.733644 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:58:46 crc kubenswrapper[4971]: I0320 08:58:46.884152 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"3c261d6c-ad35-4558-a676-58606d0c78a5","Type":"ContainerStarted","Data":"a7ab62e06ffe062911889b98ac049b2cd6862261bd671d2756755a82ed8c70b5"} Mar 20 08:58:47 crc kubenswrapper[4971]: I0320 08:58:47.896867 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"903b0df1-cef0-4077-9033-c534831e6ca8","Type":"ContainerStarted","Data":"4fa529e89e3438d2d567775eea432b78115c59d428ac63c065bc1b64bc961dc1"} Mar 20 08:58:49 crc kubenswrapper[4971]: I0320 08:58:49.660043 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 20 08:58:53 crc kubenswrapper[4971]: I0320 08:58:53.980891 4971 generic.go:334] "Generic (PLEG): container finished" podID="903b0df1-cef0-4077-9033-c534831e6ca8" containerID="4fa529e89e3438d2d567775eea432b78115c59d428ac63c065bc1b64bc961dc1" exitCode=0 Mar 20 08:58:53 crc kubenswrapper[4971]: I0320 08:58:53.981028 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"903b0df1-cef0-4077-9033-c534831e6ca8","Type":"ContainerDied","Data":"4fa529e89e3438d2d567775eea432b78115c59d428ac63c065bc1b64bc961dc1"} Mar 20 08:58:53 crc kubenswrapper[4971]: I0320 08:58:53.983932 4971 generic.go:334] "Generic (PLEG): container finished" podID="3c261d6c-ad35-4558-a676-58606d0c78a5" containerID="a7ab62e06ffe062911889b98ac049b2cd6862261bd671d2756755a82ed8c70b5" exitCode=0 Mar 20 08:58:53 crc kubenswrapper[4971]: I0320 08:58:53.983970 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"3c261d6c-ad35-4558-a676-58606d0c78a5","Type":"ContainerDied","Data":"a7ab62e06ffe062911889b98ac049b2cd6862261bd671d2756755a82ed8c70b5"} Mar 20 08:58:56 crc kubenswrapper[4971]: I0320 08:58:56.732727 4971 scope.go:117] "RemoveContainer" containerID="28296d931187d2e7d4e287e10e7dff5f848527abf9621b1204e0a061cbc3d395" Mar 20 08:58:56 crc kubenswrapper[4971]: E0320 08:58:56.733350 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:58:58 crc kubenswrapper[4971]: I0320 08:58:58.033366 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"3c261d6c-ad35-4558-a676-58606d0c78a5","Type":"ContainerStarted","Data":"10deaca2a9a5d16865465bb8cdf57ce593aac675b4ef1177ebbf74a0a9fe11b4"} Mar 20 08:59:01 crc kubenswrapper[4971]: I0320 08:59:01.065361 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"3c261d6c-ad35-4558-a676-58606d0c78a5","Type":"ContainerStarted","Data":"7cf3d5094e3c5972ddc2a1df3e96a9f378d9f10aeb53a450367a41ebce99b126"} Mar 20 08:59:01 crc kubenswrapper[4971]: I0320 08:59:01.065847 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Mar 20 08:59:01 crc kubenswrapper[4971]: I0320 08:59:01.070008 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Mar 20 08:59:01 crc kubenswrapper[4971]: I0320 08:59:01.095055 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=4.968821206 podStartE2EDuration="21.095025358s" podCreationTimestamp="2026-03-20 08:58:40 +0000 UTC" firstStartedPulling="2026-03-20 08:58:41.461301045 +0000 UTC m=+7743.441175183" lastFinishedPulling="2026-03-20 08:58:57.587505197 +0000 UTC m=+7759.567379335" observedRunningTime="2026-03-20 08:59:01.08634118 +0000 UTC m=+7763.066215348" watchObservedRunningTime="2026-03-20 08:59:01.095025358 +0000 UTC m=+7763.074899506" Mar 20 08:59:02 crc kubenswrapper[4971]: I0320 08:59:02.077867 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"903b0df1-cef0-4077-9033-c534831e6ca8","Type":"ContainerStarted","Data":"cd5de6518e92c7ab76709e5d9b28d4f4708533dd0fd107f4ddc0d3ba3f41dc1c"} Mar 20 08:59:03 crc kubenswrapper[4971]: I0320 08:59:03.055798 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-tkdkl"] Mar 20 08:59:03 crc kubenswrapper[4971]: I0320 08:59:03.073417 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-tkdkl"] Mar 20 08:59:04 crc kubenswrapper[4971]: I0320 08:59:04.746235 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c0eb761-10ea-4b19-8b9b-deaa635a6393" path="/var/lib/kubelet/pods/7c0eb761-10ea-4b19-8b9b-deaa635a6393/volumes" Mar 20 08:59:05 crc kubenswrapper[4971]: I0320 08:59:05.106998 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"903b0df1-cef0-4077-9033-c534831e6ca8","Type":"ContainerStarted","Data":"9332d3909741afdaa0f3d22760fd5d16f0a9e033a2a6f7a25dec21016f4b6b59"} Mar 20 08:59:08 crc kubenswrapper[4971]: I0320 08:59:08.146217 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"903b0df1-cef0-4077-9033-c534831e6ca8","Type":"ContainerStarted","Data":"88e7fd583c608aa9dead3866a73de3e2b7be7d82d499e08dfecc41a036e25404"} Mar 20 08:59:08 crc kubenswrapper[4971]: I0320 08:59:08.197146 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=3.054155264 podStartE2EDuration="28.197127629s" podCreationTimestamp="2026-03-20 08:58:40 +0000 UTC" firstStartedPulling="2026-03-20 08:58:42.062592622 +0000 UTC m=+7744.042466760" lastFinishedPulling="2026-03-20 08:59:07.205564987 +0000 UTC m=+7769.185439125" observedRunningTime="2026-03-20 08:59:08.192997101 +0000 UTC m=+7770.172871239" watchObservedRunningTime="2026-03-20 08:59:08.197127629 +0000 UTC m=+7770.177001777" Mar 20 08:59:10 crc kubenswrapper[4971]: I0320 08:59:10.733145 4971 scope.go:117] "RemoveContainer" containerID="28296d931187d2e7d4e287e10e7dff5f848527abf9621b1204e0a061cbc3d395" Mar 20 08:59:10 crc kubenswrapper[4971]: E0320 08:59:10.733872 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:59:10 crc kubenswrapper[4971]: I0320 08:59:10.820048 4971 scope.go:117] "RemoveContainer" containerID="b214be5f31acbfdbe0402aee158630139f194b3e9b9063303293c20c51e092ce" Mar 20 08:59:10 crc kubenswrapper[4971]: I0320 08:59:10.853511 4971 scope.go:117] "RemoveContainer" containerID="163d8ccb757a1a02fc718a5f47fb20513fc37cbca66d847cba546e368b38588d" Mar 20 08:59:10 crc kubenswrapper[4971]: I0320 08:59:10.941199 4971 scope.go:117] "RemoveContainer" containerID="5d7d1a3cd6061f415d29124b2aff214ee5133e735f9d5cbfcaa020ed266f2e96" Mar 20 08:59:11 crc kubenswrapper[4971]: I0320 08:59:11.506186 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Mar 20 08:59:11 crc kubenswrapper[4971]: I0320 08:59:11.506460 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Mar 20 08:59:11 crc kubenswrapper[4971]: I0320 08:59:11.512377 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Mar 20 08:59:12 crc kubenswrapper[4971]: I0320 08:59:12.182392 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Mar 20 08:59:12 crc kubenswrapper[4971]: I0320 08:59:12.349072 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 08:59:12 crc kubenswrapper[4971]: I0320 08:59:12.357161 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 08:59:12 crc kubenswrapper[4971]: I0320 08:59:12.361106 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 08:59:12 crc kubenswrapper[4971]: I0320 08:59:12.361176 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 08:59:12 crc kubenswrapper[4971]: I0320 08:59:12.364958 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 08:59:12 crc kubenswrapper[4971]: I0320 08:59:12.446375 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c03996bc-d785-4fa5-8bab-5aff5f6b6d30-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c03996bc-d785-4fa5-8bab-5aff5f6b6d30\") " pod="openstack/ceilometer-0" Mar 20 08:59:12 crc kubenswrapper[4971]: I0320 08:59:12.446450 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c03996bc-d785-4fa5-8bab-5aff5f6b6d30-log-httpd\") pod \"ceilometer-0\" (UID: \"c03996bc-d785-4fa5-8bab-5aff5f6b6d30\") " pod="openstack/ceilometer-0" Mar 20 08:59:12 crc kubenswrapper[4971]: I0320 08:59:12.446695 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c03996bc-d785-4fa5-8bab-5aff5f6b6d30-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c03996bc-d785-4fa5-8bab-5aff5f6b6d30\") " pod="openstack/ceilometer-0" Mar 20 08:59:12 crc kubenswrapper[4971]: I0320 08:59:12.446754 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtnmx\" (UniqueName: \"kubernetes.io/projected/c03996bc-d785-4fa5-8bab-5aff5f6b6d30-kube-api-access-vtnmx\") pod \"ceilometer-0\" (UID: \"c03996bc-d785-4fa5-8bab-5aff5f6b6d30\") " pod="openstack/ceilometer-0" Mar 20 08:59:12 crc kubenswrapper[4971]: I0320 08:59:12.446808 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c03996bc-d785-4fa5-8bab-5aff5f6b6d30-run-httpd\") pod \"ceilometer-0\" (UID: \"c03996bc-d785-4fa5-8bab-5aff5f6b6d30\") " pod="openstack/ceilometer-0" Mar 20 08:59:12 crc kubenswrapper[4971]: I0320 08:59:12.446946 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c03996bc-d785-4fa5-8bab-5aff5f6b6d30-config-data\") pod \"ceilometer-0\" (UID: \"c03996bc-d785-4fa5-8bab-5aff5f6b6d30\") " pod="openstack/ceilometer-0" Mar 20 08:59:12 crc kubenswrapper[4971]: I0320 08:59:12.447068 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c03996bc-d785-4fa5-8bab-5aff5f6b6d30-scripts\") pod \"ceilometer-0\" (UID: \"c03996bc-d785-4fa5-8bab-5aff5f6b6d30\") " pod="openstack/ceilometer-0" Mar 20 08:59:12 crc kubenswrapper[4971]: I0320 08:59:12.549034 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c03996bc-d785-4fa5-8bab-5aff5f6b6d30-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c03996bc-d785-4fa5-8bab-5aff5f6b6d30\") " pod="openstack/ceilometer-0" Mar 20 08:59:12 crc kubenswrapper[4971]: I0320 08:59:12.549103 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c03996bc-d785-4fa5-8bab-5aff5f6b6d30-log-httpd\") pod \"ceilometer-0\" (UID: \"c03996bc-d785-4fa5-8bab-5aff5f6b6d30\") " pod="openstack/ceilometer-0" Mar 20 08:59:12 crc kubenswrapper[4971]: I0320 08:59:12.549156 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c03996bc-d785-4fa5-8bab-5aff5f6b6d30-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c03996bc-d785-4fa5-8bab-5aff5f6b6d30\") " pod="openstack/ceilometer-0" Mar 20 08:59:12 crc kubenswrapper[4971]: I0320 08:59:12.549178 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtnmx\" (UniqueName: \"kubernetes.io/projected/c03996bc-d785-4fa5-8bab-5aff5f6b6d30-kube-api-access-vtnmx\") pod \"ceilometer-0\" (UID: \"c03996bc-d785-4fa5-8bab-5aff5f6b6d30\") " pod="openstack/ceilometer-0" Mar 20 08:59:12 crc kubenswrapper[4971]: I0320 08:59:12.549195 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c03996bc-d785-4fa5-8bab-5aff5f6b6d30-run-httpd\") pod \"ceilometer-0\" (UID: \"c03996bc-d785-4fa5-8bab-5aff5f6b6d30\") " pod="openstack/ceilometer-0" Mar 20 08:59:12 crc kubenswrapper[4971]: I0320 08:59:12.549234 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c03996bc-d785-4fa5-8bab-5aff5f6b6d30-config-data\") pod \"ceilometer-0\" (UID: \"c03996bc-d785-4fa5-8bab-5aff5f6b6d30\") " pod="openstack/ceilometer-0" Mar 20 08:59:12 crc kubenswrapper[4971]: I0320 08:59:12.549272 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c03996bc-d785-4fa5-8bab-5aff5f6b6d30-scripts\") pod \"ceilometer-0\" (UID: \"c03996bc-d785-4fa5-8bab-5aff5f6b6d30\") " pod="openstack/ceilometer-0" Mar 20 08:59:12 crc kubenswrapper[4971]: I0320 08:59:12.549850 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c03996bc-d785-4fa5-8bab-5aff5f6b6d30-log-httpd\") pod \"ceilometer-0\" (UID: \"c03996bc-d785-4fa5-8bab-5aff5f6b6d30\") " pod="openstack/ceilometer-0" Mar 20 08:59:12 crc kubenswrapper[4971]: I0320 08:59:12.550034 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c03996bc-d785-4fa5-8bab-5aff5f6b6d30-run-httpd\") pod \"ceilometer-0\" (UID: \"c03996bc-d785-4fa5-8bab-5aff5f6b6d30\") " pod="openstack/ceilometer-0" Mar 20 08:59:12 crc kubenswrapper[4971]: I0320 08:59:12.556842 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c03996bc-d785-4fa5-8bab-5aff5f6b6d30-scripts\") pod \"ceilometer-0\" (UID: \"c03996bc-d785-4fa5-8bab-5aff5f6b6d30\") " pod="openstack/ceilometer-0" Mar 20 08:59:12 crc kubenswrapper[4971]: I0320 08:59:12.559096 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c03996bc-d785-4fa5-8bab-5aff5f6b6d30-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c03996bc-d785-4fa5-8bab-5aff5f6b6d30\") " pod="openstack/ceilometer-0" Mar 20 08:59:12 crc kubenswrapper[4971]: I0320 08:59:12.563678 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c03996bc-d785-4fa5-8bab-5aff5f6b6d30-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c03996bc-d785-4fa5-8bab-5aff5f6b6d30\") " pod="openstack/ceilometer-0" Mar 20 08:59:12 crc kubenswrapper[4971]: I0320 08:59:12.569383 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c03996bc-d785-4fa5-8bab-5aff5f6b6d30-config-data\") pod \"ceilometer-0\" (UID: \"c03996bc-d785-4fa5-8bab-5aff5f6b6d30\") " pod="openstack/ceilometer-0" Mar 20 08:59:12 crc kubenswrapper[4971]: I0320 08:59:12.575804 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtnmx\" (UniqueName: \"kubernetes.io/projected/c03996bc-d785-4fa5-8bab-5aff5f6b6d30-kube-api-access-vtnmx\") pod \"ceilometer-0\" (UID: \"c03996bc-d785-4fa5-8bab-5aff5f6b6d30\") " pod="openstack/ceilometer-0" Mar 20 08:59:12 crc kubenswrapper[4971]: I0320 08:59:12.678193 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 08:59:13 crc kubenswrapper[4971]: I0320 08:59:13.207632 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 08:59:14 crc kubenswrapper[4971]: I0320 08:59:14.197682 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c03996bc-d785-4fa5-8bab-5aff5f6b6d30","Type":"ContainerStarted","Data":"b26ee1c2d661a5fd8cdcc112d242a72add5814afc7c06481a8034b6dea90083f"} Mar 20 08:59:18 crc kubenswrapper[4971]: I0320 08:59:18.237025 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c03996bc-d785-4fa5-8bab-5aff5f6b6d30","Type":"ContainerStarted","Data":"b49e78060c9e9b034b4ba17112eda6821eb91806e8b3d3e4c6b529d455bfe3d3"} Mar 20 08:59:19 crc kubenswrapper[4971]: I0320 08:59:19.247176 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c03996bc-d785-4fa5-8bab-5aff5f6b6d30","Type":"ContainerStarted","Data":"2eb15b132720008af8d10569b55fa48ed92a487ba27a2d994e8b235703f4097c"} Mar 20 08:59:20 crc kubenswrapper[4971]: I0320 08:59:20.262254 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c03996bc-d785-4fa5-8bab-5aff5f6b6d30","Type":"ContainerStarted","Data":"c2233374833a74fd5f9c830dbd156f93173676e6ced8b4e6640f346eec06a231"} Mar 20 08:59:21 crc kubenswrapper[4971]: I0320 08:59:21.732462 4971 scope.go:117] "RemoveContainer" containerID="28296d931187d2e7d4e287e10e7dff5f848527abf9621b1204e0a061cbc3d395" Mar 20 08:59:21 crc kubenswrapper[4971]: E0320 08:59:21.733030 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:59:22 crc kubenswrapper[4971]: I0320 08:59:22.284312 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c03996bc-d785-4fa5-8bab-5aff5f6b6d30","Type":"ContainerStarted","Data":"5392e5b781468bc0ece268171017f67c76e2d425c1864f778bb72bf947c087b9"} Mar 20 08:59:22 crc kubenswrapper[4971]: I0320 08:59:22.284708 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 08:59:29 crc kubenswrapper[4971]: I0320 08:59:29.599341 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=9.449893348 podStartE2EDuration="17.5993204s" podCreationTimestamp="2026-03-20 08:59:12 +0000 UTC" firstStartedPulling="2026-03-20 08:59:13.223225994 +0000 UTC m=+7775.203100132" lastFinishedPulling="2026-03-20 08:59:21.372653026 +0000 UTC m=+7783.352527184" observedRunningTime="2026-03-20 08:59:22.305644676 +0000 UTC m=+7784.285518814" watchObservedRunningTime="2026-03-20 08:59:29.5993204 +0000 UTC m=+7791.579194548" Mar 20 08:59:29 crc kubenswrapper[4971]: I0320 08:59:29.601599 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-z54jl"] Mar 20 08:59:29 crc kubenswrapper[4971]: I0320 08:59:29.603625 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-z54jl" Mar 20 08:59:29 crc kubenswrapper[4971]: I0320 08:59:29.613283 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-z54jl"] Mar 20 08:59:29 crc kubenswrapper[4971]: I0320 08:59:29.628962 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/006fbf05-668e-4eef-97a3-d24b1ca7dbd0-operator-scripts\") pod \"aodh-db-create-z54jl\" (UID: \"006fbf05-668e-4eef-97a3-d24b1ca7dbd0\") " pod="openstack/aodh-db-create-z54jl" Mar 20 08:59:29 crc kubenswrapper[4971]: I0320 08:59:29.629325 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7pgt\" (UniqueName: \"kubernetes.io/projected/006fbf05-668e-4eef-97a3-d24b1ca7dbd0-kube-api-access-d7pgt\") pod \"aodh-db-create-z54jl\" (UID: \"006fbf05-668e-4eef-97a3-d24b1ca7dbd0\") " pod="openstack/aodh-db-create-z54jl" Mar 20 08:59:29 crc kubenswrapper[4971]: I0320 08:59:29.696054 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-8e23-account-create-update-kw8vm"] Mar 20 08:59:29 crc kubenswrapper[4971]: I0320 08:59:29.697675 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-8e23-account-create-update-kw8vm" Mar 20 08:59:29 crc kubenswrapper[4971]: I0320 08:59:29.699927 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Mar 20 08:59:29 crc kubenswrapper[4971]: I0320 08:59:29.709074 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-8e23-account-create-update-kw8vm"] Mar 20 08:59:29 crc kubenswrapper[4971]: I0320 08:59:29.731139 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7pgt\" (UniqueName: \"kubernetes.io/projected/006fbf05-668e-4eef-97a3-d24b1ca7dbd0-kube-api-access-d7pgt\") pod \"aodh-db-create-z54jl\" (UID: \"006fbf05-668e-4eef-97a3-d24b1ca7dbd0\") " pod="openstack/aodh-db-create-z54jl" Mar 20 08:59:29 crc kubenswrapper[4971]: I0320 08:59:29.731208 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/006fbf05-668e-4eef-97a3-d24b1ca7dbd0-operator-scripts\") pod \"aodh-db-create-z54jl\" (UID: \"006fbf05-668e-4eef-97a3-d24b1ca7dbd0\") " pod="openstack/aodh-db-create-z54jl" Mar 20 08:59:29 crc kubenswrapper[4971]: I0320 08:59:29.731244 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrmzg\" (UniqueName: \"kubernetes.io/projected/2e91c8be-1a90-4da8-a072-a4aaf36b2a89-kube-api-access-vrmzg\") pod \"aodh-8e23-account-create-update-kw8vm\" (UID: \"2e91c8be-1a90-4da8-a072-a4aaf36b2a89\") " pod="openstack/aodh-8e23-account-create-update-kw8vm" Mar 20 08:59:29 crc kubenswrapper[4971]: I0320 08:59:29.731348 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e91c8be-1a90-4da8-a072-a4aaf36b2a89-operator-scripts\") pod \"aodh-8e23-account-create-update-kw8vm\" (UID: \"2e91c8be-1a90-4da8-a072-a4aaf36b2a89\") " pod="openstack/aodh-8e23-account-create-update-kw8vm" Mar 20 08:59:29 crc kubenswrapper[4971]: I0320 08:59:29.732201 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/006fbf05-668e-4eef-97a3-d24b1ca7dbd0-operator-scripts\") pod \"aodh-db-create-z54jl\" (UID: \"006fbf05-668e-4eef-97a3-d24b1ca7dbd0\") " pod="openstack/aodh-db-create-z54jl" Mar 20 08:59:29 crc kubenswrapper[4971]: I0320 08:59:29.749823 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7pgt\" (UniqueName: \"kubernetes.io/projected/006fbf05-668e-4eef-97a3-d24b1ca7dbd0-kube-api-access-d7pgt\") pod \"aodh-db-create-z54jl\" (UID: \"006fbf05-668e-4eef-97a3-d24b1ca7dbd0\") " pod="openstack/aodh-db-create-z54jl" Mar 20 08:59:29 crc kubenswrapper[4971]: I0320 08:59:29.833679 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e91c8be-1a90-4da8-a072-a4aaf36b2a89-operator-scripts\") pod \"aodh-8e23-account-create-update-kw8vm\" (UID: \"2e91c8be-1a90-4da8-a072-a4aaf36b2a89\") " pod="openstack/aodh-8e23-account-create-update-kw8vm" Mar 20 08:59:29 crc kubenswrapper[4971]: I0320 08:59:29.833816 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrmzg\" (UniqueName: \"kubernetes.io/projected/2e91c8be-1a90-4da8-a072-a4aaf36b2a89-kube-api-access-vrmzg\") pod \"aodh-8e23-account-create-update-kw8vm\" (UID: \"2e91c8be-1a90-4da8-a072-a4aaf36b2a89\") " pod="openstack/aodh-8e23-account-create-update-kw8vm" Mar 20 08:59:29 crc kubenswrapper[4971]: I0320 08:59:29.834706 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e91c8be-1a90-4da8-a072-a4aaf36b2a89-operator-scripts\") pod \"aodh-8e23-account-create-update-kw8vm\" (UID: \"2e91c8be-1a90-4da8-a072-a4aaf36b2a89\") " pod="openstack/aodh-8e23-account-create-update-kw8vm" Mar 20 08:59:29 crc kubenswrapper[4971]: I0320 08:59:29.861523 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrmzg\" (UniqueName: \"kubernetes.io/projected/2e91c8be-1a90-4da8-a072-a4aaf36b2a89-kube-api-access-vrmzg\") pod \"aodh-8e23-account-create-update-kw8vm\" (UID: \"2e91c8be-1a90-4da8-a072-a4aaf36b2a89\") " pod="openstack/aodh-8e23-account-create-update-kw8vm" Mar 20 08:59:29 crc kubenswrapper[4971]: I0320 08:59:29.978653 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-z54jl" Mar 20 08:59:30 crc kubenswrapper[4971]: I0320 08:59:30.017128 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-8e23-account-create-update-kw8vm" Mar 20 08:59:30 crc kubenswrapper[4971]: I0320 08:59:30.645333 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-8e23-account-create-update-kw8vm"] Mar 20 08:59:30 crc kubenswrapper[4971]: I0320 08:59:30.664322 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-z54jl"] Mar 20 08:59:31 crc kubenswrapper[4971]: I0320 08:59:31.369729 4971 generic.go:334] "Generic (PLEG): container finished" podID="006fbf05-668e-4eef-97a3-d24b1ca7dbd0" containerID="8ea134c20974b4052f9194d6e61c23af291cf3d95c59ba0c9174f1c9ffde82e3" exitCode=0 Mar 20 08:59:31 crc kubenswrapper[4971]: I0320 08:59:31.370150 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-z54jl" event={"ID":"006fbf05-668e-4eef-97a3-d24b1ca7dbd0","Type":"ContainerDied","Data":"8ea134c20974b4052f9194d6e61c23af291cf3d95c59ba0c9174f1c9ffde82e3"} Mar 20 08:59:31 crc kubenswrapper[4971]: I0320 08:59:31.370185 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-z54jl" event={"ID":"006fbf05-668e-4eef-97a3-d24b1ca7dbd0","Type":"ContainerStarted","Data":"f6a3aa9e7e0743485753ea4e6c748051168ce3faa03fe112a065aae69cc960dc"} Mar 20 08:59:31 crc kubenswrapper[4971]: I0320 08:59:31.372411 4971 generic.go:334] "Generic (PLEG): container finished" podID="2e91c8be-1a90-4da8-a072-a4aaf36b2a89" containerID="b318a46490086f365352264c975e137f0bd04500a5e27c5284457adc42224eea" exitCode=0 Mar 20 08:59:31 crc kubenswrapper[4971]: I0320 08:59:31.372424 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-8e23-account-create-update-kw8vm" event={"ID":"2e91c8be-1a90-4da8-a072-a4aaf36b2a89","Type":"ContainerDied","Data":"b318a46490086f365352264c975e137f0bd04500a5e27c5284457adc42224eea"} Mar 20 08:59:31 crc kubenswrapper[4971]: I0320 08:59:31.372509 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-8e23-account-create-update-kw8vm" event={"ID":"2e91c8be-1a90-4da8-a072-a4aaf36b2a89","Type":"ContainerStarted","Data":"25119bf35d6c6cc3f8b05d2cc5772d7314a0d8a706fa89e3011a45f64ae03ae4"} Mar 20 08:59:32 crc kubenswrapper[4971]: I0320 08:59:32.944743 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-z54jl" Mar 20 08:59:32 crc kubenswrapper[4971]: I0320 08:59:32.949957 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-8e23-account-create-update-kw8vm" Mar 20 08:59:33 crc kubenswrapper[4971]: I0320 08:59:33.122294 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e91c8be-1a90-4da8-a072-a4aaf36b2a89-operator-scripts\") pod \"2e91c8be-1a90-4da8-a072-a4aaf36b2a89\" (UID: \"2e91c8be-1a90-4da8-a072-a4aaf36b2a89\") " Mar 20 08:59:33 crc kubenswrapper[4971]: I0320 08:59:33.122431 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7pgt\" (UniqueName: \"kubernetes.io/projected/006fbf05-668e-4eef-97a3-d24b1ca7dbd0-kube-api-access-d7pgt\") pod \"006fbf05-668e-4eef-97a3-d24b1ca7dbd0\" (UID: \"006fbf05-668e-4eef-97a3-d24b1ca7dbd0\") " Mar 20 08:59:33 crc kubenswrapper[4971]: I0320 08:59:33.122456 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrmzg\" (UniqueName: \"kubernetes.io/projected/2e91c8be-1a90-4da8-a072-a4aaf36b2a89-kube-api-access-vrmzg\") pod \"2e91c8be-1a90-4da8-a072-a4aaf36b2a89\" (UID: \"2e91c8be-1a90-4da8-a072-a4aaf36b2a89\") " Mar 20 08:59:33 crc kubenswrapper[4971]: I0320 08:59:33.122644 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/006fbf05-668e-4eef-97a3-d24b1ca7dbd0-operator-scripts\") pod \"006fbf05-668e-4eef-97a3-d24b1ca7dbd0\" (UID: \"006fbf05-668e-4eef-97a3-d24b1ca7dbd0\") " Mar 20 08:59:33 crc kubenswrapper[4971]: I0320 08:59:33.123413 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/006fbf05-668e-4eef-97a3-d24b1ca7dbd0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "006fbf05-668e-4eef-97a3-d24b1ca7dbd0" (UID: "006fbf05-668e-4eef-97a3-d24b1ca7dbd0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:59:33 crc kubenswrapper[4971]: I0320 08:59:33.123459 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e91c8be-1a90-4da8-a072-a4aaf36b2a89-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2e91c8be-1a90-4da8-a072-a4aaf36b2a89" (UID: "2e91c8be-1a90-4da8-a072-a4aaf36b2a89"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:59:33 crc kubenswrapper[4971]: I0320 08:59:33.128198 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/006fbf05-668e-4eef-97a3-d24b1ca7dbd0-kube-api-access-d7pgt" (OuterVolumeSpecName: "kube-api-access-d7pgt") pod "006fbf05-668e-4eef-97a3-d24b1ca7dbd0" (UID: "006fbf05-668e-4eef-97a3-d24b1ca7dbd0"). InnerVolumeSpecName "kube-api-access-d7pgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:59:33 crc kubenswrapper[4971]: I0320 08:59:33.128968 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e91c8be-1a90-4da8-a072-a4aaf36b2a89-kube-api-access-vrmzg" (OuterVolumeSpecName: "kube-api-access-vrmzg") pod "2e91c8be-1a90-4da8-a072-a4aaf36b2a89" (UID: "2e91c8be-1a90-4da8-a072-a4aaf36b2a89"). InnerVolumeSpecName "kube-api-access-vrmzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:59:33 crc kubenswrapper[4971]: I0320 08:59:33.225256 4971 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/006fbf05-668e-4eef-97a3-d24b1ca7dbd0-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:59:33 crc kubenswrapper[4971]: I0320 08:59:33.225312 4971 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e91c8be-1a90-4da8-a072-a4aaf36b2a89-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:59:33 crc kubenswrapper[4971]: I0320 08:59:33.225332 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7pgt\" (UniqueName: \"kubernetes.io/projected/006fbf05-668e-4eef-97a3-d24b1ca7dbd0-kube-api-access-d7pgt\") on node \"crc\" DevicePath \"\"" Mar 20 08:59:33 crc kubenswrapper[4971]: I0320 08:59:33.225352 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrmzg\" (UniqueName: \"kubernetes.io/projected/2e91c8be-1a90-4da8-a072-a4aaf36b2a89-kube-api-access-vrmzg\") on node \"crc\" DevicePath \"\"" Mar 20 08:59:33 crc kubenswrapper[4971]: I0320 08:59:33.397593 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-z54jl" Mar 20 08:59:33 crc kubenswrapper[4971]: I0320 08:59:33.397585 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-z54jl" event={"ID":"006fbf05-668e-4eef-97a3-d24b1ca7dbd0","Type":"ContainerDied","Data":"f6a3aa9e7e0743485753ea4e6c748051168ce3faa03fe112a065aae69cc960dc"} Mar 20 08:59:33 crc kubenswrapper[4971]: I0320 08:59:33.398308 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6a3aa9e7e0743485753ea4e6c748051168ce3faa03fe112a065aae69cc960dc" Mar 20 08:59:33 crc kubenswrapper[4971]: I0320 08:59:33.399562 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-8e23-account-create-update-kw8vm" event={"ID":"2e91c8be-1a90-4da8-a072-a4aaf36b2a89","Type":"ContainerDied","Data":"25119bf35d6c6cc3f8b05d2cc5772d7314a0d8a706fa89e3011a45f64ae03ae4"} Mar 20 08:59:33 crc kubenswrapper[4971]: I0320 08:59:33.399639 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25119bf35d6c6cc3f8b05d2cc5772d7314a0d8a706fa89e3011a45f64ae03ae4" Mar 20 08:59:33 crc kubenswrapper[4971]: I0320 08:59:33.399639 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-8e23-account-create-update-kw8vm" Mar 20 08:59:33 crc kubenswrapper[4971]: I0320 08:59:33.732703 4971 scope.go:117] "RemoveContainer" containerID="28296d931187d2e7d4e287e10e7dff5f848527abf9621b1204e0a061cbc3d395" Mar 20 08:59:33 crc kubenswrapper[4971]: E0320 08:59:33.733155 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:59:34 crc kubenswrapper[4971]: I0320 08:59:34.043189 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-2hbqb"] Mar 20 08:59:34 crc kubenswrapper[4971]: I0320 08:59:34.054247 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-b860-account-create-update-khxtz"] Mar 20 08:59:34 crc kubenswrapper[4971]: I0320 08:59:34.063711 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-2hbqb"] Mar 20 08:59:34 crc kubenswrapper[4971]: I0320 08:59:34.071434 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-b860-account-create-update-khxtz"] Mar 20 08:59:34 crc kubenswrapper[4971]: I0320 08:59:34.747066 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6599a56b-bc8f-42e6-8f6e-359727614c6e" path="/var/lib/kubelet/pods/6599a56b-bc8f-42e6-8f6e-359727614c6e/volumes" Mar 20 08:59:34 crc kubenswrapper[4971]: I0320 08:59:34.748259 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a650993b-e4f4-4d7c-9d90-e9ce7e0ddd12" path="/var/lib/kubelet/pods/a650993b-e4f4-4d7c-9d90-e9ce7e0ddd12/volumes" Mar 20 08:59:35 crc kubenswrapper[4971]: I0320 08:59:35.004645 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-zxlrh"] Mar 20 08:59:35 crc kubenswrapper[4971]: E0320 08:59:35.005033 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e91c8be-1a90-4da8-a072-a4aaf36b2a89" containerName="mariadb-account-create-update" Mar 20 08:59:35 crc kubenswrapper[4971]: I0320 08:59:35.005050 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e91c8be-1a90-4da8-a072-a4aaf36b2a89" containerName="mariadb-account-create-update" Mar 20 08:59:35 crc kubenswrapper[4971]: E0320 08:59:35.005061 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="006fbf05-668e-4eef-97a3-d24b1ca7dbd0" containerName="mariadb-database-create" Mar 20 08:59:35 crc kubenswrapper[4971]: I0320 08:59:35.005067 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="006fbf05-668e-4eef-97a3-d24b1ca7dbd0" containerName="mariadb-database-create" Mar 20 08:59:35 crc kubenswrapper[4971]: I0320 08:59:35.007198 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="006fbf05-668e-4eef-97a3-d24b1ca7dbd0" containerName="mariadb-database-create" Mar 20 08:59:35 crc kubenswrapper[4971]: I0320 08:59:35.007218 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e91c8be-1a90-4da8-a072-a4aaf36b2a89" containerName="mariadb-account-create-update" Mar 20 08:59:35 crc kubenswrapper[4971]: I0320 08:59:35.007968 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-zxlrh" Mar 20 08:59:35 crc kubenswrapper[4971]: I0320 08:59:35.010991 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 20 08:59:35 crc kubenswrapper[4971]: I0320 08:59:35.011329 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-fnmzn" Mar 20 08:59:35 crc kubenswrapper[4971]: I0320 08:59:35.013708 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 20 08:59:35 crc kubenswrapper[4971]: I0320 08:59:35.020685 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 20 08:59:35 crc kubenswrapper[4971]: I0320 08:59:35.026247 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-zxlrh"] Mar 20 08:59:35 crc kubenswrapper[4971]: I0320 08:59:35.070888 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f76e60cb-33d8-4c46-9fec-c6e306e2aa0e-scripts\") pod \"aodh-db-sync-zxlrh\" (UID: \"f76e60cb-33d8-4c46-9fec-c6e306e2aa0e\") " pod="openstack/aodh-db-sync-zxlrh" Mar 20 08:59:35 crc kubenswrapper[4971]: I0320 08:59:35.071002 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrj66\" (UniqueName: \"kubernetes.io/projected/f76e60cb-33d8-4c46-9fec-c6e306e2aa0e-kube-api-access-qrj66\") pod \"aodh-db-sync-zxlrh\" (UID: \"f76e60cb-33d8-4c46-9fec-c6e306e2aa0e\") " pod="openstack/aodh-db-sync-zxlrh" Mar 20 08:59:35 crc kubenswrapper[4971]: I0320 08:59:35.071046 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f76e60cb-33d8-4c46-9fec-c6e306e2aa0e-combined-ca-bundle\") pod \"aodh-db-sync-zxlrh\" (UID: \"f76e60cb-33d8-4c46-9fec-c6e306e2aa0e\") " pod="openstack/aodh-db-sync-zxlrh" Mar 20 08:59:35 crc kubenswrapper[4971]: I0320 08:59:35.071078 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f76e60cb-33d8-4c46-9fec-c6e306e2aa0e-config-data\") pod \"aodh-db-sync-zxlrh\" (UID: \"f76e60cb-33d8-4c46-9fec-c6e306e2aa0e\") " pod="openstack/aodh-db-sync-zxlrh" Mar 20 08:59:35 crc kubenswrapper[4971]: I0320 08:59:35.171672 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f76e60cb-33d8-4c46-9fec-c6e306e2aa0e-scripts\") pod \"aodh-db-sync-zxlrh\" (UID: \"f76e60cb-33d8-4c46-9fec-c6e306e2aa0e\") " pod="openstack/aodh-db-sync-zxlrh" Mar 20 08:59:35 crc kubenswrapper[4971]: I0320 08:59:35.171806 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrj66\" (UniqueName: \"kubernetes.io/projected/f76e60cb-33d8-4c46-9fec-c6e306e2aa0e-kube-api-access-qrj66\") pod \"aodh-db-sync-zxlrh\" (UID: \"f76e60cb-33d8-4c46-9fec-c6e306e2aa0e\") " pod="openstack/aodh-db-sync-zxlrh" Mar 20 08:59:35 crc kubenswrapper[4971]: I0320 08:59:35.171863 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f76e60cb-33d8-4c46-9fec-c6e306e2aa0e-combined-ca-bundle\") pod \"aodh-db-sync-zxlrh\" (UID: \"f76e60cb-33d8-4c46-9fec-c6e306e2aa0e\") " pod="openstack/aodh-db-sync-zxlrh" Mar 20 08:59:35 crc kubenswrapper[4971]: I0320 08:59:35.171900 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f76e60cb-33d8-4c46-9fec-c6e306e2aa0e-config-data\") pod \"aodh-db-sync-zxlrh\" (UID: \"f76e60cb-33d8-4c46-9fec-c6e306e2aa0e\") " pod="openstack/aodh-db-sync-zxlrh" Mar 20 08:59:35 crc kubenswrapper[4971]: I0320 08:59:35.177369 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f76e60cb-33d8-4c46-9fec-c6e306e2aa0e-combined-ca-bundle\") pod \"aodh-db-sync-zxlrh\" (UID: \"f76e60cb-33d8-4c46-9fec-c6e306e2aa0e\") " pod="openstack/aodh-db-sync-zxlrh" Mar 20 08:59:35 crc kubenswrapper[4971]: I0320 08:59:35.180044 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f76e60cb-33d8-4c46-9fec-c6e306e2aa0e-scripts\") pod \"aodh-db-sync-zxlrh\" (UID: \"f76e60cb-33d8-4c46-9fec-c6e306e2aa0e\") " pod="openstack/aodh-db-sync-zxlrh" Mar 20 08:59:35 crc kubenswrapper[4971]: I0320 08:59:35.186757 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f76e60cb-33d8-4c46-9fec-c6e306e2aa0e-config-data\") pod \"aodh-db-sync-zxlrh\" (UID: \"f76e60cb-33d8-4c46-9fec-c6e306e2aa0e\") " pod="openstack/aodh-db-sync-zxlrh" Mar 20 08:59:35 crc kubenswrapper[4971]: I0320 08:59:35.190863 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrj66\" (UniqueName: \"kubernetes.io/projected/f76e60cb-33d8-4c46-9fec-c6e306e2aa0e-kube-api-access-qrj66\") pod \"aodh-db-sync-zxlrh\" (UID: \"f76e60cb-33d8-4c46-9fec-c6e306e2aa0e\") " pod="openstack/aodh-db-sync-zxlrh" Mar 20 08:59:35 crc kubenswrapper[4971]: I0320 08:59:35.332320 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-zxlrh" Mar 20 08:59:35 crc kubenswrapper[4971]: I0320 08:59:35.800175 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-zxlrh"] Mar 20 08:59:36 crc kubenswrapper[4971]: I0320 08:59:36.434179 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-zxlrh" event={"ID":"f76e60cb-33d8-4c46-9fec-c6e306e2aa0e","Type":"ContainerStarted","Data":"ee479bb86d8424f4f25f1d3eb9ff5f4e1a63674944f61e6ed7463827b3acea66"} Mar 20 08:59:40 crc kubenswrapper[4971]: I0320 08:59:40.683268 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 20 08:59:41 crc kubenswrapper[4971]: I0320 08:59:41.530369 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-zxlrh" event={"ID":"f76e60cb-33d8-4c46-9fec-c6e306e2aa0e","Type":"ContainerStarted","Data":"d800c2b63fef95189adca9cf7bf09d72aa824438506d2bbf8595742cfe0c3fc5"} Mar 20 08:59:41 crc kubenswrapper[4971]: I0320 08:59:41.548278 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-zxlrh" podStartSLOduration=2.666352962 podStartE2EDuration="7.548254163s" podCreationTimestamp="2026-03-20 08:59:34 +0000 UTC" firstStartedPulling="2026-03-20 08:59:35.798547219 +0000 UTC m=+7797.778421357" lastFinishedPulling="2026-03-20 08:59:40.6804484 +0000 UTC m=+7802.660322558" observedRunningTime="2026-03-20 08:59:41.546775644 +0000 UTC m=+7803.526649792" watchObservedRunningTime="2026-03-20 08:59:41.548254163 +0000 UTC m=+7803.528128341" Mar 20 08:59:42 crc kubenswrapper[4971]: I0320 08:59:42.685990 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 20 08:59:43 crc kubenswrapper[4971]: I0320 08:59:43.555301 4971 generic.go:334] "Generic (PLEG): container finished" podID="f76e60cb-33d8-4c46-9fec-c6e306e2aa0e" containerID="d800c2b63fef95189adca9cf7bf09d72aa824438506d2bbf8595742cfe0c3fc5" exitCode=0 Mar 20 08:59:43 crc kubenswrapper[4971]: I0320 08:59:43.555694 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-zxlrh" event={"ID":"f76e60cb-33d8-4c46-9fec-c6e306e2aa0e","Type":"ContainerDied","Data":"d800c2b63fef95189adca9cf7bf09d72aa824438506d2bbf8595742cfe0c3fc5"} Mar 20 08:59:44 crc kubenswrapper[4971]: I0320 08:59:44.039214 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-df4kp"] Mar 20 08:59:44 crc kubenswrapper[4971]: I0320 08:59:44.049636 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-df4kp"] Mar 20 08:59:44 crc kubenswrapper[4971]: I0320 08:59:44.733321 4971 scope.go:117] "RemoveContainer" containerID="28296d931187d2e7d4e287e10e7dff5f848527abf9621b1204e0a061cbc3d395" Mar 20 08:59:44 crc kubenswrapper[4971]: E0320 08:59:44.733907 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:59:44 crc kubenswrapper[4971]: I0320 08:59:44.747645 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73f73e65-8c8e-4358-a2fc-7ad8e1583148" path="/var/lib/kubelet/pods/73f73e65-8c8e-4358-a2fc-7ad8e1583148/volumes" Mar 20 08:59:44 crc kubenswrapper[4971]: I0320 08:59:44.963338 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-zxlrh" Mar 20 08:59:45 crc kubenswrapper[4971]: I0320 08:59:45.068376 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrj66\" (UniqueName: \"kubernetes.io/projected/f76e60cb-33d8-4c46-9fec-c6e306e2aa0e-kube-api-access-qrj66\") pod \"f76e60cb-33d8-4c46-9fec-c6e306e2aa0e\" (UID: \"f76e60cb-33d8-4c46-9fec-c6e306e2aa0e\") " Mar 20 08:59:45 crc kubenswrapper[4971]: I0320 08:59:45.068722 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f76e60cb-33d8-4c46-9fec-c6e306e2aa0e-scripts\") pod \"f76e60cb-33d8-4c46-9fec-c6e306e2aa0e\" (UID: \"f76e60cb-33d8-4c46-9fec-c6e306e2aa0e\") " Mar 20 08:59:45 crc kubenswrapper[4971]: I0320 08:59:45.068751 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f76e60cb-33d8-4c46-9fec-c6e306e2aa0e-config-data\") pod \"f76e60cb-33d8-4c46-9fec-c6e306e2aa0e\" (UID: \"f76e60cb-33d8-4c46-9fec-c6e306e2aa0e\") " Mar 20 08:59:45 crc kubenswrapper[4971]: I0320 08:59:45.068862 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f76e60cb-33d8-4c46-9fec-c6e306e2aa0e-combined-ca-bundle\") pod \"f76e60cb-33d8-4c46-9fec-c6e306e2aa0e\" (UID: \"f76e60cb-33d8-4c46-9fec-c6e306e2aa0e\") " Mar 20 08:59:45 crc kubenswrapper[4971]: I0320 08:59:45.074234 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f76e60cb-33d8-4c46-9fec-c6e306e2aa0e-scripts" (OuterVolumeSpecName: "scripts") pod "f76e60cb-33d8-4c46-9fec-c6e306e2aa0e" (UID: "f76e60cb-33d8-4c46-9fec-c6e306e2aa0e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:59:45 crc kubenswrapper[4971]: I0320 08:59:45.075296 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f76e60cb-33d8-4c46-9fec-c6e306e2aa0e-kube-api-access-qrj66" (OuterVolumeSpecName: "kube-api-access-qrj66") pod "f76e60cb-33d8-4c46-9fec-c6e306e2aa0e" (UID: "f76e60cb-33d8-4c46-9fec-c6e306e2aa0e"). InnerVolumeSpecName "kube-api-access-qrj66". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:59:45 crc kubenswrapper[4971]: I0320 08:59:45.097938 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f76e60cb-33d8-4c46-9fec-c6e306e2aa0e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f76e60cb-33d8-4c46-9fec-c6e306e2aa0e" (UID: "f76e60cb-33d8-4c46-9fec-c6e306e2aa0e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:59:45 crc kubenswrapper[4971]: I0320 08:59:45.106499 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f76e60cb-33d8-4c46-9fec-c6e306e2aa0e-config-data" (OuterVolumeSpecName: "config-data") pod "f76e60cb-33d8-4c46-9fec-c6e306e2aa0e" (UID: "f76e60cb-33d8-4c46-9fec-c6e306e2aa0e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:59:45 crc kubenswrapper[4971]: I0320 08:59:45.179577 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f76e60cb-33d8-4c46-9fec-c6e306e2aa0e-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:59:45 crc kubenswrapper[4971]: I0320 08:59:45.179756 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f76e60cb-33d8-4c46-9fec-c6e306e2aa0e-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:59:45 crc kubenswrapper[4971]: I0320 08:59:45.179858 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f76e60cb-33d8-4c46-9fec-c6e306e2aa0e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:59:45 crc kubenswrapper[4971]: I0320 08:59:45.179948 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrj66\" (UniqueName: \"kubernetes.io/projected/f76e60cb-33d8-4c46-9fec-c6e306e2aa0e-kube-api-access-qrj66\") on node \"crc\" DevicePath \"\"" Mar 20 08:59:45 crc kubenswrapper[4971]: I0320 08:59:45.572728 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-zxlrh" event={"ID":"f76e60cb-33d8-4c46-9fec-c6e306e2aa0e","Type":"ContainerDied","Data":"ee479bb86d8424f4f25f1d3eb9ff5f4e1a63674944f61e6ed7463827b3acea66"} Mar 20 08:59:45 crc kubenswrapper[4971]: I0320 08:59:45.572767 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee479bb86d8424f4f25f1d3eb9ff5f4e1a63674944f61e6ed7463827b3acea66" Mar 20 08:59:45 crc kubenswrapper[4971]: I0320 08:59:45.572841 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-zxlrh" Mar 20 08:59:50 crc kubenswrapper[4971]: I0320 08:59:50.121872 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Mar 20 08:59:50 crc kubenswrapper[4971]: E0320 08:59:50.122806 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f76e60cb-33d8-4c46-9fec-c6e306e2aa0e" containerName="aodh-db-sync" Mar 20 08:59:50 crc kubenswrapper[4971]: I0320 08:59:50.122820 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f76e60cb-33d8-4c46-9fec-c6e306e2aa0e" containerName="aodh-db-sync" Mar 20 08:59:50 crc kubenswrapper[4971]: I0320 08:59:50.122993 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="f76e60cb-33d8-4c46-9fec-c6e306e2aa0e" containerName="aodh-db-sync" Mar 20 08:59:50 crc kubenswrapper[4971]: I0320 08:59:50.124988 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 20 08:59:50 crc kubenswrapper[4971]: I0320 08:59:50.131482 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-fnmzn" Mar 20 08:59:50 crc kubenswrapper[4971]: I0320 08:59:50.131911 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 20 08:59:50 crc kubenswrapper[4971]: I0320 08:59:50.135505 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 20 08:59:50 crc kubenswrapper[4971]: I0320 08:59:50.152351 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 20 08:59:50 crc kubenswrapper[4971]: I0320 08:59:50.271002 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05f6ce5b-1f18-46c8-92f8-984e7d98872e-scripts\") pod \"aodh-0\" (UID: \"05f6ce5b-1f18-46c8-92f8-984e7d98872e\") " pod="openstack/aodh-0" Mar 20 08:59:50 crc kubenswrapper[4971]: I0320 08:59:50.271343 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05f6ce5b-1f18-46c8-92f8-984e7d98872e-combined-ca-bundle\") pod \"aodh-0\" (UID: \"05f6ce5b-1f18-46c8-92f8-984e7d98872e\") " pod="openstack/aodh-0" Mar 20 08:59:50 crc kubenswrapper[4971]: I0320 08:59:50.271404 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bm5rd\" (UniqueName: \"kubernetes.io/projected/05f6ce5b-1f18-46c8-92f8-984e7d98872e-kube-api-access-bm5rd\") pod \"aodh-0\" (UID: \"05f6ce5b-1f18-46c8-92f8-984e7d98872e\") " pod="openstack/aodh-0" Mar 20 08:59:50 crc kubenswrapper[4971]: I0320 08:59:50.271512 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05f6ce5b-1f18-46c8-92f8-984e7d98872e-config-data\") pod \"aodh-0\" (UID: \"05f6ce5b-1f18-46c8-92f8-984e7d98872e\") " pod="openstack/aodh-0" Mar 20 08:59:50 crc kubenswrapper[4971]: I0320 08:59:50.373505 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05f6ce5b-1f18-46c8-92f8-984e7d98872e-scripts\") pod \"aodh-0\" (UID: \"05f6ce5b-1f18-46c8-92f8-984e7d98872e\") " pod="openstack/aodh-0" Mar 20 08:59:50 crc kubenswrapper[4971]: I0320 08:59:50.373567 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05f6ce5b-1f18-46c8-92f8-984e7d98872e-combined-ca-bundle\") pod \"aodh-0\" (UID: \"05f6ce5b-1f18-46c8-92f8-984e7d98872e\") " pod="openstack/aodh-0" Mar 20 08:59:50 crc kubenswrapper[4971]: I0320 08:59:50.373637 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bm5rd\" (UniqueName: \"kubernetes.io/projected/05f6ce5b-1f18-46c8-92f8-984e7d98872e-kube-api-access-bm5rd\") pod \"aodh-0\" (UID: \"05f6ce5b-1f18-46c8-92f8-984e7d98872e\") " pod="openstack/aodh-0" Mar 20 08:59:50 crc kubenswrapper[4971]: I0320 08:59:50.373714 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05f6ce5b-1f18-46c8-92f8-984e7d98872e-config-data\") pod \"aodh-0\" (UID: \"05f6ce5b-1f18-46c8-92f8-984e7d98872e\") " pod="openstack/aodh-0" Mar 20 08:59:50 crc kubenswrapper[4971]: I0320 08:59:50.380302 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05f6ce5b-1f18-46c8-92f8-984e7d98872e-scripts\") pod \"aodh-0\" (UID: \"05f6ce5b-1f18-46c8-92f8-984e7d98872e\") " pod="openstack/aodh-0" Mar 20 08:59:50 crc kubenswrapper[4971]: I0320 08:59:50.380346 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05f6ce5b-1f18-46c8-92f8-984e7d98872e-combined-ca-bundle\") pod \"aodh-0\" (UID: \"05f6ce5b-1f18-46c8-92f8-984e7d98872e\") " pod="openstack/aodh-0" Mar 20 08:59:50 crc kubenswrapper[4971]: I0320 08:59:50.380672 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05f6ce5b-1f18-46c8-92f8-984e7d98872e-config-data\") pod \"aodh-0\" (UID: \"05f6ce5b-1f18-46c8-92f8-984e7d98872e\") " pod="openstack/aodh-0" Mar 20 08:59:50 crc kubenswrapper[4971]: I0320 08:59:50.389890 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bm5rd\" (UniqueName: \"kubernetes.io/projected/05f6ce5b-1f18-46c8-92f8-984e7d98872e-kube-api-access-bm5rd\") pod \"aodh-0\" (UID: \"05f6ce5b-1f18-46c8-92f8-984e7d98872e\") " pod="openstack/aodh-0" Mar 20 08:59:50 crc kubenswrapper[4971]: I0320 08:59:50.445591 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 20 08:59:50 crc kubenswrapper[4971]: I0320 08:59:50.909250 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 20 08:59:51 crc kubenswrapper[4971]: I0320 08:59:51.630095 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"05f6ce5b-1f18-46c8-92f8-984e7d98872e","Type":"ContainerStarted","Data":"f7157652414cc084a5668cb68bdd0606e88d9b1518decc39614631db1c220065"} Mar 20 08:59:51 crc kubenswrapper[4971]: I0320 08:59:51.630411 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"05f6ce5b-1f18-46c8-92f8-984e7d98872e","Type":"ContainerStarted","Data":"77058d481290028a6ed2a03a54f0434145bca76c65ca1703a790b31275f59cf8"} Mar 20 08:59:52 crc kubenswrapper[4971]: I0320 08:59:52.286717 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 08:59:52 crc kubenswrapper[4971]: I0320 08:59:52.287190 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c03996bc-d785-4fa5-8bab-5aff5f6b6d30" containerName="ceilometer-central-agent" containerID="cri-o://b49e78060c9e9b034b4ba17112eda6821eb91806e8b3d3e4c6b529d455bfe3d3" gracePeriod=30 Mar 20 08:59:52 crc kubenswrapper[4971]: I0320 08:59:52.287365 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c03996bc-d785-4fa5-8bab-5aff5f6b6d30" containerName="proxy-httpd" containerID="cri-o://5392e5b781468bc0ece268171017f67c76e2d425c1864f778bb72bf947c087b9" gracePeriod=30 Mar 20 08:59:52 crc kubenswrapper[4971]: I0320 08:59:52.287411 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c03996bc-d785-4fa5-8bab-5aff5f6b6d30" containerName="ceilometer-notification-agent" containerID="cri-o://2eb15b132720008af8d10569b55fa48ed92a487ba27a2d994e8b235703f4097c" gracePeriod=30 Mar 20 08:59:52 crc kubenswrapper[4971]: I0320 08:59:52.287485 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c03996bc-d785-4fa5-8bab-5aff5f6b6d30" containerName="sg-core" containerID="cri-o://c2233374833a74fd5f9c830dbd156f93173676e6ced8b4e6640f346eec06a231" gracePeriod=30 Mar 20 08:59:52 crc kubenswrapper[4971]: I0320 08:59:52.647160 4971 generic.go:334] "Generic (PLEG): container finished" podID="c03996bc-d785-4fa5-8bab-5aff5f6b6d30" containerID="5392e5b781468bc0ece268171017f67c76e2d425c1864f778bb72bf947c087b9" exitCode=0 Mar 20 08:59:52 crc kubenswrapper[4971]: I0320 08:59:52.648290 4971 generic.go:334] "Generic (PLEG): container finished" podID="c03996bc-d785-4fa5-8bab-5aff5f6b6d30" containerID="c2233374833a74fd5f9c830dbd156f93173676e6ced8b4e6640f346eec06a231" exitCode=2 Mar 20 08:59:52 crc kubenswrapper[4971]: I0320 08:59:52.647251 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c03996bc-d785-4fa5-8bab-5aff5f6b6d30","Type":"ContainerDied","Data":"5392e5b781468bc0ece268171017f67c76e2d425c1864f778bb72bf947c087b9"} Mar 20 08:59:52 crc kubenswrapper[4971]: I0320 08:59:52.648397 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c03996bc-d785-4fa5-8bab-5aff5f6b6d30","Type":"ContainerDied","Data":"c2233374833a74fd5f9c830dbd156f93173676e6ced8b4e6640f346eec06a231"} Mar 20 08:59:52 crc kubenswrapper[4971]: I0320 08:59:52.650227 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"05f6ce5b-1f18-46c8-92f8-984e7d98872e","Type":"ContainerStarted","Data":"52f1ecab15d7f45d6ad2e0cea8556cdb9b529fcbc75b7da1f773a1e8a0e9343d"} Mar 20 08:59:53 crc kubenswrapper[4971]: I0320 08:59:53.385648 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 08:59:53 crc kubenswrapper[4971]: I0320 08:59:53.546258 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c03996bc-d785-4fa5-8bab-5aff5f6b6d30-config-data\") pod \"c03996bc-d785-4fa5-8bab-5aff5f6b6d30\" (UID: \"c03996bc-d785-4fa5-8bab-5aff5f6b6d30\") " Mar 20 08:59:53 crc kubenswrapper[4971]: I0320 08:59:53.546352 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c03996bc-d785-4fa5-8bab-5aff5f6b6d30-scripts\") pod \"c03996bc-d785-4fa5-8bab-5aff5f6b6d30\" (UID: \"c03996bc-d785-4fa5-8bab-5aff5f6b6d30\") " Mar 20 08:59:53 crc kubenswrapper[4971]: I0320 08:59:53.546389 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c03996bc-d785-4fa5-8bab-5aff5f6b6d30-sg-core-conf-yaml\") pod \"c03996bc-d785-4fa5-8bab-5aff5f6b6d30\" (UID: \"c03996bc-d785-4fa5-8bab-5aff5f6b6d30\") " Mar 20 08:59:53 crc kubenswrapper[4971]: I0320 08:59:53.546450 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c03996bc-d785-4fa5-8bab-5aff5f6b6d30-run-httpd\") pod \"c03996bc-d785-4fa5-8bab-5aff5f6b6d30\" (UID: \"c03996bc-d785-4fa5-8bab-5aff5f6b6d30\") " Mar 20 08:59:53 crc kubenswrapper[4971]: I0320 08:59:53.546627 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c03996bc-d785-4fa5-8bab-5aff5f6b6d30-log-httpd\") pod \"c03996bc-d785-4fa5-8bab-5aff5f6b6d30\" (UID: \"c03996bc-d785-4fa5-8bab-5aff5f6b6d30\") " Mar 20 08:59:53 crc kubenswrapper[4971]: I0320 08:59:53.546654 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtnmx\" (UniqueName: \"kubernetes.io/projected/c03996bc-d785-4fa5-8bab-5aff5f6b6d30-kube-api-access-vtnmx\") pod \"c03996bc-d785-4fa5-8bab-5aff5f6b6d30\" (UID: \"c03996bc-d785-4fa5-8bab-5aff5f6b6d30\") " Mar 20 08:59:53 crc kubenswrapper[4971]: I0320 08:59:53.546680 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c03996bc-d785-4fa5-8bab-5aff5f6b6d30-combined-ca-bundle\") pod \"c03996bc-d785-4fa5-8bab-5aff5f6b6d30\" (UID: \"c03996bc-d785-4fa5-8bab-5aff5f6b6d30\") " Mar 20 08:59:53 crc kubenswrapper[4971]: I0320 08:59:53.547373 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c03996bc-d785-4fa5-8bab-5aff5f6b6d30-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c03996bc-d785-4fa5-8bab-5aff5f6b6d30" (UID: "c03996bc-d785-4fa5-8bab-5aff5f6b6d30"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:59:53 crc kubenswrapper[4971]: I0320 08:59:53.547506 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c03996bc-d785-4fa5-8bab-5aff5f6b6d30-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c03996bc-d785-4fa5-8bab-5aff5f6b6d30" (UID: "c03996bc-d785-4fa5-8bab-5aff5f6b6d30"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:59:53 crc kubenswrapper[4971]: I0320 08:59:53.555804 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03996bc-d785-4fa5-8bab-5aff5f6b6d30-kube-api-access-vtnmx" (OuterVolumeSpecName: "kube-api-access-vtnmx") pod "c03996bc-d785-4fa5-8bab-5aff5f6b6d30" (UID: "c03996bc-d785-4fa5-8bab-5aff5f6b6d30"). InnerVolumeSpecName "kube-api-access-vtnmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:59:53 crc kubenswrapper[4971]: I0320 08:59:53.602350 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03996bc-d785-4fa5-8bab-5aff5f6b6d30-scripts" (OuterVolumeSpecName: "scripts") pod "c03996bc-d785-4fa5-8bab-5aff5f6b6d30" (UID: "c03996bc-d785-4fa5-8bab-5aff5f6b6d30"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:59:53 crc kubenswrapper[4971]: I0320 08:59:53.636886 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03996bc-d785-4fa5-8bab-5aff5f6b6d30-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c03996bc-d785-4fa5-8bab-5aff5f6b6d30" (UID: "c03996bc-d785-4fa5-8bab-5aff5f6b6d30"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:59:53 crc kubenswrapper[4971]: I0320 08:59:53.649627 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c03996bc-d785-4fa5-8bab-5aff5f6b6d30-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:59:53 crc kubenswrapper[4971]: I0320 08:59:53.649667 4971 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c03996bc-d785-4fa5-8bab-5aff5f6b6d30-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 08:59:53 crc kubenswrapper[4971]: I0320 08:59:53.649678 4971 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c03996bc-d785-4fa5-8bab-5aff5f6b6d30-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 08:59:53 crc kubenswrapper[4971]: I0320 08:59:53.649709 4971 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c03996bc-d785-4fa5-8bab-5aff5f6b6d30-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 08:59:53 crc kubenswrapper[4971]: I0320 08:59:53.649718 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtnmx\" (UniqueName: \"kubernetes.io/projected/c03996bc-d785-4fa5-8bab-5aff5f6b6d30-kube-api-access-vtnmx\") on node \"crc\" DevicePath \"\"" Mar 20 08:59:53 crc kubenswrapper[4971]: I0320 08:59:53.667733 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03996bc-d785-4fa5-8bab-5aff5f6b6d30-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c03996bc-d785-4fa5-8bab-5aff5f6b6d30" (UID: "c03996bc-d785-4fa5-8bab-5aff5f6b6d30"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:59:53 crc kubenswrapper[4971]: I0320 08:59:53.675713 4971 generic.go:334] "Generic (PLEG): container finished" podID="c03996bc-d785-4fa5-8bab-5aff5f6b6d30" containerID="2eb15b132720008af8d10569b55fa48ed92a487ba27a2d994e8b235703f4097c" exitCode=0 Mar 20 08:59:53 crc kubenswrapper[4971]: I0320 08:59:53.675743 4971 generic.go:334] "Generic (PLEG): container finished" podID="c03996bc-d785-4fa5-8bab-5aff5f6b6d30" containerID="b49e78060c9e9b034b4ba17112eda6821eb91806e8b3d3e4c6b529d455bfe3d3" exitCode=0 Mar 20 08:59:53 crc kubenswrapper[4971]: I0320 08:59:53.675764 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c03996bc-d785-4fa5-8bab-5aff5f6b6d30","Type":"ContainerDied","Data":"2eb15b132720008af8d10569b55fa48ed92a487ba27a2d994e8b235703f4097c"} Mar 20 08:59:53 crc kubenswrapper[4971]: I0320 08:59:53.675791 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c03996bc-d785-4fa5-8bab-5aff5f6b6d30","Type":"ContainerDied","Data":"b49e78060c9e9b034b4ba17112eda6821eb91806e8b3d3e4c6b529d455bfe3d3"} Mar 20 08:59:53 crc kubenswrapper[4971]: I0320 08:59:53.675801 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c03996bc-d785-4fa5-8bab-5aff5f6b6d30","Type":"ContainerDied","Data":"b26ee1c2d661a5fd8cdcc112d242a72add5814afc7c06481a8034b6dea90083f"} Mar 20 08:59:53 crc kubenswrapper[4971]: I0320 08:59:53.675816 4971 scope.go:117] "RemoveContainer" containerID="5392e5b781468bc0ece268171017f67c76e2d425c1864f778bb72bf947c087b9" Mar 20 08:59:53 crc kubenswrapper[4971]: I0320 08:59:53.675828 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 08:59:53 crc kubenswrapper[4971]: I0320 08:59:53.702702 4971 scope.go:117] "RemoveContainer" containerID="c2233374833a74fd5f9c830dbd156f93173676e6ced8b4e6640f346eec06a231" Mar 20 08:59:53 crc kubenswrapper[4971]: I0320 08:59:53.722020 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03996bc-d785-4fa5-8bab-5aff5f6b6d30-config-data" (OuterVolumeSpecName: "config-data") pod "c03996bc-d785-4fa5-8bab-5aff5f6b6d30" (UID: "c03996bc-d785-4fa5-8bab-5aff5f6b6d30"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:59:53 crc kubenswrapper[4971]: I0320 08:59:53.722367 4971 scope.go:117] "RemoveContainer" containerID="2eb15b132720008af8d10569b55fa48ed92a487ba27a2d994e8b235703f4097c" Mar 20 08:59:53 crc kubenswrapper[4971]: I0320 08:59:53.752513 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c03996bc-d785-4fa5-8bab-5aff5f6b6d30-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:59:53 crc kubenswrapper[4971]: I0320 08:59:53.752561 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c03996bc-d785-4fa5-8bab-5aff5f6b6d30-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:59:53 crc kubenswrapper[4971]: I0320 08:59:53.759362 4971 scope.go:117] "RemoveContainer" containerID="b49e78060c9e9b034b4ba17112eda6821eb91806e8b3d3e4c6b529d455bfe3d3" Mar 20 08:59:53 crc kubenswrapper[4971]: I0320 08:59:53.781288 4971 scope.go:117] "RemoveContainer" containerID="5392e5b781468bc0ece268171017f67c76e2d425c1864f778bb72bf947c087b9" Mar 20 08:59:53 crc kubenswrapper[4971]: E0320 08:59:53.781748 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5392e5b781468bc0ece268171017f67c76e2d425c1864f778bb72bf947c087b9\": container with ID starting with 5392e5b781468bc0ece268171017f67c76e2d425c1864f778bb72bf947c087b9 not found: ID does not exist" containerID="5392e5b781468bc0ece268171017f67c76e2d425c1864f778bb72bf947c087b9" Mar 20 08:59:53 crc kubenswrapper[4971]: I0320 08:59:53.781780 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5392e5b781468bc0ece268171017f67c76e2d425c1864f778bb72bf947c087b9"} err="failed to get container status \"5392e5b781468bc0ece268171017f67c76e2d425c1864f778bb72bf947c087b9\": rpc error: code = NotFound desc = could not find container \"5392e5b781468bc0ece268171017f67c76e2d425c1864f778bb72bf947c087b9\": container with ID starting with 5392e5b781468bc0ece268171017f67c76e2d425c1864f778bb72bf947c087b9 not found: ID does not exist" Mar 20 08:59:53 crc kubenswrapper[4971]: I0320 08:59:53.781810 4971 scope.go:117] "RemoveContainer" containerID="c2233374833a74fd5f9c830dbd156f93173676e6ced8b4e6640f346eec06a231" Mar 20 08:59:53 crc kubenswrapper[4971]: E0320 08:59:53.783304 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2233374833a74fd5f9c830dbd156f93173676e6ced8b4e6640f346eec06a231\": container with ID starting with c2233374833a74fd5f9c830dbd156f93173676e6ced8b4e6640f346eec06a231 not found: ID does not exist" containerID="c2233374833a74fd5f9c830dbd156f93173676e6ced8b4e6640f346eec06a231" Mar 20 08:59:53 crc kubenswrapper[4971]: I0320 08:59:53.783359 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2233374833a74fd5f9c830dbd156f93173676e6ced8b4e6640f346eec06a231"} err="failed to get container status \"c2233374833a74fd5f9c830dbd156f93173676e6ced8b4e6640f346eec06a231\": rpc error: code = NotFound desc = could not find container \"c2233374833a74fd5f9c830dbd156f93173676e6ced8b4e6640f346eec06a231\": container with ID starting with c2233374833a74fd5f9c830dbd156f93173676e6ced8b4e6640f346eec06a231 not found: ID does not exist" Mar 20 08:59:53 crc kubenswrapper[4971]: I0320 08:59:53.783380 4971 scope.go:117] "RemoveContainer" containerID="2eb15b132720008af8d10569b55fa48ed92a487ba27a2d994e8b235703f4097c" Mar 20 08:59:53 crc kubenswrapper[4971]: E0320 08:59:53.783844 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2eb15b132720008af8d10569b55fa48ed92a487ba27a2d994e8b235703f4097c\": container with ID starting with 2eb15b132720008af8d10569b55fa48ed92a487ba27a2d994e8b235703f4097c not found: ID does not exist" containerID="2eb15b132720008af8d10569b55fa48ed92a487ba27a2d994e8b235703f4097c" Mar 20 08:59:53 crc kubenswrapper[4971]: I0320 08:59:53.783866 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2eb15b132720008af8d10569b55fa48ed92a487ba27a2d994e8b235703f4097c"} err="failed to get container status \"2eb15b132720008af8d10569b55fa48ed92a487ba27a2d994e8b235703f4097c\": rpc error: code = NotFound desc = could not find container \"2eb15b132720008af8d10569b55fa48ed92a487ba27a2d994e8b235703f4097c\": container with ID starting with 2eb15b132720008af8d10569b55fa48ed92a487ba27a2d994e8b235703f4097c not found: ID does not exist" Mar 20 08:59:53 crc kubenswrapper[4971]: I0320 08:59:53.783882 4971 scope.go:117] "RemoveContainer" containerID="b49e78060c9e9b034b4ba17112eda6821eb91806e8b3d3e4c6b529d455bfe3d3" Mar 20 08:59:53 crc kubenswrapper[4971]: E0320 08:59:53.784268 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b49e78060c9e9b034b4ba17112eda6821eb91806e8b3d3e4c6b529d455bfe3d3\": container with ID starting with b49e78060c9e9b034b4ba17112eda6821eb91806e8b3d3e4c6b529d455bfe3d3 not found: ID does not exist" containerID="b49e78060c9e9b034b4ba17112eda6821eb91806e8b3d3e4c6b529d455bfe3d3" Mar 20 08:59:53 crc kubenswrapper[4971]: I0320 08:59:53.784319 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b49e78060c9e9b034b4ba17112eda6821eb91806e8b3d3e4c6b529d455bfe3d3"} err="failed to get container status \"b49e78060c9e9b034b4ba17112eda6821eb91806e8b3d3e4c6b529d455bfe3d3\": rpc error: code = NotFound desc = could not find container \"b49e78060c9e9b034b4ba17112eda6821eb91806e8b3d3e4c6b529d455bfe3d3\": container with ID starting with b49e78060c9e9b034b4ba17112eda6821eb91806e8b3d3e4c6b529d455bfe3d3 not found: ID does not exist" Mar 20 08:59:53 crc kubenswrapper[4971]: I0320 08:59:53.784341 4971 scope.go:117] "RemoveContainer" containerID="5392e5b781468bc0ece268171017f67c76e2d425c1864f778bb72bf947c087b9" Mar 20 08:59:53 crc kubenswrapper[4971]: I0320 08:59:53.784765 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5392e5b781468bc0ece268171017f67c76e2d425c1864f778bb72bf947c087b9"} err="failed to get container status \"5392e5b781468bc0ece268171017f67c76e2d425c1864f778bb72bf947c087b9\": rpc error: code = NotFound desc = could not find container \"5392e5b781468bc0ece268171017f67c76e2d425c1864f778bb72bf947c087b9\": container with ID starting with 5392e5b781468bc0ece268171017f67c76e2d425c1864f778bb72bf947c087b9 not found: ID does not exist" Mar 20 08:59:53 crc kubenswrapper[4971]: I0320 08:59:53.784788 4971 scope.go:117] "RemoveContainer" containerID="c2233374833a74fd5f9c830dbd156f93173676e6ced8b4e6640f346eec06a231" Mar 20 08:59:53 crc kubenswrapper[4971]: I0320 08:59:53.788118 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2233374833a74fd5f9c830dbd156f93173676e6ced8b4e6640f346eec06a231"} err="failed to get container status \"c2233374833a74fd5f9c830dbd156f93173676e6ced8b4e6640f346eec06a231\": rpc error: code = NotFound desc = could not find container \"c2233374833a74fd5f9c830dbd156f93173676e6ced8b4e6640f346eec06a231\": container with ID starting with c2233374833a74fd5f9c830dbd156f93173676e6ced8b4e6640f346eec06a231 not found: ID does not exist" Mar 20 08:59:53 crc kubenswrapper[4971]: I0320 08:59:53.788174 4971 scope.go:117] "RemoveContainer" containerID="2eb15b132720008af8d10569b55fa48ed92a487ba27a2d994e8b235703f4097c" Mar 20 08:59:53 crc kubenswrapper[4971]: I0320 08:59:53.788514 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2eb15b132720008af8d10569b55fa48ed92a487ba27a2d994e8b235703f4097c"} err="failed to get container status \"2eb15b132720008af8d10569b55fa48ed92a487ba27a2d994e8b235703f4097c\": rpc error: code = NotFound desc = could not find container \"2eb15b132720008af8d10569b55fa48ed92a487ba27a2d994e8b235703f4097c\": container with ID starting with 2eb15b132720008af8d10569b55fa48ed92a487ba27a2d994e8b235703f4097c not found: ID does not exist" Mar 20 08:59:53 crc kubenswrapper[4971]: I0320 08:59:53.788551 4971 scope.go:117] "RemoveContainer" containerID="b49e78060c9e9b034b4ba17112eda6821eb91806e8b3d3e4c6b529d455bfe3d3" Mar 20 08:59:53 crc kubenswrapper[4971]: I0320 08:59:53.788833 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b49e78060c9e9b034b4ba17112eda6821eb91806e8b3d3e4c6b529d455bfe3d3"} err="failed to get container status \"b49e78060c9e9b034b4ba17112eda6821eb91806e8b3d3e4c6b529d455bfe3d3\": rpc error: code = NotFound desc = could not find container \"b49e78060c9e9b034b4ba17112eda6821eb91806e8b3d3e4c6b529d455bfe3d3\": container with ID starting with b49e78060c9e9b034b4ba17112eda6821eb91806e8b3d3e4c6b529d455bfe3d3 not found: ID does not exist" Mar 20 08:59:54 crc kubenswrapper[4971]: I0320 08:59:54.021169 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 08:59:54 crc kubenswrapper[4971]: I0320 08:59:54.032205 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 08:59:54 crc kubenswrapper[4971]: I0320 08:59:54.072262 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 08:59:54 crc kubenswrapper[4971]: E0320 08:59:54.072667 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c03996bc-d785-4fa5-8bab-5aff5f6b6d30" containerName="ceilometer-central-agent" Mar 20 08:59:54 crc kubenswrapper[4971]: I0320 08:59:54.072684 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="c03996bc-d785-4fa5-8bab-5aff5f6b6d30" containerName="ceilometer-central-agent" Mar 20 08:59:54 crc kubenswrapper[4971]: E0320 08:59:54.072698 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c03996bc-d785-4fa5-8bab-5aff5f6b6d30" containerName="sg-core" Mar 20 08:59:54 crc kubenswrapper[4971]: I0320 08:59:54.072705 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="c03996bc-d785-4fa5-8bab-5aff5f6b6d30" containerName="sg-core" Mar 20 08:59:54 crc kubenswrapper[4971]: E0320 08:59:54.072720 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c03996bc-d785-4fa5-8bab-5aff5f6b6d30" containerName="proxy-httpd" Mar 20 08:59:54 crc kubenswrapper[4971]: I0320 08:59:54.072727 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="c03996bc-d785-4fa5-8bab-5aff5f6b6d30" containerName="proxy-httpd" Mar 20 08:59:54 crc kubenswrapper[4971]: E0320 08:59:54.072752 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c03996bc-d785-4fa5-8bab-5aff5f6b6d30" containerName="ceilometer-notification-agent" Mar 20 08:59:54 crc kubenswrapper[4971]: I0320 08:59:54.072759 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="c03996bc-d785-4fa5-8bab-5aff5f6b6d30" containerName="ceilometer-notification-agent" Mar 20 08:59:54 crc kubenswrapper[4971]: I0320 08:59:54.072940 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="c03996bc-d785-4fa5-8bab-5aff5f6b6d30" containerName="ceilometer-central-agent" Mar 20 08:59:54 crc kubenswrapper[4971]: I0320 08:59:54.072962 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="c03996bc-d785-4fa5-8bab-5aff5f6b6d30" containerName="sg-core" Mar 20 08:59:54 crc kubenswrapper[4971]: I0320 08:59:54.072981 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="c03996bc-d785-4fa5-8bab-5aff5f6b6d30" containerName="ceilometer-notification-agent" Mar 20 08:59:54 crc kubenswrapper[4971]: I0320 08:59:54.072992 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="c03996bc-d785-4fa5-8bab-5aff5f6b6d30" containerName="proxy-httpd" Mar 20 08:59:54 crc kubenswrapper[4971]: I0320 08:59:54.074681 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 08:59:54 crc kubenswrapper[4971]: I0320 08:59:54.076220 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 08:59:54 crc kubenswrapper[4971]: I0320 08:59:54.076763 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 08:59:54 crc kubenswrapper[4971]: I0320 08:59:54.084928 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 08:59:54 crc kubenswrapper[4971]: I0320 08:59:54.160355 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ae5f048-0707-4363-80c0-10c41fa13e20-run-httpd\") pod \"ceilometer-0\" (UID: \"6ae5f048-0707-4363-80c0-10c41fa13e20\") " pod="openstack/ceilometer-0" Mar 20 08:59:54 crc kubenswrapper[4971]: I0320 08:59:54.160402 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6ae5f048-0707-4363-80c0-10c41fa13e20-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6ae5f048-0707-4363-80c0-10c41fa13e20\") " pod="openstack/ceilometer-0" Mar 20 08:59:54 crc kubenswrapper[4971]: I0320 08:59:54.160449 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ae5f048-0707-4363-80c0-10c41fa13e20-config-data\") pod \"ceilometer-0\" (UID: \"6ae5f048-0707-4363-80c0-10c41fa13e20\") " pod="openstack/ceilometer-0" Mar 20 08:59:54 crc kubenswrapper[4971]: I0320 08:59:54.160665 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95mjl\" (UniqueName: \"kubernetes.io/projected/6ae5f048-0707-4363-80c0-10c41fa13e20-kube-api-access-95mjl\") pod \"ceilometer-0\" (UID: \"6ae5f048-0707-4363-80c0-10c41fa13e20\") " pod="openstack/ceilometer-0" Mar 20 08:59:54 crc kubenswrapper[4971]: I0320 08:59:54.160795 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ae5f048-0707-4363-80c0-10c41fa13e20-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6ae5f048-0707-4363-80c0-10c41fa13e20\") " pod="openstack/ceilometer-0" Mar 20 08:59:54 crc kubenswrapper[4971]: I0320 08:59:54.160950 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ae5f048-0707-4363-80c0-10c41fa13e20-scripts\") pod \"ceilometer-0\" (UID: \"6ae5f048-0707-4363-80c0-10c41fa13e20\") " pod="openstack/ceilometer-0" Mar 20 08:59:54 crc kubenswrapper[4971]: I0320 08:59:54.160997 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ae5f048-0707-4363-80c0-10c41fa13e20-log-httpd\") pod \"ceilometer-0\" (UID: \"6ae5f048-0707-4363-80c0-10c41fa13e20\") " pod="openstack/ceilometer-0" Mar 20 08:59:54 crc kubenswrapper[4971]: I0320 08:59:54.262632 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ae5f048-0707-4363-80c0-10c41fa13e20-config-data\") pod \"ceilometer-0\" (UID: \"6ae5f048-0707-4363-80c0-10c41fa13e20\") " pod="openstack/ceilometer-0" Mar 20 08:59:54 crc kubenswrapper[4971]: I0320 08:59:54.262703 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95mjl\" (UniqueName: \"kubernetes.io/projected/6ae5f048-0707-4363-80c0-10c41fa13e20-kube-api-access-95mjl\") pod \"ceilometer-0\" (UID: \"6ae5f048-0707-4363-80c0-10c41fa13e20\") " pod="openstack/ceilometer-0" Mar 20 08:59:54 crc kubenswrapper[4971]: I0320 08:59:54.262731 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ae5f048-0707-4363-80c0-10c41fa13e20-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6ae5f048-0707-4363-80c0-10c41fa13e20\") " pod="openstack/ceilometer-0" Mar 20 08:59:54 crc kubenswrapper[4971]: I0320 08:59:54.262804 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ae5f048-0707-4363-80c0-10c41fa13e20-scripts\") pod \"ceilometer-0\" (UID: \"6ae5f048-0707-4363-80c0-10c41fa13e20\") " pod="openstack/ceilometer-0" Mar 20 08:59:54 crc kubenswrapper[4971]: I0320 08:59:54.262830 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ae5f048-0707-4363-80c0-10c41fa13e20-log-httpd\") pod \"ceilometer-0\" (UID: \"6ae5f048-0707-4363-80c0-10c41fa13e20\") " pod="openstack/ceilometer-0" Mar 20 08:59:54 crc kubenswrapper[4971]: I0320 08:59:54.262866 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ae5f048-0707-4363-80c0-10c41fa13e20-run-httpd\") pod \"ceilometer-0\" (UID: \"6ae5f048-0707-4363-80c0-10c41fa13e20\") " pod="openstack/ceilometer-0" Mar 20 08:59:54 crc kubenswrapper[4971]: I0320 08:59:54.262884 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6ae5f048-0707-4363-80c0-10c41fa13e20-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6ae5f048-0707-4363-80c0-10c41fa13e20\") " pod="openstack/ceilometer-0" Mar 20 08:59:54 crc kubenswrapper[4971]: I0320 08:59:54.264330 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ae5f048-0707-4363-80c0-10c41fa13e20-log-httpd\") pod \"ceilometer-0\" (UID: \"6ae5f048-0707-4363-80c0-10c41fa13e20\") " pod="openstack/ceilometer-0" Mar 20 08:59:54 crc kubenswrapper[4971]: I0320 08:59:54.264423 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ae5f048-0707-4363-80c0-10c41fa13e20-run-httpd\") pod \"ceilometer-0\" (UID: \"6ae5f048-0707-4363-80c0-10c41fa13e20\") " pod="openstack/ceilometer-0" Mar 20 08:59:54 crc kubenswrapper[4971]: I0320 08:59:54.267365 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6ae5f048-0707-4363-80c0-10c41fa13e20-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6ae5f048-0707-4363-80c0-10c41fa13e20\") " pod="openstack/ceilometer-0" Mar 20 08:59:54 crc kubenswrapper[4971]: I0320 08:59:54.267556 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ae5f048-0707-4363-80c0-10c41fa13e20-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6ae5f048-0707-4363-80c0-10c41fa13e20\") " pod="openstack/ceilometer-0" Mar 20 08:59:54 crc kubenswrapper[4971]: I0320 08:59:54.270837 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ae5f048-0707-4363-80c0-10c41fa13e20-config-data\") pod \"ceilometer-0\" (UID: \"6ae5f048-0707-4363-80c0-10c41fa13e20\") " pod="openstack/ceilometer-0" Mar 20 08:59:54 crc kubenswrapper[4971]: I0320 08:59:54.271017 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ae5f048-0707-4363-80c0-10c41fa13e20-scripts\") pod \"ceilometer-0\" (UID: \"6ae5f048-0707-4363-80c0-10c41fa13e20\") " pod="openstack/ceilometer-0" Mar 20 08:59:54 crc kubenswrapper[4971]: I0320 08:59:54.281310 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95mjl\" (UniqueName: \"kubernetes.io/projected/6ae5f048-0707-4363-80c0-10c41fa13e20-kube-api-access-95mjl\") pod \"ceilometer-0\" (UID: \"6ae5f048-0707-4363-80c0-10c41fa13e20\") " pod="openstack/ceilometer-0" Mar 20 08:59:54 crc kubenswrapper[4971]: I0320 08:59:54.430820 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 08:59:54 crc kubenswrapper[4971]: I0320 08:59:54.688536 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"05f6ce5b-1f18-46c8-92f8-984e7d98872e","Type":"ContainerStarted","Data":"72e9e8076e3e03385a086c4988123b4492cea5f5316de91d3214d628a65e4eaa"} Mar 20 08:59:54 crc kubenswrapper[4971]: I0320 08:59:54.746652 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03996bc-d785-4fa5-8bab-5aff5f6b6d30" path="/var/lib/kubelet/pods/c03996bc-d785-4fa5-8bab-5aff5f6b6d30/volumes" Mar 20 08:59:55 crc kubenswrapper[4971]: I0320 08:59:55.282990 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 08:59:55 crc kubenswrapper[4971]: W0320 08:59:55.292254 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ae5f048_0707_4363_80c0_10c41fa13e20.slice/crio-53e6aaf1b22f76a2147371350f2ed1c9180be5622f19aaa8a2865a9cd1351a00 WatchSource:0}: Error finding container 53e6aaf1b22f76a2147371350f2ed1c9180be5622f19aaa8a2865a9cd1351a00: Status 404 returned error can't find the container with id 53e6aaf1b22f76a2147371350f2ed1c9180be5622f19aaa8a2865a9cd1351a00 Mar 20 08:59:55 crc kubenswrapper[4971]: I0320 08:59:55.698054 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ae5f048-0707-4363-80c0-10c41fa13e20","Type":"ContainerStarted","Data":"53e6aaf1b22f76a2147371350f2ed1c9180be5622f19aaa8a2865a9cd1351a00"} Mar 20 08:59:55 crc kubenswrapper[4971]: I0320 08:59:55.700168 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"05f6ce5b-1f18-46c8-92f8-984e7d98872e","Type":"ContainerStarted","Data":"293a5075fffd3ea85639a00802816a56bebfd6ae1ad2dd4cbc8a43649f04e198"} Mar 20 08:59:55 crc kubenswrapper[4971]: I0320 08:59:55.719668 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=1.765271121 podStartE2EDuration="5.719651796s" podCreationTimestamp="2026-03-20 08:59:50 +0000 UTC" firstStartedPulling="2026-03-20 08:59:50.908455425 +0000 UTC m=+7812.888329563" lastFinishedPulling="2026-03-20 08:59:54.8628361 +0000 UTC m=+7816.842710238" observedRunningTime="2026-03-20 08:59:55.715943369 +0000 UTC m=+7817.695817507" watchObservedRunningTime="2026-03-20 08:59:55.719651796 +0000 UTC m=+7817.699525944" Mar 20 08:59:55 crc kubenswrapper[4971]: I0320 08:59:55.731916 4971 scope.go:117] "RemoveContainer" containerID="28296d931187d2e7d4e287e10e7dff5f848527abf9621b1204e0a061cbc3d395" Mar 20 08:59:55 crc kubenswrapper[4971]: E0320 08:59:55.732170 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 08:59:56 crc kubenswrapper[4971]: I0320 08:59:56.712458 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ae5f048-0707-4363-80c0-10c41fa13e20","Type":"ContainerStarted","Data":"9d9888fd6db46c24244bbbdcbb415b99973b57972ed434602ee00aa19b354691"} Mar 20 08:59:56 crc kubenswrapper[4971]: I0320 08:59:56.713964 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ae5f048-0707-4363-80c0-10c41fa13e20","Type":"ContainerStarted","Data":"cc54b061a22acd13673dff0af81259bbe8c24d4dc03134c3c293558d07f94298"} Mar 20 08:59:58 crc kubenswrapper[4971]: I0320 08:59:58.751419 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ae5f048-0707-4363-80c0-10c41fa13e20","Type":"ContainerStarted","Data":"00beae852ea0b2ca00ed7a4a01f6db85872d234cbd171829482bc03365e30080"} Mar 20 08:59:59 crc kubenswrapper[4971]: I0320 08:59:59.763583 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ae5f048-0707-4363-80c0-10c41fa13e20","Type":"ContainerStarted","Data":"9237bc293bcbb4328b39f08efa97f18cd769c99f12285b9939290bc6a9bd7d5c"} Mar 20 08:59:59 crc kubenswrapper[4971]: I0320 08:59:59.764143 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 08:59:59 crc kubenswrapper[4971]: I0320 08:59:59.819175 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.863960691 podStartE2EDuration="5.81915476s" podCreationTimestamp="2026-03-20 08:59:54 +0000 UTC" firstStartedPulling="2026-03-20 08:59:55.295472043 +0000 UTC m=+7817.275346181" lastFinishedPulling="2026-03-20 08:59:59.250666072 +0000 UTC m=+7821.230540250" observedRunningTime="2026-03-20 08:59:59.806491919 +0000 UTC m=+7821.786366067" watchObservedRunningTime="2026-03-20 08:59:59.81915476 +0000 UTC m=+7821.799028898" Mar 20 09:00:00 crc kubenswrapper[4971]: I0320 09:00:00.144717 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566620-9x5j8"] Mar 20 09:00:00 crc kubenswrapper[4971]: I0320 09:00:00.145972 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566620-9x5j8" Mar 20 09:00:00 crc kubenswrapper[4971]: I0320 09:00:00.152111 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 09:00:00 crc kubenswrapper[4971]: I0320 09:00:00.152478 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:00:00 crc kubenswrapper[4971]: I0320 09:00:00.152751 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:00:00 crc kubenswrapper[4971]: I0320 09:00:00.171834 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566620-dkp2l"] Mar 20 09:00:00 crc kubenswrapper[4971]: I0320 09:00:00.173644 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566620-dkp2l" Mar 20 09:00:00 crc kubenswrapper[4971]: I0320 09:00:00.177927 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 09:00:00 crc kubenswrapper[4971]: I0320 09:00:00.178331 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 09:00:00 crc kubenswrapper[4971]: I0320 09:00:00.188169 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566620-9x5j8"] Mar 20 09:00:00 crc kubenswrapper[4971]: I0320 09:00:00.263799 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566620-dkp2l"] Mar 20 09:00:00 crc kubenswrapper[4971]: I0320 09:00:00.275479 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zs5j5\" (UniqueName: \"kubernetes.io/projected/9f1bfb4f-58e7-4a18-86ee-ee822cd3fa48-kube-api-access-zs5j5\") pod \"collect-profiles-29566620-dkp2l\" (UID: \"9f1bfb4f-58e7-4a18-86ee-ee822cd3fa48\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566620-dkp2l" Mar 20 09:00:00 crc kubenswrapper[4971]: I0320 09:00:00.275636 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjbk2\" (UniqueName: \"kubernetes.io/projected/e8a4a91f-3933-4117-bf14-288067efbab3-kube-api-access-hjbk2\") pod \"auto-csr-approver-29566620-9x5j8\" (UID: \"e8a4a91f-3933-4117-bf14-288067efbab3\") " pod="openshift-infra/auto-csr-approver-29566620-9x5j8" Mar 20 09:00:00 crc kubenswrapper[4971]: I0320 09:00:00.275743 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9f1bfb4f-58e7-4a18-86ee-ee822cd3fa48-config-volume\") pod \"collect-profiles-29566620-dkp2l\" (UID: \"9f1bfb4f-58e7-4a18-86ee-ee822cd3fa48\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566620-dkp2l" Mar 20 09:00:00 crc kubenswrapper[4971]: I0320 09:00:00.275795 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9f1bfb4f-58e7-4a18-86ee-ee822cd3fa48-secret-volume\") pod \"collect-profiles-29566620-dkp2l\" (UID: \"9f1bfb4f-58e7-4a18-86ee-ee822cd3fa48\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566620-dkp2l" Mar 20 09:00:00 crc kubenswrapper[4971]: I0320 09:00:00.363414 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-fskqk"] Mar 20 09:00:00 crc kubenswrapper[4971]: I0320 09:00:00.364848 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-fskqk" Mar 20 09:00:00 crc kubenswrapper[4971]: I0320 09:00:00.374645 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-fskqk"] Mar 20 09:00:00 crc kubenswrapper[4971]: I0320 09:00:00.386162 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjbk2\" (UniqueName: \"kubernetes.io/projected/e8a4a91f-3933-4117-bf14-288067efbab3-kube-api-access-hjbk2\") pod \"auto-csr-approver-29566620-9x5j8\" (UID: \"e8a4a91f-3933-4117-bf14-288067efbab3\") " pod="openshift-infra/auto-csr-approver-29566620-9x5j8" Mar 20 09:00:00 crc kubenswrapper[4971]: I0320 09:00:00.386302 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9f1bfb4f-58e7-4a18-86ee-ee822cd3fa48-config-volume\") pod \"collect-profiles-29566620-dkp2l\" (UID: \"9f1bfb4f-58e7-4a18-86ee-ee822cd3fa48\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566620-dkp2l" Mar 20 09:00:00 crc kubenswrapper[4971]: I0320 09:00:00.386337 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9f1bfb4f-58e7-4a18-86ee-ee822cd3fa48-secret-volume\") pod \"collect-profiles-29566620-dkp2l\" (UID: \"9f1bfb4f-58e7-4a18-86ee-ee822cd3fa48\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566620-dkp2l" Mar 20 09:00:00 crc kubenswrapper[4971]: I0320 09:00:00.386566 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zs5j5\" (UniqueName: \"kubernetes.io/projected/9f1bfb4f-58e7-4a18-86ee-ee822cd3fa48-kube-api-access-zs5j5\") pod \"collect-profiles-29566620-dkp2l\" (UID: \"9f1bfb4f-58e7-4a18-86ee-ee822cd3fa48\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566620-dkp2l" Mar 20 09:00:00 crc kubenswrapper[4971]: I0320 09:00:00.387537 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9f1bfb4f-58e7-4a18-86ee-ee822cd3fa48-config-volume\") pod \"collect-profiles-29566620-dkp2l\" (UID: \"9f1bfb4f-58e7-4a18-86ee-ee822cd3fa48\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566620-dkp2l" Mar 20 09:00:00 crc kubenswrapper[4971]: I0320 09:00:00.391683 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9f1bfb4f-58e7-4a18-86ee-ee822cd3fa48-secret-volume\") pod \"collect-profiles-29566620-dkp2l\" (UID: \"9f1bfb4f-58e7-4a18-86ee-ee822cd3fa48\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566620-dkp2l" Mar 20 09:00:00 crc kubenswrapper[4971]: I0320 09:00:00.411001 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zs5j5\" (UniqueName: \"kubernetes.io/projected/9f1bfb4f-58e7-4a18-86ee-ee822cd3fa48-kube-api-access-zs5j5\") pod \"collect-profiles-29566620-dkp2l\" (UID: \"9f1bfb4f-58e7-4a18-86ee-ee822cd3fa48\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566620-dkp2l" Mar 20 09:00:00 crc kubenswrapper[4971]: I0320 09:00:00.414761 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjbk2\" (UniqueName: \"kubernetes.io/projected/e8a4a91f-3933-4117-bf14-288067efbab3-kube-api-access-hjbk2\") pod \"auto-csr-approver-29566620-9x5j8\" (UID: \"e8a4a91f-3933-4117-bf14-288067efbab3\") " pod="openshift-infra/auto-csr-approver-29566620-9x5j8" Mar 20 09:00:00 crc kubenswrapper[4971]: I0320 09:00:00.467402 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566620-9x5j8" Mar 20 09:00:00 crc kubenswrapper[4971]: I0320 09:00:00.476677 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-1bf7-account-create-update-mtp5w"] Mar 20 09:00:00 crc kubenswrapper[4971]: I0320 09:00:00.484411 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-1bf7-account-create-update-mtp5w" Mar 20 09:00:00 crc kubenswrapper[4971]: I0320 09:00:00.486341 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Mar 20 09:00:00 crc kubenswrapper[4971]: I0320 09:00:00.488893 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvwx2\" (UniqueName: \"kubernetes.io/projected/1943c6e8-ca3b-4c1a-aa8c-4d3576b9eb5e-kube-api-access-wvwx2\") pod \"manila-db-create-fskqk\" (UID: \"1943c6e8-ca3b-4c1a-aa8c-4d3576b9eb5e\") " pod="openstack/manila-db-create-fskqk" Mar 20 09:00:00 crc kubenswrapper[4971]: I0320 09:00:00.489056 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1943c6e8-ca3b-4c1a-aa8c-4d3576b9eb5e-operator-scripts\") pod \"manila-db-create-fskqk\" (UID: \"1943c6e8-ca3b-4c1a-aa8c-4d3576b9eb5e\") " pod="openstack/manila-db-create-fskqk" Mar 20 09:00:00 crc kubenswrapper[4971]: I0320 09:00:00.491651 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-1bf7-account-create-update-mtp5w"] Mar 20 09:00:00 crc kubenswrapper[4971]: I0320 09:00:00.494459 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566620-dkp2l" Mar 20 09:00:00 crc kubenswrapper[4971]: I0320 09:00:00.596794 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1943c6e8-ca3b-4c1a-aa8c-4d3576b9eb5e-operator-scripts\") pod \"manila-db-create-fskqk\" (UID: \"1943c6e8-ca3b-4c1a-aa8c-4d3576b9eb5e\") " pod="openstack/manila-db-create-fskqk" Mar 20 09:00:00 crc kubenswrapper[4971]: I0320 09:00:00.597150 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ddcc15d-2a56-46c2-9606-3ba1643f15b2-operator-scripts\") pod \"manila-1bf7-account-create-update-mtp5w\" (UID: \"9ddcc15d-2a56-46c2-9606-3ba1643f15b2\") " pod="openstack/manila-1bf7-account-create-update-mtp5w" Mar 20 09:00:00 crc kubenswrapper[4971]: I0320 09:00:00.597178 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n87mr\" (UniqueName: \"kubernetes.io/projected/9ddcc15d-2a56-46c2-9606-3ba1643f15b2-kube-api-access-n87mr\") pod \"manila-1bf7-account-create-update-mtp5w\" (UID: \"9ddcc15d-2a56-46c2-9606-3ba1643f15b2\") " pod="openstack/manila-1bf7-account-create-update-mtp5w" Mar 20 09:00:00 crc kubenswrapper[4971]: I0320 09:00:00.597456 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvwx2\" (UniqueName: \"kubernetes.io/projected/1943c6e8-ca3b-4c1a-aa8c-4d3576b9eb5e-kube-api-access-wvwx2\") pod \"manila-db-create-fskqk\" (UID: \"1943c6e8-ca3b-4c1a-aa8c-4d3576b9eb5e\") " pod="openstack/manila-db-create-fskqk" Mar 20 09:00:00 crc kubenswrapper[4971]: I0320 09:00:00.597936 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1943c6e8-ca3b-4c1a-aa8c-4d3576b9eb5e-operator-scripts\") pod \"manila-db-create-fskqk\" (UID: \"1943c6e8-ca3b-4c1a-aa8c-4d3576b9eb5e\") " pod="openstack/manila-db-create-fskqk" Mar 20 09:00:00 crc kubenswrapper[4971]: I0320 09:00:00.615155 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvwx2\" (UniqueName: \"kubernetes.io/projected/1943c6e8-ca3b-4c1a-aa8c-4d3576b9eb5e-kube-api-access-wvwx2\") pod \"manila-db-create-fskqk\" (UID: \"1943c6e8-ca3b-4c1a-aa8c-4d3576b9eb5e\") " pod="openstack/manila-db-create-fskqk" Mar 20 09:00:01 crc kubenswrapper[4971]: I0320 09:00:00.682257 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-fskqk" Mar 20 09:00:01 crc kubenswrapper[4971]: I0320 09:00:00.712903 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ddcc15d-2a56-46c2-9606-3ba1643f15b2-operator-scripts\") pod \"manila-1bf7-account-create-update-mtp5w\" (UID: \"9ddcc15d-2a56-46c2-9606-3ba1643f15b2\") " pod="openstack/manila-1bf7-account-create-update-mtp5w" Mar 20 09:00:01 crc kubenswrapper[4971]: I0320 09:00:00.712949 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n87mr\" (UniqueName: \"kubernetes.io/projected/9ddcc15d-2a56-46c2-9606-3ba1643f15b2-kube-api-access-n87mr\") pod \"manila-1bf7-account-create-update-mtp5w\" (UID: \"9ddcc15d-2a56-46c2-9606-3ba1643f15b2\") " pod="openstack/manila-1bf7-account-create-update-mtp5w" Mar 20 09:00:01 crc kubenswrapper[4971]: I0320 09:00:00.714199 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ddcc15d-2a56-46c2-9606-3ba1643f15b2-operator-scripts\") pod \"manila-1bf7-account-create-update-mtp5w\" (UID: \"9ddcc15d-2a56-46c2-9606-3ba1643f15b2\") " pod="openstack/manila-1bf7-account-create-update-mtp5w" Mar 20 09:00:01 crc kubenswrapper[4971]: I0320 09:00:00.752362 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n87mr\" (UniqueName: \"kubernetes.io/projected/9ddcc15d-2a56-46c2-9606-3ba1643f15b2-kube-api-access-n87mr\") pod \"manila-1bf7-account-create-update-mtp5w\" (UID: \"9ddcc15d-2a56-46c2-9606-3ba1643f15b2\") " pod="openstack/manila-1bf7-account-create-update-mtp5w" Mar 20 09:00:01 crc kubenswrapper[4971]: I0320 09:00:00.837288 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-1bf7-account-create-update-mtp5w" Mar 20 09:00:01 crc kubenswrapper[4971]: I0320 09:00:01.729903 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-fskqk"] Mar 20 09:00:01 crc kubenswrapper[4971]: I0320 09:00:01.754807 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566620-9x5j8"] Mar 20 09:00:01 crc kubenswrapper[4971]: I0320 09:00:01.760776 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566620-dkp2l"] Mar 20 09:00:01 crc kubenswrapper[4971]: I0320 09:00:01.862273 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566620-dkp2l" event={"ID":"9f1bfb4f-58e7-4a18-86ee-ee822cd3fa48","Type":"ContainerStarted","Data":"44fbf1e81c536ba7b1d25cf6f11e71d30afa4bb418a02b6df28c65560d0a622d"} Mar 20 09:00:01 crc kubenswrapper[4971]: I0320 09:00:01.871929 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-fskqk" event={"ID":"1943c6e8-ca3b-4c1a-aa8c-4d3576b9eb5e","Type":"ContainerStarted","Data":"f57aa66b4c03c39ec2b3a8d329bef0e91830fbf1f9a02b40e27f39d2bec4e8c4"} Mar 20 09:00:01 crc kubenswrapper[4971]: I0320 09:00:01.877326 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566620-9x5j8" event={"ID":"e8a4a91f-3933-4117-bf14-288067efbab3","Type":"ContainerStarted","Data":"c8bb7cffaa5b155e1fd65ca7c488571c70f3622cc60fc56a416243239331e0df"} Mar 20 09:00:01 crc kubenswrapper[4971]: I0320 09:00:01.946125 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-1bf7-account-create-update-mtp5w"] Mar 20 09:00:01 crc kubenswrapper[4971]: W0320 09:00:01.952120 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ddcc15d_2a56_46c2_9606_3ba1643f15b2.slice/crio-c7df908fa07bbb1bad1445cee64e4aed94bcb2561431792751ba0d270d25837e WatchSource:0}: Error finding container c7df908fa07bbb1bad1445cee64e4aed94bcb2561431792751ba0d270d25837e: Status 404 returned error can't find the container with id c7df908fa07bbb1bad1445cee64e4aed94bcb2561431792751ba0d270d25837e Mar 20 09:00:02 crc kubenswrapper[4971]: I0320 09:00:02.886922 4971 generic.go:334] "Generic (PLEG): container finished" podID="1943c6e8-ca3b-4c1a-aa8c-4d3576b9eb5e" containerID="bc9baf15530f7cde8989555ae150708f2f6c2ebe8bc282d59f0f77a6cc5845bd" exitCode=0 Mar 20 09:00:02 crc kubenswrapper[4971]: I0320 09:00:02.887291 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-fskqk" event={"ID":"1943c6e8-ca3b-4c1a-aa8c-4d3576b9eb5e","Type":"ContainerDied","Data":"bc9baf15530f7cde8989555ae150708f2f6c2ebe8bc282d59f0f77a6cc5845bd"} Mar 20 09:00:02 crc kubenswrapper[4971]: I0320 09:00:02.889676 4971 generic.go:334] "Generic (PLEG): container finished" podID="9ddcc15d-2a56-46c2-9606-3ba1643f15b2" containerID="e3a20d8cf980212f0b78cfb3a017b5516fe1824301f185a0bcb4b588448c42b9" exitCode=0 Mar 20 09:00:02 crc kubenswrapper[4971]: I0320 09:00:02.889747 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-1bf7-account-create-update-mtp5w" event={"ID":"9ddcc15d-2a56-46c2-9606-3ba1643f15b2","Type":"ContainerDied","Data":"e3a20d8cf980212f0b78cfb3a017b5516fe1824301f185a0bcb4b588448c42b9"} Mar 20 09:00:02 crc kubenswrapper[4971]: I0320 09:00:02.889774 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-1bf7-account-create-update-mtp5w" event={"ID":"9ddcc15d-2a56-46c2-9606-3ba1643f15b2","Type":"ContainerStarted","Data":"c7df908fa07bbb1bad1445cee64e4aed94bcb2561431792751ba0d270d25837e"} Mar 20 09:00:02 crc kubenswrapper[4971]: I0320 09:00:02.900372 4971 generic.go:334] "Generic (PLEG): container finished" podID="9f1bfb4f-58e7-4a18-86ee-ee822cd3fa48" containerID="efeda7f0dd06c042e0940b665b8dca14c55e437b58331b2f8b6860f19a6aa187" exitCode=0 Mar 20 09:00:02 crc kubenswrapper[4971]: I0320 09:00:02.900427 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566620-dkp2l" event={"ID":"9f1bfb4f-58e7-4a18-86ee-ee822cd3fa48","Type":"ContainerDied","Data":"efeda7f0dd06c042e0940b665b8dca14c55e437b58331b2f8b6860f19a6aa187"} Mar 20 09:00:04 crc kubenswrapper[4971]: I0320 09:00:04.491289 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566620-dkp2l" Mar 20 09:00:04 crc kubenswrapper[4971]: I0320 09:00:04.616783 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zs5j5\" (UniqueName: \"kubernetes.io/projected/9f1bfb4f-58e7-4a18-86ee-ee822cd3fa48-kube-api-access-zs5j5\") pod \"9f1bfb4f-58e7-4a18-86ee-ee822cd3fa48\" (UID: \"9f1bfb4f-58e7-4a18-86ee-ee822cd3fa48\") " Mar 20 09:00:04 crc kubenswrapper[4971]: I0320 09:00:04.617049 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9f1bfb4f-58e7-4a18-86ee-ee822cd3fa48-config-volume\") pod \"9f1bfb4f-58e7-4a18-86ee-ee822cd3fa48\" (UID: \"9f1bfb4f-58e7-4a18-86ee-ee822cd3fa48\") " Mar 20 09:00:04 crc kubenswrapper[4971]: I0320 09:00:04.617136 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9f1bfb4f-58e7-4a18-86ee-ee822cd3fa48-secret-volume\") pod \"9f1bfb4f-58e7-4a18-86ee-ee822cd3fa48\" (UID: \"9f1bfb4f-58e7-4a18-86ee-ee822cd3fa48\") " Mar 20 09:00:04 crc kubenswrapper[4971]: I0320 09:00:04.618554 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f1bfb4f-58e7-4a18-86ee-ee822cd3fa48-config-volume" (OuterVolumeSpecName: "config-volume") pod "9f1bfb4f-58e7-4a18-86ee-ee822cd3fa48" (UID: "9f1bfb4f-58e7-4a18-86ee-ee822cd3fa48"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:00:04 crc kubenswrapper[4971]: I0320 09:00:04.622230 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f1bfb4f-58e7-4a18-86ee-ee822cd3fa48-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9f1bfb4f-58e7-4a18-86ee-ee822cd3fa48" (UID: "9f1bfb4f-58e7-4a18-86ee-ee822cd3fa48"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:00:04 crc kubenswrapper[4971]: I0320 09:00:04.622414 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f1bfb4f-58e7-4a18-86ee-ee822cd3fa48-kube-api-access-zs5j5" (OuterVolumeSpecName: "kube-api-access-zs5j5") pod "9f1bfb4f-58e7-4a18-86ee-ee822cd3fa48" (UID: "9f1bfb4f-58e7-4a18-86ee-ee822cd3fa48"). InnerVolumeSpecName "kube-api-access-zs5j5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:00:04 crc kubenswrapper[4971]: I0320 09:00:04.672293 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-fskqk" Mar 20 09:00:04 crc kubenswrapper[4971]: I0320 09:00:04.678817 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-1bf7-account-create-update-mtp5w" Mar 20 09:00:04 crc kubenswrapper[4971]: I0320 09:00:04.718901 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvwx2\" (UniqueName: \"kubernetes.io/projected/1943c6e8-ca3b-4c1a-aa8c-4d3576b9eb5e-kube-api-access-wvwx2\") pod \"1943c6e8-ca3b-4c1a-aa8c-4d3576b9eb5e\" (UID: \"1943c6e8-ca3b-4c1a-aa8c-4d3576b9eb5e\") " Mar 20 09:00:04 crc kubenswrapper[4971]: I0320 09:00:04.719005 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1943c6e8-ca3b-4c1a-aa8c-4d3576b9eb5e-operator-scripts\") pod \"1943c6e8-ca3b-4c1a-aa8c-4d3576b9eb5e\" (UID: \"1943c6e8-ca3b-4c1a-aa8c-4d3576b9eb5e\") " Mar 20 09:00:04 crc kubenswrapper[4971]: I0320 09:00:04.719733 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zs5j5\" (UniqueName: \"kubernetes.io/projected/9f1bfb4f-58e7-4a18-86ee-ee822cd3fa48-kube-api-access-zs5j5\") on node \"crc\" DevicePath \"\"" Mar 20 09:00:04 crc kubenswrapper[4971]: I0320 09:00:04.719760 4971 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9f1bfb4f-58e7-4a18-86ee-ee822cd3fa48-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 09:00:04 crc kubenswrapper[4971]: I0320 09:00:04.719773 4971 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9f1bfb4f-58e7-4a18-86ee-ee822cd3fa48-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 09:00:04 crc kubenswrapper[4971]: I0320 09:00:04.719837 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1943c6e8-ca3b-4c1a-aa8c-4d3576b9eb5e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1943c6e8-ca3b-4c1a-aa8c-4d3576b9eb5e" (UID: "1943c6e8-ca3b-4c1a-aa8c-4d3576b9eb5e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:00:04 crc kubenswrapper[4971]: I0320 09:00:04.728895 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1943c6e8-ca3b-4c1a-aa8c-4d3576b9eb5e-kube-api-access-wvwx2" (OuterVolumeSpecName: "kube-api-access-wvwx2") pod "1943c6e8-ca3b-4c1a-aa8c-4d3576b9eb5e" (UID: "1943c6e8-ca3b-4c1a-aa8c-4d3576b9eb5e"). InnerVolumeSpecName "kube-api-access-wvwx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:00:04 crc kubenswrapper[4971]: I0320 09:00:04.821551 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ddcc15d-2a56-46c2-9606-3ba1643f15b2-operator-scripts\") pod \"9ddcc15d-2a56-46c2-9606-3ba1643f15b2\" (UID: \"9ddcc15d-2a56-46c2-9606-3ba1643f15b2\") " Mar 20 09:00:04 crc kubenswrapper[4971]: I0320 09:00:04.821821 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n87mr\" (UniqueName: \"kubernetes.io/projected/9ddcc15d-2a56-46c2-9606-3ba1643f15b2-kube-api-access-n87mr\") pod \"9ddcc15d-2a56-46c2-9606-3ba1643f15b2\" (UID: \"9ddcc15d-2a56-46c2-9606-3ba1643f15b2\") " Mar 20 09:00:04 crc kubenswrapper[4971]: I0320 09:00:04.822095 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ddcc15d-2a56-46c2-9606-3ba1643f15b2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9ddcc15d-2a56-46c2-9606-3ba1643f15b2" (UID: "9ddcc15d-2a56-46c2-9606-3ba1643f15b2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:00:04 crc kubenswrapper[4971]: I0320 09:00:04.822999 4971 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ddcc15d-2a56-46c2-9606-3ba1643f15b2-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 09:00:04 crc kubenswrapper[4971]: I0320 09:00:04.823025 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvwx2\" (UniqueName: \"kubernetes.io/projected/1943c6e8-ca3b-4c1a-aa8c-4d3576b9eb5e-kube-api-access-wvwx2\") on node \"crc\" DevicePath \"\"" Mar 20 09:00:04 crc kubenswrapper[4971]: I0320 09:00:04.823035 4971 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1943c6e8-ca3b-4c1a-aa8c-4d3576b9eb5e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 09:00:04 crc kubenswrapper[4971]: I0320 09:00:04.825457 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ddcc15d-2a56-46c2-9606-3ba1643f15b2-kube-api-access-n87mr" (OuterVolumeSpecName: "kube-api-access-n87mr") pod "9ddcc15d-2a56-46c2-9606-3ba1643f15b2" (UID: "9ddcc15d-2a56-46c2-9606-3ba1643f15b2"). InnerVolumeSpecName "kube-api-access-n87mr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:00:04 crc kubenswrapper[4971]: I0320 09:00:04.922926 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566620-dkp2l" Mar 20 09:00:04 crc kubenswrapper[4971]: I0320 09:00:04.922921 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566620-dkp2l" event={"ID":"9f1bfb4f-58e7-4a18-86ee-ee822cd3fa48","Type":"ContainerDied","Data":"44fbf1e81c536ba7b1d25cf6f11e71d30afa4bb418a02b6df28c65560d0a622d"} Mar 20 09:00:04 crc kubenswrapper[4971]: I0320 09:00:04.923034 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44fbf1e81c536ba7b1d25cf6f11e71d30afa4bb418a02b6df28c65560d0a622d" Mar 20 09:00:04 crc kubenswrapper[4971]: I0320 09:00:04.924163 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n87mr\" (UniqueName: \"kubernetes.io/projected/9ddcc15d-2a56-46c2-9606-3ba1643f15b2-kube-api-access-n87mr\") on node \"crc\" DevicePath \"\"" Mar 20 09:00:04 crc kubenswrapper[4971]: I0320 09:00:04.934303 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-fskqk" event={"ID":"1943c6e8-ca3b-4c1a-aa8c-4d3576b9eb5e","Type":"ContainerDied","Data":"f57aa66b4c03c39ec2b3a8d329bef0e91830fbf1f9a02b40e27f39d2bec4e8c4"} Mar 20 09:00:04 crc kubenswrapper[4971]: I0320 09:00:04.934345 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f57aa66b4c03c39ec2b3a8d329bef0e91830fbf1f9a02b40e27f39d2bec4e8c4" Mar 20 09:00:04 crc kubenswrapper[4971]: I0320 09:00:04.934412 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-fskqk" Mar 20 09:00:04 crc kubenswrapper[4971]: I0320 09:00:04.947528 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-1bf7-account-create-update-mtp5w" event={"ID":"9ddcc15d-2a56-46c2-9606-3ba1643f15b2","Type":"ContainerDied","Data":"c7df908fa07bbb1bad1445cee64e4aed94bcb2561431792751ba0d270d25837e"} Mar 20 09:00:04 crc kubenswrapper[4971]: I0320 09:00:04.947793 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c7df908fa07bbb1bad1445cee64e4aed94bcb2561431792751ba0d270d25837e" Mar 20 09:00:04 crc kubenswrapper[4971]: I0320 09:00:04.947954 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-1bf7-account-create-update-mtp5w" Mar 20 09:00:05 crc kubenswrapper[4971]: I0320 09:00:05.593633 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566575-n89r2"] Mar 20 09:00:05 crc kubenswrapper[4971]: I0320 09:00:05.605620 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566575-n89r2"] Mar 20 09:00:06 crc kubenswrapper[4971]: I0320 09:00:06.758078 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46196b48-8776-46c5-99e9-4136b414f740" path="/var/lib/kubelet/pods/46196b48-8776-46c5-99e9-4136b414f740/volumes" Mar 20 09:00:09 crc kubenswrapper[4971]: I0320 09:00:09.733158 4971 scope.go:117] "RemoveContainer" containerID="28296d931187d2e7d4e287e10e7dff5f848527abf9621b1204e0a061cbc3d395" Mar 20 09:00:09 crc kubenswrapper[4971]: E0320 09:00:09.735700 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:00:10 crc kubenswrapper[4971]: I0320 09:00:10.845998 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-8qtsq"] Mar 20 09:00:10 crc kubenswrapper[4971]: E0320 09:00:10.846754 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ddcc15d-2a56-46c2-9606-3ba1643f15b2" containerName="mariadb-account-create-update" Mar 20 09:00:10 crc kubenswrapper[4971]: I0320 09:00:10.846772 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ddcc15d-2a56-46c2-9606-3ba1643f15b2" containerName="mariadb-account-create-update" Mar 20 09:00:10 crc kubenswrapper[4971]: E0320 09:00:10.846819 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f1bfb4f-58e7-4a18-86ee-ee822cd3fa48" containerName="collect-profiles" Mar 20 09:00:10 crc kubenswrapper[4971]: I0320 09:00:10.846828 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f1bfb4f-58e7-4a18-86ee-ee822cd3fa48" containerName="collect-profiles" Mar 20 09:00:10 crc kubenswrapper[4971]: E0320 09:00:10.846848 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1943c6e8-ca3b-4c1a-aa8c-4d3576b9eb5e" containerName="mariadb-database-create" Mar 20 09:00:10 crc kubenswrapper[4971]: I0320 09:00:10.846856 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="1943c6e8-ca3b-4c1a-aa8c-4d3576b9eb5e" containerName="mariadb-database-create" Mar 20 09:00:10 crc kubenswrapper[4971]: I0320 09:00:10.847125 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="1943c6e8-ca3b-4c1a-aa8c-4d3576b9eb5e" containerName="mariadb-database-create" Mar 20 09:00:10 crc kubenswrapper[4971]: I0320 09:00:10.847152 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ddcc15d-2a56-46c2-9606-3ba1643f15b2" containerName="mariadb-account-create-update" Mar 20 09:00:10 crc kubenswrapper[4971]: I0320 09:00:10.847173 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f1bfb4f-58e7-4a18-86ee-ee822cd3fa48" containerName="collect-profiles" Mar 20 09:00:10 crc kubenswrapper[4971]: I0320 09:00:10.848453 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-8qtsq" Mar 20 09:00:10 crc kubenswrapper[4971]: I0320 09:00:10.852642 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Mar 20 09:00:10 crc kubenswrapper[4971]: I0320 09:00:10.853335 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-k78nk" Mar 20 09:00:10 crc kubenswrapper[4971]: I0320 09:00:10.860497 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-8qtsq"] Mar 20 09:00:10 crc kubenswrapper[4971]: I0320 09:00:10.942240 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/165bdcc8-662d-4d59-aebb-90e1bc533f38-config-data\") pod \"manila-db-sync-8qtsq\" (UID: \"165bdcc8-662d-4d59-aebb-90e1bc533f38\") " pod="openstack/manila-db-sync-8qtsq" Mar 20 09:00:10 crc kubenswrapper[4971]: I0320 09:00:10.942352 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/165bdcc8-662d-4d59-aebb-90e1bc533f38-combined-ca-bundle\") pod \"manila-db-sync-8qtsq\" (UID: \"165bdcc8-662d-4d59-aebb-90e1bc533f38\") " pod="openstack/manila-db-sync-8qtsq" Mar 20 09:00:10 crc kubenswrapper[4971]: I0320 09:00:10.942383 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7qr5\" (UniqueName: \"kubernetes.io/projected/165bdcc8-662d-4d59-aebb-90e1bc533f38-kube-api-access-k7qr5\") pod \"manila-db-sync-8qtsq\" (UID: \"165bdcc8-662d-4d59-aebb-90e1bc533f38\") " pod="openstack/manila-db-sync-8qtsq" Mar 20 09:00:10 crc kubenswrapper[4971]: I0320 09:00:10.942425 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/165bdcc8-662d-4d59-aebb-90e1bc533f38-job-config-data\") pod \"manila-db-sync-8qtsq\" (UID: \"165bdcc8-662d-4d59-aebb-90e1bc533f38\") " pod="openstack/manila-db-sync-8qtsq" Mar 20 09:00:11 crc kubenswrapper[4971]: I0320 09:00:11.044146 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/165bdcc8-662d-4d59-aebb-90e1bc533f38-combined-ca-bundle\") pod \"manila-db-sync-8qtsq\" (UID: \"165bdcc8-662d-4d59-aebb-90e1bc533f38\") " pod="openstack/manila-db-sync-8qtsq" Mar 20 09:00:11 crc kubenswrapper[4971]: I0320 09:00:11.044204 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7qr5\" (UniqueName: \"kubernetes.io/projected/165bdcc8-662d-4d59-aebb-90e1bc533f38-kube-api-access-k7qr5\") pod \"manila-db-sync-8qtsq\" (UID: \"165bdcc8-662d-4d59-aebb-90e1bc533f38\") " pod="openstack/manila-db-sync-8qtsq" Mar 20 09:00:11 crc kubenswrapper[4971]: I0320 09:00:11.044244 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/165bdcc8-662d-4d59-aebb-90e1bc533f38-job-config-data\") pod \"manila-db-sync-8qtsq\" (UID: \"165bdcc8-662d-4d59-aebb-90e1bc533f38\") " pod="openstack/manila-db-sync-8qtsq" Mar 20 09:00:11 crc kubenswrapper[4971]: I0320 09:00:11.044323 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/165bdcc8-662d-4d59-aebb-90e1bc533f38-config-data\") pod \"manila-db-sync-8qtsq\" (UID: \"165bdcc8-662d-4d59-aebb-90e1bc533f38\") " pod="openstack/manila-db-sync-8qtsq" Mar 20 09:00:11 crc kubenswrapper[4971]: I0320 09:00:11.051862 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/165bdcc8-662d-4d59-aebb-90e1bc533f38-job-config-data\") pod \"manila-db-sync-8qtsq\" (UID: \"165bdcc8-662d-4d59-aebb-90e1bc533f38\") " pod="openstack/manila-db-sync-8qtsq" Mar 20 09:00:11 crc kubenswrapper[4971]: I0320 09:00:11.051864 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/165bdcc8-662d-4d59-aebb-90e1bc533f38-combined-ca-bundle\") pod \"manila-db-sync-8qtsq\" (UID: \"165bdcc8-662d-4d59-aebb-90e1bc533f38\") " pod="openstack/manila-db-sync-8qtsq" Mar 20 09:00:11 crc kubenswrapper[4971]: I0320 09:00:11.052141 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/165bdcc8-662d-4d59-aebb-90e1bc533f38-config-data\") pod \"manila-db-sync-8qtsq\" (UID: \"165bdcc8-662d-4d59-aebb-90e1bc533f38\") " pod="openstack/manila-db-sync-8qtsq" Mar 20 09:00:11 crc kubenswrapper[4971]: I0320 09:00:11.058413 4971 scope.go:117] "RemoveContainer" containerID="6cc7d057c9f72b6b7f92f41e98c24f0875b9cfee36807c506d1fe3aa1ccac4f0" Mar 20 09:00:11 crc kubenswrapper[4971]: I0320 09:00:11.073577 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7qr5\" (UniqueName: \"kubernetes.io/projected/165bdcc8-662d-4d59-aebb-90e1bc533f38-kube-api-access-k7qr5\") pod \"manila-db-sync-8qtsq\" (UID: \"165bdcc8-662d-4d59-aebb-90e1bc533f38\") " pod="openstack/manila-db-sync-8qtsq" Mar 20 09:00:11 crc kubenswrapper[4971]: I0320 09:00:11.178801 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-8qtsq" Mar 20 09:00:11 crc kubenswrapper[4971]: I0320 09:00:11.192776 4971 scope.go:117] "RemoveContainer" containerID="eb902b3800d51a0b8ec0ea5e03cfc27c7506f87e4f321c55d1eeb903bda03455" Mar 20 09:00:11 crc kubenswrapper[4971]: I0320 09:00:11.221962 4971 scope.go:117] "RemoveContainer" containerID="e6f5ba6b52a367733a0734c52b66393cd78add4857d1bd8f04c9c167a849c434" Mar 20 09:00:11 crc kubenswrapper[4971]: I0320 09:00:11.330758 4971 scope.go:117] "RemoveContainer" containerID="db0f8efa8537bff11b3dc5fc1c79852b330e9ac024ed998065ee5da9ff50df4e" Mar 20 09:00:12 crc kubenswrapper[4971]: I0320 09:00:12.170740 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-8qtsq"] Mar 20 09:00:12 crc kubenswrapper[4971]: W0320 09:00:12.171817 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod165bdcc8_662d_4d59_aebb_90e1bc533f38.slice/crio-6d16e9ba9bff866424f5f91a6ba7545e724fc0227ef060ea747ff206e28c0313 WatchSource:0}: Error finding container 6d16e9ba9bff866424f5f91a6ba7545e724fc0227ef060ea747ff206e28c0313: Status 404 returned error can't find the container with id 6d16e9ba9bff866424f5f91a6ba7545e724fc0227ef060ea747ff206e28c0313 Mar 20 09:00:13 crc kubenswrapper[4971]: I0320 09:00:13.034044 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-8qtsq" event={"ID":"165bdcc8-662d-4d59-aebb-90e1bc533f38","Type":"ContainerStarted","Data":"6d16e9ba9bff866424f5f91a6ba7545e724fc0227ef060ea747ff206e28c0313"} Mar 20 09:00:14 crc kubenswrapper[4971]: I0320 09:00:14.044474 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566620-9x5j8" event={"ID":"e8a4a91f-3933-4117-bf14-288067efbab3","Type":"ContainerStarted","Data":"d62f7fb69148cfbd8323889418817c278a443fdad66d53f15d25cb3930c81d8f"} Mar 20 09:00:14 crc kubenswrapper[4971]: I0320 09:00:14.062728 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566620-9x5j8" podStartSLOduration=2.685636334 podStartE2EDuration="14.062708411s" podCreationTimestamp="2026-03-20 09:00:00 +0000 UTC" firstStartedPulling="2026-03-20 09:00:01.751223768 +0000 UTC m=+7823.731097906" lastFinishedPulling="2026-03-20 09:00:13.128295845 +0000 UTC m=+7835.108169983" observedRunningTime="2026-03-20 09:00:14.057630178 +0000 UTC m=+7836.037504326" watchObservedRunningTime="2026-03-20 09:00:14.062708411 +0000 UTC m=+7836.042582549" Mar 20 09:00:15 crc kubenswrapper[4971]: I0320 09:00:15.068063 4971 generic.go:334] "Generic (PLEG): container finished" podID="e8a4a91f-3933-4117-bf14-288067efbab3" containerID="d62f7fb69148cfbd8323889418817c278a443fdad66d53f15d25cb3930c81d8f" exitCode=0 Mar 20 09:00:15 crc kubenswrapper[4971]: I0320 09:00:15.068360 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566620-9x5j8" event={"ID":"e8a4a91f-3933-4117-bf14-288067efbab3","Type":"ContainerDied","Data":"d62f7fb69148cfbd8323889418817c278a443fdad66d53f15d25cb3930c81d8f"} Mar 20 09:00:17 crc kubenswrapper[4971]: I0320 09:00:17.686288 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566620-9x5j8" Mar 20 09:00:17 crc kubenswrapper[4971]: I0320 09:00:17.737942 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjbk2\" (UniqueName: \"kubernetes.io/projected/e8a4a91f-3933-4117-bf14-288067efbab3-kube-api-access-hjbk2\") pod \"e8a4a91f-3933-4117-bf14-288067efbab3\" (UID: \"e8a4a91f-3933-4117-bf14-288067efbab3\") " Mar 20 09:00:17 crc kubenswrapper[4971]: I0320 09:00:17.741926 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8a4a91f-3933-4117-bf14-288067efbab3-kube-api-access-hjbk2" (OuterVolumeSpecName: "kube-api-access-hjbk2") pod "e8a4a91f-3933-4117-bf14-288067efbab3" (UID: "e8a4a91f-3933-4117-bf14-288067efbab3"). InnerVolumeSpecName "kube-api-access-hjbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:00:17 crc kubenswrapper[4971]: I0320 09:00:17.840186 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjbk2\" (UniqueName: \"kubernetes.io/projected/e8a4a91f-3933-4117-bf14-288067efbab3-kube-api-access-hjbk2\") on node \"crc\" DevicePath \"\"" Mar 20 09:00:18 crc kubenswrapper[4971]: I0320 09:00:18.105279 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566620-9x5j8" event={"ID":"e8a4a91f-3933-4117-bf14-288067efbab3","Type":"ContainerDied","Data":"c8bb7cffaa5b155e1fd65ca7c488571c70f3622cc60fc56a416243239331e0df"} Mar 20 09:00:18 crc kubenswrapper[4971]: I0320 09:00:18.105575 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8bb7cffaa5b155e1fd65ca7c488571c70f3622cc60fc56a416243239331e0df" Mar 20 09:00:18 crc kubenswrapper[4971]: I0320 09:00:18.105382 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566620-9x5j8" Mar 20 09:00:18 crc kubenswrapper[4971]: I0320 09:00:18.107941 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-8qtsq" event={"ID":"165bdcc8-662d-4d59-aebb-90e1bc533f38","Type":"ContainerStarted","Data":"3d031600480bdb47b1aa93f05731505e830647afe1f385755a2fad41d9631320"} Mar 20 09:00:18 crc kubenswrapper[4971]: I0320 09:00:18.141144 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-8qtsq" podStartSLOduration=2.7736498640000002 podStartE2EDuration="8.141123384s" podCreationTimestamp="2026-03-20 09:00:10 +0000 UTC" firstStartedPulling="2026-03-20 09:00:12.173531037 +0000 UTC m=+7834.153405175" lastFinishedPulling="2026-03-20 09:00:17.541004557 +0000 UTC m=+7839.520878695" observedRunningTime="2026-03-20 09:00:18.132267602 +0000 UTC m=+7840.112141790" watchObservedRunningTime="2026-03-20 09:00:18.141123384 +0000 UTC m=+7840.120997522" Mar 20 09:00:18 crc kubenswrapper[4971]: I0320 09:00:18.777842 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566614-vkmv9"] Mar 20 09:00:18 crc kubenswrapper[4971]: I0320 09:00:18.791900 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566614-vkmv9"] Mar 20 09:00:20 crc kubenswrapper[4971]: I0320 09:00:20.144180 4971 generic.go:334] "Generic (PLEG): container finished" podID="165bdcc8-662d-4d59-aebb-90e1bc533f38" containerID="3d031600480bdb47b1aa93f05731505e830647afe1f385755a2fad41d9631320" exitCode=0 Mar 20 09:00:20 crc kubenswrapper[4971]: I0320 09:00:20.144283 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-8qtsq" event={"ID":"165bdcc8-662d-4d59-aebb-90e1bc533f38","Type":"ContainerDied","Data":"3d031600480bdb47b1aa93f05731505e830647afe1f385755a2fad41d9631320"} Mar 20 09:00:20 crc kubenswrapper[4971]: I0320 09:00:20.743841 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29aba505-fe6d-41ed-a14b-76ced70e80f8" path="/var/lib/kubelet/pods/29aba505-fe6d-41ed-a14b-76ced70e80f8/volumes" Mar 20 09:00:21 crc kubenswrapper[4971]: I0320 09:00:21.702809 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-8qtsq" Mar 20 09:00:21 crc kubenswrapper[4971]: I0320 09:00:21.721297 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/165bdcc8-662d-4d59-aebb-90e1bc533f38-combined-ca-bundle\") pod \"165bdcc8-662d-4d59-aebb-90e1bc533f38\" (UID: \"165bdcc8-662d-4d59-aebb-90e1bc533f38\") " Mar 20 09:00:21 crc kubenswrapper[4971]: I0320 09:00:21.721425 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7qr5\" (UniqueName: \"kubernetes.io/projected/165bdcc8-662d-4d59-aebb-90e1bc533f38-kube-api-access-k7qr5\") pod \"165bdcc8-662d-4d59-aebb-90e1bc533f38\" (UID: \"165bdcc8-662d-4d59-aebb-90e1bc533f38\") " Mar 20 09:00:21 crc kubenswrapper[4971]: I0320 09:00:21.721470 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/165bdcc8-662d-4d59-aebb-90e1bc533f38-job-config-data\") pod \"165bdcc8-662d-4d59-aebb-90e1bc533f38\" (UID: \"165bdcc8-662d-4d59-aebb-90e1bc533f38\") " Mar 20 09:00:21 crc kubenswrapper[4971]: I0320 09:00:21.721500 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/165bdcc8-662d-4d59-aebb-90e1bc533f38-config-data\") pod \"165bdcc8-662d-4d59-aebb-90e1bc533f38\" (UID: \"165bdcc8-662d-4d59-aebb-90e1bc533f38\") " Mar 20 09:00:21 crc kubenswrapper[4971]: I0320 09:00:21.735712 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/165bdcc8-662d-4d59-aebb-90e1bc533f38-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "165bdcc8-662d-4d59-aebb-90e1bc533f38" (UID: "165bdcc8-662d-4d59-aebb-90e1bc533f38"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:00:21 crc kubenswrapper[4971]: I0320 09:00:21.738394 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/165bdcc8-662d-4d59-aebb-90e1bc533f38-kube-api-access-k7qr5" (OuterVolumeSpecName: "kube-api-access-k7qr5") pod "165bdcc8-662d-4d59-aebb-90e1bc533f38" (UID: "165bdcc8-662d-4d59-aebb-90e1bc533f38"). InnerVolumeSpecName "kube-api-access-k7qr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:00:21 crc kubenswrapper[4971]: I0320 09:00:21.753957 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/165bdcc8-662d-4d59-aebb-90e1bc533f38-config-data" (OuterVolumeSpecName: "config-data") pod "165bdcc8-662d-4d59-aebb-90e1bc533f38" (UID: "165bdcc8-662d-4d59-aebb-90e1bc533f38"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:00:21 crc kubenswrapper[4971]: I0320 09:00:21.768291 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/165bdcc8-662d-4d59-aebb-90e1bc533f38-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "165bdcc8-662d-4d59-aebb-90e1bc533f38" (UID: "165bdcc8-662d-4d59-aebb-90e1bc533f38"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:00:21 crc kubenswrapper[4971]: I0320 09:00:21.827261 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/165bdcc8-662d-4d59-aebb-90e1bc533f38-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 09:00:21 crc kubenswrapper[4971]: I0320 09:00:21.827313 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7qr5\" (UniqueName: \"kubernetes.io/projected/165bdcc8-662d-4d59-aebb-90e1bc533f38-kube-api-access-k7qr5\") on node \"crc\" DevicePath \"\"" Mar 20 09:00:21 crc kubenswrapper[4971]: I0320 09:00:21.827336 4971 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/165bdcc8-662d-4d59-aebb-90e1bc533f38-job-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 09:00:21 crc kubenswrapper[4971]: I0320 09:00:21.827358 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/165bdcc8-662d-4d59-aebb-90e1bc533f38-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.169083 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-8qtsq" event={"ID":"165bdcc8-662d-4d59-aebb-90e1bc533f38","Type":"ContainerDied","Data":"6d16e9ba9bff866424f5f91a6ba7545e724fc0227ef060ea747ff206e28c0313"} Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.169626 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d16e9ba9bff866424f5f91a6ba7545e724fc0227ef060ea747ff206e28c0313" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.169148 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-8qtsq" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.518283 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Mar 20 09:00:22 crc kubenswrapper[4971]: E0320 09:00:22.518815 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="165bdcc8-662d-4d59-aebb-90e1bc533f38" containerName="manila-db-sync" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.518832 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="165bdcc8-662d-4d59-aebb-90e1bc533f38" containerName="manila-db-sync" Mar 20 09:00:22 crc kubenswrapper[4971]: E0320 09:00:22.518873 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8a4a91f-3933-4117-bf14-288067efbab3" containerName="oc" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.518881 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8a4a91f-3933-4117-bf14-288067efbab3" containerName="oc" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.519114 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8a4a91f-3933-4117-bf14-288067efbab3" containerName="oc" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.519155 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="165bdcc8-662d-4d59-aebb-90e1bc533f38" containerName="manila-db-sync" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.520291 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.524622 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.525527 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.528280 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-k78nk" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.531018 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.532666 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.534279 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.535700 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.549497 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.562599 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f089194-8fcc-4c1f-9e2f-61103e076753-scripts\") pod \"manila-scheduler-0\" (UID: \"6f089194-8fcc-4c1f-9e2f-61103e076753\") " pod="openstack/manila-scheduler-0" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.562756 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f089194-8fcc-4c1f-9e2f-61103e076753-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"6f089194-8fcc-4c1f-9e2f-61103e076753\") " pod="openstack/manila-scheduler-0" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.562821 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgmb9\" (UniqueName: \"kubernetes.io/projected/22ef81f8-ccbc-4cbc-b638-cf3a998e2eca-kube-api-access-dgmb9\") pod \"manila-share-share1-0\" (UID: \"22ef81f8-ccbc-4cbc-b638-cf3a998e2eca\") " pod="openstack/manila-share-share1-0" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.562847 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f089194-8fcc-4c1f-9e2f-61103e076753-config-data\") pod \"manila-scheduler-0\" (UID: \"6f089194-8fcc-4c1f-9e2f-61103e076753\") " pod="openstack/manila-scheduler-0" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.562921 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6f089194-8fcc-4c1f-9e2f-61103e076753-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"6f089194-8fcc-4c1f-9e2f-61103e076753\") " pod="openstack/manila-scheduler-0" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.562957 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/22ef81f8-ccbc-4cbc-b638-cf3a998e2eca-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"22ef81f8-ccbc-4cbc-b638-cf3a998e2eca\") " pod="openstack/manila-share-share1-0" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.562978 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6f089194-8fcc-4c1f-9e2f-61103e076753-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"6f089194-8fcc-4c1f-9e2f-61103e076753\") " pod="openstack/manila-scheduler-0" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.563052 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22ef81f8-ccbc-4cbc-b638-cf3a998e2eca-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"22ef81f8-ccbc-4cbc-b638-cf3a998e2eca\") " pod="openstack/manila-share-share1-0" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.563084 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/22ef81f8-ccbc-4cbc-b638-cf3a998e2eca-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"22ef81f8-ccbc-4cbc-b638-cf3a998e2eca\") " pod="openstack/manila-share-share1-0" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.563101 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/22ef81f8-ccbc-4cbc-b638-cf3a998e2eca-ceph\") pod \"manila-share-share1-0\" (UID: \"22ef81f8-ccbc-4cbc-b638-cf3a998e2eca\") " pod="openstack/manila-share-share1-0" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.563162 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhjmr\" (UniqueName: \"kubernetes.io/projected/6f089194-8fcc-4c1f-9e2f-61103e076753-kube-api-access-jhjmr\") pod \"manila-scheduler-0\" (UID: \"6f089194-8fcc-4c1f-9e2f-61103e076753\") " pod="openstack/manila-scheduler-0" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.563213 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/22ef81f8-ccbc-4cbc-b638-cf3a998e2eca-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"22ef81f8-ccbc-4cbc-b638-cf3a998e2eca\") " pod="openstack/manila-share-share1-0" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.563272 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22ef81f8-ccbc-4cbc-b638-cf3a998e2eca-config-data\") pod \"manila-share-share1-0\" (UID: \"22ef81f8-ccbc-4cbc-b638-cf3a998e2eca\") " pod="openstack/manila-share-share1-0" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.563292 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22ef81f8-ccbc-4cbc-b638-cf3a998e2eca-scripts\") pod \"manila-share-share1-0\" (UID: \"22ef81f8-ccbc-4cbc-b638-cf3a998e2eca\") " pod="openstack/manila-share-share1-0" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.573710 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.661695 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bccb6d999-n6dsj"] Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.663999 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bccb6d999-n6dsj" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.666691 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/22ef81f8-ccbc-4cbc-b638-cf3a998e2eca-ceph\") pod \"manila-share-share1-0\" (UID: \"22ef81f8-ccbc-4cbc-b638-cf3a998e2eca\") " pod="openstack/manila-share-share1-0" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.666808 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04966dc8-94d0-42f2-97b6-230806b07f54-ovsdbserver-sb\") pod \"dnsmasq-dns-5bccb6d999-n6dsj\" (UID: \"04966dc8-94d0-42f2-97b6-230806b07f54\") " pod="openstack/dnsmasq-dns-5bccb6d999-n6dsj" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.666910 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04966dc8-94d0-42f2-97b6-230806b07f54-dns-svc\") pod \"dnsmasq-dns-5bccb6d999-n6dsj\" (UID: \"04966dc8-94d0-42f2-97b6-230806b07f54\") " pod="openstack/dnsmasq-dns-5bccb6d999-n6dsj" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.667037 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04966dc8-94d0-42f2-97b6-230806b07f54-ovsdbserver-nb\") pod \"dnsmasq-dns-5bccb6d999-n6dsj\" (UID: \"04966dc8-94d0-42f2-97b6-230806b07f54\") " pod="openstack/dnsmasq-dns-5bccb6d999-n6dsj" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.667170 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhjmr\" (UniqueName: \"kubernetes.io/projected/6f089194-8fcc-4c1f-9e2f-61103e076753-kube-api-access-jhjmr\") pod \"manila-scheduler-0\" (UID: \"6f089194-8fcc-4c1f-9e2f-61103e076753\") " pod="openstack/manila-scheduler-0" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.667282 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vln6g\" (UniqueName: \"kubernetes.io/projected/04966dc8-94d0-42f2-97b6-230806b07f54-kube-api-access-vln6g\") pod \"dnsmasq-dns-5bccb6d999-n6dsj\" (UID: \"04966dc8-94d0-42f2-97b6-230806b07f54\") " pod="openstack/dnsmasq-dns-5bccb6d999-n6dsj" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.667386 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04966dc8-94d0-42f2-97b6-230806b07f54-config\") pod \"dnsmasq-dns-5bccb6d999-n6dsj\" (UID: \"04966dc8-94d0-42f2-97b6-230806b07f54\") " pod="openstack/dnsmasq-dns-5bccb6d999-n6dsj" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.667507 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/22ef81f8-ccbc-4cbc-b638-cf3a998e2eca-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"22ef81f8-ccbc-4cbc-b638-cf3a998e2eca\") " pod="openstack/manila-share-share1-0" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.667709 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/22ef81f8-ccbc-4cbc-b638-cf3a998e2eca-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"22ef81f8-ccbc-4cbc-b638-cf3a998e2eca\") " pod="openstack/manila-share-share1-0" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.672386 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/22ef81f8-ccbc-4cbc-b638-cf3a998e2eca-ceph\") pod \"manila-share-share1-0\" (UID: \"22ef81f8-ccbc-4cbc-b638-cf3a998e2eca\") " pod="openstack/manila-share-share1-0" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.672492 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22ef81f8-ccbc-4cbc-b638-cf3a998e2eca-config-data\") pod \"manila-share-share1-0\" (UID: \"22ef81f8-ccbc-4cbc-b638-cf3a998e2eca\") " pod="openstack/manila-share-share1-0" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.672592 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22ef81f8-ccbc-4cbc-b638-cf3a998e2eca-scripts\") pod \"manila-share-share1-0\" (UID: \"22ef81f8-ccbc-4cbc-b638-cf3a998e2eca\") " pod="openstack/manila-share-share1-0" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.672783 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f089194-8fcc-4c1f-9e2f-61103e076753-scripts\") pod \"manila-scheduler-0\" (UID: \"6f089194-8fcc-4c1f-9e2f-61103e076753\") " pod="openstack/manila-scheduler-0" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.672964 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f089194-8fcc-4c1f-9e2f-61103e076753-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"6f089194-8fcc-4c1f-9e2f-61103e076753\") " pod="openstack/manila-scheduler-0" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.673094 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgmb9\" (UniqueName: \"kubernetes.io/projected/22ef81f8-ccbc-4cbc-b638-cf3a998e2eca-kube-api-access-dgmb9\") pod \"manila-share-share1-0\" (UID: \"22ef81f8-ccbc-4cbc-b638-cf3a998e2eca\") " pod="openstack/manila-share-share1-0" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.673180 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f089194-8fcc-4c1f-9e2f-61103e076753-config-data\") pod \"manila-scheduler-0\" (UID: \"6f089194-8fcc-4c1f-9e2f-61103e076753\") " pod="openstack/manila-scheduler-0" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.673340 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6f089194-8fcc-4c1f-9e2f-61103e076753-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"6f089194-8fcc-4c1f-9e2f-61103e076753\") " pod="openstack/manila-scheduler-0" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.673439 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/22ef81f8-ccbc-4cbc-b638-cf3a998e2eca-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"22ef81f8-ccbc-4cbc-b638-cf3a998e2eca\") " pod="openstack/manila-share-share1-0" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.673523 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6f089194-8fcc-4c1f-9e2f-61103e076753-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"6f089194-8fcc-4c1f-9e2f-61103e076753\") " pod="openstack/manila-scheduler-0" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.675548 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22ef81f8-ccbc-4cbc-b638-cf3a998e2eca-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"22ef81f8-ccbc-4cbc-b638-cf3a998e2eca\") " pod="openstack/manila-share-share1-0" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.675672 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/22ef81f8-ccbc-4cbc-b638-cf3a998e2eca-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"22ef81f8-ccbc-4cbc-b638-cf3a998e2eca\") " pod="openstack/manila-share-share1-0" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.676372 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/22ef81f8-ccbc-4cbc-b638-cf3a998e2eca-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"22ef81f8-ccbc-4cbc-b638-cf3a998e2eca\") " pod="openstack/manila-share-share1-0" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.679271 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6f089194-8fcc-4c1f-9e2f-61103e076753-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"6f089194-8fcc-4c1f-9e2f-61103e076753\") " pod="openstack/manila-scheduler-0" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.690738 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6f089194-8fcc-4c1f-9e2f-61103e076753-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"6f089194-8fcc-4c1f-9e2f-61103e076753\") " pod="openstack/manila-scheduler-0" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.690894 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f089194-8fcc-4c1f-9e2f-61103e076753-config-data\") pod \"manila-scheduler-0\" (UID: \"6f089194-8fcc-4c1f-9e2f-61103e076753\") " pod="openstack/manila-scheduler-0" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.691007 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22ef81f8-ccbc-4cbc-b638-cf3a998e2eca-scripts\") pod \"manila-share-share1-0\" (UID: \"22ef81f8-ccbc-4cbc-b638-cf3a998e2eca\") " pod="openstack/manila-share-share1-0" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.691351 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f089194-8fcc-4c1f-9e2f-61103e076753-scripts\") pod \"manila-scheduler-0\" (UID: \"6f089194-8fcc-4c1f-9e2f-61103e076753\") " pod="openstack/manila-scheduler-0" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.691440 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22ef81f8-ccbc-4cbc-b638-cf3a998e2eca-config-data\") pod \"manila-share-share1-0\" (UID: \"22ef81f8-ccbc-4cbc-b638-cf3a998e2eca\") " pod="openstack/manila-share-share1-0" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.702757 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22ef81f8-ccbc-4cbc-b638-cf3a998e2eca-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"22ef81f8-ccbc-4cbc-b638-cf3a998e2eca\") " pod="openstack/manila-share-share1-0" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.708763 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhjmr\" (UniqueName: \"kubernetes.io/projected/6f089194-8fcc-4c1f-9e2f-61103e076753-kube-api-access-jhjmr\") pod \"manila-scheduler-0\" (UID: \"6f089194-8fcc-4c1f-9e2f-61103e076753\") " pod="openstack/manila-scheduler-0" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.708852 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bccb6d999-n6dsj"] Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.721966 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/22ef81f8-ccbc-4cbc-b638-cf3a998e2eca-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"22ef81f8-ccbc-4cbc-b638-cf3a998e2eca\") " pod="openstack/manila-share-share1-0" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.732324 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgmb9\" (UniqueName: \"kubernetes.io/projected/22ef81f8-ccbc-4cbc-b638-cf3a998e2eca-kube-api-access-dgmb9\") pod \"manila-share-share1-0\" (UID: \"22ef81f8-ccbc-4cbc-b638-cf3a998e2eca\") " pod="openstack/manila-share-share1-0" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.733668 4971 scope.go:117] "RemoveContainer" containerID="28296d931187d2e7d4e287e10e7dff5f848527abf9621b1204e0a061cbc3d395" Mar 20 09:00:22 crc kubenswrapper[4971]: E0320 09:00:22.743579 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.770099 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f089194-8fcc-4c1f-9e2f-61103e076753-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"6f089194-8fcc-4c1f-9e2f-61103e076753\") " pod="openstack/manila-scheduler-0" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.781072 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04966dc8-94d0-42f2-97b6-230806b07f54-ovsdbserver-sb\") pod \"dnsmasq-dns-5bccb6d999-n6dsj\" (UID: \"04966dc8-94d0-42f2-97b6-230806b07f54\") " pod="openstack/dnsmasq-dns-5bccb6d999-n6dsj" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.781133 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04966dc8-94d0-42f2-97b6-230806b07f54-dns-svc\") pod \"dnsmasq-dns-5bccb6d999-n6dsj\" (UID: \"04966dc8-94d0-42f2-97b6-230806b07f54\") " pod="openstack/dnsmasq-dns-5bccb6d999-n6dsj" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.781157 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04966dc8-94d0-42f2-97b6-230806b07f54-ovsdbserver-nb\") pod \"dnsmasq-dns-5bccb6d999-n6dsj\" (UID: \"04966dc8-94d0-42f2-97b6-230806b07f54\") " pod="openstack/dnsmasq-dns-5bccb6d999-n6dsj" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.781182 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vln6g\" (UniqueName: \"kubernetes.io/projected/04966dc8-94d0-42f2-97b6-230806b07f54-kube-api-access-vln6g\") pod \"dnsmasq-dns-5bccb6d999-n6dsj\" (UID: \"04966dc8-94d0-42f2-97b6-230806b07f54\") " pod="openstack/dnsmasq-dns-5bccb6d999-n6dsj" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.781201 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04966dc8-94d0-42f2-97b6-230806b07f54-config\") pod \"dnsmasq-dns-5bccb6d999-n6dsj\" (UID: \"04966dc8-94d0-42f2-97b6-230806b07f54\") " pod="openstack/dnsmasq-dns-5bccb6d999-n6dsj" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.786746 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04966dc8-94d0-42f2-97b6-230806b07f54-ovsdbserver-nb\") pod \"dnsmasq-dns-5bccb6d999-n6dsj\" (UID: \"04966dc8-94d0-42f2-97b6-230806b07f54\") " pod="openstack/dnsmasq-dns-5bccb6d999-n6dsj" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.787345 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04966dc8-94d0-42f2-97b6-230806b07f54-dns-svc\") pod \"dnsmasq-dns-5bccb6d999-n6dsj\" (UID: \"04966dc8-94d0-42f2-97b6-230806b07f54\") " pod="openstack/dnsmasq-dns-5bccb6d999-n6dsj" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.788016 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04966dc8-94d0-42f2-97b6-230806b07f54-config\") pod \"dnsmasq-dns-5bccb6d999-n6dsj\" (UID: \"04966dc8-94d0-42f2-97b6-230806b07f54\") " pod="openstack/dnsmasq-dns-5bccb6d999-n6dsj" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.793144 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04966dc8-94d0-42f2-97b6-230806b07f54-ovsdbserver-sb\") pod \"dnsmasq-dns-5bccb6d999-n6dsj\" (UID: \"04966dc8-94d0-42f2-97b6-230806b07f54\") " pod="openstack/dnsmasq-dns-5bccb6d999-n6dsj" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.804056 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.805938 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.811462 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.813269 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.835362 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vln6g\" (UniqueName: \"kubernetes.io/projected/04966dc8-94d0-42f2-97b6-230806b07f54-kube-api-access-vln6g\") pod \"dnsmasq-dns-5bccb6d999-n6dsj\" (UID: \"04966dc8-94d0-42f2-97b6-230806b07f54\") " pod="openstack/dnsmasq-dns-5bccb6d999-n6dsj" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.845995 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.871949 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.891475 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ea8432ff-5123-4390-99af-d909291447f0-etc-machine-id\") pod \"manila-api-0\" (UID: \"ea8432ff-5123-4390-99af-d909291447f0\") " pod="openstack/manila-api-0" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.891573 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea8432ff-5123-4390-99af-d909291447f0-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"ea8432ff-5123-4390-99af-d909291447f0\") " pod="openstack/manila-api-0" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.891598 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ea8432ff-5123-4390-99af-d909291447f0-config-data-custom\") pod \"manila-api-0\" (UID: \"ea8432ff-5123-4390-99af-d909291447f0\") " pod="openstack/manila-api-0" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.891687 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea8432ff-5123-4390-99af-d909291447f0-config-data\") pod \"manila-api-0\" (UID: \"ea8432ff-5123-4390-99af-d909291447f0\") " pod="openstack/manila-api-0" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.891712 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea8432ff-5123-4390-99af-d909291447f0-scripts\") pod \"manila-api-0\" (UID: \"ea8432ff-5123-4390-99af-d909291447f0\") " pod="openstack/manila-api-0" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.891763 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gv5dp\" (UniqueName: \"kubernetes.io/projected/ea8432ff-5123-4390-99af-d909291447f0-kube-api-access-gv5dp\") pod \"manila-api-0\" (UID: \"ea8432ff-5123-4390-99af-d909291447f0\") " pod="openstack/manila-api-0" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.891859 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea8432ff-5123-4390-99af-d909291447f0-logs\") pod \"manila-api-0\" (UID: \"ea8432ff-5123-4390-99af-d909291447f0\") " pod="openstack/manila-api-0" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.959355 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bccb6d999-n6dsj" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.996098 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea8432ff-5123-4390-99af-d909291447f0-config-data\") pod \"manila-api-0\" (UID: \"ea8432ff-5123-4390-99af-d909291447f0\") " pod="openstack/manila-api-0" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.996153 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea8432ff-5123-4390-99af-d909291447f0-scripts\") pod \"manila-api-0\" (UID: \"ea8432ff-5123-4390-99af-d909291447f0\") " pod="openstack/manila-api-0" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.996228 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gv5dp\" (UniqueName: \"kubernetes.io/projected/ea8432ff-5123-4390-99af-d909291447f0-kube-api-access-gv5dp\") pod \"manila-api-0\" (UID: \"ea8432ff-5123-4390-99af-d909291447f0\") " pod="openstack/manila-api-0" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.996289 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea8432ff-5123-4390-99af-d909291447f0-logs\") pod \"manila-api-0\" (UID: \"ea8432ff-5123-4390-99af-d909291447f0\") " pod="openstack/manila-api-0" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.996352 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ea8432ff-5123-4390-99af-d909291447f0-etc-machine-id\") pod \"manila-api-0\" (UID: \"ea8432ff-5123-4390-99af-d909291447f0\") " pod="openstack/manila-api-0" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.996386 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea8432ff-5123-4390-99af-d909291447f0-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"ea8432ff-5123-4390-99af-d909291447f0\") " pod="openstack/manila-api-0" Mar 20 09:00:22 crc kubenswrapper[4971]: I0320 09:00:22.996415 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ea8432ff-5123-4390-99af-d909291447f0-config-data-custom\") pod \"manila-api-0\" (UID: \"ea8432ff-5123-4390-99af-d909291447f0\") " pod="openstack/manila-api-0" Mar 20 09:00:23 crc kubenswrapper[4971]: I0320 09:00:22.997389 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea8432ff-5123-4390-99af-d909291447f0-logs\") pod \"manila-api-0\" (UID: \"ea8432ff-5123-4390-99af-d909291447f0\") " pod="openstack/manila-api-0" Mar 20 09:00:23 crc kubenswrapper[4971]: I0320 09:00:22.997446 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ea8432ff-5123-4390-99af-d909291447f0-etc-machine-id\") pod \"manila-api-0\" (UID: \"ea8432ff-5123-4390-99af-d909291447f0\") " pod="openstack/manila-api-0" Mar 20 09:00:23 crc kubenswrapper[4971]: I0320 09:00:23.003111 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea8432ff-5123-4390-99af-d909291447f0-config-data\") pod \"manila-api-0\" (UID: \"ea8432ff-5123-4390-99af-d909291447f0\") " pod="openstack/manila-api-0" Mar 20 09:00:23 crc kubenswrapper[4971]: I0320 09:00:23.003760 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea8432ff-5123-4390-99af-d909291447f0-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"ea8432ff-5123-4390-99af-d909291447f0\") " pod="openstack/manila-api-0" Mar 20 09:00:23 crc kubenswrapper[4971]: I0320 09:00:23.005686 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ea8432ff-5123-4390-99af-d909291447f0-config-data-custom\") pod \"manila-api-0\" (UID: \"ea8432ff-5123-4390-99af-d909291447f0\") " pod="openstack/manila-api-0" Mar 20 09:00:23 crc kubenswrapper[4971]: I0320 09:00:23.007043 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea8432ff-5123-4390-99af-d909291447f0-scripts\") pod \"manila-api-0\" (UID: \"ea8432ff-5123-4390-99af-d909291447f0\") " pod="openstack/manila-api-0" Mar 20 09:00:23 crc kubenswrapper[4971]: I0320 09:00:23.032919 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gv5dp\" (UniqueName: \"kubernetes.io/projected/ea8432ff-5123-4390-99af-d909291447f0-kube-api-access-gv5dp\") pod \"manila-api-0\" (UID: \"ea8432ff-5123-4390-99af-d909291447f0\") " pod="openstack/manila-api-0" Mar 20 09:00:23 crc kubenswrapper[4971]: I0320 09:00:23.279478 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 20 09:00:23 crc kubenswrapper[4971]: I0320 09:00:23.569836 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Mar 20 09:00:23 crc kubenswrapper[4971]: I0320 09:00:23.718001 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Mar 20 09:00:23 crc kubenswrapper[4971]: I0320 09:00:23.731283 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bccb6d999-n6dsj"] Mar 20 09:00:23 crc kubenswrapper[4971]: W0320 09:00:23.752192 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04966dc8_94d0_42f2_97b6_230806b07f54.slice/crio-2529a6e0e54801d8cb451102721f399c2c97a58d23244b97f7b3bb4854ac2387 WatchSource:0}: Error finding container 2529a6e0e54801d8cb451102721f399c2c97a58d23244b97f7b3bb4854ac2387: Status 404 returned error can't find the container with id 2529a6e0e54801d8cb451102721f399c2c97a58d23244b97f7b3bb4854ac2387 Mar 20 09:00:24 crc kubenswrapper[4971]: I0320 09:00:24.112872 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Mar 20 09:00:24 crc kubenswrapper[4971]: W0320 09:00:24.124701 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea8432ff_5123_4390_99af_d909291447f0.slice/crio-e9b83a7bb274c238d9f9de586273674cd6b56591633efc1710eed2c8d3f6c5f4 WatchSource:0}: Error finding container e9b83a7bb274c238d9f9de586273674cd6b56591633efc1710eed2c8d3f6c5f4: Status 404 returned error can't find the container with id e9b83a7bb274c238d9f9de586273674cd6b56591633efc1710eed2c8d3f6c5f4 Mar 20 09:00:24 crc kubenswrapper[4971]: I0320 09:00:24.188579 4971 generic.go:334] "Generic (PLEG): container finished" podID="04966dc8-94d0-42f2-97b6-230806b07f54" containerID="9396db1b4dada4564b30d93c3721740e2dd57449d4cba48153523b1e870a13d5" exitCode=0 Mar 20 09:00:24 crc kubenswrapper[4971]: I0320 09:00:24.188689 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bccb6d999-n6dsj" event={"ID":"04966dc8-94d0-42f2-97b6-230806b07f54","Type":"ContainerDied","Data":"9396db1b4dada4564b30d93c3721740e2dd57449d4cba48153523b1e870a13d5"} Mar 20 09:00:24 crc kubenswrapper[4971]: I0320 09:00:24.189040 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bccb6d999-n6dsj" event={"ID":"04966dc8-94d0-42f2-97b6-230806b07f54","Type":"ContainerStarted","Data":"2529a6e0e54801d8cb451102721f399c2c97a58d23244b97f7b3bb4854ac2387"} Mar 20 09:00:24 crc kubenswrapper[4971]: I0320 09:00:24.225700 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"6f089194-8fcc-4c1f-9e2f-61103e076753","Type":"ContainerStarted","Data":"7f190d0e93948c4623c1cb036a1afbc4088770e2895cbbb3a5f8744895c49d25"} Mar 20 09:00:24 crc kubenswrapper[4971]: I0320 09:00:24.228815 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"22ef81f8-ccbc-4cbc-b638-cf3a998e2eca","Type":"ContainerStarted","Data":"79da9893f698863f8b7d373cad78b02b964bebf8550a281be6734bbfed0508d1"} Mar 20 09:00:24 crc kubenswrapper[4971]: I0320 09:00:24.237491 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"ea8432ff-5123-4390-99af-d909291447f0","Type":"ContainerStarted","Data":"e9b83a7bb274c238d9f9de586273674cd6b56591633efc1710eed2c8d3f6c5f4"} Mar 20 09:00:24 crc kubenswrapper[4971]: I0320 09:00:24.444087 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 20 09:00:25 crc kubenswrapper[4971]: I0320 09:00:25.253008 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bccb6d999-n6dsj" event={"ID":"04966dc8-94d0-42f2-97b6-230806b07f54","Type":"ContainerStarted","Data":"8cab53cbb62350a7410a974b0ac58b698e7aa2caa2224266edf03f3a24e43581"} Mar 20 09:00:25 crc kubenswrapper[4971]: I0320 09:00:25.253528 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5bccb6d999-n6dsj" Mar 20 09:00:25 crc kubenswrapper[4971]: I0320 09:00:25.272441 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"6f089194-8fcc-4c1f-9e2f-61103e076753","Type":"ContainerStarted","Data":"091323a18990b2b819d375da47340942648cd5df467d0340c95bc4e8f548743d"} Mar 20 09:00:25 crc kubenswrapper[4971]: I0320 09:00:25.272508 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"6f089194-8fcc-4c1f-9e2f-61103e076753","Type":"ContainerStarted","Data":"844095c41a2b1833c43beafa8b53d0fa4b487cb3ff7e922114d1960a477efe84"} Mar 20 09:00:25 crc kubenswrapper[4971]: I0320 09:00:25.279479 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"ea8432ff-5123-4390-99af-d909291447f0","Type":"ContainerStarted","Data":"f9baa82b2620f4a12a04de2061840ea9d54ccc0dedd9cbc3a77023b6e055f5f2"} Mar 20 09:00:25 crc kubenswrapper[4971]: I0320 09:00:25.291919 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5bccb6d999-n6dsj" podStartSLOduration=3.291895749 podStartE2EDuration="3.291895749s" podCreationTimestamp="2026-03-20 09:00:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:00:25.276341241 +0000 UTC m=+7847.256215379" watchObservedRunningTime="2026-03-20 09:00:25.291895749 +0000 UTC m=+7847.271769887" Mar 20 09:00:25 crc kubenswrapper[4971]: I0320 09:00:25.304037 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=2.951366097 podStartE2EDuration="3.304019256s" podCreationTimestamp="2026-03-20 09:00:22 +0000 UTC" firstStartedPulling="2026-03-20 09:00:23.588452945 +0000 UTC m=+7845.568327083" lastFinishedPulling="2026-03-20 09:00:23.941106104 +0000 UTC m=+7845.920980242" observedRunningTime="2026-03-20 09:00:25.296130359 +0000 UTC m=+7847.276004497" watchObservedRunningTime="2026-03-20 09:00:25.304019256 +0000 UTC m=+7847.283893394" Mar 20 09:00:26 crc kubenswrapper[4971]: I0320 09:00:26.290085 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"ea8432ff-5123-4390-99af-d909291447f0","Type":"ContainerStarted","Data":"37fa0c59f014bd0cae7a7f62ea034b1b09e5f9e618da34a964ed5689d881090e"} Mar 20 09:00:26 crc kubenswrapper[4971]: I0320 09:00:26.310701 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=4.310597541 podStartE2EDuration="4.310597541s" podCreationTimestamp="2026-03-20 09:00:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:00:26.310048427 +0000 UTC m=+7848.289922565" watchObservedRunningTime="2026-03-20 09:00:26.310597541 +0000 UTC m=+7848.290471689" Mar 20 09:00:27 crc kubenswrapper[4971]: I0320 09:00:27.298053 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Mar 20 09:00:32 crc kubenswrapper[4971]: I0320 09:00:32.353073 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"22ef81f8-ccbc-4cbc-b638-cf3a998e2eca","Type":"ContainerStarted","Data":"e287525abc107734da9d9cda5445adc060ffe1310c0b5d0ca22cfcf7030e12b0"} Mar 20 09:00:32 crc kubenswrapper[4971]: I0320 09:00:32.847308 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Mar 20 09:00:32 crc kubenswrapper[4971]: I0320 09:00:32.961836 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5bccb6d999-n6dsj" Mar 20 09:00:33 crc kubenswrapper[4971]: I0320 09:00:33.034702 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-697dc7fb99-tb6mt"] Mar 20 09:00:33 crc kubenswrapper[4971]: I0320 09:00:33.035029 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-697dc7fb99-tb6mt" podUID="fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9" containerName="dnsmasq-dns" containerID="cri-o://9bfaff52fcacbfea721e6561894459b21799086f9c6fdd6a4b86f509f6a735dd" gracePeriod=10 Mar 20 09:00:33 crc kubenswrapper[4971]: I0320 09:00:33.371237 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"22ef81f8-ccbc-4cbc-b638-cf3a998e2eca","Type":"ContainerStarted","Data":"35d699412994d7ed030215174b175b58c37477decaf4bd91aca0e7fb3c94ef5e"} Mar 20 09:00:33 crc kubenswrapper[4971]: I0320 09:00:33.382715 4971 generic.go:334] "Generic (PLEG): container finished" podID="fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9" containerID="9bfaff52fcacbfea721e6561894459b21799086f9c6fdd6a4b86f509f6a735dd" exitCode=0 Mar 20 09:00:33 crc kubenswrapper[4971]: I0320 09:00:33.382769 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-697dc7fb99-tb6mt" event={"ID":"fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9","Type":"ContainerDied","Data":"9bfaff52fcacbfea721e6561894459b21799086f9c6fdd6a4b86f509f6a735dd"} Mar 20 09:00:33 crc kubenswrapper[4971]: I0320 09:00:33.404901 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=3.387826119 podStartE2EDuration="11.404878917s" podCreationTimestamp="2026-03-20 09:00:22 +0000 UTC" firstStartedPulling="2026-03-20 09:00:23.744291063 +0000 UTC m=+7845.724165201" lastFinishedPulling="2026-03-20 09:00:31.761343861 +0000 UTC m=+7853.741217999" observedRunningTime="2026-03-20 09:00:33.397495373 +0000 UTC m=+7855.377369521" watchObservedRunningTime="2026-03-20 09:00:33.404878917 +0000 UTC m=+7855.384753055" Mar 20 09:00:33 crc kubenswrapper[4971]: I0320 09:00:33.649860 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-697dc7fb99-tb6mt" Mar 20 09:00:33 crc kubenswrapper[4971]: I0320 09:00:33.736596 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9-config\") pod \"fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9\" (UID: \"fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9\") " Mar 20 09:00:33 crc kubenswrapper[4971]: I0320 09:00:33.736673 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9-ovsdbserver-sb\") pod \"fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9\" (UID: \"fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9\") " Mar 20 09:00:33 crc kubenswrapper[4971]: I0320 09:00:33.736760 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9-ovsdbserver-nb\") pod \"fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9\" (UID: \"fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9\") " Mar 20 09:00:33 crc kubenswrapper[4971]: I0320 09:00:33.736947 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spfjs\" (UniqueName: \"kubernetes.io/projected/fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9-kube-api-access-spfjs\") pod \"fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9\" (UID: \"fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9\") " Mar 20 09:00:33 crc kubenswrapper[4971]: I0320 09:00:33.736975 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9-dns-svc\") pod \"fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9\" (UID: \"fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9\") " Mar 20 09:00:33 crc kubenswrapper[4971]: I0320 09:00:33.744819 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9-kube-api-access-spfjs" (OuterVolumeSpecName: "kube-api-access-spfjs") pod "fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9" (UID: "fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9"). InnerVolumeSpecName "kube-api-access-spfjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:00:33 crc kubenswrapper[4971]: I0320 09:00:33.790167 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9" (UID: "fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:00:33 crc kubenswrapper[4971]: I0320 09:00:33.804766 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9" (UID: "fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:00:33 crc kubenswrapper[4971]: I0320 09:00:33.810632 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9-config" (OuterVolumeSpecName: "config") pod "fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9" (UID: "fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:00:33 crc kubenswrapper[4971]: I0320 09:00:33.818002 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9" (UID: "fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:00:33 crc kubenswrapper[4971]: I0320 09:00:33.839560 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spfjs\" (UniqueName: \"kubernetes.io/projected/fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9-kube-api-access-spfjs\") on node \"crc\" DevicePath \"\"" Mar 20 09:00:33 crc kubenswrapper[4971]: I0320 09:00:33.839597 4971 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 09:00:33 crc kubenswrapper[4971]: I0320 09:00:33.839629 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9-config\") on node \"crc\" DevicePath \"\"" Mar 20 09:00:33 crc kubenswrapper[4971]: I0320 09:00:33.839640 4971 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 09:00:33 crc kubenswrapper[4971]: I0320 09:00:33.839652 4971 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 09:00:34 crc kubenswrapper[4971]: I0320 09:00:34.397258 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-697dc7fb99-tb6mt" event={"ID":"fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9","Type":"ContainerDied","Data":"70cd7918c0fd977f522da7deffab9a007f5a81554cee657ca3fd6599c74201a0"} Mar 20 09:00:34 crc kubenswrapper[4971]: I0320 09:00:34.397576 4971 scope.go:117] "RemoveContainer" containerID="9bfaff52fcacbfea721e6561894459b21799086f9c6fdd6a4b86f509f6a735dd" Mar 20 09:00:34 crc kubenswrapper[4971]: I0320 09:00:34.397306 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-697dc7fb99-tb6mt" Mar 20 09:00:34 crc kubenswrapper[4971]: I0320 09:00:34.489811 4971 scope.go:117] "RemoveContainer" containerID="d4b921166b14c31ce2e366c331fb3059f4c0fa9065a79b7e204a3dc598f2013e" Mar 20 09:00:34 crc kubenswrapper[4971]: I0320 09:00:34.558654 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-697dc7fb99-tb6mt"] Mar 20 09:00:34 crc kubenswrapper[4971]: I0320 09:00:34.584968 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-697dc7fb99-tb6mt"] Mar 20 09:00:34 crc kubenswrapper[4971]: I0320 09:00:34.742018 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9" path="/var/lib/kubelet/pods/fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9/volumes" Mar 20 09:00:35 crc kubenswrapper[4971]: I0320 09:00:35.750582 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 09:00:35 crc kubenswrapper[4971]: I0320 09:00:35.751237 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6ae5f048-0707-4363-80c0-10c41fa13e20" containerName="ceilometer-central-agent" containerID="cri-o://cc54b061a22acd13673dff0af81259bbe8c24d4dc03134c3c293558d07f94298" gracePeriod=30 Mar 20 09:00:35 crc kubenswrapper[4971]: I0320 09:00:35.751322 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6ae5f048-0707-4363-80c0-10c41fa13e20" containerName="proxy-httpd" containerID="cri-o://9237bc293bcbb4328b39f08efa97f18cd769c99f12285b9939290bc6a9bd7d5c" gracePeriod=30 Mar 20 09:00:35 crc kubenswrapper[4971]: I0320 09:00:35.751453 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6ae5f048-0707-4363-80c0-10c41fa13e20" containerName="ceilometer-notification-agent" containerID="cri-o://9d9888fd6db46c24244bbbdcbb415b99973b57972ed434602ee00aa19b354691" gracePeriod=30 Mar 20 09:00:35 crc kubenswrapper[4971]: I0320 09:00:35.751566 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6ae5f048-0707-4363-80c0-10c41fa13e20" containerName="sg-core" containerID="cri-o://00beae852ea0b2ca00ed7a4a01f6db85872d234cbd171829482bc03365e30080" gracePeriod=30 Mar 20 09:00:36 crc kubenswrapper[4971]: I0320 09:00:36.442199 4971 generic.go:334] "Generic (PLEG): container finished" podID="6ae5f048-0707-4363-80c0-10c41fa13e20" containerID="9237bc293bcbb4328b39f08efa97f18cd769c99f12285b9939290bc6a9bd7d5c" exitCode=0 Mar 20 09:00:36 crc kubenswrapper[4971]: I0320 09:00:36.443897 4971 generic.go:334] "Generic (PLEG): container finished" podID="6ae5f048-0707-4363-80c0-10c41fa13e20" containerID="00beae852ea0b2ca00ed7a4a01f6db85872d234cbd171829482bc03365e30080" exitCode=2 Mar 20 09:00:36 crc kubenswrapper[4971]: I0320 09:00:36.443913 4971 generic.go:334] "Generic (PLEG): container finished" podID="6ae5f048-0707-4363-80c0-10c41fa13e20" containerID="9d9888fd6db46c24244bbbdcbb415b99973b57972ed434602ee00aa19b354691" exitCode=0 Mar 20 09:00:36 crc kubenswrapper[4971]: I0320 09:00:36.443923 4971 generic.go:334] "Generic (PLEG): container finished" podID="6ae5f048-0707-4363-80c0-10c41fa13e20" containerID="cc54b061a22acd13673dff0af81259bbe8c24d4dc03134c3c293558d07f94298" exitCode=0 Mar 20 09:00:36 crc kubenswrapper[4971]: I0320 09:00:36.442286 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ae5f048-0707-4363-80c0-10c41fa13e20","Type":"ContainerDied","Data":"9237bc293bcbb4328b39f08efa97f18cd769c99f12285b9939290bc6a9bd7d5c"} Mar 20 09:00:36 crc kubenswrapper[4971]: I0320 09:00:36.443981 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ae5f048-0707-4363-80c0-10c41fa13e20","Type":"ContainerDied","Data":"00beae852ea0b2ca00ed7a4a01f6db85872d234cbd171829482bc03365e30080"} Mar 20 09:00:36 crc kubenswrapper[4971]: I0320 09:00:36.444003 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ae5f048-0707-4363-80c0-10c41fa13e20","Type":"ContainerDied","Data":"9d9888fd6db46c24244bbbdcbb415b99973b57972ed434602ee00aa19b354691"} Mar 20 09:00:36 crc kubenswrapper[4971]: I0320 09:00:36.444015 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ae5f048-0707-4363-80c0-10c41fa13e20","Type":"ContainerDied","Data":"cc54b061a22acd13673dff0af81259bbe8c24d4dc03134c3c293558d07f94298"} Mar 20 09:00:36 crc kubenswrapper[4971]: I0320 09:00:36.663536 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 09:00:36 crc kubenswrapper[4971]: I0320 09:00:36.860884 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ae5f048-0707-4363-80c0-10c41fa13e20-combined-ca-bundle\") pod \"6ae5f048-0707-4363-80c0-10c41fa13e20\" (UID: \"6ae5f048-0707-4363-80c0-10c41fa13e20\") " Mar 20 09:00:36 crc kubenswrapper[4971]: I0320 09:00:36.860971 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95mjl\" (UniqueName: \"kubernetes.io/projected/6ae5f048-0707-4363-80c0-10c41fa13e20-kube-api-access-95mjl\") pod \"6ae5f048-0707-4363-80c0-10c41fa13e20\" (UID: \"6ae5f048-0707-4363-80c0-10c41fa13e20\") " Mar 20 09:00:36 crc kubenswrapper[4971]: I0320 09:00:36.861032 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6ae5f048-0707-4363-80c0-10c41fa13e20-sg-core-conf-yaml\") pod \"6ae5f048-0707-4363-80c0-10c41fa13e20\" (UID: \"6ae5f048-0707-4363-80c0-10c41fa13e20\") " Mar 20 09:00:36 crc kubenswrapper[4971]: I0320 09:00:36.861096 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ae5f048-0707-4363-80c0-10c41fa13e20-run-httpd\") pod \"6ae5f048-0707-4363-80c0-10c41fa13e20\" (UID: \"6ae5f048-0707-4363-80c0-10c41fa13e20\") " Mar 20 09:00:36 crc kubenswrapper[4971]: I0320 09:00:36.861137 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ae5f048-0707-4363-80c0-10c41fa13e20-config-data\") pod \"6ae5f048-0707-4363-80c0-10c41fa13e20\" (UID: \"6ae5f048-0707-4363-80c0-10c41fa13e20\") " Mar 20 09:00:36 crc kubenswrapper[4971]: I0320 09:00:36.861306 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ae5f048-0707-4363-80c0-10c41fa13e20-log-httpd\") pod \"6ae5f048-0707-4363-80c0-10c41fa13e20\" (UID: \"6ae5f048-0707-4363-80c0-10c41fa13e20\") " Mar 20 09:00:36 crc kubenswrapper[4971]: I0320 09:00:36.861373 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ae5f048-0707-4363-80c0-10c41fa13e20-scripts\") pod \"6ae5f048-0707-4363-80c0-10c41fa13e20\" (UID: \"6ae5f048-0707-4363-80c0-10c41fa13e20\") " Mar 20 09:00:36 crc kubenswrapper[4971]: I0320 09:00:36.875150 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ae5f048-0707-4363-80c0-10c41fa13e20-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6ae5f048-0707-4363-80c0-10c41fa13e20" (UID: "6ae5f048-0707-4363-80c0-10c41fa13e20"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:00:36 crc kubenswrapper[4971]: I0320 09:00:36.876588 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ae5f048-0707-4363-80c0-10c41fa13e20-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6ae5f048-0707-4363-80c0-10c41fa13e20" (UID: "6ae5f048-0707-4363-80c0-10c41fa13e20"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:00:36 crc kubenswrapper[4971]: I0320 09:00:36.884041 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ae5f048-0707-4363-80c0-10c41fa13e20-scripts" (OuterVolumeSpecName: "scripts") pod "6ae5f048-0707-4363-80c0-10c41fa13e20" (UID: "6ae5f048-0707-4363-80c0-10c41fa13e20"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:00:36 crc kubenswrapper[4971]: I0320 09:00:36.887810 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ae5f048-0707-4363-80c0-10c41fa13e20-kube-api-access-95mjl" (OuterVolumeSpecName: "kube-api-access-95mjl") pod "6ae5f048-0707-4363-80c0-10c41fa13e20" (UID: "6ae5f048-0707-4363-80c0-10c41fa13e20"). InnerVolumeSpecName "kube-api-access-95mjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:00:36 crc kubenswrapper[4971]: I0320 09:00:36.908807 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ae5f048-0707-4363-80c0-10c41fa13e20-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6ae5f048-0707-4363-80c0-10c41fa13e20" (UID: "6ae5f048-0707-4363-80c0-10c41fa13e20"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:00:36 crc kubenswrapper[4971]: I0320 09:00:36.951757 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ae5f048-0707-4363-80c0-10c41fa13e20-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6ae5f048-0707-4363-80c0-10c41fa13e20" (UID: "6ae5f048-0707-4363-80c0-10c41fa13e20"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:00:36 crc kubenswrapper[4971]: I0320 09:00:36.963791 4971 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ae5f048-0707-4363-80c0-10c41fa13e20-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 09:00:36 crc kubenswrapper[4971]: I0320 09:00:36.963823 4971 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ae5f048-0707-4363-80c0-10c41fa13e20-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 09:00:36 crc kubenswrapper[4971]: I0320 09:00:36.963831 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ae5f048-0707-4363-80c0-10c41fa13e20-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 09:00:36 crc kubenswrapper[4971]: I0320 09:00:36.963840 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ae5f048-0707-4363-80c0-10c41fa13e20-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 09:00:36 crc kubenswrapper[4971]: I0320 09:00:36.963852 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95mjl\" (UniqueName: \"kubernetes.io/projected/6ae5f048-0707-4363-80c0-10c41fa13e20-kube-api-access-95mjl\") on node \"crc\" DevicePath \"\"" Mar 20 09:00:36 crc kubenswrapper[4971]: I0320 09:00:36.963862 4971 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6ae5f048-0707-4363-80c0-10c41fa13e20-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 09:00:36 crc kubenswrapper[4971]: I0320 09:00:36.990407 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ae5f048-0707-4363-80c0-10c41fa13e20-config-data" (OuterVolumeSpecName: "config-data") pod "6ae5f048-0707-4363-80c0-10c41fa13e20" (UID: "6ae5f048-0707-4363-80c0-10c41fa13e20"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:00:37 crc kubenswrapper[4971]: I0320 09:00:37.066144 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ae5f048-0707-4363-80c0-10c41fa13e20-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 09:00:37 crc kubenswrapper[4971]: I0320 09:00:37.457405 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ae5f048-0707-4363-80c0-10c41fa13e20","Type":"ContainerDied","Data":"53e6aaf1b22f76a2147371350f2ed1c9180be5622f19aaa8a2865a9cd1351a00"} Mar 20 09:00:37 crc kubenswrapper[4971]: I0320 09:00:37.457452 4971 scope.go:117] "RemoveContainer" containerID="9237bc293bcbb4328b39f08efa97f18cd769c99f12285b9939290bc6a9bd7d5c" Mar 20 09:00:37 crc kubenswrapper[4971]: I0320 09:00:37.457555 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 09:00:37 crc kubenswrapper[4971]: I0320 09:00:37.497549 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 09:00:37 crc kubenswrapper[4971]: I0320 09:00:37.506132 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 09:00:37 crc kubenswrapper[4971]: I0320 09:00:37.514912 4971 scope.go:117] "RemoveContainer" containerID="00beae852ea0b2ca00ed7a4a01f6db85872d234cbd171829482bc03365e30080" Mar 20 09:00:37 crc kubenswrapper[4971]: I0320 09:00:37.535158 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 09:00:37 crc kubenswrapper[4971]: E0320 09:00:37.535599 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ae5f048-0707-4363-80c0-10c41fa13e20" containerName="ceilometer-notification-agent" Mar 20 09:00:37 crc kubenswrapper[4971]: I0320 09:00:37.535670 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ae5f048-0707-4363-80c0-10c41fa13e20" containerName="ceilometer-notification-agent" Mar 20 09:00:37 crc kubenswrapper[4971]: E0320 09:00:37.535685 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ae5f048-0707-4363-80c0-10c41fa13e20" containerName="sg-core" Mar 20 09:00:37 crc kubenswrapper[4971]: I0320 09:00:37.535693 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ae5f048-0707-4363-80c0-10c41fa13e20" containerName="sg-core" Mar 20 09:00:37 crc kubenswrapper[4971]: E0320 09:00:37.535701 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9" containerName="init" Mar 20 09:00:37 crc kubenswrapper[4971]: I0320 09:00:37.535708 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9" containerName="init" Mar 20 09:00:37 crc kubenswrapper[4971]: E0320 09:00:37.535730 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ae5f048-0707-4363-80c0-10c41fa13e20" containerName="proxy-httpd" Mar 20 09:00:37 crc kubenswrapper[4971]: I0320 09:00:37.535736 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ae5f048-0707-4363-80c0-10c41fa13e20" containerName="proxy-httpd" Mar 20 09:00:37 crc kubenswrapper[4971]: E0320 09:00:37.535754 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ae5f048-0707-4363-80c0-10c41fa13e20" containerName="ceilometer-central-agent" Mar 20 09:00:37 crc kubenswrapper[4971]: I0320 09:00:37.535761 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ae5f048-0707-4363-80c0-10c41fa13e20" containerName="ceilometer-central-agent" Mar 20 09:00:37 crc kubenswrapper[4971]: E0320 09:00:37.535777 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9" containerName="dnsmasq-dns" Mar 20 09:00:37 crc kubenswrapper[4971]: I0320 09:00:37.535784 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9" containerName="dnsmasq-dns" Mar 20 09:00:37 crc kubenswrapper[4971]: I0320 09:00:37.535958 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ae5f048-0707-4363-80c0-10c41fa13e20" containerName="proxy-httpd" Mar 20 09:00:37 crc kubenswrapper[4971]: I0320 09:00:37.535975 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ae5f048-0707-4363-80c0-10c41fa13e20" containerName="ceilometer-notification-agent" Mar 20 09:00:37 crc kubenswrapper[4971]: I0320 09:00:37.535991 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ae5f048-0707-4363-80c0-10c41fa13e20" containerName="sg-core" Mar 20 09:00:37 crc kubenswrapper[4971]: I0320 09:00:37.536000 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa85aa04-fcb2-469e-99c2-31ec9cfbc0f9" containerName="dnsmasq-dns" Mar 20 09:00:37 crc kubenswrapper[4971]: I0320 09:00:37.536013 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ae5f048-0707-4363-80c0-10c41fa13e20" containerName="ceilometer-central-agent" Mar 20 09:00:37 crc kubenswrapper[4971]: I0320 09:00:37.537803 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 09:00:37 crc kubenswrapper[4971]: I0320 09:00:37.538314 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 09:00:37 crc kubenswrapper[4971]: I0320 09:00:37.540838 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 09:00:37 crc kubenswrapper[4971]: I0320 09:00:37.541073 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 09:00:37 crc kubenswrapper[4971]: I0320 09:00:37.591522 4971 scope.go:117] "RemoveContainer" containerID="9d9888fd6db46c24244bbbdcbb415b99973b57972ed434602ee00aa19b354691" Mar 20 09:00:37 crc kubenswrapper[4971]: I0320 09:00:37.619345 4971 scope.go:117] "RemoveContainer" containerID="cc54b061a22acd13673dff0af81259bbe8c24d4dc03134c3c293558d07f94298" Mar 20 09:00:37 crc kubenswrapper[4971]: I0320 09:00:37.678058 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/454e3092-59bc-4593-becc-e3aad3af2f78-log-httpd\") pod \"ceilometer-0\" (UID: \"454e3092-59bc-4593-becc-e3aad3af2f78\") " pod="openstack/ceilometer-0" Mar 20 09:00:37 crc kubenswrapper[4971]: I0320 09:00:37.678121 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/454e3092-59bc-4593-becc-e3aad3af2f78-run-httpd\") pod \"ceilometer-0\" (UID: \"454e3092-59bc-4593-becc-e3aad3af2f78\") " pod="openstack/ceilometer-0" Mar 20 09:00:37 crc kubenswrapper[4971]: I0320 09:00:37.678370 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/454e3092-59bc-4593-becc-e3aad3af2f78-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"454e3092-59bc-4593-becc-e3aad3af2f78\") " pod="openstack/ceilometer-0" Mar 20 09:00:37 crc kubenswrapper[4971]: I0320 09:00:37.678472 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/454e3092-59bc-4593-becc-e3aad3af2f78-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"454e3092-59bc-4593-becc-e3aad3af2f78\") " pod="openstack/ceilometer-0" Mar 20 09:00:37 crc kubenswrapper[4971]: I0320 09:00:37.678670 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/454e3092-59bc-4593-becc-e3aad3af2f78-config-data\") pod \"ceilometer-0\" (UID: \"454e3092-59bc-4593-becc-e3aad3af2f78\") " pod="openstack/ceilometer-0" Mar 20 09:00:37 crc kubenswrapper[4971]: I0320 09:00:37.678735 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/454e3092-59bc-4593-becc-e3aad3af2f78-scripts\") pod \"ceilometer-0\" (UID: \"454e3092-59bc-4593-becc-e3aad3af2f78\") " pod="openstack/ceilometer-0" Mar 20 09:00:37 crc kubenswrapper[4971]: I0320 09:00:37.678994 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fnmf\" (UniqueName: \"kubernetes.io/projected/454e3092-59bc-4593-becc-e3aad3af2f78-kube-api-access-8fnmf\") pod \"ceilometer-0\" (UID: \"454e3092-59bc-4593-becc-e3aad3af2f78\") " pod="openstack/ceilometer-0" Mar 20 09:00:37 crc kubenswrapper[4971]: I0320 09:00:37.732769 4971 scope.go:117] "RemoveContainer" containerID="28296d931187d2e7d4e287e10e7dff5f848527abf9621b1204e0a061cbc3d395" Mar 20 09:00:37 crc kubenswrapper[4971]: E0320 09:00:37.733023 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:00:37 crc kubenswrapper[4971]: I0320 09:00:37.780674 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/454e3092-59bc-4593-becc-e3aad3af2f78-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"454e3092-59bc-4593-becc-e3aad3af2f78\") " pod="openstack/ceilometer-0" Mar 20 09:00:37 crc kubenswrapper[4971]: I0320 09:00:37.780731 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/454e3092-59bc-4593-becc-e3aad3af2f78-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"454e3092-59bc-4593-becc-e3aad3af2f78\") " pod="openstack/ceilometer-0" Mar 20 09:00:37 crc kubenswrapper[4971]: I0320 09:00:37.780785 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/454e3092-59bc-4593-becc-e3aad3af2f78-config-data\") pod \"ceilometer-0\" (UID: \"454e3092-59bc-4593-becc-e3aad3af2f78\") " pod="openstack/ceilometer-0" Mar 20 09:00:37 crc kubenswrapper[4971]: I0320 09:00:37.780819 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/454e3092-59bc-4593-becc-e3aad3af2f78-scripts\") pod \"ceilometer-0\" (UID: \"454e3092-59bc-4593-becc-e3aad3af2f78\") " pod="openstack/ceilometer-0" Mar 20 09:00:37 crc kubenswrapper[4971]: I0320 09:00:37.780867 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fnmf\" (UniqueName: \"kubernetes.io/projected/454e3092-59bc-4593-becc-e3aad3af2f78-kube-api-access-8fnmf\") pod \"ceilometer-0\" (UID: \"454e3092-59bc-4593-becc-e3aad3af2f78\") " pod="openstack/ceilometer-0" Mar 20 09:00:37 crc kubenswrapper[4971]: I0320 09:00:37.780902 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/454e3092-59bc-4593-becc-e3aad3af2f78-log-httpd\") pod \"ceilometer-0\" (UID: \"454e3092-59bc-4593-becc-e3aad3af2f78\") " pod="openstack/ceilometer-0" Mar 20 09:00:37 crc kubenswrapper[4971]: I0320 09:00:37.780924 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/454e3092-59bc-4593-becc-e3aad3af2f78-run-httpd\") pod \"ceilometer-0\" (UID: \"454e3092-59bc-4593-becc-e3aad3af2f78\") " pod="openstack/ceilometer-0" Mar 20 09:00:37 crc kubenswrapper[4971]: I0320 09:00:37.781518 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/454e3092-59bc-4593-becc-e3aad3af2f78-run-httpd\") pod \"ceilometer-0\" (UID: \"454e3092-59bc-4593-becc-e3aad3af2f78\") " pod="openstack/ceilometer-0" Mar 20 09:00:37 crc kubenswrapper[4971]: I0320 09:00:37.782002 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/454e3092-59bc-4593-becc-e3aad3af2f78-log-httpd\") pod \"ceilometer-0\" (UID: \"454e3092-59bc-4593-becc-e3aad3af2f78\") " pod="openstack/ceilometer-0" Mar 20 09:00:37 crc kubenswrapper[4971]: I0320 09:00:37.785170 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/454e3092-59bc-4593-becc-e3aad3af2f78-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"454e3092-59bc-4593-becc-e3aad3af2f78\") " pod="openstack/ceilometer-0" Mar 20 09:00:37 crc kubenswrapper[4971]: I0320 09:00:37.785311 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/454e3092-59bc-4593-becc-e3aad3af2f78-config-data\") pod \"ceilometer-0\" (UID: \"454e3092-59bc-4593-becc-e3aad3af2f78\") " pod="openstack/ceilometer-0" Mar 20 09:00:37 crc kubenswrapper[4971]: I0320 09:00:37.785856 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/454e3092-59bc-4593-becc-e3aad3af2f78-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"454e3092-59bc-4593-becc-e3aad3af2f78\") " pod="openstack/ceilometer-0" Mar 20 09:00:37 crc kubenswrapper[4971]: I0320 09:00:37.786334 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/454e3092-59bc-4593-becc-e3aad3af2f78-scripts\") pod \"ceilometer-0\" (UID: \"454e3092-59bc-4593-becc-e3aad3af2f78\") " pod="openstack/ceilometer-0" Mar 20 09:00:37 crc kubenswrapper[4971]: I0320 09:00:37.798952 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fnmf\" (UniqueName: \"kubernetes.io/projected/454e3092-59bc-4593-becc-e3aad3af2f78-kube-api-access-8fnmf\") pod \"ceilometer-0\" (UID: \"454e3092-59bc-4593-becc-e3aad3af2f78\") " pod="openstack/ceilometer-0" Mar 20 09:00:37 crc kubenswrapper[4971]: I0320 09:00:37.875024 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 09:00:38 crc kubenswrapper[4971]: I0320 09:00:38.377427 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 09:00:38 crc kubenswrapper[4971]: I0320 09:00:38.470981 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"454e3092-59bc-4593-becc-e3aad3af2f78","Type":"ContainerStarted","Data":"2b6508a0b508d70108396756eb8a2b199eb8d04e057026e5ca90fcaf4ffd56cd"} Mar 20 09:00:38 crc kubenswrapper[4971]: I0320 09:00:38.752310 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ae5f048-0707-4363-80c0-10c41fa13e20" path="/var/lib/kubelet/pods/6ae5f048-0707-4363-80c0-10c41fa13e20/volumes" Mar 20 09:00:39 crc kubenswrapper[4971]: I0320 09:00:39.491506 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"454e3092-59bc-4593-becc-e3aad3af2f78","Type":"ContainerStarted","Data":"17f034cdc7c9504cfcc6462588fa50d73942dfe66897d9108fc1cbb9bdda274e"} Mar 20 09:00:40 crc kubenswrapper[4971]: I0320 09:00:40.503725 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"454e3092-59bc-4593-becc-e3aad3af2f78","Type":"ContainerStarted","Data":"c5b888ebeb09c1c0017cb76650c906a2390d808b614e7939b3cf48fa4ff86e5b"} Mar 20 09:00:40 crc kubenswrapper[4971]: I0320 09:00:40.504050 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"454e3092-59bc-4593-becc-e3aad3af2f78","Type":"ContainerStarted","Data":"324ab12499388915a7d0d22ed7b703767a61dfe6224afc3c24ee4a038ce7c1dc"} Mar 20 09:00:42 crc kubenswrapper[4971]: I0320 09:00:42.534174 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"454e3092-59bc-4593-becc-e3aad3af2f78","Type":"ContainerStarted","Data":"870f31467a7297f289f104df46e43fe867fcdb39ade4c58b7672424c526083fa"} Mar 20 09:00:42 crc kubenswrapper[4971]: I0320 09:00:42.535679 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 09:00:42 crc kubenswrapper[4971]: I0320 09:00:42.557449 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.087433044 podStartE2EDuration="5.557429942s" podCreationTimestamp="2026-03-20 09:00:37 +0000 UTC" firstStartedPulling="2026-03-20 09:00:38.383158991 +0000 UTC m=+7860.363033129" lastFinishedPulling="2026-03-20 09:00:41.853155889 +0000 UTC m=+7863.833030027" observedRunningTime="2026-03-20 09:00:42.55352827 +0000 UTC m=+7864.533402408" watchObservedRunningTime="2026-03-20 09:00:42.557429942 +0000 UTC m=+7864.537304100" Mar 20 09:00:42 crc kubenswrapper[4971]: I0320 09:00:42.875625 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Mar 20 09:00:43 crc kubenswrapper[4971]: I0320 09:00:43.066500 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-6lts7"] Mar 20 09:00:43 crc kubenswrapper[4971]: I0320 09:00:43.081299 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-jbwbl"] Mar 20 09:00:43 crc kubenswrapper[4971]: I0320 09:00:43.090376 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-9831-account-create-update-wfn6w"] Mar 20 09:00:43 crc kubenswrapper[4971]: I0320 09:00:43.099408 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-q448g"] Mar 20 09:00:43 crc kubenswrapper[4971]: I0320 09:00:43.108796 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-jbwbl"] Mar 20 09:00:43 crc kubenswrapper[4971]: I0320 09:00:43.117070 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-9831-account-create-update-wfn6w"] Mar 20 09:00:43 crc kubenswrapper[4971]: I0320 09:00:43.129180 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-6lts7"] Mar 20 09:00:43 crc kubenswrapper[4971]: I0320 09:00:43.137287 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-q448g"] Mar 20 09:00:44 crc kubenswrapper[4971]: I0320 09:00:44.042933 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-8a03-account-create-update-k5dtm"] Mar 20 09:00:44 crc kubenswrapper[4971]: I0320 09:00:44.050780 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-33f2-account-create-update-z8tbj"] Mar 20 09:00:44 crc kubenswrapper[4971]: I0320 09:00:44.062872 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-8a03-account-create-update-k5dtm"] Mar 20 09:00:44 crc kubenswrapper[4971]: I0320 09:00:44.074827 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-33f2-account-create-update-z8tbj"] Mar 20 09:00:44 crc kubenswrapper[4971]: I0320 09:00:44.320755 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Mar 20 09:00:44 crc kubenswrapper[4971]: I0320 09:00:44.470278 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Mar 20 09:00:44 crc kubenswrapper[4971]: I0320 09:00:44.525087 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Mar 20 09:00:44 crc kubenswrapper[4971]: I0320 09:00:44.769226 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="095b7bd0-1af9-4d2a-ac8f-82c4446c5746" path="/var/lib/kubelet/pods/095b7bd0-1af9-4d2a-ac8f-82c4446c5746/volumes" Mar 20 09:00:44 crc kubenswrapper[4971]: I0320 09:00:44.770769 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="330d1914-0337-4d4f-a8ea-2a7c9f18942a" path="/var/lib/kubelet/pods/330d1914-0337-4d4f-a8ea-2a7c9f18942a/volumes" Mar 20 09:00:44 crc kubenswrapper[4971]: I0320 09:00:44.776005 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75476996-b18e-4254-a5ad-5afa1e533ecf" path="/var/lib/kubelet/pods/75476996-b18e-4254-a5ad-5afa1e533ecf/volumes" Mar 20 09:00:44 crc kubenswrapper[4971]: I0320 09:00:44.777415 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90424d47-8eee-4fce-b17a-1fb1250ab67d" path="/var/lib/kubelet/pods/90424d47-8eee-4fce-b17a-1fb1250ab67d/volumes" Mar 20 09:00:44 crc kubenswrapper[4971]: I0320 09:00:44.780806 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ef92901-a49d-4e95-9491-53378694d160" path="/var/lib/kubelet/pods/9ef92901-a49d-4e95-9491-53378694d160/volumes" Mar 20 09:00:44 crc kubenswrapper[4971]: I0320 09:00:44.781921 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddf84f04-f715-49cd-b077-66df82ec4ea2" path="/var/lib/kubelet/pods/ddf84f04-f715-49cd-b077-66df82ec4ea2/volumes" Mar 20 09:00:51 crc kubenswrapper[4971]: I0320 09:00:51.733338 4971 scope.go:117] "RemoveContainer" containerID="28296d931187d2e7d4e287e10e7dff5f848527abf9621b1204e0a061cbc3d395" Mar 20 09:00:51 crc kubenswrapper[4971]: E0320 09:00:51.734392 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:01:00 crc kubenswrapper[4971]: I0320 09:01:00.148305 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29566621-kzfm9"] Mar 20 09:01:00 crc kubenswrapper[4971]: I0320 09:01:00.150180 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29566621-kzfm9" Mar 20 09:01:00 crc kubenswrapper[4971]: I0320 09:01:00.157746 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29566621-kzfm9"] Mar 20 09:01:00 crc kubenswrapper[4971]: I0320 09:01:00.254879 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4231087a-ce1e-4bde-b686-0f7045eeb5d8-fernet-keys\") pod \"keystone-cron-29566621-kzfm9\" (UID: \"4231087a-ce1e-4bde-b686-0f7045eeb5d8\") " pod="openstack/keystone-cron-29566621-kzfm9" Mar 20 09:01:00 crc kubenswrapper[4971]: I0320 09:01:00.254925 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4231087a-ce1e-4bde-b686-0f7045eeb5d8-config-data\") pod \"keystone-cron-29566621-kzfm9\" (UID: \"4231087a-ce1e-4bde-b686-0f7045eeb5d8\") " pod="openstack/keystone-cron-29566621-kzfm9" Mar 20 09:01:00 crc kubenswrapper[4971]: I0320 09:01:00.255088 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4231087a-ce1e-4bde-b686-0f7045eeb5d8-combined-ca-bundle\") pod \"keystone-cron-29566621-kzfm9\" (UID: \"4231087a-ce1e-4bde-b686-0f7045eeb5d8\") " pod="openstack/keystone-cron-29566621-kzfm9" Mar 20 09:01:00 crc kubenswrapper[4971]: I0320 09:01:00.255315 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkjwc\" (UniqueName: \"kubernetes.io/projected/4231087a-ce1e-4bde-b686-0f7045eeb5d8-kube-api-access-wkjwc\") pod \"keystone-cron-29566621-kzfm9\" (UID: \"4231087a-ce1e-4bde-b686-0f7045eeb5d8\") " pod="openstack/keystone-cron-29566621-kzfm9" Mar 20 09:01:00 crc kubenswrapper[4971]: I0320 09:01:00.357308 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkjwc\" (UniqueName: \"kubernetes.io/projected/4231087a-ce1e-4bde-b686-0f7045eeb5d8-kube-api-access-wkjwc\") pod \"keystone-cron-29566621-kzfm9\" (UID: \"4231087a-ce1e-4bde-b686-0f7045eeb5d8\") " pod="openstack/keystone-cron-29566621-kzfm9" Mar 20 09:01:00 crc kubenswrapper[4971]: I0320 09:01:00.357474 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4231087a-ce1e-4bde-b686-0f7045eeb5d8-fernet-keys\") pod \"keystone-cron-29566621-kzfm9\" (UID: \"4231087a-ce1e-4bde-b686-0f7045eeb5d8\") " pod="openstack/keystone-cron-29566621-kzfm9" Mar 20 09:01:00 crc kubenswrapper[4971]: I0320 09:01:00.357495 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4231087a-ce1e-4bde-b686-0f7045eeb5d8-config-data\") pod \"keystone-cron-29566621-kzfm9\" (UID: \"4231087a-ce1e-4bde-b686-0f7045eeb5d8\") " pod="openstack/keystone-cron-29566621-kzfm9" Mar 20 09:01:00 crc kubenswrapper[4971]: I0320 09:01:00.357519 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4231087a-ce1e-4bde-b686-0f7045eeb5d8-combined-ca-bundle\") pod \"keystone-cron-29566621-kzfm9\" (UID: \"4231087a-ce1e-4bde-b686-0f7045eeb5d8\") " pod="openstack/keystone-cron-29566621-kzfm9" Mar 20 09:01:00 crc kubenswrapper[4971]: I0320 09:01:00.363881 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4231087a-ce1e-4bde-b686-0f7045eeb5d8-fernet-keys\") pod \"keystone-cron-29566621-kzfm9\" (UID: \"4231087a-ce1e-4bde-b686-0f7045eeb5d8\") " pod="openstack/keystone-cron-29566621-kzfm9" Mar 20 09:01:00 crc kubenswrapper[4971]: I0320 09:01:00.363929 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4231087a-ce1e-4bde-b686-0f7045eeb5d8-config-data\") pod \"keystone-cron-29566621-kzfm9\" (UID: \"4231087a-ce1e-4bde-b686-0f7045eeb5d8\") " pod="openstack/keystone-cron-29566621-kzfm9" Mar 20 09:01:00 crc kubenswrapper[4971]: I0320 09:01:00.365379 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4231087a-ce1e-4bde-b686-0f7045eeb5d8-combined-ca-bundle\") pod \"keystone-cron-29566621-kzfm9\" (UID: \"4231087a-ce1e-4bde-b686-0f7045eeb5d8\") " pod="openstack/keystone-cron-29566621-kzfm9" Mar 20 09:01:00 crc kubenswrapper[4971]: I0320 09:01:00.374086 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkjwc\" (UniqueName: \"kubernetes.io/projected/4231087a-ce1e-4bde-b686-0f7045eeb5d8-kube-api-access-wkjwc\") pod \"keystone-cron-29566621-kzfm9\" (UID: \"4231087a-ce1e-4bde-b686-0f7045eeb5d8\") " pod="openstack/keystone-cron-29566621-kzfm9" Mar 20 09:01:00 crc kubenswrapper[4971]: I0320 09:01:00.471543 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29566621-kzfm9" Mar 20 09:01:00 crc kubenswrapper[4971]: I0320 09:01:00.912315 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29566621-kzfm9"] Mar 20 09:01:01 crc kubenswrapper[4971]: I0320 09:01:01.719807 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29566621-kzfm9" event={"ID":"4231087a-ce1e-4bde-b686-0f7045eeb5d8","Type":"ContainerStarted","Data":"6a3b75df4bae0ed4d637459627bdb0b57830652a9a1edd540331bd9dbbf47907"} Mar 20 09:01:01 crc kubenswrapper[4971]: I0320 09:01:01.720149 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29566621-kzfm9" event={"ID":"4231087a-ce1e-4bde-b686-0f7045eeb5d8","Type":"ContainerStarted","Data":"410d24ef4f4f2bd70a565cff9f86fe991e903e1f77696599ad099c6924463246"} Mar 20 09:01:01 crc kubenswrapper[4971]: I0320 09:01:01.742624 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29566621-kzfm9" podStartSLOduration=1.742586015 podStartE2EDuration="1.742586015s" podCreationTimestamp="2026-03-20 09:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:01:01.737924303 +0000 UTC m=+7883.717798441" watchObservedRunningTime="2026-03-20 09:01:01.742586015 +0000 UTC m=+7883.722460153" Mar 20 09:01:03 crc kubenswrapper[4971]: I0320 09:01:03.051542 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-z8jrm"] Mar 20 09:01:03 crc kubenswrapper[4971]: I0320 09:01:03.061139 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-z8jrm"] Mar 20 09:01:03 crc kubenswrapper[4971]: I0320 09:01:03.732468 4971 scope.go:117] "RemoveContainer" containerID="28296d931187d2e7d4e287e10e7dff5f848527abf9621b1204e0a061cbc3d395" Mar 20 09:01:03 crc kubenswrapper[4971]: E0320 09:01:03.733197 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:01:04 crc kubenswrapper[4971]: I0320 09:01:04.743152 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb841e1b-1731-434d-8ffb-6c135a174b4a" path="/var/lib/kubelet/pods/fb841e1b-1731-434d-8ffb-6c135a174b4a/volumes" Mar 20 09:01:04 crc kubenswrapper[4971]: I0320 09:01:04.760964 4971 generic.go:334] "Generic (PLEG): container finished" podID="4231087a-ce1e-4bde-b686-0f7045eeb5d8" containerID="6a3b75df4bae0ed4d637459627bdb0b57830652a9a1edd540331bd9dbbf47907" exitCode=0 Mar 20 09:01:04 crc kubenswrapper[4971]: I0320 09:01:04.761017 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29566621-kzfm9" event={"ID":"4231087a-ce1e-4bde-b686-0f7045eeb5d8","Type":"ContainerDied","Data":"6a3b75df4bae0ed4d637459627bdb0b57830652a9a1edd540331bd9dbbf47907"} Mar 20 09:01:06 crc kubenswrapper[4971]: I0320 09:01:06.154312 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29566621-kzfm9" Mar 20 09:01:06 crc kubenswrapper[4971]: I0320 09:01:06.304202 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4231087a-ce1e-4bde-b686-0f7045eeb5d8-fernet-keys\") pod \"4231087a-ce1e-4bde-b686-0f7045eeb5d8\" (UID: \"4231087a-ce1e-4bde-b686-0f7045eeb5d8\") " Mar 20 09:01:06 crc kubenswrapper[4971]: I0320 09:01:06.304385 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkjwc\" (UniqueName: \"kubernetes.io/projected/4231087a-ce1e-4bde-b686-0f7045eeb5d8-kube-api-access-wkjwc\") pod \"4231087a-ce1e-4bde-b686-0f7045eeb5d8\" (UID: \"4231087a-ce1e-4bde-b686-0f7045eeb5d8\") " Mar 20 09:01:06 crc kubenswrapper[4971]: I0320 09:01:06.304569 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4231087a-ce1e-4bde-b686-0f7045eeb5d8-config-data\") pod \"4231087a-ce1e-4bde-b686-0f7045eeb5d8\" (UID: \"4231087a-ce1e-4bde-b686-0f7045eeb5d8\") " Mar 20 09:01:06 crc kubenswrapper[4971]: I0320 09:01:06.304654 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4231087a-ce1e-4bde-b686-0f7045eeb5d8-combined-ca-bundle\") pod \"4231087a-ce1e-4bde-b686-0f7045eeb5d8\" (UID: \"4231087a-ce1e-4bde-b686-0f7045eeb5d8\") " Mar 20 09:01:06 crc kubenswrapper[4971]: I0320 09:01:06.326392 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4231087a-ce1e-4bde-b686-0f7045eeb5d8-kube-api-access-wkjwc" (OuterVolumeSpecName: "kube-api-access-wkjwc") pod "4231087a-ce1e-4bde-b686-0f7045eeb5d8" (UID: "4231087a-ce1e-4bde-b686-0f7045eeb5d8"). InnerVolumeSpecName "kube-api-access-wkjwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:01:06 crc kubenswrapper[4971]: I0320 09:01:06.327038 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4231087a-ce1e-4bde-b686-0f7045eeb5d8-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "4231087a-ce1e-4bde-b686-0f7045eeb5d8" (UID: "4231087a-ce1e-4bde-b686-0f7045eeb5d8"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:01:06 crc kubenswrapper[4971]: I0320 09:01:06.336862 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4231087a-ce1e-4bde-b686-0f7045eeb5d8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4231087a-ce1e-4bde-b686-0f7045eeb5d8" (UID: "4231087a-ce1e-4bde-b686-0f7045eeb5d8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:01:06 crc kubenswrapper[4971]: I0320 09:01:06.348412 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4231087a-ce1e-4bde-b686-0f7045eeb5d8-config-data" (OuterVolumeSpecName: "config-data") pod "4231087a-ce1e-4bde-b686-0f7045eeb5d8" (UID: "4231087a-ce1e-4bde-b686-0f7045eeb5d8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:01:06 crc kubenswrapper[4971]: I0320 09:01:06.406782 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkjwc\" (UniqueName: \"kubernetes.io/projected/4231087a-ce1e-4bde-b686-0f7045eeb5d8-kube-api-access-wkjwc\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:06 crc kubenswrapper[4971]: I0320 09:01:06.406814 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4231087a-ce1e-4bde-b686-0f7045eeb5d8-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:06 crc kubenswrapper[4971]: I0320 09:01:06.406825 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4231087a-ce1e-4bde-b686-0f7045eeb5d8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:06 crc kubenswrapper[4971]: I0320 09:01:06.406834 4971 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4231087a-ce1e-4bde-b686-0f7045eeb5d8-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:06 crc kubenswrapper[4971]: I0320 09:01:06.785919 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29566621-kzfm9" event={"ID":"4231087a-ce1e-4bde-b686-0f7045eeb5d8","Type":"ContainerDied","Data":"410d24ef4f4f2bd70a565cff9f86fe991e903e1f77696599ad099c6924463246"} Mar 20 09:01:06 crc kubenswrapper[4971]: I0320 09:01:06.785954 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29566621-kzfm9" Mar 20 09:01:06 crc kubenswrapper[4971]: I0320 09:01:06.785965 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="410d24ef4f4f2bd70a565cff9f86fe991e903e1f77696599ad099c6924463246" Mar 20 09:01:07 crc kubenswrapper[4971]: I0320 09:01:07.881703 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 20 09:01:11 crc kubenswrapper[4971]: I0320 09:01:11.490310 4971 scope.go:117] "RemoveContainer" containerID="0ccfe317b073edb2cee1f335ccef0124d18becdf156b787592a97cbc86b74aca" Mar 20 09:01:11 crc kubenswrapper[4971]: I0320 09:01:11.515926 4971 scope.go:117] "RemoveContainer" containerID="dd903e8c825771c10e2b6e69674ba5b9e108f1d49a7a90b4724f2f23c659d74b" Mar 20 09:01:11 crc kubenswrapper[4971]: I0320 09:01:11.556018 4971 scope.go:117] "RemoveContainer" containerID="e59db8128262a6ce828d9dbe0932907eeeecee248c59467a85b08d4a125111ca" Mar 20 09:01:11 crc kubenswrapper[4971]: I0320 09:01:11.593242 4971 scope.go:117] "RemoveContainer" containerID="b91d4df69c14a4f5548b65c843781e5b7c499782ff29add7573cdc17b7adb5e8" Mar 20 09:01:11 crc kubenswrapper[4971]: I0320 09:01:11.653905 4971 scope.go:117] "RemoveContainer" containerID="9ed536a707d208b5391f0833ed540467f50206dffb41395c012fe6ffba0586f3" Mar 20 09:01:11 crc kubenswrapper[4971]: I0320 09:01:11.681467 4971 scope.go:117] "RemoveContainer" containerID="22ced0c9a1dad73f05d4a4b4683533579ec93a52ca7526e1d544d63a469a39ea" Mar 20 09:01:11 crc kubenswrapper[4971]: I0320 09:01:11.740953 4971 scope.go:117] "RemoveContainer" containerID="d482b887b43776d9e1d09fe44e3d545aa63f832fbebbe436bd11899f4997f2d4" Mar 20 09:01:11 crc kubenswrapper[4971]: I0320 09:01:11.775522 4971 scope.go:117] "RemoveContainer" containerID="400998e0e738f9a7e7a96c4eb797c377b61ff9ab1c5b31b66113136a3ea4a217" Mar 20 09:01:17 crc kubenswrapper[4971]: I0320 09:01:17.732325 4971 scope.go:117] "RemoveContainer" containerID="28296d931187d2e7d4e287e10e7dff5f848527abf9621b1204e0a061cbc3d395" Mar 20 09:01:17 crc kubenswrapper[4971]: E0320 09:01:17.733202 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:01:22 crc kubenswrapper[4971]: I0320 09:01:22.036325 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-v48pn"] Mar 20 09:01:22 crc kubenswrapper[4971]: I0320 09:01:22.047692 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-v48pn"] Mar 20 09:01:22 crc kubenswrapper[4971]: I0320 09:01:22.744802 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bedda968-4c4c-4bea-a384-4695a8169a96" path="/var/lib/kubelet/pods/bedda968-4c4c-4bea-a384-4695a8169a96/volumes" Mar 20 09:01:23 crc kubenswrapper[4971]: I0320 09:01:23.047438 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-cvhsb"] Mar 20 09:01:23 crc kubenswrapper[4971]: I0320 09:01:23.067823 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-cvhsb"] Mar 20 09:01:24 crc kubenswrapper[4971]: I0320 09:01:24.768246 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f6bf0dd-87ab-4d12-8d00-fd3907b69533" path="/var/lib/kubelet/pods/0f6bf0dd-87ab-4d12-8d00-fd3907b69533/volumes" Mar 20 09:01:25 crc kubenswrapper[4971]: I0320 09:01:25.338045 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fb654d449-5rnb4"] Mar 20 09:01:25 crc kubenswrapper[4971]: E0320 09:01:25.338906 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4231087a-ce1e-4bde-b686-0f7045eeb5d8" containerName="keystone-cron" Mar 20 09:01:25 crc kubenswrapper[4971]: I0320 09:01:25.338929 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="4231087a-ce1e-4bde-b686-0f7045eeb5d8" containerName="keystone-cron" Mar 20 09:01:25 crc kubenswrapper[4971]: I0320 09:01:25.339175 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="4231087a-ce1e-4bde-b686-0f7045eeb5d8" containerName="keystone-cron" Mar 20 09:01:25 crc kubenswrapper[4971]: I0320 09:01:25.340491 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fb654d449-5rnb4" Mar 20 09:01:25 crc kubenswrapper[4971]: I0320 09:01:25.345430 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1" Mar 20 09:01:25 crc kubenswrapper[4971]: I0320 09:01:25.368892 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fb654d449-5rnb4"] Mar 20 09:01:25 crc kubenswrapper[4971]: I0320 09:01:25.395036 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/783dfe21-136b-4efa-ba14-f6fe5a9c4499-openstack-cell1\") pod \"dnsmasq-dns-7fb654d449-5rnb4\" (UID: \"783dfe21-136b-4efa-ba14-f6fe5a9c4499\") " pod="openstack/dnsmasq-dns-7fb654d449-5rnb4" Mar 20 09:01:25 crc kubenswrapper[4971]: I0320 09:01:25.395368 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/783dfe21-136b-4efa-ba14-f6fe5a9c4499-ovsdbserver-nb\") pod \"dnsmasq-dns-7fb654d449-5rnb4\" (UID: \"783dfe21-136b-4efa-ba14-f6fe5a9c4499\") " pod="openstack/dnsmasq-dns-7fb654d449-5rnb4" Mar 20 09:01:25 crc kubenswrapper[4971]: I0320 09:01:25.395482 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5z2f\" (UniqueName: \"kubernetes.io/projected/783dfe21-136b-4efa-ba14-f6fe5a9c4499-kube-api-access-x5z2f\") pod \"dnsmasq-dns-7fb654d449-5rnb4\" (UID: \"783dfe21-136b-4efa-ba14-f6fe5a9c4499\") " pod="openstack/dnsmasq-dns-7fb654d449-5rnb4" Mar 20 09:01:25 crc kubenswrapper[4971]: I0320 09:01:25.395647 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/783dfe21-136b-4efa-ba14-f6fe5a9c4499-config\") pod \"dnsmasq-dns-7fb654d449-5rnb4\" (UID: \"783dfe21-136b-4efa-ba14-f6fe5a9c4499\") " pod="openstack/dnsmasq-dns-7fb654d449-5rnb4" Mar 20 09:01:25 crc kubenswrapper[4971]: I0320 09:01:25.395802 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/783dfe21-136b-4efa-ba14-f6fe5a9c4499-dns-svc\") pod \"dnsmasq-dns-7fb654d449-5rnb4\" (UID: \"783dfe21-136b-4efa-ba14-f6fe5a9c4499\") " pod="openstack/dnsmasq-dns-7fb654d449-5rnb4" Mar 20 09:01:25 crc kubenswrapper[4971]: I0320 09:01:25.395933 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/783dfe21-136b-4efa-ba14-f6fe5a9c4499-ovsdbserver-sb\") pod \"dnsmasq-dns-7fb654d449-5rnb4\" (UID: \"783dfe21-136b-4efa-ba14-f6fe5a9c4499\") " pod="openstack/dnsmasq-dns-7fb654d449-5rnb4" Mar 20 09:01:25 crc kubenswrapper[4971]: I0320 09:01:25.497974 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/783dfe21-136b-4efa-ba14-f6fe5a9c4499-config\") pod \"dnsmasq-dns-7fb654d449-5rnb4\" (UID: \"783dfe21-136b-4efa-ba14-f6fe5a9c4499\") " pod="openstack/dnsmasq-dns-7fb654d449-5rnb4" Mar 20 09:01:25 crc kubenswrapper[4971]: I0320 09:01:25.498093 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/783dfe21-136b-4efa-ba14-f6fe5a9c4499-dns-svc\") pod \"dnsmasq-dns-7fb654d449-5rnb4\" (UID: \"783dfe21-136b-4efa-ba14-f6fe5a9c4499\") " pod="openstack/dnsmasq-dns-7fb654d449-5rnb4" Mar 20 09:01:25 crc kubenswrapper[4971]: I0320 09:01:25.498154 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/783dfe21-136b-4efa-ba14-f6fe5a9c4499-ovsdbserver-sb\") pod \"dnsmasq-dns-7fb654d449-5rnb4\" (UID: \"783dfe21-136b-4efa-ba14-f6fe5a9c4499\") " pod="openstack/dnsmasq-dns-7fb654d449-5rnb4" Mar 20 09:01:25 crc kubenswrapper[4971]: I0320 09:01:25.498208 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/783dfe21-136b-4efa-ba14-f6fe5a9c4499-openstack-cell1\") pod \"dnsmasq-dns-7fb654d449-5rnb4\" (UID: \"783dfe21-136b-4efa-ba14-f6fe5a9c4499\") " pod="openstack/dnsmasq-dns-7fb654d449-5rnb4" Mar 20 09:01:25 crc kubenswrapper[4971]: I0320 09:01:25.498276 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/783dfe21-136b-4efa-ba14-f6fe5a9c4499-ovsdbserver-nb\") pod \"dnsmasq-dns-7fb654d449-5rnb4\" (UID: \"783dfe21-136b-4efa-ba14-f6fe5a9c4499\") " pod="openstack/dnsmasq-dns-7fb654d449-5rnb4" Mar 20 09:01:25 crc kubenswrapper[4971]: I0320 09:01:25.498306 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5z2f\" (UniqueName: \"kubernetes.io/projected/783dfe21-136b-4efa-ba14-f6fe5a9c4499-kube-api-access-x5z2f\") pod \"dnsmasq-dns-7fb654d449-5rnb4\" (UID: \"783dfe21-136b-4efa-ba14-f6fe5a9c4499\") " pod="openstack/dnsmasq-dns-7fb654d449-5rnb4" Mar 20 09:01:25 crc kubenswrapper[4971]: I0320 09:01:25.499109 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/783dfe21-136b-4efa-ba14-f6fe5a9c4499-config\") pod \"dnsmasq-dns-7fb654d449-5rnb4\" (UID: \"783dfe21-136b-4efa-ba14-f6fe5a9c4499\") " pod="openstack/dnsmasq-dns-7fb654d449-5rnb4" Mar 20 09:01:25 crc kubenswrapper[4971]: I0320 09:01:25.499109 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/783dfe21-136b-4efa-ba14-f6fe5a9c4499-ovsdbserver-sb\") pod \"dnsmasq-dns-7fb654d449-5rnb4\" (UID: \"783dfe21-136b-4efa-ba14-f6fe5a9c4499\") " pod="openstack/dnsmasq-dns-7fb654d449-5rnb4" Mar 20 09:01:25 crc kubenswrapper[4971]: I0320 09:01:25.499396 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/783dfe21-136b-4efa-ba14-f6fe5a9c4499-openstack-cell1\") pod \"dnsmasq-dns-7fb654d449-5rnb4\" (UID: \"783dfe21-136b-4efa-ba14-f6fe5a9c4499\") " pod="openstack/dnsmasq-dns-7fb654d449-5rnb4" Mar 20 09:01:25 crc kubenswrapper[4971]: I0320 09:01:25.499450 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/783dfe21-136b-4efa-ba14-f6fe5a9c4499-ovsdbserver-nb\") pod \"dnsmasq-dns-7fb654d449-5rnb4\" (UID: \"783dfe21-136b-4efa-ba14-f6fe5a9c4499\") " pod="openstack/dnsmasq-dns-7fb654d449-5rnb4" Mar 20 09:01:25 crc kubenswrapper[4971]: I0320 09:01:25.500065 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/783dfe21-136b-4efa-ba14-f6fe5a9c4499-dns-svc\") pod \"dnsmasq-dns-7fb654d449-5rnb4\" (UID: \"783dfe21-136b-4efa-ba14-f6fe5a9c4499\") " pod="openstack/dnsmasq-dns-7fb654d449-5rnb4" Mar 20 09:01:25 crc kubenswrapper[4971]: I0320 09:01:25.548428 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5z2f\" (UniqueName: \"kubernetes.io/projected/783dfe21-136b-4efa-ba14-f6fe5a9c4499-kube-api-access-x5z2f\") pod \"dnsmasq-dns-7fb654d449-5rnb4\" (UID: \"783dfe21-136b-4efa-ba14-f6fe5a9c4499\") " pod="openstack/dnsmasq-dns-7fb654d449-5rnb4" Mar 20 09:01:25 crc kubenswrapper[4971]: I0320 09:01:25.678398 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fb654d449-5rnb4" Mar 20 09:01:26 crc kubenswrapper[4971]: I0320 09:01:26.214227 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fb654d449-5rnb4"] Mar 20 09:01:26 crc kubenswrapper[4971]: I0320 09:01:26.978957 4971 generic.go:334] "Generic (PLEG): container finished" podID="783dfe21-136b-4efa-ba14-f6fe5a9c4499" containerID="a1a6ab16fb11ebc9a78c5d0d4e1458d8fe003d64c87da30f12bfc350965820c0" exitCode=0 Mar 20 09:01:26 crc kubenswrapper[4971]: I0320 09:01:26.979028 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fb654d449-5rnb4" event={"ID":"783dfe21-136b-4efa-ba14-f6fe5a9c4499","Type":"ContainerDied","Data":"a1a6ab16fb11ebc9a78c5d0d4e1458d8fe003d64c87da30f12bfc350965820c0"} Mar 20 09:01:26 crc kubenswrapper[4971]: I0320 09:01:26.979325 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fb654d449-5rnb4" event={"ID":"783dfe21-136b-4efa-ba14-f6fe5a9c4499","Type":"ContainerStarted","Data":"32136f0117183da974f131ab4542366f662052adf6a4ebb6465f75fe0ccd9a1a"} Mar 20 09:01:27 crc kubenswrapper[4971]: I0320 09:01:27.988014 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fb654d449-5rnb4" event={"ID":"783dfe21-136b-4efa-ba14-f6fe5a9c4499","Type":"ContainerStarted","Data":"b086200f67674b893f2a11b3e8ee8eee953695f9283742325148d22e4d6ff273"} Mar 20 09:01:27 crc kubenswrapper[4971]: I0320 09:01:27.989094 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7fb654d449-5rnb4" Mar 20 09:01:28 crc kubenswrapper[4971]: I0320 09:01:28.014870 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7fb654d449-5rnb4" podStartSLOduration=3.014848988 podStartE2EDuration="3.014848988s" podCreationTimestamp="2026-03-20 09:01:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:01:28.003649665 +0000 UTC m=+7909.983523823" watchObservedRunningTime="2026-03-20 09:01:28.014848988 +0000 UTC m=+7909.994723126" Mar 20 09:01:30 crc kubenswrapper[4971]: I0320 09:01:30.733445 4971 scope.go:117] "RemoveContainer" containerID="28296d931187d2e7d4e287e10e7dff5f848527abf9621b1204e0a061cbc3d395" Mar 20 09:01:30 crc kubenswrapper[4971]: E0320 09:01:30.734352 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:01:35 crc kubenswrapper[4971]: I0320 09:01:35.680846 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7fb654d449-5rnb4" Mar 20 09:01:35 crc kubenswrapper[4971]: I0320 09:01:35.769929 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bccb6d999-n6dsj"] Mar 20 09:01:35 crc kubenswrapper[4971]: I0320 09:01:35.770262 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5bccb6d999-n6dsj" podUID="04966dc8-94d0-42f2-97b6-230806b07f54" containerName="dnsmasq-dns" containerID="cri-o://8cab53cbb62350a7410a974b0ac58b698e7aa2caa2224266edf03f3a24e43581" gracePeriod=10 Mar 20 09:01:36 crc kubenswrapper[4971]: I0320 09:01:36.008692 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d564d794f-k49kw"] Mar 20 09:01:36 crc kubenswrapper[4971]: I0320 09:01:36.010862 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d564d794f-k49kw" Mar 20 09:01:36 crc kubenswrapper[4971]: I0320 09:01:36.023637 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d564d794f-k49kw"] Mar 20 09:01:36 crc kubenswrapper[4971]: I0320 09:01:36.081651 4971 generic.go:334] "Generic (PLEG): container finished" podID="04966dc8-94d0-42f2-97b6-230806b07f54" containerID="8cab53cbb62350a7410a974b0ac58b698e7aa2caa2224266edf03f3a24e43581" exitCode=0 Mar 20 09:01:36 crc kubenswrapper[4971]: I0320 09:01:36.081689 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bccb6d999-n6dsj" event={"ID":"04966dc8-94d0-42f2-97b6-230806b07f54","Type":"ContainerDied","Data":"8cab53cbb62350a7410a974b0ac58b698e7aa2caa2224266edf03f3a24e43581"} Mar 20 09:01:36 crc kubenswrapper[4971]: I0320 09:01:36.138837 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dcca18bf-a5ae-44bd-9de4-23a9ebe7b021-dns-svc\") pod \"dnsmasq-dns-d564d794f-k49kw\" (UID: \"dcca18bf-a5ae-44bd-9de4-23a9ebe7b021\") " pod="openstack/dnsmasq-dns-d564d794f-k49kw" Mar 20 09:01:36 crc kubenswrapper[4971]: I0320 09:01:36.138916 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dcca18bf-a5ae-44bd-9de4-23a9ebe7b021-ovsdbserver-sb\") pod \"dnsmasq-dns-d564d794f-k49kw\" (UID: \"dcca18bf-a5ae-44bd-9de4-23a9ebe7b021\") " pod="openstack/dnsmasq-dns-d564d794f-k49kw" Mar 20 09:01:36 crc kubenswrapper[4971]: I0320 09:01:36.139044 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/dcca18bf-a5ae-44bd-9de4-23a9ebe7b021-openstack-cell1\") pod \"dnsmasq-dns-d564d794f-k49kw\" (UID: \"dcca18bf-a5ae-44bd-9de4-23a9ebe7b021\") " pod="openstack/dnsmasq-dns-d564d794f-k49kw" Mar 20 09:01:36 crc kubenswrapper[4971]: I0320 09:01:36.139180 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s728j\" (UniqueName: \"kubernetes.io/projected/dcca18bf-a5ae-44bd-9de4-23a9ebe7b021-kube-api-access-s728j\") pod \"dnsmasq-dns-d564d794f-k49kw\" (UID: \"dcca18bf-a5ae-44bd-9de4-23a9ebe7b021\") " pod="openstack/dnsmasq-dns-d564d794f-k49kw" Mar 20 09:01:36 crc kubenswrapper[4971]: I0320 09:01:36.139280 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dcca18bf-a5ae-44bd-9de4-23a9ebe7b021-ovsdbserver-nb\") pod \"dnsmasq-dns-d564d794f-k49kw\" (UID: \"dcca18bf-a5ae-44bd-9de4-23a9ebe7b021\") " pod="openstack/dnsmasq-dns-d564d794f-k49kw" Mar 20 09:01:36 crc kubenswrapper[4971]: I0320 09:01:36.139304 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcca18bf-a5ae-44bd-9de4-23a9ebe7b021-config\") pod \"dnsmasq-dns-d564d794f-k49kw\" (UID: \"dcca18bf-a5ae-44bd-9de4-23a9ebe7b021\") " pod="openstack/dnsmasq-dns-d564d794f-k49kw" Mar 20 09:01:36 crc kubenswrapper[4971]: I0320 09:01:36.246642 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s728j\" (UniqueName: \"kubernetes.io/projected/dcca18bf-a5ae-44bd-9de4-23a9ebe7b021-kube-api-access-s728j\") pod \"dnsmasq-dns-d564d794f-k49kw\" (UID: \"dcca18bf-a5ae-44bd-9de4-23a9ebe7b021\") " pod="openstack/dnsmasq-dns-d564d794f-k49kw" Mar 20 09:01:36 crc kubenswrapper[4971]: I0320 09:01:36.246778 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dcca18bf-a5ae-44bd-9de4-23a9ebe7b021-ovsdbserver-nb\") pod \"dnsmasq-dns-d564d794f-k49kw\" (UID: \"dcca18bf-a5ae-44bd-9de4-23a9ebe7b021\") " pod="openstack/dnsmasq-dns-d564d794f-k49kw" Mar 20 09:01:36 crc kubenswrapper[4971]: I0320 09:01:36.246806 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcca18bf-a5ae-44bd-9de4-23a9ebe7b021-config\") pod \"dnsmasq-dns-d564d794f-k49kw\" (UID: \"dcca18bf-a5ae-44bd-9de4-23a9ebe7b021\") " pod="openstack/dnsmasq-dns-d564d794f-k49kw" Mar 20 09:01:36 crc kubenswrapper[4971]: I0320 09:01:36.246870 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dcca18bf-a5ae-44bd-9de4-23a9ebe7b021-dns-svc\") pod \"dnsmasq-dns-d564d794f-k49kw\" (UID: \"dcca18bf-a5ae-44bd-9de4-23a9ebe7b021\") " pod="openstack/dnsmasq-dns-d564d794f-k49kw" Mar 20 09:01:36 crc kubenswrapper[4971]: I0320 09:01:36.246912 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dcca18bf-a5ae-44bd-9de4-23a9ebe7b021-ovsdbserver-sb\") pod \"dnsmasq-dns-d564d794f-k49kw\" (UID: \"dcca18bf-a5ae-44bd-9de4-23a9ebe7b021\") " pod="openstack/dnsmasq-dns-d564d794f-k49kw" Mar 20 09:01:36 crc kubenswrapper[4971]: I0320 09:01:36.246965 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/dcca18bf-a5ae-44bd-9de4-23a9ebe7b021-openstack-cell1\") pod \"dnsmasq-dns-d564d794f-k49kw\" (UID: \"dcca18bf-a5ae-44bd-9de4-23a9ebe7b021\") " pod="openstack/dnsmasq-dns-d564d794f-k49kw" Mar 20 09:01:36 crc kubenswrapper[4971]: I0320 09:01:36.248810 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dcca18bf-a5ae-44bd-9de4-23a9ebe7b021-ovsdbserver-sb\") pod \"dnsmasq-dns-d564d794f-k49kw\" (UID: \"dcca18bf-a5ae-44bd-9de4-23a9ebe7b021\") " pod="openstack/dnsmasq-dns-d564d794f-k49kw" Mar 20 09:01:36 crc kubenswrapper[4971]: I0320 09:01:36.248827 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dcca18bf-a5ae-44bd-9de4-23a9ebe7b021-dns-svc\") pod \"dnsmasq-dns-d564d794f-k49kw\" (UID: \"dcca18bf-a5ae-44bd-9de4-23a9ebe7b021\") " pod="openstack/dnsmasq-dns-d564d794f-k49kw" Mar 20 09:01:36 crc kubenswrapper[4971]: I0320 09:01:36.249009 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcca18bf-a5ae-44bd-9de4-23a9ebe7b021-config\") pod \"dnsmasq-dns-d564d794f-k49kw\" (UID: \"dcca18bf-a5ae-44bd-9de4-23a9ebe7b021\") " pod="openstack/dnsmasq-dns-d564d794f-k49kw" Mar 20 09:01:36 crc kubenswrapper[4971]: I0320 09:01:36.249541 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dcca18bf-a5ae-44bd-9de4-23a9ebe7b021-ovsdbserver-nb\") pod \"dnsmasq-dns-d564d794f-k49kw\" (UID: \"dcca18bf-a5ae-44bd-9de4-23a9ebe7b021\") " pod="openstack/dnsmasq-dns-d564d794f-k49kw" Mar 20 09:01:36 crc kubenswrapper[4971]: I0320 09:01:36.250271 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/dcca18bf-a5ae-44bd-9de4-23a9ebe7b021-openstack-cell1\") pod \"dnsmasq-dns-d564d794f-k49kw\" (UID: \"dcca18bf-a5ae-44bd-9de4-23a9ebe7b021\") " pod="openstack/dnsmasq-dns-d564d794f-k49kw" Mar 20 09:01:36 crc kubenswrapper[4971]: I0320 09:01:36.266619 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s728j\" (UniqueName: \"kubernetes.io/projected/dcca18bf-a5ae-44bd-9de4-23a9ebe7b021-kube-api-access-s728j\") pod \"dnsmasq-dns-d564d794f-k49kw\" (UID: \"dcca18bf-a5ae-44bd-9de4-23a9ebe7b021\") " pod="openstack/dnsmasq-dns-d564d794f-k49kw" Mar 20 09:01:36 crc kubenswrapper[4971]: I0320 09:01:36.337732 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d564d794f-k49kw" Mar 20 09:01:36 crc kubenswrapper[4971]: I0320 09:01:36.348468 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bccb6d999-n6dsj" Mar 20 09:01:36 crc kubenswrapper[4971]: I0320 09:01:36.449737 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04966dc8-94d0-42f2-97b6-230806b07f54-ovsdbserver-nb\") pod \"04966dc8-94d0-42f2-97b6-230806b07f54\" (UID: \"04966dc8-94d0-42f2-97b6-230806b07f54\") " Mar 20 09:01:36 crc kubenswrapper[4971]: I0320 09:01:36.449889 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04966dc8-94d0-42f2-97b6-230806b07f54-config\") pod \"04966dc8-94d0-42f2-97b6-230806b07f54\" (UID: \"04966dc8-94d0-42f2-97b6-230806b07f54\") " Mar 20 09:01:36 crc kubenswrapper[4971]: I0320 09:01:36.449942 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04966dc8-94d0-42f2-97b6-230806b07f54-ovsdbserver-sb\") pod \"04966dc8-94d0-42f2-97b6-230806b07f54\" (UID: \"04966dc8-94d0-42f2-97b6-230806b07f54\") " Mar 20 09:01:36 crc kubenswrapper[4971]: I0320 09:01:36.450009 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04966dc8-94d0-42f2-97b6-230806b07f54-dns-svc\") pod \"04966dc8-94d0-42f2-97b6-230806b07f54\" (UID: \"04966dc8-94d0-42f2-97b6-230806b07f54\") " Mar 20 09:01:36 crc kubenswrapper[4971]: I0320 09:01:36.450047 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vln6g\" (UniqueName: \"kubernetes.io/projected/04966dc8-94d0-42f2-97b6-230806b07f54-kube-api-access-vln6g\") pod \"04966dc8-94d0-42f2-97b6-230806b07f54\" (UID: \"04966dc8-94d0-42f2-97b6-230806b07f54\") " Mar 20 09:01:36 crc kubenswrapper[4971]: I0320 09:01:36.457423 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04966dc8-94d0-42f2-97b6-230806b07f54-kube-api-access-vln6g" (OuterVolumeSpecName: "kube-api-access-vln6g") pod "04966dc8-94d0-42f2-97b6-230806b07f54" (UID: "04966dc8-94d0-42f2-97b6-230806b07f54"). InnerVolumeSpecName "kube-api-access-vln6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:01:36 crc kubenswrapper[4971]: I0320 09:01:36.504352 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04966dc8-94d0-42f2-97b6-230806b07f54-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "04966dc8-94d0-42f2-97b6-230806b07f54" (UID: "04966dc8-94d0-42f2-97b6-230806b07f54"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:36 crc kubenswrapper[4971]: I0320 09:01:36.507888 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04966dc8-94d0-42f2-97b6-230806b07f54-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "04966dc8-94d0-42f2-97b6-230806b07f54" (UID: "04966dc8-94d0-42f2-97b6-230806b07f54"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:36 crc kubenswrapper[4971]: I0320 09:01:36.515316 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04966dc8-94d0-42f2-97b6-230806b07f54-config" (OuterVolumeSpecName: "config") pod "04966dc8-94d0-42f2-97b6-230806b07f54" (UID: "04966dc8-94d0-42f2-97b6-230806b07f54"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:36 crc kubenswrapper[4971]: I0320 09:01:36.537835 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04966dc8-94d0-42f2-97b6-230806b07f54-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "04966dc8-94d0-42f2-97b6-230806b07f54" (UID: "04966dc8-94d0-42f2-97b6-230806b07f54"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:36 crc kubenswrapper[4971]: I0320 09:01:36.552394 4971 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04966dc8-94d0-42f2-97b6-230806b07f54-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:36 crc kubenswrapper[4971]: I0320 09:01:36.552424 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vln6g\" (UniqueName: \"kubernetes.io/projected/04966dc8-94d0-42f2-97b6-230806b07f54-kube-api-access-vln6g\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:36 crc kubenswrapper[4971]: I0320 09:01:36.552434 4971 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04966dc8-94d0-42f2-97b6-230806b07f54-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:36 crc kubenswrapper[4971]: I0320 09:01:36.552444 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04966dc8-94d0-42f2-97b6-230806b07f54-config\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:36 crc kubenswrapper[4971]: I0320 09:01:36.552451 4971 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04966dc8-94d0-42f2-97b6-230806b07f54-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:36 crc kubenswrapper[4971]: I0320 09:01:36.872267 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d564d794f-k49kw"] Mar 20 09:01:37 crc kubenswrapper[4971]: I0320 09:01:37.091819 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d564d794f-k49kw" event={"ID":"dcca18bf-a5ae-44bd-9de4-23a9ebe7b021","Type":"ContainerStarted","Data":"c0f17ea6d14ee5e18674c83893d9fdf1622a4f5e39c17b9f970839239f80ac60"} Mar 20 09:01:37 crc kubenswrapper[4971]: I0320 09:01:37.094879 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bccb6d999-n6dsj" event={"ID":"04966dc8-94d0-42f2-97b6-230806b07f54","Type":"ContainerDied","Data":"2529a6e0e54801d8cb451102721f399c2c97a58d23244b97f7b3bb4854ac2387"} Mar 20 09:01:37 crc kubenswrapper[4971]: I0320 09:01:37.094936 4971 scope.go:117] "RemoveContainer" containerID="8cab53cbb62350a7410a974b0ac58b698e7aa2caa2224266edf03f3a24e43581" Mar 20 09:01:37 crc kubenswrapper[4971]: I0320 09:01:37.095017 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bccb6d999-n6dsj" Mar 20 09:01:37 crc kubenswrapper[4971]: I0320 09:01:37.128418 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bccb6d999-n6dsj"] Mar 20 09:01:37 crc kubenswrapper[4971]: I0320 09:01:37.128946 4971 scope.go:117] "RemoveContainer" containerID="9396db1b4dada4564b30d93c3721740e2dd57449d4cba48153523b1e870a13d5" Mar 20 09:01:37 crc kubenswrapper[4971]: I0320 09:01:37.145269 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5bccb6d999-n6dsj"] Mar 20 09:01:38 crc kubenswrapper[4971]: I0320 09:01:38.135188 4971 generic.go:334] "Generic (PLEG): container finished" podID="dcca18bf-a5ae-44bd-9de4-23a9ebe7b021" containerID="523c0d2b33699c4febac6d7913cb731d04701c0102da468c18efb5ca04f12d38" exitCode=0 Mar 20 09:01:38 crc kubenswrapper[4971]: I0320 09:01:38.135278 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d564d794f-k49kw" event={"ID":"dcca18bf-a5ae-44bd-9de4-23a9ebe7b021","Type":"ContainerDied","Data":"523c0d2b33699c4febac6d7913cb731d04701c0102da468c18efb5ca04f12d38"} Mar 20 09:01:38 crc kubenswrapper[4971]: I0320 09:01:38.742255 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04966dc8-94d0-42f2-97b6-230806b07f54" path="/var/lib/kubelet/pods/04966dc8-94d0-42f2-97b6-230806b07f54/volumes" Mar 20 09:01:39 crc kubenswrapper[4971]: I0320 09:01:39.147766 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d564d794f-k49kw" event={"ID":"dcca18bf-a5ae-44bd-9de4-23a9ebe7b021","Type":"ContainerStarted","Data":"85eb740749c867f3a20e8c214b0b8a73c7fa071f0a587fa79645ac205ed4374c"} Mar 20 09:01:39 crc kubenswrapper[4971]: I0320 09:01:39.148203 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-d564d794f-k49kw" Mar 20 09:01:43 crc kubenswrapper[4971]: I0320 09:01:43.044034 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-d564d794f-k49kw" podStartSLOduration=8.04401657 podStartE2EDuration="8.04401657s" podCreationTimestamp="2026-03-20 09:01:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:01:39.180066651 +0000 UTC m=+7921.159940799" watchObservedRunningTime="2026-03-20 09:01:43.04401657 +0000 UTC m=+7925.023890718" Mar 20 09:01:43 crc kubenswrapper[4971]: I0320 09:01:43.048868 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-9lwb2"] Mar 20 09:01:43 crc kubenswrapper[4971]: I0320 09:01:43.061854 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-9lwb2"] Mar 20 09:01:43 crc kubenswrapper[4971]: I0320 09:01:43.732483 4971 scope.go:117] "RemoveContainer" containerID="28296d931187d2e7d4e287e10e7dff5f848527abf9621b1204e0a061cbc3d395" Mar 20 09:01:43 crc kubenswrapper[4971]: E0320 09:01:43.733191 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:01:44 crc kubenswrapper[4971]: I0320 09:01:44.823550 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="949d0358-d7fd-45a3-9c86-50b430830c78" path="/var/lib/kubelet/pods/949d0358-d7fd-45a3-9c86-50b430830c78/volumes" Mar 20 09:01:46 crc kubenswrapper[4971]: I0320 09:01:46.340101 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-d564d794f-k49kw" Mar 20 09:01:46 crc kubenswrapper[4971]: I0320 09:01:46.443670 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fb654d449-5rnb4"] Mar 20 09:01:46 crc kubenswrapper[4971]: I0320 09:01:46.444118 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7fb654d449-5rnb4" podUID="783dfe21-136b-4efa-ba14-f6fe5a9c4499" containerName="dnsmasq-dns" containerID="cri-o://b086200f67674b893f2a11b3e8ee8eee953695f9283742325148d22e4d6ff273" gracePeriod=10 Mar 20 09:01:46 crc kubenswrapper[4971]: I0320 09:01:46.602578 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d56f8b6fc-rxkhz"] Mar 20 09:01:46 crc kubenswrapper[4971]: E0320 09:01:46.603277 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04966dc8-94d0-42f2-97b6-230806b07f54" containerName="init" Mar 20 09:01:46 crc kubenswrapper[4971]: I0320 09:01:46.603302 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="04966dc8-94d0-42f2-97b6-230806b07f54" containerName="init" Mar 20 09:01:46 crc kubenswrapper[4971]: E0320 09:01:46.603321 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04966dc8-94d0-42f2-97b6-230806b07f54" containerName="dnsmasq-dns" Mar 20 09:01:46 crc kubenswrapper[4971]: I0320 09:01:46.603330 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="04966dc8-94d0-42f2-97b6-230806b07f54" containerName="dnsmasq-dns" Mar 20 09:01:46 crc kubenswrapper[4971]: I0320 09:01:46.603621 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="04966dc8-94d0-42f2-97b6-230806b07f54" containerName="dnsmasq-dns" Mar 20 09:01:46 crc kubenswrapper[4971]: I0320 09:01:46.604683 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d56f8b6fc-rxkhz" Mar 20 09:01:46 crc kubenswrapper[4971]: I0320 09:01:46.609109 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-networker" Mar 20 09:01:46 crc kubenswrapper[4971]: I0320 09:01:46.654291 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d56f8b6fc-rxkhz"] Mar 20 09:01:46 crc kubenswrapper[4971]: I0320 09:01:46.760127 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6q2pp\" (UniqueName: \"kubernetes.io/projected/11e35d68-eda6-4edb-a201-b267628ce07f-kube-api-access-6q2pp\") pod \"dnsmasq-dns-5d56f8b6fc-rxkhz\" (UID: \"11e35d68-eda6-4edb-a201-b267628ce07f\") " pod="openstack/dnsmasq-dns-5d56f8b6fc-rxkhz" Mar 20 09:01:46 crc kubenswrapper[4971]: I0320 09:01:46.760268 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/11e35d68-eda6-4edb-a201-b267628ce07f-openstack-networker\") pod \"dnsmasq-dns-5d56f8b6fc-rxkhz\" (UID: \"11e35d68-eda6-4edb-a201-b267628ce07f\") " pod="openstack/dnsmasq-dns-5d56f8b6fc-rxkhz" Mar 20 09:01:46 crc kubenswrapper[4971]: I0320 09:01:46.760345 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11e35d68-eda6-4edb-a201-b267628ce07f-ovsdbserver-nb\") pod \"dnsmasq-dns-5d56f8b6fc-rxkhz\" (UID: \"11e35d68-eda6-4edb-a201-b267628ce07f\") " pod="openstack/dnsmasq-dns-5d56f8b6fc-rxkhz" Mar 20 09:01:46 crc kubenswrapper[4971]: I0320 09:01:46.760368 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/11e35d68-eda6-4edb-a201-b267628ce07f-ovsdbserver-sb\") pod \"dnsmasq-dns-5d56f8b6fc-rxkhz\" (UID: \"11e35d68-eda6-4edb-a201-b267628ce07f\") " pod="openstack/dnsmasq-dns-5d56f8b6fc-rxkhz" Mar 20 09:01:46 crc kubenswrapper[4971]: I0320 09:01:46.760545 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/11e35d68-eda6-4edb-a201-b267628ce07f-openstack-cell1\") pod \"dnsmasq-dns-5d56f8b6fc-rxkhz\" (UID: \"11e35d68-eda6-4edb-a201-b267628ce07f\") " pod="openstack/dnsmasq-dns-5d56f8b6fc-rxkhz" Mar 20 09:01:46 crc kubenswrapper[4971]: I0320 09:01:46.760623 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11e35d68-eda6-4edb-a201-b267628ce07f-config\") pod \"dnsmasq-dns-5d56f8b6fc-rxkhz\" (UID: \"11e35d68-eda6-4edb-a201-b267628ce07f\") " pod="openstack/dnsmasq-dns-5d56f8b6fc-rxkhz" Mar 20 09:01:46 crc kubenswrapper[4971]: I0320 09:01:46.760672 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11e35d68-eda6-4edb-a201-b267628ce07f-dns-svc\") pod \"dnsmasq-dns-5d56f8b6fc-rxkhz\" (UID: \"11e35d68-eda6-4edb-a201-b267628ce07f\") " pod="openstack/dnsmasq-dns-5d56f8b6fc-rxkhz" Mar 20 09:01:46 crc kubenswrapper[4971]: I0320 09:01:46.867071 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/11e35d68-eda6-4edb-a201-b267628ce07f-openstack-cell1\") pod \"dnsmasq-dns-5d56f8b6fc-rxkhz\" (UID: \"11e35d68-eda6-4edb-a201-b267628ce07f\") " pod="openstack/dnsmasq-dns-5d56f8b6fc-rxkhz" Mar 20 09:01:46 crc kubenswrapper[4971]: I0320 09:01:46.867948 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/11e35d68-eda6-4edb-a201-b267628ce07f-openstack-cell1\") pod \"dnsmasq-dns-5d56f8b6fc-rxkhz\" (UID: \"11e35d68-eda6-4edb-a201-b267628ce07f\") " pod="openstack/dnsmasq-dns-5d56f8b6fc-rxkhz" Mar 20 09:01:46 crc kubenswrapper[4971]: I0320 09:01:46.868115 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11e35d68-eda6-4edb-a201-b267628ce07f-config\") pod \"dnsmasq-dns-5d56f8b6fc-rxkhz\" (UID: \"11e35d68-eda6-4edb-a201-b267628ce07f\") " pod="openstack/dnsmasq-dns-5d56f8b6fc-rxkhz" Mar 20 09:01:46 crc kubenswrapper[4971]: I0320 09:01:46.868928 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11e35d68-eda6-4edb-a201-b267628ce07f-config\") pod \"dnsmasq-dns-5d56f8b6fc-rxkhz\" (UID: \"11e35d68-eda6-4edb-a201-b267628ce07f\") " pod="openstack/dnsmasq-dns-5d56f8b6fc-rxkhz" Mar 20 09:01:46 crc kubenswrapper[4971]: I0320 09:01:46.868975 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11e35d68-eda6-4edb-a201-b267628ce07f-dns-svc\") pod \"dnsmasq-dns-5d56f8b6fc-rxkhz\" (UID: \"11e35d68-eda6-4edb-a201-b267628ce07f\") " pod="openstack/dnsmasq-dns-5d56f8b6fc-rxkhz" Mar 20 09:01:46 crc kubenswrapper[4971]: I0320 09:01:46.869150 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6q2pp\" (UniqueName: \"kubernetes.io/projected/11e35d68-eda6-4edb-a201-b267628ce07f-kube-api-access-6q2pp\") pod \"dnsmasq-dns-5d56f8b6fc-rxkhz\" (UID: \"11e35d68-eda6-4edb-a201-b267628ce07f\") " pod="openstack/dnsmasq-dns-5d56f8b6fc-rxkhz" Mar 20 09:01:46 crc kubenswrapper[4971]: I0320 09:01:46.869200 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/11e35d68-eda6-4edb-a201-b267628ce07f-openstack-networker\") pod \"dnsmasq-dns-5d56f8b6fc-rxkhz\" (UID: \"11e35d68-eda6-4edb-a201-b267628ce07f\") " pod="openstack/dnsmasq-dns-5d56f8b6fc-rxkhz" Mar 20 09:01:46 crc kubenswrapper[4971]: I0320 09:01:46.869368 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11e35d68-eda6-4edb-a201-b267628ce07f-ovsdbserver-nb\") pod \"dnsmasq-dns-5d56f8b6fc-rxkhz\" (UID: \"11e35d68-eda6-4edb-a201-b267628ce07f\") " pod="openstack/dnsmasq-dns-5d56f8b6fc-rxkhz" Mar 20 09:01:46 crc kubenswrapper[4971]: I0320 09:01:46.869401 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/11e35d68-eda6-4edb-a201-b267628ce07f-ovsdbserver-sb\") pod \"dnsmasq-dns-5d56f8b6fc-rxkhz\" (UID: \"11e35d68-eda6-4edb-a201-b267628ce07f\") " pod="openstack/dnsmasq-dns-5d56f8b6fc-rxkhz" Mar 20 09:01:46 crc kubenswrapper[4971]: I0320 09:01:46.870004 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/11e35d68-eda6-4edb-a201-b267628ce07f-ovsdbserver-sb\") pod \"dnsmasq-dns-5d56f8b6fc-rxkhz\" (UID: \"11e35d68-eda6-4edb-a201-b267628ce07f\") " pod="openstack/dnsmasq-dns-5d56f8b6fc-rxkhz" Mar 20 09:01:46 crc kubenswrapper[4971]: I0320 09:01:46.870545 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/11e35d68-eda6-4edb-a201-b267628ce07f-openstack-networker\") pod \"dnsmasq-dns-5d56f8b6fc-rxkhz\" (UID: \"11e35d68-eda6-4edb-a201-b267628ce07f\") " pod="openstack/dnsmasq-dns-5d56f8b6fc-rxkhz" Mar 20 09:01:46 crc kubenswrapper[4971]: I0320 09:01:46.870678 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11e35d68-eda6-4edb-a201-b267628ce07f-ovsdbserver-nb\") pod \"dnsmasq-dns-5d56f8b6fc-rxkhz\" (UID: \"11e35d68-eda6-4edb-a201-b267628ce07f\") " pod="openstack/dnsmasq-dns-5d56f8b6fc-rxkhz" Mar 20 09:01:46 crc kubenswrapper[4971]: I0320 09:01:46.870940 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11e35d68-eda6-4edb-a201-b267628ce07f-dns-svc\") pod \"dnsmasq-dns-5d56f8b6fc-rxkhz\" (UID: \"11e35d68-eda6-4edb-a201-b267628ce07f\") " pod="openstack/dnsmasq-dns-5d56f8b6fc-rxkhz" Mar 20 09:01:46 crc kubenswrapper[4971]: I0320 09:01:46.902952 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6q2pp\" (UniqueName: \"kubernetes.io/projected/11e35d68-eda6-4edb-a201-b267628ce07f-kube-api-access-6q2pp\") pod \"dnsmasq-dns-5d56f8b6fc-rxkhz\" (UID: \"11e35d68-eda6-4edb-a201-b267628ce07f\") " pod="openstack/dnsmasq-dns-5d56f8b6fc-rxkhz" Mar 20 09:01:46 crc kubenswrapper[4971]: I0320 09:01:46.939505 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d56f8b6fc-rxkhz" Mar 20 09:01:47 crc kubenswrapper[4971]: I0320 09:01:47.093155 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fb654d449-5rnb4" Mar 20 09:01:47 crc kubenswrapper[4971]: I0320 09:01:47.237645 4971 generic.go:334] "Generic (PLEG): container finished" podID="783dfe21-136b-4efa-ba14-f6fe5a9c4499" containerID="b086200f67674b893f2a11b3e8ee8eee953695f9283742325148d22e4d6ff273" exitCode=0 Mar 20 09:01:47 crc kubenswrapper[4971]: I0320 09:01:47.237694 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fb654d449-5rnb4" event={"ID":"783dfe21-136b-4efa-ba14-f6fe5a9c4499","Type":"ContainerDied","Data":"b086200f67674b893f2a11b3e8ee8eee953695f9283742325148d22e4d6ff273"} Mar 20 09:01:47 crc kubenswrapper[4971]: I0320 09:01:47.237734 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fb654d449-5rnb4" Mar 20 09:01:47 crc kubenswrapper[4971]: I0320 09:01:47.237751 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fb654d449-5rnb4" event={"ID":"783dfe21-136b-4efa-ba14-f6fe5a9c4499","Type":"ContainerDied","Data":"32136f0117183da974f131ab4542366f662052adf6a4ebb6465f75fe0ccd9a1a"} Mar 20 09:01:47 crc kubenswrapper[4971]: I0320 09:01:47.237770 4971 scope.go:117] "RemoveContainer" containerID="b086200f67674b893f2a11b3e8ee8eee953695f9283742325148d22e4d6ff273" Mar 20 09:01:47 crc kubenswrapper[4971]: I0320 09:01:47.268308 4971 scope.go:117] "RemoveContainer" containerID="a1a6ab16fb11ebc9a78c5d0d4e1458d8fe003d64c87da30f12bfc350965820c0" Mar 20 09:01:47 crc kubenswrapper[4971]: I0320 09:01:47.276549 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/783dfe21-136b-4efa-ba14-f6fe5a9c4499-config\") pod \"783dfe21-136b-4efa-ba14-f6fe5a9c4499\" (UID: \"783dfe21-136b-4efa-ba14-f6fe5a9c4499\") " Mar 20 09:01:47 crc kubenswrapper[4971]: I0320 09:01:47.276713 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/783dfe21-136b-4efa-ba14-f6fe5a9c4499-dns-svc\") pod \"783dfe21-136b-4efa-ba14-f6fe5a9c4499\" (UID: \"783dfe21-136b-4efa-ba14-f6fe5a9c4499\") " Mar 20 09:01:47 crc kubenswrapper[4971]: I0320 09:01:47.276745 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/783dfe21-136b-4efa-ba14-f6fe5a9c4499-ovsdbserver-sb\") pod \"783dfe21-136b-4efa-ba14-f6fe5a9c4499\" (UID: \"783dfe21-136b-4efa-ba14-f6fe5a9c4499\") " Mar 20 09:01:47 crc kubenswrapper[4971]: I0320 09:01:47.276828 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5z2f\" (UniqueName: \"kubernetes.io/projected/783dfe21-136b-4efa-ba14-f6fe5a9c4499-kube-api-access-x5z2f\") pod \"783dfe21-136b-4efa-ba14-f6fe5a9c4499\" (UID: \"783dfe21-136b-4efa-ba14-f6fe5a9c4499\") " Mar 20 09:01:47 crc kubenswrapper[4971]: I0320 09:01:47.276886 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/783dfe21-136b-4efa-ba14-f6fe5a9c4499-openstack-cell1\") pod \"783dfe21-136b-4efa-ba14-f6fe5a9c4499\" (UID: \"783dfe21-136b-4efa-ba14-f6fe5a9c4499\") " Mar 20 09:01:47 crc kubenswrapper[4971]: I0320 09:01:47.276923 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/783dfe21-136b-4efa-ba14-f6fe5a9c4499-ovsdbserver-nb\") pod \"783dfe21-136b-4efa-ba14-f6fe5a9c4499\" (UID: \"783dfe21-136b-4efa-ba14-f6fe5a9c4499\") " Mar 20 09:01:47 crc kubenswrapper[4971]: I0320 09:01:47.281714 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/783dfe21-136b-4efa-ba14-f6fe5a9c4499-kube-api-access-x5z2f" (OuterVolumeSpecName: "kube-api-access-x5z2f") pod "783dfe21-136b-4efa-ba14-f6fe5a9c4499" (UID: "783dfe21-136b-4efa-ba14-f6fe5a9c4499"). InnerVolumeSpecName "kube-api-access-x5z2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:01:47 crc kubenswrapper[4971]: I0320 09:01:47.302895 4971 scope.go:117] "RemoveContainer" containerID="b086200f67674b893f2a11b3e8ee8eee953695f9283742325148d22e4d6ff273" Mar 20 09:01:47 crc kubenswrapper[4971]: E0320 09:01:47.304592 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b086200f67674b893f2a11b3e8ee8eee953695f9283742325148d22e4d6ff273\": container with ID starting with b086200f67674b893f2a11b3e8ee8eee953695f9283742325148d22e4d6ff273 not found: ID does not exist" containerID="b086200f67674b893f2a11b3e8ee8eee953695f9283742325148d22e4d6ff273" Mar 20 09:01:47 crc kubenswrapper[4971]: I0320 09:01:47.304659 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b086200f67674b893f2a11b3e8ee8eee953695f9283742325148d22e4d6ff273"} err="failed to get container status \"b086200f67674b893f2a11b3e8ee8eee953695f9283742325148d22e4d6ff273\": rpc error: code = NotFound desc = could not find container \"b086200f67674b893f2a11b3e8ee8eee953695f9283742325148d22e4d6ff273\": container with ID starting with b086200f67674b893f2a11b3e8ee8eee953695f9283742325148d22e4d6ff273 not found: ID does not exist" Mar 20 09:01:47 crc kubenswrapper[4971]: I0320 09:01:47.304694 4971 scope.go:117] "RemoveContainer" containerID="a1a6ab16fb11ebc9a78c5d0d4e1458d8fe003d64c87da30f12bfc350965820c0" Mar 20 09:01:47 crc kubenswrapper[4971]: E0320 09:01:47.306034 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1a6ab16fb11ebc9a78c5d0d4e1458d8fe003d64c87da30f12bfc350965820c0\": container with ID starting with a1a6ab16fb11ebc9a78c5d0d4e1458d8fe003d64c87da30f12bfc350965820c0 not found: ID does not exist" containerID="a1a6ab16fb11ebc9a78c5d0d4e1458d8fe003d64c87da30f12bfc350965820c0" Mar 20 09:01:47 crc kubenswrapper[4971]: I0320 09:01:47.306066 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1a6ab16fb11ebc9a78c5d0d4e1458d8fe003d64c87da30f12bfc350965820c0"} err="failed to get container status \"a1a6ab16fb11ebc9a78c5d0d4e1458d8fe003d64c87da30f12bfc350965820c0\": rpc error: code = NotFound desc = could not find container \"a1a6ab16fb11ebc9a78c5d0d4e1458d8fe003d64c87da30f12bfc350965820c0\": container with ID starting with a1a6ab16fb11ebc9a78c5d0d4e1458d8fe003d64c87da30f12bfc350965820c0 not found: ID does not exist" Mar 20 09:01:47 crc kubenswrapper[4971]: I0320 09:01:47.328622 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/783dfe21-136b-4efa-ba14-f6fe5a9c4499-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "783dfe21-136b-4efa-ba14-f6fe5a9c4499" (UID: "783dfe21-136b-4efa-ba14-f6fe5a9c4499"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:47 crc kubenswrapper[4971]: I0320 09:01:47.332523 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/783dfe21-136b-4efa-ba14-f6fe5a9c4499-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "783dfe21-136b-4efa-ba14-f6fe5a9c4499" (UID: "783dfe21-136b-4efa-ba14-f6fe5a9c4499"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:47 crc kubenswrapper[4971]: I0320 09:01:47.340155 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/783dfe21-136b-4efa-ba14-f6fe5a9c4499-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "783dfe21-136b-4efa-ba14-f6fe5a9c4499" (UID: "783dfe21-136b-4efa-ba14-f6fe5a9c4499"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:47 crc kubenswrapper[4971]: I0320 09:01:47.366034 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/783dfe21-136b-4efa-ba14-f6fe5a9c4499-openstack-cell1" (OuterVolumeSpecName: "openstack-cell1") pod "783dfe21-136b-4efa-ba14-f6fe5a9c4499" (UID: "783dfe21-136b-4efa-ba14-f6fe5a9c4499"). InnerVolumeSpecName "openstack-cell1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:47 crc kubenswrapper[4971]: I0320 09:01:47.366384 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/783dfe21-136b-4efa-ba14-f6fe5a9c4499-config" (OuterVolumeSpecName: "config") pod "783dfe21-136b-4efa-ba14-f6fe5a9c4499" (UID: "783dfe21-136b-4efa-ba14-f6fe5a9c4499"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:47 crc kubenswrapper[4971]: I0320 09:01:47.378531 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/783dfe21-136b-4efa-ba14-f6fe5a9c4499-config\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:47 crc kubenswrapper[4971]: I0320 09:01:47.378560 4971 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/783dfe21-136b-4efa-ba14-f6fe5a9c4499-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:47 crc kubenswrapper[4971]: I0320 09:01:47.378570 4971 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/783dfe21-136b-4efa-ba14-f6fe5a9c4499-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:47 crc kubenswrapper[4971]: I0320 09:01:47.378582 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5z2f\" (UniqueName: \"kubernetes.io/projected/783dfe21-136b-4efa-ba14-f6fe5a9c4499-kube-api-access-x5z2f\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:47 crc kubenswrapper[4971]: I0320 09:01:47.378590 4971 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/783dfe21-136b-4efa-ba14-f6fe5a9c4499-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:47 crc kubenswrapper[4971]: I0320 09:01:47.378598 4971 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/783dfe21-136b-4efa-ba14-f6fe5a9c4499-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:47 crc kubenswrapper[4971]: I0320 09:01:47.383574 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d56f8b6fc-rxkhz"] Mar 20 09:01:47 crc kubenswrapper[4971]: W0320 09:01:47.385735 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11e35d68_eda6_4edb_a201_b267628ce07f.slice/crio-fc0f10bc7e2d97335a83cd036f83765fa4dbffbf25f6c82adab534281bf8d49b WatchSource:0}: Error finding container fc0f10bc7e2d97335a83cd036f83765fa4dbffbf25f6c82adab534281bf8d49b: Status 404 returned error can't find the container with id fc0f10bc7e2d97335a83cd036f83765fa4dbffbf25f6c82adab534281bf8d49b Mar 20 09:01:47 crc kubenswrapper[4971]: I0320 09:01:47.726252 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fb654d449-5rnb4"] Mar 20 09:01:47 crc kubenswrapper[4971]: I0320 09:01:47.740699 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fb654d449-5rnb4"] Mar 20 09:01:48 crc kubenswrapper[4971]: I0320 09:01:48.251679 4971 generic.go:334] "Generic (PLEG): container finished" podID="11e35d68-eda6-4edb-a201-b267628ce07f" containerID="5bfd19cc84d848f95f6617385d51f1d8268b09c9b148b86bb55cbeec44abde89" exitCode=0 Mar 20 09:01:48 crc kubenswrapper[4971]: I0320 09:01:48.251765 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d56f8b6fc-rxkhz" event={"ID":"11e35d68-eda6-4edb-a201-b267628ce07f","Type":"ContainerDied","Data":"5bfd19cc84d848f95f6617385d51f1d8268b09c9b148b86bb55cbeec44abde89"} Mar 20 09:01:48 crc kubenswrapper[4971]: I0320 09:01:48.251833 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d56f8b6fc-rxkhz" event={"ID":"11e35d68-eda6-4edb-a201-b267628ce07f","Type":"ContainerStarted","Data":"fc0f10bc7e2d97335a83cd036f83765fa4dbffbf25f6c82adab534281bf8d49b"} Mar 20 09:01:48 crc kubenswrapper[4971]: I0320 09:01:48.749710 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="783dfe21-136b-4efa-ba14-f6fe5a9c4499" path="/var/lib/kubelet/pods/783dfe21-136b-4efa-ba14-f6fe5a9c4499/volumes" Mar 20 09:01:49 crc kubenswrapper[4971]: I0320 09:01:49.264560 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d56f8b6fc-rxkhz" event={"ID":"11e35d68-eda6-4edb-a201-b267628ce07f","Type":"ContainerStarted","Data":"e1d2f037ccda75e40873c5ec8be99f1b46aa067f031324ec60df868990f45b44"} Mar 20 09:01:49 crc kubenswrapper[4971]: I0320 09:01:49.264814 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d56f8b6fc-rxkhz" Mar 20 09:01:49 crc kubenswrapper[4971]: I0320 09:01:49.289777 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d56f8b6fc-rxkhz" podStartSLOduration=3.289758937 podStartE2EDuration="3.289758937s" podCreationTimestamp="2026-03-20 09:01:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:01:49.279921541 +0000 UTC m=+7931.259795689" watchObservedRunningTime="2026-03-20 09:01:49.289758937 +0000 UTC m=+7931.269633075" Mar 20 09:01:56 crc kubenswrapper[4971]: I0320 09:01:56.941742 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d56f8b6fc-rxkhz" Mar 20 09:01:57 crc kubenswrapper[4971]: I0320 09:01:57.028136 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d564d794f-k49kw"] Mar 20 09:01:57 crc kubenswrapper[4971]: I0320 09:01:57.028439 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-d564d794f-k49kw" podUID="dcca18bf-a5ae-44bd-9de4-23a9ebe7b021" containerName="dnsmasq-dns" containerID="cri-o://85eb740749c867f3a20e8c214b0b8a73c7fa071f0a587fa79645ac205ed4374c" gracePeriod=10 Mar 20 09:01:57 crc kubenswrapper[4971]: I0320 09:01:57.365276 4971 generic.go:334] "Generic (PLEG): container finished" podID="dcca18bf-a5ae-44bd-9de4-23a9ebe7b021" containerID="85eb740749c867f3a20e8c214b0b8a73c7fa071f0a587fa79645ac205ed4374c" exitCode=0 Mar 20 09:01:57 crc kubenswrapper[4971]: I0320 09:01:57.365440 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d564d794f-k49kw" event={"ID":"dcca18bf-a5ae-44bd-9de4-23a9ebe7b021","Type":"ContainerDied","Data":"85eb740749c867f3a20e8c214b0b8a73c7fa071f0a587fa79645ac205ed4374c"} Mar 20 09:01:57 crc kubenswrapper[4971]: I0320 09:01:57.518638 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d564d794f-k49kw" Mar 20 09:01:57 crc kubenswrapper[4971]: I0320 09:01:57.617419 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s728j\" (UniqueName: \"kubernetes.io/projected/dcca18bf-a5ae-44bd-9de4-23a9ebe7b021-kube-api-access-s728j\") pod \"dcca18bf-a5ae-44bd-9de4-23a9ebe7b021\" (UID: \"dcca18bf-a5ae-44bd-9de4-23a9ebe7b021\") " Mar 20 09:01:57 crc kubenswrapper[4971]: I0320 09:01:57.617491 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcca18bf-a5ae-44bd-9de4-23a9ebe7b021-config\") pod \"dcca18bf-a5ae-44bd-9de4-23a9ebe7b021\" (UID: \"dcca18bf-a5ae-44bd-9de4-23a9ebe7b021\") " Mar 20 09:01:57 crc kubenswrapper[4971]: I0320 09:01:57.617555 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dcca18bf-a5ae-44bd-9de4-23a9ebe7b021-ovsdbserver-sb\") pod \"dcca18bf-a5ae-44bd-9de4-23a9ebe7b021\" (UID: \"dcca18bf-a5ae-44bd-9de4-23a9ebe7b021\") " Mar 20 09:01:57 crc kubenswrapper[4971]: I0320 09:01:57.617677 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dcca18bf-a5ae-44bd-9de4-23a9ebe7b021-ovsdbserver-nb\") pod \"dcca18bf-a5ae-44bd-9de4-23a9ebe7b021\" (UID: \"dcca18bf-a5ae-44bd-9de4-23a9ebe7b021\") " Mar 20 09:01:57 crc kubenswrapper[4971]: I0320 09:01:57.617718 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dcca18bf-a5ae-44bd-9de4-23a9ebe7b021-dns-svc\") pod \"dcca18bf-a5ae-44bd-9de4-23a9ebe7b021\" (UID: \"dcca18bf-a5ae-44bd-9de4-23a9ebe7b021\") " Mar 20 09:01:57 crc kubenswrapper[4971]: I0320 09:01:57.617741 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/dcca18bf-a5ae-44bd-9de4-23a9ebe7b021-openstack-cell1\") pod \"dcca18bf-a5ae-44bd-9de4-23a9ebe7b021\" (UID: \"dcca18bf-a5ae-44bd-9de4-23a9ebe7b021\") " Mar 20 09:01:57 crc kubenswrapper[4971]: I0320 09:01:57.633875 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcca18bf-a5ae-44bd-9de4-23a9ebe7b021-kube-api-access-s728j" (OuterVolumeSpecName: "kube-api-access-s728j") pod "dcca18bf-a5ae-44bd-9de4-23a9ebe7b021" (UID: "dcca18bf-a5ae-44bd-9de4-23a9ebe7b021"). InnerVolumeSpecName "kube-api-access-s728j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:01:57 crc kubenswrapper[4971]: I0320 09:01:57.675655 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcca18bf-a5ae-44bd-9de4-23a9ebe7b021-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "dcca18bf-a5ae-44bd-9de4-23a9ebe7b021" (UID: "dcca18bf-a5ae-44bd-9de4-23a9ebe7b021"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:57 crc kubenswrapper[4971]: I0320 09:01:57.681381 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcca18bf-a5ae-44bd-9de4-23a9ebe7b021-openstack-cell1" (OuterVolumeSpecName: "openstack-cell1") pod "dcca18bf-a5ae-44bd-9de4-23a9ebe7b021" (UID: "dcca18bf-a5ae-44bd-9de4-23a9ebe7b021"). InnerVolumeSpecName "openstack-cell1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:57 crc kubenswrapper[4971]: I0320 09:01:57.686439 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcca18bf-a5ae-44bd-9de4-23a9ebe7b021-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "dcca18bf-a5ae-44bd-9de4-23a9ebe7b021" (UID: "dcca18bf-a5ae-44bd-9de4-23a9ebe7b021"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:57 crc kubenswrapper[4971]: I0320 09:01:57.689091 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcca18bf-a5ae-44bd-9de4-23a9ebe7b021-config" (OuterVolumeSpecName: "config") pod "dcca18bf-a5ae-44bd-9de4-23a9ebe7b021" (UID: "dcca18bf-a5ae-44bd-9de4-23a9ebe7b021"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:57 crc kubenswrapper[4971]: I0320 09:01:57.691339 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcca18bf-a5ae-44bd-9de4-23a9ebe7b021-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dcca18bf-a5ae-44bd-9de4-23a9ebe7b021" (UID: "dcca18bf-a5ae-44bd-9de4-23a9ebe7b021"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:57 crc kubenswrapper[4971]: I0320 09:01:57.720249 4971 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dcca18bf-a5ae-44bd-9de4-23a9ebe7b021-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:57 crc kubenswrapper[4971]: I0320 09:01:57.720280 4971 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dcca18bf-a5ae-44bd-9de4-23a9ebe7b021-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:57 crc kubenswrapper[4971]: I0320 09:01:57.720291 4971 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dcca18bf-a5ae-44bd-9de4-23a9ebe7b021-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:57 crc kubenswrapper[4971]: I0320 09:01:57.720301 4971 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/dcca18bf-a5ae-44bd-9de4-23a9ebe7b021-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:57 crc kubenswrapper[4971]: I0320 09:01:57.720310 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s728j\" (UniqueName: \"kubernetes.io/projected/dcca18bf-a5ae-44bd-9de4-23a9ebe7b021-kube-api-access-s728j\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:57 crc kubenswrapper[4971]: I0320 09:01:57.720319 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcca18bf-a5ae-44bd-9de4-23a9ebe7b021-config\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:58 crc kubenswrapper[4971]: I0320 09:01:58.376259 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d564d794f-k49kw" event={"ID":"dcca18bf-a5ae-44bd-9de4-23a9ebe7b021","Type":"ContainerDied","Data":"c0f17ea6d14ee5e18674c83893d9fdf1622a4f5e39c17b9f970839239f80ac60"} Mar 20 09:01:58 crc kubenswrapper[4971]: I0320 09:01:58.376345 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d564d794f-k49kw" Mar 20 09:01:58 crc kubenswrapper[4971]: I0320 09:01:58.376638 4971 scope.go:117] "RemoveContainer" containerID="85eb740749c867f3a20e8c214b0b8a73c7fa071f0a587fa79645ac205ed4374c" Mar 20 09:01:58 crc kubenswrapper[4971]: I0320 09:01:58.417872 4971 scope.go:117] "RemoveContainer" containerID="523c0d2b33699c4febac6d7913cb731d04701c0102da468c18efb5ca04f12d38" Mar 20 09:01:58 crc kubenswrapper[4971]: I0320 09:01:58.418347 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d564d794f-k49kw"] Mar 20 09:01:58 crc kubenswrapper[4971]: I0320 09:01:58.429965 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d564d794f-k49kw"] Mar 20 09:01:58 crc kubenswrapper[4971]: I0320 09:01:58.739977 4971 scope.go:117] "RemoveContainer" containerID="28296d931187d2e7d4e287e10e7dff5f848527abf9621b1204e0a061cbc3d395" Mar 20 09:01:58 crc kubenswrapper[4971]: E0320 09:01:58.740515 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:01:58 crc kubenswrapper[4971]: I0320 09:01:58.755901 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcca18bf-a5ae-44bd-9de4-23a9ebe7b021" path="/var/lib/kubelet/pods/dcca18bf-a5ae-44bd-9de4-23a9ebe7b021/volumes" Mar 20 09:02:00 crc kubenswrapper[4971]: I0320 09:02:00.142153 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566622-6qfhw"] Mar 20 09:02:00 crc kubenswrapper[4971]: E0320 09:02:00.142882 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="783dfe21-136b-4efa-ba14-f6fe5a9c4499" containerName="dnsmasq-dns" Mar 20 09:02:00 crc kubenswrapper[4971]: I0320 09:02:00.142893 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="783dfe21-136b-4efa-ba14-f6fe5a9c4499" containerName="dnsmasq-dns" Mar 20 09:02:00 crc kubenswrapper[4971]: E0320 09:02:00.142914 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcca18bf-a5ae-44bd-9de4-23a9ebe7b021" containerName="init" Mar 20 09:02:00 crc kubenswrapper[4971]: I0320 09:02:00.142922 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcca18bf-a5ae-44bd-9de4-23a9ebe7b021" containerName="init" Mar 20 09:02:00 crc kubenswrapper[4971]: E0320 09:02:00.142942 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="783dfe21-136b-4efa-ba14-f6fe5a9c4499" containerName="init" Mar 20 09:02:00 crc kubenswrapper[4971]: I0320 09:02:00.142948 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="783dfe21-136b-4efa-ba14-f6fe5a9c4499" containerName="init" Mar 20 09:02:00 crc kubenswrapper[4971]: E0320 09:02:00.142981 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcca18bf-a5ae-44bd-9de4-23a9ebe7b021" containerName="dnsmasq-dns" Mar 20 09:02:00 crc kubenswrapper[4971]: I0320 09:02:00.142990 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcca18bf-a5ae-44bd-9de4-23a9ebe7b021" containerName="dnsmasq-dns" Mar 20 09:02:00 crc kubenswrapper[4971]: I0320 09:02:00.143180 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcca18bf-a5ae-44bd-9de4-23a9ebe7b021" containerName="dnsmasq-dns" Mar 20 09:02:00 crc kubenswrapper[4971]: I0320 09:02:00.143194 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="783dfe21-136b-4efa-ba14-f6fe5a9c4499" containerName="dnsmasq-dns" Mar 20 09:02:00 crc kubenswrapper[4971]: I0320 09:02:00.143916 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566622-6qfhw" Mar 20 09:02:00 crc kubenswrapper[4971]: I0320 09:02:00.145815 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:02:00 crc kubenswrapper[4971]: I0320 09:02:00.146111 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:02:00 crc kubenswrapper[4971]: I0320 09:02:00.146514 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 09:02:00 crc kubenswrapper[4971]: I0320 09:02:00.166491 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566622-6qfhw"] Mar 20 09:02:00 crc kubenswrapper[4971]: I0320 09:02:00.293801 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mc5mz\" (UniqueName: \"kubernetes.io/projected/12b15227-d749-42df-ba0b-acad0bc05557-kube-api-access-mc5mz\") pod \"auto-csr-approver-29566622-6qfhw\" (UID: \"12b15227-d749-42df-ba0b-acad0bc05557\") " pod="openshift-infra/auto-csr-approver-29566622-6qfhw" Mar 20 09:02:00 crc kubenswrapper[4971]: I0320 09:02:00.396255 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mc5mz\" (UniqueName: \"kubernetes.io/projected/12b15227-d749-42df-ba0b-acad0bc05557-kube-api-access-mc5mz\") pod \"auto-csr-approver-29566622-6qfhw\" (UID: \"12b15227-d749-42df-ba0b-acad0bc05557\") " pod="openshift-infra/auto-csr-approver-29566622-6qfhw" Mar 20 09:02:00 crc kubenswrapper[4971]: I0320 09:02:00.416386 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mc5mz\" (UniqueName: \"kubernetes.io/projected/12b15227-d749-42df-ba0b-acad0bc05557-kube-api-access-mc5mz\") pod \"auto-csr-approver-29566622-6qfhw\" (UID: \"12b15227-d749-42df-ba0b-acad0bc05557\") " pod="openshift-infra/auto-csr-approver-29566622-6qfhw" Mar 20 09:02:00 crc kubenswrapper[4971]: I0320 09:02:00.471412 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566622-6qfhw" Mar 20 09:02:00 crc kubenswrapper[4971]: I0320 09:02:00.933686 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566622-6qfhw"] Mar 20 09:02:00 crc kubenswrapper[4971]: I0320 09:02:00.955301 4971 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 09:02:01 crc kubenswrapper[4971]: I0320 09:02:01.424547 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566622-6qfhw" event={"ID":"12b15227-d749-42df-ba0b-acad0bc05557","Type":"ContainerStarted","Data":"98f0075f8ce6222d9181549699a77070b942eb03d213da94f038b599504977f6"} Mar 20 09:02:03 crc kubenswrapper[4971]: I0320 09:02:03.446828 4971 generic.go:334] "Generic (PLEG): container finished" podID="12b15227-d749-42df-ba0b-acad0bc05557" containerID="1033e4cd2cbf5412dc93fdb7b7eb501d71c34eeeae84f23fce2832c2cf9ac8fc" exitCode=0 Mar 20 09:02:03 crc kubenswrapper[4971]: I0320 09:02:03.446960 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566622-6qfhw" event={"ID":"12b15227-d749-42df-ba0b-acad0bc05557","Type":"ContainerDied","Data":"1033e4cd2cbf5412dc93fdb7b7eb501d71c34eeeae84f23fce2832c2cf9ac8fc"} Mar 20 09:02:04 crc kubenswrapper[4971]: I0320 09:02:04.793724 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566622-6qfhw" Mar 20 09:02:04 crc kubenswrapper[4971]: I0320 09:02:04.801292 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mc5mz\" (UniqueName: \"kubernetes.io/projected/12b15227-d749-42df-ba0b-acad0bc05557-kube-api-access-mc5mz\") pod \"12b15227-d749-42df-ba0b-acad0bc05557\" (UID: \"12b15227-d749-42df-ba0b-acad0bc05557\") " Mar 20 09:02:04 crc kubenswrapper[4971]: I0320 09:02:04.809160 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12b15227-d749-42df-ba0b-acad0bc05557-kube-api-access-mc5mz" (OuterVolumeSpecName: "kube-api-access-mc5mz") pod "12b15227-d749-42df-ba0b-acad0bc05557" (UID: "12b15227-d749-42df-ba0b-acad0bc05557"). InnerVolumeSpecName "kube-api-access-mc5mz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:02:04 crc kubenswrapper[4971]: I0320 09:02:04.903835 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mc5mz\" (UniqueName: \"kubernetes.io/projected/12b15227-d749-42df-ba0b-acad0bc05557-kube-api-access-mc5mz\") on node \"crc\" DevicePath \"\"" Mar 20 09:02:05 crc kubenswrapper[4971]: I0320 09:02:05.467990 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566622-6qfhw" event={"ID":"12b15227-d749-42df-ba0b-acad0bc05557","Type":"ContainerDied","Data":"98f0075f8ce6222d9181549699a77070b942eb03d213da94f038b599504977f6"} Mar 20 09:02:05 crc kubenswrapper[4971]: I0320 09:02:05.468036 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98f0075f8ce6222d9181549699a77070b942eb03d213da94f038b599504977f6" Mar 20 09:02:05 crc kubenswrapper[4971]: I0320 09:02:05.468070 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566622-6qfhw" Mar 20 09:02:05 crc kubenswrapper[4971]: I0320 09:02:05.869531 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566616-f5hbx"] Mar 20 09:02:05 crc kubenswrapper[4971]: I0320 09:02:05.880319 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566616-f5hbx"] Mar 20 09:02:06 crc kubenswrapper[4971]: I0320 09:02:06.752779 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d320012-274b-452a-92d8-a4cc549704cf" path="/var/lib/kubelet/pods/7d320012-274b-452a-92d8-a4cc549704cf/volumes" Mar 20 09:02:11 crc kubenswrapper[4971]: I0320 09:02:11.962256 4971 scope.go:117] "RemoveContainer" containerID="aa5baf9afd5b7d1411c3fc977d9855538f13f336bb7e4b82ac9bc989cc27b547" Mar 20 09:02:12 crc kubenswrapper[4971]: I0320 09:02:12.032077 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvf2hd"] Mar 20 09:02:12 crc kubenswrapper[4971]: E0320 09:02:12.032861 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12b15227-d749-42df-ba0b-acad0bc05557" containerName="oc" Mar 20 09:02:12 crc kubenswrapper[4971]: I0320 09:02:12.032896 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="12b15227-d749-42df-ba0b-acad0bc05557" containerName="oc" Mar 20 09:02:12 crc kubenswrapper[4971]: I0320 09:02:12.033229 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="12b15227-d749-42df-ba0b-acad0bc05557" containerName="oc" Mar 20 09:02:12 crc kubenswrapper[4971]: I0320 09:02:12.034430 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvf2hd" Mar 20 09:02:12 crc kubenswrapper[4971]: I0320 09:02:12.040182 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 20 09:02:12 crc kubenswrapper[4971]: I0320 09:02:12.041168 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rwstj" Mar 20 09:02:12 crc kubenswrapper[4971]: I0320 09:02:12.041640 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 20 09:02:12 crc kubenswrapper[4971]: I0320 09:02:12.041845 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 09:02:12 crc kubenswrapper[4971]: I0320 09:02:12.056453 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-opensta-6afnxblg"] Mar 20 09:02:12 crc kubenswrapper[4971]: I0320 09:02:12.058480 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-opensta-6afnxblg" Mar 20 09:02:12 crc kubenswrapper[4971]: I0320 09:02:12.061265 4971 scope.go:117] "RemoveContainer" containerID="24e4e7ddba7159ae28607f81be76f7f328af812dac3ad28b92a91d885b95ed44" Mar 20 09:02:12 crc kubenswrapper[4971]: I0320 09:02:12.061561 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-nwcvx" Mar 20 09:02:12 crc kubenswrapper[4971]: I0320 09:02:12.062269 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Mar 20 09:02:12 crc kubenswrapper[4971]: I0320 09:02:12.071777 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvf2hd"] Mar 20 09:02:12 crc kubenswrapper[4971]: I0320 09:02:12.085685 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-opensta-6afnxblg"] Mar 20 09:02:12 crc kubenswrapper[4971]: I0320 09:02:12.113326 4971 scope.go:117] "RemoveContainer" containerID="31dc4ef88a1409cbba7b1e7cdc7efcdd59fa94f13bfae269cce863ffd1f1b2ac" Mar 20 09:02:12 crc kubenswrapper[4971]: I0320 09:02:12.154457 4971 scope.go:117] "RemoveContainer" containerID="06d38ec29fd6ba7b896188c16b18091b32c3a892967cc4831fb7d355a89ea790" Mar 20 09:02:12 crc kubenswrapper[4971]: I0320 09:02:12.162205 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9373f38e-bed2-4e2e-b4de-0ec24546a79c-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-opensta-6afnxblg\" (UID: \"9373f38e-bed2-4e2e-b4de-0ec24546a79c\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-opensta-6afnxblg" Mar 20 09:02:12 crc kubenswrapper[4971]: I0320 09:02:12.162267 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d8021a6e-4275-43c1-9054-97946bdec61f-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cvf2hd\" (UID: \"d8021a6e-4275-43c1-9054-97946bdec61f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvf2hd" Mar 20 09:02:12 crc kubenswrapper[4971]: I0320 09:02:12.162454 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9373f38e-bed2-4e2e-b4de-0ec24546a79c-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-opensta-6afnxblg\" (UID: \"9373f38e-bed2-4e2e-b4de-0ec24546a79c\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-opensta-6afnxblg" Mar 20 09:02:12 crc kubenswrapper[4971]: I0320 09:02:12.162707 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d8021a6e-4275-43c1-9054-97946bdec61f-ssh-key-openstack-cell1\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cvf2hd\" (UID: \"d8021a6e-4275-43c1-9054-97946bdec61f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvf2hd" Mar 20 09:02:12 crc kubenswrapper[4971]: I0320 09:02:12.162922 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8021a6e-4275-43c1-9054-97946bdec61f-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cvf2hd\" (UID: \"d8021a6e-4275-43c1-9054-97946bdec61f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvf2hd" Mar 20 09:02:12 crc kubenswrapper[4971]: I0320 09:02:12.162980 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5dnr\" (UniqueName: \"kubernetes.io/projected/9373f38e-bed2-4e2e-b4de-0ec24546a79c-kube-api-access-r5dnr\") pod \"pre-adoption-validation-openstack-pre-adoption-opensta-6afnxblg\" (UID: \"9373f38e-bed2-4e2e-b4de-0ec24546a79c\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-opensta-6afnxblg" Mar 20 09:02:12 crc kubenswrapper[4971]: I0320 09:02:12.163076 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d8021a6e-4275-43c1-9054-97946bdec61f-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cvf2hd\" (UID: \"d8021a6e-4275-43c1-9054-97946bdec61f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvf2hd" Mar 20 09:02:12 crc kubenswrapper[4971]: I0320 09:02:12.163164 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/9373f38e-bed2-4e2e-b4de-0ec24546a79c-ssh-key-openstack-networker\") pod \"pre-adoption-validation-openstack-pre-adoption-opensta-6afnxblg\" (UID: \"9373f38e-bed2-4e2e-b4de-0ec24546a79c\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-opensta-6afnxblg" Mar 20 09:02:12 crc kubenswrapper[4971]: I0320 09:02:12.163310 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxfxv\" (UniqueName: \"kubernetes.io/projected/d8021a6e-4275-43c1-9054-97946bdec61f-kube-api-access-wxfxv\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cvf2hd\" (UID: \"d8021a6e-4275-43c1-9054-97946bdec61f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvf2hd" Mar 20 09:02:12 crc kubenswrapper[4971]: I0320 09:02:12.265411 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxfxv\" (UniqueName: \"kubernetes.io/projected/d8021a6e-4275-43c1-9054-97946bdec61f-kube-api-access-wxfxv\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cvf2hd\" (UID: \"d8021a6e-4275-43c1-9054-97946bdec61f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvf2hd" Mar 20 09:02:12 crc kubenswrapper[4971]: I0320 09:02:12.265469 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9373f38e-bed2-4e2e-b4de-0ec24546a79c-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-opensta-6afnxblg\" (UID: \"9373f38e-bed2-4e2e-b4de-0ec24546a79c\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-opensta-6afnxblg" Mar 20 09:02:12 crc kubenswrapper[4971]: I0320 09:02:12.265509 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d8021a6e-4275-43c1-9054-97946bdec61f-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cvf2hd\" (UID: \"d8021a6e-4275-43c1-9054-97946bdec61f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvf2hd" Mar 20 09:02:12 crc kubenswrapper[4971]: I0320 09:02:12.265555 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9373f38e-bed2-4e2e-b4de-0ec24546a79c-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-opensta-6afnxblg\" (UID: \"9373f38e-bed2-4e2e-b4de-0ec24546a79c\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-opensta-6afnxblg" Mar 20 09:02:12 crc kubenswrapper[4971]: I0320 09:02:12.265624 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d8021a6e-4275-43c1-9054-97946bdec61f-ssh-key-openstack-cell1\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cvf2hd\" (UID: \"d8021a6e-4275-43c1-9054-97946bdec61f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvf2hd" Mar 20 09:02:12 crc kubenswrapper[4971]: I0320 09:02:12.265677 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8021a6e-4275-43c1-9054-97946bdec61f-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cvf2hd\" (UID: \"d8021a6e-4275-43c1-9054-97946bdec61f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvf2hd" Mar 20 09:02:12 crc kubenswrapper[4971]: I0320 09:02:12.265699 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5dnr\" (UniqueName: \"kubernetes.io/projected/9373f38e-bed2-4e2e-b4de-0ec24546a79c-kube-api-access-r5dnr\") pod \"pre-adoption-validation-openstack-pre-adoption-opensta-6afnxblg\" (UID: \"9373f38e-bed2-4e2e-b4de-0ec24546a79c\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-opensta-6afnxblg" Mar 20 09:02:12 crc kubenswrapper[4971]: I0320 09:02:12.265717 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d8021a6e-4275-43c1-9054-97946bdec61f-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cvf2hd\" (UID: \"d8021a6e-4275-43c1-9054-97946bdec61f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvf2hd" Mar 20 09:02:12 crc kubenswrapper[4971]: I0320 09:02:12.265743 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/9373f38e-bed2-4e2e-b4de-0ec24546a79c-ssh-key-openstack-networker\") pod \"pre-adoption-validation-openstack-pre-adoption-opensta-6afnxblg\" (UID: \"9373f38e-bed2-4e2e-b4de-0ec24546a79c\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-opensta-6afnxblg" Mar 20 09:02:12 crc kubenswrapper[4971]: I0320 09:02:12.275520 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/9373f38e-bed2-4e2e-b4de-0ec24546a79c-ssh-key-openstack-networker\") pod \"pre-adoption-validation-openstack-pre-adoption-opensta-6afnxblg\" (UID: \"9373f38e-bed2-4e2e-b4de-0ec24546a79c\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-opensta-6afnxblg" Mar 20 09:02:12 crc kubenswrapper[4971]: I0320 09:02:12.276077 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9373f38e-bed2-4e2e-b4de-0ec24546a79c-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-opensta-6afnxblg\" (UID: \"9373f38e-bed2-4e2e-b4de-0ec24546a79c\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-opensta-6afnxblg" Mar 20 09:02:12 crc kubenswrapper[4971]: I0320 09:02:12.289222 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d8021a6e-4275-43c1-9054-97946bdec61f-ssh-key-openstack-cell1\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cvf2hd\" (UID: \"d8021a6e-4275-43c1-9054-97946bdec61f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvf2hd" Mar 20 09:02:12 crc kubenswrapper[4971]: I0320 09:02:12.289367 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9373f38e-bed2-4e2e-b4de-0ec24546a79c-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-opensta-6afnxblg\" (UID: \"9373f38e-bed2-4e2e-b4de-0ec24546a79c\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-opensta-6afnxblg" Mar 20 09:02:12 crc kubenswrapper[4971]: I0320 09:02:12.289511 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d8021a6e-4275-43c1-9054-97946bdec61f-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cvf2hd\" (UID: \"d8021a6e-4275-43c1-9054-97946bdec61f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvf2hd" Mar 20 09:02:12 crc kubenswrapper[4971]: I0320 09:02:12.289925 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8021a6e-4275-43c1-9054-97946bdec61f-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cvf2hd\" (UID: \"d8021a6e-4275-43c1-9054-97946bdec61f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvf2hd" Mar 20 09:02:12 crc kubenswrapper[4971]: I0320 09:02:12.290235 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d8021a6e-4275-43c1-9054-97946bdec61f-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cvf2hd\" (UID: \"d8021a6e-4275-43c1-9054-97946bdec61f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvf2hd" Mar 20 09:02:12 crc kubenswrapper[4971]: I0320 09:02:12.293696 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxfxv\" (UniqueName: \"kubernetes.io/projected/d8021a6e-4275-43c1-9054-97946bdec61f-kube-api-access-wxfxv\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cvf2hd\" (UID: \"d8021a6e-4275-43c1-9054-97946bdec61f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvf2hd" Mar 20 09:02:12 crc kubenswrapper[4971]: I0320 09:02:12.296806 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5dnr\" (UniqueName: \"kubernetes.io/projected/9373f38e-bed2-4e2e-b4de-0ec24546a79c-kube-api-access-r5dnr\") pod \"pre-adoption-validation-openstack-pre-adoption-opensta-6afnxblg\" (UID: \"9373f38e-bed2-4e2e-b4de-0ec24546a79c\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-opensta-6afnxblg" Mar 20 09:02:12 crc kubenswrapper[4971]: I0320 09:02:12.423552 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvf2hd" Mar 20 09:02:12 crc kubenswrapper[4971]: I0320 09:02:12.518270 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-opensta-6afnxblg" Mar 20 09:02:12 crc kubenswrapper[4971]: I0320 09:02:12.736796 4971 scope.go:117] "RemoveContainer" containerID="28296d931187d2e7d4e287e10e7dff5f848527abf9621b1204e0a061cbc3d395" Mar 20 09:02:12 crc kubenswrapper[4971]: E0320 09:02:12.739929 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:02:12 crc kubenswrapper[4971]: I0320 09:02:12.939265 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvf2hd"] Mar 20 09:02:13 crc kubenswrapper[4971]: W0320 09:02:13.033900 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9373f38e_bed2_4e2e_b4de_0ec24546a79c.slice/crio-6a3220967787e516f11e797a5b80bee229eb065b430858a4aba8067f6a5840d8 WatchSource:0}: Error finding container 6a3220967787e516f11e797a5b80bee229eb065b430858a4aba8067f6a5840d8: Status 404 returned error can't find the container with id 6a3220967787e516f11e797a5b80bee229eb065b430858a4aba8067f6a5840d8 Mar 20 09:02:13 crc kubenswrapper[4971]: I0320 09:02:13.034510 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-opensta-6afnxblg"] Mar 20 09:02:13 crc kubenswrapper[4971]: I0320 09:02:13.563186 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-opensta-6afnxblg" event={"ID":"9373f38e-bed2-4e2e-b4de-0ec24546a79c","Type":"ContainerStarted","Data":"6a3220967787e516f11e797a5b80bee229eb065b430858a4aba8067f6a5840d8"} Mar 20 09:02:13 crc kubenswrapper[4971]: I0320 09:02:13.564539 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvf2hd" event={"ID":"d8021a6e-4275-43c1-9054-97946bdec61f","Type":"ContainerStarted","Data":"0b207824df22ea80967fb1159b8911fe8fe8991192c2ebcb276b77a9c047b7ba"} Mar 20 09:02:23 crc kubenswrapper[4971]: I0320 09:02:23.666782 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvf2hd" event={"ID":"d8021a6e-4275-43c1-9054-97946bdec61f","Type":"ContainerStarted","Data":"a999df733af9576303761e3dfc43f50aca54c4058c4a8973b0ca097c589d5ab6"} Mar 20 09:02:23 crc kubenswrapper[4971]: I0320 09:02:23.670218 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-opensta-6afnxblg" event={"ID":"9373f38e-bed2-4e2e-b4de-0ec24546a79c","Type":"ContainerStarted","Data":"e7ab89d5533f31ec372532793be3a981e00f50d37d85f3eafacc83301c752809"} Mar 20 09:02:23 crc kubenswrapper[4971]: I0320 09:02:23.694480 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvf2hd" podStartSLOduration=3.2286788140000002 podStartE2EDuration="12.694463018s" podCreationTimestamp="2026-03-20 09:02:11 +0000 UTC" firstStartedPulling="2026-03-20 09:02:12.936202216 +0000 UTC m=+7954.916076354" lastFinishedPulling="2026-03-20 09:02:22.40198642 +0000 UTC m=+7964.381860558" observedRunningTime="2026-03-20 09:02:23.681450238 +0000 UTC m=+7965.661324376" watchObservedRunningTime="2026-03-20 09:02:23.694463018 +0000 UTC m=+7965.674337156" Mar 20 09:02:26 crc kubenswrapper[4971]: I0320 09:02:26.020986 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/pre-adoption-validation-openstack-pre-adoption-opensta-6afnxblg" podStartSLOduration=4.669335652 podStartE2EDuration="14.020957018s" podCreationTimestamp="2026-03-20 09:02:12 +0000 UTC" firstStartedPulling="2026-03-20 09:02:13.036504602 +0000 UTC m=+7955.016378750" lastFinishedPulling="2026-03-20 09:02:22.388125978 +0000 UTC m=+7964.368000116" observedRunningTime="2026-03-20 09:02:23.708803153 +0000 UTC m=+7965.688677311" watchObservedRunningTime="2026-03-20 09:02:26.020957018 +0000 UTC m=+7968.000831196" Mar 20 09:02:26 crc kubenswrapper[4971]: I0320 09:02:26.031094 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-9f82-account-create-update-x85jj"] Mar 20 09:02:26 crc kubenswrapper[4971]: I0320 09:02:26.039458 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-nh6zs"] Mar 20 09:02:26 crc kubenswrapper[4971]: I0320 09:02:26.047740 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-9f82-account-create-update-x85jj"] Mar 20 09:02:26 crc kubenswrapper[4971]: I0320 09:02:26.054920 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-nh6zs"] Mar 20 09:02:26 crc kubenswrapper[4971]: I0320 09:02:26.732769 4971 scope.go:117] "RemoveContainer" containerID="28296d931187d2e7d4e287e10e7dff5f848527abf9621b1204e0a061cbc3d395" Mar 20 09:02:26 crc kubenswrapper[4971]: E0320 09:02:26.733271 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:02:26 crc kubenswrapper[4971]: I0320 09:02:26.746894 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fbb5e33-a994-47a2-8574-a19b8220e0be" path="/var/lib/kubelet/pods/8fbb5e33-a994-47a2-8574-a19b8220e0be/volumes" Mar 20 09:02:26 crc kubenswrapper[4971]: I0320 09:02:26.747978 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eca1bba8-bc6c-4454-afab-398104d09e1e" path="/var/lib/kubelet/pods/eca1bba8-bc6c-4454-afab-398104d09e1e/volumes" Mar 20 09:02:31 crc kubenswrapper[4971]: E0320 09:02:31.989841 4971 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9373f38e_bed2_4e2e_b4de_0ec24546a79c.slice/crio-e7ab89d5533f31ec372532793be3a981e00f50d37d85f3eafacc83301c752809.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9373f38e_bed2_4e2e_b4de_0ec24546a79c.slice/crio-conmon-e7ab89d5533f31ec372532793be3a981e00f50d37d85f3eafacc83301c752809.scope\": RecentStats: unable to find data in memory cache]" Mar 20 09:02:32 crc kubenswrapper[4971]: I0320 09:02:32.761163 4971 generic.go:334] "Generic (PLEG): container finished" podID="d8021a6e-4275-43c1-9054-97946bdec61f" containerID="a999df733af9576303761e3dfc43f50aca54c4058c4a8973b0ca097c589d5ab6" exitCode=0 Mar 20 09:02:32 crc kubenswrapper[4971]: I0320 09:02:32.761230 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvf2hd" event={"ID":"d8021a6e-4275-43c1-9054-97946bdec61f","Type":"ContainerDied","Data":"a999df733af9576303761e3dfc43f50aca54c4058c4a8973b0ca097c589d5ab6"} Mar 20 09:02:32 crc kubenswrapper[4971]: I0320 09:02:32.762915 4971 generic.go:334] "Generic (PLEG): container finished" podID="9373f38e-bed2-4e2e-b4de-0ec24546a79c" containerID="e7ab89d5533f31ec372532793be3a981e00f50d37d85f3eafacc83301c752809" exitCode=0 Mar 20 09:02:32 crc kubenswrapper[4971]: I0320 09:02:32.762939 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-opensta-6afnxblg" event={"ID":"9373f38e-bed2-4e2e-b4de-0ec24546a79c","Type":"ContainerDied","Data":"e7ab89d5533f31ec372532793be3a981e00f50d37d85f3eafacc83301c752809"} Mar 20 09:02:34 crc kubenswrapper[4971]: I0320 09:02:34.264087 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvf2hd" Mar 20 09:02:34 crc kubenswrapper[4971]: I0320 09:02:34.310633 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d8021a6e-4275-43c1-9054-97946bdec61f-ceph\") pod \"d8021a6e-4275-43c1-9054-97946bdec61f\" (UID: \"d8021a6e-4275-43c1-9054-97946bdec61f\") " Mar 20 09:02:34 crc kubenswrapper[4971]: I0320 09:02:34.310744 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d8021a6e-4275-43c1-9054-97946bdec61f-inventory\") pod \"d8021a6e-4275-43c1-9054-97946bdec61f\" (UID: \"d8021a6e-4275-43c1-9054-97946bdec61f\") " Mar 20 09:02:34 crc kubenswrapper[4971]: I0320 09:02:34.310830 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d8021a6e-4275-43c1-9054-97946bdec61f-ssh-key-openstack-cell1\") pod \"d8021a6e-4275-43c1-9054-97946bdec61f\" (UID: \"d8021a6e-4275-43c1-9054-97946bdec61f\") " Mar 20 09:02:34 crc kubenswrapper[4971]: I0320 09:02:34.310847 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxfxv\" (UniqueName: \"kubernetes.io/projected/d8021a6e-4275-43c1-9054-97946bdec61f-kube-api-access-wxfxv\") pod \"d8021a6e-4275-43c1-9054-97946bdec61f\" (UID: \"d8021a6e-4275-43c1-9054-97946bdec61f\") " Mar 20 09:02:34 crc kubenswrapper[4971]: I0320 09:02:34.310870 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8021a6e-4275-43c1-9054-97946bdec61f-pre-adoption-validation-combined-ca-bundle\") pod \"d8021a6e-4275-43c1-9054-97946bdec61f\" (UID: \"d8021a6e-4275-43c1-9054-97946bdec61f\") " Mar 20 09:02:34 crc kubenswrapper[4971]: I0320 09:02:34.316203 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8021a6e-4275-43c1-9054-97946bdec61f-kube-api-access-wxfxv" (OuterVolumeSpecName: "kube-api-access-wxfxv") pod "d8021a6e-4275-43c1-9054-97946bdec61f" (UID: "d8021a6e-4275-43c1-9054-97946bdec61f"). InnerVolumeSpecName "kube-api-access-wxfxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:02:34 crc kubenswrapper[4971]: I0320 09:02:34.320498 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8021a6e-4275-43c1-9054-97946bdec61f-pre-adoption-validation-combined-ca-bundle" (OuterVolumeSpecName: "pre-adoption-validation-combined-ca-bundle") pod "d8021a6e-4275-43c1-9054-97946bdec61f" (UID: "d8021a6e-4275-43c1-9054-97946bdec61f"). InnerVolumeSpecName "pre-adoption-validation-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:02:34 crc kubenswrapper[4971]: I0320 09:02:34.330272 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8021a6e-4275-43c1-9054-97946bdec61f-ceph" (OuterVolumeSpecName: "ceph") pod "d8021a6e-4275-43c1-9054-97946bdec61f" (UID: "d8021a6e-4275-43c1-9054-97946bdec61f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:02:34 crc kubenswrapper[4971]: I0320 09:02:34.339852 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8021a6e-4275-43c1-9054-97946bdec61f-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "d8021a6e-4275-43c1-9054-97946bdec61f" (UID: "d8021a6e-4275-43c1-9054-97946bdec61f"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:02:34 crc kubenswrapper[4971]: I0320 09:02:34.339985 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8021a6e-4275-43c1-9054-97946bdec61f-inventory" (OuterVolumeSpecName: "inventory") pod "d8021a6e-4275-43c1-9054-97946bdec61f" (UID: "d8021a6e-4275-43c1-9054-97946bdec61f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:02:34 crc kubenswrapper[4971]: I0320 09:02:34.413262 4971 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d8021a6e-4275-43c1-9054-97946bdec61f-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 20 09:02:34 crc kubenswrapper[4971]: I0320 09:02:34.413292 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxfxv\" (UniqueName: \"kubernetes.io/projected/d8021a6e-4275-43c1-9054-97946bdec61f-kube-api-access-wxfxv\") on node \"crc\" DevicePath \"\"" Mar 20 09:02:34 crc kubenswrapper[4971]: I0320 09:02:34.413303 4971 reconciler_common.go:293] "Volume detached for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8021a6e-4275-43c1-9054-97946bdec61f-pre-adoption-validation-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 09:02:34 crc kubenswrapper[4971]: I0320 09:02:34.413315 4971 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d8021a6e-4275-43c1-9054-97946bdec61f-ceph\") on node \"crc\" DevicePath \"\"" Mar 20 09:02:34 crc kubenswrapper[4971]: I0320 09:02:34.413326 4971 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d8021a6e-4275-43c1-9054-97946bdec61f-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 09:02:34 crc kubenswrapper[4971]: I0320 09:02:34.788635 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvf2hd" event={"ID":"d8021a6e-4275-43c1-9054-97946bdec61f","Type":"ContainerDied","Data":"0b207824df22ea80967fb1159b8911fe8fe8991192c2ebcb276b77a9c047b7ba"} Mar 20 09:02:34 crc kubenswrapper[4971]: I0320 09:02:34.789034 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b207824df22ea80967fb1159b8911fe8fe8991192c2ebcb276b77a9c047b7ba" Mar 20 09:02:34 crc kubenswrapper[4971]: I0320 09:02:34.788717 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvf2hd" Mar 20 09:02:34 crc kubenswrapper[4971]: I0320 09:02:34.909191 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-opensta-6afnxblg" Mar 20 09:02:35 crc kubenswrapper[4971]: I0320 09:02:35.030107 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/9373f38e-bed2-4e2e-b4de-0ec24546a79c-ssh-key-openstack-networker\") pod \"9373f38e-bed2-4e2e-b4de-0ec24546a79c\" (UID: \"9373f38e-bed2-4e2e-b4de-0ec24546a79c\") " Mar 20 09:02:35 crc kubenswrapper[4971]: I0320 09:02:35.030315 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9373f38e-bed2-4e2e-b4de-0ec24546a79c-inventory\") pod \"9373f38e-bed2-4e2e-b4de-0ec24546a79c\" (UID: \"9373f38e-bed2-4e2e-b4de-0ec24546a79c\") " Mar 20 09:02:35 crc kubenswrapper[4971]: I0320 09:02:35.030387 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5dnr\" (UniqueName: \"kubernetes.io/projected/9373f38e-bed2-4e2e-b4de-0ec24546a79c-kube-api-access-r5dnr\") pod \"9373f38e-bed2-4e2e-b4de-0ec24546a79c\" (UID: \"9373f38e-bed2-4e2e-b4de-0ec24546a79c\") " Mar 20 09:02:35 crc kubenswrapper[4971]: I0320 09:02:35.030449 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9373f38e-bed2-4e2e-b4de-0ec24546a79c-pre-adoption-validation-combined-ca-bundle\") pod \"9373f38e-bed2-4e2e-b4de-0ec24546a79c\" (UID: \"9373f38e-bed2-4e2e-b4de-0ec24546a79c\") " Mar 20 09:02:35 crc kubenswrapper[4971]: I0320 09:02:35.034442 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9373f38e-bed2-4e2e-b4de-0ec24546a79c-pre-adoption-validation-combined-ca-bundle" (OuterVolumeSpecName: "pre-adoption-validation-combined-ca-bundle") pod "9373f38e-bed2-4e2e-b4de-0ec24546a79c" (UID: "9373f38e-bed2-4e2e-b4de-0ec24546a79c"). InnerVolumeSpecName "pre-adoption-validation-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:02:35 crc kubenswrapper[4971]: I0320 09:02:35.034584 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9373f38e-bed2-4e2e-b4de-0ec24546a79c-kube-api-access-r5dnr" (OuterVolumeSpecName: "kube-api-access-r5dnr") pod "9373f38e-bed2-4e2e-b4de-0ec24546a79c" (UID: "9373f38e-bed2-4e2e-b4de-0ec24546a79c"). InnerVolumeSpecName "kube-api-access-r5dnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:02:35 crc kubenswrapper[4971]: I0320 09:02:35.054791 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9373f38e-bed2-4e2e-b4de-0ec24546a79c-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "9373f38e-bed2-4e2e-b4de-0ec24546a79c" (UID: "9373f38e-bed2-4e2e-b4de-0ec24546a79c"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:02:35 crc kubenswrapper[4971]: I0320 09:02:35.056827 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9373f38e-bed2-4e2e-b4de-0ec24546a79c-inventory" (OuterVolumeSpecName: "inventory") pod "9373f38e-bed2-4e2e-b4de-0ec24546a79c" (UID: "9373f38e-bed2-4e2e-b4de-0ec24546a79c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:02:35 crc kubenswrapper[4971]: I0320 09:02:35.132463 4971 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/9373f38e-bed2-4e2e-b4de-0ec24546a79c-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Mar 20 09:02:35 crc kubenswrapper[4971]: I0320 09:02:35.132491 4971 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9373f38e-bed2-4e2e-b4de-0ec24546a79c-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 09:02:35 crc kubenswrapper[4971]: I0320 09:02:35.132500 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5dnr\" (UniqueName: \"kubernetes.io/projected/9373f38e-bed2-4e2e-b4de-0ec24546a79c-kube-api-access-r5dnr\") on node \"crc\" DevicePath \"\"" Mar 20 09:02:35 crc kubenswrapper[4971]: I0320 09:02:35.132509 4971 reconciler_common.go:293] "Volume detached for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9373f38e-bed2-4e2e-b4de-0ec24546a79c-pre-adoption-validation-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 09:02:35 crc kubenswrapper[4971]: I0320 09:02:35.809507 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-opensta-6afnxblg" event={"ID":"9373f38e-bed2-4e2e-b4de-0ec24546a79c","Type":"ContainerDied","Data":"6a3220967787e516f11e797a5b80bee229eb065b430858a4aba8067f6a5840d8"} Mar 20 09:02:35 crc kubenswrapper[4971]: I0320 09:02:35.810035 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a3220967787e516f11e797a5b80bee229eb065b430858a4aba8067f6a5840d8" Mar 20 09:02:35 crc kubenswrapper[4971]: I0320 09:02:35.809700 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-opensta-6afnxblg" Mar 20 09:02:37 crc kubenswrapper[4971]: I0320 09:02:37.733394 4971 scope.go:117] "RemoveContainer" containerID="28296d931187d2e7d4e287e10e7dff5f848527abf9621b1204e0a061cbc3d395" Mar 20 09:02:37 crc kubenswrapper[4971]: E0320 09:02:37.734058 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:02:39 crc kubenswrapper[4971]: I0320 09:02:39.771101 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-5ptjb"] Mar 20 09:02:39 crc kubenswrapper[4971]: E0320 09:02:39.772049 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8021a6e-4275-43c1-9054-97946bdec61f" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Mar 20 09:02:39 crc kubenswrapper[4971]: I0320 09:02:39.772080 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8021a6e-4275-43c1-9054-97946bdec61f" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Mar 20 09:02:39 crc kubenswrapper[4971]: E0320 09:02:39.772147 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9373f38e-bed2-4e2e-b4de-0ec24546a79c" containerName="pre-adoption-validation-openstack-pre-adoption-opensta-6af7d7e4" Mar 20 09:02:39 crc kubenswrapper[4971]: I0320 09:02:39.772169 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="9373f38e-bed2-4e2e-b4de-0ec24546a79c" containerName="pre-adoption-validation-openstack-pre-adoption-opensta-6af7d7e4" Mar 20 09:02:39 crc kubenswrapper[4971]: I0320 09:02:39.772525 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="9373f38e-bed2-4e2e-b4de-0ec24546a79c" containerName="pre-adoption-validation-openstack-pre-adoption-opensta-6af7d7e4" Mar 20 09:02:39 crc kubenswrapper[4971]: I0320 09:02:39.772565 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8021a6e-4275-43c1-9054-97946bdec61f" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Mar 20 09:02:39 crc kubenswrapper[4971]: I0320 09:02:39.773988 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-5ptjb" Mar 20 09:02:39 crc kubenswrapper[4971]: I0320 09:02:39.776430 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rwstj" Mar 20 09:02:39 crc kubenswrapper[4971]: I0320 09:02:39.778478 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 20 09:02:39 crc kubenswrapper[4971]: I0320 09:02:39.780497 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 09:02:39 crc kubenswrapper[4971]: I0320 09:02:39.782221 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 20 09:02:39 crc kubenswrapper[4971]: I0320 09:02:39.786933 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-5ptjb"] Mar 20 09:02:39 crc kubenswrapper[4971]: I0320 09:02:39.828293 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5blpg\" (UniqueName: \"kubernetes.io/projected/3f14a6df-ffd2-484e-9097-4e30054d8f42-kube-api-access-5blpg\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-5ptjb\" (UID: \"3f14a6df-ffd2-484e-9097-4e30054d8f42\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-5ptjb" Mar 20 09:02:39 crc kubenswrapper[4971]: I0320 09:02:39.828368 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f14a6df-ffd2-484e-9097-4e30054d8f42-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-5ptjb\" (UID: \"3f14a6df-ffd2-484e-9097-4e30054d8f42\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-5ptjb" Mar 20 09:02:39 crc kubenswrapper[4971]: I0320 09:02:39.828479 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/3f14a6df-ffd2-484e-9097-4e30054d8f42-ssh-key-openstack-cell1\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-5ptjb\" (UID: \"3f14a6df-ffd2-484e-9097-4e30054d8f42\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-5ptjb" Mar 20 09:02:39 crc kubenswrapper[4971]: I0320 09:02:39.828571 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f14a6df-ffd2-484e-9097-4e30054d8f42-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-5ptjb\" (UID: \"3f14a6df-ffd2-484e-9097-4e30054d8f42\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-5ptjb" Mar 20 09:02:39 crc kubenswrapper[4971]: I0320 09:02:39.828607 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3f14a6df-ffd2-484e-9097-4e30054d8f42-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-5ptjb\" (UID: \"3f14a6df-ffd2-484e-9097-4e30054d8f42\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-5ptjb" Mar 20 09:02:39 crc kubenswrapper[4971]: I0320 09:02:39.932562 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f14a6df-ffd2-484e-9097-4e30054d8f42-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-5ptjb\" (UID: \"3f14a6df-ffd2-484e-9097-4e30054d8f42\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-5ptjb" Mar 20 09:02:39 crc kubenswrapper[4971]: I0320 09:02:39.932644 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3f14a6df-ffd2-484e-9097-4e30054d8f42-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-5ptjb\" (UID: \"3f14a6df-ffd2-484e-9097-4e30054d8f42\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-5ptjb" Mar 20 09:02:39 crc kubenswrapper[4971]: I0320 09:02:39.932730 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5blpg\" (UniqueName: \"kubernetes.io/projected/3f14a6df-ffd2-484e-9097-4e30054d8f42-kube-api-access-5blpg\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-5ptjb\" (UID: \"3f14a6df-ffd2-484e-9097-4e30054d8f42\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-5ptjb" Mar 20 09:02:39 crc kubenswrapper[4971]: I0320 09:02:39.932754 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f14a6df-ffd2-484e-9097-4e30054d8f42-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-5ptjb\" (UID: \"3f14a6df-ffd2-484e-9097-4e30054d8f42\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-5ptjb" Mar 20 09:02:39 crc kubenswrapper[4971]: I0320 09:02:39.932810 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/3f14a6df-ffd2-484e-9097-4e30054d8f42-ssh-key-openstack-cell1\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-5ptjb\" (UID: \"3f14a6df-ffd2-484e-9097-4e30054d8f42\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-5ptjb" Mar 20 09:02:39 crc kubenswrapper[4971]: I0320 09:02:39.940823 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-7ljrd"] Mar 20 09:02:39 crc kubenswrapper[4971]: I0320 09:02:39.942210 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-7ljrd" Mar 20 09:02:39 crc kubenswrapper[4971]: I0320 09:02:39.973470 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3f14a6df-ffd2-484e-9097-4e30054d8f42-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-5ptjb\" (UID: \"3f14a6df-ffd2-484e-9097-4e30054d8f42\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-5ptjb" Mar 20 09:02:39 crc kubenswrapper[4971]: I0320 09:02:39.984574 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/3f14a6df-ffd2-484e-9097-4e30054d8f42-ssh-key-openstack-cell1\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-5ptjb\" (UID: \"3f14a6df-ffd2-484e-9097-4e30054d8f42\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-5ptjb" Mar 20 09:02:39 crc kubenswrapper[4971]: I0320 09:02:39.990958 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f14a6df-ffd2-484e-9097-4e30054d8f42-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-5ptjb\" (UID: \"3f14a6df-ffd2-484e-9097-4e30054d8f42\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-5ptjb" Mar 20 09:02:39 crc kubenswrapper[4971]: I0320 09:02:39.991555 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-nwcvx" Mar 20 09:02:39 crc kubenswrapper[4971]: I0320 09:02:39.991817 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Mar 20 09:02:40 crc kubenswrapper[4971]: I0320 09:02:39.999263 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f14a6df-ffd2-484e-9097-4e30054d8f42-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-5ptjb\" (UID: \"3f14a6df-ffd2-484e-9097-4e30054d8f42\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-5ptjb" Mar 20 09:02:40 crc kubenswrapper[4971]: I0320 09:02:40.037315 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-7ljrd"] Mar 20 09:02:40 crc kubenswrapper[4971]: I0320 09:02:40.062703 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5blpg\" (UniqueName: \"kubernetes.io/projected/3f14a6df-ffd2-484e-9097-4e30054d8f42-kube-api-access-5blpg\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-5ptjb\" (UID: \"3f14a6df-ffd2-484e-9097-4e30054d8f42\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-5ptjb" Mar 20 09:02:40 crc kubenswrapper[4971]: I0320 09:02:40.108870 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-5ptjb" Mar 20 09:02:40 crc kubenswrapper[4971]: I0320 09:02:40.144275 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/004105ec-05ab-464c-ac28-5a3f5b8ef32c-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-7ljrd\" (UID: \"004105ec-05ab-464c-ac28-5a3f5b8ef32c\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-7ljrd" Mar 20 09:02:40 crc kubenswrapper[4971]: I0320 09:02:40.144405 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/004105ec-05ab-464c-ac28-5a3f5b8ef32c-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-7ljrd\" (UID: \"004105ec-05ab-464c-ac28-5a3f5b8ef32c\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-7ljrd" Mar 20 09:02:40 crc kubenswrapper[4971]: I0320 09:02:40.145682 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87wzd\" (UniqueName: \"kubernetes.io/projected/004105ec-05ab-464c-ac28-5a3f5b8ef32c-kube-api-access-87wzd\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-7ljrd\" (UID: \"004105ec-05ab-464c-ac28-5a3f5b8ef32c\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-7ljrd" Mar 20 09:02:40 crc kubenswrapper[4971]: I0320 09:02:40.145747 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/004105ec-05ab-464c-ac28-5a3f5b8ef32c-ssh-key-openstack-networker\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-7ljrd\" (UID: \"004105ec-05ab-464c-ac28-5a3f5b8ef32c\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-7ljrd" Mar 20 09:02:40 crc kubenswrapper[4971]: I0320 09:02:40.247827 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/004105ec-05ab-464c-ac28-5a3f5b8ef32c-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-7ljrd\" (UID: \"004105ec-05ab-464c-ac28-5a3f5b8ef32c\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-7ljrd" Mar 20 09:02:40 crc kubenswrapper[4971]: I0320 09:02:40.248251 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87wzd\" (UniqueName: \"kubernetes.io/projected/004105ec-05ab-464c-ac28-5a3f5b8ef32c-kube-api-access-87wzd\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-7ljrd\" (UID: \"004105ec-05ab-464c-ac28-5a3f5b8ef32c\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-7ljrd" Mar 20 09:02:40 crc kubenswrapper[4971]: I0320 09:02:40.248331 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/004105ec-05ab-464c-ac28-5a3f5b8ef32c-ssh-key-openstack-networker\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-7ljrd\" (UID: \"004105ec-05ab-464c-ac28-5a3f5b8ef32c\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-7ljrd" Mar 20 09:02:40 crc kubenswrapper[4971]: I0320 09:02:40.248543 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/004105ec-05ab-464c-ac28-5a3f5b8ef32c-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-7ljrd\" (UID: \"004105ec-05ab-464c-ac28-5a3f5b8ef32c\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-7ljrd" Mar 20 09:02:40 crc kubenswrapper[4971]: I0320 09:02:40.253442 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/004105ec-05ab-464c-ac28-5a3f5b8ef32c-ssh-key-openstack-networker\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-7ljrd\" (UID: \"004105ec-05ab-464c-ac28-5a3f5b8ef32c\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-7ljrd" Mar 20 09:02:40 crc kubenswrapper[4971]: I0320 09:02:40.255698 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/004105ec-05ab-464c-ac28-5a3f5b8ef32c-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-7ljrd\" (UID: \"004105ec-05ab-464c-ac28-5a3f5b8ef32c\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-7ljrd" Mar 20 09:02:40 crc kubenswrapper[4971]: I0320 09:02:40.256751 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/004105ec-05ab-464c-ac28-5a3f5b8ef32c-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-7ljrd\" (UID: \"004105ec-05ab-464c-ac28-5a3f5b8ef32c\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-7ljrd" Mar 20 09:02:40 crc kubenswrapper[4971]: I0320 09:02:40.267524 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87wzd\" (UniqueName: \"kubernetes.io/projected/004105ec-05ab-464c-ac28-5a3f5b8ef32c-kube-api-access-87wzd\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-7ljrd\" (UID: \"004105ec-05ab-464c-ac28-5a3f5b8ef32c\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-7ljrd" Mar 20 09:02:40 crc kubenswrapper[4971]: I0320 09:02:40.530710 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-7ljrd" Mar 20 09:02:40 crc kubenswrapper[4971]: I0320 09:02:40.748914 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-5ptjb"] Mar 20 09:02:40 crc kubenswrapper[4971]: I0320 09:02:40.930821 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-5ptjb" event={"ID":"3f14a6df-ffd2-484e-9097-4e30054d8f42","Type":"ContainerStarted","Data":"df34ad1fefe48460efc93bda8c8862101cb66fd09e3ebfa9c2c33cb05be71dde"} Mar 20 09:02:41 crc kubenswrapper[4971]: W0320 09:02:41.054053 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod004105ec_05ab_464c_ac28_5a3f5b8ef32c.slice/crio-b08f8eb75f988c8819930533921daca7b58fe312e79fab4ea153f9c5308ee5eb WatchSource:0}: Error finding container b08f8eb75f988c8819930533921daca7b58fe312e79fab4ea153f9c5308ee5eb: Status 404 returned error can't find the container with id b08f8eb75f988c8819930533921daca7b58fe312e79fab4ea153f9c5308ee5eb Mar 20 09:02:41 crc kubenswrapper[4971]: I0320 09:02:41.054193 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-7ljrd"] Mar 20 09:02:41 crc kubenswrapper[4971]: I0320 09:02:41.942901 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-7ljrd" event={"ID":"004105ec-05ab-464c-ac28-5a3f5b8ef32c","Type":"ContainerStarted","Data":"cd6d56ac15c16e99bd5a47b5ff9b2418b0471e9a898a4431f00ade020accbf47"} Mar 20 09:02:41 crc kubenswrapper[4971]: I0320 09:02:41.943210 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-7ljrd" event={"ID":"004105ec-05ab-464c-ac28-5a3f5b8ef32c","Type":"ContainerStarted","Data":"b08f8eb75f988c8819930533921daca7b58fe312e79fab4ea153f9c5308ee5eb"} Mar 20 09:02:41 crc kubenswrapper[4971]: I0320 09:02:41.944637 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-5ptjb" event={"ID":"3f14a6df-ffd2-484e-9097-4e30054d8f42","Type":"ContainerStarted","Data":"37bd3c4080897c9ef7fd21fdab6f33c8be131dba1c497294b6cf001d3212cca1"} Mar 20 09:02:41 crc kubenswrapper[4971]: I0320 09:02:41.963892 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-7ljrd" podStartSLOduration=2.464361831 podStartE2EDuration="2.963873205s" podCreationTimestamp="2026-03-20 09:02:39 +0000 UTC" firstStartedPulling="2026-03-20 09:02:41.05701319 +0000 UTC m=+7983.036887328" lastFinishedPulling="2026-03-20 09:02:41.556524554 +0000 UTC m=+7983.536398702" observedRunningTime="2026-03-20 09:02:41.959022308 +0000 UTC m=+7983.938896456" watchObservedRunningTime="2026-03-20 09:02:41.963873205 +0000 UTC m=+7983.943747343" Mar 20 09:02:42 crc kubenswrapper[4971]: I0320 09:02:42.012129 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-5ptjb" podStartSLOduration=2.088404001 podStartE2EDuration="3.012101948s" podCreationTimestamp="2026-03-20 09:02:39 +0000 UTC" firstStartedPulling="2026-03-20 09:02:40.751795421 +0000 UTC m=+7982.731669559" lastFinishedPulling="2026-03-20 09:02:41.675493368 +0000 UTC m=+7983.655367506" observedRunningTime="2026-03-20 09:02:41.974648257 +0000 UTC m=+7983.954522405" watchObservedRunningTime="2026-03-20 09:02:42.012101948 +0000 UTC m=+7983.991976096" Mar 20 09:02:52 crc kubenswrapper[4971]: I0320 09:02:52.732035 4971 scope.go:117] "RemoveContainer" containerID="28296d931187d2e7d4e287e10e7dff5f848527abf9621b1204e0a061cbc3d395" Mar 20 09:02:52 crc kubenswrapper[4971]: E0320 09:02:52.732675 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:03:06 crc kubenswrapper[4971]: I0320 09:03:06.733387 4971 scope.go:117] "RemoveContainer" containerID="28296d931187d2e7d4e287e10e7dff5f848527abf9621b1204e0a061cbc3d395" Mar 20 09:03:06 crc kubenswrapper[4971]: E0320 09:03:06.734267 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:03:12 crc kubenswrapper[4971]: I0320 09:03:12.339800 4971 scope.go:117] "RemoveContainer" containerID="81b8470c91cb51a284777f066b129acbd03ea2d7d0a225be6953968a1de86fdb" Mar 20 09:03:12 crc kubenswrapper[4971]: I0320 09:03:12.377230 4971 scope.go:117] "RemoveContainer" containerID="77ba289c4840ef807399fc320c939664efefc46aa001ddb3e09a65c214ae7370" Mar 20 09:03:13 crc kubenswrapper[4971]: I0320 09:03:13.014024 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-c24sd"] Mar 20 09:03:13 crc kubenswrapper[4971]: I0320 09:03:13.016918 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c24sd" Mar 20 09:03:13 crc kubenswrapper[4971]: I0320 09:03:13.022505 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c24sd"] Mar 20 09:03:13 crc kubenswrapper[4971]: I0320 09:03:13.064115 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-tt6jv"] Mar 20 09:03:13 crc kubenswrapper[4971]: I0320 09:03:13.078074 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a2d354a-112d-4be4-bf79-63cee222cc3b-utilities\") pod \"certified-operators-c24sd\" (UID: \"4a2d354a-112d-4be4-bf79-63cee222cc3b\") " pod="openshift-marketplace/certified-operators-c24sd" Mar 20 09:03:13 crc kubenswrapper[4971]: I0320 09:03:13.078162 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqqdc\" (UniqueName: \"kubernetes.io/projected/4a2d354a-112d-4be4-bf79-63cee222cc3b-kube-api-access-fqqdc\") pod \"certified-operators-c24sd\" (UID: \"4a2d354a-112d-4be4-bf79-63cee222cc3b\") " pod="openshift-marketplace/certified-operators-c24sd" Mar 20 09:03:13 crc kubenswrapper[4971]: I0320 09:03:13.078195 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a2d354a-112d-4be4-bf79-63cee222cc3b-catalog-content\") pod \"certified-operators-c24sd\" (UID: \"4a2d354a-112d-4be4-bf79-63cee222cc3b\") " pod="openshift-marketplace/certified-operators-c24sd" Mar 20 09:03:13 crc kubenswrapper[4971]: I0320 09:03:13.084642 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-tt6jv"] Mar 20 09:03:13 crc kubenswrapper[4971]: I0320 09:03:13.180641 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a2d354a-112d-4be4-bf79-63cee222cc3b-utilities\") pod \"certified-operators-c24sd\" (UID: \"4a2d354a-112d-4be4-bf79-63cee222cc3b\") " pod="openshift-marketplace/certified-operators-c24sd" Mar 20 09:03:13 crc kubenswrapper[4971]: I0320 09:03:13.180738 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqqdc\" (UniqueName: \"kubernetes.io/projected/4a2d354a-112d-4be4-bf79-63cee222cc3b-kube-api-access-fqqdc\") pod \"certified-operators-c24sd\" (UID: \"4a2d354a-112d-4be4-bf79-63cee222cc3b\") " pod="openshift-marketplace/certified-operators-c24sd" Mar 20 09:03:13 crc kubenswrapper[4971]: I0320 09:03:13.180766 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a2d354a-112d-4be4-bf79-63cee222cc3b-catalog-content\") pod \"certified-operators-c24sd\" (UID: \"4a2d354a-112d-4be4-bf79-63cee222cc3b\") " pod="openshift-marketplace/certified-operators-c24sd" Mar 20 09:03:13 crc kubenswrapper[4971]: I0320 09:03:13.181393 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a2d354a-112d-4be4-bf79-63cee222cc3b-catalog-content\") pod \"certified-operators-c24sd\" (UID: \"4a2d354a-112d-4be4-bf79-63cee222cc3b\") " pod="openshift-marketplace/certified-operators-c24sd" Mar 20 09:03:13 crc kubenswrapper[4971]: I0320 09:03:13.181640 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a2d354a-112d-4be4-bf79-63cee222cc3b-utilities\") pod \"certified-operators-c24sd\" (UID: \"4a2d354a-112d-4be4-bf79-63cee222cc3b\") " pod="openshift-marketplace/certified-operators-c24sd" Mar 20 09:03:13 crc kubenswrapper[4971]: I0320 09:03:13.203672 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqqdc\" (UniqueName: \"kubernetes.io/projected/4a2d354a-112d-4be4-bf79-63cee222cc3b-kube-api-access-fqqdc\") pod \"certified-operators-c24sd\" (UID: \"4a2d354a-112d-4be4-bf79-63cee222cc3b\") " pod="openshift-marketplace/certified-operators-c24sd" Mar 20 09:03:13 crc kubenswrapper[4971]: I0320 09:03:13.350158 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c24sd" Mar 20 09:03:13 crc kubenswrapper[4971]: I0320 09:03:13.785783 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c24sd"] Mar 20 09:03:14 crc kubenswrapper[4971]: I0320 09:03:14.726389 4971 generic.go:334] "Generic (PLEG): container finished" podID="4a2d354a-112d-4be4-bf79-63cee222cc3b" containerID="69f68286f3d61284969ff7178b9daeec28f8a1bd13431dc93728534786f989b2" exitCode=0 Mar 20 09:03:14 crc kubenswrapper[4971]: I0320 09:03:14.726434 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c24sd" event={"ID":"4a2d354a-112d-4be4-bf79-63cee222cc3b","Type":"ContainerDied","Data":"69f68286f3d61284969ff7178b9daeec28f8a1bd13431dc93728534786f989b2"} Mar 20 09:03:14 crc kubenswrapper[4971]: I0320 09:03:14.726461 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c24sd" event={"ID":"4a2d354a-112d-4be4-bf79-63cee222cc3b","Type":"ContainerStarted","Data":"821104a81fd7334e0e7c149e5f81a99b47b98d0b0e9093aacd0f3cd879f9b111"} Mar 20 09:03:14 crc kubenswrapper[4971]: I0320 09:03:14.744327 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="993ffc7e-64d4-455b-84ba-134ffefa83ea" path="/var/lib/kubelet/pods/993ffc7e-64d4-455b-84ba-134ffefa83ea/volumes" Mar 20 09:03:15 crc kubenswrapper[4971]: I0320 09:03:15.736787 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c24sd" event={"ID":"4a2d354a-112d-4be4-bf79-63cee222cc3b","Type":"ContainerStarted","Data":"afb73543d13a387da58fed9bccb78e1c1fed1092998d1ee08eb4cadc457efa43"} Mar 20 09:03:17 crc kubenswrapper[4971]: I0320 09:03:17.733159 4971 scope.go:117] "RemoveContainer" containerID="28296d931187d2e7d4e287e10e7dff5f848527abf9621b1204e0a061cbc3d395" Mar 20 09:03:17 crc kubenswrapper[4971]: E0320 09:03:17.733944 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:03:17 crc kubenswrapper[4971]: I0320 09:03:17.759720 4971 generic.go:334] "Generic (PLEG): container finished" podID="4a2d354a-112d-4be4-bf79-63cee222cc3b" containerID="afb73543d13a387da58fed9bccb78e1c1fed1092998d1ee08eb4cadc457efa43" exitCode=0 Mar 20 09:03:17 crc kubenswrapper[4971]: I0320 09:03:17.759787 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c24sd" event={"ID":"4a2d354a-112d-4be4-bf79-63cee222cc3b","Type":"ContainerDied","Data":"afb73543d13a387da58fed9bccb78e1c1fed1092998d1ee08eb4cadc457efa43"} Mar 20 09:03:18 crc kubenswrapper[4971]: I0320 09:03:18.772474 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c24sd" event={"ID":"4a2d354a-112d-4be4-bf79-63cee222cc3b","Type":"ContainerStarted","Data":"fb2431cdbcf6ad085c096e367b310c95506fc1044f8eb8201c8d67f38a23a8dc"} Mar 20 09:03:18 crc kubenswrapper[4971]: I0320 09:03:18.798303 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-c24sd" podStartSLOduration=3.35894607 podStartE2EDuration="6.798284156s" podCreationTimestamp="2026-03-20 09:03:12 +0000 UTC" firstStartedPulling="2026-03-20 09:03:14.729062825 +0000 UTC m=+8016.708936963" lastFinishedPulling="2026-03-20 09:03:18.168400901 +0000 UTC m=+8020.148275049" observedRunningTime="2026-03-20 09:03:18.795748939 +0000 UTC m=+8020.775623097" watchObservedRunningTime="2026-03-20 09:03:18.798284156 +0000 UTC m=+8020.778158294" Mar 20 09:03:23 crc kubenswrapper[4971]: I0320 09:03:23.350420 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-c24sd" Mar 20 09:03:23 crc kubenswrapper[4971]: I0320 09:03:23.351296 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-c24sd" Mar 20 09:03:23 crc kubenswrapper[4971]: I0320 09:03:23.405142 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-c24sd" Mar 20 09:03:23 crc kubenswrapper[4971]: I0320 09:03:23.884219 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-c24sd" Mar 20 09:03:23 crc kubenswrapper[4971]: I0320 09:03:23.957238 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c24sd"] Mar 20 09:03:25 crc kubenswrapper[4971]: I0320 09:03:25.850582 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-c24sd" podUID="4a2d354a-112d-4be4-bf79-63cee222cc3b" containerName="registry-server" containerID="cri-o://fb2431cdbcf6ad085c096e367b310c95506fc1044f8eb8201c8d67f38a23a8dc" gracePeriod=2 Mar 20 09:03:26 crc kubenswrapper[4971]: I0320 09:03:26.317671 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c24sd" Mar 20 09:03:26 crc kubenswrapper[4971]: I0320 09:03:26.395762 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a2d354a-112d-4be4-bf79-63cee222cc3b-utilities\") pod \"4a2d354a-112d-4be4-bf79-63cee222cc3b\" (UID: \"4a2d354a-112d-4be4-bf79-63cee222cc3b\") " Mar 20 09:03:26 crc kubenswrapper[4971]: I0320 09:03:26.395891 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a2d354a-112d-4be4-bf79-63cee222cc3b-catalog-content\") pod \"4a2d354a-112d-4be4-bf79-63cee222cc3b\" (UID: \"4a2d354a-112d-4be4-bf79-63cee222cc3b\") " Mar 20 09:03:26 crc kubenswrapper[4971]: I0320 09:03:26.396043 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqqdc\" (UniqueName: \"kubernetes.io/projected/4a2d354a-112d-4be4-bf79-63cee222cc3b-kube-api-access-fqqdc\") pod \"4a2d354a-112d-4be4-bf79-63cee222cc3b\" (UID: \"4a2d354a-112d-4be4-bf79-63cee222cc3b\") " Mar 20 09:03:26 crc kubenswrapper[4971]: I0320 09:03:26.396539 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a2d354a-112d-4be4-bf79-63cee222cc3b-utilities" (OuterVolumeSpecName: "utilities") pod "4a2d354a-112d-4be4-bf79-63cee222cc3b" (UID: "4a2d354a-112d-4be4-bf79-63cee222cc3b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:03:26 crc kubenswrapper[4971]: I0320 09:03:26.403562 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a2d354a-112d-4be4-bf79-63cee222cc3b-kube-api-access-fqqdc" (OuterVolumeSpecName: "kube-api-access-fqqdc") pod "4a2d354a-112d-4be4-bf79-63cee222cc3b" (UID: "4a2d354a-112d-4be4-bf79-63cee222cc3b"). InnerVolumeSpecName "kube-api-access-fqqdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:03:26 crc kubenswrapper[4971]: I0320 09:03:26.443790 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a2d354a-112d-4be4-bf79-63cee222cc3b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4a2d354a-112d-4be4-bf79-63cee222cc3b" (UID: "4a2d354a-112d-4be4-bf79-63cee222cc3b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:03:26 crc kubenswrapper[4971]: I0320 09:03:26.498246 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqqdc\" (UniqueName: \"kubernetes.io/projected/4a2d354a-112d-4be4-bf79-63cee222cc3b-kube-api-access-fqqdc\") on node \"crc\" DevicePath \"\"" Mar 20 09:03:26 crc kubenswrapper[4971]: I0320 09:03:26.498280 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a2d354a-112d-4be4-bf79-63cee222cc3b-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 09:03:26 crc kubenswrapper[4971]: I0320 09:03:26.498290 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a2d354a-112d-4be4-bf79-63cee222cc3b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 09:03:26 crc kubenswrapper[4971]: I0320 09:03:26.864945 4971 generic.go:334] "Generic (PLEG): container finished" podID="4a2d354a-112d-4be4-bf79-63cee222cc3b" containerID="fb2431cdbcf6ad085c096e367b310c95506fc1044f8eb8201c8d67f38a23a8dc" exitCode=0 Mar 20 09:03:26 crc kubenswrapper[4971]: I0320 09:03:26.865049 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c24sd" Mar 20 09:03:26 crc kubenswrapper[4971]: I0320 09:03:26.865046 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c24sd" event={"ID":"4a2d354a-112d-4be4-bf79-63cee222cc3b","Type":"ContainerDied","Data":"fb2431cdbcf6ad085c096e367b310c95506fc1044f8eb8201c8d67f38a23a8dc"} Mar 20 09:03:26 crc kubenswrapper[4971]: I0320 09:03:26.865934 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c24sd" event={"ID":"4a2d354a-112d-4be4-bf79-63cee222cc3b","Type":"ContainerDied","Data":"821104a81fd7334e0e7c149e5f81a99b47b98d0b0e9093aacd0f3cd879f9b111"} Mar 20 09:03:26 crc kubenswrapper[4971]: I0320 09:03:26.865983 4971 scope.go:117] "RemoveContainer" containerID="fb2431cdbcf6ad085c096e367b310c95506fc1044f8eb8201c8d67f38a23a8dc" Mar 20 09:03:26 crc kubenswrapper[4971]: I0320 09:03:26.899700 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c24sd"] Mar 20 09:03:26 crc kubenswrapper[4971]: I0320 09:03:26.900839 4971 scope.go:117] "RemoveContainer" containerID="afb73543d13a387da58fed9bccb78e1c1fed1092998d1ee08eb4cadc457efa43" Mar 20 09:03:26 crc kubenswrapper[4971]: I0320 09:03:26.910268 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-c24sd"] Mar 20 09:03:26 crc kubenswrapper[4971]: I0320 09:03:26.943689 4971 scope.go:117] "RemoveContainer" containerID="69f68286f3d61284969ff7178b9daeec28f8a1bd13431dc93728534786f989b2" Mar 20 09:03:26 crc kubenswrapper[4971]: I0320 09:03:26.973895 4971 scope.go:117] "RemoveContainer" containerID="fb2431cdbcf6ad085c096e367b310c95506fc1044f8eb8201c8d67f38a23a8dc" Mar 20 09:03:26 crc kubenswrapper[4971]: E0320 09:03:26.974890 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb2431cdbcf6ad085c096e367b310c95506fc1044f8eb8201c8d67f38a23a8dc\": container with ID starting with fb2431cdbcf6ad085c096e367b310c95506fc1044f8eb8201c8d67f38a23a8dc not found: ID does not exist" containerID="fb2431cdbcf6ad085c096e367b310c95506fc1044f8eb8201c8d67f38a23a8dc" Mar 20 09:03:26 crc kubenswrapper[4971]: I0320 09:03:26.974952 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb2431cdbcf6ad085c096e367b310c95506fc1044f8eb8201c8d67f38a23a8dc"} err="failed to get container status \"fb2431cdbcf6ad085c096e367b310c95506fc1044f8eb8201c8d67f38a23a8dc\": rpc error: code = NotFound desc = could not find container \"fb2431cdbcf6ad085c096e367b310c95506fc1044f8eb8201c8d67f38a23a8dc\": container with ID starting with fb2431cdbcf6ad085c096e367b310c95506fc1044f8eb8201c8d67f38a23a8dc not found: ID does not exist" Mar 20 09:03:26 crc kubenswrapper[4971]: I0320 09:03:26.974990 4971 scope.go:117] "RemoveContainer" containerID="afb73543d13a387da58fed9bccb78e1c1fed1092998d1ee08eb4cadc457efa43" Mar 20 09:03:26 crc kubenswrapper[4971]: E0320 09:03:26.975349 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afb73543d13a387da58fed9bccb78e1c1fed1092998d1ee08eb4cadc457efa43\": container with ID starting with afb73543d13a387da58fed9bccb78e1c1fed1092998d1ee08eb4cadc457efa43 not found: ID does not exist" containerID="afb73543d13a387da58fed9bccb78e1c1fed1092998d1ee08eb4cadc457efa43" Mar 20 09:03:26 crc kubenswrapper[4971]: I0320 09:03:26.975399 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afb73543d13a387da58fed9bccb78e1c1fed1092998d1ee08eb4cadc457efa43"} err="failed to get container status \"afb73543d13a387da58fed9bccb78e1c1fed1092998d1ee08eb4cadc457efa43\": rpc error: code = NotFound desc = could not find container \"afb73543d13a387da58fed9bccb78e1c1fed1092998d1ee08eb4cadc457efa43\": container with ID starting with afb73543d13a387da58fed9bccb78e1c1fed1092998d1ee08eb4cadc457efa43 not found: ID does not exist" Mar 20 09:03:26 crc kubenswrapper[4971]: I0320 09:03:26.975431 4971 scope.go:117] "RemoveContainer" containerID="69f68286f3d61284969ff7178b9daeec28f8a1bd13431dc93728534786f989b2" Mar 20 09:03:26 crc kubenswrapper[4971]: E0320 09:03:26.975776 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69f68286f3d61284969ff7178b9daeec28f8a1bd13431dc93728534786f989b2\": container with ID starting with 69f68286f3d61284969ff7178b9daeec28f8a1bd13431dc93728534786f989b2 not found: ID does not exist" containerID="69f68286f3d61284969ff7178b9daeec28f8a1bd13431dc93728534786f989b2" Mar 20 09:03:26 crc kubenswrapper[4971]: I0320 09:03:26.975801 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69f68286f3d61284969ff7178b9daeec28f8a1bd13431dc93728534786f989b2"} err="failed to get container status \"69f68286f3d61284969ff7178b9daeec28f8a1bd13431dc93728534786f989b2\": rpc error: code = NotFound desc = could not find container \"69f68286f3d61284969ff7178b9daeec28f8a1bd13431dc93728534786f989b2\": container with ID starting with 69f68286f3d61284969ff7178b9daeec28f8a1bd13431dc93728534786f989b2 not found: ID does not exist" Mar 20 09:03:28 crc kubenswrapper[4971]: I0320 09:03:28.746005 4971 scope.go:117] "RemoveContainer" containerID="28296d931187d2e7d4e287e10e7dff5f848527abf9621b1204e0a061cbc3d395" Mar 20 09:03:28 crc kubenswrapper[4971]: I0320 09:03:28.755171 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a2d354a-112d-4be4-bf79-63cee222cc3b" path="/var/lib/kubelet/pods/4a2d354a-112d-4be4-bf79-63cee222cc3b/volumes" Mar 20 09:03:29 crc kubenswrapper[4971]: I0320 09:03:29.910162 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerStarted","Data":"ba1ddb4578d4cb1dc77ee5b85f04ad39f4f2172118cb49b973658b5d06d4d7e6"} Mar 20 09:04:00 crc kubenswrapper[4971]: I0320 09:04:00.163697 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566624-l8469"] Mar 20 09:04:00 crc kubenswrapper[4971]: E0320 09:04:00.165175 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a2d354a-112d-4be4-bf79-63cee222cc3b" containerName="extract-content" Mar 20 09:04:00 crc kubenswrapper[4971]: I0320 09:04:00.165202 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a2d354a-112d-4be4-bf79-63cee222cc3b" containerName="extract-content" Mar 20 09:04:00 crc kubenswrapper[4971]: E0320 09:04:00.165242 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a2d354a-112d-4be4-bf79-63cee222cc3b" containerName="registry-server" Mar 20 09:04:00 crc kubenswrapper[4971]: I0320 09:04:00.165256 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a2d354a-112d-4be4-bf79-63cee222cc3b" containerName="registry-server" Mar 20 09:04:00 crc kubenswrapper[4971]: E0320 09:04:00.165293 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a2d354a-112d-4be4-bf79-63cee222cc3b" containerName="extract-utilities" Mar 20 09:04:00 crc kubenswrapper[4971]: I0320 09:04:00.165307 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a2d354a-112d-4be4-bf79-63cee222cc3b" containerName="extract-utilities" Mar 20 09:04:00 crc kubenswrapper[4971]: I0320 09:04:00.165679 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a2d354a-112d-4be4-bf79-63cee222cc3b" containerName="registry-server" Mar 20 09:04:00 crc kubenswrapper[4971]: I0320 09:04:00.166584 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566624-l8469" Mar 20 09:04:00 crc kubenswrapper[4971]: I0320 09:04:00.168902 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 09:04:00 crc kubenswrapper[4971]: I0320 09:04:00.169293 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:04:00 crc kubenswrapper[4971]: I0320 09:04:00.177078 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566624-l8469"] Mar 20 09:04:00 crc kubenswrapper[4971]: I0320 09:04:00.179961 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:04:00 crc kubenswrapper[4971]: I0320 09:04:00.317719 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvd2h\" (UniqueName: \"kubernetes.io/projected/f2cbaa6d-6665-4d54-a3e9-8356327e8814-kube-api-access-lvd2h\") pod \"auto-csr-approver-29566624-l8469\" (UID: \"f2cbaa6d-6665-4d54-a3e9-8356327e8814\") " pod="openshift-infra/auto-csr-approver-29566624-l8469" Mar 20 09:04:00 crc kubenswrapper[4971]: I0320 09:04:00.420413 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvd2h\" (UniqueName: \"kubernetes.io/projected/f2cbaa6d-6665-4d54-a3e9-8356327e8814-kube-api-access-lvd2h\") pod \"auto-csr-approver-29566624-l8469\" (UID: \"f2cbaa6d-6665-4d54-a3e9-8356327e8814\") " pod="openshift-infra/auto-csr-approver-29566624-l8469" Mar 20 09:04:00 crc kubenswrapper[4971]: I0320 09:04:00.440969 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvd2h\" (UniqueName: \"kubernetes.io/projected/f2cbaa6d-6665-4d54-a3e9-8356327e8814-kube-api-access-lvd2h\") pod \"auto-csr-approver-29566624-l8469\" (UID: \"f2cbaa6d-6665-4d54-a3e9-8356327e8814\") " pod="openshift-infra/auto-csr-approver-29566624-l8469" Mar 20 09:04:00 crc kubenswrapper[4971]: I0320 09:04:00.497958 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566624-l8469" Mar 20 09:04:00 crc kubenswrapper[4971]: I0320 09:04:00.960267 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566624-l8469"] Mar 20 09:04:01 crc kubenswrapper[4971]: I0320 09:04:01.218421 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566624-l8469" event={"ID":"f2cbaa6d-6665-4d54-a3e9-8356327e8814","Type":"ContainerStarted","Data":"3d78b31da6e06a6627778943216f4c1db7784f2963c4ee0e8f3c681cd55d93a0"} Mar 20 09:04:03 crc kubenswrapper[4971]: I0320 09:04:03.238323 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566624-l8469" event={"ID":"f2cbaa6d-6665-4d54-a3e9-8356327e8814","Type":"ContainerStarted","Data":"43740f103d019ea35f569908920f5230c2fef27d15a89152e4a753894f99fda1"} Mar 20 09:04:04 crc kubenswrapper[4971]: I0320 09:04:04.251904 4971 generic.go:334] "Generic (PLEG): container finished" podID="f2cbaa6d-6665-4d54-a3e9-8356327e8814" containerID="43740f103d019ea35f569908920f5230c2fef27d15a89152e4a753894f99fda1" exitCode=0 Mar 20 09:04:04 crc kubenswrapper[4971]: I0320 09:04:04.251999 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566624-l8469" event={"ID":"f2cbaa6d-6665-4d54-a3e9-8356327e8814","Type":"ContainerDied","Data":"43740f103d019ea35f569908920f5230c2fef27d15a89152e4a753894f99fda1"} Mar 20 09:04:05 crc kubenswrapper[4971]: I0320 09:04:05.608730 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566624-l8469" Mar 20 09:04:05 crc kubenswrapper[4971]: I0320 09:04:05.755566 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvd2h\" (UniqueName: \"kubernetes.io/projected/f2cbaa6d-6665-4d54-a3e9-8356327e8814-kube-api-access-lvd2h\") pod \"f2cbaa6d-6665-4d54-a3e9-8356327e8814\" (UID: \"f2cbaa6d-6665-4d54-a3e9-8356327e8814\") " Mar 20 09:04:05 crc kubenswrapper[4971]: I0320 09:04:05.766955 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2cbaa6d-6665-4d54-a3e9-8356327e8814-kube-api-access-lvd2h" (OuterVolumeSpecName: "kube-api-access-lvd2h") pod "f2cbaa6d-6665-4d54-a3e9-8356327e8814" (UID: "f2cbaa6d-6665-4d54-a3e9-8356327e8814"). InnerVolumeSpecName "kube-api-access-lvd2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:04:05 crc kubenswrapper[4971]: I0320 09:04:05.859592 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvd2h\" (UniqueName: \"kubernetes.io/projected/f2cbaa6d-6665-4d54-a3e9-8356327e8814-kube-api-access-lvd2h\") on node \"crc\" DevicePath \"\"" Mar 20 09:04:06 crc kubenswrapper[4971]: I0320 09:04:06.270539 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566624-l8469" event={"ID":"f2cbaa6d-6665-4d54-a3e9-8356327e8814","Type":"ContainerDied","Data":"3d78b31da6e06a6627778943216f4c1db7784f2963c4ee0e8f3c681cd55d93a0"} Mar 20 09:04:06 crc kubenswrapper[4971]: I0320 09:04:06.270592 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d78b31da6e06a6627778943216f4c1db7784f2963c4ee0e8f3c681cd55d93a0" Mar 20 09:04:06 crc kubenswrapper[4971]: I0320 09:04:06.270601 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566624-l8469" Mar 20 09:04:06 crc kubenswrapper[4971]: I0320 09:04:06.676265 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566618-kfv8f"] Mar 20 09:04:06 crc kubenswrapper[4971]: I0320 09:04:06.684680 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566618-kfv8f"] Mar 20 09:04:06 crc kubenswrapper[4971]: I0320 09:04:06.752390 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01f7b9ec-77f3-43f1-8b14-f6cdcc25e266" path="/var/lib/kubelet/pods/01f7b9ec-77f3-43f1-8b14-f6cdcc25e266/volumes" Mar 20 09:04:12 crc kubenswrapper[4971]: I0320 09:04:12.456229 4971 scope.go:117] "RemoveContainer" containerID="e6d4cbbd041a631af01e05256443dce237af6cb31b5bbf91ad15f99aa75fd18b" Mar 20 09:04:12 crc kubenswrapper[4971]: I0320 09:04:12.504707 4971 scope.go:117] "RemoveContainer" containerID="cb659b4d9bb5d6b0240f06da910addfcc1ce38b3748625de83ef55c78cfffca4" Mar 20 09:05:50 crc kubenswrapper[4971]: I0320 09:05:50.162665 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:05:50 crc kubenswrapper[4971]: I0320 09:05:50.163291 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:06:00 crc kubenswrapper[4971]: I0320 09:06:00.158541 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566626-n2qgp"] Mar 20 09:06:00 crc kubenswrapper[4971]: E0320 09:06:00.159592 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2cbaa6d-6665-4d54-a3e9-8356327e8814" containerName="oc" Mar 20 09:06:00 crc kubenswrapper[4971]: I0320 09:06:00.159621 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2cbaa6d-6665-4d54-a3e9-8356327e8814" containerName="oc" Mar 20 09:06:00 crc kubenswrapper[4971]: I0320 09:06:00.159804 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2cbaa6d-6665-4d54-a3e9-8356327e8814" containerName="oc" Mar 20 09:06:00 crc kubenswrapper[4971]: I0320 09:06:00.160566 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566626-n2qgp" Mar 20 09:06:00 crc kubenswrapper[4971]: I0320 09:06:00.161940 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 09:06:00 crc kubenswrapper[4971]: I0320 09:06:00.163212 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:06:00 crc kubenswrapper[4971]: I0320 09:06:00.163496 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:06:00 crc kubenswrapper[4971]: I0320 09:06:00.168013 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566626-n2qgp"] Mar 20 09:06:00 crc kubenswrapper[4971]: I0320 09:06:00.252229 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzwhx\" (UniqueName: \"kubernetes.io/projected/5204b4e3-1a71-4ebb-9f95-9ca4edc4728a-kube-api-access-qzwhx\") pod \"auto-csr-approver-29566626-n2qgp\" (UID: \"5204b4e3-1a71-4ebb-9f95-9ca4edc4728a\") " pod="openshift-infra/auto-csr-approver-29566626-n2qgp" Mar 20 09:06:00 crc kubenswrapper[4971]: I0320 09:06:00.353362 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzwhx\" (UniqueName: \"kubernetes.io/projected/5204b4e3-1a71-4ebb-9f95-9ca4edc4728a-kube-api-access-qzwhx\") pod \"auto-csr-approver-29566626-n2qgp\" (UID: \"5204b4e3-1a71-4ebb-9f95-9ca4edc4728a\") " pod="openshift-infra/auto-csr-approver-29566626-n2qgp" Mar 20 09:06:00 crc kubenswrapper[4971]: I0320 09:06:00.375589 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzwhx\" (UniqueName: \"kubernetes.io/projected/5204b4e3-1a71-4ebb-9f95-9ca4edc4728a-kube-api-access-qzwhx\") pod \"auto-csr-approver-29566626-n2qgp\" (UID: \"5204b4e3-1a71-4ebb-9f95-9ca4edc4728a\") " pod="openshift-infra/auto-csr-approver-29566626-n2qgp" Mar 20 09:06:00 crc kubenswrapper[4971]: I0320 09:06:00.479753 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566626-n2qgp" Mar 20 09:06:00 crc kubenswrapper[4971]: I0320 09:06:00.928401 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566626-n2qgp"] Mar 20 09:06:00 crc kubenswrapper[4971]: I0320 09:06:00.966482 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566626-n2qgp" event={"ID":"5204b4e3-1a71-4ebb-9f95-9ca4edc4728a","Type":"ContainerStarted","Data":"306e13495cae48143622e885355e241c4b4f6a9f69666208924c20ad590086a3"} Mar 20 09:06:02 crc kubenswrapper[4971]: I0320 09:06:02.983501 4971 generic.go:334] "Generic (PLEG): container finished" podID="5204b4e3-1a71-4ebb-9f95-9ca4edc4728a" containerID="51474ca5240bdd3ffa85632a8b4d1b939f069832334df155523cb989c24e72ea" exitCode=0 Mar 20 09:06:02 crc kubenswrapper[4971]: I0320 09:06:02.983753 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566626-n2qgp" event={"ID":"5204b4e3-1a71-4ebb-9f95-9ca4edc4728a","Type":"ContainerDied","Data":"51474ca5240bdd3ffa85632a8b4d1b939f069832334df155523cb989c24e72ea"} Mar 20 09:06:04 crc kubenswrapper[4971]: I0320 09:06:04.373187 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566626-n2qgp" Mar 20 09:06:04 crc kubenswrapper[4971]: I0320 09:06:04.532096 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzwhx\" (UniqueName: \"kubernetes.io/projected/5204b4e3-1a71-4ebb-9f95-9ca4edc4728a-kube-api-access-qzwhx\") pod \"5204b4e3-1a71-4ebb-9f95-9ca4edc4728a\" (UID: \"5204b4e3-1a71-4ebb-9f95-9ca4edc4728a\") " Mar 20 09:06:04 crc kubenswrapper[4971]: I0320 09:06:04.544347 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5204b4e3-1a71-4ebb-9f95-9ca4edc4728a-kube-api-access-qzwhx" (OuterVolumeSpecName: "kube-api-access-qzwhx") pod "5204b4e3-1a71-4ebb-9f95-9ca4edc4728a" (UID: "5204b4e3-1a71-4ebb-9f95-9ca4edc4728a"). InnerVolumeSpecName "kube-api-access-qzwhx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:06:04 crc kubenswrapper[4971]: I0320 09:06:04.635182 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzwhx\" (UniqueName: \"kubernetes.io/projected/5204b4e3-1a71-4ebb-9f95-9ca4edc4728a-kube-api-access-qzwhx\") on node \"crc\" DevicePath \"\"" Mar 20 09:06:05 crc kubenswrapper[4971]: I0320 09:06:05.003630 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566626-n2qgp" event={"ID":"5204b4e3-1a71-4ebb-9f95-9ca4edc4728a","Type":"ContainerDied","Data":"306e13495cae48143622e885355e241c4b4f6a9f69666208924c20ad590086a3"} Mar 20 09:06:05 crc kubenswrapper[4971]: I0320 09:06:05.003668 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="306e13495cae48143622e885355e241c4b4f6a9f69666208924c20ad590086a3" Mar 20 09:06:05 crc kubenswrapper[4971]: I0320 09:06:05.003678 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566626-n2qgp" Mar 20 09:06:05 crc kubenswrapper[4971]: I0320 09:06:05.447961 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566620-9x5j8"] Mar 20 09:06:05 crc kubenswrapper[4971]: I0320 09:06:05.457729 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566620-9x5j8"] Mar 20 09:06:06 crc kubenswrapper[4971]: I0320 09:06:06.744680 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8a4a91f-3933-4117-bf14-288067efbab3" path="/var/lib/kubelet/pods/e8a4a91f-3933-4117-bf14-288067efbab3/volumes" Mar 20 09:06:20 crc kubenswrapper[4971]: I0320 09:06:20.162762 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:06:20 crc kubenswrapper[4971]: I0320 09:06:20.163510 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:06:34 crc kubenswrapper[4971]: I0320 09:06:34.412879 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-webhook-server-b54b5b68f-hq9kg" podUID="fe139df5-ee59-46ea-a07d-6a995dccdc8e" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.52:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 09:06:34 crc kubenswrapper[4971]: I0320 09:06:34.413736 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/metallb-operator-webhook-server-b54b5b68f-hq9kg" podUID="fe139df5-ee59-46ea-a07d-6a995dccdc8e" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.52:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 09:06:34 crc kubenswrapper[4971]: I0320 09:06:34.577473 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-2g4lc" podUID="1725120f-c2b6-4438-8dc0-0758d75b0ead" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.77:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 09:06:34 crc kubenswrapper[4971]: I0320 09:06:34.577564 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-2g4lc" podUID="1725120f-c2b6-4438-8dc0-0758d75b0ead" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.77:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 09:06:34 crc kubenswrapper[4971]: I0320 09:06:34.586822 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-z4rvv" podUID="f03668cf-d39b-496d-84cf-2f1c80162661" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.70:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 09:06:34 crc kubenswrapper[4971]: I0320 09:06:34.586900 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-z4rvv" podUID="f03668cf-d39b-496d-84cf-2f1c80162661" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.70:8081/readyz\": dial tcp 10.217.0.70:8081: i/o timeout (Client.Timeout exceeded while awaiting headers)" Mar 20 09:06:50 crc kubenswrapper[4971]: I0320 09:06:50.162017 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:06:50 crc kubenswrapper[4971]: I0320 09:06:50.162650 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:06:50 crc kubenswrapper[4971]: I0320 09:06:50.162706 4971 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" Mar 20 09:06:50 crc kubenswrapper[4971]: I0320 09:06:50.163538 4971 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ba1ddb4578d4cb1dc77ee5b85f04ad39f4f2172118cb49b973658b5d06d4d7e6"} pod="openshift-machine-config-operator/machine-config-daemon-m26zl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 09:06:50 crc kubenswrapper[4971]: I0320 09:06:50.163621 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" containerID="cri-o://ba1ddb4578d4cb1dc77ee5b85f04ad39f4f2172118cb49b973658b5d06d4d7e6" gracePeriod=600 Mar 20 09:06:51 crc kubenswrapper[4971]: I0320 09:06:51.018263 4971 generic.go:334] "Generic (PLEG): container finished" podID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerID="ba1ddb4578d4cb1dc77ee5b85f04ad39f4f2172118cb49b973658b5d06d4d7e6" exitCode=0 Mar 20 09:06:51 crc kubenswrapper[4971]: I0320 09:06:51.018334 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerDied","Data":"ba1ddb4578d4cb1dc77ee5b85f04ad39f4f2172118cb49b973658b5d06d4d7e6"} Mar 20 09:06:51 crc kubenswrapper[4971]: I0320 09:06:51.018600 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerStarted","Data":"da88d257cb097a1180faf648e8cb9e53e92d6d11e4160393bcb42462b61cacc0"} Mar 20 09:06:51 crc kubenswrapper[4971]: I0320 09:06:51.018633 4971 scope.go:117] "RemoveContainer" containerID="28296d931187d2e7d4e287e10e7dff5f848527abf9621b1204e0a061cbc3d395" Mar 20 09:07:06 crc kubenswrapper[4971]: I0320 09:07:06.159029 4971 generic.go:334] "Generic (PLEG): container finished" podID="004105ec-05ab-464c-ac28-5a3f5b8ef32c" containerID="cd6d56ac15c16e99bd5a47b5ff9b2418b0471e9a898a4431f00ade020accbf47" exitCode=0 Mar 20 09:07:06 crc kubenswrapper[4971]: I0320 09:07:06.159120 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-7ljrd" event={"ID":"004105ec-05ab-464c-ac28-5a3f5b8ef32c","Type":"ContainerDied","Data":"cd6d56ac15c16e99bd5a47b5ff9b2418b0471e9a898a4431f00ade020accbf47"} Mar 20 09:07:07 crc kubenswrapper[4971]: I0320 09:07:07.055412 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-9a0e-account-create-update-sm9hz"] Mar 20 09:07:07 crc kubenswrapper[4971]: I0320 09:07:07.068617 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-wwtwh"] Mar 20 09:07:07 crc kubenswrapper[4971]: I0320 09:07:07.077931 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-wwtwh"] Mar 20 09:07:07 crc kubenswrapper[4971]: I0320 09:07:07.086904 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-9a0e-account-create-update-sm9hz"] Mar 20 09:07:07 crc kubenswrapper[4971]: I0320 09:07:07.639323 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-7ljrd" Mar 20 09:07:07 crc kubenswrapper[4971]: I0320 09:07:07.662023 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/004105ec-05ab-464c-ac28-5a3f5b8ef32c-inventory\") pod \"004105ec-05ab-464c-ac28-5a3f5b8ef32c\" (UID: \"004105ec-05ab-464c-ac28-5a3f5b8ef32c\") " Mar 20 09:07:07 crc kubenswrapper[4971]: I0320 09:07:07.662087 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/004105ec-05ab-464c-ac28-5a3f5b8ef32c-ssh-key-openstack-networker\") pod \"004105ec-05ab-464c-ac28-5a3f5b8ef32c\" (UID: \"004105ec-05ab-464c-ac28-5a3f5b8ef32c\") " Mar 20 09:07:07 crc kubenswrapper[4971]: I0320 09:07:07.662186 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87wzd\" (UniqueName: \"kubernetes.io/projected/004105ec-05ab-464c-ac28-5a3f5b8ef32c-kube-api-access-87wzd\") pod \"004105ec-05ab-464c-ac28-5a3f5b8ef32c\" (UID: \"004105ec-05ab-464c-ac28-5a3f5b8ef32c\") " Mar 20 09:07:07 crc kubenswrapper[4971]: I0320 09:07:07.662211 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/004105ec-05ab-464c-ac28-5a3f5b8ef32c-tripleo-cleanup-combined-ca-bundle\") pod \"004105ec-05ab-464c-ac28-5a3f5b8ef32c\" (UID: \"004105ec-05ab-464c-ac28-5a3f5b8ef32c\") " Mar 20 09:07:07 crc kubenswrapper[4971]: I0320 09:07:07.692847 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/004105ec-05ab-464c-ac28-5a3f5b8ef32c-tripleo-cleanup-combined-ca-bundle" (OuterVolumeSpecName: "tripleo-cleanup-combined-ca-bundle") pod "004105ec-05ab-464c-ac28-5a3f5b8ef32c" (UID: "004105ec-05ab-464c-ac28-5a3f5b8ef32c"). InnerVolumeSpecName "tripleo-cleanup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:07:07 crc kubenswrapper[4971]: I0320 09:07:07.693531 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/004105ec-05ab-464c-ac28-5a3f5b8ef32c-kube-api-access-87wzd" (OuterVolumeSpecName: "kube-api-access-87wzd") pod "004105ec-05ab-464c-ac28-5a3f5b8ef32c" (UID: "004105ec-05ab-464c-ac28-5a3f5b8ef32c"). InnerVolumeSpecName "kube-api-access-87wzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:07:07 crc kubenswrapper[4971]: I0320 09:07:07.703819 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/004105ec-05ab-464c-ac28-5a3f5b8ef32c-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "004105ec-05ab-464c-ac28-5a3f5b8ef32c" (UID: "004105ec-05ab-464c-ac28-5a3f5b8ef32c"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:07:07 crc kubenswrapper[4971]: I0320 09:07:07.720189 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/004105ec-05ab-464c-ac28-5a3f5b8ef32c-inventory" (OuterVolumeSpecName: "inventory") pod "004105ec-05ab-464c-ac28-5a3f5b8ef32c" (UID: "004105ec-05ab-464c-ac28-5a3f5b8ef32c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:07:07 crc kubenswrapper[4971]: I0320 09:07:07.766166 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87wzd\" (UniqueName: \"kubernetes.io/projected/004105ec-05ab-464c-ac28-5a3f5b8ef32c-kube-api-access-87wzd\") on node \"crc\" DevicePath \"\"" Mar 20 09:07:07 crc kubenswrapper[4971]: I0320 09:07:07.766742 4971 reconciler_common.go:293] "Volume detached for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/004105ec-05ab-464c-ac28-5a3f5b8ef32c-tripleo-cleanup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 09:07:07 crc kubenswrapper[4971]: I0320 09:07:07.766772 4971 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/004105ec-05ab-464c-ac28-5a3f5b8ef32c-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 09:07:07 crc kubenswrapper[4971]: I0320 09:07:07.766785 4971 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/004105ec-05ab-464c-ac28-5a3f5b8ef32c-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Mar 20 09:07:08 crc kubenswrapper[4971]: I0320 09:07:08.181018 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-7ljrd" event={"ID":"004105ec-05ab-464c-ac28-5a3f5b8ef32c","Type":"ContainerDied","Data":"b08f8eb75f988c8819930533921daca7b58fe312e79fab4ea153f9c5308ee5eb"} Mar 20 09:07:08 crc kubenswrapper[4971]: I0320 09:07:08.181273 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b08f8eb75f988c8819930533921daca7b58fe312e79fab4ea153f9c5308ee5eb" Mar 20 09:07:08 crc kubenswrapper[4971]: I0320 09:07:08.181106 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-7ljrd" Mar 20 09:07:08 crc kubenswrapper[4971]: I0320 09:07:08.744462 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="079d372c-65a0-4c34-b239-e49413cac654" path="/var/lib/kubelet/pods/079d372c-65a0-4c34-b239-e49413cac654/volumes" Mar 20 09:07:08 crc kubenswrapper[4971]: I0320 09:07:08.745033 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402da04-6fc4-4546-b7ac-5af524c11e9a" path="/var/lib/kubelet/pods/6402da04-6fc4-4546-b7ac-5af524c11e9a/volumes" Mar 20 09:07:12 crc kubenswrapper[4971]: I0320 09:07:12.643220 4971 scope.go:117] "RemoveContainer" containerID="f77e1b52f123ea9084bed679abde9bc2db5e642cbe533ea56cc25d73c52f410a" Mar 20 09:07:12 crc kubenswrapper[4971]: I0320 09:07:12.670184 4971 scope.go:117] "RemoveContainer" containerID="d62f7fb69148cfbd8323889418817c278a443fdad66d53f15d25cb3930c81d8f" Mar 20 09:07:12 crc kubenswrapper[4971]: I0320 09:07:12.732291 4971 scope.go:117] "RemoveContainer" containerID="8ec3e621d3d4e1df49ce16aa037fa1a96f48d7b5e10caf31f75af4a8bfc58fdc" Mar 20 09:07:23 crc kubenswrapper[4971]: I0320 09:07:23.032102 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-k87v7"] Mar 20 09:07:23 crc kubenswrapper[4971]: I0320 09:07:23.040581 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-k87v7"] Mar 20 09:07:23 crc kubenswrapper[4971]: I0320 09:07:23.904766 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dnbxf"] Mar 20 09:07:23 crc kubenswrapper[4971]: E0320 09:07:23.905685 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="004105ec-05ab-464c-ac28-5a3f5b8ef32c" containerName="tripleo-cleanup-tripleo-cleanup-openstack-networker" Mar 20 09:07:23 crc kubenswrapper[4971]: I0320 09:07:23.905711 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="004105ec-05ab-464c-ac28-5a3f5b8ef32c" containerName="tripleo-cleanup-tripleo-cleanup-openstack-networker" Mar 20 09:07:23 crc kubenswrapper[4971]: E0320 09:07:23.905723 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5204b4e3-1a71-4ebb-9f95-9ca4edc4728a" containerName="oc" Mar 20 09:07:23 crc kubenswrapper[4971]: I0320 09:07:23.905732 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="5204b4e3-1a71-4ebb-9f95-9ca4edc4728a" containerName="oc" Mar 20 09:07:23 crc kubenswrapper[4971]: I0320 09:07:23.905985 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="5204b4e3-1a71-4ebb-9f95-9ca4edc4728a" containerName="oc" Mar 20 09:07:23 crc kubenswrapper[4971]: I0320 09:07:23.906011 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="004105ec-05ab-464c-ac28-5a3f5b8ef32c" containerName="tripleo-cleanup-tripleo-cleanup-openstack-networker" Mar 20 09:07:23 crc kubenswrapper[4971]: I0320 09:07:23.907915 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dnbxf" Mar 20 09:07:23 crc kubenswrapper[4971]: I0320 09:07:23.929485 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dnbxf"] Mar 20 09:07:24 crc kubenswrapper[4971]: I0320 09:07:24.027652 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b25ce08-d6d8-44ac-b85b-10b83ad410d0-catalog-content\") pod \"redhat-operators-dnbxf\" (UID: \"1b25ce08-d6d8-44ac-b85b-10b83ad410d0\") " pod="openshift-marketplace/redhat-operators-dnbxf" Mar 20 09:07:24 crc kubenswrapper[4971]: I0320 09:07:24.027984 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b25ce08-d6d8-44ac-b85b-10b83ad410d0-utilities\") pod \"redhat-operators-dnbxf\" (UID: \"1b25ce08-d6d8-44ac-b85b-10b83ad410d0\") " pod="openshift-marketplace/redhat-operators-dnbxf" Mar 20 09:07:24 crc kubenswrapper[4971]: I0320 09:07:24.028379 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rz7m4\" (UniqueName: \"kubernetes.io/projected/1b25ce08-d6d8-44ac-b85b-10b83ad410d0-kube-api-access-rz7m4\") pod \"redhat-operators-dnbxf\" (UID: \"1b25ce08-d6d8-44ac-b85b-10b83ad410d0\") " pod="openshift-marketplace/redhat-operators-dnbxf" Mar 20 09:07:24 crc kubenswrapper[4971]: I0320 09:07:24.130865 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rz7m4\" (UniqueName: \"kubernetes.io/projected/1b25ce08-d6d8-44ac-b85b-10b83ad410d0-kube-api-access-rz7m4\") pod \"redhat-operators-dnbxf\" (UID: \"1b25ce08-d6d8-44ac-b85b-10b83ad410d0\") " pod="openshift-marketplace/redhat-operators-dnbxf" Mar 20 09:07:24 crc kubenswrapper[4971]: I0320 09:07:24.130950 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b25ce08-d6d8-44ac-b85b-10b83ad410d0-catalog-content\") pod \"redhat-operators-dnbxf\" (UID: \"1b25ce08-d6d8-44ac-b85b-10b83ad410d0\") " pod="openshift-marketplace/redhat-operators-dnbxf" Mar 20 09:07:24 crc kubenswrapper[4971]: I0320 09:07:24.131053 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b25ce08-d6d8-44ac-b85b-10b83ad410d0-utilities\") pod \"redhat-operators-dnbxf\" (UID: \"1b25ce08-d6d8-44ac-b85b-10b83ad410d0\") " pod="openshift-marketplace/redhat-operators-dnbxf" Mar 20 09:07:24 crc kubenswrapper[4971]: I0320 09:07:24.131598 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b25ce08-d6d8-44ac-b85b-10b83ad410d0-utilities\") pod \"redhat-operators-dnbxf\" (UID: \"1b25ce08-d6d8-44ac-b85b-10b83ad410d0\") " pod="openshift-marketplace/redhat-operators-dnbxf" Mar 20 09:07:24 crc kubenswrapper[4971]: I0320 09:07:24.132107 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b25ce08-d6d8-44ac-b85b-10b83ad410d0-catalog-content\") pod \"redhat-operators-dnbxf\" (UID: \"1b25ce08-d6d8-44ac-b85b-10b83ad410d0\") " pod="openshift-marketplace/redhat-operators-dnbxf" Mar 20 09:07:24 crc kubenswrapper[4971]: I0320 09:07:24.151922 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rz7m4\" (UniqueName: \"kubernetes.io/projected/1b25ce08-d6d8-44ac-b85b-10b83ad410d0-kube-api-access-rz7m4\") pod \"redhat-operators-dnbxf\" (UID: \"1b25ce08-d6d8-44ac-b85b-10b83ad410d0\") " pod="openshift-marketplace/redhat-operators-dnbxf" Mar 20 09:07:24 crc kubenswrapper[4971]: I0320 09:07:24.276235 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dnbxf" Mar 20 09:07:24 crc kubenswrapper[4971]: I0320 09:07:24.745108 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2694c8fe-bdc7-48ad-b4aa-ae199a9cc90d" path="/var/lib/kubelet/pods/2694c8fe-bdc7-48ad-b4aa-ae199a9cc90d/volumes" Mar 20 09:07:24 crc kubenswrapper[4971]: I0320 09:07:24.768775 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dnbxf"] Mar 20 09:07:25 crc kubenswrapper[4971]: I0320 09:07:25.349854 4971 generic.go:334] "Generic (PLEG): container finished" podID="1b25ce08-d6d8-44ac-b85b-10b83ad410d0" containerID="15f9dc44bbd43a9288718d45418c22f2c67231f996df793faa2e57933fc52714" exitCode=0 Mar 20 09:07:25 crc kubenswrapper[4971]: I0320 09:07:25.349905 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dnbxf" event={"ID":"1b25ce08-d6d8-44ac-b85b-10b83ad410d0","Type":"ContainerDied","Data":"15f9dc44bbd43a9288718d45418c22f2c67231f996df793faa2e57933fc52714"} Mar 20 09:07:25 crc kubenswrapper[4971]: I0320 09:07:25.349936 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dnbxf" event={"ID":"1b25ce08-d6d8-44ac-b85b-10b83ad410d0","Type":"ContainerStarted","Data":"7f4c6622d82a83058132ab8502e5e566dcc9d0c759867d3d7bde0ea5afe4a564"} Mar 20 09:07:25 crc kubenswrapper[4971]: I0320 09:07:25.352059 4971 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 09:07:26 crc kubenswrapper[4971]: I0320 09:07:26.361024 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dnbxf" event={"ID":"1b25ce08-d6d8-44ac-b85b-10b83ad410d0","Type":"ContainerStarted","Data":"f788e0d45a5b5345c170fdadd11dd0a09266ac9248809b2f2cfc96698e58bfd0"} Mar 20 09:07:31 crc kubenswrapper[4971]: I0320 09:07:31.403141 4971 generic.go:334] "Generic (PLEG): container finished" podID="1b25ce08-d6d8-44ac-b85b-10b83ad410d0" containerID="f788e0d45a5b5345c170fdadd11dd0a09266ac9248809b2f2cfc96698e58bfd0" exitCode=0 Mar 20 09:07:31 crc kubenswrapper[4971]: I0320 09:07:31.403246 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dnbxf" event={"ID":"1b25ce08-d6d8-44ac-b85b-10b83ad410d0","Type":"ContainerDied","Data":"f788e0d45a5b5345c170fdadd11dd0a09266ac9248809b2f2cfc96698e58bfd0"} Mar 20 09:07:32 crc kubenswrapper[4971]: I0320 09:07:32.418677 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dnbxf" event={"ID":"1b25ce08-d6d8-44ac-b85b-10b83ad410d0","Type":"ContainerStarted","Data":"ebd7ea94b199f4d25baeda4eac53e4478c324c3ff0e95bcaedf60b3d84d3fa49"} Mar 20 09:07:32 crc kubenswrapper[4971]: I0320 09:07:32.441456 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dnbxf" podStartSLOduration=2.658579417 podStartE2EDuration="9.441436242s" podCreationTimestamp="2026-03-20 09:07:23 +0000 UTC" firstStartedPulling="2026-03-20 09:07:25.351777568 +0000 UTC m=+8267.331651706" lastFinishedPulling="2026-03-20 09:07:32.134634403 +0000 UTC m=+8274.114508531" observedRunningTime="2026-03-20 09:07:32.435661951 +0000 UTC m=+8274.415536089" watchObservedRunningTime="2026-03-20 09:07:32.441436242 +0000 UTC m=+8274.421310380" Mar 20 09:07:34 crc kubenswrapper[4971]: I0320 09:07:34.277202 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dnbxf" Mar 20 09:07:34 crc kubenswrapper[4971]: I0320 09:07:34.277537 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dnbxf" Mar 20 09:07:35 crc kubenswrapper[4971]: I0320 09:07:35.334163 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dnbxf" podUID="1b25ce08-d6d8-44ac-b85b-10b83ad410d0" containerName="registry-server" probeResult="failure" output=< Mar 20 09:07:35 crc kubenswrapper[4971]: timeout: failed to connect service ":50051" within 1s Mar 20 09:07:35 crc kubenswrapper[4971]: > Mar 20 09:07:44 crc kubenswrapper[4971]: I0320 09:07:44.357221 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dnbxf" Mar 20 09:07:44 crc kubenswrapper[4971]: I0320 09:07:44.520714 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dnbxf" Mar 20 09:07:44 crc kubenswrapper[4971]: I0320 09:07:44.628656 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dnbxf"] Mar 20 09:07:45 crc kubenswrapper[4971]: I0320 09:07:45.536732 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dnbxf" podUID="1b25ce08-d6d8-44ac-b85b-10b83ad410d0" containerName="registry-server" containerID="cri-o://ebd7ea94b199f4d25baeda4eac53e4478c324c3ff0e95bcaedf60b3d84d3fa49" gracePeriod=2 Mar 20 09:07:46 crc kubenswrapper[4971]: I0320 09:07:46.067582 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dnbxf" Mar 20 09:07:46 crc kubenswrapper[4971]: I0320 09:07:46.141862 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b25ce08-d6d8-44ac-b85b-10b83ad410d0-utilities\") pod \"1b25ce08-d6d8-44ac-b85b-10b83ad410d0\" (UID: \"1b25ce08-d6d8-44ac-b85b-10b83ad410d0\") " Mar 20 09:07:46 crc kubenswrapper[4971]: I0320 09:07:46.142722 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b25ce08-d6d8-44ac-b85b-10b83ad410d0-utilities" (OuterVolumeSpecName: "utilities") pod "1b25ce08-d6d8-44ac-b85b-10b83ad410d0" (UID: "1b25ce08-d6d8-44ac-b85b-10b83ad410d0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:07:46 crc kubenswrapper[4971]: I0320 09:07:46.243234 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rz7m4\" (UniqueName: \"kubernetes.io/projected/1b25ce08-d6d8-44ac-b85b-10b83ad410d0-kube-api-access-rz7m4\") pod \"1b25ce08-d6d8-44ac-b85b-10b83ad410d0\" (UID: \"1b25ce08-d6d8-44ac-b85b-10b83ad410d0\") " Mar 20 09:07:46 crc kubenswrapper[4971]: I0320 09:07:46.243296 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b25ce08-d6d8-44ac-b85b-10b83ad410d0-catalog-content\") pod \"1b25ce08-d6d8-44ac-b85b-10b83ad410d0\" (UID: \"1b25ce08-d6d8-44ac-b85b-10b83ad410d0\") " Mar 20 09:07:46 crc kubenswrapper[4971]: I0320 09:07:46.243891 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b25ce08-d6d8-44ac-b85b-10b83ad410d0-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 09:07:46 crc kubenswrapper[4971]: I0320 09:07:46.249383 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b25ce08-d6d8-44ac-b85b-10b83ad410d0-kube-api-access-rz7m4" (OuterVolumeSpecName: "kube-api-access-rz7m4") pod "1b25ce08-d6d8-44ac-b85b-10b83ad410d0" (UID: "1b25ce08-d6d8-44ac-b85b-10b83ad410d0"). InnerVolumeSpecName "kube-api-access-rz7m4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:07:46 crc kubenswrapper[4971]: I0320 09:07:46.345340 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rz7m4\" (UniqueName: \"kubernetes.io/projected/1b25ce08-d6d8-44ac-b85b-10b83ad410d0-kube-api-access-rz7m4\") on node \"crc\" DevicePath \"\"" Mar 20 09:07:46 crc kubenswrapper[4971]: I0320 09:07:46.379193 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b25ce08-d6d8-44ac-b85b-10b83ad410d0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1b25ce08-d6d8-44ac-b85b-10b83ad410d0" (UID: "1b25ce08-d6d8-44ac-b85b-10b83ad410d0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:07:46 crc kubenswrapper[4971]: I0320 09:07:46.447857 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b25ce08-d6d8-44ac-b85b-10b83ad410d0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 09:07:46 crc kubenswrapper[4971]: I0320 09:07:46.548439 4971 generic.go:334] "Generic (PLEG): container finished" podID="1b25ce08-d6d8-44ac-b85b-10b83ad410d0" containerID="ebd7ea94b199f4d25baeda4eac53e4478c324c3ff0e95bcaedf60b3d84d3fa49" exitCode=0 Mar 20 09:07:46 crc kubenswrapper[4971]: I0320 09:07:46.548525 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dnbxf" event={"ID":"1b25ce08-d6d8-44ac-b85b-10b83ad410d0","Type":"ContainerDied","Data":"ebd7ea94b199f4d25baeda4eac53e4478c324c3ff0e95bcaedf60b3d84d3fa49"} Mar 20 09:07:46 crc kubenswrapper[4971]: I0320 09:07:46.548577 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dnbxf" event={"ID":"1b25ce08-d6d8-44ac-b85b-10b83ad410d0","Type":"ContainerDied","Data":"7f4c6622d82a83058132ab8502e5e566dcc9d0c759867d3d7bde0ea5afe4a564"} Mar 20 09:07:46 crc kubenswrapper[4971]: I0320 09:07:46.548649 4971 scope.go:117] "RemoveContainer" containerID="ebd7ea94b199f4d25baeda4eac53e4478c324c3ff0e95bcaedf60b3d84d3fa49" Mar 20 09:07:46 crc kubenswrapper[4971]: I0320 09:07:46.548756 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dnbxf" Mar 20 09:07:46 crc kubenswrapper[4971]: I0320 09:07:46.589231 4971 scope.go:117] "RemoveContainer" containerID="f788e0d45a5b5345c170fdadd11dd0a09266ac9248809b2f2cfc96698e58bfd0" Mar 20 09:07:46 crc kubenswrapper[4971]: I0320 09:07:46.596055 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dnbxf"] Mar 20 09:07:46 crc kubenswrapper[4971]: I0320 09:07:46.609680 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dnbxf"] Mar 20 09:07:46 crc kubenswrapper[4971]: I0320 09:07:46.621032 4971 scope.go:117] "RemoveContainer" containerID="15f9dc44bbd43a9288718d45418c22f2c67231f996df793faa2e57933fc52714" Mar 20 09:07:46 crc kubenswrapper[4971]: I0320 09:07:46.656488 4971 scope.go:117] "RemoveContainer" containerID="ebd7ea94b199f4d25baeda4eac53e4478c324c3ff0e95bcaedf60b3d84d3fa49" Mar 20 09:07:46 crc kubenswrapper[4971]: E0320 09:07:46.657118 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebd7ea94b199f4d25baeda4eac53e4478c324c3ff0e95bcaedf60b3d84d3fa49\": container with ID starting with ebd7ea94b199f4d25baeda4eac53e4478c324c3ff0e95bcaedf60b3d84d3fa49 not found: ID does not exist" containerID="ebd7ea94b199f4d25baeda4eac53e4478c324c3ff0e95bcaedf60b3d84d3fa49" Mar 20 09:07:46 crc kubenswrapper[4971]: I0320 09:07:46.657147 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebd7ea94b199f4d25baeda4eac53e4478c324c3ff0e95bcaedf60b3d84d3fa49"} err="failed to get container status \"ebd7ea94b199f4d25baeda4eac53e4478c324c3ff0e95bcaedf60b3d84d3fa49\": rpc error: code = NotFound desc = could not find container \"ebd7ea94b199f4d25baeda4eac53e4478c324c3ff0e95bcaedf60b3d84d3fa49\": container with ID starting with ebd7ea94b199f4d25baeda4eac53e4478c324c3ff0e95bcaedf60b3d84d3fa49 not found: ID does not exist" Mar 20 09:07:46 crc kubenswrapper[4971]: I0320 09:07:46.657167 4971 scope.go:117] "RemoveContainer" containerID="f788e0d45a5b5345c170fdadd11dd0a09266ac9248809b2f2cfc96698e58bfd0" Mar 20 09:07:46 crc kubenswrapper[4971]: E0320 09:07:46.657730 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f788e0d45a5b5345c170fdadd11dd0a09266ac9248809b2f2cfc96698e58bfd0\": container with ID starting with f788e0d45a5b5345c170fdadd11dd0a09266ac9248809b2f2cfc96698e58bfd0 not found: ID does not exist" containerID="f788e0d45a5b5345c170fdadd11dd0a09266ac9248809b2f2cfc96698e58bfd0" Mar 20 09:07:46 crc kubenswrapper[4971]: I0320 09:07:46.657759 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f788e0d45a5b5345c170fdadd11dd0a09266ac9248809b2f2cfc96698e58bfd0"} err="failed to get container status \"f788e0d45a5b5345c170fdadd11dd0a09266ac9248809b2f2cfc96698e58bfd0\": rpc error: code = NotFound desc = could not find container \"f788e0d45a5b5345c170fdadd11dd0a09266ac9248809b2f2cfc96698e58bfd0\": container with ID starting with f788e0d45a5b5345c170fdadd11dd0a09266ac9248809b2f2cfc96698e58bfd0 not found: ID does not exist" Mar 20 09:07:46 crc kubenswrapper[4971]: I0320 09:07:46.657776 4971 scope.go:117] "RemoveContainer" containerID="15f9dc44bbd43a9288718d45418c22f2c67231f996df793faa2e57933fc52714" Mar 20 09:07:46 crc kubenswrapper[4971]: E0320 09:07:46.657978 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15f9dc44bbd43a9288718d45418c22f2c67231f996df793faa2e57933fc52714\": container with ID starting with 15f9dc44bbd43a9288718d45418c22f2c67231f996df793faa2e57933fc52714 not found: ID does not exist" containerID="15f9dc44bbd43a9288718d45418c22f2c67231f996df793faa2e57933fc52714" Mar 20 09:07:46 crc kubenswrapper[4971]: I0320 09:07:46.658002 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15f9dc44bbd43a9288718d45418c22f2c67231f996df793faa2e57933fc52714"} err="failed to get container status \"15f9dc44bbd43a9288718d45418c22f2c67231f996df793faa2e57933fc52714\": rpc error: code = NotFound desc = could not find container \"15f9dc44bbd43a9288718d45418c22f2c67231f996df793faa2e57933fc52714\": container with ID starting with 15f9dc44bbd43a9288718d45418c22f2c67231f996df793faa2e57933fc52714 not found: ID does not exist" Mar 20 09:07:46 crc kubenswrapper[4971]: I0320 09:07:46.747539 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b25ce08-d6d8-44ac-b85b-10b83ad410d0" path="/var/lib/kubelet/pods/1b25ce08-d6d8-44ac-b85b-10b83ad410d0/volumes" Mar 20 09:08:00 crc kubenswrapper[4971]: I0320 09:08:00.142627 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566628-8scct"] Mar 20 09:08:00 crc kubenswrapper[4971]: E0320 09:08:00.143711 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b25ce08-d6d8-44ac-b85b-10b83ad410d0" containerName="registry-server" Mar 20 09:08:00 crc kubenswrapper[4971]: I0320 09:08:00.143727 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b25ce08-d6d8-44ac-b85b-10b83ad410d0" containerName="registry-server" Mar 20 09:08:00 crc kubenswrapper[4971]: E0320 09:08:00.143740 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b25ce08-d6d8-44ac-b85b-10b83ad410d0" containerName="extract-utilities" Mar 20 09:08:00 crc kubenswrapper[4971]: I0320 09:08:00.143746 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b25ce08-d6d8-44ac-b85b-10b83ad410d0" containerName="extract-utilities" Mar 20 09:08:00 crc kubenswrapper[4971]: E0320 09:08:00.143761 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b25ce08-d6d8-44ac-b85b-10b83ad410d0" containerName="extract-content" Mar 20 09:08:00 crc kubenswrapper[4971]: I0320 09:08:00.143770 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b25ce08-d6d8-44ac-b85b-10b83ad410d0" containerName="extract-content" Mar 20 09:08:00 crc kubenswrapper[4971]: I0320 09:08:00.143946 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b25ce08-d6d8-44ac-b85b-10b83ad410d0" containerName="registry-server" Mar 20 09:08:00 crc kubenswrapper[4971]: I0320 09:08:00.145426 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566628-8scct" Mar 20 09:08:00 crc kubenswrapper[4971]: I0320 09:08:00.149869 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:08:00 crc kubenswrapper[4971]: I0320 09:08:00.150432 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:08:00 crc kubenswrapper[4971]: I0320 09:08:00.153492 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566628-8scct"] Mar 20 09:08:00 crc kubenswrapper[4971]: I0320 09:08:00.155148 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 09:08:00 crc kubenswrapper[4971]: I0320 09:08:00.217505 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6qrc\" (UniqueName: \"kubernetes.io/projected/6be3ae71-a5ff-448d-a31e-57c789745969-kube-api-access-h6qrc\") pod \"auto-csr-approver-29566628-8scct\" (UID: \"6be3ae71-a5ff-448d-a31e-57c789745969\") " pod="openshift-infra/auto-csr-approver-29566628-8scct" Mar 20 09:08:00 crc kubenswrapper[4971]: I0320 09:08:00.320950 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6qrc\" (UniqueName: \"kubernetes.io/projected/6be3ae71-a5ff-448d-a31e-57c789745969-kube-api-access-h6qrc\") pod \"auto-csr-approver-29566628-8scct\" (UID: \"6be3ae71-a5ff-448d-a31e-57c789745969\") " pod="openshift-infra/auto-csr-approver-29566628-8scct" Mar 20 09:08:00 crc kubenswrapper[4971]: I0320 09:08:00.365756 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6qrc\" (UniqueName: \"kubernetes.io/projected/6be3ae71-a5ff-448d-a31e-57c789745969-kube-api-access-h6qrc\") pod \"auto-csr-approver-29566628-8scct\" (UID: \"6be3ae71-a5ff-448d-a31e-57c789745969\") " pod="openshift-infra/auto-csr-approver-29566628-8scct" Mar 20 09:08:00 crc kubenswrapper[4971]: I0320 09:08:00.468421 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566628-8scct" Mar 20 09:08:00 crc kubenswrapper[4971]: I0320 09:08:00.800731 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566628-8scct"] Mar 20 09:08:01 crc kubenswrapper[4971]: I0320 09:08:01.701732 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566628-8scct" event={"ID":"6be3ae71-a5ff-448d-a31e-57c789745969","Type":"ContainerStarted","Data":"6f818f68bc7a1988cf9c7030977eba498c32cd0978590f3c9eb4c34c08625d34"} Mar 20 09:08:02 crc kubenswrapper[4971]: I0320 09:08:02.710640 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566628-8scct" event={"ID":"6be3ae71-a5ff-448d-a31e-57c789745969","Type":"ContainerStarted","Data":"e23416f69c1402dfa4e4f9525aab042eed031edf635cdc84c0f7b6f698915200"} Mar 20 09:08:03 crc kubenswrapper[4971]: I0320 09:08:03.721628 4971 generic.go:334] "Generic (PLEG): container finished" podID="6be3ae71-a5ff-448d-a31e-57c789745969" containerID="e23416f69c1402dfa4e4f9525aab042eed031edf635cdc84c0f7b6f698915200" exitCode=0 Mar 20 09:08:03 crc kubenswrapper[4971]: I0320 09:08:03.721676 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566628-8scct" event={"ID":"6be3ae71-a5ff-448d-a31e-57c789745969","Type":"ContainerDied","Data":"e23416f69c1402dfa4e4f9525aab042eed031edf635cdc84c0f7b6f698915200"} Mar 20 09:08:04 crc kubenswrapper[4971]: I0320 09:08:04.102892 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566628-8scct" Mar 20 09:08:04 crc kubenswrapper[4971]: I0320 09:08:04.203559 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6qrc\" (UniqueName: \"kubernetes.io/projected/6be3ae71-a5ff-448d-a31e-57c789745969-kube-api-access-h6qrc\") pod \"6be3ae71-a5ff-448d-a31e-57c789745969\" (UID: \"6be3ae71-a5ff-448d-a31e-57c789745969\") " Mar 20 09:08:04 crc kubenswrapper[4971]: I0320 09:08:04.210969 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6be3ae71-a5ff-448d-a31e-57c789745969-kube-api-access-h6qrc" (OuterVolumeSpecName: "kube-api-access-h6qrc") pod "6be3ae71-a5ff-448d-a31e-57c789745969" (UID: "6be3ae71-a5ff-448d-a31e-57c789745969"). InnerVolumeSpecName "kube-api-access-h6qrc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:08:04 crc kubenswrapper[4971]: I0320 09:08:04.306768 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6qrc\" (UniqueName: \"kubernetes.io/projected/6be3ae71-a5ff-448d-a31e-57c789745969-kube-api-access-h6qrc\") on node \"crc\" DevicePath \"\"" Mar 20 09:08:04 crc kubenswrapper[4971]: I0320 09:08:04.743327 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566628-8scct" Mar 20 09:08:04 crc kubenswrapper[4971]: I0320 09:08:04.747680 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566628-8scct" event={"ID":"6be3ae71-a5ff-448d-a31e-57c789745969","Type":"ContainerDied","Data":"6f818f68bc7a1988cf9c7030977eba498c32cd0978590f3c9eb4c34c08625d34"} Mar 20 09:08:04 crc kubenswrapper[4971]: I0320 09:08:04.747725 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f818f68bc7a1988cf9c7030977eba498c32cd0978590f3c9eb4c34c08625d34" Mar 20 09:08:05 crc kubenswrapper[4971]: I0320 09:08:05.179215 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566622-6qfhw"] Mar 20 09:08:05 crc kubenswrapper[4971]: I0320 09:08:05.188963 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566622-6qfhw"] Mar 20 09:08:06 crc kubenswrapper[4971]: I0320 09:08:06.751624 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12b15227-d749-42df-ba0b-acad0bc05557" path="/var/lib/kubelet/pods/12b15227-d749-42df-ba0b-acad0bc05557/volumes" Mar 20 09:08:12 crc kubenswrapper[4971]: I0320 09:08:12.811722 4971 scope.go:117] "RemoveContainer" containerID="1033e4cd2cbf5412dc93fdb7b7eb501d71c34eeeae84f23fce2832c2cf9ac8fc" Mar 20 09:08:12 crc kubenswrapper[4971]: I0320 09:08:12.874787 4971 scope.go:117] "RemoveContainer" containerID="ba1507967aa4533d9a912d97883c9af0760d929a7ccc35c3a15e7dd1fb8b3ad7" Mar 20 09:08:50 crc kubenswrapper[4971]: I0320 09:08:50.162278 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:08:50 crc kubenswrapper[4971]: I0320 09:08:50.162992 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:08:52 crc kubenswrapper[4971]: I0320 09:08:52.640421 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xv497"] Mar 20 09:08:52 crc kubenswrapper[4971]: E0320 09:08:52.644347 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6be3ae71-a5ff-448d-a31e-57c789745969" containerName="oc" Mar 20 09:08:52 crc kubenswrapper[4971]: I0320 09:08:52.644480 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="6be3ae71-a5ff-448d-a31e-57c789745969" containerName="oc" Mar 20 09:08:52 crc kubenswrapper[4971]: I0320 09:08:52.644899 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="6be3ae71-a5ff-448d-a31e-57c789745969" containerName="oc" Mar 20 09:08:52 crc kubenswrapper[4971]: I0320 09:08:52.671240 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xv497" Mar 20 09:08:52 crc kubenswrapper[4971]: I0320 09:08:52.677061 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xv497"] Mar 20 09:08:52 crc kubenswrapper[4971]: I0320 09:08:52.783317 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9dc1fc15-a1cf-4f9d-9c99-b2cab24cc958-catalog-content\") pod \"redhat-marketplace-xv497\" (UID: \"9dc1fc15-a1cf-4f9d-9c99-b2cab24cc958\") " pod="openshift-marketplace/redhat-marketplace-xv497" Mar 20 09:08:52 crc kubenswrapper[4971]: I0320 09:08:52.783380 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztcqq\" (UniqueName: \"kubernetes.io/projected/9dc1fc15-a1cf-4f9d-9c99-b2cab24cc958-kube-api-access-ztcqq\") pod \"redhat-marketplace-xv497\" (UID: \"9dc1fc15-a1cf-4f9d-9c99-b2cab24cc958\") " pod="openshift-marketplace/redhat-marketplace-xv497" Mar 20 09:08:52 crc kubenswrapper[4971]: I0320 09:08:52.783566 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9dc1fc15-a1cf-4f9d-9c99-b2cab24cc958-utilities\") pod \"redhat-marketplace-xv497\" (UID: \"9dc1fc15-a1cf-4f9d-9c99-b2cab24cc958\") " pod="openshift-marketplace/redhat-marketplace-xv497" Mar 20 09:08:52 crc kubenswrapper[4971]: I0320 09:08:52.885509 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9dc1fc15-a1cf-4f9d-9c99-b2cab24cc958-catalog-content\") pod \"redhat-marketplace-xv497\" (UID: \"9dc1fc15-a1cf-4f9d-9c99-b2cab24cc958\") " pod="openshift-marketplace/redhat-marketplace-xv497" Mar 20 09:08:52 crc kubenswrapper[4971]: I0320 09:08:52.886011 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9dc1fc15-a1cf-4f9d-9c99-b2cab24cc958-catalog-content\") pod \"redhat-marketplace-xv497\" (UID: \"9dc1fc15-a1cf-4f9d-9c99-b2cab24cc958\") " pod="openshift-marketplace/redhat-marketplace-xv497" Mar 20 09:08:52 crc kubenswrapper[4971]: I0320 09:08:52.886106 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztcqq\" (UniqueName: \"kubernetes.io/projected/9dc1fc15-a1cf-4f9d-9c99-b2cab24cc958-kube-api-access-ztcqq\") pod \"redhat-marketplace-xv497\" (UID: \"9dc1fc15-a1cf-4f9d-9c99-b2cab24cc958\") " pod="openshift-marketplace/redhat-marketplace-xv497" Mar 20 09:08:52 crc kubenswrapper[4971]: I0320 09:08:52.886547 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9dc1fc15-a1cf-4f9d-9c99-b2cab24cc958-utilities\") pod \"redhat-marketplace-xv497\" (UID: \"9dc1fc15-a1cf-4f9d-9c99-b2cab24cc958\") " pod="openshift-marketplace/redhat-marketplace-xv497" Mar 20 09:08:52 crc kubenswrapper[4971]: I0320 09:08:52.886870 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9dc1fc15-a1cf-4f9d-9c99-b2cab24cc958-utilities\") pod \"redhat-marketplace-xv497\" (UID: \"9dc1fc15-a1cf-4f9d-9c99-b2cab24cc958\") " pod="openshift-marketplace/redhat-marketplace-xv497" Mar 20 09:08:52 crc kubenswrapper[4971]: I0320 09:08:52.905560 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztcqq\" (UniqueName: \"kubernetes.io/projected/9dc1fc15-a1cf-4f9d-9c99-b2cab24cc958-kube-api-access-ztcqq\") pod \"redhat-marketplace-xv497\" (UID: \"9dc1fc15-a1cf-4f9d-9c99-b2cab24cc958\") " pod="openshift-marketplace/redhat-marketplace-xv497" Mar 20 09:08:53 crc kubenswrapper[4971]: I0320 09:08:53.020792 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xv497" Mar 20 09:08:53 crc kubenswrapper[4971]: I0320 09:08:53.454372 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xv497"] Mar 20 09:08:54 crc kubenswrapper[4971]: I0320 09:08:54.248051 4971 generic.go:334] "Generic (PLEG): container finished" podID="9dc1fc15-a1cf-4f9d-9c99-b2cab24cc958" containerID="2861fd1bfccbfd2a0507f550e7f07257d9c5e1f0c010ae2e498b64e5329e7d78" exitCode=0 Mar 20 09:08:54 crc kubenswrapper[4971]: I0320 09:08:54.248136 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xv497" event={"ID":"9dc1fc15-a1cf-4f9d-9c99-b2cab24cc958","Type":"ContainerDied","Data":"2861fd1bfccbfd2a0507f550e7f07257d9c5e1f0c010ae2e498b64e5329e7d78"} Mar 20 09:08:54 crc kubenswrapper[4971]: I0320 09:08:54.248402 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xv497" event={"ID":"9dc1fc15-a1cf-4f9d-9c99-b2cab24cc958","Type":"ContainerStarted","Data":"07cdf92f2ed1cc8ef440aa552011688585ff7f5b124b36737f326a49ae821d07"} Mar 20 09:08:55 crc kubenswrapper[4971]: I0320 09:08:55.037791 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lntm5"] Mar 20 09:08:55 crc kubenswrapper[4971]: I0320 09:08:55.040389 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lntm5" Mar 20 09:08:55 crc kubenswrapper[4971]: I0320 09:08:55.049901 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lntm5"] Mar 20 09:08:55 crc kubenswrapper[4971]: I0320 09:08:55.233585 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b61d2c6-8beb-44d6-bcb3-336ff7f7cbff-catalog-content\") pod \"community-operators-lntm5\" (UID: \"3b61d2c6-8beb-44d6-bcb3-336ff7f7cbff\") " pod="openshift-marketplace/community-operators-lntm5" Mar 20 09:08:55 crc kubenswrapper[4971]: I0320 09:08:55.233660 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b61d2c6-8beb-44d6-bcb3-336ff7f7cbff-utilities\") pod \"community-operators-lntm5\" (UID: \"3b61d2c6-8beb-44d6-bcb3-336ff7f7cbff\") " pod="openshift-marketplace/community-operators-lntm5" Mar 20 09:08:55 crc kubenswrapper[4971]: I0320 09:08:55.233769 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgz88\" (UniqueName: \"kubernetes.io/projected/3b61d2c6-8beb-44d6-bcb3-336ff7f7cbff-kube-api-access-hgz88\") pod \"community-operators-lntm5\" (UID: \"3b61d2c6-8beb-44d6-bcb3-336ff7f7cbff\") " pod="openshift-marketplace/community-operators-lntm5" Mar 20 09:08:55 crc kubenswrapper[4971]: I0320 09:08:55.257761 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xv497" event={"ID":"9dc1fc15-a1cf-4f9d-9c99-b2cab24cc958","Type":"ContainerStarted","Data":"2b66db6ec08ba5f2745e98120b7e5f96937b9bf898a004a671efd6593931db7c"} Mar 20 09:08:55 crc kubenswrapper[4971]: I0320 09:08:55.335919 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgz88\" (UniqueName: \"kubernetes.io/projected/3b61d2c6-8beb-44d6-bcb3-336ff7f7cbff-kube-api-access-hgz88\") pod \"community-operators-lntm5\" (UID: \"3b61d2c6-8beb-44d6-bcb3-336ff7f7cbff\") " pod="openshift-marketplace/community-operators-lntm5" Mar 20 09:08:55 crc kubenswrapper[4971]: I0320 09:08:55.336384 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b61d2c6-8beb-44d6-bcb3-336ff7f7cbff-catalog-content\") pod \"community-operators-lntm5\" (UID: \"3b61d2c6-8beb-44d6-bcb3-336ff7f7cbff\") " pod="openshift-marketplace/community-operators-lntm5" Mar 20 09:08:55 crc kubenswrapper[4971]: I0320 09:08:55.336434 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b61d2c6-8beb-44d6-bcb3-336ff7f7cbff-utilities\") pod \"community-operators-lntm5\" (UID: \"3b61d2c6-8beb-44d6-bcb3-336ff7f7cbff\") " pod="openshift-marketplace/community-operators-lntm5" Mar 20 09:08:55 crc kubenswrapper[4971]: I0320 09:08:55.352536 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b61d2c6-8beb-44d6-bcb3-336ff7f7cbff-catalog-content\") pod \"community-operators-lntm5\" (UID: \"3b61d2c6-8beb-44d6-bcb3-336ff7f7cbff\") " pod="openshift-marketplace/community-operators-lntm5" Mar 20 09:08:55 crc kubenswrapper[4971]: I0320 09:08:55.352656 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b61d2c6-8beb-44d6-bcb3-336ff7f7cbff-utilities\") pod \"community-operators-lntm5\" (UID: \"3b61d2c6-8beb-44d6-bcb3-336ff7f7cbff\") " pod="openshift-marketplace/community-operators-lntm5" Mar 20 09:08:55 crc kubenswrapper[4971]: I0320 09:08:55.369513 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgz88\" (UniqueName: \"kubernetes.io/projected/3b61d2c6-8beb-44d6-bcb3-336ff7f7cbff-kube-api-access-hgz88\") pod \"community-operators-lntm5\" (UID: \"3b61d2c6-8beb-44d6-bcb3-336ff7f7cbff\") " pod="openshift-marketplace/community-operators-lntm5" Mar 20 09:08:55 crc kubenswrapper[4971]: I0320 09:08:55.377173 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lntm5" Mar 20 09:08:55 crc kubenswrapper[4971]: W0320 09:08:55.853941 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b61d2c6_8beb_44d6_bcb3_336ff7f7cbff.slice/crio-494cd4d643ab2886c253158a6a72ac92e276937410b50673571611493dd11838 WatchSource:0}: Error finding container 494cd4d643ab2886c253158a6a72ac92e276937410b50673571611493dd11838: Status 404 returned error can't find the container with id 494cd4d643ab2886c253158a6a72ac92e276937410b50673571611493dd11838 Mar 20 09:08:55 crc kubenswrapper[4971]: I0320 09:08:55.854658 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lntm5"] Mar 20 09:08:56 crc kubenswrapper[4971]: I0320 09:08:56.274771 4971 generic.go:334] "Generic (PLEG): container finished" podID="9dc1fc15-a1cf-4f9d-9c99-b2cab24cc958" containerID="2b66db6ec08ba5f2745e98120b7e5f96937b9bf898a004a671efd6593931db7c" exitCode=0 Mar 20 09:08:56 crc kubenswrapper[4971]: I0320 09:08:56.274865 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xv497" event={"ID":"9dc1fc15-a1cf-4f9d-9c99-b2cab24cc958","Type":"ContainerDied","Data":"2b66db6ec08ba5f2745e98120b7e5f96937b9bf898a004a671efd6593931db7c"} Mar 20 09:08:56 crc kubenswrapper[4971]: I0320 09:08:56.289389 4971 generic.go:334] "Generic (PLEG): container finished" podID="3b61d2c6-8beb-44d6-bcb3-336ff7f7cbff" containerID="381ae75fd3e430b1710126371a62e17ed0693958abd6260d1373c3c96c75bc67" exitCode=0 Mar 20 09:08:56 crc kubenswrapper[4971]: I0320 09:08:56.289466 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lntm5" event={"ID":"3b61d2c6-8beb-44d6-bcb3-336ff7f7cbff","Type":"ContainerDied","Data":"381ae75fd3e430b1710126371a62e17ed0693958abd6260d1373c3c96c75bc67"} Mar 20 09:08:56 crc kubenswrapper[4971]: I0320 09:08:56.289500 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lntm5" event={"ID":"3b61d2c6-8beb-44d6-bcb3-336ff7f7cbff","Type":"ContainerStarted","Data":"494cd4d643ab2886c253158a6a72ac92e276937410b50673571611493dd11838"} Mar 20 09:08:57 crc kubenswrapper[4971]: I0320 09:08:57.302027 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xv497" event={"ID":"9dc1fc15-a1cf-4f9d-9c99-b2cab24cc958","Type":"ContainerStarted","Data":"778bdde1d399f6cafa3ef45515d5e1c8aeb58537c52155e52eba73b6d6a1968b"} Mar 20 09:08:57 crc kubenswrapper[4971]: I0320 09:08:57.325269 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xv497" podStartSLOduration=2.725250059 podStartE2EDuration="5.325208177s" podCreationTimestamp="2026-03-20 09:08:52 +0000 UTC" firstStartedPulling="2026-03-20 09:08:54.251073589 +0000 UTC m=+8356.230947717" lastFinishedPulling="2026-03-20 09:08:56.851031697 +0000 UTC m=+8358.830905835" observedRunningTime="2026-03-20 09:08:57.321327176 +0000 UTC m=+8359.301201334" watchObservedRunningTime="2026-03-20 09:08:57.325208177 +0000 UTC m=+8359.305082315" Mar 20 09:08:58 crc kubenswrapper[4971]: I0320 09:08:58.311786 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lntm5" event={"ID":"3b61d2c6-8beb-44d6-bcb3-336ff7f7cbff","Type":"ContainerStarted","Data":"d0d2afc1d96b1b1c528af5e49632a70b2e202c3f5e5b2b451fcdfb839a03c8ae"} Mar 20 09:09:00 crc kubenswrapper[4971]: I0320 09:09:00.335132 4971 generic.go:334] "Generic (PLEG): container finished" podID="3b61d2c6-8beb-44d6-bcb3-336ff7f7cbff" containerID="d0d2afc1d96b1b1c528af5e49632a70b2e202c3f5e5b2b451fcdfb839a03c8ae" exitCode=0 Mar 20 09:09:00 crc kubenswrapper[4971]: I0320 09:09:00.335225 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lntm5" event={"ID":"3b61d2c6-8beb-44d6-bcb3-336ff7f7cbff","Type":"ContainerDied","Data":"d0d2afc1d96b1b1c528af5e49632a70b2e202c3f5e5b2b451fcdfb839a03c8ae"} Mar 20 09:09:02 crc kubenswrapper[4971]: I0320 09:09:02.358697 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lntm5" event={"ID":"3b61d2c6-8beb-44d6-bcb3-336ff7f7cbff","Type":"ContainerStarted","Data":"dbe2fb1a0d820550368aa7cc6fcea12251a3015788940f3b26c5847596ad080f"} Mar 20 09:09:02 crc kubenswrapper[4971]: I0320 09:09:02.383969 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lntm5" podStartSLOduration=2.478783496 podStartE2EDuration="7.383948087s" podCreationTimestamp="2026-03-20 09:08:55 +0000 UTC" firstStartedPulling="2026-03-20 09:08:56.294107791 +0000 UTC m=+8358.273981929" lastFinishedPulling="2026-03-20 09:09:01.199272382 +0000 UTC m=+8363.179146520" observedRunningTime="2026-03-20 09:09:02.377684313 +0000 UTC m=+8364.357558461" watchObservedRunningTime="2026-03-20 09:09:02.383948087 +0000 UTC m=+8364.363822225" Mar 20 09:09:03 crc kubenswrapper[4971]: I0320 09:09:03.021682 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xv497" Mar 20 09:09:03 crc kubenswrapper[4971]: I0320 09:09:03.022019 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xv497" Mar 20 09:09:03 crc kubenswrapper[4971]: I0320 09:09:03.074909 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xv497" Mar 20 09:09:03 crc kubenswrapper[4971]: I0320 09:09:03.436716 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xv497" Mar 20 09:09:05 crc kubenswrapper[4971]: I0320 09:09:05.377910 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lntm5" Mar 20 09:09:05 crc kubenswrapper[4971]: I0320 09:09:05.377972 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lntm5" Mar 20 09:09:05 crc kubenswrapper[4971]: I0320 09:09:05.464738 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lntm5" Mar 20 09:09:06 crc kubenswrapper[4971]: I0320 09:09:06.234790 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xv497"] Mar 20 09:09:06 crc kubenswrapper[4971]: I0320 09:09:06.235151 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xv497" podUID="9dc1fc15-a1cf-4f9d-9c99-b2cab24cc958" containerName="registry-server" containerID="cri-o://778bdde1d399f6cafa3ef45515d5e1c8aeb58537c52155e52eba73b6d6a1968b" gracePeriod=2 Mar 20 09:09:06 crc kubenswrapper[4971]: I0320 09:09:06.403506 4971 generic.go:334] "Generic (PLEG): container finished" podID="9dc1fc15-a1cf-4f9d-9c99-b2cab24cc958" containerID="778bdde1d399f6cafa3ef45515d5e1c8aeb58537c52155e52eba73b6d6a1968b" exitCode=0 Mar 20 09:09:06 crc kubenswrapper[4971]: I0320 09:09:06.403947 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xv497" event={"ID":"9dc1fc15-a1cf-4f9d-9c99-b2cab24cc958","Type":"ContainerDied","Data":"778bdde1d399f6cafa3ef45515d5e1c8aeb58537c52155e52eba73b6d6a1968b"} Mar 20 09:09:06 crc kubenswrapper[4971]: I0320 09:09:06.487434 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lntm5" Mar 20 09:09:06 crc kubenswrapper[4971]: I0320 09:09:06.763547 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xv497" Mar 20 09:09:06 crc kubenswrapper[4971]: I0320 09:09:06.959348 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9dc1fc15-a1cf-4f9d-9c99-b2cab24cc958-catalog-content\") pod \"9dc1fc15-a1cf-4f9d-9c99-b2cab24cc958\" (UID: \"9dc1fc15-a1cf-4f9d-9c99-b2cab24cc958\") " Mar 20 09:09:06 crc kubenswrapper[4971]: I0320 09:09:06.959718 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztcqq\" (UniqueName: \"kubernetes.io/projected/9dc1fc15-a1cf-4f9d-9c99-b2cab24cc958-kube-api-access-ztcqq\") pod \"9dc1fc15-a1cf-4f9d-9c99-b2cab24cc958\" (UID: \"9dc1fc15-a1cf-4f9d-9c99-b2cab24cc958\") " Mar 20 09:09:06 crc kubenswrapper[4971]: I0320 09:09:06.959812 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9dc1fc15-a1cf-4f9d-9c99-b2cab24cc958-utilities\") pod \"9dc1fc15-a1cf-4f9d-9c99-b2cab24cc958\" (UID: \"9dc1fc15-a1cf-4f9d-9c99-b2cab24cc958\") " Mar 20 09:09:06 crc kubenswrapper[4971]: I0320 09:09:06.961085 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9dc1fc15-a1cf-4f9d-9c99-b2cab24cc958-utilities" (OuterVolumeSpecName: "utilities") pod "9dc1fc15-a1cf-4f9d-9c99-b2cab24cc958" (UID: "9dc1fc15-a1cf-4f9d-9c99-b2cab24cc958"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:09:06 crc kubenswrapper[4971]: I0320 09:09:06.966899 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dc1fc15-a1cf-4f9d-9c99-b2cab24cc958-kube-api-access-ztcqq" (OuterVolumeSpecName: "kube-api-access-ztcqq") pod "9dc1fc15-a1cf-4f9d-9c99-b2cab24cc958" (UID: "9dc1fc15-a1cf-4f9d-9c99-b2cab24cc958"). InnerVolumeSpecName "kube-api-access-ztcqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:09:06 crc kubenswrapper[4971]: I0320 09:09:06.984301 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9dc1fc15-a1cf-4f9d-9c99-b2cab24cc958-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9dc1fc15-a1cf-4f9d-9c99-b2cab24cc958" (UID: "9dc1fc15-a1cf-4f9d-9c99-b2cab24cc958"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:09:07 crc kubenswrapper[4971]: I0320 09:09:07.062776 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztcqq\" (UniqueName: \"kubernetes.io/projected/9dc1fc15-a1cf-4f9d-9c99-b2cab24cc958-kube-api-access-ztcqq\") on node \"crc\" DevicePath \"\"" Mar 20 09:09:07 crc kubenswrapper[4971]: I0320 09:09:07.062813 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9dc1fc15-a1cf-4f9d-9c99-b2cab24cc958-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 09:09:07 crc kubenswrapper[4971]: I0320 09:09:07.062824 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9dc1fc15-a1cf-4f9d-9c99-b2cab24cc958-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 09:09:07 crc kubenswrapper[4971]: I0320 09:09:07.231521 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lntm5"] Mar 20 09:09:07 crc kubenswrapper[4971]: I0320 09:09:07.418137 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xv497" Mar 20 09:09:07 crc kubenswrapper[4971]: I0320 09:09:07.418132 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xv497" event={"ID":"9dc1fc15-a1cf-4f9d-9c99-b2cab24cc958","Type":"ContainerDied","Data":"07cdf92f2ed1cc8ef440aa552011688585ff7f5b124b36737f326a49ae821d07"} Mar 20 09:09:07 crc kubenswrapper[4971]: I0320 09:09:07.418375 4971 scope.go:117] "RemoveContainer" containerID="778bdde1d399f6cafa3ef45515d5e1c8aeb58537c52155e52eba73b6d6a1968b" Mar 20 09:09:07 crc kubenswrapper[4971]: I0320 09:09:07.449923 4971 scope.go:117] "RemoveContainer" containerID="2b66db6ec08ba5f2745e98120b7e5f96937b9bf898a004a671efd6593931db7c" Mar 20 09:09:07 crc kubenswrapper[4971]: I0320 09:09:07.479846 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xv497"] Mar 20 09:09:07 crc kubenswrapper[4971]: I0320 09:09:07.489046 4971 scope.go:117] "RemoveContainer" containerID="2861fd1bfccbfd2a0507f550e7f07257d9c5e1f0c010ae2e498b64e5329e7d78" Mar 20 09:09:07 crc kubenswrapper[4971]: I0320 09:09:07.491948 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xv497"] Mar 20 09:09:08 crc kubenswrapper[4971]: I0320 09:09:08.430444 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lntm5" podUID="3b61d2c6-8beb-44d6-bcb3-336ff7f7cbff" containerName="registry-server" containerID="cri-o://dbe2fb1a0d820550368aa7cc6fcea12251a3015788940f3b26c5847596ad080f" gracePeriod=2 Mar 20 09:09:08 crc kubenswrapper[4971]: I0320 09:09:08.762860 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9dc1fc15-a1cf-4f9d-9c99-b2cab24cc958" path="/var/lib/kubelet/pods/9dc1fc15-a1cf-4f9d-9c99-b2cab24cc958/volumes" Mar 20 09:09:08 crc kubenswrapper[4971]: I0320 09:09:08.885762 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lntm5" Mar 20 09:09:09 crc kubenswrapper[4971]: I0320 09:09:09.004247 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b61d2c6-8beb-44d6-bcb3-336ff7f7cbff-catalog-content\") pod \"3b61d2c6-8beb-44d6-bcb3-336ff7f7cbff\" (UID: \"3b61d2c6-8beb-44d6-bcb3-336ff7f7cbff\") " Mar 20 09:09:09 crc kubenswrapper[4971]: I0320 09:09:09.004294 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b61d2c6-8beb-44d6-bcb3-336ff7f7cbff-utilities\") pod \"3b61d2c6-8beb-44d6-bcb3-336ff7f7cbff\" (UID: \"3b61d2c6-8beb-44d6-bcb3-336ff7f7cbff\") " Mar 20 09:09:09 crc kubenswrapper[4971]: I0320 09:09:09.004500 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgz88\" (UniqueName: \"kubernetes.io/projected/3b61d2c6-8beb-44d6-bcb3-336ff7f7cbff-kube-api-access-hgz88\") pod \"3b61d2c6-8beb-44d6-bcb3-336ff7f7cbff\" (UID: \"3b61d2c6-8beb-44d6-bcb3-336ff7f7cbff\") " Mar 20 09:09:09 crc kubenswrapper[4971]: I0320 09:09:09.005012 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b61d2c6-8beb-44d6-bcb3-336ff7f7cbff-utilities" (OuterVolumeSpecName: "utilities") pod "3b61d2c6-8beb-44d6-bcb3-336ff7f7cbff" (UID: "3b61d2c6-8beb-44d6-bcb3-336ff7f7cbff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:09:09 crc kubenswrapper[4971]: I0320 09:09:09.010194 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b61d2c6-8beb-44d6-bcb3-336ff7f7cbff-kube-api-access-hgz88" (OuterVolumeSpecName: "kube-api-access-hgz88") pod "3b61d2c6-8beb-44d6-bcb3-336ff7f7cbff" (UID: "3b61d2c6-8beb-44d6-bcb3-336ff7f7cbff"). InnerVolumeSpecName "kube-api-access-hgz88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:09:09 crc kubenswrapper[4971]: I0320 09:09:09.050124 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b61d2c6-8beb-44d6-bcb3-336ff7f7cbff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3b61d2c6-8beb-44d6-bcb3-336ff7f7cbff" (UID: "3b61d2c6-8beb-44d6-bcb3-336ff7f7cbff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:09:09 crc kubenswrapper[4971]: I0320 09:09:09.106804 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b61d2c6-8beb-44d6-bcb3-336ff7f7cbff-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 09:09:09 crc kubenswrapper[4971]: I0320 09:09:09.106847 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b61d2c6-8beb-44d6-bcb3-336ff7f7cbff-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 09:09:09 crc kubenswrapper[4971]: I0320 09:09:09.106861 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hgz88\" (UniqueName: \"kubernetes.io/projected/3b61d2c6-8beb-44d6-bcb3-336ff7f7cbff-kube-api-access-hgz88\") on node \"crc\" DevicePath \"\"" Mar 20 09:09:09 crc kubenswrapper[4971]: I0320 09:09:09.445167 4971 generic.go:334] "Generic (PLEG): container finished" podID="3b61d2c6-8beb-44d6-bcb3-336ff7f7cbff" containerID="dbe2fb1a0d820550368aa7cc6fcea12251a3015788940f3b26c5847596ad080f" exitCode=0 Mar 20 09:09:09 crc kubenswrapper[4971]: I0320 09:09:09.445222 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lntm5" event={"ID":"3b61d2c6-8beb-44d6-bcb3-336ff7f7cbff","Type":"ContainerDied","Data":"dbe2fb1a0d820550368aa7cc6fcea12251a3015788940f3b26c5847596ad080f"} Mar 20 09:09:09 crc kubenswrapper[4971]: I0320 09:09:09.445253 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lntm5" event={"ID":"3b61d2c6-8beb-44d6-bcb3-336ff7f7cbff","Type":"ContainerDied","Data":"494cd4d643ab2886c253158a6a72ac92e276937410b50673571611493dd11838"} Mar 20 09:09:09 crc kubenswrapper[4971]: I0320 09:09:09.445275 4971 scope.go:117] "RemoveContainer" containerID="dbe2fb1a0d820550368aa7cc6fcea12251a3015788940f3b26c5847596ad080f" Mar 20 09:09:09 crc kubenswrapper[4971]: I0320 09:09:09.445321 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lntm5" Mar 20 09:09:09 crc kubenswrapper[4971]: I0320 09:09:09.468289 4971 scope.go:117] "RemoveContainer" containerID="d0d2afc1d96b1b1c528af5e49632a70b2e202c3f5e5b2b451fcdfb839a03c8ae" Mar 20 09:09:09 crc kubenswrapper[4971]: I0320 09:09:09.495241 4971 scope.go:117] "RemoveContainer" containerID="381ae75fd3e430b1710126371a62e17ed0693958abd6260d1373c3c96c75bc67" Mar 20 09:09:09 crc kubenswrapper[4971]: I0320 09:09:09.497732 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lntm5"] Mar 20 09:09:09 crc kubenswrapper[4971]: I0320 09:09:09.507940 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lntm5"] Mar 20 09:09:09 crc kubenswrapper[4971]: I0320 09:09:09.543996 4971 scope.go:117] "RemoveContainer" containerID="dbe2fb1a0d820550368aa7cc6fcea12251a3015788940f3b26c5847596ad080f" Mar 20 09:09:09 crc kubenswrapper[4971]: E0320 09:09:09.544662 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbe2fb1a0d820550368aa7cc6fcea12251a3015788940f3b26c5847596ad080f\": container with ID starting with dbe2fb1a0d820550368aa7cc6fcea12251a3015788940f3b26c5847596ad080f not found: ID does not exist" containerID="dbe2fb1a0d820550368aa7cc6fcea12251a3015788940f3b26c5847596ad080f" Mar 20 09:09:09 crc kubenswrapper[4971]: I0320 09:09:09.544729 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbe2fb1a0d820550368aa7cc6fcea12251a3015788940f3b26c5847596ad080f"} err="failed to get container status \"dbe2fb1a0d820550368aa7cc6fcea12251a3015788940f3b26c5847596ad080f\": rpc error: code = NotFound desc = could not find container \"dbe2fb1a0d820550368aa7cc6fcea12251a3015788940f3b26c5847596ad080f\": container with ID starting with dbe2fb1a0d820550368aa7cc6fcea12251a3015788940f3b26c5847596ad080f not found: ID does not exist" Mar 20 09:09:09 crc kubenswrapper[4971]: I0320 09:09:09.544772 4971 scope.go:117] "RemoveContainer" containerID="d0d2afc1d96b1b1c528af5e49632a70b2e202c3f5e5b2b451fcdfb839a03c8ae" Mar 20 09:09:09 crc kubenswrapper[4971]: E0320 09:09:09.545313 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0d2afc1d96b1b1c528af5e49632a70b2e202c3f5e5b2b451fcdfb839a03c8ae\": container with ID starting with d0d2afc1d96b1b1c528af5e49632a70b2e202c3f5e5b2b451fcdfb839a03c8ae not found: ID does not exist" containerID="d0d2afc1d96b1b1c528af5e49632a70b2e202c3f5e5b2b451fcdfb839a03c8ae" Mar 20 09:09:09 crc kubenswrapper[4971]: I0320 09:09:09.545344 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0d2afc1d96b1b1c528af5e49632a70b2e202c3f5e5b2b451fcdfb839a03c8ae"} err="failed to get container status \"d0d2afc1d96b1b1c528af5e49632a70b2e202c3f5e5b2b451fcdfb839a03c8ae\": rpc error: code = NotFound desc = could not find container \"d0d2afc1d96b1b1c528af5e49632a70b2e202c3f5e5b2b451fcdfb839a03c8ae\": container with ID starting with d0d2afc1d96b1b1c528af5e49632a70b2e202c3f5e5b2b451fcdfb839a03c8ae not found: ID does not exist" Mar 20 09:09:09 crc kubenswrapper[4971]: I0320 09:09:09.545360 4971 scope.go:117] "RemoveContainer" containerID="381ae75fd3e430b1710126371a62e17ed0693958abd6260d1373c3c96c75bc67" Mar 20 09:09:09 crc kubenswrapper[4971]: E0320 09:09:09.545788 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"381ae75fd3e430b1710126371a62e17ed0693958abd6260d1373c3c96c75bc67\": container with ID starting with 381ae75fd3e430b1710126371a62e17ed0693958abd6260d1373c3c96c75bc67 not found: ID does not exist" containerID="381ae75fd3e430b1710126371a62e17ed0693958abd6260d1373c3c96c75bc67" Mar 20 09:09:09 crc kubenswrapper[4971]: I0320 09:09:09.545818 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"381ae75fd3e430b1710126371a62e17ed0693958abd6260d1373c3c96c75bc67"} err="failed to get container status \"381ae75fd3e430b1710126371a62e17ed0693958abd6260d1373c3c96c75bc67\": rpc error: code = NotFound desc = could not find container \"381ae75fd3e430b1710126371a62e17ed0693958abd6260d1373c3c96c75bc67\": container with ID starting with 381ae75fd3e430b1710126371a62e17ed0693958abd6260d1373c3c96c75bc67 not found: ID does not exist" Mar 20 09:09:10 crc kubenswrapper[4971]: I0320 09:09:10.756069 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b61d2c6-8beb-44d6-bcb3-336ff7f7cbff" path="/var/lib/kubelet/pods/3b61d2c6-8beb-44d6-bcb3-336ff7f7cbff/volumes" Mar 20 09:09:20 crc kubenswrapper[4971]: I0320 09:09:20.162948 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:09:20 crc kubenswrapper[4971]: I0320 09:09:20.163569 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:09:33 crc kubenswrapper[4971]: I0320 09:09:33.047192 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-z54jl"] Mar 20 09:09:33 crc kubenswrapper[4971]: I0320 09:09:33.058784 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-z54jl"] Mar 20 09:09:34 crc kubenswrapper[4971]: I0320 09:09:34.036680 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-8e23-account-create-update-kw8vm"] Mar 20 09:09:34 crc kubenswrapper[4971]: I0320 09:09:34.048291 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-8e23-account-create-update-kw8vm"] Mar 20 09:09:34 crc kubenswrapper[4971]: I0320 09:09:34.746703 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="006fbf05-668e-4eef-97a3-d24b1ca7dbd0" path="/var/lib/kubelet/pods/006fbf05-668e-4eef-97a3-d24b1ca7dbd0/volumes" Mar 20 09:09:34 crc kubenswrapper[4971]: I0320 09:09:34.747502 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e91c8be-1a90-4da8-a072-a4aaf36b2a89" path="/var/lib/kubelet/pods/2e91c8be-1a90-4da8-a072-a4aaf36b2a89/volumes" Mar 20 09:09:45 crc kubenswrapper[4971]: I0320 09:09:45.034741 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-zxlrh"] Mar 20 09:09:45 crc kubenswrapper[4971]: I0320 09:09:45.043043 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-zxlrh"] Mar 20 09:09:46 crc kubenswrapper[4971]: I0320 09:09:46.761537 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f76e60cb-33d8-4c46-9fec-c6e306e2aa0e" path="/var/lib/kubelet/pods/f76e60cb-33d8-4c46-9fec-c6e306e2aa0e/volumes" Mar 20 09:09:50 crc kubenswrapper[4971]: I0320 09:09:50.162302 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:09:50 crc kubenswrapper[4971]: I0320 09:09:50.162942 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:09:50 crc kubenswrapper[4971]: I0320 09:09:50.162995 4971 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" Mar 20 09:09:50 crc kubenswrapper[4971]: I0320 09:09:50.164117 4971 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"da88d257cb097a1180faf648e8cb9e53e92d6d11e4160393bcb42462b61cacc0"} pod="openshift-machine-config-operator/machine-config-daemon-m26zl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 09:09:50 crc kubenswrapper[4971]: I0320 09:09:50.164189 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" containerID="cri-o://da88d257cb097a1180faf648e8cb9e53e92d6d11e4160393bcb42462b61cacc0" gracePeriod=600 Mar 20 09:09:50 crc kubenswrapper[4971]: E0320 09:09:50.339866 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:09:50 crc kubenswrapper[4971]: I0320 09:09:50.895526 4971 generic.go:334] "Generic (PLEG): container finished" podID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerID="da88d257cb097a1180faf648e8cb9e53e92d6d11e4160393bcb42462b61cacc0" exitCode=0 Mar 20 09:09:50 crc kubenswrapper[4971]: I0320 09:09:50.895654 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerDied","Data":"da88d257cb097a1180faf648e8cb9e53e92d6d11e4160393bcb42462b61cacc0"} Mar 20 09:09:50 crc kubenswrapper[4971]: I0320 09:09:50.895914 4971 scope.go:117] "RemoveContainer" containerID="ba1ddb4578d4cb1dc77ee5b85f04ad39f4f2172118cb49b973658b5d06d4d7e6" Mar 20 09:09:50 crc kubenswrapper[4971]: I0320 09:09:50.896959 4971 scope.go:117] "RemoveContainer" containerID="da88d257cb097a1180faf648e8cb9e53e92d6d11e4160393bcb42462b61cacc0" Mar 20 09:09:50 crc kubenswrapper[4971]: E0320 09:09:50.897736 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:10:00 crc kubenswrapper[4971]: I0320 09:10:00.158853 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566630-2mnrg"] Mar 20 09:10:00 crc kubenswrapper[4971]: E0320 09:10:00.159987 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dc1fc15-a1cf-4f9d-9c99-b2cab24cc958" containerName="registry-server" Mar 20 09:10:00 crc kubenswrapper[4971]: I0320 09:10:00.160006 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dc1fc15-a1cf-4f9d-9c99-b2cab24cc958" containerName="registry-server" Mar 20 09:10:00 crc kubenswrapper[4971]: E0320 09:10:00.160028 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b61d2c6-8beb-44d6-bcb3-336ff7f7cbff" containerName="extract-content" Mar 20 09:10:00 crc kubenswrapper[4971]: I0320 09:10:00.160037 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b61d2c6-8beb-44d6-bcb3-336ff7f7cbff" containerName="extract-content" Mar 20 09:10:00 crc kubenswrapper[4971]: E0320 09:10:00.160054 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dc1fc15-a1cf-4f9d-9c99-b2cab24cc958" containerName="extract-utilities" Mar 20 09:10:00 crc kubenswrapper[4971]: I0320 09:10:00.160064 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dc1fc15-a1cf-4f9d-9c99-b2cab24cc958" containerName="extract-utilities" Mar 20 09:10:00 crc kubenswrapper[4971]: E0320 09:10:00.160080 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b61d2c6-8beb-44d6-bcb3-336ff7f7cbff" containerName="extract-utilities" Mar 20 09:10:00 crc kubenswrapper[4971]: I0320 09:10:00.160089 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b61d2c6-8beb-44d6-bcb3-336ff7f7cbff" containerName="extract-utilities" Mar 20 09:10:00 crc kubenswrapper[4971]: E0320 09:10:00.160112 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b61d2c6-8beb-44d6-bcb3-336ff7f7cbff" containerName="registry-server" Mar 20 09:10:00 crc kubenswrapper[4971]: I0320 09:10:00.160120 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b61d2c6-8beb-44d6-bcb3-336ff7f7cbff" containerName="registry-server" Mar 20 09:10:00 crc kubenswrapper[4971]: E0320 09:10:00.160157 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dc1fc15-a1cf-4f9d-9c99-b2cab24cc958" containerName="extract-content" Mar 20 09:10:00 crc kubenswrapper[4971]: I0320 09:10:00.160165 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dc1fc15-a1cf-4f9d-9c99-b2cab24cc958" containerName="extract-content" Mar 20 09:10:00 crc kubenswrapper[4971]: I0320 09:10:00.160505 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b61d2c6-8beb-44d6-bcb3-336ff7f7cbff" containerName="registry-server" Mar 20 09:10:00 crc kubenswrapper[4971]: I0320 09:10:00.160555 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="9dc1fc15-a1cf-4f9d-9c99-b2cab24cc958" containerName="registry-server" Mar 20 09:10:00 crc kubenswrapper[4971]: I0320 09:10:00.162033 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566630-2mnrg" Mar 20 09:10:00 crc kubenswrapper[4971]: I0320 09:10:00.165189 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:10:00 crc kubenswrapper[4971]: I0320 09:10:00.165365 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:10:00 crc kubenswrapper[4971]: I0320 09:10:00.165559 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 09:10:00 crc kubenswrapper[4971]: I0320 09:10:00.170591 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566630-2mnrg"] Mar 20 09:10:00 crc kubenswrapper[4971]: I0320 09:10:00.316703 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxmhn\" (UniqueName: \"kubernetes.io/projected/516be8fc-f457-4f59-ba34-6eccaf7e8ff3-kube-api-access-bxmhn\") pod \"auto-csr-approver-29566630-2mnrg\" (UID: \"516be8fc-f457-4f59-ba34-6eccaf7e8ff3\") " pod="openshift-infra/auto-csr-approver-29566630-2mnrg" Mar 20 09:10:00 crc kubenswrapper[4971]: I0320 09:10:00.419396 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxmhn\" (UniqueName: \"kubernetes.io/projected/516be8fc-f457-4f59-ba34-6eccaf7e8ff3-kube-api-access-bxmhn\") pod \"auto-csr-approver-29566630-2mnrg\" (UID: \"516be8fc-f457-4f59-ba34-6eccaf7e8ff3\") " pod="openshift-infra/auto-csr-approver-29566630-2mnrg" Mar 20 09:10:00 crc kubenswrapper[4971]: I0320 09:10:00.440371 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxmhn\" (UniqueName: \"kubernetes.io/projected/516be8fc-f457-4f59-ba34-6eccaf7e8ff3-kube-api-access-bxmhn\") pod \"auto-csr-approver-29566630-2mnrg\" (UID: \"516be8fc-f457-4f59-ba34-6eccaf7e8ff3\") " pod="openshift-infra/auto-csr-approver-29566630-2mnrg" Mar 20 09:10:00 crc kubenswrapper[4971]: I0320 09:10:00.485030 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566630-2mnrg" Mar 20 09:10:00 crc kubenswrapper[4971]: I0320 09:10:00.941858 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566630-2mnrg"] Mar 20 09:10:01 crc kubenswrapper[4971]: I0320 09:10:01.007477 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566630-2mnrg" event={"ID":"516be8fc-f457-4f59-ba34-6eccaf7e8ff3","Type":"ContainerStarted","Data":"0a1afe9e1b9bf9a3a87686f3f2050166f1225ad758369d4fd416ad79f8aca77b"} Mar 20 09:10:02 crc kubenswrapper[4971]: I0320 09:10:02.732828 4971 scope.go:117] "RemoveContainer" containerID="da88d257cb097a1180faf648e8cb9e53e92d6d11e4160393bcb42462b61cacc0" Mar 20 09:10:02 crc kubenswrapper[4971]: E0320 09:10:02.733306 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:10:03 crc kubenswrapper[4971]: I0320 09:10:03.031306 4971 generic.go:334] "Generic (PLEG): container finished" podID="516be8fc-f457-4f59-ba34-6eccaf7e8ff3" containerID="78199fefffed876b749f7f23e987cd04442bc91d340191b35a280ad11a390601" exitCode=0 Mar 20 09:10:03 crc kubenswrapper[4971]: I0320 09:10:03.031564 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566630-2mnrg" event={"ID":"516be8fc-f457-4f59-ba34-6eccaf7e8ff3","Type":"ContainerDied","Data":"78199fefffed876b749f7f23e987cd04442bc91d340191b35a280ad11a390601"} Mar 20 09:10:04 crc kubenswrapper[4971]: I0320 09:10:04.359378 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566630-2mnrg" Mar 20 09:10:04 crc kubenswrapper[4971]: I0320 09:10:04.527520 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxmhn\" (UniqueName: \"kubernetes.io/projected/516be8fc-f457-4f59-ba34-6eccaf7e8ff3-kube-api-access-bxmhn\") pod \"516be8fc-f457-4f59-ba34-6eccaf7e8ff3\" (UID: \"516be8fc-f457-4f59-ba34-6eccaf7e8ff3\") " Mar 20 09:10:04 crc kubenswrapper[4971]: I0320 09:10:04.533784 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/516be8fc-f457-4f59-ba34-6eccaf7e8ff3-kube-api-access-bxmhn" (OuterVolumeSpecName: "kube-api-access-bxmhn") pod "516be8fc-f457-4f59-ba34-6eccaf7e8ff3" (UID: "516be8fc-f457-4f59-ba34-6eccaf7e8ff3"). InnerVolumeSpecName "kube-api-access-bxmhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:10:04 crc kubenswrapper[4971]: I0320 09:10:04.630633 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxmhn\" (UniqueName: \"kubernetes.io/projected/516be8fc-f457-4f59-ba34-6eccaf7e8ff3-kube-api-access-bxmhn\") on node \"crc\" DevicePath \"\"" Mar 20 09:10:05 crc kubenswrapper[4971]: I0320 09:10:05.052949 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566630-2mnrg" event={"ID":"516be8fc-f457-4f59-ba34-6eccaf7e8ff3","Type":"ContainerDied","Data":"0a1afe9e1b9bf9a3a87686f3f2050166f1225ad758369d4fd416ad79f8aca77b"} Mar 20 09:10:05 crc kubenswrapper[4971]: I0320 09:10:05.052987 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a1afe9e1b9bf9a3a87686f3f2050166f1225ad758369d4fd416ad79f8aca77b" Mar 20 09:10:05 crc kubenswrapper[4971]: I0320 09:10:05.053007 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566630-2mnrg" Mar 20 09:10:05 crc kubenswrapper[4971]: I0320 09:10:05.064776 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-fskqk"] Mar 20 09:10:05 crc kubenswrapper[4971]: I0320 09:10:05.081107 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-fskqk"] Mar 20 09:10:05 crc kubenswrapper[4971]: I0320 09:10:05.088952 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-1bf7-account-create-update-mtp5w"] Mar 20 09:10:05 crc kubenswrapper[4971]: I0320 09:10:05.097375 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-1bf7-account-create-update-mtp5w"] Mar 20 09:10:05 crc kubenswrapper[4971]: I0320 09:10:05.430155 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566624-l8469"] Mar 20 09:10:05 crc kubenswrapper[4971]: I0320 09:10:05.443454 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566624-l8469"] Mar 20 09:10:06 crc kubenswrapper[4971]: I0320 09:10:06.745334 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1943c6e8-ca3b-4c1a-aa8c-4d3576b9eb5e" path="/var/lib/kubelet/pods/1943c6e8-ca3b-4c1a-aa8c-4d3576b9eb5e/volumes" Mar 20 09:10:06 crc kubenswrapper[4971]: I0320 09:10:06.746139 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ddcc15d-2a56-46c2-9606-3ba1643f15b2" path="/var/lib/kubelet/pods/9ddcc15d-2a56-46c2-9606-3ba1643f15b2/volumes" Mar 20 09:10:06 crc kubenswrapper[4971]: I0320 09:10:06.746625 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2cbaa6d-6665-4d54-a3e9-8356327e8814" path="/var/lib/kubelet/pods/f2cbaa6d-6665-4d54-a3e9-8356327e8814/volumes" Mar 20 09:10:13 crc kubenswrapper[4971]: I0320 09:10:13.070826 4971 scope.go:117] "RemoveContainer" containerID="43740f103d019ea35f569908920f5230c2fef27d15a89152e4a753894f99fda1" Mar 20 09:10:13 crc kubenswrapper[4971]: I0320 09:10:13.112740 4971 scope.go:117] "RemoveContainer" containerID="bc9baf15530f7cde8989555ae150708f2f6c2ebe8bc282d59f0f77a6cc5845bd" Mar 20 09:10:13 crc kubenswrapper[4971]: I0320 09:10:13.139094 4971 scope.go:117] "RemoveContainer" containerID="b318a46490086f365352264c975e137f0bd04500a5e27c5284457adc42224eea" Mar 20 09:10:13 crc kubenswrapper[4971]: I0320 09:10:13.201129 4971 scope.go:117] "RemoveContainer" containerID="d800c2b63fef95189adca9cf7bf09d72aa824438506d2bbf8595742cfe0c3fc5" Mar 20 09:10:13 crc kubenswrapper[4971]: I0320 09:10:13.261837 4971 scope.go:117] "RemoveContainer" containerID="e3a20d8cf980212f0b78cfb3a017b5516fe1824301f185a0bcb4b588448c42b9" Mar 20 09:10:13 crc kubenswrapper[4971]: I0320 09:10:13.283530 4971 scope.go:117] "RemoveContainer" containerID="8ea134c20974b4052f9194d6e61c23af291cf3d95c59ba0c9174f1c9ffde82e3" Mar 20 09:10:14 crc kubenswrapper[4971]: I0320 09:10:14.732213 4971 scope.go:117] "RemoveContainer" containerID="da88d257cb097a1180faf648e8cb9e53e92d6d11e4160393bcb42462b61cacc0" Mar 20 09:10:14 crc kubenswrapper[4971]: E0320 09:10:14.732791 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:10:22 crc kubenswrapper[4971]: I0320 09:10:22.034282 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-8qtsq"] Mar 20 09:10:22 crc kubenswrapper[4971]: I0320 09:10:22.044018 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-8qtsq"] Mar 20 09:10:22 crc kubenswrapper[4971]: I0320 09:10:22.750175 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="165bdcc8-662d-4d59-aebb-90e1bc533f38" path="/var/lib/kubelet/pods/165bdcc8-662d-4d59-aebb-90e1bc533f38/volumes" Mar 20 09:10:27 crc kubenswrapper[4971]: I0320 09:10:27.732884 4971 scope.go:117] "RemoveContainer" containerID="da88d257cb097a1180faf648e8cb9e53e92d6d11e4160393bcb42462b61cacc0" Mar 20 09:10:27 crc kubenswrapper[4971]: E0320 09:10:27.733411 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:10:42 crc kubenswrapper[4971]: I0320 09:10:42.732630 4971 scope.go:117] "RemoveContainer" containerID="da88d257cb097a1180faf648e8cb9e53e92d6d11e4160393bcb42462b61cacc0" Mar 20 09:10:42 crc kubenswrapper[4971]: E0320 09:10:42.733399 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:10:53 crc kubenswrapper[4971]: I0320 09:10:53.732509 4971 scope.go:117] "RemoveContainer" containerID="da88d257cb097a1180faf648e8cb9e53e92d6d11e4160393bcb42462b61cacc0" Mar 20 09:10:53 crc kubenswrapper[4971]: E0320 09:10:53.733148 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:10:59 crc kubenswrapper[4971]: I0320 09:10:59.694487 4971 generic.go:334] "Generic (PLEG): container finished" podID="3f14a6df-ffd2-484e-9097-4e30054d8f42" containerID="37bd3c4080897c9ef7fd21fdab6f33c8be131dba1c497294b6cf001d3212cca1" exitCode=0 Mar 20 09:10:59 crc kubenswrapper[4971]: I0320 09:10:59.694670 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-5ptjb" event={"ID":"3f14a6df-ffd2-484e-9097-4e30054d8f42","Type":"ContainerDied","Data":"37bd3c4080897c9ef7fd21fdab6f33c8be131dba1c497294b6cf001d3212cca1"} Mar 20 09:11:01 crc kubenswrapper[4971]: I0320 09:11:01.155864 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-5ptjb" Mar 20 09:11:01 crc kubenswrapper[4971]: I0320 09:11:01.290018 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/3f14a6df-ffd2-484e-9097-4e30054d8f42-ssh-key-openstack-cell1\") pod \"3f14a6df-ffd2-484e-9097-4e30054d8f42\" (UID: \"3f14a6df-ffd2-484e-9097-4e30054d8f42\") " Mar 20 09:11:01 crc kubenswrapper[4971]: I0320 09:11:01.290174 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3f14a6df-ffd2-484e-9097-4e30054d8f42-ceph\") pod \"3f14a6df-ffd2-484e-9097-4e30054d8f42\" (UID: \"3f14a6df-ffd2-484e-9097-4e30054d8f42\") " Mar 20 09:11:01 crc kubenswrapper[4971]: I0320 09:11:01.290292 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f14a6df-ffd2-484e-9097-4e30054d8f42-tripleo-cleanup-combined-ca-bundle\") pod \"3f14a6df-ffd2-484e-9097-4e30054d8f42\" (UID: \"3f14a6df-ffd2-484e-9097-4e30054d8f42\") " Mar 20 09:11:01 crc kubenswrapper[4971]: I0320 09:11:01.290333 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5blpg\" (UniqueName: \"kubernetes.io/projected/3f14a6df-ffd2-484e-9097-4e30054d8f42-kube-api-access-5blpg\") pod \"3f14a6df-ffd2-484e-9097-4e30054d8f42\" (UID: \"3f14a6df-ffd2-484e-9097-4e30054d8f42\") " Mar 20 09:11:01 crc kubenswrapper[4971]: I0320 09:11:01.290357 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f14a6df-ffd2-484e-9097-4e30054d8f42-inventory\") pod \"3f14a6df-ffd2-484e-9097-4e30054d8f42\" (UID: \"3f14a6df-ffd2-484e-9097-4e30054d8f42\") " Mar 20 09:11:01 crc kubenswrapper[4971]: I0320 09:11:01.295327 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f14a6df-ffd2-484e-9097-4e30054d8f42-kube-api-access-5blpg" (OuterVolumeSpecName: "kube-api-access-5blpg") pod "3f14a6df-ffd2-484e-9097-4e30054d8f42" (UID: "3f14a6df-ffd2-484e-9097-4e30054d8f42"). InnerVolumeSpecName "kube-api-access-5blpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:11:01 crc kubenswrapper[4971]: I0320 09:11:01.295672 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f14a6df-ffd2-484e-9097-4e30054d8f42-tripleo-cleanup-combined-ca-bundle" (OuterVolumeSpecName: "tripleo-cleanup-combined-ca-bundle") pod "3f14a6df-ffd2-484e-9097-4e30054d8f42" (UID: "3f14a6df-ffd2-484e-9097-4e30054d8f42"). InnerVolumeSpecName "tripleo-cleanup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:11:01 crc kubenswrapper[4971]: I0320 09:11:01.305119 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f14a6df-ffd2-484e-9097-4e30054d8f42-ceph" (OuterVolumeSpecName: "ceph") pod "3f14a6df-ffd2-484e-9097-4e30054d8f42" (UID: "3f14a6df-ffd2-484e-9097-4e30054d8f42"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:11:01 crc kubenswrapper[4971]: I0320 09:11:01.326224 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f14a6df-ffd2-484e-9097-4e30054d8f42-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "3f14a6df-ffd2-484e-9097-4e30054d8f42" (UID: "3f14a6df-ffd2-484e-9097-4e30054d8f42"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:11:01 crc kubenswrapper[4971]: I0320 09:11:01.327162 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f14a6df-ffd2-484e-9097-4e30054d8f42-inventory" (OuterVolumeSpecName: "inventory") pod "3f14a6df-ffd2-484e-9097-4e30054d8f42" (UID: "3f14a6df-ffd2-484e-9097-4e30054d8f42"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:11:01 crc kubenswrapper[4971]: I0320 09:11:01.392711 4971 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/3f14a6df-ffd2-484e-9097-4e30054d8f42-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 20 09:11:01 crc kubenswrapper[4971]: I0320 09:11:01.392747 4971 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3f14a6df-ffd2-484e-9097-4e30054d8f42-ceph\") on node \"crc\" DevicePath \"\"" Mar 20 09:11:01 crc kubenswrapper[4971]: I0320 09:11:01.392759 4971 reconciler_common.go:293] "Volume detached for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f14a6df-ffd2-484e-9097-4e30054d8f42-tripleo-cleanup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 09:11:01 crc kubenswrapper[4971]: I0320 09:11:01.392772 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5blpg\" (UniqueName: \"kubernetes.io/projected/3f14a6df-ffd2-484e-9097-4e30054d8f42-kube-api-access-5blpg\") on node \"crc\" DevicePath \"\"" Mar 20 09:11:01 crc kubenswrapper[4971]: I0320 09:11:01.392783 4971 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f14a6df-ffd2-484e-9097-4e30054d8f42-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 09:11:01 crc kubenswrapper[4971]: I0320 09:11:01.713098 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-5ptjb" event={"ID":"3f14a6df-ffd2-484e-9097-4e30054d8f42","Type":"ContainerDied","Data":"df34ad1fefe48460efc93bda8c8862101cb66fd09e3ebfa9c2c33cb05be71dde"} Mar 20 09:11:01 crc kubenswrapper[4971]: I0320 09:11:01.713142 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df34ad1fefe48460efc93bda8c8862101cb66fd09e3ebfa9c2c33cb05be71dde" Mar 20 09:11:01 crc kubenswrapper[4971]: I0320 09:11:01.713177 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-5ptjb" Mar 20 09:11:02 crc kubenswrapper[4971]: I0320 09:11:02.994852 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-vmk68"] Mar 20 09:11:02 crc kubenswrapper[4971]: E0320 09:11:02.995681 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="516be8fc-f457-4f59-ba34-6eccaf7e8ff3" containerName="oc" Mar 20 09:11:02 crc kubenswrapper[4971]: I0320 09:11:02.995904 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="516be8fc-f457-4f59-ba34-6eccaf7e8ff3" containerName="oc" Mar 20 09:11:02 crc kubenswrapper[4971]: E0320 09:11:02.995957 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f14a6df-ffd2-484e-9097-4e30054d8f42" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Mar 20 09:11:02 crc kubenswrapper[4971]: I0320 09:11:02.995967 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f14a6df-ffd2-484e-9097-4e30054d8f42" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Mar 20 09:11:02 crc kubenswrapper[4971]: I0320 09:11:02.996208 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="516be8fc-f457-4f59-ba34-6eccaf7e8ff3" containerName="oc" Mar 20 09:11:02 crc kubenswrapper[4971]: I0320 09:11:02.996250 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f14a6df-ffd2-484e-9097-4e30054d8f42" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Mar 20 09:11:02 crc kubenswrapper[4971]: I0320 09:11:02.997239 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-vmk68" Mar 20 09:11:03 crc kubenswrapper[4971]: I0320 09:11:03.003180 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 20 09:11:03 crc kubenswrapper[4971]: I0320 09:11:03.003315 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rwstj" Mar 20 09:11:03 crc kubenswrapper[4971]: I0320 09:11:03.003378 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 20 09:11:03 crc kubenswrapper[4971]: I0320 09:11:03.003511 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 09:11:03 crc kubenswrapper[4971]: I0320 09:11:03.008906 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-vmk68"] Mar 20 09:11:03 crc kubenswrapper[4971]: I0320 09:11:03.027028 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-openstack-openstack-networker-wpznj"] Mar 20 09:11:03 crc kubenswrapper[4971]: I0320 09:11:03.028647 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-networker-wpznj" Mar 20 09:11:03 crc kubenswrapper[4971]: I0320 09:11:03.030622 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Mar 20 09:11:03 crc kubenswrapper[4971]: I0320 09:11:03.031122 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-nwcvx" Mar 20 09:11:03 crc kubenswrapper[4971]: I0320 09:11:03.063252 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-networker-wpznj"] Mar 20 09:11:03 crc kubenswrapper[4971]: I0320 09:11:03.131632 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5dzv\" (UniqueName: \"kubernetes.io/projected/477b98cd-b9c9-4411-8a82-73a1153e0cc2-kube-api-access-b5dzv\") pod \"bootstrap-openstack-openstack-cell1-vmk68\" (UID: \"477b98cd-b9c9-4411-8a82-73a1153e0cc2\") " pod="openstack/bootstrap-openstack-openstack-cell1-vmk68" Mar 20 09:11:03 crc kubenswrapper[4971]: I0320 09:11:03.131680 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkhg5\" (UniqueName: \"kubernetes.io/projected/6631a763-6625-4978-a9dd-5ed4c4421ecc-kube-api-access-dkhg5\") pod \"bootstrap-openstack-openstack-networker-wpznj\" (UID: \"6631a763-6625-4978-a9dd-5ed4c4421ecc\") " pod="openstack/bootstrap-openstack-openstack-networker-wpznj" Mar 20 09:11:03 crc kubenswrapper[4971]: I0320 09:11:03.131746 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6631a763-6625-4978-a9dd-5ed4c4421ecc-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-networker-wpznj\" (UID: \"6631a763-6625-4978-a9dd-5ed4c4421ecc\") " pod="openstack/bootstrap-openstack-openstack-networker-wpznj" Mar 20 09:11:03 crc kubenswrapper[4971]: I0320 09:11:03.131820 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/6631a763-6625-4978-a9dd-5ed4c4421ecc-ssh-key-openstack-networker\") pod \"bootstrap-openstack-openstack-networker-wpznj\" (UID: \"6631a763-6625-4978-a9dd-5ed4c4421ecc\") " pod="openstack/bootstrap-openstack-openstack-networker-wpznj" Mar 20 09:11:03 crc kubenswrapper[4971]: I0320 09:11:03.131843 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/477b98cd-b9c9-4411-8a82-73a1153e0cc2-inventory\") pod \"bootstrap-openstack-openstack-cell1-vmk68\" (UID: \"477b98cd-b9c9-4411-8a82-73a1153e0cc2\") " pod="openstack/bootstrap-openstack-openstack-cell1-vmk68" Mar 20 09:11:03 crc kubenswrapper[4971]: I0320 09:11:03.131880 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/477b98cd-b9c9-4411-8a82-73a1153e0cc2-ssh-key-openstack-cell1\") pod \"bootstrap-openstack-openstack-cell1-vmk68\" (UID: \"477b98cd-b9c9-4411-8a82-73a1153e0cc2\") " pod="openstack/bootstrap-openstack-openstack-cell1-vmk68" Mar 20 09:11:03 crc kubenswrapper[4971]: I0320 09:11:03.131921 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/477b98cd-b9c9-4411-8a82-73a1153e0cc2-ceph\") pod \"bootstrap-openstack-openstack-cell1-vmk68\" (UID: \"477b98cd-b9c9-4411-8a82-73a1153e0cc2\") " pod="openstack/bootstrap-openstack-openstack-cell1-vmk68" Mar 20 09:11:03 crc kubenswrapper[4971]: I0320 09:11:03.132136 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6631a763-6625-4978-a9dd-5ed4c4421ecc-inventory\") pod \"bootstrap-openstack-openstack-networker-wpznj\" (UID: \"6631a763-6625-4978-a9dd-5ed4c4421ecc\") " pod="openstack/bootstrap-openstack-openstack-networker-wpznj" Mar 20 09:11:03 crc kubenswrapper[4971]: I0320 09:11:03.132251 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/477b98cd-b9c9-4411-8a82-73a1153e0cc2-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-vmk68\" (UID: \"477b98cd-b9c9-4411-8a82-73a1153e0cc2\") " pod="openstack/bootstrap-openstack-openstack-cell1-vmk68" Mar 20 09:11:03 crc kubenswrapper[4971]: I0320 09:11:03.234286 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6631a763-6625-4978-a9dd-5ed4c4421ecc-inventory\") pod \"bootstrap-openstack-openstack-networker-wpznj\" (UID: \"6631a763-6625-4978-a9dd-5ed4c4421ecc\") " pod="openstack/bootstrap-openstack-openstack-networker-wpznj" Mar 20 09:11:03 crc kubenswrapper[4971]: I0320 09:11:03.234353 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/477b98cd-b9c9-4411-8a82-73a1153e0cc2-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-vmk68\" (UID: \"477b98cd-b9c9-4411-8a82-73a1153e0cc2\") " pod="openstack/bootstrap-openstack-openstack-cell1-vmk68" Mar 20 09:11:03 crc kubenswrapper[4971]: I0320 09:11:03.234418 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5dzv\" (UniqueName: \"kubernetes.io/projected/477b98cd-b9c9-4411-8a82-73a1153e0cc2-kube-api-access-b5dzv\") pod \"bootstrap-openstack-openstack-cell1-vmk68\" (UID: \"477b98cd-b9c9-4411-8a82-73a1153e0cc2\") " pod="openstack/bootstrap-openstack-openstack-cell1-vmk68" Mar 20 09:11:03 crc kubenswrapper[4971]: I0320 09:11:03.234439 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkhg5\" (UniqueName: \"kubernetes.io/projected/6631a763-6625-4978-a9dd-5ed4c4421ecc-kube-api-access-dkhg5\") pod \"bootstrap-openstack-openstack-networker-wpznj\" (UID: \"6631a763-6625-4978-a9dd-5ed4c4421ecc\") " pod="openstack/bootstrap-openstack-openstack-networker-wpznj" Mar 20 09:11:03 crc kubenswrapper[4971]: I0320 09:11:03.234485 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6631a763-6625-4978-a9dd-5ed4c4421ecc-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-networker-wpznj\" (UID: \"6631a763-6625-4978-a9dd-5ed4c4421ecc\") " pod="openstack/bootstrap-openstack-openstack-networker-wpznj" Mar 20 09:11:03 crc kubenswrapper[4971]: I0320 09:11:03.234516 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/6631a763-6625-4978-a9dd-5ed4c4421ecc-ssh-key-openstack-networker\") pod \"bootstrap-openstack-openstack-networker-wpznj\" (UID: \"6631a763-6625-4978-a9dd-5ed4c4421ecc\") " pod="openstack/bootstrap-openstack-openstack-networker-wpznj" Mar 20 09:11:03 crc kubenswrapper[4971]: I0320 09:11:03.234538 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/477b98cd-b9c9-4411-8a82-73a1153e0cc2-inventory\") pod \"bootstrap-openstack-openstack-cell1-vmk68\" (UID: \"477b98cd-b9c9-4411-8a82-73a1153e0cc2\") " pod="openstack/bootstrap-openstack-openstack-cell1-vmk68" Mar 20 09:11:03 crc kubenswrapper[4971]: I0320 09:11:03.234569 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/477b98cd-b9c9-4411-8a82-73a1153e0cc2-ssh-key-openstack-cell1\") pod \"bootstrap-openstack-openstack-cell1-vmk68\" (UID: \"477b98cd-b9c9-4411-8a82-73a1153e0cc2\") " pod="openstack/bootstrap-openstack-openstack-cell1-vmk68" Mar 20 09:11:03 crc kubenswrapper[4971]: I0320 09:11:03.234595 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/477b98cd-b9c9-4411-8a82-73a1153e0cc2-ceph\") pod \"bootstrap-openstack-openstack-cell1-vmk68\" (UID: \"477b98cd-b9c9-4411-8a82-73a1153e0cc2\") " pod="openstack/bootstrap-openstack-openstack-cell1-vmk68" Mar 20 09:11:03 crc kubenswrapper[4971]: I0320 09:11:03.240086 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/477b98cd-b9c9-4411-8a82-73a1153e0cc2-ssh-key-openstack-cell1\") pod \"bootstrap-openstack-openstack-cell1-vmk68\" (UID: \"477b98cd-b9c9-4411-8a82-73a1153e0cc2\") " pod="openstack/bootstrap-openstack-openstack-cell1-vmk68" Mar 20 09:11:03 crc kubenswrapper[4971]: I0320 09:11:03.240383 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6631a763-6625-4978-a9dd-5ed4c4421ecc-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-networker-wpznj\" (UID: \"6631a763-6625-4978-a9dd-5ed4c4421ecc\") " pod="openstack/bootstrap-openstack-openstack-networker-wpznj" Mar 20 09:11:03 crc kubenswrapper[4971]: I0320 09:11:03.240659 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/6631a763-6625-4978-a9dd-5ed4c4421ecc-ssh-key-openstack-networker\") pod \"bootstrap-openstack-openstack-networker-wpznj\" (UID: \"6631a763-6625-4978-a9dd-5ed4c4421ecc\") " pod="openstack/bootstrap-openstack-openstack-networker-wpznj" Mar 20 09:11:03 crc kubenswrapper[4971]: I0320 09:11:03.240659 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6631a763-6625-4978-a9dd-5ed4c4421ecc-inventory\") pod \"bootstrap-openstack-openstack-networker-wpznj\" (UID: \"6631a763-6625-4978-a9dd-5ed4c4421ecc\") " pod="openstack/bootstrap-openstack-openstack-networker-wpznj" Mar 20 09:11:03 crc kubenswrapper[4971]: I0320 09:11:03.244176 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/477b98cd-b9c9-4411-8a82-73a1153e0cc2-ceph\") pod \"bootstrap-openstack-openstack-cell1-vmk68\" (UID: \"477b98cd-b9c9-4411-8a82-73a1153e0cc2\") " pod="openstack/bootstrap-openstack-openstack-cell1-vmk68" Mar 20 09:11:03 crc kubenswrapper[4971]: I0320 09:11:03.247169 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/477b98cd-b9c9-4411-8a82-73a1153e0cc2-inventory\") pod \"bootstrap-openstack-openstack-cell1-vmk68\" (UID: \"477b98cd-b9c9-4411-8a82-73a1153e0cc2\") " pod="openstack/bootstrap-openstack-openstack-cell1-vmk68" Mar 20 09:11:03 crc kubenswrapper[4971]: I0320 09:11:03.247374 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/477b98cd-b9c9-4411-8a82-73a1153e0cc2-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-vmk68\" (UID: \"477b98cd-b9c9-4411-8a82-73a1153e0cc2\") " pod="openstack/bootstrap-openstack-openstack-cell1-vmk68" Mar 20 09:11:03 crc kubenswrapper[4971]: I0320 09:11:03.259502 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5dzv\" (UniqueName: \"kubernetes.io/projected/477b98cd-b9c9-4411-8a82-73a1153e0cc2-kube-api-access-b5dzv\") pod \"bootstrap-openstack-openstack-cell1-vmk68\" (UID: \"477b98cd-b9c9-4411-8a82-73a1153e0cc2\") " pod="openstack/bootstrap-openstack-openstack-cell1-vmk68" Mar 20 09:11:03 crc kubenswrapper[4971]: I0320 09:11:03.267923 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkhg5\" (UniqueName: \"kubernetes.io/projected/6631a763-6625-4978-a9dd-5ed4c4421ecc-kube-api-access-dkhg5\") pod \"bootstrap-openstack-openstack-networker-wpznj\" (UID: \"6631a763-6625-4978-a9dd-5ed4c4421ecc\") " pod="openstack/bootstrap-openstack-openstack-networker-wpznj" Mar 20 09:11:03 crc kubenswrapper[4971]: I0320 09:11:03.358455 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-vmk68" Mar 20 09:11:03 crc kubenswrapper[4971]: I0320 09:11:03.369271 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-networker-wpznj" Mar 20 09:11:03 crc kubenswrapper[4971]: I0320 09:11:03.935423 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-networker-wpznj"] Mar 20 09:11:04 crc kubenswrapper[4971]: I0320 09:11:04.029810 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-vmk68"] Mar 20 09:11:04 crc kubenswrapper[4971]: I0320 09:11:04.769377 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-vmk68" event={"ID":"477b98cd-b9c9-4411-8a82-73a1153e0cc2","Type":"ContainerStarted","Data":"ff3ad0eff3966ea8f4b53dba888e63b60ff9cad743ef5962aced6c4db9d91b06"} Mar 20 09:11:04 crc kubenswrapper[4971]: I0320 09:11:04.769786 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-networker-wpznj" event={"ID":"6631a763-6625-4978-a9dd-5ed4c4421ecc","Type":"ContainerStarted","Data":"4f692b66015eefc0403f081af51d468faa960fc1811bacd8e3acce964c3437d8"} Mar 20 09:11:05 crc kubenswrapper[4971]: I0320 09:11:05.732669 4971 scope.go:117] "RemoveContainer" containerID="da88d257cb097a1180faf648e8cb9e53e92d6d11e4160393bcb42462b61cacc0" Mar 20 09:11:05 crc kubenswrapper[4971]: E0320 09:11:05.733185 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:11:05 crc kubenswrapper[4971]: I0320 09:11:05.779536 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-vmk68" event={"ID":"477b98cd-b9c9-4411-8a82-73a1153e0cc2","Type":"ContainerStarted","Data":"aaf955d5dd4560ab9af95916890e4d7c705ece4f5e18c88d2af45806f58a3cf8"} Mar 20 09:11:05 crc kubenswrapper[4971]: I0320 09:11:05.800174 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-openstack-openstack-cell1-vmk68" podStartSLOduration=3.0042210049999998 podStartE2EDuration="3.800154152s" podCreationTimestamp="2026-03-20 09:11:02 +0000 UTC" firstStartedPulling="2026-03-20 09:11:04.034856414 +0000 UTC m=+8486.014730542" lastFinishedPulling="2026-03-20 09:11:04.830789561 +0000 UTC m=+8486.810663689" observedRunningTime="2026-03-20 09:11:05.796997808 +0000 UTC m=+8487.776871966" watchObservedRunningTime="2026-03-20 09:11:05.800154152 +0000 UTC m=+8487.780028280" Mar 20 09:11:06 crc kubenswrapper[4971]: I0320 09:11:06.789016 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-networker-wpznj" event={"ID":"6631a763-6625-4978-a9dd-5ed4c4421ecc","Type":"ContainerStarted","Data":"a5ff9d81e4c72d8303de277ccb0c783563f9b1de477abdcfd0c45af733a3d1f5"} Mar 20 09:11:06 crc kubenswrapper[4971]: I0320 09:11:06.808061 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-openstack-openstack-networker-wpznj" podStartSLOduration=2.774699449 podStartE2EDuration="4.808040931s" podCreationTimestamp="2026-03-20 09:11:02 +0000 UTC" firstStartedPulling="2026-03-20 09:11:03.947857685 +0000 UTC m=+8485.927731823" lastFinishedPulling="2026-03-20 09:11:05.981199167 +0000 UTC m=+8487.961073305" observedRunningTime="2026-03-20 09:11:06.804528178 +0000 UTC m=+8488.784402316" watchObservedRunningTime="2026-03-20 09:11:06.808040931 +0000 UTC m=+8488.787915079" Mar 20 09:11:13 crc kubenswrapper[4971]: I0320 09:11:13.429677 4971 scope.go:117] "RemoveContainer" containerID="3d031600480bdb47b1aa93f05731505e830647afe1f385755a2fad41d9631320" Mar 20 09:11:20 crc kubenswrapper[4971]: I0320 09:11:20.733510 4971 scope.go:117] "RemoveContainer" containerID="da88d257cb097a1180faf648e8cb9e53e92d6d11e4160393bcb42462b61cacc0" Mar 20 09:11:20 crc kubenswrapper[4971]: E0320 09:11:20.734885 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:11:33 crc kubenswrapper[4971]: I0320 09:11:33.733312 4971 scope.go:117] "RemoveContainer" containerID="da88d257cb097a1180faf648e8cb9e53e92d6d11e4160393bcb42462b61cacc0" Mar 20 09:11:33 crc kubenswrapper[4971]: E0320 09:11:33.735151 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:11:45 crc kubenswrapper[4971]: I0320 09:11:45.732419 4971 scope.go:117] "RemoveContainer" containerID="da88d257cb097a1180faf648e8cb9e53e92d6d11e4160393bcb42462b61cacc0" Mar 20 09:11:45 crc kubenswrapper[4971]: E0320 09:11:45.733283 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:11:59 crc kubenswrapper[4971]: I0320 09:11:59.733091 4971 scope.go:117] "RemoveContainer" containerID="da88d257cb097a1180faf648e8cb9e53e92d6d11e4160393bcb42462b61cacc0" Mar 20 09:11:59 crc kubenswrapper[4971]: E0320 09:11:59.733855 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:12:00 crc kubenswrapper[4971]: I0320 09:12:00.152824 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566632-z9665"] Mar 20 09:12:00 crc kubenswrapper[4971]: I0320 09:12:00.154274 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566632-z9665" Mar 20 09:12:00 crc kubenswrapper[4971]: I0320 09:12:00.157205 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:12:00 crc kubenswrapper[4971]: I0320 09:12:00.157484 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:12:00 crc kubenswrapper[4971]: I0320 09:12:00.157502 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 09:12:00 crc kubenswrapper[4971]: I0320 09:12:00.183911 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566632-z9665"] Mar 20 09:12:00 crc kubenswrapper[4971]: I0320 09:12:00.211022 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2857\" (UniqueName: \"kubernetes.io/projected/dfb2d23a-e02e-44aa-957e-774d1432e1db-kube-api-access-n2857\") pod \"auto-csr-approver-29566632-z9665\" (UID: \"dfb2d23a-e02e-44aa-957e-774d1432e1db\") " pod="openshift-infra/auto-csr-approver-29566632-z9665" Mar 20 09:12:00 crc kubenswrapper[4971]: I0320 09:12:00.313346 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2857\" (UniqueName: \"kubernetes.io/projected/dfb2d23a-e02e-44aa-957e-774d1432e1db-kube-api-access-n2857\") pod \"auto-csr-approver-29566632-z9665\" (UID: \"dfb2d23a-e02e-44aa-957e-774d1432e1db\") " pod="openshift-infra/auto-csr-approver-29566632-z9665" Mar 20 09:12:00 crc kubenswrapper[4971]: I0320 09:12:00.332378 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2857\" (UniqueName: \"kubernetes.io/projected/dfb2d23a-e02e-44aa-957e-774d1432e1db-kube-api-access-n2857\") pod \"auto-csr-approver-29566632-z9665\" (UID: \"dfb2d23a-e02e-44aa-957e-774d1432e1db\") " pod="openshift-infra/auto-csr-approver-29566632-z9665" Mar 20 09:12:00 crc kubenswrapper[4971]: I0320 09:12:00.482751 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566632-z9665" Mar 20 09:12:00 crc kubenswrapper[4971]: I0320 09:12:00.839917 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566632-z9665"] Mar 20 09:12:01 crc kubenswrapper[4971]: I0320 09:12:01.292048 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566632-z9665" event={"ID":"dfb2d23a-e02e-44aa-957e-774d1432e1db","Type":"ContainerStarted","Data":"86184f10f5884c4a5e1e793efcb0ddad9afbdd44cc295bd5105083a8022e8f10"} Mar 20 09:12:02 crc kubenswrapper[4971]: I0320 09:12:02.303581 4971 generic.go:334] "Generic (PLEG): container finished" podID="dfb2d23a-e02e-44aa-957e-774d1432e1db" containerID="6377d4e05eaeed951e21ff272f86446a4323fc7819cb90830444cab192f72a94" exitCode=0 Mar 20 09:12:02 crc kubenswrapper[4971]: I0320 09:12:02.303949 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566632-z9665" event={"ID":"dfb2d23a-e02e-44aa-957e-774d1432e1db","Type":"ContainerDied","Data":"6377d4e05eaeed951e21ff272f86446a4323fc7819cb90830444cab192f72a94"} Mar 20 09:12:03 crc kubenswrapper[4971]: I0320 09:12:03.702934 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566632-z9665" Mar 20 09:12:03 crc kubenswrapper[4971]: I0320 09:12:03.787950 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2857\" (UniqueName: \"kubernetes.io/projected/dfb2d23a-e02e-44aa-957e-774d1432e1db-kube-api-access-n2857\") pod \"dfb2d23a-e02e-44aa-957e-774d1432e1db\" (UID: \"dfb2d23a-e02e-44aa-957e-774d1432e1db\") " Mar 20 09:12:03 crc kubenswrapper[4971]: I0320 09:12:03.792949 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfb2d23a-e02e-44aa-957e-774d1432e1db-kube-api-access-n2857" (OuterVolumeSpecName: "kube-api-access-n2857") pod "dfb2d23a-e02e-44aa-957e-774d1432e1db" (UID: "dfb2d23a-e02e-44aa-957e-774d1432e1db"). InnerVolumeSpecName "kube-api-access-n2857". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:12:03 crc kubenswrapper[4971]: I0320 09:12:03.890572 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2857\" (UniqueName: \"kubernetes.io/projected/dfb2d23a-e02e-44aa-957e-774d1432e1db-kube-api-access-n2857\") on node \"crc\" DevicePath \"\"" Mar 20 09:12:04 crc kubenswrapper[4971]: I0320 09:12:04.330913 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566632-z9665" event={"ID":"dfb2d23a-e02e-44aa-957e-774d1432e1db","Type":"ContainerDied","Data":"86184f10f5884c4a5e1e793efcb0ddad9afbdd44cc295bd5105083a8022e8f10"} Mar 20 09:12:04 crc kubenswrapper[4971]: I0320 09:12:04.331397 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86184f10f5884c4a5e1e793efcb0ddad9afbdd44cc295bd5105083a8022e8f10" Mar 20 09:12:04 crc kubenswrapper[4971]: I0320 09:12:04.331132 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566632-z9665" Mar 20 09:12:04 crc kubenswrapper[4971]: I0320 09:12:04.790083 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566626-n2qgp"] Mar 20 09:12:04 crc kubenswrapper[4971]: I0320 09:12:04.798055 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566626-n2qgp"] Mar 20 09:12:06 crc kubenswrapper[4971]: I0320 09:12:06.747077 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5204b4e3-1a71-4ebb-9f95-9ca4edc4728a" path="/var/lib/kubelet/pods/5204b4e3-1a71-4ebb-9f95-9ca4edc4728a/volumes" Mar 20 09:12:13 crc kubenswrapper[4971]: I0320 09:12:13.512846 4971 scope.go:117] "RemoveContainer" containerID="51474ca5240bdd3ffa85632a8b4d1b939f069832334df155523cb989c24e72ea" Mar 20 09:12:14 crc kubenswrapper[4971]: I0320 09:12:14.732767 4971 scope.go:117] "RemoveContainer" containerID="da88d257cb097a1180faf648e8cb9e53e92d6d11e4160393bcb42462b61cacc0" Mar 20 09:12:14 crc kubenswrapper[4971]: E0320 09:12:14.733218 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:12:27 crc kubenswrapper[4971]: I0320 09:12:27.732272 4971 scope.go:117] "RemoveContainer" containerID="da88d257cb097a1180faf648e8cb9e53e92d6d11e4160393bcb42462b61cacc0" Mar 20 09:12:27 crc kubenswrapper[4971]: E0320 09:12:27.733212 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:12:41 crc kubenswrapper[4971]: I0320 09:12:41.732216 4971 scope.go:117] "RemoveContainer" containerID="da88d257cb097a1180faf648e8cb9e53e92d6d11e4160393bcb42462b61cacc0" Mar 20 09:12:41 crc kubenswrapper[4971]: E0320 09:12:41.733055 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:12:53 crc kubenswrapper[4971]: I0320 09:12:53.733269 4971 scope.go:117] "RemoveContainer" containerID="da88d257cb097a1180faf648e8cb9e53e92d6d11e4160393bcb42462b61cacc0" Mar 20 09:12:53 crc kubenswrapper[4971]: E0320 09:12:53.734368 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:13:08 crc kubenswrapper[4971]: I0320 09:13:08.748372 4971 scope.go:117] "RemoveContainer" containerID="da88d257cb097a1180faf648e8cb9e53e92d6d11e4160393bcb42462b61cacc0" Mar 20 09:13:08 crc kubenswrapper[4971]: E0320 09:13:08.749471 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:13:23 crc kubenswrapper[4971]: I0320 09:13:23.733083 4971 scope.go:117] "RemoveContainer" containerID="da88d257cb097a1180faf648e8cb9e53e92d6d11e4160393bcb42462b61cacc0" Mar 20 09:13:23 crc kubenswrapper[4971]: E0320 09:13:23.735120 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:13:37 crc kubenswrapper[4971]: I0320 09:13:37.732743 4971 scope.go:117] "RemoveContainer" containerID="da88d257cb097a1180faf648e8cb9e53e92d6d11e4160393bcb42462b61cacc0" Mar 20 09:13:37 crc kubenswrapper[4971]: E0320 09:13:37.733751 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:13:52 crc kubenswrapper[4971]: I0320 09:13:52.732985 4971 scope.go:117] "RemoveContainer" containerID="da88d257cb097a1180faf648e8cb9e53e92d6d11e4160393bcb42462b61cacc0" Mar 20 09:13:52 crc kubenswrapper[4971]: E0320 09:13:52.733836 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:14:00 crc kubenswrapper[4971]: I0320 09:14:00.152889 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566634-5gs9t"] Mar 20 09:14:00 crc kubenswrapper[4971]: E0320 09:14:00.154342 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfb2d23a-e02e-44aa-957e-774d1432e1db" containerName="oc" Mar 20 09:14:00 crc kubenswrapper[4971]: I0320 09:14:00.154366 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfb2d23a-e02e-44aa-957e-774d1432e1db" containerName="oc" Mar 20 09:14:00 crc kubenswrapper[4971]: I0320 09:14:00.154806 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfb2d23a-e02e-44aa-957e-774d1432e1db" containerName="oc" Mar 20 09:14:00 crc kubenswrapper[4971]: I0320 09:14:00.156010 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566634-5gs9t" Mar 20 09:14:00 crc kubenswrapper[4971]: I0320 09:14:00.159420 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:14:00 crc kubenswrapper[4971]: I0320 09:14:00.159638 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:14:00 crc kubenswrapper[4971]: I0320 09:14:00.161487 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 09:14:00 crc kubenswrapper[4971]: I0320 09:14:00.168300 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566634-5gs9t"] Mar 20 09:14:00 crc kubenswrapper[4971]: I0320 09:14:00.278908 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9clsh\" (UniqueName: \"kubernetes.io/projected/1543875e-c018-446f-bbdd-067ec555d144-kube-api-access-9clsh\") pod \"auto-csr-approver-29566634-5gs9t\" (UID: \"1543875e-c018-446f-bbdd-067ec555d144\") " pod="openshift-infra/auto-csr-approver-29566634-5gs9t" Mar 20 09:14:00 crc kubenswrapper[4971]: I0320 09:14:00.380834 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9clsh\" (UniqueName: \"kubernetes.io/projected/1543875e-c018-446f-bbdd-067ec555d144-kube-api-access-9clsh\") pod \"auto-csr-approver-29566634-5gs9t\" (UID: \"1543875e-c018-446f-bbdd-067ec555d144\") " pod="openshift-infra/auto-csr-approver-29566634-5gs9t" Mar 20 09:14:00 crc kubenswrapper[4971]: I0320 09:14:00.397684 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9clsh\" (UniqueName: \"kubernetes.io/projected/1543875e-c018-446f-bbdd-067ec555d144-kube-api-access-9clsh\") pod \"auto-csr-approver-29566634-5gs9t\" (UID: \"1543875e-c018-446f-bbdd-067ec555d144\") " pod="openshift-infra/auto-csr-approver-29566634-5gs9t" Mar 20 09:14:00 crc kubenswrapper[4971]: I0320 09:14:00.486654 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566634-5gs9t" Mar 20 09:14:00 crc kubenswrapper[4971]: I0320 09:14:00.950560 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566634-5gs9t"] Mar 20 09:14:00 crc kubenswrapper[4971]: I0320 09:14:00.961101 4971 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 09:14:01 crc kubenswrapper[4971]: I0320 09:14:01.424087 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566634-5gs9t" event={"ID":"1543875e-c018-446f-bbdd-067ec555d144","Type":"ContainerStarted","Data":"33b2cef400b0323c277c61d838e572b0561241bceda3041faafe023785a134d4"} Mar 20 09:14:03 crc kubenswrapper[4971]: I0320 09:14:03.466588 4971 generic.go:334] "Generic (PLEG): container finished" podID="1543875e-c018-446f-bbdd-067ec555d144" containerID="dd87cb23849d7bd07ba489e0d994b98af694f8491bd979806609dfc7b2b17e48" exitCode=0 Mar 20 09:14:03 crc kubenswrapper[4971]: I0320 09:14:03.466705 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566634-5gs9t" event={"ID":"1543875e-c018-446f-bbdd-067ec555d144","Type":"ContainerDied","Data":"dd87cb23849d7bd07ba489e0d994b98af694f8491bd979806609dfc7b2b17e48"} Mar 20 09:14:04 crc kubenswrapper[4971]: I0320 09:14:04.808618 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566634-5gs9t" Mar 20 09:14:04 crc kubenswrapper[4971]: I0320 09:14:04.866088 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9clsh\" (UniqueName: \"kubernetes.io/projected/1543875e-c018-446f-bbdd-067ec555d144-kube-api-access-9clsh\") pod \"1543875e-c018-446f-bbdd-067ec555d144\" (UID: \"1543875e-c018-446f-bbdd-067ec555d144\") " Mar 20 09:14:04 crc kubenswrapper[4971]: I0320 09:14:04.876328 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1543875e-c018-446f-bbdd-067ec555d144-kube-api-access-9clsh" (OuterVolumeSpecName: "kube-api-access-9clsh") pod "1543875e-c018-446f-bbdd-067ec555d144" (UID: "1543875e-c018-446f-bbdd-067ec555d144"). InnerVolumeSpecName "kube-api-access-9clsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:14:04 crc kubenswrapper[4971]: I0320 09:14:04.968753 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9clsh\" (UniqueName: \"kubernetes.io/projected/1543875e-c018-446f-bbdd-067ec555d144-kube-api-access-9clsh\") on node \"crc\" DevicePath \"\"" Mar 20 09:14:05 crc kubenswrapper[4971]: I0320 09:14:05.487836 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566634-5gs9t" event={"ID":"1543875e-c018-446f-bbdd-067ec555d144","Type":"ContainerDied","Data":"33b2cef400b0323c277c61d838e572b0561241bceda3041faafe023785a134d4"} Mar 20 09:14:05 crc kubenswrapper[4971]: I0320 09:14:05.487867 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566634-5gs9t" Mar 20 09:14:05 crc kubenswrapper[4971]: I0320 09:14:05.487878 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33b2cef400b0323c277c61d838e572b0561241bceda3041faafe023785a134d4" Mar 20 09:14:05 crc kubenswrapper[4971]: I0320 09:14:05.879264 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566628-8scct"] Mar 20 09:14:05 crc kubenswrapper[4971]: I0320 09:14:05.887203 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566628-8scct"] Mar 20 09:14:06 crc kubenswrapper[4971]: I0320 09:14:06.732243 4971 scope.go:117] "RemoveContainer" containerID="da88d257cb097a1180faf648e8cb9e53e92d6d11e4160393bcb42462b61cacc0" Mar 20 09:14:06 crc kubenswrapper[4971]: E0320 09:14:06.732484 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:14:06 crc kubenswrapper[4971]: I0320 09:14:06.745422 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6be3ae71-a5ff-448d-a31e-57c789745969" path="/var/lib/kubelet/pods/6be3ae71-a5ff-448d-a31e-57c789745969/volumes" Mar 20 09:14:10 crc kubenswrapper[4971]: I0320 09:14:10.545719 4971 generic.go:334] "Generic (PLEG): container finished" podID="477b98cd-b9c9-4411-8a82-73a1153e0cc2" containerID="aaf955d5dd4560ab9af95916890e4d7c705ece4f5e18c88d2af45806f58a3cf8" exitCode=0 Mar 20 09:14:10 crc kubenswrapper[4971]: I0320 09:14:10.545743 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-vmk68" event={"ID":"477b98cd-b9c9-4411-8a82-73a1153e0cc2","Type":"ContainerDied","Data":"aaf955d5dd4560ab9af95916890e4d7c705ece4f5e18c88d2af45806f58a3cf8"} Mar 20 09:14:10 crc kubenswrapper[4971]: I0320 09:14:10.638904 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-28hwc"] Mar 20 09:14:10 crc kubenswrapper[4971]: E0320 09:14:10.639689 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1543875e-c018-446f-bbdd-067ec555d144" containerName="oc" Mar 20 09:14:10 crc kubenswrapper[4971]: I0320 09:14:10.639710 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="1543875e-c018-446f-bbdd-067ec555d144" containerName="oc" Mar 20 09:14:10 crc kubenswrapper[4971]: I0320 09:14:10.639957 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="1543875e-c018-446f-bbdd-067ec555d144" containerName="oc" Mar 20 09:14:10 crc kubenswrapper[4971]: I0320 09:14:10.641753 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-28hwc" Mar 20 09:14:10 crc kubenswrapper[4971]: I0320 09:14:10.650395 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-28hwc"] Mar 20 09:14:10 crc kubenswrapper[4971]: I0320 09:14:10.779371 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/231cfb67-39b4-4ebe-9bb7-6440eba49651-catalog-content\") pod \"certified-operators-28hwc\" (UID: \"231cfb67-39b4-4ebe-9bb7-6440eba49651\") " pod="openshift-marketplace/certified-operators-28hwc" Mar 20 09:14:10 crc kubenswrapper[4971]: I0320 09:14:10.779511 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/231cfb67-39b4-4ebe-9bb7-6440eba49651-utilities\") pod \"certified-operators-28hwc\" (UID: \"231cfb67-39b4-4ebe-9bb7-6440eba49651\") " pod="openshift-marketplace/certified-operators-28hwc" Mar 20 09:14:10 crc kubenswrapper[4971]: I0320 09:14:10.779543 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngfkm\" (UniqueName: \"kubernetes.io/projected/231cfb67-39b4-4ebe-9bb7-6440eba49651-kube-api-access-ngfkm\") pod \"certified-operators-28hwc\" (UID: \"231cfb67-39b4-4ebe-9bb7-6440eba49651\") " pod="openshift-marketplace/certified-operators-28hwc" Mar 20 09:14:10 crc kubenswrapper[4971]: I0320 09:14:10.881109 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/231cfb67-39b4-4ebe-9bb7-6440eba49651-catalog-content\") pod \"certified-operators-28hwc\" (UID: \"231cfb67-39b4-4ebe-9bb7-6440eba49651\") " pod="openshift-marketplace/certified-operators-28hwc" Mar 20 09:14:10 crc kubenswrapper[4971]: I0320 09:14:10.881296 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/231cfb67-39b4-4ebe-9bb7-6440eba49651-utilities\") pod \"certified-operators-28hwc\" (UID: \"231cfb67-39b4-4ebe-9bb7-6440eba49651\") " pod="openshift-marketplace/certified-operators-28hwc" Mar 20 09:14:10 crc kubenswrapper[4971]: I0320 09:14:10.881341 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngfkm\" (UniqueName: \"kubernetes.io/projected/231cfb67-39b4-4ebe-9bb7-6440eba49651-kube-api-access-ngfkm\") pod \"certified-operators-28hwc\" (UID: \"231cfb67-39b4-4ebe-9bb7-6440eba49651\") " pod="openshift-marketplace/certified-operators-28hwc" Mar 20 09:14:10 crc kubenswrapper[4971]: I0320 09:14:10.881746 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/231cfb67-39b4-4ebe-9bb7-6440eba49651-catalog-content\") pod \"certified-operators-28hwc\" (UID: \"231cfb67-39b4-4ebe-9bb7-6440eba49651\") " pod="openshift-marketplace/certified-operators-28hwc" Mar 20 09:14:10 crc kubenswrapper[4971]: I0320 09:14:10.881869 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/231cfb67-39b4-4ebe-9bb7-6440eba49651-utilities\") pod \"certified-operators-28hwc\" (UID: \"231cfb67-39b4-4ebe-9bb7-6440eba49651\") " pod="openshift-marketplace/certified-operators-28hwc" Mar 20 09:14:10 crc kubenswrapper[4971]: I0320 09:14:10.901457 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngfkm\" (UniqueName: \"kubernetes.io/projected/231cfb67-39b4-4ebe-9bb7-6440eba49651-kube-api-access-ngfkm\") pod \"certified-operators-28hwc\" (UID: \"231cfb67-39b4-4ebe-9bb7-6440eba49651\") " pod="openshift-marketplace/certified-operators-28hwc" Mar 20 09:14:10 crc kubenswrapper[4971]: I0320 09:14:10.967172 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-28hwc" Mar 20 09:14:11 crc kubenswrapper[4971]: I0320 09:14:11.508468 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-28hwc"] Mar 20 09:14:11 crc kubenswrapper[4971]: I0320 09:14:11.555316 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-28hwc" event={"ID":"231cfb67-39b4-4ebe-9bb7-6440eba49651","Type":"ContainerStarted","Data":"57f668151716959e36638e862cf939580c34dc149113dd4604fff736b5c97840"} Mar 20 09:14:12 crc kubenswrapper[4971]: I0320 09:14:12.050164 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-vmk68" Mar 20 09:14:12 crc kubenswrapper[4971]: I0320 09:14:12.108806 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/477b98cd-b9c9-4411-8a82-73a1153e0cc2-inventory\") pod \"477b98cd-b9c9-4411-8a82-73a1153e0cc2\" (UID: \"477b98cd-b9c9-4411-8a82-73a1153e0cc2\") " Mar 20 09:14:12 crc kubenswrapper[4971]: I0320 09:14:12.108889 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/477b98cd-b9c9-4411-8a82-73a1153e0cc2-bootstrap-combined-ca-bundle\") pod \"477b98cd-b9c9-4411-8a82-73a1153e0cc2\" (UID: \"477b98cd-b9c9-4411-8a82-73a1153e0cc2\") " Mar 20 09:14:12 crc kubenswrapper[4971]: I0320 09:14:12.108946 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/477b98cd-b9c9-4411-8a82-73a1153e0cc2-ceph\") pod \"477b98cd-b9c9-4411-8a82-73a1153e0cc2\" (UID: \"477b98cd-b9c9-4411-8a82-73a1153e0cc2\") " Mar 20 09:14:12 crc kubenswrapper[4971]: I0320 09:14:12.109028 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5dzv\" (UniqueName: \"kubernetes.io/projected/477b98cd-b9c9-4411-8a82-73a1153e0cc2-kube-api-access-b5dzv\") pod \"477b98cd-b9c9-4411-8a82-73a1153e0cc2\" (UID: \"477b98cd-b9c9-4411-8a82-73a1153e0cc2\") " Mar 20 09:14:12 crc kubenswrapper[4971]: I0320 09:14:12.109258 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/477b98cd-b9c9-4411-8a82-73a1153e0cc2-ssh-key-openstack-cell1\") pod \"477b98cd-b9c9-4411-8a82-73a1153e0cc2\" (UID: \"477b98cd-b9c9-4411-8a82-73a1153e0cc2\") " Mar 20 09:14:12 crc kubenswrapper[4971]: I0320 09:14:12.115043 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/477b98cd-b9c9-4411-8a82-73a1153e0cc2-kube-api-access-b5dzv" (OuterVolumeSpecName: "kube-api-access-b5dzv") pod "477b98cd-b9c9-4411-8a82-73a1153e0cc2" (UID: "477b98cd-b9c9-4411-8a82-73a1153e0cc2"). InnerVolumeSpecName "kube-api-access-b5dzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:14:12 crc kubenswrapper[4971]: I0320 09:14:12.115210 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/477b98cd-b9c9-4411-8a82-73a1153e0cc2-ceph" (OuterVolumeSpecName: "ceph") pod "477b98cd-b9c9-4411-8a82-73a1153e0cc2" (UID: "477b98cd-b9c9-4411-8a82-73a1153e0cc2"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:14:12 crc kubenswrapper[4971]: I0320 09:14:12.115473 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/477b98cd-b9c9-4411-8a82-73a1153e0cc2-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "477b98cd-b9c9-4411-8a82-73a1153e0cc2" (UID: "477b98cd-b9c9-4411-8a82-73a1153e0cc2"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:14:12 crc kubenswrapper[4971]: I0320 09:14:12.135831 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/477b98cd-b9c9-4411-8a82-73a1153e0cc2-inventory" (OuterVolumeSpecName: "inventory") pod "477b98cd-b9c9-4411-8a82-73a1153e0cc2" (UID: "477b98cd-b9c9-4411-8a82-73a1153e0cc2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:14:12 crc kubenswrapper[4971]: I0320 09:14:12.142518 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/477b98cd-b9c9-4411-8a82-73a1153e0cc2-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "477b98cd-b9c9-4411-8a82-73a1153e0cc2" (UID: "477b98cd-b9c9-4411-8a82-73a1153e0cc2"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:14:12 crc kubenswrapper[4971]: I0320 09:14:12.211189 4971 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/477b98cd-b9c9-4411-8a82-73a1153e0cc2-ceph\") on node \"crc\" DevicePath \"\"" Mar 20 09:14:12 crc kubenswrapper[4971]: I0320 09:14:12.211437 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5dzv\" (UniqueName: \"kubernetes.io/projected/477b98cd-b9c9-4411-8a82-73a1153e0cc2-kube-api-access-b5dzv\") on node \"crc\" DevicePath \"\"" Mar 20 09:14:12 crc kubenswrapper[4971]: I0320 09:14:12.211498 4971 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/477b98cd-b9c9-4411-8a82-73a1153e0cc2-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 20 09:14:12 crc kubenswrapper[4971]: I0320 09:14:12.211554 4971 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/477b98cd-b9c9-4411-8a82-73a1153e0cc2-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 09:14:12 crc kubenswrapper[4971]: I0320 09:14:12.211645 4971 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/477b98cd-b9c9-4411-8a82-73a1153e0cc2-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 09:14:12 crc kubenswrapper[4971]: I0320 09:14:12.566741 4971 generic.go:334] "Generic (PLEG): container finished" podID="6631a763-6625-4978-a9dd-5ed4c4421ecc" containerID="a5ff9d81e4c72d8303de277ccb0c783563f9b1de477abdcfd0c45af733a3d1f5" exitCode=0 Mar 20 09:14:12 crc kubenswrapper[4971]: I0320 09:14:12.566950 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-networker-wpznj" event={"ID":"6631a763-6625-4978-a9dd-5ed4c4421ecc","Type":"ContainerDied","Data":"a5ff9d81e4c72d8303de277ccb0c783563f9b1de477abdcfd0c45af733a3d1f5"} Mar 20 09:14:12 crc kubenswrapper[4971]: I0320 09:14:12.570149 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-vmk68" event={"ID":"477b98cd-b9c9-4411-8a82-73a1153e0cc2","Type":"ContainerDied","Data":"ff3ad0eff3966ea8f4b53dba888e63b60ff9cad743ef5962aced6c4db9d91b06"} Mar 20 09:14:12 crc kubenswrapper[4971]: I0320 09:14:12.570264 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff3ad0eff3966ea8f4b53dba888e63b60ff9cad743ef5962aced6c4db9d91b06" Mar 20 09:14:12 crc kubenswrapper[4971]: I0320 09:14:12.570188 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-vmk68" Mar 20 09:14:12 crc kubenswrapper[4971]: I0320 09:14:12.572127 4971 generic.go:334] "Generic (PLEG): container finished" podID="231cfb67-39b4-4ebe-9bb7-6440eba49651" containerID="fbada4f1fc26576a76f754103fc9baeca30b58a78b0852b81caf37ac49ef8aa5" exitCode=0 Mar 20 09:14:12 crc kubenswrapper[4971]: I0320 09:14:12.572180 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-28hwc" event={"ID":"231cfb67-39b4-4ebe-9bb7-6440eba49651","Type":"ContainerDied","Data":"fbada4f1fc26576a76f754103fc9baeca30b58a78b0852b81caf37ac49ef8aa5"} Mar 20 09:14:12 crc kubenswrapper[4971]: I0320 09:14:12.672994 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-grhlv"] Mar 20 09:14:12 crc kubenswrapper[4971]: E0320 09:14:12.673483 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="477b98cd-b9c9-4411-8a82-73a1153e0cc2" containerName="bootstrap-openstack-openstack-cell1" Mar 20 09:14:12 crc kubenswrapper[4971]: I0320 09:14:12.673504 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="477b98cd-b9c9-4411-8a82-73a1153e0cc2" containerName="bootstrap-openstack-openstack-cell1" Mar 20 09:14:12 crc kubenswrapper[4971]: I0320 09:14:12.673754 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="477b98cd-b9c9-4411-8a82-73a1153e0cc2" containerName="bootstrap-openstack-openstack-cell1" Mar 20 09:14:12 crc kubenswrapper[4971]: I0320 09:14:12.679352 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-grhlv" Mar 20 09:14:12 crc kubenswrapper[4971]: I0320 09:14:12.682205 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rwstj" Mar 20 09:14:12 crc kubenswrapper[4971]: I0320 09:14:12.682724 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 20 09:14:12 crc kubenswrapper[4971]: I0320 09:14:12.686917 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-grhlv"] Mar 20 09:14:12 crc kubenswrapper[4971]: I0320 09:14:12.722691 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/efb74ee9-ca4d-4087-a070-4b3f17683800-ceph\") pod \"download-cache-openstack-openstack-cell1-grhlv\" (UID: \"efb74ee9-ca4d-4087-a070-4b3f17683800\") " pod="openstack/download-cache-openstack-openstack-cell1-grhlv" Mar 20 09:14:12 crc kubenswrapper[4971]: I0320 09:14:12.722757 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qhsf\" (UniqueName: \"kubernetes.io/projected/efb74ee9-ca4d-4087-a070-4b3f17683800-kube-api-access-5qhsf\") pod \"download-cache-openstack-openstack-cell1-grhlv\" (UID: \"efb74ee9-ca4d-4087-a070-4b3f17683800\") " pod="openstack/download-cache-openstack-openstack-cell1-grhlv" Mar 20 09:14:12 crc kubenswrapper[4971]: I0320 09:14:12.722827 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/efb74ee9-ca4d-4087-a070-4b3f17683800-inventory\") pod \"download-cache-openstack-openstack-cell1-grhlv\" (UID: \"efb74ee9-ca4d-4087-a070-4b3f17683800\") " pod="openstack/download-cache-openstack-openstack-cell1-grhlv" Mar 20 09:14:12 crc kubenswrapper[4971]: I0320 09:14:12.722877 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/efb74ee9-ca4d-4087-a070-4b3f17683800-ssh-key-openstack-cell1\") pod \"download-cache-openstack-openstack-cell1-grhlv\" (UID: \"efb74ee9-ca4d-4087-a070-4b3f17683800\") " pod="openstack/download-cache-openstack-openstack-cell1-grhlv" Mar 20 09:14:12 crc kubenswrapper[4971]: I0320 09:14:12.825286 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qhsf\" (UniqueName: \"kubernetes.io/projected/efb74ee9-ca4d-4087-a070-4b3f17683800-kube-api-access-5qhsf\") pod \"download-cache-openstack-openstack-cell1-grhlv\" (UID: \"efb74ee9-ca4d-4087-a070-4b3f17683800\") " pod="openstack/download-cache-openstack-openstack-cell1-grhlv" Mar 20 09:14:12 crc kubenswrapper[4971]: I0320 09:14:12.825509 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/efb74ee9-ca4d-4087-a070-4b3f17683800-inventory\") pod \"download-cache-openstack-openstack-cell1-grhlv\" (UID: \"efb74ee9-ca4d-4087-a070-4b3f17683800\") " pod="openstack/download-cache-openstack-openstack-cell1-grhlv" Mar 20 09:14:12 crc kubenswrapper[4971]: I0320 09:14:12.825685 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/efb74ee9-ca4d-4087-a070-4b3f17683800-ssh-key-openstack-cell1\") pod \"download-cache-openstack-openstack-cell1-grhlv\" (UID: \"efb74ee9-ca4d-4087-a070-4b3f17683800\") " pod="openstack/download-cache-openstack-openstack-cell1-grhlv" Mar 20 09:14:12 crc kubenswrapper[4971]: I0320 09:14:12.825808 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/efb74ee9-ca4d-4087-a070-4b3f17683800-ceph\") pod \"download-cache-openstack-openstack-cell1-grhlv\" (UID: \"efb74ee9-ca4d-4087-a070-4b3f17683800\") " pod="openstack/download-cache-openstack-openstack-cell1-grhlv" Mar 20 09:14:12 crc kubenswrapper[4971]: I0320 09:14:12.830423 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/efb74ee9-ca4d-4087-a070-4b3f17683800-inventory\") pod \"download-cache-openstack-openstack-cell1-grhlv\" (UID: \"efb74ee9-ca4d-4087-a070-4b3f17683800\") " pod="openstack/download-cache-openstack-openstack-cell1-grhlv" Mar 20 09:14:12 crc kubenswrapper[4971]: I0320 09:14:12.830444 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/efb74ee9-ca4d-4087-a070-4b3f17683800-ceph\") pod \"download-cache-openstack-openstack-cell1-grhlv\" (UID: \"efb74ee9-ca4d-4087-a070-4b3f17683800\") " pod="openstack/download-cache-openstack-openstack-cell1-grhlv" Mar 20 09:14:12 crc kubenswrapper[4971]: I0320 09:14:12.831058 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/efb74ee9-ca4d-4087-a070-4b3f17683800-ssh-key-openstack-cell1\") pod \"download-cache-openstack-openstack-cell1-grhlv\" (UID: \"efb74ee9-ca4d-4087-a070-4b3f17683800\") " pod="openstack/download-cache-openstack-openstack-cell1-grhlv" Mar 20 09:14:12 crc kubenswrapper[4971]: I0320 09:14:12.843507 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qhsf\" (UniqueName: \"kubernetes.io/projected/efb74ee9-ca4d-4087-a070-4b3f17683800-kube-api-access-5qhsf\") pod \"download-cache-openstack-openstack-cell1-grhlv\" (UID: \"efb74ee9-ca4d-4087-a070-4b3f17683800\") " pod="openstack/download-cache-openstack-openstack-cell1-grhlv" Mar 20 09:14:13 crc kubenswrapper[4971]: I0320 09:14:13.001650 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-grhlv" Mar 20 09:14:13 crc kubenswrapper[4971]: I0320 09:14:13.542482 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-grhlv"] Mar 20 09:14:13 crc kubenswrapper[4971]: I0320 09:14:13.581155 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-28hwc" event={"ID":"231cfb67-39b4-4ebe-9bb7-6440eba49651","Type":"ContainerStarted","Data":"9d1d62587425ed24dabcb642ab49b3e02c59bc791a16f0a074b873fcaf975299"} Mar 20 09:14:13 crc kubenswrapper[4971]: I0320 09:14:13.584040 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-grhlv" event={"ID":"efb74ee9-ca4d-4087-a070-4b3f17683800","Type":"ContainerStarted","Data":"24a80bf056a5fe4c9a307736e2b4fd65b597f35b2140b7a3f8b103ae82016531"} Mar 20 09:14:13 crc kubenswrapper[4971]: I0320 09:14:13.598092 4971 scope.go:117] "RemoveContainer" containerID="e23416f69c1402dfa4e4f9525aab042eed031edf635cdc84c0f7b6f698915200" Mar 20 09:14:13 crc kubenswrapper[4971]: I0320 09:14:13.914924 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-networker-wpznj" Mar 20 09:14:13 crc kubenswrapper[4971]: I0320 09:14:13.956527 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkhg5\" (UniqueName: \"kubernetes.io/projected/6631a763-6625-4978-a9dd-5ed4c4421ecc-kube-api-access-dkhg5\") pod \"6631a763-6625-4978-a9dd-5ed4c4421ecc\" (UID: \"6631a763-6625-4978-a9dd-5ed4c4421ecc\") " Mar 20 09:14:13 crc kubenswrapper[4971]: I0320 09:14:13.956672 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6631a763-6625-4978-a9dd-5ed4c4421ecc-inventory\") pod \"6631a763-6625-4978-a9dd-5ed4c4421ecc\" (UID: \"6631a763-6625-4978-a9dd-5ed4c4421ecc\") " Mar 20 09:14:13 crc kubenswrapper[4971]: I0320 09:14:13.956800 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/6631a763-6625-4978-a9dd-5ed4c4421ecc-ssh-key-openstack-networker\") pod \"6631a763-6625-4978-a9dd-5ed4c4421ecc\" (UID: \"6631a763-6625-4978-a9dd-5ed4c4421ecc\") " Mar 20 09:14:13 crc kubenswrapper[4971]: I0320 09:14:13.956852 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6631a763-6625-4978-a9dd-5ed4c4421ecc-bootstrap-combined-ca-bundle\") pod \"6631a763-6625-4978-a9dd-5ed4c4421ecc\" (UID: \"6631a763-6625-4978-a9dd-5ed4c4421ecc\") " Mar 20 09:14:13 crc kubenswrapper[4971]: I0320 09:14:13.971846 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6631a763-6625-4978-a9dd-5ed4c4421ecc-kube-api-access-dkhg5" (OuterVolumeSpecName: "kube-api-access-dkhg5") pod "6631a763-6625-4978-a9dd-5ed4c4421ecc" (UID: "6631a763-6625-4978-a9dd-5ed4c4421ecc"). InnerVolumeSpecName "kube-api-access-dkhg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:14:13 crc kubenswrapper[4971]: I0320 09:14:13.971944 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6631a763-6625-4978-a9dd-5ed4c4421ecc-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "6631a763-6625-4978-a9dd-5ed4c4421ecc" (UID: "6631a763-6625-4978-a9dd-5ed4c4421ecc"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:14:13 crc kubenswrapper[4971]: I0320 09:14:13.985671 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6631a763-6625-4978-a9dd-5ed4c4421ecc-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "6631a763-6625-4978-a9dd-5ed4c4421ecc" (UID: "6631a763-6625-4978-a9dd-5ed4c4421ecc"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:14:13 crc kubenswrapper[4971]: I0320 09:14:13.992194 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6631a763-6625-4978-a9dd-5ed4c4421ecc-inventory" (OuterVolumeSpecName: "inventory") pod "6631a763-6625-4978-a9dd-5ed4c4421ecc" (UID: "6631a763-6625-4978-a9dd-5ed4c4421ecc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:14:14 crc kubenswrapper[4971]: I0320 09:14:14.059773 4971 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/6631a763-6625-4978-a9dd-5ed4c4421ecc-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Mar 20 09:14:14 crc kubenswrapper[4971]: I0320 09:14:14.059801 4971 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6631a763-6625-4978-a9dd-5ed4c4421ecc-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 09:14:14 crc kubenswrapper[4971]: I0320 09:14:14.059813 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkhg5\" (UniqueName: \"kubernetes.io/projected/6631a763-6625-4978-a9dd-5ed4c4421ecc-kube-api-access-dkhg5\") on node \"crc\" DevicePath \"\"" Mar 20 09:14:14 crc kubenswrapper[4971]: I0320 09:14:14.059823 4971 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6631a763-6625-4978-a9dd-5ed4c4421ecc-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 09:14:14 crc kubenswrapper[4971]: I0320 09:14:14.592773 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-networker-wpznj" event={"ID":"6631a763-6625-4978-a9dd-5ed4c4421ecc","Type":"ContainerDied","Data":"4f692b66015eefc0403f081af51d468faa960fc1811bacd8e3acce964c3437d8"} Mar 20 09:14:14 crc kubenswrapper[4971]: I0320 09:14:14.594026 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f692b66015eefc0403f081af51d468faa960fc1811bacd8e3acce964c3437d8" Mar 20 09:14:14 crc kubenswrapper[4971]: I0320 09:14:14.594142 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-networker-wpznj" Mar 20 09:14:14 crc kubenswrapper[4971]: I0320 09:14:14.607530 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-grhlv" event={"ID":"efb74ee9-ca4d-4087-a070-4b3f17683800","Type":"ContainerStarted","Data":"8eb819a61371fb67c66e88d082d9e0db898f11cdc958c21efa24257c2d96f0f1"} Mar 20 09:14:14 crc kubenswrapper[4971]: I0320 09:14:14.662364 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-openstack-openstack-cell1-grhlv" podStartSLOduration=1.9655246769999999 podStartE2EDuration="2.662338203s" podCreationTimestamp="2026-03-20 09:14:12 +0000 UTC" firstStartedPulling="2026-03-20 09:14:13.54947541 +0000 UTC m=+8675.529349548" lastFinishedPulling="2026-03-20 09:14:14.246288936 +0000 UTC m=+8676.226163074" observedRunningTime="2026-03-20 09:14:14.63535507 +0000 UTC m=+8676.615229208" watchObservedRunningTime="2026-03-20 09:14:14.662338203 +0000 UTC m=+8676.642212351" Mar 20 09:14:14 crc kubenswrapper[4971]: I0320 09:14:14.694100 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-openstack-openstack-networker-jcgvt"] Mar 20 09:14:14 crc kubenswrapper[4971]: E0320 09:14:14.695352 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6631a763-6625-4978-a9dd-5ed4c4421ecc" containerName="bootstrap-openstack-openstack-networker" Mar 20 09:14:14 crc kubenswrapper[4971]: I0320 09:14:14.695435 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="6631a763-6625-4978-a9dd-5ed4c4421ecc" containerName="bootstrap-openstack-openstack-networker" Mar 20 09:14:14 crc kubenswrapper[4971]: I0320 09:14:14.695921 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="6631a763-6625-4978-a9dd-5ed4c4421ecc" containerName="bootstrap-openstack-openstack-networker" Mar 20 09:14:14 crc kubenswrapper[4971]: I0320 09:14:14.697113 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-networker-jcgvt" Mar 20 09:14:14 crc kubenswrapper[4971]: I0320 09:14:14.709228 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-networker-jcgvt"] Mar 20 09:14:14 crc kubenswrapper[4971]: I0320 09:14:14.735345 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-nwcvx" Mar 20 09:14:14 crc kubenswrapper[4971]: I0320 09:14:14.736010 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Mar 20 09:14:14 crc kubenswrapper[4971]: I0320 09:14:14.786676 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws954\" (UniqueName: \"kubernetes.io/projected/b6855ca9-d129-4c73-a2ed-cc7059287a06-kube-api-access-ws954\") pod \"download-cache-openstack-openstack-networker-jcgvt\" (UID: \"b6855ca9-d129-4c73-a2ed-cc7059287a06\") " pod="openstack/download-cache-openstack-openstack-networker-jcgvt" Mar 20 09:14:14 crc kubenswrapper[4971]: I0320 09:14:14.786805 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/b6855ca9-d129-4c73-a2ed-cc7059287a06-ssh-key-openstack-networker\") pod \"download-cache-openstack-openstack-networker-jcgvt\" (UID: \"b6855ca9-d129-4c73-a2ed-cc7059287a06\") " pod="openstack/download-cache-openstack-openstack-networker-jcgvt" Mar 20 09:14:14 crc kubenswrapper[4971]: I0320 09:14:14.786880 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b6855ca9-d129-4c73-a2ed-cc7059287a06-inventory\") pod \"download-cache-openstack-openstack-networker-jcgvt\" (UID: \"b6855ca9-d129-4c73-a2ed-cc7059287a06\") " pod="openstack/download-cache-openstack-openstack-networker-jcgvt" Mar 20 09:14:14 crc kubenswrapper[4971]: I0320 09:14:14.888469 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/b6855ca9-d129-4c73-a2ed-cc7059287a06-ssh-key-openstack-networker\") pod \"download-cache-openstack-openstack-networker-jcgvt\" (UID: \"b6855ca9-d129-4c73-a2ed-cc7059287a06\") " pod="openstack/download-cache-openstack-openstack-networker-jcgvt" Mar 20 09:14:14 crc kubenswrapper[4971]: I0320 09:14:14.888838 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b6855ca9-d129-4c73-a2ed-cc7059287a06-inventory\") pod \"download-cache-openstack-openstack-networker-jcgvt\" (UID: \"b6855ca9-d129-4c73-a2ed-cc7059287a06\") " pod="openstack/download-cache-openstack-openstack-networker-jcgvt" Mar 20 09:14:14 crc kubenswrapper[4971]: I0320 09:14:14.888995 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ws954\" (UniqueName: \"kubernetes.io/projected/b6855ca9-d129-4c73-a2ed-cc7059287a06-kube-api-access-ws954\") pod \"download-cache-openstack-openstack-networker-jcgvt\" (UID: \"b6855ca9-d129-4c73-a2ed-cc7059287a06\") " pod="openstack/download-cache-openstack-openstack-networker-jcgvt" Mar 20 09:14:14 crc kubenswrapper[4971]: I0320 09:14:14.892950 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b6855ca9-d129-4c73-a2ed-cc7059287a06-inventory\") pod \"download-cache-openstack-openstack-networker-jcgvt\" (UID: \"b6855ca9-d129-4c73-a2ed-cc7059287a06\") " pod="openstack/download-cache-openstack-openstack-networker-jcgvt" Mar 20 09:14:14 crc kubenswrapper[4971]: I0320 09:14:14.893991 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/b6855ca9-d129-4c73-a2ed-cc7059287a06-ssh-key-openstack-networker\") pod \"download-cache-openstack-openstack-networker-jcgvt\" (UID: \"b6855ca9-d129-4c73-a2ed-cc7059287a06\") " pod="openstack/download-cache-openstack-openstack-networker-jcgvt" Mar 20 09:14:14 crc kubenswrapper[4971]: I0320 09:14:14.908471 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws954\" (UniqueName: \"kubernetes.io/projected/b6855ca9-d129-4c73-a2ed-cc7059287a06-kube-api-access-ws954\") pod \"download-cache-openstack-openstack-networker-jcgvt\" (UID: \"b6855ca9-d129-4c73-a2ed-cc7059287a06\") " pod="openstack/download-cache-openstack-openstack-networker-jcgvt" Mar 20 09:14:15 crc kubenswrapper[4971]: I0320 09:14:15.066877 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-networker-jcgvt" Mar 20 09:14:15 crc kubenswrapper[4971]: I0320 09:14:15.620129 4971 generic.go:334] "Generic (PLEG): container finished" podID="231cfb67-39b4-4ebe-9bb7-6440eba49651" containerID="9d1d62587425ed24dabcb642ab49b3e02c59bc791a16f0a074b873fcaf975299" exitCode=0 Mar 20 09:14:15 crc kubenswrapper[4971]: I0320 09:14:15.620175 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-28hwc" event={"ID":"231cfb67-39b4-4ebe-9bb7-6440eba49651","Type":"ContainerDied","Data":"9d1d62587425ed24dabcb642ab49b3e02c59bc791a16f0a074b873fcaf975299"} Mar 20 09:14:15 crc kubenswrapper[4971]: I0320 09:14:15.630474 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-networker-jcgvt"] Mar 20 09:14:15 crc kubenswrapper[4971]: W0320 09:14:15.635811 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb6855ca9_d129_4c73_a2ed_cc7059287a06.slice/crio-53aaac2dc2b60060f7b48c45ab2474dac6e3400ffa92d3fe75a6ac855f109804 WatchSource:0}: Error finding container 53aaac2dc2b60060f7b48c45ab2474dac6e3400ffa92d3fe75a6ac855f109804: Status 404 returned error can't find the container with id 53aaac2dc2b60060f7b48c45ab2474dac6e3400ffa92d3fe75a6ac855f109804 Mar 20 09:14:16 crc kubenswrapper[4971]: I0320 09:14:16.634167 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-28hwc" event={"ID":"231cfb67-39b4-4ebe-9bb7-6440eba49651","Type":"ContainerStarted","Data":"17ab991ed43785bc95cb9f82b5c32c6e8d5a0c6269020ea98beca14fbe5c8ec8"} Mar 20 09:14:16 crc kubenswrapper[4971]: I0320 09:14:16.636674 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-networker-jcgvt" event={"ID":"b6855ca9-d129-4c73-a2ed-cc7059287a06","Type":"ContainerStarted","Data":"46cee39ffb3b30ee48df0908009caa37b6d73cc395482902abc895fdc8dcd7c0"} Mar 20 09:14:16 crc kubenswrapper[4971]: I0320 09:14:16.636885 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-networker-jcgvt" event={"ID":"b6855ca9-d129-4c73-a2ed-cc7059287a06","Type":"ContainerStarted","Data":"53aaac2dc2b60060f7b48c45ab2474dac6e3400ffa92d3fe75a6ac855f109804"} Mar 20 09:14:16 crc kubenswrapper[4971]: I0320 09:14:16.659898 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-28hwc" podStartSLOduration=3.232519433 podStartE2EDuration="6.65987793s" podCreationTimestamp="2026-03-20 09:14:10 +0000 UTC" firstStartedPulling="2026-03-20 09:14:12.574070079 +0000 UTC m=+8674.553944227" lastFinishedPulling="2026-03-20 09:14:16.001428576 +0000 UTC m=+8677.981302724" observedRunningTime="2026-03-20 09:14:16.656278805 +0000 UTC m=+8678.636152963" watchObservedRunningTime="2026-03-20 09:14:16.65987793 +0000 UTC m=+8678.639752068" Mar 20 09:14:16 crc kubenswrapper[4971]: I0320 09:14:16.684789 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-openstack-openstack-networker-jcgvt" podStartSLOduration=2.035349302 podStartE2EDuration="2.684763947s" podCreationTimestamp="2026-03-20 09:14:14 +0000 UTC" firstStartedPulling="2026-03-20 09:14:15.638869113 +0000 UTC m=+8677.618743261" lastFinishedPulling="2026-03-20 09:14:16.288283758 +0000 UTC m=+8678.268157906" observedRunningTime="2026-03-20 09:14:16.674559868 +0000 UTC m=+8678.654434006" watchObservedRunningTime="2026-03-20 09:14:16.684763947 +0000 UTC m=+8678.664638095" Mar 20 09:14:19 crc kubenswrapper[4971]: I0320 09:14:19.732927 4971 scope.go:117] "RemoveContainer" containerID="da88d257cb097a1180faf648e8cb9e53e92d6d11e4160393bcb42462b61cacc0" Mar 20 09:14:19 crc kubenswrapper[4971]: E0320 09:14:19.733827 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:14:20 crc kubenswrapper[4971]: I0320 09:14:20.967519 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-28hwc" Mar 20 09:14:20 crc kubenswrapper[4971]: I0320 09:14:20.967897 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-28hwc" Mar 20 09:14:21 crc kubenswrapper[4971]: I0320 09:14:21.018026 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-28hwc" Mar 20 09:14:21 crc kubenswrapper[4971]: I0320 09:14:21.749956 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-28hwc" Mar 20 09:14:21 crc kubenswrapper[4971]: I0320 09:14:21.811311 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-28hwc"] Mar 20 09:14:23 crc kubenswrapper[4971]: I0320 09:14:23.704826 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-28hwc" podUID="231cfb67-39b4-4ebe-9bb7-6440eba49651" containerName="registry-server" containerID="cri-o://17ab991ed43785bc95cb9f82b5c32c6e8d5a0c6269020ea98beca14fbe5c8ec8" gracePeriod=2 Mar 20 09:14:24 crc kubenswrapper[4971]: I0320 09:14:24.210690 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-28hwc" Mar 20 09:14:24 crc kubenswrapper[4971]: I0320 09:14:24.291719 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngfkm\" (UniqueName: \"kubernetes.io/projected/231cfb67-39b4-4ebe-9bb7-6440eba49651-kube-api-access-ngfkm\") pod \"231cfb67-39b4-4ebe-9bb7-6440eba49651\" (UID: \"231cfb67-39b4-4ebe-9bb7-6440eba49651\") " Mar 20 09:14:24 crc kubenswrapper[4971]: I0320 09:14:24.291965 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/231cfb67-39b4-4ebe-9bb7-6440eba49651-utilities\") pod \"231cfb67-39b4-4ebe-9bb7-6440eba49651\" (UID: \"231cfb67-39b4-4ebe-9bb7-6440eba49651\") " Mar 20 09:14:24 crc kubenswrapper[4971]: I0320 09:14:24.292781 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/231cfb67-39b4-4ebe-9bb7-6440eba49651-utilities" (OuterVolumeSpecName: "utilities") pod "231cfb67-39b4-4ebe-9bb7-6440eba49651" (UID: "231cfb67-39b4-4ebe-9bb7-6440eba49651"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:14:24 crc kubenswrapper[4971]: I0320 09:14:24.292863 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/231cfb67-39b4-4ebe-9bb7-6440eba49651-catalog-content\") pod \"231cfb67-39b4-4ebe-9bb7-6440eba49651\" (UID: \"231cfb67-39b4-4ebe-9bb7-6440eba49651\") " Mar 20 09:14:24 crc kubenswrapper[4971]: I0320 09:14:24.306499 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/231cfb67-39b4-4ebe-9bb7-6440eba49651-kube-api-access-ngfkm" (OuterVolumeSpecName: "kube-api-access-ngfkm") pod "231cfb67-39b4-4ebe-9bb7-6440eba49651" (UID: "231cfb67-39b4-4ebe-9bb7-6440eba49651"). InnerVolumeSpecName "kube-api-access-ngfkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:14:24 crc kubenswrapper[4971]: I0320 09:14:24.306746 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/231cfb67-39b4-4ebe-9bb7-6440eba49651-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 09:14:24 crc kubenswrapper[4971]: I0320 09:14:24.341963 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/231cfb67-39b4-4ebe-9bb7-6440eba49651-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "231cfb67-39b4-4ebe-9bb7-6440eba49651" (UID: "231cfb67-39b4-4ebe-9bb7-6440eba49651"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:14:24 crc kubenswrapper[4971]: I0320 09:14:24.408727 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/231cfb67-39b4-4ebe-9bb7-6440eba49651-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 09:14:24 crc kubenswrapper[4971]: I0320 09:14:24.408764 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngfkm\" (UniqueName: \"kubernetes.io/projected/231cfb67-39b4-4ebe-9bb7-6440eba49651-kube-api-access-ngfkm\") on node \"crc\" DevicePath \"\"" Mar 20 09:14:24 crc kubenswrapper[4971]: I0320 09:14:24.716792 4971 generic.go:334] "Generic (PLEG): container finished" podID="231cfb67-39b4-4ebe-9bb7-6440eba49651" containerID="17ab991ed43785bc95cb9f82b5c32c6e8d5a0c6269020ea98beca14fbe5c8ec8" exitCode=0 Mar 20 09:14:24 crc kubenswrapper[4971]: I0320 09:14:24.716843 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-28hwc" event={"ID":"231cfb67-39b4-4ebe-9bb7-6440eba49651","Type":"ContainerDied","Data":"17ab991ed43785bc95cb9f82b5c32c6e8d5a0c6269020ea98beca14fbe5c8ec8"} Mar 20 09:14:24 crc kubenswrapper[4971]: I0320 09:14:24.716874 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-28hwc" event={"ID":"231cfb67-39b4-4ebe-9bb7-6440eba49651","Type":"ContainerDied","Data":"57f668151716959e36638e862cf939580c34dc149113dd4604fff736b5c97840"} Mar 20 09:14:24 crc kubenswrapper[4971]: I0320 09:14:24.716867 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-28hwc" Mar 20 09:14:24 crc kubenswrapper[4971]: I0320 09:14:24.716902 4971 scope.go:117] "RemoveContainer" containerID="17ab991ed43785bc95cb9f82b5c32c6e8d5a0c6269020ea98beca14fbe5c8ec8" Mar 20 09:14:24 crc kubenswrapper[4971]: I0320 09:14:24.736725 4971 scope.go:117] "RemoveContainer" containerID="9d1d62587425ed24dabcb642ab49b3e02c59bc791a16f0a074b873fcaf975299" Mar 20 09:14:24 crc kubenswrapper[4971]: I0320 09:14:24.757515 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-28hwc"] Mar 20 09:14:24 crc kubenswrapper[4971]: I0320 09:14:24.768894 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-28hwc"] Mar 20 09:14:24 crc kubenswrapper[4971]: I0320 09:14:24.769998 4971 scope.go:117] "RemoveContainer" containerID="fbada4f1fc26576a76f754103fc9baeca30b58a78b0852b81caf37ac49ef8aa5" Mar 20 09:14:24 crc kubenswrapper[4971]: I0320 09:14:24.806930 4971 scope.go:117] "RemoveContainer" containerID="17ab991ed43785bc95cb9f82b5c32c6e8d5a0c6269020ea98beca14fbe5c8ec8" Mar 20 09:14:24 crc kubenswrapper[4971]: E0320 09:14:24.807496 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17ab991ed43785bc95cb9f82b5c32c6e8d5a0c6269020ea98beca14fbe5c8ec8\": container with ID starting with 17ab991ed43785bc95cb9f82b5c32c6e8d5a0c6269020ea98beca14fbe5c8ec8 not found: ID does not exist" containerID="17ab991ed43785bc95cb9f82b5c32c6e8d5a0c6269020ea98beca14fbe5c8ec8" Mar 20 09:14:24 crc kubenswrapper[4971]: I0320 09:14:24.807543 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17ab991ed43785bc95cb9f82b5c32c6e8d5a0c6269020ea98beca14fbe5c8ec8"} err="failed to get container status \"17ab991ed43785bc95cb9f82b5c32c6e8d5a0c6269020ea98beca14fbe5c8ec8\": rpc error: code = NotFound desc = could not find container \"17ab991ed43785bc95cb9f82b5c32c6e8d5a0c6269020ea98beca14fbe5c8ec8\": container with ID starting with 17ab991ed43785bc95cb9f82b5c32c6e8d5a0c6269020ea98beca14fbe5c8ec8 not found: ID does not exist" Mar 20 09:14:24 crc kubenswrapper[4971]: I0320 09:14:24.807576 4971 scope.go:117] "RemoveContainer" containerID="9d1d62587425ed24dabcb642ab49b3e02c59bc791a16f0a074b873fcaf975299" Mar 20 09:14:24 crc kubenswrapper[4971]: E0320 09:14:24.808005 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d1d62587425ed24dabcb642ab49b3e02c59bc791a16f0a074b873fcaf975299\": container with ID starting with 9d1d62587425ed24dabcb642ab49b3e02c59bc791a16f0a074b873fcaf975299 not found: ID does not exist" containerID="9d1d62587425ed24dabcb642ab49b3e02c59bc791a16f0a074b873fcaf975299" Mar 20 09:14:24 crc kubenswrapper[4971]: I0320 09:14:24.808039 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d1d62587425ed24dabcb642ab49b3e02c59bc791a16f0a074b873fcaf975299"} err="failed to get container status \"9d1d62587425ed24dabcb642ab49b3e02c59bc791a16f0a074b873fcaf975299\": rpc error: code = NotFound desc = could not find container \"9d1d62587425ed24dabcb642ab49b3e02c59bc791a16f0a074b873fcaf975299\": container with ID starting with 9d1d62587425ed24dabcb642ab49b3e02c59bc791a16f0a074b873fcaf975299 not found: ID does not exist" Mar 20 09:14:24 crc kubenswrapper[4971]: I0320 09:14:24.808064 4971 scope.go:117] "RemoveContainer" containerID="fbada4f1fc26576a76f754103fc9baeca30b58a78b0852b81caf37ac49ef8aa5" Mar 20 09:14:24 crc kubenswrapper[4971]: E0320 09:14:24.808334 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbada4f1fc26576a76f754103fc9baeca30b58a78b0852b81caf37ac49ef8aa5\": container with ID starting with fbada4f1fc26576a76f754103fc9baeca30b58a78b0852b81caf37ac49ef8aa5 not found: ID does not exist" containerID="fbada4f1fc26576a76f754103fc9baeca30b58a78b0852b81caf37ac49ef8aa5" Mar 20 09:14:24 crc kubenswrapper[4971]: I0320 09:14:24.808356 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbada4f1fc26576a76f754103fc9baeca30b58a78b0852b81caf37ac49ef8aa5"} err="failed to get container status \"fbada4f1fc26576a76f754103fc9baeca30b58a78b0852b81caf37ac49ef8aa5\": rpc error: code = NotFound desc = could not find container \"fbada4f1fc26576a76f754103fc9baeca30b58a78b0852b81caf37ac49ef8aa5\": container with ID starting with fbada4f1fc26576a76f754103fc9baeca30b58a78b0852b81caf37ac49ef8aa5 not found: ID does not exist" Mar 20 09:14:26 crc kubenswrapper[4971]: I0320 09:14:26.756444 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="231cfb67-39b4-4ebe-9bb7-6440eba49651" path="/var/lib/kubelet/pods/231cfb67-39b4-4ebe-9bb7-6440eba49651/volumes" Mar 20 09:14:30 crc kubenswrapper[4971]: I0320 09:14:30.732305 4971 scope.go:117] "RemoveContainer" containerID="da88d257cb097a1180faf648e8cb9e53e92d6d11e4160393bcb42462b61cacc0" Mar 20 09:14:30 crc kubenswrapper[4971]: E0320 09:14:30.733085 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:14:45 crc kubenswrapper[4971]: I0320 09:14:45.732899 4971 scope.go:117] "RemoveContainer" containerID="da88d257cb097a1180faf648e8cb9e53e92d6d11e4160393bcb42462b61cacc0" Mar 20 09:14:45 crc kubenswrapper[4971]: E0320 09:14:45.733880 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:15:00 crc kubenswrapper[4971]: I0320 09:15:00.152292 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566635-tsd8m"] Mar 20 09:15:00 crc kubenswrapper[4971]: E0320 09:15:00.153354 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="231cfb67-39b4-4ebe-9bb7-6440eba49651" containerName="extract-utilities" Mar 20 09:15:00 crc kubenswrapper[4971]: I0320 09:15:00.153371 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="231cfb67-39b4-4ebe-9bb7-6440eba49651" containerName="extract-utilities" Mar 20 09:15:00 crc kubenswrapper[4971]: E0320 09:15:00.153392 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="231cfb67-39b4-4ebe-9bb7-6440eba49651" containerName="registry-server" Mar 20 09:15:00 crc kubenswrapper[4971]: I0320 09:15:00.153399 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="231cfb67-39b4-4ebe-9bb7-6440eba49651" containerName="registry-server" Mar 20 09:15:00 crc kubenswrapper[4971]: E0320 09:15:00.153420 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="231cfb67-39b4-4ebe-9bb7-6440eba49651" containerName="extract-content" Mar 20 09:15:00 crc kubenswrapper[4971]: I0320 09:15:00.153428 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="231cfb67-39b4-4ebe-9bb7-6440eba49651" containerName="extract-content" Mar 20 09:15:00 crc kubenswrapper[4971]: I0320 09:15:00.153635 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="231cfb67-39b4-4ebe-9bb7-6440eba49651" containerName="registry-server" Mar 20 09:15:00 crc kubenswrapper[4971]: I0320 09:15:00.154876 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566635-tsd8m" Mar 20 09:15:00 crc kubenswrapper[4971]: I0320 09:15:00.159214 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 09:15:00 crc kubenswrapper[4971]: I0320 09:15:00.164914 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 09:15:00 crc kubenswrapper[4971]: I0320 09:15:00.166237 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566635-tsd8m"] Mar 20 09:15:00 crc kubenswrapper[4971]: I0320 09:15:00.307712 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68vc4\" (UniqueName: \"kubernetes.io/projected/e1718809-0da5-4da9-b763-b0b76232586f-kube-api-access-68vc4\") pod \"collect-profiles-29566635-tsd8m\" (UID: \"e1718809-0da5-4da9-b763-b0b76232586f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566635-tsd8m" Mar 20 09:15:00 crc kubenswrapper[4971]: I0320 09:15:00.307806 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e1718809-0da5-4da9-b763-b0b76232586f-config-volume\") pod \"collect-profiles-29566635-tsd8m\" (UID: \"e1718809-0da5-4da9-b763-b0b76232586f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566635-tsd8m" Mar 20 09:15:00 crc kubenswrapper[4971]: I0320 09:15:00.307983 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e1718809-0da5-4da9-b763-b0b76232586f-secret-volume\") pod \"collect-profiles-29566635-tsd8m\" (UID: \"e1718809-0da5-4da9-b763-b0b76232586f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566635-tsd8m" Mar 20 09:15:00 crc kubenswrapper[4971]: I0320 09:15:00.409972 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68vc4\" (UniqueName: \"kubernetes.io/projected/e1718809-0da5-4da9-b763-b0b76232586f-kube-api-access-68vc4\") pod \"collect-profiles-29566635-tsd8m\" (UID: \"e1718809-0da5-4da9-b763-b0b76232586f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566635-tsd8m" Mar 20 09:15:00 crc kubenswrapper[4971]: I0320 09:15:00.410064 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e1718809-0da5-4da9-b763-b0b76232586f-config-volume\") pod \"collect-profiles-29566635-tsd8m\" (UID: \"e1718809-0da5-4da9-b763-b0b76232586f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566635-tsd8m" Mar 20 09:15:00 crc kubenswrapper[4971]: I0320 09:15:00.410274 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e1718809-0da5-4da9-b763-b0b76232586f-secret-volume\") pod \"collect-profiles-29566635-tsd8m\" (UID: \"e1718809-0da5-4da9-b763-b0b76232586f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566635-tsd8m" Mar 20 09:15:00 crc kubenswrapper[4971]: I0320 09:15:00.411413 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e1718809-0da5-4da9-b763-b0b76232586f-config-volume\") pod \"collect-profiles-29566635-tsd8m\" (UID: \"e1718809-0da5-4da9-b763-b0b76232586f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566635-tsd8m" Mar 20 09:15:00 crc kubenswrapper[4971]: I0320 09:15:00.418764 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e1718809-0da5-4da9-b763-b0b76232586f-secret-volume\") pod \"collect-profiles-29566635-tsd8m\" (UID: \"e1718809-0da5-4da9-b763-b0b76232586f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566635-tsd8m" Mar 20 09:15:00 crc kubenswrapper[4971]: I0320 09:15:00.442916 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68vc4\" (UniqueName: \"kubernetes.io/projected/e1718809-0da5-4da9-b763-b0b76232586f-kube-api-access-68vc4\") pod \"collect-profiles-29566635-tsd8m\" (UID: \"e1718809-0da5-4da9-b763-b0b76232586f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566635-tsd8m" Mar 20 09:15:00 crc kubenswrapper[4971]: I0320 09:15:00.483985 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566635-tsd8m" Mar 20 09:15:00 crc kubenswrapper[4971]: I0320 09:15:00.732883 4971 scope.go:117] "RemoveContainer" containerID="da88d257cb097a1180faf648e8cb9e53e92d6d11e4160393bcb42462b61cacc0" Mar 20 09:15:00 crc kubenswrapper[4971]: I0320 09:15:00.938529 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566635-tsd8m"] Mar 20 09:15:00 crc kubenswrapper[4971]: W0320 09:15:00.947975 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1718809_0da5_4da9_b763_b0b76232586f.slice/crio-92a1bd6be5ac2aeb8bb0682f80769ed9271017e0f465a6856790262b7b2fcd6d WatchSource:0}: Error finding container 92a1bd6be5ac2aeb8bb0682f80769ed9271017e0f465a6856790262b7b2fcd6d: Status 404 returned error can't find the container with id 92a1bd6be5ac2aeb8bb0682f80769ed9271017e0f465a6856790262b7b2fcd6d Mar 20 09:15:01 crc kubenswrapper[4971]: I0320 09:15:01.112190 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerStarted","Data":"733a160bea3a7b4da07aecd1f08779761a18f5919207a3afdf06511a9685618b"} Mar 20 09:15:01 crc kubenswrapper[4971]: I0320 09:15:01.115874 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566635-tsd8m" event={"ID":"e1718809-0da5-4da9-b763-b0b76232586f","Type":"ContainerStarted","Data":"8f45571c28b598a72dcb4feb7b476d79ed5a69076ec6df4a8f4edba338b5ae63"} Mar 20 09:15:01 crc kubenswrapper[4971]: I0320 09:15:01.115956 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566635-tsd8m" event={"ID":"e1718809-0da5-4da9-b763-b0b76232586f","Type":"ContainerStarted","Data":"92a1bd6be5ac2aeb8bb0682f80769ed9271017e0f465a6856790262b7b2fcd6d"} Mar 20 09:15:01 crc kubenswrapper[4971]: I0320 09:15:01.155119 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29566635-tsd8m" podStartSLOduration=1.155100625 podStartE2EDuration="1.155100625s" podCreationTimestamp="2026-03-20 09:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:15:01.151427458 +0000 UTC m=+8723.131301596" watchObservedRunningTime="2026-03-20 09:15:01.155100625 +0000 UTC m=+8723.134974763" Mar 20 09:15:02 crc kubenswrapper[4971]: I0320 09:15:02.126884 4971 generic.go:334] "Generic (PLEG): container finished" podID="e1718809-0da5-4da9-b763-b0b76232586f" containerID="8f45571c28b598a72dcb4feb7b476d79ed5a69076ec6df4a8f4edba338b5ae63" exitCode=0 Mar 20 09:15:02 crc kubenswrapper[4971]: I0320 09:15:02.126940 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566635-tsd8m" event={"ID":"e1718809-0da5-4da9-b763-b0b76232586f","Type":"ContainerDied","Data":"8f45571c28b598a72dcb4feb7b476d79ed5a69076ec6df4a8f4edba338b5ae63"} Mar 20 09:15:03 crc kubenswrapper[4971]: I0320 09:15:03.486701 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566635-tsd8m" Mar 20 09:15:03 crc kubenswrapper[4971]: I0320 09:15:03.593153 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e1718809-0da5-4da9-b763-b0b76232586f-config-volume\") pod \"e1718809-0da5-4da9-b763-b0b76232586f\" (UID: \"e1718809-0da5-4da9-b763-b0b76232586f\") " Mar 20 09:15:03 crc kubenswrapper[4971]: I0320 09:15:03.593825 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e1718809-0da5-4da9-b763-b0b76232586f-secret-volume\") pod \"e1718809-0da5-4da9-b763-b0b76232586f\" (UID: \"e1718809-0da5-4da9-b763-b0b76232586f\") " Mar 20 09:15:03 crc kubenswrapper[4971]: I0320 09:15:03.593982 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68vc4\" (UniqueName: \"kubernetes.io/projected/e1718809-0da5-4da9-b763-b0b76232586f-kube-api-access-68vc4\") pod \"e1718809-0da5-4da9-b763-b0b76232586f\" (UID: \"e1718809-0da5-4da9-b763-b0b76232586f\") " Mar 20 09:15:03 crc kubenswrapper[4971]: I0320 09:15:03.594160 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1718809-0da5-4da9-b763-b0b76232586f-config-volume" (OuterVolumeSpecName: "config-volume") pod "e1718809-0da5-4da9-b763-b0b76232586f" (UID: "e1718809-0da5-4da9-b763-b0b76232586f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:15:03 crc kubenswrapper[4971]: I0320 09:15:03.594577 4971 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e1718809-0da5-4da9-b763-b0b76232586f-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 09:15:03 crc kubenswrapper[4971]: I0320 09:15:03.599766 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1718809-0da5-4da9-b763-b0b76232586f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e1718809-0da5-4da9-b763-b0b76232586f" (UID: "e1718809-0da5-4da9-b763-b0b76232586f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:15:03 crc kubenswrapper[4971]: I0320 09:15:03.600256 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1718809-0da5-4da9-b763-b0b76232586f-kube-api-access-68vc4" (OuterVolumeSpecName: "kube-api-access-68vc4") pod "e1718809-0da5-4da9-b763-b0b76232586f" (UID: "e1718809-0da5-4da9-b763-b0b76232586f"). InnerVolumeSpecName "kube-api-access-68vc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:15:03 crc kubenswrapper[4971]: I0320 09:15:03.696165 4971 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e1718809-0da5-4da9-b763-b0b76232586f-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 09:15:03 crc kubenswrapper[4971]: I0320 09:15:03.696197 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68vc4\" (UniqueName: \"kubernetes.io/projected/e1718809-0da5-4da9-b763-b0b76232586f-kube-api-access-68vc4\") on node \"crc\" DevicePath \"\"" Mar 20 09:15:04 crc kubenswrapper[4971]: I0320 09:15:04.148586 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566635-tsd8m" event={"ID":"e1718809-0da5-4da9-b763-b0b76232586f","Type":"ContainerDied","Data":"92a1bd6be5ac2aeb8bb0682f80769ed9271017e0f465a6856790262b7b2fcd6d"} Mar 20 09:15:04 crc kubenswrapper[4971]: I0320 09:15:04.148660 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92a1bd6be5ac2aeb8bb0682f80769ed9271017e0f465a6856790262b7b2fcd6d" Mar 20 09:15:04 crc kubenswrapper[4971]: I0320 09:15:04.148730 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566635-tsd8m" Mar 20 09:15:04 crc kubenswrapper[4971]: I0320 09:15:04.588736 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566590-59zps"] Mar 20 09:15:04 crc kubenswrapper[4971]: I0320 09:15:04.596680 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566590-59zps"] Mar 20 09:15:04 crc kubenswrapper[4971]: I0320 09:15:04.749234 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f04acd8-fa60-4df0-9149-a2875b62ff82" path="/var/lib/kubelet/pods/8f04acd8-fa60-4df0-9149-a2875b62ff82/volumes" Mar 20 09:15:13 crc kubenswrapper[4971]: I0320 09:15:13.683957 4971 scope.go:117] "RemoveContainer" containerID="89c65477f3ea5c66679f59924eb0a20b8141b950fcdd80e357b69d48d20cbf74" Mar 20 09:15:24 crc kubenswrapper[4971]: I0320 09:15:24.372504 4971 generic.go:334] "Generic (PLEG): container finished" podID="b6855ca9-d129-4c73-a2ed-cc7059287a06" containerID="46cee39ffb3b30ee48df0908009caa37b6d73cc395482902abc895fdc8dcd7c0" exitCode=0 Mar 20 09:15:24 crc kubenswrapper[4971]: I0320 09:15:24.372670 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-networker-jcgvt" event={"ID":"b6855ca9-d129-4c73-a2ed-cc7059287a06","Type":"ContainerDied","Data":"46cee39ffb3b30ee48df0908009caa37b6d73cc395482902abc895fdc8dcd7c0"} Mar 20 09:15:25 crc kubenswrapper[4971]: I0320 09:15:25.796954 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-networker-jcgvt" Mar 20 09:15:25 crc kubenswrapper[4971]: I0320 09:15:25.921077 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b6855ca9-d129-4c73-a2ed-cc7059287a06-inventory\") pod \"b6855ca9-d129-4c73-a2ed-cc7059287a06\" (UID: \"b6855ca9-d129-4c73-a2ed-cc7059287a06\") " Mar 20 09:15:25 crc kubenswrapper[4971]: I0320 09:15:25.921582 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ws954\" (UniqueName: \"kubernetes.io/projected/b6855ca9-d129-4c73-a2ed-cc7059287a06-kube-api-access-ws954\") pod \"b6855ca9-d129-4c73-a2ed-cc7059287a06\" (UID: \"b6855ca9-d129-4c73-a2ed-cc7059287a06\") " Mar 20 09:15:25 crc kubenswrapper[4971]: I0320 09:15:25.921630 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/b6855ca9-d129-4c73-a2ed-cc7059287a06-ssh-key-openstack-networker\") pod \"b6855ca9-d129-4c73-a2ed-cc7059287a06\" (UID: \"b6855ca9-d129-4c73-a2ed-cc7059287a06\") " Mar 20 09:15:25 crc kubenswrapper[4971]: I0320 09:15:25.933562 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6855ca9-d129-4c73-a2ed-cc7059287a06-kube-api-access-ws954" (OuterVolumeSpecName: "kube-api-access-ws954") pod "b6855ca9-d129-4c73-a2ed-cc7059287a06" (UID: "b6855ca9-d129-4c73-a2ed-cc7059287a06"). InnerVolumeSpecName "kube-api-access-ws954". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:15:25 crc kubenswrapper[4971]: I0320 09:15:25.958274 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6855ca9-d129-4c73-a2ed-cc7059287a06-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "b6855ca9-d129-4c73-a2ed-cc7059287a06" (UID: "b6855ca9-d129-4c73-a2ed-cc7059287a06"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:15:25 crc kubenswrapper[4971]: I0320 09:15:25.965512 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6855ca9-d129-4c73-a2ed-cc7059287a06-inventory" (OuterVolumeSpecName: "inventory") pod "b6855ca9-d129-4c73-a2ed-cc7059287a06" (UID: "b6855ca9-d129-4c73-a2ed-cc7059287a06"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:15:26 crc kubenswrapper[4971]: I0320 09:15:26.023549 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ws954\" (UniqueName: \"kubernetes.io/projected/b6855ca9-d129-4c73-a2ed-cc7059287a06-kube-api-access-ws954\") on node \"crc\" DevicePath \"\"" Mar 20 09:15:26 crc kubenswrapper[4971]: I0320 09:15:26.023580 4971 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/b6855ca9-d129-4c73-a2ed-cc7059287a06-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Mar 20 09:15:26 crc kubenswrapper[4971]: I0320 09:15:26.023591 4971 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b6855ca9-d129-4c73-a2ed-cc7059287a06-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 09:15:26 crc kubenswrapper[4971]: I0320 09:15:26.392929 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-networker-jcgvt" event={"ID":"b6855ca9-d129-4c73-a2ed-cc7059287a06","Type":"ContainerDied","Data":"53aaac2dc2b60060f7b48c45ab2474dac6e3400ffa92d3fe75a6ac855f109804"} Mar 20 09:15:26 crc kubenswrapper[4971]: I0320 09:15:26.392968 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53aaac2dc2b60060f7b48c45ab2474dac6e3400ffa92d3fe75a6ac855f109804" Mar 20 09:15:26 crc kubenswrapper[4971]: I0320 09:15:26.392998 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-networker-jcgvt" Mar 20 09:15:26 crc kubenswrapper[4971]: I0320 09:15:26.486280 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-openstack-openstack-networker-lgllz"] Mar 20 09:15:26 crc kubenswrapper[4971]: E0320 09:15:26.486834 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1718809-0da5-4da9-b763-b0b76232586f" containerName="collect-profiles" Mar 20 09:15:26 crc kubenswrapper[4971]: I0320 09:15:26.486859 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1718809-0da5-4da9-b763-b0b76232586f" containerName="collect-profiles" Mar 20 09:15:26 crc kubenswrapper[4971]: E0320 09:15:26.486886 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6855ca9-d129-4c73-a2ed-cc7059287a06" containerName="download-cache-openstack-openstack-networker" Mar 20 09:15:26 crc kubenswrapper[4971]: I0320 09:15:26.486896 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6855ca9-d129-4c73-a2ed-cc7059287a06" containerName="download-cache-openstack-openstack-networker" Mar 20 09:15:26 crc kubenswrapper[4971]: I0320 09:15:26.487133 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6855ca9-d129-4c73-a2ed-cc7059287a06" containerName="download-cache-openstack-openstack-networker" Mar 20 09:15:26 crc kubenswrapper[4971]: I0320 09:15:26.487159 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1718809-0da5-4da9-b763-b0b76232586f" containerName="collect-profiles" Mar 20 09:15:26 crc kubenswrapper[4971]: I0320 09:15:26.488100 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-networker-lgllz" Mar 20 09:15:26 crc kubenswrapper[4971]: I0320 09:15:26.490701 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-nwcvx" Mar 20 09:15:26 crc kubenswrapper[4971]: I0320 09:15:26.491557 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Mar 20 09:15:26 crc kubenswrapper[4971]: I0320 09:15:26.508725 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-networker-lgllz"] Mar 20 09:15:26 crc kubenswrapper[4971]: I0320 09:15:26.634291 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/243abc57-20e2-431d-8714-e3ffd3a27ea6-inventory\") pod \"configure-network-openstack-openstack-networker-lgllz\" (UID: \"243abc57-20e2-431d-8714-e3ffd3a27ea6\") " pod="openstack/configure-network-openstack-openstack-networker-lgllz" Mar 20 09:15:26 crc kubenswrapper[4971]: I0320 09:15:26.635171 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/243abc57-20e2-431d-8714-e3ffd3a27ea6-ssh-key-openstack-networker\") pod \"configure-network-openstack-openstack-networker-lgllz\" (UID: \"243abc57-20e2-431d-8714-e3ffd3a27ea6\") " pod="openstack/configure-network-openstack-openstack-networker-lgllz" Mar 20 09:15:26 crc kubenswrapper[4971]: I0320 09:15:26.635213 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq2sm\" (UniqueName: \"kubernetes.io/projected/243abc57-20e2-431d-8714-e3ffd3a27ea6-kube-api-access-zq2sm\") pod \"configure-network-openstack-openstack-networker-lgllz\" (UID: \"243abc57-20e2-431d-8714-e3ffd3a27ea6\") " pod="openstack/configure-network-openstack-openstack-networker-lgllz" Mar 20 09:15:26 crc kubenswrapper[4971]: I0320 09:15:26.737142 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/243abc57-20e2-431d-8714-e3ffd3a27ea6-ssh-key-openstack-networker\") pod \"configure-network-openstack-openstack-networker-lgllz\" (UID: \"243abc57-20e2-431d-8714-e3ffd3a27ea6\") " pod="openstack/configure-network-openstack-openstack-networker-lgllz" Mar 20 09:15:26 crc kubenswrapper[4971]: I0320 09:15:26.737517 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zq2sm\" (UniqueName: \"kubernetes.io/projected/243abc57-20e2-431d-8714-e3ffd3a27ea6-kube-api-access-zq2sm\") pod \"configure-network-openstack-openstack-networker-lgllz\" (UID: \"243abc57-20e2-431d-8714-e3ffd3a27ea6\") " pod="openstack/configure-network-openstack-openstack-networker-lgllz" Mar 20 09:15:26 crc kubenswrapper[4971]: I0320 09:15:26.737879 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/243abc57-20e2-431d-8714-e3ffd3a27ea6-inventory\") pod \"configure-network-openstack-openstack-networker-lgllz\" (UID: \"243abc57-20e2-431d-8714-e3ffd3a27ea6\") " pod="openstack/configure-network-openstack-openstack-networker-lgllz" Mar 20 09:15:26 crc kubenswrapper[4971]: I0320 09:15:26.745809 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/243abc57-20e2-431d-8714-e3ffd3a27ea6-inventory\") pod \"configure-network-openstack-openstack-networker-lgllz\" (UID: \"243abc57-20e2-431d-8714-e3ffd3a27ea6\") " pod="openstack/configure-network-openstack-openstack-networker-lgllz" Mar 20 09:15:26 crc kubenswrapper[4971]: I0320 09:15:26.753302 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/243abc57-20e2-431d-8714-e3ffd3a27ea6-ssh-key-openstack-networker\") pod \"configure-network-openstack-openstack-networker-lgllz\" (UID: \"243abc57-20e2-431d-8714-e3ffd3a27ea6\") " pod="openstack/configure-network-openstack-openstack-networker-lgllz" Mar 20 09:15:26 crc kubenswrapper[4971]: I0320 09:15:26.781700 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq2sm\" (UniqueName: \"kubernetes.io/projected/243abc57-20e2-431d-8714-e3ffd3a27ea6-kube-api-access-zq2sm\") pod \"configure-network-openstack-openstack-networker-lgllz\" (UID: \"243abc57-20e2-431d-8714-e3ffd3a27ea6\") " pod="openstack/configure-network-openstack-openstack-networker-lgllz" Mar 20 09:15:26 crc kubenswrapper[4971]: I0320 09:15:26.826006 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-networker-lgllz" Mar 20 09:15:27 crc kubenswrapper[4971]: I0320 09:15:27.406733 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-networker-lgllz"] Mar 20 09:15:28 crc kubenswrapper[4971]: I0320 09:15:28.415036 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-networker-lgllz" event={"ID":"243abc57-20e2-431d-8714-e3ffd3a27ea6","Type":"ContainerStarted","Data":"d46e23f67b0d8b2cf5483272da753834b6af1e4b560dc0bb9d6d7d34fe06e97b"} Mar 20 09:15:28 crc kubenswrapper[4971]: I0320 09:15:28.415676 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-networker-lgllz" event={"ID":"243abc57-20e2-431d-8714-e3ffd3a27ea6","Type":"ContainerStarted","Data":"b25e558aa8a6ff81b38a806190ddfa6beb15dbdb8690abf24fd3db5937a63112"} Mar 20 09:15:28 crc kubenswrapper[4971]: I0320 09:15:28.438631 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-openstack-openstack-networker-lgllz" podStartSLOduration=1.946741813 podStartE2EDuration="2.438583983s" podCreationTimestamp="2026-03-20 09:15:26 +0000 UTC" firstStartedPulling="2026-03-20 09:15:27.415934183 +0000 UTC m=+8749.395808311" lastFinishedPulling="2026-03-20 09:15:27.907776343 +0000 UTC m=+8749.887650481" observedRunningTime="2026-03-20 09:15:28.429652807 +0000 UTC m=+8750.409526955" watchObservedRunningTime="2026-03-20 09:15:28.438583983 +0000 UTC m=+8750.418458121" Mar 20 09:16:00 crc kubenswrapper[4971]: I0320 09:16:00.156079 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566636-g9l8t"] Mar 20 09:16:00 crc kubenswrapper[4971]: I0320 09:16:00.158251 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566636-g9l8t" Mar 20 09:16:00 crc kubenswrapper[4971]: I0320 09:16:00.162206 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 09:16:00 crc kubenswrapper[4971]: I0320 09:16:00.162413 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:16:00 crc kubenswrapper[4971]: I0320 09:16:00.164361 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566636-g9l8t"] Mar 20 09:16:00 crc kubenswrapper[4971]: I0320 09:16:00.166999 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:16:00 crc kubenswrapper[4971]: I0320 09:16:00.224229 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7cq4\" (UniqueName: \"kubernetes.io/projected/7f126602-4ade-4262-b6f8-66432bca8316-kube-api-access-z7cq4\") pod \"auto-csr-approver-29566636-g9l8t\" (UID: \"7f126602-4ade-4262-b6f8-66432bca8316\") " pod="openshift-infra/auto-csr-approver-29566636-g9l8t" Mar 20 09:16:00 crc kubenswrapper[4971]: I0320 09:16:00.326385 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7cq4\" (UniqueName: \"kubernetes.io/projected/7f126602-4ade-4262-b6f8-66432bca8316-kube-api-access-z7cq4\") pod \"auto-csr-approver-29566636-g9l8t\" (UID: \"7f126602-4ade-4262-b6f8-66432bca8316\") " pod="openshift-infra/auto-csr-approver-29566636-g9l8t" Mar 20 09:16:00 crc kubenswrapper[4971]: I0320 09:16:00.349252 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7cq4\" (UniqueName: \"kubernetes.io/projected/7f126602-4ade-4262-b6f8-66432bca8316-kube-api-access-z7cq4\") pod \"auto-csr-approver-29566636-g9l8t\" (UID: \"7f126602-4ade-4262-b6f8-66432bca8316\") " pod="openshift-infra/auto-csr-approver-29566636-g9l8t" Mar 20 09:16:00 crc kubenswrapper[4971]: I0320 09:16:00.479928 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566636-g9l8t" Mar 20 09:16:00 crc kubenswrapper[4971]: I0320 09:16:00.952824 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566636-g9l8t"] Mar 20 09:16:01 crc kubenswrapper[4971]: I0320 09:16:01.752509 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566636-g9l8t" event={"ID":"7f126602-4ade-4262-b6f8-66432bca8316","Type":"ContainerStarted","Data":"b170f14f93a7d1e57fee8ae16a5ce1cefec486119a81a4bad47bd922e015c13c"} Mar 20 09:16:02 crc kubenswrapper[4971]: I0320 09:16:02.767794 4971 generic.go:334] "Generic (PLEG): container finished" podID="7f126602-4ade-4262-b6f8-66432bca8316" containerID="7a61712b25ae4d3ab4e74825cf2ce20f492c921590cb44127d530b2feabaf6ca" exitCode=0 Mar 20 09:16:02 crc kubenswrapper[4971]: I0320 09:16:02.767845 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566636-g9l8t" event={"ID":"7f126602-4ade-4262-b6f8-66432bca8316","Type":"ContainerDied","Data":"7a61712b25ae4d3ab4e74825cf2ce20f492c921590cb44127d530b2feabaf6ca"} Mar 20 09:16:04 crc kubenswrapper[4971]: I0320 09:16:04.244254 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566636-g9l8t" Mar 20 09:16:04 crc kubenswrapper[4971]: I0320 09:16:04.306769 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7cq4\" (UniqueName: \"kubernetes.io/projected/7f126602-4ade-4262-b6f8-66432bca8316-kube-api-access-z7cq4\") pod \"7f126602-4ade-4262-b6f8-66432bca8316\" (UID: \"7f126602-4ade-4262-b6f8-66432bca8316\") " Mar 20 09:16:04 crc kubenswrapper[4971]: I0320 09:16:04.337323 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f126602-4ade-4262-b6f8-66432bca8316-kube-api-access-z7cq4" (OuterVolumeSpecName: "kube-api-access-z7cq4") pod "7f126602-4ade-4262-b6f8-66432bca8316" (UID: "7f126602-4ade-4262-b6f8-66432bca8316"). InnerVolumeSpecName "kube-api-access-z7cq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:16:04 crc kubenswrapper[4971]: I0320 09:16:04.409317 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7cq4\" (UniqueName: \"kubernetes.io/projected/7f126602-4ade-4262-b6f8-66432bca8316-kube-api-access-z7cq4\") on node \"crc\" DevicePath \"\"" Mar 20 09:16:04 crc kubenswrapper[4971]: I0320 09:16:04.792009 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566636-g9l8t" event={"ID":"7f126602-4ade-4262-b6f8-66432bca8316","Type":"ContainerDied","Data":"b170f14f93a7d1e57fee8ae16a5ce1cefec486119a81a4bad47bd922e015c13c"} Mar 20 09:16:04 crc kubenswrapper[4971]: I0320 09:16:04.792307 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b170f14f93a7d1e57fee8ae16a5ce1cefec486119a81a4bad47bd922e015c13c" Mar 20 09:16:04 crc kubenswrapper[4971]: I0320 09:16:04.792259 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566636-g9l8t" Mar 20 09:16:05 crc kubenswrapper[4971]: I0320 09:16:05.312535 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566630-2mnrg"] Mar 20 09:16:05 crc kubenswrapper[4971]: I0320 09:16:05.321165 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566630-2mnrg"] Mar 20 09:16:06 crc kubenswrapper[4971]: I0320 09:16:06.746214 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="516be8fc-f457-4f59-ba34-6eccaf7e8ff3" path="/var/lib/kubelet/pods/516be8fc-f457-4f59-ba34-6eccaf7e8ff3/volumes" Mar 20 09:16:09 crc kubenswrapper[4971]: I0320 09:16:09.837426 4971 generic.go:334] "Generic (PLEG): container finished" podID="efb74ee9-ca4d-4087-a070-4b3f17683800" containerID="8eb819a61371fb67c66e88d082d9e0db898f11cdc958c21efa24257c2d96f0f1" exitCode=0 Mar 20 09:16:09 crc kubenswrapper[4971]: I0320 09:16:09.837543 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-grhlv" event={"ID":"efb74ee9-ca4d-4087-a070-4b3f17683800","Type":"ContainerDied","Data":"8eb819a61371fb67c66e88d082d9e0db898f11cdc958c21efa24257c2d96f0f1"} Mar 20 09:16:11 crc kubenswrapper[4971]: I0320 09:16:11.354405 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-grhlv" Mar 20 09:16:11 crc kubenswrapper[4971]: I0320 09:16:11.498921 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qhsf\" (UniqueName: \"kubernetes.io/projected/efb74ee9-ca4d-4087-a070-4b3f17683800-kube-api-access-5qhsf\") pod \"efb74ee9-ca4d-4087-a070-4b3f17683800\" (UID: \"efb74ee9-ca4d-4087-a070-4b3f17683800\") " Mar 20 09:16:11 crc kubenswrapper[4971]: I0320 09:16:11.498972 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/efb74ee9-ca4d-4087-a070-4b3f17683800-ssh-key-openstack-cell1\") pod \"efb74ee9-ca4d-4087-a070-4b3f17683800\" (UID: \"efb74ee9-ca4d-4087-a070-4b3f17683800\") " Mar 20 09:16:11 crc kubenswrapper[4971]: I0320 09:16:11.499000 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/efb74ee9-ca4d-4087-a070-4b3f17683800-ceph\") pod \"efb74ee9-ca4d-4087-a070-4b3f17683800\" (UID: \"efb74ee9-ca4d-4087-a070-4b3f17683800\") " Mar 20 09:16:11 crc kubenswrapper[4971]: I0320 09:16:11.499123 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/efb74ee9-ca4d-4087-a070-4b3f17683800-inventory\") pod \"efb74ee9-ca4d-4087-a070-4b3f17683800\" (UID: \"efb74ee9-ca4d-4087-a070-4b3f17683800\") " Mar 20 09:16:11 crc kubenswrapper[4971]: I0320 09:16:11.504991 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efb74ee9-ca4d-4087-a070-4b3f17683800-kube-api-access-5qhsf" (OuterVolumeSpecName: "kube-api-access-5qhsf") pod "efb74ee9-ca4d-4087-a070-4b3f17683800" (UID: "efb74ee9-ca4d-4087-a070-4b3f17683800"). InnerVolumeSpecName "kube-api-access-5qhsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:16:11 crc kubenswrapper[4971]: I0320 09:16:11.505685 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efb74ee9-ca4d-4087-a070-4b3f17683800-ceph" (OuterVolumeSpecName: "ceph") pod "efb74ee9-ca4d-4087-a070-4b3f17683800" (UID: "efb74ee9-ca4d-4087-a070-4b3f17683800"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:16:11 crc kubenswrapper[4971]: I0320 09:16:11.530174 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efb74ee9-ca4d-4087-a070-4b3f17683800-inventory" (OuterVolumeSpecName: "inventory") pod "efb74ee9-ca4d-4087-a070-4b3f17683800" (UID: "efb74ee9-ca4d-4087-a070-4b3f17683800"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:16:11 crc kubenswrapper[4971]: I0320 09:16:11.535227 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efb74ee9-ca4d-4087-a070-4b3f17683800-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "efb74ee9-ca4d-4087-a070-4b3f17683800" (UID: "efb74ee9-ca4d-4087-a070-4b3f17683800"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:16:11 crc kubenswrapper[4971]: I0320 09:16:11.601694 4971 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/efb74ee9-ca4d-4087-a070-4b3f17683800-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 09:16:11 crc kubenswrapper[4971]: I0320 09:16:11.601733 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5qhsf\" (UniqueName: \"kubernetes.io/projected/efb74ee9-ca4d-4087-a070-4b3f17683800-kube-api-access-5qhsf\") on node \"crc\" DevicePath \"\"" Mar 20 09:16:11 crc kubenswrapper[4971]: I0320 09:16:11.601749 4971 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/efb74ee9-ca4d-4087-a070-4b3f17683800-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 20 09:16:11 crc kubenswrapper[4971]: I0320 09:16:11.601762 4971 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/efb74ee9-ca4d-4087-a070-4b3f17683800-ceph\") on node \"crc\" DevicePath \"\"" Mar 20 09:16:11 crc kubenswrapper[4971]: I0320 09:16:11.862176 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-grhlv" event={"ID":"efb74ee9-ca4d-4087-a070-4b3f17683800","Type":"ContainerDied","Data":"24a80bf056a5fe4c9a307736e2b4fd65b597f35b2140b7a3f8b103ae82016531"} Mar 20 09:16:11 crc kubenswrapper[4971]: I0320 09:16:11.862230 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24a80bf056a5fe4c9a307736e2b4fd65b597f35b2140b7a3f8b103ae82016531" Mar 20 09:16:11 crc kubenswrapper[4971]: I0320 09:16:11.862230 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-grhlv" Mar 20 09:16:11 crc kubenswrapper[4971]: I0320 09:16:11.970124 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-x6q7m"] Mar 20 09:16:11 crc kubenswrapper[4971]: E0320 09:16:11.970808 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efb74ee9-ca4d-4087-a070-4b3f17683800" containerName="download-cache-openstack-openstack-cell1" Mar 20 09:16:11 crc kubenswrapper[4971]: I0320 09:16:11.970842 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="efb74ee9-ca4d-4087-a070-4b3f17683800" containerName="download-cache-openstack-openstack-cell1" Mar 20 09:16:11 crc kubenswrapper[4971]: E0320 09:16:11.970867 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f126602-4ade-4262-b6f8-66432bca8316" containerName="oc" Mar 20 09:16:11 crc kubenswrapper[4971]: I0320 09:16:11.970880 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f126602-4ade-4262-b6f8-66432bca8316" containerName="oc" Mar 20 09:16:11 crc kubenswrapper[4971]: I0320 09:16:11.971220 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f126602-4ade-4262-b6f8-66432bca8316" containerName="oc" Mar 20 09:16:11 crc kubenswrapper[4971]: I0320 09:16:11.971262 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="efb74ee9-ca4d-4087-a070-4b3f17683800" containerName="download-cache-openstack-openstack-cell1" Mar 20 09:16:11 crc kubenswrapper[4971]: I0320 09:16:11.972518 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-x6q7m" Mar 20 09:16:11 crc kubenswrapper[4971]: I0320 09:16:11.975770 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 20 09:16:11 crc kubenswrapper[4971]: I0320 09:16:11.975922 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rwstj" Mar 20 09:16:11 crc kubenswrapper[4971]: I0320 09:16:11.990537 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-x6q7m"] Mar 20 09:16:12 crc kubenswrapper[4971]: I0320 09:16:12.111871 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1af0481a-a5e1-42e4-aa6e-ac3d63407b66-inventory\") pod \"configure-network-openstack-openstack-cell1-x6q7m\" (UID: \"1af0481a-a5e1-42e4-aa6e-ac3d63407b66\") " pod="openstack/configure-network-openstack-openstack-cell1-x6q7m" Mar 20 09:16:12 crc kubenswrapper[4971]: I0320 09:16:12.112215 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/1af0481a-a5e1-42e4-aa6e-ac3d63407b66-ssh-key-openstack-cell1\") pod \"configure-network-openstack-openstack-cell1-x6q7m\" (UID: \"1af0481a-a5e1-42e4-aa6e-ac3d63407b66\") " pod="openstack/configure-network-openstack-openstack-cell1-x6q7m" Mar 20 09:16:12 crc kubenswrapper[4971]: I0320 09:16:12.112277 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vv8x\" (UniqueName: \"kubernetes.io/projected/1af0481a-a5e1-42e4-aa6e-ac3d63407b66-kube-api-access-6vv8x\") pod \"configure-network-openstack-openstack-cell1-x6q7m\" (UID: \"1af0481a-a5e1-42e4-aa6e-ac3d63407b66\") " pod="openstack/configure-network-openstack-openstack-cell1-x6q7m" Mar 20 09:16:12 crc kubenswrapper[4971]: I0320 09:16:12.112345 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1af0481a-a5e1-42e4-aa6e-ac3d63407b66-ceph\") pod \"configure-network-openstack-openstack-cell1-x6q7m\" (UID: \"1af0481a-a5e1-42e4-aa6e-ac3d63407b66\") " pod="openstack/configure-network-openstack-openstack-cell1-x6q7m" Mar 20 09:16:12 crc kubenswrapper[4971]: I0320 09:16:12.214329 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1af0481a-a5e1-42e4-aa6e-ac3d63407b66-inventory\") pod \"configure-network-openstack-openstack-cell1-x6q7m\" (UID: \"1af0481a-a5e1-42e4-aa6e-ac3d63407b66\") " pod="openstack/configure-network-openstack-openstack-cell1-x6q7m" Mar 20 09:16:12 crc kubenswrapper[4971]: I0320 09:16:12.214424 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/1af0481a-a5e1-42e4-aa6e-ac3d63407b66-ssh-key-openstack-cell1\") pod \"configure-network-openstack-openstack-cell1-x6q7m\" (UID: \"1af0481a-a5e1-42e4-aa6e-ac3d63407b66\") " pod="openstack/configure-network-openstack-openstack-cell1-x6q7m" Mar 20 09:16:12 crc kubenswrapper[4971]: I0320 09:16:12.214551 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vv8x\" (UniqueName: \"kubernetes.io/projected/1af0481a-a5e1-42e4-aa6e-ac3d63407b66-kube-api-access-6vv8x\") pod \"configure-network-openstack-openstack-cell1-x6q7m\" (UID: \"1af0481a-a5e1-42e4-aa6e-ac3d63407b66\") " pod="openstack/configure-network-openstack-openstack-cell1-x6q7m" Mar 20 09:16:12 crc kubenswrapper[4971]: I0320 09:16:12.214662 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1af0481a-a5e1-42e4-aa6e-ac3d63407b66-ceph\") pod \"configure-network-openstack-openstack-cell1-x6q7m\" (UID: \"1af0481a-a5e1-42e4-aa6e-ac3d63407b66\") " pod="openstack/configure-network-openstack-openstack-cell1-x6q7m" Mar 20 09:16:12 crc kubenswrapper[4971]: I0320 09:16:12.220147 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1af0481a-a5e1-42e4-aa6e-ac3d63407b66-inventory\") pod \"configure-network-openstack-openstack-cell1-x6q7m\" (UID: \"1af0481a-a5e1-42e4-aa6e-ac3d63407b66\") " pod="openstack/configure-network-openstack-openstack-cell1-x6q7m" Mar 20 09:16:12 crc kubenswrapper[4971]: I0320 09:16:12.220338 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/1af0481a-a5e1-42e4-aa6e-ac3d63407b66-ssh-key-openstack-cell1\") pod \"configure-network-openstack-openstack-cell1-x6q7m\" (UID: \"1af0481a-a5e1-42e4-aa6e-ac3d63407b66\") " pod="openstack/configure-network-openstack-openstack-cell1-x6q7m" Mar 20 09:16:12 crc kubenswrapper[4971]: I0320 09:16:12.221166 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1af0481a-a5e1-42e4-aa6e-ac3d63407b66-ceph\") pod \"configure-network-openstack-openstack-cell1-x6q7m\" (UID: \"1af0481a-a5e1-42e4-aa6e-ac3d63407b66\") " pod="openstack/configure-network-openstack-openstack-cell1-x6q7m" Mar 20 09:16:12 crc kubenswrapper[4971]: I0320 09:16:12.231180 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vv8x\" (UniqueName: \"kubernetes.io/projected/1af0481a-a5e1-42e4-aa6e-ac3d63407b66-kube-api-access-6vv8x\") pod \"configure-network-openstack-openstack-cell1-x6q7m\" (UID: \"1af0481a-a5e1-42e4-aa6e-ac3d63407b66\") " pod="openstack/configure-network-openstack-openstack-cell1-x6q7m" Mar 20 09:16:12 crc kubenswrapper[4971]: I0320 09:16:12.306340 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-x6q7m" Mar 20 09:16:12 crc kubenswrapper[4971]: I0320 09:16:12.898796 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-x6q7m"] Mar 20 09:16:13 crc kubenswrapper[4971]: I0320 09:16:13.786425 4971 scope.go:117] "RemoveContainer" containerID="78199fefffed876b749f7f23e987cd04442bc91d340191b35a280ad11a390601" Mar 20 09:16:13 crc kubenswrapper[4971]: I0320 09:16:13.885976 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-x6q7m" event={"ID":"1af0481a-a5e1-42e4-aa6e-ac3d63407b66","Type":"ContainerStarted","Data":"17ec3809f0329421b89d745ab591f510d077fc41101d383a0b8c9e0a6f5d420b"} Mar 20 09:16:13 crc kubenswrapper[4971]: I0320 09:16:13.886023 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-x6q7m" event={"ID":"1af0481a-a5e1-42e4-aa6e-ac3d63407b66","Type":"ContainerStarted","Data":"fd4adbee05444f33d2fd77605caaf20f154615b5ef0aac929b8efcbedf89e665"} Mar 20 09:16:13 crc kubenswrapper[4971]: I0320 09:16:13.910479 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-openstack-openstack-cell1-x6q7m" podStartSLOduration=2.450915977 podStartE2EDuration="2.910375921s" podCreationTimestamp="2026-03-20 09:16:11 +0000 UTC" firstStartedPulling="2026-03-20 09:16:12.913297617 +0000 UTC m=+8794.893171755" lastFinishedPulling="2026-03-20 09:16:13.372757561 +0000 UTC m=+8795.352631699" observedRunningTime="2026-03-20 09:16:13.905532383 +0000 UTC m=+8795.885406531" watchObservedRunningTime="2026-03-20 09:16:13.910375921 +0000 UTC m=+8795.890250069" Mar 20 09:16:28 crc kubenswrapper[4971]: I0320 09:16:28.018869 4971 generic.go:334] "Generic (PLEG): container finished" podID="243abc57-20e2-431d-8714-e3ffd3a27ea6" containerID="d46e23f67b0d8b2cf5483272da753834b6af1e4b560dc0bb9d6d7d34fe06e97b" exitCode=0 Mar 20 09:16:28 crc kubenswrapper[4971]: I0320 09:16:28.018965 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-networker-lgllz" event={"ID":"243abc57-20e2-431d-8714-e3ffd3a27ea6","Type":"ContainerDied","Data":"d46e23f67b0d8b2cf5483272da753834b6af1e4b560dc0bb9d6d7d34fe06e97b"} Mar 20 09:16:29 crc kubenswrapper[4971]: I0320 09:16:29.557814 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-networker-lgllz" Mar 20 09:16:29 crc kubenswrapper[4971]: I0320 09:16:29.692653 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/243abc57-20e2-431d-8714-e3ffd3a27ea6-inventory\") pod \"243abc57-20e2-431d-8714-e3ffd3a27ea6\" (UID: \"243abc57-20e2-431d-8714-e3ffd3a27ea6\") " Mar 20 09:16:29 crc kubenswrapper[4971]: I0320 09:16:29.693751 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/243abc57-20e2-431d-8714-e3ffd3a27ea6-ssh-key-openstack-networker\") pod \"243abc57-20e2-431d-8714-e3ffd3a27ea6\" (UID: \"243abc57-20e2-431d-8714-e3ffd3a27ea6\") " Mar 20 09:16:29 crc kubenswrapper[4971]: I0320 09:16:29.693837 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zq2sm\" (UniqueName: \"kubernetes.io/projected/243abc57-20e2-431d-8714-e3ffd3a27ea6-kube-api-access-zq2sm\") pod \"243abc57-20e2-431d-8714-e3ffd3a27ea6\" (UID: \"243abc57-20e2-431d-8714-e3ffd3a27ea6\") " Mar 20 09:16:29 crc kubenswrapper[4971]: I0320 09:16:29.698113 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/243abc57-20e2-431d-8714-e3ffd3a27ea6-kube-api-access-zq2sm" (OuterVolumeSpecName: "kube-api-access-zq2sm") pod "243abc57-20e2-431d-8714-e3ffd3a27ea6" (UID: "243abc57-20e2-431d-8714-e3ffd3a27ea6"). InnerVolumeSpecName "kube-api-access-zq2sm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:16:29 crc kubenswrapper[4971]: I0320 09:16:29.721108 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/243abc57-20e2-431d-8714-e3ffd3a27ea6-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "243abc57-20e2-431d-8714-e3ffd3a27ea6" (UID: "243abc57-20e2-431d-8714-e3ffd3a27ea6"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:16:29 crc kubenswrapper[4971]: I0320 09:16:29.721849 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/243abc57-20e2-431d-8714-e3ffd3a27ea6-inventory" (OuterVolumeSpecName: "inventory") pod "243abc57-20e2-431d-8714-e3ffd3a27ea6" (UID: "243abc57-20e2-431d-8714-e3ffd3a27ea6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:16:29 crc kubenswrapper[4971]: I0320 09:16:29.796166 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zq2sm\" (UniqueName: \"kubernetes.io/projected/243abc57-20e2-431d-8714-e3ffd3a27ea6-kube-api-access-zq2sm\") on node \"crc\" DevicePath \"\"" Mar 20 09:16:29 crc kubenswrapper[4971]: I0320 09:16:29.796192 4971 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/243abc57-20e2-431d-8714-e3ffd3a27ea6-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 09:16:29 crc kubenswrapper[4971]: I0320 09:16:29.796202 4971 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/243abc57-20e2-431d-8714-e3ffd3a27ea6-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Mar 20 09:16:30 crc kubenswrapper[4971]: I0320 09:16:30.034815 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-networker-lgllz" event={"ID":"243abc57-20e2-431d-8714-e3ffd3a27ea6","Type":"ContainerDied","Data":"b25e558aa8a6ff81b38a806190ddfa6beb15dbdb8690abf24fd3db5937a63112"} Mar 20 09:16:30 crc kubenswrapper[4971]: I0320 09:16:30.035098 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b25e558aa8a6ff81b38a806190ddfa6beb15dbdb8690abf24fd3db5937a63112" Mar 20 09:16:30 crc kubenswrapper[4971]: I0320 09:16:30.034901 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-networker-lgllz" Mar 20 09:16:30 crc kubenswrapper[4971]: I0320 09:16:30.108435 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-openstack-openstack-networker-255gb"] Mar 20 09:16:30 crc kubenswrapper[4971]: E0320 09:16:30.109129 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="243abc57-20e2-431d-8714-e3ffd3a27ea6" containerName="configure-network-openstack-openstack-networker" Mar 20 09:16:30 crc kubenswrapper[4971]: I0320 09:16:30.109152 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="243abc57-20e2-431d-8714-e3ffd3a27ea6" containerName="configure-network-openstack-openstack-networker" Mar 20 09:16:30 crc kubenswrapper[4971]: I0320 09:16:30.109341 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="243abc57-20e2-431d-8714-e3ffd3a27ea6" containerName="configure-network-openstack-openstack-networker" Mar 20 09:16:30 crc kubenswrapper[4971]: I0320 09:16:30.110090 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-networker-255gb" Mar 20 09:16:30 crc kubenswrapper[4971]: I0320 09:16:30.113404 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-nwcvx" Mar 20 09:16:30 crc kubenswrapper[4971]: I0320 09:16:30.113553 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Mar 20 09:16:30 crc kubenswrapper[4971]: I0320 09:16:30.128723 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-networker-255gb"] Mar 20 09:16:30 crc kubenswrapper[4971]: I0320 09:16:30.205330 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/d17f937d-4808-496f-94ba-f85e03c371ff-ssh-key-openstack-networker\") pod \"validate-network-openstack-openstack-networker-255gb\" (UID: \"d17f937d-4808-496f-94ba-f85e03c371ff\") " pod="openstack/validate-network-openstack-openstack-networker-255gb" Mar 20 09:16:30 crc kubenswrapper[4971]: I0320 09:16:30.205821 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pwkn\" (UniqueName: \"kubernetes.io/projected/d17f937d-4808-496f-94ba-f85e03c371ff-kube-api-access-9pwkn\") pod \"validate-network-openstack-openstack-networker-255gb\" (UID: \"d17f937d-4808-496f-94ba-f85e03c371ff\") " pod="openstack/validate-network-openstack-openstack-networker-255gb" Mar 20 09:16:30 crc kubenswrapper[4971]: I0320 09:16:30.205969 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d17f937d-4808-496f-94ba-f85e03c371ff-inventory\") pod \"validate-network-openstack-openstack-networker-255gb\" (UID: \"d17f937d-4808-496f-94ba-f85e03c371ff\") " pod="openstack/validate-network-openstack-openstack-networker-255gb" Mar 20 09:16:30 crc kubenswrapper[4971]: I0320 09:16:30.308526 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pwkn\" (UniqueName: \"kubernetes.io/projected/d17f937d-4808-496f-94ba-f85e03c371ff-kube-api-access-9pwkn\") pod \"validate-network-openstack-openstack-networker-255gb\" (UID: \"d17f937d-4808-496f-94ba-f85e03c371ff\") " pod="openstack/validate-network-openstack-openstack-networker-255gb" Mar 20 09:16:30 crc kubenswrapper[4971]: I0320 09:16:30.308632 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d17f937d-4808-496f-94ba-f85e03c371ff-inventory\") pod \"validate-network-openstack-openstack-networker-255gb\" (UID: \"d17f937d-4808-496f-94ba-f85e03c371ff\") " pod="openstack/validate-network-openstack-openstack-networker-255gb" Mar 20 09:16:30 crc kubenswrapper[4971]: I0320 09:16:30.308746 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/d17f937d-4808-496f-94ba-f85e03c371ff-ssh-key-openstack-networker\") pod \"validate-network-openstack-openstack-networker-255gb\" (UID: \"d17f937d-4808-496f-94ba-f85e03c371ff\") " pod="openstack/validate-network-openstack-openstack-networker-255gb" Mar 20 09:16:30 crc kubenswrapper[4971]: I0320 09:16:30.313226 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/d17f937d-4808-496f-94ba-f85e03c371ff-ssh-key-openstack-networker\") pod \"validate-network-openstack-openstack-networker-255gb\" (UID: \"d17f937d-4808-496f-94ba-f85e03c371ff\") " pod="openstack/validate-network-openstack-openstack-networker-255gb" Mar 20 09:16:30 crc kubenswrapper[4971]: I0320 09:16:30.314296 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d17f937d-4808-496f-94ba-f85e03c371ff-inventory\") pod \"validate-network-openstack-openstack-networker-255gb\" (UID: \"d17f937d-4808-496f-94ba-f85e03c371ff\") " pod="openstack/validate-network-openstack-openstack-networker-255gb" Mar 20 09:16:30 crc kubenswrapper[4971]: I0320 09:16:30.328126 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pwkn\" (UniqueName: \"kubernetes.io/projected/d17f937d-4808-496f-94ba-f85e03c371ff-kube-api-access-9pwkn\") pod \"validate-network-openstack-openstack-networker-255gb\" (UID: \"d17f937d-4808-496f-94ba-f85e03c371ff\") " pod="openstack/validate-network-openstack-openstack-networker-255gb" Mar 20 09:16:30 crc kubenswrapper[4971]: I0320 09:16:30.427589 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-networker-255gb" Mar 20 09:16:30 crc kubenswrapper[4971]: I0320 09:16:30.998129 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-networker-255gb"] Mar 20 09:16:31 crc kubenswrapper[4971]: I0320 09:16:31.051261 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-networker-255gb" event={"ID":"d17f937d-4808-496f-94ba-f85e03c371ff","Type":"ContainerStarted","Data":"5540c80f858d9d321b5eaad39dba1b1439741f4d23207a6ad699375b991a8d3b"} Mar 20 09:16:32 crc kubenswrapper[4971]: I0320 09:16:32.060534 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-networker-255gb" event={"ID":"d17f937d-4808-496f-94ba-f85e03c371ff","Type":"ContainerStarted","Data":"6bd334a136604fe2f9e143e967fd8fb5663f1fcd52910be7d684185cc104eaa5"} Mar 20 09:16:32 crc kubenswrapper[4971]: I0320 09:16:32.078349 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-openstack-openstack-networker-255gb" podStartSLOduration=1.579005175 podStartE2EDuration="2.078334143s" podCreationTimestamp="2026-03-20 09:16:30 +0000 UTC" firstStartedPulling="2026-03-20 09:16:31.000841074 +0000 UTC m=+8812.980715212" lastFinishedPulling="2026-03-20 09:16:31.500170022 +0000 UTC m=+8813.480044180" observedRunningTime="2026-03-20 09:16:32.076239238 +0000 UTC m=+8814.056113376" watchObservedRunningTime="2026-03-20 09:16:32.078334143 +0000 UTC m=+8814.058208281" Mar 20 09:16:37 crc kubenswrapper[4971]: I0320 09:16:37.102791 4971 generic.go:334] "Generic (PLEG): container finished" podID="d17f937d-4808-496f-94ba-f85e03c371ff" containerID="6bd334a136604fe2f9e143e967fd8fb5663f1fcd52910be7d684185cc104eaa5" exitCode=0 Mar 20 09:16:37 crc kubenswrapper[4971]: I0320 09:16:37.102880 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-networker-255gb" event={"ID":"d17f937d-4808-496f-94ba-f85e03c371ff","Type":"ContainerDied","Data":"6bd334a136604fe2f9e143e967fd8fb5663f1fcd52910be7d684185cc104eaa5"} Mar 20 09:16:38 crc kubenswrapper[4971]: I0320 09:16:38.592674 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-networker-255gb" Mar 20 09:16:38 crc kubenswrapper[4971]: I0320 09:16:38.680283 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d17f937d-4808-496f-94ba-f85e03c371ff-inventory\") pod \"d17f937d-4808-496f-94ba-f85e03c371ff\" (UID: \"d17f937d-4808-496f-94ba-f85e03c371ff\") " Mar 20 09:16:38 crc kubenswrapper[4971]: I0320 09:16:38.680461 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pwkn\" (UniqueName: \"kubernetes.io/projected/d17f937d-4808-496f-94ba-f85e03c371ff-kube-api-access-9pwkn\") pod \"d17f937d-4808-496f-94ba-f85e03c371ff\" (UID: \"d17f937d-4808-496f-94ba-f85e03c371ff\") " Mar 20 09:16:38 crc kubenswrapper[4971]: I0320 09:16:38.680559 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/d17f937d-4808-496f-94ba-f85e03c371ff-ssh-key-openstack-networker\") pod \"d17f937d-4808-496f-94ba-f85e03c371ff\" (UID: \"d17f937d-4808-496f-94ba-f85e03c371ff\") " Mar 20 09:16:38 crc kubenswrapper[4971]: I0320 09:16:38.685941 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d17f937d-4808-496f-94ba-f85e03c371ff-kube-api-access-9pwkn" (OuterVolumeSpecName: "kube-api-access-9pwkn") pod "d17f937d-4808-496f-94ba-f85e03c371ff" (UID: "d17f937d-4808-496f-94ba-f85e03c371ff"). InnerVolumeSpecName "kube-api-access-9pwkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:16:38 crc kubenswrapper[4971]: I0320 09:16:38.706588 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d17f937d-4808-496f-94ba-f85e03c371ff-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "d17f937d-4808-496f-94ba-f85e03c371ff" (UID: "d17f937d-4808-496f-94ba-f85e03c371ff"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:16:38 crc kubenswrapper[4971]: I0320 09:16:38.708259 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d17f937d-4808-496f-94ba-f85e03c371ff-inventory" (OuterVolumeSpecName: "inventory") pod "d17f937d-4808-496f-94ba-f85e03c371ff" (UID: "d17f937d-4808-496f-94ba-f85e03c371ff"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:16:38 crc kubenswrapper[4971]: I0320 09:16:38.782491 4971 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d17f937d-4808-496f-94ba-f85e03c371ff-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 09:16:38 crc kubenswrapper[4971]: I0320 09:16:38.782522 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pwkn\" (UniqueName: \"kubernetes.io/projected/d17f937d-4808-496f-94ba-f85e03c371ff-kube-api-access-9pwkn\") on node \"crc\" DevicePath \"\"" Mar 20 09:16:38 crc kubenswrapper[4971]: I0320 09:16:38.782533 4971 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/d17f937d-4808-496f-94ba-f85e03c371ff-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Mar 20 09:16:39 crc kubenswrapper[4971]: I0320 09:16:39.120950 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-networker-255gb" event={"ID":"d17f937d-4808-496f-94ba-f85e03c371ff","Type":"ContainerDied","Data":"5540c80f858d9d321b5eaad39dba1b1439741f4d23207a6ad699375b991a8d3b"} Mar 20 09:16:39 crc kubenswrapper[4971]: I0320 09:16:39.121020 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-networker-255gb" Mar 20 09:16:39 crc kubenswrapper[4971]: I0320 09:16:39.121038 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5540c80f858d9d321b5eaad39dba1b1439741f4d23207a6ad699375b991a8d3b" Mar 20 09:16:39 crc kubenswrapper[4971]: I0320 09:16:39.210154 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-openstack-openstack-networker-n7gwd"] Mar 20 09:16:39 crc kubenswrapper[4971]: E0320 09:16:39.210601 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d17f937d-4808-496f-94ba-f85e03c371ff" containerName="validate-network-openstack-openstack-networker" Mar 20 09:16:39 crc kubenswrapper[4971]: I0320 09:16:39.210644 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="d17f937d-4808-496f-94ba-f85e03c371ff" containerName="validate-network-openstack-openstack-networker" Mar 20 09:16:39 crc kubenswrapper[4971]: I0320 09:16:39.210861 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="d17f937d-4808-496f-94ba-f85e03c371ff" containerName="validate-network-openstack-openstack-networker" Mar 20 09:16:39 crc kubenswrapper[4971]: I0320 09:16:39.211595 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-networker-n7gwd" Mar 20 09:16:39 crc kubenswrapper[4971]: I0320 09:16:39.214350 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-nwcvx" Mar 20 09:16:39 crc kubenswrapper[4971]: I0320 09:16:39.217324 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Mar 20 09:16:39 crc kubenswrapper[4971]: I0320 09:16:39.224679 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-networker-n7gwd"] Mar 20 09:16:39 crc kubenswrapper[4971]: I0320 09:16:39.292845 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/e8634c18-a188-491c-96ee-8c23c7ec8a67-ssh-key-openstack-networker\") pod \"install-os-openstack-openstack-networker-n7gwd\" (UID: \"e8634c18-a188-491c-96ee-8c23c7ec8a67\") " pod="openstack/install-os-openstack-openstack-networker-n7gwd" Mar 20 09:16:39 crc kubenswrapper[4971]: I0320 09:16:39.292935 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65zb4\" (UniqueName: \"kubernetes.io/projected/e8634c18-a188-491c-96ee-8c23c7ec8a67-kube-api-access-65zb4\") pod \"install-os-openstack-openstack-networker-n7gwd\" (UID: \"e8634c18-a188-491c-96ee-8c23c7ec8a67\") " pod="openstack/install-os-openstack-openstack-networker-n7gwd" Mar 20 09:16:39 crc kubenswrapper[4971]: I0320 09:16:39.292958 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e8634c18-a188-491c-96ee-8c23c7ec8a67-inventory\") pod \"install-os-openstack-openstack-networker-n7gwd\" (UID: \"e8634c18-a188-491c-96ee-8c23c7ec8a67\") " pod="openstack/install-os-openstack-openstack-networker-n7gwd" Mar 20 09:16:39 crc kubenswrapper[4971]: I0320 09:16:39.394560 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65zb4\" (UniqueName: \"kubernetes.io/projected/e8634c18-a188-491c-96ee-8c23c7ec8a67-kube-api-access-65zb4\") pod \"install-os-openstack-openstack-networker-n7gwd\" (UID: \"e8634c18-a188-491c-96ee-8c23c7ec8a67\") " pod="openstack/install-os-openstack-openstack-networker-n7gwd" Mar 20 09:16:39 crc kubenswrapper[4971]: I0320 09:16:39.394646 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e8634c18-a188-491c-96ee-8c23c7ec8a67-inventory\") pod \"install-os-openstack-openstack-networker-n7gwd\" (UID: \"e8634c18-a188-491c-96ee-8c23c7ec8a67\") " pod="openstack/install-os-openstack-openstack-networker-n7gwd" Mar 20 09:16:39 crc kubenswrapper[4971]: I0320 09:16:39.394815 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/e8634c18-a188-491c-96ee-8c23c7ec8a67-ssh-key-openstack-networker\") pod \"install-os-openstack-openstack-networker-n7gwd\" (UID: \"e8634c18-a188-491c-96ee-8c23c7ec8a67\") " pod="openstack/install-os-openstack-openstack-networker-n7gwd" Mar 20 09:16:39 crc kubenswrapper[4971]: I0320 09:16:39.399042 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/e8634c18-a188-491c-96ee-8c23c7ec8a67-ssh-key-openstack-networker\") pod \"install-os-openstack-openstack-networker-n7gwd\" (UID: \"e8634c18-a188-491c-96ee-8c23c7ec8a67\") " pod="openstack/install-os-openstack-openstack-networker-n7gwd" Mar 20 09:16:39 crc kubenswrapper[4971]: I0320 09:16:39.399230 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e8634c18-a188-491c-96ee-8c23c7ec8a67-inventory\") pod \"install-os-openstack-openstack-networker-n7gwd\" (UID: \"e8634c18-a188-491c-96ee-8c23c7ec8a67\") " pod="openstack/install-os-openstack-openstack-networker-n7gwd" Mar 20 09:16:39 crc kubenswrapper[4971]: I0320 09:16:39.415886 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65zb4\" (UniqueName: \"kubernetes.io/projected/e8634c18-a188-491c-96ee-8c23c7ec8a67-kube-api-access-65zb4\") pod \"install-os-openstack-openstack-networker-n7gwd\" (UID: \"e8634c18-a188-491c-96ee-8c23c7ec8a67\") " pod="openstack/install-os-openstack-openstack-networker-n7gwd" Mar 20 09:16:39 crc kubenswrapper[4971]: I0320 09:16:39.529642 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-networker-n7gwd" Mar 20 09:16:40 crc kubenswrapper[4971]: I0320 09:16:40.074717 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-networker-n7gwd"] Mar 20 09:16:40 crc kubenswrapper[4971]: I0320 09:16:40.129270 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-networker-n7gwd" event={"ID":"e8634c18-a188-491c-96ee-8c23c7ec8a67","Type":"ContainerStarted","Data":"810538d32fcc2890eaaccd7a8f79a3a637a1ab8dc665be5ab6cc3263c5179948"} Mar 20 09:16:41 crc kubenswrapper[4971]: I0320 09:16:41.150036 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-networker-n7gwd" event={"ID":"e8634c18-a188-491c-96ee-8c23c7ec8a67","Type":"ContainerStarted","Data":"9f0826a1d170b847b81b74cbc39c20b29c531e44e1035d2e9158ab9a5390d2e2"} Mar 20 09:16:41 crc kubenswrapper[4971]: I0320 09:16:41.170196 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-openstack-openstack-networker-n7gwd" podStartSLOduration=1.745393008 podStartE2EDuration="2.170173985s" podCreationTimestamp="2026-03-20 09:16:39 +0000 UTC" firstStartedPulling="2026-03-20 09:16:40.073721446 +0000 UTC m=+8822.053595584" lastFinishedPulling="2026-03-20 09:16:40.498502423 +0000 UTC m=+8822.478376561" observedRunningTime="2026-03-20 09:16:41.167093564 +0000 UTC m=+8823.146967712" watchObservedRunningTime="2026-03-20 09:16:41.170173985 +0000 UTC m=+8823.150048123" Mar 20 09:17:12 crc kubenswrapper[4971]: I0320 09:17:12.469721 4971 generic.go:334] "Generic (PLEG): container finished" podID="1af0481a-a5e1-42e4-aa6e-ac3d63407b66" containerID="17ec3809f0329421b89d745ab591f510d077fc41101d383a0b8c9e0a6f5d420b" exitCode=0 Mar 20 09:17:12 crc kubenswrapper[4971]: I0320 09:17:12.469825 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-x6q7m" event={"ID":"1af0481a-a5e1-42e4-aa6e-ac3d63407b66","Type":"ContainerDied","Data":"17ec3809f0329421b89d745ab591f510d077fc41101d383a0b8c9e0a6f5d420b"} Mar 20 09:17:13 crc kubenswrapper[4971]: I0320 09:17:13.953407 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-x6q7m" Mar 20 09:17:14 crc kubenswrapper[4971]: I0320 09:17:14.115189 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1af0481a-a5e1-42e4-aa6e-ac3d63407b66-ceph\") pod \"1af0481a-a5e1-42e4-aa6e-ac3d63407b66\" (UID: \"1af0481a-a5e1-42e4-aa6e-ac3d63407b66\") " Mar 20 09:17:14 crc kubenswrapper[4971]: I0320 09:17:14.115319 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1af0481a-a5e1-42e4-aa6e-ac3d63407b66-inventory\") pod \"1af0481a-a5e1-42e4-aa6e-ac3d63407b66\" (UID: \"1af0481a-a5e1-42e4-aa6e-ac3d63407b66\") " Mar 20 09:17:14 crc kubenswrapper[4971]: I0320 09:17:14.115480 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/1af0481a-a5e1-42e4-aa6e-ac3d63407b66-ssh-key-openstack-cell1\") pod \"1af0481a-a5e1-42e4-aa6e-ac3d63407b66\" (UID: \"1af0481a-a5e1-42e4-aa6e-ac3d63407b66\") " Mar 20 09:17:14 crc kubenswrapper[4971]: I0320 09:17:14.115645 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vv8x\" (UniqueName: \"kubernetes.io/projected/1af0481a-a5e1-42e4-aa6e-ac3d63407b66-kube-api-access-6vv8x\") pod \"1af0481a-a5e1-42e4-aa6e-ac3d63407b66\" (UID: \"1af0481a-a5e1-42e4-aa6e-ac3d63407b66\") " Mar 20 09:17:14 crc kubenswrapper[4971]: I0320 09:17:14.121138 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1af0481a-a5e1-42e4-aa6e-ac3d63407b66-ceph" (OuterVolumeSpecName: "ceph") pod "1af0481a-a5e1-42e4-aa6e-ac3d63407b66" (UID: "1af0481a-a5e1-42e4-aa6e-ac3d63407b66"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:17:14 crc kubenswrapper[4971]: I0320 09:17:14.121383 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1af0481a-a5e1-42e4-aa6e-ac3d63407b66-kube-api-access-6vv8x" (OuterVolumeSpecName: "kube-api-access-6vv8x") pod "1af0481a-a5e1-42e4-aa6e-ac3d63407b66" (UID: "1af0481a-a5e1-42e4-aa6e-ac3d63407b66"). InnerVolumeSpecName "kube-api-access-6vv8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:17:14 crc kubenswrapper[4971]: I0320 09:17:14.142085 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1af0481a-a5e1-42e4-aa6e-ac3d63407b66-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "1af0481a-a5e1-42e4-aa6e-ac3d63407b66" (UID: "1af0481a-a5e1-42e4-aa6e-ac3d63407b66"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:17:14 crc kubenswrapper[4971]: I0320 09:17:14.152103 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1af0481a-a5e1-42e4-aa6e-ac3d63407b66-inventory" (OuterVolumeSpecName: "inventory") pod "1af0481a-a5e1-42e4-aa6e-ac3d63407b66" (UID: "1af0481a-a5e1-42e4-aa6e-ac3d63407b66"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:17:14 crc kubenswrapper[4971]: I0320 09:17:14.218054 4971 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1af0481a-a5e1-42e4-aa6e-ac3d63407b66-ceph\") on node \"crc\" DevicePath \"\"" Mar 20 09:17:14 crc kubenswrapper[4971]: I0320 09:17:14.218106 4971 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1af0481a-a5e1-42e4-aa6e-ac3d63407b66-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 09:17:14 crc kubenswrapper[4971]: I0320 09:17:14.218127 4971 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/1af0481a-a5e1-42e4-aa6e-ac3d63407b66-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 20 09:17:14 crc kubenswrapper[4971]: I0320 09:17:14.218142 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vv8x\" (UniqueName: \"kubernetes.io/projected/1af0481a-a5e1-42e4-aa6e-ac3d63407b66-kube-api-access-6vv8x\") on node \"crc\" DevicePath \"\"" Mar 20 09:17:14 crc kubenswrapper[4971]: I0320 09:17:14.515307 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-x6q7m" event={"ID":"1af0481a-a5e1-42e4-aa6e-ac3d63407b66","Type":"ContainerDied","Data":"fd4adbee05444f33d2fd77605caaf20f154615b5ef0aac929b8efcbedf89e665"} Mar 20 09:17:14 crc kubenswrapper[4971]: I0320 09:17:14.516280 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd4adbee05444f33d2fd77605caaf20f154615b5ef0aac929b8efcbedf89e665" Mar 20 09:17:14 crc kubenswrapper[4971]: I0320 09:17:14.515931 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-x6q7m" Mar 20 09:17:14 crc kubenswrapper[4971]: I0320 09:17:14.584573 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-zl54b"] Mar 20 09:17:14 crc kubenswrapper[4971]: E0320 09:17:14.585154 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1af0481a-a5e1-42e4-aa6e-ac3d63407b66" containerName="configure-network-openstack-openstack-cell1" Mar 20 09:17:14 crc kubenswrapper[4971]: I0320 09:17:14.585181 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="1af0481a-a5e1-42e4-aa6e-ac3d63407b66" containerName="configure-network-openstack-openstack-cell1" Mar 20 09:17:14 crc kubenswrapper[4971]: I0320 09:17:14.585436 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="1af0481a-a5e1-42e4-aa6e-ac3d63407b66" containerName="configure-network-openstack-openstack-cell1" Mar 20 09:17:14 crc kubenswrapper[4971]: I0320 09:17:14.586379 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-zl54b" Mar 20 09:17:14 crc kubenswrapper[4971]: I0320 09:17:14.588815 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rwstj" Mar 20 09:17:14 crc kubenswrapper[4971]: I0320 09:17:14.588974 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 20 09:17:14 crc kubenswrapper[4971]: I0320 09:17:14.596185 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-zl54b"] Mar 20 09:17:14 crc kubenswrapper[4971]: I0320 09:17:14.726132 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44rbk\" (UniqueName: \"kubernetes.io/projected/5992b06a-eb27-4e13-9086-b1cc8a80cca7-kube-api-access-44rbk\") pod \"validate-network-openstack-openstack-cell1-zl54b\" (UID: \"5992b06a-eb27-4e13-9086-b1cc8a80cca7\") " pod="openstack/validate-network-openstack-openstack-cell1-zl54b" Mar 20 09:17:14 crc kubenswrapper[4971]: I0320 09:17:14.726377 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5992b06a-eb27-4e13-9086-b1cc8a80cca7-ssh-key-openstack-cell1\") pod \"validate-network-openstack-openstack-cell1-zl54b\" (UID: \"5992b06a-eb27-4e13-9086-b1cc8a80cca7\") " pod="openstack/validate-network-openstack-openstack-cell1-zl54b" Mar 20 09:17:14 crc kubenswrapper[4971]: I0320 09:17:14.726591 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5992b06a-eb27-4e13-9086-b1cc8a80cca7-ceph\") pod \"validate-network-openstack-openstack-cell1-zl54b\" (UID: \"5992b06a-eb27-4e13-9086-b1cc8a80cca7\") " pod="openstack/validate-network-openstack-openstack-cell1-zl54b" Mar 20 09:17:14 crc kubenswrapper[4971]: I0320 09:17:14.726652 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5992b06a-eb27-4e13-9086-b1cc8a80cca7-inventory\") pod \"validate-network-openstack-openstack-cell1-zl54b\" (UID: \"5992b06a-eb27-4e13-9086-b1cc8a80cca7\") " pod="openstack/validate-network-openstack-openstack-cell1-zl54b" Mar 20 09:17:14 crc kubenswrapper[4971]: I0320 09:17:14.828933 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5992b06a-eb27-4e13-9086-b1cc8a80cca7-ssh-key-openstack-cell1\") pod \"validate-network-openstack-openstack-cell1-zl54b\" (UID: \"5992b06a-eb27-4e13-9086-b1cc8a80cca7\") " pod="openstack/validate-network-openstack-openstack-cell1-zl54b" Mar 20 09:17:14 crc kubenswrapper[4971]: I0320 09:17:14.829117 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5992b06a-eb27-4e13-9086-b1cc8a80cca7-ceph\") pod \"validate-network-openstack-openstack-cell1-zl54b\" (UID: \"5992b06a-eb27-4e13-9086-b1cc8a80cca7\") " pod="openstack/validate-network-openstack-openstack-cell1-zl54b" Mar 20 09:17:14 crc kubenswrapper[4971]: I0320 09:17:14.829155 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5992b06a-eb27-4e13-9086-b1cc8a80cca7-inventory\") pod \"validate-network-openstack-openstack-cell1-zl54b\" (UID: \"5992b06a-eb27-4e13-9086-b1cc8a80cca7\") " pod="openstack/validate-network-openstack-openstack-cell1-zl54b" Mar 20 09:17:14 crc kubenswrapper[4971]: I0320 09:17:14.829222 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44rbk\" (UniqueName: \"kubernetes.io/projected/5992b06a-eb27-4e13-9086-b1cc8a80cca7-kube-api-access-44rbk\") pod \"validate-network-openstack-openstack-cell1-zl54b\" (UID: \"5992b06a-eb27-4e13-9086-b1cc8a80cca7\") " pod="openstack/validate-network-openstack-openstack-cell1-zl54b" Mar 20 09:17:14 crc kubenswrapper[4971]: I0320 09:17:14.835088 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5992b06a-eb27-4e13-9086-b1cc8a80cca7-inventory\") pod \"validate-network-openstack-openstack-cell1-zl54b\" (UID: \"5992b06a-eb27-4e13-9086-b1cc8a80cca7\") " pod="openstack/validate-network-openstack-openstack-cell1-zl54b" Mar 20 09:17:14 crc kubenswrapper[4971]: I0320 09:17:14.835334 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5992b06a-eb27-4e13-9086-b1cc8a80cca7-ssh-key-openstack-cell1\") pod \"validate-network-openstack-openstack-cell1-zl54b\" (UID: \"5992b06a-eb27-4e13-9086-b1cc8a80cca7\") " pod="openstack/validate-network-openstack-openstack-cell1-zl54b" Mar 20 09:17:14 crc kubenswrapper[4971]: I0320 09:17:14.840163 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5992b06a-eb27-4e13-9086-b1cc8a80cca7-ceph\") pod \"validate-network-openstack-openstack-cell1-zl54b\" (UID: \"5992b06a-eb27-4e13-9086-b1cc8a80cca7\") " pod="openstack/validate-network-openstack-openstack-cell1-zl54b" Mar 20 09:17:14 crc kubenswrapper[4971]: I0320 09:17:14.847119 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44rbk\" (UniqueName: \"kubernetes.io/projected/5992b06a-eb27-4e13-9086-b1cc8a80cca7-kube-api-access-44rbk\") pod \"validate-network-openstack-openstack-cell1-zl54b\" (UID: \"5992b06a-eb27-4e13-9086-b1cc8a80cca7\") " pod="openstack/validate-network-openstack-openstack-cell1-zl54b" Mar 20 09:17:14 crc kubenswrapper[4971]: I0320 09:17:14.938987 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-zl54b" Mar 20 09:17:15 crc kubenswrapper[4971]: W0320 09:17:15.453241 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5992b06a_eb27_4e13_9086_b1cc8a80cca7.slice/crio-12ff43a56e4fefe090be8eb00cdaba4ee17fbc76dadcd672941f5b9bc5637a4c WatchSource:0}: Error finding container 12ff43a56e4fefe090be8eb00cdaba4ee17fbc76dadcd672941f5b9bc5637a4c: Status 404 returned error can't find the container with id 12ff43a56e4fefe090be8eb00cdaba4ee17fbc76dadcd672941f5b9bc5637a4c Mar 20 09:17:15 crc kubenswrapper[4971]: I0320 09:17:15.454130 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-zl54b"] Mar 20 09:17:15 crc kubenswrapper[4971]: I0320 09:17:15.525348 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-zl54b" event={"ID":"5992b06a-eb27-4e13-9086-b1cc8a80cca7","Type":"ContainerStarted","Data":"12ff43a56e4fefe090be8eb00cdaba4ee17fbc76dadcd672941f5b9bc5637a4c"} Mar 20 09:17:17 crc kubenswrapper[4971]: I0320 09:17:17.544106 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-zl54b" event={"ID":"5992b06a-eb27-4e13-9086-b1cc8a80cca7","Type":"ContainerStarted","Data":"e07bc045f09f96052a2cdee854337fb6d8a2638fda4574ffc353f9b2144aea7f"} Mar 20 09:17:17 crc kubenswrapper[4971]: I0320 09:17:17.568053 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-openstack-openstack-cell1-zl54b" podStartSLOduration=2.375105685 podStartE2EDuration="3.568020273s" podCreationTimestamp="2026-03-20 09:17:14 +0000 UTC" firstStartedPulling="2026-03-20 09:17:15.45553188 +0000 UTC m=+8857.435406018" lastFinishedPulling="2026-03-20 09:17:16.648446468 +0000 UTC m=+8858.628320606" observedRunningTime="2026-03-20 09:17:17.564226273 +0000 UTC m=+8859.544100431" watchObservedRunningTime="2026-03-20 09:17:17.568020273 +0000 UTC m=+8859.547894451" Mar 20 09:17:20 crc kubenswrapper[4971]: I0320 09:17:20.162519 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:17:20 crc kubenswrapper[4971]: I0320 09:17:20.162847 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:17:22 crc kubenswrapper[4971]: I0320 09:17:22.590224 4971 generic.go:334] "Generic (PLEG): container finished" podID="5992b06a-eb27-4e13-9086-b1cc8a80cca7" containerID="e07bc045f09f96052a2cdee854337fb6d8a2638fda4574ffc353f9b2144aea7f" exitCode=0 Mar 20 09:17:22 crc kubenswrapper[4971]: I0320 09:17:22.590285 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-zl54b" event={"ID":"5992b06a-eb27-4e13-9086-b1cc8a80cca7","Type":"ContainerDied","Data":"e07bc045f09f96052a2cdee854337fb6d8a2638fda4574ffc353f9b2144aea7f"} Mar 20 09:17:24 crc kubenswrapper[4971]: I0320 09:17:24.121310 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-zl54b" Mar 20 09:17:24 crc kubenswrapper[4971]: I0320 09:17:24.228531 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5992b06a-eb27-4e13-9086-b1cc8a80cca7-ssh-key-openstack-cell1\") pod \"5992b06a-eb27-4e13-9086-b1cc8a80cca7\" (UID: \"5992b06a-eb27-4e13-9086-b1cc8a80cca7\") " Mar 20 09:17:24 crc kubenswrapper[4971]: I0320 09:17:24.228631 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5992b06a-eb27-4e13-9086-b1cc8a80cca7-inventory\") pod \"5992b06a-eb27-4e13-9086-b1cc8a80cca7\" (UID: \"5992b06a-eb27-4e13-9086-b1cc8a80cca7\") " Mar 20 09:17:24 crc kubenswrapper[4971]: I0320 09:17:24.228672 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44rbk\" (UniqueName: \"kubernetes.io/projected/5992b06a-eb27-4e13-9086-b1cc8a80cca7-kube-api-access-44rbk\") pod \"5992b06a-eb27-4e13-9086-b1cc8a80cca7\" (UID: \"5992b06a-eb27-4e13-9086-b1cc8a80cca7\") " Mar 20 09:17:24 crc kubenswrapper[4971]: I0320 09:17:24.228799 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5992b06a-eb27-4e13-9086-b1cc8a80cca7-ceph\") pod \"5992b06a-eb27-4e13-9086-b1cc8a80cca7\" (UID: \"5992b06a-eb27-4e13-9086-b1cc8a80cca7\") " Mar 20 09:17:24 crc kubenswrapper[4971]: I0320 09:17:24.237774 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5992b06a-eb27-4e13-9086-b1cc8a80cca7-kube-api-access-44rbk" (OuterVolumeSpecName: "kube-api-access-44rbk") pod "5992b06a-eb27-4e13-9086-b1cc8a80cca7" (UID: "5992b06a-eb27-4e13-9086-b1cc8a80cca7"). InnerVolumeSpecName "kube-api-access-44rbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:17:24 crc kubenswrapper[4971]: I0320 09:17:24.237803 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5992b06a-eb27-4e13-9086-b1cc8a80cca7-ceph" (OuterVolumeSpecName: "ceph") pod "5992b06a-eb27-4e13-9086-b1cc8a80cca7" (UID: "5992b06a-eb27-4e13-9086-b1cc8a80cca7"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:17:24 crc kubenswrapper[4971]: I0320 09:17:24.269456 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5992b06a-eb27-4e13-9086-b1cc8a80cca7-inventory" (OuterVolumeSpecName: "inventory") pod "5992b06a-eb27-4e13-9086-b1cc8a80cca7" (UID: "5992b06a-eb27-4e13-9086-b1cc8a80cca7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:17:24 crc kubenswrapper[4971]: I0320 09:17:24.281139 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5992b06a-eb27-4e13-9086-b1cc8a80cca7-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "5992b06a-eb27-4e13-9086-b1cc8a80cca7" (UID: "5992b06a-eb27-4e13-9086-b1cc8a80cca7"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:17:24 crc kubenswrapper[4971]: I0320 09:17:24.331233 4971 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5992b06a-eb27-4e13-9086-b1cc8a80cca7-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 20 09:17:24 crc kubenswrapper[4971]: I0320 09:17:24.331276 4971 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5992b06a-eb27-4e13-9086-b1cc8a80cca7-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 09:17:24 crc kubenswrapper[4971]: I0320 09:17:24.331288 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44rbk\" (UniqueName: \"kubernetes.io/projected/5992b06a-eb27-4e13-9086-b1cc8a80cca7-kube-api-access-44rbk\") on node \"crc\" DevicePath \"\"" Mar 20 09:17:24 crc kubenswrapper[4971]: I0320 09:17:24.331298 4971 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5992b06a-eb27-4e13-9086-b1cc8a80cca7-ceph\") on node \"crc\" DevicePath \"\"" Mar 20 09:17:24 crc kubenswrapper[4971]: I0320 09:17:24.610024 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-zl54b" Mar 20 09:17:24 crc kubenswrapper[4971]: I0320 09:17:24.609998 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-zl54b" event={"ID":"5992b06a-eb27-4e13-9086-b1cc8a80cca7","Type":"ContainerDied","Data":"12ff43a56e4fefe090be8eb00cdaba4ee17fbc76dadcd672941f5b9bc5637a4c"} Mar 20 09:17:24 crc kubenswrapper[4971]: I0320 09:17:24.610074 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12ff43a56e4fefe090be8eb00cdaba4ee17fbc76dadcd672941f5b9bc5637a4c" Mar 20 09:17:24 crc kubenswrapper[4971]: I0320 09:17:24.750141 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-openstack-openstack-cell1-8z2kz"] Mar 20 09:17:24 crc kubenswrapper[4971]: E0320 09:17:24.750459 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5992b06a-eb27-4e13-9086-b1cc8a80cca7" containerName="validate-network-openstack-openstack-cell1" Mar 20 09:17:24 crc kubenswrapper[4971]: I0320 09:17:24.750475 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="5992b06a-eb27-4e13-9086-b1cc8a80cca7" containerName="validate-network-openstack-openstack-cell1" Mar 20 09:17:24 crc kubenswrapper[4971]: I0320 09:17:24.750717 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="5992b06a-eb27-4e13-9086-b1cc8a80cca7" containerName="validate-network-openstack-openstack-cell1" Mar 20 09:17:24 crc kubenswrapper[4971]: I0320 09:17:24.751411 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-8z2kz" Mar 20 09:17:24 crc kubenswrapper[4971]: I0320 09:17:24.758472 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-8z2kz"] Mar 20 09:17:24 crc kubenswrapper[4971]: I0320 09:17:24.759974 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 20 09:17:24 crc kubenswrapper[4971]: I0320 09:17:24.760193 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rwstj" Mar 20 09:17:24 crc kubenswrapper[4971]: I0320 09:17:24.842256 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/024091ee-f909-41ef-8ca6-0894aed949cd-ssh-key-openstack-cell1\") pod \"install-os-openstack-openstack-cell1-8z2kz\" (UID: \"024091ee-f909-41ef-8ca6-0894aed949cd\") " pod="openstack/install-os-openstack-openstack-cell1-8z2kz" Mar 20 09:17:24 crc kubenswrapper[4971]: I0320 09:17:24.843755 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/024091ee-f909-41ef-8ca6-0894aed949cd-inventory\") pod \"install-os-openstack-openstack-cell1-8z2kz\" (UID: \"024091ee-f909-41ef-8ca6-0894aed949cd\") " pod="openstack/install-os-openstack-openstack-cell1-8z2kz" Mar 20 09:17:24 crc kubenswrapper[4971]: I0320 09:17:24.843837 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/024091ee-f909-41ef-8ca6-0894aed949cd-ceph\") pod \"install-os-openstack-openstack-cell1-8z2kz\" (UID: \"024091ee-f909-41ef-8ca6-0894aed949cd\") " pod="openstack/install-os-openstack-openstack-cell1-8z2kz" Mar 20 09:17:24 crc kubenswrapper[4971]: I0320 09:17:24.843959 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6nwg\" (UniqueName: \"kubernetes.io/projected/024091ee-f909-41ef-8ca6-0894aed949cd-kube-api-access-q6nwg\") pod \"install-os-openstack-openstack-cell1-8z2kz\" (UID: \"024091ee-f909-41ef-8ca6-0894aed949cd\") " pod="openstack/install-os-openstack-openstack-cell1-8z2kz" Mar 20 09:17:24 crc kubenswrapper[4971]: I0320 09:17:24.951826 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/024091ee-f909-41ef-8ca6-0894aed949cd-ssh-key-openstack-cell1\") pod \"install-os-openstack-openstack-cell1-8z2kz\" (UID: \"024091ee-f909-41ef-8ca6-0894aed949cd\") " pod="openstack/install-os-openstack-openstack-cell1-8z2kz" Mar 20 09:17:24 crc kubenswrapper[4971]: I0320 09:17:24.951958 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/024091ee-f909-41ef-8ca6-0894aed949cd-inventory\") pod \"install-os-openstack-openstack-cell1-8z2kz\" (UID: \"024091ee-f909-41ef-8ca6-0894aed949cd\") " pod="openstack/install-os-openstack-openstack-cell1-8z2kz" Mar 20 09:17:24 crc kubenswrapper[4971]: I0320 09:17:24.951987 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/024091ee-f909-41ef-8ca6-0894aed949cd-ceph\") pod \"install-os-openstack-openstack-cell1-8z2kz\" (UID: \"024091ee-f909-41ef-8ca6-0894aed949cd\") " pod="openstack/install-os-openstack-openstack-cell1-8z2kz" Mar 20 09:17:24 crc kubenswrapper[4971]: I0320 09:17:24.952018 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6nwg\" (UniqueName: \"kubernetes.io/projected/024091ee-f909-41ef-8ca6-0894aed949cd-kube-api-access-q6nwg\") pod \"install-os-openstack-openstack-cell1-8z2kz\" (UID: \"024091ee-f909-41ef-8ca6-0894aed949cd\") " pod="openstack/install-os-openstack-openstack-cell1-8z2kz" Mar 20 09:17:24 crc kubenswrapper[4971]: I0320 09:17:24.960803 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/024091ee-f909-41ef-8ca6-0894aed949cd-ssh-key-openstack-cell1\") pod \"install-os-openstack-openstack-cell1-8z2kz\" (UID: \"024091ee-f909-41ef-8ca6-0894aed949cd\") " pod="openstack/install-os-openstack-openstack-cell1-8z2kz" Mar 20 09:17:24 crc kubenswrapper[4971]: I0320 09:17:24.962261 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/024091ee-f909-41ef-8ca6-0894aed949cd-ceph\") pod \"install-os-openstack-openstack-cell1-8z2kz\" (UID: \"024091ee-f909-41ef-8ca6-0894aed949cd\") " pod="openstack/install-os-openstack-openstack-cell1-8z2kz" Mar 20 09:17:24 crc kubenswrapper[4971]: I0320 09:17:24.968277 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/024091ee-f909-41ef-8ca6-0894aed949cd-inventory\") pod \"install-os-openstack-openstack-cell1-8z2kz\" (UID: \"024091ee-f909-41ef-8ca6-0894aed949cd\") " pod="openstack/install-os-openstack-openstack-cell1-8z2kz" Mar 20 09:17:24 crc kubenswrapper[4971]: I0320 09:17:24.993569 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6nwg\" (UniqueName: \"kubernetes.io/projected/024091ee-f909-41ef-8ca6-0894aed949cd-kube-api-access-q6nwg\") pod \"install-os-openstack-openstack-cell1-8z2kz\" (UID: \"024091ee-f909-41ef-8ca6-0894aed949cd\") " pod="openstack/install-os-openstack-openstack-cell1-8z2kz" Mar 20 09:17:25 crc kubenswrapper[4971]: I0320 09:17:25.107066 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-8z2kz" Mar 20 09:17:25 crc kubenswrapper[4971]: I0320 09:17:25.761146 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-8z2kz"] Mar 20 09:17:26 crc kubenswrapper[4971]: I0320 09:17:26.627388 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-8z2kz" event={"ID":"024091ee-f909-41ef-8ca6-0894aed949cd","Type":"ContainerStarted","Data":"2cabfded9b5b5f3d62a1e5aa0bbe78466d112645d32b641aed78483bdf5f2381"} Mar 20 09:17:27 crc kubenswrapper[4971]: I0320 09:17:27.639171 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-8z2kz" event={"ID":"024091ee-f909-41ef-8ca6-0894aed949cd","Type":"ContainerStarted","Data":"3412f83177581723412f31e171647ae0df093f7b45356ea99b76f6945ff4c38b"} Mar 20 09:17:27 crc kubenswrapper[4971]: I0320 09:17:27.661288 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-openstack-openstack-cell1-8z2kz" podStartSLOduration=2.6261972460000003 podStartE2EDuration="3.661267854s" podCreationTimestamp="2026-03-20 09:17:24 +0000 UTC" firstStartedPulling="2026-03-20 09:17:25.756087659 +0000 UTC m=+8867.735961797" lastFinishedPulling="2026-03-20 09:17:26.791158267 +0000 UTC m=+8868.771032405" observedRunningTime="2026-03-20 09:17:27.657564276 +0000 UTC m=+8869.637438424" watchObservedRunningTime="2026-03-20 09:17:27.661267854 +0000 UTC m=+8869.641141992" Mar 20 09:17:33 crc kubenswrapper[4971]: I0320 09:17:33.687972 4971 generic.go:334] "Generic (PLEG): container finished" podID="e8634c18-a188-491c-96ee-8c23c7ec8a67" containerID="9f0826a1d170b847b81b74cbc39c20b29c531e44e1035d2e9158ab9a5390d2e2" exitCode=0 Mar 20 09:17:33 crc kubenswrapper[4971]: I0320 09:17:33.688068 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-networker-n7gwd" event={"ID":"e8634c18-a188-491c-96ee-8c23c7ec8a67","Type":"ContainerDied","Data":"9f0826a1d170b847b81b74cbc39c20b29c531e44e1035d2e9158ab9a5390d2e2"} Mar 20 09:17:35 crc kubenswrapper[4971]: I0320 09:17:35.196669 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-networker-n7gwd" Mar 20 09:17:35 crc kubenswrapper[4971]: I0320 09:17:35.266775 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65zb4\" (UniqueName: \"kubernetes.io/projected/e8634c18-a188-491c-96ee-8c23c7ec8a67-kube-api-access-65zb4\") pod \"e8634c18-a188-491c-96ee-8c23c7ec8a67\" (UID: \"e8634c18-a188-491c-96ee-8c23c7ec8a67\") " Mar 20 09:17:35 crc kubenswrapper[4971]: I0320 09:17:35.266924 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e8634c18-a188-491c-96ee-8c23c7ec8a67-inventory\") pod \"e8634c18-a188-491c-96ee-8c23c7ec8a67\" (UID: \"e8634c18-a188-491c-96ee-8c23c7ec8a67\") " Mar 20 09:17:35 crc kubenswrapper[4971]: I0320 09:17:35.267013 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/e8634c18-a188-491c-96ee-8c23c7ec8a67-ssh-key-openstack-networker\") pod \"e8634c18-a188-491c-96ee-8c23c7ec8a67\" (UID: \"e8634c18-a188-491c-96ee-8c23c7ec8a67\") " Mar 20 09:17:35 crc kubenswrapper[4971]: I0320 09:17:35.272935 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8634c18-a188-491c-96ee-8c23c7ec8a67-kube-api-access-65zb4" (OuterVolumeSpecName: "kube-api-access-65zb4") pod "e8634c18-a188-491c-96ee-8c23c7ec8a67" (UID: "e8634c18-a188-491c-96ee-8c23c7ec8a67"). InnerVolumeSpecName "kube-api-access-65zb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:17:35 crc kubenswrapper[4971]: I0320 09:17:35.296690 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8634c18-a188-491c-96ee-8c23c7ec8a67-inventory" (OuterVolumeSpecName: "inventory") pod "e8634c18-a188-491c-96ee-8c23c7ec8a67" (UID: "e8634c18-a188-491c-96ee-8c23c7ec8a67"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:17:35 crc kubenswrapper[4971]: I0320 09:17:35.302709 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8634c18-a188-491c-96ee-8c23c7ec8a67-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "e8634c18-a188-491c-96ee-8c23c7ec8a67" (UID: "e8634c18-a188-491c-96ee-8c23c7ec8a67"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:17:35 crc kubenswrapper[4971]: I0320 09:17:35.370056 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65zb4\" (UniqueName: \"kubernetes.io/projected/e8634c18-a188-491c-96ee-8c23c7ec8a67-kube-api-access-65zb4\") on node \"crc\" DevicePath \"\"" Mar 20 09:17:35 crc kubenswrapper[4971]: I0320 09:17:35.370092 4971 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e8634c18-a188-491c-96ee-8c23c7ec8a67-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 09:17:35 crc kubenswrapper[4971]: I0320 09:17:35.370107 4971 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/e8634c18-a188-491c-96ee-8c23c7ec8a67-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Mar 20 09:17:35 crc kubenswrapper[4971]: I0320 09:17:35.708574 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-networker-n7gwd" event={"ID":"e8634c18-a188-491c-96ee-8c23c7ec8a67","Type":"ContainerDied","Data":"810538d32fcc2890eaaccd7a8f79a3a637a1ab8dc665be5ab6cc3263c5179948"} Mar 20 09:17:35 crc kubenswrapper[4971]: I0320 09:17:35.708627 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="810538d32fcc2890eaaccd7a8f79a3a637a1ab8dc665be5ab6cc3263c5179948" Mar 20 09:17:35 crc kubenswrapper[4971]: I0320 09:17:35.708657 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-networker-n7gwd" Mar 20 09:17:35 crc kubenswrapper[4971]: I0320 09:17:35.814168 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-openstack-openstack-networker-9vwnm"] Mar 20 09:17:35 crc kubenswrapper[4971]: E0320 09:17:35.814735 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8634c18-a188-491c-96ee-8c23c7ec8a67" containerName="install-os-openstack-openstack-networker" Mar 20 09:17:35 crc kubenswrapper[4971]: I0320 09:17:35.814757 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8634c18-a188-491c-96ee-8c23c7ec8a67" containerName="install-os-openstack-openstack-networker" Mar 20 09:17:35 crc kubenswrapper[4971]: I0320 09:17:35.814993 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8634c18-a188-491c-96ee-8c23c7ec8a67" containerName="install-os-openstack-openstack-networker" Mar 20 09:17:35 crc kubenswrapper[4971]: I0320 09:17:35.816030 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-networker-9vwnm" Mar 20 09:17:35 crc kubenswrapper[4971]: I0320 09:17:35.818890 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-nwcvx" Mar 20 09:17:35 crc kubenswrapper[4971]: I0320 09:17:35.819423 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Mar 20 09:17:35 crc kubenswrapper[4971]: I0320 09:17:35.840047 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-networker-9vwnm"] Mar 20 09:17:35 crc kubenswrapper[4971]: I0320 09:17:35.879591 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/10e60012-e649-4b87-bf60-98f44a54ef4c-ssh-key-openstack-networker\") pod \"configure-os-openstack-openstack-networker-9vwnm\" (UID: \"10e60012-e649-4b87-bf60-98f44a54ef4c\") " pod="openstack/configure-os-openstack-openstack-networker-9vwnm" Mar 20 09:17:35 crc kubenswrapper[4971]: I0320 09:17:35.879810 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhbmm\" (UniqueName: \"kubernetes.io/projected/10e60012-e649-4b87-bf60-98f44a54ef4c-kube-api-access-nhbmm\") pod \"configure-os-openstack-openstack-networker-9vwnm\" (UID: \"10e60012-e649-4b87-bf60-98f44a54ef4c\") " pod="openstack/configure-os-openstack-openstack-networker-9vwnm" Mar 20 09:17:35 crc kubenswrapper[4971]: I0320 09:17:35.880019 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10e60012-e649-4b87-bf60-98f44a54ef4c-inventory\") pod \"configure-os-openstack-openstack-networker-9vwnm\" (UID: \"10e60012-e649-4b87-bf60-98f44a54ef4c\") " pod="openstack/configure-os-openstack-openstack-networker-9vwnm" Mar 20 09:17:35 crc kubenswrapper[4971]: I0320 09:17:35.982078 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhbmm\" (UniqueName: \"kubernetes.io/projected/10e60012-e649-4b87-bf60-98f44a54ef4c-kube-api-access-nhbmm\") pod \"configure-os-openstack-openstack-networker-9vwnm\" (UID: \"10e60012-e649-4b87-bf60-98f44a54ef4c\") " pod="openstack/configure-os-openstack-openstack-networker-9vwnm" Mar 20 09:17:35 crc kubenswrapper[4971]: I0320 09:17:35.982186 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10e60012-e649-4b87-bf60-98f44a54ef4c-inventory\") pod \"configure-os-openstack-openstack-networker-9vwnm\" (UID: \"10e60012-e649-4b87-bf60-98f44a54ef4c\") " pod="openstack/configure-os-openstack-openstack-networker-9vwnm" Mar 20 09:17:35 crc kubenswrapper[4971]: I0320 09:17:35.982327 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/10e60012-e649-4b87-bf60-98f44a54ef4c-ssh-key-openstack-networker\") pod \"configure-os-openstack-openstack-networker-9vwnm\" (UID: \"10e60012-e649-4b87-bf60-98f44a54ef4c\") " pod="openstack/configure-os-openstack-openstack-networker-9vwnm" Mar 20 09:17:35 crc kubenswrapper[4971]: I0320 09:17:35.985582 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10e60012-e649-4b87-bf60-98f44a54ef4c-inventory\") pod \"configure-os-openstack-openstack-networker-9vwnm\" (UID: \"10e60012-e649-4b87-bf60-98f44a54ef4c\") " pod="openstack/configure-os-openstack-openstack-networker-9vwnm" Mar 20 09:17:35 crc kubenswrapper[4971]: I0320 09:17:35.985792 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/10e60012-e649-4b87-bf60-98f44a54ef4c-ssh-key-openstack-networker\") pod \"configure-os-openstack-openstack-networker-9vwnm\" (UID: \"10e60012-e649-4b87-bf60-98f44a54ef4c\") " pod="openstack/configure-os-openstack-openstack-networker-9vwnm" Mar 20 09:17:36 crc kubenswrapper[4971]: I0320 09:17:36.001379 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhbmm\" (UniqueName: \"kubernetes.io/projected/10e60012-e649-4b87-bf60-98f44a54ef4c-kube-api-access-nhbmm\") pod \"configure-os-openstack-openstack-networker-9vwnm\" (UID: \"10e60012-e649-4b87-bf60-98f44a54ef4c\") " pod="openstack/configure-os-openstack-openstack-networker-9vwnm" Mar 20 09:17:36 crc kubenswrapper[4971]: I0320 09:17:36.170747 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-networker-9vwnm" Mar 20 09:17:36 crc kubenswrapper[4971]: I0320 09:17:36.745747 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-networker-9vwnm"] Mar 20 09:17:37 crc kubenswrapper[4971]: I0320 09:17:37.727759 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-networker-9vwnm" event={"ID":"10e60012-e649-4b87-bf60-98f44a54ef4c","Type":"ContainerStarted","Data":"2be1cf7b741eb11b13c00bb6422997a2ad08fdf8196de4865a842da68d534570"} Mar 20 09:17:37 crc kubenswrapper[4971]: I0320 09:17:37.728122 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-networker-9vwnm" event={"ID":"10e60012-e649-4b87-bf60-98f44a54ef4c","Type":"ContainerStarted","Data":"4680ce1e2d20d089f1773cac09043871a64a597e5d6a7e333739fb393e3d0650"} Mar 20 09:17:50 crc kubenswrapper[4971]: I0320 09:17:50.162783 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:17:50 crc kubenswrapper[4971]: I0320 09:17:50.163374 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:18:00 crc kubenswrapper[4971]: I0320 09:18:00.141286 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-openstack-openstack-networker-9vwnm" podStartSLOduration=24.647748138 podStartE2EDuration="25.141257462s" podCreationTimestamp="2026-03-20 09:17:35 +0000 UTC" firstStartedPulling="2026-03-20 09:17:36.750783737 +0000 UTC m=+8878.730657885" lastFinishedPulling="2026-03-20 09:17:37.244293071 +0000 UTC m=+8879.224167209" observedRunningTime="2026-03-20 09:17:37.755919693 +0000 UTC m=+8879.735793831" watchObservedRunningTime="2026-03-20 09:18:00.141257462 +0000 UTC m=+8902.121131640" Mar 20 09:18:00 crc kubenswrapper[4971]: I0320 09:18:00.142096 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566638-l97wp"] Mar 20 09:18:00 crc kubenswrapper[4971]: I0320 09:18:00.143719 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566638-l97wp" Mar 20 09:18:00 crc kubenswrapper[4971]: I0320 09:18:00.146514 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:18:00 crc kubenswrapper[4971]: I0320 09:18:00.146681 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 09:18:00 crc kubenswrapper[4971]: I0320 09:18:00.146818 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:18:00 crc kubenswrapper[4971]: I0320 09:18:00.154088 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566638-l97wp"] Mar 20 09:18:00 crc kubenswrapper[4971]: I0320 09:18:00.249399 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6dps\" (UniqueName: \"kubernetes.io/projected/ed292ff4-8486-4848-b9f2-0b868efba1aa-kube-api-access-g6dps\") pod \"auto-csr-approver-29566638-l97wp\" (UID: \"ed292ff4-8486-4848-b9f2-0b868efba1aa\") " pod="openshift-infra/auto-csr-approver-29566638-l97wp" Mar 20 09:18:00 crc kubenswrapper[4971]: I0320 09:18:00.351040 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6dps\" (UniqueName: \"kubernetes.io/projected/ed292ff4-8486-4848-b9f2-0b868efba1aa-kube-api-access-g6dps\") pod \"auto-csr-approver-29566638-l97wp\" (UID: \"ed292ff4-8486-4848-b9f2-0b868efba1aa\") " pod="openshift-infra/auto-csr-approver-29566638-l97wp" Mar 20 09:18:00 crc kubenswrapper[4971]: I0320 09:18:00.373627 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6dps\" (UniqueName: \"kubernetes.io/projected/ed292ff4-8486-4848-b9f2-0b868efba1aa-kube-api-access-g6dps\") pod \"auto-csr-approver-29566638-l97wp\" (UID: \"ed292ff4-8486-4848-b9f2-0b868efba1aa\") " pod="openshift-infra/auto-csr-approver-29566638-l97wp" Mar 20 09:18:00 crc kubenswrapper[4971]: I0320 09:18:00.466927 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566638-l97wp" Mar 20 09:18:00 crc kubenswrapper[4971]: I0320 09:18:00.945316 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566638-l97wp"] Mar 20 09:18:01 crc kubenswrapper[4971]: I0320 09:18:01.096800 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566638-l97wp" event={"ID":"ed292ff4-8486-4848-b9f2-0b868efba1aa","Type":"ContainerStarted","Data":"e1ec1aa1cf7f821c639b7afa543f9f997e68fa6f5f9f45c221d0180e60e32534"} Mar 20 09:18:02 crc kubenswrapper[4971]: I0320 09:18:02.105679 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566638-l97wp" event={"ID":"ed292ff4-8486-4848-b9f2-0b868efba1aa","Type":"ContainerStarted","Data":"fbd61022ca77f0079c6dbe2ec1205c8969b2da0c37553c8c697178afb543789f"} Mar 20 09:18:02 crc kubenswrapper[4971]: I0320 09:18:02.125034 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566638-l97wp" podStartSLOduration=1.3591123010000001 podStartE2EDuration="2.125018825s" podCreationTimestamp="2026-03-20 09:18:00 +0000 UTC" firstStartedPulling="2026-03-20 09:18:00.971919097 +0000 UTC m=+8902.951793245" lastFinishedPulling="2026-03-20 09:18:01.737825631 +0000 UTC m=+8903.717699769" observedRunningTime="2026-03-20 09:18:02.122270942 +0000 UTC m=+8904.102145100" watchObservedRunningTime="2026-03-20 09:18:02.125018825 +0000 UTC m=+8904.104892963" Mar 20 09:18:03 crc kubenswrapper[4971]: I0320 09:18:03.116213 4971 generic.go:334] "Generic (PLEG): container finished" podID="ed292ff4-8486-4848-b9f2-0b868efba1aa" containerID="fbd61022ca77f0079c6dbe2ec1205c8969b2da0c37553c8c697178afb543789f" exitCode=0 Mar 20 09:18:03 crc kubenswrapper[4971]: I0320 09:18:03.116294 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566638-l97wp" event={"ID":"ed292ff4-8486-4848-b9f2-0b868efba1aa","Type":"ContainerDied","Data":"fbd61022ca77f0079c6dbe2ec1205c8969b2da0c37553c8c697178afb543789f"} Mar 20 09:18:04 crc kubenswrapper[4971]: I0320 09:18:04.632486 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566638-l97wp" Mar 20 09:18:04 crc kubenswrapper[4971]: I0320 09:18:04.651558 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6dps\" (UniqueName: \"kubernetes.io/projected/ed292ff4-8486-4848-b9f2-0b868efba1aa-kube-api-access-g6dps\") pod \"ed292ff4-8486-4848-b9f2-0b868efba1aa\" (UID: \"ed292ff4-8486-4848-b9f2-0b868efba1aa\") " Mar 20 09:18:04 crc kubenswrapper[4971]: I0320 09:18:04.659070 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed292ff4-8486-4848-b9f2-0b868efba1aa-kube-api-access-g6dps" (OuterVolumeSpecName: "kube-api-access-g6dps") pod "ed292ff4-8486-4848-b9f2-0b868efba1aa" (UID: "ed292ff4-8486-4848-b9f2-0b868efba1aa"). InnerVolumeSpecName "kube-api-access-g6dps". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:18:04 crc kubenswrapper[4971]: I0320 09:18:04.754482 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6dps\" (UniqueName: \"kubernetes.io/projected/ed292ff4-8486-4848-b9f2-0b868efba1aa-kube-api-access-g6dps\") on node \"crc\" DevicePath \"\"" Mar 20 09:18:05 crc kubenswrapper[4971]: I0320 09:18:05.149720 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566638-l97wp" event={"ID":"ed292ff4-8486-4848-b9f2-0b868efba1aa","Type":"ContainerDied","Data":"e1ec1aa1cf7f821c639b7afa543f9f997e68fa6f5f9f45c221d0180e60e32534"} Mar 20 09:18:05 crc kubenswrapper[4971]: I0320 09:18:05.149766 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1ec1aa1cf7f821c639b7afa543f9f997e68fa6f5f9f45c221d0180e60e32534" Mar 20 09:18:05 crc kubenswrapper[4971]: I0320 09:18:05.149830 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566638-l97wp" Mar 20 09:18:05 crc kubenswrapper[4971]: I0320 09:18:05.208324 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566632-z9665"] Mar 20 09:18:05 crc kubenswrapper[4971]: I0320 09:18:05.220457 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566632-z9665"] Mar 20 09:18:06 crc kubenswrapper[4971]: I0320 09:18:06.752884 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfb2d23a-e02e-44aa-957e-774d1432e1db" path="/var/lib/kubelet/pods/dfb2d23a-e02e-44aa-957e-774d1432e1db/volumes" Mar 20 09:18:13 crc kubenswrapper[4971]: I0320 09:18:13.875262 4971 scope.go:117] "RemoveContainer" containerID="6377d4e05eaeed951e21ff272f86446a4323fc7819cb90830444cab192f72a94" Mar 20 09:18:18 crc kubenswrapper[4971]: I0320 09:18:18.824017 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bqnb5"] Mar 20 09:18:18 crc kubenswrapper[4971]: E0320 09:18:18.829196 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed292ff4-8486-4848-b9f2-0b868efba1aa" containerName="oc" Mar 20 09:18:18 crc kubenswrapper[4971]: I0320 09:18:18.829242 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed292ff4-8486-4848-b9f2-0b868efba1aa" containerName="oc" Mar 20 09:18:18 crc kubenswrapper[4971]: I0320 09:18:18.829522 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed292ff4-8486-4848-b9f2-0b868efba1aa" containerName="oc" Mar 20 09:18:18 crc kubenswrapper[4971]: I0320 09:18:18.831341 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bqnb5" Mar 20 09:18:18 crc kubenswrapper[4971]: I0320 09:18:18.858267 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bqnb5"] Mar 20 09:18:18 crc kubenswrapper[4971]: I0320 09:18:18.952750 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6df92fbb-0ab7-43e7-9661-a2fc17c01759-catalog-content\") pod \"redhat-operators-bqnb5\" (UID: \"6df92fbb-0ab7-43e7-9661-a2fc17c01759\") " pod="openshift-marketplace/redhat-operators-bqnb5" Mar 20 09:18:18 crc kubenswrapper[4971]: I0320 09:18:18.952807 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7j2v\" (UniqueName: \"kubernetes.io/projected/6df92fbb-0ab7-43e7-9661-a2fc17c01759-kube-api-access-l7j2v\") pod \"redhat-operators-bqnb5\" (UID: \"6df92fbb-0ab7-43e7-9661-a2fc17c01759\") " pod="openshift-marketplace/redhat-operators-bqnb5" Mar 20 09:18:18 crc kubenswrapper[4971]: I0320 09:18:18.952933 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6df92fbb-0ab7-43e7-9661-a2fc17c01759-utilities\") pod \"redhat-operators-bqnb5\" (UID: \"6df92fbb-0ab7-43e7-9661-a2fc17c01759\") " pod="openshift-marketplace/redhat-operators-bqnb5" Mar 20 09:18:19 crc kubenswrapper[4971]: I0320 09:18:19.054927 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6df92fbb-0ab7-43e7-9661-a2fc17c01759-catalog-content\") pod \"redhat-operators-bqnb5\" (UID: \"6df92fbb-0ab7-43e7-9661-a2fc17c01759\") " pod="openshift-marketplace/redhat-operators-bqnb5" Mar 20 09:18:19 crc kubenswrapper[4971]: I0320 09:18:19.054985 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7j2v\" (UniqueName: \"kubernetes.io/projected/6df92fbb-0ab7-43e7-9661-a2fc17c01759-kube-api-access-l7j2v\") pod \"redhat-operators-bqnb5\" (UID: \"6df92fbb-0ab7-43e7-9661-a2fc17c01759\") " pod="openshift-marketplace/redhat-operators-bqnb5" Mar 20 09:18:19 crc kubenswrapper[4971]: I0320 09:18:19.055053 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6df92fbb-0ab7-43e7-9661-a2fc17c01759-utilities\") pod \"redhat-operators-bqnb5\" (UID: \"6df92fbb-0ab7-43e7-9661-a2fc17c01759\") " pod="openshift-marketplace/redhat-operators-bqnb5" Mar 20 09:18:19 crc kubenswrapper[4971]: I0320 09:18:19.055506 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6df92fbb-0ab7-43e7-9661-a2fc17c01759-catalog-content\") pod \"redhat-operators-bqnb5\" (UID: \"6df92fbb-0ab7-43e7-9661-a2fc17c01759\") " pod="openshift-marketplace/redhat-operators-bqnb5" Mar 20 09:18:19 crc kubenswrapper[4971]: I0320 09:18:19.055551 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6df92fbb-0ab7-43e7-9661-a2fc17c01759-utilities\") pod \"redhat-operators-bqnb5\" (UID: \"6df92fbb-0ab7-43e7-9661-a2fc17c01759\") " pod="openshift-marketplace/redhat-operators-bqnb5" Mar 20 09:18:19 crc kubenswrapper[4971]: I0320 09:18:19.074988 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7j2v\" (UniqueName: \"kubernetes.io/projected/6df92fbb-0ab7-43e7-9661-a2fc17c01759-kube-api-access-l7j2v\") pod \"redhat-operators-bqnb5\" (UID: \"6df92fbb-0ab7-43e7-9661-a2fc17c01759\") " pod="openshift-marketplace/redhat-operators-bqnb5" Mar 20 09:18:19 crc kubenswrapper[4971]: I0320 09:18:19.156964 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bqnb5" Mar 20 09:18:19 crc kubenswrapper[4971]: I0320 09:18:19.323364 4971 generic.go:334] "Generic (PLEG): container finished" podID="024091ee-f909-41ef-8ca6-0894aed949cd" containerID="3412f83177581723412f31e171647ae0df093f7b45356ea99b76f6945ff4c38b" exitCode=0 Mar 20 09:18:19 crc kubenswrapper[4971]: I0320 09:18:19.323657 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-8z2kz" event={"ID":"024091ee-f909-41ef-8ca6-0894aed949cd","Type":"ContainerDied","Data":"3412f83177581723412f31e171647ae0df093f7b45356ea99b76f6945ff4c38b"} Mar 20 09:18:19 crc kubenswrapper[4971]: I0320 09:18:19.689875 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bqnb5"] Mar 20 09:18:20 crc kubenswrapper[4971]: I0320 09:18:20.162280 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:18:20 crc kubenswrapper[4971]: I0320 09:18:20.162349 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:18:20 crc kubenswrapper[4971]: I0320 09:18:20.162398 4971 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" Mar 20 09:18:20 crc kubenswrapper[4971]: I0320 09:18:20.163218 4971 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"733a160bea3a7b4da07aecd1f08779761a18f5919207a3afdf06511a9685618b"} pod="openshift-machine-config-operator/machine-config-daemon-m26zl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 09:18:20 crc kubenswrapper[4971]: I0320 09:18:20.163282 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" containerID="cri-o://733a160bea3a7b4da07aecd1f08779761a18f5919207a3afdf06511a9685618b" gracePeriod=600 Mar 20 09:18:20 crc kubenswrapper[4971]: I0320 09:18:20.369514 4971 generic.go:334] "Generic (PLEG): container finished" podID="6df92fbb-0ab7-43e7-9661-a2fc17c01759" containerID="2cd2d04def02a75f4cd7a354a2fc14f2882be6b963e964ff0c8b4a6f66b0c7f8" exitCode=0 Mar 20 09:18:20 crc kubenswrapper[4971]: I0320 09:18:20.369881 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bqnb5" event={"ID":"6df92fbb-0ab7-43e7-9661-a2fc17c01759","Type":"ContainerDied","Data":"2cd2d04def02a75f4cd7a354a2fc14f2882be6b963e964ff0c8b4a6f66b0c7f8"} Mar 20 09:18:20 crc kubenswrapper[4971]: I0320 09:18:20.369910 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bqnb5" event={"ID":"6df92fbb-0ab7-43e7-9661-a2fc17c01759","Type":"ContainerStarted","Data":"953f86ee1119ea22c4b20c396b29b1d33bc70812c01d96e3a733026f3036476a"} Mar 20 09:18:20 crc kubenswrapper[4971]: I0320 09:18:20.375277 4971 generic.go:334] "Generic (PLEG): container finished" podID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerID="733a160bea3a7b4da07aecd1f08779761a18f5919207a3afdf06511a9685618b" exitCode=0 Mar 20 09:18:20 crc kubenswrapper[4971]: I0320 09:18:20.375379 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerDied","Data":"733a160bea3a7b4da07aecd1f08779761a18f5919207a3afdf06511a9685618b"} Mar 20 09:18:20 crc kubenswrapper[4971]: I0320 09:18:20.375437 4971 scope.go:117] "RemoveContainer" containerID="da88d257cb097a1180faf648e8cb9e53e92d6d11e4160393bcb42462b61cacc0" Mar 20 09:18:20 crc kubenswrapper[4971]: I0320 09:18:20.874151 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-8z2kz" Mar 20 09:18:21 crc kubenswrapper[4971]: I0320 09:18:21.009757 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/024091ee-f909-41ef-8ca6-0894aed949cd-ceph\") pod \"024091ee-f909-41ef-8ca6-0894aed949cd\" (UID: \"024091ee-f909-41ef-8ca6-0894aed949cd\") " Mar 20 09:18:21 crc kubenswrapper[4971]: I0320 09:18:21.010061 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/024091ee-f909-41ef-8ca6-0894aed949cd-inventory\") pod \"024091ee-f909-41ef-8ca6-0894aed949cd\" (UID: \"024091ee-f909-41ef-8ca6-0894aed949cd\") " Mar 20 09:18:21 crc kubenswrapper[4971]: I0320 09:18:21.010232 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6nwg\" (UniqueName: \"kubernetes.io/projected/024091ee-f909-41ef-8ca6-0894aed949cd-kube-api-access-q6nwg\") pod \"024091ee-f909-41ef-8ca6-0894aed949cd\" (UID: \"024091ee-f909-41ef-8ca6-0894aed949cd\") " Mar 20 09:18:21 crc kubenswrapper[4971]: I0320 09:18:21.010256 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/024091ee-f909-41ef-8ca6-0894aed949cd-ssh-key-openstack-cell1\") pod \"024091ee-f909-41ef-8ca6-0894aed949cd\" (UID: \"024091ee-f909-41ef-8ca6-0894aed949cd\") " Mar 20 09:18:21 crc kubenswrapper[4971]: I0320 09:18:21.019025 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/024091ee-f909-41ef-8ca6-0894aed949cd-kube-api-access-q6nwg" (OuterVolumeSpecName: "kube-api-access-q6nwg") pod "024091ee-f909-41ef-8ca6-0894aed949cd" (UID: "024091ee-f909-41ef-8ca6-0894aed949cd"). InnerVolumeSpecName "kube-api-access-q6nwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:18:21 crc kubenswrapper[4971]: I0320 09:18:21.026904 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/024091ee-f909-41ef-8ca6-0894aed949cd-ceph" (OuterVolumeSpecName: "ceph") pod "024091ee-f909-41ef-8ca6-0894aed949cd" (UID: "024091ee-f909-41ef-8ca6-0894aed949cd"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:18:21 crc kubenswrapper[4971]: I0320 09:18:21.043767 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/024091ee-f909-41ef-8ca6-0894aed949cd-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "024091ee-f909-41ef-8ca6-0894aed949cd" (UID: "024091ee-f909-41ef-8ca6-0894aed949cd"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:18:21 crc kubenswrapper[4971]: I0320 09:18:21.046682 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/024091ee-f909-41ef-8ca6-0894aed949cd-inventory" (OuterVolumeSpecName: "inventory") pod "024091ee-f909-41ef-8ca6-0894aed949cd" (UID: "024091ee-f909-41ef-8ca6-0894aed949cd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:18:21 crc kubenswrapper[4971]: I0320 09:18:21.111656 4971 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/024091ee-f909-41ef-8ca6-0894aed949cd-ceph\") on node \"crc\" DevicePath \"\"" Mar 20 09:18:21 crc kubenswrapper[4971]: I0320 09:18:21.111684 4971 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/024091ee-f909-41ef-8ca6-0894aed949cd-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 09:18:21 crc kubenswrapper[4971]: I0320 09:18:21.111696 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6nwg\" (UniqueName: \"kubernetes.io/projected/024091ee-f909-41ef-8ca6-0894aed949cd-kube-api-access-q6nwg\") on node \"crc\" DevicePath \"\"" Mar 20 09:18:21 crc kubenswrapper[4971]: I0320 09:18:21.111705 4971 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/024091ee-f909-41ef-8ca6-0894aed949cd-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 20 09:18:21 crc kubenswrapper[4971]: I0320 09:18:21.389593 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerStarted","Data":"1dc8fd89510b2b7d8ce851fb003f72167d9a6a38e6cb5f50c43cf4fd83fd91ff"} Mar 20 09:18:21 crc kubenswrapper[4971]: I0320 09:18:21.391623 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-8z2kz" event={"ID":"024091ee-f909-41ef-8ca6-0894aed949cd","Type":"ContainerDied","Data":"2cabfded9b5b5f3d62a1e5aa0bbe78466d112645d32b641aed78483bdf5f2381"} Mar 20 09:18:21 crc kubenswrapper[4971]: I0320 09:18:21.391676 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2cabfded9b5b5f3d62a1e5aa0bbe78466d112645d32b641aed78483bdf5f2381" Mar 20 09:18:21 crc kubenswrapper[4971]: I0320 09:18:21.391740 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-8z2kz" Mar 20 09:18:21 crc kubenswrapper[4971]: I0320 09:18:21.399215 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bqnb5" event={"ID":"6df92fbb-0ab7-43e7-9661-a2fc17c01759","Type":"ContainerStarted","Data":"e04deb7b8366988f0dd7f32468eefa08706849d0eb2391ec56d6e90c47787f31"} Mar 20 09:18:21 crc kubenswrapper[4971]: I0320 09:18:21.460235 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-qhvnn"] Mar 20 09:18:21 crc kubenswrapper[4971]: E0320 09:18:21.460926 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="024091ee-f909-41ef-8ca6-0894aed949cd" containerName="install-os-openstack-openstack-cell1" Mar 20 09:18:21 crc kubenswrapper[4971]: I0320 09:18:21.460949 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="024091ee-f909-41ef-8ca6-0894aed949cd" containerName="install-os-openstack-openstack-cell1" Mar 20 09:18:21 crc kubenswrapper[4971]: I0320 09:18:21.461241 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="024091ee-f909-41ef-8ca6-0894aed949cd" containerName="install-os-openstack-openstack-cell1" Mar 20 09:18:21 crc kubenswrapper[4971]: I0320 09:18:21.462218 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-qhvnn" Mar 20 09:18:21 crc kubenswrapper[4971]: I0320 09:18:21.464983 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 20 09:18:21 crc kubenswrapper[4971]: I0320 09:18:21.465181 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rwstj" Mar 20 09:18:21 crc kubenswrapper[4971]: I0320 09:18:21.475227 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-qhvnn"] Mar 20 09:18:21 crc kubenswrapper[4971]: I0320 09:18:21.622936 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5e4faacf-b8fc-428e-8e94-cd873f7bbea8-ssh-key-openstack-cell1\") pod \"configure-os-openstack-openstack-cell1-qhvnn\" (UID: \"5e4faacf-b8fc-428e-8e94-cd873f7bbea8\") " pod="openstack/configure-os-openstack-openstack-cell1-qhvnn" Mar 20 09:18:21 crc kubenswrapper[4971]: I0320 09:18:21.622978 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e4faacf-b8fc-428e-8e94-cd873f7bbea8-inventory\") pod \"configure-os-openstack-openstack-cell1-qhvnn\" (UID: \"5e4faacf-b8fc-428e-8e94-cd873f7bbea8\") " pod="openstack/configure-os-openstack-openstack-cell1-qhvnn" Mar 20 09:18:21 crc kubenswrapper[4971]: I0320 09:18:21.622999 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5e4faacf-b8fc-428e-8e94-cd873f7bbea8-ceph\") pod \"configure-os-openstack-openstack-cell1-qhvnn\" (UID: \"5e4faacf-b8fc-428e-8e94-cd873f7bbea8\") " pod="openstack/configure-os-openstack-openstack-cell1-qhvnn" Mar 20 09:18:21 crc kubenswrapper[4971]: I0320 09:18:21.623209 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ztkp\" (UniqueName: \"kubernetes.io/projected/5e4faacf-b8fc-428e-8e94-cd873f7bbea8-kube-api-access-6ztkp\") pod \"configure-os-openstack-openstack-cell1-qhvnn\" (UID: \"5e4faacf-b8fc-428e-8e94-cd873f7bbea8\") " pod="openstack/configure-os-openstack-openstack-cell1-qhvnn" Mar 20 09:18:21 crc kubenswrapper[4971]: I0320 09:18:21.725789 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5e4faacf-b8fc-428e-8e94-cd873f7bbea8-ceph\") pod \"configure-os-openstack-openstack-cell1-qhvnn\" (UID: \"5e4faacf-b8fc-428e-8e94-cd873f7bbea8\") " pod="openstack/configure-os-openstack-openstack-cell1-qhvnn" Mar 20 09:18:21 crc kubenswrapper[4971]: I0320 09:18:21.725850 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5e4faacf-b8fc-428e-8e94-cd873f7bbea8-ssh-key-openstack-cell1\") pod \"configure-os-openstack-openstack-cell1-qhvnn\" (UID: \"5e4faacf-b8fc-428e-8e94-cd873f7bbea8\") " pod="openstack/configure-os-openstack-openstack-cell1-qhvnn" Mar 20 09:18:21 crc kubenswrapper[4971]: I0320 09:18:21.725867 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e4faacf-b8fc-428e-8e94-cd873f7bbea8-inventory\") pod \"configure-os-openstack-openstack-cell1-qhvnn\" (UID: \"5e4faacf-b8fc-428e-8e94-cd873f7bbea8\") " pod="openstack/configure-os-openstack-openstack-cell1-qhvnn" Mar 20 09:18:21 crc kubenswrapper[4971]: I0320 09:18:21.725924 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ztkp\" (UniqueName: \"kubernetes.io/projected/5e4faacf-b8fc-428e-8e94-cd873f7bbea8-kube-api-access-6ztkp\") pod \"configure-os-openstack-openstack-cell1-qhvnn\" (UID: \"5e4faacf-b8fc-428e-8e94-cd873f7bbea8\") " pod="openstack/configure-os-openstack-openstack-cell1-qhvnn" Mar 20 09:18:21 crc kubenswrapper[4971]: I0320 09:18:21.740593 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e4faacf-b8fc-428e-8e94-cd873f7bbea8-inventory\") pod \"configure-os-openstack-openstack-cell1-qhvnn\" (UID: \"5e4faacf-b8fc-428e-8e94-cd873f7bbea8\") " pod="openstack/configure-os-openstack-openstack-cell1-qhvnn" Mar 20 09:18:21 crc kubenswrapper[4971]: I0320 09:18:21.740808 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5e4faacf-b8fc-428e-8e94-cd873f7bbea8-ceph\") pod \"configure-os-openstack-openstack-cell1-qhvnn\" (UID: \"5e4faacf-b8fc-428e-8e94-cd873f7bbea8\") " pod="openstack/configure-os-openstack-openstack-cell1-qhvnn" Mar 20 09:18:21 crc kubenswrapper[4971]: I0320 09:18:21.740599 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5e4faacf-b8fc-428e-8e94-cd873f7bbea8-ssh-key-openstack-cell1\") pod \"configure-os-openstack-openstack-cell1-qhvnn\" (UID: \"5e4faacf-b8fc-428e-8e94-cd873f7bbea8\") " pod="openstack/configure-os-openstack-openstack-cell1-qhvnn" Mar 20 09:18:21 crc kubenswrapper[4971]: I0320 09:18:21.744066 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ztkp\" (UniqueName: \"kubernetes.io/projected/5e4faacf-b8fc-428e-8e94-cd873f7bbea8-kube-api-access-6ztkp\") pod \"configure-os-openstack-openstack-cell1-qhvnn\" (UID: \"5e4faacf-b8fc-428e-8e94-cd873f7bbea8\") " pod="openstack/configure-os-openstack-openstack-cell1-qhvnn" Mar 20 09:18:21 crc kubenswrapper[4971]: I0320 09:18:21.783119 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-qhvnn" Mar 20 09:18:22 crc kubenswrapper[4971]: I0320 09:18:22.318414 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-qhvnn"] Mar 20 09:18:22 crc kubenswrapper[4971]: I0320 09:18:22.409585 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-qhvnn" event={"ID":"5e4faacf-b8fc-428e-8e94-cd873f7bbea8","Type":"ContainerStarted","Data":"ac2d9e68eab7088f46d2022ee7a98d8fd16ebe70c2de820849b27646f12ab96b"} Mar 20 09:18:23 crc kubenswrapper[4971]: I0320 09:18:23.420506 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-qhvnn" event={"ID":"5e4faacf-b8fc-428e-8e94-cd873f7bbea8","Type":"ContainerStarted","Data":"f4825e29684e709708b395a0108091cca38544c77860ca3ec856cb09fea4c917"} Mar 20 09:18:23 crc kubenswrapper[4971]: I0320 09:18:23.438070 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-openstack-openstack-cell1-qhvnn" podStartSLOduration=1.904817669 podStartE2EDuration="2.438049443s" podCreationTimestamp="2026-03-20 09:18:21 +0000 UTC" firstStartedPulling="2026-03-20 09:18:22.335050249 +0000 UTC m=+8924.314924387" lastFinishedPulling="2026-03-20 09:18:22.868282013 +0000 UTC m=+8924.848156161" observedRunningTime="2026-03-20 09:18:23.433913023 +0000 UTC m=+8925.413787161" watchObservedRunningTime="2026-03-20 09:18:23.438049443 +0000 UTC m=+8925.417923581" Mar 20 09:18:27 crc kubenswrapper[4971]: I0320 09:18:27.461696 4971 generic.go:334] "Generic (PLEG): container finished" podID="6df92fbb-0ab7-43e7-9661-a2fc17c01759" containerID="e04deb7b8366988f0dd7f32468eefa08706849d0eb2391ec56d6e90c47787f31" exitCode=0 Mar 20 09:18:27 crc kubenswrapper[4971]: I0320 09:18:27.461810 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bqnb5" event={"ID":"6df92fbb-0ab7-43e7-9661-a2fc17c01759","Type":"ContainerDied","Data":"e04deb7b8366988f0dd7f32468eefa08706849d0eb2391ec56d6e90c47787f31"} Mar 20 09:18:28 crc kubenswrapper[4971]: I0320 09:18:28.474957 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bqnb5" event={"ID":"6df92fbb-0ab7-43e7-9661-a2fc17c01759","Type":"ContainerStarted","Data":"4010535c74311ad5afad79526ab1480a92992749e486d56ac3c2a6d43ee00902"} Mar 20 09:18:28 crc kubenswrapper[4971]: I0320 09:18:28.504617 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bqnb5" podStartSLOduration=2.670718761 podStartE2EDuration="10.504579674s" podCreationTimestamp="2026-03-20 09:18:18 +0000 UTC" firstStartedPulling="2026-03-20 09:18:20.373202747 +0000 UTC m=+8922.353076885" lastFinishedPulling="2026-03-20 09:18:28.20706365 +0000 UTC m=+8930.186937798" observedRunningTime="2026-03-20 09:18:28.492689 +0000 UTC m=+8930.472563158" watchObservedRunningTime="2026-03-20 09:18:28.504579674 +0000 UTC m=+8930.484453812" Mar 20 09:18:29 crc kubenswrapper[4971]: I0320 09:18:29.158078 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bqnb5" Mar 20 09:18:29 crc kubenswrapper[4971]: I0320 09:18:29.158361 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bqnb5" Mar 20 09:18:30 crc kubenswrapper[4971]: I0320 09:18:30.216760 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bqnb5" podUID="6df92fbb-0ab7-43e7-9661-a2fc17c01759" containerName="registry-server" probeResult="failure" output=< Mar 20 09:18:30 crc kubenswrapper[4971]: timeout: failed to connect service ":50051" within 1s Mar 20 09:18:30 crc kubenswrapper[4971]: > Mar 20 09:18:32 crc kubenswrapper[4971]: I0320 09:18:32.510515 4971 generic.go:334] "Generic (PLEG): container finished" podID="10e60012-e649-4b87-bf60-98f44a54ef4c" containerID="2be1cf7b741eb11b13c00bb6422997a2ad08fdf8196de4865a842da68d534570" exitCode=0 Mar 20 09:18:32 crc kubenswrapper[4971]: I0320 09:18:32.510650 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-networker-9vwnm" event={"ID":"10e60012-e649-4b87-bf60-98f44a54ef4c","Type":"ContainerDied","Data":"2be1cf7b741eb11b13c00bb6422997a2ad08fdf8196de4865a842da68d534570"} Mar 20 09:18:34 crc kubenswrapper[4971]: I0320 09:18:34.013257 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-networker-9vwnm" Mar 20 09:18:34 crc kubenswrapper[4971]: I0320 09:18:34.086058 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhbmm\" (UniqueName: \"kubernetes.io/projected/10e60012-e649-4b87-bf60-98f44a54ef4c-kube-api-access-nhbmm\") pod \"10e60012-e649-4b87-bf60-98f44a54ef4c\" (UID: \"10e60012-e649-4b87-bf60-98f44a54ef4c\") " Mar 20 09:18:34 crc kubenswrapper[4971]: I0320 09:18:34.086325 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10e60012-e649-4b87-bf60-98f44a54ef4c-inventory\") pod \"10e60012-e649-4b87-bf60-98f44a54ef4c\" (UID: \"10e60012-e649-4b87-bf60-98f44a54ef4c\") " Mar 20 09:18:34 crc kubenswrapper[4971]: I0320 09:18:34.086392 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/10e60012-e649-4b87-bf60-98f44a54ef4c-ssh-key-openstack-networker\") pod \"10e60012-e649-4b87-bf60-98f44a54ef4c\" (UID: \"10e60012-e649-4b87-bf60-98f44a54ef4c\") " Mar 20 09:18:34 crc kubenswrapper[4971]: I0320 09:18:34.092633 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10e60012-e649-4b87-bf60-98f44a54ef4c-kube-api-access-nhbmm" (OuterVolumeSpecName: "kube-api-access-nhbmm") pod "10e60012-e649-4b87-bf60-98f44a54ef4c" (UID: "10e60012-e649-4b87-bf60-98f44a54ef4c"). InnerVolumeSpecName "kube-api-access-nhbmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:18:34 crc kubenswrapper[4971]: I0320 09:18:34.118952 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10e60012-e649-4b87-bf60-98f44a54ef4c-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "10e60012-e649-4b87-bf60-98f44a54ef4c" (UID: "10e60012-e649-4b87-bf60-98f44a54ef4c"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:18:34 crc kubenswrapper[4971]: I0320 09:18:34.119006 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10e60012-e649-4b87-bf60-98f44a54ef4c-inventory" (OuterVolumeSpecName: "inventory") pod "10e60012-e649-4b87-bf60-98f44a54ef4c" (UID: "10e60012-e649-4b87-bf60-98f44a54ef4c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:18:34 crc kubenswrapper[4971]: I0320 09:18:34.188263 4971 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10e60012-e649-4b87-bf60-98f44a54ef4c-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 09:18:34 crc kubenswrapper[4971]: I0320 09:18:34.188309 4971 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/10e60012-e649-4b87-bf60-98f44a54ef4c-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Mar 20 09:18:34 crc kubenswrapper[4971]: I0320 09:18:34.188323 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhbmm\" (UniqueName: \"kubernetes.io/projected/10e60012-e649-4b87-bf60-98f44a54ef4c-kube-api-access-nhbmm\") on node \"crc\" DevicePath \"\"" Mar 20 09:18:34 crc kubenswrapper[4971]: I0320 09:18:34.532593 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-networker-9vwnm" event={"ID":"10e60012-e649-4b87-bf60-98f44a54ef4c","Type":"ContainerDied","Data":"4680ce1e2d20d089f1773cac09043871a64a597e5d6a7e333739fb393e3d0650"} Mar 20 09:18:34 crc kubenswrapper[4971]: I0320 09:18:34.532958 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4680ce1e2d20d089f1773cac09043871a64a597e5d6a7e333739fb393e3d0650" Mar 20 09:18:34 crc kubenswrapper[4971]: I0320 09:18:34.532667 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-networker-9vwnm" Mar 20 09:18:34 crc kubenswrapper[4971]: I0320 09:18:34.606097 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-openstack-openstack-networker-ljn9j"] Mar 20 09:18:34 crc kubenswrapper[4971]: E0320 09:18:34.606539 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10e60012-e649-4b87-bf60-98f44a54ef4c" containerName="configure-os-openstack-openstack-networker" Mar 20 09:18:34 crc kubenswrapper[4971]: I0320 09:18:34.606558 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="10e60012-e649-4b87-bf60-98f44a54ef4c" containerName="configure-os-openstack-openstack-networker" Mar 20 09:18:34 crc kubenswrapper[4971]: I0320 09:18:34.606805 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="10e60012-e649-4b87-bf60-98f44a54ef4c" containerName="configure-os-openstack-openstack-networker" Mar 20 09:18:34 crc kubenswrapper[4971]: I0320 09:18:34.607622 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-networker-ljn9j" Mar 20 09:18:34 crc kubenswrapper[4971]: I0320 09:18:34.612057 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-nwcvx" Mar 20 09:18:34 crc kubenswrapper[4971]: I0320 09:18:34.614273 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Mar 20 09:18:34 crc kubenswrapper[4971]: I0320 09:18:34.634312 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-networker-ljn9j"] Mar 20 09:18:34 crc kubenswrapper[4971]: I0320 09:18:34.700106 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/d1385615-c486-46fe-889f-950bbb3699b7-ssh-key-openstack-networker\") pod \"run-os-openstack-openstack-networker-ljn9j\" (UID: \"d1385615-c486-46fe-889f-950bbb3699b7\") " pod="openstack/run-os-openstack-openstack-networker-ljn9j" Mar 20 09:18:34 crc kubenswrapper[4971]: I0320 09:18:34.700163 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvcqm\" (UniqueName: \"kubernetes.io/projected/d1385615-c486-46fe-889f-950bbb3699b7-kube-api-access-nvcqm\") pod \"run-os-openstack-openstack-networker-ljn9j\" (UID: \"d1385615-c486-46fe-889f-950bbb3699b7\") " pod="openstack/run-os-openstack-openstack-networker-ljn9j" Mar 20 09:18:34 crc kubenswrapper[4971]: I0320 09:18:34.700394 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d1385615-c486-46fe-889f-950bbb3699b7-inventory\") pod \"run-os-openstack-openstack-networker-ljn9j\" (UID: \"d1385615-c486-46fe-889f-950bbb3699b7\") " pod="openstack/run-os-openstack-openstack-networker-ljn9j" Mar 20 09:18:34 crc kubenswrapper[4971]: I0320 09:18:34.804060 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/d1385615-c486-46fe-889f-950bbb3699b7-ssh-key-openstack-networker\") pod \"run-os-openstack-openstack-networker-ljn9j\" (UID: \"d1385615-c486-46fe-889f-950bbb3699b7\") " pod="openstack/run-os-openstack-openstack-networker-ljn9j" Mar 20 09:18:34 crc kubenswrapper[4971]: I0320 09:18:34.804138 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvcqm\" (UniqueName: \"kubernetes.io/projected/d1385615-c486-46fe-889f-950bbb3699b7-kube-api-access-nvcqm\") pod \"run-os-openstack-openstack-networker-ljn9j\" (UID: \"d1385615-c486-46fe-889f-950bbb3699b7\") " pod="openstack/run-os-openstack-openstack-networker-ljn9j" Mar 20 09:18:34 crc kubenswrapper[4971]: I0320 09:18:34.804252 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d1385615-c486-46fe-889f-950bbb3699b7-inventory\") pod \"run-os-openstack-openstack-networker-ljn9j\" (UID: \"d1385615-c486-46fe-889f-950bbb3699b7\") " pod="openstack/run-os-openstack-openstack-networker-ljn9j" Mar 20 09:18:34 crc kubenswrapper[4971]: I0320 09:18:34.816824 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d1385615-c486-46fe-889f-950bbb3699b7-inventory\") pod \"run-os-openstack-openstack-networker-ljn9j\" (UID: \"d1385615-c486-46fe-889f-950bbb3699b7\") " pod="openstack/run-os-openstack-openstack-networker-ljn9j" Mar 20 09:18:34 crc kubenswrapper[4971]: I0320 09:18:34.816852 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/d1385615-c486-46fe-889f-950bbb3699b7-ssh-key-openstack-networker\") pod \"run-os-openstack-openstack-networker-ljn9j\" (UID: \"d1385615-c486-46fe-889f-950bbb3699b7\") " pod="openstack/run-os-openstack-openstack-networker-ljn9j" Mar 20 09:18:34 crc kubenswrapper[4971]: I0320 09:18:34.826548 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvcqm\" (UniqueName: \"kubernetes.io/projected/d1385615-c486-46fe-889f-950bbb3699b7-kube-api-access-nvcqm\") pod \"run-os-openstack-openstack-networker-ljn9j\" (UID: \"d1385615-c486-46fe-889f-950bbb3699b7\") " pod="openstack/run-os-openstack-openstack-networker-ljn9j" Mar 20 09:18:34 crc kubenswrapper[4971]: I0320 09:18:34.927263 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-networker-ljn9j" Mar 20 09:18:35 crc kubenswrapper[4971]: I0320 09:18:35.523770 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-networker-ljn9j"] Mar 20 09:18:35 crc kubenswrapper[4971]: W0320 09:18:35.527086 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1385615_c486_46fe_889f_950bbb3699b7.slice/crio-3d4dd338c267f8ddc14ab4fd97a4d41a64c2bc104d31e33d8c6dc512132875ea WatchSource:0}: Error finding container 3d4dd338c267f8ddc14ab4fd97a4d41a64c2bc104d31e33d8c6dc512132875ea: Status 404 returned error can't find the container with id 3d4dd338c267f8ddc14ab4fd97a4d41a64c2bc104d31e33d8c6dc512132875ea Mar 20 09:18:35 crc kubenswrapper[4971]: I0320 09:18:35.543463 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-networker-ljn9j" event={"ID":"d1385615-c486-46fe-889f-950bbb3699b7","Type":"ContainerStarted","Data":"3d4dd338c267f8ddc14ab4fd97a4d41a64c2bc104d31e33d8c6dc512132875ea"} Mar 20 09:18:36 crc kubenswrapper[4971]: I0320 09:18:36.556690 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-networker-ljn9j" event={"ID":"d1385615-c486-46fe-889f-950bbb3699b7","Type":"ContainerStarted","Data":"9e29f88273c8fe699beaebc7051f1e4afd0dc77c9d21eef473815aa6ade582e7"} Mar 20 09:18:36 crc kubenswrapper[4971]: I0320 09:18:36.577928 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-openstack-openstack-networker-ljn9j" podStartSLOduration=2.147170002 podStartE2EDuration="2.577910887s" podCreationTimestamp="2026-03-20 09:18:34 +0000 UTC" firstStartedPulling="2026-03-20 09:18:35.532881316 +0000 UTC m=+8937.512755454" lastFinishedPulling="2026-03-20 09:18:35.963622201 +0000 UTC m=+8937.943496339" observedRunningTime="2026-03-20 09:18:36.577453475 +0000 UTC m=+8938.557327613" watchObservedRunningTime="2026-03-20 09:18:36.577910887 +0000 UTC m=+8938.557785025" Mar 20 09:18:40 crc kubenswrapper[4971]: I0320 09:18:40.208933 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bqnb5" podUID="6df92fbb-0ab7-43e7-9661-a2fc17c01759" containerName="registry-server" probeResult="failure" output=< Mar 20 09:18:40 crc kubenswrapper[4971]: timeout: failed to connect service ":50051" within 1s Mar 20 09:18:40 crc kubenswrapper[4971]: > Mar 20 09:18:46 crc kubenswrapper[4971]: I0320 09:18:46.647883 4971 generic.go:334] "Generic (PLEG): container finished" podID="d1385615-c486-46fe-889f-950bbb3699b7" containerID="9e29f88273c8fe699beaebc7051f1e4afd0dc77c9d21eef473815aa6ade582e7" exitCode=0 Mar 20 09:18:46 crc kubenswrapper[4971]: I0320 09:18:46.647958 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-networker-ljn9j" event={"ID":"d1385615-c486-46fe-889f-950bbb3699b7","Type":"ContainerDied","Data":"9e29f88273c8fe699beaebc7051f1e4afd0dc77c9d21eef473815aa6ade582e7"} Mar 20 09:18:48 crc kubenswrapper[4971]: I0320 09:18:48.062433 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-networker-ljn9j" Mar 20 09:18:48 crc kubenswrapper[4971]: I0320 09:18:48.200931 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvcqm\" (UniqueName: \"kubernetes.io/projected/d1385615-c486-46fe-889f-950bbb3699b7-kube-api-access-nvcqm\") pod \"d1385615-c486-46fe-889f-950bbb3699b7\" (UID: \"d1385615-c486-46fe-889f-950bbb3699b7\") " Mar 20 09:18:48 crc kubenswrapper[4971]: I0320 09:18:48.201068 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d1385615-c486-46fe-889f-950bbb3699b7-inventory\") pod \"d1385615-c486-46fe-889f-950bbb3699b7\" (UID: \"d1385615-c486-46fe-889f-950bbb3699b7\") " Mar 20 09:18:48 crc kubenswrapper[4971]: I0320 09:18:48.201257 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/d1385615-c486-46fe-889f-950bbb3699b7-ssh-key-openstack-networker\") pod \"d1385615-c486-46fe-889f-950bbb3699b7\" (UID: \"d1385615-c486-46fe-889f-950bbb3699b7\") " Mar 20 09:18:48 crc kubenswrapper[4971]: I0320 09:18:48.206599 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1385615-c486-46fe-889f-950bbb3699b7-kube-api-access-nvcqm" (OuterVolumeSpecName: "kube-api-access-nvcqm") pod "d1385615-c486-46fe-889f-950bbb3699b7" (UID: "d1385615-c486-46fe-889f-950bbb3699b7"). InnerVolumeSpecName "kube-api-access-nvcqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:18:48 crc kubenswrapper[4971]: I0320 09:18:48.226580 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1385615-c486-46fe-889f-950bbb3699b7-inventory" (OuterVolumeSpecName: "inventory") pod "d1385615-c486-46fe-889f-950bbb3699b7" (UID: "d1385615-c486-46fe-889f-950bbb3699b7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:18:48 crc kubenswrapper[4971]: I0320 09:18:48.239763 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1385615-c486-46fe-889f-950bbb3699b7-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "d1385615-c486-46fe-889f-950bbb3699b7" (UID: "d1385615-c486-46fe-889f-950bbb3699b7"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:18:48 crc kubenswrapper[4971]: I0320 09:18:48.304071 4971 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/d1385615-c486-46fe-889f-950bbb3699b7-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Mar 20 09:18:48 crc kubenswrapper[4971]: I0320 09:18:48.304110 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvcqm\" (UniqueName: \"kubernetes.io/projected/d1385615-c486-46fe-889f-950bbb3699b7-kube-api-access-nvcqm\") on node \"crc\" DevicePath \"\"" Mar 20 09:18:48 crc kubenswrapper[4971]: I0320 09:18:48.304142 4971 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d1385615-c486-46fe-889f-950bbb3699b7-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 09:18:48 crc kubenswrapper[4971]: I0320 09:18:48.666194 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-networker-ljn9j" event={"ID":"d1385615-c486-46fe-889f-950bbb3699b7","Type":"ContainerDied","Data":"3d4dd338c267f8ddc14ab4fd97a4d41a64c2bc104d31e33d8c6dc512132875ea"} Mar 20 09:18:48 crc kubenswrapper[4971]: I0320 09:18:48.666462 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d4dd338c267f8ddc14ab4fd97a4d41a64c2bc104d31e33d8c6dc512132875ea" Mar 20 09:18:48 crc kubenswrapper[4971]: I0320 09:18:48.666320 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-networker-ljn9j" Mar 20 09:18:48 crc kubenswrapper[4971]: I0320 09:18:48.765979 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-openstack-openstack-networker-qkk4w"] Mar 20 09:18:48 crc kubenswrapper[4971]: E0320 09:18:48.766462 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1385615-c486-46fe-889f-950bbb3699b7" containerName="run-os-openstack-openstack-networker" Mar 20 09:18:48 crc kubenswrapper[4971]: I0320 09:18:48.766480 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1385615-c486-46fe-889f-950bbb3699b7" containerName="run-os-openstack-openstack-networker" Mar 20 09:18:48 crc kubenswrapper[4971]: I0320 09:18:48.766724 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1385615-c486-46fe-889f-950bbb3699b7" containerName="run-os-openstack-openstack-networker" Mar 20 09:18:48 crc kubenswrapper[4971]: I0320 09:18:48.767422 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-networker-qkk4w" Mar 20 09:18:48 crc kubenswrapper[4971]: I0320 09:18:48.771820 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-nwcvx" Mar 20 09:18:48 crc kubenswrapper[4971]: I0320 09:18:48.771916 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Mar 20 09:18:48 crc kubenswrapper[4971]: I0320 09:18:48.781162 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-networker-qkk4w"] Mar 20 09:18:48 crc kubenswrapper[4971]: I0320 09:18:48.916493 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skbc4\" (UniqueName: \"kubernetes.io/projected/193efd84-721d-48f2-95fd-714abfe9bd94-kube-api-access-skbc4\") pod \"reboot-os-openstack-openstack-networker-qkk4w\" (UID: \"193efd84-721d-48f2-95fd-714abfe9bd94\") " pod="openstack/reboot-os-openstack-openstack-networker-qkk4w" Mar 20 09:18:48 crc kubenswrapper[4971]: I0320 09:18:48.916699 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/193efd84-721d-48f2-95fd-714abfe9bd94-ssh-key-openstack-networker\") pod \"reboot-os-openstack-openstack-networker-qkk4w\" (UID: \"193efd84-721d-48f2-95fd-714abfe9bd94\") " pod="openstack/reboot-os-openstack-openstack-networker-qkk4w" Mar 20 09:18:48 crc kubenswrapper[4971]: I0320 09:18:48.916953 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/193efd84-721d-48f2-95fd-714abfe9bd94-inventory\") pod \"reboot-os-openstack-openstack-networker-qkk4w\" (UID: \"193efd84-721d-48f2-95fd-714abfe9bd94\") " pod="openstack/reboot-os-openstack-openstack-networker-qkk4w" Mar 20 09:18:49 crc kubenswrapper[4971]: I0320 09:18:49.019091 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skbc4\" (UniqueName: \"kubernetes.io/projected/193efd84-721d-48f2-95fd-714abfe9bd94-kube-api-access-skbc4\") pod \"reboot-os-openstack-openstack-networker-qkk4w\" (UID: \"193efd84-721d-48f2-95fd-714abfe9bd94\") " pod="openstack/reboot-os-openstack-openstack-networker-qkk4w" Mar 20 09:18:49 crc kubenswrapper[4971]: I0320 09:18:49.019169 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/193efd84-721d-48f2-95fd-714abfe9bd94-ssh-key-openstack-networker\") pod \"reboot-os-openstack-openstack-networker-qkk4w\" (UID: \"193efd84-721d-48f2-95fd-714abfe9bd94\") " pod="openstack/reboot-os-openstack-openstack-networker-qkk4w" Mar 20 09:18:49 crc kubenswrapper[4971]: I0320 09:18:49.019262 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/193efd84-721d-48f2-95fd-714abfe9bd94-inventory\") pod \"reboot-os-openstack-openstack-networker-qkk4w\" (UID: \"193efd84-721d-48f2-95fd-714abfe9bd94\") " pod="openstack/reboot-os-openstack-openstack-networker-qkk4w" Mar 20 09:18:49 crc kubenswrapper[4971]: I0320 09:18:49.025954 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/193efd84-721d-48f2-95fd-714abfe9bd94-ssh-key-openstack-networker\") pod \"reboot-os-openstack-openstack-networker-qkk4w\" (UID: \"193efd84-721d-48f2-95fd-714abfe9bd94\") " pod="openstack/reboot-os-openstack-openstack-networker-qkk4w" Mar 20 09:18:49 crc kubenswrapper[4971]: I0320 09:18:49.026162 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/193efd84-721d-48f2-95fd-714abfe9bd94-inventory\") pod \"reboot-os-openstack-openstack-networker-qkk4w\" (UID: \"193efd84-721d-48f2-95fd-714abfe9bd94\") " pod="openstack/reboot-os-openstack-openstack-networker-qkk4w" Mar 20 09:18:49 crc kubenswrapper[4971]: I0320 09:18:49.035227 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skbc4\" (UniqueName: \"kubernetes.io/projected/193efd84-721d-48f2-95fd-714abfe9bd94-kube-api-access-skbc4\") pod \"reboot-os-openstack-openstack-networker-qkk4w\" (UID: \"193efd84-721d-48f2-95fd-714abfe9bd94\") " pod="openstack/reboot-os-openstack-openstack-networker-qkk4w" Mar 20 09:18:49 crc kubenswrapper[4971]: I0320 09:18:49.083724 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-networker-qkk4w" Mar 20 09:18:49 crc kubenswrapper[4971]: I0320 09:18:49.632829 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-networker-qkk4w"] Mar 20 09:18:49 crc kubenswrapper[4971]: I0320 09:18:49.676443 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-networker-qkk4w" event={"ID":"193efd84-721d-48f2-95fd-714abfe9bd94","Type":"ContainerStarted","Data":"33200fcfdfb4659e7158a6ab0f7e1f0d8da7964c311bf860884205589dabfea2"} Mar 20 09:18:50 crc kubenswrapper[4971]: I0320 09:18:50.247971 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bqnb5" podUID="6df92fbb-0ab7-43e7-9661-a2fc17c01759" containerName="registry-server" probeResult="failure" output=< Mar 20 09:18:50 crc kubenswrapper[4971]: timeout: failed to connect service ":50051" within 1s Mar 20 09:18:50 crc kubenswrapper[4971]: > Mar 20 09:18:50 crc kubenswrapper[4971]: I0320 09:18:50.688537 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-networker-qkk4w" event={"ID":"193efd84-721d-48f2-95fd-714abfe9bd94","Type":"ContainerStarted","Data":"f8097e8b85e4f041bd1a0957ea07ed69dd77ba2e614fda65f85e86783fcbf01a"} Mar 20 09:18:50 crc kubenswrapper[4971]: I0320 09:18:50.715271 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-openstack-openstack-networker-qkk4w" podStartSLOduration=2.249803045 podStartE2EDuration="2.715245567s" podCreationTimestamp="2026-03-20 09:18:48 +0000 UTC" firstStartedPulling="2026-03-20 09:18:49.643668614 +0000 UTC m=+8951.623542752" lastFinishedPulling="2026-03-20 09:18:50.109111136 +0000 UTC m=+8952.088985274" observedRunningTime="2026-03-20 09:18:50.704455342 +0000 UTC m=+8952.684329480" watchObservedRunningTime="2026-03-20 09:18:50.715245567 +0000 UTC m=+8952.695119745" Mar 20 09:18:59 crc kubenswrapper[4971]: I0320 09:18:59.204528 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bqnb5" Mar 20 09:18:59 crc kubenswrapper[4971]: I0320 09:18:59.261536 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bqnb5" Mar 20 09:18:59 crc kubenswrapper[4971]: I0320 09:18:59.456919 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bqnb5"] Mar 20 09:19:00 crc kubenswrapper[4971]: I0320 09:19:00.781247 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bqnb5" podUID="6df92fbb-0ab7-43e7-9661-a2fc17c01759" containerName="registry-server" containerID="cri-o://4010535c74311ad5afad79526ab1480a92992749e486d56ac3c2a6d43ee00902" gracePeriod=2 Mar 20 09:19:01 crc kubenswrapper[4971]: I0320 09:19:01.255769 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bqnb5" Mar 20 09:19:01 crc kubenswrapper[4971]: I0320 09:19:01.367978 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7j2v\" (UniqueName: \"kubernetes.io/projected/6df92fbb-0ab7-43e7-9661-a2fc17c01759-kube-api-access-l7j2v\") pod \"6df92fbb-0ab7-43e7-9661-a2fc17c01759\" (UID: \"6df92fbb-0ab7-43e7-9661-a2fc17c01759\") " Mar 20 09:19:01 crc kubenswrapper[4971]: I0320 09:19:01.368120 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6df92fbb-0ab7-43e7-9661-a2fc17c01759-catalog-content\") pod \"6df92fbb-0ab7-43e7-9661-a2fc17c01759\" (UID: \"6df92fbb-0ab7-43e7-9661-a2fc17c01759\") " Mar 20 09:19:01 crc kubenswrapper[4971]: I0320 09:19:01.368199 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6df92fbb-0ab7-43e7-9661-a2fc17c01759-utilities\") pod \"6df92fbb-0ab7-43e7-9661-a2fc17c01759\" (UID: \"6df92fbb-0ab7-43e7-9661-a2fc17c01759\") " Mar 20 09:19:01 crc kubenswrapper[4971]: I0320 09:19:01.368904 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6df92fbb-0ab7-43e7-9661-a2fc17c01759-utilities" (OuterVolumeSpecName: "utilities") pod "6df92fbb-0ab7-43e7-9661-a2fc17c01759" (UID: "6df92fbb-0ab7-43e7-9661-a2fc17c01759"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:19:01 crc kubenswrapper[4971]: I0320 09:19:01.373511 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6df92fbb-0ab7-43e7-9661-a2fc17c01759-kube-api-access-l7j2v" (OuterVolumeSpecName: "kube-api-access-l7j2v") pod "6df92fbb-0ab7-43e7-9661-a2fc17c01759" (UID: "6df92fbb-0ab7-43e7-9661-a2fc17c01759"). InnerVolumeSpecName "kube-api-access-l7j2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:19:01 crc kubenswrapper[4971]: I0320 09:19:01.470934 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7j2v\" (UniqueName: \"kubernetes.io/projected/6df92fbb-0ab7-43e7-9661-a2fc17c01759-kube-api-access-l7j2v\") on node \"crc\" DevicePath \"\"" Mar 20 09:19:01 crc kubenswrapper[4971]: I0320 09:19:01.470968 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6df92fbb-0ab7-43e7-9661-a2fc17c01759-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 09:19:01 crc kubenswrapper[4971]: I0320 09:19:01.512966 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6df92fbb-0ab7-43e7-9661-a2fc17c01759-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6df92fbb-0ab7-43e7-9661-a2fc17c01759" (UID: "6df92fbb-0ab7-43e7-9661-a2fc17c01759"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:19:01 crc kubenswrapper[4971]: I0320 09:19:01.572885 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6df92fbb-0ab7-43e7-9661-a2fc17c01759-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 09:19:01 crc kubenswrapper[4971]: I0320 09:19:01.793526 4971 generic.go:334] "Generic (PLEG): container finished" podID="6df92fbb-0ab7-43e7-9661-a2fc17c01759" containerID="4010535c74311ad5afad79526ab1480a92992749e486d56ac3c2a6d43ee00902" exitCode=0 Mar 20 09:19:01 crc kubenswrapper[4971]: I0320 09:19:01.793567 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bqnb5" event={"ID":"6df92fbb-0ab7-43e7-9661-a2fc17c01759","Type":"ContainerDied","Data":"4010535c74311ad5afad79526ab1480a92992749e486d56ac3c2a6d43ee00902"} Mar 20 09:19:01 crc kubenswrapper[4971]: I0320 09:19:01.793594 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bqnb5" event={"ID":"6df92fbb-0ab7-43e7-9661-a2fc17c01759","Type":"ContainerDied","Data":"953f86ee1119ea22c4b20c396b29b1d33bc70812c01d96e3a733026f3036476a"} Mar 20 09:19:01 crc kubenswrapper[4971]: I0320 09:19:01.793628 4971 scope.go:117] "RemoveContainer" containerID="4010535c74311ad5afad79526ab1480a92992749e486d56ac3c2a6d43ee00902" Mar 20 09:19:01 crc kubenswrapper[4971]: I0320 09:19:01.793774 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bqnb5" Mar 20 09:19:01 crc kubenswrapper[4971]: I0320 09:19:01.835525 4971 scope.go:117] "RemoveContainer" containerID="e04deb7b8366988f0dd7f32468eefa08706849d0eb2391ec56d6e90c47787f31" Mar 20 09:19:01 crc kubenswrapper[4971]: I0320 09:19:01.843842 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bqnb5"] Mar 20 09:19:01 crc kubenswrapper[4971]: I0320 09:19:01.855470 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bqnb5"] Mar 20 09:19:01 crc kubenswrapper[4971]: I0320 09:19:01.869750 4971 scope.go:117] "RemoveContainer" containerID="2cd2d04def02a75f4cd7a354a2fc14f2882be6b963e964ff0c8b4a6f66b0c7f8" Mar 20 09:19:01 crc kubenswrapper[4971]: I0320 09:19:01.898538 4971 scope.go:117] "RemoveContainer" containerID="4010535c74311ad5afad79526ab1480a92992749e486d56ac3c2a6d43ee00902" Mar 20 09:19:01 crc kubenswrapper[4971]: E0320 09:19:01.898964 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4010535c74311ad5afad79526ab1480a92992749e486d56ac3c2a6d43ee00902\": container with ID starting with 4010535c74311ad5afad79526ab1480a92992749e486d56ac3c2a6d43ee00902 not found: ID does not exist" containerID="4010535c74311ad5afad79526ab1480a92992749e486d56ac3c2a6d43ee00902" Mar 20 09:19:01 crc kubenswrapper[4971]: I0320 09:19:01.899033 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4010535c74311ad5afad79526ab1480a92992749e486d56ac3c2a6d43ee00902"} err="failed to get container status \"4010535c74311ad5afad79526ab1480a92992749e486d56ac3c2a6d43ee00902\": rpc error: code = NotFound desc = could not find container \"4010535c74311ad5afad79526ab1480a92992749e486d56ac3c2a6d43ee00902\": container with ID starting with 4010535c74311ad5afad79526ab1480a92992749e486d56ac3c2a6d43ee00902 not found: ID does not exist" Mar 20 09:19:01 crc kubenswrapper[4971]: I0320 09:19:01.899064 4971 scope.go:117] "RemoveContainer" containerID="e04deb7b8366988f0dd7f32468eefa08706849d0eb2391ec56d6e90c47787f31" Mar 20 09:19:01 crc kubenswrapper[4971]: E0320 09:19:01.899346 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e04deb7b8366988f0dd7f32468eefa08706849d0eb2391ec56d6e90c47787f31\": container with ID starting with e04deb7b8366988f0dd7f32468eefa08706849d0eb2391ec56d6e90c47787f31 not found: ID does not exist" containerID="e04deb7b8366988f0dd7f32468eefa08706849d0eb2391ec56d6e90c47787f31" Mar 20 09:19:01 crc kubenswrapper[4971]: I0320 09:19:01.899370 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e04deb7b8366988f0dd7f32468eefa08706849d0eb2391ec56d6e90c47787f31"} err="failed to get container status \"e04deb7b8366988f0dd7f32468eefa08706849d0eb2391ec56d6e90c47787f31\": rpc error: code = NotFound desc = could not find container \"e04deb7b8366988f0dd7f32468eefa08706849d0eb2391ec56d6e90c47787f31\": container with ID starting with e04deb7b8366988f0dd7f32468eefa08706849d0eb2391ec56d6e90c47787f31 not found: ID does not exist" Mar 20 09:19:01 crc kubenswrapper[4971]: I0320 09:19:01.899387 4971 scope.go:117] "RemoveContainer" containerID="2cd2d04def02a75f4cd7a354a2fc14f2882be6b963e964ff0c8b4a6f66b0c7f8" Mar 20 09:19:01 crc kubenswrapper[4971]: E0320 09:19:01.899717 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cd2d04def02a75f4cd7a354a2fc14f2882be6b963e964ff0c8b4a6f66b0c7f8\": container with ID starting with 2cd2d04def02a75f4cd7a354a2fc14f2882be6b963e964ff0c8b4a6f66b0c7f8 not found: ID does not exist" containerID="2cd2d04def02a75f4cd7a354a2fc14f2882be6b963e964ff0c8b4a6f66b0c7f8" Mar 20 09:19:01 crc kubenswrapper[4971]: I0320 09:19:01.899770 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cd2d04def02a75f4cd7a354a2fc14f2882be6b963e964ff0c8b4a6f66b0c7f8"} err="failed to get container status \"2cd2d04def02a75f4cd7a354a2fc14f2882be6b963e964ff0c8b4a6f66b0c7f8\": rpc error: code = NotFound desc = could not find container \"2cd2d04def02a75f4cd7a354a2fc14f2882be6b963e964ff0c8b4a6f66b0c7f8\": container with ID starting with 2cd2d04def02a75f4cd7a354a2fc14f2882be6b963e964ff0c8b4a6f66b0c7f8 not found: ID does not exist" Mar 20 09:19:02 crc kubenswrapper[4971]: I0320 09:19:02.743531 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6df92fbb-0ab7-43e7-9661-a2fc17c01759" path="/var/lib/kubelet/pods/6df92fbb-0ab7-43e7-9661-a2fc17c01759/volumes" Mar 20 09:19:08 crc kubenswrapper[4971]: I0320 09:19:08.123131 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lmlqv"] Mar 20 09:19:08 crc kubenswrapper[4971]: E0320 09:19:08.124127 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6df92fbb-0ab7-43e7-9661-a2fc17c01759" containerName="extract-content" Mar 20 09:19:08 crc kubenswrapper[4971]: I0320 09:19:08.124143 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="6df92fbb-0ab7-43e7-9661-a2fc17c01759" containerName="extract-content" Mar 20 09:19:08 crc kubenswrapper[4971]: E0320 09:19:08.124180 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6df92fbb-0ab7-43e7-9661-a2fc17c01759" containerName="registry-server" Mar 20 09:19:08 crc kubenswrapper[4971]: I0320 09:19:08.124189 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="6df92fbb-0ab7-43e7-9661-a2fc17c01759" containerName="registry-server" Mar 20 09:19:08 crc kubenswrapper[4971]: E0320 09:19:08.124202 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6df92fbb-0ab7-43e7-9661-a2fc17c01759" containerName="extract-utilities" Mar 20 09:19:08 crc kubenswrapper[4971]: I0320 09:19:08.124212 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="6df92fbb-0ab7-43e7-9661-a2fc17c01759" containerName="extract-utilities" Mar 20 09:19:08 crc kubenswrapper[4971]: I0320 09:19:08.124471 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="6df92fbb-0ab7-43e7-9661-a2fc17c01759" containerName="registry-server" Mar 20 09:19:08 crc kubenswrapper[4971]: I0320 09:19:08.126336 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lmlqv" Mar 20 09:19:08 crc kubenswrapper[4971]: I0320 09:19:08.145206 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lmlqv"] Mar 20 09:19:08 crc kubenswrapper[4971]: I0320 09:19:08.202560 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09b2d204-b83b-4d81-8fc8-4f9df4513dca-utilities\") pod \"redhat-marketplace-lmlqv\" (UID: \"09b2d204-b83b-4d81-8fc8-4f9df4513dca\") " pod="openshift-marketplace/redhat-marketplace-lmlqv" Mar 20 09:19:08 crc kubenswrapper[4971]: I0320 09:19:08.204039 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09b2d204-b83b-4d81-8fc8-4f9df4513dca-catalog-content\") pod \"redhat-marketplace-lmlqv\" (UID: \"09b2d204-b83b-4d81-8fc8-4f9df4513dca\") " pod="openshift-marketplace/redhat-marketplace-lmlqv" Mar 20 09:19:08 crc kubenswrapper[4971]: I0320 09:19:08.204918 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcksk\" (UniqueName: \"kubernetes.io/projected/09b2d204-b83b-4d81-8fc8-4f9df4513dca-kube-api-access-jcksk\") pod \"redhat-marketplace-lmlqv\" (UID: \"09b2d204-b83b-4d81-8fc8-4f9df4513dca\") " pod="openshift-marketplace/redhat-marketplace-lmlqv" Mar 20 09:19:08 crc kubenswrapper[4971]: I0320 09:19:08.307101 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcksk\" (UniqueName: \"kubernetes.io/projected/09b2d204-b83b-4d81-8fc8-4f9df4513dca-kube-api-access-jcksk\") pod \"redhat-marketplace-lmlqv\" (UID: \"09b2d204-b83b-4d81-8fc8-4f9df4513dca\") " pod="openshift-marketplace/redhat-marketplace-lmlqv" Mar 20 09:19:08 crc kubenswrapper[4971]: I0320 09:19:08.307191 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09b2d204-b83b-4d81-8fc8-4f9df4513dca-utilities\") pod \"redhat-marketplace-lmlqv\" (UID: \"09b2d204-b83b-4d81-8fc8-4f9df4513dca\") " pod="openshift-marketplace/redhat-marketplace-lmlqv" Mar 20 09:19:08 crc kubenswrapper[4971]: I0320 09:19:08.307217 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09b2d204-b83b-4d81-8fc8-4f9df4513dca-catalog-content\") pod \"redhat-marketplace-lmlqv\" (UID: \"09b2d204-b83b-4d81-8fc8-4f9df4513dca\") " pod="openshift-marketplace/redhat-marketplace-lmlqv" Mar 20 09:19:08 crc kubenswrapper[4971]: I0320 09:19:08.307721 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09b2d204-b83b-4d81-8fc8-4f9df4513dca-catalog-content\") pod \"redhat-marketplace-lmlqv\" (UID: \"09b2d204-b83b-4d81-8fc8-4f9df4513dca\") " pod="openshift-marketplace/redhat-marketplace-lmlqv" Mar 20 09:19:08 crc kubenswrapper[4971]: I0320 09:19:08.307859 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09b2d204-b83b-4d81-8fc8-4f9df4513dca-utilities\") pod \"redhat-marketplace-lmlqv\" (UID: \"09b2d204-b83b-4d81-8fc8-4f9df4513dca\") " pod="openshift-marketplace/redhat-marketplace-lmlqv" Mar 20 09:19:08 crc kubenswrapper[4971]: I0320 09:19:08.331709 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcksk\" (UniqueName: \"kubernetes.io/projected/09b2d204-b83b-4d81-8fc8-4f9df4513dca-kube-api-access-jcksk\") pod \"redhat-marketplace-lmlqv\" (UID: \"09b2d204-b83b-4d81-8fc8-4f9df4513dca\") " pod="openshift-marketplace/redhat-marketplace-lmlqv" Mar 20 09:19:08 crc kubenswrapper[4971]: I0320 09:19:08.456343 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lmlqv" Mar 20 09:19:08 crc kubenswrapper[4971]: W0320 09:19:08.911570 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09b2d204_b83b_4d81_8fc8_4f9df4513dca.slice/crio-e937d60d2827ac0770847a90094d01f9e2eb83236887de3d0f54bf6bbb30e8a2 WatchSource:0}: Error finding container e937d60d2827ac0770847a90094d01f9e2eb83236887de3d0f54bf6bbb30e8a2: Status 404 returned error can't find the container with id e937d60d2827ac0770847a90094d01f9e2eb83236887de3d0f54bf6bbb30e8a2 Mar 20 09:19:08 crc kubenswrapper[4971]: I0320 09:19:08.912748 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lmlqv"] Mar 20 09:19:09 crc kubenswrapper[4971]: I0320 09:19:09.868179 4971 generic.go:334] "Generic (PLEG): container finished" podID="09b2d204-b83b-4d81-8fc8-4f9df4513dca" containerID="4dfed10b40ffcc2c045f6a6d866f2552c0252728af8b15f210766537d7d50790" exitCode=0 Mar 20 09:19:09 crc kubenswrapper[4971]: I0320 09:19:09.868264 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lmlqv" event={"ID":"09b2d204-b83b-4d81-8fc8-4f9df4513dca","Type":"ContainerDied","Data":"4dfed10b40ffcc2c045f6a6d866f2552c0252728af8b15f210766537d7d50790"} Mar 20 09:19:09 crc kubenswrapper[4971]: I0320 09:19:09.868491 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lmlqv" event={"ID":"09b2d204-b83b-4d81-8fc8-4f9df4513dca","Type":"ContainerStarted","Data":"e937d60d2827ac0770847a90094d01f9e2eb83236887de3d0f54bf6bbb30e8a2"} Mar 20 09:19:09 crc kubenswrapper[4971]: I0320 09:19:09.871638 4971 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 09:19:10 crc kubenswrapper[4971]: I0320 09:19:10.880441 4971 generic.go:334] "Generic (PLEG): container finished" podID="193efd84-721d-48f2-95fd-714abfe9bd94" containerID="f8097e8b85e4f041bd1a0957ea07ed69dd77ba2e614fda65f85e86783fcbf01a" exitCode=0 Mar 20 09:19:10 crc kubenswrapper[4971]: I0320 09:19:10.880534 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-networker-qkk4w" event={"ID":"193efd84-721d-48f2-95fd-714abfe9bd94","Type":"ContainerDied","Data":"f8097e8b85e4f041bd1a0957ea07ed69dd77ba2e614fda65f85e86783fcbf01a"} Mar 20 09:19:10 crc kubenswrapper[4971]: I0320 09:19:10.883700 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lmlqv" event={"ID":"09b2d204-b83b-4d81-8fc8-4f9df4513dca","Type":"ContainerStarted","Data":"eb0496e051ce9f2cc6cba39ce839bad7ddea03f5d221d4ade9e40089de2179d4"} Mar 20 09:19:11 crc kubenswrapper[4971]: I0320 09:19:11.895363 4971 generic.go:334] "Generic (PLEG): container finished" podID="09b2d204-b83b-4d81-8fc8-4f9df4513dca" containerID="eb0496e051ce9f2cc6cba39ce839bad7ddea03f5d221d4ade9e40089de2179d4" exitCode=0 Mar 20 09:19:11 crc kubenswrapper[4971]: I0320 09:19:11.895454 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lmlqv" event={"ID":"09b2d204-b83b-4d81-8fc8-4f9df4513dca","Type":"ContainerDied","Data":"eb0496e051ce9f2cc6cba39ce839bad7ddea03f5d221d4ade9e40089de2179d4"} Mar 20 09:19:12 crc kubenswrapper[4971]: I0320 09:19:12.365090 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-networker-qkk4w" Mar 20 09:19:12 crc kubenswrapper[4971]: I0320 09:19:12.490392 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/193efd84-721d-48f2-95fd-714abfe9bd94-inventory\") pod \"193efd84-721d-48f2-95fd-714abfe9bd94\" (UID: \"193efd84-721d-48f2-95fd-714abfe9bd94\") " Mar 20 09:19:12 crc kubenswrapper[4971]: I0320 09:19:12.490549 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skbc4\" (UniqueName: \"kubernetes.io/projected/193efd84-721d-48f2-95fd-714abfe9bd94-kube-api-access-skbc4\") pod \"193efd84-721d-48f2-95fd-714abfe9bd94\" (UID: \"193efd84-721d-48f2-95fd-714abfe9bd94\") " Mar 20 09:19:12 crc kubenswrapper[4971]: I0320 09:19:12.490657 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/193efd84-721d-48f2-95fd-714abfe9bd94-ssh-key-openstack-networker\") pod \"193efd84-721d-48f2-95fd-714abfe9bd94\" (UID: \"193efd84-721d-48f2-95fd-714abfe9bd94\") " Mar 20 09:19:12 crc kubenswrapper[4971]: I0320 09:19:12.497095 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/193efd84-721d-48f2-95fd-714abfe9bd94-kube-api-access-skbc4" (OuterVolumeSpecName: "kube-api-access-skbc4") pod "193efd84-721d-48f2-95fd-714abfe9bd94" (UID: "193efd84-721d-48f2-95fd-714abfe9bd94"). InnerVolumeSpecName "kube-api-access-skbc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:19:12 crc kubenswrapper[4971]: I0320 09:19:12.528749 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/193efd84-721d-48f2-95fd-714abfe9bd94-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "193efd84-721d-48f2-95fd-714abfe9bd94" (UID: "193efd84-721d-48f2-95fd-714abfe9bd94"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:19:12 crc kubenswrapper[4971]: I0320 09:19:12.530149 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/193efd84-721d-48f2-95fd-714abfe9bd94-inventory" (OuterVolumeSpecName: "inventory") pod "193efd84-721d-48f2-95fd-714abfe9bd94" (UID: "193efd84-721d-48f2-95fd-714abfe9bd94"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:19:12 crc kubenswrapper[4971]: I0320 09:19:12.593225 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skbc4\" (UniqueName: \"kubernetes.io/projected/193efd84-721d-48f2-95fd-714abfe9bd94-kube-api-access-skbc4\") on node \"crc\" DevicePath \"\"" Mar 20 09:19:12 crc kubenswrapper[4971]: I0320 09:19:12.593547 4971 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/193efd84-721d-48f2-95fd-714abfe9bd94-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Mar 20 09:19:12 crc kubenswrapper[4971]: I0320 09:19:12.593558 4971 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/193efd84-721d-48f2-95fd-714abfe9bd94-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 09:19:12 crc kubenswrapper[4971]: I0320 09:19:12.907755 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-networker-qkk4w" Mar 20 09:19:12 crc kubenswrapper[4971]: I0320 09:19:12.907747 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-networker-qkk4w" event={"ID":"193efd84-721d-48f2-95fd-714abfe9bd94","Type":"ContainerDied","Data":"33200fcfdfb4659e7158a6ab0f7e1f0d8da7964c311bf860884205589dabfea2"} Mar 20 09:19:12 crc kubenswrapper[4971]: I0320 09:19:12.907907 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33200fcfdfb4659e7158a6ab0f7e1f0d8da7964c311bf860884205589dabfea2" Mar 20 09:19:12 crc kubenswrapper[4971]: I0320 09:19:12.918255 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lmlqv" event={"ID":"09b2d204-b83b-4d81-8fc8-4f9df4513dca","Type":"ContainerStarted","Data":"b30eb559d1f04e4ce7a805d13d1b37c55f500d08d9217b1b6e55771d36bed1c1"} Mar 20 09:19:12 crc kubenswrapper[4971]: I0320 09:19:12.947738 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lmlqv" podStartSLOduration=2.308476319 podStartE2EDuration="4.947718155s" podCreationTimestamp="2026-03-20 09:19:08 +0000 UTC" firstStartedPulling="2026-03-20 09:19:09.871415467 +0000 UTC m=+8971.851289615" lastFinishedPulling="2026-03-20 09:19:12.510657313 +0000 UTC m=+8974.490531451" observedRunningTime="2026-03-20 09:19:12.934233309 +0000 UTC m=+8974.914107457" watchObservedRunningTime="2026-03-20 09:19:12.947718155 +0000 UTC m=+8974.927592293" Mar 20 09:19:12 crc kubenswrapper[4971]: I0320 09:19:12.999404 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-openstack-openstack-networker-9w5xv"] Mar 20 09:19:12 crc kubenswrapper[4971]: E0320 09:19:12.999945 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="193efd84-721d-48f2-95fd-714abfe9bd94" containerName="reboot-os-openstack-openstack-networker" Mar 20 09:19:12 crc kubenswrapper[4971]: I0320 09:19:12.999967 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="193efd84-721d-48f2-95fd-714abfe9bd94" containerName="reboot-os-openstack-openstack-networker" Mar 20 09:19:13 crc kubenswrapper[4971]: I0320 09:19:13.000159 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="193efd84-721d-48f2-95fd-714abfe9bd94" containerName="reboot-os-openstack-openstack-networker" Mar 20 09:19:13 crc kubenswrapper[4971]: I0320 09:19:13.000911 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-networker-9w5xv" Mar 20 09:19:13 crc kubenswrapper[4971]: I0320 09:19:13.003801 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-nwcvx" Mar 20 09:19:13 crc kubenswrapper[4971]: I0320 09:19:13.003992 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Mar 20 09:19:13 crc kubenswrapper[4971]: I0320 09:19:13.063425 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-networker-9w5xv"] Mar 20 09:19:13 crc kubenswrapper[4971]: I0320 09:19:13.101903 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7dbaa85-bc3c-47de-b48d-7d72c7d8f406-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-networker-9w5xv\" (UID: \"e7dbaa85-bc3c-47de-b48d-7d72c7d8f406\") " pod="openstack/install-certs-openstack-openstack-networker-9w5xv" Mar 20 09:19:13 crc kubenswrapper[4971]: I0320 09:19:13.101952 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/e7dbaa85-bc3c-47de-b48d-7d72c7d8f406-ssh-key-openstack-networker\") pod \"install-certs-openstack-openstack-networker-9w5xv\" (UID: \"e7dbaa85-bc3c-47de-b48d-7d72c7d8f406\") " pod="openstack/install-certs-openstack-openstack-networker-9w5xv" Mar 20 09:19:13 crc kubenswrapper[4971]: I0320 09:19:13.101980 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e7dbaa85-bc3c-47de-b48d-7d72c7d8f406-inventory\") pod \"install-certs-openstack-openstack-networker-9w5xv\" (UID: \"e7dbaa85-bc3c-47de-b48d-7d72c7d8f406\") " pod="openstack/install-certs-openstack-openstack-networker-9w5xv" Mar 20 09:19:13 crc kubenswrapper[4971]: I0320 09:19:13.102082 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7dbaa85-bc3c-47de-b48d-7d72c7d8f406-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-networker-9w5xv\" (UID: \"e7dbaa85-bc3c-47de-b48d-7d72c7d8f406\") " pod="openstack/install-certs-openstack-openstack-networker-9w5xv" Mar 20 09:19:13 crc kubenswrapper[4971]: I0320 09:19:13.102103 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6h27\" (UniqueName: \"kubernetes.io/projected/e7dbaa85-bc3c-47de-b48d-7d72c7d8f406-kube-api-access-r6h27\") pod \"install-certs-openstack-openstack-networker-9w5xv\" (UID: \"e7dbaa85-bc3c-47de-b48d-7d72c7d8f406\") " pod="openstack/install-certs-openstack-openstack-networker-9w5xv" Mar 20 09:19:13 crc kubenswrapper[4971]: I0320 09:19:13.102138 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7dbaa85-bc3c-47de-b48d-7d72c7d8f406-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-networker-9w5xv\" (UID: \"e7dbaa85-bc3c-47de-b48d-7d72c7d8f406\") " pod="openstack/install-certs-openstack-openstack-networker-9w5xv" Mar 20 09:19:13 crc kubenswrapper[4971]: I0320 09:19:13.204045 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7dbaa85-bc3c-47de-b48d-7d72c7d8f406-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-networker-9w5xv\" (UID: \"e7dbaa85-bc3c-47de-b48d-7d72c7d8f406\") " pod="openstack/install-certs-openstack-openstack-networker-9w5xv" Mar 20 09:19:13 crc kubenswrapper[4971]: I0320 09:19:13.204124 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6h27\" (UniqueName: \"kubernetes.io/projected/e7dbaa85-bc3c-47de-b48d-7d72c7d8f406-kube-api-access-r6h27\") pod \"install-certs-openstack-openstack-networker-9w5xv\" (UID: \"e7dbaa85-bc3c-47de-b48d-7d72c7d8f406\") " pod="openstack/install-certs-openstack-openstack-networker-9w5xv" Mar 20 09:19:13 crc kubenswrapper[4971]: I0320 09:19:13.204184 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7dbaa85-bc3c-47de-b48d-7d72c7d8f406-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-networker-9w5xv\" (UID: \"e7dbaa85-bc3c-47de-b48d-7d72c7d8f406\") " pod="openstack/install-certs-openstack-openstack-networker-9w5xv" Mar 20 09:19:13 crc kubenswrapper[4971]: I0320 09:19:13.204267 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7dbaa85-bc3c-47de-b48d-7d72c7d8f406-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-networker-9w5xv\" (UID: \"e7dbaa85-bc3c-47de-b48d-7d72c7d8f406\") " pod="openstack/install-certs-openstack-openstack-networker-9w5xv" Mar 20 09:19:13 crc kubenswrapper[4971]: I0320 09:19:13.204299 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/e7dbaa85-bc3c-47de-b48d-7d72c7d8f406-ssh-key-openstack-networker\") pod \"install-certs-openstack-openstack-networker-9w5xv\" (UID: \"e7dbaa85-bc3c-47de-b48d-7d72c7d8f406\") " pod="openstack/install-certs-openstack-openstack-networker-9w5xv" Mar 20 09:19:13 crc kubenswrapper[4971]: I0320 09:19:13.204332 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e7dbaa85-bc3c-47de-b48d-7d72c7d8f406-inventory\") pod \"install-certs-openstack-openstack-networker-9w5xv\" (UID: \"e7dbaa85-bc3c-47de-b48d-7d72c7d8f406\") " pod="openstack/install-certs-openstack-openstack-networker-9w5xv" Mar 20 09:19:13 crc kubenswrapper[4971]: I0320 09:19:13.209625 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7dbaa85-bc3c-47de-b48d-7d72c7d8f406-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-networker-9w5xv\" (UID: \"e7dbaa85-bc3c-47de-b48d-7d72c7d8f406\") " pod="openstack/install-certs-openstack-openstack-networker-9w5xv" Mar 20 09:19:13 crc kubenswrapper[4971]: I0320 09:19:13.209681 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7dbaa85-bc3c-47de-b48d-7d72c7d8f406-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-networker-9w5xv\" (UID: \"e7dbaa85-bc3c-47de-b48d-7d72c7d8f406\") " pod="openstack/install-certs-openstack-openstack-networker-9w5xv" Mar 20 09:19:13 crc kubenswrapper[4971]: I0320 09:19:13.217876 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/e7dbaa85-bc3c-47de-b48d-7d72c7d8f406-ssh-key-openstack-networker\") pod \"install-certs-openstack-openstack-networker-9w5xv\" (UID: \"e7dbaa85-bc3c-47de-b48d-7d72c7d8f406\") " pod="openstack/install-certs-openstack-openstack-networker-9w5xv" Mar 20 09:19:13 crc kubenswrapper[4971]: I0320 09:19:13.217990 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7dbaa85-bc3c-47de-b48d-7d72c7d8f406-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-networker-9w5xv\" (UID: \"e7dbaa85-bc3c-47de-b48d-7d72c7d8f406\") " pod="openstack/install-certs-openstack-openstack-networker-9w5xv" Mar 20 09:19:13 crc kubenswrapper[4971]: I0320 09:19:13.218043 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e7dbaa85-bc3c-47de-b48d-7d72c7d8f406-inventory\") pod \"install-certs-openstack-openstack-networker-9w5xv\" (UID: \"e7dbaa85-bc3c-47de-b48d-7d72c7d8f406\") " pod="openstack/install-certs-openstack-openstack-networker-9w5xv" Mar 20 09:19:13 crc kubenswrapper[4971]: I0320 09:19:13.226504 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6h27\" (UniqueName: \"kubernetes.io/projected/e7dbaa85-bc3c-47de-b48d-7d72c7d8f406-kube-api-access-r6h27\") pod \"install-certs-openstack-openstack-networker-9w5xv\" (UID: \"e7dbaa85-bc3c-47de-b48d-7d72c7d8f406\") " pod="openstack/install-certs-openstack-openstack-networker-9w5xv" Mar 20 09:19:13 crc kubenswrapper[4971]: I0320 09:19:13.318933 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-networker-9w5xv" Mar 20 09:19:13 crc kubenswrapper[4971]: I0320 09:19:13.827108 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-networker-9w5xv"] Mar 20 09:19:13 crc kubenswrapper[4971]: I0320 09:19:13.930545 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-networker-9w5xv" event={"ID":"e7dbaa85-bc3c-47de-b48d-7d72c7d8f406","Type":"ContainerStarted","Data":"80933a507ca3db32f9eddd20027bef5ee188f41b1ed814171ccce1d54bcf9b7f"} Mar 20 09:19:14 crc kubenswrapper[4971]: I0320 09:19:14.940242 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-networker-9w5xv" event={"ID":"e7dbaa85-bc3c-47de-b48d-7d72c7d8f406","Type":"ContainerStarted","Data":"12cb99a2a57707c06f6fadb18313d2a32801d9b75f02c348f7b4948150208e76"} Mar 20 09:19:14 crc kubenswrapper[4971]: I0320 09:19:14.967733 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-openstack-openstack-networker-9w5xv" podStartSLOduration=2.539573709 podStartE2EDuration="2.967707725s" podCreationTimestamp="2026-03-20 09:19:12 +0000 UTC" firstStartedPulling="2026-03-20 09:19:13.830873128 +0000 UTC m=+8975.810747266" lastFinishedPulling="2026-03-20 09:19:14.259007144 +0000 UTC m=+8976.238881282" observedRunningTime="2026-03-20 09:19:14.955638586 +0000 UTC m=+8976.935512744" watchObservedRunningTime="2026-03-20 09:19:14.967707725 +0000 UTC m=+8976.947581863" Mar 20 09:19:16 crc kubenswrapper[4971]: I0320 09:19:16.971043 4971 generic.go:334] "Generic (PLEG): container finished" podID="5e4faacf-b8fc-428e-8e94-cd873f7bbea8" containerID="f4825e29684e709708b395a0108091cca38544c77860ca3ec856cb09fea4c917" exitCode=0 Mar 20 09:19:16 crc kubenswrapper[4971]: I0320 09:19:16.971144 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-qhvnn" event={"ID":"5e4faacf-b8fc-428e-8e94-cd873f7bbea8","Type":"ContainerDied","Data":"f4825e29684e709708b395a0108091cca38544c77860ca3ec856cb09fea4c917"} Mar 20 09:19:18 crc kubenswrapper[4971]: I0320 09:19:18.428809 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-qhvnn" Mar 20 09:19:18 crc kubenswrapper[4971]: I0320 09:19:18.467167 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lmlqv" Mar 20 09:19:18 crc kubenswrapper[4971]: I0320 09:19:18.467210 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lmlqv" Mar 20 09:19:18 crc kubenswrapper[4971]: I0320 09:19:18.511423 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lmlqv" Mar 20 09:19:18 crc kubenswrapper[4971]: I0320 09:19:18.535973 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ztkp\" (UniqueName: \"kubernetes.io/projected/5e4faacf-b8fc-428e-8e94-cd873f7bbea8-kube-api-access-6ztkp\") pod \"5e4faacf-b8fc-428e-8e94-cd873f7bbea8\" (UID: \"5e4faacf-b8fc-428e-8e94-cd873f7bbea8\") " Mar 20 09:19:18 crc kubenswrapper[4971]: I0320 09:19:18.536171 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5e4faacf-b8fc-428e-8e94-cd873f7bbea8-ceph\") pod \"5e4faacf-b8fc-428e-8e94-cd873f7bbea8\" (UID: \"5e4faacf-b8fc-428e-8e94-cd873f7bbea8\") " Mar 20 09:19:18 crc kubenswrapper[4971]: I0320 09:19:18.536232 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e4faacf-b8fc-428e-8e94-cd873f7bbea8-inventory\") pod \"5e4faacf-b8fc-428e-8e94-cd873f7bbea8\" (UID: \"5e4faacf-b8fc-428e-8e94-cd873f7bbea8\") " Mar 20 09:19:18 crc kubenswrapper[4971]: I0320 09:19:18.536287 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5e4faacf-b8fc-428e-8e94-cd873f7bbea8-ssh-key-openstack-cell1\") pod \"5e4faacf-b8fc-428e-8e94-cd873f7bbea8\" (UID: \"5e4faacf-b8fc-428e-8e94-cd873f7bbea8\") " Mar 20 09:19:18 crc kubenswrapper[4971]: I0320 09:19:18.542848 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e4faacf-b8fc-428e-8e94-cd873f7bbea8-ceph" (OuterVolumeSpecName: "ceph") pod "5e4faacf-b8fc-428e-8e94-cd873f7bbea8" (UID: "5e4faacf-b8fc-428e-8e94-cd873f7bbea8"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:19:18 crc kubenswrapper[4971]: I0320 09:19:18.544536 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e4faacf-b8fc-428e-8e94-cd873f7bbea8-kube-api-access-6ztkp" (OuterVolumeSpecName: "kube-api-access-6ztkp") pod "5e4faacf-b8fc-428e-8e94-cd873f7bbea8" (UID: "5e4faacf-b8fc-428e-8e94-cd873f7bbea8"). InnerVolumeSpecName "kube-api-access-6ztkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:19:18 crc kubenswrapper[4971]: I0320 09:19:18.562633 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e4faacf-b8fc-428e-8e94-cd873f7bbea8-inventory" (OuterVolumeSpecName: "inventory") pod "5e4faacf-b8fc-428e-8e94-cd873f7bbea8" (UID: "5e4faacf-b8fc-428e-8e94-cd873f7bbea8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:19:18 crc kubenswrapper[4971]: I0320 09:19:18.566655 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e4faacf-b8fc-428e-8e94-cd873f7bbea8-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "5e4faacf-b8fc-428e-8e94-cd873f7bbea8" (UID: "5e4faacf-b8fc-428e-8e94-cd873f7bbea8"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:19:18 crc kubenswrapper[4971]: I0320 09:19:18.638798 4971 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e4faacf-b8fc-428e-8e94-cd873f7bbea8-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 09:19:18 crc kubenswrapper[4971]: I0320 09:19:18.638843 4971 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5e4faacf-b8fc-428e-8e94-cd873f7bbea8-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 20 09:19:18 crc kubenswrapper[4971]: I0320 09:19:18.638856 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ztkp\" (UniqueName: \"kubernetes.io/projected/5e4faacf-b8fc-428e-8e94-cd873f7bbea8-kube-api-access-6ztkp\") on node \"crc\" DevicePath \"\"" Mar 20 09:19:18 crc kubenswrapper[4971]: I0320 09:19:18.638867 4971 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5e4faacf-b8fc-428e-8e94-cd873f7bbea8-ceph\") on node \"crc\" DevicePath \"\"" Mar 20 09:19:18 crc kubenswrapper[4971]: I0320 09:19:18.992763 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-qhvnn" Mar 20 09:19:18 crc kubenswrapper[4971]: I0320 09:19:18.993916 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-qhvnn" event={"ID":"5e4faacf-b8fc-428e-8e94-cd873f7bbea8","Type":"ContainerDied","Data":"ac2d9e68eab7088f46d2022ee7a98d8fd16ebe70c2de820849b27646f12ab96b"} Mar 20 09:19:18 crc kubenswrapper[4971]: I0320 09:19:18.993953 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac2d9e68eab7088f46d2022ee7a98d8fd16ebe70c2de820849b27646f12ab96b" Mar 20 09:19:19 crc kubenswrapper[4971]: I0320 09:19:19.054260 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lmlqv" Mar 20 09:19:19 crc kubenswrapper[4971]: I0320 09:19:19.132106 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-openstack-4kbvw"] Mar 20 09:19:19 crc kubenswrapper[4971]: E0320 09:19:19.132784 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e4faacf-b8fc-428e-8e94-cd873f7bbea8" containerName="configure-os-openstack-openstack-cell1" Mar 20 09:19:19 crc kubenswrapper[4971]: I0320 09:19:19.132812 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e4faacf-b8fc-428e-8e94-cd873f7bbea8" containerName="configure-os-openstack-openstack-cell1" Mar 20 09:19:19 crc kubenswrapper[4971]: I0320 09:19:19.133128 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e4faacf-b8fc-428e-8e94-cd873f7bbea8" containerName="configure-os-openstack-openstack-cell1" Mar 20 09:19:19 crc kubenswrapper[4971]: I0320 09:19:19.134188 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-4kbvw" Mar 20 09:19:19 crc kubenswrapper[4971]: I0320 09:19:19.137481 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 20 09:19:19 crc kubenswrapper[4971]: I0320 09:19:19.137747 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rwstj" Mar 20 09:19:19 crc kubenswrapper[4971]: I0320 09:19:19.155283 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-4kbvw"] Mar 20 09:19:19 crc kubenswrapper[4971]: I0320 09:19:19.250048 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzs7w\" (UniqueName: \"kubernetes.io/projected/a0305b41-b831-4275-8e13-f5855d8f4f73-kube-api-access-fzs7w\") pod \"ssh-known-hosts-openstack-4kbvw\" (UID: \"a0305b41-b831-4275-8e13-f5855d8f4f73\") " pod="openstack/ssh-known-hosts-openstack-4kbvw" Mar 20 09:19:19 crc kubenswrapper[4971]: I0320 09:19:19.250178 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a0305b41-b831-4275-8e13-f5855d8f4f73-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-4kbvw\" (UID: \"a0305b41-b831-4275-8e13-f5855d8f4f73\") " pod="openstack/ssh-known-hosts-openstack-4kbvw" Mar 20 09:19:19 crc kubenswrapper[4971]: I0320 09:19:19.250364 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a0305b41-b831-4275-8e13-f5855d8f4f73-ceph\") pod \"ssh-known-hosts-openstack-4kbvw\" (UID: \"a0305b41-b831-4275-8e13-f5855d8f4f73\") " pod="openstack/ssh-known-hosts-openstack-4kbvw" Mar 20 09:19:19 crc kubenswrapper[4971]: I0320 09:19:19.250506 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/a0305b41-b831-4275-8e13-f5855d8f4f73-inventory-0\") pod \"ssh-known-hosts-openstack-4kbvw\" (UID: \"a0305b41-b831-4275-8e13-f5855d8f4f73\") " pod="openstack/ssh-known-hosts-openstack-4kbvw" Mar 20 09:19:19 crc kubenswrapper[4971]: I0320 09:19:19.251034 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-1\" (UniqueName: \"kubernetes.io/secret/a0305b41-b831-4275-8e13-f5855d8f4f73-inventory-1\") pod \"ssh-known-hosts-openstack-4kbvw\" (UID: \"a0305b41-b831-4275-8e13-f5855d8f4f73\") " pod="openstack/ssh-known-hosts-openstack-4kbvw" Mar 20 09:19:19 crc kubenswrapper[4971]: I0320 09:19:19.251165 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/a0305b41-b831-4275-8e13-f5855d8f4f73-ssh-key-openstack-networker\") pod \"ssh-known-hosts-openstack-4kbvw\" (UID: \"a0305b41-b831-4275-8e13-f5855d8f4f73\") " pod="openstack/ssh-known-hosts-openstack-4kbvw" Mar 20 09:19:19 crc kubenswrapper[4971]: I0320 09:19:19.352960 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a0305b41-b831-4275-8e13-f5855d8f4f73-ceph\") pod \"ssh-known-hosts-openstack-4kbvw\" (UID: \"a0305b41-b831-4275-8e13-f5855d8f4f73\") " pod="openstack/ssh-known-hosts-openstack-4kbvw" Mar 20 09:19:19 crc kubenswrapper[4971]: I0320 09:19:19.353038 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/a0305b41-b831-4275-8e13-f5855d8f4f73-inventory-0\") pod \"ssh-known-hosts-openstack-4kbvw\" (UID: \"a0305b41-b831-4275-8e13-f5855d8f4f73\") " pod="openstack/ssh-known-hosts-openstack-4kbvw" Mar 20 09:19:19 crc kubenswrapper[4971]: I0320 09:19:19.353105 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-1\" (UniqueName: \"kubernetes.io/secret/a0305b41-b831-4275-8e13-f5855d8f4f73-inventory-1\") pod \"ssh-known-hosts-openstack-4kbvw\" (UID: \"a0305b41-b831-4275-8e13-f5855d8f4f73\") " pod="openstack/ssh-known-hosts-openstack-4kbvw" Mar 20 09:19:19 crc kubenswrapper[4971]: I0320 09:19:19.353128 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/a0305b41-b831-4275-8e13-f5855d8f4f73-ssh-key-openstack-networker\") pod \"ssh-known-hosts-openstack-4kbvw\" (UID: \"a0305b41-b831-4275-8e13-f5855d8f4f73\") " pod="openstack/ssh-known-hosts-openstack-4kbvw" Mar 20 09:19:19 crc kubenswrapper[4971]: I0320 09:19:19.353154 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzs7w\" (UniqueName: \"kubernetes.io/projected/a0305b41-b831-4275-8e13-f5855d8f4f73-kube-api-access-fzs7w\") pod \"ssh-known-hosts-openstack-4kbvw\" (UID: \"a0305b41-b831-4275-8e13-f5855d8f4f73\") " pod="openstack/ssh-known-hosts-openstack-4kbvw" Mar 20 09:19:19 crc kubenswrapper[4971]: I0320 09:19:19.353175 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a0305b41-b831-4275-8e13-f5855d8f4f73-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-4kbvw\" (UID: \"a0305b41-b831-4275-8e13-f5855d8f4f73\") " pod="openstack/ssh-known-hosts-openstack-4kbvw" Mar 20 09:19:19 crc kubenswrapper[4971]: I0320 09:19:19.361254 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/a0305b41-b831-4275-8e13-f5855d8f4f73-inventory-0\") pod \"ssh-known-hosts-openstack-4kbvw\" (UID: \"a0305b41-b831-4275-8e13-f5855d8f4f73\") " pod="openstack/ssh-known-hosts-openstack-4kbvw" Mar 20 09:19:19 crc kubenswrapper[4971]: I0320 09:19:19.361451 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a0305b41-b831-4275-8e13-f5855d8f4f73-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-4kbvw\" (UID: \"a0305b41-b831-4275-8e13-f5855d8f4f73\") " pod="openstack/ssh-known-hosts-openstack-4kbvw" Mar 20 09:19:19 crc kubenswrapper[4971]: I0320 09:19:19.361709 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-1\" (UniqueName: \"kubernetes.io/secret/a0305b41-b831-4275-8e13-f5855d8f4f73-inventory-1\") pod \"ssh-known-hosts-openstack-4kbvw\" (UID: \"a0305b41-b831-4275-8e13-f5855d8f4f73\") " pod="openstack/ssh-known-hosts-openstack-4kbvw" Mar 20 09:19:19 crc kubenswrapper[4971]: I0320 09:19:19.362243 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a0305b41-b831-4275-8e13-f5855d8f4f73-ceph\") pod \"ssh-known-hosts-openstack-4kbvw\" (UID: \"a0305b41-b831-4275-8e13-f5855d8f4f73\") " pod="openstack/ssh-known-hosts-openstack-4kbvw" Mar 20 09:19:19 crc kubenswrapper[4971]: I0320 09:19:19.364171 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/a0305b41-b831-4275-8e13-f5855d8f4f73-ssh-key-openstack-networker\") pod \"ssh-known-hosts-openstack-4kbvw\" (UID: \"a0305b41-b831-4275-8e13-f5855d8f4f73\") " pod="openstack/ssh-known-hosts-openstack-4kbvw" Mar 20 09:19:19 crc kubenswrapper[4971]: I0320 09:19:19.375852 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzs7w\" (UniqueName: \"kubernetes.io/projected/a0305b41-b831-4275-8e13-f5855d8f4f73-kube-api-access-fzs7w\") pod \"ssh-known-hosts-openstack-4kbvw\" (UID: \"a0305b41-b831-4275-8e13-f5855d8f4f73\") " pod="openstack/ssh-known-hosts-openstack-4kbvw" Mar 20 09:19:19 crc kubenswrapper[4971]: I0320 09:19:19.507195 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-4kbvw" Mar 20 09:19:19 crc kubenswrapper[4971]: I0320 09:19:19.747403 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lmlqv"] Mar 20 09:19:20 crc kubenswrapper[4971]: I0320 09:19:20.050462 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-4kbvw"] Mar 20 09:19:21 crc kubenswrapper[4971]: I0320 09:19:21.010783 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-4kbvw" event={"ID":"a0305b41-b831-4275-8e13-f5855d8f4f73","Type":"ContainerStarted","Data":"e8e6c87828e35d3d43e4eaf952f9f43ed0863b9438b3fc0ef31e82000c898aa7"} Mar 20 09:19:21 crc kubenswrapper[4971]: I0320 09:19:21.011137 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-4kbvw" event={"ID":"a0305b41-b831-4275-8e13-f5855d8f4f73","Type":"ContainerStarted","Data":"0ca2ef18c3c6aba076c0e93617f27abb448beb6562578abcf09e91887844302d"} Mar 20 09:19:21 crc kubenswrapper[4971]: I0320 09:19:21.010951 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lmlqv" podUID="09b2d204-b83b-4d81-8fc8-4f9df4513dca" containerName="registry-server" containerID="cri-o://b30eb559d1f04e4ce7a805d13d1b37c55f500d08d9217b1b6e55771d36bed1c1" gracePeriod=2 Mar 20 09:19:21 crc kubenswrapper[4971]: I0320 09:19:21.042329 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-openstack-4kbvw" podStartSLOduration=1.6277827249999999 podStartE2EDuration="2.042310441s" podCreationTimestamp="2026-03-20 09:19:19 +0000 UTC" firstStartedPulling="2026-03-20 09:19:20.058457947 +0000 UTC m=+8982.038332085" lastFinishedPulling="2026-03-20 09:19:20.472985663 +0000 UTC m=+8982.452859801" observedRunningTime="2026-03-20 09:19:21.03581756 +0000 UTC m=+8983.015691698" watchObservedRunningTime="2026-03-20 09:19:21.042310441 +0000 UTC m=+8983.022184579" Mar 20 09:19:22 crc kubenswrapper[4971]: I0320 09:19:22.022321 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lmlqv" Mar 20 09:19:22 crc kubenswrapper[4971]: I0320 09:19:22.023875 4971 generic.go:334] "Generic (PLEG): container finished" podID="09b2d204-b83b-4d81-8fc8-4f9df4513dca" containerID="b30eb559d1f04e4ce7a805d13d1b37c55f500d08d9217b1b6e55771d36bed1c1" exitCode=0 Mar 20 09:19:22 crc kubenswrapper[4971]: I0320 09:19:22.023961 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lmlqv" event={"ID":"09b2d204-b83b-4d81-8fc8-4f9df4513dca","Type":"ContainerDied","Data":"b30eb559d1f04e4ce7a805d13d1b37c55f500d08d9217b1b6e55771d36bed1c1"} Mar 20 09:19:22 crc kubenswrapper[4971]: I0320 09:19:22.024019 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lmlqv" event={"ID":"09b2d204-b83b-4d81-8fc8-4f9df4513dca","Type":"ContainerDied","Data":"e937d60d2827ac0770847a90094d01f9e2eb83236887de3d0f54bf6bbb30e8a2"} Mar 20 09:19:22 crc kubenswrapper[4971]: I0320 09:19:22.024039 4971 scope.go:117] "RemoveContainer" containerID="b30eb559d1f04e4ce7a805d13d1b37c55f500d08d9217b1b6e55771d36bed1c1" Mar 20 09:19:22 crc kubenswrapper[4971]: I0320 09:19:22.045879 4971 scope.go:117] "RemoveContainer" containerID="eb0496e051ce9f2cc6cba39ce839bad7ddea03f5d221d4ade9e40089de2179d4" Mar 20 09:19:22 crc kubenswrapper[4971]: I0320 09:19:22.107114 4971 scope.go:117] "RemoveContainer" containerID="4dfed10b40ffcc2c045f6a6d866f2552c0252728af8b15f210766537d7d50790" Mar 20 09:19:22 crc kubenswrapper[4971]: I0320 09:19:22.108892 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09b2d204-b83b-4d81-8fc8-4f9df4513dca-utilities\") pod \"09b2d204-b83b-4d81-8fc8-4f9df4513dca\" (UID: \"09b2d204-b83b-4d81-8fc8-4f9df4513dca\") " Mar 20 09:19:22 crc kubenswrapper[4971]: I0320 09:19:22.109194 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcksk\" (UniqueName: \"kubernetes.io/projected/09b2d204-b83b-4d81-8fc8-4f9df4513dca-kube-api-access-jcksk\") pod \"09b2d204-b83b-4d81-8fc8-4f9df4513dca\" (UID: \"09b2d204-b83b-4d81-8fc8-4f9df4513dca\") " Mar 20 09:19:22 crc kubenswrapper[4971]: I0320 09:19:22.109350 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09b2d204-b83b-4d81-8fc8-4f9df4513dca-catalog-content\") pod \"09b2d204-b83b-4d81-8fc8-4f9df4513dca\" (UID: \"09b2d204-b83b-4d81-8fc8-4f9df4513dca\") " Mar 20 09:19:22 crc kubenswrapper[4971]: I0320 09:19:22.110725 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09b2d204-b83b-4d81-8fc8-4f9df4513dca-utilities" (OuterVolumeSpecName: "utilities") pod "09b2d204-b83b-4d81-8fc8-4f9df4513dca" (UID: "09b2d204-b83b-4d81-8fc8-4f9df4513dca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:19:22 crc kubenswrapper[4971]: I0320 09:19:22.117570 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09b2d204-b83b-4d81-8fc8-4f9df4513dca-kube-api-access-jcksk" (OuterVolumeSpecName: "kube-api-access-jcksk") pod "09b2d204-b83b-4d81-8fc8-4f9df4513dca" (UID: "09b2d204-b83b-4d81-8fc8-4f9df4513dca"). InnerVolumeSpecName "kube-api-access-jcksk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:19:22 crc kubenswrapper[4971]: I0320 09:19:22.137658 4971 scope.go:117] "RemoveContainer" containerID="b30eb559d1f04e4ce7a805d13d1b37c55f500d08d9217b1b6e55771d36bed1c1" Mar 20 09:19:22 crc kubenswrapper[4971]: E0320 09:19:22.138328 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b30eb559d1f04e4ce7a805d13d1b37c55f500d08d9217b1b6e55771d36bed1c1\": container with ID starting with b30eb559d1f04e4ce7a805d13d1b37c55f500d08d9217b1b6e55771d36bed1c1 not found: ID does not exist" containerID="b30eb559d1f04e4ce7a805d13d1b37c55f500d08d9217b1b6e55771d36bed1c1" Mar 20 09:19:22 crc kubenswrapper[4971]: I0320 09:19:22.138364 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b30eb559d1f04e4ce7a805d13d1b37c55f500d08d9217b1b6e55771d36bed1c1"} err="failed to get container status \"b30eb559d1f04e4ce7a805d13d1b37c55f500d08d9217b1b6e55771d36bed1c1\": rpc error: code = NotFound desc = could not find container \"b30eb559d1f04e4ce7a805d13d1b37c55f500d08d9217b1b6e55771d36bed1c1\": container with ID starting with b30eb559d1f04e4ce7a805d13d1b37c55f500d08d9217b1b6e55771d36bed1c1 not found: ID does not exist" Mar 20 09:19:22 crc kubenswrapper[4971]: I0320 09:19:22.138407 4971 scope.go:117] "RemoveContainer" containerID="eb0496e051ce9f2cc6cba39ce839bad7ddea03f5d221d4ade9e40089de2179d4" Mar 20 09:19:22 crc kubenswrapper[4971]: E0320 09:19:22.138730 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb0496e051ce9f2cc6cba39ce839bad7ddea03f5d221d4ade9e40089de2179d4\": container with ID starting with eb0496e051ce9f2cc6cba39ce839bad7ddea03f5d221d4ade9e40089de2179d4 not found: ID does not exist" containerID="eb0496e051ce9f2cc6cba39ce839bad7ddea03f5d221d4ade9e40089de2179d4" Mar 20 09:19:22 crc kubenswrapper[4971]: I0320 09:19:22.138774 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb0496e051ce9f2cc6cba39ce839bad7ddea03f5d221d4ade9e40089de2179d4"} err="failed to get container status \"eb0496e051ce9f2cc6cba39ce839bad7ddea03f5d221d4ade9e40089de2179d4\": rpc error: code = NotFound desc = could not find container \"eb0496e051ce9f2cc6cba39ce839bad7ddea03f5d221d4ade9e40089de2179d4\": container with ID starting with eb0496e051ce9f2cc6cba39ce839bad7ddea03f5d221d4ade9e40089de2179d4 not found: ID does not exist" Mar 20 09:19:22 crc kubenswrapper[4971]: I0320 09:19:22.138790 4971 scope.go:117] "RemoveContainer" containerID="4dfed10b40ffcc2c045f6a6d866f2552c0252728af8b15f210766537d7d50790" Mar 20 09:19:22 crc kubenswrapper[4971]: E0320 09:19:22.139277 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4dfed10b40ffcc2c045f6a6d866f2552c0252728af8b15f210766537d7d50790\": container with ID starting with 4dfed10b40ffcc2c045f6a6d866f2552c0252728af8b15f210766537d7d50790 not found: ID does not exist" containerID="4dfed10b40ffcc2c045f6a6d866f2552c0252728af8b15f210766537d7d50790" Mar 20 09:19:22 crc kubenswrapper[4971]: I0320 09:19:22.139320 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dfed10b40ffcc2c045f6a6d866f2552c0252728af8b15f210766537d7d50790"} err="failed to get container status \"4dfed10b40ffcc2c045f6a6d866f2552c0252728af8b15f210766537d7d50790\": rpc error: code = NotFound desc = could not find container \"4dfed10b40ffcc2c045f6a6d866f2552c0252728af8b15f210766537d7d50790\": container with ID starting with 4dfed10b40ffcc2c045f6a6d866f2552c0252728af8b15f210766537d7d50790 not found: ID does not exist" Mar 20 09:19:22 crc kubenswrapper[4971]: I0320 09:19:22.141268 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09b2d204-b83b-4d81-8fc8-4f9df4513dca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "09b2d204-b83b-4d81-8fc8-4f9df4513dca" (UID: "09b2d204-b83b-4d81-8fc8-4f9df4513dca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:19:22 crc kubenswrapper[4971]: I0320 09:19:22.212816 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcksk\" (UniqueName: \"kubernetes.io/projected/09b2d204-b83b-4d81-8fc8-4f9df4513dca-kube-api-access-jcksk\") on node \"crc\" DevicePath \"\"" Mar 20 09:19:22 crc kubenswrapper[4971]: I0320 09:19:22.212875 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09b2d204-b83b-4d81-8fc8-4f9df4513dca-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 09:19:22 crc kubenswrapper[4971]: I0320 09:19:22.212885 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09b2d204-b83b-4d81-8fc8-4f9df4513dca-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 09:19:23 crc kubenswrapper[4971]: I0320 09:19:23.034035 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lmlqv" Mar 20 09:19:23 crc kubenswrapper[4971]: I0320 09:19:23.055260 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lmlqv"] Mar 20 09:19:23 crc kubenswrapper[4971]: I0320 09:19:23.069554 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lmlqv"] Mar 20 09:19:24 crc kubenswrapper[4971]: I0320 09:19:24.751851 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09b2d204-b83b-4d81-8fc8-4f9df4513dca" path="/var/lib/kubelet/pods/09b2d204-b83b-4d81-8fc8-4f9df4513dca/volumes" Mar 20 09:19:25 crc kubenswrapper[4971]: I0320 09:19:25.053499 4971 generic.go:334] "Generic (PLEG): container finished" podID="e7dbaa85-bc3c-47de-b48d-7d72c7d8f406" containerID="12cb99a2a57707c06f6fadb18313d2a32801d9b75f02c348f7b4948150208e76" exitCode=0 Mar 20 09:19:25 crc kubenswrapper[4971]: I0320 09:19:25.053588 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-networker-9w5xv" event={"ID":"e7dbaa85-bc3c-47de-b48d-7d72c7d8f406","Type":"ContainerDied","Data":"12cb99a2a57707c06f6fadb18313d2a32801d9b75f02c348f7b4948150208e76"} Mar 20 09:19:26 crc kubenswrapper[4971]: I0320 09:19:26.469721 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-networker-9w5xv" Mar 20 09:19:26 crc kubenswrapper[4971]: I0320 09:19:26.600821 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7dbaa85-bc3c-47de-b48d-7d72c7d8f406-bootstrap-combined-ca-bundle\") pod \"e7dbaa85-bc3c-47de-b48d-7d72c7d8f406\" (UID: \"e7dbaa85-bc3c-47de-b48d-7d72c7d8f406\") " Mar 20 09:19:26 crc kubenswrapper[4971]: I0320 09:19:26.600943 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/e7dbaa85-bc3c-47de-b48d-7d72c7d8f406-ssh-key-openstack-networker\") pod \"e7dbaa85-bc3c-47de-b48d-7d72c7d8f406\" (UID: \"e7dbaa85-bc3c-47de-b48d-7d72c7d8f406\") " Mar 20 09:19:26 crc kubenswrapper[4971]: I0320 09:19:26.600969 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e7dbaa85-bc3c-47de-b48d-7d72c7d8f406-inventory\") pod \"e7dbaa85-bc3c-47de-b48d-7d72c7d8f406\" (UID: \"e7dbaa85-bc3c-47de-b48d-7d72c7d8f406\") " Mar 20 09:19:26 crc kubenswrapper[4971]: I0320 09:19:26.601028 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7dbaa85-bc3c-47de-b48d-7d72c7d8f406-ovn-combined-ca-bundle\") pod \"e7dbaa85-bc3c-47de-b48d-7d72c7d8f406\" (UID: \"e7dbaa85-bc3c-47de-b48d-7d72c7d8f406\") " Mar 20 09:19:26 crc kubenswrapper[4971]: I0320 09:19:26.601126 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7dbaa85-bc3c-47de-b48d-7d72c7d8f406-neutron-metadata-combined-ca-bundle\") pod \"e7dbaa85-bc3c-47de-b48d-7d72c7d8f406\" (UID: \"e7dbaa85-bc3c-47de-b48d-7d72c7d8f406\") " Mar 20 09:19:26 crc kubenswrapper[4971]: I0320 09:19:26.601197 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6h27\" (UniqueName: \"kubernetes.io/projected/e7dbaa85-bc3c-47de-b48d-7d72c7d8f406-kube-api-access-r6h27\") pod \"e7dbaa85-bc3c-47de-b48d-7d72c7d8f406\" (UID: \"e7dbaa85-bc3c-47de-b48d-7d72c7d8f406\") " Mar 20 09:19:26 crc kubenswrapper[4971]: I0320 09:19:26.606791 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7dbaa85-bc3c-47de-b48d-7d72c7d8f406-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "e7dbaa85-bc3c-47de-b48d-7d72c7d8f406" (UID: "e7dbaa85-bc3c-47de-b48d-7d72c7d8f406"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:19:26 crc kubenswrapper[4971]: I0320 09:19:26.606904 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7dbaa85-bc3c-47de-b48d-7d72c7d8f406-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "e7dbaa85-bc3c-47de-b48d-7d72c7d8f406" (UID: "e7dbaa85-bc3c-47de-b48d-7d72c7d8f406"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:19:26 crc kubenswrapper[4971]: I0320 09:19:26.606983 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7dbaa85-bc3c-47de-b48d-7d72c7d8f406-kube-api-access-r6h27" (OuterVolumeSpecName: "kube-api-access-r6h27") pod "e7dbaa85-bc3c-47de-b48d-7d72c7d8f406" (UID: "e7dbaa85-bc3c-47de-b48d-7d72c7d8f406"). InnerVolumeSpecName "kube-api-access-r6h27". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:19:26 crc kubenswrapper[4971]: I0320 09:19:26.615790 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7dbaa85-bc3c-47de-b48d-7d72c7d8f406-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "e7dbaa85-bc3c-47de-b48d-7d72c7d8f406" (UID: "e7dbaa85-bc3c-47de-b48d-7d72c7d8f406"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:19:26 crc kubenswrapper[4971]: I0320 09:19:26.633536 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7dbaa85-bc3c-47de-b48d-7d72c7d8f406-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "e7dbaa85-bc3c-47de-b48d-7d72c7d8f406" (UID: "e7dbaa85-bc3c-47de-b48d-7d72c7d8f406"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:19:26 crc kubenswrapper[4971]: I0320 09:19:26.636805 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7dbaa85-bc3c-47de-b48d-7d72c7d8f406-inventory" (OuterVolumeSpecName: "inventory") pod "e7dbaa85-bc3c-47de-b48d-7d72c7d8f406" (UID: "e7dbaa85-bc3c-47de-b48d-7d72c7d8f406"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:19:26 crc kubenswrapper[4971]: I0320 09:19:26.703843 4971 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7dbaa85-bc3c-47de-b48d-7d72c7d8f406-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 09:19:26 crc kubenswrapper[4971]: I0320 09:19:26.703878 4971 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/e7dbaa85-bc3c-47de-b48d-7d72c7d8f406-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Mar 20 09:19:26 crc kubenswrapper[4971]: I0320 09:19:26.703892 4971 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e7dbaa85-bc3c-47de-b48d-7d72c7d8f406-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 09:19:26 crc kubenswrapper[4971]: I0320 09:19:26.703905 4971 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7dbaa85-bc3c-47de-b48d-7d72c7d8f406-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 09:19:26 crc kubenswrapper[4971]: I0320 09:19:26.703918 4971 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7dbaa85-bc3c-47de-b48d-7d72c7d8f406-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 09:19:26 crc kubenswrapper[4971]: I0320 09:19:26.703931 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6h27\" (UniqueName: \"kubernetes.io/projected/e7dbaa85-bc3c-47de-b48d-7d72c7d8f406-kube-api-access-r6h27\") on node \"crc\" DevicePath \"\"" Mar 20 09:19:27 crc kubenswrapper[4971]: I0320 09:19:27.072021 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-networker-9w5xv" event={"ID":"e7dbaa85-bc3c-47de-b48d-7d72c7d8f406","Type":"ContainerDied","Data":"80933a507ca3db32f9eddd20027bef5ee188f41b1ed814171ccce1d54bcf9b7f"} Mar 20 09:19:27 crc kubenswrapper[4971]: I0320 09:19:27.072081 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80933a507ca3db32f9eddd20027bef5ee188f41b1ed814171ccce1d54bcf9b7f" Mar 20 09:19:27 crc kubenswrapper[4971]: I0320 09:19:27.072091 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-networker-9w5xv" Mar 20 09:19:27 crc kubenswrapper[4971]: I0320 09:19:27.143895 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-openstack-openstack-networker-kn8xb"] Mar 20 09:19:27 crc kubenswrapper[4971]: E0320 09:19:27.144444 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7dbaa85-bc3c-47de-b48d-7d72c7d8f406" containerName="install-certs-openstack-openstack-networker" Mar 20 09:19:27 crc kubenswrapper[4971]: I0320 09:19:27.144470 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7dbaa85-bc3c-47de-b48d-7d72c7d8f406" containerName="install-certs-openstack-openstack-networker" Mar 20 09:19:27 crc kubenswrapper[4971]: E0320 09:19:27.144501 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09b2d204-b83b-4d81-8fc8-4f9df4513dca" containerName="extract-utilities" Mar 20 09:19:27 crc kubenswrapper[4971]: I0320 09:19:27.144510 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="09b2d204-b83b-4d81-8fc8-4f9df4513dca" containerName="extract-utilities" Mar 20 09:19:27 crc kubenswrapper[4971]: E0320 09:19:27.144543 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09b2d204-b83b-4d81-8fc8-4f9df4513dca" containerName="registry-server" Mar 20 09:19:27 crc kubenswrapper[4971]: I0320 09:19:27.144551 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="09b2d204-b83b-4d81-8fc8-4f9df4513dca" containerName="registry-server" Mar 20 09:19:27 crc kubenswrapper[4971]: E0320 09:19:27.144563 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09b2d204-b83b-4d81-8fc8-4f9df4513dca" containerName="extract-content" Mar 20 09:19:27 crc kubenswrapper[4971]: I0320 09:19:27.144571 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="09b2d204-b83b-4d81-8fc8-4f9df4513dca" containerName="extract-content" Mar 20 09:19:27 crc kubenswrapper[4971]: I0320 09:19:27.144864 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="09b2d204-b83b-4d81-8fc8-4f9df4513dca" containerName="registry-server" Mar 20 09:19:27 crc kubenswrapper[4971]: I0320 09:19:27.144892 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7dbaa85-bc3c-47de-b48d-7d72c7d8f406" containerName="install-certs-openstack-openstack-networker" Mar 20 09:19:27 crc kubenswrapper[4971]: I0320 09:19:27.145844 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-networker-kn8xb" Mar 20 09:19:27 crc kubenswrapper[4971]: I0320 09:19:27.151124 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-nwcvx" Mar 20 09:19:27 crc kubenswrapper[4971]: I0320 09:19:27.151318 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 20 09:19:27 crc kubenswrapper[4971]: I0320 09:19:27.154307 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-networker-kn8xb"] Mar 20 09:19:27 crc kubenswrapper[4971]: I0320 09:19:27.213567 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/49dff616-ace8-40f1-a666-5fdf6e5ab3d6-ovncontroller-config-0\") pod \"ovn-openstack-openstack-networker-kn8xb\" (UID: \"49dff616-ace8-40f1-a666-5fdf6e5ab3d6\") " pod="openstack/ovn-openstack-openstack-networker-kn8xb" Mar 20 09:19:27 crc kubenswrapper[4971]: I0320 09:19:27.213797 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/49dff616-ace8-40f1-a666-5fdf6e5ab3d6-ssh-key-openstack-networker\") pod \"ovn-openstack-openstack-networker-kn8xb\" (UID: \"49dff616-ace8-40f1-a666-5fdf6e5ab3d6\") " pod="openstack/ovn-openstack-openstack-networker-kn8xb" Mar 20 09:19:27 crc kubenswrapper[4971]: I0320 09:19:27.213870 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-496md\" (UniqueName: \"kubernetes.io/projected/49dff616-ace8-40f1-a666-5fdf6e5ab3d6-kube-api-access-496md\") pod \"ovn-openstack-openstack-networker-kn8xb\" (UID: \"49dff616-ace8-40f1-a666-5fdf6e5ab3d6\") " pod="openstack/ovn-openstack-openstack-networker-kn8xb" Mar 20 09:19:27 crc kubenswrapper[4971]: I0320 09:19:27.213914 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49dff616-ace8-40f1-a666-5fdf6e5ab3d6-inventory\") pod \"ovn-openstack-openstack-networker-kn8xb\" (UID: \"49dff616-ace8-40f1-a666-5fdf6e5ab3d6\") " pod="openstack/ovn-openstack-openstack-networker-kn8xb" Mar 20 09:19:27 crc kubenswrapper[4971]: I0320 09:19:27.213981 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49dff616-ace8-40f1-a666-5fdf6e5ab3d6-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-networker-kn8xb\" (UID: \"49dff616-ace8-40f1-a666-5fdf6e5ab3d6\") " pod="openstack/ovn-openstack-openstack-networker-kn8xb" Mar 20 09:19:27 crc kubenswrapper[4971]: I0320 09:19:27.316168 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/49dff616-ace8-40f1-a666-5fdf6e5ab3d6-ssh-key-openstack-networker\") pod \"ovn-openstack-openstack-networker-kn8xb\" (UID: \"49dff616-ace8-40f1-a666-5fdf6e5ab3d6\") " pod="openstack/ovn-openstack-openstack-networker-kn8xb" Mar 20 09:19:27 crc kubenswrapper[4971]: I0320 09:19:27.316244 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-496md\" (UniqueName: \"kubernetes.io/projected/49dff616-ace8-40f1-a666-5fdf6e5ab3d6-kube-api-access-496md\") pod \"ovn-openstack-openstack-networker-kn8xb\" (UID: \"49dff616-ace8-40f1-a666-5fdf6e5ab3d6\") " pod="openstack/ovn-openstack-openstack-networker-kn8xb" Mar 20 09:19:27 crc kubenswrapper[4971]: I0320 09:19:27.316270 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49dff616-ace8-40f1-a666-5fdf6e5ab3d6-inventory\") pod \"ovn-openstack-openstack-networker-kn8xb\" (UID: \"49dff616-ace8-40f1-a666-5fdf6e5ab3d6\") " pod="openstack/ovn-openstack-openstack-networker-kn8xb" Mar 20 09:19:27 crc kubenswrapper[4971]: I0320 09:19:27.316321 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49dff616-ace8-40f1-a666-5fdf6e5ab3d6-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-networker-kn8xb\" (UID: \"49dff616-ace8-40f1-a666-5fdf6e5ab3d6\") " pod="openstack/ovn-openstack-openstack-networker-kn8xb" Mar 20 09:19:27 crc kubenswrapper[4971]: I0320 09:19:27.316377 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/49dff616-ace8-40f1-a666-5fdf6e5ab3d6-ovncontroller-config-0\") pod \"ovn-openstack-openstack-networker-kn8xb\" (UID: \"49dff616-ace8-40f1-a666-5fdf6e5ab3d6\") " pod="openstack/ovn-openstack-openstack-networker-kn8xb" Mar 20 09:19:27 crc kubenswrapper[4971]: I0320 09:19:27.317276 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/49dff616-ace8-40f1-a666-5fdf6e5ab3d6-ovncontroller-config-0\") pod \"ovn-openstack-openstack-networker-kn8xb\" (UID: \"49dff616-ace8-40f1-a666-5fdf6e5ab3d6\") " pod="openstack/ovn-openstack-openstack-networker-kn8xb" Mar 20 09:19:27 crc kubenswrapper[4971]: I0320 09:19:27.319651 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/49dff616-ace8-40f1-a666-5fdf6e5ab3d6-ssh-key-openstack-networker\") pod \"ovn-openstack-openstack-networker-kn8xb\" (UID: \"49dff616-ace8-40f1-a666-5fdf6e5ab3d6\") " pod="openstack/ovn-openstack-openstack-networker-kn8xb" Mar 20 09:19:27 crc kubenswrapper[4971]: I0320 09:19:27.320089 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49dff616-ace8-40f1-a666-5fdf6e5ab3d6-inventory\") pod \"ovn-openstack-openstack-networker-kn8xb\" (UID: \"49dff616-ace8-40f1-a666-5fdf6e5ab3d6\") " pod="openstack/ovn-openstack-openstack-networker-kn8xb" Mar 20 09:19:27 crc kubenswrapper[4971]: I0320 09:19:27.320306 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49dff616-ace8-40f1-a666-5fdf6e5ab3d6-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-networker-kn8xb\" (UID: \"49dff616-ace8-40f1-a666-5fdf6e5ab3d6\") " pod="openstack/ovn-openstack-openstack-networker-kn8xb" Mar 20 09:19:27 crc kubenswrapper[4971]: I0320 09:19:27.331367 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-496md\" (UniqueName: \"kubernetes.io/projected/49dff616-ace8-40f1-a666-5fdf6e5ab3d6-kube-api-access-496md\") pod \"ovn-openstack-openstack-networker-kn8xb\" (UID: \"49dff616-ace8-40f1-a666-5fdf6e5ab3d6\") " pod="openstack/ovn-openstack-openstack-networker-kn8xb" Mar 20 09:19:27 crc kubenswrapper[4971]: I0320 09:19:27.464232 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-networker-kn8xb" Mar 20 09:19:28 crc kubenswrapper[4971]: I0320 09:19:28.007573 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-networker-kn8xb"] Mar 20 09:19:28 crc kubenswrapper[4971]: I0320 09:19:28.084075 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-networker-kn8xb" event={"ID":"49dff616-ace8-40f1-a666-5fdf6e5ab3d6","Type":"ContainerStarted","Data":"fed7d7a98299f37e5e323bde61bfbcf8c7725bbad8d3d5818df51bee030a95de"} Mar 20 09:19:29 crc kubenswrapper[4971]: I0320 09:19:29.096183 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-networker-kn8xb" event={"ID":"49dff616-ace8-40f1-a666-5fdf6e5ab3d6","Type":"ContainerStarted","Data":"c3127bc588362156f3826d9bc72c15cfb4f6dc4f6668b5eee5555ced0bb7d1ae"} Mar 20 09:19:29 crc kubenswrapper[4971]: I0320 09:19:29.122053 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-openstack-openstack-networker-kn8xb" podStartSLOduration=1.5384364480000001 podStartE2EDuration="2.122030953s" podCreationTimestamp="2026-03-20 09:19:27 +0000 UTC" firstStartedPulling="2026-03-20 09:19:28.007206958 +0000 UTC m=+8989.987081096" lastFinishedPulling="2026-03-20 09:19:28.590801463 +0000 UTC m=+8990.570675601" observedRunningTime="2026-03-20 09:19:29.112497231 +0000 UTC m=+8991.092371389" watchObservedRunningTime="2026-03-20 09:19:29.122030953 +0000 UTC m=+8991.101905091" Mar 20 09:19:35 crc kubenswrapper[4971]: I0320 09:19:35.150627 4971 generic.go:334] "Generic (PLEG): container finished" podID="a0305b41-b831-4275-8e13-f5855d8f4f73" containerID="e8e6c87828e35d3d43e4eaf952f9f43ed0863b9438b3fc0ef31e82000c898aa7" exitCode=0 Mar 20 09:19:35 crc kubenswrapper[4971]: I0320 09:19:35.150691 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-4kbvw" event={"ID":"a0305b41-b831-4275-8e13-f5855d8f4f73","Type":"ContainerDied","Data":"e8e6c87828e35d3d43e4eaf952f9f43ed0863b9438b3fc0ef31e82000c898aa7"} Mar 20 09:19:36 crc kubenswrapper[4971]: I0320 09:19:36.607497 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-4kbvw" Mar 20 09:19:36 crc kubenswrapper[4971]: I0320 09:19:36.635037 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/a0305b41-b831-4275-8e13-f5855d8f4f73-inventory-0\") pod \"a0305b41-b831-4275-8e13-f5855d8f4f73\" (UID: \"a0305b41-b831-4275-8e13-f5855d8f4f73\") " Mar 20 09:19:36 crc kubenswrapper[4971]: I0320 09:19:36.635108 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzs7w\" (UniqueName: \"kubernetes.io/projected/a0305b41-b831-4275-8e13-f5855d8f4f73-kube-api-access-fzs7w\") pod \"a0305b41-b831-4275-8e13-f5855d8f4f73\" (UID: \"a0305b41-b831-4275-8e13-f5855d8f4f73\") " Mar 20 09:19:36 crc kubenswrapper[4971]: I0320 09:19:36.635209 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a0305b41-b831-4275-8e13-f5855d8f4f73-ceph\") pod \"a0305b41-b831-4275-8e13-f5855d8f4f73\" (UID: \"a0305b41-b831-4275-8e13-f5855d8f4f73\") " Mar 20 09:19:36 crc kubenswrapper[4971]: I0320 09:19:36.635257 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a0305b41-b831-4275-8e13-f5855d8f4f73-ssh-key-openstack-cell1\") pod \"a0305b41-b831-4275-8e13-f5855d8f4f73\" (UID: \"a0305b41-b831-4275-8e13-f5855d8f4f73\") " Mar 20 09:19:36 crc kubenswrapper[4971]: I0320 09:19:36.635300 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-1\" (UniqueName: \"kubernetes.io/secret/a0305b41-b831-4275-8e13-f5855d8f4f73-inventory-1\") pod \"a0305b41-b831-4275-8e13-f5855d8f4f73\" (UID: \"a0305b41-b831-4275-8e13-f5855d8f4f73\") " Mar 20 09:19:36 crc kubenswrapper[4971]: I0320 09:19:36.635352 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/a0305b41-b831-4275-8e13-f5855d8f4f73-ssh-key-openstack-networker\") pod \"a0305b41-b831-4275-8e13-f5855d8f4f73\" (UID: \"a0305b41-b831-4275-8e13-f5855d8f4f73\") " Mar 20 09:19:36 crc kubenswrapper[4971]: I0320 09:19:36.641347 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0305b41-b831-4275-8e13-f5855d8f4f73-ceph" (OuterVolumeSpecName: "ceph") pod "a0305b41-b831-4275-8e13-f5855d8f4f73" (UID: "a0305b41-b831-4275-8e13-f5855d8f4f73"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:19:36 crc kubenswrapper[4971]: I0320 09:19:36.641366 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0305b41-b831-4275-8e13-f5855d8f4f73-kube-api-access-fzs7w" (OuterVolumeSpecName: "kube-api-access-fzs7w") pod "a0305b41-b831-4275-8e13-f5855d8f4f73" (UID: "a0305b41-b831-4275-8e13-f5855d8f4f73"). InnerVolumeSpecName "kube-api-access-fzs7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:19:36 crc kubenswrapper[4971]: I0320 09:19:36.666257 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0305b41-b831-4275-8e13-f5855d8f4f73-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "a0305b41-b831-4275-8e13-f5855d8f4f73" (UID: "a0305b41-b831-4275-8e13-f5855d8f4f73"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:19:36 crc kubenswrapper[4971]: I0320 09:19:36.668891 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0305b41-b831-4275-8e13-f5855d8f4f73-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "a0305b41-b831-4275-8e13-f5855d8f4f73" (UID: "a0305b41-b831-4275-8e13-f5855d8f4f73"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:19:36 crc kubenswrapper[4971]: I0320 09:19:36.677825 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0305b41-b831-4275-8e13-f5855d8f4f73-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "a0305b41-b831-4275-8e13-f5855d8f4f73" (UID: "a0305b41-b831-4275-8e13-f5855d8f4f73"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:19:36 crc kubenswrapper[4971]: I0320 09:19:36.682339 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0305b41-b831-4275-8e13-f5855d8f4f73-inventory-1" (OuterVolumeSpecName: "inventory-1") pod "a0305b41-b831-4275-8e13-f5855d8f4f73" (UID: "a0305b41-b831-4275-8e13-f5855d8f4f73"). InnerVolumeSpecName "inventory-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:19:36 crc kubenswrapper[4971]: I0320 09:19:36.738469 4971 reconciler_common.go:293] "Volume detached for volume \"inventory-1\" (UniqueName: \"kubernetes.io/secret/a0305b41-b831-4275-8e13-f5855d8f4f73-inventory-1\") on node \"crc\" DevicePath \"\"" Mar 20 09:19:36 crc kubenswrapper[4971]: I0320 09:19:36.738504 4971 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/a0305b41-b831-4275-8e13-f5855d8f4f73-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Mar 20 09:19:36 crc kubenswrapper[4971]: I0320 09:19:36.738516 4971 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/a0305b41-b831-4275-8e13-f5855d8f4f73-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 20 09:19:36 crc kubenswrapper[4971]: I0320 09:19:36.738524 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzs7w\" (UniqueName: \"kubernetes.io/projected/a0305b41-b831-4275-8e13-f5855d8f4f73-kube-api-access-fzs7w\") on node \"crc\" DevicePath \"\"" Mar 20 09:19:36 crc kubenswrapper[4971]: I0320 09:19:36.738533 4971 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a0305b41-b831-4275-8e13-f5855d8f4f73-ceph\") on node \"crc\" DevicePath \"\"" Mar 20 09:19:36 crc kubenswrapper[4971]: I0320 09:19:36.738545 4971 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a0305b41-b831-4275-8e13-f5855d8f4f73-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 20 09:19:37 crc kubenswrapper[4971]: I0320 09:19:37.171737 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-4kbvw" event={"ID":"a0305b41-b831-4275-8e13-f5855d8f4f73","Type":"ContainerDied","Data":"0ca2ef18c3c6aba076c0e93617f27abb448beb6562578abcf09e91887844302d"} Mar 20 09:19:37 crc kubenswrapper[4971]: I0320 09:19:37.171816 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ca2ef18c3c6aba076c0e93617f27abb448beb6562578abcf09e91887844302d" Mar 20 09:19:37 crc kubenswrapper[4971]: I0320 09:19:37.171814 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-4kbvw" Mar 20 09:19:37 crc kubenswrapper[4971]: I0320 09:19:37.258272 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-openstack-openstack-cell1-rvwvh"] Mar 20 09:19:37 crc kubenswrapper[4971]: E0320 09:19:37.258865 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0305b41-b831-4275-8e13-f5855d8f4f73" containerName="ssh-known-hosts-openstack" Mar 20 09:19:37 crc kubenswrapper[4971]: I0320 09:19:37.258891 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0305b41-b831-4275-8e13-f5855d8f4f73" containerName="ssh-known-hosts-openstack" Mar 20 09:19:37 crc kubenswrapper[4971]: I0320 09:19:37.259161 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0305b41-b831-4275-8e13-f5855d8f4f73" containerName="ssh-known-hosts-openstack" Mar 20 09:19:37 crc kubenswrapper[4971]: I0320 09:19:37.260103 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-rvwvh" Mar 20 09:19:37 crc kubenswrapper[4971]: I0320 09:19:37.264334 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 20 09:19:37 crc kubenswrapper[4971]: I0320 09:19:37.264382 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rwstj" Mar 20 09:19:37 crc kubenswrapper[4971]: I0320 09:19:37.273438 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-rvwvh"] Mar 20 09:19:37 crc kubenswrapper[4971]: I0320 09:19:37.351209 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rhdw\" (UniqueName: \"kubernetes.io/projected/3f4b21df-b676-4340-8510-54195b3f2be5-kube-api-access-4rhdw\") pod \"run-os-openstack-openstack-cell1-rvwvh\" (UID: \"3f4b21df-b676-4340-8510-54195b3f2be5\") " pod="openstack/run-os-openstack-openstack-cell1-rvwvh" Mar 20 09:19:37 crc kubenswrapper[4971]: I0320 09:19:37.351397 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/3f4b21df-b676-4340-8510-54195b3f2be5-ssh-key-openstack-cell1\") pod \"run-os-openstack-openstack-cell1-rvwvh\" (UID: \"3f4b21df-b676-4340-8510-54195b3f2be5\") " pod="openstack/run-os-openstack-openstack-cell1-rvwvh" Mar 20 09:19:37 crc kubenswrapper[4971]: I0320 09:19:37.351451 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3f4b21df-b676-4340-8510-54195b3f2be5-ceph\") pod \"run-os-openstack-openstack-cell1-rvwvh\" (UID: \"3f4b21df-b676-4340-8510-54195b3f2be5\") " pod="openstack/run-os-openstack-openstack-cell1-rvwvh" Mar 20 09:19:37 crc kubenswrapper[4971]: I0320 09:19:37.351674 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f4b21df-b676-4340-8510-54195b3f2be5-inventory\") pod \"run-os-openstack-openstack-cell1-rvwvh\" (UID: \"3f4b21df-b676-4340-8510-54195b3f2be5\") " pod="openstack/run-os-openstack-openstack-cell1-rvwvh" Mar 20 09:19:37 crc kubenswrapper[4971]: I0320 09:19:37.453497 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rhdw\" (UniqueName: \"kubernetes.io/projected/3f4b21df-b676-4340-8510-54195b3f2be5-kube-api-access-4rhdw\") pod \"run-os-openstack-openstack-cell1-rvwvh\" (UID: \"3f4b21df-b676-4340-8510-54195b3f2be5\") " pod="openstack/run-os-openstack-openstack-cell1-rvwvh" Mar 20 09:19:37 crc kubenswrapper[4971]: I0320 09:19:37.453668 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/3f4b21df-b676-4340-8510-54195b3f2be5-ssh-key-openstack-cell1\") pod \"run-os-openstack-openstack-cell1-rvwvh\" (UID: \"3f4b21df-b676-4340-8510-54195b3f2be5\") " pod="openstack/run-os-openstack-openstack-cell1-rvwvh" Mar 20 09:19:37 crc kubenswrapper[4971]: I0320 09:19:37.453703 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3f4b21df-b676-4340-8510-54195b3f2be5-ceph\") pod \"run-os-openstack-openstack-cell1-rvwvh\" (UID: \"3f4b21df-b676-4340-8510-54195b3f2be5\") " pod="openstack/run-os-openstack-openstack-cell1-rvwvh" Mar 20 09:19:37 crc kubenswrapper[4971]: I0320 09:19:37.453758 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f4b21df-b676-4340-8510-54195b3f2be5-inventory\") pod \"run-os-openstack-openstack-cell1-rvwvh\" (UID: \"3f4b21df-b676-4340-8510-54195b3f2be5\") " pod="openstack/run-os-openstack-openstack-cell1-rvwvh" Mar 20 09:19:37 crc kubenswrapper[4971]: I0320 09:19:37.459394 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f4b21df-b676-4340-8510-54195b3f2be5-inventory\") pod \"run-os-openstack-openstack-cell1-rvwvh\" (UID: \"3f4b21df-b676-4340-8510-54195b3f2be5\") " pod="openstack/run-os-openstack-openstack-cell1-rvwvh" Mar 20 09:19:37 crc kubenswrapper[4971]: I0320 09:19:37.459664 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/3f4b21df-b676-4340-8510-54195b3f2be5-ssh-key-openstack-cell1\") pod \"run-os-openstack-openstack-cell1-rvwvh\" (UID: \"3f4b21df-b676-4340-8510-54195b3f2be5\") " pod="openstack/run-os-openstack-openstack-cell1-rvwvh" Mar 20 09:19:37 crc kubenswrapper[4971]: I0320 09:19:37.468010 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3f4b21df-b676-4340-8510-54195b3f2be5-ceph\") pod \"run-os-openstack-openstack-cell1-rvwvh\" (UID: \"3f4b21df-b676-4340-8510-54195b3f2be5\") " pod="openstack/run-os-openstack-openstack-cell1-rvwvh" Mar 20 09:19:37 crc kubenswrapper[4971]: I0320 09:19:37.470944 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rhdw\" (UniqueName: \"kubernetes.io/projected/3f4b21df-b676-4340-8510-54195b3f2be5-kube-api-access-4rhdw\") pod \"run-os-openstack-openstack-cell1-rvwvh\" (UID: \"3f4b21df-b676-4340-8510-54195b3f2be5\") " pod="openstack/run-os-openstack-openstack-cell1-rvwvh" Mar 20 09:19:37 crc kubenswrapper[4971]: I0320 09:19:37.577961 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-rvwvh" Mar 20 09:19:38 crc kubenswrapper[4971]: I0320 09:19:38.102865 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-rvwvh"] Mar 20 09:19:38 crc kubenswrapper[4971]: I0320 09:19:38.181821 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-rvwvh" event={"ID":"3f4b21df-b676-4340-8510-54195b3f2be5","Type":"ContainerStarted","Data":"ddf9b0d40c38819780d022e06978984125176430c91db7172dc3d8a867ce6426"} Mar 20 09:19:39 crc kubenswrapper[4971]: I0320 09:19:39.191594 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-rvwvh" event={"ID":"3f4b21df-b676-4340-8510-54195b3f2be5","Type":"ContainerStarted","Data":"8cc865ee3f31616b356655e1f55134d2461f43c84731265842633ca0d977ca17"} Mar 20 09:19:39 crc kubenswrapper[4971]: I0320 09:19:39.210045 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-openstack-openstack-cell1-rvwvh" podStartSLOduration=1.81401054 podStartE2EDuration="2.210026997s" podCreationTimestamp="2026-03-20 09:19:37 +0000 UTC" firstStartedPulling="2026-03-20 09:19:38.089659594 +0000 UTC m=+9000.069533732" lastFinishedPulling="2026-03-20 09:19:38.485676051 +0000 UTC m=+9000.465550189" observedRunningTime="2026-03-20 09:19:39.209131573 +0000 UTC m=+9001.189005721" watchObservedRunningTime="2026-03-20 09:19:39.210026997 +0000 UTC m=+9001.189901135" Mar 20 09:19:47 crc kubenswrapper[4971]: I0320 09:19:47.261684 4971 generic.go:334] "Generic (PLEG): container finished" podID="3f4b21df-b676-4340-8510-54195b3f2be5" containerID="8cc865ee3f31616b356655e1f55134d2461f43c84731265842633ca0d977ca17" exitCode=0 Mar 20 09:19:47 crc kubenswrapper[4971]: I0320 09:19:47.261841 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-rvwvh" event={"ID":"3f4b21df-b676-4340-8510-54195b3f2be5","Type":"ContainerDied","Data":"8cc865ee3f31616b356655e1f55134d2461f43c84731265842633ca0d977ca17"} Mar 20 09:19:48 crc kubenswrapper[4971]: I0320 09:19:48.710806 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-rvwvh" Mar 20 09:19:48 crc kubenswrapper[4971]: I0320 09:19:48.790473 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/3f4b21df-b676-4340-8510-54195b3f2be5-ssh-key-openstack-cell1\") pod \"3f4b21df-b676-4340-8510-54195b3f2be5\" (UID: \"3f4b21df-b676-4340-8510-54195b3f2be5\") " Mar 20 09:19:48 crc kubenswrapper[4971]: I0320 09:19:48.790691 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f4b21df-b676-4340-8510-54195b3f2be5-inventory\") pod \"3f4b21df-b676-4340-8510-54195b3f2be5\" (UID: \"3f4b21df-b676-4340-8510-54195b3f2be5\") " Mar 20 09:19:48 crc kubenswrapper[4971]: I0320 09:19:48.790741 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rhdw\" (UniqueName: \"kubernetes.io/projected/3f4b21df-b676-4340-8510-54195b3f2be5-kube-api-access-4rhdw\") pod \"3f4b21df-b676-4340-8510-54195b3f2be5\" (UID: \"3f4b21df-b676-4340-8510-54195b3f2be5\") " Mar 20 09:19:48 crc kubenswrapper[4971]: I0320 09:19:48.790763 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3f4b21df-b676-4340-8510-54195b3f2be5-ceph\") pod \"3f4b21df-b676-4340-8510-54195b3f2be5\" (UID: \"3f4b21df-b676-4340-8510-54195b3f2be5\") " Mar 20 09:19:48 crc kubenswrapper[4971]: I0320 09:19:48.796668 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f4b21df-b676-4340-8510-54195b3f2be5-ceph" (OuterVolumeSpecName: "ceph") pod "3f4b21df-b676-4340-8510-54195b3f2be5" (UID: "3f4b21df-b676-4340-8510-54195b3f2be5"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:19:48 crc kubenswrapper[4971]: I0320 09:19:48.797151 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f4b21df-b676-4340-8510-54195b3f2be5-kube-api-access-4rhdw" (OuterVolumeSpecName: "kube-api-access-4rhdw") pod "3f4b21df-b676-4340-8510-54195b3f2be5" (UID: "3f4b21df-b676-4340-8510-54195b3f2be5"). InnerVolumeSpecName "kube-api-access-4rhdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:19:48 crc kubenswrapper[4971]: I0320 09:19:48.824659 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f4b21df-b676-4340-8510-54195b3f2be5-inventory" (OuterVolumeSpecName: "inventory") pod "3f4b21df-b676-4340-8510-54195b3f2be5" (UID: "3f4b21df-b676-4340-8510-54195b3f2be5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:19:48 crc kubenswrapper[4971]: I0320 09:19:48.842854 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f4b21df-b676-4340-8510-54195b3f2be5-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "3f4b21df-b676-4340-8510-54195b3f2be5" (UID: "3f4b21df-b676-4340-8510-54195b3f2be5"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:19:48 crc kubenswrapper[4971]: I0320 09:19:48.892906 4971 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/3f4b21df-b676-4340-8510-54195b3f2be5-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 20 09:19:48 crc kubenswrapper[4971]: I0320 09:19:48.892981 4971 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f4b21df-b676-4340-8510-54195b3f2be5-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 09:19:48 crc kubenswrapper[4971]: I0320 09:19:48.892992 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rhdw\" (UniqueName: \"kubernetes.io/projected/3f4b21df-b676-4340-8510-54195b3f2be5-kube-api-access-4rhdw\") on node \"crc\" DevicePath \"\"" Mar 20 09:19:48 crc kubenswrapper[4971]: I0320 09:19:48.893000 4971 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3f4b21df-b676-4340-8510-54195b3f2be5-ceph\") on node \"crc\" DevicePath \"\"" Mar 20 09:19:49 crc kubenswrapper[4971]: I0320 09:19:49.281441 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-rvwvh" event={"ID":"3f4b21df-b676-4340-8510-54195b3f2be5","Type":"ContainerDied","Data":"ddf9b0d40c38819780d022e06978984125176430c91db7172dc3d8a867ce6426"} Mar 20 09:19:49 crc kubenswrapper[4971]: I0320 09:19:49.281693 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ddf9b0d40c38819780d022e06978984125176430c91db7172dc3d8a867ce6426" Mar 20 09:19:49 crc kubenswrapper[4971]: I0320 09:19:49.281548 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-rvwvh" Mar 20 09:19:49 crc kubenswrapper[4971]: I0320 09:19:49.357685 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-9n2zb"] Mar 20 09:19:49 crc kubenswrapper[4971]: E0320 09:19:49.358173 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f4b21df-b676-4340-8510-54195b3f2be5" containerName="run-os-openstack-openstack-cell1" Mar 20 09:19:49 crc kubenswrapper[4971]: I0320 09:19:49.358189 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f4b21df-b676-4340-8510-54195b3f2be5" containerName="run-os-openstack-openstack-cell1" Mar 20 09:19:49 crc kubenswrapper[4971]: I0320 09:19:49.358406 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f4b21df-b676-4340-8510-54195b3f2be5" containerName="run-os-openstack-openstack-cell1" Mar 20 09:19:49 crc kubenswrapper[4971]: I0320 09:19:49.359163 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-9n2zb" Mar 20 09:19:49 crc kubenswrapper[4971]: I0320 09:19:49.361805 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rwstj" Mar 20 09:19:49 crc kubenswrapper[4971]: I0320 09:19:49.362356 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 20 09:19:49 crc kubenswrapper[4971]: I0320 09:19:49.366049 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-9n2zb"] Mar 20 09:19:49 crc kubenswrapper[4971]: I0320 09:19:49.503039 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/57d329f8-1d09-47d2-9574-b86286f74c11-ceph\") pod \"reboot-os-openstack-openstack-cell1-9n2zb\" (UID: \"57d329f8-1d09-47d2-9574-b86286f74c11\") " pod="openstack/reboot-os-openstack-openstack-cell1-9n2zb" Mar 20 09:19:49 crc kubenswrapper[4971]: I0320 09:19:49.503099 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/57d329f8-1d09-47d2-9574-b86286f74c11-ssh-key-openstack-cell1\") pod \"reboot-os-openstack-openstack-cell1-9n2zb\" (UID: \"57d329f8-1d09-47d2-9574-b86286f74c11\") " pod="openstack/reboot-os-openstack-openstack-cell1-9n2zb" Mar 20 09:19:49 crc kubenswrapper[4971]: I0320 09:19:49.503152 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57d329f8-1d09-47d2-9574-b86286f74c11-inventory\") pod \"reboot-os-openstack-openstack-cell1-9n2zb\" (UID: \"57d329f8-1d09-47d2-9574-b86286f74c11\") " pod="openstack/reboot-os-openstack-openstack-cell1-9n2zb" Mar 20 09:19:49 crc kubenswrapper[4971]: I0320 09:19:49.503179 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drtjm\" (UniqueName: \"kubernetes.io/projected/57d329f8-1d09-47d2-9574-b86286f74c11-kube-api-access-drtjm\") pod \"reboot-os-openstack-openstack-cell1-9n2zb\" (UID: \"57d329f8-1d09-47d2-9574-b86286f74c11\") " pod="openstack/reboot-os-openstack-openstack-cell1-9n2zb" Mar 20 09:19:49 crc kubenswrapper[4971]: I0320 09:19:49.605250 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/57d329f8-1d09-47d2-9574-b86286f74c11-ceph\") pod \"reboot-os-openstack-openstack-cell1-9n2zb\" (UID: \"57d329f8-1d09-47d2-9574-b86286f74c11\") " pod="openstack/reboot-os-openstack-openstack-cell1-9n2zb" Mar 20 09:19:49 crc kubenswrapper[4971]: I0320 09:19:49.605338 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/57d329f8-1d09-47d2-9574-b86286f74c11-ssh-key-openstack-cell1\") pod \"reboot-os-openstack-openstack-cell1-9n2zb\" (UID: \"57d329f8-1d09-47d2-9574-b86286f74c11\") " pod="openstack/reboot-os-openstack-openstack-cell1-9n2zb" Mar 20 09:19:49 crc kubenswrapper[4971]: I0320 09:19:49.605391 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57d329f8-1d09-47d2-9574-b86286f74c11-inventory\") pod \"reboot-os-openstack-openstack-cell1-9n2zb\" (UID: \"57d329f8-1d09-47d2-9574-b86286f74c11\") " pod="openstack/reboot-os-openstack-openstack-cell1-9n2zb" Mar 20 09:19:49 crc kubenswrapper[4971]: I0320 09:19:49.605429 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drtjm\" (UniqueName: \"kubernetes.io/projected/57d329f8-1d09-47d2-9574-b86286f74c11-kube-api-access-drtjm\") pod \"reboot-os-openstack-openstack-cell1-9n2zb\" (UID: \"57d329f8-1d09-47d2-9574-b86286f74c11\") " pod="openstack/reboot-os-openstack-openstack-cell1-9n2zb" Mar 20 09:19:49 crc kubenswrapper[4971]: I0320 09:19:49.609993 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/57d329f8-1d09-47d2-9574-b86286f74c11-ceph\") pod \"reboot-os-openstack-openstack-cell1-9n2zb\" (UID: \"57d329f8-1d09-47d2-9574-b86286f74c11\") " pod="openstack/reboot-os-openstack-openstack-cell1-9n2zb" Mar 20 09:19:49 crc kubenswrapper[4971]: I0320 09:19:49.610071 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/57d329f8-1d09-47d2-9574-b86286f74c11-ssh-key-openstack-cell1\") pod \"reboot-os-openstack-openstack-cell1-9n2zb\" (UID: \"57d329f8-1d09-47d2-9574-b86286f74c11\") " pod="openstack/reboot-os-openstack-openstack-cell1-9n2zb" Mar 20 09:19:49 crc kubenswrapper[4971]: I0320 09:19:49.610973 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57d329f8-1d09-47d2-9574-b86286f74c11-inventory\") pod \"reboot-os-openstack-openstack-cell1-9n2zb\" (UID: \"57d329f8-1d09-47d2-9574-b86286f74c11\") " pod="openstack/reboot-os-openstack-openstack-cell1-9n2zb" Mar 20 09:19:49 crc kubenswrapper[4971]: I0320 09:19:49.623936 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drtjm\" (UniqueName: \"kubernetes.io/projected/57d329f8-1d09-47d2-9574-b86286f74c11-kube-api-access-drtjm\") pod \"reboot-os-openstack-openstack-cell1-9n2zb\" (UID: \"57d329f8-1d09-47d2-9574-b86286f74c11\") " pod="openstack/reboot-os-openstack-openstack-cell1-9n2zb" Mar 20 09:19:49 crc kubenswrapper[4971]: I0320 09:19:49.681707 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-9n2zb" Mar 20 09:19:50 crc kubenswrapper[4971]: I0320 09:19:50.223059 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-9n2zb"] Mar 20 09:19:50 crc kubenswrapper[4971]: I0320 09:19:50.291134 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-9n2zb" event={"ID":"57d329f8-1d09-47d2-9574-b86286f74c11","Type":"ContainerStarted","Data":"4ce66a2d5d0803ff58f8ef07d8710d7b19030fb05e6fab6919d02345cca962ab"} Mar 20 09:19:51 crc kubenswrapper[4971]: I0320 09:19:51.304652 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-9n2zb" event={"ID":"57d329f8-1d09-47d2-9574-b86286f74c11","Type":"ContainerStarted","Data":"bfbe0f9ec70f44d3dae63b26f1bc490dd63614835b13661447440b3fc99ef688"} Mar 20 09:19:51 crc kubenswrapper[4971]: I0320 09:19:51.327982 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-openstack-openstack-cell1-9n2zb" podStartSLOduration=1.7096116559999999 podStartE2EDuration="2.32795654s" podCreationTimestamp="2026-03-20 09:19:49 +0000 UTC" firstStartedPulling="2026-03-20 09:19:50.237927119 +0000 UTC m=+9012.217801257" lastFinishedPulling="2026-03-20 09:19:50.856271993 +0000 UTC m=+9012.836146141" observedRunningTime="2026-03-20 09:19:51.322680881 +0000 UTC m=+9013.302555019" watchObservedRunningTime="2026-03-20 09:19:51.32795654 +0000 UTC m=+9013.307830688" Mar 20 09:20:00 crc kubenswrapper[4971]: I0320 09:20:00.145520 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566640-7xbnt"] Mar 20 09:20:00 crc kubenswrapper[4971]: I0320 09:20:00.147706 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566640-7xbnt" Mar 20 09:20:00 crc kubenswrapper[4971]: I0320 09:20:00.150444 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:20:00 crc kubenswrapper[4971]: I0320 09:20:00.150522 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 09:20:00 crc kubenswrapper[4971]: I0320 09:20:00.150558 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:20:00 crc kubenswrapper[4971]: I0320 09:20:00.180856 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566640-7xbnt"] Mar 20 09:20:00 crc kubenswrapper[4971]: I0320 09:20:00.244874 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8pwl\" (UniqueName: \"kubernetes.io/projected/7fead403-b6dd-4df7-88d5-c035ab2fa51f-kube-api-access-s8pwl\") pod \"auto-csr-approver-29566640-7xbnt\" (UID: \"7fead403-b6dd-4df7-88d5-c035ab2fa51f\") " pod="openshift-infra/auto-csr-approver-29566640-7xbnt" Mar 20 09:20:00 crc kubenswrapper[4971]: I0320 09:20:00.347894 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8pwl\" (UniqueName: \"kubernetes.io/projected/7fead403-b6dd-4df7-88d5-c035ab2fa51f-kube-api-access-s8pwl\") pod \"auto-csr-approver-29566640-7xbnt\" (UID: \"7fead403-b6dd-4df7-88d5-c035ab2fa51f\") " pod="openshift-infra/auto-csr-approver-29566640-7xbnt" Mar 20 09:20:00 crc kubenswrapper[4971]: I0320 09:20:00.373225 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8pwl\" (UniqueName: \"kubernetes.io/projected/7fead403-b6dd-4df7-88d5-c035ab2fa51f-kube-api-access-s8pwl\") pod \"auto-csr-approver-29566640-7xbnt\" (UID: \"7fead403-b6dd-4df7-88d5-c035ab2fa51f\") " pod="openshift-infra/auto-csr-approver-29566640-7xbnt" Mar 20 09:20:00 crc kubenswrapper[4971]: I0320 09:20:00.471143 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566640-7xbnt" Mar 20 09:20:00 crc kubenswrapper[4971]: I0320 09:20:00.923915 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566640-7xbnt"] Mar 20 09:20:01 crc kubenswrapper[4971]: I0320 09:20:01.425334 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566640-7xbnt" event={"ID":"7fead403-b6dd-4df7-88d5-c035ab2fa51f","Type":"ContainerStarted","Data":"293f10a1a03c4cc3fd7b80772788fd73de61f4f298afc4471ad8fb1c7d2525b1"} Mar 20 09:20:03 crc kubenswrapper[4971]: I0320 09:20:03.460767 4971 generic.go:334] "Generic (PLEG): container finished" podID="7fead403-b6dd-4df7-88d5-c035ab2fa51f" containerID="22203c0ef21a989cb4486c452c5eeffeb56ee6b8a2fce2d4175160d2cd3c1f38" exitCode=0 Mar 20 09:20:03 crc kubenswrapper[4971]: I0320 09:20:03.460859 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566640-7xbnt" event={"ID":"7fead403-b6dd-4df7-88d5-c035ab2fa51f","Type":"ContainerDied","Data":"22203c0ef21a989cb4486c452c5eeffeb56ee6b8a2fce2d4175160d2cd3c1f38"} Mar 20 09:20:04 crc kubenswrapper[4971]: I0320 09:20:04.865106 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566640-7xbnt" Mar 20 09:20:04 crc kubenswrapper[4971]: I0320 09:20:04.942723 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8pwl\" (UniqueName: \"kubernetes.io/projected/7fead403-b6dd-4df7-88d5-c035ab2fa51f-kube-api-access-s8pwl\") pod \"7fead403-b6dd-4df7-88d5-c035ab2fa51f\" (UID: \"7fead403-b6dd-4df7-88d5-c035ab2fa51f\") " Mar 20 09:20:04 crc kubenswrapper[4971]: I0320 09:20:04.960573 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fead403-b6dd-4df7-88d5-c035ab2fa51f-kube-api-access-s8pwl" (OuterVolumeSpecName: "kube-api-access-s8pwl") pod "7fead403-b6dd-4df7-88d5-c035ab2fa51f" (UID: "7fead403-b6dd-4df7-88d5-c035ab2fa51f"). InnerVolumeSpecName "kube-api-access-s8pwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:20:05 crc kubenswrapper[4971]: I0320 09:20:05.057518 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8pwl\" (UniqueName: \"kubernetes.io/projected/7fead403-b6dd-4df7-88d5-c035ab2fa51f-kube-api-access-s8pwl\") on node \"crc\" DevicePath \"\"" Mar 20 09:20:05 crc kubenswrapper[4971]: I0320 09:20:05.485900 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566640-7xbnt" event={"ID":"7fead403-b6dd-4df7-88d5-c035ab2fa51f","Type":"ContainerDied","Data":"293f10a1a03c4cc3fd7b80772788fd73de61f4f298afc4471ad8fb1c7d2525b1"} Mar 20 09:20:05 crc kubenswrapper[4971]: I0320 09:20:05.485936 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566640-7xbnt" Mar 20 09:20:05 crc kubenswrapper[4971]: I0320 09:20:05.485962 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="293f10a1a03c4cc3fd7b80772788fd73de61f4f298afc4471ad8fb1c7d2525b1" Mar 20 09:20:05 crc kubenswrapper[4971]: I0320 09:20:05.954890 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566634-5gs9t"] Mar 20 09:20:05 crc kubenswrapper[4971]: I0320 09:20:05.964943 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566634-5gs9t"] Mar 20 09:20:06 crc kubenswrapper[4971]: I0320 09:20:06.495925 4971 generic.go:334] "Generic (PLEG): container finished" podID="57d329f8-1d09-47d2-9574-b86286f74c11" containerID="bfbe0f9ec70f44d3dae63b26f1bc490dd63614835b13661447440b3fc99ef688" exitCode=0 Mar 20 09:20:06 crc kubenswrapper[4971]: I0320 09:20:06.495974 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-9n2zb" event={"ID":"57d329f8-1d09-47d2-9574-b86286f74c11","Type":"ContainerDied","Data":"bfbe0f9ec70f44d3dae63b26f1bc490dd63614835b13661447440b3fc99ef688"} Mar 20 09:20:06 crc kubenswrapper[4971]: I0320 09:20:06.744882 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1543875e-c018-446f-bbdd-067ec555d144" path="/var/lib/kubelet/pods/1543875e-c018-446f-bbdd-067ec555d144/volumes" Mar 20 09:20:07 crc kubenswrapper[4971]: I0320 09:20:07.951442 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-9n2zb" Mar 20 09:20:08 crc kubenswrapper[4971]: I0320 09:20:08.018684 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/57d329f8-1d09-47d2-9574-b86286f74c11-ssh-key-openstack-cell1\") pod \"57d329f8-1d09-47d2-9574-b86286f74c11\" (UID: \"57d329f8-1d09-47d2-9574-b86286f74c11\") " Mar 20 09:20:08 crc kubenswrapper[4971]: I0320 09:20:08.019532 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/57d329f8-1d09-47d2-9574-b86286f74c11-ceph\") pod \"57d329f8-1d09-47d2-9574-b86286f74c11\" (UID: \"57d329f8-1d09-47d2-9574-b86286f74c11\") " Mar 20 09:20:08 crc kubenswrapper[4971]: I0320 09:20:08.019589 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drtjm\" (UniqueName: \"kubernetes.io/projected/57d329f8-1d09-47d2-9574-b86286f74c11-kube-api-access-drtjm\") pod \"57d329f8-1d09-47d2-9574-b86286f74c11\" (UID: \"57d329f8-1d09-47d2-9574-b86286f74c11\") " Mar 20 09:20:08 crc kubenswrapper[4971]: I0320 09:20:08.019754 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57d329f8-1d09-47d2-9574-b86286f74c11-inventory\") pod \"57d329f8-1d09-47d2-9574-b86286f74c11\" (UID: \"57d329f8-1d09-47d2-9574-b86286f74c11\") " Mar 20 09:20:08 crc kubenswrapper[4971]: I0320 09:20:08.024075 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57d329f8-1d09-47d2-9574-b86286f74c11-kube-api-access-drtjm" (OuterVolumeSpecName: "kube-api-access-drtjm") pod "57d329f8-1d09-47d2-9574-b86286f74c11" (UID: "57d329f8-1d09-47d2-9574-b86286f74c11"). InnerVolumeSpecName "kube-api-access-drtjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:20:08 crc kubenswrapper[4971]: I0320 09:20:08.025280 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57d329f8-1d09-47d2-9574-b86286f74c11-ceph" (OuterVolumeSpecName: "ceph") pod "57d329f8-1d09-47d2-9574-b86286f74c11" (UID: "57d329f8-1d09-47d2-9574-b86286f74c11"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:20:08 crc kubenswrapper[4971]: I0320 09:20:08.049648 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57d329f8-1d09-47d2-9574-b86286f74c11-inventory" (OuterVolumeSpecName: "inventory") pod "57d329f8-1d09-47d2-9574-b86286f74c11" (UID: "57d329f8-1d09-47d2-9574-b86286f74c11"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:20:08 crc kubenswrapper[4971]: I0320 09:20:08.060982 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57d329f8-1d09-47d2-9574-b86286f74c11-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "57d329f8-1d09-47d2-9574-b86286f74c11" (UID: "57d329f8-1d09-47d2-9574-b86286f74c11"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:20:08 crc kubenswrapper[4971]: I0320 09:20:08.121965 4971 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57d329f8-1d09-47d2-9574-b86286f74c11-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 09:20:08 crc kubenswrapper[4971]: I0320 09:20:08.122006 4971 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/57d329f8-1d09-47d2-9574-b86286f74c11-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 20 09:20:08 crc kubenswrapper[4971]: I0320 09:20:08.122017 4971 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/57d329f8-1d09-47d2-9574-b86286f74c11-ceph\") on node \"crc\" DevicePath \"\"" Mar 20 09:20:08 crc kubenswrapper[4971]: I0320 09:20:08.122026 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drtjm\" (UniqueName: \"kubernetes.io/projected/57d329f8-1d09-47d2-9574-b86286f74c11-kube-api-access-drtjm\") on node \"crc\" DevicePath \"\"" Mar 20 09:20:08 crc kubenswrapper[4971]: I0320 09:20:08.522184 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-9n2zb" event={"ID":"57d329f8-1d09-47d2-9574-b86286f74c11","Type":"ContainerDied","Data":"4ce66a2d5d0803ff58f8ef07d8710d7b19030fb05e6fab6919d02345cca962ab"} Mar 20 09:20:08 crc kubenswrapper[4971]: I0320 09:20:08.522241 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-9n2zb" Mar 20 09:20:08 crc kubenswrapper[4971]: I0320 09:20:08.522242 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ce66a2d5d0803ff58f8ef07d8710d7b19030fb05e6fab6919d02345cca962ab" Mar 20 09:20:08 crc kubenswrapper[4971]: I0320 09:20:08.627517 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-gdhcx"] Mar 20 09:20:08 crc kubenswrapper[4971]: E0320 09:20:08.628094 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fead403-b6dd-4df7-88d5-c035ab2fa51f" containerName="oc" Mar 20 09:20:08 crc kubenswrapper[4971]: I0320 09:20:08.628118 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fead403-b6dd-4df7-88d5-c035ab2fa51f" containerName="oc" Mar 20 09:20:08 crc kubenswrapper[4971]: E0320 09:20:08.628166 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57d329f8-1d09-47d2-9574-b86286f74c11" containerName="reboot-os-openstack-openstack-cell1" Mar 20 09:20:08 crc kubenswrapper[4971]: I0320 09:20:08.628177 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="57d329f8-1d09-47d2-9574-b86286f74c11" containerName="reboot-os-openstack-openstack-cell1" Mar 20 09:20:08 crc kubenswrapper[4971]: I0320 09:20:08.629140 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fead403-b6dd-4df7-88d5-c035ab2fa51f" containerName="oc" Mar 20 09:20:08 crc kubenswrapper[4971]: I0320 09:20:08.629176 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="57d329f8-1d09-47d2-9574-b86286f74c11" containerName="reboot-os-openstack-openstack-cell1" Mar 20 09:20:08 crc kubenswrapper[4971]: I0320 09:20:08.630146 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-gdhcx" Mar 20 09:20:08 crc kubenswrapper[4971]: I0320 09:20:08.634812 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 20 09:20:08 crc kubenswrapper[4971]: I0320 09:20:08.639177 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rwstj" Mar 20 09:20:08 crc kubenswrapper[4971]: I0320 09:20:08.646034 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-gdhcx"] Mar 20 09:20:08 crc kubenswrapper[4971]: I0320 09:20:08.735638 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-gdhcx\" (UID: \"aaabaf42-dd65-48db-a6da-18e2ba99b975\") " pod="openstack/install-certs-openstack-openstack-cell1-gdhcx" Mar 20 09:20:08 crc kubenswrapper[4971]: I0320 09:20:08.735700 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-inventory\") pod \"install-certs-openstack-openstack-cell1-gdhcx\" (UID: \"aaabaf42-dd65-48db-a6da-18e2ba99b975\") " pod="openstack/install-certs-openstack-openstack-cell1-gdhcx" Mar 20 09:20:08 crc kubenswrapper[4971]: I0320 09:20:08.735727 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-gdhcx\" (UID: \"aaabaf42-dd65-48db-a6da-18e2ba99b975\") " pod="openstack/install-certs-openstack-openstack-cell1-gdhcx" Mar 20 09:20:08 crc kubenswrapper[4971]: I0320 09:20:08.735758 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-gdhcx\" (UID: \"aaabaf42-dd65-48db-a6da-18e2ba99b975\") " pod="openstack/install-certs-openstack-openstack-cell1-gdhcx" Mar 20 09:20:08 crc kubenswrapper[4971]: I0320 09:20:08.735792 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-ssh-key-openstack-cell1\") pod \"install-certs-openstack-openstack-cell1-gdhcx\" (UID: \"aaabaf42-dd65-48db-a6da-18e2ba99b975\") " pod="openstack/install-certs-openstack-openstack-cell1-gdhcx" Mar 20 09:20:08 crc kubenswrapper[4971]: I0320 09:20:08.735817 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-gdhcx\" (UID: \"aaabaf42-dd65-48db-a6da-18e2ba99b975\") " pod="openstack/install-certs-openstack-openstack-cell1-gdhcx" Mar 20 09:20:08 crc kubenswrapper[4971]: I0320 09:20:08.735840 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-gdhcx\" (UID: \"aaabaf42-dd65-48db-a6da-18e2ba99b975\") " pod="openstack/install-certs-openstack-openstack-cell1-gdhcx" Mar 20 09:20:08 crc kubenswrapper[4971]: I0320 09:20:08.735864 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-gdhcx\" (UID: \"aaabaf42-dd65-48db-a6da-18e2ba99b975\") " pod="openstack/install-certs-openstack-openstack-cell1-gdhcx" Mar 20 09:20:08 crc kubenswrapper[4971]: I0320 09:20:08.735920 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhj2g\" (UniqueName: \"kubernetes.io/projected/aaabaf42-dd65-48db-a6da-18e2ba99b975-kube-api-access-jhj2g\") pod \"install-certs-openstack-openstack-cell1-gdhcx\" (UID: \"aaabaf42-dd65-48db-a6da-18e2ba99b975\") " pod="openstack/install-certs-openstack-openstack-cell1-gdhcx" Mar 20 09:20:08 crc kubenswrapper[4971]: I0320 09:20:08.735956 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-gdhcx\" (UID: \"aaabaf42-dd65-48db-a6da-18e2ba99b975\") " pod="openstack/install-certs-openstack-openstack-cell1-gdhcx" Mar 20 09:20:08 crc kubenswrapper[4971]: I0320 09:20:08.735992 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-ceph\") pod \"install-certs-openstack-openstack-cell1-gdhcx\" (UID: \"aaabaf42-dd65-48db-a6da-18e2ba99b975\") " pod="openstack/install-certs-openstack-openstack-cell1-gdhcx" Mar 20 09:20:08 crc kubenswrapper[4971]: I0320 09:20:08.736046 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-gdhcx\" (UID: \"aaabaf42-dd65-48db-a6da-18e2ba99b975\") " pod="openstack/install-certs-openstack-openstack-cell1-gdhcx" Mar 20 09:20:08 crc kubenswrapper[4971]: I0320 09:20:08.839796 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-gdhcx\" (UID: \"aaabaf42-dd65-48db-a6da-18e2ba99b975\") " pod="openstack/install-certs-openstack-openstack-cell1-gdhcx" Mar 20 09:20:08 crc kubenswrapper[4971]: I0320 09:20:08.839852 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-inventory\") pod \"install-certs-openstack-openstack-cell1-gdhcx\" (UID: \"aaabaf42-dd65-48db-a6da-18e2ba99b975\") " pod="openstack/install-certs-openstack-openstack-cell1-gdhcx" Mar 20 09:20:08 crc kubenswrapper[4971]: I0320 09:20:08.839877 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-gdhcx\" (UID: \"aaabaf42-dd65-48db-a6da-18e2ba99b975\") " pod="openstack/install-certs-openstack-openstack-cell1-gdhcx" Mar 20 09:20:08 crc kubenswrapper[4971]: I0320 09:20:08.839896 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-gdhcx\" (UID: \"aaabaf42-dd65-48db-a6da-18e2ba99b975\") " pod="openstack/install-certs-openstack-openstack-cell1-gdhcx" Mar 20 09:20:08 crc kubenswrapper[4971]: I0320 09:20:08.839913 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-ssh-key-openstack-cell1\") pod \"install-certs-openstack-openstack-cell1-gdhcx\" (UID: \"aaabaf42-dd65-48db-a6da-18e2ba99b975\") " pod="openstack/install-certs-openstack-openstack-cell1-gdhcx" Mar 20 09:20:08 crc kubenswrapper[4971]: I0320 09:20:08.839929 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-gdhcx\" (UID: \"aaabaf42-dd65-48db-a6da-18e2ba99b975\") " pod="openstack/install-certs-openstack-openstack-cell1-gdhcx" Mar 20 09:20:08 crc kubenswrapper[4971]: I0320 09:20:08.839946 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-gdhcx\" (UID: \"aaabaf42-dd65-48db-a6da-18e2ba99b975\") " pod="openstack/install-certs-openstack-openstack-cell1-gdhcx" Mar 20 09:20:08 crc kubenswrapper[4971]: I0320 09:20:08.839964 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-gdhcx\" (UID: \"aaabaf42-dd65-48db-a6da-18e2ba99b975\") " pod="openstack/install-certs-openstack-openstack-cell1-gdhcx" Mar 20 09:20:08 crc kubenswrapper[4971]: I0320 09:20:08.839983 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhj2g\" (UniqueName: \"kubernetes.io/projected/aaabaf42-dd65-48db-a6da-18e2ba99b975-kube-api-access-jhj2g\") pod \"install-certs-openstack-openstack-cell1-gdhcx\" (UID: \"aaabaf42-dd65-48db-a6da-18e2ba99b975\") " pod="openstack/install-certs-openstack-openstack-cell1-gdhcx" Mar 20 09:20:08 crc kubenswrapper[4971]: I0320 09:20:08.840012 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-gdhcx\" (UID: \"aaabaf42-dd65-48db-a6da-18e2ba99b975\") " pod="openstack/install-certs-openstack-openstack-cell1-gdhcx" Mar 20 09:20:08 crc kubenswrapper[4971]: I0320 09:20:08.840033 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-ceph\") pod \"install-certs-openstack-openstack-cell1-gdhcx\" (UID: \"aaabaf42-dd65-48db-a6da-18e2ba99b975\") " pod="openstack/install-certs-openstack-openstack-cell1-gdhcx" Mar 20 09:20:08 crc kubenswrapper[4971]: I0320 09:20:08.840072 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-gdhcx\" (UID: \"aaabaf42-dd65-48db-a6da-18e2ba99b975\") " pod="openstack/install-certs-openstack-openstack-cell1-gdhcx" Mar 20 09:20:08 crc kubenswrapper[4971]: I0320 09:20:08.845019 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-inventory\") pod \"install-certs-openstack-openstack-cell1-gdhcx\" (UID: \"aaabaf42-dd65-48db-a6da-18e2ba99b975\") " pod="openstack/install-certs-openstack-openstack-cell1-gdhcx" Mar 20 09:20:08 crc kubenswrapper[4971]: I0320 09:20:08.845264 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-gdhcx\" (UID: \"aaabaf42-dd65-48db-a6da-18e2ba99b975\") " pod="openstack/install-certs-openstack-openstack-cell1-gdhcx" Mar 20 09:20:08 crc kubenswrapper[4971]: I0320 09:20:08.845468 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-gdhcx\" (UID: \"aaabaf42-dd65-48db-a6da-18e2ba99b975\") " pod="openstack/install-certs-openstack-openstack-cell1-gdhcx" Mar 20 09:20:08 crc kubenswrapper[4971]: I0320 09:20:08.846288 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-gdhcx\" (UID: \"aaabaf42-dd65-48db-a6da-18e2ba99b975\") " pod="openstack/install-certs-openstack-openstack-cell1-gdhcx" Mar 20 09:20:08 crc kubenswrapper[4971]: I0320 09:20:08.846739 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-gdhcx\" (UID: \"aaabaf42-dd65-48db-a6da-18e2ba99b975\") " pod="openstack/install-certs-openstack-openstack-cell1-gdhcx" Mar 20 09:20:08 crc kubenswrapper[4971]: I0320 09:20:08.847375 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-ceph\") pod \"install-certs-openstack-openstack-cell1-gdhcx\" (UID: \"aaabaf42-dd65-48db-a6da-18e2ba99b975\") " pod="openstack/install-certs-openstack-openstack-cell1-gdhcx" Mar 20 09:20:08 crc kubenswrapper[4971]: I0320 09:20:08.847665 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-ssh-key-openstack-cell1\") pod \"install-certs-openstack-openstack-cell1-gdhcx\" (UID: \"aaabaf42-dd65-48db-a6da-18e2ba99b975\") " pod="openstack/install-certs-openstack-openstack-cell1-gdhcx" Mar 20 09:20:08 crc kubenswrapper[4971]: I0320 09:20:08.849162 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-gdhcx\" (UID: \"aaabaf42-dd65-48db-a6da-18e2ba99b975\") " pod="openstack/install-certs-openstack-openstack-cell1-gdhcx" Mar 20 09:20:08 crc kubenswrapper[4971]: I0320 09:20:08.854917 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-gdhcx\" (UID: \"aaabaf42-dd65-48db-a6da-18e2ba99b975\") " pod="openstack/install-certs-openstack-openstack-cell1-gdhcx" Mar 20 09:20:08 crc kubenswrapper[4971]: I0320 09:20:08.859154 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-gdhcx\" (UID: \"aaabaf42-dd65-48db-a6da-18e2ba99b975\") " pod="openstack/install-certs-openstack-openstack-cell1-gdhcx" Mar 20 09:20:08 crc kubenswrapper[4971]: I0320 09:20:08.863598 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhj2g\" (UniqueName: \"kubernetes.io/projected/aaabaf42-dd65-48db-a6da-18e2ba99b975-kube-api-access-jhj2g\") pod \"install-certs-openstack-openstack-cell1-gdhcx\" (UID: \"aaabaf42-dd65-48db-a6da-18e2ba99b975\") " pod="openstack/install-certs-openstack-openstack-cell1-gdhcx" Mar 20 09:20:08 crc kubenswrapper[4971]: I0320 09:20:08.865683 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-gdhcx\" (UID: \"aaabaf42-dd65-48db-a6da-18e2ba99b975\") " pod="openstack/install-certs-openstack-openstack-cell1-gdhcx" Mar 20 09:20:08 crc kubenswrapper[4971]: I0320 09:20:08.951117 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-gdhcx" Mar 20 09:20:09 crc kubenswrapper[4971]: I0320 09:20:09.482044 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-gdhcx"] Mar 20 09:20:09 crc kubenswrapper[4971]: W0320 09:20:09.483219 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaaabaf42_dd65_48db_a6da_18e2ba99b975.slice/crio-4838ef0a076f97ab3caecbff9be2503cfdcdf2266a52b21b293d6cb64263a045 WatchSource:0}: Error finding container 4838ef0a076f97ab3caecbff9be2503cfdcdf2266a52b21b293d6cb64263a045: Status 404 returned error can't find the container with id 4838ef0a076f97ab3caecbff9be2503cfdcdf2266a52b21b293d6cb64263a045 Mar 20 09:20:09 crc kubenswrapper[4971]: I0320 09:20:09.532039 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-gdhcx" event={"ID":"aaabaf42-dd65-48db-a6da-18e2ba99b975","Type":"ContainerStarted","Data":"4838ef0a076f97ab3caecbff9be2503cfdcdf2266a52b21b293d6cb64263a045"} Mar 20 09:20:10 crc kubenswrapper[4971]: I0320 09:20:10.542384 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-gdhcx" event={"ID":"aaabaf42-dd65-48db-a6da-18e2ba99b975","Type":"ContainerStarted","Data":"4dae5db32dcc1c7ec73373007f7933907d39320488344cfdd7e12f8ae3f633ff"} Mar 20 09:20:10 crc kubenswrapper[4971]: I0320 09:20:10.565594 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-openstack-openstack-cell1-gdhcx" podStartSLOduration=2.153913704 podStartE2EDuration="2.565574624s" podCreationTimestamp="2026-03-20 09:20:08 +0000 UTC" firstStartedPulling="2026-03-20 09:20:09.485741413 +0000 UTC m=+9031.465615551" lastFinishedPulling="2026-03-20 09:20:09.897402333 +0000 UTC m=+9031.877276471" observedRunningTime="2026-03-20 09:20:10.560852369 +0000 UTC m=+9032.540726537" watchObservedRunningTime="2026-03-20 09:20:10.565574624 +0000 UTC m=+9032.545448762" Mar 20 09:20:14 crc kubenswrapper[4971]: I0320 09:20:14.009545 4971 scope.go:117] "RemoveContainer" containerID="dd87cb23849d7bd07ba489e0d994b98af694f8491bd979806609dfc7b2b17e48" Mar 20 09:20:20 crc kubenswrapper[4971]: I0320 09:20:20.161971 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:20:20 crc kubenswrapper[4971]: I0320 09:20:20.162478 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:20:29 crc kubenswrapper[4971]: I0320 09:20:29.714689 4971 generic.go:334] "Generic (PLEG): container finished" podID="aaabaf42-dd65-48db-a6da-18e2ba99b975" containerID="4dae5db32dcc1c7ec73373007f7933907d39320488344cfdd7e12f8ae3f633ff" exitCode=0 Mar 20 09:20:29 crc kubenswrapper[4971]: I0320 09:20:29.714780 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-gdhcx" event={"ID":"aaabaf42-dd65-48db-a6da-18e2ba99b975","Type":"ContainerDied","Data":"4dae5db32dcc1c7ec73373007f7933907d39320488344cfdd7e12f8ae3f633ff"} Mar 20 09:20:29 crc kubenswrapper[4971]: E0320 09:20:29.872953 4971 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaaabaf42_dd65_48db_a6da_18e2ba99b975.slice/crio-conmon-4dae5db32dcc1c7ec73373007f7933907d39320488344cfdd7e12f8ae3f633ff.scope\": RecentStats: unable to find data in memory cache]" Mar 20 09:20:31 crc kubenswrapper[4971]: I0320 09:20:31.134415 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-gdhcx" Mar 20 09:20:31 crc kubenswrapper[4971]: I0320 09:20:31.181172 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-ssh-key-openstack-cell1\") pod \"aaabaf42-dd65-48db-a6da-18e2ba99b975\" (UID: \"aaabaf42-dd65-48db-a6da-18e2ba99b975\") " Mar 20 09:20:31 crc kubenswrapper[4971]: I0320 09:20:31.181210 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-telemetry-combined-ca-bundle\") pod \"aaabaf42-dd65-48db-a6da-18e2ba99b975\" (UID: \"aaabaf42-dd65-48db-a6da-18e2ba99b975\") " Mar 20 09:20:31 crc kubenswrapper[4971]: I0320 09:20:31.181234 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhj2g\" (UniqueName: \"kubernetes.io/projected/aaabaf42-dd65-48db-a6da-18e2ba99b975-kube-api-access-jhj2g\") pod \"aaabaf42-dd65-48db-a6da-18e2ba99b975\" (UID: \"aaabaf42-dd65-48db-a6da-18e2ba99b975\") " Mar 20 09:20:31 crc kubenswrapper[4971]: I0320 09:20:31.181255 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-neutron-metadata-combined-ca-bundle\") pod \"aaabaf42-dd65-48db-a6da-18e2ba99b975\" (UID: \"aaabaf42-dd65-48db-a6da-18e2ba99b975\") " Mar 20 09:20:31 crc kubenswrapper[4971]: I0320 09:20:31.181365 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-neutron-dhcp-combined-ca-bundle\") pod \"aaabaf42-dd65-48db-a6da-18e2ba99b975\" (UID: \"aaabaf42-dd65-48db-a6da-18e2ba99b975\") " Mar 20 09:20:31 crc kubenswrapper[4971]: I0320 09:20:31.181428 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-ovn-combined-ca-bundle\") pod \"aaabaf42-dd65-48db-a6da-18e2ba99b975\" (UID: \"aaabaf42-dd65-48db-a6da-18e2ba99b975\") " Mar 20 09:20:31 crc kubenswrapper[4971]: I0320 09:20:31.181446 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-inventory\") pod \"aaabaf42-dd65-48db-a6da-18e2ba99b975\" (UID: \"aaabaf42-dd65-48db-a6da-18e2ba99b975\") " Mar 20 09:20:31 crc kubenswrapper[4971]: I0320 09:20:31.181505 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-libvirt-combined-ca-bundle\") pod \"aaabaf42-dd65-48db-a6da-18e2ba99b975\" (UID: \"aaabaf42-dd65-48db-a6da-18e2ba99b975\") " Mar 20 09:20:31 crc kubenswrapper[4971]: I0320 09:20:31.181526 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-bootstrap-combined-ca-bundle\") pod \"aaabaf42-dd65-48db-a6da-18e2ba99b975\" (UID: \"aaabaf42-dd65-48db-a6da-18e2ba99b975\") " Mar 20 09:20:31 crc kubenswrapper[4971]: I0320 09:20:31.181550 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-nova-combined-ca-bundle\") pod \"aaabaf42-dd65-48db-a6da-18e2ba99b975\" (UID: \"aaabaf42-dd65-48db-a6da-18e2ba99b975\") " Mar 20 09:20:31 crc kubenswrapper[4971]: I0320 09:20:31.181576 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-ceph\") pod \"aaabaf42-dd65-48db-a6da-18e2ba99b975\" (UID: \"aaabaf42-dd65-48db-a6da-18e2ba99b975\") " Mar 20 09:20:31 crc kubenswrapper[4971]: I0320 09:20:31.181593 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-neutron-sriov-combined-ca-bundle\") pod \"aaabaf42-dd65-48db-a6da-18e2ba99b975\" (UID: \"aaabaf42-dd65-48db-a6da-18e2ba99b975\") " Mar 20 09:20:31 crc kubenswrapper[4971]: I0320 09:20:31.187423 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "aaabaf42-dd65-48db-a6da-18e2ba99b975" (UID: "aaabaf42-dd65-48db-a6da-18e2ba99b975"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:20:31 crc kubenswrapper[4971]: I0320 09:20:31.187707 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aaabaf42-dd65-48db-a6da-18e2ba99b975-kube-api-access-jhj2g" (OuterVolumeSpecName: "kube-api-access-jhj2g") pod "aaabaf42-dd65-48db-a6da-18e2ba99b975" (UID: "aaabaf42-dd65-48db-a6da-18e2ba99b975"). InnerVolumeSpecName "kube-api-access-jhj2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:20:31 crc kubenswrapper[4971]: I0320 09:20:31.187762 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "aaabaf42-dd65-48db-a6da-18e2ba99b975" (UID: "aaabaf42-dd65-48db-a6da-18e2ba99b975"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:20:31 crc kubenswrapper[4971]: I0320 09:20:31.187800 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-ceph" (OuterVolumeSpecName: "ceph") pod "aaabaf42-dd65-48db-a6da-18e2ba99b975" (UID: "aaabaf42-dd65-48db-a6da-18e2ba99b975"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:20:31 crc kubenswrapper[4971]: I0320 09:20:31.190538 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "aaabaf42-dd65-48db-a6da-18e2ba99b975" (UID: "aaabaf42-dd65-48db-a6da-18e2ba99b975"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:20:31 crc kubenswrapper[4971]: I0320 09:20:31.195137 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "aaabaf42-dd65-48db-a6da-18e2ba99b975" (UID: "aaabaf42-dd65-48db-a6da-18e2ba99b975"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:20:31 crc kubenswrapper[4971]: I0320 09:20:31.196245 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "aaabaf42-dd65-48db-a6da-18e2ba99b975" (UID: "aaabaf42-dd65-48db-a6da-18e2ba99b975"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:20:31 crc kubenswrapper[4971]: I0320 09:20:31.197525 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "aaabaf42-dd65-48db-a6da-18e2ba99b975" (UID: "aaabaf42-dd65-48db-a6da-18e2ba99b975"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:20:31 crc kubenswrapper[4971]: I0320 09:20:31.197703 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "aaabaf42-dd65-48db-a6da-18e2ba99b975" (UID: "aaabaf42-dd65-48db-a6da-18e2ba99b975"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:20:31 crc kubenswrapper[4971]: I0320 09:20:31.209772 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "aaabaf42-dd65-48db-a6da-18e2ba99b975" (UID: "aaabaf42-dd65-48db-a6da-18e2ba99b975"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:20:31 crc kubenswrapper[4971]: I0320 09:20:31.220777 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-inventory" (OuterVolumeSpecName: "inventory") pod "aaabaf42-dd65-48db-a6da-18e2ba99b975" (UID: "aaabaf42-dd65-48db-a6da-18e2ba99b975"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:20:31 crc kubenswrapper[4971]: I0320 09:20:31.245215 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "aaabaf42-dd65-48db-a6da-18e2ba99b975" (UID: "aaabaf42-dd65-48db-a6da-18e2ba99b975"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:20:31 crc kubenswrapper[4971]: I0320 09:20:31.283378 4971 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 09:20:31 crc kubenswrapper[4971]: I0320 09:20:31.283420 4971 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 09:20:31 crc kubenswrapper[4971]: I0320 09:20:31.283433 4971 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 09:20:31 crc kubenswrapper[4971]: I0320 09:20:31.283442 4971 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 09:20:31 crc kubenswrapper[4971]: I0320 09:20:31.283450 4971 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 09:20:31 crc kubenswrapper[4971]: I0320 09:20:31.283459 4971 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-ceph\") on node \"crc\" DevicePath \"\"" Mar 20 09:20:31 crc kubenswrapper[4971]: I0320 09:20:31.283466 4971 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 09:20:31 crc kubenswrapper[4971]: I0320 09:20:31.283475 4971 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 20 09:20:31 crc kubenswrapper[4971]: I0320 09:20:31.283484 4971 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 09:20:31 crc kubenswrapper[4971]: I0320 09:20:31.283499 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhj2g\" (UniqueName: \"kubernetes.io/projected/aaabaf42-dd65-48db-a6da-18e2ba99b975-kube-api-access-jhj2g\") on node \"crc\" DevicePath \"\"" Mar 20 09:20:31 crc kubenswrapper[4971]: I0320 09:20:31.283508 4971 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 09:20:31 crc kubenswrapper[4971]: I0320 09:20:31.283517 4971 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaabaf42-dd65-48db-a6da-18e2ba99b975-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 09:20:31 crc kubenswrapper[4971]: I0320 09:20:31.741506 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-gdhcx" event={"ID":"aaabaf42-dd65-48db-a6da-18e2ba99b975","Type":"ContainerDied","Data":"4838ef0a076f97ab3caecbff9be2503cfdcdf2266a52b21b293d6cb64263a045"} Mar 20 09:20:31 crc kubenswrapper[4971]: I0320 09:20:31.741861 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4838ef0a076f97ab3caecbff9be2503cfdcdf2266a52b21b293d6cb64263a045" Mar 20 09:20:31 crc kubenswrapper[4971]: I0320 09:20:31.741588 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-gdhcx" Mar 20 09:20:31 crc kubenswrapper[4971]: I0320 09:20:31.838131 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-z6h6v"] Mar 20 09:20:31 crc kubenswrapper[4971]: E0320 09:20:31.838636 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aaabaf42-dd65-48db-a6da-18e2ba99b975" containerName="install-certs-openstack-openstack-cell1" Mar 20 09:20:31 crc kubenswrapper[4971]: I0320 09:20:31.838653 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaabaf42-dd65-48db-a6da-18e2ba99b975" containerName="install-certs-openstack-openstack-cell1" Mar 20 09:20:31 crc kubenswrapper[4971]: I0320 09:20:31.838855 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="aaabaf42-dd65-48db-a6da-18e2ba99b975" containerName="install-certs-openstack-openstack-cell1" Mar 20 09:20:31 crc kubenswrapper[4971]: I0320 09:20:31.839587 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-z6h6v" Mar 20 09:20:31 crc kubenswrapper[4971]: I0320 09:20:31.841690 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 20 09:20:31 crc kubenswrapper[4971]: I0320 09:20:31.842049 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rwstj" Mar 20 09:20:31 crc kubenswrapper[4971]: I0320 09:20:31.845233 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-z6h6v"] Mar 20 09:20:31 crc kubenswrapper[4971]: I0320 09:20:31.895484 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/661512ef-b7a3-40be-820e-7d9670853666-inventory\") pod \"ceph-client-openstack-openstack-cell1-z6h6v\" (UID: \"661512ef-b7a3-40be-820e-7d9670853666\") " pod="openstack/ceph-client-openstack-openstack-cell1-z6h6v" Mar 20 09:20:31 crc kubenswrapper[4971]: I0320 09:20:31.895554 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f799r\" (UniqueName: \"kubernetes.io/projected/661512ef-b7a3-40be-820e-7d9670853666-kube-api-access-f799r\") pod \"ceph-client-openstack-openstack-cell1-z6h6v\" (UID: \"661512ef-b7a3-40be-820e-7d9670853666\") " pod="openstack/ceph-client-openstack-openstack-cell1-z6h6v" Mar 20 09:20:31 crc kubenswrapper[4971]: I0320 09:20:31.895969 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/661512ef-b7a3-40be-820e-7d9670853666-ssh-key-openstack-cell1\") pod \"ceph-client-openstack-openstack-cell1-z6h6v\" (UID: \"661512ef-b7a3-40be-820e-7d9670853666\") " pod="openstack/ceph-client-openstack-openstack-cell1-z6h6v" Mar 20 09:20:31 crc kubenswrapper[4971]: I0320 09:20:31.896023 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/661512ef-b7a3-40be-820e-7d9670853666-ceph\") pod \"ceph-client-openstack-openstack-cell1-z6h6v\" (UID: \"661512ef-b7a3-40be-820e-7d9670853666\") " pod="openstack/ceph-client-openstack-openstack-cell1-z6h6v" Mar 20 09:20:31 crc kubenswrapper[4971]: I0320 09:20:31.998187 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/661512ef-b7a3-40be-820e-7d9670853666-ssh-key-openstack-cell1\") pod \"ceph-client-openstack-openstack-cell1-z6h6v\" (UID: \"661512ef-b7a3-40be-820e-7d9670853666\") " pod="openstack/ceph-client-openstack-openstack-cell1-z6h6v" Mar 20 09:20:31 crc kubenswrapper[4971]: I0320 09:20:31.998241 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/661512ef-b7a3-40be-820e-7d9670853666-ceph\") pod \"ceph-client-openstack-openstack-cell1-z6h6v\" (UID: \"661512ef-b7a3-40be-820e-7d9670853666\") " pod="openstack/ceph-client-openstack-openstack-cell1-z6h6v" Mar 20 09:20:31 crc kubenswrapper[4971]: I0320 09:20:31.998359 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/661512ef-b7a3-40be-820e-7d9670853666-inventory\") pod \"ceph-client-openstack-openstack-cell1-z6h6v\" (UID: \"661512ef-b7a3-40be-820e-7d9670853666\") " pod="openstack/ceph-client-openstack-openstack-cell1-z6h6v" Mar 20 09:20:31 crc kubenswrapper[4971]: I0320 09:20:31.998419 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f799r\" (UniqueName: \"kubernetes.io/projected/661512ef-b7a3-40be-820e-7d9670853666-kube-api-access-f799r\") pod \"ceph-client-openstack-openstack-cell1-z6h6v\" (UID: \"661512ef-b7a3-40be-820e-7d9670853666\") " pod="openstack/ceph-client-openstack-openstack-cell1-z6h6v" Mar 20 09:20:32 crc kubenswrapper[4971]: I0320 09:20:32.002882 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/661512ef-b7a3-40be-820e-7d9670853666-ssh-key-openstack-cell1\") pod \"ceph-client-openstack-openstack-cell1-z6h6v\" (UID: \"661512ef-b7a3-40be-820e-7d9670853666\") " pod="openstack/ceph-client-openstack-openstack-cell1-z6h6v" Mar 20 09:20:32 crc kubenswrapper[4971]: I0320 09:20:32.003739 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/661512ef-b7a3-40be-820e-7d9670853666-inventory\") pod \"ceph-client-openstack-openstack-cell1-z6h6v\" (UID: \"661512ef-b7a3-40be-820e-7d9670853666\") " pod="openstack/ceph-client-openstack-openstack-cell1-z6h6v" Mar 20 09:20:32 crc kubenswrapper[4971]: I0320 09:20:32.004125 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/661512ef-b7a3-40be-820e-7d9670853666-ceph\") pod \"ceph-client-openstack-openstack-cell1-z6h6v\" (UID: \"661512ef-b7a3-40be-820e-7d9670853666\") " pod="openstack/ceph-client-openstack-openstack-cell1-z6h6v" Mar 20 09:20:32 crc kubenswrapper[4971]: I0320 09:20:32.021626 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f799r\" (UniqueName: \"kubernetes.io/projected/661512ef-b7a3-40be-820e-7d9670853666-kube-api-access-f799r\") pod \"ceph-client-openstack-openstack-cell1-z6h6v\" (UID: \"661512ef-b7a3-40be-820e-7d9670853666\") " pod="openstack/ceph-client-openstack-openstack-cell1-z6h6v" Mar 20 09:20:32 crc kubenswrapper[4971]: I0320 09:20:32.166961 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-z6h6v" Mar 20 09:20:32 crc kubenswrapper[4971]: I0320 09:20:32.750648 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-z6h6v"] Mar 20 09:20:32 crc kubenswrapper[4971]: W0320 09:20:32.755692 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod661512ef_b7a3_40be_820e_7d9670853666.slice/crio-54484319a3b43d63527eaf53d4a06ee0da581f4f2664b8fc7f957ddb39b2f389 WatchSource:0}: Error finding container 54484319a3b43d63527eaf53d4a06ee0da581f4f2664b8fc7f957ddb39b2f389: Status 404 returned error can't find the container with id 54484319a3b43d63527eaf53d4a06ee0da581f4f2664b8fc7f957ddb39b2f389 Mar 20 09:20:33 crc kubenswrapper[4971]: I0320 09:20:33.761561 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-z6h6v" event={"ID":"661512ef-b7a3-40be-820e-7d9670853666","Type":"ContainerStarted","Data":"9fc5948f1470a50e3edbd76073024b3d623d0f11b3bf373775b7b23a086fb92d"} Mar 20 09:20:33 crc kubenswrapper[4971]: I0320 09:20:33.762158 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-z6h6v" event={"ID":"661512ef-b7a3-40be-820e-7d9670853666","Type":"ContainerStarted","Data":"54484319a3b43d63527eaf53d4a06ee0da581f4f2664b8fc7f957ddb39b2f389"} Mar 20 09:20:33 crc kubenswrapper[4971]: I0320 09:20:33.786433 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-client-openstack-openstack-cell1-z6h6v" podStartSLOduration=2.264450278 podStartE2EDuration="2.786411313s" podCreationTimestamp="2026-03-20 09:20:31 +0000 UTC" firstStartedPulling="2026-03-20 09:20:32.757729175 +0000 UTC m=+9054.737603313" lastFinishedPulling="2026-03-20 09:20:33.27969021 +0000 UTC m=+9055.259564348" observedRunningTime="2026-03-20 09:20:33.785809207 +0000 UTC m=+9055.765683355" watchObservedRunningTime="2026-03-20 09:20:33.786411313 +0000 UTC m=+9055.766285451" Mar 20 09:20:37 crc kubenswrapper[4971]: I0320 09:20:37.800700 4971 generic.go:334] "Generic (PLEG): container finished" podID="49dff616-ace8-40f1-a666-5fdf6e5ab3d6" containerID="c3127bc588362156f3826d9bc72c15cfb4f6dc4f6668b5eee5555ced0bb7d1ae" exitCode=0 Mar 20 09:20:37 crc kubenswrapper[4971]: I0320 09:20:37.801204 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-networker-kn8xb" event={"ID":"49dff616-ace8-40f1-a666-5fdf6e5ab3d6","Type":"ContainerDied","Data":"c3127bc588362156f3826d9bc72c15cfb4f6dc4f6668b5eee5555ced0bb7d1ae"} Mar 20 09:20:38 crc kubenswrapper[4971]: I0320 09:20:38.810425 4971 generic.go:334] "Generic (PLEG): container finished" podID="661512ef-b7a3-40be-820e-7d9670853666" containerID="9fc5948f1470a50e3edbd76073024b3d623d0f11b3bf373775b7b23a086fb92d" exitCode=0 Mar 20 09:20:38 crc kubenswrapper[4971]: I0320 09:20:38.810541 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-z6h6v" event={"ID":"661512ef-b7a3-40be-820e-7d9670853666","Type":"ContainerDied","Data":"9fc5948f1470a50e3edbd76073024b3d623d0f11b3bf373775b7b23a086fb92d"} Mar 20 09:20:39 crc kubenswrapper[4971]: I0320 09:20:39.221714 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-networker-kn8xb" Mar 20 09:20:39 crc kubenswrapper[4971]: I0320 09:20:39.240023 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/49dff616-ace8-40f1-a666-5fdf6e5ab3d6-ovncontroller-config-0\") pod \"49dff616-ace8-40f1-a666-5fdf6e5ab3d6\" (UID: \"49dff616-ace8-40f1-a666-5fdf6e5ab3d6\") " Mar 20 09:20:39 crc kubenswrapper[4971]: I0320 09:20:39.240083 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-496md\" (UniqueName: \"kubernetes.io/projected/49dff616-ace8-40f1-a666-5fdf6e5ab3d6-kube-api-access-496md\") pod \"49dff616-ace8-40f1-a666-5fdf6e5ab3d6\" (UID: \"49dff616-ace8-40f1-a666-5fdf6e5ab3d6\") " Mar 20 09:20:39 crc kubenswrapper[4971]: I0320 09:20:39.240170 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49dff616-ace8-40f1-a666-5fdf6e5ab3d6-ovn-combined-ca-bundle\") pod \"49dff616-ace8-40f1-a666-5fdf6e5ab3d6\" (UID: \"49dff616-ace8-40f1-a666-5fdf6e5ab3d6\") " Mar 20 09:20:39 crc kubenswrapper[4971]: I0320 09:20:39.240254 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/49dff616-ace8-40f1-a666-5fdf6e5ab3d6-ssh-key-openstack-networker\") pod \"49dff616-ace8-40f1-a666-5fdf6e5ab3d6\" (UID: \"49dff616-ace8-40f1-a666-5fdf6e5ab3d6\") " Mar 20 09:20:39 crc kubenswrapper[4971]: I0320 09:20:39.240301 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49dff616-ace8-40f1-a666-5fdf6e5ab3d6-inventory\") pod \"49dff616-ace8-40f1-a666-5fdf6e5ab3d6\" (UID: \"49dff616-ace8-40f1-a666-5fdf6e5ab3d6\") " Mar 20 09:20:39 crc kubenswrapper[4971]: I0320 09:20:39.246952 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49dff616-ace8-40f1-a666-5fdf6e5ab3d6-kube-api-access-496md" (OuterVolumeSpecName: "kube-api-access-496md") pod "49dff616-ace8-40f1-a666-5fdf6e5ab3d6" (UID: "49dff616-ace8-40f1-a666-5fdf6e5ab3d6"). InnerVolumeSpecName "kube-api-access-496md". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:20:39 crc kubenswrapper[4971]: I0320 09:20:39.253866 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49dff616-ace8-40f1-a666-5fdf6e5ab3d6-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "49dff616-ace8-40f1-a666-5fdf6e5ab3d6" (UID: "49dff616-ace8-40f1-a666-5fdf6e5ab3d6"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:20:39 crc kubenswrapper[4971]: I0320 09:20:39.283745 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49dff616-ace8-40f1-a666-5fdf6e5ab3d6-inventory" (OuterVolumeSpecName: "inventory") pod "49dff616-ace8-40f1-a666-5fdf6e5ab3d6" (UID: "49dff616-ace8-40f1-a666-5fdf6e5ab3d6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:20:39 crc kubenswrapper[4971]: I0320 09:20:39.289863 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49dff616-ace8-40f1-a666-5fdf6e5ab3d6-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "49dff616-ace8-40f1-a666-5fdf6e5ab3d6" (UID: "49dff616-ace8-40f1-a666-5fdf6e5ab3d6"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:20:39 crc kubenswrapper[4971]: I0320 09:20:39.292428 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49dff616-ace8-40f1-a666-5fdf6e5ab3d6-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "49dff616-ace8-40f1-a666-5fdf6e5ab3d6" (UID: "49dff616-ace8-40f1-a666-5fdf6e5ab3d6"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:20:39 crc kubenswrapper[4971]: I0320 09:20:39.343270 4971 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49dff616-ace8-40f1-a666-5fdf6e5ab3d6-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 09:20:39 crc kubenswrapper[4971]: I0320 09:20:39.343299 4971 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/49dff616-ace8-40f1-a666-5fdf6e5ab3d6-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 20 09:20:39 crc kubenswrapper[4971]: I0320 09:20:39.343310 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-496md\" (UniqueName: \"kubernetes.io/projected/49dff616-ace8-40f1-a666-5fdf6e5ab3d6-kube-api-access-496md\") on node \"crc\" DevicePath \"\"" Mar 20 09:20:39 crc kubenswrapper[4971]: I0320 09:20:39.343321 4971 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49dff616-ace8-40f1-a666-5fdf6e5ab3d6-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 09:20:39 crc kubenswrapper[4971]: I0320 09:20:39.343331 4971 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/49dff616-ace8-40f1-a666-5fdf6e5ab3d6-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Mar 20 09:20:39 crc kubenswrapper[4971]: I0320 09:20:39.822208 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-networker-kn8xb" Mar 20 09:20:39 crc kubenswrapper[4971]: I0320 09:20:39.822256 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-networker-kn8xb" event={"ID":"49dff616-ace8-40f1-a666-5fdf6e5ab3d6","Type":"ContainerDied","Data":"fed7d7a98299f37e5e323bde61bfbcf8c7725bbad8d3d5818df51bee030a95de"} Mar 20 09:20:39 crc kubenswrapper[4971]: I0320 09:20:39.822281 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fed7d7a98299f37e5e323bde61bfbcf8c7725bbad8d3d5818df51bee030a95de" Mar 20 09:20:39 crc kubenswrapper[4971]: I0320 09:20:39.954627 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-openstack-openstack-networker-kl245"] Mar 20 09:20:39 crc kubenswrapper[4971]: E0320 09:20:39.955122 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49dff616-ace8-40f1-a666-5fdf6e5ab3d6" containerName="ovn-openstack-openstack-networker" Mar 20 09:20:39 crc kubenswrapper[4971]: I0320 09:20:39.955136 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="49dff616-ace8-40f1-a666-5fdf6e5ab3d6" containerName="ovn-openstack-openstack-networker" Mar 20 09:20:39 crc kubenswrapper[4971]: I0320 09:20:39.955314 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="49dff616-ace8-40f1-a666-5fdf6e5ab3d6" containerName="ovn-openstack-openstack-networker" Mar 20 09:20:39 crc kubenswrapper[4971]: I0320 09:20:39.956377 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-networker-kl245" Mar 20 09:20:39 crc kubenswrapper[4971]: I0320 09:20:39.960451 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 20 09:20:39 crc kubenswrapper[4971]: I0320 09:20:39.961483 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Mar 20 09:20:39 crc kubenswrapper[4971]: I0320 09:20:39.961492 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 20 09:20:39 crc kubenswrapper[4971]: I0320 09:20:39.961744 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-nwcvx" Mar 20 09:20:39 crc kubenswrapper[4971]: I0320 09:20:39.974955 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-networker-kl245"] Mar 20 09:20:40 crc kubenswrapper[4971]: I0320 09:20:40.060399 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/90cbcdec-a80c-4c34-9130-ecbaa12a36b0-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-networker-kl245\" (UID: \"90cbcdec-a80c-4c34-9130-ecbaa12a36b0\") " pod="openstack/neutron-metadata-openstack-openstack-networker-kl245" Mar 20 09:20:40 crc kubenswrapper[4971]: I0320 09:20:40.060497 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90cbcdec-a80c-4c34-9130-ecbaa12a36b0-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-networker-kl245\" (UID: \"90cbcdec-a80c-4c34-9130-ecbaa12a36b0\") " pod="openstack/neutron-metadata-openstack-openstack-networker-kl245" Mar 20 09:20:40 crc kubenswrapper[4971]: I0320 09:20:40.060531 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bjrs\" (UniqueName: \"kubernetes.io/projected/90cbcdec-a80c-4c34-9130-ecbaa12a36b0-kube-api-access-7bjrs\") pod \"neutron-metadata-openstack-openstack-networker-kl245\" (UID: \"90cbcdec-a80c-4c34-9130-ecbaa12a36b0\") " pod="openstack/neutron-metadata-openstack-openstack-networker-kl245" Mar 20 09:20:40 crc kubenswrapper[4971]: I0320 09:20:40.060603 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/90cbcdec-a80c-4c34-9130-ecbaa12a36b0-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-networker-kl245\" (UID: \"90cbcdec-a80c-4c34-9130-ecbaa12a36b0\") " pod="openstack/neutron-metadata-openstack-openstack-networker-kl245" Mar 20 09:20:40 crc kubenswrapper[4971]: I0320 09:20:40.060664 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/90cbcdec-a80c-4c34-9130-ecbaa12a36b0-inventory\") pod \"neutron-metadata-openstack-openstack-networker-kl245\" (UID: \"90cbcdec-a80c-4c34-9130-ecbaa12a36b0\") " pod="openstack/neutron-metadata-openstack-openstack-networker-kl245" Mar 20 09:20:40 crc kubenswrapper[4971]: I0320 09:20:40.060730 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/90cbcdec-a80c-4c34-9130-ecbaa12a36b0-ssh-key-openstack-networker\") pod \"neutron-metadata-openstack-openstack-networker-kl245\" (UID: \"90cbcdec-a80c-4c34-9130-ecbaa12a36b0\") " pod="openstack/neutron-metadata-openstack-openstack-networker-kl245" Mar 20 09:20:40 crc kubenswrapper[4971]: I0320 09:20:40.167348 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/90cbcdec-a80c-4c34-9130-ecbaa12a36b0-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-networker-kl245\" (UID: \"90cbcdec-a80c-4c34-9130-ecbaa12a36b0\") " pod="openstack/neutron-metadata-openstack-openstack-networker-kl245" Mar 20 09:20:40 crc kubenswrapper[4971]: I0320 09:20:40.167436 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/90cbcdec-a80c-4c34-9130-ecbaa12a36b0-inventory\") pod \"neutron-metadata-openstack-openstack-networker-kl245\" (UID: \"90cbcdec-a80c-4c34-9130-ecbaa12a36b0\") " pod="openstack/neutron-metadata-openstack-openstack-networker-kl245" Mar 20 09:20:40 crc kubenswrapper[4971]: I0320 09:20:40.167469 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/90cbcdec-a80c-4c34-9130-ecbaa12a36b0-ssh-key-openstack-networker\") pod \"neutron-metadata-openstack-openstack-networker-kl245\" (UID: \"90cbcdec-a80c-4c34-9130-ecbaa12a36b0\") " pod="openstack/neutron-metadata-openstack-openstack-networker-kl245" Mar 20 09:20:40 crc kubenswrapper[4971]: I0320 09:20:40.167583 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/90cbcdec-a80c-4c34-9130-ecbaa12a36b0-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-networker-kl245\" (UID: \"90cbcdec-a80c-4c34-9130-ecbaa12a36b0\") " pod="openstack/neutron-metadata-openstack-openstack-networker-kl245" Mar 20 09:20:40 crc kubenswrapper[4971]: I0320 09:20:40.167683 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90cbcdec-a80c-4c34-9130-ecbaa12a36b0-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-networker-kl245\" (UID: \"90cbcdec-a80c-4c34-9130-ecbaa12a36b0\") " pod="openstack/neutron-metadata-openstack-openstack-networker-kl245" Mar 20 09:20:40 crc kubenswrapper[4971]: I0320 09:20:40.167713 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bjrs\" (UniqueName: \"kubernetes.io/projected/90cbcdec-a80c-4c34-9130-ecbaa12a36b0-kube-api-access-7bjrs\") pod \"neutron-metadata-openstack-openstack-networker-kl245\" (UID: \"90cbcdec-a80c-4c34-9130-ecbaa12a36b0\") " pod="openstack/neutron-metadata-openstack-openstack-networker-kl245" Mar 20 09:20:40 crc kubenswrapper[4971]: I0320 09:20:40.184276 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/90cbcdec-a80c-4c34-9130-ecbaa12a36b0-ssh-key-openstack-networker\") pod \"neutron-metadata-openstack-openstack-networker-kl245\" (UID: \"90cbcdec-a80c-4c34-9130-ecbaa12a36b0\") " pod="openstack/neutron-metadata-openstack-openstack-networker-kl245" Mar 20 09:20:40 crc kubenswrapper[4971]: I0320 09:20:40.184592 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/90cbcdec-a80c-4c34-9130-ecbaa12a36b0-inventory\") pod \"neutron-metadata-openstack-openstack-networker-kl245\" (UID: \"90cbcdec-a80c-4c34-9130-ecbaa12a36b0\") " pod="openstack/neutron-metadata-openstack-openstack-networker-kl245" Mar 20 09:20:40 crc kubenswrapper[4971]: I0320 09:20:40.190296 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/90cbcdec-a80c-4c34-9130-ecbaa12a36b0-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-networker-kl245\" (UID: \"90cbcdec-a80c-4c34-9130-ecbaa12a36b0\") " pod="openstack/neutron-metadata-openstack-openstack-networker-kl245" Mar 20 09:20:40 crc kubenswrapper[4971]: I0320 09:20:40.195489 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90cbcdec-a80c-4c34-9130-ecbaa12a36b0-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-networker-kl245\" (UID: \"90cbcdec-a80c-4c34-9130-ecbaa12a36b0\") " pod="openstack/neutron-metadata-openstack-openstack-networker-kl245" Mar 20 09:20:40 crc kubenswrapper[4971]: I0320 09:20:40.196729 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bjrs\" (UniqueName: \"kubernetes.io/projected/90cbcdec-a80c-4c34-9130-ecbaa12a36b0-kube-api-access-7bjrs\") pod \"neutron-metadata-openstack-openstack-networker-kl245\" (UID: \"90cbcdec-a80c-4c34-9130-ecbaa12a36b0\") " pod="openstack/neutron-metadata-openstack-openstack-networker-kl245" Mar 20 09:20:40 crc kubenswrapper[4971]: I0320 09:20:40.203873 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/90cbcdec-a80c-4c34-9130-ecbaa12a36b0-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-networker-kl245\" (UID: \"90cbcdec-a80c-4c34-9130-ecbaa12a36b0\") " pod="openstack/neutron-metadata-openstack-openstack-networker-kl245" Mar 20 09:20:40 crc kubenswrapper[4971]: I0320 09:20:40.276558 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-networker-kl245" Mar 20 09:20:40 crc kubenswrapper[4971]: I0320 09:20:40.435327 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-z6h6v" Mar 20 09:20:40 crc kubenswrapper[4971]: I0320 09:20:40.488820 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/661512ef-b7a3-40be-820e-7d9670853666-inventory\") pod \"661512ef-b7a3-40be-820e-7d9670853666\" (UID: \"661512ef-b7a3-40be-820e-7d9670853666\") " Mar 20 09:20:40 crc kubenswrapper[4971]: I0320 09:20:40.488880 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f799r\" (UniqueName: \"kubernetes.io/projected/661512ef-b7a3-40be-820e-7d9670853666-kube-api-access-f799r\") pod \"661512ef-b7a3-40be-820e-7d9670853666\" (UID: \"661512ef-b7a3-40be-820e-7d9670853666\") " Mar 20 09:20:40 crc kubenswrapper[4971]: I0320 09:20:40.488900 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/661512ef-b7a3-40be-820e-7d9670853666-ssh-key-openstack-cell1\") pod \"661512ef-b7a3-40be-820e-7d9670853666\" (UID: \"661512ef-b7a3-40be-820e-7d9670853666\") " Mar 20 09:20:40 crc kubenswrapper[4971]: I0320 09:20:40.489030 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/661512ef-b7a3-40be-820e-7d9670853666-ceph\") pod \"661512ef-b7a3-40be-820e-7d9670853666\" (UID: \"661512ef-b7a3-40be-820e-7d9670853666\") " Mar 20 09:20:40 crc kubenswrapper[4971]: I0320 09:20:40.759305 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/661512ef-b7a3-40be-820e-7d9670853666-ceph" (OuterVolumeSpecName: "ceph") pod "661512ef-b7a3-40be-820e-7d9670853666" (UID: "661512ef-b7a3-40be-820e-7d9670853666"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:20:40 crc kubenswrapper[4971]: I0320 09:20:40.761317 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/661512ef-b7a3-40be-820e-7d9670853666-kube-api-access-f799r" (OuterVolumeSpecName: "kube-api-access-f799r") pod "661512ef-b7a3-40be-820e-7d9670853666" (UID: "661512ef-b7a3-40be-820e-7d9670853666"). InnerVolumeSpecName "kube-api-access-f799r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:20:40 crc kubenswrapper[4971]: I0320 09:20:40.766252 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/661512ef-b7a3-40be-820e-7d9670853666-inventory" (OuterVolumeSpecName: "inventory") pod "661512ef-b7a3-40be-820e-7d9670853666" (UID: "661512ef-b7a3-40be-820e-7d9670853666"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:20:40 crc kubenswrapper[4971]: I0320 09:20:40.780268 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/661512ef-b7a3-40be-820e-7d9670853666-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "661512ef-b7a3-40be-820e-7d9670853666" (UID: "661512ef-b7a3-40be-820e-7d9670853666"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:20:40 crc kubenswrapper[4971]: I0320 09:20:40.795156 4971 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/661512ef-b7a3-40be-820e-7d9670853666-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 09:20:40 crc kubenswrapper[4971]: I0320 09:20:40.795186 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f799r\" (UniqueName: \"kubernetes.io/projected/661512ef-b7a3-40be-820e-7d9670853666-kube-api-access-f799r\") on node \"crc\" DevicePath \"\"" Mar 20 09:20:40 crc kubenswrapper[4971]: I0320 09:20:40.795197 4971 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/661512ef-b7a3-40be-820e-7d9670853666-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 20 09:20:40 crc kubenswrapper[4971]: I0320 09:20:40.795206 4971 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/661512ef-b7a3-40be-820e-7d9670853666-ceph\") on node \"crc\" DevicePath \"\"" Mar 20 09:20:40 crc kubenswrapper[4971]: I0320 09:20:40.833874 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-z6h6v" event={"ID":"661512ef-b7a3-40be-820e-7d9670853666","Type":"ContainerDied","Data":"54484319a3b43d63527eaf53d4a06ee0da581f4f2664b8fc7f957ddb39b2f389"} Mar 20 09:20:40 crc kubenswrapper[4971]: I0320 09:20:40.834146 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54484319a3b43d63527eaf53d4a06ee0da581f4f2664b8fc7f957ddb39b2f389" Mar 20 09:20:40 crc kubenswrapper[4971]: I0320 09:20:40.834205 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-z6h6v" Mar 20 09:20:40 crc kubenswrapper[4971]: I0320 09:20:40.928789 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-openstack-openstack-cell1-bcpc6"] Mar 20 09:20:40 crc kubenswrapper[4971]: E0320 09:20:40.929315 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="661512ef-b7a3-40be-820e-7d9670853666" containerName="ceph-client-openstack-openstack-cell1" Mar 20 09:20:40 crc kubenswrapper[4971]: I0320 09:20:40.929339 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="661512ef-b7a3-40be-820e-7d9670853666" containerName="ceph-client-openstack-openstack-cell1" Mar 20 09:20:40 crc kubenswrapper[4971]: I0320 09:20:40.929585 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="661512ef-b7a3-40be-820e-7d9670853666" containerName="ceph-client-openstack-openstack-cell1" Mar 20 09:20:40 crc kubenswrapper[4971]: I0320 09:20:40.930475 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-bcpc6" Mar 20 09:20:40 crc kubenswrapper[4971]: I0320 09:20:40.932909 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 20 09:20:40 crc kubenswrapper[4971]: I0320 09:20:40.933193 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rwstj" Mar 20 09:20:40 crc kubenswrapper[4971]: I0320 09:20:40.933345 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 20 09:20:40 crc kubenswrapper[4971]: I0320 09:20:40.973524 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-bcpc6"] Mar 20 09:20:41 crc kubenswrapper[4971]: I0320 09:20:40.999955 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/f74aa1f9-4acb-48cd-90a1-10ff27996a3a-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-bcpc6\" (UID: \"f74aa1f9-4acb-48cd-90a1-10ff27996a3a\") " pod="openstack/ovn-openstack-openstack-cell1-bcpc6" Mar 20 09:20:41 crc kubenswrapper[4971]: I0320 09:20:41.000103 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f74aa1f9-4acb-48cd-90a1-10ff27996a3a-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-bcpc6\" (UID: \"f74aa1f9-4acb-48cd-90a1-10ff27996a3a\") " pod="openstack/ovn-openstack-openstack-cell1-bcpc6" Mar 20 09:20:41 crc kubenswrapper[4971]: I0320 09:20:41.000197 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f74aa1f9-4acb-48cd-90a1-10ff27996a3a-ceph\") pod \"ovn-openstack-openstack-cell1-bcpc6\" (UID: \"f74aa1f9-4acb-48cd-90a1-10ff27996a3a\") " pod="openstack/ovn-openstack-openstack-cell1-bcpc6" Mar 20 09:20:41 crc kubenswrapper[4971]: I0320 09:20:41.000327 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f74aa1f9-4acb-48cd-90a1-10ff27996a3a-inventory\") pod \"ovn-openstack-openstack-cell1-bcpc6\" (UID: \"f74aa1f9-4acb-48cd-90a1-10ff27996a3a\") " pod="openstack/ovn-openstack-openstack-cell1-bcpc6" Mar 20 09:20:41 crc kubenswrapper[4971]: I0320 09:20:41.000634 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f74aa1f9-4acb-48cd-90a1-10ff27996a3a-ssh-key-openstack-cell1\") pod \"ovn-openstack-openstack-cell1-bcpc6\" (UID: \"f74aa1f9-4acb-48cd-90a1-10ff27996a3a\") " pod="openstack/ovn-openstack-openstack-cell1-bcpc6" Mar 20 09:20:41 crc kubenswrapper[4971]: I0320 09:20:41.000816 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44fxd\" (UniqueName: \"kubernetes.io/projected/f74aa1f9-4acb-48cd-90a1-10ff27996a3a-kube-api-access-44fxd\") pod \"ovn-openstack-openstack-cell1-bcpc6\" (UID: \"f74aa1f9-4acb-48cd-90a1-10ff27996a3a\") " pod="openstack/ovn-openstack-openstack-cell1-bcpc6" Mar 20 09:20:41 crc kubenswrapper[4971]: I0320 09:20:41.102243 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f74aa1f9-4acb-48cd-90a1-10ff27996a3a-inventory\") pod \"ovn-openstack-openstack-cell1-bcpc6\" (UID: \"f74aa1f9-4acb-48cd-90a1-10ff27996a3a\") " pod="openstack/ovn-openstack-openstack-cell1-bcpc6" Mar 20 09:20:41 crc kubenswrapper[4971]: I0320 09:20:41.102340 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f74aa1f9-4acb-48cd-90a1-10ff27996a3a-ssh-key-openstack-cell1\") pod \"ovn-openstack-openstack-cell1-bcpc6\" (UID: \"f74aa1f9-4acb-48cd-90a1-10ff27996a3a\") " pod="openstack/ovn-openstack-openstack-cell1-bcpc6" Mar 20 09:20:41 crc kubenswrapper[4971]: I0320 09:20:41.102398 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44fxd\" (UniqueName: \"kubernetes.io/projected/f74aa1f9-4acb-48cd-90a1-10ff27996a3a-kube-api-access-44fxd\") pod \"ovn-openstack-openstack-cell1-bcpc6\" (UID: \"f74aa1f9-4acb-48cd-90a1-10ff27996a3a\") " pod="openstack/ovn-openstack-openstack-cell1-bcpc6" Mar 20 09:20:41 crc kubenswrapper[4971]: I0320 09:20:41.102420 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/f74aa1f9-4acb-48cd-90a1-10ff27996a3a-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-bcpc6\" (UID: \"f74aa1f9-4acb-48cd-90a1-10ff27996a3a\") " pod="openstack/ovn-openstack-openstack-cell1-bcpc6" Mar 20 09:20:41 crc kubenswrapper[4971]: I0320 09:20:41.102505 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f74aa1f9-4acb-48cd-90a1-10ff27996a3a-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-bcpc6\" (UID: \"f74aa1f9-4acb-48cd-90a1-10ff27996a3a\") " pod="openstack/ovn-openstack-openstack-cell1-bcpc6" Mar 20 09:20:41 crc kubenswrapper[4971]: I0320 09:20:41.102538 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f74aa1f9-4acb-48cd-90a1-10ff27996a3a-ceph\") pod \"ovn-openstack-openstack-cell1-bcpc6\" (UID: \"f74aa1f9-4acb-48cd-90a1-10ff27996a3a\") " pod="openstack/ovn-openstack-openstack-cell1-bcpc6" Mar 20 09:20:41 crc kubenswrapper[4971]: I0320 09:20:41.103399 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/f74aa1f9-4acb-48cd-90a1-10ff27996a3a-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-bcpc6\" (UID: \"f74aa1f9-4acb-48cd-90a1-10ff27996a3a\") " pod="openstack/ovn-openstack-openstack-cell1-bcpc6" Mar 20 09:20:41 crc kubenswrapper[4971]: I0320 09:20:41.106393 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f74aa1f9-4acb-48cd-90a1-10ff27996a3a-ssh-key-openstack-cell1\") pod \"ovn-openstack-openstack-cell1-bcpc6\" (UID: \"f74aa1f9-4acb-48cd-90a1-10ff27996a3a\") " pod="openstack/ovn-openstack-openstack-cell1-bcpc6" Mar 20 09:20:41 crc kubenswrapper[4971]: I0320 09:20:41.106644 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f74aa1f9-4acb-48cd-90a1-10ff27996a3a-inventory\") pod \"ovn-openstack-openstack-cell1-bcpc6\" (UID: \"f74aa1f9-4acb-48cd-90a1-10ff27996a3a\") " pod="openstack/ovn-openstack-openstack-cell1-bcpc6" Mar 20 09:20:41 crc kubenswrapper[4971]: I0320 09:20:41.106801 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f74aa1f9-4acb-48cd-90a1-10ff27996a3a-ceph\") pod \"ovn-openstack-openstack-cell1-bcpc6\" (UID: \"f74aa1f9-4acb-48cd-90a1-10ff27996a3a\") " pod="openstack/ovn-openstack-openstack-cell1-bcpc6" Mar 20 09:20:41 crc kubenswrapper[4971]: I0320 09:20:41.106900 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f74aa1f9-4acb-48cd-90a1-10ff27996a3a-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-bcpc6\" (UID: \"f74aa1f9-4acb-48cd-90a1-10ff27996a3a\") " pod="openstack/ovn-openstack-openstack-cell1-bcpc6" Mar 20 09:20:41 crc kubenswrapper[4971]: I0320 09:20:41.119594 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44fxd\" (UniqueName: \"kubernetes.io/projected/f74aa1f9-4acb-48cd-90a1-10ff27996a3a-kube-api-access-44fxd\") pod \"ovn-openstack-openstack-cell1-bcpc6\" (UID: \"f74aa1f9-4acb-48cd-90a1-10ff27996a3a\") " pod="openstack/ovn-openstack-openstack-cell1-bcpc6" Mar 20 09:20:41 crc kubenswrapper[4971]: I0320 09:20:41.263024 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-bcpc6" Mar 20 09:20:41 crc kubenswrapper[4971]: W0320 09:20:41.309415 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90cbcdec_a80c_4c34_9130_ecbaa12a36b0.slice/crio-3052657b9bef3610db6686e09c19776a2fe56bfddc639f331682950de3f2b070 WatchSource:0}: Error finding container 3052657b9bef3610db6686e09c19776a2fe56bfddc639f331682950de3f2b070: Status 404 returned error can't find the container with id 3052657b9bef3610db6686e09c19776a2fe56bfddc639f331682950de3f2b070 Mar 20 09:20:41 crc kubenswrapper[4971]: I0320 09:20:41.311483 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-networker-kl245"] Mar 20 09:20:41 crc kubenswrapper[4971]: I0320 09:20:41.845575 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-networker-kl245" event={"ID":"90cbcdec-a80c-4c34-9130-ecbaa12a36b0","Type":"ContainerStarted","Data":"3052657b9bef3610db6686e09c19776a2fe56bfddc639f331682950de3f2b070"} Mar 20 09:20:41 crc kubenswrapper[4971]: I0320 09:20:41.847972 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-bcpc6"] Mar 20 09:20:41 crc kubenswrapper[4971]: W0320 09:20:41.851705 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf74aa1f9_4acb_48cd_90a1_10ff27996a3a.slice/crio-7aee7852eafd620ccf2398588092db2d8caf903a944cdc28bbd55807b48981f0 WatchSource:0}: Error finding container 7aee7852eafd620ccf2398588092db2d8caf903a944cdc28bbd55807b48981f0: Status 404 returned error can't find the container with id 7aee7852eafd620ccf2398588092db2d8caf903a944cdc28bbd55807b48981f0 Mar 20 09:20:42 crc kubenswrapper[4971]: I0320 09:20:42.874107 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-bcpc6" event={"ID":"f74aa1f9-4acb-48cd-90a1-10ff27996a3a","Type":"ContainerStarted","Data":"0dd9be9d0a4cff752df24c0a9c1997b17959103271aa71b862fdbda3bd3ce694"} Mar 20 09:20:42 crc kubenswrapper[4971]: I0320 09:20:42.874839 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-bcpc6" event={"ID":"f74aa1f9-4acb-48cd-90a1-10ff27996a3a","Type":"ContainerStarted","Data":"7aee7852eafd620ccf2398588092db2d8caf903a944cdc28bbd55807b48981f0"} Mar 20 09:20:42 crc kubenswrapper[4971]: I0320 09:20:42.880515 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-networker-kl245" event={"ID":"90cbcdec-a80c-4c34-9130-ecbaa12a36b0","Type":"ContainerStarted","Data":"c7eab97a103680ed140de69be125014c4077282d57056f7fc76c824c254e1871"} Mar 20 09:20:42 crc kubenswrapper[4971]: I0320 09:20:42.909824 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-openstack-openstack-cell1-bcpc6" podStartSLOduration=2.349565663 podStartE2EDuration="2.909800641s" podCreationTimestamp="2026-03-20 09:20:40 +0000 UTC" firstStartedPulling="2026-03-20 09:20:41.853885842 +0000 UTC m=+9063.833759980" lastFinishedPulling="2026-03-20 09:20:42.41412082 +0000 UTC m=+9064.393994958" observedRunningTime="2026-03-20 09:20:42.896022887 +0000 UTC m=+9064.875897035" watchObservedRunningTime="2026-03-20 09:20:42.909800641 +0000 UTC m=+9064.889674779" Mar 20 09:20:42 crc kubenswrapper[4971]: I0320 09:20:42.955018 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-openstack-openstack-networker-kl245" podStartSLOduration=3.517847701 podStartE2EDuration="3.954994846s" podCreationTimestamp="2026-03-20 09:20:39 +0000 UTC" firstStartedPulling="2026-03-20 09:20:41.311343482 +0000 UTC m=+9063.291217620" lastFinishedPulling="2026-03-20 09:20:41.748490627 +0000 UTC m=+9063.728364765" observedRunningTime="2026-03-20 09:20:42.92411686 +0000 UTC m=+9064.903991018" watchObservedRunningTime="2026-03-20 09:20:42.954994846 +0000 UTC m=+9064.934868994" Mar 20 09:20:50 crc kubenswrapper[4971]: I0320 09:20:50.162987 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:20:50 crc kubenswrapper[4971]: I0320 09:20:50.163748 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:21:20 crc kubenswrapper[4971]: I0320 09:21:20.162670 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:21:20 crc kubenswrapper[4971]: I0320 09:21:20.163216 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:21:20 crc kubenswrapper[4971]: I0320 09:21:20.163264 4971 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" Mar 20 09:21:20 crc kubenswrapper[4971]: I0320 09:21:20.164023 4971 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1dc8fd89510b2b7d8ce851fb003f72167d9a6a38e6cb5f50c43cf4fd83fd91ff"} pod="openshift-machine-config-operator/machine-config-daemon-m26zl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 09:21:20 crc kubenswrapper[4971]: I0320 09:21:20.164070 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" containerID="cri-o://1dc8fd89510b2b7d8ce851fb003f72167d9a6a38e6cb5f50c43cf4fd83fd91ff" gracePeriod=600 Mar 20 09:21:20 crc kubenswrapper[4971]: E0320 09:21:20.285706 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:21:21 crc kubenswrapper[4971]: I0320 09:21:21.269156 4971 generic.go:334] "Generic (PLEG): container finished" podID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerID="1dc8fd89510b2b7d8ce851fb003f72167d9a6a38e6cb5f50c43cf4fd83fd91ff" exitCode=0 Mar 20 09:21:21 crc kubenswrapper[4971]: I0320 09:21:21.269241 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerDied","Data":"1dc8fd89510b2b7d8ce851fb003f72167d9a6a38e6cb5f50c43cf4fd83fd91ff"} Mar 20 09:21:21 crc kubenswrapper[4971]: I0320 09:21:21.269576 4971 scope.go:117] "RemoveContainer" containerID="733a160bea3a7b4da07aecd1f08779761a18f5919207a3afdf06511a9685618b" Mar 20 09:21:21 crc kubenswrapper[4971]: I0320 09:21:21.270184 4971 scope.go:117] "RemoveContainer" containerID="1dc8fd89510b2b7d8ce851fb003f72167d9a6a38e6cb5f50c43cf4fd83fd91ff" Mar 20 09:21:21 crc kubenswrapper[4971]: E0320 09:21:21.270530 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:21:33 crc kubenswrapper[4971]: I0320 09:21:33.733031 4971 scope.go:117] "RemoveContainer" containerID="1dc8fd89510b2b7d8ce851fb003f72167d9a6a38e6cb5f50c43cf4fd83fd91ff" Mar 20 09:21:33 crc kubenswrapper[4971]: E0320 09:21:33.733740 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:21:41 crc kubenswrapper[4971]: I0320 09:21:41.468566 4971 generic.go:334] "Generic (PLEG): container finished" podID="90cbcdec-a80c-4c34-9130-ecbaa12a36b0" containerID="c7eab97a103680ed140de69be125014c4077282d57056f7fc76c824c254e1871" exitCode=0 Mar 20 09:21:41 crc kubenswrapper[4971]: I0320 09:21:41.468618 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-networker-kl245" event={"ID":"90cbcdec-a80c-4c34-9130-ecbaa12a36b0","Type":"ContainerDied","Data":"c7eab97a103680ed140de69be125014c4077282d57056f7fc76c824c254e1871"} Mar 20 09:21:42 crc kubenswrapper[4971]: I0320 09:21:42.925405 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-networker-kl245" Mar 20 09:21:43 crc kubenswrapper[4971]: I0320 09:21:43.080692 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/90cbcdec-a80c-4c34-9130-ecbaa12a36b0-nova-metadata-neutron-config-0\") pod \"90cbcdec-a80c-4c34-9130-ecbaa12a36b0\" (UID: \"90cbcdec-a80c-4c34-9130-ecbaa12a36b0\") " Mar 20 09:21:43 crc kubenswrapper[4971]: I0320 09:21:43.080853 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/90cbcdec-a80c-4c34-9130-ecbaa12a36b0-ssh-key-openstack-networker\") pod \"90cbcdec-a80c-4c34-9130-ecbaa12a36b0\" (UID: \"90cbcdec-a80c-4c34-9130-ecbaa12a36b0\") " Mar 20 09:21:43 crc kubenswrapper[4971]: I0320 09:21:43.080955 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/90cbcdec-a80c-4c34-9130-ecbaa12a36b0-neutron-ovn-metadata-agent-neutron-config-0\") pod \"90cbcdec-a80c-4c34-9130-ecbaa12a36b0\" (UID: \"90cbcdec-a80c-4c34-9130-ecbaa12a36b0\") " Mar 20 09:21:43 crc kubenswrapper[4971]: I0320 09:21:43.080981 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90cbcdec-a80c-4c34-9130-ecbaa12a36b0-neutron-metadata-combined-ca-bundle\") pod \"90cbcdec-a80c-4c34-9130-ecbaa12a36b0\" (UID: \"90cbcdec-a80c-4c34-9130-ecbaa12a36b0\") " Mar 20 09:21:43 crc kubenswrapper[4971]: I0320 09:21:43.081062 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/90cbcdec-a80c-4c34-9130-ecbaa12a36b0-inventory\") pod \"90cbcdec-a80c-4c34-9130-ecbaa12a36b0\" (UID: \"90cbcdec-a80c-4c34-9130-ecbaa12a36b0\") " Mar 20 09:21:43 crc kubenswrapper[4971]: I0320 09:21:43.081110 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bjrs\" (UniqueName: \"kubernetes.io/projected/90cbcdec-a80c-4c34-9130-ecbaa12a36b0-kube-api-access-7bjrs\") pod \"90cbcdec-a80c-4c34-9130-ecbaa12a36b0\" (UID: \"90cbcdec-a80c-4c34-9130-ecbaa12a36b0\") " Mar 20 09:21:43 crc kubenswrapper[4971]: I0320 09:21:43.087877 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90cbcdec-a80c-4c34-9130-ecbaa12a36b0-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "90cbcdec-a80c-4c34-9130-ecbaa12a36b0" (UID: "90cbcdec-a80c-4c34-9130-ecbaa12a36b0"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:21:43 crc kubenswrapper[4971]: I0320 09:21:43.088276 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90cbcdec-a80c-4c34-9130-ecbaa12a36b0-kube-api-access-7bjrs" (OuterVolumeSpecName: "kube-api-access-7bjrs") pod "90cbcdec-a80c-4c34-9130-ecbaa12a36b0" (UID: "90cbcdec-a80c-4c34-9130-ecbaa12a36b0"). InnerVolumeSpecName "kube-api-access-7bjrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:21:43 crc kubenswrapper[4971]: I0320 09:21:43.114980 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90cbcdec-a80c-4c34-9130-ecbaa12a36b0-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "90cbcdec-a80c-4c34-9130-ecbaa12a36b0" (UID: "90cbcdec-a80c-4c34-9130-ecbaa12a36b0"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:21:43 crc kubenswrapper[4971]: I0320 09:21:43.115413 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90cbcdec-a80c-4c34-9130-ecbaa12a36b0-inventory" (OuterVolumeSpecName: "inventory") pod "90cbcdec-a80c-4c34-9130-ecbaa12a36b0" (UID: "90cbcdec-a80c-4c34-9130-ecbaa12a36b0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:21:43 crc kubenswrapper[4971]: I0320 09:21:43.115548 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90cbcdec-a80c-4c34-9130-ecbaa12a36b0-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "90cbcdec-a80c-4c34-9130-ecbaa12a36b0" (UID: "90cbcdec-a80c-4c34-9130-ecbaa12a36b0"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:21:43 crc kubenswrapper[4971]: I0320 09:21:43.125441 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90cbcdec-a80c-4c34-9130-ecbaa12a36b0-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "90cbcdec-a80c-4c34-9130-ecbaa12a36b0" (UID: "90cbcdec-a80c-4c34-9130-ecbaa12a36b0"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:21:43 crc kubenswrapper[4971]: I0320 09:21:43.183824 4971 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/90cbcdec-a80c-4c34-9130-ecbaa12a36b0-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 20 09:21:43 crc kubenswrapper[4971]: I0320 09:21:43.183884 4971 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90cbcdec-a80c-4c34-9130-ecbaa12a36b0-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 09:21:43 crc kubenswrapper[4971]: I0320 09:21:43.183906 4971 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/90cbcdec-a80c-4c34-9130-ecbaa12a36b0-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 09:21:43 crc kubenswrapper[4971]: I0320 09:21:43.183926 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bjrs\" (UniqueName: \"kubernetes.io/projected/90cbcdec-a80c-4c34-9130-ecbaa12a36b0-kube-api-access-7bjrs\") on node \"crc\" DevicePath \"\"" Mar 20 09:21:43 crc kubenswrapper[4971]: I0320 09:21:43.183943 4971 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/90cbcdec-a80c-4c34-9130-ecbaa12a36b0-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 20 09:21:43 crc kubenswrapper[4971]: I0320 09:21:43.183960 4971 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/90cbcdec-a80c-4c34-9130-ecbaa12a36b0-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Mar 20 09:21:43 crc kubenswrapper[4971]: I0320 09:21:43.492569 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-networker-kl245" event={"ID":"90cbcdec-a80c-4c34-9130-ecbaa12a36b0","Type":"ContainerDied","Data":"3052657b9bef3610db6686e09c19776a2fe56bfddc639f331682950de3f2b070"} Mar 20 09:21:43 crc kubenswrapper[4971]: I0320 09:21:43.492910 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3052657b9bef3610db6686e09c19776a2fe56bfddc639f331682950de3f2b070" Mar 20 09:21:43 crc kubenswrapper[4971]: I0320 09:21:43.492765 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-networker-kl245" Mar 20 09:21:44 crc kubenswrapper[4971]: I0320 09:21:44.078066 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4d6f4"] Mar 20 09:21:44 crc kubenswrapper[4971]: E0320 09:21:44.078516 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90cbcdec-a80c-4c34-9130-ecbaa12a36b0" containerName="neutron-metadata-openstack-openstack-networker" Mar 20 09:21:44 crc kubenswrapper[4971]: I0320 09:21:44.078530 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="90cbcdec-a80c-4c34-9130-ecbaa12a36b0" containerName="neutron-metadata-openstack-openstack-networker" Mar 20 09:21:44 crc kubenswrapper[4971]: I0320 09:21:44.078757 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="90cbcdec-a80c-4c34-9130-ecbaa12a36b0" containerName="neutron-metadata-openstack-openstack-networker" Mar 20 09:21:44 crc kubenswrapper[4971]: I0320 09:21:44.080270 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4d6f4" Mar 20 09:21:44 crc kubenswrapper[4971]: I0320 09:21:44.091021 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4d6f4"] Mar 20 09:21:44 crc kubenswrapper[4971]: I0320 09:21:44.204786 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d72edaa3-a57c-48ed-b393-d4ee5fd51094-utilities\") pod \"community-operators-4d6f4\" (UID: \"d72edaa3-a57c-48ed-b393-d4ee5fd51094\") " pod="openshift-marketplace/community-operators-4d6f4" Mar 20 09:21:44 crc kubenswrapper[4971]: I0320 09:21:44.205140 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47sm2\" (UniqueName: \"kubernetes.io/projected/d72edaa3-a57c-48ed-b393-d4ee5fd51094-kube-api-access-47sm2\") pod \"community-operators-4d6f4\" (UID: \"d72edaa3-a57c-48ed-b393-d4ee5fd51094\") " pod="openshift-marketplace/community-operators-4d6f4" Mar 20 09:21:44 crc kubenswrapper[4971]: I0320 09:21:44.205226 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d72edaa3-a57c-48ed-b393-d4ee5fd51094-catalog-content\") pod \"community-operators-4d6f4\" (UID: \"d72edaa3-a57c-48ed-b393-d4ee5fd51094\") " pod="openshift-marketplace/community-operators-4d6f4" Mar 20 09:21:44 crc kubenswrapper[4971]: I0320 09:21:44.307662 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d72edaa3-a57c-48ed-b393-d4ee5fd51094-utilities\") pod \"community-operators-4d6f4\" (UID: \"d72edaa3-a57c-48ed-b393-d4ee5fd51094\") " pod="openshift-marketplace/community-operators-4d6f4" Mar 20 09:21:44 crc kubenswrapper[4971]: I0320 09:21:44.308086 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47sm2\" (UniqueName: \"kubernetes.io/projected/d72edaa3-a57c-48ed-b393-d4ee5fd51094-kube-api-access-47sm2\") pod \"community-operators-4d6f4\" (UID: \"d72edaa3-a57c-48ed-b393-d4ee5fd51094\") " pod="openshift-marketplace/community-operators-4d6f4" Mar 20 09:21:44 crc kubenswrapper[4971]: I0320 09:21:44.308211 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d72edaa3-a57c-48ed-b393-d4ee5fd51094-catalog-content\") pod \"community-operators-4d6f4\" (UID: \"d72edaa3-a57c-48ed-b393-d4ee5fd51094\") " pod="openshift-marketplace/community-operators-4d6f4" Mar 20 09:21:44 crc kubenswrapper[4971]: I0320 09:21:44.308523 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d72edaa3-a57c-48ed-b393-d4ee5fd51094-utilities\") pod \"community-operators-4d6f4\" (UID: \"d72edaa3-a57c-48ed-b393-d4ee5fd51094\") " pod="openshift-marketplace/community-operators-4d6f4" Mar 20 09:21:44 crc kubenswrapper[4971]: I0320 09:21:44.308678 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d72edaa3-a57c-48ed-b393-d4ee5fd51094-catalog-content\") pod \"community-operators-4d6f4\" (UID: \"d72edaa3-a57c-48ed-b393-d4ee5fd51094\") " pod="openshift-marketplace/community-operators-4d6f4" Mar 20 09:21:44 crc kubenswrapper[4971]: I0320 09:21:44.333777 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47sm2\" (UniqueName: \"kubernetes.io/projected/d72edaa3-a57c-48ed-b393-d4ee5fd51094-kube-api-access-47sm2\") pod \"community-operators-4d6f4\" (UID: \"d72edaa3-a57c-48ed-b393-d4ee5fd51094\") " pod="openshift-marketplace/community-operators-4d6f4" Mar 20 09:21:44 crc kubenswrapper[4971]: I0320 09:21:44.404781 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4d6f4" Mar 20 09:21:45 crc kubenswrapper[4971]: I0320 09:21:45.234453 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4d6f4"] Mar 20 09:21:45 crc kubenswrapper[4971]: I0320 09:21:45.513886 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4d6f4" event={"ID":"d72edaa3-a57c-48ed-b393-d4ee5fd51094","Type":"ContainerStarted","Data":"920560de76a02524c01a082a6fd7c3b26288ef7c66bfddbf4bc283edbfb99dfa"} Mar 20 09:21:46 crc kubenswrapper[4971]: I0320 09:21:46.526874 4971 generic.go:334] "Generic (PLEG): container finished" podID="d72edaa3-a57c-48ed-b393-d4ee5fd51094" containerID="2c42d9c6c8614938400422be30e01722f8547ec11c7b8532b428a08aa173d189" exitCode=0 Mar 20 09:21:46 crc kubenswrapper[4971]: I0320 09:21:46.526963 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4d6f4" event={"ID":"d72edaa3-a57c-48ed-b393-d4ee5fd51094","Type":"ContainerDied","Data":"2c42d9c6c8614938400422be30e01722f8547ec11c7b8532b428a08aa173d189"} Mar 20 09:21:48 crc kubenswrapper[4971]: I0320 09:21:48.555179 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4d6f4" event={"ID":"d72edaa3-a57c-48ed-b393-d4ee5fd51094","Type":"ContainerStarted","Data":"4367035aeaaf3e04268acaafd4d70ffe7e88e17fa11d8ccc23ac8a663d082967"} Mar 20 09:21:48 crc kubenswrapper[4971]: I0320 09:21:48.742546 4971 scope.go:117] "RemoveContainer" containerID="1dc8fd89510b2b7d8ce851fb003f72167d9a6a38e6cb5f50c43cf4fd83fd91ff" Mar 20 09:21:48 crc kubenswrapper[4971]: E0320 09:21:48.742820 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:21:51 crc kubenswrapper[4971]: I0320 09:21:51.584250 4971 generic.go:334] "Generic (PLEG): container finished" podID="d72edaa3-a57c-48ed-b393-d4ee5fd51094" containerID="4367035aeaaf3e04268acaafd4d70ffe7e88e17fa11d8ccc23ac8a663d082967" exitCode=0 Mar 20 09:21:51 crc kubenswrapper[4971]: I0320 09:21:51.584331 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4d6f4" event={"ID":"d72edaa3-a57c-48ed-b393-d4ee5fd51094","Type":"ContainerDied","Data":"4367035aeaaf3e04268acaafd4d70ffe7e88e17fa11d8ccc23ac8a663d082967"} Mar 20 09:21:52 crc kubenswrapper[4971]: I0320 09:21:52.595473 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4d6f4" event={"ID":"d72edaa3-a57c-48ed-b393-d4ee5fd51094","Type":"ContainerStarted","Data":"4554e59abae6568819933e1ffe4b74b54f5226459891119ecbaa0447767c258d"} Mar 20 09:21:52 crc kubenswrapper[4971]: I0320 09:21:52.621036 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4d6f4" podStartSLOduration=2.918185102 podStartE2EDuration="8.621020402s" podCreationTimestamp="2026-03-20 09:21:44 +0000 UTC" firstStartedPulling="2026-03-20 09:21:46.530206147 +0000 UTC m=+9128.510080285" lastFinishedPulling="2026-03-20 09:21:52.233041447 +0000 UTC m=+9134.212915585" observedRunningTime="2026-03-20 09:21:52.617240692 +0000 UTC m=+9134.597114830" watchObservedRunningTime="2026-03-20 09:21:52.621020402 +0000 UTC m=+9134.600894540" Mar 20 09:21:54 crc kubenswrapper[4971]: I0320 09:21:54.405370 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4d6f4" Mar 20 09:21:54 crc kubenswrapper[4971]: I0320 09:21:54.405739 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4d6f4" Mar 20 09:21:54 crc kubenswrapper[4971]: I0320 09:21:54.459480 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4d6f4" Mar 20 09:21:57 crc kubenswrapper[4971]: I0320 09:21:57.647140 4971 generic.go:334] "Generic (PLEG): container finished" podID="f74aa1f9-4acb-48cd-90a1-10ff27996a3a" containerID="0dd9be9d0a4cff752df24c0a9c1997b17959103271aa71b862fdbda3bd3ce694" exitCode=0 Mar 20 09:21:57 crc kubenswrapper[4971]: I0320 09:21:57.647207 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-bcpc6" event={"ID":"f74aa1f9-4acb-48cd-90a1-10ff27996a3a","Type":"ContainerDied","Data":"0dd9be9d0a4cff752df24c0a9c1997b17959103271aa71b862fdbda3bd3ce694"} Mar 20 09:21:59 crc kubenswrapper[4971]: I0320 09:21:59.100936 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-bcpc6" Mar 20 09:21:59 crc kubenswrapper[4971]: I0320 09:21:59.220358 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f74aa1f9-4acb-48cd-90a1-10ff27996a3a-ssh-key-openstack-cell1\") pod \"f74aa1f9-4acb-48cd-90a1-10ff27996a3a\" (UID: \"f74aa1f9-4acb-48cd-90a1-10ff27996a3a\") " Mar 20 09:21:59 crc kubenswrapper[4971]: I0320 09:21:59.220408 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f74aa1f9-4acb-48cd-90a1-10ff27996a3a-inventory\") pod \"f74aa1f9-4acb-48cd-90a1-10ff27996a3a\" (UID: \"f74aa1f9-4acb-48cd-90a1-10ff27996a3a\") " Mar 20 09:21:59 crc kubenswrapper[4971]: I0320 09:21:59.220438 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f74aa1f9-4acb-48cd-90a1-10ff27996a3a-ovn-combined-ca-bundle\") pod \"f74aa1f9-4acb-48cd-90a1-10ff27996a3a\" (UID: \"f74aa1f9-4acb-48cd-90a1-10ff27996a3a\") " Mar 20 09:21:59 crc kubenswrapper[4971]: I0320 09:21:59.220470 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/f74aa1f9-4acb-48cd-90a1-10ff27996a3a-ovncontroller-config-0\") pod \"f74aa1f9-4acb-48cd-90a1-10ff27996a3a\" (UID: \"f74aa1f9-4acb-48cd-90a1-10ff27996a3a\") " Mar 20 09:21:59 crc kubenswrapper[4971]: I0320 09:21:59.220512 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f74aa1f9-4acb-48cd-90a1-10ff27996a3a-ceph\") pod \"f74aa1f9-4acb-48cd-90a1-10ff27996a3a\" (UID: \"f74aa1f9-4acb-48cd-90a1-10ff27996a3a\") " Mar 20 09:21:59 crc kubenswrapper[4971]: I0320 09:21:59.220531 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44fxd\" (UniqueName: \"kubernetes.io/projected/f74aa1f9-4acb-48cd-90a1-10ff27996a3a-kube-api-access-44fxd\") pod \"f74aa1f9-4acb-48cd-90a1-10ff27996a3a\" (UID: \"f74aa1f9-4acb-48cd-90a1-10ff27996a3a\") " Mar 20 09:21:59 crc kubenswrapper[4971]: I0320 09:21:59.226144 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f74aa1f9-4acb-48cd-90a1-10ff27996a3a-ceph" (OuterVolumeSpecName: "ceph") pod "f74aa1f9-4acb-48cd-90a1-10ff27996a3a" (UID: "f74aa1f9-4acb-48cd-90a1-10ff27996a3a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:21:59 crc kubenswrapper[4971]: I0320 09:21:59.226188 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f74aa1f9-4acb-48cd-90a1-10ff27996a3a-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "f74aa1f9-4acb-48cd-90a1-10ff27996a3a" (UID: "f74aa1f9-4acb-48cd-90a1-10ff27996a3a"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:21:59 crc kubenswrapper[4971]: I0320 09:21:59.226249 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f74aa1f9-4acb-48cd-90a1-10ff27996a3a-kube-api-access-44fxd" (OuterVolumeSpecName: "kube-api-access-44fxd") pod "f74aa1f9-4acb-48cd-90a1-10ff27996a3a" (UID: "f74aa1f9-4acb-48cd-90a1-10ff27996a3a"). InnerVolumeSpecName "kube-api-access-44fxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:21:59 crc kubenswrapper[4971]: I0320 09:21:59.248160 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f74aa1f9-4acb-48cd-90a1-10ff27996a3a-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "f74aa1f9-4acb-48cd-90a1-10ff27996a3a" (UID: "f74aa1f9-4acb-48cd-90a1-10ff27996a3a"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:21:59 crc kubenswrapper[4971]: I0320 09:21:59.250496 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f74aa1f9-4acb-48cd-90a1-10ff27996a3a-inventory" (OuterVolumeSpecName: "inventory") pod "f74aa1f9-4acb-48cd-90a1-10ff27996a3a" (UID: "f74aa1f9-4acb-48cd-90a1-10ff27996a3a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:21:59 crc kubenswrapper[4971]: I0320 09:21:59.269892 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f74aa1f9-4acb-48cd-90a1-10ff27996a3a-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "f74aa1f9-4acb-48cd-90a1-10ff27996a3a" (UID: "f74aa1f9-4acb-48cd-90a1-10ff27996a3a"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:21:59 crc kubenswrapper[4971]: I0320 09:21:59.323996 4971 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f74aa1f9-4acb-48cd-90a1-10ff27996a3a-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 20 09:21:59 crc kubenswrapper[4971]: I0320 09:21:59.324052 4971 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f74aa1f9-4acb-48cd-90a1-10ff27996a3a-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 09:21:59 crc kubenswrapper[4971]: I0320 09:21:59.324065 4971 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f74aa1f9-4acb-48cd-90a1-10ff27996a3a-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 09:21:59 crc kubenswrapper[4971]: I0320 09:21:59.324081 4971 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/f74aa1f9-4acb-48cd-90a1-10ff27996a3a-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 20 09:21:59 crc kubenswrapper[4971]: I0320 09:21:59.324092 4971 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f74aa1f9-4acb-48cd-90a1-10ff27996a3a-ceph\") on node \"crc\" DevicePath \"\"" Mar 20 09:21:59 crc kubenswrapper[4971]: I0320 09:21:59.324105 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44fxd\" (UniqueName: \"kubernetes.io/projected/f74aa1f9-4acb-48cd-90a1-10ff27996a3a-kube-api-access-44fxd\") on node \"crc\" DevicePath \"\"" Mar 20 09:21:59 crc kubenswrapper[4971]: I0320 09:21:59.665199 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-bcpc6" event={"ID":"f74aa1f9-4acb-48cd-90a1-10ff27996a3a","Type":"ContainerDied","Data":"7aee7852eafd620ccf2398588092db2d8caf903a944cdc28bbd55807b48981f0"} Mar 20 09:21:59 crc kubenswrapper[4971]: I0320 09:21:59.665283 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7aee7852eafd620ccf2398588092db2d8caf903a944cdc28bbd55807b48981f0" Mar 20 09:21:59 crc kubenswrapper[4971]: I0320 09:21:59.665290 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-bcpc6" Mar 20 09:21:59 crc kubenswrapper[4971]: I0320 09:21:59.765064 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-v4l72"] Mar 20 09:21:59 crc kubenswrapper[4971]: E0320 09:21:59.765592 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f74aa1f9-4acb-48cd-90a1-10ff27996a3a" containerName="ovn-openstack-openstack-cell1" Mar 20 09:21:59 crc kubenswrapper[4971]: I0320 09:21:59.765630 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f74aa1f9-4acb-48cd-90a1-10ff27996a3a" containerName="ovn-openstack-openstack-cell1" Mar 20 09:21:59 crc kubenswrapper[4971]: I0320 09:21:59.765839 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="f74aa1f9-4acb-48cd-90a1-10ff27996a3a" containerName="ovn-openstack-openstack-cell1" Mar 20 09:21:59 crc kubenswrapper[4971]: I0320 09:21:59.766658 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-v4l72" Mar 20 09:21:59 crc kubenswrapper[4971]: I0320 09:21:59.768863 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 20 09:21:59 crc kubenswrapper[4971]: I0320 09:21:59.769829 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 09:21:59 crc kubenswrapper[4971]: I0320 09:21:59.769951 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 20 09:21:59 crc kubenswrapper[4971]: I0320 09:21:59.770218 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 20 09:21:59 crc kubenswrapper[4971]: I0320 09:21:59.770226 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 20 09:21:59 crc kubenswrapper[4971]: I0320 09:21:59.770578 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rwstj" Mar 20 09:21:59 crc kubenswrapper[4971]: I0320 09:21:59.803805 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-v4l72"] Mar 20 09:21:59 crc kubenswrapper[4971]: I0320 09:21:59.835507 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/921173cb-9fd9-4b2a-bb4c-ea22aaa0f565-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-v4l72\" (UID: \"921173cb-9fd9-4b2a-bb4c-ea22aaa0f565\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-v4l72" Mar 20 09:21:59 crc kubenswrapper[4971]: I0320 09:21:59.835569 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wldw5\" (UniqueName: \"kubernetes.io/projected/921173cb-9fd9-4b2a-bb4c-ea22aaa0f565-kube-api-access-wldw5\") pod \"neutron-metadata-openstack-openstack-cell1-v4l72\" (UID: \"921173cb-9fd9-4b2a-bb4c-ea22aaa0f565\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-v4l72" Mar 20 09:21:59 crc kubenswrapper[4971]: I0320 09:21:59.835681 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/921173cb-9fd9-4b2a-bb4c-ea22aaa0f565-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-v4l72\" (UID: \"921173cb-9fd9-4b2a-bb4c-ea22aaa0f565\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-v4l72" Mar 20 09:21:59 crc kubenswrapper[4971]: I0320 09:21:59.835758 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/921173cb-9fd9-4b2a-bb4c-ea22aaa0f565-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-v4l72\" (UID: \"921173cb-9fd9-4b2a-bb4c-ea22aaa0f565\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-v4l72" Mar 20 09:21:59 crc kubenswrapper[4971]: I0320 09:21:59.835851 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/921173cb-9fd9-4b2a-bb4c-ea22aaa0f565-ssh-key-openstack-cell1\") pod \"neutron-metadata-openstack-openstack-cell1-v4l72\" (UID: \"921173cb-9fd9-4b2a-bb4c-ea22aaa0f565\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-v4l72" Mar 20 09:21:59 crc kubenswrapper[4971]: I0320 09:21:59.835943 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/921173cb-9fd9-4b2a-bb4c-ea22aaa0f565-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-v4l72\" (UID: \"921173cb-9fd9-4b2a-bb4c-ea22aaa0f565\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-v4l72" Mar 20 09:21:59 crc kubenswrapper[4971]: I0320 09:21:59.836016 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/921173cb-9fd9-4b2a-bb4c-ea22aaa0f565-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-v4l72\" (UID: \"921173cb-9fd9-4b2a-bb4c-ea22aaa0f565\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-v4l72" Mar 20 09:21:59 crc kubenswrapper[4971]: I0320 09:21:59.938172 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/921173cb-9fd9-4b2a-bb4c-ea22aaa0f565-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-v4l72\" (UID: \"921173cb-9fd9-4b2a-bb4c-ea22aaa0f565\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-v4l72" Mar 20 09:21:59 crc kubenswrapper[4971]: I0320 09:21:59.938240 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/921173cb-9fd9-4b2a-bb4c-ea22aaa0f565-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-v4l72\" (UID: \"921173cb-9fd9-4b2a-bb4c-ea22aaa0f565\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-v4l72" Mar 20 09:21:59 crc kubenswrapper[4971]: I0320 09:21:59.938337 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/921173cb-9fd9-4b2a-bb4c-ea22aaa0f565-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-v4l72\" (UID: \"921173cb-9fd9-4b2a-bb4c-ea22aaa0f565\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-v4l72" Mar 20 09:21:59 crc kubenswrapper[4971]: I0320 09:21:59.938369 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wldw5\" (UniqueName: \"kubernetes.io/projected/921173cb-9fd9-4b2a-bb4c-ea22aaa0f565-kube-api-access-wldw5\") pod \"neutron-metadata-openstack-openstack-cell1-v4l72\" (UID: \"921173cb-9fd9-4b2a-bb4c-ea22aaa0f565\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-v4l72" Mar 20 09:21:59 crc kubenswrapper[4971]: I0320 09:21:59.938391 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/921173cb-9fd9-4b2a-bb4c-ea22aaa0f565-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-v4l72\" (UID: \"921173cb-9fd9-4b2a-bb4c-ea22aaa0f565\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-v4l72" Mar 20 09:21:59 crc kubenswrapper[4971]: I0320 09:21:59.938426 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/921173cb-9fd9-4b2a-bb4c-ea22aaa0f565-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-v4l72\" (UID: \"921173cb-9fd9-4b2a-bb4c-ea22aaa0f565\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-v4l72" Mar 20 09:21:59 crc kubenswrapper[4971]: I0320 09:21:59.938472 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/921173cb-9fd9-4b2a-bb4c-ea22aaa0f565-ssh-key-openstack-cell1\") pod \"neutron-metadata-openstack-openstack-cell1-v4l72\" (UID: \"921173cb-9fd9-4b2a-bb4c-ea22aaa0f565\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-v4l72" Mar 20 09:21:59 crc kubenswrapper[4971]: I0320 09:21:59.942873 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/921173cb-9fd9-4b2a-bb4c-ea22aaa0f565-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-v4l72\" (UID: \"921173cb-9fd9-4b2a-bb4c-ea22aaa0f565\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-v4l72" Mar 20 09:21:59 crc kubenswrapper[4971]: I0320 09:21:59.942949 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/921173cb-9fd9-4b2a-bb4c-ea22aaa0f565-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-v4l72\" (UID: \"921173cb-9fd9-4b2a-bb4c-ea22aaa0f565\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-v4l72" Mar 20 09:21:59 crc kubenswrapper[4971]: I0320 09:21:59.944242 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/921173cb-9fd9-4b2a-bb4c-ea22aaa0f565-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-v4l72\" (UID: \"921173cb-9fd9-4b2a-bb4c-ea22aaa0f565\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-v4l72" Mar 20 09:21:59 crc kubenswrapper[4971]: I0320 09:21:59.945356 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/921173cb-9fd9-4b2a-bb4c-ea22aaa0f565-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-v4l72\" (UID: \"921173cb-9fd9-4b2a-bb4c-ea22aaa0f565\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-v4l72" Mar 20 09:21:59 crc kubenswrapper[4971]: I0320 09:21:59.946079 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/921173cb-9fd9-4b2a-bb4c-ea22aaa0f565-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-v4l72\" (UID: \"921173cb-9fd9-4b2a-bb4c-ea22aaa0f565\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-v4l72" Mar 20 09:21:59 crc kubenswrapper[4971]: I0320 09:21:59.946179 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/921173cb-9fd9-4b2a-bb4c-ea22aaa0f565-ssh-key-openstack-cell1\") pod \"neutron-metadata-openstack-openstack-cell1-v4l72\" (UID: \"921173cb-9fd9-4b2a-bb4c-ea22aaa0f565\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-v4l72" Mar 20 09:21:59 crc kubenswrapper[4971]: I0320 09:21:59.958703 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wldw5\" (UniqueName: \"kubernetes.io/projected/921173cb-9fd9-4b2a-bb4c-ea22aaa0f565-kube-api-access-wldw5\") pod \"neutron-metadata-openstack-openstack-cell1-v4l72\" (UID: \"921173cb-9fd9-4b2a-bb4c-ea22aaa0f565\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-v4l72" Mar 20 09:22:00 crc kubenswrapper[4971]: I0320 09:22:00.089629 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-v4l72" Mar 20 09:22:00 crc kubenswrapper[4971]: I0320 09:22:00.187923 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566642-xmhfw"] Mar 20 09:22:00 crc kubenswrapper[4971]: I0320 09:22:00.191403 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566642-xmhfw" Mar 20 09:22:00 crc kubenswrapper[4971]: I0320 09:22:00.210358 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 09:22:00 crc kubenswrapper[4971]: I0320 09:22:00.210644 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:22:00 crc kubenswrapper[4971]: I0320 09:22:00.210844 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:22:00 crc kubenswrapper[4971]: I0320 09:22:00.222401 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566642-xmhfw"] Mar 20 09:22:00 crc kubenswrapper[4971]: I0320 09:22:00.346569 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xng5j\" (UniqueName: \"kubernetes.io/projected/4d45f9ad-d551-4522-a036-a1a8b0af6a13-kube-api-access-xng5j\") pod \"auto-csr-approver-29566642-xmhfw\" (UID: \"4d45f9ad-d551-4522-a036-a1a8b0af6a13\") " pod="openshift-infra/auto-csr-approver-29566642-xmhfw" Mar 20 09:22:00 crc kubenswrapper[4971]: I0320 09:22:00.449209 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xng5j\" (UniqueName: \"kubernetes.io/projected/4d45f9ad-d551-4522-a036-a1a8b0af6a13-kube-api-access-xng5j\") pod \"auto-csr-approver-29566642-xmhfw\" (UID: \"4d45f9ad-d551-4522-a036-a1a8b0af6a13\") " pod="openshift-infra/auto-csr-approver-29566642-xmhfw" Mar 20 09:22:00 crc kubenswrapper[4971]: I0320 09:22:00.468049 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xng5j\" (UniqueName: \"kubernetes.io/projected/4d45f9ad-d551-4522-a036-a1a8b0af6a13-kube-api-access-xng5j\") pod \"auto-csr-approver-29566642-xmhfw\" (UID: \"4d45f9ad-d551-4522-a036-a1a8b0af6a13\") " pod="openshift-infra/auto-csr-approver-29566642-xmhfw" Mar 20 09:22:00 crc kubenswrapper[4971]: I0320 09:22:00.567284 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566642-xmhfw" Mar 20 09:22:00 crc kubenswrapper[4971]: I0320 09:22:00.691779 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-v4l72"] Mar 20 09:22:01 crc kubenswrapper[4971]: I0320 09:22:01.014538 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566642-xmhfw"] Mar 20 09:22:01 crc kubenswrapper[4971]: I0320 09:22:01.685301 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566642-xmhfw" event={"ID":"4d45f9ad-d551-4522-a036-a1a8b0af6a13","Type":"ContainerStarted","Data":"ea37b4e338c9e475abd799bbf1aab7f6192995602c288d943c261ba7bcabdcd6"} Mar 20 09:22:01 crc kubenswrapper[4971]: I0320 09:22:01.686809 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-v4l72" event={"ID":"921173cb-9fd9-4b2a-bb4c-ea22aaa0f565","Type":"ContainerStarted","Data":"f8cda968d782f10b689a786cb0de6343622e6bc79fd5c27c8a0c5918c810970f"} Mar 20 09:22:01 crc kubenswrapper[4971]: I0320 09:22:01.686850 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-v4l72" event={"ID":"921173cb-9fd9-4b2a-bb4c-ea22aaa0f565","Type":"ContainerStarted","Data":"d18bb039717f6e4c5ce626867ec89d5d17ea5ab2b5fb8c8dad99ed35a25a47e6"} Mar 20 09:22:02 crc kubenswrapper[4971]: I0320 09:22:02.703766 4971 generic.go:334] "Generic (PLEG): container finished" podID="4d45f9ad-d551-4522-a036-a1a8b0af6a13" containerID="4be388e9796f05d357e1635a4e1dc4c3bfc3acb868f4fd261c40ce7e6b63da65" exitCode=0 Mar 20 09:22:02 crc kubenswrapper[4971]: I0320 09:22:02.703840 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566642-xmhfw" event={"ID":"4d45f9ad-d551-4522-a036-a1a8b0af6a13","Type":"ContainerDied","Data":"4be388e9796f05d357e1635a4e1dc4c3bfc3acb868f4fd261c40ce7e6b63da65"} Mar 20 09:22:02 crc kubenswrapper[4971]: I0320 09:22:02.718150 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-openstack-openstack-cell1-v4l72" podStartSLOduration=3.156626044 podStartE2EDuration="3.718132735s" podCreationTimestamp="2026-03-20 09:21:59 +0000 UTC" firstStartedPulling="2026-03-20 09:22:00.713653766 +0000 UTC m=+9142.693527904" lastFinishedPulling="2026-03-20 09:22:01.275160457 +0000 UTC m=+9143.255034595" observedRunningTime="2026-03-20 09:22:01.709937179 +0000 UTC m=+9143.689811327" watchObservedRunningTime="2026-03-20 09:22:02.718132735 +0000 UTC m=+9144.698006873" Mar 20 09:22:02 crc kubenswrapper[4971]: I0320 09:22:02.732730 4971 scope.go:117] "RemoveContainer" containerID="1dc8fd89510b2b7d8ce851fb003f72167d9a6a38e6cb5f50c43cf4fd83fd91ff" Mar 20 09:22:02 crc kubenswrapper[4971]: E0320 09:22:02.732955 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:22:04 crc kubenswrapper[4971]: I0320 09:22:04.058942 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566642-xmhfw" Mar 20 09:22:04 crc kubenswrapper[4971]: I0320 09:22:04.229347 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xng5j\" (UniqueName: \"kubernetes.io/projected/4d45f9ad-d551-4522-a036-a1a8b0af6a13-kube-api-access-xng5j\") pod \"4d45f9ad-d551-4522-a036-a1a8b0af6a13\" (UID: \"4d45f9ad-d551-4522-a036-a1a8b0af6a13\") " Mar 20 09:22:04 crc kubenswrapper[4971]: I0320 09:22:04.235834 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d45f9ad-d551-4522-a036-a1a8b0af6a13-kube-api-access-xng5j" (OuterVolumeSpecName: "kube-api-access-xng5j") pod "4d45f9ad-d551-4522-a036-a1a8b0af6a13" (UID: "4d45f9ad-d551-4522-a036-a1a8b0af6a13"). InnerVolumeSpecName "kube-api-access-xng5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:22:04 crc kubenswrapper[4971]: I0320 09:22:04.332252 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xng5j\" (UniqueName: \"kubernetes.io/projected/4d45f9ad-d551-4522-a036-a1a8b0af6a13-kube-api-access-xng5j\") on node \"crc\" DevicePath \"\"" Mar 20 09:22:04 crc kubenswrapper[4971]: I0320 09:22:04.455743 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4d6f4" Mar 20 09:22:04 crc kubenswrapper[4971]: I0320 09:22:04.508227 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4d6f4"] Mar 20 09:22:04 crc kubenswrapper[4971]: I0320 09:22:04.723374 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566642-xmhfw" event={"ID":"4d45f9ad-d551-4522-a036-a1a8b0af6a13","Type":"ContainerDied","Data":"ea37b4e338c9e475abd799bbf1aab7f6192995602c288d943c261ba7bcabdcd6"} Mar 20 09:22:04 crc kubenswrapper[4971]: I0320 09:22:04.723403 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566642-xmhfw" Mar 20 09:22:04 crc kubenswrapper[4971]: I0320 09:22:04.723427 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea37b4e338c9e475abd799bbf1aab7f6192995602c288d943c261ba7bcabdcd6" Mar 20 09:22:04 crc kubenswrapper[4971]: I0320 09:22:04.723496 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4d6f4" podUID="d72edaa3-a57c-48ed-b393-d4ee5fd51094" containerName="registry-server" containerID="cri-o://4554e59abae6568819933e1ffe4b74b54f5226459891119ecbaa0447767c258d" gracePeriod=2 Mar 20 09:22:05 crc kubenswrapper[4971]: I0320 09:22:05.125883 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566636-g9l8t"] Mar 20 09:22:05 crc kubenswrapper[4971]: I0320 09:22:05.137655 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566636-g9l8t"] Mar 20 09:22:05 crc kubenswrapper[4971]: I0320 09:22:05.728778 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4d6f4" Mar 20 09:22:05 crc kubenswrapper[4971]: I0320 09:22:05.737070 4971 generic.go:334] "Generic (PLEG): container finished" podID="d72edaa3-a57c-48ed-b393-d4ee5fd51094" containerID="4554e59abae6568819933e1ffe4b74b54f5226459891119ecbaa0447767c258d" exitCode=0 Mar 20 09:22:05 crc kubenswrapper[4971]: I0320 09:22:05.737107 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4d6f4" Mar 20 09:22:05 crc kubenswrapper[4971]: I0320 09:22:05.737108 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4d6f4" event={"ID":"d72edaa3-a57c-48ed-b393-d4ee5fd51094","Type":"ContainerDied","Data":"4554e59abae6568819933e1ffe4b74b54f5226459891119ecbaa0447767c258d"} Mar 20 09:22:05 crc kubenswrapper[4971]: I0320 09:22:05.737232 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4d6f4" event={"ID":"d72edaa3-a57c-48ed-b393-d4ee5fd51094","Type":"ContainerDied","Data":"920560de76a02524c01a082a6fd7c3b26288ef7c66bfddbf4bc283edbfb99dfa"} Mar 20 09:22:05 crc kubenswrapper[4971]: I0320 09:22:05.737267 4971 scope.go:117] "RemoveContainer" containerID="4554e59abae6568819933e1ffe4b74b54f5226459891119ecbaa0447767c258d" Mar 20 09:22:05 crc kubenswrapper[4971]: I0320 09:22:05.779234 4971 scope.go:117] "RemoveContainer" containerID="4367035aeaaf3e04268acaafd4d70ffe7e88e17fa11d8ccc23ac8a663d082967" Mar 20 09:22:05 crc kubenswrapper[4971]: I0320 09:22:05.816884 4971 scope.go:117] "RemoveContainer" containerID="2c42d9c6c8614938400422be30e01722f8547ec11c7b8532b428a08aa173d189" Mar 20 09:22:05 crc kubenswrapper[4971]: I0320 09:22:05.855755 4971 scope.go:117] "RemoveContainer" containerID="4554e59abae6568819933e1ffe4b74b54f5226459891119ecbaa0447767c258d" Mar 20 09:22:05 crc kubenswrapper[4971]: E0320 09:22:05.856521 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4554e59abae6568819933e1ffe4b74b54f5226459891119ecbaa0447767c258d\": container with ID starting with 4554e59abae6568819933e1ffe4b74b54f5226459891119ecbaa0447767c258d not found: ID does not exist" containerID="4554e59abae6568819933e1ffe4b74b54f5226459891119ecbaa0447767c258d" Mar 20 09:22:05 crc kubenswrapper[4971]: I0320 09:22:05.856585 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4554e59abae6568819933e1ffe4b74b54f5226459891119ecbaa0447767c258d"} err="failed to get container status \"4554e59abae6568819933e1ffe4b74b54f5226459891119ecbaa0447767c258d\": rpc error: code = NotFound desc = could not find container \"4554e59abae6568819933e1ffe4b74b54f5226459891119ecbaa0447767c258d\": container with ID starting with 4554e59abae6568819933e1ffe4b74b54f5226459891119ecbaa0447767c258d not found: ID does not exist" Mar 20 09:22:05 crc kubenswrapper[4971]: I0320 09:22:05.856641 4971 scope.go:117] "RemoveContainer" containerID="4367035aeaaf3e04268acaafd4d70ffe7e88e17fa11d8ccc23ac8a663d082967" Mar 20 09:22:05 crc kubenswrapper[4971]: E0320 09:22:05.857084 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4367035aeaaf3e04268acaafd4d70ffe7e88e17fa11d8ccc23ac8a663d082967\": container with ID starting with 4367035aeaaf3e04268acaafd4d70ffe7e88e17fa11d8ccc23ac8a663d082967 not found: ID does not exist" containerID="4367035aeaaf3e04268acaafd4d70ffe7e88e17fa11d8ccc23ac8a663d082967" Mar 20 09:22:05 crc kubenswrapper[4971]: I0320 09:22:05.857132 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4367035aeaaf3e04268acaafd4d70ffe7e88e17fa11d8ccc23ac8a663d082967"} err="failed to get container status \"4367035aeaaf3e04268acaafd4d70ffe7e88e17fa11d8ccc23ac8a663d082967\": rpc error: code = NotFound desc = could not find container \"4367035aeaaf3e04268acaafd4d70ffe7e88e17fa11d8ccc23ac8a663d082967\": container with ID starting with 4367035aeaaf3e04268acaafd4d70ffe7e88e17fa11d8ccc23ac8a663d082967 not found: ID does not exist" Mar 20 09:22:05 crc kubenswrapper[4971]: I0320 09:22:05.857172 4971 scope.go:117] "RemoveContainer" containerID="2c42d9c6c8614938400422be30e01722f8547ec11c7b8532b428a08aa173d189" Mar 20 09:22:05 crc kubenswrapper[4971]: E0320 09:22:05.857478 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c42d9c6c8614938400422be30e01722f8547ec11c7b8532b428a08aa173d189\": container with ID starting with 2c42d9c6c8614938400422be30e01722f8547ec11c7b8532b428a08aa173d189 not found: ID does not exist" containerID="2c42d9c6c8614938400422be30e01722f8547ec11c7b8532b428a08aa173d189" Mar 20 09:22:05 crc kubenswrapper[4971]: I0320 09:22:05.857507 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c42d9c6c8614938400422be30e01722f8547ec11c7b8532b428a08aa173d189"} err="failed to get container status \"2c42d9c6c8614938400422be30e01722f8547ec11c7b8532b428a08aa173d189\": rpc error: code = NotFound desc = could not find container \"2c42d9c6c8614938400422be30e01722f8547ec11c7b8532b428a08aa173d189\": container with ID starting with 2c42d9c6c8614938400422be30e01722f8547ec11c7b8532b428a08aa173d189 not found: ID does not exist" Mar 20 09:22:05 crc kubenswrapper[4971]: I0320 09:22:05.865386 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d72edaa3-a57c-48ed-b393-d4ee5fd51094-catalog-content\") pod \"d72edaa3-a57c-48ed-b393-d4ee5fd51094\" (UID: \"d72edaa3-a57c-48ed-b393-d4ee5fd51094\") " Mar 20 09:22:05 crc kubenswrapper[4971]: I0320 09:22:05.865526 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d72edaa3-a57c-48ed-b393-d4ee5fd51094-utilities\") pod \"d72edaa3-a57c-48ed-b393-d4ee5fd51094\" (UID: \"d72edaa3-a57c-48ed-b393-d4ee5fd51094\") " Mar 20 09:22:05 crc kubenswrapper[4971]: I0320 09:22:05.865592 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47sm2\" (UniqueName: \"kubernetes.io/projected/d72edaa3-a57c-48ed-b393-d4ee5fd51094-kube-api-access-47sm2\") pod \"d72edaa3-a57c-48ed-b393-d4ee5fd51094\" (UID: \"d72edaa3-a57c-48ed-b393-d4ee5fd51094\") " Mar 20 09:22:05 crc kubenswrapper[4971]: I0320 09:22:05.867287 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d72edaa3-a57c-48ed-b393-d4ee5fd51094-utilities" (OuterVolumeSpecName: "utilities") pod "d72edaa3-a57c-48ed-b393-d4ee5fd51094" (UID: "d72edaa3-a57c-48ed-b393-d4ee5fd51094"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:22:05 crc kubenswrapper[4971]: I0320 09:22:05.877821 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d72edaa3-a57c-48ed-b393-d4ee5fd51094-kube-api-access-47sm2" (OuterVolumeSpecName: "kube-api-access-47sm2") pod "d72edaa3-a57c-48ed-b393-d4ee5fd51094" (UID: "d72edaa3-a57c-48ed-b393-d4ee5fd51094"). InnerVolumeSpecName "kube-api-access-47sm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:22:05 crc kubenswrapper[4971]: I0320 09:22:05.922471 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d72edaa3-a57c-48ed-b393-d4ee5fd51094-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d72edaa3-a57c-48ed-b393-d4ee5fd51094" (UID: "d72edaa3-a57c-48ed-b393-d4ee5fd51094"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:22:05 crc kubenswrapper[4971]: I0320 09:22:05.968761 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d72edaa3-a57c-48ed-b393-d4ee5fd51094-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 09:22:05 crc kubenswrapper[4971]: I0320 09:22:05.968810 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d72edaa3-a57c-48ed-b393-d4ee5fd51094-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 09:22:05 crc kubenswrapper[4971]: I0320 09:22:05.968825 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47sm2\" (UniqueName: \"kubernetes.io/projected/d72edaa3-a57c-48ed-b393-d4ee5fd51094-kube-api-access-47sm2\") on node \"crc\" DevicePath \"\"" Mar 20 09:22:06 crc kubenswrapper[4971]: I0320 09:22:06.070922 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4d6f4"] Mar 20 09:22:06 crc kubenswrapper[4971]: I0320 09:22:06.082128 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4d6f4"] Mar 20 09:22:06 crc kubenswrapper[4971]: I0320 09:22:06.765187 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f126602-4ade-4262-b6f8-66432bca8316" path="/var/lib/kubelet/pods/7f126602-4ade-4262-b6f8-66432bca8316/volumes" Mar 20 09:22:06 crc kubenswrapper[4971]: I0320 09:22:06.766306 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d72edaa3-a57c-48ed-b393-d4ee5fd51094" path="/var/lib/kubelet/pods/d72edaa3-a57c-48ed-b393-d4ee5fd51094/volumes" Mar 20 09:22:13 crc kubenswrapper[4971]: I0320 09:22:13.732652 4971 scope.go:117] "RemoveContainer" containerID="1dc8fd89510b2b7d8ce851fb003f72167d9a6a38e6cb5f50c43cf4fd83fd91ff" Mar 20 09:22:13 crc kubenswrapper[4971]: E0320 09:22:13.734436 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:22:14 crc kubenswrapper[4971]: I0320 09:22:14.113080 4971 scope.go:117] "RemoveContainer" containerID="7a61712b25ae4d3ab4e74825cf2ce20f492c921590cb44127d530b2feabaf6ca" Mar 20 09:22:27 crc kubenswrapper[4971]: I0320 09:22:27.732405 4971 scope.go:117] "RemoveContainer" containerID="1dc8fd89510b2b7d8ce851fb003f72167d9a6a38e6cb5f50c43cf4fd83fd91ff" Mar 20 09:22:27 crc kubenswrapper[4971]: E0320 09:22:27.733166 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:22:41 crc kubenswrapper[4971]: I0320 09:22:41.733170 4971 scope.go:117] "RemoveContainer" containerID="1dc8fd89510b2b7d8ce851fb003f72167d9a6a38e6cb5f50c43cf4fd83fd91ff" Mar 20 09:22:41 crc kubenswrapper[4971]: E0320 09:22:41.734248 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:22:55 crc kubenswrapper[4971]: I0320 09:22:55.731897 4971 scope.go:117] "RemoveContainer" containerID="1dc8fd89510b2b7d8ce851fb003f72167d9a6a38e6cb5f50c43cf4fd83fd91ff" Mar 20 09:22:55 crc kubenswrapper[4971]: E0320 09:22:55.732708 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:22:59 crc kubenswrapper[4971]: I0320 09:22:59.239792 4971 generic.go:334] "Generic (PLEG): container finished" podID="921173cb-9fd9-4b2a-bb4c-ea22aaa0f565" containerID="f8cda968d782f10b689a786cb0de6343622e6bc79fd5c27c8a0c5918c810970f" exitCode=0 Mar 20 09:22:59 crc kubenswrapper[4971]: I0320 09:22:59.239878 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-v4l72" event={"ID":"921173cb-9fd9-4b2a-bb4c-ea22aaa0f565","Type":"ContainerDied","Data":"f8cda968d782f10b689a786cb0de6343622e6bc79fd5c27c8a0c5918c810970f"} Mar 20 09:23:00 crc kubenswrapper[4971]: I0320 09:23:00.743527 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-v4l72" Mar 20 09:23:00 crc kubenswrapper[4971]: I0320 09:23:00.872659 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/921173cb-9fd9-4b2a-bb4c-ea22aaa0f565-ceph\") pod \"921173cb-9fd9-4b2a-bb4c-ea22aaa0f565\" (UID: \"921173cb-9fd9-4b2a-bb4c-ea22aaa0f565\") " Mar 20 09:23:00 crc kubenswrapper[4971]: I0320 09:23:00.872826 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/921173cb-9fd9-4b2a-bb4c-ea22aaa0f565-inventory\") pod \"921173cb-9fd9-4b2a-bb4c-ea22aaa0f565\" (UID: \"921173cb-9fd9-4b2a-bb4c-ea22aaa0f565\") " Mar 20 09:23:00 crc kubenswrapper[4971]: I0320 09:23:00.872862 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/921173cb-9fd9-4b2a-bb4c-ea22aaa0f565-neutron-metadata-combined-ca-bundle\") pod \"921173cb-9fd9-4b2a-bb4c-ea22aaa0f565\" (UID: \"921173cb-9fd9-4b2a-bb4c-ea22aaa0f565\") " Mar 20 09:23:00 crc kubenswrapper[4971]: I0320 09:23:00.872903 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/921173cb-9fd9-4b2a-bb4c-ea22aaa0f565-nova-metadata-neutron-config-0\") pod \"921173cb-9fd9-4b2a-bb4c-ea22aaa0f565\" (UID: \"921173cb-9fd9-4b2a-bb4c-ea22aaa0f565\") " Mar 20 09:23:00 crc kubenswrapper[4971]: I0320 09:23:00.872952 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wldw5\" (UniqueName: \"kubernetes.io/projected/921173cb-9fd9-4b2a-bb4c-ea22aaa0f565-kube-api-access-wldw5\") pod \"921173cb-9fd9-4b2a-bb4c-ea22aaa0f565\" (UID: \"921173cb-9fd9-4b2a-bb4c-ea22aaa0f565\") " Mar 20 09:23:00 crc kubenswrapper[4971]: I0320 09:23:00.872974 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/921173cb-9fd9-4b2a-bb4c-ea22aaa0f565-ssh-key-openstack-cell1\") pod \"921173cb-9fd9-4b2a-bb4c-ea22aaa0f565\" (UID: \"921173cb-9fd9-4b2a-bb4c-ea22aaa0f565\") " Mar 20 09:23:00 crc kubenswrapper[4971]: I0320 09:23:00.873049 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/921173cb-9fd9-4b2a-bb4c-ea22aaa0f565-neutron-ovn-metadata-agent-neutron-config-0\") pod \"921173cb-9fd9-4b2a-bb4c-ea22aaa0f565\" (UID: \"921173cb-9fd9-4b2a-bb4c-ea22aaa0f565\") " Mar 20 09:23:00 crc kubenswrapper[4971]: I0320 09:23:00.879244 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/921173cb-9fd9-4b2a-bb4c-ea22aaa0f565-ceph" (OuterVolumeSpecName: "ceph") pod "921173cb-9fd9-4b2a-bb4c-ea22aaa0f565" (UID: "921173cb-9fd9-4b2a-bb4c-ea22aaa0f565"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:23:00 crc kubenswrapper[4971]: I0320 09:23:00.880369 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/921173cb-9fd9-4b2a-bb4c-ea22aaa0f565-kube-api-access-wldw5" (OuterVolumeSpecName: "kube-api-access-wldw5") pod "921173cb-9fd9-4b2a-bb4c-ea22aaa0f565" (UID: "921173cb-9fd9-4b2a-bb4c-ea22aaa0f565"). InnerVolumeSpecName "kube-api-access-wldw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:23:00 crc kubenswrapper[4971]: I0320 09:23:00.881647 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/921173cb-9fd9-4b2a-bb4c-ea22aaa0f565-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "921173cb-9fd9-4b2a-bb4c-ea22aaa0f565" (UID: "921173cb-9fd9-4b2a-bb4c-ea22aaa0f565"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:23:00 crc kubenswrapper[4971]: I0320 09:23:00.901843 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/921173cb-9fd9-4b2a-bb4c-ea22aaa0f565-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "921173cb-9fd9-4b2a-bb4c-ea22aaa0f565" (UID: "921173cb-9fd9-4b2a-bb4c-ea22aaa0f565"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:23:00 crc kubenswrapper[4971]: I0320 09:23:00.901945 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/921173cb-9fd9-4b2a-bb4c-ea22aaa0f565-inventory" (OuterVolumeSpecName: "inventory") pod "921173cb-9fd9-4b2a-bb4c-ea22aaa0f565" (UID: "921173cb-9fd9-4b2a-bb4c-ea22aaa0f565"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:23:00 crc kubenswrapper[4971]: I0320 09:23:00.909976 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/921173cb-9fd9-4b2a-bb4c-ea22aaa0f565-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "921173cb-9fd9-4b2a-bb4c-ea22aaa0f565" (UID: "921173cb-9fd9-4b2a-bb4c-ea22aaa0f565"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:23:00 crc kubenswrapper[4971]: I0320 09:23:00.911847 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/921173cb-9fd9-4b2a-bb4c-ea22aaa0f565-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "921173cb-9fd9-4b2a-bb4c-ea22aaa0f565" (UID: "921173cb-9fd9-4b2a-bb4c-ea22aaa0f565"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:23:00 crc kubenswrapper[4971]: I0320 09:23:00.975797 4971 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/921173cb-9fd9-4b2a-bb4c-ea22aaa0f565-ceph\") on node \"crc\" DevicePath \"\"" Mar 20 09:23:00 crc kubenswrapper[4971]: I0320 09:23:00.976137 4971 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/921173cb-9fd9-4b2a-bb4c-ea22aaa0f565-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 09:23:00 crc kubenswrapper[4971]: I0320 09:23:00.976237 4971 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/921173cb-9fd9-4b2a-bb4c-ea22aaa0f565-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 09:23:00 crc kubenswrapper[4971]: I0320 09:23:00.976338 4971 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/921173cb-9fd9-4b2a-bb4c-ea22aaa0f565-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 20 09:23:00 crc kubenswrapper[4971]: I0320 09:23:00.976416 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wldw5\" (UniqueName: \"kubernetes.io/projected/921173cb-9fd9-4b2a-bb4c-ea22aaa0f565-kube-api-access-wldw5\") on node \"crc\" DevicePath \"\"" Mar 20 09:23:00 crc kubenswrapper[4971]: I0320 09:23:00.976484 4971 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/921173cb-9fd9-4b2a-bb4c-ea22aaa0f565-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 20 09:23:00 crc kubenswrapper[4971]: I0320 09:23:00.976545 4971 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/921173cb-9fd9-4b2a-bb4c-ea22aaa0f565-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 20 09:23:01 crc kubenswrapper[4971]: I0320 09:23:01.263535 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-v4l72" event={"ID":"921173cb-9fd9-4b2a-bb4c-ea22aaa0f565","Type":"ContainerDied","Data":"d18bb039717f6e4c5ce626867ec89d5d17ea5ab2b5fb8c8dad99ed35a25a47e6"} Mar 20 09:23:01 crc kubenswrapper[4971]: I0320 09:23:01.264100 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d18bb039717f6e4c5ce626867ec89d5d17ea5ab2b5fb8c8dad99ed35a25a47e6" Mar 20 09:23:01 crc kubenswrapper[4971]: I0320 09:23:01.263728 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-v4l72" Mar 20 09:23:01 crc kubenswrapper[4971]: I0320 09:23:01.361465 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-98nwv"] Mar 20 09:23:01 crc kubenswrapper[4971]: E0320 09:23:01.361928 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d72edaa3-a57c-48ed-b393-d4ee5fd51094" containerName="extract-content" Mar 20 09:23:01 crc kubenswrapper[4971]: I0320 09:23:01.361946 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="d72edaa3-a57c-48ed-b393-d4ee5fd51094" containerName="extract-content" Mar 20 09:23:01 crc kubenswrapper[4971]: E0320 09:23:01.361956 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="921173cb-9fd9-4b2a-bb4c-ea22aaa0f565" containerName="neutron-metadata-openstack-openstack-cell1" Mar 20 09:23:01 crc kubenswrapper[4971]: I0320 09:23:01.361965 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="921173cb-9fd9-4b2a-bb4c-ea22aaa0f565" containerName="neutron-metadata-openstack-openstack-cell1" Mar 20 09:23:01 crc kubenswrapper[4971]: E0320 09:23:01.361989 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d72edaa3-a57c-48ed-b393-d4ee5fd51094" containerName="registry-server" Mar 20 09:23:01 crc kubenswrapper[4971]: I0320 09:23:01.361996 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="d72edaa3-a57c-48ed-b393-d4ee5fd51094" containerName="registry-server" Mar 20 09:23:01 crc kubenswrapper[4971]: E0320 09:23:01.362016 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d45f9ad-d551-4522-a036-a1a8b0af6a13" containerName="oc" Mar 20 09:23:01 crc kubenswrapper[4971]: I0320 09:23:01.362023 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d45f9ad-d551-4522-a036-a1a8b0af6a13" containerName="oc" Mar 20 09:23:01 crc kubenswrapper[4971]: E0320 09:23:01.362033 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d72edaa3-a57c-48ed-b393-d4ee5fd51094" containerName="extract-utilities" Mar 20 09:23:01 crc kubenswrapper[4971]: I0320 09:23:01.362039 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="d72edaa3-a57c-48ed-b393-d4ee5fd51094" containerName="extract-utilities" Mar 20 09:23:01 crc kubenswrapper[4971]: I0320 09:23:01.362234 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="d72edaa3-a57c-48ed-b393-d4ee5fd51094" containerName="registry-server" Mar 20 09:23:01 crc kubenswrapper[4971]: I0320 09:23:01.362248 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="921173cb-9fd9-4b2a-bb4c-ea22aaa0f565" containerName="neutron-metadata-openstack-openstack-cell1" Mar 20 09:23:01 crc kubenswrapper[4971]: I0320 09:23:01.362266 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d45f9ad-d551-4522-a036-a1a8b0af6a13" containerName="oc" Mar 20 09:23:01 crc kubenswrapper[4971]: I0320 09:23:01.363105 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-98nwv" Mar 20 09:23:01 crc kubenswrapper[4971]: I0320 09:23:01.367958 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 09:23:01 crc kubenswrapper[4971]: I0320 09:23:01.368005 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 20 09:23:01 crc kubenswrapper[4971]: I0320 09:23:01.368473 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 20 09:23:01 crc kubenswrapper[4971]: I0320 09:23:01.370267 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 20 09:23:01 crc kubenswrapper[4971]: I0320 09:23:01.372359 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rwstj" Mar 20 09:23:01 crc kubenswrapper[4971]: I0320 09:23:01.381730 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-98nwv"] Mar 20 09:23:01 crc kubenswrapper[4971]: I0320 09:23:01.490679 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/30ac44cf-4f9c-43b0-974e-8e86f25451e8-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-98nwv\" (UID: \"30ac44cf-4f9c-43b0-974e-8e86f25451e8\") " pod="openstack/libvirt-openstack-openstack-cell1-98nwv" Mar 20 09:23:01 crc kubenswrapper[4971]: I0320 09:23:01.490760 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/30ac44cf-4f9c-43b0-974e-8e86f25451e8-ceph\") pod \"libvirt-openstack-openstack-cell1-98nwv\" (UID: \"30ac44cf-4f9c-43b0-974e-8e86f25451e8\") " pod="openstack/libvirt-openstack-openstack-cell1-98nwv" Mar 20 09:23:01 crc kubenswrapper[4971]: I0320 09:23:01.490969 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30ac44cf-4f9c-43b0-974e-8e86f25451e8-inventory\") pod \"libvirt-openstack-openstack-cell1-98nwv\" (UID: \"30ac44cf-4f9c-43b0-974e-8e86f25451e8\") " pod="openstack/libvirt-openstack-openstack-cell1-98nwv" Mar 20 09:23:01 crc kubenswrapper[4971]: I0320 09:23:01.491082 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/30ac44cf-4f9c-43b0-974e-8e86f25451e8-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-98nwv\" (UID: \"30ac44cf-4f9c-43b0-974e-8e86f25451e8\") " pod="openstack/libvirt-openstack-openstack-cell1-98nwv" Mar 20 09:23:01 crc kubenswrapper[4971]: I0320 09:23:01.491120 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xw6rv\" (UniqueName: \"kubernetes.io/projected/30ac44cf-4f9c-43b0-974e-8e86f25451e8-kube-api-access-xw6rv\") pod \"libvirt-openstack-openstack-cell1-98nwv\" (UID: \"30ac44cf-4f9c-43b0-974e-8e86f25451e8\") " pod="openstack/libvirt-openstack-openstack-cell1-98nwv" Mar 20 09:23:01 crc kubenswrapper[4971]: I0320 09:23:01.491237 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30ac44cf-4f9c-43b0-974e-8e86f25451e8-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-98nwv\" (UID: \"30ac44cf-4f9c-43b0-974e-8e86f25451e8\") " pod="openstack/libvirt-openstack-openstack-cell1-98nwv" Mar 20 09:23:01 crc kubenswrapper[4971]: I0320 09:23:01.593047 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30ac44cf-4f9c-43b0-974e-8e86f25451e8-inventory\") pod \"libvirt-openstack-openstack-cell1-98nwv\" (UID: \"30ac44cf-4f9c-43b0-974e-8e86f25451e8\") " pod="openstack/libvirt-openstack-openstack-cell1-98nwv" Mar 20 09:23:01 crc kubenswrapper[4971]: I0320 09:23:01.593159 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/30ac44cf-4f9c-43b0-974e-8e86f25451e8-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-98nwv\" (UID: \"30ac44cf-4f9c-43b0-974e-8e86f25451e8\") " pod="openstack/libvirt-openstack-openstack-cell1-98nwv" Mar 20 09:23:01 crc kubenswrapper[4971]: I0320 09:23:01.593190 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xw6rv\" (UniqueName: \"kubernetes.io/projected/30ac44cf-4f9c-43b0-974e-8e86f25451e8-kube-api-access-xw6rv\") pod \"libvirt-openstack-openstack-cell1-98nwv\" (UID: \"30ac44cf-4f9c-43b0-974e-8e86f25451e8\") " pod="openstack/libvirt-openstack-openstack-cell1-98nwv" Mar 20 09:23:01 crc kubenswrapper[4971]: I0320 09:23:01.593238 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30ac44cf-4f9c-43b0-974e-8e86f25451e8-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-98nwv\" (UID: \"30ac44cf-4f9c-43b0-974e-8e86f25451e8\") " pod="openstack/libvirt-openstack-openstack-cell1-98nwv" Mar 20 09:23:01 crc kubenswrapper[4971]: I0320 09:23:01.593259 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/30ac44cf-4f9c-43b0-974e-8e86f25451e8-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-98nwv\" (UID: \"30ac44cf-4f9c-43b0-974e-8e86f25451e8\") " pod="openstack/libvirt-openstack-openstack-cell1-98nwv" Mar 20 09:23:01 crc kubenswrapper[4971]: I0320 09:23:01.593288 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/30ac44cf-4f9c-43b0-974e-8e86f25451e8-ceph\") pod \"libvirt-openstack-openstack-cell1-98nwv\" (UID: \"30ac44cf-4f9c-43b0-974e-8e86f25451e8\") " pod="openstack/libvirt-openstack-openstack-cell1-98nwv" Mar 20 09:23:01 crc kubenswrapper[4971]: I0320 09:23:01.597786 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/30ac44cf-4f9c-43b0-974e-8e86f25451e8-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-98nwv\" (UID: \"30ac44cf-4f9c-43b0-974e-8e86f25451e8\") " pod="openstack/libvirt-openstack-openstack-cell1-98nwv" Mar 20 09:23:01 crc kubenswrapper[4971]: I0320 09:23:01.598183 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/30ac44cf-4f9c-43b0-974e-8e86f25451e8-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-98nwv\" (UID: \"30ac44cf-4f9c-43b0-974e-8e86f25451e8\") " pod="openstack/libvirt-openstack-openstack-cell1-98nwv" Mar 20 09:23:01 crc kubenswrapper[4971]: I0320 09:23:01.605372 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/30ac44cf-4f9c-43b0-974e-8e86f25451e8-ceph\") pod \"libvirt-openstack-openstack-cell1-98nwv\" (UID: \"30ac44cf-4f9c-43b0-974e-8e86f25451e8\") " pod="openstack/libvirt-openstack-openstack-cell1-98nwv" Mar 20 09:23:01 crc kubenswrapper[4971]: I0320 09:23:01.607246 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30ac44cf-4f9c-43b0-974e-8e86f25451e8-inventory\") pod \"libvirt-openstack-openstack-cell1-98nwv\" (UID: \"30ac44cf-4f9c-43b0-974e-8e86f25451e8\") " pod="openstack/libvirt-openstack-openstack-cell1-98nwv" Mar 20 09:23:01 crc kubenswrapper[4971]: I0320 09:23:01.607518 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30ac44cf-4f9c-43b0-974e-8e86f25451e8-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-98nwv\" (UID: \"30ac44cf-4f9c-43b0-974e-8e86f25451e8\") " pod="openstack/libvirt-openstack-openstack-cell1-98nwv" Mar 20 09:23:01 crc kubenswrapper[4971]: I0320 09:23:01.612061 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xw6rv\" (UniqueName: \"kubernetes.io/projected/30ac44cf-4f9c-43b0-974e-8e86f25451e8-kube-api-access-xw6rv\") pod \"libvirt-openstack-openstack-cell1-98nwv\" (UID: \"30ac44cf-4f9c-43b0-974e-8e86f25451e8\") " pod="openstack/libvirt-openstack-openstack-cell1-98nwv" Mar 20 09:23:01 crc kubenswrapper[4971]: I0320 09:23:01.682515 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-98nwv" Mar 20 09:23:02 crc kubenswrapper[4971]: I0320 09:23:02.312912 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-98nwv"] Mar 20 09:23:03 crc kubenswrapper[4971]: I0320 09:23:03.287041 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-98nwv" event={"ID":"30ac44cf-4f9c-43b0-974e-8e86f25451e8","Type":"ContainerStarted","Data":"b69809624155b4749636d40903f548335baa04fe6e17c3fe2da535e4e6366013"} Mar 20 09:23:03 crc kubenswrapper[4971]: I0320 09:23:03.288027 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-98nwv" event={"ID":"30ac44cf-4f9c-43b0-974e-8e86f25451e8","Type":"ContainerStarted","Data":"5f72f1122dac70437472ec30f3dc581bc7e8232afdcf5a7bfce5703bc1576d0e"} Mar 20 09:23:03 crc kubenswrapper[4971]: I0320 09:23:03.318274 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-openstack-openstack-cell1-98nwv" podStartSLOduration=1.762955287 podStartE2EDuration="2.318244654s" podCreationTimestamp="2026-03-20 09:23:01 +0000 UTC" firstStartedPulling="2026-03-20 09:23:02.322401202 +0000 UTC m=+9204.302275340" lastFinishedPulling="2026-03-20 09:23:02.877690569 +0000 UTC m=+9204.857564707" observedRunningTime="2026-03-20 09:23:03.304271714 +0000 UTC m=+9205.284145852" watchObservedRunningTime="2026-03-20 09:23:03.318244654 +0000 UTC m=+9205.298118792" Mar 20 09:23:06 crc kubenswrapper[4971]: I0320 09:23:06.732876 4971 scope.go:117] "RemoveContainer" containerID="1dc8fd89510b2b7d8ce851fb003f72167d9a6a38e6cb5f50c43cf4fd83fd91ff" Mar 20 09:23:06 crc kubenswrapper[4971]: E0320 09:23:06.733799 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:23:18 crc kubenswrapper[4971]: I0320 09:23:18.741681 4971 scope.go:117] "RemoveContainer" containerID="1dc8fd89510b2b7d8ce851fb003f72167d9a6a38e6cb5f50c43cf4fd83fd91ff" Mar 20 09:23:18 crc kubenswrapper[4971]: E0320 09:23:18.742305 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:23:33 crc kubenswrapper[4971]: I0320 09:23:33.733078 4971 scope.go:117] "RemoveContainer" containerID="1dc8fd89510b2b7d8ce851fb003f72167d9a6a38e6cb5f50c43cf4fd83fd91ff" Mar 20 09:23:33 crc kubenswrapper[4971]: E0320 09:23:33.733853 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:23:45 crc kubenswrapper[4971]: I0320 09:23:45.732790 4971 scope.go:117] "RemoveContainer" containerID="1dc8fd89510b2b7d8ce851fb003f72167d9a6a38e6cb5f50c43cf4fd83fd91ff" Mar 20 09:23:45 crc kubenswrapper[4971]: E0320 09:23:45.733624 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:23:56 crc kubenswrapper[4971]: I0320 09:23:56.732456 4971 scope.go:117] "RemoveContainer" containerID="1dc8fd89510b2b7d8ce851fb003f72167d9a6a38e6cb5f50c43cf4fd83fd91ff" Mar 20 09:23:56 crc kubenswrapper[4971]: E0320 09:23:56.733339 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:24:00 crc kubenswrapper[4971]: I0320 09:24:00.155686 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566644-jxm5c"] Mar 20 09:24:00 crc kubenswrapper[4971]: I0320 09:24:00.157629 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566644-jxm5c" Mar 20 09:24:00 crc kubenswrapper[4971]: I0320 09:24:00.161978 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:24:00 crc kubenswrapper[4971]: I0320 09:24:00.162178 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:24:00 crc kubenswrapper[4971]: I0320 09:24:00.162673 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 09:24:00 crc kubenswrapper[4971]: I0320 09:24:00.171729 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566644-jxm5c"] Mar 20 09:24:00 crc kubenswrapper[4971]: I0320 09:24:00.310934 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l567q\" (UniqueName: \"kubernetes.io/projected/a2946ac1-a6a9-4802-8d7b-82daf2fbe796-kube-api-access-l567q\") pod \"auto-csr-approver-29566644-jxm5c\" (UID: \"a2946ac1-a6a9-4802-8d7b-82daf2fbe796\") " pod="openshift-infra/auto-csr-approver-29566644-jxm5c" Mar 20 09:24:00 crc kubenswrapper[4971]: I0320 09:24:00.412955 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l567q\" (UniqueName: \"kubernetes.io/projected/a2946ac1-a6a9-4802-8d7b-82daf2fbe796-kube-api-access-l567q\") pod \"auto-csr-approver-29566644-jxm5c\" (UID: \"a2946ac1-a6a9-4802-8d7b-82daf2fbe796\") " pod="openshift-infra/auto-csr-approver-29566644-jxm5c" Mar 20 09:24:00 crc kubenswrapper[4971]: I0320 09:24:00.433464 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l567q\" (UniqueName: \"kubernetes.io/projected/a2946ac1-a6a9-4802-8d7b-82daf2fbe796-kube-api-access-l567q\") pod \"auto-csr-approver-29566644-jxm5c\" (UID: \"a2946ac1-a6a9-4802-8d7b-82daf2fbe796\") " pod="openshift-infra/auto-csr-approver-29566644-jxm5c" Mar 20 09:24:00 crc kubenswrapper[4971]: I0320 09:24:00.486959 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566644-jxm5c" Mar 20 09:24:00 crc kubenswrapper[4971]: I0320 09:24:00.930026 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566644-jxm5c"] Mar 20 09:24:01 crc kubenswrapper[4971]: I0320 09:24:01.860968 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566644-jxm5c" event={"ID":"a2946ac1-a6a9-4802-8d7b-82daf2fbe796","Type":"ContainerStarted","Data":"61f76e91ffada039f4436f423a95537f25f5785b7467ee071e2a90e20027cf4c"} Mar 20 09:24:02 crc kubenswrapper[4971]: I0320 09:24:02.870818 4971 generic.go:334] "Generic (PLEG): container finished" podID="a2946ac1-a6a9-4802-8d7b-82daf2fbe796" containerID="de9aa705d178bd16dd1d8d5a518b4f76841495628afcb5bd887ec24fa0da6055" exitCode=0 Mar 20 09:24:02 crc kubenswrapper[4971]: I0320 09:24:02.870882 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566644-jxm5c" event={"ID":"a2946ac1-a6a9-4802-8d7b-82daf2fbe796","Type":"ContainerDied","Data":"de9aa705d178bd16dd1d8d5a518b4f76841495628afcb5bd887ec24fa0da6055"} Mar 20 09:24:04 crc kubenswrapper[4971]: I0320 09:24:04.260798 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566644-jxm5c" Mar 20 09:24:04 crc kubenswrapper[4971]: I0320 09:24:04.396202 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l567q\" (UniqueName: \"kubernetes.io/projected/a2946ac1-a6a9-4802-8d7b-82daf2fbe796-kube-api-access-l567q\") pod \"a2946ac1-a6a9-4802-8d7b-82daf2fbe796\" (UID: \"a2946ac1-a6a9-4802-8d7b-82daf2fbe796\") " Mar 20 09:24:04 crc kubenswrapper[4971]: I0320 09:24:04.402679 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2946ac1-a6a9-4802-8d7b-82daf2fbe796-kube-api-access-l567q" (OuterVolumeSpecName: "kube-api-access-l567q") pod "a2946ac1-a6a9-4802-8d7b-82daf2fbe796" (UID: "a2946ac1-a6a9-4802-8d7b-82daf2fbe796"). InnerVolumeSpecName "kube-api-access-l567q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:24:04 crc kubenswrapper[4971]: I0320 09:24:04.499420 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l567q\" (UniqueName: \"kubernetes.io/projected/a2946ac1-a6a9-4802-8d7b-82daf2fbe796-kube-api-access-l567q\") on node \"crc\" DevicePath \"\"" Mar 20 09:24:04 crc kubenswrapper[4971]: I0320 09:24:04.900964 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566644-jxm5c" event={"ID":"a2946ac1-a6a9-4802-8d7b-82daf2fbe796","Type":"ContainerDied","Data":"61f76e91ffada039f4436f423a95537f25f5785b7467ee071e2a90e20027cf4c"} Mar 20 09:24:04 crc kubenswrapper[4971]: I0320 09:24:04.901098 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61f76e91ffada039f4436f423a95537f25f5785b7467ee071e2a90e20027cf4c" Mar 20 09:24:04 crc kubenswrapper[4971]: I0320 09:24:04.901051 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566644-jxm5c" Mar 20 09:24:05 crc kubenswrapper[4971]: I0320 09:24:05.334569 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566638-l97wp"] Mar 20 09:24:05 crc kubenswrapper[4971]: I0320 09:24:05.346043 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566638-l97wp"] Mar 20 09:24:06 crc kubenswrapper[4971]: I0320 09:24:06.750774 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed292ff4-8486-4848-b9f2-0b868efba1aa" path="/var/lib/kubelet/pods/ed292ff4-8486-4848-b9f2-0b868efba1aa/volumes" Mar 20 09:24:08 crc kubenswrapper[4971]: I0320 09:24:08.742842 4971 scope.go:117] "RemoveContainer" containerID="1dc8fd89510b2b7d8ce851fb003f72167d9a6a38e6cb5f50c43cf4fd83fd91ff" Mar 20 09:24:08 crc kubenswrapper[4971]: E0320 09:24:08.743433 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:24:14 crc kubenswrapper[4971]: I0320 09:24:14.235968 4971 scope.go:117] "RemoveContainer" containerID="fbd61022ca77f0079c6dbe2ec1205c8969b2da0c37553c8c697178afb543789f" Mar 20 09:24:20 crc kubenswrapper[4971]: I0320 09:24:20.732222 4971 scope.go:117] "RemoveContainer" containerID="1dc8fd89510b2b7d8ce851fb003f72167d9a6a38e6cb5f50c43cf4fd83fd91ff" Mar 20 09:24:20 crc kubenswrapper[4971]: E0320 09:24:20.733300 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:24:33 crc kubenswrapper[4971]: I0320 09:24:33.732559 4971 scope.go:117] "RemoveContainer" containerID="1dc8fd89510b2b7d8ce851fb003f72167d9a6a38e6cb5f50c43cf4fd83fd91ff" Mar 20 09:24:33 crc kubenswrapper[4971]: E0320 09:24:33.733514 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:24:44 crc kubenswrapper[4971]: I0320 09:24:44.733378 4971 scope.go:117] "RemoveContainer" containerID="1dc8fd89510b2b7d8ce851fb003f72167d9a6a38e6cb5f50c43cf4fd83fd91ff" Mar 20 09:24:44 crc kubenswrapper[4971]: E0320 09:24:44.734463 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:24:58 crc kubenswrapper[4971]: I0320 09:24:58.740816 4971 scope.go:117] "RemoveContainer" containerID="1dc8fd89510b2b7d8ce851fb003f72167d9a6a38e6cb5f50c43cf4fd83fd91ff" Mar 20 09:24:58 crc kubenswrapper[4971]: E0320 09:24:58.741694 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:25:13 crc kubenswrapper[4971]: I0320 09:25:13.732973 4971 scope.go:117] "RemoveContainer" containerID="1dc8fd89510b2b7d8ce851fb003f72167d9a6a38e6cb5f50c43cf4fd83fd91ff" Mar 20 09:25:13 crc kubenswrapper[4971]: E0320 09:25:13.733957 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:25:24 crc kubenswrapper[4971]: I0320 09:25:24.733062 4971 scope.go:117] "RemoveContainer" containerID="1dc8fd89510b2b7d8ce851fb003f72167d9a6a38e6cb5f50c43cf4fd83fd91ff" Mar 20 09:25:24 crc kubenswrapper[4971]: E0320 09:25:24.733836 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:25:27 crc kubenswrapper[4971]: I0320 09:25:27.077142 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dsqr4"] Mar 20 09:25:27 crc kubenswrapper[4971]: E0320 09:25:27.078009 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2946ac1-a6a9-4802-8d7b-82daf2fbe796" containerName="oc" Mar 20 09:25:27 crc kubenswrapper[4971]: I0320 09:25:27.078027 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2946ac1-a6a9-4802-8d7b-82daf2fbe796" containerName="oc" Mar 20 09:25:27 crc kubenswrapper[4971]: I0320 09:25:27.078294 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2946ac1-a6a9-4802-8d7b-82daf2fbe796" containerName="oc" Mar 20 09:25:27 crc kubenswrapper[4971]: I0320 09:25:27.080181 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dsqr4" Mar 20 09:25:27 crc kubenswrapper[4971]: I0320 09:25:27.094492 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dsqr4"] Mar 20 09:25:27 crc kubenswrapper[4971]: I0320 09:25:27.230085 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dbrd\" (UniqueName: \"kubernetes.io/projected/d8b8b7ba-c779-4967-86c7-ffe7beac6686-kube-api-access-4dbrd\") pod \"certified-operators-dsqr4\" (UID: \"d8b8b7ba-c779-4967-86c7-ffe7beac6686\") " pod="openshift-marketplace/certified-operators-dsqr4" Mar 20 09:25:27 crc kubenswrapper[4971]: I0320 09:25:27.230364 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8b8b7ba-c779-4967-86c7-ffe7beac6686-catalog-content\") pod \"certified-operators-dsqr4\" (UID: \"d8b8b7ba-c779-4967-86c7-ffe7beac6686\") " pod="openshift-marketplace/certified-operators-dsqr4" Mar 20 09:25:27 crc kubenswrapper[4971]: I0320 09:25:27.230538 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8b8b7ba-c779-4967-86c7-ffe7beac6686-utilities\") pod \"certified-operators-dsqr4\" (UID: \"d8b8b7ba-c779-4967-86c7-ffe7beac6686\") " pod="openshift-marketplace/certified-operators-dsqr4" Mar 20 09:25:27 crc kubenswrapper[4971]: I0320 09:25:27.332148 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8b8b7ba-c779-4967-86c7-ffe7beac6686-utilities\") pod \"certified-operators-dsqr4\" (UID: \"d8b8b7ba-c779-4967-86c7-ffe7beac6686\") " pod="openshift-marketplace/certified-operators-dsqr4" Mar 20 09:25:27 crc kubenswrapper[4971]: I0320 09:25:27.332576 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dbrd\" (UniqueName: \"kubernetes.io/projected/d8b8b7ba-c779-4967-86c7-ffe7beac6686-kube-api-access-4dbrd\") pod \"certified-operators-dsqr4\" (UID: \"d8b8b7ba-c779-4967-86c7-ffe7beac6686\") " pod="openshift-marketplace/certified-operators-dsqr4" Mar 20 09:25:27 crc kubenswrapper[4971]: I0320 09:25:27.332690 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8b8b7ba-c779-4967-86c7-ffe7beac6686-catalog-content\") pod \"certified-operators-dsqr4\" (UID: \"d8b8b7ba-c779-4967-86c7-ffe7beac6686\") " pod="openshift-marketplace/certified-operators-dsqr4" Mar 20 09:25:27 crc kubenswrapper[4971]: I0320 09:25:27.332699 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8b8b7ba-c779-4967-86c7-ffe7beac6686-utilities\") pod \"certified-operators-dsqr4\" (UID: \"d8b8b7ba-c779-4967-86c7-ffe7beac6686\") " pod="openshift-marketplace/certified-operators-dsqr4" Mar 20 09:25:27 crc kubenswrapper[4971]: I0320 09:25:27.333168 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8b8b7ba-c779-4967-86c7-ffe7beac6686-catalog-content\") pod \"certified-operators-dsqr4\" (UID: \"d8b8b7ba-c779-4967-86c7-ffe7beac6686\") " pod="openshift-marketplace/certified-operators-dsqr4" Mar 20 09:25:27 crc kubenswrapper[4971]: I0320 09:25:27.353347 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dbrd\" (UniqueName: \"kubernetes.io/projected/d8b8b7ba-c779-4967-86c7-ffe7beac6686-kube-api-access-4dbrd\") pod \"certified-operators-dsqr4\" (UID: \"d8b8b7ba-c779-4967-86c7-ffe7beac6686\") " pod="openshift-marketplace/certified-operators-dsqr4" Mar 20 09:25:27 crc kubenswrapper[4971]: I0320 09:25:27.415310 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dsqr4" Mar 20 09:25:27 crc kubenswrapper[4971]: I0320 09:25:27.997399 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dsqr4"] Mar 20 09:25:28 crc kubenswrapper[4971]: I0320 09:25:28.759185 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dsqr4" event={"ID":"d8b8b7ba-c779-4967-86c7-ffe7beac6686","Type":"ContainerStarted","Data":"05ae6bd65723ceb95ebaeb9512e0aa878b57f08cf18fdb0ec4309d5a28d24ecc"} Mar 20 09:25:28 crc kubenswrapper[4971]: I0320 09:25:28.759475 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dsqr4" event={"ID":"d8b8b7ba-c779-4967-86c7-ffe7beac6686","Type":"ContainerStarted","Data":"a0eeb002c45388f12fbbc47f7e6e0bc86818c4143a433318ca1c84963e3c89ea"} Mar 20 09:25:29 crc kubenswrapper[4971]: I0320 09:25:29.773051 4971 generic.go:334] "Generic (PLEG): container finished" podID="d8b8b7ba-c779-4967-86c7-ffe7beac6686" containerID="05ae6bd65723ceb95ebaeb9512e0aa878b57f08cf18fdb0ec4309d5a28d24ecc" exitCode=0 Mar 20 09:25:29 crc kubenswrapper[4971]: I0320 09:25:29.776008 4971 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 09:25:29 crc kubenswrapper[4971]: I0320 09:25:29.777758 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dsqr4" event={"ID":"d8b8b7ba-c779-4967-86c7-ffe7beac6686","Type":"ContainerDied","Data":"05ae6bd65723ceb95ebaeb9512e0aa878b57f08cf18fdb0ec4309d5a28d24ecc"} Mar 20 09:25:31 crc kubenswrapper[4971]: I0320 09:25:31.797834 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dsqr4" event={"ID":"d8b8b7ba-c779-4967-86c7-ffe7beac6686","Type":"ContainerStarted","Data":"9b5dc92b5638acde8c97e6d67999ca9a8e6d1971afee5cf0872fe35815f9e473"} Mar 20 09:25:33 crc kubenswrapper[4971]: I0320 09:25:33.821681 4971 generic.go:334] "Generic (PLEG): container finished" podID="d8b8b7ba-c779-4967-86c7-ffe7beac6686" containerID="9b5dc92b5638acde8c97e6d67999ca9a8e6d1971afee5cf0872fe35815f9e473" exitCode=0 Mar 20 09:25:33 crc kubenswrapper[4971]: I0320 09:25:33.821738 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dsqr4" event={"ID":"d8b8b7ba-c779-4967-86c7-ffe7beac6686","Type":"ContainerDied","Data":"9b5dc92b5638acde8c97e6d67999ca9a8e6d1971afee5cf0872fe35815f9e473"} Mar 20 09:25:34 crc kubenswrapper[4971]: I0320 09:25:34.832136 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dsqr4" event={"ID":"d8b8b7ba-c779-4967-86c7-ffe7beac6686","Type":"ContainerStarted","Data":"4147ed1a49a06938925748dc7862a8e435f401994f451d7ee7600db1e58af8fc"} Mar 20 09:25:34 crc kubenswrapper[4971]: I0320 09:25:34.856983 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dsqr4" podStartSLOduration=3.356334833 podStartE2EDuration="7.856965226s" podCreationTimestamp="2026-03-20 09:25:27 +0000 UTC" firstStartedPulling="2026-03-20 09:25:29.775787738 +0000 UTC m=+9351.755661876" lastFinishedPulling="2026-03-20 09:25:34.276418121 +0000 UTC m=+9356.256292269" observedRunningTime="2026-03-20 09:25:34.850782042 +0000 UTC m=+9356.830656190" watchObservedRunningTime="2026-03-20 09:25:34.856965226 +0000 UTC m=+9356.836839364" Mar 20 09:25:37 crc kubenswrapper[4971]: I0320 09:25:37.416158 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dsqr4" Mar 20 09:25:37 crc kubenswrapper[4971]: I0320 09:25:37.416537 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dsqr4" Mar 20 09:25:37 crc kubenswrapper[4971]: I0320 09:25:37.740083 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dsqr4" Mar 20 09:25:40 crc kubenswrapper[4971]: I0320 09:25:40.732463 4971 scope.go:117] "RemoveContainer" containerID="1dc8fd89510b2b7d8ce851fb003f72167d9a6a38e6cb5f50c43cf4fd83fd91ff" Mar 20 09:25:40 crc kubenswrapper[4971]: E0320 09:25:40.733281 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:25:47 crc kubenswrapper[4971]: I0320 09:25:47.467918 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dsqr4" Mar 20 09:25:47 crc kubenswrapper[4971]: I0320 09:25:47.523183 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dsqr4"] Mar 20 09:25:47 crc kubenswrapper[4971]: I0320 09:25:47.982831 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dsqr4" podUID="d8b8b7ba-c779-4967-86c7-ffe7beac6686" containerName="registry-server" containerID="cri-o://4147ed1a49a06938925748dc7862a8e435f401994f451d7ee7600db1e58af8fc" gracePeriod=2 Mar 20 09:25:48 crc kubenswrapper[4971]: I0320 09:25:48.993222 4971 generic.go:334] "Generic (PLEG): container finished" podID="d8b8b7ba-c779-4967-86c7-ffe7beac6686" containerID="4147ed1a49a06938925748dc7862a8e435f401994f451d7ee7600db1e58af8fc" exitCode=0 Mar 20 09:25:48 crc kubenswrapper[4971]: I0320 09:25:48.993290 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dsqr4" event={"ID":"d8b8b7ba-c779-4967-86c7-ffe7beac6686","Type":"ContainerDied","Data":"4147ed1a49a06938925748dc7862a8e435f401994f451d7ee7600db1e58af8fc"} Mar 20 09:25:49 crc kubenswrapper[4971]: I0320 09:25:49.303922 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dsqr4" Mar 20 09:25:49 crc kubenswrapper[4971]: I0320 09:25:49.471860 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8b8b7ba-c779-4967-86c7-ffe7beac6686-utilities\") pod \"d8b8b7ba-c779-4967-86c7-ffe7beac6686\" (UID: \"d8b8b7ba-c779-4967-86c7-ffe7beac6686\") " Mar 20 09:25:49 crc kubenswrapper[4971]: I0320 09:25:49.472160 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8b8b7ba-c779-4967-86c7-ffe7beac6686-catalog-content\") pod \"d8b8b7ba-c779-4967-86c7-ffe7beac6686\" (UID: \"d8b8b7ba-c779-4967-86c7-ffe7beac6686\") " Mar 20 09:25:49 crc kubenswrapper[4971]: I0320 09:25:49.472233 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dbrd\" (UniqueName: \"kubernetes.io/projected/d8b8b7ba-c779-4967-86c7-ffe7beac6686-kube-api-access-4dbrd\") pod \"d8b8b7ba-c779-4967-86c7-ffe7beac6686\" (UID: \"d8b8b7ba-c779-4967-86c7-ffe7beac6686\") " Mar 20 09:25:49 crc kubenswrapper[4971]: I0320 09:25:49.472563 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8b8b7ba-c779-4967-86c7-ffe7beac6686-utilities" (OuterVolumeSpecName: "utilities") pod "d8b8b7ba-c779-4967-86c7-ffe7beac6686" (UID: "d8b8b7ba-c779-4967-86c7-ffe7beac6686"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:25:49 crc kubenswrapper[4971]: I0320 09:25:49.473240 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8b8b7ba-c779-4967-86c7-ffe7beac6686-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 09:25:49 crc kubenswrapper[4971]: I0320 09:25:49.491824 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8b8b7ba-c779-4967-86c7-ffe7beac6686-kube-api-access-4dbrd" (OuterVolumeSpecName: "kube-api-access-4dbrd") pod "d8b8b7ba-c779-4967-86c7-ffe7beac6686" (UID: "d8b8b7ba-c779-4967-86c7-ffe7beac6686"). InnerVolumeSpecName "kube-api-access-4dbrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:25:49 crc kubenswrapper[4971]: I0320 09:25:49.519063 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8b8b7ba-c779-4967-86c7-ffe7beac6686-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d8b8b7ba-c779-4967-86c7-ffe7beac6686" (UID: "d8b8b7ba-c779-4967-86c7-ffe7beac6686"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:25:49 crc kubenswrapper[4971]: I0320 09:25:49.574875 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dbrd\" (UniqueName: \"kubernetes.io/projected/d8b8b7ba-c779-4967-86c7-ffe7beac6686-kube-api-access-4dbrd\") on node \"crc\" DevicePath \"\"" Mar 20 09:25:49 crc kubenswrapper[4971]: I0320 09:25:49.574921 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8b8b7ba-c779-4967-86c7-ffe7beac6686-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 09:25:50 crc kubenswrapper[4971]: I0320 09:25:50.005945 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dsqr4" event={"ID":"d8b8b7ba-c779-4967-86c7-ffe7beac6686","Type":"ContainerDied","Data":"a0eeb002c45388f12fbbc47f7e6e0bc86818c4143a433318ca1c84963e3c89ea"} Mar 20 09:25:50 crc kubenswrapper[4971]: I0320 09:25:50.006001 4971 scope.go:117] "RemoveContainer" containerID="4147ed1a49a06938925748dc7862a8e435f401994f451d7ee7600db1e58af8fc" Mar 20 09:25:50 crc kubenswrapper[4971]: I0320 09:25:50.006007 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dsqr4" Mar 20 09:25:50 crc kubenswrapper[4971]: I0320 09:25:50.034354 4971 scope.go:117] "RemoveContainer" containerID="9b5dc92b5638acde8c97e6d67999ca9a8e6d1971afee5cf0872fe35815f9e473" Mar 20 09:25:50 crc kubenswrapper[4971]: I0320 09:25:50.044509 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dsqr4"] Mar 20 09:25:50 crc kubenswrapper[4971]: I0320 09:25:50.055601 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dsqr4"] Mar 20 09:25:50 crc kubenswrapper[4971]: I0320 09:25:50.217222 4971 scope.go:117] "RemoveContainer" containerID="05ae6bd65723ceb95ebaeb9512e0aa878b57f08cf18fdb0ec4309d5a28d24ecc" Mar 20 09:25:50 crc kubenswrapper[4971]: I0320 09:25:50.744918 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8b8b7ba-c779-4967-86c7-ffe7beac6686" path="/var/lib/kubelet/pods/d8b8b7ba-c779-4967-86c7-ffe7beac6686/volumes" Mar 20 09:25:52 crc kubenswrapper[4971]: I0320 09:25:52.732284 4971 scope.go:117] "RemoveContainer" containerID="1dc8fd89510b2b7d8ce851fb003f72167d9a6a38e6cb5f50c43cf4fd83fd91ff" Mar 20 09:25:52 crc kubenswrapper[4971]: E0320 09:25:52.732884 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:26:00 crc kubenswrapper[4971]: I0320 09:26:00.147917 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566646-9dj5r"] Mar 20 09:26:00 crc kubenswrapper[4971]: E0320 09:26:00.175451 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8b8b7ba-c779-4967-86c7-ffe7beac6686" containerName="registry-server" Mar 20 09:26:00 crc kubenswrapper[4971]: I0320 09:26:00.175508 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8b8b7ba-c779-4967-86c7-ffe7beac6686" containerName="registry-server" Mar 20 09:26:00 crc kubenswrapper[4971]: E0320 09:26:00.175537 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8b8b7ba-c779-4967-86c7-ffe7beac6686" containerName="extract-content" Mar 20 09:26:00 crc kubenswrapper[4971]: I0320 09:26:00.175546 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8b8b7ba-c779-4967-86c7-ffe7beac6686" containerName="extract-content" Mar 20 09:26:00 crc kubenswrapper[4971]: E0320 09:26:00.175568 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8b8b7ba-c779-4967-86c7-ffe7beac6686" containerName="extract-utilities" Mar 20 09:26:00 crc kubenswrapper[4971]: I0320 09:26:00.175576 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8b8b7ba-c779-4967-86c7-ffe7beac6686" containerName="extract-utilities" Mar 20 09:26:00 crc kubenswrapper[4971]: I0320 09:26:00.178543 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8b8b7ba-c779-4967-86c7-ffe7beac6686" containerName="registry-server" Mar 20 09:26:00 crc kubenswrapper[4971]: I0320 09:26:00.179907 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566646-9dj5r"] Mar 20 09:26:00 crc kubenswrapper[4971]: I0320 09:26:00.180027 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566646-9dj5r" Mar 20 09:26:00 crc kubenswrapper[4971]: I0320 09:26:00.182682 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 09:26:00 crc kubenswrapper[4971]: I0320 09:26:00.185740 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:26:00 crc kubenswrapper[4971]: I0320 09:26:00.187221 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trzbg\" (UniqueName: \"kubernetes.io/projected/6b0a8320-36ab-483a-8a0b-f6b875a46c33-kube-api-access-trzbg\") pod \"auto-csr-approver-29566646-9dj5r\" (UID: \"6b0a8320-36ab-483a-8a0b-f6b875a46c33\") " pod="openshift-infra/auto-csr-approver-29566646-9dj5r" Mar 20 09:26:00 crc kubenswrapper[4971]: I0320 09:26:00.185941 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:26:00 crc kubenswrapper[4971]: I0320 09:26:00.289182 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trzbg\" (UniqueName: \"kubernetes.io/projected/6b0a8320-36ab-483a-8a0b-f6b875a46c33-kube-api-access-trzbg\") pod \"auto-csr-approver-29566646-9dj5r\" (UID: \"6b0a8320-36ab-483a-8a0b-f6b875a46c33\") " pod="openshift-infra/auto-csr-approver-29566646-9dj5r" Mar 20 09:26:00 crc kubenswrapper[4971]: I0320 09:26:00.599848 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trzbg\" (UniqueName: \"kubernetes.io/projected/6b0a8320-36ab-483a-8a0b-f6b875a46c33-kube-api-access-trzbg\") pod \"auto-csr-approver-29566646-9dj5r\" (UID: \"6b0a8320-36ab-483a-8a0b-f6b875a46c33\") " pod="openshift-infra/auto-csr-approver-29566646-9dj5r" Mar 20 09:26:00 crc kubenswrapper[4971]: I0320 09:26:00.804094 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566646-9dj5r" Mar 20 09:26:01 crc kubenswrapper[4971]: I0320 09:26:01.288843 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566646-9dj5r"] Mar 20 09:26:01 crc kubenswrapper[4971]: W0320 09:26:01.294542 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b0a8320_36ab_483a_8a0b_f6b875a46c33.slice/crio-c8196e9bd95f134a2fd9449ebdf808c7fb1c3d11f2bbeb8e954e5e34a6caadd8 WatchSource:0}: Error finding container c8196e9bd95f134a2fd9449ebdf808c7fb1c3d11f2bbeb8e954e5e34a6caadd8: Status 404 returned error can't find the container with id c8196e9bd95f134a2fd9449ebdf808c7fb1c3d11f2bbeb8e954e5e34a6caadd8 Mar 20 09:26:02 crc kubenswrapper[4971]: I0320 09:26:02.124445 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566646-9dj5r" event={"ID":"6b0a8320-36ab-483a-8a0b-f6b875a46c33","Type":"ContainerStarted","Data":"c8196e9bd95f134a2fd9449ebdf808c7fb1c3d11f2bbeb8e954e5e34a6caadd8"} Mar 20 09:26:03 crc kubenswrapper[4971]: I0320 09:26:03.136146 4971 generic.go:334] "Generic (PLEG): container finished" podID="6b0a8320-36ab-483a-8a0b-f6b875a46c33" containerID="d4b01ab11970c7840ed5c669de66d21e8fe5303fa67e50de75f76c0f9baa9704" exitCode=0 Mar 20 09:26:03 crc kubenswrapper[4971]: I0320 09:26:03.136201 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566646-9dj5r" event={"ID":"6b0a8320-36ab-483a-8a0b-f6b875a46c33","Type":"ContainerDied","Data":"d4b01ab11970c7840ed5c669de66d21e8fe5303fa67e50de75f76c0f9baa9704"} Mar 20 09:26:04 crc kubenswrapper[4971]: I0320 09:26:04.516092 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566646-9dj5r" Mar 20 09:26:04 crc kubenswrapper[4971]: I0320 09:26:04.588885 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trzbg\" (UniqueName: \"kubernetes.io/projected/6b0a8320-36ab-483a-8a0b-f6b875a46c33-kube-api-access-trzbg\") pod \"6b0a8320-36ab-483a-8a0b-f6b875a46c33\" (UID: \"6b0a8320-36ab-483a-8a0b-f6b875a46c33\") " Mar 20 09:26:04 crc kubenswrapper[4971]: I0320 09:26:04.596168 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b0a8320-36ab-483a-8a0b-f6b875a46c33-kube-api-access-trzbg" (OuterVolumeSpecName: "kube-api-access-trzbg") pod "6b0a8320-36ab-483a-8a0b-f6b875a46c33" (UID: "6b0a8320-36ab-483a-8a0b-f6b875a46c33"). InnerVolumeSpecName "kube-api-access-trzbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:26:04 crc kubenswrapper[4971]: I0320 09:26:04.691208 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trzbg\" (UniqueName: \"kubernetes.io/projected/6b0a8320-36ab-483a-8a0b-f6b875a46c33-kube-api-access-trzbg\") on node \"crc\" DevicePath \"\"" Mar 20 09:26:05 crc kubenswrapper[4971]: I0320 09:26:05.154361 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566646-9dj5r" event={"ID":"6b0a8320-36ab-483a-8a0b-f6b875a46c33","Type":"ContainerDied","Data":"c8196e9bd95f134a2fd9449ebdf808c7fb1c3d11f2bbeb8e954e5e34a6caadd8"} Mar 20 09:26:05 crc kubenswrapper[4971]: I0320 09:26:05.154440 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566646-9dj5r" Mar 20 09:26:05 crc kubenswrapper[4971]: I0320 09:26:05.155095 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8196e9bd95f134a2fd9449ebdf808c7fb1c3d11f2bbeb8e954e5e34a6caadd8" Mar 20 09:26:05 crc kubenswrapper[4971]: I0320 09:26:05.595518 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566640-7xbnt"] Mar 20 09:26:05 crc kubenswrapper[4971]: I0320 09:26:05.605302 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566640-7xbnt"] Mar 20 09:26:06 crc kubenswrapper[4971]: I0320 09:26:06.732565 4971 scope.go:117] "RemoveContainer" containerID="1dc8fd89510b2b7d8ce851fb003f72167d9a6a38e6cb5f50c43cf4fd83fd91ff" Mar 20 09:26:06 crc kubenswrapper[4971]: E0320 09:26:06.732898 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:26:06 crc kubenswrapper[4971]: I0320 09:26:06.743377 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fead403-b6dd-4df7-88d5-c035ab2fa51f" path="/var/lib/kubelet/pods/7fead403-b6dd-4df7-88d5-c035ab2fa51f/volumes" Mar 20 09:26:14 crc kubenswrapper[4971]: I0320 09:26:14.374001 4971 scope.go:117] "RemoveContainer" containerID="22203c0ef21a989cb4486c452c5eeffeb56ee6b8a2fce2d4175160d2cd3c1f38" Mar 20 09:26:20 crc kubenswrapper[4971]: I0320 09:26:20.733085 4971 scope.go:117] "RemoveContainer" containerID="1dc8fd89510b2b7d8ce851fb003f72167d9a6a38e6cb5f50c43cf4fd83fd91ff" Mar 20 09:26:21 crc kubenswrapper[4971]: I0320 09:26:21.316837 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerStarted","Data":"1cd20bc54dc57a6ce36976ac71cd8d2b4d0b0a4c0ce164f579ec4e73e0a60f78"} Mar 20 09:28:00 crc kubenswrapper[4971]: I0320 09:28:00.157800 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566648-vqnnf"] Mar 20 09:28:00 crc kubenswrapper[4971]: E0320 09:28:00.159469 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b0a8320-36ab-483a-8a0b-f6b875a46c33" containerName="oc" Mar 20 09:28:00 crc kubenswrapper[4971]: I0320 09:28:00.159575 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b0a8320-36ab-483a-8a0b-f6b875a46c33" containerName="oc" Mar 20 09:28:00 crc kubenswrapper[4971]: I0320 09:28:00.159992 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b0a8320-36ab-483a-8a0b-f6b875a46c33" containerName="oc" Mar 20 09:28:00 crc kubenswrapper[4971]: I0320 09:28:00.161955 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566648-vqnnf" Mar 20 09:28:00 crc kubenswrapper[4971]: I0320 09:28:00.164943 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 09:28:00 crc kubenswrapper[4971]: I0320 09:28:00.165156 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:28:00 crc kubenswrapper[4971]: I0320 09:28:00.165330 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:28:00 crc kubenswrapper[4971]: I0320 09:28:00.215785 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566648-vqnnf"] Mar 20 09:28:00 crc kubenswrapper[4971]: I0320 09:28:00.237052 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6m2z8\" (UniqueName: \"kubernetes.io/projected/2f7978eb-a8d6-41c0-9ca6-555bc5fb3d44-kube-api-access-6m2z8\") pod \"auto-csr-approver-29566648-vqnnf\" (UID: \"2f7978eb-a8d6-41c0-9ca6-555bc5fb3d44\") " pod="openshift-infra/auto-csr-approver-29566648-vqnnf" Mar 20 09:28:00 crc kubenswrapper[4971]: I0320 09:28:00.338970 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6m2z8\" (UniqueName: \"kubernetes.io/projected/2f7978eb-a8d6-41c0-9ca6-555bc5fb3d44-kube-api-access-6m2z8\") pod \"auto-csr-approver-29566648-vqnnf\" (UID: \"2f7978eb-a8d6-41c0-9ca6-555bc5fb3d44\") " pod="openshift-infra/auto-csr-approver-29566648-vqnnf" Mar 20 09:28:00 crc kubenswrapper[4971]: I0320 09:28:00.358297 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6m2z8\" (UniqueName: \"kubernetes.io/projected/2f7978eb-a8d6-41c0-9ca6-555bc5fb3d44-kube-api-access-6m2z8\") pod \"auto-csr-approver-29566648-vqnnf\" (UID: \"2f7978eb-a8d6-41c0-9ca6-555bc5fb3d44\") " pod="openshift-infra/auto-csr-approver-29566648-vqnnf" Mar 20 09:28:00 crc kubenswrapper[4971]: I0320 09:28:00.501209 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566648-vqnnf" Mar 20 09:28:00 crc kubenswrapper[4971]: I0320 09:28:00.956234 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566648-vqnnf"] Mar 20 09:28:01 crc kubenswrapper[4971]: I0320 09:28:01.358438 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566648-vqnnf" event={"ID":"2f7978eb-a8d6-41c0-9ca6-555bc5fb3d44","Type":"ContainerStarted","Data":"60b795ba872bfbde89adf322f3647ab8f36a6776c31393194d0efaf0bd9f1a4e"} Mar 20 09:28:02 crc kubenswrapper[4971]: I0320 09:28:02.368848 4971 generic.go:334] "Generic (PLEG): container finished" podID="2f7978eb-a8d6-41c0-9ca6-555bc5fb3d44" containerID="51fa867ad5bf53c3061d01a4f9be0ac8b7230ba30b8b9b9c9910d89f2a285d0f" exitCode=0 Mar 20 09:28:02 crc kubenswrapper[4971]: I0320 09:28:02.368981 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566648-vqnnf" event={"ID":"2f7978eb-a8d6-41c0-9ca6-555bc5fb3d44","Type":"ContainerDied","Data":"51fa867ad5bf53c3061d01a4f9be0ac8b7230ba30b8b9b9c9910d89f2a285d0f"} Mar 20 09:28:03 crc kubenswrapper[4971]: I0320 09:28:03.746799 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566648-vqnnf" Mar 20 09:28:03 crc kubenswrapper[4971]: I0320 09:28:03.810729 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6m2z8\" (UniqueName: \"kubernetes.io/projected/2f7978eb-a8d6-41c0-9ca6-555bc5fb3d44-kube-api-access-6m2z8\") pod \"2f7978eb-a8d6-41c0-9ca6-555bc5fb3d44\" (UID: \"2f7978eb-a8d6-41c0-9ca6-555bc5fb3d44\") " Mar 20 09:28:03 crc kubenswrapper[4971]: I0320 09:28:03.818966 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f7978eb-a8d6-41c0-9ca6-555bc5fb3d44-kube-api-access-6m2z8" (OuterVolumeSpecName: "kube-api-access-6m2z8") pod "2f7978eb-a8d6-41c0-9ca6-555bc5fb3d44" (UID: "2f7978eb-a8d6-41c0-9ca6-555bc5fb3d44"). InnerVolumeSpecName "kube-api-access-6m2z8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:28:03 crc kubenswrapper[4971]: I0320 09:28:03.914840 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6m2z8\" (UniqueName: \"kubernetes.io/projected/2f7978eb-a8d6-41c0-9ca6-555bc5fb3d44-kube-api-access-6m2z8\") on node \"crc\" DevicePath \"\"" Mar 20 09:28:04 crc kubenswrapper[4971]: I0320 09:28:04.403480 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566648-vqnnf" event={"ID":"2f7978eb-a8d6-41c0-9ca6-555bc5fb3d44","Type":"ContainerDied","Data":"60b795ba872bfbde89adf322f3647ab8f36a6776c31393194d0efaf0bd9f1a4e"} Mar 20 09:28:04 crc kubenswrapper[4971]: I0320 09:28:04.403524 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60b795ba872bfbde89adf322f3647ab8f36a6776c31393194d0efaf0bd9f1a4e" Mar 20 09:28:04 crc kubenswrapper[4971]: I0320 09:28:04.403546 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566648-vqnnf" Mar 20 09:28:04 crc kubenswrapper[4971]: I0320 09:28:04.848664 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566642-xmhfw"] Mar 20 09:28:04 crc kubenswrapper[4971]: I0320 09:28:04.859944 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566642-xmhfw"] Mar 20 09:28:06 crc kubenswrapper[4971]: I0320 09:28:06.745330 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d45f9ad-d551-4522-a036-a1a8b0af6a13" path="/var/lib/kubelet/pods/4d45f9ad-d551-4522-a036-a1a8b0af6a13/volumes" Mar 20 09:28:10 crc kubenswrapper[4971]: I0320 09:28:10.463402 4971 generic.go:334] "Generic (PLEG): container finished" podID="30ac44cf-4f9c-43b0-974e-8e86f25451e8" containerID="b69809624155b4749636d40903f548335baa04fe6e17c3fe2da535e4e6366013" exitCode=0 Mar 20 09:28:10 crc kubenswrapper[4971]: I0320 09:28:10.463461 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-98nwv" event={"ID":"30ac44cf-4f9c-43b0-974e-8e86f25451e8","Type":"ContainerDied","Data":"b69809624155b4749636d40903f548335baa04fe6e17c3fe2da535e4e6366013"} Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.000449 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-98nwv" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.080433 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/30ac44cf-4f9c-43b0-974e-8e86f25451e8-libvirt-secret-0\") pod \"30ac44cf-4f9c-43b0-974e-8e86f25451e8\" (UID: \"30ac44cf-4f9c-43b0-974e-8e86f25451e8\") " Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.080475 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30ac44cf-4f9c-43b0-974e-8e86f25451e8-libvirt-combined-ca-bundle\") pod \"30ac44cf-4f9c-43b0-974e-8e86f25451e8\" (UID: \"30ac44cf-4f9c-43b0-974e-8e86f25451e8\") " Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.080578 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30ac44cf-4f9c-43b0-974e-8e86f25451e8-inventory\") pod \"30ac44cf-4f9c-43b0-974e-8e86f25451e8\" (UID: \"30ac44cf-4f9c-43b0-974e-8e86f25451e8\") " Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.080680 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/30ac44cf-4f9c-43b0-974e-8e86f25451e8-ceph\") pod \"30ac44cf-4f9c-43b0-974e-8e86f25451e8\" (UID: \"30ac44cf-4f9c-43b0-974e-8e86f25451e8\") " Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.081189 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xw6rv\" (UniqueName: \"kubernetes.io/projected/30ac44cf-4f9c-43b0-974e-8e86f25451e8-kube-api-access-xw6rv\") pod \"30ac44cf-4f9c-43b0-974e-8e86f25451e8\" (UID: \"30ac44cf-4f9c-43b0-974e-8e86f25451e8\") " Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.081248 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/30ac44cf-4f9c-43b0-974e-8e86f25451e8-ssh-key-openstack-cell1\") pod \"30ac44cf-4f9c-43b0-974e-8e86f25451e8\" (UID: \"30ac44cf-4f9c-43b0-974e-8e86f25451e8\") " Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.085306 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30ac44cf-4f9c-43b0-974e-8e86f25451e8-kube-api-access-xw6rv" (OuterVolumeSpecName: "kube-api-access-xw6rv") pod "30ac44cf-4f9c-43b0-974e-8e86f25451e8" (UID: "30ac44cf-4f9c-43b0-974e-8e86f25451e8"). InnerVolumeSpecName "kube-api-access-xw6rv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.086481 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30ac44cf-4f9c-43b0-974e-8e86f25451e8-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "30ac44cf-4f9c-43b0-974e-8e86f25451e8" (UID: "30ac44cf-4f9c-43b0-974e-8e86f25451e8"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.099594 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30ac44cf-4f9c-43b0-974e-8e86f25451e8-ceph" (OuterVolumeSpecName: "ceph") pod "30ac44cf-4f9c-43b0-974e-8e86f25451e8" (UID: "30ac44cf-4f9c-43b0-974e-8e86f25451e8"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.112225 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30ac44cf-4f9c-43b0-974e-8e86f25451e8-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "30ac44cf-4f9c-43b0-974e-8e86f25451e8" (UID: "30ac44cf-4f9c-43b0-974e-8e86f25451e8"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.112893 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30ac44cf-4f9c-43b0-974e-8e86f25451e8-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "30ac44cf-4f9c-43b0-974e-8e86f25451e8" (UID: "30ac44cf-4f9c-43b0-974e-8e86f25451e8"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.117172 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30ac44cf-4f9c-43b0-974e-8e86f25451e8-inventory" (OuterVolumeSpecName: "inventory") pod "30ac44cf-4f9c-43b0-974e-8e86f25451e8" (UID: "30ac44cf-4f9c-43b0-974e-8e86f25451e8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.185125 4971 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30ac44cf-4f9c-43b0-974e-8e86f25451e8-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.185181 4971 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/30ac44cf-4f9c-43b0-974e-8e86f25451e8-ceph\") on node \"crc\" DevicePath \"\"" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.185197 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xw6rv\" (UniqueName: \"kubernetes.io/projected/30ac44cf-4f9c-43b0-974e-8e86f25451e8-kube-api-access-xw6rv\") on node \"crc\" DevicePath \"\"" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.185213 4971 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/30ac44cf-4f9c-43b0-974e-8e86f25451e8-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.185228 4971 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/30ac44cf-4f9c-43b0-974e-8e86f25451e8-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.185242 4971 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30ac44cf-4f9c-43b0-974e-8e86f25451e8-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.486149 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-98nwv" event={"ID":"30ac44cf-4f9c-43b0-974e-8e86f25451e8","Type":"ContainerDied","Data":"5f72f1122dac70437472ec30f3dc581bc7e8232afdcf5a7bfce5703bc1576d0e"} Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.486208 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f72f1122dac70437472ec30f3dc581bc7e8232afdcf5a7bfce5703bc1576d0e" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.486295 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-98nwv" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.659135 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-4ng6s"] Mar 20 09:28:12 crc kubenswrapper[4971]: E0320 09:28:12.659595 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f7978eb-a8d6-41c0-9ca6-555bc5fb3d44" containerName="oc" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.659632 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f7978eb-a8d6-41c0-9ca6-555bc5fb3d44" containerName="oc" Mar 20 09:28:12 crc kubenswrapper[4971]: E0320 09:28:12.659660 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30ac44cf-4f9c-43b0-974e-8e86f25451e8" containerName="libvirt-openstack-openstack-cell1" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.659667 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="30ac44cf-4f9c-43b0-974e-8e86f25451e8" containerName="libvirt-openstack-openstack-cell1" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.659868 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f7978eb-a8d6-41c0-9ca6-555bc5fb3d44" containerName="oc" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.659880 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="30ac44cf-4f9c-43b0-974e-8e86f25451e8" containerName="libvirt-openstack-openstack-cell1" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.663489 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-4ng6s" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.669039 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.669191 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.669465 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.682845 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-4ng6s"] Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.669553 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.669596 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rwstj" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.669617 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.669925 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.694218 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-4ng6s\" (UID: \"41f11d07-6bdd-455e-be27-f2b7a23c7f7a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4ng6s" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.694286 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-4ng6s\" (UID: \"41f11d07-6bdd-455e-be27-f2b7a23c7f7a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4ng6s" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.694305 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-4ng6s\" (UID: \"41f11d07-6bdd-455e-be27-f2b7a23c7f7a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4ng6s" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.694345 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-4ng6s\" (UID: \"41f11d07-6bdd-455e-be27-f2b7a23c7f7a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4ng6s" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.694364 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-4ng6s\" (UID: \"41f11d07-6bdd-455e-be27-f2b7a23c7f7a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4ng6s" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.694403 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-openstack-cell1-4ng6s\" (UID: \"41f11d07-6bdd-455e-be27-f2b7a23c7f7a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4ng6s" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.694423 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjcd4\" (UniqueName: \"kubernetes.io/projected/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-kube-api-access-jjcd4\") pod \"nova-cell1-openstack-openstack-cell1-4ng6s\" (UID: \"41f11d07-6bdd-455e-be27-f2b7a23c7f7a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4ng6s" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.694476 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-openstack-cell1-4ng6s\" (UID: \"41f11d07-6bdd-455e-be27-f2b7a23c7f7a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4ng6s" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.694500 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-openstack-cell1-4ng6s\" (UID: \"41f11d07-6bdd-455e-be27-f2b7a23c7f7a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4ng6s" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.694528 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-ceph\") pod \"nova-cell1-openstack-openstack-cell1-4ng6s\" (UID: \"41f11d07-6bdd-455e-be27-f2b7a23c7f7a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4ng6s" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.694558 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-inventory\") pod \"nova-cell1-openstack-openstack-cell1-4ng6s\" (UID: \"41f11d07-6bdd-455e-be27-f2b7a23c7f7a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4ng6s" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.694581 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-4ng6s\" (UID: \"41f11d07-6bdd-455e-be27-f2b7a23c7f7a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4ng6s" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.694640 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-4ng6s\" (UID: \"41f11d07-6bdd-455e-be27-f2b7a23c7f7a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4ng6s" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.797521 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-4ng6s\" (UID: \"41f11d07-6bdd-455e-be27-f2b7a23c7f7a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4ng6s" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.797629 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-4ng6s\" (UID: \"41f11d07-6bdd-455e-be27-f2b7a23c7f7a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4ng6s" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.797658 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-4ng6s\" (UID: \"41f11d07-6bdd-455e-be27-f2b7a23c7f7a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4ng6s" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.797701 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-4ng6s\" (UID: \"41f11d07-6bdd-455e-be27-f2b7a23c7f7a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4ng6s" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.797727 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-4ng6s\" (UID: \"41f11d07-6bdd-455e-be27-f2b7a23c7f7a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4ng6s" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.797769 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-openstack-cell1-4ng6s\" (UID: \"41f11d07-6bdd-455e-be27-f2b7a23c7f7a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4ng6s" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.797791 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjcd4\" (UniqueName: \"kubernetes.io/projected/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-kube-api-access-jjcd4\") pod \"nova-cell1-openstack-openstack-cell1-4ng6s\" (UID: \"41f11d07-6bdd-455e-be27-f2b7a23c7f7a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4ng6s" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.797858 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-openstack-cell1-4ng6s\" (UID: \"41f11d07-6bdd-455e-be27-f2b7a23c7f7a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4ng6s" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.797879 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-openstack-cell1-4ng6s\" (UID: \"41f11d07-6bdd-455e-be27-f2b7a23c7f7a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4ng6s" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.797909 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-ceph\") pod \"nova-cell1-openstack-openstack-cell1-4ng6s\" (UID: \"41f11d07-6bdd-455e-be27-f2b7a23c7f7a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4ng6s" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.797939 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-inventory\") pod \"nova-cell1-openstack-openstack-cell1-4ng6s\" (UID: \"41f11d07-6bdd-455e-be27-f2b7a23c7f7a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4ng6s" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.797962 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-4ng6s\" (UID: \"41f11d07-6bdd-455e-be27-f2b7a23c7f7a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4ng6s" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.798004 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-4ng6s\" (UID: \"41f11d07-6bdd-455e-be27-f2b7a23c7f7a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4ng6s" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.804653 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-4ng6s\" (UID: \"41f11d07-6bdd-455e-be27-f2b7a23c7f7a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4ng6s" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.804709 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-openstack-cell1-4ng6s\" (UID: \"41f11d07-6bdd-455e-be27-f2b7a23c7f7a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4ng6s" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.804814 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-openstack-cell1-4ng6s\" (UID: \"41f11d07-6bdd-455e-be27-f2b7a23c7f7a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4ng6s" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.805467 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-inventory\") pod \"nova-cell1-openstack-openstack-cell1-4ng6s\" (UID: \"41f11d07-6bdd-455e-be27-f2b7a23c7f7a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4ng6s" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.805516 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-4ng6s\" (UID: \"41f11d07-6bdd-455e-be27-f2b7a23c7f7a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4ng6s" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.805584 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-4ng6s\" (UID: \"41f11d07-6bdd-455e-be27-f2b7a23c7f7a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4ng6s" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.806437 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-4ng6s\" (UID: \"41f11d07-6bdd-455e-be27-f2b7a23c7f7a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4ng6s" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.809241 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-4ng6s\" (UID: \"41f11d07-6bdd-455e-be27-f2b7a23c7f7a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4ng6s" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.810629 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-4ng6s\" (UID: \"41f11d07-6bdd-455e-be27-f2b7a23c7f7a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4ng6s" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.810854 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-4ng6s\" (UID: \"41f11d07-6bdd-455e-be27-f2b7a23c7f7a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4ng6s" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.811103 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-ceph\") pod \"nova-cell1-openstack-openstack-cell1-4ng6s\" (UID: \"41f11d07-6bdd-455e-be27-f2b7a23c7f7a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4ng6s" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.814105 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-openstack-cell1-4ng6s\" (UID: \"41f11d07-6bdd-455e-be27-f2b7a23c7f7a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4ng6s" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.828916 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjcd4\" (UniqueName: \"kubernetes.io/projected/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-kube-api-access-jjcd4\") pod \"nova-cell1-openstack-openstack-cell1-4ng6s\" (UID: \"41f11d07-6bdd-455e-be27-f2b7a23c7f7a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4ng6s" Mar 20 09:28:12 crc kubenswrapper[4971]: I0320 09:28:12.980698 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-4ng6s" Mar 20 09:28:13 crc kubenswrapper[4971]: I0320 09:28:13.612070 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-4ng6s"] Mar 20 09:28:14 crc kubenswrapper[4971]: I0320 09:28:14.502828 4971 scope.go:117] "RemoveContainer" containerID="4be388e9796f05d357e1635a4e1dc4c3bfc3acb868f4fd261c40ce7e6b63da65" Mar 20 09:28:14 crc kubenswrapper[4971]: I0320 09:28:14.504299 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-4ng6s" event={"ID":"41f11d07-6bdd-455e-be27-f2b7a23c7f7a","Type":"ContainerStarted","Data":"58092781d703afe4e130cc0df93d49d168f0e724737bfc3534fa1317f7401179"} Mar 20 09:28:15 crc kubenswrapper[4971]: I0320 09:28:15.518759 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-4ng6s" event={"ID":"41f11d07-6bdd-455e-be27-f2b7a23c7f7a","Type":"ContainerStarted","Data":"e93eab7c4e5509d0c94fb681a3202bb35cd540e934bae9bd329a11199c89e6d6"} Mar 20 09:28:15 crc kubenswrapper[4971]: I0320 09:28:15.577169 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-openstack-cell1-4ng6s" podStartSLOduration=3.151330068 podStartE2EDuration="3.577139499s" podCreationTimestamp="2026-03-20 09:28:12 +0000 UTC" firstStartedPulling="2026-03-20 09:28:14.207778367 +0000 UTC m=+9516.187652505" lastFinishedPulling="2026-03-20 09:28:14.633587798 +0000 UTC m=+9516.613461936" observedRunningTime="2026-03-20 09:28:15.557011535 +0000 UTC m=+9517.536885683" watchObservedRunningTime="2026-03-20 09:28:15.577139499 +0000 UTC m=+9517.557013657" Mar 20 09:28:20 crc kubenswrapper[4971]: I0320 09:28:20.162572 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:28:20 crc kubenswrapper[4971]: I0320 09:28:20.163147 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:28:48 crc kubenswrapper[4971]: I0320 09:28:48.444336 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7l9xp"] Mar 20 09:28:48 crc kubenswrapper[4971]: I0320 09:28:48.447883 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7l9xp" Mar 20 09:28:48 crc kubenswrapper[4971]: I0320 09:28:48.459986 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7l9xp"] Mar 20 09:28:48 crc kubenswrapper[4971]: I0320 09:28:48.532264 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/766b4e94-9cc4-4678-88ce-dff153521083-utilities\") pod \"redhat-operators-7l9xp\" (UID: \"766b4e94-9cc4-4678-88ce-dff153521083\") " pod="openshift-marketplace/redhat-operators-7l9xp" Mar 20 09:28:48 crc kubenswrapper[4971]: I0320 09:28:48.532717 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/766b4e94-9cc4-4678-88ce-dff153521083-catalog-content\") pod \"redhat-operators-7l9xp\" (UID: \"766b4e94-9cc4-4678-88ce-dff153521083\") " pod="openshift-marketplace/redhat-operators-7l9xp" Mar 20 09:28:48 crc kubenswrapper[4971]: I0320 09:28:48.532777 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlxmj\" (UniqueName: \"kubernetes.io/projected/766b4e94-9cc4-4678-88ce-dff153521083-kube-api-access-xlxmj\") pod \"redhat-operators-7l9xp\" (UID: \"766b4e94-9cc4-4678-88ce-dff153521083\") " pod="openshift-marketplace/redhat-operators-7l9xp" Mar 20 09:28:48 crc kubenswrapper[4971]: I0320 09:28:48.634663 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/766b4e94-9cc4-4678-88ce-dff153521083-utilities\") pod \"redhat-operators-7l9xp\" (UID: \"766b4e94-9cc4-4678-88ce-dff153521083\") " pod="openshift-marketplace/redhat-operators-7l9xp" Mar 20 09:28:48 crc kubenswrapper[4971]: I0320 09:28:48.634764 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/766b4e94-9cc4-4678-88ce-dff153521083-catalog-content\") pod \"redhat-operators-7l9xp\" (UID: \"766b4e94-9cc4-4678-88ce-dff153521083\") " pod="openshift-marketplace/redhat-operators-7l9xp" Mar 20 09:28:48 crc kubenswrapper[4971]: I0320 09:28:48.635173 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/766b4e94-9cc4-4678-88ce-dff153521083-utilities\") pod \"redhat-operators-7l9xp\" (UID: \"766b4e94-9cc4-4678-88ce-dff153521083\") " pod="openshift-marketplace/redhat-operators-7l9xp" Mar 20 09:28:48 crc kubenswrapper[4971]: I0320 09:28:48.635319 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/766b4e94-9cc4-4678-88ce-dff153521083-catalog-content\") pod \"redhat-operators-7l9xp\" (UID: \"766b4e94-9cc4-4678-88ce-dff153521083\") " pod="openshift-marketplace/redhat-operators-7l9xp" Mar 20 09:28:48 crc kubenswrapper[4971]: I0320 09:28:48.635396 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlxmj\" (UniqueName: \"kubernetes.io/projected/766b4e94-9cc4-4678-88ce-dff153521083-kube-api-access-xlxmj\") pod \"redhat-operators-7l9xp\" (UID: \"766b4e94-9cc4-4678-88ce-dff153521083\") " pod="openshift-marketplace/redhat-operators-7l9xp" Mar 20 09:28:48 crc kubenswrapper[4971]: I0320 09:28:48.674791 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlxmj\" (UniqueName: \"kubernetes.io/projected/766b4e94-9cc4-4678-88ce-dff153521083-kube-api-access-xlxmj\") pod \"redhat-operators-7l9xp\" (UID: \"766b4e94-9cc4-4678-88ce-dff153521083\") " pod="openshift-marketplace/redhat-operators-7l9xp" Mar 20 09:28:48 crc kubenswrapper[4971]: I0320 09:28:48.778360 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7l9xp" Mar 20 09:28:49 crc kubenswrapper[4971]: W0320 09:28:49.265628 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod766b4e94_9cc4_4678_88ce_dff153521083.slice/crio-6b8330710804ca91075cc430a7e8d4b82cbc7cd14a25b79e23850e99adcadb30 WatchSource:0}: Error finding container 6b8330710804ca91075cc430a7e8d4b82cbc7cd14a25b79e23850e99adcadb30: Status 404 returned error can't find the container with id 6b8330710804ca91075cc430a7e8d4b82cbc7cd14a25b79e23850e99adcadb30 Mar 20 09:28:49 crc kubenswrapper[4971]: I0320 09:28:49.275722 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7l9xp"] Mar 20 09:28:49 crc kubenswrapper[4971]: I0320 09:28:49.838212 4971 generic.go:334] "Generic (PLEG): container finished" podID="766b4e94-9cc4-4678-88ce-dff153521083" containerID="2e4918cc37f1efad34d68f16f8c8374ef492508f318cd58de876553d8f1ad68f" exitCode=0 Mar 20 09:28:49 crc kubenswrapper[4971]: I0320 09:28:49.838527 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7l9xp" event={"ID":"766b4e94-9cc4-4678-88ce-dff153521083","Type":"ContainerDied","Data":"2e4918cc37f1efad34d68f16f8c8374ef492508f318cd58de876553d8f1ad68f"} Mar 20 09:28:49 crc kubenswrapper[4971]: I0320 09:28:49.838561 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7l9xp" event={"ID":"766b4e94-9cc4-4678-88ce-dff153521083","Type":"ContainerStarted","Data":"6b8330710804ca91075cc430a7e8d4b82cbc7cd14a25b79e23850e99adcadb30"} Mar 20 09:28:50 crc kubenswrapper[4971]: I0320 09:28:50.162594 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:28:50 crc kubenswrapper[4971]: I0320 09:28:50.162686 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:28:50 crc kubenswrapper[4971]: I0320 09:28:50.850341 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7l9xp" event={"ID":"766b4e94-9cc4-4678-88ce-dff153521083","Type":"ContainerStarted","Data":"be46cccdb639ed84457ad86ef628ef635b94caf3c1e1aaf8d14c44d06a1e2d1f"} Mar 20 09:28:55 crc kubenswrapper[4971]: I0320 09:28:55.908172 4971 generic.go:334] "Generic (PLEG): container finished" podID="766b4e94-9cc4-4678-88ce-dff153521083" containerID="be46cccdb639ed84457ad86ef628ef635b94caf3c1e1aaf8d14c44d06a1e2d1f" exitCode=0 Mar 20 09:28:55 crc kubenswrapper[4971]: I0320 09:28:55.908272 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7l9xp" event={"ID":"766b4e94-9cc4-4678-88ce-dff153521083","Type":"ContainerDied","Data":"be46cccdb639ed84457ad86ef628ef635b94caf3c1e1aaf8d14c44d06a1e2d1f"} Mar 20 09:28:57 crc kubenswrapper[4971]: I0320 09:28:57.934571 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7l9xp" event={"ID":"766b4e94-9cc4-4678-88ce-dff153521083","Type":"ContainerStarted","Data":"c90b94cae720c99cc1ef9f454c8cf0e84877303ead5bb6eea6ae77562b220626"} Mar 20 09:28:57 crc kubenswrapper[4971]: I0320 09:28:57.963307 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7l9xp" podStartSLOduration=3.464012082 podStartE2EDuration="9.963286555s" podCreationTimestamp="2026-03-20 09:28:48 +0000 UTC" firstStartedPulling="2026-03-20 09:28:49.84090496 +0000 UTC m=+9551.820779098" lastFinishedPulling="2026-03-20 09:28:56.340179423 +0000 UTC m=+9558.320053571" observedRunningTime="2026-03-20 09:28:57.955786336 +0000 UTC m=+9559.935660514" watchObservedRunningTime="2026-03-20 09:28:57.963286555 +0000 UTC m=+9559.943160693" Mar 20 09:28:58 crc kubenswrapper[4971]: I0320 09:28:58.779102 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7l9xp" Mar 20 09:28:58 crc kubenswrapper[4971]: I0320 09:28:58.779359 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7l9xp" Mar 20 09:28:59 crc kubenswrapper[4971]: I0320 09:28:59.824509 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7l9xp" podUID="766b4e94-9cc4-4678-88ce-dff153521083" containerName="registry-server" probeResult="failure" output=< Mar 20 09:28:59 crc kubenswrapper[4971]: timeout: failed to connect service ":50051" within 1s Mar 20 09:28:59 crc kubenswrapper[4971]: > Mar 20 09:29:09 crc kubenswrapper[4971]: I0320 09:29:09.822024 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7l9xp" podUID="766b4e94-9cc4-4678-88ce-dff153521083" containerName="registry-server" probeResult="failure" output=< Mar 20 09:29:09 crc kubenswrapper[4971]: timeout: failed to connect service ":50051" within 1s Mar 20 09:29:09 crc kubenswrapper[4971]: > Mar 20 09:29:18 crc kubenswrapper[4971]: I0320 09:29:18.844693 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7l9xp" Mar 20 09:29:18 crc kubenswrapper[4971]: I0320 09:29:18.906754 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7l9xp" Mar 20 09:29:19 crc kubenswrapper[4971]: I0320 09:29:19.641860 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7l9xp"] Mar 20 09:29:20 crc kubenswrapper[4971]: I0320 09:29:20.135136 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7l9xp" podUID="766b4e94-9cc4-4678-88ce-dff153521083" containerName="registry-server" containerID="cri-o://c90b94cae720c99cc1ef9f454c8cf0e84877303ead5bb6eea6ae77562b220626" gracePeriod=2 Mar 20 09:29:20 crc kubenswrapper[4971]: I0320 09:29:20.162206 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:29:20 crc kubenswrapper[4971]: I0320 09:29:20.162282 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:29:20 crc kubenswrapper[4971]: I0320 09:29:20.162333 4971 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" Mar 20 09:29:20 crc kubenswrapper[4971]: I0320 09:29:20.163232 4971 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1cd20bc54dc57a6ce36976ac71cd8d2b4d0b0a4c0ce164f579ec4e73e0a60f78"} pod="openshift-machine-config-operator/machine-config-daemon-m26zl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 09:29:20 crc kubenswrapper[4971]: I0320 09:29:20.163297 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" containerID="cri-o://1cd20bc54dc57a6ce36976ac71cd8d2b4d0b0a4c0ce164f579ec4e73e0a60f78" gracePeriod=600 Mar 20 09:29:20 crc kubenswrapper[4971]: I0320 09:29:20.798713 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7l9xp" Mar 20 09:29:20 crc kubenswrapper[4971]: I0320 09:29:20.905076 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/766b4e94-9cc4-4678-88ce-dff153521083-catalog-content\") pod \"766b4e94-9cc4-4678-88ce-dff153521083\" (UID: \"766b4e94-9cc4-4678-88ce-dff153521083\") " Mar 20 09:29:20 crc kubenswrapper[4971]: I0320 09:29:20.905200 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/766b4e94-9cc4-4678-88ce-dff153521083-utilities\") pod \"766b4e94-9cc4-4678-88ce-dff153521083\" (UID: \"766b4e94-9cc4-4678-88ce-dff153521083\") " Mar 20 09:29:20 crc kubenswrapper[4971]: I0320 09:29:20.905391 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlxmj\" (UniqueName: \"kubernetes.io/projected/766b4e94-9cc4-4678-88ce-dff153521083-kube-api-access-xlxmj\") pod \"766b4e94-9cc4-4678-88ce-dff153521083\" (UID: \"766b4e94-9cc4-4678-88ce-dff153521083\") " Mar 20 09:29:20 crc kubenswrapper[4971]: I0320 09:29:20.905885 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/766b4e94-9cc4-4678-88ce-dff153521083-utilities" (OuterVolumeSpecName: "utilities") pod "766b4e94-9cc4-4678-88ce-dff153521083" (UID: "766b4e94-9cc4-4678-88ce-dff153521083"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:29:20 crc kubenswrapper[4971]: I0320 09:29:20.906066 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/766b4e94-9cc4-4678-88ce-dff153521083-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 09:29:20 crc kubenswrapper[4971]: I0320 09:29:20.914813 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/766b4e94-9cc4-4678-88ce-dff153521083-kube-api-access-xlxmj" (OuterVolumeSpecName: "kube-api-access-xlxmj") pod "766b4e94-9cc4-4678-88ce-dff153521083" (UID: "766b4e94-9cc4-4678-88ce-dff153521083"). InnerVolumeSpecName "kube-api-access-xlxmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:29:21 crc kubenswrapper[4971]: I0320 09:29:21.008020 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlxmj\" (UniqueName: \"kubernetes.io/projected/766b4e94-9cc4-4678-88ce-dff153521083-kube-api-access-xlxmj\") on node \"crc\" DevicePath \"\"" Mar 20 09:29:21 crc kubenswrapper[4971]: I0320 09:29:21.033280 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/766b4e94-9cc4-4678-88ce-dff153521083-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "766b4e94-9cc4-4678-88ce-dff153521083" (UID: "766b4e94-9cc4-4678-88ce-dff153521083"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:29:21 crc kubenswrapper[4971]: I0320 09:29:21.110098 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/766b4e94-9cc4-4678-88ce-dff153521083-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 09:29:21 crc kubenswrapper[4971]: I0320 09:29:21.149322 4971 generic.go:334] "Generic (PLEG): container finished" podID="766b4e94-9cc4-4678-88ce-dff153521083" containerID="c90b94cae720c99cc1ef9f454c8cf0e84877303ead5bb6eea6ae77562b220626" exitCode=0 Mar 20 09:29:21 crc kubenswrapper[4971]: I0320 09:29:21.149392 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7l9xp" event={"ID":"766b4e94-9cc4-4678-88ce-dff153521083","Type":"ContainerDied","Data":"c90b94cae720c99cc1ef9f454c8cf0e84877303ead5bb6eea6ae77562b220626"} Mar 20 09:29:21 crc kubenswrapper[4971]: I0320 09:29:21.149452 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7l9xp" event={"ID":"766b4e94-9cc4-4678-88ce-dff153521083","Type":"ContainerDied","Data":"6b8330710804ca91075cc430a7e8d4b82cbc7cd14a25b79e23850e99adcadb30"} Mar 20 09:29:21 crc kubenswrapper[4971]: I0320 09:29:21.149473 4971 scope.go:117] "RemoveContainer" containerID="c90b94cae720c99cc1ef9f454c8cf0e84877303ead5bb6eea6ae77562b220626" Mar 20 09:29:21 crc kubenswrapper[4971]: I0320 09:29:21.150621 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7l9xp" Mar 20 09:29:21 crc kubenswrapper[4971]: I0320 09:29:21.152299 4971 generic.go:334] "Generic (PLEG): container finished" podID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerID="1cd20bc54dc57a6ce36976ac71cd8d2b4d0b0a4c0ce164f579ec4e73e0a60f78" exitCode=0 Mar 20 09:29:21 crc kubenswrapper[4971]: I0320 09:29:21.152334 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerDied","Data":"1cd20bc54dc57a6ce36976ac71cd8d2b4d0b0a4c0ce164f579ec4e73e0a60f78"} Mar 20 09:29:21 crc kubenswrapper[4971]: I0320 09:29:21.152363 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerStarted","Data":"bf941729054eb427a73262a724a392ea7406ec0eee565011ce7a80f6bb777365"} Mar 20 09:29:21 crc kubenswrapper[4971]: I0320 09:29:21.179192 4971 scope.go:117] "RemoveContainer" containerID="be46cccdb639ed84457ad86ef628ef635b94caf3c1e1aaf8d14c44d06a1e2d1f" Mar 20 09:29:21 crc kubenswrapper[4971]: I0320 09:29:21.200277 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7l9xp"] Mar 20 09:29:21 crc kubenswrapper[4971]: I0320 09:29:21.210830 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7l9xp"] Mar 20 09:29:21 crc kubenswrapper[4971]: I0320 09:29:21.210933 4971 scope.go:117] "RemoveContainer" containerID="2e4918cc37f1efad34d68f16f8c8374ef492508f318cd58de876553d8f1ad68f" Mar 20 09:29:21 crc kubenswrapper[4971]: I0320 09:29:21.228942 4971 scope.go:117] "RemoveContainer" containerID="c90b94cae720c99cc1ef9f454c8cf0e84877303ead5bb6eea6ae77562b220626" Mar 20 09:29:21 crc kubenswrapper[4971]: E0320 09:29:21.229346 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c90b94cae720c99cc1ef9f454c8cf0e84877303ead5bb6eea6ae77562b220626\": container with ID starting with c90b94cae720c99cc1ef9f454c8cf0e84877303ead5bb6eea6ae77562b220626 not found: ID does not exist" containerID="c90b94cae720c99cc1ef9f454c8cf0e84877303ead5bb6eea6ae77562b220626" Mar 20 09:29:21 crc kubenswrapper[4971]: I0320 09:29:21.229377 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c90b94cae720c99cc1ef9f454c8cf0e84877303ead5bb6eea6ae77562b220626"} err="failed to get container status \"c90b94cae720c99cc1ef9f454c8cf0e84877303ead5bb6eea6ae77562b220626\": rpc error: code = NotFound desc = could not find container \"c90b94cae720c99cc1ef9f454c8cf0e84877303ead5bb6eea6ae77562b220626\": container with ID starting with c90b94cae720c99cc1ef9f454c8cf0e84877303ead5bb6eea6ae77562b220626 not found: ID does not exist" Mar 20 09:29:21 crc kubenswrapper[4971]: I0320 09:29:21.229397 4971 scope.go:117] "RemoveContainer" containerID="be46cccdb639ed84457ad86ef628ef635b94caf3c1e1aaf8d14c44d06a1e2d1f" Mar 20 09:29:21 crc kubenswrapper[4971]: E0320 09:29:21.230036 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be46cccdb639ed84457ad86ef628ef635b94caf3c1e1aaf8d14c44d06a1e2d1f\": container with ID starting with be46cccdb639ed84457ad86ef628ef635b94caf3c1e1aaf8d14c44d06a1e2d1f not found: ID does not exist" containerID="be46cccdb639ed84457ad86ef628ef635b94caf3c1e1aaf8d14c44d06a1e2d1f" Mar 20 09:29:21 crc kubenswrapper[4971]: I0320 09:29:21.230091 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be46cccdb639ed84457ad86ef628ef635b94caf3c1e1aaf8d14c44d06a1e2d1f"} err="failed to get container status \"be46cccdb639ed84457ad86ef628ef635b94caf3c1e1aaf8d14c44d06a1e2d1f\": rpc error: code = NotFound desc = could not find container \"be46cccdb639ed84457ad86ef628ef635b94caf3c1e1aaf8d14c44d06a1e2d1f\": container with ID starting with be46cccdb639ed84457ad86ef628ef635b94caf3c1e1aaf8d14c44d06a1e2d1f not found: ID does not exist" Mar 20 09:29:21 crc kubenswrapper[4971]: I0320 09:29:21.230119 4971 scope.go:117] "RemoveContainer" containerID="2e4918cc37f1efad34d68f16f8c8374ef492508f318cd58de876553d8f1ad68f" Mar 20 09:29:21 crc kubenswrapper[4971]: E0320 09:29:21.230453 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e4918cc37f1efad34d68f16f8c8374ef492508f318cd58de876553d8f1ad68f\": container with ID starting with 2e4918cc37f1efad34d68f16f8c8374ef492508f318cd58de876553d8f1ad68f not found: ID does not exist" containerID="2e4918cc37f1efad34d68f16f8c8374ef492508f318cd58de876553d8f1ad68f" Mar 20 09:29:21 crc kubenswrapper[4971]: I0320 09:29:21.230499 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e4918cc37f1efad34d68f16f8c8374ef492508f318cd58de876553d8f1ad68f"} err="failed to get container status \"2e4918cc37f1efad34d68f16f8c8374ef492508f318cd58de876553d8f1ad68f\": rpc error: code = NotFound desc = could not find container \"2e4918cc37f1efad34d68f16f8c8374ef492508f318cd58de876553d8f1ad68f\": container with ID starting with 2e4918cc37f1efad34d68f16f8c8374ef492508f318cd58de876553d8f1ad68f not found: ID does not exist" Mar 20 09:29:21 crc kubenswrapper[4971]: I0320 09:29:21.230528 4971 scope.go:117] "RemoveContainer" containerID="1dc8fd89510b2b7d8ce851fb003f72167d9a6a38e6cb5f50c43cf4fd83fd91ff" Mar 20 09:29:22 crc kubenswrapper[4971]: I0320 09:29:22.746156 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="766b4e94-9cc4-4678-88ce-dff153521083" path="/var/lib/kubelet/pods/766b4e94-9cc4-4678-88ce-dff153521083/volumes" Mar 20 09:29:38 crc kubenswrapper[4971]: I0320 09:29:38.960552 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5wshc"] Mar 20 09:29:38 crc kubenswrapper[4971]: E0320 09:29:38.961663 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="766b4e94-9cc4-4678-88ce-dff153521083" containerName="extract-content" Mar 20 09:29:38 crc kubenswrapper[4971]: I0320 09:29:38.961682 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="766b4e94-9cc4-4678-88ce-dff153521083" containerName="extract-content" Mar 20 09:29:38 crc kubenswrapper[4971]: E0320 09:29:38.961703 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="766b4e94-9cc4-4678-88ce-dff153521083" containerName="extract-utilities" Mar 20 09:29:38 crc kubenswrapper[4971]: I0320 09:29:38.961710 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="766b4e94-9cc4-4678-88ce-dff153521083" containerName="extract-utilities" Mar 20 09:29:38 crc kubenswrapper[4971]: E0320 09:29:38.961731 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="766b4e94-9cc4-4678-88ce-dff153521083" containerName="registry-server" Mar 20 09:29:38 crc kubenswrapper[4971]: I0320 09:29:38.961739 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="766b4e94-9cc4-4678-88ce-dff153521083" containerName="registry-server" Mar 20 09:29:38 crc kubenswrapper[4971]: I0320 09:29:38.962002 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="766b4e94-9cc4-4678-88ce-dff153521083" containerName="registry-server" Mar 20 09:29:38 crc kubenswrapper[4971]: I0320 09:29:38.963497 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5wshc" Mar 20 09:29:38 crc kubenswrapper[4971]: I0320 09:29:38.981737 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5wshc"] Mar 20 09:29:38 crc kubenswrapper[4971]: I0320 09:29:38.993241 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1fce4de-6dde-488f-bafc-c895c85d5fc9-utilities\") pod \"redhat-marketplace-5wshc\" (UID: \"c1fce4de-6dde-488f-bafc-c895c85d5fc9\") " pod="openshift-marketplace/redhat-marketplace-5wshc" Mar 20 09:29:38 crc kubenswrapper[4971]: I0320 09:29:38.994328 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dvkj\" (UniqueName: \"kubernetes.io/projected/c1fce4de-6dde-488f-bafc-c895c85d5fc9-kube-api-access-7dvkj\") pod \"redhat-marketplace-5wshc\" (UID: \"c1fce4de-6dde-488f-bafc-c895c85d5fc9\") " pod="openshift-marketplace/redhat-marketplace-5wshc" Mar 20 09:29:38 crc kubenswrapper[4971]: I0320 09:29:38.994509 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1fce4de-6dde-488f-bafc-c895c85d5fc9-catalog-content\") pod \"redhat-marketplace-5wshc\" (UID: \"c1fce4de-6dde-488f-bafc-c895c85d5fc9\") " pod="openshift-marketplace/redhat-marketplace-5wshc" Mar 20 09:29:39 crc kubenswrapper[4971]: I0320 09:29:39.096517 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1fce4de-6dde-488f-bafc-c895c85d5fc9-catalog-content\") pod \"redhat-marketplace-5wshc\" (UID: \"c1fce4de-6dde-488f-bafc-c895c85d5fc9\") " pod="openshift-marketplace/redhat-marketplace-5wshc" Mar 20 09:29:39 crc kubenswrapper[4971]: I0320 09:29:39.096966 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1fce4de-6dde-488f-bafc-c895c85d5fc9-utilities\") pod \"redhat-marketplace-5wshc\" (UID: \"c1fce4de-6dde-488f-bafc-c895c85d5fc9\") " pod="openshift-marketplace/redhat-marketplace-5wshc" Mar 20 09:29:39 crc kubenswrapper[4971]: I0320 09:29:39.097077 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1fce4de-6dde-488f-bafc-c895c85d5fc9-catalog-content\") pod \"redhat-marketplace-5wshc\" (UID: \"c1fce4de-6dde-488f-bafc-c895c85d5fc9\") " pod="openshift-marketplace/redhat-marketplace-5wshc" Mar 20 09:29:39 crc kubenswrapper[4971]: I0320 09:29:39.097245 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1fce4de-6dde-488f-bafc-c895c85d5fc9-utilities\") pod \"redhat-marketplace-5wshc\" (UID: \"c1fce4de-6dde-488f-bafc-c895c85d5fc9\") " pod="openshift-marketplace/redhat-marketplace-5wshc" Mar 20 09:29:39 crc kubenswrapper[4971]: I0320 09:29:39.097349 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dvkj\" (UniqueName: \"kubernetes.io/projected/c1fce4de-6dde-488f-bafc-c895c85d5fc9-kube-api-access-7dvkj\") pod \"redhat-marketplace-5wshc\" (UID: \"c1fce4de-6dde-488f-bafc-c895c85d5fc9\") " pod="openshift-marketplace/redhat-marketplace-5wshc" Mar 20 09:29:39 crc kubenswrapper[4971]: I0320 09:29:39.115566 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dvkj\" (UniqueName: \"kubernetes.io/projected/c1fce4de-6dde-488f-bafc-c895c85d5fc9-kube-api-access-7dvkj\") pod \"redhat-marketplace-5wshc\" (UID: \"c1fce4de-6dde-488f-bafc-c895c85d5fc9\") " pod="openshift-marketplace/redhat-marketplace-5wshc" Mar 20 09:29:39 crc kubenswrapper[4971]: I0320 09:29:39.287484 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5wshc" Mar 20 09:29:39 crc kubenswrapper[4971]: I0320 09:29:39.803161 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5wshc"] Mar 20 09:29:40 crc kubenswrapper[4971]: I0320 09:29:40.350647 4971 generic.go:334] "Generic (PLEG): container finished" podID="c1fce4de-6dde-488f-bafc-c895c85d5fc9" containerID="048c4484d760cb80774ffc2346b76b1faa409fd1ae7fe1c426bf84a062e0c67f" exitCode=0 Mar 20 09:29:40 crc kubenswrapper[4971]: I0320 09:29:40.350751 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5wshc" event={"ID":"c1fce4de-6dde-488f-bafc-c895c85d5fc9","Type":"ContainerDied","Data":"048c4484d760cb80774ffc2346b76b1faa409fd1ae7fe1c426bf84a062e0c67f"} Mar 20 09:29:40 crc kubenswrapper[4971]: I0320 09:29:40.350962 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5wshc" event={"ID":"c1fce4de-6dde-488f-bafc-c895c85d5fc9","Type":"ContainerStarted","Data":"0ab7e1f92da5bcf9e8af85c817e72bed7670bfe115fbdacef933dc73a070cb70"} Mar 20 09:29:42 crc kubenswrapper[4971]: I0320 09:29:42.377426 4971 generic.go:334] "Generic (PLEG): container finished" podID="c1fce4de-6dde-488f-bafc-c895c85d5fc9" containerID="60d9f13e9a093fbb3d03bf08743f426c8945950af8a73399562199d4b619a265" exitCode=0 Mar 20 09:29:42 crc kubenswrapper[4971]: I0320 09:29:42.377532 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5wshc" event={"ID":"c1fce4de-6dde-488f-bafc-c895c85d5fc9","Type":"ContainerDied","Data":"60d9f13e9a093fbb3d03bf08743f426c8945950af8a73399562199d4b619a265"} Mar 20 09:29:43 crc kubenswrapper[4971]: I0320 09:29:43.390321 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5wshc" event={"ID":"c1fce4de-6dde-488f-bafc-c895c85d5fc9","Type":"ContainerStarted","Data":"9c496a6886703d66384fc910064d829309d56318824b5b4a633446d3ff9a78d8"} Mar 20 09:29:43 crc kubenswrapper[4971]: I0320 09:29:43.415079 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5wshc" podStartSLOduration=2.876601687 podStartE2EDuration="5.415053602s" podCreationTimestamp="2026-03-20 09:29:38 +0000 UTC" firstStartedPulling="2026-03-20 09:29:40.352256669 +0000 UTC m=+9602.332130807" lastFinishedPulling="2026-03-20 09:29:42.890708584 +0000 UTC m=+9604.870582722" observedRunningTime="2026-03-20 09:29:43.405540959 +0000 UTC m=+9605.385415107" watchObservedRunningTime="2026-03-20 09:29:43.415053602 +0000 UTC m=+9605.394927740" Mar 20 09:29:49 crc kubenswrapper[4971]: I0320 09:29:49.287679 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5wshc" Mar 20 09:29:49 crc kubenswrapper[4971]: I0320 09:29:49.288050 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5wshc" Mar 20 09:29:49 crc kubenswrapper[4971]: I0320 09:29:49.348265 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5wshc" Mar 20 09:29:49 crc kubenswrapper[4971]: I0320 09:29:49.506354 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5wshc" Mar 20 09:29:49 crc kubenswrapper[4971]: I0320 09:29:49.677436 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5wshc"] Mar 20 09:29:51 crc kubenswrapper[4971]: I0320 09:29:51.484681 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5wshc" podUID="c1fce4de-6dde-488f-bafc-c895c85d5fc9" containerName="registry-server" containerID="cri-o://9c496a6886703d66384fc910064d829309d56318824b5b4a633446d3ff9a78d8" gracePeriod=2 Mar 20 09:29:51 crc kubenswrapper[4971]: I0320 09:29:51.992342 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5wshc" Mar 20 09:29:52 crc kubenswrapper[4971]: I0320 09:29:52.088934 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dvkj\" (UniqueName: \"kubernetes.io/projected/c1fce4de-6dde-488f-bafc-c895c85d5fc9-kube-api-access-7dvkj\") pod \"c1fce4de-6dde-488f-bafc-c895c85d5fc9\" (UID: \"c1fce4de-6dde-488f-bafc-c895c85d5fc9\") " Mar 20 09:29:52 crc kubenswrapper[4971]: I0320 09:29:52.089049 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1fce4de-6dde-488f-bafc-c895c85d5fc9-catalog-content\") pod \"c1fce4de-6dde-488f-bafc-c895c85d5fc9\" (UID: \"c1fce4de-6dde-488f-bafc-c895c85d5fc9\") " Mar 20 09:29:52 crc kubenswrapper[4971]: I0320 09:29:52.089169 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1fce4de-6dde-488f-bafc-c895c85d5fc9-utilities\") pod \"c1fce4de-6dde-488f-bafc-c895c85d5fc9\" (UID: \"c1fce4de-6dde-488f-bafc-c895c85d5fc9\") " Mar 20 09:29:52 crc kubenswrapper[4971]: I0320 09:29:52.090262 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1fce4de-6dde-488f-bafc-c895c85d5fc9-utilities" (OuterVolumeSpecName: "utilities") pod "c1fce4de-6dde-488f-bafc-c895c85d5fc9" (UID: "c1fce4de-6dde-488f-bafc-c895c85d5fc9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:29:52 crc kubenswrapper[4971]: I0320 09:29:52.095725 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1fce4de-6dde-488f-bafc-c895c85d5fc9-kube-api-access-7dvkj" (OuterVolumeSpecName: "kube-api-access-7dvkj") pod "c1fce4de-6dde-488f-bafc-c895c85d5fc9" (UID: "c1fce4de-6dde-488f-bafc-c895c85d5fc9"). InnerVolumeSpecName "kube-api-access-7dvkj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:29:52 crc kubenswrapper[4971]: I0320 09:29:52.116028 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1fce4de-6dde-488f-bafc-c895c85d5fc9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c1fce4de-6dde-488f-bafc-c895c85d5fc9" (UID: "c1fce4de-6dde-488f-bafc-c895c85d5fc9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:29:52 crc kubenswrapper[4971]: I0320 09:29:52.191831 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dvkj\" (UniqueName: \"kubernetes.io/projected/c1fce4de-6dde-488f-bafc-c895c85d5fc9-kube-api-access-7dvkj\") on node \"crc\" DevicePath \"\"" Mar 20 09:29:52 crc kubenswrapper[4971]: I0320 09:29:52.191873 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1fce4de-6dde-488f-bafc-c895c85d5fc9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 09:29:52 crc kubenswrapper[4971]: I0320 09:29:52.191888 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1fce4de-6dde-488f-bafc-c895c85d5fc9-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 09:29:52 crc kubenswrapper[4971]: I0320 09:29:52.496297 4971 generic.go:334] "Generic (PLEG): container finished" podID="c1fce4de-6dde-488f-bafc-c895c85d5fc9" containerID="9c496a6886703d66384fc910064d829309d56318824b5b4a633446d3ff9a78d8" exitCode=0 Mar 20 09:29:52 crc kubenswrapper[4971]: I0320 09:29:52.496353 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5wshc" Mar 20 09:29:52 crc kubenswrapper[4971]: I0320 09:29:52.496344 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5wshc" event={"ID":"c1fce4de-6dde-488f-bafc-c895c85d5fc9","Type":"ContainerDied","Data":"9c496a6886703d66384fc910064d829309d56318824b5b4a633446d3ff9a78d8"} Mar 20 09:29:52 crc kubenswrapper[4971]: I0320 09:29:52.496532 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5wshc" event={"ID":"c1fce4de-6dde-488f-bafc-c895c85d5fc9","Type":"ContainerDied","Data":"0ab7e1f92da5bcf9e8af85c817e72bed7670bfe115fbdacef933dc73a070cb70"} Mar 20 09:29:52 crc kubenswrapper[4971]: I0320 09:29:52.496572 4971 scope.go:117] "RemoveContainer" containerID="9c496a6886703d66384fc910064d829309d56318824b5b4a633446d3ff9a78d8" Mar 20 09:29:52 crc kubenswrapper[4971]: I0320 09:29:52.537483 4971 scope.go:117] "RemoveContainer" containerID="60d9f13e9a093fbb3d03bf08743f426c8945950af8a73399562199d4b619a265" Mar 20 09:29:52 crc kubenswrapper[4971]: I0320 09:29:52.541737 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5wshc"] Mar 20 09:29:52 crc kubenswrapper[4971]: I0320 09:29:52.555254 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5wshc"] Mar 20 09:29:52 crc kubenswrapper[4971]: I0320 09:29:52.571339 4971 scope.go:117] "RemoveContainer" containerID="048c4484d760cb80774ffc2346b76b1faa409fd1ae7fe1c426bf84a062e0c67f" Mar 20 09:29:52 crc kubenswrapper[4971]: I0320 09:29:52.606462 4971 scope.go:117] "RemoveContainer" containerID="9c496a6886703d66384fc910064d829309d56318824b5b4a633446d3ff9a78d8" Mar 20 09:29:52 crc kubenswrapper[4971]: E0320 09:29:52.607027 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c496a6886703d66384fc910064d829309d56318824b5b4a633446d3ff9a78d8\": container with ID starting with 9c496a6886703d66384fc910064d829309d56318824b5b4a633446d3ff9a78d8 not found: ID does not exist" containerID="9c496a6886703d66384fc910064d829309d56318824b5b4a633446d3ff9a78d8" Mar 20 09:29:52 crc kubenswrapper[4971]: I0320 09:29:52.607069 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c496a6886703d66384fc910064d829309d56318824b5b4a633446d3ff9a78d8"} err="failed to get container status \"9c496a6886703d66384fc910064d829309d56318824b5b4a633446d3ff9a78d8\": rpc error: code = NotFound desc = could not find container \"9c496a6886703d66384fc910064d829309d56318824b5b4a633446d3ff9a78d8\": container with ID starting with 9c496a6886703d66384fc910064d829309d56318824b5b4a633446d3ff9a78d8 not found: ID does not exist" Mar 20 09:29:52 crc kubenswrapper[4971]: I0320 09:29:52.607095 4971 scope.go:117] "RemoveContainer" containerID="60d9f13e9a093fbb3d03bf08743f426c8945950af8a73399562199d4b619a265" Mar 20 09:29:52 crc kubenswrapper[4971]: E0320 09:29:52.607531 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60d9f13e9a093fbb3d03bf08743f426c8945950af8a73399562199d4b619a265\": container with ID starting with 60d9f13e9a093fbb3d03bf08743f426c8945950af8a73399562199d4b619a265 not found: ID does not exist" containerID="60d9f13e9a093fbb3d03bf08743f426c8945950af8a73399562199d4b619a265" Mar 20 09:29:52 crc kubenswrapper[4971]: I0320 09:29:52.607571 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60d9f13e9a093fbb3d03bf08743f426c8945950af8a73399562199d4b619a265"} err="failed to get container status \"60d9f13e9a093fbb3d03bf08743f426c8945950af8a73399562199d4b619a265\": rpc error: code = NotFound desc = could not find container \"60d9f13e9a093fbb3d03bf08743f426c8945950af8a73399562199d4b619a265\": container with ID starting with 60d9f13e9a093fbb3d03bf08743f426c8945950af8a73399562199d4b619a265 not found: ID does not exist" Mar 20 09:29:52 crc kubenswrapper[4971]: I0320 09:29:52.607598 4971 scope.go:117] "RemoveContainer" containerID="048c4484d760cb80774ffc2346b76b1faa409fd1ae7fe1c426bf84a062e0c67f" Mar 20 09:29:52 crc kubenswrapper[4971]: E0320 09:29:52.608087 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"048c4484d760cb80774ffc2346b76b1faa409fd1ae7fe1c426bf84a062e0c67f\": container with ID starting with 048c4484d760cb80774ffc2346b76b1faa409fd1ae7fe1c426bf84a062e0c67f not found: ID does not exist" containerID="048c4484d760cb80774ffc2346b76b1faa409fd1ae7fe1c426bf84a062e0c67f" Mar 20 09:29:52 crc kubenswrapper[4971]: I0320 09:29:52.608125 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"048c4484d760cb80774ffc2346b76b1faa409fd1ae7fe1c426bf84a062e0c67f"} err="failed to get container status \"048c4484d760cb80774ffc2346b76b1faa409fd1ae7fe1c426bf84a062e0c67f\": rpc error: code = NotFound desc = could not find container \"048c4484d760cb80774ffc2346b76b1faa409fd1ae7fe1c426bf84a062e0c67f\": container with ID starting with 048c4484d760cb80774ffc2346b76b1faa409fd1ae7fe1c426bf84a062e0c67f not found: ID does not exist" Mar 20 09:29:52 crc kubenswrapper[4971]: I0320 09:29:52.752161 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1fce4de-6dde-488f-bafc-c895c85d5fc9" path="/var/lib/kubelet/pods/c1fce4de-6dde-488f-bafc-c895c85d5fc9/volumes" Mar 20 09:30:00 crc kubenswrapper[4971]: I0320 09:30:00.159140 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566650-st8vq"] Mar 20 09:30:00 crc kubenswrapper[4971]: E0320 09:30:00.160369 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1fce4de-6dde-488f-bafc-c895c85d5fc9" containerName="extract-content" Mar 20 09:30:00 crc kubenswrapper[4971]: I0320 09:30:00.160387 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1fce4de-6dde-488f-bafc-c895c85d5fc9" containerName="extract-content" Mar 20 09:30:00 crc kubenswrapper[4971]: E0320 09:30:00.160422 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1fce4de-6dde-488f-bafc-c895c85d5fc9" containerName="extract-utilities" Mar 20 09:30:00 crc kubenswrapper[4971]: I0320 09:30:00.160431 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1fce4de-6dde-488f-bafc-c895c85d5fc9" containerName="extract-utilities" Mar 20 09:30:00 crc kubenswrapper[4971]: E0320 09:30:00.160440 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1fce4de-6dde-488f-bafc-c895c85d5fc9" containerName="registry-server" Mar 20 09:30:00 crc kubenswrapper[4971]: I0320 09:30:00.160448 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1fce4de-6dde-488f-bafc-c895c85d5fc9" containerName="registry-server" Mar 20 09:30:00 crc kubenswrapper[4971]: I0320 09:30:00.160758 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1fce4de-6dde-488f-bafc-c895c85d5fc9" containerName="registry-server" Mar 20 09:30:00 crc kubenswrapper[4971]: I0320 09:30:00.161807 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566650-st8vq" Mar 20 09:30:00 crc kubenswrapper[4971]: I0320 09:30:00.165227 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:30:00 crc kubenswrapper[4971]: I0320 09:30:00.165586 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 09:30:00 crc kubenswrapper[4971]: I0320 09:30:00.165803 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:30:00 crc kubenswrapper[4971]: I0320 09:30:00.171727 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566650-st8vq"] Mar 20 09:30:00 crc kubenswrapper[4971]: I0320 09:30:00.182714 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566650-h5hmj"] Mar 20 09:30:00 crc kubenswrapper[4971]: I0320 09:30:00.184562 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566650-h5hmj" Mar 20 09:30:00 crc kubenswrapper[4971]: I0320 09:30:00.200196 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 09:30:00 crc kubenswrapper[4971]: I0320 09:30:00.200461 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 09:30:00 crc kubenswrapper[4971]: I0320 09:30:00.220773 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566650-h5hmj"] Mar 20 09:30:00 crc kubenswrapper[4971]: I0320 09:30:00.267918 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-896t4\" (UniqueName: \"kubernetes.io/projected/04fd9f38-7215-427b-bc7e-6e1efd12170b-kube-api-access-896t4\") pod \"collect-profiles-29566650-h5hmj\" (UID: \"04fd9f38-7215-427b-bc7e-6e1efd12170b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566650-h5hmj" Mar 20 09:30:00 crc kubenswrapper[4971]: I0320 09:30:00.268011 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7jff\" (UniqueName: \"kubernetes.io/projected/93e47cf8-974a-4ba1-a399-855e87aaa553-kube-api-access-g7jff\") pod \"auto-csr-approver-29566650-st8vq\" (UID: \"93e47cf8-974a-4ba1-a399-855e87aaa553\") " pod="openshift-infra/auto-csr-approver-29566650-st8vq" Mar 20 09:30:00 crc kubenswrapper[4971]: I0320 09:30:00.268191 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/04fd9f38-7215-427b-bc7e-6e1efd12170b-secret-volume\") pod \"collect-profiles-29566650-h5hmj\" (UID: \"04fd9f38-7215-427b-bc7e-6e1efd12170b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566650-h5hmj" Mar 20 09:30:00 crc kubenswrapper[4971]: I0320 09:30:00.268255 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/04fd9f38-7215-427b-bc7e-6e1efd12170b-config-volume\") pod \"collect-profiles-29566650-h5hmj\" (UID: \"04fd9f38-7215-427b-bc7e-6e1efd12170b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566650-h5hmj" Mar 20 09:30:00 crc kubenswrapper[4971]: I0320 09:30:00.369897 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/04fd9f38-7215-427b-bc7e-6e1efd12170b-secret-volume\") pod \"collect-profiles-29566650-h5hmj\" (UID: \"04fd9f38-7215-427b-bc7e-6e1efd12170b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566650-h5hmj" Mar 20 09:30:00 crc kubenswrapper[4971]: I0320 09:30:00.369980 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/04fd9f38-7215-427b-bc7e-6e1efd12170b-config-volume\") pod \"collect-profiles-29566650-h5hmj\" (UID: \"04fd9f38-7215-427b-bc7e-6e1efd12170b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566650-h5hmj" Mar 20 09:30:00 crc kubenswrapper[4971]: I0320 09:30:00.370024 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-896t4\" (UniqueName: \"kubernetes.io/projected/04fd9f38-7215-427b-bc7e-6e1efd12170b-kube-api-access-896t4\") pod \"collect-profiles-29566650-h5hmj\" (UID: \"04fd9f38-7215-427b-bc7e-6e1efd12170b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566650-h5hmj" Mar 20 09:30:00 crc kubenswrapper[4971]: I0320 09:30:00.370062 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7jff\" (UniqueName: \"kubernetes.io/projected/93e47cf8-974a-4ba1-a399-855e87aaa553-kube-api-access-g7jff\") pod \"auto-csr-approver-29566650-st8vq\" (UID: \"93e47cf8-974a-4ba1-a399-855e87aaa553\") " pod="openshift-infra/auto-csr-approver-29566650-st8vq" Mar 20 09:30:00 crc kubenswrapper[4971]: I0320 09:30:00.371471 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/04fd9f38-7215-427b-bc7e-6e1efd12170b-config-volume\") pod \"collect-profiles-29566650-h5hmj\" (UID: \"04fd9f38-7215-427b-bc7e-6e1efd12170b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566650-h5hmj" Mar 20 09:30:00 crc kubenswrapper[4971]: I0320 09:30:00.383672 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/04fd9f38-7215-427b-bc7e-6e1efd12170b-secret-volume\") pod \"collect-profiles-29566650-h5hmj\" (UID: \"04fd9f38-7215-427b-bc7e-6e1efd12170b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566650-h5hmj" Mar 20 09:30:00 crc kubenswrapper[4971]: I0320 09:30:00.387122 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7jff\" (UniqueName: \"kubernetes.io/projected/93e47cf8-974a-4ba1-a399-855e87aaa553-kube-api-access-g7jff\") pod \"auto-csr-approver-29566650-st8vq\" (UID: \"93e47cf8-974a-4ba1-a399-855e87aaa553\") " pod="openshift-infra/auto-csr-approver-29566650-st8vq" Mar 20 09:30:00 crc kubenswrapper[4971]: I0320 09:30:00.388309 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-896t4\" (UniqueName: \"kubernetes.io/projected/04fd9f38-7215-427b-bc7e-6e1efd12170b-kube-api-access-896t4\") pod \"collect-profiles-29566650-h5hmj\" (UID: \"04fd9f38-7215-427b-bc7e-6e1efd12170b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566650-h5hmj" Mar 20 09:30:00 crc kubenswrapper[4971]: I0320 09:30:00.484960 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566650-st8vq" Mar 20 09:30:00 crc kubenswrapper[4971]: I0320 09:30:00.517228 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566650-h5hmj" Mar 20 09:30:01 crc kubenswrapper[4971]: I0320 09:30:01.324385 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566650-h5hmj"] Mar 20 09:30:01 crc kubenswrapper[4971]: I0320 09:30:01.440158 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566650-st8vq"] Mar 20 09:30:01 crc kubenswrapper[4971]: W0320 09:30:01.441775 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93e47cf8_974a_4ba1_a399_855e87aaa553.slice/crio-33d65668f32bc1d39cc91a4f6f8754fee703f1cc4c798a8e70a25a2a5ea2894b WatchSource:0}: Error finding container 33d65668f32bc1d39cc91a4f6f8754fee703f1cc4c798a8e70a25a2a5ea2894b: Status 404 returned error can't find the container with id 33d65668f32bc1d39cc91a4f6f8754fee703f1cc4c798a8e70a25a2a5ea2894b Mar 20 09:30:01 crc kubenswrapper[4971]: I0320 09:30:01.591999 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566650-st8vq" event={"ID":"93e47cf8-974a-4ba1-a399-855e87aaa553","Type":"ContainerStarted","Data":"33d65668f32bc1d39cc91a4f6f8754fee703f1cc4c798a8e70a25a2a5ea2894b"} Mar 20 09:30:01 crc kubenswrapper[4971]: I0320 09:30:01.594538 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566650-h5hmj" event={"ID":"04fd9f38-7215-427b-bc7e-6e1efd12170b","Type":"ContainerStarted","Data":"8cfba60b65ea111ab51cbd0256cc13a2638caef6dcb78475b9cac60bc1ab0db0"} Mar 20 09:30:02 crc kubenswrapper[4971]: I0320 09:30:02.605615 4971 generic.go:334] "Generic (PLEG): container finished" podID="04fd9f38-7215-427b-bc7e-6e1efd12170b" containerID="0ec3b42827fc6071a8b90aaa61acaefc3be25cefe8ff555d8666fb0cc5bd0633" exitCode=0 Mar 20 09:30:02 crc kubenswrapper[4971]: I0320 09:30:02.605680 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566650-h5hmj" event={"ID":"04fd9f38-7215-427b-bc7e-6e1efd12170b","Type":"ContainerDied","Data":"0ec3b42827fc6071a8b90aaa61acaefc3be25cefe8ff555d8666fb0cc5bd0633"} Mar 20 09:30:04 crc kubenswrapper[4971]: I0320 09:30:04.022049 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566650-h5hmj" Mar 20 09:30:04 crc kubenswrapper[4971]: I0320 09:30:04.157419 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/04fd9f38-7215-427b-bc7e-6e1efd12170b-config-volume\") pod \"04fd9f38-7215-427b-bc7e-6e1efd12170b\" (UID: \"04fd9f38-7215-427b-bc7e-6e1efd12170b\") " Mar 20 09:30:04 crc kubenswrapper[4971]: I0320 09:30:04.157989 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-896t4\" (UniqueName: \"kubernetes.io/projected/04fd9f38-7215-427b-bc7e-6e1efd12170b-kube-api-access-896t4\") pod \"04fd9f38-7215-427b-bc7e-6e1efd12170b\" (UID: \"04fd9f38-7215-427b-bc7e-6e1efd12170b\") " Mar 20 09:30:04 crc kubenswrapper[4971]: I0320 09:30:04.158083 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/04fd9f38-7215-427b-bc7e-6e1efd12170b-secret-volume\") pod \"04fd9f38-7215-427b-bc7e-6e1efd12170b\" (UID: \"04fd9f38-7215-427b-bc7e-6e1efd12170b\") " Mar 20 09:30:04 crc kubenswrapper[4971]: I0320 09:30:04.158544 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04fd9f38-7215-427b-bc7e-6e1efd12170b-config-volume" (OuterVolumeSpecName: "config-volume") pod "04fd9f38-7215-427b-bc7e-6e1efd12170b" (UID: "04fd9f38-7215-427b-bc7e-6e1efd12170b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:30:04 crc kubenswrapper[4971]: I0320 09:30:04.158884 4971 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/04fd9f38-7215-427b-bc7e-6e1efd12170b-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 09:30:04 crc kubenswrapper[4971]: I0320 09:30:04.165463 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04fd9f38-7215-427b-bc7e-6e1efd12170b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "04fd9f38-7215-427b-bc7e-6e1efd12170b" (UID: "04fd9f38-7215-427b-bc7e-6e1efd12170b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:30:04 crc kubenswrapper[4971]: I0320 09:30:04.170905 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04fd9f38-7215-427b-bc7e-6e1efd12170b-kube-api-access-896t4" (OuterVolumeSpecName: "kube-api-access-896t4") pod "04fd9f38-7215-427b-bc7e-6e1efd12170b" (UID: "04fd9f38-7215-427b-bc7e-6e1efd12170b"). InnerVolumeSpecName "kube-api-access-896t4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:30:04 crc kubenswrapper[4971]: I0320 09:30:04.261054 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-896t4\" (UniqueName: \"kubernetes.io/projected/04fd9f38-7215-427b-bc7e-6e1efd12170b-kube-api-access-896t4\") on node \"crc\" DevicePath \"\"" Mar 20 09:30:04 crc kubenswrapper[4971]: I0320 09:30:04.261089 4971 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/04fd9f38-7215-427b-bc7e-6e1efd12170b-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 09:30:04 crc kubenswrapper[4971]: I0320 09:30:04.645989 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566650-st8vq" event={"ID":"93e47cf8-974a-4ba1-a399-855e87aaa553","Type":"ContainerStarted","Data":"83fec366fe56e7bff356175bc43468143027d156d6abb72288755ec7eaf0b54c"} Mar 20 09:30:04 crc kubenswrapper[4971]: I0320 09:30:04.649934 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566650-h5hmj" event={"ID":"04fd9f38-7215-427b-bc7e-6e1efd12170b","Type":"ContainerDied","Data":"8cfba60b65ea111ab51cbd0256cc13a2638caef6dcb78475b9cac60bc1ab0db0"} Mar 20 09:30:04 crc kubenswrapper[4971]: I0320 09:30:04.649977 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8cfba60b65ea111ab51cbd0256cc13a2638caef6dcb78475b9cac60bc1ab0db0" Mar 20 09:30:04 crc kubenswrapper[4971]: I0320 09:30:04.650001 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566650-h5hmj" Mar 20 09:30:04 crc kubenswrapper[4971]: I0320 09:30:04.663452 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566650-st8vq" podStartSLOduration=2.007611495 podStartE2EDuration="4.663434679s" podCreationTimestamp="2026-03-20 09:30:00 +0000 UTC" firstStartedPulling="2026-03-20 09:30:01.444239401 +0000 UTC m=+9623.424113539" lastFinishedPulling="2026-03-20 09:30:04.100062585 +0000 UTC m=+9626.079936723" observedRunningTime="2026-03-20 09:30:04.658498238 +0000 UTC m=+9626.638372376" watchObservedRunningTime="2026-03-20 09:30:04.663434679 +0000 UTC m=+9626.643308817" Mar 20 09:30:04 crc kubenswrapper[4971]: E0320 09:30:04.936403 4971 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93e47cf8_974a_4ba1_a399_855e87aaa553.slice/crio-conmon-83fec366fe56e7bff356175bc43468143027d156d6abb72288755ec7eaf0b54c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93e47cf8_974a_4ba1_a399_855e87aaa553.slice/crio-83fec366fe56e7bff356175bc43468143027d156d6abb72288755ec7eaf0b54c.scope\": RecentStats: unable to find data in memory cache]" Mar 20 09:30:05 crc kubenswrapper[4971]: I0320 09:30:05.258906 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566605-k5mzl"] Mar 20 09:30:05 crc kubenswrapper[4971]: I0320 09:30:05.282885 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566605-k5mzl"] Mar 20 09:30:05 crc kubenswrapper[4971]: I0320 09:30:05.675028 4971 generic.go:334] "Generic (PLEG): container finished" podID="93e47cf8-974a-4ba1-a399-855e87aaa553" containerID="83fec366fe56e7bff356175bc43468143027d156d6abb72288755ec7eaf0b54c" exitCode=0 Mar 20 09:30:05 crc kubenswrapper[4971]: I0320 09:30:05.675080 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566650-st8vq" event={"ID":"93e47cf8-974a-4ba1-a399-855e87aaa553","Type":"ContainerDied","Data":"83fec366fe56e7bff356175bc43468143027d156d6abb72288755ec7eaf0b54c"} Mar 20 09:30:06 crc kubenswrapper[4971]: I0320 09:30:06.743956 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8846ab1-f8f1-4ed6-a554-fa0741899c59" path="/var/lib/kubelet/pods/a8846ab1-f8f1-4ed6-a554-fa0741899c59/volumes" Mar 20 09:30:07 crc kubenswrapper[4971]: I0320 09:30:07.419370 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566650-st8vq" Mar 20 09:30:07 crc kubenswrapper[4971]: I0320 09:30:07.520231 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7jff\" (UniqueName: \"kubernetes.io/projected/93e47cf8-974a-4ba1-a399-855e87aaa553-kube-api-access-g7jff\") pod \"93e47cf8-974a-4ba1-a399-855e87aaa553\" (UID: \"93e47cf8-974a-4ba1-a399-855e87aaa553\") " Mar 20 09:30:07 crc kubenswrapper[4971]: I0320 09:30:07.527433 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93e47cf8-974a-4ba1-a399-855e87aaa553-kube-api-access-g7jff" (OuterVolumeSpecName: "kube-api-access-g7jff") pod "93e47cf8-974a-4ba1-a399-855e87aaa553" (UID: "93e47cf8-974a-4ba1-a399-855e87aaa553"). InnerVolumeSpecName "kube-api-access-g7jff". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:30:07 crc kubenswrapper[4971]: I0320 09:30:07.623371 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7jff\" (UniqueName: \"kubernetes.io/projected/93e47cf8-974a-4ba1-a399-855e87aaa553-kube-api-access-g7jff\") on node \"crc\" DevicePath \"\"" Mar 20 09:30:07 crc kubenswrapper[4971]: I0320 09:30:07.695315 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566650-st8vq" event={"ID":"93e47cf8-974a-4ba1-a399-855e87aaa553","Type":"ContainerDied","Data":"33d65668f32bc1d39cc91a4f6f8754fee703f1cc4c798a8e70a25a2a5ea2894b"} Mar 20 09:30:07 crc kubenswrapper[4971]: I0320 09:30:07.695354 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33d65668f32bc1d39cc91a4f6f8754fee703f1cc4c798a8e70a25a2a5ea2894b" Mar 20 09:30:07 crc kubenswrapper[4971]: I0320 09:30:07.695377 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566650-st8vq" Mar 20 09:30:07 crc kubenswrapper[4971]: I0320 09:30:07.736047 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566644-jxm5c"] Mar 20 09:30:07 crc kubenswrapper[4971]: I0320 09:30:07.750414 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566644-jxm5c"] Mar 20 09:30:08 crc kubenswrapper[4971]: I0320 09:30:08.749986 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2946ac1-a6a9-4802-8d7b-82daf2fbe796" path="/var/lib/kubelet/pods/a2946ac1-a6a9-4802-8d7b-82daf2fbe796/volumes" Mar 20 09:30:14 crc kubenswrapper[4971]: I0320 09:30:14.574531 4971 scope.go:117] "RemoveContainer" containerID="90dce88a8b6f91ac1ba18dab79d45b75502eabace1ac3b1b5f8df2d30b2781f5" Mar 20 09:30:14 crc kubenswrapper[4971]: I0320 09:30:14.597437 4971 scope.go:117] "RemoveContainer" containerID="de9aa705d178bd16dd1d8d5a518b4f76841495628afcb5bd887ec24fa0da6055" Mar 20 09:31:18 crc kubenswrapper[4971]: I0320 09:31:18.550326 4971 generic.go:334] "Generic (PLEG): container finished" podID="41f11d07-6bdd-455e-be27-f2b7a23c7f7a" containerID="e93eab7c4e5509d0c94fb681a3202bb35cd540e934bae9bd329a11199c89e6d6" exitCode=0 Mar 20 09:31:18 crc kubenswrapper[4971]: I0320 09:31:18.550952 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-4ng6s" event={"ID":"41f11d07-6bdd-455e-be27-f2b7a23c7f7a","Type":"ContainerDied","Data":"e93eab7c4e5509d0c94fb681a3202bb35cd540e934bae9bd329a11199c89e6d6"} Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.032547 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-4ng6s" Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.162446 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.162506 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.196499 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-nova-cell1-combined-ca-bundle\") pod \"41f11d07-6bdd-455e-be27-f2b7a23c7f7a\" (UID: \"41f11d07-6bdd-455e-be27-f2b7a23c7f7a\") " Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.196676 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-nova-cell1-compute-config-0\") pod \"41f11d07-6bdd-455e-be27-f2b7a23c7f7a\" (UID: \"41f11d07-6bdd-455e-be27-f2b7a23c7f7a\") " Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.196872 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-nova-cells-global-config-1\") pod \"41f11d07-6bdd-455e-be27-f2b7a23c7f7a\" (UID: \"41f11d07-6bdd-455e-be27-f2b7a23c7f7a\") " Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.196919 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-ssh-key-openstack-cell1\") pod \"41f11d07-6bdd-455e-be27-f2b7a23c7f7a\" (UID: \"41f11d07-6bdd-455e-be27-f2b7a23c7f7a\") " Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.199031 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-nova-cells-global-config-0\") pod \"41f11d07-6bdd-455e-be27-f2b7a23c7f7a\" (UID: \"41f11d07-6bdd-455e-be27-f2b7a23c7f7a\") " Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.199096 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-nova-migration-ssh-key-1\") pod \"41f11d07-6bdd-455e-be27-f2b7a23c7f7a\" (UID: \"41f11d07-6bdd-455e-be27-f2b7a23c7f7a\") " Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.200187 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-nova-migration-ssh-key-0\") pod \"41f11d07-6bdd-455e-be27-f2b7a23c7f7a\" (UID: \"41f11d07-6bdd-455e-be27-f2b7a23c7f7a\") " Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.200350 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-inventory\") pod \"41f11d07-6bdd-455e-be27-f2b7a23c7f7a\" (UID: \"41f11d07-6bdd-455e-be27-f2b7a23c7f7a\") " Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.200454 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-nova-cell1-compute-config-2\") pod \"41f11d07-6bdd-455e-be27-f2b7a23c7f7a\" (UID: \"41f11d07-6bdd-455e-be27-f2b7a23c7f7a\") " Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.200498 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-ceph\") pod \"41f11d07-6bdd-455e-be27-f2b7a23c7f7a\" (UID: \"41f11d07-6bdd-455e-be27-f2b7a23c7f7a\") " Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.200578 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-nova-cell1-compute-config-1\") pod \"41f11d07-6bdd-455e-be27-f2b7a23c7f7a\" (UID: \"41f11d07-6bdd-455e-be27-f2b7a23c7f7a\") " Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.200652 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-nova-cell1-compute-config-3\") pod \"41f11d07-6bdd-455e-be27-f2b7a23c7f7a\" (UID: \"41f11d07-6bdd-455e-be27-f2b7a23c7f7a\") " Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.200747 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjcd4\" (UniqueName: \"kubernetes.io/projected/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-kube-api-access-jjcd4\") pod \"41f11d07-6bdd-455e-be27-f2b7a23c7f7a\" (UID: \"41f11d07-6bdd-455e-be27-f2b7a23c7f7a\") " Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.204357 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "41f11d07-6bdd-455e-be27-f2b7a23c7f7a" (UID: "41f11d07-6bdd-455e-be27-f2b7a23c7f7a"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.207379 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-ceph" (OuterVolumeSpecName: "ceph") pod "41f11d07-6bdd-455e-be27-f2b7a23c7f7a" (UID: "41f11d07-6bdd-455e-be27-f2b7a23c7f7a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.236565 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-kube-api-access-jjcd4" (OuterVolumeSpecName: "kube-api-access-jjcd4") pod "41f11d07-6bdd-455e-be27-f2b7a23c7f7a" (UID: "41f11d07-6bdd-455e-be27-f2b7a23c7f7a"). InnerVolumeSpecName "kube-api-access-jjcd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.242914 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "41f11d07-6bdd-455e-be27-f2b7a23c7f7a" (UID: "41f11d07-6bdd-455e-be27-f2b7a23c7f7a"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.245510 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "41f11d07-6bdd-455e-be27-f2b7a23c7f7a" (UID: "41f11d07-6bdd-455e-be27-f2b7a23c7f7a"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.262232 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-inventory" (OuterVolumeSpecName: "inventory") pod "41f11d07-6bdd-455e-be27-f2b7a23c7f7a" (UID: "41f11d07-6bdd-455e-be27-f2b7a23c7f7a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.263288 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "41f11d07-6bdd-455e-be27-f2b7a23c7f7a" (UID: "41f11d07-6bdd-455e-be27-f2b7a23c7f7a"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.263902 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "41f11d07-6bdd-455e-be27-f2b7a23c7f7a" (UID: "41f11d07-6bdd-455e-be27-f2b7a23c7f7a"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.264136 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-nova-cells-global-config-1" (OuterVolumeSpecName: "nova-cells-global-config-1") pod "41f11d07-6bdd-455e-be27-f2b7a23c7f7a" (UID: "41f11d07-6bdd-455e-be27-f2b7a23c7f7a"). InnerVolumeSpecName "nova-cells-global-config-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.273222 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "41f11d07-6bdd-455e-be27-f2b7a23c7f7a" (UID: "41f11d07-6bdd-455e-be27-f2b7a23c7f7a"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.278025 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "41f11d07-6bdd-455e-be27-f2b7a23c7f7a" (UID: "41f11d07-6bdd-455e-be27-f2b7a23c7f7a"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.281184 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "41f11d07-6bdd-455e-be27-f2b7a23c7f7a" (UID: "41f11d07-6bdd-455e-be27-f2b7a23c7f7a"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.287029 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "41f11d07-6bdd-455e-be27-f2b7a23c7f7a" (UID: "41f11d07-6bdd-455e-be27-f2b7a23c7f7a"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.305862 4971 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.305911 4971 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-ceph\") on node \"crc\" DevicePath \"\"" Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.305925 4971 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.305939 4971 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.305951 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjcd4\" (UniqueName: \"kubernetes.io/projected/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-kube-api-access-jjcd4\") on node \"crc\" DevicePath \"\"" Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.305964 4971 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.305980 4971 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.305993 4971 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-nova-cells-global-config-1\") on node \"crc\" DevicePath \"\"" Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.306005 4971 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.306016 4971 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.306028 4971 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.306038 4971 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.306048 4971 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/41f11d07-6bdd-455e-be27-f2b7a23c7f7a-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.580528 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-4ng6s" event={"ID":"41f11d07-6bdd-455e-be27-f2b7a23c7f7a","Type":"ContainerDied","Data":"58092781d703afe4e130cc0df93d49d168f0e724737bfc3534fa1317f7401179"} Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.580576 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="58092781d703afe4e130cc0df93d49d168f0e724737bfc3534fa1317f7401179" Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.580662 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-4ng6s" Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.768043 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-cjr5v"] Mar 20 09:31:20 crc kubenswrapper[4971]: E0320 09:31:20.768912 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41f11d07-6bdd-455e-be27-f2b7a23c7f7a" containerName="nova-cell1-openstack-openstack-cell1" Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.768937 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="41f11d07-6bdd-455e-be27-f2b7a23c7f7a" containerName="nova-cell1-openstack-openstack-cell1" Mar 20 09:31:20 crc kubenswrapper[4971]: E0320 09:31:20.768977 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93e47cf8-974a-4ba1-a399-855e87aaa553" containerName="oc" Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.768987 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="93e47cf8-974a-4ba1-a399-855e87aaa553" containerName="oc" Mar 20 09:31:20 crc kubenswrapper[4971]: E0320 09:31:20.769008 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04fd9f38-7215-427b-bc7e-6e1efd12170b" containerName="collect-profiles" Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.769016 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="04fd9f38-7215-427b-bc7e-6e1efd12170b" containerName="collect-profiles" Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.769241 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="93e47cf8-974a-4ba1-a399-855e87aaa553" containerName="oc" Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.769270 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="41f11d07-6bdd-455e-be27-f2b7a23c7f7a" containerName="nova-cell1-openstack-openstack-cell1" Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.769299 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="04fd9f38-7215-427b-bc7e-6e1efd12170b" containerName="collect-profiles" Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.770113 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-cjr5v" Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.772871 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.773092 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.774411 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.775360 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rwstj" Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.775708 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.779526 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-cjr5v"] Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.918513 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/4fc44776-0838-4b1a-844c-a16ffd83af56-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-cjr5v\" (UID: \"4fc44776-0838-4b1a-844c-a16ffd83af56\") " pod="openstack/telemetry-openstack-openstack-cell1-cjr5v" Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.918561 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fc44776-0838-4b1a-844c-a16ffd83af56-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-cjr5v\" (UID: \"4fc44776-0838-4b1a-844c-a16ffd83af56\") " pod="openstack/telemetry-openstack-openstack-cell1-cjr5v" Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.918583 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/4fc44776-0838-4b1a-844c-a16ffd83af56-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-cjr5v\" (UID: \"4fc44776-0838-4b1a-844c-a16ffd83af56\") " pod="openstack/telemetry-openstack-openstack-cell1-cjr5v" Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.918643 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4fc44776-0838-4b1a-844c-a16ffd83af56-inventory\") pod \"telemetry-openstack-openstack-cell1-cjr5v\" (UID: \"4fc44776-0838-4b1a-844c-a16ffd83af56\") " pod="openstack/telemetry-openstack-openstack-cell1-cjr5v" Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.918869 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/4fc44776-0838-4b1a-844c-a16ffd83af56-ssh-key-openstack-cell1\") pod \"telemetry-openstack-openstack-cell1-cjr5v\" (UID: \"4fc44776-0838-4b1a-844c-a16ffd83af56\") " pod="openstack/telemetry-openstack-openstack-cell1-cjr5v" Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.918990 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4fc44776-0838-4b1a-844c-a16ffd83af56-ceph\") pod \"telemetry-openstack-openstack-cell1-cjr5v\" (UID: \"4fc44776-0838-4b1a-844c-a16ffd83af56\") " pod="openstack/telemetry-openstack-openstack-cell1-cjr5v" Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.919108 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/4fc44776-0838-4b1a-844c-a16ffd83af56-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-cjr5v\" (UID: \"4fc44776-0838-4b1a-844c-a16ffd83af56\") " pod="openstack/telemetry-openstack-openstack-cell1-cjr5v" Mar 20 09:31:20 crc kubenswrapper[4971]: I0320 09:31:20.920456 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9krp\" (UniqueName: \"kubernetes.io/projected/4fc44776-0838-4b1a-844c-a16ffd83af56-kube-api-access-x9krp\") pod \"telemetry-openstack-openstack-cell1-cjr5v\" (UID: \"4fc44776-0838-4b1a-844c-a16ffd83af56\") " pod="openstack/telemetry-openstack-openstack-cell1-cjr5v" Mar 20 09:31:21 crc kubenswrapper[4971]: I0320 09:31:21.023001 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4fc44776-0838-4b1a-844c-a16ffd83af56-ceph\") pod \"telemetry-openstack-openstack-cell1-cjr5v\" (UID: \"4fc44776-0838-4b1a-844c-a16ffd83af56\") " pod="openstack/telemetry-openstack-openstack-cell1-cjr5v" Mar 20 09:31:21 crc kubenswrapper[4971]: I0320 09:31:21.023372 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/4fc44776-0838-4b1a-844c-a16ffd83af56-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-cjr5v\" (UID: \"4fc44776-0838-4b1a-844c-a16ffd83af56\") " pod="openstack/telemetry-openstack-openstack-cell1-cjr5v" Mar 20 09:31:21 crc kubenswrapper[4971]: I0320 09:31:21.023696 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9krp\" (UniqueName: \"kubernetes.io/projected/4fc44776-0838-4b1a-844c-a16ffd83af56-kube-api-access-x9krp\") pod \"telemetry-openstack-openstack-cell1-cjr5v\" (UID: \"4fc44776-0838-4b1a-844c-a16ffd83af56\") " pod="openstack/telemetry-openstack-openstack-cell1-cjr5v" Mar 20 09:31:21 crc kubenswrapper[4971]: I0320 09:31:21.024046 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/4fc44776-0838-4b1a-844c-a16ffd83af56-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-cjr5v\" (UID: \"4fc44776-0838-4b1a-844c-a16ffd83af56\") " pod="openstack/telemetry-openstack-openstack-cell1-cjr5v" Mar 20 09:31:21 crc kubenswrapper[4971]: I0320 09:31:21.024225 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fc44776-0838-4b1a-844c-a16ffd83af56-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-cjr5v\" (UID: \"4fc44776-0838-4b1a-844c-a16ffd83af56\") " pod="openstack/telemetry-openstack-openstack-cell1-cjr5v" Mar 20 09:31:21 crc kubenswrapper[4971]: I0320 09:31:21.024389 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/4fc44776-0838-4b1a-844c-a16ffd83af56-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-cjr5v\" (UID: \"4fc44776-0838-4b1a-844c-a16ffd83af56\") " pod="openstack/telemetry-openstack-openstack-cell1-cjr5v" Mar 20 09:31:21 crc kubenswrapper[4971]: I0320 09:31:21.024582 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4fc44776-0838-4b1a-844c-a16ffd83af56-inventory\") pod \"telemetry-openstack-openstack-cell1-cjr5v\" (UID: \"4fc44776-0838-4b1a-844c-a16ffd83af56\") " pod="openstack/telemetry-openstack-openstack-cell1-cjr5v" Mar 20 09:31:21 crc kubenswrapper[4971]: I0320 09:31:21.024927 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/4fc44776-0838-4b1a-844c-a16ffd83af56-ssh-key-openstack-cell1\") pod \"telemetry-openstack-openstack-cell1-cjr5v\" (UID: \"4fc44776-0838-4b1a-844c-a16ffd83af56\") " pod="openstack/telemetry-openstack-openstack-cell1-cjr5v" Mar 20 09:31:21 crc kubenswrapper[4971]: I0320 09:31:21.026826 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/4fc44776-0838-4b1a-844c-a16ffd83af56-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-cjr5v\" (UID: \"4fc44776-0838-4b1a-844c-a16ffd83af56\") " pod="openstack/telemetry-openstack-openstack-cell1-cjr5v" Mar 20 09:31:21 crc kubenswrapper[4971]: I0320 09:31:21.027846 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/4fc44776-0838-4b1a-844c-a16ffd83af56-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-cjr5v\" (UID: \"4fc44776-0838-4b1a-844c-a16ffd83af56\") " pod="openstack/telemetry-openstack-openstack-cell1-cjr5v" Mar 20 09:31:21 crc kubenswrapper[4971]: I0320 09:31:21.028466 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4fc44776-0838-4b1a-844c-a16ffd83af56-ceph\") pod \"telemetry-openstack-openstack-cell1-cjr5v\" (UID: \"4fc44776-0838-4b1a-844c-a16ffd83af56\") " pod="openstack/telemetry-openstack-openstack-cell1-cjr5v" Mar 20 09:31:21 crc kubenswrapper[4971]: I0320 09:31:21.028626 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4fc44776-0838-4b1a-844c-a16ffd83af56-inventory\") pod \"telemetry-openstack-openstack-cell1-cjr5v\" (UID: \"4fc44776-0838-4b1a-844c-a16ffd83af56\") " pod="openstack/telemetry-openstack-openstack-cell1-cjr5v" Mar 20 09:31:21 crc kubenswrapper[4971]: I0320 09:31:21.028762 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/4fc44776-0838-4b1a-844c-a16ffd83af56-ssh-key-openstack-cell1\") pod \"telemetry-openstack-openstack-cell1-cjr5v\" (UID: \"4fc44776-0838-4b1a-844c-a16ffd83af56\") " pod="openstack/telemetry-openstack-openstack-cell1-cjr5v" Mar 20 09:31:21 crc kubenswrapper[4971]: I0320 09:31:21.029026 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/4fc44776-0838-4b1a-844c-a16ffd83af56-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-cjr5v\" (UID: \"4fc44776-0838-4b1a-844c-a16ffd83af56\") " pod="openstack/telemetry-openstack-openstack-cell1-cjr5v" Mar 20 09:31:21 crc kubenswrapper[4971]: I0320 09:31:21.036517 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fc44776-0838-4b1a-844c-a16ffd83af56-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-cjr5v\" (UID: \"4fc44776-0838-4b1a-844c-a16ffd83af56\") " pod="openstack/telemetry-openstack-openstack-cell1-cjr5v" Mar 20 09:31:21 crc kubenswrapper[4971]: I0320 09:31:21.045384 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9krp\" (UniqueName: \"kubernetes.io/projected/4fc44776-0838-4b1a-844c-a16ffd83af56-kube-api-access-x9krp\") pod \"telemetry-openstack-openstack-cell1-cjr5v\" (UID: \"4fc44776-0838-4b1a-844c-a16ffd83af56\") " pod="openstack/telemetry-openstack-openstack-cell1-cjr5v" Mar 20 09:31:21 crc kubenswrapper[4971]: I0320 09:31:21.095742 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-cjr5v" Mar 20 09:31:21 crc kubenswrapper[4971]: I0320 09:31:21.648980 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-cjr5v"] Mar 20 09:31:21 crc kubenswrapper[4971]: I0320 09:31:21.662125 4971 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 09:31:22 crc kubenswrapper[4971]: I0320 09:31:22.601228 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-cjr5v" event={"ID":"4fc44776-0838-4b1a-844c-a16ffd83af56","Type":"ContainerStarted","Data":"ba20f12233759169018c873477fc3b92185267ec4d38c76c3335c6848d4a5034"} Mar 20 09:31:22 crc kubenswrapper[4971]: I0320 09:31:22.601552 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-cjr5v" event={"ID":"4fc44776-0838-4b1a-844c-a16ffd83af56","Type":"ContainerStarted","Data":"f008cc1efce853aaa7bf5e9496b5ffe58aba768546f25be3108ae78654399f82"} Mar 20 09:31:22 crc kubenswrapper[4971]: I0320 09:31:22.626470 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-openstack-openstack-cell1-cjr5v" podStartSLOduration=2.121393587 podStartE2EDuration="2.626440442s" podCreationTimestamp="2026-03-20 09:31:20 +0000 UTC" firstStartedPulling="2026-03-20 09:31:21.661925763 +0000 UTC m=+9703.641799901" lastFinishedPulling="2026-03-20 09:31:22.166972618 +0000 UTC m=+9704.146846756" observedRunningTime="2026-03-20 09:31:22.619092337 +0000 UTC m=+9704.598966475" watchObservedRunningTime="2026-03-20 09:31:22.626440442 +0000 UTC m=+9704.606314580" Mar 20 09:31:50 crc kubenswrapper[4971]: I0320 09:31:50.162888 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:31:50 crc kubenswrapper[4971]: I0320 09:31:50.163750 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:32:00 crc kubenswrapper[4971]: I0320 09:32:00.172154 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566652-77bgm"] Mar 20 09:32:00 crc kubenswrapper[4971]: I0320 09:32:00.176675 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566652-77bgm" Mar 20 09:32:00 crc kubenswrapper[4971]: I0320 09:32:00.181005 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:32:00 crc kubenswrapper[4971]: I0320 09:32:00.181233 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 09:32:00 crc kubenswrapper[4971]: I0320 09:32:00.185302 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:32:00 crc kubenswrapper[4971]: I0320 09:32:00.186385 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566652-77bgm"] Mar 20 09:32:00 crc kubenswrapper[4971]: I0320 09:32:00.273978 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cbwl\" (UniqueName: \"kubernetes.io/projected/8d900009-430c-4456-9b46-62602708c68e-kube-api-access-8cbwl\") pod \"auto-csr-approver-29566652-77bgm\" (UID: \"8d900009-430c-4456-9b46-62602708c68e\") " pod="openshift-infra/auto-csr-approver-29566652-77bgm" Mar 20 09:32:00 crc kubenswrapper[4971]: I0320 09:32:00.376459 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cbwl\" (UniqueName: \"kubernetes.io/projected/8d900009-430c-4456-9b46-62602708c68e-kube-api-access-8cbwl\") pod \"auto-csr-approver-29566652-77bgm\" (UID: \"8d900009-430c-4456-9b46-62602708c68e\") " pod="openshift-infra/auto-csr-approver-29566652-77bgm" Mar 20 09:32:00 crc kubenswrapper[4971]: I0320 09:32:00.396170 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cbwl\" (UniqueName: \"kubernetes.io/projected/8d900009-430c-4456-9b46-62602708c68e-kube-api-access-8cbwl\") pod \"auto-csr-approver-29566652-77bgm\" (UID: \"8d900009-430c-4456-9b46-62602708c68e\") " pod="openshift-infra/auto-csr-approver-29566652-77bgm" Mar 20 09:32:00 crc kubenswrapper[4971]: I0320 09:32:00.510936 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566652-77bgm" Mar 20 09:32:00 crc kubenswrapper[4971]: I0320 09:32:00.984804 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566652-77bgm"] Mar 20 09:32:00 crc kubenswrapper[4971]: W0320 09:32:00.990981 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d900009_430c_4456_9b46_62602708c68e.slice/crio-6bb5963804f6de0f9e87b4564be6d44c54c46debf2861f951cd9b5b1fa204b56 WatchSource:0}: Error finding container 6bb5963804f6de0f9e87b4564be6d44c54c46debf2861f951cd9b5b1fa204b56: Status 404 returned error can't find the container with id 6bb5963804f6de0f9e87b4564be6d44c54c46debf2861f951cd9b5b1fa204b56 Mar 20 09:32:01 crc kubenswrapper[4971]: I0320 09:32:01.025909 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566652-77bgm" event={"ID":"8d900009-430c-4456-9b46-62602708c68e","Type":"ContainerStarted","Data":"6bb5963804f6de0f9e87b4564be6d44c54c46debf2861f951cd9b5b1fa204b56"} Mar 20 09:32:03 crc kubenswrapper[4971]: I0320 09:32:03.047387 4971 generic.go:334] "Generic (PLEG): container finished" podID="8d900009-430c-4456-9b46-62602708c68e" containerID="90929e45bea32ad547f1be5f64c0a1288fd2c2df16a14f29522ab13f880bb3ce" exitCode=0 Mar 20 09:32:03 crc kubenswrapper[4971]: I0320 09:32:03.047439 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566652-77bgm" event={"ID":"8d900009-430c-4456-9b46-62602708c68e","Type":"ContainerDied","Data":"90929e45bea32ad547f1be5f64c0a1288fd2c2df16a14f29522ab13f880bb3ce"} Mar 20 09:32:04 crc kubenswrapper[4971]: I0320 09:32:04.414728 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566652-77bgm" Mar 20 09:32:04 crc kubenswrapper[4971]: I0320 09:32:04.565538 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cbwl\" (UniqueName: \"kubernetes.io/projected/8d900009-430c-4456-9b46-62602708c68e-kube-api-access-8cbwl\") pod \"8d900009-430c-4456-9b46-62602708c68e\" (UID: \"8d900009-430c-4456-9b46-62602708c68e\") " Mar 20 09:32:04 crc kubenswrapper[4971]: I0320 09:32:04.571624 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d900009-430c-4456-9b46-62602708c68e-kube-api-access-8cbwl" (OuterVolumeSpecName: "kube-api-access-8cbwl") pod "8d900009-430c-4456-9b46-62602708c68e" (UID: "8d900009-430c-4456-9b46-62602708c68e"). InnerVolumeSpecName "kube-api-access-8cbwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:32:04 crc kubenswrapper[4971]: I0320 09:32:04.668259 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cbwl\" (UniqueName: \"kubernetes.io/projected/8d900009-430c-4456-9b46-62602708c68e-kube-api-access-8cbwl\") on node \"crc\" DevicePath \"\"" Mar 20 09:32:05 crc kubenswrapper[4971]: I0320 09:32:05.066381 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566652-77bgm" event={"ID":"8d900009-430c-4456-9b46-62602708c68e","Type":"ContainerDied","Data":"6bb5963804f6de0f9e87b4564be6d44c54c46debf2861f951cd9b5b1fa204b56"} Mar 20 09:32:05 crc kubenswrapper[4971]: I0320 09:32:05.066415 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6bb5963804f6de0f9e87b4564be6d44c54c46debf2861f951cd9b5b1fa204b56" Mar 20 09:32:05 crc kubenswrapper[4971]: I0320 09:32:05.066477 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566652-77bgm" Mar 20 09:32:05 crc kubenswrapper[4971]: I0320 09:32:05.492661 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566646-9dj5r"] Mar 20 09:32:05 crc kubenswrapper[4971]: I0320 09:32:05.502978 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566646-9dj5r"] Mar 20 09:32:06 crc kubenswrapper[4971]: I0320 09:32:06.750344 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b0a8320-36ab-483a-8a0b-f6b875a46c33" path="/var/lib/kubelet/pods/6b0a8320-36ab-483a-8a0b-f6b875a46c33/volumes" Mar 20 09:32:14 crc kubenswrapper[4971]: I0320 09:32:14.784794 4971 scope.go:117] "RemoveContainer" containerID="d4b01ab11970c7840ed5c669de66d21e8fe5303fa67e50de75f76c0f9baa9704" Mar 20 09:32:20 crc kubenswrapper[4971]: I0320 09:32:20.162058 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:32:20 crc kubenswrapper[4971]: I0320 09:32:20.162597 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:32:20 crc kubenswrapper[4971]: I0320 09:32:20.162677 4971 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" Mar 20 09:32:20 crc kubenswrapper[4971]: I0320 09:32:20.163549 4971 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bf941729054eb427a73262a724a392ea7406ec0eee565011ce7a80f6bb777365"} pod="openshift-machine-config-operator/machine-config-daemon-m26zl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 09:32:20 crc kubenswrapper[4971]: I0320 09:32:20.163602 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" containerID="cri-o://bf941729054eb427a73262a724a392ea7406ec0eee565011ce7a80f6bb777365" gracePeriod=600 Mar 20 09:32:20 crc kubenswrapper[4971]: E0320 09:32:20.295645 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:32:21 crc kubenswrapper[4971]: I0320 09:32:21.228914 4971 generic.go:334] "Generic (PLEG): container finished" podID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerID="bf941729054eb427a73262a724a392ea7406ec0eee565011ce7a80f6bb777365" exitCode=0 Mar 20 09:32:21 crc kubenswrapper[4971]: I0320 09:32:21.228961 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerDied","Data":"bf941729054eb427a73262a724a392ea7406ec0eee565011ce7a80f6bb777365"} Mar 20 09:32:21 crc kubenswrapper[4971]: I0320 09:32:21.229004 4971 scope.go:117] "RemoveContainer" containerID="1cd20bc54dc57a6ce36976ac71cd8d2b4d0b0a4c0ce164f579ec4e73e0a60f78" Mar 20 09:32:21 crc kubenswrapper[4971]: I0320 09:32:21.229790 4971 scope.go:117] "RemoveContainer" containerID="bf941729054eb427a73262a724a392ea7406ec0eee565011ce7a80f6bb777365" Mar 20 09:32:21 crc kubenswrapper[4971]: E0320 09:32:21.230321 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:32:32 crc kubenswrapper[4971]: I0320 09:32:32.732558 4971 scope.go:117] "RemoveContainer" containerID="bf941729054eb427a73262a724a392ea7406ec0eee565011ce7a80f6bb777365" Mar 20 09:32:32 crc kubenswrapper[4971]: E0320 09:32:32.733535 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:32:46 crc kubenswrapper[4971]: I0320 09:32:46.733655 4971 scope.go:117] "RemoveContainer" containerID="bf941729054eb427a73262a724a392ea7406ec0eee565011ce7a80f6bb777365" Mar 20 09:32:46 crc kubenswrapper[4971]: E0320 09:32:46.734424 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:32:59 crc kubenswrapper[4971]: I0320 09:32:59.733005 4971 scope.go:117] "RemoveContainer" containerID="bf941729054eb427a73262a724a392ea7406ec0eee565011ce7a80f6bb777365" Mar 20 09:32:59 crc kubenswrapper[4971]: E0320 09:32:59.733942 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:33:11 crc kubenswrapper[4971]: I0320 09:33:11.732663 4971 scope.go:117] "RemoveContainer" containerID="bf941729054eb427a73262a724a392ea7406ec0eee565011ce7a80f6bb777365" Mar 20 09:33:11 crc kubenswrapper[4971]: E0320 09:33:11.734734 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:33:16 crc kubenswrapper[4971]: I0320 09:33:16.852856 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vfltj"] Mar 20 09:33:16 crc kubenswrapper[4971]: E0320 09:33:16.853840 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d900009-430c-4456-9b46-62602708c68e" containerName="oc" Mar 20 09:33:16 crc kubenswrapper[4971]: I0320 09:33:16.853856 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d900009-430c-4456-9b46-62602708c68e" containerName="oc" Mar 20 09:33:16 crc kubenswrapper[4971]: I0320 09:33:16.854118 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d900009-430c-4456-9b46-62602708c68e" containerName="oc" Mar 20 09:33:16 crc kubenswrapper[4971]: I0320 09:33:16.855499 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vfltj" Mar 20 09:33:16 crc kubenswrapper[4971]: I0320 09:33:16.906920 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vfltj"] Mar 20 09:33:16 crc kubenswrapper[4971]: I0320 09:33:16.943954 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/180d0a35-7932-470d-bdb8-0fd279fdf831-catalog-content\") pod \"community-operators-vfltj\" (UID: \"180d0a35-7932-470d-bdb8-0fd279fdf831\") " pod="openshift-marketplace/community-operators-vfltj" Mar 20 09:33:16 crc kubenswrapper[4971]: I0320 09:33:16.944028 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6v929\" (UniqueName: \"kubernetes.io/projected/180d0a35-7932-470d-bdb8-0fd279fdf831-kube-api-access-6v929\") pod \"community-operators-vfltj\" (UID: \"180d0a35-7932-470d-bdb8-0fd279fdf831\") " pod="openshift-marketplace/community-operators-vfltj" Mar 20 09:33:16 crc kubenswrapper[4971]: I0320 09:33:16.944063 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/180d0a35-7932-470d-bdb8-0fd279fdf831-utilities\") pod \"community-operators-vfltj\" (UID: \"180d0a35-7932-470d-bdb8-0fd279fdf831\") " pod="openshift-marketplace/community-operators-vfltj" Mar 20 09:33:17 crc kubenswrapper[4971]: I0320 09:33:17.046309 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/180d0a35-7932-470d-bdb8-0fd279fdf831-catalog-content\") pod \"community-operators-vfltj\" (UID: \"180d0a35-7932-470d-bdb8-0fd279fdf831\") " pod="openshift-marketplace/community-operators-vfltj" Mar 20 09:33:17 crc kubenswrapper[4971]: I0320 09:33:17.046678 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6v929\" (UniqueName: \"kubernetes.io/projected/180d0a35-7932-470d-bdb8-0fd279fdf831-kube-api-access-6v929\") pod \"community-operators-vfltj\" (UID: \"180d0a35-7932-470d-bdb8-0fd279fdf831\") " pod="openshift-marketplace/community-operators-vfltj" Mar 20 09:33:17 crc kubenswrapper[4971]: I0320 09:33:17.046733 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/180d0a35-7932-470d-bdb8-0fd279fdf831-utilities\") pod \"community-operators-vfltj\" (UID: \"180d0a35-7932-470d-bdb8-0fd279fdf831\") " pod="openshift-marketplace/community-operators-vfltj" Mar 20 09:33:17 crc kubenswrapper[4971]: I0320 09:33:17.046879 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/180d0a35-7932-470d-bdb8-0fd279fdf831-catalog-content\") pod \"community-operators-vfltj\" (UID: \"180d0a35-7932-470d-bdb8-0fd279fdf831\") " pod="openshift-marketplace/community-operators-vfltj" Mar 20 09:33:17 crc kubenswrapper[4971]: I0320 09:33:17.047204 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/180d0a35-7932-470d-bdb8-0fd279fdf831-utilities\") pod \"community-operators-vfltj\" (UID: \"180d0a35-7932-470d-bdb8-0fd279fdf831\") " pod="openshift-marketplace/community-operators-vfltj" Mar 20 09:33:17 crc kubenswrapper[4971]: I0320 09:33:17.066955 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6v929\" (UniqueName: \"kubernetes.io/projected/180d0a35-7932-470d-bdb8-0fd279fdf831-kube-api-access-6v929\") pod \"community-operators-vfltj\" (UID: \"180d0a35-7932-470d-bdb8-0fd279fdf831\") " pod="openshift-marketplace/community-operators-vfltj" Mar 20 09:33:17 crc kubenswrapper[4971]: I0320 09:33:17.195585 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vfltj" Mar 20 09:33:17 crc kubenswrapper[4971]: I0320 09:33:17.766157 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vfltj"] Mar 20 09:33:17 crc kubenswrapper[4971]: I0320 09:33:17.842916 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vfltj" event={"ID":"180d0a35-7932-470d-bdb8-0fd279fdf831","Type":"ContainerStarted","Data":"a44be300f46b0b861b7c506f2863b866c8a8da8fe3b896ca2b0fc31f73fe75ce"} Mar 20 09:33:18 crc kubenswrapper[4971]: I0320 09:33:18.854409 4971 generic.go:334] "Generic (PLEG): container finished" podID="180d0a35-7932-470d-bdb8-0fd279fdf831" containerID="39dbe41398dbc178abdedc7550c58c872820baa2821678f654700efb9b921dff" exitCode=0 Mar 20 09:33:18 crc kubenswrapper[4971]: I0320 09:33:18.854467 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vfltj" event={"ID":"180d0a35-7932-470d-bdb8-0fd279fdf831","Type":"ContainerDied","Data":"39dbe41398dbc178abdedc7550c58c872820baa2821678f654700efb9b921dff"} Mar 20 09:33:19 crc kubenswrapper[4971]: I0320 09:33:19.868071 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vfltj" event={"ID":"180d0a35-7932-470d-bdb8-0fd279fdf831","Type":"ContainerStarted","Data":"d7ccf826fb010e5b280ef7f7bb0b3a07b7921f975fc6aa53423587901e3c3466"} Mar 20 09:33:21 crc kubenswrapper[4971]: I0320 09:33:21.889691 4971 generic.go:334] "Generic (PLEG): container finished" podID="180d0a35-7932-470d-bdb8-0fd279fdf831" containerID="d7ccf826fb010e5b280ef7f7bb0b3a07b7921f975fc6aa53423587901e3c3466" exitCode=0 Mar 20 09:33:21 crc kubenswrapper[4971]: I0320 09:33:21.889721 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vfltj" event={"ID":"180d0a35-7932-470d-bdb8-0fd279fdf831","Type":"ContainerDied","Data":"d7ccf826fb010e5b280ef7f7bb0b3a07b7921f975fc6aa53423587901e3c3466"} Mar 20 09:33:22 crc kubenswrapper[4971]: I0320 09:33:22.903135 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vfltj" event={"ID":"180d0a35-7932-470d-bdb8-0fd279fdf831","Type":"ContainerStarted","Data":"299de98b9db36781dc9626e5d75ee61d67f92f7dd54c21b13fbbb7f4a7e1b07b"} Mar 20 09:33:22 crc kubenswrapper[4971]: I0320 09:33:22.929124 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vfltj" podStartSLOduration=3.471643411 podStartE2EDuration="6.929105128s" podCreationTimestamp="2026-03-20 09:33:16 +0000 UTC" firstStartedPulling="2026-03-20 09:33:18.856357578 +0000 UTC m=+9820.836231716" lastFinishedPulling="2026-03-20 09:33:22.313819295 +0000 UTC m=+9824.293693433" observedRunningTime="2026-03-20 09:33:22.928085541 +0000 UTC m=+9824.907959699" watchObservedRunningTime="2026-03-20 09:33:22.929105128 +0000 UTC m=+9824.908979276" Mar 20 09:33:24 crc kubenswrapper[4971]: I0320 09:33:24.733228 4971 scope.go:117] "RemoveContainer" containerID="bf941729054eb427a73262a724a392ea7406ec0eee565011ce7a80f6bb777365" Mar 20 09:33:24 crc kubenswrapper[4971]: E0320 09:33:24.734254 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:33:27 crc kubenswrapper[4971]: I0320 09:33:27.196798 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vfltj" Mar 20 09:33:27 crc kubenswrapper[4971]: I0320 09:33:27.197398 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vfltj" Mar 20 09:33:27 crc kubenswrapper[4971]: I0320 09:33:27.248823 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vfltj" Mar 20 09:33:28 crc kubenswrapper[4971]: I0320 09:33:28.013677 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vfltj" Mar 20 09:33:28 crc kubenswrapper[4971]: I0320 09:33:28.065697 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vfltj"] Mar 20 09:33:29 crc kubenswrapper[4971]: I0320 09:33:29.987652 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vfltj" podUID="180d0a35-7932-470d-bdb8-0fd279fdf831" containerName="registry-server" containerID="cri-o://299de98b9db36781dc9626e5d75ee61d67f92f7dd54c21b13fbbb7f4a7e1b07b" gracePeriod=2 Mar 20 09:33:30 crc kubenswrapper[4971]: I0320 09:33:30.439487 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vfltj" Mar 20 09:33:30 crc kubenswrapper[4971]: I0320 09:33:30.546908 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/180d0a35-7932-470d-bdb8-0fd279fdf831-catalog-content\") pod \"180d0a35-7932-470d-bdb8-0fd279fdf831\" (UID: \"180d0a35-7932-470d-bdb8-0fd279fdf831\") " Mar 20 09:33:30 crc kubenswrapper[4971]: I0320 09:33:30.547021 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/180d0a35-7932-470d-bdb8-0fd279fdf831-utilities\") pod \"180d0a35-7932-470d-bdb8-0fd279fdf831\" (UID: \"180d0a35-7932-470d-bdb8-0fd279fdf831\") " Mar 20 09:33:30 crc kubenswrapper[4971]: I0320 09:33:30.547130 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6v929\" (UniqueName: \"kubernetes.io/projected/180d0a35-7932-470d-bdb8-0fd279fdf831-kube-api-access-6v929\") pod \"180d0a35-7932-470d-bdb8-0fd279fdf831\" (UID: \"180d0a35-7932-470d-bdb8-0fd279fdf831\") " Mar 20 09:33:30 crc kubenswrapper[4971]: I0320 09:33:30.548025 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/180d0a35-7932-470d-bdb8-0fd279fdf831-utilities" (OuterVolumeSpecName: "utilities") pod "180d0a35-7932-470d-bdb8-0fd279fdf831" (UID: "180d0a35-7932-470d-bdb8-0fd279fdf831"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:33:30 crc kubenswrapper[4971]: I0320 09:33:30.548400 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/180d0a35-7932-470d-bdb8-0fd279fdf831-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 09:33:30 crc kubenswrapper[4971]: I0320 09:33:30.553745 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/180d0a35-7932-470d-bdb8-0fd279fdf831-kube-api-access-6v929" (OuterVolumeSpecName: "kube-api-access-6v929") pod "180d0a35-7932-470d-bdb8-0fd279fdf831" (UID: "180d0a35-7932-470d-bdb8-0fd279fdf831"). InnerVolumeSpecName "kube-api-access-6v929". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:33:30 crc kubenswrapper[4971]: I0320 09:33:30.623512 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/180d0a35-7932-470d-bdb8-0fd279fdf831-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "180d0a35-7932-470d-bdb8-0fd279fdf831" (UID: "180d0a35-7932-470d-bdb8-0fd279fdf831"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:33:30 crc kubenswrapper[4971]: I0320 09:33:30.650444 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/180d0a35-7932-470d-bdb8-0fd279fdf831-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 09:33:30 crc kubenswrapper[4971]: I0320 09:33:30.650483 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6v929\" (UniqueName: \"kubernetes.io/projected/180d0a35-7932-470d-bdb8-0fd279fdf831-kube-api-access-6v929\") on node \"crc\" DevicePath \"\"" Mar 20 09:33:30 crc kubenswrapper[4971]: I0320 09:33:30.998038 4971 generic.go:334] "Generic (PLEG): container finished" podID="180d0a35-7932-470d-bdb8-0fd279fdf831" containerID="299de98b9db36781dc9626e5d75ee61d67f92f7dd54c21b13fbbb7f4a7e1b07b" exitCode=0 Mar 20 09:33:31 crc kubenswrapper[4971]: I0320 09:33:31.000721 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vfltj" event={"ID":"180d0a35-7932-470d-bdb8-0fd279fdf831","Type":"ContainerDied","Data":"299de98b9db36781dc9626e5d75ee61d67f92f7dd54c21b13fbbb7f4a7e1b07b"} Mar 20 09:33:31 crc kubenswrapper[4971]: I0320 09:33:31.000783 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vfltj" event={"ID":"180d0a35-7932-470d-bdb8-0fd279fdf831","Type":"ContainerDied","Data":"a44be300f46b0b861b7c506f2863b866c8a8da8fe3b896ca2b0fc31f73fe75ce"} Mar 20 09:33:31 crc kubenswrapper[4971]: I0320 09:33:31.000809 4971 scope.go:117] "RemoveContainer" containerID="299de98b9db36781dc9626e5d75ee61d67f92f7dd54c21b13fbbb7f4a7e1b07b" Mar 20 09:33:31 crc kubenswrapper[4971]: I0320 09:33:31.001041 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vfltj" Mar 20 09:33:31 crc kubenswrapper[4971]: I0320 09:33:31.021593 4971 scope.go:117] "RemoveContainer" containerID="d7ccf826fb010e5b280ef7f7bb0b3a07b7921f975fc6aa53423587901e3c3466" Mar 20 09:33:31 crc kubenswrapper[4971]: I0320 09:33:31.029497 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vfltj"] Mar 20 09:33:31 crc kubenswrapper[4971]: I0320 09:33:31.040268 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vfltj"] Mar 20 09:33:31 crc kubenswrapper[4971]: I0320 09:33:31.044515 4971 scope.go:117] "RemoveContainer" containerID="39dbe41398dbc178abdedc7550c58c872820baa2821678f654700efb9b921dff" Mar 20 09:33:31 crc kubenswrapper[4971]: I0320 09:33:31.091501 4971 scope.go:117] "RemoveContainer" containerID="299de98b9db36781dc9626e5d75ee61d67f92f7dd54c21b13fbbb7f4a7e1b07b" Mar 20 09:33:31 crc kubenswrapper[4971]: E0320 09:33:31.092001 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"299de98b9db36781dc9626e5d75ee61d67f92f7dd54c21b13fbbb7f4a7e1b07b\": container with ID starting with 299de98b9db36781dc9626e5d75ee61d67f92f7dd54c21b13fbbb7f4a7e1b07b not found: ID does not exist" containerID="299de98b9db36781dc9626e5d75ee61d67f92f7dd54c21b13fbbb7f4a7e1b07b" Mar 20 09:33:31 crc kubenswrapper[4971]: I0320 09:33:31.092041 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"299de98b9db36781dc9626e5d75ee61d67f92f7dd54c21b13fbbb7f4a7e1b07b"} err="failed to get container status \"299de98b9db36781dc9626e5d75ee61d67f92f7dd54c21b13fbbb7f4a7e1b07b\": rpc error: code = NotFound desc = could not find container \"299de98b9db36781dc9626e5d75ee61d67f92f7dd54c21b13fbbb7f4a7e1b07b\": container with ID starting with 299de98b9db36781dc9626e5d75ee61d67f92f7dd54c21b13fbbb7f4a7e1b07b not found: ID does not exist" Mar 20 09:33:31 crc kubenswrapper[4971]: I0320 09:33:31.092067 4971 scope.go:117] "RemoveContainer" containerID="d7ccf826fb010e5b280ef7f7bb0b3a07b7921f975fc6aa53423587901e3c3466" Mar 20 09:33:31 crc kubenswrapper[4971]: E0320 09:33:31.092366 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7ccf826fb010e5b280ef7f7bb0b3a07b7921f975fc6aa53423587901e3c3466\": container with ID starting with d7ccf826fb010e5b280ef7f7bb0b3a07b7921f975fc6aa53423587901e3c3466 not found: ID does not exist" containerID="d7ccf826fb010e5b280ef7f7bb0b3a07b7921f975fc6aa53423587901e3c3466" Mar 20 09:33:31 crc kubenswrapper[4971]: I0320 09:33:31.092410 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7ccf826fb010e5b280ef7f7bb0b3a07b7921f975fc6aa53423587901e3c3466"} err="failed to get container status \"d7ccf826fb010e5b280ef7f7bb0b3a07b7921f975fc6aa53423587901e3c3466\": rpc error: code = NotFound desc = could not find container \"d7ccf826fb010e5b280ef7f7bb0b3a07b7921f975fc6aa53423587901e3c3466\": container with ID starting with d7ccf826fb010e5b280ef7f7bb0b3a07b7921f975fc6aa53423587901e3c3466 not found: ID does not exist" Mar 20 09:33:31 crc kubenswrapper[4971]: I0320 09:33:31.092437 4971 scope.go:117] "RemoveContainer" containerID="39dbe41398dbc178abdedc7550c58c872820baa2821678f654700efb9b921dff" Mar 20 09:33:31 crc kubenswrapper[4971]: E0320 09:33:31.092683 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39dbe41398dbc178abdedc7550c58c872820baa2821678f654700efb9b921dff\": container with ID starting with 39dbe41398dbc178abdedc7550c58c872820baa2821678f654700efb9b921dff not found: ID does not exist" containerID="39dbe41398dbc178abdedc7550c58c872820baa2821678f654700efb9b921dff" Mar 20 09:33:31 crc kubenswrapper[4971]: I0320 09:33:31.092717 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39dbe41398dbc178abdedc7550c58c872820baa2821678f654700efb9b921dff"} err="failed to get container status \"39dbe41398dbc178abdedc7550c58c872820baa2821678f654700efb9b921dff\": rpc error: code = NotFound desc = could not find container \"39dbe41398dbc178abdedc7550c58c872820baa2821678f654700efb9b921dff\": container with ID starting with 39dbe41398dbc178abdedc7550c58c872820baa2821678f654700efb9b921dff not found: ID does not exist" Mar 20 09:33:32 crc kubenswrapper[4971]: I0320 09:33:32.745326 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="180d0a35-7932-470d-bdb8-0fd279fdf831" path="/var/lib/kubelet/pods/180d0a35-7932-470d-bdb8-0fd279fdf831/volumes" Mar 20 09:33:36 crc kubenswrapper[4971]: I0320 09:33:36.732913 4971 scope.go:117] "RemoveContainer" containerID="bf941729054eb427a73262a724a392ea7406ec0eee565011ce7a80f6bb777365" Mar 20 09:33:36 crc kubenswrapper[4971]: E0320 09:33:36.734002 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:33:48 crc kubenswrapper[4971]: I0320 09:33:48.742996 4971 scope.go:117] "RemoveContainer" containerID="bf941729054eb427a73262a724a392ea7406ec0eee565011ce7a80f6bb777365" Mar 20 09:33:48 crc kubenswrapper[4971]: E0320 09:33:48.743806 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:34:00 crc kubenswrapper[4971]: I0320 09:34:00.152308 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566654-gknz7"] Mar 20 09:34:00 crc kubenswrapper[4971]: E0320 09:34:00.153452 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="180d0a35-7932-470d-bdb8-0fd279fdf831" containerName="extract-content" Mar 20 09:34:00 crc kubenswrapper[4971]: I0320 09:34:00.153471 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="180d0a35-7932-470d-bdb8-0fd279fdf831" containerName="extract-content" Mar 20 09:34:00 crc kubenswrapper[4971]: E0320 09:34:00.153503 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="180d0a35-7932-470d-bdb8-0fd279fdf831" containerName="registry-server" Mar 20 09:34:00 crc kubenswrapper[4971]: I0320 09:34:00.153512 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="180d0a35-7932-470d-bdb8-0fd279fdf831" containerName="registry-server" Mar 20 09:34:00 crc kubenswrapper[4971]: E0320 09:34:00.153525 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="180d0a35-7932-470d-bdb8-0fd279fdf831" containerName="extract-utilities" Mar 20 09:34:00 crc kubenswrapper[4971]: I0320 09:34:00.153535 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="180d0a35-7932-470d-bdb8-0fd279fdf831" containerName="extract-utilities" Mar 20 09:34:00 crc kubenswrapper[4971]: I0320 09:34:00.153850 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="180d0a35-7932-470d-bdb8-0fd279fdf831" containerName="registry-server" Mar 20 09:34:00 crc kubenswrapper[4971]: I0320 09:34:00.154839 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566654-gknz7" Mar 20 09:34:00 crc kubenswrapper[4971]: I0320 09:34:00.157549 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 09:34:00 crc kubenswrapper[4971]: I0320 09:34:00.157727 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:34:00 crc kubenswrapper[4971]: I0320 09:34:00.158217 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:34:00 crc kubenswrapper[4971]: I0320 09:34:00.163459 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566654-gknz7"] Mar 20 09:34:00 crc kubenswrapper[4971]: I0320 09:34:00.343351 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kqwd\" (UniqueName: \"kubernetes.io/projected/02aa84e2-acb0-43df-ae96-cca603f5bbed-kube-api-access-2kqwd\") pod \"auto-csr-approver-29566654-gknz7\" (UID: \"02aa84e2-acb0-43df-ae96-cca603f5bbed\") " pod="openshift-infra/auto-csr-approver-29566654-gknz7" Mar 20 09:34:00 crc kubenswrapper[4971]: I0320 09:34:00.445561 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kqwd\" (UniqueName: \"kubernetes.io/projected/02aa84e2-acb0-43df-ae96-cca603f5bbed-kube-api-access-2kqwd\") pod \"auto-csr-approver-29566654-gknz7\" (UID: \"02aa84e2-acb0-43df-ae96-cca603f5bbed\") " pod="openshift-infra/auto-csr-approver-29566654-gknz7" Mar 20 09:34:00 crc kubenswrapper[4971]: I0320 09:34:00.470319 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kqwd\" (UniqueName: \"kubernetes.io/projected/02aa84e2-acb0-43df-ae96-cca603f5bbed-kube-api-access-2kqwd\") pod \"auto-csr-approver-29566654-gknz7\" (UID: \"02aa84e2-acb0-43df-ae96-cca603f5bbed\") " pod="openshift-infra/auto-csr-approver-29566654-gknz7" Mar 20 09:34:00 crc kubenswrapper[4971]: I0320 09:34:00.480080 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566654-gknz7" Mar 20 09:34:00 crc kubenswrapper[4971]: I0320 09:34:00.914351 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566654-gknz7"] Mar 20 09:34:01 crc kubenswrapper[4971]: I0320 09:34:01.348499 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566654-gknz7" event={"ID":"02aa84e2-acb0-43df-ae96-cca603f5bbed","Type":"ContainerStarted","Data":"d500214c4292f1c522892932b22d3b686cdd77ed0b6c1b230dd711a4504e7c8f"} Mar 20 09:34:02 crc kubenswrapper[4971]: I0320 09:34:02.358226 4971 generic.go:334] "Generic (PLEG): container finished" podID="02aa84e2-acb0-43df-ae96-cca603f5bbed" containerID="b3277569a2bdf8b887b39f1f01a939b63f6f95d5e634297cf53c1fc2a27de2b4" exitCode=0 Mar 20 09:34:02 crc kubenswrapper[4971]: I0320 09:34:02.358297 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566654-gknz7" event={"ID":"02aa84e2-acb0-43df-ae96-cca603f5bbed","Type":"ContainerDied","Data":"b3277569a2bdf8b887b39f1f01a939b63f6f95d5e634297cf53c1fc2a27de2b4"} Mar 20 09:34:03 crc kubenswrapper[4971]: I0320 09:34:03.732598 4971 scope.go:117] "RemoveContainer" containerID="bf941729054eb427a73262a724a392ea7406ec0eee565011ce7a80f6bb777365" Mar 20 09:34:03 crc kubenswrapper[4971]: E0320 09:34:03.733431 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:34:03 crc kubenswrapper[4971]: I0320 09:34:03.762096 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566654-gknz7" Mar 20 09:34:03 crc kubenswrapper[4971]: I0320 09:34:03.818229 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kqwd\" (UniqueName: \"kubernetes.io/projected/02aa84e2-acb0-43df-ae96-cca603f5bbed-kube-api-access-2kqwd\") pod \"02aa84e2-acb0-43df-ae96-cca603f5bbed\" (UID: \"02aa84e2-acb0-43df-ae96-cca603f5bbed\") " Mar 20 09:34:03 crc kubenswrapper[4971]: I0320 09:34:03.847702 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02aa84e2-acb0-43df-ae96-cca603f5bbed-kube-api-access-2kqwd" (OuterVolumeSpecName: "kube-api-access-2kqwd") pod "02aa84e2-acb0-43df-ae96-cca603f5bbed" (UID: "02aa84e2-acb0-43df-ae96-cca603f5bbed"). InnerVolumeSpecName "kube-api-access-2kqwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:34:03 crc kubenswrapper[4971]: I0320 09:34:03.920509 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kqwd\" (UniqueName: \"kubernetes.io/projected/02aa84e2-acb0-43df-ae96-cca603f5bbed-kube-api-access-2kqwd\") on node \"crc\" DevicePath \"\"" Mar 20 09:34:04 crc kubenswrapper[4971]: I0320 09:34:04.380559 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566654-gknz7" event={"ID":"02aa84e2-acb0-43df-ae96-cca603f5bbed","Type":"ContainerDied","Data":"d500214c4292f1c522892932b22d3b686cdd77ed0b6c1b230dd711a4504e7c8f"} Mar 20 09:34:04 crc kubenswrapper[4971]: I0320 09:34:04.380924 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d500214c4292f1c522892932b22d3b686cdd77ed0b6c1b230dd711a4504e7c8f" Mar 20 09:34:04 crc kubenswrapper[4971]: I0320 09:34:04.380837 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566654-gknz7" Mar 20 09:34:04 crc kubenswrapper[4971]: I0320 09:34:04.839580 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566648-vqnnf"] Mar 20 09:34:04 crc kubenswrapper[4971]: I0320 09:34:04.849863 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566648-vqnnf"] Mar 20 09:34:06 crc kubenswrapper[4971]: I0320 09:34:06.747747 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f7978eb-a8d6-41c0-9ca6-555bc5fb3d44" path="/var/lib/kubelet/pods/2f7978eb-a8d6-41c0-9ca6-555bc5fb3d44/volumes" Mar 20 09:34:14 crc kubenswrapper[4971]: I0320 09:34:14.732739 4971 scope.go:117] "RemoveContainer" containerID="bf941729054eb427a73262a724a392ea7406ec0eee565011ce7a80f6bb777365" Mar 20 09:34:14 crc kubenswrapper[4971]: E0320 09:34:14.733431 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:34:14 crc kubenswrapper[4971]: I0320 09:34:14.895798 4971 scope.go:117] "RemoveContainer" containerID="51fa867ad5bf53c3061d01a4f9be0ac8b7230ba30b8b9b9c9910d89f2a285d0f" Mar 20 09:34:25 crc kubenswrapper[4971]: I0320 09:34:25.735840 4971 scope.go:117] "RemoveContainer" containerID="bf941729054eb427a73262a724a392ea7406ec0eee565011ce7a80f6bb777365" Mar 20 09:34:25 crc kubenswrapper[4971]: E0320 09:34:25.736723 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:34:40 crc kubenswrapper[4971]: I0320 09:34:40.733464 4971 scope.go:117] "RemoveContainer" containerID="bf941729054eb427a73262a724a392ea7406ec0eee565011ce7a80f6bb777365" Mar 20 09:34:40 crc kubenswrapper[4971]: E0320 09:34:40.735057 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:34:52 crc kubenswrapper[4971]: I0320 09:34:52.732769 4971 scope.go:117] "RemoveContainer" containerID="bf941729054eb427a73262a724a392ea7406ec0eee565011ce7a80f6bb777365" Mar 20 09:34:52 crc kubenswrapper[4971]: E0320 09:34:52.733507 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:35:03 crc kubenswrapper[4971]: I0320 09:35:03.732806 4971 scope.go:117] "RemoveContainer" containerID="bf941729054eb427a73262a724a392ea7406ec0eee565011ce7a80f6bb777365" Mar 20 09:35:03 crc kubenswrapper[4971]: E0320 09:35:03.733538 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:35:18 crc kubenswrapper[4971]: I0320 09:35:18.741011 4971 scope.go:117] "RemoveContainer" containerID="bf941729054eb427a73262a724a392ea7406ec0eee565011ce7a80f6bb777365" Mar 20 09:35:18 crc kubenswrapper[4971]: E0320 09:35:18.742773 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:35:33 crc kubenswrapper[4971]: I0320 09:35:33.732955 4971 scope.go:117] "RemoveContainer" containerID="bf941729054eb427a73262a724a392ea7406ec0eee565011ce7a80f6bb777365" Mar 20 09:35:33 crc kubenswrapper[4971]: E0320 09:35:33.733914 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:35:49 crc kubenswrapper[4971]: I0320 09:35:49.732447 4971 scope.go:117] "RemoveContainer" containerID="bf941729054eb427a73262a724a392ea7406ec0eee565011ce7a80f6bb777365" Mar 20 09:35:49 crc kubenswrapper[4971]: E0320 09:35:49.733234 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:36:00 crc kubenswrapper[4971]: I0320 09:36:00.173590 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566656-m67l8"] Mar 20 09:36:00 crc kubenswrapper[4971]: E0320 09:36:00.175306 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02aa84e2-acb0-43df-ae96-cca603f5bbed" containerName="oc" Mar 20 09:36:00 crc kubenswrapper[4971]: I0320 09:36:00.175321 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="02aa84e2-acb0-43df-ae96-cca603f5bbed" containerName="oc" Mar 20 09:36:00 crc kubenswrapper[4971]: I0320 09:36:00.175506 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="02aa84e2-acb0-43df-ae96-cca603f5bbed" containerName="oc" Mar 20 09:36:00 crc kubenswrapper[4971]: I0320 09:36:00.176225 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566656-m67l8" Mar 20 09:36:00 crc kubenswrapper[4971]: I0320 09:36:00.179101 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 09:36:00 crc kubenswrapper[4971]: I0320 09:36:00.179229 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:36:00 crc kubenswrapper[4971]: I0320 09:36:00.183047 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:36:00 crc kubenswrapper[4971]: I0320 09:36:00.211082 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566656-m67l8"] Mar 20 09:36:00 crc kubenswrapper[4971]: I0320 09:36:00.219556 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mt8jx\" (UniqueName: \"kubernetes.io/projected/95ce3374-468c-484b-ad4e-8504c953be9e-kube-api-access-mt8jx\") pod \"auto-csr-approver-29566656-m67l8\" (UID: \"95ce3374-468c-484b-ad4e-8504c953be9e\") " pod="openshift-infra/auto-csr-approver-29566656-m67l8" Mar 20 09:36:00 crc kubenswrapper[4971]: I0320 09:36:00.322242 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mt8jx\" (UniqueName: \"kubernetes.io/projected/95ce3374-468c-484b-ad4e-8504c953be9e-kube-api-access-mt8jx\") pod \"auto-csr-approver-29566656-m67l8\" (UID: \"95ce3374-468c-484b-ad4e-8504c953be9e\") " pod="openshift-infra/auto-csr-approver-29566656-m67l8" Mar 20 09:36:00 crc kubenswrapper[4971]: I0320 09:36:00.361488 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mt8jx\" (UniqueName: \"kubernetes.io/projected/95ce3374-468c-484b-ad4e-8504c953be9e-kube-api-access-mt8jx\") pod \"auto-csr-approver-29566656-m67l8\" (UID: \"95ce3374-468c-484b-ad4e-8504c953be9e\") " pod="openshift-infra/auto-csr-approver-29566656-m67l8" Mar 20 09:36:00 crc kubenswrapper[4971]: I0320 09:36:00.511468 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566656-m67l8" Mar 20 09:36:00 crc kubenswrapper[4971]: I0320 09:36:00.998655 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566656-m67l8"] Mar 20 09:36:01 crc kubenswrapper[4971]: I0320 09:36:01.618362 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566656-m67l8" event={"ID":"95ce3374-468c-484b-ad4e-8504c953be9e","Type":"ContainerStarted","Data":"ab04802238ac6083f1994ab6509ed78a58ed727a204d99875f3adb82f6db15b6"} Mar 20 09:36:02 crc kubenswrapper[4971]: I0320 09:36:02.732318 4971 scope.go:117] "RemoveContainer" containerID="bf941729054eb427a73262a724a392ea7406ec0eee565011ce7a80f6bb777365" Mar 20 09:36:02 crc kubenswrapper[4971]: E0320 09:36:02.732896 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:36:03 crc kubenswrapper[4971]: I0320 09:36:03.635351 4971 generic.go:334] "Generic (PLEG): container finished" podID="95ce3374-468c-484b-ad4e-8504c953be9e" containerID="1942874c5f0b75db1f3528d2bfd046939bb803299a577c2d2198d3f942b5c31e" exitCode=0 Mar 20 09:36:03 crc kubenswrapper[4971]: I0320 09:36:03.635413 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566656-m67l8" event={"ID":"95ce3374-468c-484b-ad4e-8504c953be9e","Type":"ContainerDied","Data":"1942874c5f0b75db1f3528d2bfd046939bb803299a577c2d2198d3f942b5c31e"} Mar 20 09:36:05 crc kubenswrapper[4971]: I0320 09:36:05.039090 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566656-m67l8" Mar 20 09:36:05 crc kubenswrapper[4971]: I0320 09:36:05.118525 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mt8jx\" (UniqueName: \"kubernetes.io/projected/95ce3374-468c-484b-ad4e-8504c953be9e-kube-api-access-mt8jx\") pod \"95ce3374-468c-484b-ad4e-8504c953be9e\" (UID: \"95ce3374-468c-484b-ad4e-8504c953be9e\") " Mar 20 09:36:05 crc kubenswrapper[4971]: I0320 09:36:05.125265 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95ce3374-468c-484b-ad4e-8504c953be9e-kube-api-access-mt8jx" (OuterVolumeSpecName: "kube-api-access-mt8jx") pod "95ce3374-468c-484b-ad4e-8504c953be9e" (UID: "95ce3374-468c-484b-ad4e-8504c953be9e"). InnerVolumeSpecName "kube-api-access-mt8jx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:36:05 crc kubenswrapper[4971]: I0320 09:36:05.220820 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mt8jx\" (UniqueName: \"kubernetes.io/projected/95ce3374-468c-484b-ad4e-8504c953be9e-kube-api-access-mt8jx\") on node \"crc\" DevicePath \"\"" Mar 20 09:36:05 crc kubenswrapper[4971]: I0320 09:36:05.655679 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566656-m67l8" event={"ID":"95ce3374-468c-484b-ad4e-8504c953be9e","Type":"ContainerDied","Data":"ab04802238ac6083f1994ab6509ed78a58ed727a204d99875f3adb82f6db15b6"} Mar 20 09:36:05 crc kubenswrapper[4971]: I0320 09:36:05.655718 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab04802238ac6083f1994ab6509ed78a58ed727a204d99875f3adb82f6db15b6" Mar 20 09:36:05 crc kubenswrapper[4971]: I0320 09:36:05.655755 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566656-m67l8" Mar 20 09:36:06 crc kubenswrapper[4971]: I0320 09:36:06.107850 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566650-st8vq"] Mar 20 09:36:06 crc kubenswrapper[4971]: I0320 09:36:06.122911 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566650-st8vq"] Mar 20 09:36:06 crc kubenswrapper[4971]: I0320 09:36:06.745755 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93e47cf8-974a-4ba1-a399-855e87aaa553" path="/var/lib/kubelet/pods/93e47cf8-974a-4ba1-a399-855e87aaa553/volumes" Mar 20 09:36:11 crc kubenswrapper[4971]: I0320 09:36:11.852705 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fslfw"] Mar 20 09:36:11 crc kubenswrapper[4971]: E0320 09:36:11.853597 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95ce3374-468c-484b-ad4e-8504c953be9e" containerName="oc" Mar 20 09:36:11 crc kubenswrapper[4971]: I0320 09:36:11.854182 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="95ce3374-468c-484b-ad4e-8504c953be9e" containerName="oc" Mar 20 09:36:11 crc kubenswrapper[4971]: I0320 09:36:11.854423 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="95ce3374-468c-484b-ad4e-8504c953be9e" containerName="oc" Mar 20 09:36:11 crc kubenswrapper[4971]: I0320 09:36:11.856079 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fslfw" Mar 20 09:36:11 crc kubenswrapper[4971]: I0320 09:36:11.868511 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fslfw"] Mar 20 09:36:11 crc kubenswrapper[4971]: I0320 09:36:11.951283 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9cfbbf6-37b3-4959-bed8-9bf5bc4695cf-utilities\") pod \"certified-operators-fslfw\" (UID: \"b9cfbbf6-37b3-4959-bed8-9bf5bc4695cf\") " pod="openshift-marketplace/certified-operators-fslfw" Mar 20 09:36:11 crc kubenswrapper[4971]: I0320 09:36:11.951469 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9cfbbf6-37b3-4959-bed8-9bf5bc4695cf-catalog-content\") pod \"certified-operators-fslfw\" (UID: \"b9cfbbf6-37b3-4959-bed8-9bf5bc4695cf\") " pod="openshift-marketplace/certified-operators-fslfw" Mar 20 09:36:11 crc kubenswrapper[4971]: I0320 09:36:11.951518 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svltx\" (UniqueName: \"kubernetes.io/projected/b9cfbbf6-37b3-4959-bed8-9bf5bc4695cf-kube-api-access-svltx\") pod \"certified-operators-fslfw\" (UID: \"b9cfbbf6-37b3-4959-bed8-9bf5bc4695cf\") " pod="openshift-marketplace/certified-operators-fslfw" Mar 20 09:36:12 crc kubenswrapper[4971]: I0320 09:36:12.054830 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9cfbbf6-37b3-4959-bed8-9bf5bc4695cf-catalog-content\") pod \"certified-operators-fslfw\" (UID: \"b9cfbbf6-37b3-4959-bed8-9bf5bc4695cf\") " pod="openshift-marketplace/certified-operators-fslfw" Mar 20 09:36:12 crc kubenswrapper[4971]: I0320 09:36:12.054900 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svltx\" (UniqueName: \"kubernetes.io/projected/b9cfbbf6-37b3-4959-bed8-9bf5bc4695cf-kube-api-access-svltx\") pod \"certified-operators-fslfw\" (UID: \"b9cfbbf6-37b3-4959-bed8-9bf5bc4695cf\") " pod="openshift-marketplace/certified-operators-fslfw" Mar 20 09:36:12 crc kubenswrapper[4971]: I0320 09:36:12.055010 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9cfbbf6-37b3-4959-bed8-9bf5bc4695cf-utilities\") pod \"certified-operators-fslfw\" (UID: \"b9cfbbf6-37b3-4959-bed8-9bf5bc4695cf\") " pod="openshift-marketplace/certified-operators-fslfw" Mar 20 09:36:12 crc kubenswrapper[4971]: I0320 09:36:12.055304 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9cfbbf6-37b3-4959-bed8-9bf5bc4695cf-catalog-content\") pod \"certified-operators-fslfw\" (UID: \"b9cfbbf6-37b3-4959-bed8-9bf5bc4695cf\") " pod="openshift-marketplace/certified-operators-fslfw" Mar 20 09:36:12 crc kubenswrapper[4971]: I0320 09:36:12.055496 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9cfbbf6-37b3-4959-bed8-9bf5bc4695cf-utilities\") pod \"certified-operators-fslfw\" (UID: \"b9cfbbf6-37b3-4959-bed8-9bf5bc4695cf\") " pod="openshift-marketplace/certified-operators-fslfw" Mar 20 09:36:12 crc kubenswrapper[4971]: I0320 09:36:12.078708 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svltx\" (UniqueName: \"kubernetes.io/projected/b9cfbbf6-37b3-4959-bed8-9bf5bc4695cf-kube-api-access-svltx\") pod \"certified-operators-fslfw\" (UID: \"b9cfbbf6-37b3-4959-bed8-9bf5bc4695cf\") " pod="openshift-marketplace/certified-operators-fslfw" Mar 20 09:36:12 crc kubenswrapper[4971]: I0320 09:36:12.176697 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fslfw" Mar 20 09:36:12 crc kubenswrapper[4971]: I0320 09:36:12.720270 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fslfw"] Mar 20 09:36:13 crc kubenswrapper[4971]: I0320 09:36:13.726304 4971 generic.go:334] "Generic (PLEG): container finished" podID="b9cfbbf6-37b3-4959-bed8-9bf5bc4695cf" containerID="33075fa6abc6b523e83befd1571b80d21588b465ad6564beb04871fd23c37c20" exitCode=0 Mar 20 09:36:13 crc kubenswrapper[4971]: I0320 09:36:13.726364 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fslfw" event={"ID":"b9cfbbf6-37b3-4959-bed8-9bf5bc4695cf","Type":"ContainerDied","Data":"33075fa6abc6b523e83befd1571b80d21588b465ad6564beb04871fd23c37c20"} Mar 20 09:36:13 crc kubenswrapper[4971]: I0320 09:36:13.726546 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fslfw" event={"ID":"b9cfbbf6-37b3-4959-bed8-9bf5bc4695cf","Type":"ContainerStarted","Data":"b93312ab88af9c9d4953aea7a16b716f12e107d8faa46cd4dd35e8709c1442d4"} Mar 20 09:36:14 crc kubenswrapper[4971]: I0320 09:36:14.775218 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fslfw" event={"ID":"b9cfbbf6-37b3-4959-bed8-9bf5bc4695cf","Type":"ContainerStarted","Data":"e1308824aa27e082e0071bbe19a47fe73aea7073abea9f1dc69be8bd14742247"} Mar 20 09:36:14 crc kubenswrapper[4971]: I0320 09:36:14.987341 4971 scope.go:117] "RemoveContainer" containerID="83fec366fe56e7bff356175bc43468143027d156d6abb72288755ec7eaf0b54c" Mar 20 09:36:16 crc kubenswrapper[4971]: I0320 09:36:16.761636 4971 generic.go:334] "Generic (PLEG): container finished" podID="b9cfbbf6-37b3-4959-bed8-9bf5bc4695cf" containerID="e1308824aa27e082e0071bbe19a47fe73aea7073abea9f1dc69be8bd14742247" exitCode=0 Mar 20 09:36:16 crc kubenswrapper[4971]: I0320 09:36:16.761673 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fslfw" event={"ID":"b9cfbbf6-37b3-4959-bed8-9bf5bc4695cf","Type":"ContainerDied","Data":"e1308824aa27e082e0071bbe19a47fe73aea7073abea9f1dc69be8bd14742247"} Mar 20 09:36:17 crc kubenswrapper[4971]: I0320 09:36:17.732261 4971 scope.go:117] "RemoveContainer" containerID="bf941729054eb427a73262a724a392ea7406ec0eee565011ce7a80f6bb777365" Mar 20 09:36:17 crc kubenswrapper[4971]: E0320 09:36:17.733022 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:36:17 crc kubenswrapper[4971]: I0320 09:36:17.771734 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fslfw" event={"ID":"b9cfbbf6-37b3-4959-bed8-9bf5bc4695cf","Type":"ContainerStarted","Data":"c2f418dccdd09d7a128e37256bd803f3b2629ec6fe9fc9d9bec8fdde10e2224a"} Mar 20 09:36:17 crc kubenswrapper[4971]: I0320 09:36:17.827771 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fslfw" podStartSLOduration=3.393680589 podStartE2EDuration="6.827745374s" podCreationTimestamp="2026-03-20 09:36:11 +0000 UTC" firstStartedPulling="2026-03-20 09:36:13.728212822 +0000 UTC m=+9995.708086960" lastFinishedPulling="2026-03-20 09:36:17.162277607 +0000 UTC m=+9999.142151745" observedRunningTime="2026-03-20 09:36:17.819567206 +0000 UTC m=+9999.799441344" watchObservedRunningTime="2026-03-20 09:36:17.827745374 +0000 UTC m=+9999.807619512" Mar 20 09:36:22 crc kubenswrapper[4971]: I0320 09:36:22.177422 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fslfw" Mar 20 09:36:22 crc kubenswrapper[4971]: I0320 09:36:22.178969 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fslfw" Mar 20 09:36:22 crc kubenswrapper[4971]: I0320 09:36:22.230961 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fslfw" Mar 20 09:36:22 crc kubenswrapper[4971]: I0320 09:36:22.870987 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fslfw" Mar 20 09:36:22 crc kubenswrapper[4971]: I0320 09:36:22.941781 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fslfw"] Mar 20 09:36:24 crc kubenswrapper[4971]: I0320 09:36:24.837113 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fslfw" podUID="b9cfbbf6-37b3-4959-bed8-9bf5bc4695cf" containerName="registry-server" containerID="cri-o://c2f418dccdd09d7a128e37256bd803f3b2629ec6fe9fc9d9bec8fdde10e2224a" gracePeriod=2 Mar 20 09:36:25 crc kubenswrapper[4971]: I0320 09:36:25.407435 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fslfw" Mar 20 09:36:25 crc kubenswrapper[4971]: I0320 09:36:25.566139 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svltx\" (UniqueName: \"kubernetes.io/projected/b9cfbbf6-37b3-4959-bed8-9bf5bc4695cf-kube-api-access-svltx\") pod \"b9cfbbf6-37b3-4959-bed8-9bf5bc4695cf\" (UID: \"b9cfbbf6-37b3-4959-bed8-9bf5bc4695cf\") " Mar 20 09:36:25 crc kubenswrapper[4971]: I0320 09:36:25.566235 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9cfbbf6-37b3-4959-bed8-9bf5bc4695cf-catalog-content\") pod \"b9cfbbf6-37b3-4959-bed8-9bf5bc4695cf\" (UID: \"b9cfbbf6-37b3-4959-bed8-9bf5bc4695cf\") " Mar 20 09:36:25 crc kubenswrapper[4971]: I0320 09:36:25.566422 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9cfbbf6-37b3-4959-bed8-9bf5bc4695cf-utilities\") pod \"b9cfbbf6-37b3-4959-bed8-9bf5bc4695cf\" (UID: \"b9cfbbf6-37b3-4959-bed8-9bf5bc4695cf\") " Mar 20 09:36:25 crc kubenswrapper[4971]: I0320 09:36:25.567641 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9cfbbf6-37b3-4959-bed8-9bf5bc4695cf-utilities" (OuterVolumeSpecName: "utilities") pod "b9cfbbf6-37b3-4959-bed8-9bf5bc4695cf" (UID: "b9cfbbf6-37b3-4959-bed8-9bf5bc4695cf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:36:25 crc kubenswrapper[4971]: I0320 09:36:25.576352 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9cfbbf6-37b3-4959-bed8-9bf5bc4695cf-kube-api-access-svltx" (OuterVolumeSpecName: "kube-api-access-svltx") pod "b9cfbbf6-37b3-4959-bed8-9bf5bc4695cf" (UID: "b9cfbbf6-37b3-4959-bed8-9bf5bc4695cf"). InnerVolumeSpecName "kube-api-access-svltx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:36:25 crc kubenswrapper[4971]: I0320 09:36:25.621247 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9cfbbf6-37b3-4959-bed8-9bf5bc4695cf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b9cfbbf6-37b3-4959-bed8-9bf5bc4695cf" (UID: "b9cfbbf6-37b3-4959-bed8-9bf5bc4695cf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:36:25 crc kubenswrapper[4971]: I0320 09:36:25.668901 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9cfbbf6-37b3-4959-bed8-9bf5bc4695cf-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 09:36:25 crc kubenswrapper[4971]: I0320 09:36:25.668940 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svltx\" (UniqueName: \"kubernetes.io/projected/b9cfbbf6-37b3-4959-bed8-9bf5bc4695cf-kube-api-access-svltx\") on node \"crc\" DevicePath \"\"" Mar 20 09:36:25 crc kubenswrapper[4971]: I0320 09:36:25.668952 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9cfbbf6-37b3-4959-bed8-9bf5bc4695cf-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 09:36:25 crc kubenswrapper[4971]: I0320 09:36:25.853931 4971 generic.go:334] "Generic (PLEG): container finished" podID="b9cfbbf6-37b3-4959-bed8-9bf5bc4695cf" containerID="c2f418dccdd09d7a128e37256bd803f3b2629ec6fe9fc9d9bec8fdde10e2224a" exitCode=0 Mar 20 09:36:25 crc kubenswrapper[4971]: I0320 09:36:25.854099 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fslfw" Mar 20 09:36:25 crc kubenswrapper[4971]: I0320 09:36:25.854127 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fslfw" event={"ID":"b9cfbbf6-37b3-4959-bed8-9bf5bc4695cf","Type":"ContainerDied","Data":"c2f418dccdd09d7a128e37256bd803f3b2629ec6fe9fc9d9bec8fdde10e2224a"} Mar 20 09:36:25 crc kubenswrapper[4971]: I0320 09:36:25.854763 4971 scope.go:117] "RemoveContainer" containerID="c2f418dccdd09d7a128e37256bd803f3b2629ec6fe9fc9d9bec8fdde10e2224a" Mar 20 09:36:25 crc kubenswrapper[4971]: I0320 09:36:25.854990 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fslfw" event={"ID":"b9cfbbf6-37b3-4959-bed8-9bf5bc4695cf","Type":"ContainerDied","Data":"b93312ab88af9c9d4953aea7a16b716f12e107d8faa46cd4dd35e8709c1442d4"} Mar 20 09:36:25 crc kubenswrapper[4971]: I0320 09:36:25.884088 4971 scope.go:117] "RemoveContainer" containerID="e1308824aa27e082e0071bbe19a47fe73aea7073abea9f1dc69be8bd14742247" Mar 20 09:36:25 crc kubenswrapper[4971]: I0320 09:36:25.903019 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fslfw"] Mar 20 09:36:25 crc kubenswrapper[4971]: I0320 09:36:25.912936 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fslfw"] Mar 20 09:36:25 crc kubenswrapper[4971]: I0320 09:36:25.918008 4971 scope.go:117] "RemoveContainer" containerID="33075fa6abc6b523e83befd1571b80d21588b465ad6564beb04871fd23c37c20" Mar 20 09:36:26 crc kubenswrapper[4971]: I0320 09:36:26.005175 4971 scope.go:117] "RemoveContainer" containerID="c2f418dccdd09d7a128e37256bd803f3b2629ec6fe9fc9d9bec8fdde10e2224a" Mar 20 09:36:26 crc kubenswrapper[4971]: E0320 09:36:26.005716 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2f418dccdd09d7a128e37256bd803f3b2629ec6fe9fc9d9bec8fdde10e2224a\": container with ID starting with c2f418dccdd09d7a128e37256bd803f3b2629ec6fe9fc9d9bec8fdde10e2224a not found: ID does not exist" containerID="c2f418dccdd09d7a128e37256bd803f3b2629ec6fe9fc9d9bec8fdde10e2224a" Mar 20 09:36:26 crc kubenswrapper[4971]: I0320 09:36:26.005743 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2f418dccdd09d7a128e37256bd803f3b2629ec6fe9fc9d9bec8fdde10e2224a"} err="failed to get container status \"c2f418dccdd09d7a128e37256bd803f3b2629ec6fe9fc9d9bec8fdde10e2224a\": rpc error: code = NotFound desc = could not find container \"c2f418dccdd09d7a128e37256bd803f3b2629ec6fe9fc9d9bec8fdde10e2224a\": container with ID starting with c2f418dccdd09d7a128e37256bd803f3b2629ec6fe9fc9d9bec8fdde10e2224a not found: ID does not exist" Mar 20 09:36:26 crc kubenswrapper[4971]: I0320 09:36:26.005763 4971 scope.go:117] "RemoveContainer" containerID="e1308824aa27e082e0071bbe19a47fe73aea7073abea9f1dc69be8bd14742247" Mar 20 09:36:26 crc kubenswrapper[4971]: E0320 09:36:26.006263 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1308824aa27e082e0071bbe19a47fe73aea7073abea9f1dc69be8bd14742247\": container with ID starting with e1308824aa27e082e0071bbe19a47fe73aea7073abea9f1dc69be8bd14742247 not found: ID does not exist" containerID="e1308824aa27e082e0071bbe19a47fe73aea7073abea9f1dc69be8bd14742247" Mar 20 09:36:26 crc kubenswrapper[4971]: I0320 09:36:26.006318 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1308824aa27e082e0071bbe19a47fe73aea7073abea9f1dc69be8bd14742247"} err="failed to get container status \"e1308824aa27e082e0071bbe19a47fe73aea7073abea9f1dc69be8bd14742247\": rpc error: code = NotFound desc = could not find container \"e1308824aa27e082e0071bbe19a47fe73aea7073abea9f1dc69be8bd14742247\": container with ID starting with e1308824aa27e082e0071bbe19a47fe73aea7073abea9f1dc69be8bd14742247 not found: ID does not exist" Mar 20 09:36:26 crc kubenswrapper[4971]: I0320 09:36:26.006353 4971 scope.go:117] "RemoveContainer" containerID="33075fa6abc6b523e83befd1571b80d21588b465ad6564beb04871fd23c37c20" Mar 20 09:36:26 crc kubenswrapper[4971]: E0320 09:36:26.006921 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33075fa6abc6b523e83befd1571b80d21588b465ad6564beb04871fd23c37c20\": container with ID starting with 33075fa6abc6b523e83befd1571b80d21588b465ad6564beb04871fd23c37c20 not found: ID does not exist" containerID="33075fa6abc6b523e83befd1571b80d21588b465ad6564beb04871fd23c37c20" Mar 20 09:36:26 crc kubenswrapper[4971]: I0320 09:36:26.006951 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33075fa6abc6b523e83befd1571b80d21588b465ad6564beb04871fd23c37c20"} err="failed to get container status \"33075fa6abc6b523e83befd1571b80d21588b465ad6564beb04871fd23c37c20\": rpc error: code = NotFound desc = could not find container \"33075fa6abc6b523e83befd1571b80d21588b465ad6564beb04871fd23c37c20\": container with ID starting with 33075fa6abc6b523e83befd1571b80d21588b465ad6564beb04871fd23c37c20 not found: ID does not exist" Mar 20 09:36:26 crc kubenswrapper[4971]: I0320 09:36:26.750019 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9cfbbf6-37b3-4959-bed8-9bf5bc4695cf" path="/var/lib/kubelet/pods/b9cfbbf6-37b3-4959-bed8-9bf5bc4695cf/volumes" Mar 20 09:36:30 crc kubenswrapper[4971]: I0320 09:36:30.733031 4971 scope.go:117] "RemoveContainer" containerID="bf941729054eb427a73262a724a392ea7406ec0eee565011ce7a80f6bb777365" Mar 20 09:36:30 crc kubenswrapper[4971]: E0320 09:36:30.734661 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:36:40 crc kubenswrapper[4971]: I0320 09:36:40.005185 4971 generic.go:334] "Generic (PLEG): container finished" podID="4fc44776-0838-4b1a-844c-a16ffd83af56" containerID="ba20f12233759169018c873477fc3b92185267ec4d38c76c3335c6848d4a5034" exitCode=0 Mar 20 09:36:40 crc kubenswrapper[4971]: I0320 09:36:40.005279 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-cjr5v" event={"ID":"4fc44776-0838-4b1a-844c-a16ffd83af56","Type":"ContainerDied","Data":"ba20f12233759169018c873477fc3b92185267ec4d38c76c3335c6848d4a5034"} Mar 20 09:36:41 crc kubenswrapper[4971]: I0320 09:36:41.528948 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-cjr5v" Mar 20 09:36:41 crc kubenswrapper[4971]: I0320 09:36:41.715530 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4fc44776-0838-4b1a-844c-a16ffd83af56-inventory\") pod \"4fc44776-0838-4b1a-844c-a16ffd83af56\" (UID: \"4fc44776-0838-4b1a-844c-a16ffd83af56\") " Mar 20 09:36:41 crc kubenswrapper[4971]: I0320 09:36:41.715927 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fc44776-0838-4b1a-844c-a16ffd83af56-telemetry-combined-ca-bundle\") pod \"4fc44776-0838-4b1a-844c-a16ffd83af56\" (UID: \"4fc44776-0838-4b1a-844c-a16ffd83af56\") " Mar 20 09:36:41 crc kubenswrapper[4971]: I0320 09:36:41.715978 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/4fc44776-0838-4b1a-844c-a16ffd83af56-ceilometer-compute-config-data-0\") pod \"4fc44776-0838-4b1a-844c-a16ffd83af56\" (UID: \"4fc44776-0838-4b1a-844c-a16ffd83af56\") " Mar 20 09:36:41 crc kubenswrapper[4971]: I0320 09:36:41.716006 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/4fc44776-0838-4b1a-844c-a16ffd83af56-ceilometer-compute-config-data-2\") pod \"4fc44776-0838-4b1a-844c-a16ffd83af56\" (UID: \"4fc44776-0838-4b1a-844c-a16ffd83af56\") " Mar 20 09:36:41 crc kubenswrapper[4971]: I0320 09:36:41.716070 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4fc44776-0838-4b1a-844c-a16ffd83af56-ceph\") pod \"4fc44776-0838-4b1a-844c-a16ffd83af56\" (UID: \"4fc44776-0838-4b1a-844c-a16ffd83af56\") " Mar 20 09:36:41 crc kubenswrapper[4971]: I0320 09:36:41.716090 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9krp\" (UniqueName: \"kubernetes.io/projected/4fc44776-0838-4b1a-844c-a16ffd83af56-kube-api-access-x9krp\") pod \"4fc44776-0838-4b1a-844c-a16ffd83af56\" (UID: \"4fc44776-0838-4b1a-844c-a16ffd83af56\") " Mar 20 09:36:41 crc kubenswrapper[4971]: I0320 09:36:41.716258 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/4fc44776-0838-4b1a-844c-a16ffd83af56-ceilometer-compute-config-data-1\") pod \"4fc44776-0838-4b1a-844c-a16ffd83af56\" (UID: \"4fc44776-0838-4b1a-844c-a16ffd83af56\") " Mar 20 09:36:41 crc kubenswrapper[4971]: I0320 09:36:41.716305 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/4fc44776-0838-4b1a-844c-a16ffd83af56-ssh-key-openstack-cell1\") pod \"4fc44776-0838-4b1a-844c-a16ffd83af56\" (UID: \"4fc44776-0838-4b1a-844c-a16ffd83af56\") " Mar 20 09:36:41 crc kubenswrapper[4971]: I0320 09:36:41.721930 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fc44776-0838-4b1a-844c-a16ffd83af56-ceph" (OuterVolumeSpecName: "ceph") pod "4fc44776-0838-4b1a-844c-a16ffd83af56" (UID: "4fc44776-0838-4b1a-844c-a16ffd83af56"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:36:41 crc kubenswrapper[4971]: I0320 09:36:41.722967 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fc44776-0838-4b1a-844c-a16ffd83af56-kube-api-access-x9krp" (OuterVolumeSpecName: "kube-api-access-x9krp") pod "4fc44776-0838-4b1a-844c-a16ffd83af56" (UID: "4fc44776-0838-4b1a-844c-a16ffd83af56"). InnerVolumeSpecName "kube-api-access-x9krp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:36:41 crc kubenswrapper[4971]: I0320 09:36:41.726783 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fc44776-0838-4b1a-844c-a16ffd83af56-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "4fc44776-0838-4b1a-844c-a16ffd83af56" (UID: "4fc44776-0838-4b1a-844c-a16ffd83af56"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:36:41 crc kubenswrapper[4971]: I0320 09:36:41.746013 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fc44776-0838-4b1a-844c-a16ffd83af56-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "4fc44776-0838-4b1a-844c-a16ffd83af56" (UID: "4fc44776-0838-4b1a-844c-a16ffd83af56"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:36:41 crc kubenswrapper[4971]: I0320 09:36:41.746781 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fc44776-0838-4b1a-844c-a16ffd83af56-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "4fc44776-0838-4b1a-844c-a16ffd83af56" (UID: "4fc44776-0838-4b1a-844c-a16ffd83af56"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:36:41 crc kubenswrapper[4971]: I0320 09:36:41.747760 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fc44776-0838-4b1a-844c-a16ffd83af56-inventory" (OuterVolumeSpecName: "inventory") pod "4fc44776-0838-4b1a-844c-a16ffd83af56" (UID: "4fc44776-0838-4b1a-844c-a16ffd83af56"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:36:41 crc kubenswrapper[4971]: I0320 09:36:41.772963 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fc44776-0838-4b1a-844c-a16ffd83af56-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "4fc44776-0838-4b1a-844c-a16ffd83af56" (UID: "4fc44776-0838-4b1a-844c-a16ffd83af56"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:36:41 crc kubenswrapper[4971]: I0320 09:36:41.777307 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fc44776-0838-4b1a-844c-a16ffd83af56-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "4fc44776-0838-4b1a-844c-a16ffd83af56" (UID: "4fc44776-0838-4b1a-844c-a16ffd83af56"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:36:41 crc kubenswrapper[4971]: I0320 09:36:41.819794 4971 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/4fc44776-0838-4b1a-844c-a16ffd83af56-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 20 09:36:41 crc kubenswrapper[4971]: I0320 09:36:41.819837 4971 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/4fc44776-0838-4b1a-844c-a16ffd83af56-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 20 09:36:41 crc kubenswrapper[4971]: I0320 09:36:41.819852 4971 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4fc44776-0838-4b1a-844c-a16ffd83af56-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 09:36:41 crc kubenswrapper[4971]: I0320 09:36:41.819862 4971 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fc44776-0838-4b1a-844c-a16ffd83af56-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 09:36:41 crc kubenswrapper[4971]: I0320 09:36:41.819876 4971 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/4fc44776-0838-4b1a-844c-a16ffd83af56-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 20 09:36:41 crc kubenswrapper[4971]: I0320 09:36:41.819888 4971 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/4fc44776-0838-4b1a-844c-a16ffd83af56-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 20 09:36:41 crc kubenswrapper[4971]: I0320 09:36:41.819934 4971 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4fc44776-0838-4b1a-844c-a16ffd83af56-ceph\") on node \"crc\" DevicePath \"\"" Mar 20 09:36:41 crc kubenswrapper[4971]: I0320 09:36:41.819946 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9krp\" (UniqueName: \"kubernetes.io/projected/4fc44776-0838-4b1a-844c-a16ffd83af56-kube-api-access-x9krp\") on node \"crc\" DevicePath \"\"" Mar 20 09:36:42 crc kubenswrapper[4971]: I0320 09:36:42.027702 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-cjr5v" event={"ID":"4fc44776-0838-4b1a-844c-a16ffd83af56","Type":"ContainerDied","Data":"f008cc1efce853aaa7bf5e9496b5ffe58aba768546f25be3108ae78654399f82"} Mar 20 09:36:42 crc kubenswrapper[4971]: I0320 09:36:42.027975 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f008cc1efce853aaa7bf5e9496b5ffe58aba768546f25be3108ae78654399f82" Mar 20 09:36:42 crc kubenswrapper[4971]: I0320 09:36:42.027831 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-cjr5v" Mar 20 09:36:42 crc kubenswrapper[4971]: I0320 09:36:42.168384 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-zdbn9"] Mar 20 09:36:42 crc kubenswrapper[4971]: E0320 09:36:42.168953 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9cfbbf6-37b3-4959-bed8-9bf5bc4695cf" containerName="extract-content" Mar 20 09:36:42 crc kubenswrapper[4971]: I0320 09:36:42.168977 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9cfbbf6-37b3-4959-bed8-9bf5bc4695cf" containerName="extract-content" Mar 20 09:36:42 crc kubenswrapper[4971]: E0320 09:36:42.168991 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fc44776-0838-4b1a-844c-a16ffd83af56" containerName="telemetry-openstack-openstack-cell1" Mar 20 09:36:42 crc kubenswrapper[4971]: I0320 09:36:42.169000 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fc44776-0838-4b1a-844c-a16ffd83af56" containerName="telemetry-openstack-openstack-cell1" Mar 20 09:36:42 crc kubenswrapper[4971]: E0320 09:36:42.169016 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9cfbbf6-37b3-4959-bed8-9bf5bc4695cf" containerName="registry-server" Mar 20 09:36:42 crc kubenswrapper[4971]: I0320 09:36:42.169027 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9cfbbf6-37b3-4959-bed8-9bf5bc4695cf" containerName="registry-server" Mar 20 09:36:42 crc kubenswrapper[4971]: E0320 09:36:42.169064 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9cfbbf6-37b3-4959-bed8-9bf5bc4695cf" containerName="extract-utilities" Mar 20 09:36:42 crc kubenswrapper[4971]: I0320 09:36:42.169073 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9cfbbf6-37b3-4959-bed8-9bf5bc4695cf" containerName="extract-utilities" Mar 20 09:36:42 crc kubenswrapper[4971]: I0320 09:36:42.169334 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fc44776-0838-4b1a-844c-a16ffd83af56" containerName="telemetry-openstack-openstack-cell1" Mar 20 09:36:42 crc kubenswrapper[4971]: I0320 09:36:42.169365 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9cfbbf6-37b3-4959-bed8-9bf5bc4695cf" containerName="registry-server" Mar 20 09:36:42 crc kubenswrapper[4971]: I0320 09:36:42.170241 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-zdbn9" Mar 20 09:36:42 crc kubenswrapper[4971]: I0320 09:36:42.172806 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-sriov-agent-neutron-config" Mar 20 09:36:42 crc kubenswrapper[4971]: I0320 09:36:42.174164 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 20 09:36:42 crc kubenswrapper[4971]: I0320 09:36:42.174388 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 20 09:36:42 crc kubenswrapper[4971]: I0320 09:36:42.174490 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rwstj" Mar 20 09:36:42 crc kubenswrapper[4971]: I0320 09:36:42.174509 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 09:36:42 crc kubenswrapper[4971]: I0320 09:36:42.181504 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-zdbn9"] Mar 20 09:36:42 crc kubenswrapper[4971]: I0320 09:36:42.331459 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/da19c2aa-8814-45f3-8f80-f9aebff030fd-ssh-key-openstack-cell1\") pod \"neutron-sriov-openstack-openstack-cell1-zdbn9\" (UID: \"da19c2aa-8814-45f3-8f80-f9aebff030fd\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-zdbn9" Mar 20 09:36:42 crc kubenswrapper[4971]: I0320 09:36:42.331714 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hph2r\" (UniqueName: \"kubernetes.io/projected/da19c2aa-8814-45f3-8f80-f9aebff030fd-kube-api-access-hph2r\") pod \"neutron-sriov-openstack-openstack-cell1-zdbn9\" (UID: \"da19c2aa-8814-45f3-8f80-f9aebff030fd\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-zdbn9" Mar 20 09:36:42 crc kubenswrapper[4971]: I0320 09:36:42.331745 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da19c2aa-8814-45f3-8f80-f9aebff030fd-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-zdbn9\" (UID: \"da19c2aa-8814-45f3-8f80-f9aebff030fd\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-zdbn9" Mar 20 09:36:42 crc kubenswrapper[4971]: I0320 09:36:42.331767 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/da19c2aa-8814-45f3-8f80-f9aebff030fd-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-zdbn9\" (UID: \"da19c2aa-8814-45f3-8f80-f9aebff030fd\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-zdbn9" Mar 20 09:36:42 crc kubenswrapper[4971]: I0320 09:36:42.331788 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/da19c2aa-8814-45f3-8f80-f9aebff030fd-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-zdbn9\" (UID: \"da19c2aa-8814-45f3-8f80-f9aebff030fd\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-zdbn9" Mar 20 09:36:42 crc kubenswrapper[4971]: I0320 09:36:42.331838 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da19c2aa-8814-45f3-8f80-f9aebff030fd-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-zdbn9\" (UID: \"da19c2aa-8814-45f3-8f80-f9aebff030fd\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-zdbn9" Mar 20 09:36:42 crc kubenswrapper[4971]: I0320 09:36:42.434982 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hph2r\" (UniqueName: \"kubernetes.io/projected/da19c2aa-8814-45f3-8f80-f9aebff030fd-kube-api-access-hph2r\") pod \"neutron-sriov-openstack-openstack-cell1-zdbn9\" (UID: \"da19c2aa-8814-45f3-8f80-f9aebff030fd\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-zdbn9" Mar 20 09:36:42 crc kubenswrapper[4971]: I0320 09:36:42.435059 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da19c2aa-8814-45f3-8f80-f9aebff030fd-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-zdbn9\" (UID: \"da19c2aa-8814-45f3-8f80-f9aebff030fd\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-zdbn9" Mar 20 09:36:42 crc kubenswrapper[4971]: I0320 09:36:42.435106 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/da19c2aa-8814-45f3-8f80-f9aebff030fd-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-zdbn9\" (UID: \"da19c2aa-8814-45f3-8f80-f9aebff030fd\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-zdbn9" Mar 20 09:36:42 crc kubenswrapper[4971]: I0320 09:36:42.435140 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/da19c2aa-8814-45f3-8f80-f9aebff030fd-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-zdbn9\" (UID: \"da19c2aa-8814-45f3-8f80-f9aebff030fd\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-zdbn9" Mar 20 09:36:42 crc kubenswrapper[4971]: I0320 09:36:42.435208 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da19c2aa-8814-45f3-8f80-f9aebff030fd-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-zdbn9\" (UID: \"da19c2aa-8814-45f3-8f80-f9aebff030fd\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-zdbn9" Mar 20 09:36:42 crc kubenswrapper[4971]: I0320 09:36:42.435297 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/da19c2aa-8814-45f3-8f80-f9aebff030fd-ssh-key-openstack-cell1\") pod \"neutron-sriov-openstack-openstack-cell1-zdbn9\" (UID: \"da19c2aa-8814-45f3-8f80-f9aebff030fd\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-zdbn9" Mar 20 09:36:42 crc kubenswrapper[4971]: I0320 09:36:42.732731 4971 scope.go:117] "RemoveContainer" containerID="bf941729054eb427a73262a724a392ea7406ec0eee565011ce7a80f6bb777365" Mar 20 09:36:42 crc kubenswrapper[4971]: E0320 09:36:42.733429 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:36:42 crc kubenswrapper[4971]: I0320 09:36:42.801357 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da19c2aa-8814-45f3-8f80-f9aebff030fd-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-zdbn9\" (UID: \"da19c2aa-8814-45f3-8f80-f9aebff030fd\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-zdbn9" Mar 20 09:36:42 crc kubenswrapper[4971]: I0320 09:36:42.806302 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/da19c2aa-8814-45f3-8f80-f9aebff030fd-ssh-key-openstack-cell1\") pod \"neutron-sriov-openstack-openstack-cell1-zdbn9\" (UID: \"da19c2aa-8814-45f3-8f80-f9aebff030fd\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-zdbn9" Mar 20 09:36:42 crc kubenswrapper[4971]: I0320 09:36:42.806750 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/da19c2aa-8814-45f3-8f80-f9aebff030fd-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-zdbn9\" (UID: \"da19c2aa-8814-45f3-8f80-f9aebff030fd\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-zdbn9" Mar 20 09:36:42 crc kubenswrapper[4971]: I0320 09:36:42.808175 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/da19c2aa-8814-45f3-8f80-f9aebff030fd-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-zdbn9\" (UID: \"da19c2aa-8814-45f3-8f80-f9aebff030fd\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-zdbn9" Mar 20 09:36:42 crc kubenswrapper[4971]: I0320 09:36:42.809173 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hph2r\" (UniqueName: \"kubernetes.io/projected/da19c2aa-8814-45f3-8f80-f9aebff030fd-kube-api-access-hph2r\") pod \"neutron-sriov-openstack-openstack-cell1-zdbn9\" (UID: \"da19c2aa-8814-45f3-8f80-f9aebff030fd\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-zdbn9" Mar 20 09:36:42 crc kubenswrapper[4971]: I0320 09:36:42.814266 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da19c2aa-8814-45f3-8f80-f9aebff030fd-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-zdbn9\" (UID: \"da19c2aa-8814-45f3-8f80-f9aebff030fd\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-zdbn9" Mar 20 09:36:42 crc kubenswrapper[4971]: I0320 09:36:42.819804 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-zdbn9" Mar 20 09:36:43 crc kubenswrapper[4971]: I0320 09:36:43.441627 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-zdbn9"] Mar 20 09:36:43 crc kubenswrapper[4971]: I0320 09:36:43.441858 4971 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 09:36:44 crc kubenswrapper[4971]: I0320 09:36:44.060317 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-zdbn9" event={"ID":"da19c2aa-8814-45f3-8f80-f9aebff030fd","Type":"ContainerStarted","Data":"b69120051d116db3e3ec1dfb294df6fdb1d8fba6021422e8bdfe749c7562a446"} Mar 20 09:36:45 crc kubenswrapper[4971]: I0320 09:36:45.073355 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-zdbn9" event={"ID":"da19c2aa-8814-45f3-8f80-f9aebff030fd","Type":"ContainerStarted","Data":"7336abe89f53f1d3f0710f955c962a92bf225d2e723f738478f65e169ab50b94"} Mar 20 09:36:45 crc kubenswrapper[4971]: I0320 09:36:45.099688 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-sriov-openstack-openstack-cell1-zdbn9" podStartSLOduration=2.448737108 podStartE2EDuration="3.099665838s" podCreationTimestamp="2026-03-20 09:36:42 +0000 UTC" firstStartedPulling="2026-03-20 09:36:43.441571895 +0000 UTC m=+10025.421446033" lastFinishedPulling="2026-03-20 09:36:44.092500615 +0000 UTC m=+10026.072374763" observedRunningTime="2026-03-20 09:36:45.08809786 +0000 UTC m=+10027.067972018" watchObservedRunningTime="2026-03-20 09:36:45.099665838 +0000 UTC m=+10027.079539976" Mar 20 09:36:57 crc kubenswrapper[4971]: I0320 09:36:57.732195 4971 scope.go:117] "RemoveContainer" containerID="bf941729054eb427a73262a724a392ea7406ec0eee565011ce7a80f6bb777365" Mar 20 09:36:57 crc kubenswrapper[4971]: E0320 09:36:57.734181 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:37:11 crc kubenswrapper[4971]: I0320 09:37:11.732767 4971 scope.go:117] "RemoveContainer" containerID="bf941729054eb427a73262a724a392ea7406ec0eee565011ce7a80f6bb777365" Mar 20 09:37:11 crc kubenswrapper[4971]: E0320 09:37:11.733862 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:37:26 crc kubenswrapper[4971]: I0320 09:37:26.732633 4971 scope.go:117] "RemoveContainer" containerID="bf941729054eb427a73262a724a392ea7406ec0eee565011ce7a80f6bb777365" Mar 20 09:37:27 crc kubenswrapper[4971]: I0320 09:37:27.539414 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerStarted","Data":"c432cb1c18813ff52f5f6bb68ab9bae999208871fd3cdfc75264abedf55542dd"} Mar 20 09:37:35 crc kubenswrapper[4971]: I0320 09:37:35.621070 4971 generic.go:334] "Generic (PLEG): container finished" podID="da19c2aa-8814-45f3-8f80-f9aebff030fd" containerID="7336abe89f53f1d3f0710f955c962a92bf225d2e723f738478f65e169ab50b94" exitCode=0 Mar 20 09:37:35 crc kubenswrapper[4971]: I0320 09:37:35.621133 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-zdbn9" event={"ID":"da19c2aa-8814-45f3-8f80-f9aebff030fd","Type":"ContainerDied","Data":"7336abe89f53f1d3f0710f955c962a92bf225d2e723f738478f65e169ab50b94"} Mar 20 09:37:37 crc kubenswrapper[4971]: I0320 09:37:37.117086 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-zdbn9" Mar 20 09:37:37 crc kubenswrapper[4971]: I0320 09:37:37.299234 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/da19c2aa-8814-45f3-8f80-f9aebff030fd-ceph\") pod \"da19c2aa-8814-45f3-8f80-f9aebff030fd\" (UID: \"da19c2aa-8814-45f3-8f80-f9aebff030fd\") " Mar 20 09:37:37 crc kubenswrapper[4971]: I0320 09:37:37.299372 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da19c2aa-8814-45f3-8f80-f9aebff030fd-inventory\") pod \"da19c2aa-8814-45f3-8f80-f9aebff030fd\" (UID: \"da19c2aa-8814-45f3-8f80-f9aebff030fd\") " Mar 20 09:37:37 crc kubenswrapper[4971]: I0320 09:37:37.299406 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/da19c2aa-8814-45f3-8f80-f9aebff030fd-ssh-key-openstack-cell1\") pod \"da19c2aa-8814-45f3-8f80-f9aebff030fd\" (UID: \"da19c2aa-8814-45f3-8f80-f9aebff030fd\") " Mar 20 09:37:37 crc kubenswrapper[4971]: I0320 09:37:37.299580 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da19c2aa-8814-45f3-8f80-f9aebff030fd-neutron-sriov-combined-ca-bundle\") pod \"da19c2aa-8814-45f3-8f80-f9aebff030fd\" (UID: \"da19c2aa-8814-45f3-8f80-f9aebff030fd\") " Mar 20 09:37:37 crc kubenswrapper[4971]: I0320 09:37:37.299650 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/da19c2aa-8814-45f3-8f80-f9aebff030fd-neutron-sriov-agent-neutron-config-0\") pod \"da19c2aa-8814-45f3-8f80-f9aebff030fd\" (UID: \"da19c2aa-8814-45f3-8f80-f9aebff030fd\") " Mar 20 09:37:37 crc kubenswrapper[4971]: I0320 09:37:37.299779 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hph2r\" (UniqueName: \"kubernetes.io/projected/da19c2aa-8814-45f3-8f80-f9aebff030fd-kube-api-access-hph2r\") pod \"da19c2aa-8814-45f3-8f80-f9aebff030fd\" (UID: \"da19c2aa-8814-45f3-8f80-f9aebff030fd\") " Mar 20 09:37:37 crc kubenswrapper[4971]: I0320 09:37:37.309876 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da19c2aa-8814-45f3-8f80-f9aebff030fd-ceph" (OuterVolumeSpecName: "ceph") pod "da19c2aa-8814-45f3-8f80-f9aebff030fd" (UID: "da19c2aa-8814-45f3-8f80-f9aebff030fd"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:37:37 crc kubenswrapper[4971]: I0320 09:37:37.310043 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da19c2aa-8814-45f3-8f80-f9aebff030fd-kube-api-access-hph2r" (OuterVolumeSpecName: "kube-api-access-hph2r") pod "da19c2aa-8814-45f3-8f80-f9aebff030fd" (UID: "da19c2aa-8814-45f3-8f80-f9aebff030fd"). InnerVolumeSpecName "kube-api-access-hph2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:37:37 crc kubenswrapper[4971]: I0320 09:37:37.309901 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da19c2aa-8814-45f3-8f80-f9aebff030fd-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "da19c2aa-8814-45f3-8f80-f9aebff030fd" (UID: "da19c2aa-8814-45f3-8f80-f9aebff030fd"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:37:37 crc kubenswrapper[4971]: I0320 09:37:37.325440 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da19c2aa-8814-45f3-8f80-f9aebff030fd-inventory" (OuterVolumeSpecName: "inventory") pod "da19c2aa-8814-45f3-8f80-f9aebff030fd" (UID: "da19c2aa-8814-45f3-8f80-f9aebff030fd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:37:37 crc kubenswrapper[4971]: I0320 09:37:37.325940 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da19c2aa-8814-45f3-8f80-f9aebff030fd-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "da19c2aa-8814-45f3-8f80-f9aebff030fd" (UID: "da19c2aa-8814-45f3-8f80-f9aebff030fd"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:37:37 crc kubenswrapper[4971]: I0320 09:37:37.331332 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da19c2aa-8814-45f3-8f80-f9aebff030fd-neutron-sriov-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-sriov-agent-neutron-config-0") pod "da19c2aa-8814-45f3-8f80-f9aebff030fd" (UID: "da19c2aa-8814-45f3-8f80-f9aebff030fd"). InnerVolumeSpecName "neutron-sriov-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:37:37 crc kubenswrapper[4971]: I0320 09:37:37.402933 4971 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da19c2aa-8814-45f3-8f80-f9aebff030fd-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 09:37:37 crc kubenswrapper[4971]: I0320 09:37:37.402994 4971 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/da19c2aa-8814-45f3-8f80-f9aebff030fd-neutron-sriov-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 20 09:37:37 crc kubenswrapper[4971]: I0320 09:37:37.403007 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hph2r\" (UniqueName: \"kubernetes.io/projected/da19c2aa-8814-45f3-8f80-f9aebff030fd-kube-api-access-hph2r\") on node \"crc\" DevicePath \"\"" Mar 20 09:37:37 crc kubenswrapper[4971]: I0320 09:37:37.403019 4971 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/da19c2aa-8814-45f3-8f80-f9aebff030fd-ceph\") on node \"crc\" DevicePath \"\"" Mar 20 09:37:37 crc kubenswrapper[4971]: I0320 09:37:37.403031 4971 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da19c2aa-8814-45f3-8f80-f9aebff030fd-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 09:37:37 crc kubenswrapper[4971]: I0320 09:37:37.403061 4971 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/da19c2aa-8814-45f3-8f80-f9aebff030fd-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 20 09:37:37 crc kubenswrapper[4971]: I0320 09:37:37.649740 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-zdbn9" event={"ID":"da19c2aa-8814-45f3-8f80-f9aebff030fd","Type":"ContainerDied","Data":"b69120051d116db3e3ec1dfb294df6fdb1d8fba6021422e8bdfe749c7562a446"} Mar 20 09:37:37 crc kubenswrapper[4971]: I0320 09:37:37.650103 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b69120051d116db3e3ec1dfb294df6fdb1d8fba6021422e8bdfe749c7562a446" Mar 20 09:37:37 crc kubenswrapper[4971]: I0320 09:37:37.649788 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-zdbn9" Mar 20 09:37:37 crc kubenswrapper[4971]: I0320 09:37:37.769199 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-jqdkh"] Mar 20 09:37:37 crc kubenswrapper[4971]: E0320 09:37:37.769855 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da19c2aa-8814-45f3-8f80-f9aebff030fd" containerName="neutron-sriov-openstack-openstack-cell1" Mar 20 09:37:37 crc kubenswrapper[4971]: I0320 09:37:37.769881 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="da19c2aa-8814-45f3-8f80-f9aebff030fd" containerName="neutron-sriov-openstack-openstack-cell1" Mar 20 09:37:37 crc kubenswrapper[4971]: I0320 09:37:37.770117 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="da19c2aa-8814-45f3-8f80-f9aebff030fd" containerName="neutron-sriov-openstack-openstack-cell1" Mar 20 09:37:37 crc kubenswrapper[4971]: I0320 09:37:37.771160 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-jqdkh" Mar 20 09:37:37 crc kubenswrapper[4971]: I0320 09:37:37.777712 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-dhcp-agent-neutron-config" Mar 20 09:37:37 crc kubenswrapper[4971]: I0320 09:37:37.778022 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 09:37:37 crc kubenswrapper[4971]: I0320 09:37:37.778326 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 20 09:37:37 crc kubenswrapper[4971]: I0320 09:37:37.778475 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 20 09:37:37 crc kubenswrapper[4971]: I0320 09:37:37.778689 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rwstj" Mar 20 09:37:37 crc kubenswrapper[4971]: I0320 09:37:37.787444 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-jqdkh"] Mar 20 09:37:37 crc kubenswrapper[4971]: I0320 09:37:37.923453 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/42809274-e4e2-41bf-bac0-163763e81df4-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-jqdkh\" (UID: \"42809274-e4e2-41bf-bac0-163763e81df4\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-jqdkh" Mar 20 09:37:37 crc kubenswrapper[4971]: I0320 09:37:37.923542 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/42809274-e4e2-41bf-bac0-163763e81df4-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-jqdkh\" (UID: \"42809274-e4e2-41bf-bac0-163763e81df4\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-jqdkh" Mar 20 09:37:37 crc kubenswrapper[4971]: I0320 09:37:37.923581 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42809274-e4e2-41bf-bac0-163763e81df4-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-jqdkh\" (UID: \"42809274-e4e2-41bf-bac0-163763e81df4\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-jqdkh" Mar 20 09:37:37 crc kubenswrapper[4971]: I0320 09:37:37.923821 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/42809274-e4e2-41bf-bac0-163763e81df4-ssh-key-openstack-cell1\") pod \"neutron-dhcp-openstack-openstack-cell1-jqdkh\" (UID: \"42809274-e4e2-41bf-bac0-163763e81df4\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-jqdkh" Mar 20 09:37:37 crc kubenswrapper[4971]: I0320 09:37:37.923877 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/42809274-e4e2-41bf-bac0-163763e81df4-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-jqdkh\" (UID: \"42809274-e4e2-41bf-bac0-163763e81df4\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-jqdkh" Mar 20 09:37:37 crc kubenswrapper[4971]: I0320 09:37:37.923920 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwv6k\" (UniqueName: \"kubernetes.io/projected/42809274-e4e2-41bf-bac0-163763e81df4-kube-api-access-cwv6k\") pod \"neutron-dhcp-openstack-openstack-cell1-jqdkh\" (UID: \"42809274-e4e2-41bf-bac0-163763e81df4\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-jqdkh" Mar 20 09:37:38 crc kubenswrapper[4971]: I0320 09:37:38.025956 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/42809274-e4e2-41bf-bac0-163763e81df4-ssh-key-openstack-cell1\") pod \"neutron-dhcp-openstack-openstack-cell1-jqdkh\" (UID: \"42809274-e4e2-41bf-bac0-163763e81df4\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-jqdkh" Mar 20 09:37:38 crc kubenswrapper[4971]: I0320 09:37:38.026019 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/42809274-e4e2-41bf-bac0-163763e81df4-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-jqdkh\" (UID: \"42809274-e4e2-41bf-bac0-163763e81df4\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-jqdkh" Mar 20 09:37:38 crc kubenswrapper[4971]: I0320 09:37:38.026059 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwv6k\" (UniqueName: \"kubernetes.io/projected/42809274-e4e2-41bf-bac0-163763e81df4-kube-api-access-cwv6k\") pod \"neutron-dhcp-openstack-openstack-cell1-jqdkh\" (UID: \"42809274-e4e2-41bf-bac0-163763e81df4\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-jqdkh" Mar 20 09:37:38 crc kubenswrapper[4971]: I0320 09:37:38.026090 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/42809274-e4e2-41bf-bac0-163763e81df4-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-jqdkh\" (UID: \"42809274-e4e2-41bf-bac0-163763e81df4\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-jqdkh" Mar 20 09:37:38 crc kubenswrapper[4971]: I0320 09:37:38.026133 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/42809274-e4e2-41bf-bac0-163763e81df4-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-jqdkh\" (UID: \"42809274-e4e2-41bf-bac0-163763e81df4\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-jqdkh" Mar 20 09:37:38 crc kubenswrapper[4971]: I0320 09:37:38.026154 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42809274-e4e2-41bf-bac0-163763e81df4-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-jqdkh\" (UID: \"42809274-e4e2-41bf-bac0-163763e81df4\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-jqdkh" Mar 20 09:37:38 crc kubenswrapper[4971]: I0320 09:37:38.030724 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/42809274-e4e2-41bf-bac0-163763e81df4-ssh-key-openstack-cell1\") pod \"neutron-dhcp-openstack-openstack-cell1-jqdkh\" (UID: \"42809274-e4e2-41bf-bac0-163763e81df4\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-jqdkh" Mar 20 09:37:38 crc kubenswrapper[4971]: I0320 09:37:38.030871 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/42809274-e4e2-41bf-bac0-163763e81df4-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-jqdkh\" (UID: \"42809274-e4e2-41bf-bac0-163763e81df4\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-jqdkh" Mar 20 09:37:38 crc kubenswrapper[4971]: I0320 09:37:38.031365 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/42809274-e4e2-41bf-bac0-163763e81df4-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-jqdkh\" (UID: \"42809274-e4e2-41bf-bac0-163763e81df4\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-jqdkh" Mar 20 09:37:38 crc kubenswrapper[4971]: I0320 09:37:38.032176 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/42809274-e4e2-41bf-bac0-163763e81df4-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-jqdkh\" (UID: \"42809274-e4e2-41bf-bac0-163763e81df4\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-jqdkh" Mar 20 09:37:38 crc kubenswrapper[4971]: I0320 09:37:38.033148 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42809274-e4e2-41bf-bac0-163763e81df4-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-jqdkh\" (UID: \"42809274-e4e2-41bf-bac0-163763e81df4\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-jqdkh" Mar 20 09:37:38 crc kubenswrapper[4971]: I0320 09:37:38.048445 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwv6k\" (UniqueName: \"kubernetes.io/projected/42809274-e4e2-41bf-bac0-163763e81df4-kube-api-access-cwv6k\") pod \"neutron-dhcp-openstack-openstack-cell1-jqdkh\" (UID: \"42809274-e4e2-41bf-bac0-163763e81df4\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-jqdkh" Mar 20 09:37:38 crc kubenswrapper[4971]: I0320 09:37:38.096630 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-jqdkh" Mar 20 09:37:38 crc kubenswrapper[4971]: I0320 09:37:38.627132 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-jqdkh"] Mar 20 09:37:38 crc kubenswrapper[4971]: W0320 09:37:38.628197 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42809274_e4e2_41bf_bac0_163763e81df4.slice/crio-2fd4a1204ae07278ff28c6ba6d6bb4c03026db6001c3faa126ed859f97ea36ad WatchSource:0}: Error finding container 2fd4a1204ae07278ff28c6ba6d6bb4c03026db6001c3faa126ed859f97ea36ad: Status 404 returned error can't find the container with id 2fd4a1204ae07278ff28c6ba6d6bb4c03026db6001c3faa126ed859f97ea36ad Mar 20 09:37:38 crc kubenswrapper[4971]: I0320 09:37:38.661321 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-jqdkh" event={"ID":"42809274-e4e2-41bf-bac0-163763e81df4","Type":"ContainerStarted","Data":"2fd4a1204ae07278ff28c6ba6d6bb4c03026db6001c3faa126ed859f97ea36ad"} Mar 20 09:37:39 crc kubenswrapper[4971]: I0320 09:37:39.294627 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 09:37:39 crc kubenswrapper[4971]: I0320 09:37:39.686770 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-jqdkh" event={"ID":"42809274-e4e2-41bf-bac0-163763e81df4","Type":"ContainerStarted","Data":"8e2385d8cd4258c421319a2d7bbae3e24348903e2615f99f397d8efeba012f05"} Mar 20 09:37:39 crc kubenswrapper[4971]: I0320 09:37:39.705373 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-dhcp-openstack-openstack-cell1-jqdkh" podStartSLOduration=2.044415531 podStartE2EDuration="2.705341726s" podCreationTimestamp="2026-03-20 09:37:37 +0000 UTC" firstStartedPulling="2026-03-20 09:37:38.63102663 +0000 UTC m=+10080.610900768" lastFinishedPulling="2026-03-20 09:37:39.291952815 +0000 UTC m=+10081.271826963" observedRunningTime="2026-03-20 09:37:39.704018201 +0000 UTC m=+10081.683892339" watchObservedRunningTime="2026-03-20 09:37:39.705341726 +0000 UTC m=+10081.685215854" Mar 20 09:38:00 crc kubenswrapper[4971]: I0320 09:38:00.131118 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566658-wsndq"] Mar 20 09:38:00 crc kubenswrapper[4971]: I0320 09:38:00.132941 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566658-wsndq" Mar 20 09:38:00 crc kubenswrapper[4971]: I0320 09:38:00.134772 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:38:00 crc kubenswrapper[4971]: I0320 09:38:00.135356 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:38:00 crc kubenswrapper[4971]: I0320 09:38:00.135578 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 09:38:00 crc kubenswrapper[4971]: I0320 09:38:00.142658 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566658-wsndq"] Mar 20 09:38:00 crc kubenswrapper[4971]: I0320 09:38:00.202774 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj2dm\" (UniqueName: \"kubernetes.io/projected/8ddea0e1-b76c-4830-9489-4c0b2fa997f3-kube-api-access-sj2dm\") pod \"auto-csr-approver-29566658-wsndq\" (UID: \"8ddea0e1-b76c-4830-9489-4c0b2fa997f3\") " pod="openshift-infra/auto-csr-approver-29566658-wsndq" Mar 20 09:38:00 crc kubenswrapper[4971]: I0320 09:38:00.304193 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sj2dm\" (UniqueName: \"kubernetes.io/projected/8ddea0e1-b76c-4830-9489-4c0b2fa997f3-kube-api-access-sj2dm\") pod \"auto-csr-approver-29566658-wsndq\" (UID: \"8ddea0e1-b76c-4830-9489-4c0b2fa997f3\") " pod="openshift-infra/auto-csr-approver-29566658-wsndq" Mar 20 09:38:00 crc kubenswrapper[4971]: I0320 09:38:00.329195 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj2dm\" (UniqueName: \"kubernetes.io/projected/8ddea0e1-b76c-4830-9489-4c0b2fa997f3-kube-api-access-sj2dm\") pod \"auto-csr-approver-29566658-wsndq\" (UID: \"8ddea0e1-b76c-4830-9489-4c0b2fa997f3\") " pod="openshift-infra/auto-csr-approver-29566658-wsndq" Mar 20 09:38:00 crc kubenswrapper[4971]: I0320 09:38:00.452096 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566658-wsndq" Mar 20 09:38:00 crc kubenswrapper[4971]: W0320 09:38:00.886256 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ddea0e1_b76c_4830_9489_4c0b2fa997f3.slice/crio-bb9aa0b1a935bb77bfd62c6eabd4cb72c7fd97ec5b81c04a1e4dc371c49d39e1 WatchSource:0}: Error finding container bb9aa0b1a935bb77bfd62c6eabd4cb72c7fd97ec5b81c04a1e4dc371c49d39e1: Status 404 returned error can't find the container with id bb9aa0b1a935bb77bfd62c6eabd4cb72c7fd97ec5b81c04a1e4dc371c49d39e1 Mar 20 09:38:00 crc kubenswrapper[4971]: I0320 09:38:00.886269 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566658-wsndq"] Mar 20 09:38:01 crc kubenswrapper[4971]: I0320 09:38:01.895582 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566658-wsndq" event={"ID":"8ddea0e1-b76c-4830-9489-4c0b2fa997f3","Type":"ContainerStarted","Data":"bb9aa0b1a935bb77bfd62c6eabd4cb72c7fd97ec5b81c04a1e4dc371c49d39e1"} Mar 20 09:38:02 crc kubenswrapper[4971]: I0320 09:38:02.907869 4971 generic.go:334] "Generic (PLEG): container finished" podID="8ddea0e1-b76c-4830-9489-4c0b2fa997f3" containerID="cf9ce687aa3125da7ca444672c9b80e615ec911a2c440453b4e8f059c233b082" exitCode=0 Mar 20 09:38:02 crc kubenswrapper[4971]: I0320 09:38:02.908001 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566658-wsndq" event={"ID":"8ddea0e1-b76c-4830-9489-4c0b2fa997f3","Type":"ContainerDied","Data":"cf9ce687aa3125da7ca444672c9b80e615ec911a2c440453b4e8f059c233b082"} Mar 20 09:38:04 crc kubenswrapper[4971]: I0320 09:38:04.302407 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566658-wsndq" Mar 20 09:38:04 crc kubenswrapper[4971]: I0320 09:38:04.390308 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sj2dm\" (UniqueName: \"kubernetes.io/projected/8ddea0e1-b76c-4830-9489-4c0b2fa997f3-kube-api-access-sj2dm\") pod \"8ddea0e1-b76c-4830-9489-4c0b2fa997f3\" (UID: \"8ddea0e1-b76c-4830-9489-4c0b2fa997f3\") " Mar 20 09:38:04 crc kubenswrapper[4971]: I0320 09:38:04.395193 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ddea0e1-b76c-4830-9489-4c0b2fa997f3-kube-api-access-sj2dm" (OuterVolumeSpecName: "kube-api-access-sj2dm") pod "8ddea0e1-b76c-4830-9489-4c0b2fa997f3" (UID: "8ddea0e1-b76c-4830-9489-4c0b2fa997f3"). InnerVolumeSpecName "kube-api-access-sj2dm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:38:04 crc kubenswrapper[4971]: I0320 09:38:04.493586 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sj2dm\" (UniqueName: \"kubernetes.io/projected/8ddea0e1-b76c-4830-9489-4c0b2fa997f3-kube-api-access-sj2dm\") on node \"crc\" DevicePath \"\"" Mar 20 09:38:04 crc kubenswrapper[4971]: I0320 09:38:04.934680 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566658-wsndq" event={"ID":"8ddea0e1-b76c-4830-9489-4c0b2fa997f3","Type":"ContainerDied","Data":"bb9aa0b1a935bb77bfd62c6eabd4cb72c7fd97ec5b81c04a1e4dc371c49d39e1"} Mar 20 09:38:04 crc kubenswrapper[4971]: I0320 09:38:04.935106 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb9aa0b1a935bb77bfd62c6eabd4cb72c7fd97ec5b81c04a1e4dc371c49d39e1" Mar 20 09:38:04 crc kubenswrapper[4971]: I0320 09:38:04.934753 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566658-wsndq" Mar 20 09:38:05 crc kubenswrapper[4971]: I0320 09:38:05.369158 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566652-77bgm"] Mar 20 09:38:05 crc kubenswrapper[4971]: I0320 09:38:05.378058 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566652-77bgm"] Mar 20 09:38:06 crc kubenswrapper[4971]: I0320 09:38:06.743902 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d900009-430c-4456-9b46-62602708c68e" path="/var/lib/kubelet/pods/8d900009-430c-4456-9b46-62602708c68e/volumes" Mar 20 09:38:15 crc kubenswrapper[4971]: I0320 09:38:15.093493 4971 scope.go:117] "RemoveContainer" containerID="90929e45bea32ad547f1be5f64c0a1288fd2c2df16a14f29522ab13f880bb3ce" Mar 20 09:38:41 crc kubenswrapper[4971]: I0320 09:38:41.270430 4971 generic.go:334] "Generic (PLEG): container finished" podID="42809274-e4e2-41bf-bac0-163763e81df4" containerID="8e2385d8cd4258c421319a2d7bbae3e24348903e2615f99f397d8efeba012f05" exitCode=0 Mar 20 09:38:41 crc kubenswrapper[4971]: I0320 09:38:41.270522 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-jqdkh" event={"ID":"42809274-e4e2-41bf-bac0-163763e81df4","Type":"ContainerDied","Data":"8e2385d8cd4258c421319a2d7bbae3e24348903e2615f99f397d8efeba012f05"} Mar 20 09:38:42 crc kubenswrapper[4971]: I0320 09:38:42.743175 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-jqdkh" Mar 20 09:38:42 crc kubenswrapper[4971]: I0320 09:38:42.863110 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42809274-e4e2-41bf-bac0-163763e81df4-neutron-dhcp-combined-ca-bundle\") pod \"42809274-e4e2-41bf-bac0-163763e81df4\" (UID: \"42809274-e4e2-41bf-bac0-163763e81df4\") " Mar 20 09:38:42 crc kubenswrapper[4971]: I0320 09:38:42.863274 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/42809274-e4e2-41bf-bac0-163763e81df4-neutron-dhcp-agent-neutron-config-0\") pod \"42809274-e4e2-41bf-bac0-163763e81df4\" (UID: \"42809274-e4e2-41bf-bac0-163763e81df4\") " Mar 20 09:38:42 crc kubenswrapper[4971]: I0320 09:38:42.863303 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/42809274-e4e2-41bf-bac0-163763e81df4-ceph\") pod \"42809274-e4e2-41bf-bac0-163763e81df4\" (UID: \"42809274-e4e2-41bf-bac0-163763e81df4\") " Mar 20 09:38:42 crc kubenswrapper[4971]: I0320 09:38:42.863332 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwv6k\" (UniqueName: \"kubernetes.io/projected/42809274-e4e2-41bf-bac0-163763e81df4-kube-api-access-cwv6k\") pod \"42809274-e4e2-41bf-bac0-163763e81df4\" (UID: \"42809274-e4e2-41bf-bac0-163763e81df4\") " Mar 20 09:38:42 crc kubenswrapper[4971]: I0320 09:38:42.863347 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/42809274-e4e2-41bf-bac0-163763e81df4-ssh-key-openstack-cell1\") pod \"42809274-e4e2-41bf-bac0-163763e81df4\" (UID: \"42809274-e4e2-41bf-bac0-163763e81df4\") " Mar 20 09:38:42 crc kubenswrapper[4971]: I0320 09:38:42.863501 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/42809274-e4e2-41bf-bac0-163763e81df4-inventory\") pod \"42809274-e4e2-41bf-bac0-163763e81df4\" (UID: \"42809274-e4e2-41bf-bac0-163763e81df4\") " Mar 20 09:38:42 crc kubenswrapper[4971]: I0320 09:38:42.868812 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42809274-e4e2-41bf-bac0-163763e81df4-ceph" (OuterVolumeSpecName: "ceph") pod "42809274-e4e2-41bf-bac0-163763e81df4" (UID: "42809274-e4e2-41bf-bac0-163763e81df4"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:38:42 crc kubenswrapper[4971]: I0320 09:38:42.868948 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42809274-e4e2-41bf-bac0-163763e81df4-kube-api-access-cwv6k" (OuterVolumeSpecName: "kube-api-access-cwv6k") pod "42809274-e4e2-41bf-bac0-163763e81df4" (UID: "42809274-e4e2-41bf-bac0-163763e81df4"). InnerVolumeSpecName "kube-api-access-cwv6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:38:42 crc kubenswrapper[4971]: I0320 09:38:42.873788 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42809274-e4e2-41bf-bac0-163763e81df4-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "42809274-e4e2-41bf-bac0-163763e81df4" (UID: "42809274-e4e2-41bf-bac0-163763e81df4"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:38:42 crc kubenswrapper[4971]: I0320 09:38:42.892535 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42809274-e4e2-41bf-bac0-163763e81df4-inventory" (OuterVolumeSpecName: "inventory") pod "42809274-e4e2-41bf-bac0-163763e81df4" (UID: "42809274-e4e2-41bf-bac0-163763e81df4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:38:42 crc kubenswrapper[4971]: I0320 09:38:42.892783 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42809274-e4e2-41bf-bac0-163763e81df4-neutron-dhcp-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-dhcp-agent-neutron-config-0") pod "42809274-e4e2-41bf-bac0-163763e81df4" (UID: "42809274-e4e2-41bf-bac0-163763e81df4"). InnerVolumeSpecName "neutron-dhcp-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:38:42 crc kubenswrapper[4971]: I0320 09:38:42.908718 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42809274-e4e2-41bf-bac0-163763e81df4-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "42809274-e4e2-41bf-bac0-163763e81df4" (UID: "42809274-e4e2-41bf-bac0-163763e81df4"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:38:42 crc kubenswrapper[4971]: I0320 09:38:42.967431 4971 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42809274-e4e2-41bf-bac0-163763e81df4-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 09:38:42 crc kubenswrapper[4971]: I0320 09:38:42.967466 4971 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/42809274-e4e2-41bf-bac0-163763e81df4-neutron-dhcp-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 20 09:38:42 crc kubenswrapper[4971]: I0320 09:38:42.967478 4971 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/42809274-e4e2-41bf-bac0-163763e81df4-ceph\") on node \"crc\" DevicePath \"\"" Mar 20 09:38:42 crc kubenswrapper[4971]: I0320 09:38:42.967489 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwv6k\" (UniqueName: \"kubernetes.io/projected/42809274-e4e2-41bf-bac0-163763e81df4-kube-api-access-cwv6k\") on node \"crc\" DevicePath \"\"" Mar 20 09:38:42 crc kubenswrapper[4971]: I0320 09:38:42.967499 4971 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/42809274-e4e2-41bf-bac0-163763e81df4-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 20 09:38:42 crc kubenswrapper[4971]: I0320 09:38:42.967508 4971 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/42809274-e4e2-41bf-bac0-163763e81df4-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 09:38:43 crc kubenswrapper[4971]: I0320 09:38:43.288568 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-jqdkh" event={"ID":"42809274-e4e2-41bf-bac0-163763e81df4","Type":"ContainerDied","Data":"2fd4a1204ae07278ff28c6ba6d6bb4c03026db6001c3faa126ed859f97ea36ad"} Mar 20 09:38:43 crc kubenswrapper[4971]: I0320 09:38:43.288622 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2fd4a1204ae07278ff28c6ba6d6bb4c03026db6001c3faa126ed859f97ea36ad" Mar 20 09:38:43 crc kubenswrapper[4971]: I0320 09:38:43.288661 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-jqdkh" Mar 20 09:38:53 crc kubenswrapper[4971]: I0320 09:38:53.129642 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 09:38:53 crc kubenswrapper[4971]: I0320 09:38:53.130334 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="77ccca0d-9032-4c04-9074-3711401b473c" containerName="nova-cell0-conductor-conductor" containerID="cri-o://c1701b1eb7140b5fe38ba24029a2325e7c0489059e8e39e989e69d7bbff07652" gracePeriod=30 Mar 20 09:38:53 crc kubenswrapper[4971]: I0320 09:38:53.570272 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 09:38:53 crc kubenswrapper[4971]: I0320 09:38:53.570520 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="8d18a1e5-a77f-4c5f-a2f3-be899e4c702d" containerName="nova-cell1-conductor-conductor" containerID="cri-o://cc0310d6f80ae78b6a488ab8385070f8bac2cd555c6aa72a5afb7fb64cbc727b" gracePeriod=30 Mar 20 09:38:53 crc kubenswrapper[4971]: I0320 09:38:53.668364 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 09:38:53 crc kubenswrapper[4971]: I0320 09:38:53.668729 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f4e0393d-4241-455b-a857-3f4e8536576b" containerName="nova-api-log" containerID="cri-o://2427238a074f49768d7eca5f2b457978f1303ed5f92dd92cae8208ee17726a0c" gracePeriod=30 Mar 20 09:38:53 crc kubenswrapper[4971]: I0320 09:38:53.668786 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f4e0393d-4241-455b-a857-3f4e8536576b" containerName="nova-api-api" containerID="cri-o://069c4523fd249ea051101504fc8505a865bd50f12636610910eacda6ead5b5ca" gracePeriod=30 Mar 20 09:38:53 crc kubenswrapper[4971]: I0320 09:38:53.682047 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 09:38:53 crc kubenswrapper[4971]: I0320 09:38:53.682420 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="29b84ad8-d074-4c35-a0dd-a0b491ac3e45" containerName="nova-scheduler-scheduler" containerID="cri-o://7e8913c8e78092443fc5bdec82211995ed9e16db7ae63352cf5840096b976239" gracePeriod=30 Mar 20 09:38:53 crc kubenswrapper[4971]: I0320 09:38:53.693850 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 09:38:53 crc kubenswrapper[4971]: I0320 09:38:53.694073 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ecdcaee4-8a8d-4f0f-94de-fe2082e55ead" containerName="nova-metadata-log" containerID="cri-o://f03c934f8e3cab9668e02a35a1ff1b598209096c238dc5e2b9041ebfa5a60f51" gracePeriod=30 Mar 20 09:38:53 crc kubenswrapper[4971]: I0320 09:38:53.694126 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ecdcaee4-8a8d-4f0f-94de-fe2082e55ead" containerName="nova-metadata-metadata" containerID="cri-o://a754c0876c0bb879fece283bb1842c594d8bf635ee792749cff6fcae64503153" gracePeriod=30 Mar 20 09:38:53 crc kubenswrapper[4971]: E0320 09:38:53.905128 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c1701b1eb7140b5fe38ba24029a2325e7c0489059e8e39e989e69d7bbff07652" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 09:38:53 crc kubenswrapper[4971]: E0320 09:38:53.906599 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c1701b1eb7140b5fe38ba24029a2325e7c0489059e8e39e989e69d7bbff07652" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 09:38:53 crc kubenswrapper[4971]: E0320 09:38:53.907630 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c1701b1eb7140b5fe38ba24029a2325e7c0489059e8e39e989e69d7bbff07652" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 09:38:53 crc kubenswrapper[4971]: E0320 09:38:53.907666 4971 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="77ccca0d-9032-4c04-9074-3711401b473c" containerName="nova-cell0-conductor-conductor" Mar 20 09:38:54 crc kubenswrapper[4971]: E0320 09:38:54.273329 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cc0310d6f80ae78b6a488ab8385070f8bac2cd555c6aa72a5afb7fb64cbc727b" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 09:38:54 crc kubenswrapper[4971]: E0320 09:38:54.274639 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cc0310d6f80ae78b6a488ab8385070f8bac2cd555c6aa72a5afb7fb64cbc727b" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 09:38:54 crc kubenswrapper[4971]: E0320 09:38:54.276763 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cc0310d6f80ae78b6a488ab8385070f8bac2cd555c6aa72a5afb7fb64cbc727b" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 09:38:54 crc kubenswrapper[4971]: E0320 09:38:54.276861 4971 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="8d18a1e5-a77f-4c5f-a2f3-be899e4c702d" containerName="nova-cell1-conductor-conductor" Mar 20 09:38:54 crc kubenswrapper[4971]: I0320 09:38:54.396389 4971 generic.go:334] "Generic (PLEG): container finished" podID="f4e0393d-4241-455b-a857-3f4e8536576b" containerID="2427238a074f49768d7eca5f2b457978f1303ed5f92dd92cae8208ee17726a0c" exitCode=143 Mar 20 09:38:54 crc kubenswrapper[4971]: I0320 09:38:54.396496 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f4e0393d-4241-455b-a857-3f4e8536576b","Type":"ContainerDied","Data":"2427238a074f49768d7eca5f2b457978f1303ed5f92dd92cae8208ee17726a0c"} Mar 20 09:38:54 crc kubenswrapper[4971]: I0320 09:38:54.399233 4971 generic.go:334] "Generic (PLEG): container finished" podID="ecdcaee4-8a8d-4f0f-94de-fe2082e55ead" containerID="f03c934f8e3cab9668e02a35a1ff1b598209096c238dc5e2b9041ebfa5a60f51" exitCode=143 Mar 20 09:38:54 crc kubenswrapper[4971]: I0320 09:38:54.399269 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ecdcaee4-8a8d-4f0f-94de-fe2082e55ead","Type":"ContainerDied","Data":"f03c934f8e3cab9668e02a35a1ff1b598209096c238dc5e2b9041ebfa5a60f51"} Mar 20 09:38:54 crc kubenswrapper[4971]: I0320 09:38:54.874355 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 09:38:54 crc kubenswrapper[4971]: I0320 09:38:54.933570 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29b84ad8-d074-4c35-a0dd-a0b491ac3e45-config-data\") pod \"29b84ad8-d074-4c35-a0dd-a0b491ac3e45\" (UID: \"29b84ad8-d074-4c35-a0dd-a0b491ac3e45\") " Mar 20 09:38:54 crc kubenswrapper[4971]: I0320 09:38:54.933690 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ztnj\" (UniqueName: \"kubernetes.io/projected/29b84ad8-d074-4c35-a0dd-a0b491ac3e45-kube-api-access-8ztnj\") pod \"29b84ad8-d074-4c35-a0dd-a0b491ac3e45\" (UID: \"29b84ad8-d074-4c35-a0dd-a0b491ac3e45\") " Mar 20 09:38:54 crc kubenswrapper[4971]: I0320 09:38:54.933830 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29b84ad8-d074-4c35-a0dd-a0b491ac3e45-combined-ca-bundle\") pod \"29b84ad8-d074-4c35-a0dd-a0b491ac3e45\" (UID: \"29b84ad8-d074-4c35-a0dd-a0b491ac3e45\") " Mar 20 09:38:54 crc kubenswrapper[4971]: I0320 09:38:54.939124 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29b84ad8-d074-4c35-a0dd-a0b491ac3e45-kube-api-access-8ztnj" (OuterVolumeSpecName: "kube-api-access-8ztnj") pod "29b84ad8-d074-4c35-a0dd-a0b491ac3e45" (UID: "29b84ad8-d074-4c35-a0dd-a0b491ac3e45"). InnerVolumeSpecName "kube-api-access-8ztnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:38:54 crc kubenswrapper[4971]: I0320 09:38:54.966996 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29b84ad8-d074-4c35-a0dd-a0b491ac3e45-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "29b84ad8-d074-4c35-a0dd-a0b491ac3e45" (UID: "29b84ad8-d074-4c35-a0dd-a0b491ac3e45"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:38:54 crc kubenswrapper[4971]: I0320 09:38:54.976358 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29b84ad8-d074-4c35-a0dd-a0b491ac3e45-config-data" (OuterVolumeSpecName: "config-data") pod "29b84ad8-d074-4c35-a0dd-a0b491ac3e45" (UID: "29b84ad8-d074-4c35-a0dd-a0b491ac3e45"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:38:55 crc kubenswrapper[4971]: I0320 09:38:55.037827 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29b84ad8-d074-4c35-a0dd-a0b491ac3e45-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 09:38:55 crc kubenswrapper[4971]: I0320 09:38:55.037862 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ztnj\" (UniqueName: \"kubernetes.io/projected/29b84ad8-d074-4c35-a0dd-a0b491ac3e45-kube-api-access-8ztnj\") on node \"crc\" DevicePath \"\"" Mar 20 09:38:55 crc kubenswrapper[4971]: I0320 09:38:55.037876 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29b84ad8-d074-4c35-a0dd-a0b491ac3e45-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 09:38:55 crc kubenswrapper[4971]: I0320 09:38:55.413119 4971 generic.go:334] "Generic (PLEG): container finished" podID="8d18a1e5-a77f-4c5f-a2f3-be899e4c702d" containerID="cc0310d6f80ae78b6a488ab8385070f8bac2cd555c6aa72a5afb7fb64cbc727b" exitCode=0 Mar 20 09:38:55 crc kubenswrapper[4971]: I0320 09:38:55.413187 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"8d18a1e5-a77f-4c5f-a2f3-be899e4c702d","Type":"ContainerDied","Data":"cc0310d6f80ae78b6a488ab8385070f8bac2cd555c6aa72a5afb7fb64cbc727b"} Mar 20 09:38:55 crc kubenswrapper[4971]: I0320 09:38:55.415657 4971 generic.go:334] "Generic (PLEG): container finished" podID="29b84ad8-d074-4c35-a0dd-a0b491ac3e45" containerID="7e8913c8e78092443fc5bdec82211995ed9e16db7ae63352cf5840096b976239" exitCode=0 Mar 20 09:38:55 crc kubenswrapper[4971]: I0320 09:38:55.415705 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"29b84ad8-d074-4c35-a0dd-a0b491ac3e45","Type":"ContainerDied","Data":"7e8913c8e78092443fc5bdec82211995ed9e16db7ae63352cf5840096b976239"} Mar 20 09:38:55 crc kubenswrapper[4971]: I0320 09:38:55.415735 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"29b84ad8-d074-4c35-a0dd-a0b491ac3e45","Type":"ContainerDied","Data":"38c6b76be7923f31036e49e7fdd4ecf2bd38ab11f323745804bb47da0cd8800c"} Mar 20 09:38:55 crc kubenswrapper[4971]: I0320 09:38:55.415758 4971 scope.go:117] "RemoveContainer" containerID="7e8913c8e78092443fc5bdec82211995ed9e16db7ae63352cf5840096b976239" Mar 20 09:38:55 crc kubenswrapper[4971]: I0320 09:38:55.415907 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 09:38:55 crc kubenswrapper[4971]: I0320 09:38:55.449907 4971 scope.go:117] "RemoveContainer" containerID="7e8913c8e78092443fc5bdec82211995ed9e16db7ae63352cf5840096b976239" Mar 20 09:38:55 crc kubenswrapper[4971]: E0320 09:38:55.450814 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e8913c8e78092443fc5bdec82211995ed9e16db7ae63352cf5840096b976239\": container with ID starting with 7e8913c8e78092443fc5bdec82211995ed9e16db7ae63352cf5840096b976239 not found: ID does not exist" containerID="7e8913c8e78092443fc5bdec82211995ed9e16db7ae63352cf5840096b976239" Mar 20 09:38:55 crc kubenswrapper[4971]: I0320 09:38:55.450857 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e8913c8e78092443fc5bdec82211995ed9e16db7ae63352cf5840096b976239"} err="failed to get container status \"7e8913c8e78092443fc5bdec82211995ed9e16db7ae63352cf5840096b976239\": rpc error: code = NotFound desc = could not find container \"7e8913c8e78092443fc5bdec82211995ed9e16db7ae63352cf5840096b976239\": container with ID starting with 7e8913c8e78092443fc5bdec82211995ed9e16db7ae63352cf5840096b976239 not found: ID does not exist" Mar 20 09:38:55 crc kubenswrapper[4971]: I0320 09:38:55.454545 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 09:38:55 crc kubenswrapper[4971]: I0320 09:38:55.489899 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 09:38:55 crc kubenswrapper[4971]: I0320 09:38:55.509119 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 09:38:55 crc kubenswrapper[4971]: E0320 09:38:55.509634 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29b84ad8-d074-4c35-a0dd-a0b491ac3e45" containerName="nova-scheduler-scheduler" Mar 20 09:38:55 crc kubenswrapper[4971]: I0320 09:38:55.509652 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="29b84ad8-d074-4c35-a0dd-a0b491ac3e45" containerName="nova-scheduler-scheduler" Mar 20 09:38:55 crc kubenswrapper[4971]: E0320 09:38:55.509673 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ddea0e1-b76c-4830-9489-4c0b2fa997f3" containerName="oc" Mar 20 09:38:55 crc kubenswrapper[4971]: I0320 09:38:55.509708 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ddea0e1-b76c-4830-9489-4c0b2fa997f3" containerName="oc" Mar 20 09:38:55 crc kubenswrapper[4971]: E0320 09:38:55.509743 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42809274-e4e2-41bf-bac0-163763e81df4" containerName="neutron-dhcp-openstack-openstack-cell1" Mar 20 09:38:55 crc kubenswrapper[4971]: I0320 09:38:55.509750 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="42809274-e4e2-41bf-bac0-163763e81df4" containerName="neutron-dhcp-openstack-openstack-cell1" Mar 20 09:38:55 crc kubenswrapper[4971]: I0320 09:38:55.509957 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ddea0e1-b76c-4830-9489-4c0b2fa997f3" containerName="oc" Mar 20 09:38:55 crc kubenswrapper[4971]: I0320 09:38:55.509981 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="42809274-e4e2-41bf-bac0-163763e81df4" containerName="neutron-dhcp-openstack-openstack-cell1" Mar 20 09:38:55 crc kubenswrapper[4971]: I0320 09:38:55.509993 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="29b84ad8-d074-4c35-a0dd-a0b491ac3e45" containerName="nova-scheduler-scheduler" Mar 20 09:38:55 crc kubenswrapper[4971]: I0320 09:38:55.510834 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 09:38:55 crc kubenswrapper[4971]: I0320 09:38:55.515026 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 20 09:38:55 crc kubenswrapper[4971]: I0320 09:38:55.527178 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 09:38:55 crc kubenswrapper[4971]: I0320 09:38:55.583918 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 20 09:38:55 crc kubenswrapper[4971]: I0320 09:38:55.655222 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67vvl\" (UniqueName: \"kubernetes.io/projected/8d18a1e5-a77f-4c5f-a2f3-be899e4c702d-kube-api-access-67vvl\") pod \"8d18a1e5-a77f-4c5f-a2f3-be899e4c702d\" (UID: \"8d18a1e5-a77f-4c5f-a2f3-be899e4c702d\") " Mar 20 09:38:55 crc kubenswrapper[4971]: I0320 09:38:55.655438 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d18a1e5-a77f-4c5f-a2f3-be899e4c702d-combined-ca-bundle\") pod \"8d18a1e5-a77f-4c5f-a2f3-be899e4c702d\" (UID: \"8d18a1e5-a77f-4c5f-a2f3-be899e4c702d\") " Mar 20 09:38:55 crc kubenswrapper[4971]: I0320 09:38:55.655572 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d18a1e5-a77f-4c5f-a2f3-be899e4c702d-config-data\") pod \"8d18a1e5-a77f-4c5f-a2f3-be899e4c702d\" (UID: \"8d18a1e5-a77f-4c5f-a2f3-be899e4c702d\") " Mar 20 09:38:55 crc kubenswrapper[4971]: I0320 09:38:55.655884 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f51e9be1-1904-4886-967b-87f6e898d67d-config-data\") pod \"nova-scheduler-0\" (UID: \"f51e9be1-1904-4886-967b-87f6e898d67d\") " pod="openstack/nova-scheduler-0" Mar 20 09:38:55 crc kubenswrapper[4971]: I0320 09:38:55.655980 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6k85s\" (UniqueName: \"kubernetes.io/projected/f51e9be1-1904-4886-967b-87f6e898d67d-kube-api-access-6k85s\") pod \"nova-scheduler-0\" (UID: \"f51e9be1-1904-4886-967b-87f6e898d67d\") " pod="openstack/nova-scheduler-0" Mar 20 09:38:55 crc kubenswrapper[4971]: I0320 09:38:55.656134 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f51e9be1-1904-4886-967b-87f6e898d67d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f51e9be1-1904-4886-967b-87f6e898d67d\") " pod="openstack/nova-scheduler-0" Mar 20 09:38:55 crc kubenswrapper[4971]: I0320 09:38:55.658343 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d18a1e5-a77f-4c5f-a2f3-be899e4c702d-kube-api-access-67vvl" (OuterVolumeSpecName: "kube-api-access-67vvl") pod "8d18a1e5-a77f-4c5f-a2f3-be899e4c702d" (UID: "8d18a1e5-a77f-4c5f-a2f3-be899e4c702d"). InnerVolumeSpecName "kube-api-access-67vvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:38:55 crc kubenswrapper[4971]: I0320 09:38:55.678312 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d18a1e5-a77f-4c5f-a2f3-be899e4c702d-config-data" (OuterVolumeSpecName: "config-data") pod "8d18a1e5-a77f-4c5f-a2f3-be899e4c702d" (UID: "8d18a1e5-a77f-4c5f-a2f3-be899e4c702d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:38:55 crc kubenswrapper[4971]: I0320 09:38:55.685942 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d18a1e5-a77f-4c5f-a2f3-be899e4c702d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8d18a1e5-a77f-4c5f-a2f3-be899e4c702d" (UID: "8d18a1e5-a77f-4c5f-a2f3-be899e4c702d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:38:55 crc kubenswrapper[4971]: I0320 09:38:55.758156 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6k85s\" (UniqueName: \"kubernetes.io/projected/f51e9be1-1904-4886-967b-87f6e898d67d-kube-api-access-6k85s\") pod \"nova-scheduler-0\" (UID: \"f51e9be1-1904-4886-967b-87f6e898d67d\") " pod="openstack/nova-scheduler-0" Mar 20 09:38:55 crc kubenswrapper[4971]: I0320 09:38:55.758291 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f51e9be1-1904-4886-967b-87f6e898d67d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f51e9be1-1904-4886-967b-87f6e898d67d\") " pod="openstack/nova-scheduler-0" Mar 20 09:38:55 crc kubenswrapper[4971]: I0320 09:38:55.758350 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f51e9be1-1904-4886-967b-87f6e898d67d-config-data\") pod \"nova-scheduler-0\" (UID: \"f51e9be1-1904-4886-967b-87f6e898d67d\") " pod="openstack/nova-scheduler-0" Mar 20 09:38:55 crc kubenswrapper[4971]: I0320 09:38:55.758464 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67vvl\" (UniqueName: \"kubernetes.io/projected/8d18a1e5-a77f-4c5f-a2f3-be899e4c702d-kube-api-access-67vvl\") on node \"crc\" DevicePath \"\"" Mar 20 09:38:55 crc kubenswrapper[4971]: I0320 09:38:55.758481 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d18a1e5-a77f-4c5f-a2f3-be899e4c702d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 09:38:55 crc kubenswrapper[4971]: I0320 09:38:55.758494 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d18a1e5-a77f-4c5f-a2f3-be899e4c702d-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 09:38:55 crc kubenswrapper[4971]: I0320 09:38:55.762521 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f51e9be1-1904-4886-967b-87f6e898d67d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f51e9be1-1904-4886-967b-87f6e898d67d\") " pod="openstack/nova-scheduler-0" Mar 20 09:38:55 crc kubenswrapper[4971]: I0320 09:38:55.762762 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f51e9be1-1904-4886-967b-87f6e898d67d-config-data\") pod \"nova-scheduler-0\" (UID: \"f51e9be1-1904-4886-967b-87f6e898d67d\") " pod="openstack/nova-scheduler-0" Mar 20 09:38:55 crc kubenswrapper[4971]: I0320 09:38:55.773916 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6k85s\" (UniqueName: \"kubernetes.io/projected/f51e9be1-1904-4886-967b-87f6e898d67d-kube-api-access-6k85s\") pod \"nova-scheduler-0\" (UID: \"f51e9be1-1904-4886-967b-87f6e898d67d\") " pod="openstack/nova-scheduler-0" Mar 20 09:38:55 crc kubenswrapper[4971]: I0320 09:38:55.832733 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 09:38:56 crc kubenswrapper[4971]: I0320 09:38:56.324100 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 09:38:56 crc kubenswrapper[4971]: W0320 09:38:56.330389 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf51e9be1_1904_4886_967b_87f6e898d67d.slice/crio-5fe44937916905927d6b4b2a5254d323ed76225e3aae3f41b903db5f4e246319 WatchSource:0}: Error finding container 5fe44937916905927d6b4b2a5254d323ed76225e3aae3f41b903db5f4e246319: Status 404 returned error can't find the container with id 5fe44937916905927d6b4b2a5254d323ed76225e3aae3f41b903db5f4e246319 Mar 20 09:38:56 crc kubenswrapper[4971]: I0320 09:38:56.427560 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f51e9be1-1904-4886-967b-87f6e898d67d","Type":"ContainerStarted","Data":"5fe44937916905927d6b4b2a5254d323ed76225e3aae3f41b903db5f4e246319"} Mar 20 09:38:56 crc kubenswrapper[4971]: I0320 09:38:56.430772 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"8d18a1e5-a77f-4c5f-a2f3-be899e4c702d","Type":"ContainerDied","Data":"fcaa96fec80dd83aa17cf6fc13ebed7bdbca6d37cd81b45cacadbd631166ca68"} Mar 20 09:38:56 crc kubenswrapper[4971]: I0320 09:38:56.430819 4971 scope.go:117] "RemoveContainer" containerID="cc0310d6f80ae78b6a488ab8385070f8bac2cd555c6aa72a5afb7fb64cbc727b" Mar 20 09:38:56 crc kubenswrapper[4971]: I0320 09:38:56.430971 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 20 09:38:56 crc kubenswrapper[4971]: I0320 09:38:56.475375 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 09:38:56 crc kubenswrapper[4971]: I0320 09:38:56.485658 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 09:38:56 crc kubenswrapper[4971]: I0320 09:38:56.494129 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 09:38:56 crc kubenswrapper[4971]: E0320 09:38:56.494546 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d18a1e5-a77f-4c5f-a2f3-be899e4c702d" containerName="nova-cell1-conductor-conductor" Mar 20 09:38:56 crc kubenswrapper[4971]: I0320 09:38:56.494562 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d18a1e5-a77f-4c5f-a2f3-be899e4c702d" containerName="nova-cell1-conductor-conductor" Mar 20 09:38:56 crc kubenswrapper[4971]: I0320 09:38:56.494799 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d18a1e5-a77f-4c5f-a2f3-be899e4c702d" containerName="nova-cell1-conductor-conductor" Mar 20 09:38:56 crc kubenswrapper[4971]: I0320 09:38:56.495491 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 20 09:38:56 crc kubenswrapper[4971]: I0320 09:38:56.498561 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 20 09:38:56 crc kubenswrapper[4971]: I0320 09:38:56.512301 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 09:38:56 crc kubenswrapper[4971]: I0320 09:38:56.585564 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0e45f2f-b018-4874-b9cc-a933274eee0c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"b0e45f2f-b018-4874-b9cc-a933274eee0c\") " pod="openstack/nova-cell1-conductor-0" Mar 20 09:38:56 crc kubenswrapper[4971]: I0320 09:38:56.585909 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0e45f2f-b018-4874-b9cc-a933274eee0c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"b0e45f2f-b018-4874-b9cc-a933274eee0c\") " pod="openstack/nova-cell1-conductor-0" Mar 20 09:38:56 crc kubenswrapper[4971]: I0320 09:38:56.585980 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2zpm\" (UniqueName: \"kubernetes.io/projected/b0e45f2f-b018-4874-b9cc-a933274eee0c-kube-api-access-w2zpm\") pod \"nova-cell1-conductor-0\" (UID: \"b0e45f2f-b018-4874-b9cc-a933274eee0c\") " pod="openstack/nova-cell1-conductor-0" Mar 20 09:38:56 crc kubenswrapper[4971]: I0320 09:38:56.688031 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0e45f2f-b018-4874-b9cc-a933274eee0c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"b0e45f2f-b018-4874-b9cc-a933274eee0c\") " pod="openstack/nova-cell1-conductor-0" Mar 20 09:38:56 crc kubenswrapper[4971]: I0320 09:38:56.688122 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0e45f2f-b018-4874-b9cc-a933274eee0c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"b0e45f2f-b018-4874-b9cc-a933274eee0c\") " pod="openstack/nova-cell1-conductor-0" Mar 20 09:38:56 crc kubenswrapper[4971]: I0320 09:38:56.688233 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2zpm\" (UniqueName: \"kubernetes.io/projected/b0e45f2f-b018-4874-b9cc-a933274eee0c-kube-api-access-w2zpm\") pod \"nova-cell1-conductor-0\" (UID: \"b0e45f2f-b018-4874-b9cc-a933274eee0c\") " pod="openstack/nova-cell1-conductor-0" Mar 20 09:38:56 crc kubenswrapper[4971]: I0320 09:38:56.692250 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0e45f2f-b018-4874-b9cc-a933274eee0c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"b0e45f2f-b018-4874-b9cc-a933274eee0c\") " pod="openstack/nova-cell1-conductor-0" Mar 20 09:38:56 crc kubenswrapper[4971]: I0320 09:38:56.692492 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0e45f2f-b018-4874-b9cc-a933274eee0c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"b0e45f2f-b018-4874-b9cc-a933274eee0c\") " pod="openstack/nova-cell1-conductor-0" Mar 20 09:38:56 crc kubenswrapper[4971]: I0320 09:38:56.713222 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2zpm\" (UniqueName: \"kubernetes.io/projected/b0e45f2f-b018-4874-b9cc-a933274eee0c-kube-api-access-w2zpm\") pod \"nova-cell1-conductor-0\" (UID: \"b0e45f2f-b018-4874-b9cc-a933274eee0c\") " pod="openstack/nova-cell1-conductor-0" Mar 20 09:38:56 crc kubenswrapper[4971]: I0320 09:38:56.743501 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29b84ad8-d074-4c35-a0dd-a0b491ac3e45" path="/var/lib/kubelet/pods/29b84ad8-d074-4c35-a0dd-a0b491ac3e45/volumes" Mar 20 09:38:56 crc kubenswrapper[4971]: I0320 09:38:56.744040 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d18a1e5-a77f-4c5f-a2f3-be899e4c702d" path="/var/lib/kubelet/pods/8d18a1e5-a77f-4c5f-a2f3-be899e4c702d/volumes" Mar 20 09:38:56 crc kubenswrapper[4971]: I0320 09:38:56.829454 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 20 09:38:57 crc kubenswrapper[4971]: W0320 09:38:57.323143 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb0e45f2f_b018_4874_b9cc_a933274eee0c.slice/crio-34134181bb741ba4d7b96a8c38b4c7daf6789ae519feb4e3ca298d4b4fcc770e WatchSource:0}: Error finding container 34134181bb741ba4d7b96a8c38b4c7daf6789ae519feb4e3ca298d4b4fcc770e: Status 404 returned error can't find the container with id 34134181bb741ba4d7b96a8c38b4c7daf6789ae519feb4e3ca298d4b4fcc770e Mar 20 09:38:57 crc kubenswrapper[4971]: I0320 09:38:57.324351 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 09:38:57 crc kubenswrapper[4971]: I0320 09:38:57.452948 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 09:38:57 crc kubenswrapper[4971]: I0320 09:38:57.456841 4971 generic.go:334] "Generic (PLEG): container finished" podID="f4e0393d-4241-455b-a857-3f4e8536576b" containerID="069c4523fd249ea051101504fc8505a865bd50f12636610910eacda6ead5b5ca" exitCode=0 Mar 20 09:38:57 crc kubenswrapper[4971]: I0320 09:38:57.456880 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f4e0393d-4241-455b-a857-3f4e8536576b","Type":"ContainerDied","Data":"069c4523fd249ea051101504fc8505a865bd50f12636610910eacda6ead5b5ca"} Mar 20 09:38:57 crc kubenswrapper[4971]: I0320 09:38:57.456922 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f4e0393d-4241-455b-a857-3f4e8536576b","Type":"ContainerDied","Data":"342a103ce3d6aba9d44d3af04671f35bed74ea7e36d9c75e07bc8831e3f50a45"} Mar 20 09:38:57 crc kubenswrapper[4971]: I0320 09:38:57.456942 4971 scope.go:117] "RemoveContainer" containerID="069c4523fd249ea051101504fc8505a865bd50f12636610910eacda6ead5b5ca" Mar 20 09:38:57 crc kubenswrapper[4971]: I0320 09:38:57.460552 4971 generic.go:334] "Generic (PLEG): container finished" podID="ecdcaee4-8a8d-4f0f-94de-fe2082e55ead" containerID="a754c0876c0bb879fece283bb1842c594d8bf635ee792749cff6fcae64503153" exitCode=0 Mar 20 09:38:57 crc kubenswrapper[4971]: I0320 09:38:57.460703 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ecdcaee4-8a8d-4f0f-94de-fe2082e55ead","Type":"ContainerDied","Data":"a754c0876c0bb879fece283bb1842c594d8bf635ee792749cff6fcae64503153"} Mar 20 09:38:57 crc kubenswrapper[4971]: I0320 09:38:57.460744 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ecdcaee4-8a8d-4f0f-94de-fe2082e55ead","Type":"ContainerDied","Data":"f9add17e4401ee21eb0cdc536e7ebf45c08ccfca2850ddac53740419a8a589fa"} Mar 20 09:38:57 crc kubenswrapper[4971]: I0320 09:38:57.460761 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9add17e4401ee21eb0cdc536e7ebf45c08ccfca2850ddac53740419a8a589fa" Mar 20 09:38:57 crc kubenswrapper[4971]: I0320 09:38:57.462297 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 09:38:57 crc kubenswrapper[4971]: I0320 09:38:57.463397 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"b0e45f2f-b018-4874-b9cc-a933274eee0c","Type":"ContainerStarted","Data":"34134181bb741ba4d7b96a8c38b4c7daf6789ae519feb4e3ca298d4b4fcc770e"} Mar 20 09:38:57 crc kubenswrapper[4971]: I0320 09:38:57.468458 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f51e9be1-1904-4886-967b-87f6e898d67d","Type":"ContainerStarted","Data":"e669051613d04e6657344ba4bc2cc0138c5819186f36a144db4683eabdb687b5"} Mar 20 09:38:57 crc kubenswrapper[4971]: I0320 09:38:57.508081 4971 scope.go:117] "RemoveContainer" containerID="2427238a074f49768d7eca5f2b457978f1303ed5f92dd92cae8208ee17726a0c" Mar 20 09:38:57 crc kubenswrapper[4971]: I0320 09:38:57.538077 4971 scope.go:117] "RemoveContainer" containerID="069c4523fd249ea051101504fc8505a865bd50f12636610910eacda6ead5b5ca" Mar 20 09:38:57 crc kubenswrapper[4971]: E0320 09:38:57.540461 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"069c4523fd249ea051101504fc8505a865bd50f12636610910eacda6ead5b5ca\": container with ID starting with 069c4523fd249ea051101504fc8505a865bd50f12636610910eacda6ead5b5ca not found: ID does not exist" containerID="069c4523fd249ea051101504fc8505a865bd50f12636610910eacda6ead5b5ca" Mar 20 09:38:57 crc kubenswrapper[4971]: I0320 09:38:57.540506 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"069c4523fd249ea051101504fc8505a865bd50f12636610910eacda6ead5b5ca"} err="failed to get container status \"069c4523fd249ea051101504fc8505a865bd50f12636610910eacda6ead5b5ca\": rpc error: code = NotFound desc = could not find container \"069c4523fd249ea051101504fc8505a865bd50f12636610910eacda6ead5b5ca\": container with ID starting with 069c4523fd249ea051101504fc8505a865bd50f12636610910eacda6ead5b5ca not found: ID does not exist" Mar 20 09:38:57 crc kubenswrapper[4971]: I0320 09:38:57.540535 4971 scope.go:117] "RemoveContainer" containerID="2427238a074f49768d7eca5f2b457978f1303ed5f92dd92cae8208ee17726a0c" Mar 20 09:38:57 crc kubenswrapper[4971]: E0320 09:38:57.542528 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2427238a074f49768d7eca5f2b457978f1303ed5f92dd92cae8208ee17726a0c\": container with ID starting with 2427238a074f49768d7eca5f2b457978f1303ed5f92dd92cae8208ee17726a0c not found: ID does not exist" containerID="2427238a074f49768d7eca5f2b457978f1303ed5f92dd92cae8208ee17726a0c" Mar 20 09:38:57 crc kubenswrapper[4971]: I0320 09:38:57.542559 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2427238a074f49768d7eca5f2b457978f1303ed5f92dd92cae8208ee17726a0c"} err="failed to get container status \"2427238a074f49768d7eca5f2b457978f1303ed5f92dd92cae8208ee17726a0c\": rpc error: code = NotFound desc = could not find container \"2427238a074f49768d7eca5f2b457978f1303ed5f92dd92cae8208ee17726a0c\": container with ID starting with 2427238a074f49768d7eca5f2b457978f1303ed5f92dd92cae8208ee17726a0c not found: ID does not exist" Mar 20 09:38:57 crc kubenswrapper[4971]: I0320 09:38:57.549181 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.549160764 podStartE2EDuration="2.549160764s" podCreationTimestamp="2026-03-20 09:38:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:38:57.53283505 +0000 UTC m=+10159.512709188" watchObservedRunningTime="2026-03-20 09:38:57.549160764 +0000 UTC m=+10159.529034902" Mar 20 09:38:57 crc kubenswrapper[4971]: I0320 09:38:57.610751 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4e0393d-4241-455b-a857-3f4e8536576b-config-data\") pod \"f4e0393d-4241-455b-a857-3f4e8536576b\" (UID: \"f4e0393d-4241-455b-a857-3f4e8536576b\") " Mar 20 09:38:57 crc kubenswrapper[4971]: I0320 09:38:57.610878 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4e0393d-4241-455b-a857-3f4e8536576b-combined-ca-bundle\") pod \"f4e0393d-4241-455b-a857-3f4e8536576b\" (UID: \"f4e0393d-4241-455b-a857-3f4e8536576b\") " Mar 20 09:38:57 crc kubenswrapper[4971]: I0320 09:38:57.610924 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecdcaee4-8a8d-4f0f-94de-fe2082e55ead-config-data\") pod \"ecdcaee4-8a8d-4f0f-94de-fe2082e55ead\" (UID: \"ecdcaee4-8a8d-4f0f-94de-fe2082e55ead\") " Mar 20 09:38:57 crc kubenswrapper[4971]: I0320 09:38:57.611013 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecdcaee4-8a8d-4f0f-94de-fe2082e55ead-combined-ca-bundle\") pod \"ecdcaee4-8a8d-4f0f-94de-fe2082e55ead\" (UID: \"ecdcaee4-8a8d-4f0f-94de-fe2082e55ead\") " Mar 20 09:38:57 crc kubenswrapper[4971]: I0320 09:38:57.611053 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbxwz\" (UniqueName: \"kubernetes.io/projected/f4e0393d-4241-455b-a857-3f4e8536576b-kube-api-access-lbxwz\") pod \"f4e0393d-4241-455b-a857-3f4e8536576b\" (UID: \"f4e0393d-4241-455b-a857-3f4e8536576b\") " Mar 20 09:38:57 crc kubenswrapper[4971]: I0320 09:38:57.611070 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8b8q7\" (UniqueName: \"kubernetes.io/projected/ecdcaee4-8a8d-4f0f-94de-fe2082e55ead-kube-api-access-8b8q7\") pod \"ecdcaee4-8a8d-4f0f-94de-fe2082e55ead\" (UID: \"ecdcaee4-8a8d-4f0f-94de-fe2082e55ead\") " Mar 20 09:38:57 crc kubenswrapper[4971]: I0320 09:38:57.611123 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecdcaee4-8a8d-4f0f-94de-fe2082e55ead-logs\") pod \"ecdcaee4-8a8d-4f0f-94de-fe2082e55ead\" (UID: \"ecdcaee4-8a8d-4f0f-94de-fe2082e55ead\") " Mar 20 09:38:57 crc kubenswrapper[4971]: I0320 09:38:57.611163 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4e0393d-4241-455b-a857-3f4e8536576b-logs\") pod \"f4e0393d-4241-455b-a857-3f4e8536576b\" (UID: \"f4e0393d-4241-455b-a857-3f4e8536576b\") " Mar 20 09:38:57 crc kubenswrapper[4971]: I0320 09:38:57.613303 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4e0393d-4241-455b-a857-3f4e8536576b-logs" (OuterVolumeSpecName: "logs") pod "f4e0393d-4241-455b-a857-3f4e8536576b" (UID: "f4e0393d-4241-455b-a857-3f4e8536576b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:38:57 crc kubenswrapper[4971]: I0320 09:38:57.614403 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ecdcaee4-8a8d-4f0f-94de-fe2082e55ead-logs" (OuterVolumeSpecName: "logs") pod "ecdcaee4-8a8d-4f0f-94de-fe2082e55ead" (UID: "ecdcaee4-8a8d-4f0f-94de-fe2082e55ead"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:38:57 crc kubenswrapper[4971]: I0320 09:38:57.619419 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4e0393d-4241-455b-a857-3f4e8536576b-kube-api-access-lbxwz" (OuterVolumeSpecName: "kube-api-access-lbxwz") pod "f4e0393d-4241-455b-a857-3f4e8536576b" (UID: "f4e0393d-4241-455b-a857-3f4e8536576b"). InnerVolumeSpecName "kube-api-access-lbxwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:38:57 crc kubenswrapper[4971]: I0320 09:38:57.640523 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecdcaee4-8a8d-4f0f-94de-fe2082e55ead-kube-api-access-8b8q7" (OuterVolumeSpecName: "kube-api-access-8b8q7") pod "ecdcaee4-8a8d-4f0f-94de-fe2082e55ead" (UID: "ecdcaee4-8a8d-4f0f-94de-fe2082e55ead"). InnerVolumeSpecName "kube-api-access-8b8q7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:38:57 crc kubenswrapper[4971]: I0320 09:38:57.644959 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecdcaee4-8a8d-4f0f-94de-fe2082e55ead-config-data" (OuterVolumeSpecName: "config-data") pod "ecdcaee4-8a8d-4f0f-94de-fe2082e55ead" (UID: "ecdcaee4-8a8d-4f0f-94de-fe2082e55ead"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:38:57 crc kubenswrapper[4971]: I0320 09:38:57.648471 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecdcaee4-8a8d-4f0f-94de-fe2082e55ead-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ecdcaee4-8a8d-4f0f-94de-fe2082e55ead" (UID: "ecdcaee4-8a8d-4f0f-94de-fe2082e55ead"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:38:57 crc kubenswrapper[4971]: I0320 09:38:57.658701 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4e0393d-4241-455b-a857-3f4e8536576b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f4e0393d-4241-455b-a857-3f4e8536576b" (UID: "f4e0393d-4241-455b-a857-3f4e8536576b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:38:57 crc kubenswrapper[4971]: I0320 09:38:57.662807 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4e0393d-4241-455b-a857-3f4e8536576b-config-data" (OuterVolumeSpecName: "config-data") pod "f4e0393d-4241-455b-a857-3f4e8536576b" (UID: "f4e0393d-4241-455b-a857-3f4e8536576b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:38:57 crc kubenswrapper[4971]: I0320 09:38:57.715220 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecdcaee4-8a8d-4f0f-94de-fe2082e55ead-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 09:38:57 crc kubenswrapper[4971]: I0320 09:38:57.715265 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8b8q7\" (UniqueName: \"kubernetes.io/projected/ecdcaee4-8a8d-4f0f-94de-fe2082e55ead-kube-api-access-8b8q7\") on node \"crc\" DevicePath \"\"" Mar 20 09:38:57 crc kubenswrapper[4971]: I0320 09:38:57.715280 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbxwz\" (UniqueName: \"kubernetes.io/projected/f4e0393d-4241-455b-a857-3f4e8536576b-kube-api-access-lbxwz\") on node \"crc\" DevicePath \"\"" Mar 20 09:38:57 crc kubenswrapper[4971]: I0320 09:38:57.715294 4971 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecdcaee4-8a8d-4f0f-94de-fe2082e55ead-logs\") on node \"crc\" DevicePath \"\"" Mar 20 09:38:57 crc kubenswrapper[4971]: I0320 09:38:57.715306 4971 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4e0393d-4241-455b-a857-3f4e8536576b-logs\") on node \"crc\" DevicePath \"\"" Mar 20 09:38:57 crc kubenswrapper[4971]: I0320 09:38:57.715318 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4e0393d-4241-455b-a857-3f4e8536576b-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 09:38:57 crc kubenswrapper[4971]: I0320 09:38:57.715327 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4e0393d-4241-455b-a857-3f4e8536576b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 09:38:57 crc kubenswrapper[4971]: I0320 09:38:57.715336 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecdcaee4-8a8d-4f0f-94de-fe2082e55ead-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 09:38:58 crc kubenswrapper[4971]: I0320 09:38:58.499160 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"b0e45f2f-b018-4874-b9cc-a933274eee0c","Type":"ContainerStarted","Data":"3d01f9e689e2a6ee73d7de0ce47e3ddefa50c7a7ac61c6b549e895fb0b4a61f8"} Mar 20 09:38:58 crc kubenswrapper[4971]: I0320 09:38:58.499667 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 20 09:38:58 crc kubenswrapper[4971]: I0320 09:38:58.507081 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 09:38:58 crc kubenswrapper[4971]: I0320 09:38:58.507172 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 09:38:58 crc kubenswrapper[4971]: I0320 09:38:58.537756 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.537720262 podStartE2EDuration="2.537720262s" podCreationTimestamp="2026-03-20 09:38:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:38:58.534068195 +0000 UTC m=+10160.513942333" watchObservedRunningTime="2026-03-20 09:38:58.537720262 +0000 UTC m=+10160.517594440" Mar 20 09:38:58 crc kubenswrapper[4971]: I0320 09:38:58.582598 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 09:38:58 crc kubenswrapper[4971]: I0320 09:38:58.595944 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 20 09:38:58 crc kubenswrapper[4971]: I0320 09:38:58.611576 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 09:38:58 crc kubenswrapper[4971]: E0320 09:38:58.612211 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecdcaee4-8a8d-4f0f-94de-fe2082e55ead" containerName="nova-metadata-log" Mar 20 09:38:58 crc kubenswrapper[4971]: I0320 09:38:58.612277 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecdcaee4-8a8d-4f0f-94de-fe2082e55ead" containerName="nova-metadata-log" Mar 20 09:38:58 crc kubenswrapper[4971]: E0320 09:38:58.612344 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecdcaee4-8a8d-4f0f-94de-fe2082e55ead" containerName="nova-metadata-metadata" Mar 20 09:38:58 crc kubenswrapper[4971]: I0320 09:38:58.612400 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecdcaee4-8a8d-4f0f-94de-fe2082e55ead" containerName="nova-metadata-metadata" Mar 20 09:38:58 crc kubenswrapper[4971]: E0320 09:38:58.612474 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4e0393d-4241-455b-a857-3f4e8536576b" containerName="nova-api-api" Mar 20 09:38:58 crc kubenswrapper[4971]: I0320 09:38:58.612522 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4e0393d-4241-455b-a857-3f4e8536576b" containerName="nova-api-api" Mar 20 09:38:58 crc kubenswrapper[4971]: E0320 09:38:58.612579 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4e0393d-4241-455b-a857-3f4e8536576b" containerName="nova-api-log" Mar 20 09:38:58 crc kubenswrapper[4971]: I0320 09:38:58.612645 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4e0393d-4241-455b-a857-3f4e8536576b" containerName="nova-api-log" Mar 20 09:38:58 crc kubenswrapper[4971]: I0320 09:38:58.612885 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecdcaee4-8a8d-4f0f-94de-fe2082e55ead" containerName="nova-metadata-metadata" Mar 20 09:38:58 crc kubenswrapper[4971]: I0320 09:38:58.612956 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4e0393d-4241-455b-a857-3f4e8536576b" containerName="nova-api-log" Mar 20 09:38:58 crc kubenswrapper[4971]: I0320 09:38:58.613021 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecdcaee4-8a8d-4f0f-94de-fe2082e55ead" containerName="nova-metadata-log" Mar 20 09:38:58 crc kubenswrapper[4971]: I0320 09:38:58.613087 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4e0393d-4241-455b-a857-3f4e8536576b" containerName="nova-api-api" Mar 20 09:38:58 crc kubenswrapper[4971]: I0320 09:38:58.614247 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 09:38:58 crc kubenswrapper[4971]: I0320 09:38:58.623890 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 09:38:58 crc kubenswrapper[4971]: I0320 09:38:58.625824 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 09:38:58 crc kubenswrapper[4971]: I0320 09:38:58.655804 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 09:38:58 crc kubenswrapper[4971]: I0320 09:38:58.675038 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 09:38:58 crc kubenswrapper[4971]: I0320 09:38:58.717985 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 09:38:58 crc kubenswrapper[4971]: I0320 09:38:58.719816 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 09:38:58 crc kubenswrapper[4971]: I0320 09:38:58.726363 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 09:38:58 crc kubenswrapper[4971]: I0320 09:38:58.754714 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecdcaee4-8a8d-4f0f-94de-fe2082e55ead" path="/var/lib/kubelet/pods/ecdcaee4-8a8d-4f0f-94de-fe2082e55ead/volumes" Mar 20 09:38:58 crc kubenswrapper[4971]: I0320 09:38:58.760764 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4e0393d-4241-455b-a857-3f4e8536576b" path="/var/lib/kubelet/pods/f4e0393d-4241-455b-a857-3f4e8536576b/volumes" Mar 20 09:38:58 crc kubenswrapper[4971]: I0320 09:38:58.761767 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 09:38:58 crc kubenswrapper[4971]: I0320 09:38:58.763092 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnh7c\" (UniqueName: \"kubernetes.io/projected/a288064d-50aa-42c6-8944-3a6c3b9d6c77-kube-api-access-mnh7c\") pod \"nova-api-0\" (UID: \"a288064d-50aa-42c6-8944-3a6c3b9d6c77\") " pod="openstack/nova-api-0" Mar 20 09:38:58 crc kubenswrapper[4971]: I0320 09:38:58.763183 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a288064d-50aa-42c6-8944-3a6c3b9d6c77-config-data\") pod \"nova-api-0\" (UID: \"a288064d-50aa-42c6-8944-3a6c3b9d6c77\") " pod="openstack/nova-api-0" Mar 20 09:38:58 crc kubenswrapper[4971]: I0320 09:38:58.763316 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a288064d-50aa-42c6-8944-3a6c3b9d6c77-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a288064d-50aa-42c6-8944-3a6c3b9d6c77\") " pod="openstack/nova-api-0" Mar 20 09:38:58 crc kubenswrapper[4971]: I0320 09:38:58.763357 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a288064d-50aa-42c6-8944-3a6c3b9d6c77-logs\") pod \"nova-api-0\" (UID: \"a288064d-50aa-42c6-8944-3a6c3b9d6c77\") " pod="openstack/nova-api-0" Mar 20 09:38:58 crc kubenswrapper[4971]: I0320 09:38:58.866934 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnh7c\" (UniqueName: \"kubernetes.io/projected/a288064d-50aa-42c6-8944-3a6c3b9d6c77-kube-api-access-mnh7c\") pod \"nova-api-0\" (UID: \"a288064d-50aa-42c6-8944-3a6c3b9d6c77\") " pod="openstack/nova-api-0" Mar 20 09:38:58 crc kubenswrapper[4971]: I0320 09:38:58.867055 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a288064d-50aa-42c6-8944-3a6c3b9d6c77-config-data\") pod \"nova-api-0\" (UID: \"a288064d-50aa-42c6-8944-3a6c3b9d6c77\") " pod="openstack/nova-api-0" Mar 20 09:38:58 crc kubenswrapper[4971]: I0320 09:38:58.867096 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/073652bb-5847-4af9-9bca-9d350eacf6fc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"073652bb-5847-4af9-9bca-9d350eacf6fc\") " pod="openstack/nova-metadata-0" Mar 20 09:38:58 crc kubenswrapper[4971]: I0320 09:38:58.867176 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a288064d-50aa-42c6-8944-3a6c3b9d6c77-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a288064d-50aa-42c6-8944-3a6c3b9d6c77\") " pod="openstack/nova-api-0" Mar 20 09:38:58 crc kubenswrapper[4971]: I0320 09:38:58.867216 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a288064d-50aa-42c6-8944-3a6c3b9d6c77-logs\") pod \"nova-api-0\" (UID: \"a288064d-50aa-42c6-8944-3a6c3b9d6c77\") " pod="openstack/nova-api-0" Mar 20 09:38:58 crc kubenswrapper[4971]: I0320 09:38:58.867273 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/073652bb-5847-4af9-9bca-9d350eacf6fc-config-data\") pod \"nova-metadata-0\" (UID: \"073652bb-5847-4af9-9bca-9d350eacf6fc\") " pod="openstack/nova-metadata-0" Mar 20 09:38:58 crc kubenswrapper[4971]: I0320 09:38:58.867318 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/073652bb-5847-4af9-9bca-9d350eacf6fc-logs\") pod \"nova-metadata-0\" (UID: \"073652bb-5847-4af9-9bca-9d350eacf6fc\") " pod="openstack/nova-metadata-0" Mar 20 09:38:58 crc kubenswrapper[4971]: I0320 09:38:58.867346 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b44rn\" (UniqueName: \"kubernetes.io/projected/073652bb-5847-4af9-9bca-9d350eacf6fc-kube-api-access-b44rn\") pod \"nova-metadata-0\" (UID: \"073652bb-5847-4af9-9bca-9d350eacf6fc\") " pod="openstack/nova-metadata-0" Mar 20 09:38:58 crc kubenswrapper[4971]: I0320 09:38:58.868550 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a288064d-50aa-42c6-8944-3a6c3b9d6c77-logs\") pod \"nova-api-0\" (UID: \"a288064d-50aa-42c6-8944-3a6c3b9d6c77\") " pod="openstack/nova-api-0" Mar 20 09:38:58 crc kubenswrapper[4971]: I0320 09:38:58.878327 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a288064d-50aa-42c6-8944-3a6c3b9d6c77-config-data\") pod \"nova-api-0\" (UID: \"a288064d-50aa-42c6-8944-3a6c3b9d6c77\") " pod="openstack/nova-api-0" Mar 20 09:38:58 crc kubenswrapper[4971]: I0320 09:38:58.879277 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a288064d-50aa-42c6-8944-3a6c3b9d6c77-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a288064d-50aa-42c6-8944-3a6c3b9d6c77\") " pod="openstack/nova-api-0" Mar 20 09:38:58 crc kubenswrapper[4971]: I0320 09:38:58.882893 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnh7c\" (UniqueName: \"kubernetes.io/projected/a288064d-50aa-42c6-8944-3a6c3b9d6c77-kube-api-access-mnh7c\") pod \"nova-api-0\" (UID: \"a288064d-50aa-42c6-8944-3a6c3b9d6c77\") " pod="openstack/nova-api-0" Mar 20 09:38:58 crc kubenswrapper[4971]: E0320 09:38:58.909301 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c1701b1eb7140b5fe38ba24029a2325e7c0489059e8e39e989e69d7bbff07652 is running failed: container process not found" containerID="c1701b1eb7140b5fe38ba24029a2325e7c0489059e8e39e989e69d7bbff07652" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 09:38:58 crc kubenswrapper[4971]: E0320 09:38:58.909994 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c1701b1eb7140b5fe38ba24029a2325e7c0489059e8e39e989e69d7bbff07652 is running failed: container process not found" containerID="c1701b1eb7140b5fe38ba24029a2325e7c0489059e8e39e989e69d7bbff07652" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 09:38:58 crc kubenswrapper[4971]: E0320 09:38:58.910327 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c1701b1eb7140b5fe38ba24029a2325e7c0489059e8e39e989e69d7bbff07652 is running failed: container process not found" containerID="c1701b1eb7140b5fe38ba24029a2325e7c0489059e8e39e989e69d7bbff07652" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 09:38:58 crc kubenswrapper[4971]: E0320 09:38:58.910376 4971 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c1701b1eb7140b5fe38ba24029a2325e7c0489059e8e39e989e69d7bbff07652 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="77ccca0d-9032-4c04-9074-3711401b473c" containerName="nova-cell0-conductor-conductor" Mar 20 09:38:58 crc kubenswrapper[4971]: I0320 09:38:58.939715 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 09:38:58 crc kubenswrapper[4971]: I0320 09:38:58.968981 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/073652bb-5847-4af9-9bca-9d350eacf6fc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"073652bb-5847-4af9-9bca-9d350eacf6fc\") " pod="openstack/nova-metadata-0" Mar 20 09:38:58 crc kubenswrapper[4971]: I0320 09:38:58.969091 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/073652bb-5847-4af9-9bca-9d350eacf6fc-config-data\") pod \"nova-metadata-0\" (UID: \"073652bb-5847-4af9-9bca-9d350eacf6fc\") " pod="openstack/nova-metadata-0" Mar 20 09:38:58 crc kubenswrapper[4971]: I0320 09:38:58.969121 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/073652bb-5847-4af9-9bca-9d350eacf6fc-logs\") pod \"nova-metadata-0\" (UID: \"073652bb-5847-4af9-9bca-9d350eacf6fc\") " pod="openstack/nova-metadata-0" Mar 20 09:38:58 crc kubenswrapper[4971]: I0320 09:38:58.969151 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b44rn\" (UniqueName: \"kubernetes.io/projected/073652bb-5847-4af9-9bca-9d350eacf6fc-kube-api-access-b44rn\") pod \"nova-metadata-0\" (UID: \"073652bb-5847-4af9-9bca-9d350eacf6fc\") " pod="openstack/nova-metadata-0" Mar 20 09:38:58 crc kubenswrapper[4971]: I0320 09:38:58.969811 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/073652bb-5847-4af9-9bca-9d350eacf6fc-logs\") pod \"nova-metadata-0\" (UID: \"073652bb-5847-4af9-9bca-9d350eacf6fc\") " pod="openstack/nova-metadata-0" Mar 20 09:38:58 crc kubenswrapper[4971]: I0320 09:38:58.973768 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/073652bb-5847-4af9-9bca-9d350eacf6fc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"073652bb-5847-4af9-9bca-9d350eacf6fc\") " pod="openstack/nova-metadata-0" Mar 20 09:38:58 crc kubenswrapper[4971]: I0320 09:38:58.974004 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/073652bb-5847-4af9-9bca-9d350eacf6fc-config-data\") pod \"nova-metadata-0\" (UID: \"073652bb-5847-4af9-9bca-9d350eacf6fc\") " pod="openstack/nova-metadata-0" Mar 20 09:38:58 crc kubenswrapper[4971]: I0320 09:38:58.988570 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b44rn\" (UniqueName: \"kubernetes.io/projected/073652bb-5847-4af9-9bca-9d350eacf6fc-kube-api-access-b44rn\") pod \"nova-metadata-0\" (UID: \"073652bb-5847-4af9-9bca-9d350eacf6fc\") " pod="openstack/nova-metadata-0" Mar 20 09:38:59 crc kubenswrapper[4971]: I0320 09:38:59.048383 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 09:38:59 crc kubenswrapper[4971]: I0320 09:38:59.077532 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 20 09:38:59 crc kubenswrapper[4971]: I0320 09:38:59.186994 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77ccca0d-9032-4c04-9074-3711401b473c-combined-ca-bundle\") pod \"77ccca0d-9032-4c04-9074-3711401b473c\" (UID: \"77ccca0d-9032-4c04-9074-3711401b473c\") " Mar 20 09:38:59 crc kubenswrapper[4971]: I0320 09:38:59.187070 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77ccca0d-9032-4c04-9074-3711401b473c-config-data\") pod \"77ccca0d-9032-4c04-9074-3711401b473c\" (UID: \"77ccca0d-9032-4c04-9074-3711401b473c\") " Mar 20 09:38:59 crc kubenswrapper[4971]: I0320 09:38:59.187207 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltrf8\" (UniqueName: \"kubernetes.io/projected/77ccca0d-9032-4c04-9074-3711401b473c-kube-api-access-ltrf8\") pod \"77ccca0d-9032-4c04-9074-3711401b473c\" (UID: \"77ccca0d-9032-4c04-9074-3711401b473c\") " Mar 20 09:38:59 crc kubenswrapper[4971]: I0320 09:38:59.194978 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77ccca0d-9032-4c04-9074-3711401b473c-kube-api-access-ltrf8" (OuterVolumeSpecName: "kube-api-access-ltrf8") pod "77ccca0d-9032-4c04-9074-3711401b473c" (UID: "77ccca0d-9032-4c04-9074-3711401b473c"). InnerVolumeSpecName "kube-api-access-ltrf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:38:59 crc kubenswrapper[4971]: I0320 09:38:59.222984 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77ccca0d-9032-4c04-9074-3711401b473c-config-data" (OuterVolumeSpecName: "config-data") pod "77ccca0d-9032-4c04-9074-3711401b473c" (UID: "77ccca0d-9032-4c04-9074-3711401b473c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:38:59 crc kubenswrapper[4971]: I0320 09:38:59.233170 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77ccca0d-9032-4c04-9074-3711401b473c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "77ccca0d-9032-4c04-9074-3711401b473c" (UID: "77ccca0d-9032-4c04-9074-3711401b473c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:38:59 crc kubenswrapper[4971]: I0320 09:38:59.289916 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77ccca0d-9032-4c04-9074-3711401b473c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 09:38:59 crc kubenswrapper[4971]: I0320 09:38:59.289943 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77ccca0d-9032-4c04-9074-3711401b473c-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 09:38:59 crc kubenswrapper[4971]: I0320 09:38:59.289955 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltrf8\" (UniqueName: \"kubernetes.io/projected/77ccca0d-9032-4c04-9074-3711401b473c-kube-api-access-ltrf8\") on node \"crc\" DevicePath \"\"" Mar 20 09:38:59 crc kubenswrapper[4971]: I0320 09:38:59.486843 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 09:38:59 crc kubenswrapper[4971]: I0320 09:38:59.516956 4971 generic.go:334] "Generic (PLEG): container finished" podID="77ccca0d-9032-4c04-9074-3711401b473c" containerID="c1701b1eb7140b5fe38ba24029a2325e7c0489059e8e39e989e69d7bbff07652" exitCode=0 Mar 20 09:38:59 crc kubenswrapper[4971]: I0320 09:38:59.516994 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"77ccca0d-9032-4c04-9074-3711401b473c","Type":"ContainerDied","Data":"c1701b1eb7140b5fe38ba24029a2325e7c0489059e8e39e989e69d7bbff07652"} Mar 20 09:38:59 crc kubenswrapper[4971]: I0320 09:38:59.517019 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 20 09:38:59 crc kubenswrapper[4971]: I0320 09:38:59.517043 4971 scope.go:117] "RemoveContainer" containerID="c1701b1eb7140b5fe38ba24029a2325e7c0489059e8e39e989e69d7bbff07652" Mar 20 09:38:59 crc kubenswrapper[4971]: I0320 09:38:59.517031 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"77ccca0d-9032-4c04-9074-3711401b473c","Type":"ContainerDied","Data":"783512d5f25ffdd9c564022e0035cd1fab58f972cc8982fae7745ad3a00d65e9"} Mar 20 09:38:59 crc kubenswrapper[4971]: I0320 09:38:59.519838 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a288064d-50aa-42c6-8944-3a6c3b9d6c77","Type":"ContainerStarted","Data":"d6666e4236952426eb8257b103d09892bcdd535589d4800a6671912de5bc7b02"} Mar 20 09:38:59 crc kubenswrapper[4971]: I0320 09:38:59.552143 4971 scope.go:117] "RemoveContainer" containerID="c1701b1eb7140b5fe38ba24029a2325e7c0489059e8e39e989e69d7bbff07652" Mar 20 09:38:59 crc kubenswrapper[4971]: E0320 09:38:59.553828 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1701b1eb7140b5fe38ba24029a2325e7c0489059e8e39e989e69d7bbff07652\": container with ID starting with c1701b1eb7140b5fe38ba24029a2325e7c0489059e8e39e989e69d7bbff07652 not found: ID does not exist" containerID="c1701b1eb7140b5fe38ba24029a2325e7c0489059e8e39e989e69d7bbff07652" Mar 20 09:38:59 crc kubenswrapper[4971]: I0320 09:38:59.553869 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1701b1eb7140b5fe38ba24029a2325e7c0489059e8e39e989e69d7bbff07652"} err="failed to get container status \"c1701b1eb7140b5fe38ba24029a2325e7c0489059e8e39e989e69d7bbff07652\": rpc error: code = NotFound desc = could not find container \"c1701b1eb7140b5fe38ba24029a2325e7c0489059e8e39e989e69d7bbff07652\": container with ID starting with c1701b1eb7140b5fe38ba24029a2325e7c0489059e8e39e989e69d7bbff07652 not found: ID does not exist" Mar 20 09:38:59 crc kubenswrapper[4971]: I0320 09:38:59.566644 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 09:38:59 crc kubenswrapper[4971]: I0320 09:38:59.577664 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 09:38:59 crc kubenswrapper[4971]: I0320 09:38:59.585591 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 09:38:59 crc kubenswrapper[4971]: E0320 09:38:59.586186 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77ccca0d-9032-4c04-9074-3711401b473c" containerName="nova-cell0-conductor-conductor" Mar 20 09:38:59 crc kubenswrapper[4971]: I0320 09:38:59.586211 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="77ccca0d-9032-4c04-9074-3711401b473c" containerName="nova-cell0-conductor-conductor" Mar 20 09:38:59 crc kubenswrapper[4971]: I0320 09:38:59.586465 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="77ccca0d-9032-4c04-9074-3711401b473c" containerName="nova-cell0-conductor-conductor" Mar 20 09:38:59 crc kubenswrapper[4971]: I0320 09:38:59.587374 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 20 09:38:59 crc kubenswrapper[4971]: I0320 09:38:59.591504 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 20 09:38:59 crc kubenswrapper[4971]: I0320 09:38:59.594065 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 09:38:59 crc kubenswrapper[4971]: I0320 09:38:59.624286 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 09:38:59 crc kubenswrapper[4971]: I0320 09:38:59.698369 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bg7b8\" (UniqueName: \"kubernetes.io/projected/be09f583-3216-4725-abc2-bb13ba7585cd-kube-api-access-bg7b8\") pod \"nova-cell0-conductor-0\" (UID: \"be09f583-3216-4725-abc2-bb13ba7585cd\") " pod="openstack/nova-cell0-conductor-0" Mar 20 09:38:59 crc kubenswrapper[4971]: I0320 09:38:59.698829 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be09f583-3216-4725-abc2-bb13ba7585cd-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"be09f583-3216-4725-abc2-bb13ba7585cd\") " pod="openstack/nova-cell0-conductor-0" Mar 20 09:38:59 crc kubenswrapper[4971]: I0320 09:38:59.698919 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be09f583-3216-4725-abc2-bb13ba7585cd-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"be09f583-3216-4725-abc2-bb13ba7585cd\") " pod="openstack/nova-cell0-conductor-0" Mar 20 09:38:59 crc kubenswrapper[4971]: I0320 09:38:59.801060 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bg7b8\" (UniqueName: \"kubernetes.io/projected/be09f583-3216-4725-abc2-bb13ba7585cd-kube-api-access-bg7b8\") pod \"nova-cell0-conductor-0\" (UID: \"be09f583-3216-4725-abc2-bb13ba7585cd\") " pod="openstack/nova-cell0-conductor-0" Mar 20 09:38:59 crc kubenswrapper[4971]: I0320 09:38:59.801176 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be09f583-3216-4725-abc2-bb13ba7585cd-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"be09f583-3216-4725-abc2-bb13ba7585cd\") " pod="openstack/nova-cell0-conductor-0" Mar 20 09:38:59 crc kubenswrapper[4971]: I0320 09:38:59.801239 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be09f583-3216-4725-abc2-bb13ba7585cd-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"be09f583-3216-4725-abc2-bb13ba7585cd\") " pod="openstack/nova-cell0-conductor-0" Mar 20 09:38:59 crc kubenswrapper[4971]: I0320 09:38:59.807364 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be09f583-3216-4725-abc2-bb13ba7585cd-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"be09f583-3216-4725-abc2-bb13ba7585cd\") " pod="openstack/nova-cell0-conductor-0" Mar 20 09:38:59 crc kubenswrapper[4971]: I0320 09:38:59.809687 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be09f583-3216-4725-abc2-bb13ba7585cd-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"be09f583-3216-4725-abc2-bb13ba7585cd\") " pod="openstack/nova-cell0-conductor-0" Mar 20 09:38:59 crc kubenswrapper[4971]: I0320 09:38:59.821714 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bg7b8\" (UniqueName: \"kubernetes.io/projected/be09f583-3216-4725-abc2-bb13ba7585cd-kube-api-access-bg7b8\") pod \"nova-cell0-conductor-0\" (UID: \"be09f583-3216-4725-abc2-bb13ba7585cd\") " pod="openstack/nova-cell0-conductor-0" Mar 20 09:38:59 crc kubenswrapper[4971]: I0320 09:38:59.910041 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 20 09:39:00 crc kubenswrapper[4971]: I0320 09:39:00.180966 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 09:39:00 crc kubenswrapper[4971]: I0320 09:39:00.529066 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"be09f583-3216-4725-abc2-bb13ba7585cd","Type":"ContainerStarted","Data":"39d22b9f339c3775ffcefa717ab898acb88f4a7be32e96f259b7601bf7cdf577"} Mar 20 09:39:00 crc kubenswrapper[4971]: I0320 09:39:00.529106 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"be09f583-3216-4725-abc2-bb13ba7585cd","Type":"ContainerStarted","Data":"d8de6c155896e7a5d2325aec70c37e0434b6e658a1ca5bae558cd465ac4e81b2"} Mar 20 09:39:00 crc kubenswrapper[4971]: I0320 09:39:00.529167 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 20 09:39:00 crc kubenswrapper[4971]: I0320 09:39:00.532406 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a288064d-50aa-42c6-8944-3a6c3b9d6c77","Type":"ContainerStarted","Data":"8e6e5119d60bd3965893db3d3ecfdbca4205f94df97375b0304cec46c0cf619a"} Mar 20 09:39:00 crc kubenswrapper[4971]: I0320 09:39:00.532453 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a288064d-50aa-42c6-8944-3a6c3b9d6c77","Type":"ContainerStarted","Data":"2d78cc571ec35731f0a4abf1ca36ef15d89d86619cf9fee06d76c7e5d3fd667d"} Mar 20 09:39:00 crc kubenswrapper[4971]: I0320 09:39:00.536268 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"073652bb-5847-4af9-9bca-9d350eacf6fc","Type":"ContainerStarted","Data":"74016f9e7446cc13a0dd36c8d806a9e0d7dd7f9472733bbde9acc29fbbc088fc"} Mar 20 09:39:00 crc kubenswrapper[4971]: I0320 09:39:00.536295 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"073652bb-5847-4af9-9bca-9d350eacf6fc","Type":"ContainerStarted","Data":"1dfa383926d4e851c081b912e6fa98f6525cfce7368f4ae98f4ec940c5f90cf0"} Mar 20 09:39:00 crc kubenswrapper[4971]: I0320 09:39:00.536307 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"073652bb-5847-4af9-9bca-9d350eacf6fc","Type":"ContainerStarted","Data":"c8d4c08edd5951fabc8baab7427596fe38dc6dc24a0f1ee1deefb65db606c944"} Mar 20 09:39:00 crc kubenswrapper[4971]: I0320 09:39:00.556770 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=1.556744781 podStartE2EDuration="1.556744781s" podCreationTimestamp="2026-03-20 09:38:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:39:00.544213928 +0000 UTC m=+10162.524088066" watchObservedRunningTime="2026-03-20 09:39:00.556744781 +0000 UTC m=+10162.536618919" Mar 20 09:39:00 crc kubenswrapper[4971]: I0320 09:39:00.569519 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.569475729 podStartE2EDuration="2.569475729s" podCreationTimestamp="2026-03-20 09:38:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:39:00.565649957 +0000 UTC m=+10162.545524095" watchObservedRunningTime="2026-03-20 09:39:00.569475729 +0000 UTC m=+10162.549349897" Mar 20 09:39:00 crc kubenswrapper[4971]: I0320 09:39:00.612783 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.612764569 podStartE2EDuration="2.612764569s" podCreationTimestamp="2026-03-20 09:38:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:39:00.592042358 +0000 UTC m=+10162.571916516" watchObservedRunningTime="2026-03-20 09:39:00.612764569 +0000 UTC m=+10162.592638707" Mar 20 09:39:00 crc kubenswrapper[4971]: I0320 09:39:00.742691 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77ccca0d-9032-4c04-9074-3711401b473c" path="/var/lib/kubelet/pods/77ccca0d-9032-4c04-9074-3711401b473c/volumes" Mar 20 09:39:00 crc kubenswrapper[4971]: I0320 09:39:00.832844 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 20 09:39:05 crc kubenswrapper[4971]: I0320 09:39:05.833217 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 20 09:39:05 crc kubenswrapper[4971]: I0320 09:39:05.874126 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 20 09:39:06 crc kubenswrapper[4971]: I0320 09:39:06.624210 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 20 09:39:06 crc kubenswrapper[4971]: I0320 09:39:06.867914 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 20 09:39:08 crc kubenswrapper[4971]: I0320 09:39:08.940224 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 09:39:08 crc kubenswrapper[4971]: I0320 09:39:08.940573 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 09:39:09 crc kubenswrapper[4971]: I0320 09:39:09.049852 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 09:39:09 crc kubenswrapper[4971]: I0320 09:39:09.049911 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 09:39:09 crc kubenswrapper[4971]: I0320 09:39:09.936820 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 20 09:39:10 crc kubenswrapper[4971]: I0320 09:39:10.022763 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a288064d-50aa-42c6-8944-3a6c3b9d6c77" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.99:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 09:39:10 crc kubenswrapper[4971]: I0320 09:39:10.022776 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a288064d-50aa-42c6-8944-3a6c3b9d6c77" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.99:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 09:39:10 crc kubenswrapper[4971]: I0320 09:39:10.132746 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="073652bb-5847-4af9-9bca-9d350eacf6fc" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.0.100:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 09:39:10 crc kubenswrapper[4971]: I0320 09:39:10.132746 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="073652bb-5847-4af9-9bca-9d350eacf6fc" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.0.100:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 09:39:15 crc kubenswrapper[4971]: I0320 09:39:15.176860 4971 scope.go:117] "RemoveContainer" containerID="a754c0876c0bb879fece283bb1842c594d8bf635ee792749cff6fcae64503153" Mar 20 09:39:15 crc kubenswrapper[4971]: I0320 09:39:15.206931 4971 scope.go:117] "RemoveContainer" containerID="f03c934f8e3cab9668e02a35a1ff1b598209096c238dc5e2b9041ebfa5a60f51" Mar 20 09:39:16 crc kubenswrapper[4971]: I0320 09:39:16.939818 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 09:39:16 crc kubenswrapper[4971]: I0320 09:39:16.940209 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 09:39:17 crc kubenswrapper[4971]: I0320 09:39:17.049361 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 09:39:17 crc kubenswrapper[4971]: I0320 09:39:17.049407 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 09:39:18 crc kubenswrapper[4971]: I0320 09:39:18.944085 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 09:39:18 crc kubenswrapper[4971]: I0320 09:39:18.944527 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 09:39:18 crc kubenswrapper[4971]: I0320 09:39:18.947052 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 09:39:18 crc kubenswrapper[4971]: I0320 09:39:18.947341 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 09:39:19 crc kubenswrapper[4971]: I0320 09:39:19.055356 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 09:39:19 crc kubenswrapper[4971]: I0320 09:39:19.056913 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 09:39:19 crc kubenswrapper[4971]: I0320 09:39:19.066859 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 09:39:19 crc kubenswrapper[4971]: I0320 09:39:19.730712 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 09:39:20 crc kubenswrapper[4971]: I0320 09:39:20.655539 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-m5hfj"] Mar 20 09:39:20 crc kubenswrapper[4971]: I0320 09:39:20.657939 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m5hfj" Mar 20 09:39:20 crc kubenswrapper[4971]: I0320 09:39:20.669903 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m5hfj"] Mar 20 09:39:20 crc kubenswrapper[4971]: I0320 09:39:20.735150 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c66777a-99c1-456c-80c8-42affde92cef-catalog-content\") pod \"redhat-operators-m5hfj\" (UID: \"2c66777a-99c1-456c-80c8-42affde92cef\") " pod="openshift-marketplace/redhat-operators-m5hfj" Mar 20 09:39:20 crc kubenswrapper[4971]: I0320 09:39:20.735305 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2jh6\" (UniqueName: \"kubernetes.io/projected/2c66777a-99c1-456c-80c8-42affde92cef-kube-api-access-f2jh6\") pod \"redhat-operators-m5hfj\" (UID: \"2c66777a-99c1-456c-80c8-42affde92cef\") " pod="openshift-marketplace/redhat-operators-m5hfj" Mar 20 09:39:20 crc kubenswrapper[4971]: I0320 09:39:20.735378 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c66777a-99c1-456c-80c8-42affde92cef-utilities\") pod \"redhat-operators-m5hfj\" (UID: \"2c66777a-99c1-456c-80c8-42affde92cef\") " pod="openshift-marketplace/redhat-operators-m5hfj" Mar 20 09:39:20 crc kubenswrapper[4971]: I0320 09:39:20.836944 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2jh6\" (UniqueName: \"kubernetes.io/projected/2c66777a-99c1-456c-80c8-42affde92cef-kube-api-access-f2jh6\") pod \"redhat-operators-m5hfj\" (UID: \"2c66777a-99c1-456c-80c8-42affde92cef\") " pod="openshift-marketplace/redhat-operators-m5hfj" Mar 20 09:39:20 crc kubenswrapper[4971]: I0320 09:39:20.837043 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c66777a-99c1-456c-80c8-42affde92cef-utilities\") pod \"redhat-operators-m5hfj\" (UID: \"2c66777a-99c1-456c-80c8-42affde92cef\") " pod="openshift-marketplace/redhat-operators-m5hfj" Mar 20 09:39:20 crc kubenswrapper[4971]: I0320 09:39:20.837162 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c66777a-99c1-456c-80c8-42affde92cef-catalog-content\") pod \"redhat-operators-m5hfj\" (UID: \"2c66777a-99c1-456c-80c8-42affde92cef\") " pod="openshift-marketplace/redhat-operators-m5hfj" Mar 20 09:39:20 crc kubenswrapper[4971]: I0320 09:39:20.837682 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c66777a-99c1-456c-80c8-42affde92cef-utilities\") pod \"redhat-operators-m5hfj\" (UID: \"2c66777a-99c1-456c-80c8-42affde92cef\") " pod="openshift-marketplace/redhat-operators-m5hfj" Mar 20 09:39:20 crc kubenswrapper[4971]: I0320 09:39:20.837768 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c66777a-99c1-456c-80c8-42affde92cef-catalog-content\") pod \"redhat-operators-m5hfj\" (UID: \"2c66777a-99c1-456c-80c8-42affde92cef\") " pod="openshift-marketplace/redhat-operators-m5hfj" Mar 20 09:39:20 crc kubenswrapper[4971]: I0320 09:39:20.859805 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2jh6\" (UniqueName: \"kubernetes.io/projected/2c66777a-99c1-456c-80c8-42affde92cef-kube-api-access-f2jh6\") pod \"redhat-operators-m5hfj\" (UID: \"2c66777a-99c1-456c-80c8-42affde92cef\") " pod="openshift-marketplace/redhat-operators-m5hfj" Mar 20 09:39:20 crc kubenswrapper[4971]: I0320 09:39:20.880029 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f"] Mar 20 09:39:20 crc kubenswrapper[4971]: I0320 09:39:20.881697 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f" Mar 20 09:39:20 crc kubenswrapper[4971]: I0320 09:39:20.888551 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 09:39:20 crc kubenswrapper[4971]: I0320 09:39:20.888748 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 20 09:39:20 crc kubenswrapper[4971]: I0320 09:39:20.889933 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rwstj" Mar 20 09:39:20 crc kubenswrapper[4971]: I0320 09:39:20.890201 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Mar 20 09:39:20 crc kubenswrapper[4971]: I0320 09:39:20.890353 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 20 09:39:20 crc kubenswrapper[4971]: I0320 09:39:20.890639 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 20 09:39:20 crc kubenswrapper[4971]: I0320 09:39:20.891376 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 20 09:39:20 crc kubenswrapper[4971]: I0320 09:39:20.894566 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f"] Mar 20 09:39:20 crc kubenswrapper[4971]: I0320 09:39:20.978668 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m5hfj" Mar 20 09:39:21 crc kubenswrapper[4971]: I0320 09:39:21.040770 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a8a4d9ea-5399-4aa0-9c74-5c6013261268-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f\" (UID: \"a8a4d9ea-5399-4aa0-9c74-5c6013261268\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f" Mar 20 09:39:21 crc kubenswrapper[4971]: I0320 09:39:21.040835 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/a8a4d9ea-5399-4aa0-9c74-5c6013261268-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f\" (UID: \"a8a4d9ea-5399-4aa0-9c74-5c6013261268\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f" Mar 20 09:39:21 crc kubenswrapper[4971]: I0320 09:39:21.040872 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xddzf\" (UniqueName: \"kubernetes.io/projected/a8a4d9ea-5399-4aa0-9c74-5c6013261268-kube-api-access-xddzf\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f\" (UID: \"a8a4d9ea-5399-4aa0-9c74-5c6013261268\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f" Mar 20 09:39:21 crc kubenswrapper[4971]: I0320 09:39:21.040898 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/a8a4d9ea-5399-4aa0-9c74-5c6013261268-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f\" (UID: \"a8a4d9ea-5399-4aa0-9c74-5c6013261268\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f" Mar 20 09:39:21 crc kubenswrapper[4971]: I0320 09:39:21.040921 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8a4d9ea-5399-4aa0-9c74-5c6013261268-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f\" (UID: \"a8a4d9ea-5399-4aa0-9c74-5c6013261268\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f" Mar 20 09:39:21 crc kubenswrapper[4971]: I0320 09:39:21.040942 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/a8a4d9ea-5399-4aa0-9c74-5c6013261268-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f\" (UID: \"a8a4d9ea-5399-4aa0-9c74-5c6013261268\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f" Mar 20 09:39:21 crc kubenswrapper[4971]: I0320 09:39:21.040963 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/a8a4d9ea-5399-4aa0-9c74-5c6013261268-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f\" (UID: \"a8a4d9ea-5399-4aa0-9c74-5c6013261268\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f" Mar 20 09:39:21 crc kubenswrapper[4971]: I0320 09:39:21.040982 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/a8a4d9ea-5399-4aa0-9c74-5c6013261268-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f\" (UID: \"a8a4d9ea-5399-4aa0-9c74-5c6013261268\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f" Mar 20 09:39:21 crc kubenswrapper[4971]: I0320 09:39:21.041009 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a8a4d9ea-5399-4aa0-9c74-5c6013261268-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f\" (UID: \"a8a4d9ea-5399-4aa0-9c74-5c6013261268\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f" Mar 20 09:39:21 crc kubenswrapper[4971]: I0320 09:39:21.041031 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a8a4d9ea-5399-4aa0-9c74-5c6013261268-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f\" (UID: \"a8a4d9ea-5399-4aa0-9c74-5c6013261268\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f" Mar 20 09:39:21 crc kubenswrapper[4971]: I0320 09:39:21.041098 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/a8a4d9ea-5399-4aa0-9c74-5c6013261268-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f\" (UID: \"a8a4d9ea-5399-4aa0-9c74-5c6013261268\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f" Mar 20 09:39:21 crc kubenswrapper[4971]: I0320 09:39:21.041165 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/a8a4d9ea-5399-4aa0-9c74-5c6013261268-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f\" (UID: \"a8a4d9ea-5399-4aa0-9c74-5c6013261268\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f" Mar 20 09:39:21 crc kubenswrapper[4971]: I0320 09:39:21.041186 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/a8a4d9ea-5399-4aa0-9c74-5c6013261268-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f\" (UID: \"a8a4d9ea-5399-4aa0-9c74-5c6013261268\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f" Mar 20 09:39:21 crc kubenswrapper[4971]: I0320 09:39:21.143443 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/a8a4d9ea-5399-4aa0-9c74-5c6013261268-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f\" (UID: \"a8a4d9ea-5399-4aa0-9c74-5c6013261268\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f" Mar 20 09:39:21 crc kubenswrapper[4971]: I0320 09:39:21.143746 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/a8a4d9ea-5399-4aa0-9c74-5c6013261268-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f\" (UID: \"a8a4d9ea-5399-4aa0-9c74-5c6013261268\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f" Mar 20 09:39:21 crc kubenswrapper[4971]: I0320 09:39:21.143784 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a8a4d9ea-5399-4aa0-9c74-5c6013261268-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f\" (UID: \"a8a4d9ea-5399-4aa0-9c74-5c6013261268\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f" Mar 20 09:39:21 crc kubenswrapper[4971]: I0320 09:39:21.143825 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/a8a4d9ea-5399-4aa0-9c74-5c6013261268-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f\" (UID: \"a8a4d9ea-5399-4aa0-9c74-5c6013261268\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f" Mar 20 09:39:21 crc kubenswrapper[4971]: I0320 09:39:21.143860 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xddzf\" (UniqueName: \"kubernetes.io/projected/a8a4d9ea-5399-4aa0-9c74-5c6013261268-kube-api-access-xddzf\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f\" (UID: \"a8a4d9ea-5399-4aa0-9c74-5c6013261268\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f" Mar 20 09:39:21 crc kubenswrapper[4971]: I0320 09:39:21.143892 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/a8a4d9ea-5399-4aa0-9c74-5c6013261268-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f\" (UID: \"a8a4d9ea-5399-4aa0-9c74-5c6013261268\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f" Mar 20 09:39:21 crc kubenswrapper[4971]: I0320 09:39:21.143918 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8a4d9ea-5399-4aa0-9c74-5c6013261268-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f\" (UID: \"a8a4d9ea-5399-4aa0-9c74-5c6013261268\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f" Mar 20 09:39:21 crc kubenswrapper[4971]: I0320 09:39:21.143939 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/a8a4d9ea-5399-4aa0-9c74-5c6013261268-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f\" (UID: \"a8a4d9ea-5399-4aa0-9c74-5c6013261268\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f" Mar 20 09:39:21 crc kubenswrapper[4971]: I0320 09:39:21.143959 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/a8a4d9ea-5399-4aa0-9c74-5c6013261268-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f\" (UID: \"a8a4d9ea-5399-4aa0-9c74-5c6013261268\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f" Mar 20 09:39:21 crc kubenswrapper[4971]: I0320 09:39:21.143977 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/a8a4d9ea-5399-4aa0-9c74-5c6013261268-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f\" (UID: \"a8a4d9ea-5399-4aa0-9c74-5c6013261268\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f" Mar 20 09:39:21 crc kubenswrapper[4971]: I0320 09:39:21.144003 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a8a4d9ea-5399-4aa0-9c74-5c6013261268-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f\" (UID: \"a8a4d9ea-5399-4aa0-9c74-5c6013261268\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f" Mar 20 09:39:21 crc kubenswrapper[4971]: I0320 09:39:21.144027 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a8a4d9ea-5399-4aa0-9c74-5c6013261268-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f\" (UID: \"a8a4d9ea-5399-4aa0-9c74-5c6013261268\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f" Mar 20 09:39:21 crc kubenswrapper[4971]: I0320 09:39:21.144085 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/a8a4d9ea-5399-4aa0-9c74-5c6013261268-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f\" (UID: \"a8a4d9ea-5399-4aa0-9c74-5c6013261268\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f" Mar 20 09:39:21 crc kubenswrapper[4971]: I0320 09:39:21.147432 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/a8a4d9ea-5399-4aa0-9c74-5c6013261268-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f\" (UID: \"a8a4d9ea-5399-4aa0-9c74-5c6013261268\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f" Mar 20 09:39:21 crc kubenswrapper[4971]: I0320 09:39:21.147620 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/a8a4d9ea-5399-4aa0-9c74-5c6013261268-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f\" (UID: \"a8a4d9ea-5399-4aa0-9c74-5c6013261268\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f" Mar 20 09:39:21 crc kubenswrapper[4971]: I0320 09:39:21.149209 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/a8a4d9ea-5399-4aa0-9c74-5c6013261268-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f\" (UID: \"a8a4d9ea-5399-4aa0-9c74-5c6013261268\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f" Mar 20 09:39:21 crc kubenswrapper[4971]: I0320 09:39:21.155059 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8a4d9ea-5399-4aa0-9c74-5c6013261268-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f\" (UID: \"a8a4d9ea-5399-4aa0-9c74-5c6013261268\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f" Mar 20 09:39:21 crc kubenswrapper[4971]: I0320 09:39:21.156318 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/a8a4d9ea-5399-4aa0-9c74-5c6013261268-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f\" (UID: \"a8a4d9ea-5399-4aa0-9c74-5c6013261268\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f" Mar 20 09:39:21 crc kubenswrapper[4971]: I0320 09:39:21.156316 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a8a4d9ea-5399-4aa0-9c74-5c6013261268-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f\" (UID: \"a8a4d9ea-5399-4aa0-9c74-5c6013261268\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f" Mar 20 09:39:21 crc kubenswrapper[4971]: I0320 09:39:21.156505 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/a8a4d9ea-5399-4aa0-9c74-5c6013261268-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f\" (UID: \"a8a4d9ea-5399-4aa0-9c74-5c6013261268\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f" Mar 20 09:39:21 crc kubenswrapper[4971]: I0320 09:39:21.156734 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/a8a4d9ea-5399-4aa0-9c74-5c6013261268-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f\" (UID: \"a8a4d9ea-5399-4aa0-9c74-5c6013261268\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f" Mar 20 09:39:21 crc kubenswrapper[4971]: I0320 09:39:21.167567 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/a8a4d9ea-5399-4aa0-9c74-5c6013261268-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f\" (UID: \"a8a4d9ea-5399-4aa0-9c74-5c6013261268\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f" Mar 20 09:39:21 crc kubenswrapper[4971]: I0320 09:39:21.176862 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/a8a4d9ea-5399-4aa0-9c74-5c6013261268-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f\" (UID: \"a8a4d9ea-5399-4aa0-9c74-5c6013261268\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f" Mar 20 09:39:21 crc kubenswrapper[4971]: I0320 09:39:21.177217 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a8a4d9ea-5399-4aa0-9c74-5c6013261268-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f\" (UID: \"a8a4d9ea-5399-4aa0-9c74-5c6013261268\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f" Mar 20 09:39:21 crc kubenswrapper[4971]: I0320 09:39:21.183559 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a8a4d9ea-5399-4aa0-9c74-5c6013261268-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f\" (UID: \"a8a4d9ea-5399-4aa0-9c74-5c6013261268\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f" Mar 20 09:39:21 crc kubenswrapper[4971]: I0320 09:39:21.184030 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xddzf\" (UniqueName: \"kubernetes.io/projected/a8a4d9ea-5399-4aa0-9c74-5c6013261268-kube-api-access-xddzf\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f\" (UID: \"a8a4d9ea-5399-4aa0-9c74-5c6013261268\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f" Mar 20 09:39:21 crc kubenswrapper[4971]: I0320 09:39:21.236618 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f" Mar 20 09:39:21 crc kubenswrapper[4971]: I0320 09:39:21.500200 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m5hfj"] Mar 20 09:39:21 crc kubenswrapper[4971]: W0320 09:39:21.504925 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c66777a_99c1_456c_80c8_42affde92cef.slice/crio-ac9040417c6a4749e52a5e8e4e2d696e8b6a71d7636a03151098eb685cacaa50 WatchSource:0}: Error finding container ac9040417c6a4749e52a5e8e4e2d696e8b6a71d7636a03151098eb685cacaa50: Status 404 returned error can't find the container with id ac9040417c6a4749e52a5e8e4e2d696e8b6a71d7636a03151098eb685cacaa50 Mar 20 09:39:21 crc kubenswrapper[4971]: I0320 09:39:21.750758 4971 generic.go:334] "Generic (PLEG): container finished" podID="2c66777a-99c1-456c-80c8-42affde92cef" containerID="82c2fee36d840ef93345a176677922774b629fe170c4a7cf5569669ed6ad64dc" exitCode=0 Mar 20 09:39:21 crc kubenswrapper[4971]: I0320 09:39:21.750811 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m5hfj" event={"ID":"2c66777a-99c1-456c-80c8-42affde92cef","Type":"ContainerDied","Data":"82c2fee36d840ef93345a176677922774b629fe170c4a7cf5569669ed6ad64dc"} Mar 20 09:39:21 crc kubenswrapper[4971]: I0320 09:39:21.751093 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m5hfj" event={"ID":"2c66777a-99c1-456c-80c8-42affde92cef","Type":"ContainerStarted","Data":"ac9040417c6a4749e52a5e8e4e2d696e8b6a71d7636a03151098eb685cacaa50"} Mar 20 09:39:21 crc kubenswrapper[4971]: W0320 09:39:21.805738 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8a4d9ea_5399_4aa0_9c74_5c6013261268.slice/crio-7cc2eee953545964187cbcb274b4e6fa6c6e856369a5a1d9ada54d4f6310ab8c WatchSource:0}: Error finding container 7cc2eee953545964187cbcb274b4e6fa6c6e856369a5a1d9ada54d4f6310ab8c: Status 404 returned error can't find the container with id 7cc2eee953545964187cbcb274b4e6fa6c6e856369a5a1d9ada54d4f6310ab8c Mar 20 09:39:21 crc kubenswrapper[4971]: I0320 09:39:21.809491 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f"] Mar 20 09:39:22 crc kubenswrapper[4971]: I0320 09:39:22.766966 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m5hfj" event={"ID":"2c66777a-99c1-456c-80c8-42affde92cef","Type":"ContainerStarted","Data":"e22ee15be614847bb26097ee941b7e8becb1ec09a81c541a72b3ad5402d5e060"} Mar 20 09:39:22 crc kubenswrapper[4971]: I0320 09:39:22.769393 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f" event={"ID":"a8a4d9ea-5399-4aa0-9c74-5c6013261268","Type":"ContainerStarted","Data":"7ca4a5f5f7f2ee4c7842680235af9222e174210a1d3732ea7fc46f512b34bd6c"} Mar 20 09:39:22 crc kubenswrapper[4971]: I0320 09:39:22.769440 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f" event={"ID":"a8a4d9ea-5399-4aa0-9c74-5c6013261268","Type":"ContainerStarted","Data":"7cc2eee953545964187cbcb274b4e6fa6c6e856369a5a1d9ada54d4f6310ab8c"} Mar 20 09:39:22 crc kubenswrapper[4971]: I0320 09:39:22.809015 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f" podStartSLOduration=2.334177162 podStartE2EDuration="2.808994324s" podCreationTimestamp="2026-03-20 09:39:20 +0000 UTC" firstStartedPulling="2026-03-20 09:39:21.808220511 +0000 UTC m=+10183.788094649" lastFinishedPulling="2026-03-20 09:39:22.283037673 +0000 UTC m=+10184.262911811" observedRunningTime="2026-03-20 09:39:22.80245471 +0000 UTC m=+10184.782328848" watchObservedRunningTime="2026-03-20 09:39:22.808994324 +0000 UTC m=+10184.788868462" Mar 20 09:39:28 crc kubenswrapper[4971]: I0320 09:39:28.824690 4971 generic.go:334] "Generic (PLEG): container finished" podID="2c66777a-99c1-456c-80c8-42affde92cef" containerID="e22ee15be614847bb26097ee941b7e8becb1ec09a81c541a72b3ad5402d5e060" exitCode=0 Mar 20 09:39:28 crc kubenswrapper[4971]: I0320 09:39:28.824788 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m5hfj" event={"ID":"2c66777a-99c1-456c-80c8-42affde92cef","Type":"ContainerDied","Data":"e22ee15be614847bb26097ee941b7e8becb1ec09a81c541a72b3ad5402d5e060"} Mar 20 09:39:29 crc kubenswrapper[4971]: I0320 09:39:29.844250 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m5hfj" event={"ID":"2c66777a-99c1-456c-80c8-42affde92cef","Type":"ContainerStarted","Data":"bd33c33f032b94de47f381c81de7cc4387b6aa523486f4e05f16e00d81d3c22f"} Mar 20 09:39:29 crc kubenswrapper[4971]: I0320 09:39:29.865110 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-m5hfj" podStartSLOduration=2.094783772 podStartE2EDuration="9.865089477s" podCreationTimestamp="2026-03-20 09:39:20 +0000 UTC" firstStartedPulling="2026-03-20 09:39:21.753419985 +0000 UTC m=+10183.733294113" lastFinishedPulling="2026-03-20 09:39:29.52372568 +0000 UTC m=+10191.503599818" observedRunningTime="2026-03-20 09:39:29.861213164 +0000 UTC m=+10191.841087312" watchObservedRunningTime="2026-03-20 09:39:29.865089477 +0000 UTC m=+10191.844963605" Mar 20 09:39:30 crc kubenswrapper[4971]: I0320 09:39:30.979580 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-m5hfj" Mar 20 09:39:30 crc kubenswrapper[4971]: I0320 09:39:30.979839 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-m5hfj" Mar 20 09:39:32 crc kubenswrapper[4971]: I0320 09:39:32.027189 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-m5hfj" podUID="2c66777a-99c1-456c-80c8-42affde92cef" containerName="registry-server" probeResult="failure" output=< Mar 20 09:39:32 crc kubenswrapper[4971]: timeout: failed to connect service ":50051" within 1s Mar 20 09:39:32 crc kubenswrapper[4971]: > Mar 20 09:39:41 crc kubenswrapper[4971]: I0320 09:39:41.034873 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-m5hfj" Mar 20 09:39:41 crc kubenswrapper[4971]: I0320 09:39:41.086208 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-m5hfj" Mar 20 09:39:41 crc kubenswrapper[4971]: I0320 09:39:41.275296 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m5hfj"] Mar 20 09:39:42 crc kubenswrapper[4971]: I0320 09:39:42.972305 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-m5hfj" podUID="2c66777a-99c1-456c-80c8-42affde92cef" containerName="registry-server" containerID="cri-o://bd33c33f032b94de47f381c81de7cc4387b6aa523486f4e05f16e00d81d3c22f" gracePeriod=2 Mar 20 09:39:43 crc kubenswrapper[4971]: I0320 09:39:43.494494 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m5hfj" Mar 20 09:39:43 crc kubenswrapper[4971]: I0320 09:39:43.696123 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c66777a-99c1-456c-80c8-42affde92cef-catalog-content\") pod \"2c66777a-99c1-456c-80c8-42affde92cef\" (UID: \"2c66777a-99c1-456c-80c8-42affde92cef\") " Mar 20 09:39:43 crc kubenswrapper[4971]: I0320 09:39:43.706819 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c66777a-99c1-456c-80c8-42affde92cef-utilities\") pod \"2c66777a-99c1-456c-80c8-42affde92cef\" (UID: \"2c66777a-99c1-456c-80c8-42affde92cef\") " Mar 20 09:39:43 crc kubenswrapper[4971]: I0320 09:39:43.706894 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2jh6\" (UniqueName: \"kubernetes.io/projected/2c66777a-99c1-456c-80c8-42affde92cef-kube-api-access-f2jh6\") pod \"2c66777a-99c1-456c-80c8-42affde92cef\" (UID: \"2c66777a-99c1-456c-80c8-42affde92cef\") " Mar 20 09:39:43 crc kubenswrapper[4971]: I0320 09:39:43.708290 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c66777a-99c1-456c-80c8-42affde92cef-utilities" (OuterVolumeSpecName: "utilities") pod "2c66777a-99c1-456c-80c8-42affde92cef" (UID: "2c66777a-99c1-456c-80c8-42affde92cef"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:39:43 crc kubenswrapper[4971]: I0320 09:39:43.713429 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c66777a-99c1-456c-80c8-42affde92cef-kube-api-access-f2jh6" (OuterVolumeSpecName: "kube-api-access-f2jh6") pod "2c66777a-99c1-456c-80c8-42affde92cef" (UID: "2c66777a-99c1-456c-80c8-42affde92cef"). InnerVolumeSpecName "kube-api-access-f2jh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:39:43 crc kubenswrapper[4971]: I0320 09:39:43.810711 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c66777a-99c1-456c-80c8-42affde92cef-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 09:39:43 crc kubenswrapper[4971]: I0320 09:39:43.810765 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2jh6\" (UniqueName: \"kubernetes.io/projected/2c66777a-99c1-456c-80c8-42affde92cef-kube-api-access-f2jh6\") on node \"crc\" DevicePath \"\"" Mar 20 09:39:43 crc kubenswrapper[4971]: I0320 09:39:43.826977 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c66777a-99c1-456c-80c8-42affde92cef-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2c66777a-99c1-456c-80c8-42affde92cef" (UID: "2c66777a-99c1-456c-80c8-42affde92cef"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:39:43 crc kubenswrapper[4971]: I0320 09:39:43.912235 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c66777a-99c1-456c-80c8-42affde92cef-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 09:39:43 crc kubenswrapper[4971]: I0320 09:39:43.984522 4971 generic.go:334] "Generic (PLEG): container finished" podID="2c66777a-99c1-456c-80c8-42affde92cef" containerID="bd33c33f032b94de47f381c81de7cc4387b6aa523486f4e05f16e00d81d3c22f" exitCode=0 Mar 20 09:39:43 crc kubenswrapper[4971]: I0320 09:39:43.984572 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m5hfj" event={"ID":"2c66777a-99c1-456c-80c8-42affde92cef","Type":"ContainerDied","Data":"bd33c33f032b94de47f381c81de7cc4387b6aa523486f4e05f16e00d81d3c22f"} Mar 20 09:39:43 crc kubenswrapper[4971]: I0320 09:39:43.984605 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m5hfj" event={"ID":"2c66777a-99c1-456c-80c8-42affde92cef","Type":"ContainerDied","Data":"ac9040417c6a4749e52a5e8e4e2d696e8b6a71d7636a03151098eb685cacaa50"} Mar 20 09:39:43 crc kubenswrapper[4971]: I0320 09:39:43.984635 4971 scope.go:117] "RemoveContainer" containerID="bd33c33f032b94de47f381c81de7cc4387b6aa523486f4e05f16e00d81d3c22f" Mar 20 09:39:43 crc kubenswrapper[4971]: I0320 09:39:43.984640 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m5hfj" Mar 20 09:39:44 crc kubenswrapper[4971]: I0320 09:39:44.036219 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m5hfj"] Mar 20 09:39:44 crc kubenswrapper[4971]: I0320 09:39:44.039530 4971 scope.go:117] "RemoveContainer" containerID="e22ee15be614847bb26097ee941b7e8becb1ec09a81c541a72b3ad5402d5e060" Mar 20 09:39:44 crc kubenswrapper[4971]: I0320 09:39:44.046520 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-m5hfj"] Mar 20 09:39:44 crc kubenswrapper[4971]: I0320 09:39:44.070794 4971 scope.go:117] "RemoveContainer" containerID="82c2fee36d840ef93345a176677922774b629fe170c4a7cf5569669ed6ad64dc" Mar 20 09:39:44 crc kubenswrapper[4971]: I0320 09:39:44.121650 4971 scope.go:117] "RemoveContainer" containerID="bd33c33f032b94de47f381c81de7cc4387b6aa523486f4e05f16e00d81d3c22f" Mar 20 09:39:44 crc kubenswrapper[4971]: E0320 09:39:44.122063 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd33c33f032b94de47f381c81de7cc4387b6aa523486f4e05f16e00d81d3c22f\": container with ID starting with bd33c33f032b94de47f381c81de7cc4387b6aa523486f4e05f16e00d81d3c22f not found: ID does not exist" containerID="bd33c33f032b94de47f381c81de7cc4387b6aa523486f4e05f16e00d81d3c22f" Mar 20 09:39:44 crc kubenswrapper[4971]: I0320 09:39:44.122116 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd33c33f032b94de47f381c81de7cc4387b6aa523486f4e05f16e00d81d3c22f"} err="failed to get container status \"bd33c33f032b94de47f381c81de7cc4387b6aa523486f4e05f16e00d81d3c22f\": rpc error: code = NotFound desc = could not find container \"bd33c33f032b94de47f381c81de7cc4387b6aa523486f4e05f16e00d81d3c22f\": container with ID starting with bd33c33f032b94de47f381c81de7cc4387b6aa523486f4e05f16e00d81d3c22f not found: ID does not exist" Mar 20 09:39:44 crc kubenswrapper[4971]: I0320 09:39:44.122141 4971 scope.go:117] "RemoveContainer" containerID="e22ee15be614847bb26097ee941b7e8becb1ec09a81c541a72b3ad5402d5e060" Mar 20 09:39:44 crc kubenswrapper[4971]: E0320 09:39:44.122463 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e22ee15be614847bb26097ee941b7e8becb1ec09a81c541a72b3ad5402d5e060\": container with ID starting with e22ee15be614847bb26097ee941b7e8becb1ec09a81c541a72b3ad5402d5e060 not found: ID does not exist" containerID="e22ee15be614847bb26097ee941b7e8becb1ec09a81c541a72b3ad5402d5e060" Mar 20 09:39:44 crc kubenswrapper[4971]: I0320 09:39:44.122540 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e22ee15be614847bb26097ee941b7e8becb1ec09a81c541a72b3ad5402d5e060"} err="failed to get container status \"e22ee15be614847bb26097ee941b7e8becb1ec09a81c541a72b3ad5402d5e060\": rpc error: code = NotFound desc = could not find container \"e22ee15be614847bb26097ee941b7e8becb1ec09a81c541a72b3ad5402d5e060\": container with ID starting with e22ee15be614847bb26097ee941b7e8becb1ec09a81c541a72b3ad5402d5e060 not found: ID does not exist" Mar 20 09:39:44 crc kubenswrapper[4971]: I0320 09:39:44.122584 4971 scope.go:117] "RemoveContainer" containerID="82c2fee36d840ef93345a176677922774b629fe170c4a7cf5569669ed6ad64dc" Mar 20 09:39:44 crc kubenswrapper[4971]: E0320 09:39:44.123209 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82c2fee36d840ef93345a176677922774b629fe170c4a7cf5569669ed6ad64dc\": container with ID starting with 82c2fee36d840ef93345a176677922774b629fe170c4a7cf5569669ed6ad64dc not found: ID does not exist" containerID="82c2fee36d840ef93345a176677922774b629fe170c4a7cf5569669ed6ad64dc" Mar 20 09:39:44 crc kubenswrapper[4971]: I0320 09:39:44.123239 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82c2fee36d840ef93345a176677922774b629fe170c4a7cf5569669ed6ad64dc"} err="failed to get container status \"82c2fee36d840ef93345a176677922774b629fe170c4a7cf5569669ed6ad64dc\": rpc error: code = NotFound desc = could not find container \"82c2fee36d840ef93345a176677922774b629fe170c4a7cf5569669ed6ad64dc\": container with ID starting with 82c2fee36d840ef93345a176677922774b629fe170c4a7cf5569669ed6ad64dc not found: ID does not exist" Mar 20 09:39:44 crc kubenswrapper[4971]: I0320 09:39:44.743466 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c66777a-99c1-456c-80c8-42affde92cef" path="/var/lib/kubelet/pods/2c66777a-99c1-456c-80c8-42affde92cef/volumes" Mar 20 09:39:50 crc kubenswrapper[4971]: I0320 09:39:50.162208 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:39:50 crc kubenswrapper[4971]: I0320 09:39:50.162747 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:40:00 crc kubenswrapper[4971]: I0320 09:40:00.147502 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566660-h97zl"] Mar 20 09:40:00 crc kubenswrapper[4971]: E0320 09:40:00.148694 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c66777a-99c1-456c-80c8-42affde92cef" containerName="extract-utilities" Mar 20 09:40:00 crc kubenswrapper[4971]: I0320 09:40:00.148712 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c66777a-99c1-456c-80c8-42affde92cef" containerName="extract-utilities" Mar 20 09:40:00 crc kubenswrapper[4971]: E0320 09:40:00.148741 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c66777a-99c1-456c-80c8-42affde92cef" containerName="extract-content" Mar 20 09:40:00 crc kubenswrapper[4971]: I0320 09:40:00.148749 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c66777a-99c1-456c-80c8-42affde92cef" containerName="extract-content" Mar 20 09:40:00 crc kubenswrapper[4971]: E0320 09:40:00.148786 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c66777a-99c1-456c-80c8-42affde92cef" containerName="registry-server" Mar 20 09:40:00 crc kubenswrapper[4971]: I0320 09:40:00.148794 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c66777a-99c1-456c-80c8-42affde92cef" containerName="registry-server" Mar 20 09:40:00 crc kubenswrapper[4971]: I0320 09:40:00.149017 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c66777a-99c1-456c-80c8-42affde92cef" containerName="registry-server" Mar 20 09:40:00 crc kubenswrapper[4971]: I0320 09:40:00.149880 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566660-h97zl" Mar 20 09:40:00 crc kubenswrapper[4971]: I0320 09:40:00.152488 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:40:00 crc kubenswrapper[4971]: I0320 09:40:00.152487 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 09:40:00 crc kubenswrapper[4971]: I0320 09:40:00.154478 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:40:00 crc kubenswrapper[4971]: I0320 09:40:00.181037 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mrtq\" (UniqueName: \"kubernetes.io/projected/8a6069cb-9293-4b44-92a9-f300b0343fff-kube-api-access-5mrtq\") pod \"auto-csr-approver-29566660-h97zl\" (UID: \"8a6069cb-9293-4b44-92a9-f300b0343fff\") " pod="openshift-infra/auto-csr-approver-29566660-h97zl" Mar 20 09:40:00 crc kubenswrapper[4971]: I0320 09:40:00.188706 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566660-h97zl"] Mar 20 09:40:00 crc kubenswrapper[4971]: I0320 09:40:00.284086 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mrtq\" (UniqueName: \"kubernetes.io/projected/8a6069cb-9293-4b44-92a9-f300b0343fff-kube-api-access-5mrtq\") pod \"auto-csr-approver-29566660-h97zl\" (UID: \"8a6069cb-9293-4b44-92a9-f300b0343fff\") " pod="openshift-infra/auto-csr-approver-29566660-h97zl" Mar 20 09:40:00 crc kubenswrapper[4971]: I0320 09:40:00.301667 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mrtq\" (UniqueName: \"kubernetes.io/projected/8a6069cb-9293-4b44-92a9-f300b0343fff-kube-api-access-5mrtq\") pod \"auto-csr-approver-29566660-h97zl\" (UID: \"8a6069cb-9293-4b44-92a9-f300b0343fff\") " pod="openshift-infra/auto-csr-approver-29566660-h97zl" Mar 20 09:40:00 crc kubenswrapper[4971]: I0320 09:40:00.505822 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566660-h97zl" Mar 20 09:40:00 crc kubenswrapper[4971]: I0320 09:40:00.956850 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566660-h97zl"] Mar 20 09:40:01 crc kubenswrapper[4971]: I0320 09:40:01.150721 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566660-h97zl" event={"ID":"8a6069cb-9293-4b44-92a9-f300b0343fff","Type":"ContainerStarted","Data":"844b8d917ea5a5e6d3c3ae7a268487f0ceab5cda2bd5bc9b519d0fef74895406"} Mar 20 09:40:03 crc kubenswrapper[4971]: I0320 09:40:03.177756 4971 generic.go:334] "Generic (PLEG): container finished" podID="8a6069cb-9293-4b44-92a9-f300b0343fff" containerID="d5758dff12c9d3ec1f99c9316590f65df7ebbc05dfb95685e7901ce5b54f3467" exitCode=0 Mar 20 09:40:03 crc kubenswrapper[4971]: I0320 09:40:03.178181 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566660-h97zl" event={"ID":"8a6069cb-9293-4b44-92a9-f300b0343fff","Type":"ContainerDied","Data":"d5758dff12c9d3ec1f99c9316590f65df7ebbc05dfb95685e7901ce5b54f3467"} Mar 20 09:40:04 crc kubenswrapper[4971]: I0320 09:40:04.515633 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566660-h97zl" Mar 20 09:40:04 crc kubenswrapper[4971]: I0320 09:40:04.667989 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mrtq\" (UniqueName: \"kubernetes.io/projected/8a6069cb-9293-4b44-92a9-f300b0343fff-kube-api-access-5mrtq\") pod \"8a6069cb-9293-4b44-92a9-f300b0343fff\" (UID: \"8a6069cb-9293-4b44-92a9-f300b0343fff\") " Mar 20 09:40:04 crc kubenswrapper[4971]: I0320 09:40:04.675970 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a6069cb-9293-4b44-92a9-f300b0343fff-kube-api-access-5mrtq" (OuterVolumeSpecName: "kube-api-access-5mrtq") pod "8a6069cb-9293-4b44-92a9-f300b0343fff" (UID: "8a6069cb-9293-4b44-92a9-f300b0343fff"). InnerVolumeSpecName "kube-api-access-5mrtq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:40:04 crc kubenswrapper[4971]: I0320 09:40:04.770692 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mrtq\" (UniqueName: \"kubernetes.io/projected/8a6069cb-9293-4b44-92a9-f300b0343fff-kube-api-access-5mrtq\") on node \"crc\" DevicePath \"\"" Mar 20 09:40:05 crc kubenswrapper[4971]: I0320 09:40:05.204101 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566660-h97zl" event={"ID":"8a6069cb-9293-4b44-92a9-f300b0343fff","Type":"ContainerDied","Data":"844b8d917ea5a5e6d3c3ae7a268487f0ceab5cda2bd5bc9b519d0fef74895406"} Mar 20 09:40:05 crc kubenswrapper[4971]: I0320 09:40:05.204397 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="844b8d917ea5a5e6d3c3ae7a268487f0ceab5cda2bd5bc9b519d0fef74895406" Mar 20 09:40:05 crc kubenswrapper[4971]: I0320 09:40:05.204191 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566660-h97zl" Mar 20 09:40:05 crc kubenswrapper[4971]: I0320 09:40:05.615152 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566654-gknz7"] Mar 20 09:40:05 crc kubenswrapper[4971]: I0320 09:40:05.626857 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566654-gknz7"] Mar 20 09:40:06 crc kubenswrapper[4971]: I0320 09:40:06.743587 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02aa84e2-acb0-43df-ae96-cca603f5bbed" path="/var/lib/kubelet/pods/02aa84e2-acb0-43df-ae96-cca603f5bbed/volumes" Mar 20 09:40:12 crc kubenswrapper[4971]: I0320 09:40:12.718325 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vznc6"] Mar 20 09:40:12 crc kubenswrapper[4971]: E0320 09:40:12.720391 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a6069cb-9293-4b44-92a9-f300b0343fff" containerName="oc" Mar 20 09:40:12 crc kubenswrapper[4971]: I0320 09:40:12.720407 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a6069cb-9293-4b44-92a9-f300b0343fff" containerName="oc" Mar 20 09:40:12 crc kubenswrapper[4971]: I0320 09:40:12.720696 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a6069cb-9293-4b44-92a9-f300b0343fff" containerName="oc" Mar 20 09:40:12 crc kubenswrapper[4971]: I0320 09:40:12.723964 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vznc6" Mar 20 09:40:12 crc kubenswrapper[4971]: I0320 09:40:12.754888 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vznc6"] Mar 20 09:40:12 crc kubenswrapper[4971]: I0320 09:40:12.851327 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eee9c0c2-42c9-44bf-8d90-3cfed668f18a-utilities\") pod \"redhat-marketplace-vznc6\" (UID: \"eee9c0c2-42c9-44bf-8d90-3cfed668f18a\") " pod="openshift-marketplace/redhat-marketplace-vznc6" Mar 20 09:40:12 crc kubenswrapper[4971]: I0320 09:40:12.851502 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwssv\" (UniqueName: \"kubernetes.io/projected/eee9c0c2-42c9-44bf-8d90-3cfed668f18a-kube-api-access-wwssv\") pod \"redhat-marketplace-vznc6\" (UID: \"eee9c0c2-42c9-44bf-8d90-3cfed668f18a\") " pod="openshift-marketplace/redhat-marketplace-vznc6" Mar 20 09:40:12 crc kubenswrapper[4971]: I0320 09:40:12.851593 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eee9c0c2-42c9-44bf-8d90-3cfed668f18a-catalog-content\") pod \"redhat-marketplace-vznc6\" (UID: \"eee9c0c2-42c9-44bf-8d90-3cfed668f18a\") " pod="openshift-marketplace/redhat-marketplace-vznc6" Mar 20 09:40:12 crc kubenswrapper[4971]: I0320 09:40:12.953213 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eee9c0c2-42c9-44bf-8d90-3cfed668f18a-utilities\") pod \"redhat-marketplace-vznc6\" (UID: \"eee9c0c2-42c9-44bf-8d90-3cfed668f18a\") " pod="openshift-marketplace/redhat-marketplace-vznc6" Mar 20 09:40:12 crc kubenswrapper[4971]: I0320 09:40:12.953328 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwssv\" (UniqueName: \"kubernetes.io/projected/eee9c0c2-42c9-44bf-8d90-3cfed668f18a-kube-api-access-wwssv\") pod \"redhat-marketplace-vznc6\" (UID: \"eee9c0c2-42c9-44bf-8d90-3cfed668f18a\") " pod="openshift-marketplace/redhat-marketplace-vznc6" Mar 20 09:40:12 crc kubenswrapper[4971]: I0320 09:40:12.953379 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eee9c0c2-42c9-44bf-8d90-3cfed668f18a-catalog-content\") pod \"redhat-marketplace-vznc6\" (UID: \"eee9c0c2-42c9-44bf-8d90-3cfed668f18a\") " pod="openshift-marketplace/redhat-marketplace-vznc6" Mar 20 09:40:12 crc kubenswrapper[4971]: I0320 09:40:12.953734 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eee9c0c2-42c9-44bf-8d90-3cfed668f18a-utilities\") pod \"redhat-marketplace-vznc6\" (UID: \"eee9c0c2-42c9-44bf-8d90-3cfed668f18a\") " pod="openshift-marketplace/redhat-marketplace-vznc6" Mar 20 09:40:12 crc kubenswrapper[4971]: I0320 09:40:12.954006 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eee9c0c2-42c9-44bf-8d90-3cfed668f18a-catalog-content\") pod \"redhat-marketplace-vznc6\" (UID: \"eee9c0c2-42c9-44bf-8d90-3cfed668f18a\") " pod="openshift-marketplace/redhat-marketplace-vznc6" Mar 20 09:40:12 crc kubenswrapper[4971]: I0320 09:40:12.978826 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwssv\" (UniqueName: \"kubernetes.io/projected/eee9c0c2-42c9-44bf-8d90-3cfed668f18a-kube-api-access-wwssv\") pod \"redhat-marketplace-vznc6\" (UID: \"eee9c0c2-42c9-44bf-8d90-3cfed668f18a\") " pod="openshift-marketplace/redhat-marketplace-vznc6" Mar 20 09:40:13 crc kubenswrapper[4971]: I0320 09:40:13.048766 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vznc6" Mar 20 09:40:13 crc kubenswrapper[4971]: I0320 09:40:13.611735 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vznc6"] Mar 20 09:40:14 crc kubenswrapper[4971]: I0320 09:40:14.318347 4971 generic.go:334] "Generic (PLEG): container finished" podID="eee9c0c2-42c9-44bf-8d90-3cfed668f18a" containerID="2308059adca50ab0938c03b2689089ea4cf34766f3b702d0db86b25a4b878c8a" exitCode=0 Mar 20 09:40:14 crc kubenswrapper[4971]: I0320 09:40:14.318394 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vznc6" event={"ID":"eee9c0c2-42c9-44bf-8d90-3cfed668f18a","Type":"ContainerDied","Data":"2308059adca50ab0938c03b2689089ea4cf34766f3b702d0db86b25a4b878c8a"} Mar 20 09:40:14 crc kubenswrapper[4971]: I0320 09:40:14.318790 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vznc6" event={"ID":"eee9c0c2-42c9-44bf-8d90-3cfed668f18a","Type":"ContainerStarted","Data":"3e90c78328cba57cfdcedcf26ea5b683af3f4cc725137e35b7f6f0233404bcb1"} Mar 20 09:40:15 crc kubenswrapper[4971]: I0320 09:40:15.328790 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vznc6" event={"ID":"eee9c0c2-42c9-44bf-8d90-3cfed668f18a","Type":"ContainerStarted","Data":"b5a9b2288c3803cb9b5fe5727d52954a97cdd838ef53fec01ae851515632dabc"} Mar 20 09:40:15 crc kubenswrapper[4971]: I0320 09:40:15.344866 4971 scope.go:117] "RemoveContainer" containerID="b3277569a2bdf8b887b39f1f01a939b63f6f95d5e634297cf53c1fc2a27de2b4" Mar 20 09:40:16 crc kubenswrapper[4971]: I0320 09:40:16.339137 4971 generic.go:334] "Generic (PLEG): container finished" podID="eee9c0c2-42c9-44bf-8d90-3cfed668f18a" containerID="b5a9b2288c3803cb9b5fe5727d52954a97cdd838ef53fec01ae851515632dabc" exitCode=0 Mar 20 09:40:16 crc kubenswrapper[4971]: I0320 09:40:16.339175 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vznc6" event={"ID":"eee9c0c2-42c9-44bf-8d90-3cfed668f18a","Type":"ContainerDied","Data":"b5a9b2288c3803cb9b5fe5727d52954a97cdd838ef53fec01ae851515632dabc"} Mar 20 09:40:17 crc kubenswrapper[4971]: I0320 09:40:17.349210 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vznc6" event={"ID":"eee9c0c2-42c9-44bf-8d90-3cfed668f18a","Type":"ContainerStarted","Data":"f14a384b13310a66475b53500f3ca3cb490652788e79c95aaaea2b28d187dd4a"} Mar 20 09:40:17 crc kubenswrapper[4971]: I0320 09:40:17.376338 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vznc6" podStartSLOduration=2.898348934 podStartE2EDuration="5.376317755s" podCreationTimestamp="2026-03-20 09:40:12 +0000 UTC" firstStartedPulling="2026-03-20 09:40:14.324592594 +0000 UTC m=+10236.304466722" lastFinishedPulling="2026-03-20 09:40:16.802561415 +0000 UTC m=+10238.782435543" observedRunningTime="2026-03-20 09:40:17.372078513 +0000 UTC m=+10239.351952661" watchObservedRunningTime="2026-03-20 09:40:17.376317755 +0000 UTC m=+10239.356191903" Mar 20 09:40:20 crc kubenswrapper[4971]: I0320 09:40:20.162326 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:40:20 crc kubenswrapper[4971]: I0320 09:40:20.162906 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:40:23 crc kubenswrapper[4971]: I0320 09:40:23.049556 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vznc6" Mar 20 09:40:23 crc kubenswrapper[4971]: I0320 09:40:23.050058 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vznc6" Mar 20 09:40:23 crc kubenswrapper[4971]: I0320 09:40:23.129628 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vznc6" Mar 20 09:40:23 crc kubenswrapper[4971]: I0320 09:40:23.937545 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vznc6" Mar 20 09:40:23 crc kubenswrapper[4971]: I0320 09:40:23.988150 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vznc6"] Mar 20 09:40:25 crc kubenswrapper[4971]: I0320 09:40:25.438866 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vznc6" podUID="eee9c0c2-42c9-44bf-8d90-3cfed668f18a" containerName="registry-server" containerID="cri-o://f14a384b13310a66475b53500f3ca3cb490652788e79c95aaaea2b28d187dd4a" gracePeriod=2 Mar 20 09:40:25 crc kubenswrapper[4971]: I0320 09:40:25.956016 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vznc6" Mar 20 09:40:26 crc kubenswrapper[4971]: I0320 09:40:26.115884 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwssv\" (UniqueName: \"kubernetes.io/projected/eee9c0c2-42c9-44bf-8d90-3cfed668f18a-kube-api-access-wwssv\") pod \"eee9c0c2-42c9-44bf-8d90-3cfed668f18a\" (UID: \"eee9c0c2-42c9-44bf-8d90-3cfed668f18a\") " Mar 20 09:40:26 crc kubenswrapper[4971]: I0320 09:40:26.115944 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eee9c0c2-42c9-44bf-8d90-3cfed668f18a-utilities\") pod \"eee9c0c2-42c9-44bf-8d90-3cfed668f18a\" (UID: \"eee9c0c2-42c9-44bf-8d90-3cfed668f18a\") " Mar 20 09:40:26 crc kubenswrapper[4971]: I0320 09:40:26.116002 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eee9c0c2-42c9-44bf-8d90-3cfed668f18a-catalog-content\") pod \"eee9c0c2-42c9-44bf-8d90-3cfed668f18a\" (UID: \"eee9c0c2-42c9-44bf-8d90-3cfed668f18a\") " Mar 20 09:40:26 crc kubenswrapper[4971]: I0320 09:40:26.116962 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eee9c0c2-42c9-44bf-8d90-3cfed668f18a-utilities" (OuterVolumeSpecName: "utilities") pod "eee9c0c2-42c9-44bf-8d90-3cfed668f18a" (UID: "eee9c0c2-42c9-44bf-8d90-3cfed668f18a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:40:26 crc kubenswrapper[4971]: I0320 09:40:26.124861 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eee9c0c2-42c9-44bf-8d90-3cfed668f18a-kube-api-access-wwssv" (OuterVolumeSpecName: "kube-api-access-wwssv") pod "eee9c0c2-42c9-44bf-8d90-3cfed668f18a" (UID: "eee9c0c2-42c9-44bf-8d90-3cfed668f18a"). InnerVolumeSpecName "kube-api-access-wwssv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:40:26 crc kubenswrapper[4971]: I0320 09:40:26.144703 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eee9c0c2-42c9-44bf-8d90-3cfed668f18a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eee9c0c2-42c9-44bf-8d90-3cfed668f18a" (UID: "eee9c0c2-42c9-44bf-8d90-3cfed668f18a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:40:26 crc kubenswrapper[4971]: I0320 09:40:26.218823 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwssv\" (UniqueName: \"kubernetes.io/projected/eee9c0c2-42c9-44bf-8d90-3cfed668f18a-kube-api-access-wwssv\") on node \"crc\" DevicePath \"\"" Mar 20 09:40:26 crc kubenswrapper[4971]: I0320 09:40:26.218863 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eee9c0c2-42c9-44bf-8d90-3cfed668f18a-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 09:40:26 crc kubenswrapper[4971]: I0320 09:40:26.218877 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eee9c0c2-42c9-44bf-8d90-3cfed668f18a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 09:40:26 crc kubenswrapper[4971]: I0320 09:40:26.449091 4971 generic.go:334] "Generic (PLEG): container finished" podID="eee9c0c2-42c9-44bf-8d90-3cfed668f18a" containerID="f14a384b13310a66475b53500f3ca3cb490652788e79c95aaaea2b28d187dd4a" exitCode=0 Mar 20 09:40:26 crc kubenswrapper[4971]: I0320 09:40:26.449162 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vznc6" Mar 20 09:40:26 crc kubenswrapper[4971]: I0320 09:40:26.449147 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vznc6" event={"ID":"eee9c0c2-42c9-44bf-8d90-3cfed668f18a","Type":"ContainerDied","Data":"f14a384b13310a66475b53500f3ca3cb490652788e79c95aaaea2b28d187dd4a"} Mar 20 09:40:26 crc kubenswrapper[4971]: I0320 09:40:26.449311 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vznc6" event={"ID":"eee9c0c2-42c9-44bf-8d90-3cfed668f18a","Type":"ContainerDied","Data":"3e90c78328cba57cfdcedcf26ea5b683af3f4cc725137e35b7f6f0233404bcb1"} Mar 20 09:40:26 crc kubenswrapper[4971]: I0320 09:40:26.449342 4971 scope.go:117] "RemoveContainer" containerID="f14a384b13310a66475b53500f3ca3cb490652788e79c95aaaea2b28d187dd4a" Mar 20 09:40:26 crc kubenswrapper[4971]: I0320 09:40:26.489823 4971 scope.go:117] "RemoveContainer" containerID="b5a9b2288c3803cb9b5fe5727d52954a97cdd838ef53fec01ae851515632dabc" Mar 20 09:40:26 crc kubenswrapper[4971]: I0320 09:40:26.493419 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vznc6"] Mar 20 09:40:26 crc kubenswrapper[4971]: I0320 09:40:26.514148 4971 scope.go:117] "RemoveContainer" containerID="2308059adca50ab0938c03b2689089ea4cf34766f3b702d0db86b25a4b878c8a" Mar 20 09:40:26 crc kubenswrapper[4971]: I0320 09:40:26.514474 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vznc6"] Mar 20 09:40:26 crc kubenswrapper[4971]: I0320 09:40:26.560933 4971 scope.go:117] "RemoveContainer" containerID="f14a384b13310a66475b53500f3ca3cb490652788e79c95aaaea2b28d187dd4a" Mar 20 09:40:26 crc kubenswrapper[4971]: E0320 09:40:26.561458 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f14a384b13310a66475b53500f3ca3cb490652788e79c95aaaea2b28d187dd4a\": container with ID starting with f14a384b13310a66475b53500f3ca3cb490652788e79c95aaaea2b28d187dd4a not found: ID does not exist" containerID="f14a384b13310a66475b53500f3ca3cb490652788e79c95aaaea2b28d187dd4a" Mar 20 09:40:26 crc kubenswrapper[4971]: I0320 09:40:26.561489 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f14a384b13310a66475b53500f3ca3cb490652788e79c95aaaea2b28d187dd4a"} err="failed to get container status \"f14a384b13310a66475b53500f3ca3cb490652788e79c95aaaea2b28d187dd4a\": rpc error: code = NotFound desc = could not find container \"f14a384b13310a66475b53500f3ca3cb490652788e79c95aaaea2b28d187dd4a\": container with ID starting with f14a384b13310a66475b53500f3ca3cb490652788e79c95aaaea2b28d187dd4a not found: ID does not exist" Mar 20 09:40:26 crc kubenswrapper[4971]: I0320 09:40:26.561511 4971 scope.go:117] "RemoveContainer" containerID="b5a9b2288c3803cb9b5fe5727d52954a97cdd838ef53fec01ae851515632dabc" Mar 20 09:40:26 crc kubenswrapper[4971]: E0320 09:40:26.561875 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5a9b2288c3803cb9b5fe5727d52954a97cdd838ef53fec01ae851515632dabc\": container with ID starting with b5a9b2288c3803cb9b5fe5727d52954a97cdd838ef53fec01ae851515632dabc not found: ID does not exist" containerID="b5a9b2288c3803cb9b5fe5727d52954a97cdd838ef53fec01ae851515632dabc" Mar 20 09:40:26 crc kubenswrapper[4971]: I0320 09:40:26.561920 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5a9b2288c3803cb9b5fe5727d52954a97cdd838ef53fec01ae851515632dabc"} err="failed to get container status \"b5a9b2288c3803cb9b5fe5727d52954a97cdd838ef53fec01ae851515632dabc\": rpc error: code = NotFound desc = could not find container \"b5a9b2288c3803cb9b5fe5727d52954a97cdd838ef53fec01ae851515632dabc\": container with ID starting with b5a9b2288c3803cb9b5fe5727d52954a97cdd838ef53fec01ae851515632dabc not found: ID does not exist" Mar 20 09:40:26 crc kubenswrapper[4971]: I0320 09:40:26.561947 4971 scope.go:117] "RemoveContainer" containerID="2308059adca50ab0938c03b2689089ea4cf34766f3b702d0db86b25a4b878c8a" Mar 20 09:40:26 crc kubenswrapper[4971]: E0320 09:40:26.562296 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2308059adca50ab0938c03b2689089ea4cf34766f3b702d0db86b25a4b878c8a\": container with ID starting with 2308059adca50ab0938c03b2689089ea4cf34766f3b702d0db86b25a4b878c8a not found: ID does not exist" containerID="2308059adca50ab0938c03b2689089ea4cf34766f3b702d0db86b25a4b878c8a" Mar 20 09:40:26 crc kubenswrapper[4971]: I0320 09:40:26.562317 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2308059adca50ab0938c03b2689089ea4cf34766f3b702d0db86b25a4b878c8a"} err="failed to get container status \"2308059adca50ab0938c03b2689089ea4cf34766f3b702d0db86b25a4b878c8a\": rpc error: code = NotFound desc = could not find container \"2308059adca50ab0938c03b2689089ea4cf34766f3b702d0db86b25a4b878c8a\": container with ID starting with 2308059adca50ab0938c03b2689089ea4cf34766f3b702d0db86b25a4b878c8a not found: ID does not exist" Mar 20 09:40:26 crc kubenswrapper[4971]: I0320 09:40:26.743636 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eee9c0c2-42c9-44bf-8d90-3cfed668f18a" path="/var/lib/kubelet/pods/eee9c0c2-42c9-44bf-8d90-3cfed668f18a/volumes" Mar 20 09:40:50 crc kubenswrapper[4971]: I0320 09:40:50.161930 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:40:50 crc kubenswrapper[4971]: I0320 09:40:50.162504 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:40:50 crc kubenswrapper[4971]: I0320 09:40:50.162547 4971 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" Mar 20 09:40:50 crc kubenswrapper[4971]: I0320 09:40:50.163259 4971 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c432cb1c18813ff52f5f6bb68ab9bae999208871fd3cdfc75264abedf55542dd"} pod="openshift-machine-config-operator/machine-config-daemon-m26zl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 09:40:50 crc kubenswrapper[4971]: I0320 09:40:50.163306 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" containerID="cri-o://c432cb1c18813ff52f5f6bb68ab9bae999208871fd3cdfc75264abedf55542dd" gracePeriod=600 Mar 20 09:40:50 crc kubenswrapper[4971]: I0320 09:40:50.672873 4971 generic.go:334] "Generic (PLEG): container finished" podID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerID="c432cb1c18813ff52f5f6bb68ab9bae999208871fd3cdfc75264abedf55542dd" exitCode=0 Mar 20 09:40:50 crc kubenswrapper[4971]: I0320 09:40:50.672931 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerDied","Data":"c432cb1c18813ff52f5f6bb68ab9bae999208871fd3cdfc75264abedf55542dd"} Mar 20 09:40:50 crc kubenswrapper[4971]: I0320 09:40:50.673526 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerStarted","Data":"f0175754c8799a9c4ddbe22551ae98fd93ca0fac507f7806f7f0341a573b4b38"} Mar 20 09:40:50 crc kubenswrapper[4971]: I0320 09:40:50.673558 4971 scope.go:117] "RemoveContainer" containerID="bf941729054eb427a73262a724a392ea7406ec0eee565011ce7a80f6bb777365" Mar 20 09:42:00 crc kubenswrapper[4971]: I0320 09:42:00.150066 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566662-v9rbc"] Mar 20 09:42:00 crc kubenswrapper[4971]: E0320 09:42:00.151244 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eee9c0c2-42c9-44bf-8d90-3cfed668f18a" containerName="registry-server" Mar 20 09:42:00 crc kubenswrapper[4971]: I0320 09:42:00.151261 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="eee9c0c2-42c9-44bf-8d90-3cfed668f18a" containerName="registry-server" Mar 20 09:42:00 crc kubenswrapper[4971]: E0320 09:42:00.151280 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eee9c0c2-42c9-44bf-8d90-3cfed668f18a" containerName="extract-utilities" Mar 20 09:42:00 crc kubenswrapper[4971]: I0320 09:42:00.151288 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="eee9c0c2-42c9-44bf-8d90-3cfed668f18a" containerName="extract-utilities" Mar 20 09:42:00 crc kubenswrapper[4971]: E0320 09:42:00.151330 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eee9c0c2-42c9-44bf-8d90-3cfed668f18a" containerName="extract-content" Mar 20 09:42:00 crc kubenswrapper[4971]: I0320 09:42:00.151337 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="eee9c0c2-42c9-44bf-8d90-3cfed668f18a" containerName="extract-content" Mar 20 09:42:00 crc kubenswrapper[4971]: I0320 09:42:00.151589 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="eee9c0c2-42c9-44bf-8d90-3cfed668f18a" containerName="registry-server" Mar 20 09:42:00 crc kubenswrapper[4971]: I0320 09:42:00.152460 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566662-v9rbc" Mar 20 09:42:00 crc kubenswrapper[4971]: I0320 09:42:00.155425 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 09:42:00 crc kubenswrapper[4971]: I0320 09:42:00.155574 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:42:00 crc kubenswrapper[4971]: I0320 09:42:00.155715 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:42:00 crc kubenswrapper[4971]: I0320 09:42:00.164989 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566662-v9rbc"] Mar 20 09:42:00 crc kubenswrapper[4971]: I0320 09:42:00.260951 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbv8k\" (UniqueName: \"kubernetes.io/projected/1c0be1fa-ddc6-47a4-b358-45671538ecbd-kube-api-access-rbv8k\") pod \"auto-csr-approver-29566662-v9rbc\" (UID: \"1c0be1fa-ddc6-47a4-b358-45671538ecbd\") " pod="openshift-infra/auto-csr-approver-29566662-v9rbc" Mar 20 09:42:00 crc kubenswrapper[4971]: I0320 09:42:00.363196 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbv8k\" (UniqueName: \"kubernetes.io/projected/1c0be1fa-ddc6-47a4-b358-45671538ecbd-kube-api-access-rbv8k\") pod \"auto-csr-approver-29566662-v9rbc\" (UID: \"1c0be1fa-ddc6-47a4-b358-45671538ecbd\") " pod="openshift-infra/auto-csr-approver-29566662-v9rbc" Mar 20 09:42:00 crc kubenswrapper[4971]: I0320 09:42:00.383521 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbv8k\" (UniqueName: \"kubernetes.io/projected/1c0be1fa-ddc6-47a4-b358-45671538ecbd-kube-api-access-rbv8k\") pod \"auto-csr-approver-29566662-v9rbc\" (UID: \"1c0be1fa-ddc6-47a4-b358-45671538ecbd\") " pod="openshift-infra/auto-csr-approver-29566662-v9rbc" Mar 20 09:42:00 crc kubenswrapper[4971]: I0320 09:42:00.475396 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566662-v9rbc" Mar 20 09:42:00 crc kubenswrapper[4971]: I0320 09:42:00.976809 4971 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 09:42:00 crc kubenswrapper[4971]: I0320 09:42:00.988131 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566662-v9rbc"] Mar 20 09:42:01 crc kubenswrapper[4971]: I0320 09:42:01.396489 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566662-v9rbc" event={"ID":"1c0be1fa-ddc6-47a4-b358-45671538ecbd","Type":"ContainerStarted","Data":"f75ea1fbd5ccdcf91c86db8e764f29ed8804712cb356a4b9235a20f77ec4fc41"} Mar 20 09:42:02 crc kubenswrapper[4971]: I0320 09:42:02.405802 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566662-v9rbc" event={"ID":"1c0be1fa-ddc6-47a4-b358-45671538ecbd","Type":"ContainerStarted","Data":"a1daf1d03cb40f26949804760911a619b828d397af9a24fd7dd2e9019fe377da"} Mar 20 09:42:02 crc kubenswrapper[4971]: I0320 09:42:02.409000 4971 generic.go:334] "Generic (PLEG): container finished" podID="a8a4d9ea-5399-4aa0-9c74-5c6013261268" containerID="7ca4a5f5f7f2ee4c7842680235af9222e174210a1d3732ea7fc46f512b34bd6c" exitCode=0 Mar 20 09:42:02 crc kubenswrapper[4971]: I0320 09:42:02.409143 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f" event={"ID":"a8a4d9ea-5399-4aa0-9c74-5c6013261268","Type":"ContainerDied","Data":"7ca4a5f5f7f2ee4c7842680235af9222e174210a1d3732ea7fc46f512b34bd6c"} Mar 20 09:42:02 crc kubenswrapper[4971]: I0320 09:42:02.440150 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566662-v9rbc" podStartSLOduration=1.6096422000000001 podStartE2EDuration="2.440130969s" podCreationTimestamp="2026-03-20 09:42:00 +0000 UTC" firstStartedPulling="2026-03-20 09:42:00.976541994 +0000 UTC m=+10342.956416132" lastFinishedPulling="2026-03-20 09:42:01.807030762 +0000 UTC m=+10343.786904901" observedRunningTime="2026-03-20 09:42:02.421050582 +0000 UTC m=+10344.400924720" watchObservedRunningTime="2026-03-20 09:42:02.440130969 +0000 UTC m=+10344.420005107" Mar 20 09:42:03 crc kubenswrapper[4971]: I0320 09:42:03.423284 4971 generic.go:334] "Generic (PLEG): container finished" podID="1c0be1fa-ddc6-47a4-b358-45671538ecbd" containerID="a1daf1d03cb40f26949804760911a619b828d397af9a24fd7dd2e9019fe377da" exitCode=0 Mar 20 09:42:03 crc kubenswrapper[4971]: I0320 09:42:03.423333 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566662-v9rbc" event={"ID":"1c0be1fa-ddc6-47a4-b358-45671538ecbd","Type":"ContainerDied","Data":"a1daf1d03cb40f26949804760911a619b828d397af9a24fd7dd2e9019fe377da"} Mar 20 09:42:03 crc kubenswrapper[4971]: I0320 09:42:03.895863 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f" Mar 20 09:42:04 crc kubenswrapper[4971]: I0320 09:42:04.033596 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/a8a4d9ea-5399-4aa0-9c74-5c6013261268-nova-cells-global-config-1\") pod \"a8a4d9ea-5399-4aa0-9c74-5c6013261268\" (UID: \"a8a4d9ea-5399-4aa0-9c74-5c6013261268\") " Mar 20 09:42:04 crc kubenswrapper[4971]: I0320 09:42:04.033979 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a8a4d9ea-5399-4aa0-9c74-5c6013261268-ssh-key-openstack-cell1\") pod \"a8a4d9ea-5399-4aa0-9c74-5c6013261268\" (UID: \"a8a4d9ea-5399-4aa0-9c74-5c6013261268\") " Mar 20 09:42:04 crc kubenswrapper[4971]: I0320 09:42:04.034013 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/a8a4d9ea-5399-4aa0-9c74-5c6013261268-nova-cell1-compute-config-3\") pod \"a8a4d9ea-5399-4aa0-9c74-5c6013261268\" (UID: \"a8a4d9ea-5399-4aa0-9c74-5c6013261268\") " Mar 20 09:42:04 crc kubenswrapper[4971]: I0320 09:42:04.034036 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/a8a4d9ea-5399-4aa0-9c74-5c6013261268-nova-cell1-compute-config-1\") pod \"a8a4d9ea-5399-4aa0-9c74-5c6013261268\" (UID: \"a8a4d9ea-5399-4aa0-9c74-5c6013261268\") " Mar 20 09:42:04 crc kubenswrapper[4971]: I0320 09:42:04.034084 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/a8a4d9ea-5399-4aa0-9c74-5c6013261268-nova-cell1-compute-config-2\") pod \"a8a4d9ea-5399-4aa0-9c74-5c6013261268\" (UID: \"a8a4d9ea-5399-4aa0-9c74-5c6013261268\") " Mar 20 09:42:04 crc kubenswrapper[4971]: I0320 09:42:04.034106 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a8a4d9ea-5399-4aa0-9c74-5c6013261268-ceph\") pod \"a8a4d9ea-5399-4aa0-9c74-5c6013261268\" (UID: \"a8a4d9ea-5399-4aa0-9c74-5c6013261268\") " Mar 20 09:42:04 crc kubenswrapper[4971]: I0320 09:42:04.034127 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xddzf\" (UniqueName: \"kubernetes.io/projected/a8a4d9ea-5399-4aa0-9c74-5c6013261268-kube-api-access-xddzf\") pod \"a8a4d9ea-5399-4aa0-9c74-5c6013261268\" (UID: \"a8a4d9ea-5399-4aa0-9c74-5c6013261268\") " Mar 20 09:42:04 crc kubenswrapper[4971]: I0320 09:42:04.034149 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a8a4d9ea-5399-4aa0-9c74-5c6013261268-inventory\") pod \"a8a4d9ea-5399-4aa0-9c74-5c6013261268\" (UID: \"a8a4d9ea-5399-4aa0-9c74-5c6013261268\") " Mar 20 09:42:04 crc kubenswrapper[4971]: I0320 09:42:04.034225 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/a8a4d9ea-5399-4aa0-9c74-5c6013261268-nova-cell1-compute-config-0\") pod \"a8a4d9ea-5399-4aa0-9c74-5c6013261268\" (UID: \"a8a4d9ea-5399-4aa0-9c74-5c6013261268\") " Mar 20 09:42:04 crc kubenswrapper[4971]: I0320 09:42:04.034647 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8a4d9ea-5399-4aa0-9c74-5c6013261268-nova-cell1-combined-ca-bundle\") pod \"a8a4d9ea-5399-4aa0-9c74-5c6013261268\" (UID: \"a8a4d9ea-5399-4aa0-9c74-5c6013261268\") " Mar 20 09:42:04 crc kubenswrapper[4971]: I0320 09:42:04.034684 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/a8a4d9ea-5399-4aa0-9c74-5c6013261268-nova-cells-global-config-0\") pod \"a8a4d9ea-5399-4aa0-9c74-5c6013261268\" (UID: \"a8a4d9ea-5399-4aa0-9c74-5c6013261268\") " Mar 20 09:42:04 crc kubenswrapper[4971]: I0320 09:42:04.034744 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/a8a4d9ea-5399-4aa0-9c74-5c6013261268-nova-migration-ssh-key-1\") pod \"a8a4d9ea-5399-4aa0-9c74-5c6013261268\" (UID: \"a8a4d9ea-5399-4aa0-9c74-5c6013261268\") " Mar 20 09:42:04 crc kubenswrapper[4971]: I0320 09:42:04.034771 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/a8a4d9ea-5399-4aa0-9c74-5c6013261268-nova-migration-ssh-key-0\") pod \"a8a4d9ea-5399-4aa0-9c74-5c6013261268\" (UID: \"a8a4d9ea-5399-4aa0-9c74-5c6013261268\") " Mar 20 09:42:04 crc kubenswrapper[4971]: I0320 09:42:04.041993 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8a4d9ea-5399-4aa0-9c74-5c6013261268-kube-api-access-xddzf" (OuterVolumeSpecName: "kube-api-access-xddzf") pod "a8a4d9ea-5399-4aa0-9c74-5c6013261268" (UID: "a8a4d9ea-5399-4aa0-9c74-5c6013261268"). InnerVolumeSpecName "kube-api-access-xddzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:42:04 crc kubenswrapper[4971]: I0320 09:42:04.043211 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8a4d9ea-5399-4aa0-9c74-5c6013261268-ceph" (OuterVolumeSpecName: "ceph") pod "a8a4d9ea-5399-4aa0-9c74-5c6013261268" (UID: "a8a4d9ea-5399-4aa0-9c74-5c6013261268"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:42:04 crc kubenswrapper[4971]: I0320 09:42:04.070794 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8a4d9ea-5399-4aa0-9c74-5c6013261268-nova-cells-global-config-1" (OuterVolumeSpecName: "nova-cells-global-config-1") pod "a8a4d9ea-5399-4aa0-9c74-5c6013261268" (UID: "a8a4d9ea-5399-4aa0-9c74-5c6013261268"). InnerVolumeSpecName "nova-cells-global-config-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:42:04 crc kubenswrapper[4971]: I0320 09:42:04.075253 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8a4d9ea-5399-4aa0-9c74-5c6013261268-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "a8a4d9ea-5399-4aa0-9c74-5c6013261268" (UID: "a8a4d9ea-5399-4aa0-9c74-5c6013261268"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:42:04 crc kubenswrapper[4971]: I0320 09:42:04.080291 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8a4d9ea-5399-4aa0-9c74-5c6013261268-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "a8a4d9ea-5399-4aa0-9c74-5c6013261268" (UID: "a8a4d9ea-5399-4aa0-9c74-5c6013261268"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:42:04 crc kubenswrapper[4971]: I0320 09:42:04.080881 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8a4d9ea-5399-4aa0-9c74-5c6013261268-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "a8a4d9ea-5399-4aa0-9c74-5c6013261268" (UID: "a8a4d9ea-5399-4aa0-9c74-5c6013261268"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:42:04 crc kubenswrapper[4971]: I0320 09:42:04.086863 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8a4d9ea-5399-4aa0-9c74-5c6013261268-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "a8a4d9ea-5399-4aa0-9c74-5c6013261268" (UID: "a8a4d9ea-5399-4aa0-9c74-5c6013261268"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:42:04 crc kubenswrapper[4971]: I0320 09:42:04.089450 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8a4d9ea-5399-4aa0-9c74-5c6013261268-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "a8a4d9ea-5399-4aa0-9c74-5c6013261268" (UID: "a8a4d9ea-5399-4aa0-9c74-5c6013261268"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:42:04 crc kubenswrapper[4971]: I0320 09:42:04.089551 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8a4d9ea-5399-4aa0-9c74-5c6013261268-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "a8a4d9ea-5399-4aa0-9c74-5c6013261268" (UID: "a8a4d9ea-5399-4aa0-9c74-5c6013261268"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:42:04 crc kubenswrapper[4971]: I0320 09:42:04.091628 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8a4d9ea-5399-4aa0-9c74-5c6013261268-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "a8a4d9ea-5399-4aa0-9c74-5c6013261268" (UID: "a8a4d9ea-5399-4aa0-9c74-5c6013261268"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:42:04 crc kubenswrapper[4971]: I0320 09:42:04.092395 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8a4d9ea-5399-4aa0-9c74-5c6013261268-inventory" (OuterVolumeSpecName: "inventory") pod "a8a4d9ea-5399-4aa0-9c74-5c6013261268" (UID: "a8a4d9ea-5399-4aa0-9c74-5c6013261268"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:42:04 crc kubenswrapper[4971]: I0320 09:42:04.113003 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8a4d9ea-5399-4aa0-9c74-5c6013261268-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "a8a4d9ea-5399-4aa0-9c74-5c6013261268" (UID: "a8a4d9ea-5399-4aa0-9c74-5c6013261268"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:42:04 crc kubenswrapper[4971]: I0320 09:42:04.119086 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8a4d9ea-5399-4aa0-9c74-5c6013261268-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "a8a4d9ea-5399-4aa0-9c74-5c6013261268" (UID: "a8a4d9ea-5399-4aa0-9c74-5c6013261268"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:42:04 crc kubenswrapper[4971]: I0320 09:42:04.148677 4971 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8a4d9ea-5399-4aa0-9c74-5c6013261268-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 09:42:04 crc kubenswrapper[4971]: I0320 09:42:04.148716 4971 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/a8a4d9ea-5399-4aa0-9c74-5c6013261268-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Mar 20 09:42:04 crc kubenswrapper[4971]: I0320 09:42:04.148732 4971 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/a8a4d9ea-5399-4aa0-9c74-5c6013261268-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 20 09:42:04 crc kubenswrapper[4971]: I0320 09:42:04.148741 4971 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/a8a4d9ea-5399-4aa0-9c74-5c6013261268-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 20 09:42:04 crc kubenswrapper[4971]: I0320 09:42:04.148750 4971 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/a8a4d9ea-5399-4aa0-9c74-5c6013261268-nova-cells-global-config-1\") on node \"crc\" DevicePath \"\"" Mar 20 09:42:04 crc kubenswrapper[4971]: I0320 09:42:04.148759 4971 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a8a4d9ea-5399-4aa0-9c74-5c6013261268-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 20 09:42:04 crc kubenswrapper[4971]: I0320 09:42:04.148770 4971 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/a8a4d9ea-5399-4aa0-9c74-5c6013261268-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 20 09:42:04 crc kubenswrapper[4971]: I0320 09:42:04.148780 4971 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/a8a4d9ea-5399-4aa0-9c74-5c6013261268-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 20 09:42:04 crc kubenswrapper[4971]: I0320 09:42:04.148788 4971 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/a8a4d9ea-5399-4aa0-9c74-5c6013261268-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 20 09:42:04 crc kubenswrapper[4971]: I0320 09:42:04.148796 4971 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a8a4d9ea-5399-4aa0-9c74-5c6013261268-ceph\") on node \"crc\" DevicePath \"\"" Mar 20 09:42:04 crc kubenswrapper[4971]: I0320 09:42:04.148804 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xddzf\" (UniqueName: \"kubernetes.io/projected/a8a4d9ea-5399-4aa0-9c74-5c6013261268-kube-api-access-xddzf\") on node \"crc\" DevicePath \"\"" Mar 20 09:42:04 crc kubenswrapper[4971]: I0320 09:42:04.148813 4971 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a8a4d9ea-5399-4aa0-9c74-5c6013261268-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 09:42:04 crc kubenswrapper[4971]: I0320 09:42:04.148821 4971 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/a8a4d9ea-5399-4aa0-9c74-5c6013261268-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 20 09:42:04 crc kubenswrapper[4971]: I0320 09:42:04.437893 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f" Mar 20 09:42:04 crc kubenswrapper[4971]: I0320 09:42:04.439725 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f" event={"ID":"a8a4d9ea-5399-4aa0-9c74-5c6013261268","Type":"ContainerDied","Data":"7cc2eee953545964187cbcb274b4e6fa6c6e856369a5a1d9ada54d4f6310ab8c"} Mar 20 09:42:04 crc kubenswrapper[4971]: I0320 09:42:04.439806 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7cc2eee953545964187cbcb274b4e6fa6c6e856369a5a1d9ada54d4f6310ab8c" Mar 20 09:42:05 crc kubenswrapper[4971]: I0320 09:42:05.446880 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566662-v9rbc" event={"ID":"1c0be1fa-ddc6-47a4-b358-45671538ecbd","Type":"ContainerDied","Data":"f75ea1fbd5ccdcf91c86db8e764f29ed8804712cb356a4b9235a20f77ec4fc41"} Mar 20 09:42:05 crc kubenswrapper[4971]: I0320 09:42:05.447198 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f75ea1fbd5ccdcf91c86db8e764f29ed8804712cb356a4b9235a20f77ec4fc41" Mar 20 09:42:05 crc kubenswrapper[4971]: I0320 09:42:05.508014 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566662-v9rbc" Mar 20 09:42:05 crc kubenswrapper[4971]: I0320 09:42:05.576804 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbv8k\" (UniqueName: \"kubernetes.io/projected/1c0be1fa-ddc6-47a4-b358-45671538ecbd-kube-api-access-rbv8k\") pod \"1c0be1fa-ddc6-47a4-b358-45671538ecbd\" (UID: \"1c0be1fa-ddc6-47a4-b358-45671538ecbd\") " Mar 20 09:42:05 crc kubenswrapper[4971]: I0320 09:42:05.585840 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c0be1fa-ddc6-47a4-b358-45671538ecbd-kube-api-access-rbv8k" (OuterVolumeSpecName: "kube-api-access-rbv8k") pod "1c0be1fa-ddc6-47a4-b358-45671538ecbd" (UID: "1c0be1fa-ddc6-47a4-b358-45671538ecbd"). InnerVolumeSpecName "kube-api-access-rbv8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:42:05 crc kubenswrapper[4971]: I0320 09:42:05.679662 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbv8k\" (UniqueName: \"kubernetes.io/projected/1c0be1fa-ddc6-47a4-b358-45671538ecbd-kube-api-access-rbv8k\") on node \"crc\" DevicePath \"\"" Mar 20 09:42:06 crc kubenswrapper[4971]: I0320 09:42:06.454994 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566662-v9rbc" Mar 20 09:42:06 crc kubenswrapper[4971]: I0320 09:42:06.571670 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566656-m67l8"] Mar 20 09:42:06 crc kubenswrapper[4971]: I0320 09:42:06.583126 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566656-m67l8"] Mar 20 09:42:06 crc kubenswrapper[4971]: I0320 09:42:06.743976 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95ce3374-468c-484b-ad4e-8504c953be9e" path="/var/lib/kubelet/pods/95ce3374-468c-484b-ad4e-8504c953be9e/volumes" Mar 20 09:42:15 crc kubenswrapper[4971]: I0320 09:42:15.488375 4971 scope.go:117] "RemoveContainer" containerID="1942874c5f0b75db1f3528d2bfd046939bb803299a577c2d2198d3f942b5c31e" Mar 20 09:42:31 crc kubenswrapper[4971]: E0320 09:42:31.968698 4971 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.119:60102->38.102.83.119:38499: write tcp 38.102.83.119:60102->38.102.83.119:38499: write: broken pipe Mar 20 09:42:50 crc kubenswrapper[4971]: I0320 09:42:50.162772 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:42:50 crc kubenswrapper[4971]: I0320 09:42:50.163293 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:43:20 crc kubenswrapper[4971]: I0320 09:43:20.162912 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:43:20 crc kubenswrapper[4971]: I0320 09:43:20.163450 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:43:35 crc kubenswrapper[4971]: I0320 09:43:35.730373 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Mar 20 09:43:35 crc kubenswrapper[4971]: I0320 09:43:35.731787 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-copy-data" podUID="e6fe2ad3-64b7-41e6-9785-7f3fefbc4b12" containerName="adoption" containerID="cri-o://1789c6a8f27abcd140c930426134a0b6bab5c49b07acc356ef45a8cb0e9bbd8c" gracePeriod=30 Mar 20 09:43:50 crc kubenswrapper[4971]: I0320 09:43:50.162288 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:43:50 crc kubenswrapper[4971]: I0320 09:43:50.162775 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:43:50 crc kubenswrapper[4971]: I0320 09:43:50.162819 4971 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" Mar 20 09:43:50 crc kubenswrapper[4971]: I0320 09:43:50.163668 4971 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f0175754c8799a9c4ddbe22551ae98fd93ca0fac507f7806f7f0341a573b4b38"} pod="openshift-machine-config-operator/machine-config-daemon-m26zl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 09:43:50 crc kubenswrapper[4971]: I0320 09:43:50.163722 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" containerID="cri-o://f0175754c8799a9c4ddbe22551ae98fd93ca0fac507f7806f7f0341a573b4b38" gracePeriod=600 Mar 20 09:43:50 crc kubenswrapper[4971]: E0320 09:43:50.282777 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:43:51 crc kubenswrapper[4971]: I0320 09:43:51.136290 4971 generic.go:334] "Generic (PLEG): container finished" podID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerID="f0175754c8799a9c4ddbe22551ae98fd93ca0fac507f7806f7f0341a573b4b38" exitCode=0 Mar 20 09:43:51 crc kubenswrapper[4971]: I0320 09:43:51.136326 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerDied","Data":"f0175754c8799a9c4ddbe22551ae98fd93ca0fac507f7806f7f0341a573b4b38"} Mar 20 09:43:51 crc kubenswrapper[4971]: I0320 09:43:51.136370 4971 scope.go:117] "RemoveContainer" containerID="c432cb1c18813ff52f5f6bb68ab9bae999208871fd3cdfc75264abedf55542dd" Mar 20 09:43:51 crc kubenswrapper[4971]: I0320 09:43:51.137110 4971 scope.go:117] "RemoveContainer" containerID="f0175754c8799a9c4ddbe22551ae98fd93ca0fac507f7806f7f0341a573b4b38" Mar 20 09:43:51 crc kubenswrapper[4971]: E0320 09:43:51.137480 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:44:00 crc kubenswrapper[4971]: I0320 09:44:00.158076 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566664-hwmxc"] Mar 20 09:44:00 crc kubenswrapper[4971]: E0320 09:44:00.159168 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c0be1fa-ddc6-47a4-b358-45671538ecbd" containerName="oc" Mar 20 09:44:00 crc kubenswrapper[4971]: I0320 09:44:00.159183 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c0be1fa-ddc6-47a4-b358-45671538ecbd" containerName="oc" Mar 20 09:44:00 crc kubenswrapper[4971]: E0320 09:44:00.159221 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8a4d9ea-5399-4aa0-9c74-5c6013261268" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Mar 20 09:44:00 crc kubenswrapper[4971]: I0320 09:44:00.159229 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8a4d9ea-5399-4aa0-9c74-5c6013261268" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Mar 20 09:44:00 crc kubenswrapper[4971]: I0320 09:44:00.159437 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c0be1fa-ddc6-47a4-b358-45671538ecbd" containerName="oc" Mar 20 09:44:00 crc kubenswrapper[4971]: I0320 09:44:00.159471 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8a4d9ea-5399-4aa0-9c74-5c6013261268" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Mar 20 09:44:00 crc kubenswrapper[4971]: I0320 09:44:00.160280 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566664-hwmxc" Mar 20 09:44:00 crc kubenswrapper[4971]: I0320 09:44:00.162737 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 09:44:00 crc kubenswrapper[4971]: I0320 09:44:00.163676 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:44:00 crc kubenswrapper[4971]: I0320 09:44:00.164960 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:44:00 crc kubenswrapper[4971]: I0320 09:44:00.182983 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566664-hwmxc"] Mar 20 09:44:00 crc kubenswrapper[4971]: I0320 09:44:00.266346 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56rsj\" (UniqueName: \"kubernetes.io/projected/fb38638e-107a-4d81-9804-451c2402ea1d-kube-api-access-56rsj\") pod \"auto-csr-approver-29566664-hwmxc\" (UID: \"fb38638e-107a-4d81-9804-451c2402ea1d\") " pod="openshift-infra/auto-csr-approver-29566664-hwmxc" Mar 20 09:44:00 crc kubenswrapper[4971]: I0320 09:44:00.369168 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56rsj\" (UniqueName: \"kubernetes.io/projected/fb38638e-107a-4d81-9804-451c2402ea1d-kube-api-access-56rsj\") pod \"auto-csr-approver-29566664-hwmxc\" (UID: \"fb38638e-107a-4d81-9804-451c2402ea1d\") " pod="openshift-infra/auto-csr-approver-29566664-hwmxc" Mar 20 09:44:00 crc kubenswrapper[4971]: I0320 09:44:00.502354 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56rsj\" (UniqueName: \"kubernetes.io/projected/fb38638e-107a-4d81-9804-451c2402ea1d-kube-api-access-56rsj\") pod \"auto-csr-approver-29566664-hwmxc\" (UID: \"fb38638e-107a-4d81-9804-451c2402ea1d\") " pod="openshift-infra/auto-csr-approver-29566664-hwmxc" Mar 20 09:44:00 crc kubenswrapper[4971]: I0320 09:44:00.783956 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566664-hwmxc" Mar 20 09:44:01 crc kubenswrapper[4971]: I0320 09:44:01.249151 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566664-hwmxc"] Mar 20 09:44:02 crc kubenswrapper[4971]: I0320 09:44:02.254836 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566664-hwmxc" event={"ID":"fb38638e-107a-4d81-9804-451c2402ea1d","Type":"ContainerStarted","Data":"b53d5fddf58e29f65a866b4f2af130f43239533632b746318a0cacd9c1599c74"} Mar 20 09:44:02 crc kubenswrapper[4971]: I0320 09:44:02.733956 4971 scope.go:117] "RemoveContainer" containerID="f0175754c8799a9c4ddbe22551ae98fd93ca0fac507f7806f7f0341a573b4b38" Mar 20 09:44:02 crc kubenswrapper[4971]: E0320 09:44:02.734897 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:44:03 crc kubenswrapper[4971]: I0320 09:44:03.266302 4971 generic.go:334] "Generic (PLEG): container finished" podID="fb38638e-107a-4d81-9804-451c2402ea1d" containerID="3466e07d2dd74abd1975bbfe928575bff23ae1bc04bff2d739812a6c18ac6a43" exitCode=0 Mar 20 09:44:03 crc kubenswrapper[4971]: I0320 09:44:03.266838 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566664-hwmxc" event={"ID":"fb38638e-107a-4d81-9804-451c2402ea1d","Type":"ContainerDied","Data":"3466e07d2dd74abd1975bbfe928575bff23ae1bc04bff2d739812a6c18ac6a43"} Mar 20 09:44:04 crc kubenswrapper[4971]: I0320 09:44:04.612172 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566664-hwmxc" Mar 20 09:44:04 crc kubenswrapper[4971]: I0320 09:44:04.763210 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56rsj\" (UniqueName: \"kubernetes.io/projected/fb38638e-107a-4d81-9804-451c2402ea1d-kube-api-access-56rsj\") pod \"fb38638e-107a-4d81-9804-451c2402ea1d\" (UID: \"fb38638e-107a-4d81-9804-451c2402ea1d\") " Mar 20 09:44:04 crc kubenswrapper[4971]: I0320 09:44:04.768704 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb38638e-107a-4d81-9804-451c2402ea1d-kube-api-access-56rsj" (OuterVolumeSpecName: "kube-api-access-56rsj") pod "fb38638e-107a-4d81-9804-451c2402ea1d" (UID: "fb38638e-107a-4d81-9804-451c2402ea1d"). InnerVolumeSpecName "kube-api-access-56rsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:44:04 crc kubenswrapper[4971]: I0320 09:44:04.866168 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56rsj\" (UniqueName: \"kubernetes.io/projected/fb38638e-107a-4d81-9804-451c2402ea1d-kube-api-access-56rsj\") on node \"crc\" DevicePath \"\"" Mar 20 09:44:05 crc kubenswrapper[4971]: I0320 09:44:05.288972 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566664-hwmxc" event={"ID":"fb38638e-107a-4d81-9804-451c2402ea1d","Type":"ContainerDied","Data":"b53d5fddf58e29f65a866b4f2af130f43239533632b746318a0cacd9c1599c74"} Mar 20 09:44:05 crc kubenswrapper[4971]: I0320 09:44:05.289027 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b53d5fddf58e29f65a866b4f2af130f43239533632b746318a0cacd9c1599c74" Mar 20 09:44:05 crc kubenswrapper[4971]: I0320 09:44:05.289066 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566664-hwmxc" Mar 20 09:44:05 crc kubenswrapper[4971]: I0320 09:44:05.689129 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566658-wsndq"] Mar 20 09:44:05 crc kubenswrapper[4971]: I0320 09:44:05.704919 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566658-wsndq"] Mar 20 09:44:06 crc kubenswrapper[4971]: I0320 09:44:06.305136 4971 generic.go:334] "Generic (PLEG): container finished" podID="e6fe2ad3-64b7-41e6-9785-7f3fefbc4b12" containerID="1789c6a8f27abcd140c930426134a0b6bab5c49b07acc356ef45a8cb0e9bbd8c" exitCode=137 Mar 20 09:44:06 crc kubenswrapper[4971]: I0320 09:44:06.305173 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"e6fe2ad3-64b7-41e6-9785-7f3fefbc4b12","Type":"ContainerDied","Data":"1789c6a8f27abcd140c930426134a0b6bab5c49b07acc356ef45a8cb0e9bbd8c"} Mar 20 09:44:06 crc kubenswrapper[4971]: I0320 09:44:06.305484 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"e6fe2ad3-64b7-41e6-9785-7f3fefbc4b12","Type":"ContainerDied","Data":"d1361c475a265b7fd97a29c9aff58896821a47fe3df136e06084025426a40e75"} Mar 20 09:44:06 crc kubenswrapper[4971]: I0320 09:44:06.305499 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1361c475a265b7fd97a29c9aff58896821a47fe3df136e06084025426a40e75" Mar 20 09:44:06 crc kubenswrapper[4971]: I0320 09:44:06.306285 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Mar 20 09:44:06 crc kubenswrapper[4971]: I0320 09:44:06.499384 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2cmgd\" (UniqueName: \"kubernetes.io/projected/e6fe2ad3-64b7-41e6-9785-7f3fefbc4b12-kube-api-access-2cmgd\") pod \"e6fe2ad3-64b7-41e6-9785-7f3fefbc4b12\" (UID: \"e6fe2ad3-64b7-41e6-9785-7f3fefbc4b12\") " Mar 20 09:44:06 crc kubenswrapper[4971]: I0320 09:44:06.500514 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mariadb-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-737c1c9e-6ada-4804-b275-4270f95b7a2f\") pod \"e6fe2ad3-64b7-41e6-9785-7f3fefbc4b12\" (UID: \"e6fe2ad3-64b7-41e6-9785-7f3fefbc4b12\") " Mar 20 09:44:06 crc kubenswrapper[4971]: I0320 09:44:06.505527 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6fe2ad3-64b7-41e6-9785-7f3fefbc4b12-kube-api-access-2cmgd" (OuterVolumeSpecName: "kube-api-access-2cmgd") pod "e6fe2ad3-64b7-41e6-9785-7f3fefbc4b12" (UID: "e6fe2ad3-64b7-41e6-9785-7f3fefbc4b12"). InnerVolumeSpecName "kube-api-access-2cmgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:44:06 crc kubenswrapper[4971]: I0320 09:44:06.520504 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-737c1c9e-6ada-4804-b275-4270f95b7a2f" (OuterVolumeSpecName: "mariadb-data") pod "e6fe2ad3-64b7-41e6-9785-7f3fefbc4b12" (UID: "e6fe2ad3-64b7-41e6-9785-7f3fefbc4b12"). InnerVolumeSpecName "pvc-737c1c9e-6ada-4804-b275-4270f95b7a2f". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 09:44:06 crc kubenswrapper[4971]: I0320 09:44:06.603415 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2cmgd\" (UniqueName: \"kubernetes.io/projected/e6fe2ad3-64b7-41e6-9785-7f3fefbc4b12-kube-api-access-2cmgd\") on node \"crc\" DevicePath \"\"" Mar 20 09:44:06 crc kubenswrapper[4971]: I0320 09:44:06.603476 4971 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-737c1c9e-6ada-4804-b275-4270f95b7a2f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-737c1c9e-6ada-4804-b275-4270f95b7a2f\") on node \"crc\" " Mar 20 09:44:06 crc kubenswrapper[4971]: I0320 09:44:06.627779 4971 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 20 09:44:06 crc kubenswrapper[4971]: I0320 09:44:06.627947 4971 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-737c1c9e-6ada-4804-b275-4270f95b7a2f" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-737c1c9e-6ada-4804-b275-4270f95b7a2f") on node "crc" Mar 20 09:44:06 crc kubenswrapper[4971]: I0320 09:44:06.705934 4971 reconciler_common.go:293] "Volume detached for volume \"pvc-737c1c9e-6ada-4804-b275-4270f95b7a2f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-737c1c9e-6ada-4804-b275-4270f95b7a2f\") on node \"crc\" DevicePath \"\"" Mar 20 09:44:06 crc kubenswrapper[4971]: I0320 09:44:06.744233 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ddea0e1-b76c-4830-9489-4c0b2fa997f3" path="/var/lib/kubelet/pods/8ddea0e1-b76c-4830-9489-4c0b2fa997f3/volumes" Mar 20 09:44:07 crc kubenswrapper[4971]: I0320 09:44:07.314955 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Mar 20 09:44:07 crc kubenswrapper[4971]: I0320 09:44:07.341534 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Mar 20 09:44:07 crc kubenswrapper[4971]: I0320 09:44:07.350263 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-copy-data"] Mar 20 09:44:07 crc kubenswrapper[4971]: I0320 09:44:07.919663 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Mar 20 09:44:07 crc kubenswrapper[4971]: I0320 09:44:07.919900 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-copy-data" podUID="a20e6b72-7aa2-4468-b8e3-e13011048d27" containerName="adoption" containerID="cri-o://8091166a7604c7770d00088c604be07ff7c8c445eb3df721c3dfb714fd24daf7" gracePeriod=30 Mar 20 09:44:08 crc kubenswrapper[4971]: I0320 09:44:08.748886 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6fe2ad3-64b7-41e6-9785-7f3fefbc4b12" path="/var/lib/kubelet/pods/e6fe2ad3-64b7-41e6-9785-7f3fefbc4b12/volumes" Mar 20 09:44:15 crc kubenswrapper[4971]: I0320 09:44:15.576644 4971 scope.go:117] "RemoveContainer" containerID="cf9ce687aa3125da7ca444672c9b80e615ec911a2c440453b4e8f059c233b082" Mar 20 09:44:15 crc kubenswrapper[4971]: I0320 09:44:15.628513 4971 scope.go:117] "RemoveContainer" containerID="1789c6a8f27abcd140c930426134a0b6bab5c49b07acc356ef45a8cb0e9bbd8c" Mar 20 09:44:16 crc kubenswrapper[4971]: I0320 09:44:16.732502 4971 scope.go:117] "RemoveContainer" containerID="f0175754c8799a9c4ddbe22551ae98fd93ca0fac507f7806f7f0341a573b4b38" Mar 20 09:44:16 crc kubenswrapper[4971]: E0320 09:44:16.733256 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:44:31 crc kubenswrapper[4971]: I0320 09:44:31.732758 4971 scope.go:117] "RemoveContainer" containerID="f0175754c8799a9c4ddbe22551ae98fd93ca0fac507f7806f7f0341a573b4b38" Mar 20 09:44:31 crc kubenswrapper[4971]: E0320 09:44:31.733831 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:44:38 crc kubenswrapper[4971]: I0320 09:44:38.464148 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Mar 20 09:44:38 crc kubenswrapper[4971]: I0320 09:44:38.590073 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dr7mg\" (UniqueName: \"kubernetes.io/projected/a20e6b72-7aa2-4468-b8e3-e13011048d27-kube-api-access-dr7mg\") pod \"a20e6b72-7aa2-4468-b8e3-e13011048d27\" (UID: \"a20e6b72-7aa2-4468-b8e3-e13011048d27\") " Mar 20 09:44:38 crc kubenswrapper[4971]: I0320 09:44:38.590165 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/a20e6b72-7aa2-4468-b8e3-e13011048d27-ovn-data-cert\") pod \"a20e6b72-7aa2-4468-b8e3-e13011048d27\" (UID: \"a20e6b72-7aa2-4468-b8e3-e13011048d27\") " Mar 20 09:44:38 crc kubenswrapper[4971]: I0320 09:44:38.591013 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8277eb03-155c-41fa-8914-8f82b3ebad1a\") pod \"a20e6b72-7aa2-4468-b8e3-e13011048d27\" (UID: \"a20e6b72-7aa2-4468-b8e3-e13011048d27\") " Mar 20 09:44:38 crc kubenswrapper[4971]: I0320 09:44:38.595880 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a20e6b72-7aa2-4468-b8e3-e13011048d27-kube-api-access-dr7mg" (OuterVolumeSpecName: "kube-api-access-dr7mg") pod "a20e6b72-7aa2-4468-b8e3-e13011048d27" (UID: "a20e6b72-7aa2-4468-b8e3-e13011048d27"). InnerVolumeSpecName "kube-api-access-dr7mg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:44:38 crc kubenswrapper[4971]: I0320 09:44:38.597796 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a20e6b72-7aa2-4468-b8e3-e13011048d27-ovn-data-cert" (OuterVolumeSpecName: "ovn-data-cert") pod "a20e6b72-7aa2-4468-b8e3-e13011048d27" (UID: "a20e6b72-7aa2-4468-b8e3-e13011048d27"). InnerVolumeSpecName "ovn-data-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:44:38 crc kubenswrapper[4971]: I0320 09:44:38.606747 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8277eb03-155c-41fa-8914-8f82b3ebad1a" (OuterVolumeSpecName: "ovn-data") pod "a20e6b72-7aa2-4468-b8e3-e13011048d27" (UID: "a20e6b72-7aa2-4468-b8e3-e13011048d27"). InnerVolumeSpecName "pvc-8277eb03-155c-41fa-8914-8f82b3ebad1a". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 09:44:38 crc kubenswrapper[4971]: I0320 09:44:38.670287 4971 generic.go:334] "Generic (PLEG): container finished" podID="a20e6b72-7aa2-4468-b8e3-e13011048d27" containerID="8091166a7604c7770d00088c604be07ff7c8c445eb3df721c3dfb714fd24daf7" exitCode=137 Mar 20 09:44:38 crc kubenswrapper[4971]: I0320 09:44:38.670334 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Mar 20 09:44:38 crc kubenswrapper[4971]: I0320 09:44:38.670342 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"a20e6b72-7aa2-4468-b8e3-e13011048d27","Type":"ContainerDied","Data":"8091166a7604c7770d00088c604be07ff7c8c445eb3df721c3dfb714fd24daf7"} Mar 20 09:44:38 crc kubenswrapper[4971]: I0320 09:44:38.670370 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"a20e6b72-7aa2-4468-b8e3-e13011048d27","Type":"ContainerDied","Data":"e855ba31df1a2ff8e58e2e04c16bdf015d243ff41bb4e0cd8057a25fefd79739"} Mar 20 09:44:38 crc kubenswrapper[4971]: I0320 09:44:38.670389 4971 scope.go:117] "RemoveContainer" containerID="8091166a7604c7770d00088c604be07ff7c8c445eb3df721c3dfb714fd24daf7" Mar 20 09:44:38 crc kubenswrapper[4971]: I0320 09:44:38.694684 4971 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-8277eb03-155c-41fa-8914-8f82b3ebad1a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8277eb03-155c-41fa-8914-8f82b3ebad1a\") on node \"crc\" " Mar 20 09:44:38 crc kubenswrapper[4971]: I0320 09:44:38.695061 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dr7mg\" (UniqueName: \"kubernetes.io/projected/a20e6b72-7aa2-4468-b8e3-e13011048d27-kube-api-access-dr7mg\") on node \"crc\" DevicePath \"\"" Mar 20 09:44:38 crc kubenswrapper[4971]: I0320 09:44:38.695077 4971 reconciler_common.go:293] "Volume detached for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/a20e6b72-7aa2-4468-b8e3-e13011048d27-ovn-data-cert\") on node \"crc\" DevicePath \"\"" Mar 20 09:44:38 crc kubenswrapper[4971]: I0320 09:44:38.705325 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Mar 20 09:44:38 crc kubenswrapper[4971]: I0320 09:44:38.714954 4971 scope.go:117] "RemoveContainer" containerID="8091166a7604c7770d00088c604be07ff7c8c445eb3df721c3dfb714fd24daf7" Mar 20 09:44:38 crc kubenswrapper[4971]: E0320 09:44:38.715739 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8091166a7604c7770d00088c604be07ff7c8c445eb3df721c3dfb714fd24daf7\": container with ID starting with 8091166a7604c7770d00088c604be07ff7c8c445eb3df721c3dfb714fd24daf7 not found: ID does not exist" containerID="8091166a7604c7770d00088c604be07ff7c8c445eb3df721c3dfb714fd24daf7" Mar 20 09:44:38 crc kubenswrapper[4971]: I0320 09:44:38.715787 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8091166a7604c7770d00088c604be07ff7c8c445eb3df721c3dfb714fd24daf7"} err="failed to get container status \"8091166a7604c7770d00088c604be07ff7c8c445eb3df721c3dfb714fd24daf7\": rpc error: code = NotFound desc = could not find container \"8091166a7604c7770d00088c604be07ff7c8c445eb3df721c3dfb714fd24daf7\": container with ID starting with 8091166a7604c7770d00088c604be07ff7c8c445eb3df721c3dfb714fd24daf7 not found: ID does not exist" Mar 20 09:44:38 crc kubenswrapper[4971]: I0320 09:44:38.716849 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-copy-data"] Mar 20 09:44:38 crc kubenswrapper[4971]: I0320 09:44:38.720136 4971 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 20 09:44:38 crc kubenswrapper[4971]: I0320 09:44:38.720279 4971 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-8277eb03-155c-41fa-8914-8f82b3ebad1a" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8277eb03-155c-41fa-8914-8f82b3ebad1a") on node "crc" Mar 20 09:44:38 crc kubenswrapper[4971]: I0320 09:44:38.745291 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a20e6b72-7aa2-4468-b8e3-e13011048d27" path="/var/lib/kubelet/pods/a20e6b72-7aa2-4468-b8e3-e13011048d27/volumes" Mar 20 09:44:38 crc kubenswrapper[4971]: I0320 09:44:38.797046 4971 reconciler_common.go:293] "Volume detached for volume \"pvc-8277eb03-155c-41fa-8914-8f82b3ebad1a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8277eb03-155c-41fa-8914-8f82b3ebad1a\") on node \"crc\" DevicePath \"\"" Mar 20 09:44:45 crc kubenswrapper[4971]: I0320 09:44:45.732284 4971 scope.go:117] "RemoveContainer" containerID="f0175754c8799a9c4ddbe22551ae98fd93ca0fac507f7806f7f0341a573b4b38" Mar 20 09:44:45 crc kubenswrapper[4971]: E0320 09:44:45.733429 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:44:59 crc kubenswrapper[4971]: I0320 09:44:59.732862 4971 scope.go:117] "RemoveContainer" containerID="f0175754c8799a9c4ddbe22551ae98fd93ca0fac507f7806f7f0341a573b4b38" Mar 20 09:44:59 crc kubenswrapper[4971]: E0320 09:44:59.734138 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:45:00 crc kubenswrapper[4971]: I0320 09:45:00.172277 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566665-fvf5v"] Mar 20 09:45:00 crc kubenswrapper[4971]: E0320 09:45:00.173143 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb38638e-107a-4d81-9804-451c2402ea1d" containerName="oc" Mar 20 09:45:00 crc kubenswrapper[4971]: I0320 09:45:00.173167 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb38638e-107a-4d81-9804-451c2402ea1d" containerName="oc" Mar 20 09:45:00 crc kubenswrapper[4971]: E0320 09:45:00.173184 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6fe2ad3-64b7-41e6-9785-7f3fefbc4b12" containerName="adoption" Mar 20 09:45:00 crc kubenswrapper[4971]: I0320 09:45:00.173207 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6fe2ad3-64b7-41e6-9785-7f3fefbc4b12" containerName="adoption" Mar 20 09:45:00 crc kubenswrapper[4971]: E0320 09:45:00.173232 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a20e6b72-7aa2-4468-b8e3-e13011048d27" containerName="adoption" Mar 20 09:45:00 crc kubenswrapper[4971]: I0320 09:45:00.173242 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="a20e6b72-7aa2-4468-b8e3-e13011048d27" containerName="adoption" Mar 20 09:45:00 crc kubenswrapper[4971]: I0320 09:45:00.173473 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="a20e6b72-7aa2-4468-b8e3-e13011048d27" containerName="adoption" Mar 20 09:45:00 crc kubenswrapper[4971]: I0320 09:45:00.173499 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb38638e-107a-4d81-9804-451c2402ea1d" containerName="oc" Mar 20 09:45:00 crc kubenswrapper[4971]: I0320 09:45:00.173520 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6fe2ad3-64b7-41e6-9785-7f3fefbc4b12" containerName="adoption" Mar 20 09:45:00 crc kubenswrapper[4971]: I0320 09:45:00.174450 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566665-fvf5v" Mar 20 09:45:00 crc kubenswrapper[4971]: I0320 09:45:00.177227 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 09:45:00 crc kubenswrapper[4971]: I0320 09:45:00.177480 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 09:45:00 crc kubenswrapper[4971]: I0320 09:45:00.187032 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566665-fvf5v"] Mar 20 09:45:00 crc kubenswrapper[4971]: I0320 09:45:00.350009 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6a505c99-5204-41cd-8c77-b658d64abca6-secret-volume\") pod \"collect-profiles-29566665-fvf5v\" (UID: \"6a505c99-5204-41cd-8c77-b658d64abca6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566665-fvf5v" Mar 20 09:45:00 crc kubenswrapper[4971]: I0320 09:45:00.350103 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljqrm\" (UniqueName: \"kubernetes.io/projected/6a505c99-5204-41cd-8c77-b658d64abca6-kube-api-access-ljqrm\") pod \"collect-profiles-29566665-fvf5v\" (UID: \"6a505c99-5204-41cd-8c77-b658d64abca6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566665-fvf5v" Mar 20 09:45:00 crc kubenswrapper[4971]: I0320 09:45:00.350312 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6a505c99-5204-41cd-8c77-b658d64abca6-config-volume\") pod \"collect-profiles-29566665-fvf5v\" (UID: \"6a505c99-5204-41cd-8c77-b658d64abca6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566665-fvf5v" Mar 20 09:45:00 crc kubenswrapper[4971]: I0320 09:45:00.451976 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljqrm\" (UniqueName: \"kubernetes.io/projected/6a505c99-5204-41cd-8c77-b658d64abca6-kube-api-access-ljqrm\") pod \"collect-profiles-29566665-fvf5v\" (UID: \"6a505c99-5204-41cd-8c77-b658d64abca6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566665-fvf5v" Mar 20 09:45:00 crc kubenswrapper[4971]: I0320 09:45:00.452143 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6a505c99-5204-41cd-8c77-b658d64abca6-config-volume\") pod \"collect-profiles-29566665-fvf5v\" (UID: \"6a505c99-5204-41cd-8c77-b658d64abca6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566665-fvf5v" Mar 20 09:45:00 crc kubenswrapper[4971]: I0320 09:45:00.452240 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6a505c99-5204-41cd-8c77-b658d64abca6-secret-volume\") pod \"collect-profiles-29566665-fvf5v\" (UID: \"6a505c99-5204-41cd-8c77-b658d64abca6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566665-fvf5v" Mar 20 09:45:00 crc kubenswrapper[4971]: I0320 09:45:00.453079 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6a505c99-5204-41cd-8c77-b658d64abca6-config-volume\") pod \"collect-profiles-29566665-fvf5v\" (UID: \"6a505c99-5204-41cd-8c77-b658d64abca6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566665-fvf5v" Mar 20 09:45:00 crc kubenswrapper[4971]: I0320 09:45:00.459709 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6a505c99-5204-41cd-8c77-b658d64abca6-secret-volume\") pod \"collect-profiles-29566665-fvf5v\" (UID: \"6a505c99-5204-41cd-8c77-b658d64abca6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566665-fvf5v" Mar 20 09:45:00 crc kubenswrapper[4971]: I0320 09:45:00.471916 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljqrm\" (UniqueName: \"kubernetes.io/projected/6a505c99-5204-41cd-8c77-b658d64abca6-kube-api-access-ljqrm\") pod \"collect-profiles-29566665-fvf5v\" (UID: \"6a505c99-5204-41cd-8c77-b658d64abca6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566665-fvf5v" Mar 20 09:45:00 crc kubenswrapper[4971]: I0320 09:45:00.511565 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566665-fvf5v" Mar 20 09:45:00 crc kubenswrapper[4971]: I0320 09:45:00.950926 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566665-fvf5v"] Mar 20 09:45:00 crc kubenswrapper[4971]: W0320 09:45:00.952148 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a505c99_5204_41cd_8c77_b658d64abca6.slice/crio-60fe198ea9dc9b1b892f5cac3b243038b0f92321475fc77411c31700712684a4 WatchSource:0}: Error finding container 60fe198ea9dc9b1b892f5cac3b243038b0f92321475fc77411c31700712684a4: Status 404 returned error can't find the container with id 60fe198ea9dc9b1b892f5cac3b243038b0f92321475fc77411c31700712684a4 Mar 20 09:45:01 crc kubenswrapper[4971]: I0320 09:45:01.251591 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566665-fvf5v" event={"ID":"6a505c99-5204-41cd-8c77-b658d64abca6","Type":"ContainerStarted","Data":"fd9c3400f2ab2bc7cd4dd33f9cbfeb17375b1874b74f68b9e374104ef7de8616"} Mar 20 09:45:01 crc kubenswrapper[4971]: I0320 09:45:01.251958 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566665-fvf5v" event={"ID":"6a505c99-5204-41cd-8c77-b658d64abca6","Type":"ContainerStarted","Data":"60fe198ea9dc9b1b892f5cac3b243038b0f92321475fc77411c31700712684a4"} Mar 20 09:45:01 crc kubenswrapper[4971]: I0320 09:45:01.265411 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29566665-fvf5v" podStartSLOduration=1.265387161 podStartE2EDuration="1.265387161s" podCreationTimestamp="2026-03-20 09:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:45:01.263332537 +0000 UTC m=+10523.243206665" watchObservedRunningTime="2026-03-20 09:45:01.265387161 +0000 UTC m=+10523.245261299" Mar 20 09:45:02 crc kubenswrapper[4971]: I0320 09:45:02.279403 4971 generic.go:334] "Generic (PLEG): container finished" podID="6a505c99-5204-41cd-8c77-b658d64abca6" containerID="fd9c3400f2ab2bc7cd4dd33f9cbfeb17375b1874b74f68b9e374104ef7de8616" exitCode=0 Mar 20 09:45:02 crc kubenswrapper[4971]: I0320 09:45:02.279544 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566665-fvf5v" event={"ID":"6a505c99-5204-41cd-8c77-b658d64abca6","Type":"ContainerDied","Data":"fd9c3400f2ab2bc7cd4dd33f9cbfeb17375b1874b74f68b9e374104ef7de8616"} Mar 20 09:45:03 crc kubenswrapper[4971]: I0320 09:45:03.641420 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566665-fvf5v" Mar 20 09:45:03 crc kubenswrapper[4971]: I0320 09:45:03.817797 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6a505c99-5204-41cd-8c77-b658d64abca6-secret-volume\") pod \"6a505c99-5204-41cd-8c77-b658d64abca6\" (UID: \"6a505c99-5204-41cd-8c77-b658d64abca6\") " Mar 20 09:45:03 crc kubenswrapper[4971]: I0320 09:45:03.817901 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljqrm\" (UniqueName: \"kubernetes.io/projected/6a505c99-5204-41cd-8c77-b658d64abca6-kube-api-access-ljqrm\") pod \"6a505c99-5204-41cd-8c77-b658d64abca6\" (UID: \"6a505c99-5204-41cd-8c77-b658d64abca6\") " Mar 20 09:45:03 crc kubenswrapper[4971]: I0320 09:45:03.818367 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6a505c99-5204-41cd-8c77-b658d64abca6-config-volume\") pod \"6a505c99-5204-41cd-8c77-b658d64abca6\" (UID: \"6a505c99-5204-41cd-8c77-b658d64abca6\") " Mar 20 09:45:03 crc kubenswrapper[4971]: I0320 09:45:03.821905 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a505c99-5204-41cd-8c77-b658d64abca6-config-volume" (OuterVolumeSpecName: "config-volume") pod "6a505c99-5204-41cd-8c77-b658d64abca6" (UID: "6a505c99-5204-41cd-8c77-b658d64abca6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:45:03 crc kubenswrapper[4971]: I0320 09:45:03.828767 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a505c99-5204-41cd-8c77-b658d64abca6-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6a505c99-5204-41cd-8c77-b658d64abca6" (UID: "6a505c99-5204-41cd-8c77-b658d64abca6"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:45:03 crc kubenswrapper[4971]: I0320 09:45:03.828856 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a505c99-5204-41cd-8c77-b658d64abca6-kube-api-access-ljqrm" (OuterVolumeSpecName: "kube-api-access-ljqrm") pod "6a505c99-5204-41cd-8c77-b658d64abca6" (UID: "6a505c99-5204-41cd-8c77-b658d64abca6"). InnerVolumeSpecName "kube-api-access-ljqrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:45:03 crc kubenswrapper[4971]: I0320 09:45:03.922069 4971 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6a505c99-5204-41cd-8c77-b658d64abca6-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 09:45:03 crc kubenswrapper[4971]: I0320 09:45:03.922127 4971 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6a505c99-5204-41cd-8c77-b658d64abca6-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 09:45:03 crc kubenswrapper[4971]: I0320 09:45:03.922139 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljqrm\" (UniqueName: \"kubernetes.io/projected/6a505c99-5204-41cd-8c77-b658d64abca6-kube-api-access-ljqrm\") on node \"crc\" DevicePath \"\"" Mar 20 09:45:04 crc kubenswrapper[4971]: I0320 09:45:04.307816 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566665-fvf5v" event={"ID":"6a505c99-5204-41cd-8c77-b658d64abca6","Type":"ContainerDied","Data":"60fe198ea9dc9b1b892f5cac3b243038b0f92321475fc77411c31700712684a4"} Mar 20 09:45:04 crc kubenswrapper[4971]: I0320 09:45:04.308268 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60fe198ea9dc9b1b892f5cac3b243038b0f92321475fc77411c31700712684a4" Mar 20 09:45:04 crc kubenswrapper[4971]: I0320 09:45:04.307868 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566665-fvf5v" Mar 20 09:45:04 crc kubenswrapper[4971]: I0320 09:45:04.356205 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566620-dkp2l"] Mar 20 09:45:04 crc kubenswrapper[4971]: I0320 09:45:04.368067 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566620-dkp2l"] Mar 20 09:45:04 crc kubenswrapper[4971]: I0320 09:45:04.745940 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f1bfb4f-58e7-4a18-86ee-ee822cd3fa48" path="/var/lib/kubelet/pods/9f1bfb4f-58e7-4a18-86ee-ee822cd3fa48/volumes" Mar 20 09:45:13 crc kubenswrapper[4971]: I0320 09:45:13.732813 4971 scope.go:117] "RemoveContainer" containerID="f0175754c8799a9c4ddbe22551ae98fd93ca0fac507f7806f7f0341a573b4b38" Mar 20 09:45:13 crc kubenswrapper[4971]: E0320 09:45:13.733799 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:45:15 crc kubenswrapper[4971]: I0320 09:45:15.690929 4971 scope.go:117] "RemoveContainer" containerID="efeda7f0dd06c042e0940b665b8dca14c55e437b58331b2f8b6860f19a6aa187" Mar 20 09:45:26 crc kubenswrapper[4971]: I0320 09:45:26.732955 4971 scope.go:117] "RemoveContainer" containerID="f0175754c8799a9c4ddbe22551ae98fd93ca0fac507f7806f7f0341a573b4b38" Mar 20 09:45:26 crc kubenswrapper[4971]: E0320 09:45:26.733586 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:45:40 crc kubenswrapper[4971]: I0320 09:45:40.732987 4971 scope.go:117] "RemoveContainer" containerID="f0175754c8799a9c4ddbe22551ae98fd93ca0fac507f7806f7f0341a573b4b38" Mar 20 09:45:40 crc kubenswrapper[4971]: E0320 09:45:40.734146 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:45:45 crc kubenswrapper[4971]: I0320 09:45:45.224959 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qb2q8"] Mar 20 09:45:45 crc kubenswrapper[4971]: E0320 09:45:45.226864 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a505c99-5204-41cd-8c77-b658d64abca6" containerName="collect-profiles" Mar 20 09:45:45 crc kubenswrapper[4971]: I0320 09:45:45.226879 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a505c99-5204-41cd-8c77-b658d64abca6" containerName="collect-profiles" Mar 20 09:45:45 crc kubenswrapper[4971]: I0320 09:45:45.227077 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a505c99-5204-41cd-8c77-b658d64abca6" containerName="collect-profiles" Mar 20 09:45:45 crc kubenswrapper[4971]: I0320 09:45:45.228428 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qb2q8" Mar 20 09:45:45 crc kubenswrapper[4971]: I0320 09:45:45.249026 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qb2q8"] Mar 20 09:45:45 crc kubenswrapper[4971]: I0320 09:45:45.288824 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3faf5cf5-d1d7-41e8-9cef-2ddf3b9363e9-utilities\") pod \"community-operators-qb2q8\" (UID: \"3faf5cf5-d1d7-41e8-9cef-2ddf3b9363e9\") " pod="openshift-marketplace/community-operators-qb2q8" Mar 20 09:45:45 crc kubenswrapper[4971]: I0320 09:45:45.288922 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3faf5cf5-d1d7-41e8-9cef-2ddf3b9363e9-catalog-content\") pod \"community-operators-qb2q8\" (UID: \"3faf5cf5-d1d7-41e8-9cef-2ddf3b9363e9\") " pod="openshift-marketplace/community-operators-qb2q8" Mar 20 09:45:45 crc kubenswrapper[4971]: I0320 09:45:45.289053 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfr7f\" (UniqueName: \"kubernetes.io/projected/3faf5cf5-d1d7-41e8-9cef-2ddf3b9363e9-kube-api-access-rfr7f\") pod \"community-operators-qb2q8\" (UID: \"3faf5cf5-d1d7-41e8-9cef-2ddf3b9363e9\") " pod="openshift-marketplace/community-operators-qb2q8" Mar 20 09:45:45 crc kubenswrapper[4971]: I0320 09:45:45.390777 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3faf5cf5-d1d7-41e8-9cef-2ddf3b9363e9-catalog-content\") pod \"community-operators-qb2q8\" (UID: \"3faf5cf5-d1d7-41e8-9cef-2ddf3b9363e9\") " pod="openshift-marketplace/community-operators-qb2q8" Mar 20 09:45:45 crc kubenswrapper[4971]: I0320 09:45:45.390859 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfr7f\" (UniqueName: \"kubernetes.io/projected/3faf5cf5-d1d7-41e8-9cef-2ddf3b9363e9-kube-api-access-rfr7f\") pod \"community-operators-qb2q8\" (UID: \"3faf5cf5-d1d7-41e8-9cef-2ddf3b9363e9\") " pod="openshift-marketplace/community-operators-qb2q8" Mar 20 09:45:45 crc kubenswrapper[4971]: I0320 09:45:45.390971 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3faf5cf5-d1d7-41e8-9cef-2ddf3b9363e9-utilities\") pod \"community-operators-qb2q8\" (UID: \"3faf5cf5-d1d7-41e8-9cef-2ddf3b9363e9\") " pod="openshift-marketplace/community-operators-qb2q8" Mar 20 09:45:45 crc kubenswrapper[4971]: I0320 09:45:45.391314 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3faf5cf5-d1d7-41e8-9cef-2ddf3b9363e9-catalog-content\") pod \"community-operators-qb2q8\" (UID: \"3faf5cf5-d1d7-41e8-9cef-2ddf3b9363e9\") " pod="openshift-marketplace/community-operators-qb2q8" Mar 20 09:45:45 crc kubenswrapper[4971]: I0320 09:45:45.391460 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3faf5cf5-d1d7-41e8-9cef-2ddf3b9363e9-utilities\") pod \"community-operators-qb2q8\" (UID: \"3faf5cf5-d1d7-41e8-9cef-2ddf3b9363e9\") " pod="openshift-marketplace/community-operators-qb2q8" Mar 20 09:45:45 crc kubenswrapper[4971]: I0320 09:45:45.708009 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfr7f\" (UniqueName: \"kubernetes.io/projected/3faf5cf5-d1d7-41e8-9cef-2ddf3b9363e9-kube-api-access-rfr7f\") pod \"community-operators-qb2q8\" (UID: \"3faf5cf5-d1d7-41e8-9cef-2ddf3b9363e9\") " pod="openshift-marketplace/community-operators-qb2q8" Mar 20 09:45:45 crc kubenswrapper[4971]: I0320 09:45:45.861896 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qb2q8" Mar 20 09:45:46 crc kubenswrapper[4971]: I0320 09:45:46.315246 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qb2q8"] Mar 20 09:45:46 crc kubenswrapper[4971]: W0320 09:45:46.318256 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3faf5cf5_d1d7_41e8_9cef_2ddf3b9363e9.slice/crio-bd4587737578490af6abd53c36c5f5ce9cc96097df9721d7d92612f49a894bc5 WatchSource:0}: Error finding container bd4587737578490af6abd53c36c5f5ce9cc96097df9721d7d92612f49a894bc5: Status 404 returned error can't find the container with id bd4587737578490af6abd53c36c5f5ce9cc96097df9721d7d92612f49a894bc5 Mar 20 09:45:46 crc kubenswrapper[4971]: I0320 09:45:46.729361 4971 generic.go:334] "Generic (PLEG): container finished" podID="3faf5cf5-d1d7-41e8-9cef-2ddf3b9363e9" containerID="63890e274c0e89894b37bd66a249387d69fef2312225c718f378d987c507dd95" exitCode=0 Mar 20 09:45:46 crc kubenswrapper[4971]: I0320 09:45:46.729476 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qb2q8" event={"ID":"3faf5cf5-d1d7-41e8-9cef-2ddf3b9363e9","Type":"ContainerDied","Data":"63890e274c0e89894b37bd66a249387d69fef2312225c718f378d987c507dd95"} Mar 20 09:45:46 crc kubenswrapper[4971]: I0320 09:45:46.729989 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qb2q8" event={"ID":"3faf5cf5-d1d7-41e8-9cef-2ddf3b9363e9","Type":"ContainerStarted","Data":"bd4587737578490af6abd53c36c5f5ce9cc96097df9721d7d92612f49a894bc5"} Mar 20 09:45:48 crc kubenswrapper[4971]: I0320 09:45:48.749908 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qb2q8" event={"ID":"3faf5cf5-d1d7-41e8-9cef-2ddf3b9363e9","Type":"ContainerStarted","Data":"b7bac01a27f0cda8b4dbbd1206e3f5e12f05852d7b0e979ba1d9942319a0214b"} Mar 20 09:45:49 crc kubenswrapper[4971]: E0320 09:45:49.088782 4971 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3faf5cf5_d1d7_41e8_9cef_2ddf3b9363e9.slice/crio-conmon-b7bac01a27f0cda8b4dbbd1206e3f5e12f05852d7b0e979ba1d9942319a0214b.scope\": RecentStats: unable to find data in memory cache]" Mar 20 09:45:49 crc kubenswrapper[4971]: I0320 09:45:49.764273 4971 generic.go:334] "Generic (PLEG): container finished" podID="3faf5cf5-d1d7-41e8-9cef-2ddf3b9363e9" containerID="b7bac01a27f0cda8b4dbbd1206e3f5e12f05852d7b0e979ba1d9942319a0214b" exitCode=0 Mar 20 09:45:49 crc kubenswrapper[4971]: I0320 09:45:49.764335 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qb2q8" event={"ID":"3faf5cf5-d1d7-41e8-9cef-2ddf3b9363e9","Type":"ContainerDied","Data":"b7bac01a27f0cda8b4dbbd1206e3f5e12f05852d7b0e979ba1d9942319a0214b"} Mar 20 09:45:50 crc kubenswrapper[4971]: I0320 09:45:50.791599 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qb2q8" event={"ID":"3faf5cf5-d1d7-41e8-9cef-2ddf3b9363e9","Type":"ContainerStarted","Data":"48bef4ef730065f1b540153ea0f80a8d702cd37cca30aa6221213e05d9186e4a"} Mar 20 09:45:50 crc kubenswrapper[4971]: I0320 09:45:50.815510 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qb2q8" podStartSLOduration=2.34436386 podStartE2EDuration="5.815489456s" podCreationTimestamp="2026-03-20 09:45:45 +0000 UTC" firstStartedPulling="2026-03-20 09:45:46.731096102 +0000 UTC m=+10568.710970240" lastFinishedPulling="2026-03-20 09:45:50.202221698 +0000 UTC m=+10572.182095836" observedRunningTime="2026-03-20 09:45:50.812071886 +0000 UTC m=+10572.791946044" watchObservedRunningTime="2026-03-20 09:45:50.815489456 +0000 UTC m=+10572.795363594" Mar 20 09:45:51 crc kubenswrapper[4971]: I0320 09:45:51.732112 4971 scope.go:117] "RemoveContainer" containerID="f0175754c8799a9c4ddbe22551ae98fd93ca0fac507f7806f7f0341a573b4b38" Mar 20 09:45:51 crc kubenswrapper[4971]: E0320 09:45:51.732354 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:45:55 crc kubenswrapper[4971]: I0320 09:45:55.863290 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qb2q8" Mar 20 09:45:55 crc kubenswrapper[4971]: I0320 09:45:55.863822 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qb2q8" Mar 20 09:45:55 crc kubenswrapper[4971]: I0320 09:45:55.924857 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qb2q8" Mar 20 09:45:56 crc kubenswrapper[4971]: I0320 09:45:56.964340 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qb2q8" Mar 20 09:46:00 crc kubenswrapper[4971]: I0320 09:46:00.181672 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566666-k7qbz"] Mar 20 09:46:00 crc kubenswrapper[4971]: I0320 09:46:00.183290 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566666-k7qbz" Mar 20 09:46:00 crc kubenswrapper[4971]: I0320 09:46:00.190977 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:46:00 crc kubenswrapper[4971]: I0320 09:46:00.191546 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 09:46:00 crc kubenswrapper[4971]: I0320 09:46:00.191209 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:46:00 crc kubenswrapper[4971]: I0320 09:46:00.208086 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566666-k7qbz"] Mar 20 09:46:00 crc kubenswrapper[4971]: I0320 09:46:00.214295 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcj2b\" (UniqueName: \"kubernetes.io/projected/aa5b1e46-fa49-41f5-a19f-531729f9ac25-kube-api-access-tcj2b\") pod \"auto-csr-approver-29566666-k7qbz\" (UID: \"aa5b1e46-fa49-41f5-a19f-531729f9ac25\") " pod="openshift-infra/auto-csr-approver-29566666-k7qbz" Mar 20 09:46:00 crc kubenswrapper[4971]: I0320 09:46:00.323672 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcj2b\" (UniqueName: \"kubernetes.io/projected/aa5b1e46-fa49-41f5-a19f-531729f9ac25-kube-api-access-tcj2b\") pod \"auto-csr-approver-29566666-k7qbz\" (UID: \"aa5b1e46-fa49-41f5-a19f-531729f9ac25\") " pod="openshift-infra/auto-csr-approver-29566666-k7qbz" Mar 20 09:46:00 crc kubenswrapper[4971]: I0320 09:46:00.343150 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcj2b\" (UniqueName: \"kubernetes.io/projected/aa5b1e46-fa49-41f5-a19f-531729f9ac25-kube-api-access-tcj2b\") pod \"auto-csr-approver-29566666-k7qbz\" (UID: \"aa5b1e46-fa49-41f5-a19f-531729f9ac25\") " pod="openshift-infra/auto-csr-approver-29566666-k7qbz" Mar 20 09:46:00 crc kubenswrapper[4971]: I0320 09:46:00.434321 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qb2q8"] Mar 20 09:46:00 crc kubenswrapper[4971]: I0320 09:46:00.434689 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qb2q8" podUID="3faf5cf5-d1d7-41e8-9cef-2ddf3b9363e9" containerName="registry-server" containerID="cri-o://48bef4ef730065f1b540153ea0f80a8d702cd37cca30aa6221213e05d9186e4a" gracePeriod=2 Mar 20 09:46:00 crc kubenswrapper[4971]: I0320 09:46:00.511321 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566666-k7qbz" Mar 20 09:46:00 crc kubenswrapper[4971]: I0320 09:46:00.905381 4971 generic.go:334] "Generic (PLEG): container finished" podID="3faf5cf5-d1d7-41e8-9cef-2ddf3b9363e9" containerID="48bef4ef730065f1b540153ea0f80a8d702cd37cca30aa6221213e05d9186e4a" exitCode=0 Mar 20 09:46:00 crc kubenswrapper[4971]: I0320 09:46:00.905450 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qb2q8" event={"ID":"3faf5cf5-d1d7-41e8-9cef-2ddf3b9363e9","Type":"ContainerDied","Data":"48bef4ef730065f1b540153ea0f80a8d702cd37cca30aa6221213e05d9186e4a"} Mar 20 09:46:00 crc kubenswrapper[4971]: I0320 09:46:00.966186 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566666-k7qbz"] Mar 20 09:46:01 crc kubenswrapper[4971]: I0320 09:46:01.040236 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qb2q8" Mar 20 09:46:01 crc kubenswrapper[4971]: I0320 09:46:01.141898 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfr7f\" (UniqueName: \"kubernetes.io/projected/3faf5cf5-d1d7-41e8-9cef-2ddf3b9363e9-kube-api-access-rfr7f\") pod \"3faf5cf5-d1d7-41e8-9cef-2ddf3b9363e9\" (UID: \"3faf5cf5-d1d7-41e8-9cef-2ddf3b9363e9\") " Mar 20 09:46:01 crc kubenswrapper[4971]: I0320 09:46:01.142220 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3faf5cf5-d1d7-41e8-9cef-2ddf3b9363e9-catalog-content\") pod \"3faf5cf5-d1d7-41e8-9cef-2ddf3b9363e9\" (UID: \"3faf5cf5-d1d7-41e8-9cef-2ddf3b9363e9\") " Mar 20 09:46:01 crc kubenswrapper[4971]: I0320 09:46:01.142319 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3faf5cf5-d1d7-41e8-9cef-2ddf3b9363e9-utilities\") pod \"3faf5cf5-d1d7-41e8-9cef-2ddf3b9363e9\" (UID: \"3faf5cf5-d1d7-41e8-9cef-2ddf3b9363e9\") " Mar 20 09:46:01 crc kubenswrapper[4971]: I0320 09:46:01.142911 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3faf5cf5-d1d7-41e8-9cef-2ddf3b9363e9-utilities" (OuterVolumeSpecName: "utilities") pod "3faf5cf5-d1d7-41e8-9cef-2ddf3b9363e9" (UID: "3faf5cf5-d1d7-41e8-9cef-2ddf3b9363e9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:46:01 crc kubenswrapper[4971]: I0320 09:46:01.149010 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3faf5cf5-d1d7-41e8-9cef-2ddf3b9363e9-kube-api-access-rfr7f" (OuterVolumeSpecName: "kube-api-access-rfr7f") pod "3faf5cf5-d1d7-41e8-9cef-2ddf3b9363e9" (UID: "3faf5cf5-d1d7-41e8-9cef-2ddf3b9363e9"). InnerVolumeSpecName "kube-api-access-rfr7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:46:01 crc kubenswrapper[4971]: I0320 09:46:01.192287 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3faf5cf5-d1d7-41e8-9cef-2ddf3b9363e9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3faf5cf5-d1d7-41e8-9cef-2ddf3b9363e9" (UID: "3faf5cf5-d1d7-41e8-9cef-2ddf3b9363e9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:46:01 crc kubenswrapper[4971]: I0320 09:46:01.244907 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3faf5cf5-d1d7-41e8-9cef-2ddf3b9363e9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 09:46:01 crc kubenswrapper[4971]: I0320 09:46:01.245250 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3faf5cf5-d1d7-41e8-9cef-2ddf3b9363e9-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 09:46:01 crc kubenswrapper[4971]: I0320 09:46:01.245336 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfr7f\" (UniqueName: \"kubernetes.io/projected/3faf5cf5-d1d7-41e8-9cef-2ddf3b9363e9-kube-api-access-rfr7f\") on node \"crc\" DevicePath \"\"" Mar 20 09:46:01 crc kubenswrapper[4971]: I0320 09:46:01.922386 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qb2q8" Mar 20 09:46:01 crc kubenswrapper[4971]: I0320 09:46:01.922421 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qb2q8" event={"ID":"3faf5cf5-d1d7-41e8-9cef-2ddf3b9363e9","Type":"ContainerDied","Data":"bd4587737578490af6abd53c36c5f5ce9cc96097df9721d7d92612f49a894bc5"} Mar 20 09:46:01 crc kubenswrapper[4971]: I0320 09:46:01.922891 4971 scope.go:117] "RemoveContainer" containerID="48bef4ef730065f1b540153ea0f80a8d702cd37cca30aa6221213e05d9186e4a" Mar 20 09:46:01 crc kubenswrapper[4971]: I0320 09:46:01.928063 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566666-k7qbz" event={"ID":"aa5b1e46-fa49-41f5-a19f-531729f9ac25","Type":"ContainerStarted","Data":"338bfb4a9614def699a9adfa070d3627a92bfddd133ff39933e0c6874baeb092"} Mar 20 09:46:01 crc kubenswrapper[4971]: I0320 09:46:01.965801 4971 scope.go:117] "RemoveContainer" containerID="b7bac01a27f0cda8b4dbbd1206e3f5e12f05852d7b0e979ba1d9942319a0214b" Mar 20 09:46:01 crc kubenswrapper[4971]: I0320 09:46:01.975797 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qb2q8"] Mar 20 09:46:02 crc kubenswrapper[4971]: I0320 09:46:02.007213 4971 scope.go:117] "RemoveContainer" containerID="63890e274c0e89894b37bd66a249387d69fef2312225c718f378d987c507dd95" Mar 20 09:46:02 crc kubenswrapper[4971]: I0320 09:46:02.009598 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qb2q8"] Mar 20 09:46:02 crc kubenswrapper[4971]: I0320 09:46:02.742929 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3faf5cf5-d1d7-41e8-9cef-2ddf3b9363e9" path="/var/lib/kubelet/pods/3faf5cf5-d1d7-41e8-9cef-2ddf3b9363e9/volumes" Mar 20 09:46:02 crc kubenswrapper[4971]: I0320 09:46:02.945919 4971 generic.go:334] "Generic (PLEG): container finished" podID="aa5b1e46-fa49-41f5-a19f-531729f9ac25" containerID="624bed76579217498a7500124693922fb7d79933066283dff9464952feca6d33" exitCode=0 Mar 20 09:46:02 crc kubenswrapper[4971]: I0320 09:46:02.945967 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566666-k7qbz" event={"ID":"aa5b1e46-fa49-41f5-a19f-531729f9ac25","Type":"ContainerDied","Data":"624bed76579217498a7500124693922fb7d79933066283dff9464952feca6d33"} Mar 20 09:46:04 crc kubenswrapper[4971]: I0320 09:46:04.340766 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566666-k7qbz" Mar 20 09:46:04 crc kubenswrapper[4971]: I0320 09:46:04.413376 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tcj2b\" (UniqueName: \"kubernetes.io/projected/aa5b1e46-fa49-41f5-a19f-531729f9ac25-kube-api-access-tcj2b\") pod \"aa5b1e46-fa49-41f5-a19f-531729f9ac25\" (UID: \"aa5b1e46-fa49-41f5-a19f-531729f9ac25\") " Mar 20 09:46:04 crc kubenswrapper[4971]: I0320 09:46:04.418953 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa5b1e46-fa49-41f5-a19f-531729f9ac25-kube-api-access-tcj2b" (OuterVolumeSpecName: "kube-api-access-tcj2b") pod "aa5b1e46-fa49-41f5-a19f-531729f9ac25" (UID: "aa5b1e46-fa49-41f5-a19f-531729f9ac25"). InnerVolumeSpecName "kube-api-access-tcj2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:46:04 crc kubenswrapper[4971]: I0320 09:46:04.516085 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tcj2b\" (UniqueName: \"kubernetes.io/projected/aa5b1e46-fa49-41f5-a19f-531729f9ac25-kube-api-access-tcj2b\") on node \"crc\" DevicePath \"\"" Mar 20 09:46:04 crc kubenswrapper[4971]: I0320 09:46:04.984135 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566666-k7qbz" event={"ID":"aa5b1e46-fa49-41f5-a19f-531729f9ac25","Type":"ContainerDied","Data":"338bfb4a9614def699a9adfa070d3627a92bfddd133ff39933e0c6874baeb092"} Mar 20 09:46:04 crc kubenswrapper[4971]: I0320 09:46:04.984681 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="338bfb4a9614def699a9adfa070d3627a92bfddd133ff39933e0c6874baeb092" Mar 20 09:46:04 crc kubenswrapper[4971]: I0320 09:46:04.984787 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566666-k7qbz" Mar 20 09:46:05 crc kubenswrapper[4971]: I0320 09:46:05.442555 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566660-h97zl"] Mar 20 09:46:05 crc kubenswrapper[4971]: I0320 09:46:05.453075 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566660-h97zl"] Mar 20 09:46:06 crc kubenswrapper[4971]: I0320 09:46:06.733310 4971 scope.go:117] "RemoveContainer" containerID="f0175754c8799a9c4ddbe22551ae98fd93ca0fac507f7806f7f0341a573b4b38" Mar 20 09:46:06 crc kubenswrapper[4971]: E0320 09:46:06.733678 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:46:06 crc kubenswrapper[4971]: I0320 09:46:06.744644 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a6069cb-9293-4b44-92a9-f300b0343fff" path="/var/lib/kubelet/pods/8a6069cb-9293-4b44-92a9-f300b0343fff/volumes" Mar 20 09:46:15 crc kubenswrapper[4971]: I0320 09:46:15.833777 4971 scope.go:117] "RemoveContainer" containerID="d5758dff12c9d3ec1f99c9316590f65df7ebbc05dfb95685e7901ce5b54f3467" Mar 20 09:46:18 crc kubenswrapper[4971]: I0320 09:46:18.739412 4971 scope.go:117] "RemoveContainer" containerID="f0175754c8799a9c4ddbe22551ae98fd93ca0fac507f7806f7f0341a573b4b38" Mar 20 09:46:18 crc kubenswrapper[4971]: E0320 09:46:18.741308 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:46:30 crc kubenswrapper[4971]: I0320 09:46:30.732159 4971 scope.go:117] "RemoveContainer" containerID="f0175754c8799a9c4ddbe22551ae98fd93ca0fac507f7806f7f0341a573b4b38" Mar 20 09:46:30 crc kubenswrapper[4971]: E0320 09:46:30.733039 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:46:44 crc kubenswrapper[4971]: I0320 09:46:44.732258 4971 scope.go:117] "RemoveContainer" containerID="f0175754c8799a9c4ddbe22551ae98fd93ca0fac507f7806f7f0341a573b4b38" Mar 20 09:46:44 crc kubenswrapper[4971]: E0320 09:46:44.732984 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:46:45 crc kubenswrapper[4971]: I0320 09:46:45.353375 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-z956l"] Mar 20 09:46:45 crc kubenswrapper[4971]: E0320 09:46:45.354366 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3faf5cf5-d1d7-41e8-9cef-2ddf3b9363e9" containerName="registry-server" Mar 20 09:46:45 crc kubenswrapper[4971]: I0320 09:46:45.354510 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="3faf5cf5-d1d7-41e8-9cef-2ddf3b9363e9" containerName="registry-server" Mar 20 09:46:45 crc kubenswrapper[4971]: E0320 09:46:45.354700 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3faf5cf5-d1d7-41e8-9cef-2ddf3b9363e9" containerName="extract-content" Mar 20 09:46:45 crc kubenswrapper[4971]: I0320 09:46:45.354825 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="3faf5cf5-d1d7-41e8-9cef-2ddf3b9363e9" containerName="extract-content" Mar 20 09:46:45 crc kubenswrapper[4971]: E0320 09:46:45.354917 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3faf5cf5-d1d7-41e8-9cef-2ddf3b9363e9" containerName="extract-utilities" Mar 20 09:46:45 crc kubenswrapper[4971]: I0320 09:46:45.355001 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="3faf5cf5-d1d7-41e8-9cef-2ddf3b9363e9" containerName="extract-utilities" Mar 20 09:46:45 crc kubenswrapper[4971]: E0320 09:46:45.355107 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa5b1e46-fa49-41f5-a19f-531729f9ac25" containerName="oc" Mar 20 09:46:45 crc kubenswrapper[4971]: I0320 09:46:45.355190 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa5b1e46-fa49-41f5-a19f-531729f9ac25" containerName="oc" Mar 20 09:46:45 crc kubenswrapper[4971]: I0320 09:46:45.355513 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="3faf5cf5-d1d7-41e8-9cef-2ddf3b9363e9" containerName="registry-server" Mar 20 09:46:45 crc kubenswrapper[4971]: I0320 09:46:45.355679 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa5b1e46-fa49-41f5-a19f-531729f9ac25" containerName="oc" Mar 20 09:46:45 crc kubenswrapper[4971]: I0320 09:46:45.357563 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z956l" Mar 20 09:46:45 crc kubenswrapper[4971]: I0320 09:46:45.379542 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z956l"] Mar 20 09:46:45 crc kubenswrapper[4971]: I0320 09:46:45.438763 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72ec649a-b70e-425b-8563-a78bab061e51-catalog-content\") pod \"certified-operators-z956l\" (UID: \"72ec649a-b70e-425b-8563-a78bab061e51\") " pod="openshift-marketplace/certified-operators-z956l" Mar 20 09:46:45 crc kubenswrapper[4971]: I0320 09:46:45.438900 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72ec649a-b70e-425b-8563-a78bab061e51-utilities\") pod \"certified-operators-z956l\" (UID: \"72ec649a-b70e-425b-8563-a78bab061e51\") " pod="openshift-marketplace/certified-operators-z956l" Mar 20 09:46:45 crc kubenswrapper[4971]: I0320 09:46:45.439156 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpkjg\" (UniqueName: \"kubernetes.io/projected/72ec649a-b70e-425b-8563-a78bab061e51-kube-api-access-vpkjg\") pod \"certified-operators-z956l\" (UID: \"72ec649a-b70e-425b-8563-a78bab061e51\") " pod="openshift-marketplace/certified-operators-z956l" Mar 20 09:46:45 crc kubenswrapper[4971]: I0320 09:46:45.540711 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpkjg\" (UniqueName: \"kubernetes.io/projected/72ec649a-b70e-425b-8563-a78bab061e51-kube-api-access-vpkjg\") pod \"certified-operators-z956l\" (UID: \"72ec649a-b70e-425b-8563-a78bab061e51\") " pod="openshift-marketplace/certified-operators-z956l" Mar 20 09:46:45 crc kubenswrapper[4971]: I0320 09:46:45.540799 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72ec649a-b70e-425b-8563-a78bab061e51-catalog-content\") pod \"certified-operators-z956l\" (UID: \"72ec649a-b70e-425b-8563-a78bab061e51\") " pod="openshift-marketplace/certified-operators-z956l" Mar 20 09:46:45 crc kubenswrapper[4971]: I0320 09:46:45.540906 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72ec649a-b70e-425b-8563-a78bab061e51-utilities\") pod \"certified-operators-z956l\" (UID: \"72ec649a-b70e-425b-8563-a78bab061e51\") " pod="openshift-marketplace/certified-operators-z956l" Mar 20 09:46:45 crc kubenswrapper[4971]: I0320 09:46:45.541439 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72ec649a-b70e-425b-8563-a78bab061e51-catalog-content\") pod \"certified-operators-z956l\" (UID: \"72ec649a-b70e-425b-8563-a78bab061e51\") " pod="openshift-marketplace/certified-operators-z956l" Mar 20 09:46:45 crc kubenswrapper[4971]: I0320 09:46:45.541545 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72ec649a-b70e-425b-8563-a78bab061e51-utilities\") pod \"certified-operators-z956l\" (UID: \"72ec649a-b70e-425b-8563-a78bab061e51\") " pod="openshift-marketplace/certified-operators-z956l" Mar 20 09:46:45 crc kubenswrapper[4971]: I0320 09:46:45.566435 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpkjg\" (UniqueName: \"kubernetes.io/projected/72ec649a-b70e-425b-8563-a78bab061e51-kube-api-access-vpkjg\") pod \"certified-operators-z956l\" (UID: \"72ec649a-b70e-425b-8563-a78bab061e51\") " pod="openshift-marketplace/certified-operators-z956l" Mar 20 09:46:45 crc kubenswrapper[4971]: I0320 09:46:45.680579 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z956l" Mar 20 09:46:46 crc kubenswrapper[4971]: I0320 09:46:46.901348 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z956l"] Mar 20 09:46:47 crc kubenswrapper[4971]: I0320 09:46:47.544925 4971 generic.go:334] "Generic (PLEG): container finished" podID="72ec649a-b70e-425b-8563-a78bab061e51" containerID="97abbaac77ebc9b14280c67bfefb42d8965f1696dfe350015ee5874d3a5d283e" exitCode=0 Mar 20 09:46:47 crc kubenswrapper[4971]: I0320 09:46:47.544978 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z956l" event={"ID":"72ec649a-b70e-425b-8563-a78bab061e51","Type":"ContainerDied","Data":"97abbaac77ebc9b14280c67bfefb42d8965f1696dfe350015ee5874d3a5d283e"} Mar 20 09:46:47 crc kubenswrapper[4971]: I0320 09:46:47.545215 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z956l" event={"ID":"72ec649a-b70e-425b-8563-a78bab061e51","Type":"ContainerStarted","Data":"e02bbaeda4ab458f2750ded735c05ff6f55801c73d8eaf1131b8e2e0a97450ad"} Mar 20 09:46:48 crc kubenswrapper[4971]: I0320 09:46:48.556670 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z956l" event={"ID":"72ec649a-b70e-425b-8563-a78bab061e51","Type":"ContainerStarted","Data":"2dec44381b8998402459e9e2429cbfb2b9a94b225174b5cfe54ac55c8c035513"} Mar 20 09:46:50 crc kubenswrapper[4971]: I0320 09:46:50.575563 4971 generic.go:334] "Generic (PLEG): container finished" podID="72ec649a-b70e-425b-8563-a78bab061e51" containerID="2dec44381b8998402459e9e2429cbfb2b9a94b225174b5cfe54ac55c8c035513" exitCode=0 Mar 20 09:46:50 crc kubenswrapper[4971]: I0320 09:46:50.575850 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z956l" event={"ID":"72ec649a-b70e-425b-8563-a78bab061e51","Type":"ContainerDied","Data":"2dec44381b8998402459e9e2429cbfb2b9a94b225174b5cfe54ac55c8c035513"} Mar 20 09:46:51 crc kubenswrapper[4971]: I0320 09:46:51.588166 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z956l" event={"ID":"72ec649a-b70e-425b-8563-a78bab061e51","Type":"ContainerStarted","Data":"ebcd6dc7f5d5e4a8c8167cde714cb5ccc734afd00904376c5fca2cb3dd12ab78"} Mar 20 09:46:51 crc kubenswrapper[4971]: I0320 09:46:51.610709 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-z956l" podStartSLOduration=3.072528258 podStartE2EDuration="6.610678305s" podCreationTimestamp="2026-03-20 09:46:45 +0000 UTC" firstStartedPulling="2026-03-20 09:46:47.548163015 +0000 UTC m=+10629.528037153" lastFinishedPulling="2026-03-20 09:46:51.086313062 +0000 UTC m=+10633.066187200" observedRunningTime="2026-03-20 09:46:51.607560703 +0000 UTC m=+10633.587434841" watchObservedRunningTime="2026-03-20 09:46:51.610678305 +0000 UTC m=+10633.590552463" Mar 20 09:46:55 crc kubenswrapper[4971]: I0320 09:46:55.681433 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-z956l" Mar 20 09:46:55 crc kubenswrapper[4971]: I0320 09:46:55.682087 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-z956l" Mar 20 09:46:55 crc kubenswrapper[4971]: I0320 09:46:55.731192 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-z956l" Mar 20 09:46:56 crc kubenswrapper[4971]: I0320 09:46:56.677744 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-z956l" Mar 20 09:46:56 crc kubenswrapper[4971]: I0320 09:46:56.725763 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-z956l"] Mar 20 09:46:58 crc kubenswrapper[4971]: I0320 09:46:58.654195 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-z956l" podUID="72ec649a-b70e-425b-8563-a78bab061e51" containerName="registry-server" containerID="cri-o://ebcd6dc7f5d5e4a8c8167cde714cb5ccc734afd00904376c5fca2cb3dd12ab78" gracePeriod=2 Mar 20 09:46:58 crc kubenswrapper[4971]: I0320 09:46:58.738940 4971 scope.go:117] "RemoveContainer" containerID="f0175754c8799a9c4ddbe22551ae98fd93ca0fac507f7806f7f0341a573b4b38" Mar 20 09:46:58 crc kubenswrapper[4971]: E0320 09:46:58.739227 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:46:59 crc kubenswrapper[4971]: I0320 09:46:59.128024 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z956l" Mar 20 09:46:59 crc kubenswrapper[4971]: I0320 09:46:59.225763 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72ec649a-b70e-425b-8563-a78bab061e51-catalog-content\") pod \"72ec649a-b70e-425b-8563-a78bab061e51\" (UID: \"72ec649a-b70e-425b-8563-a78bab061e51\") " Mar 20 09:46:59 crc kubenswrapper[4971]: I0320 09:46:59.225951 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72ec649a-b70e-425b-8563-a78bab061e51-utilities\") pod \"72ec649a-b70e-425b-8563-a78bab061e51\" (UID: \"72ec649a-b70e-425b-8563-a78bab061e51\") " Mar 20 09:46:59 crc kubenswrapper[4971]: I0320 09:46:59.226179 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpkjg\" (UniqueName: \"kubernetes.io/projected/72ec649a-b70e-425b-8563-a78bab061e51-kube-api-access-vpkjg\") pod \"72ec649a-b70e-425b-8563-a78bab061e51\" (UID: \"72ec649a-b70e-425b-8563-a78bab061e51\") " Mar 20 09:46:59 crc kubenswrapper[4971]: I0320 09:46:59.226861 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72ec649a-b70e-425b-8563-a78bab061e51-utilities" (OuterVolumeSpecName: "utilities") pod "72ec649a-b70e-425b-8563-a78bab061e51" (UID: "72ec649a-b70e-425b-8563-a78bab061e51"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:46:59 crc kubenswrapper[4971]: I0320 09:46:59.236227 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72ec649a-b70e-425b-8563-a78bab061e51-kube-api-access-vpkjg" (OuterVolumeSpecName: "kube-api-access-vpkjg") pod "72ec649a-b70e-425b-8563-a78bab061e51" (UID: "72ec649a-b70e-425b-8563-a78bab061e51"). InnerVolumeSpecName "kube-api-access-vpkjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:46:59 crc kubenswrapper[4971]: I0320 09:46:59.285140 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72ec649a-b70e-425b-8563-a78bab061e51-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "72ec649a-b70e-425b-8563-a78bab061e51" (UID: "72ec649a-b70e-425b-8563-a78bab061e51"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:46:59 crc kubenswrapper[4971]: I0320 09:46:59.329327 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vpkjg\" (UniqueName: \"kubernetes.io/projected/72ec649a-b70e-425b-8563-a78bab061e51-kube-api-access-vpkjg\") on node \"crc\" DevicePath \"\"" Mar 20 09:46:59 crc kubenswrapper[4971]: I0320 09:46:59.329388 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72ec649a-b70e-425b-8563-a78bab061e51-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 09:46:59 crc kubenswrapper[4971]: I0320 09:46:59.329403 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72ec649a-b70e-425b-8563-a78bab061e51-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 09:46:59 crc kubenswrapper[4971]: I0320 09:46:59.666395 4971 generic.go:334] "Generic (PLEG): container finished" podID="72ec649a-b70e-425b-8563-a78bab061e51" containerID="ebcd6dc7f5d5e4a8c8167cde714cb5ccc734afd00904376c5fca2cb3dd12ab78" exitCode=0 Mar 20 09:46:59 crc kubenswrapper[4971]: I0320 09:46:59.666427 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z956l" event={"ID":"72ec649a-b70e-425b-8563-a78bab061e51","Type":"ContainerDied","Data":"ebcd6dc7f5d5e4a8c8167cde714cb5ccc734afd00904376c5fca2cb3dd12ab78"} Mar 20 09:46:59 crc kubenswrapper[4971]: I0320 09:46:59.666466 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z956l" Mar 20 09:46:59 crc kubenswrapper[4971]: I0320 09:46:59.666476 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z956l" event={"ID":"72ec649a-b70e-425b-8563-a78bab061e51","Type":"ContainerDied","Data":"e02bbaeda4ab458f2750ded735c05ff6f55801c73d8eaf1131b8e2e0a97450ad"} Mar 20 09:46:59 crc kubenswrapper[4971]: I0320 09:46:59.666495 4971 scope.go:117] "RemoveContainer" containerID="ebcd6dc7f5d5e4a8c8167cde714cb5ccc734afd00904376c5fca2cb3dd12ab78" Mar 20 09:46:59 crc kubenswrapper[4971]: I0320 09:46:59.701669 4971 scope.go:117] "RemoveContainer" containerID="2dec44381b8998402459e9e2429cbfb2b9a94b225174b5cfe54ac55c8c035513" Mar 20 09:46:59 crc kubenswrapper[4971]: I0320 09:46:59.713134 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-z956l"] Mar 20 09:46:59 crc kubenswrapper[4971]: I0320 09:46:59.727531 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-z956l"] Mar 20 09:46:59 crc kubenswrapper[4971]: I0320 09:46:59.732522 4971 scope.go:117] "RemoveContainer" containerID="97abbaac77ebc9b14280c67bfefb42d8965f1696dfe350015ee5874d3a5d283e" Mar 20 09:46:59 crc kubenswrapper[4971]: I0320 09:46:59.768809 4971 scope.go:117] "RemoveContainer" containerID="ebcd6dc7f5d5e4a8c8167cde714cb5ccc734afd00904376c5fca2cb3dd12ab78" Mar 20 09:46:59 crc kubenswrapper[4971]: E0320 09:46:59.769211 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebcd6dc7f5d5e4a8c8167cde714cb5ccc734afd00904376c5fca2cb3dd12ab78\": container with ID starting with ebcd6dc7f5d5e4a8c8167cde714cb5ccc734afd00904376c5fca2cb3dd12ab78 not found: ID does not exist" containerID="ebcd6dc7f5d5e4a8c8167cde714cb5ccc734afd00904376c5fca2cb3dd12ab78" Mar 20 09:46:59 crc kubenswrapper[4971]: I0320 09:46:59.769259 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebcd6dc7f5d5e4a8c8167cde714cb5ccc734afd00904376c5fca2cb3dd12ab78"} err="failed to get container status \"ebcd6dc7f5d5e4a8c8167cde714cb5ccc734afd00904376c5fca2cb3dd12ab78\": rpc error: code = NotFound desc = could not find container \"ebcd6dc7f5d5e4a8c8167cde714cb5ccc734afd00904376c5fca2cb3dd12ab78\": container with ID starting with ebcd6dc7f5d5e4a8c8167cde714cb5ccc734afd00904376c5fca2cb3dd12ab78 not found: ID does not exist" Mar 20 09:46:59 crc kubenswrapper[4971]: I0320 09:46:59.769288 4971 scope.go:117] "RemoveContainer" containerID="2dec44381b8998402459e9e2429cbfb2b9a94b225174b5cfe54ac55c8c035513" Mar 20 09:46:59 crc kubenswrapper[4971]: E0320 09:46:59.769599 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2dec44381b8998402459e9e2429cbfb2b9a94b225174b5cfe54ac55c8c035513\": container with ID starting with 2dec44381b8998402459e9e2429cbfb2b9a94b225174b5cfe54ac55c8c035513 not found: ID does not exist" containerID="2dec44381b8998402459e9e2429cbfb2b9a94b225174b5cfe54ac55c8c035513" Mar 20 09:46:59 crc kubenswrapper[4971]: I0320 09:46:59.769695 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dec44381b8998402459e9e2429cbfb2b9a94b225174b5cfe54ac55c8c035513"} err="failed to get container status \"2dec44381b8998402459e9e2429cbfb2b9a94b225174b5cfe54ac55c8c035513\": rpc error: code = NotFound desc = could not find container \"2dec44381b8998402459e9e2429cbfb2b9a94b225174b5cfe54ac55c8c035513\": container with ID starting with 2dec44381b8998402459e9e2429cbfb2b9a94b225174b5cfe54ac55c8c035513 not found: ID does not exist" Mar 20 09:46:59 crc kubenswrapper[4971]: I0320 09:46:59.769717 4971 scope.go:117] "RemoveContainer" containerID="97abbaac77ebc9b14280c67bfefb42d8965f1696dfe350015ee5874d3a5d283e" Mar 20 09:46:59 crc kubenswrapper[4971]: E0320 09:46:59.770117 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97abbaac77ebc9b14280c67bfefb42d8965f1696dfe350015ee5874d3a5d283e\": container with ID starting with 97abbaac77ebc9b14280c67bfefb42d8965f1696dfe350015ee5874d3a5d283e not found: ID does not exist" containerID="97abbaac77ebc9b14280c67bfefb42d8965f1696dfe350015ee5874d3a5d283e" Mar 20 09:46:59 crc kubenswrapper[4971]: I0320 09:46:59.770139 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97abbaac77ebc9b14280c67bfefb42d8965f1696dfe350015ee5874d3a5d283e"} err="failed to get container status \"97abbaac77ebc9b14280c67bfefb42d8965f1696dfe350015ee5874d3a5d283e\": rpc error: code = NotFound desc = could not find container \"97abbaac77ebc9b14280c67bfefb42d8965f1696dfe350015ee5874d3a5d283e\": container with ID starting with 97abbaac77ebc9b14280c67bfefb42d8965f1696dfe350015ee5874d3a5d283e not found: ID does not exist" Mar 20 09:47:00 crc kubenswrapper[4971]: I0320 09:47:00.742476 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72ec649a-b70e-425b-8563-a78bab061e51" path="/var/lib/kubelet/pods/72ec649a-b70e-425b-8563-a78bab061e51/volumes" Mar 20 09:47:13 crc kubenswrapper[4971]: I0320 09:47:13.732504 4971 scope.go:117] "RemoveContainer" containerID="f0175754c8799a9c4ddbe22551ae98fd93ca0fac507f7806f7f0341a573b4b38" Mar 20 09:47:13 crc kubenswrapper[4971]: E0320 09:47:13.733493 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:47:26 crc kubenswrapper[4971]: I0320 09:47:26.733354 4971 scope.go:117] "RemoveContainer" containerID="f0175754c8799a9c4ddbe22551ae98fd93ca0fac507f7806f7f0341a573b4b38" Mar 20 09:47:26 crc kubenswrapper[4971]: E0320 09:47:26.734071 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:47:37 crc kubenswrapper[4971]: I0320 09:47:37.732155 4971 scope.go:117] "RemoveContainer" containerID="f0175754c8799a9c4ddbe22551ae98fd93ca0fac507f7806f7f0341a573b4b38" Mar 20 09:47:37 crc kubenswrapper[4971]: E0320 09:47:37.733214 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:47:49 crc kubenswrapper[4971]: I0320 09:47:49.732133 4971 scope.go:117] "RemoveContainer" containerID="f0175754c8799a9c4ddbe22551ae98fd93ca0fac507f7806f7f0341a573b4b38" Mar 20 09:47:49 crc kubenswrapper[4971]: E0320 09:47:49.732818 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:48:00 crc kubenswrapper[4971]: I0320 09:48:00.163574 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566668-jrfwq"] Mar 20 09:48:00 crc kubenswrapper[4971]: E0320 09:48:00.164784 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72ec649a-b70e-425b-8563-a78bab061e51" containerName="extract-content" Mar 20 09:48:00 crc kubenswrapper[4971]: I0320 09:48:00.164804 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="72ec649a-b70e-425b-8563-a78bab061e51" containerName="extract-content" Mar 20 09:48:00 crc kubenswrapper[4971]: E0320 09:48:00.164829 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72ec649a-b70e-425b-8563-a78bab061e51" containerName="extract-utilities" Mar 20 09:48:00 crc kubenswrapper[4971]: I0320 09:48:00.164840 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="72ec649a-b70e-425b-8563-a78bab061e51" containerName="extract-utilities" Mar 20 09:48:00 crc kubenswrapper[4971]: E0320 09:48:00.164868 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72ec649a-b70e-425b-8563-a78bab061e51" containerName="registry-server" Mar 20 09:48:00 crc kubenswrapper[4971]: I0320 09:48:00.164877 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="72ec649a-b70e-425b-8563-a78bab061e51" containerName="registry-server" Mar 20 09:48:00 crc kubenswrapper[4971]: I0320 09:48:00.165177 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="72ec649a-b70e-425b-8563-a78bab061e51" containerName="registry-server" Mar 20 09:48:00 crc kubenswrapper[4971]: I0320 09:48:00.166220 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566668-jrfwq" Mar 20 09:48:00 crc kubenswrapper[4971]: I0320 09:48:00.169744 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:48:00 crc kubenswrapper[4971]: I0320 09:48:00.169997 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:48:00 crc kubenswrapper[4971]: I0320 09:48:00.170211 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 09:48:00 crc kubenswrapper[4971]: I0320 09:48:00.192102 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566668-jrfwq"] Mar 20 09:48:00 crc kubenswrapper[4971]: I0320 09:48:00.201351 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spp4q\" (UniqueName: \"kubernetes.io/projected/808f215e-d651-47b4-8503-8b9837a272f7-kube-api-access-spp4q\") pod \"auto-csr-approver-29566668-jrfwq\" (UID: \"808f215e-d651-47b4-8503-8b9837a272f7\") " pod="openshift-infra/auto-csr-approver-29566668-jrfwq" Mar 20 09:48:00 crc kubenswrapper[4971]: I0320 09:48:00.303367 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spp4q\" (UniqueName: \"kubernetes.io/projected/808f215e-d651-47b4-8503-8b9837a272f7-kube-api-access-spp4q\") pod \"auto-csr-approver-29566668-jrfwq\" (UID: \"808f215e-d651-47b4-8503-8b9837a272f7\") " pod="openshift-infra/auto-csr-approver-29566668-jrfwq" Mar 20 09:48:00 crc kubenswrapper[4971]: I0320 09:48:00.322314 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spp4q\" (UniqueName: \"kubernetes.io/projected/808f215e-d651-47b4-8503-8b9837a272f7-kube-api-access-spp4q\") pod \"auto-csr-approver-29566668-jrfwq\" (UID: \"808f215e-d651-47b4-8503-8b9837a272f7\") " pod="openshift-infra/auto-csr-approver-29566668-jrfwq" Mar 20 09:48:00 crc kubenswrapper[4971]: I0320 09:48:00.495868 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566668-jrfwq" Mar 20 09:48:00 crc kubenswrapper[4971]: I0320 09:48:00.941405 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566668-jrfwq"] Mar 20 09:48:00 crc kubenswrapper[4971]: I0320 09:48:00.948064 4971 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 09:48:01 crc kubenswrapper[4971]: I0320 09:48:01.349301 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566668-jrfwq" event={"ID":"808f215e-d651-47b4-8503-8b9837a272f7","Type":"ContainerStarted","Data":"b9b3c0b9d9e12eb8fddf904c5a91314a1b5aa9bf48a581e027723d5a6f2cf72d"} Mar 20 09:48:01 crc kubenswrapper[4971]: I0320 09:48:01.731649 4971 scope.go:117] "RemoveContainer" containerID="f0175754c8799a9c4ddbe22551ae98fd93ca0fac507f7806f7f0341a573b4b38" Mar 20 09:48:01 crc kubenswrapper[4971]: E0320 09:48:01.732211 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:48:02 crc kubenswrapper[4971]: I0320 09:48:02.359134 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566668-jrfwq" event={"ID":"808f215e-d651-47b4-8503-8b9837a272f7","Type":"ContainerStarted","Data":"d1c4c2396b15a93a5e38d5cc1d3894aae38eb23beae11db304d1bf5e145ea361"} Mar 20 09:48:02 crc kubenswrapper[4971]: I0320 09:48:02.380189 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566668-jrfwq" podStartSLOduration=1.37845598 podStartE2EDuration="2.380168463s" podCreationTimestamp="2026-03-20 09:48:00 +0000 UTC" firstStartedPulling="2026-03-20 09:48:00.947826779 +0000 UTC m=+10702.927700917" lastFinishedPulling="2026-03-20 09:48:01.949539242 +0000 UTC m=+10703.929413400" observedRunningTime="2026-03-20 09:48:02.376671621 +0000 UTC m=+10704.356545759" watchObservedRunningTime="2026-03-20 09:48:02.380168463 +0000 UTC m=+10704.360042591" Mar 20 09:48:03 crc kubenswrapper[4971]: I0320 09:48:03.369848 4971 generic.go:334] "Generic (PLEG): container finished" podID="808f215e-d651-47b4-8503-8b9837a272f7" containerID="d1c4c2396b15a93a5e38d5cc1d3894aae38eb23beae11db304d1bf5e145ea361" exitCode=0 Mar 20 09:48:03 crc kubenswrapper[4971]: I0320 09:48:03.369949 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566668-jrfwq" event={"ID":"808f215e-d651-47b4-8503-8b9837a272f7","Type":"ContainerDied","Data":"d1c4c2396b15a93a5e38d5cc1d3894aae38eb23beae11db304d1bf5e145ea361"} Mar 20 09:48:04 crc kubenswrapper[4971]: I0320 09:48:04.792277 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566668-jrfwq" Mar 20 09:48:04 crc kubenswrapper[4971]: I0320 09:48:04.805511 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spp4q\" (UniqueName: \"kubernetes.io/projected/808f215e-d651-47b4-8503-8b9837a272f7-kube-api-access-spp4q\") pod \"808f215e-d651-47b4-8503-8b9837a272f7\" (UID: \"808f215e-d651-47b4-8503-8b9837a272f7\") " Mar 20 09:48:04 crc kubenswrapper[4971]: I0320 09:48:04.813331 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/808f215e-d651-47b4-8503-8b9837a272f7-kube-api-access-spp4q" (OuterVolumeSpecName: "kube-api-access-spp4q") pod "808f215e-d651-47b4-8503-8b9837a272f7" (UID: "808f215e-d651-47b4-8503-8b9837a272f7"). InnerVolumeSpecName "kube-api-access-spp4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:48:04 crc kubenswrapper[4971]: I0320 09:48:04.907764 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spp4q\" (UniqueName: \"kubernetes.io/projected/808f215e-d651-47b4-8503-8b9837a272f7-kube-api-access-spp4q\") on node \"crc\" DevicePath \"\"" Mar 20 09:48:05 crc kubenswrapper[4971]: I0320 09:48:05.391141 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566668-jrfwq" event={"ID":"808f215e-d651-47b4-8503-8b9837a272f7","Type":"ContainerDied","Data":"b9b3c0b9d9e12eb8fddf904c5a91314a1b5aa9bf48a581e027723d5a6f2cf72d"} Mar 20 09:48:05 crc kubenswrapper[4971]: I0320 09:48:05.391178 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9b3c0b9d9e12eb8fddf904c5a91314a1b5aa9bf48a581e027723d5a6f2cf72d" Mar 20 09:48:05 crc kubenswrapper[4971]: I0320 09:48:05.391179 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566668-jrfwq" Mar 20 09:48:05 crc kubenswrapper[4971]: I0320 09:48:05.453956 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566662-v9rbc"] Mar 20 09:48:05 crc kubenswrapper[4971]: I0320 09:48:05.463718 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566662-v9rbc"] Mar 20 09:48:06 crc kubenswrapper[4971]: I0320 09:48:06.742186 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c0be1fa-ddc6-47a4-b358-45671538ecbd" path="/var/lib/kubelet/pods/1c0be1fa-ddc6-47a4-b358-45671538ecbd/volumes" Mar 20 09:48:14 crc kubenswrapper[4971]: I0320 09:48:14.732388 4971 scope.go:117] "RemoveContainer" containerID="f0175754c8799a9c4ddbe22551ae98fd93ca0fac507f7806f7f0341a573b4b38" Mar 20 09:48:14 crc kubenswrapper[4971]: E0320 09:48:14.734305 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:48:15 crc kubenswrapper[4971]: I0320 09:48:15.965851 4971 scope.go:117] "RemoveContainer" containerID="a1daf1d03cb40f26949804760911a619b828d397af9a24fd7dd2e9019fe377da" Mar 20 09:48:27 crc kubenswrapper[4971]: I0320 09:48:27.732487 4971 scope.go:117] "RemoveContainer" containerID="f0175754c8799a9c4ddbe22551ae98fd93ca0fac507f7806f7f0341a573b4b38" Mar 20 09:48:27 crc kubenswrapper[4971]: E0320 09:48:27.733402 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:48:42 crc kubenswrapper[4971]: I0320 09:48:42.732682 4971 scope.go:117] "RemoveContainer" containerID="f0175754c8799a9c4ddbe22551ae98fd93ca0fac507f7806f7f0341a573b4b38" Mar 20 09:48:42 crc kubenswrapper[4971]: E0320 09:48:42.733527 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:48:53 crc kubenswrapper[4971]: I0320 09:48:53.731924 4971 scope.go:117] "RemoveContainer" containerID="f0175754c8799a9c4ddbe22551ae98fd93ca0fac507f7806f7f0341a573b4b38" Mar 20 09:48:54 crc kubenswrapper[4971]: I0320 09:48:54.909393 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerStarted","Data":"1af03ec5567291cde2ab43c1a4307b417d2c80ddda6c2e3cba497e31b86b2f92"} Mar 20 09:49:31 crc kubenswrapper[4971]: I0320 09:49:31.561797 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mlkzm"] Mar 20 09:49:31 crc kubenswrapper[4971]: E0320 09:49:31.563842 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="808f215e-d651-47b4-8503-8b9837a272f7" containerName="oc" Mar 20 09:49:31 crc kubenswrapper[4971]: I0320 09:49:31.563914 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="808f215e-d651-47b4-8503-8b9837a272f7" containerName="oc" Mar 20 09:49:31 crc kubenswrapper[4971]: I0320 09:49:31.564130 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="808f215e-d651-47b4-8503-8b9837a272f7" containerName="oc" Mar 20 09:49:31 crc kubenswrapper[4971]: I0320 09:49:31.565639 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mlkzm" Mar 20 09:49:31 crc kubenswrapper[4971]: I0320 09:49:31.588448 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mlkzm"] Mar 20 09:49:31 crc kubenswrapper[4971]: I0320 09:49:31.609603 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnr48\" (UniqueName: \"kubernetes.io/projected/9c31abb8-b303-4e30-90db-d7e55c76795c-kube-api-access-fnr48\") pod \"redhat-operators-mlkzm\" (UID: \"9c31abb8-b303-4e30-90db-d7e55c76795c\") " pod="openshift-marketplace/redhat-operators-mlkzm" Mar 20 09:49:31 crc kubenswrapper[4971]: I0320 09:49:31.609670 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c31abb8-b303-4e30-90db-d7e55c76795c-catalog-content\") pod \"redhat-operators-mlkzm\" (UID: \"9c31abb8-b303-4e30-90db-d7e55c76795c\") " pod="openshift-marketplace/redhat-operators-mlkzm" Mar 20 09:49:31 crc kubenswrapper[4971]: I0320 09:49:31.609866 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c31abb8-b303-4e30-90db-d7e55c76795c-utilities\") pod \"redhat-operators-mlkzm\" (UID: \"9c31abb8-b303-4e30-90db-d7e55c76795c\") " pod="openshift-marketplace/redhat-operators-mlkzm" Mar 20 09:49:31 crc kubenswrapper[4971]: I0320 09:49:31.711751 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c31abb8-b303-4e30-90db-d7e55c76795c-utilities\") pod \"redhat-operators-mlkzm\" (UID: \"9c31abb8-b303-4e30-90db-d7e55c76795c\") " pod="openshift-marketplace/redhat-operators-mlkzm" Mar 20 09:49:31 crc kubenswrapper[4971]: I0320 09:49:31.711867 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnr48\" (UniqueName: \"kubernetes.io/projected/9c31abb8-b303-4e30-90db-d7e55c76795c-kube-api-access-fnr48\") pod \"redhat-operators-mlkzm\" (UID: \"9c31abb8-b303-4e30-90db-d7e55c76795c\") " pod="openshift-marketplace/redhat-operators-mlkzm" Mar 20 09:49:31 crc kubenswrapper[4971]: I0320 09:49:31.711905 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c31abb8-b303-4e30-90db-d7e55c76795c-catalog-content\") pod \"redhat-operators-mlkzm\" (UID: \"9c31abb8-b303-4e30-90db-d7e55c76795c\") " pod="openshift-marketplace/redhat-operators-mlkzm" Mar 20 09:49:31 crc kubenswrapper[4971]: I0320 09:49:31.712212 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c31abb8-b303-4e30-90db-d7e55c76795c-utilities\") pod \"redhat-operators-mlkzm\" (UID: \"9c31abb8-b303-4e30-90db-d7e55c76795c\") " pod="openshift-marketplace/redhat-operators-mlkzm" Mar 20 09:49:31 crc kubenswrapper[4971]: I0320 09:49:31.712405 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c31abb8-b303-4e30-90db-d7e55c76795c-catalog-content\") pod \"redhat-operators-mlkzm\" (UID: \"9c31abb8-b303-4e30-90db-d7e55c76795c\") " pod="openshift-marketplace/redhat-operators-mlkzm" Mar 20 09:49:31 crc kubenswrapper[4971]: I0320 09:49:31.737595 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnr48\" (UniqueName: \"kubernetes.io/projected/9c31abb8-b303-4e30-90db-d7e55c76795c-kube-api-access-fnr48\") pod \"redhat-operators-mlkzm\" (UID: \"9c31abb8-b303-4e30-90db-d7e55c76795c\") " pod="openshift-marketplace/redhat-operators-mlkzm" Mar 20 09:49:31 crc kubenswrapper[4971]: I0320 09:49:31.889207 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mlkzm" Mar 20 09:49:32 crc kubenswrapper[4971]: I0320 09:49:32.401836 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mlkzm"] Mar 20 09:49:33 crc kubenswrapper[4971]: I0320 09:49:33.351644 4971 generic.go:334] "Generic (PLEG): container finished" podID="9c31abb8-b303-4e30-90db-d7e55c76795c" containerID="07a38520870e31501cd9db1bee0939d96d9f2a8b35f3180f79a0f3ac87485048" exitCode=0 Mar 20 09:49:33 crc kubenswrapper[4971]: I0320 09:49:33.351718 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mlkzm" event={"ID":"9c31abb8-b303-4e30-90db-d7e55c76795c","Type":"ContainerDied","Data":"07a38520870e31501cd9db1bee0939d96d9f2a8b35f3180f79a0f3ac87485048"} Mar 20 09:49:33 crc kubenswrapper[4971]: I0320 09:49:33.351853 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mlkzm" event={"ID":"9c31abb8-b303-4e30-90db-d7e55c76795c","Type":"ContainerStarted","Data":"f7efcbd1bc21f5585ad38018bb15f91dc12eddc515515e7a204998dadcc72fe2"} Mar 20 09:49:35 crc kubenswrapper[4971]: I0320 09:49:35.378552 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mlkzm" event={"ID":"9c31abb8-b303-4e30-90db-d7e55c76795c","Type":"ContainerStarted","Data":"fddefe24e5bab74f9361ce3d28b2b324a58931c00739655411d9e3e68746ef93"} Mar 20 09:49:37 crc kubenswrapper[4971]: I0320 09:49:37.407072 4971 generic.go:334] "Generic (PLEG): container finished" podID="9c31abb8-b303-4e30-90db-d7e55c76795c" containerID="fddefe24e5bab74f9361ce3d28b2b324a58931c00739655411d9e3e68746ef93" exitCode=0 Mar 20 09:49:37 crc kubenswrapper[4971]: I0320 09:49:37.407139 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mlkzm" event={"ID":"9c31abb8-b303-4e30-90db-d7e55c76795c","Type":"ContainerDied","Data":"fddefe24e5bab74f9361ce3d28b2b324a58931c00739655411d9e3e68746ef93"} Mar 20 09:49:38 crc kubenswrapper[4971]: I0320 09:49:38.427592 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mlkzm" event={"ID":"9c31abb8-b303-4e30-90db-d7e55c76795c","Type":"ContainerStarted","Data":"8b966a54d226e30734fe2b875cb148576d023f5818a257e110af7f9344e84328"} Mar 20 09:49:38 crc kubenswrapper[4971]: I0320 09:49:38.452461 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mlkzm" podStartSLOduration=2.949195767 podStartE2EDuration="7.452432673s" podCreationTimestamp="2026-03-20 09:49:31 +0000 UTC" firstStartedPulling="2026-03-20 09:49:33.353854529 +0000 UTC m=+10795.333728687" lastFinishedPulling="2026-03-20 09:49:37.857091455 +0000 UTC m=+10799.836965593" observedRunningTime="2026-03-20 09:49:38.449122106 +0000 UTC m=+10800.428996254" watchObservedRunningTime="2026-03-20 09:49:38.452432673 +0000 UTC m=+10800.432306811" Mar 20 09:49:41 crc kubenswrapper[4971]: I0320 09:49:41.890415 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mlkzm" Mar 20 09:49:41 crc kubenswrapper[4971]: I0320 09:49:41.890745 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mlkzm" Mar 20 09:49:42 crc kubenswrapper[4971]: I0320 09:49:42.955034 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mlkzm" podUID="9c31abb8-b303-4e30-90db-d7e55c76795c" containerName="registry-server" probeResult="failure" output=< Mar 20 09:49:42 crc kubenswrapper[4971]: timeout: failed to connect service ":50051" within 1s Mar 20 09:49:42 crc kubenswrapper[4971]: > Mar 20 09:49:52 crc kubenswrapper[4971]: I0320 09:49:52.932396 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mlkzm" podUID="9c31abb8-b303-4e30-90db-d7e55c76795c" containerName="registry-server" probeResult="failure" output=< Mar 20 09:49:52 crc kubenswrapper[4971]: timeout: failed to connect service ":50051" within 1s Mar 20 09:49:52 crc kubenswrapper[4971]: > Mar 20 09:50:00 crc kubenswrapper[4971]: I0320 09:50:00.156182 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566670-5gsxj"] Mar 20 09:50:00 crc kubenswrapper[4971]: I0320 09:50:00.158129 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566670-5gsxj" Mar 20 09:50:00 crc kubenswrapper[4971]: I0320 09:50:00.160426 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 09:50:00 crc kubenswrapper[4971]: I0320 09:50:00.160879 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:50:00 crc kubenswrapper[4971]: I0320 09:50:00.162676 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:50:00 crc kubenswrapper[4971]: I0320 09:50:00.168061 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566670-5gsxj"] Mar 20 09:50:00 crc kubenswrapper[4971]: I0320 09:50:00.228462 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4h5t\" (UniqueName: \"kubernetes.io/projected/c78e17d2-bb8b-41c2-819d-031cf7c46183-kube-api-access-m4h5t\") pod \"auto-csr-approver-29566670-5gsxj\" (UID: \"c78e17d2-bb8b-41c2-819d-031cf7c46183\") " pod="openshift-infra/auto-csr-approver-29566670-5gsxj" Mar 20 09:50:00 crc kubenswrapper[4971]: I0320 09:50:00.331239 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4h5t\" (UniqueName: \"kubernetes.io/projected/c78e17d2-bb8b-41c2-819d-031cf7c46183-kube-api-access-m4h5t\") pod \"auto-csr-approver-29566670-5gsxj\" (UID: \"c78e17d2-bb8b-41c2-819d-031cf7c46183\") " pod="openshift-infra/auto-csr-approver-29566670-5gsxj" Mar 20 09:50:00 crc kubenswrapper[4971]: I0320 09:50:00.351386 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4h5t\" (UniqueName: \"kubernetes.io/projected/c78e17d2-bb8b-41c2-819d-031cf7c46183-kube-api-access-m4h5t\") pod \"auto-csr-approver-29566670-5gsxj\" (UID: \"c78e17d2-bb8b-41c2-819d-031cf7c46183\") " pod="openshift-infra/auto-csr-approver-29566670-5gsxj" Mar 20 09:50:00 crc kubenswrapper[4971]: I0320 09:50:00.487088 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566670-5gsxj" Mar 20 09:50:00 crc kubenswrapper[4971]: I0320 09:50:00.927944 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566670-5gsxj"] Mar 20 09:50:01 crc kubenswrapper[4971]: I0320 09:50:01.670894 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566670-5gsxj" event={"ID":"c78e17d2-bb8b-41c2-819d-031cf7c46183","Type":"ContainerStarted","Data":"4b7543b88840d342842252b15f35cc0ce7c3a0412e472f377acf39a52624d48b"} Mar 20 09:50:02 crc kubenswrapper[4971]: I0320 09:50:02.683977 4971 generic.go:334] "Generic (PLEG): container finished" podID="c78e17d2-bb8b-41c2-819d-031cf7c46183" containerID="ef7792d3dfab16671ef7a65785c948329a56bdb77c9dbe2bdb9e29845f22737a" exitCode=0 Mar 20 09:50:02 crc kubenswrapper[4971]: I0320 09:50:02.684051 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566670-5gsxj" event={"ID":"c78e17d2-bb8b-41c2-819d-031cf7c46183","Type":"ContainerDied","Data":"ef7792d3dfab16671ef7a65785c948329a56bdb77c9dbe2bdb9e29845f22737a"} Mar 20 09:50:02 crc kubenswrapper[4971]: I0320 09:50:02.943284 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mlkzm" podUID="9c31abb8-b303-4e30-90db-d7e55c76795c" containerName="registry-server" probeResult="failure" output=< Mar 20 09:50:02 crc kubenswrapper[4971]: timeout: failed to connect service ":50051" within 1s Mar 20 09:50:02 crc kubenswrapper[4971]: > Mar 20 09:50:04 crc kubenswrapper[4971]: I0320 09:50:04.075508 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566670-5gsxj" Mar 20 09:50:04 crc kubenswrapper[4971]: I0320 09:50:04.108474 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4h5t\" (UniqueName: \"kubernetes.io/projected/c78e17d2-bb8b-41c2-819d-031cf7c46183-kube-api-access-m4h5t\") pod \"c78e17d2-bb8b-41c2-819d-031cf7c46183\" (UID: \"c78e17d2-bb8b-41c2-819d-031cf7c46183\") " Mar 20 09:50:04 crc kubenswrapper[4971]: I0320 09:50:04.113587 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c78e17d2-bb8b-41c2-819d-031cf7c46183-kube-api-access-m4h5t" (OuterVolumeSpecName: "kube-api-access-m4h5t") pod "c78e17d2-bb8b-41c2-819d-031cf7c46183" (UID: "c78e17d2-bb8b-41c2-819d-031cf7c46183"). InnerVolumeSpecName "kube-api-access-m4h5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:50:04 crc kubenswrapper[4971]: I0320 09:50:04.210734 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4h5t\" (UniqueName: \"kubernetes.io/projected/c78e17d2-bb8b-41c2-819d-031cf7c46183-kube-api-access-m4h5t\") on node \"crc\" DevicePath \"\"" Mar 20 09:50:04 crc kubenswrapper[4971]: I0320 09:50:04.708986 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566670-5gsxj" event={"ID":"c78e17d2-bb8b-41c2-819d-031cf7c46183","Type":"ContainerDied","Data":"4b7543b88840d342842252b15f35cc0ce7c3a0412e472f377acf39a52624d48b"} Mar 20 09:50:04 crc kubenswrapper[4971]: I0320 09:50:04.709278 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b7543b88840d342842252b15f35cc0ce7c3a0412e472f377acf39a52624d48b" Mar 20 09:50:04 crc kubenswrapper[4971]: I0320 09:50:04.709051 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566670-5gsxj" Mar 20 09:50:05 crc kubenswrapper[4971]: I0320 09:50:05.164676 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566664-hwmxc"] Mar 20 09:50:05 crc kubenswrapper[4971]: I0320 09:50:05.177623 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566664-hwmxc"] Mar 20 09:50:06 crc kubenswrapper[4971]: I0320 09:50:06.745593 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb38638e-107a-4d81-9804-451c2402ea1d" path="/var/lib/kubelet/pods/fb38638e-107a-4d81-9804-451c2402ea1d/volumes" Mar 20 09:50:11 crc kubenswrapper[4971]: I0320 09:50:11.955690 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mlkzm" Mar 20 09:50:12 crc kubenswrapper[4971]: I0320 09:50:12.010650 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mlkzm" Mar 20 09:50:12 crc kubenswrapper[4971]: I0320 09:50:12.192977 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mlkzm"] Mar 20 09:50:13 crc kubenswrapper[4971]: I0320 09:50:13.809861 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mlkzm" podUID="9c31abb8-b303-4e30-90db-d7e55c76795c" containerName="registry-server" containerID="cri-o://8b966a54d226e30734fe2b875cb148576d023f5818a257e110af7f9344e84328" gracePeriod=2 Mar 20 09:50:14 crc kubenswrapper[4971]: I0320 09:50:14.324420 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mlkzm" Mar 20 09:50:14 crc kubenswrapper[4971]: I0320 09:50:14.333823 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnr48\" (UniqueName: \"kubernetes.io/projected/9c31abb8-b303-4e30-90db-d7e55c76795c-kube-api-access-fnr48\") pod \"9c31abb8-b303-4e30-90db-d7e55c76795c\" (UID: \"9c31abb8-b303-4e30-90db-d7e55c76795c\") " Mar 20 09:50:14 crc kubenswrapper[4971]: I0320 09:50:14.333970 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c31abb8-b303-4e30-90db-d7e55c76795c-catalog-content\") pod \"9c31abb8-b303-4e30-90db-d7e55c76795c\" (UID: \"9c31abb8-b303-4e30-90db-d7e55c76795c\") " Mar 20 09:50:14 crc kubenswrapper[4971]: I0320 09:50:14.334148 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c31abb8-b303-4e30-90db-d7e55c76795c-utilities\") pod \"9c31abb8-b303-4e30-90db-d7e55c76795c\" (UID: \"9c31abb8-b303-4e30-90db-d7e55c76795c\") " Mar 20 09:50:14 crc kubenswrapper[4971]: I0320 09:50:14.335313 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c31abb8-b303-4e30-90db-d7e55c76795c-utilities" (OuterVolumeSpecName: "utilities") pod "9c31abb8-b303-4e30-90db-d7e55c76795c" (UID: "9c31abb8-b303-4e30-90db-d7e55c76795c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:50:14 crc kubenswrapper[4971]: I0320 09:50:14.341043 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c31abb8-b303-4e30-90db-d7e55c76795c-kube-api-access-fnr48" (OuterVolumeSpecName: "kube-api-access-fnr48") pod "9c31abb8-b303-4e30-90db-d7e55c76795c" (UID: "9c31abb8-b303-4e30-90db-d7e55c76795c"). InnerVolumeSpecName "kube-api-access-fnr48". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:50:14 crc kubenswrapper[4971]: I0320 09:50:14.436812 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnr48\" (UniqueName: \"kubernetes.io/projected/9c31abb8-b303-4e30-90db-d7e55c76795c-kube-api-access-fnr48\") on node \"crc\" DevicePath \"\"" Mar 20 09:50:14 crc kubenswrapper[4971]: I0320 09:50:14.437390 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c31abb8-b303-4e30-90db-d7e55c76795c-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 09:50:14 crc kubenswrapper[4971]: I0320 09:50:14.495909 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c31abb8-b303-4e30-90db-d7e55c76795c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9c31abb8-b303-4e30-90db-d7e55c76795c" (UID: "9c31abb8-b303-4e30-90db-d7e55c76795c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:50:14 crc kubenswrapper[4971]: I0320 09:50:14.538120 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c31abb8-b303-4e30-90db-d7e55c76795c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 09:50:14 crc kubenswrapper[4971]: I0320 09:50:14.820073 4971 generic.go:334] "Generic (PLEG): container finished" podID="9c31abb8-b303-4e30-90db-d7e55c76795c" containerID="8b966a54d226e30734fe2b875cb148576d023f5818a257e110af7f9344e84328" exitCode=0 Mar 20 09:50:14 crc kubenswrapper[4971]: I0320 09:50:14.820112 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mlkzm" event={"ID":"9c31abb8-b303-4e30-90db-d7e55c76795c","Type":"ContainerDied","Data":"8b966a54d226e30734fe2b875cb148576d023f5818a257e110af7f9344e84328"} Mar 20 09:50:14 crc kubenswrapper[4971]: I0320 09:50:14.820138 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mlkzm" event={"ID":"9c31abb8-b303-4e30-90db-d7e55c76795c","Type":"ContainerDied","Data":"f7efcbd1bc21f5585ad38018bb15f91dc12eddc515515e7a204998dadcc72fe2"} Mar 20 09:50:14 crc kubenswrapper[4971]: I0320 09:50:14.820155 4971 scope.go:117] "RemoveContainer" containerID="8b966a54d226e30734fe2b875cb148576d023f5818a257e110af7f9344e84328" Mar 20 09:50:14 crc kubenswrapper[4971]: I0320 09:50:14.820272 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mlkzm" Mar 20 09:50:14 crc kubenswrapper[4971]: I0320 09:50:14.857768 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mlkzm"] Mar 20 09:50:14 crc kubenswrapper[4971]: I0320 09:50:14.860747 4971 scope.go:117] "RemoveContainer" containerID="fddefe24e5bab74f9361ce3d28b2b324a58931c00739655411d9e3e68746ef93" Mar 20 09:50:14 crc kubenswrapper[4971]: I0320 09:50:14.870858 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mlkzm"] Mar 20 09:50:14 crc kubenswrapper[4971]: I0320 09:50:14.897568 4971 scope.go:117] "RemoveContainer" containerID="07a38520870e31501cd9db1bee0939d96d9f2a8b35f3180f79a0f3ac87485048" Mar 20 09:50:14 crc kubenswrapper[4971]: I0320 09:50:14.952892 4971 scope.go:117] "RemoveContainer" containerID="8b966a54d226e30734fe2b875cb148576d023f5818a257e110af7f9344e84328" Mar 20 09:50:14 crc kubenswrapper[4971]: E0320 09:50:14.953490 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b966a54d226e30734fe2b875cb148576d023f5818a257e110af7f9344e84328\": container with ID starting with 8b966a54d226e30734fe2b875cb148576d023f5818a257e110af7f9344e84328 not found: ID does not exist" containerID="8b966a54d226e30734fe2b875cb148576d023f5818a257e110af7f9344e84328" Mar 20 09:50:14 crc kubenswrapper[4971]: I0320 09:50:14.953542 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b966a54d226e30734fe2b875cb148576d023f5818a257e110af7f9344e84328"} err="failed to get container status \"8b966a54d226e30734fe2b875cb148576d023f5818a257e110af7f9344e84328\": rpc error: code = NotFound desc = could not find container \"8b966a54d226e30734fe2b875cb148576d023f5818a257e110af7f9344e84328\": container with ID starting with 8b966a54d226e30734fe2b875cb148576d023f5818a257e110af7f9344e84328 not found: ID does not exist" Mar 20 09:50:14 crc kubenswrapper[4971]: I0320 09:50:14.953569 4971 scope.go:117] "RemoveContainer" containerID="fddefe24e5bab74f9361ce3d28b2b324a58931c00739655411d9e3e68746ef93" Mar 20 09:50:14 crc kubenswrapper[4971]: E0320 09:50:14.954038 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fddefe24e5bab74f9361ce3d28b2b324a58931c00739655411d9e3e68746ef93\": container with ID starting with fddefe24e5bab74f9361ce3d28b2b324a58931c00739655411d9e3e68746ef93 not found: ID does not exist" containerID="fddefe24e5bab74f9361ce3d28b2b324a58931c00739655411d9e3e68746ef93" Mar 20 09:50:14 crc kubenswrapper[4971]: I0320 09:50:14.954065 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fddefe24e5bab74f9361ce3d28b2b324a58931c00739655411d9e3e68746ef93"} err="failed to get container status \"fddefe24e5bab74f9361ce3d28b2b324a58931c00739655411d9e3e68746ef93\": rpc error: code = NotFound desc = could not find container \"fddefe24e5bab74f9361ce3d28b2b324a58931c00739655411d9e3e68746ef93\": container with ID starting with fddefe24e5bab74f9361ce3d28b2b324a58931c00739655411d9e3e68746ef93 not found: ID does not exist" Mar 20 09:50:14 crc kubenswrapper[4971]: I0320 09:50:14.954080 4971 scope.go:117] "RemoveContainer" containerID="07a38520870e31501cd9db1bee0939d96d9f2a8b35f3180f79a0f3ac87485048" Mar 20 09:50:14 crc kubenswrapper[4971]: E0320 09:50:14.954415 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07a38520870e31501cd9db1bee0939d96d9f2a8b35f3180f79a0f3ac87485048\": container with ID starting with 07a38520870e31501cd9db1bee0939d96d9f2a8b35f3180f79a0f3ac87485048 not found: ID does not exist" containerID="07a38520870e31501cd9db1bee0939d96d9f2a8b35f3180f79a0f3ac87485048" Mar 20 09:50:14 crc kubenswrapper[4971]: I0320 09:50:14.954446 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07a38520870e31501cd9db1bee0939d96d9f2a8b35f3180f79a0f3ac87485048"} err="failed to get container status \"07a38520870e31501cd9db1bee0939d96d9f2a8b35f3180f79a0f3ac87485048\": rpc error: code = NotFound desc = could not find container \"07a38520870e31501cd9db1bee0939d96d9f2a8b35f3180f79a0f3ac87485048\": container with ID starting with 07a38520870e31501cd9db1bee0939d96d9f2a8b35f3180f79a0f3ac87485048 not found: ID does not exist" Mar 20 09:50:16 crc kubenswrapper[4971]: I0320 09:50:16.091239 4971 scope.go:117] "RemoveContainer" containerID="3466e07d2dd74abd1975bbfe928575bff23ae1bc04bff2d739812a6c18ac6a43" Mar 20 09:50:16 crc kubenswrapper[4971]: I0320 09:50:16.748426 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c31abb8-b303-4e30-90db-d7e55c76795c" path="/var/lib/kubelet/pods/9c31abb8-b303-4e30-90db-d7e55c76795c/volumes" Mar 20 09:51:20 crc kubenswrapper[4971]: I0320 09:51:20.162393 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:51:20 crc kubenswrapper[4971]: I0320 09:51:20.162950 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:51:40 crc kubenswrapper[4971]: I0320 09:51:40.293586 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hswqh"] Mar 20 09:51:40 crc kubenswrapper[4971]: E0320 09:51:40.294747 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c31abb8-b303-4e30-90db-d7e55c76795c" containerName="registry-server" Mar 20 09:51:40 crc kubenswrapper[4971]: I0320 09:51:40.294766 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c31abb8-b303-4e30-90db-d7e55c76795c" containerName="registry-server" Mar 20 09:51:40 crc kubenswrapper[4971]: E0320 09:51:40.294804 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c31abb8-b303-4e30-90db-d7e55c76795c" containerName="extract-content" Mar 20 09:51:40 crc kubenswrapper[4971]: I0320 09:51:40.294812 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c31abb8-b303-4e30-90db-d7e55c76795c" containerName="extract-content" Mar 20 09:51:40 crc kubenswrapper[4971]: E0320 09:51:40.295030 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c78e17d2-bb8b-41c2-819d-031cf7c46183" containerName="oc" Mar 20 09:51:40 crc kubenswrapper[4971]: I0320 09:51:40.295038 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="c78e17d2-bb8b-41c2-819d-031cf7c46183" containerName="oc" Mar 20 09:51:40 crc kubenswrapper[4971]: E0320 09:51:40.295062 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c31abb8-b303-4e30-90db-d7e55c76795c" containerName="extract-utilities" Mar 20 09:51:40 crc kubenswrapper[4971]: I0320 09:51:40.295068 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c31abb8-b303-4e30-90db-d7e55c76795c" containerName="extract-utilities" Mar 20 09:51:40 crc kubenswrapper[4971]: I0320 09:51:40.295277 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c31abb8-b303-4e30-90db-d7e55c76795c" containerName="registry-server" Mar 20 09:51:40 crc kubenswrapper[4971]: I0320 09:51:40.295295 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="c78e17d2-bb8b-41c2-819d-031cf7c46183" containerName="oc" Mar 20 09:51:40 crc kubenswrapper[4971]: I0320 09:51:40.296870 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hswqh" Mar 20 09:51:40 crc kubenswrapper[4971]: I0320 09:51:40.313565 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hswqh"] Mar 20 09:51:40 crc kubenswrapper[4971]: I0320 09:51:40.393143 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7218a1c-adf8-4655-b46c-9f4082da540a-catalog-content\") pod \"redhat-marketplace-hswqh\" (UID: \"b7218a1c-adf8-4655-b46c-9f4082da540a\") " pod="openshift-marketplace/redhat-marketplace-hswqh" Mar 20 09:51:40 crc kubenswrapper[4971]: I0320 09:51:40.393534 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7218a1c-adf8-4655-b46c-9f4082da540a-utilities\") pod \"redhat-marketplace-hswqh\" (UID: \"b7218a1c-adf8-4655-b46c-9f4082da540a\") " pod="openshift-marketplace/redhat-marketplace-hswqh" Mar 20 09:51:40 crc kubenswrapper[4971]: I0320 09:51:40.393666 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2dnx\" (UniqueName: \"kubernetes.io/projected/b7218a1c-adf8-4655-b46c-9f4082da540a-kube-api-access-r2dnx\") pod \"redhat-marketplace-hswqh\" (UID: \"b7218a1c-adf8-4655-b46c-9f4082da540a\") " pod="openshift-marketplace/redhat-marketplace-hswqh" Mar 20 09:51:40 crc kubenswrapper[4971]: I0320 09:51:40.496429 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7218a1c-adf8-4655-b46c-9f4082da540a-utilities\") pod \"redhat-marketplace-hswqh\" (UID: \"b7218a1c-adf8-4655-b46c-9f4082da540a\") " pod="openshift-marketplace/redhat-marketplace-hswqh" Mar 20 09:51:40 crc kubenswrapper[4971]: I0320 09:51:40.496506 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2dnx\" (UniqueName: \"kubernetes.io/projected/b7218a1c-adf8-4655-b46c-9f4082da540a-kube-api-access-r2dnx\") pod \"redhat-marketplace-hswqh\" (UID: \"b7218a1c-adf8-4655-b46c-9f4082da540a\") " pod="openshift-marketplace/redhat-marketplace-hswqh" Mar 20 09:51:40 crc kubenswrapper[4971]: I0320 09:51:40.496574 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7218a1c-adf8-4655-b46c-9f4082da540a-catalog-content\") pod \"redhat-marketplace-hswqh\" (UID: \"b7218a1c-adf8-4655-b46c-9f4082da540a\") " pod="openshift-marketplace/redhat-marketplace-hswqh" Mar 20 09:51:40 crc kubenswrapper[4971]: I0320 09:51:40.497170 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7218a1c-adf8-4655-b46c-9f4082da540a-utilities\") pod \"redhat-marketplace-hswqh\" (UID: \"b7218a1c-adf8-4655-b46c-9f4082da540a\") " pod="openshift-marketplace/redhat-marketplace-hswqh" Mar 20 09:51:40 crc kubenswrapper[4971]: I0320 09:51:40.497233 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7218a1c-adf8-4655-b46c-9f4082da540a-catalog-content\") pod \"redhat-marketplace-hswqh\" (UID: \"b7218a1c-adf8-4655-b46c-9f4082da540a\") " pod="openshift-marketplace/redhat-marketplace-hswqh" Mar 20 09:51:40 crc kubenswrapper[4971]: I0320 09:51:40.516953 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2dnx\" (UniqueName: \"kubernetes.io/projected/b7218a1c-adf8-4655-b46c-9f4082da540a-kube-api-access-r2dnx\") pod \"redhat-marketplace-hswqh\" (UID: \"b7218a1c-adf8-4655-b46c-9f4082da540a\") " pod="openshift-marketplace/redhat-marketplace-hswqh" Mar 20 09:51:40 crc kubenswrapper[4971]: I0320 09:51:40.618686 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hswqh" Mar 20 09:51:41 crc kubenswrapper[4971]: I0320 09:51:41.123022 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hswqh"] Mar 20 09:51:41 crc kubenswrapper[4971]: I0320 09:51:41.739937 4971 generic.go:334] "Generic (PLEG): container finished" podID="b7218a1c-adf8-4655-b46c-9f4082da540a" containerID="b5ff9052179509cb2f677ec51d709328d555267711263211673ee791aa3f758e" exitCode=0 Mar 20 09:51:41 crc kubenswrapper[4971]: I0320 09:51:41.740003 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hswqh" event={"ID":"b7218a1c-adf8-4655-b46c-9f4082da540a","Type":"ContainerDied","Data":"b5ff9052179509cb2f677ec51d709328d555267711263211673ee791aa3f758e"} Mar 20 09:51:41 crc kubenswrapper[4971]: I0320 09:51:41.740208 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hswqh" event={"ID":"b7218a1c-adf8-4655-b46c-9f4082da540a","Type":"ContainerStarted","Data":"a287d4cfcfe7898852f1d6df71a8dd94ac3fdc7856691f9557822d37c73b6a22"} Mar 20 09:51:42 crc kubenswrapper[4971]: I0320 09:51:42.751438 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hswqh" event={"ID":"b7218a1c-adf8-4655-b46c-9f4082da540a","Type":"ContainerStarted","Data":"3b18cdd18fabd6fde3873f261c3c796b5d8d70625435c2dd3aec769f3729cf04"} Mar 20 09:51:44 crc kubenswrapper[4971]: I0320 09:51:44.982991 4971 generic.go:334] "Generic (PLEG): container finished" podID="b7218a1c-adf8-4655-b46c-9f4082da540a" containerID="3b18cdd18fabd6fde3873f261c3c796b5d8d70625435c2dd3aec769f3729cf04" exitCode=0 Mar 20 09:51:44 crc kubenswrapper[4971]: I0320 09:51:44.983122 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hswqh" event={"ID":"b7218a1c-adf8-4655-b46c-9f4082da540a","Type":"ContainerDied","Data":"3b18cdd18fabd6fde3873f261c3c796b5d8d70625435c2dd3aec769f3729cf04"} Mar 20 09:51:45 crc kubenswrapper[4971]: I0320 09:51:45.995252 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hswqh" event={"ID":"b7218a1c-adf8-4655-b46c-9f4082da540a","Type":"ContainerStarted","Data":"7168d524a655640ca087bb12e5249e8d3c7476a32876935975b0e5616015b409"} Mar 20 09:51:46 crc kubenswrapper[4971]: I0320 09:51:46.019690 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hswqh" podStartSLOduration=2.397731486 podStartE2EDuration="6.019670474s" podCreationTimestamp="2026-03-20 09:51:40 +0000 UTC" firstStartedPulling="2026-03-20 09:51:41.741585881 +0000 UTC m=+10923.721460019" lastFinishedPulling="2026-03-20 09:51:45.363524869 +0000 UTC m=+10927.343399007" observedRunningTime="2026-03-20 09:51:46.017131907 +0000 UTC m=+10927.997006045" watchObservedRunningTime="2026-03-20 09:51:46.019670474 +0000 UTC m=+10927.999544612" Mar 20 09:51:50 crc kubenswrapper[4971]: I0320 09:51:50.162182 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:51:50 crc kubenswrapper[4971]: I0320 09:51:50.162699 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:51:50 crc kubenswrapper[4971]: I0320 09:51:50.619213 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hswqh" Mar 20 09:51:50 crc kubenswrapper[4971]: I0320 09:51:50.619266 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hswqh" Mar 20 09:51:50 crc kubenswrapper[4971]: I0320 09:51:50.667552 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hswqh" Mar 20 09:51:51 crc kubenswrapper[4971]: I0320 09:51:51.082862 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hswqh" Mar 20 09:51:51 crc kubenswrapper[4971]: I0320 09:51:51.136016 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hswqh"] Mar 20 09:51:53 crc kubenswrapper[4971]: I0320 09:51:53.058188 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hswqh" podUID="b7218a1c-adf8-4655-b46c-9f4082da540a" containerName="registry-server" containerID="cri-o://7168d524a655640ca087bb12e5249e8d3c7476a32876935975b0e5616015b409" gracePeriod=2 Mar 20 09:51:53 crc kubenswrapper[4971]: I0320 09:51:53.587140 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hswqh" Mar 20 09:51:53 crc kubenswrapper[4971]: I0320 09:51:53.728306 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7218a1c-adf8-4655-b46c-9f4082da540a-utilities\") pod \"b7218a1c-adf8-4655-b46c-9f4082da540a\" (UID: \"b7218a1c-adf8-4655-b46c-9f4082da540a\") " Mar 20 09:51:53 crc kubenswrapper[4971]: I0320 09:51:53.728546 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2dnx\" (UniqueName: \"kubernetes.io/projected/b7218a1c-adf8-4655-b46c-9f4082da540a-kube-api-access-r2dnx\") pod \"b7218a1c-adf8-4655-b46c-9f4082da540a\" (UID: \"b7218a1c-adf8-4655-b46c-9f4082da540a\") " Mar 20 09:51:53 crc kubenswrapper[4971]: I0320 09:51:53.728579 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7218a1c-adf8-4655-b46c-9f4082da540a-catalog-content\") pod \"b7218a1c-adf8-4655-b46c-9f4082da540a\" (UID: \"b7218a1c-adf8-4655-b46c-9f4082da540a\") " Mar 20 09:51:53 crc kubenswrapper[4971]: I0320 09:51:53.729248 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7218a1c-adf8-4655-b46c-9f4082da540a-utilities" (OuterVolumeSpecName: "utilities") pod "b7218a1c-adf8-4655-b46c-9f4082da540a" (UID: "b7218a1c-adf8-4655-b46c-9f4082da540a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:51:53 crc kubenswrapper[4971]: I0320 09:51:53.735694 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7218a1c-adf8-4655-b46c-9f4082da540a-kube-api-access-r2dnx" (OuterVolumeSpecName: "kube-api-access-r2dnx") pod "b7218a1c-adf8-4655-b46c-9f4082da540a" (UID: "b7218a1c-adf8-4655-b46c-9f4082da540a"). InnerVolumeSpecName "kube-api-access-r2dnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:51:53 crc kubenswrapper[4971]: I0320 09:51:53.757660 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7218a1c-adf8-4655-b46c-9f4082da540a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b7218a1c-adf8-4655-b46c-9f4082da540a" (UID: "b7218a1c-adf8-4655-b46c-9f4082da540a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:51:53 crc kubenswrapper[4971]: I0320 09:51:53.831030 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2dnx\" (UniqueName: \"kubernetes.io/projected/b7218a1c-adf8-4655-b46c-9f4082da540a-kube-api-access-r2dnx\") on node \"crc\" DevicePath \"\"" Mar 20 09:51:53 crc kubenswrapper[4971]: I0320 09:51:53.831599 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7218a1c-adf8-4655-b46c-9f4082da540a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 09:51:53 crc kubenswrapper[4971]: I0320 09:51:53.831645 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7218a1c-adf8-4655-b46c-9f4082da540a-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 09:51:54 crc kubenswrapper[4971]: I0320 09:51:54.070799 4971 generic.go:334] "Generic (PLEG): container finished" podID="b7218a1c-adf8-4655-b46c-9f4082da540a" containerID="7168d524a655640ca087bb12e5249e8d3c7476a32876935975b0e5616015b409" exitCode=0 Mar 20 09:51:54 crc kubenswrapper[4971]: I0320 09:51:54.070854 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hswqh" event={"ID":"b7218a1c-adf8-4655-b46c-9f4082da540a","Type":"ContainerDied","Data":"7168d524a655640ca087bb12e5249e8d3c7476a32876935975b0e5616015b409"} Mar 20 09:51:54 crc kubenswrapper[4971]: I0320 09:51:54.070915 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hswqh" event={"ID":"b7218a1c-adf8-4655-b46c-9f4082da540a","Type":"ContainerDied","Data":"a287d4cfcfe7898852f1d6df71a8dd94ac3fdc7856691f9557822d37c73b6a22"} Mar 20 09:51:54 crc kubenswrapper[4971]: I0320 09:51:54.070917 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hswqh" Mar 20 09:51:54 crc kubenswrapper[4971]: I0320 09:51:54.070939 4971 scope.go:117] "RemoveContainer" containerID="7168d524a655640ca087bb12e5249e8d3c7476a32876935975b0e5616015b409" Mar 20 09:51:54 crc kubenswrapper[4971]: I0320 09:51:54.096199 4971 scope.go:117] "RemoveContainer" containerID="3b18cdd18fabd6fde3873f261c3c796b5d8d70625435c2dd3aec769f3729cf04" Mar 20 09:51:54 crc kubenswrapper[4971]: I0320 09:51:54.121745 4971 scope.go:117] "RemoveContainer" containerID="b5ff9052179509cb2f677ec51d709328d555267711263211673ee791aa3f758e" Mar 20 09:51:54 crc kubenswrapper[4971]: I0320 09:51:54.127940 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hswqh"] Mar 20 09:51:54 crc kubenswrapper[4971]: I0320 09:51:54.142598 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hswqh"] Mar 20 09:51:54 crc kubenswrapper[4971]: I0320 09:51:54.180103 4971 scope.go:117] "RemoveContainer" containerID="7168d524a655640ca087bb12e5249e8d3c7476a32876935975b0e5616015b409" Mar 20 09:51:54 crc kubenswrapper[4971]: E0320 09:51:54.180756 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7168d524a655640ca087bb12e5249e8d3c7476a32876935975b0e5616015b409\": container with ID starting with 7168d524a655640ca087bb12e5249e8d3c7476a32876935975b0e5616015b409 not found: ID does not exist" containerID="7168d524a655640ca087bb12e5249e8d3c7476a32876935975b0e5616015b409" Mar 20 09:51:54 crc kubenswrapper[4971]: I0320 09:51:54.180801 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7168d524a655640ca087bb12e5249e8d3c7476a32876935975b0e5616015b409"} err="failed to get container status \"7168d524a655640ca087bb12e5249e8d3c7476a32876935975b0e5616015b409\": rpc error: code = NotFound desc = could not find container \"7168d524a655640ca087bb12e5249e8d3c7476a32876935975b0e5616015b409\": container with ID starting with 7168d524a655640ca087bb12e5249e8d3c7476a32876935975b0e5616015b409 not found: ID does not exist" Mar 20 09:51:54 crc kubenswrapper[4971]: I0320 09:51:54.180827 4971 scope.go:117] "RemoveContainer" containerID="3b18cdd18fabd6fde3873f261c3c796b5d8d70625435c2dd3aec769f3729cf04" Mar 20 09:51:54 crc kubenswrapper[4971]: E0320 09:51:54.181195 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b18cdd18fabd6fde3873f261c3c796b5d8d70625435c2dd3aec769f3729cf04\": container with ID starting with 3b18cdd18fabd6fde3873f261c3c796b5d8d70625435c2dd3aec769f3729cf04 not found: ID does not exist" containerID="3b18cdd18fabd6fde3873f261c3c796b5d8d70625435c2dd3aec769f3729cf04" Mar 20 09:51:54 crc kubenswrapper[4971]: I0320 09:51:54.181224 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b18cdd18fabd6fde3873f261c3c796b5d8d70625435c2dd3aec769f3729cf04"} err="failed to get container status \"3b18cdd18fabd6fde3873f261c3c796b5d8d70625435c2dd3aec769f3729cf04\": rpc error: code = NotFound desc = could not find container \"3b18cdd18fabd6fde3873f261c3c796b5d8d70625435c2dd3aec769f3729cf04\": container with ID starting with 3b18cdd18fabd6fde3873f261c3c796b5d8d70625435c2dd3aec769f3729cf04 not found: ID does not exist" Mar 20 09:51:54 crc kubenswrapper[4971]: I0320 09:51:54.181237 4971 scope.go:117] "RemoveContainer" containerID="b5ff9052179509cb2f677ec51d709328d555267711263211673ee791aa3f758e" Mar 20 09:51:54 crc kubenswrapper[4971]: E0320 09:51:54.181600 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5ff9052179509cb2f677ec51d709328d555267711263211673ee791aa3f758e\": container with ID starting with b5ff9052179509cb2f677ec51d709328d555267711263211673ee791aa3f758e not found: ID does not exist" containerID="b5ff9052179509cb2f677ec51d709328d555267711263211673ee791aa3f758e" Mar 20 09:51:54 crc kubenswrapper[4971]: I0320 09:51:54.181697 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5ff9052179509cb2f677ec51d709328d555267711263211673ee791aa3f758e"} err="failed to get container status \"b5ff9052179509cb2f677ec51d709328d555267711263211673ee791aa3f758e\": rpc error: code = NotFound desc = could not find container \"b5ff9052179509cb2f677ec51d709328d555267711263211673ee791aa3f758e\": container with ID starting with b5ff9052179509cb2f677ec51d709328d555267711263211673ee791aa3f758e not found: ID does not exist" Mar 20 09:51:54 crc kubenswrapper[4971]: I0320 09:51:54.745000 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7218a1c-adf8-4655-b46c-9f4082da540a" path="/var/lib/kubelet/pods/b7218a1c-adf8-4655-b46c-9f4082da540a/volumes" Mar 20 09:52:00 crc kubenswrapper[4971]: I0320 09:52:00.158738 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566672-b9rd6"] Mar 20 09:52:00 crc kubenswrapper[4971]: E0320 09:52:00.159950 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7218a1c-adf8-4655-b46c-9f4082da540a" containerName="registry-server" Mar 20 09:52:00 crc kubenswrapper[4971]: I0320 09:52:00.159968 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7218a1c-adf8-4655-b46c-9f4082da540a" containerName="registry-server" Mar 20 09:52:00 crc kubenswrapper[4971]: E0320 09:52:00.159985 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7218a1c-adf8-4655-b46c-9f4082da540a" containerName="extract-content" Mar 20 09:52:00 crc kubenswrapper[4971]: I0320 09:52:00.159991 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7218a1c-adf8-4655-b46c-9f4082da540a" containerName="extract-content" Mar 20 09:52:00 crc kubenswrapper[4971]: E0320 09:52:00.160025 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7218a1c-adf8-4655-b46c-9f4082da540a" containerName="extract-utilities" Mar 20 09:52:00 crc kubenswrapper[4971]: I0320 09:52:00.160033 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7218a1c-adf8-4655-b46c-9f4082da540a" containerName="extract-utilities" Mar 20 09:52:00 crc kubenswrapper[4971]: I0320 09:52:00.160279 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7218a1c-adf8-4655-b46c-9f4082da540a" containerName="registry-server" Mar 20 09:52:00 crc kubenswrapper[4971]: I0320 09:52:00.161123 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566672-b9rd6" Mar 20 09:52:00 crc kubenswrapper[4971]: I0320 09:52:00.163343 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:52:00 crc kubenswrapper[4971]: I0320 09:52:00.163857 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 09:52:00 crc kubenswrapper[4971]: I0320 09:52:00.164134 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:52:00 crc kubenswrapper[4971]: I0320 09:52:00.179215 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566672-b9rd6"] Mar 20 09:52:00 crc kubenswrapper[4971]: I0320 09:52:00.260197 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msvpp\" (UniqueName: \"kubernetes.io/projected/5efb2ef3-9c7e-4574-ab12-bdd3bd1b2bbd-kube-api-access-msvpp\") pod \"auto-csr-approver-29566672-b9rd6\" (UID: \"5efb2ef3-9c7e-4574-ab12-bdd3bd1b2bbd\") " pod="openshift-infra/auto-csr-approver-29566672-b9rd6" Mar 20 09:52:00 crc kubenswrapper[4971]: I0320 09:52:00.362046 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msvpp\" (UniqueName: \"kubernetes.io/projected/5efb2ef3-9c7e-4574-ab12-bdd3bd1b2bbd-kube-api-access-msvpp\") pod \"auto-csr-approver-29566672-b9rd6\" (UID: \"5efb2ef3-9c7e-4574-ab12-bdd3bd1b2bbd\") " pod="openshift-infra/auto-csr-approver-29566672-b9rd6" Mar 20 09:52:00 crc kubenswrapper[4971]: I0320 09:52:00.387566 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msvpp\" (UniqueName: \"kubernetes.io/projected/5efb2ef3-9c7e-4574-ab12-bdd3bd1b2bbd-kube-api-access-msvpp\") pod \"auto-csr-approver-29566672-b9rd6\" (UID: \"5efb2ef3-9c7e-4574-ab12-bdd3bd1b2bbd\") " pod="openshift-infra/auto-csr-approver-29566672-b9rd6" Mar 20 09:52:00 crc kubenswrapper[4971]: I0320 09:52:00.482205 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566672-b9rd6" Mar 20 09:52:00 crc kubenswrapper[4971]: I0320 09:52:00.933326 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566672-b9rd6"] Mar 20 09:52:01 crc kubenswrapper[4971]: I0320 09:52:01.150498 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566672-b9rd6" event={"ID":"5efb2ef3-9c7e-4574-ab12-bdd3bd1b2bbd","Type":"ContainerStarted","Data":"16c91f20e3214380c2b588740c8b02072ba900c9716e87d1da352fc631c105e5"} Mar 20 09:52:02 crc kubenswrapper[4971]: I0320 09:52:02.167071 4971 generic.go:334] "Generic (PLEG): container finished" podID="5efb2ef3-9c7e-4574-ab12-bdd3bd1b2bbd" containerID="92394fd9a4ccb9d5935757e080403bceb1478d4c00910cc24700504e4d51e5c6" exitCode=0 Mar 20 09:52:02 crc kubenswrapper[4971]: I0320 09:52:02.167270 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566672-b9rd6" event={"ID":"5efb2ef3-9c7e-4574-ab12-bdd3bd1b2bbd","Type":"ContainerDied","Data":"92394fd9a4ccb9d5935757e080403bceb1478d4c00910cc24700504e4d51e5c6"} Mar 20 09:52:03 crc kubenswrapper[4971]: I0320 09:52:03.564275 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566672-b9rd6" Mar 20 09:52:03 crc kubenswrapper[4971]: I0320 09:52:03.656920 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-msvpp\" (UniqueName: \"kubernetes.io/projected/5efb2ef3-9c7e-4574-ab12-bdd3bd1b2bbd-kube-api-access-msvpp\") pod \"5efb2ef3-9c7e-4574-ab12-bdd3bd1b2bbd\" (UID: \"5efb2ef3-9c7e-4574-ab12-bdd3bd1b2bbd\") " Mar 20 09:52:03 crc kubenswrapper[4971]: I0320 09:52:03.663880 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5efb2ef3-9c7e-4574-ab12-bdd3bd1b2bbd-kube-api-access-msvpp" (OuterVolumeSpecName: "kube-api-access-msvpp") pod "5efb2ef3-9c7e-4574-ab12-bdd3bd1b2bbd" (UID: "5efb2ef3-9c7e-4574-ab12-bdd3bd1b2bbd"). InnerVolumeSpecName "kube-api-access-msvpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:52:03 crc kubenswrapper[4971]: I0320 09:52:03.759235 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-msvpp\" (UniqueName: \"kubernetes.io/projected/5efb2ef3-9c7e-4574-ab12-bdd3bd1b2bbd-kube-api-access-msvpp\") on node \"crc\" DevicePath \"\"" Mar 20 09:52:04 crc kubenswrapper[4971]: I0320 09:52:04.186696 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566672-b9rd6" event={"ID":"5efb2ef3-9c7e-4574-ab12-bdd3bd1b2bbd","Type":"ContainerDied","Data":"16c91f20e3214380c2b588740c8b02072ba900c9716e87d1da352fc631c105e5"} Mar 20 09:52:04 crc kubenswrapper[4971]: I0320 09:52:04.186953 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566672-b9rd6" Mar 20 09:52:04 crc kubenswrapper[4971]: I0320 09:52:04.186963 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16c91f20e3214380c2b588740c8b02072ba900c9716e87d1da352fc631c105e5" Mar 20 09:52:04 crc kubenswrapper[4971]: I0320 09:52:04.673215 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566666-k7qbz"] Mar 20 09:52:04 crc kubenswrapper[4971]: I0320 09:52:04.682706 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566666-k7qbz"] Mar 20 09:52:04 crc kubenswrapper[4971]: I0320 09:52:04.744459 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa5b1e46-fa49-41f5-a19f-531729f9ac25" path="/var/lib/kubelet/pods/aa5b1e46-fa49-41f5-a19f-531729f9ac25/volumes" Mar 20 09:52:16 crc kubenswrapper[4971]: I0320 09:52:16.215780 4971 scope.go:117] "RemoveContainer" containerID="624bed76579217498a7500124693922fb7d79933066283dff9464952feca6d33" Mar 20 09:52:20 crc kubenswrapper[4971]: I0320 09:52:20.162442 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:52:20 crc kubenswrapper[4971]: I0320 09:52:20.163202 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:52:20 crc kubenswrapper[4971]: I0320 09:52:20.163268 4971 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" Mar 20 09:52:20 crc kubenswrapper[4971]: I0320 09:52:20.164682 4971 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1af03ec5567291cde2ab43c1a4307b417d2c80ddda6c2e3cba497e31b86b2f92"} pod="openshift-machine-config-operator/machine-config-daemon-m26zl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 09:52:20 crc kubenswrapper[4971]: I0320 09:52:20.164853 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" containerID="cri-o://1af03ec5567291cde2ab43c1a4307b417d2c80ddda6c2e3cba497e31b86b2f92" gracePeriod=600 Mar 20 09:52:20 crc kubenswrapper[4971]: I0320 09:52:20.344585 4971 generic.go:334] "Generic (PLEG): container finished" podID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerID="1af03ec5567291cde2ab43c1a4307b417d2c80ddda6c2e3cba497e31b86b2f92" exitCode=0 Mar 20 09:52:20 crc kubenswrapper[4971]: I0320 09:52:20.344642 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerDied","Data":"1af03ec5567291cde2ab43c1a4307b417d2c80ddda6c2e3cba497e31b86b2f92"} Mar 20 09:52:20 crc kubenswrapper[4971]: I0320 09:52:20.344710 4971 scope.go:117] "RemoveContainer" containerID="f0175754c8799a9c4ddbe22551ae98fd93ca0fac507f7806f7f0341a573b4b38" Mar 20 09:52:21 crc kubenswrapper[4971]: I0320 09:52:21.360153 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerStarted","Data":"9741f8c4d9da3b1995bee26db60ff6748d6555ab1f8bf6b207eb6b4cd887f8f1"} Mar 20 09:54:00 crc kubenswrapper[4971]: I0320 09:54:00.158101 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566674-lkbcg"] Mar 20 09:54:00 crc kubenswrapper[4971]: E0320 09:54:00.159116 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5efb2ef3-9c7e-4574-ab12-bdd3bd1b2bbd" containerName="oc" Mar 20 09:54:00 crc kubenswrapper[4971]: I0320 09:54:00.159132 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="5efb2ef3-9c7e-4574-ab12-bdd3bd1b2bbd" containerName="oc" Mar 20 09:54:00 crc kubenswrapper[4971]: I0320 09:54:00.159415 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="5efb2ef3-9c7e-4574-ab12-bdd3bd1b2bbd" containerName="oc" Mar 20 09:54:00 crc kubenswrapper[4971]: I0320 09:54:00.160260 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566674-lkbcg" Mar 20 09:54:00 crc kubenswrapper[4971]: I0320 09:54:00.163367 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:54:00 crc kubenswrapper[4971]: I0320 09:54:00.164151 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:54:00 crc kubenswrapper[4971]: I0320 09:54:00.164164 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 09:54:00 crc kubenswrapper[4971]: I0320 09:54:00.176853 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566674-lkbcg"] Mar 20 09:54:00 crc kubenswrapper[4971]: I0320 09:54:00.277563 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txfwp\" (UniqueName: \"kubernetes.io/projected/00b99f96-7701-4b4b-a4a3-9a8b6ce83944-kube-api-access-txfwp\") pod \"auto-csr-approver-29566674-lkbcg\" (UID: \"00b99f96-7701-4b4b-a4a3-9a8b6ce83944\") " pod="openshift-infra/auto-csr-approver-29566674-lkbcg" Mar 20 09:54:00 crc kubenswrapper[4971]: I0320 09:54:00.379923 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txfwp\" (UniqueName: \"kubernetes.io/projected/00b99f96-7701-4b4b-a4a3-9a8b6ce83944-kube-api-access-txfwp\") pod \"auto-csr-approver-29566674-lkbcg\" (UID: \"00b99f96-7701-4b4b-a4a3-9a8b6ce83944\") " pod="openshift-infra/auto-csr-approver-29566674-lkbcg" Mar 20 09:54:00 crc kubenswrapper[4971]: I0320 09:54:00.409227 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txfwp\" (UniqueName: \"kubernetes.io/projected/00b99f96-7701-4b4b-a4a3-9a8b6ce83944-kube-api-access-txfwp\") pod \"auto-csr-approver-29566674-lkbcg\" (UID: \"00b99f96-7701-4b4b-a4a3-9a8b6ce83944\") " pod="openshift-infra/auto-csr-approver-29566674-lkbcg" Mar 20 09:54:00 crc kubenswrapper[4971]: I0320 09:54:00.493883 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566674-lkbcg" Mar 20 09:54:00 crc kubenswrapper[4971]: I0320 09:54:00.961435 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566674-lkbcg"] Mar 20 09:54:00 crc kubenswrapper[4971]: I0320 09:54:00.971509 4971 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 09:54:01 crc kubenswrapper[4971]: I0320 09:54:01.404714 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566674-lkbcg" event={"ID":"00b99f96-7701-4b4b-a4a3-9a8b6ce83944","Type":"ContainerStarted","Data":"769744db768a8b29ae121716cf6320af49d67b3b0e0db79610240290bcb60778"} Mar 20 09:54:02 crc kubenswrapper[4971]: I0320 09:54:02.418037 4971 generic.go:334] "Generic (PLEG): container finished" podID="00b99f96-7701-4b4b-a4a3-9a8b6ce83944" containerID="b86122b0b0b49047cfe150427b2ff9a68794d19859343cdfd156ed9405e6fce2" exitCode=0 Mar 20 09:54:02 crc kubenswrapper[4971]: I0320 09:54:02.418084 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566674-lkbcg" event={"ID":"00b99f96-7701-4b4b-a4a3-9a8b6ce83944","Type":"ContainerDied","Data":"b86122b0b0b49047cfe150427b2ff9a68794d19859343cdfd156ed9405e6fce2"} Mar 20 09:54:03 crc kubenswrapper[4971]: I0320 09:54:03.781356 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566674-lkbcg" Mar 20 09:54:03 crc kubenswrapper[4971]: I0320 09:54:03.849802 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txfwp\" (UniqueName: \"kubernetes.io/projected/00b99f96-7701-4b4b-a4a3-9a8b6ce83944-kube-api-access-txfwp\") pod \"00b99f96-7701-4b4b-a4a3-9a8b6ce83944\" (UID: \"00b99f96-7701-4b4b-a4a3-9a8b6ce83944\") " Mar 20 09:54:03 crc kubenswrapper[4971]: I0320 09:54:03.862719 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00b99f96-7701-4b4b-a4a3-9a8b6ce83944-kube-api-access-txfwp" (OuterVolumeSpecName: "kube-api-access-txfwp") pod "00b99f96-7701-4b4b-a4a3-9a8b6ce83944" (UID: "00b99f96-7701-4b4b-a4a3-9a8b6ce83944"). InnerVolumeSpecName "kube-api-access-txfwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:54:03 crc kubenswrapper[4971]: I0320 09:54:03.952227 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txfwp\" (UniqueName: \"kubernetes.io/projected/00b99f96-7701-4b4b-a4a3-9a8b6ce83944-kube-api-access-txfwp\") on node \"crc\" DevicePath \"\"" Mar 20 09:54:04 crc kubenswrapper[4971]: I0320 09:54:04.439914 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566674-lkbcg" event={"ID":"00b99f96-7701-4b4b-a4a3-9a8b6ce83944","Type":"ContainerDied","Data":"769744db768a8b29ae121716cf6320af49d67b3b0e0db79610240290bcb60778"} Mar 20 09:54:04 crc kubenswrapper[4971]: I0320 09:54:04.439954 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="769744db768a8b29ae121716cf6320af49d67b3b0e0db79610240290bcb60778" Mar 20 09:54:04 crc kubenswrapper[4971]: I0320 09:54:04.440001 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566674-lkbcg" Mar 20 09:54:04 crc kubenswrapper[4971]: I0320 09:54:04.853958 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566668-jrfwq"] Mar 20 09:54:04 crc kubenswrapper[4971]: I0320 09:54:04.864250 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566668-jrfwq"] Mar 20 09:54:06 crc kubenswrapper[4971]: I0320 09:54:06.755082 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="808f215e-d651-47b4-8503-8b9837a272f7" path="/var/lib/kubelet/pods/808f215e-d651-47b4-8503-8b9837a272f7/volumes" Mar 20 09:54:16 crc kubenswrapper[4971]: I0320 09:54:16.340171 4971 scope.go:117] "RemoveContainer" containerID="d1c4c2396b15a93a5e38d5cc1d3894aae38eb23beae11db304d1bf5e145ea361" Mar 20 09:54:20 crc kubenswrapper[4971]: I0320 09:54:20.162952 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:54:20 crc kubenswrapper[4971]: I0320 09:54:20.163595 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:54:50 crc kubenswrapper[4971]: I0320 09:54:50.162390 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:54:50 crc kubenswrapper[4971]: I0320 09:54:50.162963 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:55:01 crc kubenswrapper[4971]: I0320 09:55:01.656868 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Mar 20 09:55:01 crc kubenswrapper[4971]: E0320 09:55:01.658915 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00b99f96-7701-4b4b-a4a3-9a8b6ce83944" containerName="oc" Mar 20 09:55:01 crc kubenswrapper[4971]: I0320 09:55:01.659023 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="00b99f96-7701-4b4b-a4a3-9a8b6ce83944" containerName="oc" Mar 20 09:55:01 crc kubenswrapper[4971]: I0320 09:55:01.659370 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="00b99f96-7701-4b4b-a4a3-9a8b6ce83944" containerName="oc" Mar 20 09:55:01 crc kubenswrapper[4971]: I0320 09:55:01.660328 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 20 09:55:01 crc kubenswrapper[4971]: I0320 09:55:01.662578 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-cjphw" Mar 20 09:55:01 crc kubenswrapper[4971]: I0320 09:55:01.662989 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 20 09:55:01 crc kubenswrapper[4971]: I0320 09:55:01.663051 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Mar 20 09:55:01 crc kubenswrapper[4971]: I0320 09:55:01.665479 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Mar 20 09:55:01 crc kubenswrapper[4971]: I0320 09:55:01.666973 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 20 09:55:01 crc kubenswrapper[4971]: I0320 09:55:01.846337 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c\") " pod="openstack/tempest-tests-tempest" Mar 20 09:55:01 crc kubenswrapper[4971]: I0320 09:55:01.846853 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c\") " pod="openstack/tempest-tests-tempest" Mar 20 09:55:01 crc kubenswrapper[4971]: I0320 09:55:01.846905 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c\") " pod="openstack/tempest-tests-tempest" Mar 20 09:55:01 crc kubenswrapper[4971]: I0320 09:55:01.846936 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkqnp\" (UniqueName: \"kubernetes.io/projected/14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c-kube-api-access-bkqnp\") pod \"tempest-tests-tempest\" (UID: \"14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c\") " pod="openstack/tempest-tests-tempest" Mar 20 09:55:01 crc kubenswrapper[4971]: I0320 09:55:01.847011 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c\") " pod="openstack/tempest-tests-tempest" Mar 20 09:55:01 crc kubenswrapper[4971]: I0320 09:55:01.847154 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c\") " pod="openstack/tempest-tests-tempest" Mar 20 09:55:01 crc kubenswrapper[4971]: I0320 09:55:01.847186 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"tempest-tests-tempest\" (UID: \"14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c\") " pod="openstack/tempest-tests-tempest" Mar 20 09:55:01 crc kubenswrapper[4971]: I0320 09:55:01.847224 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c\") " pod="openstack/tempest-tests-tempest" Mar 20 09:55:01 crc kubenswrapper[4971]: I0320 09:55:01.847249 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c-config-data\") pod \"tempest-tests-tempest\" (UID: \"14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c\") " pod="openstack/tempest-tests-tempest" Mar 20 09:55:01 crc kubenswrapper[4971]: I0320 09:55:01.948965 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c\") " pod="openstack/tempest-tests-tempest" Mar 20 09:55:01 crc kubenswrapper[4971]: I0320 09:55:01.949108 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c\") " pod="openstack/tempest-tests-tempest" Mar 20 09:55:01 crc kubenswrapper[4971]: I0320 09:55:01.949137 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"tempest-tests-tempest\" (UID: \"14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c\") " pod="openstack/tempest-tests-tempest" Mar 20 09:55:01 crc kubenswrapper[4971]: I0320 09:55:01.949167 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c\") " pod="openstack/tempest-tests-tempest" Mar 20 09:55:01 crc kubenswrapper[4971]: I0320 09:55:01.949188 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c-config-data\") pod \"tempest-tests-tempest\" (UID: \"14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c\") " pod="openstack/tempest-tests-tempest" Mar 20 09:55:01 crc kubenswrapper[4971]: I0320 09:55:01.949270 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c\") " pod="openstack/tempest-tests-tempest" Mar 20 09:55:01 crc kubenswrapper[4971]: I0320 09:55:01.949297 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c\") " pod="openstack/tempest-tests-tempest" Mar 20 09:55:01 crc kubenswrapper[4971]: I0320 09:55:01.949325 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c\") " pod="openstack/tempest-tests-tempest" Mar 20 09:55:01 crc kubenswrapper[4971]: I0320 09:55:01.949341 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkqnp\" (UniqueName: \"kubernetes.io/projected/14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c-kube-api-access-bkqnp\") pod \"tempest-tests-tempest\" (UID: \"14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c\") " pod="openstack/tempest-tests-tempest" Mar 20 09:55:01 crc kubenswrapper[4971]: I0320 09:55:01.949547 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c\") " pod="openstack/tempest-tests-tempest" Mar 20 09:55:01 crc kubenswrapper[4971]: I0320 09:55:01.949585 4971 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"tempest-tests-tempest\" (UID: \"14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/tempest-tests-tempest" Mar 20 09:55:01 crc kubenswrapper[4971]: I0320 09:55:01.950116 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c\") " pod="openstack/tempest-tests-tempest" Mar 20 09:55:01 crc kubenswrapper[4971]: I0320 09:55:01.950356 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c\") " pod="openstack/tempest-tests-tempest" Mar 20 09:55:01 crc kubenswrapper[4971]: I0320 09:55:01.951194 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c-config-data\") pod \"tempest-tests-tempest\" (UID: \"14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c\") " pod="openstack/tempest-tests-tempest" Mar 20 09:55:01 crc kubenswrapper[4971]: I0320 09:55:01.955973 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c\") " pod="openstack/tempest-tests-tempest" Mar 20 09:55:01 crc kubenswrapper[4971]: I0320 09:55:01.956923 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c\") " pod="openstack/tempest-tests-tempest" Mar 20 09:55:01 crc kubenswrapper[4971]: I0320 09:55:01.956973 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c\") " pod="openstack/tempest-tests-tempest" Mar 20 09:55:01 crc kubenswrapper[4971]: I0320 09:55:01.967570 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkqnp\" (UniqueName: \"kubernetes.io/projected/14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c-kube-api-access-bkqnp\") pod \"tempest-tests-tempest\" (UID: \"14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c\") " pod="openstack/tempest-tests-tempest" Mar 20 09:55:01 crc kubenswrapper[4971]: I0320 09:55:01.984769 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"tempest-tests-tempest\" (UID: \"14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c\") " pod="openstack/tempest-tests-tempest" Mar 20 09:55:02 crc kubenswrapper[4971]: I0320 09:55:02.286037 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 20 09:55:02 crc kubenswrapper[4971]: I0320 09:55:02.747147 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 20 09:55:03 crc kubenswrapper[4971]: I0320 09:55:03.125614 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c","Type":"ContainerStarted","Data":"2983a733871bb4943308d258e91a4b529f9a98eac45b1d8e1dbdbca699f9a60a"} Mar 20 09:55:20 crc kubenswrapper[4971]: I0320 09:55:20.162951 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:55:20 crc kubenswrapper[4971]: I0320 09:55:20.163650 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:55:20 crc kubenswrapper[4971]: I0320 09:55:20.163906 4971 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" Mar 20 09:55:20 crc kubenswrapper[4971]: I0320 09:55:20.165100 4971 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9741f8c4d9da3b1995bee26db60ff6748d6555ab1f8bf6b207eb6b4cd887f8f1"} pod="openshift-machine-config-operator/machine-config-daemon-m26zl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 09:55:20 crc kubenswrapper[4971]: I0320 09:55:20.165213 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" containerID="cri-o://9741f8c4d9da3b1995bee26db60ff6748d6555ab1f8bf6b207eb6b4cd887f8f1" gracePeriod=600 Mar 20 09:55:20 crc kubenswrapper[4971]: E0320 09:55:20.293302 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:55:20 crc kubenswrapper[4971]: I0320 09:55:20.332325 4971 generic.go:334] "Generic (PLEG): container finished" podID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerID="9741f8c4d9da3b1995bee26db60ff6748d6555ab1f8bf6b207eb6b4cd887f8f1" exitCode=0 Mar 20 09:55:20 crc kubenswrapper[4971]: I0320 09:55:20.332386 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerDied","Data":"9741f8c4d9da3b1995bee26db60ff6748d6555ab1f8bf6b207eb6b4cd887f8f1"} Mar 20 09:55:20 crc kubenswrapper[4971]: I0320 09:55:20.332478 4971 scope.go:117] "RemoveContainer" containerID="1af03ec5567291cde2ab43c1a4307b417d2c80ddda6c2e3cba497e31b86b2f92" Mar 20 09:55:20 crc kubenswrapper[4971]: I0320 09:55:20.333388 4971 scope.go:117] "RemoveContainer" containerID="9741f8c4d9da3b1995bee26db60ff6748d6555ab1f8bf6b207eb6b4cd887f8f1" Mar 20 09:55:20 crc kubenswrapper[4971]: E0320 09:55:20.333857 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:55:35 crc kubenswrapper[4971]: I0320 09:55:35.732927 4971 scope.go:117] "RemoveContainer" containerID="9741f8c4d9da3b1995bee26db60ff6748d6555ab1f8bf6b207eb6b4cd887f8f1" Mar 20 09:55:35 crc kubenswrapper[4971]: E0320 09:55:35.733848 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:55:49 crc kubenswrapper[4971]: I0320 09:55:49.732789 4971 scope.go:117] "RemoveContainer" containerID="9741f8c4d9da3b1995bee26db60ff6748d6555ab1f8bf6b207eb6b4cd887f8f1" Mar 20 09:55:49 crc kubenswrapper[4971]: E0320 09:55:49.733496 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:55:55 crc kubenswrapper[4971]: E0320 09:55:55.137417 4971 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:39fc4cb70f516d8e9b48225bc0a253ef" Mar 20 09:55:55 crc kubenswrapper[4971]: E0320 09:55:55.138157 4971 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:39fc4cb70f516d8e9b48225bc0a253ef" Mar 20 09:55:55 crc kubenswrapper[4971]: E0320 09:55:55.138431 4971 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:39fc4cb70f516d8e9b48225bc0a253ef,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bkqnp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 09:55:55 crc kubenswrapper[4971]: E0320 09:55:55.139685 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c" Mar 20 09:55:55 crc kubenswrapper[4971]: E0320 09:55:55.701406 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:39fc4cb70f516d8e9b48225bc0a253ef\\\"\"" pod="openstack/tempest-tests-tempest" podUID="14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c" Mar 20 09:56:00 crc kubenswrapper[4971]: I0320 09:56:00.158935 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566676-ntqhr"] Mar 20 09:56:00 crc kubenswrapper[4971]: I0320 09:56:00.162303 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566676-ntqhr" Mar 20 09:56:00 crc kubenswrapper[4971]: I0320 09:56:00.163571 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmc2r\" (UniqueName: \"kubernetes.io/projected/5b5a5bb6-be43-4ca4-acea-3ba241727009-kube-api-access-vmc2r\") pod \"auto-csr-approver-29566676-ntqhr\" (UID: \"5b5a5bb6-be43-4ca4-acea-3ba241727009\") " pod="openshift-infra/auto-csr-approver-29566676-ntqhr" Mar 20 09:56:00 crc kubenswrapper[4971]: I0320 09:56:00.167077 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:56:00 crc kubenswrapper[4971]: I0320 09:56:00.167333 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:56:00 crc kubenswrapper[4971]: I0320 09:56:00.167494 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 09:56:00 crc kubenswrapper[4971]: I0320 09:56:00.187654 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566676-ntqhr"] Mar 20 09:56:00 crc kubenswrapper[4971]: I0320 09:56:00.265407 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmc2r\" (UniqueName: \"kubernetes.io/projected/5b5a5bb6-be43-4ca4-acea-3ba241727009-kube-api-access-vmc2r\") pod \"auto-csr-approver-29566676-ntqhr\" (UID: \"5b5a5bb6-be43-4ca4-acea-3ba241727009\") " pod="openshift-infra/auto-csr-approver-29566676-ntqhr" Mar 20 09:56:00 crc kubenswrapper[4971]: I0320 09:56:00.284222 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmc2r\" (UniqueName: \"kubernetes.io/projected/5b5a5bb6-be43-4ca4-acea-3ba241727009-kube-api-access-vmc2r\") pod \"auto-csr-approver-29566676-ntqhr\" (UID: \"5b5a5bb6-be43-4ca4-acea-3ba241727009\") " pod="openshift-infra/auto-csr-approver-29566676-ntqhr" Mar 20 09:56:00 crc kubenswrapper[4971]: I0320 09:56:00.485782 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566676-ntqhr" Mar 20 09:56:00 crc kubenswrapper[4971]: I0320 09:56:00.968041 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566676-ntqhr"] Mar 20 09:56:00 crc kubenswrapper[4971]: W0320 09:56:00.971216 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b5a5bb6_be43_4ca4_acea_3ba241727009.slice/crio-30613e4b9d21ac176549dfe80589d45b71e0ee13ee7ceaa30a2fc2339c9ce211 WatchSource:0}: Error finding container 30613e4b9d21ac176549dfe80589d45b71e0ee13ee7ceaa30a2fc2339c9ce211: Status 404 returned error can't find the container with id 30613e4b9d21ac176549dfe80589d45b71e0ee13ee7ceaa30a2fc2339c9ce211 Mar 20 09:56:01 crc kubenswrapper[4971]: I0320 09:56:01.732236 4971 scope.go:117] "RemoveContainer" containerID="9741f8c4d9da3b1995bee26db60ff6748d6555ab1f8bf6b207eb6b4cd887f8f1" Mar 20 09:56:01 crc kubenswrapper[4971]: E0320 09:56:01.732741 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:56:01 crc kubenswrapper[4971]: I0320 09:56:01.788703 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566676-ntqhr" event={"ID":"5b5a5bb6-be43-4ca4-acea-3ba241727009","Type":"ContainerStarted","Data":"30613e4b9d21ac176549dfe80589d45b71e0ee13ee7ceaa30a2fc2339c9ce211"} Mar 20 09:56:02 crc kubenswrapper[4971]: I0320 09:56:02.801490 4971 generic.go:334] "Generic (PLEG): container finished" podID="5b5a5bb6-be43-4ca4-acea-3ba241727009" containerID="08a09ff9ebb4fed45fd5e8c1a2335d8357ec3d06014859bdddbc95becd2d2211" exitCode=0 Mar 20 09:56:02 crc kubenswrapper[4971]: I0320 09:56:02.801536 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566676-ntqhr" event={"ID":"5b5a5bb6-be43-4ca4-acea-3ba241727009","Type":"ContainerDied","Data":"08a09ff9ebb4fed45fd5e8c1a2335d8357ec3d06014859bdddbc95becd2d2211"} Mar 20 09:56:04 crc kubenswrapper[4971]: I0320 09:56:04.250581 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566676-ntqhr" Mar 20 09:56:04 crc kubenswrapper[4971]: I0320 09:56:04.350850 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmc2r\" (UniqueName: \"kubernetes.io/projected/5b5a5bb6-be43-4ca4-acea-3ba241727009-kube-api-access-vmc2r\") pod \"5b5a5bb6-be43-4ca4-acea-3ba241727009\" (UID: \"5b5a5bb6-be43-4ca4-acea-3ba241727009\") " Mar 20 09:56:04 crc kubenswrapper[4971]: I0320 09:56:04.356956 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b5a5bb6-be43-4ca4-acea-3ba241727009-kube-api-access-vmc2r" (OuterVolumeSpecName: "kube-api-access-vmc2r") pod "5b5a5bb6-be43-4ca4-acea-3ba241727009" (UID: "5b5a5bb6-be43-4ca4-acea-3ba241727009"). InnerVolumeSpecName "kube-api-access-vmc2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:56:04 crc kubenswrapper[4971]: I0320 09:56:04.453976 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmc2r\" (UniqueName: \"kubernetes.io/projected/5b5a5bb6-be43-4ca4-acea-3ba241727009-kube-api-access-vmc2r\") on node \"crc\" DevicePath \"\"" Mar 20 09:56:04 crc kubenswrapper[4971]: I0320 09:56:04.836570 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566676-ntqhr" event={"ID":"5b5a5bb6-be43-4ca4-acea-3ba241727009","Type":"ContainerDied","Data":"30613e4b9d21ac176549dfe80589d45b71e0ee13ee7ceaa30a2fc2339c9ce211"} Mar 20 09:56:04 crc kubenswrapper[4971]: I0320 09:56:04.836657 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30613e4b9d21ac176549dfe80589d45b71e0ee13ee7ceaa30a2fc2339c9ce211" Mar 20 09:56:04 crc kubenswrapper[4971]: I0320 09:56:04.836716 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566676-ntqhr" Mar 20 09:56:05 crc kubenswrapper[4971]: I0320 09:56:05.329254 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566670-5gsxj"] Mar 20 09:56:05 crc kubenswrapper[4971]: I0320 09:56:05.339012 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566670-5gsxj"] Mar 20 09:56:06 crc kubenswrapper[4971]: I0320 09:56:06.751083 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c78e17d2-bb8b-41c2-819d-031cf7c46183" path="/var/lib/kubelet/pods/c78e17d2-bb8b-41c2-819d-031cf7c46183/volumes" Mar 20 09:56:06 crc kubenswrapper[4971]: I0320 09:56:06.979738 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 20 09:56:08 crc kubenswrapper[4971]: I0320 09:56:08.877000 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c","Type":"ContainerStarted","Data":"d9aafa23a5084cab3c20da0ea2ed997b50274d56a7e701ac054c46df968a260d"} Mar 20 09:56:08 crc kubenswrapper[4971]: I0320 09:56:08.900384 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.675016988 podStartE2EDuration="1m8.900365099s" podCreationTimestamp="2026-03-20 09:55:00 +0000 UTC" firstStartedPulling="2026-03-20 09:55:02.750808005 +0000 UTC m=+11124.730682143" lastFinishedPulling="2026-03-20 09:56:06.976156076 +0000 UTC m=+11188.956030254" observedRunningTime="2026-03-20 09:56:08.89275891 +0000 UTC m=+11190.872633048" watchObservedRunningTime="2026-03-20 09:56:08.900365099 +0000 UTC m=+11190.880239237" Mar 20 09:56:12 crc kubenswrapper[4971]: I0320 09:56:12.732720 4971 scope.go:117] "RemoveContainer" containerID="9741f8c4d9da3b1995bee26db60ff6748d6555ab1f8bf6b207eb6b4cd887f8f1" Mar 20 09:56:12 crc kubenswrapper[4971]: E0320 09:56:12.733574 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:56:16 crc kubenswrapper[4971]: I0320 09:56:16.474810 4971 scope.go:117] "RemoveContainer" containerID="ef7792d3dfab16671ef7a65785c948329a56bdb77c9dbe2bdb9e29845f22737a" Mar 20 09:56:18 crc kubenswrapper[4971]: I0320 09:56:18.473478 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9jr2k"] Mar 20 09:56:18 crc kubenswrapper[4971]: E0320 09:56:18.474241 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b5a5bb6-be43-4ca4-acea-3ba241727009" containerName="oc" Mar 20 09:56:18 crc kubenswrapper[4971]: I0320 09:56:18.474255 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b5a5bb6-be43-4ca4-acea-3ba241727009" containerName="oc" Mar 20 09:56:18 crc kubenswrapper[4971]: I0320 09:56:18.474635 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b5a5bb6-be43-4ca4-acea-3ba241727009" containerName="oc" Mar 20 09:56:18 crc kubenswrapper[4971]: I0320 09:56:18.476519 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9jr2k" Mar 20 09:56:18 crc kubenswrapper[4971]: I0320 09:56:18.493324 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9jr2k"] Mar 20 09:56:18 crc kubenswrapper[4971]: I0320 09:56:18.643851 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd45s\" (UniqueName: \"kubernetes.io/projected/6220fde4-2977-4020-9cf5-0122cf43c634-kube-api-access-gd45s\") pod \"community-operators-9jr2k\" (UID: \"6220fde4-2977-4020-9cf5-0122cf43c634\") " pod="openshift-marketplace/community-operators-9jr2k" Mar 20 09:56:18 crc kubenswrapper[4971]: I0320 09:56:18.643934 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6220fde4-2977-4020-9cf5-0122cf43c634-utilities\") pod \"community-operators-9jr2k\" (UID: \"6220fde4-2977-4020-9cf5-0122cf43c634\") " pod="openshift-marketplace/community-operators-9jr2k" Mar 20 09:56:18 crc kubenswrapper[4971]: I0320 09:56:18.643954 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6220fde4-2977-4020-9cf5-0122cf43c634-catalog-content\") pod \"community-operators-9jr2k\" (UID: \"6220fde4-2977-4020-9cf5-0122cf43c634\") " pod="openshift-marketplace/community-operators-9jr2k" Mar 20 09:56:18 crc kubenswrapper[4971]: I0320 09:56:18.745695 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6220fde4-2977-4020-9cf5-0122cf43c634-utilities\") pod \"community-operators-9jr2k\" (UID: \"6220fde4-2977-4020-9cf5-0122cf43c634\") " pod="openshift-marketplace/community-operators-9jr2k" Mar 20 09:56:18 crc kubenswrapper[4971]: I0320 09:56:18.745749 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6220fde4-2977-4020-9cf5-0122cf43c634-catalog-content\") pod \"community-operators-9jr2k\" (UID: \"6220fde4-2977-4020-9cf5-0122cf43c634\") " pod="openshift-marketplace/community-operators-9jr2k" Mar 20 09:56:18 crc kubenswrapper[4971]: I0320 09:56:18.745899 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gd45s\" (UniqueName: \"kubernetes.io/projected/6220fde4-2977-4020-9cf5-0122cf43c634-kube-api-access-gd45s\") pod \"community-operators-9jr2k\" (UID: \"6220fde4-2977-4020-9cf5-0122cf43c634\") " pod="openshift-marketplace/community-operators-9jr2k" Mar 20 09:56:18 crc kubenswrapper[4971]: I0320 09:56:18.746294 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6220fde4-2977-4020-9cf5-0122cf43c634-catalog-content\") pod \"community-operators-9jr2k\" (UID: \"6220fde4-2977-4020-9cf5-0122cf43c634\") " pod="openshift-marketplace/community-operators-9jr2k" Mar 20 09:56:18 crc kubenswrapper[4971]: I0320 09:56:18.746565 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6220fde4-2977-4020-9cf5-0122cf43c634-utilities\") pod \"community-operators-9jr2k\" (UID: \"6220fde4-2977-4020-9cf5-0122cf43c634\") " pod="openshift-marketplace/community-operators-9jr2k" Mar 20 09:56:18 crc kubenswrapper[4971]: I0320 09:56:18.770538 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd45s\" (UniqueName: \"kubernetes.io/projected/6220fde4-2977-4020-9cf5-0122cf43c634-kube-api-access-gd45s\") pod \"community-operators-9jr2k\" (UID: \"6220fde4-2977-4020-9cf5-0122cf43c634\") " pod="openshift-marketplace/community-operators-9jr2k" Mar 20 09:56:18 crc kubenswrapper[4971]: I0320 09:56:18.806672 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9jr2k" Mar 20 09:56:19 crc kubenswrapper[4971]: I0320 09:56:19.314296 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9jr2k"] Mar 20 09:56:19 crc kubenswrapper[4971]: W0320 09:56:19.316279 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6220fde4_2977_4020_9cf5_0122cf43c634.slice/crio-1e0fac3dffec86e64f1899a9ad18679289ed31fa1f9f856baf908fbc958d9dc6 WatchSource:0}: Error finding container 1e0fac3dffec86e64f1899a9ad18679289ed31fa1f9f856baf908fbc958d9dc6: Status 404 returned error can't find the container with id 1e0fac3dffec86e64f1899a9ad18679289ed31fa1f9f856baf908fbc958d9dc6 Mar 20 09:56:20 crc kubenswrapper[4971]: I0320 09:56:20.018630 4971 generic.go:334] "Generic (PLEG): container finished" podID="6220fde4-2977-4020-9cf5-0122cf43c634" containerID="806dbd6e745c814257f5e96e255ce31fdfea5eca945dfa966d0f51be9f385550" exitCode=0 Mar 20 09:56:20 crc kubenswrapper[4971]: I0320 09:56:20.018722 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9jr2k" event={"ID":"6220fde4-2977-4020-9cf5-0122cf43c634","Type":"ContainerDied","Data":"806dbd6e745c814257f5e96e255ce31fdfea5eca945dfa966d0f51be9f385550"} Mar 20 09:56:20 crc kubenswrapper[4971]: I0320 09:56:20.019084 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9jr2k" event={"ID":"6220fde4-2977-4020-9cf5-0122cf43c634","Type":"ContainerStarted","Data":"1e0fac3dffec86e64f1899a9ad18679289ed31fa1f9f856baf908fbc958d9dc6"} Mar 20 09:56:24 crc kubenswrapper[4971]: I0320 09:56:24.732696 4971 scope.go:117] "RemoveContainer" containerID="9741f8c4d9da3b1995bee26db60ff6748d6555ab1f8bf6b207eb6b4cd887f8f1" Mar 20 09:56:24 crc kubenswrapper[4971]: E0320 09:56:24.733477 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:56:25 crc kubenswrapper[4971]: I0320 09:56:25.075106 4971 generic.go:334] "Generic (PLEG): container finished" podID="6220fde4-2977-4020-9cf5-0122cf43c634" containerID="86ec61ee58a0d178e0bed6b29818cbb655ac8e27fb2cfb801adf80d6259a1b08" exitCode=0 Mar 20 09:56:25 crc kubenswrapper[4971]: I0320 09:56:25.075153 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9jr2k" event={"ID":"6220fde4-2977-4020-9cf5-0122cf43c634","Type":"ContainerDied","Data":"86ec61ee58a0d178e0bed6b29818cbb655ac8e27fb2cfb801adf80d6259a1b08"} Mar 20 09:56:26 crc kubenswrapper[4971]: I0320 09:56:26.087560 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9jr2k" event={"ID":"6220fde4-2977-4020-9cf5-0122cf43c634","Type":"ContainerStarted","Data":"0b69e9087edf932e5e80ddc63b6e80bff3620e3bdb203473153dd757b5e1c612"} Mar 20 09:56:28 crc kubenswrapper[4971]: I0320 09:56:28.807560 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9jr2k" Mar 20 09:56:28 crc kubenswrapper[4971]: I0320 09:56:28.810431 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9jr2k" Mar 20 09:56:28 crc kubenswrapper[4971]: I0320 09:56:28.867195 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9jr2k" Mar 20 09:56:28 crc kubenswrapper[4971]: I0320 09:56:28.887861 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9jr2k" podStartSLOduration=5.33633374 podStartE2EDuration="10.887842212s" podCreationTimestamp="2026-03-20 09:56:18 +0000 UTC" firstStartedPulling="2026-03-20 09:56:20.022304691 +0000 UTC m=+11202.002178829" lastFinishedPulling="2026-03-20 09:56:25.573813153 +0000 UTC m=+11207.553687301" observedRunningTime="2026-03-20 09:56:26.116544629 +0000 UTC m=+11208.096418767" watchObservedRunningTime="2026-03-20 09:56:28.887842212 +0000 UTC m=+11210.867716350" Mar 20 09:56:30 crc kubenswrapper[4971]: I0320 09:56:30.177671 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9jr2k" Mar 20 09:56:30 crc kubenswrapper[4971]: I0320 09:56:30.246155 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9jr2k"] Mar 20 09:56:30 crc kubenswrapper[4971]: I0320 09:56:30.355221 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xncqk"] Mar 20 09:56:30 crc kubenswrapper[4971]: I0320 09:56:30.355463 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xncqk" podUID="d5e692dd-45d1-4384-b888-3958c4e970c9" containerName="registry-server" containerID="cri-o://70e8f78b39fcadff69658f8398a368eff3a3f179fcb917cf582850b055ce7861" gracePeriod=2 Mar 20 09:56:30 crc kubenswrapper[4971]: I0320 09:56:30.893637 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xncqk" Mar 20 09:56:30 crc kubenswrapper[4971]: I0320 09:56:30.991067 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5e692dd-45d1-4384-b888-3958c4e970c9-utilities\") pod \"d5e692dd-45d1-4384-b888-3958c4e970c9\" (UID: \"d5e692dd-45d1-4384-b888-3958c4e970c9\") " Mar 20 09:56:30 crc kubenswrapper[4971]: I0320 09:56:30.991367 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-np4xp\" (UniqueName: \"kubernetes.io/projected/d5e692dd-45d1-4384-b888-3958c4e970c9-kube-api-access-np4xp\") pod \"d5e692dd-45d1-4384-b888-3958c4e970c9\" (UID: \"d5e692dd-45d1-4384-b888-3958c4e970c9\") " Mar 20 09:56:30 crc kubenswrapper[4971]: I0320 09:56:30.991425 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5e692dd-45d1-4384-b888-3958c4e970c9-catalog-content\") pod \"d5e692dd-45d1-4384-b888-3958c4e970c9\" (UID: \"d5e692dd-45d1-4384-b888-3958c4e970c9\") " Mar 20 09:56:30 crc kubenswrapper[4971]: I0320 09:56:30.992542 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5e692dd-45d1-4384-b888-3958c4e970c9-utilities" (OuterVolumeSpecName: "utilities") pod "d5e692dd-45d1-4384-b888-3958c4e970c9" (UID: "d5e692dd-45d1-4384-b888-3958c4e970c9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:56:31 crc kubenswrapper[4971]: I0320 09:56:31.020691 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5e692dd-45d1-4384-b888-3958c4e970c9-kube-api-access-np4xp" (OuterVolumeSpecName: "kube-api-access-np4xp") pod "d5e692dd-45d1-4384-b888-3958c4e970c9" (UID: "d5e692dd-45d1-4384-b888-3958c4e970c9"). InnerVolumeSpecName "kube-api-access-np4xp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:56:31 crc kubenswrapper[4971]: I0320 09:56:31.052297 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5e692dd-45d1-4384-b888-3958c4e970c9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d5e692dd-45d1-4384-b888-3958c4e970c9" (UID: "d5e692dd-45d1-4384-b888-3958c4e970c9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:56:31 crc kubenswrapper[4971]: I0320 09:56:31.094392 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-np4xp\" (UniqueName: \"kubernetes.io/projected/d5e692dd-45d1-4384-b888-3958c4e970c9-kube-api-access-np4xp\") on node \"crc\" DevicePath \"\"" Mar 20 09:56:31 crc kubenswrapper[4971]: I0320 09:56:31.094439 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5e692dd-45d1-4384-b888-3958c4e970c9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 09:56:31 crc kubenswrapper[4971]: I0320 09:56:31.094452 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5e692dd-45d1-4384-b888-3958c4e970c9-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 09:56:31 crc kubenswrapper[4971]: I0320 09:56:31.143684 4971 generic.go:334] "Generic (PLEG): container finished" podID="d5e692dd-45d1-4384-b888-3958c4e970c9" containerID="70e8f78b39fcadff69658f8398a368eff3a3f179fcb917cf582850b055ce7861" exitCode=0 Mar 20 09:56:31 crc kubenswrapper[4971]: I0320 09:56:31.143792 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xncqk" Mar 20 09:56:31 crc kubenswrapper[4971]: I0320 09:56:31.143846 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xncqk" event={"ID":"d5e692dd-45d1-4384-b888-3958c4e970c9","Type":"ContainerDied","Data":"70e8f78b39fcadff69658f8398a368eff3a3f179fcb917cf582850b055ce7861"} Mar 20 09:56:31 crc kubenswrapper[4971]: I0320 09:56:31.143885 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xncqk" event={"ID":"d5e692dd-45d1-4384-b888-3958c4e970c9","Type":"ContainerDied","Data":"13d600120a2b38064bbf85b14e2ec1b7a83214d2956e1a65cc02d07aa3119ec9"} Mar 20 09:56:31 crc kubenswrapper[4971]: I0320 09:56:31.143906 4971 scope.go:117] "RemoveContainer" containerID="70e8f78b39fcadff69658f8398a368eff3a3f179fcb917cf582850b055ce7861" Mar 20 09:56:31 crc kubenswrapper[4971]: I0320 09:56:31.178194 4971 scope.go:117] "RemoveContainer" containerID="6d3cbaa014998737015ce96fceb3e202475bb0f362a03e518590f90bf3b758a1" Mar 20 09:56:31 crc kubenswrapper[4971]: I0320 09:56:31.183935 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xncqk"] Mar 20 09:56:31 crc kubenswrapper[4971]: I0320 09:56:31.205541 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xncqk"] Mar 20 09:56:31 crc kubenswrapper[4971]: I0320 09:56:31.235121 4971 scope.go:117] "RemoveContainer" containerID="3f968db899f46a401f9fcd8363226060bcb9a1088d65b6fa13116ba2858d68ee" Mar 20 09:56:31 crc kubenswrapper[4971]: I0320 09:56:31.273870 4971 scope.go:117] "RemoveContainer" containerID="70e8f78b39fcadff69658f8398a368eff3a3f179fcb917cf582850b055ce7861" Mar 20 09:56:31 crc kubenswrapper[4971]: E0320 09:56:31.275058 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70e8f78b39fcadff69658f8398a368eff3a3f179fcb917cf582850b055ce7861\": container with ID starting with 70e8f78b39fcadff69658f8398a368eff3a3f179fcb917cf582850b055ce7861 not found: ID does not exist" containerID="70e8f78b39fcadff69658f8398a368eff3a3f179fcb917cf582850b055ce7861" Mar 20 09:56:31 crc kubenswrapper[4971]: I0320 09:56:31.275103 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70e8f78b39fcadff69658f8398a368eff3a3f179fcb917cf582850b055ce7861"} err="failed to get container status \"70e8f78b39fcadff69658f8398a368eff3a3f179fcb917cf582850b055ce7861\": rpc error: code = NotFound desc = could not find container \"70e8f78b39fcadff69658f8398a368eff3a3f179fcb917cf582850b055ce7861\": container with ID starting with 70e8f78b39fcadff69658f8398a368eff3a3f179fcb917cf582850b055ce7861 not found: ID does not exist" Mar 20 09:56:31 crc kubenswrapper[4971]: I0320 09:56:31.275129 4971 scope.go:117] "RemoveContainer" containerID="6d3cbaa014998737015ce96fceb3e202475bb0f362a03e518590f90bf3b758a1" Mar 20 09:56:31 crc kubenswrapper[4971]: E0320 09:56:31.275484 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d3cbaa014998737015ce96fceb3e202475bb0f362a03e518590f90bf3b758a1\": container with ID starting with 6d3cbaa014998737015ce96fceb3e202475bb0f362a03e518590f90bf3b758a1 not found: ID does not exist" containerID="6d3cbaa014998737015ce96fceb3e202475bb0f362a03e518590f90bf3b758a1" Mar 20 09:56:31 crc kubenswrapper[4971]: I0320 09:56:31.275507 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d3cbaa014998737015ce96fceb3e202475bb0f362a03e518590f90bf3b758a1"} err="failed to get container status \"6d3cbaa014998737015ce96fceb3e202475bb0f362a03e518590f90bf3b758a1\": rpc error: code = NotFound desc = could not find container \"6d3cbaa014998737015ce96fceb3e202475bb0f362a03e518590f90bf3b758a1\": container with ID starting with 6d3cbaa014998737015ce96fceb3e202475bb0f362a03e518590f90bf3b758a1 not found: ID does not exist" Mar 20 09:56:31 crc kubenswrapper[4971]: I0320 09:56:31.275524 4971 scope.go:117] "RemoveContainer" containerID="3f968db899f46a401f9fcd8363226060bcb9a1088d65b6fa13116ba2858d68ee" Mar 20 09:56:31 crc kubenswrapper[4971]: E0320 09:56:31.276306 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f968db899f46a401f9fcd8363226060bcb9a1088d65b6fa13116ba2858d68ee\": container with ID starting with 3f968db899f46a401f9fcd8363226060bcb9a1088d65b6fa13116ba2858d68ee not found: ID does not exist" containerID="3f968db899f46a401f9fcd8363226060bcb9a1088d65b6fa13116ba2858d68ee" Mar 20 09:56:31 crc kubenswrapper[4971]: I0320 09:56:31.276333 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f968db899f46a401f9fcd8363226060bcb9a1088d65b6fa13116ba2858d68ee"} err="failed to get container status \"3f968db899f46a401f9fcd8363226060bcb9a1088d65b6fa13116ba2858d68ee\": rpc error: code = NotFound desc = could not find container \"3f968db899f46a401f9fcd8363226060bcb9a1088d65b6fa13116ba2858d68ee\": container with ID starting with 3f968db899f46a401f9fcd8363226060bcb9a1088d65b6fa13116ba2858d68ee not found: ID does not exist" Mar 20 09:56:32 crc kubenswrapper[4971]: I0320 09:56:32.743574 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5e692dd-45d1-4384-b888-3958c4e970c9" path="/var/lib/kubelet/pods/d5e692dd-45d1-4384-b888-3958c4e970c9/volumes" Mar 20 09:56:35 crc kubenswrapper[4971]: I0320 09:56:35.734403 4971 scope.go:117] "RemoveContainer" containerID="9741f8c4d9da3b1995bee26db60ff6748d6555ab1f8bf6b207eb6b4cd887f8f1" Mar 20 09:56:35 crc kubenswrapper[4971]: E0320 09:56:35.735328 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:56:47 crc kubenswrapper[4971]: I0320 09:56:47.732410 4971 scope.go:117] "RemoveContainer" containerID="9741f8c4d9da3b1995bee26db60ff6748d6555ab1f8bf6b207eb6b4cd887f8f1" Mar 20 09:56:47 crc kubenswrapper[4971]: E0320 09:56:47.733211 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:56:58 crc kubenswrapper[4971]: I0320 09:56:58.743876 4971 scope.go:117] "RemoveContainer" containerID="9741f8c4d9da3b1995bee26db60ff6748d6555ab1f8bf6b207eb6b4cd887f8f1" Mar 20 09:56:58 crc kubenswrapper[4971]: E0320 09:56:58.744945 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:57:11 crc kubenswrapper[4971]: I0320 09:57:11.568506 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fp6nd"] Mar 20 09:57:11 crc kubenswrapper[4971]: E0320 09:57:11.569755 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5e692dd-45d1-4384-b888-3958c4e970c9" containerName="extract-utilities" Mar 20 09:57:11 crc kubenswrapper[4971]: I0320 09:57:11.569770 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5e692dd-45d1-4384-b888-3958c4e970c9" containerName="extract-utilities" Mar 20 09:57:11 crc kubenswrapper[4971]: E0320 09:57:11.569782 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5e692dd-45d1-4384-b888-3958c4e970c9" containerName="extract-content" Mar 20 09:57:11 crc kubenswrapper[4971]: I0320 09:57:11.569788 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5e692dd-45d1-4384-b888-3958c4e970c9" containerName="extract-content" Mar 20 09:57:11 crc kubenswrapper[4971]: E0320 09:57:11.569806 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5e692dd-45d1-4384-b888-3958c4e970c9" containerName="registry-server" Mar 20 09:57:11 crc kubenswrapper[4971]: I0320 09:57:11.569812 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5e692dd-45d1-4384-b888-3958c4e970c9" containerName="registry-server" Mar 20 09:57:11 crc kubenswrapper[4971]: I0320 09:57:11.570004 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5e692dd-45d1-4384-b888-3958c4e970c9" containerName="registry-server" Mar 20 09:57:11 crc kubenswrapper[4971]: I0320 09:57:11.572055 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fp6nd" Mar 20 09:57:11 crc kubenswrapper[4971]: I0320 09:57:11.592664 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fp6nd"] Mar 20 09:57:11 crc kubenswrapper[4971]: I0320 09:57:11.598441 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10be0ed1-2543-46f9-a35e-863aa5311b11-utilities\") pod \"certified-operators-fp6nd\" (UID: \"10be0ed1-2543-46f9-a35e-863aa5311b11\") " pod="openshift-marketplace/certified-operators-fp6nd" Mar 20 09:57:11 crc kubenswrapper[4971]: I0320 09:57:11.598562 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zv8gh\" (UniqueName: \"kubernetes.io/projected/10be0ed1-2543-46f9-a35e-863aa5311b11-kube-api-access-zv8gh\") pod \"certified-operators-fp6nd\" (UID: \"10be0ed1-2543-46f9-a35e-863aa5311b11\") " pod="openshift-marketplace/certified-operators-fp6nd" Mar 20 09:57:11 crc kubenswrapper[4971]: I0320 09:57:11.598704 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10be0ed1-2543-46f9-a35e-863aa5311b11-catalog-content\") pod \"certified-operators-fp6nd\" (UID: \"10be0ed1-2543-46f9-a35e-863aa5311b11\") " pod="openshift-marketplace/certified-operators-fp6nd" Mar 20 09:57:11 crc kubenswrapper[4971]: I0320 09:57:11.700346 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10be0ed1-2543-46f9-a35e-863aa5311b11-utilities\") pod \"certified-operators-fp6nd\" (UID: \"10be0ed1-2543-46f9-a35e-863aa5311b11\") " pod="openshift-marketplace/certified-operators-fp6nd" Mar 20 09:57:11 crc kubenswrapper[4971]: I0320 09:57:11.700447 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zv8gh\" (UniqueName: \"kubernetes.io/projected/10be0ed1-2543-46f9-a35e-863aa5311b11-kube-api-access-zv8gh\") pod \"certified-operators-fp6nd\" (UID: \"10be0ed1-2543-46f9-a35e-863aa5311b11\") " pod="openshift-marketplace/certified-operators-fp6nd" Mar 20 09:57:11 crc kubenswrapper[4971]: I0320 09:57:11.700510 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10be0ed1-2543-46f9-a35e-863aa5311b11-catalog-content\") pod \"certified-operators-fp6nd\" (UID: \"10be0ed1-2543-46f9-a35e-863aa5311b11\") " pod="openshift-marketplace/certified-operators-fp6nd" Mar 20 09:57:11 crc kubenswrapper[4971]: I0320 09:57:11.700979 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10be0ed1-2543-46f9-a35e-863aa5311b11-catalog-content\") pod \"certified-operators-fp6nd\" (UID: \"10be0ed1-2543-46f9-a35e-863aa5311b11\") " pod="openshift-marketplace/certified-operators-fp6nd" Mar 20 09:57:11 crc kubenswrapper[4971]: I0320 09:57:11.701572 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10be0ed1-2543-46f9-a35e-863aa5311b11-utilities\") pod \"certified-operators-fp6nd\" (UID: \"10be0ed1-2543-46f9-a35e-863aa5311b11\") " pod="openshift-marketplace/certified-operators-fp6nd" Mar 20 09:57:11 crc kubenswrapper[4971]: I0320 09:57:11.722077 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zv8gh\" (UniqueName: \"kubernetes.io/projected/10be0ed1-2543-46f9-a35e-863aa5311b11-kube-api-access-zv8gh\") pod \"certified-operators-fp6nd\" (UID: \"10be0ed1-2543-46f9-a35e-863aa5311b11\") " pod="openshift-marketplace/certified-operators-fp6nd" Mar 20 09:57:11 crc kubenswrapper[4971]: I0320 09:57:11.891736 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fp6nd" Mar 20 09:57:12 crc kubenswrapper[4971]: I0320 09:57:12.716953 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fp6nd"] Mar 20 09:57:12 crc kubenswrapper[4971]: W0320 09:57:12.722495 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10be0ed1_2543_46f9_a35e_863aa5311b11.slice/crio-dcca9f0b5fe982b525fcb82d4b78da8baf52e75528402bb797b72dbab834a68c WatchSource:0}: Error finding container dcca9f0b5fe982b525fcb82d4b78da8baf52e75528402bb797b72dbab834a68c: Status 404 returned error can't find the container with id dcca9f0b5fe982b525fcb82d4b78da8baf52e75528402bb797b72dbab834a68c Mar 20 09:57:12 crc kubenswrapper[4971]: I0320 09:57:12.886865 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fp6nd" event={"ID":"10be0ed1-2543-46f9-a35e-863aa5311b11","Type":"ContainerStarted","Data":"dcca9f0b5fe982b525fcb82d4b78da8baf52e75528402bb797b72dbab834a68c"} Mar 20 09:57:13 crc kubenswrapper[4971]: I0320 09:57:13.732401 4971 scope.go:117] "RemoveContainer" containerID="9741f8c4d9da3b1995bee26db60ff6748d6555ab1f8bf6b207eb6b4cd887f8f1" Mar 20 09:57:13 crc kubenswrapper[4971]: E0320 09:57:13.733028 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:57:13 crc kubenswrapper[4971]: I0320 09:57:13.896821 4971 generic.go:334] "Generic (PLEG): container finished" podID="10be0ed1-2543-46f9-a35e-863aa5311b11" containerID="ec13de3f602f3a0b987a01176f26057fa4832bc597fcdcafc7447bfdfc6cf64c" exitCode=0 Mar 20 09:57:13 crc kubenswrapper[4971]: I0320 09:57:13.896876 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fp6nd" event={"ID":"10be0ed1-2543-46f9-a35e-863aa5311b11","Type":"ContainerDied","Data":"ec13de3f602f3a0b987a01176f26057fa4832bc597fcdcafc7447bfdfc6cf64c"} Mar 20 09:57:14 crc kubenswrapper[4971]: I0320 09:57:14.911326 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fp6nd" event={"ID":"10be0ed1-2543-46f9-a35e-863aa5311b11","Type":"ContainerStarted","Data":"7b7bf5f1c5f2e5f62e603b2c9189a9ee9cc34ad78b139a2b262bd5ee58c7ae22"} Mar 20 09:57:16 crc kubenswrapper[4971]: I0320 09:57:16.941626 4971 generic.go:334] "Generic (PLEG): container finished" podID="10be0ed1-2543-46f9-a35e-863aa5311b11" containerID="7b7bf5f1c5f2e5f62e603b2c9189a9ee9cc34ad78b139a2b262bd5ee58c7ae22" exitCode=0 Mar 20 09:57:16 crc kubenswrapper[4971]: I0320 09:57:16.941720 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fp6nd" event={"ID":"10be0ed1-2543-46f9-a35e-863aa5311b11","Type":"ContainerDied","Data":"7b7bf5f1c5f2e5f62e603b2c9189a9ee9cc34ad78b139a2b262bd5ee58c7ae22"} Mar 20 09:57:17 crc kubenswrapper[4971]: I0320 09:57:17.953774 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fp6nd" event={"ID":"10be0ed1-2543-46f9-a35e-863aa5311b11","Type":"ContainerStarted","Data":"de6998c882544e49997ca6e7303ada394eb2a5591bed008171ebd89c9a8ee06c"} Mar 20 09:57:17 crc kubenswrapper[4971]: I0320 09:57:17.973054 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fp6nd" podStartSLOduration=3.5024532329999998 podStartE2EDuration="6.973037035s" podCreationTimestamp="2026-03-20 09:57:11 +0000 UTC" firstStartedPulling="2026-03-20 09:57:13.898814998 +0000 UTC m=+11255.878689136" lastFinishedPulling="2026-03-20 09:57:17.3693988 +0000 UTC m=+11259.349272938" observedRunningTime="2026-03-20 09:57:17.968783884 +0000 UTC m=+11259.948658032" watchObservedRunningTime="2026-03-20 09:57:17.973037035 +0000 UTC m=+11259.952911173" Mar 20 09:57:21 crc kubenswrapper[4971]: I0320 09:57:21.892932 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fp6nd" Mar 20 09:57:21 crc kubenswrapper[4971]: I0320 09:57:21.893445 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fp6nd" Mar 20 09:57:21 crc kubenswrapper[4971]: I0320 09:57:21.939712 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fp6nd" Mar 20 09:57:28 crc kubenswrapper[4971]: I0320 09:57:28.739490 4971 scope.go:117] "RemoveContainer" containerID="9741f8c4d9da3b1995bee26db60ff6748d6555ab1f8bf6b207eb6b4cd887f8f1" Mar 20 09:57:28 crc kubenswrapper[4971]: E0320 09:57:28.741765 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:57:31 crc kubenswrapper[4971]: I0320 09:57:31.955148 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fp6nd" Mar 20 09:57:32 crc kubenswrapper[4971]: I0320 09:57:32.020828 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fp6nd"] Mar 20 09:57:32 crc kubenswrapper[4971]: I0320 09:57:32.089713 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fp6nd" podUID="10be0ed1-2543-46f9-a35e-863aa5311b11" containerName="registry-server" containerID="cri-o://de6998c882544e49997ca6e7303ada394eb2a5591bed008171ebd89c9a8ee06c" gracePeriod=2 Mar 20 09:57:32 crc kubenswrapper[4971]: I0320 09:57:32.757754 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fp6nd" Mar 20 09:57:32 crc kubenswrapper[4971]: I0320 09:57:32.823071 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zv8gh\" (UniqueName: \"kubernetes.io/projected/10be0ed1-2543-46f9-a35e-863aa5311b11-kube-api-access-zv8gh\") pod \"10be0ed1-2543-46f9-a35e-863aa5311b11\" (UID: \"10be0ed1-2543-46f9-a35e-863aa5311b11\") " Mar 20 09:57:32 crc kubenswrapper[4971]: I0320 09:57:32.823126 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10be0ed1-2543-46f9-a35e-863aa5311b11-utilities\") pod \"10be0ed1-2543-46f9-a35e-863aa5311b11\" (UID: \"10be0ed1-2543-46f9-a35e-863aa5311b11\") " Mar 20 09:57:32 crc kubenswrapper[4971]: I0320 09:57:32.823301 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10be0ed1-2543-46f9-a35e-863aa5311b11-catalog-content\") pod \"10be0ed1-2543-46f9-a35e-863aa5311b11\" (UID: \"10be0ed1-2543-46f9-a35e-863aa5311b11\") " Mar 20 09:57:32 crc kubenswrapper[4971]: I0320 09:57:32.824538 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10be0ed1-2543-46f9-a35e-863aa5311b11-utilities" (OuterVolumeSpecName: "utilities") pod "10be0ed1-2543-46f9-a35e-863aa5311b11" (UID: "10be0ed1-2543-46f9-a35e-863aa5311b11"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:57:32 crc kubenswrapper[4971]: I0320 09:57:32.829802 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10be0ed1-2543-46f9-a35e-863aa5311b11-kube-api-access-zv8gh" (OuterVolumeSpecName: "kube-api-access-zv8gh") pod "10be0ed1-2543-46f9-a35e-863aa5311b11" (UID: "10be0ed1-2543-46f9-a35e-863aa5311b11"). InnerVolumeSpecName "kube-api-access-zv8gh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:57:32 crc kubenswrapper[4971]: I0320 09:57:32.899399 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10be0ed1-2543-46f9-a35e-863aa5311b11-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "10be0ed1-2543-46f9-a35e-863aa5311b11" (UID: "10be0ed1-2543-46f9-a35e-863aa5311b11"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:57:32 crc kubenswrapper[4971]: I0320 09:57:32.925311 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10be0ed1-2543-46f9-a35e-863aa5311b11-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 09:57:32 crc kubenswrapper[4971]: I0320 09:57:32.925349 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zv8gh\" (UniqueName: \"kubernetes.io/projected/10be0ed1-2543-46f9-a35e-863aa5311b11-kube-api-access-zv8gh\") on node \"crc\" DevicePath \"\"" Mar 20 09:57:32 crc kubenswrapper[4971]: I0320 09:57:32.925362 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10be0ed1-2543-46f9-a35e-863aa5311b11-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 09:57:33 crc kubenswrapper[4971]: I0320 09:57:33.099248 4971 generic.go:334] "Generic (PLEG): container finished" podID="10be0ed1-2543-46f9-a35e-863aa5311b11" containerID="de6998c882544e49997ca6e7303ada394eb2a5591bed008171ebd89c9a8ee06c" exitCode=0 Mar 20 09:57:33 crc kubenswrapper[4971]: I0320 09:57:33.099298 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fp6nd" event={"ID":"10be0ed1-2543-46f9-a35e-863aa5311b11","Type":"ContainerDied","Data":"de6998c882544e49997ca6e7303ada394eb2a5591bed008171ebd89c9a8ee06c"} Mar 20 09:57:33 crc kubenswrapper[4971]: I0320 09:57:33.099318 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fp6nd" Mar 20 09:57:33 crc kubenswrapper[4971]: I0320 09:57:33.099336 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fp6nd" event={"ID":"10be0ed1-2543-46f9-a35e-863aa5311b11","Type":"ContainerDied","Data":"dcca9f0b5fe982b525fcb82d4b78da8baf52e75528402bb797b72dbab834a68c"} Mar 20 09:57:33 crc kubenswrapper[4971]: I0320 09:57:33.099358 4971 scope.go:117] "RemoveContainer" containerID="de6998c882544e49997ca6e7303ada394eb2a5591bed008171ebd89c9a8ee06c" Mar 20 09:57:33 crc kubenswrapper[4971]: I0320 09:57:33.121484 4971 scope.go:117] "RemoveContainer" containerID="7b7bf5f1c5f2e5f62e603b2c9189a9ee9cc34ad78b139a2b262bd5ee58c7ae22" Mar 20 09:57:33 crc kubenswrapper[4971]: I0320 09:57:33.133458 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fp6nd"] Mar 20 09:57:33 crc kubenswrapper[4971]: I0320 09:57:33.144999 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fp6nd"] Mar 20 09:57:33 crc kubenswrapper[4971]: I0320 09:57:33.153132 4971 scope.go:117] "RemoveContainer" containerID="ec13de3f602f3a0b987a01176f26057fa4832bc597fcdcafc7447bfdfc6cf64c" Mar 20 09:57:33 crc kubenswrapper[4971]: I0320 09:57:33.194089 4971 scope.go:117] "RemoveContainer" containerID="de6998c882544e49997ca6e7303ada394eb2a5591bed008171ebd89c9a8ee06c" Mar 20 09:57:33 crc kubenswrapper[4971]: E0320 09:57:33.194626 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de6998c882544e49997ca6e7303ada394eb2a5591bed008171ebd89c9a8ee06c\": container with ID starting with de6998c882544e49997ca6e7303ada394eb2a5591bed008171ebd89c9a8ee06c not found: ID does not exist" containerID="de6998c882544e49997ca6e7303ada394eb2a5591bed008171ebd89c9a8ee06c" Mar 20 09:57:33 crc kubenswrapper[4971]: I0320 09:57:33.194674 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de6998c882544e49997ca6e7303ada394eb2a5591bed008171ebd89c9a8ee06c"} err="failed to get container status \"de6998c882544e49997ca6e7303ada394eb2a5591bed008171ebd89c9a8ee06c\": rpc error: code = NotFound desc = could not find container \"de6998c882544e49997ca6e7303ada394eb2a5591bed008171ebd89c9a8ee06c\": container with ID starting with de6998c882544e49997ca6e7303ada394eb2a5591bed008171ebd89c9a8ee06c not found: ID does not exist" Mar 20 09:57:33 crc kubenswrapper[4971]: I0320 09:57:33.194703 4971 scope.go:117] "RemoveContainer" containerID="7b7bf5f1c5f2e5f62e603b2c9189a9ee9cc34ad78b139a2b262bd5ee58c7ae22" Mar 20 09:57:33 crc kubenswrapper[4971]: E0320 09:57:33.195112 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b7bf5f1c5f2e5f62e603b2c9189a9ee9cc34ad78b139a2b262bd5ee58c7ae22\": container with ID starting with 7b7bf5f1c5f2e5f62e603b2c9189a9ee9cc34ad78b139a2b262bd5ee58c7ae22 not found: ID does not exist" containerID="7b7bf5f1c5f2e5f62e603b2c9189a9ee9cc34ad78b139a2b262bd5ee58c7ae22" Mar 20 09:57:33 crc kubenswrapper[4971]: I0320 09:57:33.195188 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b7bf5f1c5f2e5f62e603b2c9189a9ee9cc34ad78b139a2b262bd5ee58c7ae22"} err="failed to get container status \"7b7bf5f1c5f2e5f62e603b2c9189a9ee9cc34ad78b139a2b262bd5ee58c7ae22\": rpc error: code = NotFound desc = could not find container \"7b7bf5f1c5f2e5f62e603b2c9189a9ee9cc34ad78b139a2b262bd5ee58c7ae22\": container with ID starting with 7b7bf5f1c5f2e5f62e603b2c9189a9ee9cc34ad78b139a2b262bd5ee58c7ae22 not found: ID does not exist" Mar 20 09:57:33 crc kubenswrapper[4971]: I0320 09:57:33.195224 4971 scope.go:117] "RemoveContainer" containerID="ec13de3f602f3a0b987a01176f26057fa4832bc597fcdcafc7447bfdfc6cf64c" Mar 20 09:57:33 crc kubenswrapper[4971]: E0320 09:57:33.195649 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec13de3f602f3a0b987a01176f26057fa4832bc597fcdcafc7447bfdfc6cf64c\": container with ID starting with ec13de3f602f3a0b987a01176f26057fa4832bc597fcdcafc7447bfdfc6cf64c not found: ID does not exist" containerID="ec13de3f602f3a0b987a01176f26057fa4832bc597fcdcafc7447bfdfc6cf64c" Mar 20 09:57:33 crc kubenswrapper[4971]: I0320 09:57:33.195695 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec13de3f602f3a0b987a01176f26057fa4832bc597fcdcafc7447bfdfc6cf64c"} err="failed to get container status \"ec13de3f602f3a0b987a01176f26057fa4832bc597fcdcafc7447bfdfc6cf64c\": rpc error: code = NotFound desc = could not find container \"ec13de3f602f3a0b987a01176f26057fa4832bc597fcdcafc7447bfdfc6cf64c\": container with ID starting with ec13de3f602f3a0b987a01176f26057fa4832bc597fcdcafc7447bfdfc6cf64c not found: ID does not exist" Mar 20 09:57:34 crc kubenswrapper[4971]: I0320 09:57:34.746299 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10be0ed1-2543-46f9-a35e-863aa5311b11" path="/var/lib/kubelet/pods/10be0ed1-2543-46f9-a35e-863aa5311b11/volumes" Mar 20 09:57:40 crc kubenswrapper[4971]: I0320 09:57:40.732671 4971 scope.go:117] "RemoveContainer" containerID="9741f8c4d9da3b1995bee26db60ff6748d6555ab1f8bf6b207eb6b4cd887f8f1" Mar 20 09:57:40 crc kubenswrapper[4971]: E0320 09:57:40.733557 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:57:51 crc kubenswrapper[4971]: I0320 09:57:51.733468 4971 scope.go:117] "RemoveContainer" containerID="9741f8c4d9da3b1995bee26db60ff6748d6555ab1f8bf6b207eb6b4cd887f8f1" Mar 20 09:57:51 crc kubenswrapper[4971]: E0320 09:57:51.734164 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:58:00 crc kubenswrapper[4971]: I0320 09:58:00.142137 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566678-f55t4"] Mar 20 09:58:00 crc kubenswrapper[4971]: E0320 09:58:00.143163 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10be0ed1-2543-46f9-a35e-863aa5311b11" containerName="registry-server" Mar 20 09:58:00 crc kubenswrapper[4971]: I0320 09:58:00.143176 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="10be0ed1-2543-46f9-a35e-863aa5311b11" containerName="registry-server" Mar 20 09:58:00 crc kubenswrapper[4971]: E0320 09:58:00.143206 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10be0ed1-2543-46f9-a35e-863aa5311b11" containerName="extract-utilities" Mar 20 09:58:00 crc kubenswrapper[4971]: I0320 09:58:00.143213 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="10be0ed1-2543-46f9-a35e-863aa5311b11" containerName="extract-utilities" Mar 20 09:58:00 crc kubenswrapper[4971]: E0320 09:58:00.143228 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10be0ed1-2543-46f9-a35e-863aa5311b11" containerName="extract-content" Mar 20 09:58:00 crc kubenswrapper[4971]: I0320 09:58:00.143234 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="10be0ed1-2543-46f9-a35e-863aa5311b11" containerName="extract-content" Mar 20 09:58:00 crc kubenswrapper[4971]: I0320 09:58:00.143422 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="10be0ed1-2543-46f9-a35e-863aa5311b11" containerName="registry-server" Mar 20 09:58:00 crc kubenswrapper[4971]: I0320 09:58:00.144288 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566678-f55t4" Mar 20 09:58:00 crc kubenswrapper[4971]: I0320 09:58:00.148419 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:58:00 crc kubenswrapper[4971]: I0320 09:58:00.149210 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 09:58:00 crc kubenswrapper[4971]: I0320 09:58:00.151774 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:58:00 crc kubenswrapper[4971]: I0320 09:58:00.159143 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566678-f55t4"] Mar 20 09:58:00 crc kubenswrapper[4971]: I0320 09:58:00.301705 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwx4g\" (UniqueName: \"kubernetes.io/projected/b64f2bb6-88df-46aa-a738-61710d224963-kube-api-access-rwx4g\") pod \"auto-csr-approver-29566678-f55t4\" (UID: \"b64f2bb6-88df-46aa-a738-61710d224963\") " pod="openshift-infra/auto-csr-approver-29566678-f55t4" Mar 20 09:58:00 crc kubenswrapper[4971]: I0320 09:58:00.403834 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwx4g\" (UniqueName: \"kubernetes.io/projected/b64f2bb6-88df-46aa-a738-61710d224963-kube-api-access-rwx4g\") pod \"auto-csr-approver-29566678-f55t4\" (UID: \"b64f2bb6-88df-46aa-a738-61710d224963\") " pod="openshift-infra/auto-csr-approver-29566678-f55t4" Mar 20 09:58:00 crc kubenswrapper[4971]: I0320 09:58:00.442367 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwx4g\" (UniqueName: \"kubernetes.io/projected/b64f2bb6-88df-46aa-a738-61710d224963-kube-api-access-rwx4g\") pod \"auto-csr-approver-29566678-f55t4\" (UID: \"b64f2bb6-88df-46aa-a738-61710d224963\") " pod="openshift-infra/auto-csr-approver-29566678-f55t4" Mar 20 09:58:00 crc kubenswrapper[4971]: I0320 09:58:00.475692 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566678-f55t4" Mar 20 09:58:00 crc kubenswrapper[4971]: I0320 09:58:00.982848 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566678-f55t4"] Mar 20 09:58:00 crc kubenswrapper[4971]: W0320 09:58:00.993750 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb64f2bb6_88df_46aa_a738_61710d224963.slice/crio-a0206ce7f386c046dbb6c2e0ad23e90e8265a487655d20418bde7558c7aac8d0 WatchSource:0}: Error finding container a0206ce7f386c046dbb6c2e0ad23e90e8265a487655d20418bde7558c7aac8d0: Status 404 returned error can't find the container with id a0206ce7f386c046dbb6c2e0ad23e90e8265a487655d20418bde7558c7aac8d0 Mar 20 09:58:01 crc kubenswrapper[4971]: I0320 09:58:01.374500 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566678-f55t4" event={"ID":"b64f2bb6-88df-46aa-a738-61710d224963","Type":"ContainerStarted","Data":"a0206ce7f386c046dbb6c2e0ad23e90e8265a487655d20418bde7558c7aac8d0"} Mar 20 09:58:02 crc kubenswrapper[4971]: I0320 09:58:02.384704 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566678-f55t4" event={"ID":"b64f2bb6-88df-46aa-a738-61710d224963","Type":"ContainerStarted","Data":"e663b89d2fd76ba1371d9f60aedbd78fdeb0d213b92f0167319c8c6d439a7270"} Mar 20 09:58:02 crc kubenswrapper[4971]: I0320 09:58:02.404104 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566678-f55t4" podStartSLOduration=1.448536587 podStartE2EDuration="2.404080117s" podCreationTimestamp="2026-03-20 09:58:00 +0000 UTC" firstStartedPulling="2026-03-20 09:58:01.000258022 +0000 UTC m=+11302.980132180" lastFinishedPulling="2026-03-20 09:58:01.955801582 +0000 UTC m=+11303.935675710" observedRunningTime="2026-03-20 09:58:02.397982027 +0000 UTC m=+11304.377856185" watchObservedRunningTime="2026-03-20 09:58:02.404080117 +0000 UTC m=+11304.383954285" Mar 20 09:58:04 crc kubenswrapper[4971]: I0320 09:58:04.411667 4971 generic.go:334] "Generic (PLEG): container finished" podID="b64f2bb6-88df-46aa-a738-61710d224963" containerID="e663b89d2fd76ba1371d9f60aedbd78fdeb0d213b92f0167319c8c6d439a7270" exitCode=0 Mar 20 09:58:04 crc kubenswrapper[4971]: I0320 09:58:04.411775 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566678-f55t4" event={"ID":"b64f2bb6-88df-46aa-a738-61710d224963","Type":"ContainerDied","Data":"e663b89d2fd76ba1371d9f60aedbd78fdeb0d213b92f0167319c8c6d439a7270"} Mar 20 09:58:05 crc kubenswrapper[4971]: I0320 09:58:05.962297 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566678-f55t4" Mar 20 09:58:06 crc kubenswrapper[4971]: I0320 09:58:06.009259 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwx4g\" (UniqueName: \"kubernetes.io/projected/b64f2bb6-88df-46aa-a738-61710d224963-kube-api-access-rwx4g\") pod \"b64f2bb6-88df-46aa-a738-61710d224963\" (UID: \"b64f2bb6-88df-46aa-a738-61710d224963\") " Mar 20 09:58:06 crc kubenswrapper[4971]: I0320 09:58:06.014568 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b64f2bb6-88df-46aa-a738-61710d224963-kube-api-access-rwx4g" (OuterVolumeSpecName: "kube-api-access-rwx4g") pod "b64f2bb6-88df-46aa-a738-61710d224963" (UID: "b64f2bb6-88df-46aa-a738-61710d224963"). InnerVolumeSpecName "kube-api-access-rwx4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:58:06 crc kubenswrapper[4971]: I0320 09:58:06.111933 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwx4g\" (UniqueName: \"kubernetes.io/projected/b64f2bb6-88df-46aa-a738-61710d224963-kube-api-access-rwx4g\") on node \"crc\" DevicePath \"\"" Mar 20 09:58:06 crc kubenswrapper[4971]: I0320 09:58:06.463941 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566678-f55t4" event={"ID":"b64f2bb6-88df-46aa-a738-61710d224963","Type":"ContainerDied","Data":"a0206ce7f386c046dbb6c2e0ad23e90e8265a487655d20418bde7558c7aac8d0"} Mar 20 09:58:06 crc kubenswrapper[4971]: I0320 09:58:06.463981 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0206ce7f386c046dbb6c2e0ad23e90e8265a487655d20418bde7558c7aac8d0" Mar 20 09:58:06 crc kubenswrapper[4971]: I0320 09:58:06.464058 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566678-f55t4" Mar 20 09:58:06 crc kubenswrapper[4971]: I0320 09:58:06.517594 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566672-b9rd6"] Mar 20 09:58:06 crc kubenswrapper[4971]: I0320 09:58:06.531281 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566672-b9rd6"] Mar 20 09:58:06 crc kubenswrapper[4971]: I0320 09:58:06.732361 4971 scope.go:117] "RemoveContainer" containerID="9741f8c4d9da3b1995bee26db60ff6748d6555ab1f8bf6b207eb6b4cd887f8f1" Mar 20 09:58:06 crc kubenswrapper[4971]: E0320 09:58:06.732698 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:58:06 crc kubenswrapper[4971]: I0320 09:58:06.745686 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5efb2ef3-9c7e-4574-ab12-bdd3bd1b2bbd" path="/var/lib/kubelet/pods/5efb2ef3-9c7e-4574-ab12-bdd3bd1b2bbd/volumes" Mar 20 09:58:16 crc kubenswrapper[4971]: I0320 09:58:16.611795 4971 scope.go:117] "RemoveContainer" containerID="92394fd9a4ccb9d5935757e080403bceb1478d4c00910cc24700504e4d51e5c6" Mar 20 09:58:18 crc kubenswrapper[4971]: I0320 09:58:18.746271 4971 scope.go:117] "RemoveContainer" containerID="9741f8c4d9da3b1995bee26db60ff6748d6555ab1f8bf6b207eb6b4cd887f8f1" Mar 20 09:58:18 crc kubenswrapper[4971]: E0320 09:58:18.746942 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:58:29 crc kubenswrapper[4971]: I0320 09:58:29.732704 4971 scope.go:117] "RemoveContainer" containerID="9741f8c4d9da3b1995bee26db60ff6748d6555ab1f8bf6b207eb6b4cd887f8f1" Mar 20 09:58:29 crc kubenswrapper[4971]: E0320 09:58:29.733562 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:58:43 crc kubenswrapper[4971]: I0320 09:58:43.732887 4971 scope.go:117] "RemoveContainer" containerID="9741f8c4d9da3b1995bee26db60ff6748d6555ab1f8bf6b207eb6b4cd887f8f1" Mar 20 09:58:43 crc kubenswrapper[4971]: E0320 09:58:43.733684 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:58:55 crc kubenswrapper[4971]: I0320 09:58:55.733624 4971 scope.go:117] "RemoveContainer" containerID="9741f8c4d9da3b1995bee26db60ff6748d6555ab1f8bf6b207eb6b4cd887f8f1" Mar 20 09:58:55 crc kubenswrapper[4971]: E0320 09:58:55.734310 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:59:08 crc kubenswrapper[4971]: I0320 09:59:08.739344 4971 scope.go:117] "RemoveContainer" containerID="9741f8c4d9da3b1995bee26db60ff6748d6555ab1f8bf6b207eb6b4cd887f8f1" Mar 20 09:59:08 crc kubenswrapper[4971]: E0320 09:59:08.740165 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:59:23 crc kubenswrapper[4971]: I0320 09:59:23.733262 4971 scope.go:117] "RemoveContainer" containerID="9741f8c4d9da3b1995bee26db60ff6748d6555ab1f8bf6b207eb6b4cd887f8f1" Mar 20 09:59:23 crc kubenswrapper[4971]: E0320 09:59:23.733989 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:59:37 crc kubenswrapper[4971]: I0320 09:59:37.732472 4971 scope.go:117] "RemoveContainer" containerID="9741f8c4d9da3b1995bee26db60ff6748d6555ab1f8bf6b207eb6b4cd887f8f1" Mar 20 09:59:37 crc kubenswrapper[4971]: E0320 09:59:37.733085 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:59:48 crc kubenswrapper[4971]: I0320 09:59:48.739311 4971 scope.go:117] "RemoveContainer" containerID="9741f8c4d9da3b1995bee26db60ff6748d6555ab1f8bf6b207eb6b4cd887f8f1" Mar 20 09:59:48 crc kubenswrapper[4971]: E0320 09:59:48.740101 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 09:59:59 crc kubenswrapper[4971]: I0320 09:59:59.732818 4971 scope.go:117] "RemoveContainer" containerID="9741f8c4d9da3b1995bee26db60ff6748d6555ab1f8bf6b207eb6b4cd887f8f1" Mar 20 09:59:59 crc kubenswrapper[4971]: E0320 09:59:59.733672 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 10:00:00 crc kubenswrapper[4971]: I0320 10:00:00.153765 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566680-5czk4"] Mar 20 10:00:00 crc kubenswrapper[4971]: E0320 10:00:00.154256 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b64f2bb6-88df-46aa-a738-61710d224963" containerName="oc" Mar 20 10:00:00 crc kubenswrapper[4971]: I0320 10:00:00.154277 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="b64f2bb6-88df-46aa-a738-61710d224963" containerName="oc" Mar 20 10:00:00 crc kubenswrapper[4971]: I0320 10:00:00.154512 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="b64f2bb6-88df-46aa-a738-61710d224963" containerName="oc" Mar 20 10:00:00 crc kubenswrapper[4971]: I0320 10:00:00.155674 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566680-5czk4" Mar 20 10:00:00 crc kubenswrapper[4971]: I0320 10:00:00.157849 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 10:00:00 crc kubenswrapper[4971]: I0320 10:00:00.157932 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 10:00:00 crc kubenswrapper[4971]: I0320 10:00:00.158101 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 10:00:00 crc kubenswrapper[4971]: I0320 10:00:00.172810 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566680-rjldg"] Mar 20 10:00:00 crc kubenswrapper[4971]: I0320 10:00:00.174367 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566680-rjldg" Mar 20 10:00:00 crc kubenswrapper[4971]: I0320 10:00:00.176528 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 10:00:00 crc kubenswrapper[4971]: I0320 10:00:00.176960 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 10:00:00 crc kubenswrapper[4971]: I0320 10:00:00.186873 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566680-5czk4"] Mar 20 10:00:00 crc kubenswrapper[4971]: I0320 10:00:00.201633 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566680-rjldg"] Mar 20 10:00:00 crc kubenswrapper[4971]: I0320 10:00:00.257696 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/215d3127-6b76-4663-bcaa-05d4b8bd3a72-secret-volume\") pod \"collect-profiles-29566680-rjldg\" (UID: \"215d3127-6b76-4663-bcaa-05d4b8bd3a72\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566680-rjldg" Mar 20 10:00:00 crc kubenswrapper[4971]: I0320 10:00:00.257761 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hcd4\" (UniqueName: \"kubernetes.io/projected/7d7d37b4-c7c5-479d-8cf6-cf504109913b-kube-api-access-2hcd4\") pod \"auto-csr-approver-29566680-5czk4\" (UID: \"7d7d37b4-c7c5-479d-8cf6-cf504109913b\") " pod="openshift-infra/auto-csr-approver-29566680-5czk4" Mar 20 10:00:00 crc kubenswrapper[4971]: I0320 10:00:00.257838 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/215d3127-6b76-4663-bcaa-05d4b8bd3a72-config-volume\") pod \"collect-profiles-29566680-rjldg\" (UID: \"215d3127-6b76-4663-bcaa-05d4b8bd3a72\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566680-rjldg" Mar 20 10:00:00 crc kubenswrapper[4971]: I0320 10:00:00.257935 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jgqc\" (UniqueName: \"kubernetes.io/projected/215d3127-6b76-4663-bcaa-05d4b8bd3a72-kube-api-access-7jgqc\") pod \"collect-profiles-29566680-rjldg\" (UID: \"215d3127-6b76-4663-bcaa-05d4b8bd3a72\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566680-rjldg" Mar 20 10:00:00 crc kubenswrapper[4971]: I0320 10:00:00.360156 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jgqc\" (UniqueName: \"kubernetes.io/projected/215d3127-6b76-4663-bcaa-05d4b8bd3a72-kube-api-access-7jgqc\") pod \"collect-profiles-29566680-rjldg\" (UID: \"215d3127-6b76-4663-bcaa-05d4b8bd3a72\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566680-rjldg" Mar 20 10:00:00 crc kubenswrapper[4971]: I0320 10:00:00.360304 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/215d3127-6b76-4663-bcaa-05d4b8bd3a72-secret-volume\") pod \"collect-profiles-29566680-rjldg\" (UID: \"215d3127-6b76-4663-bcaa-05d4b8bd3a72\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566680-rjldg" Mar 20 10:00:00 crc kubenswrapper[4971]: I0320 10:00:00.360339 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hcd4\" (UniqueName: \"kubernetes.io/projected/7d7d37b4-c7c5-479d-8cf6-cf504109913b-kube-api-access-2hcd4\") pod \"auto-csr-approver-29566680-5czk4\" (UID: \"7d7d37b4-c7c5-479d-8cf6-cf504109913b\") " pod="openshift-infra/auto-csr-approver-29566680-5czk4" Mar 20 10:00:00 crc kubenswrapper[4971]: I0320 10:00:00.360395 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/215d3127-6b76-4663-bcaa-05d4b8bd3a72-config-volume\") pod \"collect-profiles-29566680-rjldg\" (UID: \"215d3127-6b76-4663-bcaa-05d4b8bd3a72\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566680-rjldg" Mar 20 10:00:00 crc kubenswrapper[4971]: I0320 10:00:00.361265 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/215d3127-6b76-4663-bcaa-05d4b8bd3a72-config-volume\") pod \"collect-profiles-29566680-rjldg\" (UID: \"215d3127-6b76-4663-bcaa-05d4b8bd3a72\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566680-rjldg" Mar 20 10:00:00 crc kubenswrapper[4971]: I0320 10:00:00.367084 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/215d3127-6b76-4663-bcaa-05d4b8bd3a72-secret-volume\") pod \"collect-profiles-29566680-rjldg\" (UID: \"215d3127-6b76-4663-bcaa-05d4b8bd3a72\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566680-rjldg" Mar 20 10:00:00 crc kubenswrapper[4971]: I0320 10:00:00.380882 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jgqc\" (UniqueName: \"kubernetes.io/projected/215d3127-6b76-4663-bcaa-05d4b8bd3a72-kube-api-access-7jgqc\") pod \"collect-profiles-29566680-rjldg\" (UID: \"215d3127-6b76-4663-bcaa-05d4b8bd3a72\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566680-rjldg" Mar 20 10:00:00 crc kubenswrapper[4971]: I0320 10:00:00.381417 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hcd4\" (UniqueName: \"kubernetes.io/projected/7d7d37b4-c7c5-479d-8cf6-cf504109913b-kube-api-access-2hcd4\") pod \"auto-csr-approver-29566680-5czk4\" (UID: \"7d7d37b4-c7c5-479d-8cf6-cf504109913b\") " pod="openshift-infra/auto-csr-approver-29566680-5czk4" Mar 20 10:00:00 crc kubenswrapper[4971]: I0320 10:00:00.478997 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566680-5czk4" Mar 20 10:00:00 crc kubenswrapper[4971]: I0320 10:00:00.501423 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566680-rjldg" Mar 20 10:00:00 crc kubenswrapper[4971]: I0320 10:00:00.980882 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566680-5czk4"] Mar 20 10:00:00 crc kubenswrapper[4971]: I0320 10:00:00.981678 4971 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 10:00:01 crc kubenswrapper[4971]: I0320 10:00:01.133823 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566680-rjldg"] Mar 20 10:00:01 crc kubenswrapper[4971]: I0320 10:00:01.623578 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566680-rjldg" event={"ID":"215d3127-6b76-4663-bcaa-05d4b8bd3a72","Type":"ContainerStarted","Data":"7504311d86373fa49505ba5db680f980d719c7dfe987a1aedfc24da79774f13d"} Mar 20 10:00:01 crc kubenswrapper[4971]: I0320 10:00:01.623948 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566680-rjldg" event={"ID":"215d3127-6b76-4663-bcaa-05d4b8bd3a72","Type":"ContainerStarted","Data":"d36315c32213647c04b09c5b1f17764a44390f7e0a0eacc7fc4194669b7c7f87"} Mar 20 10:00:01 crc kubenswrapper[4971]: I0320 10:00:01.625078 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566680-5czk4" event={"ID":"7d7d37b4-c7c5-479d-8cf6-cf504109913b","Type":"ContainerStarted","Data":"ee4f03537ff46c180e544818b69d1fc9e59f75cfed6e95e7b3437d75d62b5c53"} Mar 20 10:00:01 crc kubenswrapper[4971]: I0320 10:00:01.646735 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29566680-rjldg" podStartSLOduration=1.6467159059999998 podStartE2EDuration="1.646715906s" podCreationTimestamp="2026-03-20 10:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:00:01.640950825 +0000 UTC m=+11423.620824963" watchObservedRunningTime="2026-03-20 10:00:01.646715906 +0000 UTC m=+11423.626590044" Mar 20 10:00:02 crc kubenswrapper[4971]: I0320 10:00:02.638891 4971 generic.go:334] "Generic (PLEG): container finished" podID="215d3127-6b76-4663-bcaa-05d4b8bd3a72" containerID="7504311d86373fa49505ba5db680f980d719c7dfe987a1aedfc24da79774f13d" exitCode=0 Mar 20 10:00:02 crc kubenswrapper[4971]: I0320 10:00:02.639009 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566680-rjldg" event={"ID":"215d3127-6b76-4663-bcaa-05d4b8bd3a72","Type":"ContainerDied","Data":"7504311d86373fa49505ba5db680f980d719c7dfe987a1aedfc24da79774f13d"} Mar 20 10:00:04 crc kubenswrapper[4971]: I0320 10:00:04.283095 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566680-rjldg" Mar 20 10:00:04 crc kubenswrapper[4971]: I0320 10:00:04.354888 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/215d3127-6b76-4663-bcaa-05d4b8bd3a72-secret-volume\") pod \"215d3127-6b76-4663-bcaa-05d4b8bd3a72\" (UID: \"215d3127-6b76-4663-bcaa-05d4b8bd3a72\") " Mar 20 10:00:04 crc kubenswrapper[4971]: I0320 10:00:04.355187 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/215d3127-6b76-4663-bcaa-05d4b8bd3a72-config-volume\") pod \"215d3127-6b76-4663-bcaa-05d4b8bd3a72\" (UID: \"215d3127-6b76-4663-bcaa-05d4b8bd3a72\") " Mar 20 10:00:04 crc kubenswrapper[4971]: I0320 10:00:04.355343 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jgqc\" (UniqueName: \"kubernetes.io/projected/215d3127-6b76-4663-bcaa-05d4b8bd3a72-kube-api-access-7jgqc\") pod \"215d3127-6b76-4663-bcaa-05d4b8bd3a72\" (UID: \"215d3127-6b76-4663-bcaa-05d4b8bd3a72\") " Mar 20 10:00:04 crc kubenswrapper[4971]: I0320 10:00:04.355937 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/215d3127-6b76-4663-bcaa-05d4b8bd3a72-config-volume" (OuterVolumeSpecName: "config-volume") pod "215d3127-6b76-4663-bcaa-05d4b8bd3a72" (UID: "215d3127-6b76-4663-bcaa-05d4b8bd3a72"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:00:04 crc kubenswrapper[4971]: I0320 10:00:04.356047 4971 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/215d3127-6b76-4663-bcaa-05d4b8bd3a72-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 10:00:04 crc kubenswrapper[4971]: I0320 10:00:04.373954 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/215d3127-6b76-4663-bcaa-05d4b8bd3a72-kube-api-access-7jgqc" (OuterVolumeSpecName: "kube-api-access-7jgqc") pod "215d3127-6b76-4663-bcaa-05d4b8bd3a72" (UID: "215d3127-6b76-4663-bcaa-05d4b8bd3a72"). InnerVolumeSpecName "kube-api-access-7jgqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:00:04 crc kubenswrapper[4971]: I0320 10:00:04.378232 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/215d3127-6b76-4663-bcaa-05d4b8bd3a72-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "215d3127-6b76-4663-bcaa-05d4b8bd3a72" (UID: "215d3127-6b76-4663-bcaa-05d4b8bd3a72"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:00:04 crc kubenswrapper[4971]: I0320 10:00:04.458051 4971 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/215d3127-6b76-4663-bcaa-05d4b8bd3a72-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 10:00:04 crc kubenswrapper[4971]: I0320 10:00:04.458089 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jgqc\" (UniqueName: \"kubernetes.io/projected/215d3127-6b76-4663-bcaa-05d4b8bd3a72-kube-api-access-7jgqc\") on node \"crc\" DevicePath \"\"" Mar 20 10:00:04 crc kubenswrapper[4971]: I0320 10:00:04.658364 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566680-rjldg" Mar 20 10:00:04 crc kubenswrapper[4971]: I0320 10:00:04.658356 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566680-rjldg" event={"ID":"215d3127-6b76-4663-bcaa-05d4b8bd3a72","Type":"ContainerDied","Data":"d36315c32213647c04b09c5b1f17764a44390f7e0a0eacc7fc4194669b7c7f87"} Mar 20 10:00:04 crc kubenswrapper[4971]: I0320 10:00:04.658551 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d36315c32213647c04b09c5b1f17764a44390f7e0a0eacc7fc4194669b7c7f87" Mar 20 10:00:04 crc kubenswrapper[4971]: I0320 10:00:04.660290 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566680-5czk4" event={"ID":"7d7d37b4-c7c5-479d-8cf6-cf504109913b","Type":"ContainerStarted","Data":"e018edb71b2eafa4b724a73659d45fe8ea7e7989f61883570c51adc5358b4cdf"} Mar 20 10:00:04 crc kubenswrapper[4971]: I0320 10:00:04.684455 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566680-5czk4" podStartSLOduration=1.564130827 podStartE2EDuration="4.684433598s" podCreationTimestamp="2026-03-20 10:00:00 +0000 UTC" firstStartedPulling="2026-03-20 10:00:00.981454462 +0000 UTC m=+11422.961328600" lastFinishedPulling="2026-03-20 10:00:04.101757233 +0000 UTC m=+11426.081631371" observedRunningTime="2026-03-20 10:00:04.67650322 +0000 UTC m=+11426.656377358" watchObservedRunningTime="2026-03-20 10:00:04.684433598 +0000 UTC m=+11426.664307736" Mar 20 10:00:04 crc kubenswrapper[4971]: I0320 10:00:04.730045 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566635-tsd8m"] Mar 20 10:00:04 crc kubenswrapper[4971]: I0320 10:00:04.743034 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566635-tsd8m"] Mar 20 10:00:05 crc kubenswrapper[4971]: I0320 10:00:05.674093 4971 generic.go:334] "Generic (PLEG): container finished" podID="7d7d37b4-c7c5-479d-8cf6-cf504109913b" containerID="e018edb71b2eafa4b724a73659d45fe8ea7e7989f61883570c51adc5358b4cdf" exitCode=0 Mar 20 10:00:05 crc kubenswrapper[4971]: I0320 10:00:05.674129 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566680-5czk4" event={"ID":"7d7d37b4-c7c5-479d-8cf6-cf504109913b","Type":"ContainerDied","Data":"e018edb71b2eafa4b724a73659d45fe8ea7e7989f61883570c51adc5358b4cdf"} Mar 20 10:00:06 crc kubenswrapper[4971]: I0320 10:00:06.747412 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1718809-0da5-4da9-b763-b0b76232586f" path="/var/lib/kubelet/pods/e1718809-0da5-4da9-b763-b0b76232586f/volumes" Mar 20 10:00:07 crc kubenswrapper[4971]: I0320 10:00:07.209744 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566680-5czk4" Mar 20 10:00:07 crc kubenswrapper[4971]: I0320 10:00:07.316222 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hcd4\" (UniqueName: \"kubernetes.io/projected/7d7d37b4-c7c5-479d-8cf6-cf504109913b-kube-api-access-2hcd4\") pod \"7d7d37b4-c7c5-479d-8cf6-cf504109913b\" (UID: \"7d7d37b4-c7c5-479d-8cf6-cf504109913b\") " Mar 20 10:00:07 crc kubenswrapper[4971]: I0320 10:00:07.323931 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d7d37b4-c7c5-479d-8cf6-cf504109913b-kube-api-access-2hcd4" (OuterVolumeSpecName: "kube-api-access-2hcd4") pod "7d7d37b4-c7c5-479d-8cf6-cf504109913b" (UID: "7d7d37b4-c7c5-479d-8cf6-cf504109913b"). InnerVolumeSpecName "kube-api-access-2hcd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:00:07 crc kubenswrapper[4971]: I0320 10:00:07.419230 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hcd4\" (UniqueName: \"kubernetes.io/projected/7d7d37b4-c7c5-479d-8cf6-cf504109913b-kube-api-access-2hcd4\") on node \"crc\" DevicePath \"\"" Mar 20 10:00:07 crc kubenswrapper[4971]: I0320 10:00:07.696092 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566680-5czk4" event={"ID":"7d7d37b4-c7c5-479d-8cf6-cf504109913b","Type":"ContainerDied","Data":"ee4f03537ff46c180e544818b69d1fc9e59f75cfed6e95e7b3437d75d62b5c53"} Mar 20 10:00:07 crc kubenswrapper[4971]: I0320 10:00:07.696135 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee4f03537ff46c180e544818b69d1fc9e59f75cfed6e95e7b3437d75d62b5c53" Mar 20 10:00:07 crc kubenswrapper[4971]: I0320 10:00:07.696184 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566680-5czk4" Mar 20 10:00:07 crc kubenswrapper[4971]: I0320 10:00:07.744166 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566674-lkbcg"] Mar 20 10:00:07 crc kubenswrapper[4971]: I0320 10:00:07.753972 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566674-lkbcg"] Mar 20 10:00:08 crc kubenswrapper[4971]: I0320 10:00:08.744145 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00b99f96-7701-4b4b-a4a3-9a8b6ce83944" path="/var/lib/kubelet/pods/00b99f96-7701-4b4b-a4a3-9a8b6ce83944/volumes" Mar 20 10:00:13 crc kubenswrapper[4971]: I0320 10:00:13.733405 4971 scope.go:117] "RemoveContainer" containerID="9741f8c4d9da3b1995bee26db60ff6748d6555ab1f8bf6b207eb6b4cd887f8f1" Mar 20 10:00:13 crc kubenswrapper[4971]: E0320 10:00:13.734051 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 10:00:16 crc kubenswrapper[4971]: I0320 10:00:16.864081 4971 scope.go:117] "RemoveContainer" containerID="8f45571c28b598a72dcb4feb7b476d79ed5a69076ec6df4a8f4edba338b5ae63" Mar 20 10:00:16 crc kubenswrapper[4971]: I0320 10:00:16.905535 4971 scope.go:117] "RemoveContainer" containerID="b86122b0b0b49047cfe150427b2ff9a68794d19859343cdfd156ed9405e6fce2" Mar 20 10:00:26 crc kubenswrapper[4971]: I0320 10:00:26.732342 4971 scope.go:117] "RemoveContainer" containerID="9741f8c4d9da3b1995bee26db60ff6748d6555ab1f8bf6b207eb6b4cd887f8f1" Mar 20 10:00:27 crc kubenswrapper[4971]: I0320 10:00:27.904477 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerStarted","Data":"b00504c5425aa2d12d14003deead3cf102e74dcadfa956fb8c983b5701595dc1"} Mar 20 10:01:00 crc kubenswrapper[4971]: I0320 10:01:00.143681 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29566681-p2q5h"] Mar 20 10:01:00 crc kubenswrapper[4971]: E0320 10:01:00.173894 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="215d3127-6b76-4663-bcaa-05d4b8bd3a72" containerName="collect-profiles" Mar 20 10:01:00 crc kubenswrapper[4971]: I0320 10:01:00.173914 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="215d3127-6b76-4663-bcaa-05d4b8bd3a72" containerName="collect-profiles" Mar 20 10:01:00 crc kubenswrapper[4971]: E0320 10:01:00.173940 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d7d37b4-c7c5-479d-8cf6-cf504109913b" containerName="oc" Mar 20 10:01:00 crc kubenswrapper[4971]: I0320 10:01:00.173946 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d7d37b4-c7c5-479d-8cf6-cf504109913b" containerName="oc" Mar 20 10:01:00 crc kubenswrapper[4971]: I0320 10:01:00.174159 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="215d3127-6b76-4663-bcaa-05d4b8bd3a72" containerName="collect-profiles" Mar 20 10:01:00 crc kubenswrapper[4971]: I0320 10:01:00.174169 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d7d37b4-c7c5-479d-8cf6-cf504109913b" containerName="oc" Mar 20 10:01:00 crc kubenswrapper[4971]: I0320 10:01:00.174949 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29566681-p2q5h" Mar 20 10:01:00 crc kubenswrapper[4971]: I0320 10:01:00.175831 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29566681-p2q5h"] Mar 20 10:01:00 crc kubenswrapper[4971]: I0320 10:01:00.186591 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acd620ca-33e4-482a-8105-31e3355d012e-combined-ca-bundle\") pod \"keystone-cron-29566681-p2q5h\" (UID: \"acd620ca-33e4-482a-8105-31e3355d012e\") " pod="openstack/keystone-cron-29566681-p2q5h" Mar 20 10:01:00 crc kubenswrapper[4971]: I0320 10:01:00.186664 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acd620ca-33e4-482a-8105-31e3355d012e-config-data\") pod \"keystone-cron-29566681-p2q5h\" (UID: \"acd620ca-33e4-482a-8105-31e3355d012e\") " pod="openstack/keystone-cron-29566681-p2q5h" Mar 20 10:01:00 crc kubenswrapper[4971]: I0320 10:01:00.186804 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dcrl\" (UniqueName: \"kubernetes.io/projected/acd620ca-33e4-482a-8105-31e3355d012e-kube-api-access-4dcrl\") pod \"keystone-cron-29566681-p2q5h\" (UID: \"acd620ca-33e4-482a-8105-31e3355d012e\") " pod="openstack/keystone-cron-29566681-p2q5h" Mar 20 10:01:00 crc kubenswrapper[4971]: I0320 10:01:00.186839 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/acd620ca-33e4-482a-8105-31e3355d012e-fernet-keys\") pod \"keystone-cron-29566681-p2q5h\" (UID: \"acd620ca-33e4-482a-8105-31e3355d012e\") " pod="openstack/keystone-cron-29566681-p2q5h" Mar 20 10:01:00 crc kubenswrapper[4971]: I0320 10:01:00.288525 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dcrl\" (UniqueName: \"kubernetes.io/projected/acd620ca-33e4-482a-8105-31e3355d012e-kube-api-access-4dcrl\") pod \"keystone-cron-29566681-p2q5h\" (UID: \"acd620ca-33e4-482a-8105-31e3355d012e\") " pod="openstack/keystone-cron-29566681-p2q5h" Mar 20 10:01:00 crc kubenswrapper[4971]: I0320 10:01:00.288585 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/acd620ca-33e4-482a-8105-31e3355d012e-fernet-keys\") pod \"keystone-cron-29566681-p2q5h\" (UID: \"acd620ca-33e4-482a-8105-31e3355d012e\") " pod="openstack/keystone-cron-29566681-p2q5h" Mar 20 10:01:00 crc kubenswrapper[4971]: I0320 10:01:00.288620 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acd620ca-33e4-482a-8105-31e3355d012e-combined-ca-bundle\") pod \"keystone-cron-29566681-p2q5h\" (UID: \"acd620ca-33e4-482a-8105-31e3355d012e\") " pod="openstack/keystone-cron-29566681-p2q5h" Mar 20 10:01:00 crc kubenswrapper[4971]: I0320 10:01:00.288665 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acd620ca-33e4-482a-8105-31e3355d012e-config-data\") pod \"keystone-cron-29566681-p2q5h\" (UID: \"acd620ca-33e4-482a-8105-31e3355d012e\") " pod="openstack/keystone-cron-29566681-p2q5h" Mar 20 10:01:00 crc kubenswrapper[4971]: I0320 10:01:00.295742 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acd620ca-33e4-482a-8105-31e3355d012e-combined-ca-bundle\") pod \"keystone-cron-29566681-p2q5h\" (UID: \"acd620ca-33e4-482a-8105-31e3355d012e\") " pod="openstack/keystone-cron-29566681-p2q5h" Mar 20 10:01:00 crc kubenswrapper[4971]: I0320 10:01:00.296533 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acd620ca-33e4-482a-8105-31e3355d012e-config-data\") pod \"keystone-cron-29566681-p2q5h\" (UID: \"acd620ca-33e4-482a-8105-31e3355d012e\") " pod="openstack/keystone-cron-29566681-p2q5h" Mar 20 10:01:00 crc kubenswrapper[4971]: I0320 10:01:00.297119 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/acd620ca-33e4-482a-8105-31e3355d012e-fernet-keys\") pod \"keystone-cron-29566681-p2q5h\" (UID: \"acd620ca-33e4-482a-8105-31e3355d012e\") " pod="openstack/keystone-cron-29566681-p2q5h" Mar 20 10:01:00 crc kubenswrapper[4971]: I0320 10:01:00.308176 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dcrl\" (UniqueName: \"kubernetes.io/projected/acd620ca-33e4-482a-8105-31e3355d012e-kube-api-access-4dcrl\") pod \"keystone-cron-29566681-p2q5h\" (UID: \"acd620ca-33e4-482a-8105-31e3355d012e\") " pod="openstack/keystone-cron-29566681-p2q5h" Mar 20 10:01:00 crc kubenswrapper[4971]: I0320 10:01:00.493208 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29566681-p2q5h" Mar 20 10:01:00 crc kubenswrapper[4971]: I0320 10:01:00.973767 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29566681-p2q5h"] Mar 20 10:01:01 crc kubenswrapper[4971]: I0320 10:01:01.238157 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29566681-p2q5h" event={"ID":"acd620ca-33e4-482a-8105-31e3355d012e","Type":"ContainerStarted","Data":"57e905aa962ff2d44c74b3e8ea1ae57a24334cbd729b75804ac5434fb9124b98"} Mar 20 10:01:01 crc kubenswrapper[4971]: I0320 10:01:01.239253 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29566681-p2q5h" event={"ID":"acd620ca-33e4-482a-8105-31e3355d012e","Type":"ContainerStarted","Data":"c630baf06733f028939b25a5c08d1ab8eb69149c38e148a959b3023176c5d796"} Mar 20 10:01:01 crc kubenswrapper[4971]: I0320 10:01:01.259314 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29566681-p2q5h" podStartSLOduration=1.259291962 podStartE2EDuration="1.259291962s" podCreationTimestamp="2026-03-20 10:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:01:01.252666158 +0000 UTC m=+11483.232540316" watchObservedRunningTime="2026-03-20 10:01:01.259291962 +0000 UTC m=+11483.239166100" Mar 20 10:01:04 crc kubenswrapper[4971]: I0320 10:01:04.295436 4971 generic.go:334] "Generic (PLEG): container finished" podID="acd620ca-33e4-482a-8105-31e3355d012e" containerID="57e905aa962ff2d44c74b3e8ea1ae57a24334cbd729b75804ac5434fb9124b98" exitCode=0 Mar 20 10:01:04 crc kubenswrapper[4971]: I0320 10:01:04.295534 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29566681-p2q5h" event={"ID":"acd620ca-33e4-482a-8105-31e3355d012e","Type":"ContainerDied","Data":"57e905aa962ff2d44c74b3e8ea1ae57a24334cbd729b75804ac5434fb9124b98"} Mar 20 10:01:05 crc kubenswrapper[4971]: I0320 10:01:05.180904 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-926gt"] Mar 20 10:01:05 crc kubenswrapper[4971]: I0320 10:01:05.183287 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-926gt" Mar 20 10:01:05 crc kubenswrapper[4971]: I0320 10:01:05.200266 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-926gt"] Mar 20 10:01:05 crc kubenswrapper[4971]: I0320 10:01:05.292270 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b9f7d1c-4632-417a-a1b9-4cf8ce803d2d-catalog-content\") pod \"redhat-operators-926gt\" (UID: \"2b9f7d1c-4632-417a-a1b9-4cf8ce803d2d\") " pod="openshift-marketplace/redhat-operators-926gt" Mar 20 10:01:05 crc kubenswrapper[4971]: I0320 10:01:05.292397 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlzxm\" (UniqueName: \"kubernetes.io/projected/2b9f7d1c-4632-417a-a1b9-4cf8ce803d2d-kube-api-access-dlzxm\") pod \"redhat-operators-926gt\" (UID: \"2b9f7d1c-4632-417a-a1b9-4cf8ce803d2d\") " pod="openshift-marketplace/redhat-operators-926gt" Mar 20 10:01:05 crc kubenswrapper[4971]: I0320 10:01:05.292452 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b9f7d1c-4632-417a-a1b9-4cf8ce803d2d-utilities\") pod \"redhat-operators-926gt\" (UID: \"2b9f7d1c-4632-417a-a1b9-4cf8ce803d2d\") " pod="openshift-marketplace/redhat-operators-926gt" Mar 20 10:01:05 crc kubenswrapper[4971]: I0320 10:01:05.394966 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b9f7d1c-4632-417a-a1b9-4cf8ce803d2d-catalog-content\") pod \"redhat-operators-926gt\" (UID: \"2b9f7d1c-4632-417a-a1b9-4cf8ce803d2d\") " pod="openshift-marketplace/redhat-operators-926gt" Mar 20 10:01:05 crc kubenswrapper[4971]: I0320 10:01:05.395243 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlzxm\" (UniqueName: \"kubernetes.io/projected/2b9f7d1c-4632-417a-a1b9-4cf8ce803d2d-kube-api-access-dlzxm\") pod \"redhat-operators-926gt\" (UID: \"2b9f7d1c-4632-417a-a1b9-4cf8ce803d2d\") " pod="openshift-marketplace/redhat-operators-926gt" Mar 20 10:01:05 crc kubenswrapper[4971]: I0320 10:01:05.395283 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b9f7d1c-4632-417a-a1b9-4cf8ce803d2d-utilities\") pod \"redhat-operators-926gt\" (UID: \"2b9f7d1c-4632-417a-a1b9-4cf8ce803d2d\") " pod="openshift-marketplace/redhat-operators-926gt" Mar 20 10:01:05 crc kubenswrapper[4971]: I0320 10:01:05.395646 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b9f7d1c-4632-417a-a1b9-4cf8ce803d2d-utilities\") pod \"redhat-operators-926gt\" (UID: \"2b9f7d1c-4632-417a-a1b9-4cf8ce803d2d\") " pod="openshift-marketplace/redhat-operators-926gt" Mar 20 10:01:05 crc kubenswrapper[4971]: I0320 10:01:05.395666 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b9f7d1c-4632-417a-a1b9-4cf8ce803d2d-catalog-content\") pod \"redhat-operators-926gt\" (UID: \"2b9f7d1c-4632-417a-a1b9-4cf8ce803d2d\") " pod="openshift-marketplace/redhat-operators-926gt" Mar 20 10:01:05 crc kubenswrapper[4971]: I0320 10:01:05.416774 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlzxm\" (UniqueName: \"kubernetes.io/projected/2b9f7d1c-4632-417a-a1b9-4cf8ce803d2d-kube-api-access-dlzxm\") pod \"redhat-operators-926gt\" (UID: \"2b9f7d1c-4632-417a-a1b9-4cf8ce803d2d\") " pod="openshift-marketplace/redhat-operators-926gt" Mar 20 10:01:05 crc kubenswrapper[4971]: I0320 10:01:05.519503 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-926gt" Mar 20 10:01:05 crc kubenswrapper[4971]: I0320 10:01:05.875388 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29566681-p2q5h" Mar 20 10:01:06 crc kubenswrapper[4971]: I0320 10:01:06.015256 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dcrl\" (UniqueName: \"kubernetes.io/projected/acd620ca-33e4-482a-8105-31e3355d012e-kube-api-access-4dcrl\") pod \"acd620ca-33e4-482a-8105-31e3355d012e\" (UID: \"acd620ca-33e4-482a-8105-31e3355d012e\") " Mar 20 10:01:06 crc kubenswrapper[4971]: I0320 10:01:06.015453 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acd620ca-33e4-482a-8105-31e3355d012e-config-data\") pod \"acd620ca-33e4-482a-8105-31e3355d012e\" (UID: \"acd620ca-33e4-482a-8105-31e3355d012e\") " Mar 20 10:01:06 crc kubenswrapper[4971]: I0320 10:01:06.015520 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acd620ca-33e4-482a-8105-31e3355d012e-combined-ca-bundle\") pod \"acd620ca-33e4-482a-8105-31e3355d012e\" (UID: \"acd620ca-33e4-482a-8105-31e3355d012e\") " Mar 20 10:01:06 crc kubenswrapper[4971]: I0320 10:01:06.015617 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/acd620ca-33e4-482a-8105-31e3355d012e-fernet-keys\") pod \"acd620ca-33e4-482a-8105-31e3355d012e\" (UID: \"acd620ca-33e4-482a-8105-31e3355d012e\") " Mar 20 10:01:06 crc kubenswrapper[4971]: I0320 10:01:06.059664 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acd620ca-33e4-482a-8105-31e3355d012e-kube-api-access-4dcrl" (OuterVolumeSpecName: "kube-api-access-4dcrl") pod "acd620ca-33e4-482a-8105-31e3355d012e" (UID: "acd620ca-33e4-482a-8105-31e3355d012e"). InnerVolumeSpecName "kube-api-access-4dcrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:01:06 crc kubenswrapper[4971]: I0320 10:01:06.074070 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acd620ca-33e4-482a-8105-31e3355d012e-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "acd620ca-33e4-482a-8105-31e3355d012e" (UID: "acd620ca-33e4-482a-8105-31e3355d012e"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:01:06 crc kubenswrapper[4971]: I0320 10:01:06.105632 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-926gt"] Mar 20 10:01:06 crc kubenswrapper[4971]: I0320 10:01:06.107760 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acd620ca-33e4-482a-8105-31e3355d012e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "acd620ca-33e4-482a-8105-31e3355d012e" (UID: "acd620ca-33e4-482a-8105-31e3355d012e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:01:06 crc kubenswrapper[4971]: I0320 10:01:06.118041 4971 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acd620ca-33e4-482a-8105-31e3355d012e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:01:06 crc kubenswrapper[4971]: I0320 10:01:06.118079 4971 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/acd620ca-33e4-482a-8105-31e3355d012e-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 20 10:01:06 crc kubenswrapper[4971]: I0320 10:01:06.118091 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dcrl\" (UniqueName: \"kubernetes.io/projected/acd620ca-33e4-482a-8105-31e3355d012e-kube-api-access-4dcrl\") on node \"crc\" DevicePath \"\"" Mar 20 10:01:06 crc kubenswrapper[4971]: I0320 10:01:06.137097 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acd620ca-33e4-482a-8105-31e3355d012e-config-data" (OuterVolumeSpecName: "config-data") pod "acd620ca-33e4-482a-8105-31e3355d012e" (UID: "acd620ca-33e4-482a-8105-31e3355d012e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:01:06 crc kubenswrapper[4971]: I0320 10:01:06.219635 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acd620ca-33e4-482a-8105-31e3355d012e-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 10:01:06 crc kubenswrapper[4971]: I0320 10:01:06.314911 4971 generic.go:334] "Generic (PLEG): container finished" podID="2b9f7d1c-4632-417a-a1b9-4cf8ce803d2d" containerID="06633b6645d343a1eadcd751db7cb2ae1a6cc859aca7c6133bf2a0339409260f" exitCode=0 Mar 20 10:01:06 crc kubenswrapper[4971]: I0320 10:01:06.314986 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-926gt" event={"ID":"2b9f7d1c-4632-417a-a1b9-4cf8ce803d2d","Type":"ContainerDied","Data":"06633b6645d343a1eadcd751db7cb2ae1a6cc859aca7c6133bf2a0339409260f"} Mar 20 10:01:06 crc kubenswrapper[4971]: I0320 10:01:06.315016 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-926gt" event={"ID":"2b9f7d1c-4632-417a-a1b9-4cf8ce803d2d","Type":"ContainerStarted","Data":"d028203d8b4e3bf1de50c625f9e2a5a4998b87855b8c6f1492e38a8e37e3428f"} Mar 20 10:01:06 crc kubenswrapper[4971]: I0320 10:01:06.317724 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29566681-p2q5h" Mar 20 10:01:06 crc kubenswrapper[4971]: I0320 10:01:06.319116 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29566681-p2q5h" event={"ID":"acd620ca-33e4-482a-8105-31e3355d012e","Type":"ContainerDied","Data":"c630baf06733f028939b25a5c08d1ab8eb69149c38e148a959b3023176c5d796"} Mar 20 10:01:06 crc kubenswrapper[4971]: I0320 10:01:06.319178 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c630baf06733f028939b25a5c08d1ab8eb69149c38e148a959b3023176c5d796" Mar 20 10:01:08 crc kubenswrapper[4971]: I0320 10:01:08.355913 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-926gt" event={"ID":"2b9f7d1c-4632-417a-a1b9-4cf8ce803d2d","Type":"ContainerStarted","Data":"6b2b0f062c734dd4bfd1d1f78183b2710f55b42f0cc93903b3f75f90f23afe1c"} Mar 20 10:01:13 crc kubenswrapper[4971]: I0320 10:01:13.408929 4971 generic.go:334] "Generic (PLEG): container finished" podID="2b9f7d1c-4632-417a-a1b9-4cf8ce803d2d" containerID="6b2b0f062c734dd4bfd1d1f78183b2710f55b42f0cc93903b3f75f90f23afe1c" exitCode=0 Mar 20 10:01:13 crc kubenswrapper[4971]: I0320 10:01:13.409015 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-926gt" event={"ID":"2b9f7d1c-4632-417a-a1b9-4cf8ce803d2d","Type":"ContainerDied","Data":"6b2b0f062c734dd4bfd1d1f78183b2710f55b42f0cc93903b3f75f90f23afe1c"} Mar 20 10:01:14 crc kubenswrapper[4971]: I0320 10:01:14.425327 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-926gt" event={"ID":"2b9f7d1c-4632-417a-a1b9-4cf8ce803d2d","Type":"ContainerStarted","Data":"117196e2bbc05bf25f8b91e6499c030fc7ba99abc5f790cd8219cc75f71c5529"} Mar 20 10:01:14 crc kubenswrapper[4971]: I0320 10:01:14.457868 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-926gt" podStartSLOduration=1.867235086 podStartE2EDuration="9.457846449s" podCreationTimestamp="2026-03-20 10:01:05 +0000 UTC" firstStartedPulling="2026-03-20 10:01:06.31647746 +0000 UTC m=+11488.296351598" lastFinishedPulling="2026-03-20 10:01:13.907088823 +0000 UTC m=+11495.886962961" observedRunningTime="2026-03-20 10:01:14.445621098 +0000 UTC m=+11496.425495256" watchObservedRunningTime="2026-03-20 10:01:14.457846449 +0000 UTC m=+11496.437720587" Mar 20 10:01:15 crc kubenswrapper[4971]: I0320 10:01:15.519782 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-926gt" Mar 20 10:01:15 crc kubenswrapper[4971]: I0320 10:01:15.520135 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-926gt" Mar 20 10:01:16 crc kubenswrapper[4971]: I0320 10:01:16.567013 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-926gt" podUID="2b9f7d1c-4632-417a-a1b9-4cf8ce803d2d" containerName="registry-server" probeResult="failure" output=< Mar 20 10:01:16 crc kubenswrapper[4971]: timeout: failed to connect service ":50051" within 1s Mar 20 10:01:16 crc kubenswrapper[4971]: > Mar 20 10:01:26 crc kubenswrapper[4971]: I0320 10:01:26.592998 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-926gt" podUID="2b9f7d1c-4632-417a-a1b9-4cf8ce803d2d" containerName="registry-server" probeResult="failure" output=< Mar 20 10:01:26 crc kubenswrapper[4971]: timeout: failed to connect service ":50051" within 1s Mar 20 10:01:26 crc kubenswrapper[4971]: > Mar 20 10:01:36 crc kubenswrapper[4971]: I0320 10:01:36.589812 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-926gt" podUID="2b9f7d1c-4632-417a-a1b9-4cf8ce803d2d" containerName="registry-server" probeResult="failure" output=< Mar 20 10:01:36 crc kubenswrapper[4971]: timeout: failed to connect service ":50051" within 1s Mar 20 10:01:36 crc kubenswrapper[4971]: > Mar 20 10:01:44 crc kubenswrapper[4971]: I0320 10:01:44.255464 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-x56s5"] Mar 20 10:01:44 crc kubenswrapper[4971]: E0320 10:01:44.257316 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acd620ca-33e4-482a-8105-31e3355d012e" containerName="keystone-cron" Mar 20 10:01:44 crc kubenswrapper[4971]: I0320 10:01:44.257351 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="acd620ca-33e4-482a-8105-31e3355d012e" containerName="keystone-cron" Mar 20 10:01:44 crc kubenswrapper[4971]: I0320 10:01:44.257953 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="acd620ca-33e4-482a-8105-31e3355d012e" containerName="keystone-cron" Mar 20 10:01:44 crc kubenswrapper[4971]: I0320 10:01:44.260837 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x56s5" Mar 20 10:01:44 crc kubenswrapper[4971]: I0320 10:01:44.271255 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x56s5"] Mar 20 10:01:44 crc kubenswrapper[4971]: I0320 10:01:44.414475 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a95fccd1-b44d-40ab-8e02-b5734b414afa-utilities\") pod \"redhat-marketplace-x56s5\" (UID: \"a95fccd1-b44d-40ab-8e02-b5734b414afa\") " pod="openshift-marketplace/redhat-marketplace-x56s5" Mar 20 10:01:44 crc kubenswrapper[4971]: I0320 10:01:44.414518 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qc7h\" (UniqueName: \"kubernetes.io/projected/a95fccd1-b44d-40ab-8e02-b5734b414afa-kube-api-access-8qc7h\") pod \"redhat-marketplace-x56s5\" (UID: \"a95fccd1-b44d-40ab-8e02-b5734b414afa\") " pod="openshift-marketplace/redhat-marketplace-x56s5" Mar 20 10:01:44 crc kubenswrapper[4971]: I0320 10:01:44.414563 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a95fccd1-b44d-40ab-8e02-b5734b414afa-catalog-content\") pod \"redhat-marketplace-x56s5\" (UID: \"a95fccd1-b44d-40ab-8e02-b5734b414afa\") " pod="openshift-marketplace/redhat-marketplace-x56s5" Mar 20 10:01:44 crc kubenswrapper[4971]: I0320 10:01:44.516831 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a95fccd1-b44d-40ab-8e02-b5734b414afa-utilities\") pod \"redhat-marketplace-x56s5\" (UID: \"a95fccd1-b44d-40ab-8e02-b5734b414afa\") " pod="openshift-marketplace/redhat-marketplace-x56s5" Mar 20 10:01:44 crc kubenswrapper[4971]: I0320 10:01:44.517224 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qc7h\" (UniqueName: \"kubernetes.io/projected/a95fccd1-b44d-40ab-8e02-b5734b414afa-kube-api-access-8qc7h\") pod \"redhat-marketplace-x56s5\" (UID: \"a95fccd1-b44d-40ab-8e02-b5734b414afa\") " pod="openshift-marketplace/redhat-marketplace-x56s5" Mar 20 10:01:44 crc kubenswrapper[4971]: I0320 10:01:44.517274 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a95fccd1-b44d-40ab-8e02-b5734b414afa-catalog-content\") pod \"redhat-marketplace-x56s5\" (UID: \"a95fccd1-b44d-40ab-8e02-b5734b414afa\") " pod="openshift-marketplace/redhat-marketplace-x56s5" Mar 20 10:01:44 crc kubenswrapper[4971]: I0320 10:01:44.517392 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a95fccd1-b44d-40ab-8e02-b5734b414afa-utilities\") pod \"redhat-marketplace-x56s5\" (UID: \"a95fccd1-b44d-40ab-8e02-b5734b414afa\") " pod="openshift-marketplace/redhat-marketplace-x56s5" Mar 20 10:01:44 crc kubenswrapper[4971]: I0320 10:01:44.517736 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a95fccd1-b44d-40ab-8e02-b5734b414afa-catalog-content\") pod \"redhat-marketplace-x56s5\" (UID: \"a95fccd1-b44d-40ab-8e02-b5734b414afa\") " pod="openshift-marketplace/redhat-marketplace-x56s5" Mar 20 10:01:44 crc kubenswrapper[4971]: I0320 10:01:44.539503 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qc7h\" (UniqueName: \"kubernetes.io/projected/a95fccd1-b44d-40ab-8e02-b5734b414afa-kube-api-access-8qc7h\") pod \"redhat-marketplace-x56s5\" (UID: \"a95fccd1-b44d-40ab-8e02-b5734b414afa\") " pod="openshift-marketplace/redhat-marketplace-x56s5" Mar 20 10:01:44 crc kubenswrapper[4971]: I0320 10:01:44.600039 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x56s5" Mar 20 10:01:45 crc kubenswrapper[4971]: I0320 10:01:45.164810 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x56s5"] Mar 20 10:01:45 crc kubenswrapper[4971]: I0320 10:01:45.783943 4971 generic.go:334] "Generic (PLEG): container finished" podID="a95fccd1-b44d-40ab-8e02-b5734b414afa" containerID="8af6ea5b48067c2104a56ad704d675b1fda99fd3850f2594e37b6d7dc031e9fa" exitCode=0 Mar 20 10:01:45 crc kubenswrapper[4971]: I0320 10:01:45.784233 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x56s5" event={"ID":"a95fccd1-b44d-40ab-8e02-b5734b414afa","Type":"ContainerDied","Data":"8af6ea5b48067c2104a56ad704d675b1fda99fd3850f2594e37b6d7dc031e9fa"} Mar 20 10:01:45 crc kubenswrapper[4971]: I0320 10:01:45.784258 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x56s5" event={"ID":"a95fccd1-b44d-40ab-8e02-b5734b414afa","Type":"ContainerStarted","Data":"9a6d78bb02066b9fbf07bd7c7e57b9f8518bedb895a154cf698613578a4eca6c"} Mar 20 10:01:46 crc kubenswrapper[4971]: I0320 10:01:46.569322 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-926gt" podUID="2b9f7d1c-4632-417a-a1b9-4cf8ce803d2d" containerName="registry-server" probeResult="failure" output=< Mar 20 10:01:46 crc kubenswrapper[4971]: timeout: failed to connect service ":50051" within 1s Mar 20 10:01:46 crc kubenswrapper[4971]: > Mar 20 10:01:47 crc kubenswrapper[4971]: I0320 10:01:47.802161 4971 generic.go:334] "Generic (PLEG): container finished" podID="a95fccd1-b44d-40ab-8e02-b5734b414afa" containerID="d51e20ccdf351655fec41cf2b9185871a59d94fcce3f765cc4a648bf31b71453" exitCode=0 Mar 20 10:01:47 crc kubenswrapper[4971]: I0320 10:01:47.802209 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x56s5" event={"ID":"a95fccd1-b44d-40ab-8e02-b5734b414afa","Type":"ContainerDied","Data":"d51e20ccdf351655fec41cf2b9185871a59d94fcce3f765cc4a648bf31b71453"} Mar 20 10:01:48 crc kubenswrapper[4971]: I0320 10:01:48.827064 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x56s5" event={"ID":"a95fccd1-b44d-40ab-8e02-b5734b414afa","Type":"ContainerStarted","Data":"69128ea6083195772601af4e5c3ddea7e20d8674ac19990114ffd81bc8a01f10"} Mar 20 10:01:54 crc kubenswrapper[4971]: I0320 10:01:54.600915 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-x56s5" Mar 20 10:01:54 crc kubenswrapper[4971]: I0320 10:01:54.601514 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-x56s5" Mar 20 10:01:54 crc kubenswrapper[4971]: I0320 10:01:54.651259 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-x56s5" Mar 20 10:01:54 crc kubenswrapper[4971]: I0320 10:01:54.676332 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-x56s5" podStartSLOduration=8.227248238 podStartE2EDuration="10.67630927s" podCreationTimestamp="2026-03-20 10:01:44 +0000 UTC" firstStartedPulling="2026-03-20 10:01:45.786001721 +0000 UTC m=+11527.765875859" lastFinishedPulling="2026-03-20 10:01:48.235062753 +0000 UTC m=+11530.214936891" observedRunningTime="2026-03-20 10:01:48.859749298 +0000 UTC m=+11530.839623436" watchObservedRunningTime="2026-03-20 10:01:54.67630927 +0000 UTC m=+11536.656183408" Mar 20 10:01:54 crc kubenswrapper[4971]: I0320 10:01:54.930945 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-x56s5" Mar 20 10:01:54 crc kubenswrapper[4971]: I0320 10:01:54.984181 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x56s5"] Mar 20 10:01:55 crc kubenswrapper[4971]: I0320 10:01:55.572529 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-926gt" Mar 20 10:01:55 crc kubenswrapper[4971]: I0320 10:01:55.626536 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-926gt" Mar 20 10:01:56 crc kubenswrapper[4971]: I0320 10:01:56.901155 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-x56s5" podUID="a95fccd1-b44d-40ab-8e02-b5734b414afa" containerName="registry-server" containerID="cri-o://69128ea6083195772601af4e5c3ddea7e20d8674ac19990114ffd81bc8a01f10" gracePeriod=2 Mar 20 10:01:57 crc kubenswrapper[4971]: I0320 10:01:57.296945 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-926gt"] Mar 20 10:01:57 crc kubenswrapper[4971]: I0320 10:01:57.297418 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-926gt" podUID="2b9f7d1c-4632-417a-a1b9-4cf8ce803d2d" containerName="registry-server" containerID="cri-o://117196e2bbc05bf25f8b91e6499c030fc7ba99abc5f790cd8219cc75f71c5529" gracePeriod=2 Mar 20 10:01:57 crc kubenswrapper[4971]: I0320 10:01:57.514293 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x56s5" Mar 20 10:01:57 crc kubenswrapper[4971]: I0320 10:01:57.593059 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a95fccd1-b44d-40ab-8e02-b5734b414afa-utilities\") pod \"a95fccd1-b44d-40ab-8e02-b5734b414afa\" (UID: \"a95fccd1-b44d-40ab-8e02-b5734b414afa\") " Mar 20 10:01:57 crc kubenswrapper[4971]: I0320 10:01:57.593320 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qc7h\" (UniqueName: \"kubernetes.io/projected/a95fccd1-b44d-40ab-8e02-b5734b414afa-kube-api-access-8qc7h\") pod \"a95fccd1-b44d-40ab-8e02-b5734b414afa\" (UID: \"a95fccd1-b44d-40ab-8e02-b5734b414afa\") " Mar 20 10:01:57 crc kubenswrapper[4971]: I0320 10:01:57.593406 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a95fccd1-b44d-40ab-8e02-b5734b414afa-catalog-content\") pod \"a95fccd1-b44d-40ab-8e02-b5734b414afa\" (UID: \"a95fccd1-b44d-40ab-8e02-b5734b414afa\") " Mar 20 10:01:57 crc kubenswrapper[4971]: I0320 10:01:57.593946 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a95fccd1-b44d-40ab-8e02-b5734b414afa-utilities" (OuterVolumeSpecName: "utilities") pod "a95fccd1-b44d-40ab-8e02-b5734b414afa" (UID: "a95fccd1-b44d-40ab-8e02-b5734b414afa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:01:57 crc kubenswrapper[4971]: I0320 10:01:57.594134 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a95fccd1-b44d-40ab-8e02-b5734b414afa-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 10:01:57 crc kubenswrapper[4971]: I0320 10:01:57.606973 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a95fccd1-b44d-40ab-8e02-b5734b414afa-kube-api-access-8qc7h" (OuterVolumeSpecName: "kube-api-access-8qc7h") pod "a95fccd1-b44d-40ab-8e02-b5734b414afa" (UID: "a95fccd1-b44d-40ab-8e02-b5734b414afa"). InnerVolumeSpecName "kube-api-access-8qc7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:01:57 crc kubenswrapper[4971]: I0320 10:01:57.629494 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a95fccd1-b44d-40ab-8e02-b5734b414afa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a95fccd1-b44d-40ab-8e02-b5734b414afa" (UID: "a95fccd1-b44d-40ab-8e02-b5734b414afa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:01:57 crc kubenswrapper[4971]: I0320 10:01:57.695860 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qc7h\" (UniqueName: \"kubernetes.io/projected/a95fccd1-b44d-40ab-8e02-b5734b414afa-kube-api-access-8qc7h\") on node \"crc\" DevicePath \"\"" Mar 20 10:01:57 crc kubenswrapper[4971]: I0320 10:01:57.695901 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a95fccd1-b44d-40ab-8e02-b5734b414afa-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 10:01:57 crc kubenswrapper[4971]: I0320 10:01:57.866317 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-926gt" Mar 20 10:01:57 crc kubenswrapper[4971]: I0320 10:01:57.900900 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b9f7d1c-4632-417a-a1b9-4cf8ce803d2d-catalog-content\") pod \"2b9f7d1c-4632-417a-a1b9-4cf8ce803d2d\" (UID: \"2b9f7d1c-4632-417a-a1b9-4cf8ce803d2d\") " Mar 20 10:01:57 crc kubenswrapper[4971]: I0320 10:01:57.901017 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlzxm\" (UniqueName: \"kubernetes.io/projected/2b9f7d1c-4632-417a-a1b9-4cf8ce803d2d-kube-api-access-dlzxm\") pod \"2b9f7d1c-4632-417a-a1b9-4cf8ce803d2d\" (UID: \"2b9f7d1c-4632-417a-a1b9-4cf8ce803d2d\") " Mar 20 10:01:57 crc kubenswrapper[4971]: I0320 10:01:57.901107 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b9f7d1c-4632-417a-a1b9-4cf8ce803d2d-utilities\") pod \"2b9f7d1c-4632-417a-a1b9-4cf8ce803d2d\" (UID: \"2b9f7d1c-4632-417a-a1b9-4cf8ce803d2d\") " Mar 20 10:01:57 crc kubenswrapper[4971]: I0320 10:01:57.903174 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b9f7d1c-4632-417a-a1b9-4cf8ce803d2d-utilities" (OuterVolumeSpecName: "utilities") pod "2b9f7d1c-4632-417a-a1b9-4cf8ce803d2d" (UID: "2b9f7d1c-4632-417a-a1b9-4cf8ce803d2d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:01:57 crc kubenswrapper[4971]: I0320 10:01:57.921625 4971 generic.go:334] "Generic (PLEG): container finished" podID="a95fccd1-b44d-40ab-8e02-b5734b414afa" containerID="69128ea6083195772601af4e5c3ddea7e20d8674ac19990114ffd81bc8a01f10" exitCode=0 Mar 20 10:01:57 crc kubenswrapper[4971]: I0320 10:01:57.921658 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b9f7d1c-4632-417a-a1b9-4cf8ce803d2d-kube-api-access-dlzxm" (OuterVolumeSpecName: "kube-api-access-dlzxm") pod "2b9f7d1c-4632-417a-a1b9-4cf8ce803d2d" (UID: "2b9f7d1c-4632-417a-a1b9-4cf8ce803d2d"). InnerVolumeSpecName "kube-api-access-dlzxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:01:57 crc kubenswrapper[4971]: I0320 10:01:57.921692 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x56s5" event={"ID":"a95fccd1-b44d-40ab-8e02-b5734b414afa","Type":"ContainerDied","Data":"69128ea6083195772601af4e5c3ddea7e20d8674ac19990114ffd81bc8a01f10"} Mar 20 10:01:57 crc kubenswrapper[4971]: I0320 10:01:57.921720 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x56s5" event={"ID":"a95fccd1-b44d-40ab-8e02-b5734b414afa","Type":"ContainerDied","Data":"9a6d78bb02066b9fbf07bd7c7e57b9f8518bedb895a154cf698613578a4eca6c"} Mar 20 10:01:57 crc kubenswrapper[4971]: I0320 10:01:57.921764 4971 scope.go:117] "RemoveContainer" containerID="69128ea6083195772601af4e5c3ddea7e20d8674ac19990114ffd81bc8a01f10" Mar 20 10:01:57 crc kubenswrapper[4971]: I0320 10:01:57.921788 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x56s5" Mar 20 10:01:57 crc kubenswrapper[4971]: I0320 10:01:57.926120 4971 generic.go:334] "Generic (PLEG): container finished" podID="2b9f7d1c-4632-417a-a1b9-4cf8ce803d2d" containerID="117196e2bbc05bf25f8b91e6499c030fc7ba99abc5f790cd8219cc75f71c5529" exitCode=0 Mar 20 10:01:57 crc kubenswrapper[4971]: I0320 10:01:57.926258 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-926gt" event={"ID":"2b9f7d1c-4632-417a-a1b9-4cf8ce803d2d","Type":"ContainerDied","Data":"117196e2bbc05bf25f8b91e6499c030fc7ba99abc5f790cd8219cc75f71c5529"} Mar 20 10:01:57 crc kubenswrapper[4971]: I0320 10:01:57.926283 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-926gt" event={"ID":"2b9f7d1c-4632-417a-a1b9-4cf8ce803d2d","Type":"ContainerDied","Data":"d028203d8b4e3bf1de50c625f9e2a5a4998b87855b8c6f1492e38a8e37e3428f"} Mar 20 10:01:57 crc kubenswrapper[4971]: I0320 10:01:57.926695 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-926gt" Mar 20 10:01:57 crc kubenswrapper[4971]: I0320 10:01:57.968365 4971 scope.go:117] "RemoveContainer" containerID="d51e20ccdf351655fec41cf2b9185871a59d94fcce3f765cc4a648bf31b71453" Mar 20 10:01:57 crc kubenswrapper[4971]: I0320 10:01:57.980784 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x56s5"] Mar 20 10:01:57 crc kubenswrapper[4971]: I0320 10:01:57.994884 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-x56s5"] Mar 20 10:01:58 crc kubenswrapper[4971]: I0320 10:01:58.001503 4971 scope.go:117] "RemoveContainer" containerID="8af6ea5b48067c2104a56ad704d675b1fda99fd3850f2594e37b6d7dc031e9fa" Mar 20 10:01:58 crc kubenswrapper[4971]: I0320 10:01:58.004282 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlzxm\" (UniqueName: \"kubernetes.io/projected/2b9f7d1c-4632-417a-a1b9-4cf8ce803d2d-kube-api-access-dlzxm\") on node \"crc\" DevicePath \"\"" Mar 20 10:01:58 crc kubenswrapper[4971]: I0320 10:01:58.004479 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b9f7d1c-4632-417a-a1b9-4cf8ce803d2d-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 10:01:58 crc kubenswrapper[4971]: I0320 10:01:58.086306 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b9f7d1c-4632-417a-a1b9-4cf8ce803d2d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2b9f7d1c-4632-417a-a1b9-4cf8ce803d2d" (UID: "2b9f7d1c-4632-417a-a1b9-4cf8ce803d2d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:01:58 crc kubenswrapper[4971]: I0320 10:01:58.092673 4971 scope.go:117] "RemoveContainer" containerID="69128ea6083195772601af4e5c3ddea7e20d8674ac19990114ffd81bc8a01f10" Mar 20 10:01:58 crc kubenswrapper[4971]: E0320 10:01:58.094411 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69128ea6083195772601af4e5c3ddea7e20d8674ac19990114ffd81bc8a01f10\": container with ID starting with 69128ea6083195772601af4e5c3ddea7e20d8674ac19990114ffd81bc8a01f10 not found: ID does not exist" containerID="69128ea6083195772601af4e5c3ddea7e20d8674ac19990114ffd81bc8a01f10" Mar 20 10:01:58 crc kubenswrapper[4971]: I0320 10:01:58.094459 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69128ea6083195772601af4e5c3ddea7e20d8674ac19990114ffd81bc8a01f10"} err="failed to get container status \"69128ea6083195772601af4e5c3ddea7e20d8674ac19990114ffd81bc8a01f10\": rpc error: code = NotFound desc = could not find container \"69128ea6083195772601af4e5c3ddea7e20d8674ac19990114ffd81bc8a01f10\": container with ID starting with 69128ea6083195772601af4e5c3ddea7e20d8674ac19990114ffd81bc8a01f10 not found: ID does not exist" Mar 20 10:01:58 crc kubenswrapper[4971]: I0320 10:01:58.094488 4971 scope.go:117] "RemoveContainer" containerID="d51e20ccdf351655fec41cf2b9185871a59d94fcce3f765cc4a648bf31b71453" Mar 20 10:01:58 crc kubenswrapper[4971]: E0320 10:01:58.094899 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d51e20ccdf351655fec41cf2b9185871a59d94fcce3f765cc4a648bf31b71453\": container with ID starting with d51e20ccdf351655fec41cf2b9185871a59d94fcce3f765cc4a648bf31b71453 not found: ID does not exist" containerID="d51e20ccdf351655fec41cf2b9185871a59d94fcce3f765cc4a648bf31b71453" Mar 20 10:01:58 crc kubenswrapper[4971]: I0320 10:01:58.094928 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d51e20ccdf351655fec41cf2b9185871a59d94fcce3f765cc4a648bf31b71453"} err="failed to get container status \"d51e20ccdf351655fec41cf2b9185871a59d94fcce3f765cc4a648bf31b71453\": rpc error: code = NotFound desc = could not find container \"d51e20ccdf351655fec41cf2b9185871a59d94fcce3f765cc4a648bf31b71453\": container with ID starting with d51e20ccdf351655fec41cf2b9185871a59d94fcce3f765cc4a648bf31b71453 not found: ID does not exist" Mar 20 10:01:58 crc kubenswrapper[4971]: I0320 10:01:58.094948 4971 scope.go:117] "RemoveContainer" containerID="8af6ea5b48067c2104a56ad704d675b1fda99fd3850f2594e37b6d7dc031e9fa" Mar 20 10:01:58 crc kubenswrapper[4971]: E0320 10:01:58.095229 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8af6ea5b48067c2104a56ad704d675b1fda99fd3850f2594e37b6d7dc031e9fa\": container with ID starting with 8af6ea5b48067c2104a56ad704d675b1fda99fd3850f2594e37b6d7dc031e9fa not found: ID does not exist" containerID="8af6ea5b48067c2104a56ad704d675b1fda99fd3850f2594e37b6d7dc031e9fa" Mar 20 10:01:58 crc kubenswrapper[4971]: I0320 10:01:58.095255 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8af6ea5b48067c2104a56ad704d675b1fda99fd3850f2594e37b6d7dc031e9fa"} err="failed to get container status \"8af6ea5b48067c2104a56ad704d675b1fda99fd3850f2594e37b6d7dc031e9fa\": rpc error: code = NotFound desc = could not find container \"8af6ea5b48067c2104a56ad704d675b1fda99fd3850f2594e37b6d7dc031e9fa\": container with ID starting with 8af6ea5b48067c2104a56ad704d675b1fda99fd3850f2594e37b6d7dc031e9fa not found: ID does not exist" Mar 20 10:01:58 crc kubenswrapper[4971]: I0320 10:01:58.095271 4971 scope.go:117] "RemoveContainer" containerID="117196e2bbc05bf25f8b91e6499c030fc7ba99abc5f790cd8219cc75f71c5529" Mar 20 10:01:58 crc kubenswrapper[4971]: I0320 10:01:58.106568 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b9f7d1c-4632-417a-a1b9-4cf8ce803d2d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 10:01:58 crc kubenswrapper[4971]: I0320 10:01:58.121986 4971 scope.go:117] "RemoveContainer" containerID="6b2b0f062c734dd4bfd1d1f78183b2710f55b42f0cc93903b3f75f90f23afe1c" Mar 20 10:01:58 crc kubenswrapper[4971]: I0320 10:01:58.147706 4971 scope.go:117] "RemoveContainer" containerID="06633b6645d343a1eadcd751db7cb2ae1a6cc859aca7c6133bf2a0339409260f" Mar 20 10:01:58 crc kubenswrapper[4971]: I0320 10:01:58.198730 4971 scope.go:117] "RemoveContainer" containerID="117196e2bbc05bf25f8b91e6499c030fc7ba99abc5f790cd8219cc75f71c5529" Mar 20 10:01:58 crc kubenswrapper[4971]: E0320 10:01:58.199367 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"117196e2bbc05bf25f8b91e6499c030fc7ba99abc5f790cd8219cc75f71c5529\": container with ID starting with 117196e2bbc05bf25f8b91e6499c030fc7ba99abc5f790cd8219cc75f71c5529 not found: ID does not exist" containerID="117196e2bbc05bf25f8b91e6499c030fc7ba99abc5f790cd8219cc75f71c5529" Mar 20 10:01:58 crc kubenswrapper[4971]: I0320 10:01:58.199438 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"117196e2bbc05bf25f8b91e6499c030fc7ba99abc5f790cd8219cc75f71c5529"} err="failed to get container status \"117196e2bbc05bf25f8b91e6499c030fc7ba99abc5f790cd8219cc75f71c5529\": rpc error: code = NotFound desc = could not find container \"117196e2bbc05bf25f8b91e6499c030fc7ba99abc5f790cd8219cc75f71c5529\": container with ID starting with 117196e2bbc05bf25f8b91e6499c030fc7ba99abc5f790cd8219cc75f71c5529 not found: ID does not exist" Mar 20 10:01:58 crc kubenswrapper[4971]: I0320 10:01:58.199473 4971 scope.go:117] "RemoveContainer" containerID="6b2b0f062c734dd4bfd1d1f78183b2710f55b42f0cc93903b3f75f90f23afe1c" Mar 20 10:01:58 crc kubenswrapper[4971]: E0320 10:01:58.199845 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b2b0f062c734dd4bfd1d1f78183b2710f55b42f0cc93903b3f75f90f23afe1c\": container with ID starting with 6b2b0f062c734dd4bfd1d1f78183b2710f55b42f0cc93903b3f75f90f23afe1c not found: ID does not exist" containerID="6b2b0f062c734dd4bfd1d1f78183b2710f55b42f0cc93903b3f75f90f23afe1c" Mar 20 10:01:58 crc kubenswrapper[4971]: I0320 10:01:58.199884 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b2b0f062c734dd4bfd1d1f78183b2710f55b42f0cc93903b3f75f90f23afe1c"} err="failed to get container status \"6b2b0f062c734dd4bfd1d1f78183b2710f55b42f0cc93903b3f75f90f23afe1c\": rpc error: code = NotFound desc = could not find container \"6b2b0f062c734dd4bfd1d1f78183b2710f55b42f0cc93903b3f75f90f23afe1c\": container with ID starting with 6b2b0f062c734dd4bfd1d1f78183b2710f55b42f0cc93903b3f75f90f23afe1c not found: ID does not exist" Mar 20 10:01:58 crc kubenswrapper[4971]: I0320 10:01:58.199903 4971 scope.go:117] "RemoveContainer" containerID="06633b6645d343a1eadcd751db7cb2ae1a6cc859aca7c6133bf2a0339409260f" Mar 20 10:01:58 crc kubenswrapper[4971]: E0320 10:01:58.200266 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06633b6645d343a1eadcd751db7cb2ae1a6cc859aca7c6133bf2a0339409260f\": container with ID starting with 06633b6645d343a1eadcd751db7cb2ae1a6cc859aca7c6133bf2a0339409260f not found: ID does not exist" containerID="06633b6645d343a1eadcd751db7cb2ae1a6cc859aca7c6133bf2a0339409260f" Mar 20 10:01:58 crc kubenswrapper[4971]: I0320 10:01:58.200302 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06633b6645d343a1eadcd751db7cb2ae1a6cc859aca7c6133bf2a0339409260f"} err="failed to get container status \"06633b6645d343a1eadcd751db7cb2ae1a6cc859aca7c6133bf2a0339409260f\": rpc error: code = NotFound desc = could not find container \"06633b6645d343a1eadcd751db7cb2ae1a6cc859aca7c6133bf2a0339409260f\": container with ID starting with 06633b6645d343a1eadcd751db7cb2ae1a6cc859aca7c6133bf2a0339409260f not found: ID does not exist" Mar 20 10:01:58 crc kubenswrapper[4971]: I0320 10:01:58.266471 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-926gt"] Mar 20 10:01:58 crc kubenswrapper[4971]: I0320 10:01:58.277123 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-926gt"] Mar 20 10:01:58 crc kubenswrapper[4971]: I0320 10:01:58.743568 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b9f7d1c-4632-417a-a1b9-4cf8ce803d2d" path="/var/lib/kubelet/pods/2b9f7d1c-4632-417a-a1b9-4cf8ce803d2d/volumes" Mar 20 10:01:58 crc kubenswrapper[4971]: I0320 10:01:58.744915 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a95fccd1-b44d-40ab-8e02-b5734b414afa" path="/var/lib/kubelet/pods/a95fccd1-b44d-40ab-8e02-b5734b414afa/volumes" Mar 20 10:02:00 crc kubenswrapper[4971]: I0320 10:02:00.151983 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566682-2lbc4"] Mar 20 10:02:00 crc kubenswrapper[4971]: E0320 10:02:00.153005 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a95fccd1-b44d-40ab-8e02-b5734b414afa" containerName="extract-content" Mar 20 10:02:00 crc kubenswrapper[4971]: I0320 10:02:00.153025 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="a95fccd1-b44d-40ab-8e02-b5734b414afa" containerName="extract-content" Mar 20 10:02:00 crc kubenswrapper[4971]: E0320 10:02:00.153056 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b9f7d1c-4632-417a-a1b9-4cf8ce803d2d" containerName="registry-server" Mar 20 10:02:00 crc kubenswrapper[4971]: I0320 10:02:00.153064 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b9f7d1c-4632-417a-a1b9-4cf8ce803d2d" containerName="registry-server" Mar 20 10:02:00 crc kubenswrapper[4971]: E0320 10:02:00.153082 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a95fccd1-b44d-40ab-8e02-b5734b414afa" containerName="extract-utilities" Mar 20 10:02:00 crc kubenswrapper[4971]: I0320 10:02:00.153091 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="a95fccd1-b44d-40ab-8e02-b5734b414afa" containerName="extract-utilities" Mar 20 10:02:00 crc kubenswrapper[4971]: E0320 10:02:00.153103 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b9f7d1c-4632-417a-a1b9-4cf8ce803d2d" containerName="extract-content" Mar 20 10:02:00 crc kubenswrapper[4971]: I0320 10:02:00.153110 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b9f7d1c-4632-417a-a1b9-4cf8ce803d2d" containerName="extract-content" Mar 20 10:02:00 crc kubenswrapper[4971]: E0320 10:02:00.153129 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b9f7d1c-4632-417a-a1b9-4cf8ce803d2d" containerName="extract-utilities" Mar 20 10:02:00 crc kubenswrapper[4971]: I0320 10:02:00.153137 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b9f7d1c-4632-417a-a1b9-4cf8ce803d2d" containerName="extract-utilities" Mar 20 10:02:00 crc kubenswrapper[4971]: E0320 10:02:00.153163 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a95fccd1-b44d-40ab-8e02-b5734b414afa" containerName="registry-server" Mar 20 10:02:00 crc kubenswrapper[4971]: I0320 10:02:00.153172 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="a95fccd1-b44d-40ab-8e02-b5734b414afa" containerName="registry-server" Mar 20 10:02:00 crc kubenswrapper[4971]: I0320 10:02:00.153679 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="a95fccd1-b44d-40ab-8e02-b5734b414afa" containerName="registry-server" Mar 20 10:02:00 crc kubenswrapper[4971]: I0320 10:02:00.153918 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b9f7d1c-4632-417a-a1b9-4cf8ce803d2d" containerName="registry-server" Mar 20 10:02:00 crc kubenswrapper[4971]: I0320 10:02:00.154825 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566682-2lbc4" Mar 20 10:02:00 crc kubenswrapper[4971]: I0320 10:02:00.157723 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wdbz\" (UniqueName: \"kubernetes.io/projected/e26ef584-e0d0-4448-82f2-143e5f331c3b-kube-api-access-9wdbz\") pod \"auto-csr-approver-29566682-2lbc4\" (UID: \"e26ef584-e0d0-4448-82f2-143e5f331c3b\") " pod="openshift-infra/auto-csr-approver-29566682-2lbc4" Mar 20 10:02:00 crc kubenswrapper[4971]: I0320 10:02:00.158958 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 10:02:00 crc kubenswrapper[4971]: I0320 10:02:00.159366 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 10:02:00 crc kubenswrapper[4971]: I0320 10:02:00.159551 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 10:02:00 crc kubenswrapper[4971]: I0320 10:02:00.168549 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566682-2lbc4"] Mar 20 10:02:00 crc kubenswrapper[4971]: I0320 10:02:00.259151 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wdbz\" (UniqueName: \"kubernetes.io/projected/e26ef584-e0d0-4448-82f2-143e5f331c3b-kube-api-access-9wdbz\") pod \"auto-csr-approver-29566682-2lbc4\" (UID: \"e26ef584-e0d0-4448-82f2-143e5f331c3b\") " pod="openshift-infra/auto-csr-approver-29566682-2lbc4" Mar 20 10:02:00 crc kubenswrapper[4971]: I0320 10:02:00.278632 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wdbz\" (UniqueName: \"kubernetes.io/projected/e26ef584-e0d0-4448-82f2-143e5f331c3b-kube-api-access-9wdbz\") pod \"auto-csr-approver-29566682-2lbc4\" (UID: \"e26ef584-e0d0-4448-82f2-143e5f331c3b\") " pod="openshift-infra/auto-csr-approver-29566682-2lbc4" Mar 20 10:02:00 crc kubenswrapper[4971]: I0320 10:02:00.472954 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566682-2lbc4" Mar 20 10:02:00 crc kubenswrapper[4971]: I0320 10:02:00.942798 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566682-2lbc4"] Mar 20 10:02:00 crc kubenswrapper[4971]: W0320 10:02:00.949544 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode26ef584_e0d0_4448_82f2_143e5f331c3b.slice/crio-f76854aae650394101f34dc9b60be0840749224ec553a501617577a081b4bd29 WatchSource:0}: Error finding container f76854aae650394101f34dc9b60be0840749224ec553a501617577a081b4bd29: Status 404 returned error can't find the container with id f76854aae650394101f34dc9b60be0840749224ec553a501617577a081b4bd29 Mar 20 10:02:01 crc kubenswrapper[4971]: I0320 10:02:01.972201 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566682-2lbc4" event={"ID":"e26ef584-e0d0-4448-82f2-143e5f331c3b","Type":"ContainerStarted","Data":"f76854aae650394101f34dc9b60be0840749224ec553a501617577a081b4bd29"} Mar 20 10:02:02 crc kubenswrapper[4971]: I0320 10:02:02.988434 4971 generic.go:334] "Generic (PLEG): container finished" podID="e26ef584-e0d0-4448-82f2-143e5f331c3b" containerID="53eea9252fa4b17192ddedd3b917af427ab1758527612a2028b5b605c0d46000" exitCode=0 Mar 20 10:02:02 crc kubenswrapper[4971]: I0320 10:02:02.988724 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566682-2lbc4" event={"ID":"e26ef584-e0d0-4448-82f2-143e5f331c3b","Type":"ContainerDied","Data":"53eea9252fa4b17192ddedd3b917af427ab1758527612a2028b5b605c0d46000"} Mar 20 10:02:04 crc kubenswrapper[4971]: I0320 10:02:04.622642 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566682-2lbc4" Mar 20 10:02:04 crc kubenswrapper[4971]: I0320 10:02:04.774281 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wdbz\" (UniqueName: \"kubernetes.io/projected/e26ef584-e0d0-4448-82f2-143e5f331c3b-kube-api-access-9wdbz\") pod \"e26ef584-e0d0-4448-82f2-143e5f331c3b\" (UID: \"e26ef584-e0d0-4448-82f2-143e5f331c3b\") " Mar 20 10:02:04 crc kubenswrapper[4971]: I0320 10:02:04.784211 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e26ef584-e0d0-4448-82f2-143e5f331c3b-kube-api-access-9wdbz" (OuterVolumeSpecName: "kube-api-access-9wdbz") pod "e26ef584-e0d0-4448-82f2-143e5f331c3b" (UID: "e26ef584-e0d0-4448-82f2-143e5f331c3b"). InnerVolumeSpecName "kube-api-access-9wdbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:02:04 crc kubenswrapper[4971]: I0320 10:02:04.878197 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9wdbz\" (UniqueName: \"kubernetes.io/projected/e26ef584-e0d0-4448-82f2-143e5f331c3b-kube-api-access-9wdbz\") on node \"crc\" DevicePath \"\"" Mar 20 10:02:05 crc kubenswrapper[4971]: I0320 10:02:05.008140 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566682-2lbc4" event={"ID":"e26ef584-e0d0-4448-82f2-143e5f331c3b","Type":"ContainerDied","Data":"f76854aae650394101f34dc9b60be0840749224ec553a501617577a081b4bd29"} Mar 20 10:02:05 crc kubenswrapper[4971]: I0320 10:02:05.008185 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f76854aae650394101f34dc9b60be0840749224ec553a501617577a081b4bd29" Mar 20 10:02:05 crc kubenswrapper[4971]: I0320 10:02:05.008249 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566682-2lbc4" Mar 20 10:02:05 crc kubenswrapper[4971]: I0320 10:02:05.714433 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566676-ntqhr"] Mar 20 10:02:05 crc kubenswrapper[4971]: I0320 10:02:05.727513 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566676-ntqhr"] Mar 20 10:02:06 crc kubenswrapper[4971]: I0320 10:02:06.755113 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b5a5bb6-be43-4ca4-acea-3ba241727009" path="/var/lib/kubelet/pods/5b5a5bb6-be43-4ca4-acea-3ba241727009/volumes" Mar 20 10:02:17 crc kubenswrapper[4971]: I0320 10:02:17.024690 4971 scope.go:117] "RemoveContainer" containerID="08a09ff9ebb4fed45fd5e8c1a2335d8357ec3d06014859bdddbc95becd2d2211" Mar 20 10:02:50 crc kubenswrapper[4971]: I0320 10:02:50.162429 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 10:02:50 crc kubenswrapper[4971]: I0320 10:02:50.162903 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 10:03:20 crc kubenswrapper[4971]: I0320 10:03:20.162159 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 10:03:20 crc kubenswrapper[4971]: I0320 10:03:20.162815 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 10:03:50 crc kubenswrapper[4971]: I0320 10:03:50.162442 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 10:03:50 crc kubenswrapper[4971]: I0320 10:03:50.163076 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 10:03:50 crc kubenswrapper[4971]: I0320 10:03:50.163133 4971 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" Mar 20 10:03:50 crc kubenswrapper[4971]: I0320 10:03:50.164176 4971 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b00504c5425aa2d12d14003deead3cf102e74dcadfa956fb8c983b5701595dc1"} pod="openshift-machine-config-operator/machine-config-daemon-m26zl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 10:03:50 crc kubenswrapper[4971]: I0320 10:03:50.164259 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" containerID="cri-o://b00504c5425aa2d12d14003deead3cf102e74dcadfa956fb8c983b5701595dc1" gracePeriod=600 Mar 20 10:03:51 crc kubenswrapper[4971]: I0320 10:03:51.097554 4971 generic.go:334] "Generic (PLEG): container finished" podID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerID="b00504c5425aa2d12d14003deead3cf102e74dcadfa956fb8c983b5701595dc1" exitCode=0 Mar 20 10:03:51 crc kubenswrapper[4971]: I0320 10:03:51.097616 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerDied","Data":"b00504c5425aa2d12d14003deead3cf102e74dcadfa956fb8c983b5701595dc1"} Mar 20 10:03:51 crc kubenswrapper[4971]: I0320 10:03:51.098107 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerStarted","Data":"15b22a46addd719b87e39803e0676833c1a1e5fce264f62fdd63b9866ad52788"} Mar 20 10:03:51 crc kubenswrapper[4971]: I0320 10:03:51.098129 4971 scope.go:117] "RemoveContainer" containerID="9741f8c4d9da3b1995bee26db60ff6748d6555ab1f8bf6b207eb6b4cd887f8f1" Mar 20 10:04:00 crc kubenswrapper[4971]: I0320 10:04:00.159715 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566684-54npb"] Mar 20 10:04:00 crc kubenswrapper[4971]: E0320 10:04:00.161439 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e26ef584-e0d0-4448-82f2-143e5f331c3b" containerName="oc" Mar 20 10:04:00 crc kubenswrapper[4971]: I0320 10:04:00.161458 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="e26ef584-e0d0-4448-82f2-143e5f331c3b" containerName="oc" Mar 20 10:04:00 crc kubenswrapper[4971]: I0320 10:04:00.161817 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="e26ef584-e0d0-4448-82f2-143e5f331c3b" containerName="oc" Mar 20 10:04:00 crc kubenswrapper[4971]: I0320 10:04:00.163097 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566684-54npb" Mar 20 10:04:00 crc kubenswrapper[4971]: I0320 10:04:00.167235 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 10:04:00 crc kubenswrapper[4971]: I0320 10:04:00.167446 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 10:04:00 crc kubenswrapper[4971]: I0320 10:04:00.168040 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 10:04:00 crc kubenswrapper[4971]: I0320 10:04:00.183103 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566684-54npb"] Mar 20 10:04:00 crc kubenswrapper[4971]: I0320 10:04:00.302655 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xw4h9\" (UniqueName: \"kubernetes.io/projected/4d23c2eb-0052-4cc9-a837-954e06e21c94-kube-api-access-xw4h9\") pod \"auto-csr-approver-29566684-54npb\" (UID: \"4d23c2eb-0052-4cc9-a837-954e06e21c94\") " pod="openshift-infra/auto-csr-approver-29566684-54npb" Mar 20 10:04:00 crc kubenswrapper[4971]: I0320 10:04:00.404818 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xw4h9\" (UniqueName: \"kubernetes.io/projected/4d23c2eb-0052-4cc9-a837-954e06e21c94-kube-api-access-xw4h9\") pod \"auto-csr-approver-29566684-54npb\" (UID: \"4d23c2eb-0052-4cc9-a837-954e06e21c94\") " pod="openshift-infra/auto-csr-approver-29566684-54npb" Mar 20 10:04:00 crc kubenswrapper[4971]: I0320 10:04:00.424403 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xw4h9\" (UniqueName: \"kubernetes.io/projected/4d23c2eb-0052-4cc9-a837-954e06e21c94-kube-api-access-xw4h9\") pod \"auto-csr-approver-29566684-54npb\" (UID: \"4d23c2eb-0052-4cc9-a837-954e06e21c94\") " pod="openshift-infra/auto-csr-approver-29566684-54npb" Mar 20 10:04:00 crc kubenswrapper[4971]: I0320 10:04:00.496507 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566684-54npb" Mar 20 10:04:00 crc kubenswrapper[4971]: I0320 10:04:00.992167 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566684-54npb"] Mar 20 10:04:01 crc kubenswrapper[4971]: I0320 10:04:01.244007 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566684-54npb" event={"ID":"4d23c2eb-0052-4cc9-a837-954e06e21c94","Type":"ContainerStarted","Data":"cb3943c8502ed2a9700bcd05f83f9a6a4f98c9512787781a6bb7ee3329739575"} Mar 20 10:04:02 crc kubenswrapper[4971]: I0320 10:04:02.268295 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566684-54npb" event={"ID":"4d23c2eb-0052-4cc9-a837-954e06e21c94","Type":"ContainerStarted","Data":"6e79619babfc0d3498f19527ee26e6e5338a787011df5f3155c2a781cc604791"} Mar 20 10:04:02 crc kubenswrapper[4971]: I0320 10:04:02.297139 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566684-54npb" podStartSLOduration=1.487422848 podStartE2EDuration="2.297120955s" podCreationTimestamp="2026-03-20 10:04:00 +0000 UTC" firstStartedPulling="2026-03-20 10:04:00.998194379 +0000 UTC m=+11662.978068517" lastFinishedPulling="2026-03-20 10:04:01.807892476 +0000 UTC m=+11663.787766624" observedRunningTime="2026-03-20 10:04:02.283446313 +0000 UTC m=+11664.263320451" watchObservedRunningTime="2026-03-20 10:04:02.297120955 +0000 UTC m=+11664.276995093" Mar 20 10:04:03 crc kubenswrapper[4971]: I0320 10:04:03.278796 4971 generic.go:334] "Generic (PLEG): container finished" podID="4d23c2eb-0052-4cc9-a837-954e06e21c94" containerID="6e79619babfc0d3498f19527ee26e6e5338a787011df5f3155c2a781cc604791" exitCode=0 Mar 20 10:04:03 crc kubenswrapper[4971]: I0320 10:04:03.278884 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566684-54npb" event={"ID":"4d23c2eb-0052-4cc9-a837-954e06e21c94","Type":"ContainerDied","Data":"6e79619babfc0d3498f19527ee26e6e5338a787011df5f3155c2a781cc604791"} Mar 20 10:04:04 crc kubenswrapper[4971]: I0320 10:04:04.882818 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566684-54npb" Mar 20 10:04:04 crc kubenswrapper[4971]: I0320 10:04:04.995053 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xw4h9\" (UniqueName: \"kubernetes.io/projected/4d23c2eb-0052-4cc9-a837-954e06e21c94-kube-api-access-xw4h9\") pod \"4d23c2eb-0052-4cc9-a837-954e06e21c94\" (UID: \"4d23c2eb-0052-4cc9-a837-954e06e21c94\") " Mar 20 10:04:05 crc kubenswrapper[4971]: I0320 10:04:05.002068 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d23c2eb-0052-4cc9-a837-954e06e21c94-kube-api-access-xw4h9" (OuterVolumeSpecName: "kube-api-access-xw4h9") pod "4d23c2eb-0052-4cc9-a837-954e06e21c94" (UID: "4d23c2eb-0052-4cc9-a837-954e06e21c94"). InnerVolumeSpecName "kube-api-access-xw4h9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:04:05 crc kubenswrapper[4971]: I0320 10:04:05.098705 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xw4h9\" (UniqueName: \"kubernetes.io/projected/4d23c2eb-0052-4cc9-a837-954e06e21c94-kube-api-access-xw4h9\") on node \"crc\" DevicePath \"\"" Mar 20 10:04:05 crc kubenswrapper[4971]: I0320 10:04:05.307586 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566684-54npb" event={"ID":"4d23c2eb-0052-4cc9-a837-954e06e21c94","Type":"ContainerDied","Data":"cb3943c8502ed2a9700bcd05f83f9a6a4f98c9512787781a6bb7ee3329739575"} Mar 20 10:04:05 crc kubenswrapper[4971]: I0320 10:04:05.307676 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb3943c8502ed2a9700bcd05f83f9a6a4f98c9512787781a6bb7ee3329739575" Mar 20 10:04:05 crc kubenswrapper[4971]: I0320 10:04:05.307758 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566684-54npb" Mar 20 10:04:05 crc kubenswrapper[4971]: I0320 10:04:05.360281 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566678-f55t4"] Mar 20 10:04:05 crc kubenswrapper[4971]: I0320 10:04:05.372952 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566678-f55t4"] Mar 20 10:04:06 crc kubenswrapper[4971]: I0320 10:04:06.745736 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b64f2bb6-88df-46aa-a738-61710d224963" path="/var/lib/kubelet/pods/b64f2bb6-88df-46aa-a738-61710d224963/volumes" Mar 20 10:04:17 crc kubenswrapper[4971]: I0320 10:04:17.190877 4971 scope.go:117] "RemoveContainer" containerID="e663b89d2fd76ba1371d9f60aedbd78fdeb0d213b92f0167319c8c6d439a7270" Mar 20 10:05:50 crc kubenswrapper[4971]: I0320 10:05:50.161851 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 10:05:50 crc kubenswrapper[4971]: I0320 10:05:50.162392 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 10:06:00 crc kubenswrapper[4971]: I0320 10:06:00.138445 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566686-94sgt"] Mar 20 10:06:00 crc kubenswrapper[4971]: E0320 10:06:00.139499 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d23c2eb-0052-4cc9-a837-954e06e21c94" containerName="oc" Mar 20 10:06:00 crc kubenswrapper[4971]: I0320 10:06:00.139516 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d23c2eb-0052-4cc9-a837-954e06e21c94" containerName="oc" Mar 20 10:06:00 crc kubenswrapper[4971]: I0320 10:06:00.139806 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d23c2eb-0052-4cc9-a837-954e06e21c94" containerName="oc" Mar 20 10:06:00 crc kubenswrapper[4971]: I0320 10:06:00.140494 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566686-94sgt" Mar 20 10:06:00 crc kubenswrapper[4971]: I0320 10:06:00.142313 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 10:06:00 crc kubenswrapper[4971]: I0320 10:06:00.142757 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 10:06:00 crc kubenswrapper[4971]: I0320 10:06:00.142818 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 10:06:00 crc kubenswrapper[4971]: I0320 10:06:00.159220 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566686-94sgt"] Mar 20 10:06:00 crc kubenswrapper[4971]: I0320 10:06:00.282451 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnk2x\" (UniqueName: \"kubernetes.io/projected/ec35264e-6c4f-457f-afa0-3e785a7143a9-kube-api-access-cnk2x\") pod \"auto-csr-approver-29566686-94sgt\" (UID: \"ec35264e-6c4f-457f-afa0-3e785a7143a9\") " pod="openshift-infra/auto-csr-approver-29566686-94sgt" Mar 20 10:06:00 crc kubenswrapper[4971]: I0320 10:06:00.385472 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnk2x\" (UniqueName: \"kubernetes.io/projected/ec35264e-6c4f-457f-afa0-3e785a7143a9-kube-api-access-cnk2x\") pod \"auto-csr-approver-29566686-94sgt\" (UID: \"ec35264e-6c4f-457f-afa0-3e785a7143a9\") " pod="openshift-infra/auto-csr-approver-29566686-94sgt" Mar 20 10:06:00 crc kubenswrapper[4971]: I0320 10:06:00.415301 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnk2x\" (UniqueName: \"kubernetes.io/projected/ec35264e-6c4f-457f-afa0-3e785a7143a9-kube-api-access-cnk2x\") pod \"auto-csr-approver-29566686-94sgt\" (UID: \"ec35264e-6c4f-457f-afa0-3e785a7143a9\") " pod="openshift-infra/auto-csr-approver-29566686-94sgt" Mar 20 10:06:00 crc kubenswrapper[4971]: I0320 10:06:00.463133 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566686-94sgt" Mar 20 10:06:01 crc kubenswrapper[4971]: I0320 10:06:01.040046 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566686-94sgt"] Mar 20 10:06:01 crc kubenswrapper[4971]: I0320 10:06:01.060757 4971 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 10:06:01 crc kubenswrapper[4971]: I0320 10:06:01.439449 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566686-94sgt" event={"ID":"ec35264e-6c4f-457f-afa0-3e785a7143a9","Type":"ContainerStarted","Data":"6fa6804b83b69d4e17eedd57febfeed5405071e5c10a65d9347a4ea275c1e1cc"} Mar 20 10:06:02 crc kubenswrapper[4971]: I0320 10:06:02.453696 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566686-94sgt" event={"ID":"ec35264e-6c4f-457f-afa0-3e785a7143a9","Type":"ContainerStarted","Data":"cea13be598b12ee4768259f2b424cc5b3d06edfe842b7fe4044cedb746b300ea"} Mar 20 10:06:02 crc kubenswrapper[4971]: I0320 10:06:02.476832 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566686-94sgt" podStartSLOduration=1.596639492 podStartE2EDuration="2.476804812s" podCreationTimestamp="2026-03-20 10:06:00 +0000 UTC" firstStartedPulling="2026-03-20 10:06:01.060402758 +0000 UTC m=+11783.040276896" lastFinishedPulling="2026-03-20 10:06:01.940568078 +0000 UTC m=+11783.920442216" observedRunningTime="2026-03-20 10:06:02.469895635 +0000 UTC m=+11784.449769773" watchObservedRunningTime="2026-03-20 10:06:02.476804812 +0000 UTC m=+11784.456678950" Mar 20 10:06:03 crc kubenswrapper[4971]: I0320 10:06:03.465349 4971 generic.go:334] "Generic (PLEG): container finished" podID="ec35264e-6c4f-457f-afa0-3e785a7143a9" containerID="cea13be598b12ee4768259f2b424cc5b3d06edfe842b7fe4044cedb746b300ea" exitCode=0 Mar 20 10:06:03 crc kubenswrapper[4971]: I0320 10:06:03.465410 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566686-94sgt" event={"ID":"ec35264e-6c4f-457f-afa0-3e785a7143a9","Type":"ContainerDied","Data":"cea13be598b12ee4768259f2b424cc5b3d06edfe842b7fe4044cedb746b300ea"} Mar 20 10:06:05 crc kubenswrapper[4971]: I0320 10:06:05.121593 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566686-94sgt" Mar 20 10:06:05 crc kubenswrapper[4971]: I0320 10:06:05.178909 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnk2x\" (UniqueName: \"kubernetes.io/projected/ec35264e-6c4f-457f-afa0-3e785a7143a9-kube-api-access-cnk2x\") pod \"ec35264e-6c4f-457f-afa0-3e785a7143a9\" (UID: \"ec35264e-6c4f-457f-afa0-3e785a7143a9\") " Mar 20 10:06:05 crc kubenswrapper[4971]: I0320 10:06:05.192645 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec35264e-6c4f-457f-afa0-3e785a7143a9-kube-api-access-cnk2x" (OuterVolumeSpecName: "kube-api-access-cnk2x") pod "ec35264e-6c4f-457f-afa0-3e785a7143a9" (UID: "ec35264e-6c4f-457f-afa0-3e785a7143a9"). InnerVolumeSpecName "kube-api-access-cnk2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:06:05 crc kubenswrapper[4971]: I0320 10:06:05.281402 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnk2x\" (UniqueName: \"kubernetes.io/projected/ec35264e-6c4f-457f-afa0-3e785a7143a9-kube-api-access-cnk2x\") on node \"crc\" DevicePath \"\"" Mar 20 10:06:05 crc kubenswrapper[4971]: I0320 10:06:05.498738 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566686-94sgt" event={"ID":"ec35264e-6c4f-457f-afa0-3e785a7143a9","Type":"ContainerDied","Data":"6fa6804b83b69d4e17eedd57febfeed5405071e5c10a65d9347a4ea275c1e1cc"} Mar 20 10:06:05 crc kubenswrapper[4971]: I0320 10:06:05.498802 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6fa6804b83b69d4e17eedd57febfeed5405071e5c10a65d9347a4ea275c1e1cc" Mar 20 10:06:05 crc kubenswrapper[4971]: I0320 10:06:05.498890 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566686-94sgt" Mar 20 10:06:05 crc kubenswrapper[4971]: I0320 10:06:05.543204 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566680-5czk4"] Mar 20 10:06:05 crc kubenswrapper[4971]: I0320 10:06:05.554654 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566680-5czk4"] Mar 20 10:06:06 crc kubenswrapper[4971]: I0320 10:06:06.747356 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d7d37b4-c7c5-479d-8cf6-cf504109913b" path="/var/lib/kubelet/pods/7d7d37b4-c7c5-479d-8cf6-cf504109913b/volumes" Mar 20 10:06:17 crc kubenswrapper[4971]: I0320 10:06:17.298420 4971 scope.go:117] "RemoveContainer" containerID="e018edb71b2eafa4b724a73659d45fe8ea7e7989f61883570c51adc5358b4cdf" Mar 20 10:06:20 crc kubenswrapper[4971]: I0320 10:06:20.163113 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 10:06:20 crc kubenswrapper[4971]: I0320 10:06:20.163749 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 10:06:50 crc kubenswrapper[4971]: I0320 10:06:50.162409 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 10:06:50 crc kubenswrapper[4971]: I0320 10:06:50.162901 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 10:06:50 crc kubenswrapper[4971]: I0320 10:06:50.162951 4971 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" Mar 20 10:06:50 crc kubenswrapper[4971]: I0320 10:06:50.163772 4971 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"15b22a46addd719b87e39803e0676833c1a1e5fce264f62fdd63b9866ad52788"} pod="openshift-machine-config-operator/machine-config-daemon-m26zl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 10:06:50 crc kubenswrapper[4971]: I0320 10:06:50.163829 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" containerID="cri-o://15b22a46addd719b87e39803e0676833c1a1e5fce264f62fdd63b9866ad52788" gracePeriod=600 Mar 20 10:06:50 crc kubenswrapper[4971]: E0320 10:06:50.306326 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 10:06:50 crc kubenswrapper[4971]: I0320 10:06:50.862650 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dhmv9"] Mar 20 10:06:50 crc kubenswrapper[4971]: E0320 10:06:50.863121 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec35264e-6c4f-457f-afa0-3e785a7143a9" containerName="oc" Mar 20 10:06:50 crc kubenswrapper[4971]: I0320 10:06:50.863137 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec35264e-6c4f-457f-afa0-3e785a7143a9" containerName="oc" Mar 20 10:06:50 crc kubenswrapper[4971]: I0320 10:06:50.863329 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec35264e-6c4f-457f-afa0-3e785a7143a9" containerName="oc" Mar 20 10:06:50 crc kubenswrapper[4971]: I0320 10:06:50.864777 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dhmv9" Mar 20 10:06:50 crc kubenswrapper[4971]: I0320 10:06:50.881211 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dhmv9"] Mar 20 10:06:50 crc kubenswrapper[4971]: I0320 10:06:50.929799 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m757r\" (UniqueName: \"kubernetes.io/projected/27cb32a1-4a5d-4cad-bd43-805765170d5a-kube-api-access-m757r\") pod \"community-operators-dhmv9\" (UID: \"27cb32a1-4a5d-4cad-bd43-805765170d5a\") " pod="openshift-marketplace/community-operators-dhmv9" Mar 20 10:06:50 crc kubenswrapper[4971]: I0320 10:06:50.930076 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27cb32a1-4a5d-4cad-bd43-805765170d5a-utilities\") pod \"community-operators-dhmv9\" (UID: \"27cb32a1-4a5d-4cad-bd43-805765170d5a\") " pod="openshift-marketplace/community-operators-dhmv9" Mar 20 10:06:50 crc kubenswrapper[4971]: I0320 10:06:50.930331 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27cb32a1-4a5d-4cad-bd43-805765170d5a-catalog-content\") pod \"community-operators-dhmv9\" (UID: \"27cb32a1-4a5d-4cad-bd43-805765170d5a\") " pod="openshift-marketplace/community-operators-dhmv9" Mar 20 10:06:50 crc kubenswrapper[4971]: I0320 10:06:50.947389 4971 generic.go:334] "Generic (PLEG): container finished" podID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerID="15b22a46addd719b87e39803e0676833c1a1e5fce264f62fdd63b9866ad52788" exitCode=0 Mar 20 10:06:50 crc kubenswrapper[4971]: I0320 10:06:50.947436 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerDied","Data":"15b22a46addd719b87e39803e0676833c1a1e5fce264f62fdd63b9866ad52788"} Mar 20 10:06:50 crc kubenswrapper[4971]: I0320 10:06:50.947474 4971 scope.go:117] "RemoveContainer" containerID="b00504c5425aa2d12d14003deead3cf102e74dcadfa956fb8c983b5701595dc1" Mar 20 10:06:50 crc kubenswrapper[4971]: I0320 10:06:50.948191 4971 scope.go:117] "RemoveContainer" containerID="15b22a46addd719b87e39803e0676833c1a1e5fce264f62fdd63b9866ad52788" Mar 20 10:06:50 crc kubenswrapper[4971]: E0320 10:06:50.948486 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 10:06:51 crc kubenswrapper[4971]: I0320 10:06:51.031925 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27cb32a1-4a5d-4cad-bd43-805765170d5a-catalog-content\") pod \"community-operators-dhmv9\" (UID: \"27cb32a1-4a5d-4cad-bd43-805765170d5a\") " pod="openshift-marketplace/community-operators-dhmv9" Mar 20 10:06:51 crc kubenswrapper[4971]: I0320 10:06:51.032013 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m757r\" (UniqueName: \"kubernetes.io/projected/27cb32a1-4a5d-4cad-bd43-805765170d5a-kube-api-access-m757r\") pod \"community-operators-dhmv9\" (UID: \"27cb32a1-4a5d-4cad-bd43-805765170d5a\") " pod="openshift-marketplace/community-operators-dhmv9" Mar 20 10:06:51 crc kubenswrapper[4971]: I0320 10:06:51.032075 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27cb32a1-4a5d-4cad-bd43-805765170d5a-utilities\") pod \"community-operators-dhmv9\" (UID: \"27cb32a1-4a5d-4cad-bd43-805765170d5a\") " pod="openshift-marketplace/community-operators-dhmv9" Mar 20 10:06:51 crc kubenswrapper[4971]: I0320 10:06:51.032798 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27cb32a1-4a5d-4cad-bd43-805765170d5a-utilities\") pod \"community-operators-dhmv9\" (UID: \"27cb32a1-4a5d-4cad-bd43-805765170d5a\") " pod="openshift-marketplace/community-operators-dhmv9" Mar 20 10:06:51 crc kubenswrapper[4971]: I0320 10:06:51.033237 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27cb32a1-4a5d-4cad-bd43-805765170d5a-catalog-content\") pod \"community-operators-dhmv9\" (UID: \"27cb32a1-4a5d-4cad-bd43-805765170d5a\") " pod="openshift-marketplace/community-operators-dhmv9" Mar 20 10:06:51 crc kubenswrapper[4971]: I0320 10:06:51.052997 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m757r\" (UniqueName: \"kubernetes.io/projected/27cb32a1-4a5d-4cad-bd43-805765170d5a-kube-api-access-m757r\") pod \"community-operators-dhmv9\" (UID: \"27cb32a1-4a5d-4cad-bd43-805765170d5a\") " pod="openshift-marketplace/community-operators-dhmv9" Mar 20 10:06:51 crc kubenswrapper[4971]: I0320 10:06:51.187195 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dhmv9" Mar 20 10:06:51 crc kubenswrapper[4971]: I0320 10:06:51.817103 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dhmv9"] Mar 20 10:06:51 crc kubenswrapper[4971]: I0320 10:06:51.966893 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dhmv9" event={"ID":"27cb32a1-4a5d-4cad-bd43-805765170d5a","Type":"ContainerStarted","Data":"de3d9c4ade3efca0c8ed74a9128d92278905df6db0392c75f9a320bf38b1fdaa"} Mar 20 10:06:52 crc kubenswrapper[4971]: I0320 10:06:52.981321 4971 generic.go:334] "Generic (PLEG): container finished" podID="27cb32a1-4a5d-4cad-bd43-805765170d5a" containerID="6138b7298cca28bd5a8c8fba90bf9d79039a4a506f763dc64b7f042d59d4e9c7" exitCode=0 Mar 20 10:06:52 crc kubenswrapper[4971]: I0320 10:06:52.981433 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dhmv9" event={"ID":"27cb32a1-4a5d-4cad-bd43-805765170d5a","Type":"ContainerDied","Data":"6138b7298cca28bd5a8c8fba90bf9d79039a4a506f763dc64b7f042d59d4e9c7"} Mar 20 10:06:55 crc kubenswrapper[4971]: I0320 10:06:55.006373 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dhmv9" event={"ID":"27cb32a1-4a5d-4cad-bd43-805765170d5a","Type":"ContainerStarted","Data":"4705e057b5545bfecced3f94ccc17f1977e8c90c215409e0a2c1beb2062db16d"} Mar 20 10:06:56 crc kubenswrapper[4971]: I0320 10:06:56.017998 4971 generic.go:334] "Generic (PLEG): container finished" podID="27cb32a1-4a5d-4cad-bd43-805765170d5a" containerID="4705e057b5545bfecced3f94ccc17f1977e8c90c215409e0a2c1beb2062db16d" exitCode=0 Mar 20 10:06:56 crc kubenswrapper[4971]: I0320 10:06:56.018099 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dhmv9" event={"ID":"27cb32a1-4a5d-4cad-bd43-805765170d5a","Type":"ContainerDied","Data":"4705e057b5545bfecced3f94ccc17f1977e8c90c215409e0a2c1beb2062db16d"} Mar 20 10:06:57 crc kubenswrapper[4971]: I0320 10:06:57.029665 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dhmv9" event={"ID":"27cb32a1-4a5d-4cad-bd43-805765170d5a","Type":"ContainerStarted","Data":"8f6657b44738edc795e610e9cfb6bd4415766254ebbe474a1c3543aae2cca5e2"} Mar 20 10:06:57 crc kubenswrapper[4971]: I0320 10:06:57.052485 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dhmv9" podStartSLOduration=3.531441843 podStartE2EDuration="7.052465609s" podCreationTimestamp="2026-03-20 10:06:50 +0000 UTC" firstStartedPulling="2026-03-20 10:06:52.983901451 +0000 UTC m=+11834.963775589" lastFinishedPulling="2026-03-20 10:06:56.504925217 +0000 UTC m=+11838.484799355" observedRunningTime="2026-03-20 10:06:57.045637204 +0000 UTC m=+11839.025511392" watchObservedRunningTime="2026-03-20 10:06:57.052465609 +0000 UTC m=+11839.032339757" Mar 20 10:07:01 crc kubenswrapper[4971]: I0320 10:07:01.189289 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dhmv9" Mar 20 10:07:01 crc kubenswrapper[4971]: I0320 10:07:01.190276 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dhmv9" Mar 20 10:07:01 crc kubenswrapper[4971]: I0320 10:07:01.239886 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dhmv9" Mar 20 10:07:02 crc kubenswrapper[4971]: I0320 10:07:02.145254 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dhmv9" Mar 20 10:07:02 crc kubenswrapper[4971]: I0320 10:07:02.198471 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dhmv9"] Mar 20 10:07:02 crc kubenswrapper[4971]: I0320 10:07:02.732563 4971 scope.go:117] "RemoveContainer" containerID="15b22a46addd719b87e39803e0676833c1a1e5fce264f62fdd63b9866ad52788" Mar 20 10:07:02 crc kubenswrapper[4971]: E0320 10:07:02.733043 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 10:07:04 crc kubenswrapper[4971]: I0320 10:07:04.103878 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-dhmv9" podUID="27cb32a1-4a5d-4cad-bd43-805765170d5a" containerName="registry-server" containerID="cri-o://8f6657b44738edc795e610e9cfb6bd4415766254ebbe474a1c3543aae2cca5e2" gracePeriod=2 Mar 20 10:07:04 crc kubenswrapper[4971]: I0320 10:07:04.980240 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dhmv9" Mar 20 10:07:05 crc kubenswrapper[4971]: I0320 10:07:05.100837 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27cb32a1-4a5d-4cad-bd43-805765170d5a-catalog-content\") pod \"27cb32a1-4a5d-4cad-bd43-805765170d5a\" (UID: \"27cb32a1-4a5d-4cad-bd43-805765170d5a\") " Mar 20 10:07:05 crc kubenswrapper[4971]: I0320 10:07:05.101059 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m757r\" (UniqueName: \"kubernetes.io/projected/27cb32a1-4a5d-4cad-bd43-805765170d5a-kube-api-access-m757r\") pod \"27cb32a1-4a5d-4cad-bd43-805765170d5a\" (UID: \"27cb32a1-4a5d-4cad-bd43-805765170d5a\") " Mar 20 10:07:05 crc kubenswrapper[4971]: I0320 10:07:05.101166 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27cb32a1-4a5d-4cad-bd43-805765170d5a-utilities\") pod \"27cb32a1-4a5d-4cad-bd43-805765170d5a\" (UID: \"27cb32a1-4a5d-4cad-bd43-805765170d5a\") " Mar 20 10:07:05 crc kubenswrapper[4971]: I0320 10:07:05.102193 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27cb32a1-4a5d-4cad-bd43-805765170d5a-utilities" (OuterVolumeSpecName: "utilities") pod "27cb32a1-4a5d-4cad-bd43-805765170d5a" (UID: "27cb32a1-4a5d-4cad-bd43-805765170d5a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:07:05 crc kubenswrapper[4971]: I0320 10:07:05.102515 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27cb32a1-4a5d-4cad-bd43-805765170d5a-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 10:07:05 crc kubenswrapper[4971]: I0320 10:07:05.106741 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27cb32a1-4a5d-4cad-bd43-805765170d5a-kube-api-access-m757r" (OuterVolumeSpecName: "kube-api-access-m757r") pod "27cb32a1-4a5d-4cad-bd43-805765170d5a" (UID: "27cb32a1-4a5d-4cad-bd43-805765170d5a"). InnerVolumeSpecName "kube-api-access-m757r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:07:05 crc kubenswrapper[4971]: I0320 10:07:05.118473 4971 generic.go:334] "Generic (PLEG): container finished" podID="27cb32a1-4a5d-4cad-bd43-805765170d5a" containerID="8f6657b44738edc795e610e9cfb6bd4415766254ebbe474a1c3543aae2cca5e2" exitCode=0 Mar 20 10:07:05 crc kubenswrapper[4971]: I0320 10:07:05.118523 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dhmv9" event={"ID":"27cb32a1-4a5d-4cad-bd43-805765170d5a","Type":"ContainerDied","Data":"8f6657b44738edc795e610e9cfb6bd4415766254ebbe474a1c3543aae2cca5e2"} Mar 20 10:07:05 crc kubenswrapper[4971]: I0320 10:07:05.118536 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dhmv9" Mar 20 10:07:05 crc kubenswrapper[4971]: I0320 10:07:05.118553 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dhmv9" event={"ID":"27cb32a1-4a5d-4cad-bd43-805765170d5a","Type":"ContainerDied","Data":"de3d9c4ade3efca0c8ed74a9128d92278905df6db0392c75f9a320bf38b1fdaa"} Mar 20 10:07:05 crc kubenswrapper[4971]: I0320 10:07:05.118573 4971 scope.go:117] "RemoveContainer" containerID="8f6657b44738edc795e610e9cfb6bd4415766254ebbe474a1c3543aae2cca5e2" Mar 20 10:07:05 crc kubenswrapper[4971]: I0320 10:07:05.158966 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27cb32a1-4a5d-4cad-bd43-805765170d5a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "27cb32a1-4a5d-4cad-bd43-805765170d5a" (UID: "27cb32a1-4a5d-4cad-bd43-805765170d5a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:07:05 crc kubenswrapper[4971]: I0320 10:07:05.173645 4971 scope.go:117] "RemoveContainer" containerID="4705e057b5545bfecced3f94ccc17f1977e8c90c215409e0a2c1beb2062db16d" Mar 20 10:07:05 crc kubenswrapper[4971]: I0320 10:07:05.203853 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m757r\" (UniqueName: \"kubernetes.io/projected/27cb32a1-4a5d-4cad-bd43-805765170d5a-kube-api-access-m757r\") on node \"crc\" DevicePath \"\"" Mar 20 10:07:05 crc kubenswrapper[4971]: I0320 10:07:05.203891 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27cb32a1-4a5d-4cad-bd43-805765170d5a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 10:07:05 crc kubenswrapper[4971]: I0320 10:07:05.214137 4971 scope.go:117] "RemoveContainer" containerID="6138b7298cca28bd5a8c8fba90bf9d79039a4a506f763dc64b7f042d59d4e9c7" Mar 20 10:07:05 crc kubenswrapper[4971]: I0320 10:07:05.243404 4971 scope.go:117] "RemoveContainer" containerID="8f6657b44738edc795e610e9cfb6bd4415766254ebbe474a1c3543aae2cca5e2" Mar 20 10:07:05 crc kubenswrapper[4971]: E0320 10:07:05.243905 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f6657b44738edc795e610e9cfb6bd4415766254ebbe474a1c3543aae2cca5e2\": container with ID starting with 8f6657b44738edc795e610e9cfb6bd4415766254ebbe474a1c3543aae2cca5e2 not found: ID does not exist" containerID="8f6657b44738edc795e610e9cfb6bd4415766254ebbe474a1c3543aae2cca5e2" Mar 20 10:07:05 crc kubenswrapper[4971]: I0320 10:07:05.243950 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f6657b44738edc795e610e9cfb6bd4415766254ebbe474a1c3543aae2cca5e2"} err="failed to get container status \"8f6657b44738edc795e610e9cfb6bd4415766254ebbe474a1c3543aae2cca5e2\": rpc error: code = NotFound desc = could not find container \"8f6657b44738edc795e610e9cfb6bd4415766254ebbe474a1c3543aae2cca5e2\": container with ID starting with 8f6657b44738edc795e610e9cfb6bd4415766254ebbe474a1c3543aae2cca5e2 not found: ID does not exist" Mar 20 10:07:05 crc kubenswrapper[4971]: I0320 10:07:05.243977 4971 scope.go:117] "RemoveContainer" containerID="4705e057b5545bfecced3f94ccc17f1977e8c90c215409e0a2c1beb2062db16d" Mar 20 10:07:05 crc kubenswrapper[4971]: E0320 10:07:05.244370 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4705e057b5545bfecced3f94ccc17f1977e8c90c215409e0a2c1beb2062db16d\": container with ID starting with 4705e057b5545bfecced3f94ccc17f1977e8c90c215409e0a2c1beb2062db16d not found: ID does not exist" containerID="4705e057b5545bfecced3f94ccc17f1977e8c90c215409e0a2c1beb2062db16d" Mar 20 10:07:05 crc kubenswrapper[4971]: I0320 10:07:05.244407 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4705e057b5545bfecced3f94ccc17f1977e8c90c215409e0a2c1beb2062db16d"} err="failed to get container status \"4705e057b5545bfecced3f94ccc17f1977e8c90c215409e0a2c1beb2062db16d\": rpc error: code = NotFound desc = could not find container \"4705e057b5545bfecced3f94ccc17f1977e8c90c215409e0a2c1beb2062db16d\": container with ID starting with 4705e057b5545bfecced3f94ccc17f1977e8c90c215409e0a2c1beb2062db16d not found: ID does not exist" Mar 20 10:07:05 crc kubenswrapper[4971]: I0320 10:07:05.244433 4971 scope.go:117] "RemoveContainer" containerID="6138b7298cca28bd5a8c8fba90bf9d79039a4a506f763dc64b7f042d59d4e9c7" Mar 20 10:07:05 crc kubenswrapper[4971]: E0320 10:07:05.244770 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6138b7298cca28bd5a8c8fba90bf9d79039a4a506f763dc64b7f042d59d4e9c7\": container with ID starting with 6138b7298cca28bd5a8c8fba90bf9d79039a4a506f763dc64b7f042d59d4e9c7 not found: ID does not exist" containerID="6138b7298cca28bd5a8c8fba90bf9d79039a4a506f763dc64b7f042d59d4e9c7" Mar 20 10:07:05 crc kubenswrapper[4971]: I0320 10:07:05.244798 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6138b7298cca28bd5a8c8fba90bf9d79039a4a506f763dc64b7f042d59d4e9c7"} err="failed to get container status \"6138b7298cca28bd5a8c8fba90bf9d79039a4a506f763dc64b7f042d59d4e9c7\": rpc error: code = NotFound desc = could not find container \"6138b7298cca28bd5a8c8fba90bf9d79039a4a506f763dc64b7f042d59d4e9c7\": container with ID starting with 6138b7298cca28bd5a8c8fba90bf9d79039a4a506f763dc64b7f042d59d4e9c7 not found: ID does not exist" Mar 20 10:07:05 crc kubenswrapper[4971]: I0320 10:07:05.478457 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dhmv9"] Mar 20 10:07:05 crc kubenswrapper[4971]: I0320 10:07:05.494288 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-dhmv9"] Mar 20 10:07:06 crc kubenswrapper[4971]: I0320 10:07:06.744690 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27cb32a1-4a5d-4cad-bd43-805765170d5a" path="/var/lib/kubelet/pods/27cb32a1-4a5d-4cad-bd43-805765170d5a/volumes" Mar 20 10:07:17 crc kubenswrapper[4971]: I0320 10:07:17.733677 4971 scope.go:117] "RemoveContainer" containerID="15b22a46addd719b87e39803e0676833c1a1e5fce264f62fdd63b9866ad52788" Mar 20 10:07:17 crc kubenswrapper[4971]: E0320 10:07:17.734706 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 10:07:31 crc kubenswrapper[4971]: I0320 10:07:31.732469 4971 scope.go:117] "RemoveContainer" containerID="15b22a46addd719b87e39803e0676833c1a1e5fce264f62fdd63b9866ad52788" Mar 20 10:07:31 crc kubenswrapper[4971]: E0320 10:07:31.733329 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 10:07:44 crc kubenswrapper[4971]: I0320 10:07:44.733659 4971 scope.go:117] "RemoveContainer" containerID="15b22a46addd719b87e39803e0676833c1a1e5fce264f62fdd63b9866ad52788" Mar 20 10:07:44 crc kubenswrapper[4971]: E0320 10:07:44.734526 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 10:07:56 crc kubenswrapper[4971]: I0320 10:07:56.735115 4971 scope.go:117] "RemoveContainer" containerID="15b22a46addd719b87e39803e0676833c1a1e5fce264f62fdd63b9866ad52788" Mar 20 10:07:56 crc kubenswrapper[4971]: E0320 10:07:56.737317 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 10:08:00 crc kubenswrapper[4971]: I0320 10:08:00.143033 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566688-d9w58"] Mar 20 10:08:00 crc kubenswrapper[4971]: E0320 10:08:00.143832 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27cb32a1-4a5d-4cad-bd43-805765170d5a" containerName="registry-server" Mar 20 10:08:00 crc kubenswrapper[4971]: I0320 10:08:00.143854 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="27cb32a1-4a5d-4cad-bd43-805765170d5a" containerName="registry-server" Mar 20 10:08:00 crc kubenswrapper[4971]: E0320 10:08:00.143878 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27cb32a1-4a5d-4cad-bd43-805765170d5a" containerName="extract-utilities" Mar 20 10:08:00 crc kubenswrapper[4971]: I0320 10:08:00.143889 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="27cb32a1-4a5d-4cad-bd43-805765170d5a" containerName="extract-utilities" Mar 20 10:08:00 crc kubenswrapper[4971]: E0320 10:08:00.143924 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27cb32a1-4a5d-4cad-bd43-805765170d5a" containerName="extract-content" Mar 20 10:08:00 crc kubenswrapper[4971]: I0320 10:08:00.143930 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="27cb32a1-4a5d-4cad-bd43-805765170d5a" containerName="extract-content" Mar 20 10:08:00 crc kubenswrapper[4971]: I0320 10:08:00.144178 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="27cb32a1-4a5d-4cad-bd43-805765170d5a" containerName="registry-server" Mar 20 10:08:00 crc kubenswrapper[4971]: I0320 10:08:00.144966 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566688-d9w58" Mar 20 10:08:00 crc kubenswrapper[4971]: I0320 10:08:00.147210 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 10:08:00 crc kubenswrapper[4971]: I0320 10:08:00.147464 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 10:08:00 crc kubenswrapper[4971]: I0320 10:08:00.147631 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 10:08:00 crc kubenswrapper[4971]: I0320 10:08:00.169150 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566688-d9w58"] Mar 20 10:08:00 crc kubenswrapper[4971]: I0320 10:08:00.189659 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbj8n\" (UniqueName: \"kubernetes.io/projected/a2fd33b6-6269-4dc3-9d7f-aad9fa0e70b2-kube-api-access-zbj8n\") pod \"auto-csr-approver-29566688-d9w58\" (UID: \"a2fd33b6-6269-4dc3-9d7f-aad9fa0e70b2\") " pod="openshift-infra/auto-csr-approver-29566688-d9w58" Mar 20 10:08:00 crc kubenswrapper[4971]: I0320 10:08:00.291890 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbj8n\" (UniqueName: \"kubernetes.io/projected/a2fd33b6-6269-4dc3-9d7f-aad9fa0e70b2-kube-api-access-zbj8n\") pod \"auto-csr-approver-29566688-d9w58\" (UID: \"a2fd33b6-6269-4dc3-9d7f-aad9fa0e70b2\") " pod="openshift-infra/auto-csr-approver-29566688-d9w58" Mar 20 10:08:00 crc kubenswrapper[4971]: I0320 10:08:00.318217 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbj8n\" (UniqueName: \"kubernetes.io/projected/a2fd33b6-6269-4dc3-9d7f-aad9fa0e70b2-kube-api-access-zbj8n\") pod \"auto-csr-approver-29566688-d9w58\" (UID: \"a2fd33b6-6269-4dc3-9d7f-aad9fa0e70b2\") " pod="openshift-infra/auto-csr-approver-29566688-d9w58" Mar 20 10:08:00 crc kubenswrapper[4971]: I0320 10:08:00.477890 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566688-d9w58" Mar 20 10:08:00 crc kubenswrapper[4971]: I0320 10:08:00.956051 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566688-d9w58"] Mar 20 10:08:01 crc kubenswrapper[4971]: I0320 10:08:01.805471 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566688-d9w58" event={"ID":"a2fd33b6-6269-4dc3-9d7f-aad9fa0e70b2","Type":"ContainerStarted","Data":"bdae14cce4f7fa715b19a519a97bf8e8442b95d03ba27ddbbe7b5463f7ceffc9"} Mar 20 10:08:02 crc kubenswrapper[4971]: I0320 10:08:02.816136 4971 generic.go:334] "Generic (PLEG): container finished" podID="a2fd33b6-6269-4dc3-9d7f-aad9fa0e70b2" containerID="7d04f59c860fbe4e3f92becf5f99467ddd22c3e0f99f70bcb981dc6f71150c55" exitCode=0 Mar 20 10:08:02 crc kubenswrapper[4971]: I0320 10:08:02.816189 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566688-d9w58" event={"ID":"a2fd33b6-6269-4dc3-9d7f-aad9fa0e70b2","Type":"ContainerDied","Data":"7d04f59c860fbe4e3f92becf5f99467ddd22c3e0f99f70bcb981dc6f71150c55"} Mar 20 10:08:04 crc kubenswrapper[4971]: I0320 10:08:04.315883 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566688-d9w58" Mar 20 10:08:04 crc kubenswrapper[4971]: I0320 10:08:04.379792 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbj8n\" (UniqueName: \"kubernetes.io/projected/a2fd33b6-6269-4dc3-9d7f-aad9fa0e70b2-kube-api-access-zbj8n\") pod \"a2fd33b6-6269-4dc3-9d7f-aad9fa0e70b2\" (UID: \"a2fd33b6-6269-4dc3-9d7f-aad9fa0e70b2\") " Mar 20 10:08:04 crc kubenswrapper[4971]: I0320 10:08:04.403982 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2fd33b6-6269-4dc3-9d7f-aad9fa0e70b2-kube-api-access-zbj8n" (OuterVolumeSpecName: "kube-api-access-zbj8n") pod "a2fd33b6-6269-4dc3-9d7f-aad9fa0e70b2" (UID: "a2fd33b6-6269-4dc3-9d7f-aad9fa0e70b2"). InnerVolumeSpecName "kube-api-access-zbj8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:08:04 crc kubenswrapper[4971]: I0320 10:08:04.483688 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbj8n\" (UniqueName: \"kubernetes.io/projected/a2fd33b6-6269-4dc3-9d7f-aad9fa0e70b2-kube-api-access-zbj8n\") on node \"crc\" DevicePath \"\"" Mar 20 10:08:04 crc kubenswrapper[4971]: I0320 10:08:04.842859 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566688-d9w58" event={"ID":"a2fd33b6-6269-4dc3-9d7f-aad9fa0e70b2","Type":"ContainerDied","Data":"bdae14cce4f7fa715b19a519a97bf8e8442b95d03ba27ddbbe7b5463f7ceffc9"} Mar 20 10:08:04 crc kubenswrapper[4971]: I0320 10:08:04.843245 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bdae14cce4f7fa715b19a519a97bf8e8442b95d03ba27ddbbe7b5463f7ceffc9" Mar 20 10:08:04 crc kubenswrapper[4971]: I0320 10:08:04.843320 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566688-d9w58" Mar 20 10:08:05 crc kubenswrapper[4971]: I0320 10:08:05.399077 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566682-2lbc4"] Mar 20 10:08:05 crc kubenswrapper[4971]: I0320 10:08:05.419097 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566682-2lbc4"] Mar 20 10:08:06 crc kubenswrapper[4971]: I0320 10:08:06.744058 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e26ef584-e0d0-4448-82f2-143e5f331c3b" path="/var/lib/kubelet/pods/e26ef584-e0d0-4448-82f2-143e5f331c3b/volumes" Mar 20 10:08:11 crc kubenswrapper[4971]: I0320 10:08:11.732448 4971 scope.go:117] "RemoveContainer" containerID="15b22a46addd719b87e39803e0676833c1a1e5fce264f62fdd63b9866ad52788" Mar 20 10:08:11 crc kubenswrapper[4971]: E0320 10:08:11.733284 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 10:08:17 crc kubenswrapper[4971]: I0320 10:08:17.443583 4971 scope.go:117] "RemoveContainer" containerID="53eea9252fa4b17192ddedd3b917af427ab1758527612a2028b5b605c0d46000" Mar 20 10:08:26 crc kubenswrapper[4971]: I0320 10:08:26.732842 4971 scope.go:117] "RemoveContainer" containerID="15b22a46addd719b87e39803e0676833c1a1e5fce264f62fdd63b9866ad52788" Mar 20 10:08:26 crc kubenswrapper[4971]: E0320 10:08:26.733892 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 10:08:41 crc kubenswrapper[4971]: I0320 10:08:41.732447 4971 scope.go:117] "RemoveContainer" containerID="15b22a46addd719b87e39803e0676833c1a1e5fce264f62fdd63b9866ad52788" Mar 20 10:08:41 crc kubenswrapper[4971]: E0320 10:08:41.733051 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 10:08:52 crc kubenswrapper[4971]: I0320 10:08:52.732708 4971 scope.go:117] "RemoveContainer" containerID="15b22a46addd719b87e39803e0676833c1a1e5fce264f62fdd63b9866ad52788" Mar 20 10:08:52 crc kubenswrapper[4971]: E0320 10:08:52.733378 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 10:09:06 crc kubenswrapper[4971]: I0320 10:09:06.732101 4971 scope.go:117] "RemoveContainer" containerID="15b22a46addd719b87e39803e0676833c1a1e5fce264f62fdd63b9866ad52788" Mar 20 10:09:06 crc kubenswrapper[4971]: E0320 10:09:06.733009 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 10:09:18 crc kubenswrapper[4971]: I0320 10:09:18.738866 4971 scope.go:117] "RemoveContainer" containerID="15b22a46addd719b87e39803e0676833c1a1e5fce264f62fdd63b9866ad52788" Mar 20 10:09:18 crc kubenswrapper[4971]: E0320 10:09:18.739803 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 10:09:29 crc kubenswrapper[4971]: I0320 10:09:29.733128 4971 scope.go:117] "RemoveContainer" containerID="15b22a46addd719b87e39803e0676833c1a1e5fce264f62fdd63b9866ad52788" Mar 20 10:09:29 crc kubenswrapper[4971]: E0320 10:09:29.734367 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 10:09:43 crc kubenswrapper[4971]: I0320 10:09:43.733164 4971 scope.go:117] "RemoveContainer" containerID="15b22a46addd719b87e39803e0676833c1a1e5fce264f62fdd63b9866ad52788" Mar 20 10:09:43 crc kubenswrapper[4971]: E0320 10:09:43.733960 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 10:09:58 crc kubenswrapper[4971]: I0320 10:09:58.741796 4971 scope.go:117] "RemoveContainer" containerID="15b22a46addd719b87e39803e0676833c1a1e5fce264f62fdd63b9866ad52788" Mar 20 10:09:58 crc kubenswrapper[4971]: E0320 10:09:58.742699 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 10:10:00 crc kubenswrapper[4971]: I0320 10:10:00.174406 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566690-p2jfv"] Mar 20 10:10:00 crc kubenswrapper[4971]: E0320 10:10:00.174833 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2fd33b6-6269-4dc3-9d7f-aad9fa0e70b2" containerName="oc" Mar 20 10:10:00 crc kubenswrapper[4971]: I0320 10:10:00.174845 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2fd33b6-6269-4dc3-9d7f-aad9fa0e70b2" containerName="oc" Mar 20 10:10:00 crc kubenswrapper[4971]: I0320 10:10:00.175054 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2fd33b6-6269-4dc3-9d7f-aad9fa0e70b2" containerName="oc" Mar 20 10:10:00 crc kubenswrapper[4971]: I0320 10:10:00.175775 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566690-p2jfv" Mar 20 10:10:00 crc kubenswrapper[4971]: I0320 10:10:00.179003 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 10:10:00 crc kubenswrapper[4971]: I0320 10:10:00.179208 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 10:10:00 crc kubenswrapper[4971]: I0320 10:10:00.179210 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 10:10:00 crc kubenswrapper[4971]: I0320 10:10:00.191857 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566690-p2jfv"] Mar 20 10:10:00 crc kubenswrapper[4971]: I0320 10:10:00.267809 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djxgj\" (UniqueName: \"kubernetes.io/projected/97b8c056-36f7-4851-88bb-ce7d8496ef1f-kube-api-access-djxgj\") pod \"auto-csr-approver-29566690-p2jfv\" (UID: \"97b8c056-36f7-4851-88bb-ce7d8496ef1f\") " pod="openshift-infra/auto-csr-approver-29566690-p2jfv" Mar 20 10:10:00 crc kubenswrapper[4971]: I0320 10:10:00.369879 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djxgj\" (UniqueName: \"kubernetes.io/projected/97b8c056-36f7-4851-88bb-ce7d8496ef1f-kube-api-access-djxgj\") pod \"auto-csr-approver-29566690-p2jfv\" (UID: \"97b8c056-36f7-4851-88bb-ce7d8496ef1f\") " pod="openshift-infra/auto-csr-approver-29566690-p2jfv" Mar 20 10:10:00 crc kubenswrapper[4971]: I0320 10:10:00.388418 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djxgj\" (UniqueName: \"kubernetes.io/projected/97b8c056-36f7-4851-88bb-ce7d8496ef1f-kube-api-access-djxgj\") pod \"auto-csr-approver-29566690-p2jfv\" (UID: \"97b8c056-36f7-4851-88bb-ce7d8496ef1f\") " pod="openshift-infra/auto-csr-approver-29566690-p2jfv" Mar 20 10:10:00 crc kubenswrapper[4971]: I0320 10:10:00.494711 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566690-p2jfv" Mar 20 10:10:00 crc kubenswrapper[4971]: I0320 10:10:00.802314 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566690-p2jfv"] Mar 20 10:10:01 crc kubenswrapper[4971]: I0320 10:10:01.031690 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566690-p2jfv" event={"ID":"97b8c056-36f7-4851-88bb-ce7d8496ef1f","Type":"ContainerStarted","Data":"018bedad70042e87b4758cd5b463f8d6d94b7852b213c287a4d1fbbe6839af0d"} Mar 20 10:10:03 crc kubenswrapper[4971]: I0320 10:10:03.052206 4971 generic.go:334] "Generic (PLEG): container finished" podID="97b8c056-36f7-4851-88bb-ce7d8496ef1f" containerID="067c472db539144441caa852bc4aa81fadf9af091c2af95cd56f0848aa91ee84" exitCode=0 Mar 20 10:10:03 crc kubenswrapper[4971]: I0320 10:10:03.052252 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566690-p2jfv" event={"ID":"97b8c056-36f7-4851-88bb-ce7d8496ef1f","Type":"ContainerDied","Data":"067c472db539144441caa852bc4aa81fadf9af091c2af95cd56f0848aa91ee84"} Mar 20 10:10:04 crc kubenswrapper[4971]: I0320 10:10:04.558381 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566690-p2jfv" Mar 20 10:10:04 crc kubenswrapper[4971]: I0320 10:10:04.657400 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djxgj\" (UniqueName: \"kubernetes.io/projected/97b8c056-36f7-4851-88bb-ce7d8496ef1f-kube-api-access-djxgj\") pod \"97b8c056-36f7-4851-88bb-ce7d8496ef1f\" (UID: \"97b8c056-36f7-4851-88bb-ce7d8496ef1f\") " Mar 20 10:10:04 crc kubenswrapper[4971]: I0320 10:10:04.662632 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97b8c056-36f7-4851-88bb-ce7d8496ef1f-kube-api-access-djxgj" (OuterVolumeSpecName: "kube-api-access-djxgj") pod "97b8c056-36f7-4851-88bb-ce7d8496ef1f" (UID: "97b8c056-36f7-4851-88bb-ce7d8496ef1f"). InnerVolumeSpecName "kube-api-access-djxgj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:10:04 crc kubenswrapper[4971]: I0320 10:10:04.760974 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djxgj\" (UniqueName: \"kubernetes.io/projected/97b8c056-36f7-4851-88bb-ce7d8496ef1f-kube-api-access-djxgj\") on node \"crc\" DevicePath \"\"" Mar 20 10:10:05 crc kubenswrapper[4971]: I0320 10:10:05.079903 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566690-p2jfv" event={"ID":"97b8c056-36f7-4851-88bb-ce7d8496ef1f","Type":"ContainerDied","Data":"018bedad70042e87b4758cd5b463f8d6d94b7852b213c287a4d1fbbe6839af0d"} Mar 20 10:10:05 crc kubenswrapper[4971]: I0320 10:10:05.079952 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="018bedad70042e87b4758cd5b463f8d6d94b7852b213c287a4d1fbbe6839af0d" Mar 20 10:10:05 crc kubenswrapper[4971]: I0320 10:10:05.079983 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566690-p2jfv" Mar 20 10:10:05 crc kubenswrapper[4971]: I0320 10:10:05.645556 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566684-54npb"] Mar 20 10:10:05 crc kubenswrapper[4971]: I0320 10:10:05.655618 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566684-54npb"] Mar 20 10:10:06 crc kubenswrapper[4971]: I0320 10:10:06.743551 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d23c2eb-0052-4cc9-a837-954e06e21c94" path="/var/lib/kubelet/pods/4d23c2eb-0052-4cc9-a837-954e06e21c94/volumes" Mar 20 10:10:12 crc kubenswrapper[4971]: I0320 10:10:12.733402 4971 scope.go:117] "RemoveContainer" containerID="15b22a46addd719b87e39803e0676833c1a1e5fce264f62fdd63b9866ad52788" Mar 20 10:10:12 crc kubenswrapper[4971]: E0320 10:10:12.734952 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 10:10:16 crc kubenswrapper[4971]: I0320 10:10:16.059190 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mwn7w"] Mar 20 10:10:16 crc kubenswrapper[4971]: E0320 10:10:16.059960 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97b8c056-36f7-4851-88bb-ce7d8496ef1f" containerName="oc" Mar 20 10:10:16 crc kubenswrapper[4971]: I0320 10:10:16.059972 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="97b8c056-36f7-4851-88bb-ce7d8496ef1f" containerName="oc" Mar 20 10:10:16 crc kubenswrapper[4971]: I0320 10:10:16.060216 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="97b8c056-36f7-4851-88bb-ce7d8496ef1f" containerName="oc" Mar 20 10:10:16 crc kubenswrapper[4971]: I0320 10:10:16.061957 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mwn7w" Mar 20 10:10:16 crc kubenswrapper[4971]: I0320 10:10:16.086962 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mwn7w"] Mar 20 10:10:16 crc kubenswrapper[4971]: I0320 10:10:16.179789 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1514a967-0c0a-41d9-a8b0-186cff8e4818-utilities\") pod \"certified-operators-mwn7w\" (UID: \"1514a967-0c0a-41d9-a8b0-186cff8e4818\") " pod="openshift-marketplace/certified-operators-mwn7w" Mar 20 10:10:16 crc kubenswrapper[4971]: I0320 10:10:16.179838 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngjtl\" (UniqueName: \"kubernetes.io/projected/1514a967-0c0a-41d9-a8b0-186cff8e4818-kube-api-access-ngjtl\") pod \"certified-operators-mwn7w\" (UID: \"1514a967-0c0a-41d9-a8b0-186cff8e4818\") " pod="openshift-marketplace/certified-operators-mwn7w" Mar 20 10:10:16 crc kubenswrapper[4971]: I0320 10:10:16.180277 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1514a967-0c0a-41d9-a8b0-186cff8e4818-catalog-content\") pod \"certified-operators-mwn7w\" (UID: \"1514a967-0c0a-41d9-a8b0-186cff8e4818\") " pod="openshift-marketplace/certified-operators-mwn7w" Mar 20 10:10:16 crc kubenswrapper[4971]: I0320 10:10:16.282971 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1514a967-0c0a-41d9-a8b0-186cff8e4818-catalog-content\") pod \"certified-operators-mwn7w\" (UID: \"1514a967-0c0a-41d9-a8b0-186cff8e4818\") " pod="openshift-marketplace/certified-operators-mwn7w" Mar 20 10:10:16 crc kubenswrapper[4971]: I0320 10:10:16.283101 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1514a967-0c0a-41d9-a8b0-186cff8e4818-utilities\") pod \"certified-operators-mwn7w\" (UID: \"1514a967-0c0a-41d9-a8b0-186cff8e4818\") " pod="openshift-marketplace/certified-operators-mwn7w" Mar 20 10:10:16 crc kubenswrapper[4971]: I0320 10:10:16.283130 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngjtl\" (UniqueName: \"kubernetes.io/projected/1514a967-0c0a-41d9-a8b0-186cff8e4818-kube-api-access-ngjtl\") pod \"certified-operators-mwn7w\" (UID: \"1514a967-0c0a-41d9-a8b0-186cff8e4818\") " pod="openshift-marketplace/certified-operators-mwn7w" Mar 20 10:10:16 crc kubenswrapper[4971]: I0320 10:10:16.283658 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1514a967-0c0a-41d9-a8b0-186cff8e4818-catalog-content\") pod \"certified-operators-mwn7w\" (UID: \"1514a967-0c0a-41d9-a8b0-186cff8e4818\") " pod="openshift-marketplace/certified-operators-mwn7w" Mar 20 10:10:16 crc kubenswrapper[4971]: I0320 10:10:16.283843 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1514a967-0c0a-41d9-a8b0-186cff8e4818-utilities\") pod \"certified-operators-mwn7w\" (UID: \"1514a967-0c0a-41d9-a8b0-186cff8e4818\") " pod="openshift-marketplace/certified-operators-mwn7w" Mar 20 10:10:16 crc kubenswrapper[4971]: I0320 10:10:16.319406 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngjtl\" (UniqueName: \"kubernetes.io/projected/1514a967-0c0a-41d9-a8b0-186cff8e4818-kube-api-access-ngjtl\") pod \"certified-operators-mwn7w\" (UID: \"1514a967-0c0a-41d9-a8b0-186cff8e4818\") " pod="openshift-marketplace/certified-operators-mwn7w" Mar 20 10:10:16 crc kubenswrapper[4971]: I0320 10:10:16.386905 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mwn7w" Mar 20 10:10:16 crc kubenswrapper[4971]: I0320 10:10:16.934023 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mwn7w"] Mar 20 10:10:17 crc kubenswrapper[4971]: I0320 10:10:17.205900 4971 generic.go:334] "Generic (PLEG): container finished" podID="1514a967-0c0a-41d9-a8b0-186cff8e4818" containerID="7acf8db07a9f9260f6c6b074de869d09102faa6382b60fec0542dc50ad523056" exitCode=0 Mar 20 10:10:17 crc kubenswrapper[4971]: I0320 10:10:17.206347 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwn7w" event={"ID":"1514a967-0c0a-41d9-a8b0-186cff8e4818","Type":"ContainerDied","Data":"7acf8db07a9f9260f6c6b074de869d09102faa6382b60fec0542dc50ad523056"} Mar 20 10:10:17 crc kubenswrapper[4971]: I0320 10:10:17.206984 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwn7w" event={"ID":"1514a967-0c0a-41d9-a8b0-186cff8e4818","Type":"ContainerStarted","Data":"028fe493d18dc73599d1c6311e689d3b9e9b9550575d11a3d79789c596dcffb2"} Mar 20 10:10:17 crc kubenswrapper[4971]: I0320 10:10:17.546911 4971 scope.go:117] "RemoveContainer" containerID="6e79619babfc0d3498f19527ee26e6e5338a787011df5f3155c2a781cc604791" Mar 20 10:10:18 crc kubenswrapper[4971]: I0320 10:10:18.222329 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwn7w" event={"ID":"1514a967-0c0a-41d9-a8b0-186cff8e4818","Type":"ContainerStarted","Data":"29cf321152ec104efae5eb68582fefc6d2ec7c0e2dee5315902e683cced4e032"} Mar 20 10:10:19 crc kubenswrapper[4971]: I0320 10:10:19.239142 4971 generic.go:334] "Generic (PLEG): container finished" podID="1514a967-0c0a-41d9-a8b0-186cff8e4818" containerID="29cf321152ec104efae5eb68582fefc6d2ec7c0e2dee5315902e683cced4e032" exitCode=0 Mar 20 10:10:19 crc kubenswrapper[4971]: I0320 10:10:19.239250 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwn7w" event={"ID":"1514a967-0c0a-41d9-a8b0-186cff8e4818","Type":"ContainerDied","Data":"29cf321152ec104efae5eb68582fefc6d2ec7c0e2dee5315902e683cced4e032"} Mar 20 10:10:20 crc kubenswrapper[4971]: I0320 10:10:20.275033 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwn7w" event={"ID":"1514a967-0c0a-41d9-a8b0-186cff8e4818","Type":"ContainerStarted","Data":"6c41547e719b5233b77f0bfa955231536cc60aba74e2e03a456f2514a7c0f108"} Mar 20 10:10:20 crc kubenswrapper[4971]: I0320 10:10:20.310093 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mwn7w" podStartSLOduration=1.8614504379999999 podStartE2EDuration="4.310072007s" podCreationTimestamp="2026-03-20 10:10:16 +0000 UTC" firstStartedPulling="2026-03-20 10:10:17.207384745 +0000 UTC m=+12039.187258883" lastFinishedPulling="2026-03-20 10:10:19.656006314 +0000 UTC m=+12041.635880452" observedRunningTime="2026-03-20 10:10:20.309404799 +0000 UTC m=+12042.289278937" watchObservedRunningTime="2026-03-20 10:10:20.310072007 +0000 UTC m=+12042.289946145" Mar 20 10:10:26 crc kubenswrapper[4971]: I0320 10:10:26.387398 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mwn7w" Mar 20 10:10:26 crc kubenswrapper[4971]: I0320 10:10:26.388153 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mwn7w" Mar 20 10:10:26 crc kubenswrapper[4971]: I0320 10:10:26.450636 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mwn7w" Mar 20 10:10:26 crc kubenswrapper[4971]: I0320 10:10:26.732445 4971 scope.go:117] "RemoveContainer" containerID="15b22a46addd719b87e39803e0676833c1a1e5fce264f62fdd63b9866ad52788" Mar 20 10:10:26 crc kubenswrapper[4971]: E0320 10:10:26.732774 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 10:10:27 crc kubenswrapper[4971]: I0320 10:10:27.391182 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mwn7w" Mar 20 10:10:27 crc kubenswrapper[4971]: I0320 10:10:27.464470 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mwn7w"] Mar 20 10:10:29 crc kubenswrapper[4971]: I0320 10:10:29.378694 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mwn7w" podUID="1514a967-0c0a-41d9-a8b0-186cff8e4818" containerName="registry-server" containerID="cri-o://6c41547e719b5233b77f0bfa955231536cc60aba74e2e03a456f2514a7c0f108" gracePeriod=2 Mar 20 10:10:29 crc kubenswrapper[4971]: I0320 10:10:29.965629 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mwn7w" Mar 20 10:10:30 crc kubenswrapper[4971]: I0320 10:10:30.057126 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngjtl\" (UniqueName: \"kubernetes.io/projected/1514a967-0c0a-41d9-a8b0-186cff8e4818-kube-api-access-ngjtl\") pod \"1514a967-0c0a-41d9-a8b0-186cff8e4818\" (UID: \"1514a967-0c0a-41d9-a8b0-186cff8e4818\") " Mar 20 10:10:30 crc kubenswrapper[4971]: I0320 10:10:30.057188 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1514a967-0c0a-41d9-a8b0-186cff8e4818-utilities\") pod \"1514a967-0c0a-41d9-a8b0-186cff8e4818\" (UID: \"1514a967-0c0a-41d9-a8b0-186cff8e4818\") " Mar 20 10:10:30 crc kubenswrapper[4971]: I0320 10:10:30.057227 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1514a967-0c0a-41d9-a8b0-186cff8e4818-catalog-content\") pod \"1514a967-0c0a-41d9-a8b0-186cff8e4818\" (UID: \"1514a967-0c0a-41d9-a8b0-186cff8e4818\") " Mar 20 10:10:30 crc kubenswrapper[4971]: I0320 10:10:30.071330 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1514a967-0c0a-41d9-a8b0-186cff8e4818-utilities" (OuterVolumeSpecName: "utilities") pod "1514a967-0c0a-41d9-a8b0-186cff8e4818" (UID: "1514a967-0c0a-41d9-a8b0-186cff8e4818"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:10:30 crc kubenswrapper[4971]: I0320 10:10:30.087466 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1514a967-0c0a-41d9-a8b0-186cff8e4818-kube-api-access-ngjtl" (OuterVolumeSpecName: "kube-api-access-ngjtl") pod "1514a967-0c0a-41d9-a8b0-186cff8e4818" (UID: "1514a967-0c0a-41d9-a8b0-186cff8e4818"). InnerVolumeSpecName "kube-api-access-ngjtl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:10:30 crc kubenswrapper[4971]: I0320 10:10:30.160883 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngjtl\" (UniqueName: \"kubernetes.io/projected/1514a967-0c0a-41d9-a8b0-186cff8e4818-kube-api-access-ngjtl\") on node \"crc\" DevicePath \"\"" Mar 20 10:10:30 crc kubenswrapper[4971]: I0320 10:10:30.160922 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1514a967-0c0a-41d9-a8b0-186cff8e4818-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 10:10:30 crc kubenswrapper[4971]: I0320 10:10:30.162011 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1514a967-0c0a-41d9-a8b0-186cff8e4818-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1514a967-0c0a-41d9-a8b0-186cff8e4818" (UID: "1514a967-0c0a-41d9-a8b0-186cff8e4818"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:10:30 crc kubenswrapper[4971]: I0320 10:10:30.262360 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1514a967-0c0a-41d9-a8b0-186cff8e4818-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 10:10:30 crc kubenswrapper[4971]: I0320 10:10:30.390002 4971 generic.go:334] "Generic (PLEG): container finished" podID="1514a967-0c0a-41d9-a8b0-186cff8e4818" containerID="6c41547e719b5233b77f0bfa955231536cc60aba74e2e03a456f2514a7c0f108" exitCode=0 Mar 20 10:10:30 crc kubenswrapper[4971]: I0320 10:10:30.390044 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwn7w" event={"ID":"1514a967-0c0a-41d9-a8b0-186cff8e4818","Type":"ContainerDied","Data":"6c41547e719b5233b77f0bfa955231536cc60aba74e2e03a456f2514a7c0f108"} Mar 20 10:10:30 crc kubenswrapper[4971]: I0320 10:10:30.390064 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mwn7w" Mar 20 10:10:30 crc kubenswrapper[4971]: I0320 10:10:30.390081 4971 scope.go:117] "RemoveContainer" containerID="6c41547e719b5233b77f0bfa955231536cc60aba74e2e03a456f2514a7c0f108" Mar 20 10:10:30 crc kubenswrapper[4971]: I0320 10:10:30.390071 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwn7w" event={"ID":"1514a967-0c0a-41d9-a8b0-186cff8e4818","Type":"ContainerDied","Data":"028fe493d18dc73599d1c6311e689d3b9e9b9550575d11a3d79789c596dcffb2"} Mar 20 10:10:30 crc kubenswrapper[4971]: I0320 10:10:30.419621 4971 scope.go:117] "RemoveContainer" containerID="29cf321152ec104efae5eb68582fefc6d2ec7c0e2dee5315902e683cced4e032" Mar 20 10:10:30 crc kubenswrapper[4971]: I0320 10:10:30.442572 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mwn7w"] Mar 20 10:10:30 crc kubenswrapper[4971]: I0320 10:10:30.454326 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mwn7w"] Mar 20 10:10:30 crc kubenswrapper[4971]: I0320 10:10:30.460898 4971 scope.go:117] "RemoveContainer" containerID="7acf8db07a9f9260f6c6b074de869d09102faa6382b60fec0542dc50ad523056" Mar 20 10:10:30 crc kubenswrapper[4971]: I0320 10:10:30.497845 4971 scope.go:117] "RemoveContainer" containerID="6c41547e719b5233b77f0bfa955231536cc60aba74e2e03a456f2514a7c0f108" Mar 20 10:10:30 crc kubenswrapper[4971]: E0320 10:10:30.498388 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c41547e719b5233b77f0bfa955231536cc60aba74e2e03a456f2514a7c0f108\": container with ID starting with 6c41547e719b5233b77f0bfa955231536cc60aba74e2e03a456f2514a7c0f108 not found: ID does not exist" containerID="6c41547e719b5233b77f0bfa955231536cc60aba74e2e03a456f2514a7c0f108" Mar 20 10:10:30 crc kubenswrapper[4971]: I0320 10:10:30.498430 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c41547e719b5233b77f0bfa955231536cc60aba74e2e03a456f2514a7c0f108"} err="failed to get container status \"6c41547e719b5233b77f0bfa955231536cc60aba74e2e03a456f2514a7c0f108\": rpc error: code = NotFound desc = could not find container \"6c41547e719b5233b77f0bfa955231536cc60aba74e2e03a456f2514a7c0f108\": container with ID starting with 6c41547e719b5233b77f0bfa955231536cc60aba74e2e03a456f2514a7c0f108 not found: ID does not exist" Mar 20 10:10:30 crc kubenswrapper[4971]: I0320 10:10:30.498460 4971 scope.go:117] "RemoveContainer" containerID="29cf321152ec104efae5eb68582fefc6d2ec7c0e2dee5315902e683cced4e032" Mar 20 10:10:30 crc kubenswrapper[4971]: E0320 10:10:30.498817 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29cf321152ec104efae5eb68582fefc6d2ec7c0e2dee5315902e683cced4e032\": container with ID starting with 29cf321152ec104efae5eb68582fefc6d2ec7c0e2dee5315902e683cced4e032 not found: ID does not exist" containerID="29cf321152ec104efae5eb68582fefc6d2ec7c0e2dee5315902e683cced4e032" Mar 20 10:10:30 crc kubenswrapper[4971]: I0320 10:10:30.498846 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29cf321152ec104efae5eb68582fefc6d2ec7c0e2dee5315902e683cced4e032"} err="failed to get container status \"29cf321152ec104efae5eb68582fefc6d2ec7c0e2dee5315902e683cced4e032\": rpc error: code = NotFound desc = could not find container \"29cf321152ec104efae5eb68582fefc6d2ec7c0e2dee5315902e683cced4e032\": container with ID starting with 29cf321152ec104efae5eb68582fefc6d2ec7c0e2dee5315902e683cced4e032 not found: ID does not exist" Mar 20 10:10:30 crc kubenswrapper[4971]: I0320 10:10:30.498864 4971 scope.go:117] "RemoveContainer" containerID="7acf8db07a9f9260f6c6b074de869d09102faa6382b60fec0542dc50ad523056" Mar 20 10:10:30 crc kubenswrapper[4971]: E0320 10:10:30.499122 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7acf8db07a9f9260f6c6b074de869d09102faa6382b60fec0542dc50ad523056\": container with ID starting with 7acf8db07a9f9260f6c6b074de869d09102faa6382b60fec0542dc50ad523056 not found: ID does not exist" containerID="7acf8db07a9f9260f6c6b074de869d09102faa6382b60fec0542dc50ad523056" Mar 20 10:10:30 crc kubenswrapper[4971]: I0320 10:10:30.499155 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7acf8db07a9f9260f6c6b074de869d09102faa6382b60fec0542dc50ad523056"} err="failed to get container status \"7acf8db07a9f9260f6c6b074de869d09102faa6382b60fec0542dc50ad523056\": rpc error: code = NotFound desc = could not find container \"7acf8db07a9f9260f6c6b074de869d09102faa6382b60fec0542dc50ad523056\": container with ID starting with 7acf8db07a9f9260f6c6b074de869d09102faa6382b60fec0542dc50ad523056 not found: ID does not exist" Mar 20 10:10:30 crc kubenswrapper[4971]: I0320 10:10:30.742098 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1514a967-0c0a-41d9-a8b0-186cff8e4818" path="/var/lib/kubelet/pods/1514a967-0c0a-41d9-a8b0-186cff8e4818/volumes" Mar 20 10:10:37 crc kubenswrapper[4971]: I0320 10:10:37.736588 4971 scope.go:117] "RemoveContainer" containerID="15b22a46addd719b87e39803e0676833c1a1e5fce264f62fdd63b9866ad52788" Mar 20 10:10:37 crc kubenswrapper[4971]: E0320 10:10:37.737326 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 10:10:50 crc kubenswrapper[4971]: I0320 10:10:50.732653 4971 scope.go:117] "RemoveContainer" containerID="15b22a46addd719b87e39803e0676833c1a1e5fce264f62fdd63b9866ad52788" Mar 20 10:10:50 crc kubenswrapper[4971]: E0320 10:10:50.733546 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 10:11:04 crc kubenswrapper[4971]: I0320 10:11:04.732328 4971 scope.go:117] "RemoveContainer" containerID="15b22a46addd719b87e39803e0676833c1a1e5fce264f62fdd63b9866ad52788" Mar 20 10:11:04 crc kubenswrapper[4971]: E0320 10:11:04.733469 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 10:11:14 crc kubenswrapper[4971]: I0320 10:11:14.942542 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-s86rw"] Mar 20 10:11:14 crc kubenswrapper[4971]: E0320 10:11:14.943536 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1514a967-0c0a-41d9-a8b0-186cff8e4818" containerName="registry-server" Mar 20 10:11:14 crc kubenswrapper[4971]: I0320 10:11:14.943550 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="1514a967-0c0a-41d9-a8b0-186cff8e4818" containerName="registry-server" Mar 20 10:11:14 crc kubenswrapper[4971]: E0320 10:11:14.943572 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1514a967-0c0a-41d9-a8b0-186cff8e4818" containerName="extract-content" Mar 20 10:11:14 crc kubenswrapper[4971]: I0320 10:11:14.943579 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="1514a967-0c0a-41d9-a8b0-186cff8e4818" containerName="extract-content" Mar 20 10:11:14 crc kubenswrapper[4971]: E0320 10:11:14.943589 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1514a967-0c0a-41d9-a8b0-186cff8e4818" containerName="extract-utilities" Mar 20 10:11:14 crc kubenswrapper[4971]: I0320 10:11:14.943597 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="1514a967-0c0a-41d9-a8b0-186cff8e4818" containerName="extract-utilities" Mar 20 10:11:14 crc kubenswrapper[4971]: I0320 10:11:14.943854 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="1514a967-0c0a-41d9-a8b0-186cff8e4818" containerName="registry-server" Mar 20 10:11:14 crc kubenswrapper[4971]: I0320 10:11:14.945384 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s86rw" Mar 20 10:11:14 crc kubenswrapper[4971]: I0320 10:11:14.957471 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s86rw"] Mar 20 10:11:15 crc kubenswrapper[4971]: I0320 10:11:15.081849 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-584rd\" (UniqueName: \"kubernetes.io/projected/4615e143-1cf7-49e6-be71-3f508735cbcd-kube-api-access-584rd\") pod \"redhat-operators-s86rw\" (UID: \"4615e143-1cf7-49e6-be71-3f508735cbcd\") " pod="openshift-marketplace/redhat-operators-s86rw" Mar 20 10:11:15 crc kubenswrapper[4971]: I0320 10:11:15.081933 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4615e143-1cf7-49e6-be71-3f508735cbcd-catalog-content\") pod \"redhat-operators-s86rw\" (UID: \"4615e143-1cf7-49e6-be71-3f508735cbcd\") " pod="openshift-marketplace/redhat-operators-s86rw" Mar 20 10:11:15 crc kubenswrapper[4971]: I0320 10:11:15.082466 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4615e143-1cf7-49e6-be71-3f508735cbcd-utilities\") pod \"redhat-operators-s86rw\" (UID: \"4615e143-1cf7-49e6-be71-3f508735cbcd\") " pod="openshift-marketplace/redhat-operators-s86rw" Mar 20 10:11:15 crc kubenswrapper[4971]: I0320 10:11:15.184150 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4615e143-1cf7-49e6-be71-3f508735cbcd-catalog-content\") pod \"redhat-operators-s86rw\" (UID: \"4615e143-1cf7-49e6-be71-3f508735cbcd\") " pod="openshift-marketplace/redhat-operators-s86rw" Mar 20 10:11:15 crc kubenswrapper[4971]: I0320 10:11:15.184288 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4615e143-1cf7-49e6-be71-3f508735cbcd-utilities\") pod \"redhat-operators-s86rw\" (UID: \"4615e143-1cf7-49e6-be71-3f508735cbcd\") " pod="openshift-marketplace/redhat-operators-s86rw" Mar 20 10:11:15 crc kubenswrapper[4971]: I0320 10:11:15.184352 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-584rd\" (UniqueName: \"kubernetes.io/projected/4615e143-1cf7-49e6-be71-3f508735cbcd-kube-api-access-584rd\") pod \"redhat-operators-s86rw\" (UID: \"4615e143-1cf7-49e6-be71-3f508735cbcd\") " pod="openshift-marketplace/redhat-operators-s86rw" Mar 20 10:11:15 crc kubenswrapper[4971]: I0320 10:11:15.185178 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4615e143-1cf7-49e6-be71-3f508735cbcd-utilities\") pod \"redhat-operators-s86rw\" (UID: \"4615e143-1cf7-49e6-be71-3f508735cbcd\") " pod="openshift-marketplace/redhat-operators-s86rw" Mar 20 10:11:15 crc kubenswrapper[4971]: I0320 10:11:15.185535 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4615e143-1cf7-49e6-be71-3f508735cbcd-catalog-content\") pod \"redhat-operators-s86rw\" (UID: \"4615e143-1cf7-49e6-be71-3f508735cbcd\") " pod="openshift-marketplace/redhat-operators-s86rw" Mar 20 10:11:15 crc kubenswrapper[4971]: I0320 10:11:15.221813 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-584rd\" (UniqueName: \"kubernetes.io/projected/4615e143-1cf7-49e6-be71-3f508735cbcd-kube-api-access-584rd\") pod \"redhat-operators-s86rw\" (UID: \"4615e143-1cf7-49e6-be71-3f508735cbcd\") " pod="openshift-marketplace/redhat-operators-s86rw" Mar 20 10:11:15 crc kubenswrapper[4971]: I0320 10:11:15.274809 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s86rw" Mar 20 10:11:15 crc kubenswrapper[4971]: I0320 10:11:15.775238 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s86rw"] Mar 20 10:11:15 crc kubenswrapper[4971]: I0320 10:11:15.870310 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s86rw" event={"ID":"4615e143-1cf7-49e6-be71-3f508735cbcd","Type":"ContainerStarted","Data":"347aecd8b57182f9072534277b741f4109308c582ae2cda60b851b6c02df7c37"} Mar 20 10:11:16 crc kubenswrapper[4971]: I0320 10:11:16.732970 4971 scope.go:117] "RemoveContainer" containerID="15b22a46addd719b87e39803e0676833c1a1e5fce264f62fdd63b9866ad52788" Mar 20 10:11:16 crc kubenswrapper[4971]: E0320 10:11:16.733667 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 10:11:16 crc kubenswrapper[4971]: I0320 10:11:16.882286 4971 generic.go:334] "Generic (PLEG): container finished" podID="4615e143-1cf7-49e6-be71-3f508735cbcd" containerID="cb1f6407921b852319e71b0e996e773155d9ff96b65688cfc0f640a0fbb4de8e" exitCode=0 Mar 20 10:11:16 crc kubenswrapper[4971]: I0320 10:11:16.882353 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s86rw" event={"ID":"4615e143-1cf7-49e6-be71-3f508735cbcd","Type":"ContainerDied","Data":"cb1f6407921b852319e71b0e996e773155d9ff96b65688cfc0f640a0fbb4de8e"} Mar 20 10:11:16 crc kubenswrapper[4971]: I0320 10:11:16.884979 4971 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 10:11:17 crc kubenswrapper[4971]: I0320 10:11:17.893206 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s86rw" event={"ID":"4615e143-1cf7-49e6-be71-3f508735cbcd","Type":"ContainerStarted","Data":"6ae93d20ce5dd87797175adfd8ddc40302ab7500544c1e7c640e7f99e14de327"} Mar 20 10:11:21 crc kubenswrapper[4971]: I0320 10:11:21.932837 4971 generic.go:334] "Generic (PLEG): container finished" podID="4615e143-1cf7-49e6-be71-3f508735cbcd" containerID="6ae93d20ce5dd87797175adfd8ddc40302ab7500544c1e7c640e7f99e14de327" exitCode=0 Mar 20 10:11:21 crc kubenswrapper[4971]: I0320 10:11:21.933037 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s86rw" event={"ID":"4615e143-1cf7-49e6-be71-3f508735cbcd","Type":"ContainerDied","Data":"6ae93d20ce5dd87797175adfd8ddc40302ab7500544c1e7c640e7f99e14de327"} Mar 20 10:11:22 crc kubenswrapper[4971]: I0320 10:11:22.944212 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s86rw" event={"ID":"4615e143-1cf7-49e6-be71-3f508735cbcd","Type":"ContainerStarted","Data":"08bcddf6a619e4be1a1a916c045feb1df4906e110378b4b5b98c9808944a4c5e"} Mar 20 10:11:22 crc kubenswrapper[4971]: I0320 10:11:22.969934 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-s86rw" podStartSLOduration=3.472007717 podStartE2EDuration="8.969913062s" podCreationTimestamp="2026-03-20 10:11:14 +0000 UTC" firstStartedPulling="2026-03-20 10:11:16.884748289 +0000 UTC m=+12098.864622427" lastFinishedPulling="2026-03-20 10:11:22.382653634 +0000 UTC m=+12104.362527772" observedRunningTime="2026-03-20 10:11:22.961547175 +0000 UTC m=+12104.941421313" watchObservedRunningTime="2026-03-20 10:11:22.969913062 +0000 UTC m=+12104.949787200" Mar 20 10:11:25 crc kubenswrapper[4971]: I0320 10:11:25.275127 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-s86rw" Mar 20 10:11:25 crc kubenswrapper[4971]: I0320 10:11:25.275546 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-s86rw" Mar 20 10:11:26 crc kubenswrapper[4971]: I0320 10:11:26.322216 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-s86rw" podUID="4615e143-1cf7-49e6-be71-3f508735cbcd" containerName="registry-server" probeResult="failure" output=< Mar 20 10:11:26 crc kubenswrapper[4971]: timeout: failed to connect service ":50051" within 1s Mar 20 10:11:26 crc kubenswrapper[4971]: > Mar 20 10:11:28 crc kubenswrapper[4971]: I0320 10:11:28.742163 4971 scope.go:117] "RemoveContainer" containerID="15b22a46addd719b87e39803e0676833c1a1e5fce264f62fdd63b9866ad52788" Mar 20 10:11:28 crc kubenswrapper[4971]: E0320 10:11:28.742829 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 10:11:35 crc kubenswrapper[4971]: I0320 10:11:35.320438 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-s86rw" Mar 20 10:11:35 crc kubenswrapper[4971]: I0320 10:11:35.365915 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-s86rw" Mar 20 10:11:35 crc kubenswrapper[4971]: I0320 10:11:35.555800 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-s86rw"] Mar 20 10:11:37 crc kubenswrapper[4971]: I0320 10:11:37.094739 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-s86rw" podUID="4615e143-1cf7-49e6-be71-3f508735cbcd" containerName="registry-server" containerID="cri-o://08bcddf6a619e4be1a1a916c045feb1df4906e110378b4b5b98c9808944a4c5e" gracePeriod=2 Mar 20 10:11:37 crc kubenswrapper[4971]: I0320 10:11:37.624075 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s86rw" Mar 20 10:11:37 crc kubenswrapper[4971]: I0320 10:11:37.708547 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4615e143-1cf7-49e6-be71-3f508735cbcd-catalog-content\") pod \"4615e143-1cf7-49e6-be71-3f508735cbcd\" (UID: \"4615e143-1cf7-49e6-be71-3f508735cbcd\") " Mar 20 10:11:37 crc kubenswrapper[4971]: I0320 10:11:37.708785 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4615e143-1cf7-49e6-be71-3f508735cbcd-utilities\") pod \"4615e143-1cf7-49e6-be71-3f508735cbcd\" (UID: \"4615e143-1cf7-49e6-be71-3f508735cbcd\") " Mar 20 10:11:37 crc kubenswrapper[4971]: I0320 10:11:37.708873 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-584rd\" (UniqueName: \"kubernetes.io/projected/4615e143-1cf7-49e6-be71-3f508735cbcd-kube-api-access-584rd\") pod \"4615e143-1cf7-49e6-be71-3f508735cbcd\" (UID: \"4615e143-1cf7-49e6-be71-3f508735cbcd\") " Mar 20 10:11:37 crc kubenswrapper[4971]: I0320 10:11:37.709490 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4615e143-1cf7-49e6-be71-3f508735cbcd-utilities" (OuterVolumeSpecName: "utilities") pod "4615e143-1cf7-49e6-be71-3f508735cbcd" (UID: "4615e143-1cf7-49e6-be71-3f508735cbcd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:11:37 crc kubenswrapper[4971]: I0320 10:11:37.722204 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4615e143-1cf7-49e6-be71-3f508735cbcd-kube-api-access-584rd" (OuterVolumeSpecName: "kube-api-access-584rd") pod "4615e143-1cf7-49e6-be71-3f508735cbcd" (UID: "4615e143-1cf7-49e6-be71-3f508735cbcd"). InnerVolumeSpecName "kube-api-access-584rd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:11:37 crc kubenswrapper[4971]: I0320 10:11:37.811285 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4615e143-1cf7-49e6-be71-3f508735cbcd-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 10:11:37 crc kubenswrapper[4971]: I0320 10:11:37.811320 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-584rd\" (UniqueName: \"kubernetes.io/projected/4615e143-1cf7-49e6-be71-3f508735cbcd-kube-api-access-584rd\") on node \"crc\" DevicePath \"\"" Mar 20 10:11:37 crc kubenswrapper[4971]: I0320 10:11:37.840345 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4615e143-1cf7-49e6-be71-3f508735cbcd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4615e143-1cf7-49e6-be71-3f508735cbcd" (UID: "4615e143-1cf7-49e6-be71-3f508735cbcd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:11:37 crc kubenswrapper[4971]: I0320 10:11:37.913084 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4615e143-1cf7-49e6-be71-3f508735cbcd-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 10:11:38 crc kubenswrapper[4971]: I0320 10:11:38.109103 4971 generic.go:334] "Generic (PLEG): container finished" podID="4615e143-1cf7-49e6-be71-3f508735cbcd" containerID="08bcddf6a619e4be1a1a916c045feb1df4906e110378b4b5b98c9808944a4c5e" exitCode=0 Mar 20 10:11:38 crc kubenswrapper[4971]: I0320 10:11:38.109150 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s86rw" event={"ID":"4615e143-1cf7-49e6-be71-3f508735cbcd","Type":"ContainerDied","Data":"08bcddf6a619e4be1a1a916c045feb1df4906e110378b4b5b98c9808944a4c5e"} Mar 20 10:11:38 crc kubenswrapper[4971]: I0320 10:11:38.109214 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s86rw" event={"ID":"4615e143-1cf7-49e6-be71-3f508735cbcd","Type":"ContainerDied","Data":"347aecd8b57182f9072534277b741f4109308c582ae2cda60b851b6c02df7c37"} Mar 20 10:11:38 crc kubenswrapper[4971]: I0320 10:11:38.109238 4971 scope.go:117] "RemoveContainer" containerID="08bcddf6a619e4be1a1a916c045feb1df4906e110378b4b5b98c9808944a4c5e" Mar 20 10:11:38 crc kubenswrapper[4971]: I0320 10:11:38.109174 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s86rw" Mar 20 10:11:38 crc kubenswrapper[4971]: I0320 10:11:38.143629 4971 scope.go:117] "RemoveContainer" containerID="6ae93d20ce5dd87797175adfd8ddc40302ab7500544c1e7c640e7f99e14de327" Mar 20 10:11:38 crc kubenswrapper[4971]: I0320 10:11:38.147707 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-s86rw"] Mar 20 10:11:38 crc kubenswrapper[4971]: I0320 10:11:38.159534 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-s86rw"] Mar 20 10:11:38 crc kubenswrapper[4971]: I0320 10:11:38.178804 4971 scope.go:117] "RemoveContainer" containerID="cb1f6407921b852319e71b0e996e773155d9ff96b65688cfc0f640a0fbb4de8e" Mar 20 10:11:38 crc kubenswrapper[4971]: I0320 10:11:38.225169 4971 scope.go:117] "RemoveContainer" containerID="08bcddf6a619e4be1a1a916c045feb1df4906e110378b4b5b98c9808944a4c5e" Mar 20 10:11:38 crc kubenswrapper[4971]: E0320 10:11:38.225653 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08bcddf6a619e4be1a1a916c045feb1df4906e110378b4b5b98c9808944a4c5e\": container with ID starting with 08bcddf6a619e4be1a1a916c045feb1df4906e110378b4b5b98c9808944a4c5e not found: ID does not exist" containerID="08bcddf6a619e4be1a1a916c045feb1df4906e110378b4b5b98c9808944a4c5e" Mar 20 10:11:38 crc kubenswrapper[4971]: I0320 10:11:38.225688 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08bcddf6a619e4be1a1a916c045feb1df4906e110378b4b5b98c9808944a4c5e"} err="failed to get container status \"08bcddf6a619e4be1a1a916c045feb1df4906e110378b4b5b98c9808944a4c5e\": rpc error: code = NotFound desc = could not find container \"08bcddf6a619e4be1a1a916c045feb1df4906e110378b4b5b98c9808944a4c5e\": container with ID starting with 08bcddf6a619e4be1a1a916c045feb1df4906e110378b4b5b98c9808944a4c5e not found: ID does not exist" Mar 20 10:11:38 crc kubenswrapper[4971]: I0320 10:11:38.225715 4971 scope.go:117] "RemoveContainer" containerID="6ae93d20ce5dd87797175adfd8ddc40302ab7500544c1e7c640e7f99e14de327" Mar 20 10:11:38 crc kubenswrapper[4971]: E0320 10:11:38.226026 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ae93d20ce5dd87797175adfd8ddc40302ab7500544c1e7c640e7f99e14de327\": container with ID starting with 6ae93d20ce5dd87797175adfd8ddc40302ab7500544c1e7c640e7f99e14de327 not found: ID does not exist" containerID="6ae93d20ce5dd87797175adfd8ddc40302ab7500544c1e7c640e7f99e14de327" Mar 20 10:11:38 crc kubenswrapper[4971]: I0320 10:11:38.226078 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ae93d20ce5dd87797175adfd8ddc40302ab7500544c1e7c640e7f99e14de327"} err="failed to get container status \"6ae93d20ce5dd87797175adfd8ddc40302ab7500544c1e7c640e7f99e14de327\": rpc error: code = NotFound desc = could not find container \"6ae93d20ce5dd87797175adfd8ddc40302ab7500544c1e7c640e7f99e14de327\": container with ID starting with 6ae93d20ce5dd87797175adfd8ddc40302ab7500544c1e7c640e7f99e14de327 not found: ID does not exist" Mar 20 10:11:38 crc kubenswrapper[4971]: I0320 10:11:38.226113 4971 scope.go:117] "RemoveContainer" containerID="cb1f6407921b852319e71b0e996e773155d9ff96b65688cfc0f640a0fbb4de8e" Mar 20 10:11:38 crc kubenswrapper[4971]: E0320 10:11:38.226641 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb1f6407921b852319e71b0e996e773155d9ff96b65688cfc0f640a0fbb4de8e\": container with ID starting with cb1f6407921b852319e71b0e996e773155d9ff96b65688cfc0f640a0fbb4de8e not found: ID does not exist" containerID="cb1f6407921b852319e71b0e996e773155d9ff96b65688cfc0f640a0fbb4de8e" Mar 20 10:11:38 crc kubenswrapper[4971]: I0320 10:11:38.226670 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb1f6407921b852319e71b0e996e773155d9ff96b65688cfc0f640a0fbb4de8e"} err="failed to get container status \"cb1f6407921b852319e71b0e996e773155d9ff96b65688cfc0f640a0fbb4de8e\": rpc error: code = NotFound desc = could not find container \"cb1f6407921b852319e71b0e996e773155d9ff96b65688cfc0f640a0fbb4de8e\": container with ID starting with cb1f6407921b852319e71b0e996e773155d9ff96b65688cfc0f640a0fbb4de8e not found: ID does not exist" Mar 20 10:11:38 crc kubenswrapper[4971]: I0320 10:11:38.750779 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4615e143-1cf7-49e6-be71-3f508735cbcd" path="/var/lib/kubelet/pods/4615e143-1cf7-49e6-be71-3f508735cbcd/volumes" Mar 20 10:11:43 crc kubenswrapper[4971]: I0320 10:11:43.732519 4971 scope.go:117] "RemoveContainer" containerID="15b22a46addd719b87e39803e0676833c1a1e5fce264f62fdd63b9866ad52788" Mar 20 10:11:43 crc kubenswrapper[4971]: E0320 10:11:43.733370 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 10:11:44 crc kubenswrapper[4971]: I0320 10:11:44.908523 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-b42j9"] Mar 20 10:11:44 crc kubenswrapper[4971]: E0320 10:11:44.909349 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4615e143-1cf7-49e6-be71-3f508735cbcd" containerName="extract-content" Mar 20 10:11:44 crc kubenswrapper[4971]: I0320 10:11:44.909364 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="4615e143-1cf7-49e6-be71-3f508735cbcd" containerName="extract-content" Mar 20 10:11:44 crc kubenswrapper[4971]: E0320 10:11:44.909377 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4615e143-1cf7-49e6-be71-3f508735cbcd" containerName="registry-server" Mar 20 10:11:44 crc kubenswrapper[4971]: I0320 10:11:44.909383 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="4615e143-1cf7-49e6-be71-3f508735cbcd" containerName="registry-server" Mar 20 10:11:44 crc kubenswrapper[4971]: E0320 10:11:44.909423 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4615e143-1cf7-49e6-be71-3f508735cbcd" containerName="extract-utilities" Mar 20 10:11:44 crc kubenswrapper[4971]: I0320 10:11:44.909430 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="4615e143-1cf7-49e6-be71-3f508735cbcd" containerName="extract-utilities" Mar 20 10:11:44 crc kubenswrapper[4971]: I0320 10:11:44.909667 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="4615e143-1cf7-49e6-be71-3f508735cbcd" containerName="registry-server" Mar 20 10:11:44 crc kubenswrapper[4971]: I0320 10:11:44.911359 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b42j9" Mar 20 10:11:44 crc kubenswrapper[4971]: I0320 10:11:44.922183 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b42j9"] Mar 20 10:11:45 crc kubenswrapper[4971]: I0320 10:11:45.065478 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75d71852-e6d6-40c4-82a2-062794a50067-catalog-content\") pod \"redhat-marketplace-b42j9\" (UID: \"75d71852-e6d6-40c4-82a2-062794a50067\") " pod="openshift-marketplace/redhat-marketplace-b42j9" Mar 20 10:11:45 crc kubenswrapper[4971]: I0320 10:11:45.065575 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qs84r\" (UniqueName: \"kubernetes.io/projected/75d71852-e6d6-40c4-82a2-062794a50067-kube-api-access-qs84r\") pod \"redhat-marketplace-b42j9\" (UID: \"75d71852-e6d6-40c4-82a2-062794a50067\") " pod="openshift-marketplace/redhat-marketplace-b42j9" Mar 20 10:11:45 crc kubenswrapper[4971]: I0320 10:11:45.065765 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75d71852-e6d6-40c4-82a2-062794a50067-utilities\") pod \"redhat-marketplace-b42j9\" (UID: \"75d71852-e6d6-40c4-82a2-062794a50067\") " pod="openshift-marketplace/redhat-marketplace-b42j9" Mar 20 10:11:45 crc kubenswrapper[4971]: I0320 10:11:45.167264 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75d71852-e6d6-40c4-82a2-062794a50067-utilities\") pod \"redhat-marketplace-b42j9\" (UID: \"75d71852-e6d6-40c4-82a2-062794a50067\") " pod="openshift-marketplace/redhat-marketplace-b42j9" Mar 20 10:11:45 crc kubenswrapper[4971]: I0320 10:11:45.167351 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75d71852-e6d6-40c4-82a2-062794a50067-catalog-content\") pod \"redhat-marketplace-b42j9\" (UID: \"75d71852-e6d6-40c4-82a2-062794a50067\") " pod="openshift-marketplace/redhat-marketplace-b42j9" Mar 20 10:11:45 crc kubenswrapper[4971]: I0320 10:11:45.167400 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qs84r\" (UniqueName: \"kubernetes.io/projected/75d71852-e6d6-40c4-82a2-062794a50067-kube-api-access-qs84r\") pod \"redhat-marketplace-b42j9\" (UID: \"75d71852-e6d6-40c4-82a2-062794a50067\") " pod="openshift-marketplace/redhat-marketplace-b42j9" Mar 20 10:11:45 crc kubenswrapper[4971]: I0320 10:11:45.167744 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75d71852-e6d6-40c4-82a2-062794a50067-utilities\") pod \"redhat-marketplace-b42j9\" (UID: \"75d71852-e6d6-40c4-82a2-062794a50067\") " pod="openshift-marketplace/redhat-marketplace-b42j9" Mar 20 10:11:45 crc kubenswrapper[4971]: I0320 10:11:45.167865 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75d71852-e6d6-40c4-82a2-062794a50067-catalog-content\") pod \"redhat-marketplace-b42j9\" (UID: \"75d71852-e6d6-40c4-82a2-062794a50067\") " pod="openshift-marketplace/redhat-marketplace-b42j9" Mar 20 10:11:45 crc kubenswrapper[4971]: I0320 10:11:45.186581 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qs84r\" (UniqueName: \"kubernetes.io/projected/75d71852-e6d6-40c4-82a2-062794a50067-kube-api-access-qs84r\") pod \"redhat-marketplace-b42j9\" (UID: \"75d71852-e6d6-40c4-82a2-062794a50067\") " pod="openshift-marketplace/redhat-marketplace-b42j9" Mar 20 10:11:45 crc kubenswrapper[4971]: I0320 10:11:45.239807 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b42j9" Mar 20 10:11:45 crc kubenswrapper[4971]: I0320 10:11:45.695061 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b42j9"] Mar 20 10:11:46 crc kubenswrapper[4971]: I0320 10:11:46.203267 4971 generic.go:334] "Generic (PLEG): container finished" podID="75d71852-e6d6-40c4-82a2-062794a50067" containerID="59fcdae4755c37455d78f7392e8971e8ab502b2f242b14f62760015f13a88fa6" exitCode=0 Mar 20 10:11:46 crc kubenswrapper[4971]: I0320 10:11:46.203377 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b42j9" event={"ID":"75d71852-e6d6-40c4-82a2-062794a50067","Type":"ContainerDied","Data":"59fcdae4755c37455d78f7392e8971e8ab502b2f242b14f62760015f13a88fa6"} Mar 20 10:11:46 crc kubenswrapper[4971]: I0320 10:11:46.203558 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b42j9" event={"ID":"75d71852-e6d6-40c4-82a2-062794a50067","Type":"ContainerStarted","Data":"4dd45e490568138b89bd30facce3cbb549f3873e4e08302be21bce4dd3d6097f"} Mar 20 10:11:47 crc kubenswrapper[4971]: I0320 10:11:47.215817 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b42j9" event={"ID":"75d71852-e6d6-40c4-82a2-062794a50067","Type":"ContainerStarted","Data":"73345c109d6cf8c6fa762737de4d746e14d166630bcdf9df5772966be8dce91d"} Mar 20 10:11:48 crc kubenswrapper[4971]: I0320 10:11:48.226331 4971 generic.go:334] "Generic (PLEG): container finished" podID="75d71852-e6d6-40c4-82a2-062794a50067" containerID="73345c109d6cf8c6fa762737de4d746e14d166630bcdf9df5772966be8dce91d" exitCode=0 Mar 20 10:11:48 crc kubenswrapper[4971]: I0320 10:11:48.226445 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b42j9" event={"ID":"75d71852-e6d6-40c4-82a2-062794a50067","Type":"ContainerDied","Data":"73345c109d6cf8c6fa762737de4d746e14d166630bcdf9df5772966be8dce91d"} Mar 20 10:11:49 crc kubenswrapper[4971]: I0320 10:11:49.239528 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b42j9" event={"ID":"75d71852-e6d6-40c4-82a2-062794a50067","Type":"ContainerStarted","Data":"adb36ef98bde9d2fe62b4cac1db762581555b2c818da569bd67af763b0345e1d"} Mar 20 10:11:49 crc kubenswrapper[4971]: I0320 10:11:49.261927 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-b42j9" podStartSLOduration=2.807605252 podStartE2EDuration="5.261911786s" podCreationTimestamp="2026-03-20 10:11:44 +0000 UTC" firstStartedPulling="2026-03-20 10:11:46.206024754 +0000 UTC m=+12128.185898882" lastFinishedPulling="2026-03-20 10:11:48.660331268 +0000 UTC m=+12130.640205416" observedRunningTime="2026-03-20 10:11:49.256858139 +0000 UTC m=+12131.236732277" watchObservedRunningTime="2026-03-20 10:11:49.261911786 +0000 UTC m=+12131.241785924" Mar 20 10:11:55 crc kubenswrapper[4971]: I0320 10:11:55.240078 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-b42j9" Mar 20 10:11:55 crc kubenswrapper[4971]: I0320 10:11:55.240729 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-b42j9" Mar 20 10:11:55 crc kubenswrapper[4971]: I0320 10:11:55.282498 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-b42j9" Mar 20 10:11:55 crc kubenswrapper[4971]: I0320 10:11:55.352961 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-b42j9" Mar 20 10:11:55 crc kubenswrapper[4971]: I0320 10:11:55.519439 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b42j9"] Mar 20 10:11:55 crc kubenswrapper[4971]: I0320 10:11:55.732023 4971 scope.go:117] "RemoveContainer" containerID="15b22a46addd719b87e39803e0676833c1a1e5fce264f62fdd63b9866ad52788" Mar 20 10:11:56 crc kubenswrapper[4971]: I0320 10:11:56.308458 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerStarted","Data":"f75848c8286f55cc3d920b08f7a89dc5054b5048e3e2418181291bb7c5c3716f"} Mar 20 10:11:57 crc kubenswrapper[4971]: I0320 10:11:57.317859 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-b42j9" podUID="75d71852-e6d6-40c4-82a2-062794a50067" containerName="registry-server" containerID="cri-o://adb36ef98bde9d2fe62b4cac1db762581555b2c818da569bd67af763b0345e1d" gracePeriod=2 Mar 20 10:11:57 crc kubenswrapper[4971]: I0320 10:11:57.825691 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b42j9" Mar 20 10:11:57 crc kubenswrapper[4971]: I0320 10:11:57.920335 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs84r\" (UniqueName: \"kubernetes.io/projected/75d71852-e6d6-40c4-82a2-062794a50067-kube-api-access-qs84r\") pod \"75d71852-e6d6-40c4-82a2-062794a50067\" (UID: \"75d71852-e6d6-40c4-82a2-062794a50067\") " Mar 20 10:11:57 crc kubenswrapper[4971]: I0320 10:11:57.920407 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75d71852-e6d6-40c4-82a2-062794a50067-catalog-content\") pod \"75d71852-e6d6-40c4-82a2-062794a50067\" (UID: \"75d71852-e6d6-40c4-82a2-062794a50067\") " Mar 20 10:11:57 crc kubenswrapper[4971]: I0320 10:11:57.920458 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75d71852-e6d6-40c4-82a2-062794a50067-utilities\") pod \"75d71852-e6d6-40c4-82a2-062794a50067\" (UID: \"75d71852-e6d6-40c4-82a2-062794a50067\") " Mar 20 10:11:57 crc kubenswrapper[4971]: I0320 10:11:57.921961 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75d71852-e6d6-40c4-82a2-062794a50067-utilities" (OuterVolumeSpecName: "utilities") pod "75d71852-e6d6-40c4-82a2-062794a50067" (UID: "75d71852-e6d6-40c4-82a2-062794a50067"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:11:57 crc kubenswrapper[4971]: I0320 10:11:57.923069 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75d71852-e6d6-40c4-82a2-062794a50067-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 10:11:57 crc kubenswrapper[4971]: I0320 10:11:57.934068 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75d71852-e6d6-40c4-82a2-062794a50067-kube-api-access-qs84r" (OuterVolumeSpecName: "kube-api-access-qs84r") pod "75d71852-e6d6-40c4-82a2-062794a50067" (UID: "75d71852-e6d6-40c4-82a2-062794a50067"). InnerVolumeSpecName "kube-api-access-qs84r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:11:58 crc kubenswrapper[4971]: I0320 10:11:58.027491 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs84r\" (UniqueName: \"kubernetes.io/projected/75d71852-e6d6-40c4-82a2-062794a50067-kube-api-access-qs84r\") on node \"crc\" DevicePath \"\"" Mar 20 10:11:58 crc kubenswrapper[4971]: I0320 10:11:58.027485 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75d71852-e6d6-40c4-82a2-062794a50067-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "75d71852-e6d6-40c4-82a2-062794a50067" (UID: "75d71852-e6d6-40c4-82a2-062794a50067"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:11:58 crc kubenswrapper[4971]: I0320 10:11:58.128789 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75d71852-e6d6-40c4-82a2-062794a50067-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 10:11:58 crc kubenswrapper[4971]: I0320 10:11:58.329032 4971 generic.go:334] "Generic (PLEG): container finished" podID="75d71852-e6d6-40c4-82a2-062794a50067" containerID="adb36ef98bde9d2fe62b4cac1db762581555b2c818da569bd67af763b0345e1d" exitCode=0 Mar 20 10:11:58 crc kubenswrapper[4971]: I0320 10:11:58.329102 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b42j9" Mar 20 10:11:58 crc kubenswrapper[4971]: I0320 10:11:58.329150 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b42j9" event={"ID":"75d71852-e6d6-40c4-82a2-062794a50067","Type":"ContainerDied","Data":"adb36ef98bde9d2fe62b4cac1db762581555b2c818da569bd67af763b0345e1d"} Mar 20 10:11:58 crc kubenswrapper[4971]: I0320 10:11:58.329562 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b42j9" event={"ID":"75d71852-e6d6-40c4-82a2-062794a50067","Type":"ContainerDied","Data":"4dd45e490568138b89bd30facce3cbb549f3873e4e08302be21bce4dd3d6097f"} Mar 20 10:11:58 crc kubenswrapper[4971]: I0320 10:11:58.329675 4971 scope.go:117] "RemoveContainer" containerID="adb36ef98bde9d2fe62b4cac1db762581555b2c818da569bd67af763b0345e1d" Mar 20 10:11:58 crc kubenswrapper[4971]: I0320 10:11:58.363793 4971 scope.go:117] "RemoveContainer" containerID="73345c109d6cf8c6fa762737de4d746e14d166630bcdf9df5772966be8dce91d" Mar 20 10:11:58 crc kubenswrapper[4971]: I0320 10:11:58.372810 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b42j9"] Mar 20 10:11:58 crc kubenswrapper[4971]: I0320 10:11:58.400548 4971 scope.go:117] "RemoveContainer" containerID="59fcdae4755c37455d78f7392e8971e8ab502b2f242b14f62760015f13a88fa6" Mar 20 10:11:58 crc kubenswrapper[4971]: I0320 10:11:58.402597 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-b42j9"] Mar 20 10:11:58 crc kubenswrapper[4971]: I0320 10:11:58.426336 4971 scope.go:117] "RemoveContainer" containerID="adb36ef98bde9d2fe62b4cac1db762581555b2c818da569bd67af763b0345e1d" Mar 20 10:11:58 crc kubenswrapper[4971]: E0320 10:11:58.426932 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adb36ef98bde9d2fe62b4cac1db762581555b2c818da569bd67af763b0345e1d\": container with ID starting with adb36ef98bde9d2fe62b4cac1db762581555b2c818da569bd67af763b0345e1d not found: ID does not exist" containerID="adb36ef98bde9d2fe62b4cac1db762581555b2c818da569bd67af763b0345e1d" Mar 20 10:11:58 crc kubenswrapper[4971]: I0320 10:11:58.426972 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adb36ef98bde9d2fe62b4cac1db762581555b2c818da569bd67af763b0345e1d"} err="failed to get container status \"adb36ef98bde9d2fe62b4cac1db762581555b2c818da569bd67af763b0345e1d\": rpc error: code = NotFound desc = could not find container \"adb36ef98bde9d2fe62b4cac1db762581555b2c818da569bd67af763b0345e1d\": container with ID starting with adb36ef98bde9d2fe62b4cac1db762581555b2c818da569bd67af763b0345e1d not found: ID does not exist" Mar 20 10:11:58 crc kubenswrapper[4971]: I0320 10:11:58.427001 4971 scope.go:117] "RemoveContainer" containerID="73345c109d6cf8c6fa762737de4d746e14d166630bcdf9df5772966be8dce91d" Mar 20 10:11:58 crc kubenswrapper[4971]: E0320 10:11:58.427396 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73345c109d6cf8c6fa762737de4d746e14d166630bcdf9df5772966be8dce91d\": container with ID starting with 73345c109d6cf8c6fa762737de4d746e14d166630bcdf9df5772966be8dce91d not found: ID does not exist" containerID="73345c109d6cf8c6fa762737de4d746e14d166630bcdf9df5772966be8dce91d" Mar 20 10:11:58 crc kubenswrapper[4971]: I0320 10:11:58.427425 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73345c109d6cf8c6fa762737de4d746e14d166630bcdf9df5772966be8dce91d"} err="failed to get container status \"73345c109d6cf8c6fa762737de4d746e14d166630bcdf9df5772966be8dce91d\": rpc error: code = NotFound desc = could not find container \"73345c109d6cf8c6fa762737de4d746e14d166630bcdf9df5772966be8dce91d\": container with ID starting with 73345c109d6cf8c6fa762737de4d746e14d166630bcdf9df5772966be8dce91d not found: ID does not exist" Mar 20 10:11:58 crc kubenswrapper[4971]: I0320 10:11:58.427443 4971 scope.go:117] "RemoveContainer" containerID="59fcdae4755c37455d78f7392e8971e8ab502b2f242b14f62760015f13a88fa6" Mar 20 10:11:58 crc kubenswrapper[4971]: E0320 10:11:58.427986 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59fcdae4755c37455d78f7392e8971e8ab502b2f242b14f62760015f13a88fa6\": container with ID starting with 59fcdae4755c37455d78f7392e8971e8ab502b2f242b14f62760015f13a88fa6 not found: ID does not exist" containerID="59fcdae4755c37455d78f7392e8971e8ab502b2f242b14f62760015f13a88fa6" Mar 20 10:11:58 crc kubenswrapper[4971]: I0320 10:11:58.428016 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59fcdae4755c37455d78f7392e8971e8ab502b2f242b14f62760015f13a88fa6"} err="failed to get container status \"59fcdae4755c37455d78f7392e8971e8ab502b2f242b14f62760015f13a88fa6\": rpc error: code = NotFound desc = could not find container \"59fcdae4755c37455d78f7392e8971e8ab502b2f242b14f62760015f13a88fa6\": container with ID starting with 59fcdae4755c37455d78f7392e8971e8ab502b2f242b14f62760015f13a88fa6 not found: ID does not exist" Mar 20 10:11:58 crc kubenswrapper[4971]: I0320 10:11:58.744585 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75d71852-e6d6-40c4-82a2-062794a50067" path="/var/lib/kubelet/pods/75d71852-e6d6-40c4-82a2-062794a50067/volumes" Mar 20 10:12:00 crc kubenswrapper[4971]: I0320 10:12:00.154316 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566692-blv8c"] Mar 20 10:12:00 crc kubenswrapper[4971]: E0320 10:12:00.155230 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75d71852-e6d6-40c4-82a2-062794a50067" containerName="registry-server" Mar 20 10:12:00 crc kubenswrapper[4971]: I0320 10:12:00.155246 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="75d71852-e6d6-40c4-82a2-062794a50067" containerName="registry-server" Mar 20 10:12:00 crc kubenswrapper[4971]: E0320 10:12:00.155274 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75d71852-e6d6-40c4-82a2-062794a50067" containerName="extract-utilities" Mar 20 10:12:00 crc kubenswrapper[4971]: I0320 10:12:00.155283 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="75d71852-e6d6-40c4-82a2-062794a50067" containerName="extract-utilities" Mar 20 10:12:00 crc kubenswrapper[4971]: E0320 10:12:00.155317 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75d71852-e6d6-40c4-82a2-062794a50067" containerName="extract-content" Mar 20 10:12:00 crc kubenswrapper[4971]: I0320 10:12:00.155325 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="75d71852-e6d6-40c4-82a2-062794a50067" containerName="extract-content" Mar 20 10:12:00 crc kubenswrapper[4971]: I0320 10:12:00.155547 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="75d71852-e6d6-40c4-82a2-062794a50067" containerName="registry-server" Mar 20 10:12:00 crc kubenswrapper[4971]: I0320 10:12:00.156468 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566692-blv8c" Mar 20 10:12:00 crc kubenswrapper[4971]: I0320 10:12:00.160313 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 10:12:00 crc kubenswrapper[4971]: I0320 10:12:00.161166 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 10:12:00 crc kubenswrapper[4971]: I0320 10:12:00.166443 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 10:12:00 crc kubenswrapper[4971]: I0320 10:12:00.173165 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566692-blv8c"] Mar 20 10:12:00 crc kubenswrapper[4971]: I0320 10:12:00.264782 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m28gd\" (UniqueName: \"kubernetes.io/projected/4855e97d-1722-4b16-9cbe-205e12794105-kube-api-access-m28gd\") pod \"auto-csr-approver-29566692-blv8c\" (UID: \"4855e97d-1722-4b16-9cbe-205e12794105\") " pod="openshift-infra/auto-csr-approver-29566692-blv8c" Mar 20 10:12:00 crc kubenswrapper[4971]: I0320 10:12:00.366651 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m28gd\" (UniqueName: \"kubernetes.io/projected/4855e97d-1722-4b16-9cbe-205e12794105-kube-api-access-m28gd\") pod \"auto-csr-approver-29566692-blv8c\" (UID: \"4855e97d-1722-4b16-9cbe-205e12794105\") " pod="openshift-infra/auto-csr-approver-29566692-blv8c" Mar 20 10:12:00 crc kubenswrapper[4971]: I0320 10:12:00.399773 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m28gd\" (UniqueName: \"kubernetes.io/projected/4855e97d-1722-4b16-9cbe-205e12794105-kube-api-access-m28gd\") pod \"auto-csr-approver-29566692-blv8c\" (UID: \"4855e97d-1722-4b16-9cbe-205e12794105\") " pod="openshift-infra/auto-csr-approver-29566692-blv8c" Mar 20 10:12:00 crc kubenswrapper[4971]: I0320 10:12:00.484953 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566692-blv8c" Mar 20 10:12:00 crc kubenswrapper[4971]: I0320 10:12:00.934264 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566692-blv8c"] Mar 20 10:12:00 crc kubenswrapper[4971]: W0320 10:12:00.937132 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4855e97d_1722_4b16_9cbe_205e12794105.slice/crio-950ad42eec06f64ea92b3d10b054c77fbfd46ea7b849a43f63b09905094e634c WatchSource:0}: Error finding container 950ad42eec06f64ea92b3d10b054c77fbfd46ea7b849a43f63b09905094e634c: Status 404 returned error can't find the container with id 950ad42eec06f64ea92b3d10b054c77fbfd46ea7b849a43f63b09905094e634c Mar 20 10:12:01 crc kubenswrapper[4971]: I0320 10:12:01.366160 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566692-blv8c" event={"ID":"4855e97d-1722-4b16-9cbe-205e12794105","Type":"ContainerStarted","Data":"950ad42eec06f64ea92b3d10b054c77fbfd46ea7b849a43f63b09905094e634c"} Mar 20 10:12:02 crc kubenswrapper[4971]: I0320 10:12:02.375571 4971 generic.go:334] "Generic (PLEG): container finished" podID="4855e97d-1722-4b16-9cbe-205e12794105" containerID="4ec08c24006c1eef334602e576bf79eda93230827073da4637d9c80cf47fe1e1" exitCode=0 Mar 20 10:12:02 crc kubenswrapper[4971]: I0320 10:12:02.375777 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566692-blv8c" event={"ID":"4855e97d-1722-4b16-9cbe-205e12794105","Type":"ContainerDied","Data":"4ec08c24006c1eef334602e576bf79eda93230827073da4637d9c80cf47fe1e1"} Mar 20 10:12:03 crc kubenswrapper[4971]: I0320 10:12:03.755048 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566692-blv8c" Mar 20 10:12:03 crc kubenswrapper[4971]: I0320 10:12:03.855917 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m28gd\" (UniqueName: \"kubernetes.io/projected/4855e97d-1722-4b16-9cbe-205e12794105-kube-api-access-m28gd\") pod \"4855e97d-1722-4b16-9cbe-205e12794105\" (UID: \"4855e97d-1722-4b16-9cbe-205e12794105\") " Mar 20 10:12:03 crc kubenswrapper[4971]: I0320 10:12:03.871553 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4855e97d-1722-4b16-9cbe-205e12794105-kube-api-access-m28gd" (OuterVolumeSpecName: "kube-api-access-m28gd") pod "4855e97d-1722-4b16-9cbe-205e12794105" (UID: "4855e97d-1722-4b16-9cbe-205e12794105"). InnerVolumeSpecName "kube-api-access-m28gd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:12:03 crc kubenswrapper[4971]: I0320 10:12:03.958871 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m28gd\" (UniqueName: \"kubernetes.io/projected/4855e97d-1722-4b16-9cbe-205e12794105-kube-api-access-m28gd\") on node \"crc\" DevicePath \"\"" Mar 20 10:12:04 crc kubenswrapper[4971]: I0320 10:12:04.402402 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566692-blv8c" event={"ID":"4855e97d-1722-4b16-9cbe-205e12794105","Type":"ContainerDied","Data":"950ad42eec06f64ea92b3d10b054c77fbfd46ea7b849a43f63b09905094e634c"} Mar 20 10:12:04 crc kubenswrapper[4971]: I0320 10:12:04.402448 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566692-blv8c" Mar 20 10:12:04 crc kubenswrapper[4971]: I0320 10:12:04.402457 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="950ad42eec06f64ea92b3d10b054c77fbfd46ea7b849a43f63b09905094e634c" Mar 20 10:12:04 crc kubenswrapper[4971]: I0320 10:12:04.837635 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566686-94sgt"] Mar 20 10:12:04 crc kubenswrapper[4971]: I0320 10:12:04.846902 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566686-94sgt"] Mar 20 10:12:06 crc kubenswrapper[4971]: I0320 10:12:06.762511 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec35264e-6c4f-457f-afa0-3e785a7143a9" path="/var/lib/kubelet/pods/ec35264e-6c4f-457f-afa0-3e785a7143a9/volumes" Mar 20 10:12:17 crc kubenswrapper[4971]: I0320 10:12:17.657837 4971 scope.go:117] "RemoveContainer" containerID="cea13be598b12ee4768259f2b424cc5b3d06edfe842b7fe4044cedb746b300ea" Mar 20 10:12:34 crc kubenswrapper[4971]: I0320 10:12:34.699670 4971 generic.go:334] "Generic (PLEG): container finished" podID="14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c" containerID="d9aafa23a5084cab3c20da0ea2ed997b50274d56a7e701ac054c46df968a260d" exitCode=0 Mar 20 10:12:34 crc kubenswrapper[4971]: I0320 10:12:34.699837 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c","Type":"ContainerDied","Data":"d9aafa23a5084cab3c20da0ea2ed997b50274d56a7e701ac054c46df968a260d"} Mar 20 10:12:36 crc kubenswrapper[4971]: I0320 10:12:36.162085 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 20 10:12:36 crc kubenswrapper[4971]: I0320 10:12:36.316552 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c-ssh-key\") pod \"14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c\" (UID: \"14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c\") " Mar 20 10:12:36 crc kubenswrapper[4971]: I0320 10:12:36.316772 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c-test-operator-ephemeral-workdir\") pod \"14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c\" (UID: \"14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c\") " Mar 20 10:12:36 crc kubenswrapper[4971]: I0320 10:12:36.316815 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c-test-operator-ephemeral-temporary\") pod \"14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c\" (UID: \"14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c\") " Mar 20 10:12:36 crc kubenswrapper[4971]: I0320 10:12:36.316851 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c-openstack-config-secret\") pod \"14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c\" (UID: \"14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c\") " Mar 20 10:12:36 crc kubenswrapper[4971]: I0320 10:12:36.316932 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkqnp\" (UniqueName: \"kubernetes.io/projected/14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c-kube-api-access-bkqnp\") pod \"14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c\" (UID: \"14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c\") " Mar 20 10:12:36 crc kubenswrapper[4971]: I0320 10:12:36.316973 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c-openstack-config\") pod \"14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c\" (UID: \"14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c\") " Mar 20 10:12:36 crc kubenswrapper[4971]: I0320 10:12:36.317028 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c-config-data\") pod \"14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c\" (UID: \"14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c\") " Mar 20 10:12:36 crc kubenswrapper[4971]: I0320 10:12:36.317061 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c-ca-certs\") pod \"14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c\" (UID: \"14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c\") " Mar 20 10:12:36 crc kubenswrapper[4971]: I0320 10:12:36.317158 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c\" (UID: \"14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c\") " Mar 20 10:12:36 crc kubenswrapper[4971]: I0320 10:12:36.317923 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c" (UID: "14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:12:36 crc kubenswrapper[4971]: I0320 10:12:36.319669 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c-config-data" (OuterVolumeSpecName: "config-data") pod "14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c" (UID: "14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:12:36 crc kubenswrapper[4971]: I0320 10:12:36.323331 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c-kube-api-access-bkqnp" (OuterVolumeSpecName: "kube-api-access-bkqnp") pod "14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c" (UID: "14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c"). InnerVolumeSpecName "kube-api-access-bkqnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:12:36 crc kubenswrapper[4971]: I0320 10:12:36.332933 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "test-operator-logs") pod "14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c" (UID: "14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 10:12:36 crc kubenswrapper[4971]: I0320 10:12:36.334138 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c" (UID: "14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:12:36 crc kubenswrapper[4971]: I0320 10:12:36.350772 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c" (UID: "14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:12:36 crc kubenswrapper[4971]: I0320 10:12:36.367272 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c" (UID: "14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:12:36 crc kubenswrapper[4971]: I0320 10:12:36.372290 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c" (UID: "14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:12:36 crc kubenswrapper[4971]: I0320 10:12:36.417799 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c" (UID: "14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:12:36 crc kubenswrapper[4971]: I0320 10:12:36.419942 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkqnp\" (UniqueName: \"kubernetes.io/projected/14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c-kube-api-access-bkqnp\") on node \"crc\" DevicePath \"\"" Mar 20 10:12:36 crc kubenswrapper[4971]: I0320 10:12:36.420104 4971 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:12:36 crc kubenswrapper[4971]: I0320 10:12:36.420207 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 10:12:36 crc kubenswrapper[4971]: I0320 10:12:36.420288 4971 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 20 10:12:36 crc kubenswrapper[4971]: I0320 10:12:36.420400 4971 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 20 10:12:36 crc kubenswrapper[4971]: I0320 10:12:36.420483 4971 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c-ssh-key\") on node \"crc\" DevicePath \"\"" Mar 20 10:12:36 crc kubenswrapper[4971]: I0320 10:12:36.420562 4971 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 20 10:12:36 crc kubenswrapper[4971]: I0320 10:12:36.421039 4971 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 20 10:12:36 crc kubenswrapper[4971]: I0320 10:12:36.421158 4971 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 20 10:12:36 crc kubenswrapper[4971]: I0320 10:12:36.446984 4971 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 20 10:12:36 crc kubenswrapper[4971]: I0320 10:12:36.523412 4971 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 20 10:12:36 crc kubenswrapper[4971]: I0320 10:12:36.726401 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c","Type":"ContainerDied","Data":"2983a733871bb4943308d258e91a4b529f9a98eac45b1d8e1dbdbca699f9a60a"} Mar 20 10:12:36 crc kubenswrapper[4971]: I0320 10:12:36.726683 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2983a733871bb4943308d258e91a4b529f9a98eac45b1d8e1dbdbca699f9a60a" Mar 20 10:12:36 crc kubenswrapper[4971]: I0320 10:12:36.726529 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 20 10:12:44 crc kubenswrapper[4971]: I0320 10:12:44.201414 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 20 10:12:44 crc kubenswrapper[4971]: E0320 10:12:44.202690 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4855e97d-1722-4b16-9cbe-205e12794105" containerName="oc" Mar 20 10:12:44 crc kubenswrapper[4971]: I0320 10:12:44.202712 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="4855e97d-1722-4b16-9cbe-205e12794105" containerName="oc" Mar 20 10:12:44 crc kubenswrapper[4971]: E0320 10:12:44.202747 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c" containerName="tempest-tests-tempest-tests-runner" Mar 20 10:12:44 crc kubenswrapper[4971]: I0320 10:12:44.202760 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c" containerName="tempest-tests-tempest-tests-runner" Mar 20 10:12:44 crc kubenswrapper[4971]: I0320 10:12:44.203172 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="4855e97d-1722-4b16-9cbe-205e12794105" containerName="oc" Mar 20 10:12:44 crc kubenswrapper[4971]: I0320 10:12:44.203228 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c" containerName="tempest-tests-tempest-tests-runner" Mar 20 10:12:44 crc kubenswrapper[4971]: I0320 10:12:44.204447 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 10:12:44 crc kubenswrapper[4971]: I0320 10:12:44.208764 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-cjphw" Mar 20 10:12:44 crc kubenswrapper[4971]: I0320 10:12:44.215269 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 20 10:12:44 crc kubenswrapper[4971]: I0320 10:12:44.276316 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"7d649566-4edc-4e27-91cd-b98e8c74ceaf\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 10:12:44 crc kubenswrapper[4971]: I0320 10:12:44.276412 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkdqw\" (UniqueName: \"kubernetes.io/projected/7d649566-4edc-4e27-91cd-b98e8c74ceaf-kube-api-access-nkdqw\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"7d649566-4edc-4e27-91cd-b98e8c74ceaf\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 10:12:44 crc kubenswrapper[4971]: I0320 10:12:44.377988 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"7d649566-4edc-4e27-91cd-b98e8c74ceaf\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 10:12:44 crc kubenswrapper[4971]: I0320 10:12:44.378086 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkdqw\" (UniqueName: \"kubernetes.io/projected/7d649566-4edc-4e27-91cd-b98e8c74ceaf-kube-api-access-nkdqw\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"7d649566-4edc-4e27-91cd-b98e8c74ceaf\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 10:12:44 crc kubenswrapper[4971]: I0320 10:12:44.378545 4971 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"7d649566-4edc-4e27-91cd-b98e8c74ceaf\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 10:12:44 crc kubenswrapper[4971]: I0320 10:12:44.414408 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkdqw\" (UniqueName: \"kubernetes.io/projected/7d649566-4edc-4e27-91cd-b98e8c74ceaf-kube-api-access-nkdqw\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"7d649566-4edc-4e27-91cd-b98e8c74ceaf\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 10:12:44 crc kubenswrapper[4971]: I0320 10:12:44.416745 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"7d649566-4edc-4e27-91cd-b98e8c74ceaf\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 10:12:44 crc kubenswrapper[4971]: I0320 10:12:44.533101 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 10:12:44 crc kubenswrapper[4971]: I0320 10:12:44.959181 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 20 10:12:45 crc kubenswrapper[4971]: I0320 10:12:45.829067 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"7d649566-4edc-4e27-91cd-b98e8c74ceaf","Type":"ContainerStarted","Data":"4068b17fcf8459dcfd6cbbaa7374a10ac96406f21012a3f11d17260dcb1f0583"} Mar 20 10:12:46 crc kubenswrapper[4971]: I0320 10:12:46.845667 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"7d649566-4edc-4e27-91cd-b98e8c74ceaf","Type":"ContainerStarted","Data":"5c94b4e5a282177c17bfea8bf12ede6ace6334b860c925b759ebc5fe06a87e66"} Mar 20 10:12:46 crc kubenswrapper[4971]: I0320 10:12:46.883437 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.146063437 podStartE2EDuration="2.883405609s" podCreationTimestamp="2026-03-20 10:12:44 +0000 UTC" firstStartedPulling="2026-03-20 10:12:44.965787062 +0000 UTC m=+12186.945661190" lastFinishedPulling="2026-03-20 10:12:45.703129224 +0000 UTC m=+12187.683003362" observedRunningTime="2026-03-20 10:12:46.865636827 +0000 UTC m=+12188.845510985" watchObservedRunningTime="2026-03-20 10:12:46.883405609 +0000 UTC m=+12188.863279787" Mar 20 10:14:00 crc kubenswrapper[4971]: I0320 10:14:00.151065 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566694-f94kc"] Mar 20 10:14:00 crc kubenswrapper[4971]: I0320 10:14:00.152887 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566694-f94kc" Mar 20 10:14:00 crc kubenswrapper[4971]: I0320 10:14:00.156743 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 10:14:00 crc kubenswrapper[4971]: I0320 10:14:00.156893 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 10:14:00 crc kubenswrapper[4971]: I0320 10:14:00.157017 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 10:14:00 crc kubenswrapper[4971]: I0320 10:14:00.190424 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566694-f94kc"] Mar 20 10:14:00 crc kubenswrapper[4971]: I0320 10:14:00.303028 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgk88\" (UniqueName: \"kubernetes.io/projected/333d7fba-8587-41ea-a463-a4c9866ec202-kube-api-access-xgk88\") pod \"auto-csr-approver-29566694-f94kc\" (UID: \"333d7fba-8587-41ea-a463-a4c9866ec202\") " pod="openshift-infra/auto-csr-approver-29566694-f94kc" Mar 20 10:14:00 crc kubenswrapper[4971]: I0320 10:14:00.406076 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgk88\" (UniqueName: \"kubernetes.io/projected/333d7fba-8587-41ea-a463-a4c9866ec202-kube-api-access-xgk88\") pod \"auto-csr-approver-29566694-f94kc\" (UID: \"333d7fba-8587-41ea-a463-a4c9866ec202\") " pod="openshift-infra/auto-csr-approver-29566694-f94kc" Mar 20 10:14:00 crc kubenswrapper[4971]: I0320 10:14:00.437272 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgk88\" (UniqueName: \"kubernetes.io/projected/333d7fba-8587-41ea-a463-a4c9866ec202-kube-api-access-xgk88\") pod \"auto-csr-approver-29566694-f94kc\" (UID: \"333d7fba-8587-41ea-a463-a4c9866ec202\") " pod="openshift-infra/auto-csr-approver-29566694-f94kc" Mar 20 10:14:00 crc kubenswrapper[4971]: I0320 10:14:00.482954 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566694-f94kc" Mar 20 10:14:00 crc kubenswrapper[4971]: I0320 10:14:00.918394 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566694-f94kc"] Mar 20 10:14:01 crc kubenswrapper[4971]: I0320 10:14:01.613886 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566694-f94kc" event={"ID":"333d7fba-8587-41ea-a463-a4c9866ec202","Type":"ContainerStarted","Data":"287dd3872178311b307654563f034a9de8284fa5f4275c028fe6c6385fcbdc98"} Mar 20 10:14:02 crc kubenswrapper[4971]: I0320 10:14:02.626578 4971 generic.go:334] "Generic (PLEG): container finished" podID="333d7fba-8587-41ea-a463-a4c9866ec202" containerID="8aee91ac246f248e4ac38f489111d1d5ab57b061b31d678e50f4dc99ae6a4bc8" exitCode=0 Mar 20 10:14:02 crc kubenswrapper[4971]: I0320 10:14:02.626702 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566694-f94kc" event={"ID":"333d7fba-8587-41ea-a463-a4c9866ec202","Type":"ContainerDied","Data":"8aee91ac246f248e4ac38f489111d1d5ab57b061b31d678e50f4dc99ae6a4bc8"} Mar 20 10:14:03 crc kubenswrapper[4971]: I0320 10:14:03.973141 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566694-f94kc" Mar 20 10:14:04 crc kubenswrapper[4971]: I0320 10:14:04.078990 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgk88\" (UniqueName: \"kubernetes.io/projected/333d7fba-8587-41ea-a463-a4c9866ec202-kube-api-access-xgk88\") pod \"333d7fba-8587-41ea-a463-a4c9866ec202\" (UID: \"333d7fba-8587-41ea-a463-a4c9866ec202\") " Mar 20 10:14:04 crc kubenswrapper[4971]: I0320 10:14:04.085022 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/333d7fba-8587-41ea-a463-a4c9866ec202-kube-api-access-xgk88" (OuterVolumeSpecName: "kube-api-access-xgk88") pod "333d7fba-8587-41ea-a463-a4c9866ec202" (UID: "333d7fba-8587-41ea-a463-a4c9866ec202"). InnerVolumeSpecName "kube-api-access-xgk88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:14:04 crc kubenswrapper[4971]: I0320 10:14:04.181941 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgk88\" (UniqueName: \"kubernetes.io/projected/333d7fba-8587-41ea-a463-a4c9866ec202-kube-api-access-xgk88\") on node \"crc\" DevicePath \"\"" Mar 20 10:14:04 crc kubenswrapper[4971]: I0320 10:14:04.664311 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566694-f94kc" event={"ID":"333d7fba-8587-41ea-a463-a4c9866ec202","Type":"ContainerDied","Data":"287dd3872178311b307654563f034a9de8284fa5f4275c028fe6c6385fcbdc98"} Mar 20 10:14:04 crc kubenswrapper[4971]: I0320 10:14:04.664365 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="287dd3872178311b307654563f034a9de8284fa5f4275c028fe6c6385fcbdc98" Mar 20 10:14:04 crc kubenswrapper[4971]: I0320 10:14:04.664369 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566694-f94kc" Mar 20 10:14:05 crc kubenswrapper[4971]: I0320 10:14:05.045622 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566688-d9w58"] Mar 20 10:14:05 crc kubenswrapper[4971]: I0320 10:14:05.057366 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566688-d9w58"] Mar 20 10:14:06 crc kubenswrapper[4971]: I0320 10:14:06.746986 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2fd33b6-6269-4dc3-9d7f-aad9fa0e70b2" path="/var/lib/kubelet/pods/a2fd33b6-6269-4dc3-9d7f-aad9fa0e70b2/volumes" Mar 20 10:14:10 crc kubenswrapper[4971]: I0320 10:14:10.479582 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-x7xk7/must-gather-n27tq"] Mar 20 10:14:10 crc kubenswrapper[4971]: E0320 10:14:10.480656 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="333d7fba-8587-41ea-a463-a4c9866ec202" containerName="oc" Mar 20 10:14:10 crc kubenswrapper[4971]: I0320 10:14:10.480672 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="333d7fba-8587-41ea-a463-a4c9866ec202" containerName="oc" Mar 20 10:14:10 crc kubenswrapper[4971]: I0320 10:14:10.480957 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="333d7fba-8587-41ea-a463-a4c9866ec202" containerName="oc" Mar 20 10:14:10 crc kubenswrapper[4971]: I0320 10:14:10.482412 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x7xk7/must-gather-n27tq" Mar 20 10:14:10 crc kubenswrapper[4971]: I0320 10:14:10.484874 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-x7xk7"/"kube-root-ca.crt" Mar 20 10:14:10 crc kubenswrapper[4971]: I0320 10:14:10.485201 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-x7xk7"/"default-dockercfg-js4s5" Mar 20 10:14:10 crc kubenswrapper[4971]: I0320 10:14:10.485346 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-x7xk7"/"openshift-service-ca.crt" Mar 20 10:14:10 crc kubenswrapper[4971]: I0320 10:14:10.492771 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-x7xk7/must-gather-n27tq"] Mar 20 10:14:10 crc kubenswrapper[4971]: I0320 10:14:10.620252 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/74d57ac5-068a-41d4-ae06-ba8cca3b7c1a-must-gather-output\") pod \"must-gather-n27tq\" (UID: \"74d57ac5-068a-41d4-ae06-ba8cca3b7c1a\") " pod="openshift-must-gather-x7xk7/must-gather-n27tq" Mar 20 10:14:10 crc kubenswrapper[4971]: I0320 10:14:10.620351 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-js7st\" (UniqueName: \"kubernetes.io/projected/74d57ac5-068a-41d4-ae06-ba8cca3b7c1a-kube-api-access-js7st\") pod \"must-gather-n27tq\" (UID: \"74d57ac5-068a-41d4-ae06-ba8cca3b7c1a\") " pod="openshift-must-gather-x7xk7/must-gather-n27tq" Mar 20 10:14:10 crc kubenswrapper[4971]: I0320 10:14:10.721600 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/74d57ac5-068a-41d4-ae06-ba8cca3b7c1a-must-gather-output\") pod \"must-gather-n27tq\" (UID: \"74d57ac5-068a-41d4-ae06-ba8cca3b7c1a\") " pod="openshift-must-gather-x7xk7/must-gather-n27tq" Mar 20 10:14:10 crc kubenswrapper[4971]: I0320 10:14:10.721717 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-js7st\" (UniqueName: \"kubernetes.io/projected/74d57ac5-068a-41d4-ae06-ba8cca3b7c1a-kube-api-access-js7st\") pod \"must-gather-n27tq\" (UID: \"74d57ac5-068a-41d4-ae06-ba8cca3b7c1a\") " pod="openshift-must-gather-x7xk7/must-gather-n27tq" Mar 20 10:14:10 crc kubenswrapper[4971]: I0320 10:14:10.722248 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/74d57ac5-068a-41d4-ae06-ba8cca3b7c1a-must-gather-output\") pod \"must-gather-n27tq\" (UID: \"74d57ac5-068a-41d4-ae06-ba8cca3b7c1a\") " pod="openshift-must-gather-x7xk7/must-gather-n27tq" Mar 20 10:14:10 crc kubenswrapper[4971]: I0320 10:14:10.739260 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-js7st\" (UniqueName: \"kubernetes.io/projected/74d57ac5-068a-41d4-ae06-ba8cca3b7c1a-kube-api-access-js7st\") pod \"must-gather-n27tq\" (UID: \"74d57ac5-068a-41d4-ae06-ba8cca3b7c1a\") " pod="openshift-must-gather-x7xk7/must-gather-n27tq" Mar 20 10:14:10 crc kubenswrapper[4971]: I0320 10:14:10.808420 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x7xk7/must-gather-n27tq" Mar 20 10:14:11 crc kubenswrapper[4971]: I0320 10:14:11.272124 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-x7xk7/must-gather-n27tq"] Mar 20 10:14:11 crc kubenswrapper[4971]: I0320 10:14:11.735096 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x7xk7/must-gather-n27tq" event={"ID":"74d57ac5-068a-41d4-ae06-ba8cca3b7c1a","Type":"ContainerStarted","Data":"b04d18d1d38d9f9be4e094ef55a77a059e32483673b4fc1169b14f092fc6f46f"} Mar 20 10:14:17 crc kubenswrapper[4971]: I0320 10:14:17.804523 4971 scope.go:117] "RemoveContainer" containerID="7d04f59c860fbe4e3f92becf5f99467ddd22c3e0f99f70bcb981dc6f71150c55" Mar 20 10:14:17 crc kubenswrapper[4971]: I0320 10:14:17.812630 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x7xk7/must-gather-n27tq" event={"ID":"74d57ac5-068a-41d4-ae06-ba8cca3b7c1a","Type":"ContainerStarted","Data":"48e68cc536c5c5d31cb8655a14ca3a279c90674c590cd00ce1a78488002e895c"} Mar 20 10:14:18 crc kubenswrapper[4971]: I0320 10:14:18.824935 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x7xk7/must-gather-n27tq" event={"ID":"74d57ac5-068a-41d4-ae06-ba8cca3b7c1a","Type":"ContainerStarted","Data":"9a4781ef90ee8bca4c04e64533251b6fc2c121a314ea7b4803d57aea2b318008"} Mar 20 10:14:18 crc kubenswrapper[4971]: I0320 10:14:18.845917 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-x7xk7/must-gather-n27tq" podStartSLOduration=2.575532001 podStartE2EDuration="8.845899051s" podCreationTimestamp="2026-03-20 10:14:10 +0000 UTC" firstStartedPulling="2026-03-20 10:14:11.276577525 +0000 UTC m=+12273.256451663" lastFinishedPulling="2026-03-20 10:14:17.546944575 +0000 UTC m=+12279.526818713" observedRunningTime="2026-03-20 10:14:18.839175418 +0000 UTC m=+12280.819049576" watchObservedRunningTime="2026-03-20 10:14:18.845899051 +0000 UTC m=+12280.825773189" Mar 20 10:14:20 crc kubenswrapper[4971]: I0320 10:14:20.161887 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 10:14:20 crc kubenswrapper[4971]: I0320 10:14:20.162851 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 10:14:22 crc kubenswrapper[4971]: E0320 10:14:22.569394 4971 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.119:50700->38.102.83.119:38499: write tcp 38.102.83.119:50700->38.102.83.119:38499: write: broken pipe Mar 20 10:14:23 crc kubenswrapper[4971]: I0320 10:14:23.662360 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-x7xk7/crc-debug-rnccp"] Mar 20 10:14:23 crc kubenswrapper[4971]: I0320 10:14:23.663864 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x7xk7/crc-debug-rnccp" Mar 20 10:14:23 crc kubenswrapper[4971]: I0320 10:14:23.713426 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8253a94d-d7b8-4be9-a020-30f53fb000de-host\") pod \"crc-debug-rnccp\" (UID: \"8253a94d-d7b8-4be9-a020-30f53fb000de\") " pod="openshift-must-gather-x7xk7/crc-debug-rnccp" Mar 20 10:14:23 crc kubenswrapper[4971]: I0320 10:14:23.713735 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62zn8\" (UniqueName: \"kubernetes.io/projected/8253a94d-d7b8-4be9-a020-30f53fb000de-kube-api-access-62zn8\") pod \"crc-debug-rnccp\" (UID: \"8253a94d-d7b8-4be9-a020-30f53fb000de\") " pod="openshift-must-gather-x7xk7/crc-debug-rnccp" Mar 20 10:14:23 crc kubenswrapper[4971]: I0320 10:14:23.816320 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8253a94d-d7b8-4be9-a020-30f53fb000de-host\") pod \"crc-debug-rnccp\" (UID: \"8253a94d-d7b8-4be9-a020-30f53fb000de\") " pod="openshift-must-gather-x7xk7/crc-debug-rnccp" Mar 20 10:14:23 crc kubenswrapper[4971]: I0320 10:14:23.816453 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62zn8\" (UniqueName: \"kubernetes.io/projected/8253a94d-d7b8-4be9-a020-30f53fb000de-kube-api-access-62zn8\") pod \"crc-debug-rnccp\" (UID: \"8253a94d-d7b8-4be9-a020-30f53fb000de\") " pod="openshift-must-gather-x7xk7/crc-debug-rnccp" Mar 20 10:14:23 crc kubenswrapper[4971]: I0320 10:14:23.816558 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8253a94d-d7b8-4be9-a020-30f53fb000de-host\") pod \"crc-debug-rnccp\" (UID: \"8253a94d-d7b8-4be9-a020-30f53fb000de\") " pod="openshift-must-gather-x7xk7/crc-debug-rnccp" Mar 20 10:14:23 crc kubenswrapper[4971]: I0320 10:14:23.847285 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62zn8\" (UniqueName: \"kubernetes.io/projected/8253a94d-d7b8-4be9-a020-30f53fb000de-kube-api-access-62zn8\") pod \"crc-debug-rnccp\" (UID: \"8253a94d-d7b8-4be9-a020-30f53fb000de\") " pod="openshift-must-gather-x7xk7/crc-debug-rnccp" Mar 20 10:14:23 crc kubenswrapper[4971]: I0320 10:14:23.983145 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x7xk7/crc-debug-rnccp" Mar 20 10:14:24 crc kubenswrapper[4971]: I0320 10:14:24.906712 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x7xk7/crc-debug-rnccp" event={"ID":"8253a94d-d7b8-4be9-a020-30f53fb000de","Type":"ContainerStarted","Data":"4621ac6432d095c97fbabeaec2a1cbd814159be3bc03e96d2fa237d379f9eef2"} Mar 20 10:14:33 crc kubenswrapper[4971]: I0320 10:14:33.036883 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x7xk7/crc-debug-rnccp" event={"ID":"8253a94d-d7b8-4be9-a020-30f53fb000de","Type":"ContainerStarted","Data":"9af1c6967b4533902bbb948d64cad22dcadab4be753ebbd56e6f044501f7b05b"} Mar 20 10:14:33 crc kubenswrapper[4971]: I0320 10:14:33.058871 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-x7xk7/crc-debug-rnccp" podStartSLOduration=1.935792793 podStartE2EDuration="10.058849366s" podCreationTimestamp="2026-03-20 10:14:23 +0000 UTC" firstStartedPulling="2026-03-20 10:14:24.027350046 +0000 UTC m=+12286.007224194" lastFinishedPulling="2026-03-20 10:14:32.150406629 +0000 UTC m=+12294.130280767" observedRunningTime="2026-03-20 10:14:33.050912227 +0000 UTC m=+12295.030786365" watchObservedRunningTime="2026-03-20 10:14:33.058849366 +0000 UTC m=+12295.038723504" Mar 20 10:14:50 crc kubenswrapper[4971]: I0320 10:14:50.161782 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 10:14:50 crc kubenswrapper[4971]: I0320 10:14:50.162240 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 10:15:00 crc kubenswrapper[4971]: I0320 10:15:00.158071 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566695-l4b56"] Mar 20 10:15:00 crc kubenswrapper[4971]: I0320 10:15:00.160139 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566695-l4b56" Mar 20 10:15:00 crc kubenswrapper[4971]: I0320 10:15:00.162994 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 10:15:00 crc kubenswrapper[4971]: I0320 10:15:00.163246 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 10:15:00 crc kubenswrapper[4971]: I0320 10:15:00.173925 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566695-l4b56"] Mar 20 10:15:00 crc kubenswrapper[4971]: I0320 10:15:00.273129 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr8ld\" (UniqueName: \"kubernetes.io/projected/611e0071-9890-4bd6-bb8d-26d2b8dc73c1-kube-api-access-mr8ld\") pod \"collect-profiles-29566695-l4b56\" (UID: \"611e0071-9890-4bd6-bb8d-26d2b8dc73c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566695-l4b56" Mar 20 10:15:00 crc kubenswrapper[4971]: I0320 10:15:00.273328 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/611e0071-9890-4bd6-bb8d-26d2b8dc73c1-config-volume\") pod \"collect-profiles-29566695-l4b56\" (UID: \"611e0071-9890-4bd6-bb8d-26d2b8dc73c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566695-l4b56" Mar 20 10:15:00 crc kubenswrapper[4971]: I0320 10:15:00.273376 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/611e0071-9890-4bd6-bb8d-26d2b8dc73c1-secret-volume\") pod \"collect-profiles-29566695-l4b56\" (UID: \"611e0071-9890-4bd6-bb8d-26d2b8dc73c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566695-l4b56" Mar 20 10:15:00 crc kubenswrapper[4971]: I0320 10:15:00.375255 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mr8ld\" (UniqueName: \"kubernetes.io/projected/611e0071-9890-4bd6-bb8d-26d2b8dc73c1-kube-api-access-mr8ld\") pod \"collect-profiles-29566695-l4b56\" (UID: \"611e0071-9890-4bd6-bb8d-26d2b8dc73c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566695-l4b56" Mar 20 10:15:00 crc kubenswrapper[4971]: I0320 10:15:00.375395 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/611e0071-9890-4bd6-bb8d-26d2b8dc73c1-config-volume\") pod \"collect-profiles-29566695-l4b56\" (UID: \"611e0071-9890-4bd6-bb8d-26d2b8dc73c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566695-l4b56" Mar 20 10:15:00 crc kubenswrapper[4971]: I0320 10:15:00.375520 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/611e0071-9890-4bd6-bb8d-26d2b8dc73c1-secret-volume\") pod \"collect-profiles-29566695-l4b56\" (UID: \"611e0071-9890-4bd6-bb8d-26d2b8dc73c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566695-l4b56" Mar 20 10:15:00 crc kubenswrapper[4971]: I0320 10:15:00.376402 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/611e0071-9890-4bd6-bb8d-26d2b8dc73c1-config-volume\") pod \"collect-profiles-29566695-l4b56\" (UID: \"611e0071-9890-4bd6-bb8d-26d2b8dc73c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566695-l4b56" Mar 20 10:15:00 crc kubenswrapper[4971]: I0320 10:15:00.392901 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr8ld\" (UniqueName: \"kubernetes.io/projected/611e0071-9890-4bd6-bb8d-26d2b8dc73c1-kube-api-access-mr8ld\") pod \"collect-profiles-29566695-l4b56\" (UID: \"611e0071-9890-4bd6-bb8d-26d2b8dc73c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566695-l4b56" Mar 20 10:15:00 crc kubenswrapper[4971]: I0320 10:15:00.393216 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/611e0071-9890-4bd6-bb8d-26d2b8dc73c1-secret-volume\") pod \"collect-profiles-29566695-l4b56\" (UID: \"611e0071-9890-4bd6-bb8d-26d2b8dc73c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566695-l4b56" Mar 20 10:15:00 crc kubenswrapper[4971]: I0320 10:15:00.500733 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566695-l4b56" Mar 20 10:15:01 crc kubenswrapper[4971]: I0320 10:15:01.070756 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566695-l4b56"] Mar 20 10:15:01 crc kubenswrapper[4971]: I0320 10:15:01.354323 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566695-l4b56" event={"ID":"611e0071-9890-4bd6-bb8d-26d2b8dc73c1","Type":"ContainerStarted","Data":"ce8aca519755b83da22414ae150c1fd13ff7cb9a9cc14ad20d6959d083f351e9"} Mar 20 10:15:01 crc kubenswrapper[4971]: I0320 10:15:01.354597 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566695-l4b56" event={"ID":"611e0071-9890-4bd6-bb8d-26d2b8dc73c1","Type":"ContainerStarted","Data":"ed5a26de2fee953f8d642930ea80627a6b71530a3b6f439515c8490f3505aa8f"} Mar 20 10:15:01 crc kubenswrapper[4971]: I0320 10:15:01.380805 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29566695-l4b56" podStartSLOduration=1.380773842 podStartE2EDuration="1.380773842s" podCreationTimestamp="2026-03-20 10:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:15:01.367564675 +0000 UTC m=+12323.347438833" watchObservedRunningTime="2026-03-20 10:15:01.380773842 +0000 UTC m=+12323.360647990" Mar 20 10:15:02 crc kubenswrapper[4971]: I0320 10:15:02.367818 4971 generic.go:334] "Generic (PLEG): container finished" podID="611e0071-9890-4bd6-bb8d-26d2b8dc73c1" containerID="ce8aca519755b83da22414ae150c1fd13ff7cb9a9cc14ad20d6959d083f351e9" exitCode=0 Mar 20 10:15:02 crc kubenswrapper[4971]: I0320 10:15:02.367859 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566695-l4b56" event={"ID":"611e0071-9890-4bd6-bb8d-26d2b8dc73c1","Type":"ContainerDied","Data":"ce8aca519755b83da22414ae150c1fd13ff7cb9a9cc14ad20d6959d083f351e9"} Mar 20 10:15:03 crc kubenswrapper[4971]: I0320 10:15:03.758684 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566695-l4b56" Mar 20 10:15:03 crc kubenswrapper[4971]: I0320 10:15:03.862159 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/611e0071-9890-4bd6-bb8d-26d2b8dc73c1-config-volume\") pod \"611e0071-9890-4bd6-bb8d-26d2b8dc73c1\" (UID: \"611e0071-9890-4bd6-bb8d-26d2b8dc73c1\") " Mar 20 10:15:03 crc kubenswrapper[4971]: I0320 10:15:03.862272 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/611e0071-9890-4bd6-bb8d-26d2b8dc73c1-secret-volume\") pod \"611e0071-9890-4bd6-bb8d-26d2b8dc73c1\" (UID: \"611e0071-9890-4bd6-bb8d-26d2b8dc73c1\") " Mar 20 10:15:03 crc kubenswrapper[4971]: I0320 10:15:03.862626 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mr8ld\" (UniqueName: \"kubernetes.io/projected/611e0071-9890-4bd6-bb8d-26d2b8dc73c1-kube-api-access-mr8ld\") pod \"611e0071-9890-4bd6-bb8d-26d2b8dc73c1\" (UID: \"611e0071-9890-4bd6-bb8d-26d2b8dc73c1\") " Mar 20 10:15:03 crc kubenswrapper[4971]: I0320 10:15:03.865363 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/611e0071-9890-4bd6-bb8d-26d2b8dc73c1-config-volume" (OuterVolumeSpecName: "config-volume") pod "611e0071-9890-4bd6-bb8d-26d2b8dc73c1" (UID: "611e0071-9890-4bd6-bb8d-26d2b8dc73c1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:15:03 crc kubenswrapper[4971]: I0320 10:15:03.879314 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/611e0071-9890-4bd6-bb8d-26d2b8dc73c1-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "611e0071-9890-4bd6-bb8d-26d2b8dc73c1" (UID: "611e0071-9890-4bd6-bb8d-26d2b8dc73c1"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:15:03 crc kubenswrapper[4971]: I0320 10:15:03.879668 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/611e0071-9890-4bd6-bb8d-26d2b8dc73c1-kube-api-access-mr8ld" (OuterVolumeSpecName: "kube-api-access-mr8ld") pod "611e0071-9890-4bd6-bb8d-26d2b8dc73c1" (UID: "611e0071-9890-4bd6-bb8d-26d2b8dc73c1"). InnerVolumeSpecName "kube-api-access-mr8ld". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:15:03 crc kubenswrapper[4971]: I0320 10:15:03.965558 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mr8ld\" (UniqueName: \"kubernetes.io/projected/611e0071-9890-4bd6-bb8d-26d2b8dc73c1-kube-api-access-mr8ld\") on node \"crc\" DevicePath \"\"" Mar 20 10:15:03 crc kubenswrapper[4971]: I0320 10:15:03.965596 4971 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/611e0071-9890-4bd6-bb8d-26d2b8dc73c1-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 10:15:03 crc kubenswrapper[4971]: I0320 10:15:03.965674 4971 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/611e0071-9890-4bd6-bb8d-26d2b8dc73c1-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 10:15:04 crc kubenswrapper[4971]: I0320 10:15:04.419894 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566695-l4b56" event={"ID":"611e0071-9890-4bd6-bb8d-26d2b8dc73c1","Type":"ContainerDied","Data":"ed5a26de2fee953f8d642930ea80627a6b71530a3b6f439515c8490f3505aa8f"} Mar 20 10:15:04 crc kubenswrapper[4971]: I0320 10:15:04.419932 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed5a26de2fee953f8d642930ea80627a6b71530a3b6f439515c8490f3505aa8f" Mar 20 10:15:04 crc kubenswrapper[4971]: I0320 10:15:04.420001 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566695-l4b56" Mar 20 10:15:04 crc kubenswrapper[4971]: I0320 10:15:04.464885 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566650-h5hmj"] Mar 20 10:15:04 crc kubenswrapper[4971]: I0320 10:15:04.479431 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566650-h5hmj"] Mar 20 10:15:04 crc kubenswrapper[4971]: E0320 10:15:04.684231 4971 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod611e0071_9890_4bd6_bb8d_26d2b8dc73c1.slice/crio-ed5a26de2fee953f8d642930ea80627a6b71530a3b6f439515c8490f3505aa8f\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod611e0071_9890_4bd6_bb8d_26d2b8dc73c1.slice\": RecentStats: unable to find data in memory cache]" Mar 20 10:15:04 crc kubenswrapper[4971]: I0320 10:15:04.748384 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04fd9f38-7215-427b-bc7e-6e1efd12170b" path="/var/lib/kubelet/pods/04fd9f38-7215-427b-bc7e-6e1efd12170b/volumes" Mar 20 10:15:09 crc kubenswrapper[4971]: I0320 10:15:09.466734 4971 generic.go:334] "Generic (PLEG): container finished" podID="8253a94d-d7b8-4be9-a020-30f53fb000de" containerID="9af1c6967b4533902bbb948d64cad22dcadab4be753ebbd56e6f044501f7b05b" exitCode=0 Mar 20 10:15:09 crc kubenswrapper[4971]: I0320 10:15:09.466832 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x7xk7/crc-debug-rnccp" event={"ID":"8253a94d-d7b8-4be9-a020-30f53fb000de","Type":"ContainerDied","Data":"9af1c6967b4533902bbb948d64cad22dcadab4be753ebbd56e6f044501f7b05b"} Mar 20 10:15:10 crc kubenswrapper[4971]: I0320 10:15:10.585681 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x7xk7/crc-debug-rnccp" Mar 20 10:15:10 crc kubenswrapper[4971]: I0320 10:15:10.620683 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-x7xk7/crc-debug-rnccp"] Mar 20 10:15:10 crc kubenswrapper[4971]: I0320 10:15:10.630805 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-x7xk7/crc-debug-rnccp"] Mar 20 10:15:10 crc kubenswrapper[4971]: I0320 10:15:10.705075 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62zn8\" (UniqueName: \"kubernetes.io/projected/8253a94d-d7b8-4be9-a020-30f53fb000de-kube-api-access-62zn8\") pod \"8253a94d-d7b8-4be9-a020-30f53fb000de\" (UID: \"8253a94d-d7b8-4be9-a020-30f53fb000de\") " Mar 20 10:15:10 crc kubenswrapper[4971]: I0320 10:15:10.705515 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8253a94d-d7b8-4be9-a020-30f53fb000de-host\") pod \"8253a94d-d7b8-4be9-a020-30f53fb000de\" (UID: \"8253a94d-d7b8-4be9-a020-30f53fb000de\") " Mar 20 10:15:10 crc kubenswrapper[4971]: I0320 10:15:10.705581 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8253a94d-d7b8-4be9-a020-30f53fb000de-host" (OuterVolumeSpecName: "host") pod "8253a94d-d7b8-4be9-a020-30f53fb000de" (UID: "8253a94d-d7b8-4be9-a020-30f53fb000de"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:15:10 crc kubenswrapper[4971]: I0320 10:15:10.706332 4971 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8253a94d-d7b8-4be9-a020-30f53fb000de-host\") on node \"crc\" DevicePath \"\"" Mar 20 10:15:10 crc kubenswrapper[4971]: I0320 10:15:10.711948 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8253a94d-d7b8-4be9-a020-30f53fb000de-kube-api-access-62zn8" (OuterVolumeSpecName: "kube-api-access-62zn8") pod "8253a94d-d7b8-4be9-a020-30f53fb000de" (UID: "8253a94d-d7b8-4be9-a020-30f53fb000de"). InnerVolumeSpecName "kube-api-access-62zn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:15:10 crc kubenswrapper[4971]: I0320 10:15:10.745712 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8253a94d-d7b8-4be9-a020-30f53fb000de" path="/var/lib/kubelet/pods/8253a94d-d7b8-4be9-a020-30f53fb000de/volumes" Mar 20 10:15:10 crc kubenswrapper[4971]: I0320 10:15:10.808662 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62zn8\" (UniqueName: \"kubernetes.io/projected/8253a94d-d7b8-4be9-a020-30f53fb000de-kube-api-access-62zn8\") on node \"crc\" DevicePath \"\"" Mar 20 10:15:11 crc kubenswrapper[4971]: I0320 10:15:11.488275 4971 scope.go:117] "RemoveContainer" containerID="9af1c6967b4533902bbb948d64cad22dcadab4be753ebbd56e6f044501f7b05b" Mar 20 10:15:11 crc kubenswrapper[4971]: I0320 10:15:11.488671 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x7xk7/crc-debug-rnccp" Mar 20 10:15:11 crc kubenswrapper[4971]: I0320 10:15:11.770353 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-x7xk7/crc-debug-srxb5"] Mar 20 10:15:11 crc kubenswrapper[4971]: E0320 10:15:11.770802 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="611e0071-9890-4bd6-bb8d-26d2b8dc73c1" containerName="collect-profiles" Mar 20 10:15:11 crc kubenswrapper[4971]: I0320 10:15:11.770814 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="611e0071-9890-4bd6-bb8d-26d2b8dc73c1" containerName="collect-profiles" Mar 20 10:15:11 crc kubenswrapper[4971]: E0320 10:15:11.770829 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8253a94d-d7b8-4be9-a020-30f53fb000de" containerName="container-00" Mar 20 10:15:11 crc kubenswrapper[4971]: I0320 10:15:11.770835 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="8253a94d-d7b8-4be9-a020-30f53fb000de" containerName="container-00" Mar 20 10:15:11 crc kubenswrapper[4971]: I0320 10:15:11.771054 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="8253a94d-d7b8-4be9-a020-30f53fb000de" containerName="container-00" Mar 20 10:15:11 crc kubenswrapper[4971]: I0320 10:15:11.771086 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="611e0071-9890-4bd6-bb8d-26d2b8dc73c1" containerName="collect-profiles" Mar 20 10:15:11 crc kubenswrapper[4971]: I0320 10:15:11.771814 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x7xk7/crc-debug-srxb5" Mar 20 10:15:11 crc kubenswrapper[4971]: I0320 10:15:11.826556 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5hfz\" (UniqueName: \"kubernetes.io/projected/c4c8fd7f-cb8a-4dfc-bd20-47a5df15e190-kube-api-access-w5hfz\") pod \"crc-debug-srxb5\" (UID: \"c4c8fd7f-cb8a-4dfc-bd20-47a5df15e190\") " pod="openshift-must-gather-x7xk7/crc-debug-srxb5" Mar 20 10:15:11 crc kubenswrapper[4971]: I0320 10:15:11.826647 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c4c8fd7f-cb8a-4dfc-bd20-47a5df15e190-host\") pod \"crc-debug-srxb5\" (UID: \"c4c8fd7f-cb8a-4dfc-bd20-47a5df15e190\") " pod="openshift-must-gather-x7xk7/crc-debug-srxb5" Mar 20 10:15:11 crc kubenswrapper[4971]: I0320 10:15:11.929201 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5hfz\" (UniqueName: \"kubernetes.io/projected/c4c8fd7f-cb8a-4dfc-bd20-47a5df15e190-kube-api-access-w5hfz\") pod \"crc-debug-srxb5\" (UID: \"c4c8fd7f-cb8a-4dfc-bd20-47a5df15e190\") " pod="openshift-must-gather-x7xk7/crc-debug-srxb5" Mar 20 10:15:11 crc kubenswrapper[4971]: I0320 10:15:11.929249 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c4c8fd7f-cb8a-4dfc-bd20-47a5df15e190-host\") pod \"crc-debug-srxb5\" (UID: \"c4c8fd7f-cb8a-4dfc-bd20-47a5df15e190\") " pod="openshift-must-gather-x7xk7/crc-debug-srxb5" Mar 20 10:15:11 crc kubenswrapper[4971]: I0320 10:15:11.929440 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c4c8fd7f-cb8a-4dfc-bd20-47a5df15e190-host\") pod \"crc-debug-srxb5\" (UID: \"c4c8fd7f-cb8a-4dfc-bd20-47a5df15e190\") " pod="openshift-must-gather-x7xk7/crc-debug-srxb5" Mar 20 10:15:11 crc kubenswrapper[4971]: I0320 10:15:11.949445 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5hfz\" (UniqueName: \"kubernetes.io/projected/c4c8fd7f-cb8a-4dfc-bd20-47a5df15e190-kube-api-access-w5hfz\") pod \"crc-debug-srxb5\" (UID: \"c4c8fd7f-cb8a-4dfc-bd20-47a5df15e190\") " pod="openshift-must-gather-x7xk7/crc-debug-srxb5" Mar 20 10:15:12 crc kubenswrapper[4971]: I0320 10:15:12.090027 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x7xk7/crc-debug-srxb5" Mar 20 10:15:12 crc kubenswrapper[4971]: I0320 10:15:12.501020 4971 generic.go:334] "Generic (PLEG): container finished" podID="c4c8fd7f-cb8a-4dfc-bd20-47a5df15e190" containerID="4e6829767ffabadd1afefac17198818bbf3b380b3fda5a46a74561c94bbc67f3" exitCode=0 Mar 20 10:15:12 crc kubenswrapper[4971]: I0320 10:15:12.501100 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x7xk7/crc-debug-srxb5" event={"ID":"c4c8fd7f-cb8a-4dfc-bd20-47a5df15e190","Type":"ContainerDied","Data":"4e6829767ffabadd1afefac17198818bbf3b380b3fda5a46a74561c94bbc67f3"} Mar 20 10:15:12 crc kubenswrapper[4971]: I0320 10:15:12.501284 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x7xk7/crc-debug-srxb5" event={"ID":"c4c8fd7f-cb8a-4dfc-bd20-47a5df15e190","Type":"ContainerStarted","Data":"c890d52894d594c6d3bd16593b168da945ee07a98e0b5ad6bb2dc2f59daddb99"} Mar 20 10:15:13 crc kubenswrapper[4971]: I0320 10:15:13.143719 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-x7xk7/crc-debug-srxb5"] Mar 20 10:15:13 crc kubenswrapper[4971]: I0320 10:15:13.153424 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-x7xk7/crc-debug-srxb5"] Mar 20 10:15:13 crc kubenswrapper[4971]: I0320 10:15:13.612447 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x7xk7/crc-debug-srxb5" Mar 20 10:15:13 crc kubenswrapper[4971]: I0320 10:15:13.657544 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5hfz\" (UniqueName: \"kubernetes.io/projected/c4c8fd7f-cb8a-4dfc-bd20-47a5df15e190-kube-api-access-w5hfz\") pod \"c4c8fd7f-cb8a-4dfc-bd20-47a5df15e190\" (UID: \"c4c8fd7f-cb8a-4dfc-bd20-47a5df15e190\") " Mar 20 10:15:13 crc kubenswrapper[4971]: I0320 10:15:13.657624 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c4c8fd7f-cb8a-4dfc-bd20-47a5df15e190-host\") pod \"c4c8fd7f-cb8a-4dfc-bd20-47a5df15e190\" (UID: \"c4c8fd7f-cb8a-4dfc-bd20-47a5df15e190\") " Mar 20 10:15:13 crc kubenswrapper[4971]: I0320 10:15:13.657805 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4c8fd7f-cb8a-4dfc-bd20-47a5df15e190-host" (OuterVolumeSpecName: "host") pod "c4c8fd7f-cb8a-4dfc-bd20-47a5df15e190" (UID: "c4c8fd7f-cb8a-4dfc-bd20-47a5df15e190"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:15:13 crc kubenswrapper[4971]: I0320 10:15:13.658127 4971 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c4c8fd7f-cb8a-4dfc-bd20-47a5df15e190-host\") on node \"crc\" DevicePath \"\"" Mar 20 10:15:13 crc kubenswrapper[4971]: I0320 10:15:13.665832 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4c8fd7f-cb8a-4dfc-bd20-47a5df15e190-kube-api-access-w5hfz" (OuterVolumeSpecName: "kube-api-access-w5hfz") pod "c4c8fd7f-cb8a-4dfc-bd20-47a5df15e190" (UID: "c4c8fd7f-cb8a-4dfc-bd20-47a5df15e190"). InnerVolumeSpecName "kube-api-access-w5hfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:15:13 crc kubenswrapper[4971]: I0320 10:15:13.761005 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5hfz\" (UniqueName: \"kubernetes.io/projected/c4c8fd7f-cb8a-4dfc-bd20-47a5df15e190-kube-api-access-w5hfz\") on node \"crc\" DevicePath \"\"" Mar 20 10:15:14 crc kubenswrapper[4971]: I0320 10:15:14.298195 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-x7xk7/crc-debug-fkbdt"] Mar 20 10:15:14 crc kubenswrapper[4971]: E0320 10:15:14.298900 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4c8fd7f-cb8a-4dfc-bd20-47a5df15e190" containerName="container-00" Mar 20 10:15:14 crc kubenswrapper[4971]: I0320 10:15:14.298913 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4c8fd7f-cb8a-4dfc-bd20-47a5df15e190" containerName="container-00" Mar 20 10:15:14 crc kubenswrapper[4971]: I0320 10:15:14.299157 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4c8fd7f-cb8a-4dfc-bd20-47a5df15e190" containerName="container-00" Mar 20 10:15:14 crc kubenswrapper[4971]: I0320 10:15:14.299863 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x7xk7/crc-debug-fkbdt" Mar 20 10:15:14 crc kubenswrapper[4971]: I0320 10:15:14.474513 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3284b97c-0394-4c51-9775-e7715c02eb4d-host\") pod \"crc-debug-fkbdt\" (UID: \"3284b97c-0394-4c51-9775-e7715c02eb4d\") " pod="openshift-must-gather-x7xk7/crc-debug-fkbdt" Mar 20 10:15:14 crc kubenswrapper[4971]: I0320 10:15:14.475043 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b256x\" (UniqueName: \"kubernetes.io/projected/3284b97c-0394-4c51-9775-e7715c02eb4d-kube-api-access-b256x\") pod \"crc-debug-fkbdt\" (UID: \"3284b97c-0394-4c51-9775-e7715c02eb4d\") " pod="openshift-must-gather-x7xk7/crc-debug-fkbdt" Mar 20 10:15:14 crc kubenswrapper[4971]: I0320 10:15:14.521576 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c890d52894d594c6d3bd16593b168da945ee07a98e0b5ad6bb2dc2f59daddb99" Mar 20 10:15:14 crc kubenswrapper[4971]: I0320 10:15:14.521646 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x7xk7/crc-debug-srxb5" Mar 20 10:15:14 crc kubenswrapper[4971]: I0320 10:15:14.576749 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b256x\" (UniqueName: \"kubernetes.io/projected/3284b97c-0394-4c51-9775-e7715c02eb4d-kube-api-access-b256x\") pod \"crc-debug-fkbdt\" (UID: \"3284b97c-0394-4c51-9775-e7715c02eb4d\") " pod="openshift-must-gather-x7xk7/crc-debug-fkbdt" Mar 20 10:15:14 crc kubenswrapper[4971]: I0320 10:15:14.576851 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3284b97c-0394-4c51-9775-e7715c02eb4d-host\") pod \"crc-debug-fkbdt\" (UID: \"3284b97c-0394-4c51-9775-e7715c02eb4d\") " pod="openshift-must-gather-x7xk7/crc-debug-fkbdt" Mar 20 10:15:14 crc kubenswrapper[4971]: I0320 10:15:14.576935 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3284b97c-0394-4c51-9775-e7715c02eb4d-host\") pod \"crc-debug-fkbdt\" (UID: \"3284b97c-0394-4c51-9775-e7715c02eb4d\") " pod="openshift-must-gather-x7xk7/crc-debug-fkbdt" Mar 20 10:15:14 crc kubenswrapper[4971]: I0320 10:15:14.594381 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b256x\" (UniqueName: \"kubernetes.io/projected/3284b97c-0394-4c51-9775-e7715c02eb4d-kube-api-access-b256x\") pod \"crc-debug-fkbdt\" (UID: \"3284b97c-0394-4c51-9775-e7715c02eb4d\") " pod="openshift-must-gather-x7xk7/crc-debug-fkbdt" Mar 20 10:15:14 crc kubenswrapper[4971]: I0320 10:15:14.614687 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x7xk7/crc-debug-fkbdt" Mar 20 10:15:14 crc kubenswrapper[4971]: W0320 10:15:14.640816 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3284b97c_0394_4c51_9775_e7715c02eb4d.slice/crio-a9632a1a50e48ad8efb57931471410ab9a7af55cc338b018b49c68da40ad4166 WatchSource:0}: Error finding container a9632a1a50e48ad8efb57931471410ab9a7af55cc338b018b49c68da40ad4166: Status 404 returned error can't find the container with id a9632a1a50e48ad8efb57931471410ab9a7af55cc338b018b49c68da40ad4166 Mar 20 10:15:14 crc kubenswrapper[4971]: I0320 10:15:14.747646 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4c8fd7f-cb8a-4dfc-bd20-47a5df15e190" path="/var/lib/kubelet/pods/c4c8fd7f-cb8a-4dfc-bd20-47a5df15e190/volumes" Mar 20 10:15:15 crc kubenswrapper[4971]: I0320 10:15:15.539184 4971 generic.go:334] "Generic (PLEG): container finished" podID="3284b97c-0394-4c51-9775-e7715c02eb4d" containerID="4be2fca0996e26a4e6e0cd83c6e468dde14880b0866c680481257061f52557ab" exitCode=0 Mar 20 10:15:15 crc kubenswrapper[4971]: I0320 10:15:15.539306 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x7xk7/crc-debug-fkbdt" event={"ID":"3284b97c-0394-4c51-9775-e7715c02eb4d","Type":"ContainerDied","Data":"4be2fca0996e26a4e6e0cd83c6e468dde14880b0866c680481257061f52557ab"} Mar 20 10:15:15 crc kubenswrapper[4971]: I0320 10:15:15.539540 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x7xk7/crc-debug-fkbdt" event={"ID":"3284b97c-0394-4c51-9775-e7715c02eb4d","Type":"ContainerStarted","Data":"a9632a1a50e48ad8efb57931471410ab9a7af55cc338b018b49c68da40ad4166"} Mar 20 10:15:15 crc kubenswrapper[4971]: I0320 10:15:15.592934 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-x7xk7/crc-debug-fkbdt"] Mar 20 10:15:15 crc kubenswrapper[4971]: I0320 10:15:15.603952 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-x7xk7/crc-debug-fkbdt"] Mar 20 10:15:16 crc kubenswrapper[4971]: I0320 10:15:16.662497 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x7xk7/crc-debug-fkbdt" Mar 20 10:15:16 crc kubenswrapper[4971]: I0320 10:15:16.821173 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3284b97c-0394-4c51-9775-e7715c02eb4d-host\") pod \"3284b97c-0394-4c51-9775-e7715c02eb4d\" (UID: \"3284b97c-0394-4c51-9775-e7715c02eb4d\") " Mar 20 10:15:16 crc kubenswrapper[4971]: I0320 10:15:16.821287 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3284b97c-0394-4c51-9775-e7715c02eb4d-host" (OuterVolumeSpecName: "host") pod "3284b97c-0394-4c51-9775-e7715c02eb4d" (UID: "3284b97c-0394-4c51-9775-e7715c02eb4d"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:15:16 crc kubenswrapper[4971]: I0320 10:15:16.821351 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b256x\" (UniqueName: \"kubernetes.io/projected/3284b97c-0394-4c51-9775-e7715c02eb4d-kube-api-access-b256x\") pod \"3284b97c-0394-4c51-9775-e7715c02eb4d\" (UID: \"3284b97c-0394-4c51-9775-e7715c02eb4d\") " Mar 20 10:15:16 crc kubenswrapper[4971]: I0320 10:15:16.821935 4971 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3284b97c-0394-4c51-9775-e7715c02eb4d-host\") on node \"crc\" DevicePath \"\"" Mar 20 10:15:16 crc kubenswrapper[4971]: I0320 10:15:16.827351 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3284b97c-0394-4c51-9775-e7715c02eb4d-kube-api-access-b256x" (OuterVolumeSpecName: "kube-api-access-b256x") pod "3284b97c-0394-4c51-9775-e7715c02eb4d" (UID: "3284b97c-0394-4c51-9775-e7715c02eb4d"). InnerVolumeSpecName "kube-api-access-b256x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:15:16 crc kubenswrapper[4971]: I0320 10:15:16.924373 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b256x\" (UniqueName: \"kubernetes.io/projected/3284b97c-0394-4c51-9775-e7715c02eb4d-kube-api-access-b256x\") on node \"crc\" DevicePath \"\"" Mar 20 10:15:17 crc kubenswrapper[4971]: I0320 10:15:17.573576 4971 scope.go:117] "RemoveContainer" containerID="4be2fca0996e26a4e6e0cd83c6e468dde14880b0866c680481257061f52557ab" Mar 20 10:15:17 crc kubenswrapper[4971]: I0320 10:15:17.573661 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x7xk7/crc-debug-fkbdt" Mar 20 10:15:17 crc kubenswrapper[4971]: I0320 10:15:17.918518 4971 scope.go:117] "RemoveContainer" containerID="0ec3b42827fc6071a8b90aaa61acaefc3be25cefe8ff555d8666fb0cc5bd0633" Mar 20 10:15:18 crc kubenswrapper[4971]: I0320 10:15:18.742553 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3284b97c-0394-4c51-9775-e7715c02eb4d" path="/var/lib/kubelet/pods/3284b97c-0394-4c51-9775-e7715c02eb4d/volumes" Mar 20 10:15:20 crc kubenswrapper[4971]: I0320 10:15:20.162742 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 10:15:20 crc kubenswrapper[4971]: I0320 10:15:20.162798 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 10:15:20 crc kubenswrapper[4971]: I0320 10:15:20.162839 4971 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" Mar 20 10:15:20 crc kubenswrapper[4971]: I0320 10:15:20.163648 4971 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f75848c8286f55cc3d920b08f7a89dc5054b5048e3e2418181291bb7c5c3716f"} pod="openshift-machine-config-operator/machine-config-daemon-m26zl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 10:15:20 crc kubenswrapper[4971]: I0320 10:15:20.163707 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" containerID="cri-o://f75848c8286f55cc3d920b08f7a89dc5054b5048e3e2418181291bb7c5c3716f" gracePeriod=600 Mar 20 10:15:20 crc kubenswrapper[4971]: I0320 10:15:20.609771 4971 generic.go:334] "Generic (PLEG): container finished" podID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerID="f75848c8286f55cc3d920b08f7a89dc5054b5048e3e2418181291bb7c5c3716f" exitCode=0 Mar 20 10:15:20 crc kubenswrapper[4971]: I0320 10:15:20.609825 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerDied","Data":"f75848c8286f55cc3d920b08f7a89dc5054b5048e3e2418181291bb7c5c3716f"} Mar 20 10:15:20 crc kubenswrapper[4971]: I0320 10:15:20.610079 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerStarted","Data":"919bb2f73e7c6359debd4a73f45868c083b9339eb00f7714958d919a54c82d9f"} Mar 20 10:15:20 crc kubenswrapper[4971]: I0320 10:15:20.610101 4971 scope.go:117] "RemoveContainer" containerID="15b22a46addd719b87e39803e0676833c1a1e5fce264f62fdd63b9866ad52788" Mar 20 10:16:00 crc kubenswrapper[4971]: I0320 10:16:00.145922 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566696-kdqxj"] Mar 20 10:16:00 crc kubenswrapper[4971]: E0320 10:16:00.148082 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3284b97c-0394-4c51-9775-e7715c02eb4d" containerName="container-00" Mar 20 10:16:00 crc kubenswrapper[4971]: I0320 10:16:00.148224 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="3284b97c-0394-4c51-9775-e7715c02eb4d" containerName="container-00" Mar 20 10:16:00 crc kubenswrapper[4971]: I0320 10:16:00.148536 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="3284b97c-0394-4c51-9775-e7715c02eb4d" containerName="container-00" Mar 20 10:16:00 crc kubenswrapper[4971]: I0320 10:16:00.149434 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566696-kdqxj" Mar 20 10:16:00 crc kubenswrapper[4971]: I0320 10:16:00.151878 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 10:16:00 crc kubenswrapper[4971]: I0320 10:16:00.152457 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 10:16:00 crc kubenswrapper[4971]: I0320 10:16:00.152824 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 10:16:00 crc kubenswrapper[4971]: I0320 10:16:00.159255 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566696-kdqxj"] Mar 20 10:16:00 crc kubenswrapper[4971]: I0320 10:16:00.195793 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sc69m\" (UniqueName: \"kubernetes.io/projected/9a5c30ae-e5aa-4e94-8a67-2f2d0088db08-kube-api-access-sc69m\") pod \"auto-csr-approver-29566696-kdqxj\" (UID: \"9a5c30ae-e5aa-4e94-8a67-2f2d0088db08\") " pod="openshift-infra/auto-csr-approver-29566696-kdqxj" Mar 20 10:16:00 crc kubenswrapper[4971]: I0320 10:16:00.298434 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sc69m\" (UniqueName: \"kubernetes.io/projected/9a5c30ae-e5aa-4e94-8a67-2f2d0088db08-kube-api-access-sc69m\") pod \"auto-csr-approver-29566696-kdqxj\" (UID: \"9a5c30ae-e5aa-4e94-8a67-2f2d0088db08\") " pod="openshift-infra/auto-csr-approver-29566696-kdqxj" Mar 20 10:16:00 crc kubenswrapper[4971]: I0320 10:16:00.316746 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sc69m\" (UniqueName: \"kubernetes.io/projected/9a5c30ae-e5aa-4e94-8a67-2f2d0088db08-kube-api-access-sc69m\") pod \"auto-csr-approver-29566696-kdqxj\" (UID: \"9a5c30ae-e5aa-4e94-8a67-2f2d0088db08\") " pod="openshift-infra/auto-csr-approver-29566696-kdqxj" Mar 20 10:16:00 crc kubenswrapper[4971]: I0320 10:16:00.475337 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566696-kdqxj" Mar 20 10:16:00 crc kubenswrapper[4971]: I0320 10:16:00.985051 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566696-kdqxj"] Mar 20 10:16:01 crc kubenswrapper[4971]: I0320 10:16:01.020356 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566696-kdqxj" event={"ID":"9a5c30ae-e5aa-4e94-8a67-2f2d0088db08","Type":"ContainerStarted","Data":"2c2379e079aec653d19cd47b98fc916043b3ccb3272653396d23f40a1c7755b6"} Mar 20 10:16:03 crc kubenswrapper[4971]: I0320 10:16:03.043677 4971 generic.go:334] "Generic (PLEG): container finished" podID="9a5c30ae-e5aa-4e94-8a67-2f2d0088db08" containerID="e34bccb653b30075c80106ab681bb6500037d54e31d9a4c5f1f84666a9851f61" exitCode=0 Mar 20 10:16:03 crc kubenswrapper[4971]: I0320 10:16:03.043769 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566696-kdqxj" event={"ID":"9a5c30ae-e5aa-4e94-8a67-2f2d0088db08","Type":"ContainerDied","Data":"e34bccb653b30075c80106ab681bb6500037d54e31d9a4c5f1f84666a9851f61"} Mar 20 10:16:04 crc kubenswrapper[4971]: I0320 10:16:04.457189 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566696-kdqxj" Mar 20 10:16:04 crc kubenswrapper[4971]: I0320 10:16:04.493998 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sc69m\" (UniqueName: \"kubernetes.io/projected/9a5c30ae-e5aa-4e94-8a67-2f2d0088db08-kube-api-access-sc69m\") pod \"9a5c30ae-e5aa-4e94-8a67-2f2d0088db08\" (UID: \"9a5c30ae-e5aa-4e94-8a67-2f2d0088db08\") " Mar 20 10:16:04 crc kubenswrapper[4971]: I0320 10:16:04.499759 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a5c30ae-e5aa-4e94-8a67-2f2d0088db08-kube-api-access-sc69m" (OuterVolumeSpecName: "kube-api-access-sc69m") pod "9a5c30ae-e5aa-4e94-8a67-2f2d0088db08" (UID: "9a5c30ae-e5aa-4e94-8a67-2f2d0088db08"). InnerVolumeSpecName "kube-api-access-sc69m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:16:04 crc kubenswrapper[4971]: I0320 10:16:04.596295 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sc69m\" (UniqueName: \"kubernetes.io/projected/9a5c30ae-e5aa-4e94-8a67-2f2d0088db08-kube-api-access-sc69m\") on node \"crc\" DevicePath \"\"" Mar 20 10:16:05 crc kubenswrapper[4971]: I0320 10:16:05.062441 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566696-kdqxj" event={"ID":"9a5c30ae-e5aa-4e94-8a67-2f2d0088db08","Type":"ContainerDied","Data":"2c2379e079aec653d19cd47b98fc916043b3ccb3272653396d23f40a1c7755b6"} Mar 20 10:16:05 crc kubenswrapper[4971]: I0320 10:16:05.062487 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c2379e079aec653d19cd47b98fc916043b3ccb3272653396d23f40a1c7755b6" Mar 20 10:16:05 crc kubenswrapper[4971]: I0320 10:16:05.062545 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566696-kdqxj" Mar 20 10:16:05 crc kubenswrapper[4971]: I0320 10:16:05.529187 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566690-p2jfv"] Mar 20 10:16:05 crc kubenswrapper[4971]: I0320 10:16:05.540302 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566690-p2jfv"] Mar 20 10:16:06 crc kubenswrapper[4971]: I0320 10:16:06.750156 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97b8c056-36f7-4851-88bb-ce7d8496ef1f" path="/var/lib/kubelet/pods/97b8c056-36f7-4851-88bb-ce7d8496ef1f/volumes" Mar 20 10:16:18 crc kubenswrapper[4971]: I0320 10:16:18.026489 4971 scope.go:117] "RemoveContainer" containerID="067c472db539144441caa852bc4aa81fadf9af091c2af95cd56f0848aa91ee84" Mar 20 10:17:20 crc kubenswrapper[4971]: I0320 10:17:20.162563 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 10:17:20 crc kubenswrapper[4971]: I0320 10:17:20.163527 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 10:17:50 crc kubenswrapper[4971]: I0320 10:17:50.162104 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 10:17:50 crc kubenswrapper[4971]: I0320 10:17:50.162604 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 10:18:00 crc kubenswrapper[4971]: I0320 10:18:00.146314 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566698-cwb8n"] Mar 20 10:18:00 crc kubenswrapper[4971]: E0320 10:18:00.147422 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a5c30ae-e5aa-4e94-8a67-2f2d0088db08" containerName="oc" Mar 20 10:18:00 crc kubenswrapper[4971]: I0320 10:18:00.147445 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a5c30ae-e5aa-4e94-8a67-2f2d0088db08" containerName="oc" Mar 20 10:18:00 crc kubenswrapper[4971]: I0320 10:18:00.147802 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a5c30ae-e5aa-4e94-8a67-2f2d0088db08" containerName="oc" Mar 20 10:18:00 crc kubenswrapper[4971]: I0320 10:18:00.148934 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566698-cwb8n" Mar 20 10:18:00 crc kubenswrapper[4971]: I0320 10:18:00.154641 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 10:18:00 crc kubenswrapper[4971]: I0320 10:18:00.154695 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 10:18:00 crc kubenswrapper[4971]: I0320 10:18:00.159021 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 10:18:00 crc kubenswrapper[4971]: I0320 10:18:00.174861 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566698-cwb8n"] Mar 20 10:18:00 crc kubenswrapper[4971]: I0320 10:18:00.355949 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77j94\" (UniqueName: \"kubernetes.io/projected/d4208792-3a1c-40e3-a708-e65cf9660096-kube-api-access-77j94\") pod \"auto-csr-approver-29566698-cwb8n\" (UID: \"d4208792-3a1c-40e3-a708-e65cf9660096\") " pod="openshift-infra/auto-csr-approver-29566698-cwb8n" Mar 20 10:18:00 crc kubenswrapper[4971]: I0320 10:18:00.457847 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77j94\" (UniqueName: \"kubernetes.io/projected/d4208792-3a1c-40e3-a708-e65cf9660096-kube-api-access-77j94\") pod \"auto-csr-approver-29566698-cwb8n\" (UID: \"d4208792-3a1c-40e3-a708-e65cf9660096\") " pod="openshift-infra/auto-csr-approver-29566698-cwb8n" Mar 20 10:18:00 crc kubenswrapper[4971]: I0320 10:18:00.476621 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77j94\" (UniqueName: \"kubernetes.io/projected/d4208792-3a1c-40e3-a708-e65cf9660096-kube-api-access-77j94\") pod \"auto-csr-approver-29566698-cwb8n\" (UID: \"d4208792-3a1c-40e3-a708-e65cf9660096\") " pod="openshift-infra/auto-csr-approver-29566698-cwb8n" Mar 20 10:18:00 crc kubenswrapper[4971]: I0320 10:18:00.479685 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566698-cwb8n" Mar 20 10:18:00 crc kubenswrapper[4971]: I0320 10:18:00.919697 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566698-cwb8n"] Mar 20 10:18:00 crc kubenswrapper[4971]: I0320 10:18:00.920265 4971 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 10:18:01 crc kubenswrapper[4971]: I0320 10:18:01.276423 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566698-cwb8n" event={"ID":"d4208792-3a1c-40e3-a708-e65cf9660096","Type":"ContainerStarted","Data":"a3d8a7b4a2e4e076620fd027753e15a6c3056754ad81ee97e4ad37c7eeb8d7af"} Mar 20 10:18:02 crc kubenswrapper[4971]: I0320 10:18:02.286577 4971 generic.go:334] "Generic (PLEG): container finished" podID="d4208792-3a1c-40e3-a708-e65cf9660096" containerID="373bf68012a436de380cde7b284fd79d5a8e635168448f791af598596b08c6a5" exitCode=0 Mar 20 10:18:02 crc kubenswrapper[4971]: I0320 10:18:02.286657 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566698-cwb8n" event={"ID":"d4208792-3a1c-40e3-a708-e65cf9660096","Type":"ContainerDied","Data":"373bf68012a436de380cde7b284fd79d5a8e635168448f791af598596b08c6a5"} Mar 20 10:18:03 crc kubenswrapper[4971]: I0320 10:18:03.702203 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566698-cwb8n" Mar 20 10:18:03 crc kubenswrapper[4971]: I0320 10:18:03.827149 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77j94\" (UniqueName: \"kubernetes.io/projected/d4208792-3a1c-40e3-a708-e65cf9660096-kube-api-access-77j94\") pod \"d4208792-3a1c-40e3-a708-e65cf9660096\" (UID: \"d4208792-3a1c-40e3-a708-e65cf9660096\") " Mar 20 10:18:03 crc kubenswrapper[4971]: I0320 10:18:03.832153 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4208792-3a1c-40e3-a708-e65cf9660096-kube-api-access-77j94" (OuterVolumeSpecName: "kube-api-access-77j94") pod "d4208792-3a1c-40e3-a708-e65cf9660096" (UID: "d4208792-3a1c-40e3-a708-e65cf9660096"). InnerVolumeSpecName "kube-api-access-77j94". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:18:03 crc kubenswrapper[4971]: I0320 10:18:03.930363 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77j94\" (UniqueName: \"kubernetes.io/projected/d4208792-3a1c-40e3-a708-e65cf9660096-kube-api-access-77j94\") on node \"crc\" DevicePath \"\"" Mar 20 10:18:04 crc kubenswrapper[4971]: I0320 10:18:04.311168 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566698-cwb8n" event={"ID":"d4208792-3a1c-40e3-a708-e65cf9660096","Type":"ContainerDied","Data":"a3d8a7b4a2e4e076620fd027753e15a6c3056754ad81ee97e4ad37c7eeb8d7af"} Mar 20 10:18:04 crc kubenswrapper[4971]: I0320 10:18:04.311213 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3d8a7b4a2e4e076620fd027753e15a6c3056754ad81ee97e4ad37c7eeb8d7af" Mar 20 10:18:04 crc kubenswrapper[4971]: I0320 10:18:04.311274 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566698-cwb8n" Mar 20 10:18:04 crc kubenswrapper[4971]: I0320 10:18:04.788333 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566692-blv8c"] Mar 20 10:18:04 crc kubenswrapper[4971]: I0320 10:18:04.801551 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566692-blv8c"] Mar 20 10:18:06 crc kubenswrapper[4971]: I0320 10:18:06.746067 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4855e97d-1722-4b16-9cbe-205e12794105" path="/var/lib/kubelet/pods/4855e97d-1722-4b16-9cbe-205e12794105/volumes" Mar 20 10:18:15 crc kubenswrapper[4971]: I0320 10:18:15.902021 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5dcbx"] Mar 20 10:18:15 crc kubenswrapper[4971]: E0320 10:18:15.903767 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4208792-3a1c-40e3-a708-e65cf9660096" containerName="oc" Mar 20 10:18:15 crc kubenswrapper[4971]: I0320 10:18:15.903789 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4208792-3a1c-40e3-a708-e65cf9660096" containerName="oc" Mar 20 10:18:15 crc kubenswrapper[4971]: I0320 10:18:15.904122 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4208792-3a1c-40e3-a708-e65cf9660096" containerName="oc" Mar 20 10:18:15 crc kubenswrapper[4971]: I0320 10:18:15.907369 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5dcbx" Mar 20 10:18:15 crc kubenswrapper[4971]: I0320 10:18:15.920096 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5dcbx"] Mar 20 10:18:15 crc kubenswrapper[4971]: I0320 10:18:15.987483 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1291caf-f069-4fbb-be0a-e92154f76f15-catalog-content\") pod \"community-operators-5dcbx\" (UID: \"e1291caf-f069-4fbb-be0a-e92154f76f15\") " pod="openshift-marketplace/community-operators-5dcbx" Mar 20 10:18:15 crc kubenswrapper[4971]: I0320 10:18:15.987685 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lz29q\" (UniqueName: \"kubernetes.io/projected/e1291caf-f069-4fbb-be0a-e92154f76f15-kube-api-access-lz29q\") pod \"community-operators-5dcbx\" (UID: \"e1291caf-f069-4fbb-be0a-e92154f76f15\") " pod="openshift-marketplace/community-operators-5dcbx" Mar 20 10:18:15 crc kubenswrapper[4971]: I0320 10:18:15.987727 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1291caf-f069-4fbb-be0a-e92154f76f15-utilities\") pod \"community-operators-5dcbx\" (UID: \"e1291caf-f069-4fbb-be0a-e92154f76f15\") " pod="openshift-marketplace/community-operators-5dcbx" Mar 20 10:18:16 crc kubenswrapper[4971]: I0320 10:18:16.090040 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1291caf-f069-4fbb-be0a-e92154f76f15-catalog-content\") pod \"community-operators-5dcbx\" (UID: \"e1291caf-f069-4fbb-be0a-e92154f76f15\") " pod="openshift-marketplace/community-operators-5dcbx" Mar 20 10:18:16 crc kubenswrapper[4971]: I0320 10:18:16.090544 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lz29q\" (UniqueName: \"kubernetes.io/projected/e1291caf-f069-4fbb-be0a-e92154f76f15-kube-api-access-lz29q\") pod \"community-operators-5dcbx\" (UID: \"e1291caf-f069-4fbb-be0a-e92154f76f15\") " pod="openshift-marketplace/community-operators-5dcbx" Mar 20 10:18:16 crc kubenswrapper[4971]: I0320 10:18:16.091106 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1291caf-f069-4fbb-be0a-e92154f76f15-utilities\") pod \"community-operators-5dcbx\" (UID: \"e1291caf-f069-4fbb-be0a-e92154f76f15\") " pod="openshift-marketplace/community-operators-5dcbx" Mar 20 10:18:16 crc kubenswrapper[4971]: I0320 10:18:16.090987 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1291caf-f069-4fbb-be0a-e92154f76f15-catalog-content\") pod \"community-operators-5dcbx\" (UID: \"e1291caf-f069-4fbb-be0a-e92154f76f15\") " pod="openshift-marketplace/community-operators-5dcbx" Mar 20 10:18:16 crc kubenswrapper[4971]: I0320 10:18:16.091632 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1291caf-f069-4fbb-be0a-e92154f76f15-utilities\") pod \"community-operators-5dcbx\" (UID: \"e1291caf-f069-4fbb-be0a-e92154f76f15\") " pod="openshift-marketplace/community-operators-5dcbx" Mar 20 10:18:16 crc kubenswrapper[4971]: I0320 10:18:16.116784 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lz29q\" (UniqueName: \"kubernetes.io/projected/e1291caf-f069-4fbb-be0a-e92154f76f15-kube-api-access-lz29q\") pod \"community-operators-5dcbx\" (UID: \"e1291caf-f069-4fbb-be0a-e92154f76f15\") " pod="openshift-marketplace/community-operators-5dcbx" Mar 20 10:18:16 crc kubenswrapper[4971]: I0320 10:18:16.239873 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5dcbx" Mar 20 10:18:16 crc kubenswrapper[4971]: I0320 10:18:16.855256 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5dcbx"] Mar 20 10:18:17 crc kubenswrapper[4971]: I0320 10:18:17.471697 4971 generic.go:334] "Generic (PLEG): container finished" podID="e1291caf-f069-4fbb-be0a-e92154f76f15" containerID="9c7fb74b6faa2240b386f1623c56730558c65d9d6d5123059ad119cb8f7f2fec" exitCode=0 Mar 20 10:18:17 crc kubenswrapper[4971]: I0320 10:18:17.471735 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5dcbx" event={"ID":"e1291caf-f069-4fbb-be0a-e92154f76f15","Type":"ContainerDied","Data":"9c7fb74b6faa2240b386f1623c56730558c65d9d6d5123059ad119cb8f7f2fec"} Mar 20 10:18:17 crc kubenswrapper[4971]: I0320 10:18:17.471758 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5dcbx" event={"ID":"e1291caf-f069-4fbb-be0a-e92154f76f15","Type":"ContainerStarted","Data":"3a56d2457126ed6e5008592eebc789852c152a840d7016d66bbfd3321156eb03"} Mar 20 10:18:18 crc kubenswrapper[4971]: I0320 10:18:18.128209 4971 scope.go:117] "RemoveContainer" containerID="4ec08c24006c1eef334602e576bf79eda93230827073da4637d9c80cf47fe1e1" Mar 20 10:18:18 crc kubenswrapper[4971]: I0320 10:18:18.484991 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5dcbx" event={"ID":"e1291caf-f069-4fbb-be0a-e92154f76f15","Type":"ContainerStarted","Data":"597266d38042aa7edbc77cef5318337abbbe76d705a2337797d74781261820c1"} Mar 20 10:18:19 crc kubenswrapper[4971]: I0320 10:18:19.494313 4971 generic.go:334] "Generic (PLEG): container finished" podID="e1291caf-f069-4fbb-be0a-e92154f76f15" containerID="597266d38042aa7edbc77cef5318337abbbe76d705a2337797d74781261820c1" exitCode=0 Mar 20 10:18:19 crc kubenswrapper[4971]: I0320 10:18:19.494644 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5dcbx" event={"ID":"e1291caf-f069-4fbb-be0a-e92154f76f15","Type":"ContainerDied","Data":"597266d38042aa7edbc77cef5318337abbbe76d705a2337797d74781261820c1"} Mar 20 10:18:20 crc kubenswrapper[4971]: I0320 10:18:20.162038 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 10:18:20 crc kubenswrapper[4971]: I0320 10:18:20.162823 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 10:18:20 crc kubenswrapper[4971]: I0320 10:18:20.162867 4971 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" Mar 20 10:18:20 crc kubenswrapper[4971]: I0320 10:18:20.163648 4971 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"919bb2f73e7c6359debd4a73f45868c083b9339eb00f7714958d919a54c82d9f"} pod="openshift-machine-config-operator/machine-config-daemon-m26zl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 10:18:20 crc kubenswrapper[4971]: I0320 10:18:20.163708 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" containerID="cri-o://919bb2f73e7c6359debd4a73f45868c083b9339eb00f7714958d919a54c82d9f" gracePeriod=600 Mar 20 10:18:20 crc kubenswrapper[4971]: E0320 10:18:20.296188 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 10:18:20 crc kubenswrapper[4971]: I0320 10:18:20.519767 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5dcbx" event={"ID":"e1291caf-f069-4fbb-be0a-e92154f76f15","Type":"ContainerStarted","Data":"469a674dda3a2e70068b3bfe191e31013cdffb9e078e35bf48dcf4db7e0525a0"} Mar 20 10:18:20 crc kubenswrapper[4971]: I0320 10:18:20.522777 4971 generic.go:334] "Generic (PLEG): container finished" podID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerID="919bb2f73e7c6359debd4a73f45868c083b9339eb00f7714958d919a54c82d9f" exitCode=0 Mar 20 10:18:20 crc kubenswrapper[4971]: I0320 10:18:20.522826 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerDied","Data":"919bb2f73e7c6359debd4a73f45868c083b9339eb00f7714958d919a54c82d9f"} Mar 20 10:18:20 crc kubenswrapper[4971]: I0320 10:18:20.522860 4971 scope.go:117] "RemoveContainer" containerID="f75848c8286f55cc3d920b08f7a89dc5054b5048e3e2418181291bb7c5c3716f" Mar 20 10:18:20 crc kubenswrapper[4971]: I0320 10:18:20.523426 4971 scope.go:117] "RemoveContainer" containerID="919bb2f73e7c6359debd4a73f45868c083b9339eb00f7714958d919a54c82d9f" Mar 20 10:18:20 crc kubenswrapper[4971]: E0320 10:18:20.523969 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 10:18:20 crc kubenswrapper[4971]: I0320 10:18:20.544112 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5dcbx" podStartSLOduration=3.102646641 podStartE2EDuration="5.544090927s" podCreationTimestamp="2026-03-20 10:18:15 +0000 UTC" firstStartedPulling="2026-03-20 10:18:17.47467206 +0000 UTC m=+12519.454546198" lastFinishedPulling="2026-03-20 10:18:19.916116346 +0000 UTC m=+12521.895990484" observedRunningTime="2026-03-20 10:18:20.539963459 +0000 UTC m=+12522.519837597" watchObservedRunningTime="2026-03-20 10:18:20.544090927 +0000 UTC m=+12522.523965065" Mar 20 10:18:26 crc kubenswrapper[4971]: I0320 10:18:26.240460 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5dcbx" Mar 20 10:18:26 crc kubenswrapper[4971]: I0320 10:18:26.241156 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5dcbx" Mar 20 10:18:26 crc kubenswrapper[4971]: I0320 10:18:26.286170 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5dcbx" Mar 20 10:18:26 crc kubenswrapper[4971]: I0320 10:18:26.633872 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5dcbx" Mar 20 10:18:26 crc kubenswrapper[4971]: I0320 10:18:26.684319 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5dcbx"] Mar 20 10:18:28 crc kubenswrapper[4971]: I0320 10:18:28.604505 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5dcbx" podUID="e1291caf-f069-4fbb-be0a-e92154f76f15" containerName="registry-server" containerID="cri-o://469a674dda3a2e70068b3bfe191e31013cdffb9e078e35bf48dcf4db7e0525a0" gracePeriod=2 Mar 20 10:18:29 crc kubenswrapper[4971]: I0320 10:18:29.087699 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5dcbx" Mar 20 10:18:29 crc kubenswrapper[4971]: I0320 10:18:29.159957 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz29q\" (UniqueName: \"kubernetes.io/projected/e1291caf-f069-4fbb-be0a-e92154f76f15-kube-api-access-lz29q\") pod \"e1291caf-f069-4fbb-be0a-e92154f76f15\" (UID: \"e1291caf-f069-4fbb-be0a-e92154f76f15\") " Mar 20 10:18:29 crc kubenswrapper[4971]: I0320 10:18:29.160033 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1291caf-f069-4fbb-be0a-e92154f76f15-utilities\") pod \"e1291caf-f069-4fbb-be0a-e92154f76f15\" (UID: \"e1291caf-f069-4fbb-be0a-e92154f76f15\") " Mar 20 10:18:29 crc kubenswrapper[4971]: I0320 10:18:29.160108 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1291caf-f069-4fbb-be0a-e92154f76f15-catalog-content\") pod \"e1291caf-f069-4fbb-be0a-e92154f76f15\" (UID: \"e1291caf-f069-4fbb-be0a-e92154f76f15\") " Mar 20 10:18:29 crc kubenswrapper[4971]: I0320 10:18:29.161091 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1291caf-f069-4fbb-be0a-e92154f76f15-utilities" (OuterVolumeSpecName: "utilities") pod "e1291caf-f069-4fbb-be0a-e92154f76f15" (UID: "e1291caf-f069-4fbb-be0a-e92154f76f15"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:18:29 crc kubenswrapper[4971]: I0320 10:18:29.172589 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1291caf-f069-4fbb-be0a-e92154f76f15-kube-api-access-lz29q" (OuterVolumeSpecName: "kube-api-access-lz29q") pod "e1291caf-f069-4fbb-be0a-e92154f76f15" (UID: "e1291caf-f069-4fbb-be0a-e92154f76f15"). InnerVolumeSpecName "kube-api-access-lz29q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:18:29 crc kubenswrapper[4971]: I0320 10:18:29.213752 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1291caf-f069-4fbb-be0a-e92154f76f15-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e1291caf-f069-4fbb-be0a-e92154f76f15" (UID: "e1291caf-f069-4fbb-be0a-e92154f76f15"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:18:29 crc kubenswrapper[4971]: I0320 10:18:29.263177 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1291caf-f069-4fbb-be0a-e92154f76f15-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 10:18:29 crc kubenswrapper[4971]: I0320 10:18:29.263225 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz29q\" (UniqueName: \"kubernetes.io/projected/e1291caf-f069-4fbb-be0a-e92154f76f15-kube-api-access-lz29q\") on node \"crc\" DevicePath \"\"" Mar 20 10:18:29 crc kubenswrapper[4971]: I0320 10:18:29.263239 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1291caf-f069-4fbb-be0a-e92154f76f15-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 10:18:29 crc kubenswrapper[4971]: I0320 10:18:29.620588 4971 generic.go:334] "Generic (PLEG): container finished" podID="e1291caf-f069-4fbb-be0a-e92154f76f15" containerID="469a674dda3a2e70068b3bfe191e31013cdffb9e078e35bf48dcf4db7e0525a0" exitCode=0 Mar 20 10:18:29 crc kubenswrapper[4971]: I0320 10:18:29.620653 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5dcbx" event={"ID":"e1291caf-f069-4fbb-be0a-e92154f76f15","Type":"ContainerDied","Data":"469a674dda3a2e70068b3bfe191e31013cdffb9e078e35bf48dcf4db7e0525a0"} Mar 20 10:18:29 crc kubenswrapper[4971]: I0320 10:18:29.620693 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5dcbx" event={"ID":"e1291caf-f069-4fbb-be0a-e92154f76f15","Type":"ContainerDied","Data":"3a56d2457126ed6e5008592eebc789852c152a840d7016d66bbfd3321156eb03"} Mar 20 10:18:29 crc kubenswrapper[4971]: I0320 10:18:29.620715 4971 scope.go:117] "RemoveContainer" containerID="469a674dda3a2e70068b3bfe191e31013cdffb9e078e35bf48dcf4db7e0525a0" Mar 20 10:18:29 crc kubenswrapper[4971]: I0320 10:18:29.620744 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5dcbx" Mar 20 10:18:29 crc kubenswrapper[4971]: I0320 10:18:29.655718 4971 scope.go:117] "RemoveContainer" containerID="597266d38042aa7edbc77cef5318337abbbe76d705a2337797d74781261820c1" Mar 20 10:18:29 crc kubenswrapper[4971]: I0320 10:18:29.667089 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5dcbx"] Mar 20 10:18:29 crc kubenswrapper[4971]: I0320 10:18:29.683483 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5dcbx"] Mar 20 10:18:29 crc kubenswrapper[4971]: I0320 10:18:29.716550 4971 scope.go:117] "RemoveContainer" containerID="9c7fb74b6faa2240b386f1623c56730558c65d9d6d5123059ad119cb8f7f2fec" Mar 20 10:18:29 crc kubenswrapper[4971]: I0320 10:18:29.741475 4971 scope.go:117] "RemoveContainer" containerID="469a674dda3a2e70068b3bfe191e31013cdffb9e078e35bf48dcf4db7e0525a0" Mar 20 10:18:29 crc kubenswrapper[4971]: E0320 10:18:29.742080 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"469a674dda3a2e70068b3bfe191e31013cdffb9e078e35bf48dcf4db7e0525a0\": container with ID starting with 469a674dda3a2e70068b3bfe191e31013cdffb9e078e35bf48dcf4db7e0525a0 not found: ID does not exist" containerID="469a674dda3a2e70068b3bfe191e31013cdffb9e078e35bf48dcf4db7e0525a0" Mar 20 10:18:29 crc kubenswrapper[4971]: I0320 10:18:29.742127 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"469a674dda3a2e70068b3bfe191e31013cdffb9e078e35bf48dcf4db7e0525a0"} err="failed to get container status \"469a674dda3a2e70068b3bfe191e31013cdffb9e078e35bf48dcf4db7e0525a0\": rpc error: code = NotFound desc = could not find container \"469a674dda3a2e70068b3bfe191e31013cdffb9e078e35bf48dcf4db7e0525a0\": container with ID starting with 469a674dda3a2e70068b3bfe191e31013cdffb9e078e35bf48dcf4db7e0525a0 not found: ID does not exist" Mar 20 10:18:29 crc kubenswrapper[4971]: I0320 10:18:29.742154 4971 scope.go:117] "RemoveContainer" containerID="597266d38042aa7edbc77cef5318337abbbe76d705a2337797d74781261820c1" Mar 20 10:18:29 crc kubenswrapper[4971]: E0320 10:18:29.742756 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"597266d38042aa7edbc77cef5318337abbbe76d705a2337797d74781261820c1\": container with ID starting with 597266d38042aa7edbc77cef5318337abbbe76d705a2337797d74781261820c1 not found: ID does not exist" containerID="597266d38042aa7edbc77cef5318337abbbe76d705a2337797d74781261820c1" Mar 20 10:18:29 crc kubenswrapper[4971]: I0320 10:18:29.742819 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"597266d38042aa7edbc77cef5318337abbbe76d705a2337797d74781261820c1"} err="failed to get container status \"597266d38042aa7edbc77cef5318337abbbe76d705a2337797d74781261820c1\": rpc error: code = NotFound desc = could not find container \"597266d38042aa7edbc77cef5318337abbbe76d705a2337797d74781261820c1\": container with ID starting with 597266d38042aa7edbc77cef5318337abbbe76d705a2337797d74781261820c1 not found: ID does not exist" Mar 20 10:18:29 crc kubenswrapper[4971]: I0320 10:18:29.742871 4971 scope.go:117] "RemoveContainer" containerID="9c7fb74b6faa2240b386f1623c56730558c65d9d6d5123059ad119cb8f7f2fec" Mar 20 10:18:29 crc kubenswrapper[4971]: E0320 10:18:29.743203 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c7fb74b6faa2240b386f1623c56730558c65d9d6d5123059ad119cb8f7f2fec\": container with ID starting with 9c7fb74b6faa2240b386f1623c56730558c65d9d6d5123059ad119cb8f7f2fec not found: ID does not exist" containerID="9c7fb74b6faa2240b386f1623c56730558c65d9d6d5123059ad119cb8f7f2fec" Mar 20 10:18:29 crc kubenswrapper[4971]: I0320 10:18:29.743236 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c7fb74b6faa2240b386f1623c56730558c65d9d6d5123059ad119cb8f7f2fec"} err="failed to get container status \"9c7fb74b6faa2240b386f1623c56730558c65d9d6d5123059ad119cb8f7f2fec\": rpc error: code = NotFound desc = could not find container \"9c7fb74b6faa2240b386f1623c56730558c65d9d6d5123059ad119cb8f7f2fec\": container with ID starting with 9c7fb74b6faa2240b386f1623c56730558c65d9d6d5123059ad119cb8f7f2fec not found: ID does not exist" Mar 20 10:18:29 crc kubenswrapper[4971]: E0320 10:18:29.867632 4971 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1291caf_f069_4fbb_be0a_e92154f76f15.slice/crio-3a56d2457126ed6e5008592eebc789852c152a840d7016d66bbfd3321156eb03\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1291caf_f069_4fbb_be0a_e92154f76f15.slice\": RecentStats: unable to find data in memory cache]" Mar 20 10:18:30 crc kubenswrapper[4971]: I0320 10:18:30.742051 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1291caf-f069-4fbb-be0a-e92154f76f15" path="/var/lib/kubelet/pods/e1291caf-f069-4fbb-be0a-e92154f76f15/volumes" Mar 20 10:18:33 crc kubenswrapper[4971]: I0320 10:18:33.732539 4971 scope.go:117] "RemoveContainer" containerID="919bb2f73e7c6359debd4a73f45868c083b9339eb00f7714958d919a54c82d9f" Mar 20 10:18:33 crc kubenswrapper[4971]: E0320 10:18:33.733220 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 10:18:48 crc kubenswrapper[4971]: I0320 10:18:48.743979 4971 scope.go:117] "RemoveContainer" containerID="919bb2f73e7c6359debd4a73f45868c083b9339eb00f7714958d919a54c82d9f" Mar 20 10:18:48 crc kubenswrapper[4971]: E0320 10:18:48.744756 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 10:18:51 crc kubenswrapper[4971]: I0320 10:18:51.513715 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_3c261d6c-ad35-4558-a676-58606d0c78a5/init-config-reloader/0.log" Mar 20 10:18:51 crc kubenswrapper[4971]: I0320 10:18:51.768515 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_3c261d6c-ad35-4558-a676-58606d0c78a5/config-reloader/0.log" Mar 20 10:18:51 crc kubenswrapper[4971]: I0320 10:18:51.790287 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_3c261d6c-ad35-4558-a676-58606d0c78a5/init-config-reloader/0.log" Mar 20 10:18:51 crc kubenswrapper[4971]: I0320 10:18:51.798618 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_3c261d6c-ad35-4558-a676-58606d0c78a5/alertmanager/0.log" Mar 20 10:18:51 crc kubenswrapper[4971]: I0320 10:18:51.959218 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_05f6ce5b-1f18-46c8-92f8-984e7d98872e/aodh-api/0.log" Mar 20 10:18:52 crc kubenswrapper[4971]: I0320 10:18:52.001303 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_05f6ce5b-1f18-46c8-92f8-984e7d98872e/aodh-evaluator/0.log" Mar 20 10:18:52 crc kubenswrapper[4971]: I0320 10:18:52.013767 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_05f6ce5b-1f18-46c8-92f8-984e7d98872e/aodh-listener/0.log" Mar 20 10:18:52 crc kubenswrapper[4971]: I0320 10:18:52.090514 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_05f6ce5b-1f18-46c8-92f8-984e7d98872e/aodh-notifier/0.log" Mar 20 10:18:52 crc kubenswrapper[4971]: I0320 10:18:52.595903 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_454e3092-59bc-4593-becc-e3aad3af2f78/ceilometer-central-agent/0.log" Mar 20 10:18:52 crc kubenswrapper[4971]: I0320 10:18:52.661711 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-openstack-openstack-cell1-vmk68_477b98cd-b9c9-4411-8a82-73a1153e0cc2/bootstrap-openstack-openstack-cell1/0.log" Mar 20 10:18:52 crc kubenswrapper[4971]: I0320 10:18:52.731209 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_454e3092-59bc-4593-becc-e3aad3af2f78/ceilometer-notification-agent/0.log" Mar 20 10:18:52 crc kubenswrapper[4971]: I0320 10:18:52.816496 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-openstack-openstack-networker-wpznj_6631a763-6625-4978-a9dd-5ed4c4421ecc/bootstrap-openstack-openstack-networker/0.log" Mar 20 10:18:52 crc kubenswrapper[4971]: I0320 10:18:52.826445 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_454e3092-59bc-4593-becc-e3aad3af2f78/proxy-httpd/0.log" Mar 20 10:18:52 crc kubenswrapper[4971]: I0320 10:18:52.926899 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_454e3092-59bc-4593-becc-e3aad3af2f78/sg-core/0.log" Mar 20 10:18:53 crc kubenswrapper[4971]: I0320 10:18:53.014731 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-openstack-openstack-cell1-z6h6v_661512ef-b7a3-40be-820e-7d9670853666/ceph-client-openstack-openstack-cell1/0.log" Mar 20 10:18:53 crc kubenswrapper[4971]: I0320 10:18:53.409632 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_fafe0d4a-a07e-4d27-a7d3-c951a2ef82bc/cinder-api-log/0.log" Mar 20 10:18:53 crc kubenswrapper[4971]: I0320 10:18:53.445082 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_fafe0d4a-a07e-4d27-a7d3-c951a2ef82bc/cinder-api/0.log" Mar 20 10:18:53 crc kubenswrapper[4971]: I0320 10:18:53.501270 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_747155f7-75a9-43bc-9726-f4fa4f618ee6/cinder-backup/0.log" Mar 20 10:18:53 crc kubenswrapper[4971]: I0320 10:18:53.806468 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_747155f7-75a9-43bc-9726-f4fa4f618ee6/probe/0.log" Mar 20 10:18:53 crc kubenswrapper[4971]: I0320 10:18:53.926782 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_40678394-304b-4d4d-8d8a-4f399a896a2a/cinder-scheduler/0.log" Mar 20 10:18:53 crc kubenswrapper[4971]: I0320 10:18:53.977360 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_40678394-304b-4d4d-8d8a-4f399a896a2a/probe/0.log" Mar 20 10:18:54 crc kubenswrapper[4971]: I0320 10:18:54.197290 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_82e8fffd-a0f2-4591-9a54-ffc5227fac62/probe/0.log" Mar 20 10:18:54 crc kubenswrapper[4971]: I0320 10:18:54.602015 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-openstack-openstack-cell1-x6q7m_1af0481a-a5e1-42e4-aa6e-ac3d63407b66/configure-network-openstack-openstack-cell1/0.log" Mar 20 10:18:54 crc kubenswrapper[4971]: I0320 10:18:54.800089 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-openstack-openstack-networker-lgllz_243abc57-20e2-431d-8714-e3ffd3a27ea6/configure-network-openstack-openstack-networker/0.log" Mar 20 10:18:55 crc kubenswrapper[4971]: I0320 10:18:55.360959 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-openstack-openstack-cell1-qhvnn_5e4faacf-b8fc-428e-8e94-cd873f7bbea8/configure-os-openstack-openstack-cell1/0.log" Mar 20 10:18:55 crc kubenswrapper[4971]: I0320 10:18:55.541514 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5d56f8b6fc-rxkhz_11e35d68-eda6-4edb-a201-b267628ce07f/init/0.log" Mar 20 10:18:55 crc kubenswrapper[4971]: I0320 10:18:55.552518 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-openstack-openstack-networker-9vwnm_10e60012-e649-4b87-bf60-98f44a54ef4c/configure-os-openstack-openstack-networker/0.log" Mar 20 10:18:55 crc kubenswrapper[4971]: I0320 10:18:55.677760 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5d56f8b6fc-rxkhz_11e35d68-eda6-4edb-a201-b267628ce07f/init/0.log" Mar 20 10:18:55 crc kubenswrapper[4971]: I0320 10:18:55.883148 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5d56f8b6fc-rxkhz_11e35d68-eda6-4edb-a201-b267628ce07f/dnsmasq-dns/0.log" Mar 20 10:18:56 crc kubenswrapper[4971]: I0320 10:18:56.234250 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-openstack-openstack-cell1-grhlv_efb74ee9-ca4d-4087-a070-4b3f17683800/download-cache-openstack-openstack-cell1/0.log" Mar 20 10:18:56 crc kubenswrapper[4971]: I0320 10:18:56.257946 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_051896ec-c7dc-48da-acab-6b65bf92d80e/glance-httpd/0.log" Mar 20 10:18:56 crc kubenswrapper[4971]: I0320 10:18:56.278287 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-openstack-openstack-networker-jcgvt_b6855ca9-d129-4c73-a2ed-cc7059287a06/download-cache-openstack-openstack-networker/0.log" Mar 20 10:18:56 crc kubenswrapper[4971]: I0320 10:18:56.421785 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_051896ec-c7dc-48da-acab-6b65bf92d80e/glance-log/0.log" Mar 20 10:18:56 crc kubenswrapper[4971]: I0320 10:18:56.507028 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_82e8fffd-a0f2-4591-9a54-ffc5227fac62/cinder-volume/0.log" Mar 20 10:18:56 crc kubenswrapper[4971]: I0320 10:18:56.554705 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_fb8fbce1-2aa7-445f-aed3-09344c7e1f30/glance-httpd/0.log" Mar 20 10:18:56 crc kubenswrapper[4971]: I0320 10:18:56.569662 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_fb8fbce1-2aa7-445f-aed3-09344c7e1f30/glance-log/0.log" Mar 20 10:18:56 crc kubenswrapper[4971]: I0320 10:18:56.892270 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-68cbbccf66-npd8p_4caadd7e-7246-481f-8093-627230427f16/heat-api/0.log" Mar 20 10:18:56 crc kubenswrapper[4971]: I0320 10:18:56.911786 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-66bdd8fc4b-czgn7_4893772c-593c-4f8b-9f44-8a786adeadb9/heat-engine/0.log" Mar 20 10:18:56 crc kubenswrapper[4971]: I0320 10:18:56.963219 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-769d45db65-qj7ql_dcba81ea-8871-4cdd-aeae-61f58557c44c/heat-cfnapi/0.log" Mar 20 10:18:57 crc kubenswrapper[4971]: I0320 10:18:57.160726 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7c6cddf495-pndtg_d5207d51-a124-4d40-bfef-2dcc09b9fade/horizon/0.log" Mar 20 10:18:57 crc kubenswrapper[4971]: I0320 10:18:57.178778 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7c6cddf495-pndtg_d5207d51-a124-4d40-bfef-2dcc09b9fade/horizon-log/0.log" Mar 20 10:18:57 crc kubenswrapper[4971]: I0320 10:18:57.225651 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-openstack-openstack-cell1-gdhcx_aaabaf42-dd65-48db-a6da-18e2ba99b975/install-certs-openstack-openstack-cell1/0.log" Mar 20 10:18:57 crc kubenswrapper[4971]: I0320 10:18:57.401060 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-openstack-openstack-networker-9w5xv_e7dbaa85-bc3c-47de-b48d-7d72c7d8f406/install-certs-openstack-openstack-networker/0.log" Mar 20 10:18:57 crc kubenswrapper[4971]: I0320 10:18:57.884895 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29566621-kzfm9_4231087a-ce1e-4bde-b686-0f7045eeb5d8/keystone-cron/0.log" Mar 20 10:18:58 crc kubenswrapper[4971]: I0320 10:18:58.085102 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29566681-p2q5h_acd620ca-33e4-482a-8105-31e3355d012e/keystone-cron/0.log" Mar 20 10:18:58 crc kubenswrapper[4971]: I0320 10:18:58.304149 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_3663f86e-c27b-4585-8201-8115f0e04501/kube-state-metrics/0.log" Mar 20 10:18:59 crc kubenswrapper[4971]: I0320 10:18:59.026945 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-openstack-openstack-cell1-8z2kz_024091ee-f909-41ef-8ca6-0894aed949cd/install-os-openstack-openstack-cell1/0.log" Mar 20 10:18:59 crc kubenswrapper[4971]: I0320 10:18:59.102255 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-openstack-openstack-networker-n7gwd_e8634c18-a188-491c-96ee-8c23c7ec8a67/install-os-openstack-openstack-networker/0.log" Mar 20 10:18:59 crc kubenswrapper[4971]: I0320 10:18:59.379471 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_ea8432ff-5123-4390-99af-d909291447f0/manila-api/0.log" Mar 20 10:18:59 crc kubenswrapper[4971]: I0320 10:18:59.485393 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7948b77f69-j5269_97c1ece7-9e4a-46a6-a86c-bdde47c1ab39/keystone-api/0.log" Mar 20 10:18:59 crc kubenswrapper[4971]: I0320 10:18:59.607125 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_6f089194-8fcc-4c1f-9e2f-61103e076753/probe/0.log" Mar 20 10:18:59 crc kubenswrapper[4971]: I0320 10:18:59.693361 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_6f089194-8fcc-4c1f-9e2f-61103e076753/manila-scheduler/0.log" Mar 20 10:18:59 crc kubenswrapper[4971]: I0320 10:18:59.743383 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_ea8432ff-5123-4390-99af-d909291447f0/manila-api-log/0.log" Mar 20 10:18:59 crc kubenswrapper[4971]: I0320 10:18:59.920922 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_22ef81f8-ccbc-4cbc-b638-cf3a998e2eca/probe/0.log" Mar 20 10:18:59 crc kubenswrapper[4971]: I0320 10:18:59.928242 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_22ef81f8-ccbc-4cbc-b638-cf3a998e2eca/manila-share/0.log" Mar 20 10:19:00 crc kubenswrapper[4971]: I0320 10:19:00.554357 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6c7c478cfc-2twl9_1bf12dad-fd1a-4a84-9f23-b9b9ca622268/neutron-httpd/0.log" Mar 20 10:19:00 crc kubenswrapper[4971]: I0320 10:19:00.665357 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-openstack-openstack-cell1-98nwv_30ac44cf-4f9c-43b0-974e-8e86f25451e8/libvirt-openstack-openstack-cell1/0.log" Mar 20 10:19:01 crc kubenswrapper[4971]: I0320 10:19:01.016707 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-dhcp-openstack-openstack-cell1-jqdkh_42809274-e4e2-41bf-bac0-163763e81df4/neutron-dhcp-openstack-openstack-cell1/0.log" Mar 20 10:19:01 crc kubenswrapper[4971]: I0320 10:19:01.144669 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6c7c478cfc-2twl9_1bf12dad-fd1a-4a84-9f23-b9b9ca622268/neutron-api/0.log" Mar 20 10:19:01 crc kubenswrapper[4971]: I0320 10:19:01.226670 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-openstack-openstack-cell1-v4l72_921173cb-9fd9-4b2a-bb4c-ea22aaa0f565/neutron-metadata-openstack-openstack-cell1/0.log" Mar 20 10:19:01 crc kubenswrapper[4971]: I0320 10:19:01.534548 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-openstack-openstack-networker-kl245_90cbcdec-a80c-4c34-9130-ecbaa12a36b0/neutron-metadata-openstack-openstack-networker/0.log" Mar 20 10:19:01 crc kubenswrapper[4971]: I0320 10:19:01.633729 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-sriov-openstack-openstack-cell1-zdbn9_da19c2aa-8814-45f3-8f80-f9aebff030fd/neutron-sriov-openstack-openstack-cell1/0.log" Mar 20 10:19:01 crc kubenswrapper[4971]: I0320 10:19:01.662703 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_a288064d-50aa-42c6-8944-3a6c3b9d6c77/nova-api-api/0.log" Mar 20 10:19:02 crc kubenswrapper[4971]: I0320 10:19:02.047148 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_a288064d-50aa-42c6-8944-3a6c3b9d6c77/nova-api-log/0.log" Mar 20 10:19:02 crc kubenswrapper[4971]: I0320 10:19:02.082626 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_be09f583-3216-4725-abc2-bb13ba7585cd/nova-cell0-conductor-conductor/0.log" Mar 20 10:19:02 crc kubenswrapper[4971]: I0320 10:19:02.116704 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_b0e45f2f-b018-4874-b9cc-a933274eee0c/nova-cell1-conductor-conductor/0.log" Mar 20 10:19:02 crc kubenswrapper[4971]: I0320 10:19:02.386774 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_d4573bfc-d861-400d-bd97-475733d58617/nova-cell1-novncproxy-novncproxy/0.log" Mar 20 10:19:02 crc kubenswrapper[4971]: I0320 10:19:02.734833 4971 scope.go:117] "RemoveContainer" containerID="919bb2f73e7c6359debd4a73f45868c083b9339eb00f7714958d919a54c82d9f" Mar 20 10:19:02 crc kubenswrapper[4971]: E0320 10:19:02.735840 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 10:19:03 crc kubenswrapper[4971]: I0320 10:19:03.005049 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_073652bb-5847-4af9-9bca-9d350eacf6fc/nova-metadata-log/0.log" Mar 20 10:19:03 crc kubenswrapper[4971]: I0320 10:19:03.202780 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_073652bb-5847-4af9-9bca-9d350eacf6fc/nova-metadata-metadata/0.log" Mar 20 10:19:03 crc kubenswrapper[4971]: I0320 10:19:03.643322 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_f51e9be1-1904-4886-967b-87f6e898d67d/nova-scheduler-scheduler/0.log" Mar 20 10:19:03 crc kubenswrapper[4971]: I0320 10:19:03.802996 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_52099a51-54ac-4611-89cf-191426ae31d7/mysql-bootstrap/0.log" Mar 20 10:19:04 crc kubenswrapper[4971]: I0320 10:19:04.018110 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_52099a51-54ac-4611-89cf-191426ae31d7/mysql-bootstrap/0.log" Mar 20 10:19:04 crc kubenswrapper[4971]: I0320 10:19:04.127240 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_52099a51-54ac-4611-89cf-191426ae31d7/galera/0.log" Mar 20 10:19:04 crc kubenswrapper[4971]: I0320 10:19:04.372156 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_8d6c9df9-b2df-4747-a0c7-0a6684689438/mysql-bootstrap/0.log" Mar 20 10:19:04 crc kubenswrapper[4971]: I0320 10:19:04.380182 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell7lj9f_a8a4d9ea-5399-4aa0-9c74-5c6013261268/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1/0.log" Mar 20 10:19:04 crc kubenswrapper[4971]: I0320 10:19:04.611637 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_8d6c9df9-b2df-4747-a0c7-0a6684689438/mysql-bootstrap/0.log" Mar 20 10:19:04 crc kubenswrapper[4971]: I0320 10:19:04.612396 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-openstack-cell1-4ng6s_41f11d07-6bdd-455e-be27-f2b7a23c7f7a/nova-cell1-openstack-openstack-cell1/0.log" Mar 20 10:19:04 crc kubenswrapper[4971]: I0320 10:19:04.700958 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_8d6c9df9-b2df-4747-a0c7-0a6684689438/galera/0.log" Mar 20 10:19:04 crc kubenswrapper[4971]: I0320 10:19:04.795311 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_0a492260-faae-40fb-887d-46a5be1c4c5e/openstackclient/0.log" Mar 20 10:19:04 crc kubenswrapper[4971]: I0320 10:19:04.828865 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_331e05d9-b918-47df-94c6-ad8f16ad4f05/openstack-network-exporter/0.log" Mar 20 10:19:04 crc kubenswrapper[4971]: I0320 10:19:04.973257 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_331e05d9-b918-47df-94c6-ad8f16ad4f05/ovn-northd/0.log" Mar 20 10:19:05 crc kubenswrapper[4971]: I0320 10:19:05.377886 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_e2bebfec-671e-453b-855a-0a1c4b6a3109/openstack-network-exporter/0.log" Mar 20 10:19:05 crc kubenswrapper[4971]: I0320 10:19:05.401459 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-openstack-openstack-cell1-bcpc6_f74aa1f9-4acb-48cd-90a1-10ff27996a3a/ovn-openstack-openstack-cell1/0.log" Mar 20 10:19:05 crc kubenswrapper[4971]: I0320 10:19:05.527748 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-openstack-openstack-networker-kn8xb_49dff616-ace8-40f1-a666-5fdf6e5ab3d6/ovn-openstack-openstack-networker/0.log" Mar 20 10:19:05 crc kubenswrapper[4971]: I0320 10:19:05.546903 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_e2bebfec-671e-453b-855a-0a1c4b6a3109/ovsdbserver-nb/0.log" Mar 20 10:19:05 crc kubenswrapper[4971]: I0320 10:19:05.728637 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_69ef94eb-93b4-42b8-8f8b-4ad69843586d/openstack-network-exporter/0.log" Mar 20 10:19:05 crc kubenswrapper[4971]: I0320 10:19:05.798158 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_69ef94eb-93b4-42b8-8f8b-4ad69843586d/ovsdbserver-nb/0.log" Mar 20 10:19:05 crc kubenswrapper[4971]: I0320 10:19:05.887722 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_66a65a08-59f0-4d4f-bde1-db66ae04d2ff/openstack-network-exporter/0.log" Mar 20 10:19:05 crc kubenswrapper[4971]: I0320 10:19:05.953691 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_66a65a08-59f0-4d4f-bde1-db66ae04d2ff/ovsdbserver-nb/0.log" Mar 20 10:19:06 crc kubenswrapper[4971]: I0320 10:19:06.145479 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_f19df11e-015e-4c86-8369-c2bafe67a087/openstack-network-exporter/0.log" Mar 20 10:19:06 crc kubenswrapper[4971]: I0320 10:19:06.163681 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_f19df11e-015e-4c86-8369-c2bafe67a087/ovsdbserver-sb/0.log" Mar 20 10:19:06 crc kubenswrapper[4971]: I0320 10:19:06.323741 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_10014eeb-477f-4f79-ad5d-6412a0ba2dfd/openstack-network-exporter/0.log" Mar 20 10:19:06 crc kubenswrapper[4971]: I0320 10:19:06.348244 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_10014eeb-477f-4f79-ad5d-6412a0ba2dfd/ovsdbserver-sb/0.log" Mar 20 10:19:06 crc kubenswrapper[4971]: I0320 10:19:06.448958 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_de027a38-7366-4724-b961-dd311e4cfd46/openstack-network-exporter/0.log" Mar 20 10:19:06 crc kubenswrapper[4971]: I0320 10:19:06.588750 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_de027a38-7366-4724-b961-dd311e4cfd46/ovsdbserver-sb/0.log" Mar 20 10:19:06 crc kubenswrapper[4971]: I0320 10:19:06.936564 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_pre-adoption-validation-openstack-pre-adoption-opensta-6afnxblg_9373f38e-bed2-4e2e-b4de-0ec24546a79c/pre-adoption-validation-openstack-pre-adoption-opensta-6af7d7e4/0.log" Mar 20 10:19:06 crc kubenswrapper[4971]: I0320 10:19:06.971154 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5c9b55c9fb-k6gqg_21e946e7-e1e2-469c-b61a-b6bf7c3e9543/placement-api/0.log" Mar 20 10:19:06 crc kubenswrapper[4971]: I0320 10:19:06.980182 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5c9b55c9fb-k6gqg_21e946e7-e1e2-469c-b61a-b6bf7c3e9543/placement-log/0.log" Mar 20 10:19:07 crc kubenswrapper[4971]: I0320 10:19:07.202780 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_903b0df1-cef0-4077-9033-c534831e6ca8/init-config-reloader/0.log" Mar 20 10:19:07 crc kubenswrapper[4971]: I0320 10:19:07.221645 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_pre-adoption-validation-openstack-pre-adoption-openstack-cvf2hd_d8021a6e-4275-43c1-9054-97946bdec61f/pre-adoption-validation-openstack-pre-adoption-openstack-cell1/0.log" Mar 20 10:19:07 crc kubenswrapper[4971]: I0320 10:19:07.405760 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_903b0df1-cef0-4077-9033-c534831e6ca8/config-reloader/0.log" Mar 20 10:19:07 crc kubenswrapper[4971]: I0320 10:19:07.451167 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_903b0df1-cef0-4077-9033-c534831e6ca8/prometheus/0.log" Mar 20 10:19:07 crc kubenswrapper[4971]: I0320 10:19:07.469350 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_903b0df1-cef0-4077-9033-c534831e6ca8/thanos-sidecar/0.log" Mar 20 10:19:07 crc kubenswrapper[4971]: I0320 10:19:07.474677 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_903b0df1-cef0-4077-9033-c534831e6ca8/init-config-reloader/0.log" Mar 20 10:19:07 crc kubenswrapper[4971]: I0320 10:19:07.697123 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6e020080-54ed-44d0-ad96-8da6f1aa9bf0/setup-container/0.log" Mar 20 10:19:07 crc kubenswrapper[4971]: I0320 10:19:07.849126 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6e020080-54ed-44d0-ad96-8da6f1aa9bf0/setup-container/0.log" Mar 20 10:19:07 crc kubenswrapper[4971]: I0320 10:19:07.934088 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6e020080-54ed-44d0-ad96-8da6f1aa9bf0/rabbitmq/0.log" Mar 20 10:19:08 crc kubenswrapper[4971]: I0320 10:19:08.006744 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_afb0da72-3952-4b60-8cd1-00b036d211d4/setup-container/0.log" Mar 20 10:19:08 crc kubenswrapper[4971]: I0320 10:19:08.287858 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_afb0da72-3952-4b60-8cd1-00b036d211d4/setup-container/0.log" Mar 20 10:19:08 crc kubenswrapper[4971]: I0320 10:19:08.345801 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-openstack-openstack-cell1-9n2zb_57d329f8-1d09-47d2-9574-b86286f74c11/reboot-os-openstack-openstack-cell1/0.log" Mar 20 10:19:08 crc kubenswrapper[4971]: I0320 10:19:08.381099 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_afb0da72-3952-4b60-8cd1-00b036d211d4/rabbitmq/0.log" Mar 20 10:19:08 crc kubenswrapper[4971]: I0320 10:19:08.511335 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-openstack-openstack-networker-qkk4w_193efd84-721d-48f2-95fd-714abfe9bd94/reboot-os-openstack-openstack-networker/0.log" Mar 20 10:19:08 crc kubenswrapper[4971]: I0320 10:19:08.710936 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-openstack-openstack-cell1-rvwvh_3f4b21df-b676-4340-8510-54195b3f2be5/run-os-openstack-openstack-cell1/0.log" Mar 20 10:19:08 crc kubenswrapper[4971]: I0320 10:19:08.899001 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-openstack-openstack-networker-ljn9j_d1385615-c486-46fe-889f-950bbb3699b7/run-os-openstack-openstack-networker/0.log" Mar 20 10:19:08 crc kubenswrapper[4971]: I0320 10:19:08.962827 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-openstack-4kbvw_a0305b41-b831-4275-8e13-f5855d8f4f73/ssh-known-hosts-openstack/0.log" Mar 20 10:19:09 crc kubenswrapper[4971]: I0320 10:19:09.304267 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_14c39a8b-cdb6-4be3-a3cc-57c5b8ddfc3c/tempest-tests-tempest-tests-runner/0.log" Mar 20 10:19:09 crc kubenswrapper[4971]: I0320 10:19:09.390189 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_7d649566-4edc-4e27-91cd-b98e8c74ceaf/test-operator-logs-container/0.log" Mar 20 10:19:10 crc kubenswrapper[4971]: I0320 10:19:10.114513 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-openstack-openstack-cell1-zl54b_5992b06a-eb27-4e13-9086-b1cc8a80cca7/validate-network-openstack-openstack-cell1/0.log" Mar 20 10:19:10 crc kubenswrapper[4971]: I0320 10:19:10.456232 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-openstack-openstack-cell1-cjr5v_4fc44776-0838-4b1a-844c-a16ffd83af56/telemetry-openstack-openstack-cell1/0.log" Mar 20 10:19:10 crc kubenswrapper[4971]: I0320 10:19:10.520969 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-openstack-openstack-networker-255gb_d17f937d-4808-496f-94ba-f85e03c371ff/validate-network-openstack-openstack-networker/0.log" Mar 20 10:19:12 crc kubenswrapper[4971]: I0320 10:19:12.368473 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tripleo-cleanup-tripleo-cleanup-openstack-cell1-5ptjb_3f14a6df-ffd2-484e-9097-4e30054d8f42/tripleo-cleanup-tripleo-cleanup-openstack-cell1/0.log" Mar 20 10:19:13 crc kubenswrapper[4971]: I0320 10:19:13.732121 4971 scope.go:117] "RemoveContainer" containerID="919bb2f73e7c6359debd4a73f45868c083b9339eb00f7714958d919a54c82d9f" Mar 20 10:19:13 crc kubenswrapper[4971]: E0320 10:19:13.733416 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 10:19:14 crc kubenswrapper[4971]: I0320 10:19:14.975105 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tripleo-cleanup-tripleo-cleanup-openstack-networker-7ljrd_004105ec-05ab-464c-ac28-5a3f5b8ef32c/tripleo-cleanup-tripleo-cleanup-openstack-networker/0.log" Mar 20 10:19:25 crc kubenswrapper[4971]: I0320 10:19:25.731797 4971 scope.go:117] "RemoveContainer" containerID="919bb2f73e7c6359debd4a73f45868c083b9339eb00f7714958d919a54c82d9f" Mar 20 10:19:25 crc kubenswrapper[4971]: E0320 10:19:25.732766 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 10:19:26 crc kubenswrapper[4971]: I0320 10:19:26.098826 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_d0878a99-9a41-4556-a72f-21fb18a5761d/memcached/0.log" Mar 20 10:19:35 crc kubenswrapper[4971]: I0320 10:19:35.767669 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c7v2tl_8be29c57-53f2-4dcd-955c-9597cda034bd/util/0.log" Mar 20 10:19:35 crc kubenswrapper[4971]: I0320 10:19:35.974480 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c7v2tl_8be29c57-53f2-4dcd-955c-9597cda034bd/pull/0.log" Mar 20 10:19:36 crc kubenswrapper[4971]: I0320 10:19:36.010843 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c7v2tl_8be29c57-53f2-4dcd-955c-9597cda034bd/pull/0.log" Mar 20 10:19:36 crc kubenswrapper[4971]: I0320 10:19:36.036870 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c7v2tl_8be29c57-53f2-4dcd-955c-9597cda034bd/util/0.log" Mar 20 10:19:36 crc kubenswrapper[4971]: I0320 10:19:36.216475 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c7v2tl_8be29c57-53f2-4dcd-955c-9597cda034bd/extract/0.log" Mar 20 10:19:36 crc kubenswrapper[4971]: I0320 10:19:36.217325 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c7v2tl_8be29c57-53f2-4dcd-955c-9597cda034bd/pull/0.log" Mar 20 10:19:36 crc kubenswrapper[4971]: I0320 10:19:36.238289 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c7v2tl_8be29c57-53f2-4dcd-955c-9597cda034bd/util/0.log" Mar 20 10:19:36 crc kubenswrapper[4971]: I0320 10:19:36.509509 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-59bc569d95-tzs7r_f478a4fa-f1af-4829-84e7-d42e3518b311/manager/0.log" Mar 20 10:19:36 crc kubenswrapper[4971]: I0320 10:19:36.691429 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-588d4d986b-z4rvv_f03668cf-d39b-496d-84cf-2f1c80162661/manager/0.log" Mar 20 10:19:37 crc kubenswrapper[4971]: I0320 10:19:37.008472 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-79df6bcc97-6z7mk_23b236ba-b8a6-4025-a093-cd3e7ccce1a2/manager/0.log" Mar 20 10:19:37 crc kubenswrapper[4971]: I0320 10:19:37.066987 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-67dd5f86f5-lsfqr_f899ae6a-129e-4909-83b2-64f4a270d3aa/manager/0.log" Mar 20 10:19:37 crc kubenswrapper[4971]: I0320 10:19:37.226178 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-8464cc45fb-zx4c5_8d8e9e17-63d7-4de7-83d4-114caa076970/manager/0.log" Mar 20 10:19:37 crc kubenswrapper[4971]: I0320 10:19:37.646929 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f787dddc9-fkl5f_161e42ba-c8e5-40a0-be42-8fba7ea0428c/manager/0.log" Mar 20 10:19:37 crc kubenswrapper[4971]: I0320 10:19:37.733968 4971 scope.go:117] "RemoveContainer" containerID="919bb2f73e7c6359debd4a73f45868c083b9339eb00f7714958d919a54c82d9f" Mar 20 10:19:37 crc kubenswrapper[4971]: E0320 10:19:37.735194 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 10:19:38 crc kubenswrapper[4971]: I0320 10:19:38.093853 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-768b96df4c-6xt47_524c42fa-40bb-435d-a5ef-b1cd805b5fdb/manager/0.log" Mar 20 10:19:38 crc kubenswrapper[4971]: I0320 10:19:38.149360 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7b9c774f96-srgtw_42b39042-7dfe-4946-bcfa-7bd424edb636/manager/0.log" Mar 20 10:19:38 crc kubenswrapper[4971]: I0320 10:19:38.327919 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-55f864c847-hm96v_14a1436b-84d2-4739-b16f-1f0cd580a0dd/manager/0.log" Mar 20 10:19:38 crc kubenswrapper[4971]: I0320 10:19:38.482584 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67ccfc9778-wnjvf_804e22b3-ff42-4280-b6b2-fb431b9311b0/manager/0.log" Mar 20 10:19:38 crc kubenswrapper[4971]: I0320 10:19:38.645489 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-767865f676-25vg8_b3e37501-4241-4d3b-bd19-9764455a723c/manager/0.log" Mar 20 10:19:38 crc kubenswrapper[4971]: I0320 10:19:38.970006 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5b9f45d989-j26ds_0fab6976-2703-4123-9188-3cb727ffe701/manager/0.log" Mar 20 10:19:39 crc kubenswrapper[4971]: I0320 10:19:39.148200 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d488d59fb-6jc7r_c2cf6754-c345-4bd4-81de-d652e506f0cf/manager/0.log" Mar 20 10:19:39 crc kubenswrapper[4971]: I0320 10:19:39.239214 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-74c4796899pz5xg_13ed1180-8ed1-4fc9-9dd4-4fb3d9b23f3a/manager/0.log" Mar 20 10:19:39 crc kubenswrapper[4971]: I0320 10:19:39.508934 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-b85c4d696-9jzcg_e879631c-0237-44c7-8682-b615034de536/operator/0.log" Mar 20 10:19:40 crc kubenswrapper[4971]: I0320 10:19:40.049586 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-k8spn_956174ac-8d4d-4dcf-90ad-79ee8322ebdb/registry-server/0.log" Mar 20 10:19:40 crc kubenswrapper[4971]: I0320 10:19:40.325267 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-884679f54-d9nz2_b9bc9bf8-93d8-4b02-b563-ed18ef89944d/manager/0.log" Mar 20 10:19:40 crc kubenswrapper[4971]: I0320 10:19:40.570288 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5784578c99-lfdpc_1fc90bc6-1bb1-4357-a357-45bd8795ad02/manager/0.log" Mar 20 10:19:40 crc kubenswrapper[4971]: I0320 10:19:40.731042 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-n2tv6_10892b2b-3d7a-4994-8779-8a22ac48ca70/operator/0.log" Mar 20 10:19:40 crc kubenswrapper[4971]: I0320 10:19:40.831529 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-c674c5965-nnng5_d0bcdd3a-90b0-42ce-aaa3-c42379e95e2b/manager/0.log" Mar 20 10:19:41 crc kubenswrapper[4971]: I0320 10:19:41.043037 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d58dc466-2g4lc_1725120f-c2b6-4438-8dc0-0758d75b0ead/manager/0.log" Mar 20 10:19:41 crc kubenswrapper[4971]: I0320 10:19:41.193008 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-d6b694c5-8k5hb_a7c0469d-7b51-403e-b190-f42fd2668900/manager/0.log" Mar 20 10:19:41 crc kubenswrapper[4971]: I0320 10:19:41.206154 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-pjg97_d05cc1d7-7da4-4950-93c0-84fd9ea90403/manager/0.log" Mar 20 10:19:41 crc kubenswrapper[4971]: I0320 10:19:41.364191 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-ssscj_f4b444d6-b24e-4055-b020-c367f5c5a144/manager/0.log" Mar 20 10:19:42 crc kubenswrapper[4971]: I0320 10:19:42.507026 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-86bd8996f6-ktmn5_1be5b0dc-cddc-44fb-bb47-d92e3295312c/manager/0.log" Mar 20 10:19:51 crc kubenswrapper[4971]: I0320 10:19:51.733137 4971 scope.go:117] "RemoveContainer" containerID="919bb2f73e7c6359debd4a73f45868c083b9339eb00f7714958d919a54c82d9f" Mar 20 10:19:51 crc kubenswrapper[4971]: E0320 10:19:51.734087 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 10:20:00 crc kubenswrapper[4971]: I0320 10:20:00.161770 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566700-dp87m"] Mar 20 10:20:00 crc kubenswrapper[4971]: E0320 10:20:00.162844 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1291caf-f069-4fbb-be0a-e92154f76f15" containerName="registry-server" Mar 20 10:20:00 crc kubenswrapper[4971]: I0320 10:20:00.162859 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1291caf-f069-4fbb-be0a-e92154f76f15" containerName="registry-server" Mar 20 10:20:00 crc kubenswrapper[4971]: E0320 10:20:00.162880 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1291caf-f069-4fbb-be0a-e92154f76f15" containerName="extract-content" Mar 20 10:20:00 crc kubenswrapper[4971]: I0320 10:20:00.162887 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1291caf-f069-4fbb-be0a-e92154f76f15" containerName="extract-content" Mar 20 10:20:00 crc kubenswrapper[4971]: E0320 10:20:00.162935 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1291caf-f069-4fbb-be0a-e92154f76f15" containerName="extract-utilities" Mar 20 10:20:00 crc kubenswrapper[4971]: I0320 10:20:00.162944 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1291caf-f069-4fbb-be0a-e92154f76f15" containerName="extract-utilities" Mar 20 10:20:00 crc kubenswrapper[4971]: I0320 10:20:00.163184 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1291caf-f069-4fbb-be0a-e92154f76f15" containerName="registry-server" Mar 20 10:20:00 crc kubenswrapper[4971]: I0320 10:20:00.164056 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566700-dp87m" Mar 20 10:20:00 crc kubenswrapper[4971]: I0320 10:20:00.167838 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 10:20:00 crc kubenswrapper[4971]: I0320 10:20:00.168962 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 10:20:00 crc kubenswrapper[4971]: I0320 10:20:00.169137 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 10:20:00 crc kubenswrapper[4971]: I0320 10:20:00.177190 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566700-dp87m"] Mar 20 10:20:00 crc kubenswrapper[4971]: I0320 10:20:00.177456 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wsps\" (UniqueName: \"kubernetes.io/projected/076c4058-a8ca-4c9a-a140-8d73445634a6-kube-api-access-9wsps\") pod \"auto-csr-approver-29566700-dp87m\" (UID: \"076c4058-a8ca-4c9a-a140-8d73445634a6\") " pod="openshift-infra/auto-csr-approver-29566700-dp87m" Mar 20 10:20:00 crc kubenswrapper[4971]: I0320 10:20:00.280067 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wsps\" (UniqueName: \"kubernetes.io/projected/076c4058-a8ca-4c9a-a140-8d73445634a6-kube-api-access-9wsps\") pod \"auto-csr-approver-29566700-dp87m\" (UID: \"076c4058-a8ca-4c9a-a140-8d73445634a6\") " pod="openshift-infra/auto-csr-approver-29566700-dp87m" Mar 20 10:20:00 crc kubenswrapper[4971]: I0320 10:20:00.306290 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wsps\" (UniqueName: \"kubernetes.io/projected/076c4058-a8ca-4c9a-a140-8d73445634a6-kube-api-access-9wsps\") pod \"auto-csr-approver-29566700-dp87m\" (UID: \"076c4058-a8ca-4c9a-a140-8d73445634a6\") " pod="openshift-infra/auto-csr-approver-29566700-dp87m" Mar 20 10:20:00 crc kubenswrapper[4971]: I0320 10:20:00.498298 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566700-dp87m" Mar 20 10:20:00 crc kubenswrapper[4971]: I0320 10:20:00.990216 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566700-dp87m"] Mar 20 10:20:01 crc kubenswrapper[4971]: I0320 10:20:01.186127 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-fn7gp_c9ad26c3-6a8e-41df-9879-b7ff5f77fdea/control-plane-machine-set-operator/0.log" Mar 20 10:20:01 crc kubenswrapper[4971]: I0320 10:20:01.379654 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-fw8fs_6f5bffa0-8b50-4aac-a108-edc093dfc2bf/kube-rbac-proxy/0.log" Mar 20 10:20:01 crc kubenswrapper[4971]: I0320 10:20:01.380814 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-fw8fs_6f5bffa0-8b50-4aac-a108-edc093dfc2bf/machine-api-operator/0.log" Mar 20 10:20:01 crc kubenswrapper[4971]: I0320 10:20:01.645092 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566700-dp87m" event={"ID":"076c4058-a8ca-4c9a-a140-8d73445634a6","Type":"ContainerStarted","Data":"7c5ab2a65b63f368fe7c78051d2aca0fb2bcfadb685fab920b1c6f6a18827f8c"} Mar 20 10:20:02 crc kubenswrapper[4971]: I0320 10:20:02.732088 4971 scope.go:117] "RemoveContainer" containerID="919bb2f73e7c6359debd4a73f45868c083b9339eb00f7714958d919a54c82d9f" Mar 20 10:20:02 crc kubenswrapper[4971]: E0320 10:20:02.732667 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 10:20:03 crc kubenswrapper[4971]: I0320 10:20:03.683308 4971 generic.go:334] "Generic (PLEG): container finished" podID="076c4058-a8ca-4c9a-a140-8d73445634a6" containerID="596ed9447a6afcf3015746121e3448b433d94782eb2f456d988033f163907522" exitCode=0 Mar 20 10:20:03 crc kubenswrapper[4971]: I0320 10:20:03.683361 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566700-dp87m" event={"ID":"076c4058-a8ca-4c9a-a140-8d73445634a6","Type":"ContainerDied","Data":"596ed9447a6afcf3015746121e3448b433d94782eb2f456d988033f163907522"} Mar 20 10:20:05 crc kubenswrapper[4971]: I0320 10:20:05.092836 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566700-dp87m" Mar 20 10:20:05 crc kubenswrapper[4971]: I0320 10:20:05.173294 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wsps\" (UniqueName: \"kubernetes.io/projected/076c4058-a8ca-4c9a-a140-8d73445634a6-kube-api-access-9wsps\") pod \"076c4058-a8ca-4c9a-a140-8d73445634a6\" (UID: \"076c4058-a8ca-4c9a-a140-8d73445634a6\") " Mar 20 10:20:05 crc kubenswrapper[4971]: I0320 10:20:05.184968 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/076c4058-a8ca-4c9a-a140-8d73445634a6-kube-api-access-9wsps" (OuterVolumeSpecName: "kube-api-access-9wsps") pod "076c4058-a8ca-4c9a-a140-8d73445634a6" (UID: "076c4058-a8ca-4c9a-a140-8d73445634a6"). InnerVolumeSpecName "kube-api-access-9wsps". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:20:05 crc kubenswrapper[4971]: I0320 10:20:05.275892 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9wsps\" (UniqueName: \"kubernetes.io/projected/076c4058-a8ca-4c9a-a140-8d73445634a6-kube-api-access-9wsps\") on node \"crc\" DevicePath \"\"" Mar 20 10:20:05 crc kubenswrapper[4971]: I0320 10:20:05.701732 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566700-dp87m" event={"ID":"076c4058-a8ca-4c9a-a140-8d73445634a6","Type":"ContainerDied","Data":"7c5ab2a65b63f368fe7c78051d2aca0fb2bcfadb685fab920b1c6f6a18827f8c"} Mar 20 10:20:05 crc kubenswrapper[4971]: I0320 10:20:05.701769 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c5ab2a65b63f368fe7c78051d2aca0fb2bcfadb685fab920b1c6f6a18827f8c" Mar 20 10:20:05 crc kubenswrapper[4971]: I0320 10:20:05.702177 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566700-dp87m" Mar 20 10:20:06 crc kubenswrapper[4971]: I0320 10:20:06.172527 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566694-f94kc"] Mar 20 10:20:06 crc kubenswrapper[4971]: I0320 10:20:06.182494 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566694-f94kc"] Mar 20 10:20:06 crc kubenswrapper[4971]: I0320 10:20:06.777075 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="333d7fba-8587-41ea-a463-a4c9866ec202" path="/var/lib/kubelet/pods/333d7fba-8587-41ea-a463-a4c9866ec202/volumes" Mar 20 10:20:15 crc kubenswrapper[4971]: I0320 10:20:15.249486 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-545d4d4674-kmsrl_fd1800e7-08ec-4ea1-80b2-2958aef7113b/cert-manager-controller/0.log" Mar 20 10:20:15 crc kubenswrapper[4971]: I0320 10:20:15.326580 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-5545bd876-5hknf_fabc83f0-ade8-49c7-a868-3de1566e8280/cert-manager-cainjector/0.log" Mar 20 10:20:15 crc kubenswrapper[4971]: I0320 10:20:15.416172 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-6888856db4-6zp78_7e167de7-5293-49db-818d-813af89b693e/cert-manager-webhook/0.log" Mar 20 10:20:16 crc kubenswrapper[4971]: I0320 10:20:16.733057 4971 scope.go:117] "RemoveContainer" containerID="919bb2f73e7c6359debd4a73f45868c083b9339eb00f7714958d919a54c82d9f" Mar 20 10:20:16 crc kubenswrapper[4971]: E0320 10:20:16.733333 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 10:20:18 crc kubenswrapper[4971]: I0320 10:20:18.263502 4971 scope.go:117] "RemoveContainer" containerID="8aee91ac246f248e4ac38f489111d1d5ab57b061b31d678e50f4dc99ae6a4bc8" Mar 20 10:20:28 crc kubenswrapper[4971]: I0320 10:20:28.647462 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-zl9t9_0fc8c692-e8ea-4d20-91a2-41c340d9cb66/nmstate-console-plugin/0.log" Mar 20 10:20:28 crc kubenswrapper[4971]: I0320 10:20:28.741369 4971 scope.go:117] "RemoveContainer" containerID="919bb2f73e7c6359debd4a73f45868c083b9339eb00f7714958d919a54c82d9f" Mar 20 10:20:28 crc kubenswrapper[4971]: E0320 10:20:28.741674 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 10:20:28 crc kubenswrapper[4971]: I0320 10:20:28.810433 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-smvq8_b8bca7dc-fb19-45b9-8bad-b2ee90232d46/nmstate-handler/0.log" Mar 20 10:20:28 crc kubenswrapper[4971]: I0320 10:20:28.869435 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-c8zms_ae395f28-6667-46bf-81c3-af54d7c1c743/kube-rbac-proxy/0.log" Mar 20 10:20:28 crc kubenswrapper[4971]: I0320 10:20:28.903545 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-c8zms_ae395f28-6667-46bf-81c3-af54d7c1c743/nmstate-metrics/0.log" Mar 20 10:20:29 crc kubenswrapper[4971]: I0320 10:20:29.059045 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-j886c_2659746d-8c1c-475b-ad3b-f57197f14895/nmstate-operator/0.log" Mar 20 10:20:29 crc kubenswrapper[4971]: I0320 10:20:29.142916 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-rtfrt_81500b95-85cb-41cc-bee8-d3d56d47ff4d/nmstate-webhook/0.log" Mar 20 10:20:40 crc kubenswrapper[4971]: I0320 10:20:40.733515 4971 scope.go:117] "RemoveContainer" containerID="919bb2f73e7c6359debd4a73f45868c083b9339eb00f7714958d919a54c82d9f" Mar 20 10:20:40 crc kubenswrapper[4971]: E0320 10:20:40.734686 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 10:20:42 crc kubenswrapper[4971]: I0320 10:20:42.819902 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-8ff7d675-tqdrq_05730a75-1c9d-4199-bd62-176210c9207a/prometheus-operator/0.log" Mar 20 10:20:43 crc kubenswrapper[4971]: I0320 10:20:43.077774 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-657d9b77d4-b797l_4aeb8dc6-151a-40dc-871e-6145135c7fdf/prometheus-operator-admission-webhook/0.log" Mar 20 10:20:43 crc kubenswrapper[4971]: I0320 10:20:43.078133 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-657d9b77d4-wsqv2_1be3085b-b74b-4086-ab50-b41ac948660c/prometheus-operator-admission-webhook/0.log" Mar 20 10:20:43 crc kubenswrapper[4971]: I0320 10:20:43.252780 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-6dd7dd855f-jt588_a64188ef-f360-424b-b0f7-4b6f3801e47e/operator/0.log" Mar 20 10:20:43 crc kubenswrapper[4971]: I0320 10:20:43.275157 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-86dc4f68fc-bsgd2_7d563a7f-a865-46b9-b775-cb3c61eae815/perses-operator/0.log" Mar 20 10:20:53 crc kubenswrapper[4971]: I0320 10:20:53.732844 4971 scope.go:117] "RemoveContainer" containerID="919bb2f73e7c6359debd4a73f45868c083b9339eb00f7714958d919a54c82d9f" Mar 20 10:20:53 crc kubenswrapper[4971]: E0320 10:20:53.733616 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 10:20:57 crc kubenswrapper[4971]: I0320 10:20:57.389883 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-l7zzc_30af6211-fbfb-4319-8354-b2e08e781f2c/kube-rbac-proxy/0.log" Mar 20 10:20:57 crc kubenswrapper[4971]: I0320 10:20:57.622092 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-47mxg_0be71de9-783d-4467-bb0c-d8cb89f6bf38/frr-k8s-webhook-server/0.log" Mar 20 10:20:57 crc kubenswrapper[4971]: I0320 10:20:57.842912 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-l7zzc_30af6211-fbfb-4319-8354-b2e08e781f2c/controller/0.log" Mar 20 10:20:57 crc kubenswrapper[4971]: I0320 10:20:57.869040 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zlrz9_2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40/cp-frr-files/0.log" Mar 20 10:20:58 crc kubenswrapper[4971]: I0320 10:20:58.066772 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zlrz9_2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40/cp-frr-files/0.log" Mar 20 10:20:58 crc kubenswrapper[4971]: I0320 10:20:58.088233 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zlrz9_2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40/cp-metrics/0.log" Mar 20 10:20:58 crc kubenswrapper[4971]: I0320 10:20:58.105890 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zlrz9_2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40/cp-reloader/0.log" Mar 20 10:20:58 crc kubenswrapper[4971]: I0320 10:20:58.111506 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zlrz9_2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40/cp-reloader/0.log" Mar 20 10:20:58 crc kubenswrapper[4971]: I0320 10:20:58.309224 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zlrz9_2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40/cp-frr-files/0.log" Mar 20 10:20:58 crc kubenswrapper[4971]: I0320 10:20:58.320928 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zlrz9_2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40/cp-metrics/0.log" Mar 20 10:20:58 crc kubenswrapper[4971]: I0320 10:20:58.344205 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zlrz9_2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40/cp-reloader/0.log" Mar 20 10:20:58 crc kubenswrapper[4971]: I0320 10:20:58.346275 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zlrz9_2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40/cp-metrics/0.log" Mar 20 10:20:58 crc kubenswrapper[4971]: I0320 10:20:58.902199 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zlrz9_2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40/cp-metrics/0.log" Mar 20 10:20:58 crc kubenswrapper[4971]: I0320 10:20:58.918414 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zlrz9_2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40/cp-frr-files/0.log" Mar 20 10:20:58 crc kubenswrapper[4971]: I0320 10:20:58.931695 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zlrz9_2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40/controller/0.log" Mar 20 10:20:58 crc kubenswrapper[4971]: I0320 10:20:58.965361 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zlrz9_2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40/cp-reloader/0.log" Mar 20 10:20:59 crc kubenswrapper[4971]: I0320 10:20:59.107982 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zlrz9_2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40/kube-rbac-proxy/0.log" Mar 20 10:20:59 crc kubenswrapper[4971]: I0320 10:20:59.109727 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zlrz9_2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40/frr-metrics/0.log" Mar 20 10:20:59 crc kubenswrapper[4971]: I0320 10:20:59.212117 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zlrz9_2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40/kube-rbac-proxy-frr/0.log" Mar 20 10:20:59 crc kubenswrapper[4971]: I0320 10:20:59.314725 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zlrz9_2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40/reloader/0.log" Mar 20 10:20:59 crc kubenswrapper[4971]: I0320 10:20:59.524274 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-fc6897bfd-j6pdm_19ab5c82-2f79-47be-a7ba-68cdbe00f4b7/manager/0.log" Mar 20 10:20:59 crc kubenswrapper[4971]: I0320 10:20:59.588309 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-b54b5b68f-hq9kg_fe139df5-ee59-46ea-a07d-6a995dccdc8e/webhook-server/0.log" Mar 20 10:20:59 crc kubenswrapper[4971]: I0320 10:20:59.815084 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-x8lwk_62dce9cb-516c-4d4b-b6cc-7d19b8cf5b7a/kube-rbac-proxy/0.log" Mar 20 10:21:00 crc kubenswrapper[4971]: I0320 10:21:00.655304 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-x8lwk_62dce9cb-516c-4d4b-b6cc-7d19b8cf5b7a/speaker/0.log" Mar 20 10:21:01 crc kubenswrapper[4971]: I0320 10:21:01.116078 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wtfjv"] Mar 20 10:21:01 crc kubenswrapper[4971]: E0320 10:21:01.116689 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="076c4058-a8ca-4c9a-a140-8d73445634a6" containerName="oc" Mar 20 10:21:01 crc kubenswrapper[4971]: I0320 10:21:01.116712 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="076c4058-a8ca-4c9a-a140-8d73445634a6" containerName="oc" Mar 20 10:21:01 crc kubenswrapper[4971]: I0320 10:21:01.117020 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="076c4058-a8ca-4c9a-a140-8d73445634a6" containerName="oc" Mar 20 10:21:01 crc kubenswrapper[4971]: I0320 10:21:01.133847 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wtfjv" Mar 20 10:21:01 crc kubenswrapper[4971]: I0320 10:21:01.166601 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wtfjv"] Mar 20 10:21:01 crc kubenswrapper[4971]: I0320 10:21:01.224063 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afb4a5aa-dbfc-419c-acfb-6932095a8f8b-utilities\") pod \"certified-operators-wtfjv\" (UID: \"afb4a5aa-dbfc-419c-acfb-6932095a8f8b\") " pod="openshift-marketplace/certified-operators-wtfjv" Mar 20 10:21:01 crc kubenswrapper[4971]: I0320 10:21:01.224171 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7j7g\" (UniqueName: \"kubernetes.io/projected/afb4a5aa-dbfc-419c-acfb-6932095a8f8b-kube-api-access-c7j7g\") pod \"certified-operators-wtfjv\" (UID: \"afb4a5aa-dbfc-419c-acfb-6932095a8f8b\") " pod="openshift-marketplace/certified-operators-wtfjv" Mar 20 10:21:01 crc kubenswrapper[4971]: I0320 10:21:01.224218 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afb4a5aa-dbfc-419c-acfb-6932095a8f8b-catalog-content\") pod \"certified-operators-wtfjv\" (UID: \"afb4a5aa-dbfc-419c-acfb-6932095a8f8b\") " pod="openshift-marketplace/certified-operators-wtfjv" Mar 20 10:21:01 crc kubenswrapper[4971]: I0320 10:21:01.327827 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afb4a5aa-dbfc-419c-acfb-6932095a8f8b-utilities\") pod \"certified-operators-wtfjv\" (UID: \"afb4a5aa-dbfc-419c-acfb-6932095a8f8b\") " pod="openshift-marketplace/certified-operators-wtfjv" Mar 20 10:21:01 crc kubenswrapper[4971]: I0320 10:21:01.328030 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7j7g\" (UniqueName: \"kubernetes.io/projected/afb4a5aa-dbfc-419c-acfb-6932095a8f8b-kube-api-access-c7j7g\") pod \"certified-operators-wtfjv\" (UID: \"afb4a5aa-dbfc-419c-acfb-6932095a8f8b\") " pod="openshift-marketplace/certified-operators-wtfjv" Mar 20 10:21:01 crc kubenswrapper[4971]: I0320 10:21:01.328107 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afb4a5aa-dbfc-419c-acfb-6932095a8f8b-catalog-content\") pod \"certified-operators-wtfjv\" (UID: \"afb4a5aa-dbfc-419c-acfb-6932095a8f8b\") " pod="openshift-marketplace/certified-operators-wtfjv" Mar 20 10:21:01 crc kubenswrapper[4971]: I0320 10:21:01.328378 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afb4a5aa-dbfc-419c-acfb-6932095a8f8b-utilities\") pod \"certified-operators-wtfjv\" (UID: \"afb4a5aa-dbfc-419c-acfb-6932095a8f8b\") " pod="openshift-marketplace/certified-operators-wtfjv" Mar 20 10:21:01 crc kubenswrapper[4971]: I0320 10:21:01.328438 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afb4a5aa-dbfc-419c-acfb-6932095a8f8b-catalog-content\") pod \"certified-operators-wtfjv\" (UID: \"afb4a5aa-dbfc-419c-acfb-6932095a8f8b\") " pod="openshift-marketplace/certified-operators-wtfjv" Mar 20 10:21:01 crc kubenswrapper[4971]: I0320 10:21:01.361089 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7j7g\" (UniqueName: \"kubernetes.io/projected/afb4a5aa-dbfc-419c-acfb-6932095a8f8b-kube-api-access-c7j7g\") pod \"certified-operators-wtfjv\" (UID: \"afb4a5aa-dbfc-419c-acfb-6932095a8f8b\") " pod="openshift-marketplace/certified-operators-wtfjv" Mar 20 10:21:01 crc kubenswrapper[4971]: I0320 10:21:01.464724 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wtfjv" Mar 20 10:21:02 crc kubenswrapper[4971]: I0320 10:21:02.019829 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wtfjv"] Mar 20 10:21:02 crc kubenswrapper[4971]: I0320 10:21:02.282384 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wtfjv" event={"ID":"afb4a5aa-dbfc-419c-acfb-6932095a8f8b","Type":"ContainerStarted","Data":"9af2cd8b20a32e267a30e338f8885e1163ed1ef1a8503d45b33823b4bcb6c787"} Mar 20 10:21:03 crc kubenswrapper[4971]: I0320 10:21:03.294293 4971 generic.go:334] "Generic (PLEG): container finished" podID="afb4a5aa-dbfc-419c-acfb-6932095a8f8b" containerID="d0327f295b37727f2e8239e2dabab08da570ecbc893821f08636a5470df4ef57" exitCode=0 Mar 20 10:21:03 crc kubenswrapper[4971]: I0320 10:21:03.294386 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wtfjv" event={"ID":"afb4a5aa-dbfc-419c-acfb-6932095a8f8b","Type":"ContainerDied","Data":"d0327f295b37727f2e8239e2dabab08da570ecbc893821f08636a5470df4ef57"} Mar 20 10:21:03 crc kubenswrapper[4971]: I0320 10:21:03.385542 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zlrz9_2dc60c31-5a92-4ff2-b5b6-b87ae2ea1d40/frr/0.log" Mar 20 10:21:04 crc kubenswrapper[4971]: I0320 10:21:04.310474 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wtfjv" event={"ID":"afb4a5aa-dbfc-419c-acfb-6932095a8f8b","Type":"ContainerStarted","Data":"31c74963a22de8f52b4a60d9e175fa939be3e6d3154ea77509871ba55d971d23"} Mar 20 10:21:05 crc kubenswrapper[4971]: I0320 10:21:05.322244 4971 generic.go:334] "Generic (PLEG): container finished" podID="afb4a5aa-dbfc-419c-acfb-6932095a8f8b" containerID="31c74963a22de8f52b4a60d9e175fa939be3e6d3154ea77509871ba55d971d23" exitCode=0 Mar 20 10:21:05 crc kubenswrapper[4971]: I0320 10:21:05.322287 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wtfjv" event={"ID":"afb4a5aa-dbfc-419c-acfb-6932095a8f8b","Type":"ContainerDied","Data":"31c74963a22de8f52b4a60d9e175fa939be3e6d3154ea77509871ba55d971d23"} Mar 20 10:21:06 crc kubenswrapper[4971]: I0320 10:21:06.335764 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wtfjv" event={"ID":"afb4a5aa-dbfc-419c-acfb-6932095a8f8b","Type":"ContainerStarted","Data":"5a1922f5470fe47b30419033d6670cf72ce147c413c7ee7156a90eb539d82973"} Mar 20 10:21:06 crc kubenswrapper[4971]: I0320 10:21:06.732990 4971 scope.go:117] "RemoveContainer" containerID="919bb2f73e7c6359debd4a73f45868c083b9339eb00f7714958d919a54c82d9f" Mar 20 10:21:06 crc kubenswrapper[4971]: E0320 10:21:06.733414 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 10:21:11 crc kubenswrapper[4971]: I0320 10:21:11.465178 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wtfjv" Mar 20 10:21:11 crc kubenswrapper[4971]: I0320 10:21:11.465733 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wtfjv" Mar 20 10:21:11 crc kubenswrapper[4971]: I0320 10:21:11.516345 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wtfjv" Mar 20 10:21:11 crc kubenswrapper[4971]: I0320 10:21:11.547453 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wtfjv" podStartSLOduration=8.078395655 podStartE2EDuration="10.547435491s" podCreationTimestamp="2026-03-20 10:21:01 +0000 UTC" firstStartedPulling="2026-03-20 10:21:03.299155042 +0000 UTC m=+12685.279029180" lastFinishedPulling="2026-03-20 10:21:05.768194878 +0000 UTC m=+12687.748069016" observedRunningTime="2026-03-20 10:21:06.362525478 +0000 UTC m=+12688.342399626" watchObservedRunningTime="2026-03-20 10:21:11.547435491 +0000 UTC m=+12693.527309619" Mar 20 10:21:12 crc kubenswrapper[4971]: I0320 10:21:12.436829 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wtfjv" Mar 20 10:21:12 crc kubenswrapper[4971]: I0320 10:21:12.488283 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wtfjv"] Mar 20 10:21:14 crc kubenswrapper[4971]: I0320 10:21:14.408834 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wtfjv" podUID="afb4a5aa-dbfc-419c-acfb-6932095a8f8b" containerName="registry-server" containerID="cri-o://5a1922f5470fe47b30419033d6670cf72ce147c413c7ee7156a90eb539d82973" gracePeriod=2 Mar 20 10:21:14 crc kubenswrapper[4971]: I0320 10:21:14.985830 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wtfjv" Mar 20 10:21:15 crc kubenswrapper[4971]: I0320 10:21:15.032621 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afb4a5aa-dbfc-419c-acfb-6932095a8f8b-catalog-content\") pod \"afb4a5aa-dbfc-419c-acfb-6932095a8f8b\" (UID: \"afb4a5aa-dbfc-419c-acfb-6932095a8f8b\") " Mar 20 10:21:15 crc kubenswrapper[4971]: I0320 10:21:15.032700 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afb4a5aa-dbfc-419c-acfb-6932095a8f8b-utilities\") pod \"afb4a5aa-dbfc-419c-acfb-6932095a8f8b\" (UID: \"afb4a5aa-dbfc-419c-acfb-6932095a8f8b\") " Mar 20 10:21:15 crc kubenswrapper[4971]: I0320 10:21:15.033010 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7j7g\" (UniqueName: \"kubernetes.io/projected/afb4a5aa-dbfc-419c-acfb-6932095a8f8b-kube-api-access-c7j7g\") pod \"afb4a5aa-dbfc-419c-acfb-6932095a8f8b\" (UID: \"afb4a5aa-dbfc-419c-acfb-6932095a8f8b\") " Mar 20 10:21:15 crc kubenswrapper[4971]: I0320 10:21:15.033675 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/afb4a5aa-dbfc-419c-acfb-6932095a8f8b-utilities" (OuterVolumeSpecName: "utilities") pod "afb4a5aa-dbfc-419c-acfb-6932095a8f8b" (UID: "afb4a5aa-dbfc-419c-acfb-6932095a8f8b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:21:15 crc kubenswrapper[4971]: I0320 10:21:15.034249 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afb4a5aa-dbfc-419c-acfb-6932095a8f8b-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 10:21:15 crc kubenswrapper[4971]: I0320 10:21:15.042779 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afb4a5aa-dbfc-419c-acfb-6932095a8f8b-kube-api-access-c7j7g" (OuterVolumeSpecName: "kube-api-access-c7j7g") pod "afb4a5aa-dbfc-419c-acfb-6932095a8f8b" (UID: "afb4a5aa-dbfc-419c-acfb-6932095a8f8b"). InnerVolumeSpecName "kube-api-access-c7j7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:21:15 crc kubenswrapper[4971]: I0320 10:21:15.123454 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/afb4a5aa-dbfc-419c-acfb-6932095a8f8b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "afb4a5aa-dbfc-419c-acfb-6932095a8f8b" (UID: "afb4a5aa-dbfc-419c-acfb-6932095a8f8b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:21:15 crc kubenswrapper[4971]: I0320 10:21:15.147095 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7j7g\" (UniqueName: \"kubernetes.io/projected/afb4a5aa-dbfc-419c-acfb-6932095a8f8b-kube-api-access-c7j7g\") on node \"crc\" DevicePath \"\"" Mar 20 10:21:15 crc kubenswrapper[4971]: I0320 10:21:15.147137 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afb4a5aa-dbfc-419c-acfb-6932095a8f8b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 10:21:15 crc kubenswrapper[4971]: I0320 10:21:15.420927 4971 generic.go:334] "Generic (PLEG): container finished" podID="afb4a5aa-dbfc-419c-acfb-6932095a8f8b" containerID="5a1922f5470fe47b30419033d6670cf72ce147c413c7ee7156a90eb539d82973" exitCode=0 Mar 20 10:21:15 crc kubenswrapper[4971]: I0320 10:21:15.420970 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wtfjv" event={"ID":"afb4a5aa-dbfc-419c-acfb-6932095a8f8b","Type":"ContainerDied","Data":"5a1922f5470fe47b30419033d6670cf72ce147c413c7ee7156a90eb539d82973"} Mar 20 10:21:15 crc kubenswrapper[4971]: I0320 10:21:15.420979 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wtfjv" Mar 20 10:21:15 crc kubenswrapper[4971]: I0320 10:21:15.420998 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wtfjv" event={"ID":"afb4a5aa-dbfc-419c-acfb-6932095a8f8b","Type":"ContainerDied","Data":"9af2cd8b20a32e267a30e338f8885e1163ed1ef1a8503d45b33823b4bcb6c787"} Mar 20 10:21:15 crc kubenswrapper[4971]: I0320 10:21:15.421016 4971 scope.go:117] "RemoveContainer" containerID="5a1922f5470fe47b30419033d6670cf72ce147c413c7ee7156a90eb539d82973" Mar 20 10:21:15 crc kubenswrapper[4971]: I0320 10:21:15.450551 4971 scope.go:117] "RemoveContainer" containerID="31c74963a22de8f52b4a60d9e175fa939be3e6d3154ea77509871ba55d971d23" Mar 20 10:21:15 crc kubenswrapper[4971]: I0320 10:21:15.471969 4971 scope.go:117] "RemoveContainer" containerID="d0327f295b37727f2e8239e2dabab08da570ecbc893821f08636a5470df4ef57" Mar 20 10:21:15 crc kubenswrapper[4971]: I0320 10:21:15.472113 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wtfjv"] Mar 20 10:21:15 crc kubenswrapper[4971]: I0320 10:21:15.494002 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wtfjv"] Mar 20 10:21:15 crc kubenswrapper[4971]: I0320 10:21:15.516659 4971 scope.go:117] "RemoveContainer" containerID="5a1922f5470fe47b30419033d6670cf72ce147c413c7ee7156a90eb539d82973" Mar 20 10:21:15 crc kubenswrapper[4971]: E0320 10:21:15.517352 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a1922f5470fe47b30419033d6670cf72ce147c413c7ee7156a90eb539d82973\": container with ID starting with 5a1922f5470fe47b30419033d6670cf72ce147c413c7ee7156a90eb539d82973 not found: ID does not exist" containerID="5a1922f5470fe47b30419033d6670cf72ce147c413c7ee7156a90eb539d82973" Mar 20 10:21:15 crc kubenswrapper[4971]: I0320 10:21:15.517392 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a1922f5470fe47b30419033d6670cf72ce147c413c7ee7156a90eb539d82973"} err="failed to get container status \"5a1922f5470fe47b30419033d6670cf72ce147c413c7ee7156a90eb539d82973\": rpc error: code = NotFound desc = could not find container \"5a1922f5470fe47b30419033d6670cf72ce147c413c7ee7156a90eb539d82973\": container with ID starting with 5a1922f5470fe47b30419033d6670cf72ce147c413c7ee7156a90eb539d82973 not found: ID does not exist" Mar 20 10:21:15 crc kubenswrapper[4971]: I0320 10:21:15.517418 4971 scope.go:117] "RemoveContainer" containerID="31c74963a22de8f52b4a60d9e175fa939be3e6d3154ea77509871ba55d971d23" Mar 20 10:21:15 crc kubenswrapper[4971]: E0320 10:21:15.518067 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31c74963a22de8f52b4a60d9e175fa939be3e6d3154ea77509871ba55d971d23\": container with ID starting with 31c74963a22de8f52b4a60d9e175fa939be3e6d3154ea77509871ba55d971d23 not found: ID does not exist" containerID="31c74963a22de8f52b4a60d9e175fa939be3e6d3154ea77509871ba55d971d23" Mar 20 10:21:15 crc kubenswrapper[4971]: I0320 10:21:15.518093 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31c74963a22de8f52b4a60d9e175fa939be3e6d3154ea77509871ba55d971d23"} err="failed to get container status \"31c74963a22de8f52b4a60d9e175fa939be3e6d3154ea77509871ba55d971d23\": rpc error: code = NotFound desc = could not find container \"31c74963a22de8f52b4a60d9e175fa939be3e6d3154ea77509871ba55d971d23\": container with ID starting with 31c74963a22de8f52b4a60d9e175fa939be3e6d3154ea77509871ba55d971d23 not found: ID does not exist" Mar 20 10:21:15 crc kubenswrapper[4971]: I0320 10:21:15.518105 4971 scope.go:117] "RemoveContainer" containerID="d0327f295b37727f2e8239e2dabab08da570ecbc893821f08636a5470df4ef57" Mar 20 10:21:15 crc kubenswrapper[4971]: E0320 10:21:15.518347 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0327f295b37727f2e8239e2dabab08da570ecbc893821f08636a5470df4ef57\": container with ID starting with d0327f295b37727f2e8239e2dabab08da570ecbc893821f08636a5470df4ef57 not found: ID does not exist" containerID="d0327f295b37727f2e8239e2dabab08da570ecbc893821f08636a5470df4ef57" Mar 20 10:21:15 crc kubenswrapper[4971]: I0320 10:21:15.518367 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0327f295b37727f2e8239e2dabab08da570ecbc893821f08636a5470df4ef57"} err="failed to get container status \"d0327f295b37727f2e8239e2dabab08da570ecbc893821f08636a5470df4ef57\": rpc error: code = NotFound desc = could not find container \"d0327f295b37727f2e8239e2dabab08da570ecbc893821f08636a5470df4ef57\": container with ID starting with d0327f295b37727f2e8239e2dabab08da570ecbc893821f08636a5470df4ef57 not found: ID does not exist" Mar 20 10:21:15 crc kubenswrapper[4971]: I0320 10:21:15.563527 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874822m6_cd481b29-3182-4bea-bc4d-60f01f877aa3/util/0.log" Mar 20 10:21:15 crc kubenswrapper[4971]: I0320 10:21:15.653115 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874822m6_cd481b29-3182-4bea-bc4d-60f01f877aa3/util/0.log" Mar 20 10:21:15 crc kubenswrapper[4971]: I0320 10:21:15.727221 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874822m6_cd481b29-3182-4bea-bc4d-60f01f877aa3/pull/0.log" Mar 20 10:21:15 crc kubenswrapper[4971]: I0320 10:21:15.742265 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874822m6_cd481b29-3182-4bea-bc4d-60f01f877aa3/pull/0.log" Mar 20 10:21:15 crc kubenswrapper[4971]: I0320 10:21:15.908750 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874822m6_cd481b29-3182-4bea-bc4d-60f01f877aa3/util/0.log" Mar 20 10:21:15 crc kubenswrapper[4971]: I0320 10:21:15.928113 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874822m6_cd481b29-3182-4bea-bc4d-60f01f877aa3/pull/0.log" Mar 20 10:21:15 crc kubenswrapper[4971]: I0320 10:21:15.953902 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874822m6_cd481b29-3182-4bea-bc4d-60f01f877aa3/extract/0.log" Mar 20 10:21:16 crc kubenswrapper[4971]: I0320 10:21:16.089909 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lc28r_32deec4a-495c-4499-b249-5affc8063c20/util/0.log" Mar 20 10:21:16 crc kubenswrapper[4971]: I0320 10:21:16.330964 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lc28r_32deec4a-495c-4499-b249-5affc8063c20/util/0.log" Mar 20 10:21:16 crc kubenswrapper[4971]: I0320 10:21:16.339068 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lc28r_32deec4a-495c-4499-b249-5affc8063c20/pull/0.log" Mar 20 10:21:16 crc kubenswrapper[4971]: I0320 10:21:16.352883 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lc28r_32deec4a-495c-4499-b249-5affc8063c20/pull/0.log" Mar 20 10:21:16 crc kubenswrapper[4971]: I0320 10:21:16.504073 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lc28r_32deec4a-495c-4499-b249-5affc8063c20/util/0.log" Mar 20 10:21:16 crc kubenswrapper[4971]: I0320 10:21:16.566120 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lc28r_32deec4a-495c-4499-b249-5affc8063c20/pull/0.log" Mar 20 10:21:16 crc kubenswrapper[4971]: I0320 10:21:16.597217 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lc28r_32deec4a-495c-4499-b249-5affc8063c20/extract/0.log" Mar 20 10:21:16 crc kubenswrapper[4971]: I0320 10:21:16.703466 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lssfc_9f7b38cc-03d4-40e5-9678-f8e7a589392f/util/0.log" Mar 20 10:21:16 crc kubenswrapper[4971]: I0320 10:21:16.743992 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afb4a5aa-dbfc-419c-acfb-6932095a8f8b" path="/var/lib/kubelet/pods/afb4a5aa-dbfc-419c-acfb-6932095a8f8b/volumes" Mar 20 10:21:16 crc kubenswrapper[4971]: I0320 10:21:16.952280 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lssfc_9f7b38cc-03d4-40e5-9678-f8e7a589392f/util/0.log" Mar 20 10:21:16 crc kubenswrapper[4971]: I0320 10:21:16.971699 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lssfc_9f7b38cc-03d4-40e5-9678-f8e7a589392f/pull/0.log" Mar 20 10:21:17 crc kubenswrapper[4971]: I0320 10:21:17.013902 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lssfc_9f7b38cc-03d4-40e5-9678-f8e7a589392f/pull/0.log" Mar 20 10:21:17 crc kubenswrapper[4971]: I0320 10:21:17.176759 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lssfc_9f7b38cc-03d4-40e5-9678-f8e7a589392f/extract/0.log" Mar 20 10:21:17 crc kubenswrapper[4971]: I0320 10:21:17.197155 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lssfc_9f7b38cc-03d4-40e5-9678-f8e7a589392f/pull/0.log" Mar 20 10:21:17 crc kubenswrapper[4971]: I0320 10:21:17.223724 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lssfc_9f7b38cc-03d4-40e5-9678-f8e7a589392f/util/0.log" Mar 20 10:21:17 crc kubenswrapper[4971]: I0320 10:21:17.363629 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g2c5x_cad0849d-53e9-4b65-a400-5eeeba9c533d/util/0.log" Mar 20 10:21:17 crc kubenswrapper[4971]: I0320 10:21:17.542699 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g2c5x_cad0849d-53e9-4b65-a400-5eeeba9c533d/util/0.log" Mar 20 10:21:17 crc kubenswrapper[4971]: I0320 10:21:17.588511 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g2c5x_cad0849d-53e9-4b65-a400-5eeeba9c533d/pull/0.log" Mar 20 10:21:17 crc kubenswrapper[4971]: I0320 10:21:17.654455 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g2c5x_cad0849d-53e9-4b65-a400-5eeeba9c533d/pull/0.log" Mar 20 10:21:17 crc kubenswrapper[4971]: I0320 10:21:17.733200 4971 scope.go:117] "RemoveContainer" containerID="919bb2f73e7c6359debd4a73f45868c083b9339eb00f7714958d919a54c82d9f" Mar 20 10:21:17 crc kubenswrapper[4971]: E0320 10:21:17.733516 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 10:21:17 crc kubenswrapper[4971]: I0320 10:21:17.769171 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g2c5x_cad0849d-53e9-4b65-a400-5eeeba9c533d/util/0.log" Mar 20 10:21:17 crc kubenswrapper[4971]: I0320 10:21:17.776095 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g2c5x_cad0849d-53e9-4b65-a400-5eeeba9c533d/pull/0.log" Mar 20 10:21:17 crc kubenswrapper[4971]: I0320 10:21:17.800528 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g2c5x_cad0849d-53e9-4b65-a400-5eeeba9c533d/extract/0.log" Mar 20 10:21:17 crc kubenswrapper[4971]: I0320 10:21:17.967439 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-m2l48_ca4e2982-1f40-434d-b05f-baaa0c805845/extract-utilities/0.log" Mar 20 10:21:18 crc kubenswrapper[4971]: I0320 10:21:18.138356 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-m2l48_ca4e2982-1f40-434d-b05f-baaa0c805845/extract-content/0.log" Mar 20 10:21:18 crc kubenswrapper[4971]: I0320 10:21:18.170929 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-m2l48_ca4e2982-1f40-434d-b05f-baaa0c805845/extract-content/0.log" Mar 20 10:21:18 crc kubenswrapper[4971]: I0320 10:21:18.176192 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-m2l48_ca4e2982-1f40-434d-b05f-baaa0c805845/extract-utilities/0.log" Mar 20 10:21:18 crc kubenswrapper[4971]: I0320 10:21:18.311068 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-m2l48_ca4e2982-1f40-434d-b05f-baaa0c805845/extract-content/0.log" Mar 20 10:21:18 crc kubenswrapper[4971]: I0320 10:21:18.346217 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-m2l48_ca4e2982-1f40-434d-b05f-baaa0c805845/extract-utilities/0.log" Mar 20 10:21:18 crc kubenswrapper[4971]: I0320 10:21:18.367208 4971 scope.go:117] "RemoveContainer" containerID="4e6829767ffabadd1afefac17198818bbf3b380b3fda5a46a74561c94bbc67f3" Mar 20 10:21:19 crc kubenswrapper[4971]: I0320 10:21:19.214523 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9jr2k_6220fde4-2977-4020-9cf5-0122cf43c634/extract-utilities/0.log" Mar 20 10:21:19 crc kubenswrapper[4971]: I0320 10:21:19.362722 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9jr2k_6220fde4-2977-4020-9cf5-0122cf43c634/extract-utilities/0.log" Mar 20 10:21:19 crc kubenswrapper[4971]: I0320 10:21:19.425357 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9jr2k_6220fde4-2977-4020-9cf5-0122cf43c634/extract-content/0.log" Mar 20 10:21:19 crc kubenswrapper[4971]: I0320 10:21:19.497512 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9jr2k_6220fde4-2977-4020-9cf5-0122cf43c634/extract-content/0.log" Mar 20 10:21:19 crc kubenswrapper[4971]: I0320 10:21:19.636785 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9jr2k_6220fde4-2977-4020-9cf5-0122cf43c634/extract-content/0.log" Mar 20 10:21:19 crc kubenswrapper[4971]: I0320 10:21:19.663284 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9jr2k_6220fde4-2977-4020-9cf5-0122cf43c634/extract-utilities/0.log" Mar 20 10:21:19 crc kubenswrapper[4971]: I0320 10:21:19.852784 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-q4lsj_c34c021e-b961-4f87-a749-9e023f6b8c93/marketplace-operator/0.log" Mar 20 10:21:20 crc kubenswrapper[4971]: I0320 10:21:20.041839 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-wztzb_44282ec6-bf9c-4091-bb86-2e491e577076/extract-utilities/0.log" Mar 20 10:21:20 crc kubenswrapper[4971]: I0320 10:21:20.124685 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-m2l48_ca4e2982-1f40-434d-b05f-baaa0c805845/registry-server/0.log" Mar 20 10:21:20 crc kubenswrapper[4971]: I0320 10:21:20.205709 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9jr2k_6220fde4-2977-4020-9cf5-0122cf43c634/registry-server/0.log" Mar 20 10:21:20 crc kubenswrapper[4971]: I0320 10:21:20.469895 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-wztzb_44282ec6-bf9c-4091-bb86-2e491e577076/extract-content/0.log" Mar 20 10:21:20 crc kubenswrapper[4971]: I0320 10:21:20.482465 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-wztzb_44282ec6-bf9c-4091-bb86-2e491e577076/extract-utilities/0.log" Mar 20 10:21:20 crc kubenswrapper[4971]: I0320 10:21:20.505981 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-wztzb_44282ec6-bf9c-4091-bb86-2e491e577076/extract-content/0.log" Mar 20 10:21:20 crc kubenswrapper[4971]: I0320 10:21:20.709065 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-wztzb_44282ec6-bf9c-4091-bb86-2e491e577076/extract-content/0.log" Mar 20 10:21:20 crc kubenswrapper[4971]: I0320 10:21:20.728902 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-wztzb_44282ec6-bf9c-4091-bb86-2e491e577076/extract-utilities/0.log" Mar 20 10:21:20 crc kubenswrapper[4971]: I0320 10:21:20.768881 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xp27t_d1cb53af-ca9d-494f-a994-dbe7117881db/extract-utilities/0.log" Mar 20 10:21:20 crc kubenswrapper[4971]: I0320 10:21:20.958961 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xp27t_d1cb53af-ca9d-494f-a994-dbe7117881db/extract-utilities/0.log" Mar 20 10:21:21 crc kubenswrapper[4971]: I0320 10:21:21.006154 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xp27t_d1cb53af-ca9d-494f-a994-dbe7117881db/extract-content/0.log" Mar 20 10:21:21 crc kubenswrapper[4971]: I0320 10:21:21.012345 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xp27t_d1cb53af-ca9d-494f-a994-dbe7117881db/extract-content/0.log" Mar 20 10:21:21 crc kubenswrapper[4971]: I0320 10:21:21.130148 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-wztzb_44282ec6-bf9c-4091-bb86-2e491e577076/registry-server/0.log" Mar 20 10:21:21 crc kubenswrapper[4971]: I0320 10:21:21.190334 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xp27t_d1cb53af-ca9d-494f-a994-dbe7117881db/extract-utilities/0.log" Mar 20 10:21:21 crc kubenswrapper[4971]: I0320 10:21:21.190848 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xp27t_d1cb53af-ca9d-494f-a994-dbe7117881db/extract-content/0.log" Mar 20 10:21:22 crc kubenswrapper[4971]: I0320 10:21:22.662374 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xp27t_d1cb53af-ca9d-494f-a994-dbe7117881db/registry-server/0.log" Mar 20 10:21:29 crc kubenswrapper[4971]: I0320 10:21:29.224630 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bvsqn"] Mar 20 10:21:29 crc kubenswrapper[4971]: E0320 10:21:29.225737 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afb4a5aa-dbfc-419c-acfb-6932095a8f8b" containerName="registry-server" Mar 20 10:21:29 crc kubenswrapper[4971]: I0320 10:21:29.225754 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="afb4a5aa-dbfc-419c-acfb-6932095a8f8b" containerName="registry-server" Mar 20 10:21:29 crc kubenswrapper[4971]: E0320 10:21:29.225773 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afb4a5aa-dbfc-419c-acfb-6932095a8f8b" containerName="extract-content" Mar 20 10:21:29 crc kubenswrapper[4971]: I0320 10:21:29.225782 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="afb4a5aa-dbfc-419c-acfb-6932095a8f8b" containerName="extract-content" Mar 20 10:21:29 crc kubenswrapper[4971]: E0320 10:21:29.225823 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afb4a5aa-dbfc-419c-acfb-6932095a8f8b" containerName="extract-utilities" Mar 20 10:21:29 crc kubenswrapper[4971]: I0320 10:21:29.225832 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="afb4a5aa-dbfc-419c-acfb-6932095a8f8b" containerName="extract-utilities" Mar 20 10:21:29 crc kubenswrapper[4971]: I0320 10:21:29.226093 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="afb4a5aa-dbfc-419c-acfb-6932095a8f8b" containerName="registry-server" Mar 20 10:21:29 crc kubenswrapper[4971]: I0320 10:21:29.256263 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bvsqn" Mar 20 10:21:29 crc kubenswrapper[4971]: I0320 10:21:29.260392 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bvsqn"] Mar 20 10:21:29 crc kubenswrapper[4971]: I0320 10:21:29.379917 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62q9j\" (UniqueName: \"kubernetes.io/projected/c0179af2-d7fb-475f-ab9a-e6ed401e1952-kube-api-access-62q9j\") pod \"redhat-operators-bvsqn\" (UID: \"c0179af2-d7fb-475f-ab9a-e6ed401e1952\") " pod="openshift-marketplace/redhat-operators-bvsqn" Mar 20 10:21:29 crc kubenswrapper[4971]: I0320 10:21:29.380041 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0179af2-d7fb-475f-ab9a-e6ed401e1952-catalog-content\") pod \"redhat-operators-bvsqn\" (UID: \"c0179af2-d7fb-475f-ab9a-e6ed401e1952\") " pod="openshift-marketplace/redhat-operators-bvsqn" Mar 20 10:21:29 crc kubenswrapper[4971]: I0320 10:21:29.380226 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0179af2-d7fb-475f-ab9a-e6ed401e1952-utilities\") pod \"redhat-operators-bvsqn\" (UID: \"c0179af2-d7fb-475f-ab9a-e6ed401e1952\") " pod="openshift-marketplace/redhat-operators-bvsqn" Mar 20 10:21:29 crc kubenswrapper[4971]: I0320 10:21:29.481735 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0179af2-d7fb-475f-ab9a-e6ed401e1952-utilities\") pod \"redhat-operators-bvsqn\" (UID: \"c0179af2-d7fb-475f-ab9a-e6ed401e1952\") " pod="openshift-marketplace/redhat-operators-bvsqn" Mar 20 10:21:29 crc kubenswrapper[4971]: I0320 10:21:29.481788 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62q9j\" (UniqueName: \"kubernetes.io/projected/c0179af2-d7fb-475f-ab9a-e6ed401e1952-kube-api-access-62q9j\") pod \"redhat-operators-bvsqn\" (UID: \"c0179af2-d7fb-475f-ab9a-e6ed401e1952\") " pod="openshift-marketplace/redhat-operators-bvsqn" Mar 20 10:21:29 crc kubenswrapper[4971]: I0320 10:21:29.481849 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0179af2-d7fb-475f-ab9a-e6ed401e1952-catalog-content\") pod \"redhat-operators-bvsqn\" (UID: \"c0179af2-d7fb-475f-ab9a-e6ed401e1952\") " pod="openshift-marketplace/redhat-operators-bvsqn" Mar 20 10:21:29 crc kubenswrapper[4971]: I0320 10:21:29.482284 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0179af2-d7fb-475f-ab9a-e6ed401e1952-utilities\") pod \"redhat-operators-bvsqn\" (UID: \"c0179af2-d7fb-475f-ab9a-e6ed401e1952\") " pod="openshift-marketplace/redhat-operators-bvsqn" Mar 20 10:21:29 crc kubenswrapper[4971]: I0320 10:21:29.482313 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0179af2-d7fb-475f-ab9a-e6ed401e1952-catalog-content\") pod \"redhat-operators-bvsqn\" (UID: \"c0179af2-d7fb-475f-ab9a-e6ed401e1952\") " pod="openshift-marketplace/redhat-operators-bvsqn" Mar 20 10:21:29 crc kubenswrapper[4971]: I0320 10:21:29.500025 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62q9j\" (UniqueName: \"kubernetes.io/projected/c0179af2-d7fb-475f-ab9a-e6ed401e1952-kube-api-access-62q9j\") pod \"redhat-operators-bvsqn\" (UID: \"c0179af2-d7fb-475f-ab9a-e6ed401e1952\") " pod="openshift-marketplace/redhat-operators-bvsqn" Mar 20 10:21:29 crc kubenswrapper[4971]: I0320 10:21:29.582315 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bvsqn" Mar 20 10:21:29 crc kubenswrapper[4971]: I0320 10:21:29.733439 4971 scope.go:117] "RemoveContainer" containerID="919bb2f73e7c6359debd4a73f45868c083b9339eb00f7714958d919a54c82d9f" Mar 20 10:21:29 crc kubenswrapper[4971]: E0320 10:21:29.733999 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 10:21:30 crc kubenswrapper[4971]: I0320 10:21:30.053345 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bvsqn"] Mar 20 10:21:30 crc kubenswrapper[4971]: I0320 10:21:30.553298 4971 generic.go:334] "Generic (PLEG): container finished" podID="c0179af2-d7fb-475f-ab9a-e6ed401e1952" containerID="e3cff097550b034c58da9b9d50a84a520ca3284574185c3b46c5e646c0f8881b" exitCode=0 Mar 20 10:21:30 crc kubenswrapper[4971]: I0320 10:21:30.553401 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bvsqn" event={"ID":"c0179af2-d7fb-475f-ab9a-e6ed401e1952","Type":"ContainerDied","Data":"e3cff097550b034c58da9b9d50a84a520ca3284574185c3b46c5e646c0f8881b"} Mar 20 10:21:30 crc kubenswrapper[4971]: I0320 10:21:30.553589 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bvsqn" event={"ID":"c0179af2-d7fb-475f-ab9a-e6ed401e1952","Type":"ContainerStarted","Data":"3b44be652f5d1cf44f7aef326ee9de21232607453a4b43eb4c43af0bf2a9f5ff"} Mar 20 10:21:32 crc kubenswrapper[4971]: I0320 10:21:32.580360 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bvsqn" event={"ID":"c0179af2-d7fb-475f-ab9a-e6ed401e1952","Type":"ContainerStarted","Data":"8bab81516516780a54472eeb8509b0d8c89ac48c22511ee88302a2d591651b4b"} Mar 20 10:21:34 crc kubenswrapper[4971]: I0320 10:21:34.505199 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-657d9b77d4-b797l_4aeb8dc6-151a-40dc-871e-6145135c7fdf/prometheus-operator-admission-webhook/0.log" Mar 20 10:21:34 crc kubenswrapper[4971]: I0320 10:21:34.580978 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-657d9b77d4-wsqv2_1be3085b-b74b-4086-ab50-b41ac948660c/prometheus-operator-admission-webhook/0.log" Mar 20 10:21:34 crc kubenswrapper[4971]: I0320 10:21:34.605978 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-8ff7d675-tqdrq_05730a75-1c9d-4199-bd62-176210c9207a/prometheus-operator/0.log" Mar 20 10:21:34 crc kubenswrapper[4971]: I0320 10:21:34.728812 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-6dd7dd855f-jt588_a64188ef-f360-424b-b0f7-4b6f3801e47e/operator/0.log" Mar 20 10:21:34 crc kubenswrapper[4971]: I0320 10:21:34.785003 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-86dc4f68fc-bsgd2_7d563a7f-a865-46b9-b775-cb3c61eae815/perses-operator/0.log" Mar 20 10:21:36 crc kubenswrapper[4971]: I0320 10:21:36.618007 4971 generic.go:334] "Generic (PLEG): container finished" podID="c0179af2-d7fb-475f-ab9a-e6ed401e1952" containerID="8bab81516516780a54472eeb8509b0d8c89ac48c22511ee88302a2d591651b4b" exitCode=0 Mar 20 10:21:36 crc kubenswrapper[4971]: I0320 10:21:36.618093 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bvsqn" event={"ID":"c0179af2-d7fb-475f-ab9a-e6ed401e1952","Type":"ContainerDied","Data":"8bab81516516780a54472eeb8509b0d8c89ac48c22511ee88302a2d591651b4b"} Mar 20 10:21:37 crc kubenswrapper[4971]: I0320 10:21:37.631060 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bvsqn" event={"ID":"c0179af2-d7fb-475f-ab9a-e6ed401e1952","Type":"ContainerStarted","Data":"a5f6dc03e92bcc0b37b7012bab3f159356f5a131a5aef8c3505991ca8db744ed"} Mar 20 10:21:37 crc kubenswrapper[4971]: I0320 10:21:37.654666 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bvsqn" podStartSLOduration=2.210539008 podStartE2EDuration="8.654649604s" podCreationTimestamp="2026-03-20 10:21:29 +0000 UTC" firstStartedPulling="2026-03-20 10:21:30.555057114 +0000 UTC m=+12712.534931252" lastFinishedPulling="2026-03-20 10:21:36.99916771 +0000 UTC m=+12718.979041848" observedRunningTime="2026-03-20 10:21:37.647709891 +0000 UTC m=+12719.627584029" watchObservedRunningTime="2026-03-20 10:21:37.654649604 +0000 UTC m=+12719.634523742" Mar 20 10:21:39 crc kubenswrapper[4971]: I0320 10:21:39.582544 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bvsqn" Mar 20 10:21:39 crc kubenswrapper[4971]: I0320 10:21:39.583072 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bvsqn" Mar 20 10:21:40 crc kubenswrapper[4971]: I0320 10:21:40.670792 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bvsqn" podUID="c0179af2-d7fb-475f-ab9a-e6ed401e1952" containerName="registry-server" probeResult="failure" output=< Mar 20 10:21:40 crc kubenswrapper[4971]: timeout: failed to connect service ":50051" within 1s Mar 20 10:21:40 crc kubenswrapper[4971]: > Mar 20 10:21:41 crc kubenswrapper[4971]: I0320 10:21:41.732307 4971 scope.go:117] "RemoveContainer" containerID="919bb2f73e7c6359debd4a73f45868c083b9339eb00f7714958d919a54c82d9f" Mar 20 10:21:41 crc kubenswrapper[4971]: E0320 10:21:41.733036 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 10:21:50 crc kubenswrapper[4971]: I0320 10:21:50.622457 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bvsqn" podUID="c0179af2-d7fb-475f-ab9a-e6ed401e1952" containerName="registry-server" probeResult="failure" output=< Mar 20 10:21:50 crc kubenswrapper[4971]: timeout: failed to connect service ":50051" within 1s Mar 20 10:21:50 crc kubenswrapper[4971]: > Mar 20 10:21:53 crc kubenswrapper[4971]: I0320 10:21:53.732571 4971 scope.go:117] "RemoveContainer" containerID="919bb2f73e7c6359debd4a73f45868c083b9339eb00f7714958d919a54c82d9f" Mar 20 10:21:53 crc kubenswrapper[4971]: E0320 10:21:53.733480 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 10:22:00 crc kubenswrapper[4971]: I0320 10:22:00.142603 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566702-rqtq4"] Mar 20 10:22:00 crc kubenswrapper[4971]: I0320 10:22:00.144789 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566702-rqtq4" Mar 20 10:22:00 crc kubenswrapper[4971]: I0320 10:22:00.152161 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566702-rqtq4"] Mar 20 10:22:00 crc kubenswrapper[4971]: I0320 10:22:00.185509 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 10:22:00 crc kubenswrapper[4971]: I0320 10:22:00.185807 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 10:22:00 crc kubenswrapper[4971]: I0320 10:22:00.188120 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 10:22:00 crc kubenswrapper[4971]: I0320 10:22:00.309671 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xlh4\" (UniqueName: \"kubernetes.io/projected/86f94daf-ce3e-408e-9f8b-2adcd85bcfb2-kube-api-access-8xlh4\") pod \"auto-csr-approver-29566702-rqtq4\" (UID: \"86f94daf-ce3e-408e-9f8b-2adcd85bcfb2\") " pod="openshift-infra/auto-csr-approver-29566702-rqtq4" Mar 20 10:22:00 crc kubenswrapper[4971]: I0320 10:22:00.411291 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xlh4\" (UniqueName: \"kubernetes.io/projected/86f94daf-ce3e-408e-9f8b-2adcd85bcfb2-kube-api-access-8xlh4\") pod \"auto-csr-approver-29566702-rqtq4\" (UID: \"86f94daf-ce3e-408e-9f8b-2adcd85bcfb2\") " pod="openshift-infra/auto-csr-approver-29566702-rqtq4" Mar 20 10:22:00 crc kubenswrapper[4971]: I0320 10:22:00.446842 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xlh4\" (UniqueName: \"kubernetes.io/projected/86f94daf-ce3e-408e-9f8b-2adcd85bcfb2-kube-api-access-8xlh4\") pod \"auto-csr-approver-29566702-rqtq4\" (UID: \"86f94daf-ce3e-408e-9f8b-2adcd85bcfb2\") " pod="openshift-infra/auto-csr-approver-29566702-rqtq4" Mar 20 10:22:00 crc kubenswrapper[4971]: I0320 10:22:00.507238 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566702-rqtq4" Mar 20 10:22:00 crc kubenswrapper[4971]: I0320 10:22:00.640734 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bvsqn" podUID="c0179af2-d7fb-475f-ab9a-e6ed401e1952" containerName="registry-server" probeResult="failure" output=< Mar 20 10:22:00 crc kubenswrapper[4971]: timeout: failed to connect service ":50051" within 1s Mar 20 10:22:00 crc kubenswrapper[4971]: > Mar 20 10:22:00 crc kubenswrapper[4971]: I0320 10:22:00.962084 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566702-rqtq4"] Mar 20 10:22:01 crc kubenswrapper[4971]: I0320 10:22:01.885599 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566702-rqtq4" event={"ID":"86f94daf-ce3e-408e-9f8b-2adcd85bcfb2","Type":"ContainerStarted","Data":"a22de96d051257a92664d4a7549614cef3a1e208b635f7849df640e1f429f0d2"} Mar 20 10:22:02 crc kubenswrapper[4971]: I0320 10:22:02.898440 4971 generic.go:334] "Generic (PLEG): container finished" podID="86f94daf-ce3e-408e-9f8b-2adcd85bcfb2" containerID="a208b939f2314073550bd766affb0c875214f6f4d512bd161c75379fdebd9956" exitCode=0 Mar 20 10:22:02 crc kubenswrapper[4971]: I0320 10:22:02.898498 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566702-rqtq4" event={"ID":"86f94daf-ce3e-408e-9f8b-2adcd85bcfb2","Type":"ContainerDied","Data":"a208b939f2314073550bd766affb0c875214f6f4d512bd161c75379fdebd9956"} Mar 20 10:22:04 crc kubenswrapper[4971]: I0320 10:22:04.284886 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566702-rqtq4" Mar 20 10:22:04 crc kubenswrapper[4971]: I0320 10:22:04.391551 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xlh4\" (UniqueName: \"kubernetes.io/projected/86f94daf-ce3e-408e-9f8b-2adcd85bcfb2-kube-api-access-8xlh4\") pod \"86f94daf-ce3e-408e-9f8b-2adcd85bcfb2\" (UID: \"86f94daf-ce3e-408e-9f8b-2adcd85bcfb2\") " Mar 20 10:22:04 crc kubenswrapper[4971]: I0320 10:22:04.397359 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86f94daf-ce3e-408e-9f8b-2adcd85bcfb2-kube-api-access-8xlh4" (OuterVolumeSpecName: "kube-api-access-8xlh4") pod "86f94daf-ce3e-408e-9f8b-2adcd85bcfb2" (UID: "86f94daf-ce3e-408e-9f8b-2adcd85bcfb2"). InnerVolumeSpecName "kube-api-access-8xlh4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:22:04 crc kubenswrapper[4971]: I0320 10:22:04.495088 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xlh4\" (UniqueName: \"kubernetes.io/projected/86f94daf-ce3e-408e-9f8b-2adcd85bcfb2-kube-api-access-8xlh4\") on node \"crc\" DevicePath \"\"" Mar 20 10:22:04 crc kubenswrapper[4971]: I0320 10:22:04.741045 4971 scope.go:117] "RemoveContainer" containerID="919bb2f73e7c6359debd4a73f45868c083b9339eb00f7714958d919a54c82d9f" Mar 20 10:22:04 crc kubenswrapper[4971]: E0320 10:22:04.741402 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 10:22:04 crc kubenswrapper[4971]: I0320 10:22:04.933953 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566702-rqtq4" event={"ID":"86f94daf-ce3e-408e-9f8b-2adcd85bcfb2","Type":"ContainerDied","Data":"a22de96d051257a92664d4a7549614cef3a1e208b635f7849df640e1f429f0d2"} Mar 20 10:22:04 crc kubenswrapper[4971]: I0320 10:22:04.933993 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a22de96d051257a92664d4a7549614cef3a1e208b635f7849df640e1f429f0d2" Mar 20 10:22:04 crc kubenswrapper[4971]: I0320 10:22:04.934044 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566702-rqtq4" Mar 20 10:22:05 crc kubenswrapper[4971]: I0320 10:22:05.369161 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566696-kdqxj"] Mar 20 10:22:05 crc kubenswrapper[4971]: I0320 10:22:05.381683 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566696-kdqxj"] Mar 20 10:22:06 crc kubenswrapper[4971]: I0320 10:22:06.746846 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a5c30ae-e5aa-4e94-8a67-2f2d0088db08" path="/var/lib/kubelet/pods/9a5c30ae-e5aa-4e94-8a67-2f2d0088db08/volumes" Mar 20 10:22:09 crc kubenswrapper[4971]: I0320 10:22:09.641042 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bvsqn" Mar 20 10:22:09 crc kubenswrapper[4971]: I0320 10:22:09.697427 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bvsqn" Mar 20 10:22:09 crc kubenswrapper[4971]: I0320 10:22:09.888423 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bvsqn"] Mar 20 10:22:10 crc kubenswrapper[4971]: I0320 10:22:10.984884 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bvsqn" podUID="c0179af2-d7fb-475f-ab9a-e6ed401e1952" containerName="registry-server" containerID="cri-o://a5f6dc03e92bcc0b37b7012bab3f159356f5a131a5aef8c3505991ca8db744ed" gracePeriod=2 Mar 20 10:22:11 crc kubenswrapper[4971]: I0320 10:22:11.472958 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bvsqn" Mar 20 10:22:11 crc kubenswrapper[4971]: I0320 10:22:11.539463 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62q9j\" (UniqueName: \"kubernetes.io/projected/c0179af2-d7fb-475f-ab9a-e6ed401e1952-kube-api-access-62q9j\") pod \"c0179af2-d7fb-475f-ab9a-e6ed401e1952\" (UID: \"c0179af2-d7fb-475f-ab9a-e6ed401e1952\") " Mar 20 10:22:11 crc kubenswrapper[4971]: I0320 10:22:11.539713 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0179af2-d7fb-475f-ab9a-e6ed401e1952-catalog-content\") pod \"c0179af2-d7fb-475f-ab9a-e6ed401e1952\" (UID: \"c0179af2-d7fb-475f-ab9a-e6ed401e1952\") " Mar 20 10:22:11 crc kubenswrapper[4971]: I0320 10:22:11.539741 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0179af2-d7fb-475f-ab9a-e6ed401e1952-utilities\") pod \"c0179af2-d7fb-475f-ab9a-e6ed401e1952\" (UID: \"c0179af2-d7fb-475f-ab9a-e6ed401e1952\") " Mar 20 10:22:11 crc kubenswrapper[4971]: I0320 10:22:11.540528 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0179af2-d7fb-475f-ab9a-e6ed401e1952-utilities" (OuterVolumeSpecName: "utilities") pod "c0179af2-d7fb-475f-ab9a-e6ed401e1952" (UID: "c0179af2-d7fb-475f-ab9a-e6ed401e1952"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:22:11 crc kubenswrapper[4971]: I0320 10:22:11.546530 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0179af2-d7fb-475f-ab9a-e6ed401e1952-kube-api-access-62q9j" (OuterVolumeSpecName: "kube-api-access-62q9j") pod "c0179af2-d7fb-475f-ab9a-e6ed401e1952" (UID: "c0179af2-d7fb-475f-ab9a-e6ed401e1952"). InnerVolumeSpecName "kube-api-access-62q9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:22:11 crc kubenswrapper[4971]: I0320 10:22:11.643067 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0179af2-d7fb-475f-ab9a-e6ed401e1952-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 10:22:11 crc kubenswrapper[4971]: I0320 10:22:11.643111 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62q9j\" (UniqueName: \"kubernetes.io/projected/c0179af2-d7fb-475f-ab9a-e6ed401e1952-kube-api-access-62q9j\") on node \"crc\" DevicePath \"\"" Mar 20 10:22:11 crc kubenswrapper[4971]: I0320 10:22:11.689064 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0179af2-d7fb-475f-ab9a-e6ed401e1952-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c0179af2-d7fb-475f-ab9a-e6ed401e1952" (UID: "c0179af2-d7fb-475f-ab9a-e6ed401e1952"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:22:11 crc kubenswrapper[4971]: I0320 10:22:11.745262 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0179af2-d7fb-475f-ab9a-e6ed401e1952-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 10:22:11 crc kubenswrapper[4971]: I0320 10:22:11.998077 4971 generic.go:334] "Generic (PLEG): container finished" podID="c0179af2-d7fb-475f-ab9a-e6ed401e1952" containerID="a5f6dc03e92bcc0b37b7012bab3f159356f5a131a5aef8c3505991ca8db744ed" exitCode=0 Mar 20 10:22:11 crc kubenswrapper[4971]: I0320 10:22:11.998124 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bvsqn" event={"ID":"c0179af2-d7fb-475f-ab9a-e6ed401e1952","Type":"ContainerDied","Data":"a5f6dc03e92bcc0b37b7012bab3f159356f5a131a5aef8c3505991ca8db744ed"} Mar 20 10:22:11 crc kubenswrapper[4971]: I0320 10:22:11.998150 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bvsqn" event={"ID":"c0179af2-d7fb-475f-ab9a-e6ed401e1952","Type":"ContainerDied","Data":"3b44be652f5d1cf44f7aef326ee9de21232607453a4b43eb4c43af0bf2a9f5ff"} Mar 20 10:22:11 crc kubenswrapper[4971]: I0320 10:22:11.998167 4971 scope.go:117] "RemoveContainer" containerID="a5f6dc03e92bcc0b37b7012bab3f159356f5a131a5aef8c3505991ca8db744ed" Mar 20 10:22:12 crc kubenswrapper[4971]: I0320 10:22:11.998315 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bvsqn" Mar 20 10:22:12 crc kubenswrapper[4971]: I0320 10:22:12.041003 4971 scope.go:117] "RemoveContainer" containerID="8bab81516516780a54472eeb8509b0d8c89ac48c22511ee88302a2d591651b4b" Mar 20 10:22:12 crc kubenswrapper[4971]: I0320 10:22:12.042687 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bvsqn"] Mar 20 10:22:12 crc kubenswrapper[4971]: I0320 10:22:12.054700 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bvsqn"] Mar 20 10:22:12 crc kubenswrapper[4971]: I0320 10:22:12.086474 4971 scope.go:117] "RemoveContainer" containerID="e3cff097550b034c58da9b9d50a84a520ca3284574185c3b46c5e646c0f8881b" Mar 20 10:22:12 crc kubenswrapper[4971]: I0320 10:22:12.116549 4971 scope.go:117] "RemoveContainer" containerID="a5f6dc03e92bcc0b37b7012bab3f159356f5a131a5aef8c3505991ca8db744ed" Mar 20 10:22:12 crc kubenswrapper[4971]: E0320 10:22:12.118990 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5f6dc03e92bcc0b37b7012bab3f159356f5a131a5aef8c3505991ca8db744ed\": container with ID starting with a5f6dc03e92bcc0b37b7012bab3f159356f5a131a5aef8c3505991ca8db744ed not found: ID does not exist" containerID="a5f6dc03e92bcc0b37b7012bab3f159356f5a131a5aef8c3505991ca8db744ed" Mar 20 10:22:12 crc kubenswrapper[4971]: I0320 10:22:12.119037 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5f6dc03e92bcc0b37b7012bab3f159356f5a131a5aef8c3505991ca8db744ed"} err="failed to get container status \"a5f6dc03e92bcc0b37b7012bab3f159356f5a131a5aef8c3505991ca8db744ed\": rpc error: code = NotFound desc = could not find container \"a5f6dc03e92bcc0b37b7012bab3f159356f5a131a5aef8c3505991ca8db744ed\": container with ID starting with a5f6dc03e92bcc0b37b7012bab3f159356f5a131a5aef8c3505991ca8db744ed not found: ID does not exist" Mar 20 10:22:12 crc kubenswrapper[4971]: I0320 10:22:12.119064 4971 scope.go:117] "RemoveContainer" containerID="8bab81516516780a54472eeb8509b0d8c89ac48c22511ee88302a2d591651b4b" Mar 20 10:22:12 crc kubenswrapper[4971]: E0320 10:22:12.119564 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bab81516516780a54472eeb8509b0d8c89ac48c22511ee88302a2d591651b4b\": container with ID starting with 8bab81516516780a54472eeb8509b0d8c89ac48c22511ee88302a2d591651b4b not found: ID does not exist" containerID="8bab81516516780a54472eeb8509b0d8c89ac48c22511ee88302a2d591651b4b" Mar 20 10:22:12 crc kubenswrapper[4971]: I0320 10:22:12.119628 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bab81516516780a54472eeb8509b0d8c89ac48c22511ee88302a2d591651b4b"} err="failed to get container status \"8bab81516516780a54472eeb8509b0d8c89ac48c22511ee88302a2d591651b4b\": rpc error: code = NotFound desc = could not find container \"8bab81516516780a54472eeb8509b0d8c89ac48c22511ee88302a2d591651b4b\": container with ID starting with 8bab81516516780a54472eeb8509b0d8c89ac48c22511ee88302a2d591651b4b not found: ID does not exist" Mar 20 10:22:12 crc kubenswrapper[4971]: I0320 10:22:12.119655 4971 scope.go:117] "RemoveContainer" containerID="e3cff097550b034c58da9b9d50a84a520ca3284574185c3b46c5e646c0f8881b" Mar 20 10:22:12 crc kubenswrapper[4971]: E0320 10:22:12.119984 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3cff097550b034c58da9b9d50a84a520ca3284574185c3b46c5e646c0f8881b\": container with ID starting with e3cff097550b034c58da9b9d50a84a520ca3284574185c3b46c5e646c0f8881b not found: ID does not exist" containerID="e3cff097550b034c58da9b9d50a84a520ca3284574185c3b46c5e646c0f8881b" Mar 20 10:22:12 crc kubenswrapper[4971]: I0320 10:22:12.120016 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3cff097550b034c58da9b9d50a84a520ca3284574185c3b46c5e646c0f8881b"} err="failed to get container status \"e3cff097550b034c58da9b9d50a84a520ca3284574185c3b46c5e646c0f8881b\": rpc error: code = NotFound desc = could not find container \"e3cff097550b034c58da9b9d50a84a520ca3284574185c3b46c5e646c0f8881b\": container with ID starting with e3cff097550b034c58da9b9d50a84a520ca3284574185c3b46c5e646c0f8881b not found: ID does not exist" Mar 20 10:22:12 crc kubenswrapper[4971]: I0320 10:22:12.749063 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0179af2-d7fb-475f-ab9a-e6ed401e1952" path="/var/lib/kubelet/pods/c0179af2-d7fb-475f-ab9a-e6ed401e1952/volumes" Mar 20 10:22:18 crc kubenswrapper[4971]: I0320 10:22:18.435308 4971 scope.go:117] "RemoveContainer" containerID="e34bccb653b30075c80106ab681bb6500037d54e31d9a4c5f1f84666a9851f61" Mar 20 10:22:18 crc kubenswrapper[4971]: I0320 10:22:18.747858 4971 scope.go:117] "RemoveContainer" containerID="919bb2f73e7c6359debd4a73f45868c083b9339eb00f7714958d919a54c82d9f" Mar 20 10:22:18 crc kubenswrapper[4971]: E0320 10:22:18.748130 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 10:22:26 crc kubenswrapper[4971]: I0320 10:22:26.896375 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-72f4t"] Mar 20 10:22:26 crc kubenswrapper[4971]: E0320 10:22:26.897222 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0179af2-d7fb-475f-ab9a-e6ed401e1952" containerName="registry-server" Mar 20 10:22:26 crc kubenswrapper[4971]: I0320 10:22:26.897235 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0179af2-d7fb-475f-ab9a-e6ed401e1952" containerName="registry-server" Mar 20 10:22:26 crc kubenswrapper[4971]: E0320 10:22:26.897268 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0179af2-d7fb-475f-ab9a-e6ed401e1952" containerName="extract-utilities" Mar 20 10:22:26 crc kubenswrapper[4971]: I0320 10:22:26.897275 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0179af2-d7fb-475f-ab9a-e6ed401e1952" containerName="extract-utilities" Mar 20 10:22:26 crc kubenswrapper[4971]: E0320 10:22:26.897292 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0179af2-d7fb-475f-ab9a-e6ed401e1952" containerName="extract-content" Mar 20 10:22:26 crc kubenswrapper[4971]: I0320 10:22:26.897299 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0179af2-d7fb-475f-ab9a-e6ed401e1952" containerName="extract-content" Mar 20 10:22:26 crc kubenswrapper[4971]: E0320 10:22:26.897328 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86f94daf-ce3e-408e-9f8b-2adcd85bcfb2" containerName="oc" Mar 20 10:22:26 crc kubenswrapper[4971]: I0320 10:22:26.897334 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="86f94daf-ce3e-408e-9f8b-2adcd85bcfb2" containerName="oc" Mar 20 10:22:26 crc kubenswrapper[4971]: I0320 10:22:26.897505 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0179af2-d7fb-475f-ab9a-e6ed401e1952" containerName="registry-server" Mar 20 10:22:26 crc kubenswrapper[4971]: I0320 10:22:26.897528 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="86f94daf-ce3e-408e-9f8b-2adcd85bcfb2" containerName="oc" Mar 20 10:22:26 crc kubenswrapper[4971]: I0320 10:22:26.898995 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-72f4t" Mar 20 10:22:26 crc kubenswrapper[4971]: I0320 10:22:26.914767 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-72f4t"] Mar 20 10:22:26 crc kubenswrapper[4971]: I0320 10:22:26.999409 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8ff13e5-3b53-4e98-8884-368ca85c2c9c-utilities\") pod \"redhat-marketplace-72f4t\" (UID: \"c8ff13e5-3b53-4e98-8884-368ca85c2c9c\") " pod="openshift-marketplace/redhat-marketplace-72f4t" Mar 20 10:22:26 crc kubenswrapper[4971]: I0320 10:22:26.999740 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8ff13e5-3b53-4e98-8884-368ca85c2c9c-catalog-content\") pod \"redhat-marketplace-72f4t\" (UID: \"c8ff13e5-3b53-4e98-8884-368ca85c2c9c\") " pod="openshift-marketplace/redhat-marketplace-72f4t" Mar 20 10:22:26 crc kubenswrapper[4971]: I0320 10:22:26.999869 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mrql\" (UniqueName: \"kubernetes.io/projected/c8ff13e5-3b53-4e98-8884-368ca85c2c9c-kube-api-access-5mrql\") pod \"redhat-marketplace-72f4t\" (UID: \"c8ff13e5-3b53-4e98-8884-368ca85c2c9c\") " pod="openshift-marketplace/redhat-marketplace-72f4t" Mar 20 10:22:27 crc kubenswrapper[4971]: I0320 10:22:27.101298 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8ff13e5-3b53-4e98-8884-368ca85c2c9c-utilities\") pod \"redhat-marketplace-72f4t\" (UID: \"c8ff13e5-3b53-4e98-8884-368ca85c2c9c\") " pod="openshift-marketplace/redhat-marketplace-72f4t" Mar 20 10:22:27 crc kubenswrapper[4971]: I0320 10:22:27.101358 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8ff13e5-3b53-4e98-8884-368ca85c2c9c-catalog-content\") pod \"redhat-marketplace-72f4t\" (UID: \"c8ff13e5-3b53-4e98-8884-368ca85c2c9c\") " pod="openshift-marketplace/redhat-marketplace-72f4t" Mar 20 10:22:27 crc kubenswrapper[4971]: I0320 10:22:27.101448 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mrql\" (UniqueName: \"kubernetes.io/projected/c8ff13e5-3b53-4e98-8884-368ca85c2c9c-kube-api-access-5mrql\") pod \"redhat-marketplace-72f4t\" (UID: \"c8ff13e5-3b53-4e98-8884-368ca85c2c9c\") " pod="openshift-marketplace/redhat-marketplace-72f4t" Mar 20 10:22:27 crc kubenswrapper[4971]: I0320 10:22:27.101964 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8ff13e5-3b53-4e98-8884-368ca85c2c9c-utilities\") pod \"redhat-marketplace-72f4t\" (UID: \"c8ff13e5-3b53-4e98-8884-368ca85c2c9c\") " pod="openshift-marketplace/redhat-marketplace-72f4t" Mar 20 10:22:27 crc kubenswrapper[4971]: I0320 10:22:27.102031 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8ff13e5-3b53-4e98-8884-368ca85c2c9c-catalog-content\") pod \"redhat-marketplace-72f4t\" (UID: \"c8ff13e5-3b53-4e98-8884-368ca85c2c9c\") " pod="openshift-marketplace/redhat-marketplace-72f4t" Mar 20 10:22:27 crc kubenswrapper[4971]: I0320 10:22:27.122897 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mrql\" (UniqueName: \"kubernetes.io/projected/c8ff13e5-3b53-4e98-8884-368ca85c2c9c-kube-api-access-5mrql\") pod \"redhat-marketplace-72f4t\" (UID: \"c8ff13e5-3b53-4e98-8884-368ca85c2c9c\") " pod="openshift-marketplace/redhat-marketplace-72f4t" Mar 20 10:22:27 crc kubenswrapper[4971]: I0320 10:22:27.226282 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-72f4t" Mar 20 10:22:27 crc kubenswrapper[4971]: I0320 10:22:27.914309 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-72f4t"] Mar 20 10:22:28 crc kubenswrapper[4971]: I0320 10:22:28.253653 4971 generic.go:334] "Generic (PLEG): container finished" podID="c8ff13e5-3b53-4e98-8884-368ca85c2c9c" containerID="6dadcb62afeacdfc96bf6387014fcb7cabb26bfae44d1bda5a9346980ce8f9ed" exitCode=0 Mar 20 10:22:28 crc kubenswrapper[4971]: I0320 10:22:28.253763 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-72f4t" event={"ID":"c8ff13e5-3b53-4e98-8884-368ca85c2c9c","Type":"ContainerDied","Data":"6dadcb62afeacdfc96bf6387014fcb7cabb26bfae44d1bda5a9346980ce8f9ed"} Mar 20 10:22:28 crc kubenswrapper[4971]: I0320 10:22:28.254068 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-72f4t" event={"ID":"c8ff13e5-3b53-4e98-8884-368ca85c2c9c","Type":"ContainerStarted","Data":"33262be7cca12dea04fc765823d440af3fa0f3ff9624e545c6d5e04943891515"} Mar 20 10:22:29 crc kubenswrapper[4971]: I0320 10:22:29.265541 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-72f4t" event={"ID":"c8ff13e5-3b53-4e98-8884-368ca85c2c9c","Type":"ContainerStarted","Data":"bfc61b7ecbec17a5f7b2c3f5c36d2a643fd2b5003d958c7016869344a07fe2ae"} Mar 20 10:22:30 crc kubenswrapper[4971]: I0320 10:22:30.282286 4971 generic.go:334] "Generic (PLEG): container finished" podID="c8ff13e5-3b53-4e98-8884-368ca85c2c9c" containerID="bfc61b7ecbec17a5f7b2c3f5c36d2a643fd2b5003d958c7016869344a07fe2ae" exitCode=0 Mar 20 10:22:30 crc kubenswrapper[4971]: I0320 10:22:30.282531 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-72f4t" event={"ID":"c8ff13e5-3b53-4e98-8884-368ca85c2c9c","Type":"ContainerDied","Data":"bfc61b7ecbec17a5f7b2c3f5c36d2a643fd2b5003d958c7016869344a07fe2ae"} Mar 20 10:22:31 crc kubenswrapper[4971]: I0320 10:22:31.296745 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-72f4t" event={"ID":"c8ff13e5-3b53-4e98-8884-368ca85c2c9c","Type":"ContainerStarted","Data":"0b5b3fb0a65d885f2306fea3b7bd3b7500b428a582831754be488ced6581873c"} Mar 20 10:22:31 crc kubenswrapper[4971]: I0320 10:22:31.324272 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-72f4t" podStartSLOduration=2.8740112140000003 podStartE2EDuration="5.324249824s" podCreationTimestamp="2026-03-20 10:22:26 +0000 UTC" firstStartedPulling="2026-03-20 10:22:28.259494251 +0000 UTC m=+12770.239368389" lastFinishedPulling="2026-03-20 10:22:30.709732861 +0000 UTC m=+12772.689606999" observedRunningTime="2026-03-20 10:22:31.318837552 +0000 UTC m=+12773.298711700" watchObservedRunningTime="2026-03-20 10:22:31.324249824 +0000 UTC m=+12773.304123962" Mar 20 10:22:31 crc kubenswrapper[4971]: I0320 10:22:31.732452 4971 scope.go:117] "RemoveContainer" containerID="919bb2f73e7c6359debd4a73f45868c083b9339eb00f7714958d919a54c82d9f" Mar 20 10:22:31 crc kubenswrapper[4971]: E0320 10:22:31.732737 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 10:22:37 crc kubenswrapper[4971]: I0320 10:22:37.226638 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-72f4t" Mar 20 10:22:37 crc kubenswrapper[4971]: I0320 10:22:37.227165 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-72f4t" Mar 20 10:22:37 crc kubenswrapper[4971]: I0320 10:22:37.273461 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-72f4t" Mar 20 10:22:37 crc kubenswrapper[4971]: I0320 10:22:37.412277 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-72f4t" Mar 20 10:22:37 crc kubenswrapper[4971]: I0320 10:22:37.513390 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-72f4t"] Mar 20 10:22:39 crc kubenswrapper[4971]: I0320 10:22:39.380592 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-72f4t" podUID="c8ff13e5-3b53-4e98-8884-368ca85c2c9c" containerName="registry-server" containerID="cri-o://0b5b3fb0a65d885f2306fea3b7bd3b7500b428a582831754be488ced6581873c" gracePeriod=2 Mar 20 10:22:39 crc kubenswrapper[4971]: I0320 10:22:39.980963 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-72f4t" Mar 20 10:22:40 crc kubenswrapper[4971]: I0320 10:22:40.102557 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8ff13e5-3b53-4e98-8884-368ca85c2c9c-utilities\") pod \"c8ff13e5-3b53-4e98-8884-368ca85c2c9c\" (UID: \"c8ff13e5-3b53-4e98-8884-368ca85c2c9c\") " Mar 20 10:22:40 crc kubenswrapper[4971]: I0320 10:22:40.102768 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mrql\" (UniqueName: \"kubernetes.io/projected/c8ff13e5-3b53-4e98-8884-368ca85c2c9c-kube-api-access-5mrql\") pod \"c8ff13e5-3b53-4e98-8884-368ca85c2c9c\" (UID: \"c8ff13e5-3b53-4e98-8884-368ca85c2c9c\") " Mar 20 10:22:40 crc kubenswrapper[4971]: I0320 10:22:40.102984 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8ff13e5-3b53-4e98-8884-368ca85c2c9c-catalog-content\") pod \"c8ff13e5-3b53-4e98-8884-368ca85c2c9c\" (UID: \"c8ff13e5-3b53-4e98-8884-368ca85c2c9c\") " Mar 20 10:22:40 crc kubenswrapper[4971]: I0320 10:22:40.103473 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8ff13e5-3b53-4e98-8884-368ca85c2c9c-utilities" (OuterVolumeSpecName: "utilities") pod "c8ff13e5-3b53-4e98-8884-368ca85c2c9c" (UID: "c8ff13e5-3b53-4e98-8884-368ca85c2c9c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:22:40 crc kubenswrapper[4971]: I0320 10:22:40.103998 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8ff13e5-3b53-4e98-8884-368ca85c2c9c-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 10:22:40 crc kubenswrapper[4971]: I0320 10:22:40.120783 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8ff13e5-3b53-4e98-8884-368ca85c2c9c-kube-api-access-5mrql" (OuterVolumeSpecName: "kube-api-access-5mrql") pod "c8ff13e5-3b53-4e98-8884-368ca85c2c9c" (UID: "c8ff13e5-3b53-4e98-8884-368ca85c2c9c"). InnerVolumeSpecName "kube-api-access-5mrql". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:22:40 crc kubenswrapper[4971]: I0320 10:22:40.145435 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8ff13e5-3b53-4e98-8884-368ca85c2c9c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c8ff13e5-3b53-4e98-8884-368ca85c2c9c" (UID: "c8ff13e5-3b53-4e98-8884-368ca85c2c9c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:22:40 crc kubenswrapper[4971]: I0320 10:22:40.207075 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mrql\" (UniqueName: \"kubernetes.io/projected/c8ff13e5-3b53-4e98-8884-368ca85c2c9c-kube-api-access-5mrql\") on node \"crc\" DevicePath \"\"" Mar 20 10:22:40 crc kubenswrapper[4971]: I0320 10:22:40.207419 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8ff13e5-3b53-4e98-8884-368ca85c2c9c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 10:22:40 crc kubenswrapper[4971]: I0320 10:22:40.393395 4971 generic.go:334] "Generic (PLEG): container finished" podID="c8ff13e5-3b53-4e98-8884-368ca85c2c9c" containerID="0b5b3fb0a65d885f2306fea3b7bd3b7500b428a582831754be488ced6581873c" exitCode=0 Mar 20 10:22:40 crc kubenswrapper[4971]: I0320 10:22:40.393491 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-72f4t" Mar 20 10:22:40 crc kubenswrapper[4971]: I0320 10:22:40.393503 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-72f4t" event={"ID":"c8ff13e5-3b53-4e98-8884-368ca85c2c9c","Type":"ContainerDied","Data":"0b5b3fb0a65d885f2306fea3b7bd3b7500b428a582831754be488ced6581873c"} Mar 20 10:22:40 crc kubenswrapper[4971]: I0320 10:22:40.395087 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-72f4t" event={"ID":"c8ff13e5-3b53-4e98-8884-368ca85c2c9c","Type":"ContainerDied","Data":"33262be7cca12dea04fc765823d440af3fa0f3ff9624e545c6d5e04943891515"} Mar 20 10:22:40 crc kubenswrapper[4971]: I0320 10:22:40.395169 4971 scope.go:117] "RemoveContainer" containerID="0b5b3fb0a65d885f2306fea3b7bd3b7500b428a582831754be488ced6581873c" Mar 20 10:22:40 crc kubenswrapper[4971]: I0320 10:22:40.420879 4971 scope.go:117] "RemoveContainer" containerID="bfc61b7ecbec17a5f7b2c3f5c36d2a643fd2b5003d958c7016869344a07fe2ae" Mar 20 10:22:40 crc kubenswrapper[4971]: I0320 10:22:40.444489 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-72f4t"] Mar 20 10:22:40 crc kubenswrapper[4971]: I0320 10:22:40.454420 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-72f4t"] Mar 20 10:22:40 crc kubenswrapper[4971]: I0320 10:22:40.454453 4971 scope.go:117] "RemoveContainer" containerID="6dadcb62afeacdfc96bf6387014fcb7cabb26bfae44d1bda5a9346980ce8f9ed" Mar 20 10:22:40 crc kubenswrapper[4971]: I0320 10:22:40.511329 4971 scope.go:117] "RemoveContainer" containerID="0b5b3fb0a65d885f2306fea3b7bd3b7500b428a582831754be488ced6581873c" Mar 20 10:22:40 crc kubenswrapper[4971]: E0320 10:22:40.511805 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b5b3fb0a65d885f2306fea3b7bd3b7500b428a582831754be488ced6581873c\": container with ID starting with 0b5b3fb0a65d885f2306fea3b7bd3b7500b428a582831754be488ced6581873c not found: ID does not exist" containerID="0b5b3fb0a65d885f2306fea3b7bd3b7500b428a582831754be488ced6581873c" Mar 20 10:22:40 crc kubenswrapper[4971]: I0320 10:22:40.511904 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b5b3fb0a65d885f2306fea3b7bd3b7500b428a582831754be488ced6581873c"} err="failed to get container status \"0b5b3fb0a65d885f2306fea3b7bd3b7500b428a582831754be488ced6581873c\": rpc error: code = NotFound desc = could not find container \"0b5b3fb0a65d885f2306fea3b7bd3b7500b428a582831754be488ced6581873c\": container with ID starting with 0b5b3fb0a65d885f2306fea3b7bd3b7500b428a582831754be488ced6581873c not found: ID does not exist" Mar 20 10:22:40 crc kubenswrapper[4971]: I0320 10:22:40.511988 4971 scope.go:117] "RemoveContainer" containerID="bfc61b7ecbec17a5f7b2c3f5c36d2a643fd2b5003d958c7016869344a07fe2ae" Mar 20 10:22:40 crc kubenswrapper[4971]: E0320 10:22:40.512377 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfc61b7ecbec17a5f7b2c3f5c36d2a643fd2b5003d958c7016869344a07fe2ae\": container with ID starting with bfc61b7ecbec17a5f7b2c3f5c36d2a643fd2b5003d958c7016869344a07fe2ae not found: ID does not exist" containerID="bfc61b7ecbec17a5f7b2c3f5c36d2a643fd2b5003d958c7016869344a07fe2ae" Mar 20 10:22:40 crc kubenswrapper[4971]: I0320 10:22:40.512491 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfc61b7ecbec17a5f7b2c3f5c36d2a643fd2b5003d958c7016869344a07fe2ae"} err="failed to get container status \"bfc61b7ecbec17a5f7b2c3f5c36d2a643fd2b5003d958c7016869344a07fe2ae\": rpc error: code = NotFound desc = could not find container \"bfc61b7ecbec17a5f7b2c3f5c36d2a643fd2b5003d958c7016869344a07fe2ae\": container with ID starting with bfc61b7ecbec17a5f7b2c3f5c36d2a643fd2b5003d958c7016869344a07fe2ae not found: ID does not exist" Mar 20 10:22:40 crc kubenswrapper[4971]: I0320 10:22:40.512562 4971 scope.go:117] "RemoveContainer" containerID="6dadcb62afeacdfc96bf6387014fcb7cabb26bfae44d1bda5a9346980ce8f9ed" Mar 20 10:22:40 crc kubenswrapper[4971]: E0320 10:22:40.512953 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6dadcb62afeacdfc96bf6387014fcb7cabb26bfae44d1bda5a9346980ce8f9ed\": container with ID starting with 6dadcb62afeacdfc96bf6387014fcb7cabb26bfae44d1bda5a9346980ce8f9ed not found: ID does not exist" containerID="6dadcb62afeacdfc96bf6387014fcb7cabb26bfae44d1bda5a9346980ce8f9ed" Mar 20 10:22:40 crc kubenswrapper[4971]: I0320 10:22:40.513058 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dadcb62afeacdfc96bf6387014fcb7cabb26bfae44d1bda5a9346980ce8f9ed"} err="failed to get container status \"6dadcb62afeacdfc96bf6387014fcb7cabb26bfae44d1bda5a9346980ce8f9ed\": rpc error: code = NotFound desc = could not find container \"6dadcb62afeacdfc96bf6387014fcb7cabb26bfae44d1bda5a9346980ce8f9ed\": container with ID starting with 6dadcb62afeacdfc96bf6387014fcb7cabb26bfae44d1bda5a9346980ce8f9ed not found: ID does not exist" Mar 20 10:22:40 crc kubenswrapper[4971]: I0320 10:22:40.745039 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8ff13e5-3b53-4e98-8884-368ca85c2c9c" path="/var/lib/kubelet/pods/c8ff13e5-3b53-4e98-8884-368ca85c2c9c/volumes" Mar 20 10:22:42 crc kubenswrapper[4971]: I0320 10:22:42.732749 4971 scope.go:117] "RemoveContainer" containerID="919bb2f73e7c6359debd4a73f45868c083b9339eb00f7714958d919a54c82d9f" Mar 20 10:22:42 crc kubenswrapper[4971]: E0320 10:22:42.733291 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 10:22:53 crc kubenswrapper[4971]: I0320 10:22:53.732593 4971 scope.go:117] "RemoveContainer" containerID="919bb2f73e7c6359debd4a73f45868c083b9339eb00f7714958d919a54c82d9f" Mar 20 10:22:53 crc kubenswrapper[4971]: E0320 10:22:53.734161 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 10:23:06 crc kubenswrapper[4971]: I0320 10:23:06.733085 4971 scope.go:117] "RemoveContainer" containerID="919bb2f73e7c6359debd4a73f45868c083b9339eb00f7714958d919a54c82d9f" Mar 20 10:23:06 crc kubenswrapper[4971]: E0320 10:23:06.733743 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m26zl_openshift-machine-config-operator(0c96bfbc-3a6a-44df-9bf6-9f78c587657c)\"" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" Mar 20 10:23:21 crc kubenswrapper[4971]: I0320 10:23:21.732840 4971 scope.go:117] "RemoveContainer" containerID="919bb2f73e7c6359debd4a73f45868c083b9339eb00f7714958d919a54c82d9f" Mar 20 10:23:22 crc kubenswrapper[4971]: I0320 10:23:22.896594 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerStarted","Data":"67879cb7fb1508fd84bfdb179e6919b555772e4bc6efefc714792967baf2ede9"} Mar 20 10:24:00 crc kubenswrapper[4971]: I0320 10:24:00.158538 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566704-nfb2r"] Mar 20 10:24:00 crc kubenswrapper[4971]: E0320 10:24:00.159517 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8ff13e5-3b53-4e98-8884-368ca85c2c9c" containerName="registry-server" Mar 20 10:24:00 crc kubenswrapper[4971]: I0320 10:24:00.159531 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8ff13e5-3b53-4e98-8884-368ca85c2c9c" containerName="registry-server" Mar 20 10:24:00 crc kubenswrapper[4971]: E0320 10:24:00.159544 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8ff13e5-3b53-4e98-8884-368ca85c2c9c" containerName="extract-utilities" Mar 20 10:24:00 crc kubenswrapper[4971]: I0320 10:24:00.159551 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8ff13e5-3b53-4e98-8884-368ca85c2c9c" containerName="extract-utilities" Mar 20 10:24:00 crc kubenswrapper[4971]: E0320 10:24:00.159560 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8ff13e5-3b53-4e98-8884-368ca85c2c9c" containerName="extract-content" Mar 20 10:24:00 crc kubenswrapper[4971]: I0320 10:24:00.159566 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8ff13e5-3b53-4e98-8884-368ca85c2c9c" containerName="extract-content" Mar 20 10:24:00 crc kubenswrapper[4971]: I0320 10:24:00.159786 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8ff13e5-3b53-4e98-8884-368ca85c2c9c" containerName="registry-server" Mar 20 10:24:00 crc kubenswrapper[4971]: I0320 10:24:00.160537 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566704-nfb2r" Mar 20 10:24:00 crc kubenswrapper[4971]: I0320 10:24:00.164230 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 10:24:00 crc kubenswrapper[4971]: I0320 10:24:00.164497 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 10:24:00 crc kubenswrapper[4971]: I0320 10:24:00.166060 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 10:24:00 crc kubenswrapper[4971]: I0320 10:24:00.171316 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566704-nfb2r"] Mar 20 10:24:00 crc kubenswrapper[4971]: I0320 10:24:00.316006 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn8cw\" (UniqueName: \"kubernetes.io/projected/77602f55-b8a7-4ad9-810c-7388039d5552-kube-api-access-vn8cw\") pod \"auto-csr-approver-29566704-nfb2r\" (UID: \"77602f55-b8a7-4ad9-810c-7388039d5552\") " pod="openshift-infra/auto-csr-approver-29566704-nfb2r" Mar 20 10:24:00 crc kubenswrapper[4971]: I0320 10:24:00.418401 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vn8cw\" (UniqueName: \"kubernetes.io/projected/77602f55-b8a7-4ad9-810c-7388039d5552-kube-api-access-vn8cw\") pod \"auto-csr-approver-29566704-nfb2r\" (UID: \"77602f55-b8a7-4ad9-810c-7388039d5552\") " pod="openshift-infra/auto-csr-approver-29566704-nfb2r" Mar 20 10:24:00 crc kubenswrapper[4971]: I0320 10:24:00.445788 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn8cw\" (UniqueName: \"kubernetes.io/projected/77602f55-b8a7-4ad9-810c-7388039d5552-kube-api-access-vn8cw\") pod \"auto-csr-approver-29566704-nfb2r\" (UID: \"77602f55-b8a7-4ad9-810c-7388039d5552\") " pod="openshift-infra/auto-csr-approver-29566704-nfb2r" Mar 20 10:24:00 crc kubenswrapper[4971]: I0320 10:24:00.496880 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566704-nfb2r" Mar 20 10:24:00 crc kubenswrapper[4971]: I0320 10:24:00.988306 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566704-nfb2r"] Mar 20 10:24:00 crc kubenswrapper[4971]: I0320 10:24:00.993286 4971 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 10:24:01 crc kubenswrapper[4971]: I0320 10:24:01.329544 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566704-nfb2r" event={"ID":"77602f55-b8a7-4ad9-810c-7388039d5552","Type":"ContainerStarted","Data":"a97051d691a24edaa6c128d2f52ffe6bcd25e687f70d0c781c59b610e90dc3d5"} Mar 20 10:24:03 crc kubenswrapper[4971]: I0320 10:24:03.353953 4971 generic.go:334] "Generic (PLEG): container finished" podID="77602f55-b8a7-4ad9-810c-7388039d5552" containerID="185ede94fe0e94e715d7fb9d340583c36cc979b9d0705b70d165cffa8fc3f77f" exitCode=0 Mar 20 10:24:03 crc kubenswrapper[4971]: I0320 10:24:03.354140 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566704-nfb2r" event={"ID":"77602f55-b8a7-4ad9-810c-7388039d5552","Type":"ContainerDied","Data":"185ede94fe0e94e715d7fb9d340583c36cc979b9d0705b70d165cffa8fc3f77f"} Mar 20 10:24:04 crc kubenswrapper[4971]: I0320 10:24:04.731737 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566704-nfb2r" Mar 20 10:24:04 crc kubenswrapper[4971]: I0320 10:24:04.814956 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vn8cw\" (UniqueName: \"kubernetes.io/projected/77602f55-b8a7-4ad9-810c-7388039d5552-kube-api-access-vn8cw\") pod \"77602f55-b8a7-4ad9-810c-7388039d5552\" (UID: \"77602f55-b8a7-4ad9-810c-7388039d5552\") " Mar 20 10:24:04 crc kubenswrapper[4971]: I0320 10:24:04.822858 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77602f55-b8a7-4ad9-810c-7388039d5552-kube-api-access-vn8cw" (OuterVolumeSpecName: "kube-api-access-vn8cw") pod "77602f55-b8a7-4ad9-810c-7388039d5552" (UID: "77602f55-b8a7-4ad9-810c-7388039d5552"). InnerVolumeSpecName "kube-api-access-vn8cw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:24:04 crc kubenswrapper[4971]: I0320 10:24:04.919921 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vn8cw\" (UniqueName: \"kubernetes.io/projected/77602f55-b8a7-4ad9-810c-7388039d5552-kube-api-access-vn8cw\") on node \"crc\" DevicePath \"\"" Mar 20 10:24:05 crc kubenswrapper[4971]: I0320 10:24:05.377225 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566704-nfb2r" event={"ID":"77602f55-b8a7-4ad9-810c-7388039d5552","Type":"ContainerDied","Data":"a97051d691a24edaa6c128d2f52ffe6bcd25e687f70d0c781c59b610e90dc3d5"} Mar 20 10:24:05 crc kubenswrapper[4971]: I0320 10:24:05.377267 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a97051d691a24edaa6c128d2f52ffe6bcd25e687f70d0c781c59b610e90dc3d5" Mar 20 10:24:05 crc kubenswrapper[4971]: I0320 10:24:05.377273 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566704-nfb2r" Mar 20 10:24:05 crc kubenswrapper[4971]: I0320 10:24:05.812761 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566698-cwb8n"] Mar 20 10:24:05 crc kubenswrapper[4971]: I0320 10:24:05.824944 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566698-cwb8n"] Mar 20 10:24:06 crc kubenswrapper[4971]: I0320 10:24:06.746082 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4208792-3a1c-40e3-a708-e65cf9660096" path="/var/lib/kubelet/pods/d4208792-3a1c-40e3-a708-e65cf9660096/volumes" Mar 20 10:24:18 crc kubenswrapper[4971]: I0320 10:24:18.574858 4971 scope.go:117] "RemoveContainer" containerID="373bf68012a436de380cde7b284fd79d5a8e635168448f791af598596b08c6a5" Mar 20 10:25:27 crc kubenswrapper[4971]: I0320 10:25:27.394198 4971 generic.go:334] "Generic (PLEG): container finished" podID="74d57ac5-068a-41d4-ae06-ba8cca3b7c1a" containerID="48e68cc536c5c5d31cb8655a14ca3a279c90674c590cd00ce1a78488002e895c" exitCode=0 Mar 20 10:25:27 crc kubenswrapper[4971]: I0320 10:25:27.394300 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x7xk7/must-gather-n27tq" event={"ID":"74d57ac5-068a-41d4-ae06-ba8cca3b7c1a","Type":"ContainerDied","Data":"48e68cc536c5c5d31cb8655a14ca3a279c90674c590cd00ce1a78488002e895c"} Mar 20 10:25:27 crc kubenswrapper[4971]: I0320 10:25:27.397818 4971 scope.go:117] "RemoveContainer" containerID="48e68cc536c5c5d31cb8655a14ca3a279c90674c590cd00ce1a78488002e895c" Mar 20 10:25:28 crc kubenswrapper[4971]: I0320 10:25:28.453054 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-x7xk7_must-gather-n27tq_74d57ac5-068a-41d4-ae06-ba8cca3b7c1a/gather/0.log" Mar 20 10:25:38 crc kubenswrapper[4971]: I0320 10:25:38.612748 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-x7xk7/must-gather-n27tq"] Mar 20 10:25:38 crc kubenswrapper[4971]: I0320 10:25:38.615228 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-x7xk7/must-gather-n27tq" podUID="74d57ac5-068a-41d4-ae06-ba8cca3b7c1a" containerName="copy" containerID="cri-o://9a4781ef90ee8bca4c04e64533251b6fc2c121a314ea7b4803d57aea2b318008" gracePeriod=2 Mar 20 10:25:38 crc kubenswrapper[4971]: I0320 10:25:38.630236 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-x7xk7/must-gather-n27tq"] Mar 20 10:25:39 crc kubenswrapper[4971]: I0320 10:25:39.039786 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-x7xk7_must-gather-n27tq_74d57ac5-068a-41d4-ae06-ba8cca3b7c1a/copy/0.log" Mar 20 10:25:39 crc kubenswrapper[4971]: I0320 10:25:39.040494 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x7xk7/must-gather-n27tq" Mar 20 10:25:39 crc kubenswrapper[4971]: I0320 10:25:39.206229 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-js7st\" (UniqueName: \"kubernetes.io/projected/74d57ac5-068a-41d4-ae06-ba8cca3b7c1a-kube-api-access-js7st\") pod \"74d57ac5-068a-41d4-ae06-ba8cca3b7c1a\" (UID: \"74d57ac5-068a-41d4-ae06-ba8cca3b7c1a\") " Mar 20 10:25:39 crc kubenswrapper[4971]: I0320 10:25:39.206386 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/74d57ac5-068a-41d4-ae06-ba8cca3b7c1a-must-gather-output\") pod \"74d57ac5-068a-41d4-ae06-ba8cca3b7c1a\" (UID: \"74d57ac5-068a-41d4-ae06-ba8cca3b7c1a\") " Mar 20 10:25:39 crc kubenswrapper[4971]: I0320 10:25:39.211762 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74d57ac5-068a-41d4-ae06-ba8cca3b7c1a-kube-api-access-js7st" (OuterVolumeSpecName: "kube-api-access-js7st") pod "74d57ac5-068a-41d4-ae06-ba8cca3b7c1a" (UID: "74d57ac5-068a-41d4-ae06-ba8cca3b7c1a"). InnerVolumeSpecName "kube-api-access-js7st". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:25:39 crc kubenswrapper[4971]: I0320 10:25:39.309385 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-js7st\" (UniqueName: \"kubernetes.io/projected/74d57ac5-068a-41d4-ae06-ba8cca3b7c1a-kube-api-access-js7st\") on node \"crc\" DevicePath \"\"" Mar 20 10:25:39 crc kubenswrapper[4971]: I0320 10:25:39.480004 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74d57ac5-068a-41d4-ae06-ba8cca3b7c1a-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "74d57ac5-068a-41d4-ae06-ba8cca3b7c1a" (UID: "74d57ac5-068a-41d4-ae06-ba8cca3b7c1a"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:25:39 crc kubenswrapper[4971]: I0320 10:25:39.513363 4971 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/74d57ac5-068a-41d4-ae06-ba8cca3b7c1a-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 20 10:25:39 crc kubenswrapper[4971]: I0320 10:25:39.531112 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-x7xk7_must-gather-n27tq_74d57ac5-068a-41d4-ae06-ba8cca3b7c1a/copy/0.log" Mar 20 10:25:39 crc kubenswrapper[4971]: I0320 10:25:39.531390 4971 generic.go:334] "Generic (PLEG): container finished" podID="74d57ac5-068a-41d4-ae06-ba8cca3b7c1a" containerID="9a4781ef90ee8bca4c04e64533251b6fc2c121a314ea7b4803d57aea2b318008" exitCode=143 Mar 20 10:25:39 crc kubenswrapper[4971]: I0320 10:25:39.531438 4971 scope.go:117] "RemoveContainer" containerID="9a4781ef90ee8bca4c04e64533251b6fc2c121a314ea7b4803d57aea2b318008" Mar 20 10:25:39 crc kubenswrapper[4971]: I0320 10:25:39.531444 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x7xk7/must-gather-n27tq" Mar 20 10:25:39 crc kubenswrapper[4971]: I0320 10:25:39.555202 4971 scope.go:117] "RemoveContainer" containerID="48e68cc536c5c5d31cb8655a14ca3a279c90674c590cd00ce1a78488002e895c" Mar 20 10:25:39 crc kubenswrapper[4971]: I0320 10:25:39.649240 4971 scope.go:117] "RemoveContainer" containerID="9a4781ef90ee8bca4c04e64533251b6fc2c121a314ea7b4803d57aea2b318008" Mar 20 10:25:39 crc kubenswrapper[4971]: E0320 10:25:39.651148 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a4781ef90ee8bca4c04e64533251b6fc2c121a314ea7b4803d57aea2b318008\": container with ID starting with 9a4781ef90ee8bca4c04e64533251b6fc2c121a314ea7b4803d57aea2b318008 not found: ID does not exist" containerID="9a4781ef90ee8bca4c04e64533251b6fc2c121a314ea7b4803d57aea2b318008" Mar 20 10:25:39 crc kubenswrapper[4971]: I0320 10:25:39.651198 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a4781ef90ee8bca4c04e64533251b6fc2c121a314ea7b4803d57aea2b318008"} err="failed to get container status \"9a4781ef90ee8bca4c04e64533251b6fc2c121a314ea7b4803d57aea2b318008\": rpc error: code = NotFound desc = could not find container \"9a4781ef90ee8bca4c04e64533251b6fc2c121a314ea7b4803d57aea2b318008\": container with ID starting with 9a4781ef90ee8bca4c04e64533251b6fc2c121a314ea7b4803d57aea2b318008 not found: ID does not exist" Mar 20 10:25:39 crc kubenswrapper[4971]: I0320 10:25:39.651226 4971 scope.go:117] "RemoveContainer" containerID="48e68cc536c5c5d31cb8655a14ca3a279c90674c590cd00ce1a78488002e895c" Mar 20 10:25:39 crc kubenswrapper[4971]: E0320 10:25:39.651713 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48e68cc536c5c5d31cb8655a14ca3a279c90674c590cd00ce1a78488002e895c\": container with ID starting with 48e68cc536c5c5d31cb8655a14ca3a279c90674c590cd00ce1a78488002e895c not found: ID does not exist" containerID="48e68cc536c5c5d31cb8655a14ca3a279c90674c590cd00ce1a78488002e895c" Mar 20 10:25:39 crc kubenswrapper[4971]: I0320 10:25:39.651767 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48e68cc536c5c5d31cb8655a14ca3a279c90674c590cd00ce1a78488002e895c"} err="failed to get container status \"48e68cc536c5c5d31cb8655a14ca3a279c90674c590cd00ce1a78488002e895c\": rpc error: code = NotFound desc = could not find container \"48e68cc536c5c5d31cb8655a14ca3a279c90674c590cd00ce1a78488002e895c\": container with ID starting with 48e68cc536c5c5d31cb8655a14ca3a279c90674c590cd00ce1a78488002e895c not found: ID does not exist" Mar 20 10:25:40 crc kubenswrapper[4971]: I0320 10:25:40.746485 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74d57ac5-068a-41d4-ae06-ba8cca3b7c1a" path="/var/lib/kubelet/pods/74d57ac5-068a-41d4-ae06-ba8cca3b7c1a/volumes" Mar 20 10:25:50 crc kubenswrapper[4971]: I0320 10:25:50.174041 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 10:25:50 crc kubenswrapper[4971]: I0320 10:25:50.174801 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 10:26:00 crc kubenswrapper[4971]: I0320 10:26:00.146309 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566706-c6xpk"] Mar 20 10:26:00 crc kubenswrapper[4971]: E0320 10:26:00.147167 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74d57ac5-068a-41d4-ae06-ba8cca3b7c1a" containerName="gather" Mar 20 10:26:00 crc kubenswrapper[4971]: I0320 10:26:00.147179 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="74d57ac5-068a-41d4-ae06-ba8cca3b7c1a" containerName="gather" Mar 20 10:26:00 crc kubenswrapper[4971]: E0320 10:26:00.147193 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74d57ac5-068a-41d4-ae06-ba8cca3b7c1a" containerName="copy" Mar 20 10:26:00 crc kubenswrapper[4971]: I0320 10:26:00.147199 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="74d57ac5-068a-41d4-ae06-ba8cca3b7c1a" containerName="copy" Mar 20 10:26:00 crc kubenswrapper[4971]: E0320 10:26:00.147227 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77602f55-b8a7-4ad9-810c-7388039d5552" containerName="oc" Mar 20 10:26:00 crc kubenswrapper[4971]: I0320 10:26:00.147233 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="77602f55-b8a7-4ad9-810c-7388039d5552" containerName="oc" Mar 20 10:26:00 crc kubenswrapper[4971]: I0320 10:26:00.147422 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="77602f55-b8a7-4ad9-810c-7388039d5552" containerName="oc" Mar 20 10:26:00 crc kubenswrapper[4971]: I0320 10:26:00.147442 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="74d57ac5-068a-41d4-ae06-ba8cca3b7c1a" containerName="gather" Mar 20 10:26:00 crc kubenswrapper[4971]: I0320 10:26:00.147459 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="74d57ac5-068a-41d4-ae06-ba8cca3b7c1a" containerName="copy" Mar 20 10:26:00 crc kubenswrapper[4971]: I0320 10:26:00.148138 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566706-c6xpk" Mar 20 10:26:00 crc kubenswrapper[4971]: I0320 10:26:00.151273 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 10:26:00 crc kubenswrapper[4971]: I0320 10:26:00.151300 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 10:26:00 crc kubenswrapper[4971]: I0320 10:26:00.151341 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 10:26:00 crc kubenswrapper[4971]: I0320 10:26:00.157914 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566706-c6xpk"] Mar 20 10:26:00 crc kubenswrapper[4971]: I0320 10:26:00.202450 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwjmc\" (UniqueName: \"kubernetes.io/projected/9187a152-9627-47d3-975a-f032ba256117-kube-api-access-fwjmc\") pod \"auto-csr-approver-29566706-c6xpk\" (UID: \"9187a152-9627-47d3-975a-f032ba256117\") " pod="openshift-infra/auto-csr-approver-29566706-c6xpk" Mar 20 10:26:00 crc kubenswrapper[4971]: I0320 10:26:00.304162 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwjmc\" (UniqueName: \"kubernetes.io/projected/9187a152-9627-47d3-975a-f032ba256117-kube-api-access-fwjmc\") pod \"auto-csr-approver-29566706-c6xpk\" (UID: \"9187a152-9627-47d3-975a-f032ba256117\") " pod="openshift-infra/auto-csr-approver-29566706-c6xpk" Mar 20 10:26:00 crc kubenswrapper[4971]: I0320 10:26:00.325244 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwjmc\" (UniqueName: \"kubernetes.io/projected/9187a152-9627-47d3-975a-f032ba256117-kube-api-access-fwjmc\") pod \"auto-csr-approver-29566706-c6xpk\" (UID: \"9187a152-9627-47d3-975a-f032ba256117\") " pod="openshift-infra/auto-csr-approver-29566706-c6xpk" Mar 20 10:26:00 crc kubenswrapper[4971]: I0320 10:26:00.469822 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566706-c6xpk" Mar 20 10:26:00 crc kubenswrapper[4971]: I0320 10:26:00.938752 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566706-c6xpk"] Mar 20 10:26:01 crc kubenswrapper[4971]: I0320 10:26:01.768453 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566706-c6xpk" event={"ID":"9187a152-9627-47d3-975a-f032ba256117","Type":"ContainerStarted","Data":"f13e0551ce1bdb42ae80a9bdc9812d3985ce583a5f776c6775cf8e7bc158c984"} Mar 20 10:26:02 crc kubenswrapper[4971]: I0320 10:26:02.783275 4971 generic.go:334] "Generic (PLEG): container finished" podID="9187a152-9627-47d3-975a-f032ba256117" containerID="55156c92e400c7671d0113cb15a776e9721daed14bbfb5574035535ff18bd898" exitCode=0 Mar 20 10:26:02 crc kubenswrapper[4971]: I0320 10:26:02.783404 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566706-c6xpk" event={"ID":"9187a152-9627-47d3-975a-f032ba256117","Type":"ContainerDied","Data":"55156c92e400c7671d0113cb15a776e9721daed14bbfb5574035535ff18bd898"} Mar 20 10:26:04 crc kubenswrapper[4971]: I0320 10:26:04.238467 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566706-c6xpk" Mar 20 10:26:04 crc kubenswrapper[4971]: I0320 10:26:04.347640 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwjmc\" (UniqueName: \"kubernetes.io/projected/9187a152-9627-47d3-975a-f032ba256117-kube-api-access-fwjmc\") pod \"9187a152-9627-47d3-975a-f032ba256117\" (UID: \"9187a152-9627-47d3-975a-f032ba256117\") " Mar 20 10:26:04 crc kubenswrapper[4971]: I0320 10:26:04.358131 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9187a152-9627-47d3-975a-f032ba256117-kube-api-access-fwjmc" (OuterVolumeSpecName: "kube-api-access-fwjmc") pod "9187a152-9627-47d3-975a-f032ba256117" (UID: "9187a152-9627-47d3-975a-f032ba256117"). InnerVolumeSpecName "kube-api-access-fwjmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:26:04 crc kubenswrapper[4971]: I0320 10:26:04.450521 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwjmc\" (UniqueName: \"kubernetes.io/projected/9187a152-9627-47d3-975a-f032ba256117-kube-api-access-fwjmc\") on node \"crc\" DevicePath \"\"" Mar 20 10:26:04 crc kubenswrapper[4971]: I0320 10:26:04.806580 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566706-c6xpk" event={"ID":"9187a152-9627-47d3-975a-f032ba256117","Type":"ContainerDied","Data":"f13e0551ce1bdb42ae80a9bdc9812d3985ce583a5f776c6775cf8e7bc158c984"} Mar 20 10:26:04 crc kubenswrapper[4971]: I0320 10:26:04.806655 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f13e0551ce1bdb42ae80a9bdc9812d3985ce583a5f776c6775cf8e7bc158c984" Mar 20 10:26:04 crc kubenswrapper[4971]: I0320 10:26:04.806690 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566706-c6xpk" Mar 20 10:26:05 crc kubenswrapper[4971]: I0320 10:26:05.314169 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566700-dp87m"] Mar 20 10:26:05 crc kubenswrapper[4971]: I0320 10:26:05.327911 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566700-dp87m"] Mar 20 10:26:06 crc kubenswrapper[4971]: I0320 10:26:06.744912 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="076c4058-a8ca-4c9a-a140-8d73445634a6" path="/var/lib/kubelet/pods/076c4058-a8ca-4c9a-a140-8d73445634a6/volumes" Mar 20 10:26:18 crc kubenswrapper[4971]: I0320 10:26:18.698494 4971 scope.go:117] "RemoveContainer" containerID="596ed9447a6afcf3015746121e3448b433d94782eb2f456d988033f163907522" Mar 20 10:26:20 crc kubenswrapper[4971]: I0320 10:26:20.162516 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 10:26:20 crc kubenswrapper[4971]: I0320 10:26:20.163073 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 10:26:50 crc kubenswrapper[4971]: I0320 10:26:50.162137 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 10:26:50 crc kubenswrapper[4971]: I0320 10:26:50.163000 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 10:26:50 crc kubenswrapper[4971]: I0320 10:26:50.163090 4971 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" Mar 20 10:26:50 crc kubenswrapper[4971]: I0320 10:26:50.164550 4971 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"67879cb7fb1508fd84bfdb179e6919b555772e4bc6efefc714792967baf2ede9"} pod="openshift-machine-config-operator/machine-config-daemon-m26zl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 10:26:50 crc kubenswrapper[4971]: I0320 10:26:50.164735 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" containerID="cri-o://67879cb7fb1508fd84bfdb179e6919b555772e4bc6efefc714792967baf2ede9" gracePeriod=600 Mar 20 10:26:50 crc kubenswrapper[4971]: I0320 10:26:50.365124 4971 generic.go:334] "Generic (PLEG): container finished" podID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerID="67879cb7fb1508fd84bfdb179e6919b555772e4bc6efefc714792967baf2ede9" exitCode=0 Mar 20 10:26:50 crc kubenswrapper[4971]: I0320 10:26:50.365173 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerDied","Data":"67879cb7fb1508fd84bfdb179e6919b555772e4bc6efefc714792967baf2ede9"} Mar 20 10:26:50 crc kubenswrapper[4971]: I0320 10:26:50.365208 4971 scope.go:117] "RemoveContainer" containerID="919bb2f73e7c6359debd4a73f45868c083b9339eb00f7714958d919a54c82d9f" Mar 20 10:26:51 crc kubenswrapper[4971]: I0320 10:26:51.379332 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" event={"ID":"0c96bfbc-3a6a-44df-9bf6-9f78c587657c","Type":"ContainerStarted","Data":"014280db9d03c7b18313a3bf93ba145481330e58661e9f7dcd597483668abb01"} Mar 20 10:28:00 crc kubenswrapper[4971]: I0320 10:28:00.154197 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566708-4lhvw"] Mar 20 10:28:00 crc kubenswrapper[4971]: E0320 10:28:00.169305 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9187a152-9627-47d3-975a-f032ba256117" containerName="oc" Mar 20 10:28:00 crc kubenswrapper[4971]: I0320 10:28:00.169349 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="9187a152-9627-47d3-975a-f032ba256117" containerName="oc" Mar 20 10:28:00 crc kubenswrapper[4971]: I0320 10:28:00.172767 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="9187a152-9627-47d3-975a-f032ba256117" containerName="oc" Mar 20 10:28:00 crc kubenswrapper[4971]: I0320 10:28:00.175546 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566708-4lhvw" Mar 20 10:28:00 crc kubenswrapper[4971]: I0320 10:28:00.190255 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 10:28:00 crc kubenswrapper[4971]: I0320 10:28:00.190495 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 10:28:00 crc kubenswrapper[4971]: I0320 10:28:00.191000 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5q4xq" Mar 20 10:28:00 crc kubenswrapper[4971]: I0320 10:28:00.198954 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566708-4lhvw"] Mar 20 10:28:00 crc kubenswrapper[4971]: I0320 10:28:00.363060 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8shnb\" (UniqueName: \"kubernetes.io/projected/1747936e-8610-4ec3-ba07-9d1f594e949c-kube-api-access-8shnb\") pod \"auto-csr-approver-29566708-4lhvw\" (UID: \"1747936e-8610-4ec3-ba07-9d1f594e949c\") " pod="openshift-infra/auto-csr-approver-29566708-4lhvw" Mar 20 10:28:00 crc kubenswrapper[4971]: I0320 10:28:00.466863 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8shnb\" (UniqueName: \"kubernetes.io/projected/1747936e-8610-4ec3-ba07-9d1f594e949c-kube-api-access-8shnb\") pod \"auto-csr-approver-29566708-4lhvw\" (UID: \"1747936e-8610-4ec3-ba07-9d1f594e949c\") " pod="openshift-infra/auto-csr-approver-29566708-4lhvw" Mar 20 10:28:00 crc kubenswrapper[4971]: I0320 10:28:00.489240 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8shnb\" (UniqueName: \"kubernetes.io/projected/1747936e-8610-4ec3-ba07-9d1f594e949c-kube-api-access-8shnb\") pod \"auto-csr-approver-29566708-4lhvw\" (UID: \"1747936e-8610-4ec3-ba07-9d1f594e949c\") " pod="openshift-infra/auto-csr-approver-29566708-4lhvw" Mar 20 10:28:00 crc kubenswrapper[4971]: I0320 10:28:00.507728 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566708-4lhvw" Mar 20 10:28:00 crc kubenswrapper[4971]: I0320 10:28:00.964079 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566708-4lhvw"] Mar 20 10:28:01 crc kubenswrapper[4971]: I0320 10:28:01.138053 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566708-4lhvw" event={"ID":"1747936e-8610-4ec3-ba07-9d1f594e949c","Type":"ContainerStarted","Data":"ef341516b6e2b5b3dfa451834e572f8a67e117dfa10233ec5dc804dd7f514cf8"} Mar 20 10:28:03 crc kubenswrapper[4971]: I0320 10:28:03.155475 4971 generic.go:334] "Generic (PLEG): container finished" podID="1747936e-8610-4ec3-ba07-9d1f594e949c" containerID="78b5444a3e1c4e86e35fb49a738fce7bab7450dc2404fd56daaef1fc9d56a821" exitCode=0 Mar 20 10:28:03 crc kubenswrapper[4971]: I0320 10:28:03.155667 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566708-4lhvw" event={"ID":"1747936e-8610-4ec3-ba07-9d1f594e949c","Type":"ContainerDied","Data":"78b5444a3e1c4e86e35fb49a738fce7bab7450dc2404fd56daaef1fc9d56a821"} Mar 20 10:28:04 crc kubenswrapper[4971]: I0320 10:28:04.570391 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566708-4lhvw" Mar 20 10:28:04 crc kubenswrapper[4971]: I0320 10:28:04.761272 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8shnb\" (UniqueName: \"kubernetes.io/projected/1747936e-8610-4ec3-ba07-9d1f594e949c-kube-api-access-8shnb\") pod \"1747936e-8610-4ec3-ba07-9d1f594e949c\" (UID: \"1747936e-8610-4ec3-ba07-9d1f594e949c\") " Mar 20 10:28:04 crc kubenswrapper[4971]: I0320 10:28:04.770669 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1747936e-8610-4ec3-ba07-9d1f594e949c-kube-api-access-8shnb" (OuterVolumeSpecName: "kube-api-access-8shnb") pod "1747936e-8610-4ec3-ba07-9d1f594e949c" (UID: "1747936e-8610-4ec3-ba07-9d1f594e949c"). InnerVolumeSpecName "kube-api-access-8shnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:28:04 crc kubenswrapper[4971]: I0320 10:28:04.864581 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8shnb\" (UniqueName: \"kubernetes.io/projected/1747936e-8610-4ec3-ba07-9d1f594e949c-kube-api-access-8shnb\") on node \"crc\" DevicePath \"\"" Mar 20 10:28:05 crc kubenswrapper[4971]: I0320 10:28:05.178476 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566708-4lhvw" event={"ID":"1747936e-8610-4ec3-ba07-9d1f594e949c","Type":"ContainerDied","Data":"ef341516b6e2b5b3dfa451834e572f8a67e117dfa10233ec5dc804dd7f514cf8"} Mar 20 10:28:05 crc kubenswrapper[4971]: I0320 10:28:05.178523 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef341516b6e2b5b3dfa451834e572f8a67e117dfa10233ec5dc804dd7f514cf8" Mar 20 10:28:05 crc kubenswrapper[4971]: I0320 10:28:05.178551 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566708-4lhvw" Mar 20 10:28:05 crc kubenswrapper[4971]: I0320 10:28:05.653199 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566702-rqtq4"] Mar 20 10:28:05 crc kubenswrapper[4971]: I0320 10:28:05.665503 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566702-rqtq4"] Mar 20 10:28:06 crc kubenswrapper[4971]: I0320 10:28:06.747750 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86f94daf-ce3e-408e-9f8b-2adcd85bcfb2" path="/var/lib/kubelet/pods/86f94daf-ce3e-408e-9f8b-2adcd85bcfb2/volumes" Mar 20 10:28:18 crc kubenswrapper[4971]: I0320 10:28:18.844760 4971 scope.go:117] "RemoveContainer" containerID="a208b939f2314073550bd766affb0c875214f6f4d512bd161c75379fdebd9956" Mar 20 10:28:50 crc kubenswrapper[4971]: I0320 10:28:50.164735 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 10:28:50 crc kubenswrapper[4971]: I0320 10:28:50.165458 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 10:29:20 crc kubenswrapper[4971]: I0320 10:29:20.162720 4971 patch_prober.go:28] interesting pod/machine-config-daemon-m26zl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 10:29:20 crc kubenswrapper[4971]: I0320 10:29:20.163345 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m26zl" podUID="0c96bfbc-3a6a-44df-9bf6-9f78c587657c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 10:29:30 crc kubenswrapper[4971]: I0320 10:29:30.053900 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jfcj6"] Mar 20 10:29:30 crc kubenswrapper[4971]: E0320 10:29:30.055067 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1747936e-8610-4ec3-ba07-9d1f594e949c" containerName="oc" Mar 20 10:29:30 crc kubenswrapper[4971]: I0320 10:29:30.055084 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="1747936e-8610-4ec3-ba07-9d1f594e949c" containerName="oc" Mar 20 10:29:30 crc kubenswrapper[4971]: I0320 10:29:30.055367 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="1747936e-8610-4ec3-ba07-9d1f594e949c" containerName="oc" Mar 20 10:29:30 crc kubenswrapper[4971]: I0320 10:29:30.057431 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jfcj6" Mar 20 10:29:30 crc kubenswrapper[4971]: I0320 10:29:30.065552 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jfcj6"] Mar 20 10:29:30 crc kubenswrapper[4971]: I0320 10:29:30.140903 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7gnn\" (UniqueName: \"kubernetes.io/projected/b50d5536-d2cc-4a14-b9f1-2ba27e47a546-kube-api-access-d7gnn\") pod \"community-operators-jfcj6\" (UID: \"b50d5536-d2cc-4a14-b9f1-2ba27e47a546\") " pod="openshift-marketplace/community-operators-jfcj6" Mar 20 10:29:30 crc kubenswrapper[4971]: I0320 10:29:30.140959 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b50d5536-d2cc-4a14-b9f1-2ba27e47a546-utilities\") pod \"community-operators-jfcj6\" (UID: \"b50d5536-d2cc-4a14-b9f1-2ba27e47a546\") " pod="openshift-marketplace/community-operators-jfcj6" Mar 20 10:29:30 crc kubenswrapper[4971]: I0320 10:29:30.141232 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b50d5536-d2cc-4a14-b9f1-2ba27e47a546-catalog-content\") pod \"community-operators-jfcj6\" (UID: \"b50d5536-d2cc-4a14-b9f1-2ba27e47a546\") " pod="openshift-marketplace/community-operators-jfcj6" Mar 20 10:29:30 crc kubenswrapper[4971]: I0320 10:29:30.242103 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b50d5536-d2cc-4a14-b9f1-2ba27e47a546-catalog-content\") pod \"community-operators-jfcj6\" (UID: \"b50d5536-d2cc-4a14-b9f1-2ba27e47a546\") " pod="openshift-marketplace/community-operators-jfcj6" Mar 20 10:29:30 crc kubenswrapper[4971]: I0320 10:29:30.242238 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7gnn\" (UniqueName: \"kubernetes.io/projected/b50d5536-d2cc-4a14-b9f1-2ba27e47a546-kube-api-access-d7gnn\") pod \"community-operators-jfcj6\" (UID: \"b50d5536-d2cc-4a14-b9f1-2ba27e47a546\") " pod="openshift-marketplace/community-operators-jfcj6" Mar 20 10:29:30 crc kubenswrapper[4971]: I0320 10:29:30.242268 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b50d5536-d2cc-4a14-b9f1-2ba27e47a546-utilities\") pod \"community-operators-jfcj6\" (UID: \"b50d5536-d2cc-4a14-b9f1-2ba27e47a546\") " pod="openshift-marketplace/community-operators-jfcj6" Mar 20 10:29:30 crc kubenswrapper[4971]: I0320 10:29:30.242952 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b50d5536-d2cc-4a14-b9f1-2ba27e47a546-utilities\") pod \"community-operators-jfcj6\" (UID: \"b50d5536-d2cc-4a14-b9f1-2ba27e47a546\") " pod="openshift-marketplace/community-operators-jfcj6" Mar 20 10:29:30 crc kubenswrapper[4971]: I0320 10:29:30.242951 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b50d5536-d2cc-4a14-b9f1-2ba27e47a546-catalog-content\") pod \"community-operators-jfcj6\" (UID: \"b50d5536-d2cc-4a14-b9f1-2ba27e47a546\") " pod="openshift-marketplace/community-operators-jfcj6" Mar 20 10:29:30 crc kubenswrapper[4971]: I0320 10:29:30.262381 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7gnn\" (UniqueName: \"kubernetes.io/projected/b50d5536-d2cc-4a14-b9f1-2ba27e47a546-kube-api-access-d7gnn\") pod \"community-operators-jfcj6\" (UID: \"b50d5536-d2cc-4a14-b9f1-2ba27e47a546\") " pod="openshift-marketplace/community-operators-jfcj6" Mar 20 10:29:30 crc kubenswrapper[4971]: I0320 10:29:30.382204 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jfcj6" Mar 20 10:29:30 crc kubenswrapper[4971]: I0320 10:29:30.971427 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jfcj6"] Mar 20 10:29:31 crc kubenswrapper[4971]: I0320 10:29:31.108843 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jfcj6" event={"ID":"b50d5536-d2cc-4a14-b9f1-2ba27e47a546","Type":"ContainerStarted","Data":"b5cb88478d04764b5dd2c1aea40c12bb4a1447740628d3ff9c17b0facf674491"} Mar 20 10:29:32 crc kubenswrapper[4971]: I0320 10:29:32.121825 4971 generic.go:334] "Generic (PLEG): container finished" podID="b50d5536-d2cc-4a14-b9f1-2ba27e47a546" containerID="af94ba4ff6cbb0457f0410a4e35b207bd3aab3d51ce223496a5f5928b70e1f9b" exitCode=0 Mar 20 10:29:32 crc kubenswrapper[4971]: I0320 10:29:32.121953 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jfcj6" event={"ID":"b50d5536-d2cc-4a14-b9f1-2ba27e47a546","Type":"ContainerDied","Data":"af94ba4ff6cbb0457f0410a4e35b207bd3aab3d51ce223496a5f5928b70e1f9b"} Mar 20 10:29:32 crc kubenswrapper[4971]: I0320 10:29:32.124169 4971 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 10:29:33 crc kubenswrapper[4971]: I0320 10:29:33.132670 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jfcj6" event={"ID":"b50d5536-d2cc-4a14-b9f1-2ba27e47a546","Type":"ContainerStarted","Data":"4d7fc1d90c4ca99cfb6f55ba4397350456ecd504ce1a97268e4c859eb7894f40"} Mar 20 10:29:35 crc kubenswrapper[4971]: I0320 10:29:35.160587 4971 generic.go:334] "Generic (PLEG): container finished" podID="b50d5536-d2cc-4a14-b9f1-2ba27e47a546" containerID="4d7fc1d90c4ca99cfb6f55ba4397350456ecd504ce1a97268e4c859eb7894f40" exitCode=0 Mar 20 10:29:35 crc kubenswrapper[4971]: I0320 10:29:35.161159 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jfcj6" event={"ID":"b50d5536-d2cc-4a14-b9f1-2ba27e47a546","Type":"ContainerDied","Data":"4d7fc1d90c4ca99cfb6f55ba4397350456ecd504ce1a97268e4c859eb7894f40"} Mar 20 10:29:36 crc kubenswrapper[4971]: I0320 10:29:36.179010 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jfcj6" event={"ID":"b50d5536-d2cc-4a14-b9f1-2ba27e47a546","Type":"ContainerStarted","Data":"019a65858278fcfd231d49618a4ecb4fe3c2b1ee51b69d8789d02a6db28e8e14"} Mar 20 10:29:36 crc kubenswrapper[4971]: I0320 10:29:36.215854 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jfcj6" podStartSLOduration=2.773050639 podStartE2EDuration="6.215832826s" podCreationTimestamp="2026-03-20 10:29:30 +0000 UTC" firstStartedPulling="2026-03-20 10:29:32.123940992 +0000 UTC m=+13194.103815130" lastFinishedPulling="2026-03-20 10:29:35.566723169 +0000 UTC m=+13197.546597317" observedRunningTime="2026-03-20 10:29:36.208466092 +0000 UTC m=+13198.188340240" watchObservedRunningTime="2026-03-20 10:29:36.215832826 +0000 UTC m=+13198.195706964" Mar 20 10:29:40 crc kubenswrapper[4971]: I0320 10:29:40.382933 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jfcj6" Mar 20 10:29:40 crc kubenswrapper[4971]: I0320 10:29:40.383544 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jfcj6" Mar 20 10:29:40 crc kubenswrapper[4971]: I0320 10:29:40.426987 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jfcj6" Mar 20 10:29:41 crc kubenswrapper[4971]: I0320 10:29:41.290962 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jfcj6" Mar 20 10:29:41 crc kubenswrapper[4971]: I0320 10:29:41.351910 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jfcj6"] Mar 20 10:29:43 crc kubenswrapper[4971]: I0320 10:29:43.242265 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jfcj6" podUID="b50d5536-d2cc-4a14-b9f1-2ba27e47a546" containerName="registry-server" containerID="cri-o://019a65858278fcfd231d49618a4ecb4fe3c2b1ee51b69d8789d02a6db28e8e14" gracePeriod=2